Telangana used an algorithm to build 360-degree profiles of citizens and decide whether they were eligible for state's welfare. The profiles were faulty. As a result, subsidised food was denied to thousands of poor. The state government knew of the blunder, but did not fix it. The Collective’s member Tapasya travelled to the state to understand how states are using artificial intelligence in an unintelligent way, at the cost of citizens. They report along with Kumar Sambhav and Divij Joshi. The story was produced with support from the Pulitzer Center’s AI Accountability Network.
Haryana used the state’s Family ID database and algorithms to identify genuine beneficiaries of welfare schemes. The database wrongly listed thousands of citizens as dead. Result: They lost their pensions. The Collective’s Tapasya travelled to Haryana to understand how states are using artificial intelligence in an unintelligent way, at a cost to citizens. She writes along with Kumar Sambhav and Divij Joshi. The story was produced with support from the Pulitzer Center’s AI Accountability Network.
In the past few years, at least half a dozen states have adopted profiling software to predict the eligibility of citizens for welfare schemes. These algorithms have wrongly declared the alive as dead, the poor as well-off, the disabled as able-bodied, robbing thousands of subsidised food, old-age pension, disability pension, widow pensions, and other welfare benefits for the poor. An investigation into how states are using artificial intelligence in an unintelligent way, at the cost of citizens. With support from the Pulitzer Center’s AI Accountability Network.
Dear Readers,
The Reporters’ Collective relies on your support to carry out investigative journalism that demands time and resources. For several months, we’ve been operating at a deficit. Your contributions are essential to meet our costs, with over 75% of donations going directly to pay salaries. Your support helps keep the lights on in our newsroom. Please donate generously today!