Human Rights Watch: Flawed technology shuts out deserving families from aid
A MARTÍNEZ, HOST:
Algorithms are everywhere in today's world, and it turns out they're used in the social safety net, too. The government of Jordan uses algorithms to screen applicants for its national cash assistance program. And that program, which is partially supported by the World Bank, provides aid to more than 200,000 poor households. But according to Human Rights Watch, the technology is shutting some people out. Our co-host, Leila Fadel, spoke with senior researcher Amos Toh.
AMOS TOH: What we found was that it actually leads to very trivial distinctions between families who qualify and families who don't qualify. One of the indicators, for example, measures whether you have a car, and it also takes into account factors like the value, the age of your car. People who have cars below five years of age are automatically disqualified. But if you have cars above five years of age, those other factors come into play. But even this seemingly complicated indicator doesn't actually account for situations where, you know, families who have older cars might be able to afford petrol one week, and the next week they have to hitchhike because they simply have run out of money.
LEILA FADEL, BYLINE: So it doesn't take the nuance of people's lived experiences. If you could give me an example of somebody that you spoke to that was shut out of this program, what got them shut out, and how did that impact their life?
TOH: We spoke to a family in one of the poorest villages in the country, and she had received the benefit for some time and then was dropped in 2022. She is still trying to make sense of why she was dropped from the benefit, but she suspects that the fact that they own a car may have factored into their subsequent rejection from the program. She said, you know, the car destroyed us. Look at it. It's sitting down there on the street. And we can't use it, and we actually have to hitchhike to do basic tasks. So that's the kind of situation that we kept coming across when we were interviewing families in Jordan.
FADEL: The World Bank has talked about this automated tech that you studied as, quote, the cornerstone of inclusive social policies. What did it say when it looked at your findings?
TOH: I think the World Bank's response really is that this form of targeting is one of the most cost-effective ways to distribute social protection and to distribute the limited resources that are available.
FADEL: So what is the solution here, then? Are you saying that they should just make the algorithms smarter?
TOH: Our recommendation is that these algorithms should be gotten rid of altogether. Targeting algorithms do not work in this context. They have been flawed for a long time, and the tendency now to try to improve them with better data, better technology - we worry that it will actually exacerbate chronic problems with poverty-targeted programs.
FADEL: How many places depend on algorithms like this to disseminate social welfare?
TOH: Globally, the World Bank has said that the number of countries that have taken up this technology has gone from 23 to 60. So increasingly, a number of countries are relying on these kinds of technology to distribute cash assistance, despite some of the growing problems that we have documented.
FADEL: Amos Toh is a senior researcher with Human Rights Watch. Thank you so much for your time.
TOH: Thanks for having me.
MARTÍNEZ: The World Bank says in a statement that the cash transfer program it supports in Jordan has provided a lifeline to the country's poorest people and made the government's social safety net more expansive and effective. Transcript provided by NPR, Copyright NPR.
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.