‘Computer says no’: algorithms, ideology and inequality

in algorithms •  19 days ago

Algorithms are increasingly being used in public services to determine how resources are distributed, and the decisions these algorithms make often disadvantage the most marginalised and vulnerable in society.

Is it possible that public services algorithms are not simply objective and neutral tools, but rather processes that contain political and economic assumptions about what types of people are worthy of ‘care’ and which are not?

Is it further possible that such algorithms are ideological tools created and administered by elites, designed and used to exclude ever increasing amounts of people from public service provision, thus forming part of a broader neoliberal agenda?

These are just some of the questions which Virginia Eubanks asks in her latest book Automating Inequality: How High Tech Tools Profile, Police and Punish the Poor.

Screen Shot 2019-03-05 at 20.16.11.png

In this book she considers several ways in which algorithms are used in biased ways in public services, three of which I consider below, along with some of the implications.

The Allegheny Family Screening Tool

The Allegheny Family Screening Tool (AFST) is based on big-data kept since 1999, and is used to predict which kinds of children are most likely to neglected or abused in the future. The tool is basically a statistical regression which identifies the background factors of previously abused children, which are then used to help case workers decide which reports they should screen for further investigation.

However, the database only has data on children and parents who have received public assistance, in other words the poor. It doesn’t include information about the harms that might be happening to middle-class kids.

The problem with this is that poor kids who aren’t being abused may be wrongly flagged as being abused, while middle class kids who are being abused could just go under the radar.

This seems to be a case of confusing poor parenting with poverty – with the poor subjected to extra scrutiny and surveillance, which may not be the most effective way of tackling abuse across all social class groups.

Interestingly Eubanks says that bias against the poor has always been around, but now such databases are just reinforcing this even more.

The Electronic registry of the un-housed in Los Angeles

There at 58,000 unhoused people in Los Angeles, with 75% completely unsheltered, and the goal of the registry is to do determine who the most vulnerable unhoused people are, so they can be housed.

The problem with the registry is that it asks people a barrage of moral and potentially incriminating questions – for example:

• ‘Are you having unprotected sex?
• Are you selling drugs for money?
• Is there an open warrant for your arrest?
• Have you thought about harming yourself or others?’

The responses to the survey are shared with over hundred different agencies across Los Angeles County and some of the information is available to the LAPD, which they can get by simply making a verbal request.

Some of the people who have done the survey feel as if they are being asked to incriminate themselves in exchange for a slightly better chance of getting housing.

This article provides further information.

The Automation of benefit claims in Indiana

In Indiana in 2006, the governor signed a $1.4 billion contract with a collection of high-tech companies (including IBM) to automate all of the state’s eligibility processes for the welfare programs.

They did this by replacing local case workers with online forms and regional call centres, which meant that anyone claiming benefits no longer had a person who knew their case.

If anyone made mistake filling in one of the forms online, or if the state made a mistake, or IBM made a mistake, and there were A LOT of mistakes made, the claimant was told they were ineligible for benefits and had them cut, and then faced a monumental struggle and weeks if not months of delays to get them reinstated.

Millions of people lost out on benefits they were entitled to and people even died as a result.

The project was such a failure that the governor cancelled the contract three years into a 10-year contract, for which IBM sued and received an additional $50 million on top of the $437 million already collected.

However, from a neoliberal perspective, this experiment did its job: many people who applied for benefits and were wrongly rejected, even if they successfully fought to get them reinstated, are now reluctant to claim benefits again, even if they’re entitled to them, because the experience of dealing with that system was so horrendous.

Screen Shot 2019-03-05 at 20.17.30.png

Final Thoughts/ Conclusions

The problem with relying on databases and algorithms in public services to determine who should and shouldn’t get access to scarce resources is that we are using them as an ‘empathy over-ride’ – ‘I don’t want to decide which of the 58 000 homeless people in LA gets housed, so let’s just enter some data and leave the decision to a computer’.

Then there’s the fact that relying on algorithms is basically like doing triage on social problems the system can’t cope with – the reason we need algorithms is that there aren’t enough resources – so perhaps we need to get over neoliberalism, get a grip on tax havens, and provide sufficient resources to meet social need, then we wouldn’t need algorithms to make the unpleasant decisions humans cannot face.

It might just be that the use of algorithms in public services compounds social problems – as we focus on collecting ever increasing amounts of data and using it more effectively, when what we really need to solve our welfare problems is political change?

Eubanks suggests that the kinds of abuses states are making of algorithms might be because we are still in the ‘Wild West’ period of big data: in which the people who create and use these databases have a kind of ‘anything goes’ mentality about how they can use the data collected.

However, this might be coming to an end, because more and more people, especially those in marginalized communities that have been most harmed by these systems is that people are becoming more concerned about how their data is being shared, whether it’s legal and whether it’s morally right, and they are developing ways of resisting giving away so much data.

Virginia Eubanks is a political science professor at the State University of New York


How algorithms can worsen inequality

A child abuse prediction model fails poor families

pic source book

computer says no

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

This is an important topic and ties into crypto as well. It would be possible to put data in the hands of the people that it is about and let them fine tune the level of granularity at which they share it and metadata about it using cryptography. Since data is such a valuable resource, the public shouldn't be forced to divulge it for free or at all if they don't want to.

You're absolutely right in that there are many questionable ways in which data are gathered on people and used against them. The child abuse risk model is a prime example. Assessing child abuse or the possibility thereof is such a subjective matter that it is hard to completely trust the process by which people do it, particularly when certain families are branded as being at risk by an algorithm they very understand vaguely at best. Any family can be found at fault if scrutinized closely enough.

Child abuse takes place and checking kids out by the healthcare system is understandably a necessity. But when I compare how things have changed in this regard since I was a kid (I have a ten-year-old daughter), the prevailing ethos seems to have changed from common sense to population-wide screening and suspicion. When my daughter started school, all the parents were given a questionnaire before a doctor's examination at the school clinic that asked very detailed questions about our private lives that appeared to be for some kind of a statistical risk assessment model. I heard some parents chuckle at some of the questions. We had had to move a couple of years prior to that because of my wife's work and she'd had some kids bully her around the time she started school. We'd moved another time right before our daughter's school started because we wanted to move to a better area and because we were better able to afford it at the time. I was at the examination with my daughter and at one point the doctor questioned our moving and I had to defend our decision by telling him that vacant jobs in my wife's field were few and far between in the area we moved from. I don't know what sort of information my daughter's teacher had told the teacher. Otherwise it was a pleasant conversation but it felt odd at one point.

All first world countries, at least in the West, seem keen on implementing many types of mass surveillance of their populations. It is starting to feel somewhat unpleasant. And continental Europe, particularly the Nordic countries, are generally much less paranoid about child safety than the Anglosphere, particularly North America (and perhaps Australia, too). Kids walk to school alone from first grade on unlike the USA. In Illinois, it is in fact illegal to leave children home alone until they turn 14. That's patently absurd because the same kids will be allowed to drive a car when 16. But the same type of thinking of parenting as a profession spearheaded by Americans seems to be spreading here, too. In those first-world countries without mass immigration or religiosity keeping birth rates up, they have been plummeting in particularly in the last 5-10 years. I suspect a lot of people on the fence about becoming parents in the first place or considering having a second or a third child have decided against it because the standards for good parenting have been creeping up all the time. In a modern society, particularly when the middle class is having less job security than in past decades, not having children is increasingly the rational choice as people know that being poor makes them suspect even as parents.


Thanks for the detailed comment.

I'm just glad I don't want kids... they do seem to open the door to more state surveillance.

I think Britain is worse than most xountries in Europe for the things you mention - so much surveillance but then so many cock-ups too (sure you've heard of the Rotherham case).

The US is probably a step up, at least for the poor.

Here in the uk there are so many measures of child development yr kid will always fail at something.... friend of mine's got a 'gifted' kid - he was told he had problems relating to other children - actually he was just bored because he was so far ahead of them all.

Data does seem to be used in more and more oppressive ways.

Posted using Partiko Android


I feel there's a kind of naivity about some of these approaches which seem like a good idea. As part of a health and social care masters we were looking at mass screening of patient data. Several problems:

  • although apparently anonymised, gps were able to identify individual patients by the clusters of data;
  • although apparently aiming to identify most vulnerable patients, the data designers had no concept of how this data might be misused (throughout the whole presentation, the presenters were alarmingly brighteyed and bushy tailed and gushing and the audience, all health and social care workers, ominously silent with accompanying "I can't believe I'm hearing this" body language.
  • failed to understand the interrelationship between social factors and health outcomes, for example, the health consequences of caring for (unpaid, usually family) carers.
    @roused wrote a post yesterday about fachidiots. Between these, neoliberal policies and unconscious bias, some groups of people are completely stuffed.
    Thanks for sharing the book.

I've heard about the first one of those... and the other two don't surprise me.

I can imagine that masters degree is depressing and enlightening at the same time.

Posted using Partiko Android


It was a bit of a struggle.

Posted using Partiko Android

To listen to the audio version of this article click on the play image.

Brought to you by @tts. If you find it useful please consider upvoting this reply.

Thank you so much for participating in the Partiko Delegation Plan Round 1! We really appreciate your support! As part of the delegation benefits, we just gave you a 3.00% upvote! Together, let’s change the world!

Hi, @revisesociology!

You just got a 1.93% upvote from SteemPlus!
To get higher upvotes, earn more SteemPlus Points (SPP). On your Steemit wallet, check your SPP balance and click on "How to earn SPP?" to find out all the ways to earn.
If you're not using SteemPlus yet, please check our last posts in here to see the many ways in which SteemPlus can improve your Steem experience on Steemit and Busy.

Hi @revisesociology!

Your post was upvoted by @steem-ua, new Steem dApp, using UserAuthority for algorithmic post curation!
Your UA account score is currently 4.558 which ranks you at #1882 across all Steem accounts.
Your rank has improved 2 places in the last three days (old rank 1884).

In our last Algorithmic Curation Round, consisting of 374 contributions, your post is ranked at #93.

Evaluation of your UA score:
  • Some people are already following you, keep going!
  • The readers appreciate your great work!
  • Good user engagement!

Feel free to join our @steem-ua Discord server

You got a 18.74% upvote from @ocdb courtesy of @revisesociology! :)

@ocdb is a non-profit bidbot for whitelisted Steemians, current max bid is 50 SBD and the equivalent amount in STEEM.
Check our website https://thegoodwhales.io/ for the whitelist, queue and delegation info. Join our Discord channel for more information.

If you like what @ocd does, consider voting for ocd-witness through SteemConnect or on the Steemit Witnesses page. :)

Curated for #informationwar (by @truthforce)

Latest Drive For Delegation/Fundraising Click Here!!!

Ways you can help the @informationwar!

  • Upvote this comment or Delegate Steem Power. 25 SP 50 SP 100 SP or Join the curation trail here.
  • Tutorials on all ways to support us and useful resources here