Governments Creating Precrime Child Abuse Algorithms to Predict Child Abuse - Or Just to Kidnap More Children

in #cps6 years ago (edited)

In the effort to find and help vulnerable children, authorities are looking at precrime intervention in the form of artificial intelligence and data processing to find at-risk children before they are harmed. The world of sci-fi precrime, like that depicted in Minority Report, is heading towards a reality near you.


pixabay, pixabay

In the U.K., child protective services are said to be 800 million pounds short in funding according to The Guardian who covered the move by Bristol authorities to experiment with algorithms and AI to remove children from homes before they are harmed. But this raises some moral/ethical questions. Can this work? Or will it criminalize parents and kidnap children from families where they were not in any real danger at all?

Bristol council's predictive system is overseen by Gary Davies, and sees the advantages. Instead of removing humans from the process, Davies argued it will mean that data will be looked at to stop humans from making mistakes:

"It's not saying you will be sexually exploited, or you will go missing. It's demonstrating the behavioural characteristics of being exploited or going missing. It's flagging up that risk and vulnerability."

But his comments seem to suggest that the fate of families will be inthe hands of computers who will categorize children's behaviors or their parents as promoting an environment that favors being exploited, and hence call for attention from the child "protective" services. As many victims of CPS know, once the Child Procurement Syndicate targets you as a threat to a child, there is a high probability that your child will get taken from you and you have to battle your way to get them back.

Could the cost-cutting of services be a problem-reaction-solution in order to bring in more Orwellian-dystopian micromanagement and control over our lives? Maybe. In explaining how useful the system could be, The Guardian talks about the AI efficiency of data mining of consumer behavior to promote how useful it could be when applied to family and child behavior:

Machine learning systems built to mine massive amounts of personal data have long been used to predict customer behaviour in the private sector.

Computer programs assess how likely we are to default on a loan, or how much risk we pose to an insurance provider.

The way it would work for children, is looking at indicators that predict the emergence of an outcome. They want to compile data from children who enter the child care system which would give them a model to target other children for child care services:

They then attempt to identify characteristics commonly found in children who enter the care system. Once these have been identified, the model can be run against large datasets to find other individuals who share the same characteristics.

What characteristics would be looked at as "predictive indicators", for example?

... history of domestic abuse, youth offending and truancy.

But that's not all. The Guardian mentions other indicators that are often used to treat parents as unfit and therefore harming their children: unemployment. Some indicators didn't make it, such as "rent arrears and health data", luckily, but will it stay that way in the future? Any non-positive life circumstances can be used to "predict" how a parent is creating an unfavorable or harmful environment for a child.

Data prediction metrics used in other areas are not 100% accurate, with about an 80% track record on accuracy. Think about that in terms of families and parents. What happens to the 20% who are falsely accused of being bad parents or child abusers outright, and nothing would have happened to the children? Well, that doesn't matter compared to the greater good of protecting 80%, right? Wrong. That's not including the 80% who match the predictive indicators but wouldn't have led to child abuse anyways.

Data privacy is also a concern, as The Guardian points out. Some people think that this must be pursued with the aim of "balancing" protection of the vulnerable and protecting people's rights to privacy. Automated predictive profiling is a pretty open area as there is no national oversight on what authorities are doing and how much they disclose what their doing.

Michael Veale, a researcher in public sector machine learning at University College London is in favor of the system it seems, as he justifies the false positive of innocent parents who get dragged through the mud and court system, causing trauma to children, as he is favorable to casting a wide net:

"When you predict events like these, you tend to get false positives. If you want to make sure you don’t miss children who are at risk, you’re going to play it safe and cast your net widely."

I see this system being used to favor more kidnapping of children from parents who are no threat, further terrorizing families and parents and spreading trauma to children who have unfavorable characteristics that match predictive indicators. This will of course be welcome for the social workers, foster system and agencies who will have more justification for what they do and the need for more funding.


Thank you for your time and attention. Peace.


Support @familyprotection. Expose CPS, vaccines and other harms to families and children. Donate if/when you can :)


If you appreciate and value the content, please consider: Upvoting, Sharing or Reblogging below.
Follow me for more content to come!


My goal is to share knowledge, truth and moral understanding in order to help change the world for the better. If you appreciate and value what I do, please consider supporting me as a Steem Witness by voting for me at the bottom of the Witness page.

Sort:  
Thank-you @krnel for submitting this post with the #familyprotection tag. It has been UPVOTED by @familyprotection and RESTEEMED TO OUR Community Supporters.

"Child Protection Agencies" are taking children away from their loving families.
THESE FAMILIES NEED PROTECTING.

(If you feel that our community has brought more rewards and attention to this post, please consider contributing a portion of those rewards back to our cause.)

@familyprotection has been chosen to receive another donation of 10 SBD from the A Dollar A Day project - supported today by @pennsif, @coruscate, @cryptocurator, @d-vine, @globocop, @goldendawne, @hopehuggs, @riverflows, @steeminganarchy, @theadmiral0 & witnesses @drakos, @followbtcnews, @helpie, @quochuy, @steemcommunity & @yabapmatt

Contact @pennsif if you would like to know more about A Dollar A Day.


The A Dollar A Day charitable giving project.


Thanks :)

It's the easiest thing in the world to get the public to accept dubious things like this when you're ostensibly doing it for the kids. Data privacy is already a non-factor when the safety of children is in question.

Let's put this thing in a context. Negative life outcomes such as unemployment are considered to be risk factor. The middle class is already shrinking in much of the developed world. AI and robotics are destroying jobs at an accelerating rate with no jobs in new sectors in sight to replace them - except for the care sector, of course. An increasing number of us will find employment in the social services in the future. The family is already under assault from multiple directions. This may also be a way to keep the undesirables from breeding and taxing the welfare system. The undesirables meaning anyone unable to keep up with the forces of accelerating technological change driving humanity toward obsolescence.

Perhaps the plummeting birth rates in countries such as Finland are partly a result of people seeing the writing on the wall. The ever increasing structural unemployment is making family formation increasingly difficult and the more developed a country the stricter and the more demanding the standards for good parenting. It seems to me that if someone really wants a family, it could be best for them to look for a country nowhere near the global top of "human development" but developed and stable enough not to have any serious problems with preventable diseases, organized crime or anything like that. Too much modernity is poison. It seems that too much modernity on a social level is analogous to too high standards of hygiene causing allergies and autoimmune diseases.

Yup, it targets a certain demographic of society for sure :/ All fits into a larger agenda. You make a good point about more development and more restriction on parents.

They leave no doubt they believe we are their possessions to do with as they wish. It is a tense time as the line grows stronger for those who are ok with being owned and those of us who are tired of this and would prefer to be free.

Given how many say things like politics doesn't affect them, or that the constitution has outlived its purpose, all of the insanity arguments that makes ones jaw drop from the lack of clarity and common sense, I imagine there will still be support for such a tactic. They will say, "Its ok that they took your kids, their the government you know. If you didn't do anything harmful to those kids, you would have or they wouldn't have taken them." Pathetic.

Yup, if you have nothing to hide, then everything will be fin with you. The government doesn't make mistakes. And if they do, they are trustworthy and you will get your kids back no problem. Hail bog brother!

Scary stuff isn't it! The UK is turning into Looney Land. We just can't make this stuff up! VERY concerning.

"When you predict events like these, you tend to get false positives. If you want to make sure you don’t miss children who are at risk, you’re going to play it safe and cast your net widely." Oh yeah, that's not scary as hell!

Yeah, nothing wrong with thinking like that. Everything will turn out great!

It’s always to kidnap more! So many unexplained missing children, orphans etc, it’s crazy, and the population (90% of them) are asleep and don’t even care or wana believe any of it. They do it in front of our faces, some big game they’re playing us with. They will all get what’s coming to them soon!

Yup, one sick big joke :/

Thanks

Curated for #informationwar (by @wakeupnd)

  • Our purpose is to encourage posts discussing Information War, Propaganda, Disinformation and other false narratives. We currently have over 10,000 Steem Power and 20+ people following the curation trail to support our mission.

  • Join our discord and chat with 250+ fellow Informationwar Activists.

  • Join our brand new reddit! and start sharing your Steemit posts directly to The_IW, via the share button on your Steemit post!!!

  • Connect with fellow Informationwar writers in our Roll Call! InformationWar - Leadership/Contributing Writers/Supporters: Roll Call

Ways you can help the @informationwar

  • Upvote this comment.
  • Delegate Steem Power. 25 SP 50 SP 100 SP
  • Join the curation trail here.
  • Tutorials on all ways to support us and useful resources here

This is really out of the norm and intrusive considering it will encompass gathering data on behaviors of all children not just those suspected of abuse. I really don't see a need for such a intrusive program given that teachers are often the first line of defense for children when it comes to spotting neglect or abuse. I think this program will find them spending more time, energy and money running down false claims along with those that are legit.

@krnel This post will be featured on MSP Waves' The CHAOS Show during its @familyprotection segment today on Friday. Show starts at 7 pm Central.

If you wish to appear on the show, please contact me on MSP Discord.

Since you mentioned false positives please take a look at this: https://medium.com/s/story/im-a-heart-doctor-heres-why-im-wary-of-the-new-apple-watch-2b1999f2d942

When a certain negative element is uncommon (eg: abusive parents) any sophisticated system is going to produce more false positives than accurate positives. Just let that sink in for a moment.

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.031
BTC 63062.73
ETH 2687.13
USDT 1.00
SBD 2.54