AI and the Judicial System: Using Artificial Intelligence In Ruling of Court Cases

in #stemng7 years ago (edited)

Fellow esteemed,
The use of artificial intelligence powered machine or AI has helped people perform various technical jobs without stress. AI created to learn and become general intelligence is in vogue but problem solving purpose of AI is the new law, AI shouldn’t all be about robotics. An area currently influenced by AI is the American Judicial system.

Before the incorporation of AI in U. S courts of law, shortly after the accused has been arrested, he has to appear before a judge. A trial date is then set by the judge for the accused which could happen in some weeks or months’ time after his/her first appearance in court. The defendant can be remanded in jail as he/she awaits trial or can be released and wait until the court date, all these are decided by the judge in charge.

Also, most courts use the bail system. An amount (bail money) is set by the judge for the accused to pay to evade remaining in jail as he /she awaits trial. Using this system, most judges set a very high price for defendants that pose a risk of not coming back for trial. The basis of release on bail is usually a price tag attached to a charge. So no matter the price, those rich enough will pay and go back into the community to cause havoc while those who can’t pay (and may be innocent) are detained. And it will be worthy to note;

Under the American system of justice, there’s a presumption that defendants are innocent until proven guilty.
source

1877D992-EC1A-4727-983D-3C7E05AC6E36_cx0_cy9_cw0_w250_r1_s.jpg
Image Source
First Day of Software Usage, 30th August, 2017.

Using Artificial intelligence to assess risk

In some American courts, AI systems are being used by judges to help ascertain the right time and how long a culprit should be remanded. In creating the AI system, thousands of old court cases are analyzed by researchers using computers. An algorithm is used with the data provided to predict if an accused person will get involved in a new crime or fail to honour his/her court date.

The Public Safety Assessment is One of the AI system United States judges use in ruling cases in court.
We developed the PSA after conducting extensive research. A team under the direction of two leading researchers in the pretrial field analyzed 750,000 cases from more than 300 jurisdictions to identify the factors that were most predictive of

(1) failure to appear,
(2) criminal activity, and
(3) violence.
These factors include a defendant’s age, current charge, and key aspects of a person’s criminal history.
source

The public safety assessment tool was created by the privately-financed Laura and John Arnold Foundation, based in Texas.
According to them, the system is designed to help judges make fair decisions while ruling over a case using the most detailed and objective information available.
The assessment program has been used by state judges in New Jersey to help in making decisions about defendants before trails. Other states, Judges have also used the system.
The first stage of the assessment process is to get the suspect’s fingerprint and information about the person is sent into a centralized computerized system. A video conference is held as the accused appears for his first hearing and the judges take their risk score. An accused with low risk score is often released for the pretrial period while an accused with a high risk factor can be detained or released but strictly under tight supervision by the court until the next court date.

The PSA measures the likelihood that an individual will commit a new crime—particularly a violent crime—upon release, as well as the likelihood that he or she will appear at a future court hearing. The risk assessment considers nine factors related to a defendant’s age, criminal history and current charge that research has shown accurately predict risk. The tool then generates risk scores for each defendant. This information, along with other pertinent facts from a defendant’s case, is provided to judges to assist in their pretrial decision making. While the PSA can be a helpful informational tool, it is important to note that judges always have the final say in every decision.
source

Acoording to Judge Ernest Caposela who told the Associated Press;

he supports the state’s efforts to use technology to provide the best information available to help judges make careful decisions about defendants.

Caposela compared the automatic system to “the same way you buy something from Amazon. Once you’re in the system, they’ve got everything they need on you.”

AE4858B1-77C4-46AA-BF18-89F401B7938B_w408_r1_s.jpg
Image Source

Is Replacing Judgement with Data Viable

Dangerous people have been kept off the streets said some legal experts who praised the assessment system and some accused are free to go if they are not a threat to the safety of the community.

The issue of biased ruling is reduced with the introduction of the AI system, this biased rulings which may be influenced by an accused gender,race or appearance. The assessment carried out to ascertain the risk factors uses age and past criminal convictions. Race, employment background, gender, where a person resides or arrest history are not considered when taking the assessment.

Many people criticise the usage of AI-powered data, they believe the judge's own judgement might end up been replaced in pre-trial decisions and sentencing.

Conclusion

Though the price-tag bail system is replaced with something better, the AI (PSA) is not all knowing. Havoc has still been carried out by those deemed to be of low risk factor, thus the low risk system is not foolproof.

Judge Glenn Grant, the acting administrative director of the courts in New Jersey, publicly stated, there are “probably 20 cases from 2016 where individuals were released on bail and people were murdered.”
source

In modern artificial intelligence, data rules. A.I. software is only as smart as the data used to train it, as Steve Lohr recently wrote, and that means that some of the biases in the real world can seep into A.I.
source

The PSA AI could have been fed thousands of data to determine low risk defendants, but it could still be biased in choosing those who are truly at low risk. An example of the bias of an AI system can be read in this article.

So what do you think about the AI and helping Judges make the law, do you think AI will be the panacea or Judges should be taught to analyses cases to determine low risk defendants? Leave your thoughts in the comments below.

Source/Reference

Ref1

Ref2

Ref3

Ref4.

Sort:  

Nice write-up, really miss your blog after a long break.

The use of artificial intelligence powered machine or AI has helped people perform various technical jobs without stress.

I can see that where this word is going. Making task easier for people

Thanks @valchiz for stopping by.

We are going digital and everything is becoming automated.

Fascinating concept. However I think training it with the vast data of older court cases simply means it will equally absorb the biases of the judges and lawyers who adjudicated those old cases. The data set is already inherently poisoned.

What we need is a system that can look at all that data from the past and make better judgments than they did.

That's because a lot of innocent persons have been sent to prison due to negligence and boycotting of the law. The AI should do better.

I think the intelligence(AI) should not be allowed to make all decisions. It's evaluation should still be re-evaluated.

Yeah.. But it hasn't been used to give final verdicts yet. It's only used for pre-trial judgement.

Hey it's been a while.. I'm back :) How are you doing?

Yeah.. Good to have you back.

Muy interesante su post amigo. La inteligencia artificial cada día avanza su alcance en el planeta.. saludos..

Thanks for stopping by. I don't understand the language you used but I hope it's related to the post.

Nice, love steemit

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.030
BTC 60986.47
ETH 2413.18
USDT 1.00
SBD 2.59