AI in the courtroom: Judges enlist ChatGPT help, critics cite risks

An Indian High Court judge used AI chatbot ChatGPT to summarize case law. The use of AI chatbots in the legal system is growing, with proponents praising their potential to streamline processes while critics warn of biases and false results.

The ChatGPT app is displayed on an iPhone in New York, on May 18, 2023.
Some judges are experimenting with AI chatbots to assist them in rulings, but their reliability is questionable, said several legal and tech experts.

Richard Drew/AP

May 30, 2023

Indian High Court judge Anoop Chitkara has ruled over thousands of cases. But when he refused bail to a man accused of assault and murder, he turned to ChatGPT to help justify his reasoning.

He is among a growing number of justices using artificial intelligence (AI) chatbots to assist them in rulings, with supporters saying the tech can streamline court processes while critics warn it risks bias and injustice.

“AI cannot replace a judge. ... However, it has immense potential as an aid in judicial processes,” said Mr. Chitkara.

They took up arms to fight Russia. They’ve taken up pens to express themselves.

“The knowledge revolution has started, and these AI platforms have in certain situations demonstrated their capabilities to instantaneously transform queries into outstanding results.”

Chatbots like ChatGPT and Google’s Bard are software applications designed to mimic human conversation in response to users’ questions.

Mr. Chitkara said he did not rely on ChatGPT to help decide his ruling in the 2020 case at the Punjab and Haryana High Court.

However, he wondered if he was relying too heavily on his own “consistent view” that allegations involving an unusually high level of cruelty should count against granting bail, and asked ChatGPT to summarize case law on the issue.

The justice ministry did not immediately respond to a request for comment.

Ukraine’s Pokrovsk was about to fall to Russia 2 months ago. It’s hanging on.

The use of AI in the criminal justice system is growing quickly worldwide, from the popular DoNotPay chatbot lawyer mobile app to robot judges in Estonia adjudicating small claims and AI judges in Chinese courts.

In the Caribbean Colombian city of Cartagena, Judge Juan Manuel Padilla also turned to ChatGPT for help in a lawsuit in which an autistic boy’s parents were suing his health care provider for treatment costs and expenses.

“[ChatGPT] is generating text that is very reliable, very concrete, and applicable to a case in a specific way,” said Mr. Padilla.

He asked the chatbot several legal questions such as whether an autistic child is exempt from fees for therapy. He included the details in his ruling, which sided in favor of the child.

Concerns over false results

But chatbots’ reliability is questionable, said several legal and tech experts.

“Some judges are trying to find a way to make the job faster – but they don’t always know the limits or risks,” said Juan David Gutierrez, professor of public policy and data at Universidad del Rosario in Bogota, Colombia.

“ChatGPT can make up laws and rulings that don’t exist. In my view, it shouldn’t be used for anything important.”

There have been numerous examples of chatbots getting information wrong or making up plausible but incorrect answers – which have been dubbed “hallucinations” – such as inventing fictional articles and academic papers.

When ChatGPT was tested on its responses to 50 legal questions by Linklaters, a global law firm headquartered in London, legal experts found it proficient in some areas but severely lacking in others.

The AI confused sections of the Data Protection Act 2018, and failed to give complete answers on English contract law.

“If you didn’t already have a very good understanding of that area of law, it would be very hard for you to work that out,” solicitor Peter Church, an expert in data privacy at Linklaters, told the Thomson Reuters Foundation.

Use of chatbot “a disaster”

Better technology promises a way to alleviate the huge backlog that is clogging some legal systems.

India alone had more than 40 million cases pending in lower courts last year while Brazil had 26 million new lawsuits filed in 2020 alone – more than 6,000 per judge.

But AI risks oversimplifying complex problems and could raise unrealistic expectations of tech’s capabilities, Dona Mathew and Urvashi Aneja from the research collective Digital Futures Lab wrote in a recent report.

There are also concerns over privacy violations and the exploitation of judicial data for profit.

“With biased and incomplete datasets, no legal remedies and accountability safeguards ... these changes can lead to systematic harms like threats to judicial independence and stagnation of legal principles,” they wrote.

Raquel Guerrero, a lawyer for three journalists in Bolivia who were accused of posting photos of a victim of violence without their permission, expressed concerns when the court consulted ChatGPT during an online hearing in April.

Ms. Guerrero said the complainant gave permission for the photos to be shared online but later denied she had done so.

Constitutional judges asked ChatGPT about any “legitimate public interest” for journalists posting online photos of a “woman showing parts of her body” without her consent.

ChatGPT answered it was a “violation of the person’s privacy and dignity.” The judges ordered the photos to be removed from social media.

The court record said ChatGPT does not replace decisions made by jurists, but that it can be used as additional support to be able to “clarify certain concepts.”

But Ms. Guerrero said the chatbot’s use in the hearing was “arbitrary” and a “disaster.”

“It can’t be used as if it’s a calculator that takes away the obligation of judges to use reason and to apply justice and to apply it correctly,” Ms. Guerrero said, adding she is considering filing a complaint against the judges for using the chatbot.

“Obviously, ChatGPT doesn’t stop being a robot. If you ask it in the right way, it will answer what you want to hear.”

This story was reported by the Thomson Reuters Foundation.