Tech Law

AI in the Courts: The Jury Is Out

tech law

A session on the role of emerging technologies in the courtroom was part of last month’s New York State Bar Association Annual Meeting in New York City.

“Emerging Technologies in Litigation” included a panel of local and federal judges as well as an e-discovery researcher and emerging technology attorney. The group discussed the use of artificial intelligence in the courtroom.

The session addressed the role that AI could play in judicial decision-making, where algorithms can potentially predict behavior and outcomes resulting from different legal strategies. The rationale is that law is based on precedent — if a case is similar to past cases, then the results shouldn’t be all too surprising.

However, given the rise of deepfakes — and the possibility that AI, in effect, could manufacture evidence — some argued that the technology should be excluded from court proceedings.

Despite such concerns, the global “legal tech” market for AI is expected to grow in the coming years, driven by the trend in major law firms to adopt various legal tech solutions that aim to reduce turnaround time for some legal cases.

AI is used to help with document management systems, e-discovery, e-billing, contract management, and even practice and case management.

AI has already been employed at a lower level in the Los Angeles Superior Court to handle seemingly mundane traffic citations. Visitors to the court’s website can interact with Gina, an AI-powered online avatar, to pay a traffic ticket, register for traffic school, or schedule a court date.

Since being installed in 2016, Gina — which is part of an effort by the LA Superior Court to reduce the backlog of cases — has had more than 200,000 interactions a year and has reduced traffic court wait times dramatically.

One Step Closer to PreCrime

AI’s predictive algorithms can be used by police departments to strategize about where to send patrols, and facial recognition systems can be used to help identify suspects.

Combined, these approaches sound eerily similar to the Philip K. Dick short story, “The Minority Report,” which became the basis of the Steven Spielberg-directed film Minority Report, in which the police department’s PreCrime unit apprehends criminals based on foreknowledge of criminal activity.

“Courts currently are using AI algorithms to determine the defendant’s risk,’ which can range from the probability that the defendant will commit another crime to whether or not they will appear for their next court date for bail, sentencing and parole decisions,” explained technology inventor/consultant Lon Safko.

Often AI can be wrong — not only in determining where officers should patrol but also in recommending how criminals should be sentenced. Here is where the Correctional Offender Management Profiling for Alternative Sanctions comes into play. It compares defendant answers to questions as well as personal factors against a nationwide data group and assigns a score, which is used to determine sentencing.

“Recently, in Wisconsin, a defendant was found guilty for his participation in a drive-by shooting,” Safko told TechNewsWorld.

“While being booked, the suspect answered several questions that were entered into the AI system COMPAS,” he continued. “The judge gave this defendant a long sentence partially because he was labeled ‘high risk by this assessment tool.”

AI in the Courts

At the present time, it isn’t clear how widespread the use of AI in the courts will be — in part because the courts at all levels have been quite slow to embrace any new technology. This could be changing, however, as AI can help streamline the courts in ways that could benefit all parties.

“We believe the courts are leading digital transformation in the market, and approximately 90 percent of courts have evolved from traditional court reporting to professional digital court reporting,” said Jacques Botbol, vice president of marketing at software firm Verbit.

“Certain applications of AI are often adopted faster than others –particularly those surrounding the automation of routine tasks and workflows,” he told TechNewsWorld.

“It’s interesting to note that AI is also being utilized through more complex applications, such as utilizing AI to make decisions regarding cases,” added Botbol. “These use cases will be adopted more slowly as there are significant concerns about due process, biases, etc.”

AI Court Reporting

Supporters of AI technology in the courts point to how it can help court reporters do their job better.

“Today, most court reporting firms reject work since they don’t have the necessary workforce to handle it all,” explained Botbol.

“AI is helping to fill the gaps that the retiring court reporters and the legacy court reporter market have left,” he noted.

At the same time, “lawyers want to receive materials quickly, and today depositions are getting delayed because of the shortage in the market — with some areas reaching more than 35 percent,” Botbol added.

AI, along with automatic speech recognition (ASR), allows for proceedings to be recorded and processed in a timelier manner.

“There is a backlog of cases that need to be transcribed, yet with AI-based ASR tools, these transcripts can be processed at faster turnaround times,” said Botbol. “Instead of relying on court transcriptionists, the courts have multiple court reporting agencies that they can assign the work out to in order to clear their backlog and work more efficiently.”

Judge and/or Jury

No one is expecting that AI will fill the role of judge or jury –at least not in the legal system of the United States. However, AI could help ensure that the accused in criminal cases truly are granted the right to a speedy trial while also addressing the backlogs in the civil courts.

“In the future, AI will not only serve as an add-on but will also help to streamline trials by removing delays, which will lead to smarter and faster decisions being made,” said Tony Sirna, legal specialist at Verbit.

“Applications of AI are being studied and piloted for a number of use cases,” he told TechNewsWorld.

These include not only sentencing and risk assessment, such as COMPAS, but also the settlement of disputes.

“Online Dispute Resolution is another aspect where we may see automated adjudication of small civil cases,” noted Sirna.

AI could help the parties reach an equitable settlement in civil cases.

“Mining extensive amounts of related court cases and decisions will come into play, with parties submitting their cases and using AI combined with data mining for settlement options or fair adjudication,” noted Sirna.

AI Rights

Another consideration that likely will come up is how AI will be treated by the courts. Can AI be an “expert witness,” for example? If so, how will AI need to be treated by the courts? Will AI need to be granted some form of rights?

“AI likely won’t need ‘rights,’ but it will need control and a team that manages the innovation in each court,” said Sirna.

“The aspect of ‘rights’ related to AI poses interesting legal questions: Who is responsible for the AI? Is the AI algorithm fair or biased? At what point does the AI make its own decisions? Who is liable for results or decisions rendered by algorithms — the user, the designer, or the court?” pondered Sirna.

However, many of these questions likely won’t need to be addressed anytime soon — nor will AI have the power to pass judgment.

“Our judicial system is by no means ‘early adopters,’ but for good cause,” said Safko.

“Rendering a just verdict and sentence is paramount, and we have to be sure that the defendants and plaintiffs are properly represented and that their information is protected,” he said. “This is why doctors insist on still using fax machines over email, which can easily be hacked.”

Automated Recommendations

AI could have a place in the courtroom, but perhaps only to aid human lawyers, judges, court reporters, and jury. AI shouldn’t replace any of those humans but aid them in doing their job.

“Once a technology has proven itself to be reliable and show time or cost savings, it has been and will be adopted,” suggested Safko.

“AI is not a perfect science — it is still programmed by humans, and not every set of data perfectly matches the predetermined rules programmed into the application,” he warned.

However, with the increasing pressure on court dockets, any time or cost-saving measures need to be considered. It is important, too, to consider how AI then could affect people’s lives.

“Every automated recommendation should be reviewed by a qualified judge to verify the outcome. Then their recommendation needs to be fed back into that system to allow it to become more proficient at rendering appropriate decisions,” said Safko.

“We can’t risk peoples’ lives on automated apps that save money,” he noted.

“Even the Chief Justice of our Supreme Court, John Roberts, is concerned about how AI is affecting the U.S. legal system,” Safko explained. “When asked about AI in our legal system, he said, ‘it’s a day that’s here, and it’s putting a significant strain on how the judiciary goes about doing things.'”

Peter Suciu

Peter Suciu has been an ECT News Network reporter since 2012. His areas of focus include cybersecurity, mobile phones, displays, streaming media, pay TV and autonomous vehicles. He has written and edited for numerous publications and websites, including Newsweek, Wired and FoxNews.com.Email Peter.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Peter Suciu
More in Tech Law

What's your outlook for the business climate in 2025?
Loading ... Loading ...

Technewsworld Channels