SAN FRANCISCO — A smartphone app that can tell a defendant what to say in court using artificial intelligence has been used for the first time — and it’s a lot cheaper than a real lawyer.
California-based company DoNotPay says this is the first time AI has been used during a trial anywhere in the world. The neural network will listen to all speeches from witnesses, lawyers, and the judge.
The program then tells the defendant exactly what to say through an earpiece, urging them to stick to only those words. In this case, legal history is being made over a speeding ticket. The breakthrough may set a precedent for more serious cases in the future.
Identities and locations are being kept confidential by DoNotPay. It would be illegal to use this technology in most countries, but British-born founder Joshua Browder has successfully argued that the app classifies as a hearing aid.
“It is technically within the rules, but I don’t think it is in the spirit of the rules,” the entrepreneur says according to a statement from SWNS.
The company is paying people to use the app
The landmark test case is taking place in February 2023. The firm has agreed to pay any penalties imposed for using the app. DoNotPay is also offering $1 million to anyone with an upcoming case at the U.S. Supreme Court if they will do the same thing.
The computer brain has already been used to talk directly to customer service staff at a bank with a synthesized voice. The program successfully reversed several bank fees on its own, according to reports.
“It is the most mind-blowing thing I have ever done. It is only $16 that we got reversed, but that is the perfect job for AI – who has time to waste on hold for $16?” Browder tells SWNS.
The AI program has been trained in a range of case law topics, including immigration law. It has intervened in about three million cases in the U.S. and the U.K. The app sticks to factual statements, rather than saying whatever it could to win regardless of the truth.
“We are trying to minimize our legal liability. And it is not good if it actually twists facts and is too manipulative,” Browder says.
The audio tool is also tweaked to not automatically react to statements.
“Sometimes silence is the best answer,” Browder continues.
His ultimate goal is that the software will eventually replace some lawyers.
“It is all about language, and that is what lawyers charge hundreds or thousands of dollars an hour to do,” the entrepreneur explains. “There will still be a lot of good lawyers out there who may be arguing in the European Court of Human Rights (ECHR).”
“But a lot of lawyers are just charging way too much money to copy and paste documents and I think they will definitely be replaced, and they should be replaced.”
The app raises ethical concerns in the courtroom
Dr. Nikos Aletras, a computer scientist at Sheffield University, has created an AI program that can accurately predict ECHR cases. He has seen growing use of machine learning in the legal system, but warns its adoption needs careful consideration.
Providing real-time audio legal advice in a courtroom would still be a technological challenge. Ethical issues remain, such as whether it would even be legal to use.
Using recording equipment in the U.K. would breach the Contempt of Court Act of 1981. This AI system may fall foul of that rule.
“It appears to involve transmitting the audio to a third party’s servers and processing that audio within the resulting computer system,” says Neil Brown of the law firm decoded.legal. “I’d have thought a judge might well conclude it was being recorded, even if deleted soon afterwards.”
“So probably not something to try here unless you fancy contempt proceedings, at least not without checking it with the judge first,” Brown tells SWNS.
Asked if such a trial would be legal, the U.K. Ministry of Justice likened AI to “McKenzie friends” — people with no legal training who assist in court. They don’t have to be qualified lawyers but defendants have the right to have them sit in court and offer advice, a spokesperson explains.
South West News Service writer Mark Waghorn contributed to this report.