Business Standard

US lawyer in legal trouble after citing cases 'invented' by ChatGPT

As artificial intelligence sweeps the online world, it has conjured dystopian visions of computers replacing not only human interaction, but also human labour.

ChatGPT, OpenAI, web browsing

Photo: Reuters

Agencies
Benjamin Weiser

The lawsuit began like so many others: A man named Roberto Mata sued the airline Avianca, saying he was injured when a metal serving cart struck his knee during a flight to Kennedy International Airport in New York.
 
When Avianca asked a Manhattan federal judge to toss out the case, Mata’s lawyers vehemently objected, submitting a 10-page brief that cited more than half a dozen relevant court decisions. There was Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines and, of course, Varghese v. China Southern Airlines, with its learned discussion of federal law and “the 
 
tolling effect of the automatic stay on a statute of limitations.”
 
There was just one hitch: No one — not the airline’s lawyers, not even the judge himself — could find the decisions or the quotations cited and summarised in the brief. That was because ChatGPT had invented everything.
 
The lawyer who created the brief, Steven A Schwartz of the firm Levidow, Levidow & Oberman, threw himself on the mercy of the court on Thursday, saying in an affidavit that he had used the artificial intelligence program to do his legal research — “a source that has revealed itself to be unreliable.”
 
Schwartz, who has practised law in New York for three decades, told  Judge P Kevin Castel that he had no intent to deceive the court or the airline. Schwartz said that he had never used ChatGPT, and “therefore was unaware of the possibility that its content could be false.”
 
He had, he told Judge Castel, even asked the program to verify that the cases were real.It had said yes. Schwartz said he “greatly regrets” relying on ChatGPT “and will never do so in the future without absolute verification of its authenticity.”
Judge Castel said in an order that he had been presented with “an unprecedented circumstance,” a legal submission replete with “bogus judicial decisions, with bogus quotes and bogus internal citations.” He ordered a hearing for June 8 to discuss potential sanctions.
 
As artificial intelligence sweeps the online world, it has conjured dystopian visions of computers replacing not only human interaction, but also human labour. The fear has been especially intense for knowledge workers, many of whom worry that their daily activities may not be as rarefied as thought.
 
Stephen Gillers, a legal ethics professor at New York University School of Law, said the issue was particularly acute among lawyers, who have been debating the value and the dangers of AI software like ChatGPT, as well as the need to verify whatever information it provides. “The discussion now among the bar is how to avoid exactly what this case describes,” Gillers said. “You cannot just take the output and cut and paste it into your court filings.”
 
The real-life case of Roberto Mata v Avianca shows that white-collar professions may have at least a little time left before the robots take over.
 
It began when Mata was a passenger on Avianca Flight 670 from El Salvador to New York on August 27, 2019, when an airline employee bonked him with the serving cart, according to the lawsuit. After Mata sued, the airline filed papers asking that the case be dismissed because the statute of limitations had expired. In a brief filed in March, Mata’s lawyers said the lawsuit should continue, bolstering their argument with references and quotes from the many court decisions that have since been debunked.
 
Soon, Avianca’s lawyers wrote to Judge Castel, saying they were unable to find the cases that were cited in the brief.
When it came to Varghese v China Southern Airlines, they said they had “not been able to locate this case by caption or citation, nor any case bearing any resemblance to it.”
 
Indeed, the lawyers added, the quotation, which came from Varghese itself, cited something called Zicherman v Korean Air Lines Co Ltd, an opinion purportedly handed down by the US Court of Appeals for the 11th Circuit in 2008. They said they could not find that, either. Judge Castel ordered Mata’s attorneys to provide copies of the opinions referred to in their brief. The lawyers submitted a compendium of eight; in most cases, they listed the court and judges who issued them, the docket numbers and dates.
 
The copy of the supposed Varghese decision, for example, is six pages long and says it was written by a member of a three-judge panel of the 11th Circuit. But Avianca’s lawyers told the judge that they could not find that opinion, or the others, on court dockets or legal databases.
 
Bart Banino, a lawyer for Avianca, said that his firm, Condon & Forsyth, specialised in aviation law and that 
its lawyers could tell the cases in the brief were not real. He added that they had an inkling a chatbot might have been involved.
Schwartz did not respond to a message seeking comment, nor did Peter LoDuca, another lawyer at the firm, whose name appeared on the brief.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: May 28 2023 | 11:05 PM IST

Explore News