(Bloomberg Law) -- A New York lawyer told a judge he never meant to fool anybody when he filed a court brief full of phony legal precedents invented by ChatGPT.
Steven Schwartz, who faces punishment for the brief, on Thursday asked US District Judge P. Kevin Castel for leniency. The lawyer claims he had no idea the free artificial intelligence tool could create fake case citations and court opinions.
“There were many things I should have done to assure the veracity of these cases,” Schwartz told the judge. “I failed miserably at that.”
Generative artificial intelligence tools like ChatGPT promise to upend white-collar professions, changing how law firms, financial institutions and others conduct business. The technology also has risks, including over-reliance on it.
ChatGPT, a chatbot launched by the nonprofit OpenAI, can carry on human-like conversations and pull vast troves of data from the internet—though it admits that it’s prone to hallucinations and can provide inaccurate information.
Schwartz admitted this week that ChatGPT invented six cases he cited in a brief in a case against Avianca Airlines.
“I just never could imagine that ChatGPT would produce fabricated cases,” Schwartz told the judge in Manhattan federal court.
Crux of Case
Schwartz’s client claimed an Avianca employee hit him in the left knee with a metal serving cart on a 2019 flight from El Salvador to New York, causing him “severe personal injuries.”
The airline sought to dismiss the suit, arguing it was filed too late. In researching the statute of limitations issue, Schwartz acknowledged he used ChatGPT.
After Avianca’s lawyers filed papers saying they couldn’t find the cases, Schwartz continued to rely on the AI tool.
Castel told the lawyers that the case hinges on the actions of Schwartz and a colleague from his firm, Peter LoDuca, after they learned about the fraudulent citations.
“I doubt we would be here today if the narrative had ended there,” he said.
Castel walked Schwartz through his faulty brief, asking whether he’d thought to check the cases on legal research databases, in books at a law library, or even on Google. Schwartz’s answer each time was “no.”
The judge asked Schwartz whether he was suspicious of one of the main phony cases cited in the brief, the non-existent “Varghese v. China South Airlines Co.,” which the judge said included information that made no sense.
“Can we agree that is legal gibberish?” Castel asked.
Schwartz said he could not fathom that ChatGPT would invent a case from whole cloth and that he never considered such a possibility until Castel’s May 4 order to show cause.
“I continued to be duped by ChatGPT,” Schwartz said. “It’s embarrassing.”
After the problems with Schwartz’s brief were identified, federal judges in Illinois and Texas issued standing orders requiring lawyers to certify that their filings were created without generative AI, or that a human reviewed the accuracy of any language AI did craft.
“These platforms are incredibly powerful and have many uses in the law,” Northern District of Texas Judge Brantley Starr wrote in his order. “But legal briefing is not one of them.”
In the Avianca case, Castel ordered the sanctions hearing for Schwartz, his colleague LoDuca, who signed the brief and filed it with the court, and their firm, the four-lawyer Manhattan personal injury shop Levidow, Levidow & Oberman.
Castel in his order to show cause also raised the possibility of referring Schwartz to a state attorney grievance committee, which can investigate professional conduct.
Schwartz’s legal team is asking Castel not to hit him with sanctions, noting he and his firm’s reputations have already been tarnished by the episode.
Schwartz’s defense counsel, Ronald Minkoff, said the public embarrassment they’ve been exposed to is deterrent enough.
The case is “schadenfreude for any lawyer,” Minkoff said, “because lawyers have historically had difficulties with new technology.”
Castel adjourned the hearing without saying when he’ll decide on possible sanctions.
To contact the reporter on this story: Justin Wise at firstname.lastname@example.org; Bob Van Voris at email@example.com
To contact the editors responsible for this story: Chris Opfer at firstname.lastname@example.org; John Hughes at email@example.com
(Updates with lawyer, judge comments from third paragraph.)
©2023 Bloomberg L.P.