
Image by Sasun Bughdaryan, from Unsplash
Experts Warn Courts May Overlook AI Hallucinations In Legal Filings
A Georgia court overturned a divorce order after discovering fake legal citations, likely generated by AI, raising alarms about growing risks in justice systems.
In a rush? Here are the quick facts:
- Georgia court vacated order due to suspected AI-generated fake case citations.
- Judge Jeff Watkins cited “generative AI” as a likely source of bogus cases.
- Experts say courts are likely to miss more AI errors in filings.
The case before a Georgia court demonstrates how artificial intelligence (AI) might quietly degrade the public’s trust in American legal institutions. ArsTechnica reports that Judge Jeff Watkins from the Georgia Court of Appeals overturned a divorce order because he found two made-up cases in the document, which were likely AI content.
The order had been drafted by attorney Diana Lynch, which ArsTechnica reports it is now a common practice in overworked courts. This growing habit of using AI in legal filings makes shortcuts particularly risky.
Lynch was sanctioned $2,500, and Judge Watkins wrote, “the irregularities in these filings suggest that they were drafted using generative AI,” adding that AI hallucinations can “waste time and money,” damage the system’s reputation, and allow a “litigant […] to defy a judicial ruling by disingenuously claiming doubt about its authenticity,” as reported by ArsTechnica.
Experts warn this is not an isolated case. John Browning, a former Texas appeals judge, said it’s “frighteningly likely” more trial courts will mistakenly rely on AI-generated fake citations, especially in overburdened systems. “I can envision such a scenario in any number of situations,” he told Ars Technica.
Other recent examples echo the concern. The Colorado legal system fined two attorneys representing MyPillow CEO Mike Lindell a total of $3,000 after they presented AI-generated legal documents with more than twenty major mistakes. Judge Nina Y. Wang wrote, “this Court derives no joy from sanctioning attorneys,” but emphasized that lawyers are responsible for verifying filings.
In California, another judge fined two law firms $31,000 after they submitted briefs containing fake citations. “That’s scary,” wrote Judge Michael Wilner, who was nearly persuaded by the fake rulings. Unless courts adapt quickly, AI hallucinations could become a recurring nightmare in American justice.
This trend is particularly concerning when considering how expensive legal representation already is. People commonly believe that legal fees ensure both accuracy and professional service from their lawyers. However, as attorneys use AI for shortcuts, clients may end up paying the bill for mistakes made by a machine.
These hallucinations don’t just threaten legal outcomes, they also may reinforce inequality by making justice even harder to access for those who can least afford to fight it.