
Photo by Sasun Bughdaryan on Unsplash
Anthropic’s Lawyers Take Responsibility For AI Hallucination In Copyright Lawsuit
This Thursday, a lawyer for Anthropic acknowledged using an incorrect citation generated by the company’s AI chatbot, Claude, during an ongoing copyright case in Northern California. The AI company is currently in a legal battle with Universal Music Group (UMG), which sued Anthropic over the use of copyrighted song lyrics to train its AI models.
In a rush? Here are the quick facts:
- Lawyer defending Anthropic in court acknowledged using an incorrect citation generated by Claude.
- The error was addressed by UMG’s lawyers during the ongoing music copyright case in Northern California.
- Dukanovic described the situation as an “embarrassing and unintentional mistake.”
The lawyer, Ivana Dukanovic, an associate at Latham & Watkins, stated in a court filing that Claude made a citation error by listing the wrong title and author for an article used in the case. However, she noted that the publication, link, year, and content of the article were correct.
“Our investigation of the matter confirms that this was an honest citation mistake and not a fabrication of authority,” states the document. “We apologize for the inaccuracy and any confusion this error caused.”
According to Reuters, Dukanovic’s explanation comes after UMG’s lawyers said data scientist Olivia Chen used AI-fabricated sources to defend Anthropic in the case.
Dukanovic explained that Chen used the correct article from the journal American Statistician but that lawyers at Latham & Watkins added the incorrect footnote provided by Claude.
“After the Latham & Watkins team identified the source as potential additional support for Ms. Chen’s testimony, I asked Claude.ai to provide a properly formatted legal citation for that source using the link to the correct article,” explained the lawyer. “Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors. Our manual citation check did not catch that error.”
Dukanovic described the situation as an “embarrassing and unintentional mistake” and assured that they have implemented new measures to ensure it doesn’t happen again.
A few weeks ago, during an early stage of the case, the jury had ruled in favor of Anthropic and considered UMG’s requests too broad. This new situation could jeopardize Anthropic’s advantage in the case.
In the past few months, multiple lawyers have presented incorrect AI-generated documents in court in the United States, raising concerns and legal problems. This week, a judge fined two law firms $31,000 for fake AI-generated legal citations.