ChatGPT Incorrect Citation and Sanctions

Attorney Lombardi
Law Firm Sanctioned for Citations Generated by ChatGPT — Implications for Personal Injury Pleadings
In Mata v. Avianca, Inc., a U.S. District Court dismissed a personal injury claim and sanctioned plaintiff’s counsel for submitting fake case citations created by ChatGPT. The incident raises important ethical and practical issues for personal injury practitioners relying on AI tools in pleadings. Wikipedia
Facts & Court Action
-
The plaintiff alleged injury when a metal service cart struck his knee aboard a flight. Wikipedia
-
In their filings, counsel cited authorities that turned out to be fabricated — invented by AI (ChatGPT) and not real precedents. Wikipedia
-
The Court dismissed the personal injury case and fined the lawyers $5,000 for misusing AI-generated false precedent. Wikipedia
Legal & Ethical Analysis
-
Duty of verification
Even when using AI tools, attorneys remain responsible for ensuring the accuracy and authenticity of legal citations. Courts will not excuse reliance on fictitious cases. -
Risks of overreliance on AI
The case is a cautionary tale: AI can produce plausible-sounding but incorrect or fabricated legal material. Uncritical reliance risks sanctions, loss of credibility, and dismissal. -
Discovery / motion stage risks
Opposing counsel may scrutinize citations and flag suspicious or non-existent authority. Sanctions or motions to strike may follow. -
Best practices for AI in legal drafting
-
Use AI as a supplement, not a substitute for legal research.
-
Always cross-check authority via official law databases (Westlaw, Lexis, public reporters).
-
Maintain logs or records of how AI was used, and flag AI-generated content carefully.
-
Avoid copying large chunks of AI output without vetting, editing, and verifying.
-
Takeaway for Personal Injury Lawyers
This ruling serves as a red flag for personal injury and tort practitioners: the allure of AI must be tempered by rigorous human oversight. In complex cases, especially when dispositive motions or appeals are expected, attorneys cannot gamble with authority integrity. A single misstep can sink an entire case.
The Denver Post recently published a story titled, Denver DA failed to disclose police records in as many as 756 criminal cases, ‘Technical issue’ kept some police files from reaching defendants in hundreds of cases since 2022. Written by Shelly Bradbury, October 3, 2025.
The Denver District Attorney’s Office failed to share police records with defense attorneys in as many as 756 criminal cases since 2022, potentially violating court discovery rules, a probe by the office found.
I doubt this was purposeful. Likely a computer problem. We lawyers and judges will see more of this in the future. It is a growth issue involved with trying to streamline the system to make it work more efficiently.
We all need to check twice to confirm things are correct before hitting the send button.
If you have questions about your Iowa personal injury or workers’ compensation case, call Mr. Lombardi
Steve Lombardi
