
Two attorneys for MyPillow founder Mike Lindell have been fined by a federal judge after filing an AI-generated court brief filled with fake citations—prompting a blistering rebuke over what the court called “gross carelessness.”
At a Glance
- Two of Mike Lindell’s attorneys were fined $3,000 each for filing an error-filled legal brief written using AI.
- The motion contained nearly 30 false or misquoted citations, including references to non-existent cases.
- Judge Nina Y. Wang denounced the filing as careless and the attorneys’ response as “puzzlingly defiant.”
- Lindell himself was not sanctioned, but the case adds pressure to regulate AI use in legal settings.
A Defiant Filing—and a Harsh Response
Attorneys Jennifer DeMaster and Christopher Kachouroff submitted the disputed brief in Lindell’s ongoing legal battle with Dominion Voting Systems executive Eric Coomer. The brief, generated using ChatGPT, cited fabricated precedents in an attempt to exclude trial evidence. According to the Colorado Sun, Judge Wang noted that the errors were not accidental and said the attorneys failed to check or correct the fake citations, even after multiple chances.
She issued sanctions under Rule 11, which permits penalties for filings that demonstrate bad faith or reckless disregard for truth. As reported by AP News, Judge Wang called their defense “troubling” and warned that AI tools do not relieve lawyers of their professional duty to vet legal arguments.
Watch a report: AI in Court Backfires—Lawyers Sanctioned
AI Hallucinations Spreading in Courtrooms
This is not an isolated incident. A Washington Post investigation found at least 95 similar AI-generated legal filings since mid-2023. In one Wyoming case, plaintiffs suing Walmart were sanctioned for citing fabricated rulings. Courts in Indiana, Utah, and California have issued similar reprimands in recent months.
The growing trend has judges, bar associations, and ethicists sounding the alarm. The American Bar Association has issued early guidelines on responsible AI use, but enforcement mechanisms remain limited. In Lindell’s case, the judge noted that both attorneys may be referred for bar discipline.
Ethics in the Age of Automation
As reported by the My Journal Courier, Judge Wang stressed that sanctions were not aimed at punishing Lindell himself but at upholding professional standards in an era of “automated misinformation.” While some firms have adopted internal policies to verify AI output, others continue to rely blindly on tools like ChatGPT—creating serious risks in high-stakes litigation.
In the end, the “puzzlingly defiant” tone of the attorneys’ responses may have sealed their fate. According to LiveMint, the judge warned that future offenses could result in far harsher penalties, adding: “This court derives no joy from these sanctions—but professional integrity must be preserved.”












