Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
World
Josh Marcus

The new public defender: Some are turning to ChatGPT to offer legal advice and win small claims cases

There has been an explosion of people turning to artificial intelligence tools like ChatGPT and Perplexity for legal advice in recent years, with litigants in some cases ditching human lawyers altogether in favor of AI.

“It was like having God up there responding to my questions,” Lynn White, who used both tools in an eviction case in Long Beach, California, told NBC News.

White, behind on payments for her mobile home, lacked the funds to hire a lawyer, and told the outlet that a court-appointed attorney lost in her initial case.

On appeal, White represented herself and turned to AI to scrutinize the judge’s decisions, research case law, and draft responses.

She was able to overturn the eviction notice and avoid over $70,000 in penalties and overdue rent.

A New Yorker told the Staten Island Advance in August that he was representing himself in a civil case against his former employer and found that using AI helped him plot the best course of action to “checkmate” the case.

“The research, thinking and drafting I did with AI in five days would have otherwise taken months and months, and multiple attorneys,” Richard Hoffmann, 42, of Annadale, told the outlet.

Though AI companies typically warn their customers not to rely entirely on these products for legal advice, AI possesses serious capabilities in this arena. OpenAI’s GPT-4 model was able to pass a bar exam in 2023 with high marks, researchers found.

Court systems and legal advocates have gotten in on the action. A 2024 Stanford University analysis found that three out of four lawyers plan to use generative AI in their practice, and the legal profession has explored how AI tools might increase access to justice.

The Alaska Court System is developing an AI-powered chatbot called the Alaska Virtual Assistant to assist people representing themselves in probate estate cases. The legal nonprofit Public Counsel has also offered workshops on how to use AI for those representing themselves.

Still, despite enthusiasm and some success stories, others point to AI’s penchant for “hallucinations” - false claims the tools pass off as true - as reason to be wary.

Lawyers for MyPillow CEO Mike Lindell were recently fined $3,000 over AI issues in court (Copyright 2025 The Associated Press. All rights reserved)

Last month, a California attorney was hit with a $10,000 fine for filing a state court appeal filled with inaccurate legal quotes generated by ChatGPT. A judge noted that 21 of 23 of the citations were totally made up.

In July, two lawyers for MyPillow CEO Mike Lindell were ordered to pay $3,000 each after allegedly using AI to file a document that cited nonexistent cases, and misquoted case law.

A Thomson-Reuters Institute study examining cases, between June 30 and August 1, found “pervasive” hallucinations in filings. Researchers identified 22 different cases in which courts or opposing parties found non-existent cases quoted in legal filings, in many situations leading to disciplinary measures or sanctions against those who introduced them.

In one noted case, a dispute between a family and a school board, a self-representing defendant was found to have submitted pleadings with 42 “nonexistent legal authorities.”

It led a judge to conclude that “an AI program hallucinated them in an effort to meet whatever [the defendant’s] desired outcome was based on the prompt that she put into the AI program.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.