We are now at 58 reported AI hallucination cases (suspected or confirmed) in the UK.

We have over 1100 internationally. #aihallucinations #ailaw #ai

https://naturalandartificiallaw.com/ai-hallucination-cases-uk-58/

UK AI Hallucination Cases: 4 New Cases 58 in total

UK AI hallucination cases now stand at 58. This update reviews three new decisions, a possible Irish incident, and what the judgments suggest.

Natural and Artificial Law

Re-sharing BuzzStream's "Do Americans Use AI for News?" - https://www.buzzstream.com/blog/ai-news-usage

The hyperlink "getting better at hallucinating" in their piece goes to my article, "Are AI Hallucinations Getting Better or Worse? We Analyzed the Data".

#AI #AIResearch #AIHallucinations

Do Americans Use AI for News?

We surveyed 1,000 Americans to understand if and how they interact with AI in getting their news. The results are eye-opening.

BuzzStream
The State Bar of my state (this is the organization that handles licensing of lawyers) sent a letter to its members warning them about using #ai to prepare court filings. Apparently lawyers have been disciplined by judges for "ai hallucinations" which resulted in fictitious court cases being cited and referenced. I'd never heard that term before. 😂 #AiHallucinations

Amelia Hill: US startup advertises ‘AI bully’ role to test patience of leading #chatbots. $800-a-day position involves exposing a chatbot’s inconsistencies as it forgets, fudges or hallucinates.

"The only prerequisite is having an 'extensive personal history of being let down by #technology'– and the patience to ask the same question over and over again."

#AI #bullying #AIHallucinations #job
https://www.theguardian.com/technology/2026/mar/19/us-startup-advertises-ai-bully-role-to-test-patience-of-leading-chatbots

US startup advertises ‘AI bully’ role to test patience of leading chatbots

$800-a-day position involves exposing a chatbot’s inconsistencies as it forgets, fudges or hallucinates

The Guardian

University of Waterloo: Top AI coding tools make mistakes one in four times. “Even the most advanced models achieved only about 75 per cent accuracy in the tests, while open-source models performed closer to 65 per cent. The study evaluated 11 LLM models across 18 structured output formats and 44 tasks designed to assess how reliably the systems followed structured rules.”

https://rbfirehose.com/2026/03/20/university-of-waterloo-top-ai-coding-tools-make-mistakes-one-in-four-times/
University of Waterloo: Top AI coding tools make mistakes one in four times

University of Waterloo: Top AI coding tools make mistakes one in four times. “Even the most advanced models achieved only about 75 per cent accuracy in the tests, while open-source models per…

ResearchBuzz: Firehose

"Today, AI is rapidly changing the way we build software, and the pace of that change is only accelerating," said Astral founder Charlie Marsh.

"If our goal is to make programming more productive, then building at the frontier of AI and software feels like the highest-leverage thing we can do."

RE: https://seekingalpha.com/news/4566413-openai-plans-to-acquire-startup-astral-to-boost-codex-abilities

#SayNoToMANGO #DataSecurity #Privacy #PrivacyMatters #JobSecurity #HumanityOverProfits #SlopWare #AIStealsJobs #AISlop #VibeCoding #VibeSlop #AIHallucinations

Techdirt: A Reddit Post, An AI Hallucination, And Two Lawyers Who Never Checked Citations Walk Into A Dog Custody Case. “A hallucinated citation traveled through an entire legal proceeding — from a Reddit blog post to a client’s declaration to an attorney’s letter to the opposing attorney’s draft of the court order to the judge’s signature to appellate filings — and at no point […]

https://rbfirehose.com/2026/03/18/techdirt-a-reddit-post-an-ai-hallucination-and-two-lawyers-who-never-checked-citations-walk-into-a-dog-custody-case/
Techdirt: A Reddit Post, An AI Hallucination, And Two Lawyers Who Never Checked Citations Walk Into A Dog Custody Case

Techdirt: A Reddit Post, An AI Hallucination, And Two Lawyers Who Never Checked Citations Walk Into A Dog Custody Case. “A hallucinated citation traveled through an entire legal proceeding — …

ResearchBuzz: Firehose

Gothamist: AI v. Nicki Minaj: How chatbots are colliding with NY’s court system. “The state’s court system acknowledges that AI’s use in legal proceedings has become ‘increasingly common’ and that “judges are seeking guidance” on how to deal with it in their courtrooms, according to an October memorandum issued by an advisory committee established in 2024 to ensure the technology is […]

https://rbfirehose.com/2026/03/17/ai-v-nicki-minaj-how-chatbots-are-colliding-with-nys-court-system-gothamist/
AI v. Nicki Minaj: How chatbots are colliding with NY’s court system (Gothamist)

Gothamist: AI v. Nicki Minaj: How chatbots are colliding with NY’s court system. “The state’s court system acknowledges that AI’s use in legal proceedings has become ‘increasingly commo…

ResearchBuzz: Firehose

Are AI hallucinations getting better or worse? We analyzed the data.

See the report here: https://scottgraffius.com/blog/files/ai-hallucinations-2026.html

#AI #AIHallucinations #AISafety #AIResearch #AIErrors

AI Hallucinations: Causes, Effects, and Mitigation Techniques

AI Hallucinations: Causes, Effects, and Mitigation Techniques Introduction Artificial Intelligence tools such as ChatGPT, Claude, and Gemini have transformed the way students and researchers access information. They can summarize papers, explain theories, draft outlines, and even assist in coding or data analysis. However, these systems sometimes produce confident but incorrect information. In AI terminology, this phenomenon is called hallucination. Unlike human errors that come from […]

https://solomonaganai.wordpress.com/2026/03/15/ai-hallucinations-causes-effects-and-mitigation-techniques/

AI Hallucinations: Causes, Effects, and Mitigation Techniques

AI Hallucinations: Causes, Effects, and Mitigation Techniques Introduction Artificial Intelligence tools such as ChatGPT, Claude, and Gemini have transformed the way students and researchers access…

Solomon Agan -AI in Education Consultant