Case Report: A Lawyer Cited Five Fake AI Generated Cases (Ayinde v London Borough of Haringey)
A shocking UK court case exposes how fake AI-generated legal cases slipped into court, raising serious ethical questions in legal professions.
A recent English court case has sent shockwaves through the legal world after a barrister submitted legal arguments based on completely made-up cases, likely generated by artificial intelligence. What unfolded was a gripping lesson in digital overreach, professional responsibility, and the real-world risks of relying on AI without fact-checking.
🏛️ Court: King’s Bench Division, High Court (Administrative Court)
🗓️ Judgment Date: 30 April 2025
🗂️ Case Number: [2025] EWHC 1040 (Admin)
The Misuse of AI in Legal Proceedings
In the Ayinde v London Borough of Haringey case, a major legal issue arose not from the substance of the homelessness claim, but from how the claimant’s legal representatives presented their arguments; specifically, their use (or misuse) of artificial intelligence (AI) to reference legal cases.
This spotlighted a troubling and increasingly relevant issue in the legal profession: the unsatisfactory reliance on AI tools for drafting legal case documents.
At the core of the controversy were five entirely fake legal cases cited in the claimant’s written arguments.
These cases were presented as if they were real High Court or Court of Appeal decisions, complete with legal principles supposedly extracted from them.
The five fictitious cases generated by AI, submitted to the court by the claimant’s legal team, were:
R (on the application of El Gendi) v. Camden London Borough Council [2020] EWHC 2435 (Admin)
R (on the application of Ibrahim) v. Waltham Forest LBC [2019] EWHC 1873 (Admin)
R (on the application of H) v. Ealing London Borough Council [2021] EWHC 939 (Admin)
R (on the application of KN) v. Barnet LBC [2020] EWHC 1066 (Admin)
R (on the application of Balogun) v. London Borough of Lambeth [2020] EWCA Civ 1442
When the opposing counsel checked these five cases, they didn’t exist. 😳📚
This was not just a minor mistake.
Submitting fake case law, knowingly or carelessly, can mislead the court, undermine justice, and violate a lawyer’s ethical duties.
While the barrister claimed the cases had been “dragged and dropped” from a personal archive, the court found this explanation unconvincing. The judge could not confirm whether AI like ChatGPT or another tool was used to generate these citations, but strongly suspected it was a possibility.
Why is this a big deal?
Because AI tools, while powerful, can “hallucinate”, meaning they sometimes invent facts, quotes, or in this case, entire legal decisions. Lawyers are expected to verify everything they submit, and failure to do so not only damages the credibility of the profession but also wastes the court's time and resources.