Liabooks Home|PRISM News
Police officer viewing digital error data on a tablet
TechAI Analysis

Microsoft Copilot AI Hallucination Leads to Wrongful Police Ban

2 min readSource

West Midlands Police admit that a ban on Maccabi Tel Aviv fans was based on a Microsoft Copilot AI hallucination. Explore the impact of AI errors in law enforcement.

A weeks-long wall of denial has finally crumbled. The chief constable of the West Midlands Police admitted that a controversial decision to ban Maccabi Tel Aviv fans from the UK was fueled by Microsoft Copilot hallucinations, sparking a firestorm over AI accountability in law enforcement.

The Microsoft Copilot AI Hallucination Fallout

According to reports, the controversy dates back to October 2025, when Birmingham’s Safety Advisory Group (SAG) met to assess the risks of a match between Aston Villa and Maccabi Tel Aviv. Tensions were high following an October 2 terror attack at a Manchester synagogue, where an Islamic attacker killed several people. Amid this sensitive climate, police turned to AI for rapid intelligence gathering.

However, the Microsoft Copilot tool generated hallucinated information—false data presented as fact—suggesting specific threats from the visiting fans. Based on this flawed output, authorities implemented a total ban, a move that civil liberties advocates now call a dangerous precedent of 'algorithmic policing' without human oversight.

Trust and Technology in Turmoil

The admission comes after the force repeatedly denied using AI tools during the initial decision-making process. The reversal highlights a growing gap between the rapid adoption of generative AI and the lack of robust verification protocols in public safety sectors.

This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.

Related Articles