
Credit: Getty Images
After repeatedly denying for weeks that his force used AI tools, the chief constable of the West Midlands police has finally admitted that a hugely controversial decision to ban Maccabi Tel Aviv football fans from the UK did involve hallucinated information from Microsoft Copilot.
In October 2025, Birmingham’s Safety Advisory Group (SAG) met to decide whether an upcoming football match between Aston Villa (based in Birmingham) and Maccabi Tel Aviv could be held safely.
Tensions were heightened in part due to an October 2 terror attack against a synagogue in Manchester where several people were killed by an Islamic attacker.
West Midlands Police, who were a key member of the SAG, argued that the upcoming football match could lead to violence in Birmingham, and they recommended banning fans from the game. The police pointed specifically to claims that Maccabi Tel Aviv fans had been violent in a recent football match in Amsterdam.
This decision was hugely controversial, and it quickly became political. To some Jews and conservatives, it looked like Jewish fans were being banned from the match even though Islamic terror attacks were the more serious source of violence. The football match went ahead on November 6 without fans, but the controversy around the ban has persisted for months.
Making it worse was the fact that the West Midlands Police narrative rapidly fell apart. According to the BBC, police claimed that the Amsterdam football match featured “500-600 Maccabi fans [who] had targeted Muslim communities the night before the Amsterdam fixture, saying there had been ‘serious assaults including throwing random members of the public’ into a river. They also claimed that 5,000 officers were needed to deal with the unrest in Amsterdam, after previously saying that the figure was 1,200.”
Amsterdam police made clear that the West Midlands account of bad Maccabi fan behavior was highly exaggerated, and the BBC recently obtained a letter from the Dutch inspector general confirming that the claims were inaccurate.
But it was one flat-out error—a small one, really—that has made the West Midlands Police recommendation look particularly shoddy. In a list of recent games with Maccabi Tel Aviv fans present, the police included a match between West Ham (UK) and Maccabi Tel Aviv. The only problem? No such match occurred.
So where had this completely fantasized detail come from? As an inquiry into the whole situation was mounted, Craig Guildford, the chief constable of the West Midlands Police, was hauled before Parliament in December 2025 and again in early January 2026 to answer questions. Both times, he claimed the police did not use AI—the obvious suspect in a case like this. In December, Guildford blamed “social media scraping” gone wrong; in January, he chalked it up to some bad Googling.
“We do not use AI,” he told Parliament on January 6. “On the West Ham side of things and how we gained that information, in producing the report, one of the officers would usually go to… a system, which football officers use all over the country, that has intelligence reports of previous games. They did not find any relevant information within the searches that they made for that. They basically Googled when the last time was. That is how the information came to be.”
But Guildford admitted this week that this explanation was, in fact, bollocks. As he acknowledged in a letter on January 12, “I [recently] became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot.”
He had not intended to deceive anyone, he added, saying that “up until Friday afternoon, [I] understood that the West Ham match had only been identified through the use of Google.”
This has made a bad situation even worse. Today, in the House of Commons, Home Secretary Shabana Mahmood gave a long statement on the case in which she threw Guildford under the bus and backed over him five or six times.
Home Secretary Shabana Mahmood making a statement in Parliament today.
Mahmood blamed the ban on “confirmation bias” by the police. She said the Amsterdam stories they used were “exaggerated or simply untrue.” And she highlighted the fact that Guildford claimed “AI tools were not used to prepare intelligence reports,” but now “AI hallucination” was said to be responsible.
The whole thing was a “failure of leadership,” and Guildford “no longer has my confidence,” she said.
This last bit was something that everyone in the UK appears to agree on. Conservatives want Guildford to go, too, with party leaders calling for his resignation. MP Nick Timothy has been ranting for days on X about the issue, especially the fact that hallucination-prone AI tools are being used to produce security decisions.
“More detail on the misuse of AI by the police,” he wrote today. “They didn’t just deny it to the home affairs committee. They denied it in FOI requests. They said they have no AI policy. So officers are using a new, unreliable technology for sensitive purposes without training or rules.”
