Thames Valley Police Wrongly Arrest British-Asian Man After Racist AI Facial-Recognition Misidentification

Thames Valley Police Wrongly Arrest British-Asian Man After Racist AI Facial-Recognition Misidentification

26 February, 20269 sources compared
Technology and Science

Key Points from 9 News Sources

  1. 1

    Alvi Choudhury, a software engineer from Southampton, was wrongly arrested for a Milton Keynes burglary

  2. 2

    He was detained and handcuffed at his Southampton home for nearly ten hours

  3. 3

    AI facial-recognition produced a racially biased match with another person of South Asian heritage

Full Analysis Summary

Wrongful facial-ID arrest

Thames Valley Police arrested a man after running CCTV of a Milton Keynes burglary through an automated face-matching search against police records, but the match was wrong and the individual was later released.

Brave Search reported that Alvi Choudhury, a 26-year-old software engineer of Bangladeshi origin, was wrongly arrested after Thames Valley Police ran burglary CCTV through a face-matching search against the Police National Database (PND).

The Guardian reported that Alvi Choudhury, 26, a software engineer from Southampton, was wrongly arrested in January after Thames Valley Police used automated facial-recognition software that matched him to CCTV footage of a suspect in a £3,000 burglary in Milton Keynes, about 100 miles away.

The sources contradict each other on Choudhury's background: Brave Search described him as 'of Bangladeshi origin' while The Guardian described him as 'from Southampton'.

EasternEye summarised the event with the headline 'UK Police Wrongly Arrest Man in Facial ID Error'.

Coverage Differences

Narrative Framing

thecanary.co (Other): Frames the incident as proof of systemic, racist AI and uses the case to criticise political moves to expand AI in policing and courts. | BBC (Western Mainstream): Presents the police defence that the technology provided intelligence only and emphasises a distinction between retrospective PND searches and live camera van systems. | The Guardian (Western Mainstream): Highlights both the victim’s account and official acknowledgement that the arrest “may have been the result of bias within facial recognition technology”, portraying systemic risk while noting police claim a human visual assessment led to arrest.

Alleged wrongful detention

The detained man says officers handcuffed him at home and held him for many hours despite strong alibi evidence and obvious differences between him and the CCTV image.

The Guardian reports he 'was handcuffed at home, held in custody for nearly 10 hours and released at 2 a.m.'

Attack of the Fanboy describes that he 'was completely shocked, detained for 11 hours despite having a strong alibi and evidence he was at work, and that the CCTV image did not resemble him (different age, skin tone, nose, facial hair and features); an officer reportedly laughed when he pointed this out.'

The Canary similarly says he 'was held for around 11 hours before officers reviewed evidence; the suspect did not resemble him and officers reportedly laughed when releasing him.'

The reports disagree on the detention length, citing nearly 10 hours, 11 hours, and around 11 hours.

Coverage Differences

Tone

thecanary.co (Other): Strongly polemical and alarmist tone — treats the error as evidence of a racist, politically-driven AI rollout and scathing of government/party policy. | LADbible (Western Tabloid): Sensational and emotive: frames the story as a shocking, dystopian error and stresses the personal indignity and bias angle. | BBC (Western Mainstream): Measured, neutral tone — reports police statements and the victim’s account while emphasising official caveats about technologies and their use.

Biometric identification failures

Reporting highlights technical and data-retention failures that enabled the misidentification.

The match reportedly came from an earlier custody photograph retained in police records, and the arrest relied on an algorithmic lead.

Brave Search explains: 'The algorithm returned a 2021 mugshot of Choudhury — from an earlier arrest that did not lead to charges — and officers arrested him; he was held about 9–10 hours and later released without charge.'

The Canary notes the match linked 'a custody photo of him taken four years earlier to a suspect in a city he has never visited (about 100 miles from his home).'

Attack of the Fanboy and other coverage point to two related failures: 'long-term retention of biometric records for people who were innocent or never charged, and operational overreliance on automated face-matching outputs without sufficient safeguards, training or independent verification' (Brave Search).

Coverage Differences

Contradictory Detention Times

BBC (Western Mainstream): Reports the length of detention as over nine hours. | The Guardian (Western Mainstream): Reports the detention as nearly ten hours and gives a release time (2am). | The Times of India (Asian): States the man was detained for 10 hours. | LADbible (Western Tabloid): Gives a longer detention figure, saying he was held for 11 hours before officers spoke to him.

Facial-recognition bias reports

Campaigners and multiple reports frame the case as an example of known racial and demographic biases in facial‑recognition technology and call for stronger safeguards.

The Canary quotes Liberty's director and others describing 'well-documented racial bias in face-recognition systems — trained predominantly on white faces — making people who are young, women, Asian or Black far more likely to be misidentified,' and says Hart cited a figure that black women were 250 times more likely to be misidentified.

LADbible cites published error‑rate figures used to underscore disparity: 'false-positive rates of 5.5% for Black faces and 4.0% for Asian faces versus 0.04% for white faces'.

Attack of the Fanboy notes similar documented higher false‑positive rates.

The coverage explicitly links those technical biases to the subject's claim he was targeted 'because they are "a brown person with curly hair"' (LADbible).

Coverage Differences

Technical Detail/Omissions

The Guardian (Western Mainstream): Provides technical and procurement detail (vendor, scale of searches, database size) and cites Home Office research on disparate false‑positive rates. | BBC (Western Mainstream): Highlights operational distinction between retrospective PND searches and live facial‑recognition in camera vans, but does not delve into vendor or monthly-search statistics. | The Times of India (Asian): Focuses on the victim’s experience and the wrongful-arrest narrative but omits vendor, search‑volume or Home Office technical figures.

Police AI controversy

Thames Valley Police apologised for the distress and said the arrest followed officers’ visual assessment of AI-provided intelligence.

The individual has launched legal action and campaigners demand transparency and reform.

LADbible records that Thames Valley Police apologized for the distress, saying the arrest was based on officers’ visual assessment following AI-provided intelligence, that the AI did not determine the arrest, later enquiries cleared the individual, and that the force will continue using such tools while aiming to build trust.

Brave Search notes that Choudhury has sued Thames Valley Police and Hampshire Constabulary.

The Canary describes wider policy context and concerns, reporting that police AI lead Alex Murray admitted on 24 February that a new £115m national police data centre will produce discriminatory results, and that campaigners demand vendor and system transparency, external audits, removal policies, improved officer training and stronger oversight.

Sources use inconsistent name spellings: Brave Search and The Guardian use 'Choudhury' while the Canary uses 'Choudary', a contradiction the sources do not reconcile.

All 9 Sources Compared

Attack of the Fanboy

AI facial recognition technology wrongly targets innocent citizen, and the police’s response will leave you shocked

Read Original

BBC

Mistaken arrest victim says police were laughing

Read Original

Brave Search

alvi choudary

Read Original

DESIblitz

British Bangladeshi Man Arrested after Facial Recognition Error

Read Original

EasternEye

UK police wrongly arrest Asian man in Southampton after facial recognition error

Read Original

LADbible

Dystopian error leads to arrest of man for burglary in city he's never visited

Read Original

The Guardian

Facial recognition error prompts police to arrest Asian man for burglary 100 miles away

Read Original

The Times of India

'Do I look like this?' Bangladeshi man wrongfully detained in UK after facial recognition software identi

Read Original

thecanary.co

Wrongful arrest due to racist AI tech illustrates the disaster that is Labour's AI 'justice' initiative

Read Original