"it is more likely to incorrectly match black and Asian people"
Ministers are under pressure to tighten safeguards on facial recognition after the Home Office admitted the technology is more likely to misidentify black and Asian people on some settings.
The Home Office said it was “more likely to incorrectly include some demographic groups in its search results”.
Police and crime commissioners said the findings “shed light on a concerning inbuilt bias” and urged caution as the government moves towards a national expansion of the technology.
The results emerged hours after policing minister Sarah Jones described facial recognition as the “biggest breakthrough since DNA matching”.
The technology scans faces and cross-references images against watchlists of known or wanted criminals. It can be deployed on live camera feeds in public spaces or used retrospectively to identify suspects through police, passport or immigration databases.
At a lower system setting, analysts found major racial disparities.
The report said: “The false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%).
“The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%).”
The Association of Police and Crime Commissioners said the data confirmed an inbuilt bias within the system.
It said: “This has meant that in some circumstances it is more likely to incorrectly match black and Asian people than their white counterparts.
“The language is technical but behind the detail it seems clear that technology has been deployed into operational policing without adequate safeguards in place.”
The statement questioned why the findings were not shared sooner with affected communities.
It said: “Although there is no evidence of adverse impact in any individual case, that is more by luck than design.
“System failures have been known for some time, yet these were not shared with those communities affected, nor with leading sector stakeholders.”
The government has launched a 10-week public consultation aimed at expanding police use of facial recognition.
The public will be asked whether police should be allowed to search beyond existing records, including passport and driving licence databases, to locate suspects.
Civil servants are also working with police on a new national facial recognition system expected to hold millions of images.
Charlie Whelton, a policy and campaigns officer at Liberty, warned of serious consequences:
“The racial bias in these stats shows the damaging real-life impacts of letting police use facial recognition without proper safeguards in place.
“With thousands of searches a month using this discriminatory algorithm, there are now serious questions to be answered over just how many people of colour were falsely identified, and what consequences this had.
“This report is yet more evidence that this powerful and opaque technology cannot be used without robust safeguards in place to protect us all, including real transparency and meaningful oversight.
“The government must halt the rapid rollout of facial recognition technology until these are in place to protect each of us and prioritise our rights – something we know the public wants.”
Concerns have also been raised by senior politicians. Former cabinet minister David Davis reacted after police leaders suggested cameras could be placed at shopping centres, stadiums and transport hubs:
“Welcome to Big Brother Britain. It is clear the government intends to roll out this dystopian technology across the country.
“Something of this magnitude should not happen without full and detailed debate in the House of Commons.”
Officials maintain that facial recognition is necessary to catch serious offenders. They say manual safeguards are built into training, operational practice and guidance.
These require all potential matches returned from the police national database to be visually reviewed by trained users and investigating officers.
A Home Office spokesperson said:
“The Home Office takes the findings of the report seriously and we have already taken action.”
“A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation.
“Given the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement’s use of facial recognition.
“They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports.”
The findings now place the government’s expansion plans under renewed scrutiny, as campaigners and commissioners demand stronger oversight before further rollout.








