“The change significantly reduces the impact of bias"
UK Police forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and ethnic minority groups.
UK forces use the Police National Database to conduct retrospective facial recognition searches, comparing a suspect’s image against more than 19 million custody photographs.
Last week, the Home Office admitted the technology was biased after a National Physical Laboratory review found higher misidentification rates for Black and Asian people and women.
The Home Office said it “had acted on the findings”, but documents reveal the bias had been known for more than a year.
Police leaders were first informed in September 2024, following a Home Office-commissioned review by the NPL.
That review found the system was more likely to suggest incorrect matches for women, Black people, and those aged 40 and under.
The National Police Chiefs’ Council ordered an increase to the confidence threshold required for potential matches to reduce bias.
The decision was reversed the following month after police forces complained the system produced fewer investigative leads.
According to The Guardian, NPCC documents show searches resulting in potential matches dropped from 56% to 14% after the threshold was raised.
A recent NPL study found false positives for Black women could occur almost 100 times more frequently than for white women at certain settings.
Publishing those findings, the Home Office said: “The testing identified that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.”
NPCC documents described the impact of the higher threshold, stating: “The change significantly reduces the impact of bias across protected characteristics of race, age and gender but had a significant negative impact on operational effectiveness”.
They added forces complained that “a once effective tactic returned results of limited benefit”.
The government has launched a 10-week consultation on plans to widen the use of facial recognition technology.
Policing minister Sarah Jones has described the technology as the “biggest breakthrough since DNA matching”.
Professor Pete Fussey, a former independent reviewer of the Met’s use of facial recognition, questioned police priorities:
“This raises the question of whether facial recognition only becomes useful if users accept biases in ethnicity and gender.
“Convenience is a weak argument for overriding fundamental rights, and one unlikely to withstand legal scrutiny.”
Abimbola Johnson, chair of the police race action plan’s independent scrutiny board, criticised the lack of oversight:
“There was very little discussion through race action plan meetings of the facial recognition rollout despite obvious cross-over with the plan’s concerns.
“These revelations show once again that the anti-racism commitments policing has made through the race action plan are not being translated into wider practice.
“Our reports have warned that new technologies are being rolled out in a landscape where racial disparities, weak scrutiny and poor data collection already persist.
“Any use of facial recognition must meet strict national standards, be independently scrutinised, and demonstrate it reduces rather than compounds racial disparity.”
Chief Constable Amanda Blakeman, NPCC lead for the Police National Database, said safeguards had been introduced:
“The decision to revert to the original algorithm threshold was not taken lightly and was made to best protect the public from those who could cause harm, illustrating the balance that must be struck in policing’s use of facial recognition.
“Following the identification of bias, we reissued and promoted training and guidance to Police National Database (PND) users to ensure all existing safeguards were used.
“We are confident these safeguards protect the public from the identified bias and enable us to use retrospective facial recognition responsibly and transparently.”
A Home Office spokesperson said action was already underway.
They said: “The Home Office takes the findings of the report seriously and we have already taken action.
“A new algorithm has been independently tested and procured, which has no statistically significant bias.”
“It will be tested early next year and will be subject to evaluation.
“Our priority is protecting the public. This game-changing technology will support police to put criminals and rapists behind bars.
“There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results.”








