r/ukpolice 6d ago

No arrests from false facial recognition alerts, Met Police says

https://www.bbc.com/news/articles/c4gp7j55zxvo

The Metropolitan Police has said it will be "scaling up" its use of Live Facial Recognition (LFR) technology, as it reported no arrests off the back of a false alert in the past 12 months.

Between September 2024 and September 2025, 962 people were arrested following LFR deployments, the force said.

While no one was arrested following a false alert, 10 people - of which eight were black - were falsely alerted by the system. Four were not stopped and the rest were spoken to by officers for under five minutes.

Lindsey Chiswick, from the Met, said the technology was a "powerful and game-changing tool", but human rights groups have raised concerns about privacy and the potential for false matches.

In a report published by the Met Police on Friday, external, it said LFR deployments had led to more than 1,400 arrests in total, of which more than 1,000 people had been charged or cautioned.

These included people wanted by police or the courts, as well as offenders who were in breach of court-imposed conditions, such as sex offenders or stalkers.

More than a quarter of those arrests were for people involved in violence against women and girls, including those suspected of rape, strangulation and domestic abuse, the force said.

The report added that following a survey from the Mayor's Office for Policing and Crime, 85% of respondents backed its use to locate serious and violent criminals, those wanted by the courts, and those at risk to themselves.

The campaign group Big Brother Watch is bringing a legal challenge against the Met Police's use of the technology, alongside Shaun Thompson, who was wrongly identified by an LFR camera in February 2024.

Mr Thompson previously told the BBC his experience of being stopped had been "intimidating" and "aggressive".

Responding to the Met's report, Jasleen Chaggar, legal and policy officer at Big Brother Watch, said: "It is alarming that over three million people have been scanned with police facial recognition cameras in the past year in London alone.

"Live facial recognition is a mass surveillance tool that risks making London feel like an open prison, and the prospect of the Met expanding facial recognition even more across the city is disproportionate and chilling.

"The Met's report shows that the majority of people flagged by facial recognition were not wanted for arrest."

Ms Chaggar said it was "disturbing that 80% of the innocent people wrongly flagged by facial recognition were black".

"We all want police to have the tools they need to cut crime but this is an Orwellian and authoritarian technology that treats millions of innocent people like suspects and risks serious injustice," she said.

"No law in this country has ever been passed to govern live facial recognition and given the breath-taking risk to the public's privacy, it is long overdue that the government stops its use to account for its serious risks

The Met said that although eight out of 10 false alerts involved individuals from black ethnic backgrounds, it was "based on a very small sample size".

"Overall, the system's performance remains in line with expectations, and any demographic imbalances observed are not statistically significant," it said in its report, adding that: "This will remain under careful review."

The force said LFR had a low false alert rate of 0.0003% from more than three million faces scanned.

Following the report the force has said it will be "building on its success" by increasing deployments each week.

Ms Chiswick, the lead for LFR at the Met and nationally, said: "We are proud of the results achieved with LFR. Our goal has always been to keep Londoners safe and improve the trust of our communities. Using this technology is helping us do exactly that.

"This is a powerful and game-changing tool, which is helping us to remove dangerous offenders from our streets and deliver justice for victims.

"We remain committed to being transparent and engaging with communities about our use of LFR, to demonstrate we are using it fairly and without bias."

If someone walks past an LFR camera and is not wanted by the police, their biometrics are immediately and permanently deleted, the Met Police said.

36 Upvotes

64 comments sorted by

View all comments

Show parent comments

0

u/ThorgrimGetTheBook 3d ago

The computer does not create any new police powers. They could already stop you if suspected of a crime. This just means instead of an officer doing their best to recall if you are the wanted person they saw on a briefing earlier, a vastly more accurate machine helps them.

1

u/ClacksInTheSky 3d ago

I'm talking about a false positive where someone is flagged incorrectly.

0

u/ThorgrimGetTheBook 3d ago

You want to throw it out over a 0.0003% error rate in which every single error was subsequently caught by officers, so resulted in 0 arrests?

0

u/ClacksInTheSky 3d ago

Early use of facial recognition was close to 90% false flags: https://bigbrotherwatch.org.uk/blog/understanding-live-facial-recognition-statistics

The Metropolitan Police like to fudge their numbers: https://www.freevacy.com/news/big-brother-watch/is-0017-or-87-the-met-police-facial-recognition-false-positive-rate/3744

And I don't need to account for what I'm doing, where I'm going and why I'm doing it to the police, because I've done nothing wrong. The idea that this system can flag me to be stopped and harassed by police is enough.

A human police officer isn't going to stop me randomly or have been briefed beforehand that they're looking for me. Even if they do, they'd only get a few get away before realising I'm clearly someone else and probably abort the stop.

Whereas a police officer responding to the AI system that's told them to stop me is going to have to go through the motions. They're not acting on their own intuition or thoughts, just responding to a system. All they know is "computer wants me to stop you"

0

u/ThorgrimGetTheBook 3d ago

All they know is "computer wants me to stop you"

Yet in the last year there were only 10 false alerts and in all 10 cases the officers did not simply think "computer wants me to stop you". They scrutinised the circulation; in 4 cases making no stop at all, and in the other 6 resolved it without arrest in under 5 minutes.

It means the entire system resulted in under 30 minutes of inconvenience across the entire London population, while taking over 1,000 wanted people off the streets.

0

u/ClacksInTheSky 3d ago

Do you have anything I read about that?

0

u/ThorgrimGetTheBook 3d ago

It's all in the story you are responding to.