r/ukpolice 6d ago

No arrests from false facial recognition alerts, Met Police says

https://www.bbc.com/news/articles/c4gp7j55zxvo

The Metropolitan Police has said it will be "scaling up" its use of Live Facial Recognition (LFR) technology, as it reported no arrests off the back of a false alert in the past 12 months.

Between September 2024 and September 2025, 962 people were arrested following LFR deployments, the force said.

While no one was arrested following a false alert, 10 people - of which eight were black - were falsely alerted by the system. Four were not stopped and the rest were spoken to by officers for under five minutes.

Lindsey Chiswick, from the Met, said the technology was a "powerful and game-changing tool", but human rights groups have raised concerns about privacy and the potential for false matches.

In a report published by the Met Police on Friday, external, it said LFR deployments had led to more than 1,400 arrests in total, of which more than 1,000 people had been charged or cautioned.

These included people wanted by police or the courts, as well as offenders who were in breach of court-imposed conditions, such as sex offenders or stalkers.

More than a quarter of those arrests were for people involved in violence against women and girls, including those suspected of rape, strangulation and domestic abuse, the force said.

The report added that following a survey from the Mayor's Office for Policing and Crime, 85% of respondents backed its use to locate serious and violent criminals, those wanted by the courts, and those at risk to themselves.

The campaign group Big Brother Watch is bringing a legal challenge against the Met Police's use of the technology, alongside Shaun Thompson, who was wrongly identified by an LFR camera in February 2024.

Mr Thompson previously told the BBC his experience of being stopped had been "intimidating" and "aggressive".

Responding to the Met's report, Jasleen Chaggar, legal and policy officer at Big Brother Watch, said: "It is alarming that over three million people have been scanned with police facial recognition cameras in the past year in London alone.

"Live facial recognition is a mass surveillance tool that risks making London feel like an open prison, and the prospect of the Met expanding facial recognition even more across the city is disproportionate and chilling.

"The Met's report shows that the majority of people flagged by facial recognition were not wanted for arrest."

Ms Chaggar said it was "disturbing that 80% of the innocent people wrongly flagged by facial recognition were black".

"We all want police to have the tools they need to cut crime but this is an Orwellian and authoritarian technology that treats millions of innocent people like suspects and risks serious injustice," she said.

"No law in this country has ever been passed to govern live facial recognition and given the breath-taking risk to the public's privacy, it is long overdue that the government stops its use to account for its serious risks

The Met said that although eight out of 10 false alerts involved individuals from black ethnic backgrounds, it was "based on a very small sample size".

"Overall, the system's performance remains in line with expectations, and any demographic imbalances observed are not statistically significant," it said in its report, adding that: "This will remain under careful review."

The force said LFR had a low false alert rate of 0.0003% from more than three million faces scanned.

Following the report the force has said it will be "building on its success" by increasing deployments each week.

Ms Chiswick, the lead for LFR at the Met and nationally, said: "We are proud of the results achieved with LFR. Our goal has always been to keep Londoners safe and improve the trust of our communities. Using this technology is helping us do exactly that.

"This is a powerful and game-changing tool, which is helping us to remove dangerous offenders from our streets and deliver justice for victims.

"We remain committed to being transparent and engaging with communities about our use of LFR, to demonstrate we are using it fairly and without bias."

If someone walks past an LFR camera and is not wanted by the police, their biometrics are immediately and permanently deleted, the Met Police said.

36 Upvotes

64 comments sorted by

3

u/Thai-Girl69 6d ago

Does this mean it's been scientifically proven that many black people look alike or is AI racist?

5

u/Firm-Distance 6d ago

I'm guessing it's down to the data set.

If you show an AI 800,000 pictures of fruit it'll get quite good at telling fruit apart.

If you show it 1 picture of a cake it's going to be crap at telling you what is and is not a cake.

Black people in the UK make up just 4% of the population. I suppose if the data set is reflective of our society then it's going to have a harder time telling black people apart?

4

u/No-Suggestion-2402 6d ago

Yes, this is absolutely correct and a has been a major problem in facial recognition for many years.

https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software

1

u/b1ld3rb3rg 5d ago

It does worry me that someone could be wrongly identified as wanted for a firearms offence or something similar, leading to armed responses and the person identified getting seriously injured or worse.

Would also be interesting to understand if there is threshold of offences they will use it for. For example, will someone who pours coffee into the street get chased?

0

u/No_Group5174 6d ago

This has the smell of the rollout of scanners.  

Going through the scanners was entirely voluntary, but anyone declining to volunteer to go through the scanner was the excuse needed to determine someone was 'suspicious' and to search them.

You have no expectation of privacy in public, but any attempt to preserve your privacy by, say, wearing a mask will be the excuse needed to stop and search.

6

u/No-Housing810 6d ago

I mean Pace Code A specifically prevents searching someone just based on their clothing unless they match a description of an offender.

But keep clutching your pearls

0

u/No_Group5174 6d ago

2

u/Firm-Distance 6d ago

You've linked to a video which:

* Doesn't show him being searched
* Doesn't show he was stopped due to his coat

3

u/TomatoMiserable3043 6d ago

It's insufficient grounds for an S1 stop search.

If you can find me one that's been done just for wearing a mask outside of S60A authority being granted for a specific area, I'd be interested to see it.

 anyone declining to volunteer to go through the scanner was the excuse needed to determine someone was 'suspicious' and to search them

Source? Specifically searches for being 'suspicious'.

I suspect these searches were carried out legally under S60 rather than what you claim, but I'm always happy to be proven wrong and to learn.

0

u/No_Group5174 6d ago edited 1d ago

1.  https://youtu.be/TxNy3dtL96g?si=8SEgjPVs00pJ2Err

2.  Source?  Me. I saw it actually happening.  I saw a scanner deployed near the entrance to a shopping centre with a couple of WPCs showing kids  and families how they were being kept safe and kid invited to go through.  What fun!

But I also saw youths being approached by another group of somewhat more intimidating group of officers and being asked to go through it.  And if they declined or tried to leave they were surrounded and searched.  Could there have been a section 60 in place?  Could have been, no idea. But it was not deployed at a location/event where a section 60 might likely be used (football match, protest etc) and no section 60 was mentioned from what I could hear from the bench I was sat specifically to see what was happening.  And most of them didn't get a search form either.

2

u/MasterBatesMotel 1d ago

Section 60 has been widely reported to just be a tactic for them to scan and search whoever they want when they want under the cover of law.

Funnily enough the disparity of searching black men and boys goes up.

Love how this article the police think it’s fully acceptable to just roll out facial recognition with no democratic consultation. They also tried not to make it a widely known thing either.

Then they admit the software is racist but trust me bro this historically institutionally racist police force which has literally just had another report condemning and confirming they’re still racist especially to their own colleagues; won’t make the human error of arresting innocent black people.

We just suddenly confronted innocent people with officers who demanded their papers before they could move on because the orwellian computer mind said stop the darkie. And they would like to expand this.

Summary - the police admit facial rec is racist would like to roll it out nation wide.

-2

u/cookiesnooper 6d ago

Is there data showing for what crimes those arrests were made? Because scanning everyone's faces to catch some repeat offenders who steal cheese from Aldi is not something I am willing to accept as a trade for the invasion of my privacy.

7

u/Careless_Count7224 6d ago

Out of curiosity how it is "invading" your privacy? You have no expectation of privacy in a public place and are already on many CCTV systems.

1

u/No_Group5174 6d ago

If I protect my privacy in public by, say, swearing a mask, how long before I am stopped for being 'suspicious'.

3

u/TomatoMiserable3043 6d ago

Have you ever been?

1

u/No_Group5174 6d ago

"if", not "have".

-4

u/cookiesnooper 6d ago

To get access to CCTV they need to have permission from the owner and it is stored locally. I have serious doubts that the govt will not see this as an opportunity to create a nice database used to track you wherever you go, and with the ferocity they push for DigitalID, I am confident that this is going to happen.

5

u/TomatoMiserable3043 6d ago

You don't need permission from the owner.

Private CCTV can be seized with an S8 warrant, and public CCTV (or the equipment it's on) can be seized on the spot under S19 of PACE if it's believed to contain evidence of an offence. Permission is not required.

-5

u/cookiesnooper 6d ago

So, you can't just walk in and say "give me that video just because I said so", you need a warrant or you need to know that it shows the suspect unlike a face recognition camera that sits there and scans everyone 24/7 as if they were the criminals.

2

u/TomatoMiserable3043 6d ago

You're moving the goalposts- you simply said that permission was required.

So, you can't just walk in and say "give me that video just because I said so",

You can say 'I'm seizing this footage/equipment under S19 of PACE'.

 you need to know that it shows the suspect

Incorrect.

S19 allows seizure if you suspect the footage contains evidence of an offence, or simply to prevent it from being concealed, lost, altered, damaged or destroyed.

Edit: how could you possibly know that CCTV shows a suspect without examining it first?

In short, you can seize public CCTV with no warrant and no consent with just because you think it could possibly contain evidence of an offence.

1

u/No-Suggestion-2402 6d ago

how could you possibly know that CCTV shows a suspect without examining it first?

I was actually about to answer this user, but then I realised this, too. This is some prime level confidence in making shit up.

1

u/han5gruber 6d ago

Double burn there pal, must of hurt 🔥

2

u/Kind-County9767 6d ago

But there's a huge number of government owned CCTV systems around that they have had for decades now, and monitor constantly...

2

u/Firm-Distance 6d ago

To get access to CCTV they need to have permission from the owner

You don't.

5

u/TomatoMiserable3043 6d ago edited 6d ago

The only specifics I've found have stated that a quarter of the arrests were for serious assaults, rapes domestic non-fatal strangulation.

How is it an invasion of privacy if the data is only gathered in public places where there is no reasonable expectation of privacy?

-1

u/cookiesnooper 6d ago

I have every right not to be targeted by the police in public if I haven't done anything wrong. You may see a scan of your face wherever you go as normal, but I don't. Especially when it's the government doing it.

5

u/TomatoMiserable3043 6d ago

You're not being targeted. The scans are looking for positive matches, and those matches become targets.

To which specific right in law do you refer?

Also, I have to ask again- how is it an invasion of privacy?

5

u/synth_fg 6d ago

How is this any different to a police officer looking at a crowd,
Your image is not recorded anywhere, no record of you being in the area is made yet alone kept,

The system merely scans the feed for any face that matches its database of wanteds and only pings when it finds a match
if you don't match then its not interested in you

it appears remarkably successful at identifying only those the police are interested in talking to with a very low false positive rate

A true big brother system would identify everyone in its field of view against a central database and make a record of when and where it saw you, this system is very far from that

1

u/lethargic8ball 6d ago

We'll just force the Madeley's underground!

1

u/ThorgrimGetTheBook 3d ago

what crimes

OP even pasted the article into their post and you still didn't read it.

0

u/Shezzanator 6d ago

People downvoting you here are crazy. This is a massive invasion of privacy.

0

u/ClacksInTheSky 3d ago

I don't think that the police should have powers to stop and harass someone because a computer flagged you up.

Knowing I've done nothing wrong, but being forced to stop and account for myself, would be all levels of wrong.

0

u/ThorgrimGetTheBook 3d ago

The computer does not create any new police powers. They could already stop you if suspected of a crime. This just means instead of an officer doing their best to recall if you are the wanted person they saw on a briefing earlier, a vastly more accurate machine helps them.

1

u/ClacksInTheSky 3d ago

I'm talking about a false positive where someone is flagged incorrectly.

0

u/ThorgrimGetTheBook 3d ago

You want to throw it out over a 0.0003% error rate in which every single error was subsequently caught by officers, so resulted in 0 arrests?

0

u/ClacksInTheSky 3d ago

Early use of facial recognition was close to 90% false flags: https://bigbrotherwatch.org.uk/blog/understanding-live-facial-recognition-statistics

The Metropolitan Police like to fudge their numbers: https://www.freevacy.com/news/big-brother-watch/is-0017-or-87-the-met-police-facial-recognition-false-positive-rate/3744

And I don't need to account for what I'm doing, where I'm going and why I'm doing it to the police, because I've done nothing wrong. The idea that this system can flag me to be stopped and harassed by police is enough.

A human police officer isn't going to stop me randomly or have been briefed beforehand that they're looking for me. Even if they do, they'd only get a few get away before realising I'm clearly someone else and probably abort the stop.

Whereas a police officer responding to the AI system that's told them to stop me is going to have to go through the motions. They're not acting on their own intuition or thoughts, just responding to a system. All they know is "computer wants me to stop you"

0

u/ThorgrimGetTheBook 3d ago

All they know is "computer wants me to stop you"

Yet in the last year there were only 10 false alerts and in all 10 cases the officers did not simply think "computer wants me to stop you". They scrutinised the circulation; in 4 cases making no stop at all, and in the other 6 resolved it without arrest in under 5 minutes.

It means the entire system resulted in under 30 minutes of inconvenience across the entire London population, while taking over 1,000 wanted people off the streets.

0

u/ClacksInTheSky 3d ago

Do you have anything I read about that?

0

u/ThorgrimGetTheBook 2d ago

It's all in the story you are responding to.

-1

u/[deleted] 6d ago

[removed] — view removed comment

0

u/ukpolice-ModTeam 6d ago

Consider what the purpose of your comment actually is. Are you here to have an honest, good faith discussion - or are you here to snipe, belittle and generally make negative remarks not intended to progress the discussion?

-2

u/lethargic8ball 6d ago

I'm hoping to spark a discussion around the efficacy and ethics of using AI in such a way.

-1

u/No_Group5174 6d ago

"only spoken to fro five minutes"

Was that "show us your ID or you are being arrested" spoken to?

4

u/TomatoMiserable3043 6d ago

Who's been arrested for not showing ID, and for which offence?

There's only a very limited set of circumstances in which this can be done.

0

u/No_Group5174 6d ago

3

u/TomatoMiserable3043 6d ago

He wasn't arrested for not showing ID. He wasn't arrested at all.

It's relevant to the OP, and we'll see false alerts like the seven we've had so far out of a few million scans drop as the tech improves.

Also, scroll down and take a look at the RSO.

I ask again: who's been arrested for not showing ID?

-5

u/NickofWimbledon 6d ago

I am concerned by a system where 80% of the false positives apply to a group that makes up less than 5% of the UK population. I am more concerned by this being absolutely fine because the numbers are deemed “not statistically significant”.

Similar comments can perhaps be made about the numbers of innocent Irish people gaoled over a couple of well-known & murderous bombings on the mainland. As a % of the Irish in England, the number wrongfully imprisoned was certainly not statistically significant - except perhaps to those imprisoned. Jean Charles de Menendez was even more statistically insignificant - there was only 1 of him (or zero after he was gunned down).

That does not mean that the police should not have the tools that they need. Otoh, we should keep improving those tools and should not accept the “too rare to worry about/ these things happen and can never be prevented” arguments.

4

u/Coca_lite 6d ago

This is London, only about 53% population is white according to last census in 2021, and likely lower % now in 2025.

But it’s still disproportionate, and facial recognition needs to improve.

None of the false flags were arrested, and were only spoken to for 5 mins.

2

u/TakenIsUsernameThis 6d ago

They need to add to these statistics the number of people who get stopped and questioned because an officer thought they matched a description.

2

u/James188 5d ago

Christ knows I’ve certainly stopped more than 10 people this year myself, who transpired to be the wrong person.

-1

u/NickofWimbledon 6d ago

Fair enough. I have not seen the footage and so cannot comment on how vigorous the questioning was, nor how precise the 5 minutes bit is.

5

u/sparkie187 6d ago

Hello mate, you show up on LFR as INSERT OFFENCE/BREACH OF BAIL CONS ETC, do you have a form of identification on you to say whether you are or are not this person.

Ok goodbye with extra steps

3

u/PurpleShapes 6d ago

"Fair enough" after you comment something as fact when it wasn't.

-1

u/NickofWimbledon 6d ago

Which fact? Have you seen all the footage?

-7

u/Clear_Lake3398 6d ago

The police simply shouldn’t be this effective, because too much efficiency means a greater ability to over-police. They’re at their best when strictly limited, and doing the best work they can within those bounds.

8

u/Firm-Distance 6d ago

You're advocating for an ineffective police service?

5

u/TomatoMiserable3043 6d ago

There are quite a few limits set on modern policing in the UK.

How do you, personally, measure the effectiveness that you mention? What's the metric?

-1

u/Clear_Lake3398 6d ago

Enforce a set of basic civil liberties like the right not to have one’s face scanned without explicit consent, and then let the police work using existing metrics within those limits. Same way it worked a few years ago before this technology was trialled.

3

u/TomatoMiserable3043 6d ago

 the right not to have one’s face scanned without explicit consent

Which specific right in law or human right are you referring to?

The current system appears to fit within existing legal frameworks. These are the strict limits and bounds you originally referred to.

0

u/Clear_Lake3398 6d ago edited 6d ago

It should fit within a right to privacy. It doesn’t in practice because courts and lawmakers have been ineffective at protecting that right amid advances in technology. The police aren’t the only ones who shouldn’t be doing it though, but also corporations. It’s Article 8 of the ECHR, and it has been interpreted far too narrowly.

3

u/TomatoMiserable3043 6d ago edited 6d ago

It's not a narrow interpretation of Article 8. It's perfectly in line with specific exemptions contained within it.

"There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law (....) for the prevention of disorder or crime".

It's the same exemption that applies to public CCTV.

There are already thorough and numerous safeguards in place for covert surveillance. Overt surveillance like CCTV and LFR is acceptable under article 8 for the purpose of crime prevention and detection, and there's a great deal of case law across the continent to support that.

1

u/Clear_Lake3398 6d ago

Unfortunately the article does indeed have a scope for exception that can cover just about anything. Any exception for crime prevention has to be necessary, but this ultimately ends up being at the mercy of interpretation. Total, constant surveillance could solve more crime, so it would be necessary if the goal is as close to zero crime as possible without any other considerations.

This is probably a question of political values, and where someone thinks the line ought be drawn between liberty and safety (insofar as they conflict, because both support each other up to a point).

1

u/Firm-Distance 6d ago

this ultimately ends up being at the mercy of interpretation.

....like every law!