Two people who took the Metropolitan Police to court over their use of facial recognition cameras have lost their challenge.
Shaun Thompson said the force’s technology wrongly identified him as a suspect in February 2024 outside London Bridge Tube station.
Along with Big Brother Watch director Silkie Carlo, Shaun launched a judicial review into the use of the cameras.
Lawyers representing the pair told a High Court hearing last month that the police’s use of LFR is increasing ‘exponentially’.
The Met Police used facial recognition 231 times and scanned around 4 million faces last year, Dan Squires KC said during the judicial review of the extension of facial recognition across other forces.
They argued facial recognition is ‘similar to a DNA profile’ and that plans to mount permanent installations in the capital would make it ‘impossible’ for Londoners to travel without their biometric data being taken and processed.
There are plans to extend the cameras’ use, although consultation on it is still underway.
‘They do not reflect the rest of society’
Sir Mark Rowley said live facial recognition technology ‘won’t be as ubiquitous as CCTV’.
Speaking at Charing Cross Police Station, he said the force is using the technology ‘in a targeted way’ and in ‘high-crime areas’, adding that ‘it’s not going to be on every street corner’.
He continued: ‘In the surveys we’ve done, 80% of Londoners support us using it. So, while there are pressure groups who are concerned, that doesn’t reflect the rest of society.’
Earlier this year, the technology was plugged in at one of the UK’s busiest train stations – London Bridge.
Latest London news
To get the latest news from the capital, visit Metro’s London news hub.
The tech, which uses artificial intelligence, will scan faces and compare it against a list of serious criminals at the station, which saw over 54 million passengers last year.
If the system hits a match, it will send out an alert to an officer, who will manually review it and make further checks to decide if the person is a suspect, the force said.
‘The law needs to catch up first’
Privacy and civil liberties campaigners have warned against the rollout of the tech, saying its use by police forces across the country is currently not monitored.
Big Brother Watch previously described the ‘mass biometric surveillance’ of people as ‘disturbing.’
In February last year, facial recognition software was installed across Cardiff for the Six Nations games. Despite scanning 162,680 faces, not a single arrest was made, according to the organisation.
Madeleine Stone, a senior advocacy officer at the organisation, previously told Metro the law needs to catch up with the technology first, as there is no legislation governing the use of facial recognition cameras.
‘The police have essentially been left off the leash and can do what they want with this,’ Madeleine said.
‘Everyone gets something wrong sometimes, but what happens when the algorithm gets it wrong? Who is responsible then?’
What is Live Facial Recognition?
The Metropolitan Police say they use Live Facial Recognition (LFR) technology to prevent and detect crime and find wanted criminals.
When people pass through an area with a camera, their images are streamed directly to the LFR system and compared to a watchlist.
It can also help establish who a person is if they are unable to communicate who they are.
LFR is often used at large events or in busy areas, typically on top of vans driven around by police.
It was first used in England and Wales at the 2017 UEFA Champions League final in Cardiff.
Responding to the ruling, Carlo said: ‘This is a disappointing judgment, but the fight against live facial recognition mass surveillance is far from over.
‘There has never been a more important time to stand up for the public’s rights against dystopian surveillance tech that turns us into walking ID cards and treats us like a nation of suspects.
‘Innocent people deserve clear and strict protections from live facial recognition cameras, which should be reserved for the most serious cases rather than used to scan millions of people, and that is what the appeal will seek to achieve.
‘This legal challenge, which was made possible by concerned members of the public, has already led to a change in the Met’s facial recognition policy and to a payment awarded to Mr Thompson, who was misidentified by the tech and threatened with arrest.
‘He has been courageous in challenging the police, defending his rights and now standing up for the rights of millions of others in the country.’
Thompson added: ‘I’ve considered the court’s judgment today and decided to appeal it to protect Londoners from facial recognition being used for mass surveillance and leading to situations like mine, where I was misidentified, detained and threatened with arrest.
‘No one should be treated like a criminal due to a computer error.
‘I was compliant with the police, but my bank cards and passport weren’t enough to convince the police the facial recognition tech was wrong.
‘It’s like stop and search on steroids.
‘It’s clear the more widely this is used, the more innocent people like me risk being criminalised.
‘My daily work getting knives off the streets with the Street Fathers proves we can keep London safe through community action, not by spying on the public with cameras that real criminals already know how to dodge.’
What does Sir Mark Rowley think about the LFR ruling?
Sir Mark said the judgment was a ‘significant and important victory for public safety’ and that LFR ‘is not secret surveillance’.
He said: ‘The courts have confirmed our approach is lawful. The public supports its use. It works. And it helps us keep Londoners safe.
‘The question is no longer whether we should use live facial recognition – it’s why we would choose not to.
‘It’s very safe and if you’re nervous about your civil liberties, it’s important you know that your image is deleted in less than a second after you’ve walked past the camera.’
Policing Minister, Sarah Jones said: ‘I welcome today’s ruling because there can be no true liberty when people live in fear of crime in their communities. Live facial recognition only locates specifically wanted people – law abiding citizens have nothing to fear.
‘This technology puts dangerous rapists and murderers behind bars – and I question any group who call that uncivil.
‘We are rolling out facial recognition across the country with record investment to keep communities safe.’
Got a story? Get in touch with our news team by emailing us at webnews@metro.co.uk. Or you can submit your videos and pictures here.
For more stories like this, check our news page.
Follow Metro.co.uk on Twitter and Facebook for the latest news updates. You can now also get Metro.co.uk articles sent straight to your device. Sign up for our daily push alerts here.
MORE: Gang guilty of gunning down mum in drive-by shooting targeting rapper outside funeral
MORE: The 5 London postcodes where you can still rent for less than £800 per month
MORE: X Factor finalist who ‘tried to murder influencer’ makes heart sign in court
