‘This is Big Brother… you have just been detected in Camden High Street'

Human rights groups raise alarm over police use of facial recognition software

Friday, 27th October 2023 — By Frankie Lister-Fell

pexels-marc-schulte-1766002

The use of LFR has been described as an ‘Orwellian’ nightmare



POLICE are deploying an “Orwellian mass surveillance tool” in Camden, campaigners have warned.

This year, the Met have used live facial recognition technology (LFR) three times in Camden: twice in April and once last month (September 8) – in Camden High Street. The police didn’t declare where it was deployed and gave little warning on social media before its use.

There are two types of facial recognition used by the police: retrospective recognition, where images of someone suspected of doing something illegal are compared to custody images databases, and LFR.

LFR operates from police vans that are deployed in a specific area. Signs are displayed at the location to say that facial recognition is taking place.

The cameras scan anyone who walks past. Every face is mapped and converted into a “biometric face print”, similar to fingerprints, that is checked against a police watchlist looking for matches – without consent from those walking by.

Madeleine Stone, senior advocacy officer at Big Brother Watch, told the New Journal: “It’s the same thing as having your fingerprint taken by the police in order to walk down the street.” The Met say the technology is used to “locate dangerous individuals”.

Last week, commissioner Sir Mark Rowley said the force’s new plans to use retrospective facial recognition to identify shoplifters “was pushing boundaries”. But human rights organisations have said it’s discriminatory, inaccurate and a “big privacy concern”.

Ms Stone said: “Police targeting Camden’s residents with this dangerous mass surveillance tool treats them like suspects, erodes public freedoms and wastes public money. Live facial recognition is not an efficient crime-fighting tool, with the police’s own statistics revealing that more than 8 out of 10 facial recognition matches have been inaccurate since its introduction. This is an Orwellian mass surveillance tool rarely seen outside of Russia and China and has absolutely no place in London.”

The police tech being used at Highbury Corner [Big Brother Watch]

Ms Stone said they’ve seen people including children pulled aside by the police following an incorrect LFR match and forced to give their fingerprints or show their ID to prove they’re not a criminal. She said there was an algorithmic bias, where studies have shown live facial recognition is “less accurate for women and people of colour”.

And certain communities are over-policed and more likely to end up on watchlists, which means they’re more likely to be flagged by the technology wrongly or rightly.

Caroline Russell, chair of London Assembly’s police committee, said: “The lack of transparency about the make-up of watchlists and the purpose of deployments is unhelpful in the context of the Met trying to improve Londoners’ trust and confidence in policing. “LFR is a really dangerous technology and it’s very difficult to see how it’s actually making a difference in terms of policing.”

The London Policing Ethics Panel has reported that younger people and Asian, black and mixed ethnic groups are more likely not to attend events if they know they will be monitored with LFR.

A Met spokesperson said: “The Met has primarily focused the use of LFR on the most serious crimes; locating people wanted for violent offences, including knife and gun crime, or those with outstanding warrants who are proving hard to find. Operational deployments of LFR technology have been in support of longer-term violence reduction initiatives and have resulted in a number of arrests for serious offences including conspiracy to supply Class A drugs, assault on emergency service workers, possession with intent to supply Class A & B drugs, grievous bodily harm and being unlawfully at large having escaped from prison.”

It added that alert rates across their deployments are between 0 per cent and 0.08 per cent, although Big Brother Watch claim that the Met uses a different formula (the number of false matches against the total number of faces seen) to calculate accuracy. Human rights organisation Liberty has launched a petition to the Home Office to ban facial recognition technology.

Emmanuelle Andrews, Liberty policy and campaigns manager, said: “We cannot police and surveil our way out of social problems, and facial recognition is the wrong response – especially to people in need of support who are struggling to survive amidst the cost-of-living crisis.

“The safest thing to do for everyone is ban facial recognition technology.”



 

Related Articles