Facial recognition technology was used at King Charles III's coronation and the British Grand Prix

Face-off In Britain Over Controversial Surveillance Tech

Facial recognition technology was used at King Charles III's coronation and the British Grand Prix

PAY ATTENTION: TUKO is in WhatsApp Channels now! Subscribe and read news in favourite messenger.

On a grey, cloudy morning in December, London police deployed a state-of-the-art AI powered camera near the railway station in the suburb of Croydon and quietly scanned the faces of the unsuspecting passersby.

The use of live facial recognition (LFR) technology -- which creates biometric facial signatures before instantaneously running them through a watchlist of suspects -- led to 10 arrests for crimes including threats to kill, bank fraud, theft and possession of a crossbow.

The technology, which was used at the British Grand Prix in July and at King Charles III's coronation in May, has proved so effective in trials that the UK government wants it used more.

"Developing facial recognition as a crime fighting tool is a high priority," policing minister Chris Philp told police chiefs in October, adding that the technology has "great potential".

"Recent deployments have led to arrests that would otherwise have been impossible and there have been no false alerts," he added.

But the call to expedite its roll-out has outraged some parliamentarians, who want the government's privacy regulator to take "assertive, regulatory action" to prevent its abuse.

PAY ATTENTION: Click “See First” under the “Following” tab to see TUKO News on your News Feed

"Facial recognition surveillance involves the processing, en masse, of the sensitive biometric data of huge numbers of people -- often without their knowledge," they wrote in a letter.

"It poses a serious risk to the rights of the British public and threatens to transform our public spaces into ones in which people feel under the constant control of corporations and the government."

False matches

Lawmakers allege that false matches by the technology, which is yet to be debated in parliament, have led to more than 65 wrongful interventions by the police.

One was the arrest of a 14-year-old boy in school uniform, who was surrounded by officers and had his fingerprints taken before his eventual release.

MPs said the use of the technology by private companies, meanwhile, represented a "radical transfer of power" from ordinary people to companies in private spaces, with potentially serious consequences for anyone misidentified.

Members of the public, they said, could be prevented from making essential purchases like food, be subject to intrusive interventions or be brought into dangerous confrontations with security staff.

Last year the owner the Sports Direct chain, Frasers Group, defended the use of live LFR technology in stores, saying the technology had "significantly" cut shoplifting and reduced violence against staff.

'Walking ID cards'

Civil liberties groups say the technology is oppressive and has no place in a democracy.

Mark Johnson, an advocacy manager for Big Brother Watch, compares the technology to the writer George Orwell's novel "Nineteen Eighty-Four" -- a portrait of a totalitarian state in which the characters are under constant surveillance.

The technology, he told AFP, "is an Orwellian mass surveillance tool that turns us all into walking ID cards".

Activists argue the technology places too much unmonitored power in the hands of the police, who have been given increased powers of arrest over protests through the Public Order Act.

The new laws, pushed through parliament by the right-wing Tory government four days before the coronation, give police the power to stop a protest if they believe it could cause "more than minor disruption to the life of the community".

Critics are especially concerned about the lack of oversight in the composition of police watchlists, saying some have been populated with protestors and people with mental health issues, who are not suspected of any offences.

"Off-the-shelf versions of these tools need legal and technical oversight to be used responsibly and ethically," one activist told AFP.

"I worry police forces don't have that resource and capacity to do this right now."

The police say that the details of anyone who is not a match on a watchlist are immediately and automatically deleted.

The Home Office interior ministry insists data protection, equality and human rights laws strictly govern the use of the technology.

But that has not satisfied opponents, in a country where previous attempts to introduce compulsory identity cards have met fierce resistance.

In June 2023, the European Parliament voted to ban live facial recognition in public spaces.

In the UK, lawmakers who oppose the technology, want to go further.

"Live facial recognition has never been given explicit approval by parliament," said Conservative MP David Davis, who once resigned his seat alleging the extension of custody time limits for terror suspects without charge was a breach of civil liberties.

"It is a suspicionless mass surveillance tool that has no place in Britain."

PAY ATTENTION: Unlock exclusive features from TUKO.co.ke - join our membership!

Source: AFP

JKIA Blackout: Kipchumba Murkomen Suspects Internal Sabotage, Calls On Police To Probe Staff
EU Visa-free Travel For Kosovo Enters Into Force
Kimani Mbugua Hospitalised Day After Video Of Him Speaking Incoherently Went Viral

Facial recognition technology was used at King Charles III's coronation and the British Grand Prix
Facial recognition technology was used at King Charles III's coronation and the British Grand Prix
Police and the government say LFR has the potential to help fight crime
Police and the government say LFR has the potential to help fight crime
Civil liberties groups say LFR is oppressive, while some MPs want its use regulated
Civil liberties groups say LFR is oppressive, while some MPs want its use regulated