UK

A police force will resume its use of live facial recognition technology after the publication of a new report – but campaigners have made renewed calls for a ban.

South Wales Police had paused its use of the technology amid concerns over discrimination but said it would now use it again.

The Court of Appeal found in 2020 that South Wales Police had breached privacy rights, data protection laws and equality legislation through its use of facial recognition technology.

Since that ruling, South Wales Police say a code of practice has been established to set out the force’s obligations.

The report found “minimal discrepancies” for race and sex when the technology is used in certain settings.

The technology could differentiate between identical twins, according to the report.

The Metropolitan Police has welcomed the National Physical Laboratory’s findings which were published on Wednesday.

But human rights groups have warned the technology should have no place in a democracy.

Madeleine Stone, legal and policy officer at Big Brother Watch, said: “This Orwellian technology may be used in China and Russia but has no place in British policing.

“This report confirms that live facial recognition does have significant race and sex biases, but says that police can use settings to mitigate them.

“Given repeated findings of institutional racism and sexism within the police, forces should not be using such discriminatory technology at all.”

Katy Watts, a lawyer at the human rights group Liberty, added: “We should all be able to live our lives without the threat of being watched, tracked and monitored by the police.

“Facial recognition technology is a discriminatory and oppressive surveillance tool that completely undermines this basic right.”

Read more:
Police warned against ‘sinister’ use of recognition
‘Tech risks turning streets into police line-ups’

‘We have listened’

The report was commissioned by the Met and South Wales Police in late 2021 while the police’s use of the technology was publicly debated.

Lindsey Chiswick, director of intelligence for the Met, said: “We know that at the setting we have been using it, the performance is the same across race and gender and the chance of a false match is just one in 6,000 people who pass the camera.

“All matches are manually reviewed by an officer. If the officer thinks it is a match, a conversation will follow to check.

“We understand the concerns raised by some groups and individuals about emerging technology and the potential for bias. We have listened to these voices.”

South Wales Police chief constable, Jeremy Vaughan, added the system is “a force for good”.

Articles You May Like

Will Israel let an attack by Iran go unpunished? Probably not
Adobe releases Acrobat AI assistant starting at $4.99 a month
Iran attack and Ukraine war expose growing weakness by West to deter threats
Tesla (TSLA) is rumored to be preparing a massive round of layoffs
Ukraine uses British cruise missiles to hit Russian military HQ