Site icon Albert Jack – News Website

Facial recognition cams used by UK police is 98 percent inaccurate

Photo credit should read DANIEL LEAL-OLIVAS/AFP/Getty Images

A system of facial recognition software and surveillance cameras used by law enforcement in UK has falsely identified innocent people 98 percent of the time, a civil liberties watchdog revealed in a report Tuesday.

Automated facial recognition (AFR) technology used by London’s Metropolitan Police is designed to find persons of interests within large groups of people by comparing the biometrics of attendees caught on camera with information already stored on law enforcement databases.

Police records suggest the technology is grossly unreliable, however, and authorities who continue using AFR risk potentially violating British privacy laws, according to Big Brother Watch, a nonprofit civil liberties group that released the report.

London’s Metropolitan Police has tested AFR at a total of three events, including the city’s Notting Hill carnival in 2016 and 2017, and a “Remembrance Sunday” event in November, the watchdog discovered.

The tests correctly identified a total of two people who appeared on police databases, but neither was a wanted criminal and no arrests were made, according to law enforcement documents obtained by the report’s authors.

But the same system incorrectly flagged 102 people as potential suspects, and authorities subsequently pulled aside and interviewed at least five people during the 2017 carnival and made them prove their identities, the report said.

“Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the U.K.. Members of the public could be tracked, located and identified — or misidentified — everywhere they go,” said Silkie Carlo, the watchdog’s director.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate,” Ms. Carlo said. “It must be dropped.”

The Metropolitan Police told the BBC it tested the technology to see whether it could “assist police in identifying known offenders in large events, in order to protect the wider public.”

“Regarding ‘false’ positive matches — we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts,” the Met told BBC. “All alerts against the watch list are deleted after 30 days. Faces in the video stream that do not generate an alert are deleted immediately.”

Big Brother Watch said it planned to bring its report to Parliament and demand that police stop using automated facial recognition, citing potential violations of the Human Rights Act 1998.

Read – Is it time to drive Islam out of Europe?

Read – The Slow Death of Europe

Join – Europe in Danger on Facebook

You can follow Albert on Gab.ai and Minds or Twitter and Facebook.

Or join the free mailing list (top right) and feel free to comment on story below

Exit mobile version