Surveillance Commissioner Blasts Cops for Data Retention

Security

British police continue to hold millions of images of innocent citizens a decade after being ordered to destroy them, the UK’s outgoing biometrics and surveillance commissioner has revealed.

Fraser Sampson, who will end his term tomorrow, told The Guardian that a high court ruling in 2012 ordered police to dispose of custody photographs for individuals who were subsequently not charged with any crimes.

He estimated the number of these photographs to be “well over” three million.

“Eleven years later they have still got them. And the answer to that has been: ‘Well they’re retained on a database that doesn’t have the bulk delete capability,’” Sampson told the paper.

“I’ve said: ‘Well that can’t possibly be a defence because you built it.’ So what the police are trying to do at the moment is migrate that, create a new database and move everything off it that is legitimate, and by a process of elimination all that’s left can be deleted. But in terms of public trust and confidence it’s not a great place to be if that’s the only plan you have.”

Read more on facial recognition: Police Use of Facial Recognition Ruled Unlawful in World-First Case

Retention of such images will be of particular concern in an age of AI-powered facial recognition technology, which could be used to pick suspects out of a crowd if they provide a match.

Sampson argued there are currently insufficient checks and balances governing police use of such privacy-invasive technology.

Databases of facial images could be compiled not just by capture from official police cameras but also from social media, where huge volumes are shared on a daily basis by private users. 

“All of a sudden you can tap that ocean. Our regulatory framework hasn’t seen any account of that at all. It still just regulates those cameras that are owned and operated by police and local authorities, which is a tiny fraction,” Sampson argued.

Controversial facial recognition firm Clearview AI has built a database of billions of images scraped from the internet to help law enforcement clients identify suspects via surveillance cameras.

However, it recently escaped a multimillion-pound regulatory fine after a UK court ruled that it only provided services to non-UK or EU law enforcement.

Products You May Like

Articles You May Like

Health Information Published Online After MediSecure Ransomware Attack
Russia’s DoppelGänger Campaign Manipulates Social Media
Chinese Nationals Arrested for Laundering $73 Million in Pig Butchering Crypto Scam
What happens when AI goes rogue (and how to stop it)
Experts Find Flaw in Replicate AI Service Exposing Customers’ Models and Data

Leave a Reply

Your email address will not be published. Required fields are marked *