Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Lizzie Dearden

Facial recognition company used by British police fined £7.5m over unlawful image database

Copyright 2022 The Associated Press. All rights reserved.

A facial recognition company used by British police forces has been fined more than £7.5m for creating an unlawful database of 20 billion images.

The Information Commissioner’s Office said Clearview AI had scraped people’s private photos from social media and across the internet without their knowledge.

It created an app, sold to customers including the police, where they could upload a photograph to check for a match against images in the database.

The app would provide a list of images that have similar characteristics, with a link to the websites where they were sourced.

“Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge,” a spokesperson for the Information Commissioner said.

“Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.”

Documents reviewed by Buzzfeed News in 2020 indicated that the Metropolitan Police, National Crime Agency, Northamptonshire Police,North Yorkshire Police, Suffolk Constabulary, Surrey Police and Hampshire Police are among the forces to have used the technology.

The ICO found that Clearview AI had committed multiple breaches of data protection laws, including failing to have a lawful reason for collecting people’s information, failing to be “fair and transparent” and asking people who questioned whether they were on the database for additional personal information including photos.

John Edwards, the Information Commissioner, said: “Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images.

“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.

“People expect that their personal information will be respected, regardless of where in the world their data is being used.”

The fine was the result of a joint investigation conducted with the Australian information commissioner, which started in July 2020.

Police trail live facial recognition technology in Stratford

The ICO also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

A legal firm representing Clearview AI said the fine was “incorrect as a matter of law”, because the company no longer does business in the UK and is “not subject to the ICO’s jurisdiction”.

Hoan Ton-That, the company’s CEO, said: “I am deeply disappointed that the UK Information Commissioner has misinterpreted my technology and intentions. I created the consequential facial recognition technology known the world over.

“My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts.

“It breaks my heart that Clearview AI has been unable to assist when receiving urgent requests from UK law enforcement agencies seeking to use this technology to investigate cases of severe sexual abuse of children in the UK. We collect only public data from the open internet and comply with all standards of privacy and law.”

Clearview AI’s app was separate from live facial recognition systems used by the Metropolitan Police and South Wales Police, which use video footage to scan for matches to a “watchlist” of images in real time.

The use of the technology, which has also been expanding into the private sector, has drawn controversy and several legal challenges.

A man who was scanned in Cardiff won a Court of Appeal case in 2020, with judges finding that the automatic facial recognition used violated human rights, data protection and equality laws.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.