Controversial facial recognition firm Clearview AI has been handed a $13.3 million fine by the UK privacy watchdog following a joint investigation with its Australian counterpart.
Clearview AI sells a facial recognition app which allows users to upload a photo of an individual and have it matched with images in the company’s database of at least three billion photos, which it automatically scrapes from social media platforms and a range of other sources online.
It was revealed that a number of Australian police forces had tried out the app, feeding images of themselves, suspects and victims into the system.
The Office of the Australian Information Commissioner (OAIC) found that Clearview’s “indiscriminate and automated” collection of sensitive biometric information of Australians on a “large scale, for profit” breached Australian privacy laws.
The privacy watchdog also found that the AFP had breached privacy rules in using Clearview by not properly assessing the risks associated with it.
The UK Information Commissioner’s Office (ICO) made similar findings in November last year following a joint investigation by the two watchdogs.
The two privacy offices’ investigation came to an end late last year, with Clearview ordered to stop collecting any further information on Australians and to delete all collected images on them.
The company will not be fined in Australia as this is not in the Australian privacy watchdog’s current powers and it did not opt to apply to the courts for a fine against the firm. The OAIC has recommended that the Privacy Act be amended to give it the power to issue public infringement notices for interferences with privacy, allowing it to issue fines similar to the ICO in the UK.
The ICO originally intended to fine Clearview $30.2 million (17 million pounds), but this figure has been reduced to $13.3 million (7.5 million pounds) after representations from the company were taken into consideration.
Clearview has also been ordered to delete all data of UK residents held on its systems.
“Clearview AI has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images,” UK Information Commissioner John Edwards said.
“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”
The fine is the third largest ever imposed by the ICO in the UK.
In a statement, Clearview chief executive Hoan Ton-That said that the UK authorities had “misrepresented” the company.
“I am deeply disappointed that the UK Information Commissioner has misinterpreted my technology and intentions…I would welcome the opportunity to engage in conversation with leaders and lawmakers so the true value of this technology, which has proven so essential to law enforcement, can continue to make communities safe,” they said.
The company has agreed to stop selling its services to private companies and individuals, but will continue to offer it to law enforcement authorities around the world.