Clearview AI has come under yet another sanction for violating European privacy regulations.
The Athens-based Greek data protection authority has fined the controversial facial recognition firm 20 million euros and banned it from collecting and processing the personal data of people living in Greece. He also ordered the removal of all data on Greek citizens that he had already collected.
Since the end of last year, national DPAs in Great Britain, Italy as well as France also issued similar rulings sanctioning Clearview, effectively preventing it from selling its services in their markets, as any local customers would be at risk. threatened with a fine.
The American company gained notoriety for collecting selfies from the Internet to create a commercial algorithmic identity matching service for law enforcement and other organizations, including those in the private sector.
Recently, in MayClearview has agreed to severe restrictions on its services within the US in exchange for settling a 2020 lawsuit from the American Civil Liberties Union (ACLU) that accused it of violating an Illinois law that bans the use of biometric data from individuals. data without consent.
The European Union’s data protection framework, the General Data Protection Regulation (GDPR), sets an equally high bar for the legitimate use of biometrics to identify individuals, a standard that applies to the entire bloc as well as some non-member states (including the United Kingdom); only about 30 countries.
Under the GDPR, such a sensitive purpose for personal data (for example, facial recognition for an ID matching service) requires, at a minimum, the explicit consent of the data subjects to the processing of their biometric data.
However, it is clear that Clearview did not obtain consent from the billions of people (and likely millions of Europeans) whose selfies it secretly took from social media platforms and other online sources to train facial recognition AI, repurposing people’s data for purposes hostile to privacy. . So the growing string of GDPR sanctions against him in Europe is hardly surprising. And additional penalties may follow.
In his 23 page solution, Hellenic DPA stated that Clearview violated the principles of legality and transparency of the GDPR by finding violations of Articles 5(1)a, 6 and 9; and violations of obligations under Articles 12, 14, 15 and 27.
The Greek DPA’s decision follows a complaint filed in May 2021 by a local human rights group. Homo digitaliswho proclaimed victory in a press release, saying the €20 million fine sends “a strong signal against the intrusive business models of companies that seek to make money through illegal processing of personal data.”
The advocacy group also suggested that the fine sends “a clear signal to law enforcement agencies working with companies of this nature that such actions are illegal and grossly violate the rights of data subjects.” (In an even clearer message last yearSweden’s DPA has fined local police €250,000 for illegally using Clearview, which it says violated the country’s Criminal Data Act.)
Clearview was contacted for comment on the Hellenic DPA sanctions.
According to current estimates, the company was fined – on paper – almost 50 million euros by regulatory authorities in Europe. Though it’s not yet clear if it has paid any fines, given potential appeals and the difficult task for international regulators to enforce local laws against the US organization if it chooses not to cooperate.
The UK DPA has told us that Clearview is appealing the sanctions in this market.
“We have received notice that Clearview AI has filed an appeal. Clearview AI is not required to comply with the Notice of Enforcement or pay the Notice of Penalty until a decision on the appeal is made. We will not comment further on this case while the lawsuit is ongoing,” an ICO spokesperson said.
Clearview’s responses to earlier GDPR sanctions indicated that it does not currently do business in the affected markets. But it remains to be seen if law enforcement will work to permanently shut him out of the region — or if he might try to circumvent sanctions by somehow adapting his product.
In the US, he called the agreement with the ACLU a “huge win” for his business, saying he would not be affected as he would still be able to sell his algorithm (rather than access to his database) to private companies in the US. US
The US settlement also included an exemption for government contractors, suggesting that Clearview could continue to work with federal government agencies in the US, such as National security and FBI – when applying a five-year ban on providing its software to any government contractors or state or local governments in Illinois itself.
Notably, European DPAs have yet to order the destruction of the Clearview algorithm, despite several regulators concluding that it was trained on illegally obtained personal data.
Like us previously reportedlegal experts have suggested that there is a gray area as to whether the GDPR authorizes supervisors to order the removal of AI models trained on illegally obtained data, rather than just ordering the removal of the data itself, as appears to have been the case. so far in this Clearview saga.
But new EU AI legislation could be enacted to enable regulators to go further: The Artificial Intelligence Law (still draft) contains the authority of market surveillance authorities to “take all necessary corrective actions” to bring the AI system into compliance, including removing it from the market (which, in essence, amounts to commercial destruction) – depending on the nature the risk it poses.
If the AI law finally passed by EU lawmakers retains this provision, it suggests that any wiggle room for commercial entities to allow illegally trained AI models to be used within the bloc could soon lead to some uncompromising legal clarity.
At the same time, if Clearview complies with all these international orders to remove and stop processing citizens’ data, it will not be able to update its AI models with fresh human biometric data from countries where it is prohibited from processing human biometric data, which implies that the utility of its product will gradually deteriorate with each fully enforced ban order.
Credit: techcrunch.com /