Facial Recognition Laws in Europe

tl;dr

In the first part of this series, we looked at how the United States has responded to the proliferation of harmful facial recognition technology (FRT) in their country. In this post, we will be looking at the diametrically different approach to FRT that has been taken in Europe. While the US is responding post-facto to the use of FRT, Europe is grappling with the conundrum of how use of FRT corresponds to its existing laws.

FRT regulation in Europe

Europe consists of around fifty countries, of which 27 countries are a part of the European Union. The table below illustrates the respective country’s stance on FRT. The stance of countries which have not been included in the table could not be verified through a valid source.

Country Stance on FRT
Armenia In use
Austria In use
Belarus In use
Belgium Ban
Bulgaria In use
Croatia Considering use
Cyprus In use
Czech Republic In use
Denmark In use
England In use
Estonia Approved for use (not implemented)
Finland Approved for use (not implemented)
France In use
Germany In use
Greece In use
Hungary In use
Iceland In use
Ireland In use
Italy In use
Luxembourg Ban
Malta In use
Moldova In use
Netherlands In use
Norway In use
Poland Approved for use (not implemented)
Portugal In use
Romania Approved for use (not implemented)
Russia In use
Scotland Considering use
Serbia In use
Slovakia Approved for use (not implemented)
Slovenia In use
Spain In use
Sweden Approved for use (not implemented)
Switzerland In use

Existing and proposed regulations for FRT in Europe

Unlike the US and India, the member countries of the EU have a first line of defence against FRT in the form of the GDPR. The GDPR, under Art. 5, puts into place certain principles based on which personal data can be processed. These principles are ‘lawfulness, fairness and transparency’, ‘purpose limitation’, ‘data minimisation’, ‘accuracy’, ‘storage limitation’, ‘integrity and confidentiality’ and ‘accountability’. Further, under Art. 9, the GDPR prohibits the processing of personal data revealing biometric data for the purpose of uniquely identifying a natural person.

However according to Daniel Leufer, Access Now’s Europe Policy Analyst, “(t)here are quite a few exceptions to Art 9, such as (e) "processing relates to personal data which are manifestly made public by the data subject;" and (g) "processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject", which allow for exemptions and/or abuse.” It is probable that the use of FRT in multiple countries across Europe, as seen in the table above, is being done on the basis of these exceptions. As a result, Access Now, EDRi, and other civil society organisations across Europe have come together to launch the Reclaim Your Face campaign to stop mass surveillance through facial recognition.

The Law Enforcement Directive (LED) is a legislation that runs parallel to the GDPR in the EU and relates specifically to the processing of personal data by data controllers for ‘law enforcement purposes’, which falls outside of the scope of the GDPR. Art. 10 of the LED states that, “Processing of special categories of personal data: Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be allowed [emphasis added] only where strictly necessary, subject to appropriate safeguards for the rights and freedoms of the data subject, and only:

  1. where authorised by Union or Member State law;

  2. to protect the vital interests of the data subject or of another natural person; or

  3. where such processing relates to data which are manifestly made public by the data subject.”

At present there is an active proposal on harmonised rules on artificial intelligence (Artificial Intelligence Act) in the European Commission. Under Art. 5(1)(d) of the proposal, use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement is prohibited, however, the wide exemptions attached to this provision weaken it severely by acting as loopholes. The article also does not apply to other authorities or private companies and relates only to “real-time” uses thereby creating another loophole. In response to the proposal, the European Data Protection Supervisor (EDPS), EU’s privacy watchdog, stated in a press release that “ban on remote biometric identification in public space is necessary”.

Specific developments in the United Kingdom

Since the UK has exited the EU, the GDPR and the LED are not applicable to it (after the transition period) and the new AI Act proposal will also not apply to it. However, the UK has the Data Protection Act, 2018 which regulates how personal information is used by organisations, businesses or the government. The Data Protection Act, 2018 is the UK’s implementation of the GDPR and thus, in practice, applies the same principles as the GDPR with regard to processing of personal data in addition to having the similar shortcomings.

In addition to legislation, one of the leading legal precedents in this area is the case of Ed Bridges v. South Wales Police: Landmark case on FRT decided by the Court Of Appeal (Civil Division). After the London High Court, in September, 2019 ruled that use of FRT on Ed Bridges was not unlawful, the Court of Appeal in August, 2020 ruled that “use of automatic facial recognition (AFR) technology by South Wales Police is unlawful”. This favourable outcome has been hailed by many as a big step in the fight against FRT as it is touted to be the world's first legal challenge to police use of facial recognition technology.

We hope this analysis helps guide researchers, advocates and groups in India on the path forward for crafting a legal framework for FRT regulation in India that may include banning its deployment for specific purposes.

FRT regulation around the world

Under IFF’s Project Panoptic, we have called for a ban on government entities, police and other security/intelligence agencies using FRT. In this series, we will be taking a look at how our stance measures up against developments on FRT regulations around the world. For the third post in this series we will be focusing on China, so stay in touch with us and keep following Project Panoptic.

Published By
Anushka Jain

Anushka Jain

Policy Counsel, IFF

Share this Case Study