1. Home
  2. Authorities and bodies
  3. European Union
  4. European Data Protection Board
  5. EDPB – Guidelines
  6. EDPB – Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement – Version 1.0
6 Jul. 2022
CSC Work program 2022-2024

EDPB – Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement – Version 1.0

More and more law enforcement authorities (LEAs) apply or intend to apply facial recognition technology (FRT). It may be used to authenticate or to identify a person and can be applied on videos (e.g. CCTV) or photographs. It may be used for various purposes, including to search for persons in police watch lists or to monitor a person’s movements in the public space.

FRT is built on the processing of biometric data, therefore, it encompasses the processing of special categories of personal data. Often, FRT uses components of artificial intelligence (AI) or machine learning (ML). While this enables large scale data processing, it also induces the risk of discrimination and false results. FRT may be used in controlled 1:1 situations, but also on huge crowds and important transport hubs.

FRT is a sensitive tool for LEAs. LEAs are executive authorities and have sovereign powers. FRT is prone to interfere with fundamental rights – also beyond the right to protection of personal data – and is able to affect our social and democratic political stability.

For personal data protection in the law enforcement context, the requirements of the LED have to be met. A certain framework regarding the use of FRT is provided for in the LED, in particular Article 3(13) LED (term “biometric data”), Article 4 (principles relating to processing of personal data), Article 8 (lawfulness of processing), Article 10 (processing of special categories of personal data) and Article 11 LED (automated individual decision-making).

Several other fundamental rights may be affected by the application of FRT, as well. Hence, the EU Charter of Fundamental Rights (“the Charter”) is essential for the interpretation of the LED, in particular the right to protection of personal data of Article 8 of the Charter, but also the right to privacy laid down by Article 7 of the Charter.

Legislative measures that serve as a legal basis for the processing of personal data directly interfere with the rights guaranteed by Articles 7 and 8 of the Charter. The processing of biometric data under all circumstances constitutes a serious interference in itself. This does not depend on the outcome, e.g. a positive matching. Any limitation to the exercise of fundamental rights and freedoms must be provided for by law and respect the essence of those rights and freedoms.

The legal basis must be sufficiently clear in its terms to give citizens an adequate indication of conditions and circumstances in which authorities are empowered to resort to any measures of collection of data and secret surveillance. A mere transposition into domestic law of the general clause in Article 10 LED would lack precision and foreseeability.

Before the national legislator creates a new legal basis for any form of processing of biometric data using facial recognition, the competent data protection supervisory authority should be consulted.

Legislative measures have to be appropriate for attaining the legitimate objectives pursued by the legislation at issue. An objective of general interest – however fundamental it may be – does not, in itself, justify a limitation to a fundamental right. Legislative measures should differentiate and target those persons covered by it in the light of the objective, e.g. fighting specific serious crime. If the measure covers all persons in a general manner without such differentiation, limitation or exception, it intensifies the interference. It also intensifies the interference if the data processing covers a significant part of the population.

The data has to be processed in a way that ensures the applicability and effectiveness of the EU data protection rules and principles. Based on each situation, the assessment of necessity and proportionality has to also identify and consider all possible implications for other fundamental rights. If the data is systematically processed without the knowledge of the data subjects, it is likely to generate a general conception of constant surveillance. This may lead to chilling effects in regard of some or all of the fundamental rights concerned such as human dignity under Article 1 of the Charter, freedom of thought, conscience and religion under Article 10 of the Charter, freedom of expression under Article 11 of the Charter as well as freedom of assembly and association under Article 12 of the Charter.

Processing of special categories of data, such as biometric data can only be regarded as “strictly necessary” (Art. 10 LED) if the interference to the protection of personal data and its restrictions is limited to what is absolutely necessary, i.e. indispensable, and excluding any processing of a general or systematic nature.

The fact that a photograph has been manifestly made public (Art. 10 LED) by the data subject does not entail that the related biometric data, which can be retrieved from the photograph by specific technical means, is considered as having been manifestly made public. Default settings of a service, e.g. making templates publicly available, or absence of choice, e.g. templates are made public without the user to be able to change this setting, should not in any way be construed as data manifestly made public.

Article 11 LED establishes a framework for automated individual decision-making. The use of FRT entails the use of special categories of data and may lead to profiling, depending on the way and purpose FRT is applied for. In any case, in accordance with Union law and Article 11(3) LED, profiling that results in discrimination against natural persons on the basis of special categories of personal data shall be prohibited.

Article 6 LED regards the necessity to distinguish between different categories of data subjects. With regard to data subjects for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with the legitimate aim according to the LED, there is most likely no justification of an interference.

The data minimisation principle (Article 4(1)(e) LED) also requires that any video material not relevant to the purpose of the processing should always be removed or anonymised (e.g. by blurring with no retroactive ability to recover the data) before deployment.

The controller must carefully consider how to (or if it can) meet the requirements for data subject’s rights before any FRT processing is launched since FRT often involves processing of special categories of personal data without any apparent interaction with the data subject.

The effective exercise of data subject’s rights is dependent on the controller fulfilling its information obligations (Article 13 LED). When assessing whether a “specific case” according to Article 13(2) LED exists, several factors need to be taken into consideration, including if personal data is collected without the knowledge of the data subject as this would be the only way to enable data subjects to effectively exercise their rights. Should decision-making be done solely based on FRT, then the data subjects need to be informed about the features of the automated decision making.

As regards access requests, when biometric data is stored and connected to an identity also by alphanumerical data, in line with the principle of data minimization, this should allow for the competent authority to give confirmation to an access request based on a search by those alpha-numerical data and without launching any further processing of biometric data of others (i.e. by searching with FRT in a database).

The risks for the data subjects are particularly serious if inaccurate data is stored in a police database and/or shared with other entities. The controller must correct stored data and FRT systems accordingly, see recital 47 LED.

The right to restriction becomes especially important when it comes to facial recognition technology (based on algorithm(s) and thereby never showing a definitive result) in situations where large quantities of data are gathered and the accuracy and quality of the identification may vary.

A data protection impact assessment (DPIA) before the use of FRT is a mandatory requirement, cf. Article 27 LED. The EDPB recommends making public the results of such assessments, or at least the main findings and conclusions of the DPIA, as a trust and transparency enhancing measure.

Most cases of deployment and use of FRT contain intrinsic high risk to the rights and freedoms of data subjects. Therefore, the authority deploying the FRT should consult the competent supervisory authority prior to the deployment of the system.

Given the unique nature of biometric data, the authority, implementing and/or using FRT should pay special attention to the security of processing, in line with Article 29 LED. In particular, the law enforcement authority should ensure the system complies with the relevant standards and implement biometric template protection measures. Data protection principles and safeguards must be embedded in the technology before the start of the processing of personal data. Therefore, even when a LEA intends to apply and use FRT from external providers, it has to ensure, e.g. through the procurement procedure, that only FRT built upon the principles of data protection by design and by default are deployed.

Logging (cf. Article 25 LED) is an important safeguard for verification of the lawfulness of the processing, both internally (i.e. self-monitoring by the concerned controller/processor) and by external supervisory authorities. In the context of facial recognition systems, logging is recommended also for changes of the reference database and for identification or verification attempts including user, outcome and confidence score.

The EDPB recalls its and the EDPS’ joint call for a ban of certain kinds of processing in relation to (1) remote biometric identification of individuals in publicly accessible spaces, (2) AI-supported facial recognition systems categorising individuals based on their biometrics into clusters according to ethnicity, gender, as well as political or sexual orientation or other grounds for discrimination (3) use of facial recognition or similar technologies, to infer emotions of a natural person and (4) processing of personal data in a law enforcement context that would rely on a database populated by collection of personal data on a mass-scale and in an indiscriminate way, e.g. by “scraping” photographs and facial pictures accessible online.

These guidelines address law makers at EU and national level, as well as LEAs and their officers at implementing and using FRT-systems. Individuals are addressed as far as they are interested generally or as data subjects, in particular as regards data subjects’ rights.

The guidelines intend to inform about certain properties of FRT and the applicable legal framework in the context of law enforcement (in particular the LED).

  • In addition, they provide a tool to support a first classification of the sensitivity of a given use case (Annex I).
  • They also contain practical guidance for LEAs that wish to procure and run a FRT-system (Annex II).
  • The guidelines also depict several typical use cases and list numerous considerations relevant, especially with regard to the necessity and proportionality test (Annex III).

References

  • SA
  • Case-law
  • Legislation