Apple’s Face ID in View of the General Data Protection Regulation
In tandem with ever stricter legislation, new technologies are increasingly requesting our personal data, often of a sensitive nature.
Facial recognition
Facial recognition is an innovation today widely used by major groups in the tech industry, such as Samsung or Huawei. Apple, however, undeniably remains the company having achieved the greatest impact when it released its new iPhone X in November 2017 by presenting its face ID technology, which allows one to unlock their mobile phone effortlessly. To do so, this innovation is based on extremely precise measurements of the user’s face dimensions. This biometric data then makes it possible to detect the user’s face in any circumstance, whatever its position.
Both practical and fun, this technology nevertheless raises questions about the compatibility of such data processing by Apple with that of current and future legislation and especially the General Data Protection Regulation that will come into effect on 25 May 2018. As the latter tends to be increasingly stricter with respect to companies collecting personal data, it is interesting to analyse whether collecting such biometric data cannot be challenged by the regulation.
Processing of sensitive data
Unlike the previously applicable Personal Data Directive, the General Data Protection Regulation specifies that biometric data falls within the scope of “sensitive data” (Article 9 of the Regulation). Recital No. 51 of the General Data Protection Regulation defines biometric data as data “processed through a specific technical means allowing the unique identification or authentication of a natural person”. Although this definition remains relatively vague, it is a safe bet that the courts will include facial recognition in such a category of personal data. Such a qualification is of great importance, insofar as the General Data Protection Regulation in principle prohibits such a collection, unless this collection fulfils the conditions set out in Article 9-2. It is indeed tolerated if “the data subject has given explicit consent to the processing of those data (…) for one or more specified purposes”. By being careful to rigorously fulfil such conditions, Apple could indeed carry out such data processing, provided that, as authorised under Article 9-3 of the said Regulation, the Member State in which the processing is carried out does not provide for more restrictive provisions.
The company must also endeavour to meet the requirements of Article 35 of the said Regulation. In fact, with regard to the data collected with the use of new technologies that would represent a high risk for the rights and freedoms of individuals, the General Data Protection Regulation requires companies to carry out a detailed analysis concerning the data collected. Through its collection of biometric data via its iPhone X, Apple is in fact faced with such an obligation. The analysis shall include a systematic description of the processing operations envisaged, an assessment of the necessity and proportionality of the processing operations with respect to the purposes, and an assessment of the risks to the rights and freedoms of the persons concerned.
Regarding the potential risks, Apple had already communicated on the degree of increased security that it provided for this type of data: in fact, the company does not keep the user’s biometric data on an external server to the extent that such data is encrypted and locked in the smartphone’s processor via the Secure Enclave (ultra-secure cloud storage). However, such a degree of security had been called into question by a controversy that erupted a few months ago. The American Civil Liberties Union (ACLU), the equivalent of the French data protection authority (CNIL) in the United States, warned that Apple is sharing this biometric data with third-party application developers. Sharing data that concerns facial recognition would allow developers to add new features to their applications. Even though Apple forbade them to use the data for advertisement or marketing purposes, security experts had raised the fact that there was still a risk of fraudulent use of the data by the developers, diverting them from their intended use.
Face ID, consistent with the General Data Protection Regulation?
Apple will have to take the challenges of the General Data Protection Regulation seriously by simultaneously ensuring users give their explicit and informed consent to the processing of their data, guaranteeing a high level of data security and a use strictly proportionate to the purpose for collecting the images. The company’s innovation through its face ID is in fact a typical example of the growing use of increasingly sensitive data through new technologies. This is indeed what the European Union understood during its reflections on the General Data Protection Regulation. Even if, at this stage, simple assumptions about the alignment of high-tech companies with this legislation can be made, it will be necessary to pay close attention to the interpretatio of the courts regarding the processing of this type of data.
The company’s innovation through its face ID is in fact a typical example of the growing use of increasingly sensitive data through new technologies. This is indeed what the European Union understood during its reflections on the General Data Protection Regulation. Even if, at this stage, simple assumptions about the alignment of high-tech companies with this legislation can be made, it will be necessary to pay close attention to the interpretation of the courts regarding the processing of this type of data.