Article 29 WP highlights dangers of facial recognition technologies

A- A A+

At the end of last month the Article 29 Working Party (WP) adopted a new opinion (PDF) on developments in biometric technologies. Almost ten years ago the WP wrote its first report on this matter, but since then the availability, cost, effectiveness and use of these technologies has “dramatically changed”, according to the body of national data protection commissioners.

One of the trends that justified a new report was the perceived move from developing biometric systems that allow the identification of persons to systems that detect behaviour or specific needs of people, including through the (covert) use of remote collection of body traits. Such use results in “a serious threat for privacy and a leak of control over personal data”, as it has “serious consequences on the capacity of people to exercise free consent or simply get information about the processing”.

The Article 29 WP develops this further in relation to the use of new facial recognition technologies. Photographs found online (for instance on social media) may not be further processed “in order to extract biometric templates” or “to recognise the persons on the pictures automatically” without a specific legal basis (e.g. consent) for this new purpose.” If facial recognition tools would be employed on a large scale, the Article 29 WP warns that its ability to capture biometric data without the knowledge of a person would “terminate anonymity in public spaces and allow consistent tracking of individuals.”

As a general rule, the WP states that the use of biometrics “for general security requirements of property and individuals” cannot be regarded as a legitimate interest overriding the interests or fundamental rights and freedoms of the data subject.

“On the contrary, the processing of biometric data can only be justified as a required tool securing the property and/or individuals, where there is evidence, on the basis of objective and documented circumstances, of the concrete existence of a considerable risk. To that end the controller needs to prove that specific circumstances pose a concrete, considerable risk, which the controller is required to assess with special care. In order to comply with the proportionality principle, the controller, in presence of these high risk situations, is obliged to verify if possible alternative measures could be equally effective but less intrusive in relation to the aims pursued and choose such alternatives.”

The opinion further addresses specific biometric systems such as vein pattern recognition technology, fingerprint databases, voice recognition systems and DNA.

Categories: Privacy, Security