Author Archive

European Commission proposes law enforcement access to EURODAC

The European Commission on Wednesday (30 May) proposed to allow law enforcement authorities access to EURODAC, a biometric database of asylum seekers. The proposal will be presented to the Home Affairs Ministers at the next Justice and Home Affairs Council on 7-8 June 2012. The Commission has yet to release the full details of the proposal. Member state law enforcement authorities and EUROPOL would be able to request the comparison of fingerprint data with those already stored in the EURODAC central database, but under strict conditions. The comparison with the EURODAC database for law enforcement purposes would be strictly limited to the prevention, detection or investigation of terrorist offences as defined in the Council Framework Decision on combating terrorism (2002/475/JHA) and of other serious criminal offences as defined in the Council Framework Decision on the European Arrest Warrant (2002/584/JHA).

 The new proposal introduces the possibility for Member States’ law enforcement authorities and Europol to request comparison of fingerprint data with those stored in the EURODAC central database in a specific case when they seek to establish the exact identity of or get further information about a person who is suspected of a serious crime or is a victim of crime. Law enforcement authorities may only request the comparison with EURODAC data if there are reasonable grounds to consider that such comparison will substantially contribute to the prevention, detection or investigation of the serious criminal offence in question. The proposal makes clear that the comparison of fingerprint data using EURODAC may only be made after national fingerprint databases and the Automated Fingerprint Databases of other Member States under Council Decision 2008/615/JHA (the Prüm Agreements) were consulted and have returned negative results. A comparison using the EURODAC database will provide result on a ‘hit’/’no hit’ basis. Following a hit, the available information on the person (related to his/her asylum application) can then be requested from that Member State by using existing instruments for information exchange, such as Framework Decision 2006/960/JHA on simplifying the exchange of information and intelligence between law enforcement authorities. The proposal excludes that the EURODAC database be searched by law enforcement authorities on a systematic basis, and prohibits them from sharing personal data obtained with third countries, organisations or other entities. According to responsible Commissioner Malmstrom, “robust safeguards have been introduced to guarantee full the respect of fundamental rights and of privacy and in order to ensure that the right to asylum is not in any way adversely affected.” But Melita Sunjic, spokeswoman for the United Nations High Commissioner for Refugees (UNHCR) in Brussels, told EUobserver that law enforcement access to the database would equate asylum seekers with criminality. A similar proposal was already tabled by the Commission in 2009 but was quickly shot down the European Data Protection Supervisor (EDPS) and the Meijers Committee, a group of experts on international immigration, refugee and criminal law. According to Dr. Maarten den Heijer, a member of the Meijers Committee: “The proposal would effectively transform all asylum seekers whose data is stored into criminal suspects and it will, indeed, increase the chance of prosecution of asylum seekers solely on the basis that they have once lodged an asylum claim somewhere,” said Dr den Heijer. Furthermore, Dr den Heijer argues the proposal would violate a data protection principle of ‘purpose limitation’ which holds that stored personal data may only be used for the purpose it was initially collected for. He cited a case brought against Germany by an Austrian national at the European Court of Justice in 2008 which ruled that a system for processing personal data specific to foreign nationals for the purpose of fighting crime is not permissible.

Data Protection and Privacy

Data Protection and Privacy Data protection and privacy (as far as its overlapping facet is concerned) are concrete fundamental rights, whose enjoyment depends on the active provision of all the necessary conditions by data controllers and processors. The duty to inform data subjects about how their data are used, processed, and stored, is certainly included. After all, rights are worth very little if their bearers are not aware of them, or how to enforce them. In this light, the recent outrage concerning Google Street View’s surreptitious collection of personal data such as email addresses, passwords, IP addresses etc., is easily explained. The related report published at the end of April demonstrates that the collection of such data by Google’s vehicles mapping different cities’ streets was not due to the decision of a single employee acting in his own capacity. Rather, it was a well-orchestrated programme, which many people inside the company were aware of. Consequently, EU prosecutors are planning to resume actions against the company. This comes one month after the Commission stretched its muscles over Google’s privacy policy change, whereby all accounts held by one user are going to be conflated in one single access point to its services, in an effort to step up identity management, i.e. identification of customers, to better extract revenues from users. The problem here is a ‘take-it-or-leave-it’ approach, whereby customers are left with no choice but accepting the changes or quitting, provided they have read and understood the new policy at all. Google is hardly the only company faced with criticism over this issue; Facebook, for instance, is also criticized for its constantly changing, ‘take-it-or-leave-it’ approach. These policies seem to be a tick-box exercise: they comply with legal requirements without providing users with meaningful instruments to protect their privacy. In contrast, in the online world privacy/data protection policies are an important, and sometimes only, means toward this end (if succinct, complete, and written in layman’s language). Appeals to foster clear information started some years ago, and have been recently restated in programmatic speeches on data protection legislation reforms, such as Vivianne Reading’s. Unfortunately, this is often forgotten. Either privacy policies are used as a tick-box exercise, or they are inexistent, even with those who preach the importance of enhancing human rights. Indeed, we reviewed 20 related projects under the SEC and SSH call. Of the 18 fully operational project websites, only a few have a legal notice, and an overwhelming majority does not provide a data protection policy, even among those whose research focus is privacy. Click on the link below to read the SURVEILLE FP7 project’s data protection policy. Comments and suggestions are most welcome.

SURVEILLE project partner Efus sets up new working group on security technologies. Interested members are invited to participate!

The European Forum for Urban Security (Efus), a project partner of SURVEILLE, has just announced an open invitation to interested members to join its new working group on technologies. The initiative will build on the organization’s ongoing initiative for a responsible and democratic use of video surveillance. Efus notes how technologies are increasingly part of our daily urban life, and are thus an important working theme for the organisation: Technologies offer a wide range of opportunities, but they also pose new threats. For instance, they provide new security tools, but these raise questions of efficiency and ethics. Through internet and social networks, they create new, virtual public spaces and new forms of life in society, which create new opportunities for social cohesion and democratic participation, but also new forms of control and crime. They provide us with an increasingly sophisticated and performing infrastructure, but it comes with new vulnerabilities and risks.” Efus explains that the project is an opportunity for members of the Forum to obtain direct access to the state of the art of research in this field, but also to give feedback on their views on, their wishes for, and their experiences with these technologies. The working group will accompany the SURVEILLE project, and provide input and feedback. Its members will be invited to participate in its annual Forum for European decision makers, which will take place each September in Brussels. In addition, the working group will gather at least once a year for a one-day meeting. Travel and subsistence costs will be paid. Other meetings will also take place in relation with events organised by Efus, such as its yearly General Assembly. Moreover, working group members will have the opportunity to participate in thematic meetings of the SURVEILLE project. If you are interested in participating in this working group please contact Sebastian Sperber at Efus: [email protected] Also, for further information you may wish to see the announcement on the Efus website:

UK's Metropolitan Police to extract suspects' mobile data on a wider basis

Recent news that the Metropolitan Police has put in place a system to extract mobile phone data from suspects held in custody has sent alarm bells ringing in certain quarters. Privacy International expressed concern over the Metropolitan Police’s use of the Radio Tactics ACESO mobile phone data extraction system. “We are looking at a possible breach of human rights law,” spokeswoman Emma Draper told the BBC. The Register notes the Radio Tactics device is basically a Windows 7 PC with forensics software installed, and a touch interface complete with step-by-step instructions on where to plug in each cable into the mobile device. The move to a standardized system for the Metropolitan Police is anticipated to save time and resources, The Register noting: “Different police forces use different companies, but it’s an expensive and time-consuming process and as the number of smartphones increases, the police would like to be able to get at more data more easily and in less time.” It will be interesting to see how the terminals are used once they have been rolled out, and whether the planned training for police officers using the technology will prove sufficient to ensure that those whose mobile phones are subject to forensic examination are duly accorded the protections afforded under the existing legislation. The UK’s Human Rights Act Article 8, for example, assures individuals are guaranteed the right to privacy surrounding their communications except where a public authority, such as the police, believes it necessary to interfere with that right “in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others”. A further concern is whether long term storage of the data retrieved from these devices remains secure. Privacy International is warning that the technology, while currently limited to terminals in police stations, may eventually be used on the street: “Examining suspects’ mobile phones after they are arrested is one thing, but if this technology was to be taken out onto the streets and used in stop-and-searches, that would be a significant and disturbing expansion of police powers,” said privacy International spokeswoman Emma Draper.

Article 29 WP highlights dangers of facial recognition technologies

At the end of last month the Article 29 Working Party (WP) adopted a new opinion (PDF) on developments in biometric technologies. Almost ten years ago the WP wrote its first report on this matter, but since then the availability, cost, effectiveness and use of these technologies has “dramatically changed”, according to the body of national data protection commissioners. One of the trends that justified a new report was the perceived move from developing biometric systems that allow the identification of persons to systems that detect behaviour or specific needs of people, including through the (covert) use of remote collection of body traits. Such use results in “a serious threat for privacy and a leak of control over personal data”, as it has “serious consequences on the capacity of people to exercise free consent or simply get information about the processing”. The Article 29 WP develops this further in relation to the use of new facial recognition technologies. Photographs found online (for instance on social media) may not be further processed “in order to extract biometric templates” or “to recognise the persons on the pictures automatically” without a specific legal basis (e.g. consent) for this new purpose.” If facial recognition tools would be employed on a large scale, the Article 29 WP warns that its ability to capture biometric data without the knowledge of a person would “terminate anonymity in public spaces and allow consistent tracking of individuals.”

As a general rule, the WP states that the use of biometrics “for general security requirements of property and individuals” cannot be regarded as a legitimate interest overriding the interests or fundamental rights and freedoms of the data subject. “On the contrary, the processing of biometric data can only be justified as a required tool securing the property and/or individuals, where there is evidence, on the basis of objective and documented circumstances, of the concrete existence of a considerable risk. To that end the controller needs to prove that specific circumstances pose a concrete, considerable risk, which the controller is required to assess with special care. In order to comply with the proportionality principle, the controller, in presence of these high risk situations, is obliged to verify if possible alternative measures could be equally effective but less intrusive in relation to the aims pursued and choose such alternatives.” The opinion further addresses specific biometric systems such as vein pattern recognition technology, fingerprint databases, voice recognition systems and DNA.