On 29 and 30 October 2014, the SURVEILLE project is going to organise together with the IRISS and RESPECT projects an Annual Forum for Decision Makers called “Democracy and Security”. This event is going to take place in the Diamant Brussels Conference & Business Centre (Bvd A. Reyerslaan 80, 1030 Brussels, Belgium).
It will be a significant moment for the three projects that are going to present the culmination of months’ work on the use of surveillance technologies for security purposes and its impact on our democratic societies.
Further information will be provided soon, which should be included in a particular website dedicated to this event
On June 8th the European Data Protection Supervisor (EDPS) published an opinion commenting on the Commission’s ‘Recommendation on preparations for the roll out of smart metering systems’ (issued March 9th), and on smart energy metering systems in general. Providing that the economic assessment to be carried out by Member States by the end of the summer gives favourable results the rollout for electricity and gas markets should take place by 2020. The EDPS warned that the introduction of smart meters is likely to raise serious threats to privacy in terms of the right to respect of family life and home, data protection concerns and the security of the citizen. It could constitute unwarranted surveillance if appropriate safeguards are not adopted.
As with traditional gas and electricity meters, the new devices could be installed in all households. What makes them ‘smart’ is the fact that they will enable the automatic transmission of consumption data from each household to energy suppliers. The reading, recording and transmission of such data would occur regularly – and could take place as frequently as every fifteen minutes. Smart meters will pave the way to a dynamic, ‘demand and response’ pricing system whereby energy consumption at peak times will be more expensive than consumption off-peak, or even change from customer to customer. As such, smart meters are a precondition for the modernisation of energy supply chains and deliver ‘smart grids’, which are supposed to provide considerable economic benefits.
The collection of such fine-grained information on consumption, though, could allow for the extraction of personal information, which could impinge upon the privacy of the members of EU households. The EDPS notes, for instance, that “by analysing detailed electricity usage data it may be possible in the future to infer or predict – also on a basis of deductions about the way in which electronic tools work – when members of a household are away on holiday or at work, when they sleep and awake, whether they watch television or use certain tools or devices, or entertain guests in their free time, how often they do their laundry, if someone uses a specific medical device or a baby monitor, whether a kidney problem has suddenly appeared or developed over time, if anyone suffers from insomnia, or indeed whether individuals sleep in the same room.” (p. 5).
Over time the collection of such massive information can amount to tracking and reveal very detailed behavioural patterns, or profiles, which could prove of benefit to both businesses (for targeted advertising and value-added services) and law enforcement agencies. Moreover, if data were not secured properly criminals could hack into the servers of the energy suppliers to obtain information on individuals, for instance, in order to commit burglary.
The EDPS commented extensively on the content of the recommendation. Whilst the recommendation incorporates new concepts such as privacy by design, privacy impact assessments (PIAs) and notification of data breaches, the EDPS highlighted a number of shortcomings such as the omission of basic principles of, and practical guidance on, data protection. He then suggested to introduce specific guidance and a clear methodology in the Template which will be prepared by the Commission for the voluntary impact assessments to be carried out by Member States, and proposed the assessment of additional legislative measures at the EU level to guarantee homogeneity of applicable laws and data protection standards.
Data Protection and Privacy
Data protection and privacy (as far as its overlapping facet is concerned) are concrete fundamental rights, whose enjoyment depends on the active provision of all the necessary conditions by data controllers and processors. The duty to inform data subjects about how their data are used, processed, and stored, is certainly included. After all, rights are worth very little if their bearers are not aware of them, or how to enforce them.
In this light, the recent outrage concerning Google Street View’s surreptitious collection of personal data such as email addresses, passwords, IP addresses etc., is easily explained. The related report published at the end of April demonstrates that the collection of such data by Google’s vehicles mapping different cities’ streets was not due to the decision of a single employee acting in his own capacity. Rather, it was a well-orchestrated programme, which many people inside the company were aware of. Consequently, EU prosecutors are planning to resume actions against the company.
In contrast, in the online world privacy/data protection policies are an important, and sometimes only, means toward this end (if succinct, complete, and written in layman’s language). Appeals to foster clear information started some years ago, and have been recently restated in programmatic speeches on data protection legislation reforms, such as Vivianne Reading’s.
Unfortunately, this is often forgotten. Either privacy policies are used as a tick-box exercise, or they are inexistent, even with those who preach the importance of enhancing human rights. Indeed, we reviewed 20 related projects under the SEC and SSH call. Of the 18 fully operational project websites, only a few have a legal notice, and an overwhelming majority does not provide a data protection policy, even among those whose research focus is privacy.
Click on the link below to read the SURVEILLE FP7 project’s data protection policy. Comments and suggestions are most welcome.
Recent news that the Metropolitan Police has put in place a system to extract mobile phone data from suspects held in custody has sent alarm bells ringing in certain quarters.
Privacy International expressed concern over the Metropolitan Police’s use of the Radio Tactics ACESO mobile phone data extraction system. “We are looking at a possible breach of human rights law,” spokeswoman Emma Draper told the BBC.
The Register notes the Radio Tactics device is basically a Windows 7 PC with forensics software installed, and a touch interface complete with step-by-step instructions on where to plug in each cable into the mobile device. The move to a standardized system for the Metropolitan Police is anticipated to save time and resources, The Register noting: “Different police forces use different companies, but it’s an expensive and time-consuming process and as the number of smartphones increases, the police would like to be able to get at more data more easily and in less time.”
It will be interesting to see how the terminals are used once they have been rolled out, and whether the planned training for police officers using the technology will prove sufficient to ensure that those whose mobile phones are subject to forensic examination are duly accorded the protections afforded under the existing legislation. The UK’s Human Rights Act Article 8, for example, assures individuals are guaranteed the right to privacy surrounding their communications except where a public authority, such as the police, believes it necessary to interfere with that right “in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others”.
A further concern is whether long term storage of the data retrieved from these devices remains secure.
Privacy International is warning that the technology, while currently limited to terminals in police stations, may eventually be used on the street:
“Examining suspects’ mobile phones after they are arrested is one thing, but if this technology was to be taken out onto the streets and used in stop-and-searches, that would be a significant and disturbing expansion of police powers,” said privacy International spokeswoman Emma Draper.
At the end of last month the Article 29 Working Party (WP) adopted a new opinion (PDF) on developments in biometric technologies. Almost ten years ago the WP wrote its first report on this matter, but since then the availability, cost, effectiveness and use of these technologies has “dramatically changed”, according to the body of national data protection commissioners.
One of the trends that justified a new report was the perceived move from developing biometric systems that allow the identification of persons to systems that detect behaviour or specific needs of people, including through the (covert) use of remote collection of body traits. Such use results in “a serious threat for privacy and a leak of control over personal data”, as it has “serious consequences on the capacity of people to exercise free consent or simply get information about the processing”.
The Article 29 WP develops this further in relation to the use of new facial recognition technologies. Photographs found online (for instance on social media) may not be further processed “in order to extract biometric templates” or “to recognise the persons on the pictures automatically” without a specific legal basis (e.g. consent) for this new purpose.” If facial recognition tools would be employed on a large scale, the Article 29 WP warns that its ability to capture biometric data without the knowledge of a person would “terminate anonymity in public spaces and allow consistent tracking of individuals.” As a general rule, the WP states that the use of biometrics “for general security requirements of property and individuals” cannot be regarded as a legitimate interest overriding the interests or fundamental rights and freedoms of the data subject.
“On the contrary, the processing of biometric data can only be justified as a required tool securing the property and/or individuals, where there is evidence, on the basis of objective and documented circumstances, of the concrete existence of a considerable risk. To that end the controller needs to prove that specific circumstances pose a concrete, considerable risk, which the controller is required to assess with special care. In order to comply with the proportionality principle, the controller, in presence of these high risk situations, is obliged to verify if possible alternative measures could be equally effective but less intrusive in relation to the aims pursued and choose such alternatives.”
The opinion further addresses specific biometric systems such as vein pattern recognition technology, fingerprint databases, voice recognition systems and DNA.