In July 2014, the International Organization for Standardization (“ISO”) and International Electrotechnical Commission (“IEC”) published ISO/IEC 27018 (ISO 27018), a code of practice that sets forth standards and guidelines pertaining to the protection of data consisting of “personally identifiable information” processed by public cloud service providers.
ISO/IEC 27018 is the first International Standard that focuses on protection of personal data in the cloud. Although only a few months old, the new standard should finally give cloud users confidence that their service provider is well-placed to keep data private and secure.
ISO/IEC 27018 specifies certain minimum types of security measures that cloud providers should adopt, if applicable, including encryption and access controls. The cloud standard also requires cloud providers to implement security awareness policies and make relevant staff aware of the potential consequences (for staff, the cloud provider and the customer) of breaching privacy and security rules.
As the first-ever standard that deals with the protection of personal data for the cloud, ISO/IEC 27018 has the following key objectives:
ISO/IEC 27018 provides a practical basis to induce confidence in the cloud industry. At the same time, the public cloud industry will have clear guidance in order to meet some of the legal and regulatory concerns of its clients.
ISO/IEC 27018:2014 establishes commonly accepted control objectives, controls and guidelines for implementing measures to protect “personally identifiable information” in accordance with the privacy principles in ISO/IEC 29100 for the public cloud computing environment.
In particular, ISO/IEC 27018:2014 specifies guidelines based on ISO/IEC 27002, taking into consideration the regulatory requirements for the protection of “personally identifiable information” which might be applicable within the context of the information security risk environment(s) of a provider of public cloud services.
ISO/IEC 27018:2014 is applicable to all types and sizes of organizations, including public and private companies, government entities, and not-for-profit organizations, which provide information processing services as “personally identifiable information” processors via cloud computing under contract to other organizations.
The guidelines in ISO/IEC 27018:2014 might also be relevant to organizations acting as “personally identifiable information” controllers; however, “personally identifiable information” controllers can be subject to additional “personally identifiable information” protection legislation, regulations and obligations, not applying to “personally identifiable information” processors. ISO/IEC 27018:2014 is not intended to cover such additional obligations.
As a guiding principle, ISO/IEC 27018 standards and guidelines facilitate the retention by the cloud service customer of authority to determine the scope of any use and handling of its “personally identifiable information”. The following controls and implementation guidelines set forth in ISO/IEC 27018 as generally applicable to cloud service providers processing “personally identifiable information”:
The Safe-Harbour provision, in place since the early years of the tech boom in the late 1990s, allows US companies to satisfy EU rules by signing up to a self-reporting scheme, supervised by the US federal trade commission. It is based on the principle that US data privacy standards are equivalent to those in Europe.
Viviane Reding, the commissioner overseeing data protection, told that her office had begun an assessment of the “Safe Harbour” used by Google and Facebook, as well as thousands of smaller US tech companies.
The Safe Harbor agreement between the EU and US is under review as it may be a “loophole” for data transfers to take place at a lower standard of data protection than EU law permits, the European Commission has said.
The European Union has therefore launched a commission to review the U.S. Department of Commerce’s Safe Harbor agreement. The review comes in the wake of PRISM, the US National Security Agency’s data collection program. Safe Harbor is a voluntary program for U.S.-based companies with operations in the EU to transfer personal data across EU borders.
The EU, indeed, argues that the Safe Harbor program may be using “loopholes” to skirt EU data privacy rules. The International Trade Association (ITA), acknowledges the “criticisms,” but disagrees, saying that the program operates within its framework. Safe Harbor is based on the EU Data Protection Directive, and, as noted by the ITA, is limited when national security or defense matters are in question.
EU officials would like to review Safe Harbor for compatibility with new EU laws on data protection. While the U.S. is open to discussions on Safe Harbor, it is not likely that they will tighten any restrictions on it.
At issue is the reach of the draft EU legislation. It would require non-European companies to comply with EU laws in full when serving European customers – something that US officials argue is extraterritorial. It would also allow Brussels to fine companies that did not comply up to 2 per cent of their total annual turnover.
Advocate General’s Opinion in Case C-131/12 Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González considers that search engine service providers are not responsible, on the basis of the Data Protection Directive, for personal data appearing on web pages they process.
In early 1998, a newspaper widely circulated in Spain published in its printed edition two announcements concerning a real-estate auction connected with attachment proceedings prompted by social security debts. A person was mentioned as the owner. At a later date an electronic version of the newspaper was made available online by its publisher.
In November 2009 this person contacted the publisher of the newspaper asserting that, when his name and surnames were entered in the Google search engine, a reference appeared linking to pages of the newspaper with these announcements. He argued that the proceedings had been concluded and resolved many years earlier and were now of no relevance. The publisher replied that erasure of his data was not appropriate, given that the publication was effected by order of the Spanish Ministry of Labour and Social Affairs.
In February 2010, he contacted Google Spain and requested that the search results show no links to the newspaper when his name and surnames were entered into Google search engine. Google Spain forwarded the request to Google Inc., whose registered office is in California, United States, taking the view that the latter was the undertaking providing the internet search service.
Thereafter he lodged a complaint with the Agencia Española de Protección de Datos (Spanish Data Protection Agency, AEPD) against the publisher and Google. By a decision on 30 July 2010, the Director of the AEPD upheld the complaint against Google Spain and Google Inc., calling on them to withdraw the data from their index and to render future access to them impossible. The complaint against the publisher was rejected, however, because publication of the data in the press was legally justified. Google Inc. and Google Spain have brought two appeals before the Audiencia Nacional (National High Court, Spain), seeking annulment of the AEPD decision. In this context, this Spanish court has referred a series of questions to the Court of Justice.
In today’s Opinion, Advocate General Niilo Jääskinen addresses first the question of the territorial scope of the application of national data protection legislation. The primary factor that gives rise to its application is the processing of personal data carried out in the context of the activities of an establishment of the controller (according to the Data Protection Directive, the “controller” is the person or body which alone or jointly with others determines the purposes and means of the processing of personal data) on the territory of the Member State. However, Google claims that no processing of personal data relating to its search engine takes place in Spain. Google Spain acts merely as commercial representative of Google for its advertising functions. In this capacity it has taken responsibility for the processing of personal data relating to its Spanish advertising customers.
The Advocate General considers that this question should be examined taking into account the business model of internet search engine providers. This normally relies on keyword advertising which is the source of income and the reason for the provision of a free information location tool. The entity in charge of keyword advertising is linked to the internet search engine. This entity needs a presence on national advertising markets and that is why Google has established subsidiaries in many Member States. Hence, in his view, it must be considered that an establishment processes personal data if it is linked to a service involved in selling targeted advertising to inhabitants of a Member State, even if the technical data processing operations are situated in other Member States or third countries. Therefore, Mr Jääskinen proposes that the Court declare that processing of personal data takes place within the context of a controller’s establishment and, therefore, that national data protection legislation is applicable to a search engine provider when it sets up in a Member State, for the promotion and sale of advertising space on the search engine, an office which orientates its activity towards the inhabitants of that State.
Secondly, as for the legal position of Google as an internet search engine provider, Mr Jääskinen recalls that, when the Directive was adopted in 1995, the Internet and search engines were new phenomena and their current development was not foreseen by the Community legislator. He takes the view that Google is not generally to be considered as a “controller” of the personal data appearing on web pages it processes, who, according to the Directive, would be responsible for compliance with data protection rules. In effect, provision of an information location tool does not imply any control over the content included on third party web pages. It does not even enable the internet search engine provider to distinguish between personal data in the sense of the Directive, which relates to an identifiable living natural person, and other data. In his opinion, the internet search engine provider cannot in law or in fact fulfil the obligations of the controller provided in the Directive in relation to personal data on source web pages hosted on third party servers.
Therefore, a national data protection authority cannot require an internet search engine service provider to withdraw information from its index except in cases where this service provider has not complied with the exclusion codes or where a request emanating from a website regarding an update of cache memory has not been complied with. This scenario does not seem pertinent in the present case. A possible “notice and take down procedure” concerning links to source web pages with illegal or inappropriate content is a matter for national civil liability law based on grounds other than data protection.
Thirdly, the Directive does not establish a general “right to be forgotten”. Such a right cannot therefore be invoked against search engine service providers on the basis of the Directive, even when it is interpreted in accordance with the Charter of Fundamental Rights of the European Union (in particular, the rights of respect for private and family life under Article 7 and protection of personal data under Article 8 versus freedom of expression and information under Article 11 and freedom to conduct a business under Article 16).
The rights to rectification, erasure and blocking of data provided in the Directive concern data whose processing does not comply with the provisions of the Directive, in particular because of the incomplete or inaccurate nature of the data. This does not seem to be the case in the current proceedings.
The Directive also grants any person the right to object at any time, on compelling legitimate grounds relating to his particular situation, to the processing of data relating to him, save as otherwise provided by national legislation. However, the Advocate General considers that a subjective preference alone does not amount to a compelling legitimate ground and thus the Directive does not entitle a person to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests.
It is possible that the secondary liability of the search engine service providers under national law may lead to duties amounting to blocking access to third party websites with illegal content such as web pages infringing intellectual property rights or displaying libellous or criminal information. In contrast, requesting search engine service providers to suppress legitimate and legal information that has entered the public domain would entail an interference with the freedom of expression of the publisher of the web page. In his view, it would amount to censorship of his published content by a private party.