December GDPR Updates: Right to be Forgotten, Access Requests, Facial Recognition and More

EDPB guidelines on the right to be forgotten in search engines. The European Data Protection Board (“EDPB”) released for public comments its draft guidelines on the right to be forgotten under the General Data Protection Regulations (“GDPR”), as it applies to online search engines. 

The draft distinguishes between delisting requests and full erasure requests. Requests to delist seek to delete the link between a particular search term queried and the resulting content that relates to the data subject. In that case, different search words may still lead to the same content. Full erasure, on the other hand, is where every possible link to the content is sought to be deleted, regardless of the search term queried. 

The draft also clarifies that the GDPR does not require search engine providers to notify the third party content owner, who publishes the information on the internet which the searched keyword leads to, about the data subject’s delisting request. 
The draft further explains that unlike the Data Protection Directive that preceded the GDPR and required the data subject to base his or her request “on compelling legitimate grounds relating to his [or her] particular situation”, the GDPR shifts the burden of proof to the controllers (the search engine) to demonstrate “compelling legitimate grounds for the processing”. 

CLICK HERE to read the draft EDPB guidelines. 

ICO draft guidance on data subjects’ right of access. The UK Information Commissioner’s Office (“ICO”) published its draft guidance on data subjects’ right of access under the GDPR. These rights allow data subjects to request a copy of their personal data processed by the controller, and to request further information about how the controller processes their personal data and shares it with third parties. 
The ICO draft guidance recommends that organizations properly prepare for such requests for access, among others, by: 

  • Explaining to individuals how they can request access; 
  • Training staff to recognize data subjects’ requests under the GDPR;  
  • Maintaining information asset registers which document where and how personal data is stored, to speed up the process of locating the information in response to a request; 
  • Having documented retention and deletion policies for the personal data the organization processes. This helps to ensure that the organization does not keep information longer than necessary and can potentially reduce the amount of information it will need to review when responding to a data subject request. 

The guidance also discusses instances where a controller may refuse a request for access or limit the scope of the disclosure. For example, if the request implicates disclosure of information on the controller’s management forecasting, planning, or other business activity and such disclosure is likely to prejudice the business or activity; or where the request implicates information protected by the attorney-client privilege. 

CLICK HERE to read the ICO guidance. 

German privacy regulator imposes a $10 million fine on telecom company for data security violations. Germany’s Federal Commissioner for Data Protection and Freedom of Information (“BFDI”) imposed a fine of nearly $10 million on 1&1 Telecommunications, one of the largest mobile and internet providers in Germany, for failing to implement sufficient technical and organizational measures to protect customer data in its call center environments. 

The BFDI began investigating the matter after a data subject filed a complaint alleging that the company allowed callers to access extensive customer information simply by providing a customer’s name and date of birth. The BFDI found this to be an insufficient mode of authentication for protecting customer data, in violation of Article 32 of the GDPR, which obligates a controller to take appropriate data security measures.

The BFDI stated that the penalty was justified because the data security risk spanned the company’s entire customer base. However, the company’s cooperation and transparency were taken under consideration, resulting in what the BFDI regards as a relatively low fine. 

CLICK HERE to read the BFDI’s press release on the matter (in German) 

CNIL opinion on challenges of facial recognition technology. The French data protection authority (“CNIL”) has published an opinion on facial recognition technology and the challenges it raises. 

Facial recognition is a biometric technique for the automated recognition of a person, based on the characteristics of their face. The CNIL states that most of the risks facial recognition entails are due to the biometric nature of the information it processes. Since facial recognition is based on probability rather than certainty, a mistake or variations in performance may have far-reaching consequences on individuals. Moreover, the technology allows for such processing even without the data subject being aware that such data is collected and processed, making it a very intrusive tool on people’s anonymity in the public space. 

The CNIL finds it crucial to prioritize the security of such biometric data, by, for example, storing such biometric data on a personal device belonging to and accessible to the user rather than in central database storage solutions. 

The CNIL also gives examples of uses of the technology it deems acceptable, for example, to filter access to a carnival (on an experimental basis). It also explains prohibited uses, such as school entry controls, which in the CNIL’s opinion can be achieved by less intrusive means. 

Finally, the CNIL calls for establishing a fully-fledged European model that implements the principles of respecting individuals’ rights by obtaining their consent wherever feasible and adopting a rigorous experimental methodology that will allow careful testing of different technical solutions. 

CLICK HERE to read the CNIL opinion on facial recognition technology. 

Advocate General’s opinion on the validity of the EU standard contractual clauses. The Advocate General for the Court of Justice of the European Union (“CJEU”) published its opinion affirming the validity of the standard contractual clause mechanism for data transfers from the EU to third countries under the GDPR. 

The opinion was issued in connection with a case filed by Maximillian Schrems, an Austrian Facebook user, attorney and privacy activist, who filed a complaint against Facebook alleging that Facebook unlawfully transfers personal data of its European users to the United States for processing. 

The GDPR allows personal data to be transferred to a country outside the European Union if that country ensures an adequate level of protection of the data, or by using alternative transfer mechanisms such as the standard contractual clauses promulgated by the European Commission back in 2010.

In its opinion, the Advocate General concluded that the standard contractual clauses impose a duty on controllers and supervisory authorities to suspend and prohibit a transfer if they have reason to believe that a conflict arises between the privacy obligations under the standard clauses and the laws of the destination country, such that the standard clauses cannot properly be complied with. The Advocate General opined that this renders the standard clauses a valid transfer mechanism as it provides sufficient guarantee that transfers based on the clauses are suspended or prohibited where those clauses are breached or impossible to honor.

The Advocate General’s opinion does not bind the CJEU, which is expected to issue its judgment on the matter in 2020.

CLICK HERE to read the CJEU Advocate General’s opinion.