By Rebecca Morgan, Jade Kowalski & Charlotte Burke

|

Published 04 October 2024

Overview

On 12 September 2024, Ireland's data protection regulator, the Data Protection Commission ("DPC") announced that it had commenced a cross-border statutory inquiry into Google Ireland Limited ("Google") arising from Google's processing of user personal data to help to develop its AI model, Pathways Language Model 2 (PaLM 2).

The inquiry is set to focus on whether Google complied with any obligations it may have had to carry out a Data Protection Impact Assessment ("DPIA") prior to commencing the processing in question. In particular, the DPC has confirmed that it will consider whether the rights and freedoms of the affected data subjects were adequately considered and protected before training of PaLM 2 commenced.

This inquiry demonstrates a growing regulatory trend to focus on the proper completion of the DPIA process itself, as well as the legality of the underlying processing it considers. In this article we look back the engagement between the UK Information Commissioner's Office ("ICO") and Snap Inc ("Snap") on a very similar topic, set out practical tips and consider What does this all mean for your DPIAs?

The reader will recall that, back in the autumn of 2023, the ICO published a Preliminary Enforcement Notice ("PEN") in relation to Snap's MyAI chatbot on the grounds that Snap had not met its obligations in relation to carrying out a DPIA (under Article 35 UK GDPR) or consulting with the ICO prior to the commencement of high risk processing (under Article 36 UK GDPR). In June 2024, the ICO published its final decision, which actually concluded that Snap had complied with its obligations under Article 35 and that there was no breach under Article 36. The final decision is a must-read for practitioners dealing with DPIAs, both in relation to the deployment of AI and more broadly, and it provides illuminating detail on the ICO's expectations on the level of detail a DPIA should contain.

By way of reminder, MyAI is a chatbot that provides Snapchat users with the ability to raise queries via a conversational interface. It is founded on ChatGPT, which is a form of large language model (LLM). The MyAI chatbot was the first of its kind in a messaging platform, and was designed to answer questions, offer advice and converse with the Snapchat user. Over time and with more interaction, MyAI would become a more personalised experience.

 

Preliminary Enforcement Notice

Snap approached the ICO voluntarily about its MyAI service in March and April 2023. At that point, Snap had already released a trial version to its subset of premium users signed up to Snapchat+, and a wider release in the UK subsequently took place in the April, just before Snap and the ICO met to discuss it. Following the meeting, the ICO made informal requests for Snap to answer eight specific questions regarding generative AI, with a further request to share its DPIA. Snap preferred to answer the questions and did not share their DPIA. What followed was a formal ICO request, in the form of an Information Notice (dated 19 May 2023) requiring Snap to provide all versions of its DPIA relating to MyAI. After some to-ing and fro-ing, Snap did provide the ICO with full copies of all versions of its MyAI DPIA.

Following that, in June 2023, the ICO opened its investigation into MyAI following concerns about the inadequacy of Snap's assessment of the data protection risks posed by the chatbot.

When the ICO issued its PEN, (which we wrote about here) it did so on the basis that, although Snap had undertaken a DPIA (and had, in fact, produced four iterations), that DPIA did not meet the requirements of Article 35 UK GDPR. The ICO deemed it to be inadequate in its identification and assessment of the privacy risks to its millions of users, a significant proportion of which were teenagers. Furthermore, because the DPIA had concluded that the processing would result in high risk to the rights and freedoms of those teenage users, the PEN provisionally found a breach of Article 36(1) due to Snap's failure to undertake prior consultation with the ICO before the processing commenced. The ICO's provisional conclusion was that Snap must cease processing until a revised DPIA was carried out.

The PEN itself effectively provided Snap with a checklist of requirements for bringing its DPIA up to scratch in the "fifth version”. The PEN identified that Snap's existing DPIA had fallen short of UK GDPR requirements in that it failed to:

  • systematically describe the nature, scope and context of the processing operations performed in connection with My AI (Article 35(7)(a));
  • assess the necessity and proportionality of its processing activities, specifically in relation to the use of generative AI technology and how this changed the nature of the personal data processed and the processing operations performed by Snap (Article 35(7)(b));
  • assess the risks posed to the rights and freedoms of users of My AI, specifically in respect of the targeting of users aged 13-17 for marketing purposes, the processing of special category data on a large scale and the effect of the use of new technologies on users’ understanding of how their personal data is processed (Article 35(7)(c)); and
  • identify measures which Snap envisaged would address the risks resulting from the processing operations performed in connection with My AI, including a failure to set out measures designed to address the “compounded” risks posed to users aged 13-17 and the risk that the content of conversations with My AI could be intentionally or inadvertently tracked, along with the failure to explain the removal of Snap’s in-app warning against sharing confidential and sensitive information with My AI (Article 35(7)(d).

Armed with the PEN and a clear checklist of where previous iterations of the DPIA had fallen-short, Snap got to work on the fifth version. This was provided to the ICO, preceded by written representations, in November 2023, with a skeleton argument following in December. A few days later, the ICO held an oral hearing attended by ICO representatives, along with Snap Executives and legal representatives. Follow-up questions were then raised by the ICO, which were responded to by Snap in February 2024.

If this sounds like a rather protracted and expensive way to undertake a DPIA, then there are certainly some key takeaways from the ICO's Final Decision Notice which organisations can use, alongside the ICO's more formal DPIA guidance, to provide a clear set of guidelines to follow when completing a DPIA.

 

The Decision Notice

The ICO's final Decision Notice is dated 21 May 2024. It was announced by the ICO that month, and the publication of the full 62-page notice followed in June. 

In his final Decision Notice, the Commissioner concluded that:

  • the DPIA dating from November 2023 (the Fifth DPIA) complied with Article 35 requirements and therefore there were no grounds for the Commissioner to issue an Enforcement Notice requiring Snap to take any steps to bring the processing into compliance with the UK GDPR; and
  • Snap did not infringe Article 36(1) by failing to consult the Commissioner prior to commencement of the processing in connection with MyAI.

It is worth highlighting that throughout the decision, the Commissioner points to the ICO's DPIA Guidance, and notes that as guidance it is not legally binding, but nonetheless it is "intended to assist controllers to comply with their obligations under Articles 35 and 36 UK GDPR, in indicating how the Commissioner interprets these Articles. Therefore, it is referred to throughout this document to support the explanation of the reasoning underpinning the Commissioner’s conclusions". Organisations operating as controllers would therefore be wise to heed the ICO's DPIA guidance, for two reasons – first, it is clearly going to be followed by the ICO in the event that they have cause to consider your DPIA and whether it meets the requirements, and second because it is actually pretty helpful. This decision notice now serves to provide more illumination – so don't depart from it lightly unless you have a very clear and justified reason for so doing.

Also worth bearing in mind is the EDPB Guidelines on Data Protection Impact Assessment, which include guidance on Criteria for an Acceptable DPIA, albeit providing less extensive guidance than the ICO's 'How do we do a DPIA?' section.

 

The ICO's Non-Infringement Findings – Article 35

The Decision Notice comprehensively sets out the Commissioner's findings regarding how and why the Fifth DPIA did not contravene the Article 35 requirements. Principally, the Commissioner was satisfied that the Fifth DPIA provided substantially greater detail and analysis than its four predecessors, and in particular that Snap had addressed the four limbs specified in the PEN and as required under Article 35(7)(a)-(d).

 

Systematic description of the processing operations – Article 35(7)(a)

The Fifth DPIA was found specifically to meet this requirement as it "describes the nature of the processing performed by Snap and its processors in the course of using OpenAI’s GPT technology to generate My AI’s responses to user queries; considers the wider context in which the processing takes place, including the emergence of generative AI and public concerns regarding its use; outlines the scope of the processing carried out in connection with My AI by reference to specific user statistics; and sets out the purposes for which the processing is performed, which are defined as "providing a personalised My AI experience, improving the service, delivering contextual advertisements and providing a safety and security-oriented feature.""

To meet this limb, your DPIA should outline both how and why you plan to use the personal data for your envisaged processing, including details of the nature, scope, context and purposes of the processing:

  • Nature of processing – this should include a description of how personal data is collected, used and shared. This might include a step-by-step breakdown of the processing operations, and in Snap's case, their Fifth DPIA provided this for all entities operating in the chain, covering a list of personal data required for different processing activities. The key then is for this to be a granular description, rather than just a high-level overview. Notably, the Decision Notice highlighted that Snap had included a detailed explanation of their applicable retention periods in this section, which the Commissioner considered helpful in Snap demonstrating that the Fifth DPIA contained a systematic description of the processing.
  • Scope of the processing – this should include the volume, variety and sensitivity of the personal data. It is especially important to be clear about any special category or criminal offence data that will or could be processed in connection with the envisaged processing. You should also make sure this includes any data that might not immediately come to mind, such as meta data, and that where data subjects or users have the option to include free text data within a tool, this could include special category and criminal offence data.
  • Context of the processing – this should include detail of the wider context in which the processing takes place. In the MyAI example, the Commissioner had provisionally concluded that Snap "had failed to consider issues of public concern relating to the use of generative AI and individuals' expectations in respect of geolocation data." The Decision Notice suggests that a DPIA should address any advances in technology that are relevant to the processing.

 

Necessity and proportionality – Article 35(7)(b)

This part of a DPIA should include an assessment of whether the processing will achieve the desired purpose and whether there are other reasonable means of achieving the same result. This will need to include, amongst other things, the lawful basis for the processing, how transparency information will be provided to individuals, measures taken to ensure that processors comply with their data protection obligations, and safeguards relating to international data transfers.

In the MyAI decision, Snap's Fifth DPIA successfully addressed this limb, and in particular Snap “considers the extent to which the processing entailed on My AI will in real, concrete terms differ from the data processing that is entailed on traditional (non-GAIT-reliant) query-based online services…and that the Fifth DPIA documented how the use of a new and innovation form of technology will affect the proportionality of the processing performed in connection with MyAI."

 

Assessment of risks – Article 35(7)(c)

This will require the DPIA to document the potential impact on individuals and any harm or damage which the processing may cause, with specific reference to any impact the processing may have on the rights and freedoms protected by the European Convention on Human Rights. It should include an objective assessment of the risks posed by the processing activities.

The Commissioner's provisional conclusion in relation to MyAI was that Snap's four original DPIAs did not contain an assessment of the risks to the rights and freedoms of data subjects in relation to three areas: (a) the targeting of 13-17 year olds for advertising purposes; (b) the large scale processing of special category data; and (c) the impact of the use of generative AI technology and associated risks especially in relation to teenage users.

In its Fifth DPIA, Snap had applied the structured risk matrix recommended in the ICO's DPIA guidance which meant that they assessed the likelihood of the risk occurring, the potential severity of that risk and then designated an overall risk level, as well as setting out mitigatory measures and residual risk ratings.

 

Mitigatory measures – Article 35(7)(d)

The key to this is to ensure that the mitigatory measures included in your DPIA are accurate and do actually address the specific risk(s) you have identified in your DPIA.

For example, Snap had identified particular risks were compounded in respect of their teenage users but had then failed to identify any additional child-specific measures to address the heightened level of risk that was posed to this particular group. The Fifth DPIA did identify child-specific measures to be implemented to directly address the risks posed to that group of users.

 

The ICO's Non-Infringement Findings – Article 36

The Decision Notice records that the Commissioner concluded that Snap did not infringe the prior consultation requirements regarding the processing of personal data in connection with MyAI. This is due to the fact that Snap's preliminary DPIAs had erroneously recorded that there was a high-risk to the rights and freedoms of 13-17 year old users, and that it was not a reflection of Snap's true assessment of the risks posed once mitigatory measures were taken into consideration.

This clearly points to ensuring that your DPIA is reviewed and proof-read to check for any errors, such as incorrectly recording the risks, or not adjusting a residual risk rating once mitigatory measures have been considered.

 

What does this all mean for your DPIAs?

The Snap Decision Notice is an interesting read – both in terms of the Commissioner's eventual findings but also the broader history of the matter. In conjunction with the ICO's DPIA guidance, it provides some clear takeaways that will help DPOs and their teams to get DPIAs right first time. Here are our thoughts on the key learnings from the decision:

  • Prepare your DPIA with the ICO in mind as the ultimate audience – Snap eventually had to share all four iterations of its DPIA which were carried out prior to the ICO issuing its PEN.
  • Make sure you have all relevant stakeholders lined up to assist with the DPIA. For example, in the context of a DPIA covering the development or deployment of an AI system, this will include input from stakeholders who can provide sufficient and understandable detail about the technology in question, and where that fits into the wider AI/technology landscape.
  • Be specific about the relevant retention periods for the personal data involved in the envisaged processing. In our experience, this is an area that clients frequently overlook but the ICO has specifically called it out so it should be given appropriate consideration when undertaking a DPIA. If your organisation doesn't have a detailed retention schedule, then you should take the opportunity to address that as part of your ongoing data protection compliance programme. We have often found that the issue of retention comes to the fore when a data breach is suffered, but it is clear that the ICO will also be thinking about it if they had reason to look at a DPIA.
  • Use the ICO's risk matrix, or if you are using your organisation's own risk methodology, ensure that it allows you to capture and document all of the key features of the ICO's version.
  • If data subjects are children, then your DPIA will need to include child-focused measures. The same would apply to other vulnerable data subject groups - mitigation measures will need to be carefully thought about and targeted at the particular cohort. This might involve some creative thinking, such as delivering transparency information in more dynamic ways.
  • Check, check and check again - especially the risk ratings – perhaps make sure they are validated by peer review/in discussion with DPO or legal counsel. Snap's error with not updating the residual risk rating was a simple one that was repeated and not spotted, leading to an unnecessary finding by ICO in relation to the prior consultation obligation.
  • If in doubt, include more detail. Try to avoid cross-referencing other material without explaining how it applies in the context of the processing

Of course, a DPIA should be undertaken in advance of the envisaged processing being undertaken. However, your organisation might have undertaken some DPIAs for particularly complex or novel processing that now, with the benefit of hindsight and armed with this Decision Notice, you might have approached differently. Perhaps it is time to think about a remediation plan, to update any inadequate DPIAs, especially if they involve personal data of children or other vulnerable data subjects or large scale special category or criminal offence data. In particular the DPC announcement clearly indicates that going forward, any processing of personal data to train or develop AI systems will be subject to close regulatory scrutiny, and organisations should carefully consider their obligation to prepare a DPIA before commencing any such processing.

Authors