5 min read

ICO concludes its first investigation against GenAI use: Snap's 'My AI'

Read more

By Astrid Hardy & Hans Allnutt

|

Published 10 June 2024

Overview

On 21 May 2024, the Information Commissioner's Office (ICO) issued a press release warning organisations to not ignore data protection risk in light of its findings in the Snap 'My AI' chatbot investigation. The ICO highlights the importance of protecting privacy rights in the context of Generative AI (GenAI). This comes after the ICO issued its first preliminary enforcement notice on 6 October 2023 against Snapchat Inc and Snap Group Limited (Snap) over its potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot "My AI", particularly in respect of children.

GenAI chatbots being advertised as “virtual friends” have become increasingly popular among children. While these “virtual friends” may seem harmless, there are great concerns about privacy risks they pose for children. There were a string of regulatory decisions in Europe last year into other GenAI chatbots and it is a positive development that the UK is following suit. This is the first investigation launched into GenAI use by the ICO and we predict the first of many more to come.

What is Snap's 'My AI'?

Snap' 'My AI' was launched in February 2023, it is a ‘virtual friend’ bot which has been launched by Snap using ChatGPT (Open AI). It acts as a virtual friend on Snapchat. In the early days, users quickly identified that the chatbot was responding in an unsafe and inappropriate manner and red flags were raised with Snap. It was initially rolled out as a premium Snapchat + feature but it has now been launched as a free feature, and appears at the top of the users Friends feed, and there is no option to delete it.

‘My AI’ has been trained to manipulate users in providing sensitive data by convincing users that they are in a real friendship. Some users (especially children) can easily forget that the chatbot AI is nothing more than a language model-based programme and not a human. In fact, it encourages the users to share information in a safe space and facilitates users who may not ordinarily share such sensitive information to do so. Also, as the user interacts more with 'My AI' it becomes more personalised over time, learning more about the user, and therefore making users feel like they are speaking to a real friend. The concern raised was mainly surrounding the lack of risk assessment and Snap's collection and processing of underage data subjects.

On Snapchat, users as young as 13 can sign up to the App without the need for parental consent. We understand that younger users manage to access Snapchat by lying about their age when joining, and the safeguards in place are insufficient. Snapchat has over 363 million users globally, and the majority of those are under the age of 21. Snapchat has recently received criticism for the lack of removing underage users from its platform. A report shared by Ofcom confirmed that Snapchat only removed 700 – 1,000 suspected underage accounts in the UK between March 2022 and March 2023. It is therefore understandable why the ICO launched an investigation.

The ICO's investigation

The ICO launched its investigation in June 2023 following concerns that Snap has "not met its legal obligation to adequately assess the data protection risks posed by the new chatbot". This failure was particularly significant in the context of the large user base, which included children aged 13 to 17.

On 6 October 2023, the ICO issued a Preliminary Enforcement Notice to Snap. The Preliminary Enforcement Notice set out the steps which the Commissioner required Snap to implement. The ICO has since been working alongside Snap and in May this year concluded its investigation into Snap's approach to assessing the data protection risks of 'My AI'.

On 21 May 2024, the ICO confirmed that their investigation led to Snap taking significant steps to carry out a more thorough review of the risks posed by ‘My AI’ and Snap demonstrated that it had implemented appropriate mitigations. The ICO concluded its satisfaction that Snap's 'My AI' is compliant because it has undertaken a risk assessment.

Although the full decision has not been published yet, the ICO's Executive Director of Regulatory Risk has made it clear that for GenAI tools it is going to continue to "monitor organisations’ risk assessments and use the full range of our enforcement powers – including fines – to protect the public from harm."

The ICO has increasingly warned of data protection issues with the use of GenAI, so it will be interesting to review the full decision once published. It also reminded organisations that they must innovate responsibly.

For now, it has reiterated its priorities to protect children's privacy online in its press release in April 2024 which emphasised the work the ICO is doing in line with the Children's Code of Practice dated 2021.

The ICO has also released its 2024-2025 priorities for protecting children's personal information online with John Edwards, the UK Commissioner confirming that:

"Children’s privacy must not be traded in the chase for profit. How companies design their online services and use children’s personal information have a significant impact on what young people see and experience in the digital world. I’m calling on social media and video-sharing platforms to assess and understand the potential data harms to children on their platforms, and to take steps to mitigate them."

Crystal ball gazing

The UK GDPR already requires organisations to conduct a data protection impact assessment (DPIA) before carrying out high-risk processing, which would be triggered by the use of children's data. The ICO's investigation into Snap 'My AI' is an important reminder that organisations need to carefully assess privacy risks before launching new GenAI tools, especially when they are accessed by children.

The ICO has been clear that GenAI is a "key priority" and their investigation into 'My AI' "should act as a warning shot for industry". We await the ICO's full decision in the coming weeks so watch this space. As predicted last year, 2024 will be a significant year for the regulation of GenAI tools more widely, with a particular focus on the protection of children.

Authors