The most recent in the ICO consultation series on the application of UK GDPR to the development of generative AI spotlights individual rights.
This fourth call for evidence considers the information rights that the UK data privacy regime confers on people's personal information, and how this relates to the training and fine-tuning of generative AI. The ICO is encouraging organisations that are developing AI to submit evidence demonstrating tested and verified methods for meeting these obligations.
Background
Individuals are granted various rights over their personal data, which organisations must enable the exercise of. These include the rights:
- to be informed about whether their personal data is being processed;
- to access a copy of it;
- to have information about them rectified, if it is inaccurate;
- not to be subject to solely automated decisions that have legal or other significant effects on them and
- in some circumstances, the right to request the deletion of information, and restrict the use of their information.
In the context of generative AI, these rights apply to personal data used in training data, fine-tuning data, AI model outputs, and user queries. Therefore, the call for evidence is focused on how organisations developing generative AI will enable people to exercise these rights, with issues of automated decision-making excluded from the scope of the consultation.
Although the ICO provides guidance on logging and responding to rights requests, generative AI poses specific challenges such as identifying individuals, preventing memorisation, and ensuring accuracy. The call for evidence seeks input on the methods and best practices that developers use to comply with their legal obligations.
ICO analysis on individual rights and generative AI
The right to be informed
The right to be informed is a prerequisite for individuals exercising their rights and to understand how their data is used. Development of generative AI platforms requires the use of various data sources for training and fine-tuning, such as web-scraped data, public datasets, or user-generated data.
Developers must inform individuals about the data use and their rights under Articles 13 and 14 of the UK GDPR, unless an exception applies, such as disproportionate effort for web-scraped data. However, the use of exemptions must still involve the appropriate protection individual rights by methods such as publishing specific and accessible information on the personal data being used and explanations of the purposes and lawful basis for doing so.
The ICO wants to establish what measures generative AI developers should take to safeguard individuals' rights and freedoms, such as privacy-enhancing technologies or other pseudonymisation techniques.
The right of access
The right of access allows individuals to request copies of the personal data held about them. Any instance where a developer argues they cannot respond due to an inability to identify individuals, must be explained (and demonstrated if possible) to the individual in question; the individual may choose to submit further information for facilitating identification.
The ICO expects that developers should have clear, documented methods to respond to requests, irrespective of the use of the data.
Rights to erasure, rectification, restriction of processing and objecting to processing
The rights to erasure, rectification, restriction, and objection enable individuals to control their personal data under certain circumstances, As these requests should be addressed within a relatively short timescale (one month, with a possible three-month extension), the ICO is seeking input on how these requests are respected in practice. The application of these rights in practice for generative AI models present memorisation issues as models retain imprints, which can result in outputs containing sections of training data.
The ICO notes that developers use input and output filters to mitigate these risks, but as their efficacy is in question, the ICO asks for clarification how developers implement these rights and how they prevent or detect memorisation.
The deployment stage
The ICO highlights that the use of generative AI model by deployers and end-users, who may have different purposes and expectations than the developers. The responsibility for fulfilling individual rights lies with the controllers or joint controllers of the personal data, who must respect the rights throughout the AI lifecycle and supply chain. Controllership in the generative AI supply chain will form part of the fifth call for evidence in this series.
Next steps
As expected, the ICO concludes that organisations must enable individuals to exercise their rights over their personal data in generative AI models, and is seeking evidence on the methods being developed to help inform their own guidance. The consultation concludes on 10 June 2024.
The ICO continues to provide updates on the development of its approach to AI on a regular basis, having recently published details of its strategic approach and detailing collaboration with other regulators via direct bilateral contact or groups such as the Regulators and AI Working Group, and the Digital Regulation Cooperation Forum (DCRF).