13 min read

Key developments in data protection and privacy in 2024

Read more

By Charlotte Halford & Jade Kowalski

|

Published 05 December 2024

Overview

DAC Beachcroft has recently launched our Informed Insurance Predictions for 2025, our annual predictions for the insurance sector, which provides critical insights from our Data, Privacy and Cyber team on the pressing issues facing data protection and privacy practitioners in the coming year.

In order to understand the direction of travel for these issues, we must reflect on the significant developments of the past year, including regulatory and legislative changes and technological advancements like artificial intelligence (AI).

 

AI and its regulatory landscape

AI remains the most significant technological development facing companies, insurers, governments and regulators. The data protection and privacy implications of AI are vast. The past year clarified the regulatory environment for AI through key developments:

  • EU AI Act: The world's first comprehensive piece of legislation dealing with AI, the EU AI Act, entered in force on 1 August of this year following years of political debate and negotiation. Our data and privacy experts provided a deep dive into the legislation, addressing how organisations would need to scope their exposure to the legislation. A complex piece of legislation, the AI Act establishes a risk-based approach to the regulation of the entire life cycle of AI systems in the EU, setting out a legal framework for the development, placing on the market, putting into service and the use of AI systems. The AI Act has implications for organisations and individuals both in the EU and more widely, often irrespective of jurisdiction. Our technology team also addressed how the AI Act deals with 'prohibited AI practices' as part of our DACB AI Explainer series. The European Data Protection Supervisor also published guidance providing practical advice to EU institutions and agencies on the processing of personal data in their use of generative AI, albeit without prejudice to the AI Act.
  • UK's Flexible Approach: The prescriptive nature of the legislation introduced in the EU offers sharp contrast to the current position in the UK, where a non-statutory flexible framework is currently in place. The previous government indicated a continued commitment to their pro-innovation approach earlier this year, but the General Election precipitated a sea change in this area. The first King's Speech of the new Labour government stated it is now considering some measures comparable to the EU AI Act, specifically an express intention to "establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models." Developments are expected in 2025.

 

The ICO's approach to regulating AI

The UK's data protection regulator, the Information Commissioner's Office (ICO), has been particularly active, publishing details of its strategic approach to AI in April. The report highlighted how principles-based data protection law aligns with the proposed flexible AI framework in the UK which is built on five key AI principles. The report also noted the potential for the ICO to update its guidance in response to future AI-related legislation.

The first investigation by the ICO into a generative AI tool was concluded this year, resulting in a warning to organisations not to ignore data protection risk in relation to artificial intelligence tools. The SnapAI investigation investigated a potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot "My AI", particularly in respect of children. The investigation resulted in Snap conducting a revised data protection impact assessment (DPIA) in relation to the data protection risks posed by “My AI,” thus complying with the necessary UK GDPR requirements. More generally, the outcome reflects the growing regulatory trend to focus on the proper completion of the DPIA, which our team discussed in detail in October in the context of the Snap outcome. When considered alongside the ICO's DPIA guidance, it provided some clear takeaways that will help DPOs and their teams to get DPIAs right first time.

The ICO also launched a consultation series in January considering how data protection law should apply to the development and use of generative AI. We produced a series of articles commenting the various chapters, covering lawful basis, purpose limitation, accuracy of training data, individual rights and the allocation of accountability. The consultation series closed in September, and we await the outcome with interest, particularly how any guidance offered by the ICO will interact with other requirements.

 

The wider regulatory picture

Looking to the wider regulatory environment, sector-specific regulators and industry bodies have taken steps to ensure that the implications of artificial intelligence are fully understood by organisations and that they are compliant with existing rules and regulation.

The Association of British Insurers issued guidance on using AI responsibly, and the Bank of England and Prudential Regulatory Authority responded to government questions on the safe delivery of AI and machine learning. Finally, the Digital Regulation Cooperation Forum, comprising four UK regulators: the Competition and Markets Authority (CMA), the Office of Communications (Ofcom), the ICO and the Financial Conduct Authority (FCA), published its 2024/25 Workplan which confirmed that among other projects, the group will work to improve regulatory coherence and business compliance in line with the government's AI framework.

In addition, both in the UK and the EU, regulators have been active in responding social media companies' proposals to use public and non-public user data to train AI models. These proposals were the subject of extensive discussions and agreements and our team has provided a thorough analysis on how privacy practitioners should address the question of how personal data can be compliantly processed to train AI systems.

 

The UK Data (Use and Access) Bill

In the UK, the retained version of the EU GDPR remains in place, implemented through the UK GDPR. At the start of 2024, efforts were still underway to pass the second iteration of the Data Protection and Digital Information (DPDI) Bill. However, as a result of the General Election, the DPDI Bill was abandoned. In October 2024, the new government laid the Data (Use and Access) Bill (DUAB) before Parliament in the House of Lords.

Among other measures, the DUAB will make targeted reforms to parts of the UK’s existing and retained data protection framework. The DUAB has progressed through the House of Lords, recently reaching Committee stage, and early indications are that DUAB will proceed to conclusion as early as Summer 2025, particularly as there is overlap with previous iterations, with some elements of the DPDIB reproduced in full. Our team recently reviewed the proposed measures within the DUAB and their implications.

Of particular consideration within the consideration of the DUAB is that the EU's adequacy decision in respect of UK transfers under the GDPR is due to expire in June 2025. The UK House of Lords European Affair Committee launched an inquiry into UK-EU adequacy earlier this year, and this issue will inform discussions about the DUAB. During the second reading stage of the DUAB, Baroness Jones, representing the Government, expressly stated that the Government recognised the importance of retaining those adequacy decisions with the EU, and ministers are engaging with the European Commission on this issue. The decision around data adequacy will arguably be the key issue relating to data protection moving into 2025.

 

Consent models and privacy challenges

In recent months, UK users visiting certain news and other websites will have been encountered a choice upon accessing the sites. They can either They can either consent to the use of their data for personalised advertising in exchange for free access or opt for a subscription service that provides access without sharing personal data for advertising purposes. Those who decline both options are denied access to the website.

This approach, known as 'pay or ok' or 'consent or pay' has sparked significant debate and regulatory challenges in the EU throughout 2024. Unsurprisingly, Meta has found itself at the sharp end of regulatory interventions, given its reliance on data-driven business models. This controversy follows earlier challenges to Meta's stated legal basis for processing personal data for behavioural advertising, which is necessary for the purposes of a contract and legitimate interests. In response to these disputes, Meta introduced the "pay or OK" model, signalling its intent to shift to a consent-based legal framework for processing personal data. As part of this change, Meta also proposed offering ad-free subscription tiers for users in the EU, EEA, and Switzerland.

In the early part of 2024, a number of EU data protection authorities sought clarification from the European Data Protection Board on the legality of these models, arguing that it represented "a huge fork in the road. Is data protection a fundamental right for everyone, or is it a luxury reserved for the wealthy?"

The EPDB issued a much-anticipated opinion, yet the decision only commented on a narrow cross-section of 'large online platforms' as defined by the Digital Services Act and Digital Markets Act. The opinion held that "in most cases, it will not be possible for large online platforms to comply with the requirements for valid consent if they confront users only with a binary choice between consenting to processing of personal data for behavioural advertising purposes and paying a fee." A high bar for consent was established. However, as we highlighted above, some of the organisations using 'pay or ok' models include media publishers and this opinion did not resolve those questions. However, the EDPB did confirm it intends to develop guidelines on ‘consent or pay’ models with a broader scope. In response, Meta has issued proceedings against the EDPB seeking the annulment of the opinion, financial damages associated with the opinion, and costs.

The challenges to 'pay or ok' models did not end there, and Meta again found itself under scrutiny. The European Commission also issued a preliminary opinion finding that Meta's 'pay or ok' model was not compliant with Meta's obligations under the Digital Markets Act. The Commission has also coordinated action with Consumer Protection Cooperation (CPC) Network to issue a letter to Meta raising specific concerns about its practices under EU consumer law. We await the outcome of both of these disputes.

In the UK, the ICO responded to the growing noise in the EU in particular by opening a call for views on this business model. Equivalence between the ad-funded and paid-for service and the balance of power between service provider and users were identified as two of a range of factors organisations will need to consider whether 'pay or ok' would provide valid consent for personalised ads. Although the consultation closed in 2024, the ICO has yet to publish its views, although it did highlight in August that a response will be forthcoming, with the specific expectation that Meta will "consider any data protection concerns we raise prior to any introduction of a subscription service for its UK users."

The 'pay or ok' call for views was identified as part of the ICO's ongoing cookie compliance work, and in July, the Deputy Commissioner of the ICO expressed disappointment that Google would be concluding longstanding efforts to remove third-party cookie tracking technology from the Chrome web browser.

 

Data breaches and fines

Judgments in data breach claims across both the UK and EU have continued this year to offer insight into the challenges that these claims face, and how defendant practitioners can respond.

  • The decision of Farley v Paymaster generated outcomes for any controller dealing with the question of whether Articles 33 and 34 UK GDPR are triggered where personal data is placed into an incorrect envelope or sent to the wrong address. Those controllers who might have assumed that a risk was unlikely, given the sealed envelope directed for the addressee only, may have to assess risks differently depending on the outcome of Farley.
  • Adams v Ministry of Defence dealt with the question of omnibus claim forms (multiple claimants added to a single claim) which try to avoid the complication of Group Litigation Orders. This decision means the use of single claims forms by multiple claimants is likely to become more challenging will likely have an immediate impact on the viability of omnibus claim forms in data breach claims.
  • Looking to the EU, the Court of Justice of the European Union handed down a judgment of interest in the claim of TR v Land Hessen. The CJEU found that when a breach of personal data has been established, a supervisory authority is not obliged to exercise a corrective power, in particular the power to impose an administrative fine, where this is not necessary to remedy the shortcoming found and to ensure that the GDPR is fully enforced.

Although the CJEU held in TR that data protection regulators are not obliged to exercise corrective measures, it is worth noting that from a UK perspective, the ICO's Data Protection Fining Guidance was updated this year, setting out the circumstances in which a fine might be imposed and factors to be considered. The ICO also confirmed the revised approach it adopted in 2022 for working with public sector organisations, partly to ensure that fine do not unnecessarily impact on the provision of public services and budgets would be reviewed. We await the outcome.

Noting the enforcement powers held by the ICO, in August, the regulator stated it intended to issue a £6.09m fine to Advanced Computer Software Group Ltd. Not only was this a substantial sum, but this preliminary notice also represented the first instance we have seen in the UK of the ICO pursuing a processor for a breach of its obligations under data protection law. Although the final enforcement measures are yet to be confirmed, the news created a renewed focus and raised potential concerns for processors, reminding them of the importance of things like multi factor authentication.

In the European Union, a number of significant fines have been issued for GDPR breaches have been issued. The Dutch data protection authority (Autoriteit Persoonsgegevens) imposed a fine of EUR290 million on Uber, having found that Uber had collected sensitive data from drivers, and stored the data on US-based servers for a period of two years without using appropriate transfer mechanisms. The Irish Data Protection Commission then issued a reprimand to Meta, and administrative fines totalling €91 million, pursuant to GDPR after Meta notified the DPC that certain passwords of social media users had been stored on internal systems without protection or encryption.

In addition, and as our team highlighted in April, a somewhat remarkable decision by the European Data Protection Supervisor (EDPS) found that the European Commission's use of Microsoft 365 was in breach of Regulation 2018/1725 (the GDPR equivalent applicable to EU institutions).

 

Developments in data subject access requests

As part of our detailed analysis articles produced once a month, our team analysed the modern landscape relating to data subject access requests, and provided some practical tips for handling and responding to them effectively, as well as considering the decision in Harrison v Cameron and ACL.

 

Looking forward

The year 2024 has been transformative for data protection and privacy. As we approach 2025, beyond those issues addressed in the DACB Informed Insurance thought leadership, data protection practitioners will continue to be mindful of technological developments impacting the future and practitioners must navigate evolving legislation, regulatory expectations in light of these technological advancements to safeguard data and uphold privacy rights.

The second edition of the ICO Tech Horizons report demonstrated the wide range of advances likely to require proactive data protection assessments in the coming year and beyond. Genomics, virtual worlds, neurotechnologies, quantum computing and next-generation search are all terms that will become consequential to data protection rights in the coming years, and we will continue to help our clients navigate these new challenges.

Authors