By Jade Kowalski & Alistair Cooper

|

Published 07 November 2023

Overview

As facial recognition technology ("FRT") becomes more sophisticated and its use becomes more widespread as a means of combatting crime and protecting the public, scrutiny has increased. Back in July concerns were raised about the use of FRT by Northamptonshire Police at the F1 British Grand Prix, which was seemingly implemented in light of protests staged by Just Stop Oil protesters in 2022. More recently, there has been criticism in relation to involvement of major retailers in a new government-backed scheme focussed on tackling shoplifting with the help of FRT.

This increased attention is unsurprising, as there are strong arguments to suggest that the purpose and inner-workings of FRT will always be at odds with the fundamental rights and freedoms of the individuals who attend events or venues where this technology is in operation. We explore this conflict – as well as the key data protection requirements – below (Nb. although the use of FRT has implications from both a human rights and data privacy perspective, this note focusses on the latter. The human rights implications were explored in a 2020 case involving South Wales Police).

 

What is FRT?

In its 2021 opinion on this topic, the ICO defined facial recognition as "the process by which a person can be identified or otherwise recognised from a digital facial image". It went on to say that "Cameras are used to capture these images and FRT software produces a biometric template. Often, the system will then estimate the degree of similarity between two facial templates to identify a match (e.g. to verify someone’s identity), or to place a template in a particular category (e.g. age group)".

Importantly, the ICO makes a distinction between 'one-to-one' FRT (e.g. face ID which you use to unlock your iPhone) and live FRT, which is FRT "directed towards everyone in a particular area rather than specific individuals". In the live scenario, biometric data regarding the individuals who are 'in range' is captured indiscriminately in real-time, and there is "often a lack of awareness, choice or control for the individual in this process". For these reasons there are real and obvious concerns around whether live FRT can ever be used in a way which complies with data protection laws.

 

What is biometric data?

Biometric data – which is likely to fall within the definition of "special category" personal data – is "personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data". Importantly, the ICO opinion referred to above confirms that data which is processed only momentarily can still constitute biometric data, and the question of whether the identification of the relevant individual is automated or not (i.e. whether it requires human intervention) is irrelevant. All that matters is that the personal data results from specific technical processing (e.g. converting an image into data regarding the attributes of an individual) which allows the unique identification of an individual.

As we alluded to in our previous article on this topic, the ICO is clearly alive to the risks posed by the processing of biometric data and is currently focussing a great deal of attention on the topic, having recently published draft guidance, as well as a recent blogposts regarding: (i) a live FRT product operated by security company, Facewatch; and (ii) data protection issues in relation to the sharing of data by retailers to combat shoplifting. As alluded to in our previous article, we feel the draft guidance mentioned above fails to address some key challenges, in particular in the context of biometric data processing for crime prevention purposes.

 

Key data protection requirements

Unsurprisingly, there is a high bar for data protection compliance when it comes to the use of FRT, given that it involves the processing of biometric data which is likely to constitute special category personal data. As always when it comes to data compliance, those building or using FRT solutions should look at things holistically, starting with the principles set out in UK GDPR Article 5. Compliance in this context can definitely not be viewed as a 'box-checking exercise'.

A few of the key requirements which will be particularly important include:

  • Data Protection Impact Assessment ("DPIA") – a data controller is required to carry out a DPIA where a particular kind of personal data processing is likely to result in a high risk to the rights and freedoms of natural persons, particularly where new technologies are involved. It goes without saying that a DPIA would be required in the context of FRT. This will help to inform wider steps to be taken with a view to achieving compliance.
  • Transparency – data subjects need to be told how their data is being processed in a way which is concise, transparent and easy to understand. In practice it may be difficult for the data controller to comply with this principle when FRT is being used to collect biometric data in an indiscriminate way (a website privacy policy is unlikely to help much, here...).
  • Fairness – any personal data processing via FRT must be fair, and must not create a risk of bias or discrimination. As above, this may be difficult in practice given how the technology works, and this is a big area of concern being highlighted in the media.
  • Identify a lawful basis for processing – as mentioned in the ICO opinion referred to above, the lawful basis most commonly relied upon in the context of these technologies is the 'legitimate interests' basis. This in itself is a balancing act between the interests of the relevant controller, and the fundamental rights and freedoms of the relevant data subject, and a separate assessment will need to be performed to determine if this basis can be relied upon.
  • Special category and criminal data – where relevant, additional requirements will apply where these types of data are involved, which includes identifying specific conditions for processing under Articles 9 and 10 of the UK GDPR.
  • Purpose limitation – as tempting as it might be to use the data collected via FRT for purposes other than those for which it was originally collected, this is not permitted.

 

Our view

Whilst this is still a rapidly evolving issue, what is apparent is that: (i) the use of FRT is particularly high risk from a data protection perspective; and (ii) there are no clear answers to the compliance questions it throws up. Those looking to develop or implement FRT should take extreme care when doing so and take time to assess their legal and regulatory responsibilities.

Authors