Biometric technologies – sorting the fact from the science fiction

Biometric technologies – sorting the fact from the science fiction

From watching the recent BBC series, The Capture, one might think that privacy invasive biometric data technologies such as live facial recognition (LFR) technology can be lawfully used in the UK. In the real world things are not so straight-forward.

What is biometric data?

Biometric data is personal data obtained from or relating to a person’s body or behaviour which may be used to uniquely identify them. Traditionally this has included fingerprints, DNA, iris scans, and voice and facial recognition used to verify or identify an individual. It may be used to access, for example, a smartphone, a system or a building. More recently, in conjunction with use of CCTV, video platforms such as Zoom and artificial intelligence, more novel forms of biometric technology are becoming available in the UK. Technologies exist which claim to analyse physical characteristics such as perspiration, gait and keystrokes to derive information about an individual’s emotional state or intention.

What laws relate to biometric data and artificial technology?

Using personal data in combination with artificial intelligence is not new, but as yet there is no single legal framework or regulatory authority in the UK for the use of biometric technology, or artificial intelligence more generally[1]. This means that organisations seeking to deploy these technologies must consider a patchwork of laws, including laws relating to data privacy, human rights and specific sector laws (such as financial services)[2].

Data protection is a common concern and any organisation seeking to use these technologies must ensure that its processing of biometric personal data meets the fundamental data protection principles of lawfulness, fairness and transparency, as well as the other requirements of data protection law.

In the context of novel biometric technologies, particularly those involving LFR, there are significant risks of a lack of transparency to the individuals whose personal data is being collected and processed; the ways in which their data will or may be used in future, and the risk that significantly more data is gathered than is necessary. Given the very nature of biometric data being unique to an individual and it being very difficult or impossible to change should it ever be lost, stolen or inappropriately used, the risks to data security of individuals is even greater than other categories of personal data.

What is the Information Commissioner’s Office’s position?

The UK’s privacy regulator, the Information Commissioner’s Office (ICO), has demonstrated its intention to come down very hard on misuse of personal data in connection with LFR. The recent £7.5 million fine of Clearview AI Inc., the third largest fine issued to date by the ICO, for Clearview’s collection of facial images to create a global online database for facial recognition technology demonstrates this. The ICO is not alone, various European states’ privacy commissioners have issued high fines to Clearview and there are calls for use of LFR to be banned or, at very least, suspended until there is a clear legal framework for its use.

In the ICO’s opinions on the use of LFR by law enforcement in public places and on the use of LFR in public places, and in its statement in response to the complaint made by Big Brother Watch against Southern Co-operative’s use of LFR in stores to try to detect crime, the ICO has made it clear that privacy must be at the heart of any use of new technology.

ICO warning about ‘immature biometric technologies’

The ICO has recently gone one step further and has issued a broad warning to all organisations against the use of ’emotional analysis’ technology. Developers claim that this technology can make HR processes more efficient and even improve diversity and inclusion in the workplace by ‘cancelling out human biases’ in recruitment by using algorithms to read speech patterns and facial movement to assess candidates for jobs or in examinations. The ICO considers that this kind of use is far more risky than traditional biometric technologies.

New research by Cambridge University’s Centre for Gender Studies argues that these claims about ’emotional analysis’ are little more than an ‘automated pseudoscience’. Based on the scientific research, the ICO’s view is that it is not a question of this kind of technology being in its infancy, but more fundamentally there is no scientific basis that personality, emotions or intent can be deduced from external characteristics of an individual and the use of this technology can result in ‘systemic bias, inaccuracy and even discrimination’. Speaking on BBC Tech Tent, Deputy Commissioner Steven Bonner referred to this kind of technology as ‘sorting hat’ technology and ‘fundamentally flawed’. Bonner also expressed the view that a decision to invest in this technology is likely to be a poor decision resulting in financial loss.

Is any further ICO guidance anticipated?

The good news is that the ICO has explained that it will act positively towards those organisations taking good practice and that it is developing wider guidance on the use of biometric technologies, including facial, fingerprint and voice recognition which are already successfully used in industry. The guidance is due to be published in spring 2023 and will aim to help businesses, as well as highlighting the importance of data security.

How should organisations manage risks in the acquisition of novel technology?

In view of the risks, organisations procuring new technological solutions should not rely only on the developers’ assurances but should undertake a full analysis to assess the following:

  • Data protection by default and design – understand how the technology actually works and ensure data protection by default and design is at the heart of the technology;
  • Data Protection Impact Assessment – understand how personal data will be impacted by the technology;
  • Technical effectiveness – ensure the effectiveness, including security measures, of the technology;
  • False results – ensure false positives and false negatives are assessed as part of technological due diligence, to reduce the risks of discrimination and bias; and
  • Watchlists – undertake due diligence on the provider and the ICO’s watchlist to understand whether any investigations are on-going or flagged.

The bottom line is that if you cannot understand how the technology works and if you cannot satisfy yourself that there are no questions about data protection then regardless of the anticipated commercial benefits the technology is unlikely to be a good investment and may even result in regulatory fines or legal claims being brought against your organisation.

We are experienced in assisting organisations to deal with acquisition of novel technologies. If you require further advice or assistance in relation to this area, then please do get in touch with Anne Todd or Nathaniel Lane.

[1] The EU is proposing an Artificial Intelligence Act which will likely influence the development of the UK’s artificial intelligence legislation as well as legislation around the world.

[2] The Ryder Review, an independent legal review of the governance of biometric data in England and Wales commissioned by the Ada Lovelace Institute reports in detail on the various laws and regulators.

EVENTS
mainstream
MAINstream Pitch Event

Applications for this pitch event close 11 April. If you are interested in joining the network and attending our events please email mainstream@michelmores.com for further details. We...

EVENTS
mainstream
MAINstream Pitch Event

If you are interested in joining the network and attending our events please email mainstream@michelmores.com for further details. We hold five pitch events a year where high-growth early-stage...

EVENTS
mainstream
MAINstream Pitch Event

If you are interested in joining the network and attending our events please email mainstream@michelmores.com for further details. We hold five pitch events a year where high-growth early-stage...