Police use of Facial Recognition Technology found to be Unlawful

The Court of Appeal in the case of R (Bridges) -v- Chief Constable of South Wales Police and Others [2020] EWCA Civ 1058 overturned the High Court’s decision and found the use of facial recognition technology in this circumstance was unlawful.

The Court of Appeal determined that the use of facial recognition technology being used in the pilot project run by South Wales Police was not in accordance with Article 8(2) of the European Convention on Human Rights (right to respect for private and family life). This decision was based on the following:

  • There was no clear guidance on where the facial recognition technology could be used and who could be put on a watchlist. Therefore, police officers had too broad a discretion to meet the requirement of Article 8(2) of the ECHR, which is about interference by a public authority having to be in accordance with law and only where necessary.
  • The data protection impact assessment was inadequate and not compliant with Part 3 of the Data Protection Act 2018 (this is law enforcement processing so we need to look to the Data Protection Act 2018, not the GDPR).
  • South Wales Police had not taken reasonable steps to investigate whether there was a racial or gender bias in the technology, and such investigation is required by the public sector equality duty. Even though this was a pilot, the public sector equality duty still applied.

This case arose when Mr Bridges was scanned by facial recognition technology when Christmas shopping in 2017 and whilst at a peaceful protect in 2018. Mr Bridges subsequently brought a judicial review arguing that the use of the technology was a breach of his human rights. Mr Bridges was unsuccessful in the High Court and the case was successfully appealed.

The automated facial recognition technology processes the facial biometric data of members of the public, and then uses the digital images captured to compare them against individuals, in this case, on South Wales Police’s watchlist. If no match is made, the image is then automatically deleted immediately.

In this circumstance, the Court of Appeal ruled that the data protection impact assessment was inadequate. A DPIA is required when carrying out any processing that is likely to result in a high risk to the rights and freedoms of individuals, which is the case when using facial recognition technology. This is an important decision because it is a reminder of the importance of DPIAs and that they must properly assess the rights and freedoms of individuals and must address measures that could mitigate any identified risks.

The judgment also addresses appropriate policy documents and it is important to remember that when there is sensitive processing, as in this case, an appropriate policy document is required by the Data Protection Act 2018. However, the Court held that in this case they did not need to determine if such policy was in place.

The use of facial recognition technology is a developing area and as the technology improves there is greater interest in using it in our public spaces to assist with security, create efficiencies and potentially in a post-Covid-lockdown world, to enable social distancing. This judgement nevertheless serves as a reminder that technology that processes personal data must always be used within the boundaries of data protection law.

The UK is not the only country of course grappling with the use of facial recognition technology within the legal framework. In 2019, a school in Sweden was fined by the Swedish data protection authority, Datainspektionen, for the use of facial recognition technology to monitor pupil attendance in schools. The local authority that implemented the technology had done so to see if facial recognition could speed up the registration process in the morning and therefore free up teachers’ time. However, the Swedish data protection authority found that proper consent had not been obtained and it also had concerns about the data protection impact assessment that had been carried out.

The case of Bridges is not saying that facial recognition technology will always be unlawful but it clearly underlines that the use of any such technology must be proportionate and in line with the legal framework. When deciding whether to use such technology, controllers must ensure that they comply with data protection and human rights law at all times, and so careful consideration must be given to those issues before rolling out technology of this kind.

This article is for general awareness only and does not constitute legal or professional advice. The law may have changed since this page was first published. If you would like further advice and assistance in relation to any of the issues raised in this article, please contact us today by telephone or email enquiries@sharpepritchard.co.uk.

Posted in Data, Privacy and Information Law, Latest news and blog.