Menu My portfolio: 0

Automated facial recognition technology used by police breaches right to private life

The Queen (Bridges) v The Chief Constable of South Wales Police & others [2020] EWCA Civ 1058

Related Practice Area(s):
Discrimination and Equality, Human Rights, Police Law, Media and Information Law
Court:

This appeal concerns the lawfulness of the use of automated facial recognition technology (“AFR”) in a pilot project by the South Wales Police Force (“SWP”). AFR is a new technology used to assess whether two facial images depict the same person. The specific type of AFR at issue, known as AFR Locate, works by extracting faces captured in a live feed from a camera and automatically comparing them to faces on a watchlist. If no match is detected, the software will automatically delete the facial image captured from the live feed. If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images to determine whether to make an intervention.

To date SWP watchlists have comprised between 400-800 people, and the maximum capacity for a watchlist is 2,000 images. AFR Locate is capable of scanning 50 faces per second. Over the 50 deployments undertaken in 2017 and 2018, it is estimated that around 500,000 faces may have been scanned.

The Appellant brought a claim for judicial review on the basis that AFR was not compatible with the right to respect for private life under Article 8 of the European Convention on Human Rights, data protection legislation, and the Public Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010, by not considering the possibility that AFR Locate might produce results that were indirectly discriminatory on the grounds of sex and/or race because it produces a higher rate of positive matches for female faces and/or for black and minority ethnic faces.

Held: the appeal is allowed on grounds 1, 3 and 5.

The appeal succeeded on Ground 1, that the DC erred in concluding that SWP’s interference with Mr Bridges’s Article 8(1) rights was “in accordance with the law” for the purposes of Article 8(2). The Court held that although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).

The appeal succeeded on Ground 3, that the DC was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient.

The appeal succeeded on Ground 5, that the DC was wrong to hold that SWP complied with the PSED. The Court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

Dan Squires QC and Aidan Wills were involved in this case.

 

Please find summary here.