Bridges v CCSWP (2019) is a landmark decision on the legality of automated facial recognition technology (AFR). Put in simple terms, AFR is a technology used to assess whether two facial images show the same person. A digital image taken of a person’s face is processed to extract biometric data such as the measurements of their face, then this data is compared with biometric data from images contained in databases such as police watchlists.
The case has added significance because it marks the first legal challenge launched against the police for using this form of artificial intelligence. Due to its novelty, discussing the case’s privacy and data protection rights concerns and their implications will gain you valuable points in a summative essay or exam on human rights law.
The following brief will give you an overview of the facts, core issues and key takeaways from the case.
On 21st December 2017 and 27th March 2018, the SWP deployed AFR in Cardiff for an AFR Locate pilot project. This involved deployment of surveillance cameras which captured digital images of people in public. These images were then processed and compared with those of persons on watchlists compiled by the police. After spotting a police van with AFR cameras in public on these two separate occasions, Edward Bridges, a civil liberties campaigner living in Cardiff, launched a legal challenge against the police.
For pragmatic reasons, SWP accepted Bridges’ claim that he was present during these two occasions. The real dispute concerned whether this use of AFR was lawful or not. Bridges claimed that by using AFR to analyse his biometric data (digital mapping of his facial features) without his consent or knowledge, the South Wales Police breached his human rights. On the opposite side, Deputy Chief Constable Richard Lewis argued that using biometric data analysis was lawful and proportionate.
The legislation this case was concerned with was the European Convention on Human Rights, Data Protection Act 1998 and DPA 2018, Protection of Freedoms Act 2012, and the Law Enforcement Directive.
First, Bridges claimed that SWP’s use of AFR Locate was in breach of the requirements of ECHR Article 8, which provides that everyone “has the right to respect for his private and family life, his home and his correspondence” (Art 8(1)) and that there “shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary” for national security and public safety purposes, for the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of rights and freedoms of others (Article 8(2)).
Second, Bridges claimed that SWP’s use of AFR Locate was contrary to data protection legislation requirements. He argued that the SWP acted contrary to s. 4(4) DPA 1998 and s. 35 DPA 2018 by failing to act in accordance with the data protection principles. Furthermore, Bridges claimed that the use of AFR falls within s. 64(1) DPA 2018 and therefore that a data protection impact assessment must be conducted.
Third, Bridges claimed that the SWP failed to comply with the obligation under s. 149(1) of the Equality Act 2010, which is to have “due regard” to certain prescribed matters when exercising their functions. The argument here was that it was evident from the equality impact assessment document created by SWP for the proposed use of AFR Locate that they had failed to consider the possibility that AFR software could produce a disproportionately higher rate of false-positive matches for women and minority ethnic groups. Bridges argued that in this way, the use of AFR Locate indirectly discriminated against those groups. Bridges claimed that this meant SWP had failed to have due regard for relevant considerations, which is required by s. 149(1)(a)-(c) of the 2010 Act.
The wider legal issue was with “whether the current legal regime in the United Kingdom is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilized society.” The core issue was with the AFR and its potential implications for privacy and data protection rights.
On whether AFR breached Bridges’ Article 8 rights, the Court decided that its use does infringe those rights. The AFR engages Article8 because such “automated capture of facial biometrics, and conversion of those images into biometric data, involves large scale and relatively indiscriminate processing of personal data. If such processing is not subject to appropriate safeguards, such data…could be collected…in a manner amounting to a serious interference with privacy rights.”
However, the inference with Article 8(1) rights could be justified because the police’s purpose for the use of AFR, mainly for the prevention and detection of crime, was well documented. This objective was considered sufficiently important to justify the interference and the usage of AFR was found to be rationally connected to this objective. AFR was also used for a limited time with a specific purpose, only covering a limited area and leading to crime detection. Other important factors in favour of AFR use was public safety, the lack of impact on Bridges (considering he was not on a watchlist and his data therefore not stored), the targeted nature of the watchlist and the success of the project (it had in 50 instances resulted in some 37 arrests or disposals).
Bridges contended that there must be some specific statutory basis for the use of AFR Locate. The Court disagreed and considered that the police’s common law powers are “amply sufficient” in relation to the use of AFR Locate. As such, they did not need new express statutory powers for this usage.
On the s. 149(1) of the Equality Act 2010 claim, the Court noted that no evidence showed that AFR produced indirectly discriminatory results. Furthermore, the SWP demonstrated they had considered their obligations early on as they had prepared an Equality Impact Assessment – Initial Assessment. As such, Bridges’ arguments in relation to the equality ground failed.
This case is important because, as was written in the judgment, “the law must keep pace with new and emerging technologies.” This raised new and significant issues about the police’s use of AFR, and was the first case in which any court in the world had specifically considered AFR. In doing this, the Court needed to strike a balance between privacy right protection and the public interest in using new technology to enable detection and prevention of crime.
The Court recognised that the use of AFR could aid the apprehension of suspects or offenders, and provide public protection through its potential to detect crime. Yet, it was also recognised that AFR raises significant civil liberties concerns. Since it involves processing facial biometric data of a large number of people, it can violate important privacy and data protection rights. As such, there is a need to carefully and continuously consider the implications of such technologies.
In the end, the Court decided that the use of automated facial recognition was a justified privacy intrusion. This technology has significant value as it can enable police to identify suspects and offenders where they would normally not be able to do so. However, the Court did note that whilst the current legal framework is sufficient for now, it may need regular review as the use of AFR increases.
These are certainly exciting times for those interested in the legal implications of technology. As the police continue to acquire new forms of technology, the law must keep up to ensure that its usage does not violate privacy and data protection rights. However, it currently seems like the use of such technology will be considered lawful in the eyes of the court if done for the legitimate purpose of detecting and preventing crime.
Words by: Kristin Klungtveit
Loading More Content