UK’s facial recognition technology ‘breaches privacy rights’

  • 6/24/2020
  • 00:00
  • 4
  • 0
  • 0
news-picture

Automated facial recognition technology that searches for people in public places breaches privacy rights and will “radically” alter the way Britain is policed, the court of appeal has been told. At the opening of a legal challenge against the use by South Wales police of the mass surveillance system, lawyers for the civil rights organisation Liberty argued that it is also racially discriminatory and contrary to data protection laws. In written submissions to the court, Dan Squires QC, who is acting for Liberty and Ed Bridges, a Cardiff resident, said that the South Wales force had already captured the biometrics of 500,000 faces, the overwhelming majority of whom are not suspected of any wrongdoing. Bridges, 37, whose face was scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018, says the use of automatic facial recognition (AFR) by South Wales police caused him “distress”. The case has been brought after South Wales police and the Home Office won a high court case last year that effectively gave the green light for national deployment of the technology. In a case focused on the potentials of digital technology, the online hearing was, ironically, plagued by technical difficulties – muffled sounds, echoing voices and disruptions to the live-streamed public service. At one point, while the sitting was suspended, one of the three appeal court judges, Dame Victoria Sharp, president of the Queen’s Bench Division, could be heard remarking: “Now we know what the crown court judges feel like.” Use of remote technology during the pandemic, many lawyers feel, has not been an unqualified success. In his written submission, Squires said: “If AFR is rolled out nationally, it will change radically the way that Britain is policed … Connected to a database with the right information, AFR could be used to identify very large numbers of people in a given place at a given time – for example, those present at a protest that the police are monitoring.” He added: “Given the proliferation of databases operated by the police and other public authorities, the exponential increase in information held by public bodies and the ever increasing practice of sharing that information between public bodies, it is not difficult to imagine that police forces nationally could soon – if they cannot already - have access to photographs of the vast majority of the population.” On concerns about AFR’s potential for discrimination, Squires said there was a heightened risk of “racial bias leading to erroneous police stops”. In order to comply with the public sector equality duty, he added, the police force “was required to make concerted efforts to determine whether the particular AFR software it was using suffered from the problems of race or gender bias identified in other software, and to take reasonable steps to obtain material to enable it to make that determination correctly”. However, Squires said: “South Wales police failed to do so.” The hearing continues.

مشاركة :