close
close

Further concerns about the use of facial recognition technology in Britain

Further concerns about the use of facial recognition technology in Britain

(Image by Tumisu via Pixabay)

A recent inquiry by the House of Lords Justice and Home Affairs Committee into the level of shoplifting in Britain has raised a number of ongoing concerns about the use of facial recognition technology by retailers and other companies. Colleagues on the Committee argued that regulations and best practice guidelines should be put forward for the adoption of the ‘controversial’ technology, equating it with what is essentially ‘privatised policing’.

While the investigation focused on how shoplifting is not being effectively tackled and its impact on the UK economy, an interesting picture emerged during the evidence-taking sessions of unregulated facial recognition systems by private companies.

Facial recognition technology is still relatively new, but has been used by two police forces in Britain for a number of years: South Wales and London’s Metropolitan Police. The use of facial recognition systems by these forces has been widely criticized by civil liberties and data protection groups, with claims that the tools invade the privacy of innocent people, are not entirely accurate for certain groups and are difficult to use on a large scale.

The police forces in question naturally argue that facial recognition technology is a useful tool to ensure the safety of the public, especially at events.

The EU recently banned the use of real-time remote biometric identification by police in public spaces, while also banning the use of facial recognition technology by private sector organizations. It classified most facial recognition systems as ‘risky’ AI practices, under the recent AI law. In contrast, the newly elected British Prime Minister Keir Starmer has called for the technology to be expanded to help reduce violent disorder across the country – which campaigners say could be illegal under GDPR legislation.

Simply put, there are many incentives for governments and private sector companies to adopt the technology, but little UK regulation or guidance to say when it is proportionate and lawful to do so. Right now, it’s essentially the Wild West as to when and where it gets adopted, and people’s biometric data is stored in databases across the country.

Need guidance

During the hearing, the committee heard how CCTV plays a crucial role in enabling police to identify perpetrators and build an intelligence picture. It revealed that some companies are using facial recognition technology provided by private companies. Although the figures are relatively low, with some witnesses suggesting this affects less than 10% of stores, its effectiveness suggests this may happen more in the future.

The main concern is that its use is essentially a privatized policing system. Retailers use private companies such as Facewatch, which do not receive or send information to the police – with no criminal barrier to being placed on such a watchlist. This means that an individual can be placed on a private facial recognition watchlist and blacklisted from retailers, at the discretion of a security guard, without a policy report being made and without the individual being notified that he or she has been added to a watchlist.

Big Brother Watch, a civil liberties organization, said that “it is highly likely that such mass indiscriminate biometric processing by private companies for the purpose of loss prevention is unlawful under the GDPR” and that there is potential for bias and discrimination within the algorithms used power surveillance software. because the technology is “less accurate for people with dark skin.”

Professor Emmeline Taylor, Professor of Criminology, School of Policy and Global Affairs, City, University of London, also told the committee that the technology is “controversial”, while Adam Ratcliffe, Operations Director of Safer Business Network CIC, said there is “nervousness in that industry around the legality and human rights element, because with live facial recognition you scan everyone, so you process someone’s data when they walk into the store, even if they are not an offender.”

In response to the evidence received, the committee wrote to the Minister for Police, Cime and Faire Prevention, Dame Diana Johnson MP, stating:

The committee has serious concerns about the use of facial recognition technologies by private companies. We are concerned about the implications of what is effectively privatized policing, about the hidden nature of the decisions made on the basis of data that matches data in a private database, and about the lack of recourse for those who may be wrongfully have been entered into the database. due to incorrect identification. We are concerned about potential breaches of the GDPR and the risk of misidentification due to bias and discrimination within the algorithms.

It added that it supports the introduction of regulations and best practice guidelines for the use of facial recognition technology by private companies.

My opinion

The use of technologies that collect biometric data en masse without effective mechanisms in place to allow individuals to request that collection or request deletion of their data, through regulatory tools, is, in my view, a recipe for disaster. Your face is your data and it feels like a very slippery slope to have that information in the hands of unregulated companies that act like privatized police companies. Regardless of its usefulness in detecting bad actors, if we are not all protected in the way our data is used, it is a high-risk technology. More regulation is urgently needed.