close
close

Association-anemone

Bite-sized brilliance in every update

Other concerns raised about the use of facial recognition technology in the UK
asane

Other concerns raised about the use of facial recognition technology in the UK

(Image by Tumisu from Pixabay)

A recent House of Lords Justice and Home Affairs Committee inquiry into the level of shoplifting in the UK has raised a number of ongoing concerns about the use of facial recognition technology by retailers and other businesses. Peers on the committee argued that regulations and best practices needed to be put forward for the adoption of the “controversial” technology, which he equated to what is essentially “privatized policing.”

While the inquiry focused on how shoplifting was not effectively tackled and its impact on the UK economy, what emerged during the evidence-gathering sessions was an interesting view of unregulated facial recognition systems of private companies.

Facial recognition technology is still relatively new, but has been used by two UK police forces for several years now – South Wales and the London Metropolitan Police. These forces’ use of facial recognition systems has been widely criticized by civil liberties and data protection groups, who argue that the tools violate the privacy of innocent people, are not completely accurate for certain groups, and are difficult to use at scale.

The police forces in question, of course, argue that facial recognition technology is a useful tool for keeping the public safe, especially at events.

The EU recently banned the use of real-time remote biometric identification by police in public spaces, while also banning the use of facial recognition technology by private sector organisations. It classified most facial recognition systems as “high risk” AI practices under the recent AI Act. Instead, the newly elected Prime Minister of Great Britain Keir Starmer called for the technology to be expanded to help reduce violent unrest across the country – which campaigners claim could be illegal under GDPR laws.

Put simply, there are plenty of incentives for public authorities and private sector companies to adopt the technology, but little UK regulation or guidance to say when it is proportionate and legal to do so. Currently, it’s essentially the Wild West in terms of where and when it’s adopted, and people’s biometrics are stored in databases across the country.

Guidance is needed

During its evidence hearing, the Committee heard how CCTV plays a crucial role in enabling police to identify criminals and build an intelligence picture. It found that some companies use facial recognition technology provided by private companies. Although the numbers are relatively low, with some witnesses suggesting it is under 10% of stores, its effectiveness suggests that more may do so in the future.

The main concern is that its use is essentially a privatized police system. Retailers use private companies such as Facewatch, which do not receive information from or send information to the police – with no criminal threshold for being placed on such a watchlist. This means that a person can be placed on a private facial recognition watchlist and blacklisted from retailers at the discretion of a security guard, without any policy report being made and without the person being informed that it has been added to a watchlist.

Big Brother Watch, a civil liberties organization, said that “such mass indiscriminate biometric processing by private companies for loss prevention is highly likely to be unlawful under the GDPR” and that there is potential for bias and discrimination within algorithms that power surveillance software because the technology is “less accurate for people with darker skin”.

Equally, Professor Emmeline Taylor, Professor of Criminology, School of Politics and Global Affairs, City, University of London, told the Committee that the technology was “controversial”, while Adam Ratcliffe, Chief Operating Officer, Safer Business Network CIC, said there is “nervousness in that industry about the legality and human rights element, because with live facial recognition you’re scanning everybody, so you’re processing someone’s data when they come into the store, even if they’re not a criminal.”

In response to the evidence received, the Committee wrote in a letter to the Minister for Policing, Crime Prevention and Fairs, Dame Diana Johnson MP, stating:

The committee has serious concerns about the use of facial recognition technologies by private companies. We are concerned about the implications of what is effectively the privatization of the police, the hidden nature of decisions being made on the basis of data linked to entries in a private database and the lack of recourse for people who may have been wrongly entered into the database. data. due to misidentification. We are concerned about potential GDPR violations and the risk of misidentification due to bias and discrimination within algorithms.

It added that it supports the introduction of regulations and best practice guidelines for the use of facial recognition technology by private companies.

My appreciation

Using technologies that collect biometric data en masse without effective mechanisms to allow individuals to query that collection or request deletion of their data through regulatory tools is, in my view, a recipe for disaster. Your face is your data and it feels like a very slippery slope to have that information in the hands of unregulated companies that act as privatized police companies. Regardless of its usefulness in catching bad actors, if we’re not all protected in how our data is used – it’s a high-risk technology. More regulation is urgently needed.