Skip to the content

Civil society coalition calls for ban on police use of biometric surveillance

30/07/24

Mark Say Managing Editor

Get UKAuthority News

Share

Digital face
Image source: istock.com/Maksim Tkachenko

A group of civil society organisations has written to the home secretary calling for safeguards in the use of AI systems in policing – including an outright ban on predictive policing and biometric surveillance systems.

The #SafetyNotSurveillance coalition comprises 17 organisations including the Open Rights Group, Big Brother Watch, Liberty and the Network for Police Monitoring.

Its letter says the Government should ban systems that use AI, data and algorithms to identify, profile and target individuals, groups and locations, attempting to predict certain criminal acts or the risk of them.

“These systems have been proven to reproduce and reinforce discrimination and inequality, along the lines of, but not limited to, racial and ethnic origin, nationality, socio-economic status, disability, gender and migration status,” it says. “Data reflecting existing inequalities and prejudices is used to recreate and reinforce these inequalities.”

It also calls for safeguards, transparency and accountability for all other uses. This would involve the regulation of systems that are not prohibited with a legislative framework in place.

Surveil and control

Sara Chitseko, pre-crime programme manager for the Open Rights Groups, said: “In the UK, and around the world, police already use AI systems to ‘predict’ our likelihood of committing future crimes, to surveil and control us in public spaces, to profile us and to make decisions that determine our access to vital public services, including welfare, education and housing.

“AI and automated systems have been proven to magnify discrimination and inequality in policing. Of particular concern are so-called ‘predictive policing’ and biometric surveillance systems which are disproportionately used to target racialised, working class and migrant communities.

“These systems must be banned if we are to protect the right to be presumed innocent. Without strong regulation, police will continue to use AI systems which infringe our rights and exacerbate structural power imbalances, while big tech companies profit.”

The letter is most immediately relevant to police forces’ use of facial recognition technology. This has stirred up protests from civil liberties groups, and calls for a clear legal foundation from the House of Lords Justice and Home Affairs Committee. But the Home Office has made clear that it sees the scope for widespread use of the technology.

Recent initiatives have included the launch of an effort to develop an application programming interface for facial matching in policing, and an investment in the technology as part of an anti-shoplifting campaign.

 

 

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.