Police use of AI in urgent need of oversight, warns report
Image credit: Palinchak | Dreamstime
A report investigating the police use of artificial intelligence (AI) and data-driven technologies has called for urgent national guidance amid concerns that its uptake could lead to discrimination.
The study, published by the Royal United Services Institute (RUSI), said new guidelines were crucial to ensure the use of data analytics, AI and computer algorithms develops “legally and ethically”.
Forces expanding the use of digital technology to tackle crime is in part driven by funding cuts requiring efficient use of resources, the report claimed. It also added that officers are battling against “information overload” as the volume of data around their work grows, while there is also a perceived need to take a “preventative” rather than “reactive” stance to policing.
The pressure resulting from this overload have led forces to develop tools to forecast demand in control centres; to “triage” investigations according to their “solvability”, and to assess the risks posed by known offenders. Examples of the latter include Hampshire Police’s domestic violence risk-forecasting model; Durham Police’s 'Harm Assessment Risk Tool' (HART), and West Midlands Police’s draft 'Integrated Offender Management' model.
Commissioned by the Centre for Data Ethics and Innovation (CDEI), the report stated that while technology could help improve police “effectiveness and efficiency”, it was held back by “the lack of a robust empirical evidence base, poor data quality and insufficient skills and expertise”.
Furthermore, while the report did not directly focus on biometric, live-facial recognition or digital forensic technologies, it explored general issues of data protection and human rights underlying all types of police technology.
The report noted: “It could be argued that the use of such tools would not be ‘necessary’ if the police forces had the resources needed to deploy a non-technological solution to the problem at hand, which may be less intrusive in terms of its use of personal data.”
It also advised that an “integrated impact assessment” was needed to help justify the need for each new police analytics project. The report also said such initiatives were often not underpinned with enough evidence over their claimed benefits, scientific validity or cost-effectiveness.
The report’s authors noted criticism of “predictive policing” tools being “racially biased”, but said there was “a lack of sufficient evidence” to assess whether this occurs in England and Wales and if it results in unlawful discrimination. It stated studies claiming to demonstrate racial bias were mostly based on analysis in the US, with it being unclear if such concerns would apply to a UK context.
“However, there is a legitimate concern that the use of algorithms may replicate or amplify the disparities inherent in police-recorded data, potentially leading to discriminatory outcomes,” the report stated. “For this reason, ongoing tracking of discrimination risk is needed at all stages of a police data analytics project, from problem formulation and tool design to testing and operational deployment.”
Roger Taylor, chairman of the CDEI, said: “There are significant opportunities to create better, safer and fairer services for society through AI and we see this potential in policing. But new national guidelines, as suggested by RUSI, are crucial to ensure police forces have the confidence to innovate legally and ethically.”
The report called on the National Police Chiefs’ Council (NPCC) to work with the Home Office and College of Policing to develop the new technology guidelines. Commissioner Ian Dyson, NPCC’s lead for information management, said it would work with the Government and regulators to consider the report’s recommendations.
“Data-driven technology can help us to keep the public safe,” he added. “Police chiefs recognise the need for guidelines to ensure legal and ethical development of new technologies and to build confidence in their ongoing use.”
In September 2018, RUSI released a report which called for the introduction of regulations ensuring transparency and responsibility when UK police forces trial machine learning (ML) tools.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.