UK Government to fund expansion of Minority Report-like policing system
Image credit: Dreamstime
A highly controversial crime prediction system has received £5m in additional funding in order to expand it to other parts of the UK.
The National Data Analytics Solution (NDAS) is currently in the testing phase and is meant to aid police in assessing the risk of someone committing a crime.
The UK Government will expand the system across the country with a further £5m from the Police Transformation Fund after its initial £4.5m in funds. NDAS analyses large volumes of police-held data to assess the risk of someone committing a crime or becoming a victim, according to a Home Office spokesperson.
While the programme is designed to support police officers and does not replace their decision making, criticism on NDAS and other similar projects by experts persist.
A major concern is that machine learning used in prediction algorithms could produce biased results and potentially disadvantage minority groups. In a report by the Royal United Services Institute for Defense and Security Studies from 2017, authors warn that machine learning could create “inherent biases presented in the data”.
A large false-positive rate would complicate things, admit experts. At the China-Britain AI Summit 2019 earlier in the year, Areiel Wolanow, managing director at Finserv Experts, a provider of financial services technology, told E&T that the biggest problem is that technology is still very immature. “[With] a big false positive rate, [algorithms] are not that good at it yet because we only started building solutions that can do this. You want something that is more accurate that is not going to be unfairly subjecting people to embarrassment and stigma”, he said. “It would take time but it would be a good thing to have in a place like the UK where terrorism is a big problem.”
A key question is: “how much longer are you prepared to invest and pour money into something until you get [a system] that works to a level of safety that we are all comfortable with? That is a very tough balancing act,” he added.
In the UK other case studies are exploring the viability of advanced law enforcement technology and AI. At Durham Constabulary in the north, an AI system is being developed to evaluate the risk of convicts reoffending.
The system called Harm Assessment Risk Tool or HART was criticised in a 2019 Big Brother Watch report. The independent non-profit organisation stated that "if the system does produce a discriminatory, inaccurate prediction, it is likely to negatively impact the individual because the system is designed to over-estimate individuals' risk of re-offending".
Next to NDAS and HART, there are other examples such as the geographic crime prediction systems (PredPol). Big Brother Watch expressed concerns about tools like PredPol which uses past crime data from police records to predict future crime patterns. Police data would represent "systematic under-reporting and systematic over-reporting of certain types of crime and in certain locations". Also, police data could represent discriminatory policing practices and societal inequalities, such as those which results in black men being more than three times more likely to be arrested than white men in the UK, according to the report.
The individual-oriented crime prediction systems, the Offender Assessment System (OASys) and Offender Group Reconviction Scale (OGRS), are also examples where prediction models can spit out biased results. OASys and OGRS would help with the assessment of the risk offenders pose to others and how likely an offender is to re-offend, as well as assessing offender needs. Algorithms used for the systems were found in a 2014 National Offender Management Service analysis to "generate different predictions based on race and gender".
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.