Police sign. Credit: Dreamstime

Big Brother Watch highlights police force's 'chilling' use of Experian's profiling tool

Image credit: Dreamstime

Machine learning system fed profiles generated by model that categorises people under headings including “disconnected youth” and “Asian heritage” - but force insists aim is improving life chances.

Privacy group Big Brother Watch has said it is “chilling” that police have used profiles generated by a credit check company as part of a new semi-automated custody process.

Durham Constabulary last year became the first force in the country to use a so-called ‘random forest’ machine learning system to assess the risk of certain individuals reoffending.

The police force uses the Harm Assessment Risk Tool (HART) - an algorithm developed at Cambridge University - to provide guidance to custody officers in some cases.

The system predicts an individual’s risk of reoffending based on 34 variables, most of which focus on prior history of criminal behaviour. It is used as part of a scheme called Checkpoint which aims to divert ‘low risk’ offenders from prosecution and prison.

Labour MP David Lammy’s review of the treatment of Black, Asian and minority ethnic (BAME) people in the UK’s criminal justice system highlighted Checkpoint as a positive example of a model that could be replicated more widely to bring about greater equality in the system.

As part of the programme, certain offenders have prosecution deferred - provided they agree to submit to various structured interventions, for example treatment for drug or alcohol addiction.

Individuals’ age, gender and the location of their place of residence, but not their ethnicity, are said to be taken into account by the HART tool.

However, Big Brother Watch has now uncovered a register of contracts that showed Durham Constabulary paid £25,913 to the company Experian for its ‘Mosaic’ system, which claims to be able to combine “unparalleled data resources” to profile households and individuals in the UK.

Among other things, Mosaic sorts people into groups, including ones labelled “disconnected youth”, “dependent greys” and “Asian heritage”. Big Brother Watch has suggested that this could perpetuate stereotypes, with potentially “dystopian” implications for people whom the police come into contact with.

Silkie Carlo, the group’s director, said: “For a credit checking company to collect millions of pieces of information about us and sell profiles to the highest bidder is chilling. But for police to feed these crude and offensive profiles through artificial intelligence to make decisions on freedom and justice in the UK is truly dystopian.

“We wouldn’t accept people going through our bins to collect information about us. Nor should we accept multi-billion pound companies like Experian scavenging for information about us online or offline, whether for profit or policing.

A spokesman for Experian told E&T: “Experian Mosaic is a classification tool which helps organisations understand someone’s likely lifestyle characteristics based on where they live. Most of the information we use is from consumer surveys and public data, such as the census. We use anonymised, aggregated data to build the segments and factual information about individuals is never shared with any organisation.”

He added: “The Mosaic names are built on information, not on stereotypes. We don’t make any assumptions when we’re creating a new Mosaic, and we work hard to make sure our segmentation reflects what’s really happening in a particular area or group of people.”

Sheena Urwin, head of criminal justice at Durham Constabulary, said the force had worked with Experian to improve its understanding of, and engagement with, local communities.

“Our aim is to reduce harm to the communities we serve and improve life chances for the people we come into contact with,” she said. She stressed the force was “continuing to evaluate the research with our academic partners”.

The aim of the HART process is to sort offenders into categories. These are ‘high risk’ (at risk of committing a new serious offence, i.e.: murder, attempted murder, grievous bodily harm, robbery, a sexual offence or a firearms offence), ‘medium risk’ (at risk of committing a new offence defined as ‘non-serious’) and ‘low risk’ (no reoffending of any kind).

Real world outcomes are monitored against the machines’ predictions, and one academic involved in trials of HART said it was now running at around 90 per cent accuracy. Officers are advised to continue using discretion rather than blindly following what the computer says. Durham Constabulary particularly wants to identify the ‘moderate risk’ group as these people are the intended recipients of Checkpoint interventions.

Big Brother Watch was founded by Matthew Elliott, one of the leading figures in Vote Leave and the founder of the libertarian Taxpayers Alliance. Its office is based in the same Westminster building as six other right-of-centre organisations, including the climate sceptic Global Warming Policy Foundation.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close