Knives out: tech tackles violent crime
Image credit: Getty Images
Knife crime across the UK has risen steeply in recent years – can advanced technology help the police put an end to such violence?
Knife crime offences in the UK rose by 7 per cent in the 12 months to the end of June 2019, reaching a record high. It’s a worrying trend, both in itself and because it makes people feel less safe on the streets. But there are advanced technological solutions to halting attacks, including the use of controversial predictive policing tools.
The number of offences involving a knife or sharp instrument reached 47,513 in England and Wales in the year ending June 2019, according to figures provided by the Office for National Statistics (ONS). What are the reasons for this increase in knife violence?
Explanations range from a rise in gang- and drug-related violence, to police budget cuts. This is a politically charged issue, with Theresa May insisting, during her time as Prime Minister, that there is “no direct correlation” between the rise in knife crime and a reduction of 20,000 in police numbers since 2010. However, in 2018 a Home Affairs Committee report said police forces were “struggling to cope” due to cutbacks, while a leaked Home Office report said falling police numbers had “likely contributed” to a rise in violent crime, the BBC reported.
In a bid to fight knife crime, the government has pledged to recruit 20,000 new police officers over the next three years and introduce tougher sentences for serious offenders, with Home Secretary Priti Patel recently announcing a £35m cash boost to tackle violent crime across the 18 worst-affected areas in the country. The Offensive Weapons Act also makes it harder to obtain and own knives, including zombie knives (knives made for violence), and the Home Secretary will also be taking further action against retailers found to be selling knives to children, as statistics show one in five knife crimes are committed by under-18s, according to Ministry of Justice figures.
Knife crime statistics
Official figures show a rise in the number of recorded offences involving a knife or sharp instrument for five years.
The police recorded 47,513 offences involving a knife or sharp instrument in England and Wales for year ending June 2019. This was the highest number since the year ending March 2011, the earliest point for which comparable data are available.
Furthermore, the volume of knife and sharp instrument offences has increased by 44 per cent since the year ending March 2011.
While such measures will undoubtedly help, some experts believe technology will have an increasingly important role to play, even if critics worry we could be creeping towards the dystopian future portrayed in the film ‘Minority Report’, in which ‘criminals’ can be caught and punished even before crimes are committed.
Several police forces across the UK have been trialling facial-recognition systems to help catch criminals. South Wales Police is using automated facial-recognition (AFR) software, which can automatically detect faces in an image or video, such as CCTV, and compare them with a database of around 500,000 faces of ‘persons of interest’. It can also be deployed in real time so that live camera feeds of faces can be compared against a specially made watch list to ‘locate’ individuals. This means it can be used at big events, such as sporting fixtures and music concerts, to keep an eye on known troublemakers, for example.
South Wales Police says AFR has been used to assist in the identification of “hundreds of suspects” across South Wales, but has had to answer concerns about privacy, saying the system can’t be used to identify anyone who isn’t on a watch list and their photo isn’t retained.
Similarly, the Metropolitan Police (the Met) used live facial-recognition to carry out a total of ten trials in public across London. NEC’s NeoFace technology was used in targeted areas to analyse faces including the structure of each face and the distance between eyes, nose, mouth and jaw, to create facial data to check against a watch list of offenders. If there was a match, officers could decide to stop a person of interest. While not commenting on the success rate of the trials, the Met has ceased using the technology while its merits are being reviewed.
Facial recognition could further aid the police in stop-and-search operations, if they ever choose to use live facial-recognition technology in conjunction with wearable cameras. Like similar systems, Digital Barriers’ technology identifies faces from video footage and compares them, in real time, to a watch list of people of interest.
However, what makes its technology different is its secure video-streaming capability that enables users to transmit pictures and audio at bandwidths as low as 9kbps, according to Farah Plange, marketing manager of Digital Barriers. “The combination of these two technologies enables us to offer real-time facial recognition from a body-worn camera, smartphone or other connected device, even when network conditions are poor, thereby supporting police and security forces with an easily deployable and highly effective tool,” she says.
This means it could prove useful in stop-and-search operations, which are arguably an important tool in fighting knife crime, but are not without controversy about bias and discrimination. Plange says the technology could ensure individuals are not unfairly targeted and allows an officer in a potential stop-and-search scenario to very quickly confirm whether an individual is on an active police watch list, providing crucial intelligence as to whether there is reasonable justification for a search.
“Meanwhile, an individual not identified as a person of interest and whose behaviour does not otherwise lead to reasonable grounds for suspicion can be sent on their way, freeing up valuable time to target those most likely to threaten public safety while minimising concerns around bias,” she adds.
As well as pinpointing known criminals, technology can also be used to detect weapons themselves. Of course, metal detectors can help, but distinguishing concealed blades from other items such as keys and coins is challenging for the police, particularly among large crowds. This is why the Home Office and Department for Transport have shared £460,000 among six companies to fast-track innovative technologies that aim to detect people carrying knives in crowded places such as streets, railway stations and major events.
While there is little information available about the proofs of concept delivered by Security Screening Technologies, Iconal Technology, Loughborough University, Xenint, Thales UK, and Advanced Nano Tech and Scientific, they are exploring how solutions involving radar and electromagnetic and acoustic sensors could help detect steel-bladed knives. And there are plenty of other companies developing indicative detection technologies too.
US Customs and Borders control already uses ThruVision’s people-screening technology and now it’s being trialled on the London Underground. This method reveals weapons, including knives, concealed under clothing that block a person’s body heat, at distances of up to 10 metres. This means that officers will be able to identify the size, shape and location of any concealed object that could be used as a weapon, without needing physical searches, keeping busy stations flowing. The technology, however, does not show any intimate body parts and it is impossible to tell a person’s gender, age or ethnicity from the imagery it produces. It is already used on the Los Angeles Metro and was trialled at Stratford station for five days in September 2019.
Another system, developed by Canadian company Patriot One, uses radar, magnetic, video and chemical detection sensors, plus AI software, to detect concealed weapons. It is one of many security measures being developed by private firms that harness the power of artificial intelligence (AI). “If an individual with a knife were to pass by one of these cognitive radar or magnetic sensors (or the combination of both working together), it would immediately send an alert to security officers, who could then identify and intercept the individual,” explains Martin Cronin, CEO of Patriot One.
The technology, called the PATSCAN multi-sensor covert threat detection platform, can also be used in conjunction with CCTV systems, and sensors deployed covertly to detect weapons on people or in a bag without the need for invasive, intrusive or prejudicial measures, says Cronin.
“This means that in a crowded environment like a train station or shopping centre, video-recognition systems that are enhanced by AI-powered threat-object recognition software can be used to identify and flag forbidden objects such as knives,” he says. “Also, in this instance, cognitive radar and/or magnetic technologies can be used to screen for concealed weapons at the double door entry to these facilities. If an individual with a knife were to pass through one of these sensors, it would alert security officers who could then use the surveillance camera system to identify and intercept the individual before they have the chance to draw and use the weapon.”
One of the possible reasons for the increase in knife crime is fewer bobbies on the beat, but technology could help accelerate some of the most time-consuming tasks. For example, South Wales Police says: “Prior to AFR, identification of a suspect typically took two weeks to achieve. The same result can now be achieved the same day.” The force says AFR allows resources to be deployed elsewhere to protect its communities.
Another technology that has yet to be used by UK police forces is Genetec Valcri – a crime investigation tool that helps crime analysts to speed up the process of reviewing evidence, so that meaningful patterns can be more easily spotted and investigatory leads prioritised. It enables the typical analyst to achieve in two to three hours what would previously have taken two to three days, according to David Petrook, the product group director of Genetec.
The tool can analyse information from multiple databases, applying its correlation engine and real-time semantic search. When investigating a knife attack, investigators might start by looking for similar cases in terms of time, location or method, but there is no guarantee that the word ‘knife’ will have appeared in previous police reports, with ‘weapon’, ‘blade’ and ‘sword’ all commonly used. “Valcri helps to make these associations ensuring that investigators are presented with all of the variables, which enables them to enrich the trail of evidence,” Petrook says.
The software can suggest possible relationships between crimes and individuals. “For example, Valcri can connect the dots and uncover that crime was committed by a perpetrator arrested for another crime,” he explains. Valcri then visualises the results in a form that enables investigators to test hypotheses quickly, and decide the best next steps for pursuing a resolution to that case.
Valcri is designed to ensure historical data is properly processed and analysed so that police can make sensible and informed decisions on how to best deploy their resource, but it’s not able to predict future incidents. However, work is under way to do just this.
Cambridge University is at the forefront of research into how assault data can forecast fatal stabbings. So far, criminologists working with the Met have been able to show that the number of assaults resulting in knife injuries over one year, correlated with an increased risk of deadly knife crime in the same small areas the following year .
For the study, published in the Cambridge Journal of Evidence-Based Policing, DCI John Massey from the Met’s Homicide Command trawled through thousands of knife-crime records to pick out and ‘geo-code’ incidents where people were stabbed but survived during the 2016/17 financial year, in what is the first dataset of non-fatal knife assault ‘hotspots’ in the UK.
He discovered that 3,543 knife assaults had occurred during the 12-month period – a ratio of 66 non-fatal stabbings for every knife homicide that year. Each assault was matched to one of London’s 4,835 local census areas and compared to the locations of the 97 homicides from the following financial year. The data showed that of the 41 neighbourhoods that had six or more injuries from knife assaults in the first year, 15 per cent went on to experience a homicide in the next year.
The researchers say their analysis reveals a large increase in homicide risk and that the biggest assault hotspots were 15 times more likely to suffer a knife homicide the following year than areas with no assaults. They believe the data provides a ‘consistent pattern’ of greater knife homicide risk, meaning the police could be aware at particularly risky areas and deploy their resources more efficiently. Study author Professor Lawrence Sherman says: “When combined with intelligence-gathering on the streets, this form of data analysis could enhance the effectiveness of scarce resources to create a new and more powerful preventative toolkit. Our study is just the first step.”
‘Predictive’ tools have already been trialled in the real world. For example, a system called PredPol, which is trained using historic crime data to highlight areas where hotspots crimes may be more likely, is able to predict where crimes might occur and was being trialled by Kent Police. It was the first force in England and Wales to introduce the ‘predictive policing’ system in 2013, but it has now stopped using the software because while it was useful for proactive policing, Supt John Phillips told the BBC it was “challenging” to demonstrate whether the system had enabled police to reduce crime.
PredPol’s technology has been used by approximately 60 police departments in the US, where it has also attracted criticism that it isn’t making policing fairer and more accountable as hoped. While the AI tool identified areas in a neighbourhood where serious crimes are more likely to occur but not who is likely to commit them, there are still ethical concerns from civil rights organisations. They are concerned about bias being built into the software, as well as reinforcing prejudices about which neighbourhoods are good and bad, perhaps changing the way the police deal with crimes, even if they are small.
Regulating the sales of certain knives, education programmes and stop-and-search campaigns will all almost certainly help cut the number of assaults with knives, but while it may take time to resolve ethical concerns surrounding the use of AI tools to accurately predict crime hotspots, it seems unlikely that technology will not play a role in fighting knife crime.
Critically, the intelligent policing tools being developed are not designed to take the place of officers, but help them make better decisions in a transparent way and bear the brunt of time-consuming admin tasks, leaving them free to patrol the streets. But surely, the future of fighting knife crime will be found (at least partially) in technology.
Could AI predict who will reoffend?
Police forces across England and Wales are developing AI models to assess reoffending risk for convicted offenders.
Durham Constabulary was the first force in the UK to use the AI-based Harm Assessment Risk Tool, or HART. Its purpose was to help custody officers decide whether a suspect would be suitable for deferred prosecution, based on their risk of reoffending.
The Chief Constable praised the tool for allowing magistrates’ courts to focus on the more serious offences. However, it came under scrutiny for using Experian Mosaic data, which arguably classifies UK postcodes, households and even individuals into stereotypes, to predict a suspect’s reoffending risk.
Silkie Carlo, director of Big Brother Watch, says: “For police to feed these crude and offensive profiles through artificial intelligence to make decisions on freedom and justice in the UK is truly dystopian.”
West Midlands Police is building an AI system to assess reoffending risk, while accommodating the concerns of its ethics committee. In September 2019, E&T was told that the tool had reached 75 per cent accuracy in correctly predicting that someone would reoffend, and was “near perfect” at not misidentifying people as ‘high risk’, but it had yet to be piloted in real-life conditions.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.