Committee challenges police use of AI
Image credit: Darrell Evans/Dreamstime
Police use of AI and facial-recognition technology is not subject to proper oversight and risks exacerbating discrimination, a parliamentary committee has warned.
The Lords Justice and Home Affairs committee said new technologies were being created in a “new Wild West” without the law and public awareness keeping up with developments, warning that the lack of oversight meant “users are in effect making it up as they go along”.
The cross-party group added that while AI has the potential to improve people’s lives, it could have “serious implications” for human rights and civil liberties in the justice system.
“Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline,” the peers said.
The committee added that scrutiny was not ensuring new tools were “safe, necessary, proportionate, and effective”.
According to the group, police forces and other law-enforcement agencies were buying equipment in a “worryingly opaque” market, with details of how systems work kept secret because of firms’ insistence on commercial confidentiality.
As a result, they have called for a mandatory register of algorithms used in criminal justice tools, a national body to set standards and certify new technology, and new local ethics committees to oversee its use.
They have also called for the duty of candour on the police so that there is full transparency. The committee said AI can have huge effects on people’s lives, particularly those in marginalised communities, and without transparency there can be no scrutiny and no accountability when things go wrong.
Baroness Hamwee, chairwoman of the committee, said: “What would it be like to be convicted and imprisoned based on AI which you don’t understand and which you can’t challenge?
“Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose and not be used unchecked.”
Hamwee said they welcome the advantages AI can bring to the UK’s justice system, but not if there is no adequate oversight. “Humans must be the ultimate decision-makers, knowing how to question the tools they are using and how to challenge their outcome,” she said.
The peers also raised concerns about AI being used in “predictive policing” (forecasting crime before it happened), highlighting concerns it could make problems of discrimination worse by embedding in “human bias” algorithms.
Professor Karen Yeung, an expert in law, ethics, and informatics at the University of Birmingham, told the committee that “criminal risk assessment” tools were not focused on white-collar crimes such as insider trading, due to lack of data, but instead focused on the crimes for which there was more information.
“This is really pernicious. We are looking at high-volume data that is most about poor people, and we are turning it into prediction tools about poor people,” Yeung argued. “We are leaving whole swathes of society untouched by those tools.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.