Online Safety Bill ‘not fit for purpose’, IT experts say
The Online Safety Bill, which will compel social media platforms to tackle online harms, is “not fit for purpose”, IT experts have said.
In a new poll conducted by BCS, The Chartered Institute for IT, some 46 per cent said the bill was not workable with only 14 per cent of tech professionals believing the legislation was ‘fit for purpose’.
The Online Safety Bill would put a duty on tech giants like Facebook and Google to develop systems to identify and remove illegal material, as well as deal with content that is harmful to adults and children. Ofcom would enforce this as the regulator.
But the poll found that the majority of IT specialists (58 per cent) said it would have a negative effect on freedom of speech, with only 19 per cent saying that the measures proposed would make the internet safer.
Last month, the Institute of Economic Affairs warned the bill could hand the Secretary of State and Ofcom “unprecedented powers to define and limit speech, with limited parliamentary or judicial oversight”.
There were nearly 1,300 responses from tech professionals to the survey by BCS.
The government has stressed the bill does not require removal of legal content, but larger platforms must set out how ‘priority content that is harmful to adults’ – such as suicide-related material - is dealt with by their service.
This could include specifying that the content would be removed, or deprioritising it in news feeds or simply making it clear what content is freely accessible. Types of harmful content will be specified by parliament; platforms must also take steps not to limit freedom of speech.
Just 9 per cent of IT specialists polled said they were confident that ‘legal but harmful content’ could be effectively and proportionately removed.
Some 74 per cent of tech specialists said they felt the bill would do nothing to stop the spread of disinformation and fake news.
Just one quarter (25 per cent) felt it was realistic for Ofcom to ask platforms to ‘develop or source’ accurate technology to detect sexual abuse material; 57 per cent said it was not.
The Online Safety Bill is now due to return to Parliament after the new Prime Minister is chosen on 5 September.
Rob Deri, chief executive of BCS, said: “There is real need to prevent online harm, but this law only goes part way to trying to achieve that. The aim should be to prevent hatred and abusive online behaviours, by stopping harmful material appearing online in the first place – and that takes a mix of both technical and societal changes.
“A new Prime Minister should take the opportunity to fundamentally review the Online Safety Bill in its current form.
“The technology itself has an important part to play in keeping people safe on social media platforms. However, the Bill leans too heavily on tech solutions to prevent undesirable content, which can’t be relied upon to do that well enough, and could affect freedom of speech and privacy in ways that are unacceptable in a democratic society.
“The legislation should also focus on substantive programmes of digital education and advice, so that young people and their parents can confidently navigate the risks of social media throughout their lives.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.