UK public backs limits on ‘legal but harmful’ content online
Image credit: reuters
Anti-abuse campaign group Hope not Hate has, with other groups, carried out research which suggests that social media companies are not trusted by the public to deal with the problem of online abuse and hateful content. The research also found majority support for increased regulation on tech companies.
The research involved Hope not Hate, Demos and the Antisemitism Policy Trust, among other groups. The study involved more than 1,500 people weighed to be representative of the UK public.
It identified online abuse as a key issue among the public, with 73 per cent stating that they are worried about the quantity of this content on social media.
A large majority (74 per cent) of respondents said they do not trust social media companies alone to decide what is extreme content or disinformation when it appears on their platforms. There is strong public support for stronger regulations forcing these companies to take action against harmful content, with 71 per cent agreeing they should be held legally responsible for user-generated content on their platforms and 73 per cent agreeing they should be forced to remove harmful content when it appears on their platforms.
The government’s long-awaited Online Safety Bill, which would give platforms a statutory duty of care to its users and leverage large financial penalties for failure of this duty of care, is due to begin being scrutinised by legislators this month following delays. The proposals include plans to induce platforms to identify content which is “legal but harmful” such as celebration of self-mutilation and suicide and anonymous bullying.
The Hope not Hate research suggests strong public support for these measures. 80 per cent of those asked stated that while they believe in free speech, limits must be placed to prevent the spread of extremist content online. Its report points to the legal obligations for broadcast media regarding “legal but harmful” content, such as under the 2003 Communications Act, which gives Ofcom the duty to set standards for programme content.
Regarding specific types of harmful content, just nine per cent of respondents want racist content to be allowed on social media (eight percent for specifically antisemitic content, including Holocaust denial), eight per cent for homophobic content, and nine per cent for sexist content.
“Allowing people to spew hateful and offensive content online is not a way to protect freedom of speech, but rather risks sowing divisions and amplifying the vile views of a tiny minority,” said Joe Mulhall, head of research for Hope not Hate. “At present, online speech that causes division and harm is often defended on the basis that to remove it would undermine free speech.
“In reality, allowing the amplification of such speech only erodes the quality of public debate and causes harm to the groups such speech targets. This defence, in theory and in practice, minimises free speech overall. As our polling shows, there is clearly an overwhelming consensus that hateful content, even when legal, is too visible on social media platforms.
“The only way to really make sure that everyone has freedom of speech is to protect anyone who is currently being attacked or marginalised based on characteristics such as race, gender, or sexual orientation. That’s why continuing to include legal but harmful content in the Online Safety Bill is the best way to ensure social media companies apply effective systems and processes to reduce the promotion of hate and abuse, while preserving freedom of expression.”
The report states: “Those condemning the bill on free speech grounds underestimate the potential for social inequalities to be reflected in public debate, and disregard the nature and extent of these inequalities in the 'marketplace of ideas'. As such, the position of some 'free speech' advocates can be paradoxical. They claim to be committed to valuing free speech above all else, propagating an unequal debate that further undermines the free speech of those who are already harmed by social inequalities.
“Some of those currently arguing for the removal of legal but harmful content from this legislation are instead proposing to criminalise speech that is currently legal – a proposal potentially at odds with the aim of preserving free expression.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.