Make social media companies liable for abusive content, ethics watchdog concludes
A Parliamentary committee has published a report which recommends introducing laws to make internet companies such as Facebook, Twitter and Google criminally responsible for illegal intimidatory content hosted on their platforms.
Although the inquiry is largely focused on the intimidation of politicians and political campaigners on social media, it is sure to have implications for other abusive content posted online. The Committee on Standards in Public Life began the inquiry following reports of serious online abuse of parliamentary candidates during the 2017 general election. In some cases, social media companies were accused of harmful passivity, even after being notified of abuse of candidates using their platforms.
“The widespread use of social media has been the most significant factor accelerating and enabling intimidatory behaviour in recent years,” the committee wrote in their report: Intimidation in Public Life.
Since the violent murder of Labour MP Jo Cox during the 2016 EU referendum campaign by a constituent associated with far-right politics, the intimidation of politicians both offline and online has become a high profile subject of discussion.
Already disadvantaged groups (women, ethnic minorities and LGBT candidates) were disproportionately likely to be targeted, the committee found, and the abuse targeted at these people is already discouraging some candidates from standing.
While intimidation of politicians is nothing new, the committee said, there is now a serious issue with the scale and intensity of intimidation of public figures. The committee places responsibility with social media companies for being too slow to take action, but also with political parties for negative campaigning and failure to call out abusive behaviour, and the police for their inconsistency in their handling of illegal intimidation.
The report recommends that – among other things – the government brings forward legislation to shift the liability of illegal web content to social media companies. Currently, social media companies are not liable for the user-generated content hosted on their websites.
This is largely due to the EU E-Commerce Directive 2000, which treats social media companies as hosts of content, rather than publishers. This legislation is “out of date”, the committee said.
“Facebook, Twitter and Google are not simply platforms for the content that others post, they play a role in shaping what users see [...] with developments in technology, the time has come for the companies to take more responsibility for illegal material that appears on their platforms,” the report said.
According to the committee, social media companies must play a part in ensuring that they are equipped to make quick, fair decisions on whether or not to allow aggressive content to remain online.
It also recommended that government should consider making intimidation of campaigners and candidates an offence in electoral law, that police officers must be trained to investigate social media offences, and that political parties should work together to develop and enforce a code of content on intimidatory behaviour during elections.
Lord Paul Bew, chair of the committee, said in a statement that: “This level of vile and threatening behaviour, albeit by a minority of people, against those standing for public office is unacceptable in a healthy democracy. We cannot get to a point where people are put off standing, retreat from debate, and even fear for their lives as a result of their engagement in politics. This is not about protecting elites or stifling debate, it is about ensuring we have a vigorous democracy in which participants engage in a responsible way which recognises others’ right to participate and to hold different points of view.
“The increasing scale and intensity of this issue demands a serious response [...] we have been persuaded that the time has come for the government to legislate to shift the liability for illegal content online towards social media companies,” he continued.
“Many of the recommendations we are making today are not limited solely to election periods but will have wider relevance across our public life.”
In November, Chamath Palihapitiya, the former vice-president for user growth at Facebook, spoke critically about the social media giant at an event at Stanford Business School, stating that Facebook has helped create “short-term, dopamine-driven feedback loops that [...] are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.”