Suicide online

Social media bosses called to solve self-harm video epidemic

Image credit: Dreamstime

The anti-suicide charity Samaritans has been invited to an upcoming meeting of social media companies initiated by the UK government to discuss disturbing content on their platforms.

The growing prevalence of such content on social media sites has led the government to get involved in tackling the issue.

It initiated a second meeting with representatives from social media giants Facebook, Google, Snapchat and Instagram alongside Samaritans to discuss plans to clean up the web.

Health secretary Matt Hancock - who limits the access his children have to social media as it would expose them to risks”, said he is delighted to announce a world-leading partnership, which will see us team up with Samaritans to enable social media companies to go further in achieving our goal of making the UK the safest place to be online.

According to data from the office of national statistics (ONS), in 2017 alone, 5,821 suicides were registered, corresponding to a rate of 10.1 death per 100,000 citizens (or 7.6 for 2016 when accounted for age, according to WHO data - see scatterplot chart).

The age group that suffers the highest suicides were found to be adults between 45 and 49 years of age, older than the average age of online social networking users.

The data shows no clear connection between how much time users spend on social media and suicide rates.

According to survey results from GlobalWebIndex of 2017, Filipinos spent around 3.57 hours on social media every day, among the highest rates of countries surveyed and around double as much as UK citizens. But the Philippine suicide rate is only 5.2 suicide deaths per 100,000, lower than rates estimated for the UK. On average UK users spend around 1.54 hours on social media per day but experience a higher rate of suicide.

Because young children are more vulnerable in general, greater attention on how they are being affected by online material is being discussed and experts call for more collaboration. There has been a worrying growth of dangerous online content which is an urgent issue to combat and something we cannot solve alone, said Ruth Sutherland, chief executive of the Samaritans. 

The results of a previous summit resulted in a ban by Instagram of images portraying self-harm.

In the lead-up to the meeting, Hancock promised to work with social media companies which agreed that “normalising or glamorising of eating disorders, suicide and self-harm on social media platforms is never acceptable and the proliferation of this material is causing real harm”.

A working paper was announced that pushes for establishing a legal basis for a “new duty of care towards [online] users, which will be overseen by an independent regulator”.

Under this new law, firms are expected to face increased scrutiny for tackling a comprehensive set of online harms, including “illegal activity and content to behaviors which are harmful but not necessarily illegal”.

In mid-April, new rulings were proposed that could force Facebook and Instagram to deactivate the like buttons and geo-location tracking services for children in the UK.

In 2017, Molly Russel, 14, committed suicide, prompting her family to search her social media accounts where they found material relating to depression and suicide. Following her death the government and social media companies have experienced more pressure to act.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close