Legal measures required to guarantee internet safety, says digital minister

Image credit: Pixabay

In an appearance before the House of Commons Science and Technology Committee, Margot James, minister for digital and the creative industries, said that the government is considering legal measures to control the behaviour of social media companies.

In 2017, the government published its Internet Safety Strategy green paper, which laid out a set of broad principles regarding appropriate online conduct. The paper states that what is unacceptable offline should be unacceptable online; that all users should be empowered to stay safe online; and that technology companies have a responsibility for the content that they host. The government is due to publish a white paper this winter which lays out how these recommendations could be enforced, through both legislative and non-legislative mechanisms.

Appearing before the Commons Science and Technology Committee, digital minister James confirmed that the government was considering introducing laws to guarantee internet safety.

“It is now our consideration that some legal measures will be required, they will need to be enforced and that will be by an approved regulator, although we have not come to any conclusion [as to which],” James said, adding that it was possible that a new regulator could be established for this purpose.

In January, the German government began enforcing a new law which required large social media companies to remove “obviously illegal” content within 24 hours of it being reported – or else face fines up to €50m (£44m). The policy was broadly seen as a test case for government regulation of social media content; in other European countries, governments had done no more than seek voluntary agreements with social media companies with regards to how dangerous content could be limited online.

Some groups – including Human Rights Watch – criticised the law for potentially leading to heavy-handed censorship, and government officials later acknowledged that an amendment could be added to help users restore content that was inappropriately removed.

James told the committee that the government was considering introducing similar legal requirements for social media companies, and was awaiting data and feedback relating to the German legislation before a possible debate in the next parliamentary session to debate UK rules. She said that she felt that the voluntary approach to dangerous content removal had not been working, hence her support for some legal requirements.

“I think a lot of the highly dangerous material has gone under the radar in the dark web, but there’s still too much material that is available, apparently, on various platforms, and it takes them too long to remove it,” she said. She added that in some cases voluntary removal was inaccurate, and that: “it shouldn’t be just left to the platforms to decide what is and what isn’t acceptable”.

MPs also questioned James about a new law requiring websites to verify that visitors are at least 18 years old before they can access explicit adult content; the introduction of the law has been delayed due to uncertainties about how it could be enforced. James said she was confident that “the majority of large commercial porn producers” would comply with the requirements and she hoped the age verification process would be applied fully from Easter 2019.

James also acknowledged that social media age filters are “not robust enough” and that technological innovation is necessary to bring about more robust verification systems. She said the government was “working towards a system” of digital ID issued to young people which could potentially be used to control access to social media and adult content, as well as distributing government services online, although this would not be seen before 2021 at the earliest.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close