Facebook cracks down on users sharing misinformation; EU tightens content rules

Facebook has said it will take action against users who repeatedly share misinformation on its platform by reducing how many users see their posts.

People who post misleading content about Covid-19, vaccines, climate change, elections and other divisive topics will be fact-checked and users will be able to see their conclusions displayed more prominently on the page.

Facebook said that it already reduced the reach of individual false posts if they were deemed to be misleading, but this restriction will now apply to entire accounts.

Users will see a notification on misleading posts that includes the fact-checker’s article debunking the claim, as well as a prompt to share the article with their followers.

It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so that other people are less likely to see them.

Facebook first brought its fact-checking initiative to the UK in 2019. It works with independent charity Full Fact to review stories, images and videos which have been flagged by users, rating them based on their accuracy.

The European Commission also released new guidelines this week for its Code of Practice on Disinformation stating that platforms such as Google and Facebook cannot make money from advertising linked to false posts.

They are also proposing to introduce legislation this year to improve the transparency of political advertising with social media playing an increasingly important role in elections around the world.

“Disinformation cannot remain a source of revenue. We need to see stronger commitments by online platforms, the entire advertising ecosystem and networks of fact-checkers,” Thierry Breton, EU industry chief, said in a statement.

Vera Jourova, the Commission’s vice president for values and transparency, said the issue was urgent because of the fast-evolving threats posed by disinformation: “We need online platforms and other players to address the systemic risks of their services and algorithmic amplification, stop policing themselves alone and stop allowing to make money on disinformation, while fully preserving the freedom of speech”.

Facebook said it supported the Commission's focus on greater transparency for users and better collaboration amongst platforms and the advertising ecosystem.

The Code of Practice was first introduced in 2018 and includes Google, Facebook, Twitter, Microsoft, Mozilla, TikTok and some advertising and tech lobbying groups among its signatories. It expects all signatories to come up with details about how they will comply with the updated guidelines by the end of 2021 and to implement them by early next year.

In 2019, EU officials expressed frustration over Facebook’s reluctance to share important data related to its efforts to curb disinformation campaigns.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles