Laptop with Facebook Log In open in browser

Facebook admits millions of child exploitation posts had to be deleted from its service

Image credit: Pexels

Antigone Davis, Facebook’s global head of safety, has revealed that over eight million pieces of content violating the site’s rules on child nudity and exploitation had been removed from the site in the last three months.

Over the last year, a machine-learning tool has been used to identify images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualised context. Also, the company disclosed the use of a similar system which catches users engaged in grooming or befriending minors for such exploitation.

In a blog post addressing the issue, Davis wrote that Facebook also works with organisations such as the US-based National Centre for Missing and Exploited Children (NCMEC), reporting content to them.

“Our Community Standards ban child exploitation and to avoid even the potential for abuse, we take action on non-sexual content as well, like seemingly benign photos of children in the bath,” she said.

“With this comprehensive approach, in the last quarter alone, we removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies, 99 per cent of which was removed before anyone reported it. We also remove accounts that promote this type of content.

“We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC.”

Davis also revealed that the social media firm was helping the NCMEC organisation develop software to help prioritise the reports it receives and shares with law enforcement, in order to address the most serious cases first.

“We also collaborate with other safety experts, NGOs and companies to disrupt and prevent the sexual exploitation of children across online technologies.

“For example, we work with the Tech Coalition to eradicate online child exploitation, the Internet Watch Foundation and the multi-stakeholder WePROTECT Global Alliance to End Child Exploitation Online. Next month, Facebook will join Microsoft and other industry partners to begin building tools for smaller companies to prevent the grooming of children online.”

The child grooming system evaluates factors such as how many people have blocked a user and whether that user quickly attempts to contact many children, Davis said.

Michelle DeLaune, chief operating officer at NCMEC, said the organisation expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year.

DeLaune acknowledged that a crucial blind spot is encrypted chat apps and secretive ‘dark web’ sites where much of new child pornography originates. With the increase in reports, the software created will help to decide which tips to assess and pursue first.

Prior to the new software created for this purpose, the firm relied on users or its adult nudity filters to catch child images, with a separate system blocking child pornography that has previously been reported to authorities.

Facebook has not previously disclosed data on child nudity removals, although some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.

Throughout the course of the year, the social media giant has come under fire regarding data and inappropriate content. Last month, Facebook was sued over the exposure of 50 million user’ data, after two people filed a lawsuit against the company in the federal court in California.

September 2018 saw Home Secretary Sajid Javid warning tech firms to do more to fight online paedophiles, warning tens of thousands of youngsters are in danger of being groomed, exploited and blackmailed by sexual predators on the internet.

Also, in July, a Channel 4 investigation found that Facebook moderators have been treating abusive, violent and racist content with leniency, even when the content explicitly violates its guidelines.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles