TikTok deleted more than 49 million videos breaking content rules
Image credit: Tashatuvango/Dreamstime
TikTok removed more than 49 million videos in just six months last year, the video app’s latest transparency report has revealed.
The popular video-sharing app said it took down 49,247,689 videos in the second half of 2019, more than 98 per cent of which were removed before they were reported by users.
Around 5.5 per cent of those removed videos in December 2019 broke the site’s rules on adult nudity and sexual activities, with another 24.8 per cent violating rules designed to protect minors.
“Around the world, tens of thousands of videos are uploaded on TikTok every minute,” the firm said in a blog post. “With every video comes a greater responsibility on our end to protect the safety and well-being of our users."
It added: “As a global platform, we have thousands of people across the markets where TikTok operates working to maintain a safe and secure app environment for everyone.”
The firm said it was currently only able to publish figures on the type of video violation for one month of the period in question because its new content moderation system was only introduced then. But it said it would be able to share this data for the full-time period of each report in the future.
During the month of December, 25.5 per cent of the videos taken down fell under the category of adult nudity and sexual activities. Another 24.8 per cent violated minor safety policies, which include content depicting harmful, dangerous or illegal behaviour by minors, such as alcohol or drug use. These were removed due to “an abundance of caution for child safety”.
Content containing illegal activities and regulated goods made up 21.5 per cent of take-downs, while 15.6 per cent violated the app’s suicide, self-harm and dangerous acts policy.
Of the remaining videos removed, 8.6 per cent violated violent and graphic content policies, 3 per cent fell under “harassment and bullying” and less than 1 per cent contained content that violated policies on hate speech, dangerous individuals and organisations, and “integrity and authenticity”.
According to the report, just over two million of the total removed videos were from users in the UK. It added that the 49 million removed videos accounted for “less than 1 per cent” of all the videos created in the app, from 1 July to 31 December 2019.
Furthermore, the firm revealed it received 500 legal requests for user information from government agencies in connection with law-enforcement investigations in the six months, including 10 made in the UK.
The platform has come under increased scrutiny in recent weeks because of its Chinese roots, and the ongoing concerns, particularly in the US, over whether links to the country could be a security risk for users.
Earlier this week, US Secretary of State Mike Pompeo said that the Trump administration was “looking at” banning Chinese apps such as TikTok.
The video app did not address the issue directly in its transparency report but said it was “committed to taking a responsible approach” to building its platform.
“We’re working every day to be more transparent about the violating content we take down and offer our users meaningful ways to have more control over their experience, including the option to appeal if we get something wrong,” the company said.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.