TikTok introduces child safety features
Image credit: Dreamstime
Wildly popular video-sharing app TikTok has added safety features, including screen time management prompts and parental control over the content their children are viewing.
“Family safety mode” will enable parents to control the amount of screen time permitted on the app, send messages to their children’s accounts, and restrict certain types of content from appearing in their feeds.
TikTok – which was downloaded more than 700 million times last year - is a social media app for sharing short videos, often featuring lip-syncing, reactions, and comic skits set to music. The ByteDance-owned app is extremely popular among younger users, and has served as a vehicle to create memes, hit songs, and even celebrities. However, it has come under criticism by users of other social media platforms for its potential to facilitate cyberbullying and child grooming – as well as for its alleged censorship of content critical of the Chinese Communist Party and violation of child privacy laws in the US.
Although the app introduced some screen time management tools last year, this update marks the most significant move towards addressing concerns about the wellbeing of its young userbase. Users will now see prompts about screen time management automatically appearing in their content feed.
TikTok’s European head of trust and safety Cormac Keenan said that the prompts would “remind our community to be aware of the time they spend on TikTok and to encourage them to consider taking some time out.” He added that the company had worked with some of the platform’s most prominent figures on the prompt system.
“When people use TikTok, we know they expect an experience that is fun, authentic, and safe,” he said. “As part of our ongoing commitment to providing users with features and resources to have the best experience on TikTok, we are announcing family safety mode, a new feature to help parents and guardians keep their teens safe on TikTok.”
“We will keep introducing ways to keep our community safe so they can stay focused on what matters to them – creating, sharing, and enjoying the creativity of TikTok’s community.”
The announcement comes as the UK government finalises its plans to regulate major online platforms hosting user-generated content, particularly social media platforms. Last week, the government announced that Ofcom will be given responsibility for ensuring that internet platforms are protecting users against “online harms” such as terrorist propaganda and child exploitation. It is not yet known what punishments Ofcom will be able to wield to ensure these platforms abide by their statutory duty of care.
Meanwhile, the German cabinet has approved a bill that will require social media platforms to report certain forms of hate speech to law enforcement, including far-right propaganda, graphic portrayals of violence, murder and rape threats, and posts indicating preparation for a terrorist attack. The measures will also extend the definition of criminal hate speech to include threats of rape and vandalism.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.