facebook guidelines

Facebook’s ‘recklessness’ put children at risk, says regulator

US regulators have proposed tightening privacy regulations after finding that Facebook misled parents and failed to protect the privacy of children using its Messenger Kids app.

The Federal Trade Commission (FTC) has proposed sweeping changes to a 2020 privacy order with Facebook's owner, Meta, to limit its use of children's data. 

The US regulator has found that the social media company misled parents about how much control they had over who their children had contact with in the Messenger Kids app and was deceptive about how much access app developers had to users' private data, breaching a previous privacy agreement. 

As a result, Meta could now face further limitations, including new privacy guidelines that would ban the company from making money off data collected from under-age users, as well as restrictions to its use of face-recognition technology. 

"Facebook has repeatedly violated its privacy promises," said Samuel Levine, director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."

Meta called the announcement a “political stunt”, adding: “Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory.

“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil.

“We have spent vast resources building and implementing an industry-leading privacy programme under the terms of our FTC agreement. We will vigorously fight this action and expect to prevail.”

Messenger Kids was first launched in 2017. At the time, it was described as an extension of their parent's account, that would allow children to chat with family members and friends. The company said the app “helps parents and children to chat in a safer way”, and emphasised that parents are “always in control” of their children’s activity.

When launching the product, Facebook stressed the service would not show ads or collect data for marketing, though it would collect some data it said was necessary to run the service. However, in early 2018, a group of 100 experts, advocates and parenting organisations contested Facebook’s claims that the app was filling a need for a children’s messaging service.

“Messenger Kids is not responding to a need — it is creating one,” the letter said. “It appeals primarily to children who otherwise would not have their own social media accounts.”

In the past, the FTC has settled with Facebook over privacy violations. In 2019, Facebook agreed to pay a record $5bn fine to resolve allegations it had violated the 2012 consent order by misleading users about how much control they had over their personal data. 

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles