Comment

View from Washington: Google and Facebook need to cut out this kidology

This simply isn't the time for dedicated social media sites for pre-teens.

Christmas is coming and with perfect timing the issue of children’s safety online is back on the agenda. So too, though, is that of Silicon Valley’s increasingly bad timing.

From a marketing perspective, you can see why Facebook has chosen now to launch a text, voice and video messaging service for pre-teens. Many have asked Santa to put a tablet or some other connected present under the tree. Guess who wants to be their first app download?

Tonally, however, the move strikes a bum note.

Russian hacking has alerted millions of adults to the risks of being deceived themselves on the Internet. This week has also seen two major news stories directly connected to the risks youngsters face in cyberspace.

YouTube CEO Susan Wojcicki appears to have been bounced into increasing the number of content moderators on Google’s social media sites to 10,000 as inappropriate content continues to slip onto its channels, particularly that aimed at children.

Meanwhile in the UK, both the National Crime Agency and the National Police Chief’s Council have launched a campaign to warn adults about how paedophiles are using live-streaming services to prey upon the young.

Add to that existing and broader concerns about the quality of content monitoring across all social media and Facebook’s timing becomes even harder to understand.

There have been corporate assurances that services intended for those below the age of 13 will be more rigorously controlled than those for adults (though that's arguably not much of a stretch).

Moreover, as such services are mostly conceived in the US, they are developed to comply with the country’s Childrens’ Online Privacy Protection Act (COPPA). Most Internet companies have hitherto banned under-13s from using their products because such COPPA compliance was considered too difficult. COPPA does still set a high bar, so – the argument goes – if the companies now think they can scale it, we should have more confidence in these new products.

Finally, parents must give approval for their kids to join these platforms. They can just say ‘No’. Even if they don’t, parental oversight over how their young use connected hardware will always be the best regulator of all.

Yet, as with all online services, it appears that those aimed at our offspring can be spoofed – and often are, as YouTube’s experiences show. Our kids aren’t bad at spoofing either, with at least 20 million under-13s in the US alone thought to be on grown-up Facebook, having given false ages.

Then, by creating products specifically for the young, you inevitably attract those who would abuse them. These sites seem to create another cyber-security version of Whack-A-Mole, a game kids would probably be better off playing for real.

Beyond the question of child safety, there is that of monetisation. Why exactly is Facebook doing this?

COPPA does control what data firms can harvest from children, but it was introduced 17 years ago. Since then, the algorithms used to profile users have matured to an incredible degree. It’s arguable that even a few nuggets gleaned from your kids could significantly refine how a company profiles you, even if it cannot target your children specifically.

Or is it simply a bid to ‘catch them young’ and protect the user base over time. Blogger John Gruber caught what many of us will feel about that: “This is like Philip Morris introducing officially licensed candy cigarettes.”

At the risk of being slated for restraining innovation, I simply cannot see how these types of service - especially where kids are encouraged to upload various kinds of content - deliver their claimed benefits to the degree that they sufficiently outweigh the risks.

Our trust of social media is at an all-time low with considerable justification. The specific issue of online sex trafficking - including the abuse of minors - is also currently high on the legislative agenda in Washington, indicating just how big a threat it is.

Most of all, we don’t really know what impact these apps will have. The research is evolving all the time (indeed, different expert views were cited at its launch showing that Facebook Kids would be ‘good’ and ‘bad’ for children).

This is only my personal view, but without that final element in particular, I don’t just think you should avoid Facebook Kids, YouTube Kids and any of the other platforms beginning to emerge. I think we all need to tell the social media companies not to go there at all, until it can (if ever) be done with enough confidence to satisfy everyone.

So, leave them kids alone. And then get down to fixing the obvious problems with your adult services.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close