View from Washington: Silent Zuck slings muck at us all
The Cambridge Analytica scandal lumbers on as genuine researchers ponder their future in Facebook's silence.
Day three of the Cambridge Analytica (CA) scandal and so flows on the torrent of revelations (catch up here on earlier analysis).
First, CA boss Alexander Nix got caught by Channel 4 on a hidden camera saying his company can augment its cyber-shenanigans with some odious, old school dirty politics.
Then, The New York Times reported that Facebook’s security chief, Alex Stamos, was serving out his notice before the scandal broke, having banged heads with senior management over its failure to fully research and disclose the platform’s hacking by bad actors.
Next, the UK’s Information Commissioner revealed that she had to have a Facebook ‘forensic’ team kicked out of CA’s offices last night while she seeks a warrant to seize the research company’s servers.
Yet still no Zuck. Or Sheryl. Facebook’s CEO and COO remain in Wally-like concealment – though there are reports a staff briefing will take place later today (20 March) in the US. Great news for the 50 million users unwillingly profiled.
Oh, this snark feels good. Too good, unfortunately.
Because I’m going to tack away from the main story and, as I began yesterday, highlight another way that this increasingly ghastly miasma hurts technology in the wider sense. Let’s go back to the programme from which users’ data was originally harvested.
Watergate gave us the saying, “It’s not the crime; it’s the cover-up.” For Facebook, that seems to be the case here. Indeed, it’s hard to say that it committed any ‘crime’ at all, certainly initially. Rather, events tend to support the old canard that, “The road to hell is paved with good intentions.”
It all starts with researcher Aleksandr Kogan. In 2014, the Cambridge University-accredited academic launched a Facebook personality app to profile users and gather data from their friends under a framework made available to researchers and approved developers.
Ironically, Facebook set up the channel Kogan used partly because there had been complaints that the company was being opaque about its data and social impact. In response, the framework allowed accredited researchers to look into its user-base and, subject to appropriate controls over the sample’s distribution, analyse it and publish independently on its make-up and behaviour.
For its part, Facebook knew that it had unleashed something extraordinary and wanted to know how it actually worked. For the researchers, then as now, social media is an evolving beast and becoming so nearly ubiquitous that much more digging is needed. Academic, social and commercial goals coincided.
The programme was (and in such form as it remains) a good thing.
There now seems little doubt that Dr Kogan was a bad actor. He abused the privilege Facebook gave him by sharing the data his app accumulated outside his GSR research vehicle with CA. And having discovered this back in 2015, Facebook actually did the right thing.
It contacted the researcher and CA, told them that the data had been distributed in breach of its terms of service, and instructed them to delete it immediately. In 2016, Facebook also tracked down whistleblower Christopher Wylie, who had since left CA (but says he worked with Dr Kogan on the project), and issued the same instruction:
“Because this data was obtained and used without permission, and because GSR was not authorised to share or sell it to you, it cannot be used legitimately in the future and must be deleted immediately.”
However, things then started to go wonky. According to Wylie, little further was done by Facebook to ensure compliance: “That to me was the most astonishing thing. They waited two years and did absolutely nothing to check that the data was deleted. All they asked me to do was tick a box on a form and post it back,” he told The Observer.
Remember, it’s the cover-up. But let’s not keeping gnawing at Facebook’s dilettante attitude to a data leak (this wasn’t technically a breach), or its subsequent failure to inform users who were involved, or its unhelpful evidence before subsequent formal political hearings. Done that. Kicking ahead. Hope it’s hard.
Rather, you must wonder about what impact the scandal will have on relationships between technology companies and both academic and bona fide researchers who follow them. This incident will be making many feel tempted to close off access to their data entirely. Or perhaps, constrain that access and the data’s use to the nth degree.
Our understanding of technology will be much worse as a result. And all that on the eve of what could be massive changes to our world, wrought not just by social media but also virtual and augmented reality, the Internet of Things, machine learning and, well, who knows. These questions and concerns arguably could not have emerged at a worse time.
So before knees jerk, let’s remember the following.
- Dr Kogan committed a massive breach of academic ethics. Thankfully, these are rare.
- Facebook was unacceptably lazy, so let’s see this as a teachable moment in a realistic sense.
- The value of independent research still outweighs the risks. And we need it more than ever.
Before we get consumed with anger about the many serious issues this scandal is raising – and at great speed – we need to make sure that we don’t let it do far greater damage than it already has. Genuine researchers must have access. We depend on their work, and always have.
But finally, back to the invisible Mr Zuckerberg and Ms Sandberg (I think we’re done with first names now, don’t you?)
Yesterday’s wheelbarrow and shovel weren’t enough? We can get bigger. Or was it the wellies - you deserve Hunters not Dunlops? No problem. The whole industry will do a whip round for you. But do us all a favour. And yourselves.
The Augean Stables remain. But now there’s a Hydra outside too.