View from Washington: WannaCry? Wanna wake the hell up, more like
The WannaCry attack again shows an urgent need for greater collaboration on cybersecurity best practices.
It did not take long for the ‘blame game’ to start as it emerged that hundreds of thousands of computers have been hit by the ‘WannaCry’ ransomware attack. But the real issue remains the general lack of cooperation rather than confrontation that still dogs all things cyber-security.
Although it issued a security patch in March that appears to have protected most Windows-based computers from the attack, Microsoft has pointed the finger at the US government and its security services. It argues that, as demonstrated by Wikileaks’ Vault 7 leak, the CIA and NSA have been accumulating knowledge of security vulnerabilities so that they themselves can exploit them, rather than alerting technology companies.
Brad Smith, Microsoft’s president and chief legal officer, has a point there (and being fair to him, he notes a need for greater collaboration between industry and government). But it is hard to see how that inevitable tension can ever be resolved entirely to both sides’ satisfaction. The whole point of national cyber-espionage operations is to discover and then exploit flaws, supposedly in their various national interests.
One thing is for sure: if there is little constructive dialogue among all the interested parties, we can expect ‘WannaCry’ to be but one of many such attacks in the years to come.
That such dialogue is not yet taking place was further illustrated for me when I recently had the chance to chat with Wally Rhines, chairman and CEO of ‘Big 3’ EDA vendor Mentor, now a business within the Siemens conglomerate.
Rhines, a necessarily keen observer of IC design trends, has recently been banging the drum hard about design for security and its increasing impact. With particular regard to chip verification his mantra has been that the process is expanding from simply verifying that a chip does what it is supposed to do to also verifying that it does not do anything it is not supposed to do.
In this respect, his take on the threat posed by Trojans – such as WannaCry or Stuxnet – is interesting. Particularly in terms the response from the silicon community so far (note here also that he was talking before WannaCry struck).
“I’ve gone to meetings and heard that [embedded Trojans] don’t seem to be a problem. If they are around, our customers who buy design tools don’t seem particularly troubled about it. But then if I tell this to guys at the NSA, they roll their eyes like I came out of La-La-Land,” said Rhines.
“So you have to figure that this is happening and that it’s being done by all the players. Why are there restrictions on some imported equipment? Well, you have to guess that it’s partly – at least – because you can embed Trojans in them.”
Rhines sees the immediate problem as one of cost and complexity. Hardware companies can take steps such as adding physically unclonable functions (or PUFs) to a design. But this complicates an already difficult design task – particularly in today’s billion-gate era – and chops a few cents off already tight margins.
“So, people don’t want to go through that if they don’t have to,” says Rhines. “But one of these days, something is going to happen and everybody will go, ‘OK, where’s the toolkit?’”
Maybe that day has arrived. But now, the issue of collaboration once more raises its head. Because guess what happens once a toolkit is available?
“At that point, the system builder will go to the chip supplier and say, ‘I’d like to add this sentence to the purchase order. It says that there are no Trojans built into your chips.’
“You’ll go to your lawyers and ask if that’s OK, and you know what they’ll say: ‘Are you kidding? Absolutely not. There’s no way in the world we’re doing that.’”
That said, some kind of coordinated response seems inevitable. Darpa, the US defence research agency, is understood to be working towards one.
“The way forward – and it’s not really happening yet to scale – is that we'll have to agree across the industry on a set of best practices. Maybe Darpa takes us in that direction,” says Rhines. “But what you will have to offer ultimately, I think, will be an ability to say, 'We followed these agreed best practices. We can’t say that there’s nothing in there for sure, but we did use these best practices to prevent it.' For even that, though, we need the negotiation to start.”
Rhines’ comments are specific to hardware but they could just as easily be applied to software development and – as Darpa’s involvement arguably shows – government. The negotiation needs to start.
There have been efforts on that front. In their times in office, former US President Barack Obama and former UK Prime Minister David Cameron both raised the need for broader cyber-security debate and strategy development at national and international levels. Their successors, though, seem preoccupied with other matters right now (and, yes, it is also fair to ask just how much Obama or Cameron really achieved).
But can this wait any longer? What Rhines’ observations also show is that developing a workable cyber-security design strategy will pull on not one but two hideously complex areas: technological capability and legal liability. Getting agreement on either of those in isolation is hard enough. But it’s still true that putting off a likely arduous task combining both is not going to make any of us more secure.
Because, be honest, when you look at where we are today, you really do wanna cry.
For those of you looking for the more detailed analysis of the Kapor Centre’s Tech Leavers Study, with ‘WannaCry’ dominating the headlines, this will now appear on Thursday (May 18).