National Cyber Security Centre officially opened by Queen as Chancellor warns business to ‘sharpen’ its approach
Image credit: PA
Dr Ian Levy, technical director of the new National Cyber Security Centre, opened by the Queen in London today, reveals how he hopes to shame trusted firms into complying with best practice on stopping hackers
“At the moment it’s fear, uncertainty, doubt – that is the entire narrative.”
This is computer scientist Dr Ian Levy’s precis of public attitudes to cyber crime.
As someone at the forefront of the fight against hackers - as technical director of the National Cyber Security Centre (NCSC), officially opened by the Queen today – he should know.
Levy is plain-speaking and refreshingly critical of much that is wrong with the current mindset around cyber-crime.
“The interesting thing for me is that this narrative is driven by a massively incentivised group: the cyber-security industry,” he says. “They are incentivised to make it sound bad so you buy their magic ambulance.”
Experts’ responses to whatever botnet or bug is in the news can entail little more than “giving it a really cool name, shouting about it and running round claiming it’s the end of the world,” says Levy, witheringly.
So, keep calm and carry on?
“Regardless of what you read in the media, the attacks are limited by the laws of physics,” Levy states. “They are limited by computer science. They are just software, in the end.
“So, regardless of who is doing it - whether it is a spotty 15-year-old, a nation state or an organised crime group - they have the same limitations in terms of computer science. The difference is the amount of investment they are willing to make.”
Keeping calm about the prospect of cyber attacks is no easy feat.
Be it the hacking of the Democratic National Committee ahead of the US elections or denial of service attack on banks, incidents appear to be on the rise. Earlier this year Levy’s boss Ciaran Martin said the UK was being bombarded by dozens of serious cyber attacks every month. Many are believed to originate from Russia or China.
Yet Levy remains reassuringly optimistic about the future outlook and fizzes with enthusiasm about the task facing NCSC, which started its work with surprisingly little fanfare last year.
Today marks its official opening, amid a blitz of publicity and a speech by Chancellor Philip Hammond in which he will urge business to "sharpen its approach" to cyber crime.
The NCSC’s role encompasses everything from “target hardening” to helping police take down criminals and aiding the Secret Service in pinpointing state-sponsored mischief.
One of Levy’s first goals is to make sure email systems serving roughly 5,000 web domains are effectively bulletproof by the close of this year, cutting email spoofing off at the knees.
The project is about “getting the basics right”, he insists, but it is nonetheless a big job. Measures put in place for the tax authorities already mean half a billion spoof emails will not get delivered by the end of 2017.
Levy intends to publish instructions for companies, showing precisely how he has achieved this reduction. “The government is going to be our guinea pig,” he says.
“Once we do it for government, I'm going to go to every single organisation that has a high public trust brand and say, right, if you're not doing it, why the hell not? We've done it for the biggest enterprise in the country - government - and here's all the software. Why aren't you doing it?”
His desire to open source defence techniques is unusual for someone working for an adjunct of a spy agency.
Levy is still an employee of GCHQ, just as he has been for over a decade. It was from that agency’s secretive, donut-shaped base that he transferred to the office in a commercial block in London’s Victoria which now houses the NCSC. He describes himself as an “early adopter” of this new, rather more casual way of working. The place is hardly Fort Knox - there is a Shake Shack on the ground floor, for goodness sake – and is a far cry from the spy clichés.
Levy has been at pains to claim that the NCSC will be “as transparent as possible”. It will publish performance data and “How To” kits showing its working-out - even where this might cause embarrassment. Time will tell in what way, and indeed if, such transparency actually pans out in practice.
“If your only defence is that your attacker doesn’t know what he’s going up against, you fail,” Levy says, when asked if his desire for openness flies in the face of Secret Service protocol. “That is just an axiomatic principle of security - if the only defence you’ve got is pretending nobody knows what you’re doing, forget it.”
He adds: “Could we get to the point where we publish on the internet our threat feed? Maybe, I don’t know.
“We’re certainly going to try and publish the algorithm in the middle that ingests all those different data types and comes up with a threat feed.
“With some of the software that we’re using, our intention is to open source a lot of the tools we’re building to help other people implement the same thing.
“The architecture of protection is not what protects you. What matters is how you use it and how you integrate all that together on a national scale.”
As well as being an early adopter of the new office and style of working, Levy is an early adopter of the Internet of Things (IoT).
Much fear around technology in general is now shifting towards the prospect of the threats that could be posed by having homes full of internet-connected appliances, with video cameras that can be watched and potentially hacked into from afar, smart energy metres and autonomous smart cars whose brakes might be made to stop working by way of malware. Surely this leaves us vulnerable?
“It’s an interesting question,” replies Levy. “I’ve got some IoT-type things in my house, but you can’t talk to them from outside. You know, I’m a geek, so I’ve got a nice big firewall on the edge of my broadband.
“I’ve got all sorts of stuff that protects me, so I know exactly what those things are talking to.
“I know exactly how they can be talked to, and I’ve made a judgement - right or wrong - that for my particular use, that’s acceptable given the benefit I get. But how do you scale that out?
“Take Mirai [the IoT botnet which laid siege to routers in the UK last year]. There are two responses to it. The first is to go, right, IoT device manufacturers are all stupid and need to be educated and we need to make them write better software. That’s the current narrative.”
A better response, he says, would be to make changes necessary to simply “take that vulnerability away”.
The cyber-security industry could do worse than to learn from the experience of engineers working in aviation, he says.
“If you want proof of principle of how you manage systemic vulnerabilities, it’s aircraft,” he states, citing as a case study the move from square airplane windows to ones with rounded corners in the 1950s after a notorious incident in which three de Havilland Comets exploded mid-air.
“That was basically a fielded system with a vulnerability,” he says. “The aircraft industry at that time said, right, number one, we’re going to make a systemic design change to say you will no longer be allowed to make planes with square windows – so you take the vulnerability away.
“In the interim, every plane has got square windows, so you can’t ground them all. So buttress around the window corner, do some operational monitoring to make sure the fuselage is working properly – those kinds of things - to manage the risk.
“That’s where we need to be in cyber-security. Because that’s harm reduction, not vulnerability reduction. The cyber-security industry today runs on vulnerability reduction. We need to move to harm reduction.”
He adds: “If you look at Mirai, the right answer is not that all these Chinese camera manufacturers need to be sued into making their software better. That’s just not going to work, not in the short term. That's maybe a 10-year goal.
“The right answer is to say, as a system, how do I mitigate the fact that these cameras are vulnerable?”
This could be done by having telecoms networks simply turn off various obscure network protocols, such as Telnet and UPNP, which allow remote access to computers in residences or business premises. In practice, hardly anyone needs to use these, says Levy, so why not scrap them and simplify things?
“There are a very tiny minority of people who need this. By default, let's turn it off,” he declares. “Then you mitigate the harm caused by the fact that these things are vulnerable, because nobody can talk to the vulnerability anymore.
“And if you can't talk to the vulnerability, you can't cause it to do bad things. For the tiny minority of people who need to have those protocols from the Wan side [port], they're going to know what they're doing and will go into their control panel and their ISP and say yes, turn it on.
“That's basically scaling up what I said about my IoT setup at home. By default, you cannot talk to it from outside my house unless I make a decision that you can. That's the sort of thing I think we need to do nationally.
“I can't expect 66 million people to know what those protocols do, but I can expect a few people who need them to know what they do and to turn them back on.”
Such clarity is another refreshing side to Levy’s style of communication. He acknowledges the often convoluted way in which cyber-security is talked about and the bamboozling array of different agencies dealing with the issue. The NCSC was set up partly to try and make this landscape simpler.
Communicating data, findings and best practice to the public and industry figures will be a “huge challenge”, he admits.
“Publishing stuff that a bunch of academics can read and make a judgement on? That’s easy,” he concludes. “Publishing stuff my granny can understand? That’s hard.”
He says he wants to “take away a whole bunch of noise” around cyber-crime, adding, “We’ll automate the defence against the stuff where we can automate that. And with the really smart stuff – that’s the top-end stuff – you then free up the professional defenders to look after that.”
His message for IET members as IoT moves forward also concerns communication.
“There’s a very well established safety culture in terms of how you build safe systems,” he says. “There's also a very well established cyber-security culture. But they use totally different languages.
“They have totally different views about what the right initial answer is. For example, if you build a safety critical system, you want to keep it in its intended state as far as possible - so you never patch it. If you're a cyber-security person you say, well, I need to patch it because it's got all these vulnerabilities.
“We need to bring a common language to those two disparate communities because otherwise you're going to have a safe system that's not secure or a secure system that's not safe.”
In a word, simplify. In a complex world, that’s sound advice.