The State vs The People

As the world becomes ever more digitised, the uneasy relationship between personal privacy and national security grows increasingly complex.

This summer of 2013 brought into focus a conflict that has lurked in the background for over a decade. The individual's right to privacy versus his or her security is now officially on society's agenda.

The line between protection and prying has never been more blurred. This is most apparent on the the Internet – that vexed playground that offers unprecedented potential for both freedom (to the point of lawlessness) and protection (to the point of oppression).

Privacy may well be a human right but recently it clashed with the requirement for liberal democracies to defend themselves. It'all boils down to the fact that, in order to prevent future terrorist attacks, security agencies need to accumulate large quantities of data from cyberspace and analyse it preventively for patterns, suspects and possible connections. Infringements on privacy are justified by balancing them against the greater good of security.

Citizens are meant to trust the good will of governments, but doubts – often triggered by utmost secrecy that shrouds information-gathering programmes – remain.

Back in the last century, at the start of the shift towards digitalisation, powerful content providers in the US and Europe launched an attack against 'piracy' (sharing of copyrighted content like movies and music). The attack included trying to pass restricting legislation that would have affected all Internet users. One of the traces of that campaign – now involving mobile telcos – is a repeated call for the abandonment of the neutrality inherent in the current Internet architecture in favour of the possibility of offering 'privileged' services, discriminating traffic. A draft EU regulation currently in discussion includes this possibility. To implement this, monitoring systems would be needed to record what kind of services each client uses over time: an infrastructure that could be easily used for surveillance. A reported flood of child pornography is a particularly persuasive argument for monitoring and controlling Internet usage – as is the need to control the excesses of online gambling. Finally, accumulating data about economic transactions and profiling can help to combat tax evasion and fraud: this is what is happening, for instance, in Italy, where more than 100 databases are constantly mined – without a warrant – to individuate and profile 'suspects'.

These are are all worthy objectives, but citizens should seriously ask themselves if giving away a fundamental human right is the right thing to do. Chinese and Russian authorities maintain a near-absolute control over cyberspace, and there has already been an attempt to switch the governance of the Internet as a whole to nation-states through the UN Agency ITU, moving away from the current private multiple stakeholders' model.

The Internet expansion, witnessed in the 1990s, and the culture of freedom it created were made possible by its technological underpinnings (open and standard protocols, for instance), by opening up the infrastructure to commercial activities, the governance model and the initial absence of laws and regulations aided by the free software revolution. All of this should not be taken for granted: it can be endangered, and that's exactly what is happening.

The nuts and bolts of snooping

To understand how in practice intelligence agencies can tap into traffic we will have to look at the physical structure of the Internet – at least on the general level. The backbone of the Internet is made up of fibre-optic cables, both on land and under the sea. This technology has supplanted – among other things – satellite links and microwave transmissions for voice applications, as telephony switched from analogue to digital.

This switch rendered obsolete interception systems like Echelon, and forced agencies such as the USA's National Security Agency (NSA) to look for other solutions. They started tapping the cable themselves – usually at or near the landing stations, whose number is limited – or inserting special equipment at the switching operations of main providers (with their compelled collaboration of course).

Tapping a fibre-optic cable is not as easy as putting crocodile clips on an old telephone wire: the cable has to be carefully severed and a 'splitter' has to be inserted and connected at both ends and at a derivation where the intercepted traffic will go; the main component of a splitter is a precisely crafted optical prism. Accessing trunk cables presents some challenges, starting from the sheer volume of data: total traffic over the world underwater cables in 2012 topped 50TB per second.

Storage is another substantial challenge. Stretching the law somewhat, the NSA enlisted the cooperation of major operators; 'secret rooms' were installed at strategically placed switching centres and Internet Exchange Points (IXPs), the main nodes where Internet traffic is routed. There, special government network equipment was installed to duplicate and store data. The deluge was such that NSA ran out of storage space at its main location in Fort Meade, and a new multi-billion-dollar data centre, located in Utah, is now being put into service.

For the whole exercise to make sense, information has to be extracted from the flood of raw data, and here technology based on diverse strategies is used. There is simple filtering based on metadata – addresses of sender or receiver, geographic location, type of protocol used, to name just a few. Then there are state-of-the-art machine-learning techniques capable of extracting patterns from the whole traffic, including content – from email messages to YouTube videos, from web requests to Facebook usage. Prominent among NSA suppliers of analysis technology is Narus, an Israeli company now absorbed by Boeing.

Besides intercepting raw traffic, the NSA and its sister agencies have also enlisted the cooperation (albeit often reluctant) of major Internet companies under the PRISM programme. According to US Law, section 702 of the FISA amendments Act of 2008, Internet companies are forced to supply NSA with information about 'targets' upon request. The programme was unveiled in June 2013 by former NSA employee Edward Snowden, a whistleblower who has effectively forced the US authorities to admit these previously secret operations.

The list of companies involved includes Microsoft and Skype, Google and YouTube, Yahoo!, Facebook, PalTalk, AOL (America On Line) and Apple. PRISM requests allow the government to access stored data: for instance, to circumvent email encryption and gain access to the content of messages. Yahoo!, Google and others apparently tried to fight the practice in court,but they were all unsuccessful. The knowledge that email communications are open to wholesale interception without warrant, especially for non-US users, could very well undermine trust in all major cloud services.

Encryption has always been problematic for security agencies, ever since the 'crypto wars' of the 1990s, when strong encryption techniques were listed in the US among ammunitions and other military technologies for which export was banned. Banning the export of software and algorithms was unfeasible, and with time those restrictions were lifted.

Historically, the NSA has always attempted to weaken the level of crypto technology available to the general public to preserve its own capability of reading any encrypted material. It is general knowledge the agency possesses perhaps the most advanced cryptological capabilities in the world, probably including a good deal of classified advances in code breaking, but the preferred method to gain access to encrypted traffic nowadays seems to insert so-called 'back-doors' in software and systems, always with the collaboration of vendors, willing or unwilling.

Apple's operating systems are strongly suspected of including features, which could potentially allow easy circumvention of the whole-disk encryption, provided by the FileVault feature. Recall that a back-door is an inbuilt way to bypass one or more security features of a software system in order to gain access or control. Even de'facto cryptographic standards, like SSL (at the base of the 'https' protocol commonly seen in the browser as a small closed lock), seems to be compromised. This could potentially lead to dire consequences for the level of trust we pose in web transactions even over supposedly 'secure' sites.

Public policies

Deployment of widespread data interception and collection by states in the last decade has not only been empowered by technical advances, but first and foremost, by policies and legislation. The US Patriot Act, along with other legislation introduced after the 11'September 2001 terrorist attacks, formed the basis for the subsequent developments. Fear of another attack was, of course, the prime consideration, allowing also the expansion of NSA interception activities to US citizens bypassing in practice the court which should have authorised them on a case-by-case basis.

Agreements among security agencies of the so-called 'five eyes', sanctioned by the various governments, allowed for the driftnet tactics put into practice by the NSA, Britain's GCHQ, and Canadian, Australian and New Zealand services. Last August, New Zealand passed legislation by a close margin that allowed its Government Communications Security Bureau (GCSB) to gather data on domestic citizens and also foreign residents. In Italy, the revenue service routinely collects on a large scale databases concerning citizens: from a registered vehicles list to bank transactions, without need for warrants, to preventively compile list of citizens "at risk of tax evasion" to be checked.

The European Commission, the main legislative engine of the European Union, is also following this path with a stream of directive proposals and regulations emphasising the need to "secure the Internet" and build a safe environment across Europe. For example, Directive 2013/40 on cybercrime, in force since last summer, allows for data exchange between European law enforcement agencies.

The perils for the Internet industry

The economic growth, enabled by the advent of the Internet more than two decades ago, has been huge, but as we have seen it is based on the Internet as it was, structured around a neutral network where people and businesses could expect reasonable privacy and reasonable security in their activities. This created a level playing field where operators and service providers, big and small, could thrive. Even more than cybercrime, the interception operations of the last decade, coupled with the relentless penetration even of encrypted data and communications by states has undermined what is at the base of it all: trust. Without a certain level of trust, cyberspace is not a viable environment in which to develop a business, conduct transactions, and communicate.

Now that the forced collaboration of big players has been revealed, many of them want to disclose all the details, precisely because their main asset, trust, is in danger. And if those names cannot resist, how could small start-ups assure their prospective users of a reasonable protection of their data and information?

We should be aware that in the name of security, explicitly due to protection from highly infrequent events, like terrorist attacks, we are in the process of dismantling one of the most important industries created in the last decades – Internet software and cloud services, an industry that is an enabler for many others sectors.

Beside the destruction of trust in cyberspace, a more immediate danger of this trend is the possible 'Balkanisation' of the Internet. If nation states take over the governance of cyberspace, we face a real risk of fracturing the worldwide network into a series of separated national environments, separated by new 'cyberborders'. There are already some examples of this in China and Iran. This practice carries with it the obvious negative impacts on the possibility for all Internet operators to rely on a worldwide market.

The trends we are witnessing pose a threat not only to the online privacy of individuals, which by itself is important, but also to a whole industry built on open foundations, technological and cultural. A high (or at least adequate) level of trust and expectation of privacy is a prerequisite for conducting any kind of transaction on the web. Losing this could have long-term repercussions, and while protecting us from security menaces should be a priority; destroying the very foundations of our way of living in the process would be conceding victory to our enemies.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles