Maria and Giuseppe Fidecaro inside the spark chamber, 1963

60 years of CERN: the nuclear heart of Europe

CERN, the organisation that runs the world's largest particle physics laboratory, turned 60 last month. Its efforts to unravel the mysteries of the universe are well documented, but the pioneering role of the engineers and computer scientists supporting the physics is sometimes overlooked.

The European Organisation for Nuclear Research - known by its French acronym CERN - was founded on 29 September 1954 with four principles at the heart of its mission: to answer questions about the nature of the universe; to train the scientists of tomorrow; to heal the scars of the Second World War through international collaboration; and, importantly, to advance the frontiers of technology.

The technology at the heart of everything CERN has done for the past six decades is the particle accelerator. In 1957, the 600MeV (megaelectronvolt) synchro'cyclotron came online producing the beams for the laboratory's first particle and nuclear physics programmes. Fast-forward 50 years and the 27km circular Large Hadron Collider (LHC) has reached record energies of 7TeV - more than 10,000 times higher - and this figure will double to 14TeV when the facility reopens following upgrades started in February 2013.

The technological advances that enabled such dramatic progress have generally been incremental, but certain step changes are discernible. One was conceived in the late 1950s when physicists realised the gateway to higher energies lay in colliding two particle beams rather than firing beams at a stationary target.

Doubling up

In 1965, CERN approved plans for the Intersecting Storage Rings (ISR), which would use the 28GeV proton synchrotron to feed proton beams into two 300m storage rings surrounded by high-powered magnets to contain and direct particles to eight collision points where the rings intersected. The increase in energy was enormous - colliding two beams with a combined centre-of-mass energy of 62GeV was the equivalent of a 2,000GeV beam hitting a fixed target.

This marked a "new era for accelerators" according to Frédérick Bordry, director for accelerators and technology at CERN, and marked the first time the laboratory had led the way in accelerator technology.

The laboratory's next major contribution to the field was 'stochastic cooling', which solved the challenge of stacking particles in tight enough bunches to ensure they collided with high enough frequency. Invented by CERN engineer Simon van der Meer, the process uses a wideband optical detector to identify divergence in the momentum of particles in a bunch before using this signal to drive an electromagnetic 'kicker' to knock them back into line. The technique eventually won van der Meer the 1984 Nobel Prize in physics alongside Italian physicist Carlo Rubbia, who used it to discover the W and Z bosons and confirm the weak nuclear force.

According to Bordry though, the large-scale application of superconductivity has been CERN's principal contribution to accelerator science. American particle physics laboratory Fermilab was the first to take advantage of the higher field strength of superconducting electro-magnets in its 1.8TeV Tevatron collider, which opened in 1983.

But in 1986 research and development began at CERN on what would become the largest accelerator ever built. "With superconductivity we pushed the technology to the limit," says Bordry. "We didn't develop something we could say is new, but we pushed industry and universities to think about engineering very large-scale devices."

By then the CERN Council had already approved Europe's largest civil-engineering project prior to the Channel Tunnel, to excavate a 27km tunnel for the Large Electron-Positron Collider (LEP). The LHC was to be built inside this tunnel. Superconductivity was the only way to achieve the 8.4-tesla fields necessary to steer the beams at the energies envisaged by the LHC's creators, so the 1,232 35-tonne dipole magnets to bend the beams and 392 smaller quadrupole magnets to focus them would be made from superconducting niobium-titanium alloy.

This would require the creation of the largest cryogenic system in the world to provide the 96 tonnes of superfluid helium needed to keep the magnets at their operational temperature of 1.9K (-271.3°C). Keeping the cryogenic systems insulated and creating an interference-free environment for the beams would also require the creation of the world's largest operational vacuum, with a total of 104km of piping.

After eight years of R&D, the CERN Council approved the project and on the morning of 10 September 2008 the first beam was circulated through the collider, but like any ground-breaking project there were risks. The collider's magnets carry 10GJ of energy when in operation. "That's roughly the equivalent of an A380 Airbus travelling at 700km/h, but if there's any problem you have to land it in one minute," says Bordry.

Just nine days later, during a magnet powering test, a faulty electrical connection caused a discharge that punctured the cryogenic system's helium enclosure, venting liquid helium which expanded with explosive force, damaging 53 magnets and contaminating the vacuum pipe. According to Bordry, who oversaw the commissioning of the LHC, the fault was due to one superconducting wire interconnect among 10,000 that had a resistance of 200 nano-ohms rather than the required 1nΩ.

"In every large project it is inevitable you get one event," he says. "For me, what's more important is our organisation and recovery from that one event. That's part of the engineering challenge - to manage risk and limit risk, but to have zero risk is impossible."

Tolerant technology

Just over a year later the LHC was up and running, and on 30 November 2009, it achieved 1.18TeV per beam, beating the Tevatron's eight-year record of 0.98TeV per beam. Six months later the collider hit its initial design energy of a combined 7TeV. Achieving such high energies was a marvel of modern engineering, but also presented a challenge to the designers of the facility's detectors.

A key issue was radiation tolerance. According to Dave Barney, a physicist working on the LHC's CMS detector, the amount of radiation accumulated over the collider's lifetime will be comparable to an atomic bomb blast. "It's spread over 10 or 15 years rather than 10 or 15 milliseconds, but the level is the same so you have to move to military specs for the detectors and the electronics that go with it," he says. "There are very few companies that produce radiation-tolerant electronics and even fewer who will tell you about it."

CERN researchers worked with IBM to develop their 0.25μm CMOS (complementary metal oxide semiconductor) process to produce circuits capable of withstanding the high radiation levels at the LHC. These also had to be engineered to be incredibly fast to cope with the 20 million collisions a second the LHC produced prior to its shut-down last spring. When it comes back online early next year this will increase to 40 million.

By comparison, CERN's early detectors dealt with collision frequencies of several hertz, just a few per second. Early experiments used bubble chambers filled with a superheated liquid - normally hydrogen - which produce trails of bubbles when charged particles pass through. A magnetic field alters the course of particles depending on their energy, mass and charge and photographs are taken of their tracks.

Entering the digital age

"It was a fairly slow, laborious process to analyse the data because you effectively just have to scan through all the photos," says Barney. With the advent of electronic computers a process for digitising the images was created, but it wasn't until French physicist Georges Charpak invented the 'multiwire proportional chamber' in 1968 that detectors became truly digital.

Charpak's device consisted of a box filled with a gas that can be ionised by passing particles, along with a large number of parallel detector wires that produce an electronic signal when a particle is detected. The invention increased the detection rate from one or two particles a second to 1,000 particles a second and even today nearly every particle physics experiment features a detector based on these principles.

A so-called 'onion structure' for detectors began to appear in the 1970s, layering multiple sub-detectors in one device. Typically the innermost layer would be trackers to measure the momentum of particles, followed by calorimeters to measure the energy of particles by absorbing them in dense material, and finally muon detectors, to spot the one type of particle capable of penetrating the calorimeters.

Once again the chief innovation at the LHC was pushing the scale of the devices. "It was a massive leap forward. Though it had been used before, never to the scale or complexity used in the LHC experiments," says Barney. The ATLAS detector features the largest superconducting magnet ever built to bend the paths of particles produced in collisions. The CMS experiment features 200m2 of silicon sensors in its tracker and nearly 75,000 23cm-long lead tungstate crystals in its calorimeter, worth $1,000 each.

Dealing with data

Operating at 7TeV the LHC produces approximately 27 petabytes of collision data annually. While probably the best-known technology to come out of CERN is the World Wide Web, this was somewhat peripheral to the central mission of the laboratory's IT department, which has always been data analysis.

The first computer installed at CERN was a Ferranti Mercury - a vacuum tube machine with a paper tape in/out. A series of machines created by IBM and the now defunct Control Data Corporation followed, but the limited number of calculations and high maintenance requirements restricted their usefulness.

Frédéric Hemmer, head of CERN's IT department, explains that the early machines had proprietary assembly languages, which meant rewriting software libraries for each new machine. It was the popularisation of IBM's Fortran programming language in the 1970s - the first high-level language compiler adopted by multiple vendors - that led to one of CERN's biggest contributions to computing. "Little by little, over time, CERN and its partners developed a number of tools and now these are more or less common to all experiments in physics," says Hemmer.

The CERN Program Library became a large collection of tools for tasks like numerical analysis and simulation that, while orientated towards the needs of high-energy physics, were applicable to a wide range of problems. CERN's recognition of the need to have access to the expanding suite of commercial software packages that became available in the 1990s saw the organisation shift to C++ as the principal programming language in 1996. The CERN Program Library has been discontinued, but the majority of tools were retained in the ROOT object-oriented data analysis framework and library that replaced it.

The shift from mainframe computing to PCs in the 1980s and the rise of computer networks in the 1990s saw distributed computing take hold at CERN, but it was quickly realised that the amount of data the LHC would produce required far more computing power than the organisation could afford. In 1999 computer scientists Ian Foster and Carl Kesselman came up with the concept of 'grid computing', which would link the resources of the hundreds of organisation collaborating with CERN to create a giant distributed network offering near real-time access to LHC data across the world with no single point of failure.

Today, the Worldwide LHC Computing Grid process two million jobs a day on roughly 250,000 cores from more than 160 institutions worldwide, interlinked by a combination of private fibre-optic cable and high-speed Internet connections. Hemmer believes CERN's innovation lay in making such a complex system workable and resilient. "We already had experience with distributed computing," he says. "The challenge was to do it world-wide where you are not able to control the nodes."

Looking forward

As the LHC comes back online at its full design energy and plans for future accelerators progress, the amount of data to be processed will only increase.

Whatever solutions CERN devises to deal with the increasing complexity of its experiments, it will be able to rely on industry to help. "Most of the time we class people from industry who work with us as collaborators," says Barney. "I don't think they make a lot of money with us, but they learn because they have to solve problems they have never encountered before."

Whatever the future holds, Bordry is sure that CERN will play a leading role. "CERN benefits from the last 60 years of building bigger and bigger accelerators. More than that, we have the structure, the personnel, the resources to build the next one."

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles