A novel look at CERN

Shutdown at CERN

After the discovery of the Higgs boson, CERN's Large Hadron Collider has been powered down for a series of upgrades.

Visitors entering the control centre at CERN could be forgiven for thinking they have walked in to Switzerland's answer to Disney's EPCOT Centre. Groups of children mill around the foyer looking at the various photographs and diagrams on the walls, their mood alternating between boisterous impatience and hushed reverence, while tourists fill the air with the incessant clicking of cameras. But delve further into the labyrinthine corridors that make up the facility's nerve centre and the atmosphere becomes altogether more serious.

Save for the gentle hum of computer fans, the whole building is filled with the kind of heavy silence that only serious thought and painstaking endeavour seems to be able to conjure into being. It's little wonder, as it's here that some of the keenest minds in the world sit wrangling with one of the most profound questions asked by man: What is the universe and how did we come to be in it?

Located just inside the Franco-Swiss border in the pastureland separating Lake Geneva from the Jura Mountains, the institution sits in a landscape more conducive to evenings drinking pastis than the uncovering of the universe's deepest secrets. Farmers with weather-beaten faces periodically emerge from quaint stone cottages, while cattle graze on the gently undulating hills. Even the buildings that house the colossal particle detectors dotted around the Large Hadron Collider's 27km circumference are somewhat reminiscent of dairy farms – all pipes, vents and lichen-coated corrugated iron.

Venture 100m down into the tunnels and chambers of the LHC, however, and an entirely different picture emerges. Here, among millions of pounds' worth of advanced electronic wizardry, researchers stare intently at huge banks of computer screens as subatomic particles travelling at near-light speed smash together in dramatic collisions in tunnels just metres away. Through a combination of seriously high-tech engineering, advanced computing methods and brute force brain power, the physicists gathered here are able unravel the very fabric of the universe thread by thread.

It was earlier this year at CERN that the infamously shy Higgs boson finally revealed itself after 20 years of subatomic hide and seek. The discovery of the Higgs, the particle responsible for granting mass to the building blocks of the universe, was accompanied by the sort of media circus usually reserved for politicians with suspect private lives. The story captured the public's interest in a way that pure scientific research rarely does, and for several months the whole world seemed to be caught up in 'Higgsteria'. The media may now have turned its attention away from particles physics and the LHC and on to the next celebrity calamity or political scandal, but the research at CERN continues apace.

The Large Hadron Collider

In February the LHC was powered down in preparation for upgrades that will allow the scientists to, among other things, further pin down the properties of the Higgs boson. This is the first of several 'Long Shutdown' periods planned for the next decade or so and will continue until 2015 when the accelerator will be fired up once more.

Particle accelerators such as the LHC use powerful electromagnetic fields to propel charged particles around a circular track at speeds up to 99.9999991 per cent of the speed of light. Electromagnetic resonators provide the acceleration while quadrupole magnets bring the beams into tight focus. The superconducting electromagnets in the LHC are capable of producing a magnetic field of 8.3T, 250,000 times greater than the Earth's magnetic field, but in order to do so must be cooled to extremely low temperatures. This is achieved via an advanced superfluid helium cooling system which maintains the temperature at -271.3'C, the temperature of outer space is -270.5'C, making the LHC the largest cryogenic system in the world.

In the LHC two beams of protons are shot around the circular tunnels, one clockwise and one anticlockwise. Each beam consists of 2,808 bunches of protons, each comprising 1,011 particles. These bunches change in volume as they travel through the accelerator but are squeezed into an area of around 16'microns, about a quarter of the width of a'human hair, shortly before the collision takes place. Currently the bunches are spaced at 50'nanoseconds apart generating up to 300'million particle collisions per second but plans are in place to drop this to 25'nanoseconds upon reopening in 2015.

"The LHC is currently delivering more than ten times more collisions than the previous best hadron collider, which was the Tevatron in the US," says Phil Allport, upgrade coordinator for the LHC's ATLAS detector. "I think when people originally proposed the LHC it was thought that delivering the design luminosity was already pushing the envelope. It was only five or six years ago that people started saying 'hang 'on, can't we push that any further?'. There were studies to see what the limitations were and what would have to be upgraded in the accelerator. It was found that quite a lot of the complex can be kept as it is and still push the luminosity up to these values."

As Allport says, one measure of the performance of a particle accelerator is luminosity, which varies according to the number of particles per unit area per unit time. The higher the luminosity the more collisions per second and thus the more likely a particular process will occur. This means that generating as high a luminosity as possible is a major factor in the success of the experiments on the LHC and it is one of the main priorities for improvement during shutdown. There are two methods of upping the luminosity, either increasing the number of protons per bunch or squeezing the same number of protons into a smaller space.

"We hope to go beyond the designed luminosity later this decade and then, with a substantial upgrade to the accelerator, be able to deliver enough luminosity for both experiments to get roughly ten times as much data in the 2030s," explains Allport. "At the moment we have been coping with a 50'nanosecond beam crossing. The intention is when the machine turns back on at the beginning of 2015 it will be running at 25'nanoseconds and that already gives you a factor of two in luminosity.

"Another way of looking at it is that you get the same luminosity for a factor of two less collisions per beam crossing, which is why the experiments are very keen that the machine establishes that it can operate with 25 nanoseconds. In fact, we will design it so that we will have a little bit of headroom on that as well. The way in which they propose to deliver the luminosity is for the machine to have a capability of delivering even more collisions but then throttle it back."

When protons collide

Another key feature of the upgrade is an increase in the energy of the collisions from 8T electron volts (eV) up to 14TeV, an electron volt being the amount of energy gained by a single electron when accelerated by a potential difference of 1V. In everyday terms this is fairly insignificant: 1TeV is roughly the energy of motion of a flying mosquito. But it is the energy density that is important. In the LHC this energy exists in a space a million times smaller than a mosquito.

As Einstein's famous equation, e=mc2, tells us mass and energy are interchangeable. When two high-energy particles collide, part of their energy is converted into mass, thus creating new particles. These can be anything from the entire family of elementary particles. This means that the higher the energy the wider the variety of particles that can be produced. The spray of particles resulting from the collisions is recorded by a specially designed detector and analysed, allowing scientists to reconstruct what took place in minute detail.

There are six detectors on the LHC – ATLAS, CMS, TOTEM, LHCf and LHCb – and while they differ slightly they all share the same basic set up. They each measure the position, charge, speed, mass and energy of the particles entering them and consist of two main parts, a tracker and a calorimeter. The tracker reveals the trajectory left by a charged particle as it ionizes matter and the calorimeter determines the energy of the particles by measuring the amount of energy required to stop them. Over time, all of the radiation crashing into the detectors eventually takes its toll and so several of the components situated nearest to the firing line need to be replaced during the shutdown.

"Both ATLAS and CMS will replace their trackers for several reasons," says Allport. "One is that they won't have the radiation tolerance. If you look at a typical event at the LHC, when you throw protons at each other you are basically throwing mush on mush with small bits inside. The collisions you are interested in are inside. So the detectors closest to the beam in either direction get the worst battering in terms of radiation.

"The trackers are the innermost parts of the experiment and won't be able to survive ten times the radiation they were designed for. We are still looking at whether or not the forward calorimeters will be able to stand the environment that we are expecting."

At peak performance the LHC is capable of generating 600 million collisions per second. But, as only a very small fraction of these collisions are of interest, the detectors are calibrated to selectively record the most interesting energies.

"Clever electronics discard the collisions we don't want," says Allport. "The LHC will give you a huge number of collisions and if you filter through them you can find the rare collision at high energies. So things such as Higgs production or looking for super symmetry are buried in there, the trick is how you dig them out both in terms of not throwing them away and then what analysis tricks you use to pull the interesting signals out of this huge background. Each experiment is a computer in its own right that basically does a huge amount of data reduction before anything is even stored."

Even when being selective, however, the collisions generate an enormous amount of data. Currently, the data collected by all four main experiments amounts to around 15PB per year. If this were burnt to CD it would take a stack of discs 20km tall to contain it all. Of course, once all of this data has been collected it needs to be analysed.

It's common knowledge that the World Wide Web was developed at CERN in the late 1980s to helps scientists share information more easily. This tradition of innovation in computing continues with the current developments in grid computing.

CERN does not have the computing or financial resources to process all of the data generated, so in 2002 set up the Worldwide LHC Computing Grid to share the work with computer centres around the world. Data from CERN is transferred to 11 large computing centres at rates of up to 10 gigabits per second via fibreoptic cables. Those large centres then send and receive data from 200 smaller centres spread across the globe giving near real-time data access to more than 8,000 physicists worldwide.

Further experiments

When put back online in early 2015 the upgraded accelerator and detectors will be set to work on a number of experiments. Among them determining exactly what kind of Higgs boson has been found. This will again be the task of the ATLAS and CMS experiments.

"The LHC has landed on the shore of a new landscape in physics and the first thing you see on that shore is the Higgs," explains Jon Butterworth, physics professor at University College London and member of the High Energy Physics group on ATLAS. "We kind of knew that it would be there or that we would have had something instead of it.

"It's a completely qualitatively new area of physics because above this energy scale the forces of unified particles are effectively massless, it's terra icognita. The fact that the Higgs is there means that the standard model might work but in a different way from how it works in everyday life. If we hadn't found the Higgs we would have no predictions for what might be up there. As we have found the Higgs we do have predictions but they are still qualitatively different from everyday life. Really we are starting that exploration."

The standard model of particle physics is a collection of theories that accounts for our understanding of fundamental particles and forces. It states that matter consists of tiny subatomic particles called quarks, which combine to form protons and neutrons, and leptons such as electrons and muons. The four forces, gravity, the electromagnetic force, the strong force, which binds quarks together to form larger particles, and the weak force, which underpins radioactivity, act through carrier particles exchanged between quarks and leptons.

Earlier this year both CMS and ATLAS confirmed the discovery of a previously unknown particle between 125GeV and 127GeV. This particle displays many of the properties the standard model states a Higgs boson should possess, but further tests need to be run.

"There are several properties the Higgs has to have," Butterworth says. "A Higgs is something that is responsible for the mass of fundamental particles and in particular for this breaking of the symmetry between the weak force and the electromagnetic force.

"We have got enough evidence that this is a Higgs. It is interacting with the weak force and the electromagnetic force in the way we expected. But also in the standard model the Higgs has got to do more than that. It has to give mass to the electron and the muon and all the quarks as well and we have very limited evidence that it is doing that yet. I'd like to measure the Higgs decaying to quarks because that would be the first time we have seen it directly coupling to quarks. The key thing we want to do is to see as many of these decays as accurately as possible."

Another area of investigation for the team at ATLAS is dark matter. Astrophysicists are baffled by the fact that visible matter only accounts for a tiny fraction of the mass in the universe, around 4 per cent. Physicists around the world have been scratching their heads trying to figure out what is responsible for the remaining 96 per cent and Butterworth hopes his team will find it first.

"There is evidence that there is something beyond the standard model there and it would be great to find that and learn what is next," he says. "I believe that [we are looking for] some sort of particle. It's probably weakly interacting and it's got a mass that may be in reach of the LHC. It's not the same kind of research question as we had with the Higgs. If we don't find it it won't prove there's no dark matter whereas if we hadn't found the Higgs it would've proved there's no Higgs."

Butterworth continues: "There are a few ways of finding dark matter. One is to make it in a lab, which is what we'd do. But it will fly out of the detector and we will only know it is there because there will be missing energy or something. We could provide a candidate for a new fundamental particle with all of the properties consistent with dark matter but we would want to see it in the underground detectors and additional evidence. If there is dark matter and it comes together in the middle of galaxies then that's what people are looking for in things like AMS [the Alpha Magnetic spectrometer, a detector mounted on the International Space Station]."

It's early days but perhaps the next time the eyes of the world fall on the physicists of CERN it will be due to them shedding light on the elusive nature of dark matter.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them