Big-science experiments are expensive so electronics designers are turning to off-the-shelf technologies to keep costs under control.
Earlier this year, scientists celebrated the centenary of the discovery of cosmic rays during balloon flights over the Viennese countryside by physicist and Nobel laureate Victor Hess. Today, several hundred miles to the north in Berlin, people such as Peter Wegner, theoretical physicist and mathematician at the Deutsches Elektronen-Synchrotron (DESY) are working on a much larger-scale project to look more closely at the incredibly energetic radiation that emanates from quasars and supernovas.
"We have an energy density towards the top that is much higher than we can achieve on the Earth at facilities such as CERN," Wegner says. Luckily for us, the atmosphere intercepts most of the rays. Instead, they look for indirect evidence of the particles and rays - a spray of Cherenkov radiation.
An ion hit by one of these photons can be accelerated to such a degree that it travels faster than the speed of light in air. The electromagnetic field around the fast-moving particle excite other atoms in the air, producing a telltale glow of coherent blue light.
The problem with using Cherenkov radiation is that it is hard to correlate the emissions with a source in space. An array of detectors makes it possible to map them more accurately. With three or more detectors, Wegner says it is possible to show not just energy but the orientation and shape of the Cherenkov glow and, using a stereo reconstruction, point to the source of the original cosmic ray.
Several installations have multiple telescopes set up to look for cosmic-ray sources but not on the scale envisaged by an international group of research institutes. "We intend to increase the number dramatically - to 50 telescopes in all," says Wegner. "Once installed, we will have a thousand times more sensitivity than any of the other Cherenkov detectors."
DESY developers working on the array control software and electronics for the Cherenkov Telescope Array (CTA) expect to be able to take advantage of existing work on synchronising large scientific systems, reusing their code and architectures.
It is all part of a growing trend to use off-the-shelf components where possible to keep the cost down in big-science projects - mixing custom design with the smart application of existing designs. For example, to ensure that the computers running the CTA can correlate detection events, each of the telescopes must be synchronised in time. The prime candidate for that is the White Rabbit time-distribution protocol developed at CERN, and the GSI accelerator team in Germany. White Rabbit itself uses a lot of off-the-shelf technology. The packets that synchronise the nodes run over an Ethernet network. "You just need special devices for the time switches," Wegner notes.
The world of radio telescope arrays will provide the framework for the software: the Alma Common Software was written to remotely coordinate individual antennas. Elements across the network publish and subscribe to data feeds using the standard data distribution service (DDS). Lower-level control over the sensors and actuators that steer the detectors is then developed through the OPC UA standard for the world of industrial automation. National Instruments' Labview is used to pull the pieces of hardware together into the software framework - and has become a commonly-used tool in projects like this. "If you perform upgrades or have new instruments you don't have to change the functionality of the framework," explains Wegner.
Biggest radio telescope
The Square Kilometre Array (SKA) of radio telescopes will perform astronomy on an even larger scale. Covering a much greater total area than its name suggests, the SKA is a huge undertaking destined for remote spots in both Australia and South Africa. When completed it will be the largest and most sensitive radio telescope to date, and another of the great exemplars of current 'big science'. The project name comes from the size of the total collecting area, which will create an aggregate instrument that has 50 times the sensitivity of the best telescopes now in place. Thousands of antennas will extend to distances of 3,000km from the centre of the array.
Like the CTA, the array divides into three, covering different parts of the radio spectrum, each one requiring new approaches to design to make it possible to deploy the instruments without incurring astronomical costs. In the low-frequency range, there are 2.5 million individual receiver elements, says Gary Kemp, programme director at Cambridge Consultants. The company was engaged to bring down the production cost of the antennas and their associated electronics.
"Our role is to take a prototype design that the University of Cambridge has been working on and turn that into something that is manufacturable in volume and has integrated receiver electronics with it," Kemp explains. "Replicating a design many, many times calls for particular skills. When you look at the work we do for other clients - mass consumer products like coffee machines - that's where we come in.
"The target cost is as much a part of the specification as thing such as noise performance," Kemp adds. "The target for each element is £75. To get to that you have to consider how much metal you use in the antenna for a start. The cost of metal is a significant part of the cost as well as the labour to make the elements."
The university's initial design resembles a stepped pyramid structure that was cut out of large pieces of sheet metal. "Cutting that out of sheet means you have quite a lot of metal, but you are also throwing a lot away. Where we started was to look at these filled antenna elements and work out whether they actually needed to be filled - the electrical current may just flow around the edges," Kemp explains. "If all you have to do is draw around the edge you don't have to make it out of sheet metal. You can make it out of wire."
By using wire construction, it became possible to go to the same manufacturers that produce supermarket trolleys and coathangers - companies that are used to making very cheap metal shapes. "With wire, the process is more readily automated, which frees up the decision on where to make the thing. You don't need to go where the labour cost is low," says Kemp.
Having a product that could be made in a local highly automated plant would help deliver on one of the aims of the SKA: to promote UK and European manufacturing as well as European design. "It was a fundamental change to the design. We had the same shape but it was made of line elements rather than two-dimensional elements. We fed back the proposed modifications to the university so they could confirm with simulations that it would still work," Kemp explained. "At the same time we were talking with manufacturing partners to get their advice on what was the best approach. Manufacturers would say: 'you can't do that but we can do this'."
Kemp adds: "Collaborating with industry at this stage is really important. If you bring in experience in manufacturing at the start of the design process you approach the design with a mindset that gets you where you want to go more quickly. You get the technical performance you want and make sure that the whole build is viable from the start. It avoids the problem of where you have something that works but now you have to figure out how to make it."
The ability to use off-the-shelf components helps keep the cost of these large systems under control - scientific users can piggyback on the massive scale of the consumer electronics market to deliver cheaper components every year. "Designing a broadband amplifier that's matched to an antenna with a complex impedance was a challenge," admits Kemp, "but we have a design that can use commercial components that plug into the top of the structure, which also lends some structural integrity to the antenna elements."
Off-the-shelf design cannot go everywhere. Electronic sensors that go into systems such as the particle detectors at CERN's Large Hadron Collider (LHC) can demand specialised design - although it is possible to use commercial manufacturing processes to keep costs under control. "We supplied several thousands of detectors for high-energy particles to CERN," says Giorgio Fallica, manager of advanced sensor development at STMicroelectronics. The LHC detectors rely on semiconductors but they need to cover a wide area, which does not fit well with the way that chips are normally made. Most cover an area of a square centimetre or less. Even the very largest devices are limited by the size of the reticle that holds the mask used to define features on their surfaces - a maximum of 600mm2 - so tens of hundreds of those will fit onto a typical wafer.
Each individual sensor in the Compact Muon Solenoid (CMS) detector needs to measure 100cm2 in area. "There is only detector per wafer," says Fallica. The surface features on the wafer are not as fine as those used in today's chips, which made it possible to use much larger contact masks than reticle-based projection optics allow.
In reactors and particle accelerators electronic components may be bombarded with heat, radiation, and high-energy particles many times during their operational life: they have to be designed to withstand the conditions generally only encountered by deep-space probes. Alpha particles, gamma'rays, and high-energy neutrons disrupt semiconductors in a variety of ways. They result in 'soft errors' in which the state of a logic gate flips from one state to another.
Sometimes the damage is permanent when the transistors go into 'latch up' and pass so much current they destroy themselves. And components that power and control the sensor arrays are as at much risk as the sensors themselves. Lorenzo Naso, divisional marketing manager at STMicroelectronics, says: "A standard power technology cannot allow sufficient level of radiation hardness." However, STMicroelectronics was able to use an existing design that, with some modifications, could be hardened to the intense radiation generated by particle collisions.
A slightly unconventional basic design turned out to be more resistant to radiation than most. "This technology is not conventional for a voltage regulator," says Naso. "It is based on transistors that are arranged vertically rather than horizontally."