It's 70 years since two devastating atomic bombs forced Japan's surrender and ended the Second World War. The energy in atoms has since been harnessed for peaceful ends, but an isolated Manhattan Project site shows both the promise of nuclear power and the limits of our ability to cope with it.
“Some recent work by E. Fermi and L. Szilard leads me to expect that the element uranium may be turned into a new and important source of energy in the immediate future. Certain aspects of the situation which has arisen seem to call for watchfulness and, if necessary, quick action.” When Albert Einstein tells you to pay attention, it’s safe to assume that something important is occurring. The legendary scientist’s letter to US President Franklin Roosevelt in August 1939 resulted in quick action indeed: the launch of a programme code-named Development of Substitute Materials that eventually became known simply as the Manhattan Project.
Few technologies have such an obvious duality of purpose as the unleashing of the power locked within atoms of uranium and plutonium. In a controlled nuclear chain reaction, a small amount of radioactive material can generate enormous quantities of thermal energy, heating liquids and driving turbines to generate clean, carbon-free electricity. In the absence of nuclear reactors generating over 10 per cent of the world’s electricity, we would pump an additional 3.5 gigatons of carbon dioxide into the atmosphere each year, accelerating global warming.
In 2013, Nasa’s Goddard Institute for Space Studies calculated that, despite high profile accidents at Chernobyl, Three Mile Island and Fukushima, nuclear power also avoids an average of 76,000 deaths annually from toxic air pollution caused by the burning of fossil fuels.
However, if nuclear energy is released in a fraction of a second instead, the same material can cause massive devastation, laying waste to cities and irradiating vast areas for decades. The atomic bombs dropped on Hiroshima and Nagasaki, ending the Second World War, killed over 110,000 people immediately and ultimately another 240,000 from radiation poisoning. Some scientists estimate that decades of bomb testing have since resulted in the deaths of hundreds of thousands more.
Rarely can the good and the evil threads of nuclear technology be completely untangled. Power stations that provided emissions-free energy for peaceful civilian applications also bred material for nuclear weapons. Meanwhile, technologies and systems developed to churn out plutonium for warheads enabled scientists to produce radio-thermal fuel for deep space probes and Mars rovers.
Perhaps one location sums up the split personality of nuclear technology more than any other. In the summer of 1942, Hanford was just another sleepy farming community in rural Washington State at the northwest corner of America. But it was blessed - many would now say cursed - with geography that made it irresistible to the nascent Manhattan Project. The nearest towns were small, major roads and railways were distant, yet the mighty Columbia River ran right next door (ideal for cooling reactors).
Construction of the world’s first industrial plutonium reactor began here in late 1943 - and was complete little more than a year later. Reactor B went on to produce the plutonium for the Trinity ‘gadget’ and both atomic bombs detonated over Japan in 1945. But that was just the start for Hanford. By the early 1960s, the town was home to nine nuclear reactors, five reprocessing plants and nearly a thousand support buildings and laboratories. It eventually produced the majority of the 70,000 warheads ever made in America, only ceasing production of plutonium in 1987 as the Cold War melted away.
In the rush to produce weapons, little attention was given to Hanford’s by-products: large quantities of radioactive and chemically hazardous waste. Over the decades, an estimated 1.7 trillion litres of liquid waste was simply discharged into the ground. The most toxic waste was pumped into underground tanks, where 211 million litres of highly radioactive sludge still remain.
The plan was for a rigorous remediation operation managed by the Department of Energy (DOE) that would take 30 years. Low-level waste would be isolated and the most dangerously radioactive materials vitrified. This process, in which waste is sealed into glass bricks that would then be buried deep in mountain repositories, was initially developed at the Pacific Northwest National Laboratory, also based at Hanford. Construction of the vitrification plant began in 2001 and it was meant to be operational in 2011. However, technical and managerial failures have pushed that out to the late 2020s. Tanks, designed in the 1950s to store waste for 20 years, will now have to hold it until as late as 2040 - but a third have already failed and some are still leaking.
In a damning report published this May, the US Government Accountability Office (GAO) observed that over the last 25 years, the DOE has spent over $19bn dollars on projects and strategies for managing the contents of the underground tanks, “none of which,” it notes, “have succeeded in treating any waste.” The DOE’s latest strategy is for two new facilities to pre-treat and stage low-level waste while the department works to resolve technical problems in the vitrification plant (more on that later). The GAO concluded that the DOE had come up with this idea based on facilities it had proposed in prior years but had never actually constructed, thus potentially excluding cheaper options. “Furthermore, DOE does not have reliable estimates for the cost or schedule for constructing the projects,” the report continues.
Even if these facilities could spring into existence overnight, they would have nowhere to send their pre-treated waste to. The Waste Treatment Plant (WTP) where the materials are intended to be blended with glass-forming substances, heated to over 1100°C and poured into stainless steel casks to cool, is far from complete. Work on a key part of this vitrification plant stopped in 2012 because there was no clear technical route forward. A major problem is that the WTP builder, Bechtel National, bid for the project in 2000 under a ‘design-build’ contract. This is a supposedly low-risk, high-speed approach where technology development, plant design and construction happen simultaneously rather than one after another.
In fact, it meant that difficult scientific and technical challenges in processing hot, liquid radioactive waste were not adequately tackled until far too late. The GAO report notes drily, “DOE no longer encourages the use [of the design-build approach] for complex, first-of-a-kind facilities but has continued to use it for the WTP. Even as unresolved technical challenges persist, DOE has continued the design-build approach to constructing the WTP without fully implementing aggressive risk mitigation strategies.” Without these strategies, such as enlisting experts to help with the design, the GAO believes: “DOE will have little assurance that technical challenges will be solved or that emerging ones will be mitigated in design.”
Uncertainty over the treatment and destination of nuclear waste extends to the very highest levels in America. In 2011, the Obama administration withdrew funding for a proposed deep geological repository for high-level radioactive waste at Yucca Mountain in Nevada - a move that the GAO slammed as politically rather than scientifically motivated. There is now no destination for Hanford’s vitrified waste, even if the WTP does get up and running in the next couple of decades.
Meanwhile, all the waste that was released into the ground is forming ‘plumes’; subterranean streams that are mingling with ground water and heading slowly (although no one knows exactly how slowly) towards the Columbia River. Around 190 square kilometres of the site are contaminated with a range of toxic and radioactive chemicals including carbon tetrachloride, uranium, caesium, tritium, cyanide and strontium-90.
At some points, compounds that trap some of the most dangerous chemicals (such as strontium-90) have been injected underground. Elsewhere, water is being pumped to the surface, treated, and then returned to the ground, with the aim of both protecting the Columbia and removing the contaminants for good. However, there was no existing method for measuring how well this ‘pump-and-treat’ system was working, so contractors and DOE scientists developed their own.
James Hanson and Matthew Tonkin started by generating 'capture maps', based on the dynamic water levels observed in observation wells, as well as simulated maps developed from computer models of how water was moving through the geological strata. They then overlaid these maps on the distribution of hexavalent chromium (a strong carcinogen), and established a relationship between the water that was being captured and the contaminant plumes. Stretches of river shoreline that were showing less protection could then be targeted for extra treatment.
Hanford engineers are also optimising the pump-and-treat process itself. Mark Carlson and Mark Byrnes of contractors CH2M Hill wanted to figure out why their decontamination systems were not working properly. The systems rely on fluidised bed bioreactors (FBRs) that use bacteria to continuously remove nitrates, metals, and volatile organic compounds. Carlson and Byrnes found that the micronutrient recipe typically used to feed the bacteria was too high in carbon and manganese, causing slimy biological fouling in the reactor. Adjusting the feed eventually restored the reactors’ performance.
This remediation work happening at Hanford could now help other rivers being threatened by contaminated groundwater. Luckily, few countries need to deal with the residue of decades of plutonium production. But septic systems, landfills, leaking underground storage tanks, and agricultural and industrial operations can also render groundwater hazardous to aquatic life or unfit for consumption. California alone estimates that more than a third of its groundwater by area has been contaminated beyond use.
While nuclear warheads are no longer produced at Hanford, one operational reactor remains. The Columbia Generating Station is an 1150MW boiling water reactor that has been in operation since 1984. It is the Pacific Northwest’s only nuclear power plant, although Washington did start building several others in the 1980s. Today, it supplies electricity at a levelled cost of about 5 cents per kilowatt-hour, which is cheaper than gas, wind or solar.
The security and isolation of Hanford’s nuclear facilities meant the site was attractive for other projects that thrive far from the madding crowd. In 2001, Caltech and MIT opened one of two Laser Interferometer Gravitational-Wave Observatories (LIGO) on the Hanford site. This large-scale experiment can also trace its history back to Einstein, who in 1916 predicted the existence of gravitational waves as part of his theory of General Relativity.
When large, dense cosmic bodies move suddenly, they cause ripples in the fabric of space-time, carrying away gravitational energy in a similar way to waves on a pond. However, gravitational waves have never been directly observed as they are almost unbelievably faint. Theoretically, they should be detectable by a pair of laser interferometers measuring the time it takes light to travel between suspended mirrors. With the passage of even a massive gravitational wave, the distance measured by a light beam would shift by the tiniest amount: for a detector 1m long, the change would be millions of times smaller than a proton.
LIGO solves this in part by using large (4km) detectors, very high quality laser light and interferometers separated by thousands of kilometres: Hanford’s twin is in Louisiana. But they still require a very calm environment, with no ambient vibration from passing lorries or bustling city streets. (Even stray air currents can fox the detectors). The remoteness of Hanford, together with its ample supply of scientists and technicians, made it the perfect location. Despite such attention to detail, LIGO failed to detect a single gravitational wave during its first decade of operation. It is now undergoing an upgrade that will extend its observational reach from 50 million to 500 million light years, and should be fully operational later this year.
In the greater scheme of things, it is far too early for nuclear technology to be declared good or evil. The nuclear era is still younger than some of the candidates in next year’s US Presidential elections and even if we cease smashing atoms today, the effects of our experimentation will linger for centuries. Cleaning up Hanford has cost around $40bn so far, a figure estimated to top $110bn by the time major remediation efforts cease in 2090. Cleaning up is predicted to continue to require at least $2m annually well into the 22nd century and, judging by the sluggish progress to date, probably long beyond.
But humanity has also learned much over these short decades. Commercial power stations are, despite the 2011 accident at Fukushima, generally safer and more secure than they were in the 20th century. And while the threat of nuclear weapons remains very real, a steep reduction in the number of warheads (the US now has less than 5000) lowers the chance of a thermonuclear accident. There are also start-ups trying to reboot nuclear power with smaller, smarter designs.Transatomic Power’s molten salt reactors could run on the spent fuel rods from traditional power plants, while TerraPower, a company backed by Bill Gates, promises a fail-safe reactor that cannot be used to make nuclear weapons.
Perhaps we are about to enter an age where nuclear power finally delivers on its promise of cheap, safe, carbon-free energy. Or perhaps we will continue to struggle to reconcile our dreams of atoms for peace with the realities of pollution, proliferation and politics.
One thing is certain. Since uranium-235 - the payload of many early atomic weapons - has a half-life of 700 million years, this is a debate that will not be settled for some time to come.