Stone axe head

The seven ages of materials

Image credit: Dreamstime

What are the great building blocks of engineering materials? There are thousands, of course, but here we take a whistle-stop tour of seven fundamental substances that have left their mark on the way we built the world around us.

Even since we first picked up a rock to crack open a nut or sharpened a stick to spear a fish or used a flint to create a spark to light a fire, we’ve been using tools. Ever since we first plaited long grasses to make a rope with which to lash together a few branches to make shelter, we’ve been creating buildings. From the moment we realised that some rocks split open to produce sharp edges, we’ve developed weapons.

The evolution of humankind is inextricably bound up with our use of materials and tools. So much so that the main prehistoric phases of our civilisation aren’t named after our linguistic prowess, social interactions or economic achievements, but by the engineering materials of the age. It’s a bias that reflects how materials have changed society with increasing rapidity. While the Stone Age is reckoned in millions of years, subsequent ages are counted in tens of millennia and ever-decreasing units, until we reach the Industrial Revolution, when the introduction of new processes was sufficiently quick for the engineering age to be measured in decades.

Many of the materials the Victorians worked with, especially iron and glass, weren’t new, despite elements on the periodic table being discovered all the time. In materials science discoveries and processes overlap, reinvent themselves, often finding new applications and solutions to long-​established challenges. When we come to the 21st century, innovations in materials such as silicon and carbon are so rapid that we need monthly reports to keep up.

Finally, why no carbon? That’s because here at E&T, when it comes to the fourth most abundant element in the known universe, we reckon its heyday is yet to come.

Stone: from cooking to calendars

Stone Age: c2500000-3200 BC

While learning about the Stone Age at school we were routinely told that the earliest humans were hunter-gatherers, living on what they could find or catch. What we weren’t told was that this was the first technology age, when we learnt to shape stone to make tools for hunting, for food preparation, to defend ourselves, to create structures for living in and monuments that acted as astronomical calendars and spiritual centres.

The Stone Age is the first in a three-phase framework of human prehistory – the others being Bronze and Iron – a phrase coined in the 19th century by Danish scholar Christian J Thomsen, who made the assumption that each period was technologically more complex than that which preceded it. Today,  this is seen as an oversimplification that emphasises the evolution of technology above other factors such as the development of language, agriculture and society.

The consensus for the Stone Age is that it started 2.5 million years ago (earliest known stone tools) with the earliest end date at around 3300BC, when bronze was first manufactured in western Asia, extending to the Neolithic (literally ‘new stone’) age that brought with it cereal cultivation, irrigation and the expansion of villages into towns and cities. The Göbekli Tepe archaeological site in Turkey is the location of the world’s oldest megaliths (i.e. prehistoric stone monuments) and dates back to the 10th millennium BC.

While stone wasn’t the only material of the age – pottery made massive advances, with organic materials such as antlers and bones common – the era is named after the most hard-wearing substance of the time. The advent of bronze metallurgy as the new technology for tools and weapons effectively made this aspect of stone obsolete.

Bronze: from axe-heads to armour

Bronze Age: c3200-1200 BC

There was a time when bronze, an alloy of copper and tin, was the hardest common metal known to humankind, lending its name to the Bronze Age, which occupies the gap between the stone and iron ages, in Europe thought to be around 3200-600 BC. Technically, it was preceded by the lesser-known Chalcotholic period, when copper metalworking was the main technology.

The manufacture of bronze is one of the first recognisable industrial processes, as it required systematic procedures: the separate mining and smelting of tin, which is then added to molten copper. As copper and tin ores are rarely found together (the exception being in Cornwall), early trade infrastructures in raw materials developed, with Cornish tin being exported as far as Phoenicia in the eastern Mediterranean.

Copper mines of up to 70m exist in the UK dating back as far as c2000BC, while archaeological remains of sword moulds have been discovered in Somerset, dating back to the 12th century BC. During this period copper-zinc alloys known imprecisely as ‘brass’ were being produced, leading to today’s preferred term of ‘copper alloy’ for both bronze and brass.

This was when geographically wide-ranging commercial networks started to form, leading to the spread of the technology across Ancient Egypt, Asia and Europe. Both iron and copper smelting appeared in Africa, although there is no evidence to show it either spontaneously evolving or being introduced.

The benefit of bronze was simply that it was harder than copper, as well as being more ductile and stiffer. As uses for bronze increased, metal casting evolved, resulting in the production of better tools, weapons, armour and materials. Archaeological discoveries of bronze artefact hoards also suggest that the material represented wealth in the form of stored value as well as status.

Iron: from quality to utility

Iron Age: c1200 BC-100 AD

Third and last in the conventional sequence of prehistoric technology ages, the Iron Age is by far the shortest, sitting between the collapse of the Bronze Age and the beginning of recorded (i.e. written) history. Although the period gets its name from the widespread employment of ferrous metallurgy and the early evolution of carbon steels for tools and weaponry, iron’s emergence as the pre-eminent engineering material of the era was not ushered in by any significant innovations in materials processing. Its rise was more a reflection of the economic conditions created by a series of events in the Mediterranean (volcanic eruptions, invasions, as well as failures of governments) leading to massive disruptions in the international tin trade around 1300BC.

Market pressures caused by tin scarcity forced metalworkers to search for alternative metals, with the plentiful iron (which was known in the Bronze Age but considered inferior) becoming the material of the day. The main aim of technologists was how to improve iron by hardening it through physical and chemical processes, while there is also evidence that metalworkers were recycling bronze. As cheap steel improved, weapons became stronger, lighter and harder, with the result that when tin re-emerged onto the market, it was no longer price-competitive as a mass-production metal.

The time frame for the Iron Age varies widely depending on location, but in Western Europe the start point is about when iron replaced bronze in arms manufacture. The end point is the Roman Conquest (in Britain AD43), despite which iron-working remained a mainstream technology in much of Europe until the Industrial Revolution (19th century).

Beyond the swords, daggers, lanceheads and spearheads made to arm the militia, Iron Age ferrous metals were also used in the burgeoning agricultural and construction sectors, for chains, ploughs, reaping hooks, scythes, hammers and saws.

The end of the Iron Age is generally considered to coincide with the Roman Conquests, and history books tell us that it was succeeded by Antiquity and then the Middle Ages. It wasn’t until the 1300s that another material, glass, could lay claim to a material age.

Glass: windows, bottles and beyond

Glass Age: 1300-present

Glass in its natural form has been with us since lightning first struck sand to produce fulgurites of fused quartz, which is to say long before humans started experimenting with what is now defined as an amorphous solid. Glass can be traced back to 3500 BC when the Egyptians and Mesopotamians started to produce jewellery in the form of beads. And although we’ve made glass for ornamentation ever since, by far its most important feature in everyday use is that it can be produced to be transparent to light.

According to the Glass Alliance Europe, “no other man-made material provides so many possibilities across so many industries and disciplines”. In terms of everyday applications, the generic term ‘glass’ tends to refer to familiar uses such as liquid containers, construction materials and consumer optoelectronics, or more simply bottles, windows and lenses. There are of course thousands of other ways in which glass is deployed, from scientific and medical equipment to fibre-optics, from renewable energy to automotive, while in the world of materials science lively debate continues on what substances actually constitute glass.

Although there are a multitude of different types of glass, they are all produced from the same fundamental process: that of melting silicon dioxide (sand) at high temperature and mixing it with various additives (such as sodium carbonate or ‘soda’), to create different characteristics such as strength, chemical durability and colour, before cooling it to form a new material. Although industrial production of glass can be traced back to the 13th century, it really came into its own in the mid-19th century when the ‘float’ process allowed mass production. The 20th century was to see significant innovations brought to glass production by Sir Alastair Pilkington.

One of the most important properties of glass is that it can, according to the Glass Packaging Institute: “be recycled endlessly without any loss in purity or quality,” with manufacturing from ‘cullets’ (recovered glass) requiring less energy than to make it from raw materials.

Steel: from skyscrapers to spoons

Steel Age: 1800s-present

With more than 1.6 billion tonnes of steel produced globally each year, steel is one of the most abundant man-made materials on the planet today. An alloy, it is made up almost exclusively of iron (as much as 99 per cent), while its secondary component carbon contributes up to 2 per cent by weight. Carbon is added to increase iron’s tensile strength, but it also contributes other properties such as hardness, resulting in a metal so versatile that it is one of the great building blocks of the modern world.

Although steel has been known to civilisations globally for up to 4,000 years, it wasn’t until the arrival of the Bessemer process in the mid-19th century that it could be mass-produced in industrial quantities.

Steel started its journey to ubiquity as a semi-precious metal, often produced in a haphazard way in bloomeries, a rudimentary type of smelting furnace. But by the Iron Age it was an established alternative to copper alloys. Because of its hardness, along with its ability to produce long-lasting sharp edges, it was vital to the arms industry. In prehistoric times, when steel was very rare, such was its value that when Alexander the Great defeated the ancient Indian king Porus he was rewarded not with gold, but steel.

Bessemer’s invention, which removed impurities from iron via an oxidisation process to produce steel, was the catalyst required to set the Industrial Revolution in full flow. A new breed of entrepreneurs such as Andrew Carnegie emerged to exploit this inexhaustible new material, casting rail networks across continents, building vertical cities in the form of skyscrapers and rolling out vast quantities of low-cost utility items such as cutlery.

Globally, production has shifted, with the recent economic boom in China and India creating mushrooming demand. China is currently the top producer, taking a one-third market share. Steel is also the world’s most recycled material.

Aluminium: from tin cans to outer space

Aliminium Age: 1800s-present

Given that it is the most abundant metal in the earth’s crust, aluminium was always going to play a leading role as an engineering material. The problem has always been that it only rarely occurs naturally in pure metallic form, and is locked away chemically in 270 different minerals. Despite the difficulties in extracting aluminium, it is the second most used metal, with global production in 2016 at 58.8 million tonnes (iron currently holds the record at 1,232Mt).

Taken for granted today as the material of choice when low-density metals are required – from ‘tin’ foil to aeroplanes, drinks cans to construction, food processing to machinery components – large-scale industrial production didn’t arrive until the late 19th century when the Hall-Héroult process signalled the start of aluminium smelting (bit.ly/ecceng-aluminium). In 1889, Austrian chemist Carl Joseph Bayer discovered a method of purifying bauxite (aluminium’s most common ore) to yield alumina. The Bayer and Hall-Héroult processes remain the basis of aluminium production to this day.

As for the name, ‘aluminium’ was adopted in the early 19th century because the -ium suffix sounded ‘classical’. Most people agreed, except Webster’s dictionary, which listed ‘aluminum’. US journalists adopted that spelling, but the American Chemical  Society only followed suit as recently as 1925.

Plastic: from packaging to pollution

Plastic Age: 1907-present

If ever there was a material that is both a blessing and a curse it is plastic. While we tend to think of it more in terms of what it represents, either negatively (cheap, insubstantial, environmentally threatening) or positively (affordable, utilitarian, recyclable), the word ‘plastic’ these days is a collective noun for a vast range of materials that interact with virtually every aspect of 21st century life: from the trivial toys you find in Christmas crackers (‘commodity’ plastics) to life-saving synthetic heart valves (‘engineering’ plastics). Depending on your point of view, these petrochemical derivatives have the potential to stimulate the economies of emerging nations or create environmental meltdown.

They contaminate our oceans and yet have made spaceflight possible. They are destructive and innovative. In fact, the only thing plastics really have in common is that they fall under a broad definition of synthetic or semi-synthetic organic compounds that are malleable and can be moulded. And even that’s up for debate, with the International Union of Pure and Applied Chemistry issuing guidelines to remove ambiguity when using the word (for example, not treating it as a synonym for polymer).

The first man-made plastic was invented by metallurgist Alexander Parkes, who exhibited his Parkesine nitrocellulose compound (intended to be a substitute for ivory) at the Great London Exposition of 1862, where it won a bronze medal. The first fully-synthetic plastic was Bakelite, invented in New York in 1907 by Leo Baekeland, who also gave us the term ‘plastics’. The rest is history, with the material becoming a convenient and economic alternative for just about any engineering material imaginable: metals, wood, ceramics, stone or glass.

Because commodity plastics are cheap, they are ubiquitous in food packaging, most of which has historically been thrown away, and because they have large molecules, decomposition rates are slow. This, in turn, has led to the development of industries such as recycling and bioplastic.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles