Global network and data connection. Digital planet technology network.Futuristic earth information technology

Can digital twin Earths help us prepare for what lies ahead?

Image credit: Alamy

Weather forecasting and climate modelling have come a long way over the past 50 years. But life on Earth is affected by more than just weather. Could future technology fully predict the ups and downs of the planet’s behaviour including human activity?

Misfortunes never come singly, says an old proverb. And so, as the Covid-19 pandemic continued wreaking havoc in the world, the UN Food and Agriculture Organization (FAO) revealed in January 2021 that prices of key crops had been rising at the highest rate in six years. Wars, pest infestations, droughts, hurricanes, and floods disrupted agriculture in many parts of the world over the course of the ill-fated 2020 and suddenly, there was less produce left in global reserves. The price of soy shot up by 56 per cent compared to the previous year; corn gained 45 per cent. While not as serious as the global food crises of the late 2000s, the trend was concerning especially for poorer countries, according to the FAO.

“The production of crops such as wheat and corn is very climate-dependent,” said Professor Francisco Doblas Reyes, director of the Department of Earth Sciences at the Barcelona Supercomputing Centre, which is part of the European scientific community developing next-generation climate models. “Countries can prepare for fluctuations in agricultural production and dampen price increases by cleverly managing their stock. But to do that effectively, they need information several years in advance about the situation on both the regional and global level.”

The FAO and other international organisations make global food security forecasts based on climate predictions and global and political developments to help countries plan ahead. But current weather and climate models are neither sufficiently accurate nor detailed in longer-term predictions. Also, the analysts need to combine information from various sources. The work is time-consuming and tedious.

The next generation of environmental models could tackle this problem in a more automated way, according to Doblas Reyes. Not only for agriculture and food security but for all other sectors affected by weather, climate fluctuations, natural disasters, but also human activity.

“It’s not just farming; there are many climate-sensitive sectors that would benefit from credible and timely information to make decisions,” said Doblas Reyes. “For example, people who make decisions about big renewable energy infrastructure projects need to be able to adapt to the climate change. They need to know what the situation is going to look like in the next 10 to 20 years. But to provide them with such information, we need to be able to model the progress of climate change with a greater resolution, spatial and temporal. We need to be able to model its progress in a more realistic way.”

One such next-generation environmental modelling project hoping to tackle this challenge is Destination Earth. Launched in 2020, this is an ambitious endeavour spearheaded by the European Commission, which aims to develop an environmental super-model, a digital replica of the entire system of the planet including human activity. The model, also called a digital twin Earth, would rely on well-established methodologies of weather and climate forecasting but take these technologies one step further. It will weave together cutting-edge climate simulations with information about all other aspects of the Earth’s system gathered by sensors in space, on the ground, underground, as well as under water to effectively simulate the proverbial ‘butterfly wing’ effect, the impact of seemingly minor regional events across the planet.

How do volcanic eruptions affect weather patterns on the other side of the globe? Is there a likelihood of a devastating drought in South America in the next five years? What would the changes in precipitation patterns mean for farmers in Argentina? And how would their decreased wheat production affect the global food market? How will car electrification targets influence air pollution levels in Europe? And what can be done to mitigate any effects? A decade from now, anybody needing answers to such questions might be able to get them without complex training in data processing and analysis.

According to Nicolaus Hanowski, head of mission management and ground segment at the European Space Agency (ESA), the ambitious technology is feasible thanks to the revolution in Earth observation that has taken place over the past decade. Since 2010, hundreds of satellites have been launched to orbit around the Earth and take the planet’s pulse on a daily basis.

“Today, we have huge amounts of Earth-observation data, not only imagery, but all kinds of data, about the state of things on the surface of the Earth, even in the Earth’s interior,” Hanowski said. “We have detailed measurements of pollutants; we have data on wind patterns throughout the Earth’s atmosphere. With this rise of data diversity and data volumes beyond meteorological data, we should be able to get to the level of prognostic and predictive power similar to what we have in weather and climate modelling.”

The Copernicus Earth-observation programme, funded by the European Commission but overseen by ESA, is at the heart of these efforts. Currently consisting of eight satellites monitoring various aspects of the Earth’s behaviour including sea-level rise, wind circulation, air quality, and the land and sea surface, the constellation provides data free to users all over the world.

Combining high-resolution and novel multispectral capabilities, a swath width of 290km and frequent revisit times, the Copernicus Sentinel-2 mission offers views of Earth’s changing lands in unprecedented detail.

Combining high-resolution and novel multispectral capabilities, a swath width of 290km and frequent revisit times, the Copernicus Sentinel-2 mission offers views of Earth’s changing lands in unprecedented detail.

Image credit: ESA/ATG medialab

When setting up Copernicus, the Commission’s goal was to stimulate European companies to develop novel applications based on this data that would make the benefits of Earth observation more widely accessible and introduce the technology to sectors that would not traditionally use it. According to Hanowski, the mission has so far been successful.

“Ten years ago, only about one-30th of the data generated by ESA’s missions was actually downloaded by users,” said Hanowski. “Since then, the ratio has completely reversed and every data product that is generated today is on average downloaded 30 times. Just from our ESA hub, we disseminate 250 terabytes of data per day. If you put that on discs, you would reach the height of the Eiffel Tower.”

While in the past most of the users of ESA’s data were scientists looking to solve particular scientific problems, nowadays the applications developed by third-party companies deal with a wide variety of problems: identifying water leaks in pipes, helping renewable energy companies to select the most convenient spots for new installations, measuring thermal behaviour of buildings in large cities. The aim of Destination Earth is to foster this trend even further.

One of the roadblocks to wider adoption of Earth-observation data, according to Hanowski, is the rather technical skillset and expertise still required today to work with the data sets. The core platform of Destination Earth will therefore go for a more intuitive interface, using graphics and visualisations to allow users to interact with the data and test various scenarios of future developments.

“We want to move away from the rather IT-centric technology of today in which customers frequently have to download and analyse many different types of data,” said Hanowski. “For example, today, if you are interested in vegetation predictions, you need to access multiple data sets, bring together many types of measurements. It’s an exercise that requires a very deep understanding of the problem.”

No more computer code and algorithms. Future users will be able to interact with the data like with a computer game or a military simulation, changing default parameters to simulate various scenarios and possible outcomes.

“In our case, you could visually simulate the environmental impact of various policies,” said Hanowski. “For example, how many fossil-burning power plants will be operating in a certain timeframe if a certain policy is implemented? How will that affect our CO2 output? How could traffic management and the level of car electrification contribute to the CO2 output? Subsequently, you could even have a look at the impact on sea levels.”

The European Commission’s goal is for the first two priority digital twins to be operational on cloud services by the end of 2023. These two will focus on predictions of extreme natural disasters and climate adaptation. The full replica of the entire Earth system could be available by 2027.

According to Doblas Reyes, the model will require more than just satellite data. To accurately simulate the behaviour of the ocean, the computers need to know as accurately as possible what is the state of the ocean all the way to the bottom, they need to understand how ice sheets and glaciers behave under the surface and how much water is in the soil way deeper than the surface level measurable from space.

“To be accurate, these models need to know the state of the Earth in the present moment and in the past as accurately as possible,” he said. “The bigger the problem, the more initial data points you need.”

The first step, he added, will be increasing the resolution of existing weather models from the current standard of 5km to 1km. With such a resolution, the models will be able to forecast the behaviour of severe weather phenomena on a more regional level: what does a thunderstorm do within a single city? How will a windstorm rampage through the mountains?

“The goal is to create a weather-forecasting model that will be much more realistic, much closer to the observations and also hopefully have better ability in predicting what would happen in the next few days, particularly for the extreme events,” said Doblas Reyes.

Working to greater resolution and including more types of data in the initial mix will require more calculations and that, in turn, will require more powerful supercomputers. Such machines might not even exist today. But Europe is currently in the middle of a major investment drive to beef up the continent’s supercomputing facilities. €8bn (£7bn) has been pledged to develop and deploy new exascale computers (machines capable of calculating at least 1018 operations per second), quantum computers, and data infrastructure.

In October 2020, two new powerful supercomputers, the €144.5m (£125m) Lumi and the €120m (£104m) Leonardo, were unveiled in Finland and Italy respectively. Together with other supercomputers under development as part of the European High-Performance Computing (EuroHPC) Joint Undertaking, Lumi and Leonardo will help crunch data for Destination Earth.

According to Doblas Reyes, the major challenge will be for the European supercomputing facilities to find enough skilled engineers capable of working with such advanced technology. “These machines are very complex, they have many different elements that require expertise to make the best use of,” he said. “We need software engineers who actually have this experience, but these people usually work for the private sector, for companies such as Google and IBM, [or] for the video-game industry. It will be very difficult for the public sector to compete with those big players.”

Hanowski agrees that while the weather-forecasting and climate-modelling domains have experience with supercomputers, this does not apply to the entire field of Earth observation. “We are not really yet accustomed to systematically using high-performance computing,” he said. “We are not yet accustomed to applying AI algorithms on a robust operational scale. There is a lot of experimental and technological work done in this domain, but really employing these means of enabling predictive information generation is new and requires the bringing together of many different technological communities.”

The ultimate coming together of experts from fields such as Earth observation, IT and AI will determine whether Destination Earth succeeds. The European Commission sees the project as key to helping the continent become climate-neutral by 2050. While no environmental super-model of the future might be able to predict the next pandemic outbreak, it might help deal with the ensuing difficulties, mitigate future food crises, and plan more smartly for whatever lies ahead.


UK, Copernicus and Destination Earth

The UK is at the moment not participating in the Destination Earth project, according to a European Commission spokesperson. It remains to be defined under what conditions users outside of the EU will be able to access and use the digital replica of the Earth, once developed, he added. 

As part of its Brexit withdrawal negotiations, the UK agreed to continue its participation in the Copernicus Earth observation programme, which provides a big portion of the data for the future digital twin Earth. The details of the participation for the 2021 to 2027 period are yet to be negotiated, including an annual third-country fee.

The UK remains part of the European Space Agency, which co-develops Copernicus with the European Commission.

It also remains a member of the European Centre for Medium-Range Weather Forecasts and EUMETSAT, the European Organisation for the Exploitation of Meteorological Satellites, both of which will play key roles in the development of Destination Earth.


Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles