Computer models that simulate the response of the global climate system to increased concentrations of carbon dioxide and other greenhouse gases predicted global warming would happen along the big picture trend lines observed over recent decades, according to climate scientists.
Global temperatures have ticked up along with a rise in the intensity and frequency of floods, droughts and fires. Insurance companies, a financial barometer of what’s happening in the real world, bear this out: They’re exiting markets increasingly vulnerable to climate-driven weather extremes.
“We’re in this situation where the climate is in fact changing and changing quite rapidly and it’s clear that we all have to live with the new climate in some fashion and need to adapt to it,” Tapio Schneider, a climate scientist and professor of environmental science and engineering at the California Institute of Technology in Pasadena, told me.
Schneider is leading one of several efforts that are harnessing advances in computer technology and artificial intelligence to build climate models that provide useful information for people adapting to climate change at the scale of their homes, neighborhoods and businesses.
For example, the storm drains in the foothills near Schneider’s home in California were built more than 50 years ago to handle the runoff from even the heaviest rainfall events expected then. But they’re not right for the extremes of today or those that are expected to come. The storm drains need to be rebuilt to protect Schneider’s home and neighborhood from floods.
“You need to know what kind of rainfall extremes to expect. And for that, climate models aren’t quite good enough,” he said. “I mean, yes, extreme rainfall will get more intense. We know that. But what, exactly, does it mean for stormwater management in LA, or more to the point, more vulnerable areas on the US Gulf Coast for hurricanes and storm surges?”
High resolution in demand
Current climate models divide up the globe into three-dimensional grid cells and then calculate equations that represent the processes and interactions that drive the Earth’s climate in each cell as well as how the cells interact with each other. The accuracy of any given model depends in part on the size of each cell. Most models today have cells that measure 100 kilometers in latitude and longitude.
As today’s climate changes, financial institutions, insurance companies and other businesses are seeking information about climate impacts at the scale of a specific home or warehouse or piece of infrastructure such as a road or pipeline.
Demand for this information has “has leap-frogged the current capabilities of climate science and climate models by at least a decade,” Andy Pitman, a climate scientist at the University of New South Wales in Sydney, Australia, and colleagues write in a 2021 paper in Nature Climate Change that summarizes the demand for reliable climate information and the potential and limitations of what current climate models can provide.
Nevertheless, the demand has spurred the emergence of companies that provide climate services such as hazard risk assessments of local climate impacts. Many make the best out of today’s 100-kilometer resolution models combined with a variety of geospatial and other data to assess an entity’s risk profile. Concerns are rising within the academic community that these companies are misusing information from climate models such as overstating their accuracy.
“Some are ethical and are highlighting the uncertainties, but there are unquestionably some bad actors out there,” Pitman told me in an email exchange.
The climate modeling community is scrambling to play catch up to the demand, noted Schneider, who leads the Climate Modeling Alliance, a coalition of scientists, engineers and applied mathematicians from Caltech, MIT and NASA’s Jet Propulsion Laboratory who are building a next-generation climate model from the ground up.
The coalition, known as CliMA, envisions a climate model with a resolution in the tens of kilometers, which is within the capabilities of today’s computing infrastructure, and calibrated with Earth observations and higher-resolution regional simulations using AI tools. They outlined the approach in the September issue of Nature Climate Change.
Global resolution in the tens of kilometers is a significant improvement over the current standard of 100-kilometer resolution. Recent advances in computing resources allow for this resolution of model to run hundreds of times to generate the large ensembles that are necessary to quantify uncertainties and “explore scenarios of what might happen,” Schneider said.
Yet even at 10-kilometer resolution, some of the biggest challenges to the climate modeling community remain unsolved, he added. For example, Schneider has a long-standing interest in modeling the microphysical processes that control the formation of clouds.
“We would like to know how ice crystals form, how droplets form, in detail, and how they interact with turbulence at scales of millimeters,” he said.
The challenge, he added, is that “the only thing we see is the end effect of these processes. And yet, these small-scale processes are what we need to model. So, the hard part is using the data we have, which are only indirectly informative about the small-scale processes, to learn about these small-scale processes. You can use AI tools to do this, you just need to use them in different ways from how people usually use them.”
For example, the CliMA team is exploring a set of techniques called ensemble Kalman methods, which are widely used in weather forecasting to estimate the state of the atmosphere at the beginning of a forecast. The CliMA team is using variants of the methods to solve inverse problems, which require working backward from observed or measured data to infer the underlying properties of a system – in essence, to figure out what’s happening behind the scenes based on what can be observed.
In their case, they aim to use Earth observation data such as global cloud cover data from NASA’s CloudSat and CALIPSO satellites as the AI model input to calibrate and quantify uncertainties in their models of clouds and their underlying turbulence.
The team is also using additional data generated computationally in high-resolution simulations of the cloud formation process, nesting these simulations within the larger climate model. The high-resolution simulations provide detailed local climate information about the modeled areas and inform modeling of these processes everywhere else.
Schneider and his colleagues are on track to release a working prototype of their model within a few years.
“Our take is we need to find ways to use the data we have for the Earth system – 50 terabytes of data every day from NASA alone – to inform models,” Schneider said.
The European Union’s Destination Earth initiative is also building a next-generation climate model that leverages advances in AI and cloud computing. Its aim is a digital twin of the planet with one-kilometer resolution, which can improve simulations of extreme events such as thunderstorms and the resulting flood damage.
The first phase of the program – configuring, deploying and demonstrating the initial infrastructure building blocks – will be completed by June 2024. The team aims to have a full digital replica of Earth ready by 2030.
Pitman at the University of New South Wales said both approaches will advance climate science, and both are needed in the same way that the astronomical community would benefit from all advances in telescopes be they radio, optical, or gravity wave detection.
“There is no right or wrong here,” he said. “We need both. We also need innovative use of machine learning that uses the large ensembles to create large ensembles of the very high-resolution models. So, lots to do.”
Top image: Flooding near CA 99 is shown in Merced on a day when recent rain resulted in a noticeably high water level in Merced County. Photo taken January 10, 2023. Photo courtesy of Andrew Innerarity / California Department of Water Resources