Red sky at morning, sailor take warning.
For centuries, people have relied on weather forecasts to protect their lives and livelihoods.
Fortunately, modern-day techniques are much more sophisticated than gazing up at the sky. Today’s forecasters harness supercomputers that process vast amounts of data using mathematical models. These gigantic machines can conduct more than 10,000 trillion calculations per second as they filter hundreds of billions of weather observations around the globe every day.
Yet it is still tricky to know meteorological conditions in advance. The atmosphere is a dynamic and chaotic system with many unknowns. To make matters worse, global warming is increasing the frequency of severe weather events such as heatwaves, floods and storms.
The need for fast and accurate weather forecasts have never been greater - weather hazards cause a total of USD200 billion in economic damage a year. Swiss Re estimates based on four weather perils – floods, tropical cyclones, winter storms in Europe and severe thunderstorms. https://www.swissre.com/press-release/Economic-losses-set-to-increase-due-to-climate-change-with-US-and-Philippines-the-hardest-hit-Swiss-Re-Institute-finds/3051a9b0-e379-4bcb-990f-3cc8236d55a1
Encouragingly, the advent of machine learning (ML) and big data modelling is allowing scientists to predict changes in the atmosphere with greater precision and on a longer-term horizon. At the same time, technological advances are transforming weather forecasting into a profitable commercial enterprise from what was an underfunded public service. A new industry is taking shape.
Perhaps the most important recent major technological breakthrough was made last year by pre-trained ML models developed by Chinese tech company Huawei. Once trained, the Pangu-weather is a 3D high-resolution programme that achieves faster and more accurate weather predictions than conventional methods with just one server.
Hot on Huawei's heels came Google, whose DeepMind unit in 2023 unveiled its AI-based weather forecast system GraphCast that runs on a desktop computer. And in March, chip producer Nvidia launched its own digital twin platform, Earth-2, to simulate weather and climate conditions.
“Data-driven models... can compete with physics-based models in terms of extremes. The scientific community is catching up very quickly and machine learning can help us jump and push further towards faster and higher resolution modelling," says Dr Nicolas Gruber, Professor of Environmental Physics at ETH Zurich and an expert on climate modelling.
“Currently, most innovation in this field is driven by private companies. All the big tech companies have made major investments in the area. At the same time, it is important to note that the private sector relies on the vast amount of open data that the scientific community and public entities, such as the weather services, provide for fee. Without those data, the training of these data-based models would not have been possible."
Black box and black swans
Modern weather forecasting depends on physics-based numerical weather prediction (NWP), which uses the latest weather observations – such as temperature, rainfall, pressure, wind and humidity – alongside a mathematical computer model of the atmosphere to produce a forecast.
Today, Prof Gruber says advances in high power computing have enabled higher resolution forecasts on NWP with grid length of around 1.5km for a period of 7-10 days, double the range of 20 years ago.
While this is a breakthrough, the resolution is still not sharp enough to allow authorities to effectively plan for disaster prevention and evacuation support in the event of extreme events such as hurricanes, or businesses to avoid losses from floods or droughts.
“The challenge lies in being more precise in identifying storm paths and predicting the areas that need to be evacuated,” says Prof Gruber.
In fact, even finer resolution, in the order of 1 metre, is required to accurately show how clouds form and develop, which scientists say remain the greatest source of uncertainty in projections. This doesn’t come cheap on a conventional NWP method – 1 metre resolution models need 100 billion times more computational resources than today’s most progressive 1km models.https://dl.acm.org/doi/pdf/10.1145/3592979.3593412
"Machine learning becomes problematic if something unexpected and not part of your training happens. With global warming, we’re going into an unknown territory – we don’t know what the 2°C or 4°C warmer world will look like."
The state-of-the-art AI and ML models promise to address these challenges. For example, Nvidia’s Earth-2 generates resolution images that are 12 times higher than its NWP counterpart, at 1,000 times the speed and 3,000 times energy efficiency.https://nvidianews.nvidia.com/news/nvidia-announces-earth-climate-digital-twin
However, Prof Gruber is less confident about the ability of ML models to cope with unprecedentedly volatile and intense weather that will result from climate change.
Typically, machine learning relies on statistical regression models that are trained with hundreds of millions of historical meteorological data parameters, such as barometric pressure or sky conditions.
But severe weather events, such as the deadly floods in Dubai in 2024 or record-breaking heatwave in Delhi earlier the same year, usually fall outside of the scope of such models.
“Machine learning models are good at reproducing parameters and data space that you already have. But you only get what they’re trained for. It becomes problematic if something unexpected and not part of your training happens,” Prof Gruber says.
“With global warming, we’re going into an unknown territory – we don’t know what the 2°C or 4°C warmer world will look like.”
Power and resource hungry
ML models require extensive training in a large data centre infrastructure with supercomputers. Prof Gruber also says power consumption from these set-ups is becoming a constraint on the future development of weather and climate modelling systems.
For example, a model Prof Gruber uses runs on a supercomputer that requires 1 megawatt (1MW) of electricity. If this electricity is provided from an average European electricity service, this causes the emissions of more than 4,000 tonnes of CO2 per year, equivalent to over 25 million miles driven by an average gasoline car. Some of the world’s biggest supercomputers consume as much as 20MW.
This comes at a time when AI and data centres which support the cutting-edge technology already cause stress on local power grids. The International Energy Agency expects electricity consumed by data centres will more than double globally by 2026 to over 1,000 terawatt hours, an amount roughly equivalent to what Japan uses annually.
“Scaling ML models is challenging, as increased compute power does not always yield proportionate improvements in performance,” Prof Gruber says.
“The computational power required for sustaining AI’s rise is doubling roughly every 100 days, which is not sustainable. It’s imperative that we balance the progression of AI with the imperatives of sustainability.”
Investment insights
by Jennifer Boscardin-Chin, Senior Client Portfolio Manager, Thematic Equities, Pictet Asset Management
-
The advent of machine learning and artificial intelligence brings huge benefits. Laying the foundation for the cutting-edge technology are data centres, which compete for energy, water and land with other businesses, industries and local residents. Their increasing environmental footprint means demand will grow for more efficient solutions to optimise energy and resource usage more effectively.
-
More frequent extreme weather events are increasing efforts to invest in climate adaptation, designed to boost urban and infrastructure resilience across cities. The climate adaptation market – including green buildings and technologies, sustainable agricultural practices and enhancing disaster preparedness – is expected to hit USD2 trillion in the coming years (source: Bank of America). What is more, return on investment is attractive – every dollar invested in adaptation strategies could generate return of USD2-10 by 2030 (source: World Resource Institute).
-
Be it energy efficient technologies or climate adaptation strategies, companies which provide environmental solutions are likely to deliver a persistent return premium over the long term. This is often because the market underestimates the drivers of long-term environmental growth and their enduring nature and overemphasises short-term earnings estimates.