As we at Verne Global prepare to attend the Meteorological Technology Expo in Amsterdam later this month, it occurred to me that as a society we have an extraordinary number of superstitious methods for predicting the weather, from cows lying down meaning that it’s going to rain (though there’s actually no truth in that) to “red sky at night, shepherd’s delight” (which might actually have a little truth in it).
The fact that there are so many of these emphasises, first, how important the weather is to our daily lives, and second, that for a long time superstitions were all we had to go on.
Thanks to the many increased means of measuring, and analysis infrastructure in the form of high performance computing (HPC) - weather forecasting has improved immeasurably in recent decades. Many of us can remember when the weather forecast was unreliable beyond the next 24 hours, such as the hurricane in 1987 but irregular forecasting hasn’t been the case for years. Experts say that accuracy is increasing by around one day every decade and today’s five-day forecasts are almost as accurate as two-day forecasts were three decades ago.
In the most general terms, accurate forecasts rely on three things: availability of data, reliable models and simulations of weather trends and industrial scale HPC to process it all – which is why we will be showcasing our innovative hpcDIRECT platform to the global meteorological community in Amsterdam.
Getting it right is vitally important. In rural parts of the world, the livelihood of farmers can be shaped by significant weather events. Entire industries, for example the travel industry, can have their fortunes altered in ways that have a noticeable effect on the bottom line. More warning of significant weather changes allows people to make better plans.
On top of that, there are the public health and safety benefits of being able to predict extreme weather events. Even a few extra hours of warning of a hurricane can be a matter of life and death. And finally, as climate change alters long-standing weather patterns, our ability to predict medium-to-long-term changes will become more significant.
More than 11,000 weather stations worldwide measure a range of factors every hour, from temperature to air pressure, and relay this to state-run and private forecasters, where it is combined with data from ships, weather balloons, satellites and aircraft and then run through computer models.
These models require vast amounts of data so that they can quantify uncertainty. The models will inevitably have errors but so will the observations. It therefore is not possible to estimate a “state zero” from which the forecast starts. Instead, most weather models predict a range of starting points and forecast from there. The output will be a corresponding range of predictions and the goal is to make that range as narrow as possible and, of course, contain the accurate prediction. The modern weather forecast, then, is not the result of one simulation, but of many.
But accuracy depends to a large extent on the size of the area to which the forecast applies. In the early 1980s the ‘spatial resolution’ of weather forecasts covered around 200 square kilometres. Within that area, a lot of different things could be happening. If the forecast predicted rain there might still be large parts of that region that stayed dry. Today, the spatial resolution of many forecasts is around nine square kilometres.
At a recent talk at The Platform for Advanced Scientific Computing Conference (PASC18) in July in Basel, Dr Nils P Wedi of the European Centre for Medium-Range Weather Forecasts (ECMWF) said that the ECMWF is aiming for a resolution of one square kilometre but it will take more than a decade to get there.
The increase in spatial resolution has followed the rise in computing power. At higher resolutions, more data is needed and processing it all in a useful timeframe is a challenge. A forecast of nine square kilometres requires around 48 terabytes of data. At 1.25 square kilometres around 1.8 petabytes will be needed.
At the moment, said Dr Wedi said, ECMWF is archiving around one petabyte of data every week. He referred to a 2018 study from the Federal Institute of Meteorology and Climatology in Zurich that ran a kilometer-scale Earth system simulation using 4,888 GPUs.
More and more weather companies are partnering with specialist HPC providers like Verne Global so that the compute they need is available when they need it. The volume of compute required to achieve accurate forecasting is ever increasing, HPC has become the engine room for weather forecasting. In 1959, the Met Office’s computer Meteor, was capable of doing 30,000 calculations a second. The Cray Supercomputer the Met Office operates today is capable of over 14,000 trillion arithmetic operations per second – that’s more than 2 million calculations per second for every man, woman and child on the plane. Embracing provider HPC services enables these crucial organisations to further their efforts and improve their results.
This won’t just lead to more accurate forecasts; it will also bring new services as data starts to get connected. Your sat-nav might route you away from a particular bridge in high winds or let you know that a road is flooded, for example, while theme parks might give customers real time information about weather trends so that they can choose which rides they go on and when.
It might even be possible to determine once and for all whether cows really do know when rain is coming.
If you’re in Amsterdam for the Meteorological Technology Expo please do come along to our stand (5023) to hear how hpcDIRECT can support your compute demands. I look forward to meeting you.