My father was a frustrated meteorologist. Oh, sure, like most people who farmed and lived in the country most of his life, he lived by the weather, and could predict what the day would be like with pretty fair accuracy by employing one simple technique: Going outside and looking up. He would look at the clouds, the wind direction, feel the air, and if he said "It's going to rain later today," you could pretty much take that to the bank.
The frustrated meteorologist part, though, was thanks to the Army Air Forces during World War 2. Dad was trained as a bomber navigator, and in those days, a navigator had to know a fair bit about weather and how it worked. He had to be able to look at a weather system and, as navigator, decide whether they should fly over, around, under, or turn back. One of Dad's few regrets in his long, long life was not studying the weather and its prediction more, because it was something he found interesting.
Nowadays, though, we're still only able to predict the weather a few days out, at least with any accuracy. But the National Oceanographic and Atmospheric Administration is rolling out some artificial intelligence (AI) aided weather models, which are supposed to be able to predict with greater accuracy and better timing.
I'm skeptical. Watts Up With That's scribe Anthony Watts has more.
NOAA has launched a groundbreaking new suite of operational, artificial intelligence (AI)-driven global weather prediction models, marking a significant advancement in forecast speed, efficiency, and accuracy. The models will provide forecasters with faster delivery of more accurate guidance, while using a fraction of computational resources.
“NOAA’s strategic application of AI is a significant leap forward in American weather model innovation,” said Neil Jacobs, Ph.D., NOAA administrator. “These AI models reflect a new paradigm for NOAA in providing improved accuracy for large-scale weather and tropical tracks, and faster delivery of forecast products to meteorologists and the public at a lower cost through drastically reduced computational expenses.”
Is it, really, though? AI can only use what we give it. It can't really extrapolate. It can only take the data it already has and try to squeeze a few billion round pegs in a few billion square holes. Weather, like climate - they aren't the same thing - is chaotic, vast in scale, with billions of interacting elements and components. We can sort of predict the broad strokes, and have been able to since the early 1940s, since the first long-range weather flights and later, weather radar, made it possible to see weather fronts forming several days before they arrive at a particular location.
Now we do it with satellites, and it's still amazing how often we get it wrong when trying to predict more than a few days out.
Here's what NOAA's new systems look like:
The new suite of AI weather models includes three distinct applications:
- AIGFS (Artificial Intelligence Global Forecast System): A weather forecast model that implements AI to deliver improved weather forecasts more quickly and efficiently (using up to 99.7% less computing resources) than its traditional counterpart.
- AIGEFS (Artificial Intelligence Global Ensemble Forecast System): An AI-based ensemble system that provides a range of probable forecast outcomes to meteorologists and decision-makers. Early results show improved performance over the traditional GEFS, extending forecast skill by an additional 18 to 24 hours.
- HGEFS (Hybrid-GEFS): A pioneering, hybrid “grand ensemble” that combines the new AI-based AIGEFS (above) with NOAA’s flagship ensemble model, the Global Ensemble Forecast System. Initial testing shows that this model, a first-of-its kind approach for an operational weather center, consistently outperforms both the AI-only and physics-only ensemble systems.
First, AIGFS: How does notoriously energy-hungry AI use up to "99.7 percent less computing resources" while still powering through a weather forecast, crunching through all the same data that much faster, without a huge energy demand? I'm reminded of the caution I used to give potential clients in my corporate consulting days: "You can have speed, you can have quality, you can have economy. Pick two."
Second, AIGEFGS. So, we get another 18 to 24 hours - at what cost?
Third, HGEFS - so it takes data from AIGEFS and an older system, combining AI and physics - and what's the result? Outperforms, how?
Here's where I have a problem, aside from how much all this is costing. The weather is chaotic. It's unpredictable. Every weather system has billions of inputs and outputs, influences, and effects. Like biology, weather is complex beyond our understanding; any model we can come up with, AI-enhanced or not, is laughably crude by comparison.
So, we may gain another 24-hour lead time in knowing what the weather is likely to be - but at what cost?
Read More: Photographs and Memories: Colder Weather
The Polar Vortex Is Back, and Alaska Would Like a Word
I guess that no matter how sophisticated things get, we're still only gaining on the margins. So, we get an extra day before the next thunderstorm, blizzard, or heat wave. What do we do with that knowledge? The weather will be what the weather will be, no matter how much money we spend on trying to look an additional 24 hours ahead. Maybe we'd be better off learning to relax a little. Take prudent precautions for your area; here, we have plenty of firewood and a full heating oil tank. Wherever you live, being prepared for the kind of weather you're likely to see is always a good idea, and it doesn't require wheelbarrows full of taxpayer money. The weather will be what the weather will be; take an even strain.
Personally, I'm sticking with Dad's method: Go outside, look up. Even here, in the notoriously fickle Alaska weather systems, I'm getting pretty good at it.






