top of page
Search

Supercomputers Will Improve Hurricane Forecasts


Row of refrigerator sized super computers for NOAA



Predicting the path and power of a hurricane, and understanding how one strengthens, is a challenge for forecasters. Two new supercomputers will help more accurately predict this year’s storms that are being made worse and even more erratic by global warming.






 

In October 2018, Hurricane Michael struck Florida’s Gulf Coast in October as a Category 5 storm with wind speeds in excess of 150 mph. Meteorologists and storm trackers had predicted it would max out as a Category 2-3 storm with winds ranging from 96-129 mph.


Why were they so far off the mark? Because the data needed to accurately assess a hurricane’s path and intensity was too great for the computers available.


Model Prediction Improvements Over The Past Three Decades


Hurricane analysis requires a tremendous amount of computing power. As computers have grown in calculating abilities, so has the accuracy of hurricane modeling.


Projections of hurricane paths have gotten steadily more accurate over the past 30 years as large-scale weather models, and the computers running them, have improved. Hurricane analysis, however, requires a tremendous amount of computing power. Average errors in hurricane path predictions dropped from about 100 miles in 2005 to about 65 miles in 2020. The difference might seem small when storms can be hundreds of miles wide, but when it comes to predicting where the worst effects of a hurricane will hit, “every little wiggle matters,” says Michael Brennan, head of the Hurricane Specialist Unit at the National Hurricane Center.


Since the strength of a hurricane is driven by local factors like wind speed and temperature at the center of the storm, understanding and predicting hurricanes’ intensity has been more challenging than predicting their paths. Still, intensity predictions have also started to improve in the past decade with errors in the intensity forecast within 48 hours decreasing by 20% to 30% between 2005 and 2020 (1).


Supercomputer Impacts on Modeling


The Coastal Storm Modeling System (CoSMoS) provides emergency responders and coastal planners with critical storm-hazard information such as flood extent, flood depth, duration of flooding, wave height, and currents that can be used to increase public safety, mitigate physical damages, and more effectively manage complex coastal settings (2).


Weather modeling requires huge amounts of computing power due to the complicated physics used to analyze changing atmospheric conditions.


Weather models work by splitting the globe up into several pieces and trying to calculate what will happen in each of them. A higher-resolution model will break up the globe into even smaller fragments, which means there are more of them to consider. Greater computing power allows this geographical precision.


With a more robust computer system, researchers can also put together an ensemble model which they run as many as 20 or 30 times. Each of these runs is slightly altered to reflect different conditions in order to see how the predictions differ. The results are then tallied up and considered together to give forecasters a better understanding of the conditions affecting each storm.


Here Come The Crays


In early 2020, the NOAA announced it will triple its operational supercomputing capacity for weather and climate with the acquisition of two supercomputers called Crays. The new Crays will replace four existing systems named Luna, Surge, Mars and Venus, and be based in Manassas, Virginia, and Phoenix Arizona (backing each other up). These comparatively compact supercomputers (estimated at the size of 10 refrigerators), will have the capacity to handle 12.1 petaflops of data – a significant increase over previous models – and operational by the start of the 2022 hurricane season.


Peta-What?

Graphic chart reflecting improvement in supercomputing capacity in petaflops from 2009 to 2020

A petaflop is a measurement of processing speed. It stands for floating-point operations per second, or how many floating-point arithmetic calculations a supercomputer can do each second (floating-points are numbers with decimals in them). One petaflop is one-thousand-trillion floating-point operations, or flops, per second. For some perspective, an iPhone has 600 Gflops and there are 1 million G Flops in each Petaflops. Another way to look at it -to match what a 1 Petaflop computer system can do in just one second, you'd have to perform one calculation every second for 31,688,765 years.


Local Meteorologists Are Thrilled


The resulting forecast accuracy of these supercomputers will make the lives of local meteorologists much easier. Warning times and preparation advisories will save lives and property.


Many meteorologists caution, however, that even the most accurate forecasting won’t prevent loss of life if the public isn’t paying attention. Many of us have to Google the difference between a storm watch (hazardous conditions are expected) and a storm warning (hazardous conditions have been already observed) nearly every time a storm is forecast. Just think back on when regular programming interruptions announcing tornado watches and tornado warnings – in those moments, is the distinction clear in our minds? If we don’t know what is occurring, we can’t know whether to head for the basement or tie down the lawn furniture.


Climate change means an ever-increasing risk of extreme weather. Forecasters hope that clearly communicated, more accurate predictions will help us make the right decisions when storms pass through.




 

(2) USGS.gov

Sources: NOAA, YCombinator, University Information Technology Services, USGS, hpcwire

0 comments

Recent Posts

See All
bottom of page