Upgrading the Hurricane Forecast

TACC's Ranger supercomputer helps researchers predict storm intensity with greater accuracy

Header Image
Tropical Storm Irene makes landfall near New York City. [Courtesy of NOAA]

When Hurricane Irene swept through New England in August 2011, the National Hurricane Center (NHC) did an astounding job of predicting its path. However, Irene arrived significantly weaker than originally forecast, leading to a larger evacuation than would have occurred had NHC's intensity forecasts been closer to the mark. 
 
"The National Hurricane Center has been doing an excellent job over the past few decades of persistently increasing the hurricane forecast track accuracy," said Fuqing Zhang, professor of meteorology at the Pennsylvania State University. "But there have been virtually no improvements in the intensity forecast."

Mean absolute forecast error averaged by all 72 airborne Doppler missions during 2008-2011 for the NHC official forecast (black), NOAA GFDL (green) and HWRF (blue) models and PSU WRF-EnKF forecast (red) after making the samples homogeneous for all 4 forecasts. The Y-axis is the intensity forecast error in unit of knots, while the X-axis is the forecast lead-time in unit of hour.

Predicting how hurricanes form, intensify, or dissipate is different and more challenging than predicting its path.

Zhang, along with his Penn State colleague, Yonghui Weng, in collaboration with researchers at the National Oceanic and Atmospheric Administration (NOAA), has developed a pioneering hurricane modeling and forecasting system that improves on today's operational methods in several important ways.

First, the system adopts a higher-resolution state-of-the-art computer model (4.5km grid spacing, compared to those of 9km) capable of more explicitly resolving a hurricane's inner-core dynamics and structure, which is crucial to hurricane intensity.

Second, it ingests airborne Doppler radar taken by planes flying through the hurricane, helping to initialize the models. "NOAA has been collecting this data for 30 years, but it has yet to be put into operational models used by NHC forecasters," Zhang said.

Third, a new data assimilation method using an ensemble Kalman filter (Zhang's area of expertise) has improved how the system ingests Doppler radar data.

"We use statistical information about the background and merge it with the observations," he explained. "By combining the forecast and the observations in a new optimal way, we greatly improved the model initial fields over the existing methods used by the NOAA operational hurricane models."

Using the Ranger supercomputer at the Texas Advanced Computing Center (TACC), Zhang's system forecasted the track and intensity of every major storm in the Atlantic throughout the 2011 hurricane season, sharing the results of his simulations on his personal research website.

A group photo of Zhang's research team. (From left to right: Ben Green, Junhong Wei, Yonghui Weng, Ellie Meng, Xinghua Bao, Jianhua Sun,Fuqing Zhang, Jon Poterjoy, Thomas Hinson and Baoguo Xie)

As a result of the changes, Zhang's forecasts were shown to improve intensity predictions by an average of 20 to 40 percent over the NHC official forecasts for 2008-2011 storms that have the airborne Doppler radar data.

First deployed on Ranger at TACC during Hurricane Ike in 2008, Zhang's prediction system is one of a handful being assessed by the NHC to become part of the operational forecasting system used in emergency situations. Those in the NHC feel that it provides many strong features that improve upon current methods.

"It may be one of our best hopes for trying to find models that can depict these kinds of changing conditions," said James Franklin, chief of the Hurricane Specialist Unit at the NHC.

Frank Marks, director of NOAA's Hurricane Research Division, said Zhang has radically changed the landscape in hurricane intensity prediction.

"The results of his work and the subsequent implementation of his approach in the operational models have demonstrated the first potential breakthrough in intensity prediction since the advent of the Dvorak technique for estimating the storm's current intensity from satellite," he said. "Fuqing's tireless energy and drive has led to a major potential breakthrough."

Just as several generations of jets are in development while earlier versions fly, the National Hurricane Center tests experimental methods concurrently with operational systems. To do so, they rely on advanced computing systems like Ranger at the Texas Advanced Computing Center, supported by the National Science Foundation.

The ensemble forecasts that Zhang produces require the hurricane simulations to run dozens of times, simultaneously, which requires thousands of computer processors working in parallel — a feat that can only be achieved on a few dozen supercomputing systems in the world.

Zhang's website shows his group's real-time Atlantic Hurricane Forecast (created using Ranger) plotted alongside comparable model forecasts, satellite images of the storm, and other pertinent information.

"It would not be feasible to do any of this without the initial jumpstart by TACC and its computing resources," Zhang said

Zhang's research is part of NOAA's Hurricane Forecast Improvement Project, a 10-year effort to make significant gains in extreme weather prediction. The National Science Foundation and the Office of Naval Research also support Zhang's work.

Hurricane Irene may have been less intense than originally predicted, but it brought drenching rains and massive flooding throughout Vermont and much of New England.

"We still have problems in predicting the genesis, rapid intensification or weakening, and the eye-wall replacement cycle," Zhang said. "We still have difficulties in quantifying the uncertainties associated with the prediction. The public wants to know how much weather they'll get, but given the inherent uncertainties we want to include error bars."

Improving these aspects of hurricane forecasting will require even more powerful supercomputers. With TACC's Stampede system — approximately 20 times larger than Ranger — coming online in January 2013, Zhang will have the opportunity to put additional predictive power into his storm forecasts, a fact that galvanizes him.

"We're having an impact in terms of what we do, and the effects on day-to-day life," Zhang said. "And of course the nature of hurricanes always excites us."

Aaron Dubrow, Science and Technology Writer
February 8, 2012

Share |

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin is one of the leading centers of computational excellence in the United States. The center's mission is to enable discoveries that advance science and society through the application of advanced computing technologies. To fulfill this mission, TACC identifies, evaluates, deploys, and supports powerful computing, visualization, and storage systems and software. TACC's staff experts help researchers and educators use these technologies effectively, and conduct research and development to make these technologies more powerful, more reliable, and easier to use. TACC staff also help encourage, educate, and train the next generation of researchers, empowering them to make discoveries that change the world.

  • Researchers used Ranger to develop and test a new, high-resolution hurricane forecasting system that incorporates Doppler radar data from planes flying into the storm.
  • The forecasts were shown to improve intensity predictions by an average of 20 to 40 percent over the official forecasts of the National Hurricane Center (NHC).
  • The prediction system is one of a handful being assessed by the NHC to become part of the operational forecasting system used in emergency situations.

Aaron Dubrow
Science and Technology Writer
aarondubrow@tacc.utexas.edu