Getting Ahead of the Spill
University of Texas researchers aid Gulf oil spill effort with satellite detection and supercomputer-enabled forecasts
Growing up in an oil camp in East Texas, Gordon Wells, the first person to see the satellite images of the BP disaster on a daily basis, experienced conditions comparable to the aftermath of an oil spill.
"There were all sorts of sludge pits and oil out there on the surface where I was raised," Wells said. "In coastal Louisiana, we're not talking about an area that's never been touched by oil. The marshes do have some resilience to come back, so long as they're not completely obliterated by this particular event."
Wells, a researcher at The University of Texas at Austin's Center for Space Research (CSR), has spent his career using satellite sensor systems to perform environmental monitoring and resource assessment. He has seen oil spills in Iraq, Kuwait, and even Louisiana, when the Alvenus supertanker broke up south of Lake Charles in 1984.
"I took aerial photographs of that spill and watched it come in along beaches near Galveston," Wells recalled.
This may be the first event, however, where his lifetime of experience with oil will have such a direct impact on the response to the disaster.
Using CSR's three rooftop antennas, Wells receives regular satellite photos and data transmissions from 14 American and international satellites passing over the Gulf, and streams these real-time images of the oil spill to 30 national and local relief agencies even faster than NASA can.
"With something this large, you have to have overhead observation that gives you a comprehensive and reasonably uniform view of the Gulf, day-by-day," Wells said.
Radar satellite observations in late July show that oil extends approximately 60 miles to the East of the Deepwater Horizon drilling site, but only in very minor patches. Stringers and filaments of oil are located near the Mississippi River Delta.
A key member of the NASA-funded Mid-American Geospatial Information Center (MAGIC) and the Governor's Emergency Management Council, Wells is at the front line of efforts to monitor and minimize damage from the oil—right where he wants to be.
Wells' collaborator, Clint Dawson, is a more reluctant hero.
A professor of aerospace engineering and engineering mechanics and head of the Computational Hydraulics Group at UT's Institute for Computational Engineering and Sciences (ICES), Dawson, along with Rick Luettich, head of the Institute of Marine Sciences at the University of North Carolina Chapel Hill, and Joannes Westerink, a civil engineering professor at the University of Notre Dame, developed ADCIRC, the leading storm surge model used to predict the impact of hurricanes on the Gulf Coast.
Dawson and his collaborators are authorities on the specific dynamics of the Gulf's tides, currents, and wave systems, and among the best able to predict the oil spill's movements and landfall.
"I knew it was going to be a catastrophe from the minute I heard about it," Dawson said, sitting in his office overlooking the Spanish tile roofs of the UT campus. "It's in the ‘bird's foot' area, where the Mississippi dumps into the Gulf, and where are all kinds of fishing goes on. It's an important area economically, and also for environmental reasons, and it's going to be a long time before that area recovers."
The ADCIRC model depicts the interactions between weather, currents, oil, and land, including the specific topography of the coast and its marshes and wetlands. The models recently showed oil drifting toward Texas, just as people on the coast were starting to see oil wash ashore along the upper Texas coast.
The models are a feat of software engineering. They are also reflect our growing ability to use supercomputers to simulate natural processes with enough authenticity to trace the complex dynamics of the Gulf of Mexico, and even to project its progress into the future.
Running night and day on the Ranger supercomputer at the Texas Advanced Computing Center (TACC), ADCIRC, which uses CSR's satellite data as a starting point, has modeled the day-to-day movements of the oil spill to predict the spill's movements up to three days into the future. To do so, Dawson's group has used more than 1.7 million computing hours (or the equivalent or 194 years worth of calculations on a single processor), in the eight weeks since they began simulating. They intend to run numerical simulations throughout the hurricane season.
The satellite images and oil spill forecasts are available to first responders as they position booms along the shore, directing U.S. Coast Guard ships to the thickest plumes of oil, and, in the worst case scenario of a hurricane, enable emergency teams to get ahead of the disaster to save lives and property along the coast.
Still image from an animation developed by TACC staff that shows the transport of particles representing surface oil near the Louisiana coast.
Harnessing Technology for Action
Dawson and his colleagues at UNC and Notre Dame are able to perform these feats of forecasting because of their expert ability to harness the power of tens of thousands of computer processors to simulate the hydrodynamics of Gulf coast waters with extreme accuracy.
Wells, too, uses advanced computing technology to ingest, interpret, and distribute remote sensing information. Radar peers through clouds night and day. Feature-detecting algorithms interpret satellite imagery to differentiate mangroves from coastal grasslands, and sand bars from beachfront homes. And all the data passes through the pulsing fabric ofRanger and Corral, two of the most powerful advanced computing systems in the country, housed at TACC on The University of Texas at Austin's J.J. Pickle Research Campus.
As Paul Simon said: "These are the days of miracles and wonders."
In terms of lives protected and property preserved, the national investment in high-tech cyberinfrastructure is paying off, and scientists' use of advanced computing is only getting better.
ADCIRC simulations of the Gulf Coast in 2004 prophesied the potential impact of a storm like Hurricane Katrina, but were ignored until after the storm. In early July of this year, Dawson's team began simulating the storm surge caused by Hurricane Alex, based on the advisories coming from the National Hurricane Center. Within two hours of the time the advisory was issued, they were able to produce forecasts, post-process the results, and send them to Wells at the State Operations Center in Austin. According to Wells, the timely results were shared there with the Texas Speaker of the House, Joe Strauss and other state officials.
"This is the first time the new forecast system has been run for a real hurricane, and it worked beautifully," Dawson wrote in an email to TACC and UT Austin leadership. "Without Ranger and the facilities at ICES we would not be able to do this."
Wells uses the satellite data and computer-driven forecasts to direct rapid responses to emergencies in the Gulf region. During Hurricane Ike, in 2008, Wells used the information to help state emergency managers position rescue teams, arrange transportation away from the impending storm, and track the transport of hospital patients and at-risk individuals.
Wells' outfit was monitoring conditions in the south-central U.S when the Deepwater Horizon explosion occurred. Between the May 25 and 27, it became apparent that the oil spill was much larger than had been reported.
"The spread of the spill was so rapid and over such a large surface area that we could see that it was going to be a bigger spill," Wells said.
Using CSR's three rooftop antennas, Wells receives regular satellite photos and data transmissions from 14 American and international satellites passing over the Gulf, and streams these real-time images of the oil spill to 30 national and local relief agencies. Here, a view of the Gulf Coast from a NASA satellite.
He began to accumulate all of the available satellite imagery of the area and created a workflow to distribute it to the National Oceanic and Atmospheric Administration's (NOAA) marine pollution surveillance team, who took the lead in organizing the imagery.
The data — several terabytes a day — is stored on TACC's Corral system, which distributes it to users through its high-speed network. "The data flows out to every other federal responding agency," Wells explained. "The Coast Guard is using our information to decide where they're going to have an operation that day and where they're most likely to intersect the part of the oil that they're interested in."
Trading Hurricanes for Oil Plumes
With a rapid-response grant from the National Science Foundation (NSF), and the offer of a million computing hours on TACC's Rangersupercomputer at their disposal, the ADCIRC team began work to adapt their storm surge modeling tool to track oil spills.
Since May 27, Dawson and his colleagues have been calculating one "nowcast," (or near-term prediction), and one forecast (tracking the spill up to three days into the future) per day. The researchers hope this time frame provides enough warning to alert and protect coastal areas in case of an emergency.
The simulations provide 50-meter resolution for the near-shore regions of interest — 10 to 20 times more detailed than other models of the oil spill. The models show thick blobs of oil entering the "bird's foot," and beginning to wash up into the marshes there.
Getting ahead of the spill's movements is the scientists' ultimate goal.
"Having the storm surge model allows us to isolate areas in danger, and get our search and rescue teams out there to do operations and do them effectively," said Wells. "Buying a little time can make a big difference."
The satellite images have a secondary value that is largely psychological. Along with oil covered birds and underwater footage of the spewing wellhead, satellite images of the spill, with its grey-white tentacles spreading towards the green coast, have been among the most iconic reflections of the event.
"With the satellite images, people begin to understand the scale of the disaster," said Wells. "They can see what big is, just as they can look at the oil-soaked birds and know that it's bad. It's important for people to get the dimensions in their head."
The oil spill is now estimated to extend over more than 20,000 square miles.
As much as Dawson's predictive forecasts and Wells' satellite images are success stories, their experience tracking the spill points to the gaps in knowledge and the limitations of today's technology.
"We don't know how much oil is out there to be honest with you. Things just keep popping up from seemingly nowhere," Dawson said. "We're only modeling the surface oil because nobody knows the 3D extent of the plume."
In the case of a hurricane, things could get a lot worse fast. A storm passing over the spill would likely push oil far inland, and if the hurricane made landfall in a populated Gulf Coast city like New Orleans or Houston, it could present an environmental disaster on a massive scale. It is this scenario that keeps the researchers up at night.
For that reason, Dawson and his colleagues have been using data from previous hurricanes to predict potential outcomes if a hurricane passed directly over or near the spill.
"Our Hurricane Gustav and Ike simulations actually show that some of the oil is pushed to the west, causing it to wash up in places that you wouldn't expect, like Texas, for instance," said Dawson.
And then there's another possibility, the researchers say: that a hurricane could churn up the oil, dissipate the spill, and actually reduce its impact.
No one truly knows.
"Putting all of these pieces together in such a short timeframe has been a Herculean effort by a group of really top-notch researchers," Dawson said. "Just getting this to where it's automated and picks up the forecasts everyday and simulates the spill is a major accomplishment."
For now, the scientists ask for a little patience and maturity when dealing with the spill and its clean-up.
Dawson pleaded. "People are hurting, let's figure out a way to keep them employed, supported, and get the spill cleaned up as quickly as possible."
By providing critical and timely information, Dawson and Wells' efforts are helping to do just that.
August 2, 2010
The Texas Advanced Computing Center (TACC) at The University of Texas at Austin is one of the leading centers of computational excellence in the United States. The center's mission is to enable discoveries that advance science and society through the application of advanced computing technologies. To fulfill this mission, TACC identifies, evaluates, deploys, and supports powerful computing, visualization, and storage systems and software. TACC's staff experts help researchers and educators use these technologies effectively, and conduct research and development to make these technologies more powerful, more reliable, and easier to use. TACC staff also help encourage, educate, and train the next generation of researchers, empowering them to make discoveries that change the world.
- Researchers from The University of Texas at Austin are using the advanced computing resources at TACC to observe and predict the movement of the Gulf oil spill.
- Information from satellite data and high-resolution simulations is passed to the federal agencies coordinating the response effort.
- By getting ahead of the oil spill, and forecasting the impact of a hurricane on the Gulf coast, the researchers hope to protect lives and property and avert an environmental disaster.
To learn more about these research efforts, visit the following sites:
- Mid-American Geospatial Information Center (MAGIC)
- ADCIRC Coastal Circulation and Storm Surge Model
- The Institute for Computational Engineering and Sciences (ICES)
Science and Technology Writer