Latest News


Two Years of Super Science

Published on February 4, 2010 by Aaron Dubrow

A view of Ranger, which resides on The University of Texas at Austin's J.J. Pickle Research Campus.

The machine room at the Texas Advanced Computing Center (TACC) hums with science. Invisible to the naked eye, simulated universes are forming in the eons after the big bang, virtual drug compounds are binding to viral proteins, and models of tomorrow's solar cells are absorbing photons of light.

Since 2008, Ranger — one of the world's most powerful supercomputers — has churned out computations to help solve some of the world's most critical problems, from climate change to the Swine Flu. With a peak performance capability of 579 trillion calculations per second (579 Teraflop/second, or Tflops) and 123 terabytes of memory, Ranger has been able to help researchers address important scientific problems of great size, with incredible speed.

On Feb. 4, 2010, Ranger celebrated the two-year anniversary of its commitment to scientific research in the service of society.

The story begins in 2006, when a relatively new and rapidly growing advanced computing center in Austin, Texas, competed for, and won, the first of the National Science Foundation's (NSF) "Path to Petascale" grants. The purpose of the grant was to stimulate the U.S. national research community with an investment in supercomputers an order of magnitude larger than those that previously existed in the open science community.

"TACC had emerged on the national landscape through the acquisition of the first IBM Power4 System available for the national open science community in 2001, and the first terascale Linux cluster for the national open science community in 2003," said Jay Boisseau, TACC director and principal investigator for Ranger. "However, we were still a relatively new center, with a small but growing staff. We had tremendous talent and passion, though, with the confidence to compete and a vision for how to succeed."

Ranger entered formal production on Feb. 4, 2008, and premiered on the June 2008 TOP500 list as the #4 system, and the top academic supercomputer, in the world. Ranger was 50,000 times more powerful than a PC, and five times more capable than any open-science supercomputer available in the U.S. at the time. The system immediately doubled the computing time available to scientists through the NSF TeraGrid — the NSF's network of cyberinfrastructure resources — and was pivotal in paving the way for the next generation of supercomputers.

"Ranger is an incredible asset for The University of Texas at Austin, for the national scientific community, and for society as a whole," William Powers Jr., president of The University of Texas at Austin, said at the time. "That the National Science Foundation has entrusted us with this vital scientific instrument for research is a testament to the expertise of the staff at TACC, who are among the most innovative leaders in high performance computing."

At $59 million, funding for Ranger represents the largest NSF award ever made to The University of Texas at Austin.

Much about the system was novel. Working with technology partners Sun Microsystems and Advanced Micro Devices (AMD), Ranger incorporated a first-of-its-kind Sun Datacenter Switch 3456 to provide a scalable InfiniBand network infrastructure. It was also the first large supercomputer to use the new quad-core AMD Opteron™ processors.

"When we built Ranger, there was only one piece of hardware that existed as a product," explained Karl Schulz, TACC's associate director for Application Collaborations and co-principal investigator on the Ranger project. "Everything else was brand new."

The most pioneering aspect of the system was not the novelty of its parts, but the way it was able to integrate consumer components and open source software and tools to create a well-balanced, versatile system.

Over the course of the last two years, Rangerhas evolved from a novel cluster-based supercomputer to a mainstay of the academic computational research community. It has enabled 981 research projects, and more than 2800 users, in almost every discipline of science and engineering, and even in social sciences and humanities.

Simulations of Hurricane Ike on Ranger helped improve NOAA's forecasting methodology and demonstrated how The University of Texas at Austin is leading the nation in the development of scientific and computing solutions. [Credit: NOAA; Bill Barth, John Cazes, Greg P. Johnson, Romy Schneider and Karl Schulz, TACC.]

"We've made it reliable, high performance and scalable, and we've provided great user support," said Boisseau. "It's a system that is constantly in demand — often far in excess of what we can even provide — because we've established a reputation for making it a great working environment."

Of these hundreds of projects, a subset of research groups have successfully extended their applications to make full system runs, employing more than 60,000 processing cores, and pushing the limits of what is possible with advanced computing capabilities. Meanwhile, smaller computations of more than 5,000 cores have become routine, enabling the entire field of computational science to address more complex problems.

This explosion of available computing has made possible a host of improvements for domain scientists. "We have seen higher resolution models, more physics, more coupling of applications, larger single applications, larger parameter space sweeps, and larger ensembles of applications," Tommy Minyard, director of Advanced Computing Systems at TACC, enumerated. "It's game-changing."

In many cases, the research could not have been completed on a reasonable time-scale — or at all — without the use of Ranger's massive processing power and memory.

The scientific insights enabled by Ranger are numerous and diverse, and have been presented in publications in leading science journals and conference proceedings. But perhaps the most notable, and socially impactful, projects involved cases where Ranger's computational power helped save lives and protect property.

When Hurricanes Ike and Gustav struck the Gulf Coast in 2008, an ongoing collaboration with the National Oceanographic and Atmospheric Agency (NOAA) placed TACC in the center of emergency scientific activity. Hurricane forecasting requires massive computing power and cyberinfrastructure to be effective, so as these storms approach landfall, NOAA researchers turned to TACC to test their next-generation hurricane forecasting system using Ranger.

Understanding and controlling charge separation at the nanoscale from first principles calculations. This example shows the difference between semiconducting vs. metallic carbon nanotubes at a heterojunction (right).

The multi-tiered effort used global models with twice the resolution of the best operational simulations and regional models with six times the resolution of those now used. For the first time, Doppler radar data was streamed directly from NOAA planes to TACC servers, and instead of a single forecast, researchers ran 30 forecasts for each prediction to accomplish the ensemble blending that provides important uncertainty measures. Taken together, these accuracy improvements proved to be immense.

"Hurricane track forecasts improved by 15 percent for a five-day forecast," said Robert Gall, NOAA Hurricane Forecast Improvement Project manager. "The demonstration led to a revised strategy for our next-generation operational data assimilation which would not have been possible without Ranger."

Ranger had another chance to show off its urgent computing capabilities and potential for social impact when the H1N1 pandemic emerged in mid-2009. What was this new virus? How did it function? Would our medicines be effective against it?

A team led by Klaus Schulten at the University of Illinois at Urbana-Champaign immediately got to work answering these questions with Ranger's assistance. In quick succession, they modeled the virus's neuraminidase protein — the active site for all current anti-viral medications — and simulated how Tamiflu, Relenza, and several phase 3 drugs bound to the drug site, as well as to that of a likely mutant strain. Their molecular simulations revealed the pathway and the dynamics that led the drugs to bind to the active site of the viral protein. They also discovered that slight changes in the protein's structure, found in mutant forms, would have the capacity to evade our current drugs.

"A great fear is that the swine flu will mutate quickly and become drug resistant, and in fact, there are drug-resistant strains already," said Eric Lee, post-doctoral researcher in Schulten's lab.

With the knowledge gained through their virtual neuraminidase and drug-binding model, Schulten's team is guiding the way to the creation of new drugs and drug approaches that will continue to protect us against the flu.

Ranger is assisting with another kind of emergency: the impact of greenhouse gases on our efforts to curb global climate change. Projects running on the system include efforts to model the waters around Antarctica, map the winds of Texas, and make fundamental improvements in our understanding of solar cells.

This latest project, conducted by Jeffrey Grossman, professor of Materials Science and Engineering at MIT, aims to find cheap and efficient solar energy alternatives using computer simulations.

"The only way we can do these calculations, at these sizes, with the complexity that we're looking for, is on a machine like Ranger," Grossman said. "It's essentially the difference between being able to solve the problem and not."

Add to this list research in basic science, including how the universe formed, what matter is made of, and how tectonic plates move, and you have a sense of the scope and potential of the research enabled by Ranger's processing power.

Two years after its debut — half-way through its four-year life-cycle — Ranger is still one of the Top 10 systems in the world, and is doing its best work: a testimony to its extraordinary scale and the talented staff that continues to improve the system.

From an experimental, untested platform, Rangerhas evolved into the workhorse of the TeraGrid, a model for smaller Linux clusters around the world — from the Australian Weather Agency to South Africa's Centre for High Performance Computing — and a launching-pad for academic researchers preparing their computational methodologies for use on newer petascale systems such as Kraken and the forth-coming Blue Waters.

The Ranger team has not rested on its laurels. In its short life, Ranger has upgraded its processors, tuned its software stack, and become more versatile with the addition of a connected visualization resource, Spur. Spur's presence means that visualization, data analysis, and compute-intensive simulation and modeling, can all be done on the same system, without time-consuming file transfers.

Over the next two years, Ranger will continue to support the open science community, assisting in emergency situations, and leading to fundamental discoveries in energy, the environment and the life sciences that will change the world.

"I'm proud that TACC earned the opportunity and the responsibility from the National Science Foundation of providing the first system with hundreds of teraflops of performance for the open science community," Boisseau said. "As the first such system, we led the way towards petascale computing, and we're very pleased with how favorably Ranger has been adopted and is being used for scientific discovery."

Story Highlights

On Feb. 4, 2010, Ranger — one of the world's most powerful supercomputers — celebrated its two-year anniversary.

Since its deployment, Ranger has enabled 981 projects, assisting more than 2800 researchers.

Among its achievements, Ranger has helped scientists improve hurricane forecasts, limit the spread of the H1N1 virus, and understand solar energy capture.


Faith Singer-Villalobos

Communications Manager | 512-232-5771

Aaron Dubrow

Science And Technology Writer

Jorge Salazar

Technical Writer/Editor | 512-475-9411