Better Living Through Big Data

Supercomputing Shakes Up Earthquake Preparedness

Pearly Tan Writer

Scientists say it’s time to put more sensors in the ground so today’s supercomputers can help us better understand earth movements and prepare for the next Big One.

People in the San Francisco Bay Area are often warned to prepare for the “Big One,” but few imagine that a supercomputer might save their lives.

This year marks the 25th anniversary of the devastating Loma Prieta earthquake, which reached a magnitude 6.9 on the Richter scale. Since then, earthquake preparedness has evolved, including the enforcement of stricter building codes, but still scientists don’t know how much damage the next big quake will cause.

bay_bridge_quake

Now, a global team of geologists, mathematicians and computer scientists is getting closer to an answer by using a supercomputer to mimic a complex quake involving multiple fault lines. Their work has big implications for how people study and prepare for disasters.

It demonstrates what’s possible when you combine detailed data and powerful processing, according to Alexander Heinecke, a researcher who led Intel Labs’ efforts, joining the core team that included Dr. Michael Bader, Alexander Breuer and Sebastian Rettenberger from Germany’s Technische Universitaet Muenchen (TUM), and Dr. Alice Agnes Gabriel and Dr. Christian Pelties from Ludwig Maximillians Universitaet Muenchen (LMU) in Germany.

Their work, supported by researchers inside Intel and computing centers from around the world, was nominated this year for the prestigious Gordon Bell Prize, which is given for achievement in high-performance computing.

Earthquakes are difficult to model because of the hundreds of factors that determine their size and severity. Heinecke, a computer scientist by training, says he was drawn to the “grand challenge” of seeing whether a supercomputer could make sense of all of the different variables.

For this project, the international team picked the 7.3 magnitude Landers earthquake that stuck the Mojave Desert in 1992. While no one died in the quake due to the area’s isolation, it’s regarded as one of the most educational earthquakes of all time because the main shock triggered five other adjacent faults, one after the other. Recording stations in the desert captured piles of information that provided a reference point.

field_scientists

To simulate the earthquake, Heinecke and his team built 3D models of the earth and tested different scenarios. They used supercomputers powered by Intel Xeon Phi coprocessors, such as Texas Advanced Computing Center’s Stampede, to simulate ground motion with unprecedented levels of accuracy.

The earthquake simulation code used by the team made up to 8.6 quadrillion calculations per second. The average computer performs about 10 billion calculations per second, which is roughly one million times slower.

The work is gaining notice among earthquake scientists.

“Ten years ago, computers were preventing us from understanding the ground,” said Brad Aagaard, a research geophysicist for the U.S. Geological Survey (USGS).

“Today, the limiting power is not computing power but our understanding of the earth. We need more observations and sensors out there.”

That’s starting to happen, too. New types of seismic data collection include accelerometers measuring force placed in buildings and in the ground.

UC Berkeley Stadium Earthquake Region

California Memorial Stadium at the University of California, Berkeley, sits right on top of the Hayward Fault. It underwent a $321 million seismic retrofit in 2010.

The University’s Seismological Laboratory created an app called MyQuake, which calculates the shaking intensity at the phone’s location during earthquakes.

Aagaard hopes that with better ground-shake forecasting architects will be able to design buildings that can survive even the biggest earthquakes.

As supercomputing evolves and earthquake simulation becomes more accurate, it could have a big impact on the design of buildings in quake-prone areas like San Francisco.

“When the ground moves during an earthquake, it affects buildings and the systems in it,” said Stephen Mahin, a professor of structural engineering at University of California, Berkeley.

“How the building moves will change the ground, and affect how the next building shakes.”

SUPERCOMPUTING-EVERYWHERE-infographic

With more complex buildings being built in San Francisco, researchers are not only using supercomputing to analyze the movement of tall buildings but to also model the effect of whole cities being impacted by earthquakes. Such rich detail and high-level analyses require millions of calculations, just the sort of thing that supercomputers can do.

Simulations are also helping engineers determine whether a building’s foundation needs to be plowed deep into bedrock or spread wide to hold on to more soil.

So what about the big one? The Hayward fault, the main threat to the Bay Area, erupts every 140 years or so, most recently in 1868. So it’s due. Six other nearby faults could also be triggered by a major quake.

Right now, the biggest obstacle is data. Until the Hayward fault ruptures and data show how much of it ruptured and where it ruptured, scientists can prepare for the big one by simulating all the possible scenarios, but no one will know for sure.

“In working on earthquake simulations, the best you can achieve is reasonably accurate scientific outcomes. You can never perfectly simulate real-world outcomes,” Heinecke said.

Feature image of Loma Prieta quake by USGS / Bay Bridge earthquake photo by Joe Lewis, CIR Online

Share This Article

Related Topics

Science

Read This Next

Read Full Story