The 1989 Loma Prieta earthquake was a wakeup call to the Central Coast and San Francisco Bay Area, which had gone decades without a major earthquake. In the 20 years since then, increasingly sophisticated assessments of earthquake hazards have led to a better understanding of the seismic threat in the Bay Area and throughout California.
The bottom line from those assessments is clear and simple: Be prepared.
“It is good that people are remembering Loma Prieta 20 years later, because it is an example of what is to come. There will be comparable events in California over the next few decades, and some will be more damaging,” said seismologist Thorne Lay, professor of Earth and planetary sciences at UCSC.
Karen McNally, professor emerita of Earth and planetary sciences, was director of the Richter Seismological Laboratory and the Institute of Tectonics at UCSC in 1989. Thanks in part to her efforts, the seismological recordings of the Loma Prieta earthquake were the best that had ever been obtained for a large continental earthquake. As a result, scientists learned a great deal, not only about the faults in this area, but more generally about the factors that contribute to damaging ground shaking during an earthquake.
McNally, who recognized two earlier temblors in the region as possible precursors to a large earthquake, had installed state-of-the-art digital instruments for recording ground motions in the area where the San Andreas fault ruptured on October 17, 1989.
“I felt that a large earthquake was probably imminent on the timescale of a few months to a few years,” she said. “I was then testing prototype instruments designed for field work on large earthquakes, so I put those out in August.”
After the earthquake, additional instruments that had just been purchased by the Incorporated Research Institutions for Seismology (IRIS) were sent to UCSC and installed in the field to record aftershocks.
“It was certainly my entrance into modern digital seismology,” said Susan Schwartz, then a postdoctoral researcher working with McNally and now director of the Keck Seismology Laboratory at UCSC. “We had a lot of graduate students helping to put out the instruments, and we set up a processing center to handle the data. We learned a lot about the geometry of the fault zone and how the intensity of the shaking was affected by surface geology.”
Since Loma Prieta, California building codes have been updated to reflect new understanding of ground motions during an earthquake and better mapping of seismic hazards. McNally’s instruments recorded ground accelerations of 1 g or more, much greater than was thought possible for a magnitude 6.9 earthquake. Peak ground acceleration is a key factor in determining the amount of damage caused by an earthquake. Measurements of 1 g or more were later recorded during the 1994 Northridge earthquake (magnitude 6.7) in southern California, confirming McNally’s findings.
The data recorded from Loma Prieta clearly showed that ground shaking is much more violent on soft sediments, such as those found on the margins of San Francisco Bay, than on bedrock. Geologists had previously recognized this effect, but new data helped to quantify it. There were also some surprises in the damage patterns from Loma Prieta that ultimately led to a better understanding of how the propagation of seismic waves through different geological structures can focus energy in certain areas.
“We now know so much more and can predict what the shaking is going to be in a given area,” Schwartz said. “The U.S. Geological Survey can now issue a shake map right after an earthquake occurs that gives a preliminary estimate of the intensity of the shaking in different locations.”
Shake maps are a valuable resource for emergency personnel and agencies directing the response to an earthquake. Lay noted that the capacity to rapidly record and analyze data from an earthquake has led to the development of automated systems for communicating information about an earthquake after it happens.
“Technological advances have really been speeding up how fast we can communicate information about an earthquake to governments, utilities, and railways, so they can do rapid responses like stopping trains and inspecting bridges,” Lay said. “Our response capability in California is getting better every year, although it’s still not quite as advanced as in Japan.”
Predicting earthquakes remains a challenging and perhaps insurmountable problem. But McNally emphasized the value of the earthquake probability forecasts that are available now.
“Intermediate and long-term warnings are very useful if people know what to do with that information,” she said.
The most recent assessment of California earthquake hazards–the Uniform California Earthquake Rupture Forecast–was issued in 2008. The forecasts include a greater than 99 percent probability of one or more magnitude 6.7 or larger earthquakes in California over the next 30 years. In the greater San Francisco Bay Area, the probability of a magnitude 6.7 or greater earthquake is 63 percent, or about 2 out of 3.
“These are sobering numbers,” Lay said. “For the state as a whole, it’s almost a guaranteed occurrence. In the Bay Area, the primary concern is the Hayward fault, which has a particularly high perceived risk. Loma Prieta caused tremendous damage, but its epicenter was actually in a pretty remote area. A comparable event on the Hayward fault would be right under a densely populated area.”
Schwartz noted that while a lot has changed in earthquake science over the past 20 years, the basic messages of earthquake safety and preparedness are much the same: plan ahead; have emergency supplies on hand; eliminate hazards such as unsecured bookcases; and “drop, cover, and hold on” during an earthquake.
“I look at this anniversary as an opportunity for people to think about earthquake preparedness,” Schwartz said. “It amazes me to realize that most of the students in my classes now weren’t even born then. It could happen again, and people should think about whether they’re prepared.”