230 likes | 346 Vues
This research introduces the concept of a Virtual Seismologist (VS) capable of automating seismic monitoring and early warning systems. By drawing parallels to weather forecasting, the VS aims to provide timely notifications about ongoing seismic events. The system will leverage data analytics from seismographic networks to continuously update users on the location, magnitude, and potential impacts of earthquakes. Automated decision-making processes will enhance response actions, addressing the challenges of uncertainty and false alarms. The study combines Bayesian approaches with robust data assimilation techniques to improve seismic risk assessment and ensure community preparedness.
E N D
Creating theVirtual Seismologist Tom Heaton, Caltech Georgia Cua, Univ. of Puerto Rico http://etd.caltech.edu/etd/ Masumi Yamada, Caltech
Earthquake Alerting … a different kind of prediction • What if earthquakes were really slow, like the weather? • We could recognize that an earthquake is beginning and then broadcast information on its development … on the news. • “an earthquake on the San Andreas started yesterday. Seismologists warn that it may continue to strengthen into a great earthquake and they predict that severe shaking will hit later today.”
If the earthquake is fast, can we be faster? • Everything must be automated • Data analysis that a seismologist uses must be automated • Communications must be automated • Actions must be automated • Common sense decision making must be automated
How would the system work? • Seismographic Network computers provide estimates of the location, size, and reliability of events using data available at any instant … estimates are updated each second • Each user is continuously notified of updated information …. User’s computer estimates the distance of the event, and then calculates an arrival time, size, and uncertainty • An action is taken when the expected benefit of the action exceeds its cost • In the presence of uncertainty, false alarms must be expected and managed
What we need is a special seismologist • Someone who has good knowledge of seismology • Someone who has good judgment • Someone who works very, very fast • Someone who doesn’t sleep • We need a Virtual Seismologist
Virtual Seismologist (VS) method for seismic early warning • Bayesian approach to seismic early warning designed for regions with distributed seismic hazard/risk • Modeled on “back of the envelope” methods of human seismologists for examining waveform data • Shape of envelopes, relative frequency content • Robust analysis • Capacity to assimilate different types of information • Previously observed seismicity • State of health of seismic network • Known fault locations • Gutenberg-Richter recurrence relationship
Ground motion envelope: our definition Full acceleration time history Efficient data transmission 3 components each of Acceleration, Velocity, Displacement, of 9 samples per second envelope definition– max.absolute value over 1-second window
Data set for learning the envelope characteristics Most data are from TriNet, but many larger records are from COSMOS • 70 events, 2 < M < 7.3, R < 200 km • Non-linear model estimation (inversion) to characterize waveform envelopes for these events • ~30,000 time histories
Average Rock and Soil envelopes as functions of M, R rms horizontal acceleration
horizontal acceleration ampl rel. to ave. rock site Vertical P-wave acceleration ampl rel. to ave. rock site horizontal velocity ampl rel. to ave. rock site vertical P-wave velocity ampl rel. to ave. rock site
Estimating M from ratios of P-wave motions • P-wave frequency content scales with M (Allen and Kanamori, 2003, Nakamura, 1988) • Find the linear combination of log(acc) and log(disp) that minimizes the variance within magnitude-based groups while maximizing separation between groups (eigenvalue problem) • Estimating M from Zad
CPP MLS WLT DLA SRN PLS LLS STG • Voronoi cells are nearest neighbor regions • If the first arrival is at SRN, the event must be within SRN’s Voronoi cell • Green circles are seismicity in week prior to mainshock
3 sec after initial P detection at SRN Epi dist est=33 km M=5.5 Single station estimate: • Prior information: • Voronoi cells • Gutenberg-Richter M, R estimates using 3 sec observations at SRN No prior information 8 km M=4.4 • Prior information: • Voronoi cells • No Gutenberg-Richter 9 km M=4.8 Note: star marks actual M, RSRN
What about Large Earthquakes with Long Ruptures? • Large events are infrequent, but they have potentially grave consequences • Large events potentially provide the largest warnings to heavily shaken regions • Point source characterizations are adequate for M<7, but long ruptures (e.g., 1906, 1857) require finite fault
Strategy to Handle Long Ruptures • Determine the rupture dimension by using high-frequencies to recognize which stations are near source • Determine the approximate slip (and therefore instantaneous magnitude) by using low-frequencies and evolving knowledge of rupture dimension • We are using Chi-Chi earthquake data to develop and test algorithms
We are experimenting with different Linear Discriminant analyses to distinguish near-field from far-field records
10 seconds after origin 20 seconds after origin Near-field Far-field Near-field Far-field
30 seconds after origin 40 seconds after origin Near-field Far-field Near-field Far-field
Strategy for acceleration envelopes • High-frequency energy is proportional to rupture are (Brune scaling) • Sum envelopes from 10-km patches
Sum of 9 point source envelopes • Vertical acceleration
Once rupture dimension is known • Obtain approximate slip from long-periods • Real-time GPS would be very helpful • Evolving moment magnitude useful for estimating probable rupture length • Magnitude critical for tsunami warning
Conclusions • Bayesian statistical framework allows integration of many types of information to produce most probable solution and error estimates • Waveform envelopes can be used for rapid and robust real-time analysis • Strategies to determine rupture dimension and slip look very promising • User decision making should be based on cost/benefit analysis • Need to carry out Bayesian approach from source estimation through user response. In particular, the Gutenberg-Richter recurrence relationship should be included in either the source estimation or user response. • If a user wants ensure that proper actions are taken during the “Big One”, false alarms must be tolerated