1 / 72

Peter Bird Yan Y. Kagan Zheng-Kang Shen Zhen Liu UCLA Department of Earth & Space Sciences

Plate-tectonic analysis of shallow earthquakes: Toward long-term seismic hazard models based on tectonics. Peter Bird Yan Y. Kagan Zheng-Kang Shen Zhen Liu UCLA Department of Earth & Space Sciences October 1, 2003 presented to USGS, Menlo Park, CA.

megan
Télécharger la présentation

Peter Bird Yan Y. Kagan Zheng-Kang Shen Zhen Liu UCLA Department of Earth & Space Sciences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Plate-tectonic analysis of shallow earthquakes:Toward long-term seismic hazard models based on tectonics Peter Bird Yan Y. Kagan Zheng-Kang Shen Zhen Liu UCLA Department of Earth & Space Sciences October 1, 2003 presented to USGS, Menlo Park, CA

  2. Approaches to improving seismic hazard models: • Do a good job on the long-term-average (Poissonian) hazard maps before tackling time-dependence. • Base seismic coupling and frequency/ magnitude relations on global statistics. • Determine fault slip rates and anelastic strain rates from unified kinematic models.

  3. I. Do a good job on the long-term-average (Poissonian) hazard maps before tackling time-dependence. • Conceptually simpler. Perhaps we can all agree on basic principles. • Needed for LONG-term planning (nuclear waste repositories, dams, new Pantheons). • A stable hazard map simplifies public education. • Supports studies of time-dependent hazard by showing which clustering patterns are permanent, and which are time-dependent.

  4. II. Base seismic coupling and frequency/magnitude relations on global statistics. • In continents, “characteristic earthquake” sequences may be the exception, not the rule. • Anticipating fault segmentation is subjective. • Some recent large earthquakes have ignored expected rupture segments (Northridge, Landers, Denali). • Instead, use the “ergodic assumption”: Global data over one century may substitute for local data covering thousands of years.

  5. The kinematic basis for the global calibration: Plate boundary model PB2002 has 52 plates and 13 orogens: Bird [2003] An updated digital model of plate boundaries, Geochemistry Geophysics Geosystems, 4(3), 1027, doi:10.1029/2001GC000252.

  6. Source data for the PB2002 plate boundary model: • Plate Tectonic Map of the Circum-Pacific Region [Circum-Pacific Map Project, 1981; 1986]; • gridded topography/bathymetry from ETOPO5 [NOAA-NGDC, 1988]; • 14 Euler poles for large plates from NUVEL-1A [DeMets et al., 1994]; • 10 small plates, and orogen concept, from Gordon [1995]; • 1,511 subaerial volcano locations from the Smithsonian's Global Volcanism Program [Simkin & Siebert, 1995]; • mid-ocean spreading ridge boundaries from Paleo-Oceanographic Mapping Project [Mueller et al., 1997]; • gridded sea floor ages from the Paleo-Oceanographic Mapping Project; • Global Seismic Hazard Map [Giardini et al., 1999]; • 168 regional studies, including 32 using GPS; • locations and nodal planes of ~15,000 shallow earthquakes from the Harvard CMT catalog.

  7. Example of a simple improvement: Recognition of a Mariana plate (MA) which separates from the Philippine Sea plate (PS) by back-arc spreading:

  8. Example of a complex improvement: The Banda Sea-New Guinea region (8 small plates):

  9. Regions of non-rigid lithosphere (or very many small plates) are designated as “orogens” in which this model is not expected to be accurate. Here: the Philippines orogen.

  10. Model PB2002 includes estimated Euler poles and velocities for each plate:

  11. Relative plate velocities predicted by PB2002:

  12. Every digitization step (from 1-109 km long) along every plate boundary is classified as being one of 7 types:

  13. New work, to determine:“For each of the 7 types of plate boundary, • What is the average rate of damaging earthquakes (above some magnitude threshold, per unit boundary length)? • How large might the largest event be (in the next century, at a given confidence)? • What fraction of low-temperature (frictional) inter-plate slip is expressed as earthquakes?”

  14. Using the Harvard CMT catalog of 15,015 shallow events:

  15. We study histograms of earthquake* frequency as a function of distance to the nearest plate boundary: [*shallow earthquakes of appropriate focal mechanism, excluding those in orogens] Note that the distribution for SUBduction zones is asymmetrical:

  16. We adopt a “two-sigma” rule for apparent boundary half- width. This is generally greater than or equal to the half-widths expected a priori. This rule selects ~95% of shallow non-orogenic EQs into one boundary or another. Table 1. Estimates of Apparent Boundary Half-Width (in km) and CMT catalog statistics

  17. Formal assignment of an earthquake to a plate boundary step is by a probabilistic algorithm that considers all available information: step type, spatial relations, EQ depth, and focal mechanism:

  18. The A factor takes into account the length, velocity, and inherent seismicity of each candidate plate boundary step. The inherent seismicity levels of the 7 types of plate boundary (obtained by iteration of this classification algorithm) are valuable basic information for seismicity forecasts:

  19. We allow for several different “model earthquake” focal mechanisms on each plate boundary step. When the step is oblique to relative plate motion (the general case), the Earth may produce either oblique- slip EQs, or sets of partitioned-slip EQs:

  20. For SUB steps, the depth PDF function D associated with each “model earthquake” helps to separate mechanisms expected to be shallow (green curve) from those expected to be within the slab (blue curve, for case of slab top at 25 km depth) and those thrust events expected to lie along the slab-top plate interface (a Gaussian PDF centered on this depth; not shown here).

  21. The final classification factor (E) “grades” a possible match on the angular discrepancy between actual and model focal mechanism:

  22. The result…

  23. The frequency/moment distribution that we fit to these subcatalogs is the

  24. Advantages of this distribution: • Simple (only one more parameter than G-R); • Has a finite integrated moment (unlike G-R) for b < 1; • Fits global subcatalogs slightly better than the gamma distribution.

  25. The maximum-likelihood method is used to determine the parameters of these tapered G-R distributions (and their uncertainties): An ideal case (both parameters determined) A typical case (corner magnitude unbounded from above)

  26. not to be taken literally! (“a large number”) 95%-confidence lower limit threshold magnitude 95%-confidence upper limit 95%-confidence lower limit

  27. Review of results on spectral slope, b: Although there are variations, none is significant with 95%-confidence. Kagan’s [1999] hypothesis of uniform b still stands.

  28. In many cases, subcatalogs obtained from the Harvard CMT catalog for non-orogen regions are not large enough to define 95%-confidence upper limits on the corner magnitudes.We next enlarged some of our subcatalogs in three ways: included events of 1976 AD from catalog of Ekström & Nettles [1997] (mt 6.28); included events of 1900-1975 AD from catalog of Pacheco & Sykes [1992] (mt 7.10); included plate-boundary-associated events from within the 13 orogens of PB2002:

  29. But, it is necessary to be careful: • Catalog data from 1900-1975 is less accurate in every way (moment/magnitude, location, depth, focal mechanism-?), and therefore these events are more likely to be misclassified. • The high catalog threshold (mt = 7.1) makes b very hard to determine, and risks biasing mc values which are smaller. • We chose not to work with merged subcatalogs for OSR and OTF/medium-fast, where we already know that mc < 7.1. • We fix b at the value determined from the 1977-2002 Harvard CMT catalog, and only optimize the corner magnitude mc.

More Related