state of practice of seismic hazard analysis from the good to the bad n.
Skip this Video
Loading SlideShow in 5 Seconds..
State of Practice of Seismic Hazard Analysis: From the Good to the Bad PowerPoint Presentation
Download Presentation
State of Practice of Seismic Hazard Analysis: From the Good to the Bad

State of Practice of Seismic Hazard Analysis: From the Good to the Bad

384 Vues Download Presentation
Télécharger la présentation

State of Practice of Seismic Hazard Analysis: From the Good to the Bad

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. State of Practice of Seismic Hazard Analysis: From the Good to the Bad Norm Abrahamson, Seismologist Pacific Gas & Electric Company

  2. Seismic Hazard Analysis • Approaches to design ground motion • Deterministic • Probabilistic (PSHA) • Continuing debate in the literature about PSHA • Time Histories • Scaling • Spectrum compatible

  3. Seismic Hazard Approaches • Deterministic approach • Rare earthquake selected • Median or 84th percentile ground motion • Probabilistic approach • Probability of ground motion selected • Return period defines rare • Performance approach • Probability of damage states of structure • Structural fragility needed • Risk approach • Probability of consequence • Loss of life • Dollars

  4. Deterministic vs Probabilistic • Deterministic • Consider of small number of scenarios (Mag, dist, number of standard deviation of ground motion) • Choose the largest ground motion from cases considered • Probabilistic • Consider all possible scenarios (all mag, dist, and number of std dev) • Compute the rate of each scenario • Combine the rates of scenarios with ground motion above a threshold to determine probability of “exceedance”

  5. Deterministic Approach • Select a specific magnitude and distance (location) • For dams, typically the “worst-case” earthquake • (MCE) • Design for ground motion, not earthquakes • Ground motion has large variability for a given magnitude, distance, and site condition • Key issue: What ground motion level do we select?

  6. 2004 ParkfieldNear Fault PGA Values

  7. Worst-Case Ground Motion is Not Selected in Deterministic Approach • Combing largest earthquake with the worst-case ground motion is too unlikely a case • The occurrence of the maximum earthquake is rare, so it is not “reasonable” to use a worst-case ground motion for this earthquake • Chose something smaller than the worst-case ground motion that is “reasonable”.

  8. What is “Reasonable” • The same number of standard deviation of ground motion may not be “reasonable” for all sources • Median may be reasonable for low activity sources, but higher value may be needed for high activity sources • Need to consider both the rate of the earthquake and the chance of the ground motion

  9. Components of PSHA • Source Characterization • Size, location, mechanism, and rates of earthquakes • Ground motion characterization • Ground motion for a given earthquake • Site Response • Amplification of ground motion at a site • Hazard Analysis • Hazard calculation • Select representative scenarios • Earthquake scenario and ground motion

  10. Selected Issues in Current Practice • (Less) Common Problems with current Practice • Max magnitude • VS30 • Spatial smoothing of seismicity • Double counting some aspects of ground motion variability • Epistemic uncertainties • Mixing of epistemic and aleatory on the logic tree • Underestimation of epistemic uncertainties • Over-estimation of epistemic uncertainties • Hazard reports / hand off of information • UHS and Scenario Spectra

  11. Common Misunderstandings • Distance Measures • Different distance metrics for ground motion models often used interchangeably • Rupture distance • JB distance • Rx (new for NGA models) • Hypocentral distance • Epicentral distance • Return Period and Recurrence Interval used interchangeably • Recurrence interval used for earthquakes • Return period for ground motion at a site

  12. Common Misunderstandings • Standard ground motion models thought to give the larger component • Most ground motion models give the average horizontal component • Average is more robust for regression • Scale factors have been available to compute the larger component • Different definitions of what is the larger component • Larger for a random orientation • Larger for all orientations • Sa(T) corresponding to the larger PGA • Can be lower than the average!

  13. Use and Misuse of VS30 VS30 Not the fundamental physical parameter For typical sites, VS30 correlated with deeper Vs profile Most soil sites are in alluvial basins (deep soils) CA empirical based models not applicable to shallow soil sites Proper Use Clear hand-off between ground motion and site response Consistent definition of “rock” Use for deep soil sites that have typical profiles Misuse Replace site-specific analysis for any profile (not typical as contained in GM data base) Use ground motion with VS30 for shallow soil sites (CA models) Need to select a deeper layer and conduct site response study Or use models with soil depth and VS30

  14. Sloppy Use of Terms: Mmax Most hazard reports list a maximum magnitude for each source Has different meanings for different types of sources Zones Maximum magnitude, usually applied to exponential model Faults Mean magnitude for full rupture, usually applied to characteristic type models Allows for earthquake larger than Mmax Called mean characteristic earthquake Issue Some analyses use exp model for faults or characteristic models for regions Not clear how to interpret Mmax Improve practice Define both Mmax and Mchar in hazard reports

  15. Terminology • Aleatory Variability (random) • Randomness in M, location, ground motion (e) • Incorporated in hazard calculation directly • Refined as knowledge improves • Epistemic Uncertainty (scientific) • Due to lack of information • Incorporated in PSHA using logic trees (leads to alternative hazard curves) • Reduced as knowledge improves

  16. Aleatory and Epistemic • For mean hazard, not important to keep separate • Good practice • Keep aleatory and epistemic separate • Not always easy • Allows identification of key uncertainties, guides additional studies, future research • Source characterization • Common to see some aleatory variability in logic tree (treated as epistemic uncertanity) • Rupture behavior (segmentation, clustering) • Ground motion characterization • Standard practice uses ergodic assumption • Some epistemic uncertainty is treated as aleatory variability

  17. Example: Unknown Die • Observed outcome of four rolls of a die • 3, 4, 4, 5 • What is the model of the die? • Probabilities for future rolls (aleatory) • How well do we know the model of the die? • Develop alternative models (epistemic)

  18. Unknown Die Example

  19. Epistemic Uncertainty • Less data/knowledge implies greater epistemic uncertainty • In practice, this is often not the case • Tend to consider only available (e.g. published) models • More data/studies leads to more available models • Greater epistemic uncertainty included in PSHA

  20. Characterization of Epistemic Uncertainty • Regions with little data • Tendency to underestimate epistemic • With little data, use simple models • Often assume that the simple model is correct with no uncertainty • Regions with more data • Broader set of models • More complete characterization of epistemic • Sometimes overestimates epistemic

  21. Underestimation of Epistemic Uncertainty Standard Practice: If no data on time of last eqk, assume Poisson only Good Practice: Scale the Poisson rates to capture the range from the renewal model

  22. Overestimate of Epistemic Uncertainty Rate: Constrained by paleo earthquake recurrence 600 Yrs for full rupture Mean char mag=9.0 Alternative mag distributions considered as epistemic uncertainty exponential model brought along with low weight, but leads to over-estimation of uncertainty

  23. Epistemic Uncertainty • Good Practice • Consider alternative credible models • Use minimum uncertainty for regions with few available models • Check that observations are not inconsistent with each alternative model • Poor Practice • Models included because they were used in the past • Trouble comes from applying models in ways not consistent with their original development • E.g. exponential model intended to fit observed rates of earthquakes, not to be scaled to fit paleo-seismic recurrence intervals

  24. Ground Motion Models • Aleatory • Standard practice to use published standard deviations • Ergodic assumption - GM median and variability is the same for all data used in GM model • Standard deviation applies to a single site / single path • Epistemic • Standard practice to use alternative available models (median and standard deviation) • Do the available models cover the epistemic uncertainty • Issue with use of NGA models

  25. Problems with Current Practice • Major problems have been related to the ground motion variability • Ignoring the ground motion variability • Assumes s=0 for ground motion • Considers including ground motion s as a conservative option • This is simply wrong. • Applying severe truncation to the ground motion distribution • e.g. Distribution truncated at +1s • Ground motions above 1s are considered unreasonable • No empirical basis for truncation at less than 3s. • Physical limits of material will truncate the distribution

  26. Example of GM Variability

  27. GM Variability Example

  28. GM Truncation Effects (Bay Bridge)

  29. 2004 Parkfield

  30. Ergodic Assumption • Trade space for time

  31. Mixing epistemic and aleatory(in Aleatory)

  32. Standard Deviations for LN PGA

  33. Single Ray Path

  34. Standard Deviations for LN PGA

  35. Removing the Ergodic Assumption • Significant reduction in the aleatory variability of ground motion • 40-50% reduction for single path - single site

  36. Hazard Example

  37. Die: combine rolls (ergodic)

  38. Non-Ergodic: Reduced Aleatory

  39. Removing the Ergodic Assumption • Penalty: must include increased epistemic uncertainty • Requries model for the median ground motion for a specific path and site • Benefits come with constraints on the median • Data • Numerical simulations • Current State of Practice • Most studies use ergodic assumption • Mean hazard is OK, given no site/path specific information • Some use of reduced standard deviations (reduced aleatory), but without the increased epistemic • Underestimates the mean hazard • Bad practice

  40. Non-Ergodic: Increased Epistemic

  41. Standard Deviations for Surface Fault Rupture

  42. Removing the Ergodic Assumption • Single site aleatory variability • Much smaller than global variability • Value of even small number of site-specific observations

  43. Large Impacts on Hazard

  44. Keeping Track of Epistemic and Aleatory • If no new data • Broader fractiles • No impact on mean hazard • Provides a framework for incorporation of new data as it becomes available • Identifies key sources of uncertainty • Candidates for additional studies • Shows clear benefits of collecting new data

  45. Hazard Reports • Uniform Hazard Spectra • The UHS is an envelope of the spectra from a suite of earthquakes • Standard practice hazard report includes: • UHS at a range of return periods gives the level of the ground motion • Deaggregation at several spectral periods for each return period identifies the controlling M,R • Good practice hazard report includes: • UHS • Deaggregation • Representative scenario spectra that make up the UHS. • Conditional Mean Spectra (CMS)

  46. Crane Valley Dam Example • Controlling Scenarios from deaggregation • For return period = 1500 years: • SA(T=0.2): M=5.5-6.0, R=20-30 km • Sa(T=2): M=7.5-8.0, R=170 km

  47. Scenario Ground Motions (Baker and Cornell Approach: Conditional Mean Spectra) Find number of standard deviations needed to reach UHS Next, Construct the rest of the spectrum

  48. Correlation of Epsilons T=1.5 T=0.3

  49. Correlation of Variability • Correlation decreases away from reference period • Increase at short period results from nature of Sa slope

  50. Scenario Spectra for UHS • Develop a suite of deterministic scenarios that comprise the UHS • Time histories should be matched to the scenarios individually, not to the entire UHS