1 / 11

High Performance Computing at SCEC

High Performance Computing at SCEC. Scott Callaghan Southern California Earthquake Center University of Southern California. Why High Performance Computing?. What is HPC? Using large machines with many processors to compute quickly Why is it important?

ellery
Télécharger la présentation

High Performance Computing at SCEC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California

  2. Why High Performance Computing? • What is HPC? • Using large machines with many processors to compute quickly • Why is it important? • Only way to perform large-scale simulations • Two main types of HPC SCEC projects • What kind of shaking will this eq cause in a region? • What kind of shaking will this single location experience?

  3. SCEC Scenario Simulations • Simulations of individual earthquakes • Determine shaking over a region caused by a single event (usually M > 7) Peak ground velocities for a Mw8.0 Wall-to-Wall Scenario on the San Andreas Fault (1Hz) calculated using AWP-ODC on NICS Kraken.

  4. Simulating Large Events • Must break up the work into pieces • Most commonly, spatially • Give work to each processor • Run a timestep • Communicate with neighbors • Repeat • As number of processors increases, harder to get good performance

  5. Probabilistic Seismic Hazard Analysis • Builders ask seismologists: “What will the peak ground motion be at my new building in the next 50 years?” • Seismologists answer this question using Probabilistic Seismic Hazard Analysis (PSHA) • PSHA results used in building codes, insurance • California building codes impact billions of dollars of construction yearly

  6. PSHA Reporting • PSHA information is relayed through • Hazard curves (for 1 location) • Hazard maps (for a region) 2% in 50 years 0.6 g Curve for downtown LA Probability of exceeding 0.1g in 50 yrs

  7. PSHA Methodology Pick a location of interest. Define what future earthquakes might happen. Estimate the magnitude and probability for each earthquake, from earthquake rupture forecast (ERF) Determine the shaking caused by each earthquake at the site of interest. Aggregate the shaking levels with the probabilities to produce a hazard curve. Repeat for multiple sites for a hazard map. Typically performed with attenuation relationships. 7

  8. CyberShake Approach • Uses physics-based approach • 3-D ground motion simulation with anelastic wave propagation • Considers ~415,000 rupture variations per site • 7000 ruptures in ERF • <200 km from site of interest • Magnitude >6.5 • Add variability • More accurate than traditional attenuation methods • 100+ sites in Southern California needed to calculate hazard map LADT: Probability of Exceedance (SA 3.0) Blue and Green – common attenuation relationships Black – CyberShake

  9. Results Attenuation map CyberShake map

  10. Results (difference) CyberShake map compared to attenuation map Population Density

  11. Some recent numbers • Wall-to-wall simulation • 2 TB output • 100,000 processors • CyberShake • Hazard curves for 223 sites • 8.5 TB output files • 46 PB of file I/O • 190 million jobs executed • 4500 processors for 54 days

More Related