1 / 33

ARSC Initiatives in Weather, Climate and Ocean Modeling

ARSC Initiatives in Weather, Climate and Ocean Modeling. Dr. Gregory Newby, Chief Scientist Arctic Region Supercomputing Center Presentation to the HPC User Form September 9 2008. Common Themes with Other User Forum Presenters. Lots of work with WRF Work with CCSM

wan
Télécharger la présentation

ARSC Initiatives in Weather, Climate and Ocean Modeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ARSC Initiatives in Weather, Climate and Ocean Modeling Dr. Gregory Newby, Chief Scientist Arctic Region Supercomputing Center Presentation to the HPC User Form September 9 2008

  2. Common Themes with Other User Forum Presenters • Lots of work with WRF • Work with CCSM • Mature community-developed applications that scale reasonably well and are hungry for more CPU power • Desire for higher resolution to better approximate reality • Desire to add increasingly sophisticated processes to the phenomena under investigation • Therefore, these items will not be major themes of today’s talk

  3. ARSC Themes of potential interest to the HPC User Forum • Performance analysis of current (multicore) and forthcoming (Cell, GPU, FPGA, T2+) processors for WRF • Quasi-operational WRF with NWS/NOAA users • Wildfire smoke prediction based on WRF/Chem • Model coupling • High resolution ocean forecasts • Arctic ice tide impacts • WRF sensitivity analysis • These will be the major themes of today’s talk

  4. Multicore Performance Analysis on ARSC’s Supercomputers • Mostly on Midnight (dual core) • Sun x2200m2 & x4600 cluster nodes • 2312 Opteron cores, IB, 4GB/core, Lustre • 12.88 TFLOP theoretical peak • Benchmarks • Real-world applications • Forthcoming: Pingo (quad core) • Cray XT5 • 3456 cores, SeaStar, 4GB/core, Lustre • About 30 TFLOP theoretical peak

  5. 2-socket AMD64 topology CPU-2,3 CPU-0,1 One 1 GHz 16x16 HyperTransport link per supported processor with 8GB/second bandwidth

  6. 8 socket AMD64: Sun Fire X4600 Server

  7. 8 socket AMD64 topology CPU2 CPU4 CPU6 CPU0 CPU7 CPU1 CPU3 CPU5 * All HT links are operating at 1GHz and 8GB/s

  8. WRF Performance on Midnight Time required to compute 1 forecast hour on 3-nest domain

  9. Next-Generation Processor Performance • GPU, Cell, quad-core Xeon & Opteron, FPGA, CMT • Many benchmarks completed, some applications • Forthcoming: WRF test cases for AK domains (submitted for AGU 2008) • GPU (nVidia 9800 GTX), possibly Firestream • FPGA: WRF doesn’t have enough “hot spots,” porting BLAS too much work (but RapidMind might help) • Cell: QS22 cluster being configured; should work well • UltraSPARC T2+: scaled well to 8 threads per socket; WRF is too CPU-bound to benefit much from T2+ CMT

  10. Quasi-Operational and Research Weather Modeling • Work has been evolving since 2005 • Initially, efforts were directed at quasi-operational, scheduled WRF forecasts for the Fairbanks Weather Forecast Office (WFO). The goal was to leverage lots of CPUs to provide high-resolution forecasts, and continually assess & improve • The scheduled runs continue to dominate our efforts, but rigorous analysis and study has yielded important results

  11. 2-way Nested Domains for ARSCwrf 200x200x75 cells 421x328x75 cells 151x151x75 cells

  12. Operational WRF – AWIPS This is the end product of our twice-daily WRF runs, as seen on the NWS Advanced Weather Interactive Prediction System (FAI WFO)

  13. Value to WFO FAI • Grid resolution is approximately 5km. The ability to populate with a reasonably accurate, high-resolution weather model saves the forecasters a lot of time. In the best case, the model output is the NDFD product.

  14. Verification Products • Four days after a forecast, we automatically retrieve all available observations and tabulate a comparison of the WRF forecast for each location vs. the observations

  15. Data Assimilation • Motivation – better set of initial conditions • Current initial conditions are interpolated from coarser resolution model runs – clearly much room for error • With data assimilation, we perturb our original input grid by carefully assimilating available observations • We are currently in the process of adding this capability to our operational runs, and once implemented will perform assimilated and non-assimilated runs side-by-side for comparison

  16. Data Assimilation • Test Case – 48-hour forecast starting at 00Z on 01 May 2007 Vertical soundings from raobs and satellite – temperature, dewpoint, pressure, winds Surface obs – temperature, dewpoint, pressure, winds

  17. Alaska Air Quality Impacted by Wildfires South Fairbanks, June 28, 2004. Air quality particulate level at approximately 900 micrograms / cubic meter. South Fairbanks, July 6, 2004. Air quality particulate level at approximately 10 micrograms/cubic meter. Photos courtesy of Dr. James Conner, FNSB http://www.dec.state.ak.us/air/am/2004_wf_sum.htm

  18. WRF-CHEM PARTICULATE, CHEMICAL & METEOROLOGICAL FORECAST MODIS FIRE DETECTION & BURN AREA WRF-MET METEOROLOGY FORECAST FUEL MOISTURE FIRE EMISSIONS FIRE SPREAD PLUME DYNAMICS Initial architecture: WRF/Chem Smoke Dispersion System DEM, Static Fuel Static Fuel Data Emission Factors FOFEM Identical WRF domain initialization (SI/WPS) WRF/Chem Gridded Hourly Emissions POSTPROCESSING WRF-Chem netCDF Adapted from a scheme used by our partners from the US Forest Service Fire Science Lab in Missoula.

  19. Fire Detection and Burn Area Example

  20. Example Product for Monday, 8th September 2008, 9:00 UTC A weak smoke concentration due to North-Easterly winds and extended fires North/East of the Yukon has been clearly confirmed in the Fairbanks area. Source: smoke.uaf.edu

  21. Coupling Work: Regional Arctic Climate Model & More • Practical ongoing work, DoE funded. Wieslaw Maslowski, PI • ARSC is also looking at some more general approaches to coupled climate models, including adding permafrost, ice sheets, hydrology, ecosystems, human infrastructure, and coastal erosion. Major partner: International Arctic Research Center, UAF. • RACM is a regional Arctic climate system model including WRF, VIC (land surface model), NAPC (ocean and sea ice model). Current emphasis is coupling WRF, VIC and NAPC via cpl7. • Cpl7 is still in development. Our work will keep pace with cpl7 progress. CCSM4 project groups have made early-release codes available to RACM partners • Since cpl7 is designed for CCSM4, where the framework is different with WRF and VIC, some changes of cpl7 and WRF are necessary.

  22. MPI_COMM_WORLD Partitioning of RACM Processing CPLICE CPLATM ICE ATM CPL CPLLND CPLATM LND OCN 5 COMPONENTS 10 COMMUNICATION GROUPS

  23. ROMS and Beyond • ROMS is an ocean model that can operate at both very large and very small scales, and takes shorelines into account • At ARSC: Ongoing research for northwest Pacific, Gulf of Alaska, Beaufort Sea & Bering Sea • Also, partnering to couple ROMS with other best-of-breed models for ocean / ice prediction

  24. CCSM3 Coupling for Ocean / Ice Modeling

  25. The ROMS Regional Setups

  26. ROMS Driving Ecosystem Model Physical Forcing (Wind, temperature Sunlight, mixing) Nutrients NO3, NH4… Primary Producers (Phytoplankton) Secondary Producers (Zooplankton) Fish Nutrient Phytoplankton Zooplankton models

  27. Beaufort Sea WRF Sensitivity Analysis • Funded by MMS • Emphasis includes near-coast oil spills in Beaufort Sea (north of AK & Yukon Territory) • Many scenarios, many WRF runs • Reanalysis study • Storm case studies • Impact of winds on waves

  28. MMS WRF Report: Higher Resolution not needed for Effective Near-Shore Wind Fields

  29. From Field to Supercomputer: Designing Next-Generation Sea Ice Modules for Very High Resolution Ice-Ocean Models Data + models

  30. Tidal and wind forcing of the ice-ocean boundary layer verses simple wind forcing (14km resolution) These animations demonstrate the difference between simulations being purely wind driven (right), and those that are both tidally and wind driven (left). Both animations span the same period, and have the same timestep, which is roughly 50 minutes) Arctic System Modeling

  31. Conclusions • Climate, weather and related phenomena (sea ice, land surface, physical oceanography) models tend to be community developed and supported • Scale fairly well • Ported & maintained for new hardware • Mostly Fortran + MPI, but very complex codes • Scientists are hungry for higher resolution, longer runs, and additional runs to parameterize phenomena • The Arctic requires several adjustments to models for optimal verisimilitude, versus mid-latitudes

  32. Huge THANKS • Don Morton (ARSC & U. Montana): Weather • Abdullah Kayi (GWU): HyperTransport • Martin Stuefer (UAF), Georg Grell (NOAA) & Saulo Freitas (CPTEC/INPE): Wildfire smoke • Kate Hedstrom (ARSC): ROMS • Georgina Gibson (ARSC): NPZ • Wieslaw Maslowski (NPS): POP, RACM • Andrew Roberts (ARSC), Jennifer Hutchings (UAF): Tidal ice • Juanxiong He (ARSC): Coupling • Jing Zhang, Jeremy Krieger (UAF), Don Morton: MMS WRF sensitivity • Many other partners & collaborators

More Related