Weather Research And Forecast Model: Software And Computing Issues John Michalakes email@example.com Mesoscale and Microscale Meteorology National Center for Atmospheric Research Dave Gill (NCAR), Tom Black, S.G. Gopalakrishnan (NCEP); Jacques Middlecoff, Dan Schaffer (FSL), V. Balaji (GFDL), Jennifer Abernethy (U. Colorado), many others
WRF Project Collaborators • Signatory Partners: • NCAR Mesoscale and Microscale Meteorology Division • NOAA National Centers for Environmental Prediction • NOAA Forecast Systems Laboratory • Air Force Weather Agency • Federal Aviation Administration • Navy, Naval Research Laboratory • Additional Collaborators: • OU Center for the Analysis and Prediction of Storms • Department of Defense HPCMO • CMA Chinese Academy for Meteorological Sciences • NOAA Geophysical Fluid Dynamics Laboratory • NASA GSFC Atmospheric Sciences Division • NOAA National Severe Storms Laboratory • EPA Atmospheric Modeling Division • University Community
Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Well suited for broad range of applications: • Limited-area NWP • Cloud modeling, Large Eddy Simulation • Synoptic-scale research • Chemistry and air-quality research and prediction • Regional climate • Designed for 1-10km but must also perform at higher (LES; dx ~ 100 meter) and lower (synoptic scale; dx ~100km) resolutions • Portable and efficient on parallel computers 12km WRF simulation of large-scale baroclinic cyclone, Oct. 24, 2001
Hierarchical software architecture Insulate scientists' code from parallelism and other architecture/implementation-specific details Well-defined interfaces between layers, and external packages for communications, I/O, and model coupling facilitates code reuse and exploiting of community infrastructure, e.g. ESMF. Multi-level parallelism Decomposition over distributed memory patches (per MPI-process); then shared memory tiles (per OpenMP thread) Same code adapatable for shared-memory, distributed-memory, and hybrid parallel systems Control over size and shape of working subdomain for cache/vector efficiency Aspects of WRF Software Design
Multiple, run-time selectable dycore options Eulerian Mass (Skamarock, Klemp, Wicker) NH-Meso Eta (Janjic) Semi-implicit semi-Lagrangian (J. Purser) WRF 3DVAR is also implemented as a "core" within the WRF software framework WRF software framework selected by China Met. Admin. for GRAPES Active data-dictionary: WRF-Registry Compile time database of WRF state data and its attributes 30-thousand lines of WRF auto-generated at compile time Allows rapid development of WRF by automating repetitive, error-prone code I-K-J Order for Storage and Loop Nesting Detailed studies (with Rich Loft and Pat Worley) Provides best compromise for vector and microprocessor performance Grid nesting Two-way interacting,coincident (non-rotated) Run-time instantiation Moving (Hurricane WRF; NOAA requirement for 2006) Target performance: no more than 15 % overhead Model coupling… Aspects of WRF Software Design
Coupling as extension of WRF I/O API Coupling mechanism appears as I/O package (early 90's: Coats, Models-III) Concurrent and sequential coupling modes supported Two implementations to date: Model Coupling Toolkit (MCT, Argonne NL) Model Coupling Environment Library (MCEL, U. Southern Miss.). Projects WRF/Rutgers Ocean Model coupling under NSF Teragrid Project: Model Environment for Atmospheric Discover (MEAD), NCSA at U. of Illinois WRF/HFSoLE and ocean coupling under PET Project CWO-002 FY03: Software Infrastructure for Regional Coupled Geophysical Modeling Model Coupling
…effective, relocatable, high-resolution, dynamically linked mesoscale, meteorological, coastal oceanographic, surface water/groundwater, riverine/estuarine and sediment transport models. WRF / HFSoLE Coupling WRF Winds SWAN Wave Height High Fidelity Simulation of Littoral Environments Rick Allard, NRL, UGC 2002 NCOM SST
WRF / HFSoLE Coupling WRF Winds Total run time: 1713 s Storing to coupler: 3.5 s Updating from coupler: 3.25s NCOM SST
ESMF and WRF • WRF is a cooperating ESMF application • Opportunities for integration • Incorporate ESMF utilities; e.g. Time Manager (completed Summer 2003) • WRF as an ESMF component model • WRF has adopted top-level Initialize/Run/Finalize ESMF component formalism • Integration of ESMF coupling superstructure (when available) • Grid nesting a special application of ESMF regridding • ESMF driver layer (when available) • ESMF I/O (merger with WRF I/O functionality in progress)
WRF Porting and Performance • Computing platforms • IBM Power3, Power4 • HP Alpha and Itanium • SGI MIPS and Itanium • Linux • Compilers: Portland Group, Intel • Message passing: MPICH, LAM, Scali, Myrinet, … • Processors: Pentium, Itanium, Alpha • Ports in progress • NEC SX/6 • Cray X-1
WRF Scaling • Communication alone fails to account for all the observed inefficiency as processor counts increase: • On 256p, should be 93% efficient relative to 16p • Actually only 71% -- Where is the rest?
WRF Scaling Instrumented load imbalance over 128 processors for 12km CONUS run Load Imbalance: 1 – (Tmean / Tmax )
MM5 Performance Modeling the Weather on a Cray X1, Tony Meys, Army HPC Research Center / Network Computing Services, Inc., in proceedings CUG, August 2003. (P = MSP) http://www.mmm.ucar.edu/mm5/mpp/performance
WRF in BAMEX, May 20 – July 6 2003 • Real-time forecasts • WRF, RAMS, MM5, NCEP models • Distributed to forecasters at NOAA WFOs • Used to position aircraft, mobile profilers in advance of system to be observed • WRF forecasts • Daily: 4km 00Z • 2x Daily: 10km 00Z and 12Z • Dedicated 3.5 hr partition of 128p on NCAR bluesky system + 4p for pre-/post-processing • First forecaster reaction to high resolution WRF
WRF in BAMEX, May 20 – July 6 2003 BAMEX 6/8/03 • Real-time forecasts • WRF, RAMS, MM5, NCEP models • Distributed to forecasters at NOAA WFOs • Used to position aircraft, mobile profilers in advance of system to be observed • WRF forecasts • Daily: 4km 00Z • 2x Daily: 10km 00Z and 12Z • Dedicated 3.5 hr partition of 128p on NCAR bluesky system + 4p for pre-/post-processing • First forecaster reaction to high resolution WRF WRF 4 km Reflectivity Composite Radar "… the model is uncanny at reproducing the essential reflectivity and timing of the bow echoes." "… it seems to know how to generate the cold pools and keep track of them." "I really liked the 4 km BAMEX model run and DON'T want to it go away."
WRF Status and Continuing Work • First released to community December 2000, current "beta" release, WRF 1.3, March 2003 • Fully functional research release by end of 2003, followed by operational release and deployment in 2004-05 time frame. • Testing and verification underway for operation implementation at NCEP, AFWA • Real time forecasts for BAMEX field experiments • Ensemble forecast system using WRF Eulerian Mass and NH-Eta cores, physics with unified NOAH LSM • Numerous month-long seasonal retrospective runs at NCAR, FSL, DoD • WRF Developmental Testbed Center under development http://www.wrf-model.org