1 / 23

Higher Resolution Operational Models

Higher Resolution Operational Models. Operational Mesoscale Model History. Early: LFM, NGM (history) Eta (mainly history) MM5: Still used by some, but phasing out NMM- Main NWS mesoscale model. Sometimes called WRF-NMM WRF-ARW: Heavily used by research and some operational communities.

clive
Télécharger la présentation

Higher Resolution Operational Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Higher Resolution Operational Models

  2. Operational Mesoscale Model History • Early: LFM, NGM (history) • Eta (mainly history) • MM5: Still used by some, but phasing out • NMM- Main NWS mesoscale model. Sometimes called WRF-NMM • WRF-ARW: Heavily used by research and some operational communities. • The NWS calls their mesoscale run NAM: North American Mesoscale . Now NMM

  3. Eta Model

  4. ground Ptop  = 0 Eta Coordinate And Step Mountains  = 1 MSL

  5. Horizontal resolution of 12 km 12-km terrain

  6. Drawbacks of the Eta Coordinate • The failure to generate downslope wind storms in regions of complex terrain • Weak boundary layer winds over elevated terrain when compared to observations • The displacement of precipitation maxima too far toward the bottom of steeply sloping terrain as opposed to the observed location near the top half of the terrain slope • The reduction in the number of vertical layers used to define the model atmosphere above elevated topography particularly within the boundary layer

  7. WRF and NMM

  8. Why WRF? • An attempt to create a national mesoscale prediction system to be used by both operational and research communities. • A new, state-of-the-art model that has good conservation characteristics (e.g., conservation of mass) and good numerics (so not too much numerical diffusion) • A model that could parallelize well on many processors and easy to modify. • Plug-compatible physics to foster improvements in model physics. • Designed for grid spacings of 1-10 km

  9. WRF Software Infrastructure Dynamic Cores Mass Core NMM Core … Static Initialization Post Processors, Verification Obs Data, Analyses 3DVAR Data Assimilation Standard Physics Interface Physics Packages WRF Modeling System

  10. Two WRF Cores • ARW (Advanced Research WRF) • developed at NCAR • Non-hydrostatic Numerical Model (NMM) Core developed at NCEP • Both work under the WRF IO Infrastructure NMM ARW

  11. The NCAR ARW Core Model: (See: www.wrf-model.org) • Terrain following vertical coordinate • two-way nesting, any ratio • Conserves mass, entropy and scalars using up to 6th order spatial differencing equ for fluxes. Very good numerics, less implicit smoothing in numerics. • NCAR physics package (converted from MM5 and Eta), NOAH unified land-surface model, NCEP physics adapted too

  12. The NCEP Nonhydrostatic Mesoscale Model: NMM (Janjic et al. 2001), NWS WRF • Hybrid sigmapressure vertical coord. • 3:1 nesting ratio • Conserves kinetic energy, enstrophy and momentum using 2nd order differencing equation • Modified Eta physics, Noah unified land-surface model, NCAR physics adapted too

  13. The National Weather Service dropped Eta in 2006 as the NAM (North American Mesoscale) run and replaced it with WRF NMM. • The Air Force uses WRF ARW. • Most universities use WRF ARW

  14. NWS NMM—The NAM RUN • Run every six hours over N. American and adjacent ocean • Run to 84 hours at 12-km grid spacing. • Uses the Grid-Point Statistical Interpolation (GSI) data assimilation system (3DVAR) • Start with GDAS (GFS analysis) as initial first guess at t-12 hour (the start of the analysis cycle) • Runs an intermittent data assimilation cycle every three hours until the initialization time.

  15. October 2011 Update: NMMB • One-way nested forecasts computed concurrently with the 12-km NMM-B parent run for • CONUS (4 km to 60 hours) • Alaska (6 km to 60 hours) • Hawaii (3 km to 60 hours) • Puerto Rico (3 km to 60 hours) • For fire weather, moveable 1.33-km CONUS and 1.5-km Alaska nests are also run concurrently (to 36 hours). • A change in horizontal grid from Arakawa-E to Arakawa-B grid, which speeds up computations without degrading the forecast

  16. Current NAM WRF-NMM (E-grid) 4/Day = 6 hr update Forecasts to 84 hours 12 km horizontal grid spacing New NAM NEMS based NMMB B-grid replaces E-grid Parent remains 12 km to 84 hr Four Fixed Nests Run to 60 hr 4 km CONUS nest 6 km Alaska nest 3 km HI & PR nests Single placeable 1.33km or 1.5 km FireWeather/IMET/DHS run to 36hr September 2011 NAM Upgrade

  17. NMMB 4-km Conus

  18. NMM • Was generally inferior to GFS

  19. Looks like it has improved…

More Related