1 / 145

Storm Prediction Center Highlights EMC Production Suite Review December 8, 2010 Steven Weiss Storm Prediction Center, N

Storm Prediction Center Highlights EMC Production Suite Review December 8, 2010 Steven Weiss Storm Prediction Center, Norman, OK. National Weather Center. National Weather Center. SPC Russ Schneider Greg Carbin Gregg Grosshans Joe Byerly Jay Liang Andy Dean Israel Jirak Chris Melick

PamelaLan
Télécharger la présentation

Storm Prediction Center Highlights EMC Production Suite Review December 8, 2010 Steven Weiss Storm Prediction Center, N

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Storm Prediction Center HighlightsEMC Production Suite ReviewDecember 8, 2010Steven Weiss Storm Prediction Center, Norman, OK National Weather Center National Weather Center

  2. SPC Russ Schneider Greg Carbin Gregg Grosshans Joe Byerly Jay Liang Andy Dean Israel Jirak Chris Melick Chris Siewert SPC Forecasters NSSL Jack Kain Adam Clark Patrick Marsh Ryan Sobash Mike Coniglio Bob Rabin HPC Dave Novak Faye Barthold Mike Bodner Keith Brill HPC Forecasters AWC Jason Levit Bruce Entwistle AWC Forecasters NASA/SPoRT/USRA Bill McCaul Jon Case EMC Geoff DiMego Zavisa Janjic Matt Pyle Jun Du Geoff Manikin Brad Ferrier GSD Stan Benjamin Steve Weygandt John Brown Curtis Alexander NESDIS/NOAA Satellite Steve Goodman Bonnie Reed Bill Campbell OU/CAPS Ming Xue Fanyou Kong Kevin Thomas NCAR Morris Weisman Wei Wang DTC Tara Jensen Barb Brown Bill Kuo CIMSS-UW Jason Otkin Wayne Feltz CIRA-CSU Louie Grasso Dan Lindsey Acknowledgements

  3. Outline • Thanks to EMC/NCO • SPC Mission and Responsibility • Virtual temperature in CAPE calculations • Hazardous Weather Testbed • Multiple WRF models including 26 member Storm Scale Ensemble Forecast (SSEF) • Convection-allowing WRF guidance for severe convective weather • Two case examples from 2010 • Discussion of Upcoming Modeling Needs • SPC “Wish List”

  4. Good News From SPC Perspective (or a Late Thanksgiving Card) • Model production continues to be remarkably timely and reliable • Forecasters know when model output will be available • Excellent working relationship with EMC/NCO • Very responsive to inquiries and requests • RUC Extension to 18 hrs; GFS Resolution Increase • Support and improvements to 4 km WRF models • Operational Hi Res Window • Continued development of experimental 4 km WRF-NMM • 00 and 12z runs over full CONUS domain • Convective parameter maximum grids • Plans for operational CONUS 4 km nest in new NAM (NMM-B) • Upcoming SREF and Rapid Refresh implementations in 2011 • Outstanding collaboration/support for Hazardous Weather Testbed

  5. STORM PREDICTION CENTER HAZARDOUS PHENOMENA • Hail, Wind, Tornadoes • Excessive rainfall • Fire Weather • Winter weather

  6. Storm Prediction Center Primary Products SPC Mission and Responsibility • Tornado and Severe Thunderstorm Watches • Watch Status Reports • Severe Weather Outlooks through Day 8 • 4-hr PeriodEnhancedThunderstorm Outlooks for Day 1 • Short-Term Mesoscale Discussions • Severe Convective Weather • Heavy Rain • Hazardous Winter Weather • Fire Weather Outlooks through Day 8 • New fire weather day shift; enhanced collaboration with NIFC & GACCs • Categorical and Probabilistic Products

  7. “Virtual” CAPE Computation • Historically, CAPE calculations in mid-latitudes and tropics have used different formulations • Tropical analysis of instability has typically included virtual temperature (MWR 1993)

  8. “Virtual” CAPE Computation • Historically, CAPE calculations in mid-latitudes and tropics have had different formulations • Until the 1990s, mid-latitude CAPE calculation often did not include virtual temperature (WAF 1994)

  9. “Virtual” CAPE Computation • By the mid-1990s, SPC instability computations were corrected to include virtual temperature in N-SHARP sounding analysis program and other diagnostic CAPE and LI fields • SPC Hourly Mesoanalysis fields (SPC web page) incorporate virtual temperature in lifted parcel computations • These are widely used by WFOs for real-time analysis • For consistency and more proper representation of instability, we recommend use of virtual temperature in NCEP unified post processor • What is the effect of incorporating virtual temperature on CAPE and LI computations? • The impact is proportional to the water vapor content of the atmosphere • Typically, larger differences in CAPE magnitude are apparent when the lifted parcel has high mixing ratio • CAPE differences: ~100-250 J/kg (most important in low CAPE situations) • LI differences: ~ 1 deg K

  10. Doswell/Rasmussen Study All 1992 00z Soundings with Positive CAPE (5876 Total) Difference between Virtual and Non-Virtual CAPE For CAPE < 3000 J.kg, difference < 250 J/kg As CAPE increases beyond 1000 J/kg, differences trend toward 10-15%

  11. “Virtual” CAPE Computation • SPC has been working with Geoff Manikin for several years to test code to compute virtual CAPE • He has tested virtual CAPE in NAM and Rapid Refresh computations and examined difference fields • Values and differences between virtual and non-virtual CAPE correspond well to those found by Doswell and Rasmussen • Here are some recent examples

  12. NAM “Low CAPE”* Case – 12z 30 Nov 2010 9-hr forecast valid 21z (*Over Land) “Non-Virtual” CAPE CAPE Difference <150 J/kg over land “Virtual” CAPE CIN Difference

  13. RR “High CAPE” Case – 16z 21 Oct 2010 3-hr forecast valid 19z “Non-Virtual” CAPE CAPE Difference Generally < 300 J/kg, max difference 400 J/kg “Virtual” CAPE

  14. NAM Lifted Index Case – 12z 1 Dec 2010 15-hr forecast valid 03z “Non-Virtual” LI LI Difference “Virtual” LI

  15. Virtual CAPE Computation • Recommendation: NCEP unified post processor CAPE and LI calculations should be corrected to include effects of virtual temperature • Options • Change current grids to incorporate virtual temperature • Add new “virtual grids” in addition to existing “non-virtual grids” • SPC prefers first option

  16. Use of Convection-Allowing WRF Models at the SPC

  17. Convection-Allowing WRF Models • Severe weather types (tornadoes, hail, wind damage) can be closely related to convective mode • Tornadoes (discrete or embedded supercells) • Damaging wind (bow echoes and QLCSs) • Very large hail (supercells) • Traditional operational models (NAM, GFS, RUC) do not resolve convective-scale details • Convection-allowing WRF models with 4 km grid length • Resolve mesoscale structure (linear MCS’s, bow echoes) • Can approximate storm scale features including supercells • But analysis and prediction of pre-storm and near-storm mesoscale environment is of paramount importance

  18. Examples of Different Convective Modes as Seen by Radar Reflectivity Discrete Multi-cell Cells Cluster Bow Echo Linear System System with Embedded Bows (and leading cells)

  19. 4 km WRF Models Used Daily at SPC • WRF-NMM (EMC) and WRF-ARW (EMC and NSSL) • Operational EMC WRFs three times daily in HiResWindow to 48h • 00 and 12 UTC over Central/East domain • 06 UTC over West/Central domain • Experimental EMC and NSSL WRFs • EMC WRF-NMM run twice daily (00 and 12 UTC) • NSSL WRF-ARW run once daily (12 UTC) • 36h forecasts over full CONUS • Cold start with NAM initial and boundary conditions • No parameterized convection • Unique convective fields such as: • Simulated reflectivity • Measures of updraft rotation in model storms • Hourly maximum fields (“history variables”)

  20. EFP EWP NOAA Hazardous Weather Testbed • An organization and facility that supports and promotes collaborative research activities between SPC, NSSL, WFO OUN, and the broader national and international meteorological community of research scientists, academia, and forecasters. Initially Two Main Program Areas Experimental Forecast Program Experimental Warning Program Prediction of hazardous mesoscale and stormscale events from a few hours to a week in advance Detection and prediction of hazardous mesoscale and stormscale events up to several hours in advance

  21. EFP EFP EWP EWP NOAA Hazardous Weather Testbed • An organization that supports and promotes collaborative research activities between SPC, NSSL, WFO OUN, and the broader national and international meteorological community of research scientists, academia, and forecasters. Since 2009 - Three Main Program Areas Experimental Forecast Program Experimental Warning Program GOES-R PG Real time OT&E of GOES-R products prior to launch GOES-R Proving Ground

  22. Spring Experiments http://www.nssl.noaa.gov/projects/hwt/efp/ When: • ~8 am to 4 pm M-F from May into mid-June Where: • In National Weather Center HWT between OUN and SPC Focus: • Evaluate emerging scientific concepts and tools in a simulated operational forecasting environment Participation: • ~60-70 researchers and forecasters from U.S. and international government agencies, academia, and the private sector • 8-14 active participants at any time

  23. HWT Spring Experiment2010 Participating Institutions: NOAA Agencies UniversitiesGov’t Agencies Private • NCEP/EMC (2) -NCEP/AWC (6) -NCEP/HPC (5) -NCEP/SPC (7) -NCEP/OPC -NWS/ABQ -NWS/HUN -NWS/ANC -NWS/CAE -OAR/NSSL (4) -NESDIS (2) -Oklahoma -Iowa State -Albany/SUNY (3) -Texas A&M -MIT/LL -UA/Huntsville (2) -NWS/DTX -NWS/EAX -NWS/EKA -NWS/TWC -NWS/FGZ -NWS/PIH -NWS/TFX -CIMSS/UW(8) -OAR/PSD -NCAR/DTC (6) -FAA/Academy (2) -FAA/ATCSCC (2) -NASA/SPoRT (4) -AFWA (2) -Environ. Canada (3) -Mitre (CAASD) -FirstEnergy -SSAI -NWS/RAH -NWS/ILN -NWS/OKX -NWS/RLX -NWS/ATCSCC -NWS/OST (5) -NWS/MDL (2) -CIRA/CSU (2) -OAR/GSD (3) - The HWT is a facilitator of R2O & O2R across the larger community

  24. Hazardous Weather Testbed Unique Benefits • The close working relationship between operational and research meteorologists has fostered • Increased appreciation by Research Scientists of forecaster insights, and operational constraints and requirements • Education of Operational Forecasters about cutting-edge NWP models & science concepts for application to severe weather forecasting • Accelerated Transfer of useful new science and technology from research to operations

  25. Primary HWT Collaborative Forecast Projects Focus has been on exploring advanced NWP applications for severe weather prediction 1997 2000 2001 2002 2003 2004 2005 2006 2007-09 2010 WINWEX ‘97 SE2010 Hi-Res WRF & Ensembles; Severe storm, Aviation and QPF Desks SE2002 IHOP forecasting support (Conv. Init.) SE2006 Pre-implementation evaluation of NAM-WRF SE2000 Evaluation of Model soundings, RUC-based SFCOA, hail SE2004 First detailed look at Hi-Res WRF SE2003 Short-Range Ensembles (SREF) SE2007-09 Hi-Res WRF & Ensembles SE2001 Subjective Verif., Conv. Param. SE2005 Hi-Res WRF configuration testing

  26. 2010 HWT Spring Experiment • Expansion of Convective Hazards • Explore high resolution NWP applications for • Severe thunderstorms (SPC-NSSL) • Aviation-impacts (AWC) • QPF/Heavy Rain (HPC)

  27. High Resolution Community Models Used in the 2010 Spring Experiment

  28. (4 GB/day) NSSL (00 UTC) SPC (36 GB/day) CAPS (00, 09, 12, 15, 18 UTC) (5 GB/day) NCAR (00, 12 UTC) (70 GB/day*) HRRR/RUC13 (00, 09, 12, 15, 18 UTC) (18 GB/day) EMC (00, 12 UTC) DTC WEBSITE ARCHIVE CAPS: 1.5 TB EMC: 544 GB HRRR: 270 GB NCAR: 152 GB NSSL: 235 GB (3+ TB TOTAL) HAZARDOUS WEATHER TESTBED Courtesy: Patrick Marsh NSSL/OU *10 GB/day archived

  29. June 17, 2010 • Tornado Outbreak – Upper Midwest • Record tornado day in Minnesota (48 tornadoes) • 17 EF2+ tornadoes including four rated EF4 (in MN and ND) • 3 killed, more than 40 injured, and widespread damage • Strongly forced situation with well-defined frontal boundary Tornado Damage in Holmes, ND

  30. Deterministic 00z WRF Model Forecasts valid 18z-04z EMC 4 km WRF-NMM NSSL 4 km WRF-ARW

  31. WRF Simulated Reflectivity 18 hr Forecast valid 18z 17 June 2010 NMM4NSSL4 Radar

  32. WRF Simulated Reflectivity 19 hr Forecast valid 19z 17 June 2010 NMM4NSSL4 Radar

  33. WRF Simulated Reflectivity 20 hr Forecast valid 20z 17 June 2010 NMM4NSSL4 Radar

  34. WRF Simulated Reflectivity 21 hr Forecast valid 21z 17 June 2010 NMM4NSSL4 Otter Tail Cnty EF4 1 Killed Radar

  35. WRF Simulated Reflectivity 22 hr Forecast valid 22z 17 June 2010 NMM4NSSL4 Radar

  36. WRF Simulated Reflectivity 23 hr Forecast valid 23z 17 June 2010 NMM4NSSL4 Polk Cnty EF3 1 Killed Radar

  37. WRF Simulated Reflectivity 24 hr Forecast valid 00z 18 June 2010 NMM4NSSL4 Radar Freeborn Cnty EF4 1 Killed

  38. WRF Simulated Reflectivity 25 hr Forecast valid 01z 18 June 2010 NMM4NSSL4 Radar

  39. WRF Simulated Reflectivity 26 hr Forecast valid 02z 18 June 2010 NMM4NSSL4 Radar

  40. WRF Simulated Reflectivity 27 hr Forecast valid 03z 18 June 2010 NMM4NSSL4 Radar

  41. WRF Simulated Reflectivity 28 hr Forecast valid 04z 18 June 2010 NMM4NSSL4 NSSL WRF developed excessive cold pool and surged leading edge of convection too far southeast across Iowa. Initially “slow” NMM was better with location but intensity was too weak near Mississippi River. Radar

  42. SSEF Tests of Double-Moment Microphysics 27 hr Forecast valid 03z 18 June 2010 Thompson WDM6 All members identical to SSEF 4 km ARW control run (upper left) except for different double-moment microphysics schemes. All moved convection southeastward too rapidly. Morrison Radar

  43. Examination of HRRR Forecasts for Short-Term Update Information Simulated Reflectivity from 14z, 16z, and 18z HRRR Runs Valid 14-23z

  44. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 14z 17 June 2010 14z 16z (NA) 00-hr 18z (NA) Observed Radar GOES Water Vapor Imagery

  45. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 15z 17 June 2010 14z 16z (NA) 01-hr 18z (NA) Observed Radar GOES Water Vapor Imagery

  46. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 16z 17 June 2010 14z 16z 02-hr 00-hr 18z (NA) Observed Radar GOES Water Vapor Imagery

  47. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 17z 17 June 2010 14z 16z 03-hr 01-hr 18z (NA) Observed Radar GOES Water Vapor Imagery

  48. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 18z 17 June 2010 14z 16z 04-hr 02-hr 18z Observed 00-hr Radar GOES Water Vapor Imagery

  49. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 19z 17 June 2010 14z 16z 05-hr 03-hr 18z Observed 01-hr Radar GOES Water Vapor Imagery

  50. HRRR Simulated Reflectivity (1 km AGL) Forecasts valid 20z 17 June 2010 14z 16z 06-hr 04-hr 18z Observed 02-hr Radar GOES Water Vapor Imagery

More Related