1 / 43

Real Time Mesoscale Analysis

Real Time Mesoscale Analysis. RTMA Temperature 1500 UTC 14 March 2008. John Horel Department of Meteorology University of Utah john.horel@utah.edu. Acknowledgements Dan Tyndall (Univ . of Utah) Dave Myrick (WRH/SSD ) Manuel Pondeca (NCEP/EMC) References

shilah
Télécharger la présentation

Real Time Mesoscale Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real Time Mesoscale Analysis RTMA Temperature 1500 UTC 14 March 2008 John Horel Department of Meteorology University of Utah john.horel@utah.edu

  2. Acknowledgements • Dan Tyndall (Univ. of Utah) • Dave Myrick (WRH/SSD) • Manuel Pondeca (NCEP/EMC) • References • Benjamin, S., J. M. Brown, G. Manikin, and G. Mann, 2007: The RTMA background – hourly downscaling of RUC data to 5-km detail. Preprints, 22nd Conf. on WAF/18th Conf. on NWP, Park City, UT, Amer. Meteor. Soc., 4A.6. • De Pondeca, M., and Coauthors, 2007: The status of the Real Time Mesoscale Analysis at NCEP. Preprints, 22nd Conf. on WAF/18th Conf. on NWP, Park City, UT, Amer. Meteor. Soc., 4A.5. • Horel, J., and B. Colman, 2005: Real-time and retrospective mesoscale objective analyses. Bull. Amer. Meteor. Soc., 86, 1477-1480. • Tyndall, D., 2008: Sensitivity of Surface Temperature Analyses to Specification of Background and Observation Error Covariances. M.S. Thesis. U/Utah

  3. An Analysis of Record • Forecasters have needs for higher resolution analyses than currently available • Localized weather forecasting • Gridded forecast verification • Climatological applications • AOR program established in 2004 • Three phases • Real Time Mesoscale Analysis • Delayed analysis: Phase II • Retrospective reanalysis: Phase III

  4. Real-Time Mesoscale Analysis (RTMA) • Fast-track, proof-of-concept intended to: • Enhance existing analysis capabilities at the NWS and generate near real-time hourly analyses of surface observations on domains matching the NDFD grids. • Background errors can be defined using characteristics of background fields (terrain, potential temperature, wind, etc.) • Provide estimates of analysis uncertainty • Developed at NCEP, ESRL, and NESDIS • Implemented in August 2006 for CONUS (and southernmost Canada) & recently for Alaska, Guam, Puerto Rico • Analyzed parameters: 2-m T, 2-m q, 2-m Td, sfc pressure, 10-m winds, precipitation, and effective cloud amount • 5 km resolution for CONUS with plans for 2.5 km resolution

  5. More Info… www.meted.ucar.edu

  6. The Real-Time Mesoscale Analysis • Several layers of quality control for surface observations • Two dimensional variational surface analysis (2D-Var) using recursive filters • Utilizes NCEP’s Gridpoint Statistical Interpolation software (GSI) • Uses various surface observations and satellite winds • METAR, PUBLIC, RAWS, other mesonets • SSM/I and QuikSCAT satellite winds • Analysis became available in 2006, and is being evaluated by WFOs and several universities

  7. The actual ABCs… • The RTMA analysis equation looks like: • Covariances are error correlation measures between all pairs of gridpoints • Background error covariance matrix can be extremely large • 2,900 GB memory requirement for continental scale • Recursive filters significantly reduce this demand

  8. The Real-Time Mesoscale Analysis

  9. RTMA Domains

  10. CVOC & CVOI CVOD RTMA Vancouver Area

  11. Background DownscalingBenjamin et al. (2007) • CONUS RTMA background = 1-h forecast from the NCEP-operational 13-km RUC downscaled to the 5-km NDFD terrain • Horizontal - bilinear interpolation • Vertical interpolation – varies by variable, for temperature it is based on near-surface stability and moisture from the RUC native data used to adjust to the RTMA 5-km terrain • If RTMA terrain lower than RUC, then local RUC lapse rate used (between dry adiabatic and isothermal) • If RTMA terrain higher than RUC, interpolated between native RUC vertical levels, but shallow, surface-based inversions maintained • Coastline sharpening

  12. Surface Data Issues • Real-time RTMA analysis begins ~30 min past the hour • Getting the data in time is a challenge • RTMA uses obs taken (+/-12 min from top of hour) • RAWS and Quickscat winds (-30 min to +12 min) • Many remote obs still don’t make it in time, as well as some obswith longer latencies (Snotel = every 3 hrs) Observation Network MADIS NCEP

  13. RTMA CONUS Temperature Analysis

  14. Subjective Evaluation of the RTMA Temperature • Development of web based graphics for many analyses over selected subdomains • Wasatch Valley, UT • Puget Sound, WA • Norman, OK • Shenandoah Valley, VA • Analysis problems based on subjective evaluation • Non-physical treatment of temperature inversions • Smoothed features • Bull's-eye features around observations • Overfitting issues • Case study • 0900 UTC 22 October 2007 • Shenandoah Valley, VA

  15. Case Study Orientation RTMA 0900 UTC 22 October 2007

  16. RTMA Topography vs. Actual Topography 0 100 200 300 400 500 600 700 800 900 1000

  17. 1200 UTC 22 October 2007 KIAD Sterling, VA Atmospheric Sounding

  18. RUC 1-hr Forecast Valid 0900 UTC 22 October 2007 RUC 1-hr Forecast

  19. RTMA 0900 UTC 22 October 2007 RTMA Temperature Analysis

  20. RTMA Temperature Analysis Increments RTMA Temperature Analysis Increments 0900 UTC 22 October 2007

  21. Observations • Surface observations: METAR, RAWS, PUBLIC, OTHER • Observations must fall in ±12 min time window (−30/+12 min for RAWS) • Observations undergo quality control by MADIS and internal checks by RTMA • Approximately 11,000 observations for almost 900,000 gridpoints for CONUS

  22. METAR 16/59/1,744 OTHER 10/75/1,961 Observation Density PUBLIC 215/575/6,486 RAWS 3/11/1,301 Observation Density

  23. Observation and Background Error Variances • Ratio of σo2/σb2estimated by accumulating statistics over a time period for entire CONUS • 8 May 2008 – 7 June 2008 • RTMA used σo2/σb2 = 1 for METAR observations and σo2/σb2 = 1.2 for mesonet observations • Statistics computed in our research suggest σo2/σb2 between 2 and 3 • Background should be “trusted” more than observations

  24. Estimation of Observation and Background Error Covariances • Temperature errors at two gridpoints may be correlated with each other • Error covariances specify over what distance an observation increment should be “spread” • RTMA used decorrelation lengths of: • Horizontal (R): 40 km • Vertical (Z): 100 m

  25. Error Correlation- KOKV R = 40 km, Z = 100 m Error Correlation Example – KOKV

  26. Error Correlation- KOKV R = 80 km, Z = 200 m Error Correlation Example – KOKV

  27. Local Surface Analysis • RTMA experiments run on NCEP’s Haze supercomputer but limited computer time available • Development of a local surface analysis (LSA) • Same background field • Same observation dataset, but without internal quality control • Similar 2D-Var method, but doesn’t use recursive filters • Smaller domain • Control run uses same characteristics of RTMA (decorrelation length scales; observation to background error ratio) at time of study

  28. LSA R = 40 km, Z = 100 m, σo2/σb2 = 1 LSA Temperature Analysis

  29. RTMA 0900 UTC 22 October 2007 RTMA Temperature Analysis

  30. Data Denial Experiments • Evaluation of analyses done by splitting up all observations randomly into 10 groups • Two error measures: • Root-mean-square error calculated using the withheld observations only vs. using the observations assimilated into the analysis • To avoid overfitting, want RMSE at withheld locations to be similar to RMSE at locations used in the analysis • Root-mean-square sensitivity computed at all gridpoints • Want analyses to be less sensitive to withheld observations

  31. Group 5 R = 40 km, Z = 100 m, σo2/σb2 = 1 Withholding Observations

  32. Withheld Data Groups 1-5 R = 40 km, Z = 100 m, σo2/σb2 = 1 -1 -0.5 0 0.5 1 2.5 4.5 6.5 8.5 10.5 12.5 14.5 16.5

  33. Withheld Data Groups 6-10 R = 40 km, Z = 100 m, σo2/σb2 = 1 -1 -0.5 0 0.5 1 2.5 4.5 6.5 8.5 10.5 12.5 14.5 16.5

  34. Group 1 RTMA RTMA Temperature Analysis Sensitivity

  35. LSA RMSE and Sensitivity Compared to RTMA Suggests overfitting Overly sensitive to withheld observations

  36. How good does the RTMA have to be? • Is the downscaled RUC background field good enough? • RUC RMSE approximately 2°C to all observations in CONUS • How much does a 2D-Var analysis improve RMSE? • 0.1-0.2°C based on LSA results

  37. Verifying Forecasts with Analyses • Motivating Question: Is there a way we can define what constitutes a “good enough” forecast? • Problem: Concerns about grid-based verification • Reality: Forecasters need feedback for entire forecast grids not just selected key observation locations

  38. Forecaster Concerns • Quality of the verifying analysis • Penalty for adding mesoscale detail to grids in areas unresolved by analysis • Bad (mesonet) observations influencing analysis • Analysis in remote areas – driven mostly by the background model

  39. Using RTMA Analyses and RTMA Uncertainty Estimates for Forecast Verification • RTMA uncertainty estimate grids are experimental products under development for temperature, moisture, wind • Goal: • Higher uncertainty in data sparse areas or areas with larger representativeness errors • Lower uncertainty in data dense areas or areas with smaller representativeness errors • Basic premise • Forecasters not penalized as much where the analysis is more uncertain

  40. Temperature (oC) Forecast Verification Example 2 6 8 1 7 Valley w many obs 1 10 Mountains w unrepobs 8 2 12 3 9 4 Forecast RTMA RTMA Uncertainty

  41. Temperature (oC) Forecast Verification Example 2 6 8 1 7 Valley w many obs 1 10 Mountains w unrepobs 8 2 12 3 9 4 Forecast RTMA RTMA Uncertainty No color= Differences between the forecast and analysis are less than analysis uncertainty Red = abs(RTMA – Forecast) > Uncertainty

  42. Summary • Improving current analyses such as RTMA requires improving observations, background fields, and analysis techniques • Increase number of high-quality observations available to the analysis • Improve background forecast/analysis from which the analyses begin • Adjust assumptions regarding how background errors are related from one location to another • Future approaches • Treat analyses like forecasts: best solutions are ensemble ones rather than deterministic ones • Depend on assimilation system to define error characteristics of modeling system including errors of the background fields • Improve forward operators that translate how background values correspond to observations

More Related