1 / 74

VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO

VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality. Contract with Baron Advanced Meteorological Systems (BAMS) Formerly known as MCNC Don Olerud, BAMS Technical Lead Contract initiated January 2003.

ramona
Télécharger la présentation

VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality

  2. Contract with Baron Advanced Meteorological Systems (BAMS) Formerly known as MCNC Don Olerud, BAMS Technical Lead Contract initiated January 2003

  3. Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions Meteorological Modeling Goals

  4. Summary of recent and relevant MM5 sensitivity studies Draft delivered: January 2003 Learn from what others have done Inter-RPO collaboration Will serve a starting point for VISTAS Recommend a set of sensitivity tests Draft delivered: January 2003 Different physics options and inputs proposed for testing Meteorological Modeling GoalsPhase I

  5. Evaluation methodologies Draft delivered: January 2003, Updated April 2003 Assessing Model Performance Conceptual understanding correct? placement, timing of features Are diurnal features adequately captured Are clouds reasonably well modeled Are precipitation fields reasonable Do wind fields generally match observations Do temperature and moisture fields match observations Million dollar question… Do the meteorological fields produce acceptable air quality model results? Meteorological Modeling GoalsPhase I

  6. Evaluation: Spatial Products Spatial Aloft Products Timeseries Products Sounding Products Spatial Statistics Products Timeseries Statistics Products Combination Products Timeseries Statistics Aloft Products Statistical Tables Form Profiler Products Cross Sensitivity products Meteorological Modeling GoalsPhase I

  7. Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions Periods that we’re modeling ? Geographical extent of testing ? Meteorological Modeling Goals

  8. January 1 – 20, 2002 Episode 1 July 13 – 27, 2001 Episode 2 July 13 – 21, 1999 Episode 3 Choice of episode periods was based on: Availability of robust AQ databases Full AQ cycle (clean-dirty-clean) Availability of meteorological data Air quality and meteorological regime Sensitivity episodes

  9. 12 km 36 km

  10. PX_ACM Pleim-Xiu land-surface model, ACM pbl scheme NOAH_MRF NOAH land-surface model, MRF pbl scheme Multi_Blkdr Multi-layer soil model, Blackadar pbl scheme NOAH ETA M-Y NOAH land-surface model, ETA Mellor-Yamada pbl Sensitivity Tests BASE CASE

  11. PX_ACM case significantly cold-biased PX_ACM runs are continuous (i.e. soil/moisture values from one modeling segment serves as initial conditions for following segment) Significantly better results obtained by making each P-X run independent (PX_ACM2) January 2002 – Episode 1

  12. T T

  13. T T

  14. T T

  15. T T

  16. Run Bias abserr IA PX -2.68 3.15 0.854 PX2 -1.38 2.25 0.877 1.5m Temperature stats12 km domain - All hours - Episode 1

  17. T

  18. Daytime CFRAC (alt)

  19. Daytime CFRAC (alt) Diff

  20. Nighttime CFRAC (alt)

  21. Nighttime CFRAC (alt) Diff

  22. 24-h Pcp

  23. 24-h Pcp Diff

  24. Daytime Pcp

  25. Daytime Pcp Diff

  26. T

  27. NOAH_MRF by far the highest and smoothest Probably too high PX_ACM2 ~= Multi_blkdr PX_ACM2 subject to some suppressed PBL heights (in areas) during the day Some of this may be real ? (over melting snow, or in presence of clouds/precipitation) Lack of observations make this nearly impossible to evaluate PX_ACM2 very low at night NOAH_ETA-MY lowest during day PBL HeightsSubjective observations

  28. 3-Panel Plots Bias, Error, Index of Agreement for t, q, cld, spd, dir, RH Bias, Accuracy, Equitable Threat Score for pcp (0.01, 0.05, 0.10, 0.25, 0.5, 1.0 in) Labels sometimes difficult to see, so colors remain consistent px_acm(2): Blue noah_mrf: Red multi_blkdr: Black noah_eta-my: Purple Pcp plots only available for “Full” regions Time Series Statistics

  29. Temp Stats (Episode 1)

  30. Mixing Ratio

  31. Wind Speed

  32. Wind Direction

  33. Cloud Fraction

  34. Cloud Fraction (Alt)

  35. Relative Humidity

  36. T (~500 m aloft)

  37. T (~1600 m aloft)

  38. T (~3400 m aloft)

  39. Q (aloft)

  40. Q (aloft)

  41. Q (aloft)

  42. D (aloft)

  43. D (aloft)

More Related