1 / 30

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP. University of California, Riverside. October 27, 2003, CMAS Annual Meeting, RTP, NC. UC Riverside : Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang Ralph Morris et al., ENVIRON Corporation

fruma
Télécharger la présentation

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Regional Haze Modeling:Recent Modeling Results for VISTAS and WRAP University of California, Riverside October 27, 2003, CMAS Annual Meeting, RTP, NC

  2. UC Riverside: Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang Ralph Morris et al., ENVIRON Corporation Zac Adelman et al., Carolina Environmental Program Tom Tesche et al., Alpine Geophysics Don Olerud, BAMS Modeling Team Participants

  3. Western Regional Air Partnership: John Vimont, Mary Uhl, Kevin Briggs, Tom Moore, VISTAS: Pat Brewer, Jim Boylan, Shiela Holman Acknowledgments

  4. Model Performance Evaluation WRAP 1996 Model Performance Evaluation VISTAS 2002 Sensitivity Results CMAQ Benchmarks Topics

  5. 1996 Annual Modeling 36 km grid for western US, 95x85x18 layers MM5 by Olerud et al. WRAP Modeling

  6. Corrections to point sources MOBILE6 beta for WRAP states Monthly corrections for NH3 based on EPA/ORD inverse modeling. Updated non-road model Typical fires used for results shown here 1996 NEI for non-WRAP states WRAP Emissions Updates

  7. WRAP - CMAQ revisions • v0301, released in March 2001 • Used as the base case and all sensitivity cases for WRAP’s 309 simulations. • v0602, released in June 2002 • v4.2.2, released in March 2003 • v4.3, released in Sept. 2003

  8. Comparisons based on IMPROVE evaluation

  9. Model Performance Metrics • How well does the model reproduces mean, modal, and variational characteristics ? • Using observations to normalize model error & bias result in misleading conclusion: • if observation is very small  large bias or error • if model under prediction  bounded by -1 • model over prediction is weighted more than under prediction • We used Mean Normalized Err & Bias in 309: • Poor metric for clean conditions

  10. Recommended Performance Metrics • Use fractional error and bias: • bias and error is bounded symmetrical limits of +2 • Normalized Mean Error & Bias: • Divide the sum of the errors by the sum of the observations. • Coefficient of determination (R2) • explains how much of the variability in the model predictions can be explained by the fact that they are related to ambient observation, i.e. how close the points are to the observations.

  11. Statistical measures used in model performance evaluation

  12. Statistical measures used in model performance evaluation

  13. Statistical measures used in model performance evaluation • In addition… • Mean observation • Mean prediction • Standard deviation (SD) of observation • Standard deviation (SD) of prediction • Correlation variance

  14. Expanded Model Evaluation Software to include… • Ambient data evaluation for air quality monitoring networks: • IMPROVE (24-Hour average PM) • CASTNet (Weekly average PM & Gas) • STN (24-Hour average PM) • AQS (Hourly Gas) • NADP (weekly total deposition) • SEARCH • 17 statistical measures in model performance evaluation • All performance metrics can be analyzed in an automated process for model and data selected by: · allsite_daily · onesite_daily · allsite_yearly · onesite_monthly · allsite_monthly · onesite_yearly

  15. Facilitate model evaluation. Benefit from shared development of tool. Share monitoring data. UCR software available at website: www.cert.ucr.edu/aqm Community Model Evaluation Tool?

  16. WRAP 1996 Evaluation, CMAQ v4.3

  17. WRAP 1996 Evaluation, CMAQ v4.3

  18. WRAP 1996 Evaluation, CMAQ v4.3

  19. WRAP 1996 Evaluation, CMAQ v4.3

  20. New fugitive dust emissions model New NH3 emissions model Actual Prescribed & Ag burning emissions 2002 annuals simulations being developed. WRAP 1996 cases in progress

  21. 34 L MM5 by Olerud 1999 NEI CMAQ v3 VISTAS Model 12 km Domain

  22. 3 Episodes: Jan 2002, July 1999, July 2001 Sensitivity Cases MM5 MRF and ETA-MY, PBL height, Kz_min, Layer collapsing CB4-2002 SAPRC99 CMAQ-AIM GEO-CHEM for BC NH3 emissions VISTAS Sensitivity Cases

  23. NO3 over predictions in winter, under predictions in summer. Thorton et al N2O5 had small benefit, July MNB increased from –50% to –45% SO4 performance reasonably good Problems with PBL height Kz_min = 1 improved performance Investigating PBL height corrections Minor differences in 19 vs 34 layers VISTAS Key Findings

  24. Athlon MP 2000 (1.66 GHz) Opteron 246 (2.0 GHz) 32 bit code 64 bit code Compare 1, 4 and 8 CPUs. Ported CMAQ to the 64 bit SuSE Pointers & memory allocation for 64 bit Benchmarks

  25. VISTAS 12 km domain 168 x 177 x 19 layers Benchmarks for CMAQ 4.3 One day simulation, CB4, MEBI Single CPU run time hour:minutes Athlon 2 GHz: 14:10 Opteron 32bit 2 GHz: 12:49 Opteron 64 bit 2 GHz: 10:57 Test Case for benchmarks

  26. Small cluster < 8 CPUs use Athlon Large cluster >16 CPUs use Opterons? Optimal Cost Configuration

  27. Major Improvements in WRAP 1996 Model WRAP 2002 annual modeling underway VISTAS Sensitivity Studies still have problems in NO3 Need better NH3 inventory Need more attention to PBL heights in MM5 Community model evaluation tool? Conclusions

More Related