1 / 14

Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007

AMS/EPA Workshop on the Evaluation of Regional-Scale Air Quality Modeling Systems: Overview & Next Steps. Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007. Evolving US Air Quality Management System. Source: John Bachmann, EM Magazine, June 2007. Steering Committee Members.

vevina
Télécharger la présentation

Tyler Fox, USEPA 6 th Annual CMAS Conference October 1, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AMS/EPA Workshop on the Evaluation of Regional-Scale Air Quality Modeling Systems: Overview & Next Steps Tyler Fox, USEPA 6th Annual CMAS Conference October 1, 2007

  2. Evolving US Air Quality Management System Source: John Bachmann, EM Magazine, June 2007

  3. Steering Committee Members • S.T. Rao • Alice Gilliland • Kenneth Schere • Robin Dennis • Dr. Steven Hanna • John S. Irwin • Christian Hogrefe • Prof. Douw Steyn • Prof. Montserrat Fuentes • Prof. Akula Venkatram • Christian Seigneur • Rich Scheffe • Tyler Fox

  4. Workshop Objectives • Discuss approaches to advance process-level evaluations of meteorology, emissions, chemistry, and chemical-transport modeling. • Discuss and develop approaches to advance air quality model evaluation methods and procedures. • Develop specific recommendations for model evaluations from air quality management and forecasting perspectives.

  5. Keynote Topics • Evaluating performance of • meteorological processes within air quality modeling systems • source and sink processes within air quality modeling systems • chemistry and aerosol processes within air quality modeling systems • Methods and processes for evaluating the performance of air quality modeling system components

  6. Some discussion items • Some of most important MET variables for AQ modeling are not routinely and reliably evaluated (e.g., PBL depth & cloud properties) • Discussed a number of model probing tools: • Source apportionment & receptor modeling • Sensitivity analysis & process analysis • Measurements are critical so imperative for model developers/users be involved in design of monitoring networks and special field studies. • Model evaluation methods specific to the context of the application (i.e., fit for purpose). • Purpose of evaluation? • Acceptance for an application • Guide and influence further modeling system improvements

  7. HIGH Emissions MET Processes PBL Height Clouds SOA Source/SinkProcesses Potential to Improve Model Performance Wet Removal Chem/AeroProcesses SO4 AqChem Precip DryDep N2O5/NO3˙HetChem OC Aging WS, WDIR AerosolThermo Gas Mechs Coagulation Temp, RH Nucleation LOW Availability of Lab or Field Measurements SPARSE ABUNDANT Processes Affecting Modeled PM2.5 Source: Prakash Bhave

  8. MODEL EVALUATION FRAMEWORK Operational Evaluation How do the model predicted concentrations compare to observed concentration data? What are the overall temporal or spatial prediction errors or biases? Model-predicted concentration and deposition Model Inputs: meteorology and emissions Chemical transformation: gas, aerosol, and aqueous phases Transport: advection and diffusion Removal: dry and wet deposition Are we getting the right answers? Can we capture observed changes in air quality? Dynamic Evaluation Can the model capture changes related to meteorological events or variations? Can the model capture changes related to emission reductions? Can we identify needed improvements for modeled processes or inputs? Are we getting right answers for right (or wrong) reasons? Diagnostic Evaluation Are model errors or biases caused by model inputs or by modeled processes? Can we identify the specific modeled process(es) responsible? Probabilistic Evaluation What is our confidence in the model-predicted values? How do observed concentrations compare within an uncertainty range of model predictions? What is our confidence in the model predictions?

  9. EPA Modeling Guidance for SIP Demonstrations • In April 2007, EPA released final version of Guidance on the Use of Models and Other Analyses for Demonstrating Attainment of Air Quality Goals for Ozone, PM2.5, and Regional Haze • Chapter 18 “What are the procedures for evaluating model performance and what is the role of diagnostic analysis?” • Appendix B “Summary of recent model performance evaluations” • Available at: http://www.epa.gov/scram001/guidance_sip.htm

  10. Morris, R., et al., “Model and Chemistry Inter-comparison: CMAQ with CB4, CB4-2002, SAPRC99”, National RPO Modeling Meeting, Denver, CO, 2005b. • Based on US (36-km) and VISTAS (12-km) January 2002 modeling, conducted chemistry mechanisms inter-comparisons for CMAQ with CB4, CB4-2002,and SAPRC99. • The performance of CB4 and CB4-2002 was similar for PM, and superior to SAPRC99 overall (for the Jan02 case). • The model performance for CMAQ/CB4, US 36-km domain is in the range of: Sulfate: MFE = 42% ~ 73%, MFB = -21% ~ +14% Nitrate: MFE = 62% ~ 105%, MFB = -21% ~ +46% Organic: MFE = 50% ~ 77%, MFB = +3% ~ +59% EC: MFE = 59% ~ 88%, MFB = +2% ~ +70% Soil: MFE = 165% ~ 180%, MFB =+164% ~ +180% PM 2.5: MFE = 48% ~ 88%, MFB = +25% ~ +81% • Given that the computational cost of SAPRC99 is twice that of CB4, suggested to use 36 and 12 km grids with CB4 chemistry for PM modeling for the time being. • Noted that both CB4 and SAPRC underpredicted winter O3 significantly.

  11. Workshop Next Steps • The workshop Steering Committee is currently preparing a manuscript, summarizing the recommendations of the workshop participants, for publication in the Bulletin of the American Meteorological Society • Conduct follow-on workshop(s) in 2008 to discuss the results of the applications of the recommended methods and lessons learned.

  12. Expected Outcomes • Promote dialogue across community to gain better understanding and ultimately agreement on “accepted” evaluation methods and techniques • Builds confidence in the use of regional-scale air quality model outputs for air quality management and air quality forecasting purposes.

  13. Evolving US Air Quality Management System Source: John Bachmann, EM Magazine, June 2007

  14. Challenges Ahead • SIP Modeling for Attainment Demos • Dynamic evaluations of model responsiveness • Public Health and Exposure Assessments • Improve air quality characterization for health studies at local and neighborhood scales • Integrated, Multi-Pollutant AQM Planning • “One-atmosphere” modeling to better inform control strategy development & more comprehensive planning • Climate-Air Quality Linkages • Link climate and regional modeling systems to address feedbacks on emissions, meteorology, and chemistry.

More Related