1 / 23

Refinement of the 16-20 Sept ‘00 Modeling Episode: Part II—Performance Improvement

Refinement of the 16-20 Sept ‘00 Modeling Episode: Part II—Performance Improvement. Central California Ozone Study: Modeling Support for 16-20 Sept ’00 Episode Kickoff Meeting Sacramento, CA T. W. Tesche 24 October 2003. Part II: Presentation Overview.

maili
Télécharger la présentation

Refinement of the 16-20 Sept ‘00 Modeling Episode: Part II—Performance Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Refinement of the 16-20Sept ‘00 Modeling Episode: Part II—Performance Improvement Central California Ozone Study: Modeling Support for 16-20 Sept ’00 Episode Kickoff Meeting Sacramento, CA T. W. Tesche 24 October 2003

  2. Part II: Presentation Overview • Regulatory Modeling Episode Development Issues • Components of the Model Performance Evaluation (MPE) • Model Performance Improvement Process • Potential Sources of Model Performance Problems • Meteorology • Emissions • Other • Diagnostic Tools and Analyses • Process Analysis • Emissions Inventory Sensitivity/Uncertainty Analyses • Use of Aircraft and other Aloft Air Quality and Met Data • Initial Diagnostic Steps

  3. Regulatory Modeling Episode Development Issues • Modeling should adhere to EPA photochemical modeling guidance • EPA guidance requires modeling protocols and adherence to them • Performance goals are identified by EPA for 1-hr SIP modeling • Schedule and resource constraints limit ‘full-science’ approach to model base case development and evaluation (i.e., limited multispecies, compensating error, and alternative base case analyses) • Weight of evidence analyses more strongly encouraged given growing experience in regulatory applications

  4. Remember… • Making policy decisions based on mathematical models is like marriage… at some point you decide that you can live with certain flaws and trade-offs. Student’s answer to a test in one of Prof. Jeffries’ Classes

  5. Components of the Model Performance Evaluation (MPE) • Initial Screening (i.e., the ‘big picture’) • Refined diagnostic evaluations • Sensitivity/uncertainty experiments • Corroborative modeling (e.g., other grid models, observation based models) • ‘Weight of Evidence’ analyses • Overall assessment of episode suitability for use in regulatory modeling • Alternative base case and compensatory error analyses

  6. Components of the Model Evaluation • Operational Evaluation. Tests model ability to estimate 1-hr ground-level ozone concentrations at regulatory monitors. • Diagnostic Evaluation. Tests model ability to estimate ozone precursor and product species and species ratios (e.g., NO, NO2, VOCs, NOy, NOz, CO), species ratios (e.g., VOC/NOx), associated oxidants (H2O2, HNO3), other ‘tracer’ species (CO), and the temporal and spatial variations and mass budgets of key species. • Mechanistic/Scientific Evaluation. Tests model ability to predict response of ozone and product species to changes in variables such as meteorology, emissions, land use, and so on. • Probabilistic Evaluation. Accounts for uncertainties associated with model predictions and observations of ozone and precursor species. • Comparative Evaluation. Quantifies differences between alternative model codes (including preprocessors such as MM5) , configurations, or operation modes; emphasis normally on operational evaluation investigative methods.

  7. Principals Governing ModelPerformance Improvement • Alternations to model inputs, science algorithms, user-specified parameters, core code, and preprocessor algorithms should be technically justified • Alternations to model inputs should be documented • Alterations, where significant, should be vetted with study sponsors and technical review committee • Diagnostic analyses and model inputs alterations must fit with regulatory timeframe and project resources • Process should align with EPA modeling guidelines

  8. Potential Sources of Model Performance Difficulties • Meteorology • Vertical turbulent mixing rates, maxima, and minima • Wind speed and direction errors • Planetary boundary layer (PBL) height predictions • Surface temperature errors (local & regional cool bias, nighttime warm bias, daytime cool bias) • Potential errors in surface and RASS temperature measurements at some stations; uncertainties in measurement heights • Uncertain conformance with aloft measurements (aircraft, fixed profilers, sondes)

  9. Potential Sources of Model Performance Difficulties • Biogenic Emissions • Uncertainty in proper selection of surface temperature for input to biogenic emissions models • Apparently overestimated PAR (i.e., photosynthetically active radiation) inputs to biogenic emissions model • California isoprene emissions likely underestimated by 50% or more (Geron et al., 2001) • Implementation of representative canopy height

  10. Potential Sources of Model Performance Difficulties • On Road Motor Vehicle Emissions • Potential underestimation in motor vehicle VOC and/or NOx emissions inventory (Harley et al., 2003) • Concerns over adequacy of weekendtraffic emissions • Alternative temperature and RH inputs (from met. processors) to motor vehicle model yield different motor vehicle emissions estimates

  11. Potential Sources of Model Performance Difficulties • Area, Point Source, Non-Road Emissions • Potentially missing or poorly characterized wildfires • Non-road NOx emissions from small engines, pumps, etc. • Potential old (erroneous) point source data from SARMAP • Concerns over adequacy of point source defaults based on source type

  12. Potential Sources of Model Performance Difficulties • ICs/BCs/Model Structure • Optimal number of vertical layers and horizontal grid resolution (how determined?) • Adequacy of spin-up period (how determined?) • Reasonableness and impact of BCs (including BCs aloft) on ground level ozone • Need for fine grid (e.g. 1.33 km) resolution within domain?

  13. Diagnostic Tools and Analysesto Be Considered • Sensitivity analysis (e.g., alternative MM5 PBL schemes, boundary conditions, PBL height patches, choice of Kv diffusivities, various emissions inventory sensitivity runs) • Process Analysis (IPR & IRR) • Ozone source apportionment • DDM, brute force methods • Uncertainty analyses • Imputed VOC and/or NOx emissions to address suspected methodological biases in current emissions estimation procedures

  14. Elements of Process Analysis • Integrated Reaction Rate Analysis (IRR), a chemical budget analysis of radicals, NOy, Ox, and O3 • Integrated Process Analysis (IPR), a local budget analysis of chemical, transport, deposition, and emissions processes • Example Application of PA to Houston SIP

  15. Integrated Reaction Rate (IRR) Analysis Ox Production [O3 production + NO oxidation] on 8 Sept ’93 Base Case Ozone.

  16. Integrated Reaction Rate (IRR) Analysis Impact of Isoprene Emissions on 8 Sept ’93 Base Case Ozone

  17. Integrated Process Rate (IPR) Analysis Ozone Process Analysis for Croquet [Cell 26,31,1] on 8 Sept ‘93

  18. Integrated Process Rate (IPR) Analysis NO2 Process Analysis for Croquet [Cell 26,31,1] on 8 Sept ‘93

  19. Supporting Diagnostic Analyses E-W Vertical Ozone Slice Plot on 8 Sept ’93 Near Croquet Cell

  20. Supporting Diagnostic Analyses Level 1 (0–20 m) Level 2 (20–80m) Stagnant winds in Croquet region (level 1) and strong NW winds in adjacent layer (Level 2) produce sustained vertical advection of precursors aloft: 8 Sept ’93, 0300-0400 CST

  21. Flying Data Grabber CAMx Aloft MPE for Houston SIP

  22. ARBProject Officer Task 5Document Refinements, Data Base TransferD. McNally, Alpine T. Tesche, Alpine R. Morris, Environ G. Yarwood, Environ G. Mansell, EnvironH. Jeffries, OThree Task 6Management, Meetings, ReportingT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Task 4Model Performance EvaluationT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 3Establish RefinedCAMx Base CaseD. McNally, Alpine T. Tesche, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 2Model Performance Improvement PlanT. Tesche, AlpineD. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, EnvironH. Jeffries, OThree Task 1Evaluate Modeling Assumptions and ProceduresT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree CCOSTechnical Committee Task 5Document Refinements, Data Base TransferD. McNally, Alpine T. Tesche, Alpine R. Morris, Environ G. Yarwood, Environ G. Mansell, EnvironH. Jeffries, OThree Task 6Management, Meetings, ReportingT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Task 4Model Performance EvaluationT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 3Establish RefinedCAMx Base CaseD. McNally, Alpine T. Tesche, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Dr. Ajith Kaduwela ARB Project Officer Task 2Model Performance Improvement PlanT. Tesche, AlpineD. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, EnvironH. Jeffries, OThree Task 1Evaluate Modeling Assumptions and ProceduresT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree CCOSTechnical Committee T. W. Tesche, AlpineProject Manager T. W. Tesche, AlpineProject Manager T. W. Tesche, AlpineD. McNally, AlpineCo-Principal Investigators T. W. Tesche, AlpineD. McNally, AlpineCo-Principal Investigators Imputed Inventory Factors Used In Houston SIP Modeling

  23. Initial Diagnostic Steps • Acquisition of relevant CCOS modeling datasets (e.g., emissions, aerometric, aircraft, profiler, sonde, landuse) • Evaluation of pertinent meteorological simulations (MM5: NOAA/ARB/Alpine; CALMET: ARB) • Evaluation of initial CAMx simulations (e.g. ARB004) • Synthesis of modeling experience with other episodes and domains (e.g. ENVIRON Bay Area, UCR CCOS modeling) • Development of Model Performance Improvement Plan (MPIP)

More Related