1 / 24

National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements

National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner 1,5 , Daewon Byun 2,† , Pius Lee 2 , Jeff McQueen 3 , Roland Draxler 2 , Phil Dickerson 4 , Kyle Wedmark 1,5

kail
Télécharger la présentation

National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner1,5, Daewon Byun2,†, Pius Lee2, Jeff McQueen3, Roland Draxler2, Phil Dickerson4, Kyle Wedmark1,5 1 National Oceanic and Atmospheric Administration (NOAA), National Weather Service 2 NOAA, Air Resources Laboratory (ARL) 3 NOAA, National Center for Environmental Prediction (NCEP) 4 Environmental Protection Agency (EPA) 5 Noblis, Inc † Deceased 2011 National Air Quality Conference, San Diego, CA March 9, 2011

  2. Overview Overview • Background • Recent Progress • Ozone • Smoke • Dust • PM2.5 • Feedback and Outreach • Summary and Plans

  3. 2005: O3 6 2007: O3,& smoke 2010 smoke ozone National Air Quality Forecast CapabilityCurrent and Planned Capabilities, 3/11 • Improving the basis for AQ alerts • Providing AQ information for people at risk • Prediction Capabilities: • Operations: Ozone nationwide: expanded from EUS to CONUS (9/07), AK (9/10) and HI (9/10) Smoke nationwide: implemented over CONUS (3/07), AK (9/09), and HI (2/10) • Experimental testing: • Ozone upgrades • Dust predictions • Developmental testing: • Ozone upgrades • Components for particulate matter (PM) forecasts 2009: smoke 2010: ozone • Near-term Targets: • Higher resolution prediction • Longer range: • Quantitative PM2.5 prediction • Extend air quality forecast range to 48-72 hours • Include broader range of significant pollutants

  4. National Air Quality Forecast Capability End-to-End Operational Capability • Model Components: Linked numerical prediction system • Operationally integrated on NCEP’s supercomputer • NCEP mesoscale NWP: WRF-NMM • NOAA/EPA community model for AQ: CMAQ • NOAA HYSPLIT model for smoke prediction • Observational Input: • NWS weather observations; NESDIS fire locations • EPA emissions inventory/monitoring data • Gridded forecast guidance products • On NWS servers: www.weather.gov/aq and ftp-servers • On EPA servers • Updated 2x daily • Verification basis, near-real time: • Ground-level AIRNow observations • Satellite smoke observations • Customer outreach/feedback • State & Local AQ forecasters coordinated with EPA • Public and Private Sector AQ constituents

  5. Progress in 2010Ozone, Smoke Operational Nationwide; Dust Testing • Ozone: Expanded Forecast Guidance to Alaska and Hawaii domains in NWS operations (9/10) • Operations, 2010: Updates for CONUS (emissions), new 1, 8-hour daily maximum products • Developmental testing: changing boundary conditions, dry deposition, PBL in CB-05 system • Smoke: Expanded Forecast Guidance to Hawaii domain in NWS operations (2/10) • Operations:CONUS predictions operational since 2007, AK predictions since 2009 • Developmental testing: Improvements to verification • Aerosols: Developmental testing providing comprehensive dataset for diagnostic evaluations. (CONUS) • CMAQ (aerosol option), testing CB-05 chemical mechanism with AERO-4 aerosol modules • Qualitative; summertime underprediction consistent with missing source inputs • Dust and smoke inputs: testing dust contributions to PM2.5 from global sources • Real-time testing of combining smoke inputs with CMAQ-aerosol • Testing experimental prediction of dust from CONUS sources • Developing prototype for assimilation of surface PM2.5 measurements • R&D efforts continuing in chemical data assimilation, real-time emissions sources, advanced chemical mechanisms

  6. Operational predictions at http://www.weather.gov/aq/ OZONE

  7. Near-real-time OzoneVerification • Comparing model with observations from ground level monitors compiled and provided by EPA’ s AIRNow program • Fraction correct with • respect to 75ppb (code orange) thresholdfor operational prediction (using CMAQ CBIV mechanism) • over 48 contiguous states Year 2009 skill target: fraction correct ≥ 0.9 Year 2010

  8. Chemical mechanism sensitivity analysis Experimental ozone prediction (with updated CB05 mechanism) shows larger biases than • operational ozone prediction (CB-IV mechanism) • Summertime, • Eastern US. • Sensitivity studies in progress: • Chemical speciation • Indicator reactions in box model studies (organic nitrate, NOx, PAN) Seasonal ozone bias for CONUS Operational (CB-IV) Experimental (CB05) • Mechanism differences • Ozone production • Precursor budget • Seasonal input differences • Emissions • Meteorological parameters in 2009

  9. Impact of NMM-B meteorologyAugust 10, 2010 example NMMB-CMAQ WRF/NMM-CMAQ Overprediction case highlighted by Michael Giegert, CT DEP NMMB-CMAQ reduces overprediction of 8-hr maximum ozone in the coastal region in the northeastern US ppb

  10. Operational predictions at http://www.weather.gov/aq/ SMOKE

  11. Smoke Prediction Example: Four Mile Canyon Fire September 7-8, 2010 “The blaze broke out Monday morning in Four Mile Canyon northwest of Boulder and rapidly spread across roughly 1,400 hectares. Erratic wind gusts sometimes sent the fire in two directions at once.” “The 11-square-mile blaze had destroyed at least 92 structures and damaged at least eight others by Tuesday night, Boulder County sheriff's Cmdr. Rick Brough said.” 11

  12. Experimental predictions at http://www.weather.gov/aq-expr/ DUST

  13. CONUS Dust Predictions: Experimental Testing • Standalone prediction of airborne dust from dust storms: • Wind-driven dust emitted where surface winds exceed thresholds over source regions • Source regions with emission potential estimated from monthly MODIS deep blue climatology (2003-2006) • HYSPLIT model for transport, dispersion and deposition • Updated source map in experimental testing (Draxler et al., JGR, 2010) Developmental testing

  14. Dust simulation compared with IMPROVE data • HYSPLIT simulated air concentrations are compared with IMPROVE data in southern California and western Arizona: • Model simulations: contours • IMPROVE dust concentration: numbers next to “+“ • Variability in model simulation • Highest measured value in Phoenix • Model and IMPROVE comparable in 3-10 ug/m3 range • Spotty pattern indicates under-prediction. Considering ways for improving representation of smaller dust emission sources. June-July 2007 average dust concentration (Draxler et al., JGR, 2010)

  15. Developmental predictions PM2.5

  16. Developmental predictions, Summer 2010 Aerosols over CONUS From National emissions inventory (NEI) sources only • CMAQ: CB05 gases, AERO-4 aerosols • sea salt emissions and reactions Seasonal bias: • winter, overprediction • summer, underprediction Testing of real-time wildfire smoke emissions in CMAQ Summer 2010

  17. Assimilation of surface PM data • Assimilation increases correlation between forecast and observed PM2.5 • Control is forecast without assimilation Assimilation reduces bias and increases equitable threat score Bias ratio Equitable threat score [%] Pagowski et al QJRMS, 2010 • Exploring bias correction approaches, e.g. Djalalova et al., Atmospheric Env., 2010

  18. Partnering with AQ Forecasters http://www.epa.gov/airnow/airaware/ Focus group, State/local AQ forecasters: • Participate in real-time developmental testing of new capabilities, e.g. aerosol predictions • Provide feedback on reliability, utility of test products • Local episodes/case studies emphasis • Regular meetings; working together with EPA’s AIRNow and NOAA • Feedback is essential for refining/improving coordination This year’s Air Awareness week is May 2nd – May 6th 2011

  19. AQ Forecaster Feedback Examples Skill Scores (2010) for Code Orange O3 Threshold (≥ 76 ppbv, 8-hour) • William Ryan, Penn State: • Significant, and continuing, reductions in NOx emissions on a regional scale since 2003 and lowered O3 concentrations • The key relationship that powers statistical forecast models (O3 and Tmax) has weakened • Forecasters must now rely on numerical forecast models, updated with EGU emissions data yearly, for forecast guidance. • NAQC model guidance out-performed other forecast methods in 2010 • Precursor emissions and O3 concentrations not static since 2003 and not likely to be in the near future • *from Bill Ryan’s 2011 AMS presentation, The Effect of Recent Regional Emissions Reductions on Air Quality Forecasting in the mid-Atlantic

  20. Summary • US national AQ forecasting capability status: • Ozone prediction nationwide (AK and HI since September 2010) • Smoke prediction nationwide (HI since February 2010) • Experimental testing of dust prediction from CONUS sources (since June 2010) • Developmental testing of CMAQ aerosol predictions with NEI sources • Near-term plans for improvements and testing: • Operational dust predictions over CONUS • Development of satellite product for verification of dust predictions • Understanding/reduction of summertime ozone biases in the CB05 system • Transitions to meteorological inputs from NMM-B regional weather model • Development of higher resolution predictions Target operational implementation of initial quantitative total PM2.5 forecasts for northeastern US in 2015

  21. Acknowledgments:AQF Implementation Team Members Special thanks to Paula Davidson, OST chief scientist and former NAQFC Manager NOAA/NWS/OST Tim McClung NAQFC Manager NOAA/OAR Jim Meagher NOAA AQ Matrix Manager NWS/OCWWSJannie Ferrell Outreach, Feedback NWS/OPS/TOC Cindy Cromwell, Bob Bunge Data Communications NWS/OST/MDL Jerry Gorline, Marc Saccucci, Dev. Verification, NDGD Product Development Tim Boyer, Dave Ruth NWS/OST Ken Carey, Kyle Wedmark Program Support NESDIS/NCDC Alan Hall Product Archiving NWS/NCEP Jeff McQueen, Youhua Tang, Marina Tsidulko, AQF model interface development, testing, & integration Jianping Huang *Sarah Lu , Ho-Chun Huang Global data assimilation and feedback testing *Brad Ferrier, *Dan Johnson, *Eric Rogers, WRF/NAM coordination *Hui-Ya Chuang Geoff Manikin Smoke Product testing and integration Dan Starosta, Chris Magee NCO transition and systems testing Robert Kelly, Bob Bodner, Andrew Orrison HPC coordination and AQF webdrawer NOAA/OAR/ARL Pius Lee, Rick Saylor, Daniel Tong , CMAQ development, adaptation of AQ simulations for AQF TianfengChai, FantineNgan, Yunsoo Choi, Hyun-Cheol Kim, Yunhee Kim, Mo Dan Roland Draxler, Glenn Rolph, Ariel Stein HYSPLIT adaptations NESDIS/STARShobhaKondragunta, JianZeng Smoke Verification product development NESDIS/OSDPD Matt Seybold, Mark Ruminski HMS product integration with smoke forecast tool EPA/OAQPS partners: Chet Wayland, Phil Dickerson, Scott Jackson, Brad Johns AIRNow development, coordination with NAQFC • * GuestContributors 21

  22. Operational AQ Forecast Guidancewww.weather.gov/aq CONUS Ozone Expansion Implemented in September 2007 AK and HI implemented in September 2010 Smoke ProductsCONUS implemented in March 2007 AK implemented September 2009 HI implemented in February 2010 Further information: www.nws.noaa.gov/ost/air_quality

  23. Backup

  24. Smoke forecast tool: Alaska example • Very active fire season over Alaska in 2009 • More than 2,950,000 acres burned GOES smoke product: Confirms areal extent of peak concentrations FMS = 35%, for column-averaged smoke > 1 ug/m3 • 7/13/09, 17-18Z, Observation: 7/13/09, 17-18Z, Prediction: • Real-time objective verification: • Based on NOAA/NESDIS satellite imagery: • GOES Aerosol Smoke Product • Smoke from identified fires only • Filtered for interference: • Clouds, surface reflectance, solar angle, other aerosol • “Footprint” comparison: • Threshold concentration (1 µg/m3) for average smoke in the column • Tracking Threat scores, or Figure-of-merit statistics (FMS): July 2009 Target skill is 0.08 AObs • (Area Pred Area Obs) • ---------------------------------- • (Area Pred. U Area Obs) APred

More Related