omi validation status n.
Skip this Video
Loading SlideShow in 5 Seconds..
OMI validation status PowerPoint Presentation
Download Presentation
OMI validation status

OMI validation status

171 Vues Download Presentation
Télécharger la présentation

OMI validation status

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. OMI validation status Ankie Piters, KNMI

  2. What is the status of OMI validation? How good is our knowledge about the accuracy/quality of the data? (rather than: “how good is our data”) How well is our knowledge communicated to the user? A product is “validated” when: The newest version of a product has been compared on a global scale to different sources of known accuracy, and for all seasons. Possible systematic effects have been studied with respect to instrumental, algorithmic, or atmospheric parameters. The results of these studies are quantified in terms of e.g. a bias and precision, and their dependence on certain parameters. And regimes with different error-characteristics are specified. The results are published in peer-reviewed literature, and summarised/referred to in the product README files.

  3. Questions to Product Developers: To what correlative data has your product been compared? Has your product been compared to data at different latitude bands? Has your product been compared for different time periods or seasons? Is it possible to state an average bias, precision, or accuracy for your product? Do the validation studies show systematic effects or dependencies w.r.t. certain regions, atmospheric circumstances, or instrumental parameters? Can you specify certain regimes where your product is less accurate, or should be used with caution? What knowledge is missing about the quality, accuracy of your product?

  4. global comparisons green: yes orange: large latitude range not covered results published green: peer-reviewed paper(s) orange: other bias / precision green: quantified orange: not yet yellow: missing info, or waiting for validation of current version version green: Latest validated version orange: Newer version than the one validated seasonal comparisons green: yes orange: not all seasons covered corellative data green: more independent validation sources orange: less validation sources additional information blue: known issues yellow: missing info (*) probably solved in next version

  5. OMDOAO3 - Global Ozone columns - KNMI Dobson Brewer latitude latitude source: bias w.r.t Brewer: -1/-2%; precision: 3%  comp to Dobson indicate a weak slope as a function of latitude OMI-Science Team Meeting | 11 March 2014 5

  6. OMDOAO3 - SZA Ozone columns - KNMI Dobson Brewer sza sza source: negative bias compared to Brewers increase at SZA>75o, to -3% at 85o OMI-Science Team Meeting | 11 March 2014 6

  7. OMDOAO3 – Timeseries NH Ozone columns - KNMI Dobson Brewer source: small trend over time of +0.1% per year OMI-Science Team Meeting | 11 March 2014 7

  8. NO2 columns – NASA and KNMI January July Zonal mean stratospheric NO2 U = Usa E = Europa A = East Asia Northern hemis-phere (35-55°) tropospheric NO2 U E A Comparison of NASA "Standard Product" (SP; blue, red) and DOMINO v2 (green) NO2, based on same initial NO2 slant columns: differences lie in the separation of stratosphere and troposphere; SP2 is new w.r.t. SP1 and matches DOMINO better [Bucsela et al., 2013]

  9. NO2 columns – KNMI Three (sub)urban MAX-DOAS instruments against daily OMI tropospheric NO2 (x 1e15 molec/cm2) on some days in 2006-2011: magnitudes match, correlation not strong [Lin et al., 2013] Monthly average tropospheric NO2 over Beijing from OMI and MAX-DOAS (2008-2011): OMI underestimates NO2 [Ma et al., 2013]

  10. NO2 - NASA and KNMISO2 - NASA original AMF improved AMF NO2-KNMI NO2 map column to in-situ via AQ model NO2-NASA SO2-NASA SO2 source: McLinden et al, 2013, ACPD

  11. Comparison of OMI tropospheric NO2 VCD with (car) MAX-DOAS for different locations:  Beijing  Wuxi  New Delhi  Paris  Islamabad SCIA, 2008-2010 OMI, 2008-2009 Example for Beijing Ma et al., ACP 2013 OMI is always < MAX-DOAS MAX-DOAS: hourly averaged tropospheric NO2 VCDs during satellite overpass red points: CTH < 1km; blue points: CTH > 1km

  12. Validation of OMI satellite observations Paris Delhi-Agra, 16.01.2011 R. Shaiganfar, MPIC Mainz Shaiganfar et al., ACP 2011

  13. Comparison of PDFs of effective cloud fraction eff. cloud fraction – NASA Alexander Vasilkov, SSAI PDFs are practically the same. Not anticipated because of different sizes of the OMI and OMPS footprint. 13

  14. cloud pressure – NASA and KNMI Comparison of PDFs of cloud (optical centroid) pressure Alexander Vasilkov, SSAI NASAKNMIOMPS Southern mid-latitudes Northern mid-latitudes Tropics - In general, OMI-NASA retrieves somewhat lower cloud OCPs than OMPS does. Differences are most pronounced in the tropics. - Differences between NASA and OMPS cloud pressures appear to be similar to differences between NASA and KNMI except for the differences in the tropics. 14

  15. OMI BrO – Data Consistency Raid Suleiman, CFA Trend is varying from -0.7% per year at southern mid latitude to -1.7% per year at northern high latitude Real or instrument degradation?

  16. OMI BrO: Comparison with Ground-Based Observations Raid Suleiman, CFA Prior to Row Anomaly OMI and Ground-Based Zenith-Sky total column BrO at Harestua, Norway.Harestua columns by F. Hendrick (BIRA) Comparison between OMI and Harestua (monthly mean) for the period of February 1, 2005 to August 8, 2012. The varying trend here is very small

  17. OMI AOD (NASA) vs AERONET 44 sites, 4 years of data Linear correlation coefficient: 0.81 linear regression (solid line): slope 0.79intercept: 0.10 [AP: agreement may be better than the authors suggest …..] Aerosol - NASA source: Ahn et al, 2014, JGR

  18. Conclusions Several OMI products have had some recent validation. Uncertainties have been quantified and systematic effects characterized since 2011: 18 new publications with validation results: 4 in 2012 (NO2, clouds, BrO, SO2), 9 in 2013 (aerosol, NO2, SO2), 2 in 2014 (aerosol, strat/total NO2), 1 submitted (aerosol), 2 in preparation (NO2, O3) Gaps in the validation have been identified for almost all products. Main issues: latest versions of products need validation: cloud(KNMI), HCHO, BrO, OClO many products need characterisation over snow/ice or partially cloudy conditions Most README files have been updated in the last 3 years; README files for HCHO and SO2 are more than 5 years old