1 / 45

MyOcean2 First Annual Meeting – 17-18 April 2013

WP 17 Product Quality MyOcean2 First Annual Meeting – Cork 16-17 April 2013. MyOcean2 First Annual Meeting – 17-18 April 2013. Work Package Main Objectives. Ensure that: the accuracy of MyOcean products is adequately monitored

kynton
Télécharger la présentation

MyOcean2 First Annual Meeting – 17-18 April 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WP 17Product Quality MyOcean2 First Annual Meeting – Cork 16-17 April 2013 MyOcean2 First AnnualMeeting – 17-18 April 2013

  2. Work Package Main Objectives Ensure that: • the accuracy of MyOcean products is adequately monitored • changes to the MyOcean system are justified from a product quality point of view • information on product quality is scientifically sound and consistent, is useful, and is communicated effectively • product quality assessment takes into account user requirements and feedback MyOcean2 First Annual Meeting – 17-18 April 2013

  3. Background: Validation categories and who does what Calibration/ Pre-upgrade tests PC x.5/6, WP18 QuID Static document for each product containing all quality info. Updated when product changes. Routine validation (1-10 days behind production) PC x.2 Operational monitoring Quarterly validation report Single document covering all real-time products. Results for each quarter. Offline validation with new metrics (1-3 months behind production) WP17.4 User validation (by users – for themselves) Coordinated by WP3 Interaction needed Governed by guidelines developed in WP17.2

  4. Partnership MyOcean2 First Annual Meeting – 17-18 April 2013

  5. Achievements with respect to WP17 workplan

  6. Main achievements in year 1 CalVal guidelines published (WP17.2) Quality Assurance Review Group initiated and performed first review – V3 QuIDs more complete and more consistent (WP17.3) Initial results from development of new metrics (WP17.4) Prototype quarterly validation reports produced (WP17.5)

  7. Achievements: CalVal guidelines

  8. Achievements: CalVal guidelinesPlanned process PC WP 17.2 sub-teams WP 17.2 All PC representatives PC Liaison PC PC Proposals PC PC Guidelines Published once per year

  9. Achievements: CalVal guidelines • In April we chose topics to focus on for phase 1 (we kept scope limited to make it achievable in a short period) • In December phase 1 of the guidelines was published, containing the 3 topics which were mature enough: • Conventions for Estimated Accuracy Numbers • Guidelines for sharing QC flags between TACs and MFCs • Conventions for providing statistics for quarterly validation reports • PCs are requested to adopt the guidelines by June 2013

  10. CalVal guidelines: challenges Difficult to keep the working group discussions moving between face-to-face meetings (April 2012, November 2012, April 2013) For next phase, propose to schedule regular teleconferences to maintain momentum

  11. CalVal guidelines: next steps • We have agree to work on the following topics for phase 2 of the guidelines: • TAC-MFC links: putting more concrete processes in place for QC feedbacks • Defining a framework for process-focussed model-observation intercomparisons • Defining the quality thresholds which should be used to trigger user alerts when quality is degraded • EAN: agreeing a long-term strategy for BGC metrics (will link up with the GODAE OceanView MEP task team where appropriate) • EAN: seeking common metrics for sea-ice products in the short to medium term

  12. Achievements: improvements in review process for PQ

  13. Achievements: improvements in review process for PQ The Quality Assurance Review Group (QuARG) was formed: Fabrice Hernandez (chair), Henning Wehde, Laurent Bertino, Gilles Larnicol, Alistair Sellar Following discussions with PMO, the PQ documentation requirements for PCs were (slightly) simplified Brief summary of the review process was written to give clear guidance to PCs on requirements for them

  14. QuID status at V3 QuARG reviewed all QuIDs submitted for V3 to increase consistency and readability – each document reviewed by at least 2 members For V3 there are QuIDs covering 87 products – big effort by many people Remaining 20 should be ready soon – by V3.1?

  15. PQ review process: challenges Timescales to review & correct QuIDs were very tight, and many QuIDs submitted late => required a big effort from QuARG chair in the final weeks of V3 integration to review final changes

  16. Achievements: metrics development

  17. Achievements: metrics development 3 partners (MUMM, UK Met Office, Mercator-Ocean) are taking novel validation techniques from NWP and adapting for ocean products Initial results follow...

  18. Accuracy of site-specific forecasts(UK Met Office) Comparing single-site timeseries observations with gridded products, focusing on critical thresholds and categorical statistics: hit/miss rates and derivatives (CSI, ETS, REV, …) use time/space windows on the gridded product to understand the best method to generate site-specific forecasts user-focused metrics such as “good weather windows”: e.g. periods with currents below a given threshold • Anticipated information for users: • “The hit rate for site-specific forecasts is ...” • “The best method to produce site-specific forecasts is to take the mean/max of X neighbouring points in space/time.”

  19. Accuracy of site-specific forecasts(UK Met Office) POD FAR CSI Applied to hourly data gives poor results Met Office Marine Verification scores for >10.0ms-1 winds (mid-strong wind speed) Hourly model v hourly observations

  20. POD FAR CSI Accuracy of site-specific forecasts(UK Met Office) Applied to weekly averaged data gives better results Weekly averaged model v weekly averaged observation

  21. Metrics development - biogeochemistry (MUMM) • Checking different terminologies and calculation methods • Finetuning calculation method (time windows) • NWS chlorophyll comparisons Satellite 26 July Model 26 July Model 31 July

  22. Metrics development - biogeochemistry (MUMM) Also investigating… Spatial neighborhood method Added research area Mediteranean sea Atlantic ocean …

  23. Triple collocation to determine relative error ranges (Mercator-Ocean) L3 obs • TC method is used to compared data sets of the same geophysical field (wind (Stoffelen 1998), waves (Caireset al 2003), SST (O’Carroll et al 2008), soil moisture (Zwieback et al 2012)). • TC gives an estimation of the error associated with each data set and allows the calibration of the data sets. The errors are generally supposed to be uncorrelated with each other and with the geophysical field. • Application: one year of SST data (Aug 2011-Sep 2012), Bay of Biscay region. • Data sets: multi-sensor L3 observations (taken as reference), IBI and NWS forecasts (1-D averaged). Calculation performed for grid points with at least 200 valid obs. • Error models: several tested; illustration with O’Carrol 2008. IBI MFC 0.5 NWS MFC 0.0 Variance of errors

  24. Achievements: quarterly validation reports

  25. Achievements: quarterly validation reports We will produce online reports displaying validation statistics for the real-time MyOcean products, updated quarterly These will be published as a set of web pages to make it easy to browse results

  26. Quarterly reports:planned process Quarterly statistics (netCDF) Operations (PC x.2/3) UKMO (WP17.5) New metrics (WP17.4) Plots PC reps (WP17.5) Interpretation (style must be consistent ) Completed report

  27. Quarterly reports:status and challenges • We have a prototype containing results from most PCs • Currently working on defining conventions for summary text to ensure consistency • Challenges: • Defining a data format which accommodates results from diverse PCs – but we seem to have managed it • Defining conventions for summary text will be harder

  28. Prototype pages Production centres Products Variables or datasets Overview of results for each product

  29. Prototype pages Full domain On shelf Off shelf Norwegian Trench RMS errors for each area and forecast lead time 0.0 0.2 RMSE (K) 0.4 0.6

  30. Prototype pages Select an area to get timeseries statistics for that area

  31. Achievements in Production Centres

  32. Achievements in Production Centres Validation highlights from production centres Some of these were presented at the MyOcean science days which had a session devoted to validation Thanks to all contributors MyOcean2 First Annual Meeting – 17-18 April 2013

  33. Assessing Armor global observationalanalysis usingvelocity computation Gulf Stream Area Section at 60°W • Improvement of the T/S fieldresolution Zonal velocity relative to the surface  Assessment of the density gradient shear OLD NEW DEPTH Higher shear + = u (z = 0) Absolute zonal geostrophicvelocity (Mulet et al., 2012) DEPTH Better estimate Comparisonwithindependant data Argo drift at 1000 m: ANDRO (Ollitrault et Rannou, 2010) LATITUDE LATITUDE Standard deviation : 6 cm.s-1 7 cm.s-1 10 cm.s-1

  34. ARC-MFC:Validation of MIZ sea ice concentration Near sea ice minimum (2012-08 - 2012-09) Near sea ice maximum (2013-02 - 2013-03) Character size codes:1-3% 4-8%9-14%≥15% We find: 1. Somewhat low SIC values in model near sea ice minimum 2. Modeling sea ice categories exactly is not easy!(low values on the diagonal) ARC-MFC validation results are available from http://myocean.met.no/ARC-MFC/Validation/

  35. Baltic MFC: Sealevel validation • Timeseriesofsealeveldataat 56 tidegauges • Hourlydata • Satellite-borne altimeter:severe limitations in semi-enclosed seas due to limitedaccuracy and spatial resolution MyOcean2 First Annual Meeting – 17-18 April 2013

  36. BAL MFC: Ice metrics experiments Ice edge RMS distance is higly dependent on reference data chosen (model or obs) Minimum ice edge length needs to be tuned to supress false alarms Score can be misleading at the start/ end of ice period MyOcean2 First Annual Meeting – 17-18 April 2013

  37. IBI Validation improvements • NARVAL continuous developments. Working lines: • To include new OBS Sources • SMOS, ARGO(in D-M: automatic generation of nc files for WP17) BIAS RMSE BIAS RMSE February, 2012 February, 2012

  38. MED-MFC Biogeochemistry OCT2012-MAR2013 Timeseries of product quality statistics and EAN Known sources of error in forecasts: Western regions: patchy bloom events and high spatial variability, which are not completely resolved. Satellite coverage decreases in winter: decrease of statistical comparison. Eastern regions: oligotrophic conditions at surface not fully handled by satellite observation and models • Product quality estimated over sub-regions, two examples are reported: North-Western Med (NWM) and Levantine (LEV)

  39. Black Sea MFC quality monitoring for sealevel and currents Example of the modeled Sea Level quality control Jason-2. Standard deviations between modeled & observed Sea Level. Track - 007 RMSD histogram for all satellites & tracks Root men square deviation & correlation coefficient between modeled and satellite sea levels January-September 2012 Frequency RMSD (cm) Cycle RMSD (cm) Example of the modeled currents velocities quality control MyOcean2 First Annual Meeting – 17-18 April 2013

  40. Validation highlights TAC-SL Degree of Freedom Signal: Evolution of the mean contribution of the different altimeters in the merged product. Jason-1 unavailable 1 to15th March 2013 . Jason-2 reference mission unavailable 10 days just after Jason-1 recovery • Any change in the constellation can impact the quality of the products (improved/degraded sampling ). MyOcean2 First Annual Meeting – 17-18 April 2013

  41. InSitu Validation Key Performance indicators KPI 1: Data availability KPI 2: Input data coverage KPI 3: Meta data quality KPI 4: Output data quality Global Arctic Mediterranean Mediterranean Global

  42. First periodsummaryandnextsteps The outcomes promised in the plan: • CalVal guidelines • Phase 1 published;phase 2 will be published Autumn 2013 • Operation of QuARG (Quality Assurance Review Group) • Completed reviews for V3, QuIDs very complete;lessons learned... • Metrics development • Initial results are promising; investigations continue • Quarterly validation reporting • Prototype developed; expect publication in July • Establishing links and processes • service desk, users, reanalysis WP, between PCs • Some progress via guidelines; need to do more next year (top priority) MyOcean2 First Annual Meeting – 17-18 April 2013

  43. Thank you

  44. Proposed links – production mode WP17.4: new metrics Real-time operations Results of routine validation for last quarter PC x.2/3 Results from new metrics for last quarter WP18 WP 17.5: quarterly report Quality incidents + user queries Disseminate report List of recent quality incidents? Service desk Web portal PCs others… Users

  45. Proposed links – offline mode WP3: Users Understand user needs for accuracy info Discuss impact on usefulness of products PC reps in WP17.2 liaise with rest of PC WP17.2: CalVal guidelines WP17.5: Quarterly reports PCs WP18 Discuss impact on development priorities WP17.3: QuARG Provide evidence for benefit of upgrades R&D: WP19 PC x.5, WP18 PC representatives are part of WP17 for all of these activities

More Related