1 / 23

Data Quality

Data Quality. Ed Chapman OOI Chief Systems Engineer. Goal. Address Areas for Recommendations: #2 “Data Policy, Data Quality Protocols and Procedures” and #4 “Data sampling rate strategy development and management ” Specific topics:

tovah
Télécharger la présentation

Data Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Quality Ed Chapman OOI Chief Systems Engineer

  2. Goal Address Areas for Recommendations: #2 “Data Policy, Data Quality Protocols and Procedures” and #4 “Data sampling rate strategy development and management ” Specific topics: “Shoreside & at-sea instrument and subsystem quality/calibration procedures/protocols, automated thresholds/flags, manual data QC, exception management, and long term time-series data sampling rate management.”

  3. Shoreside & at-sea instrument and subsystem quality/calibration procedures/protocols

  4. Pre-Deployment Procedures 1. Incoming Inspection • Completed for all Instruments and Platforms • Verifies configuration and state as delivered 2. Calibration Records • Vendor records for each instrument or platform 3. Quality Conformance Tests (QCT) • Completed for all Instruments and Platforms • Confirms basic functionality (“bench test”), detects failures or damage 4. Instrument Requirements Verification • Completed for each instrument model • Evaluate first article against Requirements and specifications 5. Platform Integration and Test • Platform operation verified using platform controller • End-to-End communication verified, instrument to shore station • Platform Requirements Verification

  5. At-Sea Procedures: Pioneer-1 Platform monitoring • Full platform function available when within WiFi range • Communication with shore station when out of range Shipboard underway sampling • Meteorology time series from Knorr bow mast • Thermosalinograph time series from Knorr system • Bathymetry from echoshounder and multi-beam Shipboard CTD profiles • Post-deployment casts at each of 3 sites for Pioneer-1 • Seabird 9-11 with DO, Fluoro, beam x-miss, turb, PAR Physical Samples • Post-deployment casts at each of 3 sites for Pioneer-1 • Salinity and Oxygen completed onboard • Nitrate/Nitrate, Chlorophyll and Carbon system done in shore labs

  6. Automated QC Thresholds and FlagsL1b and L2b

  7. Permanent storage Instrument Driver and Agent Data Product Algorithm Calibration Table Secondary Post-Deployment calibration values POLYVAL Algorithm User Secondary Post-Recovery calibration values POLYVAL Algorithm QC Flags Interpolation QC algorithms (range, spike, stuck, gradient, trend, combined) Lookup Tables

  8. Automated QC Checks • Seven QC Checks • Global Range Test • Local Range Test • Spike Test • Stuck Value Test • Trend Test • Gradient Test • Combined QC Flags

  9. When? • QC Checks are run on a periodic basis • when data is ingested from the uncabled instruments • Continuously from the cabled instruments • QC Flags are stored.

  10. Automated QC actions • PSs create Look Up Tables and values are uploaded through the UI as csv files

  11. Automated QC Updates • If new values are uploaded for any of the QC Flags those values overwrite the original values. • OOINet reruns the QC check for all data products and creates and stores new QC Flags • QC is “value added” so we don’t retain prior flags

  12. Human in the loop QCL1c and L2c

  13. Permanent storage Instrument Driver and Agent Data Product Algorithm Calibration Table Secondary Post-Deployment calibration values POLYVAL Algorithm User Secondary Post-Recovery calibration values POLYVAL Algorithm Interpolation QC algorithms (range, spike, stuck, gradient, trend, combined) Lookup Tables L1c Human in the loop

  14. Human in the Loop QC Actions • PS periodically downloads an L1 or L2 product • PS performs HITL QC locally on PS machine • PS uploads L1c or L2c values, and HITL metadata (provenance, etc.) into OOINet • User who downloads L1 or L2 product to which HITL QC has been applied will see L1c or L2c variables in the downloaded time series • Only for the time range for which the HITL QC was applied

  15. Human in the Loop QC Updates • If new HITL values are uploaded for a time period that has already been uploaded those values overwrite the original values.

  16. Relationship of QC level a, b, and c

  17. Database L0 L0 L1 Data Product Algorithm L2 Data Product Algorithm Primary Calibration Function L1a L2b Secondary Calibration Functions L1b QC Algorithms QC Algorithms Human In The Loop Human In The Loop L1a L1b and QC flags L1c L0 L2c QC flags L2b GUI User

  18. exception management

  19. Long term time-series data sampling rate management

  20. Pivotal and Default Sampling Rates • PSs and external scientists established pivotaland defaultsampling rates • Pivotal is the minimum necessary sampling to answer long term science questions • Default is the standard sampling rate • Rates guided by power and energy budgets for each platform type.

  21. Annual Review of Default Rate • Annual external reviews to recommend default rate changes • Annual iteration between OOI engineers and external science advisors (via UNOLS committee) to assure that recommended adjustments are feasible

  22. Adjustment to sampling rates • Any recommended change in sampling rate by external proposers will be evaluated with respect to the required maintenance of 'pivotal' rates and power and energy budgets (this becomes part of the annual interaction between OOI and the UNOLS advisory structure) • O&M team will sustain the default sampling rates. • Changes in sampling rates only occur in response to: • safety threats to life or property • technical issues that compromise performance • pre-approved responses to defined events (volcanic eruptions, hurricanes…) • approved retasking of observatory elements (NSF proposal process, UNOLS scheduling)

  23. Questions? Specific topics: “Shoreside & at-sea instrument and subsystem quality/calibration procedures/protocols, automated thresholds/flags, manual data QC, exception management, and long term time-series data sampling rate management.”

More Related