1 / 16

EOS Terra MODIS Quality Assurance Overview

EOS Terra MODIS Quality Assurance Overview. Joseph M Glassy, Director, MODIS Software Development at NTSG School of Forestry, Numerical Terradynamics Simulation Group University of Montana, Missoula Montana USA. Introduction. Goals:

jeslyn
Télécharger la présentation

EOS Terra MODIS Quality Assurance Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EOS TerraMODIS Quality Assurance Overview Joseph M Glassy, Director, MODIS Software Development at NTSG School of Forestry, Numerical Terradynamics Simulation Group University of Montana, Missoula Montana USA

  2. Introduction • Goals: • Summarize our SCF Quality Assurance program (goals, protocols, interface to ECS) • Describe how QA/Validation information is organized, formatted, and retrieved. • Introduce key Quality Assurance concepts and techniques. • Invite feedback on how we can improve.

  3. Quality Assurance, Validation,and Evaluation • Distributed quality control measures integrated into a NASA operational mission is a relatively new approach (e.g. where SCF’s actively participate in QA beyond initial assignment). • Quality Control measures divided into QA, evaluation, and a formal validation program.

  4. Quality Assurance, Evaluation, and Validation • Quality Assurance: near-real-time, tile and pixel level evaluation of tile products. • Evaluation: reasonableness checks on trends towards “correctness” stopping short of saying outputs are “absolutely right” • Validation: after-the-fact analysis (formal, global) using sparse point comparisons (FluxNet, Core Validation sites, campaigns like Safari 2000, etc) to “try” to assess absolute accuracy…

  5. Quality Assurance • Two emphases: routine and trouble-ticket driven • Two scopes: tile level (ECS metadata) vs. pixel level (separate pixel QA, often as categorical scores) • Tile level QA: mandated EOSDIS Core System fields as searchable metadata, to qualify data orders, use of tiles in specific analysis applications • ECS Metadata :Object-Data-Lang./Parameter Value Lang (ODL-PVL) hierarchies.

  6. Quality Assurance Data Flow • Production at GSFC and/or MODAPS • Goal: SCF acquire(s) and analyze 5% of tiles • SCF check tile, pixel QA, then updates SCIENCEQUALITYFLAG and explanation… • SCF posts QA evaluation back to LDOPE • LDOPE updates DB at Eros Data Center • User orders, receives tile(s) with updated QA field(s).

  7. QA VALIDSfor SCIENCEQUALITYFLAG • Passed • Suspect • Failed • Inferred Passed • Inferred Failed • Being investigated • Not being investigated (default)

  8. ECS Tile Level Metadata • Appears in HDFEOS tiles as (3) file level character attributes: • StructMetadata.0 – tile geolocation parameters • CoreMetadata.0 – user searchable key fields to qualify Query/Orders and assessing usability in scientific analysis and field applications. • ArchiveMetadata.0 – overlapping info, more for ECS/EDC use in organizing archives. (ignore)

  9. CoreMetadata.0 Example • ODL/PVL information is organized into a nested hierarchy of object “stanzas” • QAPERCENTGOODQUALITY or, • QAPERCENTCLOUDCOVER may be helpful… • Example of a ODL/PVL stanza for LOCALGRANULEID • :CoreMetadata.0 = "\n", • "GROUP = INVENTORYMETADATA\n", • " GROUPTYPE = MASTERGROUP\n", • "\n", • " GROUP = ECSDATAGRANULE\n", • "\n", • " OBJECT = LOCALGRANULEID\n", • " NUM_VAL = 1\n", • " VALUE = \"MOD15A1.A2000208.h18v04.002.2000322020100.hdf\"\n", • " END_OBJECT = LOCALGRANULEID\n", • "\n", • " OBJECT = PRODUCTIONDATETIME\n", • " NUM_VAL = 1\n", • " VALUE = \"2000-11-17T02:01:01.000Z\"\n", • " END_OBJECT = PRODUCTIONDATETIME\n", • "\n",

  10. Embedded QA Bitfield Legend (via Noesys 2.0)

  11. ..all QA bitfields merged…

  12. MODLAND_QC ALGOR_PATH ...vs. each QA bitfield separated… CLOUD STATE (0=clear, 1=cloudy, 2=mixed, 3=not set) QC N. Pixels Rel. Frequency 0 648401 45.028 1 404148 28.066 2 82511 5.730 3 304940 21.176 CLOUD_STATE

  13. Quality Assurance Techniques • Tile level analysis using visualization tools like HDFLook, ENVI/IDL, Noesys, HDF-Explorer • Tile level statistical analysis & characterization • Multiple tile statistical characterizations, over time periods • Reprojection/mosaic post-processing, to import into GIS/RS environments, for formal comparison with other geo-spatial datasets.

  14. Quality AssuranceTools Available • Public domain: HDFLook (Lille,FR); Webwinds (JPL) • Commercial: Noesys, ENVI/IDL, HDF-Explorer • Future: ERDAS Imagine, PCI EasiPACE, ArcInfo • Open source utilities: reprojtool.exe, qcbits.exe, others as developed… • Ask your vendor to support HDF-EOS!

  15. Quality AssuranceResources Available • Consult MODIS Product User’s Guide and/or the product Abstract, available with orders at EDC • Consult main NASA EOSDIS Core System metadata web site: • observer.gsfc.nasa.gov/applied_tech/metadata • Main MODIS Quality Assurance URL is: • http://modland.nascom.nasa.gov/QA_WWW/qahome.html

More Related