1 / 15

LAI Evaluation and Protocols Workshop Background and Charge

LAI Evaluation and Protocols Workshop Background and Charge. Jeffrey L. Privette Chair, Land Product Validation Subgroup. Land Remote Sensing in 21 st Century. Data Rich Environment (Moderate resolution) AATSR, VEGETATION, MODIS, AVHRR, GLI, SAC-C, MERIS, POLDER, NPP/NPOESS, SeaWiFS

carolinat
Télécharger la présentation

LAI Evaluation and Protocols Workshop Background and Charge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LAI Evaluation and Protocols WorkshopBackground and Charge Jeffrey L. Privette Chair, Land Product Validation Subgroup

  2. Land Remote Sensing in 21st Century • Data Rich Environment (Moderate resolution) • AATSR, VEGETATION, MODIS, AVHRR, GLI, SAC-C, MERIS, POLDER, NPP/NPOESS, SeaWiFS • Operational land products • E.g., LAI, FPAR, Land Cover/Change, Albedo, NPP • Multiple temporal and spatial scales

  3. Implications of Operational Land Products • Single algorithm • Consistent global approach • Accuracy varies with target • Less scientific oversight • Much greater – but less critical – user community • Product interdependencies (e.g., surface reflect.) • Products used into large-scale process models • National and international policies • Errors increasingly important • e.g., Antarctic ozone hole “found” late

  4. Evaluation: QA vs. Validation • Quality Assessment • Basic sensibility checks • Qualitative • Validation • Comparisons with independentmeasurement source • Quantitative • Global products point to international cooperation/ standards • CEOS WGCV Land Product Validation Subgroup

  5. Context of LPV Subgroup • Committee for Earth Observing Satellites (CEOS) • Purpose: to coordinate civilian observations from space • Complimentarity, compatibility, exchange • Composed of the world’s space agencies • Two Working Groups • Working Group on Information Systems and Services (WGISS) • Working Group for Calibration and Validation (WGCV) • Charter: facilitate and encourage standards, efficiencies, best practices • Five Subgroups • Synthetic Aperture Radar • Microwave Sensors • Terrain Mapping • Infrared and Visible Optical Sensors • Land Product Validation

  6. LPV Subgroup Organization • Implementation • Topical workshops • Reports, recommendations. • Community activities to address issues/problems • Leadership • Chair: Jeff Privette (NASA; tenure 3-5 years) • Deputy Chair: Jeff Morisette (SSAI) • European Lead: Kurt Gunther (DLR)

  7. LPV History • Organizational meeting in Ispra (2000) • Charter, Vision, Work Plan • GOFC chosen to provide an initial programmatic focus for LPV • WGCV, CEOS approval in October 2000 • Additional discussions in January 2001 • Aussois, France (ISPRS PMSRS) • Washington DC (MODIS Science Team Mtg.)

  8. Global Observation of Forest Cover (GOFC) • Part of the Global Terrestrial Observing System (GTOS) – contributing to the IGOS Carbon Theme • Objective: to improve operational monitoring of land surface from the carbon perspective • GOFC themes • Land cover • Biophysical Parameters (LAI/FPAR/NPP) • Fire • Providing validated products is a high priority for GOFC

  9. Year 2001 Focus Topic:Biophysical Parameters • Initial Focus: Leaf Area Index and FPAR • MODIS product released in August, 2000 • Forthcoming: GLI, MISR, MERIS (VI-derived) • Informal Meetings • May ’00 (Ispra), Jan. ’01 (ISPRS/Aussois), Jan ’01 (MODIS Sci. Team Mtg., Washington) • Major validation projects afoot • VALERI, BigFoot, Canadian Network, SAVE

  10. LAI Product Evaluation • 12-18 month pathfinder • Shake-out for more rigorous, longer-term and multiproduct comparisons • Evaluation of operational and research products • Leverage Year 2000 field campaigns (24) • 1st Workshop: Frascati, June 6-7, 2001

  11. LAI Product Evaluation: Sites * * LAI and satellite data collected in Year 2000

  12. Support of LAI Evaluation Sites SATELLITE DATA Multiple sensors, various scales, subsetted over the site and WWW accessible. Ancillary/GIS Layers such as… - elevation - land cover - reference layer (with political boundaries, airports, water bodies) Scientific Networks such as AERONET and FLUXNET data Field and airborne data WWW accessible Graphic courtesy of the BigFoot program

  13. Steps To Global LAI Validation • Identify and endorse best methods  Standards (v.1) • Link (by methods, data exchange) existing project-based site networks into “Global Test Site Network” • Critical Assessment of Global Test Site Network • Statistical credibility, representivity, size, scheduling • Feedback to Algorithm Developers, CEOS & community

  14. Charge • Evaluate LAI products from satellites (Day 1) • Challenging conditions, biomes • Product idiosyncrasies • Discuss/draft “Best Methods” for LAI data collection and analysis (Day 1 and 2) • Site measurements (Space, Time, Method) • Scaling • Comparisons with products • Requirements for community to do job (tools) • Think and work as a team in service to satellite product user-community

More Related