1 / 45

Workshop on Quantitative Evaluation of Downscaled C limate Projections (August 12-16, 2013)

The National Climate Predictions and Projections Platform. Workshop on Quantitative Evaluation of Downscaled C limate Projections (August 12-16, 2013). Motivation: Practitioner’s Dilemma.

Télécharger la présentation

Workshop on Quantitative Evaluation of Downscaled C limate Projections (August 12-16, 2013)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The National Climate Predictions and Projections Platform Workshop on Quantitative Evaluation of Downscaled Climate Projections(August 12-16, 2013)

  2. Motivation: Practitioner’s Dilemma • Practitioner’s dilemma - how to choose among many available sources of climate information for a given place and application?

  3. Needs • Objective evaluation of datasets prepared for use in planning for climate change. • Provision of application-specific guidance to improve usability of climate-change projections in planning. • Initiate a community based on standards to build and sustain practice of evaluation and informed use of climate-change projections.

  4. When, Where, Who • August 12-16, 2013 • Boulder Colorado • Participants • Use cases from sectoral working groups • Agricultural impacts • Ecological impacts • Water resources impacts • Human health impacts • Datasets from downscaling working groups • NCPP Community, agency partners, program sponsors, international observers, interested parties

  5. Week at a Glance Monday 12 August Days 1 and 2 – Evaluation Focus Tuesday 13 August Wednesday 14 August Day 3 – Transition Thursday 15 August Days 4 and 5 – Guidance Focus Friday 16 August

  6. Expected Outcomes DATA • Database for access to high-resolution datasets with standardized metadata of downscaling methods • Demonstration of flexible, transparent climate index calculation service (Climate Translator v.0) • First version of a standardized evaluation capability and infrastructure for high-resolution climate datasets, incl. application-oriented evaluations • Description of a sustainable infrastructure for evaluation services • Sector and problem-specific case studies within the NCPP environment • First version of a comparative evaluation environment to develop translational and guidance information • Identify value-added, remaining gaps and needs for further development of evaluation framework and metadata, incl. templates EVALUATION COMMUNITIES OF PRACTICE

  7. Evaluation: Downscaling working groups ARRM BCSD MACA BCCA Statistical downscaling datasets NARCCAP data, Hostetler data, RegCM2 Dynamical downscaling datasets Delta method Baseline GRIDDED OBSERVATIONAL DATA SETS

  8. Guidance: Applications Water resources • Application Use Cases • Identification of network of application specialists • Define representative questions to focus the evaluations • Representation of application needs: Scales, Indices, etc. • Feedback on guidance and translational information needs • Feedback on design / requirements of software environment for workshop • Contribution to reports from workshop Ecological Impacts Agriculture Health impacts

  9. About 75 participants • Downscaling working groups • BCCA, BCSD, ARRM, NARCCAP, MACA, etc. teams – approx. 20 people • Sectoral working groups • Agricultural impacts, Ecological impacts, Water resources impacts, Human health impacts – approx. 30 people • NCPP Community – Executive Board, Climate Science Applications Team, Core & Tech Teams = approx. 18 people • Program managers, reporters, international guests – about 5 people

  10. Week in More Detail • Days 1 and 2 – EVALUATION focus • Intercomparison of downscaling methods • Fine tuning the evaluation framework – what worked and what did not work? • Interpretation of results and development of guidance for user groups • Identification of gaps and needs for downscaled data for the participating applications Monday Tuesday • Day 3 – TRANSITION: EVALUATION and GUIDANCE • Morning - Summary of the downscaling methods attributes and evaluations results by sector and protocol • Afternoon - Start of sectoral applications groups work Wednesday • Days 4 and 5 – GUIDANCE focus • Interpretation of results and guidance for user groups • Presentation of metadata descriptions and their usage • Presentation of indices provision - OCGIS • Identification of gaps and needs for downscaled data for application needs • Identification of future steps Thursday Friday

  11. Below are categories of supplemental information

  12. Days in More Detail

  13. Proposed structure Monday 8:30-9:00: Breakfast and Coffee/Tea 9-9:30: Logistics, Welcome and introductions, Technicalities (Catchy intro: RAL director? Head of NOAA over video?) • Brief introductions of workshop members • Technical logistics: internet access, ESG node and COG environment? • Overview of workshop, and key objectives 9:30-10:30: Key-Note: Practitioners Dilemma: A call from the desperate world for help break 11-12:30: Evaluation approach of NCPP • Framework presentation of evaluation of downscaled projections data, protocols, standards … • Introduction of version 0: How were the evaluations done, tools, images, metadata/CIM, potential plans forward (DIBBs structure), working groups and workshops, … community of practice Lunch 12:30-2pm 2-3:30 pm: High-resolution data providers: observed and projected gridded information • What distinguishes your method and what were you trying to accomplish with this method? (getting to value-added question) • Presentations from developers of downscaling methods and datasets Break 4-5pm: Key discussion: Discussion of Directions of Downscaling

  14. Proposed structure Tuesday 8:30-9:00: Breakfast and Coffee/Tea 9-10:30: Results from Evaluations : Data Perspective • Evaluation and characteristics of the baseline data: Observed gridded data comparisons to station data and inter-comparisons – short presentations • Evaluation of the characteristics of the downscaled projections data: Downscaled projections evaluation – presentations and discussion break 11-12:30: continued Lunch 12:30-2pm 2-3:30pm: Results from Evaluation: User Perspective • Short introduction of Applications needs • Case studies presentation and critique of evaluations break 4-5pm: Key Discussion: Discussion of issues related to the framework • Next steps in fine tuning the evaluation framework – what worked and what did not work? What else needs to be added? What needs to be changed? What does need to be done by the developers of downscaled data – what gaps are there in relation to applications?

  15. Proposed structure Wednesday 8:30-9:00: Breakfast and Coffee/Tea9-10:00: Key Note: Downscaling for the World of Water (Maurer?) 10-10:30: Summary of first two days and future evaluation potential using Protocols 2 and 3 • Summary first two days • Perfect Model experiments and evaluations • Presentation and discussions • Process-based metrics and evaluations • Presentation and discussions break 11-12:30pm: User Communities Lunch 12:30-2pm break 4-5pm: Key discussion:

  16. Day 4 and 5

  17. More Detail on Participants and Partnerships

  18. Partnership through downscaling working group • GFDL – Perfect model experiments • Keith Dixon, V. Balaji, A. Radhakrishnan • Texas Tech Univeristy, SC CSC • Katharine Hayhoe - ARRM • DOI USGS, Bureau of Reclamation, Santa Clara University, Scripps Institute, Climate Central, NOAA/NWS • E. Maurer, H. Hidalgo, D. Cayan, A. Wood - BCSD, BCCA • University of Idaho • J. Abatzoglou - MACA • DOI USGS, Oregon State University • S. Hostetler – RegCM2 - dynamically downscaled data • NCAR • Linda Mearns – NARCCAP - dynamically downscaled data

  19. Partnerships through sectoral working groups • Health impacts • NOAA/NWS – David Green • NYC Dept of Health – Dr. Shao Lin • NCAR – Olga Wilhelmi • Columbia University – Patrick Kinney • Univeristy of Florida – Chris Uejio • Agricultural impacts • AGMIP • USDA • NIDIS • SE RISA • Ecological impacts • DOI USGS NC CSC • Water resources impacts • Bureau of Reclamation • California ……

  20. Partnership through infrastructure, metadata and standards development • ES-DOC • IS-ENES, METAFOR project (CIM and CVs) • NESII • CoG, OCGIS • EU CHARMe project (metadata archive and search) • EU CORDEX (dynamical downscaling CV), NA CORDEX (archive and metadata standardization) • ESGF (data and images archiving) • DOI-USGS (data access) • GLISA (translational information archiving)

  21. More Details on Protocols and Metrics

  22. Downscaling working groups ARRM BCSD MACA BCCA Statistical downscaling datasets NARCCAP data, Hostetler data, RegCM2 Dynamical downscaling datasets Delta method Baseline

  23. Downscaling working groups BCSD Delta Method Hostetler data, RegCM2 BCCA Baseline ARRM NARCCAP data, MACA Dynamical downscaling datasets Statistical downscaling datasets

  24. Evaluation framework:Protocols and Metrics Types of protocols Observational Validation by comparison to observed data Perfect model Comparison to a high-resolution GCM; allows evaluation of nonstationarity Idealized scenarios Comparison to synthetic data with known properties

  25. Groups of metrics Group 1 A standard set of metrics calculated for all methods describing the statistical distribution and temporal characteristics of the downscaled data Group 2 Sets of metrics useful for specific sectoral and impacts applications Group 3 Sets of metrics used to evaluate climate system processes and phenomena Water resources Southwest monsoon Central tendency Human health Extreme precipitation processes Tails of distribution Ecological impacts Atmospheric rivers Variability Other extreme events related processes Agricultural impacts Temporal characteristics

  26. More detailed architectural diagrams • Original Vision of NCPP Architecture • Commodity Governance (Cog) Earth System Grid Federation (ESGF) Infrastructure to Support 2013 Workshop • OpenClimateGIS Systems Figure

  27. Original Vision of NCPP Architecture: Summer 2011 Not complete or final! Information Interpretation NCPP website, project workspaces for communities of practice CoG for community connections Support for inter-comparison projects and workflows representing solution patterns Curator display, CoG Composition and display of guidance documents and other text related to the use of climate data climate.gov approaches Interface layer Downscaling and data formatting services, visualization, faceted data search, bookmarking OpenClimateGIS, LAS, ESGF search, USGS tools, ENSEMBLES Search and semantic services associated with web content and other sources Consiliate, Drupal database tools Service layer Federated data archival and access ESGF, THREDDS, data.gov platforms Data at USGS, PCMDI, NASA, NOAA, … Federated metadata collectionand display Curator tools, METAFOR, Kepler and other workflow solutions Resource layer

  28. CoG ESGF Infrastructure to support 2013 Workshop

  29. OpenClimateGIS Systems Figure

  30. Design Considerations:Climate Translator V.0

  31. Design Considerations: Climate Translator V.0 Indices Predefined; Defined by users Geography Define Locality; GIS; Web Mapping Multiple Basic Data Archives USGS GeoDataPortal Earth System Grid … Evaluation Protocols, Metrics, Observations Analysis & Synthesis of Information Definitions, Sources, Metadata, Fact Sheets, Narratives, Guidance

  32. NCPP ArchitectureWhat Date Goes Where

  33. Primary Data Existing downscaled datasets; Validation datasets (observations or hi-res model output) Quantitative Evaluation computation of indices, if not already available; computation of metrics according to NCPP protocols; Products of QED New Indices Datasets (?) ESGF (local) Run the Evaluation Code: NCL; Python (?) Other OpenDAP (e.g. Geodata Portal) Evaluation Data Bundles Image Bundles (ESGF? Local Database?) Local disk (may be at NOAA, NCAR, or at scientist’s institution) Code Repository linked to COG Environment (ideally) Location of objects are color coded Orange = COG or other NCPP database Gray = “Don’t know yet” Downscaled Datasets Index/Metric Code Groups of Metrics Downscaling Model Components Processor Component ? Experiment ? Downscaling Simulations and Ensembles CIM documents Evaluation Protocols Experiments Experiment (e.g. NCPP Protocol 1)

  34. Products of Workshop/working groups Other Images (unstructured) Further Visualization Text (structured case studies; other text) Expert analysis Evaluation Data Bundles Image Bundles (ESGF? Local Database?) Search and Compare COG Wiki and Linked Tools or GLISA-like CMS/database ??? Integrate with other NCPP translational info COG Wiki and Linked Tools Translational/Interpretive CIM document? CIM

  35. Design Considerations • These plots were to help define the computational environment to support Workshop 2013. (Read note sections of slides.) • Focus on evaluation of existing data products • Linking to protocols and metrics development of capability to compare and describe gridded data systems • Separate the output interface in types to facilitate development of services versus internal NCPP environment

  36. Two Classes of Evaluation Evaluation of Methodology Evaluation of Data Products • Important for End Users • Informs Data Set Developers • Definable problem with our resources • Fundamental descriptions are of value and support NCPP’s mission • Important for Data Set Developers • Informs uncertainty description and translation • “Perfect Model” strategically central

  37. 2013 WorkshopFocus - Evaluation of Data Products • Quantified Description Environment (QDE) • Focus on T and P, quantify differences in standard data sets. • Data set choice criteria • Meaningful Contribution • Standard treatment across datasets • Gridded • What is in the literature? Evaluation of Data Products • Important for End Users • Informs Data Set Developers • Definable problem with our resources • Fundamental descriptions are of value and support NCPP’s mission

  38. Quantified Description Environment (QDE) Output Input Calculation

  39. QDE: Input Input Station Data (observations) Gridded Data Observations Models

  40. QDE: Output Digital Data Primary Data Derived Data Output Research Environment Support of Services Non-Digital Data Software Descriptions Structured Unstructured End-User / Us & Not Us Environments: Us & Not Us Analysis Collaborative End-user

  41. 2013 Workshop and NCPP

  42. NCPP Strategy and Projects • Workshop in 2013 is startsa progression of workshops that focus the overall evaluation activity and strategy of NCPP

  43. NCPP Strategy and Projects • Workshop in 2013 • a focal point and an integration of all NCPP projects • start of a progression of workshops that focus the overall evaluation activity and strategy of NCPP NC CSC Downscaling Metadata Downscaling Evaluation Climate Indices NCPP Software Environment Workshop 2013 Interagency Community Integration

  44. Principles and values Workshop Goals • Standardization • Reproducibility and transparency • Comparability of methods • Extensibility • Co-development • Quantitative evaluation • Infrastructure support • Description, guidance and interpretation • Informed decision-making

  45. Contributions to NCPP development goals • Evaluation standards • Develop a suite of evaluation metrics for downscaled data • Design a common suite of tests to evaluate downscaling methods • Guidance documents • Produce  guidance on the  advantages and limitations of various downscaling techniques for specific user applications based on the quantitative evaluations • Inform development of standard, vetted downscaled climate prediction and projection products • Translation for users and development of metadata • Educate users in the evaluation and application of downscaled climate prediction and projection products • Develop searchable structured metadata to describe downscaling methods as well as their evaluations • Develop an initial platform for data discovery, exploration, and analysis that serves and makes use of the translational information • Cyber infrastructure • Develop information technology infrastructure to support community analysis and provision of climate information. 

More Related