1 / 20

Earth System Modeling Infrastructure

Earth System Modeling Infrastructure. Cecelia DeLuca/ESMF-NCAR March 31-April 1, 2009 CHyMP Meeting. Outline. Elements of interoperability platforms Integrating across elements Summary. Elements of interoperability platforms. Tight coupling tools and interfaces

dwaynec
Télécharger la présentation

Earth System Modeling Infrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Earth System Modeling Infrastructure Cecelia DeLuca/ESMF-NCAR March 31-April 1, 2009 CHyMP Meeting

  2. Outline • Elements of interoperability platforms • Integrating across elements • Summary

  3. Elements of interoperability platforms • Tight coupling tools and interfaces • - hierarchical and peer component relationships • - frequent, high volume transfers on high performance computers • Loose coupling tools and interfaces- generally peer-peer component relationships- lower volume and infrequent transfers on desktop and distributed systems • Science gateways- browse, search, and distribution of model components, models, and datasets- visualization and analysis services- workspaces and management tools for collaboration • Metadata conventions and ontologies- ideally, with automated production of metadata from models • Governance- coordinated and controlled evolution of systems

  4. Tight coupling tools and interfaces • Examples: • Earth System Modeling Framework (ESMF) - NASA, NOAA, Department of Defense, community weather and climate models, U.S. operational numerical weather prediction centers (HPC focus) • http://www.esmf.ucar.edu • Flexible Modeling System (FMS) – NOAA precursor to ESMF, still used at the Geophysical Fluid Dynamics Laboratory for climate modeling • http://www.gfdl.noaa.gov/fms/ • Space Weather Modeling Framework (SWMF) – NASA-funded, used at the University of Michigan for space weather prediction

  5. How coupling tools work: • Users wrap their native data in framework data structures • Users adopt standard calling interfaces for a set of methods that enable data exchange between components • Development toolkits help users with routine functions (regridding, time management, etc.)

  6. ESMF: Standard interfaces Three ESMF component methods: Initialize, Run, and Finalize (I/R/F) Each can have multiple phases Users register their native I/R/F methods with an ESMF Component Small set of arguments: call ESMF_GridCompRun (myComp, importState, exportState,clock, phase, blockingFlag, rc)

  7. ESMF: Distributed data representation 1. Representation in index space (Arrays) • Simple, flexible multi-dimensionalarray structure • Regridding via sparse matrix multiply withuser-supplied interpolation weights • Scalable to 10K+ processors - no globalinformation held locally 2. Representation in physical space (Fields) • Built on Arrays + some form of Grid • Grids are: logically rectangular, unstructured mesh, or observational data streams • Regridding via parallel on-line interpolation weight generation, bilinear or higher order options • Intrinsically holds significant amounts of metadata - dynamic, usable for multiple purposes, limited annotation required Supported Array distributions

  8. ESMF: Coupling options • Generally single executable for simpler deployment • Push mode of data communication is very efficient • Coupling communications can be set up and called in a coupler, or called directly from within components (for I/O, data assimilation) • Hierarchical components for organization into sub-processes • Recursive components for nesting higher resolution regions • Coupling across C/C++and Fortran • Ensemble management ESMF-based hierarchical structure of GEOS-5 atmospheric GCM

  9. ESMF: Performance portability • ESMF is highly performance portable, low (<5%) overhead • 3000+ regression tests run on 30+ platform/compiler combinations nightlySee http://www.esmf.ucar.edu/download/platforms • Newer ports include native Windows, Solaris • Using TeraGrid Build and Test Service to simplify regression testing Performance at the petascale… Scaling of the ESMF sparse matrix multiply, used in regridding transformations, out to 16K processors. (ESMF v3.1.0rp2) Plot from Peggy Li, NASA/JPL Tested on ORNL XT4, -N1 means 1 core per node. ASMM Run-Time Comparison msec

  10. ESMF: Higher order interpolation techniques in CCSM • ESMF higher order interpolation weights were used to map from a 2-degree Community Atmospheric Model (CAM) grid to a POP ocean grid (384x320, irregularly spaced) • 33% reduction in noise globally in quantity critical for ocean circulation compared to previous bilinear interpolation approach • ESMF weights are now the CCSM default Interpolation noise in the derivative of the zonal wind stress Interp. noise grid index in latitudinal direction Black = bilinear Red = higher-orderESMF v3.1.1 Green = higher order ESMF v4.0.0

  11. GFS GFS Atm Phys GFS Atm Dynamics GFS I/O NEMS NMM-B Atm Dynamics NMM-B Atm Phys NMM History GEOS-5 Radiation GEOS-5 LW Rad GEOS-5 Solar Rad GEOS-5 Land GEOS-5 Veg Dyn GEOS-5 Catchment GEOS-5 OGCM Poseidon GEOS-5 Data Ocean GEOS-5 Salt Water GEOS-5 Ocean Biogeo HAF ESMF: Model map SWMF Legend Ovals show ESMF components and models that are at the working prototype level orbeyond. GAIM CCSM4 Dead atm Data atm NOAA Department of Defense University NASA Department of Energy National Science Foundation ESMF coupling complete ESMF coupling in progress Component (thin lines) Model (thick lines) POP Ocean Dead ocean Data ocean Stub ocean CICE ice Dead ice Data ice Stub ice FIM CLM Dead land Data land Stub land Ice sheet Strat Chem GEOS-5 Param Chem GEOS-5 Atm Dynamics GOCART GEOS-5 GWD GEOS-5 FV Dycore FV Cub Sph Dycore Tracer Advection Land Info System GSI GEOS-5 Atm Physics GEOS-5 Hiistory GEOS-5 Atm Chem GEOS-5 Aeros Chem GEOS-5 Surface GEOS-5 Topology GEOS-5 Moist Proc GEOS-5 Lake GEOS-5 Turbulence GEOS-5 Land Ice WRF POP ROMS MOM4 UCLA AGCM SWAN COAMPS CICE MITgcm MITgcm Atm MITgcm Ocean pWASH123 ADCIRC NCOM HYCOM

  12. Loose coupling tools and interfaces • Examples: • OpenMI • http://www.openmi.org • Web service approaches • Coupling options: • Generally multiple executable • Pull mode of data communication simple but not efficient(ask for a data point based on coordinates) • Generally peer-peer component relationships • Coupling across multiple computer languages (Python, Java, C++, etc.)

  13. Science gateways – access centers • Examples: • Earth System Grid (ESG) – DOE, NCAR, NOAA support, used to distribute Intergovernmental Panel on Climate Change data and for climate research • http://www.earthsystemgrid.org • Hydrologic Information System (HIS) - NSF funded, used to enhance access to data for hydrologic analysishttp://his.cuahsi.org • Object Modeling System (OMS) - USDA effort, used for agricultural modeling and analysishttp://javaforge.com/project/1781

  14. Metadata conventions and ontologies • Examples: • Climate and Forecast (CF) conventions - spatial and temporal properties of fields used in weather and climate • http://cf-pcmdi.llnl.gov • METAFOR Common Information Model (CIM) – large EU-funded project, climate model component structure and properties (including technical and scientific properties) http://metaforclimate.eu • WaterML – Schema for hydrologic data developed by the Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI)http://his.cuahsi.org/wofws.html

  15. Governance • Pervasive issue in community modeling • Divergent effects of • Multiple institutions • Geographic dispersion • Multiple domains of interest (working groups) • Must be balanced by strong integration body - strategies: • Meets frequently enough to affect routine development (quarterly) • Meets virtually to get sufficient representation • Includes user and other stakeholder representatives • Authorized to prioritize and set development schedule • Supported by web-based management tools

  16. Integrating across interoperability elements Examples from the Curator project (NSF and NASA) Automated output of CF and CIM XML schema from ESMF (tight coupling + ontology) Ingest of ESMF-generated schema into ESG, propagation into tools for search, browse, inter-comparison and distribution of model components and models(tight coupling + ontology + science gateway) Implementation of dataset “trackback” in ESG that connects datasets with detailed information about the models used to create the data (tight coupling + ontology + science gateway) Implementation of personal and group workspaces in ESG (science gateway + governance)

  17. Integrating across interoperability elements(cont.) • Translation of ESMF interfaces into web services to enable invocation of ESMF applications from a science gateway, and enable data and metadata from the run to be stored back to the gateway(tight coupling + loose coupling + science gateway + ontology, new TeraGrid funding) Web service interface Loosely coupled components ESMF interface Tightly coupled HPC components Issue of switch from push to pull data interactions…

  18. Screenshot: Component trackback

  19. Screenshot: Faceted search

  20. Summary • Cross-domain interoperability platforms have multiple elements • Many of these elements already exist • Integration activities (such as Earth System Curator) are the next focus Image courtesy of Rocky Dunlap, Georgia Institute of technology

More Related