1 / 36

The Earth System Modeling Framework

The Earth System Modeling Framework. Robert Oehmke, Gerhard Theurich, Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder robert.oehmke@noaa.gov Gung Ho Meeting Bath, England November 2, 2014. Outline. ESMF Overview

Télécharger la présentation

The Earth System Modeling Framework

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Earth System Modeling Framework Robert Oehmke, Gerhard Theurich, Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder robert.oehmke@noaa.gov Gung Ho Meeting Bath, England November 2, 2014

  2. Outline • ESMF Overview • ESMF Infrastructure • ESMF Superstructure • New Directions: NUOPC/ESPS and Cupid • Current Status and Future Work

  3. Motivation In climate research and numerical weather prediction… increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall “coupled” modeling system (e.g. atmosphere, ocean, sea ice) In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable and now accelerator-based computing architectures In software …emergence of frameworks to promote code reuse and interoperability The ESMF is a focused community effort to tame the complexity of models and the computing environment. It leverages, unifies and extends existing software frameworks, creating new opportunities for scientific contribution and collaboration.

  4. What is ESMF? ESMF provides: • Tools for building models • Tools for coupling models ESMF is: • Portable: • Unix/Linux and Windows (Cygwin/MinGW) systems • Tested on >40 different OS/Compiler/MPI combinations every night • Well tested: • More than 6000 unit/system tests • Parallel: • Based on MPI (with “mpiuni” bypass mode available) • OpenMP and Pthreads support • Flexible: • Interfaces to multiple languages • A wide range of functionality and supported options

  5. Interfaces • Complete F95 API: • use ESMF • Derived types and methods • Investigating moving to Fortran 2003 • Limited C API: • #include “ESMC.h” • Structs and methods • Limited Python API: • Import ESMPy • Classes with methods • Applications: • File-based interpolation weight generation:mpirun –np <N> ESMF_RegridWeightGen –s …. • File-based weight generation AND application of weights:mpirun –np <N> ESMF_Regrid (coming next release)

  6. The ESMF Sandwich • Superstructure: Component data structures and methods for coupling model components • Infrastructure: Field data structures and methods for building model components, and utilities for coupling YOU DON’T NEED TO USE THEM BOTH!

  7. Outline • ESMF Overview • ESMF Infrastructure • ESMF Superstructure • New Directions: NUOPC/ESPS and Cupid • Current Status and Future Work

  8. ESMF Infrastructure • Distributed data classes: Used to hold data spread across a set of processors • Represents data so ESMF can perform operations on it • Provides a standard representation to be passed between components • Can reference user memory (usually) • Consists of two kinds of structures: • Index Space classes (Arrays) • Physical Space classes (Fields) • Utilities: • Time Manager: Classes to represent time, time intervals, alarms… • Used in ESMF for passing time info between models, time loops, etc. • Also useful for doing calculations with time, conversions, etc. • Attributes: Allow metadata to be attached to ESMF classes • Instrument models to be more self describing • Can be written to various file formats, e.g. CIM compliant XML • Others: Logging (LogError), Virtual Machine (VM), Config

  9. Index Space Distributed Data • Distgrid: Represents index space and distribution across processors • Supports multiple index space representations • From arbitrary 1-D sequences • To N-D Tiles • To N-D Tiles connected together along their edges • Supports multiple distribution options • Array: A distributed index space data container • Array = data + Distgrid • Supports different data types: integer 4, real 4, real 8, … • Other options: halos, undistributed dimensions, … • ArrayBundle: A set of Arrays which can be operated on at one time.

  10. Physical Space Grid Representation Classes • Grid: • Structured representation of a region • Consists of logically rectangular tiles • Mesh: • Unstructured representation of a region • In 2D: polygons with any number of sides • N-gon support added for MetOffice users • In 3D: tetrahedrons & hexahedrons • Xgrid (Exchange Grid): • Represents boundary layer between two regions • Represented by custom constructed Mesh • LocStream(Location Stream): • Set of disconnected points • E.g. locations of observations

  11. Physical Space Distributed Data • Field: A distributed physical space data container • Field = data + grid representation class (e.g. Grid, Mesh, …) • Based on Array/Distgrid, so supports those index/distribution options • Can get corresponding coordinates from grid representation class • Supports different data types: integer 4, real 4, real 8, … • Other options: halos, undistributed dimensions, … • FieldBundle: A set of Fields which can be operated on at one time.

  12. Distributed Data Class Operations • Sparse Matrix Multiply: • Apply coefficients (weights) in parallel to distributed data • Highly tuned for efficiency/auto-tunes for optimal execution • Underlies most of ESMF distributed data operations • Redistribution: • Move data between distributions without changing values • Useful in cases where grid doesn’t change, but distribution does • Halo: • Fill “Halo” cells which hold data from another processor • Useful during computations on distributed data • Regridding: • Move data from one grid to a different one • Useful when moving data between models with different grids • Only available on physical space data classes

  13. Basic Computation ! Create grid representation class grid=ESMF_GridCreate(….) ! Create Field on grid representation class Field=ESMF_FieldCreate(grid,….) ! Create Halo Communication Structure Call ESMF_FieldHaloStore(Field,…,routehandle) ! Loop over time do t=1,…. ! Get Field data Call ESMF_FieldGet(Field, farrayPtr=ptr_to_data,…) ! Loop over memory doing computations on data do i=1,… do j=1… ptr_to_data(i,j)=…. enddo enddo ! Update halo Call ESMF_FieldHalo(Field,…) enddo

  14. Sparse Matrix Performance Example • Performance of sparse matrix multiply in CESM (1_1_beta09) CPL7 • Run on Cray XK6 (jaguarpf)

  15. Regrid: Features • Interfaces: F90, C, Python, ESMF_RegridWeightGen (file-based separate application) • Multiple interpolation types: • Bilinear • Higher order patch recovery • Yields better derivatives/smoother results than bilinear • Based on “patch recovery” used in finite element modeling [1][2] • Nearest neighbor • First order conservative • Path between points in bilinear: options for straight line or great circle • Added for MetOffice customers because of “accuracy” questions • Normalization options for conservative: destination area or fraction • Added for MetOffice customers • Pole options for global spherical logically rectangular grids: • Full circle average, N-point average, teeth, no pole • Other: masking, user area, ignore unmapped,…

  16. Regrid: Spherical Grids • Support grids with spherical (lon, lat, rad) coordinates • Mix and match pairs of: • Global 2D logically rectangular Grids • Regional 2D logically rectangular Grids • 2D unstructured Meshes composed of polygons with any number of sides: • ESMF internally represents these as triangles and quadrilaterals • Supported elements: triangles, quadrilaterals, pentagons, hexagons,… • Multi-patch grids (e.g. cubed spheres) currently supported via Meshes HOMME Cubed Sphere Grid with Pentagons Courtesy Mark Taylor of Sandia Regional Grid FIM Unstructured Grid

  17. Regrid: Cartesian Grids In addition, regridding supports Cartesian (x,y,z) coordinates: • Regridding between any pair of: • 2D Meshes composed of polygons with any number of sides • 2D logically rectangular Grids composed of a single patch • Bilinear or Conservative regridding between any pair of: • 3D Meshes composed of hexahedrons • 3D logically rectangular Grids composed of a single patch 2D Unstructured Mesh From www.ngdc.noaa.gov 3D Grid 3D Unstructured Mesh

  18. Regrid Weight Calculation Performance Platform: IBM IDataPlex cluster (Yellowstone at NCAR)

  19. Outline • ESMF Overview • ESMF Infrastructure • ESMF Superstructure • New Directions: NUOPC/ESPS and Cupid • Current Status and Future Work

  20. ESMF Superstructure Grid Comp. Grid Comp. • State: structure for transferring data between models in a standard way. Can contain Array, Field, Bundles, other States, etc… • Gridded Component: wraps a model and allows it to be called in a standard way. • Coupler Component: wraps user code for translating data between models and allows it to be called in a standard way. Coupler Comp. State State State State

  21. Component Hierarchy ESMF components in the GEOS-5 atmospheric GCM • Using superstructure components can be arranged hierarchically, helping to organize complex models • Different groups may create different kinds or levels of components

  22. Component Overhead • Overhead of ESMF component wrapper around native CCSM4 component. • No significant performance overhead (<3% is typical) • Platform: IBM Power 575, bluefire, at NCAR

  23. Outline • ESMF Overview • ESMF Infrastructure • ESMF Superstructure • New Directions: NUOPC/ESPS and Cupid • Current Status and Future Work

  24. New Directions The initial ESMF software fell short of the vision for common infrastructure in several ways: Implementations of ESMF could vary widely and did not guarantee a minimum level of technical interoperability among sites - creation of the NUOPC Layer It was difficult to track who was using ESMF and how they were using it – initiation of the Earth System Prediction Suite There was a significant learning curve for implementing ESMF in a modeling code – Cupid Integrated Development Environment New development directions address these gaps…

  25. NUOPC Layer 1. Implementations of ESMF could vary widely and did not guarantee a minimum level of technical interoperability among sites National Unified Operational Prediction Capability: Consortium of U.S. operational weather and water prediction centers. • Participants: NOAA, Navy, Air Force, NASA, and other associated modeling groups. • Overall goals: • Improve collaboration among agencies. • Accelerate the transition of new technology into the operational centers. • Technical goal: Increase interoperability of ESMF-based applications • NUOPC websites: • http://www.weather.gov/nuopc/ • http://earthsystemcog.org/projects/nuopc/

  26. The NUOPC Layer An interoperability layer on top of ESMF that adds: • Definitions for the component interactions during Initialize, Run, Finalize. • Generic components that provide a standard implementation of interoperable components. • A field dictionary, based on Climate & Forecast (CF) conventions, as the basis for a standard identification of fields between components. • Mechanisms to report component incompatibilities detected during run-time. • A compliance checker option that serves as a development and debugging tool. • A collection of example applications. • See: https://www.earthsystemcog.org/projects/nuopc/proto_codes

  27. NUOPC Layer Generic Components • Model: • Implements a specific physical domain, e.g. atmosphere, ocean, wave, ice. • Connector: • Connects pairs of components in one direction, e.g. Model to/from Model, or Model to/from Mediator. • Executes simple transforms (Regrid/Redist, units). • Mediator: • Scientific coupling code (flux calculations, accumulation, averaging, etc.) between (potentially multiple) Models. • Driver: • Provides a harness for Models, Mediators, and Connectors (supporting hierarchies). • Coordinates initialize and run sequences.

  28. NUOPC Layer Examples

  29. The Earth System Prediction Suite • 2. It was difficult to track who was using ESMF and how they were using it • The Earth System Prediction Suite (ESPS) is a collection of weather and climate modeling codes that use ESMF with the NUOPC conventions. • The ESPS makes clear which codes are available as ESMF components and modeling systems. ESPS website: • https://www.earthsystemcog.org/projects/esps/ • Inclusion criteria for NUOPC: • ESPS components and coupled modeling systems are NUOPC-compliant. • A minimal, prescribed set of model documentation that conforms to the Common Information Model standard is provided for each version of the ESPS component or modeling system. • ESPS codes must have clear terms of use (e.g. public domain statement, open source license, proprietary status), and must have a way for credentialed ESPC collaborators to request access. • Regression tests are provided for each component and modeling system. • There is a commitment to continued NUOPC compliance and ESPS participation for new versions of the code.

  30. Model Codes in the ESPS Currently, components in the ESPS can be of the following types:coupled system, atmosphere, ocean, wave, sea ice Target codes include: • The Community Earth System Model (CESM) and its constituent components • The NOAA Environmental Modeling System (NEMS), including the new Climate Forecast System • The MOM5 and HYCOM oceans • SWAN and WaveWatch 3 wave models • The Navy Global Environmental Model (NavGEM)-HYCOM-CICE coupled system • The Navy Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS) and COAMPS Tropical Cyclone (COAMPS-TC) • NASA GEOS-5 • NASA ModelE

  31. Cupid Development andTraining Environment • 3. There was a significant learning curve for implementing ESMF in a modeling code CUPID GOAL: Make ESMF training and development simpler and more appealing • NOAA CIRES, Georgia Institute of Technology, and NASA GISS/GSFC collaboration • Eclipse-based “Integrated Development Environment” or IDE • Customized for ESMF applications with NUOPC conventions Cupid is a working prototype expected to be ready for first public release in 2014. Cupid project: https://earthsystemcog.org/projects/cupid/ Cupid tutorial:https://github.com/cupid-ide/cupid/blob/master/org.earthsystemcurator.cupid.nuopc.fsml/doc/latex/cupid.pdf?raw=true

  32. Cupid Development andTraining Environment Select sample code or model • Pick a training problem (or coupled model) • Generate a framework-aware outline of the source code • Navigate around the source code using the outline • Use an editor to modify the source code • Automatically generate code needed for NUOPC compliance • Compile and run locally or on a cloud (currently Amazon Web Services) Source code editor NUOPC outline Project explorer Console for viewing output

  33. Outline • ESMF Overview • ESMF Infrastructure • ESMF Superstructure • New Directions: NUOPC/ESPS and Cupid • Current Status and Future Work

  34. Current Status Released ESMF 6.3.0r in January Highlights: • Added support for n-gons in Mesh • Great circle paths for bilinear Released ESMF 6.3.0rp1 in July Highlights: • Python interface (ESMPy) brought into ESMF source • Fraction normalization for conservative • Allows regrid of partial destination cells without user normalization

  35. Scheduled for Upcoming Releases • Support for 4-sided concave cells in regridding – now all cases work correctly (7.0.0) • Implemented and available as a snapshot • For ESMPy, removed requirement that cell centers of Fields be defined, even for operations where they were not needed (7.0.0) • Implemented and available as a snapshot • Higher order conservative regridding (7.0.0) • Breaking up grid files to increase maximum grid size possible for interpolation weight generation (7.0.0) • MOAB finite element library integration: (7.0.0 & 7.1.0) • Introducing MOAB finite element library in addition to ESMF native finite element library • Will be testing to see if we replace native library with MOAB • Would bring in support for higher order elements • Extrapolation of points that lie outside the source grid (7.0.0)

  36. References • Patch interpolation: • Khoei S.A., Gharehbaghi A. R. The superconvergent patch recovery technique and data transfer operators in 3d plasticity problems. Finite Elements in Analysis and Design, 43(8), 2007. • Hung K.C, Gu H., Zong Z. A modified superconvergent patch recovery method and its application to large deformation problems. Finite Elements in Analysis and Design, 40(5-6), 2004. If you have questions or requests, come talk to me, or email: esmf_support@list.woc.noaa.gov

More Related