1 / 22

Intercomparisons Working Groupe activities

Intercomparisons Working Groupe activities. Prepared by F. Hernandez

inigo
Télécharger la présentation

Intercomparisons Working Groupe activities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intercomparisons Working Groupe activities Prepared by F. Hernandez K. Lisaeter, L. Bertino, F. Davidson, M. Kamachi, G. Brassington, P. Oke, A. Schiller, C. Maes, J. Cummings, E. Chassignet, H. Hulburt, P. Hacker, J. Siddorn, M. Martin, S. Dobricic, C. Regnier, L. Crosnier, N. Verbrugge, M. Drévillon, J-M Lellouche Status of the intercomparison exercice Some exemples of diagnostics based on Class 1/2

  2. Status of the intercomparison exercice • Methodology decided: • Compare operational/dedicated hindcast from Feb-March-April period. • Consistency and quality assessment (not performance) • Intercomparison based on Class 1 and Class 2 metrics, and reference data • Files shared on OpenDap/FTP, assessment performed by different teams on dedicated ocean basins • Preliminary work performed: • Intercomparison plan endorsed • Technical implementation documents (metrics definition) written and distributed

  3. Status of the intercomparison exercice: definition of metrics • New description of Class 1, Class 2 and Class 3 metrics: • Regional areas revisited to fit recommendations' • Complete description of mooring, sections etc… • Up-grade NetCDF files definition to be consistent with COARDS CF1.2 conventions • Include sea-ice variables in the definitions • Saving half storage capacity by use of “compressed” NetCDF files (data written “short” instead of “floats”, using “scale_factors”) • Proposition of a set of reference data (availability, access)

  4. Status of the intercomparison exercice: definition of metrics • Class 1 definition (provided with fortran programs):

  5. Status of the intercomparison exercice: definition of metrics • Class 1 definition (provided with fortran programs): • 2D fields: • The zonal and meridional wind stress (Pa) on top of the ocean, • The total net heat flux (including relaxation term) (W/m2) into the sea water, • The surface solar heat (W/m2) into the sea water, • The freshwater flux (including relaxation term) (kg/m2/s) into the ocean, • The Mixed Layer Depth (henceforth MLD) (m). Two kinds of MLD diagnostics are provided, to be compliant with [de Boyer Montégut et al., 2004] and [D'Ortenzio et al., 2005]. A temperature criteria MLD(θ) with temperature difference with the ocean surface of T=0.2°C. And a surface potential density criteria MLD(ρ) with a 0.03 kg/m3 surface potential density criteria. • The Sea Surface Height (SSH) (m). • 3D fields: • The potential temperature (K) and salinity (psu). • The zonal and meridional velocity fields (m/s). • The vertical eddy diffusivity (kz, in m2/s): if compressed, first in LOG10!

  6. Status of the intercomparison exercice: definition of metrics • Class 1 definition (provided with fortran programs): • 2D fields (for ARC, ACC, NAT, NPA and GLO): • Sea-Ice thickness (m) • Sea-Ice concentration (fraction) • Sea-Ice x and y velocities (m/s) • Surface snow thickness over sea ice (m) • Sea ice downward x and y stress (Pa) • Tendency of sea ice thickness due to thermodynamics (m/s) • Surface downward heat flux in air (W/m2) • Ancillary data: • The Mean Dynamic Topography (henceforth MDT) (m) used as a reference sea level during the assimilation procedure. MDT is also called Mean Sea Surface Height (MSSH). • Climatologies of Sea Surface Temperature (SST) (K), of surface current (m/s), of MLD (m). • Climatology of potential temperature (K) and salinity (psu) fields from (T,S) used as a reference.

  7. Status of the intercomparison exercice: definition of metrics • Class 2 mooring/sections • potential temperature (K) and salinity (psu). • zonal and meridional velocity fields (m/s). • Sea Surface Height (SSH) (m).

  8. Status of the intercomparison exercice: definition of metrics straight sections (yellow); XBT sections (brown); gliders sections (purple); tide gauges (blue), and other moorings (red). 78 vertical levels (WOA and GDEM3.0 standard levels

  9. Status of the intercomparison exercice: definition of metrics • Class 3 definition (transport): In black, sections without specific class of computation on the vertical. Transport computed with classes: temperature (red), salinity (yellow), density (blue) and depth (green).

  10. Status of the intercomparison exercice: assessment through Class 1-2-3 metrics • Consistency: Monthly averaged fields compared to: • WOA’2005, Hydrobase, CARS, MEDATLAS, Janssen, climatologies • De Boyet Montégut MLD climatology • SST climatology • Quality: Daily fields compared to • In situ data (Coriolis data server) • Dynamic topography, or SLA (AVISO products) • SST (depending on groups) • SSM/I Sea-Ice concentration and drift products • Surface currents (DBCP data, OSCAR, SURCOUF products)

  11. Status of the intercomparison exercice: Where are we ? • Partners involved / status:

  12. Status of the intercomparison exercice: Where are we ? • Agenda: • Shift (one month now) for availability of products • Not clear view of “intercomparison strength of work” in the different areas (ie how many groups plan a dedicated work looking at more than their own hindcast? ) • Target: define a deadline to be prepared for the Symposium Validation and intercomparison of analysis and forecast products F. Hernandez (Mercator-Ocean), G. Brassington (BoM), J. Cummings (NRL), L. Crosnier (Mercator-Ocean), F. Davidson (DFO), S. Dobricic (ICMCC), P. Hacker (Univ. of Hawaii), M. Kamachi (JMA), K. A. Lisæter (NERSC), M. Martin (UK Met Office) • Availability of products (end of July ?????) • Availability of intercomparison results (mid October ????) • Managing the outcomes: • How do we take profit from feedbacks ? • Initiative to keep on this activity?

  13. Assessment diagnostics SST SST-WOA05 NOAA RTG SST SST - RTG

  14. Assessment diagnostics

  15. Assessment diagnostics Salinity-WOA05 Salinity Surface currents comparison to drifters

  16. Assessment diagnostics

  17. Assessment diagnostics

  18. Assessment diagnostics

  19. Assessment diagnostics

  20. Assessment diagnostics

  21. Assessment diagnostics

  22. Status of the intercomparison exercice: Where are we ? • Agenda: • Shift (one month now) for availability of products • Not clear view of “intercomparison strength of work” in the different areas (ie how many groups plan a dedicated work looking at more than their own hindcast? ) • Target: define a deadline to be prepared for the Symposium Validation and intercomparison of analysis and forecast products F. Hernandez (Mercator-Ocean), G. Brassington (BoM), J. Cummings (NRL), L. Crosnier (Mercator-Ocean), F. Davidson (DFO), S. Dobricic (ICMCC), P. Hacker (Univ. of Hawaii), M. Kamachi (JMA), K. A. Lisæter (NERSC), M. Martin (UK Met Office) • Availability of products (end of July ?????) • Availability of intercomparison results (mid October ????) • Managing the outcomes: • How do we take profit from feedbacks ? • Initiative to keep on this activity?

More Related