1 / 24

- How does the coupled modeling system work? and - Setting up a coupled application

- How does the coupled modeling system work? and - Setting up a coupled application. C oupled Modeling System. Model Coupling Toolkit Mathematics and Computer Science Division Argonne National Laboratory http://www-unix.mcs.anl.gov/mct/.

avani
Télécharger la présentation

- How does the coupled modeling system work? and - Setting up a coupled application

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. - How does the coupled modeling system work?and- Setting up a coupled application

  2. Coupled Modeling System Model Coupling Toolkit Mathematics and Computer Science Division Argonne National Laboratory http://www-unix.mcs.anl.gov/mct/ MCT is an open-source package that provides MPI based communications between all nodes of a distributed memory modeling component system. Download and compile as libraries that are linked to. Model A running on M nodes. Model B running on N nodes. Model C ……… MCT provides communications between all models. ……… (it also works here) Warner, J.C., Perlin, N., and Skyllingstad, E. (2008). Using the Model Coupling Toolkit to couple earth system models. Environmental Modeling and Software

  3. Libraries • MCT - v2.60 or higher (distributed) 1) cd to the MCT dir 2) ./configure This makes Makefile.conf. you can edit this file. 3) make 4) make install 5) set environment vars setenv MCT_INCDIR COAWST/Lib/MCT/include setenv MCT_LIBDIR COAWST/Lib/MCT/lib (or where ever you installed them, see last slide)

  4. Compilers dir (side note)

  5. Model organization master.F mpi_init init_file (# procs/model) { init run finalize SWAN { init run finalize ROMS

  6. init, run, and finalize ROMS SWAN init_param init_parallel init_scaclars init_coupling MPI_INIT init (grid decomp) roms_init SWINIT SWREAD (grid) init_coupling SWINITMPI run (sync. point) main3d ..... waves_coupling ... swanmain ..... ocean_coupling ... roms_run SWMAIN mpi_finalize close_io roms_finalize finalize SWEXITMPI mpi_finalize close_io

  7. Grid decomposition (during initialization) SWAN ROMS • Each tile is on a • separate processor. • Each tile registers • with MCT.

  8. init_coupling ROMS- init_coupling SWAN- init_coupling 1 1 2 2 3 3 processed by each ROMS tile processed by each SWAN tile

  9. Synchronization (run phase) ROMS- ocean_coupling SWAN- waves_coupling MCT MCT processed by each ROMS tile processed by each SWAN tile

  10. ATM to OCN data fields or #define ATM2OCN_FLUXES #define BULK_FLUXES Use momentum + heat fluxes computed in WRF for both ROMS+WRF Salt flux Use wrfvars in COARE algorithm #define EMINUSP #define ATM_PRESS - Patm Uwind, Vwind Swrad, Lwrad, RH, Tair, cloud rain, evap stflx_salt = evap - rain Ustress, Vstress, Swrad, Lwrad LH, HFX LH + HFX computed in bulk_fluxes ATM Uwind, Vwind, Patm, RH, Tair, cloud, rain, evap, SWrad, Lwrad LH, HFX, Ustress, Vstress stflx_temp = Swrad+Lwrad +LH+HFX OCN Integration and Application Network (ian.umces.edu/symbols), University of Maryland Center for Environmental Science.

  11. OCN to ATM data fields ATM Hwave, Lpwave, Tpsurf, SST OCN WAV OCN SST Momentum Heat Surface fluxes Moisture = f ( Hwave, Lpwave, Tpsurf ) WAV

  12. How to create coupled application • 1) Create all input, BC, init, forcing, etc files for each model as if running separately. I recommend that you run each model separately first. • 2) modify cppdefs in your header file. • 3) SCRIP (if different grids or grid refinement) • 4) coupling.in • 5) coawst.bash • 6) run it as coawstM - Handout ends here - More in the online ppt - Classroom tutorial will now follow: Projects/Sandy/create_sandy_application.m

  13. 1) Use each model separately • WRF • 27 vertical levels • dt 36 s • Physics • Lin microphysics • RRTM longwave, Dudhia shortwave • Mellor-Yamada-Janjic (MYJ) PBL • Kain-Fritsch (KF) cumulus scheme • ROMS • 16 vertical levels • dt 240, 48 • Physics • GLS turbulence closure • COARE bulk fluxes • BC's from HYCOM • Timestep = 240s 6 km grid 5km and 1 km grid(s) These models are on different grids.

  14. 2) sandy.h

  15. 3) SCRIP - grid interpolation http://climate.lanl.gov/Software/SCRIP/ Ocean grid 5 km Atm Grid 6 km 10 GFS data HFLX SST Ocean model provides higher resolution and coupled response of SST to atmosphere. But the ocean grid is limited in spatial coverage so atmosphere model must combine data from different sources, which can create a discontinuity in the forcing. Atmosphere model provides heat flux to cover entire ocean grid. SCRIP interpolations weights needed to remap data fields. Flux conservative remapping scheme

  16. Libraries • SCRIP - v 1.6 (distributed) Used when 2 or more models are not on the same grid. 1) cd to COAWST/Lib/SCRIP/source dir 2) edit makefile 3) make

  17. 3) SCRIP Need to create SCRIP weights using COAWST\Tools\mfiles\mtools\ create_scrip_weights_master.m

  18. 3) SCRIP There will be created a weight file between each grid. So if you have 1 WRF grid, and 2 ROMS grids, it will produce atm1_to_ocn1_weights.nc atm1_to_ocn2_weights.nc ocn1_to_atm1_weights.nc ocn2_to_atm1_weights.nc 1000 m 5000 m ROMS grid 1 ROMS grid 2 WRF grid 1

  19. 4) coupling.in (this is a ROMS+WRF app) set # procs for each model set coupling interval. Can be different for each direction. input file names. only 1 for WRF, 1 for ROMS, multiple for SWAN SCRIP weights are listed here

  20. 4) namelist.input need dt of 30 to divide evenly into coupling interval of 1200 sec. set # procs for atmmodel

  21. 6) run it as coawstM • use total number of procs from coupling.in • only 1 executable

  22. Processor allocation stdout reports processor allocation This looks like from a different run, but you get the idea

  23. Processor allocation "Timing for …." = WRF "1 179 52974 02:59:00 " = ROMS Here is where the model coupling synchronization occurs. so probably could re-allocate more nodes to WRF

  24. JOE_TC - test case examples JOE_TC test cases are distributed applications for testing ROMS+WRF coupling • JOE_TCw = wrf only • JOE_TCs = same grid, roms + wrf coupled • JOE_TCd = different grids for roms and wrf, needs scrip weights

More Related