1 / 34

WRF4G

WRF4G. The Weather Research Forecasting model workflow for the GRID meteo@unican.es. GRID developers: Valvanuz Fernández , Antonio S. Cofiño Application developers: Jesús Fernández, Lluís Fita. Department of Applied Mathematics & Computer Sciences University of Cantabria, Spain.

kiaria
Télécharger la présentation

WRF4G

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WRF4G The Weather Research Forecasting model workflow for the GRIDmeteo@unican.es GRID developers: Valvanuz Fernández, Antonio S. Cofiño Application developers: Jesús Fernández, Lluís Fita Department of Applied Mathematics& Computer Sciences University of Cantabria, Spain Santander Meteorology Group A multidisciplinary approach to weather & climate http://www.meteo.unican.es EGU General Assembly 2010, Wien, 2-7 May 2010

  2. Numerical Weather Prediction model (regional)‏ Open source (Fortran90)‏. A world community with 6000+ registered users. Application in a wide variety of studies: Weather forecasting (operational at NCEP)‏ Data assimilation studies Regional climate studies Idealized simulations Educational applications,.... WRF Model

  3. Computing and storage requirements: Intensive use of CPU Supports a variety of parallel programming paradigms (OpenMP, MPI, serial,...) . It's a common benchmarking application for HPC. Large amount of input, output and restart data. Typical experiments last for days (even in parallel). Application with a complex workflow (preprocessing, execution & postprocessing)‏ WRFGEL ungrib metgrid real wrf namelist.input geogrid namelist.wps WRF Preprocessing System WRF ARW WRF Requirements WRF WORKFLOW

  4. Reanalisys/Reforecasting High number (~104) of independent simulations High volume of output-data (>TB) Requires scalability Regional projections for climate change Contiguous simulations (~10), weeks of walltime each. High volume of output data (>10TB) Recovering system for simulation restart Weather Forecasting QoS and optimal resources: deadline for delivering Sensitive studies for Climate and Weather Physical schemes, initial conditions and boundary conditions uncertainties sampling Resource demanding experiments Scientific experiments WRF

  5. GOALS Make easier for the user the process of design, execution and monitoring of the experiments with WRF. Develop a framework that allow the user to use at the same time different computing resources in a transparent way: Local clusters (PBS, SGE, LoadLeveler,…) Grid infrastructures (gLite, Globus,…) Local resources (SSH, fork, …). Develop a set of command line tools and a Web portal for the WRF users. WRF4G: Goals

  6. CORDEX (COordinated Regional climate Downscaling EXperiment) is a framework to improve coordination of international efforts in regional climate downscaling research. CORDEX was initiated as a result of the Task Force on Regional Climate Downscaling, formed by the World Climate Research Program (WCRP). Role of WRF4G in CORDEX http://wcrp.ipsl.jussieu.fr/RCD_Projects/CORDEXhttp://www.meteo.unican.es/wiki/cordexwrf • A set of target regions has been proposed and modeling groups willing to contribute must comply with simulations specifications. • CORDEX will produce an ensemble of simulations sampling uncertainties related to: (i) varying Global Climate Model (GCM) simulations; (ii) varying greenhouse gas (GHG) concentration scenarios; (iii) natural climate variability; and (iv) different downscaling methods. • There are currently 15 groups planning to contribute to CORDEX with WRF, 5 groups planning to contribute the African domain (key region for the AR5). All of them could benefit from WRF4G app.

  7. 150.000 CPUs 70 PB 260 sites worldwide Arquitectures: i386,x86_64 LRMS: torque, sge, lsf, bqs Shared and not shared Home Computing resources: OS: Debian, SL/Centos 4 and 5. Memory: 250MB to 16GB Processor: P4 to i7 Different queues limitations: walltime, memory & disk quotas… Bandwidth in some sites is very small. EGEE: Example of big Grid infrastructure

  8. GRID Challenges: Develop an application adapted to run in different arquitectures, OS, LRMS and parallel environments. Repository of WRF binaries adapted to those environments Detect sites that not accomplish application requirements: Memory, Processor, queue time limitations,… Optimizing the data transfers and replication between resources distributed geographically. Develop a monitoring tool for experiments status. Create a checkpointing management system that allow to restart simulations. Failure detection and recovery. WRF4G: GRID Challenges

  9. WRF4G is a port of the WRF Modeling System prepared to run in GRID environments. WRF has been splitted in 2 layers to separate the scientific experiment from the infrastructure details (wrf.input & wrf4g.conf). WRF4G supports several data transfer services (gsiftp, rsync,…) and execution systems (gLite, globus, PBS, SGE,…). It can be run in the user’s machine. No deployment required on sites. The application is deployed by a pilot job. WRF for GRID: WRF4G

  10. WRF4G Workflow Gridded Data: NAM, GFS, RUC, AGRMET, etc. ungrib metgrid real wrf Static Geographical Data namelist.wps namelist.input geogrid WRF Preprocessing System WRF ARW

  11. WRF4G Workflow WRF4G wrf.input wrf4g.conf Gridded Data: NAM, GFS, RUC, AGRMET, etc. ungrib metgrid real wrf Static Geographical Data namelist.wps namelist.input geogrid WRF Preprocessing System WRF ARW Processed locally

  12. WRF4G Workflow UI WRF4G wrf.input wrf4g.conf wrf4g_submitter.sh Job submission abstraction layer WRF4G_ini.sh WRF4G.sh Input data abstraction layer WRFGEL Gridded Data: NAM, GFS, RUC, AGRMET, etc. ungrib metgrid real wrf WN Static Geographical Data namelist.wps namelist.input geogrid WRF Preprocessing System WRF ARW Processed locally

  13. WRF4G Workflow geo_em.nc namelist.wps UI WRF4G_DOMAINS WRF4G wrf.input wrf4g.conf WRF4G-0.0.2.tgz WRF4Gbin-3.1r83.tgz wrf4g_submitter.sh WRF4G_APPS Job submission abstraction layer WRF4G_ini.sh WRF4G.sh WRF4G_INPUT wrfout_* wrfrst_* wrfinput_* wrfbdy_* WRF4G_BASEPATH Input data abstraction layer WRFGEL Gridded Data: NAM, GFS, RUC, AGRMET, etc. ungrib metgrid real wrf WN Static Geographical Data namelist.wps namelist.input geogrid WRF Preprocessing System WRF ARW Processed locally

  14. ${WRF4G_ROOT} examples ui wn cccantabria wrf.input wrf4g.conf domain_partition wrf.input.in wrf4g.conf.in nino50 wrf.input wrf4g.conf operativo operativo.sh post_operatorio.sh wrf.input.in wrf4g.conf WRF4G.sh scripts WPS WRFV3 bin lib WRFGEL vcp fortnml ncdump preprocessor.* WRF4G_ini.sh wrf4g_submitter.sh wrf4g_make_tarball.sh wrf4g_make_tarball_bin.sh wrf4g_submit.EELA_grid_job wrf4g_submit.MDMcluster wrf4g_submit.MDMclusterIFB wrf4g_wrfgel_environment create_output_structure download_file exist_wps get_date_restart register_file WRF4G structure

  15. ${WRF4G_ROOT} examples ui wn cccantabria wrf.input wrf4g.conf domain_partition wrf.input.in wrf4g.conf.in nino50 wrf.input wrf4g.conf operativo operativo.sh post_operatorio.sh wrf.input.in wrf4g.conf WRF4G.sh scripts WPS WRFV3 bin lib WRFGEL vcp fortnml ncdump preprocessor.* WRF4G_ini.sh wrf4g_submitter.sh wrf4g_make_tarball.sh wrf4g_make_tarball_bin.sh wrf4g_submit.EELA_grid_job wrf4g_submit.MDMcluster wrf4g_submit.MDMclusterIFB wrf4g_wrfgel_environment create_output_structure download_file exist_wps get_date_restart register_file WRF4G structure These are sample configuration files for a WRF4G experiment

  16. ${WRF4G_ROOT} examples ui wn cccantabria wrf.input wrf4g.conf domain_partition wrf.input.in wrf4g.conf.in nino50 wrf.input wrf4g.conf operativo operativo.sh post_operatorio.sh wrf.input.in wrf4g.conf WRF4G.sh scripts WPS WRFV3 bin lib WRFGEL vcp fortnml ncdump preprocessor.* WRF4G_ini.sh wrf4g_submitter.sh wrf4g_make_tarball.sh wrf4g_make_tarball_bin.sh wrf4g_submit.EELA_grid_job wrf4g_submit.MDMcluster wrf4g_submit.MDMclusterIFB wrf4g_wrfgel_environment create_output_structure download_file exist_wps get_date_restart register_file WRF4G structure These are user scripts launched from the UI

  17. ${WRF4G_ROOT} examples ui wn cccantabria wrf.input wrf4g.conf domain_partition wrf.input.in wrf4g.conf.in nino50 wrf.input wrf4g.conf operativo operativo.sh post_operatorio.sh wrf.input.in wrf4g.conf WRF4G.sh scripts WPS WRFV3 bin lib WRFGEL vcp fortnml ncdump preprocessor.* WRF4G_ini.sh wrf4g_submitter.sh wrf4g_make_tarball.sh wrf4g_make_tarball_bin.sh wrf4g_submit.EELA_grid_job wrf4g_submit.MDMcluster wrf4g_submit.MDMclusterIFB wrf4g_wrfgel_environment create_output_structure download_file exist_wps get_date_restart register_file WRF4G structure This is what the Worker Node sees just before runningWRF4G.sh

  18. Create and run an experiment User Interface - my_experiments/exp wrf4g.conf wrf.input

  19. Create and run an experiment User Interface - my_experiments/exp wrf4g.conf wrf.input realizations exp__rea1 exp__... exp__reaN 0001 0002 ... 0001 0002 ... 0001 0002 ... sandbox.tgz wrf.chunk wrf.input wrf4g.conf WRF4G_ini

  20. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - ${WRF4G_BASEPATH}/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout

  21. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfinput_t0 wrfbdy_t0

  22. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfinput_t0 wrfbdy_t0

  23. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfrst_t2 wrfinput_t0 wrfbdy_t0

  24. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfrst_t2 wrfrst_t4 wrfinput_t0 wrfbdy_t0

  25. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfrst_t2 wrfrst_t4 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4

  26. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfrst_t2 wrfrst_t4 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4

  27. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfout_t5 wrfrst_t2 wrfrst_t4 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4

  28. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfout_t5 wrfout_t6 wrfout_t7 wrfout_t8 wrfrst_t2 wrfrst_t4 wrfrst_t6 wrfrst_t8 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4

  29. Create and run an experiment exp__rea1 exp__rea2 exp__rea3 Three realizations split into two chunks each: 0 1 2 3 4 5 6 time Storage Element - METEO4G/WRF/experiments/exp exp__rea1 exp__rea2 exp__rea3 output restart wpsout output restart wpsout output restart wpsout wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfout_t5 wrfout_t6 wrfout_t7 wrfout_t8 wrfrst_t2 wrfrst_t4 wrfrst_t6 wrfrst_t8 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4 wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfout_t5 wrfout_t6 wrfout_t7 wrfout_t8 wrfrst_t2 wrfrst_t4 wrfrst_t6 wrfrst_t8 wrfout_t0 wrfout_t1 wrfout_t2 wrfout_t3 wrfout_t4 wrfout_t5 wrfout_t6 wrfout_t7 wrfout_t8 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4 wrfrst_t2 wrfrst_t4 wrfrst_t6 wrfrst_t8 wrfinput_t0 wrfbdy_t0 wrfinput_t4 wrfbdy_t4

  30. To hide the complexity of the GRID to the application developer, WRF4G includes several abstraction layers which encapsulate the routine tasks in generic function calls: Data Management vcp: Provides transparent copy between any of the following protocols: gsiftp, LFC, rsync, local copies WRF4G architecture (I)‏ vcp -r /local/dir gridftp://server:port/remote/path vcp gsiftp://srv1:port/rmt/file gsiftp://srv2:port/other/file vcp /local/file1 rsync://server/other/local/path/ vcp gsiftp://srv1:port/rmt/file ln:link

  31. To hide the complexity of the GRID to the application developer, WRF4G includes several abstraction layers which encapsulate the routine tasks in generic function calls: Execution Management Through the use of plugins, the user can submit jobs to a different infrastructures: Grid (globus 4.2 and glite CE), local clusters (PBS, SGE, SLURM,…), local resources. Prepared to use different execution environments: OpenMP and MPI. Strong scheduling policy based in history records and resources characteristics. Failure detection and recovery. WRF4G architecture (II)‏

  32. Conclusions & Current Work • The application is been supported in EELA-2, EGEE & NGI-ES infrastructures and GT2.4 & GT4. • A release candidate (v1.0 RC1) has been launched to run realistic experiments. It consist on a virtual machine UI with all the application components. • Collaborations with end-users in LA (CETA-CIIFEN-UPS), Europe (CESGA-MeteoGalicia) for feedback and Asia (HAII, Thailand). • Currently we are testing MPI support in the GRID. • Daily operational version running on the GRID for ensemble of weather forecasts.

  33. Future Work • Develop a monitoring system oriented to app status. • Execution management works on Globus and gLite CE based Gridway plugins. PBS y SGE are used, but plugins will developed for better integration with Gridway. • Develop an scalable replica management integrated with Gridway. (bottleneck in GRID infrastructures). • Create a user portal that allow the users managing and monitoring their experiments. • Collaborate with EGI and other International GRID Initiatives like EUAsiaGrid • Incorporate users feedback

  34. Thanks !!! meteo@unican.es

More Related