1 / 22

Utilizing E-SciencE through Charon Extension Layer (CEL) Toolkit

Utilizing E-SciencE through Charon Extension Layer (CEL) Toolkit. Petr Kulh ánek, Martin Pet řek, Jan Kmun íček CESNET. Contents. Introduction CEL Infrastructure Application Management Module System Applications in Grid Job Management Charon System Multi-Site Approach Conclusions.

coy
Télécharger la présentation

Utilizing E-SciencE through Charon Extension Layer (CEL) Toolkit

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Utilizing E-SciencE through Charon Extension Layer(CEL)Toolkit Petr Kulhánek, Martin Petřek, Jan Kmuníček CESNET

  2. Contents • Introduction • CEL Infrastructure • Application Management • Module System • Applications in Grid • Job Management • Charon System • Multi-Site Approach • Conclusions GCCP 2006, November 27-29, 2006

  3. Introduction • What is Charon Extension Layer? • uniform and modular approach for (complex) computational jobs submission and management • generic system for use of application programs in the Grid environment (LCG/gLite middleware, …) • Why Charon Extension Layer? • many various batch systems & scheduling components used in grid environment • each batch system has unique tools and different philosophy of its utilization • LCG/gLite provided tools are quite raw and simple • many additional tasks to use computer resources properly GCCP 2006, November 27-29, 2006

  4. CEL Infrastructure • Application management • single/parallel execution without job script modification • Job management • easy job submission, monitoring, and result retrieving • Command Line Interface (CLI) approach GCCP 2006, November 27-29, 2006

  5. Application management • Requirements • easy application initialization • version conflict handling • inter-application conflicts and/or dependencies handling • same usage in single/parallel execution • different levels of parallelizations Module System GCCP 2006, November 27-29, 2006

  6. Module System • similar approach as in Environment Modules Project* • applications are activated by the modifications of shell environment (e.g. PATH, LD_LIBRARY_PATH, etc.) • particular build of application is described by realization • e.g. by instructions, which describe shell environment modifications • realization is identified by name consisting from four parts: • user can specify only part of realization, in that case, module system completes full name of realization in such a way thatthe application will best fit available computationalresources *) http://modules.sourceforge.net/ GCCP 2006, November 27-29, 2006

  7. Module System continued • Commands of Module System module [action] [module1 [module2] …] • main command of Module System actions: • add (load), remove (unload) • avail, list*, active, exported, versions, realizations • disp, isactive * list is default action modconfig • menu driven configuration of Module System (vizualizations, autorestored modules, etc.) GCCP 2006, November 27-29, 2006

  8. Module System continued • Example of Module Activation $ module add povray Module specification: povray (add action) ============================================================== WARNING: Nonoptimal architecture is used for module 'povray' Cache type : system cache Architecture : i786 Number of CPUs : 1 Max CPUs per node : 1 Exported module : povray:3.6 Complete module : povray:3.6:i386:single • Module Name Completion povray → povray:3.6:auto:auto → povray:3.6:i386:single user → default values → resolved final name GCCP 2006, November 27-29, 2006

  9. Module System continued • Module Name Completion name - specified by user (it is mandatory) version - specified by user / default architecture - specified by user / default / automatically determined • Module System tries to find such realization, which is the closest to system architecture parallelmode - specified by user / default / automatically determined • para - always • p4 - NCPU > MaxCPUs/node • shmem - 1 < NCPU <= MaxCPUs/node • node - NCPU <= MaxCPUs/node • single - NCPU=1 GCCP 2006, November 27-29, 2006

  10. Applications in Grid • Model I - METACentrum (Czech national Grid) • Model II – EGEE GRID Applications are on sharedvolume available to all gridelements Legend: UI- user interface CE- computing element SE- storage element WN- worker node app - application applications cannot be shared with all grid elements their “sharing” is provided by their deployment to SE (once time) only required applications are then installed on CE during every job execution GCCP 2006, November 27-29, 2006

  11. Applications in Grid continued • both modes can be carried out without any modifications of module system • desired “non-standard” functions can be carried out by user (administrator) defined hooks (so called modactions) • modaction is script that is executed during any action of module command • modaction script for add action solves problems with applications in Model II • it behaves differently on UI and WN • it activates applications from volume on UI • it downloads package from SE to WN (CE) and installs it to some temporary volume then Module System sets environment in such a way that application will be used from that volume Advantages all applications are available in whole grid immediately after their deployment to SE Drawbacks this approach is suitable only for middle and long term jobs GCCP 2006, November 27-29, 2006

  12. Job management • Requirements • easy job submission • user should focus only on desired task not to all things related to submissions • easy parallel executions of applications • often repeated things should be process automatically • keep information about job during execution and/or after execution Charon System GCCP 2006, November 27-29, 2006

  13. Charon System • Overview • it is application in the context of Module System • it separates resources settings from job submission • Job Submission and Management • psubmit <resources> <job_script> [NCPU] [syncmode] • pinfo • psync • pkill • pgo(does not work in EGEE GRID environment) • Charon Setup • pconfigure GCCP 2006, November 27-29, 2006

  14. Charon System continued • Typical job flow No additional arguments are required – all information about job is stored in control files in job directory. GCCP 2006, November 27-29, 2006

  15. Charon System continued • Job Restrictions • job is described by script* • each job has to be in separate directory – control files need to be unique • job directories must not overlap – because job directory is copied to WN and then back • only relative paths to job directory contents have to be used in job script – only data from job directory will be present on WN • software should be activated by Module System – only then best utilization of resources can be reached • Job autodetection* • in some cases, user can specified input file instead of script and Charon System will prepare script for its processing • * currently autodetected jobs are: gaussian, povray, and precycle GCCP 2006, November 27-29, 2006

  16. Charon System continued • Configuration • Sync Mode – option for data transfer between UI and WN • gridcopy • all data within job directory as input • all data within job directory as result • stdout • all data within job directory as input • only standard output as result (other data are discarded) • Resources – identification of particular CE • Properties – fine grained selection of computational resources (through Requirements item in JDL) • Alias – uniform combination of above setup items in a single word pconfigure command serves for configuration. It is menu driven. GCCP 2006, November 27-29, 2006

  17. Multi-Site Approach • So-called “site” in CEL • is a special module within Module System which switches the appropriate environment from one resource to other one • represent virtual encapsulation of computational resources • utilizations of different Grids (sites) from one computer • all sites share the same software repository but list of available applications depends on site Module System setup GCCP 2006, November 27-29, 2006

  18. Interactive Software Repository • Interactive browser of the module database • another extension of Module system containing real-time list of available software realizations • this service can show the list of available applications with or without the accessible application versions • this information is integrated with the detailed description of applications (documentation of particular compilation and installation in the MediaWiki http://troll.chemi.muni.cz/whitezone/development/charon/isoftrepo/ GCCP 2006, November 27-29, 2006

  19. Interactive Software Repository GCCP 2006, November 27-29, 2006

  20. User Enhancements • Current enhancements of Module System • easy-to-use configuration of user preferences using configuration by a single command which is menu driven • user can setup the behavior of module name completion, visualization of listed modules • user is allowed to select modules to be automatically loaded during site activation • user can change priority between system and user module caches • user modules • user is permitted to extend available application portfolio by own module specifications GCCP 2006, November 27-29, 2006

  21. Conclusions • Single job management • encapsulation of a single computational job • minimization of overhead resulting from direct middleware usage (JDL file preparation, etc.) • easy submission and navigation during job lifetime • same usage in single/parallel executions of applications • Application management • easy application initialization, inter-application/version conflicts and dependencies handling • comfortable enlargement / modification of available application portfolio GCCP 2006, November 27-29, 2006

  22. Acknowledgements • Luděk Matyska (CESNET) • Jaroslav Koča (NCBR) • European Commission • EGEE II (contract number RI-031688) • EGEE (contract number IST-2003-508833) • Ministry of Education, Youth, and Physical Training of the Czech Republic (contract number MSM0021622413) • Grant Agency of Czech Republic (204/03/H016) GCCP 2006, November 27-29, 2006

More Related