1 / 15

Developing Scientific Applications Using Standard Grid Middleware

Developing Scientific Applications Using Standard Grid Middleware. Hiroshi Takemiya Grid Technology Research Center National Institute of Advanced Industrial Science and Technology. Goal of the research.

tea
Télécharger la présentation

Developing Scientific Applications Using Standard Grid Middleware

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Scientific Applications Using Standard Grid Middleware Hiroshi Takemiya Grid Technology Research Center National Institute of Advanced Industrial Science and Technology

  2. Goal of the research • Examining and evaluating the effectiveness of grid middleware by gridifying “real” applications • Computational Grid becomes feasible for running apps. • Many kinds of middleware has been provided • Several pioneering works succeeded in running apps. • Little information on how to “gridify” legacy apps. • How easily the application can be gridified? • How efficiently the application can be executed on the grid? • The information will be valuable for; • application programmers + middleware providers Climate simulation, Molecular Simulation, Fluid simulation, Virtual Screening, Astronomical Virtual Observatory … Globus Toolkit, UNICORE, MPICH-G, Ninf-G … Grid infrastructure

  3. Weather Forecasting System • Predicting short- to middle-term weather change • Based on Barotropic S-model • Proposed by Dr. Tanaka • Legacy FORTRAN program • Simple and precise • Treating vertically averaged quantities • Solving shallow water equations • 2D simulation • 150 sec for 100 days prediction/1 simulation • Succeeded in reproducing characteristic phenomenon • Distribution of jet streams • Blocking phenomenon of high atmospheric pressure 1989/1/30-2/12

  4. Sample simulation 1 perturbation Time evolution (Leap Frog method) Sample simulation 2 perturbation Statistical result Observational data Time evolution (Leap Frog method) … perturbation 100 ~ 1000 simulations Sample simulation N perturbation Time evolution (Leap Frog method) Ensemble Simulation • Keep high precision over long period • Taking a statistical ensemble mean • Introducing perturbation for each simulation • Requires100 ~ 1000 sample simulations Gridifying the program enables quick response

  5. Numerical Library Client side IDL FILE Server side 4. connect back Globus I/O GASS IDL Compiler Client generate 3. invoke Executable 2. interface reply Remote Executable GRAM 1. interface request fork Interface Information LDIF File MDS retrieve GSI http://ninf.apgrid.org/ Assigning independent tasks on distributed resources Ninf-G Library at a glance • Reference implementation of GridRPC • Standardized at GGF Grid RPC WG • Providing RPC semantics on the grid • Suited for gridifying task parallel programs • Providing asynchronous RPC call as well • Built on top of Globus Toolkit • GRAM: invocation of server programs • Globus I/O, GASS: communication between C-S • MDS: managing stub information • GSI: authentication between C-S

  6. Client Program Server Program main(){ for(i = 0; i < task_no; i++) grpc_call_async (dest[i], “remote_call”, args); grpc_wait_all(dest); } remote_call(){ Processing(); Return; } Client side Server side Server source IDL FILE 4. connect back Client IDL Compiler Ninf-G Client Library Globus I/O GASS generate 3. invoke Executable Server stub 2. interface reply Remote Executable GRAM 1. interface request invoke Interface Information LDIF File MDS retrieve Requires no detailed knowledge on Grid infrastructure GSI http://ninf.apgrid.org/ Advantages of using Ninf-G library (1) • Construct grid applications easily • Hiding complicated mechanism of the grid • Based on familiar RPC semantics • Providing support tools for programming • ns_client_gen: client program compiler • ns_gen: stub generator

  7. Advantages of using Ninf-G library (2) • Write once, run everywhere • Based on standard API • Many libraries (NetSolve, DIET, OmniRPC…) with the same API • cf. MPI vs. NX library • Constructed on the most popular grid middleware • Robust and flexible applications can be constructed • Dynamic server program invocation • Recovering failures in establishing connections, allocating tasks

  8. Ninf-g Ninf-g Grid Lib Ninf-g Ninf-g Ninf-g user Gridifying the Weather Forecasting System • Gridifying 2 parts of the program • Simulation part • Visualization part • Executed in a pipelined fashion • Accessing through the GridLib portal S-model Program Reading Data Averaging results Solving Equations Solving Equations Solving Equations Visualizing Results

  9. grpc_call Globus-job-run Gass-url-copy Behavior of the System server client Sim. Server GridLib Grid Lib Vis. Server user

  10. Osaka U, Japan 156 CPUs Weather Simulation Server Program Doshisha U, Japan 16 CPUs Weather Simulation Server Program NCHC, Taiwan HKU, HK 16 CPUs 32 CPUs Weather Simulation Server Program Weather Simulation Server Program KISTI, Korea 64 CPUs Weather Simulation Server Program Current Status of the System TSUKUBA U, Japan • Deployed on the ApGrid Testbed • Client program: cluster in AIST • Server program: 10 clusters in 9 institutes • Succeeded in giving a demonstration • CCGrid Conference: using 110 CPU’s • PRAGMA Workshop: using 183 CPU’s 16 CPUs Weather Simulation Server Program AIST,Japan 64 CPUs Visualization Program Weather Simulation Server Program TITECH, Japan 16 CPUs Weather Simulation Server Program APGridTestbed TITECH, Japan 16 CPUs Weather Simulation Server Program AIST,Japan 10 CPUs Weather Simulation Client Program GridLib KU,Thailand 15 CPUs Weather Simulation Server Program

  11. Lessons Learned (1) 10 days work 3 days work On a grid environment • How easily the application could be gridified? • Gridifying the app using Ninf-G was very easy! • 13 days work, 300 lines modification • Most of the program modification can be performed on a single computer. • Replacing a local call to remote one is straightforward. • Creating Server program is automated. • Specifying the interface of a server function. • Eliminating data dependence between C-S. • Inserting Ninf-G functions into the client program. • Creating server stubs On a single computer

  12. Lessons Learned (2) Termination Initialization • How was the performance of the application? • Attaining good performance is difficult • Initialization/termination cost are large • Modification of application, as well as middleware is needed • Large look-up cost of MDS • Long polling period of GRAM • Blocking of Ninf-G initialization/termination function KISTI DU AIST elapsed time (sec) 0 200 400 600 800 1000 1200 1400

  13. Performance Result of Executing 200 Sample Simulations • Executing an optimized program • Bypassing MDS lookup • Modifying GRAM source to reducing polling period • Multi-threading application to avoid blocking • … KISTI DU AIST 0 200 400 600 800 1000 1200 1400 elapsed time (sec)

  14. Future Work • Middleware level • Improving Ninf-G based on knowledge gained through the work • Ninf-G2 will be released in Nov. 2003 • Designing task farming API on Ninf-G • difficult to implement efficient, robust, flexible apps • Hiding scheduling, error recovery, multi-threading mechanisms • Application level • Checking the scalability of middleware/apps. • Using more than 500 CPUs

More Related