1 / 57

Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA)

17 th APAN Meetings Research Collaborations Among Institutions Employing Grid Technology. January 29, 2004 Hawaii Imin International Conference Center University of Hawaii, Honolulu, Hawaii, USA. Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA)

benjy
Télécharger la présentation

Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 17th APAN Meetings Research Collaborations Among InstitutionsEmploying Grid Technology January 29, 2004 Hawaii Imin International Conference CenterUniversity of Hawaii, Honolulu, Hawaii, USA Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee (NCAM/KMA, KOREA)

  2. Main Goals Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system

  3. About uMeteo-K The concept of virtual lab. for interdisciplinary meteorological researches • Cooperation meteorological research environment (Access Grid) • - Parallelized numerical weather/climate prediction modeling (Computational Grid) • - Virtual server for large meteorological data (Data Grid) Grid technology is essential to accomplishment

  4. Establishment of virtual cooperation research environment using Access Grid Pilot operation of numerical weather prediction system under Computational Grid environment Establishment of data grid, application of virtual scenarios for uMeteo-K uMeteo-K Achievement(Apr., 2003 - present)

  5. PKNU video-conferencing & AG room

  6. CPUs Video device-1 RAM Video device-2 ODD Animation capture device Audio device FDD Mother Board SCSI-FDD PKNU room node AG server spec.

  7. PKNU QuickBrigde server installed • bridge.pknu.ac.kr: uMeteo-K bridge server • - Supporting to connect PIG without Multicasting network • - Operating uMeteo-K homepage server • - Registered uMeteo-K virtual venues service (ANL) • Operating the Quick bridge server (bridge.pknu.ac.kr :210.107.209.219)

  8. uMeteo-K portable AG sever spec.

  9. uMeteo-K AG instruments - Camera : EVI-D30 camera - Mic. : cordless & boundary mic. - Echo remove unit

  10. uMeteo-K new AG configuration Multicast APAG ANL Unicast KMA Meteorological University KMA SNU KISTI KISTI Unicast KMA Meteorological Organization AG KJIST KAIST Unicast KMA Other Users PKNU PKNU(부경대) KMA KMA Quick Bridge

  11. PKNU Venue-server registered to ANL

  12. Remote PPT - Sharing presentation data and connecting with AG - The data sharing through simple virtual host configuration

  13. VNC Server & Viewer • VNC stands for Virtual Network Computing. • VNC allows you to view and interact with one computer (the "server") using a simple program (the "viewer") on another computer anywhere. • - The two computers don't even have to be the same type OS.

  14. Sample of VNC Server & Viewer

  15. Samples of uMeteo-K AG operation Seminar : international (4) + domestic (14) Co-work and general use : 60

  16. uMeteo-K AG operation

  17. Meteorology Session on 16th APAN meeting Marriot Hotel, Busan, Korea <Aug. 26~27, 2003>

  18. 2003 Grid Forum Korea Winter meetings at Chosun Hotel, Seoul <Dec. 1~2, 2003>

  19. GLOBEC workshop at NFRDI, Korea <Dec. 17~18, 2003>

  20. Tera Computing and Network Demonstration, KISTI <Dec. 23, 2003>

  21. Workshop on Grid Technology for Environmental study at PKNU <Jan. 8~10, 2004>

  22. uMeteo-K AG 2.0 tutorials uMeteo-K meeting & tutorials at PKNU, Busan, Korea <Jan. 8~10, 2004>

  23. Costs & Benefits Analysis for uMeteo-K AG 국내 전문가1인=100,000원, 국외전문가1인=1,000,000원, 수행연구원1인=5만원

  24. Differences between AG 1.2 and AG 2.0 1. Easy Venue Server configuration and usage 2. Instead of PIG, using individual Venue Server 3. Complexity about Certification Authority 4. Sharing Data, Sharing Application, Sharing Service enforced 5. GUI 6. Service total Management about Access Grid utility Data sharing Remote controll client client direct access bridge client server server server Manage total applications through different programs Manage total applications through one program AG 1.2 AG 2.0

  25. Operate AG 2.0 – (1) Video Conferencing

  26. Operate AG 2.0 – (2) Sharing Data

  27. Operate AG 2.0 – (3) Data Download

  28. Operate AG 2.0 – (4) Shared Brower

  29. Operate AG 2.0 – (5) Shared Presentation

  30. Client Services & Applications • Enable face-to-face meeting activities • What can be done: • Sharing Data • Shared Applications • Text chat • Applications: • Shared Presentation • Shared Web browser • Whiteboard • Voting Tool • Question & Answer Tool • Shared Desktop Tool • Integrate legacy single-user apps • Communicates media addresses to node service

  31. uMeteo-K CG Testbed • uMeteo-K computational grid test-bed • (Two clusters utilized and each cluster has 4 nodes) • < A node’s specification>

  32. uMeteo-K CG Testbed Configuration UPS NAS storage sever 4 nodes ( single CPU ) cluster NAS storage sever 10/100 switch hub 4 nodes ( single CPU ) cluster Monitoring system KOREN 10/100 Ethernet Electrometer

  33. Master A Master B CA-B CA-A slaves slaves PBS PBS Globus linkage between testbed clusters • Independent simple CA has installed at each master node. • Simulating the severe weather, typhoon Mae-mi • A group of slave nodes is controlled by each master node’s PBS • scheduler

  34. CA information of each cluster - CA-A subject : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus identity : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u533 timeleft : 7:52:58 - CA-B subject : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 identity : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u535 timeleft : 7:52:43

  35. uMeteo-K CG testbed monitoring

  36. MM5 (Mesoscale Model Version 5) Non-hydrostatic NWP model developed by PSU/NCAR KMA’s operational model for short-term forecast Parallel NWP model for realtime short-term forecast

  37. Precipitation, MSLP & Wind for 24 HRs Sep. 12, 2003 09:00 LST~ Sep. 13, 2003 09:00 LST Precipitation MSLP and Wind

  38. Parallel MM5 Benchmarks with GLOBUS • Average job waiting time (including CA) : 25 sec • The required time for 3600 sec (1 hour) model integration • The required time for 86400 sec (1 day) model integration

  39. Connecting to KISTI testbed - CA infomation subject : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy/CN=proxy issuer : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy type : full strength : 512 bits path : /tmp/x509up_p2358.fileEaIRDc.1 timeleft : 7:28:17 - Conneting to venus.gridcenter.or.kr using gsissh gsissh -p 2222 venus.gridcenter.or.kr - File transfer to venus.gridcenter.or.kr using gsincftpput and gsincftpget gsincftpput -p 2811 venus.gridcenter.or.kr ./ ~/MM5.tar gsincftpput -p 2811 venus.gridcenter.or.kr ./ ~/CReSS.tar

  40. Running weather numerical model • It is successful to compile MM5 ( Mesoscale Model version 5 ) and CReSS ( Cloud resolving Storm Simulator) • It is successful to run CReSS, but fail to run MM5 because testbed is not supported some library and routine. - rsl file to run CReSS + ( &(resourceManagerContact="venus.gridcenter.or.kr") (count=8) (label="subjob 0") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 0) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") ) ( &(resourceManagerContact="jupiter.gridcenter.or.kr") (count=8) (label="subjob 8") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 1) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") )

  41. CReSS (Cloud Resolving Storm simulator) Research & development in Nagoya Univ., Japan Non-hydrostatic compressible dynamic model • Meso-to-cloud scale simulation system • Inclusion of a detailed cloud . microphysical processes It takes many times for simulation. Performance of parallel processing

  42. Downburst Simulation for 21000 sec. < Wind vector in y section [m s-1] >

  43. Parallel CReSS Benchmarks with GLOBUS • The required time for 21000 sec (1 hour) model integration in uMeteoK Testbed • The required time for 21000 sec (1 hour) model integration in KISTI Testbed

  44. Data Grid in Globus Tools Data Transport & Access • GASS - Simple, Multi-protocol file transfer Tools, integrated with GRAM • GridFTP - Provides high-performance, reliable data transfer for modern WANS Data Replication and Management • Data Replica Catalog - Provides a catalog service for keeping track of replicated datasets • Replica Management - Provides services for creating and managing replicated datasets

  45. Data grid for climate prediction Data transportation Atmospheric e-science Data Grid Wu-Ftp Connecting Model output KMA NCEP KMA Supercom Model output Forecast output Data input Data input Model output SNU PKNU NASA Forecast output Forecast output Observation data KISTI Supercom NASA Model output

  46. Pukyong National University Seoul National University Korea Meteorology Administration Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Hardware structure of Data-GRID neosky15 KOREN Network Disk Raid 2.3TB cdldata Disk Raid 1.2TB KOREN Network Disk Raid 1.8TB cpsdata KOREN Network Disk Raid 500G Disk Raid 500G pknuGB01 climate

  47. Globus-replica -management ldapadd Globus-replica -catalog Globus-replica -management Globus-replica -Management& catalog Globus-replica -management Globus-replica- management ldapadd Globus-replica -management Register location for test-collection (snu-kma-pknu) Delete the files & location List all location where some files can be found Copy file snu->kma or snu->pknu Overview Data grid Adding the uMeteoK-DataGrid to LDAP server Adding the Test-catalog to LDAP server Creating the Test-Collection in the test-catalog Create & Register & list file-list for all location Delete the Test-collection & Test-catalog

  48. Globus Tool kit Grid-Proxy-Init Data transfer & Access Using ‘NCFTP’ Programs reliable and re-startable data transfer parallel and Striped Data transfer Automatic negotiation of TCP buffer/window sizes Grid FTP Data grid middleware User certification GSINCFTP Using ‘globus-url-copy’ Secured ftp Including Globus Tool kit Grid-ftp

  49. Data Grid in Globus Tools Globus Toolkit install-process Grid Software : Globus 2.4 Toolkit CA Software : SimpleCA Grid-proxy-init Globus-personal-gatekeeper Globusrun Result of run /bin/data It appear that Test Result of Globus Software(globusrun) is successful

  50. SNU-SNU - Cdldata  climate SNU-KMA - Cdldata  Neosky15 SNU-PKNU - Cdldata  PknuGB01 Data Grid in Globus Tools GSINCFTP Testing - Successful

More Related