1 / 19

DUCKS – Distributed User-mode Chirp-Knowledgeable Server

DUCKS – Distributed User-mode Chirp-Knowledgeable Server. Joe Thompson Jay Doyle. DUCKS Motivation. 3. Usability. 2. Chirp & Condor. 1. Performance. DUCKS Goals. Bring together functionality of Condor and CHIRP in an easy to use package. Abstract Condor and CHIRP interfaces.

hachi
Télécharger la présentation

DUCKS – Distributed User-mode Chirp-Knowledgeable Server

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DUCKS – Distributed User-mode Chirp-Knowledgeable Server Joe Thompson Jay Doyle

  2. DUCKS Motivation 3 Usability 2 Chirp & Condor 1 Performance

  3. DUCKS Goals • Bring together functionality of Condor and CHIRP in an easy to use package. • Abstract Condor and CHIRP interfaces. • Intelligently distribute files over CHIRP servers. • Provide simple interface for the Chirp Active Storage Program-To-Data model. • Provide simple interface for the Condor Data-To-Program model.

  4. Incoming Message Queue DUCKS Client Incoming Message Queue Incoming Message Queue Transaction Handler Chirp Server List MySQL Garbage Collector Timeout Handler Chirp Tracker Transaction List

  5. ducks_put <username> <local_file> <ducks_name> Local Machine DUCKS Server Chirp Node Store_Request<username> <ducks_name> <filesize> 1) Verify the <username/ducks_name> pair is not already in the DB 2) Find a Chirp node with enough free space to store the file3) Query database for path name to use on the Chirp node 4) Parse response to get the Chirp storage location for <local_file> Store_Response<Chirp_node> <Path_on_node> chirp_put<local_file> <chirp_node> <path_on_node> File Store_success<username> <ducks_name> <chirp_node> <path_on_node> 5) Update DB to reflect this file storage

  6. ducks_get <username> <ducks_name> <local_name> Local Machine DUCKS Server Chirp Node Get_Request<username> <ducks_name> 1) Query DB for Chirp location of <ducks_name> 2) Parse response to get the Chirp storage location for <ducks_name> Get_Response<Chirp_node> <Path_on_node> chirp_get<chirp_node> <path_on_node> <local_file> File

  7. ducks_delete <username> <ducks_name> Local Machine DUCKS Server Garbage Collector delete_Request<username> <ducks_name> 1) Set the delete_flag of the <username/ducks_name> entry in the DB DUCKS DB 2) Periodically query the DB for files with the delete_flag set <chirp_node> <path_on_node>…… Result Set 3) Delete files in the list Delete File 4) Remove files from DB Chirp Node

  8. ducks_ls <username> <search_string> Local Machine DUCKS Server ls Thread ls_Request<username> <search_string> 1) Query the DB for all files owned by the user that match “%<search_string>%”; 2) Start a background thread and pass it the result set containing the found files 3) Iterate through the set and send the file information to the client Result Set File_info

  9. Client Chirp Nodes Distribution via Chirp Active Storage to nodes already storing input input01.txt Wrapper/ Submit Scripts input02.txt Job Request Response with locations of input files input03.txt ducks_get exe/libs request input04.txt DUCKS Server

  10. Client Chirp Nodes Distribute tasks to any available node Get input and exe/lib files from other nodes in the Chirp cluster with ducks_get Condor exe/libs/input_file request DUCKS Server

  11. DUCKS Future Work • The basic framework is implemented. • Add a more robust file interface. • Implement DUCKS management of job status information. (Queued, Running, Complete)

  12. ?

More Related