1 / 14

LCD Computing Setup

LCD Computing Setup. Server Specs and Setup Desktop Cluster Organization Physics Software Distribution Plans and Schemes. Jeremy McCormick, Sergey Uzunyan, Guilherme Lima, et al. Server Hardware Specs. Dual-CPU Athlon: 2.133 GHz 2 Gb RAM HD: 1 master disk, 1 secondary, 4 RAID

terah
Télécharger la présentation

LCD Computing Setup

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LCD Computing Setup Server Specs and Setup Desktop Cluster Organization Physics Software Distribution Plans and Schemes Jeremy McCormick, Sergey Uzunyan, Guilherme Lima, et al

  2. Server Hardware Specs • Dual-CPU Athlon: 2.133 GHz • 2 Gb RAM • HD: 1 master disk, 1 secondary, 4 RAID • ~1 TB total storage

  3. Server Configuration • Red Hat 9 • kernel 2.4.20-13.9smp • hostname k2 • IP 131.156.85.141 • accessible via SSH

  4. Server Filesystem Filesystem Size Mount Description /dev/hda1 13 G / root fs /dev/hda3 19G /home home dirs /dev/hdb2 185G /k2bkp backup disk /dev/lcd/k2dist 9.9G /k2dist physics distribution /dev/lcd/k2work 145G /k2work project work dirs /dev/raid0/lcd_data 734G /k2data project data

  5. Conceptual Diagram

  6. NFS Structure Mount Description /k2dist physics software distribution /k2bkp backup /home/$USERNAME user home directory /k2work/$USERNAME project work directory /k2data/$USERNAME project data /sdisk/$MACHINE workstation shared disks

  7. Physics Software Distribution • simulation, analysis & event generation • shared binaries, libraries, scripts, includes, etc. • usable from any node in desktop cluster • common environment & setup scripts • /k2work/$USERNAME application builds possible • /k2dist isolates physics apps from Linux fs

  8. /k2dist directory structure Directory Description Contains apps applications directory tree with packages, libs, binaries bin executables shell scripts, binaries, symlinks to binaries config configuration info currently contains node list doc documentation application docs (pdf, html, ps, etc.) include source includes symlinks to include dirs install_files installation files install packages (tar.gz) in dirs lib libraries symlinks to libraries

  9. /k2dist/apps Analysis jas, root Event Generation pandora-pythia, peg Libraries aida, boost, cernlib, clhep, freehep, g4phys, geant4, lcio, Mesa, pegs4, xml4c Simulation lcdg4, mokka, tbeam Utilities david, dawn

  10. Useful /k2dist/bin Scripts • prjenv.sh • project env • add . /k2dist/prjenv.sh to ~/.bash_profile or ~/.cshrc • appenv.sh • application env • setup java vars, PATH, LD_LIBRARY_PATH, etc. • included by prjenv.sh • g4_5_2_p01env.sh • Geant4 env vars • included by prjenv.sh • nodes.sh • print node names, IPs, host • ex. = for n in `nodes.sh hn`; do • ping $n • done

  11. Simple Remote Job test_job.sh # set project environment . /k2dist/bin/prjenv.sh # run job nohup testbeam -m /k2work/jeremy/run10.mac -o /k2data/jeremy/tb-test.txt &> /k2work/jeremy/tb-test.log & start job on node dvk [jeremy@lepton-physics jeremy]$ ssh rio /k2work/jeremy/test_job.sh get pid [jeremy@lepton-physics jeremy]$ ssh rio pgrep testbeam 22742 kill the job [jeremy@lepton-physics jeremy]$ ssh rio kill 22742

  12. k2 + Desktop Cluster • new user setup • common development platform • “standard” NICADD physics apps • no more desktop installs • ample storage area • shared datasets • centralized authentication • distributed computing

  13. Plans and Schemes • no more development on nicadd & individual desktops • setup & packaging scripts • migrate SIO-Server • batch computing (fbs, pbs) • physics software packages • framework applications • add more cluster nodes • world hegemony (or at least Western hemisphere)

  14. K2 says “Hello”

More Related