210 likes | 580 Vues
Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator Strategy Local Cluster Overview Connecting to it Grid Cluster Computer Rooms Particle Physics Strategy The Server / Desktop Divide Servers
E N D
Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator Graduate Lectures
Strategy • Local Cluster Overview • Connecting to it • Grid Cluster • Computer Rooms Graduate Lectures
Particle Physics Strategy The Server / Desktop Divide Servers General Purpose Unix Server Linux Worker nodes Group DAQ Systems Linux FileServers Web Server Win XP PC Win XP PC Win XP PC Linux System Win XP PC Desktops Approx 200 Windows XP Desktop PC’s with Exceed, xming or ssh used to access central Linux systems Graduate Lectures
Particle Physics Linux • Aim to provide general purpose Linux based system for code development and testing and other Linux based applications. • Unix Admin (Ewan MacMahon). • Interactive logins servers and batch queues are provided • Systems run Scientific Linux which is a free Red Hat Enterprise based distribution • All systems are currently running SL4, this is the same version as used on the Grid and CERN. • The Systems will be migrated to SL5 in the coming months. Students are encouraged to test pplxint5 to let us know of any problems. • Worker nodes form a PBS (aka torque) cluster accessed via batch queues. Graduate Lectures
Current Clusters • Particle Physics Local Batch cluster • Oxfords Tier 2 Grid cluster Graduate Lectures
pplxgenng AMD 285 2.6GHz PP Linux Batch Farm (Upgrade Nov/Dec 08) Scientific Linux 4 160 active slots pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores 6TB pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores 4TB pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores 10TB pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxfs2 pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores 9TB pplxwn 8 * Intel 5420 cores CDF Data pplxwn 8 * Intel 5420 cores pplxwn21 8 * Intel 5345 cores pplxfs3 19TB pplxwn20 8 * Intel 5345 cores 8 * Intel 5345 cores 19TB pplxwn19 pplxfs6 8 * Intel 5345 cores pplxwn18 pplxfs4 { 19TB pplxint3 8 * Intel 5420 cores pplxint pplxfs7 pplxint2 8 * Intel 5420 cores 19TB ATLAS Data pplxint1 Interactive login nodes 19TB pplxfs5 Hacked pplxfs8
Strong Passwords etc • Use a strong password not open to dictionary attack! • fred123 – No good • Uaspnotda!09 – Much better • Better to use ssh with a passphrased key stored on your desktop. Graduate Lectures
PP Linux Batch Farm Scientific Linux 5 migration plan pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn20 and 21 being used as pilot SL5. pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores pplxwn 8 * Intel 5420 cores 8 * Intel 5345 cores pplxint5 pplxwn20 8 * Intel 5345 cores 8 * Intel 5345 cores pplxwn19 8 * Intel 5345 cores pplxwn18 { pplxint3 8 * Intel 5420 cores pplxint pplxint2 8 * Intel 5420 cores pplxint1 Interactive login nodes
http://pplxconfig.physics.ox.ac.uk/ganglia Graduate Lectures
Connecting with PuTTY Demo • Plain ssh terminal connection • With key and Pageant • ssh with X windows tunnelled to passive exceed • ssh, X windows tunnel, passive exceed, KDE Session http://www.physics.ox.ac.uk/it/unix/particle/XTunnel%20via%20ssh.htm Graduate Lectures
SouthGrid Member Institutions • Oxford • RAL PPD • Cambridge • Birmingham • Bristol • JET at Culham Graduate Lectures
Oxford Tier 2 Grid Upgrade 2008 • 13 systems, 26 servers, 52 cpus, 208 cores. Intel 5420 clovertown cpu’s provide ~540KSI2K • 3 servers each providing 20TB usable storage after RAID 6, total ~60TB • One racks, 2 PDU’s, 2 UPS’s, 3 3COM 5500G switches Graduate Lectures
Get a Grid Certificate Must remember to use the same web browser to request and retrieve the Grid Certificate. Once you have it in your browser you can export it to the Linux Cluster to run grid jobs. Details of these steps and how to request membership of the SouthGrid VO (if you do not belong to an existing group such as ATLAS, LHCb) are here: http://www.gridpp.ac.uk/southgrid/VO/instructions.html Graduate Lectures
Two New Computer Rooms provide excellent infrastructure for the future The New Computer room built at Begbroke Science Park jointly for the Oxford Super Computer and the Physics department, provides space for 55 (11KW) computer racks. 22 of which will be for Physics. Up to a third of these can be used for the Tier 2 centre. This £1.5M project is funded by SRIF and a contribution of ~£200K from Oxford Physics. The room was ready in December 2007. Oxford Tier 2 Grid cluster was moved there during spring 2008. All new Physics High Performance Clusters will be installed here. Graduate Lectures
Local Oxford DWB Physics Infrastructure Computer Room Completely separate from the Begbroke Science park a computer room with 100KW cooling and >200KW power has been built. ~£150K Oxford Physics money. Local Physics department Infrastructure computer room. Completed September 2007. This allowed local computer rooms to be refurbished as offices again and racks that were in unsuitable locations to be re housed. Graduate Lectures
The end for now… • Ewan will give more details of use of the clusters next week • Help Pages • http://www.physics.ox.ac.uk/it/unix/default.htm • http://www.physics.ox.ac.uk/pp/computing/ • Questions…. • Network Topology Graduate Lectures
Network • Gigabit connection to campus operational since July 2005. • Second gigabit connection installed Sept 2007. • Dual 10 gigabit links installed August 2009 • Gigabit firewall installed for Physics. Purchased commercial unit to minimise manpower required for development and maintenance. Juniper ISG 1000 running netscreen. • Firewall also supports NAT and VPN services which is allowing us to consolidate and simplify the network services. • Moving to the firewall NAT has solved a number of problems we were having previously, including unreliability of videoconferencing connections. • Physics-wide wireless network. Installed in DWB public rooms, Martin Wood,AOPP and Theory. New firewall provides routing and security for this network. Graduate Lectures
Network Access Super Janet 4 2* 10Gb/s with Super Janet 5 Physics Firewall Physics Backbone Router 1Gb/s OUCS Firewall 1Gb/s 10Gb/s Backbone Edge Router 10Gb/s 100Mb/s Campus Backbone Router 1Gb/s 10Gb/s depts Backbone Edge Router depts 100Mb/s depts 100Mb/s depts Graduate Lectures
Physics Backbone Linux Server 1Gb/s Physics Firewall Server switch 1Gb/s Win 2k Server 1Gb/s 1Gb/s Particle Physics 1Gb/s 100Mb/s Physics Backbone Router 100Mb/s 1Gb/s desktop Clarendon Lab 100Mb/s 1Gb/s desktop 1Gb/s 1Gb/s 100Mb/s Astro Atmos Theory Graduate Lectures