130 likes | 254 Vues
The University of Florida's Research Computing initiative aims to enhance research through high-performance computing and big data resources. We provide support, training, and consulting services to UF researchers, improving competitiveness for external funding. Our infrastructure features over 7,300 cores, fast scratch storage, and dedicated clusters. The Matching Program helps secure hardware for academic needs. Faculty members can access various computing units at competitive prices, with comprehensive management and 24/7 monitoring to alleviate system administration tasks. Join us to unlock advanced research capabilities!
E N D
UF Research Computing:Introduction and Getting Started Erik Deumens Research Computing
Mission • Create opportunities for research and scholarship • Improve competitiveness in securing external funding • Provide to UF researchers • High-performance computing and big data storage resources • Support, training, and consulting services
Funding and operation • Funding for people and machines • Provost, VP Research, CIO, Deans • Faculty participation for hardware purchases • Research Computing Matching Program • Limited use for any UF faculty member • Up to 8 cores at the same time • Buying-in gets access to and priority on additional resources • Comprehensive management • Hardware maintenance and 24x7 monitoring • Relieve researchers of the majority of systems administration tasks
Matching Program 2012 • Same budget of $300K • Currently $250K already committed (Nov 2012)
Services offered • NCU Normalized Compute Units • 1 core in node with RAM, network, access to 230 TB fast scratch storage • Acquisition is for 5 years • Cost: $400 per NCU • RSU Replicated long-term Storage Units • 1 TB for 1 year • Cost: $250 per RSU • LSU Long-term Storage Units • 1 TB for 1 year • Cost: $125 per LSU
Resources • Hardware Resources • Over 7,300cores AMD and Intel • High-speed, low-latency InfiniBandinterconnects • >1.5 PB, high performance Lustre and Nexenta storage • Dedicated MPI cluster for 64-way and up to 512. • 80 GPUs—for CUDA code, etc. • Several large memory (512 GB, 1 TB) nodes
Coming in 2013 • Internet2 Innovation Platform • Campus connection to Internet upgrade • From 10 Gpbs to 100 Gpbs • Campus Research Network from 20 to 200 Gbps • New Data Center on Eastside Campus • 5,000 sq. ft. space for research computing • 10,000 sq.ft and 1.75 MW total • New cluster • 16,000 cores AMD • Infiniband interconnect • 2.8 PB (raw) fast, high-availability, storage
Usage priorities • Investors get fast access • To the nr of cores they bought • Within minutes to at most an hour • Check that your job can actually run • Also check that others in the group did not use all slots • Can burst up to 10 times that nr if cores are idle • Not continuous use of 10 times the nr of cores • Non-investors can use • Up to 8 cores simultaneously • Lower priority than investors to start jobs
Getting started • User Accounts • Qualifications: • Current UF faculty, UF graduate student, and researchers • Request at: http://www.hpc.ufl.edu/support/ • Requirements: • GatorLink Authentication • Faculty sponsorship for graduate students and researchers
Help and support • Help Request Tickets • https://support.hpc.ufl.edu • For any kind of question or help requests • Searchable database of solutions • We are here to help! • support@hpc.ufl.edu
Documentation • http://wiki.hpc.ufl.edu • Documents on hardware and software resources • Various user guides • Many sample submission scripts • http://hpc.ufl.edu/support • Frequently Asked Questions • Account set up and maintenance