1 / 26

Campus Network

Campus Network. Do the (B)right things. Main Objectives for 2002. Upgrade of the Accelerator Network & LHC network design. Completion of S.C. phase II. Selection, Configuration and Operation of IP appliances. LHC computing prototypes. CN jmj. Infrastructure CM,ES,JS,MD.

nairi
Télécharger la présentation

Campus Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Campus Network Do the (B)right things

  2. Main Objectives for 2002 Upgrade of the Accelerator Network & LHC network design Completion of S.C. phase II Selection, Configuration and Operation of IP appliances LHC computing prototypes CN jmj Infrastructure CM,ES,JS,MD Passive/Active infrastructure logistics and CFT, S.C. phase II, Accelerator Network Infra Upgrade, Experimental Areas & Pits, Network extensions Software support & Engineering DF,MZ,NG Products evaluation and integration, DNS, DHCP, CFMGR (device configuration) Accelerator active infrastructure logistics & CFT 513C & LHC computing JAB,MC Computer center and backbone network design, Product Validation, Data Challenges, OpenLab Interactions with IT services Remedy Workflow NSO Ovdv ST-EL-FO, SL-CO, TCR relations Network Consultation office AR,IG + DCS team First Line support, contract follow up, database consistencies, Network analysis Software for Operation EV,IL,NT,PK LANDB, MTP, WEBREQ, SPECTRUM Remedy support IT/CS Campus Network Structure by Team CS group meeting

  3. In practice… • CN and NSO are now so much mixed that it was difficult to split the presentation CS group meeting

  4. CN Section Marc Collignon Maryse Da Costa Daniel Francart Jacques Anthonioz-Blanc Marianna Zuin Nick Garfield Claude Mertina Jean Simeoni Eric Sallaz CS group meeting

  5. Infrastructure • A lot was done • Complete structured cabling phase II • Start LHC control installations • Two proposals for PS/SPS/TCR rejuvenation • Preparation of the PS rejuvenation • … and cope with (always urgent) user requests • Despite • Logistics problems • Environment (e.g. power) • Budget constraints CS group meeting

  6. Software Support and Engineering • Studies of “real time” constraints for LHC • Preparation for Wireless deployment & Management • Call for tenders (Routers and switches) • Evaluation of new industrial switches • Definition of a common service platform (Linux). CS group meeting

  7. Computer Center, Backbone, Openlab, … • End of FDDI • History goes • End of 128.141 • After 7 years of common and intensive effort • Towards an IP only network • Studies have started • Preparation of the Computer Center Move to the Vault • This is not just a DEMECO problem • Openlab: 10 gigabit-ethernet tests • And outstanding TESTBED performance results… CS group meeting

  8. Bravo! CS group meeting

  9. What next ? For CN and NSO sections Many projects have started

  10. Technical network Implementation • We managed to convince • Got agreement • Huge Logistic problem • Call for tenders, deliveries, stock management, etc • A lot to install ! • Adaptation of services • New switches • Database & tools extensions • New features to study & deploy • Clock distribution, Multicast, QoS,… • Integrated, with the GPN, into one single 24x24, 365x365 support schema CS group meeting

  11. And LHC physics!!!… • The LHC project is not only the machine • Physics at the pits • Very large and complex systems to install • Will start end 2003 for the (large) infrastructures • Computer Center farms challenge • Solutions for installing network for 10000s of nodes • Operating such large systems CS group meeting

  12. Services High Availability • GPN Backbone redundancy: Implemented • DHCP: Redundant services: done • Technical Network: part of the challenge • CSAM • ST/EL: Better power feeds… • DNS: soon to improve • Spectrum everywhere, 24x24. CS group meeting

  13. Soon to come • Better network access control • Pilot High density 100 base-T to the desk • All database based applications move to WEB interfaces including MTP (si, si!) • LANDB2 in January • LANDB3 mid 2003 will fully Integrate Network/Register and LANDB2 • All services moved to Intel/Linux • End of NCDs • DXCOMS off … CS group meeting

  14. Thank you! Questions?

  15. The place of NSO ? Campus network NSO CS group meeting

  16. Italo Gard Pavel Krysin Ignacio Leon Alasdair Ross Tamara Smoliakova Nikolaos Trikoupis Evgeuni Vedeniapine Olaf van der Vossen The people 1. http://it-div-cs.web.cern.ch/it-div-cs/public/di/nso_photo.html CS group meeting

  17. Benoit Clement Sylvestre Catin Thomas Nederman Hector Guajardo David Parra The people 2. CS group meeting

  18. Services WEBREQ LANDB NETWORK MANAGEMENT Operation NETOPS & FIRST-LINE (DCS) SLNET support Wireless Network Services & Operation CS group meeting

  19. Achievements in last year • We survived the departure of 3 persons ! • Version one of the LANDB and WEBREQ changes almost ready. • Spectrum now firmly in the hands of Nikos. • Network Operation ok most users satisfied. • First line without mayor problems. • SL network operation ok, upgrade started. • Wireless (production) deployment started. • CS web pages maintained by Tamara. CS group meeting

  20. WEBREQ changes • New WEB interface • Logic differentiates • Interface card • Network service • Signature • NICE login • N interface cards • wireless • New fields • For Technet • IT Division layout CS group meeting

  21. LANDB2 changes • LANDB / MANUTPtools merge. • SLNET-DB absorbed. • SLA tools integrated. CS group meeting

  22. Network Management • Spectrum everywhere ! • All services managed by spectrum only • More redundancy, new architecture and hardware, 24x24 service • PCALARM off… • End of HPopenview =>> port SLNET functionality. • Non specialist interface (Helpdesk, TCR, Operators) • Network Statistics • For network services • Integrated into WEBREQ v5 CS group meeting

  23. NETOPS & First Line • NETOPS • Continue the service to the users. • Adopt the new Tools. • Service Technet and the legacy 128.142 • In 9 months from now ? • First Line • Improve the SLA, reporting and statistics. • Technet integrates the first line. • Massive new installations in the computer center. CS group meeting

  24. SLNET support • 24x24 Support and Interventions on the remaining 128.142. • Coordinate transition to Technet with the users. • Assist the CN teams in preparing the eradication of the 128.142 in PS, TCR and SPS. CS group meeting

  25. Wireless • Deployment • Speed up deployment in public meeting rooms. • Manage “private” base stations sold by the CS group. • Assistance. • NETOPS. CS group meeting

  26. Things to come • Replacement for Nacho who will leave in spring. CS group meeting

More Related