1 / 47

Controls The roadmap towards LHC controls

Effects of the compressed schedule. Controls The roadmap towards LHC controls. Rüdiger Schmidt and Robin Lauckner “Chamonix” 2005 On behalf of the Controls Group and collaborators. Where to go – LHC beam operation LHC controls Hardware commissioning and efficient controls

pennebaker
Télécharger la présentation

Controls The roadmap towards LHC controls

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effects of the compressed schedule ControlsThe roadmap towards LHC controls Rüdiger Schmidt and Robin Lauckner “Chamonix” 2005 On behalf of the Controls Group and collaborators Where to go – LHC beam operation LHC controls Hardware commissioning and efficient controls Milestones – past and future Conclusions

  2. Effects of the compressed schedule ControlsThe roadmap towards LHC controls Rüdiger Schmidt and Robin Lauckner “Chamonix” 2005 On behalf of the Controls Group and collaborators Perspective of a user of the controls …NOT as controls expert

  3. AB-CO meeting (AB-CO-Day) in December Considering • the compressed LHC schedule • the concurrent requirements from LEIR, CNGS, the CCC • other developments prompt availability of the Control System becomes even more critical The meeting had the following aims: • to ensure that the Controls Systems are ready on time • to establish the goals and milestones for 2005 and 2006 • to prepare for 2007 and 2008 The approach is a progressive deployment of solutions to meet successive milestones across CERN The workshop is intended to give statuses, to expose problems if any and to help providing written documents for controls clients link: http://ab-div-co.web.cern.ch/ab-div-co/coday.htm

  4. This presentation and the AB-CO-Day This presentation is based on the material presented during the AB-CO meeting (CO-Day) It does NOT address controls issues in other groups (only a few remarks) ! B.Frammery, P.Charrue, M.Vanden Eynden, F.Di Maio, C.H.Sicard, E.Hatziangeli, R.Billen, R.Lauckner, P.Gayet, M.Gourber-Pace, V.Baggiolini, K.Kostro, J.L.Nougaret, J.Lewis, M.Peryt, C.Roderick, K.Sigerud, L.Mestre, H.Milcent, M.Zerlauth, M.Zaera Sanz, R.Lauckner, E.Hatziangeli R.Denz, J.Casas-Cubillos, R.Gavaggio, P.Gomes, I.Laugier, M.Lamont, S.Page .....and many others inside and outside AB-CO that contributed to the work

  5. Large elec. circuit commissioning (HWC proper) Quench Protection Equip tests LHC ready for beam DFB test LHC sector test with beam TT40 (TT40) – TI8 Test LSS8L (“String 3”) Start SPS with fast SC change QRL start commissioning TI2 commissioning SPS new timing Prototype Real Time orbit feedback LEIR beam commissioning CNGS beam commissioning Prototype collimator control CCC ready CNGS HW commissioning Summary of AB/CO main milestones

  6. LHC controls: LHC ready for beam Beam control • Transfer lines • Injection and Extraction (beam dumping system) • Beam optics controls • Beam instrumentation • RF • Beam interlocks • Collimation • Real Time orbit feedback • Radiation monitors Hardware control • Power converters • Quench protection system • Cryogenics system • Powering interlocks • Vacuum systems • Uninterruptible Power Supplies (UPS), Arret Urgence Generale (AUG) • Safety systems and general services

  7. These are the main systems for Hardware Commissioning LHC controls: prototyping Beam control • Transfer lines -> TT40 and TI8 • Injection and Extraction (beam dumping system) -> SPS to TT40 • Beam optics controls-> TT40 and TI8 • Beam instrumentation -> TT40 and TI8 • RF • Beam interlocks -> TT40 and TI8 • Collimation-> SPS and TT40, prototyping but with stand-alone controls • Real Time orbit feedback -> SPS • Radiation monitors Hardware control • Power converters • Quench protection system • Cryogenics system -> LHC cryoplants (not for ring cryogenics) • Powering interlocks-> TT40 and TI8 (for normal conducting magnets) • Vacuum systems -> SPS, TT40 and TI8 • Uninterruptible Power Supplies (UPS), Arret Urgence Generale (AUG) • Safety systems and general services (display of TI8 tunnel temperatures by normal conducting magnet supervision using PVSS)

  8. Hardware for LHC controls provided by AB-CO Frontends and Gateways: Computers (PLCs, PCs, VME systems) Networks • WorldFIP network and Profibus • Ethernet (provided by IT, but needing coordination and testing) Field Control Rooms and Servers: Consoles, displays, … Specific electronics (some of it being developed) • Timing modules • Beam and Powering interlock electronics (VME, PLC and custom) • Modules for generation and distribution of Safe LHC Parameters (VME) Cables, cables and cables HW installation planning being addressed, planning for sectors 7-8 and 8-1 will be confirmed by end January Installation body in place before end January • define detailed tasks and responsibilites (cable verifications, HW procurement, installation, functional testing, etc) • weekly follow-ups linked to the planning forthmentionned

  9. HARDWARE INSTALLATION controls group ~300 VME SYSTEMS ~150 PC Rackable “Gateways” and WorldFIP Infrastructure 200 km cabling, ~40000 passive and ~1100 active elements Timing System Infrastructure Timing distribution via Copper cables and optical transmission VME MTGs in CernControlCentre Remote Reboot Service for PC Gateways, PLCs and Field Interfaces ~30 Schneider PLCs for the reset of ~300 systems and Local Cabling Terminal Service For PC Gateways and VME Crates with Terminal server boxes in SRx + … Local cabling Local Consoles (SRx and Underground based on the PC Gateway platform) Well under way, but this is most urgent and no time to lose !

  10. PC Gateways, VME CPUs and Remote Reboot in stock M.Vanden Eynden for the AB/CO/HT section

  11. Software services and deployment • Logging • Post Mortem • Timing • Transient recording (OASIS) • Alarms (LASER)

  12. Software services and deployment • Logging TT40/TI8 extractions SPS vacuum • Post Mortemsome experience from TT40 SM18 quench analysis – reuse? • TimingPS and SPS • Transient recording (OASIS) TT40 and TI8 • Alarms (LASER)

  13. Software services and deployment • Logging TT40/TI8 extractions SPS vacuum • Post Mortemsome experience from TT40 SM18 quench analysis – reuse? • TimingPS and SPS • Transient recording (OASIS) TT40 and TI8 • Alarms (LASER) • LSA Project Developing generic application softwarecollaboration of AB-CO and AB-OP(LSA Team) • FESA Collaboration of AB-CO and BDI(FESA Team) • CMW Controls MiddlewareAB-CO • PVSS / UNICOS Supervisory Controls and Data Acquisition (PVSS as commercial product, UNICOS (home made) framework for PVSS

  14. Software services and deployment • Logging TT40/TI8 extractions SPS vacuum • Post Mortemsome experience from TT40 SM18 quench analysis – reuse? • TimingPS and SPS • Transient recording (OASIS) TT40 and TI8 • Alarms (LASER) • LSA ProjectMagnet control, beam instruments, orbit control, driving fixed displays, … • FESA Access equipment: Beam Interlocks, OASIS, … • CMW Communication Glue between “everything” • PVSS / UNICOSVacuum, cryogenics, magnet interlocks, display of data from cooling and ventilation

  15. LHC Layout Mechanical Optical DC powering Physical equipment equipment catalogue serial number Controls configuration Hardware topology Software topology RackWizard ABCAM Portal to capture assets and installation data Operational Data setting measurements logging post mortem LHC complexity and Data management inspired by R.Billen AB-CO and TS

  16. LHC Layout Mechanical Optical DC powering Physical equipment equipment catalogue serial number Controls configuration Hardware topology Software topology RackWizard ABCAM Portal to capture assets and installation data Operational Data setting measurements logging post mortem Everything should fall in place:Consistent data management is a great challenge Several critical milestones upcoming (in January and February) Naming of LHC entities and signals is an important issue => see conventions (e.g. EDMS 473091)

  17. Hardware commissioning Before starting: QRL commissioning Test of a section of the QRL Vacuum, cryogenics The first part all main systems together: type test of magnet feedbox (DFBA) commissioning Test with 4 electrical circuits in SM18 Vacuum, cryogenics, (power converters), quench protection, powering interlocks The fun part: first magnets in LHC (LSS8L - “String 3”) commissioning Commissioning of 2 powering subsectors with 29 electrical circuitsVacuum, cryogenics, power converters, quench protection, powering interlocks, UPS and AUG The tough part (HEP community impatiently waiting...): All the rest Commissioning of 26 powering subsectors with 1585 electrical circuits

  18. Hardware Commissioning … controls perspective Commissioning of the (superconducting) magnet powering system Objectives • Safe magnet powering operational • Ramping of magnets works • Magnets can operate at high field reliably for several hours • Qualify electrical circuits, and capture information • Commission and validate controls for operational powering of LHC magnets (…everything what can be done without beam) To be performed in two phases Power Converters not connected to Magnets Power Converters connected to Magnets Driven by HardwareCommissioningWG, software discussed in SACEC team: Software Application for Commissioning of Electrical Circuits HW and SW experts (F.Rodriguez-Mateos / R.Lauckner et al)

  19. first current ramp for all circuits with I > 120 A ramp to nominal current: global protection ok current Scaling from String 2: commissioning of the circuits one by one – for LHC would take 15 years: substantial increase of efficiency is essential Efficient controls allows for commissioning of several circuits in parallel nominal current observe voltage drop across entire circuit - should be constant when I=constant (close to zero) discharge from nominal current energy extraction ok minimum quench current minimum current (about 2% of nominal) time

  20. Key to efficiency: Automated Testing Detailed specs being written by SACEC (M.Zerlauth, B.Puccio, R.Denz, H.Thiesen, S.Page, M.Zaera Sanz, H.Milcent, F.Chevrier) Implementation by AB-CO based on existing tools Automated Test Procedures PVSS-CMW Interface PVSS-CMW Interface CMW Quench Protection (PVSS Supervision) Powering Interlocks (PVSS Supervision) Power Converters (Gateway) CMW Gateway Ethernet WorldFIP QPS Hardware PIC Hardware Power Converter

  21. Major controls contributions from other groups to LHC Hardware commissioning • Vacuum system (controls by AT-VAC, Alarms and Logging from AB-CO) • Quench protection system (controls shared by AT-MEL and AB-CO) • Power converters (controls shared by AB-PO and AB-CO) • Cryogenics system (controls shared by AT-ACR and AB-CO) • a worry is the manpower for writing software specifications and then translating the specs into PLC and PVSS code and controls commissioning. This is clearly aggravated with several activities in parallel. • many other tools developed by CO are being used, clarifications between AT-ACR and AB-CO (remote reset, rack wizard, …) underway

  22. Milestones and main requirements This table is not complete, it shows the most relevant tools and milestones ! Time HC

  23. Logging, Shot-by-Shot Logging, Alarms, Post Mortem recording All these systems store data “for certain variables”… Post Mortem analysis is using data from all these systems and other data (e.g. from LSA, databases, ..) !

  24. Post Mortem: Beam incident during TT40 tests Data acquired from Shot-by-Shot & WIC Logging clients proved essential for the post-mortem analysis of the TT40 high intensity beam test accident PC simulation (PC off) by AB/PO (reconstructed) Magnetic septum current change from Logging System J.Wenninger AB-OP J.Wenninger time within SPS super-cycle

  25. T = 1 ms Logging during TT40 /TI8 tests: Transient recording (horizontal zoom) – used as post mortem

  26. Example: SPS and LHC Transfer LinesVacuum Control System Synoptic of the SPS Complex Pressure profile in LHC TI8 Vacuum layout ofLHC TI8 Vacuum sectorisation of LHC TI8 I.Laugier AT-VAC AT-VAC/IN Section

  27. Example: Applications Deployed 2004 for the Cryogenics system Point 1.8 Point 2 Point 4 Point 6 Point 8 Central HP HP storage HP HP HP HP HP storage storage storage storage storage QSC QSC QSC QSC QSC QSC QSC QSC QSC QSCC QSCC QSCA QSCB QSCC QSCA QSCB QSCC QSCC QSCB QSCA QSCC QSCC QSCA QSCB QSCC Surface LN LN LN LN LN (buildings+areas) AB-CO-IS 2 2 2 2 2 QSRB QSRB QSRB QSRB QSRA QSRA QSRA QSRA Shaft QURA QURA QURA QURA QUI QUI QUI QUI QUI Cavern QURC QURC QURC QURC QURC QURC QURC QURC QRL QRL QRL QRL QRL QRL QRL QRL AT-ACR Tunnel Sector 1-2 Sector 2-3 Sector 3-4 Sector 4-5 Sector 5-6 Sector 6-7 Sector 7-8 Sector 8-1 To be Deployed Deployed Concept/Unity V1/ PCVue Deployed Concept/Unity V1/ PVSS 2.12 Deployed ABB Deployed Unity V1 Ph.Gayet, AB-CO

  28. Power Converter Controls Architecture Operator console Client application (JAPC) 2 ppm = 14 MeV of full current Controls Ethernet CMW Commands LynxOS PC gateway FGC Gateway 50Hz Status Commands WorldFIP bus (50Hz cycle) FGC FGC FGC FGC FGC FGC FGC FGC FGC FGC … 30 Function Generator Controllers, one per power converter S.Page AB-PO

  29. PVSS Temperature Supervision – Warm magnet interlocks for TI8 M.Zaera-Sanz AB-CO

  30. Comments • Controls is for LHC and for all other accelerators at CERN • Similar to hardware systems, controls differs between accelerators • There were THREE, and now ONLY ONE controls group • A limited variety of solutions emerged / have been selected – standardisation as far as reasonable • The strategy: implementing these solutions progressively started 2003 (TT40) and 2004 with (SPS and TT40/TI8) with many intermediate milestones to LHC beam commissioning For hardware commissioning, some issues I consider to be more critical than others… • getting the controls hardware installed and tested • post mortem recording and analysis • automatic test procedure • integration of controls efforts with other groups

  31. Conclusions Hardware commissioning – best validation of controls is via QRL tests  QPS Surface tests  DFB tests “String 3” the rest Beam commissioning, best validation of beam controls viaTI8 / TT40  LEIR  SPS  Sector testTI2  LHC this gives us the chance to start most critical phase of hardware commissioning with well-tested and efficient controls Develop / Implement / Commission, if possible well ahead of the date when controls is needed with very high efficiency … ! Dry runs for all critical systems !

  32. Reserve slides

  33. Risks I do not see any showstoppers, but… • Unprecedented complexity of LHC and its controls • limit controls complexity whenever possible • Suppression of intermediate milestones • Discussing principle controls philosophy instead of going ahead with what can be done with what we have • Wrong priorities – diverting from the essential • Squeezing out the last 5% (forgetting other issues) - diverting from the essential

  34. abstract Controls for beam operation are similar to other CERN accelerators. From the experience gained in the TI8 tests, with milestones such as the commissioning of LEIR, TI2, the controls for LHC will be realised with the same building blocks as used today. Hardware commissioning will provide a different challenge with an large scale use of industrial solutions for accelerator control. Already long before first injection into the LHC, controls for vacuum, cryogenics, quench protection, powering interlocks and power converters must be fully operational. Current plans for the installation and commissioning of the controls infrastructure will be explained and the main aspects highlighted. The generic controls facilities required for the commissioning of the hardware and for beam operation will be discussed with emphasis on how the work advances in the different domains (post-mortem, logging, alarms, timing, specific applications, etc.). The requirements, in particular for Hardware Commissioning, and how they translate into existing controls and / or technical specifications will be addressed. The presentation is based on the summary of a AB-CO meeting (CO-Day) that has being organised in December.

  35. Supervision of the Beam Interlock Controller based on JAVA (BIC) IO Lib RT-Task BIC HW BIC driver 1 Hz Reset, … Message queue Shared memory segment On-line Buffer (Inputs + Time) BIC console application (JAVA) JVM i th Client PC running on console JAPC RDA CORBA • GET( ) • SET( ) • Monitor[On, Off] CMW Server VME CORBA RDA Eq. SW I/O Lib Data + Time Subscribers List 1 4 2 3 Data Ready

  36. Issues deserving special attention • Hardware installation • Timing hardware • Data management • VDSL for QRL commissioning • PVSS – SIEMENS PLC driver • Logging, post-mortem, shot-by-shot logging, SDSS logging • Naming conventions and application • FESA for QPS • Users of FESA need support • Controls system and machine protection / interlocks • Software interlocks and Integritiy of safety critical information • Network security • Real time feedback • Collimators controls

  37. The Consoles • PCs “commercial of the shelf” • with 1 to 3 screens • installed in local or central control rooms • used to run operational software • Operating System • Customised version of NICE XP • SLC3 Linux distribution • Basic Installed Packages • Common Console Manager • JAVA runtime environment • Exceed

  38. The Application Server • HP ProLiant • 2x2.8 Ghz XEON, 3Gb RAM, 2x16Gb system disks, Dual PowerSupply • Redundancy, hot swap, hardware Raid0, …, • allows for continuity of service and tolerance to faults • SLC3 LINUX distribution or RedHat RHEL-3 for Oracle 9i AS • “transfer.ref” mechanism to launch system and user services and applications • System resources and user applications monitored by clic and reported to the Alarm system

  39. Status of magnet interlocks Normal conducting magnets (WIC) • operational during several month of running, without single failure during TI8 and TT40 tests • several magets “saved” from overheating • interfaced to other system (cooling and ventilation) • WIC is the small brother of PIC • partial validation of PVSS and PLC architecture for magnet interlocks • next milestone is LEIR Superconducting magnets (PIC) • PIC well advanced • Milestone for DFB tests

  40. Issues • Software maintenance • Unlike the previous years, patches and upgrades have to be installed during operation • Therefore in collaboration with software providers and OP we will have to test and deploy the patches throughout the year • SystemConfiguration • Curtain policy not yet finalised (who can do what and where between development and operational zones) • Homogenisation between (old)PS and (old)SL not yet finalised • ThirdPartySupport • Java products (OC4J, SonicMQ, …) or PVSS package configuration needs manual intervention • Need resources to setup an automatic recovery procedure • Third Party Software can prevent system upgrades…

  41. Hardware Commissioning in 2 phases Power Converters not connected to Magnets • Test of electrical circuits one by one • Initial continuity of circuits, integrity of instrumentation • Insulation tests, interlocks on water, switches • 8h operation at ultimate current • 24h run of all PC • Power Converters, QPS and PIC tested together before connection to magnets Power Converters connected to Magnets • Commission electrical circuits one by one at low current • Commission electrical circuits one by one or in groups to nominal current • Commission of electrical circuits powered in unison to nominal current

  42. Logging, SbS Logging, Alarms, Post Mortem recording Logging System • Stores logging data for certain variables (logging entities) • Collects data, typically at 1 Hz or slower, in regular intervals or on change Alarm System • Handles alarms in case of fault conditions and stores alarm data • Defines & processes alarm data for certain variables (fault members) Shot-by-Shot Logging System (designed to monitor LHC filling) • Stores logging data for certain variables (logging entities) for each extraction from SPS to TT40 / TI8 (after an event …the shot) • Variables can be stored as one data point, or as transient data with many data points Post Mortem transient recording • Stores transient data for certain variables after a post-mortem event (post mortem entities) or after an internal fault of a system • Name and time stamp for entities required • Post Mortem analysis is using data from all these systems

  43. LHC Controls Option 1 (simplified) Operator console 1 running programs to interface to operators Operator console 2 running programs to interface to operators Front End Controllers PLC / VME / custom electronics CCC or Field Control Room Ethernet Example Ethernet Around the LHC Underground or Surface Buildings

  44. LHC Controls Option 2 (simplified) Operator console 1 running programs to interface to operators Operator console 2 running programs to interface to operators Server running application programs / alarms / logging … Front End Controllers PLC / VME / custom electronics Front End Controllers PLC / VME / custom electronics CCC or Field Control Room Ethernet Back of the CCC or elsewhere Ethernet Around the LHC Underground or Surface Buildings

  45. LHC Controls Option 3 (simplified)) Operator console 1 running programs to interface to operators Operator console 2 running programs to interface to operators Server running application programs / alarms / logging … Gateway (Front End Computer) Gateway (Front End Computer) Front End Controllers eg. QPS and PC Front End Controllers eg. QPS and PC CCC or Field Control Room Ethernet Back of the CCC or elsewhere Ethernet Around the LHC Surface Buildings ProfiBus / WorldFIP Around the LHC Underground or Surface Buildings

  46. Data management • LHC machine description (mechanical, optics, electrical) • LHC layout • DC magnet powering • Accelerator Controls Configuration • Model for PS complex extended to LHC • ABCAM – controls equipment installation database (first version for end January 2005) • Rack Wizard (to describe racks and what is inside) • Other projects depend on data management (e.g FESA, installation, …) • Calibration information (e.g. for thermometers) • Asset management • CERN wide with MTF Several critical milestones upcoming (in January and February) Naming is an important issue => see naming conventions (e.g. EDMS 473091)

  47. T = 28.8 s Logging during TT40 /TI8 tests: Vertical zoom Data acquired from Shot-by-Shot & WIC Logging clients proved invaluable for the post-mortem analysis of the TT40 high intensity beam test accident (25-10-2004) I = 29 A

More Related