1 / 54

Controls and Monitoring Status Update

Controls and Monitoring Status Update. J. Leaver 29/05/2009. Infrastructure. Infrastructure Issues. General EPICS infrastructure EPICS server / client organisation Unification of control systems Remote access Monitoring Controls Configuration database Schedule.

bendek
Télécharger la présentation

Controls and Monitoring Status Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Controls and Monitoring Status Update J. Leaver 29/05/2009

  2. Infrastructure

  3. Infrastructure Issues • General EPICS infrastructure • EPICS server / client organisation • Unification of control systems • Remote access • Monitoring • Controls • Configuration database • Schedule

  4. EPICS Client / Server Overview

  5. EPICS Server / Client Organisation • Wide variety of EPICS server applications permitted • Typically connect to physical hardware • Impossible to enforce common interface/processor/OS specifications • Each server is maintained by ‘owner’ of respective control system • Strict central administration unnecessary – ‘end user’ only concerned with availability of PVs on network • EPICS clients also varied, but must be uniformly accessible • Users should not have difficulty finding/launching clients • Applications should be consistently organised/updated • MICE Online Group (MOG) responsibility

  6. EPICS Client Organisation • All client-side applications run on miceecserv • Central installation repository greatly simplifies configuration/maintenance/backup • MOG collates individual applications, applies updates when available from control system ‘owners’ EPICS client applications miceecserv miceopi1 miceopi2 Controls Network EPICS server applications EPICS IOC EPICS IOC Portable CA Server Portable CA Server EPICS IOC

  7. EPICS Client Organisation • Client control/monitoring GUIs viewed directly on miceecserv, or one of 2 ‘Operator Interface’ PCs • OPI PCs act as ‘dumb terminals’, running displays from miceecserv via SSH EPICS client applications miceecserv miceopi1 miceopi2 Controls Network EPICS server applications EPICS IOC EPICS IOC Portable CA Server Portable CA Server EPICS IOC

  8. Unification of Control Systems • At user level: Simple ‘wrapper’ GUI provides menu for launching individual client applications • At system level: Employ 2 standard EPICS tools (running as background services on miceecserv) • Alarm Handler • Monitors all servers & warns operators of abnormal/dangerous conditions • Channel Archiver • Automatically records PV parameters to disk & provides several visualisation options • See PH’s talk

  9. User Interface

  10. User Interface Message log Alarm Handler Large wall-mounted display Any important parameters for current run

  11. User Interface Client application launcher Client GUI Standard desktop monitor

  12. User Interface Connected to miceecserv

  13. User Interface Connected to miceopi1 Connected to miceopi2

  14. Remote Monitoring: General Principles • Remote users should have simple, easily accessible interface for routine monitoring • ‘Expert’ remote users should have access to monitoring displays which match those in MLCR • No machine on Controls Network should be directly accessible over the internet • System load generated by remote monitoring should have minimal impact on control & monitoring services

  15. Remote Monitoring: Web Server RAL Gateway Java Archive Viewer Data Server Web Server NFS Mount CGI Export miceecserv PPD Network Web browser Channel Archiver PV Archive Internet Controls Network EPICS IOC EPICS IOC Portable CA Server Portable CA Server EPICS IOC

  16. Remote Monitoring: Direct PV Access • Could recreate normal client displays using web interface, but would involve impractical development overheads • Provide direct read only access to PVs so actual client GUIs may be run remotely miceecserv RAL Gateway Standard client GUI running on remote PC (read only) CA Gateway (read only) CA Gateway (read only) Controls Network EPICS IOC EPICS IOC Portable CA Server Portable CA Server EPICS IOC

  17. Remote Monitoring: Direct PV Access • CA Gateway makes PVs available across subnets (with full access control), while minimising load on underlying servers • To simplify end-user support, virtual machine disk image containing EPICS + all client applications will be made available miceecserv RAL Gateway Standard client GUI running on remote PC (read only) CA Gateway (read only) CA Gateway (read only) Controls Network EPICS IOC EPICS IOC Portable CA Server Portable CA Server EPICS IOC

  18. Remote Control • Where possible, operations affecting the state of any MICE system should only be performed within MLCR • Remote users accessing controls can lead to unknown/unexpected running conditions – should be discouraged • If necessary, off-site experts will be permitted to run control client applications on miceecserv, via SSH through RAL Gateway • Each expert will have an account on miceecserv which only contains client applications for their designated system

  19. Configuration Database • Necessary to integrate control systems with central MICE configuration database • Read set point values from database • Upload PV values to EPICS servers • Modify PVs with client GUIs • Download PV values from EPICS servers • Write new set point values to database • For (2) & (4) propose use of standard EPICS Backup & Restore Tool (BURT) • Backup/restore PV values to/from snapshot files

  20. Configuration Database • BURT snapshot files may be written in ‘Self-Describing Data Sets’ (SDDS) format • For (1) & (5), propose development of application to write/read database values to/from SDDS files • C API for generating SDDS snapshots provided with BURT • C/C++ APIs for database (PostgreSQL) available • NB: Configuration database interface still in very early planning stages – details to be discussed/decided • Have not rejected possibility of developing custom backup/restore client which accesses database directly

  21. Infrastructure Schedule

  22. Control & Monitoring Systems

  23. C&M Systems Overview

  24. C&M Systems Developed by Local MICE Community

  25. Existing Target Controller system stable/reliable, but only has ‘push button’ interface & limited upgradeability Currently undergoing complete redesign to increase functionality and enable PC control Based on USBDAQ Contains 1M gate FPGA USB interface for PC communication Will be fully integrated with EPICS Target: Controller

  26. Target: Controller • In hardware/firmware design stage – EPICS development not yet commenced • Stage 1 upgrade will be complete end of July 2009 • Interfaces USBDAQ with existing analogue electronics • EPICS C&M system recreating current ‘push button’ controls (actuation, target dip depth, timing) • Stage 2 upgrade to be completed end of December 2009 • Redesign of analogue electronics • Enable fine control of subsystems

  27. Target: Beam Loss • Beam loss IOC reads local data archive written by DAQ system • Clients provide virtual scope display, history plots & analysis • System functionally complete, but requires final selection of algorithm for calculating ‘absolute’ beam loss DAQ

  28. FNAL Beam Profile Monitors • EPICS Server/client applications complete • Well tested, used for monitor calibration procedures

  29. Cherenkov System

  30. Tracker: Magnetic Field Probes CAN Bus Network Socket EPICS Server (Linux PC) Standalone Probe Interface (Widows PC) Hall Probes • NIKHEF Hall probes will be installed • In homogeneous region of Tracker volume • At Z-edges of Tracker volume • Outside solenoids (backup check of field polarity) • Hall probes read out via CAN interface using Windows application • Portable CA server reads parameters from Windows PC via network socket • Monitor B-field (X, Y, Z components) + probe temperature

  31. Tracker: Magnetic Field Probes • C&M system functionally complete • Just requires error handling refinements & definition of alarm limits • To be installed at RAL in November 2009 • Dependent on Tracker schedule – could change • No dedicated client will be written – sufficient to display parameters via Channel Archiver Data Server

  32. Tracker: AFEIIts • AFEIIt configuration, control & monitoring software complete • Finalisation of DATE integration details required • Need DATE-side client to enable/disable triggers (i.e. run control)

  33. Tracker: AFEIIt Infrastructure • ‘Infrastructure’ corresponds to miscellaneous auxiliary hardware associated with AFEIIts • Somewhat ill-defined, since most hardware (AFEIIt cryo systems & safety interlocks) integrated with Spectrometer Solenoid controls • Currently require C&M for AFEIIt power supplies • 4 Wiener PSUs (1 per cryo) • CAN Bus or RS232 communication interface • Intend to use RS232 for simplicity • No progress yet – expect manpower to be available for completion in August • Additional C&M requirements may develop (to be discussed)

  34. Hydrogen Absorbers: Focus Coils • Absorber Focus Coils expected to require C&M systems very similar to Pion Decay Solenoid & Spectrometer Solenoids • Would be most efficient for DL to take over project (wealth of relevant expertise) • Unfortunately prevented by MICE funding constraints • Task assigned to MOG

  35. Hydrogen Absorbers: Focus Coils • If possible, will attempt to use DL’s existing magnet designs as template • DL C&M systems have vxWorks IOCs • For MICE to develop vxWorks software, expensive (~£15.2K) license required • Investigate replacement with RTEMS controllers (‘similar’ real-time OS, free to develop) • DL system include custom in-house hardware • Not available for general MICE usage – will check alternatives • However, will consider possibility of entirely new design (perhaps with Linux PC-based IOCs)

  36. Hydrogen Absorbers: Focus Coils • Work on Focus Coil C&M system has not yet commenced • Need to confirm availability of PH • Assistance from FNAL Controls Group would be highly beneficial – need to discuss • Expect to start project in September 2009

  37. RF Cavities: Coupling Coils • Cavity Coupling Coil C&M situation identical to Focus Coils • Similar requirements to other MICE magnets • MOG responsibility (need to confirm PH’s availability) • Project should run in parallel with Focus Coil C&M system

  38. DATE Status • Need mechanism for reporting current DAQ state via EPICS • Simple (‘dumb’) data server hosts DATE status PV • Client application reads DATE status from DIIM server, forwards value to EPICS server • Server & display client complete; DATE-side client to be implemented EPICS Data Server (Single ‘status’ PV) DATE Client

  39. Network Status • Need to verify that all machines on DAQ & control networks are functional throughout MICE operation • Two types of machine • Generic PC (Linux, Windows) • ‘Hard’ IOC (vxWorks, possibly RTEMS) • EPICS Network Status server contains one status PV for each valid MICE IP address

  40. Network Status • Read status: PC • SSH into PC • Verifies network connectivity & PC identity • If successful, check list of currently running processes for required services • Read status: ‘Hard’ IOC • Check that standard internal status PV is accessible, with valid contents • e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs

  41. Network Status • Currently have working prototype • EPICS server connects to PCs via SSH, checks contents of ‘key’ ID file • Client displays status of all PCs, scans at user-specified period (with ‘check now’ override) • Need to add service checking & ‘hard’ IOC support

  42. Unassigned Control Systems • The following systems currently have no allocated C&M effort • Time of Flight System • Diffuser • Calorimeter system • Request help from MICE community to identify system requirements • Need to find additional C&M resources • MOG operating at full capacity & no funds for DL to undertake these projects • Expect those responsible for each system will be required to implement corresponding EPICS controls • Assistance from FNAL Controls Group would be welcome (to be discussed)

  43. MICE Community C&M Projects Schedule

  44. C&M Systems Developed by Daresbury

  45. Target: Drive • Significant work required for Target upgrade • Additional temperature sensors • Split power supply to reduce current → duplication of C&M components • On schedule for Target installation

  46. Beamline Magnets • C&M system complete • DL provides ongoing support & maintenance

  47. Pion Decay Solenoid • C&M system complete • DL provides ongoing support & maintenance

  48. Tracker: Spectrometer Solenoids • Controls rack layout essentially complete • Associated wiring diagrams ~50% complete • Require ~4 weeks work • Rack, cabling, distribution costs: ~£5K • C&M system to follow standard DL design • Controls interface hardware costs: ~£13K • Software development effort: ~0.4 man years

  49. Tracker: Spectrometer Solenoids • Work currently halted due to budget constraints • 3 options • Allow DL to complete project • Requires ~£18K capital + 0.4 man years effort • Take DL’s current design & complete within the collaboration • Requires ~£18K capital + ~£15.2K vxWorks developer licence + 0.6-0.8 man years effort • Insufficient MICE manpower available… • Discard DL’s design & start over within the collaboration • Unknown capital requirements (likely ~£18K) • Requires ~1.5 man years effort • Insufficient MICE manpower available…

  50. Tracker: Spectrometer Solenoids • Only reasonable option: provide DL with funds to complete project • Cannot pay for work out of UK budget • Possibly utilise common fund? • AB currently in negotiations with MZ • Must decide on course of action (preferably before end of CM24)

More Related