1 / 23

LCLS & LUSI

Experimental Area Controls and Data-Acquisition for LCLS and LUSI Instruments Parallel Session Presentation Gunther Haller Research Engineering Group SLAC Particle Physics and Astrophysics Division Photon Control and Data Systems (PCDS) 30 October 2007. v5. LCLS & LUSI.

gbucher
Télécharger la présentation

LCLS & LUSI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experimental Area Controlsand Data-Acquisition for LCLS and LUSI InstrumentsParallel Session PresentationGunther HallerResearch Engineering GroupSLAC Particle Physics and Astrophysics DivisionPhoton Control and Data Systems (PCDS)30 October 2007 v5

  2. LCLS & LUSI • Common Controls and DAQ System Design for LCLS and LUSI • Common CAM (G. Haller) for LCLS and LUSI Controls & DAQ, including XTOD Controls • Group is called Photon Control and Data Systems (PCDS) group • LCLS Controls & DAQ Responsibility • Common services for all hutches • PPS • Laser Safety • Accelerator timing interface • 120 Hz beam-quality data interface • Machine protection system interface • User safeguards • Network interface • AMO experiment (NEH, hutch 2) • All controls and DAQ • 2-D detector • Control and DAQ for detector itself only • LUSI Control & DAQ Responsibility • X-Ray Pump Probe, XPP (NEH, hutch 3) • X-Ray Photon Correlation Spectroscopy, XCS (FEH, hutch 1) • Coherent X-Ray Imaging, CXI (FEH, hutch 2) • LCLS & LUSI • Local data storage in halls

  3. Per pulse data collection Experimental Diagnostic – EO signal, e- and g beam parameters Raw data rate and volume 2 Gb/sec or higher On-line storage capacity - 20 TB/day Timing/Triggering EO timing measurement < 1 ps Detector trigger < 1 ms Real time analysis Frame correction, quality control To the extent possible - binning, sparsification, FFT Quick view Quasi real-time feedback, 5 frame/s Alignment Data Management Unified data model Archiving capacity – 5 PB/year Analysis staging storage capacity – 20 TB Offline Analysis > 1000 node cluster Pump Laser operation Vacuum controls MPS systems Laser PPS systems EO system E.g. Overall LUSI Specifications

  4. Optics KB mirrors for focusing Refractive lens for focusing Monochromator Collimator Slits Attenuators Split-delay Pulse picker Compressor Sample environment Particle injector Cryostat Cryo-em stage Precision stages Beam Diagnostics Intensity monitors Beam positioning monitor Wavefront sensor Measurement instrument Diffractometer e- and ion TOF Mass spectrometer EO timing measurement Laser systems Pump laser and diagnostics EO laser Molecular alignment laser Vacuum systems Turbo pumps Ion pumps 2D Detectors Cornell detector for CXI BNL detector for XPP BNL detector for PCS Controls – List of Components

  5. Basic Motion System Block Diagram • 67 Stepper motors Hytec SMDS4-B driver Newport XCS controller

  6. Viewing/Camera System Block Diagram • Cameras • PULNiX 6710CL (648x484, 9um x 9um) • Up to 120 Hz • Triggered by timing signal from timing system event receiver PMC card • Centroid finding software running on VME IOC EDT DV CameraLink PMC card PULNiX 6710CL camera

  7. Machine Timing Distribution ~20 ps jitter (plus longer term drifts) Separate fast-timing network to get < 100 fs timing Beam to laser timing difference is measured and used, at 120 Hz, to process images in DAQ E.g. sorting/binning of images for pump probe experiment Example: Timing Control - Electro-Optic Sampling Stabilized Fiber Optic RF Distribution (10 fs) LBNL Electro-optic Sampling Laser Pump-probe Laser Gun Laser Sector 20 LTU NEH

  8. Data Acquisition/Mgmt Architecture SCCS LUSI Quick View Rendering Node Volume Rendering Node Detector Control Node Volume Rendering Cluster Detector Specific Experiment Common 10–G Ethernet 10–G Ethernet 2D Detector SLAC LCLS DAQ Box ADC FPGA Data Server 4 x 2.5 Gbit/s fiber Data Servers Detector-Specific Front-End Electronics (FEE) Online Processors Tape Drives/ Robots Disk Arrays/ Controller Accelerator 120-Hz Data Exchange & Timing Interface Offline Online • Detector (bump-bonded or integrated) • Detector–specific Front-End Electronics (FEE) • Local configuration registers • State-machine to generate low-level detector readout signals • Digitize analog pixel voltages • Organize bits into pixel words • Transmit to DAQ system • IP core in FPGA for communication • Up to 4 x 2.5-Gb/s fibers

  9. Example: Experiment Front-End Board • Interfaces to detector ASIC • Control signals • Row/column clocks • Biases/thresholds • Analog pixel voltage • Contains • Communication IP core • Local configuration state machine • Local image readout state machine • Example: SLAC development board • FPGA with • MGT interfaces, up to 4 x 2.5 Gbit/sec fiber IO • ~ 200 digital IO • VHDL programmed • Includes communication IP core provided by SLAC • Every detector system needs such a board to interface to the detector/ASIC maybe with detector-specific ADC’s/DAC’s integrated on board (or on separate board which is connected to this board) • No additional modules needed to connect to common DAQ

  10. Data Acquisition/Mgmt Architecture SCCS LUSI Quick View Rendering Node Volume Rendering Node Detector Control Node Volume Rendering Cluster Detector Specific Experiment Common Level 1 10–G Ethernet 10–G Ethernet 2D Detector SLAC LCLS DAQ Box ADC FPGA Data Server 4 x 2.5 Gbit/s fiber Data Servers Detector-Specific Front-End Electronics (FEE) Online Processors Tape Drives/ Robots Disk Arrays/ Controller Accelerator 120-Hz Data Exchange & Timing Interface Offline Online • Level 1 DAQ nodes are responsible for: • Control FEE parameters • Receive machine timing signals • Send trigger signals to FEE • Acquire FEE data • Merge FEE data with beam-line data information • Low level real time data processing, e.g.: • Filtering of images based on beam-line data • Pixel correction using calibration constants • Send collected data to Level 2 nodes

  11. ATCA Crate • ATCA • Based on 10-Gigabit Ethernet backplane serial communication fabric • 2 custom boards • Reconfigurable Cluster Element (RCE) Module • Interface to detector • Up to 8 x 2.5 Gbit/sec links to detector modules • Cluster Interconnect Module (CIM) • Managed 24-port 10-G Ethernet switching • One ATCA crate can hold up to 14 RCE’s & 2 CIM’s • Essentially 480 Gbit/sec switch capacity • Naturally scalable • Can also scale up crates RCE ATCA Crate CIM

  12. Reconfigurable Cluster Element (RCE) Boards • Addresses performance issues with off-shelf hardware • Processing/switching limited by CPU-memory sub-system and not # of MIPS of CPU • Scalability • Cost • Networking architecture • Reconfigurable Cluster Element module with 2 each of following • Virtex-4 FPGA • 2 PowerPC processors IP cores • 512 Mbyte RLDRAM • 8 Gbytes/sec cpu-data memory interface • 10-G Ethernet event data interface • 1-G Ethernet control interface • RTEMS operating system • EPICS • up to 512 Gbyte of FLASH memory Rear Transition Module Reconfigurable Cluster Element Module

  13. Cluster Interconnect Module • Network card • 2 x 24-port 10-G Ethernet Fulcrum switch ASICs • Managed via Virtex-4 FPGA • Network card interconnects up to 14 in-crate RCE boards • Network card interconnects multiple crates or farm machines

  14. Data Acquisition/Mgmt Architecture SCCS LUSI Quick View Rendering Node Volume Rendering Node Detector Control Node Volume Rendering Cluster Detector Specific Experiment Common Level 2 10–G Ethernet 10–G Ethernet 2D Detector SLAC LCLS DAQ Box ADC FPGA Data Server 4 x 2.5 Gbit/s fiber Data Servers Detector-Specific Front-End Electronics Online Processors Tape Drives/ Robots Disk Arrays/ Controller Accelerator 120-Hz Data Exchange & Timing Interface Offline Online • Level 2 DAQ nodes are responsible for: • High level data processing, e.g. : • Combine 105 to 107 images into 3-D data-set • Learn/pattern recognition/classify/sort images • Alignment, reconstruction • Local caching of the data • Generate real time monitoring information • Different technologies are being evaluated for Level 2 nodes: • ATCA/RCE crates • Linux cluster based on commercial machines (eg Dell PowerEdge 1950)‏ • Level 3: SCCS archive/bulk storage

  15. Real-time Processing – Sorting in CXI • Diffraction from a single molecule: noisy diffraction pattern of unknown orientation single LCLS pulse unknown orientation • Combine 105 to 107 measurements into 3D dataset: Reconstruct by Oversampling phase retrieval Classify/sort Average Alignment The highest achievable resolution is limited by the ability to group patterns of similar orientation Real-time? Miao, Hodgson, Sayre, PNAS 98 (2001) Gösta Huldt, Abraham Szöke, Janos Hajdu (J.Struct Biol, 2003 02-ERD-047)

  16. Computational Alignment Experimental Data (ALS) Difference of pyramid diffraction patterns 10º apart, Gösta Huldt, U. Uppsala “The number currently used to obtain high-resolution structures of specimens prepared as 2D crystals, is estimated to require at least 1017 floating-point operations” R. M. Glaeser, J. Struct. Bio. 128, (1999) q kout “Computational Alignment” requires large computational power that might only be provided by performing offline analysis?  Save first, and Analyze later? To be investigated kin 

  17. Data Acquisition System 10-G Ethernet Science Data NEH or FEH hutch-common local data-storage and to SCCS • Example BNL XAMP Detector 1,024 x 1,024 array • Row-by-row readout: 64 readout IO’s • 8 eight-channel 20-MHz 14-bit ADC’s + range-bit • Instantaneous readout: • 64 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA • Output FPGA • 250 Mbytes/s at 120 Hz (1024x1024x2x120) 1G Ethernet private subnet EElog ATCA With RCE & CIM Controller PC SLAC WAN Channel Access Gateway EPICS Channel Access Dedicated 1G Ethernet 120-Hz Beam and Experiment Data EVG-EVR Protocol with Time-Stamp, Beamcode Electrical IO Up to 4 x 2.5 Gb/sec Fibers XAMP Detector Electronics 1,024 x 1,024 pixels & ASIC Fiber From EVG Machine Timing System Experiment-specific Front-end Board Trigger Strobe VME IOC With EVR PMC Timing Module Trigger Strobe DAQ to Experiment interface Experiment-Specific Front-End Electronics (FEE) SLAC Common DAQ

  18. Data Acquisition and Processing • Example: Pixel Detectors • Calibration • Without beam (dedicated calibration runs or in-between ~8-ms spaced beams) • Images to be used to calculate calibration constants • Examples • Dark-image accumulation and averaging, pedestal calculation • Transfer curve mapping, gain calculation • Neighbor pixel cross-talk correction • Readout 120-Hz Science Images • Calibration Correction • With beam • Pedestal subtraction • Piece-wise linear or polynomial correction • Cross-talk compensation • Other corrections? • Requirements to be determined after prototype detectors/ASICs are evaluated • Investigate C++ RTEM processing versus VHDL

  19. Data Acquisition and Processing (2) • Event-building with 120-Hz accelerator timing & beam-quality data information • Attach 120-Hz time-stamp, beam-code to pipelined images • Attach 120-Hz beam-quality data to pipelined images • Dataflow consistency monitoring & diagnostics • Error detection and recovery • Filtering of images • Explore partial online data reduction • Investigate rejection using 120-Hz accelerator beam-data • Investigate feature extraction from calibrated images • Investigate filtering using extracted features • Requirements to be determined after prototype detectors/ASICs are evaluated • Investigate C++ RTEM processing versus VHDL • Realtime Event Monitoring • Monitor quality of images • ~ 5-Hz rate • Processing of images • Display on user monitor

  20. Spectrometer Data Acquisition • Up to 8 GHz waveform sampling, 1 usec record or 1-GHz, 500-usec window • At 120 Hz -> 100 Mbytes/sec • High performance to move or process data • Waveform is time-stamped via EVR • Respective beam-quality data is attached to each waveform • Agilent Acqiris 8-GHz, 10-Bit DC282 cPCI Digitizer Module

  21. Offline Data Management and Scientific Computing • Data Format and API (online/offline) • Data Catalog, Meta Data Management • Electronic Logbook • Processing Framework, Workflow, Pipeline • Mass Storage System • Offline Data Export System • Offline Processing Cluster • Offline Disk Server • Science Tools • Scientific Computing for LUSI Science • Opportunities and needs being evaluated • Very dependent on the detailed nature of the science • Unprecedented size (for photon science) of data sets to be analyzed • Unprecedented computational needs (for photon science) • Comparable in scale to a major high-energy physics experiment • Greater need for flexibility than for a major high-energy physics experiment

  22. Applications • User programs • Endstation operation • Calibration • Alignment • Interface to SW for diffraction/scattering experiments • SPEC • Interface to instrumentation/analysis SW • MatLab • LabView • User tools • STRIP tool • ALARM Handler

  23. Summary • Control subsystem based on EPICS, standard at SLAC • Controller selection in progress paced by beam-line definition • Starting to assemble test-setups • Moving and processing science data is key data-acquisition task • AMO data acquisition via 10-bit 8-GHz cPCI waveform sampling modules • 2D-Pixel detector acquisition via SLAC ATCA DAQ Modules • Peak data rate/volume requirements are comparable to HEP experiments, requiring separate data acquisition and management system • Leverage significant expertise at SLAC in data acquisition and management • Prototypes of ATCA DAQ modules in hand

More Related