1 / 26

EIC Workshop 21 May 2008 Experience with high trigger rates @JLAB R. Chris Cuevas Jefferson Lab

EIC Workshop 21 May 2008 Experience with high trigger rates @JLAB R. Chris Cuevas Jefferson Lab Experimental Nuclear Physics Division. Topics Cebaf’s Large Acceptance Spectrometer – CLAS Trigger design parameters Performance Notes -- 1996 to 2008 12 GeV Upgrade –

zamora
Télécharger la présentation

EIC Workshop 21 May 2008 Experience with high trigger rates @JLAB R. Chris Cuevas Jefferson Lab

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EIC Workshop 21 May 2008 Experience with high trigger rates @JLAB R. Chris Cuevas Jefferson Lab Experimental Nuclear Physics Division Topics Cebaf’s Large Acceptance Spectrometer – CLAS Trigger design parameters Performance Notes -- 1996 to 2008 12 GeV Upgrade – Trigger Requirements & Solutions EIC Trigger requirements – New challenges (A few slides from Dec07 Workshop) Future technology

  2. 6 Identical Sectors • CLAS Trigger Design Parameters • Photon & Electron Experiments with • polarized targets, polarized beam • High Luminosities 1034cm-2s-1: • DAQ event rate designed to 10KHz • Dead-timeless, low latency Level 1 (<125ns) • Pipelined (133MHz) clock • Fast Level 1 for ADC Gate, TDC Start • Level 2 (Drift Chamber) Pass/Fail • Up to 32 Front End Readout Crates (ROC) • Sector based, pattern recognition programming • Implementation • VXI 9U x 360mm sector modules (Level 1 Router) • Event Processor ( Programmable sector coincidences) • Very low propagation pipeline stage (15ns) • ECLps technology for most logic • Trigger Supervisor manages trigger signals and • interrupts the front end crate ReadOut Controllers. • Level 2 trigger signal created with external logic from • Drift Chamber system. Level 2 fail issues Fast Clear to • Fastbus modules. TOF Drift Chambers 3 Regions Cerenkov ECal

  3. CLAS Overall Trigger Block Diagram Forward Carriage Trigger Crate Level 2 Pass E V E N T P R O C T R I G G E R S U P E R V I S O R S E C T O R 1 S E C T O R 2 S E C T O R 3 S E C T O R 4 S E C T O R 5 S E C T O R 6 TOF_D Level 2 Fail TOF_T BUSY EC Level 1 Trigger Distribution Level 1 Accept * CC CLEAR LEVEL2 LOGIC DG535 DELAY DECK1 DG535 DELAY DECK2 DG535 DELAY DECK3 “HBTG”** * LAC Not Shown ** HBTG == L1 Accept FC Rocs Start/Gate Branch Cables to all ROCS 1877 ROCS STOPS Deck1 1877 ROCS STOPS Deck2 1877 ROCS STOPS Deck3 FROM Drift Chambers [ADBs] OR B U S Y Cuevas -- Electronics Group 18 May 1999

  4. Level 2 Trigger Block Diagram Track Segment Finders and Majority Logic CPU VME Control to Select Majority Function Sector 1 142° R3Stereo To TDC {DC4} VME Majority Logic NIM/ECL R3Axial To Forward Carriage Level 2 Latch R2Stereo R2Axial R1Axial ** e Sector2 Sector3 Sector4 Sector5 Sector6 Segment Finders Each Superlayer Drift Chambers Segment Collectors ** Majority Logic Boards designed on VME Flexible I/O format Two NIM outputs per board. Majority Logic function selected by VME control. One output drives a local TDC and the other output goes to the forward carriage Latch module. Space Frame Decks 1-3 Cuevas -- Electronics Group 18 May 1999

  5. All FastBus front end modules 1872A TDC, 1877TDC, 1881 ADC • Struck FastBus Interface – Motorola VME Cpu • No Level 2 implemented • Long conversion time (1.8us/chan 1872A single hit TDC) limits trigger rate • Events NOT pipelined in the ReadOut Controller (ROC) • ATM network to ROC • < 3KHz event trigger rate • Level 2 implemented - Drift Chamber regions with majority logic • Electron(L1) and a track in a sector can be combined for L2 pass or fail • Replace 1872A with VME pipeline TDC(CAEN) • Upgrade Motorola Cpu (ROC) • 100MB Ethernet network adapted to ROCs • Other DAQ methods improve event trigger rate to ~8KHz • LIMITATIONS • Trigger Supervisor support of 32 crates Max • Triggers are not pipelined ( 1 Trigger ->1 Event readout cycle ) • Gated ADC (1881) Analog signal *stored* in delay cable • Photon experiment triggers not easily implemented with Level 1 hardware • CLAS Trigger Performance Notes (1996  Present)

  6. Other Performance Notes • Relatively low failure rate of FastBus instrumentation • Aging BiRa FastBus crate power supplies will be replaced with Wiener • product for 1877 TDC crates only • Air flow cooling design has worked well • Virtually no hardware failures for custom Level 1 Trigger System • *VXI crate and power supply converted to Wiener product • Virtually no hardware failures for custom Level 2 Trigger modules (~400 ) • Very recent implementation of VME CAEN programmable (FPGA) logic • modules for the g12 photon beam experiment. Photon trigger hardware • is coupled to original Level 1 modules to create triggers for CLAS. In use • since April 2008, with excellent results and new trigger GUI. • CLAS Trigger Performance Notes (1996  Present)

  7. Hall D Hall B 12 GeV Trigger Requirements

  8. Hall D • Reduce total hadronic rate from 350KHz to true tagged hadronic rate of ~14KHz • Use Level1 hardware trigger and Level 3 farm to achieve this 25:1 reduction • Level 1 hardware trigger efficiently cuts low energy photon interactions • Level 1 trigger hardware design goal is 200KHz • Level 1 uses: • Energy Sum from Barrel Calorimeter • Energy Sum from Forward Calorimeter • Charged track counts (Hits) from TOF • Charged track counts (Hits) from Start Counter • Tagger Energy (Hit counts) • Simulations show that this Level1 cut method achieves ~150KHz trigger rate • Relatively ‘open’ Level 1 trigger 12 GeV Trigger Requirements Cut backround ‘Physics’ event

  9. Every n (256) events Trigger Every event Front End “Digitizer” Digital Pipeline FE/DAQ Interface Event Block Buffers Analog Data To ROC • Hall D – “How do you perform this Level1 cut with hardware? • Use FLASH ADC for detector signals that are included in the Level 1 trigger • Detector signals are stored in front end boards • Energy Sum is computed at the board, crate, and subsystem (BCAL,FCAL) • Synchronous system and Trigger Supervisor performs event blocking at the ROC level • 8us buffer on front end boards allows for trigger decision (latency) • Use high speed fiber optic/serial data transfer between front end crates • Easily supports 64 readout crates and is easily expandable 12 GeV Trigger Hardware

  10. -Fiber links- 12 Crates ENERGY SUM PROCESSOR SUM/TIME (8 INPUTS) GTP Select FCAL Energy, BCAL Energy, Photon Energy, AND Track Counts <,>,= TRIGGER SUPERVISOR ----------------- CLOCK TRIGGER SYNC ROC CONTROL FADC -VXS- BCAL SUM -Fiber links- 12 Crates ENERGY SUM PROCESSOR SUM/TIME (8 INPUTS) -VXS- FADC FCAL SUM -Fiber links- 2 Crates * Longest Link * TOF TRACK COUNT ENERGY SUM PROCESSOR** SUM/TIME (8 INPUTS) ** Process Track Counts -VXS- -Fiber links- 2 Crates FADC TAGGER ENERGY -VXS- FADC -Fiber link- 1 Crates Signal distribution to Front End Crates (Fiber Links) START COUNTER TRACK COUNT -VXS- FADC PAIR SPECTROMETER FADC -VXS- Block Diagram: Hall D Level 1 Trigger 12 GeV Trigger Hardware ‘Trigger Supervisor’ ‘SubSystem’ ‘Global’ ‘Crate’

  11. Latest Designs VXS High Speed Serial Backplane 16 channel 250 Msps Flash ADC Energy Sum Module

  12. Hall B • Photon & Electron Experiments with polarized targets, polarized beam • Increase Luminosity to 1035cm-2s-1: • DAQ event rate increase 10KHz(25-30MB/s) • Retain sector based trigger scheme • Add PreCal, Low Threhold Cerenkov counter • Add Silicon Vertex Tracker, and Central TOF • Upgrade Drift Chamber Level 2 Hardware • Replace FastBus ADC modules with FlashADCs (Keep Multi-Hit 1877s TDC) • FlashADC design will be used for Calorimeter Energy Sum and ‘Cluster’ finding • Level 1 trigger will be promptly sent to SVT and the Drift Chamber Level 2 hardware • Level 2 will employ a ‘Road Finder’ to link all three Drift Chamber regions per sector • FlashADC and custom trigger modules will be identical to Hall D for cost savings and • efficient use of design and implementation resources! 12 GeV Trigger Requirements

  13. 12 GeV Front End Electronics & Level 1 Trigger Modules Trigger Supervisor Crate VXS Crate VXS Crate Fiber Optic Clock/Trigger Distribution Detector Signals TD TS TS (12) (1) Clock Crate Trigger Processor Fiber Optic Distribution VXS Crate (# Boards) - (~400) FADC250 SSP GTP - (~140) F1TDC (8) - (<40) Crate Trigger Processor (2) Global Trigger Processor Crate** - (~80) Trigger Interface - (~80) Signal Distribution ** Standard VXS Crate Implement GTP on two Switch Slots Cuevas Updated 28MARCH08

  14. A few other nice features of JLAB custom Trigger Modules Collaborative efforts of JLAB Fast Electronics, JLAB DAQ, and Christopher Newport Universitygroups. 12 GeV Trigger Hardware

  15. Subsystem Processors (SSPs) • All subsystem processors reside in Global Trigger Crate • All subsystem processors are same physical PC boards! • Each SSP receives up to eight four-lane “crate data links” • Some SSPs divided into two boards (because of crate count) • If so – both board “Partial Results” sent to global processor • Eight SSPs are needed: • Two for BCAL – Energy Subsystem Processor (ESP) • Two for FCAL – Energy Subsystem Processor (ESP) • One for Start Counter – Hit Subsystem Processor (HSP) • One for TOF – Hit Subsystem Processor (HSP) • One for Tagger – Tagger Subsystem Processor • One spare! • Each subsystem processor sends time-stamped Subsystem Event Reports (SER) to all Global Trigger Processors (as in CTP-SSP link)

  16. SSP

  17. The Global Trigger Crate • Eight SubSystem Processors (SSPs) on one logical “Side” • One or more Global Trigger Processors (GTPs) on other “Side” • SSPs are connected to the GTPs via a “partial-mesh” backplane • 8 x 2 Mesh Initially (VXS!) • Each SSP talks to each GTP via a four-lane Aurora Backplane Link • Each SSP sources two four-lane links to the backplane • Each GTP sinks eight four-lane links from the backplane

  18. The Global Trigger Crate (logical view) VME/ VXS bcal ESPs fcal ESPs tof HP strt HP phot EP Clk/ Trig In GTP Array SSP Array

  19. GTP Logic 2-8 Trigger Bits 8 Trigger Bits

  20. Example GTP Trigger Equation Implementation • Z >= TFM*HTOF + EFM*EFCal + RM*((EFCal +1)/(EBCal + 1)) • HTOF - Hits Forward TOF • EFCal - Energy Forward Calorimeter • EBCal - Energy Barrel Calorimeter • Equation was implemented in VHDL using Xilinx synthesis tools and Virtex 5 LX220 FPGA • All computing done in pipelined, 32bit floating point arithmetic • Subsystem processor data was converted from integers to floating point • Equation is computed every 4ns and trigger bit is updated if Z is above a programmable threshold • Each coefficient is “variable” – can be changed very quickly without having to reprogram FPGA • Used Xilinx specific math libraries (+, -, *, /, sqrt) • Synthesis and implementation resulted in using 3% of LX220 FPGA • Latency was 69 clock cycles => 276ns delay introduced for forming L1 trigger • Algorithm was targeted to run at a 300MHz clock speed without significant effort.

  21. EIC Trigger Hardware Goals (Copied Slides)

  22. EIC Trigger Hardware Goals (Copied Slides)

  23. Summary • CLAS high rate trigger system has been very reliable for over a decade • Evolution of improvements to trigger rate by ‘upgrading’ aging hardware • FLASHADC used for detector signals that ‘create’ the trigger • 250MHz sample rate(4ns) with 8us data buffer • Energy summing and other trigger logic created from detector signals • Elegant VXS backplane implementation takes advantage of high speed serial links • Latest Field Programmable Gate Arrays used to implement trigger ‘equations’ • Full synchronous system managed by Trigger Supervisor • Trigger distribution • Event data blocking and ReadOut Controller interupts • Flexible system design (same module designs) used for two complex experimental Halls • New FPGA inputs can accept very high (1.2Gbps) input data from higher speed • ADC chips.

  24. Questions? Discussion?

  25. Other Stuff

  26. Examples of physical rack layout drawings • ALL equipment must be shown to identify rack space issues • (i.e. Network gear, patch panels, splitter panels, etc.) • Airflow/Cooling issues will need to be identified and resolved 36” Deep 19” Standard JLAB Rack • VXS Crate with: • (16) FADC-250 • Sum Board • (1) Clock/Trig/Sync • Cpu not shown Fan Tray/Crate control

More Related