1 / 9

SDHCAL Data Acquisition Status and Plans C.Combaret, G.Beaulieu,L.Mirabito

This document provides an overview of the current hardware and software setup for SDHCAL data acquisition, as well as plans for future development. It includes information on the hardware configuration, software architecture, and upcoming projects for both ILC and CMS applications.

cpieper
Télécharger la présentation

SDHCAL Data Acquisition Status and Plans C.Combaret, G.Beaulieu,L.Mirabito

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SDHCAL Data Acquisition Status and Plans C.Combaret, G.Beaulieu,L.Mirabito

  2. Current Status Hardware • 50 planes (1m2 each) of GRPC equipped with HR2 chips, inside a stainless steel structure • Clock and synchronous commands are sent using HDMI cables between a central SDCC and 7 DCC boards (1 DCC for 8 plane) • Three new DIF firmwares: The central DIF of each plane fan in/out clock, command and busy to the 2 lateral ones. Only one HDMI connection per plane • Data is readout through USB2 • A dedicated mezzanin is plugged on each DIF with USB2 FTDI chips • Same firmware and software as with USB1 FTDI chip Software • Raspberry pi micro PC equipped with 12-ports homemade USB Hub as Daq frontend • 150 usb2 links to 13 Raspberry micro PCs (1 PC for 4 planes) • XDAQ not yet supported on Raspberry PI • DAQ software migrated to DIMfor communication between PCs • Possibility to keep XDAQ (or Eudaq) architecture in the backend • Same Data structure and DB configuration

  3. December 2014 Beam Test Setup Unique HDMI connection per plane Flat cable connection to lateral DIFs 13 RPI with 12-ports USB hub

  4. Hardware General View of m3 Trigger Beam Start/Stop SDCC USB RPi DCC 1 DCC for 8 cassettes (1 spare) USB2 USB Hub Boards and asics configurations RPi DIF_G DIF_M DIF_D 1 hub+Rpi for 4 cassettes Acquisition data ASU ASU ASU 6x 24 HR2 ASU ASU ASU USB/USB2 Homemade protocol HDMI medium

  5. Software General View of m3 Configuration DB (Oracle) Dif and asics conf. DIM DB Manager DIF Data Handler USB Device Driver DIM DIF Server DIM DIF Client DIF DIF Readout Shm DIM Writer CCC Manager DIM CCC Server USB Device Driver Online Analysis SDCC CCC Readout Archived data Run control Running on DAQ RpIs Running anywhere on the DAQ network DIM interface

  6. Plans What is in progress (ILC) • Common beam test with other CALICE detectors • light(?!) adaptation of software, hardware and firmware to make it compliant with other detectors • Main issue is the clock and command compatibility What we plan to build very soon (ILC) • A few (4) GRPC detectors (2 or 3 m2) equipped with HR3 chips • One (new) DIF per plane • Data is readout through TCP/IP links • Clock and fast commands are sent to the DIF using dedicated TTC protocol • Software still based on DIM What we plan to build very soon (CMS) • A few (tbd) GRPC inside leak tights cassettes with TDC/Petiroc electronics • Eventually Daq will need to be CMS compliant (GBT use?) and integrate inside the CMS DAQ/Trigger framework • Soon, a TTC compliant DAQ is sufficient.

  7. TCP/IP +TTC solution Advantages • No need of HW implementation, all TCP stack is already implemented in the Wiznet chip • TTC protocol well known, many hardware and software implementations already exist • TTC hardware well known, many boards already exist and are available (especially at CERN pool) • Could be compliant quite easily with m3 prototype (tbc) to be used as tail catcher • Current baseline for ILC SDHCAL Disadvantages • Require 1 ethernet cable per DIF for data and 1 for TTC • TTC technology becomes old • Not compliant with CMS upgrade plans

  8. GBT solution Advantages : • CMS upgrades compliant technology • Lighter in term of cables (everything on 2 fibers) • More modern technology Disadvantages • Requires more expensive hardware (tbc) • Designed for CMS -> overkill for ILC in terms of bandwidth • Power consumption quite high for ILC (even if LP-GBT exists with <1W dissipation) • Both Hardware and Software high quality work has already been performed in the CMS collaboration Ideas • Can be used as 1 GBT link per « super module » if someone design a concentrator board to accommodate DIFs • Maybe a lighter (slower, lower bw, less consumption) GBT can be imagined between DIFs and concentrator (but is it really neededl?)

  9. Summary: What will/could be done? ILC : Develop HR3 system with current baseline (TTC & TCP/IP DIF). -> mandatory to have a technological prototype of SDHCAL with HR3 CMS : Use the new DIF as a GBT experimentation platform to read Petiroc+TDC Otherwise use a Glib board (FMC board to accommodate ASU link required).

More Related