1 / 10

GLAST Large Area Telescope ISOC Peer Review Test Bed Terry Schalk GLAST Flight Software

Gamma-ray Large Area Space Telescope. GLAST Large Area Telescope ISOC Peer Review Test Bed Terry Schalk GLAST Flight Software t@slac.stanford.edu. General Overview. DAQ group will provide 3 sets of hardware, built in parallel, to test the Trigger and Data Flow system:

hea
Télécharger la présentation

GLAST Large Area Telescope ISOC Peer Review Test Bed Terry Schalk GLAST Flight Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gamma-ray Large Area Space Telescope GLAST Large Area Telescope ISOC Peer Review Test Bed Terry Schalk GLAST Flight Software t@slac.stanford.edu

  2. General Overview • DAQ group will provide 3 sets of hardware, built in parallel, to test the Trigger and Data Flow system: • Front End Simulator (FES) using a “Test Bed LAT”: • LAT-TD-02895 • Real electronics (all of it) up to the front-end cables on TEMs for all subsystems • No front-end electronics or sensors • Simulated data is driven down the cables for testing • This configuration allows for both static (data flow integrity) and dynamic (data flow throughput) testing • Functioning Tower without sensors • Functioning Tower with a subset of working sensors

  3. LAT FSW: Executes on SIU and EPU FSW is an integral part of the data acquisition (DAQ) subsystem and is managed, budgeted, and scheduled as part of the DAQ subsystem TKR Front-End Electronics (MCM) ACD Front-End Electronics (FREE) TKR CAL Front-End Electronics (AFEE) 16 Tower Electronics Modules & Supplies CAL Global-Trigger/ACD-EM/Signal-Distribution Unit* 3 Event-Processor Units (EPU) (2 + 1 spare) • Event processing CPU • LAT Communication Board • SIB Spacecraft Interface Units (SIU)* • Storage Interface Board (SIB): Spacecraft interface, control & telemetry • LAT control CPU • LAT Communication Board (LCB): LAT command and data interface Power-Distribution Unit (PDU)* • Spacecraft interface, power • LAT power distribution • LAT health monitoring * Primary & Secondary Units shown in one chassis

  4. Software Test-Bed for EM2/FU Tests • Provides a full DAQ system with EM2 hardware (interfaces and functionality identical to flight) before flight hardware is available • 16 TEMs and 16 TEM power supplies • Front end simulators • Full ACD EM2 electronics • Fully redundent complete GASU and PDU • 2 SIUs and 3 EPUs. • Allows realistic boot and startup tests • because all the redundant data paths • components are present, failover • procedures can also be tested. TKR and CAL Electronics Simulators TEM DAQ Modules TEM Power Supplies 12 ACD Electronics Cards

  5. Test-Stand (2) Racks for FES “feeder” CPUs 4x4 grid for electronics mounting Cable feedthroughs (as flight) Cable management (as flight)

  6. Software Test Bed Hardware Configuration Host CPU Crate Target(s) GASU 1553 Instrument comm’s 16 TEMs + FESs AEM Discretes Host Comm’s GEM High Speed Science Discretes CRU CPU Power Dist’n EBM Instrument switches “SC” Power Instrument Power SC switches • Host • Spacecraft Instrument Interface Simulator (ethernet ) • CPU Crate • cPCI flight equivalent chassis or commercial cPCI chassis • Instrument communications: LAT communications board (LCB) (cPCI form factor) • Host communications: Storage and Interface Board (SIB) (also does power bootstrap) • CPU: RAD750 processor or Motorola MCP750 COTS processor • GASU • Full GASU implementation (including primary and redundant sides) • Power Distribution • Full PDU implementation (including both primary and redundant sides) • Target(s) • 16 Tower Electronics Modules (backed by Front End Simulators) “SC” Power

  7. Front End SimulatorFES Hardware Configuration • The FES hardware consists of 10 off-the-shelf PCs with Moselle PCI bridges and 4 x 120 Gbytes of local disk space. • Eight of the PCs will service 2 TEMs each. • The remaining 2 PCs will service the AEM and be home to the FES control system. • These PCs are connected to Front End Simulator Boards through PCI: • These boards accept data dictated by bandwidth considerations between the PC and the boards themselves. • On the output side, the boards manage the time sequenced distribution of data and the data presentation to the T&DF subsystem trigger and data cables. • 3 flavors of these boards; a TKR board, a CAL board, an ACD board. All physically identical, differing only by the code loaded. • Each board has two large banks of memory, one intended to hold the data and the other the timing transition vector information. • These memories also hold the storage used to implement the register model. A LAT command that writes a front-end register will actually write a location in these memories.

  8. Front End SimulatorFunctional Pieces • The FES consists of the following major functional blocks. They are presented roughly in data stream order: • Data Source Generator • Simple Pattern Generator (Physics Monte Carlo, eg GlastSim) • Data Munger – offline • Munges the output of the Data Source Generator into a format suitable for FES PC consumption • Data Transfer Agent/Storage • Moves data from the storage media where it was generated to the FES storage system, i.e. the PC’s local disk system • Data Munger - FES Front-End PC’s • Moves and possibly reformats the data from the FES storage system to the FES tower/subsystem simulator boards • FES Tower/Subsystem Simulator Boards • Receives the data from the FES Front-End PC’s. These boards simulate both the sensor side and cable side of the LAT Front-End Electronics. • Hardware System Control • Provides for the coordination and synchronization of the multiple PCs that compose the FES. • User System Control • The interface used to control the FES.

  9. Rest of the Test hardware • DAQ group will provide 3 sets of hardware, built in parallel, to test the Trigger and Data Flow system: • Front End Simulator (FES) using a “Test Bed LAT”: • Functioning Tower without sensors: • The test Tower contains operational front-end electronics • Allows FSW to test that its TEMs can be integrated with a Tower and that data can be successfully written to the Tower’s registers • Functioning Tower with a subset of working sensors: • The test Tower contains a subset of sensors as well as front-end electronics • Allows FSW to test event trigger using real physics data • Also test timing and calibration

  10. Single Complete Tower VME CPU with LAT Communication Board (LCB) Tower Electronics Module with Tower Power Supply Full set of 4 CAL AFEE boards, (4 sides, 1 each) Full set of 36 TKR MCMs (4 sides, 9 each)

More Related