1 / 27

25 th ALICE RRB

25 th ALICE RRB. Collaboration News ALICE Status Hardware status Commissioning activities Planning. CERN-RRB-2008-115. Collaboration News. New Institutes Houston ( USA ) EMCAL, Computing Institutes leaving BARC Mumbai (India) leaving ALICE

Télécharger la présentation

25 th ALICE RRB

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 25th ALICE RRB Collaboration News ALICE Status Hardware status Commissioning activities Planning CERN-RRB-2008-115 25th ALICE RRB J. Schukraft

  2. Collaboration News • New Institutes • Houston (USA) EMCAL, Computing • Institutes leaving • BARC Mumbai (India) leaving ALICE • associate member, completed its technical contribution to the PMD electronics • Ongoing discussions • Comsats, Pinstec (Pakistan) Physics, computing • Elections/Nominations • Spokesperson: JSre-elected 1.3.2009 until end 2010 • Deputy Spokesperson, P. Kuijer (NIKHEF/Utrecht), 2 years (2009-2010) • Resource Coordinator: C. Decosse (CERN) (J. de Groot decided to step down end 2008) • Special thanks to J. de Groot, first (and only) RC since 1996 • Run Coordinator: J. Rak (Jyvaskyula) 1 year (2009)

  3. Funding • UK : Continued supportfrom STFC for ALICE • ALICE & STAR groups merged, will apply at next funding round for Nuclear Physics grant. • CERN: ‘White paper money’ • CERN share of 2.5 MCHF has been agreed to complete the ALICE detector • restore full DAQ capacity, complete TRD/EMCAL services & infrastructure, • repay loan to CF for ZDC vacuum chamber • additional contributions received (or requested) from other FA

  4. ALICE Detector Installation mid 2008 Complete: ITS, TPC, TOF, HMPID, FMD, T0, V0, ZDC, Muon arm, Acorde Partial installation: 1/5 PHOS 4/18 TRD 9/48 PMD 0/6 EMCAL ~ 40% DAQ/HLT ALICE Status Installation targets met by mid 2008

  5. 24 April: Insertion of final TOF super module

  6. Installation of final muon chamber

  7. Installation 1st PHOS module

  8. Formal end of ALICE installation July 2008

  9. Data Taking & Commissioning • Comissioning runs (24/7) • Cosmics I (2 weeks, Dec 2007) • local (individual detectors) and start of global (several detectors) commissioning • Cosmics II (3 weeks, Febr/Mar 2008) • local/global commissioning, first few days of alignment ‘test’ run, magnet commissioning • TPC was turned off after ~ 1 week (Drift HV instability, now understood & corrected) • Cosmics III (since 5 May 2008 continuous operation 24/7) • global commissioning, calibration & alignment production runs • Injection tests • TI2 dump in June , injection tests August, first circulating beam September • observed very high particle fluxes during dumps and even during injection through ALICE • 10’s to 1000’s of particles/cm2 with beam screens in LHC and/or TI2 • decided to switch off all sensitive detectors during injection • SPD, V0 always on (trigger), • SSD, SDD, FMD, T0, HMPID, muon trigger occasionally • (beam was useful only for a subset of detectors !) > 6 months 24/7 running ~ 6700 shifts (~ 30 MY) + 2800 ‘on call’

  10. Injection tests SPD hits versus bunch intensity (beam through ALICE) FMD event display (1 bunch through ALICE, > 100 000 hits) FMD hits versus SPD hits (beam through ALICE) SPD/SSD, Sunday, 15.6 Dump on TED

  11. First beam 10th September SPD hits on 10.9.2008, shortly after 9 am V0 hits on 10.9.2008, shortly after 9 am

  12. Luminosity monitor (V0) Double turn, beam 1 back at point 2 ! Single turn

  13. Beam pick-up T0 SPD V0 Auto-correlation for SPD trigger, with multi-turn correlations (3564 bunch crossings) Trigger timing (before alignment) versus bunch number single shot for SPD, V0, beam-pickup BPTX, T0 triggers

  14. First interactions 12th September Circulating beam 2: stray particle causing an interaction in the ITS ITS tracks on 12.9.2008 7 reconstructed tracks, common vertex

  15. TPC Commissioning • TPC up and running since end 2007 • chamber+FC, gas system, cooling, FEE, Laser, DCS, DAQ, HLT,… • noise reduction: modify PS (some occasional excess noise remains in outer parts) • replaced several faulty electronics components (FEE, RCU, ..) • some IROC decoupling capacitors show leakage after months of operation • already several replaced, currently 2 ‘weak’ left + 2 IROCs not operational • difficult or no access to parts of the TPC => modify cabling during shutdown TDR: s ~ 1 measured: 0.7-0.8

  16. ExB scan (Laser tracks) Drift velocity from Laser (~ 2x10-4) TPC Calibration Krypton Gain Calibration • Gain, drift velocity, ExB, alignment,… • Laser, Cosmics, Krypton, drift velocity monitor, gas composition, T/P sensors .. Krypton Calibration chamber gain variation (s ~ 8%)

  17. Momentum Resolution (uncalibrated) TPC Performance • First preliminary results from cosmics • dE/dx resolution (PPR goal: ~ 5.5%) • < 6% • pt resolution (PPR goal: ~ 5% @ 10 GeV) • ~ 10% @ 10 GeV w/o calibration Particle Identification

  18. Silicon: ITS & FMD status • all silicon detectors fully commissioned (Det.,DAQ, DCS, cooling, ..) • pixel trigger worked ‘out of the box’ and was heavily used • individual bad/noisy channels ~ within specs (<0.15% SPD, FMD, 1.5% SSD, 3% SDD) • lost some modules/half-ladders (tracking is sufficiently redundant, so physics impact is minor) (ITS detectors could only be fully tested after cabling on miniframe, i.e. w/o access) • SPD: 12.5% half-staves (2 connection, 13 insufficient cooling flow) • plan to improve flow regulation for individual circuits next WS • SDD: 5% modules (few can be recovered during shutdown) • SSD: 3.8% modules, 9% half-ladders • mostly Sintef ladders which developed excessive bias current (8/22) FMD: S/N ~ 30 FMD3-o SSD SSD: S/N ~ 40 FMD2-o FMD 1-i FMD 2-i FMD 3-i

  19. ALICE Inner Tracking System: Alignment with Cosmics • Alignment using tracks and Millepede program in a hierarchical approach • ~50k cosmic  for alignment collected since end of May (~0.1 Hz), using Pixel trigger Silicon Pixel Detector (SPD): • ~10M channels • 240 sensitive vol. (60 ladders)‏ Silicon Drift Detector (SDD): • ~133k channels • 260 sensitive vol. (36 ladders)‏ Silicon Strip Detector (SSD): • ~2.6M channels • 1698 sensitive vol. (72 ladders)‏ ITS total: 2.2k alignable sensitive volumes  13k degrees of freedom ITS Event display Distribution ofclustersin the 6 layers

  20. ALICE Inner Tracking System: Alignment with Cosmics • Preliminary results for SPD (Pixels, ~ 80% prel. aligned): • Residual misalignment < 10 m (detector resolution = 12 m in r ) • SSD & SDD, as well as ITS/TPC alignment has started (smaller coverage) Track-to-“extra clusters” distance in transv. plane (sensor overlap) Track-to-track (top vs bottom) distance in transv. plane before alignment before alignment after alignment after alignment = 55 m (vs 40 m in simulation without misalignment)‏ = 21 m (vs 15 m in simul. without misalignment)‏

  21. Oct 08 Other Detectors • TOF: 2/3 fully commissioned (3 finishing gas tests, several missing DC-DC converters) • TOF trigger works very well (noise factor 2 better than in tests) • TRD: 4 SM’s installed (2 final ones), noise and L1 trigger ok • HMPID: fully commissioned, noise at specs • PHOS: 1 module fully commissioned, trigger com. ongoing • occasional noise pickup (depends on the day..) • Muon tracking: ~ 70% commissioned, noise problem cured in all but station 3 • Muon trigger, V0, T0, ACORDE, ZDC: all up and running TRD cosmic HMPID cosmic event (no calib/slewing, offset correction etc..) Jul 08 TOF raw spectra (no calib/slewing, offset correction etc..)

  22. Online • Continuous online operation since March (24/7) • DAQ: 40% HW installed, stable operation up to 500 MB/s transfer, data taking > 3 kHz • complete HW for 2009 • DCS/ECS: all essential functionality operational, continuous improvement in functionality • some problems encountered with scalability (mostly resolved) • CTP: some problems with spurious triggers resolved, running well • HLT: 500 CPU’s, very successful operation (online reconstruction, data reduction) • complete HW in 2009 Aggregated cumulative time in the global partition

  23. DAQ: Global Partitions Number of detectors in the global partition Now also runs with large number of detectors run stable over hours with realistic trigger sequences ! (Useful cosmic data taking done for subsets of detectors working with different triggers)

  24. Offline & Physics ‘commissioning’ • Offline • exercise data transfer to T0/T1, prompt reconstruction, prompt calibration, data Q/A • Concentrating on early physics at 900 GeV and 10 TeV • global event features, pt spectra, particle rations, baryon transport, …. • detailed trigger & data taking scenario elaborated • fast calibration/alignment/reconstruction prepared • daily meetings of ‘first physics task force’ to exercising analysis until mid Sept.

  25. Computing • Computing Resources (updated after CRSG review) • ALICE deficit in 2009 : ~ 25 CPU% , 60% Disk (error found by CRSG), 50% Tape • ‘optimistic running scenario’ 107 s, but ALICE pp data depends much less on it ! • need 109 pp MB events/year, i.e. 107s@100 Hz or 106s@1000 Hz • addressing the shortfall • adding resources: partially successful and ongoing • in discussion with a number of specific partners • redistribution of existing resources between experiments (integral LHC resources much more adequate than differential distribution) • CRSG: ‘A very limited degree of redistribution of resources may be advisable in 2009’ • ‘living with less’: scenarios & physics impact under discussion with LHCC • a modest shortfall may be accommodated with a modest loss of physics … 24/10/2007 23rd RRB J. Schukraft 25

  26. Planning • Until 12 October • continue cosmic running & consolidation of online software • Medium term (until LHC restart in 2009) • ‘shutdown mode’, start with activities planned for winter shutdown • re-arrange cabling on mini-frame (ITS,TPC) to allow better access to electronics major activity, requiring > 4 months of work both underground & surface • detector consolidation TPC capacitors; recover some ITS modules, improve SPD cooling, … • detector installation (TRD, PHOS, PMD, EMCAL), … • 3 PHOS, 6-8 TRD modules, 2 EMCAL modules, complete PMD • Long term: complete detector • PMD (2008), DAQ/HLT(2009),TRD (2009), PHOS (2010/11), EMCAL (2010/11)

  27. Summary • Installation: • met all installation goals by mid 2008 • Commissioning and initial calibration/alignment • went rather well, sometimes even better than expected (e.g. TPC, SPD alignment + trigger) • some (mostly minor) hickups / problems (noise, lost channels) / bug fixing • to be improved: • access to TPC for repairs/FEE exchanges • cooling flow of SPD • Detector performance • better – within - very close to specs(at least as far as could be verified with cosmics) After > 15 years of designing & constructing ALICE we do have a working experiment ready for physics. Thanks to all the Funding Agencies who have made this possible with their strong support over 1½ decades !

More Related