1 / 15

First attempt at using WIRED

First attempt at using WIRED. Eric van Herwijnen. Overview. Gaudi MC event server Tracking group’s reconstructed events Test version released Future plans. /run1/event1.xml /event2.xml /....... /run2/event1.xml /event2.xml /..... /....../. Eventx.xml.

mrosenblum
Télécharger la présentation

First attempt at using WIRED

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. First attempt at using WIRED Eric van Herwijnen

  2. Overview • Gaudi MC event server • Tracking group’s reconstructed events • Test version released • Future plans

  3. /run1/event1.xml /event2.xml /....... /run2/event1.xml /event2.xml /..... /....../...... Eventx.xml WIRED EventDumpAlgorithm http://lhcb.cern.ch:8080/servlets/eventselector&runx#eventy (Gaudi) Sicb dataset

  4. Open an MC event from Web server

  5. Production plans

  6. Status of simulated event production • 180 k mbias events • 270 k bb inclusive • 7*106 total number of events simulated so far ( 1 TB data)

  7. Current capacity • Production centers for simulation: • No dedicated LHCb facility • CERN CSF (being phased out for us) • RAL CSF • IN2P3 • Heidelberg • For analysis: • CERN RSBATCH/RSPLUS (being phased out for us)

  8. New production centers • CERN- Planned by end 1999: • CSF PCSF (first production 300k events done for new magnet study, 150 k events/week) • RSBATCH  Linux cluster (analysis batch) • RSPLUS  LXTEST (interactive logon service) • LHCb LSF cluster (corridor PC’s, 50 LHCb machines available,  75 k events/week)

  9. New production centers • Outside CERN (figures for v116/117): • Liverpool ( 50 times PCSF, 7500 k events/week) • RAL PC farm (half the size of PCSF, ie  75 k events/week) • RAL CSF ( 10 k events/week) • Lyon IN2P3 ( 20 k events/week) • RIO Linux farm (?)

  10. SICB performance

  11. Test release • Http://lhcb.cern.ch/computing/analysis/default.htm • Download the file install.class • Install jdk 1.1.8 (from IBM) for best preformance • Type java install

  12. Future work • Make an XML conversion service instead of an algorithm • Re-arrange hits & reconstructed tracks • Improve interactivity • Get feedback

More Related