1 / 37

Opening Remarks

Opening Remarks. John Womersley Fermilab DØ Collaboration Meeting, February 2002. St. Valentine’s Day. St. Valentine’s Day in Chicago. Thursday, February 14, 1929 Members of “Bugs” Moran’s gang assassinated by four men posing as police officers, on the orders of Al Capone. Progress.

mzander
Télécharger la présentation

Opening Remarks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Opening Remarks John Womersley Fermilab DØ Collaboration Meeting, February 2002

  2. St. Valentine’s Day

  3. St. Valentine’s Day in Chicago Thursday, February 14, 1929 • Members of “Bugs” Moran’s gang assassinated by four men posing as police officers, on the orders of Al Capone

  4. Progress • There has been a great deal of progress in the experiment which is reflected in a full agenda this week and in the plenary talks today and on Friday • Physics and Object-ID • Certified physics objects; Moriond analyses • Detector hardware • Fiber tracker, Level 2, DAQ • Computing and software • Run 2B • Technical Review in December; L1 Trigger MRI proposal submitted in January Klima Fox Ginther Moore Brooijmans Hadley Kotcher

  5. done in progress not yet done • October 15 • W/Z  e signal • reporting and oversight established • establish a plan for analysis infrastructure (e.g. event display, luminosity tools) and for farm operations during shutdown • November 1 • W/Z m signal • luminosity available by run and trigger • November 15 • J/ ym m signal for calibration • Operations: • 20 Hz to tape • Examines, online monitoring of data quality • production offline capable of keeping up with data taking & plans to cope with increases • use new tape handling system • Last date for major changes to Dec 1 software release Analysis done, no signal found (trigger w/o pT cut) Both WAMUS and FAMUS now People identified, effort starting New switch, New nodes

  6. done in progress not yet done • December 1 • start stable global running; freeze global trigger list (L1 and L3) • Establish EM scale • b tagging with muons • P10.x software release (= final reco code for Moriond) • January 10 • certified e, mu physics objects (p10.x) • certified tracks (p10.x) • certified jet physics objects and certified JES • February 1 • b tagging with sec vtx or IP • complete data taking for Moriond • February 8 • complete data processing for Moriond • March 1 • Plots/results ready for approval List 3.x used for all Moriond data P10.11 used to process all data; desirable to reprocess subsamples with later releases Lot of work in progress, quite a struggle . . . “So close I can smell it” - G.G. Söldner-R. Crépé-R. Iashvili Gutierrez Kupčo . . . tracking first . . . reprocessing too

  7. Detector Performance Ginther

  8. Gutierrez

  9. Run 2 luminosity to date ~ 22 pb-1 Slopes are more important than absolute numbers at this stage ~ 8 pb-1 ~ 6 pb-1

  10. 10 400 8 300 6 200 4 Peak Luminosity (1031) Integrated Luminosity (pb-1) 100 2 0 0 1/1/02 3/1/02 9/1/02 5/1/02 7/1/02 1/1/03 11/1/02 Beams Division Plan for 2002 Church

  11. How are we doing so far? On track for the last 2 weeks ~ 2 pb-1 delivered per week Scheduled evolution

  12. DØ Operations • Histograms of DØ performance are now maintained on the Internal Documents page (though the reports they are based on are unfortunately broken this week) Live Fraction Rate to Tape (Hz) 70% New VBDi 25Hz  Jul 01 Jul 01 Period of poor performance Feb 02 Feb 02 We intend to track these numbers in order to improve! Also, audible alarms now stop data taking Savage

  13. DAQ • A decision had been made to implement the “commodity solution” (ethernet-based) DAQ system • The project is managed by Gustaaf Brooijmans and Gordon Watts • Procurement of (almost) all items is underway • We are already using SBC’s to read out the fiber tracker • Plan • Convert software first, so that existing VRC’s emulate SBC’s • Then switch over to the new system one crate at a time • Paced by deliveries • start in April, finished by June? Brooijmans

  14. DAQ • Brown/ZRL solution will be maintained and developed in the meantime • Hardware VBDi’s installed and working • We can now (2/9/02) run stably while taking ~ 80Hz into L3 • Need new trigger list with better rejection at L3! • Trigger meisters, L3 filters group, trigger board • 2/10/02 record breaking day: • 1.1 million events recorded • 0.7 million events processed through the reco farm Gallas

  15. 91.8Hz

  16. Data Monitoring and Quality • Pushpa Bhat will coordinate detector and global Examine programs, online data quality monitoring and certification, and the online event display.   She’s working with detector experts to improve and update the detector Examines, and with Michiel Sanders who is creating a global physics Examine. • Stefan Söldner-Rembold will coordinate tracking and reporting mechanisms for data quality and the definitions of criteria. This will be what is used by offline analysis to determine "good runs".  We will have a number of criteria based on detector, ID and physics group requirements.  The quality info will be implemented in the Runs database by the DB experts. These are both very important tasks to ensure quality physics data.  Please give these people your help and support.

  17. Shifts • 262 people have taken shifts in the period Jul 1, 2001 – Jan 31, 2002 (25-30 have done SAM or farm shifts) • 84 people who did not take shifts so far indicated on their effort reports that they intended to do so • We still need one DAQ shifter for the period ending June • a detector shift person willing to be reassigned? • The shift situation will be revisited at the end of May • Thanks to everyone who stepped up to the plate after our discussions at the last meeting (and of course to those who were already contributing before that) • Thanks also to all those who put a lot of time into scheduling and training shift physicists • Institutions planning to send summer students to Fermilab: • Alan Stone is looking for DAQ shifters for the summer

  18. Effort Reporting • DØ effort report completed and posted on the IB web page • thanks to Sharon Hagopian, Peter Ratoff, and Henry Lubatti • We have passed the results on to all the subgroup leaders • In general, it is not a very pretty picture. Some examples: • Most projects peak on the left • many people working on a given project at the 10-30% level • very few working on a project at the 80-100% level Vertex algo. EM ID Calibration SMT detector

  19. Future trends • We asked people to tell us their future plans too: • My conclusion: we are inefficient in the way we apply our effort • Way too many small fractions • Not clear how to improve … Some things are going to get worse SMT algorithms now planned Not a lot of “undecided” manpower

  20. DØRACE workshop • Very successful • Lots of interest: 65 registered attendees, 45-50 in “hands-on” session • Two themes: 1. Down to earth: set up present DØ software at remote institutions • Now 14 institutions running SAM stations • was 5 before this week

  21. DØRACE continued 2. Reach for the stars:discuss/plan for the future Grid efforts inside and outside DØ Offsite reconstruction Remote analysis, regional centers(?) Videoconferencing Sociology • U.S. interest is low so far — why? • Regular on-week DØRACE meetings • On the web: DØ at work  DØRACE • Organize another “hands-on” workshop of this type? Thanks to Jae, Lee, Igor, Iain, Kors, et al.!

  22. Videoconferencing Task Force • We have received the report of the videoconferencing task force • Thanks to Ursula Bassler, Sarah Eno, Ron Lipton, Tom Marshall, and Gordon Watts • Major Recommendations • Post electronic copies of talks at least an hour before • Sound systems in the video rooms should be improved • Detailed prescriptions were given for each room • We should simultaneously connect to ISDN (our current system) and IP video conferencing systems (like VRVS). • Try this in the 9th circle as soon as possible, as a trial • Implement a talk archiving and posting system such as the one used by ATLAS at CERN • helps make the first recommendation easier to implement, and therefore more likely • A FNAL technician should be assigned to be responsible for video conferencing equipment

  23. Videoconferencing recommendations • Other Recommendations that don’t cost anything • “Suggested rules for meetings” should be implemented • Meeting conduct, use of microphones, font size (22 – 24) • Clearly designated person in charge of video during each meeting, camera control etc. (convenor or delegate) • Standing DO video task force of 5 people to field video questions and give advice to the lab. • Should include strong representation from offsite people and should include groups with limited financial resources. • Create a file with default line/font settings for Root plots • Chain a set of instructions for operating the video to each system • The task force also made recommendations on disk space, to buy wireless microphones, to buy a laptop projector and document camera, scanners, and new equipment for the Far Side

  24. Computer Security Issues • As of Jan 1st 2002 we are expected to be in compliance with Fermilab’s Strong Authentication Plan. • What does this mean to the user? • You must have a Kerberos principal for access to Fermilab computing systems. • Machines on the Fermilab network may only offer Kerberized versions of network services. • I.e. telnet, ftp, rsh, etc. • Services such as PCanywhere, VNC should only be accessible from fully Kerberized nodes. • Any NECESSARY services which cannot be Kerberized require exemptions.

  25. Computer Security Issues: exemptions • You will need to provide details of the services offered as well as administrative info and plans for bringing the system into compliance to get an exemption. • See the Computer Security link off the D0 Computing web page for more details. • Note that exemptions are presumed to be temporary until the system can be brought into compliance. • Systems covered by the Online Critical Systems Plan have a blanket exemption. • Such systems must no longer be located in offline subnet (addresses 224-227).

  26. Computer Security Issues: laptops • Don’t forget that the Strong Authentication requirements apply to your laptop once it is connected to the FNAL network. • Easiest way to become compliant is to simply turn off all network services. Few people really need to have telnet, ftp, or rsh servers running on their laptops. • Note it is only the SERVERS that are in question. There is no restriction on running client end of the connections. I.e. you may telnet, ftp, etc FROM your laptop without restriction. • Automated scans to detect non-compliant systems are being run. You may receive notice that you need to bring your system up to standard.

  27. Computer Security Issues • There are several specific security vulnerabilities that you should be aware of. See Nick Hadley’s Computing and Software plenary talk on Friday for details. • Many threats require that we be able to locate specific nodes quickly. This is particularly difficult for machines that have legs. • Please register your laptop! • You will need the MAC address of your laptop to do this. • See the the Security link off the D0 Computing web page for more details. • Please make an effort to ensure that your systems and usage are in compliance with the Fermi Computing Policy. • If you have an issue that doesn’t seem to be covered in the policy, or if it is but the policy just sounds like incoherent gibberish to you, then please don’t hesitate to ASK QUESTIONS! • Contact Michael Diesburg (diesburg@fnal.gov) or Greg Cisko (cisko@fnal.gov). If we can’t figure it out either, we will find someone who can.

  28. Run 1 analysis • Boaz will cover in his talk • because … • Jianming Qian has stepped down as Run 1 coordinator • Has been expert at “herding of cats” since 1999 • About 40 papers published on his watch • THANK YOU! NoteAll Run 1 analysis issues (requests for EB’s, etc.) now go to Boaz

  29. Miscellaneous • Chip Brock • Has agreed to co-lead the Data Access and Databases group together with Lee Lueking. • Thanks to Vicky White and good luck to Chip. • New DØ web pages brought online • Thanks to Alan Jonckheere and also Sherry Towers, Chip Brock and Boaz Klima. • New link to “daily meetings” on DØ at work page • Thanks to Florida State. • Weekly all-experimenters meeting talks are now posted on DØ at work

  30. Traffic control This gravel lot is our overflow parking P DAB Outback

  31. Welcome to the Collaboration Meeting • Reminder • Social get-together this evening at 6pm at the Users’ Center

  32. Welcome to the DØRACE Workshop John Womersley Fermilab DØ Collaboration Meeting, February 2002

  33. Why remote analysis? • Demographics • Technological evolution

  34. Demographics • The DØ collaboration is more distributed, more international than ever before • We have 77 active institutions in 17 countries • 585 collaborators of whom only 216 are resident at Fermilab • I know that at least some U.S. institutions plan to reduce the fraction of time their graduate students spend at Fermilab during the later stages of the run

  35. Technological evolution • Ian Foster in Physics Today, February 2002: “A useful metric for the rate of technological change is the average period during which speed or capacity doubles or, more or less equivalently, halves in price. For storage, networks, and computing power, these periods are around 12, 9, and 18 months, respectively. The different time constants associated with these three exponentials have significant implications. The annual doubling of data storage capacity, as measured in bits per unit area, has already reduced the cost of a terabyte (1012 bytes) disk farm to less than $10 000. Anticipating that the trend will continue, the designers of major physics experiments are planning petabyte data archives.” (that’s us, by the way)

  36. Ian Foster, continued … “Such large data volumes demand more from our analysis capabilities. Dramatic improvements in microprocessor performance mean that the lowly desktop or laptop is now a powerful computational engine. Nevertheless, computer power is falling behind storage. By doubling "only" every 18 months or so, computer power takes five years to increase by a single order of magnitude. Assembling the computational resources needed for large-scale analysis at a single location is becoming infeasible. The solution to these problems lies in dramatic changes taking place in networking. Spurred by […] innovations […] and by the demands of the Internet economy, the performance of wide area networks doubles every nine months or so; every five years it increases by two orders of magnitude.“ • The message for us is clear

  37. Remote analysis • We have a responsibility to the collaboration to fully support and facilitate remote physics analysis • We have a self-interest in doing so: it will maximize the physics output from the experiment • In the longer term, a growing reliance on remote resources is almost certain to be critical if we are to provide adequate computing for DØ (not just for analysis) • What I would like to see come out of this workshop • Establish a direction for remote physics analysis in 2002 • Input to the upcoming review on DØ computing needs for “Greater Run 2” (2003-7)

More Related