250 likes | 258 Vues
A summary of the plans and goals of the Diphoton+X group, including details on meetings, common tasks, and the diphoton analysis.
E N D
Diphoton+X 2015: Overview of Plans and Goals Bruce Schumm SCIPP On Behalf of the Photon+X Working Group La Plata, LPNHE Paris, Milano, Tokyo Tech, UCSC 03 June 2015
8 Tev Results • Draft includes • Diphoton • Photon+b • Photon+jets • Photon+l • Exercise of mushing all four results together in a single paper was taxing but worthwhile • Can we work more closely together to make the combined paper easier and more natural to write?
Photon + X Group Organization • New Photon+XTwiki (thanks to Hernan) • https://twiki.cern.ch/twiki/bin/viewauth/AtlasProtected/SUSYPhotonX • Meetings (plan on 90 min) Mondays 16:30 CERN time • Chaired on rotating basis by Photon+X contact people • Leonardo Carminati (Milano) • Bruce Schumm (UCSC) • Hernan Wahlberg (La Plata)
Photon + X Group Commonality TASKS POTENTIALLY COMMON TO ALL GROUPS SM MC samples xAOD framework issues Code snippets reflecting agreed upon object definitions Isolation variable code Pseudophoton object definition code e->gamma fake rate study dPhi studies (one-sided or two-sided?) Common systematics tasks? Common journal submission
Diphoton Analysis 8 TeV Gluino-bino model limits 8 TeV Wino-bino model limits
Tasks Overview KEY: DONE UNDERWAY UPCOMING • Backgrounds • QCD • Electroweak • Irreducible • Models • SM samples • Strong & EW signal • Full vs. fast sim (More on this in a moment) • Code/Infrastructure • xAOD • Derivations • Higher-level infrastructure (Ryan’s package) • Events variables (MET with photons, etc.) • Event selection • Preliminary studies • Optimization
Focus Points for Diphoton Analysis • As before, optimize separately for low/high bino mass for both strong (gluino) and EW (wino) production • Assume same a*e (about 12%) and expected #events limits (3 events for strong and 5 events for weak production) • Assume 1.5 fb-1 luminosity at 13 TeV • Mass of focus point is that giving cross section of • 3 events = (1.5 fb-1) * (gluino) * (0.12) gluino = 17 fb • 5 events = (1.5 fb-1) * (wino) * (0.12) wino = 28 fb • Mgluino = 1500 GeV Mwino = 600 GeV
Diphoton Analysis Issues Not planning to use SUSY derivations (using EXOT10) Next big step to get underway is optimization; for this we await completion of the MC samples Only need four signal focus points (40,000 events) Can we get highest priority for these points? Also need , -jet, W, Z, tt, W and Z samples in order to perform optimization
Thoughts About Diphoton Analysis Timeline • Optimization should take ~2 weeks once MC samples are available (July 1?) • Background estimate techniques in place on roughly the same time scale • Except for QCD background, approaches largely identical to those of the 8 TeV analysis, but almost all rely on controls making use of 13 TeV data • Will develop control samples with limited initial data but will need to run through full data set and assemble estimated when the full sample becomes available.
Models • SM samples defined; generation underway. Much overlap with other groups; much hard work by Milano group! • Gluino, wino grids defined • All BF and decay length issues resolved • Fast Sim sufficiently validated • Of order ~2m events at 10K/point, more or less approved • Final validation step (generator-level filter to ensure two binos in each event) underway • Need this soon! (optimization)
Backgrounds - QCD • Prior approach was to assume real diphotons are 7525% of low-MET background • Diphoton MC used to estimate high-MET contribution • Pseudophoton control sample scaled to remainder of low-MET events used to estimate -jet contribution • Using 8 TeV data to explore new approach (ABCD method with pseudophotons and relaxed isolation); preliminary results expected soon • If this doesn’t work, will need to fall back to old pseudophoton control-sample technique
Backgrounds - EW • Estimate with e control sample scaled by e fake rate • Need to select e control sample • Tag&probe study of e fake rate underway [Giacomo] MAYBE: • W MC suggests that ~25% of EW background doesn’t arise from e fakes • Some of this may be accounted for in QCD background • Some of QCD background may include e fake events • Prior approach was to include 25% systematic error on the EW background • Perform QCD/EW background overlap study?
Backgrounds - Irreducible • W contribution estimated via l control sample and simultaneous fit with SR • Question about comparison w/ VBFNLO expectation • Need to develop control sample and explore • Z contribution from Sherpa, scaled to VBFNLO (via MadGraph) in relevant kinematic region • Big difference between VBFNLO and Sherpa not understood (Sherpa much larger) • Need to revisit
Event Selection: Preliminary Studies • In past, formal optimization was last step, considering only M_eff (or HT) , MET • Individual, preliminary studies used to establish • Photon PT cut; see e.g. https://indico.cern.ch/event/165989/contribution/0/material/slides/0.pdf • Δφ-MET : make use of or not; cut value. Should we also cut on (Δφ-MET - )? • Δφjet-MET : cut value. Should we also cut on (Δφjet-MET - )? • For 8 TeV, used Meff vs. MET visualization plane (see below) • Will need signal grid points for this already!
Optimization: The Conundrum • How to estimate backgrounds when final background estimates not available? • For 8 TeV analysis optimization, backgrounds estimated by • QCD background estimated by scaling 1 tight + 1 non-isolated pseudophoton sample to 2 tight pseudophoton sample with no Meff (HT) cut for 0 < 60 < MET (DATA) • EW background estimated by scaling e sample by uniform 2% e scale factor (DATA) • W, Z from MC • SUSY group will accept leaving final data-driven step and quick reoptimization before unblinding. Or, pre-optimize as a function of one to-be-determined background value
TASKS POTENTIALLY COMMON TO ALL GROUPS SM MC samples xAOD framework issues (not completely sure what I mean by this…) Code snippets reflecting agreed-upon object definitions Isolation variable code Pseudophoton object definition code e->gamma fake rate study dPhi studies (one-sided or two-sided?) MAYBE? Common derivation (probably not since different triggers)? Common systematics tasks?
Code/Infrastructure • xAOD-based analysis: TokyoTech, UCSC need to catch up • Derivations followed through upon by Milano (status?) • Higher-level statistics and • plotting utility (Ryan…) • Past quantities that have required • study (do we need to look into • these?) • MET • Isolation definition • ???
Optimization: 8 TeV Approach • Last step done by inspection of Meff (or HT) vs MET plane • Can be confounded by statistics; also look at background and signal stats over same plane • See 8 TeV backup note WP2 Optimization NO YES
What SRs to Create? • For 8 TeV Analysis • Strong production: High Meff; backgrounds near 0 • EW production: Intermediate HT; backgrounds 1-2 events • Low mass bino, high mass bino for both • SP1, SP2, WP1, WP2 • Also: Model-independent SR (MIS), no Meff (HT) cut. Based on choosing MET cut at which EW and QCD backgrounds about the same (~1 event each)
Model-Independent SR (?) 8 TeV analysis: at MET=250, Meff = 0 backgrounds about same EW QCD Question: Should we rethink? What do we really want to do to minimize chance that we miss a signal? Hmmm…. How do we think about this?
What Physics Could Hide Signal with Dominant BF into Photons and DM? Degenerate SUSY scenarios? No – energy has to go somewhere. We would see it in photons and/or MET. Photons will not be soft because decaying state will either be high-mass or boosted. Low photonic BF? Would need to accelerate single-photon analyses. Not really practical. Long-lived scenarios? Need to re-create non-pointing photon reconstruction. Probably no competition from CMS here anyway. Perhaps most likely scenario is lower-than-expected cross section from non-SUSY process. Probably best addressed by what was done before, or perhaps just use no Meff or Ht cut and use lower MET cut of the other, model-dependent SRs. Could perhaps also maintain low photon ET cut but that could be a “can of worms”.
Wrap UP • I haven’t mentioned limit setting within HistFitter • Immediate motivation is to get to unblind before or simultaneous with CMS • I’m not assuming we’ll necessary be setting limits! • Our work is cut out for us. Thoughts? • We should start writing the skeleton of the backup note. If anyone is itching to do this, by all means. Otherwise, I’m very happy to do that.