1 / 83

First Oscillation Results from the MiniBooNE Experiment

First Oscillation Results from the MiniBooNE Experiment. Jonathan Link Virginia Polytechnic Institute & State University April 19 th 2007. The MiniBooNE Collaboration. University of Alabama Los Alamos National Laboratory Bucknell University Louisiana State University

aldan
Télécharger la présentation

First Oscillation Results from the MiniBooNE Experiment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. First Oscillation Results from the MiniBooNE Experiment Jonathan Link Virginia Polytechnic Institute & State University April 19th 2007 Jonathan Link

  2. The MiniBooNE Collaboration University of Alabama Los Alamos National Laboratory Bucknell University Louisiana State University University of Cincinnati University of Michigan University of Colorado Princeton University Columbia University Saint Mary’s University of Minnesota Embry Riddle University Virginia Polytechnic Institute Fermi National Accelerator Laboratory Western Illinois University Indiana University Yale University Jonathan Link

  3. The LSND Experiment LSND took data from 1993-98 Neutrino oscillation probability: P(νμ→νe) = sin22θ sin2(1.27Δm2L/E) π+ μ+ νμ Baseline of 30 meters Energy range of 20 to 55 MeV L/E of about 1m/MeV νe p e+ n e+ νμνe n+H→D+γ LSND’s Signature Scintillation Čerenkov 2.2 MeV neutron capture Jonathan Link

  4. The LSND Signal LSND found an excess of νe events in a νμ beam They found 87.9 ± 22.4 ± 6.0 events over expectation. With an oscillation probability of (0.264 ± 0.067 ± 0.045)%. Decay in flight analysis (nmne) oscillation probability of (0.10± 0.16 ± 0.04) % Jonathan Link

  5. Why is this Result Interesting? ν3 Dm22 ν2 Dm12 ν1 Existing Neutrino Oscillation Data LEP found that there are only 3 light neutrinos that interact weakly Three neutrinos allow only 2 independent Dm2 scales |Dm32|= Dm12 ±Dm22 But there are experimental results at 3 different Dm2 scales Jonathan Link

  6. What Does it Mean? • First, one or more of the experiments may be wrong LSND, being the leading candidate, has to be checked → MiniBooNE • Otherwise, add one or more sterile neutrinos… Giving you more independent Dm2 scales • Best fits to the data require at least two sterile The Usual 3ν Model (mass)2 Jonathan Link

  7. target and horn decay region absorber dirt detector nm ne??? K+ p+ Booster primary beam secondary beam tertiary beam (protons) (mesons) (neutrinos) The MiniBooNE Design Strategy Keep L/E the same while changing systematics, neutrino energy and event signature P(νμ→νe) = sin22θ sin2(1.27Δm2L/E) Order of magnitude higher energy (~500 MeV) than LSND (~30 MeV) Order of magnitude longer baseline (~500 m) than LSND (~30 m) Jonathan Link

  8. The Neutrino Beam Jonathan Link

  9. The Neutrino Beam within a magnetic horn (2.5 kV, 174 kA) that increases the flux by 6 Booster Target Hall MiniBooNE extracts beam from the 8 GeV Booster… delivers it to a 1.7 l Be target… These results correspond to (5.580.12) 1020 POT Jonathan Link

  10. Modeling Production of Pions HARP (CERN) • 5% λ Beryllium target • 8.9 GeV proton beam momentum Data are fit to a Sanford-Wang parameterization. HARP collaboration, hep-ex/0702024 Jonathan Link

  11. Modeling Production of Kaons K+ Data from 10 - 24 GeV. Uses a Feynman Scaling Parameterization. data ─ points dash ─ total error (fit  parameterization) K0 data are also parameterized. In situ measurement of K+ from LMC agrees within errors with parameterization Jonathan Link

  12. Neutrino Flux from GEANT4 Simulation • “Intrinsic” νe +νe sources: • μ+ → e+νμνe (52%) • K+ → π0 e+νe (29%) • K0 → π e νe (14%) • Other ( 5%) π+→μ+νμ K+→μ+νμ μ+→ e+νμνe K→πeνe Antineutrino content: 6% νe/ νμ = 0.5% Jonathan Link

  13. Stability of Running Full ν Run Neutrino interactions per proton-on-target. Observed and expected events per minute Jonathan Link

  14. Events in the Detector Jonathan Link

  15. The MiniBooNE Detector • 541 meters downstream of target • 12 meter diameter sphere • Filled with 950,000 liters of • pure mineral oil — 20+ meter attenuation length • Light tight inner region with • 1280 photomultiplier tubes • Outer veto region with 240 • PMTs. Simulated with a GEANT3 Monte Carlo Jonathan Link

  16. Optical and PMT Response Models Attenuation length: >20 m @ 400 nm We have developed 39-parameter “Optical Model” based on internal calibration and external measurement • Detected photons from • Prompt light (Cherenkov) • Late light (scintillation, fluorescence) • in a 3:1 ratio for b~1 Data/MC overlay Jonathan Link

  17. The Beam Window The 1.6 μs spill sits inside a 19.2 μs beam trigger window An unbroken time cluster of hits forms a “subevent” Most events are from νμ charged current (CC) interactions with a characteristic two “subevent” structure from stopped μ decay Tank Hits μ Example Event e Jonathan Link

  18. Event Timing Structure with Simple Clean-up Cuts Progressively introducing cuts on the beam window: Veto Hits <6 removes through-going cosmic rays This leaves “Michel electrons” (mnmnee) and beam events Tank Hits > 200 (effectively energy) removes Michel electrons, which have a 52 MeV endpoint All events in beam window Jonathan Link

  19. Calibration Systems and Sources Jonathan Link

  20. Predicted Event Rates Before Cuts From the NUANCE event generator D. Casper, NPS, 112 (2002) 161 Event neutrino energy (GeV) Jonathan Link

  21. Charged Current Quasi-elastic CCQE are 39% of all events Events are “clean” (few particles) Energy of the neutrino can be reconstructed μ or e ν θ Beam N N’ Reconstructed from: Scattering angle Visible energy (Evisible) An oscillation signal is an excess of νe events as a function of EνQE Jonathan Link

  22. Neutrino Interaction Parameters in NUANCE From Q2 fits to MB νμ CCQE data: MAeff ─ Effective axial mass EloSF ─Pauli blocking parameter From electron scattering data: Eb ─ Binding energy pf ─ Fermi momentum data/MC~1 across all angles & energy after fit Model describes CCQE νμ data well Kinetic Energy of muon Jonathan Link

  23. Events Producing Pions CC π+ Easy to tag due to 3 subevents. Not a substantial background to the oscillation analysis. μ+ ν 25% π+ Δ N N ν NC π0 The π0 decays to 2 photons, which can look “electron-like” mimicking the signal... <1% of π0 contribute to background. ν 8% π0 Δ N N The Δ also decays to a single photon with ~0.5% probability Jonathan Link

  24. Particles Produced in the Detector and PID Muons: Produced in most CC events. Usually 2 subevent or exiting. Electrons: Tag for νμνe CC quasi-elastic (QE) signal. 1 subevent π0s: Can form a background if one photon is weak or exits tank. In NC case, 1 subevent. Jonathan Link

  25. Two Independent Analyses Jonathan Link

  26. Signal Region Arbitrary Units Example Signals: Dm2=0.4 eV2 Dm2=0.7 eV2 Dm2=1.0 eV2 0 1 2 3 Neutrino Energy (GeV) Analysis Objectives • The goal of both analyses is to • minimize background while • maximizing signal efficiency. • “Signal range” is approximately • 300 MeV < EnQE < 1500 MeV • One can then either: • look for a total excess • (“counting experiment”) • fit for both an excess and • energy dependence • (“energy fit”) Jonathan Link

  27. Open Data for Studies MiniBooNE is searching for a small but distinguishable event signature In order to maintain blindness, Electron-like events were sequestered, Leaving ~99% of the in-beam events available for study. Rule for cuts to sequester events: <1s signal outside of the box Low level information which did not allow particle-id was available for all events. Jonathan Link

  28. Analysis Pre-cuts Both analyses share these hit-level pre-cuts: Only 1 subevent Veto hits < 6 Tank hits > 200 And a radius precut: R<500 cm (where reconstructed radius is algorithm-dependent) Jonathan Link

  29. Analysis 1: “Track-Based” (TB) Analysis Philosophy: Uses detailed, direct reconstruction of particle tracks, and ratio of fit likelihoods for the various event type hypotheses (μ, e and π0)to identify particles. This algorithm was found to have the better sensitivity to νμνe appearance. Therefore, before unblinding, this was the algorithm chosen for the “primary result” Jonathan Link

  30. νe CCQE MC νμ CCQE Rejecting Muon-like Events Using log(Le/Lμ) log(Le/Lm)>0 favors electron-like hypothesis Note: photon conversions are electron-like. This does not separate e/π0. Separation is clean at high energies where muon-like events are long. Analysis cut was chosen to maximize the νμ νe sensitivity Jonathan Link

  31. Rejecting π0 Events Using a mass cut Using log(Le/Lp) νe CCQE MC NC π0 NC π0 νe CCQE Cuts were chosen to maximize nm ne sensitivity Jonathan Link

  32. π0 e e π0 Monte Carlo π0 only γγinvariant mass BLIND log(Le/Lp) Invariant Mass Testing e-π0 Separation with Data 1 subevent log(Le/Lm)>0 (e-like) log(Le/Lp)<0 (π-like) mass>50 (high mass) Signal region BLIND Sideband Region has χ2 Probability of 69% Jonathan Link

  33. Efficiency: Summary of Track Based Analysis Cuts “Precuts” + • Log(Le/Lm) • + Log(Le/Lp) • + invariant mass Simulated Backgrounds After Cuts Jonathan Link

  34. Analysis 2: Boosted Decision Trees (BDT) Boosted decision trees, or “boosting” is a technique for pattern recognition through training (like a neural net). Philosophy: Construct a set of low-level analysis variables which are combined to give a score to each event that is used to select electron-like events. This algorithm represents an independent cross check of the Track Based Analysis. In the interest of time I will focus on analysis 1 which is the primary analysis Jonathan Link

  35. Errors, Constraints and Sensitivity Jonathan Link

  36. nm mis-id intrinsic ne Sources of νe Background We have two main categories of background: Uncertainties in the background rates are among the sources of significant error in the analysis. We have handles in the MiniBooNE data for each source of BG. Jonathan Link

  37. Determining the Misidentification Background Rates BDT Normalization & energy dependence of both background and signal Predict From the νμ CCQE events Data/MC Track Based: 1.32 ± 0.26 Boosted Decision Tree: 1.22 ± 0.29 Tying the νe background and signal prediction to the νμ flux constrains this analysis to a strict νμνe appearance-only search Jonathan Link

  38. νμ Constraint on Intrinsic νe from Beam μ+ Decay π→μνμ • Measure the νμ flux • Kinematics allows • connection to the π flux Eν(GeV) En = 0.43 Ep μ→ e νμνe Eπ(GeV) Once the π flux is known, the μ flux is determined Jonathan Link

  39. K+and K0 Decay Backgrounds At high energies, well above the signal range, νμ and νe-like events are largely from neutrinos from kaon decay. The events in these high energy bins are used to constrain the νe background from kaon decay. signal range Jonathan Link

  40. π0 Production Constrained with MiniBooNE Data This reduces the error on predicted mis-identified π0s Reweighting improves agreement in other variables, e.g. Because this constrains the Δ resonance rate, it also constrains the rate of Δ→Nγ Jonathan Link

  41. EnhancedBackgroundCuts External Sources of Background External events (sometimes referred to a “Dirt” events) are from ν interactions outside of the detector: Ndata/NMC = 0.99 ± 0.15 Event Type of External Events after PID cuts Cosmic Rays: Measured from out-of-beam data: 2.1 ± 0.5 events Jonathan Link

  42. Summary of Predicted Background Rates Beam Intrinsic νe νμ Misidentification Jonathan Link

  43. Handling Uncertainties in the Analyses What we begin with... ... what we need For a given source of uncertainty, Errors on a wide range of parameters in the underlying model For a given source of uncertainty, Errors in bins of reconstructed EnQE and information on the correlations between bins Jonathan Link

  44. Optical Model Uncertainties 39 parameters must be varied Allowed variations are set by the Michel calibration sample To understand allowed variations, we ran 70 hit-level simulations, with randomly selected parameters. “Multisims” Jonathan Link

  45. Multisims Used to Convert Uncertainties to Errors Using Multisims to convert from parameter uncertainties to errors in EnQE bins: For each error source, “Multisims” are generated within the allowed variations by reweighting the standard Monte Carlo. In the case of the optical model, hit-level simulations are used. 1000 multisims for K+ production 70 multisims for the Optical Model standard MC Number of events passing cuts in 500 < EnQE <600 MeV Jonathan Link

  46. MC MC Form Large Error Matrix for Uncertainties and Correlations Error Matrix Elements: • Where: • N is number of events passing cuts • MC is standard monte carlo • a represents a given multisim • M is the total number of multisims • i,j are EnQE bins BDT Total error matrix is sum from each source. TB: νe-only total error matrix BDT: νμ-νe total error matrix Jonathan Link

  47. Sensitivity of the Two Analyses With all the errors calculated and before the box is opened, we can calculate the sensitivity of the two analyses. The Track-based sensitivity is better, thus this becomes the pre-determined default analysis Jonathan Link

  48. The Initial Results Jonathan Link

  49. Box Opening Procedure Progress cautiously, in a step-wise fashion After applying all analysis cuts: • Fit sequestered data to an oscillation hypothesis, returning no fit parameters. Return the c2 of the data/MC comparison for a set of diagnostic variables. 2. Open up the plots from step 1. The fitted signal is unreported. Plots are chosen to be useful diagnostics, without indicating if signal was added. 3. Report the c2 for a fit to EnQE , without returning fit parameters. • Compare EnQE in data and Monte Carlo, returning the fit parameters. At this point, the box is open. Jonathan Link

  50. Step 1 Return the c2 of the data/MC comparison for a set of diagnostic variables 12 variables are tested for Track Based 46 variables are tested for Boosted Decision Tree All analysis variables were returned with good probability except... Track Based analysis c2 Probability for the Evisible fit was only 1% This probability was sufficiently low to merit further consideration Jonathan Link

More Related