1 / 9

Development of New Kaon Selectors

Development of New Kaon Selectors. Kalanand Mishra University of Cincinnati. Overview of Neural Net Training. The input variables for neural net are: likelihoods from SVT, DCH, DRC (both global and track-based ) and momentum and polar angle () of the tracks.

sheric
Télécharger la présentation

Development of New Kaon Selectors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of New Kaon Selectors Kalanand Mishra University of Cincinnati

  2. Overview of Neural Net Training • The input variables for neural net are: likelihoods from SVT, DCH, DRC (both global and track-based) and momentum and polar angle () of the tracks. • Separate neural net training for ‘Good Quality’ and ‘Poor Quality’* tracks: gives two family of selectors - “KNNGoodQual“ and “KNNNoQual”. • * Poor Quality tracks are defined as belonging to one of the following categories: • outside DIRC acceptance • passing through the cracks between DIRC bars • no DCH hits in layers > 35 • EMC energy < 0.15 GeV

  3. Performance of ‘KNNxQual’ selectors Bkgd. Rejection NoQual GoodQual An absolute improvement An overall improvement Signal Efficiency The higher curve/ point represents better performance GoodQual 0.3 < P < 0.5 Most of the tracks used in B-tagging have low momenta. And, the biggest consumer for such a selector is B-tagging group. Deterioration

  4. Tried different algorithms …. Binary Ada Boost Simple Binary Split Fisher Ada Boost Decision Tree Bagger Decision Tree Events ( Provides the best separation ) Classifier Output

  5. AdaBoost Decision Tree For details on the algorithms and software used, see: arXiv:physics/0507143 (by Ilya Narsky) Sgnl. Bkgd. Events • Decision Tree splits nodes recursively until a stopping criteria is satisfied. • AdaBoost combines weak classifiers by applying them sequentially. At each step it enhances weights of misclassified events and reduces weights of correctly classified events. Classifier Output

  6. Ada Boost Decision Tree • Training on “real data”. • Visual inspection shows a significant improvement over the neural network performance. • Need to retrain after randomizing the momentum dist. and with additional input variables.

  7. Performance in select momentum bins dE/dx - DRC transition region: 0.8 < P < 1.0 GeV/c Low momentum: 0.3 < P < 0.5 GeV/c Intermediate range: 1.9 < P < 2.1 GeV/c High momentum: 3.0 < P < 3.2 GeV/c

  8. Things to do …. • Randomize the momentum distributions of signal and background events before training • Add additional input discriminating variables : - # signal and bkgd. Cherenkov photons in the ring - # total drift chamber hits and hits in the last 5 layers - # hits in the silicon detector - …… other suggestions ! • Add other background categories - proton, ….. • Finalize the cuts and implement the selectors : - it will be a single family of selectors : no separate selectors for “good” and “poor” quality tracks - should we still call it a KNN selector ?

  9. Summary • Significant efforts underway to develop a “new version” of the KNN selectors. • The goal is to develop a powerful non-LH kaon selector using the best performing classifier (or a combination of classifiers). • Such a selector is expected to be able to replace the current KNN selectors for B-tagging purposes, and should be a meaningful alternative of the LH selectors for Physics analyses.

More Related