90 likes | 198 Vues
This document presents an overview of background rejection trees developed by T. Burnett, detailing the methodology and implementation in merit file structure. Each tree is characterized by a specific set of weighted trees and nodes described in `dtree.txt` and corresponding variables in `variables.txt`. The evaluation process involves using a vector of floats arranged per the variable list. It also discusses training on EVEN events, optional boosting for improved efficiency, and preliminary insights into energy resolution and validity fractions in the context of signal and background weighting.
E N D
The eight selections (from Bill) T. Burnett
The prefilter cuts T. Burnett
Implementation in merit file structure • Each tree is described by two files: • dtree.txt – ascii file with a list of weighted trees and nodes: • tree: specify the weight to assign to the tree • branch: variable index, cut value • leaf: purity • variables.txt – list of the corresponding tuple variables • Evaluation is by passing a vector of floats, ordered according to the variable list. • Proposal to incorporate the prefilter cut in the tree description T. Burnett
Training details • Weight signal and background to be the same • Train on the EVEN events, with optional boosting • Test with ODD events • Save training and testing efficiency curves T. Burnett
Boosting: what does it do? • Nice interpolation for low background • Not much improvement in actual separation (so far) T. Burnett
Preliminary single-tree background T. Burnett
What about the energy resolution?Validity fractions T. Burnett
The fraction of time each estimate is best T. Burnett