1 / 15

The Neural Network Objects RhoNNO: A ROOT package for general use

The Neural Network Objects RhoNNO: A ROOT package for general use. Marcel Kunze Institut für Experimentalphysik 1 Ruhr-Universität Bochum. Why ?. Neural Network Objects are around us and tested in various applications since several years (“NNO”)

loan
Télécharger la présentation

The Neural Network Objects RhoNNO: A ROOT package for general use

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Neural Network ObjectsRhoNNO: A ROOT package for general use Marcel Kunze Institut für Experimentalphysik 1 Ruhr-Universität Bochum 3rd ROOT Workshop, M. Kunze

  2. Why ? • Neural Network Objects are around us and tested in various applications since several years (“NNO”) • ROOT seems very appealing wrt. object persistence and interactivity • NNO has been overhauled to re-use ROOT functionality (“RhoNNO”) • Integration of other ROOT applications (Neural Network Kernel of J.P.Ernenwein) • RhoNNO complies to ROOT‘s coding standards 3rd ROOT Workshop, M. Kunze

  3. Neural Network Models • Implementation of the most popular supervised and unsupervised network models 3rd ROOT Workshop, M. Kunze

  4. Growing Neural Gas (GNG) Example: Adaptation to a multi-dimensional PDF (1D-2D-3D) Fractal growth process: Density and topology conservation 3rd ROOT Workshop, M. Kunze

  5. Architecture • RhoNNO class hierarchy 3rd ROOT Workshop, M. Kunze

  6. Interactive Control Positive Samples (“pro”) Negative Samples (“con”) • The plotter highlights training progress - Output functions- Error functions 3rd ROOT Workshop, M. Kunze

  7. Management of Data Sets • TDataServeA mini database to support management of input/output vector relations- Add vector pairs to data sets- Partition data sets for training/testing- Shuffle data sets- Serve vector pairs - Save and load data sets using ROOT persistence 3rd ROOT Workshop, M. Kunze

  8. // Abstract interface for all networks virtual void AllocNet() = 0; virtual void InitNet() = 0; virtual void WriteText() = 0; virtual void WriteBinary() = 0; virtual void ReadText() = 0; virtual void ReadBinary() = 0; virtual Double_t* Recall(NNO_INTYPE* in,NNO_OUTTYPE* out=0) = 0; virtual Double_t Train(NNO_INTYPE* in,NNO_OUTTYPE* out=0) = 0; // Training and testing Double_t TrainEpoch(TDataServe *server, Int_t nEpoch=1); Double_t TestEpoch(TDataServe *server); void BalanceSamples(Bool_t yesNo = kTRUE); virtual void SetMomentumTerm(Double_t f); virtual void SetFlatSpotElimination(Double_t f); VNeuralNet Interface • Abstract base class of all network models 3rd ROOT Workshop, M. Kunze

  9. NetworkTrainer • A RhoNNO sample application to train and test neural networks- Assemble training and test data sets out of ROOT trees, based on TFormula- Define network architecture and transfer functions- Define and execute a training schedule- Persist networks- Generate C++ code to perform network recall 3rd ROOT Workshop, M. Kunze

  10. Example: PID Tagging Qc e • Identify charged tracks using dE/dx, QC etc. m p K p P [GeV/c] P [GeV/c] 3rd ROOT Workshop, M. Kunze

  11. The ROOT Training File • Arbitrary standard tree to provide training and test sample Measurement and shape variables MC-Truth: 1 = electron 2 = muon 3 = pion 4 = kaon 5 = proton Likelihood values 3rd ROOT Workshop, M. Kunze

  12. # Example: Training of PIDSelectors with NNO #define the network topology and training schedule xmlp 7 15 10 1 # MLP with 2 hidden layers transfer TR_FERMI # Transfer function momentum 0.2 # Momentum term balance true # Assure same statistics for pro and con samples plots true # Show updating error plots on training progress test 10000 # Number of test vector pairs to reserve start 1 # First training epoch stop 200 # Last training epoch #define the data source, take two input files datapath ../Data # Directory to look up data files networkpath ../Networks # Directory to persist network files file PidTuple1.root # First file to get input from file PidTuple2.root # Second … (ad infinitum) #set up the input layer (use branch names) tree PidTuple # This is the tree to look up data cut mom>0.5&&dch>0&&dch<10000 # Preselection of samples input mom:acos(theta):svt:emc:drc:dch:ifr #Input layer autoscale true # Apply a scale to assure inputs are O(1) #set up the output layer (use branch names) #Particles pid = {electron=1,muon,pion,kaon,proton} output abs(pid)==3 # Perform training for pions Steering File x 3rd ROOT Workshop, M. Kunze

  13. // TXMLP network trained with NNO NetworkTrainer at Fri Apr 27 // Input parameters mom:acos(theta):svt:emc:drc:dch:ifr // Output parameters abs(pid)==3 // Training files: //../Data/PidTuple1.root //../Data/PidTuple2.root #include "RhoNNO/TXMLP.h" Double_t* Recall(Double_t *invec) { static TXMLP net("TXMLP.net"); Float_t x[7]; x[0] = 0.76594 * invec[0]; // mom x[1] = 2.21056 * invec[1]; // acos(theta) x[2] = 0.20365 * invec[2]; // svt x[3] = 2.2859 * invec[3]; // emc x[4] = 1.75435 * invec[4]; // drc x[5] = 0.00165 * invec[5]; // dch x[6] = 0.85728 * invec[6]; // ifr return net.Recall(x); } Generation of Recall Code 3rd ROOT Workshop, M. Kunze

  14. Execute the Example • NetworkTrainer <steering file> <first epoch> <last epoch> 3rd ROOT Workshop, M. Kunze

  15. Summary • The re-use of ROOT functionality improves NNO a lot • TFormula works great to pre-process samples • Run training either from CINT C++ or from ASCII steering file • A RhoNNO GUI and/or Wizard is still missing • RhoNNO comes as part of the Rho package, but can be used independently: Installation of the shared lib plus the headers is sufficient. • The NetworkTrainer application runs standalone • Documentation:http://www.ep1.ruhr-uni-bochum.de/~marcel/RhoNNO.html 3rd ROOT Workshop, M. Kunze

More Related