Logo ROOT  
Reference Guide
 
Loading...
Searching...
No Matches
TMVAMultipleBackgroundExample.C File Reference

Detailed Description

View in nbviewer Open in SWAN This example shows the training of signal with three different backgrounds Then in the application a tree is created with all signal and background events where the true class ID and the three classifier outputs are added finally with the application tree, the significance is maximized with the help of the TMVA genetic algorithm.

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Executable: TMVAGAexample
Start Test TMVAGAexample
========================
... event: 0 (200)
======> EVENT:0
var1 = -1.14361
var2 = -0.822373
var3 = -0.395426
var4 = -0.529427
created tree: TreeS
... event: 0 (200)
======> EVENT:0
var1 = -1.54361
var2 = -1.42237
var3 = -1.39543
var4 = -2.02943
created tree: TreeB0
... event: 0 (200)
======> EVENT:0
var1 = -1.54361
var2 = -0.822373
var3 = -0.395426
var4 = -2.02943
created tree: TreeB1
======> EVENT:0
var1 = 0.463304
var2 = 1.37192
var3 = -1.16769
var4 = -1.77551
created tree: TreeB2
created data file: tmva_example_multiple_background.root
========================
--- Training
<HEADER> DataSetInfo : [datasetBkg0] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg0] : Added class "Background"
: Add Tree TreeB0 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg0
: Building event vectors for type 2 Signal
: Dataset[datasetBkg0] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg0] : create input formulas for tree TreeB0
<HEADER> DataSetFactory : [datasetBkg0] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.390 +0.543 +0.801
: var2: +0.390 +1.000 +0.787 +0.768
: var3: +0.543 +0.787 +1.000 +0.837
: var4: +0.801 +0.768 +0.837 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg0] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg0] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg0] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.0025285 1.0135 [ -3.1150 2.2852 ]
: var2: 0.015478 1.1254 [ -3.6952 3.1113 ]
: var3: 0.083688 1.1724 [ -3.3587 3.9796 ]
: var4: 0.18853 1.3296 [ -3.7913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.12706 1.0000 [ -3.2013 2.4661 ]
: var2: -0.094932 1.0000 [ -2.7387 2.4399 ]
: var3: -0.0075796 1.0000 [ -2.7068 3.2704 ]
: var4: 0.28226 1.0000 [ -1.9230 2.3683 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 3.3271e-09 2.0955 [ -6.9024 6.2810 ]
: var2: 5.4250e-10 0.81719 [ -2.1933 1.8247 ]
: var3: 7.3866e-10 0.50438 [ -1.2415 1.1920 ]
: var4: 2.1420e-10 0.35074 [ -0.85693 1.0044 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.18815 1.0000 [ -1.2538 5.4391 ]
: var2: 0.14382 1.0000 [ -2.0629 6.0054 ]
: var3: 0.11380 1.0000 [ -2.0399 7.5442 ]
: var4: 0.048569 1.0000 [ -2.7199 5.5633 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 4 : 4.418e-01
: 2 : Variable 3 : 3.388e-01
: 3 : Variable 2 : 2.147e-01
: 4 : Variable 1 : 1.485e-01
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.104 sec
<HEADER> BDTG : [datasetBkg0] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.0169 sec
: Creating xml weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
: Creating standalone class: datasetBkg0/weights/TMVAMultiBkg0_BDTG.class.C
: TMVASignalBackground0.root:/datasetBkg0/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var1 : 2.673e-01
: 2 : var2 : 2.603e-01
: 3 : var3 : 2.490e-01
: 4 : var4 : 2.234e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg0] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.012 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg0] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.13613 0.97981 [ -2.0823 2.9998 ]
: var2: 0.085482 0.86846 [ -1.9349 2.0015 ]
: var3: 0.16949 0.99559 [ -2.4774 3.0223 ]
: var4: 0.33525 1.2442 [ -2.9030 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg0 BDTG : 0.936
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg0 BDTG : 0.000 (0.975) 0.770 (0.977) 0.975 (0.982)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg0 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg0 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
<HEADER> DataSetInfo : [datasetBkg1] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg1] : Added class "Background"
: Add Tree TreeB1 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg1
: Building event vectors for type 2 Signal
: Dataset[datasetBkg1] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg1] : create input formulas for tree TreeB1
<HEADER> DataSetFactory : [datasetBkg1] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.390 +0.543 +0.801
: var2: +0.390 +1.000 +0.787 +0.768
: var3: +0.543 +0.787 +1.000 +0.837
: var4: +0.801 +0.768 +0.837 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg1] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg1] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg1] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.0025285 1.0135 [ -3.1150 2.2852 ]
: var2: 0.31548 1.0836 [ -3.0952 3.1113 ]
: var3: 0.58369 1.0377 [ -2.3587 3.9796 ]
: var4: 0.18853 1.3296 [ -3.7913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: -0.18796 1.0000 [ -3.2043 2.5135 ]
: var2: 0.060618 1.0000 [ -2.5942 2.5176 ]
: var3: 0.71489 1.0000 [ -1.9164 4.0104 ]
: var4: -0.014100 1.0000 [ -2.1785 2.3322 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 2.2165e-09 1.9481 [ -6.5131 5.8550 ]
: var2: 1.9686e-09 0.87136 [ -2.4299 2.1873 ]
: var3: 8.5915e-10 0.53326 [ -1.6219 1.2402 ]
: var4:-3.8999e-10 0.45543 [ -1.1278 1.1965 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.18140 1.0000 [ -1.2839 5.4441 ]
: var2: 0.12101 1.0000 [ -2.0797 6.0929 ]
: var3: 0.13453 1.0000 [ -1.6667 5.8802 ]
: var4: 0.068813 1.0000 [ -1.8739 5.5007 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 4 : 4.418e-01
: 2 : Variable 1 : 1.485e-01
: 3 : Variable 3 : 5.784e-02
: 4 : Variable 2 : 3.636e-02
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.0984 sec
<HEADER> BDTG : [datasetBkg1] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.0162 sec
: Creating xml weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
: Creating standalone class: datasetBkg1/weights/TMVAMultiBkg1_BDTG.class.C
: TMVASignalBackground1.root:/datasetBkg1/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var3 : 2.759e-01
: 2 : var1 : 2.623e-01
: 3 : var4 : 2.431e-01
: 4 : var2 : 2.187e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg1] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.0112 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg1] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.13613 0.97981 [ -2.0823 2.9998 ]
: var2: 0.38548 0.81654 [ -1.3349 2.5106 ]
: var3: 0.66949 0.88808 [ -1.4774 3.9796 ]
: var4: 0.33525 1.2442 [ -2.9030 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg1 BDTG : 0.993
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg1 BDTG : 0.000 (0.985) 0.985 (0.987) 0.989 (0.991)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg1 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg1 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
<HEADER> DataSetInfo : [datasetBkg2] : Added class "Signal"
: Add Tree TreeS of type Signal with 200 events
<HEADER> DataSetInfo : [datasetBkg2] : Added class "Background"
: Add Tree TreeB2 of type Background with 200 events
<HEADER> Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Rebuilding Dataset datasetBkg2
: Building event vectors for type 2 Signal
: Dataset[datasetBkg2] : create input formulas for tree TreeS
: Building event vectors for type 2 Background
: Dataset[datasetBkg2] : create input formulas for tree TreeB2
<HEADER> DataSetFactory : [datasetBkg2] : Number of events in input trees
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 100
: Signal -- testing events : 100
: Signal -- training and testing events: 200
: Background -- training events : 100
: Background -- testing events : 100
: Background -- training and testing events: 200
:
<HEADER> DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.427 +0.620 +0.834
: var2: +0.427 +1.000 +0.756 +0.779
: var3: +0.620 +0.756 +1.000 +0.854
: var4: +0.834 +0.779 +0.854 +1.000
: ----------------------------------------
<HEADER> DataSetInfo : Correlation matrix (Background):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.689 -0.032 +0.201
: var2: -0.689 +1.000 +0.051 -0.112
: var3: -0.032 +0.051 +1.000 -0.090
: var4: +0.201 -0.112 -0.090 +1.000
: ----------------------------------------
<HEADER> DataSetFactory : [datasetBkg2] :
:
<HEADER> Factory : Train all methods
<HEADER> Factory : [datasetBkg2] : Create Transformation "I" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "P" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "G" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> Factory : [datasetBkg2] : Create Transformation "D" with events from all classes.
:
<HEADER> : Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.29768 0.91323 [ -2.7150 2.2852 ]
: var2: 0.66936 0.96658 [ -3.0952 3.1113 ]
: var3: 0.30872 1.1413 [ -2.3587 3.9796 ]
: var4: 0.48019 1.1841 [ -2.2913 4.1179 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.22260 1.0000 [ -2.8899 2.2151 ]
: var2: 0.64848 1.0000 [ -2.8577 2.8017 ]
: var3: 0.093503 1.0000 [ -2.1097 2.6394 ]
: var4: 0.29279 1.0000 [ -2.2171 2.6253 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 1.7369e-09 1.5388 [ -5.4229 5.6879 ]
: var2: 2.3402e-09 0.94775 [ -2.3763 2.7626 ]
: var3: 3.1758e-09 0.82690 [ -1.9785 1.7544 ]
: var4: 9.3132e-10 0.72324 [ -1.7482 1.7182 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
<HEADER> TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.17819 1.0000 [ -1.4362 4.6688 ]
: var2: 0.15184 1.0000 [ -1.4113 5.3518 ]
: var3: 0.12791 1.0000 [ -1.8368 5.3543 ]
: var4: 0.099146 1.0000 [ -2.1654 4.5855 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
<HEADER> IdTransformation : Ranking result (top variable is best ranked)
: -----------------------------------
: Rank : Variable : Separation
: -----------------------------------
: 1 : Variable 2 : 3.993e-01
: 2 : Variable 4 : 2.811e-01
: 3 : Variable 3 : 2.659e-01
: 4 : Variable 1 : 1.571e-01
: -----------------------------------
<HEADER> Factory : Train method: BDTG for Classification
:
<HEADER> BDTG : #events: (reweighted) sig: 100 bkg: 100
: #events: (unweighted) sig: 100 bkg: 100
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 200 events: 0.101 sec
<HEADER> BDTG : [datasetBkg2] : Evaluation of BDTG on training sample (200 events)
: Elapsed time for evaluation of 200 events: 0.016 sec
: Creating xml weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
: Creating standalone class: datasetBkg2/weights/TMVAMultiBkg2_BDTG.class.C
: TMVASignalBackground2.root:/datasetBkg2/Method_BDT/BDTG
<HEADER> Factory : Training finished
:
: Ranking input variables (method specific)...
<HEADER> BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 2.842e-01
: 2 : var1 : 2.630e-01
: 3 : var2 : 2.360e-01
: 4 : var3 : 2.168e-01
: --------------------------------------
<HEADER> Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
<HEADER> Factory : Test all methods
<HEADER> Factory : Test method: BDTG for Classification performance
:
<HEADER> BDTG : [datasetBkg2] : Evaluation of BDTG on testing sample (200 events)
: Elapsed time for evaluation of 200 events: 0.0109 sec
<HEADER> Factory : Evaluate all methods
<HEADER> Factory : Evaluate classifier: BDTG
:
<HEADER> BDTG : [datasetBkg2] : Loop over test events and fill histograms with classifier response...
:
<HEADER> TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.31824 0.87725 [ -1.8821 2.9998 ]
: var2: 0.68634 0.81995 [ -1.2800 2.0015 ]
: var3: 0.28439 1.0366 [ -1.8691 3.0223 ]
: var4: 0.66443 1.1236 [ -1.7755 3.3317 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: datasetBkg2 BDTG : 0.943
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: datasetBkg2 BDTG : 0.000 (0.975) 0.000 (0.979) 0.979 (0.986)
: -------------------------------------------------------------------------------------------------------------------
:
<HEADER> Dataset:datasetBkg2 : Created tree 'TestTree' with 200 events
:
<HEADER> Dataset:datasetBkg2 : Created tree 'TrainTree' with 200 events
:
<HEADER> Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
========================
--- Application & create combined tree
: Booking "BDT method" of type "BDT" from datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml.
: Reading weight file: datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
: Booking "BDT method" of type "BDT" from datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml.
: Reading weight file: datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
: Booking "BDT method" of type "BDT" from datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml.
: Reading weight file: datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml
<HEADER> DataSetInfo : [Default] : Added class "Signal"
<HEADER> DataSetInfo : [Default] : Added class "Background"
: Booked classifier "BDTG" of type: "BDT"
--- Select signal sample
: Rebuilding Dataset Default
: Rebuilding Dataset Default
: Rebuilding Dataset Default
--- End of event loop: Real time 0:00:00, CP time 0.030
--- Select background 0 sample
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Select background 1 sample
--- End of event loop: Real time 0:00:00, CP time 0.040
--- Select background 2 sample
--- End of event loop: Real time 0:00:00, CP time 0.020
--- Created root file: "tmva_example_multiple_backgrounds__applied.root" containing the MVA output histograms
==> Application of readers is done! combined tree created
========================
--- maximize significance
Classifier ranges (defined by the user)
range: -1 1
range: -1 1
range: -1 1
<HEADER> FitterBase : <GeneticFitter> Optimisation, please be patient ... (inaccurate progress timing for GA)
: Elapsed time: 13.1 sec
======================
Efficiency : 0.955
Purity : 0.880184
True positive weights : 191
False positive weights: 26
Signal weights : 200
cutValue[0] = -0.950311;
cutValue[1] = 0.986588;
cutValue[2] = 0.905048;
#include <iostream> // Stream declarations
#include <vector>
#include <limits>
#include "TChain.h"
#include "TCut.h"
#include "TDirectory.h"
#include "TH1F.h"
#include "TH1.h"
#include "TMath.h"
#include "TFile.h"
#include "TStopwatch.h"
#include "TROOT.h"
#include "TSystem.h"
#include "TMVA/Factory.h"
#include "TMVA/DataLoader.h"//required to load dataset
#include "TMVA/Reader.h"
using namespace std;
using namespace TMVA;
// ----------------------------------------------------------------------------------------------
// Training
// ----------------------------------------------------------------------------------------------
//
void Training(){
std::string factoryOptions( "!V:!Silent:Transformations=I;D;P;G,D:AnalysisType=Classification" );
TString fname = "./tmva_example_multiple_background.root";
TFile *input(0);
input = TFile::Open( fname );
TTree *signal = (TTree*)input->Get("TreeS");
TTree *background0 = (TTree*)input->Get("TreeB0");
TTree *background1 = (TTree*)input->Get("TreeB1");
TTree *background2 = (TTree*)input->Get("TreeB2");
/// global event weights per tree (see below for setting event-wise weights)
Double_t signalWeight = 1.0;
Double_t background0Weight = 1.0;
Double_t background1Weight = 1.0;
Double_t background2Weight = 1.0;
// Create a new root output file.
TString outfileName( "TMVASignalBackground0.root" );
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
// background 0
// ____________
TMVA::Factory *factory = new TMVA::Factory( "TMVAMultiBkg0", outputFile, factoryOptions );
TMVA::DataLoader *dataloader=new TMVA::DataLoader("datasetBkg0");
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background0, background0Weight );
// factory->SetBackgroundWeightExpression("weight");
TCut mycuts = ""; // for example: TCut mycuts = "abs(var1)<0.5 && abs(var2-0.5)<1";
TCut mycutb = ""; // for example: TCut mycutb = "abs(var1)<0.5";
// tell the factory to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG",
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.6:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
// background 1
// ____________
outfileName = "TMVASignalBackground1.root";
outputFile = TFile::Open( outfileName, "RECREATE" );
dataloader=new TMVA::DataLoader("datasetBkg1");
factory = new TMVA::Factory( "TMVAMultiBkg1", outputFile, factoryOptions );
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background1, background1Weight );
// dataloader->SetBackgroundWeightExpression("weight");
// tell the factory to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG",
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.6:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
// background 2
// ____________
outfileName = "TMVASignalBackground2.root";
outputFile = TFile::Open( outfileName, "RECREATE" );
factory = new TMVA::Factory( "TMVAMultiBkg2", outputFile, factoryOptions );
dataloader=new TMVA::DataLoader("datasetBkg2");
dataloader->AddVariable( "var1", "Variable 1", "", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
dataloader->AddSignalTree ( signal, signalWeight );
dataloader->AddBackgroundTree( background2, background2Weight );
// dataloader->SetBackgroundWeightExpression("weight");
// tell the dataloader to use all remaining events in the trees after training for testing:
dataloader->PrepareTrainingAndTestTree( mycuts, mycutb,
"nTrain_Signal=0:nTrain_Background=0:SplitMode=Random:NormMode=NumEvents:!V" );
// Boosted Decision Trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG",
"!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.30:UseBaggedBoost:BaggedSampleFraction=0.5:SeparationType=GiniIndex:nCuts=20:MaxDepth=2" );
factory->TrainAllMethods();
factory->TestAllMethods();
factory->EvaluateAllMethods();
outputFile->Close();
delete factory;
delete dataloader;
}
// ----------------------------------------------------------------------------------------------
// Application
// ----------------------------------------------------------------------------------------------
//
// create a summary tree with all signal and background events and for each event the three classifier values and the true classID
void ApplicationCreateCombinedTree(){
// Create a new root output file.
TString outfileName( "tmva_example_multiple_backgrounds__applied.root" );
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
TTree* outputTree = new TTree("multiBkg","multiple backgrounds tree");
Float_t var1, var2;
Float_t var3, var4;
Int_t classID = 0;
Float_t weight = 1.f;
Float_t classifier0, classifier1, classifier2;
outputTree->Branch("classID", &classID, "classID/I");
outputTree->Branch("var1", &var1, "var1/F");
outputTree->Branch("var2", &var2, "var2/F");
outputTree->Branch("var3", &var3, "var3/F");
outputTree->Branch("var4", &var4, "var4/F");
outputTree->Branch("weight", &weight, "weight/F");
outputTree->Branch("cls0", &classifier0, "cls0/F");
outputTree->Branch("cls1", &classifier1, "cls1/F");
outputTree->Branch("cls2", &classifier2, "cls2/F");
// create three readers for the three different signal/background classifications, .. one for each background
TMVA::Reader *reader0 = new TMVA::Reader( "!Color:!Silent" );
TMVA::Reader *reader1 = new TMVA::Reader( "!Color:!Silent" );
TMVA::Reader *reader2 = new TMVA::Reader( "!Color:!Silent" );
reader0->AddVariable( "var1", &var1 );
reader0->AddVariable( "var2", &var2 );
reader0->AddVariable( "var3", &var3 );
reader0->AddVariable( "var4", &var4 );
reader1->AddVariable( "var1", &var1 );
reader1->AddVariable( "var2", &var2 );
reader1->AddVariable( "var3", &var3 );
reader1->AddVariable( "var4", &var4 );
reader2->AddVariable( "var1", &var1 );
reader2->AddVariable( "var2", &var2 );
reader2->AddVariable( "var3", &var3 );
reader2->AddVariable( "var4", &var4 );
// load the weight files for the readers
TString method = "BDT method";
reader0->BookMVA( "BDT method", "datasetBkg0/weights/TMVAMultiBkg0_BDTG.weights.xml" );
reader1->BookMVA( "BDT method", "datasetBkg1/weights/TMVAMultiBkg1_BDTG.weights.xml" );
reader2->BookMVA( "BDT method", "datasetBkg2/weights/TMVAMultiBkg2_BDTG.weights.xml" );
// load the input file
TFile *input(0);
TString fname = "./tmva_example_multiple_background.root";
input = TFile::Open( fname );
TTree* theTree = NULL;
// loop through signal and all background trees
for( int treeNumber = 0; treeNumber < 4; ++treeNumber ) {
if( treeNumber == 0 ){
theTree = (TTree*)input->Get("TreeS");
std::cout << "--- Select signal sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 0;
}else if( treeNumber == 1 ){
theTree = (TTree*)input->Get("TreeB0");
std::cout << "--- Select background 0 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 1;
}else if( treeNumber == 2 ){
theTree = (TTree*)input->Get("TreeB1");
std::cout << "--- Select background 1 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 2;
}else if( treeNumber == 3 ){
theTree = (TTree*)input->Get("TreeB2");
std::cout << "--- Select background 2 sample" << std::endl;
// theTree->SetBranchAddress( "weight", &weight );
weight = 1;
classID = 3;
}
theTree->SetBranchAddress( "var1", &var1 );
theTree->SetBranchAddress( "var2", &var2 );
theTree->SetBranchAddress( "var3", &var3 );
theTree->SetBranchAddress( "var4", &var4 );
std::cout << "--- Processing: " << theTree->GetEntries() << " events" << std::endl;
sw.Start();
Int_t nEvent = theTree->GetEntries();
// Int_t nEvent = 100;
for (Long64_t ievt=0; ievt<nEvent; ievt++) {
if (ievt%1000 == 0){
std::cout << "--- ... Processing event: " << ievt << std::endl;
}
theTree->GetEntry(ievt);
// get the classifiers for each of the signal/background classifications
classifier0 = reader0->EvaluateMVA( method );
classifier1 = reader1->EvaluateMVA( method );
classifier2 = reader2->EvaluateMVA( method );
outputTree->Fill();
}
// get elapsed time
sw.Stop();
std::cout << "--- End of event loop: "; sw.Print();
}
input->Close();
// write output tree
/* outputTree->SetDirectory(outputFile);
outputTree->Write(); */
outputFile->Write();
outputFile->Close();
std::cout << "--- Created root file: \"" << outfileName.Data() << "\" containing the MVA output histograms" << std::endl;
delete reader0;
delete reader1;
delete reader2;
std::cout << "==> Application of readers is done! combined tree created" << std::endl << std::endl;
}
// -----------------------------------------------------------------------------------------
// Genetic Algorithm Fitness definition
// -----------------------------------------------------------------------------------------
//
class MyFitness : public IFitterTarget {
public:
// constructor
MyFitness( TChain* _chain ) : IFitterTarget() {
chain = _chain;
hSignal = new TH1F("hsignal","hsignal",100,-1,1);
hFP = new TH1F("hfp","hfp",100,-1,1);
hTP = new TH1F("htp","htp",100,-1,1);
TString cutsAndWeightSignal = "weight*(classID==0)";
nSignal = chain->Draw("Entry$/Entries$>>hsignal",cutsAndWeightSignal,"goff");
weightsSignal = hSignal->Integral();
}
// the output of this function will be minimized
Double_t EstimatorFunction( std::vector<Double_t> & factors ){
TString cutsAndWeightTruePositive = Form("weight*((classID==0) && cls0>%f && cls1>%f && cls2>%f )",factors.at(0), factors.at(1), factors.at(2));
TString cutsAndWeightFalsePositive = Form("weight*((classID >0) && cls0>%f && cls1>%f && cls2>%f )",factors.at(0), factors.at(1), factors.at(2));
// Entry$/Entries$ just draws something reasonable. Could in principle anything
Float_t nTP = chain->Draw("Entry$/Entries$>>htp",cutsAndWeightTruePositive,"goff");
Float_t nFP = chain->Draw("Entry$/Entries$>>hfp",cutsAndWeightFalsePositive,"goff");
weightsTruePositive = hTP->Integral();
weightsFalsePositive = hFP->Integral();
efficiency = 0;
if( weightsSignal > 0 )
efficiency = weightsTruePositive/weightsSignal;
purity = 0;
if( weightsTruePositive+weightsFalsePositive > 0 )
purity = weightsTruePositive/(weightsTruePositive+weightsFalsePositive);
Float_t effTimesPur = efficiency*purity;
Float_t toMinimize = std::numeric_limits<float>::max(); // set to the highest existing number
if( effTimesPur > 0 ) // if larger than 0, take 1/x. This is the value to minimize
toMinimize = 1./(effTimesPur); // we want to minimize 1/efficiency*purity
// Print();
return toMinimize;
}
void Print(){
std::cout << std::endl;
std::cout << "======================" << std::endl
<< "Efficiency : " << efficiency << std::endl
<< "Purity : " << purity << std::endl << std::endl
<< "True positive weights : " << weightsTruePositive << std::endl
<< "False positive weights: " << weightsFalsePositive << std::endl
<< "Signal weights : " << weightsSignal << std::endl;
}
Float_t nSignal;
Float_t efficiency;
Float_t purity;
Float_t weightsTruePositive;
Float_t weightsFalsePositive;
Float_t weightsSignal;
private:
TChain* chain;
TH1F* hSignal;
TH1F* hFP;
TH1F* hTP;
};
// ----------------------------------------------------------------------------------------------
// Call of Genetic algorithm
// ----------------------------------------------------------------------------------------------
//
void MaximizeSignificance(){
// define all the parameters by their minimum and maximum value
// in this example 3 parameters (=cuts on the classifiers) are defined.
vector<Interval*> ranges;
ranges.push_back( new Interval(-1,1) ); // for some classifiers (especially LD) the ranges have to be taken larger
ranges.push_back( new Interval(-1,1) );
ranges.push_back( new Interval(-1,1) );
std::cout << "Classifier ranges (defined by the user)" << std::endl;
for( std::vector<Interval*>::iterator it = ranges.begin(); it != ranges.end(); it++ ){
std::cout << " range: " << (*it)->GetMin() << " " << (*it)->GetMax() << std::endl;
}
TChain* chain = new TChain("multiBkg");
chain->Add("tmva_example_multiple_backgrounds__applied.root");
IFitterTarget* myFitness = new MyFitness( chain );
// prepare the genetic algorithm with an initial population size of 20
// mind: big population sizes will help in searching the domain space of the solution
// but you have to weight this out to the number of generations
// the extreme case of 1 generation and populationsize n is equal to
// a Monte Carlo calculation with n tries
const TString name( "multipleBackgroundGA" );
const TString opts( "PopSize=100:Steps=30" );
GeneticFitter mg( *myFitness, name, ranges, opts);
// mg.SetParameters( 4, 30, 200, 10,5, 0.95, 0.001 );
std::vector<Double_t> result;
Double_t estimator = mg.Run(result);
dynamic_cast<MyFitness*>(myFitness)->Print();
std::cout << std::endl;
int n = 0;
for( std::vector<Double_t>::iterator it = result.begin(); it<result.end(); it++ ){
std::cout << " cutValue[" << n << "] = " << (*it) << ";"<< std::endl;
n++;
}
}
void TMVAMultipleBackgroundExample()
{
// ----------------------------------------------------------------------------------------
// Run all
// ----------------------------------------------------------------------------------------
cout << "Start Test TMVAGAexample" << endl
<< "========================" << endl
<< endl;
TString createDataMacro = gROOT->GetTutorialDir() + "/tmva/createData.C";
gROOT->ProcessLine(TString::Format(".L %s",createDataMacro.Data()));
gROOT->ProcessLine("create_MultipleBackground(200)");
cout << endl;
cout << "========================" << endl;
cout << "--- Training" << endl;
Training();
cout << endl;
cout << "========================" << endl;
cout << "--- Application & create combined tree" << endl;
ApplicationCreateCombinedTree();
cout << endl;
cout << "========================" << endl;
cout << "--- maximize significance" << endl;
MaximizeSignificance();
}
int main( int argc, char** argv ) {
TMVAMultipleBackgroundExample();
}
int main()
Definition Prototype.cxx:12
int Int_t
Definition RtypesCore.h:45
double Double_t
Definition RtypesCore.h:59
long long Long64_t
Definition RtypesCore.h:80
float Float_t
Definition RtypesCore.h:57
char name[80]
Definition TGX11.cxx:110
#define gROOT
Definition TROOT.h:404
char * Form(const char *fmt,...)
A chain is a collection of files containing TTree objects.
Definition TChain.h:33
virtual Int_t Add(TChain *chain)
Add all files referenced by the passed chain to this chain.
Definition TChain.cxx:218
A specialized string object used for TTree selections.
Definition TCut.h:25
A ROOT file is a suite of consecutive data records (TKey instances) with a well defined format.
Definition TFile.h:54
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseCompiledDefault, Int_t netopt=0)
Create / open a file.
Definition TFile.cxx:4025
Int_t Write(const char *name=nullptr, Int_t opt=0, Int_t bufsiz=0) override
Write memory objects to this file.
Definition TFile.cxx:2374
void Close(Option_t *option="") override
Close a file.
Definition TFile.cxx:899
1-D histogram with a float per channel (see TH1 documentation)}
Definition TH1.h:575
void AddSignalTree(TTree *signal, Double_t weight=1.0, Types::ETreeType treetype=Types::kMaxTreeType)
number of signal events (used to compute significance)
void PrepareTrainingAndTestTree(const TCut &cut, const TString &splitOpt)
prepare the training and test trees -> same cuts for signal and background
void AddBackgroundTree(TTree *background, Double_t weight=1.0, Types::ETreeType treetype=Types::kMaxTreeType)
number of signal events (used to compute significance)
void AddVariable(const TString &expression, const TString &title, const TString &unit, char type='F', Double_t min=0, Double_t max=0)
user inserts discriminating variable in data set info
This is the main MVA steering class.
Definition Factory.h:80
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition Factory.cxx:1114
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition Factory.cxx:352
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition Factory.cxx:1271
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition Factory.cxx:1376
Fitter using a Genetic Algorithm.
Interface for a fitter 'target'.
The TMVA::Interval Class.
Definition Interval.h:61
The Reader class serves to use the MVAs in a specific analysis context.
Definition Reader.h:64
Double_t EvaluateMVA(const std::vector< Float_t > &, const TString &methodTag, Double_t aux=0)
Evaluate a std::vector<float> of input data for a given method The parameter aux is obligatory for th...
Definition Reader.cxx:468
IMethod * BookMVA(const TString &methodTag, const TString &weightfile)
read method name from weight file
Definition Reader.cxx:368
void AddVariable(const TString &expression, Float_t *)
Add a float variable or expression to the reader.
Definition Reader.cxx:303
Stopwatch class.
Definition TStopwatch.h:28
void Start(Bool_t reset=kTRUE)
Start the stopwatch.
void Stop()
Stop the stopwatch.
void Print(Option_t *option="") const
Print the real and cpu time passed between the start and stop events.
Basic string class.
Definition TString.h:136
const char * Data() const
Definition TString.h:369
static TString Format(const char *fmt,...)
Static method which formats a string using a printf style format descriptor and return a TString.
Definition TString.cxx:2336
A TTree represents a columnar dataset.
Definition TTree.h:79
virtual Int_t Fill()
Fill all branches.
Definition TTree.cxx:4594
virtual Int_t GetEntry(Long64_t entry, Int_t getall=0)
Read all branches of entry and return total number of bytes read.
Definition TTree.cxx:5622
virtual Int_t SetBranchAddress(const char *bname, void *add, TBranch **ptr=0)
Change branch address, dealing with clone trees properly.
Definition TTree.cxx:8356
virtual Long64_t GetEntries() const
Definition TTree.h:460
TBranch * Branch(const char *name, T *obj, Int_t bufsize=32000, Int_t splitlevel=99)
Add a new branch, and infer the data type from the type of obj being passed.
Definition TTree.h:350
const Int_t n
Definition legend1.C:16
void Print(std::ostream &os, const OptionType &opt)
create variable transformations
Author
Andreas Hoecker

Definition in file TMVAMultipleBackgroundExample.C.