Loading [MathJax]/jax/output/HTML-CSS/config.js
Logo ROOT  
Reference Guide
All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Modules Pages
TMVAMulticlass.C File Reference

Detailed Description

View in nbviewer Open in SWAN This macro provides a simple example for the training and testing of the TMVA multiclass classification

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Root Macro: TMVAMulticlass
==> Start TMVAMulticlass
--- TMVAMulticlass: Using input file: ./files/tmva_multiclass_example.root
DataSetInfo : [dataset] : Added class "Signal"
: Add Tree TreeS of type Signal with 2000 events
DataSetInfo : [dataset] : Added class "bg0"
: Add Tree TreeB0 of type bg0 with 2000 events
DataSetInfo : [dataset] : Added class "bg1"
: Add Tree TreeB1 of type bg1 with 2000 events
DataSetInfo : [dataset] : Added class "bg2"
: Add Tree TreeB2 of type bg2 with 2000 events
: Dataset[dataset] : Class index : 0 name : Signal
: Dataset[dataset] : Class index : 1 name : bg0
: Dataset[dataset] : Class index : 2 name : bg1
: Dataset[dataset] : Class index : 3 name : bg2
Factory : Booking method: ␛[1mBDTG␛[0m
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Building event vectors for type 2 Signal
: Dataset[dataset] : create input formulas for tree TreeS
: Building event vectors for type 2 bg0
: Dataset[dataset] : create input formulas for tree TreeB0
: Building event vectors for type 2 bg1
: Dataset[dataset] : create input formulas for tree TreeB1
: Building event vectors for type 2 bg2
: Dataset[dataset] : create input formulas for tree TreeB2
DataSetFactory : [dataset] : Number of events in input trees
:
:
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 1000
: Signal -- testing events : 1000
: Signal -- training and testing events: 2000
: bg0 -- training events : 1000
: bg0 -- testing events : 1000
: bg0 -- training and testing events: 2000
: bg1 -- training events : 1000
: bg1 -- testing events : 1000
: bg1 -- training and testing events: 2000
: bg2 -- training events : 1000
: bg2 -- testing events : 1000
: bg2 -- training and testing events: 2000
:
DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.397 +0.623 +0.832
: var2: +0.397 +1.000 +0.716 +0.737
: var3: +0.623 +0.716 +1.000 +0.859
: var4: +0.832 +0.737 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg0):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.365 +0.592 +0.811
: var2: +0.365 +1.000 +0.708 +0.740
: var3: +0.592 +0.708 +1.000 +0.859
: var4: +0.811 +0.740 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg1):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.407 +0.610 +0.834
: var2: +0.407 +1.000 +0.710 +0.741
: var3: +0.610 +0.710 +1.000 +0.851
: var4: +0.834 +0.741 +0.851 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg2):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.647 -0.016 -0.013
: var2: -0.647 +1.000 +0.015 +0.002
: var3: -0.016 +0.015 +1.000 -0.024
: var4: -0.013 +0.002 -0.024 +1.000
: ----------------------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: ␛[1mMLP␛[0m
:
MLP : Building Network.
: Initializing weights
Factory : Booking method: ␛[1mPDEFoam␛[0m
:
Factory : Booking method: ␛[1mDL_CPU␛[0m
:
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: <none>
: - Default:
: Boost_num: "0" [Number of times the classifier will be boosted]
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
: VarTransform: "N" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
: H: "False" [Print method-specific help message]
: Layout: "TANH|100,TANH|50,TANH|10,LINEAR" [Layout of the network.]
: ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
: WeightInitialization: "XAVIERUNIFORM" [Weight initialization strategy]
: Architecture: "GPU" [Which architecture to perform the training on.]
: TrainingStrategy: "Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100" [Defines the training strategies.]
: - Default:
: VerbosityLevel: "Default" [Verbosity level]
: CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
: IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
: InputLayout: "0|0|0" [The Layout of the input]
: BatchLayout: "0|0|0" [The Layout of the batch]
: RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
: ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
DL_CPU : [dataset] : Create Transformation "N" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
␛[31m<ERROR> : CUDA backend not enabled. Please make sure you have CUDA installed and it was successfully detected by CMAKE by using -Dtmva-gpu=On ␛[0m
: Will now use instead the CPU architecture !
: Will now use the CPU architecture with BLAS and IMT support !
Factory : ␛[1mTrain all methods␛[0m
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "P" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "G" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.047647 1.0025 [ -3.6592 3.2645 ]
: var2: 0.32647 1.0646 [ -3.6891 3.7877 ]
: var3: 0.11493 1.1230 [ -4.5727 4.5640 ]
: var4: -0.076531 1.2652 [ -4.8486 5.0412 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.082544 1.0000 [ -3.6274 3.1017 ]
: var2: 0.36715 1.0000 [ -3.3020 3.4950 ]
: var3: 0.066865 1.0000 [ -2.9882 3.3086 ]
: var4: -0.20593 1.0000 [ -3.3088 2.8423 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 5.7502e-10 1.8064 [ -8.0344 7.8312 ]
: var2:-1.6078e-11 0.90130 [ -2.6765 2.7523 ]
: var3: 3.0841e-10 0.73386 [ -2.6572 2.2255 ]
: var4:-2.6886e-10 0.62168 [ -1.7384 2.2297 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.013510 1.0000 [ -2.6520 6.2074 ]
: var2: 0.0096839 1.0000 [ -2.8402 6.3073 ]
: var3: 0.010397 1.0000 [ -3.0251 5.8860 ]
: var4: 0.0053980 1.0000 [ -3.0998 5.7078 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
Factory : Train method: BDTG for Multiclass classification
:
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 4000 events: 5.84 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 2.29 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.class.C␛[0m
: TMVAMulticlass.root:/dataset/Method_BDT/BDTG
Factory : Training finished
:
Factory : Train method: MLP for Multiclass classification
:
: Training Network
:
: Elapsed time for training with 4000 events: 24.1 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of MLP on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0151 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.class.C␛[0m
: Write special histos to file: TMVAMulticlass.root:/dataset/Method_MLP/MLP
Factory : Training finished
:
Factory : Train method: PDEFoam for Multiclass classification
:
: Build up multiclass foam 0
: Elapsed time: 0.669 sec
: Build up multiclass foam 1
: Elapsed time: 0.67 sec
: Build up multiclass foam 2
: Elapsed time: 0.675 sec
: Build up multiclass foam 3
: Elapsed time: 0.473 sec
: Elapsed time for training with 4000 events: 2.66 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of PDEFoam on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.127 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights.xml␛[0m
: writing foam MultiClassFoam0 to file
: writing foam MultiClassFoam1 to file
: writing foam MultiClassFoam2 to file
: writing foam MultiClassFoam3 to file
: Foams written to file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.class.C␛[0m
Factory : Training finished
:
Factory : Train method: DL_CPU for Multiclass classification
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: Start of deep neural network training on CPU using MT, nthreads = 1
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: ***** Deep Learning Network *****
DEEP NEURAL NETWORK: Depth = 4 Input = ( 1, 1, 4 ) Batch size = 100 Loss function = C
Layer 0 DENSE Layer: ( Input = 4 , Width = 100 ) Output = ( 1 , 100 , 100 ) Activation Function = Tanh
Layer 1 DENSE Layer: ( Input = 100 , Width = 50 ) Output = ( 1 , 100 , 50 ) Activation Function = Tanh
Layer 2 DENSE Layer: ( Input = 50 , Width = 10 ) Output = ( 1 , 100 , 10 ) Activation Function = Tanh
Layer 3 DENSE Layer: ( Input = 10 , Width = 4 ) Output = ( 1 , 100 , 4 ) Activation Function = Identity
: Using 3200 events for training and 800 for testing
: Compute initial loss on the validation data
: Training phase 1 of 1: Optimizer ADAM Learning rate = 0.001 regularization 0 minimum error = 0.694109
: --------------------------------------------------------------
: Epoch | Train Err. Val. Err. t(s)/epoch t(s)/Loss nEvents/s Conv. Steps
: --------------------------------------------------------------
: Start epoch iteration ...
: 1 Minimum Test error found - save the configuration
: 1 | 0.601639 0.529598 0.0779234 0.00683489 45014.3 0
: 2 Minimum Test error found - save the configuration
: 2 | 0.501703 0.482517 0.078214 0.00672277 44760.7 0
: 3 Minimum Test error found - save the configuration
: 3 | 0.468575 0.450963 0.0794362 0.00693783 44138.9 0
: 4 Minimum Test error found - save the configuration
: 4 | 0.441521 0.423264 0.0796778 0.00684587 43936.8 0
: 5 Minimum Test error found - save the configuration
: 5 | 0.418245 0.399452 0.0795148 0.00687393 44052.4 0
: 6 Minimum Test error found - save the configuration
: 6 | 0.400239 0.381813 0.0805903 0.00715567 43576.2 0
: 7 Minimum Test error found - save the configuration
: 7 | 0.385134 0.366955 0.0802139 0.00693059 43666.1 0
: 8 Minimum Test error found - save the configuration
: 8 | 0.372892 0.359176 0.0802719 0.00695463 43645.9 0
: 9 Minimum Test error found - save the configuration
: 9 | 0.362315 0.345947 0.0800879 0.00692915 43740.5 0
: 10 Minimum Test error found - save the configuration
: 10 | 0.352219 0.337112 0.0806098 0.00719091 43585.5 0
: 11 Minimum Test error found - save the configuration
: 11 | 0.343396 0.328213 0.0814565 0.00704443 43003.8 0
: 12 Minimum Test error found - save the configuration
: 12 | 0.334729 0.320262 0.0813125 0.00705132 43091.2 0
: 13 Minimum Test error found - save the configuration
: 13 | 0.326497 0.315931 0.0815685 0.00709421 42967.8 0
: 14 Minimum Test error found - save the configuration
: 14 | 0.319745 0.311981 0.0814144 0.00711322 43067.9 0
: 15 Minimum Test error found - save the configuration
: 15 | 0.316114 0.305233 0.0820269 0.00709927 42707.9 0
: 16 Minimum Test error found - save the configuration
: 16 | 0.30824 0.297451 0.0818797 0.00711326 42800 0
: 17 Minimum Test error found - save the configuration
: 17 | 0.303741 0.293735 0.0818713 0.00723151 42872.5 0
: 18 Minimum Test error found - save the configuration
: 18 | 0.300183 0.288567 0.082169 0.00714796 42654.7 0
: 19 | 0.297913 0.288721 0.0819832 0.00701055 42682.2 1
: 20 Minimum Test error found - save the configuration
: 20 | 0.293851 0.287432 0.0821713 0.00716516 42663.2 0
: 21 Minimum Test error found - save the configuration
: 21 | 0.291091 0.284773 0.0818996 0.00715152 42810.4 0
: 22 Minimum Test error found - save the configuration
: 22 | 0.28867 0.27893 0.0822568 0.00717801 42621.9 0
: 23 Minimum Test error found - save the configuration
: 23 | 0.287568 0.27744 0.0822581 0.00723197 42651.8 0
: 24 | 0.285236 0.278107 0.0819974 0.00703381 42687.4 1
: 25 Minimum Test error found - save the configuration
: 25 | 0.282269 0.273126 0.0822672 0.00719141 42623.6 0
: 26 | 0.280313 0.274299 0.0825596 0.00706982 42389.8 1
: 27 Minimum Test error found - save the configuration
: 27 | 0.278656 0.268562 0.0825362 0.00721191 42483 0
: 28 Minimum Test error found - save the configuration
: 28 | 0.277803 0.267208 0.0827061 0.00720253 42382.1 0
: 29 Minimum Test error found - save the configuration
: 29 | 0.276087 0.266467 0.0829573 0.00721609 42249.2 0
: 30 Minimum Test error found - save the configuration
: 30 | 0.27497 0.26454 0.0838896 0.00854444 42471.2 0
: 31 Minimum Test error found - save the configuration
: 31 | 0.272911 0.263488 0.0836593 0.00721598 41861.1 0
: 32 Minimum Test error found - save the configuration
: 32 | 0.272072 0.260027 0.0828512 0.00719819 42298.4 0
: 33 Minimum Test error found - save the configuration
: 33 | 0.270275 0.258824 0.0835476 0.00727989 41957.5 0
: 34 Minimum Test error found - save the configuration
: 34 | 0.26943 0.258575 0.0830947 0.00722386 42176.9 0
: 35 Minimum Test error found - save the configuration
: 35 | 0.267961 0.257068 0.084461 0.00728816 41465.3 0
: 36 Minimum Test error found - save the configuration
: 36 | 0.267414 0.255256 0.0839349 0.007312 41763 0
: 37 | 0.265198 0.257091 0.0837838 0.00708912 41723.9 1
: 38 Minimum Test error found - save the configuration
: 38 | 0.263875 0.2515 0.0839749 0.00731214 41741.3 0
: 39 | 0.263742 0.252848 0.0835269 0.00714072 41892.4 1
: 40 | 0.262368 0.25218 0.0828004 0.00709134 42267.1 2
: 41 | 0.26108 0.255272 0.0830726 0.0071841 42167.1 3
: 42 Minimum Test error found - save the configuration
: 42 | 0.260779 0.250018 0.0829705 0.00723241 42250.9 0
: 43 Minimum Test error found - save the configuration
: 43 | 0.258294 0.247138 0.0829927 0.00727374 42261.5 0
: 44 Minimum Test error found - save the configuration
: 44 | 0.257424 0.246331 0.082817 0.00724448 42343.4 0
: 45 | 0.255627 0.248431 0.0828506 0.00712582 42258.3 1
: 46 Minimum Test error found - save the configuration
: 46 | 0.255865 0.241371 0.0831288 0.00725381 42174.6 0
: 47 | 0.253965 0.24606 0.0828958 0.00710974 42224.1 1
: 48 Minimum Test error found - save the configuration
: 48 | 0.253274 0.240937 0.0832742 0.00734591 42145 0
: 49 Minimum Test error found - save the configuration
: 49 | 0.252453 0.239544 0.0831857 0.00726895 42151.4 0
: 50 | 0.249973 0.241542 0.0831159 0.00716548 42132.7 1
: 51 Minimum Test error found - save the configuration
: 51 | 0.249954 0.23924 0.0833719 0.00727502 42051.6 0
: 52 Minimum Test error found - save the configuration
: 52 | 0.249039 0.238564 0.0830818 0.00728164 42216.3 0
: 53 Minimum Test error found - save the configuration
: 53 | 0.247667 0.237269 0.0835753 0.0073023 41954.6 0
: 54 | 0.245584 0.238542 0.0832525 0.00745591 42218.2 1
: 55 Minimum Test error found - save the configuration
: 55 | 0.244835 0.233356 0.0833052 0.00729967 42102.2 0
: 56 | 0.245473 0.241083 0.0832101 0.00712703 42059.3 1
: 57 | 0.242822 0.23354 0.0829879 0.00713344 42186.1 2
: 58 Minimum Test error found - save the configuration
: 58 | 0.241648 0.231286 0.0835115 0.00730249 41989.8 0
: 59 | 0.240651 0.233094 0.0833037 0.00715281 42021.9 1
: 60 Minimum Test error found - save the configuration
: 60 | 0.239609 0.229201 0.0835801 0.0073839 41996.9 0
: 61 | 0.238865 0.232234 0.0834593 0.00713777 41927.9 1
: 62 | 0.237175 0.232358 0.0834557 0.00713567 41928.7 2
: 63 | 0.236377 0.229504 0.0831698 0.00713272 42084.7 3
: 64 Minimum Test error found - save the configuration
: 64 | 0.2366 0.228354 0.0834461 0.0073129 42031.6 0
: 65 | 0.235037 0.231788 0.0832566 0.00717017 42057.4 1
: 66 | 0.234731 0.23122 0.0831345 0.00715363 42115.8 2
: 67 | 0.233597 0.228464 0.0840417 0.00726356 41678.5 3
: 68 Minimum Test error found - save the configuration
: 68 | 0.23284 0.225191 0.0832484 0.00730499 42136.6 0
: 69 Minimum Test error found - save the configuration
: 69 | 0.231989 0.224381 0.0834417 0.00734305 42050.7 0
: 70 | 0.23117 0.225086 0.0841558 0.00734658 41661.7 1
: 71 | 0.230284 0.229931 0.0838117 0.00717027 41752.9 2
: 72 | 0.229937 0.22689 0.0837218 0.00719759 41816.8 3
: 73 | 0.229306 0.225456 0.0836035 0.00716305 41862.6 4
: 74 Minimum Test error found - save the configuration
: 74 | 0.228141 0.223579 0.0837216 0.00730626 41876.4 0
: 75 Minimum Test error found - save the configuration
: 75 | 0.228865 0.221979 0.0841765 0.00743592 41698.9 0
: 76 | 0.226426 0.224212 0.0835975 0.00718907 41880.2 1
: 77 | 0.227039 0.229741 0.0847699 0.00716912 41236.7 2
: 78 Minimum Test error found - save the configuration
: 78 | 0.227255 0.219558 0.083895 0.00733759 41798.7 0
: 79 | 0.225984 0.225662 0.0834502 0.00716382 41947.2 1
: 80 | 0.225118 0.223068 0.083753 0.00716918 41784.3 2
: 81 | 0.223169 0.226894 0.0834617 0.00718382 41951.8 3
: 82 | 0.223516 0.222787 0.083681 0.00717904 41829 4
: 83 | 0.224552 0.223731 0.0834398 0.0071717 41957.2 5
: 84 | 0.223034 0.219893 0.0833029 0.00718394 42039.5 6
: 85 Minimum Test error found - save the configuration
: 85 | 0.222191 0.218094 0.0837802 0.0073355 41860.3 0
: 86 | 0.223876 0.219623 0.0833149 0.00716946 42024.9 1
: 87 | 0.220952 0.226054 0.0836528 0.00720221 41857.1 2
: 88 | 0.221036 0.222959 0.0842447 0.00744891 41668.9 3
: 89 | 0.22029 0.220036 0.0835495 0.00729568 41965.1 4
: 90 | 0.220745 0.218357 0.0837272 0.0072058 41818.3 5
: 91 | 0.220514 0.218643 0.0836497 0.00718725 41850.6 6
: 92 | 0.21934 0.21968 0.0834726 0.00718659 41947.4 7
: 93 | 0.218849 0.225004 0.0832939 0.00717277 42038.3 8
: 94 | 0.219878 0.220582 0.0837471 0.00718594 41796.7 9
: 95 | 0.218333 0.220219 0.0835197 0.00725048 41956.6 10
: 96 | 0.218262 0.218382 0.0834367 0.00718141 41964.3 11
:
: Elapsed time for training with 4000 events: 7.98 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of DL_CPU on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.114 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.weights.xml␛[0m
: Creating standalone class: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.class.C␛[0m
Factory : Training finished
:
: Ranking input variables (method specific)...
BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 3.117e-01
: 2 : var1 : 2.504e-01
: 3 : var2 : 2.430e-01
: 4 : var3 : 1.949e-01
: --------------------------------------
MLP : Ranking result (top variable is best ranked)
: -----------------------------
: Rank : Variable : Importance
: -----------------------------
: 1 : var4 : 6.076e+01
: 2 : var2 : 4.824e+01
: 3 : var1 : 2.116e+01
: 4 : var3 : 1.692e+01
: -----------------------------
PDEFoam : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 2.991e-01
: 2 : var1 : 2.930e-01
: 3 : var3 : 2.365e-01
: 4 : var2 : 1.714e-01
: --------------------------------------
: No variable ranking supplied by classifier: DL_CPU
TH1.Print Name = TrainingHistory_DL_CPU_trainingError, Entries= 0, Total sum= 26.1181
TH1.Print Name = TrainingHistory_DL_CPU_valError, Entries= 0, Total sum= 25.306
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_BDTG.weights.xml␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_MLP.weights.xml␛[0m
MLP : Building Network.
: Initializing weights
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights.xml␛[0m
: Read foams from file: ␛[0;36mdataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root␛[0m
: Reading weight file: ␛[0;36mdataset/weights/TMVAMulticlass_DL_CPU.weights.xml␛[0m
Factory : ␛[1mTest all methods␛[0m
Factory : Test method: BDTG for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.19 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: MLP for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of MLP on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0154 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: PDEFoam for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of PDEFoam on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.129 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: DL_CPU for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of DL_CPU on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.115 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : ␛[1mEvaluate all methods␛[0m
: Evaluate multiclass classification method: BDTG
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: MLP
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: PDEFoam
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_PDEFoam : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: DL_CPU
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
:
: 1-vs-rest performance metrics per class
: -------------------------------------------------------------------------------------------------------
:
: Considers the listed class as signal and the other classes
: as background, reporting the resulting binary performance.
: A score of 0.820 (0.850) means 0.820 was acheived on the
: test set and 0.850 on the training set.
:
: Dataset MVA Method ROC AUC Sig eff@B=0.01 Sig eff@B=0.10 Sig eff@B=0.30
: Name: / Class: test (train) test (train) test (train) test (train)
:
: dataset BDTG
: ------------------------------
: Signal 0.968 (0.978) 0.508 (0.605) 0.914 (0.945) 0.990 (0.996)
: bg0 0.910 (0.931) 0.256 (0.288) 0.737 (0.791) 0.922 (0.956)
: bg1 0.947 (0.954) 0.437 (0.511) 0.833 (0.856) 0.971 (0.971)
: bg2 0.978 (0.982) 0.585 (0.678) 0.951 (0.956) 0.999 (0.996)
:
: dataset MLP
: ------------------------------
: Signal 0.970 (0.975) 0.596 (0.632) 0.933 (0.938) 0.988 (0.993)
: bg0 0.929 (0.934) 0.303 (0.298) 0.787 (0.793) 0.949 (0.961)
: bg1 0.962 (0.967) 0.467 (0.553) 0.881 (0.906) 0.985 (0.992)
: bg2 0.975 (0.979) 0.629 (0.699) 0.929 (0.940) 0.998 (0.998)
:
: dataset PDEFoam
: ------------------------------
: Signal 0.916 (0.928) 0.294 (0.382) 0.744 (0.782) 0.932 (0.952)
: bg0 0.837 (0.848) 0.109 (0.147) 0.519 (0.543) 0.833 (0.851)
: bg1 0.890 (0.902) 0.190 (0.226) 0.606 (0.646) 0.923 (0.929)
: bg2 0.967 (0.972) 0.510 (0.527) 0.900 (0.926) 0.993 (0.998)
:
: dataset DL_CPU
: ------------------------------
: Signal 0.974 (0.975) 0.581 (0.642) 0.934 (0.944) 0.991 (0.993)
: bg0 0.927 (0.932) 0.280 (0.320) 0.780 (0.776) 0.942 (0.958)
: bg1 0.961 (0.963) 0.494 (0.485) 0.888 (0.891) 0.987 (0.991)
: bg2 0.975 (0.974) 0.660 (0.672) 0.908 (0.899) 0.999 (0.999)
:
: -------------------------------------------------------------------------------------------------------
:
:
: Confusion matrices for all methods
: -------------------------------------------------------------------------------------------------------
:
: Does a binary comparison between the two classes given by a
: particular row-column combination. In each case, the class
: given by the row is considered signal while the class given
: by the column index is considered background.
:
: === Showing confusion matrix for method : BDTG
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.497 (0.373) 0.710 (0.693) 0.680 (0.574)
: bg0 0.271 (0.184) - 0.239 (0.145) 0.705 (0.667)
: bg1 0.855 (0.766) 0.369 (0.222) - 0.587 (0.578)
: bg2 0.714 (0.585) 0.705 (0.581) 0.648 (0.601) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.911 (0.853) 0.991 (0.981) 0.945 (0.913)
: bg0 0.833 (0.774) - 0.654 (0.582) 0.930 (0.901)
: bg1 0.971 (0.980) 0.716 (0.681) - 0.871 (0.862)
: bg2 0.976 (0.951) 0.971 (0.973) 0.936 (0.941) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.957) 0.999 (1.000) 0.998 (0.997)
: bg0 0.965 (0.926) - 0.874 (0.835) 0.991 (0.976)
: bg1 1.000 (0.999) 0.916 (0.894) - 0.988 (0.985)
: bg2 0.999 (0.999) 0.997 (0.999) 0.996 (0.997) -
:
: === Showing confusion matrix for method : MLP
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.465 (0.490) 0.974 (0.953) 0.632 (0.498)
: bg0 0.320 (0.269) - 0.224 (0.250) 0.655 (0.627)
: bg1 0.943 (0.920) 0.341 (0.275) - 0.632 (0.687)
: bg2 0.665 (0.642) 0.697 (0.680) 0.706 (0.598) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.865 (0.854) 0.996 (0.994) 0.908 (0.907)
: bg0 0.784 (0.776) - 0.666 (0.655) 0.919 (0.895)
: bg1 0.998 (0.998) 0.791 (0.785) - 0.912 (0.902)
: bg2 0.943 (0.903) 0.946 (0.939) 0.924 (0.928) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.964) 0.997 (0.997) 0.993 (0.986)
: bg0 0.952 (0.924) - 0.936 (0.928) 0.992 (0.990)
: bg1 1.000 (1.000) 0.945 (0.936) - 0.998 (0.995)
: bg2 0.994 (0.985) 0.998 (0.998) 0.998 (0.998) -
:
: === Showing confusion matrix for method : PDEFoam
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.289 (0.233) 0.467 (0.436) 0.421 (0.332)
: bg0 0.100 (0.045) - 0.132 (0.116) 0.540 (0.313)
: bg1 0.209 (0.434) 0.153 (0.092) - 0.347 (0.323)
: bg2 0.560 (0.552) 0.445 (0.424) 0.501 (0.506) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.665 (0.640) 0.854 (0.822) 0.807 (0.790)
: bg0 0.538 (0.520) - 0.415 (0.374) 0.843 (0.833)
: bg1 0.885 (0.886) 0.542 (0.491) - 0.728 (0.646)
: bg2 0.928 (0.890) 0.956 (0.959) 0.847 (0.895) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.898 (0.878) 0.971 (0.950) 0.982 (0.975)
: bg0 0.828 (0.810) - 0.696 (0.676) 0.954 (0.951)
: bg1 0.951 (0.966) 0.803 (0.745) - 0.958 (0.966)
: bg2 0.998 (0.991) 0.998 (0.996) 0.998 (0.993) -
:
: === Showing confusion matrix for method : DL_CPU
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.458 (0.448) 0.949 (0.947) 0.679 (0.644)
: bg0 0.383 (0.386) - 0.165 (0.194) 0.618 (0.591)
: bg1 0.903 (0.901) 0.350 (0.293) - 0.502 (0.657)
: bg2 0.631 (0.614) 0.699 (0.664) 0.720 (0.692) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.871 (0.876) 0.993 (0.989) 0.939 (0.929)
: bg0 0.786 (0.773) - 0.671 (0.681) 0.878 (0.845)
: bg1 0.998 (0.993) 0.818 (0.785) - 0.842 (0.854)
: bg2 0.898 (0.886) 0.907 (0.935) 0.895 (0.907) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.973 (0.967) 0.998 (1.000) 0.998 (1.000)
: bg0 0.966 (0.942) - 0.926 (0.913) 0.990 (0.988)
: bg1 1.000 (0.998) 0.950 (0.949) - 0.999 (0.993)
: bg2 0.999 (0.997) 0.999 (1.000) 0.998 (0.998) -
:
: -------------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 4000 events
:
Dataset:dataset : Created tree 'TrainTree' with 4000 events
:
Factory : ␛[1mThank you for using TMVA!␛[0m
: ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m
==> Wrote root file: TMVAMulticlass.root
==> TMVAMulticlass is done!
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>
#include "TFile.h"
#include "TTree.h"
#include "TString.h"
#include "TSystem.h"
#include "TROOT.h"
#include "TMVA/Tools.h"
#include "TMVA/Factory.h"
using namespace TMVA;
void TMVAMulticlass( TString myMethodList = "" )
{
// This loads the library
// to get access to the GUI and all tmva macros
//
// TString tmva_dir(TString(gRootDir) + "/tmva");
// if(gSystem->Getenv("TMVASYS"))
// tmva_dir = TString(gSystem->Getenv("TMVASYS"));
// gROOT->SetMacroPath(tmva_dir + "/test/:" + gROOT->GetMacroPath() );
// gROOT->ProcessLine(".L TMVAMultiClassGui.C");
//---------------------------------------------------------------
// Default MVA methods to be trained + tested
std::map<std::string,int> Use;
Use["MLP"] = 1;
Use["BDTG"] = 1;
#ifdef R__HAS_TMVAGPU
Use["DL_CPU"] = 1;
Use["DL_GPU"] = 1;
#else
Use["DL_CPU"] = 1;
Use["DL_GPU"] = 0;
#endif
Use["FDA_GA"] = 0;
Use["PDEFoam"] = 1;
//---------------------------------------------------------------
std::cout << std::endl;
std::cout << "==> Start TMVAMulticlass" << std::endl;
if (myMethodList != "") {
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;
std::vector<TString> mlist = TMVA::gTools().SplitString( myMethodList, ',' );
for (UInt_t i=0; i<mlist.size(); i++) {
std::string regMethod(mlist[i]);
if (Use.find(regMethod) == Use.end()) {
std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
std::cout << std::endl;
return;
}
Use[regMethod] = 1;
}
}
// Create a new root output file.
TString outfileName = "TMVAMulticlass.root";
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
TMVA::Factory *factory = new TMVA::Factory( "TMVAMulticlass", outputFile,
"!V:!Silent:Color:DrawProgressBar:Transformations=I;D;P;G,D:AnalysisType=multiclass" );
TMVA::DataLoader *dataloader=new TMVA::DataLoader("dataset");
dataloader->AddVariable( "var1", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
TFile *input(0);
TString fname = "./tmva_example_multiclass.root";
if (!gSystem->AccessPathName( fname )) {
input = TFile::Open( fname ); // check if file in local directory exists
}
else {
input = TFile::Open("http://root.cern.ch/files/tmva_multiclass_example.root", "CACHEREAD");
}
if (!input) {
std::cout << "ERROR: could not open data file" << std::endl;
exit(1);
}
std::cout << "--- TMVAMulticlass: Using input file: " << input->GetName() << std::endl;
TTree *signalTree = (TTree*)input->Get("TreeS");
TTree *background0 = (TTree*)input->Get("TreeB0");
TTree *background1 = (TTree*)input->Get("TreeB1");
TTree *background2 = (TTree*)input->Get("TreeB2");
gROOT->cd( outfileName+TString(":/") );
dataloader->AddTree (signalTree,"Signal");
dataloader->AddTree (background0,"bg0");
dataloader->AddTree (background1,"bg1");
dataloader->AddTree (background2,"bg2");
dataloader->PrepareTrainingAndTestTree( "", "SplitMode=Random:NormMode=NumEvents:!V" );
if (Use["BDTG"]) // gradient boosted decision trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG", "!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.10:UseBaggedBoost:BaggedSampleFraction=0.50:nCuts=20:MaxDepth=2");
if (Use["MLP"]) // neural network
factory->BookMethod( dataloader, TMVA::Types::kMLP, "MLP", "!H:!V:NeuronType=tanh:NCycles=1000:HiddenLayers=N+5,5:TestRate=5:EstimatorType=MSE");
if (Use["FDA_GA"]) // functional discriminant with GA minimizer
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_GA", "H:!V:Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):FitMethod=GA:PopSize=300:Cycles=3:Steps=20:Trim=True:SaveBestGen=1" );
if (Use["PDEFoam"]) // PDE-Foam approach
factory->BookMethod( dataloader, TMVA::Types::kPDEFoam, "PDEFoam", "!H:!V:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Nmin=100:Kernel=None:Compress=T" );
if (Use["DL_CPU"]) {
TString layoutString("Layout=TANH|100,TANH|50,TANH|10,LINEAR");
TString trainingStrategyString("TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,"
"TestRepetitions=1,ConvergenceSteps=10,BatchSize=100");
TString nnOptions("!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:"
"WeightInitialization=XAVIERUNIFORM:Architecture=GPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDL, "DL_CPU", nnOptions);
}
if (Use["DL_GPU"]) {
TString layoutString("Layout=TANH|100,TANH|50,TANH|10,LINEAR");
TString trainingStrategyString("TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,"
"TestRepetitions=1,ConvergenceSteps=10,BatchSize=100");
TString nnOptions("!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:"
"WeightInitialization=XAVIERUNIFORM:Architecture=GPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDL, "DL_GPU", nnOptions);
}
// Train MVAs using the set of training events
factory->TrainAllMethods();
// Evaluate all MVAs using the set of test events
factory->TestAllMethods();
// Evaluate and compare performance of all configured MVAs
factory->EvaluateAllMethods();
// --------------------------------------------------------------
// Save the output
outputFile->Close();
std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVAMulticlass is done!" << std::endl;
delete factory;
delete dataloader;
// Launch the GUI for the root macros
if (!gROOT->IsBatch()) TMVAMultiClassGui( outfileName );
}
int main( int argc, char** argv )
{
// Select methods (don't look at this code - not of interest)
TString methodList;
for (int i=1; i<argc; i++) {
TString regMethod(argv[i]);
if(regMethod=="-b" || regMethod=="--batch") continue;
if (!methodList.IsNull()) methodList += TString(",");
methodList += regMethod;
}
TMVAMulticlass(methodList);
return 0;
}
unsigned int UInt_t
Definition: RtypesCore.h:44
#define gROOT
Definition: TROOT.h:406
R__EXTERN TSystem * gSystem
Definition: TSystem.h:556
A ROOT file is a suite of consecutive data records (TKey instances) with a well defined format.
Definition: TFile.h:53
static Bool_t SetCacheFileDir(ROOT::Internal::TStringView cacheDir, Bool_t operateDisconnected=kTRUE, Bool_t forceCacheread=kFALSE)
Definition: TFile.h:323
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseCompiledDefault, Int_t netopt=0)
Create / open a file.
Definition: TFile.cxx:3942
void Close(Option_t *option="") override
Close a file.
Definition: TFile.cxx:873
void PrepareTrainingAndTestTree(const TCut &cut, const TString &splitOpt)
prepare the training and test trees -> same cuts for signal and background
Definition: DataLoader.cxx:633
void AddTree(TTree *tree, const TString &className, Double_t weight=1.0, const TCut &cut="", Types::ETreeType tt=Types::kMaxTreeType)
Definition: DataLoader.cxx:352
void AddVariable(const TString &expression, const TString &title, const TString &unit, char type='F', Double_t min=0, Double_t max=0)
user inserts discriminating variable in data set info
Definition: DataLoader.cxx:486
This is the main MVA steering class.
Definition: Factory.h:81
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition: Factory.cxx:1093
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition: Factory.cxx:345
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition: Factory.cxx:1244
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition: Factory.cxx:1349
static Tools & Instance()
Definition: Tools.cxx:74
std::vector< TString > SplitString(const TString &theOpt, const char separator) const
splits the option string at 'separator' and fills the list 'splitV' with the primitive strings
Definition: Tools.cxx:1210
@ kFDA
Definition: Types.h:94
@ kBDT
Definition: Types.h:88
@ kPDEFoam
Definition: Types.h:96
@ kMLP
Definition: Types.h:92
virtual const char * GetName() const
Returns name of object.
Definition: TNamed.h:47
Basic string class.
Definition: TString.h:131
Bool_t IsNull() const
Definition: TString.h:402
virtual Bool_t AccessPathName(const char *path, EAccessMode mode=kFileExists)
Returns FALSE if one can access a file using the specified access mode.
Definition: TSystem.cxx:1291
A TTree represents a columnar dataset.
Definition: TTree.h:78
int main(int argc, char **argv)
create variable transformations
Tools & gTools()
void TMVAMultiClassGui(const char *fName="TMVAMulticlass.root", TString dataset="")
Author
Andreas Hoecker

Definition in file TMVAMulticlass.C.