Logo ROOT  
Reference Guide
TMVAMulticlass.C File Reference

Detailed Description

View in nbviewer Open in SWAN This macro provides a simple example for the training and testing of the TMVA multiclass classification

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Root Macro: TMVAMulticlass
==> Start TMVAMulticlass
--- TMVAMulticlass: Using input file: ./files/tmva_multiclass_example.root
DataSetInfo : [dataset] : Added class "Signal"
: Add Tree TreeS of type Signal with 2000 events
DataSetInfo : [dataset] : Added class "bg0"
: Add Tree TreeB0 of type bg0 with 2000 events
DataSetInfo : [dataset] : Added class "bg1"
: Add Tree TreeB1 of type bg1 with 2000 events
DataSetInfo : [dataset] : Added class "bg2"
: Add Tree TreeB2 of type bg2 with 2000 events
: Dataset[dataset] : Class index : 0 name : Signal
: Dataset[dataset] : Class index : 1 name : bg0
: Dataset[dataset] : Class index : 2 name : bg1
: Dataset[dataset] : Class index : 3 name : bg2
Factory : Booking method: BDTG
:
: the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
: --> change to new default NegWeightTreatment=Pray
: Building event vectors for type 2 Signal
: Dataset[dataset] : create input formulas for tree TreeS
: Building event vectors for type 2 bg0
: Dataset[dataset] : create input formulas for tree TreeB0
: Building event vectors for type 2 bg1
: Dataset[dataset] : create input formulas for tree TreeB1
: Building event vectors for type 2 bg2
: Dataset[dataset] : create input formulas for tree TreeB2
DataSetFactory : [dataset] : Number of events in input trees
:
:
:
:
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 1000
: Signal -- testing events : 1000
: Signal -- training and testing events: 2000
: bg0 -- training events : 1000
: bg0 -- testing events : 1000
: bg0 -- training and testing events: 2000
: bg1 -- training events : 1000
: bg1 -- testing events : 1000
: bg1 -- training and testing events: 2000
: bg2 -- training events : 1000
: bg2 -- testing events : 1000
: bg2 -- training and testing events: 2000
:
DataSetInfo : Correlation matrix (Signal):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.397 +0.623 +0.832
: var2: +0.397 +1.000 +0.716 +0.737
: var3: +0.623 +0.716 +1.000 +0.859
: var4: +0.832 +0.737 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg0):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.365 +0.592 +0.811
: var2: +0.365 +1.000 +0.708 +0.740
: var3: +0.592 +0.708 +1.000 +0.859
: var4: +0.811 +0.740 +0.859 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg1):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 +0.407 +0.610 +0.834
: var2: +0.407 +1.000 +0.710 +0.741
: var3: +0.610 +0.710 +1.000 +0.851
: var4: +0.834 +0.741 +0.851 +1.000
: ----------------------------------------
DataSetInfo : Correlation matrix (bg2):
: ----------------------------------------
: var1 var2 var3 var4
: var1: +1.000 -0.647 -0.016 -0.013
: var2: -0.647 +1.000 +0.015 +0.002
: var3: -0.016 +0.015 +1.000 -0.024
: var4: -0.013 +0.002 -0.024 +1.000
: ----------------------------------------
DataSetFactory : [dataset] :
:
Factory : Booking method: MLP
:
MLP : Building Network.
: Initializing weights
Factory : Booking method: PDEFoam
:
Factory : Booking method: DL_CPU
:
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: <none>
: - Default:
: Boost_num: "0" [Number of times the classifier will be boosted]
: Parsing option string:
: ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:WeightInitialization=XAVIERUNIFORM:Architecture=GPU:Layout=TANH|100,TANH|50,TANH|10,LINEAR:TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100"
: The following options are set:
: - By User:
: V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
: VarTransform: "N" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
: H: "False" [Print method-specific help message]
: Layout: "TANH|100,TANH|50,TANH|10,LINEAR" [Layout of the network.]
: ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
: WeightInitialization: "XAVIERUNIFORM" [Weight initialization strategy]
: Architecture: "GPU" [Which architecture to perform the training on.]
: TrainingStrategy: "Optimizer=ADAM,LearningRate=1e-3,TestRepetitions=1,ConvergenceSteps=10,BatchSize=100" [Defines the training strategies.]
: - Default:
: VerbosityLevel: "Default" [Verbosity level]
: CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
: IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
: InputLayout: "0|0|0" [The Layout of the input]
: BatchLayout: "0|0|0" [The Layout of the batch]
: RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
: ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
DL_CPU : [dataset] : Create Transformation "N" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
<ERROR> : CUDA backend not enabled. Please make sure you have CUDA installed and it was successfully detected by CMAKE by using -Dtmva-gpu=On 
: Will now use instead the CPU architecture !
: Will now use the CPU architecture with BLAS and IMT support !
Factory : Train all methods
Factory : [dataset] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "P" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "G" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
Factory : [dataset] : Create Transformation "D" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'var1' <---> Output : variable 'var1'
: Input : variable 'var2' <---> Output : variable 'var2'
: Input : variable 'var3' <---> Output : variable 'var3'
: Input : variable 'var4' <---> Output : variable 'var4'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.047647 1.0025 [ -3.6592 3.2645 ]
: var2: 0.32647 1.0646 [ -3.6891 3.7877 ]
: var3: 0.11493 1.1230 [ -4.5727 4.5640 ]
: var4: -0.076531 1.2652 [ -4.8486 5.0412 ]
: -----------------------------------------------------------
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.082544 1.0000 [ -3.6274 3.1017 ]
: var2: 0.36715 1.0000 [ -3.3020 3.4950 ]
: var3: 0.066865 1.0000 [ -2.9882 3.3086 ]
: var4: -0.20593 1.0000 [ -3.3088 2.8423 ]
: -----------------------------------------------------------
: Preparing the Principle Component (PCA) transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 5.7502e-10 1.8064 [ -8.0344 7.8312 ]
: var2:-1.6078e-11 0.90130 [ -2.6765 2.7523 ]
: var3: 3.0841e-10 0.73386 [ -2.6572 2.2255 ]
: var4:-2.6886e-10 0.62168 [ -1.7384 2.2297 ]
: -----------------------------------------------------------
: Preparing the Gaussian transformation...
: Preparing the Decorrelation transformation...
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.013510 1.0000 [ -2.6520 6.2074 ]
: var2: 0.0096839 1.0000 [ -2.8402 6.3073 ]
: var3: 0.010397 1.0000 [ -3.0251 5.8860 ]
: var4: 0.0053980 1.0000 [ -3.0998 5.7078 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
Factory : Train method: BDTG for Multiclass classification
:
: Training 1000 Decision Trees ... patience please
: Elapsed time for training with 4000 events: 5.46 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of BDTG on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.94 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: dataset/weights/TMVAMulticlass_BDTG.weights.xml
: Creating standalone class: dataset/weights/TMVAMulticlass_BDTG.class.C
: TMVAMulticlass.root:/dataset/Method_BDT/BDTG
Factory : Training finished
:
Factory : Train method: MLP for Multiclass classification
:
: Training Network
:
: Elapsed time for training with 4000 events: 24.3 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of MLP on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0177 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: dataset/weights/TMVAMulticlass_MLP.weights.xml
: Creating standalone class: dataset/weights/TMVAMulticlass_MLP.class.C
: Write special histos to file: TMVAMulticlass.root:/dataset/Method_MLP/MLP
Factory : Training finished
:
Factory : Train method: PDEFoam for Multiclass classification
:
: Build up multiclass foam 0
: Elapsed time: 0.663 sec
: Build up multiclass foam 1
: Elapsed time: 0.668 sec
: Build up multiclass foam 2
: Elapsed time: 0.67 sec
: Build up multiclass foam 3
: Elapsed time: 0.47 sec
: Elapsed time for training with 4000 events: 2.64 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of PDEFoam on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.126 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: dataset/weights/TMVAMulticlass_PDEFoam.weights.xml
: writing foam MultiClassFoam0 to file
: writing foam MultiClassFoam1 to file
: writing foam MultiClassFoam2 to file
: writing foam MultiClassFoam3 to file
: Foams written to file: dataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root
: Creating standalone class: dataset/weights/TMVAMulticlass_PDEFoam.class.C
Factory : Training finished
:
Factory : Train method: DL_CPU for Multiclass classification
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: Start of deep neural network training on CPU using MT, nthreads = 1
:
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070769 0.28960 [ -1.0000 1.0000 ]
: var2: 0.074130 0.28477 [ -1.0000 1.0000 ]
: var3: 0.026106 0.24582 [ -1.0000 1.0000 ]
: var4: -0.034951 0.25587 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: ***** Deep Learning Network *****
DEEP NEURAL NETWORK: Depth = 4 Input = ( 1, 1, 4 ) Batch size = 100 Loss function = C
Layer 0 DENSE Layer: ( Input = 4 , Width = 100 ) Output = ( 1 , 100 , 100 ) Activation Function = Tanh
Layer 1 DENSE Layer: ( Input = 100 , Width = 50 ) Output = ( 1 , 100 , 50 ) Activation Function = Tanh
Layer 2 DENSE Layer: ( Input = 50 , Width = 10 ) Output = ( 1 , 100 , 10 ) Activation Function = Tanh
Layer 3 DENSE Layer: ( Input = 10 , Width = 4 ) Output = ( 1 , 100 , 4 ) Activation Function = Identity
: Using 3200 events for training and 800 for testing
: Compute initial loss on the validation data
: Training phase 1 of 1: Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 0.689348
: --------------------------------------------------------------
: Epoch | Train Err. Val. Err. t(s)/epoch t(s)/Loss nEvents/s Conv. Steps
: --------------------------------------------------------------
: Start epoch iteration ...
: 1 Minimum Test error found - save the configuration
: 1 | 0.585859 0.513179 0.0770618 0.00680656 45548.2 0
: 2 Minimum Test error found - save the configuration
: 2 | 0.494514 0.472595 0.0782847 0.00673694 44725.4 0
: 3 Minimum Test error found - save the configuration
: 3 | 0.460313 0.441617 0.0792387 0.0068158 44184.9 0
: 4 Minimum Test error found - save the configuration
: 4 | 0.433511 0.414374 0.0789596 0.00681079 44352.8 0
: 5 Minimum Test error found - save the configuration
: 5 | 0.408516 0.391522 0.0796459 0.00714388 44136.7 0
: 6 Minimum Test error found - save the configuration
: 6 | 0.389172 0.374154 0.0802346 0.00693585 43657 0
: 7 Minimum Test error found - save the configuration
: 7 | 0.375151 0.36088 0.0802025 0.00693302 43674.4 0
: 8 Minimum Test error found - save the configuration
: 8 | 0.364006 0.350309 0.0799224 0.00690333 43824.2 0
: 9 Minimum Test error found - save the configuration
: 9 | 0.355301 0.341487 0.0814222 0.00701618 43007.3 0
: 10 Minimum Test error found - save the configuration
: 10 | 0.348055 0.336465 0.0807696 0.00697556 43363.9 0
: 11 Minimum Test error found - save the configuration
: 11 | 0.341541 0.330644 0.080693 0.00700171 43424.4 0
: 12 Minimum Test error found - save the configuration
: 12 | 0.335818 0.32025 0.0814411 0.0070455 43013.3 0
: 13 Minimum Test error found - save the configuration
: 13 | 0.329728 0.316073 0.0807996 0.00699318 43356.7 0
: 14 Minimum Test error found - save the configuration
: 14 | 0.324898 0.311109 0.0808676 0.00702662 43336.4 0
: 15 Minimum Test error found - save the configuration
: 15 | 0.320485 0.305903 0.0809312 0.00701366 43291.5 0
: 16 Minimum Test error found - save the configuration
: 16 | 0.316391 0.299418 0.0810307 0.00702514 43240 0
: 17 Minimum Test error found - save the configuration
: 17 | 0.310835 0.294764 0.0811975 0.00703809 43150.3 0
: 18 Minimum Test error found - save the configuration
: 18 | 0.306948 0.291669 0.0811564 0.0070421 43176.6 0
: 19 | 0.302894 0.292923 0.0812182 0.00696606 43096.4 1
: 20 Minimum Test error found - save the configuration
: 20 | 0.299705 0.285545 0.0814297 0.00705982 43028.2 0
: 21 Minimum Test error found - save the configuration
: 21 | 0.296297 0.281174 0.0813758 0.00706517 43062.5 0
: 22 | 0.292534 0.282411 0.0817453 0.00701216 42819 1
: 23 Minimum Test error found - save the configuration
: 23 | 0.290563 0.279206 0.0817646 0.00711286 42865.7 0
: 24 Minimum Test error found - save the configuration
: 24 | 0.288294 0.276363 0.0821264 0.00714938 42679.8 0
: 25 Minimum Test error found - save the configuration
: 25 | 0.285826 0.273398 0.0817472 0.00710205 42869.5 0
: 26 | 0.283597 0.273863 0.081559 0.00699075 42913.7 1
: 27 Minimum Test error found - save the configuration
: 27 | 0.282664 0.270684 0.0816967 0.00707515 42883 0
: 28 Minimum Test error found - save the configuration
: 28 | 0.279816 0.268888 0.0817107 0.00709825 42888.3 0
: 29 Minimum Test error found - save the configuration
: 29 | 0.278549 0.267211 0.0822081 0.00720978 42667.6 0
: 30 Minimum Test error found - save the configuration
: 30 | 0.276603 0.265524 0.0822306 0.00713992 42615.1 0
: 31 | 0.275571 0.27219 0.0820113 0.00703592 42680.7 1
: 32 Minimum Test error found - save the configuration
: 32 | 0.273938 0.265166 0.0824101 0.00716928 42530.1 0
: 33 Minimum Test error found - save the configuration
: 33 | 0.273043 0.262158 0.0831342 0.00719433 42138.6 0
: 34 | 0.270929 0.266126 0.0822184 0.00704979 42570.9 1
: 35 Minimum Test error found - save the configuration
: 35 | 0.269982 0.261688 0.082349 0.00715048 42554 0
: 36 Minimum Test error found - save the configuration
: 36 | 0.269369 0.261057 0.0824727 0.00719757 42510.7 0
: 37 Minimum Test error found - save the configuration
: 37 | 0.267988 0.258422 0.082084 0.00715537 42707.3 0
: 38 | 0.267136 0.261039 0.0818535 0.00703911 42772.5 1
: 39 | 0.26639 0.259965 0.0820376 0.00704323 42669.9 2
: 40 | 0.264237 0.258516 0.0829376 0.00709333 42191.7 3
: 41 Minimum Test error found - save the configuration
: 41 | 0.265309 0.256802 0.0825376 0.00719099 42470.4 0
: 42 Minimum Test error found - save the configuration
: 42 | 0.264678 0.254968 0.082478 0.0071773 42496.3 0
: 43 | 0.261063 0.254988 0.0823363 0.0070655 42513.2 1
: 44 | 0.259478 0.25538 0.0825383 0.00708286 42409.1 2
: 45 Minimum Test error found - save the configuration
: 45 | 0.259611 0.252583 0.0823139 0.00717097 42585.5 0
: 46 | 0.258594 0.252907 0.0822661 0.00711637 42581.6 1
: 47 | 0.260198 0.25518 0.082345 0.00707842 42515.5 2
: 48 Minimum Test error found - save the configuration
: 48 | 0.257634 0.251211 0.0827716 0.00729663 42398.2 0
: 49 Minimum Test error found - save the configuration
: 49 | 0.254325 0.250771 0.0830595 0.00723749 42204.1 0
: 50 Minimum Test error found - save the configuration
: 50 | 0.254537 0.250684 0.0823181 0.00712909 42559.4 0
: 51 Minimum Test error found - save the configuration
: 51 | 0.253139 0.24834 0.0823812 0.00717305 42548.6 0
: 52 Minimum Test error found - save the configuration
: 52 | 0.25229 0.245617 0.0823699 0.00718753 42563.2 0
: 53 | 0.2508 0.246733 0.082334 0.00707583 42520.3 1
: 54 Minimum Test error found - save the configuration
: 54 | 0.250397 0.245504 0.0829735 0.00724971 42258.9 0
: 55 Minimum Test error found - save the configuration
: 55 | 0.250192 0.245481 0.0826991 0.00722384 42398 0
: 56 Minimum Test error found - save the configuration
: 56 | 0.247785 0.243017 0.0828973 0.00724057 42296.3 0
: 57 | 0.248088 0.243808 0.0825055 0.007102 42438.3 1
: 58 | 0.247958 0.244931 0.082577 0.00711672 42406.4 2
: 59 Minimum Test error found - save the configuration
: 59 | 0.246151 0.243004 0.0826303 0.00719635 42421.2 0
: 60 | 0.245284 0.244758 0.0830378 0.00709729 42138.2 1
: 61 Minimum Test error found - save the configuration
: 61 | 0.245138 0.240574 0.082982 0.00727669 42269.2 0
: 62 Minimum Test error found - save the configuration
: 62 | 0.243781 0.238625 0.083644 0.00736897 41953.5 0
: 63 Minimum Test error found - save the configuration
: 63 | 0.244894 0.237914 0.0831039 0.00723235 42176.6 0
: 64 Minimum Test error found - save the configuration
: 64 | 0.243299 0.237113 0.0825814 0.0072217 42463 0
: 65 | 0.242373 0.240268 0.0827045 0.00711536 42334.1 1
: 66 | 0.241129 0.241459 0.0829668 0.00712766 42194.6 2
: 67 | 0.240585 0.237871 0.0830663 0.00729906 42234.6 3
: 68 Minimum Test error found - save the configuration
: 68 | 0.240612 0.236486 0.0828257 0.00719233 42309.3 0
: 69 | 0.239335 0.237961 0.0824938 0.00711158 42450.3 1
: 70 | 0.240655 0.241996 0.0827235 0.00712138 42326.9 2
: 71 Minimum Test error found - save the configuration
: 71 | 0.238389 0.234177 0.0828632 0.00725951 42326 0
: 72 | 0.23731 0.237473 0.0829158 0.00712977 42224.1 1
: 73 | 0.236778 0.236013 0.0831582 0.00714752 42099.3 2
: 74 Minimum Test error found - save the configuration
: 74 | 0.23704 0.232216 0.0830355 0.00724472 42221.5 0
: 75 | 0.235177 0.235389 0.0829994 0.00715093 42189.4 1
: 76 | 0.234452 0.237081 0.0832187 0.00712613 42054 2
: 77 | 0.234776 0.234543 0.0828482 0.00713445 42264.5 3
: 78 | 0.23422 0.233126 0.0832729 0.00720366 42066.9 4
: 79 | 0.233863 0.232421 0.0831498 0.00713987 42099.7 5
: 80 | 0.232848 0.234785 0.0826263 0.00711393 42377.2 6
: 81 Minimum Test error found - save the configuration
: 81 | 0.232822 0.230318 0.082654 0.00719895 42409.4 0
: 82 | 0.232487 0.230779 0.0827276 0.00715992 42346.1 1
: 83 Minimum Test error found - save the configuration
: 83 | 0.231728 0.228941 0.0835732 0.00736864 41992.2 0
: 84 | 0.231125 0.2302 0.0836009 0.00715831 41861.5 1
: 85 | 0.230178 0.228942 0.0828464 0.00712817 42261.9 2
: 86 Minimum Test error found - save the configuration
: 86 | 0.229453 0.228516 0.0832373 0.00738226 42185.7 0
: 87 | 0.228935 0.231839 0.083236 0.00715704 42061.6 1
: 88 | 0.227634 0.231077 0.0831174 0.00715147 42124.1 2
: 89 Minimum Test error found - save the configuration
: 89 | 0.22723 0.226266 0.083151 0.0072557 42163.3 0
: 90 | 0.227472 0.228064 0.0831021 0.00713951 42126 1
: 91 Minimum Test error found - save the configuration
: 91 | 0.228178 0.225826 0.0833073 0.00727432 42087 0
: 92 | 0.225487 0.231137 0.0831989 0.00715799 42082.6 1
: 93 Minimum Test error found - save the configuration
: 93 | 0.225811 0.225512 0.0833409 0.00727295 42067.7 0
: 94 Minimum Test error found - save the configuration
: 94 | 0.225993 0.225302 0.083491 0.00727251 41984.6 0
: 95 | 0.224028 0.227236 0.0831478 0.00716512 42114.9 1
: 96 | 0.223426 0.225838 0.0832772 0.00715873 42039.7 2
: 97 Minimum Test error found - save the configuration
: 97 | 0.222469 0.223811 0.0835767 0.0072669 41934.3 0
: 98 Minimum Test error found - save the configuration
: 98 | 0.221844 0.222024 0.0833981 0.0072862 42043.4 0
: 99 Minimum Test error found - save the configuration
: 99 | 0.221483 0.220963 0.086683 0.00731505 40318.5 0
: 100 | 0.220225 0.224752 0.0832868 0.00716827 42039.7 1
: 101 Minimum Test error found - save the configuration
: 101 | 0.220367 0.220907 0.0834077 0.0072798 42034.5 0
: 102 | 0.218637 0.224252 0.0834953 0.00716282 41921.9 1
: 103 | 0.219283 0.22296 0.0831355 0.00716139 42119.6 2
: 104 Minimum Test error found - save the configuration
: 104 | 0.217987 0.220746 0.0830948 0.00724725 42189.9 0
: 105 Minimum Test error found - save the configuration
: 105 | 0.218154 0.219828 0.0831114 0.00724781 42181 0
: 106 | 0.218006 0.223147 0.0830638 0.00716511 42161.5 1
: 107 | 0.218126 0.222556 0.0830688 0.00714916 42149.9 2
: 108 Minimum Test error found - save the configuration
: 108 | 0.21567 0.217989 0.0831153 0.00723578 42172.1 0
: 109 Minimum Test error found - save the configuration
: 109 | 0.215246 0.217636 0.0836383 0.00733199 41936.3 0
: 110 | 0.214425 0.219203 0.0834668 0.00718861 41951.7 1
: 111 Minimum Test error found - save the configuration
: 111 | 0.21506 0.216768 0.0835006 0.00729434 41991.3 0
: 112 Minimum Test error found - save the configuration
: 112 | 0.21454 0.2142 0.0835479 0.00729217 41964.1 0
: 113 | 0.213627 0.221701 0.0834529 0.00724156 41988.5 1
: 114 | 0.212547 0.215926 0.0835138 0.00718259 41922.5 2
: 115 | 0.213138 0.216498 0.0835181 0.00718078 41919.2 3
: 116 | 0.211619 0.21796 0.0835474 0.00718787 41907 4
: 117 | 0.212765 0.217522 0.0834914 0.00719538 41941.9 5
: 118 | 0.211992 0.220704 0.0835478 0.00719236 41909.3 6
: 119 | 0.212321 0.22159 0.0835703 0.00719482 41898.3 7
: 120 | 0.21092 0.214799 0.083667 0.00721466 41856.1 8
: 121 | 0.210093 0.215246 0.0832136 0.00719155 42093.1 9
: 122 Minimum Test error found - save the configuration
: 122 | 0.20972 0.212589 0.0841467 0.00751112 41756.1 0
: 123 | 0.209015 0.218563 0.0837485 0.00720175 41804.5 1
: 124 | 0.20883 0.217614 0.0839855 0.00719599 41672.4 2
: 125 Minimum Test error found - save the configuration
: 125 | 0.209083 0.212388 0.0833316 0.00726377 42067.7 0
: 126 | 0.209113 0.220801 0.0830931 0.00715685 42140.6 1
: 127 Minimum Test error found - save the configuration
: 127 | 0.209072 0.209977 0.0838271 0.00735307 41844.3 0
: 128 | 0.208177 0.215882 0.083666 0.00719702 41847.1 1
: 129 | 0.208392 0.212737 0.0837161 0.00720818 41825.7 2
: 130 | 0.207473 0.214539 0.0833574 0.00718301 42008.9 3
: 131 | 0.206834 0.214137 0.083208 0.00717829 42088.8 4
: 132 | 0.206619 0.216419 0.0832371 0.00719015 42079.3 5
: 133 | 0.205706 0.214099 0.0832886 0.00718741 42049.3 6
: 134 | 0.205564 0.210928 0.083304 0.00718749 42040.8 7
: 135 | 0.204179 0.21078 0.0833144 0.00718632 42034.4 8
: 136 Minimum Test error found - save the configuration
: 136 | 0.204867 0.209197 0.0842208 0.00736808 41638.1 0
: 137 | 0.205348 0.212868 0.0835291 0.00720746 41927.8 1
: 138 | 0.203781 0.21461 0.0835484 0.00720729 41917.1 2
: 139 | 0.204506 0.215568 0.0840829 0.00730063 41676.3 3
: 140 | 0.203138 0.213975 0.0838367 0.00721971 41766.2 4
: 141 | 0.204524 0.209916 0.0839085 0.00724726 41742.1 5
: 142 | 0.203557 0.215018 0.0838832 0.0072138 41737.6 6
: 143 | 0.203364 0.211967 0.0836054 0.00720851 41886.5 7
: 144 | 0.202802 0.213637 0.0834848 0.00719332 41944.4 8
: 145 | 0.203439 0.214003 0.0837297 0.00723711 41834.1 9
: 146 Minimum Test error found - save the configuration
: 146 | 0.201782 0.207864 0.0837979 0.00733519 41850.5 0
: 147 | 0.201822 0.215091 0.0838384 0.0072213 41766.1 1
: 148 | 0.201108 0.211555 0.0837553 0.00723482 41818.9 2
: 149 | 0.202877 0.212093 0.0837183 0.00723217 41837.7 3
: 150 | 0.202242 0.216875 0.0837437 0.00721722 41815.6 4
: 151 | 0.202963 0.216674 0.0837591 0.00721918 41808.2 5
: 152 | 0.202122 0.212511 0.08388 0.00722474 41745.3 6
: 153 | 0.201004 0.209889 0.0839183 0.00722269 41723.4 7
: 154 | 0.201004 0.212207 0.0837124 0.00721459 41831.3 8
: 155 | 0.199523 0.216074 0.0836804 0.00721146 41847.1 9
: 156 | 0.200066 0.21936 0.0836678 0.00721534 41856.1 10
: 157 | 0.199925 0.210179 0.0840806 0.00721082 41628.8 11
:
: Elapsed time for training with 4000 events: 13 sec
: Dataset[dataset] : Create results for training
: Dataset[dataset] : Multiclass evaluation of DL_CPU on training sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.118 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating xml weight file: dataset/weights/TMVAMulticlass_DL_CPU.weights.xml
: Creating standalone class: dataset/weights/TMVAMulticlass_DL_CPU.class.C
Factory : Training finished
:
: Ranking input variables (method specific)...
BDTG : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 3.117e-01
: 2 : var1 : 2.504e-01
: 3 : var2 : 2.430e-01
: 4 : var3 : 1.949e-01
: --------------------------------------
MLP : Ranking result (top variable is best ranked)
: -----------------------------
: Rank : Variable : Importance
: -----------------------------
: 1 : var4 : 6.076e+01
: 2 : var2 : 4.824e+01
: 3 : var1 : 2.116e+01
: 4 : var3 : 1.692e+01
: -----------------------------
PDEFoam : Ranking result (top variable is best ranked)
: --------------------------------------
: Rank : Variable : Variable Importance
: --------------------------------------
: 1 : var4 : 2.991e-01
: 2 : var1 : 2.930e-01
: 3 : var3 : 2.365e-01
: 4 : var2 : 1.714e-01
: --------------------------------------
: No variable ranking supplied by classifier: DL_CPU
TH1.Print Name = TrainingHistory_DL_CPU_trainingError, Entries= 0, Total sum= 39.2372
TH1.Print Name = TrainingHistory_DL_CPU_valError, Entries= 0, Total sum= 38.9762
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: dataset/weights/TMVAMulticlass_BDTG.weights.xml
: Reading weight file: dataset/weights/TMVAMulticlass_MLP.weights.xml
MLP : Building Network.
: Initializing weights
: Reading weight file: dataset/weights/TMVAMulticlass_PDEFoam.weights.xml
: Read foams from file: dataset/weights/TMVAMulticlass_PDEFoam.weights_foams.root
: Reading weight file: dataset/weights/TMVAMulticlass_DL_CPU.weights.xml
Factory : Test all methods
Factory : Test method: BDTG for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of BDTG on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 1.01 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: MLP for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of MLP on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.0143 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: PDEFoam for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of PDEFoam on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.126 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Test method: DL_CPU for Multiclass classification performance
:
: Dataset[dataset] : Create results for testing
: Dataset[dataset] : Multiclass evaluation of DL_CPU on testing sample
: Dataset[dataset] : Elapsed time for evaluation of 4000 events: 0.114 sec
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
Factory : Evaluate all methods
: Evaluate multiclass classification method: BDTG
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_BDTG : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: MLP
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_MLP : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: PDEFoam
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_PDEFoam : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.070153 1.0224 [ -4.0592 3.5808 ]
: var2: 0.30372 1.0460 [ -3.6952 3.7877 ]
: var3: 0.12152 1.1222 [ -3.6800 3.9200 ]
: var4: -0.072602 1.2766 [ -4.8486 4.2221 ]
: -----------------------------------------------------------
: Evaluate multiclass classification method: DL_CPU
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
: Creating multiclass response histograms...
: Creating multiclass performance histograms...
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
TFHandler_DL_CPU : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: var1: 0.077270 0.29534 [ -1.1155 1.0914 ]
: var2: 0.068045 0.27981 [ -1.0016 1.0000 ]
: var3: 0.027548 0.24565 [ -0.80459 0.85902 ]
: var4: -0.034157 0.25816 [ -1.0000 0.83435 ]
: -----------------------------------------------------------
:
: 1-vs-rest performance metrics per class
: -------------------------------------------------------------------------------------------------------
:
: Considers the listed class as signal and the other classes
: as background, reporting the resulting binary performance.
: A score of 0.820 (0.850) means 0.820 was acheived on the
: test set and 0.850 on the training set.
:
: Dataset MVA Method ROC AUC Sig eff@B=0.01 Sig eff@B=0.10 Sig eff@B=0.30
: Name: / Class: test (train) test (train) test (train) test (train)
:
: dataset BDTG
: ------------------------------
: Signal 0.968 (0.978) 0.508 (0.605) 0.914 (0.945) 0.990 (0.996)
: bg0 0.910 (0.931) 0.256 (0.288) 0.737 (0.791) 0.922 (0.956)
: bg1 0.947 (0.954) 0.437 (0.511) 0.833 (0.856) 0.971 (0.971)
: bg2 0.978 (0.982) 0.585 (0.678) 0.951 (0.956) 0.999 (0.996)
:
: dataset MLP
: ------------------------------
: Signal 0.970 (0.975) 0.596 (0.632) 0.933 (0.938) 0.988 (0.993)
: bg0 0.929 (0.934) 0.303 (0.298) 0.787 (0.793) 0.949 (0.961)
: bg1 0.962 (0.967) 0.467 (0.553) 0.881 (0.906) 0.985 (0.992)
: bg2 0.975 (0.979) 0.629 (0.699) 0.929 (0.940) 0.998 (0.998)
:
: dataset PDEFoam
: ------------------------------
: Signal 0.916 (0.928) 0.294 (0.382) 0.744 (0.782) 0.932 (0.952)
: bg0 0.837 (0.848) 0.109 (0.147) 0.519 (0.543) 0.833 (0.851)
: bg1 0.890 (0.902) 0.190 (0.226) 0.606 (0.646) 0.923 (0.929)
: bg2 0.967 (0.972) 0.510 (0.527) 0.900 (0.926) 0.993 (0.998)
:
: dataset DL_CPU
: ------------------------------
: Signal 0.976 (0.978) 0.633 (0.661) 0.941 (0.947) 0.991 (0.995)
: bg0 0.926 (0.932) 0.321 (0.319) 0.785 (0.789) 0.937 (0.958)
: bg1 0.965 (0.967) 0.473 (0.553) 0.901 (0.902) 0.990 (0.994)
: bg2 0.982 (0.982) 0.688 (0.694) 0.940 (0.957) 1.000 (0.998)
:
: -------------------------------------------------------------------------------------------------------
:
:
: Confusion matrices for all methods
: -------------------------------------------------------------------------------------------------------
:
: Does a binary comparison between the two classes given by a
: particular row-column combination. In each case, the class
: given by the row is considered signal while the class given
: by the column index is considered background.
:
: === Showing confusion matrix for method : BDTG
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.497 (0.373) 0.710 (0.693) 0.680 (0.574)
: bg0 0.271 (0.184) - 0.239 (0.145) 0.705 (0.667)
: bg1 0.855 (0.766) 0.369 (0.222) - 0.587 (0.578)
: bg2 0.714 (0.585) 0.705 (0.581) 0.648 (0.601) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.911 (0.853) 0.991 (0.981) 0.945 (0.913)
: bg0 0.833 (0.774) - 0.654 (0.582) 0.930 (0.901)
: bg1 0.971 (0.980) 0.716 (0.681) - 0.871 (0.862)
: bg2 0.976 (0.951) 0.971 (0.973) 0.936 (0.941) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.957) 0.999 (1.000) 0.998 (0.997)
: bg0 0.965 (0.926) - 0.874 (0.835) 0.991 (0.976)
: bg1 1.000 (0.999) 0.916 (0.894) - 0.988 (0.985)
: bg2 0.999 (0.999) 0.997 (0.999) 0.996 (0.997) -
:
: === Showing confusion matrix for method : MLP
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.465 (0.490) 0.974 (0.953) 0.632 (0.498)
: bg0 0.320 (0.269) - 0.224 (0.250) 0.655 (0.627)
: bg1 0.943 (0.920) 0.341 (0.275) - 0.632 (0.687)
: bg2 0.665 (0.642) 0.697 (0.680) 0.706 (0.598) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.865 (0.854) 0.996 (0.994) 0.908 (0.907)
: bg0 0.784 (0.776) - 0.666 (0.655) 0.919 (0.895)
: bg1 0.998 (0.998) 0.791 (0.785) - 0.912 (0.902)
: bg2 0.943 (0.903) 0.946 (0.939) 0.924 (0.928) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.964) 0.997 (0.997) 0.993 (0.986)
: bg0 0.952 (0.924) - 0.936 (0.928) 0.992 (0.990)
: bg1 1.000 (1.000) 0.945 (0.936) - 0.998 (0.995)
: bg2 0.994 (0.985) 0.998 (0.998) 0.998 (0.998) -
:
: === Showing confusion matrix for method : PDEFoam
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.289 (0.233) 0.467 (0.436) 0.421 (0.332)
: bg0 0.100 (0.045) - 0.132 (0.116) 0.540 (0.313)
: bg1 0.209 (0.434) 0.153 (0.092) - 0.347 (0.323)
: bg2 0.560 (0.552) 0.445 (0.424) 0.501 (0.506) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.665 (0.640) 0.854 (0.822) 0.807 (0.790)
: bg0 0.538 (0.520) - 0.415 (0.374) 0.843 (0.833)
: bg1 0.885 (0.886) 0.542 (0.491) - 0.728 (0.646)
: bg2 0.928 (0.890) 0.956 (0.959) 0.847 (0.895) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.898 (0.878) 0.971 (0.950) 0.982 (0.975)
: bg0 0.828 (0.810) - 0.696 (0.676) 0.954 (0.951)
: bg1 0.951 (0.966) 0.803 (0.745) - 0.958 (0.966)
: bg2 0.998 (0.991) 0.998 (0.996) 0.998 (0.993) -
:
: === Showing confusion matrix for method : DL_CPU
: (Signal Efficiency for Background Efficiency 0.01%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.484 (0.478) 0.916 (0.933) 0.720 (0.677)
: bg0 0.319 (0.275) - 0.246 (0.246) 0.625 (0.555)
: bg1 0.929 (0.893) 0.333 (0.265) - 0.685 (0.665)
: bg2 0.690 (0.688) 0.694 (0.685) 0.701 (0.680) -
:
: (Signal Efficiency for Background Efficiency 0.10%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.895 (0.887) 0.995 (0.991) 0.933 (0.928)
: bg0 0.785 (0.770) - 0.686 (0.710) 0.883 (0.850)
: bg1 0.997 (0.993) 0.784 (0.803) - 0.867 (0.888)
: bg2 0.979 (0.946) 0.966 (0.949) 0.928 (0.935) -
:
: (Signal Efficiency for Background Efficiency 0.30%)
: ---------------------------------------------------
: Signal bg0 bg1 bg2
: test (train) test (train) test (train) test (train)
: Signal - 0.978 (0.971) 0.998 (0.999) 0.998 (0.999)
: bg0 0.959 (0.936) - 0.931 (0.915) 0.976 (0.970)
: bg1 1.000 (0.998) 0.945 (0.954) - 0.999 (0.995)
: bg2 0.999 (1.000) 0.998 (1.000) 0.998 (0.999) -
:
: -------------------------------------------------------------------------------------------------------
:
Dataset:dataset : Created tree 'TestTree' with 4000 events
:
Dataset:dataset : Created tree 'TrainTree' with 4000 events
:
Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
==> Wrote root file: TMVAMulticlass.root
==> TMVAMulticlass is done!
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>
#include "TFile.h"
#include "TTree.h"
#include "TString.h"
#include "TSystem.h"
#include "TROOT.h"
#include "TMVA/Tools.h"
#include "TMVA/Factory.h"
using namespace TMVA;
void TMVAMulticlass( TString myMethodList = "" )
{
// This loads the library
// to get access to the GUI and all tmva macros
//
// TString tmva_dir(TString(gRootDir) + "/tmva");
// if(gSystem->Getenv("TMVASYS"))
// tmva_dir = TString(gSystem->Getenv("TMVASYS"));
// gROOT->SetMacroPath(tmva_dir + "/test/:" + gROOT->GetMacroPath() );
// gROOT->ProcessLine(".L TMVAMultiClassGui.C");
//---------------------------------------------------------------
// Default MVA methods to be trained + tested
std::map<std::string,int> Use;
Use["MLP"] = 1;
Use["BDTG"] = 1;
#ifdef R__HAS_TMVAGPU
Use["DL_CPU"] = 1;
Use["DL_GPU"] = 1;
#else
Use["DL_CPU"] = 1;
Use["DL_GPU"] = 0;
#endif
Use["FDA_GA"] = 0;
Use["PDEFoam"] = 1;
//---------------------------------------------------------------
std::cout << std::endl;
std::cout << "==> Start TMVAMulticlass" << std::endl;
if (myMethodList != "") {
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;
std::vector<TString> mlist = TMVA::gTools().SplitString( myMethodList, ',' );
for (UInt_t i=0; i<mlist.size(); i++) {
std::string regMethod(mlist[i]);
if (Use.find(regMethod) == Use.end()) {
std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
std::cout << std::endl;
return;
}
Use[regMethod] = 1;
}
}
// Create a new root output file.
TString outfileName = "TMVAMulticlass.root";
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );
TMVA::Factory *factory = new TMVA::Factory( "TMVAMulticlass", outputFile,
"!V:!Silent:Color:DrawProgressBar:Transformations=I;D;P;G,D:AnalysisType=multiclass" );
TMVA::DataLoader *dataloader=new TMVA::DataLoader("dataset");
dataloader->AddVariable( "var1", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "", 'F' );
dataloader->AddVariable( "var3", "Variable 3", "units", 'F' );
dataloader->AddVariable( "var4", "Variable 4", "units", 'F' );
TFile *input(0);
TString fname = "./tmva_example_multiclass.root";
if (!gSystem->AccessPathName( fname )) {
input = TFile::Open( fname ); // check if file in local directory exists
}
else {
input = TFile::Open("http://root.cern.ch/files/tmva_multiclass_example.root", "CACHEREAD");
}
if (!input) {
std::cout << "ERROR: could not open data file" << std::endl;
exit(1);
}
std::cout << "--- TMVAMulticlass: Using input file: " << input->GetName() << std::endl;
TTree *signalTree = (TTree*)input->Get("TreeS");
TTree *background0 = (TTree*)input->Get("TreeB0");
TTree *background1 = (TTree*)input->Get("TreeB1");
TTree *background2 = (TTree*)input->Get("TreeB2");
gROOT->cd( outfileName+TString(":/") );
dataloader->AddTree (signalTree,"Signal");
dataloader->AddTree (background0,"bg0");
dataloader->AddTree (background1,"bg1");
dataloader->AddTree (background2,"bg2");
dataloader->PrepareTrainingAndTestTree( "", "SplitMode=Random:NormMode=NumEvents:!V" );
if (Use["BDTG"]) // gradient boosted decision trees
factory->BookMethod( dataloader, TMVA::Types::kBDT, "BDTG", "!H:!V:NTrees=1000:BoostType=Grad:Shrinkage=0.10:UseBaggedBoost:BaggedSampleFraction=0.50:nCuts=20:MaxDepth=2");
if (Use["MLP"]) // neural network
factory->BookMethod( dataloader, TMVA::Types::kMLP, "MLP", "!H:!V:NeuronType=tanh:NCycles=1000:HiddenLayers=N+5,5:TestRate=5:EstimatorType=MSE");
if (Use["FDA_GA"]) // functional discriminant with GA minimizer
factory->BookMethod( dataloader, TMVA::Types::kFDA, "FDA_GA", "H:!V:Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):FitMethod=GA:PopSize=300:Cycles=3:Steps=20:Trim=True:SaveBestGen=1" );
if (Use["PDEFoam"]) // PDE-Foam approach
factory->BookMethod( dataloader, TMVA::Types::kPDEFoam, "PDEFoam", "!H:!V:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Nmin=100:Kernel=None:Compress=T" );
if (Use["DL_CPU"]) {
TString layoutString("Layout=TANH|100,TANH|50,TANH|10,LINEAR");
TString trainingStrategyString("TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,"
"TestRepetitions=1,ConvergenceSteps=10,BatchSize=100");
TString nnOptions("!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:"
"WeightInitialization=XAVIERUNIFORM:Architecture=GPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDL, "DL_CPU", nnOptions);
}
if (Use["DL_GPU"]) {
TString layoutString("Layout=TANH|100,TANH|50,TANH|10,LINEAR");
TString trainingStrategyString("TrainingStrategy=Optimizer=ADAM,LearningRate=1e-3,"
"TestRepetitions=1,ConvergenceSteps=10,BatchSize=100");
TString nnOptions("!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=N:"
"WeightInitialization=XAVIERUNIFORM:Architecture=GPU");
nnOptions.Append(":");
nnOptions.Append(layoutString);
nnOptions.Append(":");
nnOptions.Append(trainingStrategyString);
factory->BookMethod(dataloader, TMVA::Types::kDL, "DL_GPU", nnOptions);
}
// Train MVAs using the set of training events
factory->TrainAllMethods();
// Evaluate all MVAs using the set of test events
factory->TestAllMethods();
// Evaluate and compare performance of all configured MVAs
factory->EvaluateAllMethods();
// --------------------------------------------------------------
// Save the output
outputFile->Close();
std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVAMulticlass is done!" << std::endl;
delete factory;
delete dataloader;
// Launch the GUI for the root macros
if (!gROOT->IsBatch()) TMVAMultiClassGui( outfileName );
}
int main( int argc, char** argv )
{
// Select methods (don't look at this code - not of interest)
TString methodList;
for (int i=1; i<argc; i++) {
TString regMethod(argv[i]);
if(regMethod=="-b" || regMethod=="--batch") continue;
if (!methodList.IsNull()) methodList += TString(",");
methodList += regMethod;
}
TMVAMulticlass(methodList);
return 0;
}
Author
Andreas Hoecker

Definition in file TMVAMulticlass.C.

TMVA::DataLoader::PrepareTrainingAndTestTree
void PrepareTrainingAndTestTree(const TCut &cut, const TString &splitOpt)
prepare the training and test trees -> same cuts for signal and background
Definition: DataLoader.cxx:632
TMVA::Tools::SplitString
std::vector< TString > SplitString(const TString &theOpt, const char separator) const
splits the option string at 'separator' and fills the list 'splitV' with the primitive strings
Definition: Tools.cxx:1211
TFile::SetCacheFileDir
static Bool_t SetCacheFileDir(ROOT::Internal::TStringView cacheDir, Bool_t operateDisconnected=kTRUE, Bool_t forceCacheread=kFALSE)
Definition: TFile.h:324
TMVA::Types::kBDT
@ kBDT
Definition: Types.h:88
TMVA::DataLoader::AddTree
void AddTree(TTree *tree, const TString &className, Double_t weight=1.0, const TCut &cut="", Types::ETreeType tt=Types::kMaxTreeType)
Definition: DataLoader.cxx:351
TTree
A TTree represents a columnar dataset.
Definition: TTree.h:79
DataLoader.h
TFile::Open
static TFile * Open(const char *name, Option_t *option="", const char *ftitle="", Int_t compress=ROOT::RCompressionSetting::EDefaults::kUseCompiledDefault, Int_t netopt=0)
Create / open a file.
Definition: TFile.cxx:3997
TMVA::Factory::TestAllMethods
void TestAllMethods()
Evaluates all booked methods on the testing data and adds the output to the Results in the corresponi...
Definition: Factory.cxx:1271
TTree.h
TString
Basic string class.
Definition: TString.h:136
TSystem::AccessPathName
virtual Bool_t AccessPathName(const char *path, EAccessMode mode=kFileExists)
Returns FALSE if one can access a file using the specified access mode.
Definition: TSystem.cxx:1294
TString.h
TFile.h
TROOT.h
TMVA::Types::kPDEFoam
@ kPDEFoam
Definition: Types.h:96
TDirectoryFile::Get
TObject * Get(const char *namecycle) override
Return pointer to object identified by namecycle.
Definition: TDirectoryFile.cxx:909
TMVAMultiClassGui.h
TSystem.h
TMVA::TMVAMultiClassGui
void TMVAMultiClassGui(const char *fName="TMVAMulticlass.root", TString dataset="")
TMVA::Factory
This is the main MVA steering class.
Definition: Factory.h:80
TFile
A ROOT file is a suite of consecutive data records (TKey instances) with a well defined format.
Definition: TFile.h:54
unsigned int
gSystem
R__EXTERN TSystem * gSystem
Definition: TSystem.h:559
TMVA::Tools::Instance
static Tools & Instance()
Definition: Tools.cxx:75
TMVA::Factory::BookMethod
MethodBase * BookMethod(DataLoader *loader, TString theMethodName, TString methodTitle, TString theOption="")
Book a classifier or regression method.
Definition: Factory.cxx:352
TMVA::Types::kDL
@ kDL
Definition: Types.h:101
TString::IsNull
Bool_t IsNull() const
Definition: TString.h:407
TMVA::Types::kMLP
@ kMLP
Definition: Types.h:92
TFile::Close
void Close(Option_t *option="") override
Close a file.
Definition: TFile.cxx:879
Factory.h
TMVA::Types::kFDA
@ kFDA
Definition: Types.h:94
Tools.h
TNamed::GetName
virtual const char * GetName() const
Returns name of object.
Definition: TNamed.h:47
TMVA::Factory::EvaluateAllMethods
void EvaluateAllMethods(void)
Iterates over all MVAs that have been booked, and calls their evaluation methods.
Definition: Factory.cxx:1376
TMVA::Factory::TrainAllMethods
void TrainAllMethods()
Iterates through all booked methods and calls training.
Definition: Factory.cxx:1114
TMVA::DataLoader::AddVariable
void AddVariable(const TString &expression, const TString &title, const TString &unit, char type='F', Double_t min=0, Double_t max=0)
user inserts discriminating variable in data set info
Definition: DataLoader.cxx:485
TMVA::gTools
Tools & gTools()
TMVA
create variable transformations
Definition: GeneticMinimizer.h:22
main
int main(int argc, char *argv[])
Definition: cef_main.cxx:54
gROOT
#define gROOT
Definition: TROOT.h:406
TMVA::DataLoader
Definition: DataLoader.h:50