Running with nthreads  = 4
DataSetInfo              : [dataset] : Added class "Signal"
                         : Add Tree sig_tree of type Signal with 1000 events
DataSetInfo              : [dataset] : Added class "Background"
                         : Add Tree bkg_tree of type Background with 1000 events
Factory                  : Booking method: ␛[1mBDT␛[0m
                         : 
                         : Rebuilding Dataset dataset
                         : Building event vectors for type 2 Signal
                         : Dataset[dataset] :  create input formulas for tree sig_tree
                         : Using variable vars[0] from array expression vars of size 256
                         : Building event vectors for type 2 Background
                         : Dataset[dataset] :  create input formulas for tree bkg_tree
                         : Using variable vars[0] from array expression vars of size 256
DataSetFactory           : [dataset] : Number of events in input trees
                         : 
                         : 
                         : Number of training and testing events
                         : ---------------------------------------------------------------------------
                         : Signal     -- training events            : 800
                         : Signal     -- testing events             : 200
                         : Signal     -- training and testing events: 1000
                         : Background -- training events            : 800
                         : Background -- testing events             : 200
                         : Background -- training and testing events: 1000
                         : 
Factory                  : Booking method: ␛[1mTMVA_DNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     Layout: "DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0." [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     InputLayout: "0|0|0" [The Layout of the input]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : Booking method: ␛[1mTMVA_CNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     InputLayout: "1|16|16" [The Layout of the input]
                         :     Layout: "CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,MaxEpochs=10,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0" [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : ␛[1mTrain all methods␛[0m
Factory                  : Train method: BDT for Classification
                         : 
BDT                      : #events: (reweighted) sig: 800 bkg: 800
                         : #events: (unweighted) sig: 800 bkg: 800
                         : Training 200 Decision Trees ... patience please
                         : Elapsed time for training with 1600 events: 0.856 sec         
BDT                      : [dataset] : Evaluation of BDT on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.00917 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.class.C␛[0m
                         : TMVA_CNN_ClassificationOutput.root:/dataset/Method_BDT/BDT
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_DNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 8  Input = ( 1, 1, 256 )  Batch size = 100  Loss function = C
   Layer 0   DENSE Layer:   ( Input =   256 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 2   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 3   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 4   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 5   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 6   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 7   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 70.7592
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |      0.94074    0.968124    0.131438   0.0154636     10347.2           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |     0.703334    0.783123    0.119331   0.0119066     11170.6           0
                         :          3 Minimum Test error found - save the configuration 
                         :          3 |     0.589155    0.722468    0.147514    0.017118     9202.75           0
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.509683    0.671764     0.15428   0.0128141     8482.61           0
                         :          5 |     0.438998    0.687484    0.118798   0.0118839       11224           1
                         :          6 |     0.390283     0.69251    0.125828   0.0124177       10581           2
                         :          7 |     0.359919    0.685568    0.125937    0.012847       10611           3
                         :          8 |     0.299083    0.684519     0.11628   0.0120905     11517.4           4
                         :          9 Minimum Test error found - save the configuration 
                         :          9 |     0.254566    0.665318    0.130274   0.0130577     10237.5           0
                         :         10 |     0.229768    0.683192    0.123444   0.0120913     10776.6           1
                         : 
                         : Elapsed time for training with 1600 events: 1.32 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.0658 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_CNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 7  Input = ( 1, 16, 16 )  Batch size = 100  Loss function = C
   Layer 0   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 10 , 256 , 100 )   Norm dim =    10  axis = 1
 
   Layer 2   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 3   POOL Layer:   ( W = 15 ,  H = 15 ,  D = 10 )    Filter ( W = 2 ,  H = 2 )    Output = ( 100 , 10 , 10 , 225 ) 
   Layer 4   RESHAPE Layer     Input = ( 10 , 15 , 15 )  Output = ( 1 , 100 , 2250 ) 
   Layer 5   DENSE Layer:   ( Input =  2250 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 6   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 81.2146
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |      1.77763    0.729008    0.950787   0.0854753     1386.78           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |     0.824963    0.718483    0.962825   0.0760971     1353.29           0
                         :          3 Minimum Test error found - save the configuration 
                         :          3 |     0.695163    0.683414    0.878163   0.0753567     1494.76           0
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.674264    0.675706     1.05573   0.0809404     1231.04           0
                         :          5 Minimum Test error found - save the configuration 
                         :          5 |     0.669128    0.669623    0.897981   0.0885406     1482.51           0
                         :          6 Minimum Test error found - save the configuration 
                         :          6 |     0.643772    0.645234    0.975632   0.0942706     1361.53           0
                         :          7 Minimum Test error found - save the configuration 
                         :          7 |     0.631765    0.629899    0.929489   0.0723486        1400           0
                         :          8 |      0.60654    0.641677    0.917316    0.110693     1487.68           1
                         :          9 Minimum Test error found - save the configuration 
                         :          9 |     0.601893    0.622961     1.00362    0.086273     1308.12           0
                         :         10 |     0.578338    0.623074    0.919736    0.111236     1484.23           1
                         : 
                         : Elapsed time for training with 1600 events: 9.58 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.546 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
                         : Ranking input variables (method specific)...
BDT                      : Ranking result (top variable is best ranked)
                         : --------------------------------------
                         : Rank : Variable  : Variable Importance
                         : --------------------------------------
                         :    1 : vars      : 1.042e-02
                         :    2 : vars      : 9.804e-03
                         :    3 : vars      : 9.785e-03
                         :    4 : vars      : 9.386e-03
                         :    5 : vars      : 9.109e-03
                         :    6 : vars      : 8.809e-03
                         :    7 : vars      : 8.571e-03
                         :    8 : vars      : 8.414e-03
                         :    9 : vars      : 8.406e-03
                         :   10 : vars      : 8.293e-03
                         :   11 : vars      : 8.183e-03
                         :   12 : vars      : 8.154e-03
                         :   13 : vars      : 7.936e-03
                         :   14 : vars      : 7.895e-03
                         :   15 : vars      : 7.865e-03
                         :   16 : vars      : 7.785e-03
                         :   17 : vars      : 7.725e-03
                         :   18 : vars      : 7.702e-03
                         :   19 : vars      : 7.566e-03
                         :   20 : vars      : 7.515e-03
                         :   21 : vars      : 7.487e-03
                         :   22 : vars      : 7.309e-03
                         :   23 : vars      : 7.225e-03
                         :   24 : vars      : 7.059e-03
                         :   25 : vars      : 7.053e-03
                         :   26 : vars      : 7.046e-03
                         :   27 : vars      : 6.989e-03
                         :   28 : vars      : 6.897e-03
                         :   29 : vars      : 6.856e-03
                         :   30 : vars      : 6.844e-03
                         :   31 : vars      : 6.839e-03
                         :   32 : vars      : 6.815e-03
                         :   33 : vars      : 6.765e-03
                         :   34 : vars      : 6.689e-03
                         :   35 : vars      : 6.676e-03
                         :   36 : vars      : 6.667e-03
                         :   37 : vars      : 6.628e-03
                         :   38 : vars      : 6.607e-03
                         :   39 : vars      : 6.584e-03
                         :   40 : vars      : 6.538e-03
                         :   41 : vars      : 6.419e-03
                         :   42 : vars      : 6.373e-03
                         :   43 : vars      : 6.358e-03
                         :   44 : vars      : 6.306e-03
                         :   45 : vars      : 6.295e-03
                         :   46 : vars      : 6.252e-03
                         :   47 : vars      : 6.164e-03
                         :   48 : vars      : 6.154e-03
                         :   49 : vars      : 6.130e-03
                         :   50 : vars      : 6.124e-03
                         :   51 : vars      : 6.106e-03
                         :   52 : vars      : 6.069e-03
                         :   53 : vars      : 6.066e-03
                         :   54 : vars      : 6.049e-03
                         :   55 : vars      : 5.969e-03
                         :   56 : vars      : 5.889e-03
                         :   57 : vars      : 5.889e-03
                         :   58 : vars      : 5.873e-03
                         :   59 : vars      : 5.869e-03
                         :   60 : vars      : 5.858e-03
                         :   61 : vars      : 5.856e-03
                         :   62 : vars      : 5.846e-03
                         :   63 : vars      : 5.815e-03
                         :   64 : vars      : 5.738e-03
                         :   65 : vars      : 5.664e-03
                         :   66 : vars      : 5.640e-03
                         :   67 : vars      : 5.632e-03
                         :   68 : vars      : 5.613e-03
                         :   69 : vars      : 5.548e-03
                         :   70 : vars      : 5.545e-03
                         :   71 : vars      : 5.543e-03
                         :   72 : vars      : 5.542e-03
                         :   73 : vars      : 5.541e-03
                         :   74 : vars      : 5.508e-03
                         :   75 : vars      : 5.485e-03
                         :   76 : vars      : 5.436e-03
                         :   77 : vars      : 5.342e-03
                         :   78 : vars      : 5.308e-03
                         :   79 : vars      : 5.239e-03
                         :   80 : vars      : 5.232e-03
                         :   81 : vars      : 5.209e-03
                         :   82 : vars      : 5.187e-03
                         :   83 : vars      : 5.153e-03
                         :   84 : vars      : 5.144e-03
                         :   85 : vars      : 5.135e-03
                         :   86 : vars      : 5.090e-03
                         :   87 : vars      : 5.076e-03
                         :   88 : vars      : 5.062e-03
                         :   89 : vars      : 5.060e-03
                         :   90 : vars      : 5.034e-03
                         :   91 : vars      : 5.008e-03
                         :   92 : vars      : 5.001e-03
                         :   93 : vars      : 4.945e-03
                         :   94 : vars      : 4.892e-03
                         :   95 : vars      : 4.847e-03
                         :   96 : vars      : 4.784e-03
                         :   97 : vars      : 4.783e-03
                         :   98 : vars      : 4.780e-03
                         :   99 : vars      : 4.774e-03
                         :  100 : vars      : 4.772e-03
                         :  101 : vars      : 4.722e-03
                         :  102 : vars      : 4.709e-03
                         :  103 : vars      : 4.707e-03
                         :  104 : vars      : 4.694e-03
                         :  105 : vars      : 4.686e-03
                         :  106 : vars      : 4.658e-03
                         :  107 : vars      : 4.622e-03
                         :  108 : vars      : 4.605e-03
                         :  109 : vars      : 4.541e-03
                         :  110 : vars      : 4.520e-03
                         :  111 : vars      : 4.519e-03
                         :  112 : vars      : 4.507e-03
                         :  113 : vars      : 4.480e-03
                         :  114 : vars      : 4.387e-03
                         :  115 : vars      : 4.362e-03
                         :  116 : vars      : 4.361e-03
                         :  117 : vars      : 4.351e-03
                         :  118 : vars      : 4.298e-03
                         :  119 : vars      : 4.287e-03
                         :  120 : vars      : 4.280e-03
                         :  121 : vars      : 4.269e-03
                         :  122 : vars      : 4.266e-03
                         :  123 : vars      : 4.227e-03
                         :  124 : vars      : 4.205e-03
                         :  125 : vars      : 4.167e-03
                         :  126 : vars      : 4.161e-03
                         :  127 : vars      : 4.128e-03
                         :  128 : vars      : 4.087e-03
                         :  129 : vars      : 4.082e-03
                         :  130 : vars      : 4.075e-03
                         :  131 : vars      : 4.055e-03
                         :  132 : vars      : 4.031e-03
                         :  133 : vars      : 3.994e-03
                         :  134 : vars      : 3.983e-03
                         :  135 : vars      : 3.845e-03
                         :  136 : vars      : 3.815e-03
                         :  137 : vars      : 3.809e-03
                         :  138 : vars      : 3.799e-03
                         :  139 : vars      : 3.781e-03
                         :  140 : vars      : 3.769e-03
                         :  141 : vars      : 3.760e-03
                         :  142 : vars      : 3.710e-03
                         :  143 : vars      : 3.692e-03
                         :  144 : vars      : 3.663e-03
                         :  145 : vars      : 3.659e-03
                         :  146 : vars      : 3.635e-03
                         :  147 : vars      : 3.630e-03
                         :  148 : vars      : 3.625e-03
                         :  149 : vars      : 3.621e-03
                         :  150 : vars      : 3.593e-03
                         :  151 : vars      : 3.592e-03
                         :  152 : vars      : 3.591e-03
                         :  153 : vars      : 3.591e-03
                         :  154 : vars      : 3.590e-03
                         :  155 : vars      : 3.573e-03
                         :  156 : vars      : 3.552e-03
                         :  157 : vars      : 3.537e-03
                         :  158 : vars      : 3.511e-03
                         :  159 : vars      : 3.494e-03
                         :  160 : vars      : 3.440e-03
                         :  161 : vars      : 3.439e-03
                         :  162 : vars      : 3.420e-03
                         :  163 : vars      : 3.363e-03
                         :  164 : vars      : 3.337e-03
                         :  165 : vars      : 3.305e-03
                         :  166 : vars      : 3.287e-03
                         :  167 : vars      : 3.197e-03
                         :  168 : vars      : 3.162e-03
                         :  169 : vars      : 3.152e-03
                         :  170 : vars      : 3.076e-03
                         :  171 : vars      : 3.069e-03
                         :  172 : vars      : 3.036e-03
                         :  173 : vars      : 3.021e-03
                         :  174 : vars      : 2.950e-03
                         :  175 : vars      : 2.917e-03
                         :  176 : vars      : 2.887e-03
                         :  177 : vars      : 2.868e-03
                         :  178 : vars      : 2.850e-03
                         :  179 : vars      : 2.799e-03
                         :  180 : vars      : 2.787e-03
                         :  181 : vars      : 2.777e-03
                         :  182 : vars      : 2.749e-03
                         :  183 : vars      : 2.721e-03
                         :  184 : vars      : 2.721e-03
                         :  185 : vars      : 2.641e-03
                         :  186 : vars      : 2.628e-03
                         :  187 : vars      : 2.625e-03
                         :  188 : vars      : 2.598e-03
                         :  189 : vars      : 2.597e-03
                         :  190 : vars      : 2.576e-03
                         :  191 : vars      : 2.574e-03
                         :  192 : vars      : 2.465e-03
                         :  193 : vars      : 2.427e-03
                         :  194 : vars      : 2.262e-03
                         :  195 : vars      : 2.261e-03
                         :  196 : vars      : 2.226e-03
                         :  197 : vars      : 2.172e-03
                         :  198 : vars      : 1.947e-03
                         :  199 : vars      : 1.821e-03
                         :  200 : vars      : 1.728e-03
                         :  201 : vars      : 1.673e-03
                         :  202 : vars      : 1.649e-03
                         :  203 : vars      : 1.413e-03
                         :  204 : vars      : 1.375e-03
                         :  205 : vars      : 6.298e-04
                         :  206 : vars      : 0.000e+00
                         :  207 : vars      : 0.000e+00
                         :  208 : vars      : 0.000e+00
                         :  209 : vars      : 0.000e+00
                         :  210 : vars      : 0.000e+00
                         :  211 : vars      : 0.000e+00
                         :  212 : vars      : 0.000e+00
                         :  213 : vars      : 0.000e+00
                         :  214 : vars      : 0.000e+00
                         :  215 : vars      : 0.000e+00
                         :  216 : vars      : 0.000e+00
                         :  217 : vars      : 0.000e+00
                         :  218 : vars      : 0.000e+00
                         :  219 : vars      : 0.000e+00
                         :  220 : vars      : 0.000e+00
                         :  221 : vars      : 0.000e+00
                         :  222 : vars      : 0.000e+00
                         :  223 : vars      : 0.000e+00
                         :  224 : vars      : 0.000e+00
                         :  225 : vars      : 0.000e+00
                         :  226 : vars      : 0.000e+00
                         :  227 : vars      : 0.000e+00
                         :  228 : vars      : 0.000e+00
                         :  229 : vars      : 0.000e+00
                         :  230 : vars      : 0.000e+00
                         :  231 : vars      : 0.000e+00
                         :  232 : vars      : 0.000e+00
                         :  233 : vars      : 0.000e+00
                         :  234 : vars      : 0.000e+00
                         :  235 : vars      : 0.000e+00
                         :  236 : vars      : 0.000e+00
                         :  237 : vars      : 0.000e+00
                         :  238 : vars      : 0.000e+00
                         :  239 : vars      : 0.000e+00
                         :  240 : vars      : 0.000e+00
                         :  241 : vars      : 0.000e+00
                         :  242 : vars      : 0.000e+00
                         :  243 : vars      : 0.000e+00
                         :  244 : vars      : 0.000e+00
                         :  245 : vars      : 0.000e+00
                         :  246 : vars      : 0.000e+00
                         :  247 : vars      : 0.000e+00
                         :  248 : vars      : 0.000e+00
                         :  249 : vars      : 0.000e+00
                         :  250 : vars      : 0.000e+00
                         :  251 : vars      : 0.000e+00
                         :  252 : vars      : 0.000e+00
                         :  253 : vars      : 0.000e+00
                         :  254 : vars      : 0.000e+00
                         :  255 : vars      : 0.000e+00
                         :  256 : vars      : 0.000e+00
                         : --------------------------------------
                         : No variable ranking supplied by classifier: TMVA_DNN_CPU
                         : No variable ranking supplied by classifier: TMVA_CNN_CPU
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_trainingError, Entries= 0, Total sum= 4.71553
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_valError, Entries= 0, Total sum= 7.24407
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_trainingError, Entries= 0, Total sum= 7.70345
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_valError, Entries= 0, Total sum= 6.63908
Factory                  : === Destroy and recreate all methods via weight files for testing ===
                         : 
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
Factory                  : ␛[1mTest all methods␛[0m
Factory                  : Test method: BDT for Classification performance
                         : 
BDT                      : [dataset] : Evaluation of BDT on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.0023 sec       
Factory                  : Test method: TMVA_DNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.0182 sec       
Factory                  : Test method: TMVA_CNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.151 sec       
Factory                  : ␛[1mEvaluate all methods␛[0m
Factory                  : Evaluate classifier: BDT
                         : 
BDT                      : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_DNN_CPU
                         : 
TMVA_DNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_CNN_CPU
                         : 
TMVA_CNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
                         : 
                         : Evaluation results ranked by best signal efficiency and purity (area)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet       MVA                       
                         : Name:         Method:          ROC-integ
                         : dataset       TMVA_CNN_CPU   : 0.787
                         : dataset       BDT            : 0.739
                         : dataset       TMVA_DNN_CPU   : 0.676
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
                         : Testing efficiency compared to training efficiency (overtraining check)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet              MVA              Signal efficiency: from test sample (from training sample) 
                         : Name:                Method:          @B=0.01             @B=0.10            @B=0.30   
                         : -------------------------------------------------------------------------------------------------------------------
                         : dataset              TMVA_CNN_CPU   : 0.015 (0.090)       0.405 (0.439)      0.739 (0.768)
                         : dataset              BDT            : 0.050 (0.305)       0.355 (0.658)      0.652 (0.824)
                         : dataset              TMVA_DNN_CPU   : 0.008 (0.170)       0.230 (0.620)      0.587 (0.797)
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
Dataset:dataset          : Created tree 'TestTree' with 400 events
                         : 
Dataset:dataset          : Created tree 'TrainTree' with 1600 events
                         : 
Factory                  : ␛[1mThank you for using TMVA!␛[0m
                         : ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m