Running with nthreads  = 4
DataSetInfo              : [dataset] : Added class "Signal"
                         : Add Tree sig_tree of type Signal with 1000 events
DataSetInfo              : [dataset] : Added class "Background"
                         : Add Tree bkg_tree of type Background with 1000 events
Factory                  : Booking method: ␛[1mBDT␛[0m
                         : 
                         : Rebuilding Dataset dataset
                         : Building event vectors for type 2 Signal
                         : Dataset[dataset] :  create input formulas for tree sig_tree
                         : Using variable vars[0] from array expression vars of size 256
                         : Building event vectors for type 2 Background
                         : Dataset[dataset] :  create input formulas for tree bkg_tree
                         : Using variable vars[0] from array expression vars of size 256
DataSetFactory           : [dataset] : Number of events in input trees
                         : 
                         : 
                         : Number of training and testing events
                         : ---------------------------------------------------------------------------
                         : Signal     -- training events            : 800
                         : Signal     -- testing events             : 200
                         : Signal     -- training and testing events: 1000
                         : Background -- training events            : 800
                         : Background -- testing events             : 200
                         : Background -- training and testing events: 1000
                         : 
Factory                  : Booking method: ␛[1mTMVA_DNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     Layout: "DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10" [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     InputLayout: "0|0|0" [The Layout of the input]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : Booking method: ␛[1mTMVA_CNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     InputLayout: "1|16|16" [The Layout of the input]
                         :     Layout: "CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10" [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : ␛[1mTrain all methods␛[0m
Factory                  : Train method: BDT for Classification
                         : 
BDT                      : #events: (reweighted) sig: 800 bkg: 800
                         : #events: (unweighted) sig: 800 bkg: 800
                         : Training 400 Decision Trees ... patience please
                         : Elapsed time for training with 1600 events: 1.47 sec         
BDT                      : [dataset] : Evaluation of BDT on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.0169 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.class.C␛[0m
                         : TMVA_CNN_ClassificationOutput.root:/dataset/Method_BDT/BDT
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_DNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 8  Input = ( 1, 1, 256 )  Batch size = 100  Loss function = C
   Layer 0   DENSE Layer:   ( Input =   256 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 2   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 3   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 4   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 5   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 6   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 7   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 248.84
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |      1.17607     1.04166    0.149421   0.0165287     9029.85           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |     0.753481    0.832032    0.150776    0.016124     8911.85           0
                         :          3 Minimum Test error found - save the configuration 
                         :          3 |      0.62635    0.789719    0.135382   0.0137732     9867.68           0
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.531105    0.746739    0.144025   0.0158814      9364.5           0
                         :          5 Minimum Test error found - save the configuration 
                         :          5 |     0.479718    0.744022    0.132796     0.01821     10472.4           0
                         :          6 Minimum Test error found - save the configuration 
                         :          6 |     0.422271    0.740446    0.125372   0.0144334     10816.8           0
                         :          7 Minimum Test error found - save the configuration 
                         :          7 |     0.392474    0.691584    0.123574   0.0124396     10797.7           0
                         :          8 |     0.342873    0.705644     0.10921   0.0102209     12122.5           1
                         :          9 |      0.28376    0.704744    0.107085     0.01052     12426.8           2
                         :         10 |     0.257901    0.777081    0.113647   0.0105553     11640.2           3
                         : 
                         : Elapsed time for training with 1600 events: 1.32 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.0627 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_CNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 7  Input = ( 1, 16, 16 )  Batch size = 100  Loss function = C
   Layer 0   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 10 , 256 , 100 )   Norm dim =    10  axis = 1
 
   Layer 2   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 3   POOL Layer:   ( W = 15 ,  H = 15 ,  D = 10 )    Filter ( W = 2 ,  H = 2 )    Output = ( 100 , 10 , 10 , 225 ) 
   Layer 4   RESHAPE Layer     Input = ( 10 , 15 , 15 )  Output = ( 1 , 100 , 2250 ) 
   Layer 5   DENSE Layer:   ( Input =  2250 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 6   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 163.907
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |      1.88436    0.797553    0.872348   0.0774184     1509.57           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |      0.80801     0.74686    0.873605   0.0834448     1518.68           0
                         :          3 Minimum Test error found - save the configuration 
                         :          3 |      0.69641    0.704115    0.844998    0.076029     1560.53           0
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.665548     0.68328      0.9214   0.0750789      1417.9           0
                         :          5 |      0.64438    0.699963    0.858884   0.0871269     1554.89           1
                         :          6 Minimum Test error found - save the configuration 
                         :          6 |     0.644333    0.662935    0.894486   0.0687608     1453.27           0
                         :          7 Minimum Test error found - save the configuration 
                         :          7 |     0.611541    0.655652    0.933213   0.0898542     1422.88           0
                         :          8 Minimum Test error found - save the configuration 
                         :          8 |      0.56978    0.653833    0.944789   0.0728782     1376.29           0
                         :          9 Minimum Test error found - save the configuration 
                         :          9 |     0.570617    0.591576    0.858244   0.0797985     1541.53           0
                         :         10 |     0.522539    0.616709    0.834237   0.0887981     1609.79           1
                         : 
                         : Elapsed time for training with 1600 events: 8.91 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.415 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
                         : Ranking input variables (method specific)...
BDT                      : Ranking result (top variable is best ranked)
                         : --------------------------------------
                         : Rank : Variable  : Variable Importance
                         : --------------------------------------
                         :    1 : vars      : 1.092e-02
                         :    2 : vars      : 9.600e-03
                         :    3 : vars      : 8.439e-03
                         :    4 : vars      : 8.367e-03
                         :    5 : vars      : 8.348e-03
                         :    6 : vars      : 7.988e-03
                         :    7 : vars      : 7.797e-03
                         :    8 : vars      : 7.599e-03
                         :    9 : vars      : 7.451e-03
                         :   10 : vars      : 7.393e-03
                         :   11 : vars      : 7.365e-03
                         :   12 : vars      : 7.341e-03
                         :   13 : vars      : 7.336e-03
                         :   14 : vars      : 7.274e-03
                         :   15 : vars      : 7.016e-03
                         :   16 : vars      : 6.993e-03
                         :   17 : vars      : 6.966e-03
                         :   18 : vars      : 6.802e-03
                         :   19 : vars      : 6.723e-03
                         :   20 : vars      : 6.691e-03
                         :   21 : vars      : 6.573e-03
                         :   22 : vars      : 6.535e-03
                         :   23 : vars      : 6.517e-03
                         :   24 : vars      : 6.498e-03
                         :   25 : vars      : 6.492e-03
                         :   26 : vars      : 6.491e-03
                         :   27 : vars      : 6.433e-03
                         :   28 : vars      : 6.424e-03
                         :   29 : vars      : 6.396e-03
                         :   30 : vars      : 6.363e-03
                         :   31 : vars      : 6.361e-03
                         :   32 : vars      : 6.327e-03
                         :   33 : vars      : 6.248e-03
                         :   34 : vars      : 6.217e-03
                         :   35 : vars      : 6.214e-03
                         :   36 : vars      : 6.204e-03
                         :   37 : vars      : 6.083e-03
                         :   38 : vars      : 6.024e-03
                         :   39 : vars      : 6.004e-03
                         :   40 : vars      : 5.972e-03
                         :   41 : vars      : 5.963e-03
                         :   42 : vars      : 5.913e-03
                         :   43 : vars      : 5.868e-03
                         :   44 : vars      : 5.866e-03
                         :   45 : vars      : 5.753e-03
                         :   46 : vars      : 5.734e-03
                         :   47 : vars      : 5.726e-03
                         :   48 : vars      : 5.706e-03
                         :   49 : vars      : 5.694e-03
                         :   50 : vars      : 5.528e-03
                         :   51 : vars      : 5.507e-03
                         :   52 : vars      : 5.486e-03
                         :   53 : vars      : 5.396e-03
                         :   54 : vars      : 5.375e-03
                         :   55 : vars      : 5.362e-03
                         :   56 : vars      : 5.357e-03
                         :   57 : vars      : 5.342e-03
                         :   58 : vars      : 5.241e-03
                         :   59 : vars      : 5.237e-03
                         :   60 : vars      : 5.200e-03
                         :   61 : vars      : 5.193e-03
                         :   62 : vars      : 5.170e-03
                         :   63 : vars      : 5.137e-03
                         :   64 : vars      : 5.134e-03
                         :   65 : vars      : 5.132e-03
                         :   66 : vars      : 5.123e-03
                         :   67 : vars      : 5.119e-03
                         :   68 : vars      : 5.086e-03
                         :   69 : vars      : 5.085e-03
                         :   70 : vars      : 5.072e-03
                         :   71 : vars      : 5.010e-03
                         :   72 : vars      : 4.977e-03
                         :   73 : vars      : 4.961e-03
                         :   74 : vars      : 4.846e-03
                         :   75 : vars      : 4.845e-03
                         :   76 : vars      : 4.765e-03
                         :   77 : vars      : 4.729e-03
                         :   78 : vars      : 4.704e-03
                         :   79 : vars      : 4.696e-03
                         :   80 : vars      : 4.690e-03
                         :   81 : vars      : 4.665e-03
                         :   82 : vars      : 4.660e-03
                         :   83 : vars      : 4.655e-03
                         :   84 : vars      : 4.650e-03
                         :   85 : vars      : 4.636e-03
                         :   86 : vars      : 4.628e-03
                         :   87 : vars      : 4.593e-03
                         :   88 : vars      : 4.584e-03
                         :   89 : vars      : 4.576e-03
                         :   90 : vars      : 4.549e-03
                         :   91 : vars      : 4.544e-03
                         :   92 : vars      : 4.542e-03
                         :   93 : vars      : 4.537e-03
                         :   94 : vars      : 4.527e-03
                         :   95 : vars      : 4.467e-03
                         :   96 : vars      : 4.399e-03
                         :   97 : vars      : 4.388e-03
                         :   98 : vars      : 4.373e-03
                         :   99 : vars      : 4.366e-03
                         :  100 : vars      : 4.333e-03
                         :  101 : vars      : 4.266e-03
                         :  102 : vars      : 4.261e-03
                         :  103 : vars      : 4.239e-03
                         :  104 : vars      : 4.228e-03
                         :  105 : vars      : 4.225e-03
                         :  106 : vars      : 4.224e-03
                         :  107 : vars      : 4.155e-03
                         :  108 : vars      : 4.151e-03
                         :  109 : vars      : 4.148e-03
                         :  110 : vars      : 4.143e-03
                         :  111 : vars      : 4.109e-03
                         :  112 : vars      : 4.102e-03
                         :  113 : vars      : 4.082e-03
                         :  114 : vars      : 4.059e-03
                         :  115 : vars      : 4.022e-03
                         :  116 : vars      : 4.006e-03
                         :  117 : vars      : 3.985e-03
                         :  118 : vars      : 3.971e-03
                         :  119 : vars      : 3.971e-03
                         :  120 : vars      : 3.966e-03
                         :  121 : vars      : 3.886e-03
                         :  122 : vars      : 3.881e-03
                         :  123 : vars      : 3.871e-03
                         :  124 : vars      : 3.837e-03
                         :  125 : vars      : 3.819e-03
                         :  126 : vars      : 3.805e-03
                         :  127 : vars      : 3.803e-03
                         :  128 : vars      : 3.773e-03
                         :  129 : vars      : 3.766e-03
                         :  130 : vars      : 3.766e-03
                         :  131 : vars      : 3.762e-03
                         :  132 : vars      : 3.756e-03
                         :  133 : vars      : 3.750e-03
                         :  134 : vars      : 3.738e-03
                         :  135 : vars      : 3.723e-03
                         :  136 : vars      : 3.678e-03
                         :  137 : vars      : 3.676e-03
                         :  138 : vars      : 3.653e-03
                         :  139 : vars      : 3.653e-03
                         :  140 : vars      : 3.624e-03
                         :  141 : vars      : 3.592e-03
                         :  142 : vars      : 3.581e-03
                         :  143 : vars      : 3.572e-03
                         :  144 : vars      : 3.553e-03
                         :  145 : vars      : 3.551e-03
                         :  146 : vars      : 3.537e-03
                         :  147 : vars      : 3.515e-03
                         :  148 : vars      : 3.501e-03
                         :  149 : vars      : 3.500e-03
                         :  150 : vars      : 3.486e-03
                         :  151 : vars      : 3.477e-03
                         :  152 : vars      : 3.435e-03
                         :  153 : vars      : 3.434e-03
                         :  154 : vars      : 3.433e-03
                         :  155 : vars      : 3.405e-03
                         :  156 : vars      : 3.368e-03
                         :  157 : vars      : 3.343e-03
                         :  158 : vars      : 3.313e-03
                         :  159 : vars      : 3.304e-03
                         :  160 : vars      : 3.301e-03
                         :  161 : vars      : 3.292e-03
                         :  162 : vars      : 3.288e-03
                         :  163 : vars      : 3.286e-03
                         :  164 : vars      : 3.243e-03
                         :  165 : vars      : 3.213e-03
                         :  166 : vars      : 3.196e-03
                         :  167 : vars      : 3.180e-03
                         :  168 : vars      : 3.171e-03
                         :  169 : vars      : 3.149e-03
                         :  170 : vars      : 3.114e-03
                         :  171 : vars      : 3.109e-03
                         :  172 : vars      : 3.104e-03
                         :  173 : vars      : 3.083e-03
                         :  174 : vars      : 3.080e-03
                         :  175 : vars      : 3.070e-03
                         :  176 : vars      : 3.060e-03
                         :  177 : vars      : 3.025e-03
                         :  178 : vars      : 2.907e-03
                         :  179 : vars      : 2.891e-03
                         :  180 : vars      : 2.890e-03
                         :  181 : vars      : 2.862e-03
                         :  182 : vars      : 2.847e-03
                         :  183 : vars      : 2.826e-03
                         :  184 : vars      : 2.814e-03
                         :  185 : vars      : 2.783e-03
                         :  186 : vars      : 2.783e-03
                         :  187 : vars      : 2.782e-03
                         :  188 : vars      : 2.715e-03
                         :  189 : vars      : 2.711e-03
                         :  190 : vars      : 2.700e-03
                         :  191 : vars      : 2.692e-03
                         :  192 : vars      : 2.687e-03
                         :  193 : vars      : 2.678e-03
                         :  194 : vars      : 2.607e-03
                         :  195 : vars      : 2.599e-03
                         :  196 : vars      : 2.580e-03
                         :  197 : vars      : 2.542e-03
                         :  198 : vars      : 2.451e-03
                         :  199 : vars      : 2.418e-03
                         :  200 : vars      : 2.414e-03
                         :  201 : vars      : 2.390e-03
                         :  202 : vars      : 2.385e-03
                         :  203 : vars      : 2.341e-03
                         :  204 : vars      : 2.339e-03
                         :  205 : vars      : 2.294e-03
                         :  206 : vars      : 2.286e-03
                         :  207 : vars      : 2.254e-03
                         :  208 : vars      : 2.210e-03
                         :  209 : vars      : 2.204e-03
                         :  210 : vars      : 2.187e-03
                         :  211 : vars      : 2.185e-03
                         :  212 : vars      : 2.109e-03
                         :  213 : vars      : 2.105e-03
                         :  214 : vars      : 2.080e-03
                         :  215 : vars      : 2.070e-03
                         :  216 : vars      : 2.045e-03
                         :  217 : vars      : 2.042e-03
                         :  218 : vars      : 2.028e-03
                         :  219 : vars      : 2.010e-03
                         :  220 : vars      : 1.974e-03
                         :  221 : vars      : 1.926e-03
                         :  222 : vars      : 1.919e-03
                         :  223 : vars      : 1.876e-03
                         :  224 : vars      : 1.857e-03
                         :  225 : vars      : 1.803e-03
                         :  226 : vars      : 1.746e-03
                         :  227 : vars      : 1.684e-03
                         :  228 : vars      : 1.654e-03
                         :  229 : vars      : 1.616e-03
                         :  230 : vars      : 1.608e-03
                         :  231 : vars      : 1.592e-03
                         :  232 : vars      : 1.523e-03
                         :  233 : vars      : 1.512e-03
                         :  234 : vars      : 1.510e-03
                         :  235 : vars      : 1.481e-03
                         :  236 : vars      : 1.416e-03
                         :  237 : vars      : 1.292e-03
                         :  238 : vars      : 9.729e-04
                         :  239 : vars      : 8.143e-04
                         :  240 : vars      : 7.239e-04
                         :  241 : vars      : 6.101e-04
                         :  242 : vars      : 5.436e-04
                         :  243 : vars      : 2.542e-05
                         :  244 : vars      : 0.000e+00
                         :  245 : vars      : 0.000e+00
                         :  246 : vars      : 0.000e+00
                         :  247 : vars      : 0.000e+00
                         :  248 : vars      : 0.000e+00
                         :  249 : vars      : 0.000e+00
                         :  250 : vars      : 0.000e+00
                         :  251 : vars      : 0.000e+00
                         :  252 : vars      : 0.000e+00
                         :  253 : vars      : 0.000e+00
                         :  254 : vars      : 0.000e+00
                         :  255 : vars      : 0.000e+00
                         :  256 : vars      : 0.000e+00
                         : --------------------------------------
                         : No variable ranking supplied by classifier: TMVA_DNN_CPU
                         : No variable ranking supplied by classifier: TMVA_CNN_CPU
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_trainingError, Entries= 0, Total sum= 5.26601
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_valError, Entries= 0, Total sum= 7.77367
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_trainingError, Entries= 0, Total sum= 7.61752
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_valError, Entries= 0, Total sum= 6.81248
Factory                  : === Destroy and recreate all methods via weight files for testing ===
                         : 
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
Factory                  : ␛[1mTest all methods␛[0m
Factory                  : Test method: BDT for Classification performance
                         : 
BDT                      : [dataset] : Evaluation of BDT on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.0035 sec       
Factory                  : Test method: TMVA_DNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.0127 sec       
Factory                  : Test method: TMVA_CNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.118 sec       
Factory                  : ␛[1mEvaluate all methods␛[0m
Factory                  : Evaluate classifier: BDT
                         : 
BDT                      : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_DNN_CPU
                         : 
TMVA_DNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_CNN_CPU
                         : 
TMVA_CNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
                         : 
                         : Evaluation results ranked by best signal efficiency and purity (area)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet       MVA                       
                         : Name:         Method:          ROC-integ
                         : dataset       BDT            : 0.773
                         : dataset       TMVA_CNN_CPU   : 0.763
                         : dataset       TMVA_DNN_CPU   : 0.621
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
                         : Testing efficiency compared to training efficiency (overtraining check)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet              MVA              Signal efficiency: from test sample (from training sample) 
                         : Name:                Method:          @B=0.01             @B=0.10            @B=0.30   
                         : -------------------------------------------------------------------------------------------------------------------
                         : dataset              BDT            : 0.175 (0.345)       0.420 (0.693)      0.685 (0.904)
                         : dataset              TMVA_CNN_CPU   : 0.045 (0.180)       0.435 (0.573)      0.695 (0.756)
                         : dataset              TMVA_DNN_CPU   : 0.025 (0.105)       0.188 (0.484)      0.495 (0.700)
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
Dataset:dataset          : Created tree 'TestTree' with 400 events
                         : 
Dataset:dataset          : Created tree 'TrainTree' with 1600 events
                         : 
Factory                  : ␛[1mThank you for using TMVA!␛[0m
                         : ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m