Running with nthreads  = 4
DataSetInfo              : [dataset] : Added class "Signal"
                         : Add Tree sig_tree of type Signal with 1000 events
DataSetInfo              : [dataset] : Added class "Background"
                         : Add Tree bkg_tree of type Background with 1000 events
Factory                  : Booking method: ␛[1mBDT␛[0m
                         : 
                         : Rebuilding Dataset dataset
                         : Building event vectors for type 2 Signal
                         : Dataset[dataset] :  create input formulas for tree sig_tree
                         : Using variable vars[0] from array expression vars of size 256
                         : Building event vectors for type 2 Background
                         : Dataset[dataset] :  create input formulas for tree bkg_tree
                         : Using variable vars[0] from array expression vars of size 256
DataSetFactory           : [dataset] : Number of events in input trees
                         : 
                         : 
                         : Number of training and testing events
                         : ---------------------------------------------------------------------------
                         : Signal     -- training events            : 800
                         : Signal     -- testing events             : 200
                         : Signal     -- training and testing events: 1000
                         : Background -- training events            : 800
                         : Background -- testing events             : 200
                         : Background -- training and testing events: 1000
                         : 
Factory                  : Booking method: ␛[1mTMVA_DNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:Layout=DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     Layout: "DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,BNORM,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.,MaxEpochs=10" [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     InputLayout: "0|0|0" [The Layout of the input]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : Booking method: ␛[1mTMVA_CNN_CPU␛[0m
                         : 
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     <none>
                         : - Default:
                         :     Boost_num: "0" [Number of times the classifier will be boosted]
                         : Parsing option string: 
                         : ... "!H:V:ErrorStrategy=CROSSENTROPY:VarTransform=None:WeightInitialization=XAVIER:InputLayout=1|16|16:Layout=CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR:TrainingStrategy=LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10:Architecture=CPU"
                         : The following options are set:
                         : - By User:
                         :     V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
                         :     VarTransform: "None" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
                         :     H: "False" [Print method-specific help message]
                         :     InputLayout: "1|16|16" [The Layout of the input]
                         :     Layout: "CONV|10|3|3|1|1|1|1|RELU,BNORM,CONV|10|3|3|1|1|1|1|RELU,MAXPOOL|2|2|1|1,RESHAPE|FLAT,DENSE|100|RELU,DENSE|1|LINEAR" [Layout of the network.]
                         :     ErrorStrategy: "CROSSENTROPY" [Loss function: Mean squared error (regression) or cross entropy (binary classification).]
                         :     WeightInitialization: "XAVIER" [Weight initialization strategy]
                         :     Architecture: "CPU" [Which architecture to perform the training on.]
                         :     TrainingStrategy: "LearningRate=1e-3,Momentum=0.9,Repetitions=1,ConvergenceSteps=5,BatchSize=100,TestRepetitions=1,WeightDecay=1e-4,Regularization=None,Optimizer=ADAM,DropConfig=0.0+0.0+0.0+0.0,MaxEpochs=10" [Defines the training strategies.]
                         : - Default:
                         :     VerbosityLevel: "Default" [Verbosity level]
                         :     CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
                         :     IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
                         :     BatchLayout: "0|0|0" [The Layout of the batch]
                         :     RandomSeed: "0" [Random seed used for weight initialization and batch shuffling]
                         :     ValidationSize: "20%" [Part of the training data to use for validation. Specify as 0.2 or 20% to use a fifth of the data set as validation set. Specify as 100 to use exactly 100 events. (Default: 20%)]
                         : Will now use the CPU architecture with BLAS and IMT support !
Factory                  : ␛[1mTrain all methods␛[0m
Factory                  : Train method: BDT for Classification
                         : 
BDT                      : #events: (reweighted) sig: 800 bkg: 800
                         : #events: (unweighted) sig: 800 bkg: 800
                         : Training 400 Decision Trees ... patience please
                         : Elapsed time for training with 1600 events: 1.71 sec         
BDT                      : [dataset] : Evaluation of BDT on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.0164 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.class.C␛[0m
                         : TMVA_CNN_ClassificationOutput.root:/dataset/Method_BDT/BDT
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_DNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 8  Input = ( 1, 1, 256 )  Batch size = 100  Loss function = C
   Layer 0   DENSE Layer:   ( Input =   256 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 2   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 3   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 4   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 5   BATCH NORM Layer:    Input/Output = ( 100 , 100 , 1 )    Norm dim =   100  axis = -1
 
   Layer 6   DENSE Layer:   ( Input =   100 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 7   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 21.8273
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |       1.0348     1.50777    0.147606   0.0187439     9312.28           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |     0.700302     1.27075    0.176006    0.019543     7669.53           0
                         :          3 Minimum Test error found - save the configuration 
                         :          3 |     0.625502     1.12068    0.169952   0.0188981     7944.18           0
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.561372     1.11073    0.176747   0.0200042     7655.85           0
                         :          5 Minimum Test error found - save the configuration 
                         :          5 |     0.496457     1.09581    0.174257   0.0192665     7742.44           0
                         :          6 |     0.431827     1.14041    0.172394   0.0184929     7797.22           1
                         :          7 |     0.397026     1.15564    0.171003   0.0189866     7893.87           2
                         :          8 Minimum Test error found - save the configuration 
                         :          8 |     0.342025     1.09064    0.170182    0.019092     7942.26           0
                         :          9 |     0.286433     1.18584    0.171795    0.018873     7847.15           1
                         :         10 |     0.261693      1.2418    0.164411   0.0174628     8166.14           2
                         : 
                         : Elapsed time for training with 1600 events: 1.72 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.102 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
Factory                  : Train method: TMVA_CNN_CPU for Classification
                         : 
                         : Start of deep neural network training on CPU using MT,  nthreads = 4
                         : 
                         : *****   Deep Learning Network *****
DEEP NEURAL NETWORK:   Depth = 7  Input = ( 1, 16, 16 )  Batch size = 100  Loss function = C
   Layer 0   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 1   BATCH NORM Layer:    Input/Output = ( 10 , 256 , 100 )   Norm dim =    10  axis = 1
 
   Layer 2   CONV LAYER:   ( W = 16 ,  H = 16 ,  D = 10 )    Filter ( W = 3 ,  H = 3 )    Output = ( 100 , 10 , 10 , 256 )     Activation Function = Relu
   Layer 3   POOL Layer:   ( W = 15 ,  H = 15 ,  D = 10 )    Filter ( W = 2 ,  H = 2 )    Output = ( 100 , 10 , 10 , 225 ) 
   Layer 4   RESHAPE Layer     Input = ( 10 , 15 , 15 )  Output = ( 1 , 100 , 2250 ) 
   Layer 5   DENSE Layer:   ( Input =  2250 , Width =   100 )  Output = (  1 ,   100 ,   100 )   Activation Function = Relu
   Layer 6   DENSE Layer:   ( Input =   100 , Width =     1 )  Output = (  1 ,   100 ,     1 )   Activation Function = Identity
                         : Using 1280 events for training and 320 for testing
                         : Compute initial loss  on the validation data 
                         : Training phase 1 of 1:  Optimizer ADAM (beta1=0.9,beta2=0.999,eps=1e-07) Learning rate = 0.001 regularization 0 minimum error = 184.928
                         : --------------------------------------------------------------
                         :      Epoch |   Train Err.   Val. Err.  t(s)/epoch   t(s)/Loss   nEvents/s Conv. Steps
                         : --------------------------------------------------------------
                         :    Start epoch iteration ...
                         :          1 Minimum Test error found - save the configuration 
                         :          1 |       6.2677     3.35653     1.01181   0.0837096     1292.96           0
                         :          2 Minimum Test error found - save the configuration 
                         :          2 |      1.70727    0.732628    0.918189   0.0875828     1444.73           0
                         :          3 |     0.890542    0.750677    0.942725   0.0853785     1399.67           1
                         :          4 Minimum Test error found - save the configuration 
                         :          4 |     0.704229    0.686011     1.15176    0.117186      1159.9           0
                         :          5 Minimum Test error found - save the configuration 
                         :          5 |     0.671897     0.67311    0.923088    0.081224     1425.41           0
                         :          6 |     0.664014    0.710039     1.02978    0.111701     1307.07           1
                         :          7 Minimum Test error found - save the configuration 
                         :          7 |     0.652514    0.636182     1.04979   0.0881565     1247.87           0
                         :          8 Minimum Test error found - save the configuration 
                         :          8 |     0.634474    0.626774    0.944327   0.0841509     1395.06           0
                         :          9 Minimum Test error found - save the configuration 
                         :          9 |      0.66644    0.611112     1.02156   0.0837529     1279.58           0
                         :         10 Minimum Test error found - save the configuration 
                         :         10 |     0.589074    0.595838     1.03715    0.114535     1300.66           0
                         : 
                         : Elapsed time for training with 1600 events: 10.2 sec         
                         : Evaluate deep neural network on CPU using batches with size = 100
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on training sample (1600 events)
                         : Elapsed time for evaluation of 1600 events: 0.525 sec       
                         : Creating xml weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
                         : Creating standalone class: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.class.C␛[0m
Factory                  : Training finished
                         : 
                         : Ranking input variables (method specific)...
BDT                      : Ranking result (top variable is best ranked)
                         : --------------------------------------
                         : Rank : Variable  : Variable Importance
                         : --------------------------------------
                         :    1 : vars      : 8.150e-03
                         :    2 : vars      : 7.852e-03
                         :    3 : vars      : 7.321e-03
                         :    4 : vars      : 7.018e-03
                         :    5 : vars      : 6.905e-03
                         :    6 : vars      : 6.821e-03
                         :    7 : vars      : 6.733e-03
                         :    8 : vars      : 6.710e-03
                         :    9 : vars      : 6.660e-03
                         :   10 : vars      : 6.587e-03
                         :   11 : vars      : 6.575e-03
                         :   12 : vars      : 6.547e-03
                         :   13 : vars      : 6.516e-03
                         :   14 : vars      : 6.492e-03
                         :   15 : vars      : 6.434e-03
                         :   16 : vars      : 6.382e-03
                         :   17 : vars      : 6.381e-03
                         :   18 : vars      : 6.377e-03
                         :   19 : vars      : 6.357e-03
                         :   20 : vars      : 6.160e-03
                         :   21 : vars      : 6.146e-03
                         :   22 : vars      : 6.119e-03
                         :   23 : vars      : 6.116e-03
                         :   24 : vars      : 6.062e-03
                         :   25 : vars      : 6.059e-03
                         :   26 : vars      : 5.951e-03
                         :   27 : vars      : 5.893e-03
                         :   28 : vars      : 5.858e-03
                         :   29 : vars      : 5.815e-03
                         :   30 : vars      : 5.815e-03
                         :   31 : vars      : 5.809e-03
                         :   32 : vars      : 5.804e-03
                         :   33 : vars      : 5.800e-03
                         :   34 : vars      : 5.788e-03
                         :   35 : vars      : 5.768e-03
                         :   36 : vars      : 5.764e-03
                         :   37 : vars      : 5.709e-03
                         :   38 : vars      : 5.663e-03
                         :   39 : vars      : 5.642e-03
                         :   40 : vars      : 5.596e-03
                         :   41 : vars      : 5.596e-03
                         :   42 : vars      : 5.578e-03
                         :   43 : vars      : 5.523e-03
                         :   44 : vars      : 5.474e-03
                         :   45 : vars      : 5.462e-03
                         :   46 : vars      : 5.413e-03
                         :   47 : vars      : 5.381e-03
                         :   48 : vars      : 5.363e-03
                         :   49 : vars      : 5.337e-03
                         :   50 : vars      : 5.316e-03
                         :   51 : vars      : 5.279e-03
                         :   52 : vars      : 5.274e-03
                         :   53 : vars      : 5.269e-03
                         :   54 : vars      : 5.246e-03
                         :   55 : vars      : 5.242e-03
                         :   56 : vars      : 5.212e-03
                         :   57 : vars      : 5.175e-03
                         :   58 : vars      : 5.173e-03
                         :   59 : vars      : 5.139e-03
                         :   60 : vars      : 5.133e-03
                         :   61 : vars      : 5.091e-03
                         :   62 : vars      : 5.068e-03
                         :   63 : vars      : 5.050e-03
                         :   64 : vars      : 5.037e-03
                         :   65 : vars      : 5.000e-03
                         :   66 : vars      : 4.993e-03
                         :   67 : vars      : 4.977e-03
                         :   68 : vars      : 4.941e-03
                         :   69 : vars      : 4.913e-03
                         :   70 : vars      : 4.906e-03
                         :   71 : vars      : 4.898e-03
                         :   72 : vars      : 4.838e-03
                         :   73 : vars      : 4.831e-03
                         :   74 : vars      : 4.829e-03
                         :   75 : vars      : 4.817e-03
                         :   76 : vars      : 4.779e-03
                         :   77 : vars      : 4.771e-03
                         :   78 : vars      : 4.745e-03
                         :   79 : vars      : 4.709e-03
                         :   80 : vars      : 4.670e-03
                         :   81 : vars      : 4.666e-03
                         :   82 : vars      : 4.646e-03
                         :   83 : vars      : 4.624e-03
                         :   84 : vars      : 4.620e-03
                         :   85 : vars      : 4.573e-03
                         :   86 : vars      : 4.572e-03
                         :   87 : vars      : 4.549e-03
                         :   88 : vars      : 4.549e-03
                         :   89 : vars      : 4.523e-03
                         :   90 : vars      : 4.513e-03
                         :   91 : vars      : 4.510e-03
                         :   92 : vars      : 4.501e-03
                         :   93 : vars      : 4.491e-03
                         :   94 : vars      : 4.475e-03
                         :   95 : vars      : 4.456e-03
                         :   96 : vars      : 4.453e-03
                         :   97 : vars      : 4.438e-03
                         :   98 : vars      : 4.429e-03
                         :   99 : vars      : 4.399e-03
                         :  100 : vars      : 4.343e-03
                         :  101 : vars      : 4.328e-03
                         :  102 : vars      : 4.300e-03
                         :  103 : vars      : 4.286e-03
                         :  104 : vars      : 4.251e-03
                         :  105 : vars      : 4.242e-03
                         :  106 : vars      : 4.239e-03
                         :  107 : vars      : 4.231e-03
                         :  108 : vars      : 4.224e-03
                         :  109 : vars      : 4.212e-03
                         :  110 : vars      : 4.197e-03
                         :  111 : vars      : 4.188e-03
                         :  112 : vars      : 4.161e-03
                         :  113 : vars      : 4.116e-03
                         :  114 : vars      : 4.109e-03
                         :  115 : vars      : 4.094e-03
                         :  116 : vars      : 4.090e-03
                         :  117 : vars      : 4.085e-03
                         :  118 : vars      : 4.072e-03
                         :  119 : vars      : 4.062e-03
                         :  120 : vars      : 4.036e-03
                         :  121 : vars      : 4.014e-03
                         :  122 : vars      : 4.001e-03
                         :  123 : vars      : 3.993e-03
                         :  124 : vars      : 3.979e-03
                         :  125 : vars      : 3.978e-03
                         :  126 : vars      : 3.968e-03
                         :  127 : vars      : 3.967e-03
                         :  128 : vars      : 3.947e-03
                         :  129 : vars      : 3.947e-03
                         :  130 : vars      : 3.932e-03
                         :  131 : vars      : 3.915e-03
                         :  132 : vars      : 3.909e-03
                         :  133 : vars      : 3.888e-03
                         :  134 : vars      : 3.882e-03
                         :  135 : vars      : 3.878e-03
                         :  136 : vars      : 3.854e-03
                         :  137 : vars      : 3.837e-03
                         :  138 : vars      : 3.836e-03
                         :  139 : vars      : 3.829e-03
                         :  140 : vars      : 3.826e-03
                         :  141 : vars      : 3.803e-03
                         :  142 : vars      : 3.796e-03
                         :  143 : vars      : 3.791e-03
                         :  144 : vars      : 3.785e-03
                         :  145 : vars      : 3.766e-03
                         :  146 : vars      : 3.748e-03
                         :  147 : vars      : 3.713e-03
                         :  148 : vars      : 3.677e-03
                         :  149 : vars      : 3.664e-03
                         :  150 : vars      : 3.655e-03
                         :  151 : vars      : 3.652e-03
                         :  152 : vars      : 3.650e-03
                         :  153 : vars      : 3.619e-03
                         :  154 : vars      : 3.618e-03
                         :  155 : vars      : 3.611e-03
                         :  156 : vars      : 3.577e-03
                         :  157 : vars      : 3.576e-03
                         :  158 : vars      : 3.549e-03
                         :  159 : vars      : 3.517e-03
                         :  160 : vars      : 3.502e-03
                         :  161 : vars      : 3.484e-03
                         :  162 : vars      : 3.477e-03
                         :  163 : vars      : 3.475e-03
                         :  164 : vars      : 3.445e-03
                         :  165 : vars      : 3.405e-03
                         :  166 : vars      : 3.390e-03
                         :  167 : vars      : 3.349e-03
                         :  168 : vars      : 3.347e-03
                         :  169 : vars      : 3.341e-03
                         :  170 : vars      : 3.338e-03
                         :  171 : vars      : 3.283e-03
                         :  172 : vars      : 3.281e-03
                         :  173 : vars      : 3.255e-03
                         :  174 : vars      : 3.225e-03
                         :  175 : vars      : 3.212e-03
                         :  176 : vars      : 3.192e-03
                         :  177 : vars      : 3.145e-03
                         :  178 : vars      : 3.142e-03
                         :  179 : vars      : 3.140e-03
                         :  180 : vars      : 3.134e-03
                         :  181 : vars      : 3.112e-03
                         :  182 : vars      : 3.105e-03
                         :  183 : vars      : 3.102e-03
                         :  184 : vars      : 3.088e-03
                         :  185 : vars      : 3.059e-03
                         :  186 : vars      : 3.056e-03
                         :  187 : vars      : 3.048e-03
                         :  188 : vars      : 3.028e-03
                         :  189 : vars      : 3.026e-03
                         :  190 : vars      : 3.025e-03
                         :  191 : vars      : 3.025e-03
                         :  192 : vars      : 2.991e-03
                         :  193 : vars      : 2.960e-03
                         :  194 : vars      : 2.857e-03
                         :  195 : vars      : 2.834e-03
                         :  196 : vars      : 2.833e-03
                         :  197 : vars      : 2.816e-03
                         :  198 : vars      : 2.807e-03
                         :  199 : vars      : 2.790e-03
                         :  200 : vars      : 2.760e-03
                         :  201 : vars      : 2.759e-03
                         :  202 : vars      : 2.750e-03
                         :  203 : vars      : 2.739e-03
                         :  204 : vars      : 2.675e-03
                         :  205 : vars      : 2.669e-03
                         :  206 : vars      : 2.627e-03
                         :  207 : vars      : 2.609e-03
                         :  208 : vars      : 2.576e-03
                         :  209 : vars      : 2.574e-03
                         :  210 : vars      : 2.573e-03
                         :  211 : vars      : 2.554e-03
                         :  212 : vars      : 2.496e-03
                         :  213 : vars      : 2.477e-03
                         :  214 : vars      : 2.440e-03
                         :  215 : vars      : 2.402e-03
                         :  216 : vars      : 2.379e-03
                         :  217 : vars      : 2.326e-03
                         :  218 : vars      : 2.293e-03
                         :  219 : vars      : 2.279e-03
                         :  220 : vars      : 2.187e-03
                         :  221 : vars      : 2.158e-03
                         :  222 : vars      : 2.143e-03
                         :  223 : vars      : 2.102e-03
                         :  224 : vars      : 2.088e-03
                         :  225 : vars      : 2.081e-03
                         :  226 : vars      : 2.062e-03
                         :  227 : vars      : 1.939e-03
                         :  228 : vars      : 1.921e-03
                         :  229 : vars      : 1.903e-03
                         :  230 : vars      : 1.849e-03
                         :  231 : vars      : 1.828e-03
                         :  232 : vars      : 1.705e-03
                         :  233 : vars      : 1.705e-03
                         :  234 : vars      : 1.701e-03
                         :  235 : vars      : 1.693e-03
                         :  236 : vars      : 1.649e-03
                         :  237 : vars      : 1.624e-03
                         :  238 : vars      : 1.555e-03
                         :  239 : vars      : 1.429e-03
                         :  240 : vars      : 1.417e-03
                         :  241 : vars      : 1.361e-03
                         :  242 : vars      : 1.345e-03
                         :  243 : vars      : 1.220e-03
                         :  244 : vars      : 1.039e-03
                         :  245 : vars      : 6.119e-04
                         :  246 : vars      : 0.000e+00
                         :  247 : vars      : 0.000e+00
                         :  248 : vars      : 0.000e+00
                         :  249 : vars      : 0.000e+00
                         :  250 : vars      : 0.000e+00
                         :  251 : vars      : 0.000e+00
                         :  252 : vars      : 0.000e+00
                         :  253 : vars      : 0.000e+00
                         :  254 : vars      : 0.000e+00
                         :  255 : vars      : 0.000e+00
                         :  256 : vars      : 0.000e+00
                         : --------------------------------------
                         : No variable ranking supplied by classifier: TMVA_DNN_CPU
                         : No variable ranking supplied by classifier: TMVA_CNN_CPU
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_trainingError, Entries= 0, Total sum= 5.13744
TH1.Print Name  = TrainingHistory_TMVA_DNN_CPU_valError, Entries= 0, Total sum= 11.9201
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_trainingError, Entries= 0, Total sum= 13.4482
TH1.Print Name  = TrainingHistory_TMVA_CNN_CPU_valError, Entries= 0, Total sum= 9.37891
Factory                  : === Destroy and recreate all methods via weight files for testing ===
                         : 
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_BDT.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_DNN_CPU.weights.xml␛[0m
                         : Reading weight file: ␛[0;36mdataset/weights/TMVA_CNN_Classification_TMVA_CNN_CPU.weights.xml␛[0m
Factory                  : ␛[1mTest all methods␛[0m
Factory                  : Test method: BDT for Classification performance
                         : 
BDT                      : [dataset] : Evaluation of BDT on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.00466 sec       
Factory                  : Test method: TMVA_DNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_DNN_CPU             : [dataset] : Evaluation of TMVA_DNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.0251 sec       
Factory                  : Test method: TMVA_CNN_CPU for Classification performance
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 400
                         : 
TMVA_CNN_CPU             : [dataset] : Evaluation of TMVA_CNN_CPU on testing sample (400 events)
                         : Elapsed time for evaluation of 400 events: 0.128 sec       
Factory                  : ␛[1mEvaluate all methods␛[0m
Factory                  : Evaluate classifier: BDT
                         : 
BDT                      : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_DNN_CPU
                         : 
TMVA_DNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
Factory                  : Evaluate classifier: TMVA_CNN_CPU
                         : 
TMVA_CNN_CPU             : [dataset] : Loop over test events and fill histograms with classifier response...
                         : 
                         : Evaluate deep neural network on CPU using batches with size = 1000
                         : 
                         : Dataset[dataset] :  variable plots are not produces ! The number of variables is 256 , it is larger than 200
                         : 
                         : Evaluation results ranked by best signal efficiency and purity (area)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet       MVA                       
                         : Name:         Method:          ROC-integ
                         : dataset       TMVA_CNN_CPU   : 0.756
                         : dataset       BDT            : 0.744
                         : dataset       TMVA_DNN_CPU   : 0.692
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
                         : Testing efficiency compared to training efficiency (overtraining check)
                         : -------------------------------------------------------------------------------------------------------------------
                         : DataSet              MVA              Signal efficiency: from test sample (from training sample) 
                         : Name:                Method:          @B=0.01             @B=0.10            @B=0.30   
                         : -------------------------------------------------------------------------------------------------------------------
                         : dataset              TMVA_CNN_CPU   : 0.025 (0.095)       0.365 (0.411)      0.680 (0.748)
                         : dataset              BDT            : 0.045 (0.375)       0.345 (0.709)      0.638 (0.883)
                         : dataset              TMVA_DNN_CPU   : 0.025 (0.105)       0.305 (0.545)      0.600 (0.736)
                         : -------------------------------------------------------------------------------------------------------------------
                         : 
Dataset:dataset          : Created tree 'TestTree' with 400 events
                         : 
Dataset:dataset          : Created tree 'TrainTree' with 1600 events
                         : 
Factory                  : ␛[1mThank you for using TMVA!␛[0m
                         : ␛[1mFor citation information, please visit: http://tmva.sf.net/citeTMVA.html␛[0m