// Author: Andreas Hoecker, Joerg Stelzer, Helge Voss, Kai Voss, Eckhard v. Toerne, Jan Therhaag

/**********************************************************************************
 * Project: TMVA - a Root-integrated toolkit for multivariate data analysis       *
 * Package: TMVA                                                                  *
 * Class  : MethodBDT (BDT = Boosted Decision Trees)                              *
 * Web    : http://tmva.sourceforge.net                                           *
 *                                                                                *
 * Description:                                                                   *
 *      Analysis of Boosted Decision Trees                                        *
 *                                                                                *
 * Authors (alphabetical):                                                        *
 *      Andreas Hoecker <Andreas.Hocker@cern.ch> - CERN, Switzerland              *
 *      Helge Voss      <Helge.Voss@cern.ch>     - MPI-K Heidelberg, Germany      *
 *      Kai Voss        <Kai.Voss@cern.ch>       - U. of Victoria, Canada         *
 *      Doug Schouten   <dschoute@sfu.ca>        - Simon Fraser U., Canada        *
 *      Jan Therhaag    <jan.therhaag@cern.ch>   - U. of Bonn, Germany            *
 *      Eckhard v. Toerne     <evt@uni-bonn.de>        - U of Bonn, Germany       *
 *                                                                                *
 * Copyright (c) 2005-2011:                                                       *
 *      CERN, Switzerland                                                         *
 *      U. of Victoria, Canada                                                    *
 *      MPI-K Heidelberg, Germany                                                 *
 *      U. of Bonn, Germany                                                       *
 *                                                                                *
 * Redistribution and use in source and binary forms, with or without             *
 * modification, are permitted according to the terms listed in LICENSE           *
 * (http://tmva.sourceforge.net/LICENSE)                                          *
 **********************************************************************************/

//_______________________________________________________________________
//
// Analysis of Boosted Decision Trees
//
// Boosted decision trees have been successfully used in High Energy
// Physics analysis for example by the MiniBooNE experiment
// (Yang-Roe-Zhu, physics/0508045). In Boosted Decision Trees, the
// selection is done on a majority vote on the result of several decision
// trees, which are all derived from the same training sample by
// supplying different event weights during the training.
//
// Decision trees:
//
// Successive decision nodes are used to categorize the
// events out of the sample as either signal or background. Each node
// uses only a single discriminating variable to decide if the event is
// signal-like ("goes right") or background-like ("goes left"). This
// forms a tree like structure with "baskets" at the end (leave nodes),
// and an event is classified as either signal or background according to
// whether the basket where it ends up has been classified signal or
// background during the training. Training of a decision tree is the
// process to define the "cut criteria" for each node. The training
// starts with the root node. Here one takes the full training event
// sample and selects the variable and corresponding cut value that gives
// the best separation between signal and background at this stage. Using
// this cut criterion, the sample is then divided into two subsamples, a
// signal-like (right) and a background-like (left) sample. Two new nodes
// are then created for each of the two sub-samples and they are
// constructed using the same mechanism as described for the root
// node. The devision is stopped once a certain node has reached either a
// minimum number of events, or a minimum or maximum signal purity. These
// leave nodes are then called "signal" or "background" if they contain
// more signal respective background events from the training sample.
//
// Boosting:
//
// The idea behind adaptive boosting (AdaBoost) is, that signal events
// from the training sample, that end up in a background node
// (and vice versa) are given a larger weight than events that are in
// the correct leave node. This results in a re-weighed training event
// sample, with which then a new decision tree can be developed.
// The boosting can be applied several times (typically 100-500 times)
// and one ends up with a set of decision trees (a forest).
// Gradient boosting works more like a function expansion approach, where
// each tree corresponds to a summand. The parameters for each summand (tree)
// are determined by the minimization of a error function (binomial log-
// likelihood for classification and Huber loss for regression).
// A greedy algorithm is used, which means, that only one tree is modified
// at a time, while the other trees stay fixed.
//
// Bagging:
//
// In this particular variant of the Boosted Decision Trees the boosting
// is not done on the basis of previous training results, but by a simple
// stochastic re-sampling of the initial training event sample.
//
// Random Trees:
// Similar to the "Random Forests" from Leo Breiman and Adele Cutler, it
// uses the bagging algorithm together and bases the determination of the
// best node-split during the training on a random subset of variables only
// which is individually chosen for each split.
//
// Analysis:
//
// Applying an individual decision tree to a test event results in a
// classification of the event as either signal or background. For the
// boosted decision tree selection, an event is successively subjected to
// the whole set of decision trees and depending on how often it is
// classified as signal, a "likelihood" estimator is constructed for the
// event being signal or background. The value of this estimator is the
// one which is then used to select the events from an event sample, and
// the cut value on this estimator defines the efficiency and purity of
// the selection.
//
//_______________________________________________________________________

#include <algorithm>

#include <math.h>
#include <fstream>

#include "Riostream.h"
#include "TRandom3.h"
#include "TMath.h"
#include "TObjString.h"
#include "TGraph.h"

#include "TMVA/ClassifierFactory.h"
#include "TMVA/MethodBDT.h"
#include "TMVA/Tools.h"
#include "TMVA/Timer.h"
#include "TMVA/Ranking.h"
#include "TMVA/SdivSqrtSplusB.h"
#include "TMVA/BinarySearchTree.h"
#include "TMVA/SeparationBase.h"
#include "TMVA/GiniIndex.h"
#include "TMVA/GiniIndexWithLaplace.h"
#include "TMVA/CrossEntropy.h"
#include "TMVA/MisClassificationError.h"
#include "TMVA/Results.h"
#include "TMVA/ResultsMulticlass.h"
#include "TMVA/Interval.h"
#include "TMVA/LogInterval.h"
#include "TMVA/PDF.h"
#include "TMVA/BDTEventWrapper.h"

#include "TMatrixTSym.h"

using std::vector;
using std::make_pair;

REGISTER_METHOD(BDT)

ClassImp(TMVA::MethodBDT)

   const Int_t TMVA::MethodBDT::fgDebugLevel = 0;

//_______________________________________________________________________
TMVA::MethodBDT::MethodBDT( const TString& jobName,
                            const TString& methodTitle,
                            DataSetInfo& theData,
                            const TString& theOption,
                            TDirectory* theTargetDir ) :
   TMVA::MethodBase( jobName, Types::kBDT, methodTitle, theData, theOption, theTargetDir )
   , fTrainSample(0)
   , fNTrees(0)
   , fSigToBkgFraction(0) 
   , fAdaBoostBeta(0)
   , fTransitionPoint(0)
   , fShrinkage(0)
   , fBaggedBoost(kFALSE)
   , fBaggedGradBoost(kFALSE)
   , fSumOfWeights(0)
   , fMinNodeEvents(0)
   , fMinNodeSize(5)
   , fMinNodeSizeS("5%")
   , fNCuts(0)
   , fUseFisherCuts(0)        // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fMinLinCorrForFisher(.8) // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fUseExclusiveVars(0)     // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fUseYesNoLeaf(kFALSE)
   , fNodePurityLimit(0)
   , fNNodesMax(0)
   , fMaxDepth(0)
   , fPruneMethod(DecisionTree::kNoPruning)
   , fPruneStrength(0)
   , fFValidationEvents(0)
   , fAutomatic(kFALSE)
   , fRandomisedTrees(kFALSE)
   , fUseNvars(0)
   , fUsePoissonNvars(0)  // don't use this initialisation, only here to make  Coverity happy. Is set in Init()
   , fUseNTrainEvents(0)
   , fBaggedSampleFraction(0)
   , fNoNegWeightsInTraining(kFALSE)
   , fInverseBoostNegWeights(kFALSE)
   , fPairNegWeightsGlobal(kFALSE)
   , fTrainWithNegWeights(kFALSE)
   , fDoBoostMonitor(kFALSE)
   , fITree(0)
   , fBoostWeight(0)
   , fErrorFraction(0)
   , fCss(0)
   , fCts_sb(0)
   , fCtb_ss(0)
   , fCbb(0)
   , fDoPreselection(kFALSE)
   , fHistoricBool(kFALSE) 
{
   // the standard constructor for the "boosted decision trees"
   fMonitorNtuple = NULL;
   fSepType = NULL;
}

//_______________________________________________________________________
TMVA::MethodBDT::MethodBDT( DataSetInfo& theData,
                            const TString& theWeightFile,
                            TDirectory* theTargetDir )
   : TMVA::MethodBase( Types::kBDT, theData, theWeightFile, theTargetDir )
   , fTrainSample(0)
   , fNTrees(0)
   , fSigToBkgFraction(0) 
   , fAdaBoostBeta(0)
   , fTransitionPoint(0)
   , fShrinkage(0)
   , fBaggedBoost(kFALSE)
   , fBaggedGradBoost(kFALSE)
   , fSumOfWeights(0)
   , fMinNodeEvents(0)
   , fMinNodeSize(5)
   , fMinNodeSizeS("5%")
   , fNCuts(0)
   , fUseFisherCuts(0)        // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fMinLinCorrForFisher(.8) // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fUseExclusiveVars(0)     // don't use this initialisation, only here to make  Coverity happy. Is set in DeclarOptions()
   , fUseYesNoLeaf(kFALSE)
   , fNodePurityLimit(0)
   , fNNodesMax(0)
   , fMaxDepth(0)
   , fPruneMethod(DecisionTree::kNoPruning)
   , fPruneStrength(0)
   , fFValidationEvents(0)
   , fAutomatic(kFALSE)
   , fRandomisedTrees(kFALSE)
   , fUseNvars(0)
   , fUsePoissonNvars(0)  // don't use this initialisation, only here to make  Coverity happy. Is set in Init()
   , fUseNTrainEvents(0)
   , fBaggedSampleFraction(0)
   , fNoNegWeightsInTraining(kFALSE)
   , fInverseBoostNegWeights(kFALSE)
   , fPairNegWeightsGlobal(kFALSE)
   , fTrainWithNegWeights(kFALSE)
   , fDoBoostMonitor(kFALSE)
   , fITree(0)
   , fBoostWeight(0)
   , fErrorFraction(0)
   , fCss(0)
   , fCts_sb(0)
   , fCtb_ss(0)
   , fCbb(0)
   , fDoPreselection(kFALSE)
   , fHistoricBool(kFALSE) 
{
   fMonitorNtuple = NULL;
   fSepType = NULL;
   // constructor for calculating BDT-MVA using previously generated decision trees
   // the result of the previous training (the decision trees) are read in via the
   // weight file. Make sure the the variables correspond to the ones used in
   // creating the "weight"-file
}

//_______________________________________________________________________
Bool_t TMVA::MethodBDT::HasAnalysisType( Types::EAnalysisType type, UInt_t numberClasses, UInt_t numberTargets )
{
   // BDT can handle classification with multiple classes and regression with one regression-target
   if (type == Types::kClassification && numberClasses == 2) return kTRUE;
   if (type == Types::kMulticlass ) return kTRUE;
   if( type == Types::kRegression && numberTargets == 1 ) return kTRUE;
   return kFALSE;
}

//_______________________________________________________________________
void TMVA::MethodBDT::DeclareOptions()
{
   // define the options (their key words) that can be set in the option string
   // know options:
   // nTrees        number of trees in the forest to be created
   // BoostType     the boosting type for the trees in the forest (AdaBoost e.t.c..)
   //                  known: AdaBoost
   //                         AdaBoostR2 (Adaboost for regression)
   //                         Bagging
   //                         GradBoost
   // AdaBoostBeta     the boosting parameter, beta, for AdaBoost
   // UseRandomisedTrees  choose at each node splitting a random set of variables
   // UseNvars         use UseNvars variables in randomised trees
   // UsePoission Nvars use UseNvars not as fixed number but as mean of a possion distribution 
   // SeparationType   the separation criterion applied in the node splitting
   //                  known: GiniIndex
   //                         MisClassificationError
   //                         CrossEntropy
   //                         SDivSqrtSPlusB
   // MinNodeSize:     minimum percentage of training events in a leaf node (leaf criteria, stop splitting)
   // nCuts:           the number of steps in the optimisation of the cut for a node (if < 0, then
   //                  step size is determined by the events)
   // UseFisherCuts:   use multivariate splits using the Fisher criterion
   // UseYesNoLeaf     decide if the classification is done simply by the node type, or the S/B
   //                  (from the training) in the leaf node
   // NodePurityLimit  the minimum purity to classify a node as a signal node (used in pruning and boosting to determine
   //                  misclassification error rate)
   // PruneMethod      The Pruning method:
   //                  known: NoPruning  // switch off pruning completely
   //                         ExpectedError
   //                         CostComplexity
   // PruneStrength    a parameter to adjust the amount of pruning. Should be large enough such that overtraining is avoided.
   // PruningValFraction   number of events to use for optimizing pruning (only if PruneStrength < 0, i.e. automatic pruning)
   // NegWeightTreatment      IgnoreNegWeightsInTraining  Ignore negative weight events in the training.
   //                         DecreaseBoostWeight     Boost ev. with neg. weight with 1/boostweight instead of boostweight
   //                         PairNegWeightsGlobal    Pair ev. with neg. and pos. weights in traning sample and "annihilate" them 
   // MaxDepth         maximum depth of the decision tree allowed before further splitting is stopped

   DeclareOptionRef(fNTrees, "NTrees", "Number of trees in the forest");
   if (DoRegression()) {
      DeclareOptionRef(fMaxDepth=50,"MaxDepth","Max depth of the decision tree allowed");
   }else{
      DeclareOptionRef(fMaxDepth=3,"MaxDepth","Max depth of the decision tree allowed");
   }

   TString tmp="5%"; if (DoRegression()) tmp="0.2%";
   DeclareOptionRef(fMinNodeSizeS=tmp, "MinNodeSize", "Minimum percentage of training events required in a leaf node (default: Classification: 5%, Regression: 0.2%)");
   // MinNodeSize:     minimum percentage of training events in a leaf node (leaf criteria, stop splitting)
   DeclareOptionRef(fNCuts, "nCuts", "Number of grid points in variable range used in finding optimal cut in node splitting");

   DeclareOptionRef(fBoostType, "BoostType", "Boosting type for the trees in the forest (note: AdaCost is still experimental)");

   AddPreDefVal(TString("AdaBoost"));
   AddPreDefVal(TString("RealAdaBoost"));
   AddPreDefVal(TString("AdaCost"));
   AddPreDefVal(TString("Bagging"));
   //   AddPreDefVal(TString("RegBoost"));
   AddPreDefVal(TString("AdaBoostR2"));
   AddPreDefVal(TString("Grad"));
   if (DoRegression()) {
      fBoostType = "AdaBoostR2";
   }else{
      fBoostType = "AdaBoost";
   }
   DeclareOptionRef(fAdaBoostR2Loss="Quadratic", "AdaBoostR2Loss", "Type of Loss function in AdaBoostR2");
   AddPreDefVal(TString("Linear"));
   AddPreDefVal(TString("Quadratic"));
   AddPreDefVal(TString("Exponential"));

   DeclareOptionRef(fBaggedBoost=kFALSE, "UseBaggedBoost","Use only a random subsample of all events for growing the trees in each boost iteration.");
   DeclareOptionRef(fShrinkage=1.0, "Shrinkage", "Learning rate for GradBoost algorithm");
   DeclareOptionRef(fAdaBoostBeta=.5, "AdaBoostBeta", "Learning rate  for AdaBoost algorithm");
   DeclareOptionRef(fRandomisedTrees,"UseRandomisedTrees","Determine at each node splitting the cut variable only as the best out of a random subset of variables (like in RandomForests)");
   DeclareOptionRef(fUseNvars,"UseNvars","Size of the subset of variables used with RandomisedTree option");
   DeclareOptionRef(fUsePoissonNvars,"UsePoissonNvars", "Interpret \"UseNvars\" not as fixed number but as mean of a Possion distribution in each split with RandomisedTree option");
   DeclareOptionRef(fBaggedSampleFraction=.6,"BaggedSampleFraction","Relative size of bagged event sample to original size of the data sample (used whenever bagging is used (i.e. UseBaggedBoost, Bagging,)" );

   DeclareOptionRef(fUseYesNoLeaf=kTRUE, "UseYesNoLeaf",
                    "Use Sig or Bkg categories, or the purity=S/(S+B) as classification of the leaf node -> Real-AdaBoost");
   if (DoRegression()) {
      fUseYesNoLeaf = kFALSE;
   }

   DeclareOptionRef(fNegWeightTreatment="InverseBoostNegWeights","NegWeightTreatment","How to treat events with negative weights in the BDT training (particular the boosting) : IgnoreInTraining;  Boost With inverse boostweight; Pair events with negative and positive weights in traning sample and *annihilate* them (experimental!)");
   AddPreDefVal(TString("InverseBoostNegWeights"));
   AddPreDefVal(TString("IgnoreNegWeightsInTraining"));
   AddPreDefVal(TString("NoNegWeightsInTraining"));    // well, let's be nice to users and keep at least this old name anyway .. 
   AddPreDefVal(TString("PairNegWeightsGlobal"));
   AddPreDefVal(TString("Pray"));



   DeclareOptionRef(fCss=1.,   "Css",   "AdaCost: cost of true signal selected signal"); 
   DeclareOptionRef(fCts_sb=1.,"Cts_sb","AdaCost: cost of true signal selected bkg"); 
   DeclareOptionRef(fCtb_ss=1.,"Ctb_ss","AdaCost: cost of true bkg    selected signal"); 
   DeclareOptionRef(fCbb=1.,   "Cbb",   "AdaCost: cost of true bkg    selected bkg "); 

   DeclareOptionRef(fNodePurityLimit=0.5, "NodePurityLimit", "In boosting/pruning, nodes with purity > NodePurityLimit are signal; background otherwise.");


   DeclareOptionRef(fSepTypeS, "SeparationType", "Separation criterion for node splitting");
   AddPreDefVal(TString("CrossEntropy"));
   AddPreDefVal(TString("GiniIndex"));
   AddPreDefVal(TString("GiniIndexWithLaplace"));
   AddPreDefVal(TString("MisClassificationError"));
   AddPreDefVal(TString("SDivSqrtSPlusB"));
   AddPreDefVal(TString("RegressionVariance"));
   if (DoRegression()) {
      fSepTypeS = "RegressionVariance";
   }else{
      fSepTypeS = "GiniIndex";
   }

   DeclareOptionRef(fDoBoostMonitor=kFALSE,"DoBoostMonitor","Create control plot with ROC integral vs tree number");

   DeclareOptionRef(fUseFisherCuts=kFALSE, "UseFisherCuts", "Use multivariate splits using the Fisher criterion");
   DeclareOptionRef(fMinLinCorrForFisher=.8,"MinLinCorrForFisher", "The minimum linear correlation between two variables demanded for use in Fisher criterion in node splitting");
   DeclareOptionRef(fUseExclusiveVars=kFALSE,"UseExclusiveVars","Variables already used in fisher criterion are not anymore analysed individually for node splitting");


   DeclareOptionRef(fDoPreselection=kFALSE,"DoPreselection","and and apply automatic pre-selection for 100% efficient signal (bkg) cuts prior to training");


   DeclareOptionRef(fSigToBkgFraction=1,"SigToBkgFraction","Sig to Bkg ratio used in Training (similar to NodePurityLimit, which cannot be used in real adaboost"); 

   DeclareOptionRef(fPruneMethodS, "PruneMethod", "Note: for BDTs use small trees (e.g.MaxDepth=3) and NoPruning:  Pruning: Method used for pruning (removal) of statistically insignificant branches ");
   AddPreDefVal(TString("NoPruning"));
   AddPreDefVal(TString("ExpectedError"));
   AddPreDefVal(TString("CostComplexity"));

   DeclareOptionRef(fPruneStrength, "PruneStrength", "Pruning strength");

   DeclareOptionRef(fFValidationEvents=0.5, "PruningValFraction", "Fraction of events to use for optimizing automatic pruning.");

   // deprecated options, still kept for the moment:
   DeclareOptionRef(fMinNodeEvents=0, "nEventsMin", "deprecated: Use MinNodeSize (in % of training events) instead");

   DeclareOptionRef(fBaggedGradBoost=kFALSE, "UseBaggedGrad","deprecated: Use *UseBaggedBoost* instead:  Use only a random subsample of all events for growing the trees in each iteration.");
   DeclareOptionRef(fBaggedSampleFraction, "GradBaggingFraction","deprecated: Use *BaggedSampleFraction* instead: Defines the fraction of events to be used in each iteration, e.g. when UseBaggedGrad=kTRUE. ");
   DeclareOptionRef(fUseNTrainEvents,"UseNTrainEvents","deprecated: Use *BaggedSampleFraction* instead: Number of randomly picked training events used in randomised (and bagged) trees");
   DeclareOptionRef(fNNodesMax,"NNodesMax","deprecated: Use MaxDepth instead to limit the tree size" );


}

//_______________________________________________________________________
void TMVA::MethodBDT::DeclareCompatibilityOptions() {
   // options that are used ONLY for the READER to ensure backward compatibility

   MethodBase::DeclareCompatibilityOptions();


   DeclareOptionRef(fHistoricBool=kTRUE, "UseWeightedTrees",
                    "Use weighted trees or simple average in classification from the forest");
   DeclareOptionRef(fHistoricBool=kFALSE, "PruneBeforeBoost", "Flag to prune the tree before applying boosting algorithm");
   DeclareOptionRef(fHistoricBool=kFALSE,"RenormByClass","Individually re-normalize each event class to the original size after boosting");

   AddPreDefVal(TString("NegWeightTreatment"),TString("IgnoreNegWeights"));

}




//_______________________________________________________________________
void TMVA::MethodBDT::ProcessOptions()
{
   // the option string is decoded, for available options see "DeclareOptions"

   fSepTypeS.ToLower();
   if      (fSepTypeS == "misclassificationerror") fSepType = new MisClassificationError();
   else if (fSepTypeS == "giniindex")              fSepType = new GiniIndex();
   else if (fSepTypeS == "giniindexwithlaplace")   fSepType = new GiniIndexWithLaplace();
   else if (fSepTypeS == "crossentropy")           fSepType = new CrossEntropy();
   else if (fSepTypeS == "sdivsqrtsplusb")         fSepType = new SdivSqrtSplusB();
   else if (fSepTypeS == "regressionvariance")     fSepType = NULL;
   else {
      Log() << kINFO << GetOptions() << Endl;
      Log() << kFATAL << "<ProcessOptions> unknown Separation Index option " << fSepTypeS << " called" << Endl;
   }

   fPruneMethodS.ToLower();
   if      (fPruneMethodS == "expectederror")  fPruneMethod = DecisionTree::kExpectedErrorPruning;
   else if (fPruneMethodS == "costcomplexity") fPruneMethod = DecisionTree::kCostComplexityPruning;
   else if (fPruneMethodS == "nopruning")      fPruneMethod = DecisionTree::kNoPruning;
   else {
      Log() << kINFO << GetOptions() << Endl;
      Log() << kFATAL << "<ProcessOptions> unknown PruneMethod " << fPruneMethodS << " option called" << Endl;
   }
   if (fPruneStrength < 0 && (fPruneMethod != DecisionTree::kNoPruning) && fBoostType!="Grad") fAutomatic = kTRUE;
   else fAutomatic = kFALSE;
   if (fAutomatic && fPruneMethod==DecisionTree::kExpectedErrorPruning){
      Log() << kFATAL 
            <<  "Sorry autmoatic pruning strength determination is not implemented yet for ExpectedErrorPruning" << Endl;
   }


   if (fMinNodeEvents > 0){
      fMinNodeSize = Double_t(fMinNodeEvents*100.) / Data()->GetNTrainingEvents();
      Log() << kWARNING << "You have explicitly set ** nEventsMin = " << fMinNodeEvents<<" ** the min ablsolut number \n"
            << "of events in a leaf node. This is DEPRECATED, please use the option \n"
            << "*MinNodeSize* giving the relative number as percentage of training \n"
            << "events instead. \n"
            << "nEventsMin="<<fMinNodeEvents<< "--> MinNodeSize="<<fMinNodeSize<<"%" 
            << Endl;
      Log() << kWARNING << "Note also that explicitly setting *nEventsMin* so far OVERWRITES the option recomeded \n" 
            << " *MinNodeSize* = " << fMinNodeSizeS << " option !!" << Endl ;         
      fMinNodeSizeS = Form("%F3.2",fMinNodeSize);
      
   }else{
      SetMinNodeSize(fMinNodeSizeS);
   }


   fAdaBoostR2Loss.ToLower();
   
   if (fBoostType=="Grad") {
      fPruneMethod = DecisionTree::kNoPruning;
      if (fNegWeightTreatment=="InverseBoostNegWeights"){
         Log() << kWARNING << "the option *InverseBoostNegWeights* does not exist for BoostType=Grad --> change to *IgnoreNegWeightsInTraining*" << Endl;
         fNegWeightTreatment="IgnoreNegWeightsInTraining";
         fNoNegWeightsInTraining=kTRUE;
      }
   } else if (fBoostType=="RealAdaBoost"){
      fBoostType    = "AdaBoost";
      fUseYesNoLeaf = kFALSE;
   } else if (fBoostType=="AdaCost"){
      fUseYesNoLeaf = kFALSE;
   }

   if (fFValidationEvents < 0.0) fFValidationEvents = 0.0;
   if (fAutomatic && fFValidationEvents > 0.5) {
      Log() << kWARNING << "You have chosen to use more than half of your training sample "
            << "to optimize the automatic pruning algorithm. This is probably wasteful "
            << "and your overall results will be degraded. Are you sure you want this?"
            << Endl;
   }


   if (this->Data()->HasNegativeEventWeights()){
      Log() << kINFO << " You are using a Monte Carlo that has also negative weights. "
            << "That should in principle be fine as long as on average you end up with "
            << "something positive. For this you have to make sure that the minimal number "
            << "of (un-weighted) events demanded for a tree node (currently you use: MinNodeSize="
            << fMinNodeSizeS << "  ("<< fMinNodeSize << "%)" 
            <<", (or the deprecated equivalent nEventsMin) you can set this via the " 
            <<"BDT option string when booking the "
            << "classifier) is large enough to allow for reasonable averaging!!! "
            << " If this does not help.. maybe you want to try the option: IgnoreNegWeightsInTraining  "
            << "which ignores events with negative weight in the training. " << Endl
            << Endl << "Note: You'll get a WARNING message during the training if that should ever happen" << Endl;
   }

   if (DoRegression()) {
      if (fUseYesNoLeaf && !IsConstructedFromWeightFile()){
         Log() << kWARNING << "Regression Trees do not work with fUseYesNoLeaf=TRUE --> I will set it to FALSE" << Endl;
         fUseYesNoLeaf = kFALSE;
      }

      if (fSepType != NULL){
         Log() << kWARNING << "Regression Trees do not work with Separation type other than <RegressionVariance> --> I will use it instead" << Endl;
         fSepType = NULL;
      }
      if (fUseFisherCuts){
         Log() << kWARNING << "Sorry, UseFisherCuts is not available for regression analysis, I will ignore it!" << Endl;
         fUseFisherCuts = kFALSE;
      }
      if (fNCuts < 0) {
         Log() << kWARNING << "Sorry, the option of nCuts<0 using a more elaborate node splitting algorithm " << Endl;
         Log() << kWARNING << "is not implemented for regression analysis ! " << Endl;
         Log() << kWARNING << "--> I switch do default nCuts = 20 and use standard node splitting"<<Endl;
         fNCuts=20;
      }
   }
   if (fRandomisedTrees){
      Log() << kINFO << " Randomised trees use no pruning" << Endl;
      fPruneMethod = DecisionTree::kNoPruning;
      //      fBoostType   = "Bagging";
   }

   if (fUseFisherCuts) {
      Log() << kWARNING << "Sorry, when using the option UseFisherCuts, the other option nCuts<0 (i.e. using" << Endl;
      Log() << kWARNING << " a more elaborate node splitting algorithm) is not implemented. I will switch o " << Endl;
      Log() << kWARNING << "--> I switch do default nCuts = 20 and use standard node splitting WITH possible Fisher criteria"<<Endl;
      fNCuts=20;
   }
   
   if (fNTrees==0){
      Log() << kERROR << " Zero Decision Trees demanded... that does not work !! "
            << " I set it to 1 .. just so that the program does not crash"
            << Endl;
      fNTrees = 1;
   }

   fNegWeightTreatment.ToLower();
   if      (fNegWeightTreatment == "ignorenegweightsintraining")   fNoNegWeightsInTraining = kTRUE;
   else if (fNegWeightTreatment == "nonegweightsintraining")   fNoNegWeightsInTraining = kTRUE;
   else if (fNegWeightTreatment == "inverseboostnegweights") fInverseBoostNegWeights = kTRUE;
   else if (fNegWeightTreatment == "pairnegweightsglobal")   fPairNegWeightsGlobal   = kTRUE;
   else if (fNegWeightTreatment == "pray")   Log() << kWARNING << "Yes, good luck with praying " << Endl;
   else {
      Log() << kINFO << GetOptions() << Endl;
      Log() << kFATAL << "<ProcessOptions> unknown option for treating negative event weights during training " << fNegWeightTreatment << " requested" << Endl;
   }
   
   if (fNegWeightTreatment == "pairnegweightsglobal") 
      Log() << kWARNING << " you specified the option NegWeightTreatment=PairNegWeightsGlobal : This option is still considered EXPERIMENTAL !! " << Endl;


   // dealing with deprecated options !
   if (fNNodesMax>0) {
      UInt_t tmp=1; // depth=0  == 1 node
      fMaxDepth=0;
      while (tmp < fNNodesMax){
         tmp+=2*tmp;
         fMaxDepth++;
      }
      Log() << kWARNING << "You have specified a deprecated option *NNodesMax="<<fNNodesMax
            << "* \n this has been translated to MaxDepth="<<fMaxDepth<<Endl;
   }


   if (fUseNTrainEvents>0){
      fBaggedSampleFraction  = (Double_t) fUseNTrainEvents/Data()->GetNTrainingEvents();
      Log() << kWARNING << "You have specified a deprecated option *UseNTrainEvents="<<fUseNTrainEvents
            << "* \n this has been translated to BaggedSampleFraction="<<fBaggedSampleFraction<<"(%)"<<Endl;
   }      

   if (fBoostType=="Bagging") fBaggedBoost = kTRUE;
   if (fBaggedGradBoost){
      fBaggedBoost = kTRUE;
      Log() << kWARNING << "You have specified a deprecated option *UseBaggedGrad* --> please use  *UseBaggedBoost* instead" << Endl;
   }

}


//_______________________________________________________________________

void TMVA::MethodBDT::SetMinNodeSize(Double_t sizeInPercent){
   if (sizeInPercent > 0 && sizeInPercent < 50){
      fMinNodeSize=sizeInPercent;
      
   } else {
      Log() << kFATAL << "you have demanded a minimal node size of " 
            << sizeInPercent << "% of the training events.. \n"
            << " that somehow does not make sense "<<Endl;
   }

}
//_______________________________________________________________________
void TMVA::MethodBDT::SetMinNodeSize(TString sizeInPercent){
   sizeInPercent.ReplaceAll("%","");
   sizeInPercent.ReplaceAll(" ","");
   if (sizeInPercent.IsFloat()) SetMinNodeSize(sizeInPercent.Atof());
   else {
      Log() << kFATAL << "I had problems reading the option MinNodeEvents, which "
            << "after removing a possible % sign now reads " << sizeInPercent << Endl;
   }
}



//_______________________________________________________________________
void TMVA::MethodBDT::Init( void )
{
   // common initialisation with defaults for the BDT-Method
      
   fNTrees         = 800;
   if (fAnalysisType == Types::kClassification || fAnalysisType == Types::kMulticlass ) {
      fMaxDepth        = 3;
      fBoostType      = "AdaBoost";
      if(DataInfo().GetNClasses()!=0) //workaround for multiclass application
         fMinNodeSize = 5.;
   }else {
      fMaxDepth = 50;
      fBoostType      = "AdaBoostR2";
      fAdaBoostR2Loss = "Quadratic";
      if(DataInfo().GetNClasses()!=0) //workaround for multiclass application
         fMinNodeSize  = .2;
   }
   

   fNCuts          = 20;
   fPruneMethodS   = "NoPruning";
   fPruneMethod    = DecisionTree::kNoPruning;
   fPruneStrength  = 0;
   fAutomatic      = kFALSE;
   fFValidationEvents = 0.5;
   fRandomisedTrees = kFALSE;
   //   fUseNvars        =  (GetNvar()>12) ? UInt_t(GetNvar()/8) : TMath::Max(UInt_t(2),UInt_t(GetNvar()/3));
   fUseNvars        =  UInt_t(TMath::Sqrt(GetNvar())+0.6);
   fUsePoissonNvars = kTRUE;
   fShrinkage       = 1.0;
   fSumOfWeights    = 0.0;

   // reference cut value to distinguish signal-like from background-like events
   SetSignalReferenceCut( 0 );
}


//_______________________________________________________________________
void TMVA::MethodBDT::Reset( void )
{
   // reset the method, as if it had just been instantiated (forget all training etc.)
   
   // I keep the BDT EventSample and its Validation sample (eventuall they should all
   // disappear and just use the DataSet samples ..
   
   // remove all the trees 
   for (UInt_t i=0; i<fForest.size();           i++) delete fForest[i];
   fForest.clear();

   fBoostWeights.clear();
   if (fMonitorNtuple) fMonitorNtuple->Delete(); fMonitorNtuple=NULL;
   fVariableImportance.clear();
   fResiduals.clear();
   // now done in "InitEventSample" which is called in "Train"
   // reset all previously stored/accumulated BOOST weights in the event sample
   //for (UInt_t iev=0; iev<fEventSample.size(); iev++) fEventSample[iev]->SetBoostWeight(1.);
   if (Data()) Data()->DeleteResults(GetMethodName(), Types::kTraining, GetAnalysisType());
   Log() << kDEBUG << " successfully(?) reset the method " << Endl;                                      
}


//_______________________________________________________________________
TMVA::MethodBDT::~MethodBDT( void )
{
   //destructor
   //   Note: fEventSample and ValidationSample are already deleted at the end of TRAIN
   //         When they are not used anymore
   //   for (UInt_t i=0; i<fEventSample.size();      i++) delete fEventSample[i];
   //   for (UInt_t i=0; i<fValidationSample.size(); i++) delete fValidationSample[i];
   for (UInt_t i=0; i<fForest.size();           i++) delete fForest[i];
}

//_______________________________________________________________________
void TMVA::MethodBDT::InitEventSample( void )
{
   // initialize the event sample (i.e. reset the boost-weights... etc)

   if (!HasTrainingTree()) Log() << kFATAL << "<Init> Data().TrainingTree() is zero pointer" << Endl;

   if (fEventSample.size() > 0) { // do not re-initialise the event sample, just set all boostweights to 1. as if it were untouched
      // reset all previously stored/accumulated BOOST weights in the event sample
      for (UInt_t iev=0; iev<fEventSample.size(); iev++) fEventSample[iev]->SetBoostWeight(1.);
   } else {
      Data()->SetCurrentType(Types::kTraining);
      UInt_t nevents = Data()->GetNTrainingEvents();

      std::vector<const TMVA::Event*> tmpEventSample;
      for (Long64_t ievt=0; ievt<nevents; ievt++) {
         //  const Event *event = new Event(*(GetEvent(ievt)));
         Event* event = new Event( *GetTrainingEvent(ievt) );
         tmpEventSample.push_back(event);
      }

      if (!DoRegression()) DeterminePreselectionCuts(tmpEventSample);
      else fDoPreselection = kFALSE; // just to make sure...

      for (UInt_t i=0; i<tmpEventSample.size(); i++) delete tmpEventSample[i];
     

      Bool_t firstNegWeight=kTRUE;
      Bool_t firstZeroWeight=kTRUE;
      for (Long64_t ievt=0; ievt<nevents; ievt++) {
         //         const Event *event = new Event(*(GetEvent(ievt)));
         // const Event* event = new Event( *GetTrainingEvent(ievt) );
         Event* event = new Event( *GetTrainingEvent(ievt) );
         if (fDoPreselection){
            if (TMath::Abs(ApplyPreselectionCuts(event)) > 0.05) continue;
         }

         if (event->GetWeight() < 0 && (IgnoreEventsWithNegWeightsInTraining() || fNoNegWeightsInTraining)){
            if (firstNegWeight) {
               Log() << kWARNING << " Note, you have events with negative event weight in the sample, but you've chosen to ignore them" << Endl;
               firstNegWeight=kFALSE;
            }
            delete event;
         }else if (event->GetWeight()==0){
            if (firstZeroWeight) {
               firstZeroWeight = kFALSE;
               Log() << "Events with weight == 0 are going to be simply ignored " << Endl;
            }
            delete event;
         }else{
            if (event->GetWeight() < 0) {
               fTrainWithNegWeights=kTRUE;
               if (firstNegWeight){
                  firstNegWeight = kFALSE;
                  if (fPairNegWeightsGlobal){
                     Log() << kWARNING << "Events with negative event weights are found and "
                           << " will be removed prior to the actual BDT training by global "
                           << " paring (and subsequent annihilation) with positiv weight events"
                           << Endl;
                  }else{
                     Log() << kWARNING << "Events with negative event weights are USED during "
                           << "the BDT training. This might cause problems with small node sizes " 
                           << "or with the boosting. Please remove negative events from training "
                           << "using the option *IgnoreEventsWithNegWeightsInTraining* in case you "
                           << "observe problems with the boosting"
                           << Endl;
                  }
               }
            }
            // if fAutomatic == true you need a validation sample to optimize pruning
            if (fAutomatic) {
               Double_t modulo = 1.0/(fFValidationEvents);
               Int_t   imodulo = static_cast<Int_t>( fmod(modulo,1.0) > 0.5 ? ceil(modulo) : floor(modulo) );
               if (ievt % imodulo == 0) fValidationSample.push_back( event );
               else                     fEventSample.push_back( event );
            }
            else {
               fEventSample.push_back(event);
            }
         }
      }

      if (fAutomatic) {
         Log() << kINFO << "<InitEventSample> Internally I use " << fEventSample.size()
               << " for Training  and " << fValidationSample.size()
               << " for Pruning Validation (" << ((Float_t)fValidationSample.size())/((Float_t)fEventSample.size()+fValidationSample.size())*100.0
               << "% of training used for validation)" << Endl;
      }
      
      // some pre-processing for events with negative weights
      if (fPairNegWeightsGlobal) PreProcessNegativeEventWeights();
   }

   if (!DoRegression()){
      Log() << kINFO << "<InitEventSample> For classification trees, "<< Endl;
      Log() << kINFO << " the effective number of backgrounds is scaled to match "<<Endl;
      Log() << kINFO << " the signal. Othersise the first boosting step would do 'just that'!"<<Endl;
      // it does not make sense in decision trees to start with unequal number of signal/background
      // events (weights) .. hence normalize them now (happens atherwise in first 'boosting step'
      // anyway..  
      // Also make sure, that the sum_of_weights == sample.size() .. as this is assumed in
      // the DecisionTree to derive a sensible number for "fMinSize" (min.#events in node)
      // that currently is an OR between "weighted" and "unweighted number"
      // I want:
      //    nS + nB = n
      //   a*SW + b*BW = n
      //   (a*SW)/(b*BW) = fSigToBkgFraction
      //
      // ==> b = n/((1+f)BW)  and a = (nf/(1+f))/SW

      Double_t nevents = fEventSample.size();
      Double_t sumSigW=0, sumBkgW=0;
      Int_t    sumSig=0, sumBkg=0;
      for (UInt_t ievt=0; ievt<fEventSample.size(); ievt++) {
         if ((DataInfo().IsSignal(fEventSample[ievt])) ) {
            sumSigW += fEventSample[ievt]->GetWeight();
            sumSig++;
         } else {
            sumBkgW += fEventSample[ievt]->GetWeight(); 
            sumBkg++;
         }
      }
      if (sumSigW && sumBkgW){
         Double_t normSig = nevents/((1+fSigToBkgFraction)*sumSigW)*fSigToBkgFraction;
         Double_t normBkg = nevents/((1+fSigToBkgFraction)*sumBkgW); ;
         Log() << kINFO << "re-normlise events such that Sig and Bkg have respective sum of weights = " 
               << fSigToBkgFraction << Endl;
         Log() << kINFO << "  sig->sig*"<<normSig << "ev. bkg->bkg*"<<normBkg << "ev." <<Endl;
         Log() << kINFO << "#events: (reweighted) sig: "<< sumSigW*normSig << " bkg: " << sumBkgW*normBkg << Endl;
         Log() << kINFO << "#events: (unweighted) sig: "<< sumSig << " bkg: " << sumBkg << Endl;
         for (Long64_t ievt=0; ievt<nevents; ievt++) {
            if ((DataInfo().IsSignal(fEventSample[ievt])) ) fEventSample[ievt]->SetBoostWeight(normSig);
            else                                            fEventSample[ievt]->SetBoostWeight(normBkg);
         }
      }else{
         Log() << kINFO << "--> could not determine scaleing factors as either there are " << Endl;
         Log() << kINFO << " no signal events (sumSigW="<<sumSigW<<") or no bkg ev. (sumBkgW="<<sumBkgW<<")"<<Endl;
      }
      
   }
   
   fTrainSample = &fEventSample;
   if (fBaggedBoost){
      GetBaggedSubSample(fEventSample);
      fTrainSample = &fSubSample;
   }

   //just for debug purposes..
   /*
     sumSigW=0;
     sumBkgW=0;
     for (UInt_t ievt=0; ievt<fEventSample.size(); ievt++) {
     if ((DataInfo().IsSignal(fEventSample[ievt])) ) sumSigW += fEventSample[ievt]->GetWeight();
     else sumBkgW += fEventSample[ievt]->GetWeight();
     }
     Log() << kWARNING << "sigSumW="<<sumSigW<<"bkgSumW="<<sumBkgW<< Endl;
   */
}

//_______________________________________________________________________
void TMVA::MethodBDT::PreProcessNegativeEventWeights(){
   // o.k. you know there are events with negative event weights. This routine will remove
   // them by pairing them with the closest event(s) of the same event class with positive
   // weights
   // A first attempt is "brute force", I dont' try to be clever using search trees etc, 
   // just quick and dirty to see if the result is any good  
   Double_t totalNegWeights = 0;
   Double_t totalPosWeights = 0;
   Double_t totalWeights    = 0;
   std::vector<const Event*> negEvents;
   for (UInt_t iev = 0; iev < fEventSample.size(); iev++){
      if (fEventSample[iev]->GetWeight() < 0) {
         totalNegWeights += fEventSample[iev]->GetWeight();
         negEvents.push_back(fEventSample[iev]);
      } else {
         totalPosWeights += fEventSample[iev]->GetWeight();
      }
      totalWeights += fEventSample[iev]->GetWeight();
   }
   if (totalNegWeights == 0 ) {
      Log() << kINFO << "no negative event weights found .. no preprocessing necessary" << Endl;
      return;
   } else {
      Log() << kINFO << "found a total of " << totalNegWeights << " of negative event weights which I am going to try to pair with positive events to annihilate them" << Endl;
      Log() << kINFO << "found a total of " << totalPosWeights << " of events with positive weights" << Endl;
      Log() << kINFO << "--> total sum of weights = " << totalWeights << " = " << totalNegWeights+totalPosWeights << Endl;
   }
   
   std::vector<TMatrixDSym*>* cov = gTools().CalcCovarianceMatrices( fEventSample, 2);
   
   TMatrixDSym *invCov;

   for (Int_t i=0; i<2; i++){
      invCov = ((*cov)[i]);
      if ( TMath::Abs(invCov->Determinant()) < 10E-24 ) {
         std::cout << "<MethodBDT::PreProcessNeg...> matrix is almost singular with deterninant="
                   << TMath::Abs(invCov->Determinant()) 
                   << " did you use the variables that are linear combinations or highly correlated?" 
                   << std::endl;
      }
      if ( TMath::Abs(invCov->Determinant()) < 10E-120 ) {
         std::cout << "<MethodBDT::PreProcessNeg...> matrix is singular with determinant="
                   << TMath::Abs(invCov->Determinant())  
                   << " did you use the variables that are linear combinations?" 
                   << std::endl;
      }
      
      invCov->Invert();
   }
   


   Log() << kINFO << "Found a total of " << totalNegWeights << " in negative weights out of " << fEventSample.size() << " training events "  << Endl;
   Timer timer(negEvents.size(),"Negative Event paired");
   for (UInt_t nev = 0; nev < negEvents.size(); nev++){
      timer.DrawProgressBar( nev );
      Double_t weight = negEvents[nev]->GetWeight();
      UInt_t  iClassID = negEvents[nev]->GetClass();
      invCov = ((*cov)[iClassID]);
      while (weight < 0){
         // find closest event with positive event weight and "pair" it with the negative event
         // (add their weight) until there is no negative weight anymore
         Int_t iMin=-1;
         Double_t dist, minDist=10E270;
         for (UInt_t iev = 0; iev < fEventSample.size(); iev++){
            if (iClassID==fEventSample[iev]->GetClass() && fEventSample[iev]->GetWeight() > 0){
               dist=0;
               for (UInt_t ivar=0; ivar < GetNvar(); ivar++){
                  for (UInt_t jvar=0; jvar<GetNvar(); jvar++){
                     dist += (negEvents[nev]->GetValue(ivar)-fEventSample[iev]->GetValue(ivar))*
                        (*invCov)[ivar][jvar]*
                        (negEvents[nev]->GetValue(jvar)-fEventSample[iev]->GetValue(jvar));
                  }
               }
               if (dist < minDist) { iMin=iev; minDist=dist;}
            }
         }
         
         if (iMin > -1) { 
            //     std::cout << "Happily pairing .. weight before : " << negEvents[nev]->GetWeight() << " and " << fEventSample[iMin]->GetWeight();
            Double_t newWeight = (negEvents[nev]->GetWeight() + fEventSample[iMin]->GetWeight());
            if (newWeight > 0){
               negEvents[nev]->SetBoostWeight( 0 );
               fEventSample[iMin]->SetBoostWeight( newWeight/fEventSample[iMin]->GetOriginalWeight() );  // note the weight*boostweight should be "newWeight"
            } else {
               negEvents[nev]->SetBoostWeight( newWeight/negEvents[nev]->GetOriginalWeight() ); // note the weight*boostweight should be "newWeight"
               fEventSample[iMin]->SetBoostWeight( 0 );
            }        
            //     std::cout << " and afterwards " <<  negEvents[nev]->GetWeight() <<  " and the paired " << fEventSample[iMin]->GetWeight() << " dist="<<minDist<< std::endl;
         } else Log() << kFATAL << "preprocessing didn't find event to pair with the negative weight ... probably a bug" << Endl;
         weight = negEvents[nev]->GetWeight();
      }
   }
   Log() << kINFO << "<Negative Event Pairing> took: " << timer.GetElapsedTime()
         << "                              " << Endl;

   // just check.. now there should be no negative event weight left anymore
   totalNegWeights = 0;
   totalPosWeights = 0;
   totalWeights    = 0;
   Double_t sigWeight=0;
   Double_t bkgWeight=0;
   Int_t    nSig=0;
   Int_t    nBkg=0;

   std::vector<const Event*> newEventSample;

   for (UInt_t iev = 0; iev < fEventSample.size(); iev++){
      if (fEventSample[iev]->GetWeight() < 0) {
         totalNegWeights += fEventSample[iev]->GetWeight();
         totalWeights    += fEventSample[iev]->GetWeight();
      } else {
         totalPosWeights += fEventSample[iev]->GetWeight();
         totalWeights    += fEventSample[iev]->GetWeight();
      }
      if (fEventSample[iev]->GetWeight() > 0) {
         newEventSample.push_back(new Event(*fEventSample[iev]));
         if (fEventSample[iev]->GetClass() == fSignalClass){
            sigWeight += fEventSample[iev]->GetWeight();
            nSig+=1;
         }else{
            bkgWeight += fEventSample[iev]->GetWeight();
            nBkg+=1;
         }
      }
   }
   if (totalNegWeights < 0) Log() << kFATAL << " compenstion of negative event weights with positive ones did not work " << totalNegWeights << Endl;

   for (UInt_t i=0; i<fEventSample.size();      i++) delete fEventSample[i];
   fEventSample = newEventSample;

   Log() << kINFO  << " after PreProcessing, the Event sample is left with " << fEventSample.size() << " events (unweighted), all with positive weights, adding up to " << totalWeights << Endl;
   Log() << kINFO  << " nSig="<<nSig << " sigWeight="<<sigWeight <<  " nBkg="<<nBkg << " bkgWeight="<<bkgWeight << Endl;
   

}

//

//_______________________________________________________________________
std::map<TString,Double_t>  TMVA::MethodBDT::OptimizeTuningParameters(TString fomType, TString fitType)
{
   // call the Optimzier with the set of paremeters and ranges that
   // are meant to be tuned.

   // fill all the tuning parameters that should be optimized into a map:
   std::map<TString,TMVA::Interval*> tuneParameters;
   std::map<TString,Double_t> tunedParameters;

   // note: the 3rd paraemter in the inteval is the "number of bins", NOT the stepsize !!
   //       the actual VALUES at (at least for the scan, guess also in GA) are always
   //       read from the middle of the bins. Hence.. the choice of Intervals e.g. for the
   //       MaxDepth, in order to make nice interger values!!!

   // find some reasonable ranges for the optimisation of MinNodeEvents:

   tuneParameters.insert(std::pair<TString,Interval*>("NTrees",         new Interval(10,1000,5))); //  stepsize 50
   tuneParameters.insert(std::pair<TString,Interval*>("MaxDepth",       new Interval(2,4,3)));    // stepsize 1
   tuneParameters.insert(std::pair<TString,Interval*>("MinNodeSize",    new LogInterval(1,30,30)));    // 
   //tuneParameters.insert(std::pair<TString,Interval*>("NodePurityLimit",new Interval(.4,.6,3)));   // stepsize .1
   //tuneParameters.insert(std::pair<TString,Interval*>("BaggedSampleFraction",new Interval(.4,.9,6)));   // stepsize .1

   // method-specific parameters
   if        (fBoostType=="AdaBoost"){
      tuneParameters.insert(std::pair<TString,Interval*>("AdaBoostBeta",   new Interval(.2,1.,5)));   
  
   }else if (fBoostType=="Grad"){
      tuneParameters.insert(std::pair<TString,Interval*>("Shrinkage",      new Interval(0.05,0.50,5)));  
  
   }else if (fBoostType=="Bagging" && fRandomisedTrees){
      Int_t min_var  = TMath::FloorNint( GetNvar() * .25 );
      Int_t max_var  = TMath::CeilNint(  GetNvar() * .75 ); 
      tuneParameters.insert(std::pair<TString,Interval*>("UseNvars",       new Interval(min_var,max_var,4)));
     
   }
   
   Log()<<kINFO << " the following BDT parameters will be tuned on the respective *grid*\n"<<Endl;
   std::map<TString,TMVA::Interval*>::iterator it;
   for(it=tuneParameters.begin(); it!= tuneParameters.end(); it++){
      Log() << kWARNING << it->first << Endl;
      (it->second)->Print(Log());  
      Log()<<Endl;
   }
   
   OptimizeConfigParameters optimize(this, tuneParameters, fomType, fitType);
   tunedParameters=optimize.optimize();

   return tunedParameters;

}

//_______________________________________________________________________
void TMVA::MethodBDT::SetTuneParameters(std::map<TString,Double_t> tuneParameters)
{
   // set the tuning parameters accoding to the argument

   std::map<TString,Double_t>::iterator it;
   for(it=tuneParameters.begin(); it!= tuneParameters.end(); it++){
      Log() << kWARNING << it->first << " = " << it->second << Endl;  
      if (it->first ==  "MaxDepth"       ) SetMaxDepth        ((Int_t)it->second);
      else if (it->first ==  "MinNodeSize"    ) SetMinNodeSize     (it->second);
      else if (it->first ==  "NTrees"         ) SetNTrees          ((Int_t)it->second);
      else if (it->first ==  "NodePurityLimit") SetNodePurityLimit (it->second);
      else if (it->first ==  "AdaBoostBeta"   ) SetAdaBoostBeta    (it->second);
      else if (it->first ==  "Shrinkage"      ) SetShrinkage       (it->second);
      else if (it->first ==  "UseNvars"       ) SetUseNvars        ((Int_t)it->second);
      else if (it->first ==  "BaggedSampleFraction" ) SetBaggedSampleFraction (it->second);
      else Log() << kFATAL << " SetParameter for " << it->first << " not yet implemented " <<Endl;
   }
   

}

//_______________________________________________________________________
void TMVA::MethodBDT::Train()
{
   // BDT training
   TMVA::DecisionTreeNode::fgIsTraining=true;

   // fill the STL Vector with the event sample
   // (needs to be done here and cannot be done in "init" as the options need to be 
   // known). 
   InitEventSample();

   if (fNTrees==0){
      Log() << kERROR << " Zero Decision Trees demanded... that does not work !! "
            << " I set it to 1 .. just so that the program does not crash"
            << Endl;
      fNTrees = 1;
   }

   // HHV (it's been here since looong but I really don't know why we cannot handle
   // normalized variables in BDTs...  todo
   if (IsNormalised()) Log() << kFATAL << "\"Normalise\" option cannot be used with BDT; "
                             << "please remove the option from the configuration string, or "
                             << "use \"!Normalise\""
                             << Endl;

   Log() << kINFO << "Training "<< fNTrees << " Decision Trees ... patience please" << Endl;

   Log() << kDEBUG << "Training with maximal depth = " <<fMaxDepth 
         << ", MinNodeEvents=" << fMinNodeEvents
         << ", NTrees="<<fNTrees
         << ", NodePurityLimit="<<fNodePurityLimit
         << ", AdaBoostBeta="<<fAdaBoostBeta
         << Endl;

   // weights applied in boosting
   Int_t nBins;
   Double_t xMin,xMax;
   TString hname = "AdaBooost weight distribution";

   nBins= 100;
   xMin = 0;
   xMax = 30;

   if (DoRegression()) {
      nBins= 100;
      xMin = 0;
      xMax = 1;
      hname="Boost event weights distribution";
   }

   // book monitoring histograms (for AdaBost only)   

   TH1* h = new TH1F("BoostWeight",hname,nBins,xMin,xMax);
   TH1* nodesBeforePruningVsTree = new TH1I("NodesBeforePruning","nodes before pruning",fNTrees,0,fNTrees);
   TH1* nodesAfterPruningVsTree = new TH1I("NodesAfterPruning","nodes after pruning",fNTrees,0,fNTrees);

      

   if(!DoMulticlass()){
      Results* results = Data()->GetResults(GetMethodName(), Types::kTraining, GetAnalysisType());

      h->SetXTitle("boost weight");
      results->Store(h, "BoostWeights");
  

      // Monitor the performance (on TEST sample) versus number of trees
      if (fDoBoostMonitor){
         TH2* boostMonitor = new TH2F("BoostMonitor","ROC Integral Vs iTree",2,0,fNTrees,2,0,1.05);
         boostMonitor->SetXTitle("#tree");
         boostMonitor->SetYTitle("ROC Integral");
         results->Store(boostMonitor, "BoostMonitor");
         TGraph *boostMonitorGraph = new TGraph();
         boostMonitorGraph->SetName("BoostMonitorGraph");
         boostMonitorGraph->SetTitle("ROCIntegralVsNTrees");
         results->Store(boostMonitorGraph, "BoostMonitorGraph");
      }

      // weights applied in boosting vs tree number
      h = new TH1F("BoostWeightVsTree","Boost weights vs tree",fNTrees,0,fNTrees);
      h->SetXTitle("#tree");
      h->SetYTitle("boost weight");
      results->Store(h, "BoostWeightsVsTree");
      
      // error fraction vs tree number
      h = new TH1F("ErrFractHist","error fraction vs tree number",fNTrees,0,fNTrees);
      h->SetXTitle("#tree");
      h->SetYTitle("error fraction");
      results->Store(h, "ErrorFrac");
      
      // nNodesBeforePruning vs tree number
      nodesBeforePruningVsTree->SetXTitle("#tree");
      nodesBeforePruningVsTree->SetYTitle("#tree nodes");
      results->Store(nodesBeforePruningVsTree);
      
      // nNodesAfterPruning vs tree number
      nodesAfterPruningVsTree->SetXTitle("#tree");
      nodesAfterPruningVsTree->SetYTitle("#tree nodes");
      results->Store(nodesAfterPruningVsTree);

   }
   
   fMonitorNtuple= new TTree("MonitorNtuple","BDT variables");
   fMonitorNtuple->Branch("iTree",&fITree,"iTree/I");
   fMonitorNtuple->Branch("boostWeight",&fBoostWeight,"boostWeight/D");
   fMonitorNtuple->Branch("errorFraction",&fErrorFraction,"errorFraction/D");

   Timer timer( fNTrees, GetName() );
   Int_t nNodesBeforePruningCount = 0;
   Int_t nNodesAfterPruningCount = 0;

   Int_t nNodesBeforePruning = 0;
   Int_t nNodesAfterPruning = 0;


   if(fBoostType=="Grad"){
      InitGradBoost(fEventSample);
   }

   Int_t itree=0;
   Bool_t continueBoost=kTRUE;
   //for (int itree=0; itree<fNTrees; itree++) {
   while (itree < fNTrees && continueBoost){
      timer.DrawProgressBar( itree );
      // Results* results = Data()->GetResults(GetMethodName(), Types::kTraining, GetAnalysisType());
      // TH1 *hxx = new TH1F(Form("swdist%d",itree),Form("swdist%d",itree),10000,0,15);
      // results->Store(hxx,Form("swdist%d",itree));
      // TH1 *hxy = new TH1F(Form("bwdist%d",itree),Form("bwdist%d",itree),10000,0,15);
      // results->Store(hxy,Form("bwdist%d",itree));
      // for (Int_t iev=0; iev<fEventSample.size(); iev++) {
      //    if (fEventSample[iev]->GetClass()!=0) hxy->Fill((fEventSample[iev])->GetWeight());
      //    else          hxx->Fill((fEventSample[iev])->GetWeight());
      // }

      if(DoMulticlass()){
         if (fBoostType!="Grad"){
            Log() << kFATAL << "Multiclass is currently only supported by gradient boost. "
                  << "Please change boost option accordingly (GradBoost)."
                  << Endl;
         }
         UInt_t nClasses = DataInfo().GetNClasses();
         for (UInt_t i=0;i<nClasses;i++){            
            fForest.push_back( new DecisionTree( fSepType, fMinNodeSize, fNCuts, &(DataInfo()), i,
                                                 fRandomisedTrees, fUseNvars, fUsePoissonNvars, fMaxDepth,
                                                 itree*nClasses+i, fNodePurityLimit, itree*nClasses+1));
            fForest.back()->SetNVars(GetNvar());
            if (fUseFisherCuts) {
               fForest.back()->SetUseFisherCuts();
               fForest.back()->SetMinLinCorrForFisher(fMinLinCorrForFisher); 
               fForest.back()->SetUseExclusiveVars(fUseExclusiveVars); 
            }
            // the minimum linear correlation between two variables demanded for use in fisher criterion in node splitting

            nNodesBeforePruning = fForest.back()->BuildTree(*fTrainSample);
            Double_t bw = this->Boost(*fTrainSample, fForest.back(),i);
            if (bw > 0) {
               fBoostWeights.push_back(bw);
            }else{
               fBoostWeights.push_back(0);
               Log() << kWARNING << "stopped boosting at itree="<<itree << Endl;
               //               fNTrees = itree+1; // that should stop the boosting
               continueBoost=kFALSE; 
            }
         }
      }
      else{
         fForest.push_back( new DecisionTree( fSepType, fMinNodeSize, fNCuts, &(DataInfo()), fSignalClass,
                                              fRandomisedTrees, fUseNvars, fUsePoissonNvars, fMaxDepth,
                                              itree, fNodePurityLimit, itree));
         fForest.back()->SetNVars(GetNvar());
         if (fUseFisherCuts) {
            fForest.back()->SetUseFisherCuts();
            fForest.back()->SetMinLinCorrForFisher(fMinLinCorrForFisher); 
            fForest.back()->SetUseExclusiveVars(fUseExclusiveVars); 
         }
         
         nNodesBeforePruning = fForest.back()->BuildTree(*fTrainSample);
         
         if (fUseYesNoLeaf && !DoRegression() && fBoostType!="Grad") { // remove leaf nodes where both daughter nodes are of same type
            nNodesBeforePruning = fForest.back()->CleanTree();
         }

         nNodesBeforePruningCount += nNodesBeforePruning;
         nodesBeforePruningVsTree->SetBinContent(itree+1,nNodesBeforePruning);
         
         fForest.back()->SetPruneMethod(fPruneMethod); // set the pruning method for the tree
         fForest.back()->SetPruneStrength(fPruneStrength); // set the strength parameter
         
         std::vector<const Event*> * validationSample = NULL;
         if(fAutomatic) validationSample = &fValidationSample;
         
         Double_t bw = this->Boost(*fTrainSample, fForest.back());
         if (bw > 0) {
            fBoostWeights.push_back(bw);
         }else{
            fBoostWeights.push_back(0);
            Log() << kWARNING << "stopped boosting at itree="<<itree << Endl;
            continueBoost=kFALSE;
         }
         

         
         // if fAutomatic == true, pruneStrength will be the optimal pruning strength
         // determined by the pruning algorithm; otherwise, it is simply the strength parameter
         // set by the user
         if  (fPruneMethod != DecisionTree::kNoPruning) fForest.back()->PruneTree(validationSample);
         
         if (fUseYesNoLeaf && !DoRegression() && fBoostType!="Grad"){ // remove leaf nodes where both daughter nodes are of same type
            fForest.back()->CleanTree();
         }
         nNodesAfterPruning = fForest.back()->GetNNodes();
         nNodesAfterPruningCount += nNodesAfterPruning;
         nodesAfterPruningVsTree->SetBinContent(itree+1,nNodesAfterPruning);
         
         fITree = itree;
         fMonitorNtuple->Fill();
         if (fDoBoostMonitor){
            if (! DoRegression() ){
               if (  itree==fNTrees-1 ||  (!(itree%500)) ||
                     (!(itree%250) && itree <1000)||
                     (!(itree%100) && itree < 500)||
                     (!(itree%50)  && itree < 250)||
                     (!(itree%25)  && itree < 150)||
                     (!(itree%10)  && itree <  50)||
                     (!(itree%5)   && itree <  20)
                     ) BoostMonitor(itree);
            }
         }
      }
      itree++;
   }

   // get elapsed time
   Log() << kINFO << "<Train> elapsed time: " << timer.GetElapsedTime()
         << "                              " << Endl;
   if (fPruneMethod == DecisionTree::kNoPruning) {
      Log() << kINFO << "<Train> average number of nodes (w/o pruning) : "
            << nNodesBeforePruningCount/GetNTrees() << Endl;
   }
   else {
      Log() << kINFO << "<Train> average number of nodes before/after pruning : "
            << nNodesBeforePruningCount/GetNTrees() << " / "
            << nNodesAfterPruningCount/GetNTrees()
            << Endl;
   }
   TMVA::DecisionTreeNode::fgIsTraining=false;


   // reset all previously stored/accumulated BOOST weights in the event sample
   //   for (UInt_t iev=0; iev<fEventSample.size(); iev++) fEventSample[iev]->SetBoostWeight(1.);
   Log() << kDEBUG << "Now I delete the privat data sample"<< Endl;
   for (UInt_t i=0; i<fEventSample.size();      i++) delete fEventSample[i];
   for (UInt_t i=0; i<fValidationSample.size(); i++) delete fValidationSample[i];
   fEventSample.clear();
   fValidationSample.clear();

}


//_______________________________________________________________________
Double_t TMVA::MethodBDT::GetGradBoostMVA(const TMVA::Event* e, UInt_t nTrees)
{
   //returns MVA value: -1 for background, 1 for signal
   Double_t sum=0;
   for (UInt_t itree=0; itree<nTrees; itree++) {
      //loop over all trees in forest
      sum += fForest[itree]->CheckEvent(e,kFALSE);
 
   }
   return 2.0/(1.0+exp(-2.0*sum))-1; //MVA output between -1 and 1
}

//_______________________________________________________________________
void TMVA::MethodBDT::UpdateTargets(std::vector<const TMVA::Event*>& eventSample, UInt_t cls)
{
   //Calculate residua for all events;

   if(DoMulticlass()){
      UInt_t nClasses = DataInfo().GetNClasses();
      for (std::vector<const TMVA::Event*>::iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         fResiduals[*e].at(cls)+=fForest.back()->CheckEvent(*e,kFALSE);
         if(cls == nClasses-1){
            for(UInt_t i=0;i<nClasses;i++){
               Double_t norm = 0.0;
               for(UInt_t j=0;j<nClasses;j++){
                  if(i!=j)
                     norm+=exp(fResiduals[*e].at(j)-fResiduals[*e].at(i));
               }
               Double_t p_cls = 1.0/(1.0+norm);
               Double_t res = ((*e)->GetClass()==i)?(1.0-p_cls):(-p_cls);
               const_cast<TMVA::Event*>(*e)->SetTarget(i,res);
            }
         }
      }
   }
   else{
      for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         fResiduals[*e].at(0)+=fForest.back()->CheckEvent(*e,kFALSE);
         Double_t p_sig=1.0/(1.0+exp(-2.0*fResiduals[*e].at(0)));
         Double_t res = (DataInfo().IsSignal(*e)?1:0)-p_sig;
         const_cast<TMVA::Event*>(*e)->SetTarget(0,res);
      }
   }   
}

//_______________________________________________________________________
void TMVA::MethodBDT::UpdateTargetsRegression(std::vector<const TMVA::Event*>& eventSample, Bool_t first)
{
   //Calculate current residuals for all events and update targets for next iteration
   for (std::vector<const TMVA::Event*>::const_iterator e=fEventSample.begin(); e!=fEventSample.end();e++) {
      if(!first){
         fWeightedResiduals[*e].first -= fForest.back()->CheckEvent(*e,kFALSE);
      }
      
   }
   
   fSumOfWeights = 0;
   vector< std::pair<Double_t, Double_t> > temp;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++){
      temp.push_back(make_pair(fabs(fWeightedResiduals[*e].first),fWeightedResiduals[*e].second));
      fSumOfWeights += (*e)->GetWeight();
   }
   fTransitionPoint = GetWeightedQuantile(temp,0.7,fSumOfWeights);

   Int_t i=0;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
 
      if(temp[i].first<=fTransitionPoint)
         const_cast<TMVA::Event*>(*e)->SetTarget(0,fWeightedResiduals[*e].first);
      else
         const_cast<TMVA::Event*>(*e)->SetTarget(0,fTransitionPoint*(fWeightedResiduals[*e].first<0?-1.0:1.0));
      i++;
   }
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GetWeightedQuantile(vector<  std::pair<Double_t, Double_t> > vec, const Double_t quantile, const Double_t norm){
   //calculates the quantile of the distribution of the first pair entries weighted with the values in the second pair entries
   Double_t temp = 0.0;
   std::sort(vec.begin(), vec.end());
   UInt_t i = 0;
   while(i<vec.size() && temp <= norm*quantile){
      temp += vec[i].second;
      i++;
   }
   if (i >= vec.size()) return 0.; // prevent uncontrolled memory access in return value calculation 
   return vec[i].first;
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GradBoost(std::vector<const TMVA::Event*>& eventSample, DecisionTree *dt, UInt_t cls)
{
   //Calculate the desired response value for each region
   std::map<TMVA::DecisionTreeNode*,std::vector<Double_t> > leaves;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t weight = (*e)->GetWeight();
      TMVA::DecisionTreeNode* node = dt->GetEventNode(*(*e));
      if ((leaves[node]).empty()){
         (leaves[node]).push_back((*e)->GetTarget(cls)* weight);
         (leaves[node]).push_back(fabs((*e)->GetTarget(cls))*(1.0-fabs((*e)->GetTarget(cls))) * weight* weight);
      }
      else {
         (leaves[node])[0]+=((*e)->GetTarget(cls)* weight);
         (leaves[node])[1]+=fabs((*e)->GetTarget(cls))*(1.0-fabs((*e)->GetTarget(cls))) * weight* weight;
      }
   }
   for (std::map<TMVA::DecisionTreeNode*,std::vector<Double_t> >::iterator iLeave=leaves.begin();
        iLeave!=leaves.end();++iLeave){
      if ((iLeave->second)[1]<1e-30) (iLeave->second)[1]=1e-30;

      (iLeave->first)->SetResponse(fShrinkage/DataInfo().GetNClasses()*(iLeave->second)[0]/((iLeave->second)[1]));
   }
   
   //call UpdateTargets before next tree is grown

   DoMulticlass() ? UpdateTargets(fEventSample, cls) : UpdateTargets(fEventSample);
   return 1; //trees all have the same weight
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GradBoostRegression(std::vector<const TMVA::Event*>& eventSample, DecisionTree *dt )
{
   // Implementation of M_TreeBoost using a Huber loss function as desribed by Friedman 1999
   std::map<TMVA::DecisionTreeNode*,Double_t > leaveWeights;
   std::map<TMVA::DecisionTreeNode*,vector< std::pair<Double_t, Double_t> > > leaves;
   UInt_t i =0;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      TMVA::DecisionTreeNode* node = dt->GetEventNode(*(*e));      
      (leaves[node]).push_back(make_pair(fWeightedResiduals[*e].first,(*e)->GetWeight()));
      (leaveWeights[node]) += (*e)->GetWeight();
      i++;
   }

   for (std::map<TMVA::DecisionTreeNode*,vector< std::pair<Double_t, Double_t> > >::iterator iLeave=leaves.begin();
        iLeave!=leaves.end();++iLeave){
      Double_t shift=0,diff= 0;
      Double_t ResidualMedian = GetWeightedQuantile(iLeave->second,0.5,leaveWeights[iLeave->first]);
      for(UInt_t j=0;j<((iLeave->second).size());j++){
         diff = (iLeave->second)[j].first-ResidualMedian;
         shift+=1.0/((iLeave->second).size())*((diff<0)?-1.0:1.0)*TMath::Min(fTransitionPoint,fabs(diff));
      }
      (iLeave->first)->SetResponse(fShrinkage*(ResidualMedian+shift));          
   }
   
   UpdateTargetsRegression(*fTrainSample);
   return 1;
}

//_______________________________________________________________________
void TMVA::MethodBDT::InitGradBoost( std::vector<const TMVA::Event*>& eventSample)
{
   // initialize targets for first tree
   fSumOfWeights = 0;
   fSepType=NULL; //set fSepType to NULL (regression trees are used for both classification an regression)
   std::vector<std::pair<Double_t, Double_t> > temp;
   if(DoRegression()){
      for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         fWeightedResiduals[*e]= make_pair((*e)->GetTarget(0), (*e)->GetWeight());
         fSumOfWeights+=(*e)->GetWeight();
         temp.push_back(make_pair(fWeightedResiduals[*e].first,fWeightedResiduals[*e].second));
      }
      Double_t weightedMedian = GetWeightedQuantile(temp,0.5, fSumOfWeights);
     
      //Store the weighted median as a first boosweight for later use
      fBoostWeights.push_back(weightedMedian);
      std::map<const TMVA::Event*, std::pair<Double_t, Double_t> >::iterator res = fWeightedResiduals.begin();
      for (; res!=fWeightedResiduals.end(); ++res ) {
         //substract the gloabl median from all residuals
         (*res).second.first -= weightedMedian;  
      }

      UpdateTargetsRegression(*fTrainSample,kTRUE);

      return;
   }
   else if(DoMulticlass()){
      UInt_t nClasses = DataInfo().GetNClasses();
      for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         for (UInt_t i=0;i<nClasses;i++){
            //Calculate initial residua, assuming equal probability for all classes
            Double_t r = (*e)->GetClass()==i?(1-1.0/nClasses):(-1.0/nClasses);
            const_cast<TMVA::Event*>(*e)->SetTarget(i,r);
            fResiduals[*e].push_back(0);   
         }
      }
   }
   else{
      for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         Double_t r = (DataInfo().IsSignal(*e)?1:0)-0.5; //Calculate initial residua
         const_cast<TMVA::Event*>(*e)->SetTarget(0,r);
         fResiduals[*e].push_back(0);         
      }
   }

}
//_______________________________________________________________________
Double_t TMVA::MethodBDT::TestTreeQuality( DecisionTree *dt )
{
   // test the tree quality.. in terms of Miscalssification

   Double_t ncorrect=0, nfalse=0;
   for (UInt_t ievt=0; ievt<fValidationSample.size(); ievt++) {
      Bool_t isSignalType= (dt->CheckEvent(fValidationSample[ievt]) > fNodePurityLimit ) ? 1 : 0;

      if (isSignalType == (DataInfo().IsSignal(fValidationSample[ievt])) ) {
         ncorrect += fValidationSample[ievt]->GetWeight();
      }
      else{
         nfalse += fValidationSample[ievt]->GetWeight();
      }
   }

   return  ncorrect / (ncorrect + nfalse);
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::Boost( std::vector<const TMVA::Event*>& eventSample, DecisionTree *dt, UInt_t cls )
{
   // apply the boosting alogrithim (the algorithm is selecte via the the "option" given
   // in the constructor. The return value is the boosting weight

   Double_t returnVal=-1;

   if      (fBoostType=="AdaBoost")    returnVal = this->AdaBoost  (eventSample, dt);
   else if (fBoostType=="AdaCost")     returnVal = this->AdaCost   (eventSample, dt);
   else if (fBoostType=="Bagging")     returnVal = this->Bagging   ( );
   else if (fBoostType=="RegBoost")    returnVal = this->RegBoost  (eventSample, dt);
   else if (fBoostType=="AdaBoostR2")  returnVal = this->AdaBoostR2(eventSample, dt);
   else if (fBoostType=="Grad"){
      if(DoRegression())
         returnVal = this->GradBoostRegression(eventSample, dt);
      else if(DoMulticlass())
         returnVal = this->GradBoost (eventSample, dt, cls);
      else
         returnVal = this->GradBoost (eventSample, dt);
   }
   else {
      Log() << kINFO << GetOptions() << Endl;
      Log() << kFATAL << "<Boost> unknown boost option " << fBoostType<< " called" << Endl;
   }

   if (fBaggedBoost){
      GetBaggedSubSample(fEventSample);
   }


   return returnVal;
}

//_______________________________________________________________________
void TMVA::MethodBDT::BoostMonitor(Int_t iTree)
{
   // fills the ROCIntegral vs Itree from the testSample for the monitoring plots
   // during the training .. but using the testing events 

   Results* results = Data()->GetResults(GetMethodName(),Types::kTraining, Types::kMaxAnalysisType);

   TH1F *tmpS = new TH1F( "tmpS", "",     100 , -1., 1.00001 );
   TH1F *tmpB = new TH1F( "tmpB", "",     100 , -1., 1.00001 );
   TH1F *tmp;


   UInt_t signalClassNr = DataInfo().GetClassInfo("Signal")->GetNumber();
 
   // const std::vector<Event*> events=Data()->GetEventCollection(Types::kTesting);
   // //   fMethod->GetTransformationHandler().CalcTransformations(fMethod->Data()->GetEventCollection(Types::kTesting));
   // for (UInt_t iev=0; iev < events.size() ; iev++){
   //    if (events[iev]->GetClass() == signalClassNr) tmp=tmpS;
   //    else                                          tmp=tmpB;
   //    tmp->Fill(PrivateGetMvaValue(*(events[iev])),events[iev]->GetWeight());
   // }
   
   UInt_t nevents = Data()->GetNTestEvents();
   for (UInt_t iev=0; iev < nevents; iev++){
      const Event* event = GetTestingEvent(iev);

      if (event->GetClass() == signalClassNr) {tmp=tmpS;}
      else                                    {tmp=tmpB;}
      tmp->Fill(PrivateGetMvaValue(event),event->GetWeight()); 
   }
   Double_t max=1;

   std::vector<TH1F*> hS;
   std::vector<TH1F*> hB;
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++){
      hS.push_back(new TH1F(Form("SigVar%dAtTree%d",ivar,iTree),Form("SigVar%dAtTree%d",ivar,iTree),100,DataInfo().GetVariableInfo(ivar).GetMin(),DataInfo().GetVariableInfo(ivar).GetMax()));
      hB.push_back(new TH1F(Form("BkgVar%dAtTree%d",ivar,iTree),Form("BkgVar%dAtTree%d",ivar,iTree),100,DataInfo().GetVariableInfo(ivar).GetMin(),DataInfo().GetVariableInfo(ivar).GetMax()));
      results->Store(hS.back(),hS.back()->GetTitle());
      results->Store(hB.back(),hB.back()->GetTitle());
   }
   

   for (UInt_t iev=0; iev < fEventSample.size(); iev++){
      if (fEventSample[iev]->GetBoostWeight() > max) max = 1.01*fEventSample[iev]->GetBoostWeight();
   }
   TH1F *tmpBoostWeightsS = new TH1F(Form("BoostWeightsInTreeS%d",iTree),Form("BoostWeightsInTreeS%d",iTree),100,0.,max); 
   TH1F *tmpBoostWeightsB = new TH1F(Form("BoostWeightsInTreeB%d",iTree),Form("BoostWeightsInTreeB%d",iTree),100,0.,max); 
   results->Store(tmpBoostWeightsS,tmpBoostWeightsS->GetTitle());
   results->Store(tmpBoostWeightsB,tmpBoostWeightsB->GetTitle());

   TH1F *tmpBoostWeights;
   std::vector<TH1F*> *h;

   for (UInt_t iev=0; iev < fEventSample.size(); iev++){
      if (fEventSample[iev]->GetClass() == signalClassNr) {
         tmpBoostWeights=tmpBoostWeightsS;
         h=&hS;
      }else{
         tmpBoostWeights=tmpBoostWeightsB;
         h=&hB;
      }
      tmpBoostWeights->Fill(fEventSample[iev]->GetBoostWeight());
      for (UInt_t ivar=0; ivar<GetNvar(); ivar++){
         (*h)[ivar]->Fill(fEventSample[iev]->GetValue(ivar),fEventSample[iev]->GetWeight());
      }
   }
   

   TMVA::PDF *sig = new TMVA::PDF( " PDF Sig", tmpS, TMVA::PDF::kSpline3 );
   TMVA::PDF *bkg = new TMVA::PDF( " PDF Bkg", tmpB, TMVA::PDF::kSpline3 );
   

   TGraph*  gr=results->GetGraph("BoostMonitorGraph");
   Int_t nPoints = gr->GetN();
   gr->Set(nPoints+1);
   gr->SetPoint(nPoints,(Double_t)iTree+1,GetROCIntegral(sig,bkg));

   tmpS->Delete();
   tmpB->Delete();
   
   delete sig;
   delete bkg;

   return;
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::AdaBoost( std::vector<const TMVA::Event*>& eventSample, DecisionTree *dt )
{
   // the AdaBoost implementation.
   // a new training sample is generated by weighting
   // events that are misclassified by the decision tree. The weight
   // applied is w = (1-err)/err or more general:
   //            w = ((1-err)/err)^beta
   // where err is the fraction of misclassified events in the tree ( <0.5 assuming
   // demanding the that previous selection was better than random guessing)
   // and "beta" being a free parameter (standard: beta = 1) that modifies the
   // boosting.

   Double_t err=0, sumGlobalw=0, sumGlobalwfalse=0, sumGlobalwfalse2=0;

   std::vector<Double_t> sumw(DataInfo().GetNClasses(),0); //for individually re-scaling  each class
   std::map<Node*,Int_t> sigEventsInNode; // how many signal events of the training tree

   Double_t maxDev=0;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t w = (*e)->GetWeight();
      sumGlobalw += w;
      UInt_t iclass=(*e)->GetClass();
      sumw[iclass] += w;

      if ( DoRegression() ) {
         Double_t tmpDev = TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) ); 
         sumGlobalwfalse += w * tmpDev;
         sumGlobalwfalse2 += w * tmpDev*tmpDev;
         if (tmpDev > maxDev) maxDev = tmpDev;
      }else{

         if (fUseYesNoLeaf){ 
            Bool_t isSignalType = (dt->CheckEvent(*e,fUseYesNoLeaf) > fNodePurityLimit );
            if (!(isSignalType == DataInfo().IsSignal(*e))) {
               sumGlobalwfalse+= w;
            }
         }else{
            Double_t dtoutput = (dt->CheckEvent(*e,fUseYesNoLeaf) - 0.5)*2.;
            Int_t    trueType;
            if (DataInfo().IsSignal(*e)) trueType = 1;
            else trueType = -1;
            sumGlobalwfalse+= w*trueType*dtoutput;
         }
      }
   }

   err = sumGlobalwfalse/sumGlobalw ;
   if ( DoRegression() ) {
      //if quadratic loss:
      if (fAdaBoostR2Loss=="linear"){
         err = sumGlobalwfalse/maxDev/sumGlobalw ;
      }
      else if (fAdaBoostR2Loss=="quadratic"){
         err = sumGlobalwfalse2/maxDev/maxDev/sumGlobalw ;
      }
      else if (fAdaBoostR2Loss=="exponential"){
         err = 0;
         for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
            Double_t w = (*e)->GetWeight();
            Double_t  tmpDev = TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) ); 
            err += w * (1 - exp (-tmpDev/maxDev)) / sumGlobalw;
         }
         
      }
      else {
         Log() << kFATAL << " you've chosen a Loss type for Adaboost other than linear, quadratic or exponential " 
               << " namely " << fAdaBoostR2Loss << "\n" 
               << "and this is not implemented... a typo in the options ??" <<Endl;
      }
   }

   Log() << kDEBUG << "BDT AdaBoos  wrong/all: " << sumGlobalwfalse << "/" << sumGlobalw << Endl;


   Double_t newSumGlobalw=0;
   std::vector<Double_t> newSumw(sumw.size(),0);

   Double_t boostWeight=1.;
   if (err >= 0.5 && fUseYesNoLeaf) { // sanity check ... should never happen as otherwise there is apparently
      // something odd with the assignement of the leaf nodes (rem: you use the training
      // events for this determination of the error rate)
      if (dt->GetNNodes() == 1){
         Log() << kERROR << " YOUR tree has only 1 Node... kind of a funny *tree*. I cannot " 
               << "boost such a thing... if after 1 step the error rate is == 0.5"
               << Endl
               << "please check why this happens, maybe too many events per node requested ?"
               << Endl;
         
      }else{
         Log() << kERROR << " The error rate in the BDT boosting is > 0.5. ("<< err
               << ") That should not happen, please check your code (i.e... the BDT code), I "
               << " stop boosting here" <<  Endl;
         return -1;
      }
      err = 0.5;
   } else if (err < 0) {
      Log() << kERROR << " The error rate in the BDT boosting is < 0. That can happen"
            << " due to improper treatment of negative weights in a Monte Carlo.. (if you have"
            << " an idea on how to do it in a better way, please let me know (Helge.Voss@cern.ch)"
            << " for the time being I set it to its absolute value.. just to continue.." <<  Endl;
      err = TMath::Abs(err);
   }
   if (fUseYesNoLeaf)
      boostWeight = TMath::Log((1.-err)/err)*fAdaBoostBeta;
   else
      boostWeight = TMath::Log((1.+err)/(1-err))*fAdaBoostBeta;
   
   
   Log() << kDEBUG << "BDT AdaBoos  wrong/all: " << sumGlobalwfalse << "/" << sumGlobalw << " 1-err/err="<<boostWeight<< " log.."<<TMath::Log(boostWeight)<<Endl;

   Results* results = Data()->GetResults(GetMethodName(),Types::kTraining, Types::kMaxAnalysisType);


   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
 
      if (fUseYesNoLeaf||DoRegression()){ 
         if ((!( (dt->CheckEvent(*e,fUseYesNoLeaf) > fNodePurityLimit ) == DataInfo().IsSignal(*e))) || DoRegression()) {
            Double_t boostfactor = TMath::Exp(boostWeight);
         
            if (DoRegression()) boostfactor = TMath::Power(1/boostWeight,(1.-TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) )/maxDev ) );
            if ( (*e)->GetWeight() > 0 ){
               (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
               // Helge change back            (*e)->ScaleBoostWeight(boostfactor);
               if (DoRegression()) results->GetHist("BoostWeights")->Fill(boostfactor);
            } else {
               if ( fInverseBoostNegWeights )(*e)->ScaleBoostWeight( 1. / boostfactor); // if the original event weight is negative, and you want to "increase" the events "positive" influence, you'd reather make the event weight "smaller" in terms of it's absolute value while still keeping it something "negative"
               else (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
               
            }
         }

      }else{
         Double_t dtoutput = (dt->CheckEvent(*e,fUseYesNoLeaf) - 0.5)*2.;
         Int_t    trueType;
         if (DataInfo().IsSignal(*e)) trueType = 1;
         else trueType = -1;
         Double_t boostfactor = TMath::Exp(-1*boostWeight*trueType*dtoutput);
         
         if ( (*e)->GetWeight() > 0 ){
            (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
            // Helge change back            (*e)->ScaleBoostWeight(boostfactor);
            if (DoRegression()) results->GetHist("BoostWeights")->Fill(boostfactor);
         } else {
            if ( fInverseBoostNegWeights )(*e)->ScaleBoostWeight( 1. / boostfactor); // if the original event weight is negative, and you want to "increase" the events "positive" influence, you'd reather make the event weight "smaller" in terms of it's absolute value while still keeping it something "negative"
            else (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
         }
      }
      newSumGlobalw+=(*e)->GetWeight();
      newSumw[(*e)->GetClass()] += (*e)->GetWeight();
   }


   //   Double_t globalNormWeight=sumGlobalw/newSumGlobalw;
   Double_t globalNormWeight=( (Double_t) eventSample.size())/newSumGlobalw;
   Log() << kDEBUG << "new Nsig="<<newSumw[0]*globalNormWeight << " new Nbkg="<<newSumw[1]*globalNormWeight << Endl;


   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      //      if (fRenormByClass) (*e)->ScaleBoostWeight( normWeightByClass[(*e)->GetClass()] );
      //      else                (*e)->ScaleBoostWeight( globalNormWeight );
      //      else                (*e)->ScaleBoostWeight( globalNormWeight );
      if (DataInfo().IsSignal(*e))(*e)->ScaleBoostWeight( globalNormWeight * fSigToBkgFraction );
      else                (*e)->ScaleBoostWeight( globalNormWeight );
   }

   if (!(DoRegression()))results->GetHist("BoostWeights")->Fill(boostWeight);
   results->GetHist("BoostWeightsVsTree")->SetBinContent(fForest.size(),boostWeight);
   results->GetHist("ErrorFrac")->SetBinContent(fForest.size(),err);

   fBoostWeight = boostWeight;
   fErrorFraction = err;

   return boostWeight;
}


//_______________________________________________________________________
Double_t TMVA::MethodBDT::AdaCost( vector<const TMVA::Event*>& eventSample, DecisionTree *dt )
{
   // the AdaCost boosting algorithm takes a simple cost Matrix  (currently fixed for
   // all events... later could be modified to use individual cost matrices for each
   // events as in the original paper...
   // 
   //                   true_signal true_bkg 
   //     ----------------------------------
   //     sel_signal |   Css         Ctb_ss    Cxx.. in the range [0,1]
   //     sel_bkg    |   Cts_sb      Cbb
   //
   //    and takes this into account when calculating the misclass. cost (former: error fraction):
   //    
   //    err = sum_events ( weight* y_true*y_sel * beta(event)   
   //
 
   Double_t Css = fCss; 
   Double_t Cbb = fCbb; 
   Double_t Cts_sb = fCts_sb; 
   Double_t Ctb_ss = fCtb_ss; 

   Double_t err=0, sumGlobalWeights=0, sumGlobalCost=0;

   std::vector<Double_t> sumw(DataInfo().GetNClasses(),0);      //for individually re-scaling  each class
   std::map<Node*,Int_t> sigEventsInNode; // how many signal events of the training tree

   for (vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t w = (*e)->GetWeight();
      sumGlobalWeights += w;
      UInt_t iclass=(*e)->GetClass();

      sumw[iclass] += w;

      if ( DoRegression() ) {
         Log() << kFATAL << " AdaCost not implemented for regression"<<Endl;
      }else{
  
         Double_t dtoutput = (dt->CheckEvent(*e,false) - 0.5)*2.;
         Int_t    trueType;
         Bool_t   isTrueSignal = DataInfo().IsSignal(*e);
         Bool_t   isSelectedSignal = (dtoutput>0);
         if (isTrueSignal) trueType = 1;
         else trueType = -1;

         Double_t cost=0;
         if       (isTrueSignal  && isSelectedSignal)  cost=Css;
         else if  (isTrueSignal  && !isSelectedSignal) cost=Cts_sb;
         else if  (!isTrueSignal  && isSelectedSignal) cost=Ctb_ss;
         else if  (!isTrueSignal && !isSelectedSignal) cost=Cbb;
         else Log() << kERROR << "something went wrong in AdaCost" << Endl;

         sumGlobalCost+= w*trueType*dtoutput*cost;

      }
   }

   if ( DoRegression() ) {
      Log() << kFATAL << " AdaCost not implemented for regression"<<Endl;
   }

   //   Log() << kDEBUG << "BDT AdaBoos  wrong/all: " << sumGlobalCost << "/" << sumGlobalWeights << Endl;
   //      Log() << kWARNING << "BDT AdaBoos  wrong/all: " << sumGlobalCost << "/" << sumGlobalWeights << Endl;
   sumGlobalCost /= sumGlobalWeights;
   //   Log() << kWARNING << "BDT AdaBoos  wrong/all: " << sumGlobalCost << "/" << sumGlobalWeights << Endl;


   Double_t newSumGlobalWeights=0;
   vector<Double_t> newSumClassWeights(sumw.size(),0);

   Double_t boostWeight = TMath::Log((1+sumGlobalCost)/(1-sumGlobalCost)) * fAdaBoostBeta;

   Results* results = Data()->GetResults(GetMethodName(),Types::kTraining, Types::kMaxAnalysisType);

   for (vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t dtoutput = (dt->CheckEvent(*e,false) - 0.5)*2.;
      Int_t    trueType;
      Bool_t   isTrueSignal = DataInfo().IsSignal(*e);
      Bool_t   isSelectedSignal = (dtoutput>0);
      if (isTrueSignal) trueType = 1;
      else trueType = -1;
     
      Double_t cost=0;
      if       (isTrueSignal  && isSelectedSignal)  cost=Css;
      else if  (isTrueSignal  && !isSelectedSignal) cost=Cts_sb;
      else if  (!isTrueSignal  && isSelectedSignal) cost=Ctb_ss;
      else if  (!isTrueSignal && !isSelectedSignal) cost=Cbb;
      else Log() << kERROR << "something went wrong in AdaCost" << Endl;

      Double_t boostfactor = TMath::Exp(-1*boostWeight*trueType*dtoutput*cost);
      if (DoRegression())Log() << kFATAL << " AdaCost not implemented for regression"<<Endl; 
      if ( (*e)->GetWeight() > 0 ){
         (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
         // Helge change back            (*e)->ScaleBoostWeight(boostfactor);
         if (DoRegression())Log() << kFATAL << " AdaCost not implemented for regression"<<Endl;
      } else {
         if ( fInverseBoostNegWeights )(*e)->ScaleBoostWeight( 1. / boostfactor); // if the original event weight is negative, and you want to "increase" the events "positive" influence, you'd reather make the event weight "smaller" in terms of it's absolute value while still keeping it something "negative"
      }

      newSumGlobalWeights+=(*e)->GetWeight();
      newSumClassWeights[(*e)->GetClass()] += (*e)->GetWeight();
   }


   //  Double_t globalNormWeight=sumGlobalWeights/newSumGlobalWeights;
   Double_t globalNormWeight=Double_t(eventSample.size())/newSumGlobalWeights;
   Log() << kDEBUG << "new Nsig="<<newSumClassWeights[0]*globalNormWeight << " new Nbkg="<<newSumClassWeights[1]*globalNormWeight << Endl;


   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      // if (fRenormByClass) (*e)->ScaleBoostWeight( normWeightByClass[(*e)->GetClass()] );
      // else                (*e)->ScaleBoostWeight( globalNormWeight );
      if (DataInfo().IsSignal(*e))(*e)->ScaleBoostWeight( globalNormWeight * fSigToBkgFraction );
      else                (*e)->ScaleBoostWeight( globalNormWeight );
   }


   if (!(DoRegression()))results->GetHist("BoostWeights")->Fill(boostWeight);
   results->GetHist("BoostWeightsVsTree")->SetBinContent(fForest.size(),boostWeight);
   results->GetHist("ErrorFrac")->SetBinContent(fForest.size(),err);

   fBoostWeight = boostWeight;
   fErrorFraction = err;


   return boostWeight;
}


//_______________________________________________________________________
Double_t TMVA::MethodBDT::Bagging( )
{
   // call it boot-strapping, re-sampling or whatever you like, in the end it is nothing
   // else but applying "random" poisson weights to each event.

   // this is now done in "MethodBDT::Boost  as it might be used by other boost methods, too
   // GetBaggedSample(eventSample);

   return 1.;  //here as there are random weights for each event, just return a constant==1;
}

//_______________________________________________________________________
void TMVA::MethodBDT::GetBaggedSubSample(std::vector<const TMVA::Event*>& eventSample)
{
   // fills fEventSample with fBaggedSampleFraction*NEvents random training events

  
   Double_t n;
   TRandom3 *trandom   = new TRandom3(100*fForest.size()+1234);

   if (!fSubSample.empty()) fSubSample.clear();

   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      n = trandom->PoissonD(fBaggedSampleFraction);
      for (Int_t i=0;i<n;i++) fSubSample.push_back(*e);
   }
   
   delete trandom;
   return;

   /*
     UInt_t nevents = fEventSample.size();
   
     if (!fSubSample.empty()) fSubSample.clear();
     TRandom3 *trandom   = new TRandom3(fForest.size()+1);

     for (UInt_t ievt=0; ievt<nevents; ievt++) { // recreate new random subsample
     if(trandom->Rndm()<fBaggedSampleFraction)
     fSubSample.push_back(fEventSample[ievt]);
     }
     delete trandom;
   */

}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::RegBoost( std::vector<const TMVA::Event*>& /* eventSample */, DecisionTree* /* dt */ )
{
   // a special boosting only for Regression ...
   // maybe I'll implement it later...

   return 1;
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::AdaBoostR2( std::vector<const TMVA::Event*>& eventSample, DecisionTree *dt )
{
   // adaption of the AdaBoost to regression problems (see H.Drucker 1997)

   if ( !DoRegression() ) Log() << kFATAL << "Somehow you chose a regression boost method for a classification job" << Endl;

   Double_t err=0, sumw=0, sumwfalse=0, sumwfalse2=0;
   Double_t maxDev=0;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t w = (*e)->GetWeight();
      sumw += w;

      Double_t  tmpDev = TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) );
      sumwfalse  += w * tmpDev;
      sumwfalse2 += w * tmpDev*tmpDev;
      if (tmpDev > maxDev) maxDev = tmpDev;
   }

   //if quadratic loss:
   if (fAdaBoostR2Loss=="linear"){
      err = sumwfalse/maxDev/sumw ;
   }
   else if (fAdaBoostR2Loss=="quadratic"){
      err = sumwfalse2/maxDev/maxDev/sumw ;
   }
   else if (fAdaBoostR2Loss=="exponential"){
      err = 0;
      for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
         Double_t w = (*e)->GetWeight();
         Double_t  tmpDev = TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) ); 
         err += w * (1 - exp (-tmpDev/maxDev)) / sumw;
      }
      
   }
   else {
      Log() << kFATAL << " you've chosen a Loss type for Adaboost other than linear, quadratic or exponential " 
            << " namely " << fAdaBoostR2Loss << "\n" 
            << "and this is not implemented... a typo in the options ??" <<Endl;
   }


   if (err >= 0.5) { // sanity check ... should never happen as otherwise there is apparently
      // something odd with the assignement of the leaf nodes (rem: you use the training
      // events for this determination of the error rate)
      if (dt->GetNNodes() == 1){
         Log() << kERROR << " YOUR tree has only 1 Node... kind of a funny *tree*. I cannot " 
               << "boost such a thing... if after 1 step the error rate is == 0.5"
               << Endl
               << "please check why this happens, maybe too many events per node requested ?"
               << Endl;
         
      }else{
         Log() << kERROR << " The error rate in the BDT boosting is > 0.5. ("<< err
               << ") That should not happen, but is possible for regression trees, and"
               << " should trigger a stop for the boosting. please check your code (i.e... the BDT code), I "
               << " stop boosting " <<  Endl;
         return -1;
      }
      err = 0.5;
   } else if (err < 0) {
      Log() << kERROR << " The error rate in the BDT boosting is < 0. That can happen"
            << " due to improper treatment of negative weights in a Monte Carlo.. (if you have"
            << " an idea on how to do it in a better way, please let me know (Helge.Voss@cern.ch)"
            << " for the time being I set it to its absolute value.. just to continue.." <<  Endl;
      err = TMath::Abs(err);
   }

   Double_t boostWeight = err / (1.-err);
   Double_t newSumw=0;

   Results* results = Data()->GetResults(GetMethodName(), Types::kTraining, Types::kMaxAnalysisType);

   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      Double_t boostfactor =  TMath::Power(boostWeight,(1.-TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) )/maxDev ) );
      results->GetHist("BoostWeights")->Fill(boostfactor);
      //      std::cout << "R2  " << boostfactor << "   " << boostWeight << "   " << (1.-TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) )/maxDev)  << std::endl;
      if ( (*e)->GetWeight() > 0 ){
         Float_t newBoostWeight = (*e)->GetBoostWeight() * boostfactor;
         Float_t newWeight = (*e)->GetWeight() * (*e)->GetBoostWeight() * boostfactor;
         if (newWeight == 0) {
            Log() << kINFO << "Weight=    "   <<   (*e)->GetWeight() << Endl;
            Log() << kINFO  << "BoostWeight= " <<   (*e)->GetBoostWeight() << Endl;
            Log() << kINFO  << "boostweight="<<boostWeight << "  err= " <<err << Endl; 
            Log() << kINFO  << "NewBoostWeight= " <<   newBoostWeight << Endl;
            Log() << kINFO  << "boostfactor= " <<  boostfactor << Endl;
            Log() << kINFO  << "maxDev     = " <<  maxDev << Endl;
            Log() << kINFO  << "tmpDev     = " <<  TMath::Abs(dt->CheckEvent(*e,kFALSE) - (*e)->GetTarget(0) ) << Endl;
            Log() << kINFO  << "target     = " <<  (*e)->GetTarget(0)  << Endl; 
            Log() << kINFO  << "estimate   = " <<  dt->CheckEvent(*e,kFALSE)  << Endl;
         }
         (*e)->SetBoostWeight( newBoostWeight );
         //         (*e)->SetBoostWeight( (*e)->GetBoostWeight() * boostfactor);
      } else {
         (*e)->SetBoostWeight( (*e)->GetBoostWeight() / boostfactor);
      }
      newSumw+=(*e)->GetWeight();
   }

   // re-normalise the weights
   Double_t normWeight =  sumw / newSumw;
   for (std::vector<const TMVA::Event*>::const_iterator e=eventSample.begin(); e!=eventSample.end();e++) {
      //Helge    (*e)->ScaleBoostWeight( sumw/newSumw);
      // (*e)->ScaleBoostWeight( normWeight);
      (*e)->SetBoostWeight( (*e)->GetBoostWeight() * normWeight );
   }


   results->GetHist("BoostWeightsVsTree")->SetBinContent(fForest.size(),1./boostWeight);
   results->GetHist("ErrorFrac")->SetBinContent(fForest.size(),err);

   fBoostWeight = boostWeight;
   fErrorFraction = err;

   return TMath::Log(1./boostWeight);
}

//_______________________________________________________________________
void TMVA::MethodBDT::AddWeightsXMLTo( void* parent ) const
{
   // write weights to XML
   void* wght = gTools().AddChild(parent, "Weights");

   if (fDoPreselection){
      for (UInt_t ivar=0; ivar<GetNvar(); ivar++){
         gTools().AddAttr( wght, Form("PreselectionLowBkgVar%d",ivar),      fIsLowBkgCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionLowBkgVar%dValue",ivar), fLowBkgCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionLowSigVar%d",ivar),      fIsLowSigCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionLowSigVar%dValue",ivar), fLowSigCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionHighBkgVar%d",ivar),     fIsHighBkgCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionHighBkgVar%dValue",ivar),fHighBkgCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionHighSigVar%d",ivar),     fIsHighSigCut[ivar]);
         gTools().AddAttr( wght, Form("PreselectionHighSigVar%dValue",ivar),fHighSigCut[ivar]);
      }
   }


   gTools().AddAttr( wght, "NTrees", fForest.size() );
   gTools().AddAttr( wght, "AnalysisType", fForest.back()->GetAnalysisType() );

   for (UInt_t i=0; i< fForest.size(); i++) {
      void* trxml = fForest[i]->AddXMLTo(wght);
      gTools().AddAttr( trxml, "boostWeight", fBoostWeights[i] );
      gTools().AddAttr( trxml, "itree", i );
   }
}

//_______________________________________________________________________
void TMVA::MethodBDT::ReadWeightsFromXML(void* parent) {
   // reads the BDT from the xml file

   UInt_t i;
   for (i=0; i<fForest.size(); i++) delete fForest[i];
   fForest.clear();
   fBoostWeights.clear();

   UInt_t ntrees;
   UInt_t analysisType;
   Float_t boostWeight;

   
   if (gTools().HasAttr( parent, Form("PreselectionLowBkgVar%d",0))) {
      fIsLowBkgCut.resize(GetNvar()); 
      fLowBkgCut.resize(GetNvar());   
      fIsLowSigCut.resize(GetNvar()); 
      fLowSigCut.resize(GetNvar());   
      fIsHighBkgCut.resize(GetNvar());
      fHighBkgCut.resize(GetNvar());  
      fIsHighSigCut.resize(GetNvar());
      fHighSigCut.resize(GetNvar());  

      Bool_t tmpBool;
      Double_t tmpDouble;
      for (UInt_t ivar=0; ivar<GetNvar(); ivar++){
         gTools().ReadAttr( parent, Form("PreselectionLowBkgVar%d",ivar), tmpBool);
         fIsLowBkgCut[ivar]=tmpBool;
         gTools().ReadAttr( parent, Form("PreselectionLowBkgVar%dValue",ivar), tmpDouble);
         fLowBkgCut[ivar]=tmpDouble;
         gTools().ReadAttr( parent, Form("PreselectionLowSigVar%d",ivar), tmpBool);    
         fIsLowSigCut[ivar]=tmpBool;
         gTools().ReadAttr( parent, Form("PreselectionLowSigVar%dValue",ivar), tmpDouble); 
         fLowSigCut[ivar]=tmpDouble;
         gTools().ReadAttr( parent, Form("PreselectionHighBkgVar%d",ivar), tmpBool);   
         fIsHighBkgCut[ivar]=tmpBool;
         gTools().ReadAttr( parent, Form("PreselectionHighBkgVar%dValue",ivar), tmpDouble);
         fHighBkgCut[ivar]=tmpDouble;
         gTools().ReadAttr( parent, Form("PreselectionHighSigVar%d",ivar),tmpBool);   
         fIsHighSigCut[ivar]=tmpBool;
         gTools().ReadAttr( parent, Form("PreselectionHighSigVar%dValue",ivar), tmpDouble);
         fHighSigCut[ivar]=tmpDouble;  
      }
   }


   gTools().ReadAttr( parent, "NTrees", ntrees );
   
   if(gTools().HasAttr(parent, "TreeType")) { // pre 4.1.0 version
      gTools().ReadAttr( parent, "TreeType", analysisType );
   } else {                                 // from 4.1.0 onwards
      gTools().ReadAttr( parent, "AnalysisType", analysisType );      
   }

   void* ch = gTools().GetChild(parent);
   i=0;
   while(ch) {
      fForest.push_back( dynamic_cast<DecisionTree*>( DecisionTree::CreateFromXML(ch, GetTrainingTMVAVersionCode()) ) );
      fForest.back()->SetAnalysisType(Types::EAnalysisType(analysisType));
      fForest.back()->SetTreeID(i++);
      gTools().ReadAttr(ch,"boostWeight",boostWeight);
      fBoostWeights.push_back(boostWeight);
      ch = gTools().GetNextChild(ch);
   }
}

//_______________________________________________________________________
void  TMVA::MethodBDT::ReadWeightsFromStream( std::istream& istr )
{
   // read the weights (BDT coefficients)
   TString dummy;
   //   Types::EAnalysisType analysisType;
   Int_t analysisType(0);

   // coverity[tainted_data_argument]
   istr >> dummy >> fNTrees;
   Log() << kINFO << "Read " << fNTrees << " Decision trees" << Endl;

   for (UInt_t i=0;i<fForest.size();i++) delete fForest[i];
   fForest.clear();
   fBoostWeights.clear();
   Int_t iTree;
   Double_t boostWeight;
   for (int i=0;i<fNTrees;i++) {
      istr >> dummy >> iTree >> dummy >> boostWeight;
      if (iTree != i) {
         fForest.back()->Print( std::cout );
         Log() << kFATAL << "Error while reading weight file; mismatch iTree="
               << iTree << " i=" << i
               << " dummy " << dummy
               << " boostweight " << boostWeight
               << Endl;
      }
      fForest.push_back( new DecisionTree() );
      fForest.back()->SetAnalysisType(Types::EAnalysisType(analysisType));
      fForest.back()->SetTreeID(i);
      fForest.back()->Read(istr, GetTrainingTMVAVersionCode());
      fBoostWeights.push_back(boostWeight);
   }
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GetMvaValue( Double_t* err, Double_t* errUpper ){
   return this->GetMvaValue( err, errUpper, 0 );
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GetMvaValue( Double_t* err, Double_t* errUpper, UInt_t useNTrees )
{
   // Return the MVA value (range [-1;1]) that classifies the
   // event according to the majority vote from the total number of
   // decision trees.
   const Event* ev = GetEvent();
   if (fDoPreselection) {
      Double_t val = ApplyPreselectionCuts(ev);
      if (TMath::Abs(val)>0.05) return val; 
   }
   return PrivateGetMvaValue(ev, err, errUpper, useNTrees);

}
//_______________________________________________________________________
Double_t TMVA::MethodBDT::PrivateGetMvaValue(const TMVA::Event* ev, Double_t* err, Double_t* errUpper, UInt_t useNTrees )
{
   // Return the MVA value (range [-1;1]) that classifies the
   // event according to the majority vote from the total number of
   // decision trees.

   // cannot determine error
   NoErrorCalc(err, errUpper);
   
   // allow for the possibility to use less trees in the actual MVA calculation
   // than have been originally trained.
   UInt_t nTrees = fForest.size();

   if (useNTrees > 0 ) nTrees = useNTrees;

   if (fBoostType=="Grad") return GetGradBoostMVA(ev,nTrees);
   
   Double_t myMVA = 0;
   Double_t norm  = 0;
   for (UInt_t itree=0; itree<nTrees; itree++) {
      //
      myMVA += fBoostWeights[itree] * fForest[itree]->CheckEvent(ev,fUseYesNoLeaf);
      norm  += fBoostWeights[itree];
   }
   return ( norm > std::numeric_limits<double>::epsilon() ) ? myMVA /= norm : 0 ;
}


//_______________________________________________________________________
const std::vector<Float_t>& TMVA::MethodBDT::GetMulticlassValues()
{
   // get the multiclass MVA response for the BDT classifier

   const TMVA::Event *e = GetEvent();
   if (fMulticlassReturnVal == NULL) fMulticlassReturnVal = new std::vector<Float_t>();
   fMulticlassReturnVal->clear();

   std::vector<double> temp;

   UInt_t nClasses = DataInfo().GetNClasses();
   for(UInt_t iClass=0; iClass<nClasses; iClass++){
      temp.push_back(0.0);
      for(UInt_t itree = iClass; itree<fForest.size(); itree+=nClasses){
         temp[iClass] += fForest[itree]->CheckEvent(e,kFALSE);
      }
   }    

   for(UInt_t iClass=0; iClass<nClasses; iClass++){
      Double_t norm = 0.0;
      for(UInt_t j=0;j<nClasses;j++){
         if(iClass!=j)
            norm+=exp(temp[j]-temp[iClass]);
      }
      (*fMulticlassReturnVal).push_back(1.0/(1.0+norm));
   }

   
   return *fMulticlassReturnVal;
}




//_______________________________________________________________________
const std::vector<Float_t> & TMVA::MethodBDT::GetRegressionValues()
{
   // get the regression value generated by the BDTs


   if (fRegressionReturnVal == NULL) fRegressionReturnVal = new std::vector<Float_t>();
   fRegressionReturnVal->clear();

   const Event * ev = GetEvent();
   Event * evT = new Event(*ev);

   Double_t myMVA = 0;
   Double_t norm  = 0;
   if (fBoostType=="AdaBoostR2") {
      // rather than using the weighted average of the tree respones in the forest
      // H.Decker(1997) proposed to use the "weighted median"
     
      // sort all individual tree responses according to the prediction value 
      //   (keep the association to their tree weight)
      // the sum up all the associated weights (starting from the one whose tree
      //   yielded the smalles response) up to the tree "t" at which you've
      //   added enough tree weights to have more than half of the sum of all tree weights.
      // choose as response of the forest that one which belongs to this "t"

      vector< Double_t > response(fForest.size());
      vector< Double_t > weight(fForest.size());
      Double_t           totalSumOfWeights = 0;

      for (UInt_t itree=0; itree<fForest.size(); itree++) {
         response[itree]    = fForest[itree]->CheckEvent(ev,kFALSE);
         weight[itree]      = fBoostWeights[itree];
         totalSumOfWeights += fBoostWeights[itree];
      }

      std::vector< std::vector<Double_t> > vtemp;
      vtemp.push_back( response ); // this is the vector that will get sorted
      vtemp.push_back( weight ); 
      gTools().UsefulSortAscending( vtemp );

      Int_t t=0;
      Double_t sumOfWeights = 0;
      while (sumOfWeights <= totalSumOfWeights/2.) {
         sumOfWeights += vtemp[1][t];
         t++;
      }

      Double_t rVal=0;
      Int_t    count=0;
      for (UInt_t i= TMath::Max(UInt_t(0),UInt_t(t-(fForest.size()/6)-0.5)); 
           i< TMath::Min(UInt_t(fForest.size()),UInt_t(t+(fForest.size()/6)+0.5)); i++) {
         count++;
         rVal+=vtemp[0][i];
      }
      //      fRegressionReturnVal->push_back( rVal/Double_t(count));
      evT->SetTarget(0, rVal/Double_t(count) );
   }
   else if(fBoostType=="Grad"){
      for (UInt_t itree=0; itree<fForest.size(); itree++) {
         myMVA += fForest[itree]->CheckEvent(ev,kFALSE);
      }
      //      fRegressionReturnVal->push_back( myMVA+fBoostWeights[0]);
      evT->SetTarget(0, myMVA+fBoostWeights[0] );
   }
   else{
      for (UInt_t itree=0; itree<fForest.size(); itree++) {
         //
         myMVA += fBoostWeights[itree] * fForest[itree]->CheckEvent(ev,kFALSE);
         norm  += fBoostWeights[itree];
      }
      //      fRegressionReturnVal->push_back( ( norm > std::numeric_limits<double>::epsilon() ) ? myMVA /= norm : 0 );
      evT->SetTarget(0, ( norm > std::numeric_limits<double>::epsilon() ) ? myMVA /= norm : 0 );
   }



   const Event* evT2 = GetTransformationHandler().InverseTransform( evT );
   fRegressionReturnVal->push_back( evT2->GetTarget(0) );

   delete evT;


   return *fRegressionReturnVal;
}

//_______________________________________________________________________
void  TMVA::MethodBDT::WriteMonitoringHistosToFile( void ) const
{
   // Here we could write some histograms created during the processing
   // to the output file.
   Log() << kINFO << "Write monitoring histograms to file: " << BaseDir()->GetPath() << Endl;

   //Results* results = Data()->GetResults(GetMethodName(), Types::kTraining, Types::kMaxAnalysisType);
   //results->GetStorage()->Write();
   fMonitorNtuple->Write();
}

//_______________________________________________________________________
vector< Double_t > TMVA::MethodBDT::GetVariableImportance()
{
   // Return the relative variable importance, normalized to all
   // variables together having the importance 1. The importance in
   // evaluated as the total separation-gain that this variable had in
   // the decision trees (weighted by the number of events)

   fVariableImportance.resize(GetNvar());
   for (UInt_t ivar = 0; ivar < GetNvar(); ivar++) {
      fVariableImportance[ivar]=0;
   }
   Double_t  sum=0;
   for (UInt_t itree = 0; itree < GetNTrees(); itree++) {
      std::vector<Double_t> relativeImportance(fForest[itree]->GetVariableImportance());
      for (UInt_t i=0; i< relativeImportance.size(); i++) {
         fVariableImportance[i] +=  fBoostWeights[itree] * relativeImportance[i];
      }
   }
   
   for (UInt_t ivar=0; ivar< fVariableImportance.size(); ivar++){
      fVariableImportance[ivar] = TMath::Sqrt(fVariableImportance[ivar]);
      sum += fVariableImportance[ivar];
   }
   for (UInt_t ivar=0; ivar< fVariableImportance.size(); ivar++) fVariableImportance[ivar] /= sum;

   return fVariableImportance;
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::GetVariableImportance( UInt_t ivar )
{
   // Returns the measure for the variable importance of variable "ivar"
   // which is later used in GetVariableImportance() to calculate the
   // relative variable importances.

   std::vector<Double_t> relativeImportance = this->GetVariableImportance();
   if (ivar < (UInt_t)relativeImportance.size()) return relativeImportance[ivar];
   else Log() << kFATAL << "<GetVariableImportance> ivar = " << ivar << " is out of range " << Endl;

   return -1;
}

//_______________________________________________________________________
const TMVA::Ranking* TMVA::MethodBDT::CreateRanking()
{
   // Compute ranking of input variables

   // create the ranking object
   fRanking = new Ranking( GetName(), "Variable Importance" );
   vector< Double_t> importance(this->GetVariableImportance());

   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {

      fRanking->AddRank( Rank( GetInputLabel(ivar), importance[ivar] ) );
   }

   return fRanking;
}

//_______________________________________________________________________
void TMVA::MethodBDT::GetHelpMessage() const
{
   // Get help message text
   //
   // typical length of text line:
   //         "|--------------------------------------------------------------|"
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Short description:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "Boosted Decision Trees are a collection of individual decision" << Endl;
   Log() << "trees which form a multivariate classifier by (weighted) majority " << Endl;
   Log() << "vote of the individual trees. Consecutive decision trees are  " << Endl;
   Log() << "trained using the original training data set with re-weighted " << Endl;
   Log() << "events. By default, the AdaBoost method is employed, which gives " << Endl;
   Log() << "events that were misclassified in the previous tree a larger " << Endl;
   Log() << "weight in the training of the following tree." << Endl;
   Log() << Endl;
   Log() << "Decision trees are a sequence of binary splits of the data sample" << Endl;
   Log() << "using a single descriminant variable at a time. A test event " << Endl;
   Log() << "ending up after the sequence of left-right splits in a final " << Endl;
   Log() << "(\"leaf\") node is classified as either signal or background" << Endl;
   Log() << "depending on the majority type of training events in that node." << Endl;
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Performance optimisation:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "By the nature of the binary splits performed on the individual" << Endl;
   Log() << "variables, decision trees do not deal well with linear correlations" << Endl;
   Log() << "between variables (they need to approximate the linear split in" << Endl;
   Log() << "the two dimensional space by a sequence of splits on the two " << Endl;
   Log() << "variables individually). Hence decorrelation could be useful " << Endl;
   Log() << "to optimise the BDT performance." << Endl;
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Performance tuning via configuration options:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "The two most important parameters in the configuration are the  " << Endl;
   Log() << "minimal number of events requested by a leaf node as percentage of the " <<Endl;
   Log() << "   number of training events (option \"MinNodeSize\"  replacing the actual number " << Endl;
   Log() << " of events \"nEventsMin\" as given in earlier versions" << Endl;
   Log() << "If this number is too large, detailed features " << Endl;
   Log() << "in the parameter space are hard to be modelled. If it is too small, " << Endl;
   Log() << "the risk to overtrain rises and boosting seems to be less effective" << Endl;
   Log() << "  typical values from our current expericience for best performance  " << Endl;
   Log() << "  are between 0.5(%) and 10(%) " << Endl;
   Log() << Endl;
   Log() << "The default minimal number is currently set to " << Endl;
   Log() << "   max(20, (N_training_events / N_variables^2 / 10)) " << Endl;
   Log() << "and can be changed by the user." << Endl;
   Log() << Endl;
   Log() << "The other crucial parameter, the pruning strength (\"PruneStrength\")," << Endl;
   Log() << "is also related to overtraining. It is a regularisation parameter " << Endl;
   Log() << "that is used when determining after the training which splits " << Endl;
   Log() << "are considered statistically insignificant and are removed. The" << Endl;
   Log() << "user is advised to carefully watch the BDT screen output for" << Endl;
   Log() << "the comparison between efficiencies obtained on the training and" << Endl;
   Log() << "the independent test sample. They should be equal within statistical" << Endl;
   Log() << "errors, in order to minimize statistical fluctuations in different samples." << Endl;
}

//_______________________________________________________________________
void TMVA::MethodBDT::MakeClassSpecific( std::ostream& fout, const TString& className ) const
{
   // make ROOT-independent C++ class for classifier response (classifier-specific implementation)

   TString nodeName = className;
   nodeName.ReplaceAll("Read","");
   nodeName.Append("Node");
   // write BDT-specific classifier response
   fout << "   std::vector<"<<nodeName<<"*> fForest;       // i.e. root nodes of decision trees" << std::endl;
   fout << "   std::vector<double>                fBoostWeights; // the weights applied in the individual boosts" << std::endl;
   fout << "};" << std::endl << std::endl;
   fout << "double " << className << "::GetMvaValue__( const std::vector<double>& inputValues ) const" << std::endl;
   fout << "{" << std::endl;
   fout << "   double myMVA = 0;" << std::endl;
   if (fDoPreselection){
      for (UInt_t ivar = 0; ivar< fIsLowBkgCut.size(); ivar++){
         if (fIsLowBkgCut[ivar]){
            fout << "   if (inputValues["<<ivar<<"] < " << fLowBkgCut[ivar] << ") return -1;  // is background preselection cut" << std::endl;
         }
         if (fIsLowSigCut[ivar]){
            fout << "   if (inputValues["<<ivar<<"] < "<< fLowSigCut[ivar] << ") return  1;  // is signal preselection cut" << std::endl;
         }
         if (fIsHighBkgCut[ivar]){
            fout << "   if (inputValues["<<ivar<<"] > "<<fHighBkgCut[ivar] <<")  return -1;  // is background preselection cut" << std::endl;
         }
         if (fIsHighSigCut[ivar]){
            fout << "   if (inputValues["<<ivar<<"] > "<<fHighSigCut[ivar]<<")  return  1;  // is signal preselection cut" << std::endl;
         }
      }
   }

   if (fBoostType!="Grad"){
      fout << "   double norm  = 0;" << std::endl;
   }
   fout << "   for (unsigned int itree=0; itree<fForest.size(); itree++){" << std::endl;
   fout << "      "<<nodeName<<" *current = fForest[itree];" << std::endl;
   fout << "      while (current->GetNodeType() == 0) { //intermediate node" << std::endl;
   fout << "         if (current->GoesRight(inputValues)) current=("<<nodeName<<"*)current->GetRight();" << std::endl;
   fout << "         else current=("<<nodeName<<"*)current->GetLeft();" << std::endl;
   fout << "      }" << std::endl;
   if (fBoostType=="Grad"){
      fout << "      myMVA += current->GetResponse();" << std::endl;
   }else{
      if (fUseYesNoLeaf) fout << "      myMVA += fBoostWeights[itree] *  current->GetNodeType();" << std::endl;
      else               fout << "      myMVA += fBoostWeights[itree] *  current->GetPurity();" << std::endl;
      fout << "      norm  += fBoostWeights[itree];" << std::endl;
   }
   fout << "   }" << std::endl;
   if (fBoostType=="Grad"){
      fout << "   return 2.0/(1.0+exp(-2.0*myMVA))-1.0;" << std::endl;
   }
   else fout << "   return myMVA /= norm;" << std::endl;
   fout << "};" << std::endl << std::endl;
   fout << "void " << className << "::Initialize()" << std::endl;
   fout << "{" << std::endl;
   //Now for each decision tree, write directly the constructors of the nodes in the tree structure
   for (UInt_t itree=0; itree<GetNTrees(); itree++) {
      fout << "  // itree = " << itree << std::endl;
      fout << "  fBoostWeights.push_back(" << fBoostWeights[itree] << ");" << std::endl;
      fout << "  fForest.push_back( " << std::endl;
      this->MakeClassInstantiateNode((DecisionTreeNode*)fForest[itree]->GetRoot(), fout, className);
      fout <<"   );" << std::endl;
   }
   fout << "   return;" << std::endl;
   fout << "};" << std::endl;
   fout << " " << std::endl;
   fout << "// Clean up" << std::endl;
   fout << "inline void " << className << "::Clear() " << std::endl;
   fout << "{" << std::endl;
   fout << "   for (unsigned int itree=0; itree<fForest.size(); itree++) { " << std::endl;
   fout << "      delete fForest[itree]; " << std::endl;
   fout << "   }" << std::endl;
   fout << "}" << std::endl;
}

//_______________________________________________________________________
void TMVA::MethodBDT::MakeClassSpecificHeader(  std::ostream& fout, const TString& className) const
{
   // specific class header
   TString nodeName = className;
   nodeName.ReplaceAll("Read","");
   nodeName.Append("Node");
   //fout << "#ifndef NN" << std::endl; commented out on purpose see next line
   fout << "#define NN new "<<nodeName << std::endl; // NN definition depends on individual methods. Important to have NO #ifndef if several BDT methods compile together
   //fout << "#endif" << std::endl; commented out on purpose see previous line
   fout << "   " << std::endl;
   fout << "#ifndef "<<nodeName<<"__def" << std::endl;
   fout << "#define "<<nodeName<<"__def" << std::endl;
   fout << "   " << std::endl;
   fout << "class "<<nodeName<<" {" << std::endl;
   fout << "   " << std::endl;
   fout << "public:" << std::endl;
   fout << "   " << std::endl;
   fout << "   // constructor of an essentially \"empty\" node floating in space" << std::endl;
   fout << "   "<<nodeName<<" ( "<<nodeName<<"* left,"<<nodeName<<"* right," << std::endl;
   if (fUseFisherCuts){
      fout << "                          int nFisherCoeff," << std::endl;
      for (UInt_t i=0;i<GetNVariables()+1;i++){
         fout << "                          double fisherCoeff"<<i<<"," << std::endl;
      }
   }
   fout << "                          int selector, double cutValue, bool cutType, " << std::endl;
   fout << "                          int nodeType, double purity, double response ) :" << std::endl;
   fout << "   fLeft         ( left         )," << std::endl;
   fout << "   fRight        ( right        )," << std::endl;
   if (fUseFisherCuts) fout << "   fNFisherCoeff ( nFisherCoeff )," << std::endl;
   fout << "   fSelector     ( selector     )," << std::endl;
   fout << "   fCutValue     ( cutValue     )," << std::endl;
   fout << "   fCutType      ( cutType      )," << std::endl;
   fout << "   fNodeType     ( nodeType     )," << std::endl;
   fout << "   fPurity       ( purity       )," << std::endl;
   fout << "   fResponse     ( response     ){" << std::endl;
   if (fUseFisherCuts){
      for (UInt_t i=0;i<GetNVariables()+1;i++){
         fout << "     fFisherCoeff.push_back(fisherCoeff"<<i<<");" << std::endl;
      }
   }
   fout << "   }" << std::endl << std::endl;
   fout << "   virtual ~"<<nodeName<<"();" << std::endl << std::endl;
   fout << "   // test event if it decends the tree at this node to the right" << std::endl;
   fout << "   virtual bool GoesRight( const std::vector<double>& inputValues ) const;" << std::endl;
   fout << "   "<<nodeName<<"* GetRight( void )  {return fRight; };" << std::endl << std::endl;
   fout << "   // test event if it decends the tree at this node to the left " << std::endl;
   fout << "   virtual bool GoesLeft ( const std::vector<double>& inputValues ) const;" << std::endl;
   fout << "   "<<nodeName<<"* GetLeft( void ) { return fLeft; };   " << std::endl << std::endl;
   fout << "   // return  S/(S+B) (purity) at this node (from  training)" << std::endl << std::endl;
   fout << "   double GetPurity( void ) const { return fPurity; } " << std::endl;
   fout << "   // return the node type" << std::endl;
   fout << "   int    GetNodeType( void ) const { return fNodeType; }" << std::endl;
   fout << "   double GetResponse(void) const {return fResponse;}" << std::endl << std::endl;
   fout << "private:" << std::endl << std::endl;
   fout << "   "<<nodeName<<"*   fLeft;     // pointer to the left daughter node" << std::endl;
   fout << "   "<<nodeName<<"*   fRight;    // pointer to the right daughter node" << std::endl;
   if (fUseFisherCuts){
      fout << "   int                     fNFisherCoeff; // =0 if this node doesn use fisher, else =nvar+1 " << std::endl;
      fout << "   std::vector<double>     fFisherCoeff;  // the fisher coeff (offset at the last element)" << std::endl;
   }
   fout << "   int                     fSelector; // index of variable used in node selection (decision tree)   " << std::endl;
   fout << "   double                  fCutValue; // cut value appplied on this node to discriminate bkg against sig" << std::endl;
   fout << "   bool                    fCutType;  // true: if event variable > cutValue ==> signal , false otherwise" << std::endl;
   fout << "   int                     fNodeType; // Type of node: -1 == Bkg-leaf, 1 == Signal-leaf, 0 = internal " << std::endl;
   fout << "   double                  fPurity;   // Purity of node from training"<< std::endl;
   fout << "   double                  fResponse; // Regression response value of node" << std::endl;
   fout << "}; " << std::endl;
   fout << "   " << std::endl;
   fout << "//_______________________________________________________________________" << std::endl;
   fout << "   "<<nodeName<<"::~"<<nodeName<<"()" << std::endl;
   fout << "{" << std::endl;
   fout << "   if (fLeft  != NULL) delete fLeft;" << std::endl;
   fout << "   if (fRight != NULL) delete fRight;" << std::endl;
   fout << "}; " << std::endl;
   fout << "   " << std::endl;
   fout << "//_______________________________________________________________________" << std::endl;
   fout << "bool "<<nodeName<<"::GoesRight( const std::vector<double>& inputValues ) const" << std::endl;
   fout << "{" << std::endl;
   fout << "   // test event if it decends the tree at this node to the right" << std::endl;
   fout << "   bool result;" << std::endl;
   if (fUseFisherCuts){
      fout << "   if (fNFisherCoeff == 0){" << std::endl;
      fout << "     result = (inputValues[fSelector] > fCutValue );" << std::endl;
      fout << "   }else{" << std::endl;
      fout << "     double fisher = fFisherCoeff.at(fFisherCoeff.size()-1);" << std::endl;
      fout << "     for (unsigned int ivar=0; ivar<fFisherCoeff.size()-1; ivar++)" << std::endl;
      fout << "       fisher += fFisherCoeff.at(ivar)*inputValues.at(ivar);" << std::endl;
      fout << "     result = fisher > fCutValue;" << std::endl;
      fout << "   }" << std::endl;
   }else{
      fout << "     result = (inputValues[fSelector] > fCutValue );" << std::endl;
   }
   fout << "   if (fCutType == true) return result; //the cuts are selecting Signal ;" << std::endl;
   fout << "   else return !result;" << std::endl;
   fout << "}" << std::endl;
   fout << "   " << std::endl;
   fout << "//_______________________________________________________________________" << std::endl;
   fout << "bool "<<nodeName<<"::GoesLeft( const std::vector<double>& inputValues ) const" << std::endl;
   fout << "{" << std::endl;
   fout << "   // test event if it decends the tree at this node to the left" << std::endl;
   fout << "   if (!this->GoesRight(inputValues)) return true;" << std::endl;
   fout << "   else return false;" << std::endl;
   fout << "}" << std::endl;
   fout << "   " << std::endl;
   fout << "#endif" << std::endl;
   fout << "   " << std::endl;
}

//_______________________________________________________________________
void TMVA::MethodBDT::MakeClassInstantiateNode( DecisionTreeNode *n, std::ostream& fout, const TString& className ) const
{
   // recursively descends a tree and writes the node instance to the output streem
   if (n == NULL) {
      Log() << kFATAL << "MakeClassInstantiateNode: started with undefined node" <<Endl;
      return ;
   }
   fout << "NN("<<std::endl;
   if (n->GetLeft() != NULL){
      this->MakeClassInstantiateNode( (DecisionTreeNode*)n->GetLeft() , fout, className);
   }
   else {
      fout << "0";
   }
   fout << ", " <<std::endl;
   if (n->GetRight() != NULL){
      this->MakeClassInstantiateNode( (DecisionTreeNode*)n->GetRight(), fout, className );
   }
   else {
      fout << "0";
   }
   fout << ", " <<  std::endl
        << std::setprecision(6);
   if (fUseFisherCuts){
      fout << n->GetNFisherCoeff() << ", ";
      for (UInt_t i=0; i< GetNVariables()+1; i++) {
         if (n->GetNFisherCoeff() == 0 ){
            fout <<  "0, ";
         }else{
            fout << n->GetFisherCoeff(i) << ", ";
         }
      }
   }
   fout << n->GetSelector() << ", "
        << n->GetCutValue() << ", "
        << n->GetCutType() << ", "
        << n->GetNodeType() << ", "
        << n->GetPurity() << ","
        << n->GetResponse() << ") ";
}

//_______________________________________________________________________
void TMVA::MethodBDT::DeterminePreselectionCuts(const std::vector<const TMVA::Event*>& eventSample)
{
   // find useful preselection cuts that will be applied before
   // and Decision Tree training.. (and of course also applied
   // in the GetMVA .. --> -1 for background +1 for Signal 
   //   /*
   Double_t nTotS = 0.0, nTotB = 0.0;
   Int_t nTotS_unWeighted = 0, nTotB_unWeighted = 0;  
  
   std::vector<TMVA::BDTEventWrapper> bdtEventSample;

   fIsLowSigCut.assign(GetNvar(),kFALSE); 
   fIsLowBkgCut.assign(GetNvar(),kFALSE); 
   fIsHighSigCut.assign(GetNvar(),kFALSE);
   fIsHighBkgCut.assign(GetNvar(),kFALSE);

   fLowSigCut.assign(GetNvar(),0.);   //  ---------------| -->  in var is signal (accept all above lower cut)
   fLowBkgCut.assign(GetNvar(),0.);   //  ---------------| -->  in var is bkg    (accept all above lower cut)
   fHighSigCut.assign(GetNvar(),0.);  //  <-- | --------------  in var is signal (accept all blow cut)
   fHighBkgCut.assign(GetNvar(),0.);  //  <-- | --------------  in var is blg    (accept all blow cut)
   
  
   // Initialize (un)weighted counters for signal & background
   // Construct a list of event wrappers that point to the original data
   for( std::vector<const TMVA::Event*>::const_iterator it = eventSample.begin(); it != eventSample.end(); ++it ) {
      if (DataInfo().IsSignal(*it)){ 
         nTotS += (*it)->GetWeight();
         ++nTotS_unWeighted;
      }
      else {
         nTotB += (*it)->GetWeight();
         ++nTotB_unWeighted;
      }
      bdtEventSample.push_back(TMVA::BDTEventWrapper(*it));
   }
  
   for( UInt_t ivar = 0; ivar < GetNvar(); ivar++ ) { // loop over all discriminating variables
      TMVA::BDTEventWrapper::SetVarIndex(ivar); // select the variable to sort by
      std::sort( bdtEventSample.begin(),bdtEventSample.end() ); // sort the event data 
    
      Double_t bkgWeightCtr = 0.0, sigWeightCtr = 0.0;
      std::vector<TMVA::BDTEventWrapper>::iterator it = bdtEventSample.begin(), it_end = bdtEventSample.end();
      for( ; it != it_end; ++it ) {
         if (DataInfo().IsSignal(**it))
            sigWeightCtr += (**it)->GetWeight();
         else 
            bkgWeightCtr += (**it)->GetWeight(); 
         // Store the accumulated signal (background) weights
         it->SetCumulativeWeight(false,bkgWeightCtr); 
         it->SetCumulativeWeight(true,sigWeightCtr);
      }
      
      //variable that determines how "exact" you cut on the preslection found in the training data. Here I chose
      //1% of the variable range...
      Double_t dVal = (DataInfo().GetVariableInfo(ivar).GetMax() - DataInfo().GetVariableInfo(ivar).GetMin())/100. ;
      Double_t nSelS, nSelB, effS=0.05, effB=0.05, rejS=0.05, rejB=0.05;
      Double_t tmpEffS, tmpEffB, tmpRejS, tmpRejB;
      // Locate the optimal cut for this (ivar-th) variable
      
      
      
      for(UInt_t iev = 1; iev < bdtEventSample.size(); iev++) {
         //dVal = bdtEventSample[iev].GetVal() - bdtEventSample[iev-1].GetVal();
         
         nSelS = bdtEventSample[iev].GetCumulativeWeight(true);
         nSelB = bdtEventSample[iev].GetCumulativeWeight(false);
         // you look for some 100% efficient pre-selection cut to remove background.. i.e. nSelS=0 && nSelB>5%nTotB or ( nSelB=0 nSelS>5%nTotS)
         tmpEffS=nSelS/nTotS;
         tmpEffB=nSelB/nTotB;
         tmpRejS=1-tmpEffS;
         tmpRejB=1-tmpEffB;
         if      (nSelS==0     && tmpEffB>effB)  {effB=tmpEffB; fLowBkgCut[ivar]  = bdtEventSample[iev].GetVal() - dVal; fIsLowBkgCut[ivar]=kTRUE;}
         else if (nSelB==0     && tmpEffS>effS)  {effS=tmpEffS; fLowSigCut[ivar]  = bdtEventSample[iev].GetVal() - dVal; fIsLowSigCut[ivar]=kTRUE;}
         else if (nSelB==nTotB && tmpRejS>rejS)  {rejS=tmpRejS; fHighSigCut[ivar] = bdtEventSample[iev].GetVal() + dVal; fIsHighSigCut[ivar]=kTRUE;}
         else if (nSelS==nTotS && tmpRejB>rejB)  {rejB=tmpRejB; fHighBkgCut[ivar] = bdtEventSample[iev].GetVal() + dVal; fIsHighBkgCut[ivar]=kTRUE;}
         
      }
   }

   Log() << kINFO << " found and suggest the following possible pre-selection cuts " << Endl;
   if (fDoPreselection) Log() << kINFO << "the training will be done after these cuts... and GetMVA value returns +1, (-1) for a signal (bkg) event that passes these cuts" << Endl;
   else  Log() << kINFO << "as option DoPreselection was not used, these cuts however will not be performed, but the training will see the full sample"<<Endl;
   for (UInt_t ivar=0; ivar < GetNvar(); ivar++ ) { // loop over all discriminating variables
      if (fIsLowBkgCut[ivar]){
         Log() << kINFO  << " found cut: Bkg if var " << ivar << " < "  << fLowBkgCut[ivar] << Endl;
      } 
      if (fIsLowSigCut[ivar]){
         Log() << kINFO  << " found cut: Sig if var " << ivar << " < "  << fLowSigCut[ivar] << Endl;
      }
      if (fIsHighBkgCut[ivar]){
         Log() << kINFO  << " found cut: Bkg if var " << ivar << " > "  << fHighBkgCut[ivar] << Endl;
      } 
      if (fIsHighSigCut[ivar]){
         Log() << kINFO  << " found cut: Sig if var " << ivar << " > "  << fHighSigCut[ivar] << Endl;
      }
   }
   
   return;
}

//_______________________________________________________________________
Double_t TMVA::MethodBDT::ApplyPreselectionCuts(const Event* ev)
{
   // aply the  preselection cuts before even bothing about any 
   // Decision Trees  in the GetMVA .. --> -1 for background +1 for Signal 
   
   Double_t result=0;

   for (UInt_t ivar=0; ivar < GetNvar(); ivar++ ) { // loop over all discriminating variables
      if (fIsLowBkgCut[ivar]){
         if (ev->GetValue(ivar) < fLowBkgCut[ivar]) result = -1;  // is background
      } 
      if (fIsLowSigCut[ivar]){
         if (ev->GetValue(ivar) < fLowSigCut[ivar]) result =  1;  // is signal
      } 
      if (fIsHighBkgCut[ivar]){
         if (ev->GetValue(ivar) > fHighBkgCut[ivar]) result = -1;  // is background
      } 
      if (fIsHighSigCut[ivar]){
         if (ev->GetValue(ivar) > fHighSigCut[ivar]) result =  1;  // is signal
      }
   }
   
   return result;
}

 MethodBDT.cxx:1
 MethodBDT.cxx:2
 MethodBDT.cxx:3
 MethodBDT.cxx:4
 MethodBDT.cxx:5
 MethodBDT.cxx:6
 MethodBDT.cxx:7
 MethodBDT.cxx:8
 MethodBDT.cxx:9
 MethodBDT.cxx:10
 MethodBDT.cxx:11
 MethodBDT.cxx:12
 MethodBDT.cxx:13
 MethodBDT.cxx:14
 MethodBDT.cxx:15
 MethodBDT.cxx:16
 MethodBDT.cxx:17
 MethodBDT.cxx:18
 MethodBDT.cxx:19
 MethodBDT.cxx:20
 MethodBDT.cxx:21
 MethodBDT.cxx:22
 MethodBDT.cxx:23
 MethodBDT.cxx:24
 MethodBDT.cxx:25
 MethodBDT.cxx:26
 MethodBDT.cxx:27
 MethodBDT.cxx:28
 MethodBDT.cxx:29
 MethodBDT.cxx:30
 MethodBDT.cxx:31
 MethodBDT.cxx:32
 MethodBDT.cxx:33
 MethodBDT.cxx:34
 MethodBDT.cxx:35
 MethodBDT.cxx:36
 MethodBDT.cxx:37
 MethodBDT.cxx:38
 MethodBDT.cxx:39
 MethodBDT.cxx:40
 MethodBDT.cxx:41
 MethodBDT.cxx:42
 MethodBDT.cxx:43
 MethodBDT.cxx:44
 MethodBDT.cxx:45
 MethodBDT.cxx:46
 MethodBDT.cxx:47
 MethodBDT.cxx:48
 MethodBDT.cxx:49
 MethodBDT.cxx:50
 MethodBDT.cxx:51
 MethodBDT.cxx:52
 MethodBDT.cxx:53
 MethodBDT.cxx:54
 MethodBDT.cxx:55
 MethodBDT.cxx:56
 MethodBDT.cxx:57
 MethodBDT.cxx:58
 MethodBDT.cxx:59
 MethodBDT.cxx:60
 MethodBDT.cxx:61
 MethodBDT.cxx:62
 MethodBDT.cxx:63
 MethodBDT.cxx:64
 MethodBDT.cxx:65
 MethodBDT.cxx:66
 MethodBDT.cxx:67
 MethodBDT.cxx:68
 MethodBDT.cxx:69
 MethodBDT.cxx:70
 MethodBDT.cxx:71
 MethodBDT.cxx:72
 MethodBDT.cxx:73
 MethodBDT.cxx:74
 MethodBDT.cxx:75
 MethodBDT.cxx:76
 MethodBDT.cxx:77
 MethodBDT.cxx:78
 MethodBDT.cxx:79
 MethodBDT.cxx:80
 MethodBDT.cxx:81
 MethodBDT.cxx:82
 MethodBDT.cxx:83
 MethodBDT.cxx:84
 MethodBDT.cxx:85
 MethodBDT.cxx:86
 MethodBDT.cxx:87
 MethodBDT.cxx:88
 MethodBDT.cxx:89
 MethodBDT.cxx:90
 MethodBDT.cxx:91
 MethodBDT.cxx:92
 MethodBDT.cxx:93
 MethodBDT.cxx:94
 MethodBDT.cxx:95
 MethodBDT.cxx:96
 MethodBDT.cxx:97
 MethodBDT.cxx:98
 MethodBDT.cxx:99
 MethodBDT.cxx:100
 MethodBDT.cxx:101
 MethodBDT.cxx:102
 MethodBDT.cxx:103
 MethodBDT.cxx:104
 MethodBDT.cxx:105
 MethodBDT.cxx:106
 MethodBDT.cxx:107
 MethodBDT.cxx:108
 MethodBDT.cxx:109
 MethodBDT.cxx:110
 MethodBDT.cxx:111
 MethodBDT.cxx:112
 MethodBDT.cxx:113
 MethodBDT.cxx:114
 MethodBDT.cxx:115
 MethodBDT.cxx:116
 MethodBDT.cxx:117
 MethodBDT.cxx:118
 MethodBDT.cxx:119
 MethodBDT.cxx:120
 MethodBDT.cxx:121
 MethodBDT.cxx:122
 MethodBDT.cxx:123
 MethodBDT.cxx:124
 MethodBDT.cxx:125
 MethodBDT.cxx:126
 MethodBDT.cxx:127
 MethodBDT.cxx:128
 MethodBDT.cxx:129
 MethodBDT.cxx:130
 MethodBDT.cxx:131
 MethodBDT.cxx:132
 MethodBDT.cxx:133
 MethodBDT.cxx:134
 MethodBDT.cxx:135
 MethodBDT.cxx:136
 MethodBDT.cxx:137
 MethodBDT.cxx:138
 MethodBDT.cxx:139
 MethodBDT.cxx:140
 MethodBDT.cxx:141
 MethodBDT.cxx:142
 MethodBDT.cxx:143
 MethodBDT.cxx:144
 MethodBDT.cxx:145
 MethodBDT.cxx:146
 MethodBDT.cxx:147
 MethodBDT.cxx:148
 MethodBDT.cxx:149
 MethodBDT.cxx:150
 MethodBDT.cxx:151
 MethodBDT.cxx:152
 MethodBDT.cxx:153
 MethodBDT.cxx:154
 MethodBDT.cxx:155
 MethodBDT.cxx:156
 MethodBDT.cxx:157
 MethodBDT.cxx:158
 MethodBDT.cxx:159
 MethodBDT.cxx:160
 MethodBDT.cxx:161
 MethodBDT.cxx:162
 MethodBDT.cxx:163
 MethodBDT.cxx:164
 MethodBDT.cxx:165
 MethodBDT.cxx:166
 MethodBDT.cxx:167
 MethodBDT.cxx:168
 MethodBDT.cxx:169
 MethodBDT.cxx:170
 MethodBDT.cxx:171
 MethodBDT.cxx:172
 MethodBDT.cxx:173
 MethodBDT.cxx:174
 MethodBDT.cxx:175
 MethodBDT.cxx:176
 MethodBDT.cxx:177
 MethodBDT.cxx:178
 MethodBDT.cxx:179
 MethodBDT.cxx:180
 MethodBDT.cxx:181
 MethodBDT.cxx:182
 MethodBDT.cxx:183
 MethodBDT.cxx:184
 MethodBDT.cxx:185
 MethodBDT.cxx:186
 MethodBDT.cxx:187
 MethodBDT.cxx:188
 MethodBDT.cxx:189
 MethodBDT.cxx:190
 MethodBDT.cxx:191
 MethodBDT.cxx:192
 MethodBDT.cxx:193
 MethodBDT.cxx:194
 MethodBDT.cxx:195
 MethodBDT.cxx:196
 MethodBDT.cxx:197
 MethodBDT.cxx:198
 MethodBDT.cxx:199
 MethodBDT.cxx:200
 MethodBDT.cxx:201
 MethodBDT.cxx:202
 MethodBDT.cxx:203
 MethodBDT.cxx:204
 MethodBDT.cxx:205
 MethodBDT.cxx:206
 MethodBDT.cxx:207
 MethodBDT.cxx:208
 MethodBDT.cxx:209
 MethodBDT.cxx:210
 MethodBDT.cxx:211
 MethodBDT.cxx:212
 MethodBDT.cxx:213
 MethodBDT.cxx:214
 MethodBDT.cxx:215
 MethodBDT.cxx:216
 MethodBDT.cxx:217
 MethodBDT.cxx:218
 MethodBDT.cxx:219
 MethodBDT.cxx:220
 MethodBDT.cxx:221
 MethodBDT.cxx:222
 MethodBDT.cxx:223
 MethodBDT.cxx:224
 MethodBDT.cxx:225
 MethodBDT.cxx:226
 MethodBDT.cxx:227
 MethodBDT.cxx:228
 MethodBDT.cxx:229
 MethodBDT.cxx:230
 MethodBDT.cxx:231
 MethodBDT.cxx:232
 MethodBDT.cxx:233
 MethodBDT.cxx:234
 MethodBDT.cxx:235
 MethodBDT.cxx:236
 MethodBDT.cxx:237
 MethodBDT.cxx:238
 MethodBDT.cxx:239
 MethodBDT.cxx:240
 MethodBDT.cxx:241
 MethodBDT.cxx:242
 MethodBDT.cxx:243
 MethodBDT.cxx:244
 MethodBDT.cxx:245
 MethodBDT.cxx:246
 MethodBDT.cxx:247
 MethodBDT.cxx:248
 MethodBDT.cxx:249
 MethodBDT.cxx:250
 MethodBDT.cxx:251
 MethodBDT.cxx:252
 MethodBDT.cxx:253
 MethodBDT.cxx:254
 MethodBDT.cxx:255
 MethodBDT.cxx:256
 MethodBDT.cxx:257
 MethodBDT.cxx:258
 MethodBDT.cxx:259
 MethodBDT.cxx:260
 MethodBDT.cxx:261
 MethodBDT.cxx:262
 MethodBDT.cxx:263
 MethodBDT.cxx:264
 MethodBDT.cxx:265
 MethodBDT.cxx:266
 MethodBDT.cxx:267
 MethodBDT.cxx:268
 MethodBDT.cxx:269
 MethodBDT.cxx:270
 MethodBDT.cxx:271
 MethodBDT.cxx:272
 MethodBDT.cxx:273
 MethodBDT.cxx:274
 MethodBDT.cxx:275
 MethodBDT.cxx:276
 MethodBDT.cxx:277
 MethodBDT.cxx:278
 MethodBDT.cxx:279
 MethodBDT.cxx:280
 MethodBDT.cxx:281
 MethodBDT.cxx:282
 MethodBDT.cxx:283
 MethodBDT.cxx:284
 MethodBDT.cxx:285
 MethodBDT.cxx:286
 MethodBDT.cxx:287
 MethodBDT.cxx:288
 MethodBDT.cxx:289
 MethodBDT.cxx:290
 MethodBDT.cxx:291
 MethodBDT.cxx:292
 MethodBDT.cxx:293
 MethodBDT.cxx:294
 MethodBDT.cxx:295
 MethodBDT.cxx:296
 MethodBDT.cxx:297
 MethodBDT.cxx:298
 MethodBDT.cxx:299
 MethodBDT.cxx:300
 MethodBDT.cxx:301
 MethodBDT.cxx:302
 MethodBDT.cxx:303
 MethodBDT.cxx:304
 MethodBDT.cxx:305
 MethodBDT.cxx:306
 MethodBDT.cxx:307
 MethodBDT.cxx:308
 MethodBDT.cxx:309
 MethodBDT.cxx:310
 MethodBDT.cxx:311
 MethodBDT.cxx:312
 MethodBDT.cxx:313
 MethodBDT.cxx:314
 MethodBDT.cxx:315
 MethodBDT.cxx:316
 MethodBDT.cxx:317
 MethodBDT.cxx:318
 MethodBDT.cxx:319
 MethodBDT.cxx:320
 MethodBDT.cxx:321
 MethodBDT.cxx:322
 MethodBDT.cxx:323
 MethodBDT.cxx:324
 MethodBDT.cxx:325
 MethodBDT.cxx:326
 MethodBDT.cxx:327
 MethodBDT.cxx:328
 MethodBDT.cxx:329
 MethodBDT.cxx:330
 MethodBDT.cxx:331
 MethodBDT.cxx:332
 MethodBDT.cxx:333
 MethodBDT.cxx:334
 MethodBDT.cxx:335
 MethodBDT.cxx:336
 MethodBDT.cxx:337
 MethodBDT.cxx:338
 MethodBDT.cxx:339
 MethodBDT.cxx:340
 MethodBDT.cxx:341
 MethodBDT.cxx:342
 MethodBDT.cxx:343
 MethodBDT.cxx:344
 MethodBDT.cxx:345
 MethodBDT.cxx:346
 MethodBDT.cxx:347
 MethodBDT.cxx:348
 MethodBDT.cxx:349
 MethodBDT.cxx:350
 MethodBDT.cxx:351
 MethodBDT.cxx:352
 MethodBDT.cxx:353
 MethodBDT.cxx:354
 MethodBDT.cxx:355
 MethodBDT.cxx:356
 MethodBDT.cxx:357
 MethodBDT.cxx:358
 MethodBDT.cxx:359
 MethodBDT.cxx:360
 MethodBDT.cxx:361
 MethodBDT.cxx:362
 MethodBDT.cxx:363
 MethodBDT.cxx:364
 MethodBDT.cxx:365
 MethodBDT.cxx:366
 MethodBDT.cxx:367
 MethodBDT.cxx:368
 MethodBDT.cxx:369
 MethodBDT.cxx:370
 MethodBDT.cxx:371
 MethodBDT.cxx:372
 MethodBDT.cxx:373
 MethodBDT.cxx:374
 MethodBDT.cxx:375
 MethodBDT.cxx:376
 MethodBDT.cxx:377
 MethodBDT.cxx:378
 MethodBDT.cxx:379
 MethodBDT.cxx:380
 MethodBDT.cxx:381
 MethodBDT.cxx:382
 MethodBDT.cxx:383
 MethodBDT.cxx:384
 MethodBDT.cxx:385
 MethodBDT.cxx:386
 MethodBDT.cxx:387
 MethodBDT.cxx:388
 MethodBDT.cxx:389
 MethodBDT.cxx:390
 MethodBDT.cxx:391
 MethodBDT.cxx:392
 MethodBDT.cxx:393
 MethodBDT.cxx:394
 MethodBDT.cxx:395
 MethodBDT.cxx:396
 MethodBDT.cxx:397
 MethodBDT.cxx:398
 MethodBDT.cxx:399
 MethodBDT.cxx:400
 MethodBDT.cxx:401
 MethodBDT.cxx:402
 MethodBDT.cxx:403
 MethodBDT.cxx:404
 MethodBDT.cxx:405
 MethodBDT.cxx:406
 MethodBDT.cxx:407
 MethodBDT.cxx:408
 MethodBDT.cxx:409
 MethodBDT.cxx:410
 MethodBDT.cxx:411
 MethodBDT.cxx:412
 MethodBDT.cxx:413
 MethodBDT.cxx:414
 MethodBDT.cxx:415
 MethodBDT.cxx:416
 MethodBDT.cxx:417
 MethodBDT.cxx:418
 MethodBDT.cxx:419
 MethodBDT.cxx:420
 MethodBDT.cxx:421
 MethodBDT.cxx:422
 MethodBDT.cxx:423
 MethodBDT.cxx:424
 MethodBDT.cxx:425
 MethodBDT.cxx:426
 MethodBDT.cxx:427
 MethodBDT.cxx:428
 MethodBDT.cxx:429
 MethodBDT.cxx:430
 MethodBDT.cxx:431
 MethodBDT.cxx:432
 MethodBDT.cxx:433
 MethodBDT.cxx:434
 MethodBDT.cxx:435
 MethodBDT.cxx:436
 MethodBDT.cxx:437
 MethodBDT.cxx:438
 MethodBDT.cxx:439
 MethodBDT.cxx:440
 MethodBDT.cxx:441
 MethodBDT.cxx:442
 MethodBDT.cxx:443
 MethodBDT.cxx:444
 MethodBDT.cxx:445
 MethodBDT.cxx:446
 MethodBDT.cxx:447
 MethodBDT.cxx:448
 MethodBDT.cxx:449
 MethodBDT.cxx:450
 MethodBDT.cxx:451
 MethodBDT.cxx:452
 MethodBDT.cxx:453
 MethodBDT.cxx:454
 MethodBDT.cxx:455
 MethodBDT.cxx:456
 MethodBDT.cxx:457
 MethodBDT.cxx:458
 MethodBDT.cxx:459
 MethodBDT.cxx:460
 MethodBDT.cxx:461
 MethodBDT.cxx:462
 MethodBDT.cxx:463
 MethodBDT.cxx:464
 MethodBDT.cxx:465
 MethodBDT.cxx:466
 MethodBDT.cxx:467
 MethodBDT.cxx:468
 MethodBDT.cxx:469
 MethodBDT.cxx:470
 MethodBDT.cxx:471
 MethodBDT.cxx:472
 MethodBDT.cxx:473
 MethodBDT.cxx:474
 MethodBDT.cxx:475
 MethodBDT.cxx:476
 MethodBDT.cxx:477
 MethodBDT.cxx:478
 MethodBDT.cxx:479
 MethodBDT.cxx:480
 MethodBDT.cxx:481
 MethodBDT.cxx:482
 MethodBDT.cxx:483
 MethodBDT.cxx:484
 MethodBDT.cxx:485
 MethodBDT.cxx:486
 MethodBDT.cxx:487
 MethodBDT.cxx:488
 MethodBDT.cxx:489
 MethodBDT.cxx:490
 MethodBDT.cxx:491
 MethodBDT.cxx:492
 MethodBDT.cxx:493
 MethodBDT.cxx:494
 MethodBDT.cxx:495
 MethodBDT.cxx:496
 MethodBDT.cxx:497
 MethodBDT.cxx:498
 MethodBDT.cxx:499
 MethodBDT.cxx:500
 MethodBDT.cxx:501
 MethodBDT.cxx:502
 MethodBDT.cxx:503
 MethodBDT.cxx:504
 MethodBDT.cxx:505
 MethodBDT.cxx:506
 MethodBDT.cxx:507
 MethodBDT.cxx:508
 MethodBDT.cxx:509
 MethodBDT.cxx:510
 MethodBDT.cxx:511
 MethodBDT.cxx:512
 MethodBDT.cxx:513
 MethodBDT.cxx:514
 MethodBDT.cxx:515
 MethodBDT.cxx:516
 MethodBDT.cxx:517
 MethodBDT.cxx:518
 MethodBDT.cxx:519
 MethodBDT.cxx:520
 MethodBDT.cxx:521
 MethodBDT.cxx:522
 MethodBDT.cxx:523
 MethodBDT.cxx:524
 MethodBDT.cxx:525
 MethodBDT.cxx:526
 MethodBDT.cxx:527
 MethodBDT.cxx:528
 MethodBDT.cxx:529
 MethodBDT.cxx:530
 MethodBDT.cxx:531
 MethodBDT.cxx:532
 MethodBDT.cxx:533
 MethodBDT.cxx:534
 MethodBDT.cxx:535
 MethodBDT.cxx:536
 MethodBDT.cxx:537
 MethodBDT.cxx:538
 MethodBDT.cxx:539
 MethodBDT.cxx:540
 MethodBDT.cxx:541
 MethodBDT.cxx:542
 MethodBDT.cxx:543
 MethodBDT.cxx:544
 MethodBDT.cxx:545
 MethodBDT.cxx:546
 MethodBDT.cxx:547
 MethodBDT.cxx:548
 MethodBDT.cxx:549
 MethodBDT.cxx:550
 MethodBDT.cxx:551
 MethodBDT.cxx:552
 MethodBDT.cxx:553
 MethodBDT.cxx:554
 MethodBDT.cxx:555
 MethodBDT.cxx:556
 MethodBDT.cxx:557
 MethodBDT.cxx:558
 MethodBDT.cxx:559
 MethodBDT.cxx:560
 MethodBDT.cxx:561
 MethodBDT.cxx:562
 MethodBDT.cxx:563
 MethodBDT.cxx:564
 MethodBDT.cxx:565
 MethodBDT.cxx:566
 MethodBDT.cxx:567
 MethodBDT.cxx:568
 MethodBDT.cxx:569
 MethodBDT.cxx:570
 MethodBDT.cxx:571
 MethodBDT.cxx:572
 MethodBDT.cxx:573
 MethodBDT.cxx:574
 MethodBDT.cxx:575
 MethodBDT.cxx:576
 MethodBDT.cxx:577
 MethodBDT.cxx:578
 MethodBDT.cxx:579
 MethodBDT.cxx:580
 MethodBDT.cxx:581
 MethodBDT.cxx:582
 MethodBDT.cxx:583
 MethodBDT.cxx:584
 MethodBDT.cxx:585
 MethodBDT.cxx:586
 MethodBDT.cxx:587
 MethodBDT.cxx:588
 MethodBDT.cxx:589
 MethodBDT.cxx:590
 MethodBDT.cxx:591
 MethodBDT.cxx:592
 MethodBDT.cxx:593
 MethodBDT.cxx:594
 MethodBDT.cxx:595
 MethodBDT.cxx:596
 MethodBDT.cxx:597
 MethodBDT.cxx:598
 MethodBDT.cxx:599
 MethodBDT.cxx:600
 MethodBDT.cxx:601
 MethodBDT.cxx:602
 MethodBDT.cxx:603
 MethodBDT.cxx:604
 MethodBDT.cxx:605
 MethodBDT.cxx:606
 MethodBDT.cxx:607
 MethodBDT.cxx:608
 MethodBDT.cxx:609
 MethodBDT.cxx:610
 MethodBDT.cxx:611
 MethodBDT.cxx:612
 MethodBDT.cxx:613
 MethodBDT.cxx:614
 MethodBDT.cxx:615
 MethodBDT.cxx:616
 MethodBDT.cxx:617
 MethodBDT.cxx:618
 MethodBDT.cxx:619
 MethodBDT.cxx:620
 MethodBDT.cxx:621
 MethodBDT.cxx:622
 MethodBDT.cxx:623
 MethodBDT.cxx:624
 MethodBDT.cxx:625
 MethodBDT.cxx:626
 MethodBDT.cxx:627
 MethodBDT.cxx:628
 MethodBDT.cxx:629
 MethodBDT.cxx:630
 MethodBDT.cxx:631
 MethodBDT.cxx:632
 MethodBDT.cxx:633
 MethodBDT.cxx:634
 MethodBDT.cxx:635
 MethodBDT.cxx:636
 MethodBDT.cxx:637
 MethodBDT.cxx:638
 MethodBDT.cxx:639
 MethodBDT.cxx:640
 MethodBDT.cxx:641
 MethodBDT.cxx:642
 MethodBDT.cxx:643
 MethodBDT.cxx:644
 MethodBDT.cxx:645
 MethodBDT.cxx:646
 MethodBDT.cxx:647
 MethodBDT.cxx:648
 MethodBDT.cxx:649
 MethodBDT.cxx:650
 MethodBDT.cxx:651
 MethodBDT.cxx:652
 MethodBDT.cxx:653
 MethodBDT.cxx:654
 MethodBDT.cxx:655
 MethodBDT.cxx:656
 MethodBDT.cxx:657
 MethodBDT.cxx:658
 MethodBDT.cxx:659
 MethodBDT.cxx:660
 MethodBDT.cxx:661
 MethodBDT.cxx:662
 MethodBDT.cxx:663
 MethodBDT.cxx:664
 MethodBDT.cxx:665
 MethodBDT.cxx:666
 MethodBDT.cxx:667
 MethodBDT.cxx:668
 MethodBDT.cxx:669
 MethodBDT.cxx:670
 MethodBDT.cxx:671
 MethodBDT.cxx:672
 MethodBDT.cxx:673
 MethodBDT.cxx:674
 MethodBDT.cxx:675
 MethodBDT.cxx:676
 MethodBDT.cxx:677
 MethodBDT.cxx:678
 MethodBDT.cxx:679
 MethodBDT.cxx:680
 MethodBDT.cxx:681
 MethodBDT.cxx:682
 MethodBDT.cxx:683
 MethodBDT.cxx:684
 MethodBDT.cxx:685
 MethodBDT.cxx:686
 MethodBDT.cxx:687
 MethodBDT.cxx:688
 MethodBDT.cxx:689
 MethodBDT.cxx:690
 MethodBDT.cxx:691
 MethodBDT.cxx:692
 MethodBDT.cxx:693
 MethodBDT.cxx:694
 MethodBDT.cxx:695
 MethodBDT.cxx:696
 MethodBDT.cxx:697
 MethodBDT.cxx:698
 MethodBDT.cxx:699
 MethodBDT.cxx:700
 MethodBDT.cxx:701
 MethodBDT.cxx:702
 MethodBDT.cxx:703
 MethodBDT.cxx:704
 MethodBDT.cxx:705
 MethodBDT.cxx:706
 MethodBDT.cxx:707
 MethodBDT.cxx:708
 MethodBDT.cxx:709
 MethodBDT.cxx:710
 MethodBDT.cxx:711
 MethodBDT.cxx:712
 MethodBDT.cxx:713
 MethodBDT.cxx:714
 MethodBDT.cxx:715
 MethodBDT.cxx:716
 MethodBDT.cxx:717
 MethodBDT.cxx:718
 MethodBDT.cxx:719
 MethodBDT.cxx:720
 MethodBDT.cxx:721
 MethodBDT.cxx:722
 MethodBDT.cxx:723
 MethodBDT.cxx:724
 MethodBDT.cxx:725
 MethodBDT.cxx:726
 MethodBDT.cxx:727
 MethodBDT.cxx:728
 MethodBDT.cxx:729
 MethodBDT.cxx:730
 MethodBDT.cxx:731
 MethodBDT.cxx:732
 MethodBDT.cxx:733
 MethodBDT.cxx:734
 MethodBDT.cxx:735
 MethodBDT.cxx:736
 MethodBDT.cxx:737
 MethodBDT.cxx:738
 MethodBDT.cxx:739
 MethodBDT.cxx:740
 MethodBDT.cxx:741
 MethodBDT.cxx:742
 MethodBDT.cxx:743
 MethodBDT.cxx:744
 MethodBDT.cxx:745
 MethodBDT.cxx:746
 MethodBDT.cxx:747
 MethodBDT.cxx:748
 MethodBDT.cxx:749
 MethodBDT.cxx:750
 MethodBDT.cxx:751
 MethodBDT.cxx:752
 MethodBDT.cxx:753
 MethodBDT.cxx:754
 MethodBDT.cxx:755
 MethodBDT.cxx:756
 MethodBDT.cxx:757
 MethodBDT.cxx:758
 MethodBDT.cxx:759
 MethodBDT.cxx:760
 MethodBDT.cxx:761
 MethodBDT.cxx:762
 MethodBDT.cxx:763
 MethodBDT.cxx:764
 MethodBDT.cxx:765
 MethodBDT.cxx:766
 MethodBDT.cxx:767
 MethodBDT.cxx:768
 MethodBDT.cxx:769
 MethodBDT.cxx:770
 MethodBDT.cxx:771
 MethodBDT.cxx:772
 MethodBDT.cxx:773
 MethodBDT.cxx:774
 MethodBDT.cxx:775
 MethodBDT.cxx:776
 MethodBDT.cxx:777
 MethodBDT.cxx:778
 MethodBDT.cxx:779
 MethodBDT.cxx:780
 MethodBDT.cxx:781
 MethodBDT.cxx:782
 MethodBDT.cxx:783
 MethodBDT.cxx:784
 MethodBDT.cxx:785
 MethodBDT.cxx:786
 MethodBDT.cxx:787
 MethodBDT.cxx:788
 MethodBDT.cxx:789
 MethodBDT.cxx:790
 MethodBDT.cxx:791
 MethodBDT.cxx:792
 MethodBDT.cxx:793
 MethodBDT.cxx:794
 MethodBDT.cxx:795
 MethodBDT.cxx:796
 MethodBDT.cxx:797
 MethodBDT.cxx:798
 MethodBDT.cxx:799
 MethodBDT.cxx:800
 MethodBDT.cxx:801
 MethodBDT.cxx:802
 MethodBDT.cxx:803
 MethodBDT.cxx:804
 MethodBDT.cxx:805
 MethodBDT.cxx:806
 MethodBDT.cxx:807
 MethodBDT.cxx:808
 MethodBDT.cxx:809
 MethodBDT.cxx:810
 MethodBDT.cxx:811
 MethodBDT.cxx:812
 MethodBDT.cxx:813
 MethodBDT.cxx:814
 MethodBDT.cxx:815
 MethodBDT.cxx:816
 MethodBDT.cxx:817
 MethodBDT.cxx:818
 MethodBDT.cxx:819
 MethodBDT.cxx:820
 MethodBDT.cxx:821
 MethodBDT.cxx:822
 MethodBDT.cxx:823
 MethodBDT.cxx:824
 MethodBDT.cxx:825
 MethodBDT.cxx:826
 MethodBDT.cxx:827
 MethodBDT.cxx:828
 MethodBDT.cxx:829
 MethodBDT.cxx:830
 MethodBDT.cxx:831
 MethodBDT.cxx:832
 MethodBDT.cxx:833
 MethodBDT.cxx:834
 MethodBDT.cxx:835
 MethodBDT.cxx:836
 MethodBDT.cxx:837
 MethodBDT.cxx:838
 MethodBDT.cxx:839
 MethodBDT.cxx:840
 MethodBDT.cxx:841
 MethodBDT.cxx:842
 MethodBDT.cxx:843
 MethodBDT.cxx:844
 MethodBDT.cxx:845
 MethodBDT.cxx:846
 MethodBDT.cxx:847
 MethodBDT.cxx:848
 MethodBDT.cxx:849
 MethodBDT.cxx:850
 MethodBDT.cxx:851
 MethodBDT.cxx:852
 MethodBDT.cxx:853
 MethodBDT.cxx:854
 MethodBDT.cxx:855
 MethodBDT.cxx:856
 MethodBDT.cxx:857
 MethodBDT.cxx:858
 MethodBDT.cxx:859
 MethodBDT.cxx:860
 MethodBDT.cxx:861
 MethodBDT.cxx:862
 MethodBDT.cxx:863
 MethodBDT.cxx:864
 MethodBDT.cxx:865
 MethodBDT.cxx:866
 MethodBDT.cxx:867
 MethodBDT.cxx:868
 MethodBDT.cxx:869
 MethodBDT.cxx:870
 MethodBDT.cxx:871
 MethodBDT.cxx:872
 MethodBDT.cxx:873
 MethodBDT.cxx:874
 MethodBDT.cxx:875
 MethodBDT.cxx:876
 MethodBDT.cxx:877
 MethodBDT.cxx:878
 MethodBDT.cxx:879
 MethodBDT.cxx:880
 MethodBDT.cxx:881
 MethodBDT.cxx:882
 MethodBDT.cxx:883
 MethodBDT.cxx:884
 MethodBDT.cxx:885
 MethodBDT.cxx:886
 MethodBDT.cxx:887
 MethodBDT.cxx:888
 MethodBDT.cxx:889
 MethodBDT.cxx:890
 MethodBDT.cxx:891
 MethodBDT.cxx:892
 MethodBDT.cxx:893
 MethodBDT.cxx:894
 MethodBDT.cxx:895
 MethodBDT.cxx:896
 MethodBDT.cxx:897
 MethodBDT.cxx:898
 MethodBDT.cxx:899
 MethodBDT.cxx:900
 MethodBDT.cxx:901
 MethodBDT.cxx:902
 MethodBDT.cxx:903
 MethodBDT.cxx:904
 MethodBDT.cxx:905
 MethodBDT.cxx:906
 MethodBDT.cxx:907
 MethodBDT.cxx:908
 MethodBDT.cxx:909
 MethodBDT.cxx:910
 MethodBDT.cxx:911
 MethodBDT.cxx:912
 MethodBDT.cxx:913
 MethodBDT.cxx:914
 MethodBDT.cxx:915
 MethodBDT.cxx:916
 MethodBDT.cxx:917
 MethodBDT.cxx:918
 MethodBDT.cxx:919
 MethodBDT.cxx:920
 MethodBDT.cxx:921
 MethodBDT.cxx:922
 MethodBDT.cxx:923
 MethodBDT.cxx:924
 MethodBDT.cxx:925
 MethodBDT.cxx:926
 MethodBDT.cxx:927
 MethodBDT.cxx:928
 MethodBDT.cxx:929
 MethodBDT.cxx:930
 MethodBDT.cxx:931
 MethodBDT.cxx:932
 MethodBDT.cxx:933
 MethodBDT.cxx:934
 MethodBDT.cxx:935
 MethodBDT.cxx:936
 MethodBDT.cxx:937
 MethodBDT.cxx:938
 MethodBDT.cxx:939
 MethodBDT.cxx:940
 MethodBDT.cxx:941
 MethodBDT.cxx:942
 MethodBDT.cxx:943
 MethodBDT.cxx:944
 MethodBDT.cxx:945
 MethodBDT.cxx:946
 MethodBDT.cxx:947
 MethodBDT.cxx:948
 MethodBDT.cxx:949
 MethodBDT.cxx:950
 MethodBDT.cxx:951
 MethodBDT.cxx:952
 MethodBDT.cxx:953
 MethodBDT.cxx:954
 MethodBDT.cxx:955
 MethodBDT.cxx:956
 MethodBDT.cxx:957
 MethodBDT.cxx:958
 MethodBDT.cxx:959
 MethodBDT.cxx:960
 MethodBDT.cxx:961
 MethodBDT.cxx:962
 MethodBDT.cxx:963
 MethodBDT.cxx:964
 MethodBDT.cxx:965
 MethodBDT.cxx:966
 MethodBDT.cxx:967
 MethodBDT.cxx:968
 MethodBDT.cxx:969
 MethodBDT.cxx:970
 MethodBDT.cxx:971
 MethodBDT.cxx:972
 MethodBDT.cxx:973
 MethodBDT.cxx:974
 MethodBDT.cxx:975
 MethodBDT.cxx:976
 MethodBDT.cxx:977
 MethodBDT.cxx:978
 MethodBDT.cxx:979
 MethodBDT.cxx:980
 MethodBDT.cxx:981
 MethodBDT.cxx:982
 MethodBDT.cxx:983
 MethodBDT.cxx:984
 MethodBDT.cxx:985
 MethodBDT.cxx:986
 MethodBDT.cxx:987
 MethodBDT.cxx:988
 MethodBDT.cxx:989
 MethodBDT.cxx:990
 MethodBDT.cxx:991
 MethodBDT.cxx:992
 MethodBDT.cxx:993
 MethodBDT.cxx:994
 MethodBDT.cxx:995
 MethodBDT.cxx:996
 MethodBDT.cxx:997
 MethodBDT.cxx:998
 MethodBDT.cxx:999
 MethodBDT.cxx:1000
 MethodBDT.cxx:1001
 MethodBDT.cxx:1002
 MethodBDT.cxx:1003
 MethodBDT.cxx:1004
 MethodBDT.cxx:1005
 MethodBDT.cxx:1006
 MethodBDT.cxx:1007
 MethodBDT.cxx:1008
 MethodBDT.cxx:1009
 MethodBDT.cxx:1010
 MethodBDT.cxx:1011
 MethodBDT.cxx:1012
 MethodBDT.cxx:1013
 MethodBDT.cxx:1014
 MethodBDT.cxx:1015
 MethodBDT.cxx:1016
 MethodBDT.cxx:1017
 MethodBDT.cxx:1018
 MethodBDT.cxx:1019
 MethodBDT.cxx:1020
 MethodBDT.cxx:1021
 MethodBDT.cxx:1022
 MethodBDT.cxx:1023
 MethodBDT.cxx:1024
 MethodBDT.cxx:1025
 MethodBDT.cxx:1026
 MethodBDT.cxx:1027
 MethodBDT.cxx:1028
 MethodBDT.cxx:1029
 MethodBDT.cxx:1030
 MethodBDT.cxx:1031
 MethodBDT.cxx:1032
 MethodBDT.cxx:1033
 MethodBDT.cxx:1034
 MethodBDT.cxx:1035
 MethodBDT.cxx:1036
 MethodBDT.cxx:1037
 MethodBDT.cxx:1038
 MethodBDT.cxx:1039
 MethodBDT.cxx:1040
 MethodBDT.cxx:1041
 MethodBDT.cxx:1042
 MethodBDT.cxx:1043
 MethodBDT.cxx:1044
 MethodBDT.cxx:1045
 MethodBDT.cxx:1046
 MethodBDT.cxx:1047
 MethodBDT.cxx:1048
 MethodBDT.cxx:1049
 MethodBDT.cxx:1050
 MethodBDT.cxx:1051
 MethodBDT.cxx:1052
 MethodBDT.cxx:1053
 MethodBDT.cxx:1054
 MethodBDT.cxx:1055
 MethodBDT.cxx:1056
 MethodBDT.cxx:1057
 MethodBDT.cxx:1058
 MethodBDT.cxx:1059
 MethodBDT.cxx:1060
 MethodBDT.cxx:1061
 MethodBDT.cxx:1062
 MethodBDT.cxx:1063
 MethodBDT.cxx:1064
 MethodBDT.cxx:1065
 MethodBDT.cxx:1066
 MethodBDT.cxx:1067
 MethodBDT.cxx:1068
 MethodBDT.cxx:1069
 MethodBDT.cxx:1070
 MethodBDT.cxx:1071
 MethodBDT.cxx:1072
 MethodBDT.cxx:1073
 MethodBDT.cxx:1074
 MethodBDT.cxx:1075
 MethodBDT.cxx:1076
 MethodBDT.cxx:1077
 MethodBDT.cxx:1078
 MethodBDT.cxx:1079
 MethodBDT.cxx:1080
 MethodBDT.cxx:1081
 MethodBDT.cxx:1082
 MethodBDT.cxx:1083
 MethodBDT.cxx:1084
 MethodBDT.cxx:1085
 MethodBDT.cxx:1086
 MethodBDT.cxx:1087
 MethodBDT.cxx:1088
 MethodBDT.cxx:1089
 MethodBDT.cxx:1090
 MethodBDT.cxx:1091
 MethodBDT.cxx:1092
 MethodBDT.cxx:1093
 MethodBDT.cxx:1094
 MethodBDT.cxx:1095
 MethodBDT.cxx:1096
 MethodBDT.cxx:1097
 MethodBDT.cxx:1098
 MethodBDT.cxx:1099
 MethodBDT.cxx:1100
 MethodBDT.cxx:1101
 MethodBDT.cxx:1102
 MethodBDT.cxx:1103
 MethodBDT.cxx:1104
 MethodBDT.cxx:1105
 MethodBDT.cxx:1106
 MethodBDT.cxx:1107
 MethodBDT.cxx:1108
 MethodBDT.cxx:1109
 MethodBDT.cxx:1110
 MethodBDT.cxx:1111
 MethodBDT.cxx:1112
 MethodBDT.cxx:1113
 MethodBDT.cxx:1114
 MethodBDT.cxx:1115
 MethodBDT.cxx:1116
 MethodBDT.cxx:1117
 MethodBDT.cxx:1118
 MethodBDT.cxx:1119
 MethodBDT.cxx:1120
 MethodBDT.cxx:1121
 MethodBDT.cxx:1122
 MethodBDT.cxx:1123
 MethodBDT.cxx:1124
 MethodBDT.cxx:1125
 MethodBDT.cxx:1126
 MethodBDT.cxx:1127
 MethodBDT.cxx:1128
 MethodBDT.cxx:1129
 MethodBDT.cxx:1130
 MethodBDT.cxx:1131
 MethodBDT.cxx:1132
 MethodBDT.cxx:1133
 MethodBDT.cxx:1134
 MethodBDT.cxx:1135
 MethodBDT.cxx:1136
 MethodBDT.cxx:1137
 MethodBDT.cxx:1138
 MethodBDT.cxx:1139
 MethodBDT.cxx:1140
 MethodBDT.cxx:1141
 MethodBDT.cxx:1142
 MethodBDT.cxx:1143
 MethodBDT.cxx:1144
 MethodBDT.cxx:1145
 MethodBDT.cxx:1146
 MethodBDT.cxx:1147
 MethodBDT.cxx:1148
 MethodBDT.cxx:1149
 MethodBDT.cxx:1150
 MethodBDT.cxx:1151
 MethodBDT.cxx:1152
 MethodBDT.cxx:1153
 MethodBDT.cxx:1154
 MethodBDT.cxx:1155
 MethodBDT.cxx:1156
 MethodBDT.cxx:1157
 MethodBDT.cxx:1158
 MethodBDT.cxx:1159
 MethodBDT.cxx:1160
 MethodBDT.cxx:1161
 MethodBDT.cxx:1162
 MethodBDT.cxx:1163
 MethodBDT.cxx:1164
 MethodBDT.cxx:1165
 MethodBDT.cxx:1166
 MethodBDT.cxx:1167
 MethodBDT.cxx:1168
 MethodBDT.cxx:1169
 MethodBDT.cxx:1170
 MethodBDT.cxx:1171
 MethodBDT.cxx:1172
 MethodBDT.cxx:1173
 MethodBDT.cxx:1174
 MethodBDT.cxx:1175
 MethodBDT.cxx:1176
 MethodBDT.cxx:1177
 MethodBDT.cxx:1178
 MethodBDT.cxx:1179
 MethodBDT.cxx:1180
 MethodBDT.cxx:1181
 MethodBDT.cxx:1182
 MethodBDT.cxx:1183
 MethodBDT.cxx:1184
 MethodBDT.cxx:1185
 MethodBDT.cxx:1186
 MethodBDT.cxx:1187
 MethodBDT.cxx:1188
 MethodBDT.cxx:1189
 MethodBDT.cxx:1190
 MethodBDT.cxx:1191
 MethodBDT.cxx:1192
 MethodBDT.cxx:1193
 MethodBDT.cxx:1194
 MethodBDT.cxx:1195
 MethodBDT.cxx:1196
 MethodBDT.cxx:1197
 MethodBDT.cxx:1198
 MethodBDT.cxx:1199
 MethodBDT.cxx:1200
 MethodBDT.cxx:1201
 MethodBDT.cxx:1202
 MethodBDT.cxx:1203
 MethodBDT.cxx:1204
 MethodBDT.cxx:1205
 MethodBDT.cxx:1206
 MethodBDT.cxx:1207
 MethodBDT.cxx:1208
 MethodBDT.cxx:1209
 MethodBDT.cxx:1210
 MethodBDT.cxx:1211
 MethodBDT.cxx:1212
 MethodBDT.cxx:1213
 MethodBDT.cxx:1214
 MethodBDT.cxx:1215
 MethodBDT.cxx:1216
 MethodBDT.cxx:1217
 MethodBDT.cxx:1218
 MethodBDT.cxx:1219
 MethodBDT.cxx:1220
 MethodBDT.cxx:1221
 MethodBDT.cxx:1222
 MethodBDT.cxx:1223
 MethodBDT.cxx:1224
 MethodBDT.cxx:1225
 MethodBDT.cxx:1226
 MethodBDT.cxx:1227
 MethodBDT.cxx:1228
 MethodBDT.cxx:1229
 MethodBDT.cxx:1230
 MethodBDT.cxx:1231
 MethodBDT.cxx:1232
 MethodBDT.cxx:1233
 MethodBDT.cxx:1234
 MethodBDT.cxx:1235
 MethodBDT.cxx:1236
 MethodBDT.cxx:1237
 MethodBDT.cxx:1238
 MethodBDT.cxx:1239
 MethodBDT.cxx:1240
 MethodBDT.cxx:1241
 MethodBDT.cxx:1242
 MethodBDT.cxx:1243
 MethodBDT.cxx:1244
 MethodBDT.cxx:1245
 MethodBDT.cxx:1246
 MethodBDT.cxx:1247
 MethodBDT.cxx:1248
 MethodBDT.cxx:1249
 MethodBDT.cxx:1250
 MethodBDT.cxx:1251
 MethodBDT.cxx:1252
 MethodBDT.cxx:1253
 MethodBDT.cxx:1254
 MethodBDT.cxx:1255
 MethodBDT.cxx:1256
 MethodBDT.cxx:1257
 MethodBDT.cxx:1258
 MethodBDT.cxx:1259
 MethodBDT.cxx:1260
 MethodBDT.cxx:1261
 MethodBDT.cxx:1262
 MethodBDT.cxx:1263
 MethodBDT.cxx:1264
 MethodBDT.cxx:1265
 MethodBDT.cxx:1266
 MethodBDT.cxx:1267
 MethodBDT.cxx:1268
 MethodBDT.cxx:1269
 MethodBDT.cxx:1270
 MethodBDT.cxx:1271
 MethodBDT.cxx:1272
 MethodBDT.cxx:1273
 MethodBDT.cxx:1274
 MethodBDT.cxx:1275
 MethodBDT.cxx:1276
 MethodBDT.cxx:1277
 MethodBDT.cxx:1278
 MethodBDT.cxx:1279
 MethodBDT.cxx:1280
 MethodBDT.cxx:1281
 MethodBDT.cxx:1282
 MethodBDT.cxx:1283
 MethodBDT.cxx:1284
 MethodBDT.cxx:1285
 MethodBDT.cxx:1286
 MethodBDT.cxx:1287
 MethodBDT.cxx:1288
 MethodBDT.cxx:1289
 MethodBDT.cxx:1290
 MethodBDT.cxx:1291
 MethodBDT.cxx:1292
 MethodBDT.cxx:1293
 MethodBDT.cxx:1294
 MethodBDT.cxx:1295
 MethodBDT.cxx:1296
 MethodBDT.cxx:1297
 MethodBDT.cxx:1298
 MethodBDT.cxx:1299
 MethodBDT.cxx:1300
 MethodBDT.cxx:1301
 MethodBDT.cxx:1302
 MethodBDT.cxx:1303
 MethodBDT.cxx:1304
 MethodBDT.cxx:1305
 MethodBDT.cxx:1306
 MethodBDT.cxx:1307
 MethodBDT.cxx:1308
 MethodBDT.cxx:1309
 MethodBDT.cxx:1310
 MethodBDT.cxx:1311
 MethodBDT.cxx:1312
 MethodBDT.cxx:1313
 MethodBDT.cxx:1314
 MethodBDT.cxx:1315
 MethodBDT.cxx:1316
 MethodBDT.cxx:1317
 MethodBDT.cxx:1318
 MethodBDT.cxx:1319
 MethodBDT.cxx:1320
 MethodBDT.cxx:1321
 MethodBDT.cxx:1322
 MethodBDT.cxx:1323
 MethodBDT.cxx:1324
 MethodBDT.cxx:1325
 MethodBDT.cxx:1326
 MethodBDT.cxx:1327
 MethodBDT.cxx:1328
 MethodBDT.cxx:1329
 MethodBDT.cxx:1330
 MethodBDT.cxx:1331
 MethodBDT.cxx:1332
 MethodBDT.cxx:1333
 MethodBDT.cxx:1334
 MethodBDT.cxx:1335
 MethodBDT.cxx:1336
 MethodBDT.cxx:1337
 MethodBDT.cxx:1338
 MethodBDT.cxx:1339
 MethodBDT.cxx:1340
 MethodBDT.cxx:1341
 MethodBDT.cxx:1342
 MethodBDT.cxx:1343
 MethodBDT.cxx:1344
 MethodBDT.cxx:1345
 MethodBDT.cxx:1346
 MethodBDT.cxx:1347
 MethodBDT.cxx:1348
 MethodBDT.cxx:1349
 MethodBDT.cxx:1350
 MethodBDT.cxx:1351
 MethodBDT.cxx:1352
 MethodBDT.cxx:1353
 MethodBDT.cxx:1354
 MethodBDT.cxx:1355
 MethodBDT.cxx:1356
 MethodBDT.cxx:1357
 MethodBDT.cxx:1358
 MethodBDT.cxx:1359
 MethodBDT.cxx:1360
 MethodBDT.cxx:1361
 MethodBDT.cxx:1362
 MethodBDT.cxx:1363
 MethodBDT.cxx:1364
 MethodBDT.cxx:1365
 MethodBDT.cxx:1366
 MethodBDT.cxx:1367
 MethodBDT.cxx:1368
 MethodBDT.cxx:1369
 MethodBDT.cxx:1370
 MethodBDT.cxx:1371
 MethodBDT.cxx:1372
 MethodBDT.cxx:1373
 MethodBDT.cxx:1374
 MethodBDT.cxx:1375
 MethodBDT.cxx:1376
 MethodBDT.cxx:1377
 MethodBDT.cxx:1378
 MethodBDT.cxx:1379
 MethodBDT.cxx:1380
 MethodBDT.cxx:1381
 MethodBDT.cxx:1382
 MethodBDT.cxx:1383
 MethodBDT.cxx:1384
 MethodBDT.cxx:1385
 MethodBDT.cxx:1386
 MethodBDT.cxx:1387
 MethodBDT.cxx:1388
 MethodBDT.cxx:1389
 MethodBDT.cxx:1390
 MethodBDT.cxx:1391
 MethodBDT.cxx:1392
 MethodBDT.cxx:1393
 MethodBDT.cxx:1394
 MethodBDT.cxx:1395
 MethodBDT.cxx:1396
 MethodBDT.cxx:1397
 MethodBDT.cxx:1398
 MethodBDT.cxx:1399
 MethodBDT.cxx:1400
 MethodBDT.cxx:1401
 MethodBDT.cxx:1402
 MethodBDT.cxx:1403
 MethodBDT.cxx:1404
 MethodBDT.cxx:1405
 MethodBDT.cxx:1406
 MethodBDT.cxx:1407
 MethodBDT.cxx:1408
 MethodBDT.cxx:1409
 MethodBDT.cxx:1410
 MethodBDT.cxx:1411
 MethodBDT.cxx:1412
 MethodBDT.cxx:1413
 MethodBDT.cxx:1414
 MethodBDT.cxx:1415
 MethodBDT.cxx:1416
 MethodBDT.cxx:1417
 MethodBDT.cxx:1418
 MethodBDT.cxx:1419
 MethodBDT.cxx:1420
 MethodBDT.cxx:1421
 MethodBDT.cxx:1422
 MethodBDT.cxx:1423
 MethodBDT.cxx:1424
 MethodBDT.cxx:1425
 MethodBDT.cxx:1426
 MethodBDT.cxx:1427
 MethodBDT.cxx:1428
 MethodBDT.cxx:1429
 MethodBDT.cxx:1430
 MethodBDT.cxx:1431
 MethodBDT.cxx:1432
 MethodBDT.cxx:1433
 MethodBDT.cxx:1434
 MethodBDT.cxx:1435
 MethodBDT.cxx:1436
 MethodBDT.cxx:1437
 MethodBDT.cxx:1438
 MethodBDT.cxx:1439
 MethodBDT.cxx:1440
 MethodBDT.cxx:1441
 MethodBDT.cxx:1442
 MethodBDT.cxx:1443
 MethodBDT.cxx:1444
 MethodBDT.cxx:1445
 MethodBDT.cxx:1446
 MethodBDT.cxx:1447
 MethodBDT.cxx:1448
 MethodBDT.cxx:1449
 MethodBDT.cxx:1450
 MethodBDT.cxx:1451
 MethodBDT.cxx:1452
 MethodBDT.cxx:1453
 MethodBDT.cxx:1454
 MethodBDT.cxx:1455
 MethodBDT.cxx:1456
 MethodBDT.cxx:1457
 MethodBDT.cxx:1458
 MethodBDT.cxx:1459
 MethodBDT.cxx:1460
 MethodBDT.cxx:1461
 MethodBDT.cxx:1462
 MethodBDT.cxx:1463
 MethodBDT.cxx:1464
 MethodBDT.cxx:1465
 MethodBDT.cxx:1466
 MethodBDT.cxx:1467
 MethodBDT.cxx:1468
 MethodBDT.cxx:1469
 MethodBDT.cxx:1470
 MethodBDT.cxx:1471
 MethodBDT.cxx:1472
 MethodBDT.cxx:1473
 MethodBDT.cxx:1474
 MethodBDT.cxx:1475
 MethodBDT.cxx:1476
 MethodBDT.cxx:1477
 MethodBDT.cxx:1478
 MethodBDT.cxx:1479
 MethodBDT.cxx:1480
 MethodBDT.cxx:1481
 MethodBDT.cxx:1482
 MethodBDT.cxx:1483
 MethodBDT.cxx:1484
 MethodBDT.cxx:1485
 MethodBDT.cxx:1486
 MethodBDT.cxx:1487
 MethodBDT.cxx:1488
 MethodBDT.cxx:1489
 MethodBDT.cxx:1490
 MethodBDT.cxx:1491
 MethodBDT.cxx:1492
 MethodBDT.cxx:1493
 MethodBDT.cxx:1494
 MethodBDT.cxx:1495
 MethodBDT.cxx:1496
 MethodBDT.cxx:1497
 MethodBDT.cxx:1498
 MethodBDT.cxx:1499
 MethodBDT.cxx:1500
 MethodBDT.cxx:1501
 MethodBDT.cxx:1502
 MethodBDT.cxx:1503
 MethodBDT.cxx:1504
 MethodBDT.cxx:1505
 MethodBDT.cxx:1506
 MethodBDT.cxx:1507
 MethodBDT.cxx:1508
 MethodBDT.cxx:1509
 MethodBDT.cxx:1510
 MethodBDT.cxx:1511
 MethodBDT.cxx:1512
 MethodBDT.cxx:1513
 MethodBDT.cxx:1514
 MethodBDT.cxx:1515
 MethodBDT.cxx:1516
 MethodBDT.cxx:1517
 MethodBDT.cxx:1518
 MethodBDT.cxx:1519
 MethodBDT.cxx:1520
 MethodBDT.cxx:1521
 MethodBDT.cxx:1522
 MethodBDT.cxx:1523
 MethodBDT.cxx:1524
 MethodBDT.cxx:1525
 MethodBDT.cxx:1526
 MethodBDT.cxx:1527
 MethodBDT.cxx:1528
 MethodBDT.cxx:1529
 MethodBDT.cxx:1530
 MethodBDT.cxx:1531
 MethodBDT.cxx:1532
 MethodBDT.cxx:1533
 MethodBDT.cxx:1534
 MethodBDT.cxx:1535
 MethodBDT.cxx:1536
 MethodBDT.cxx:1537
 MethodBDT.cxx:1538
 MethodBDT.cxx:1539
 MethodBDT.cxx:1540
 MethodBDT.cxx:1541
 MethodBDT.cxx:1542
 MethodBDT.cxx:1543
 MethodBDT.cxx:1544
 MethodBDT.cxx:1545
 MethodBDT.cxx:1546
 MethodBDT.cxx:1547
 MethodBDT.cxx:1548
 MethodBDT.cxx:1549
 MethodBDT.cxx:1550
 MethodBDT.cxx:1551
 MethodBDT.cxx:1552
 MethodBDT.cxx:1553
 MethodBDT.cxx:1554
 MethodBDT.cxx:1555
 MethodBDT.cxx:1556
 MethodBDT.cxx:1557
 MethodBDT.cxx:1558
 MethodBDT.cxx:1559
 MethodBDT.cxx:1560
 MethodBDT.cxx:1561
 MethodBDT.cxx:1562
 MethodBDT.cxx:1563
 MethodBDT.cxx:1564
 MethodBDT.cxx:1565
 MethodBDT.cxx:1566
 MethodBDT.cxx:1567
 MethodBDT.cxx:1568
 MethodBDT.cxx:1569
 MethodBDT.cxx:1570
 MethodBDT.cxx:1571
 MethodBDT.cxx:1572
 MethodBDT.cxx:1573
 MethodBDT.cxx:1574
 MethodBDT.cxx:1575
 MethodBDT.cxx:1576
 MethodBDT.cxx:1577
 MethodBDT.cxx:1578
 MethodBDT.cxx:1579
 MethodBDT.cxx:1580
 MethodBDT.cxx:1581
 MethodBDT.cxx:1582
 MethodBDT.cxx:1583
 MethodBDT.cxx:1584
 MethodBDT.cxx:1585
 MethodBDT.cxx:1586
 MethodBDT.cxx:1587
 MethodBDT.cxx:1588
 MethodBDT.cxx:1589
 MethodBDT.cxx:1590
 MethodBDT.cxx:1591
 MethodBDT.cxx:1592
 MethodBDT.cxx:1593
 MethodBDT.cxx:1594
 MethodBDT.cxx:1595
 MethodBDT.cxx:1596
 MethodBDT.cxx:1597
 MethodBDT.cxx:1598
 MethodBDT.cxx:1599
 MethodBDT.cxx:1600
 MethodBDT.cxx:1601
 MethodBDT.cxx:1602
 MethodBDT.cxx:1603
 MethodBDT.cxx:1604
 MethodBDT.cxx:1605
 MethodBDT.cxx:1606
 MethodBDT.cxx:1607
 MethodBDT.cxx:1608
 MethodBDT.cxx:1609
 MethodBDT.cxx:1610
 MethodBDT.cxx:1611
 MethodBDT.cxx:1612
 MethodBDT.cxx:1613
 MethodBDT.cxx:1614
 MethodBDT.cxx:1615
 MethodBDT.cxx:1616
 MethodBDT.cxx:1617
 MethodBDT.cxx:1618
 MethodBDT.cxx:1619
 MethodBDT.cxx:1620
 MethodBDT.cxx:1621
 MethodBDT.cxx:1622
 MethodBDT.cxx:1623
 MethodBDT.cxx:1624
 MethodBDT.cxx:1625
 MethodBDT.cxx:1626
 MethodBDT.cxx:1627
 MethodBDT.cxx:1628
 MethodBDT.cxx:1629
 MethodBDT.cxx:1630
 MethodBDT.cxx:1631
 MethodBDT.cxx:1632
 MethodBDT.cxx:1633
 MethodBDT.cxx:1634
 MethodBDT.cxx:1635
 MethodBDT.cxx:1636
 MethodBDT.cxx:1637
 MethodBDT.cxx:1638
 MethodBDT.cxx:1639
 MethodBDT.cxx:1640
 MethodBDT.cxx:1641
 MethodBDT.cxx:1642
 MethodBDT.cxx:1643
 MethodBDT.cxx:1644
 MethodBDT.cxx:1645
 MethodBDT.cxx:1646
 MethodBDT.cxx:1647
 MethodBDT.cxx:1648
 MethodBDT.cxx:1649
 MethodBDT.cxx:1650
 MethodBDT.cxx:1651
 MethodBDT.cxx:1652
 MethodBDT.cxx:1653
 MethodBDT.cxx:1654
 MethodBDT.cxx:1655
 MethodBDT.cxx:1656
 MethodBDT.cxx:1657
 MethodBDT.cxx:1658
 MethodBDT.cxx:1659
 MethodBDT.cxx:1660
 MethodBDT.cxx:1661
 MethodBDT.cxx:1662
 MethodBDT.cxx:1663
 MethodBDT.cxx:1664
 MethodBDT.cxx:1665
 MethodBDT.cxx:1666
 MethodBDT.cxx:1667
 MethodBDT.cxx:1668
 MethodBDT.cxx:1669
 MethodBDT.cxx:1670
 MethodBDT.cxx:1671
 MethodBDT.cxx:1672
 MethodBDT.cxx:1673
 MethodBDT.cxx:1674
 MethodBDT.cxx:1675
 MethodBDT.cxx:1676
 MethodBDT.cxx:1677
 MethodBDT.cxx:1678
 MethodBDT.cxx:1679
 MethodBDT.cxx:1680
 MethodBDT.cxx:1681
 MethodBDT.cxx:1682
 MethodBDT.cxx:1683
 MethodBDT.cxx:1684
 MethodBDT.cxx:1685
 MethodBDT.cxx:1686
 MethodBDT.cxx:1687
 MethodBDT.cxx:1688
 MethodBDT.cxx:1689
 MethodBDT.cxx:1690
 MethodBDT.cxx:1691
 MethodBDT.cxx:1692
 MethodBDT.cxx:1693
 MethodBDT.cxx:1694
 MethodBDT.cxx:1695
 MethodBDT.cxx:1696
 MethodBDT.cxx:1697
 MethodBDT.cxx:1698
 MethodBDT.cxx:1699
 MethodBDT.cxx:1700
 MethodBDT.cxx:1701
 MethodBDT.cxx:1702
 MethodBDT.cxx:1703
 MethodBDT.cxx:1704
 MethodBDT.cxx:1705
 MethodBDT.cxx:1706
 MethodBDT.cxx:1707
 MethodBDT.cxx:1708
 MethodBDT.cxx:1709
 MethodBDT.cxx:1710
 MethodBDT.cxx:1711
 MethodBDT.cxx:1712
 MethodBDT.cxx:1713
 MethodBDT.cxx:1714
 MethodBDT.cxx:1715
 MethodBDT.cxx:1716
 MethodBDT.cxx:1717
 MethodBDT.cxx:1718
 MethodBDT.cxx:1719
 MethodBDT.cxx:1720
 MethodBDT.cxx:1721
 MethodBDT.cxx:1722
 MethodBDT.cxx:1723
 MethodBDT.cxx:1724
 MethodBDT.cxx:1725
 MethodBDT.cxx:1726
 MethodBDT.cxx:1727
 MethodBDT.cxx:1728
 MethodBDT.cxx:1729
 MethodBDT.cxx:1730
 MethodBDT.cxx:1731
 MethodBDT.cxx:1732
 MethodBDT.cxx:1733
 MethodBDT.cxx:1734
 MethodBDT.cxx:1735
 MethodBDT.cxx:1736
 MethodBDT.cxx:1737
 MethodBDT.cxx:1738
 MethodBDT.cxx:1739
 MethodBDT.cxx:1740
 MethodBDT.cxx:1741
 MethodBDT.cxx:1742
 MethodBDT.cxx:1743
 MethodBDT.cxx:1744
 MethodBDT.cxx:1745
 MethodBDT.cxx:1746
 MethodBDT.cxx:1747
 MethodBDT.cxx:1748
 MethodBDT.cxx:1749
 MethodBDT.cxx:1750
 MethodBDT.cxx:1751
 MethodBDT.cxx:1752
 MethodBDT.cxx:1753
 MethodBDT.cxx:1754
 MethodBDT.cxx:1755
 MethodBDT.cxx:1756
 MethodBDT.cxx:1757
 MethodBDT.cxx:1758
 MethodBDT.cxx:1759
 MethodBDT.cxx:1760
 MethodBDT.cxx:1761
 MethodBDT.cxx:1762
 MethodBDT.cxx:1763
 MethodBDT.cxx:1764
 MethodBDT.cxx:1765
 MethodBDT.cxx:1766
 MethodBDT.cxx:1767
 MethodBDT.cxx:1768
 MethodBDT.cxx:1769
 MethodBDT.cxx:1770
 MethodBDT.cxx:1771
 MethodBDT.cxx:1772
 MethodBDT.cxx:1773
 MethodBDT.cxx:1774
 MethodBDT.cxx:1775
 MethodBDT.cxx:1776
 MethodBDT.cxx:1777
 MethodBDT.cxx:1778
 MethodBDT.cxx:1779
 MethodBDT.cxx:1780
 MethodBDT.cxx:1781
 MethodBDT.cxx:1782
 MethodBDT.cxx:1783
 MethodBDT.cxx:1784
 MethodBDT.cxx:1785
 MethodBDT.cxx:1786
 MethodBDT.cxx:1787
 MethodBDT.cxx:1788
 MethodBDT.cxx:1789
 MethodBDT.cxx:1790
 MethodBDT.cxx:1791
 MethodBDT.cxx:1792
 MethodBDT.cxx:1793
 MethodBDT.cxx:1794
 MethodBDT.cxx:1795
 MethodBDT.cxx:1796
 MethodBDT.cxx:1797
 MethodBDT.cxx:1798
 MethodBDT.cxx:1799
 MethodBDT.cxx:1800
 MethodBDT.cxx:1801
 MethodBDT.cxx:1802
 MethodBDT.cxx:1803
 MethodBDT.cxx:1804
 MethodBDT.cxx:1805
 MethodBDT.cxx:1806
 MethodBDT.cxx:1807
 MethodBDT.cxx:1808
 MethodBDT.cxx:1809
 MethodBDT.cxx:1810
 MethodBDT.cxx:1811
 MethodBDT.cxx:1812
 MethodBDT.cxx:1813
 MethodBDT.cxx:1814
 MethodBDT.cxx:1815
 MethodBDT.cxx:1816
 MethodBDT.cxx:1817
 MethodBDT.cxx:1818
 MethodBDT.cxx:1819
 MethodBDT.cxx:1820
 MethodBDT.cxx:1821
 MethodBDT.cxx:1822
 MethodBDT.cxx:1823
 MethodBDT.cxx:1824
 MethodBDT.cxx:1825
 MethodBDT.cxx:1826
 MethodBDT.cxx:1827
 MethodBDT.cxx:1828
 MethodBDT.cxx:1829
 MethodBDT.cxx:1830
 MethodBDT.cxx:1831
 MethodBDT.cxx:1832
 MethodBDT.cxx:1833
 MethodBDT.cxx:1834
 MethodBDT.cxx:1835
 MethodBDT.cxx:1836
 MethodBDT.cxx:1837
 MethodBDT.cxx:1838
 MethodBDT.cxx:1839
 MethodBDT.cxx:1840
 MethodBDT.cxx:1841
 MethodBDT.cxx:1842
 MethodBDT.cxx:1843
 MethodBDT.cxx:1844
 MethodBDT.cxx:1845
 MethodBDT.cxx:1846
 MethodBDT.cxx:1847
 MethodBDT.cxx:1848
 MethodBDT.cxx:1849
 MethodBDT.cxx:1850
 MethodBDT.cxx:1851
 MethodBDT.cxx:1852
 MethodBDT.cxx:1853
 MethodBDT.cxx:1854
 MethodBDT.cxx:1855
 MethodBDT.cxx:1856
 MethodBDT.cxx:1857
 MethodBDT.cxx:1858
 MethodBDT.cxx:1859
 MethodBDT.cxx:1860
 MethodBDT.cxx:1861
 MethodBDT.cxx:1862
 MethodBDT.cxx:1863
 MethodBDT.cxx:1864
 MethodBDT.cxx:1865
 MethodBDT.cxx:1866
 MethodBDT.cxx:1867
 MethodBDT.cxx:1868
 MethodBDT.cxx:1869
 MethodBDT.cxx:1870
 MethodBDT.cxx:1871
 MethodBDT.cxx:1872
 MethodBDT.cxx:1873
 MethodBDT.cxx:1874
 MethodBDT.cxx:1875
 MethodBDT.cxx:1876
 MethodBDT.cxx:1877
 MethodBDT.cxx:1878
 MethodBDT.cxx:1879
 MethodBDT.cxx:1880
 MethodBDT.cxx:1881
 MethodBDT.cxx:1882
 MethodBDT.cxx:1883
 MethodBDT.cxx:1884
 MethodBDT.cxx:1885
 MethodBDT.cxx:1886
 MethodBDT.cxx:1887
 MethodBDT.cxx:1888
 MethodBDT.cxx:1889
 MethodBDT.cxx:1890
 MethodBDT.cxx:1891
 MethodBDT.cxx:1892
 MethodBDT.cxx:1893
 MethodBDT.cxx:1894
 MethodBDT.cxx:1895
 MethodBDT.cxx:1896
 MethodBDT.cxx:1897
 MethodBDT.cxx:1898
 MethodBDT.cxx:1899
 MethodBDT.cxx:1900
 MethodBDT.cxx:1901
 MethodBDT.cxx:1902
 MethodBDT.cxx:1903
 MethodBDT.cxx:1904
 MethodBDT.cxx:1905
 MethodBDT.cxx:1906
 MethodBDT.cxx:1907
 MethodBDT.cxx:1908
 MethodBDT.cxx:1909
 MethodBDT.cxx:1910
 MethodBDT.cxx:1911
 MethodBDT.cxx:1912
 MethodBDT.cxx:1913
 MethodBDT.cxx:1914
 MethodBDT.cxx:1915
 MethodBDT.cxx:1916
 MethodBDT.cxx:1917
 MethodBDT.cxx:1918
 MethodBDT.cxx:1919
 MethodBDT.cxx:1920
 MethodBDT.cxx:1921
 MethodBDT.cxx:1922
 MethodBDT.cxx:1923
 MethodBDT.cxx:1924
 MethodBDT.cxx:1925
 MethodBDT.cxx:1926
 MethodBDT.cxx:1927
 MethodBDT.cxx:1928
 MethodBDT.cxx:1929
 MethodBDT.cxx:1930
 MethodBDT.cxx:1931
 MethodBDT.cxx:1932
 MethodBDT.cxx:1933
 MethodBDT.cxx:1934
 MethodBDT.cxx:1935
 MethodBDT.cxx:1936
 MethodBDT.cxx:1937
 MethodBDT.cxx:1938
 MethodBDT.cxx:1939
 MethodBDT.cxx:1940
 MethodBDT.cxx:1941
 MethodBDT.cxx:1942
 MethodBDT.cxx:1943
 MethodBDT.cxx:1944
 MethodBDT.cxx:1945
 MethodBDT.cxx:1946
 MethodBDT.cxx:1947
 MethodBDT.cxx:1948
 MethodBDT.cxx:1949
 MethodBDT.cxx:1950
 MethodBDT.cxx:1951
 MethodBDT.cxx:1952
 MethodBDT.cxx:1953
 MethodBDT.cxx:1954
 MethodBDT.cxx:1955
 MethodBDT.cxx:1956
 MethodBDT.cxx:1957
 MethodBDT.cxx:1958
 MethodBDT.cxx:1959
 MethodBDT.cxx:1960
 MethodBDT.cxx:1961
 MethodBDT.cxx:1962
 MethodBDT.cxx:1963
 MethodBDT.cxx:1964
 MethodBDT.cxx:1965
 MethodBDT.cxx:1966
 MethodBDT.cxx:1967
 MethodBDT.cxx:1968
 MethodBDT.cxx:1969
 MethodBDT.cxx:1970
 MethodBDT.cxx:1971
 MethodBDT.cxx:1972
 MethodBDT.cxx:1973
 MethodBDT.cxx:1974
 MethodBDT.cxx:1975
 MethodBDT.cxx:1976
 MethodBDT.cxx:1977
 MethodBDT.cxx:1978
 MethodBDT.cxx:1979
 MethodBDT.cxx:1980
 MethodBDT.cxx:1981
 MethodBDT.cxx:1982
 MethodBDT.cxx:1983
 MethodBDT.cxx:1984
 MethodBDT.cxx:1985
 MethodBDT.cxx:1986
 MethodBDT.cxx:1987
 MethodBDT.cxx:1988
 MethodBDT.cxx:1989
 MethodBDT.cxx:1990
 MethodBDT.cxx:1991
 MethodBDT.cxx:1992
 MethodBDT.cxx:1993
 MethodBDT.cxx:1994
 MethodBDT.cxx:1995
 MethodBDT.cxx:1996
 MethodBDT.cxx:1997
 MethodBDT.cxx:1998
 MethodBDT.cxx:1999
 MethodBDT.cxx:2000
 MethodBDT.cxx:2001
 MethodBDT.cxx:2002
 MethodBDT.cxx:2003
 MethodBDT.cxx:2004
 MethodBDT.cxx:2005
 MethodBDT.cxx:2006
 MethodBDT.cxx:2007
 MethodBDT.cxx:2008
 MethodBDT.cxx:2009
 MethodBDT.cxx:2010
 MethodBDT.cxx:2011
 MethodBDT.cxx:2012
 MethodBDT.cxx:2013
 MethodBDT.cxx:2014
 MethodBDT.cxx:2015
 MethodBDT.cxx:2016
 MethodBDT.cxx:2017
 MethodBDT.cxx:2018
 MethodBDT.cxx:2019
 MethodBDT.cxx:2020
 MethodBDT.cxx:2021
 MethodBDT.cxx:2022
 MethodBDT.cxx:2023
 MethodBDT.cxx:2024
 MethodBDT.cxx:2025
 MethodBDT.cxx:2026
 MethodBDT.cxx:2027
 MethodBDT.cxx:2028
 MethodBDT.cxx:2029
 MethodBDT.cxx:2030
 MethodBDT.cxx:2031
 MethodBDT.cxx:2032
 MethodBDT.cxx:2033
 MethodBDT.cxx:2034
 MethodBDT.cxx:2035
 MethodBDT.cxx:2036
 MethodBDT.cxx:2037
 MethodBDT.cxx:2038
 MethodBDT.cxx:2039
 MethodBDT.cxx:2040
 MethodBDT.cxx:2041
 MethodBDT.cxx:2042
 MethodBDT.cxx:2043
 MethodBDT.cxx:2044
 MethodBDT.cxx:2045
 MethodBDT.cxx:2046
 MethodBDT.cxx:2047
 MethodBDT.cxx:2048
 MethodBDT.cxx:2049
 MethodBDT.cxx:2050
 MethodBDT.cxx:2051
 MethodBDT.cxx:2052
 MethodBDT.cxx:2053
 MethodBDT.cxx:2054
 MethodBDT.cxx:2055
 MethodBDT.cxx:2056
 MethodBDT.cxx:2057
 MethodBDT.cxx:2058
 MethodBDT.cxx:2059
 MethodBDT.cxx:2060
 MethodBDT.cxx:2061
 MethodBDT.cxx:2062
 MethodBDT.cxx:2063
 MethodBDT.cxx:2064
 MethodBDT.cxx:2065
 MethodBDT.cxx:2066
 MethodBDT.cxx:2067
 MethodBDT.cxx:2068
 MethodBDT.cxx:2069
 MethodBDT.cxx:2070
 MethodBDT.cxx:2071
 MethodBDT.cxx:2072
 MethodBDT.cxx:2073
 MethodBDT.cxx:2074
 MethodBDT.cxx:2075
 MethodBDT.cxx:2076
 MethodBDT.cxx:2077
 MethodBDT.cxx:2078
 MethodBDT.cxx:2079
 MethodBDT.cxx:2080
 MethodBDT.cxx:2081
 MethodBDT.cxx:2082
 MethodBDT.cxx:2083
 MethodBDT.cxx:2084
 MethodBDT.cxx:2085
 MethodBDT.cxx:2086
 MethodBDT.cxx:2087
 MethodBDT.cxx:2088
 MethodBDT.cxx:2089
 MethodBDT.cxx:2090
 MethodBDT.cxx:2091
 MethodBDT.cxx:2092
 MethodBDT.cxx:2093
 MethodBDT.cxx:2094
 MethodBDT.cxx:2095
 MethodBDT.cxx:2096
 MethodBDT.cxx:2097
 MethodBDT.cxx:2098
 MethodBDT.cxx:2099
 MethodBDT.cxx:2100
 MethodBDT.cxx:2101
 MethodBDT.cxx:2102
 MethodBDT.cxx:2103
 MethodBDT.cxx:2104
 MethodBDT.cxx:2105
 MethodBDT.cxx:2106
 MethodBDT.cxx:2107
 MethodBDT.cxx:2108
 MethodBDT.cxx:2109
 MethodBDT.cxx:2110
 MethodBDT.cxx:2111
 MethodBDT.cxx:2112
 MethodBDT.cxx:2113
 MethodBDT.cxx:2114
 MethodBDT.cxx:2115
 MethodBDT.cxx:2116
 MethodBDT.cxx:2117
 MethodBDT.cxx:2118
 MethodBDT.cxx:2119
 MethodBDT.cxx:2120
 MethodBDT.cxx:2121
 MethodBDT.cxx:2122
 MethodBDT.cxx:2123
 MethodBDT.cxx:2124
 MethodBDT.cxx:2125
 MethodBDT.cxx:2126
 MethodBDT.cxx:2127
 MethodBDT.cxx:2128
 MethodBDT.cxx:2129
 MethodBDT.cxx:2130
 MethodBDT.cxx:2131
 MethodBDT.cxx:2132
 MethodBDT.cxx:2133
 MethodBDT.cxx:2134
 MethodBDT.cxx:2135
 MethodBDT.cxx:2136
 MethodBDT.cxx:2137
 MethodBDT.cxx:2138
 MethodBDT.cxx:2139
 MethodBDT.cxx:2140
 MethodBDT.cxx:2141
 MethodBDT.cxx:2142
 MethodBDT.cxx:2143
 MethodBDT.cxx:2144
 MethodBDT.cxx:2145
 MethodBDT.cxx:2146
 MethodBDT.cxx:2147
 MethodBDT.cxx:2148
 MethodBDT.cxx:2149
 MethodBDT.cxx:2150
 MethodBDT.cxx:2151
 MethodBDT.cxx:2152
 MethodBDT.cxx:2153
 MethodBDT.cxx:2154
 MethodBDT.cxx:2155
 MethodBDT.cxx:2156
 MethodBDT.cxx:2157
 MethodBDT.cxx:2158
 MethodBDT.cxx:2159
 MethodBDT.cxx:2160
 MethodBDT.cxx:2161
 MethodBDT.cxx:2162
 MethodBDT.cxx:2163
 MethodBDT.cxx:2164
 MethodBDT.cxx:2165
 MethodBDT.cxx:2166
 MethodBDT.cxx:2167
 MethodBDT.cxx:2168
 MethodBDT.cxx:2169
 MethodBDT.cxx:2170
 MethodBDT.cxx:2171
 MethodBDT.cxx:2172
 MethodBDT.cxx:2173
 MethodBDT.cxx:2174
 MethodBDT.cxx:2175
 MethodBDT.cxx:2176
 MethodBDT.cxx:2177
 MethodBDT.cxx:2178
 MethodBDT.cxx:2179
 MethodBDT.cxx:2180
 MethodBDT.cxx:2181
 MethodBDT.cxx:2182
 MethodBDT.cxx:2183
 MethodBDT.cxx:2184
 MethodBDT.cxx:2185
 MethodBDT.cxx:2186
 MethodBDT.cxx:2187
 MethodBDT.cxx:2188
 MethodBDT.cxx:2189
 MethodBDT.cxx:2190
 MethodBDT.cxx:2191
 MethodBDT.cxx:2192
 MethodBDT.cxx:2193
 MethodBDT.cxx:2194
 MethodBDT.cxx:2195
 MethodBDT.cxx:2196
 MethodBDT.cxx:2197
 MethodBDT.cxx:2198
 MethodBDT.cxx:2199
 MethodBDT.cxx:2200
 MethodBDT.cxx:2201
 MethodBDT.cxx:2202
 MethodBDT.cxx:2203
 MethodBDT.cxx:2204
 MethodBDT.cxx:2205
 MethodBDT.cxx:2206
 MethodBDT.cxx:2207
 MethodBDT.cxx:2208
 MethodBDT.cxx:2209
 MethodBDT.cxx:2210
 MethodBDT.cxx:2211
 MethodBDT.cxx:2212
 MethodBDT.cxx:2213
 MethodBDT.cxx:2214
 MethodBDT.cxx:2215
 MethodBDT.cxx:2216
 MethodBDT.cxx:2217
 MethodBDT.cxx:2218
 MethodBDT.cxx:2219
 MethodBDT.cxx:2220
 MethodBDT.cxx:2221
 MethodBDT.cxx:2222
 MethodBDT.cxx:2223
 MethodBDT.cxx:2224
 MethodBDT.cxx:2225
 MethodBDT.cxx:2226
 MethodBDT.cxx:2227
 MethodBDT.cxx:2228
 MethodBDT.cxx:2229
 MethodBDT.cxx:2230
 MethodBDT.cxx:2231
 MethodBDT.cxx:2232
 MethodBDT.cxx:2233
 MethodBDT.cxx:2234
 MethodBDT.cxx:2235
 MethodBDT.cxx:2236
 MethodBDT.cxx:2237
 MethodBDT.cxx:2238
 MethodBDT.cxx:2239
 MethodBDT.cxx:2240
 MethodBDT.cxx:2241
 MethodBDT.cxx:2242
 MethodBDT.cxx:2243
 MethodBDT.cxx:2244
 MethodBDT.cxx:2245
 MethodBDT.cxx:2246
 MethodBDT.cxx:2247
 MethodBDT.cxx:2248
 MethodBDT.cxx:2249
 MethodBDT.cxx:2250
 MethodBDT.cxx:2251
 MethodBDT.cxx:2252
 MethodBDT.cxx:2253
 MethodBDT.cxx:2254
 MethodBDT.cxx:2255
 MethodBDT.cxx:2256
 MethodBDT.cxx:2257
 MethodBDT.cxx:2258
 MethodBDT.cxx:2259
 MethodBDT.cxx:2260
 MethodBDT.cxx:2261
 MethodBDT.cxx:2262
 MethodBDT.cxx:2263
 MethodBDT.cxx:2264
 MethodBDT.cxx:2265
 MethodBDT.cxx:2266
 MethodBDT.cxx:2267
 MethodBDT.cxx:2268
 MethodBDT.cxx:2269
 MethodBDT.cxx:2270
 MethodBDT.cxx:2271
 MethodBDT.cxx:2272
 MethodBDT.cxx:2273
 MethodBDT.cxx:2274
 MethodBDT.cxx:2275
 MethodBDT.cxx:2276
 MethodBDT.cxx:2277
 MethodBDT.cxx:2278
 MethodBDT.cxx:2279
 MethodBDT.cxx:2280
 MethodBDT.cxx:2281
 MethodBDT.cxx:2282
 MethodBDT.cxx:2283
 MethodBDT.cxx:2284
 MethodBDT.cxx:2285
 MethodBDT.cxx:2286
 MethodBDT.cxx:2287
 MethodBDT.cxx:2288
 MethodBDT.cxx:2289
 MethodBDT.cxx:2290
 MethodBDT.cxx:2291
 MethodBDT.cxx:2292
 MethodBDT.cxx:2293
 MethodBDT.cxx:2294
 MethodBDT.cxx:2295
 MethodBDT.cxx:2296
 MethodBDT.cxx:2297
 MethodBDT.cxx:2298
 MethodBDT.cxx:2299
 MethodBDT.cxx:2300
 MethodBDT.cxx:2301
 MethodBDT.cxx:2302
 MethodBDT.cxx:2303
 MethodBDT.cxx:2304
 MethodBDT.cxx:2305
 MethodBDT.cxx:2306
 MethodBDT.cxx:2307
 MethodBDT.cxx:2308
 MethodBDT.cxx:2309
 MethodBDT.cxx:2310
 MethodBDT.cxx:2311
 MethodBDT.cxx:2312
 MethodBDT.cxx:2313
 MethodBDT.cxx:2314
 MethodBDT.cxx:2315
 MethodBDT.cxx:2316
 MethodBDT.cxx:2317
 MethodBDT.cxx:2318
 MethodBDT.cxx:2319
 MethodBDT.cxx:2320
 MethodBDT.cxx:2321
 MethodBDT.cxx:2322
 MethodBDT.cxx:2323
 MethodBDT.cxx:2324
 MethodBDT.cxx:2325
 MethodBDT.cxx:2326
 MethodBDT.cxx:2327
 MethodBDT.cxx:2328
 MethodBDT.cxx:2329
 MethodBDT.cxx:2330
 MethodBDT.cxx:2331
 MethodBDT.cxx:2332
 MethodBDT.cxx:2333
 MethodBDT.cxx:2334
 MethodBDT.cxx:2335
 MethodBDT.cxx:2336
 MethodBDT.cxx:2337
 MethodBDT.cxx:2338
 MethodBDT.cxx:2339
 MethodBDT.cxx:2340
 MethodBDT.cxx:2341
 MethodBDT.cxx:2342
 MethodBDT.cxx:2343
 MethodBDT.cxx:2344
 MethodBDT.cxx:2345
 MethodBDT.cxx:2346
 MethodBDT.cxx:2347
 MethodBDT.cxx:2348
 MethodBDT.cxx:2349
 MethodBDT.cxx:2350
 MethodBDT.cxx:2351
 MethodBDT.cxx:2352
 MethodBDT.cxx:2353
 MethodBDT.cxx:2354
 MethodBDT.cxx:2355
 MethodBDT.cxx:2356
 MethodBDT.cxx:2357
 MethodBDT.cxx:2358
 MethodBDT.cxx:2359
 MethodBDT.cxx:2360
 MethodBDT.cxx:2361
 MethodBDT.cxx:2362
 MethodBDT.cxx:2363
 MethodBDT.cxx:2364
 MethodBDT.cxx:2365
 MethodBDT.cxx:2366
 MethodBDT.cxx:2367
 MethodBDT.cxx:2368
 MethodBDT.cxx:2369
 MethodBDT.cxx:2370
 MethodBDT.cxx:2371
 MethodBDT.cxx:2372
 MethodBDT.cxx:2373
 MethodBDT.cxx:2374
 MethodBDT.cxx:2375
 MethodBDT.cxx:2376
 MethodBDT.cxx:2377
 MethodBDT.cxx:2378
 MethodBDT.cxx:2379
 MethodBDT.cxx:2380
 MethodBDT.cxx:2381
 MethodBDT.cxx:2382
 MethodBDT.cxx:2383
 MethodBDT.cxx:2384
 MethodBDT.cxx:2385
 MethodBDT.cxx:2386
 MethodBDT.cxx:2387
 MethodBDT.cxx:2388
 MethodBDT.cxx:2389
 MethodBDT.cxx:2390
 MethodBDT.cxx:2391
 MethodBDT.cxx:2392
 MethodBDT.cxx:2393
 MethodBDT.cxx:2394
 MethodBDT.cxx:2395
 MethodBDT.cxx:2396
 MethodBDT.cxx:2397
 MethodBDT.cxx:2398
 MethodBDT.cxx:2399
 MethodBDT.cxx:2400
 MethodBDT.cxx:2401
 MethodBDT.cxx:2402
 MethodBDT.cxx:2403
 MethodBDT.cxx:2404
 MethodBDT.cxx:2405
 MethodBDT.cxx:2406
 MethodBDT.cxx:2407
 MethodBDT.cxx:2408
 MethodBDT.cxx:2409
 MethodBDT.cxx:2410
 MethodBDT.cxx:2411
 MethodBDT.cxx:2412
 MethodBDT.cxx:2413
 MethodBDT.cxx:2414
 MethodBDT.cxx:2415
 MethodBDT.cxx:2416
 MethodBDT.cxx:2417
 MethodBDT.cxx:2418
 MethodBDT.cxx:2419
 MethodBDT.cxx:2420
 MethodBDT.cxx:2421
 MethodBDT.cxx:2422
 MethodBDT.cxx:2423
 MethodBDT.cxx:2424
 MethodBDT.cxx:2425
 MethodBDT.cxx:2426
 MethodBDT.cxx:2427
 MethodBDT.cxx:2428
 MethodBDT.cxx:2429
 MethodBDT.cxx:2430
 MethodBDT.cxx:2431
 MethodBDT.cxx:2432
 MethodBDT.cxx:2433
 MethodBDT.cxx:2434
 MethodBDT.cxx:2435
 MethodBDT.cxx:2436
 MethodBDT.cxx:2437
 MethodBDT.cxx:2438
 MethodBDT.cxx:2439
 MethodBDT.cxx:2440
 MethodBDT.cxx:2441
 MethodBDT.cxx:2442
 MethodBDT.cxx:2443
 MethodBDT.cxx:2444
 MethodBDT.cxx:2445
 MethodBDT.cxx:2446
 MethodBDT.cxx:2447
 MethodBDT.cxx:2448
 MethodBDT.cxx:2449
 MethodBDT.cxx:2450
 MethodBDT.cxx:2451
 MethodBDT.cxx:2452
 MethodBDT.cxx:2453
 MethodBDT.cxx:2454
 MethodBDT.cxx:2455
 MethodBDT.cxx:2456
 MethodBDT.cxx:2457
 MethodBDT.cxx:2458
 MethodBDT.cxx:2459
 MethodBDT.cxx:2460
 MethodBDT.cxx:2461
 MethodBDT.cxx:2462
 MethodBDT.cxx:2463
 MethodBDT.cxx:2464
 MethodBDT.cxx:2465
 MethodBDT.cxx:2466
 MethodBDT.cxx:2467
 MethodBDT.cxx:2468
 MethodBDT.cxx:2469
 MethodBDT.cxx:2470
 MethodBDT.cxx:2471
 MethodBDT.cxx:2472
 MethodBDT.cxx:2473
 MethodBDT.cxx:2474
 MethodBDT.cxx:2475
 MethodBDT.cxx:2476
 MethodBDT.cxx:2477
 MethodBDT.cxx:2478
 MethodBDT.cxx:2479
 MethodBDT.cxx:2480
 MethodBDT.cxx:2481
 MethodBDT.cxx:2482
 MethodBDT.cxx:2483
 MethodBDT.cxx:2484
 MethodBDT.cxx:2485
 MethodBDT.cxx:2486
 MethodBDT.cxx:2487
 MethodBDT.cxx:2488
 MethodBDT.cxx:2489
 MethodBDT.cxx:2490
 MethodBDT.cxx:2491
 MethodBDT.cxx:2492
 MethodBDT.cxx:2493
 MethodBDT.cxx:2494
 MethodBDT.cxx:2495
 MethodBDT.cxx:2496
 MethodBDT.cxx:2497
 MethodBDT.cxx:2498
 MethodBDT.cxx:2499
 MethodBDT.cxx:2500
 MethodBDT.cxx:2501
 MethodBDT.cxx:2502
 MethodBDT.cxx:2503
 MethodBDT.cxx:2504
 MethodBDT.cxx:2505
 MethodBDT.cxx:2506
 MethodBDT.cxx:2507
 MethodBDT.cxx:2508
 MethodBDT.cxx:2509
 MethodBDT.cxx:2510
 MethodBDT.cxx:2511
 MethodBDT.cxx:2512
 MethodBDT.cxx:2513
 MethodBDT.cxx:2514
 MethodBDT.cxx:2515
 MethodBDT.cxx:2516
 MethodBDT.cxx:2517
 MethodBDT.cxx:2518
 MethodBDT.cxx:2519
 MethodBDT.cxx:2520
 MethodBDT.cxx:2521
 MethodBDT.cxx:2522
 MethodBDT.cxx:2523
 MethodBDT.cxx:2524
 MethodBDT.cxx:2525
 MethodBDT.cxx:2526
 MethodBDT.cxx:2527
 MethodBDT.cxx:2528
 MethodBDT.cxx:2529
 MethodBDT.cxx:2530
 MethodBDT.cxx:2531
 MethodBDT.cxx:2532
 MethodBDT.cxx:2533
 MethodBDT.cxx:2534
 MethodBDT.cxx:2535
 MethodBDT.cxx:2536
 MethodBDT.cxx:2537
 MethodBDT.cxx:2538
 MethodBDT.cxx:2539
 MethodBDT.cxx:2540
 MethodBDT.cxx:2541
 MethodBDT.cxx:2542
 MethodBDT.cxx:2543
 MethodBDT.cxx:2544
 MethodBDT.cxx:2545
 MethodBDT.cxx:2546
 MethodBDT.cxx:2547
 MethodBDT.cxx:2548
 MethodBDT.cxx:2549
 MethodBDT.cxx:2550
 MethodBDT.cxx:2551
 MethodBDT.cxx:2552
 MethodBDT.cxx:2553
 MethodBDT.cxx:2554
 MethodBDT.cxx:2555
 MethodBDT.cxx:2556
 MethodBDT.cxx:2557
 MethodBDT.cxx:2558
 MethodBDT.cxx:2559
 MethodBDT.cxx:2560
 MethodBDT.cxx:2561
 MethodBDT.cxx:2562
 MethodBDT.cxx:2563
 MethodBDT.cxx:2564
 MethodBDT.cxx:2565
 MethodBDT.cxx:2566
 MethodBDT.cxx:2567
 MethodBDT.cxx:2568
 MethodBDT.cxx:2569
 MethodBDT.cxx:2570
 MethodBDT.cxx:2571
 MethodBDT.cxx:2572
 MethodBDT.cxx:2573
 MethodBDT.cxx:2574
 MethodBDT.cxx:2575
 MethodBDT.cxx:2576
 MethodBDT.cxx:2577
 MethodBDT.cxx:2578
 MethodBDT.cxx:2579
 MethodBDT.cxx:2580
 MethodBDT.cxx:2581
 MethodBDT.cxx:2582
 MethodBDT.cxx:2583
 MethodBDT.cxx:2584
 MethodBDT.cxx:2585
 MethodBDT.cxx:2586
 MethodBDT.cxx:2587
 MethodBDT.cxx:2588
 MethodBDT.cxx:2589
 MethodBDT.cxx:2590
 MethodBDT.cxx:2591
 MethodBDT.cxx:2592
 MethodBDT.cxx:2593
 MethodBDT.cxx:2594
 MethodBDT.cxx:2595
 MethodBDT.cxx:2596
 MethodBDT.cxx:2597
 MethodBDT.cxx:2598
 MethodBDT.cxx:2599
 MethodBDT.cxx:2600
 MethodBDT.cxx:2601
 MethodBDT.cxx:2602
 MethodBDT.cxx:2603
 MethodBDT.cxx:2604
 MethodBDT.cxx:2605
 MethodBDT.cxx:2606
 MethodBDT.cxx:2607
 MethodBDT.cxx:2608
 MethodBDT.cxx:2609
 MethodBDT.cxx:2610
 MethodBDT.cxx:2611
 MethodBDT.cxx:2612
 MethodBDT.cxx:2613
 MethodBDT.cxx:2614
 MethodBDT.cxx:2615
 MethodBDT.cxx:2616
 MethodBDT.cxx:2617
 MethodBDT.cxx:2618
 MethodBDT.cxx:2619
 MethodBDT.cxx:2620
 MethodBDT.cxx:2621
 MethodBDT.cxx:2622
 MethodBDT.cxx:2623
 MethodBDT.cxx:2624
 MethodBDT.cxx:2625
 MethodBDT.cxx:2626
 MethodBDT.cxx:2627
 MethodBDT.cxx:2628
 MethodBDT.cxx:2629
 MethodBDT.cxx:2630
 MethodBDT.cxx:2631
 MethodBDT.cxx:2632
 MethodBDT.cxx:2633
 MethodBDT.cxx:2634
 MethodBDT.cxx:2635
 MethodBDT.cxx:2636
 MethodBDT.cxx:2637
 MethodBDT.cxx:2638
 MethodBDT.cxx:2639
 MethodBDT.cxx:2640
 MethodBDT.cxx:2641
 MethodBDT.cxx:2642
 MethodBDT.cxx:2643
 MethodBDT.cxx:2644
 MethodBDT.cxx:2645
 MethodBDT.cxx:2646
 MethodBDT.cxx:2647
 MethodBDT.cxx:2648
 MethodBDT.cxx:2649
 MethodBDT.cxx:2650
 MethodBDT.cxx:2651
 MethodBDT.cxx:2652
 MethodBDT.cxx:2653
 MethodBDT.cxx:2654
 MethodBDT.cxx:2655
 MethodBDT.cxx:2656
 MethodBDT.cxx:2657
 MethodBDT.cxx:2658
 MethodBDT.cxx:2659
 MethodBDT.cxx:2660
 MethodBDT.cxx:2661
 MethodBDT.cxx:2662
 MethodBDT.cxx:2663
 MethodBDT.cxx:2664
 MethodBDT.cxx:2665
 MethodBDT.cxx:2666
 MethodBDT.cxx:2667
 MethodBDT.cxx:2668
 MethodBDT.cxx:2669
 MethodBDT.cxx:2670
 MethodBDT.cxx:2671
 MethodBDT.cxx:2672
 MethodBDT.cxx:2673
 MethodBDT.cxx:2674
 MethodBDT.cxx:2675
 MethodBDT.cxx:2676
 MethodBDT.cxx:2677
 MethodBDT.cxx:2678
 MethodBDT.cxx:2679
 MethodBDT.cxx:2680
 MethodBDT.cxx:2681
 MethodBDT.cxx:2682
 MethodBDT.cxx:2683
 MethodBDT.cxx:2684
 MethodBDT.cxx:2685
 MethodBDT.cxx:2686
 MethodBDT.cxx:2687
 MethodBDT.cxx:2688
 MethodBDT.cxx:2689
 MethodBDT.cxx:2690
 MethodBDT.cxx:2691
 MethodBDT.cxx:2692
 MethodBDT.cxx:2693
 MethodBDT.cxx:2694
 MethodBDT.cxx:2695
 MethodBDT.cxx:2696
 MethodBDT.cxx:2697
 MethodBDT.cxx:2698
 MethodBDT.cxx:2699
 MethodBDT.cxx:2700
 MethodBDT.cxx:2701
 MethodBDT.cxx:2702
 MethodBDT.cxx:2703
 MethodBDT.cxx:2704
 MethodBDT.cxx:2705
 MethodBDT.cxx:2706
 MethodBDT.cxx:2707
 MethodBDT.cxx:2708
 MethodBDT.cxx:2709
 MethodBDT.cxx:2710
 MethodBDT.cxx:2711
 MethodBDT.cxx:2712
 MethodBDT.cxx:2713
 MethodBDT.cxx:2714
 MethodBDT.cxx:2715
 MethodBDT.cxx:2716
 MethodBDT.cxx:2717
 MethodBDT.cxx:2718
 MethodBDT.cxx:2719
 MethodBDT.cxx:2720
 MethodBDT.cxx:2721
 MethodBDT.cxx:2722
 MethodBDT.cxx:2723
 MethodBDT.cxx:2724
 MethodBDT.cxx:2725
 MethodBDT.cxx:2726
 MethodBDT.cxx:2727
 MethodBDT.cxx:2728
 MethodBDT.cxx:2729
 MethodBDT.cxx:2730
 MethodBDT.cxx:2731
 MethodBDT.cxx:2732
 MethodBDT.cxx:2733
 MethodBDT.cxx:2734
 MethodBDT.cxx:2735
 MethodBDT.cxx:2736
 MethodBDT.cxx:2737
 MethodBDT.cxx:2738
 MethodBDT.cxx:2739
 MethodBDT.cxx:2740
 MethodBDT.cxx:2741
 MethodBDT.cxx:2742
 MethodBDT.cxx:2743
 MethodBDT.cxx:2744
 MethodBDT.cxx:2745
 MethodBDT.cxx:2746
 MethodBDT.cxx:2747
 MethodBDT.cxx:2748
 MethodBDT.cxx:2749
 MethodBDT.cxx:2750
 MethodBDT.cxx:2751
 MethodBDT.cxx:2752
 MethodBDT.cxx:2753
 MethodBDT.cxx:2754
 MethodBDT.cxx:2755
 MethodBDT.cxx:2756
 MethodBDT.cxx:2757
 MethodBDT.cxx:2758
 MethodBDT.cxx:2759
 MethodBDT.cxx:2760
 MethodBDT.cxx:2761
 MethodBDT.cxx:2762
 MethodBDT.cxx:2763
 MethodBDT.cxx:2764
 MethodBDT.cxx:2765
 MethodBDT.cxx:2766
 MethodBDT.cxx:2767
 MethodBDT.cxx:2768
 MethodBDT.cxx:2769
 MethodBDT.cxx:2770
 MethodBDT.cxx:2771
 MethodBDT.cxx:2772
 MethodBDT.cxx:2773
 MethodBDT.cxx:2774
 MethodBDT.cxx:2775
 MethodBDT.cxx:2776
 MethodBDT.cxx:2777
 MethodBDT.cxx:2778
 MethodBDT.cxx:2779
 MethodBDT.cxx:2780
 MethodBDT.cxx:2781
 MethodBDT.cxx:2782
 MethodBDT.cxx:2783
 MethodBDT.cxx:2784
 MethodBDT.cxx:2785
 MethodBDT.cxx:2786
 MethodBDT.cxx:2787
 MethodBDT.cxx:2788
 MethodBDT.cxx:2789
 MethodBDT.cxx:2790
 MethodBDT.cxx:2791
 MethodBDT.cxx:2792
 MethodBDT.cxx:2793
 MethodBDT.cxx:2794
 MethodBDT.cxx:2795
 MethodBDT.cxx:2796
 MethodBDT.cxx:2797
 MethodBDT.cxx:2798
 MethodBDT.cxx:2799
 MethodBDT.cxx:2800
 MethodBDT.cxx:2801
 MethodBDT.cxx:2802
 MethodBDT.cxx:2803
 MethodBDT.cxx:2804
 MethodBDT.cxx:2805
 MethodBDT.cxx:2806
 MethodBDT.cxx:2807
 MethodBDT.cxx:2808
 MethodBDT.cxx:2809
 MethodBDT.cxx:2810
 MethodBDT.cxx:2811
 MethodBDT.cxx:2812
 MethodBDT.cxx:2813
 MethodBDT.cxx:2814
 MethodBDT.cxx:2815
 MethodBDT.cxx:2816
 MethodBDT.cxx:2817
 MethodBDT.cxx:2818
 MethodBDT.cxx:2819
 MethodBDT.cxx:2820
 MethodBDT.cxx:2821
 MethodBDT.cxx:2822
 MethodBDT.cxx:2823
 MethodBDT.cxx:2824
 MethodBDT.cxx:2825
 MethodBDT.cxx:2826
 MethodBDT.cxx:2827
 MethodBDT.cxx:2828
 MethodBDT.cxx:2829
 MethodBDT.cxx:2830
 MethodBDT.cxx:2831
 MethodBDT.cxx:2832
 MethodBDT.cxx:2833
 MethodBDT.cxx:2834
 MethodBDT.cxx:2835
 MethodBDT.cxx:2836
 MethodBDT.cxx:2837
 MethodBDT.cxx:2838
 MethodBDT.cxx:2839
 MethodBDT.cxx:2840
 MethodBDT.cxx:2841
 MethodBDT.cxx:2842
 MethodBDT.cxx:2843
 MethodBDT.cxx:2844
 MethodBDT.cxx:2845
 MethodBDT.cxx:2846
 MethodBDT.cxx:2847
 MethodBDT.cxx:2848
 MethodBDT.cxx:2849
 MethodBDT.cxx:2850
 MethodBDT.cxx:2851
 MethodBDT.cxx:2852
 MethodBDT.cxx:2853
 MethodBDT.cxx:2854
 MethodBDT.cxx:2855
 MethodBDT.cxx:2856
 MethodBDT.cxx:2857
 MethodBDT.cxx:2858
 MethodBDT.cxx:2859
 MethodBDT.cxx:2860
 MethodBDT.cxx:2861
 MethodBDT.cxx:2862
 MethodBDT.cxx:2863
 MethodBDT.cxx:2864
 MethodBDT.cxx:2865
 MethodBDT.cxx:2866
 MethodBDT.cxx:2867
 MethodBDT.cxx:2868
 MethodBDT.cxx:2869
 MethodBDT.cxx:2870
 MethodBDT.cxx:2871
 MethodBDT.cxx:2872
 MethodBDT.cxx:2873
 MethodBDT.cxx:2874
 MethodBDT.cxx:2875
 MethodBDT.cxx:2876
 MethodBDT.cxx:2877
 MethodBDT.cxx:2878
 MethodBDT.cxx:2879
 MethodBDT.cxx:2880
 MethodBDT.cxx:2881
 MethodBDT.cxx:2882
 MethodBDT.cxx:2883
 MethodBDT.cxx:2884
 MethodBDT.cxx:2885
 MethodBDT.cxx:2886
 MethodBDT.cxx:2887
 MethodBDT.cxx:2888
 MethodBDT.cxx:2889
 MethodBDT.cxx:2890
 MethodBDT.cxx:2891
 MethodBDT.cxx:2892
 MethodBDT.cxx:2893
 MethodBDT.cxx:2894
 MethodBDT.cxx:2895
 MethodBDT.cxx:2896
 MethodBDT.cxx:2897
 MethodBDT.cxx:2898
 MethodBDT.cxx:2899
 MethodBDT.cxx:2900
 MethodBDT.cxx:2901
 MethodBDT.cxx:2902
 MethodBDT.cxx:2903
 MethodBDT.cxx:2904
 MethodBDT.cxx:2905
 MethodBDT.cxx:2906
 MethodBDT.cxx:2907
 MethodBDT.cxx:2908
 MethodBDT.cxx:2909
 MethodBDT.cxx:2910
 MethodBDT.cxx:2911
 MethodBDT.cxx:2912
 MethodBDT.cxx:2913
 MethodBDT.cxx:2914
 MethodBDT.cxx:2915
 MethodBDT.cxx:2916
 MethodBDT.cxx:2917
 MethodBDT.cxx:2918
 MethodBDT.cxx:2919
 MethodBDT.cxx:2920
 MethodBDT.cxx:2921
 MethodBDT.cxx:2922
 MethodBDT.cxx:2923
 MethodBDT.cxx:2924
 MethodBDT.cxx:2925
 MethodBDT.cxx:2926
 MethodBDT.cxx:2927
 MethodBDT.cxx:2928
 MethodBDT.cxx:2929
 MethodBDT.cxx:2930
 MethodBDT.cxx:2931
 MethodBDT.cxx:2932
 MethodBDT.cxx:2933
 MethodBDT.cxx:2934
 MethodBDT.cxx:2935
 MethodBDT.cxx:2936
 MethodBDT.cxx:2937
 MethodBDT.cxx:2938
 MethodBDT.cxx:2939