ROOT  6.07/01
Reference Guide
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Groups Pages
MethodDT.cxx
Go to the documentation of this file.
1 // @(#)root/tmva $Id$
2 // Author: Andreas Hoecker, Joerg Stelzer, Helge Voss, Kai Voss
3 
4 /**********************************************************************************
5  * Project: TMVA - a Root-integrated toolkit for multivariate data analysis *
6  * Package: TMVA *
7  * Class : MethodDT (DT = Decision Trees) *
8  * Web : http://tmva.sourceforge.net *
9  * *
10  * Description: *
11  * Analysis of Boosted Decision Trees *
12  * *
13  * Authors (alphabetical): *
14  * Andreas Hoecker <Andreas.Hocker@cern.ch> - CERN, Switzerland *
15  * Helge Voss <Helge.Voss@cern.ch> - MPI-K Heidelberg, Germany *
16  * Or Cohen <orcohenor@gmail.com> - Weizmann Inst., Israel *
17  * *
18  * Copyright (c) 2005: *
19  * CERN, Switzerland *
20  * MPI-K Heidelberg, Germany *
21  * *
22  * Redistribution and use in source and binary forms, with or without *
23  * modification, are permitted according to the terms listed in LICENSE *
24  * (http://tmva.sourceforge.net/LICENSE) *
25  **********************************************************************************/
26 
27 //_______________________________________________________________________
28 //
29 // Analysis of Boosted Decision Trees
30 //
31 // Boosted decision trees have been successfully used in High Energy
32 // Physics analysis for example by the MiniBooNE experiment
33 // (Yang-Roe-Zhu, physics/0508045). In Boosted Decision Trees, the
34 // selection is done on a majority vote on the result of several decision
35 // trees, which are all derived from the same training sample by
36 // supplying different event weights during the training.
37 //
38 // Decision trees:
39 //
40 // successive decision nodes are used to categorize the
41 // events out of the sample as either signal or background. Each node
42 // uses only a single discriminating variable to decide if the event is
43 // signal-like ("goes right") or background-like ("goes left"). This
44 // forms a tree like structure with "baskets" at the end (leave nodes),
45 // and an event is classified as either signal or background according to
46 // whether the basket where it ends up has been classified signal or
47 // background during the training. Training of a decision tree is the
48 // process to define the "cut criteria" for each node. The training
49 // starts with the root node. Here one takes the full training event
50 // sample and selects the variable and corresponding cut value that gives
51 // the best separation between signal and background at this stage. Using
52 // this cut criterion, the sample is then divided into two subsamples, a
53 // signal-like (right) and a background-like (left) sample. Two new nodes
54 // are then created for each of the two sub-samples and they are
55 // constructed using the same mechanism as described for the root
56 // node. The devision is stopped once a certain node has reached either a
57 // minimum number of events, or a minimum or maximum signal purity. These
58 // leave nodes are then called "signal" or "background" if they contain
59 // more signal respective background events from the training sample.
60 //
61 // Boosting:
62 //
63 // the idea behind the boosting is, that signal events from the training
64 // sample, that *end up in a background node (and vice versa) are given a
65 // larger weight than events that are in the correct leave node. This
66 // results in a re-weighed training event sample, with which then a new
67 // decision tree can be developed. The boosting can be applied several
68 // times (typically 100-500 times) and one ends up with a set of decision
69 // trees (a forest).
70 //
71 // Bagging:
72 //
73 // In this particular variant of the Boosted Decision Trees the boosting
74 // is not done on the basis of previous training results, but by a simple
75 // stochasitc re-sampling of the initial training event sample.
76 //
77 // Analysis:
78 //
79 // applying an individual decision tree to a test event results in a
80 // classification of the event as either signal or background. For the
81 // boosted decision tree selection, an event is successively subjected to
82 // the whole set of decision trees and depending on how often it is
83 // classified as signal, a "likelihood" estimator is constructed for the
84 // event being signal or background. The value of this estimator is the
85 // one which is then used to select the events from an event sample, and
86 // the cut value on this estimator defines the efficiency and purity of
87 // the selection.
88 //*
89 //_______________________________________________________________________
90 
91 #include "TMVA/MethodDT.h"
92 
93 #include <algorithm>
94 #include "Riostream.h"
95 #include "TRandom3.h"
96 #include "TMath.h"
97 #include "TObjString.h"
98 
99 #include "TMVA/BinarySearchTree.h"
100 #include "TMVA/CCPruner.h"
101 #include "TMVA/ClassifierFactory.h"
102 #include "TMVA/CrossEntropy.h"
103 #include "TMVA/DataSet.h"
104 #include "TMVA/DecisionTree.h"
105 #include "TMVA/GiniIndex.h"
106 #include "TMVA/MethodBase.h"
107 #include "TMVA/MethodBoost.h"
109 #include "TMVA/MsgLogger.h"
110 #include "TMVA/Ranking.h"
111 #include "TMVA/SdivSqrtSplusB.h"
112 #include "TMVA/SeparationBase.h"
113 #include "TMVA/Timer.h"
114 #include "TMVA/Tools.h"
115 #include "TMVA/Types.h"
116 
117 using std::vector;
118 
119 REGISTER_METHOD(DT)
120 
121 ClassImp(TMVA::MethodDT)
122 
123 ////////////////////////////////////////////////////////////////////////////////
124 /// the standard constructor for just an ordinar "decision trees"
125 
126  TMVA::MethodDT::MethodDT( const TString& jobName,
127  const TString& methodTitle,
128  DataSetInfo& theData,
129  const TString& theOption,
130  TDirectory* theTargetDir ) :
131  TMVA::MethodBase( jobName, Types::kDT, methodTitle, theData, theOption, theTargetDir )
132  , fTree(0)
133  , fSepType(0)
134  , fMinNodeEvents(0)
135  , fMinNodeSize(0)
136  , fNCuts(0)
137  , fUseYesNoLeaf(kFALSE)
138  , fNodePurityLimit(0)
139  , fMaxDepth(0)
140  , fErrorFraction(0)
141  , fPruneStrength(0)
142  , fPruneMethod(DecisionTree::kNoPruning)
143  , fAutomatic(kFALSE)
144  , fRandomisedTrees(kFALSE)
145  , fUseNvars(0)
146  , fUsePoissonNvars(0) // don't use this initialisation, only here to make Coverity happy. Is set in Init()
147  , fDeltaPruneStrength(0)
148 {
149 }
150 
151 ////////////////////////////////////////////////////////////////////////////////
152 ///constructor from Reader
153 
155  const TString& theWeightFile,
156  TDirectory* theTargetDir ) :
157  TMVA::MethodBase( Types::kDT, dsi, theWeightFile, theTargetDir )
158  , fTree(0)
159  , fSepType(0)
160  , fMinNodeEvents(0)
161  , fMinNodeSize(0)
162  , fNCuts(0)
163  , fUseYesNoLeaf(kFALSE)
164  , fNodePurityLimit(0)
165  , fMaxDepth(0)
166  , fErrorFraction(0)
167  , fPruneStrength(0)
168  , fPruneMethod(DecisionTree::kNoPruning)
169  , fAutomatic(kFALSE)
170  , fRandomisedTrees(kFALSE)
171  , fUseNvars(0)
172  , fDeltaPruneStrength(0)
173 {
174 }
175 
176 ////////////////////////////////////////////////////////////////////////////////
177 /// FDA can handle classification with 2 classes and regression with one regression-target
178 
180 {
181  if( type == Types::kClassification && numberClasses == 2 ) return kTRUE;
182  return kFALSE;
183 }
184 
185 
186 ////////////////////////////////////////////////////////////////////////////////
187 /// define the options (their key words) that can be set in the option string
188 /// UseRandomisedTrees choose at each node splitting a random set of variables
189 /// UseNvars use UseNvars variables in randomised trees
190 /// SeparationType the separation criterion applied in the node splitting
191 /// known: GiniIndex
192 /// MisClassificationError
193 /// CrossEntropy
194 /// SDivSqrtSPlusB
195 /// nEventsMin: the minimum number of events in a node (leaf criteria, stop splitting)
196 /// nCuts: the number of steps in the optimisation of the cut for a node (if < 0, then
197 /// step size is determined by the events)
198 /// UseYesNoLeaf decide if the classification is done simply by the node type, or the S/B
199 /// (from the training) in the leaf node
200 /// NodePurityLimit the minimum purity to classify a node as a signal node (used in pruning and boosting to determine
201 /// misclassification error rate)
202 /// PruneMethod The Pruning method:
203 /// known: NoPruning // switch off pruning completely
204 /// ExpectedError
205 /// CostComplexity
206 /// PruneStrength a parameter to adjust the amount of pruning. Should be large enouth such that overtraining is avoided");
207 
209 {
210  DeclareOptionRef(fRandomisedTrees,"UseRandomisedTrees","Choose at each node splitting a random set of variables and *bagging*");
211  DeclareOptionRef(fUseNvars,"UseNvars","Number of variables used if randomised Tree option is chosen");
212  DeclareOptionRef(fUsePoissonNvars,"UsePoissonNvars", "Interpret \"UseNvars\" not as fixed number but as mean of a Possion distribution in each split with RandomisedTree option");
213  DeclareOptionRef(fUseYesNoLeaf=kTRUE, "UseYesNoLeaf",
214  "Use Sig or Bkg node type or the ratio S/B as classification in the leaf node");
215  DeclareOptionRef(fNodePurityLimit=0.5, "NodePurityLimit", "In boosting/pruning, nodes with purity > NodePurityLimit are signal; background otherwise.");
216  DeclareOptionRef(fSepTypeS="GiniIndex", "SeparationType", "Separation criterion for node splitting");
217  AddPreDefVal(TString("MisClassificationError"));
218  AddPreDefVal(TString("GiniIndex"));
219  AddPreDefVal(TString("CrossEntropy"));
220  AddPreDefVal(TString("SDivSqrtSPlusB"));
221  DeclareOptionRef(fMinNodeEvents=-1, "nEventsMin", "deprecated !!! Minimum number of events required in a leaf node");
222  DeclareOptionRef(fMinNodeSizeS, "MinNodeSize", "Minimum percentage of training events required in a leaf node (default: Classification: 10%, Regression: 1%)");
223  DeclareOptionRef(fNCuts, "nCuts", "Number of steps during node cut optimisation");
224  DeclareOptionRef(fPruneStrength, "PruneStrength", "Pruning strength (negative value == automatic adjustment)");
225  DeclareOptionRef(fPruneMethodS="NoPruning", "PruneMethod", "Pruning method: NoPruning (switched off), ExpectedError or CostComplexity");
226 
227  AddPreDefVal(TString("NoPruning"));
228  AddPreDefVal(TString("ExpectedError"));
229  AddPreDefVal(TString("CostComplexity"));
230 
231  if (DoRegression()) {
232  DeclareOptionRef(fMaxDepth=50,"MaxDepth","Max depth of the decision tree allowed");
233  }else{
234  DeclareOptionRef(fMaxDepth=3,"MaxDepth","Max depth of the decision tree allowed");
235  }
236 }
237 
239  // options that are used ONLY for the READER to ensure backward compatibility
240 
242 
243  DeclareOptionRef(fPruneBeforeBoost=kFALSE, "PruneBeforeBoost",
244  "--> removed option .. only kept for reader backward compatibility");
245 }
246 
247 ////////////////////////////////////////////////////////////////////////////////
248 /// the option string is decoded, for available options see "DeclareOptions"
249 
251 {
252  fSepTypeS.ToLower();
253  if (fSepTypeS == "misclassificationerror") fSepType = new MisClassificationError();
254  else if (fSepTypeS == "giniindex") fSepType = new GiniIndex();
255  else if (fSepTypeS == "crossentropy") fSepType = new CrossEntropy();
256  else if (fSepTypeS == "sdivsqrtsplusb") fSepType = new SdivSqrtSplusB();
257  else {
258  Log() << kINFO << GetOptions() << Endl;
259  Log() << kFATAL << "<ProcessOptions> unknown Separation Index option called" << Endl;
260  }
261 
262  // std::cout << "fSeptypes " << fSepTypeS << " fseptype " << fSepType << std::endl;
263 
264  fPruneMethodS.ToLower();
265  if (fPruneMethodS == "expectederror" ) fPruneMethod = DecisionTree::kExpectedErrorPruning;
266  else if (fPruneMethodS == "costcomplexity" ) fPruneMethod = DecisionTree::kCostComplexityPruning;
267  else if (fPruneMethodS == "nopruning" ) fPruneMethod = DecisionTree::kNoPruning;
268  else {
269  Log() << kINFO << GetOptions() << Endl;
270  Log() << kFATAL << "<ProcessOptions> unknown PruneMethod option:" << fPruneMethodS <<" called" << Endl;
271  }
272 
273  if (fPruneStrength < 0) fAutomatic = kTRUE;
274  else fAutomatic = kFALSE;
275  if (fAutomatic && fPruneMethod==!DecisionTree::kCostComplexityPruning){
276  Log() << kFATAL
277  << "Sorry autmoatic pruning strength determination is not implemented yet for ExpectedErrorPruning" << Endl;
278  }
279 
280 
281  if (this->Data()->HasNegativeEventWeights()){
282  Log() << kINFO << " You are using a Monte Carlo that has also negative weights. "
283  << "That should in principle be fine as long as on average you end up with "
284  << "something positive. For this you have to make sure that the minimal number "
285  << "of (un-weighted) events demanded for a tree node (currently you use: MinNodeSize="
286  <<fMinNodeSizeS
287  <<", (or the deprecated equivalent nEventsMin) you can set this via the "
288  <<"MethodDT option string when booking the "
289  << "classifier) is large enough to allow for reasonable averaging!!! "
290  << " If this does not help.. maybe you want to try the option: IgnoreNegWeightsInTraining "
291  << "which ignores events with negative weight in the training. " << Endl
292  << Endl << "Note: You'll get a WARNING message during the training if that should ever happen" << Endl;
293  }
294 
295  if (fRandomisedTrees){
296  Log() << kINFO << " Randomised trees should use *bagging* as *boost* method. Did you set this in the *MethodBoost* ? . Here I can enforce only the *no pruning*" << Endl;
297  fPruneMethod = DecisionTree::kNoPruning;
298  // fBoostType = "Bagging";
299  }
300 
301  if (fMinNodeEvents > 0){
302  fMinNodeSize = fMinNodeEvents / Data()->GetNTrainingEvents() * 100;
303  Log() << kWARNING << "You have explicitly set *nEventsMin*, the min ablsolut number \n"
304  << "of events in a leaf node. This is DEPRECATED, please use the option \n"
305  << "*MinNodeSize* giving the relative number as percentage of training \n"
306  << "events instead. \n"
307  << "nEventsMin="<<fMinNodeEvents<< "--> MinNodeSize="<<fMinNodeSize<<"%"
308  << Endl;
309  }else{
310  SetMinNodeSize(fMinNodeSizeS);
311  }
312 }
313 
315  if (sizeInPercent > 0 && sizeInPercent < 50){
316  fMinNodeSize=sizeInPercent;
317 
318  } else {
319  Log() << kERROR << "you have demanded a minimal node size of "
320  << sizeInPercent << "% of the training events.. \n"
321  << " that somehow does not make sense "<<Endl;
322  }
323 
324 }
326  sizeInPercent.ReplaceAll("%","");
327  if (sizeInPercent.IsAlnum()) SetMinNodeSize(sizeInPercent.Atof());
328  else {
329  Log() << kERROR << "I had problems reading the option MinNodeEvents, which\n"
330  << "after removing a possible % sign now reads " << sizeInPercent << Endl;
331  }
332 }
333 
334 
335 
336 ////////////////////////////////////////////////////////////////////////////////
337 /// common initialisation with defaults for the DT-Method
338 
340 {
341  fMinNodeEvents = -1;
342  fMinNodeSize = 5;
343  fMinNodeSizeS = "5%";
344  fNCuts = 20;
345  fPruneMethod = DecisionTree::kNoPruning;
346  fPruneStrength = 5; // -1 means automatic determination of the prune strength using a validation sample
347  fDeltaPruneStrength=0.1;
348  fRandomisedTrees= kFALSE;
349  fUseNvars = GetNvar();
350  fUsePoissonNvars = kTRUE;
351 
352  // reference cut value to distingiush signal-like from background-like events
353  SetSignalReferenceCut( 0 );
354  if (fAnalysisType == Types::kClassification || fAnalysisType == Types::kMulticlass ) {
355  fMaxDepth = 3;
356  }else {
357  fMaxDepth = 50;
358  }
359 }
360 
361 ////////////////////////////////////////////////////////////////////////////////
362 ///destructor
363 
365 {
366  delete fTree;
367 }
368 
369 ////////////////////////////////////////////////////////////////////////////////
370 
372 {
374  fTree = new DecisionTree( fSepType, fMinNodeSize, fNCuts, &(DataInfo()), 0,
375  fRandomisedTrees, fUseNvars, fUsePoissonNvars,fMaxDepth,0 );
376  fTree->SetNVars(GetNvar());
377  if (fRandomisedTrees) Log()<<kWARNING<<" randomised Trees do not work yet in this framework,"
378  << " as I do not know how to give each tree a new random seed, now they"
379  << " will be all the same and that is not good " << Endl;
380  fTree->SetAnalysisType( GetAnalysisType() );
381 
382  //fTree->BuildTree(GetEventCollection(Types::kTraining));
383  Data()->SetCurrentType(Types::kTraining);
384  UInt_t nevents = Data()->GetNTrainingEvents();
385  std::vector<const TMVA::Event*> tmp;
386  for (Long64_t ievt=0; ievt<nevents; ievt++) {
387  const Event *event = GetEvent(ievt);
388  tmp.push_back(event);
389  }
390  fTree->BuildTree(tmp);
391  if (fPruneMethod != DecisionTree::kNoPruning) fTree->PruneTree();
392 
394 }
395 
396 ////////////////////////////////////////////////////////////////////////////////
397 /// prune the decision tree if requested (good for individual trees that are best grown out, and then
398 /// pruned back, while boosted decision trees are best 'small' trees to start with. Well, at least the
399 /// standard "optimal pruning algorithms" don't result in 'weak enough' classifiers !!
400 
402 {
403  // remember the number of nodes beforehand (for monitoring purposes)
404 
405 
406  if (fAutomatic && fPruneMethod == DecisionTree::kCostComplexityPruning) { // automatic cost complexity pruning
407  CCPruner* pruneTool = new CCPruner(fTree, this->Data() , fSepType);
408  pruneTool->Optimize();
409  std::vector<DecisionTreeNode*> nodes = pruneTool->GetOptimalPruneSequence();
410  fPruneStrength = pruneTool->GetOptimalPruneStrength();
411  for(UInt_t i = 0; i < nodes.size(); i++)
412  fTree->PruneNode(nodes[i]);
413  delete pruneTool;
414  }
415  else if (fAutomatic && fPruneMethod != DecisionTree::kCostComplexityPruning){
416  /*
417 
418  Double_t alpha = 0;
419  Double_t delta = fDeltaPruneStrength;
420 
421  DecisionTree* dcopy;
422  std::vector<Double_t> q;
423  multimap<Double_t,Double_t> quality;
424  Int_t nnodes=fTree->GetNNodes();
425 
426  // find the maxiumum prune strength that still leaves some nodes
427  Bool_t forceStop = kFALSE;
428  Int_t troubleCount=0, previousNnodes=nnodes;
429 
430 
431  nnodes=fTree->GetNNodes();
432  while (nnodes > 3 && !forceStop) {
433  dcopy = new DecisionTree(*fTree);
434  dcopy->SetPruneStrength(alpha+=delta);
435  dcopy->PruneTree();
436  q.push_back(TestTreeQuality(dcopy));
437  quality.insert(std::pair<const Double_t,Double_t>(q.back(),alpha));
438  nnodes=dcopy->GetNNodes();
439  if (previousNnodes == nnodes) troubleCount++;
440  else {
441  troubleCount=0; // reset counter
442  if (nnodes < previousNnodes / 2 ) fDeltaPruneStrength /= 2.;
443  }
444  previousNnodes = nnodes;
445  if (troubleCount > 20) {
446  if (methodIndex == 0 && fPruneStrength <=0) {//maybe you need larger stepsize ??
447  fDeltaPruneStrength *= 5;
448  Log() << kINFO << "<PruneTree> trouble determining optimal prune strength"
449  << " for Tree " << methodIndex
450  << " --> first try to increase the step size"
451  << " currently Prunestrenght= " << alpha
452  << " stepsize " << fDeltaPruneStrength << " " << Endl;
453  troubleCount = 0; // try again
454  fPruneStrength = 1; // if it was for the first time..
455  } else if (methodIndex == 0 && fPruneStrength <=2) {//maybe you need much larger stepsize ??
456  fDeltaPruneStrength *= 5;
457  Log() << kINFO << "<PruneTree> trouble determining optimal prune strength"
458  << " for Tree " << methodIndex
459  << " --> try to increase the step size even more.. "
460  << " if that still didn't work, TRY IT BY HAND"
461  << " currently Prunestrenght= " << alpha
462  << " stepsize " << fDeltaPruneStrength << " " << Endl;
463  troubleCount = 0; // try again
464  fPruneStrength = 3; // if it was for the first time..
465  } else {
466  forceStop=kTRUE;
467  Log() << kINFO << "<PruneTree> trouble determining optimal prune strength"
468  << " for Tree " << methodIndex << " at tested prune strength: " << alpha << " --> abort forced, use same strength as for previous tree:"
469  << fPruneStrength << Endl;
470  }
471  }
472  if (fgDebugLevel==1) Log() << kINFO << "Pruneed with ("<<alpha
473  << ") give quality: " << q.back()
474  << " and #nodes: " << nnodes
475  << Endl;
476  delete dcopy;
477  }
478  if (!forceStop) {
479  multimap<Double_t,Double_t>::reverse_iterator it=quality.rend();
480  it++;
481  fPruneStrength = it->second;
482  // adjust the step size for the next tree.. think that 20 steps are sort of
483  // fine enough.. could become a tunable option later..
484  fDeltaPruneStrength *= Double_t(q.size())/20.;
485  }
486 
487  fTree->SetPruneStrength(fPruneStrength);
488  fTree->PruneTree();
489  */
490  }
491  else {
492  fTree->SetPruneStrength(fPruneStrength);
493  fTree->PruneTree();
494  }
495 
496  return fPruneStrength;
497 }
498 
499 ////////////////////////////////////////////////////////////////////////////////
500 
502 {
503  Data()->SetCurrentType(Types::kValidation);
504  // test the tree quality.. in terms of Miscalssification
505  Double_t SumCorrect=0,SumWrong=0;
506  for (Long64_t ievt=0; ievt<Data()->GetNEvents(); ievt++)
507  {
508  const Event * ev = Data()->GetEvent(ievt);
509  if ((dt->CheckEvent(ev) > dt->GetNodePurityLimit() ) == DataInfo().IsSignal(ev)) SumCorrect+=ev->GetWeight();
510  else SumWrong+=ev->GetWeight();
511  }
512  Data()->SetCurrentType(Types::kTraining);
513  return SumCorrect / (SumCorrect + SumWrong);
514 }
515 
516 ////////////////////////////////////////////////////////////////////////////////
517 
518 void TMVA::MethodDT::AddWeightsXMLTo( void* parent ) const
519 {
520  fTree->AddXMLTo(parent);
521  //Log() << kFATAL << "Please implement writing of weights as XML" << Endl;
522 }
523 
524 ////////////////////////////////////////////////////////////////////////////////
525 
527 {
528  if(fTree)
529  delete fTree;
530  fTree = new DecisionTree();
531  fTree->ReadXML(wghtnode,GetTrainingTMVAVersionCode());
532 }
533 
534 ////////////////////////////////////////////////////////////////////////////////
535 
536 void TMVA::MethodDT::ReadWeightsFromStream( std::istream& istr )
537 {
538  delete fTree;
539  fTree = new DecisionTree();
540  fTree->Read(istr);
541 }
542 
543 ////////////////////////////////////////////////////////////////////////////////
544 /// returns MVA value
545 
547 {
548  // cannot determine error
549  NoErrorCalc(err, errUpper);
550 
551  return fTree->CheckEvent(GetEvent(),fUseYesNoLeaf);
552 }
553 
554 ////////////////////////////////////////////////////////////////////////////////
555 
557 {
558 }
559 ////////////////////////////////////////////////////////////////////////////////
560 
562 {
563  return 0;
564 }
void Optimize()
determine the pruning sequence
Definition: CCPruner.cxx:100
MsgLogger & Endl(MsgLogger &ml)
Definition: MsgLogger.h:162
void GetHelpMessage() const
Definition: MethodDT.cxx:556
long long Long64_t
Definition: RtypesCore.h:69
void Init(void)
common initialisation with defaults for the DT-Method
Definition: MethodDT.cxx:339
TString & ReplaceAll(const TString &s1, const TString &s2)
Definition: TString.h:635
Double_t Atof() const
Return floating-point value contained in string.
Definition: TString.cxx:2017
Double_t GetNodePurityLimit() const
Definition: DecisionTree.h:170
Bool_t IsAlnum() const
Returns true if all characters in string are alphanumeric.
Definition: TString.cxx:1776
EAnalysisType
Definition: Types.h:124
Basic string class.
Definition: TString.h:137
bool Bool_t
Definition: RtypesCore.h:59
virtual Bool_t HasAnalysisType(Types::EAnalysisType type, UInt_t numberClasses, UInt_t numberTargets)
FDA can handle classification with 2 classes and regression with one regression-target.
Definition: MethodDT.cxx:179
const Bool_t kFALSE
Definition: Rtypes.h:92
std::vector< TMVA::DecisionTreeNode * > GetOptimalPruneSequence() const
return the prune strength (=alpha) corresponding to the prune sequence
Definition: CCPruner.cxx:216
Double_t GetWeight() const
return the event weight - depending on whether the flag IgnoreNegWeightsInTraining is or not...
Definition: Event.cxx:376
Double_t PruneTree()
prune the decision tree if requested (good for individual trees that are best grown out...
Definition: MethodDT.cxx:401
void SetMinNodeSize(Double_t sizeInPercent)
Definition: MethodDT.cxx:314
void ReadWeightsFromStream(std::istream &istr)
Definition: MethodDT.cxx:536
void DeclareOptions()
define the options (their key words) that can be set in the option string UseRandomisedTrees choose a...
Definition: MethodDT.cxx:208
MethodDT(const TString &jobName, const TString &methodTitle, DataSetInfo &theData, const TString &theOption="", TDirectory *theTargetDir=0)
std::vector< std::vector< double > > Data
Double_t CheckEvent(const TMVA::Event *, Bool_t UseYesNoLeaf=kFALSE) const
the event e is put into the decision tree (starting at the root node) and the output is NodeType (sig...
void SetPruneStrength(Float_t alpha=-1.0)
Definition: CCPruner.h:113
void ProcessOptions()
the option string is decoded, for available options see "DeclareOptions"
Definition: MethodDT.cxx:250
unsigned int UInt_t
Definition: RtypesCore.h:42
ClassImp(TMVA::MethodDT) TMVA
the standard constructor for just an ordinar "decision trees"
Definition: MethodDT.cxx:121
Double_t TestTreeQuality(DecisionTree *dt)
Definition: MethodDT.cxx:501
double Double_t
Definition: RtypesCore.h:55
void AddWeightsXMLTo(void *parent) const
Definition: MethodDT.cxx:518
Describe directory structure in memory.
Definition: TDirectory.h:44
int type
Definition: TGX11.cxx:120
Double_t GetMvaValue(Double_t *err=0, Double_t *errUpper=0)
returns MVA value
Definition: MethodDT.cxx:546
void Train(void)
Definition: MethodDT.cxx:371
void DeclareCompatibilityOptions()
options that are used ONLY for the READER to ensure backward compatibility they are hence without any...
Definition: MethodDT.cxx:238
#define REGISTER_METHOD(CLASS)
for example
virtual void DeclareCompatibilityOptions()
options that are used ONLY for the READER to ensure backward compatibility they are hence without any...
Definition: MethodBase.cxx:606
virtual ~MethodDT(void)
destructor
Definition: MethodDT.cxx:364
void ReadWeightsFromXML(void *wghtnode)
Definition: MethodDT.cxx:526
Float_t GetOptimalPruneStrength() const
Definition: CCPruner.h:92
const Bool_t kTRUE
Definition: Rtypes.h:91
Definition: math.cpp:60
const Ranking * CreateRanking()
Definition: MethodDT.cxx:561