Implementation of a Decision Tree.
In a decision tree successive decision nodes are used to categorize the events out of the sample as either signal or background. Each node uses only a single discriminating variable to decide if the event is signal-like ("goes right") or background-like ("goes left"). This forms a tree like structure with "baskets" at the end (leave nodes), and an event is classified as either signal or background according to whether the basket where it ends up has been classified signal or background during the training. Training of a decision tree is the process to define the "cut criteria" for each node. The training starts with the root node. Here one takes the full training event sample and selects the variable and corresponding cut value that gives the best separation between signal and background at this stage. Using this cut criterion, the sample is then divided into two subsamples, a signal-like (right) and a background-like (left) sample. Two new nodes are then created for each of the two sub-samples and they are constructed using the same mechanism as described for the root node. The devision is stopped once a certain node has reached either a minimum number of events, or a minimum or maximum signal purity. These leave nodes are then called "signal" or "background" if they contain more signal respective background events from the training sample.
Definition at line 65 of file DecisionTree.h.
Public Types | |
enum | EPruneMethod { kExpectedErrorPruning =0 , kCostComplexityPruning , kNoPruning } |
typedef std::vector< const TMVA::Event * > | EventConstList |
typedef std::vector< TMVA::Event * > | EventList |
Public Member Functions | |
DecisionTree (const DecisionTree &d) | |
copy constructor that creates a true copy, i.e. | |
DecisionTree (SeparationBase *sepType, Float_t minSize, Int_t nCuts, DataSetInfo *=NULL, UInt_t cls=0, Bool_t randomisedTree=kFALSE, Int_t useNvars=0, Bool_t usePoissonNvars=kFALSE, UInt_t nMaxDepth=9999999, Int_t iSeed=fgRandomSeed, Float_t purityLimit=0.5, Int_t treeID=0) | |
constructor specifying the separation type, the min number of events in a no that is still subjected to further splitting, the number of bins in the grid used in applying the cut for the node splitting. | |
DecisionTree (void) | |
default constructor using the GiniIndex as separation criterion, no restrictions on minium number of events in a leave note or the separation gain in the node splitting | |
virtual | ~DecisionTree (void) |
destructor | |
void | ApplyValidationSample (const EventConstList *validationSample) const |
run the validation sample through the (pruned) tree and fill in the nodes the variables NSValidation and NBValidadtion (i.e. | |
UInt_t | BuildTree (const EventConstList &eventSample, DecisionTreeNode *node=NULL) |
building the decision tree by recursively calling the splitting of one (root-) node into two daughter nodes (returns the number of nodes) | |
Double_t | CheckEvent (const TMVA::Event *, Bool_t UseYesNoLeaf=kFALSE) const |
the event e is put into the decision tree (starting at the root node) and the output is NodeType (signal) or (background) of the final node (basket) in which the given events ends up. | |
void | CheckEventWithPrunedTree (const TMVA::Event *) const |
pass a single validation event through a pruned decision tree on the way down the tree, fill in all the "intermediate" information that would normally be there from training. | |
virtual const char * | ClassName () const |
UInt_t | CleanTree (DecisionTreeNode *node=NULL) |
remove those last splits that result in two leaf nodes that are both of the type (i.e. | |
void | ClearTree () |
clear the tree nodes (their S/N, Nevents etc), just keep the structure of the tree | |
UInt_t | CountLeafNodes (TMVA::Node *n=NULL) |
return the number of terminal nodes in the sub-tree below Node n | |
virtual DecisionTreeNode * | CreateNode (UInt_t) const |
virtual BinaryTree * | CreateTree () const |
void | DescendTree (Node *n=NULL) |
descend a tree to find all its leaf nodes | |
Bool_t | DoRegression () const |
void | FillEvent (const TMVA::Event &event, TMVA::DecisionTreeNode *node) |
fill the existing the decision tree structure by filling event in from the top node and see where they happen to end up | |
void | FillTree (const EventList &eventSample) |
fill the existing the decision tree structure by filling event in from the top node and see where they happen to end up | |
Types::EAnalysisType | GetAnalysisType (void) |
TMVA::DecisionTreeNode * | GetEventNode (const TMVA::Event &e) const |
get the pointer to the leaf node where a particular event ends up in... (used in gradient boosting) | |
std::vector< Double_t > | GetFisherCoefficients (const EventConstList &eventSample, UInt_t nFisherVars, UInt_t *mapVarInFisher) |
calculate the fisher coefficients for the event sample and the variables used | |
Int_t | GetNNodesBeforePruning () |
Node * | GetNode (ULong_t sequence, UInt_t depth) |
retrieve node from the tree. | |
Double_t | GetNodePurityLimit () const |
Double_t | GetPruneStrength () const |
void | GetRandomisedVariables (Bool_t *useVariable, UInt_t *variableMap, UInt_t &nVars) |
virtual DecisionTreeNode * | GetRoot () const |
Double_t | GetSumWeights (const EventConstList *validationSample) const |
calculate the normalization factor for a pruning validation sample | |
Int_t | GetTreeID () |
std::vector< Double_t > | GetVariableImportance () |
Return the relative variable importance, normalized to all variables together having the importance 1. | |
Double_t | GetVariableImportance (UInt_t ivar) |
returns the relative importance of variable ivar | |
void | PruneNode (TMVA::DecisionTreeNode *node) |
prune away the subtree below the node | |
void | PruneNodeInPlace (TMVA::DecisionTreeNode *node) |
prune a node temporarily (without actually deleting its descendants which allows testing the pruned tree quality for many different pruning stages without "touching" the tree. | |
Double_t | PruneTree (const EventConstList *validationSample=NULL) |
prune (get rid of internal nodes) the Decision tree to avoid overtraining several different pruning methods can be applied as selected by the variable "fPruneMethod". | |
void | SetAnalysisType (Types::EAnalysisType t) |
void | SetMinLinCorrForFisher (Double_t min) |
void | SetNodePurityLimit (Double_t p) |
void | SetNVars (Int_t n) |
void | SetParentTreeInNodes (Node *n=NULL) |
descend a tree to find all its leaf nodes, fill max depth reached in the tree at the same time. | |
void | SetPruneMethod (EPruneMethod m=kCostComplexityPruning) |
void | SetPruneStrength (Double_t p) |
void | SetTreeID (Int_t treeID) |
void | SetUseExclusiveVars (Bool_t t=kTRUE) |
void | SetUseFisherCuts (Bool_t t=kTRUE) |
Double_t | TestPrunedTreeQuality (const DecisionTreeNode *dt=NULL, Int_t mode=0) const |
return the misclassification rate of a pruned tree a "pruned tree" may have set the variable "IsTerminal" to "arbitrary" at any node, hence this tree quality testing will stop there, hence test the pruned tree (while the full tree is still in place for normal/later use) | |
Double_t | TrainNode (const EventConstList &eventSample, DecisionTreeNode *node) |
Double_t | TrainNodeFast (const EventConstList &eventSample, DecisionTreeNode *node) |
Decide how to split a node using one of the variables that gives the best separation of signal/background. | |
Double_t | TrainNodeFull (const EventConstList &eventSample, DecisionTreeNode *node) |
train a node by finding the single optimal cut for a single variable that best separates signal and background (maximizes the separation gain) | |
Public Member Functions inherited from TMVA::BinaryTree | |
BinaryTree (void) | |
constructor for a yet "empty" tree. Needs to be filled afterwards | |
virtual | ~BinaryTree () |
destructor (deletes the nodes and "events" if owned by the tree | |
virtual void * | AddXMLTo (void *parent) const |
add attributes to XML | |
UInt_t | CountNodes (Node *n=NULL) |
return the number of nodes in the tree. (make a new count --> takes time) | |
Node * | GetLeftDaughter (Node *n) |
get left daughter node current node "n" | |
UInt_t | GetNNodes () const |
Node * | GetRightDaughter (Node *n) |
get right daughter node current node "n" | |
UInt_t | GetTotalTreeDepth () const |
virtual void | Print (std::ostream &os) const |
recursively print the tree | |
virtual void | Read (std::istream &istr, UInt_t tmva_Version_Code=TMVA_VERSION_CODE) |
Read the binary tree from an input stream. | |
virtual void | ReadXML (void *node, UInt_t tmva_Version_Code=TMVA_VERSION_CODE) |
read attributes from XML | |
void | SetRoot (Node *r) |
void | SetTotalTreeDepth (Int_t depth) |
void | SetTotalTreeDepth (Node *n=NULL) |
descend a tree to find all its leaf nodes, fill max depth reached in the tree at the same time. | |
Static Public Member Functions | |
static DecisionTree * | CreateFromXML (void *node, UInt_t tmva_Version_Code=TMVA_VERSION_CODE) |
re-create a new tree (decision tree or search tree) from XML | |
Private Member Functions | |
Double_t | SamplePurity (EventList eventSample) |
calculates the purity S/(S+B) of a given event sample | |
Static Private Attributes | |
static const Int_t | fgDebugLevel = 0 |
static const Int_t | fgRandomSeed = 0 |
Additional Inherited Members | |
Protected Member Functions inherited from TMVA::BinaryTree | |
void | DeleteNode (Node *) |
protected, recursive, function used by the class destructor and when Pruning | |
MsgLogger & | Log () const |
Protected Attributes inherited from TMVA::BinaryTree | |
UInt_t | fDepth |
UInt_t | fNNodes |
Node * | fRoot |
#include <TMVA/DecisionTree.h>
typedef std::vector<const TMVA::Event*> TMVA::DecisionTree::EventConstList |
Definition at line 74 of file DecisionTree.h.
typedef std::vector<TMVA::Event*> TMVA::DecisionTree::EventList |
Definition at line 73 of file DecisionTree.h.
Enumerator | |
---|---|
kExpectedErrorPruning | |
kCostComplexityPruning | |
kNoPruning |
Definition at line 139 of file DecisionTree.h.
TMVA::DecisionTree::DecisionTree | ( | void | ) |
default constructor using the GiniIndex as separation criterion, no restrictions on minium number of events in a leave note or the separation gain in the node splitting
Definition at line 115 of file DecisionTree.cxx.
TMVA::DecisionTree::DecisionTree | ( | TMVA::SeparationBase * | sepType, |
Float_t | minSize, | ||
Int_t | nCuts, | ||
DataSetInfo * | dataInfo = NULL , |
||
UInt_t | cls = 0 , |
||
Bool_t | randomisedTree = kFALSE , |
||
Int_t | useNvars = 0 , |
||
Bool_t | usePoissonNvars = kFALSE , |
||
UInt_t | nMaxDepth = 9999999 , |
||
Int_t | iSeed = fgRandomSeed , |
||
Float_t | purityLimit = 0.5 , |
||
Int_t | treeID = 0 |
||
) |
constructor specifying the separation type, the min number of events in a no that is still subjected to further splitting, the number of bins in the grid used in applying the cut for the node splitting.
Definition at line 150 of file DecisionTree.cxx.
TMVA::DecisionTree::DecisionTree | ( | const DecisionTree & | d | ) |
copy constructor that creates a true copy, i.e.
a completely independent tree the node copy will recursively copy all the nodes
Definition at line 200 of file DecisionTree.cxx.
|
virtual |
destructor
Definition at line 236 of file DecisionTree.cxx.
void TMVA::DecisionTree::ApplyValidationSample | ( | const EventConstList * | validationSample | ) | const |
run the validation sample through the (pruned) tree and fill in the nodes the variables NSValidation and NBValidadtion (i.e.
how many of the Signal and Background events from the validation sample. This is then later used when asking for the "tree quality" ..
Definition at line 1029 of file DecisionTree.cxx.
UInt_t TMVA::DecisionTree::BuildTree | ( | const EventConstList & | eventSample, |
DecisionTreeNode * | node = NULL |
||
) |
building the decision tree by recursively calling the splitting of one (root-) node into two daughter nodes (returns the number of nodes)
Definition at line 377 of file DecisionTree.cxx.
Double_t TMVA::DecisionTree::CheckEvent | ( | const TMVA::Event * | e, |
Bool_t | UseYesNoLeaf = kFALSE |
||
) | const |
the event e is put into the decision tree (starting at the root node) and the output is NodeType (signal) or (background) of the final node (basket) in which the given events ends up.
I.e. the result of the classification if the event for this decision tree.
Definition at line 2690 of file DecisionTree.cxx.
void TMVA::DecisionTree::CheckEventWithPrunedTree | ( | const TMVA::Event * | e | ) | const |
pass a single validation event through a pruned decision tree on the way down the tree, fill in all the "intermediate" information that would normally be there from training.
Definition at line 1085 of file DecisionTree.cxx.
|
inlinevirtual |
Implements TMVA::BinaryTree.
Definition at line 98 of file DecisionTree.h.
UInt_t TMVA::DecisionTree::CleanTree | ( | DecisionTreeNode * | node = NULL | ) |
remove those last splits that result in two leaf nodes that are both of the type (i.e.
both signal or both background) this of course is only a reasonable thing to do when you use "YesOrNo" leafs, while it might loose s.th. if you use the purity information in the nodes. --> hence I don't call it automatically in the tree building
Definition at line 937 of file DecisionTree.cxx.
void TMVA::DecisionTree::ClearTree | ( | ) |
clear the tree nodes (their S/N, Nevents etc), just keep the structure of the tree
Definition at line 923 of file DecisionTree.cxx.
UInt_t TMVA::DecisionTree::CountLeafNodes | ( | TMVA::Node * | n = NULL | ) |
return the number of terminal nodes in the sub-tree below Node n
Definition at line 1131 of file DecisionTree.cxx.
|
static |
re-create a new tree (decision tree or search tree) from XML
Definition at line 281 of file DecisionTree.cxx.
|
inlinevirtual |
Implements TMVA::BinaryTree.
Definition at line 95 of file DecisionTree.h.
|
inlinevirtual |
Implements TMVA::BinaryTree.
Definition at line 96 of file DecisionTree.h.
descend a tree to find all its leaf nodes
Definition at line 1160 of file DecisionTree.cxx.
|
inline |
Definition at line 188 of file DecisionTree.h.
void TMVA::DecisionTree::FillEvent | ( | const TMVA::Event & | event, |
TMVA::DecisionTreeNode * | node | ||
) |
fill the existing the decision tree structure by filling event in from the top node and see where they happen to end up
Definition at line 891 of file DecisionTree.cxx.
fill the existing the decision tree structure by filling event in from the top node and see where they happen to end up
Definition at line 880 of file DecisionTree.cxx.
|
inline |
Definition at line 190 of file DecisionTree.h.
TMVA::DecisionTreeNode * TMVA::DecisionTree::GetEventNode | ( | const TMVA::Event & | e | ) | const |
get the pointer to the leaf node where a particular event ends up in... (used in gradient boosting)
Definition at line 2673 of file DecisionTree.cxx.
std::vector< Double_t > TMVA::DecisionTree::GetFisherCoefficients | ( | const EventConstList & | eventSample, |
UInt_t | nFisherVars, | ||
UInt_t * | mapVarInFisher | ||
) |
calculate the fisher coefficients for the event sample and the variables used
Definition at line 2342 of file DecisionTree.cxx.
|
inline |
Definition at line 180 of file DecisionTree.h.
TMVA::Node * TMVA::DecisionTree::GetNode | ( | ULong_t | sequence, |
UInt_t | depth | ||
) |
retrieve node from the tree.
Its position (up to a maximal tree depth of 64) is coded as a sequence of left-right moves starting from the root, coded as 0-1 bit patterns stored in the "long-integer" (i.e. 0:left ; 1:right
Definition at line 1231 of file DecisionTree.cxx.
|
inline |
Definition at line 162 of file DecisionTree.h.
|
inline |
Definition at line 147 of file DecisionTree.h.
void TMVA::DecisionTree::GetRandomisedVariables | ( | Bool_t * | useVariable, |
UInt_t * | variableMap, | ||
UInt_t & | nVars | ||
) |
Definition at line 1247 of file DecisionTree.cxx.
|
inlinevirtual |
Reimplemented from TMVA::BinaryTree.
Definition at line 94 of file DecisionTree.h.
Double_t TMVA::DecisionTree::GetSumWeights | ( | const EventConstList * | validationSample | ) | const |
calculate the normalization factor for a pruning validation sample
Definition at line 1118 of file DecisionTree.cxx.
|
inline |
Definition at line 186 of file DecisionTree.h.
vector< Double_t > TMVA::DecisionTree::GetVariableImportance | ( | ) |
Return the relative variable importance, normalized to all variables together having the importance 1.
The importance in evaluated as the total separation-gain that this variable had in the decision trees (weighted by the number of events)
Definition at line 2745 of file DecisionTree.cxx.
returns the relative importance of variable ivar
Definition at line 2766 of file DecisionTree.cxx.
void TMVA::DecisionTree::PruneNode | ( | TMVA::DecisionTreeNode * | node | ) |
prune away the subtree below the node
Definition at line 1194 of file DecisionTree.cxx.
void TMVA::DecisionTree::PruneNodeInPlace | ( | TMVA::DecisionTreeNode * | node | ) |
prune a node temporarily (without actually deleting its descendants which allows testing the pruned tree quality for many different pruning stages without "touching" the tree.
Definition at line 1217 of file DecisionTree.cxx.
Double_t TMVA::DecisionTree::PruneTree | ( | const EventConstList * | validationSample = NULL | ) |
prune (get rid of internal nodes) the Decision tree to avoid overtraining several different pruning methods can be applied as selected by the variable "fPruneMethod".
Definition at line 964 of file DecisionTree.cxx.
calculates the purity S/(S+B) of a given event sample
Definition at line 2722 of file DecisionTree.cxx.
|
inline |
Definition at line 189 of file DecisionTree.h.
Definition at line 192 of file DecisionTree.h.
Definition at line 161 of file DecisionTree.h.
Definition at line 194 of file DecisionTree.h.
descend a tree to find all its leaf nodes, fill max depth reached in the tree at the same time.
Definition at line 248 of file DecisionTree.cxx.
|
inline |
Definition at line 140 of file DecisionTree.h.
Definition at line 146 of file DecisionTree.h.
Definition at line 185 of file DecisionTree.h.
Definition at line 193 of file DecisionTree.h.
Definition at line 191 of file DecisionTree.h.
Double_t TMVA::DecisionTree::TestPrunedTreeQuality | ( | const DecisionTreeNode * | dt = NULL , |
Int_t | mode = 0 |
||
) | const |
return the misclassification rate of a pruned tree a "pruned tree" may have set the variable "IsTerminal" to "arbitrary" at any node, hence this tree quality testing will stop there, hence test the pruned tree (while the full tree is still in place for normal/later use)
Definition at line 1043 of file DecisionTree.cxx.
|
inline |
Definition at line 108 of file DecisionTree.h.
Double_t TMVA::DecisionTree::TrainNodeFast | ( | const EventConstList & | eventSample, |
TMVA::DecisionTreeNode * | node | ||
) |
Decide how to split a node using one of the variables that gives the best separation of signal/background.
In order to do this, for each variable a scan of the different cut values in a grid (grid = fNCuts) is performed and the resulting separation gains are compared. in addition to the individual variables, one can also ask for a fisher discriminant being built out of (some) of the variables and used as a possible multivariate split.
Definition at line 1374 of file DecisionTree.cxx.
Double_t TMVA::DecisionTree::TrainNodeFull | ( | const EventConstList & | eventSample, |
TMVA::DecisionTreeNode * | node | ||
) |
train a node by finding the single optimal cut for a single variable that best separates signal and background (maximizes the separation gain)
Definition at line 2536 of file DecisionTree.cxx.
|
private |
Definition at line 239 of file DecisionTree.h.
|
private |
Definition at line 241 of file DecisionTree.h.
|
staticprivate |
Definition at line 236 of file DecisionTree.h.
|
staticprivate |
Definition at line 69 of file DecisionTree.h.
|
private |
Definition at line 234 of file DecisionTree.h.
|
private |
Definition at line 208 of file DecisionTree.h.
|
private |
Definition at line 215 of file DecisionTree.h.
|
private |
Definition at line 216 of file DecisionTree.h.
|
private |
Definition at line 214 of file DecisionTree.h.
|
private |
Definition at line 230 of file DecisionTree.h.
|
private |
Definition at line 206 of file DecisionTree.h.
|
private |
Definition at line 222 of file DecisionTree.h.
|
private |
Definition at line 224 of file DecisionTree.h.
|
private |
Definition at line 205 of file DecisionTree.h.
|
private |
Definition at line 221 of file DecisionTree.h.
|
private |
Definition at line 219 of file DecisionTree.h.
|
private |
Definition at line 226 of file DecisionTree.h.
|
private |
Definition at line 212 of file DecisionTree.h.
|
private |
Definition at line 211 of file DecisionTree.h.
|
private |
Definition at line 235 of file DecisionTree.h.
|
private |
Definition at line 237 of file DecisionTree.h.
|
private |
Definition at line 209 of file DecisionTree.h.
|
private |
Definition at line 207 of file DecisionTree.h.
|
private |
Definition at line 227 of file DecisionTree.h.
|
private |
Definition at line 228 of file DecisionTree.h.
|
private |
Definition at line 218 of file DecisionTree.h.
|
private |
Definition at line 232 of file DecisionTree.h.