ROOT logo
// @(#)root/tmva $Id: MethodFisher.cxx 37506 2010-12-10 14:00:41Z stelzer $
// Author: Andreas Hoecker, Xavier Prudent, Joerg Stelzer, Helge Voss, Kai Voss

/**********************************************************************************
 * Project: TMVA - a Root-integrated toolkit for multivariate Data analysis       *
 * Package: TMVA                                                                  *
 * Class  : MethodFisher                                                          *
 * Web    : http://tmva.sourceforge.net                                           *
 *                                                                                *
 * Description:                                                                   *
 *      Implementation (see header for description)                               *
 *                                                                                *
 * Original author of this Fisher-Discriminant implementation:                    *
 *      Andre Gaidot, CEA-France;                                                 *
 *      (Translation from FORTRAN)                                                *
 *                                                                                *
 * Authors (alphabetical):                                                        *
 *      Andreas Hoecker <Andreas.Hocker@cern.ch> - CERN, Switzerland              *
 *      Xavier Prudent  <prudent@lapp.in2p3.fr>  - LAPP, France                   *
 *      Helge Voss      <Helge.Voss@cern.ch>     - MPI-K Heidelberg, Germany      *
 *      Kai Voss        <Kai.Voss@cern.ch>       - U. of Victoria, Canada         *
 *                                                                                *
 * Copyright (c) 2005:                                                            *
 *      CERN, Switzerland                                                         *
 *      U. of Victoria, Canada                                                    *
 *      MPI-K Heidelberg, Germany                                                 *
 *      LAPP, Annecy, France                                                      *
 *                                                                                *
 * Redistribution and use in source and binary forms, with or without             *
 * modification, are permitted according to the terms listed in LICENSE           *
 * (http://tmva.sourceforge.net/LICENSE)                                          *
 **********************************************************************************/

//_______________________________________________________________________
/* Begin_Html
  Fisher and Mahalanobis Discriminants (Linear Discriminant Analysis)

  <p>
  In the method of Fisher discriminants event selection is performed
  in a transformed variable space with zero linear correlations, by
  distinguishing the mean values of the signal and background
  distributions.<br></p>

  <p>
  The linear discriminant analysis determines an axis in the (correlated)
  hyperspace of the input variables
  such that, when projecting the output classes (signal and background)
  upon this axis, they are pushed as far as possible away from each other,
  while events of a same class are confined in a close vicinity.
  The linearity property of this method is reflected in the metric with
  which "far apart" and "close vicinity" are determined: the covariance
  matrix of the discriminant variable space.
  </p>

  <p>
  The classification of the events in signal and background classes
  relies on the following characteristics (only): overall sample means,
  <i><my:o>x</my:o><sub>i</sub></i>, for each input variable, <i>i</i>,
  class-specific sample means, <i><my:o>x</my:o><sub>S(B),i</sub></i>,
  and total covariance matrix <i>T<sub>ij</sub></i>. The covariance matrix
  can be decomposed into the sum of a <i>within-</i> (<i>W<sub>ij</sub></i>)
  and a <i>between-class</i> (<i>B<sub>ij</sub></i>) class matrix. They describe
  the dispersion of events relative to the means of their own class (within-class
  matrix), and relative to the overall sample means (between-class matrix).
  The Fisher coefficients, <i>F<sub>i</sub></i>, are then given by <br>
  <center>
  <img vspace=6 src="gif/tmva_fisherC.gif" align="bottom" >
  </center>
  where in TMVA is set <i>N<sub>S</sub>=N<sub>B</sub></i>, so that the factor
  in front of the sum simplifies to &frac12;.
  The Fisher discriminant then reads<br>
  <center>
  <img vspace=6 src="gif/tmva_fisherD.gif" align="bottom" >
  </center>
  The offset <i>F</i><sub>0</sub> centers the sample mean of <i>x</i><sub>Fi</sub>
  at zero. Instead of using the within-class matrix, the Mahalanobis variant
  determines the Fisher coefficients as follows:<br>
  <center>
  <img vspace=6 src="gif/tmva_mahaC.gif" align="bottom" >
  </center>
  with resulting <i>x</i><sub>Ma</sub> that are very similar to the
  <i>x</i><sub>Fi</sub>. <br></p>

  TMVA provides two outputs for the ranking of the input variables:<br><p></p>
  <ul>
  <li> <u>Fisher test:</u> the Fisher analysis aims at simultaneously maximising
  the between-class separation, while minimising the within-class dispersion.
  A useful measure of the discrimination power of a variable is hence given
  by the diagonal quantity: <i>B<sub>ii</sub>/W<sub>ii</sub></i>.
  </li>

  <li> <u>Discrimination power:</u> the value of the Fisher coefficient is a
  measure of the discriminating power of a variable. The discrimination power
  of set of input variables can therefore be measured by the scalar
  <center>
  <img vspace=6 src="gif/tmva_discpower.gif" align="bottom" >
  </center>
  </li>
  </ul>
  The corresponding numbers are printed on standard output.
  End_Html */
//_______________________________________________________________________

#include <iomanip>
#include <cassert>

#include "TMath.h"
#include "Riostream.h"

#include "TMVA/VariableTransformBase.h"
#include "TMVA/MethodFisher.h"
#include "TMVA/Tools.h"
#include "TMatrix.h"
#include "TMVA/Ranking.h"
#include "TMVA/Types.h"
#include "TMVA/ClassifierFactory.h"

REGISTER_METHOD(Fisher)

ClassImp(TMVA::MethodFisher);

//_______________________________________________________________________
TMVA::MethodFisher::MethodFisher( const TString& jobName,
                                  const TString& methodTitle,
                                  DataSetInfo& dsi,
                                  const TString& theOption,
                                  TDirectory* theTargetDir ) :
   MethodBase( jobName, Types::kFisher, methodTitle, dsi, theOption, theTargetDir ),
   fMeanMatx     ( 0 ),
   fTheMethod    ( "Fisher" ),
   fFisherMethod ( kFisher ),
   fBetw         ( 0 ),
   fWith         ( 0 ),
   fCov          ( 0 ),
   fSumOfWeightsS( 0 ),
   fSumOfWeightsB( 0 ),
   fDiscrimPow   ( 0 ),
   fFisherCoeff  ( 0 ),
   fF0           ( 0 )
{
   // standard constructor for the "Fisher"
}

//_______________________________________________________________________
TMVA::MethodFisher::MethodFisher( DataSetInfo& dsi,
                                  const TString& theWeightFile,
                                  TDirectory* theTargetDir ) :
   MethodBase( Types::kFisher, dsi, theWeightFile, theTargetDir ),
   fMeanMatx     ( 0 ),
   fTheMethod    ( "Fisher" ),
   fFisherMethod ( kFisher ),
   fBetw         ( 0 ),
   fWith         ( 0 ),
   fCov          ( 0 ),
   fSumOfWeightsS( 0 ),
   fSumOfWeightsB( 0 ),
   fDiscrimPow   ( 0 ),
   fFisherCoeff  ( 0 ),
   fF0           ( 0 )
{
   // constructor from weight file
}

//_______________________________________________________________________
void TMVA::MethodFisher::Init( void )
{
   // default initialization called by all constructors

   // allocate Fisher coefficients
   fFisherCoeff = new std::vector<Double_t>( GetNvar() );

   // the minimum requirement to declare an event signal-like
   SetSignalReferenceCut( 0.0 );

   // this is the preparation for training
   InitMatrices();
}

//_______________________________________________________________________
void TMVA::MethodFisher::DeclareOptions()
{
   //
   // MethodFisher options:
   // format and syntax of option string: "type"
   // where type is "Fisher" or "Mahalanobis"
   //
   DeclareOptionRef( fTheMethod = "Fisher", "Method", "Discrimination method" );
   AddPreDefVal(TString("Fisher"));
   AddPreDefVal(TString("Mahalanobis"));
}

//_______________________________________________________________________
void TMVA::MethodFisher::ProcessOptions()
{
   // process user options
   if (fTheMethod ==  "Fisher" ) fFisherMethod = kFisher;
   else                          fFisherMethod = kMahalanobis;

   // this is the preparation for training
   InitMatrices();
}

//_______________________________________________________________________
TMVA::MethodFisher::~MethodFisher( void )
{
   // destructor
   if (fBetw       ) { delete fBetw; fBetw = 0; }
   if (fWith       ) { delete fWith; fWith = 0; }
   if (fCov        ) { delete fCov;  fCov = 0; }
   if (fDiscrimPow ) { delete fDiscrimPow; fDiscrimPow = 0; }
   if (fFisherCoeff) { delete fFisherCoeff; fFisherCoeff = 0; }
}

//_______________________________________________________________________
Bool_t TMVA::MethodFisher::HasAnalysisType( Types::EAnalysisType type, UInt_t numberClasses, UInt_t /*numberTargets*/ )
{
   // Fisher can only handle classification with 2 classes
   if (type == Types::kClassification && numberClasses == 2) return kTRUE;
   return kFALSE;
}

//_______________________________________________________________________
void TMVA::MethodFisher::Train( void )
{
   // computation of Fisher coefficients by series of matrix operations

   // get mean value of each variables for signal, backgd and signal+backgd
   GetMean();

   // get the matrix of covariance 'within class'
   GetCov_WithinClass();

   // get the matrix of covariance 'between class'
   GetCov_BetweenClass();

   // get the matrix of covariance 'between class'
   GetCov_Full();

   //--------------------------------------------------------------

   // get the Fisher coefficients
   GetFisherCoeff();

   // get the discriminating power of each variables
   GetDiscrimPower();

   // nice output
   PrintCoefficients();
}

//_______________________________________________________________________
Double_t TMVA::MethodFisher::GetMvaValue( Double_t* err, Double_t* errUpper )
{
   // returns the Fisher value (no fixed range)
   const Event * ev = GetEvent();
   Double_t result = fF0;
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++)
      result += (*fFisherCoeff)[ivar]*ev->GetValue(ivar);

   // cannot determine error
   NoErrorCalc(err, errUpper);

   return result;

}

//_______________________________________________________________________
void TMVA::MethodFisher::InitMatrices( void )
{
   // initializaton method; creates global matrices and vectors

   // average value of each variables for S, B, S+B
   fMeanMatx = new TMatrixD( GetNvar(), 3 );

   // the covariance 'within class' and 'between class' matrices
   fBetw = new TMatrixD( GetNvar(), GetNvar() );
   fWith = new TMatrixD( GetNvar(), GetNvar() );
   fCov  = new TMatrixD( GetNvar(), GetNvar() );

   // discriminating power
   fDiscrimPow = new std::vector<Double_t>( GetNvar() );
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetMean( void )
{
   // compute mean values of variables in each sample, and the overall means

   // initialize internal sum-of-weights variables
   fSumOfWeightsS = 0;
   fSumOfWeightsB = 0;

   const UInt_t nvar = DataInfo().GetNVariables();

   // init vectors
   Double_t* sumS = new Double_t[nvar];
   Double_t* sumB = new Double_t[nvar];
   for (UInt_t ivar=0; ivar<nvar; ivar++) { sumS[ivar] = sumB[ivar] = 0; }

   // compute sample means
   for (Int_t ievt=0; ievt<Data()->GetNEvents(); ievt++) {

      // read the Training Event into "event"
      const Event * ev = GetEvent(ievt);

      // sum of weights
      Double_t weight = GetTWeight(ev);
      if (DataInfo().IsSignal(ev)) fSumOfWeightsS += weight;
      else                         fSumOfWeightsB += weight;

      Double_t* sum = DataInfo().IsSignal(ev) ? sumS : sumB;

      for (UInt_t ivar=0; ivar<nvar; ivar++) sum[ivar] += ev->GetValue( ivar )*weight;
   }

   for (UInt_t ivar=0; ivar<nvar; ivar++) {
      (*fMeanMatx)( ivar, 2 ) = sumS[ivar];
      (*fMeanMatx)( ivar, 0 ) = sumS[ivar]/fSumOfWeightsS;

      (*fMeanMatx)( ivar, 2 ) += sumB[ivar];
      (*fMeanMatx)( ivar, 1 ) = sumB[ivar]/fSumOfWeightsB;

      // signal + background
      (*fMeanMatx)( ivar, 2 ) /= (fSumOfWeightsS + fSumOfWeightsB);
   }
   delete [] sumS;
   delete [] sumB;
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetCov_WithinClass( void )
{
   // the matrix of covariance 'within class' reflects the dispersion of the
   // events relative to the center of gravity of their own class  

   // assert required
   assert( fSumOfWeightsS > 0 && fSumOfWeightsB > 0 );

   // product matrices (x-<x>)(y-<y>) where x;y are variables

   // init
   const Int_t nvar  = GetNvar();
   const Int_t nvar2 = nvar*nvar;
   Double_t *sumSig  = new Double_t[nvar2];
   Double_t *sumBgd  = new Double_t[nvar2];
   Double_t *xval    = new Double_t[nvar];
   memset(sumSig,0,nvar2*sizeof(Double_t));
   memset(sumBgd,0,nvar2*sizeof(Double_t));
   
   // 'within class' covariance
   for (Int_t ievt=0; ievt<Data()->GetNEvents(); ievt++) {

      // read the Training Event into "event"
      const Event* ev = GetEvent(ievt);

      Double_t weight = GetTWeight(ev); // may ignore events with negative weights

      for (Int_t x=0; x<nvar; x++) xval[x] = ev->GetValue( x );
      Int_t k=0;
      for (Int_t x=0; x<nvar; x++) {
         for (Int_t y=0; y<nvar; y++) {            
            Double_t v = ( (xval[x] - (*fMeanMatx)(x, 0))*(xval[y] - (*fMeanMatx)(y, 0)) )*weight;
            if (DataInfo().IsSignal(ev)) sumSig[k] += v;
            else                         sumBgd[k] += v;
            k++;
         }
      }
   }
   Int_t k=0;
   for (Int_t x=0; x<nvar; x++) {
      for (Int_t y=0; y<nvar; y++) {
         (*fWith)(x, y) = (sumSig[k] + sumBgd[k])/(fSumOfWeightsS + fSumOfWeightsB);
         k++;
      }
   }

   delete [] sumSig;
   delete [] sumBgd;
   delete [] xval;
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetCov_BetweenClass( void )
{
   // the matrix of covariance 'between class' reflects the dispersion of the
   // events of a class relative to the global center of gravity of all the class
   // hence the separation between classes

   // assert required
   assert( fSumOfWeightsS > 0 && fSumOfWeightsB > 0);

   Double_t prodSig, prodBgd;

   for (UInt_t x=0; x<GetNvar(); x++) {
      for (UInt_t y=0; y<GetNvar(); y++) {

         prodSig = ( ((*fMeanMatx)(x, 0) - (*fMeanMatx)(x, 2))*
                     ((*fMeanMatx)(y, 0) - (*fMeanMatx)(y, 2)) );
         prodBgd = ( ((*fMeanMatx)(x, 1) - (*fMeanMatx)(x, 2))*
                     ((*fMeanMatx)(y, 1) - (*fMeanMatx)(y, 2)) );

         (*fBetw)(x, y) = (fSumOfWeightsS*prodSig + fSumOfWeightsB*prodBgd) / (fSumOfWeightsS + fSumOfWeightsB);
      }
   }
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetCov_Full( void )
{
   // compute full covariance matrix from sum of within and between matrices
   for (UInt_t x=0; x<GetNvar(); x++) 
      for (UInt_t y=0; y<GetNvar(); y++) 
         (*fCov)(x, y) = (*fWith)(x, y) + (*fBetw)(x, y);
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetFisherCoeff( void )
{
   // Fisher = Sum { [coeff]*[variables] }
   //
   // let Xs be the array of the mean values of variables for signal evts
   // let Xb be the array of the mean values of variables for backgd evts
   // let InvWith be the inverse matrix of the 'within class' correlation matrix
   //
   // then the array of Fisher coefficients is 
   // [coeff] =sqrt(fNsig*fNbgd)/fNevt*transpose{Xs-Xb}*InvWith

   // assert required
   assert( fSumOfWeightsS > 0 && fSumOfWeightsB > 0);

   // invert covariance matrix
   TMatrixD* theMat = 0;
   switch (GetFisherMethod()) {
   case kFisher:
      theMat = fWith;
      break;
   case kMahalanobis:
      theMat = fCov;
      break;
   default:
      Log() << kFATAL << "<GetFisherCoeff> undefined method" << GetFisherMethod() << Endl;
   }

   TMatrixD invCov( *theMat );
   if ( TMath::Abs(invCov.Determinant()) < 10E-24 ) {
      Log() << kWARNING << "<GetFisherCoeff> matrix is almost singular with deterninant="
              << TMath::Abs(invCov.Determinant()) 
              << " did you use the variables that are linear combinations or highly correlated?" 
              << Endl;
   }
   if ( TMath::Abs(invCov.Determinant()) < 10E-120 ) {
      Log() << kFATAL << "<GetFisherCoeff> matrix is singular with determinant="
              << TMath::Abs(invCov.Determinant())  
              << " did you use the variables that are linear combinations?" 
              << Endl;
   }

   invCov.Invert();
   
   // apply rescaling factor
   Double_t xfact = TMath::Sqrt( fSumOfWeightsS*fSumOfWeightsB ) / (fSumOfWeightsS + fSumOfWeightsB);

   // compute difference of mean values
   std::vector<Double_t> diffMeans( GetNvar() );
   UInt_t ivar, jvar;
   for (ivar=0; ivar<GetNvar(); ivar++) {
      (*fFisherCoeff)[ivar] = 0;

      for (jvar=0; jvar<GetNvar(); jvar++) {
         Double_t d = (*fMeanMatx)(jvar, 0) - (*fMeanMatx)(jvar, 1);
         (*fFisherCoeff)[ivar] += invCov(ivar, jvar)*d;
      }    
    
      // rescale
      (*fFisherCoeff)[ivar] *= xfact;
   }

   // offset correction
   fF0 = 0.0;
   for (ivar=0; ivar<GetNvar(); ivar++){ 
      fF0 += (*fFisherCoeff)[ivar]*((*fMeanMatx)(ivar, 0) + (*fMeanMatx)(ivar, 1));
   }
   fF0 /= -2.0;  
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetDiscrimPower( void )
{
   // computation of discrimination power indicator for each variable
   // small values of "fWith" indicates little compactness of sig & of backgd
   // big values of "fBetw" indicates large separation between sig & backgd
   //
   // we want signal & backgd classes as compact and separated as possible
   // the discriminating power is then defined as the ration "fBetw/fWith"
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
      if ((*fCov)(ivar, ivar) != 0) 
         (*fDiscrimPow)[ivar] = (*fBetw)(ivar, ivar)/(*fCov)(ivar, ivar);
      else
         (*fDiscrimPow)[ivar] = 0;
   }
}

//_______________________________________________________________________
const TMVA::Ranking* TMVA::MethodFisher::CreateRanking() 
{
   // computes ranking of input variables

   // create the ranking object
   fRanking = new Ranking( GetName(), "Discr. power" );

   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
      fRanking->AddRank( Rank( GetInputLabel(ivar), (*fDiscrimPow)[ivar] ) );
   }

   return fRanking;
}

//_______________________________________________________________________
void TMVA::MethodFisher::PrintCoefficients( void ) 
{
   // display Fisher coefficients and discriminating power for each variable
   // check maximum length of variable name
   Log() << kINFO << "Results for Fisher coefficients:" << Endl;

   if (GetTransformationHandler().GetTransformationList().GetSize() != 0) {
      Log() << kINFO << "NOTE: The coefficients must be applied to TRANFORMED variables" << Endl;
      Log() << kINFO << "  List of the transformation: " << Endl;
      TListIter trIt(&GetTransformationHandler().GetTransformationList());
      while (VariableTransformBase *trf = (VariableTransformBase*) trIt()) {
         Log() << kINFO << "  -- " << trf->GetName() << Endl;
      }
   }
   std::vector<TString>  vars;
   std::vector<Double_t> coeffs;
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
      vars  .push_back( GetInputLabel(ivar) );
      coeffs.push_back(  (*fFisherCoeff)[ivar] );
   }
   vars  .push_back( "(offset)" );
   coeffs.push_back( fF0 );
   TMVA::gTools().FormattedOutput( coeffs, vars, "Variable" , "Coefficient", Log() );   

   if (IsNormalised()) {
      Log() << kINFO << "NOTE: You have chosen to use the \"Normalise\" booking option. Hence, the" << Endl;
      Log() << kINFO << "      coefficients must be applied to NORMALISED (') variables as follows:" << Endl;
      Int_t maxL = 0;
      for (UInt_t ivar=0; ivar<GetNvar(); ivar++) if (GetInputLabel(ivar).Length() > maxL) maxL = GetInputLabel(ivar).Length();

      // Print normalisation expression (see Tools.cxx): "2*(x - xmin)/(xmax - xmin) - 1.0"
      for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
         Log() << kINFO 
                 << setw(maxL+9) << TString("[") + GetInputLabel(ivar) + "]' = 2*(" 
                 << setw(maxL+2) << TString("[") + GetInputLabel(ivar) + "]"
                 << setw(3) << (GetXmin(ivar) > 0 ? " - " : " + ")
                 << setw(6) << TMath::Abs(GetXmin(ivar)) << setw(3) << ")/"
                 << setw(6) << (GetXmax(ivar) -  GetXmin(ivar) )
                 << setw(3) << " - 1"
                 << Endl;
      }
      Log() << kINFO << "The TMVA Reader will properly account for this normalisation, but if the" << Endl;
      Log() << kINFO << "Fisher classifier is applied outside the Reader, the transformation must be" << Endl;
      Log() << kINFO << "implemented -- or the \"Normalise\" option is removed and Fisher retrained." << Endl;
      Log() << kINFO << Endl;
   }   
}
  
//_______________________________________________________________________
void TMVA::MethodFisher::ReadWeightsFromStream( istream& istr )
{
   // read Fisher coefficients from weight file
   istr >> fF0;
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) istr >> (*fFisherCoeff)[ivar];
}

//_______________________________________________________________________
void TMVA::MethodFisher::AddWeightsXMLTo( void* parent ) const 
{
   // create XML description of Fisher classifier

   void* wght = gTools().AddChild(parent, "Weights");
   gTools().AddAttr( wght, "NCoeff", GetNvar()+1 );
   void* coeffxml = gTools().AddChild(wght, "Coefficient");
   gTools().AddAttr( coeffxml, "Index", 0   );
   gTools().AddAttr( coeffxml, "Value", fF0 );
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
      coeffxml = gTools().AddChild( wght, "Coefficient" );
      gTools().AddAttr( coeffxml, "Index", ivar+1 );
      gTools().AddAttr( coeffxml, "Value", (*fFisherCoeff)[ivar] );
   }
}

//_______________________________________________________________________
void TMVA::MethodFisher::ReadWeightsFromXML( void* wghtnode ) 
{
   // read Fisher coefficients from xml weight file
   UInt_t ncoeff, coeffidx;
   gTools().ReadAttr( wghtnode, "NCoeff", ncoeff );
   fFisherCoeff->resize(ncoeff-1);

   void* ch = gTools().GetChild(wghtnode);
   Double_t coeff;
   while (ch) {
      gTools().ReadAttr( ch, "Index", coeffidx );
      gTools().ReadAttr( ch, "Value", coeff    );
      if (coeffidx==0) fF0 = coeff;
      else             (*fFisherCoeff)[coeffidx-1] = coeff;
      ch = gTools().GetNextChild(ch);
   }
}

//_______________________________________________________________________
void TMVA::MethodFisher::MakeClassSpecific( std::ostream& fout, const TString& className ) const
{
   // write Fisher-specific classifier response
   Int_t dp = fout.precision();
   fout << "   double              fFisher0;" << endl;
   fout << "   std::vector<double> fFisherCoefficients;" << endl;
   fout << "};" << endl;
   fout << "" << endl;
   fout << "inline void " << className << "::Initialize() " << endl;
   fout << "{" << endl;
   fout << "   fFisher0 = " << std::setprecision(12) << fF0 << ";" << endl;
   for (UInt_t ivar=0; ivar<GetNvar(); ivar++) {
      fout << "   fFisherCoefficients.push_back( " << std::setprecision(12) << (*fFisherCoeff)[ivar] << " );" << endl;
   }
   fout << endl;
   fout << "   // sanity check" << endl;
   fout << "   if (fFisherCoefficients.size() != fNvars) {" << endl;
   fout << "      std::cout << \"Problem in class \\\"\" << fClassName << \"\\\"::Initialize: mismatch in number of input values\"" << endl;
   fout << "                << fFisherCoefficients.size() << \" != \" << fNvars << std::endl;" << endl;
   fout << "      fStatusIsClean = false;" << endl;
   fout << "   }         " << endl;
   fout << "}" << endl;
   fout << endl;
   fout << "inline double " << className << "::GetMvaValue__( const std::vector<double>& inputValues ) const" << endl;
   fout << "{" << endl;
   fout << "   double retval = fFisher0;" << endl;
   fout << "   for (size_t ivar = 0; ivar < fNvars; ivar++) {" << endl;
   fout << "      retval += fFisherCoefficients[ivar]*inputValues[ivar];" << endl;
   fout << "   }" << endl;
   fout << endl;
   fout << "   return retval;" << endl;
   fout << "}" << endl;
   fout << endl;
   fout << "// Clean up" << endl;
   fout << "inline void " << className << "::Clear() " << endl;
   fout << "{" << endl;
   fout << "   // clear coefficients" << endl;
   fout << "   fFisherCoefficients.clear(); " << endl;
   fout << "}" << endl;
   fout << std::setprecision(dp);
}

//_______________________________________________________________________
void TMVA::MethodFisher::GetHelpMessage() const
{
   // get help message text
   //
   // typical length of text line: 
   //         "|--------------------------------------------------------------|"
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Short description:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "Fisher discriminants select events by distinguishing the mean " << Endl;
   Log() << "values of the signal and background distributions in a trans- " << Endl;
   Log() << "formed variable space where linear correlations are removed." << Endl;
   Log() << Endl;
   Log() << "   (More precisely: the \"linear discriminator\" determines" << Endl;
   Log() << "    an axis in the (correlated) hyperspace of the input " << Endl;
   Log() << "    variables such that, when projecting the output classes " << Endl;
   Log() << "    (signal and background) upon this axis, they are pushed " << Endl;
   Log() << "    as far as possible away from each other, while events" << Endl;
   Log() << "    of a same class are confined in a close vicinity. The  " << Endl;
   Log() << "    linearity property of this classifier is reflected in the " << Endl;
   Log() << "    metric with which \"far apart\" and \"close vicinity\" are " << Endl;
   Log() << "    determined: the covariance matrix of the discriminating" << Endl;
   Log() << "    variable space.)" << Endl;
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Performance optimisation:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "Optimal performance for Fisher discriminants is obtained for " << Endl;
   Log() << "linearly correlated Gaussian-distributed variables. Any deviation" << Endl;
   Log() << "from this ideal reduces the achievable separation power. In " << Endl;
   Log() << "particular, no discrimination at all is achieved for a variable" << Endl;
   Log() << "that has the same sample mean for signal and background, even if " << Endl;
   Log() << "the shapes of the distributions are very different. Thus, Fisher " << Endl;
   Log() << "discriminants often benefit from suitable transformations of the " << Endl;
   Log() << "input variables. For example, if a variable x in [-1,1] has a " << Endl;
   Log() << "a parabolic signal distributions, and a uniform background" << Endl;
   Log() << "distributions, their mean value is zero in both cases, leading " << Endl;
   Log() << "to no separation. The simple transformation x -> |x| renders this " << Endl;
   Log() << "variable powerful for the use in a Fisher discriminant." << Endl;
   Log() << Endl;
   Log() << gTools().Color("bold") << "--- Performance tuning via configuration options:" << gTools().Color("reset") << Endl;
   Log() << Endl;
   Log() << "<None>" << Endl;
}
 MethodFisher.cxx:1
 MethodFisher.cxx:2
 MethodFisher.cxx:3
 MethodFisher.cxx:4
 MethodFisher.cxx:5
 MethodFisher.cxx:6
 MethodFisher.cxx:7
 MethodFisher.cxx:8
 MethodFisher.cxx:9
 MethodFisher.cxx:10
 MethodFisher.cxx:11
 MethodFisher.cxx:12
 MethodFisher.cxx:13
 MethodFisher.cxx:14
 MethodFisher.cxx:15
 MethodFisher.cxx:16
 MethodFisher.cxx:17
 MethodFisher.cxx:18
 MethodFisher.cxx:19
 MethodFisher.cxx:20
 MethodFisher.cxx:21
 MethodFisher.cxx:22
 MethodFisher.cxx:23
 MethodFisher.cxx:24
 MethodFisher.cxx:25
 MethodFisher.cxx:26
 MethodFisher.cxx:27
 MethodFisher.cxx:28
 MethodFisher.cxx:29
 MethodFisher.cxx:30
 MethodFisher.cxx:31
 MethodFisher.cxx:32
 MethodFisher.cxx:33
 MethodFisher.cxx:34
 MethodFisher.cxx:35
 MethodFisher.cxx:36
 MethodFisher.cxx:37
 MethodFisher.cxx:38
 MethodFisher.cxx:39
 MethodFisher.cxx:40
 MethodFisher.cxx:41
 MethodFisher.cxx:42
 MethodFisher.cxx:43
 MethodFisher.cxx:44
 MethodFisher.cxx:45
 MethodFisher.cxx:46
 MethodFisher.cxx:47
 MethodFisher.cxx:48
 MethodFisher.cxx:49
 MethodFisher.cxx:50
 MethodFisher.cxx:51
 MethodFisher.cxx:52
 MethodFisher.cxx:53
 MethodFisher.cxx:54
 MethodFisher.cxx:55
 MethodFisher.cxx:56
 MethodFisher.cxx:57
 MethodFisher.cxx:58
 MethodFisher.cxx:59
 MethodFisher.cxx:60
 MethodFisher.cxx:61
 MethodFisher.cxx:62
 MethodFisher.cxx:63
 MethodFisher.cxx:64
 MethodFisher.cxx:65
 MethodFisher.cxx:66
 MethodFisher.cxx:67
 MethodFisher.cxx:68
 MethodFisher.cxx:69
 MethodFisher.cxx:70
 MethodFisher.cxx:71
 MethodFisher.cxx:72
 MethodFisher.cxx:73
 MethodFisher.cxx:74
 MethodFisher.cxx:75
 MethodFisher.cxx:76
 MethodFisher.cxx:77
 MethodFisher.cxx:78
 MethodFisher.cxx:79
 MethodFisher.cxx:80
 MethodFisher.cxx:81
 MethodFisher.cxx:82
 MethodFisher.cxx:83
 MethodFisher.cxx:84
 MethodFisher.cxx:85
 MethodFisher.cxx:86
 MethodFisher.cxx:87
 MethodFisher.cxx:88
 MethodFisher.cxx:89
 MethodFisher.cxx:90
 MethodFisher.cxx:91
 MethodFisher.cxx:92
 MethodFisher.cxx:93
 MethodFisher.cxx:94
 MethodFisher.cxx:95
 MethodFisher.cxx:96
 MethodFisher.cxx:97
 MethodFisher.cxx:98
 MethodFisher.cxx:99
 MethodFisher.cxx:100
 MethodFisher.cxx:101
 MethodFisher.cxx:102
 MethodFisher.cxx:103
 MethodFisher.cxx:104
 MethodFisher.cxx:105
 MethodFisher.cxx:106
 MethodFisher.cxx:107
 MethodFisher.cxx:108
 MethodFisher.cxx:109
 MethodFisher.cxx:110
 MethodFisher.cxx:111
 MethodFisher.cxx:112
 MethodFisher.cxx:113
 MethodFisher.cxx:114
 MethodFisher.cxx:115
 MethodFisher.cxx:116
 MethodFisher.cxx:117
 MethodFisher.cxx:118
 MethodFisher.cxx:119
 MethodFisher.cxx:120
 MethodFisher.cxx:121
 MethodFisher.cxx:122
 MethodFisher.cxx:123
 MethodFisher.cxx:124
 MethodFisher.cxx:125
 MethodFisher.cxx:126
 MethodFisher.cxx:127
 MethodFisher.cxx:128
 MethodFisher.cxx:129
 MethodFisher.cxx:130
 MethodFisher.cxx:131
 MethodFisher.cxx:132
 MethodFisher.cxx:133
 MethodFisher.cxx:134
 MethodFisher.cxx:135
 MethodFisher.cxx:136
 MethodFisher.cxx:137
 MethodFisher.cxx:138
 MethodFisher.cxx:139
 MethodFisher.cxx:140
 MethodFisher.cxx:141
 MethodFisher.cxx:142
 MethodFisher.cxx:143
 MethodFisher.cxx:144
 MethodFisher.cxx:145
 MethodFisher.cxx:146
 MethodFisher.cxx:147
 MethodFisher.cxx:148
 MethodFisher.cxx:149
 MethodFisher.cxx:150
 MethodFisher.cxx:151
 MethodFisher.cxx:152
 MethodFisher.cxx:153
 MethodFisher.cxx:154
 MethodFisher.cxx:155
 MethodFisher.cxx:156
 MethodFisher.cxx:157
 MethodFisher.cxx:158
 MethodFisher.cxx:159
 MethodFisher.cxx:160
 MethodFisher.cxx:161
 MethodFisher.cxx:162
 MethodFisher.cxx:163
 MethodFisher.cxx:164
 MethodFisher.cxx:165
 MethodFisher.cxx:166
 MethodFisher.cxx:167
 MethodFisher.cxx:168
 MethodFisher.cxx:169
 MethodFisher.cxx:170
 MethodFisher.cxx:171
 MethodFisher.cxx:172
 MethodFisher.cxx:173
 MethodFisher.cxx:174
 MethodFisher.cxx:175
 MethodFisher.cxx:176
 MethodFisher.cxx:177
 MethodFisher.cxx:178
 MethodFisher.cxx:179
 MethodFisher.cxx:180
 MethodFisher.cxx:181
 MethodFisher.cxx:182
 MethodFisher.cxx:183
 MethodFisher.cxx:184
 MethodFisher.cxx:185
 MethodFisher.cxx:186
 MethodFisher.cxx:187
 MethodFisher.cxx:188
 MethodFisher.cxx:189
 MethodFisher.cxx:190
 MethodFisher.cxx:191
 MethodFisher.cxx:192
 MethodFisher.cxx:193
 MethodFisher.cxx:194
 MethodFisher.cxx:195
 MethodFisher.cxx:196
 MethodFisher.cxx:197
 MethodFisher.cxx:198
 MethodFisher.cxx:199
 MethodFisher.cxx:200
 MethodFisher.cxx:201
 MethodFisher.cxx:202
 MethodFisher.cxx:203
 MethodFisher.cxx:204
 MethodFisher.cxx:205
 MethodFisher.cxx:206
 MethodFisher.cxx:207
 MethodFisher.cxx:208
 MethodFisher.cxx:209
 MethodFisher.cxx:210
 MethodFisher.cxx:211
 MethodFisher.cxx:212
 MethodFisher.cxx:213
 MethodFisher.cxx:214
 MethodFisher.cxx:215
 MethodFisher.cxx:216
 MethodFisher.cxx:217
 MethodFisher.cxx:218
 MethodFisher.cxx:219
 MethodFisher.cxx:220
 MethodFisher.cxx:221
 MethodFisher.cxx:222
 MethodFisher.cxx:223
 MethodFisher.cxx:224
 MethodFisher.cxx:225
 MethodFisher.cxx:226
 MethodFisher.cxx:227
 MethodFisher.cxx:228
 MethodFisher.cxx:229
 MethodFisher.cxx:230
 MethodFisher.cxx:231
 MethodFisher.cxx:232
 MethodFisher.cxx:233
 MethodFisher.cxx:234
 MethodFisher.cxx:235
 MethodFisher.cxx:236
 MethodFisher.cxx:237
 MethodFisher.cxx:238
 MethodFisher.cxx:239
 MethodFisher.cxx:240
 MethodFisher.cxx:241
 MethodFisher.cxx:242
 MethodFisher.cxx:243
 MethodFisher.cxx:244
 MethodFisher.cxx:245
 MethodFisher.cxx:246
 MethodFisher.cxx:247
 MethodFisher.cxx:248
 MethodFisher.cxx:249
 MethodFisher.cxx:250
 MethodFisher.cxx:251
 MethodFisher.cxx:252
 MethodFisher.cxx:253
 MethodFisher.cxx:254
 MethodFisher.cxx:255
 MethodFisher.cxx:256
 MethodFisher.cxx:257
 MethodFisher.cxx:258
 MethodFisher.cxx:259
 MethodFisher.cxx:260
 MethodFisher.cxx:261
 MethodFisher.cxx:262
 MethodFisher.cxx:263
 MethodFisher.cxx:264
 MethodFisher.cxx:265
 MethodFisher.cxx:266
 MethodFisher.cxx:267
 MethodFisher.cxx:268
 MethodFisher.cxx:269
 MethodFisher.cxx:270
 MethodFisher.cxx:271
 MethodFisher.cxx:272
 MethodFisher.cxx:273
 MethodFisher.cxx:274
 MethodFisher.cxx:275
 MethodFisher.cxx:276
 MethodFisher.cxx:277
 MethodFisher.cxx:278
 MethodFisher.cxx:279
 MethodFisher.cxx:280
 MethodFisher.cxx:281
 MethodFisher.cxx:282
 MethodFisher.cxx:283
 MethodFisher.cxx:284
 MethodFisher.cxx:285
 MethodFisher.cxx:286
 MethodFisher.cxx:287
 MethodFisher.cxx:288
 MethodFisher.cxx:289
 MethodFisher.cxx:290
 MethodFisher.cxx:291
 MethodFisher.cxx:292
 MethodFisher.cxx:293
 MethodFisher.cxx:294
 MethodFisher.cxx:295
 MethodFisher.cxx:296
 MethodFisher.cxx:297
 MethodFisher.cxx:298
 MethodFisher.cxx:299
 MethodFisher.cxx:300
 MethodFisher.cxx:301
 MethodFisher.cxx:302
 MethodFisher.cxx:303
 MethodFisher.cxx:304
 MethodFisher.cxx:305
 MethodFisher.cxx:306
 MethodFisher.cxx:307
 MethodFisher.cxx:308
 MethodFisher.cxx:309
 MethodFisher.cxx:310
 MethodFisher.cxx:311
 MethodFisher.cxx:312
 MethodFisher.cxx:313
 MethodFisher.cxx:314
 MethodFisher.cxx:315
 MethodFisher.cxx:316
 MethodFisher.cxx:317
 MethodFisher.cxx:318
 MethodFisher.cxx:319
 MethodFisher.cxx:320
 MethodFisher.cxx:321
 MethodFisher.cxx:322
 MethodFisher.cxx:323
 MethodFisher.cxx:324
 MethodFisher.cxx:325
 MethodFisher.cxx:326
 MethodFisher.cxx:327
 MethodFisher.cxx:328
 MethodFisher.cxx:329
 MethodFisher.cxx:330
 MethodFisher.cxx:331
 MethodFisher.cxx:332
 MethodFisher.cxx:333
 MethodFisher.cxx:334
 MethodFisher.cxx:335
 MethodFisher.cxx:336
 MethodFisher.cxx:337
 MethodFisher.cxx:338
 MethodFisher.cxx:339
 MethodFisher.cxx:340
 MethodFisher.cxx:341
 MethodFisher.cxx:342
 MethodFisher.cxx:343
 MethodFisher.cxx:344
 MethodFisher.cxx:345
 MethodFisher.cxx:346
 MethodFisher.cxx:347
 MethodFisher.cxx:348
 MethodFisher.cxx:349
 MethodFisher.cxx:350
 MethodFisher.cxx:351
 MethodFisher.cxx:352
 MethodFisher.cxx:353
 MethodFisher.cxx:354
 MethodFisher.cxx:355
 MethodFisher.cxx:356
 MethodFisher.cxx:357
 MethodFisher.cxx:358
 MethodFisher.cxx:359
 MethodFisher.cxx:360
 MethodFisher.cxx:361
 MethodFisher.cxx:362
 MethodFisher.cxx:363
 MethodFisher.cxx:364
 MethodFisher.cxx:365
 MethodFisher.cxx:366
 MethodFisher.cxx:367
 MethodFisher.cxx:368
 MethodFisher.cxx:369
 MethodFisher.cxx:370
 MethodFisher.cxx:371
 MethodFisher.cxx:372
 MethodFisher.cxx:373
 MethodFisher.cxx:374
 MethodFisher.cxx:375
 MethodFisher.cxx:376
 MethodFisher.cxx:377
 MethodFisher.cxx:378
 MethodFisher.cxx:379
 MethodFisher.cxx:380
 MethodFisher.cxx:381
 MethodFisher.cxx:382
 MethodFisher.cxx:383
 MethodFisher.cxx:384
 MethodFisher.cxx:385
 MethodFisher.cxx:386
 MethodFisher.cxx:387
 MethodFisher.cxx:388
 MethodFisher.cxx:389
 MethodFisher.cxx:390
 MethodFisher.cxx:391
 MethodFisher.cxx:392
 MethodFisher.cxx:393
 MethodFisher.cxx:394
 MethodFisher.cxx:395
 MethodFisher.cxx:396
 MethodFisher.cxx:397
 MethodFisher.cxx:398
 MethodFisher.cxx:399
 MethodFisher.cxx:400
 MethodFisher.cxx:401
 MethodFisher.cxx:402
 MethodFisher.cxx:403
 MethodFisher.cxx:404
 MethodFisher.cxx:405
 MethodFisher.cxx:406
 MethodFisher.cxx:407
 MethodFisher.cxx:408
 MethodFisher.cxx:409
 MethodFisher.cxx:410
 MethodFisher.cxx:411
 MethodFisher.cxx:412
 MethodFisher.cxx:413
 MethodFisher.cxx:414
 MethodFisher.cxx:415
 MethodFisher.cxx:416
 MethodFisher.cxx:417
 MethodFisher.cxx:418
 MethodFisher.cxx:419
 MethodFisher.cxx:420
 MethodFisher.cxx:421
 MethodFisher.cxx:422
 MethodFisher.cxx:423
 MethodFisher.cxx:424
 MethodFisher.cxx:425
 MethodFisher.cxx:426
 MethodFisher.cxx:427
 MethodFisher.cxx:428
 MethodFisher.cxx:429
 MethodFisher.cxx:430
 MethodFisher.cxx:431
 MethodFisher.cxx:432
 MethodFisher.cxx:433
 MethodFisher.cxx:434
 MethodFisher.cxx:435
 MethodFisher.cxx:436
 MethodFisher.cxx:437
 MethodFisher.cxx:438
 MethodFisher.cxx:439
 MethodFisher.cxx:440
 MethodFisher.cxx:441
 MethodFisher.cxx:442
 MethodFisher.cxx:443
 MethodFisher.cxx:444
 MethodFisher.cxx:445
 MethodFisher.cxx:446
 MethodFisher.cxx:447
 MethodFisher.cxx:448
 MethodFisher.cxx:449
 MethodFisher.cxx:450
 MethodFisher.cxx:451
 MethodFisher.cxx:452
 MethodFisher.cxx:453
 MethodFisher.cxx:454
 MethodFisher.cxx:455
 MethodFisher.cxx:456
 MethodFisher.cxx:457
 MethodFisher.cxx:458
 MethodFisher.cxx:459
 MethodFisher.cxx:460
 MethodFisher.cxx:461
 MethodFisher.cxx:462
 MethodFisher.cxx:463
 MethodFisher.cxx:464
 MethodFisher.cxx:465
 MethodFisher.cxx:466
 MethodFisher.cxx:467
 MethodFisher.cxx:468
 MethodFisher.cxx:469
 MethodFisher.cxx:470
 MethodFisher.cxx:471
 MethodFisher.cxx:472
 MethodFisher.cxx:473
 MethodFisher.cxx:474
 MethodFisher.cxx:475
 MethodFisher.cxx:476
 MethodFisher.cxx:477
 MethodFisher.cxx:478
 MethodFisher.cxx:479
 MethodFisher.cxx:480
 MethodFisher.cxx:481
 MethodFisher.cxx:482
 MethodFisher.cxx:483
 MethodFisher.cxx:484
 MethodFisher.cxx:485
 MethodFisher.cxx:486
 MethodFisher.cxx:487
 MethodFisher.cxx:488
 MethodFisher.cxx:489
 MethodFisher.cxx:490
 MethodFisher.cxx:491
 MethodFisher.cxx:492
 MethodFisher.cxx:493
 MethodFisher.cxx:494
 MethodFisher.cxx:495
 MethodFisher.cxx:496
 MethodFisher.cxx:497
 MethodFisher.cxx:498
 MethodFisher.cxx:499
 MethodFisher.cxx:500
 MethodFisher.cxx:501
 MethodFisher.cxx:502
 MethodFisher.cxx:503
 MethodFisher.cxx:504
 MethodFisher.cxx:505
 MethodFisher.cxx:506
 MethodFisher.cxx:507
 MethodFisher.cxx:508
 MethodFisher.cxx:509
 MethodFisher.cxx:510
 MethodFisher.cxx:511
 MethodFisher.cxx:512
 MethodFisher.cxx:513
 MethodFisher.cxx:514
 MethodFisher.cxx:515
 MethodFisher.cxx:516
 MethodFisher.cxx:517
 MethodFisher.cxx:518
 MethodFisher.cxx:519
 MethodFisher.cxx:520
 MethodFisher.cxx:521
 MethodFisher.cxx:522
 MethodFisher.cxx:523
 MethodFisher.cxx:524
 MethodFisher.cxx:525
 MethodFisher.cxx:526
 MethodFisher.cxx:527
 MethodFisher.cxx:528
 MethodFisher.cxx:529
 MethodFisher.cxx:530
 MethodFisher.cxx:531
 MethodFisher.cxx:532
 MethodFisher.cxx:533
 MethodFisher.cxx:534
 MethodFisher.cxx:535
 MethodFisher.cxx:536
 MethodFisher.cxx:537
 MethodFisher.cxx:538
 MethodFisher.cxx:539
 MethodFisher.cxx:540
 MethodFisher.cxx:541
 MethodFisher.cxx:542
 MethodFisher.cxx:543
 MethodFisher.cxx:544
 MethodFisher.cxx:545
 MethodFisher.cxx:546
 MethodFisher.cxx:547
 MethodFisher.cxx:548
 MethodFisher.cxx:549
 MethodFisher.cxx:550
 MethodFisher.cxx:551
 MethodFisher.cxx:552
 MethodFisher.cxx:553
 MethodFisher.cxx:554
 MethodFisher.cxx:555
 MethodFisher.cxx:556
 MethodFisher.cxx:557
 MethodFisher.cxx:558
 MethodFisher.cxx:559
 MethodFisher.cxx:560
 MethodFisher.cxx:561
 MethodFisher.cxx:562
 MethodFisher.cxx:563
 MethodFisher.cxx:564
 MethodFisher.cxx:565
 MethodFisher.cxx:566
 MethodFisher.cxx:567
 MethodFisher.cxx:568
 MethodFisher.cxx:569
 MethodFisher.cxx:570
 MethodFisher.cxx:571
 MethodFisher.cxx:572
 MethodFisher.cxx:573
 MethodFisher.cxx:574
 MethodFisher.cxx:575
 MethodFisher.cxx:576
 MethodFisher.cxx:577
 MethodFisher.cxx:578
 MethodFisher.cxx:579
 MethodFisher.cxx:580
 MethodFisher.cxx:581
 MethodFisher.cxx:582
 MethodFisher.cxx:583
 MethodFisher.cxx:584
 MethodFisher.cxx:585
 MethodFisher.cxx:586
 MethodFisher.cxx:587
 MethodFisher.cxx:588
 MethodFisher.cxx:589
 MethodFisher.cxx:590
 MethodFisher.cxx:591
 MethodFisher.cxx:592
 MethodFisher.cxx:593
 MethodFisher.cxx:594
 MethodFisher.cxx:595
 MethodFisher.cxx:596
 MethodFisher.cxx:597
 MethodFisher.cxx:598
 MethodFisher.cxx:599
 MethodFisher.cxx:600
 MethodFisher.cxx:601
 MethodFisher.cxx:602
 MethodFisher.cxx:603
 MethodFisher.cxx:604
 MethodFisher.cxx:605
 MethodFisher.cxx:606
 MethodFisher.cxx:607
 MethodFisher.cxx:608
 MethodFisher.cxx:609
 MethodFisher.cxx:610
 MethodFisher.cxx:611
 MethodFisher.cxx:612
 MethodFisher.cxx:613
 MethodFisher.cxx:614
 MethodFisher.cxx:615
 MethodFisher.cxx:616
 MethodFisher.cxx:617
 MethodFisher.cxx:618
 MethodFisher.cxx:619
 MethodFisher.cxx:620
 MethodFisher.cxx:621
 MethodFisher.cxx:622
 MethodFisher.cxx:623
 MethodFisher.cxx:624
 MethodFisher.cxx:625
 MethodFisher.cxx:626
 MethodFisher.cxx:627
 MethodFisher.cxx:628
 MethodFisher.cxx:629
 MethodFisher.cxx:630
 MethodFisher.cxx:631
 MethodFisher.cxx:632
 MethodFisher.cxx:633
 MethodFisher.cxx:634
 MethodFisher.cxx:635
 MethodFisher.cxx:636
 MethodFisher.cxx:637
 MethodFisher.cxx:638
 MethodFisher.cxx:639
 MethodFisher.cxx:640
 MethodFisher.cxx:641
 MethodFisher.cxx:642
 MethodFisher.cxx:643
 MethodFisher.cxx:644
 MethodFisher.cxx:645
 MethodFisher.cxx:646
 MethodFisher.cxx:647
 MethodFisher.cxx:648
 MethodFisher.cxx:649
 MethodFisher.cxx:650
 MethodFisher.cxx:651
 MethodFisher.cxx:652
 MethodFisher.cxx:653
 MethodFisher.cxx:654
 MethodFisher.cxx:655
 MethodFisher.cxx:656
 MethodFisher.cxx:657
 MethodFisher.cxx:658
 MethodFisher.cxx:659
 MethodFisher.cxx:660
 MethodFisher.cxx:661
 MethodFisher.cxx:662
 MethodFisher.cxx:663
 MethodFisher.cxx:664
 MethodFisher.cxx:665
 MethodFisher.cxx:666
 MethodFisher.cxx:667
 MethodFisher.cxx:668
 MethodFisher.cxx:669
 MethodFisher.cxx:670
 MethodFisher.cxx:671
 MethodFisher.cxx:672
 MethodFisher.cxx:673
 MethodFisher.cxx:674
 MethodFisher.cxx:675
 MethodFisher.cxx:676
 MethodFisher.cxx:677
 MethodFisher.cxx:678
 MethodFisher.cxx:679
 MethodFisher.cxx:680
 MethodFisher.cxx:681
 MethodFisher.cxx:682
 MethodFisher.cxx:683
 MethodFisher.cxx:684
 MethodFisher.cxx:685
 MethodFisher.cxx:686
 MethodFisher.cxx:687
 MethodFisher.cxx:688
 MethodFisher.cxx:689
 MethodFisher.cxx:690
 MethodFisher.cxx:691
 MethodFisher.cxx:692
 MethodFisher.cxx:693
 MethodFisher.cxx:694
 MethodFisher.cxx:695
 MethodFisher.cxx:696
 MethodFisher.cxx:697