Logo ROOT  
Reference Guide
 
Loading...
Searching...
No Matches
FourBinInstructional.py File Reference

Namespaces

namespace  FourBinInstructional
 

Detailed Description

View in nbviewer Open in SWAN
This example is a generalization of the on/off problem.

This example is a generalization of the on/off problem. It's a common setup for SUSY searches. Imagine that one has two variables "x" and "y" (eg. missing ET and SumET), see figure. The signal region has high values of both of these variables (top right). One can see low values of "x" or "y" acting as side-bands. If we just used "y" as a sideband, we would have the on/off problem.

  • In the signal region we observe non events and expect s+b events.
  • In the region with low values of "y" (bottom right) we observe noff events and expect tau*b events. Note the significance of tau. In the background only case:
tau ~ <expectation off> / <expectation on>
Option_t Option_t TPoint TPoint const char GetTextMagnitude GetFillStyle GetLineColor GetLineWidth GetMarkerStyle GetTextAlign GetTextColor GetTextSize void on

If tau is known, this model is sufficient, but often tau is not known exactly. So one can use low values of "x" as an additional constraint for tau. Note that this technique critically depends on the notion that the joint distribution for "x" and "y" can be factorized. Generally, these regions have many events, so it the ratio can be measured very precisely there. So we extend the model to describe the left two boxes... denoted with "bar".

  • In the upper left we observe nonbar events and expect bbar events
  • In the bottom left we observe noffbar events and expect tau bbar events Note again we have:
tau ~ <expectation off bar> / <expectation on bar>

One can further expand the model to account for the systematic associated to assuming the distribution of "x" and "y" factorizes (eg. that tau is the same for off/on and offbar/onbar). This can be done in several ways, but here we introduce an additional parameter rho, which so that one set of models will use tau and the other tau*rho. The choice is arbitrary, but it has consequences on the numerical stability of the algorithms. The "bar" measurements typically have more events (& smaller relative errors). If we choose

<expectation noffbar> = tau * rho * <expectation noonbar>

the product tau*rho will be known very precisely (~1/sqrt(bbar)) and the contour in those parameters will be narrow and have a non-trivial tau~1/rho shape. However, if we choose to put rho on the non/noff measurements (where the product will have an error ~1/sqrt(b)), the contours will be more amenable to numerical techniques. Thus, here we choose to define:

tau := <expectation off bar> / (<expectation on bar>)
rho := <expectation off> / (<expectation on> * tau)
^ y
|
|---------------------------+
| | |
| nonbar | non |
| bbar | s+b |
| | |
|---------------+-----------|
| | |
| noffbar | noff |
| tau bbar | tau b rho |
| | |
+-----------------------------> x
#define b(i)
Definition RSha256.hxx:100
Double_t y[n]
Definition legend1.C:17
Double_t x[n]
Definition legend1.C:17

Left in this way, the problem is under-constrained. However, one may have some auxiliary measurement (usually based on Monte Carlo) to constrain rho. Let us call this auxiliary measurement that gives the nominal value of rho "rhonom". Thus, there is a 'constraint' term in the full model: P(rhonom | rho). In this case, we consider a Gaussian constraint with standard deviation sigma.

In the example, the initial values of the parameters are:

- s = 40
- b = 100
- tau = 5
- bbar = 1000
- rho = 1
(sigma for rho = 20%)
const Double_t sigma

and in the toy dataset:

- non = 139
- noff = 528
- nonbar = 993
- noffbar = 4906
- rhonom = 1.27824

Note, the covariance matrix of the parameters has large off-diagonal terms. Clearly s,b are anti-correlated. Similarly, since noffbar >> nonbar, one would expect bbar,tau to be anti-correlated.

This can be seen below.

GLOBAL b bbar rho s tau
b 0.96820 1.000 0.191 -0.942 -0.762 -0.209
bbar 0.91191 0.191 1.000 0.000 -0.146 -0.912
rho 0.96348 -0.942 0.000 1.000 0.718 -0.000
s 0.76250 -0.762 -0.146 0.718 1.000 0.160
tau 0.92084 -0.209 -0.912 -0.000 0.160 1.000

Similarly, since tau*rho appears as a product, we expect rho,tau to be anti-correlated. When the error on rho is significantly larger than 1/sqrt(bbar), tau is essentially known and the correlation is minimal (tau mainly cares about bbar, and rho about b,s). In the alternate parametrization (bbar* tau * rho) the correlation coefficient for rho,tau is large (and negative).

The code below uses best-practices for RooFit & RooStats as of June 2010.

It proceeds as follows:

  • create a workspace to hold the model
  • use workspace factory to quickly create the terms of the model
  • use workspace factory to define total model (a prod pdf)
  • create a RooStats ModelConfig to specify observables, parameters of interest
  • add to the ModelConfig a prior on the parameters for Bayesian techniques note, the pdf it is factorized for parameters of interest & nuisance params
  • visualize the model
  • write the workspace to a file
  • use several of RooStats IntervalCalculators & compare results
[#0] WARNING:InputArguments -- The parameter 'sigma' with range [-1e+30, 1e+30] of the RooGaussian 'mcCons' exceeds the safe range of (0, inf). Advise to limit its range.
[#1] INFO:ObjectHandling -- RooWorkspace::import(wspace) importing dataset modelData
[#1] INFO:InputArguments -- The deprecated RooFit::CloneData(1) option passed to createNLL() is ignored.
[#0] PROGRESS:Minimization -- ProfileLikelihoodCalcultor::DoGLobalFit - find MLE
[#1] INFO:Minimization -- RooAbsMinimizerFcn::setOptimizeConst: activating const optimization
[#1] INFO:Minimization -- The following expressions will be evaluated in cache-and-track mode: (on,off,onbar,offbar,mcCons)
[#0] PROGRESS:Minimization -- ProfileLikelihoodCalcultor::DoMinimizeNLL - using Minuit / with strategy 1
[#1] INFO:Minimization --
RooFitResult: minimized FCN value: 16.2872, estimated distance to minimum: 1.21273e-07
covariance matrix quality: Full, accurate covariance matrix
Status : MINIMIZE=0
Floating Parameter FinalValue +/- Error
-------------------- --------------------------
b 8.3602e+01 +/- 1.39e+01
bbar 9.9301e+02 +/- 3.15e+01
rho 1.2783e+00 +/- 1.99e-01
s 5.5397e+01 +/- 1.78e+01
tau 4.9405e+00 +/- 1.72e-01
[#1] INFO:Minimization -- RooAbsMinimizerFcn::setOptimizeConst: activating const optimization
[#1] INFO:Minimization -- The following expressions will be evaluated in cache-and-track mode: (on,off,onbar,offbar,mcCons)
**********
** 1 **SET PRINT 1
**********
**********
** 2 **SET NOGRAD
**********
PARAMETER DEFINITIONS:
NO. NAME VALUE STEP SIZE LIMITS
1 b 1.17958e+02 1.38657e+01 0.00000e+00 3.00000e+02
2 bbar 1.00111e+03 3.15031e+01 5.00000e+02 2.00000e+03
3 rho 9.28979e-01 1.98664e-01 0.00000e+00 2.00000e+00
4 s 1.20959e+01 1.78108e+01 0.00000e+00 1.00000e+02
MINUIT WARNING IN PARAMETR
============== VARIABLE4 BROUGHT BACK INSIDE LIMITS.
5 tau 4.89226e+00 1.71715e-01 3.00000e+00 7.00000e+00
**********
** 3 **SET ERR 0.5
**********
**********
** 4 **SET PRINT 1
**********
**********
** 5 **SET STR 1
**********
NOW USING STRATEGY 1: TRY TO BALANCE SPEED AGAINST RELIABILITY
**********
** 6 **MIGRAD 2500 1
**********
FIRST CALL TO USER FUNCTION AT NEW START POINT, WITH IFLAG=4.
START MIGRAD MINIMIZATION. STRATEGY 1. CONVERGENCE WHEN EDM .LT. 1.00e-03
FCN=18.2144 FROM MIGRAD STATUS=INITIATE 20 CALLS 21 TOTAL
EDM= unknown STRATEGY= 1 NO ERROR MATRIX
EXT PARAMETER CURRENT GUESS STEP FIRST
NO. NAME VALUE ERROR SIZE DERIVATIVE
1 b 1.17958e+02 1.38657e+01 9.47845e-02 -1.99668e-02
2 bbar 1.00111e+03 3.15031e+01 4.45476e-02 -1.41126e-01
3 rho 9.28979e-01 1.98664e-01 2.00529e-01 -1.44935e-02
4 s 1.20959e+01 1.78108e+01 5.77645e-01 -2.24297e+00
5 tau 4.89226e+00 1.71715e-01 8.60896e-02 -8.62115e-02
ERR DEF= 0.5
MIGRAD MINIMIZATION HAS CONVERGED.
MIGRAD WILL VERIFY CONVERGENCE AND ERROR MATRIX.
COVARIANCE MATRIX CALCULATED SUCCESSFULLY
FCN=16.2872 FROM MIGRAD STATUS=CONVERGED 190 CALLS 191 TOTAL
EDM=3.60951e-06 STRATEGY= 1 ERROR MATRIX ACCURATE
EXT PARAMETER STEP FIRST
NO. NAME VALUE ERROR SIZE DERIVATIVE
1 b 8.35905e+01 1.38673e+01 7.30675e-05 2.40056e-02
2 bbar 9.93029e+02 3.15018e+01 5.19045e-05 -3.02691e-02
3 rho 1.27859e+00 1.98755e-01 1.58308e-04 1.62269e-02
4 s 5.54105e+01 1.78121e+01 6.70504e-04 3.43680e-04
5 tau 4.94037e+00 1.71699e-01 9.49080e-05 -2.33172e-02
ERR DEF= 0.5
EXTERNAL ERROR MATRIX. NDIM= 25 NPAR= 5 ERR DEF=0.5
1.930e+02 8.358e+01 -2.620e+00 -1.929e+02 -5.000e-01
8.358e+01 9.930e+02 1.248e-04 -8.355e+01 -4.940e+00
-2.620e+00 1.248e-04 4.008e-02 2.619e+00 -7.467e-07
-1.929e+02 -8.355e+01 2.619e+00 3.319e+02 4.998e-01
-5.000e-01 -4.940e+00 -7.467e-07 4.998e-01 2.955e-02
PARAMETER CORRELATION COEFFICIENTS
NO. GLOBAL 1 2 3 4 5
1 0.96819 1.000 0.191 -0.942 -0.762 -0.209
2 0.91195 0.191 1.000 0.000 -0.146 -0.912
3 0.96348 -0.942 0.000 1.000 0.718 -0.000
4 0.76233 -0.762 -0.146 0.718 1.000 0.160
5 0.92088 -0.209 -0.912 -0.000 0.160 1.000
**********
** 7 **SET ERR 0.5
**********
**********
** 8 **SET PRINT 1
**********
**********
** 9 **HESSE 2500
**********
COVARIANCE MATRIX CALCULATED SUCCESSFULLY
FCN=16.2872 FROM HESSE STATUS=OK 31 CALLS 222 TOTAL
EDM=3.6099e-06 STRATEGY= 1 ERROR MATRIX ACCURATE
EXT PARAMETER INTERNAL INTERNAL
NO. NAME VALUE ERROR STEP SIZE VALUE
1 b 8.35905e+01 1.38734e+01 1.46135e-05 -4.58641e-01
2 bbar 9.93029e+02 3.14977e+01 1.03809e-05 -3.49712e-01
3 rho 1.27859e+00 1.98839e-01 3.16616e-05 2.82321e-01
4 s 5.54105e+01 1.78190e+01 1.34101e-04 1.08422e-01
5 tau 4.94037e+00 1.71676e-01 1.89816e-05 -2.98215e-02
ERR DEF= 0.5
EXTERNAL ERROR MATRIX. NDIM= 25 NPAR= 5 ERR DEF=0.5
1.932e+02 8.358e+01 -2.623e+00 -1.931e+02 -5.000e-01
8.358e+01 9.928e+02 -2.730e-04 -8.358e+01 -4.939e+00
-2.623e+00 -2.730e-04 4.012e-02 2.623e+00 1.633e-06
-1.931e+02 -8.358e+01 2.623e+00 3.321e+02 5.000e-01
-5.000e-01 -4.939e+00 1.633e-06 5.000e-01 2.955e-02
PARAMETER CORRELATION COEFFICIENTS
NO. GLOBAL 1 2 3 4 5
1 0.96822 1.000 0.191 -0.942 -0.763 -0.209
2 0.91193 0.191 1.000 -0.000 -0.146 -0.912
3 0.96351 -0.942 -0.000 1.000 0.718 0.000
4 0.76256 -0.763 -0.146 0.718 1.000 0.160
5 0.92086 -0.209 -0.912 0.000 0.160 1.000
[#1] INFO:Minimization -- RooAbsMinimizerFcn::setOptimizeConst: deactivating const optimization
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) Creating instance of MINUIT
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) determining minimum likelihood for current configurations w.r.t all observable
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) minimum found at (s=55.4077)
.
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) Creating instance of MINUIT
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) determining minimum likelihood for current configurations w.r.t all observable
[#0] ERROR:InputArguments -- RooArgSet::checkForDup: ERROR argument with name s is already in this set
[#1] INFO:Minimization -- RooProfileLL::evaluate(nll_model_modelData_Profile[s]) minimum found at (s=55.4105)
..........................................................................................................................................................................................................Real time 0:00:02, CP time 2.210
Bayesian Calc. only supports on parameter of interest
Profile Likelihood interval on s = [12.190173059698829, 88.68711581705273]
import ROOT
doBayesian = False
doFeldmanCousins = False
doMCMC = False
# let's time this challenging example
t = ROOT.TStopwatch()
t.Start()
# set RooFit random seed for reproducible results
ROOT.RooRandom.randomGenerator().SetSeed(4357)
# make model
wspace = ROOT.RooWorkspace("wspace")
wspace.factory("Poisson::on(non[0,1000], sum::splusb(s[40,0,100],b[100,0,300]))")
wspace.factory("Poisson::off(noff[0,5000], prod::taub(b,tau[5,3,7],rho[1,0,2]))")
wspace.factory("Poisson::onbar(nonbar[0,10000], bbar[1000,500,2000])")
wspace.factory("Poisson::offbar(noffbar[0,1000000], prod::lambdaoffbar(bbar, tau))")
wspace.factory("Gaussian::mcCons(rhonom[1.,0,2], rho, sigma[.2])")
wspace.factory("PROD::model(on,off,onbar,offbar,mcCons)")
wspace.defineSet("obs", "non,noff,nonbar,noffbar,rhonom")
wspace.factory("Uniform::prior_poi({s})")
wspace.factory("Uniform::prior_nuis({b,bbar,tau, rho})")
wspace.factory("PROD::prior(prior_poi,prior_nuis)")
# ----------------------------------
# Control some interesting variations
# define parameers of interest
# for 1-d plots
wspace.defineSet("poi", "s")
wspace.defineSet("nuis", "b,tau,rho,bbar")
# for 2-d plots to inspect correlations:
# wspace.defineSet("poi","s,rho")
# test simpler cases where parameters are known.
# wspace["tau"].setConstant()
# wspace["rho"].setConstant()
# wspace["b"].setConstant()
# wspace["bbar"].setConstant()
# inspect workspace
# wspace.Print()
# ----------------------------------------------------------
# Generate toy data
# generate toy data assuming current value of the parameters
# import into workspace.
# add Verbose() to see how it's being generated
data = wspace["model"].generate(wspace.set("obs"), 1)
# data.Print("v")
wspace.Import(data)
# ----------------------------------
# Now the statistical tests
# model config
modelConfig = ROOT.RooStats.ModelConfig("FourBins")
modelConfig.SetWorkspace(wspace)
modelConfig.SetPdf(wspace["model"])
modelConfig.SetPriorPdf(wspace["prior"])
modelConfig.SetParametersOfInterest(wspace.set("poi"))
modelConfig.SetNuisanceParameters(wspace.set("nuis"))
wspace.Import(modelConfig)
wspace.writeToFile("FourBin.root")
# -------------------------------------------------
# If you want to see the covariance matrix uncomment
# wspace["model"].fitTo(data)
# use ProfileLikelihood
plc = ROOT.RooStats.ProfileLikelihoodCalculator(data, modelConfig)
plc.SetConfidenceLevel(0.95)
plInt = plc.GetInterval()
msglevel = ROOT.RooMsgService.instance().globalKillBelow()
ROOT.RooMsgService.instance().setGlobalKillBelow(ROOT.RooFit.FATAL)
plInt.LowerLimit(wspace["s"]) # get ugly print out of the way. Fix.
ROOT.RooMsgService.instance().setGlobalKillBelow(msglevel)
# use FeldmaCousins (takes ~20 min)
fc = ROOT.RooStats.FeldmanCousins(data, modelConfig)
fc.SetConfidenceLevel(0.95)
# number counting: dataset always has 1 entry with N events observed
fc.FluctuateNumDataEntries(False)
fc.UseAdaptiveSampling(True)
fc.SetNBins(40)
fcInt = ROOT.RooStats.PointSetInterval()
if doFeldmanCousins: # takes 7 minutes
fcInt = fc.GetInterval()
# use BayesianCalculator (only 1-d parameter of interest, slow for this problem)
bc = ROOT.RooStats.BayesianCalculator(data, modelConfig)
bc.SetConfidenceLevel(0.95)
bInt = ROOT.RooStats.SimpleInterval()
if doBayesian and len(wspace.set("poi")) == 1:
bInt = bc.GetInterval()
else:
print("Bayesian Calc. only supports on parameter of interest")
# use MCMCCalculator (takes about 1 min)
# Want an efficient proposal function, so derive it from covariance
# matrix of fit
fit = wspace["model"].fitTo(data, Save=True)
ph = ROOT.RooStats.ProposalHelper()
ph.SetVariables(fit.floatParsFinal())
ph.SetCovMatrix(fit.covarianceMatrix())
ph.SetUpdateProposalParameters(ROOT.kTRUE) # auto-create mean vars and add mappings
ph.SetCacheSize(100)
pf = ph.GetProposalFunction()
mc = ROOT.RooStats.MCMCCalculator(data, modelConfig)
mc.SetConfidenceLevel(0.95)
mc.SetProposalFunction(pf)
mc.SetNumBurnInSteps(500) # first N steps to be ignored as burn-in
mc.SetNumIters(50000)
mc.SetLeftSideTailFraction(0.5) # make a central interval
mcInt = ROOT.RooStats.MCMCInterval()
if doMCMC:
mcInt = mc.GetInterval()
# ----------------------------------
# Make some plots
c1 = ROOT.gROOT.Get("c1")
if not c1:
c1 = ROOT.TCanvas("c1")
if doBayesian and doMCMC:
c1.Divide(3)
c1.cd(1)
elif doBayesian or doMCMC:
c1.Divide(2)
c1.cd(1)
lrplot = ROOT.RooStats.LikelihoodIntervalPlot(plInt)
lrplot.Draw()
if doBayesian and len(wspace.set("poi")) == 1:
c1.cd(2)
# the plot takes a long time and print lots of error
# using a scan it is better
bc.SetScanOfPosterior(20)
bplot = bc.GetPosteriorPlot()
bplot.Draw()
if doMCMC:
if doBayesian and len(wspace.set("poi")) == 1:
c1.cd(3)
else:
c1.cd(2)
mcPlot = ROOT.RooStats.MCMCIntervalPlot(mcInt)
mcPlot.Draw()
# ----------------------------------
# querry intervals
print(
"Profile Likelihood interval on s = [{}, {}]".format(plInt.LowerLimit(wspace["s"]), plInt.UpperLimit(wspace["s"]))
)
# Profile Likelihood interval on s = [12.1902, 88.6871]
if doBayesian and len(wspace.set("poi")) == 1:
print("Bayesian interval on s = [{}, {}]".format(bInt.LowerLimit(), bInt.UpperLimit()))
if doFeldmanCousins:
print(
"Feldman Cousins interval on s = [{}, {}]".format(fcInt.LowerLimit(wspace["s"]), fcInt.UpperLimit(wspace["s"]))
)
# Feldman Cousins interval on s = [18.75 +/- 2.45, 83.75 +/- 2.45]
if doMCMC:
print("MCMC interval on s = [{}, {}]".format(mcInt.LowerLimit(wspace["s"]), mcInt.UpperLimit(wspace["s"])))
# MCMC interval on s = [15.7628, 84.7266]
t.Stop()
t.Print()
c1.SaveAs("FourBinInstructional.png")
# TODO: The calculators have to be destructed first. Otherwise, we can get
# segmentation faults depending on the destruction order, which is random in
# Python. Probably the issue is that some object has a non-owning pointer to
# another object, which it uses in its destructor. This should be fixed either
# in the design of RooStats in C++, or with phythonizations.
del plc
del bc
del mc
Option_t Option_t TPoint TPoint const char GetTextMagnitude GetFillStyle GetLineColor GetLineWidth GetMarkerStyle GetTextAlign GetTextColor GetTextSize void char Point_t Rectangle_t WindowAttributes_t Float_t Float_t Float_t Int_t Int_t UInt_t UInt_t Rectangle_t Int_t Int_t Window_t TString Int_t GCValues_t GetPrimarySelectionOwner GetDisplay GetScreen GetColormap GetNativeEvent const char const char dpyName wid window const char font_name cursor keysym reg const char only_if_exist regb h Point_t winding char text const char depth char const char Int_t count const char ColorStruct_t color const char Pixmap_t Pixmap_t PictureAttributes_t attr const char char ret_data h unsigned char height h Atom_t Int_t ULong_t ULong_t unsigned char prop_list Atom_t Atom_t Atom_t Time_t UChar_t len
Option_t Option_t TPoint TPoint const char GetTextMagnitude GetFillStyle GetLineColor GetLineWidth GetMarkerStyle GetTextAlign GetTextColor GetTextSize void char Point_t Rectangle_t WindowAttributes_t Float_t Float_t Float_t Int_t Int_t UInt_t UInt_t Rectangle_t Int_t Int_t Window_t TString Int_t GCValues_t GetPrimarySelectionOwner GetDisplay GetScreen GetColormap GetNativeEvent const char const char dpyName wid window const char font_name cursor keysym reg const char only_if_exist regb h Point_t winding char text const char depth char const char Int_t count const char ColorStruct_t color const char Pixmap_t Pixmap_t PictureAttributes_t attr const char char ret_data h unsigned char height h Atom_t Int_t ULong_t ULong_t unsigned char prop_list Atom_t Atom_t Atom_t Time_t format
Date
July 2022
Authors
Artem Busorgin, Kyle Cranmer and Tanja Rommerskirchen (C++ version)

Definition in file FourBinInstructional.py.