Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Similar documents
Topics in Statistical Data Analysis for HEP Lecture 1: Bayesian Methods CERN-JINR European School of High Energy Physics Bautzen, June 2009

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1

Systematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009

Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests

Introduction to Statistical Methods for High Energy Physics

Statistical Methods for Particle Physics (I)

Statistische Methoden der Datenanalyse. Kapitel 1: Fundamentale Konzepte. Professor Markus Schumacher Freiburg / Sommersemester 2009

Introduction to Likelihoods

Statistical Data Analysis Stat 5: More on nuisance parameters, Bayesian methods

Statistical Methods in Particle Physics. Lecture 2

Statistics for the LHC Lecture 1: Introduction

Lectures on Statistical Data Analysis

Statistical Methods in Particle Physics Lecture 2: Limits and Discovery

YETI IPPP Durham

Statistical Models with Uncertain Error Parameters (G. Cowan, arxiv: )

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistics for the LHC Lecture 2: Discovery

RWTH Aachen Graduiertenkolleg

Topics in statistical data analysis for high-energy physics

Statistical Data Analysis 2017/18

Some Topics in Statistical Data Analysis

Introductory Statistics Course Part II

Statistical Methods for Particle Physics Lecture 3: systematic uncertainties / further topics

Statistical Methods for Discovery and Limits in HEP Experiments Day 3: Exclusion Limits

Advanced Statistical Methods. Lecture 6

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Statistical Methods for Particle Physics Lecture 4: discovery, exclusion limits

Statistical Methods for Particle Physics Lecture 2: statistical tests, multivariate methods

The language of random variables

Statistical Methods for Particle Physics Lecture 3: Systematics, nuisance parameters

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Statistical Methods in Particle Physics

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

An introduction to Bayesian reasoning in particle physics

E. Santovetti lesson 4 Maximum likelihood Interval estimation

Statistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009

an introduction to bayesian inference

Monte Carlo in Bayesian Statistics

Some Statistical Tools for Particle Physics

Statistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009

arxiv: v1 [hep-ex] 9 Jul 2013

Introduction to Bayes

Multivariate statistical methods and data mining in particle physics

Statistical Methods for LHC Physics

Physics 403. Segev BenZvi. Credible Intervals, Confidence Intervals, and Limits. Department of Physics and Astronomy University of Rochester

Lecture 5: Bayes pt. 1

Bayesian Phylogenetics:

Introduction to Bayesian Data Analysis

Lecture 2. G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1

Statistical Methods for Particle Physics

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Machine Learning

Advanced statistical methods for data analysis Lecture 1

Physics 403. Segev BenZvi. Classical Hypothesis Testing: The Likelihood Ratio Test. Department of Physics and Astronomy University of Rochester

Machine Learning

Bayesian Inference in Astronomy & Astrophysics A Short Course

Doing Bayesian Integrals

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Bayesian Phylogenetics

Bayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida

Recent developments in statistical methods for particle physics

Bayesian analysis in nuclear physics

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)

Bayesian Inference. Anders Gorm Pedersen. Molecular Evolution Group Center for Biological Sequence Analysis Technical University of Denmark (DTU)

Bayesian Inference. Chapter 1. Introduction and basic concepts

Lecture 1: Probability Fundamentals

ECE295, Data Assimila0on and Inverse Problems, Spring 2015

Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016

Statistical Methods in Particle Physics Day 4: Discovery and limits

Physics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

Assessing Regime Uncertainty Through Reversible Jump McMC

Machine Learning using Bayesian Approaches

Primer on statistics:

Bayesian Inference and MCMC

Bayesian Regression Linear and Logistic Regression

Journeys of an Accidental Statistician

Probability and Estimation. Alan Moses

Principles of Statistical Inference

Physics 509: Propagating Systematic Uncertainties. Scott Oser Lecture #12

Principles of Statistical Inference

Statistics and Data Analysis

Statistical Methods for Particle Physics Lecture 2: multivariate methods

Hypothesis Testing - Frequentist

OVERVIEW OF PRINCIPLES OF STATISTICS

P Values and Nuisance Parameters

LECTURE NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2013 PART I A. STRANDLIE GJØVIK UNIVERSITY COLLEGE AND UNIVERSITY OF OSLO

A Bayesian Approach to Phylogenetics

Markov Chain Monte Carlo methods

Principles of Bayesian Inference

Bayesian inference for multivariate extreme value distributions

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit

Markov Networks.

Bayesian Inference: What, and Why?

Bayesian phylogenetics. the one true tree? Bayesian phylogenetics

Stat 5101 Lecture Notes

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Introduction to Bayesian Statistics 1

Discovery and Significance. M. Witherell 5/10/12

Reminder of some Markov Chain properties:

Transcription:

Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 1

Outline Lecture #1: An introduction to Bayesian statistical methods Role of probability in data analysis (Frequentist, Bayesian) A simple fitting problem : Frequentist vs. Bayesian solution Bayesian computation, Markov Chain Monte Carlo Lecture #2: Setting limits, making a discovery Frequentist vs Bayesian approach, treatment of systematic uncertainties Lecture #3: Multivariate methods for HEP Event selection as a statistical test Neyman-Pearson lemma and likelihood ratio test Some multivariate classifiers: NN, BDT, SVM,... G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 2

Data analysis in particle physics Observe events of a certain type Measure characteristics of each event (particle momenta, number of muons, energy of jets,...) Theories (e.g. SM) predict distributions of these properties up to free parameters, e.g., α, G F, M Z, α s, m H,... Some tasks of data analysis: Estimate (measure) the parameters; Quantify the uncertainty of the parameter estimates; Test the extent to which the predictions of a theory are in agreement with the data ( presence of New Physics?) 3

Dealing with uncertainty In particle physics there are various elements of uncertainty: theory is not deterministic quantum mechanics random measurement errors present even without quantum effects things we could know in principle but don t e.g. from limitations of cost, time,... We can quantify the uncertainty using PROBABILITY 4

A definition of probability Consider a set S with subsets A, B,... Kolmogorov axioms (1933) Also define conditional probability: G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 5

Interpretation of probability I. Relative frequency A, B,... are outcomes of a repeatable experiment cf. quantum mechanics, particle scattering, radioactive decay... II. Subjective probability A, B,... are hypotheses (statements that are true or false) Both interpretations consistent with Kolmogorov axioms. In particle physics frequency interpretation often most useful, but subjective probability can provide more natural treatment of non-repeatable phenomena: systematic uncertainties, probability that Higgs boson exists,... G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 6

Bayes theorem From the definition of conditional probability we have and but, so Bayes theorem First published (posthumously) by the Reverend Thomas Bayes (1702 1761) An essay towards solving a problem in the doctrine of chances, Philos. Trans. R. Soc. 53 (1763) 370; reprinted in Biometrika, 45 (1958) 293. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 7

The law of total probability B Consider a subset B of the sample space S, S divided into disjoint subsets A i such that i A i = S, A i B A i law of total probability Bayes theorem becomes 8

Frequentist Statistics general philosophy In frequentist statistics, probabilities are associated only with the data, i.e., outcomes of repeatable observations. Probability = limiting frequency Probabilities such as P (Higgs boson exists), P (0.117 < α s < 0.121), etc. are either 0 or 1, but we don t know which. The tools of frequentist statistics tell us what to expect, under the assumption of certain probabilities, about hypothetical repeated observations. The preferred theories (models, hypotheses,...) are those for which our observations would be considered usual. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 9

Bayesian Statistics general philosophy In Bayesian statistics, interpretation of probability extended to degree of belief (subjective probability). Use this for hypotheses: probability of the data assuming hypothesis H (the likelihood) prior probability, i.e., before seeing the data posterior probability, i.e., after seeing the data normalization involves sum over all possible hypotheses Bayesian methods can provide more natural treatment of nonrepeatable phenomena: systematic uncertainties, probability that Higgs boson exists,... No golden rule for priors ( if-then character of Bayes thm.) G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 10

Statistical vs. systematic errors Statistical errors: How much would the result fluctuate upon repetition of the measurement? Implies some set of assumptions to define probability of outcome of the measurement. Systematic errors: What is the uncertainty in my result due to uncertainty in my assumptions, e.g., model (theoretical) uncertainty; modeling of measurement apparatus. Usually taken to mean the sources of error do not vary upon repetition of the measurement. Often result from uncertain value of calibration constants, efficiencies, etc. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 11

Systematic errors and nuisance parameters Model prediction (including e.g. detector effects) never same as "true prediction" of the theory: y model: truth: x Model can be made to approximate better the truth by including more free parameters. systematic uncertainty nuisance parameters G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 12

Example: fitting a straight line Data: Model: measured y i independent, Gaussian: assume x i and σ i known. Goal: estimate θ 0 (don t care about θ 1 ). G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 13

Frequentist approach Standard deviations from tangent lines to contour Correlation between to increase. causes errors G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 14

Frequentist case with a measurement t 1 of θ 1 The information on θ 1 improves accuracy of G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 15

Bayesian method We need to associate prior probabilities with θ 0 and θ 1, e.g., reflects prior ignorance, in any case much broader than Putting this into Bayes theorem gives: based on previous measurement posterior likelihood prior G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 16

Bayesian method (continued) We then integrate (marginalize) p(θ 0, θ 1 x) to find p(θ 0 x): In this example we can do the integral (rare). We find Usually need numerical methods (e.g. Markov Chain Monte Carlo) to do integral. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 17

Digression: marginalization with MCMC Bayesian computations involve integrals like often high dimensionality and impossible in closed form, also impossible with normal acceptance-rejection Monte Carlo. Markov Chain Monte Carlo (MCMC) has revolutionized Bayesian computation. MCMC (e.g., Metropolis-Hastings algorithm) generates correlated sequence of random numbers: cannot use for many applications, e.g., detector MC; effective stat. error greater than if uncorrelated. Basic idea: sample multidimensional look, e.g., only at distribution of parameters of interest. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 18

Example: posterior pdf from MCMC Sample the posterior pdf from previous example with MCMC: Summarize pdf of parameter of interest with, e.g., mean, median, standard deviation, etc. Although numerical values of answer here same as in frequentist case, interpretation is different (sometimes unimportant?) G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 19

MCMC basics: Metropolis-Hastings algorithm Goal: given an n-dimensional pdf generate a sequence of points 1) Start at some point 2) Generate 3) Form Hastings test ratio 4) Generate Proposal density e.g. Gaussian centred about 5) If else move to proposed point old point repeated 6) Iterate G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 20

Metropolis-Hastings (continued) This rule produces a correlated sequence of points (note how each new point depends on the previous one). For our purposes this correlation is not fatal, but statistical errors larger than it would be with uncorrelated points. The proposal density can be (almost) anything, but choose so as to minimize autocorrelation. Often take proposal density symmetric: Test ratio is (Metropolis-Hastings): I.e. if the proposed step is to a point of higher if not, only take the step with probability If proposed step rejected, hop in place., take it; G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 21

Metropolis-Hastings caveats Actually one can only prove that the sequence of points follows the desired pdf in the limit where it runs forever. There may be a burn-in period where the sequence does not initially follow Unfortunately there are few useful theorems to tell us when the sequence has converged. Look at trace plots, autocorrelation. Check result with different proposal density. If you think it s converged, try starting from a different point and see if the result is similar. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 22

Bayesian method with alternative priors Suppose we don t have a previous measurement of θ 1 but rather, e.g., a theorist says it should be positive and not too much greater than 0.1 "or so", i.e., something like From this we obtain (numerically) the posterior pdf for θ 0 : This summarizes all knowledge about θ 0. Look also at result from variety of priors. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 23

A more general fit (symbolic) Given measurements: and (usually) covariances: Predicted value: expectation value control variable parameters bias Often take: Minimize Equivalent to maximizing L(θ) e χ2 /2, i.e., least squares same as maximum likelihood using a Gaussian likelihood function. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 24

Its Bayesian equivalent Take Joint probability for all parameters and use Bayes theorem: To get desired probability for θ, integrate (marginalize) over b: Posterior is Gaussian with mode same as least squares estimator, σ θ same as from χ 2 = χ 2 min + 1. (Back where we started!) G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 25

Alternative priors for systematic errors Gaussian prior for the bias b often not realistic, especially if one considers the "error on the error". Incorporating this can give a prior with longer tails: π b (b) Represents error on the error ; standard deviation of π s (s) is σ s. b G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 26

A simple test Suppose fit effectively averages four measurements. Take σ sys = σ stat = 0.1, uncorrelated. Case #1: data appear compatible Posterior p(µ y): measurem ment p(µ y) ) experiment µ Usually summarize posterior p(µ y) with mode and standard deviation: G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 27

Simple test with inconsistent data Case #2: there is an outlier Posterior p(µ y): me easurement p(µ y) experiment µ Bayesian fit less sensitive to outlier. (See also D'Agostini 1999; Dose & von der Linden 1999) G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 28

Goodness-of-fit vs. size of error In LS fit, value of minimized χ 2 does not affect size of error on fitted parameter. In Bayesian analysis with non-gaussian prior for systematics, a high χ 2 corresponds to a larger error (and vice versa). posterior σ µ 2000 repetitions of experiment, σ s = 0.5, here no actual bias. σ µ from least squares χ 2 G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 29

Summary of lecture 1 The distinctive features of Bayesian statistics are: Subjective probability used for hypotheses (e.g. a parameter). Bayes' theorem relates the probability of data given H (the likelihood) to the posterior probability of H given data: Requires prior probability for H Bayesian methods often yield answers that are close (or identical) to those of frequentist statistics, albeit with different interpretation. This is not the case when the prior information is important relative to that contained in the data. G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 30

Extra slides G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 31

Some Bayesian references P. Gregory, Bayesian Logical Data Analysis for the Physical Sciences, CUP, 2005 D. Sivia, Data Analysis: a Bayesian Tutorial, OUP, 2006 S. Press, Subjective and Objective Bayesian Statistics: Principles, Models and Applications, 2nd ed., Wiley, 2003 A. O Hagan, Kendall s, Advanced Theory of Statistics, Vol. 2B, Bayesian Inference, Arnold Publishers, 1994 A. Gelman et al., Bayesian Data Analysis, 2nd ed., CRC, 2004 W. Bolstad, Introduction to Bayesian Statistics, Wiley, 2004 E.T. Jaynes, Probability Theory: the Logic of Science, CUP, 2003 G. Cowan SUSSP65, St Andrews, 16-29 August 2009 / Statistical Methods 1 page 32