Method of Feldman and Cousins for the construction of classical confidence belts
|
|
- Angel Woods
- 5 years ago
- Views:
Transcription
1 Method of Feldman and Cousins for the construction of classical confidence belts Ulrike Schnoor IKTP TU Dresden ATLAS Seminar U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 1 / 23
2 Feldman-Cousins method New method to solve two problems in small signal analysis at the same time: flip-flopping (choice of interval based on data) non-confidence intervals unphysical intervals (empty set) for both upper limits and central confidence intervals U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 2 / 23
3 Feldman-Cousins method 1 Reminder: confidence intervals 2 Flip-flopping and unphysical confidence intervals 3 Solution with Feldman-Cousins method U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 2 / 23
4 Reminder: Confidence intervals Bayesian confidence intervals = credible intervals For an observable x whose p.d.f. depends on the parameter θ, the Bayesian confidence interval (θ 1, θ 2 ) corresponding to C. L. α is constructed by: where with P(θ t x 0 ): posterior p.d.f. L(x 0 θ): likelihood function P(x 0 ): normalization constant P(θ t): prior p.d.f. θ2 θ 1 P(θ t x 0 ) dθ t = α P(θ t x 0 ) = L(x 0 θ) P(θt) P(x 0 ) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 3 / 23
5 Reminder: Confidence intervals Classical confidence intervals For an observable x whose p.d.f. depends on the parameter θ, a classical confidence interval (θ 1, θ 2 ) corresponding to C. L. α is a member of a set with the property with θ 1, θ 2 functions of x. (1) is true for every allowed θ, in particular θ t. P(θ [θ 1, θ 2 ]) = α (1) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 4 / 23
6 Reminder: Confidence intervals Classical confidence intervals For an observable x whose p.d.f. depends on the parameter θ, a classical confidence interval (θ 1, θ 2 ) corresponding to C. L. α is a member of a set with the property with θ 1, θ 2 functions of x. (1) is true for every allowed θ, in particular θ t. P(θ [θ 1, θ 2 ]) = α (1) Construction according to Neyman s method U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 4 / 23
7 Neyman s method of confidence belts U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
8 Neyman s method of confidence belts p.d.f. f (x; θ): x... outcome of experiment (estimator for θ) θ... parameter of the p.d.f. U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
9 Neyman s method of confidence belts p.d.f. f (x; θ): x... outcome of experiment (estimator for θ) θ... parameter of the p.d.f. Acceptance interval [x 1, x 2 ] for given C. L. α for each value of θ: P(x 1 < x < x 2 ; θ) = α = x 2 f (x; θ)dx x 1 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
10 Neyman s method of confidence belts p.d.f. f (x; θ): x... outcome of experiment (estimator for θ) θ... parameter of the p.d.f. Acceptance interval [x 1, x 2 ] for given C. L. α for each value of θ: P(x 1 < x < x 2 ; θ) = α = x 2 f (x; θ)dx x 1 Union of the intervals: confidence belt D(α) x 1 (θ, α), x 2 (θ, α) monotonic inverse functions θ 1 (x) = x 1 1 (x(θ)), θ 2 (x) = x 1 2 (x(θ)) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
11 Neyman s method of confidence belts p.d.f. f (x; θ): x... outcome of experiment (estimator for θ) θ... parameter of the p.d.f. Acceptance interval [x 1, x 2 ] for given C. L. α for each value of θ: P(x 1 < x < x 2 ; θ) = α = x 2 f (x; θ)dx x 1 Union of the intervals: confidence belt D(α) x 1 (θ, α), x 2 (θ, α) monotonic inverse functions θ 1 (x) = x 1 1 (x(θ)), θ 2 (x) = x 1 2 (x(θ)) x > x 1 (θ) θ 1 (x) > θ x < x 2 (θ) θ 2 (x) < θ U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
12 Neyman s method of confidence belts p.d.f. f (x; θ): x... outcome of experiment (estimator for θ) θ... parameter of the p.d.f. Acceptance interval [x 1, x 2 ] for given C. L. α for each value of θ: P(x 1 < x < x 2 ; θ) = α = x 2 f (x; θ)dx x 1 Union of the intervals: confidence belt D(α) x 1 (θ, α), x 2 (θ, α) monotonic inverse functions θ 1 (x) = x 1 1 (x(θ)), θ 2 (x) = x 1 2 (x(θ)) x > x 1 (θ) θ 1 (x) > θ x < x 2 (θ) θ 2 (x) < θ Confidence interval (θ 2 (x 0 ), θ 1 (x 0 )) for measurement x 0 : θ between θ 1 (x) and θ 2 (x) x lies between x 1 (θ) and x 2 (θ) θ P(θ 2 (x) < θ < θ 1 (x)) = α U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 5 / 23
13 Neyman s method of confidence belts Interval choices Upper confidence limits: P(x < x 1 θ) = 1 α satisfies P(θ > θ 1 ) = 1 α Central confidence intervals: P(x < x 1 θ) = P(x > x 2 θ) = 1 α 2 Choices requiring ordering principle, e.g. Feldman-Cousins method U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 6 / 23
14 Coverage Coverage P(θ 2 (x) < θ < θ 1 (x)) = α satisfied θ U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 7 / 23
15 Coverage Coverage P(θ 2 (x) < θ < θ 1 (x)) = α satisfied θ Neyman s exact construction gives confidence interval with coverage probability α for discrete x or approximative construction methods: U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 7 / 23
16 Coverage Coverage P(θ 2 (x) < θ < θ 1 (x)) = α satisfied θ Neyman s exact construction gives confidence interval with coverage probability α for discrete x or approximative construction methods: Undercoverage θ : P(θ [θ 1, θ 2 ]) < α U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 7 / 23
17 Coverage Coverage P(θ 2 (x) < θ < θ 1 (x)) = α satisfied θ Neyman s exact construction gives confidence interval with coverage probability α for discrete x or approximative construction methods: Undercoverage θ : P(θ [θ 1, θ 2 ]) < α Overcoverage θ : P(θ [θ 1, θ 2 ]) > α U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 7 / 23
18 Coverage Coverage P(θ 2 (x) < θ < θ 1 (x)) = α satisfied θ Neyman s exact construction gives confidence interval with coverage probability α for discrete x or approximative construction methods: Undercoverage θ : P(θ [θ 1, θ 2 ]) < α Overcoverage θ : P(θ [θ 1, θ 2 ]) > α Conservative intervals Overcoverage for some values of θ (loss of power) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 7 / 23
19 Two major problems can arise from Neyman s construction of confidence intervals U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 8 / 23
20 Example I - Gaussian with boundary at origin Gaussian distribution (σ = 1) for physical values µ > 0: P(x µ) = 1 2π exp( (x µ)/2) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 9 / 23
21 Example I - Gaussian with boundary at origin Gaussian distribution (σ = 1) for physical values µ > 0: P(x µ) = 1 2π exp( (x µ)/2) Unphysical confidence level E.g. for x = 1.8: confidence interval is empty set! Reason: negative values of µ are unphysical. U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 9 / 23
22 Example I - Gaussian with boundary at origin Flip-flopping : choice of interval type based on data for x > 3σ: central interval for 0 x 3σ: upper limit for x < 0: assume x 0 =0 FLIP-FLOPPING e.g. for µ = 2: this interval only contains 85% of P(x µ) undercoverage U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 10 / 23
23 Example II - Poisson with background Poisson distribution with background b: P(n µ) = (b = 3.0 in plots) (µ + b)n e (µ+b) n! U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 11 / 23
24 Example II - Poisson with background Poisson distribution with background b: P(n µ) = (b = 3.0 in plots) (µ + b)n e (µ+b) n! Unphysical confidence level E.g. for n = 0: confidence interval is empty set! Reason: negative values of µ are unphysical. U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 11 / 23
25 Example II - Poisson with background Poisson distribution with background b: P(n µ) = (b = 3.0 in plots) (µ + b)n e (µ+b) n! Unphysical confidence level E.g. for n = 0: confidence interval is empty set! Reason: negative values of µ are unphysical. FLIP-FLOPPING Interval choice based on data undercoverage U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 11 / 23
26 Feldman-Cousins method solves both problems at the same time! U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 12 / 23
27 Feldman-Cousins ordering principle... choice of acceptance interval based on likelihood ratio λ = f (x; θ) f (x; θ best ) with f (x; θ)... likelihood of θ given data x f (x; θ best )... θ best maximizes the likelihood for given x Integration steps: λ>λ min (α) dx f (x; θ) = α see examples U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 13 / 23
28 Ex. II - Feldman-Cousins for Poisson with background Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 14 / 23
29 Ex. II - Feldman-Cousins for Poisson with background Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for values of signal mean µ = 0.5: for each n, determine P(n µ) (P(0 0.5) = 0.03), and value µ best that maximizes P(n µ) (µ best = 0 for n = 0) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 14 / 23
30 Ex. II - Feldman-Cousins for Poisson with background Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for values of signal mean µ = 0.5: for each n, determine P(n µ) (P(0 0.5) = 0.03), and value µ best that maximizes P(n µ) (µ best = 0 for n = 0) determine ratio of likelihoods R = P(n µ) ; rank points n decreasingly P(n µ best ) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 14 / 23
31 Ex. II - Feldman-Cousins for Poisson with background Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for values of signal mean µ = 0.5: for each n, determine P(n µ) (P(0 0.5) = 0.03), and value µ best that maximizes P(n µ) (µ best = 0 for n = 0) determine ratio of likelihoods R = P(n µ) ; rank points n decreasingly P(n µ best ) add points n to acceptance interval according to rank until P(n µ) α U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 14 / 23
32 Ex. II - Feldman-Cousins for Poisson with background Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for values of signal mean µ = 0.5: for each n, determine P(n µ) (P(0 0.5) = 0.03), and value µ best that maximizes P(n µ) (µ best = 0 for n = 0) determine ratio of likelihoods R = P(n µ) ; rank points n decreasingly P(n µ best ) add points n to acceptance interval according to rank until P(n µ) α repeat for all µ and different values of b; determine confidence interval (µ 1, µ 2 ) according to standard procedure (vertical line) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 14 / 23
33 Ex. II - Feldman-Cousins for Poisson with background no empty sets coherent set of intervals no flip-flopping slightly conservative due to discreteness of n U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 15 / 23
34 Ex. I - Feldman-Cousins for Gaussian with boundary at origin For Gaussian with boundary at origin P(x µ) = 1 2π exp( (x µ)/2) Determine acceptance interval for signal mean µ: for each value of x find µ best that maximizes P(x µ): µ best = max(0, x) P(x µ best ) = { 1/ 2π, x 0 exp(x 2 /2)/ 2π, x > 0 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 16 / 23
35 Ex. I - Feldman-Cousins for Gaussian with boundary at origin For Gaussian with boundary at origin P(x µ) = 1 2π exp( (x µ)/2) Determine acceptance interval for signal mean µ: for each value of x find µ best that maximizes P(x µ): µ best = max(0, x) P(x µ best ) = { 1/ 2π, x 0 exp(x 2 /2)/ 2π, x > 0 compute R: R(x) = { P(x µ) exp( (x µ) 2 P(x µ best ) = /2), x 0 exp(xµ µ 2 /2), x > 0 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 16 / 23
36 Ex. I - Feldman-Cousins for Gaussian with boundary at origin For Gaussian with boundary at origin P(x µ) = 1 2π exp( (x µ)/2) Determine acceptance interval for signal mean µ: for each value of x find µ best that maximizes P(x µ): µ best = max(0, x) P(x µ best ) = { 1/ 2π, x 0 exp(x 2 /2)/ 2π, x > 0 compute R: R(x) = { P(x µ) exp( (x µ) 2 P(x µ best ) = /2), x 0 exp(xµ µ 2 /2), x > 0 integrate over the R-ranked intervals: find interval [x 1, x 2 ] such that R(x 1 ) = R(x 2 ) and x 2 P(x µ)dx = α x 1 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 16 / 23
37 Ex. I - Feldman-Cousins for Gaussian with boundary at origin For Gaussian with boundary at origin P(x µ) = 1 2π exp( (x µ)/2) Determine acceptance interval for signal mean µ: for each value of x find µ best that maximizes P(x µ): µ best = max(0, x) P(x µ best ) = { 1/ 2π, x 0 exp(x 2 /2)/ 2π, x > 0 compute R: R(x) = { P(x µ) exp( (x µ) 2 P(x µ best ) = /2), x 0 exp(xµ µ 2 /2), x > 0 integrate over the R-ranked intervals: find interval [x 1, x 2 ] such that R(x 1 ) = R(x 2 ) and x 2 P(x µ)dx = α x 1 find acceptance interval for each µ confidence interval [µ 1, µ 2 ] for each x 0 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 16 / 23
38 Ex. I - Feldman-Cousins for Gaussian with boundary at origin Figure: Feldman-Cousins confidence belt Figure: Flip-flopping confidence belt no empty sets no flip-flopping slightly conservative at transition from upper limit to two-sided interval U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 17 / 23
39 Sources G. Cowan: Statistical data analysis (Oxford University Press 1998) G. Feldman, R. Cousins: A Unified Approach to the Classical Statistical Analysis of Small Signals F. James: Statistical Methods in Experimental Physics (World Scientific 2006) B. D. Yabsley: Neyman & Feldman-Cousins intervals for a simple problem with an unphysical region, and an analytic solution (arxiv:hep-ex/ v1 2006) PDG Review, Ch. 33 (Statistics) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 18 / 23
40 BACKUP U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 19 / 23
41 Reminder: Bayesian and Classical confidence levels Bayesian Confidence Level α : degree of belief that θ t [θ 1, θ 2 ] inference about the true value θ t experiment not repeatable prior contains all previous knowledge / believes Classical Confidence Level α : probability that interval [θ 1, θ 2 ] contains θ no prior, no degree of belief no inference about θ t, but only about (θ 1, θ 2 ) experiment repeatable: fraction α of measurements yields a confidence interval containing the true value θ t U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 20 / 23
42 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
43 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 find value of µ that maximizes P(n µ): here µ = max(0, n b), µ = 0 for n = 0 U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
44 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 find value of µ that maximizes P(n µ): here µ = max(0, n b), µ = 0 for n = 0 calculate the ratio of likelihoods: R = P(n µ), rank decreasingly P(n µ best ) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
45 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 find value of µ that maximizes P(n µ): here µ = max(0, n b), µ = 0 for n = 0 calculate the ratio of likelihoods: R = P(n µ), rank decreasingly P(n µ best ) add points n to acceptance interval according to rank until P(n µ) α U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
46 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 find value of µ that maximizes P(n µ): here µ = max(0, n b), µ = 0 for n = 0 calculate the ratio of likelihoods: R = P(n µ), rank decreasingly P(n µ best ) add points n to acceptance interval according to rank until P(n µ) α do this for all µ and different values of b U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
47 Ex. II - Feldman-Cousins for Poisson with background Construction of confidence interval with ordering principle based on likelihood ratios Poisson with background b = 3.0: P(n µ) = (µ + b) n exp( (µ + b))/n! Determine acceptance interval for signal mean µ = 0.5 for all points n: start with n = 0: P(0 0.5) = 0.03 find value of µ that maximizes P(n µ): here µ = max(0, n b), µ = 0 for n = 0 calculate the ratio of likelihoods: R = P(n µ), rank decreasingly P(n µ best ) add points n to acceptance interval according to rank until P(n µ) α do this for all µ and different values of b determine confidence interval (µ 1, µ 2 ) according to standard procedure (vertical line) U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 21 / 23
48 Ex. II - Feldman-Cousins for Poisson with background P(n µ) for given n, µ µ best maximizes P(n µ) for given n R = P(n µ) P(n µ best ) rank according to decreasing R comparison to Upper Limit and central intervals U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 22 / 23
49 Mild pathologies arising from Feldman-Cousins in the case of Poisson with background After determination of acceptance intervals, draw vertical line to get confidence interval. Sometimes, set of intersected lines is not connected! define θ 2 = bottom-most intersected line; θ 1 = top-most intersected line background-dependence: µ 2 (b) has to be decreasing monotonously (if not, lengthen some confidence intervals) These compensations add slightly to conservatism of the intervals, but conservatism remains dominated by discreteness of n U. Schnoor (IKTP TU DD) Felcman-Cousins ATLAS Seminar 23 / 23
Physics 403. Segev BenZvi. Credible Intervals, Confidence Intervals, and Limits. Department of Physics and Astronomy University of Rochester
Physics 403 Credible Intervals, Confidence Intervals, and Limits Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Summarizing Parameters with a Range Bayesian
More informationUnified approach to the classical statistical analysis of small signals
PHYSICAL REVIEW D VOLUME 57, NUMBER 7 1 APRIL 1998 Unified approach to the classical statistical analysis of small signals Gary J. Feldman * Department of Physics, Harvard University, Cambridge, Massachusetts
More informationarxiv:physics/ v2 [physics.data-an] 16 Dec 1999
HUTP-97/A096 A Unified Approach to the Classical Statistical Analysis of Small Signals arxiv:physics/9711021v2 [physics.data-an] 16 Dec 1999 Gary J. Feldman Department of Physics, Harvard University, Cambridge,
More informationJourneys of an Accidental Statistician
Journeys of an Accidental Statistician A partially anecdotal account of A Unified Approach to the Classical Statistical Analysis of Small Signals, GJF and Robert D. Cousins, Phys. Rev. D 57, 3873 (1998)
More informationStatistics of Small Signals
Statistics of Small Signals Gary Feldman Harvard University NEPPSR August 17, 2005 Statistics of Small Signals In 1998, Bob Cousins and I were working on the NOMAD neutrino oscillation experiment and we
More informationE. Santovetti lesson 4 Maximum likelihood Interval estimation
E. Santovetti lesson 4 Maximum likelihood Interval estimation 1 Extended Maximum Likelihood Sometimes the number of total events measurements of the experiment n is not fixed, but, for example, is a Poisson
More informationSearch and Discovery Statistics in HEP
Search and Discovery Statistics in HEP Eilam Gross, Weizmann Institute of Science This presentation would have not been possible without the tremendous help of the following people throughout many years
More informationStatistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009
Statistics for Particle Physics Kyle Cranmer New York University 91 Remaining Lectures Lecture 3:! Compound hypotheses, nuisance parameters, & similar tests! The Neyman-Construction (illustrated)! Inverted
More informationChapter 3: Interval Estimation
Chapter 3: Interval Estimation 1. Basic Concepts: Probability, random variables, distributions, convergence, law of large numbers, Central Limit Theorem, etc. 2. Point Estimation 3. Interval Estimation
More informationConfidence intervals and the Feldman-Cousins construction. Edoardo Milotti Advanced Statistics for Data Analysis A.Y
Confidence intervals and the Feldman-Cousins construction Edoardo Milotti Advanced Statistics for Data Analysis A.Y. 2015-16 Review of the Neyman construction of the confidence intervals X-Outline of a
More informationCREDIBILITY OF CONFIDENCE INTERVALS
CREDIBILITY OF CONFIDENCE INTERVALS D. Karlen Λ Ottawa-Carleton Institute for Physics, Carleton University, Ottawa, Canada Abstract Classical confidence intervals are often misunderstood by particle physicists
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationBayesian Inference: Posterior Intervals
Bayesian Inference: Posterior Intervals Simple values like the posterior mean E[θ X] and posterior variance var[θ X] can be useful in learning about θ. Quantiles of π(θ X) (especially the posterior median)
More informationSolution: chapter 2, problem 5, part a:
Learning Chap. 4/8/ 5:38 page Solution: chapter, problem 5, part a: Let y be the observed value of a sampling from a normal distribution with mean µ and standard deviation. We ll reserve µ for the estimator
More informationStatistical Methods in Particle Physics Lecture 2: Limits and Discovery
Statistical Methods in Particle Physics Lecture 2: Limits and Discovery SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationINTRODUCTION TO BAYESIAN METHODS II
INTRODUCTION TO BAYESIAN METHODS II Abstract. We will revisit point estimation and hypothesis testing from the Bayesian perspective.. Bayes estimators Let X = (X,..., X n ) be a random sample from the
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty
More informationPrimer on statistics:
Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood
More informationConfidence intervals fundamental issues
Confidence intervals fundamental issues Null Hypothesis testing P-values Classical or frequentist confidence intervals Issues that arise in interpretation of fit result Bayesian statistics and intervals
More informationStatistical Methods for Particle Physics Lecture 4: discovery, exclusion limits
Statistical Methods for Particle Physics Lecture 4: discovery, exclusion limits www.pp.rhul.ac.uk/~cowan/stat_aachen.html Graduierten-Kolleg RWTH Aachen 10-14 February 2014 Glen Cowan Physics Department
More informationProbability and Estimation. Alan Moses
Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationThe binomial model. Assume a uniform prior distribution on p(θ). Write the pdf for this distribution.
The binomial model Example. After suspicious performance in the weekly soccer match, 37 mathematical sciences students, staff, and faculty were tested for the use of performance enhancing analytics. Let
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More informationIf there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,
Recall The Neyman-Pearson Lemma Neyman-Pearson Lemma: Let Θ = {θ 0, θ }, and let F θ0 (x) be the cdf of the random vector X under hypothesis and F θ (x) be its cdf under hypothesis. Assume that the cdfs
More information32. STATISTICS. 32. Statistics 1
32. STATISTICS 32. Statistics 1 Revised September 2007 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in High Energy Physics. In statistics, we are interested in using a
More informationHypothesis Testing - Frequentist
Frequentist Hypothesis Testing - Frequentist Compare two hypotheses to see which one better explains the data. Or, alternatively, what is the best way to separate events into two classes, those originating
More information32. STATISTICS. 32. Statistics 1
32. STATISTICS 32. Statistics 1 Revised April 1998 by F. James (CERN); February 2000 by R. Cousins (UCLA); October 2001, October 2003, and August 2005 by G. Cowan (RHUL). This chapter gives an overview
More informationConfidence Intervals. First ICFA Instrumentation School/Workshop. Harrison B. Prosper Florida State University
Confidence Intervals First ICFA Instrumentation School/Workshop At Morelia,, Mexico, November 18-29, 2002 Harrison B. Prosper Florida State University Outline Lecture 1 Introduction Confidence Intervals
More informationIntroduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf
1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Lior Wolf 2014-15 We know that X ~ B(n,p), but we do not know p. We get a random sample from X, a
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationIntroduction into Bayesian statistics
Introduction into Bayesian statistics Maxim Kochurov EF MSU November 15, 2016 Maxim Kochurov Introduction into Bayesian statistics EF MSU 1 / 7 Content 1 Framework Notations 2 Difference Bayesians vs Frequentists
More informationIrr. Statistical Methods in Experimental Physics. 2nd Edition. Frederick James. World Scientific. CERN, Switzerland
Frederick James CERN, Switzerland Statistical Methods in Experimental Physics 2nd Edition r i Irr 1- r ri Ibn World Scientific NEW JERSEY LONDON SINGAPORE BEIJING SHANGHAI HONG KONG TAIPEI CHENNAI CONTENTS
More informationSome Topics in Statistical Data Analysis
Some Topics in Statistical Data Analysis Invisibles School IPPP Durham July 15, 2013 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan G. Cowan
More information32. STATISTICS. 32. Statistics 1
32. STATISTICS 32. Statistics 1 Revised September 2009 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in high-energy physics. In statistics, we are interested in using a
More informationStatistical Methods for Discovery and Limits in HEP Experiments Day 3: Exclusion Limits
Statistical Methods for Discovery and Limits in HEP Experiments Day 3: Exclusion Limits www.pp.rhul.ac.uk/~cowan/stat_freiburg.html Vorlesungen des GK Physik an Hadron-Beschleunigern, Freiburg, 27-29 June,
More informationMiscellany : Long Run Behavior of Bayesian Methods; Bayesian Experimental Design (Lecture 4)
Miscellany : Long Run Behavior of Bayesian Methods; Bayesian Experimental Design (Lecture 4) Tom Loredo Dept. of Astronomy, Cornell University http://www.astro.cornell.edu/staff/loredo/bayes/ Bayesian
More informationarxiv: v1 [hep-ex] 9 Jul 2013
Statistics for Searches at the LHC Glen Cowan Physics Department, Royal Holloway, University of London, Egham, Surrey, TW2 EX, UK arxiv:137.2487v1 [hep-ex] 9 Jul 213 Abstract These lectures 1 describe
More informationFrequentist Confidence Limits and Intervals. Roger Barlow SLUO Lectures on Statistics August 2006
Frequentist Confidence Limits and Intervals Roger Barlow SLUO Lectures on Statistics August 2006 Confidence Intervals A common part of the physicist's toolkit Especially relevant for results which are
More informationCS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts
ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes
More informationParameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1
Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data
More informationMathematical statistics
October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:
More informationTime Series and Dynamic Models
Time Series and Dynamic Models Section 1 Intro to Bayesian Inference Carlos M. Carvalho The University of Texas at Austin 1 Outline 1 1. Foundations of Bayesian Statistics 2. Bayesian Estimation 3. The
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationBayesian Analysis (Optional)
Bayesian Analysis (Optional) 1 2 Big Picture There are two ways to conduct statistical inference 1. Classical method (frequentist), which postulates (a) Probability refers to limiting relative frequencies
More informationPractical Statistics part II Composite hypothesis, Nuisance Parameters
Practical Statistics part II Composite hypothesis, Nuisance Parameters W. Verkerke (NIKHEF) Summary of yesterday, plan for today Start with basics, gradually build up to complexity of Statistical tests
More informationIntroduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf
1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 2013-14 We know that X ~ B(n,p), but we do not know p. We get a random sample
More informationSTAT 425: Introduction to Bayesian Analysis
STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective
More informationEstimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator
Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were
More informationExpect Values and Probability Density Functions
Intelligent Systems: Reasoning and Recognition James L. Crowley ESIAG / osig Second Semester 00/0 Lesson 5 8 april 0 Expect Values and Probability Density Functions otation... Bayesian Classification (Reminder...3
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationCLASS NOTES Models, Algorithms and Data: Introduction to computing 2018
CLASS NOTES Models, Algorithms and Data: Introduction to computing 208 Petros Koumoutsakos, Jens Honore Walther (Last update: June, 208) IMPORTANT DISCLAIMERS. REFERENCES: Much of the material (ideas,
More informationRecent developments in statistical methods for particle physics
Recent developments in statistical methods for particle physics Particle Physics Seminar Warwick, 17 February 2011 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk
More informationComputational Cognitive Science
Computational Cognitive Science Lecture 8: Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Based on slides by Sharon Goldwater October 14, 2016 Frank Keller Computational
More informationNew Bayesian methods for model comparison
Back to the future New Bayesian methods for model comparison Murray Aitkin murray.aitkin@unimelb.edu.au Department of Mathematics and Statistics The University of Melbourne Australia Bayesian Model Comparison
More informationPatterns of Scalable Bayesian Inference Background (Session 1)
Patterns of Scalable Bayesian Inference Background (Session 1) Jerónimo Arenas-García Universidad Carlos III de Madrid jeronimo.arenas@gmail.com June 14, 2017 1 / 15 Motivation. Bayesian Learning principles
More information9 Bayesian inference. 9.1 Subjective probability
9 Bayesian inference 1702-1761 9.1 Subjective probability This is probability regarded as degree of belief. A subjective probability of an event A is assessed as p if you are prepared to stake pm to win
More informationBayesian RL Seminar. Chris Mansley September 9, 2008
Bayesian RL Seminar Chris Mansley September 9, 2008 Bayes Basic Probability One of the basic principles of probability theory, the chain rule, will allow us to derive most of the background material in
More informationStatistics for the LHC Lecture 1: Introduction
Statistics for the LHC Lecture 1: Introduction Academic Training Lectures CERN, 14 17 June, 2010 indico.cern.ch/conferencedisplay.py?confid=77830 Glen Cowan Physics Department Royal Holloway, University
More informationSearch Procedures in High Energy Physics
Version 2.20 July 1, 2009 Search Procedures in High Energy Physics Luc Demortier 1 Laboratory of Experimental High-Energy Physics The Rockefeller University Abstract The usual procedure for searching for
More informationComputational Biology Lecture #3: Probability and Statistics. Bud Mishra Professor of Computer Science, Mathematics, & Cell Biology Sept
Computational Biology Lecture #3: Probability and Statistics Bud Mishra Professor of Computer Science, Mathematics, & Cell Biology Sept 26 2005 L2-1 Basic Probabilities L2-2 1 Random Variables L2-3 Examples
More informationLecture 24. Introduction to Error Analysis. Experimental Nuclear Physics PHYS 741
Lecture 24 Introduction to Error Analysis Experimental Nuclear Physics PHYS 741 heeger@wisc.edu References and Figures from: - Bevington Data Reduction and Error Analysis 1 Seminar Announcement Friday,
More informationBayesian Econometrics
Bayesian Econometrics Christopher A. Sims Princeton University sims@princeton.edu September 20, 2016 Outline I. The difference between Bayesian and non-bayesian inference. II. Confidence sets and confidence
More informationSUFFICIENT STATISTICS
SUFFICIENT STATISTICS. Introduction Let X (X,..., X n ) be a random sample from f θ, where θ Θ is unknown. We are interested using X to estimate θ. In the simple case where X i Bern(p), we found that the
More informationLecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1
Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationEstimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio
Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist
More informationRecommendations for presentation of error bars
Draft 0.00 ATLAS Statistics Forum 15 February, 2011 Recommendations for presentation of error bars 1 Introduction This note summarizes recommendations on how to present error bars on plots. It follows
More informationWhat are the Findings?
What are the Findings? James B. Rawlings Department of Chemical and Biological Engineering University of Wisconsin Madison Madison, Wisconsin April 2010 Rawlings (Wisconsin) Stating the findings 1 / 33
More informationBayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification
Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification Michael Anderson, PhD Hélène Carabin, DVM, PhD Department of Biostatistics and Epidemiology The University of Oklahoma
More information10. Composite Hypothesis Testing. ECE 830, Spring 2014
10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters
More informationIntroduction to Probabilistic Machine Learning
Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning
More informationConfidence Distribution
Confidence Distribution Xie and Singh (2013): Confidence distribution, the frequentist distribution estimator of a parameter: A Review Céline Cunen, 15/09/2014 Outline of Article Introduction The concept
More informationStatistical learning. Chapter 20, Sections 1 3 1
Statistical learning Chapter 20, Sections 1 3 Chapter 20, Sections 1 3 1 Outline Bayesian learning Maximum a posteriori and maximum likelihood learning Bayes net learning ML parameter learning with complete
More informationAccouncements. You should turn in a PDF and a python file(s) Figure for problem 9 should be in the PDF
Accouncements You should turn in a PDF and a python file(s) Figure for problem 9 should be in the PDF Please do not zip these files and submit (unless there are >5 files) 1 Bayesian Methods Machine Learning
More informationEmpirical Risk Minimization is an incomplete inductive principle Thomas P. Minka
Empirical Risk Minimization is an incomplete inductive principle Thomas P. Minka February 20, 2001 Abstract Empirical Risk Minimization (ERM) only utilizes the loss function defined for the task and is
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationThe Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics
The Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics Bob Cousins Univ. of California, Los Angeles Workshop on Evidence, Discovery, Proof: Measuring the Higgs Particle Univ. of South
More informationCSC321 Lecture 18: Learning Probabilistic Models
CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling
More informationA Very Brief Summary of Statistical Inference, and Examples
A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)
More informationCOS513 LECTURE 8 STATISTICAL CONCEPTS
COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions
More informationAdvanced Statistics Course Part I
Advanced Statistics Course Part I W. Verkerke (NIKHEF) Wouter Verkerke, NIKHEF Outline of this course Advances statistical methods Theory and practice Focus on limit setting and discovery for the LHC Part
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans
More informationHypothesis testing (cont d)
Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able
More informationBayesian parameter estimation with weak data and when combining evidence: the case of climate sensitivity
Bayesian parameter estimation with weak data and when combining evidence: the case of climate sensitivity Nicholas Lewis Independent climate scientist CliMathNet 5 July 2016 Main areas to be discussed
More informationDavid Giles Bayesian Econometrics
David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian
More informationChapter 7. Confidence Sets Lecture 30: Pivotal quantities and confidence sets
Chapter 7. Confidence Sets Lecture 30: Pivotal quantities and confidence sets Confidence sets X: a sample from a population P P. θ = θ(p): a functional from P to Θ R k for a fixed integer k. C(X): a confidence
More informationAnnouncements. Proposals graded
Announcements Proposals graded Kevin Jamieson 2018 1 Hypothesis testing Machine Learning CSE546 Kevin Jamieson University of Washington October 30, 2018 2018 Kevin Jamieson 2 Anomaly detection You are
More informationIntroductory Statistics Course Part II
Introductory Statistics Course Part II https://indico.cern.ch/event/735431/ PHYSTAT ν CERN 22-25 January 2019 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationBayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017
Bayesian inference Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark April 10, 2017 1 / 22 Outline for today A genetic example Bayes theorem Examples Priors Posterior summaries
More informationCompute f(x θ)f(θ) dθ
Bayesian Updating: Continuous Priors 18.05 Spring 2014 b a Compute f(x θ)f(θ) dθ January 1, 2017 1 /26 Beta distribution Beta(a, b) has density (a + b 1)! f (θ) = θ a 1 (1 θ) b 1 (a 1)!(b 1)! http://mathlets.org/mathlets/beta-distribution/
More informationBEST TESTS. Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized.
BEST TESTS Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized. 1. Most powerful test Let {f θ } θ Θ be a family of pdfs. We will consider
More informationStatistical Methods for Particle Physics
Statistical Methods for Particle Physics Invisibles School 8-13 July 2014 Château de Button Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationBayes Factors for Discovery
Glen Cowan RHUL Physics 3 April, 22 Bayes Factors for Discovery The fundamental quantity one should use in the Bayesian framework to quantify the significance of a discovery is the posterior probability
More informationA Very Brief Summary of Bayesian Inference, and Examples
A Very Brief Summary of Bayesian Inference, and Examples Trinity Term 009 Prof Gesine Reinert Our starting point are data x = x 1, x,, x n, which we view as realisations of random variables X 1, X,, X
More information