Finite sample size optimality of GLR tests. George V. Moustakides University of Patras, Greece
|
|
- Christopher Parsons
- 5 years ago
- Views:
Transcription
1 Finite sample size optimality of GLR tests George V. Moustakides University of Patras, Greece
2 Outline Hypothesis testing and GLR Randomized tests Classical hypothesis testing with randomized tests Alternative implementation of randomized tests A combined hypothesis testing/isolation problem Optimality of GLR Randomized estimators A combined hypothesis testing/estimation problem Optimum GLR-type tests 2
3 Hypothesis testing and GLR Binary hypothesis testing (fixed sample size N): Collection of observations X=[x 1,...,x N ] : X H 1 : X ~ f 0 (X); ~ f 1 (X); Decide between and H 1 Likelihood Ratio Rest (LR) f 1 (X) f 0 (X) H 1 λ Optimum according to different criteria: Neyman-Pearson, Bayesian, Min-Max, Equalized error probabilities. 3
4 Composite binary hypothesis testing Observations X=[x 1,...,x N ] : X f 01 (X), π 01 f 02 (X), π 02. f 0K0 (X), π 0K0 H 1 : X f 11 (X), π 11 f 12 (X), π 12. f 1K1 (X), π 1K1 One subcase responsible for the data 4
5 If we knew before hand the two possible subcases, for example: f 0i and f 1j then f 1j (X) f 0i (X) f 1 (X) f 0 (X) = π 11f 11 (X)+ + π 1K1 f 1K1 (X) π 01 f 01 (X)+ + π 0K0 f 0K0 (X) H 1 H 1 When this knowledge is not possible: If in addition to detection, we like to isolate the subcase responsible for the data X GLR f (X) H1 1^j f (X) 0^i λ ^i =arg max f 0i (X) 1 i K 0 ^j =arg max f 1j (X) 1 j K 1 λ ML isolation 5
6 Composite binary hypothesis testing (cond.) Observations X=[x 1,...,x N ] : X f 0 (X θ 0 ), π 0 (θ 0 ) H 1 : X f 1 (X θ 1 ), π 1 (θ 1 ) Only detection f 1 (X) f 0 (X) = R H π1 (θ 1 )f 1 (X θ 1 ) dθ 1 R 1 π0 (θ 0 )f 0 (X θ 0 ) dθ 0 If we like, in addition to detection to estimate the parameter vector θ i GLR H f 1 (Xj^µ 1 ) 1 f 0 (Xj^µ 0 ) λ ^µ 0 =argmaxf 0 (Xjµ 0 ) µ 0 ^µ 1 =argmaxf 1 (Xjµ 1 ) µ 1 ML estimate 6
7 Optimality of GLR Wald (1943) Le Cam (1953) Lehmann (1959) Hoeffding (1965) Neyman (1965) M Asymptotic optimality under regularity conditions. As the number of samples increases without limit, performance of GLR approaches the optimum test (with true parameters considered known). 7
8 Randomized tests Binary hypothesis testing (fixed sample size N): Collection of observations X=[x 1,...,x N ] : X ~ f 0 (X); H 1 : X ~ f 1 (X); Decide between and H 1 Deterministic tests X R N wo complementary sets in R N : A 0,A 1 with A 1 =A 0 When X A i decide for H i. c Randomized tests wo complementary probabilities: δ 0 (X),δ 1 (X) with δ 0 (X)+δ 1 (X)=1. In a random game select H i with probability δ i (X). 8
9 Randomized more general than Deterministic Neyman-Pearson: ± i (X) =1 Ai (X) When X A 1 then we decide with probability 1 for H 1 and with probability 0 for (the equivalent of a deterministic decision). max P[d =1jH 1 ]; subject to ± 0 ;± 1 P[d =1j ] ± 1 (X)f 1 (X) dx - λ ± 1 (X)f 0 (X) dx = ± 1 (X)[f 1 (X) f 0 (X)] dx maxff 1 (X) f 0 (X); 0g dx 9
10 ± 1 (X) = 8 < : 1 f 1 (X) f 0 (X) > 0 0 f 1 (X) f 0 (X) < 0 f 1 (X) f 0 (X) =0: ± 0 (X) =1 ± 1 (X) = 8 < : 0 f 1 (X) f 0 (X) > 0 1 f 1 (X) f 0 (X) < 0 1 f 1 (X) f 0 (X) =0: Likelihood Ratio Rest (LR) f 1 (X) f 0 (X) H 1 λ 10
11 Composite binary hypothesis testing Observations X=[x 1,...,x N ] : X f 01 (X), π 01 f 02 (X), π 02. f 0K0 (X), π 0K0 H 1 : X f 11 (X), π 11 f 12 (X), π 12. f 1K1 (X), π 1K1 δ 01 (X) δ 02 (X). ± 0K0 (X) δ 11 (X) δ 12 (X). ± 1K1 (X) 11
12 δ 01 (X) δ 02 (X) ± 0K0 (X) δ 11 (X) δ 12 (X) ± 1K1 (X) X i=0;1 XK i j=1 ± ij (X) =1 In a random game, with probability δ ij (X): decide for Hypothesis H i and subcase j. We SIMULANEOUSLY Detect and Isolate 12
13 Alternative two-step implementation ± 01 (X);± 02 (X);:::;± 0K0 (X) + {z } ± 0 (X) ± 11 (X);± 12 (X);:::;± 1K1 (X) {z + } ± 1 (X) q ij (X) = ± ij(x) ± i (X) ± 0 (X)+± 1 (X) =1 q 01 (X)+ + q 0K0 (X) =1 q 11 (X)+ + q 1K1 (X) =1 ± 0 (x);± 1 (X); [q 01 (X);:::;q 0K0 (X)]; [q 11 (X);:::;q 1K1 (X)] With probabilities δ 0 (X),δ 1 (X) decide between,h 1. Given we selected H i, isolate among the subcases of H i with probabilities q i1 (X), q i2 (X),..., q iki (X): 13
14 Optimality of GLR : X» f 01 (X);¼ 01» f 02 (X);¼ 02.» f 0K0 (X);¼ 0K0 H 1 : X» f 11 (X);¼ 11» f 12 (X);¼ 12.» f 1K1 (X);¼ 1K1 A Combined Detection / Isolation problem Following a Neyman-Pearson like approach, we propose max P[Correct-detection/isolationjH 1 ] subject to the false alarm constraint P[Miss-detection/isolationj ] 14
15 HEOREM: he test that solves the combined detection/isolation problem is the following max ¼ 1j f 1j (X) 1 j K 1 max ¼ 0j f 0j (X) 1 j K 0 H 1 ( Randomization probability γ ) hreshold λ and randomization probability γ are selected to satisfy the false alarm constraint with equality. 15
16 PROOF P[Correct-detection/isolationjH i ] = XK i j=1 P[Correct-detection/isolationjH ij ]¼ ij P[Correct-detection/isolationjH ij ] = ± i (X)q ij (X)f ij (X) dx: Constrained optimization problem: Lagrange multiplier P[Correct-detection/isolationjH 1 ] P[Miss-detection/isolationj ] = P[Correct-detection/isolationjH 1 ] + P[Correct-detection/isolationj ] 16
17 P[Correct-detection/isolationjH 1 ] + P[Correct-detection/isolationj ] = ± 1 (X) ( K1 X j=1 + q 1j (X)¼ 1j f 1j (X) ± 0 (X) ( K0 X j=1 ) dx q 0j (X)¼ 0j f 0j (X) ± 1 (X) max f¼ 1j f 1j (X)gdX 1 j K 1 + ± 0 (X) max f¼ 0j f 0j (X)gdX 1 j K 0 ) dx 17
18 = h ± 1 (X) max 1 j K 1 f¼ 1j f 1j (X)g ½ max + ± 0 (X) max 1 j K 0 f¼ 0j f 0j (X)g i dx max 1 j K 1 f¼ 1j f 1j (X)g; max 1 j K 0 f¼ 0j f 0j (X)g max ¼ 1j f 1j (X) 1 j K 1 max ¼ 0j f 0j (X) 1 j K 0 H 1 ¾ dx If priors not known, assume equiprobable subcases: max K 1 1 f 1j (X) 1 j K 1 max K 0 1 f 0j(X) 1 j K 0 H 1 ) max f 1j (X) 1 j K 1 max f 0j (X) 1 j K 0 H 1 K1 K 2 Classical GLR 18
19 Randomized estimators Observations X=[x 1,...,x N ] X» f(xjµ); ¼(µ) We would like to obtain an estimate of θ Deterministic estimator: ^µ = G(X) ^µ Randomized estimator: ±^µ(x) ) ±(^µjx); ±(^µjx) d^µ =1; a PDF! How do we obtain a randomized estimate? We generate a r.v. distributed according to ±(^µjx) Deterministic estimator ^µ o ±(^µjx) =Dirac n^µ G(X) 19
20 Optimum Bayes estimation If true parameter is θ and estimate ^µ, cost is: C(^µ; µ) Select randomized estimator to minimize average cost E[C(^µ; µ)] = C(^µ; µ)±(^µjx)f(xjµ)¼(µ) dx d^µdµ ½ ¾ = ±(^µjx) C(^µ; µ)f(xjµ)¼(µ) dµ d^µ dx {z } D(^µ; X) min D(U; X) U min D(U; X) dx U o ±(^µjx) =Diracn^µ arg min D(U; X) U 20
21 Combined detection/estimation Observations X=[x 1,...,x N ] : X» f(xjµ =0) H 1 : X» f(xjµ); ¼(µ) Decide between,h 1 Whenever we decide in favor of H 1 we also like to provide an estimate of θ ^µ Detection/estimation procedure: wo-step strategy ± 0 (X);± 1 (X);q(^µjX) 21
22 Following a Neyman-Pearson like approach, we propose min E[C(^µ; µ)jh 1 ] subject to: P[Miss-detectionj ] P[Miss-detectionj ]= ± 1 (X)f(Xj0)dX Cost computation Correct Hypothesis: C(^µ; µ)± 1 (X)q(^µjX)¼(µ)f(Xjµ)dµ d^µdx Wrong Hypothesis: C(0;µ)± 0 (X)¼(µ)f(Xjµ)dµ dx 22
23 C(^µ; µ)± 1 (X)q(^µjX)¼(µ)f(Xjµ)dµ d^µdx + C(0;µ)± 0 (X)¼(µ)f(Xjµ)dµ dx + ± 1 (X)f(Xj0)dX D(U; X) ½z } { ¾ ± 1 (X)min C(U; µ)¼(µ)f(xjµ)dµ dx U ½ ¾ + ± 0 (X) C(0;µ)¼(µ)f(Xjµ)dµ dx {z } + ± 1 (X)f(Xj0)dX D(0;X) 23
24 = n ± 1 (X)[ f(xj0) + min D(U; X)] U min n f(xj0) + min U We decide in favor of H 1 when o + ± 0 (X)D(0;X) dx o D(U; X); D(0;X) dx he final test is: D(0;X) > f(xj0) + min U D(U; X) D(0;X) min D(U; X) U f(xj0) C(U; X) = C(U; µ)f(xjµ)¼(µ)dµ H 1 24
25 Examples MAP estimator: C(^µ; µ) = ½ 0 for k^µ µk 1 1 otherwise Optimum test: max µ ¼(µ)f(Xjµ) f(xj0) H 1 If prior π(θ) is uniform we recover the classical GLR max f(xjµ) µ f(xj0) H 1 25
26 MMSE estimator: C(^µ; µ) =k^µ µk 2 Optimum test: k^µ MMSE k 2 R ¼(µ)f(Xjµ) dµ f(xj0) H 1 ^µ MMSE = R µ¼(µ)f(xjµ) dµ R ¼(µ)f(Xjµ) dµ = E[µjX] 26
27 MEDIAN estimator: C(^µ; µ) =j^µ µj Optimum test: ^µmedian >< ^µ MEDIAN =arg ^µ : >: 0 8 µ¼(µ)f(xjµ) dµ f(xj0) ^µ H 1 ¼(µ)f(Xjµ) dµ ¼(µ)f(Xjµ) dµ =argf^µ : P[µ ^µjx] =0:5g 9 >= =0:5 >; 27
Optimum Joint Detection and Estimation
Report SSP-2010-1: Optimum Joint Detection and Estimation George V. Moustakides Statistical Signal Processing Group Department of Electrical & Computer Engineering niversity of Patras, GREECE Contents
More informationOptimum Joint Detection and Estimation
20 IEEE International Symposium on Information Theory Proceedings Optimum Joint Detection and Estimation George V. Moustakides Department of Electrical and Computer Engineering University of Patras, 26500
More informationPartitioning the Parameter Space. Topic 18 Composite Hypotheses
Topic 18 Composite Hypotheses Partitioning the Parameter Space 1 / 10 Outline Partitioning the Parameter Space 2 / 10 Partitioning the Parameter Space Simple hypotheses limit us to a decision between one
More informationChapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)
Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Detection problems can usually be casted as binary or M-ary hypothesis testing problems. Applications: This chapter: Simple hypothesis
More informationSequential Detection. Changes: an overview. George V. Moustakides
Sequential Detection of Changes: an overview George V. Moustakides Outline Sequential hypothesis testing and Sequential detection of changes The Sequential Probability Ratio Test (SPRT) for optimum hypothesis
More informationDetection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010
Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple
More informationJoint Detection and Estimation: Optimum Tests and Applications
IEEE TRANSACTIONS ON INFORMATION THEORY (SUBMITTED) 1 Joint Detection and Estimation: Optimum Tests and Applications George V. Moustakides, Senior Member, IEEE, Guido H. Jajamovich, Student Member, IEEE,
More informationDetection theory 101 ELEC-E5410 Signal Processing for Communications
Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang
More informationLecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary
ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood
More informationLECTURE 10: NEYMAN-PEARSON LEMMA AND ASYMPTOTIC TESTING. The last equality is provided so this can look like a more familiar parametric test.
Economics 52 Econometrics Professor N.M. Kiefer LECTURE 1: NEYMAN-PEARSON LEMMA AND ASYMPTOTIC TESTING NEYMAN-PEARSON LEMMA: Lesson: Good tests are based on the likelihood ratio. The proof is easy in the
More informationDecentralized Sequential Hypothesis Testing. Change Detection
Decentralized Sequential Hypothesis Testing & Change Detection Giorgos Fellouris, Columbia University, NY, USA George V. Moustakides, University of Patras, Greece Outline Sequential hypothesis testing
More information2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?
ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we
More informationLecture Testing Hypotheses: The Neyman-Pearson Paradigm
Math 408 - Mathematical Statistics Lecture 29-30. Testing Hypotheses: The Neyman-Pearson Paradigm April 12-15, 2013 Konstantin Zuev (USC) Math 408, Lecture 29-30 April 12-15, 2013 1 / 12 Agenda Example:
More informationHypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses
Testing Hypotheses MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline Hypothesis Testing 1 Hypothesis Testing 2 Hypothesis Testing: Statistical Decision Problem Two coins: Coin 0 and Coin 1 P(Head Coin 0)
More informationDirection: This test is worth 250 points and each problem worth points. DO ANY SIX
Term Test 3 December 5, 2003 Name Math 52 Student Number Direction: This test is worth 250 points and each problem worth 4 points DO ANY SIX PROBLEMS You are required to complete this test within 50 minutes
More informationChapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)
HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter
More informationSTAT 830 Hypothesis Testing
STAT 830 Hypothesis Testing Richard Lockhart Simon Fraser University STAT 830 Fall 2018 Richard Lockhart (Simon Fraser University) STAT 830 Hypothesis Testing STAT 830 Fall 2018 1 / 30 Purposes of These
More informationHypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3
Hypothesis Testing CB: chapter 8; section 0.3 Hypothesis: statement about an unknown population parameter Examples: The average age of males in Sweden is 7. (statement about population mean) The lowest
More informationLecture notes on statistical decision theory Econ 2110, fall 2013
Lecture notes on statistical decision theory Econ 2110, fall 2013 Maximilian Kasy March 10, 2014 These lecture notes are roughly based on Robert, C. (2007). The Bayesian choice: from decision-theoretic
More information44 CHAPTER 2. BAYESIAN DECISION THEORY
44 CHAPTER 2. BAYESIAN DECISION THEORY Problems Section 2.1 1. In the two-category case, under the Bayes decision rule the conditional error is given by Eq. 7. Even if the posterior densities are continuous,
More informationChapter 4. Theory of Tests. 4.1 Introduction
Chapter 4 Theory of Tests 4.1 Introduction Parametric model: (X, B X, P θ ), P θ P = {P θ θ Θ} where Θ = H 0 +H 1 X = K +A : K: critical region = rejection region / A: acceptance region A decision rule
More information10-704: Information Processing and Learning Fall Lecture 24: Dec 7
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 24: Dec 7 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of
More informationIntroduction to Statistical Inference
Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural
More informationECE531 Lecture 4b: Composite Hypothesis Testing
ECE531 Lecture 4b: Composite Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 16-February-2011 Worcester Polytechnic Institute D. Richard Brown III 16-February-2011 1 / 44 Introduction
More informationMachine Learning. Theory of Classification and Nonparametric Classifier. Lecture 2, January 16, What is theoretically the best classifier
Machine Learning 10-701/15 701/15-781, 781, Spring 2008 Theory of Classification and Nonparametric Classifier Eric Xing Lecture 2, January 16, 2006 Reading: Chap. 2,5 CB and handouts Outline What is theoretically
More informationIf there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,
Recall The Neyman-Pearson Lemma Neyman-Pearson Lemma: Let Θ = {θ 0, θ }, and let F θ0 (x) be the cdf of the random vector X under hypothesis and F θ (x) be its cdf under hypothesis. Assume that the cdfs
More informationComposite Hypotheses and Generalized Likelihood Ratio Tests
Composite Hypotheses and Generalized Likelihood Ratio Tests Rebecca Willett, 06 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve
More information10. Composite Hypothesis Testing. ECE 830, Spring 2014
10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters
More informationCh. 5 Hypothesis Testing
Ch. 5 Hypothesis Testing The current framework of hypothesis testing is largely due to the work of Neyman and Pearson in the late 1920s, early 30s, complementing Fisher s work on estimation. As in estimation,
More information10-704: Information Processing and Learning Fall Lecture 21: Nov 14. sup
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: Nov 4 Note: hese notes are based on scribed notes from Spring5 offering of this course LaeX template courtesy of UC
More informationDetection and Estimation Theory
ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More informationF2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1
Adaptive Filtering and Change Detection Bo Wahlberg (KTH and Fredrik Gustafsson (LiTH Course Organization Lectures and compendium: Theory, Algorithms, Applications, Evaluation Toolbox and manual: Algorithms,
More informationIntelligent Systems Statistical Machine Learning
Intelligent Systems Statistical Machine Learning Carsten Rother, Dmitrij Schlesinger WS2015/2016, Our model and tasks The model: two variables are usually present: - the first one is typically discrete
More informationTesting Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA
Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box 90251 Durham, NC 27708, USA Summary: Pre-experimental Frequentist error probabilities do not summarize
More informationPrimer on statistics:
Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood
More informationAPPENDIX 1 NEYMAN PEARSON CRITERIA
54 APPENDIX NEYMAN PEARSON CRITERIA The design approaches for detectors directly follow the theory of hypothesis testing. The primary approaches to hypothesis testing problem are the classical approach
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015
More informationEstimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction
Estimation Tasks Short Course on Image Quality Matthew A. Kupinski Introduction Section 13.3 in B&M Keep in mind the similarities between estimation and classification Image-quality is a statistical concept
More informationAnnouncements. Proposals graded
Announcements Proposals graded Kevin Jamieson 2018 1 Hypothesis testing Machine Learning CSE546 Kevin Jamieson University of Washington October 30, 2018 2018 Kevin Jamieson 2 Anomaly detection You are
More informationECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing
ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III 1 / 8 Basics Hypotheses H 0
More informationSTAT 830 Hypothesis Testing
STAT 830 Hypothesis Testing Hypothesis testing is a statistical problem where you must choose, on the basis of data X, between two alternatives. We formalize this as the problem of choosing between two
More informationStatistical Inference: Maximum Likelihood and Bayesian Approaches
Statistical Inference: Maximum Likelihood and Bayesian Approaches Surya Tokdar From model to inference So a statistical analysis begins by setting up a model {f (x θ) : θ Θ} for data X. Next we observe
More informationChange-point models and performance measures for sequential change detection
Change-point models and performance measures for sequential change detection Department of Electrical and Computer Engineering, University of Patras, 26500 Rion, Greece moustaki@upatras.gr George V. Moustakides
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationOn the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing
1 On the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing Sinan Gezici, Senior Member, IEEE, and Pramod K. Varshney, Life Fellow, IEEE Abstract In this letter, the
More informationMIT Spring 2016
Decision Theoretic Framework MIT 18.655 Dr. Kempthorne Spring 2016 1 Outline Decision Theoretic Framework 1 Decision Theoretic Framework 2 Decision Problems of Statistical Inference Estimation: estimating
More informationIntelligent Systems Statistical Machine Learning
Intelligent Systems Statistical Machine Learning Carsten Rother, Dmitrij Schlesinger WS2014/2015, Our tasks (recap) The model: two variables are usually present: - the first one is typically discrete k
More informationDetection and Estimation Chapter 1. Hypothesis Testing
Detection and Estimation Chapter 1. Hypothesis Testing Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2015 1/20 Syllabus Homework:
More information1/24/2008. Review of Statistical Inference. C.1 A Sample of Data. C.2 An Econometric Model. C.4 Estimating the Population Variance and Other Moments
/4/008 Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University C. A Sample of Data C. An Econometric Model C.3 Estimating the Mean of a Population C.4 Estimating the Population
More informationMachine Learning Lecture 2
Machine Perceptual Learning and Sensory Summer Augmented 6 Computing Announcements Machine Learning Lecture 2 Course webpage http://www.vision.rwth-aachen.de/teaching/ Slides will be made available on
More informationIntroduction Large Sample Testing Composite Hypotheses. Hypothesis Testing. Daniel Schmierer Econ 312. March 30, 2007
Hypothesis Testing Daniel Schmierer Econ 312 March 30, 2007 Basics Parameter of interest: θ Θ Structure of the test: H 0 : θ Θ 0 H 1 : θ Θ 1 for some sets Θ 0, Θ 1 Θ where Θ 0 Θ 1 = (often Θ 1 = Θ Θ 0
More informationUncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department
Decision-Making under Statistical Uncertainty Jayakrishnan Unnikrishnan PhD Defense ECE Department University of Illinois at Urbana-Champaign CSL 141 12 June 2010 Statistical Decision-Making Relevant in
More informationDetection and Estimation Theory
ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More informationBayes Rule for Minimizing Risk
Bayes Rule for Minimizing Risk Dennis Lee April 1, 014 Introduction In class we discussed Bayes rule for minimizing the probability of error. Our goal is to generalize this rule to minimize risk instead
More informationECE531 Lecture 8: Non-Random Parameter Estimation
ECE531 Lecture 8: Non-Random Parameter Estimation D. Richard Brown III Worcester Polytechnic Institute 19-March-2009 Worcester Polytechnic Institute D. Richard Brown III 19-March-2009 1 / 25 Introduction
More informationECE531 Lecture 13: Sequential Detection of Discrete-Time Signals
ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals D. Richard Brown III Worcester Polytechnic Institute 30-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 30-Apr-2009 1 / 32
More informationIN HYPOTHESIS testing problems, a decision-maker aims
IEEE SIGNAL PROCESSING LETTERS, VOL. 25, NO. 12, DECEMBER 2018 1845 On the Optimality of Likelihood Ratio Test for Prospect Theory-Based Binary Hypothesis Testing Sinan Gezici, Senior Member, IEEE, and
More informationTopics in Genomic Signal Processing
Topics in Genomic Signal Processing Guido Hugo Jajamovich Submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences COLUMBIA
More informationBayesian Decision and Bayesian Learning
Bayesian Decision and Bayesian Learning Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 30 Bayes Rule p(x ω i
More informationStatistical Inference
Statistical Inference Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Spring, 2006 1. DeGroot 1973 In (DeGroot 1973), Morrie DeGroot considers testing the
More informationCONSIDER two companion problems: Separating Function Estimation Tests: A New Perspective on Binary Composite Hypothesis Testing
TO APPEAR IN IEEE TRANSACTIONS ON SIGNAL PROCESSING Separating Function Estimation Tests: A New Perspective on Binary Composite Hypothesis Testing Ali Ghobadzadeh, Student Member, IEEE, Saeed Gazor, Senior
More informationLecture 22: Error exponents in hypothesis testing, GLRT
10-704: Information Processing and Learning Spring 2012 Lecture 22: Error exponents in hypothesis testing, GLRT Lecturer: Aarti Singh Scribe: Aarti Singh Disclaimer: These notes have not been subjected
More informationBayesian Decision Theory
Bayesian Decision Theory Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Fall 2017 CS 551, Fall 2017 c 2017, Selim Aksoy (Bilkent University) 1 / 46 Bayesian
More information8. Hypothesis Testing
FE661 - Statistical Methods for Financial Engineering 8. Hypothesis Testing Jitkomut Songsiri introduction Wald test likelihood-based tests significance test for linear regression 8-1 Introduction elements
More informationSTOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes
STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute
More informationP Values and Nuisance Parameters
P Values and Nuisance Parameters Luc Demortier The Rockefeller University PHYSTAT-LHC Workshop on Statistical Issues for LHC Physics CERN, Geneva, June 27 29, 2007 Definition and interpretation of p values;
More informationFundamentals of Statistical Signal Processing Volume II Detection Theory
Fundamentals of Statistical Signal Processing Volume II Detection Theory Steven M. Kay University of Rhode Island PH PTR Prentice Hall PTR Upper Saddle River, New Jersey 07458 http://www.phptr.com Contents
More informationChange Detection Algorithms
5 Change Detection Algorithms In this chapter, we describe the simplest change detection algorithms. We consider a sequence of independent random variables (y k ) k with a probability density p (y) depending
More informationMachine Learning Lecture 2
Machine Perceptual Learning and Sensory Summer Augmented 15 Computing Many slides adapted from B. Schiele Machine Learning Lecture 2 Probability Density Estimation 16.04.2015 Bastian Leibe RWTH Aachen
More informationIntroduction to Bayesian Statistics
Bayesian Parameter Estimation Introduction to Bayesian Statistics Harvey Thornburg Center for Computer Research in Music and Acoustics (CCRMA) Department of Music, Stanford University Stanford, California
More informationChapter 2 Signal Processing at Receivers: Detection Theory
Chapter Signal Processing at Receivers: Detection Theory As an application of the statistical hypothesis testing, signal detection plays a key role in signal processing at receivers of wireless communication
More informationHypothesis Testing - Frequentist
Frequentist Hypothesis Testing - Frequentist Compare two hypotheses to see which one better explains the data. Or, alternatively, what is the best way to separate events into two classes, those originating
More informationSYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I
SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability
More informationThe University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80
The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80 71. Decide in each case whether the hypothesis is simple
More informationDecision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over
Point estimation Suppose we are interested in the value of a parameter θ, for example the unknown bias of a coin. We have already seen how one may use the Bayesian method to reason about θ; namely, we
More informationChapter 4: Constrained estimators and tests in the multiple linear regression model (Part III)
Chapter 4: Constrained estimators and tests in the multiple linear regression model (Part III) Florian Pelgrin HEC September-December 2010 Florian Pelgrin (HEC) Constrained estimators September-December
More informationChapter 5. Bayesian Statistics
Chapter 5. Bayesian Statistics Principles of Bayesian Statistics Anything unknown is given a probability distribution, representing degrees of belief [subjective probability]. Degrees of belief [subjective
More informationBayesian statistics: Inference and decision theory
Bayesian statistics: Inference and decision theory Patric Müller und Francesco Antognini Seminar über Statistik FS 28 3.3.28 Contents 1 Introduction and basic definitions 2 2 Bayes Method 4 3 Two optimalities:
More informationON UNIFORMLY MOST POWERFUL DECENTRALIZED DETECTION
ON UNIFORMLY MOST POWERFUL DECENTRALIZED DETECTION by Uri Rogers A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical and Computer Engineering
More informationChapter 9: Hypothesis Testing Sections
Chapter 9: Hypothesis Testing Sections 9.1 Problems of Testing Hypotheses 9.2 Testing Simple Hypotheses 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.6 Comparing the Means of Two
More informationImportance Sampling and. Radon-Nikodym Derivatives. Steven R. Dunbar. Sampling with respect to 2 distributions. Rare Event Simulation
1 / 33 Outline 1 2 3 4 5 2 / 33 More than one way to evaluate a statistic A statistic for X with pdf u(x) is A = E u [F (X)] = F (x)u(x) dx 3 / 33 Suppose v(x) is another probability density such that
More informationStatistics of Small Signals
Statistics of Small Signals Gary Feldman Harvard University NEPPSR August 17, 2005 Statistics of Small Signals In 1998, Bob Cousins and I were working on the NOMAD neutrino oscillation experiment and we
More informationParameter Estimation
1 / 44 Parameter Estimation Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay October 25, 2012 Motivation System Model used to Derive
More informationIntroduction to Machine Learning
Introduction to Machine Learning Generative Models Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574 1
More informationTesting Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata
Maura Department of Economics and Finance Università Tor Vergata Hypothesis Testing Outline It is a mistake to confound strangeness with mystery Sherlock Holmes A Study in Scarlet Outline 1 The Power Function
More informationMath 201 Solutions to Assignment 1. 2ydy = x 2 dx. y = C 1 3 x3
Math 201 Solutions to Assignment 1 1. Solve the initial value problem: x 2 dx + 2y = 0, y(0) = 2. x 2 dx + 2y = 0, y(0) = 2 2y = x 2 dx y 2 = 1 3 x3 + C y = C 1 3 x3 Notice that y is not defined for some
More informationHypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes
Neyman-Pearson paradigm. Suppose that a researcher is interested in whether the new drug works. The process of determining whether the outcome of the experiment points to yes or no is called hypothesis
More informationTopic 3: Hypothesis Testing
CS 8850: Advanced Machine Learning Fall 07 Topic 3: Hypothesis Testing Instructor: Daniel L. Pimentel-Alarcón c Copyright 07 3. Introduction One of the simplest inference problems is that of deciding between
More informationExercises Chapter 4 Statistical Hypothesis Testing
Exercises Chapter 4 Statistical Hypothesis Testing Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 5, 013 Christophe Hurlin (University of Orléans) Advanced Econometrics
More informationDetection theory. H 0 : x[n] = w[n]
Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal
More informationMachine Learning CMPT 726 Simon Fraser University. Binomial Parameter Estimation
Machine Learning CMPT 726 Simon Fraser University Binomial Parameter Estimation Outline Maximum Likelihood Estimation Smoothed Frequencies, Laplace Correction. Bayesian Approach. Conjugate Prior. Uniform
More informationTwo hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationBayesian Learning. Bayesian Learning Criteria
Bayesian Learning In Bayesian learning, we are interested in the probability of a hypothesis h given the dataset D. By Bayes theorem: P (h D) = P (D h)p (h) P (D) Other useful formulas to remember are:
More informationGeneralized Linear Models
Generalized Linear Models Lecture 3. Hypothesis testing. Goodness of Fit. Model diagnostics GLM (Spring, 2018) Lecture 3 1 / 34 Models Let M(X r ) be a model with design matrix X r (with r columns) r n
More informationBayesian statistics. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Bayesian statistics DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall15 Carlos Fernandez-Granda Frequentist vs Bayesian statistics In frequentist statistics
More informationECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters
ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters D. Richard Brown III Worcester Polytechnic Institute 26-February-2009 Worcester Polytechnic Institute D. Richard Brown III 26-February-2009
More informationDecentralized Detection In Wireless Sensor Networks
Decentralized Detection In Wireless Sensor Networks Milad Kharratzadeh Department of Electrical & Computer Engineering McGill University Montreal, Canada April 2011 Statistical Detection and Estimation
More informationVariations. ECE 6540, Lecture 10 Maximum Likelihood Estimation
Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter
More information