ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

Size: px
Start display at page:

Download "ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters"

Transcription

1 ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters D. Richard Brown III Worcester Polytechnic Institute 26-February-2009 Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

2 Introduction Today we consider parametric detection problems for signals with one or more unknown parameters. We focus on the case of an N-sample discrete time observation Y Y with binary hypotheses H 0 : Y p x (y) for x X 0 H 1 : Y p x (y) for x X 1 Our approach: Absorb the unknown parameters into our notion of the state x and the associated state space X. In many textbooks, it is common to denote the random parameter(s) as Θ and a realization of these parameters as θ. We will stay consistent with our earlier notation, however, and use x here. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

3 Example Suppose we have a known signal s R n and we have a communication system that transmits either s 0 = s or s 1 = s. The signal arrives with unknown amplitude a > 0 and is corrupted by zero-mean AWGN with variance σ 2. The hypotheses can be written as H 0 : Y N(as 0,σ 2 I) H 1 : Y N(as 1,σ 2 I) for unknown a > 0. What is the state space X and what are the sets X 0 and X 1? An equivalent, but more clever way to write these hypotheses is H 0 : Y N(as,σ 2 I) with a < 0 H 1 : Y N(as,σ 2 I) with a > 0 for unknown a R\0. What is the state space X and what are the sets X 0 and X 1? Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

4 Types of Detectors: What we Already Know Problem: Find a decision rule to decide between for unknown a R\0 H 0 : Y N(as,σ 2 I) with a < 0 H 1 : Y N(as,σ 2 I) with a > 0 We know that these types of problems are composite binary hypothesis testing problems. Bayes decision rules involve computation of two discriminant functions (or commodity costs) g 0 (y,π) and g 1 (y,π) and subsequent comparison. Under the N-P criterion with significance level (or size) α: Our strategy: reduce the composite HT problem to a simple one. Check for the existence of a UMP decision rule (check the critical region and/or monotone likelihood ratio) that maximizes P D,x for all x X 1 subject to P fp,x α for all x X 0. If a UMP rule doesn t exist, we can usually find an LMP decision rule. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

5 Example: Known Signal with Unknown Amplitude s k a + Y k n-sample detector decisions W k We would like to decide on the presence of a known signal s R n (H 1 ) versus the absence of the signal (H 0 ). The known signal s arrives at the detector corrupted by noise W and with unknown amplitude a. We observe a realization y R n of the random variable Y = as + W. a = 0 when no signal is present (H 0 ) a > 0 when a signal is present (H 1 ) What kind of hypothesis testing problem is this? Binary, composite, one-sided. What is the state space here? x = a X = [0, ). Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

6 Bayes Detector for this Example To develop a Bayes detector for this problem, we need 3 things: 1. We need a conditional pdf or pmf statistically describing the observations y Y for each state x X: p x (y). 2. We need a cost assignment for each state x X and each hypothesis H i : C 0 (x) and C 1 (x). 3. We need a prior (pdf) on the states: π(x). The Bayes risk of decision rule ρ in this case can be written as r(ρ, π) = π(x) ρ (y)c(x)p x (y)dy dx X Y ( ) = ρ (y) C(x)π(x)p x (y)dx dy Y } X {{ } g(y,π) We know how to solve this problem: For each y Y, the Bayes decision rule just compares g 0 (y,π) to g 1 (y,π) and sets δ Bπ equal to the index of the smaller discriminant function (or commodity cost). Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

7 Bayes Detector for this Example Suppose the noise is distributed as W N(0,Σ). Conditioned on x = a, we can write the density of the vector observation as ( 1 p x=a (y) = (2π) n/2 exp (y as) Σ 1 ) (y as) Σ 1/2 2 Suppose also that we have the UCA and the prior π(x) = π 0 δ(x) + (1 π 0 )(I x (0) I x (1)) where I x (t) = 1 for all x > t and I x (t) = 0 otherwise. Note that the state space here is X = [0,1]. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

8 Bayes Decision Rule: π 0 = 0.5 and s = [1.5, 0.5] ybar ybar0 Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

9 Bayes Decision Rule: π 0 = 0.4 and s = [1.5, 0.5] ybar ybar0 Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

10 Bayes Decision Rule: π 0 = 0.75 and s = [1.5, 0.5] ybar ybar0 Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

11

12 Neyman-Pearson Detector To develop a N-P detector for this problem, we need 2 things: 1. We need a conditional pdf or pmf statistically describing the observations y Y for each state x X: p x (y). 2. We need a significance-level for the test: P fp α. How should we approach the problem? Check to see if a UMP detector exists. If not, derive an LMP detector with good performance for a near zero. A new option: If we can specify a prior on the states, denoted as π(x), we can reduce each composite hypothesis to a simple one by taking a weighted average of the density with respect to the prior, i.e. H j : Y p j (y) = p x (y)π j (x)dx X j π(x) where π j (x) = Prob(x X j ) when x X j and is equal to zero otherwise. Then testing H 0 versus H 1 is a simple hypothesis testing problem for which the N-P lemma gives a unique optimal solution. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

13 Neyman-Pearson Detector Suppose the noise is distributed as W N(0,σ 2 I). Conditioned on x = a, we can write the density of the vector observation as ( 1 p x=a (y) = (2πσ 2 exp (y ) as) (y as) ) n/2 2σ 2 = = n 1 k=0 n 1 k=0 p x=a (y k ) 1 2πσ exp ( (y k as k ) 2 ) Note that any results we derive for this case can also be applied to the case when W N(0,Σ) by using our coordinate-transformation (decorrelation) trick. 2σ 2 Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

14 Neyman-Pearson Detector: UMP Decision Rule Let s set up a simple HT test: H 0 : a = 0 versus H 1 : a = a 1 > 0. The N-P lemma says that, for this simple HT problem, we can find an α-level decision rule of the form 1 ln (L(y)) > ln v ρ(y) = γ ln (L(y)) = ln v 0 ln (L(y)) < ln v ( ) where ln (L(y)) := ln p1 (y) p 0 (y) with v 0 and γ [0,1] chosen such that P fp = α. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

15 Log-Likelihood Ratio The log-likelihood ratio for this simple HT test: ( ) p1 (y) ln (L(y)) = ln p 0 (y) ( n 1 k=0 = ln exp ( ) n 1 k=0 exp y2 k 2σ ( 2 n 1 = ln = k=0 n 1 a 1 σ 2 k=0 = a 1 σ 2 ) (y k a 1 s k ) 2 2σ 2 exp ( 2yk a 1 s k a 2 1 s2 k 2σ 2 ( y k s k a 1s 2 ) k 2 (s y a 1 s 2 ) 2 ) ) Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

16 False Positive Probability P fp = Prob (ln (L(Y )) > lnv a = 0) + γprob (ln(l(y )) = ln v a = 0) = Prob (s Y > σ2 ln v + a 1 s 2 ) a = 0 a 1 2 How is s Y distributed when a = 0? Hence, v must be selected such that σ 2 P fp = Q a 1 ln v + a 1 s 2 2 = α s σ Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

17 Neyman-Pearson Detector: UMP Decision Rule Does a UMP decision rule exist? To answer this question, we need to determine if the critical region Γ 1 = {y Y : ρ(y) decides H 1 } depends on our choice of a 1. We decide H 1 for sure when s y > σ2 ln v + a 1 s 2 a 1 2 and we decide H 1 with probability γ when s y = σ2 ln v + a 1 s 2 a 1 2 (which happens with probability zero). Critical region depends on a 1 no UMP decision rule. What should we do now? Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

18 Neyman-Pearson Detector: UMP Decision Rule Hold on. Does the critical region Γ 1 = {y R n : s y > σ2 a 1 ln v + a 1 s 2 really depend on a 1? Given 0 α 1 and t > 0, the unique solution to Q(z/t) = α is z = tq 1 (α). Hence, the unique solution to σ 2 Q a 1 ln v + a 1 s 2 2 = α s σ is σ 2 2 } ln v + a 1 s 2 = s σq 1 (α). a 1 2 Hence the critical region for a significance-level α N-P decision rule can be written as Γ 1 = {y R n : s y > s σq 1 (α)} Does this depend on a 1? Does a UMP decision rule exist? Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

19 Neyman-Pearson Detector: UMP Decision Rule 1 s y > s σq 1 (α) ρ UMP (y) = γ s y = s σq 1 (α) 0 s y < s σq 1 (α) How would this change if the noise was distributed as W N(0,Σ)? Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

20 Neyman-Pearson Detector: LMP Decision Rule If the UMP detector did not exist or was too complicated, we could find an LMP detector for this example by comparing d da L a(y) a=0 to a threshold. When the noise samples are i.i.d., the likelihood ratio can be written as L a (y) = = p x=a(y) p x=0 (y) n 1 k=0 q(y k as k ) q(y k ) where q(x) is the pmf/pdf of the kth noise sample. In our example, q(x) = 1 2πσ e x2 /σ 2. We ll continue our analysis here for i.i.d. noise with general distribution q(x)... Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

21 Neyman-Pearson Detector: LMP Decision Rule Taking the derivative of L a (y) with respect to a yields d da L n 1 q (y k as k ) a(y) = s k q(y k ) k=0 where q (x) = d dxq(x). Setting a = 0 yields j k q(y j as j ) q(y j ) d da L a(y) a=0 n 1 q (y k ) = s k q(y k ) k=0 n 1 q (y k ) = s k q(y k ) = k=0 n 1 s k h LMP (y k ) k=0 j k q(y j ) q(y j ) Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

22 Neyman-Pearson Detector: LMP Decision Rule The locally most powerful decision rule then takes the form 1 n 1 k=0 s kh LMP (y k ) > τ ρ LMP (y) = γ n 1 k=0 s kh LMP (y k ) = τ 0 n 1 k=0 s kh LMP (y k ) < τ where γ and τ are selected such that P fp = α. In our example, h LMP (y k ) := q (y k ) q(y k ) = 2y k. Hence, if τ = τσ 2 /2, the σ 2 LMP decision rule then takes the form 1 s y > τ ρ LMP (y) = γ s y = τ 0 s y < τ with τ and γ selected so that P fp = α. How does this compare to the UMP decision rule? ρ LMP (y) = ρ UMP (y) here. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

23 LMP Decision Rule for i.i.d. Laplacian Noise For b > 0, the Laplacian density is given as q(x) = b 2 e b x. For Laplacian noise, h LMP (y k ) = q (y k ) q(y k ) = bsgn(x). Hence, the locally most powerful decision rule can be written as 1 n 1 k=0 s ksgn(y k ) > τ ρ LMP (y) = γ n 1 k=0 s ksgn(y k ) = τ 0 n 1 k=0 s ksgn(y k ) < τ In this case, ρ LMP (y) ρ UMP (y). Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

24 The Third Alternative: Average Density Over a Prior The approach followed in pages of your textbook (as well as Example III.B.5) is to reduce the composite HT problem to a simple HT problem by using the a prior π(x) to compute a single density as a weighted average of the family of densities associated with H j, i.e. p j (y) = p x (y)π j (x)dx X j π(x) where π j (x) = Prob(x X j ) when x X j and is equal to zero otherwise. In this case, given a significance level α, the N-P lemma tells us that a unique optimal solution must exist based on a threshold test of the likelihood ratio L(y) = = p 1(y) p 0 (y) X 1 p x (y)π 1 (x)dx X 0 p x (y)π 0 (x)dx. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

25 Remarks on the Third Alternative If there is some uncertainty as to the prior, the usual approach is to chose π(x) such that it gives as little information about x as possible, i.e. such that the simple HT problem has maximum difficulty. Example III.B.5: Noncoherent Detection of a Modulated Sinusoidal Carrier. In this example, the unknown parameter is the phase of the received signal. What prior on this phase would give the least information? π(x) U(0, 2π) This is exactly the prior used in the example in your textbook. Please read this example and, in particular, note the development of the catalyst for deciding between two different (but known) signals with unknown phase on page 71. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

26 Conclusions Detection of known signals in noise: simple hypothesis testing. Detection of signals with one or more unknown parameters in noise: composite hypothesis testing. You are encouraged to at least skim the section on Stochastic Signals in section III.B (up to page 81). I also plan to discuss section III.D Sequential Detection in the optional lecture after the final exam. Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

27 Midterm Exam: What You Need to Know Different types of hypothesis testing problems (binary, M-ary, simple, composite) Mathematical model of hypothesis testing problems. Intuition about good and bad decision rules. Poor textbook chapter II (whole chapter): Bayesian hypothesis testing (binary and M-ary, simple and composite) Minimax hypothesis testing (binary, simple) Neyman-Pearson hypothesis testing (binary, simple and composite) Poor textbook chapter III (up to page 72): Decorrelation of signals observed in correlated Gaussian noise. Detection of known discrete-time signals (binary and M-ary) Detection of discrete-time signals with random parameters (binary) Worcester Polytechnic Institute D. Richard Brown III 26-February / 27

ECE531 Lecture 4b: Composite Hypothesis Testing

ECE531 Lecture 4b: Composite Hypothesis Testing ECE531 Lecture 4b: Composite Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 16-February-2011 Worcester Polytechnic Institute D. Richard Brown III 16-February-2011 1 / 44 Introduction

More information

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III 1 / 7 Neyman

More information

ECE531 Lecture 2b: Bayesian Hypothesis Testing

ECE531 Lecture 2b: Bayesian Hypothesis Testing ECE531 Lecture 2b: Bayesian Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 29-January-2009 Worcester Polytechnic Institute D. Richard Brown III 29-January-2009 1 / 39 Minimizing

More information

ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing

ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III 1 / 8 Basics Hypotheses H 0

More information

ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals

ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals D. Richard Brown III Worcester Polytechnic Institute 30-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 30-Apr-2009 1 / 32

More information

ECE531 Screencast 11.5: Uniformly Most Powerful Decision Rules

ECE531 Screencast 11.5: Uniformly Most Powerful Decision Rules ECE531 Screencast 11.5: Uniformly Most Powerful Decision Rules D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III 1 / 9 Monotone Likelihood Ratio

More information

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Detection theory 101 ELEC-E5410 Signal Processing for Communications Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off

More information

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Detection problems can usually be casted as binary or M-ary hypothesis testing problems. Applications: This chapter: Simple hypothesis

More information

ECE531 Lecture 8: Non-Random Parameter Estimation

ECE531 Lecture 8: Non-Random Parameter Estimation ECE531 Lecture 8: Non-Random Parameter Estimation D. Richard Brown III Worcester Polytechnic Institute 19-March-2009 Worcester Polytechnic Institute D. Richard Brown III 19-March-2009 1 / 25 Introduction

More information

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood

More information

Chapter 2 Signal Processing at Receivers: Detection Theory

Chapter 2 Signal Processing at Receivers: Detection Theory Chapter Signal Processing at Receivers: Detection Theory As an application of the statistical hypothesis testing, signal detection plays a key role in signal processing at receivers of wireless communication

More information

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)? ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we

More information

10. Composite Hypothesis Testing. ECE 830, Spring 2014

10. Composite Hypothesis Testing. ECE 830, Spring 2014 10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters

More information

ECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations)

ECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations) ECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations) D. Richard Brown III Worcester Polytechnic Institute 26-January-2011 Worcester Polytechnic Institute

More information

Parameter Estimation

Parameter Estimation 1 / 44 Parameter Estimation Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay October 25, 2012 Motivation System Model used to Derive

More information

ECE531 Lecture 10b: Maximum Likelihood Estimation

ECE531 Lecture 10b: Maximum Likelihood Estimation ECE531 Lecture 10b: Maximum Likelihood Estimation D. Richard Brown III Worcester Polytechnic Institute 05-Apr-2011 Worcester Polytechnic Institute D. Richard Brown III 05-Apr-2011 1 / 23 Introduction So

More information

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise

More information

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1, Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem

More information

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple

More information

8 Basics of Hypothesis Testing

8 Basics of Hypothesis Testing 8 Basics of Hypothesis Testing 4 Problems Problem : The stochastic signal S is either 0 or E with equal probability, for a known value E > 0. Consider an observation X = x of the stochastic variable X

More information

Topic 3: Hypothesis Testing

Topic 3: Hypothesis Testing CS 8850: Advanced Machine Learning Fall 07 Topic 3: Hypothesis Testing Instructor: Daniel L. Pimentel-Alarcón c Copyright 07 3. Introduction One of the simplest inference problems is that of deciding between

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 22-January-2009 WPI D. Richard Brown III 22-January-2009 1 / 37 Lecture 1 Major Topics 1. Web page. 2. Syllabus

More information

Composite Hypotheses and Generalized Likelihood Ratio Tests

Composite Hypotheses and Generalized Likelihood Ratio Tests Composite Hypotheses and Generalized Likelihood Ratio Tests Rebecca Willett, 06 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve

More information

Detection and Estimation Chapter 1. Hypothesis Testing

Detection and Estimation Chapter 1. Hypothesis Testing Detection and Estimation Chapter 1. Hypothesis Testing Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2015 1/20 Syllabus Homework:

More information

DETECTION theory deals primarily with techniques for

DETECTION theory deals primarily with techniques for ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for

More information

Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process

Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process Applied Mathematical Sciences, Vol. 4, 2010, no. 62, 3083-3093 Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process Julia Bondarenko Helmut-Schmidt University Hamburg University

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 CS students: don t forget to re-register in CS-535D. Even if you just audit this course, please do register.

More information

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell, Recall The Neyman-Pearson Lemma Neyman-Pearson Lemma: Let Θ = {θ 0, θ }, and let F θ0 (x) be the cdf of the random vector X under hypothesis and F θ (x) be its cdf under hypothesis. Assume that the cdfs

More information

Lecture 21: Minimax Theory

Lecture 21: Minimax Theory Lecture : Minimax Theory Akshay Krishnamurthy akshay@cs.umass.edu November 8, 07 Recap In the first part of the course, we spent the majority of our time studying risk minimization. We found many ways

More information

Introduction to Detection Theory

Introduction to Detection Theory Introduction to Detection Theory Detection Theory (a.k.a. decision theory or hypothesis testing) is concerned with situations where we need to make a decision on whether an event (out of M possible events)

More information

Hypothesis Testing - Frequentist

Hypothesis Testing - Frequentist Frequentist Hypothesis Testing - Frequentist Compare two hypotheses to see which one better explains the data. Or, alternatively, what is the best way to separate events into two classes, those originating

More information

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1 Adaptive Filtering and Change Detection Bo Wahlberg (KTH and Fredrik Gustafsson (LiTH Course Organization Lectures and compendium: Theory, Algorithms, Applications, Evaluation Toolbox and manual: Algorithms,

More information

Detection theory. H 0 : x[n] = w[n]

Detection theory. H 0 : x[n] = w[n] Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal

More information

An Introduction to Signal Detection and Estimation - Second Edition Chapter III: Selected Solutions

An Introduction to Signal Detection and Estimation - Second Edition Chapter III: Selected Solutions An Introduction to Signal Detection and Estimation - Second Edition Chapter III: Selected Solutions H. V. Poor Princeton University March 17, 5 Exercise 1: Let {h k,l } denote the impulse response of a

More information

Fundamentals of Statistical Signal Processing Volume II Detection Theory

Fundamentals of Statistical Signal Processing Volume II Detection Theory Fundamentals of Statistical Signal Processing Volume II Detection Theory Steven M. Kay University of Rhode Island PH PTR Prentice Hall PTR Upper Saddle River, New Jersey 07458 http://www.phptr.com Contents

More information

ECE531 Homework Assignment Number 6 Solution

ECE531 Homework Assignment Number 6 Solution ECE53 Homework Assignment Number 6 Solution Due by 8:5pm on Wednesday 3-Mar- Make sure your reasoning and work are clear to receive full credit for each problem.. 6 points. Suppose you have a scalar random

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer

More information

Lecture 8: Signal Detection and Noise Assumption

Lecture 8: Signal Detection and Noise Assumption ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(0, σ I n n and S = [s, s,..., s n ] T

More information

Digital Transmission Methods S

Digital Transmission Methods S Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume

More information

Hypothesis testing (cont d)

Hypothesis testing (cont d) Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 15-January-2013 WPI D. Richard Brown III 15-January-2013 1 / 39 First Lecture: Major Topics 1. Administrative

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015

More information

Introduction to Statistical Inference

Introduction to Statistical Inference Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural

More information

Definition 3.1 A statistical hypothesis is a statement about the unknown values of the parameters of the population distribution.

Definition 3.1 A statistical hypothesis is a statement about the unknown values of the parameters of the population distribution. Hypothesis Testing Definition 3.1 A statistical hypothesis is a statement about the unknown values of the parameters of the population distribution. Suppose the family of population distributions is indexed

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer

More information

Module 2. Random Processes. Version 2, ECE IIT, Kharagpur

Module 2. Random Processes. Version 2, ECE IIT, Kharagpur Module Random Processes Version, ECE IIT, Kharagpur Lesson 9 Introduction to Statistical Signal Processing Version, ECE IIT, Kharagpur After reading this lesson, you will learn about Hypotheses testing

More information

Lecture 12 November 3

Lecture 12 November 3 STATS 300A: Theory of Statistics Fall 2015 Lecture 12 November 3 Lecturer: Lester Mackey Scribe: Jae Hyuck Park, Christian Fong Warning: These notes may contain factual and/or typographic errors. 12.1

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

ECE531 Homework Assignment Number 2

ECE531 Homework Assignment Number 2 ECE53 Homework Assignment Number 2 Due by 8:5pm on Thursday 5-Feb-29 Make sure your reasoning and work are clear to receive full credit for each problem 3 points A city has two taxi companies distinguished

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

EE4304 C-term 2007: Lecture 17 Supplemental Slides

EE4304 C-term 2007: Lecture 17 Supplemental Slides EE434 C-term 27: Lecture 17 Supplemental Slides D. Richard Brown III Worcester Polytechnic Institute, Department of Electrical and Computer Engineering February 5, 27 Geometric Representation: Optimal

More information

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute

More information

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

UTA EE5362 PhD Diagnosis Exam (Spring 2011) EE5362 Spring 2 PhD Diagnosis Exam ID: UTA EE5362 PhD Diagnosis Exam (Spring 2) Instructions: Verify that your exam contains pages (including the cover shee. Some space is provided for you to show your

More information

Hypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses

Hypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses Testing Hypotheses MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline Hypothesis Testing 1 Hypothesis Testing 2 Hypothesis Testing: Statistical Decision Problem Two coins: Coin 0 and Coin 1 P(Head Coin 0)

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

EE 574 Detection and Estimation Theory Lecture Presentation 8

EE 574 Detection and Estimation Theory Lecture Presentation 8 Lecture Presentation 8 Aykut HOCANIN Dept. of Electrical and Electronic Engineering 1/14 Chapter 3: Representation of Random Processes 3.2 Deterministic Functions:Orthogonal Representations For a finite-energy

More information

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes Neyman-Pearson paradigm. Suppose that a researcher is interested in whether the new drug works. The process of determining whether the outcome of the experiment points to yes or no is called hypothesis

More information

Parametric Techniques Lecture 3

Parametric Techniques Lecture 3 Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Use of the likelihood principle in physics. Statistics II

Use of the likelihood principle in physics. Statistics II Use of the likelihood principle in physics Statistics II 1 2 3 + Bayesians vs Frequentists 4 Why ML does work? hypothesis observation 5 6 7 8 9 10 11 ) 12 13 14 15 16 Fit of Histograms corresponds This

More information

Ch. 5 Hypothesis Testing

Ch. 5 Hypothesis Testing Ch. 5 Hypothesis Testing The current framework of hypothesis testing is largely due to the work of Neyman and Pearson in the late 1920s, early 30s, complementing Fisher s work on estimation. As in estimation,

More information

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015 EECS564 Estimation, Filtering, and Detection Exam Week of April 0, 015 This is an open book takehome exam. You have 48 hours to complete the exam. All work on the exam should be your own. problems have

More information

STAT 830 Hypothesis Testing

STAT 830 Hypothesis Testing STAT 830 Hypothesis Testing Richard Lockhart Simon Fraser University STAT 830 Fall 2018 Richard Lockhart (Simon Fraser University) STAT 830 Hypothesis Testing STAT 830 Fall 2018 1 / 30 Purposes of These

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics MAS 713 Chapter 8 Previous lecture: 1 Bayesian Inference 2 Decision theory 3 Bayesian Vs. Frequentist 4 Loss functions 5 Conjugate priors Any questions? Mathematical Statistics

More information

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology Singapore University of Design and Technology Lecture 9: Hypothesis testing, uniformly most powerful tests. The Neyman-Pearson framework Let P be the family of distributions of concern. The Neyman-Pearson

More information

Lecture 22: Error exponents in hypothesis testing, GLRT

Lecture 22: Error exponents in hypothesis testing, GLRT 10-704: Information Processing and Learning Spring 2012 Lecture 22: Error exponents in hypothesis testing, GLRT Lecturer: Aarti Singh Scribe: Aarti Singh Disclaimer: These notes have not been subjected

More information

Detection Theory. Composite tests

Detection Theory. Composite tests Composite tests Chapter 5: Correction Thu I claimed that the above, which is the most general case, was captured by the below Thu Chapter 5: Correction Thu I claimed that the above, which is the most general

More information

Continuous Wave Data Analysis: Fully Coherent Methods

Continuous Wave Data Analysis: Fully Coherent Methods Continuous Wave Data Analysis: Fully Coherent Methods John T. Whelan School of Gravitational Waves, Warsaw, 3 July 5 Contents Signal Model. GWs from rotating neutron star............ Exercise: JKS decomposition............

More information

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Bayesian decision theory 8001652 Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Jussi Tohka jussi.tohka@tut.fi Institute of Signal Processing Tampere University of Technology

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 211 Urbauer

More information

Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma

Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma Theory of testing hypotheses X: a sample from a population P in P, a family of populations. Based on the observed X, we test a

More information

8: Hypothesis Testing

8: Hypothesis Testing Some definitions 8: Hypothesis Testing. Simple, compound, null and alternative hypotheses In test theory one distinguishes between simple hypotheses and compound hypotheses. A simple hypothesis Examples:

More information

(3) Review of Probability. ST440/540: Applied Bayesian Statistics

(3) Review of Probability. ST440/540: Applied Bayesian Statistics Review of probability The crux of Bayesian statistics is to compute the posterior distribution, i.e., the uncertainty distribution of the parameters (θ) after observing the data (Y) This is the conditional

More information

Bayes Decision Theory

Bayes Decision Theory Bayes Decision Theory Minimum-Error-Rate Classification Classifiers, Discriminant Functions and Decision Surfaces The Normal Density 0 Minimum-Error-Rate Classification Actions are decisions on classes

More information

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent

More information

LECTURE NOTE #3 PROF. ALAN YUILLE

LECTURE NOTE #3 PROF. ALAN YUILLE LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.

More information

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3 Hypothesis Testing CB: chapter 8; section 0.3 Hypothesis: statement about an unknown population parameter Examples: The average age of males in Sweden is 7. (statement about population mean) The lowest

More information

Decentralized Detection In Wireless Sensor Networks

Decentralized Detection In Wireless Sensor Networks Decentralized Detection In Wireless Sensor Networks Milad Kharratzadeh Department of Electrical & Computer Engineering McGill University Montreal, Canada April 2011 Statistical Detection and Estimation

More information

Statistical Inference

Statistical Inference Statistical Inference Classical and Bayesian Methods Revision Class for Midterm Exam AMS-UCSC Th Feb 9, 2012 Winter 2012. Session 1 (Revision Class) AMS-132/206 Th Feb 9, 2012 1 / 23 Topics Topics We will

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Slide Set 3: Detection Theory January 2018 Heikki Huttunen heikki.huttunen@tut.fi Department of Signal Processing Tampere University of Technology Detection theory

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Homework 7: Solutions. P3.1 from Lehmann, Romano, Testing Statistical Hypotheses.

Homework 7: Solutions. P3.1 from Lehmann, Romano, Testing Statistical Hypotheses. Stat 300A Theory of Statistics Homework 7: Solutions Nikos Ignatiadis Due on November 28, 208 Solutions should be complete and concisely written. Please, use a separate sheet or set of sheets for each

More information

ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing

ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing Robert Vanderbei Fall 2014 Slides last edited on November 24, 2014 http://www.princeton.edu/ rvdb Coin Tossing Example Consider two coins.

More information

Two results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise.

Two results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise. Two results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise. Dominique Pastor GET - ENST Bretagne, CNRS UMR 2872 TAMCIC, Technopôle de

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 4.0 Introduction to Statistical Methods in Economics Spring 009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher Lecture 3 STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher Previous lectures What is machine learning? Objectives of machine learning Supervised and

More information

Bayesian Decision Theory

Bayesian Decision Theory Bayesian Decision Theory Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Fall 2017 CS 551, Fall 2017 c 2017, Selim Aksoy (Bilkent University) 1 / 46 Bayesian

More information

Testing Statistical Hypotheses

Testing Statistical Hypotheses E.L. Lehmann Joseph P. Romano Testing Statistical Hypotheses Third Edition 4y Springer Preface vii I Small-Sample Theory 1 1 The General Decision Problem 3 1.1 Statistical Inference and Statistical Decisions

More information

Lecture Testing Hypotheses: The Neyman-Pearson Paradigm

Lecture Testing Hypotheses: The Neyman-Pearson Paradigm Math 408 - Mathematical Statistics Lecture 29-30. Testing Hypotheses: The Neyman-Pearson Paradigm April 12-15, 2013 Konstantin Zuev (USC) Math 408, Lecture 29-30 April 12-15, 2013 1 / 12 Agenda Example:

More information

Problem Set 3 Due Oct, 5

Problem Set 3 Due Oct, 5 EE6: Random Processes in Systems Lecturer: Jean C. Walrand Problem Set 3 Due Oct, 5 Fall 6 GSI: Assane Gueye This problem set essentially reviews detection theory. Not all eercises are to be turned in.

More information

Statistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009

Statistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009 Statistics for Particle Physics Kyle Cranmer New York University 91 Remaining Lectures Lecture 3:! Compound hypotheses, nuisance parameters, & similar tests! The Neyman-Construction (illustrated)! Inverted

More information

Overfitting, Bias / Variance Analysis

Overfitting, Bias / Variance Analysis Overfitting, Bias / Variance Analysis Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 8, 207 / 40 Outline Administration 2 Review of last lecture 3 Basic

More information

Chapter 9: Hypothesis Testing Sections

Chapter 9: Hypothesis Testing Sections Chapter 9: Hypothesis Testing Sections 9.1 Problems of Testing Hypotheses 9.2 Testing Simple Hypotheses 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.6 Comparing the Means of Two

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80

The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80 The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80 71. Decide in each case whether the hypothesis is simple

More information