Detection and Estimation Theory

Similar documents
Detection and Estimation Theory

Detection and Estimation Theory

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Hypothesis testing (cont d)

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

Detection and Estimation Theory

Detection and Estimation Theory

Introduction to Statistical Inference

Signal Detection Basics - CFAR

ECE531 Lecture 4b: Composite Hypothesis Testing

Bayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides)

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1

Detection and Estimation Theory

Lecture 8: Information Theory and Statistics

ESE 524 Detection and Estimation Theory

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

An introduction to Bayesian inference and model comparison J. Daunizeau

Detection and Estimation Chapter 1. Hypothesis Testing

Detection theory. H 0 : x[n] = w[n]

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Bayesian inference J. Daunizeau

ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations

Decision Criteria 23

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Lecture 7 Introduction to Statistical Decision Theory

Introduction to Detection Theory

ESE 523 Information Theory

Bayesian inference J. Daunizeau

On the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing

ECE531 Lecture 2b: Bayesian Hypothesis Testing

Digital Transmission Methods S

Hypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses

Introduction to Signal Detection and Classification. Phani Chavali

Lecture 8: Information Theory and Statistics

44 CHAPTER 2. BAYESIAN DECISION THEORY

IN HYPOTHESIS testing problems, a decision-maker aims

Lecture Testing Hypotheses: The Neyman-Pearson Paradigm

Direction: This test is worth 250 points and each problem worth points. DO ANY SIX

Partitioning the Parameter Space. Topic 18 Composite Hypotheses

Detection and Estimation Theory

Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process

Performance Evaluation and Comparison

Parameter Estimation, Sampling Distributions & Hypothesis Testing

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology

Bayes Rule for Minimizing Risk

DETECTION theory deals primarily with techniques for

بسم الله الرحمن الرحيم

Announcements. Proposals graded

PATTERN RECOGNITION AND MACHINE LEARNING

10-704: Information Processing and Learning Fall Lecture 24: Dec 7

University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena

Chapter 2 Signal Processing at Receivers: Detection Theory

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

LECTURE NOTE #3 PROF. ALAN YUILLE

ST5215: Advanced Statistical Theory

Information Theory and Hypothesis Testing

FYST17 Lecture 8 Statistics and hypothesis testing. Thanks to T. Petersen, S. Maschiocci, G. Cowan, L. Lyons

Discussion of Hypothesis testing by convex optimization

Pattern Classification

LECTURE 10: NEYMAN-PEARSON LEMMA AND ASYMPTOTIC TESTING. The last equality is provided so this can look like a more familiar parametric test.

Detection and Estimation Theory

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Recognition Performance from SAR Imagery Subject to System Resource Constraints

Importance Sampling and. Radon-Nikodym Derivatives. Steven R. Dunbar. Sampling with respect to 2 distributions. Rare Event Simulation

Machine Learning and Data Mining. Bayes Classifiers. Prof. Alexander Ihler

Primer on statistics:

Lecture 12 November 3

Lecture 2: Basic Concepts of Statistical Decision Theory

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

hypothesis testing 1

Notes on Decision Theory and Prediction

Multivariate statistical methods and data mining in particle physics

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

AUTOMOTIVE ENVIRONMENT SENSORS

Topic 17: Simple Hypotheses

Statistics: CI, Tolerance Intervals, Exceedance, and Hypothesis Testing. Confidence intervals on mean. CL = x ± t * CL1- = exp

Minimum Error Rate Classification

Hypothesis Testing Chap 10p460

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations)

Data Privacy in Biomedicine. Lecture 11b: Performance Measures for System Evaluation

Machine Learning Linear Classification. Prof. Matteo Matteucci

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

Topic 15: Simple Hypotheses

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

Fundamentals of Statistical Signal Processing Volume II Detection Theory

Introduction to Logistic Regression

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Part 2 Elements of Bayesian Decision Theory

ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing

STONY BROOK UNIVERSITY. CEAS Technical Report 829

Topic 21 Goodness of Fit

Bayesian Decision Theory

Mathematical Statistics

SPRING 2007 EXAM C SOLUTIONS

Testing composite hypotheses applied to AR-model order estimation; the Akaike-criterion revised

Transcription:

ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 211 Urbauer Hall 314-935-4173 (Lynda answers) jao@wustl.edu J. A. O'S. ESE 524, Lecture 4, 1/22/9 1

Minimax i Decision i Rule ind the decision rule that minimizes the maximum Bayes Risk over all possible priors. Criterion Bayes: mimize expected risk or cost min maxr R Z P 1 1 Transition Priors Costs Densities P & P 1 C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, 1/22/9 2

Minimax i Decision i Rule: Analysis or any fixed decision rule the risk is linear in P 1 The maximum over P 1 is achieved at an end point To make that end point as low as possible, the risk should be constant with respect to P 1 To minimize that constant value, the risk should achieve the minimum risk at some P 1*. At that value of the prior, the best decision rule is a likelihood ratio test R M = C P P + C P P M M 1 r H ( R 1 1) R Z P = p H d P = pr ( R H ) d R Z 1 H J. A. O'S. ESE 524, Lecture 4, 1/22/9 3

Minimax i Decision i Rule: Analysis Bayes risk is concave (it is always below its tangent) Minimax is achieved at an end point or at an interior point on the Bayes risk curve where the tangent is zero 3 25 2 Risk 15 1 5.1.2.3.4.5.6.7.8.9 1 P 1 J. A. O'S. ESE 524, Lecture 4, 1/22/9 4

.9.8 Minimax Decision Rule: 5 X.7 H : x 5 e, X.6 Example X Matlab Code pf=:.1:1; pd=pf.^.2; eta=.2*(pf.^(-.8)); % eta=dp_d/dp_ figure plot(pf,pd);xlabel('p_');ylabel('p_d') pd);xlabel('p cm=1;cf=1; p1star=1./(1+cm*eta/cf); riskoptimal=cm*(1-pd).*p1star+cf*pf.*(1-p1star); figure plot(p1star,riskoptimal,'b'), hold on p1=:.1:1; r1=cm*(1-pd(1))*p1+cf*pf(1)*(1-p1); plot(p1,r1,'r'), hold on r2=cm*(1-pd(2))*p1+cf*pf(2)*(1-p1); p1+cf pf(2) (1-p1); plot(p1,r2,'g'), hold on r3=cm*(1-pd(3))*p1+cf*pf(3)*(1-p1); plot(p1,r3,'c') xlabel('p_1');ylabel('risk') H1 : x e, X l = 4X ln5 P = 5e dx = e γ ' γ ' 5X 5 γ ' X P = e dx = 1 e M P = 1 P = P D.2.2 M(1 ) = M C P C P γ ' P D Risk 1.5.4.3.2.1.1.2.3.4.5.6.7.8.9 1 P 3 25 2 15 1 5.1.2.3.4.5.6.7.8.9 1 P 1 J. A. O'S. ESE 524, Lecture 4, 1/22/9 5

Neyman-Pearson Decision i Rule M ( α) = P + η P Minimize P M subject to P α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p H ( H 1 1 ) d + η p H ( H ) d α r R R r R R Z Z1 Criterion Bayes: mimize expected risk or cost Transition Densities Priors P & P 1 Costs C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, 1/22/9 6

Neyman-Pearson Decision i Rule M ( α) = P + η P Minimize P M subject to P α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p H ( H 1 1 ) d + η p H ( H ) d α r R R r R R Z Z1 Plot the ROC: P D versus P for the family of likelihood ratio tests Draw a vertical line where P = α ind the corresponding P D At that point, the threshold equals the derivative of the ROC η = = dp dp dp dp D D dη dη J. A. O'S. ESE 524, Lecture 4, 1/22/9 7

Summary Several decision rules Likelihood (and loglikelihood) ratio test is optimal Receiver operating characteristic (ROC) plots probability of detection versus probability of false alarm with the threshold as a parameter all possible optimal performance Neyman-Pearson is a point on the ROC (P = α) Minimax is a point on the ROC (P C =P M C M ) Probability of error is a point on the ROC (slope η = (1-P 1 )/P 1 ) J. A. O'S. ESE 524, Lecture 4, 1/22/9 8

(Somewhat) Practical Example Given an image, find the parts of the image that are different. Model: Gaussian data under either hypothesis. Under H 1, variance is greater than under H. Example: Image data. Background represents the null hypothesis model as Gaussian J. A. O'S. ESE 524, Lecture 4, 1/22/9 9

(Somewhat) Practical Example 18 Histogram of Image 16 14 12 1 8 6 4 2 ( x ) 2 ij μ normsq = 2 2 σ { 3 I 1, normsq> ij = γ, otherwise 2 6 x 14 Histogram of NormSq 5 4 15 2 25 3 35 4 45 5 55 6 1 5 J. A. 1 O'S. ESE 15 524, 2 Lecture 25 4, 31/22/9 35 1

MtlbCd Matlab Code threshold=3; im1=imread('passengershudsonplaneap.jpg','jpg'); im1=sum(double(im1),3); figure; imagesc(im1); colormap gray; axis off [s1,s2]=size(im1); x=im1(1:12,1:18); size(x) x=reshape(x,1,12*18); [hx,ix]=hist(x,5); figure, plot(ix,hx); title('histogram of Image') mu=mean(x); sigma=std(x); normimage=(im1-mu).^2/(2*sigma^2); normimage2=reshape(normimage,1,numel(normimage)); [hx,ix]=hist(normimage2,5); figure, plot(ix,hx); title('histogram of NormSq') [xi,indexx]=find(normimage2>threshold); indexx]=find(normimage2>threshold); im2=reshape(im1,1,numel(im1)); imagethresh=mu*ones(size(im2)); imagethresh(indexx)=im2(indexx); imagethresh=reshape(imagethresh,s1,s2); figure; imagesc(imagethresh); colormap gray; axis off J. A. O'S. ESE 524, Lecture 4, 1/22/9 11