Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Similar documents
Detection theory 101 ELEC-E5410 Signal Processing for Communications

Introduction to Statistical Inference

Introduction to Detection Theory

Detection theory. H 0 : x[n] = w[n]

DETECTION theory deals primarily with techniques for

PATTERN RECOGNITION AND MACHINE LEARNING

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Introduction to Signal Detection and Classification. Phani Chavali

Chapter 2 Signal Processing at Receivers: Detection Theory

Bayesian Decision Theory

Detection and Estimation Theory

Fundamentals of Statistical Signal Processing Volume II Detection Theory

Detection and Estimation Theory

ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

Lecture Testing Hypotheses: The Neyman-Pearson Paradigm

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Digital Transmission Methods S

Lecture 8: Information Theory and Statistics

Signal Detection Basics - CFAR

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

APPENDIX 1 NEYMAN PEARSON CRITERIA

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1

Chapter 2. Review of basic Statistical methods 1 Distribution, conditional distribution and moments

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Novel spectrum sensing schemes for Cognitive Radio Networks

ECE531 Lecture 4b: Composite Hypothesis Testing

Summary of Chapters 7-9

Definition 3.1 A statistical hypothesis is a statement about the unknown values of the parameters of the population distribution.

Detection and Estimation Chapter 1. Hypothesis Testing

Lecture 8: Information Theory and Statistics

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

Performance Evaluation and Comparison

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

Algorithmisches Lernen/Machine Learning

ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing

Module 2. Random Processes. Version 2, ECE IIT, Kharagpur

Partitioning the Parameter Space. Topic 18 Composite Hypotheses

Error Rates. Error vs Threshold. ROC Curve. Biometrics: A Pattern Recognition System. Pattern classification. Biometrics CSE 190 Lecture 3

Review. December 4 th, Review

10. Composite Hypothesis Testing. ECE 830, Spring 2014

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Practice Problems Section Problems

CFAR DETECTION OF SPATIALLY DISTRIBUTED TARGETS IN K- DISTRIBUTED CLUTTER WITH UNKNOWN PARAMETERS

Outline. Detection Theory. NonGaussian NoiseCharacteristics. Example: Kurtosis: Chapter 10: NonGaussian Noise Reza Meraji Dec. 14, 2010.

Detection and Estimation Theory

TUTORIAL 8 SOLUTIONS #

Estimation and Detection

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

IN MOST problems pertaining to hypothesis testing, one

Hypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses

Statistical Inference

Bayesian inference J. Daunizeau

Noncoherent Integration Gain, and its Approximation Mark A. Richards. June 9, Signal-to-Noise Ratio and Integration in Radar Signal Processing

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

Decentralized Detection In Wireless Sensor Networks

Statistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009

Bayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides)

Statistical Inference

CH.9 Tests of Hypotheses for a Single Sample

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations

Lecture 7 Introduction to Statistical Decision Theory

Hypothesis Testing Chap 10p460

Handout 12: Error Probability Analysis of Binary Detection

MATH 240. Chapter 8 Outlines of Hypothesis Tests

Bayesian inference J. Daunizeau

hypothesis testing 1

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Information Theory and Hypothesis Testing

Use of Wijsman's Theorem for the Ratio of Maximal Invariant Densities in Signal Detection Applications

On the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing

Parameter Estimation, Sampling Distributions & Hypothesis Testing

Reasoning with Uncertainty

Practical Statistics

Two results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise.

The problem of base rates

Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma

Bayesian Decision Theory

Introductory Econometrics

Hypothesis Testing - Frequentist

Physics 403. Segev BenZvi. Classical Hypothesis Testing: The Likelihood Ratio Test. Department of Physics and Astronomy University of Rochester

Finite sample size optimality of GLR tests. George V. Moustakides University of Patras, Greece

Ch. 5 Hypothesis Testing

An introduction to Bayesian inference and model comparison J. Daunizeau

Primer on statistics:

Bayesian Decision Theory

Lecture 12 November 3

IN HYPOTHESIS testing problems, a decision-maker aims

ECE531 Screencast 11.4: Composite Neyman-Pearson Hypothesis Testing

Topic 10: Hypothesis Testing

Multivariate statistical methods and data mining in particle physics

ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals

exp{ (x i) 2 i=1 n i=1 (x i a) 2 (x i ) 2 = exp{ i=1 n i=1 n 2ax i a 2 i=1

Statistical Methods for Particle Physics Lecture 2: statistical tests, multivariate methods

Lecture 8: Signal Detection and Noise Assumption

Composite Hypotheses and Generalized Likelihood Ratio Tests

University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena

Transcription:

Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple Hypothesis Testing

Chapter Goal State basic statistical groundwork for detectors design Address simple hypothesis, where PDFs are known (Chapter 6 will cover the case of unknown PDFs) Introduction to the classical approach based on Neyman-Pearson and Bayes Risk Bayesian methods employ prior knowledge of the hypotheses Neyman-Pearson has no prior knowledge

Bayes vs Newman-Pearson The particular method employed depends upon our willingness to incorporate prior knowledge about the probabilities of ocurrence of the various hypotheses Typical cases Bayes : Communication systems, Pattern recognition NP : Sonar, Radar

Nomenclature 2 2 (, ) : G a u s s ia n P D F w ith m e a n a n d v a ria n ce H i : H ypothesis i p ( x H ) : P ro b a b ility o f o b s e rv in g x u n d e r a s s u m p rio n H i i P ( H ; H ) : P ro b a b ility o f d e te c tin g H, w h e n it is H i j i j P ( H H ) : P ro b a b ility o f d e te c tin g H, w h e n H w a s o b s e rv ed i j L( x) : L ik e lih o o d o f x i j

Nomenclature cont... Note that P ( H ; H ) i j is the probability of deciding Hi, if Hj is true. While P ( H H ) assumes that the outcome of a probabilistic i j experiment is observed to be Hj and that the probability of deciding Hi is conditioned to that outcome.

Neyman-Pearson Theorem

Neyman-Pearson Theorem cont... H : x[0 ] w [0 ] 0 H : x[0 ] s[0 ] w [0 ] 1 It is no possible to reduce both error probabilities simultaneously In Neyman-Pearson the approach is to hold one error probability while minimizing the other.

Neyman-Pearson Theorem cont... We wish to maximize the probability of detection P P ( H ; H ) D 1 1 And constraint the probability of false alarm P P ( H ; H ) FA 1 0 By choosing the threshold γ

Neyman-Pearson Theorem cont... To maximize the probability of detection for a given probability of false alarm decide for H1 if L ( x) p ( x ; H ) 1 p ( x ; H ) 0 Likelihood ratio test with P p ( x ; H ) d x FA { x : L ( x ) } 0

Likelihood Ratio The LR test rejects the null hypothesis if the value of this statistic is too small. Intuition: large Pfa -> small gamma -> not much larger p(x,h1) than p(x;h0) Limiting case: Pfa->1 Limiting case: Pfa->0 L ( x) p ( x ; H ) 1 p ( x ; H ) 0

Neyman-Pearson Theorem cont... Example 1 Maximize P P ( H ; H ) D 1 1 Calculate the threshold so that P FA 10 3

Neyman-Pearson Theorem Example 1 γ' (1) (6) (2) (7) (3) (8) ex p ( ) (4) γ ' 3 (5)

Question How can Pd be improved? Answer is: Only if we allow for a larger Pfa. NP gives the optimal detector! By allowing a Pfa of 0.5, the Pd is increased to 0.85

System Model In the General case Where the signal s[n]=a for A>0 and w[n] is WGN with variance σ². It can be shown that the sum of x is a sufficient statistic The summation of x[n] is sufficient.

Detector Performance In a more general definition Energy-to-noise-ratio Or where Deflection Coefficient

Detector Performance cont... An alternative way to present the detection performance γ' in creasin g γ' Receiver Operating Characteristic (ROC)

Irrelevant data Some data not belonging to the desired signal might actually be useful. The observed data-set is If { x[0 ], x[1],..., x[ N 1], w [0 ], w [1],..., w [ N 1] R R R H 0 : x[ n ] w[ n ] w R w [ n ] H 1 : x[ n ] A w [ n ] w R w [ n ] Then the reference noise can be used to improve the detection. The perfect detector would become T x[ 0 ] w [ 0 ] R A 2

Irrelevant data cont... Generalizing, using the NP theorem If they are un-correlated Then, x2 is irrelevant and L ( x, x ) L ( x ) 1 2 1

Minimum Probability Error The probability of error is defined with Bayesian paradigm as Minimizing the probability of error, we decide H1 if If both prior probabilities are equal we have

Minimum Probability Error example An On-off key communication system, where we transmit either S [ n ] 0 o r S [ n ] A 0 1 The probability of transmitting 0 or A is the same ½. Then, in order to minimize the probability of error γ=1

Minimum Probability Error example... A n alo g o u s to N P, w e d ecid e H Then we calculate the Pe 1 if x A 2 Deflection Coefficient The probability error decreases monoto nically with respect of the deflection coefficient

Bayes Risk Bayes Risk assigns costs to the type of errors, C ij denotes the cost if we decide H but H is true. If no error is made, no i j cost is assigned. The detector that minimizes Bayes risk is the following

Multiple Hyphoteses testing If we have a communication system, M-ary PAM (pulse amplitud modulation) with M=3.

Multiple Hyphoteses testing cont... The conditional PDF describing the communication system is given by In order to maximize p(x Hi), we can minimize

Multiple Hyphoteses testing cont... Then Therefore

Multiple Hyphoteses testing cont... There are six types of Pe, so Instead of calculating Pe, we calculate Pc=1-Pe, the probability of correct desicion And finally

Multiple Hyphoteses testing cont... Binary Case PAM Case P e 2 NA Q 2 4 P e 2 4 NA Q 2 3 4 In the digital communications world, there is a difference in the energy per symbol, which is not considered in this example.

Chapter Problems Warm up problems: 3.1, 3.2 Problems to be solved in class: 3.4, 3.8, 3.15, 3.18, 3.21