Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010
Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple Hypothesis Testing
Chapter Goal State basic statistical groundwork for detectors design Address simple hypothesis, where PDFs are known (Chapter 6 will cover the case of unknown PDFs) Introduction to the classical approach based on Neyman-Pearson and Bayes Risk Bayesian methods employ prior knowledge of the hypotheses Neyman-Pearson has no prior knowledge
Bayes vs Newman-Pearson The particular method employed depends upon our willingness to incorporate prior knowledge about the probabilities of ocurrence of the various hypotheses Typical cases Bayes : Communication systems, Pattern recognition NP : Sonar, Radar
Nomenclature 2 2 (, ) : G a u s s ia n P D F w ith m e a n a n d v a ria n ce H i : H ypothesis i p ( x H ) : P ro b a b ility o f o b s e rv in g x u n d e r a s s u m p rio n H i i P ( H ; H ) : P ro b a b ility o f d e te c tin g H, w h e n it is H i j i j P ( H H ) : P ro b a b ility o f d e te c tin g H, w h e n H w a s o b s e rv ed i j L( x) : L ik e lih o o d o f x i j
Nomenclature cont... Note that P ( H ; H ) i j is the probability of deciding Hi, if Hj is true. While P ( H H ) assumes that the outcome of a probabilistic i j experiment is observed to be Hj and that the probability of deciding Hi is conditioned to that outcome.
Neyman-Pearson Theorem
Neyman-Pearson Theorem cont... H : x[0 ] w [0 ] 0 H : x[0 ] s[0 ] w [0 ] 1 It is no possible to reduce both error probabilities simultaneously In Neyman-Pearson the approach is to hold one error probability while minimizing the other.
Neyman-Pearson Theorem cont... We wish to maximize the probability of detection P P ( H ; H ) D 1 1 And constraint the probability of false alarm P P ( H ; H ) FA 1 0 By choosing the threshold γ
Neyman-Pearson Theorem cont... To maximize the probability of detection for a given probability of false alarm decide for H1 if L ( x) p ( x ; H ) 1 p ( x ; H ) 0 Likelihood ratio test with P p ( x ; H ) d x FA { x : L ( x ) } 0
Likelihood Ratio The LR test rejects the null hypothesis if the value of this statistic is too small. Intuition: large Pfa -> small gamma -> not much larger p(x,h1) than p(x;h0) Limiting case: Pfa->1 Limiting case: Pfa->0 L ( x) p ( x ; H ) 1 p ( x ; H ) 0
Neyman-Pearson Theorem cont... Example 1 Maximize P P ( H ; H ) D 1 1 Calculate the threshold so that P FA 10 3
Neyman-Pearson Theorem Example 1 γ' (1) (6) (2) (7) (3) (8) ex p ( ) (4) γ ' 3 (5)
Question How can Pd be improved? Answer is: Only if we allow for a larger Pfa. NP gives the optimal detector! By allowing a Pfa of 0.5, the Pd is increased to 0.85
System Model In the General case Where the signal s[n]=a for A>0 and w[n] is WGN with variance σ². It can be shown that the sum of x is a sufficient statistic The summation of x[n] is sufficient.
Detector Performance In a more general definition Energy-to-noise-ratio Or where Deflection Coefficient
Detector Performance cont... An alternative way to present the detection performance γ' in creasin g γ' Receiver Operating Characteristic (ROC)
Irrelevant data Some data not belonging to the desired signal might actually be useful. The observed data-set is If { x[0 ], x[1],..., x[ N 1], w [0 ], w [1],..., w [ N 1] R R R H 0 : x[ n ] w[ n ] w R w [ n ] H 1 : x[ n ] A w [ n ] w R w [ n ] Then the reference noise can be used to improve the detection. The perfect detector would become T x[ 0 ] w [ 0 ] R A 2
Irrelevant data cont... Generalizing, using the NP theorem If they are un-correlated Then, x2 is irrelevant and L ( x, x ) L ( x ) 1 2 1
Minimum Probability Error The probability of error is defined with Bayesian paradigm as Minimizing the probability of error, we decide H1 if If both prior probabilities are equal we have
Minimum Probability Error example An On-off key communication system, where we transmit either S [ n ] 0 o r S [ n ] A 0 1 The probability of transmitting 0 or A is the same ½. Then, in order to minimize the probability of error γ=1
Minimum Probability Error example... A n alo g o u s to N P, w e d ecid e H Then we calculate the Pe 1 if x A 2 Deflection Coefficient The probability error decreases monoto nically with respect of the deflection coefficient
Bayes Risk Bayes Risk assigns costs to the type of errors, C ij denotes the cost if we decide H but H is true. If no error is made, no i j cost is assigned. The detector that minimizes Bayes risk is the following
Multiple Hyphoteses testing If we have a communication system, M-ary PAM (pulse amplitud modulation) with M=3.
Multiple Hyphoteses testing cont... The conditional PDF describing the communication system is given by In order to maximize p(x Hi), we can minimize
Multiple Hyphoteses testing cont... Then Therefore
Multiple Hyphoteses testing cont... There are six types of Pe, so Instead of calculating Pe, we calculate Pc=1-Pe, the probability of correct desicion And finally
Multiple Hyphoteses testing cont... Binary Case PAM Case P e 2 NA Q 2 4 P e 2 4 NA Q 2 3 4 In the digital communications world, there is a difference in the energy per symbol, which is not considered in this example.
Chapter Problems Warm up problems: 3.1, 3.2 Problems to be solved in class: 3.4, 3.8, 3.15, 3.18, 3.21