ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 211 Urbauer Hall 314-935-4173 (Lynda answers) jao@wustl.edu J. A. O'S. ESE 524, Lecture 4, 1/22/9 1
Minimax i Decision i Rule ind the decision rule that minimizes the maximum Bayes Risk over all possible priors. Criterion Bayes: mimize expected risk or cost min maxr R Z P 1 1 Transition Priors Costs Densities P & P 1 C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, 1/22/9 2
Minimax i Decision i Rule: Analysis or any fixed decision rule the risk is linear in P 1 The maximum over P 1 is achieved at an end point To make that end point as low as possible, the risk should be constant with respect to P 1 To minimize that constant value, the risk should achieve the minimum risk at some P 1*. At that value of the prior, the best decision rule is a likelihood ratio test R M = C P P + C P P M M 1 r H ( R 1 1) R Z P = p H d P = pr ( R H ) d R Z 1 H J. A. O'S. ESE 524, Lecture 4, 1/22/9 3
Minimax i Decision i Rule: Analysis Bayes risk is concave (it is always below its tangent) Minimax is achieved at an end point or at an interior point on the Bayes risk curve where the tangent is zero 3 25 2 Risk 15 1 5.1.2.3.4.5.6.7.8.9 1 P 1 J. A. O'S. ESE 524, Lecture 4, 1/22/9 4
.9.8 Minimax Decision Rule: 5 X.7 H : x 5 e, X.6 Example X Matlab Code pf=:.1:1; pd=pf.^.2; eta=.2*(pf.^(-.8)); % eta=dp_d/dp_ figure plot(pf,pd);xlabel('p_');ylabel('p_d') pd);xlabel('p cm=1;cf=1; p1star=1./(1+cm*eta/cf); riskoptimal=cm*(1-pd).*p1star+cf*pf.*(1-p1star); figure plot(p1star,riskoptimal,'b'), hold on p1=:.1:1; r1=cm*(1-pd(1))*p1+cf*pf(1)*(1-p1); plot(p1,r1,'r'), hold on r2=cm*(1-pd(2))*p1+cf*pf(2)*(1-p1); p1+cf pf(2) (1-p1); plot(p1,r2,'g'), hold on r3=cm*(1-pd(3))*p1+cf*pf(3)*(1-p1); plot(p1,r3,'c') xlabel('p_1');ylabel('risk') H1 : x e, X l = 4X ln5 P = 5e dx = e γ ' γ ' 5X 5 γ ' X P = e dx = 1 e M P = 1 P = P D.2.2 M(1 ) = M C P C P γ ' P D Risk 1.5.4.3.2.1.1.2.3.4.5.6.7.8.9 1 P 3 25 2 15 1 5.1.2.3.4.5.6.7.8.9 1 P 1 J. A. O'S. ESE 524, Lecture 4, 1/22/9 5
Neyman-Pearson Decision i Rule M ( α) = P + η P Minimize P M subject to P α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p H ( H 1 1 ) d + η p H ( H ) d α r R R r R R Z Z1 Criterion Bayes: mimize expected risk or cost Transition Densities Priors P & P 1 Costs C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, 1/22/9 6
Neyman-Pearson Decision i Rule M ( α) = P + η P Minimize P M subject to P α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p H ( H 1 1 ) d + η p H ( H ) d α r R R r R R Z Z1 Plot the ROC: P D versus P for the family of likelihood ratio tests Draw a vertical line where P = α ind the corresponding P D At that point, the threshold equals the derivative of the ROC η = = dp dp dp dp D D dη dη J. A. O'S. ESE 524, Lecture 4, 1/22/9 7
Summary Several decision rules Likelihood (and loglikelihood) ratio test is optimal Receiver operating characteristic (ROC) plots probability of detection versus probability of false alarm with the threshold as a parameter all possible optimal performance Neyman-Pearson is a point on the ROC (P = α) Minimax is a point on the ROC (P C =P M C M ) Probability of error is a point on the ROC (slope η = (1-P 1 )/P 1 ) J. A. O'S. ESE 524, Lecture 4, 1/22/9 8
(Somewhat) Practical Example Given an image, find the parts of the image that are different. Model: Gaussian data under either hypothesis. Under H 1, variance is greater than under H. Example: Image data. Background represents the null hypothesis model as Gaussian J. A. O'S. ESE 524, Lecture 4, 1/22/9 9
(Somewhat) Practical Example 18 Histogram of Image 16 14 12 1 8 6 4 2 ( x ) 2 ij μ normsq = 2 2 σ { 3 I 1, normsq> ij = γ, otherwise 2 6 x 14 Histogram of NormSq 5 4 15 2 25 3 35 4 45 5 55 6 1 5 J. A. 1 O'S. ESE 15 524, 2 Lecture 25 4, 31/22/9 35 1
MtlbCd Matlab Code threshold=3; im1=imread('passengershudsonplaneap.jpg','jpg'); im1=sum(double(im1),3); figure; imagesc(im1); colormap gray; axis off [s1,s2]=size(im1); x=im1(1:12,1:18); size(x) x=reshape(x,1,12*18); [hx,ix]=hist(x,5); figure, plot(ix,hx); title('histogram of Image') mu=mean(x); sigma=std(x); normimage=(im1-mu).^2/(2*sigma^2); normimage2=reshape(normimage,1,numel(normimage)); [hx,ix]=hist(normimage2,5); figure, plot(ix,hx); title('histogram of NormSq') [xi,indexx]=find(normimage2>threshold); indexx]=find(normimage2>threshold); im2=reshape(im1,1,numel(im1)); imagethresh=mu*ones(size(im2)); imagethresh(indexx)=im2(indexx); imagethresh=reshape(imagethresh,s1,s2); figure; imagesc(imagethresh); colormap gray; axis off J. A. O'S. ESE 524, Lecture 4, 1/22/9 11