ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas
Outline Matched Filter Generalized Matched Filter Signal Design Multiple Signals
3 Detect a known signal in WGN Consider detecting a known deterministic signal in white Gaussian noise. Given a length-n sequence, x n, n = 0, 1,..., N 1, which is generated according to H 0 :X[n] = W [n] H 1 :X[n] = s[n] + W [n] where W [n] N (0, σ 2 ), s = {s[0],..., s[n 1]} is a known and deterministic signal.
4 NP Detector Let X = [X[0], X[1],, X[N 1]] T. Then H 0 : X N (0, σ 2 I n ), H 1 : X N (s, σ 2 I n ) The NP detector is L(x) = p 1(x) p 0 (x) H 1 H 0 γ x T s H 1 σ 2 1 ln γ + 2 s 2 H 0 }{{} γ or equivalently T (x) = x[n]s[n] H 1 H 0 γ
Correlator and Matched Filter Correlator x[n] N T (x) = P N 1 x[n]s[n] > γ T (x) H1? H 0 Ĥ s[n] Let Matched Filter { s[n 1 n] n = 0, 1,..., N 1 h[n] = 0 o.w. x[n] h[n] FIR filter n = N 1 5 T (x) H1? H 0 0 Ĥ
6 Performance of Matched Filter T (x) = x[n]s[n] > γ Under either hypothesis, T (X) is Gaussian. Let E := [ ] E[T H 0 ] = E W [n]s[n] = 0 E[T H 1 ] = E [ s2 [n], ] (s[n] + W [n])s[n] = E var(t H 0 ) = s T Σ w s = σ 2 var(t H 1 ) = s T Σ w s = σ 2 s 2 [n] = σ 2 E s 2 [n] = σ 2 E
Performance of Matched Filter (Cont d) { N (0, σ T 2 E) under H 0 N (E, σ 2 E) under H 1 Thus, ( ) γ P FA = P[T > γ ; H 0 ] = Q σ2 E P D = P[T > γ ; H 1 ] = Q ( = Q Q 1 (P FA ) ( γ ) E σ2 E ) E σ 2 Key parameter is the SNR (or energy-to-noise-ratio) E σ 2. As SNR increases, P D increases. The shape of the signal does NOT affect the detection performance. This is not true for colored Gaussian noise. 7
8 Generalized Matched Filter Noise is not i.i.d WGN, but correlated: W N (0, C), C is the covariance matrix. Then, { N (0, C) under H0 X N (s, C) under H 1 The optimal test is l(x) = ln p(x H 1) H 1 ln γ p(x H 0 ) H 0 where l(x) = 1 [ (x s) T C 1 (x s) x T C 1 x ] 2 = 1 [ x T C 1 x 2x T C 1 s + s T C 1 s x T C 1 x ] 2 = x T C 1 s 1 2 st C 1 s
9 Generalized Matched Filter Generalized Matched Filter: T (x) = x T C 1 s H 1 H 0 ln γ + 1 2 st C 1 s := γ
10 Generalized Matched Filter: Prewhitening Matrix When C = U T AU is positive definite, C 1 = U T A 1 U exists and is also positive definite, where A is a diagonal matrix. It can be factored as C 1 = D T D, where D = UA 1/2 is called a prewhitening matrix. Then, T (x) = x T C 1 s = x T D T Ds = x T s, where x = Dx, s = Ds. x[n] D x 0 [n] N P N 1 T (x) H1? H 0 0 Ĥ s 0 [n] D s[n]
P D increases with s T C 1 s, not E/σ 2. The shape of the signal DOES affect the detection performance. 11 Performance of Generalized Matched Filter T (x) = x T C 1 s > γ Under either hypothesis, T (X) is Gaussian. E[T H 0 ] = E [ W T C 1 s ] = 0 E[T H 1 ] = E [ (s + W ) T C 1 s ] = s T C 1 s var(t H 0 ) = E [ (W T C 1 s) 2] = s T C 1 s var(t H 1 ) = var(t H 0 ) = s T C 1 s ( Thus, P FA = P[T > γ γ ) ; H 0 ] = Q st C 1 s ( γ P D = P[T > γ s T C 1 ) s ; H 1 ] = Q st C 1 s ( = Q Q 1 (P FA ) ) s T C 1 s
12 Signal Design for Correlated Noise Objective: design the signal s to maximize P D for a given P FA, subject to energy constraint s T s = E. maximize s subject to s T C 1 s s T s = E
13 Signal Design for Correlated Noise (Cont d) Solution: making use of Lagrangian multipliers F = s T C 1 s + λ(e s T s) Taking derivative w.r.t to s, we have 2C 1 s 2λs = 0, i.e., C 1 s = λs Thus, s T C 1 s = λs T s = λe s is an eigenvector of C 1 associated with eigenvalue λ. To maximize s T C 1 s, s should be associated with the maximum eigenvalue of C 1. Since Cs = (1/λ)s, we should choose the signal as the (scaled) eigenvector of C associated with its minimum eigenvalue.
Example Assume W [n] N (0, σ 2 n), and W i s are uncorrelated. How to design the signal s[n], n = 0,..., N 1 to maximize P D?
Example [ 1 ρ Assume C = ρ 1 to maximize P D? ], where 0 < ρ 1. How to design the signal s[n], n = 0, 1
16 Multiple Signals Detection H 0 :X = W H 1 :X = s + W Classification H 0 :X = s 0 + W H 1 :X = s 1 + W
Example Additive Gaussian White Noise (AWGN) communication channel. Two messages i = {0, 1} with probabilities π 0 = π 1. Given i, the received signal is a N 1 random vector X = s i + W where W N (0, σ 2 I N N ). {s 0, s 1 } are known to the receiver. Design the ML receiver.
18 Minimum Distance Receiver We want to minimize P e, with equal prior probabilities for H i ML detector p(x H i ) = Decide H i for which [ ] 1 exp 1 (2πσ 2 ) N/2 2σ 2 (x[n] s i [n]) 2 Di 2 := (x[n] s i [n]) 2 = (x s i ) T (x s i ) = x s i 2 is minimum. minimum distance receiver x s 0 R 1 R 0 s 0 s 1 s 1
19 Minimum Distance Receiver Di 2 := (x[n] s i [n]) 2 = Therefore, we decide H i for which is maximum. T i (x) := x[n]s i [n] 1 2 x 2 [n] 2 s 2 i [n] = x[n]s i [n] + x[n]s i [n] 1 2 E i s 2 i [n]
20 Binary Case ML detector T 1 (x) T 0 (x) = x[n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) H 1 H 0 0 or equivalently Error probability x T (s 1 s 0 ) P e = Q ( 1 2 H 1 1 H 0 2 (E 1 E 0 ) ) s 1 s 0 σ
Performance for Binary Case P e = P[Ĥ = H 1 H 0 ]π 0 + P[Ĥ = H 0 H 1 ]π 1 = 1 2 (P[T 1(X) T 0 (X) > 0 H 0 ] + P[T 1 (X) T 0 (X) < 0 H 1 ]) Let T (x) := T 1 (x) T 0 (x) = x[n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) T is a Gaussian random variable conditioned on either hypothesis! E[T H 0 ] = s 0 [n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) = 1 2 s 1 s 0 2 E[T H 1 ] = E[T H 0 ] = 1 2 s 1 s 0 2 var(t H 0 ) = T var(x[n])(s 1 [n] s 0 [n]) 2 = σ 2 s 1 s 0 2 = var(t H 1 ) { N ( 1 2 s 1 s 0 2, σ 2 s 1 s 0 2 ) H 0 N ( 1 2 s 1 s 0 2, σ 2 s 1 s 0 2 ) H 1 P e = Q 21 ( 1 2 ) s 1 s 0 σ