ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Similar documents
DETECTION theory deals primarily with techniques for

Lecture 8: Signal Detection and Noise Assumption

Introduction to Statistical Inference

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

Detection theory. H 0 : x[n] = w[n]

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Rowan University Department of Electrical and Computer Engineering

PATTERN RECOGNITION AND MACHINE LEARNING

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

Introduction to Detection Theory

10. Composite Hypothesis Testing. ECE 830, Spring 2014

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

III.B Linear Transformations: Matched filter

Chapter 4: Continuous channel and its capacity

ELEG 5633 Detection and Estimation Minimum Variance Unbiased Estimators (MVUE)

EE 5407 Part II: Spatial Based Wireless Communications

8 Basics of Hypothesis Testing

7. Statistical estimation

Fundamentals of Statistical Signal Processing Volume II Detection Theory

Linear Models for Regression CS534

Estimation and Detection

Linear Models for Regression CS534

CS 340 Lec. 18: Multivariate Gaussian Distributions and Linear Discriminant Analysis

A Probability Review

Lecture 15: Thu Feb 28, 2019

Module 1 - Signal estimation

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Chapter 4: Factor Analysis

Chapter 2 Signal Processing at Receivers: Detection Theory

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

7. Statistical estimation

Parameter Estimation

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Factor Analysis and Kalman Filtering (11/2/04)

ELEG 3124 SYSTEMS AND SIGNALS Ch. 5 Fourier Transform

Algorithmisches Lernen/Machine Learning

Applications of Information Geometry to Hypothesis Testing and Signal Detection

Random Processes Handout IV

Exercises with solutions (Set D)

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Naive Bayes and Gaussian Bayes Classifier

Naive Bayes and Gaussian Bayes Classifier

The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet.

1 EM algorithm: updating the mixing proportions {π k } ik are the posterior probabilities at the qth iteration of EM.

CFAR DETECTION OF SPATIALLY DISTRIBUTED TARGETS IN K- DISTRIBUTED CLUTTER WITH UNKNOWN PARAMETERS

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Lecture 2. Capacity of the Gaussian channel

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

TARGET DETECTION WITH FUNCTION OF COVARIANCE MATRICES UNDER CLUTTER ENVIRONMENT

The Gaussian distribution

Bayesian Decision Theory

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University

The problem is to infer on the underlying probability distribution that gives rise to the data S.

Statistical learning. Chapter 20, Sections 1 3 1

ECE521 Lecture7. Logistic Regression

CS281 Section 4: Factor Analysis and PCA

Principal Component Analysis and Linear Discriminant Analysis

Lecture Notes on the Gaussian Distribution

Chapter 7. Basic Probability Theory

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

EIE6207: Estimation Theory

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Multi User Detection I

LDPC Codes. Intracom Telecom, Peania

TAMS39 Lecture 10 Principal Component Analysis Factor Analysis

Machine Learning. 7. Logistic and Linear Regression

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold)

Naive Bayes and Gaussian Bayes Classifier

ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

ECE 636: Systems identification

EE 574 Detection and Estimation Theory Lecture Presentation 8

Communication Theory II

Linear Regression and Discrimination

CS 195-5: Machine Learning Problem Set 1

Partial Solution Set, Leon 4.3

BASICS OF DETECTION AND ESTIMATION THEORY

Lecture 5: Moment Generating Functions

Probabilistic & Unsupervised Learning

Cognitive MIMO Radar

Convex Optimization M2

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Novel spectrum sensing schemes for Cognitive Radio Networks

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Machine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall

A Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.

Covariance Estimation for High Dimensional Data Vectors

III.C - Linear Transformations: Optimal Filtering

Morning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li.

CS181 Midterm 2 Practice Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Transcription:

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas

Outline Matched Filter Generalized Matched Filter Signal Design Multiple Signals

3 Detect a known signal in WGN Consider detecting a known deterministic signal in white Gaussian noise. Given a length-n sequence, x n, n = 0, 1,..., N 1, which is generated according to H 0 :X[n] = W [n] H 1 :X[n] = s[n] + W [n] where W [n] N (0, σ 2 ), s = {s[0],..., s[n 1]} is a known and deterministic signal.

4 NP Detector Let X = [X[0], X[1],, X[N 1]] T. Then H 0 : X N (0, σ 2 I n ), H 1 : X N (s, σ 2 I n ) The NP detector is L(x) = p 1(x) p 0 (x) H 1 H 0 γ x T s H 1 σ 2 1 ln γ + 2 s 2 H 0 }{{} γ or equivalently T (x) = x[n]s[n] H 1 H 0 γ

Correlator and Matched Filter Correlator x[n] N T (x) = P N 1 x[n]s[n] > γ T (x) H1? H 0 Ĥ s[n] Let Matched Filter { s[n 1 n] n = 0, 1,..., N 1 h[n] = 0 o.w. x[n] h[n] FIR filter n = N 1 5 T (x) H1? H 0 0 Ĥ

6 Performance of Matched Filter T (x) = x[n]s[n] > γ Under either hypothesis, T (X) is Gaussian. Let E := [ ] E[T H 0 ] = E W [n]s[n] = 0 E[T H 1 ] = E [ s2 [n], ] (s[n] + W [n])s[n] = E var(t H 0 ) = s T Σ w s = σ 2 var(t H 1 ) = s T Σ w s = σ 2 s 2 [n] = σ 2 E s 2 [n] = σ 2 E

Performance of Matched Filter (Cont d) { N (0, σ T 2 E) under H 0 N (E, σ 2 E) under H 1 Thus, ( ) γ P FA = P[T > γ ; H 0 ] = Q σ2 E P D = P[T > γ ; H 1 ] = Q ( = Q Q 1 (P FA ) ( γ ) E σ2 E ) E σ 2 Key parameter is the SNR (or energy-to-noise-ratio) E σ 2. As SNR increases, P D increases. The shape of the signal does NOT affect the detection performance. This is not true for colored Gaussian noise. 7

8 Generalized Matched Filter Noise is not i.i.d WGN, but correlated: W N (0, C), C is the covariance matrix. Then, { N (0, C) under H0 X N (s, C) under H 1 The optimal test is l(x) = ln p(x H 1) H 1 ln γ p(x H 0 ) H 0 where l(x) = 1 [ (x s) T C 1 (x s) x T C 1 x ] 2 = 1 [ x T C 1 x 2x T C 1 s + s T C 1 s x T C 1 x ] 2 = x T C 1 s 1 2 st C 1 s

9 Generalized Matched Filter Generalized Matched Filter: T (x) = x T C 1 s H 1 H 0 ln γ + 1 2 st C 1 s := γ

10 Generalized Matched Filter: Prewhitening Matrix When C = U T AU is positive definite, C 1 = U T A 1 U exists and is also positive definite, where A is a diagonal matrix. It can be factored as C 1 = D T D, where D = UA 1/2 is called a prewhitening matrix. Then, T (x) = x T C 1 s = x T D T Ds = x T s, where x = Dx, s = Ds. x[n] D x 0 [n] N P N 1 T (x) H1? H 0 0 Ĥ s 0 [n] D s[n]

P D increases with s T C 1 s, not E/σ 2. The shape of the signal DOES affect the detection performance. 11 Performance of Generalized Matched Filter T (x) = x T C 1 s > γ Under either hypothesis, T (X) is Gaussian. E[T H 0 ] = E [ W T C 1 s ] = 0 E[T H 1 ] = E [ (s + W ) T C 1 s ] = s T C 1 s var(t H 0 ) = E [ (W T C 1 s) 2] = s T C 1 s var(t H 1 ) = var(t H 0 ) = s T C 1 s ( Thus, P FA = P[T > γ γ ) ; H 0 ] = Q st C 1 s ( γ P D = P[T > γ s T C 1 ) s ; H 1 ] = Q st C 1 s ( = Q Q 1 (P FA ) ) s T C 1 s

12 Signal Design for Correlated Noise Objective: design the signal s to maximize P D for a given P FA, subject to energy constraint s T s = E. maximize s subject to s T C 1 s s T s = E

13 Signal Design for Correlated Noise (Cont d) Solution: making use of Lagrangian multipliers F = s T C 1 s + λ(e s T s) Taking derivative w.r.t to s, we have 2C 1 s 2λs = 0, i.e., C 1 s = λs Thus, s T C 1 s = λs T s = λe s is an eigenvector of C 1 associated with eigenvalue λ. To maximize s T C 1 s, s should be associated with the maximum eigenvalue of C 1. Since Cs = (1/λ)s, we should choose the signal as the (scaled) eigenvector of C associated with its minimum eigenvalue.

Example Assume W [n] N (0, σ 2 n), and W i s are uncorrelated. How to design the signal s[n], n = 0,..., N 1 to maximize P D?

Example [ 1 ρ Assume C = ρ 1 to maximize P D? ], where 0 < ρ 1. How to design the signal s[n], n = 0, 1

16 Multiple Signals Detection H 0 :X = W H 1 :X = s + W Classification H 0 :X = s 0 + W H 1 :X = s 1 + W

Example Additive Gaussian White Noise (AWGN) communication channel. Two messages i = {0, 1} with probabilities π 0 = π 1. Given i, the received signal is a N 1 random vector X = s i + W where W N (0, σ 2 I N N ). {s 0, s 1 } are known to the receiver. Design the ML receiver.

18 Minimum Distance Receiver We want to minimize P e, with equal prior probabilities for H i ML detector p(x H i ) = Decide H i for which [ ] 1 exp 1 (2πσ 2 ) N/2 2σ 2 (x[n] s i [n]) 2 Di 2 := (x[n] s i [n]) 2 = (x s i ) T (x s i ) = x s i 2 is minimum. minimum distance receiver x s 0 R 1 R 0 s 0 s 1 s 1

19 Minimum Distance Receiver Di 2 := (x[n] s i [n]) 2 = Therefore, we decide H i for which is maximum. T i (x) := x[n]s i [n] 1 2 x 2 [n] 2 s 2 i [n] = x[n]s i [n] + x[n]s i [n] 1 2 E i s 2 i [n]

20 Binary Case ML detector T 1 (x) T 0 (x) = x[n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) H 1 H 0 0 or equivalently Error probability x T (s 1 s 0 ) P e = Q ( 1 2 H 1 1 H 0 2 (E 1 E 0 ) ) s 1 s 0 σ

Performance for Binary Case P e = P[Ĥ = H 1 H 0 ]π 0 + P[Ĥ = H 0 H 1 ]π 1 = 1 2 (P[T 1(X) T 0 (X) > 0 H 0 ] + P[T 1 (X) T 0 (X) < 0 H 1 ]) Let T (x) := T 1 (x) T 0 (x) = x[n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) T is a Gaussian random variable conditioned on either hypothesis! E[T H 0 ] = s 0 [n](s 1 [n] s 0 [n]) 1 2 (E 1 E 0 ) = 1 2 s 1 s 0 2 E[T H 1 ] = E[T H 0 ] = 1 2 s 1 s 0 2 var(t H 0 ) = T var(x[n])(s 1 [n] s 0 [n]) 2 = σ 2 s 1 s 0 2 = var(t H 1 ) { N ( 1 2 s 1 s 0 2, σ 2 s 1 s 0 2 ) H 0 N ( 1 2 s 1 s 0 2, σ 2 s 1 s 0 2 ) H 1 P e = Q 21 ( 1 2 ) s 1 s 0 σ