Lecture 8: Signal Detection and Noise Assumption

Similar documents
2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

10. Composite Hypothesis Testing. ECE 830, Spring 2014

Composite Hypotheses and Generalized Likelihood Ratio Tests

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

DETECTION theory deals primarily with techniques for

Detection theory. H 0 : x[n] = w[n]

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Topic 3: Hypothesis Testing

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

Detection theory 101 ELEC-E5410 Signal Processing for Communications

DS-GA 1002 Lecture notes 10 November 23, Linear models

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

01 Probability Theory and Statistics Review

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

PATTERN RECOGNITION AND MACHINE LEARNING

Lecture 3: Review of Linear Algebra

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Lecture Notes on the Gaussian Distribution

Lecture 3: Review of Linear Algebra

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations

A Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.

Bayesian Decision Theory

ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

10. Linear Models and Maximum Likelihood Estimation

Gaussian Models (9/9/13)

CFAR TARGET DETECTION IN TREE SCATTERING INTERFERENCE

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Rowan University Department of Electrical and Computer Engineering

Introduction to Signal Detection and Classification. Phani Chavali

2015 School Year. Graduate School Entrance Examination Problem Booklet. Mathematics

Introduction to Detection Theory

ECE531 Lecture 2b: Bayesian Hypothesis Testing

Chapter 2 Signal Processing at Receivers: Detection Theory

Probabilistic Graphical Models

Association studies and regression

ECEN 5612, Fall 2007 Noise and Random Processes Prof. Timothy X Brown NAME: CUID:

TARGET DETECTION WITH FUNCTION OF COVARIANCE MATRICES UNDER CLUTTER ENVIRONMENT

Modifying Voice Activity Detection in Low SNR by correction factors

LECTURE NOTE #3 PROF. ALAN YUILLE

Lecture 11. Multivariate Normal theory

ik () uk () Today s menu Last lecture Some definitions Repeatability of sensing elements

Introduction to Statistical Inference

Properties of Linear Transformations from R n to R m

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Lecture 17: Section 4.2

Chapter 5 continued. Chapter 5 sections

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Linear regression. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

Lecture 22: Section 4.7

Lecture 1: August 28

E190Q Lecture 11 Autonomous Robot Navigation

Generalized Linear Models (1/29/13)

EIE6207: Estimation Theory

(Practice Version) Midterm Exam 2

Problem Set 3 Due Oct, 5

Continuous Random Variables

Factor Analysis and Kalman Filtering (11/2/04)

Math 24 Spring 2012 Questions (mostly) from the Textbook

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator

11 - The linear model

EE 574 Detection and Estimation Theory Lecture Presentation 8

Lecture Note 1: Probability Theory and Statistics

COS Lecture 16 Autonomous Robot Navigation

Bayesian methods in the search for gravitational waves

CS281A/Stat241A Lecture 17

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Introduction to Normal Distribution

DS-GA 1002 Lecture notes 12 Fall Linear regression

Family Feud Review. Linear Algebra. October 22, 2013

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

The Multivariate Gaussian Distribution [DRAFT]

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Linear Regression and Discrimination

Novel spectrum sensing schemes for Cognitive Radio Networks

E&CE 358, Winter 2016: Solution #2. Prof. X. Shen

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

MULTIPLE-CHANNEL DETECTION IN ACTIVE SENSING. Kaitlyn Beaudet and Douglas Cochran

STA 414/2104, Spring 2014, Practice Problem Set #1

MATH5745 Multivariate Methods Lecture 07

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

Wed Feb The vector spaces 2, 3, n. Announcements: Warm-up Exercise:

Lecture Note 12: Kalman Filter

Lecture 2: Review of Basic Probability Theory

Advanced Signal Processing Minimum Variance Unbiased Estimation (MVU)

Midterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so.

M.-A. Bizouard, F. Cavalier. GWDAW 8 Milwaukee 19/12/03

Probability and Statistics for Final Year Engineering Students

Lecture 12. Block Diagram

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

Lecture 5: GPs and Streaming regression

Mean and variance. Compute the mean and variance of the distribution with density

Machine learning - HT Maximum Likelihood

ECE Homework Set 3

Transcription:

ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(0, σ I n n and S = [s, s,..., s n ] T is the known signal waveform. P 0 (X = P (X = (πσ n (πσ n exp( σ XT X exp[ σ (X ST (X S] The second equation holds because under hypothesis, W = X S. The Log Likelihood Ratio test is log Λ(x = log After simplifying it, we can get P W (X P W (X S = σ [(X ST (X S X T X] = σ [ XT + S T S] H X T S H σ γ + ST S In this case, X T S is the sufficient statistics t(x for the parameter θ = 0,. Note that S T S = is the signal energy. The LR detector filters data by projecting them onto signal subspace.. Example Suppose we want to control the probability of false alarm. For example, choose γ so that P(X T S > γ 0.05. The test statistic X T S is usually called matched filter. In particular, projection onto subspace spanned by S is = γ P S = SST S T S = S ST P S X = SST X = S (XT S γ

Lecture 8: Signal Detection and Noise Assumption Figure : Projection of X onto subspace S where XT S is just a number. Geometrically, suppose the horizontal line is the subspace S and X is some other vector. The projection of vector X into subspace S can be expressed in the figure.. Example Suppose the signal value S k is sinusoid. S k = cos(πf 0 k + θ, k =,..., n The match filter in this case is to compute the value in the specific frequency. So P S in this example is a bandpass filter..3 Performance Analysis Next problem what we want to know is what s the probability density of X T S, which is the sufficient statistics of this test. : X N(0, σ I H : X N(S, σ I X T S = n k= X ks k is also Gaussian distributed. Recall if X N(µ, Σ, then Y = AX N(Aµ, AΣA T, where A is a matrix. Since Y = X T S = S T X, Y is a scalar. So we can get : X T S N(0 T S, S T σ IS = N(0, σ H : X T S N(S T S, S T σ IS = N(, σ The probability of false alarm is P F A = Q( γ 0 σ, and the probability of detection is P D = Q( γ σ = γ Q( σ σ. Since Q function is invertible, we can get γ σ = Q (P F A. Therefore, P D = Q(Q (P F A. In the equation, is the square root of Signal Noise Ratio( SNR. σ AWGN Assumption σ Is real-world noise really additive, white and Gaussian? Well, here are a few observations. Noise in many applications (e.g. communication and radar arose from several independent sources, all adding together

Lecture 8: Signal Detection and Noise Assumption 3 Figure : Distribution of P 0 and P Figure 3: Relation between probability of detection and false alarm at sensors and combining additively to the measurement. AWGN is gaussian distributed as the following formula. W N(0, σ I CLT(Central Limit Theorem: If x,..., x n are independent random variables with means µ i and variances σi <,then Z n = n x i µ i n i= σ i N(0, in distribution quite quickly. Thus, it is quite reasonable to model noise as additive and Gaussian list in many applications. However, whiteness is not always a good assumption.. Example 3 Suppose W = S + S + + S k, where S, S,... S k are inteferencing signals that are not of interest. But each of them is structured/correlated in time. Therefore, we need a more generalized form of noise, which is Colored Gaussian Noise. 3 Colored Gaussian Noise W N(0, Σ is called correlated or colored noise, where Σ is a structured covariance matrix. Consider the binary hypothesis test in this case.

Lecture 8: Signal Detection and Noise Assumption 4 Figure 4: Relation between probability of detection and SNR : X = S 0 + W H : X = S + W where W N(0, Σ and S 0 and S are know signal waveforms. So we can rewrite the hypothesis as : X N(S 0, Σ H : X N(S, Σ The probability density of each hypothesis is P i (X = exp[ (π n (Σ (X S i T Σ (X S i ], i = 0, The log likelihood ratio is log( P (X P (X = [(X S T Σ(X S (X S 0 T Σ (X S 0 ] = X T Σ (S S 0 ST Σ S + ST 0 Σ H S 0 γ Let t(x = (S S 0 Σ X, we can get (S S 0 Σ X H γ + ST Σ S ST 0 Σ S 0 = γ The probability of false alarm is : t N((S S 0 Σ S 0, (S S 0 T Σ (S S 0 H : t N((S S 0 Σ S, (S S 0 T Σ (S S 0 γ (S S 0 T Σ S 0 P F A = Q( [(S S 0 T Σ (S S 0 ] In this case it is natural to define SNR = (S S 0 T Σ (S S 0

Lecture 8: Signal Detection and Noise Assumption 5 3. Example 4 [ ] [ ] ρ ρ S = [, ], S 0 = [, ], Σ =, Σ ρ = ρ. ρ The test statistics is [ ] [ ] y = (S S 0 Σ ρ x X = [, ] ρ ρ x = + ρ (x + x The probability of false alarm is : y N( + ρ, + ρ H : y N(+ + ρ, + ρ The probability of detection is +ρ +ρ P F A = Q( γ + +ρ +ρ P D = Q( γ Figure 5: ROC curve at different ρ