EE 574 Detection and Estimation Theory Lecture Presentation 8

Similar documents
Chapter 2 Signal Processing at Receivers: Detection Theory

ECE531 Lecture 2b: Bayesian Hypothesis Testing

7 The Waveform Channel

Digital Transmission Methods S

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Modulation & Coding for the Gaussian Channel

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Application of Matched Filter

ECE531 Lecture 4b: Composite Hypothesis Testing

Chapter [4] "Operations on a Single Random Variable"

Detection and Estimation Theory

Handout 12: Error Probability Analysis of Binary Detection

A Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Detection and Estimation Theory

A Hilbert Space for Random Processes

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Signals and Spectra - Review

ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters

Exam 1. Problem 1: True or false

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics.

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Example: Bipolar NRZ (non-return-to-zero) signaling

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Multi User Detection I

EE4304 C-term 2007: Lecture 17 Supplemental Slides

Parameter Estimation

Bayesian Learning (II)

Lecture 15: Thu Feb 28, 2019

ELEC546 Review of Information Theory

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Course content (will be adapted to the background knowledge of the class):

E&CE 358, Winter 2016: Solution #2. Prof. X. Shen

s o (t) = S(f)H(f; t)e j2πft df,

Lecture 8: Signal Detection and Noise Assumption

Random Processes Why we Care

Data Detection for Controlled ISI. h(nt) = 1 for n=0,1 and zero otherwise.

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

Problem Set 3 Due Oct, 5

EE303: Communication Systems

Digital Modulation 1

Module 2. Random Processes. Version 2, ECE IIT, Kharagpur

EE6604 Personal & Mobile Communications. Week 15. OFDM on AWGN and ISI Channels

Performance Analysis of Spread Spectrum CDMA systems

On the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

MODULE -4 BAYEIAN LEARNING

SEISMIC WAVE PROPAGATION. Lecture 2: Fourier Analysis

Stochastic Processes. A stochastic process is a function of two variables:

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Memo of J. G. Proakis and M Salehi, Digital Communications, 5th ed. New York: McGraw-Hill, Chenggao HAN

Lecture 7 Introduction to Statistical Decision Theory

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

ANALYSIS OF A PARTIAL DECORRELATOR IN A MULTI-CELL DS/CDMA SYSTEM

RADIO SYSTEMS ETIN15. Lecture no: Equalization. Ove Edfors, Department of Electrical and Information Technology

Introduction to Detection Theory

DETECTION theory deals primarily with techniques for

Chapter 2 Random Processes

Continuous Fourier transform of a Gaussian Function

EE (082) Ch. II: Intro. to Signals Lecture 2 Dr. Wajih Abu-Al-Saud

Digital Communications

Introduction to Statistical Inference

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

Concentration Ellipsoids

Sparsity Measure and the Detection of Significant Data

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Spectral Analysis - Introduction

Detection & Estimation Lecture 1

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

In this section again we shall assume that the matrix A is m m, real and symmetric.

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li.

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

IN HYPOTHESIS testing problems, a decision-maker aims

Lecture 12. Block Diagram

EE456 Digital Communications

ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Machine Learning 4771

Estimation techniques

Basic concepts in estimation

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Random Processes Handout IV

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Relationship between Least Squares Approximation and Maximum Likelihood Hypotheses

Introduction to Bayesian Learning. Machine Learning Fall 2018

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Detection & Estimation Lecture 1

8 Basics of Hypothesis Testing

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Projects in Wireless Communication Lecture 1

Definition of a Stochastic Process

Transcription:

Lecture Presentation 8 Aykut HOCANIN Dept. of Electrical and Electronic Engineering 1/14

Chapter 3: Representation of Random Processes 3.2 Deterministic Functions:Orthogonal Representations For a finite-energy signal which is defined over [0, T ], E x = T The orthogonal expansion is given by, 0 x(t) = x 2 (t) dt < (1) x i φ i (t). (2) i=1 The coefficients x i which minimize the mean-square approximation error for a given N are given by x i = T 0 x(t)φ i (t) dt (3) As N, the approximation error goes to zero. We say that φ i (t), i = 1, 2,... form a complete orthonormal (CON) set. 2/14

It is observed that (look at equation (9)) E x = x 2 i (4) which is the Parseval s Theorem. It is possible to generate the coefficients using two different approaches: 1. correlation operation (figure 1) 2. filter operation (figure 2) i=1 3.3-3.8 Random Process Characterization In the probability review which was done at the beginning of the semester we have discussed this topic in detail. Please look at the textbook for alternative interpretations and examples. 3/14

Chapter 4: Detection of Signals-Estimation of Signal Parameters Detection The classical theory is extended to include observations which consist of continuous waveforms. The thermal noise can be modelled as a sample function from a Gaussian random process. In most systems, the spectrum of the thermal noise is flat over the frequency range of interest. (Spectral height of N 0 /2 joules). In figure 3, a case of known signals in the presence of additive white Gaussian noise is shown. In digital communication systems: the two types of error (say 1, when 0 was sent, and vice versa) are usually of equal importance. A signal is present in both hypothesis. 4/14

The probability of error is enough for measuring system performance. Error correction is possible. In radar/sonar systems, The errors have different importance. A signal is present in only one hypothesis. ROC is needed for performance. Error correction not possible. Estimation The estimation problem of signal parameters is encountered frequently in both communications and radar/sonar areas. The purpose of the receiver is to estimate the values of the successive A i and use these estimates to reconstruct the message (figure 4). 5/14

The approach in this chapter involves: 1. The observation consists of a waveform r(t) and hence it may be infinite dimensional. We therefore map the received signal into a convenient decision or estimation space. 2. In detection, decision regions are selected ROC or P (ɛ)is computed. In estimation, the variance or the mean-square error is computed. 3. The results are examined for possible improvement of the design. Detection and Estimation in White Gaussian Noise In the simple binary detection problem the following hypotheses are given: r(t) = Es(t) + w(t), 0 t T : H 1 = w(t), 0 t T : H 0 (5) It is assumed that the signals have unit energy. The problem is to observe r(t) over the interval [0, T ] and decide whether H 0 or H 1 is true. 6/14

The observation is continuous-time random waveform and hence the first step is to reduce it to a set of random variables (figure 5). (it may be a countably infinite set) r(t) = lim K K r i φ i (t) 0 t T. (6) i=1 The receiver may take the form of a correlation receiver or equivalently a matched filter receiver. The distance between the two signals in a general binary detection in Gaussian noise problem is given by d 2 = 2 N 0 (E 1 + E 0 2ρ E 0 E 1 (7) For fixed energies the best performance is obtained by making ρ = 1. Hence s 0 (t) = s 1 (t). (8) It should be noted that the signal shape is not important. 7/14

Linear Estimation The received waveform in additive white noise is given by r(t) = s(t, A) + w(t), 0 t T (9) where w(t) is a sample function from a white Gaussian noise process with spectral height N 0 /2. We wish to estimate A: If A is random, we will assume that the apriori density is known and use Bayesian estimation procedure. If A is a nonrandom variable we will use ML estimation. If s(t, A) is a linear mapping (superposition holds), the system is referred to as a linear signaling system. The estimator will be linear for various criterion of interest. For a linear system equation (9) becomes r(t) = A Es(t) + w(t), 0 t T (10) 8/14

Using the linearity property, the estimators are readily computed: r 1 (t) = T 0 r(t)s(t) dt (11) The probability density of r 1 given a = A is Gaussian G(A E, N 0 2 ). â ML (R 1 ) = R 1 E (12) If A is a random variable with probability density p a (A) then the MAP estimate is the value of A where l p (A) = 1 (R 1 A E) 2 + ln p a (A) (13) 2 N 0 /2 is a maximum. â MAP (R 1 ) = 2E/N 0 2E/N 0 + 1/σ 2 a R 1 E (14) It should be noted that the only difference between the two estimators is the 9/14

gain. The MAP estimate is also the Bayes estimate for a large class of other criteria (i.e. squared cost function etc.) as long as the a posteriori density is Gaussian. 10/14

x φ 1 ( t ) T 0 dt x 1 x( t ) x φ 2 ( t )... x φ N ( t ) T 0 T 0 dt dt x 2 x N Figure 1: Generation of expansion coefficients by correlation operation. 11/14

Sample at t= T φ 1 (T τ ) x1 x (t) φ 2 (T τ ) x2... φ N (T τ ) x N Figure 2: Generation of expansion coefficients by filter operation. 12/14

n(t) s(t) 0, 1, 1, 0 r(t) Source Transmitter + Receiver 0, 1, 1, 0 User Figure 3: A digital communications system. n(t) a(t) Sampler and Transmitter s(t, A i ) + r(t) Receiver PAM or PMF Figure 4: A parameter transmission system. 13/14

r(t) Decomposition into coordinates r infinite dimensional vector Rotation of Coordinates y l sufficient statistic Decision device Figure 5: Generation of sufficient statistics. 14/14