A Hilbert Space for Random Processes

Similar documents
A Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.

Gaussian Random Variables Why we Care

ECE 630: Statistical Communication Theory

Random Processes Why we Care

Properties of the Autocorrelation Function

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Digital Transmission Methods S

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Chapter 4: Continuous channel and its capacity

Summary: SER formulation. Binary antipodal constellation. Generic binary constellation. Constellation gain. 2D constellations

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

EE 574 Detection and Estimation Theory Lecture Presentation 8

E&CE 358, Winter 2016: Solution #2. Prof. X. Shen

SOLUTIONS TO ECE 6603 ASSIGNMENT NO. 6

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

BASICS OF DETECTION AND ESTIMATION THEORY

ECE Homework Set 3

EE4304 C-term 2007: Lecture 17 Supplemental Slides

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

Lecture 15: Thu Feb 28, 2019

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

Computing Probability of Symbol Error

Distributed Scheduling for Achieving Multi-User Diversity (Capacity of Opportunistic Scheduling in Heterogeneous Networks)

Gaussian, Markov and stationary processes

Morning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland

Chapter 7: Channel coding:convolutional codes

Application of Matched Filter

Parameter Estimation

EE456 Digital Communications

Revision of Lecture 4

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Memo of J. G. Proakis and M Salehi, Digital Communications, 5th ed. New York: McGraw-Hill, Chenggao HAN

«Random Vectors» Lecture #2: Introduction Andreas Polydoros

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Basic Principles of Video Coding

Detection and Estimation Theory

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

Summary II: Modulation and Demodulation

Computer Vision & Digital Image Processing

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Chapter [4] "Operations on a Single Random Variable"

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

ECE 673-Random signal analysis I Final

List Decoding: Geometrical Aspects and Performance Bounds

Principal Component Analysis

Data Detection for Controlled ISI. h(nt) = 1 for n=0,1 and zero otherwise.

Tutorial 1 : Probabilities

Asymptotic Probability Density Function of. Nonlinear Phase Noise

EE 302 Division 1. Homework 6 Solutions.

= 4. e t/a dt (2) = 4ae t/a. = 4a a = 1 4. (4) + a 2 e +j2πft 2

ECE Homework Set 2

Chapter 2 Signal Processing at Receivers: Detection Theory

INVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS

EE5713 : Advanced Digital Communications

Cramér-von Mises Gaussianity test in Hilbert space

III.C - Linear Transformations: Optimal Filtering

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Statistical signal processing

!P x. !E x. Bad Things Happen to Good Signals Spring 2011 Lecture #6. Signal-to-Noise Ratio (SNR) Definition of Mean, Power, Energy ( ) 2.

Time : 3 Hours Max. Marks: 60. Instructions : Answer any two questions from each Module. All questions carry equal marks. Module I

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

ECE Information theory Final


Lecture 18: Gaussian Channel

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Signals and Spectra - Review

ECE Lecture #9, Part 1

ELEC546 Review of Information Theory

Basic Operator Theory with KL Expansion

Stochastic Processes

Pattern Recognition 2

Construction of seismic fragility curves using the Karhunen-Loève expansion

CHAPTER 14. Based on the info about the scattering function we know that the multipath spread is T m =1ms, and the Doppler spread is B d =0.2 Hz.

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Simultaneous SDR Optimality via a Joint Matrix Decomp.

Polynomial Chaos and Karhunen-Loeve Expansion

Lecture 12. Block Diagram

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

Improved Detected Data Processing for Decision-Directed Tracking of MIMO Channels

EE 505 Introduction. What do we mean by random with respect to variables and signals?

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Random Processes Handout IV

Video Coding with Motion Compensation for Groups of Pictures

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Transcription:

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of greatest interest to us. I This vector space contains random processes that satisfy, i.e., Z b E[Xt 2 ] dt <. I Inner Product: cross-correlation a Z b E[hX t, Y t i]=e[ a X t Y t dt]. I Fact: This vector space is separable; therefore, an orthonormal basis {F} exists. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 114

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I (con t) I Representation: X t = Â X n F n (t) n=1 for a apple t apple b with X n = hx t, F n (t)i = Z b a X t F n (t) dt. I I Note that X n is a random variable. For this to be a valid Hilbert space, we must interprete equality of processes X t and Y t in the mean squared sense, i.e., X t = Y t means E[ X t Y t 2 ]=0. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 115

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts Karhunen-Loeve Expansion I Goal: Choose an orthonormal basis {F} such that the representation {X n } of the random process X t consists of uncorrelated random variables. I The resulting representation is called Karhunen-Loeve expansion. I Thus, we want E[X n X m ]=E[X n ]E[X m ] for n 6= m. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 116

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts Karhunen-Loeve Expansion I It can be shown, that for the representation {X n } to consist of uncorrelated random variables, the orthonormal basis vectors {F} must satisfy Z b a K X (t, u) F n (u) du = l n F n (t) I where l n = Var[X n ]. I {Fn } and {l n } are the eigenfunctions and eigenvalues of the autocovariance function K X (t, u), respectively. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 117

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts Example: Wiener Process I For the Wiener Process, the autocovariance function is I It can be shown that K X (t, u) =R X (t, u) =s 2 min(t, u). F n (t) = l n = r 2 T sin((n 1 2 )p t T ) st (n 1 2 )p! 2 for n = 1, 2,... 2018, B.-P. Paris ECE 630: Statistical Communication Theory 118

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts Properties of the K-L Expansion I The eigenfunctions of the autocovariance function form a complete basis. I If X t is Gaussian, then the representation {X n } is a vector of independent, Gaussian random variables. I For white noise, K X (t, u) = N 0 2 d(t u). Then, the eigenfunctions must satisfy Z d(t u)f(u) du = lf(t). N0 2 I Any orthonormal set of bases {F} satisfies this condition! I Eigenvalues l are all equal to N 0 2. I If X t is white, Gaussian noise then the representation {X n } are independent, identically distributed random variables. I zero mean I variance N 0 2 2018, B.-P. Paris ECE 630: Statistical Communication Theory 119

Part III Optimum Receivers in AWGN Channels 2018, B.-P. Paris ECE 630: Statistical Communication Theory 120

A Simple Communication System Simple Communication System N t Source m 2{0, 1} TX: m! s(t) s(t) R t RX: R t! ˆm ˆm 2{0, 1} I Objectives: For the above system I describe the optimum receiver and I find the probability of error for that receiver. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 121

Assumptions Noise: N t is a white Gaussian noise process with spectral height N 0 2 : R N (t) = N 0 2 d(t). I Additive White Gaussian Noise (AWGN). Source: characterized by the a priori probabilities p 0 = Pr{m = 0} p 1 = Pr{m = 1}. I For this example, will assume p 0 = p 1 = 1 2. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 122

Assumptions (cont d) Transmitter: maps information bits m to signals: 8 q < E s 0 (t) = b T if m = 0 m! s(t) : q : E s 1 (t) = b T if m = 1 for 0 apple t apple T. I Note that we are considering the transmission of a single bit. I In AWGN channels, each bit can be considered in isolation. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 123

Objective I In general, the objective is to find the receiver that minimizes the probability of error: Pr{e} = Pr{ ˆm 6= m} = p 0 Pr{ ˆm = 1 m = 0} + p 1 Pr{ ˆm = 0 m = 1}. I For this example, optimal receiver will be given (next slide). I Also, compute the probability of error for the communication system. I That is the focus of this example. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 124

Receiver I We will see that the following receiver minimizes the probability of error for this communication system. Receiver R t q E b T R T 0 dt R R > 0 : ˆm = 0 R < 0 : ˆm = 1 ˆm I RX Frontend computes R = R q T 0 R E b t T dt = hr t, s 0 (t)i. I RX Backend compares R to a threshold to arrive at decision ˆm. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 125

Plan for Finding Pr{e} I Analysis of the receiver proceeds in the following steps: 1. Find the conditional distribution of the output R from the receiver frontend. I I Conditioning with respect to each of the possibly transmitted signals. This boils down to finding conditional mean and variance of R. 2. Find the conditional error probabilities Pr{ ˆm = 0 m = 1} and Pr{ ˆm = 1 m = 0}. I Involves finding the probability that R exceeds a threshold. 3. Total probability of error: Pr{e} = p 0 Pr{ ˆm = 0 m = 1} + p 1 Pr{ ˆm = 0 m = 1}. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 126

Conditional Distribution of R I There are two random effects that affect the received signal: I the additive white Gaussian noise N t and I the random information bit m. I By conditioning on m thus, on s(t) randomness is caused by the noise only. I Conditional on m, the output R of the receiver frontend is a Gaussian random variable: I N t is a Gaussian random process; for given s(t), R t = s(t)+n t is a Gaussian random process. I The frontend performs a linear transformation of R t : R = hr t, s 0 (t)i. I We need to find the conditional means and variances 2018, B.-P. Paris ECE 630: Statistical Communication Theory 127

Conditional Distribution of R I The conditional means and variance of the frontend output R are E[R m = 0] =E b Var[R m = 0] = N 0 2 E b E[R m = 1] = E b Var[R m = 1] = N 0 2 E b I Therefore, the conditional distributions of R are p R m=0 (r) N(E b, N 0 2 E b) p R m=1 (r) N( E b, N 0 2 E b) I The two conditional distributions differ in the mean and have equal variances. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 128

Conditional Distribution of R 0.25 p R m=0 (r) p R m=1 (r) 0.2 0.15 pdf 0.1 0.05 0-8 -6-4 -2 0 2 4 6 8 r I The two conditional pdfs are shown in the plot above, with I E b = 3 I N 0 2 = 1 2018, B.-P. Paris ECE 630: Statistical Communication Theory 129

Conditional Probability of Error Receiver R t q E b T R T 0 dt R R > 0 : ˆm = 0 R < 0 : ˆm = 1 ˆm I The receiver backend decides: ( 0 if R > 0 ˆm = 1 if R < 0 I Two conditional error probabilities: Pr{ ˆm = 0 m = 1} and Pr{ ˆm = 1 m = 0} 2018, B.-P. Paris ECE 630: Statistical Communication Theory 130

Error Probability Pr{ ˆm = 0 m = 1} 0.25 p R m=0 (r) p R m=1 (r) 0.2 pdf 0.15 0.1 0.05 I Conditional error probability Pr{ ˆm = 0 m = 1} corresponds to shaded area. 0-8 -6-4 -2 0 2 4 6 8 r Pr{ ˆm = 0 m = 1} = Pr{R > 0 m = 1} = Z 0 p R m=1 (r) dr = Q r 2Eb N0! 2018, B.-P. Paris ECE 630: Statistical Communication Theory 131

Error Probability Pr{ ˆm = 1 m = 0} 0.25 p R m=0 (r) p R m=1 (r) 0.2 pdf 0.15 0.1 0.05 I Conditional error probability Pr{ ˆm = 1 m = 0} corresponds to shaded area. 0-8 -6-4 -2 0 2 4 6 8 r Pr{ ˆm = 1 m = 0} = Pr{R < 0 m = 0} = Z 0 p R m=0 (r) dr = Q r 2Eb N0!. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 132

Average Probability of Error I The (average) probability of error is the average of the two conditional probabilities of error. I The average is weighted by the a priori probabilities p 0 and p 1. I Thus, Pr{e} = p 0 Pr{ ˆm = 1 m = 0} + p 1 Pr{ ˆm = 0 m = 1}. I With the above conditional error probabilities and equal priors p 0 = p 1 = 1 2 Pr{e} = 1 r 2 Q 2Eb N0! + 1 2 Q r 2Eb N0! = Q r 2Eb N0!. I Note that the error probability depends on the ratio E b I I where E b is the energy of signals s 0 (t) and s 1 (t). This ratio is referred to as the signal-to-noise ratio. 2018, B.-P. Paris ECE 630: Statistical Communication Theory 133 N 0,

Exercise - Compute Probability of Error I Compute the probability of error for the example system if the only change in the system is that signals s 0 (t) and s 1 (t) are changed to triangular signals: 8 2A >< T t for 0 apple t apple T 2 s 0 (t) = 2A >: 2A T t for T 2 apple t apple T 0 else with A = I Answer: q 3E b T. Pr{e} = Q s 3E b 2N 0! s 1 (t) = s 0 (t) 2018, B.-P. Paris ECE 630: Statistical Communication Theory 134