Properties of the Autocorrelation Function

Similar documents
Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Random Processes Why we Care

ECE 630: Statistical Communication Theory

Stochastic Processes

Probability and Statistics for Final Year Engineering Students

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Probability and Statistics

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Gaussian, Markov and stationary processes

16.584: Random (Stochastic) Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

A Hilbert Space for Random Processes

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Chapter 4 Random process. 4.1 Random process

ECE Homework Set 3

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Chapter 6 - Random Processes

Random Processes Handout IV

Stochastic Processes. Monday, November 14, 11

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE-340, Spring 2015 Review Questions

Lecture - 30 Stationary Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Statistical signal processing

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. A stochastic process is a function of two variables:

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Final Examination Solutions (Total: 100 points)

ECE534, Spring 2018: Solutions for Problem Set #5

Fig 1: Stationary and Non Stationary Time Series

ECE 636: Systems identification

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

3F1 Random Processes Examples Paper (for all 6 lectures)

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Name of the Student: Problems on Discrete & Continuous R.Vs

Gaussian Random Variables Why we Care

Chapter 6. Random Processes

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

TSKS01 Digital Communication Lecture 1

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Math 180C, Spring Supplement on the Renewal Equation

Problems on Discrete & Continuous R.Vs

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

Fundamentals of Noise

Conditional distributions (discrete case)

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Module 9: Stationary Processes

where r n = dn+1 x(t)

4 Moment generating functions

Econometría 2: Análisis de series de Tiempo

ECE 673-Random signal analysis I Final

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

Homework 3 (Stochastic Processes)

7 The Waveform Channel

Chapter 4. Chapter 4 sections

Stochastic process for macro

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Chapter 5 Random Variables and Processes

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Chapter 3 Convolution Representation

More on Distribution Function

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

1. Stochastic Processes and filtrations

MA8109 Stochastic Processes in Systems Theory Autumn 2013

LANGEVIN THEORY OF BROWNIAN MOTION. Contents. 1 Langevin theory. 1 Langevin theory 1. 2 The Ornstein-Uhlenbeck process 8

Outline. Convolution. Filtering

Therefore the new Fourier coefficients are. Module 2 : Signals in Frequency Domain Problem Set 2. Problem 1

Mathematics 426 Robert Gross Homework 9 Answers

Time : 3 Hours Max. Marks: 60. Instructions : Answer any two questions from each Module. All questions carry equal marks. Module I

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Joint Distribution of Two or More Random Variables

Introduction to Probability and Stochastic Processes I

Signals and Spectra (1A) Young Won Lim 11/26/12

Rice method for the maximum of Gaussian fields

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

EECE 3620: Linear Time-Invariant Systems: Chapter 2

Massachusetts Institute of Technology

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

06/12/ rws/jMc- modif SuFY10 (MPF) - Textbook Section IX 1

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

12/20/2017. Lectures on Signals & systems Engineering. Designed and Presented by Dr. Ayman Elshenawy Elsefy

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

Random Process. Random Process. Random Process. Introduction to Random Processes

= 4. e t/a dt (2) = 4ae t/a. = 4a a = 1 4. (4) + a 2 e +j2πft 2

Transcription:

Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R X (t, u) apple 1 2 (R X (t, t)+r X (u, u)) 4. R X (t, u) 2 apple R X (t, t) R X (u, u) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 51

Stationarity I The concept of stationarity is analogous to the idea of time-invariance in linear systems. I Interpretation: For a stationary random process, the statistical properties of the process do not change with time. I Definition: A random process X t is strict-sense stationary (sss) to the n-th order if: p Xt1,...,X t n (x 1,..., x n )=p Xt1 +T,...,X t n+t (x 1,..., x n ) for all T. I The statistics of X t do not depend on absolute time but only on the time differences between the sample times. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 52

Wide-Sense Stationarity I A simpler and more tractable notion of stationarity is based on the second-order description of a process. I Definition: A random process X t is wide-sense stationary (wss) if 1. the mean function m X (t) is constant and 2. the autocorrelation function R X (t, u) depends on t and u only through t u, i.e., R X (t, u) =R X (t u) I Notation: for a wss random process, we write the autocorrelation function in terms of the single time-parameter t = t u: R X (t, u) =R X (t u) =R X (t). 2017, B.-P. Paris ECE 630: Statistical Communication Theory 53

Exercise: Stationarity I True or False: Every random process that is strict-sense stationarity to the second order is also wide-sense stationary. I Answer: True I True or False: Every random process that is wide-sense stationary must be strict-sense stationarity to the second order. I Answer: False I True or False: The discrete phase process is strict-sense stationary. I Answer: False; first order density depends on t, therefore, not even first-order sss. I True or False: The discrete phase process is wide-sense stationary. I Answer: True 2017, B.-P. Paris ECE 630: Statistical Communication Theory 54

White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function: R X (t) = N 0 2 d(t) I I White Gaussian noise is a good model for noise in communication systems. Note, that the variance of X t is infinite: Var(X t )=E[Xt 2]=R X (0) = N 0 d(0) =. 2 I Also, for t 6= u: E[X t X u ]=R X (t, u) =R X (t u) =0. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 55

Integrals of Random Processes I We will see, that receivers always include a linear, time-invariant system, i.e., a filter. I Linear, time-invariant systems convolve the input random process with the impulse response of the filter. I Convolution is fundamentally an integration. I We will establish conditions that ensure that an expression like (w) = b a X t (w)h(t) dt is well-behaved. I The result of the (definite) integral is a random variable. I Concern: Does the above integral converge? 2017, B.-P. Paris ECE 630: Statistical Communication Theory 56

Mean Square Convergence I There are different senses in which a sequence of random variables may converge: almost surely, in probability, mean square, and in distribution. I We will focus exclusively on mean square convergence. I For our integral, mean square convergence means that the Rieman sum and the random variable satisfy: I Given e > 0, there exists a d > 0 so that E[( n  k=1 with: I a = t 0 < t 1 < < t n = b I t k 1 apple t k apple t k I d = max k (t k t k 1 ) 2 X tk h(t k )(t k t k 1 ) ) ] apple e. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 57

Mean Square Convergence Why We Care I It can be shown that the integral converges if b b a a R X (t, u)h(t)h(u) dt du < I Important: When the integral converges, then the order of integration and expectation can be interchanged, e.g., b E[ ]=E[ a X t h(t) dt] = b a E[X t ]h(t) dt = b a m X (t)h(t) dt I Throughout this class, we will focus exclusively on cases where R X (t, u) and h(t) are such that our integrals converge. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 58

Exercise: Brownian Motion I Definition: Let N t be white Gaussian noise with N 0 2 = s 2. The random process W t = t 0 N s ds for t 0 is called Brownian Motion or Wiener Process. I Compute the mean and autocorrelation functions of W t. I Answer: m W (t) =0 and R W (t, u) =s 2 min(t, u) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 59

Integrals of Gaussian Random Processes I Let X t denote a Gaussian random process with second order description m X (t) and R X (t, s). I Then, the integral = b a X (t)h(t) dt is a Gaussian random variable. I Moreover mean and variance are given by Var[ ]=E[( = b b a µ = E[ ]= a b a m X (t)h(t) dt b E[ ]) 2 2 ]=E[( (X t m x (t))h(t) dt) ] a C X (t, u)h(t)h(u) dt du 2017, B.-P. Paris ECE 630: Statistical Communication Theory 60

Jointly Defined Random Processes I Let X t and Y t be jointly defined random processes. I E.g., input and output of a filter. I Then, joint densities of the form p Xt Y u (x, y) can be defined. I Additionally, second order descriptions that describe the correlation between samples of X t and Y t can be defined. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 61

Crosscorrelation and Crosscovariance I Definition: The crosscorrelation function R XY (t, u) is defined as: R XY (t, u) =E[X t Y u ]= xyp Xt Y u (x, y) dx dy. I Definition: The crosscovariance function C XY (t, u) is defined as: C XY (t, u) =R XY (t, u) m X (t)m Y (u). I Definition: The processes X t and Y t are called jointly wide-sense stationary if: 1. R XY (t, u) =R XY (t u) and 2. m X (t) and m Y (t) are constants. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 62

Filtering of Random Processes Filtered Random Process X t h(t) Y t 2017, B.-P. Paris ECE 630: Statistical Communication Theory 63

Filtering of Random Processes I Clearly, X t and Y t are jointly defined random processes. I Standard LTI system convolution: Y t = h(t s)x s ds = h(t) X t I Recall: this convolution is well-behaved if R X (s, n)h(t s)h(t n) ds dn < I E.g.: RR R X (s, n) ds dn < and h(t) stable. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 64

Second Order Description of Output: Mean I The expected value of the filter s output Y t is: E[Y t ]=E[ = = h(t h(t h(t s)x s ds] s)e[x s ] ds s)m X (s) ds I For a wss process X t, m X (t) is constant. Therefore, is also constant. E[Y t ]=m Y (t) =m X h(s) ds 2017, B.-P. Paris ECE 630: Statistical Communication Theory 65

Crosscorrelation of Input and Output I The crosscorrelation between input and ouput signals is: R XY (t, u) =E[X t Y u ]=E[X t h(u s)x s ds = = h(u s)e[x t X s ] ds h(u s)r X (t, s) ds I For a wss input process R XY (t, u) = = h(u s)r X (t, s) ds = h(n)r X (t, u h(n)r X (t u + n) dn = R XY (t u) n) dn I Input and output are jointly stationary. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 66

Autocorelation of Output I The autocorrelation of Y t is given by R Y (t, u) =E[Y t Y u ]=E[ = h(t s)x s ds h(u h(t s)h(u n)r X (s, n) ds dn n)x n dn] I For a wss input process: R Y (t, u) = = = h(t s)h(u n)r X (s, n) ds dn h(l)h(l g)r X (t l, u l + g) dl dg h(l)h(l g)r X (t u g) dl dg = R Y (t u) I Define R h (g) = R h(l)h(l g) dl = h(l) h( l). I Then, R Y (t) = R R h (g)r X (t g) dg = R h (t) R X (t) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 67

Exercise: Filtered White Noise Process I Let the white Gaussian noise process X t be input to a filter with impulse response ( h(t) =e at e at for t 0 u(t) = 0 for t < 0 I Compute the second order description of the output process Y t. I Answers: I Mean: m Y = 0 I Autocorrelation: R Y (t) = N 0 2 e a t 2a 2017, B.-P. Paris ECE 630: Statistical Communication Theory 68