Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R X (t, u) apple 1 2 (R X (t, t)+r X (u, u)) 4. R X (t, u) 2 apple R X (t, t) R X (u, u) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 51
Stationarity I The concept of stationarity is analogous to the idea of time-invariance in linear systems. I Interpretation: For a stationary random process, the statistical properties of the process do not change with time. I Definition: A random process X t is strict-sense stationary (sss) to the n-th order if: p Xt1,...,X t n (x 1,..., x n )=p Xt1 +T,...,X t n+t (x 1,..., x n ) for all T. I The statistics of X t do not depend on absolute time but only on the time differences between the sample times. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 52
Wide-Sense Stationarity I A simpler and more tractable notion of stationarity is based on the second-order description of a process. I Definition: A random process X t is wide-sense stationary (wss) if 1. the mean function m X (t) is constant and 2. the autocorrelation function R X (t, u) depends on t and u only through t u, i.e., R X (t, u) =R X (t u) I Notation: for a wss random process, we write the autocorrelation function in terms of the single time-parameter t = t u: R X (t, u) =R X (t u) =R X (t). 2017, B.-P. Paris ECE 630: Statistical Communication Theory 53
Exercise: Stationarity I True or False: Every random process that is strict-sense stationarity to the second order is also wide-sense stationary. I Answer: True I True or False: Every random process that is wide-sense stationary must be strict-sense stationarity to the second order. I Answer: False I True or False: The discrete phase process is strict-sense stationary. I Answer: False; first order density depends on t, therefore, not even first-order sss. I True or False: The discrete phase process is wide-sense stationary. I Answer: True 2017, B.-P. Paris ECE 630: Statistical Communication Theory 54
White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function: R X (t) = N 0 2 d(t) I I White Gaussian noise is a good model for noise in communication systems. Note, that the variance of X t is infinite: Var(X t )=E[Xt 2]=R X (0) = N 0 d(0) =. 2 I Also, for t 6= u: E[X t X u ]=R X (t, u) =R X (t u) =0. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 55
Integrals of Random Processes I We will see, that receivers always include a linear, time-invariant system, i.e., a filter. I Linear, time-invariant systems convolve the input random process with the impulse response of the filter. I Convolution is fundamentally an integration. I We will establish conditions that ensure that an expression like (w) = b a X t (w)h(t) dt is well-behaved. I The result of the (definite) integral is a random variable. I Concern: Does the above integral converge? 2017, B.-P. Paris ECE 630: Statistical Communication Theory 56
Mean Square Convergence I There are different senses in which a sequence of random variables may converge: almost surely, in probability, mean square, and in distribution. I We will focus exclusively on mean square convergence. I For our integral, mean square convergence means that the Rieman sum and the random variable satisfy: I Given e > 0, there exists a d > 0 so that E[( n  k=1 with: I a = t 0 < t 1 < < t n = b I t k 1 apple t k apple t k I d = max k (t k t k 1 ) 2 X tk h(t k )(t k t k 1 ) ) ] apple e. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 57
Mean Square Convergence Why We Care I It can be shown that the integral converges if b b a a R X (t, u)h(t)h(u) dt du < I Important: When the integral converges, then the order of integration and expectation can be interchanged, e.g., b E[ ]=E[ a X t h(t) dt] = b a E[X t ]h(t) dt = b a m X (t)h(t) dt I Throughout this class, we will focus exclusively on cases where R X (t, u) and h(t) are such that our integrals converge. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 58
Exercise: Brownian Motion I Definition: Let N t be white Gaussian noise with N 0 2 = s 2. The random process W t = t 0 N s ds for t 0 is called Brownian Motion or Wiener Process. I Compute the mean and autocorrelation functions of W t. I Answer: m W (t) =0 and R W (t, u) =s 2 min(t, u) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 59
Integrals of Gaussian Random Processes I Let X t denote a Gaussian random process with second order description m X (t) and R X (t, s). I Then, the integral = b a X (t)h(t) dt is a Gaussian random variable. I Moreover mean and variance are given by Var[ ]=E[( = b b a µ = E[ ]= a b a m X (t)h(t) dt b E[ ]) 2 2 ]=E[( (X t m x (t))h(t) dt) ] a C X (t, u)h(t)h(u) dt du 2017, B.-P. Paris ECE 630: Statistical Communication Theory 60
Jointly Defined Random Processes I Let X t and Y t be jointly defined random processes. I E.g., input and output of a filter. I Then, joint densities of the form p Xt Y u (x, y) can be defined. I Additionally, second order descriptions that describe the correlation between samples of X t and Y t can be defined. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 61
Crosscorrelation and Crosscovariance I Definition: The crosscorrelation function R XY (t, u) is defined as: R XY (t, u) =E[X t Y u ]= xyp Xt Y u (x, y) dx dy. I Definition: The crosscovariance function C XY (t, u) is defined as: C XY (t, u) =R XY (t, u) m X (t)m Y (u). I Definition: The processes X t and Y t are called jointly wide-sense stationary if: 1. R XY (t, u) =R XY (t u) and 2. m X (t) and m Y (t) are constants. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 62
Filtering of Random Processes Filtered Random Process X t h(t) Y t 2017, B.-P. Paris ECE 630: Statistical Communication Theory 63
Filtering of Random Processes I Clearly, X t and Y t are jointly defined random processes. I Standard LTI system convolution: Y t = h(t s)x s ds = h(t) X t I Recall: this convolution is well-behaved if R X (s, n)h(t s)h(t n) ds dn < I E.g.: RR R X (s, n) ds dn < and h(t) stable. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 64
Second Order Description of Output: Mean I The expected value of the filter s output Y t is: E[Y t ]=E[ = = h(t h(t h(t s)x s ds] s)e[x s ] ds s)m X (s) ds I For a wss process X t, m X (t) is constant. Therefore, is also constant. E[Y t ]=m Y (t) =m X h(s) ds 2017, B.-P. Paris ECE 630: Statistical Communication Theory 65
Crosscorrelation of Input and Output I The crosscorrelation between input and ouput signals is: R XY (t, u) =E[X t Y u ]=E[X t h(u s)x s ds = = h(u s)e[x t X s ] ds h(u s)r X (t, s) ds I For a wss input process R XY (t, u) = = h(u s)r X (t, s) ds = h(n)r X (t, u h(n)r X (t u + n) dn = R XY (t u) n) dn I Input and output are jointly stationary. 2017, B.-P. Paris ECE 630: Statistical Communication Theory 66
Autocorelation of Output I The autocorrelation of Y t is given by R Y (t, u) =E[Y t Y u ]=E[ = h(t s)x s ds h(u h(t s)h(u n)r X (s, n) ds dn n)x n dn] I For a wss input process: R Y (t, u) = = = h(t s)h(u n)r X (s, n) ds dn h(l)h(l g)r X (t l, u l + g) dl dg h(l)h(l g)r X (t u g) dl dg = R Y (t u) I Define R h (g) = R h(l)h(l g) dl = h(l) h( l). I Then, R Y (t) = R R h (g)r X (t g) dg = R h (t) R X (t) 2017, B.-P. Paris ECE 630: Statistical Communication Theory 67
Exercise: Filtered White Noise Process I Let the white Gaussian noise process X t be input to a filter with impulse response ( h(t) =e at e at for t 0 u(t) = 0 for t < 0 I Compute the second order description of the output process Y t. I Answers: I Mean: m Y = 0 I Autocorrelation: R Y (t) = N 0 2 e a t 2a 2017, B.-P. Paris ECE 630: Statistical Communication Theory 68