Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes Stationary Ergodic Random Processes EE 78B: Stationary Random Processes 7 1 Stationary Random Processes Stationarity refers to time invariance of some, or all, of the statistics of a random process, such as mean, autocorrelation, n-th-order distribution We define two types of stationarity: strict sense (SSS) and wide sense (WSS) A random process X(t) (or X n ) is said to be SSS if all its finite order distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of X(t 1 ),X(t ),...,X(t k ) and X(t 1 +),X(t +),...,X(t k +) are the same for all k, all t 1,t,...,t k, and all time shifts So for a SSS process, the first-order distribution is independent of t, and the second-order distribution the distribution of any two samples X(t 1 ) and X(t ) depends only on = t t 1 To see this, note that from the definition of stationarity, for any t, the joint distribution of X(t 1 ) and X(t ) is the same as the joint distribution of X(t 1 +(t t 1 )) = X(t) and X(t +(t t 1 )) = X(t+(t t 1 )) EE 78B: Stationary Random Processes 7
Example: The random phase signal X(t) = αcos(ωt+θ) where Θ U[0,π] is SSS We already know that the first order pdf is 1 f X(t) (x) = πα α < x < +α 1 (x/α), which is independent of t, and is therefore stationary To find the second order pdf, note that if we are given the value of X(t) at one point, say t 1, there are (at most) two possible sample functions: x 1 x 1 t 1 t t x EE 78B: Stationary Random Processes 7 3 The second order pdf can thus be written as f X(t1 ),X(t )(x 1,x ) = f X(t1 )(x 1 )f X(t ) X(t 1 )(x x 1 ) = f X(t1 )(x 1 ) ( 1 δ(x x 1 )+ 1 δ(x x ) ), which depends only on t t 1, and thus the second order pdf is stationary Now if we know that X(t 1 ) = x 1 and X(t ) = x, the sample path is totally determined (except when x 1 = x = 0, where two paths may be possible), and thus all n-th order pdfs are stationary IID processes are SSS Random walk and Poisson processes are not SSS The Gauss-Markov process (as we defined it) is not SSS. However, if we set X 1 to the steady state distribution of X n, it becomes SSS (see homework exercise) EE 78B: Stationary Random Processes 7 4
Wide-Sense Stationary Random Processes A random process X(t) is said to be wide-sense stationary (WSS) if its mean and autocorrelation functions are time invariant, i.e., E(X(t)) = µ, independent of t R X (t 1,t ) is a function only of the time difference t t 1 E[X(t) ] < (technical condition) Since R X (t 1,t ) = R X (t,t 1 ), for any wide sense stationary process X(t), R X (t 1,t ) is a function only of t t 1 Clearly SSS WSS. The converse is not necessarily true EE 78B: Stationary Random Processes 7 5 Example: Let X(t) = +sint with probability 1 4 sint with probability 1 4 +cost with probability 1 4 cost with probability 1 4 E(X(t)) = 0 and R X (t 1,t ) = 1 cos(t t 1 ), thus X(t) is WSS But X(0) and X( π 4 ) do not have the same pmf (different ranges), so the first order pmf is not stationary, and the process is not SSS For Gaussian random processes, WSS SSS, since the process is completely specified by its mean and autocorrelation functions Random walk is not WSS, since R X (n 1,n ) = min{n 1,n } is not time invariant; similarly Poisson process is not WSS EE 78B: Stationary Random Processes 7 6
Autocorrelation Function of WSS Processes Let X(t) be a WSS process. Relabel R X (t 1,t ) as R X () where = t 1 t 1. R X () is real and even, i.e., R X () = R X ( ) for every. R X () R X (0) = E[X (t)], the average power of X(t) This can be shown as follows. For every t, (R X ()) = [E(X(t)X(t+))] E[X (t)]e[x (t+)] by Schwarz inequality = (R X (0)) by stationarity 3.If R X (T) = R X (0) for some T 0, then R X () is periodic with period T and so is X(t) (with probability 1)!! That is, R X () = R X ( +T), X(t) = X(t+T) w.p.1 for every EE 78B: Stationary Random Processes 7 7 Example: The autocorrelation function for the periodic signal with random phase X(t) = αcos(ωt+θ) is R X () = α cosω (also periodic) To prove property 3, we again use the Schwarz inequality: For every, [ RX () R X ( +T) ] = [ E(X(t)(X(t+) X(t+ +T))) ] E[X (t)]e [ (X(t+) X(t+ +T)) ] = R X (0)(R X (0) R X (T)) = R X (0)(R X (0) R X (0)) = 0 Thus R X () = R X ( +T) for all, i.e., R X () is periodic with period T The above properties of R X () are necessary but not sufficient for a function to qualify as an autocorrelation function for a WSS process EE 78B: Stationary Random Processes 7 8
The necessary and sufficient conditions for a function to be an autocorrelation function for a WSS process is that it be real, even, and nonnegative definite By nonnegative definite we mean that for any n, any t 1,t,...,t n and any real vector a = (a 1,...,a n ), n n a i a j R(t i t j ) 0 i=1 j=1 To see why this is necessary, recall that the correlation matrix for a random vector must be nonnegative definite, so if we take a set of n samples from the WSS random process, their correlation matrix must be nonnegative definite The condition is sufficient since such an R() can specify a zero mean stationary Gaussian random process The nonnegative definite condition may be difficult to verify directly. It turns out, however, to be equivalent to the condition that the Fourier transform of R X (), which is called the power spectral density S X (f), is nonnegative for all frequencies f EE 78B: Stationary Random Processes 7 9 Which Functions Can Be an R X ()? 1.. e α e α 3. 4. sinc EE 78B: Stationary Random Processes 7 10
Which Functions can be an R X ()? 5. 6. 1 1 7. 8. T T n 4 3 1 1 3 4 n 1 1 T T EE 78B: Stationary Random Processes 7 11 Interpretation of Autocorrelation Function Let X(t) be WSS with zero mean. If R X () drops quickly with, this means that samples become uncorrelated quickly as we increase. Conversely, if R X () drops slowly with, samples are highly correlated R X1 () R X () So R X () is a measure of the rate of change of X(t) with time t, i.e., the frequency response of X(t) It turns out that this is not just an intuitive interpretation the Fourier transform of R X () (the power spectral density) is in fact the average power density of X(t) over frequency EE 78B: Stationary Random Processes 7 1
Power Spectral Density The power spectral density (psd) of a WSS random process X(t) is the Fourier transform of R X (): S X (f) = F ( R X () ) = R X ()e iπf d For a discrete time process X n, the power spectral density is the discrete-time Fourier transform (DTFT) of the sequence R X (n): S X (f) = R X (n)e iπnf, f < 1 n= R X () (or R X (n)) can be recovered from S X (f) by taking the inverse Fourier transform or inverse DTFT: R X () = R X (n) = 1 1 S X (f)e iπf df S X (f)e iπnf df EE 78B: Stationary Random Processes 7 13 Properties of the Power Spectral Density 1. S X (f) is real and even, since the Fourier transform of the real and even function R X () is real and even. S X(f)df = R X (0) = E(X (t)), the average power of X(t), i.e., the area under S X is the average power 3. S X (f) is the average power density, i.e., the average power of X(t) in the frequency band [f 1,f ] is f1 f f S X (f)df + S X (f)df = S X (f)df f f 1 f 1 (we will show this soon) From property 3, it follows that S X (f) 0. Why? In general, a function S(f) is a psd if and only if it is real, even, nonnegative, and S(f)df < EE 78B: Stationary Random Processes 7 14
Examples 1. R X () = e α S X (f) = α α +(πf) f. R X () = α cosω α 4 S X (f) α 4 ω π ω π f EE 78B: Stationary Random Processes 7 15 3. R X (n) = n S X (f) = 3 5 4cosπf 4 3 1 1 3 4 n 1 1 f 4. Discrete time white noise process: X 1,X,...,X n,... zero mean, uncorrelated, with average power N { N n = 0 R X (n) = S X (f) 0 otherwise N N n 1 + 1 f If X n is also a GRP, then we obtain a discrete time WGN process EE 78B: Stationary Random Processes 7 16
5. Bandlimited white noise process: WSS zero mean process X(t) with S X (f) R X () = NBsincB N B B f 1 B B ( For any t, the samples X t± n ) for n = 0,1,,... are uncorrelated B 6.White noise process: If we let B in the previous example, we obtain a white noise process, which has S X (f) = N for all f R X () = N δ() EE 78B: Stationary Random Processes 7 17 If, in addition, X(t) is a GRP, then we obtain the famous white Gaussian noise (WGN) process Remarks on white noise: For a white noise process, all samples are uncorrelated The process is not physically realizable, since it has infinite power However, it plays a similar role in random processes to point mass in physics and delta function in linear systems Thermal noise and shot noise are well modeled as white Gaussian noise, since they have very flat psd over very wide band (GHz) EE 78B: Stationary Random Processes 7 18
Continuity and Integration of Random Processes We are all familiar with the definitions of continuity and integration for deterministic functions as limits Using the notions of convergence discussed in Lecture Notes 5, we can define these notions for random processes. We focus only on m.s. convergence Continuity: A process X(t) is said to be mean square continuous if for every t lim s t E[(X(s) X(t)) ] = 0 The continuity of X(t) depends only on its autocorrelation function R X (t 1,t ) In fact, the following statements are all equivalent: 1. R X (t 1,t ) is continuous at all points of the form (t,t). X(t) is m.s. continuous 3. R X (t 1,t ) is continuous in t 1,t EE 78B: Stationary Random Processes 7 19 Proof: 1. 1 implies : Since if R X (t 1,t ) is continuous at all points (t,t), E[(X(t) X(s)) ] = R X (t,t)+r X (s,s) R X (s,t) 0 as s t. implies 3: Consider R X (s 1,s ) = E[X(s 1 )X(s )] = E[(X(t 1 )+(X(s 1 ) X(t 1 )))(X(t )+(X(s ) X(t )))] = R X (t 1,t )+E[X(t 1 )(X(s ) X(t ))]+E[X(t )(X(s 1 ) X(t 1 ))] +E[(X(s 1 ) X(t 1 )))(X(s ) X(t )))] R X (t 1,t )+ E[X (t 1 )]E[(X(s ) X(t )) ] + E[X (t )]E[(X(s 1 ) X(t 1 )) ] + E[(X(s 1 ) X(t 1 )) ]E[(X(s ) X(t )) ] Schwartz inequality R X (t 1,t ) as s 1 t 1 and s t since X(t) is m.s. continuous 3. Since 3 implies 1, we are done EE 78B: Stationary Random Processes 7 0
Example: The Poisson process N(t) with rate λ > 0 is m.s. continuous, since its autocorrelation function, is a continuous function R N (t 1,t ) = λmin{t 1,t }+λ t 1 t Integration: Let X(t) be a RP and h(t) be a function. We can define the integral b a h(t)x(t)dt as the limit of a sum (as in Riemann integral of a deterministic function) in m.s. Let > 0 such that b a = n and a 1 a+ a+ n 1 a+(n 1) n a+n = b, then the corresponding Riemann sum is n 1 h( i )X( i ) i=1 The above integral then exists if this sum has a limit in m.s. as 0 EE 78B: Stationary Random Processes 7 1 Moreover, if the random integral exists for all a,b, then we can define h(t)x(t)dt = lim a,b b a h(t)x(t)dt in m.s. Fact: The existence of the m.s. integral depends only on R X and h More specifically, the above integral exists iff b b exists (in the normal sense) a a R X (t 1,t )h(t 1 )h(t )dt 1 dt Remark: We are skipping several mathematical details here. In what follows, we use the above fact to justify the existence of integrals involving random processes and in interchanging expectation and integration EE 78B: Stationary Random Processes 7
Stationary Ergodic Random processes Let X(t) be SSS or only WSS Ergodicity of X(t) means that certain time averages converge to their respective statistical averages Mean ergodic process: Let X(t) be a WSS and m.s. continuous RP with mean µ X To estimate the mean of X(t), we form the time average X(t) = 1 t t 0 X()d The RP X(t) is said to be mean ergodic if X(t) µ X as t in m.s. Similarly for a discrete RP, the time average (same as sample average) is X n = 1 n X n n i=1 and the RP is mean ergodic if Xn µ X in m.s. EE 78B: Stationary Random Processes 7 3 Example: Let X n be a WSS process with C X (n) = 0 for n 0, i.e., the X i s are uncorrelated, then X n is mean ergodic The process does not need to have uncorrelated samples for it to be mean ergodic, however Whether a WSS process is mean ergodic again depends only on its autocorrelation function By definition, mean ergodicity means that lim E[( X(t) µ X ) ] 0 t Since E( X(t)) = µ X, the condition for mean ergodicity is the same as lim Var( X(t)) = 0 t EE 78B: Stationary Random Processes 7 4
Now, consider [ (1 t ) ] E( X (t)) = E X()d t 0 ( t t ) 1 = E X( 1 )X( )d 1 d t 0 0 = 1 t t R X ( 1, )d 1 d t 0 0 = 1 t t R X ( 1 )d 1 d t 0 0 From figure below, this double integral reduces to the single integral E( X (t)) = t (t )R X ()d t 0 EE 78B: Stationary Random Processes 7 5 t = t 1 = = t + t 1 Hence, a WSS process X(t) is mean ergodic iff lim t t (t )R X ()d = µ t X 0 EE 78B: Stationary Random Processes 7 6
Example: Let X(t) be a WSS with zero mean and R X () = e Evaluating the condition on mean ergodicity, we obtain t (t )R X ()d = t 0 t (e t +t 1), which 0 as t. Hence X(t) is mean ergodic Example: Consider the coin with random bias P example in Lecture Notes 5. The random process X 1,X,... is stationary However, it is not mean ergodic, since X n P in m.s. Remarks: The process in the above example can be viewed as a mixture of IID Bernoulli(p) processes, each of which is stationary ergodic (it turns out that every stationary process is a mixture of stationary ergodic processes) Ergodicity can be defined for general (not necessarily stationary) processes (this is beyond the scope of this course, however) EE 78B: Stationary Random Processes 7 7