Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Stochastic Processes

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Random Processes Why we Care

Probability and Statistics

Introduction to Probability and Stochastic Processes I

Stochastic Processes. A stochastic process is a function of two variables:

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Gaussian, Markov and stationary processes

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Signals and Spectra - Review

Properties of the Autocorrelation Function

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

16.584: Random (Stochastic) Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

7 The Waveform Channel

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

= 4. e t/a dt (2) = 4ae t/a. = 4a a = 1 4. (4) + a 2 e +j2πft 2

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Statistical signal processing

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Chapter 6. Random Processes

Module 9: Stationary Processes

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Stochastic Process II Dr.-Ing. Sudchai Boonto

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Random Processes Handout IV

3F1 Random Processes Examples Paper (for all 6 lectures)

Chapter 4 Random process. 4.1 Random process

EE4601 Communication Systems

Probability and Statistics for Final Year Engineering Students

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

5 Analog carrier modulation with noise

Detection & Estimation Lecture 1

CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME

Signals and Spectra (1A) Young Won Lim 11/26/12

Chapter 6 - Random Processes

ECE 636: Systems identification

Fundamentals of Noise

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

Lecture Notes 3 Convergence (Chapter 5)

Random Process. Random Process. Random Process. Introduction to Random Processes

Detection & Estimation Lecture 1

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

1. Stochastic Processes and Stationarity

Definition of a Stochastic Process

Fourier Analysis and Power Spectral Density

14 - Gaussian Stochastic Processes

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Chapter 6: Random Processes 1

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Massachusetts Institute of Technology

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

PROBABILITY AND RANDOM PROCESSESS

University of Regina. Lecture Notes. Michael Kozdron

Name of the Student: Problems on Discrete & Continuous R.Vs

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

ECE 353 Probability and Random Signals - Practice Questions

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Homework 3 (Stochastic Processes)

EE401: Advanced Communication Theory

Fig 1: Stationary and Non Stationary Time Series

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

ECE Homework Set 3

Chapter 5 Random Variables and Processes

conditional cdf, conditional pdf, total probability theorem?

STA205 Probability: Week 8 R. Wolpert

Lecture - 30 Stationary Processes

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

where r n = dn+1 x(t)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Each problem is worth 25 points, and you may solve the problems in any order.

Simulation - Lectures - Part III Markov chain Monte Carlo

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Problems on Discrete & Continuous R.Vs

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

Introduction to Stochastic processes

Northwestern University Department of Electrical Engineering and Computer Science

TSKS01 Digital Communication Lecture 1

ECE534, Spring 2018: Solutions for Problem Set #5

Problem Sheet 1 Examples of Random Processes

Transcription:

Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes Stationary Ergodic Random Processes EE 78B: Stationary Random Processes 7 1 Stationary Random Processes Stationarity refers to time invariance of some, or all, of the statistics of a random process, such as mean, autocorrelation, n-th-order distribution We define two types of stationarity: strict sense (SSS) and wide sense (WSS) A random process X(t) (or X n ) is said to be SSS if all its finite order distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of X(t 1 ),X(t ),...,X(t k ) and X(t 1 +),X(t +),...,X(t k +) are the same for all k, all t 1,t,...,t k, and all time shifts So for a SSS process, the first-order distribution is independent of t, and the second-order distribution the distribution of any two samples X(t 1 ) and X(t ) depends only on = t t 1 To see this, note that from the definition of stationarity, for any t, the joint distribution of X(t 1 ) and X(t ) is the same as the joint distribution of X(t 1 +(t t 1 )) = X(t) and X(t +(t t 1 )) = X(t+(t t 1 )) EE 78B: Stationary Random Processes 7

Example: The random phase signal X(t) = αcos(ωt+θ) where Θ U[0,π] is SSS We already know that the first order pdf is 1 f X(t) (x) = πα α < x < +α 1 (x/α), which is independent of t, and is therefore stationary To find the second order pdf, note that if we are given the value of X(t) at one point, say t 1, there are (at most) two possible sample functions: x 1 x 1 t 1 t t x EE 78B: Stationary Random Processes 7 3 The second order pdf can thus be written as f X(t1 ),X(t )(x 1,x ) = f X(t1 )(x 1 )f X(t ) X(t 1 )(x x 1 ) = f X(t1 )(x 1 ) ( 1 δ(x x 1 )+ 1 δ(x x ) ), which depends only on t t 1, and thus the second order pdf is stationary Now if we know that X(t 1 ) = x 1 and X(t ) = x, the sample path is totally determined (except when x 1 = x = 0, where two paths may be possible), and thus all n-th order pdfs are stationary IID processes are SSS Random walk and Poisson processes are not SSS The Gauss-Markov process (as we defined it) is not SSS. However, if we set X 1 to the steady state distribution of X n, it becomes SSS (see homework exercise) EE 78B: Stationary Random Processes 7 4

Wide-Sense Stationary Random Processes A random process X(t) is said to be wide-sense stationary (WSS) if its mean and autocorrelation functions are time invariant, i.e., E(X(t)) = µ, independent of t R X (t 1,t ) is a function only of the time difference t t 1 E[X(t) ] < (technical condition) Since R X (t 1,t ) = R X (t,t 1 ), for any wide sense stationary process X(t), R X (t 1,t ) is a function only of t t 1 Clearly SSS WSS. The converse is not necessarily true EE 78B: Stationary Random Processes 7 5 Example: Let X(t) = +sint with probability 1 4 sint with probability 1 4 +cost with probability 1 4 cost with probability 1 4 E(X(t)) = 0 and R X (t 1,t ) = 1 cos(t t 1 ), thus X(t) is WSS But X(0) and X( π 4 ) do not have the same pmf (different ranges), so the first order pmf is not stationary, and the process is not SSS For Gaussian random processes, WSS SSS, since the process is completely specified by its mean and autocorrelation functions Random walk is not WSS, since R X (n 1,n ) = min{n 1,n } is not time invariant; similarly Poisson process is not WSS EE 78B: Stationary Random Processes 7 6

Autocorrelation Function of WSS Processes Let X(t) be a WSS process. Relabel R X (t 1,t ) as R X () where = t 1 t 1. R X () is real and even, i.e., R X () = R X ( ) for every. R X () R X (0) = E[X (t)], the average power of X(t) This can be shown as follows. For every t, (R X ()) = [E(X(t)X(t+))] E[X (t)]e[x (t+)] by Schwarz inequality = (R X (0)) by stationarity 3.If R X (T) = R X (0) for some T 0, then R X () is periodic with period T and so is X(t) (with probability 1)!! That is, R X () = R X ( +T), X(t) = X(t+T) w.p.1 for every EE 78B: Stationary Random Processes 7 7 Example: The autocorrelation function for the periodic signal with random phase X(t) = αcos(ωt+θ) is R X () = α cosω (also periodic) To prove property 3, we again use the Schwarz inequality: For every, [ RX () R X ( +T) ] = [ E(X(t)(X(t+) X(t+ +T))) ] E[X (t)]e [ (X(t+) X(t+ +T)) ] = R X (0)(R X (0) R X (T)) = R X (0)(R X (0) R X (0)) = 0 Thus R X () = R X ( +T) for all, i.e., R X () is periodic with period T The above properties of R X () are necessary but not sufficient for a function to qualify as an autocorrelation function for a WSS process EE 78B: Stationary Random Processes 7 8

The necessary and sufficient conditions for a function to be an autocorrelation function for a WSS process is that it be real, even, and nonnegative definite By nonnegative definite we mean that for any n, any t 1,t,...,t n and any real vector a = (a 1,...,a n ), n n a i a j R(t i t j ) 0 i=1 j=1 To see why this is necessary, recall that the correlation matrix for a random vector must be nonnegative definite, so if we take a set of n samples from the WSS random process, their correlation matrix must be nonnegative definite The condition is sufficient since such an R() can specify a zero mean stationary Gaussian random process The nonnegative definite condition may be difficult to verify directly. It turns out, however, to be equivalent to the condition that the Fourier transform of R X (), which is called the power spectral density S X (f), is nonnegative for all frequencies f EE 78B: Stationary Random Processes 7 9 Which Functions Can Be an R X ()? 1.. e α e α 3. 4. sinc EE 78B: Stationary Random Processes 7 10

Which Functions can be an R X ()? 5. 6. 1 1 7. 8. T T n 4 3 1 1 3 4 n 1 1 T T EE 78B: Stationary Random Processes 7 11 Interpretation of Autocorrelation Function Let X(t) be WSS with zero mean. If R X () drops quickly with, this means that samples become uncorrelated quickly as we increase. Conversely, if R X () drops slowly with, samples are highly correlated R X1 () R X () So R X () is a measure of the rate of change of X(t) with time t, i.e., the frequency response of X(t) It turns out that this is not just an intuitive interpretation the Fourier transform of R X () (the power spectral density) is in fact the average power density of X(t) over frequency EE 78B: Stationary Random Processes 7 1

Power Spectral Density The power spectral density (psd) of a WSS random process X(t) is the Fourier transform of R X (): S X (f) = F ( R X () ) = R X ()e iπf d For a discrete time process X n, the power spectral density is the discrete-time Fourier transform (DTFT) of the sequence R X (n): S X (f) = R X (n)e iπnf, f < 1 n= R X () (or R X (n)) can be recovered from S X (f) by taking the inverse Fourier transform or inverse DTFT: R X () = R X (n) = 1 1 S X (f)e iπf df S X (f)e iπnf df EE 78B: Stationary Random Processes 7 13 Properties of the Power Spectral Density 1. S X (f) is real and even, since the Fourier transform of the real and even function R X () is real and even. S X(f)df = R X (0) = E(X (t)), the average power of X(t), i.e., the area under S X is the average power 3. S X (f) is the average power density, i.e., the average power of X(t) in the frequency band [f 1,f ] is f1 f f S X (f)df + S X (f)df = S X (f)df f f 1 f 1 (we will show this soon) From property 3, it follows that S X (f) 0. Why? In general, a function S(f) is a psd if and only if it is real, even, nonnegative, and S(f)df < EE 78B: Stationary Random Processes 7 14

Examples 1. R X () = e α S X (f) = α α +(πf) f. R X () = α cosω α 4 S X (f) α 4 ω π ω π f EE 78B: Stationary Random Processes 7 15 3. R X (n) = n S X (f) = 3 5 4cosπf 4 3 1 1 3 4 n 1 1 f 4. Discrete time white noise process: X 1,X,...,X n,... zero mean, uncorrelated, with average power N { N n = 0 R X (n) = S X (f) 0 otherwise N N n 1 + 1 f If X n is also a GRP, then we obtain a discrete time WGN process EE 78B: Stationary Random Processes 7 16

5. Bandlimited white noise process: WSS zero mean process X(t) with S X (f) R X () = NBsincB N B B f 1 B B ( For any t, the samples X t± n ) for n = 0,1,,... are uncorrelated B 6.White noise process: If we let B in the previous example, we obtain a white noise process, which has S X (f) = N for all f R X () = N δ() EE 78B: Stationary Random Processes 7 17 If, in addition, X(t) is a GRP, then we obtain the famous white Gaussian noise (WGN) process Remarks on white noise: For a white noise process, all samples are uncorrelated The process is not physically realizable, since it has infinite power However, it plays a similar role in random processes to point mass in physics and delta function in linear systems Thermal noise and shot noise are well modeled as white Gaussian noise, since they have very flat psd over very wide band (GHz) EE 78B: Stationary Random Processes 7 18

Continuity and Integration of Random Processes We are all familiar with the definitions of continuity and integration for deterministic functions as limits Using the notions of convergence discussed in Lecture Notes 5, we can define these notions for random processes. We focus only on m.s. convergence Continuity: A process X(t) is said to be mean square continuous if for every t lim s t E[(X(s) X(t)) ] = 0 The continuity of X(t) depends only on its autocorrelation function R X (t 1,t ) In fact, the following statements are all equivalent: 1. R X (t 1,t ) is continuous at all points of the form (t,t). X(t) is m.s. continuous 3. R X (t 1,t ) is continuous in t 1,t EE 78B: Stationary Random Processes 7 19 Proof: 1. 1 implies : Since if R X (t 1,t ) is continuous at all points (t,t), E[(X(t) X(s)) ] = R X (t,t)+r X (s,s) R X (s,t) 0 as s t. implies 3: Consider R X (s 1,s ) = E[X(s 1 )X(s )] = E[(X(t 1 )+(X(s 1 ) X(t 1 )))(X(t )+(X(s ) X(t )))] = R X (t 1,t )+E[X(t 1 )(X(s ) X(t ))]+E[X(t )(X(s 1 ) X(t 1 ))] +E[(X(s 1 ) X(t 1 )))(X(s ) X(t )))] R X (t 1,t )+ E[X (t 1 )]E[(X(s ) X(t )) ] + E[X (t )]E[(X(s 1 ) X(t 1 )) ] + E[(X(s 1 ) X(t 1 )) ]E[(X(s ) X(t )) ] Schwartz inequality R X (t 1,t ) as s 1 t 1 and s t since X(t) is m.s. continuous 3. Since 3 implies 1, we are done EE 78B: Stationary Random Processes 7 0

Example: The Poisson process N(t) with rate λ > 0 is m.s. continuous, since its autocorrelation function, is a continuous function R N (t 1,t ) = λmin{t 1,t }+λ t 1 t Integration: Let X(t) be a RP and h(t) be a function. We can define the integral b a h(t)x(t)dt as the limit of a sum (as in Riemann integral of a deterministic function) in m.s. Let > 0 such that b a = n and a 1 a+ a+ n 1 a+(n 1) n a+n = b, then the corresponding Riemann sum is n 1 h( i )X( i ) i=1 The above integral then exists if this sum has a limit in m.s. as 0 EE 78B: Stationary Random Processes 7 1 Moreover, if the random integral exists for all a,b, then we can define h(t)x(t)dt = lim a,b b a h(t)x(t)dt in m.s. Fact: The existence of the m.s. integral depends only on R X and h More specifically, the above integral exists iff b b exists (in the normal sense) a a R X (t 1,t )h(t 1 )h(t )dt 1 dt Remark: We are skipping several mathematical details here. In what follows, we use the above fact to justify the existence of integrals involving random processes and in interchanging expectation and integration EE 78B: Stationary Random Processes 7

Stationary Ergodic Random processes Let X(t) be SSS or only WSS Ergodicity of X(t) means that certain time averages converge to their respective statistical averages Mean ergodic process: Let X(t) be a WSS and m.s. continuous RP with mean µ X To estimate the mean of X(t), we form the time average X(t) = 1 t t 0 X()d The RP X(t) is said to be mean ergodic if X(t) µ X as t in m.s. Similarly for a discrete RP, the time average (same as sample average) is X n = 1 n X n n i=1 and the RP is mean ergodic if Xn µ X in m.s. EE 78B: Stationary Random Processes 7 3 Example: Let X n be a WSS process with C X (n) = 0 for n 0, i.e., the X i s are uncorrelated, then X n is mean ergodic The process does not need to have uncorrelated samples for it to be mean ergodic, however Whether a WSS process is mean ergodic again depends only on its autocorrelation function By definition, mean ergodicity means that lim E[( X(t) µ X ) ] 0 t Since E( X(t)) = µ X, the condition for mean ergodicity is the same as lim Var( X(t)) = 0 t EE 78B: Stationary Random Processes 7 4

Now, consider [ (1 t ) ] E( X (t)) = E X()d t 0 ( t t ) 1 = E X( 1 )X( )d 1 d t 0 0 = 1 t t R X ( 1, )d 1 d t 0 0 = 1 t t R X ( 1 )d 1 d t 0 0 From figure below, this double integral reduces to the single integral E( X (t)) = t (t )R X ()d t 0 EE 78B: Stationary Random Processes 7 5 t = t 1 = = t + t 1 Hence, a WSS process X(t) is mean ergodic iff lim t t (t )R X ()d = µ t X 0 EE 78B: Stationary Random Processes 7 6

Example: Let X(t) be a WSS with zero mean and R X () = e Evaluating the condition on mean ergodicity, we obtain t (t )R X ()d = t 0 t (e t +t 1), which 0 as t. Hence X(t) is mean ergodic Example: Consider the coin with random bias P example in Lecture Notes 5. The random process X 1,X,... is stationary However, it is not mean ergodic, since X n P in m.s. Remarks: The process in the above example can be viewed as a mixture of IID Bernoulli(p) processes, each of which is stationary ergodic (it turns out that every stationary process is a mixture of stationary ergodic processes) Ergodicity can be defined for general (not necessarily stationary) processes (this is beyond the scope of this course, however) EE 78B: Stationary Random Processes 7 7