Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Stochastic Processes

Stochastic Processes. A stochastic process is a function of two variables:

Statistical signal processing

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Chapter 2 Random Processes

PROBABILITY AND RANDOM PROCESSESS

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

16.584: Random (Stochastic) Processes

3F1 Random Processes Examples Paper (for all 6 lectures)

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Introduction to Probability and Stochastic Processes I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

ECE 636: Systems identification

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Problem Sheet 1 Examples of Random Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Chapter 6. Random Processes

Spectral Analysis of Random Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

IV. Covariance Analysis

Name of the Student: Problems on Discrete & Continuous R.Vs

Stochastic Processes. Chapter Definitions

Problems on Discrete & Continuous R.Vs

Probability and Statistics

Signals and Spectra - Review

Definition of a Stochastic Process

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Random Processes Why we Care

Basics on 2-D 2 D Random Signal

Communication Theory II

A Probability Review

Stochastic Process II Dr.-Ing. Sudchai Boonto

Fig 1: Stationary and Non Stationary Time Series

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

where r n = dn+1 x(t)

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

D.S.G. POLLOCK: BRIEF NOTES

Massachusetts Institute of Technology

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

2A1H Time-Frequency Analysis II

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

7 The Waveform Channel

Stochastic Processes. Monday, November 14, 11

Signals and Spectra (1A) Young Won Lim 11/26/12

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Signal Processing Signal and System Classifications. Chapter 13

Chapter 6 - Random Processes

Probability and Statistics for Final Year Engineering Students

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Chapter 5 Random Variables and Processes

conditional cdf, conditional pdf, total probability theorem?

Homework 3 (Stochastic Processes)

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

5 Operations on Multiple Random Variables

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

LOPE3202: Communication Systems 10/18/2017 2

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

ELEMENTS OF PROBABILITY THEORY

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

4 Classical Coherence Theory

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Outline. Random Variables. Examples. Random Variable

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Particle Filters. Outline

Multiple Random Variables

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Chapter 4 The Fourier Series and Fourier Transform

Chapter 6: Random Processes 1

Name of the Student: Problems on Discrete & Continuous R.Vs

MA6451 PROBABILITY AND RANDOM PROCESSES

LINEAR RESPONSE THEORY

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Continuous Stochastic Processes

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

TSKS01 Digital Communication Lecture 1

Econ 424 Time Series Concepts

Fundamentals of Noise

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

ECE Homework Set 3

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Random Process. Random Process. Random Process. Introduction to Random Processes

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

EE4601 Communication Systems

System Identification & Parameter Estimation

Ch 4: The Continuous-Time Fourier Transform

Basics on Probability. Jingrui He 09/11/2007

Name of the Student: Problems on Discrete & Continuous R.Vs

Properties of the Autocorrelation Function

Transcription:

Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1

Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function. 2

Deterministic vs. Random Signals Deterministic Signal: Exactly predictable. V t = Acos ωt + θ, A, ω, t, θ R Random Signal: Associated with a chance occurrence. a) Continuous or discrete (time series). b) May have a deterministic structure. V t = Acos ωt + θ, ω, t, R, A & θ random e. g. A 0,1, θ N 0,1, or t Z (integer) 3

Example: No deterministic structure. X(t) 0.15 0.1 0.05 0-0.05-0.1 0 1 2 3 4 5 6 7 8 9 10 t 4

Random Processes Map the elements of the sample space S to the set of continuous time functions C. For a fixed time point = random variable. Example: Measurement of any physical quantity (with additive noise) over time. X: S C x(t, s )=ordinary time function X(t, s)=random variable 5

Random Process x(t, s ) X t, s = random variable X t, s 5 4 3 2 1 0-1 Sample Function 5 s -2 4.5-3 4-4 3.5 0 10 3 t 20 30 40 50 t 60 70 80 1.5 2 2.5 6 90

Random Sequence Map the elements of the sample space S to the set of discrete time functions X(n, s). For a fixed time point = random variable. Example: Samples of any physical quantity (with additive noise) over time. Discrete random process, time series. 7

Example: Random Binary Signal Random sequence of pulses s.t. 1. Rectangular pulses of fixed duration T. 2. Amplitude +1 or 1, equally likely. 3. Statistically independent amplitudes. 4. Start time D for sequence uniformly distributed in the interval [0, T]. 8

Random Binary Signal 1 T 2T t 1 D 2T D~U 0, T 9

Mathematical Description X t = k= A k p t kt D p(t) = unit amplitude pulse of duration T. A k = binary r.v. in 1, 1 = amplitude of k th pulse. D = random start time, uniformly distributed in [0, T]. 10

Moments of Random Process Fix time to obtain a random variable. Obtain moments as a function of time. m X t = E X t = න k th moment = E X k t = න xf X x, t dx x k f X x, t dx 11

Properties of Binary Signal m X t = E X t = X i P i 2 i=1 = 1(1/2) + 1(1/2) = 0 2 nd moment = E X 2 t = 1 2 1 2 + 1 2 1 2 = 1 Second moment =variance (zero mean). Special Case: Constant moments (not functions of time). 12

Joint Densities f Xi X j x i, x j, i j X i = X t i Specify how fast X t time changes with Later: related to spectral content. Higher order densities provide more information (hard to compute). 13

Statistically Independent Random Signals f X1 X 2 Y 1 Y 2 = f X1 X 2 f Y1 Y 2 X i = X t i, Y j = Y t j Any choice of t i and t j Possibly t i t j 14

Gaussian Random Process All density functions (any order) normal. Multivariate normal density: completely specified by the mean and covariance matrix. f X x = 1 2π n/2 det C x exp 1 2 x m x T C x 1 x m x X = X 1 X i X j T, i j X i = X t i 15

Autocorrelation R XX t 1, t 2 = E X t 1 X t 2 = න න x t 1 x t 2 f X1 X 2 x 1, x 2 dx 1 dx 2 Autocorrelation is an ensemble average using the joint density function. Recall: for fixed t, random process = r.v. Similarly define, autocovariance (same as autocorrelation for zero mean). 16

Autocorrelation of Binary Signal Choose 2 time points t 1 & t 2, 0 < t 1 < t 2 < T D = random start time, D~U 0, T Switching point D t 1, t 2 : t 1 &t 2 in the same pulse. Switching point D t 1, t 2 : t 1 &t 2 not in the same pulse. 1 T 2T t 2T 1 D 17

0 < D < t 1 or t 2 < D < T 1 T 2T t 1 D t 1 t 2 2T D t 1, t 2 t 1 & t 2 in same interval X t 1 = X t 2 = 1, with equal probability = 0.5 18

t 1 < D < t 2 1 T 2T t 1 D t 1 t 2 2T D t 1, t 2 t 1 & t 2 not in same interval X t 1 = X t 2 = ±1, with equal probability = 0.5 19

1/T Probabilities D = uniformly distributed in [0,T]. P t 1 < D < t 2 = t 2 t 1, 0 < t T 1 < t 2 < T P 0 < D < t 1 or t 2 < D < T = 1 P t 1 < D < t 2 = 1 t 2 t 1 T T D 20

Product of Amplitudes Sign of amplitude is the same if in the same pulse 0 < D < t 1 or t 2 < D < T. Sign of amplitude can be the same or different if in consecutive pulses (t 1 < D < 21

Autocorrelation t 2 t 1 < T Pr X t 1 X t 2 = 1 t 1 in same pulse as t 2 = 1 Use t 2 t 1 < T to include t 2 < t 1 R XX t 1, t 2 = E X t 1 X t 2 = 3 cases X t 1 X t 2 P X t 1 X t 2 = 1 1 t 2 t 1 /T + 1 0.5 t 2 t 1 /T + 1 0.5 t 2 t 1 /T = 1 t 2 t 1 /T 22

Autocorrelation t 2 t 1 > T Independent amplitudes for t 2 t 1 > T Independent implies uncorrelated. R XX t 1, t 2 = E X t 1 X t 2 = E X t 1 E X t 2 = 0 23

Autocorrelation of Binary Signal R XX t 1, t 2 = E X t 1 X t 2 = ቐ 1 t 2 t 1, t T 2 t 1 < T 0, t 2 t 1 > T 1 0.9 0.8 Function of t 2 t 1 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-6 -4-2 0 2 4 6 t 2 t 1 24

Stationary Random Process Two definitions Strictly stationary random process Wide sense stationary random process (WSS) (Strictly) Stationary implies wide-sense stationary. 25

Strictly Stationary All pdfs describing the process are unaffected by any time shift. X i = X t i, i = 1,2,, k X i = X t i + τ, i = 1,2,, k Both governed by the same pdfs f Xi, f Xi X j, f Xi X k Wide Sense Stationary: Constant mean, shift-invariant autocorrelation. 26

Wide Sense Stationary Random Signal Stationarity of the mean (constant). m = E X(t) Stationary of the autocorrelation. R XX τ = E X t X t + τ (depends on separation τ only) 27

Nonstationary Signal Y(t )= X+cos(t), X~N(0,9), m Y (t) = cos(t) 28

Example: WSS only Equally probable outcomes Y t ±5 sin t, ±5 cos t E Y t = 0.25 ±5 sin t ± 5 cos t = 0 Joint pdf: Assume four possible joint outcomes only (equally probable) with two sines or two cosines of the same sign R Y t + τ, t = 1 4 2 25 sin t sin t + τ +2 25 cos t cos t + τ = 12.5 cos τ = R Y τ 29

Not Strict Sense Stationary Y t ±5 sin t, ±5 cos t Take two time points (say 0 and π/4) Values at t = 0: Y t 0, ±5 Values at t = π : Y t ±5/ 2 4 First-order distributions are different (even though their mean is the same). 30

Ergodic Random Processes Single realization is enough. Time average = ensemble average. Ergodicity, like stationarity, is an idealization. Can be approximately true in practice. Ergodic signals tend to look random.

Stationarity Necessary Explanation Single realization has a single average for any property: all moments, autocorrelation etc. Can only obtain expected value if it is constant. 32

Example: Stationary not Ergodic Random constant amplitude X t = A Amplitude A~N 0, σ 2 Sample realization with amplitude A 1 0 Mean of constant A 1 = A 1 0 Stationarity is not sufficient. 33

Ergodicity in Mean Time Average m xa = Lim T 1 T න 0 T x A t dt = E X t Ergodicity in the mean: Cov XX τ 0, as τ For zero mean R τ 0, as τ 34

Ergodicity in Autocorrelation Time Autocorrelation R XA τ = Lim T 1 T න 0 T x A t x A t + τ dt = E X t X t + τ = R XX τ Ergodicity in autocorrelation: Need 4 th moment. Cov ZZ τ 1 0 as τ 1, Z = X t X t + τ 35

Example Deterministic structure X t = A sin ωt A~ N(0, 2 ), constant Sample realization X A t = A 1 sin ωt, fix A 1 Compare time autocorrelation & autocorrelation. 36

Time Autocorrelation R X A ( ) Lim T Lim T Lim T A1 2 2 1 T 1 T 2 A1 2T cos T 0 T 0 0 A T X 2 1 A sin cos ( t) X A ( t ) dt t sin t dt cos 2t dt (finite integral)/t T න cos ω 2t + τ 0 T dt = න 0 cos ωτ cos 2ωt sin ωτ sin 2ωt dt 37

Autocorrelation R XX t 1, t 2 = E X t 1 X t 2 = E A 2 sin ωt 1 sin ωt 2 = σ2 2 cos ω t 1 t 2 cos ω t 1 + t 2 Not a function of the time shift only. Not equal to time autocorrelation. Not an ergodic process. 38

Properties of Autocorrelation Useful general properties used throughout the course. Several apply to the stationary case only. Assume real scalar processes. 39

Mean Square Value From autocorrelation R XX t, t = E X 2 (t) = R XX t For X stationary R XX t = R XX 0, t 40

Even Function R XX t 1, t 2 = E X t 1 X t 2 = E X t 2 X t 1 = R XX t 2, t 1 Stationary X t R XX t 2 t 1 = R XX t 1 t 2 R XX τ =even function of τ = t 2 t 1 41

Peak Value Consider a zero-mean process w.l.o.g. (subtract mean if not zero) ρ = E X t X t + τ E X 2 t E X 2 t + τ ρ = R X τ R X 0 1 R X τ R X 0 = R X τ R X 0 42

Periodic Component R has A periodic component of the same period. No information about phase. X =zero-mean X np + uncorrelated X p X p = A cos ωt + θ, θ~u 0,2π R XX τ = E X t X t + τ = E X np t + X p t X np t + τ + X p t + τ X = Sum of uncorrelated terms R XX τ = R Xnp X np τ + R Xp X p τ 43

Periodic Component (cont.) R Xp X p τ = E X p t X p t + τ = A 2 E cos ωt + θ cos ωt + ωτ + θ = A 2 /2 E cos ωτ + cos ω 2t + τ + 2θ cos ω 2t + τ + 2θ = cos ω 2t + τ cos 2θ sin ω 2t + τ sin 2θ E cos 2θ = E sin 2θ = 0, θ~u 0,2π R Xp X p τ = A 2 /2 cos ωτ 44

Periodic Component (cont.) R Xp X p τ = A 2 /2 cos ωτ R XX τ = R Xnp X np τ + R Xp X p τ R has = R Xnp X np τ + A 2 /2 cos ωτ A periodic component of the same period. No information about phase. 45

Zero-mean Zero-mean, ergodic process R XX τ 0 as τ R XX τ = E X t X t + τ = E X zm t + m x X zm t + τ + m x 2 = R Xzm X zm τ + m x R XX τ m 2 x as τ 46

Autocorrelation for Vector x R XX t 1, t 2 = E x t 1 x t 2 = E x t 2 x t 1 = R XX t 2, t 1 =conjugate transpose =transpose for real x Stationary: R XX t 2 t 1 = R XX t 1 t 2 47

Crosscorrelation Function R XY t 1, t 2 = E X t 1 Y t 2 Stationary: skew-symmetric R XY τ = E X t Y t + τ R YX τ = E Y t X t + τ = E X t Y t τ = R XY ( τ)} 48

Properties of Crosscorrelation R XY τ = R YX τ R XY τ R XX 0 R YY 0, ρ 1 R XY τ 1 [R 2 XX 0 + R YY 0 ] 0, orthogonal R XY τ = ቊ m X m Y, uncorrelated 49

Example: Cross-correlation X t = A sin ωt + θ, θ~u 0,2π Y t = B cos ωt + θ, θ~u 0,2π R XY τ = E X t Y t + τ = AB. E sin ωt + θ cos ω t + τ + θ = AB 2. E sin ω 2t + τ + 2θ + sin ωτ = AB 2 sin ωτ Zero at τ = 0 not maximum Zero mean as shown earlier 50

Combination of Random Processes Sum of two random processes. R ZZ τ = E Z t = X t + Y(t) X t + Y(t) X t + τ + Y(t + τ) = R XX τ + R XY τ + R YX τ + R YY τ Uncorrelated and at least one zeromean. R ZZ τ = R XX τ + R YY τ 51

Time Delay Estimation Transmit signal to object. Receive signal. Find cross-correlation of two signals. Location of peak = time delay. R XY τ = E X t X t + τ T d + n t = R XX τ T d 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-8 -6-4 -2 0 2 4 6 8 T d 52

Power Spectral Density Function (SDF) Stationary Process: Fourier transform of autocorrelation. S XX jω = F R XX τ = න R XX τ e jωτ dτ R XX τ = 1 2π න S XX jω e jωτ dω Wiener-Khinchine Relation S XX s = L 2 R XX τ = න S XX s e sτ dω 53

Example R XX τ = σ 2 β τ e S XX jω = F R XX τ = 2σ2 β ω 2 + β 2 S XX s = L 2 R XX τ = 2σ2 β s 2 + β 2 Same as Gauss-Markov process but autocorrelation does not tell the whole story. 54

Autocorrelation and PSD R τ S jω 55

For = 0 Mean-square Value of Output R XX τ = 1 2π න S XX jω e jωτ dω R XX 0 = E X 2 (t) = 1 2π න S XX jω dω = 1 j 2πj න S XX s ds j 56

Power in a Finite Band Power spectral density gives distribution of power over frequencies. 1 Power for ω ω 1, ω 2 : ω 1SXX jω dω + 1 ω 2SXX jω dω 2π න ω 2 2π න ω 1 57

2 1 ω 2 2π න ω 1 Example S XX jω = 2σ2 β ω 2 + β 2 Power for ω ω 1, ω 2 : 2σ 2 β ω 2 + β 2 dω = Integration tables. 2σ2 β π ω 2 1 ω β tan 1 β ω1 Power = 2σ2 ω π tan 1 2 β ω tan 1 1 β 58

Example: Mean Square E X 2 (t) = 1 2π න S XX jω dω = 1 j 2πj න S XX s ds j = 2σ2 β 1 ω 2π β tan 1 β = σ 2 Later: Table 3.1 for s-domain integral. 59

Cross Spectral Density Function S XY jω = F R XY τ = න R XY τ e jωτ dτ Skew-symmetry of cross-correlation: R XY τ =R YX τ S XY jω = න R YX τ e jω τ d τ = S YX jω 60

Properties of SDF The autocorrelation is real, even. Real function f τ : F f τ Even function f τ : F f τ = even + j odd = even S XX jω = S XX ω 2 = real, even Nonnegative: since its integral gives the energy over BW. 61

Spectral Density of Sum Z t = X t + Y t R ZZ τ = R XX τ +R XY τ + R YX τ + R YY τ S ZZ jω = S XX jω +S XY jω + S YX jω + S YY jω Zero cross-correlation: S XY jω = 0 S ZZ jω = S XX jω +S YY jω X & Y uncorrelated & X or Y zero mean. 62

Coherence Function S 2 XY jω 2 γ XY jω = 1, ω S XX jω S YY jω 2 1, X t = Y t jω = ቊ 0, zero crosscorrelation X t &Y t γ XY Used in applications (e.g. acoustics). 63

Conclusion Probabilistic description of random signals. Autocorrelation and crosscorrelation. Power spectral density function. Experimental determination: necessary but difficult. 64

References R. G. Brown and P. Y. C. Hwang, Introduction to Random Signals and Applied Kalman Filtering, 3ed, J. Wiley, NY, 1997. G. R. Cooper and C. D. MaGillem, Probabilistic Methods of Signal and System Analysis, Oxford Univ. Press, NY, 1999 Binary signal example is from K. S. Shanmugan & A. M. Breipohl, Detection, Estimation & Data Analysis, J. Wiley, NY, 1988, pp. 132-134. 65