EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome is, which is a whole function Xt (, λ i ) x i () t this real-valued function is called a sample function The set of all sample functions is an ensamble x 1 () t x 1 ( t 1 ) t λ 1 t 1 x 1 ( t ) t λ x () t x ( t ) λ 3 x ( t 1 ) t x 3 () t x 3 ( t ) x 3 ( t 1 ) t X 1 X
EAS 305 Random Processes Viewgraph of 10 Statistics of Random Processes I Distributions and Densities Random process Xt () For a particular value of t, say t 1, we have a random variable Xt ( 1 ) X 1 The distribution function of this random variable is defined by F X ( x 1 ; t 1 ) P{ X( t 1 ) x 1 }, and is called the first-order distribution of Xt () The corresponding first-order density function is f X ( x 1 ; t 1 ) FX ( x x 1 ; t 1 ) 1 For t 1 and t, we get two random variables X( t 1 ) X 1 and X( t ) X Their joint distribution is called the second-order distribution: F X ( x 1, x ; t 1, t ) P{ X( t 1 ) x 1, Xt ( ) x }, with corresponding second-order density function f X ( x 1, x ; t 1, t ) F x 1 x X ( x 1, x ; t 1, t ) The nth-order distribution and density functions are given by F X ( x 1, x, x n ; t 1, t,, t n ) P{ X( t 1 ) x 1, Xt ( ) x,, Xt ( n ) x n }, and f X ( x 1, x, x n ; t 1, t,, t n ) n F x 1 x x X ( x 1, x, x n ; t 1, t,, t n ) n
EAS 305 Random Processes Viewgraph 3 of 10 II Statistical Averages (ensamble averages) The mean of Xt () is defined by E[ X() t ] Xt () xf X ( xt ;) dx The autocorrelation of Xt () is defined by R XX ( t 1, t ) E[ X( t 1 )Xt ( )] x 1 x f X ( x 1, x ; t 1, t ) dx 1 dx The autocovariances of Xt () is defined by C XX ( t 1, t ) E{ [ X( t 1 ) Xt ( 1 )][ Xt ( ) Xt ( )]} R XX ( t 1, t ) Xt ( 1 )Xt ( ) The nth joint moment of Xt () is defined by E[ X( t 1 )Xt ( ) Xt ( n )] x 1 x x n f X ( x 1, x,, x n ; t 1, t,, t n ) dx 1 dx dx n III Stationarity Strict-Sense Stationarity A random process Xt () is called strict-sense stationary (SSS) if the statistics are invariant wrt any time shift, ie f X ( x 1, x,, x n ; t 1, t,, t n ) f X ( x 1, x,, x n ; t 1 + c, t + c,, t n + c) It follows f X ( x 1 ; t 1 ) f X ( x 1 ; t 1 + c) for any c,hence first-order density of a SSS Xt () is independent of time t: f X ( x 1 ; t) f X ( x 1 ) Similarly, f X ( x 1, x ; t 1, t ) f X ( x 1, x ; t 1 + c, t + c) By setting c t 1, we get f X ( x 1, x ; t 1, t ) f X ( x 1, x ; t t 1 ) Thus, if Xt () is SSS, the joint density of the random variables Xt () and X is independent of t and depends only on the time difference τ
EAS 305 Random Processes Viewgraph 4 of 10 Wide -Sense Stationary A random process X() t is said to be wide-sense stationary (WSS) if its mean is constant (independent of time) E[ X() t ] X, and its autocorrelation depends only on the time difference τ E[ X()Xt ] R XX () τ As a result, the auto covariance of a WSS process also depends only on the time difference τ: C XX () τ R XX () τ X Setting τ 0in E[ X()Xt ] R XX () τ results in E[ X () t ] R XX () 0 The average power of a WSS process is independent of time t, and equals R XX ( 0) An SSS process is WSS, but a WSS process is not necessarily SSS Two processes Xt () and Yt () are jointly wide-sense stationary (jointly WSS) if each is WSS and their crosscorrelation depends only on the time difference τ: R XY ( t, t + τ) E[ X()Yt ] R XY () τ Also the cross-covariance of jointly WSS Xt () and Yt () depends only on the time difference τ: C XY () τ R XY () τ XY IV Time Averages and Ergodicity The time-averaged mean of a sample function xt () of a random process Xt () is defined as xt () 1 T lim --- xt () dt, where the symbol denotes time-averaging T T T Similarly, the time-averaged autocorrelation of the sample function xt () is
EAS 305 Random Processes Viewgraph 5 of 10 1 xt ()x lim --- T T xt ()x dt T Both xt () and xt ()x are random variables since they depend on which sample function resulted from experiment then if Xt () is stationary: 1 T E[ x() t ] lim --- T T E[ x() t ] dt X T The expected value of the time-averaged mean is equal to ensamble mean 1 T Also E[ x()xt ] lim ---, T T E[ x()xt ] dt R XX () τ T the expected value of the time-averaged autocorrelation is equal to the ensamble autocorrelation A random process Xt () is ergodic if time-averages are the same for all sample functions, and are equal to the corresponding ensamble averages In an ergodic process, all its statistics can be obtained from a single sample function A stationary process Xt () is called ergodic in the mean if xt () X, and ergodic in the autocorrelation if xt ()x R XX () τ Correlations and Power Spectral Densities Assume all random processes WSS: I Autocorrelation R XX (τ): R XX () τ E[ X()Xt ] Properties of R XX () τ : R XX ( τ) R XX () τ, R XX () τ R XX () 0, R XX () 0 E[ X () t ]
EAS 305 Random Processes Viewgraph 6 of 10 II Cross-Correlation R XY (τ): R XY () τ E[ X()Yt ] 1 Properties of R XY () τ : R XY ( τ) R XY () τ, R XY () τ R XX ()R 0 YY () 0, R XY () τ -- [ R XX ( 0) + R YY ( 0) ] III Autocovariance C XX (τ): C XX () τ R XX () τ X IV Cross-Covariance C XY (τ): C XY () τ R XY () τ XY Two processes are (mutually) orthogonal if R XY () τ 0, and uncorrelated if C XY () τ 0 V Power Spectrum Density S XX (ω): The power spectral density S XX (ω) is the Fourier transform of R XX (τ) 1 R XX () τ ----- π S XX ( ω)e jωτ dω Properties: real, S XX ( ω) 0, even fn S XX ( ω) S XX ( ω), 1 (Parseval s type relation) ----- S π XX ( ω) d ω R XX () 0 E[ X () t ] S XX ( ω) R XX ()e τ jωτ dτ Thus VI Cross Spectral Densities S XY ( ω) R XY ()e τ jωτ dτ, S YX ( ω) R YX ()e τ jωτ dτ 1 Therefore: R XY () τ ----- S, π XY ( ω)e jωτ 1 dω R YX () τ ----- S π YX ( ω)e jωτ dω Since R XY () τ R YX ( τ), then S XY ( ω) S YX ( ω) S YX ( ω)
EAS 305 Random Processes Viewgraph 7 of 10 Random Processes in Linear Systems I System Response: LTI system with impulse response ht (), and output Yt () LXt [ ()] ht () Xt hζ ()Xt ( ζ) dζ II Mean and Autocorrelation of Output: EYt [ ()] Yt () E h()xt ζ ( ζ) dζ h()e ζ [( X( t ζ) ] dζ h()xt ζ ( ζ) dζ ht () Xt R YY ( t 1, t ) EYt [ ( 1 )Yt ( )] E h()hµ ζ ( )Xt ( 1 ζ)xt ( µ )( dζ) dµ h()hµ ζ ( )E[ X( t 1 ζ)xt ( µ )] d ζdµ h()hµ ζ ( )R XX ( t 1 ζ, t µ ) d ζdµ If input is WSS then EYt [ ()] Y h()xζ ζ d X h()ζ ζ d, the mean of the output is a cons XH() 0 R YY ( t 1, t ) h()hµ ζ ( )R XX ( t t + ζ µ ) d ζdµ, R :Y WSS 1 YY () τ hζ ()hµ ( )R XX ( τ+ ζ µ ) d ζdµ III Power Spectral Density of Output: S YY ( ω) R YY ()e τ jωτ dτ h()hµ ζ ( )R XX ( τ+ ζ µ )e jωτ dτd ζdµ H( ω) S XX ( ω), R YY () τ 1 ----- S π YY ( ω)e jωτ dω 1 ----- H ( ω) Average power of output is: π S XX ( ω)e jωτ dω EY 1 [ () t] R YY () 0 ----- H ( ω) π S XX ( ω) dω
EAS 305 Random Processes Viewgraph 8 of 10 Special Classes of Random Processes I Gaussian Random Process: II White Noise: A random process Xt () is white noise if S XX ( ω) η η -- Its autocorrelation is R XX () τ -- δτ () III Band-Limited White Noise: η --, ω ω A random process Xt () is band-limited white noise if S XX ( ω) B 0, ω > ω B 1 ω B η Then R XX () τ ----- -- e π j ωτ ηω B sinω B τ dω ---------- ----------------- π ω B τ ω B
EAS 305 Random Processes Viewgraph 9 of 10 IV Narrowband Random Process: Let Xt () be WSS process with zero mean Let its power spectral density S XX ( ω) be nonzero only in a narrow frequency band of width W which is very small compared to a center frequency ω c, then we have a narrowband random process When white or broadband noise is passed through a narrowband linear filter, narrowband noise results When a sample function of the output is viewed on oscilloscope, the observed waveform appears as a sinusoid of random amplitude and phase Narrowband noise is conveniently represented by Xt () Vt () cos[ ω c t+ φt ()], where Vt ( ) and φ() t are the envelop function and phase function, respectively From trigonometric identity of the cosine of a sum we get the quadrature representation of process: Xt () Vt () cosφt () cosω c t Vt () sinφ() t sinω c t X c () t cosω c t where X s () t sinω c t X c () t Vt () cosφt (), in phase component X s () t Vt () sinφ(), t qudrature component Vt () X c () t + X s () t φt () X s () t atan----------- X c () t To detect the quadrature component X c () t, and the in-phase component X s () t from Xt () we use:
EAS 305 Random Processes Viewgraph 10 of 10 Low-pass Filter X c (t) X(t) ~ cosωct Low-pass Filter X s (t) ~ -sinω c t Properties of X c () t and X s () t : 1- Same power spectrum: S Xc ( ω) S Xs ( ω) - Same mean and variance as Xt (): X c X s X 0, and σ Xc σ Xs σ X S XX ( ω ω c ) + S XX ( ω + ω c ), ω W 0, else 3- X c () t and X s () t are uncorrelated: E[ X c ()X t s () t ] 0 4- If input process is gaussian, then so are the in-phase and quadrature components 5- If Xt () is gaussian, then for a fixed t, V(t) is a random variable with Rayleigh distribution, and φt () is a random variable uniformly distributed over [ 0π, ]