EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Similar documents
16.584: Random (Stochastic) Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Chapter 6. Random Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

3F1 Random Processes Examples Paper (for all 6 lectures)

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Stochastic Processes. A stochastic process is a function of two variables:

Probability and Statistics for Final Year Engineering Students

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Introduction to Probability and Stochastic Processes I

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

PROBABILITY AND RANDOM PROCESSESS

Fig 1: Stationary and Non Stationary Time Series

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Chapter 5 Random Variables and Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

IV. Covariance Analysis

D.S.G. POLLOCK: BRIEF NOTES

ECE 636: Systems identification

Signal Processing Signal and System Classifications. Chapter 13

Problems on Discrete & Continuous R.Vs

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Properties of the Autocorrelation Function

Problem Sheet 1 Examples of Random Processes

7 The Waveform Channel

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

Statistical signal processing

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Random Processes Why we Care

Signals and Spectra - Review

Chapter 2 Random Processes

Signals and Spectra (1A) Young Won Lim 11/26/12

Massachusetts Institute of Technology

Random Processes Handout IV

Stochastic Processes. Chapter Definitions

Definition of a Stochastic Process

Stochastic Process II Dr.-Ing. Sudchai Boonto

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II

Name of the Student: Problems on Discrete & Continuous R.Vs

Fundamentals of Noise

EE4601 Communication Systems

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

Continuous Stochastic Processes

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Question Paper Code : AEC11T03

Communication Theory II

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Spectral Analysis of Random Processes

Fourier Series. Spectral Analysis of Periodic Signals

Lecture - 30 Stationary Processes

System Identification & Parameter Estimation

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise

Chapter 4 Random process. 4.1 Random process

MA6451 PROBABILITY AND RANDOM PROCESSES

Probability and Statistics

where r n = dn+1 x(t)

GATE EE Topic wise Questions SIGNALS & SYSTEMS

Module 9: Stationary Processes

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

CONTINUOUS IMAGE MATHEMATICAL CHARACTERIZATION

ECE-340, Spring 2015 Review Questions

COURSE OUTLINE. Introduction Signals and Noise Filtering Noise Sensors and associated electronics. Sensors, Signals and Noise 1

TSKS01 Digital Communication Lecture 1

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

ECE Homework Set 3

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

2007 Final Exam for Random Processes. x(t)

Chapter 5 Frequency Domain Analysis of Systems

ω 0 = 2π/T 0 is called the fundamental angular frequency and ω 2 = 2ω 0 is called the

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

LOPE3202: Communication Systems 10/18/2017 2

Chapter 6: Random Processes 1

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

ECE 630: Statistical Communication Theory

Chapter 5 Frequency Domain Analysis of Systems

Radio Interferometry. Contents

Random Process. Random Process. Random Process. Introduction to Random Processes

1 Signals and systems

Each problem is worth 25 points, and you may solve the problems in any order.

Stochastic Processes. Monday, November 14, 11

Modern Navigation. Thomas Herring

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Fourier Analysis and Power Spectral Density

Transcription:

EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome is, which is a whole function Xt (, λ i ) x i () t this real-valued function is called a sample function The set of all sample functions is an ensamble x 1 () t x 1 ( t 1 ) t λ 1 t 1 x 1 ( t ) t λ x () t x ( t ) λ 3 x ( t 1 ) t x 3 () t x 3 ( t ) x 3 ( t 1 ) t X 1 X

EAS 305 Random Processes Viewgraph of 10 Statistics of Random Processes I Distributions and Densities Random process Xt () For a particular value of t, say t 1, we have a random variable Xt ( 1 ) X 1 The distribution function of this random variable is defined by F X ( x 1 ; t 1 ) P{ X( t 1 ) x 1 }, and is called the first-order distribution of Xt () The corresponding first-order density function is f X ( x 1 ; t 1 ) FX ( x x 1 ; t 1 ) 1 For t 1 and t, we get two random variables X( t 1 ) X 1 and X( t ) X Their joint distribution is called the second-order distribution: F X ( x 1, x ; t 1, t ) P{ X( t 1 ) x 1, Xt ( ) x }, with corresponding second-order density function f X ( x 1, x ; t 1, t ) F x 1 x X ( x 1, x ; t 1, t ) The nth-order distribution and density functions are given by F X ( x 1, x, x n ; t 1, t,, t n ) P{ X( t 1 ) x 1, Xt ( ) x,, Xt ( n ) x n }, and f X ( x 1, x, x n ; t 1, t,, t n ) n F x 1 x x X ( x 1, x, x n ; t 1, t,, t n ) n

EAS 305 Random Processes Viewgraph 3 of 10 II Statistical Averages (ensamble averages) The mean of Xt () is defined by E[ X() t ] Xt () xf X ( xt ;) dx The autocorrelation of Xt () is defined by R XX ( t 1, t ) E[ X( t 1 )Xt ( )] x 1 x f X ( x 1, x ; t 1, t ) dx 1 dx The autocovariances of Xt () is defined by C XX ( t 1, t ) E{ [ X( t 1 ) Xt ( 1 )][ Xt ( ) Xt ( )]} R XX ( t 1, t ) Xt ( 1 )Xt ( ) The nth joint moment of Xt () is defined by E[ X( t 1 )Xt ( ) Xt ( n )] x 1 x x n f X ( x 1, x,, x n ; t 1, t,, t n ) dx 1 dx dx n III Stationarity Strict-Sense Stationarity A random process Xt () is called strict-sense stationary (SSS) if the statistics are invariant wrt any time shift, ie f X ( x 1, x,, x n ; t 1, t,, t n ) f X ( x 1, x,, x n ; t 1 + c, t + c,, t n + c) It follows f X ( x 1 ; t 1 ) f X ( x 1 ; t 1 + c) for any c,hence first-order density of a SSS Xt () is independent of time t: f X ( x 1 ; t) f X ( x 1 ) Similarly, f X ( x 1, x ; t 1, t ) f X ( x 1, x ; t 1 + c, t + c) By setting c t 1, we get f X ( x 1, x ; t 1, t ) f X ( x 1, x ; t t 1 ) Thus, if Xt () is SSS, the joint density of the random variables Xt () and X is independent of t and depends only on the time difference τ

EAS 305 Random Processes Viewgraph 4 of 10 Wide -Sense Stationary A random process X() t is said to be wide-sense stationary (WSS) if its mean is constant (independent of time) E[ X() t ] X, and its autocorrelation depends only on the time difference τ E[ X()Xt ] R XX () τ As a result, the auto covariance of a WSS process also depends only on the time difference τ: C XX () τ R XX () τ X Setting τ 0in E[ X()Xt ] R XX () τ results in E[ X () t ] R XX () 0 The average power of a WSS process is independent of time t, and equals R XX ( 0) An SSS process is WSS, but a WSS process is not necessarily SSS Two processes Xt () and Yt () are jointly wide-sense stationary (jointly WSS) if each is WSS and their crosscorrelation depends only on the time difference τ: R XY ( t, t + τ) E[ X()Yt ] R XY () τ Also the cross-covariance of jointly WSS Xt () and Yt () depends only on the time difference τ: C XY () τ R XY () τ XY IV Time Averages and Ergodicity The time-averaged mean of a sample function xt () of a random process Xt () is defined as xt () 1 T lim --- xt () dt, where the symbol denotes time-averaging T T T Similarly, the time-averaged autocorrelation of the sample function xt () is

EAS 305 Random Processes Viewgraph 5 of 10 1 xt ()x lim --- T T xt ()x dt T Both xt () and xt ()x are random variables since they depend on which sample function resulted from experiment then if Xt () is stationary: 1 T E[ x() t ] lim --- T T E[ x() t ] dt X T The expected value of the time-averaged mean is equal to ensamble mean 1 T Also E[ x()xt ] lim ---, T T E[ x()xt ] dt R XX () τ T the expected value of the time-averaged autocorrelation is equal to the ensamble autocorrelation A random process Xt () is ergodic if time-averages are the same for all sample functions, and are equal to the corresponding ensamble averages In an ergodic process, all its statistics can be obtained from a single sample function A stationary process Xt () is called ergodic in the mean if xt () X, and ergodic in the autocorrelation if xt ()x R XX () τ Correlations and Power Spectral Densities Assume all random processes WSS: I Autocorrelation R XX (τ): R XX () τ E[ X()Xt ] Properties of R XX () τ : R XX ( τ) R XX () τ, R XX () τ R XX () 0, R XX () 0 E[ X () t ]

EAS 305 Random Processes Viewgraph 6 of 10 II Cross-Correlation R XY (τ): R XY () τ E[ X()Yt ] 1 Properties of R XY () τ : R XY ( τ) R XY () τ, R XY () τ R XX ()R 0 YY () 0, R XY () τ -- [ R XX ( 0) + R YY ( 0) ] III Autocovariance C XX (τ): C XX () τ R XX () τ X IV Cross-Covariance C XY (τ): C XY () τ R XY () τ XY Two processes are (mutually) orthogonal if R XY () τ 0, and uncorrelated if C XY () τ 0 V Power Spectrum Density S XX (ω): The power spectral density S XX (ω) is the Fourier transform of R XX (τ) 1 R XX () τ ----- π S XX ( ω)e jωτ dω Properties: real, S XX ( ω) 0, even fn S XX ( ω) S XX ( ω), 1 (Parseval s type relation) ----- S π XX ( ω) d ω R XX () 0 E[ X () t ] S XX ( ω) R XX ()e τ jωτ dτ Thus VI Cross Spectral Densities S XY ( ω) R XY ()e τ jωτ dτ, S YX ( ω) R YX ()e τ jωτ dτ 1 Therefore: R XY () τ ----- S, π XY ( ω)e jωτ 1 dω R YX () τ ----- S π YX ( ω)e jωτ dω Since R XY () τ R YX ( τ), then S XY ( ω) S YX ( ω) S YX ( ω)

EAS 305 Random Processes Viewgraph 7 of 10 Random Processes in Linear Systems I System Response: LTI system with impulse response ht (), and output Yt () LXt [ ()] ht () Xt hζ ()Xt ( ζ) dζ II Mean and Autocorrelation of Output: EYt [ ()] Yt () E h()xt ζ ( ζ) dζ h()e ζ [( X( t ζ) ] dζ h()xt ζ ( ζ) dζ ht () Xt R YY ( t 1, t ) EYt [ ( 1 )Yt ( )] E h()hµ ζ ( )Xt ( 1 ζ)xt ( µ )( dζ) dµ h()hµ ζ ( )E[ X( t 1 ζ)xt ( µ )] d ζdµ h()hµ ζ ( )R XX ( t 1 ζ, t µ ) d ζdµ If input is WSS then EYt [ ()] Y h()xζ ζ d X h()ζ ζ d, the mean of the output is a cons XH() 0 R YY ( t 1, t ) h()hµ ζ ( )R XX ( t t + ζ µ ) d ζdµ, R :Y WSS 1 YY () τ hζ ()hµ ( )R XX ( τ+ ζ µ ) d ζdµ III Power Spectral Density of Output: S YY ( ω) R YY ()e τ jωτ dτ h()hµ ζ ( )R XX ( τ+ ζ µ )e jωτ dτd ζdµ H( ω) S XX ( ω), R YY () τ 1 ----- S π YY ( ω)e jωτ dω 1 ----- H ( ω) Average power of output is: π S XX ( ω)e jωτ dω EY 1 [ () t] R YY () 0 ----- H ( ω) π S XX ( ω) dω

EAS 305 Random Processes Viewgraph 8 of 10 Special Classes of Random Processes I Gaussian Random Process: II White Noise: A random process Xt () is white noise if S XX ( ω) η η -- Its autocorrelation is R XX () τ -- δτ () III Band-Limited White Noise: η --, ω ω A random process Xt () is band-limited white noise if S XX ( ω) B 0, ω > ω B 1 ω B η Then R XX () τ ----- -- e π j ωτ ηω B sinω B τ dω ---------- ----------------- π ω B τ ω B

EAS 305 Random Processes Viewgraph 9 of 10 IV Narrowband Random Process: Let Xt () be WSS process with zero mean Let its power spectral density S XX ( ω) be nonzero only in a narrow frequency band of width W which is very small compared to a center frequency ω c, then we have a narrowband random process When white or broadband noise is passed through a narrowband linear filter, narrowband noise results When a sample function of the output is viewed on oscilloscope, the observed waveform appears as a sinusoid of random amplitude and phase Narrowband noise is conveniently represented by Xt () Vt () cos[ ω c t+ φt ()], where Vt ( ) and φ() t are the envelop function and phase function, respectively From trigonometric identity of the cosine of a sum we get the quadrature representation of process: Xt () Vt () cosφt () cosω c t Vt () sinφ() t sinω c t X c () t cosω c t where X s () t sinω c t X c () t Vt () cosφt (), in phase component X s () t Vt () sinφ(), t qudrature component Vt () X c () t + X s () t φt () X s () t atan----------- X c () t To detect the quadrature component X c () t, and the in-phase component X s () t from Xt () we use:

EAS 305 Random Processes Viewgraph 10 of 10 Low-pass Filter X c (t) X(t) ~ cosωct Low-pass Filter X s (t) ~ -sinω c t Properties of X c () t and X s () t : 1- Same power spectrum: S Xc ( ω) S Xs ( ω) - Same mean and variance as Xt (): X c X s X 0, and σ Xc σ Xs σ X S XX ( ω ω c ) + S XX ( ω + ω c ), ω W 0, else 3- X c () t and X s () t are uncorrelated: E[ X c ()X t s () t ] 0 4- If input process is gaussian, then so are the in-phase and quadrature components 5- If Xt () is gaussian, then for a fixed t, V(t) is a random variable with Rayleigh distribution, and φt () is a random variable uniformly distributed over [ 0π, ]