ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Similar documents
Chapter 6. Random Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes

Stochastic Processes. A stochastic process is a function of two variables:

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

16.584: Random (Stochastic) Processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Random Processes Why we Care

Definition of a Stochastic Process

Probability and Statistics for Final Year Engineering Students

Name of the Student: Problems on Discrete & Continuous R.Vs

where r n = dn+1 x(t)

Statistical signal processing

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Signals and Spectra - Review

Chapter 5 Random Variables and Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Stochastic Process II Dr.-Ing. Sudchai Boonto

Fig 1: Stationary and Non Stationary Time Series

PROBABILITY AND RANDOM PROCESSESS

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Stochastic Processes. Monday, November 14, 11

Probability and Statistics

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Chapter 6: Random Processes 1

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Communication Theory II

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

7 The Waveform Channel

Massachusetts Institute of Technology

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Gaussian, Markov and stationary processes

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

Introduction to Probability and Stochastic Processes I

Lecture - 30 Stationary Processes

Problems on Discrete & Continuous R.Vs

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Review of Probability

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Problem Sheet 1 Examples of Random Processes

Module 9: Stationary Processes

Fundamentals of Noise

IV. Covariance Analysis

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

STAT 248: EDA & Stationarity Handout 3

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

ECE 636: Systems identification

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Spectral Analysis of Random Processes

Question Paper Code : AEC11T03

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

ECE Homework Set 3

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Lecture 2: Repetition of probability theory and statistics

ECE Lecture #10 Overview

Fundamentals of Applied Probability and Random Processes

Econ 424 Time Series Concepts

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Basics on 2-D 2 D Random Signal

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Recitation 2: Probability

Northwestern University Department of Electrical Engineering and Computer Science

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Chapter 2 Random Processes

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

conditional cdf, conditional pdf, total probability theorem?

Some Time-Series Models

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Name of the Student: Problems on Discrete & Continuous R.Vs

Chapter 5,6 Multiple RandomVariables

TSKS01 Digital Communication Lecture 1

System Identification

Stochastic Processes (Master degree in Engineering) Franco Flandoli

Stochastic Processes. Chapter Definitions

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Econometría 2: Análisis de series de Tiempo

Random Process Review

3F1 Random Processes Examples Paper (for all 6 lectures)

Financial Econometrics and Quantitative Risk Managenent Return Properties

FINAL EXAM: 3:30-5:30pm

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

f (1 0.5)/n Z =

ELEG 3124 SYSTEMS AND SIGNALS Ch. 5 Fourier Transform

Transcription:

Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu

OUTLINE 2 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

DEFINITION 3 Stochastic process, or, random process A random variable changes with respect to time Example: the temperature in the room At any given moment, the temperature is random: random variable The temperature (the value of the random variable) changes with respect to time.

DEFINITION 4 Stochastic process (random process) Recall: Random variable is a mapping from random events to real number: (), where is a random event Stochastic process is a random variable changes with time Denoted as: ( t, ) It is a function of random event, and time t. Recall: random variable is a function of random event

Ensemble domain DEFINITION 5 Stochastic process (random process) Fix time: (, ) is a random variable t k ( t, ) pdf, CDF, mean, variance, moments, etc. fix event: t, ) is a deterministic time function ( k Defined as a realization of a random process Or: a sample function of a random process time domain

DEFINITION 6 Example Let be a Gaussian random variable with 0 mean and unit variance, then ( t, ) cos(2 ft) is a random process. If we fix t t 0, we get a Gaussian random variable The pdf of ( t 0, ) is: ( t 0, ) cos(2 ft0) If we fix 0, we get sample function It is a deterministic function of time ( 0 t, 0) cos(2 ft)

DEFINITION 7 Stochastic process: A random variable changes with respect to time ( t, ) Sample function: A deterministic time function t, ) associated with outcome ( k Ensemble: The set of all possible time functions of a stochastic process.

DEFINITION 8 Example Let be a random variable uniformly distributed between [, ], then ( t, ) Acos(2 ft ) is a random process, where A is a constant. At t t 0 Find the mean and variance of ( t 0, )

DEFINITION 9 Example Let be a random variable uniformly distributed between [, ], and 2 a Gaussian random variable with 0 mean and variance. Then cos( 2ft ) is a random process. and are independent. Find the mean and variance of the random process at t t 0

DEFINITION 10 Continuous-time random process The time is continuous E.g. ( t, ) cos(2 ft) Discrete-time random process The time is discrete E.g. ( n, fn) ) cos(2 Continuous random process At any time, ( t 0, ) is a continuous random variable E.g. ( t, ) cos(2 ft), is Gaussian distributed Discrete random process t R n 0,1,2, At any time, ( t 0, ) is a discrete random variable E.g. ( t, ) cos(2 ft), is a Bernoulli RV

DEFINITION 11 Classifications Continuous-time v.s. discrete-time Continuous v.s. discrete Stationary v.s. non-stationary Wide sense stationary (WSS) v.s. non-wss Ergodic v.s. non-ergodic Simple notation Random variable: (), usually denoted as Random process: ( t, ), usually denoted as (t)

OUTLINE 12 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

DESCRIPTION 13 How do we describe a random process? A random variable is fully characterized by its pdf or CDF. How do we statistically describe random process? E.g. We need to describe the random process in both the ensemble domain and the time domain Descriptions of the random process Joint distribution of time samples: joint pdf, joint CDF Moments: mean, variance, correlation function, covariance function

DESCRIPTION 14 Joint distribution of time samples Consider a random process (t) Let joint CDF: Joint pdf: A random process is fully specified by the collections of all the joint CDFs (or joint pdfs) for any n and any choice of sampling instants. For a continuous-time random process, there will be infinite such joint CDFs.

DESCRIPTION 15 Moments of time samples Provide a partial description of the random process For most practical applications it is sufficient to have a partial description. Mean function m The mean function is a deterministic function of time. Variance function ( ( t) t) E[ ( t)] uf ( u) du 2 2 [ ( t) m ( t)] x m ( t) f ( x dx 2 ( ( t) t) E )

DESCRIPTION 16 Example For a random process ( t) ACos(2 ft), where A is a random variable with mean and variance. Find the mean function and variance function of (t). m 2 A A

DESCRIPTION 17 Autocorrelation function (ACF) The autocorrelation function of a random process (t) is defined as the R It describes the correlation of the random process in the time domain How are two events happened at different times related to each other. E.g. if there is a strong correlation between the temperature today and the temperature tomorrow, then we can predict tomorrow s temperature by using today s observation. E.g. the text in a book can be considered as a discrete random process» Stochasxxx» We can easily guess the contents in xxx by using the time correlation. 2 is the second moment of ( t1, t2) E ( t1) ( t2) R t, t ) E ( ) t ) ( 1 1 t1 ( 1

DESCRIPTION Autocovariance function The autocovariance function of a random process (t) is defined as the 18 ) ( ) ( )][ ( ) ( ), ( 2 2 1 1 2 1 t m t t m t E t t C ) ( ) ( )] ( ) ( [ 2 1 2 1 t m t m t t E 2 ) ( ), ( t t t C

DESCRIPTION 19 Example Consider a random process ( t) Acos(2 ft), where A is a random variable with mean and variance 2. Find the autocorrelation A function and autocovariance function. m A

DESCRIPTION 20 Example Consider a random process ( t) Acos(2 ft ), where A is a random 2 variable with mean and variance A. is uniformly distributed in [, ]. A and are independent. Find the autocorrelation function and autocovariance function. m A

DESCRIPTION 21 Example

OUTLINE 22 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

DESCRIPTION 23 Discrete-time random process (random sequence) A discrete-time random process E.g. Joint PMF Joint CDF

DESCRIPTION 24 Bernoulli process

DESCRIPTION 25 Moments of time samples Provide a partial description of the random process For most practical applications it is sufficient to have a partial description. Mean function The mean function is a deterministic function of time. Variance function It is a deterministic function of time

DESCRIPTION 26 Example For a Bernoulli (p) process, find the mean function and variance function.

DESCRIPTION 27 Autocorrelation function of a random sequence Autocovariance function of a random sequence

DESCRIPTION 28 Example

OUTLINE 29 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

STATIONARY RANDOM PROCESS 30 Stationary random process A random process, (t), is stationary if the joint distribution of any set of samples does not depend on the time origin. For any value of and, and for any choice of

STATIONARY RANDOM PROCESS 31 1 st order distribution If (t) is a stationary random process, then the first order CDF or pdf must be independent of time Mean: F ( x) F ( t )( ( t) x f ( x) f ( t )( ( t) x ), ), t, t, The samples at different time instant have the same distribution. E E (t) ( t ) For a stationary random process, the mean is independent of time

STATIONARY RANDOM PROCESS 32 2 nd order distribution If (t) is a stationary random process, then the 2 nd order CDF and pdf are: F f ( x1, x2) F (0) ( )( x1, 2), ( t) ( t ) x ( x1, x2) f (0) ( )( x1, 2), ( t) ( t ) x t, t, The 2 nd order distribution only depends on the time difference between the two samples Autocorrelation function t, t 1 ) R ( 2 Autocovariance function: For a stationary random process, the autocorrelation function and autocovariance function only depends on the time difference: t2 t1

STATIONARY RANDOM PROCESS 33 Stationary random process In order to determine whether a random process is stationary, we need to find out the joint distribution of any group of samples Stationary random process the joint distribution of any group of time samples is independent of the starting time This is a very strict requirement, and sometimes it is difficult to determine whether a random process is stationary.

STATIONARY RANDOM PROCESS 34 Wide-Sense Stationary (WSS) Random Process A random process, (t), is wide-sense stationary (WSS), if the following two conditions are satisfied The first moment is independent of time E ( t) E[ ( t )] m The autocorrelation function depends only on the time difference E ( t) ( t ) R ( ) Only consider the 1 st order and 2 nd order distributions

STATIONARY RANDOM PROCESS 35 Stationary v.s. Wide-Sense Stationary If (t) is stationary (t) is WSS It is not true the other way around Stationary is a much stricter condition. It requires the joint distribution of any combination of samples to be independent of the absolute starting time WSS only considers the first moment (mean is a constant) and second order moment (autocorrelation function depends only on the time difference) E E ( t) E[ ( t )] m ( t) ( t ) R ( )

STATIONARY RANDOM PROCESS 36 Example A random process is described by ( t) A Bcos(2 ft ) where A is a random variable uniformly distributed between [-3, 3], B is an RV with zero mean and variance 4, and is a random variable uniformly distributed in [ / 2,3 / 2]. A, B, and are independent. Find the mean and autocorrelation function. Is (t) WSS?

STATIONARY RANDOM PROCESS 37 Autocorrelation function of a WSS random process R ( ) E ( t) ( t ) R (0) 2 ( t) is the average power of the signal (t) E ) R ( ) E ( t) ( t ) E ( t ) ( t) R ( is an even function R ( ) R (0) Cauchy-Schwartz inequality If R ( ) is non-periodic, then lim R ( ) m 2

STATIONARY RANDOM PROCESS 38 Example Consider a random process having an autocorrelation function Find the mean and variance of (t)

STATIONARY RANDOM PROCESS 39 Example A random process Z(t) is Z ( t ) ( t ) ( t t 0) Where (t) is a WSS random process with autocorrelation function R Z 2 ( ) exp( ) Find the autocorrelation function of Z(t). Is Z(t) WSS?

STATIONARY RANDOM PROCESS 40 Example A random process (t) = At + B, where A is a Gaussian random variable with 0 mean and variance 16, and B is uniformly distributed between 0 and 6. A and B are independent. Find the mean and auto-correlation function. Is it WSS?

OUTLINE 41 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

TWO RANDOM PROCESSES 42 Two random processes (t) and Y(t) Cross-correlation function The cross-correlation function between two random processes (t) and Y(t) is defined as t, t ) E[ ( t ) Y ( )] R Y ( 1 2 1 t2 Two random processes are said to be uncorrelated if for all and E[ ( t1) Y( t2)] E[ ( t1)] E[ Y( t2)] Two random processes are said to be orthogonal if for all and R Y ( t, t2) Cross-covariance function 1 t1 2 t1 2 The cross-covariance function between two random processes (t) and Y(t) is defined as C Y ( t1, t2) E[ ( t1) Y( t2)] m ( t1) m ( t2) 0 t t

TWO RANDOM PROCESSES 43 Example Consider two random processes ( t) cos(2 ft ) and Y( t) sin(2 ft ). Are they uncorrelated?

TWO RANDOM PROCESSES 44 Example Suppose signal Y(t) consists a desired signal (t) plus noise N(t) as Y(t) = (t) + N(t) The autocorrelation functions of (t) and N(t) are: R ( t, t ) and R ( t, t 1 2 NN 1 2), respectively. The mean function of (t) and N(t) are: m (t) and m N (t), respectively. Find the cross-correlation between (t) and Y(t).

OUTLINE 45 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

Ensemble domain ERGODIC RANDOM PROCESS 46 Time average of a signal x(t) x( t) T 1 T T 0 x( t) dt Time average of a sample function of the random process x( t) ( t, 0) 1 T ( t, 0 ) T ( t, 0) dt T 0 Ensemble average (mean) of a random process ( 0 t, ) E t, ) ( xf ( t, )( x ) dx time domain

ERGODIC RANDOM PROCESS 47 Ergodic random process A stationary random process is also an ergodic random process if the n-th order ensemble average is the same as the n-th order time average. n n ( t) E ( t) x n ( t) 1 lim T T T 0 x n ( t) dt E n t) n ( x f ( t) ( x) dx If a random process is ergodic, we can find the moments by performing time average over a single sample function. Ergodic is only defined for stationary random process If a random process is ergodic, then it must be stationary Not true the other way around

ERGODIC RANDOM PROCESS 48 Example Consider a random process ( t) A where A is a Gaussian RV with 0 mean and variance 4. Is it ergodic?

ERGODIC RANDOM PROCESS 49 Mean ergodic A WSS random process is mean ergodic if the ensemble average is the same as the time average. ( t) E ( t) Mean ergodic v.s. ergodic Ergodic: Mean ergodic: n n ( t) E ( t) ( t) E ( t) for stationary process for WSS process If a random process is ergodic, then it must be mean ergodic Not true the other way around If a process is mean ergodic, it must be WSS Mean ergodic is defined for WSS process only.

ERGODIC RANDOM PROCESS 50 Example Consider a random process ( t) Acos(2 ft ) where A is a non-zero constant, and is uniformly distributed between 0 and. Is it mean ergodic?

OUTLINE 51 Definition of stochastic process (random process) Description of continuous-time random process Description of discrete-time random process Stationary and non-stationary Two random processes Ergodic and non-ergodic random process Power spectral density

POWER SPECTRAL DENSITY 52 Power spectral density (PSD) The distribution of the power in the frequency domain. For a WSS random process, the PSD is the Fourier transform of the autocorrelation function S ( f ) F[ ( )] R The density of power in the frequency domain. The power consumption between the frequency range P f 2 f 1 S ( f ) df [ f 1, f2]:

POWER SPECTRAL DENSITY 53 White noise Autocorrelation function R x any two samples in the time domain are uncorrelated. Power spectral density N0 ( ) ( ) 2 N 2 0 N0 S x ( f ) ( ) d 2 ( ) R N0/2 S f N0/2 f