Stochastic Processes. Monday, November 14, 11

Similar documents
Probability and Statistics

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Name of the Student: Problems on Discrete & Continuous R.Vs

Module 9: Stationary Processes

Chapter 6. Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Chapter 6: Random Processes 1

Let X and Y denote two random variables. The joint distribution of these random

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

Stochastic Processes

16.584: Random (Stochastic) Processes

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

ECE Homework Set 3

Properties of the Autocorrelation Function

Random Processes Why we Care

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Name of the Student: Problems on Discrete & Continuous R.Vs

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Multiple Random Variables

Chapter 6 - Random Processes

Multivariate Random Variable

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Statistical signal processing

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Stochastic Processes. A stochastic process is a function of two variables:

Definition of a Stochastic Process

EE4601 Communication Systems

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Introduction to Probability and Stochastic Processes I

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

5 Operations on Multiple Random Variables

Basics on Probability. Jingrui He 09/11/2007

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

white noise Time moving average

1 Presessional Probability

Econometría 2: Análisis de series de Tiempo

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

ENGG2430A-Homework 2

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

ECE534, Spring 2018: Solutions for Problem Set #5

7 The Waveform Channel

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

ECE 636: Systems identification

STAT 248: EDA & Stationarity Handout 3

ECE 673-Random signal analysis I Final

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

IV. Covariance Analysis

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

Lecture 2: Repetition of probability theory and statistics

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Week 12-13: Discrete Probability

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Problems on Discrete & Continuous R.Vs

Question Paper Code : AEC11T03

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

Econ 424 Time Series Concepts

Chapter 5 Random Variables and Processes

Communication Theory II

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

ECE Lecture #10 Overview

04. Random Variables: Concepts

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Class 8 Review Problems solutions, 18.05, Spring 2014

FINAL EXAM: 3:30-5:30pm

Name of the Student: Problems on Discrete & Continuous R.Vs

2 (Statistics) Random variables

Fundamentals of Noise

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

µ X (A) = P ( X 1 (A) )

1 Review of Probability and Distributions

STA205 Probability: Week 8 R. Wolpert

Gaussian, Markov and stationary processes

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Introduction to Probability and Stocastic Processes - Part I

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Algorithms for Uncertainty Quantification

Probability Theory and Statistics. Peter Jochumzen

Introduction to Statistics and Error Analysis

ECSE B Solutions to Assignment 8 Fall 2008

Fig 1: Stationary and Non Stationary Time Series

STAT Financial Time Series

Lesson 4: Stationary stochastic processes

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

Notes on Time Series Modeling

Class 1: Stationary Time Series Analysis

Lecture 19: Properties of Expectation

Chapter 4 : Expectation and Moments

Lecture - 30 Stationary Processes

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Some Time-Series Models

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Transcription:

Stochastic Processes 1

Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed by t T. For a fixed 0, X(( 0,t) is a function of time or a sample function For a fixed t 0, X((, t 0 ) is a random variable or a function of Discrete-time random process: n T, X(, n) Continuous-time random process: t T X(, n)

Probabilistic characterization of X(, t) {X(, t)} is an infinite family of r.v., we then need to know: 1. First-order p.d. or distribution. Second-order p.d. or distribution F (x, t) =P [X(, t) apple x] for all t @F(x, t) f(x, t) = @x F (x 1,x,t 1,t )=P [X(, t 1 ) apple x 1,X(, t ) apple x ] for all t 1,t T f(x 1,x,t 1,t )= @ F (x, t) @x 1 @x 3. n-th order p.d. or distribution F (x 1,,x n,t 1,,t n )=P [X(, t 1 ) apple x 1,,X(, t n ) apple x n ] for all t 1,,t n T f(x 1,,x n,t 1,,t n )= @n F (x, t) @x 1 @x n Remark The n-th order p.d./p.d must also satisfy Symmetry F (x 1,,x n,t 1,,t n )=F (x i1,,x in,t i1,,t in ) for any permutation of {1,,,n}. Consistency in marginals F (x 1,,x n 1,t 1,,t n 1 )=F (x 1,,x n 1, 1,t 1,,t n ) 3

Kolmogorov s Theorem If F (x 1,,x n,t 1,,t n ) for all sets of {t i }T satisfy the symmetry and consistency conditions, there corresponds a stochastic process. Second-order Properties F (x, t),f(x, t), first-order p.d./p.d, and F (x 1,x,t 1,t ),f(x 1,x,t 1,t ) are not su cient to characterize X(, t) but provide important information: Mean function (t) X (t) =E(X(t)) = Z 1 1 xf(x, t)dx function of t that depends on the first-order p.d. Auto-covariance function C X (t 1,t ) C X (t 1,t ) = E[(X(t 1 ) (t 1 ))(X(t ) (t ))] = E[X(t 1 )X(t )] R X (t 1,t ) (t 1 ) (t ) R X (t 1,t ) is the autocorrelation function, a function of t 1, t and secondorder probability density ZZ R X (t 1,t )=E[X(t 1 )X(t )] = x 1 x f(x 1,x,t 1,t )dx 1 dx 4

Remarks Note that C X (t, t) =E[(X(t) (t)) ]= X(t) or the variance of X(t). C X (t 1,t ) relates r.v. s at times t 1 and t,i.e.,intime,whilec X (t, t) relates the r.v. X(t) with itself, i.e., in space. Correlation coe cient r X (t 1,t )= C X(t 1,t ) X(t 1 ) X (t ) Cross-covariance X(t), Y (t) real processes C XY (t 1,t ) = E[(X(t 1 ) X (t 1 ))(Y (t ) Y (t ))] = E[X(t 1 )Y (t )] R XY (t 1,t ) X (t 1 ) Y (t ) R XY (t 1,t ) is the cross-correlation function. X(t), Y (t) are uncorrelated i C XY (t 1,t ) = 0 for all t 1 and t T. 5

Remark A zero-mean process (t) is called strictly white noise if (t i ) and (t j ) are independent for t i 6= t j. It is called white noise if it is uncorrelated for di erent values of t, i.e., C (t i,t j )= (t i t j )= t i = t j 0 t i 6= t j Gaussian Processes X(t) is Gaussian if {X(t 1 ),,X(t n ) are jointly Gaussian for any value of n and times {t 1,,t n }. Remark The statistics of a Gaussian process are completely determines by the mean (t) and the covariance C X (t 1,t ) functions. Thus f(x, t) requires (t),c(t, t) = X(t) f(x 1,x,t 1,t ) requires (t 1 ), (t ),C(t 1,t ),C(t 1,t 1 ),C(t,t ) and in general the n th -order characteristic function X(bf!, t) =exp 4j X i! i X (t i ) 0.5 X i,k 3! i! k C X (t i,t j ) 5 6

Stationarity Strict Stationarity X(t) is strictly stationary i F (x 1,,x n,t 1,,t n )=F (x 1,,x n,t 1 +,,t n + ) for all n and values.i.e., statistics do not change with time. If X(t) is strictly stationary then Proof its k th -moments are constant, i.e.,e[x k (t)] = constant for all t. R X (t, t + ) =R X (0, ), or it only depends on the lag Using strict stationarity E[X k (t)] = Z 1 1 x k f(x, t)dx = Z 1 1 x k f(x, t + )dx = E[X k (t + )] for any, which can only happen when it is a constant. By strict stationarity ZZ R X (t, t + ) = x 1 x f(x 1,x,t,t+ )dx 1 dx = ZZ x 1 x f(x 1,x, 0, )dx 1 dx = R X (0, ) Wide-sense Stationarity (wss) X(t) is wss if 1. E[X(t)] = constant, Var[X(t)] = constant, for all t.. R X (t 1,t )=R X (t t 1 ) 7

Examples of random processes Discrete-time Binomial process Consider the Bernoulli trials where X(n) = 1 event occurs at time n 0 otherwise with P [X(n) = 1] = p, P [X(n) = 0] = 1 p = q. The discrete-time Binomial process counts the number of times the event occurred (successes) in a total of n trials, or Y (n) = X(i), Y(0) = 0, n 0 Since Y (n) = 1 X(i) + X(n) = Y (n 1) + X(n) the process is also represented by the di erence equation Y (n) =Y (n 1) + X(n) Y (0) = 0, n 0 8

Characterization of Y (n) First-order p.d. f(y, n) = P [Y (n) =k] (y k) = k=0 k=0 n k (y k), 0 apple k apple n Second and higher-order p.d. are di Y (n) s. Statistical averages cult to find given the dependence of the E[Y (n)] = E[X(i)] p = np Var[Y (n)] = E[(Y (n) np) ]=E[( = X i,j = = npq (X(i) p) ] E[(X(i) p)(x(j) p)] = X i,j E[(X(i) p) ] Var(X(i))=(1 p+0 q) p =pq E[X(i) p]e[x(j) p] =0 if i6=j independence of X(i) Notice that E[Y (n)] = np, so it depends on time n, and Var[Y (n)] = npq also a function of time n, so the discrete binomial process is non-stationary. 9

Random Walk Process Let the discrete binomial process be defined by Bernoulli trials s event occurs at time n X(n) = s otherwise so that Y (n) = X(i), Y(0) = 0, n 0 Possible values at times bigger than zero n =1 Y (1) = 1, 1 n = Y () =, 0, n =3 Y (3) = 3, 1, 1, 3.. In general, at step n = n 0 Y (n) can take values {k n 0, 0 apple k apple n 0 }, so for instance for n =, Y () can take values k, 0 apple k apple or, 0,. 10

Characterization of Y (n) First-order p. mass d. P [Y (n 0 )=k n 0, 0 apple k apple n 0 ] Convert the random walk process into a binomial process by (s = 1) Z(n) = X(n)+1 1 event occurs at time n = 0 otherwise Y (n) = X(i) = (Z(i) 1) = Z(i) n =Ỹ (n) n where Ỹ (n) is the binomial process in the previous example. Letting m =k n 0, 0 apple k apple n 0 then we have for 0 apple n 0 apple n P [Y (n 0 )=m] = P [Ỹ (n 0) n 0 =k n 0 ]=P[Ỹ(n 0)=k] = Mean and variance n0 k p k q n 0 k E[Y (n)] = E[Ỹ (n) n] =E[Ỹ (n)] n =np n Var[Y (n)] = 4Var[Ỹ (n)] = 4npq Both of which depend on n, so that the process is nonstationary. 11

Sinusoidal processes X(t) =A cos( t + ) A constant,frequency and phase are random and independent. Assume U[, ]. The sinusoidal process X(t) isw.s.s. The E[X(t)] = AE[cos( t + )] =AE[cos( t) sin( ) sin( t) cos( )] = A[E[cos( t)] E[sin( )] 0 = 0 E[sin( )] = Z E[sin( t)] E[cos( )]] independence 0 1 sin( )d =0 area under a period of the sinusoid T 0 =, likewise for E[cos( )] = 0. The autocorrelation R X (t +,t) = E[X(t + )X(t)] = A E[cos( (t + ) + ) cos( t + )] = A E[cos( ) + cos( t + + )] A E[cos( t + ) sin( )] A E[sin( t + ) cos( )] 0 = A E[cos( )] + = A E[cos( )] where the zero term is obtained because of the independence and that the expected values E[sin( )] = E[cos( )] = 0 by similar reasons as above. 1

X(t) =A cos(! 0 t), A U[0, 1],! 0 constant. X(t) is non-stationary. t X(t) 0 A cos(0) = A U[0, 1] 4! 0 A cos( /4 ) = Ap U[0, p /]! o A U[ 1, 0] For each time the first-order p.d. is di erent so the process is not strictly stationary. The process can be shown not to be wide-sense stationary: E[X(t)] = cos(! 0 t)e[a] =0.5 cos(! 0 t) R X (t +,t) = E[A ] cos(! 0 (t + )) cos(! 0 t) = A[cos(! 0 ) + cos(! 0 (t + ))] R X (t, t) = Var[X(t)] = A(1 + cos(! 0 t) which gives that the mean and the variance are not constant, and the autocorrelation is not a function of the lag, therefore the systems is not wide-sense stationary. 13

Gaussian processes Let X(t) =A + Bt, wherea and B are jointly Gaussian. Determine if X(t) is Gaussian. Consider 6 4 X(t 1 ). X(t n ) 3 7 5 = 6 4 1 t 1.. 1 t n 3 7 5 apple A B for any n and times {t k }, 1 apple k apple n, the above is a linear combination of A and B which are Gaussian then {X(t k )} are jointly Gaussian, and so the process X(t) is Gaussian. Gaussian processes Consider a Gaussian process W (n), 1 < n < 1 such that E[W (n)] = 0 for all n R(k, `) = (k `) = k = ` 0 k 6= ` such a process is a discrete white noise, determine its n th -order p.d. The covariance is C(k, `) =R(k, `) E[W (k)]e[w (`)] = R(k, `) which is zero when k 6= `, sow (k) and W (`) are uncorrelated and by being Gaussian they are independent. So f(w n1,,w nm )= Y i f(w ni ) 14