MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X t ] = m(constant), i.e., the mean is independent of t. E[X t X t+k ] = σ 2 if k = 0 0 if k 0 It can be verified that the above process is a covariance stationary process. Moving Average Process A Moving Average Process X(t), t 0} is a stochastic process satisfying certain properties. It is represented as: X t = a 0 e t + a 1 e t 1 +... + a h e t h where, a is are real constants and e k is a pure random process with mean 0 and variance σ 2. This collection X(t), t 0} is known as a moving average process. When the constant a h 0 then the above process is called a moving average( MA in short) of order h. Now let C k = E[X t X t+k ] be the correlation coefficient, then (a 0 a k + a 1 a k+1 +... + a h k a h )σ 2 if k h C k = 0 if k > h
MODULE 9: STATIONARY PROCESSES 8 Now we define the covariance coefficient ρ k, ρ k = C k = C 0 a0 a k +a 1 a k+1 +...+a h k a h a 2 0 +a 2 1 +...+a 2 h, k h 0, k > h. It can be verified that the moving average process is a covariance stationary or wide sense stationary process. First Order Markov Process One very important Moving Average Process is the first order Markov process which is defined as: X t + a t X t 1 = e t, a t < 1 where, e t, t T } is a pure random process with mean 0 and variance 1. Now X t can be written as X t = Σ k=0 ρ ke t k where, ρ k = a k for all k. Thus the Markov process of first order can be related to a Moving Average Process of infinite order. 2 Autoregressive Process Autoregressive process(ar) A stochastic process X t, t T } of the form: X t + b 1 X t 1 + b 2 X t 2 +... + b h X t h = e t where, e t is a pure random process with mean 0 and b is are real constants with b h 0, then the corresponding stochastic process is an AR process of order h. AR process of infinite order: When X t = r=0 b re t r, then the process X t, t T } is an AR process of infinite order. Now we consider a special case of Autoregressive processes i.e. an Autoregressive
MODULE 9: STATIONARY PROCESSES 9 process of order 2. An Autoregressive process of order 2 will be of the form: X t + b 1 X t 1 + b 2 X t 2 = e t This process is known as the Yule Process. Autoregressive Moving Average Process A stochastic process is called an Autoregressive Moving Average Process(ARMA) if it is of the form: p r=0 b rx t s = q s=0 a se t s where, e t, t T } is a pure random process with mean 0 and b is are real constants with b 0 = 1. The above process is called an ARMA process of order(p, q). If p and q are then it will be called ARMA process of order infinity. Power Spectrum The autoregressive and moving average process are useful in the study of power spectrum. For a wide sense stationary process we define: Covariance function : C k = E[]X t X t+k E[X t ]E[X t+k ] Correlation function: ρ k = = C k V ar(xt ) V ar(x t+k ) π π e ikw df 1 (w) where, F 1 (w) is cdf called integrated spectrum and whenever it s absolutely continuous then the derivative can be written in terms of the density function. Inverse Representation of ρ k : f 1 (w) = 1 k= ρ ke ikw i.e., knowing the correlation coefficient we can determine the density function.
MODULE 9: STATIONARY PROCESSES 10 Representation of C k : C k = eikw df ( w) where, df (w) = C 0 df 1 (w) and whenever F (w) is an absolutely continuous function then the inverse representation of C k can be given: f(w) = 1 k= C ke ikw. Hence, f(w) is called the spectral density function and f 1 (w) is called the normalized spectral density function. The advantage of studying power spectrum is that when we have a stochastic process X(t), t T } having the wide sense stationary property in time domain, the entities C k, ρ k can be found but these are not simple to evaluate. On the other hand, in frequency domain, for the same stochastic process X(t), t T } we have the spectral density and the normalized spectral density function and using the inverse relations we can find C k and ρ k. This is called the spectrum study. EXAMPLE 1. Consider the white noise process, X(t), t T }. Assume E[X t ]=0 and E[X r X s ] = σ 2 if r = s 0 if r s Now we try to find few entities in time domain: C k =0, for k 0 C 0 = σ 2 ρ k =0, for k 0 ρ 0 = C 0 σ 2 = 1 Now we study the same in frequency domain: Spectral density: f(w) = 1 F (w) = k= C ke ikw = σ2, for π w π f(w)dw = σ 2 w, π w π 0, otherwise We can also evaluate the derivative: df (w) = σ2 dw
MODULE 9: STATIONARY PROCESSES 11 Now we show that using the inverse relation we can find C k given f(w). C k = e ikw σ 2 if k = 0 df (w) = 0 if k 0