Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Similar documents
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Module 9: Stationary Processes

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Multiresolution Models of Time Series

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

Chapter 6: Model Specification for Time Series

Time Series Solutions HT 2009

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Introduction to Spectral and Time-Spectral Analysis with some Applications

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Chapter 4: Models for Stationary Time Series

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

1 Linear Difference Equations

Chapter 6. Random Processes

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Classic Time Series Analysis

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Examples Sheet

Elements of Multivariate Time Series Analysis

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Brief Review on Stochastic Processes

Some Time-Series Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Chapter 3 - Temporal processes

Statistics 349(02) Review Questions

3. ARMA Modeling. Now: Important class of stationary processes

Time Series Analysis

Time Series 2. Robert Almgren. Sept. 21, 2009

IV. Covariance Analysis

LINEAR STOCHASTIC MODELS

3 Theory of stationary random processes

STAD57 Time Series Analysis. Lecture 23

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Time Series: Theory and Methods

Automatic Autocorrelation and Spectral Analysis

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Spectral Analysis - Introduction

Time Series Analysis

Lecture 2: ARMA(p,q) models (part 2)

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -33 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Econ 623 Econometrics II Topic 2: Stationary Time Series

Time Series Examples Sheet

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

Notes on Random Processes

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Lecture 3-4: Probability models for time series

Identifiability, Invertibility

Chapter 9: Forecasting

E 4101/5101 Lecture 6: Spectral analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

MAT3379 (Winter 2016)

Stochastic Processes- IV

Stochastic Processes

Discrete time processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Review Session: Econometrics - CLEFIN (20192)

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

If we want to analyze experimental or simulated data we might encounter the following tasks:

9) Time series econometrics

Switching Regime Estimation

Stochastic Processes. A stochastic process is a function of two variables:

Introduction to Signal Processing

Econometría 2: Análisis de series de Tiempo

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

Empirical Market Microstructure Analysis (EMMA)

Econometría 2: Análisis de series de Tiempo

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

SOME BASICS OF TIME-SERIES ANALYSIS

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

Lecture 2: Univariate Time Series

STAT Financial Time Series

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Cointegration, Stationarity and Error Correction Models.

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

Time Series Analysis Fall 2008

Spatial Statistics with Image Analysis. Lecture L02. Computer exercise 0 Daily Temperature. Lecture 2. Johan Lindström.

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Analysis

Stochastic Processes

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

at least 50 and preferably 100 observations should be available to build a proper model

Lecture 4 - Spectral Estimation

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

5 Transfer function modelling

Time Series Analysis

Advanced Econometrics

Some functional (Hölderian) limit theorems and their applications (II)

Lecture 4a: ARMA Model

Transcription:

MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X t ] = m(constant), i.e., the mean is independent of t. E[X t X t+k ] = σ 2 if k = 0 0 if k 0 It can be verified that the above process is a covariance stationary process. Moving Average Process A Moving Average Process X(t), t 0} is a stochastic process satisfying certain properties. It is represented as: X t = a 0 e t + a 1 e t 1 +... + a h e t h where, a is are real constants and e k is a pure random process with mean 0 and variance σ 2. This collection X(t), t 0} is known as a moving average process. When the constant a h 0 then the above process is called a moving average( MA in short) of order h. Now let C k = E[X t X t+k ] be the correlation coefficient, then (a 0 a k + a 1 a k+1 +... + a h k a h )σ 2 if k h C k = 0 if k > h

MODULE 9: STATIONARY PROCESSES 8 Now we define the covariance coefficient ρ k, ρ k = C k = C 0 a0 a k +a 1 a k+1 +...+a h k a h a 2 0 +a 2 1 +...+a 2 h, k h 0, k > h. It can be verified that the moving average process is a covariance stationary or wide sense stationary process. First Order Markov Process One very important Moving Average Process is the first order Markov process which is defined as: X t + a t X t 1 = e t, a t < 1 where, e t, t T } is a pure random process with mean 0 and variance 1. Now X t can be written as X t = Σ k=0 ρ ke t k where, ρ k = a k for all k. Thus the Markov process of first order can be related to a Moving Average Process of infinite order. 2 Autoregressive Process Autoregressive process(ar) A stochastic process X t, t T } of the form: X t + b 1 X t 1 + b 2 X t 2 +... + b h X t h = e t where, e t is a pure random process with mean 0 and b is are real constants with b h 0, then the corresponding stochastic process is an AR process of order h. AR process of infinite order: When X t = r=0 b re t r, then the process X t, t T } is an AR process of infinite order. Now we consider a special case of Autoregressive processes i.e. an Autoregressive

MODULE 9: STATIONARY PROCESSES 9 process of order 2. An Autoregressive process of order 2 will be of the form: X t + b 1 X t 1 + b 2 X t 2 = e t This process is known as the Yule Process. Autoregressive Moving Average Process A stochastic process is called an Autoregressive Moving Average Process(ARMA) if it is of the form: p r=0 b rx t s = q s=0 a se t s where, e t, t T } is a pure random process with mean 0 and b is are real constants with b 0 = 1. The above process is called an ARMA process of order(p, q). If p and q are then it will be called ARMA process of order infinity. Power Spectrum The autoregressive and moving average process are useful in the study of power spectrum. For a wide sense stationary process we define: Covariance function : C k = E[]X t X t+k E[X t ]E[X t+k ] Correlation function: ρ k = = C k V ar(xt ) V ar(x t+k ) π π e ikw df 1 (w) where, F 1 (w) is cdf called integrated spectrum and whenever it s absolutely continuous then the derivative can be written in terms of the density function. Inverse Representation of ρ k : f 1 (w) = 1 k= ρ ke ikw i.e., knowing the correlation coefficient we can determine the density function.

MODULE 9: STATIONARY PROCESSES 10 Representation of C k : C k = eikw df ( w) where, df (w) = C 0 df 1 (w) and whenever F (w) is an absolutely continuous function then the inverse representation of C k can be given: f(w) = 1 k= C ke ikw. Hence, f(w) is called the spectral density function and f 1 (w) is called the normalized spectral density function. The advantage of studying power spectrum is that when we have a stochastic process X(t), t T } having the wide sense stationary property in time domain, the entities C k, ρ k can be found but these are not simple to evaluate. On the other hand, in frequency domain, for the same stochastic process X(t), t T } we have the spectral density and the normalized spectral density function and using the inverse relations we can find C k and ρ k. This is called the spectrum study. EXAMPLE 1. Consider the white noise process, X(t), t T }. Assume E[X t ]=0 and E[X r X s ] = σ 2 if r = s 0 if r s Now we try to find few entities in time domain: C k =0, for k 0 C 0 = σ 2 ρ k =0, for k 0 ρ 0 = C 0 σ 2 = 1 Now we study the same in frequency domain: Spectral density: f(w) = 1 F (w) = k= C ke ikw = σ2, for π w π f(w)dw = σ 2 w, π w π 0, otherwise We can also evaluate the derivative: df (w) = σ2 dw

MODULE 9: STATIONARY PROCESSES 11 Now we show that using the inverse relation we can find C k given f(w). C k = e ikw σ 2 if k = 0 df (w) = 0 if k 0