Some Time-Series Models

Similar documents
Lecture 2: Univariate Time Series

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

FE570 Financial Markets and Trading. Stevens Institute of Technology

Econ 424 Time Series Concepts

Chapter 4: Models for Stationary Time Series

Econometría 2: Análisis de series de Tiempo

A time series is called strictly stationary if the joint distribution of every collection (Y t

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

1 Linear Difference Equations

3 Theory of stationary random processes

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Notes on Time Series Modeling

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Time Series: Theory and Methods

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Discrete time processes

Univariate Time Series Analysis; ARIMA Models

Chapter 9: Forecasting

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series Analysis -- An Introduction -- AMS 586

Elements of Multivariate Time Series Analysis

Exercises - Time series analysis

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Quantitative Finance I

at least 50 and preferably 100 observations should be available to build a proper model

Empirical Market Microstructure Analysis (EMMA)

1. Fundamental concepts

7. MULTIVARATE STATIONARY PROCESSES

Statistics of stochastic processes

Class 1: Stationary Time Series Analysis

Time Series Analysis

11. Further Issues in Using OLS with TS Data

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Econometría 2: Análisis de series de Tiempo

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Gaussian processes. Basic Properties VAG002-

Econometrics II Heij et al. Chapter 7.1

Stochastic Processes

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Introduction to ARMA and GARCH processes

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

3. ARMA Modeling. Now: Important class of stationary processes

Chapter 6: Model Specification for Time Series

Lesson 9: Autoregressive-Moving Average (ARMA) models

Applied time-series analysis

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

7. Forecasting with ARIMA models

Introduction to Stochastic processes

Ch 4. Models For Stationary Time Series. Time Series Analysis

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Nonlinear time series

Covariances of ARMA Processes

LINEAR STOCHASTIC MODELS

Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods

1 Class Organization. 2 Introduction

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Econ 623 Econometrics II Topic 2: Stationary Time Series

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Lecture 4a: ARMA Model

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Basics: Definitions and Notation. Stationarity. A More Formal Definition

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

MCMC analysis of classical time series algorithms.

Lecture 2: ARMA(p,q) models (part 2)

Midterm Suggested Solutions

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Classic Time Series Analysis

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Ch 6. Model Specification. Time Series Analysis

Autoregressive and Moving-Average Models

6 NONSEASONAL BOX-JENKINS MODELS

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

AR, MA and ARMA models

ARMA Estimation Recipes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Automatic Autocorrelation and Spectral Analysis

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Stochastic Processes. A stochastic process is a function of two variables:

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015

Lecture 3-4: Probability models for time series

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

5: MULTIVARATE STATIONARY PROCESSES

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Transcription:

Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random walks, and MA processes Autoregressive processes ARMA, ARIMA and the general linear models 5. The Wold decomposition theorem Stochastic processes and their properties A stochastic process can be described as a statistical phenomenon that evolves in time according to probabilistic laws. Mathematically, a stochastic process is a collection of random variables that are ordered in time and defined at a set of time points, which may be continuous or discrete. Most statistical problems are concerned with estimating the properties of a population from a sample. In time-series analysis, the order of observations is determined by time and it is usually impossible to make more than one observation at any given time. We may regard the observed time series as just one example of the infinite set of time series that might have been observed. This infinite set of time series is called the ensemble, and every 1

member of the ensemble is a possible realization of the stochastic process. A simple way of describing a stochastic process is to give the moments of the process. Denote the random variable at time by if time is continuous, and by if time is discrete. Mean: The mean function is defined for all by Variance: The variance function for all by is defined Autocovariance: We define the acv.f. be the covariance of with, to Stationary processes A time series is said to be strictly stationary if the joint distribution of is the same as the joint distribution of for all. Strict stationarity implies that for for which is called the autocovariance coefficient at lag. The size of depends on the units in which is measured. One usually standardizes the acv.f. to produce the autocorrelation function (ac.f.), which is defined by 2

A process is called second-order stationary (or weakly stationary) if its mean is constant and its acv.f. depends only on the lag, so that And This weaker definition of stationarity will generally be used from now on. Some properties of the autocorrelation function Suppose a stationary stochastic process has mean, variance, acv.f. and ac.f.. Then The ac.f. is an even function of lag, so that The ac.f. does not uniquely identify the underlying model. Although a given stochastic process has a unique covariance structure, the converse is not in general true. Some useful models Purely random processes A discrete-time process is called a purely random process if it consists of a sequence of random variables,, which are mutually independent and identically distributed. We normally assume that. The independence assumption means that 3

{ { The process is strictly stationary, and hence weakly stationary. A purely random process is sometimes called white noise, particularly by engineers. Example: Pfizer stock returns and its ACFs Let be the Pfizer stock price at the end of month, then is the monthly log return of the stock. rets<-read.table("dlogret6stocks.txt", header=t) ser1<-ts(rets[,2], start=c(2000,1), frequency=12) par(mfrow=c(2,1), cex=0.8) plot(ser1, xlab="month", ylab="pfizer") title("example: Returns of Monthly Pfizer stock price") acf(rets[,2]) Some useful models Random walks 4

Suppose that is a discrete-time, purely random process with mean and variance. A process is said to be random walk if. The process is customarily started at zero when, so that. We find that and that As the mean and the variance change with, the process is nonstationary. The first differences of a random walk form a purely random process, which is stationary. A good example of time series, which behaves like random walks, are share prices on successive days. Example: Pfizer s accumulated stock returns 5

Some useful models Moving average processes A process is said to be a moving average process of order (or MA( ) process) if Where are constants and is a purely random process with mean 0 and variance. The s are usually scale so that. We can show that, since s are independent, and. Using for and 0 for, we have ( ) { Example Consider the series defined by the equation Where are independent Normal random variables with mean 0 and variance. We then have { Hence is stationary. The ac.f of is { 6

Examples: MA series and their ACFs Example Consider the series defined by the equation Where are independent Normal random variables with mean 0 and variance. Compute its ACFs. Hints: Compute the autocovariance for and, respectively. 7

Examples: MA series and their ACFs We can see that the process is second-order stationary for all values of. Furthermore, if the s are normally distributed, then so are the s, and we have a strictly stationary normal process. The ac.f. of the above MA( ) process is given by { Note that the ac.f. cuts off at lag, which is a special feature of MA processes. For instance, the MA(1) process with has an ac.f. given by for, for and, otherwise. 8

Examples: MA series and their ACFs Although there are no restrictions on the for a (finite-order) MA process to be stationary, restrictions are usually imposed on the to ensure that the process satisfies a condition called invertibility. Example: Consider the following MA(1) processes: We can show that (A) and (B) have exactly the same ac.f., hence we cannot identify an MA process uniquely from a given ac.f. If we invert models (A) and (B) by expressing in terms of s, we have 9

Note that the series of coefficients of for models (A) and (B) cannot be convergent at the same time. In general, a process is said to be invertible if the random disturbance at time (or innovations) can be expressed as a convergent sum of present and past values of in the form Where The definition above implies that an invertible process can be rewritten in the form an autoregressive process (possibly of infinite order), whose coefficients form a convergent sum. From the definition above, model (A) is said to be invertible whereas model (B) is not. The imposition of the invertibility condition ensures that there is a unique MA process for a given ac.f. The invertibility condition for an MA process of any order can be expressed by using the backward shift operator, denoted by B, which is defined by for all. Denote, then It can be shown that an MA(q) process is invertible if the roots of the equation All lie outside the unit circle, where B is regarded as a complex variable instead of as an operator. 10

Example: In the MA(1) process and the invertibility condition is. 11

Autoregressive processes A process is said to be an autoregressive process of order p (or AR(p)) if Where is a purely random process with mean 0 and variance. The above AR(p) process can be written as ( ) Example: AR(1) process Consider the first-order AR process By successive substitution, we obtain that, if, We can use the backward shift operator B to explore the duality between AR and MA process. Note that (2) can be written as So that The above form implies that If, we have, and the acv.f. is given by 12

[( ) ( )] We find that the process (2) is second-order stationary provided. Since and, we have Or The acv.f. and ac.f. can also be written recursively Examples of the ac.f. of the AR(1) process for different values of. Figure 1: Autocorrelation functions of AR(1) process for 13

General AR(p) process Consider the AR(p) process ( ) or equivalently as ( ) Where ( ) The relationship between the s and s can be found. The necessary condition for (3) to be stationary is that its variance or converges. The acv.f. is given by, for. A sufficient condition for this to converge, and hence for stationarity is that converges. Since might be algebraically hard to find, an alternative simpler way is to assume the process is stationary, multiply through (1) by, take expectations and divide by variance of Yule-Walker Equations:, assuming that the is finite. Then using the fact that for all, and we find the for all For p = 2, the Yule-Walker equation (4) is a set of difference equations and has the general solution are the roots of the so-called auxiliary equation 14

The constants are chosen to satisfy the initial conditions depending on. The first Yule- Walker equation provide a further restriction on the using and. An equivalent way of expressing the stationarity condition is to say that the roots of the equation must lie outside the unit circle. Important Example: Computing the ACFs of the AR(2) process Suppose, are the roots of the quadratic equation Here if from which we can show that the stationarity region is the triangular region satisfying When the roots are real, we have where the constants also real and may be found as follows. Since, we have while the first Yule-Walker equation gives are 15

Remark: We only considered process with mean zero here, but non-zero means can be dealt with by rewriting (1) in the form this does not affect the ac.f. Example Consider the AR(2) process given by is this process stationary? If so, what is its ac.f.? Solution. The roots of are -2 and 3. they are outside the unit circle, hence is stationary. Note that the roots of are and. The ACFs of this process are given by Since gives and., and, we have Example Consider the AR(2) process given by is this process stationary? If so, what is its ac.f.? Solution. The roots of (5), which, in this case, is. The roots of this equation are complex, namely,, whose modulus both exceeds one, hence the process is stationary. 16

To calculate the ac.f. of the process, we use the first Yule-Walker equation to give, which yields. For, the Yule- Walker equations are, which indicates the auxiliary equation with roots. Using and, some algebra gives ( ) Figure 2: Series (top) and autocorrelation functions (bottom) of the process. Example Consider the AR(2) process given by is this process stationary? If so, what is its ac.f.? Solutions: (1) The roots of are 4 and -3, which are outside the unit circle, hence is stationary. (2) The ACFs of are 17

Where. Exercises Suppose are independent normal random variables with mean 0 and variance. Compute the ac.f. of the AR(2) process Solutions: 18

Some useful models ARMA models A mixed autoregressive/moving-average process containing p AR terms and q MA terms is said to be an ARMA process of order (p, q). It is given by Let Equation (6) may be written in the form The condition on the model parameters to make the process stationary and invertible are the same as for a pure AR or MA process. The importance of ARMA processes lies in the fact that a stationary time series may often be adequately modelled by an ARMA model involving fewer parameters than a pure MA or AR process by itself. An ARMA model can be expressed as a pure MA process in the form, where is the MA operator, which may be of infinite order. By comparison, we see that. An ARMA model can also be expressed as a pure AR process in the form, where. By convention we write so that Example Find the weights and weights for the ARMA(1,1) process given by 19

Here and. We can show that the process is stationary and invertible. Then Hence:, for Similarly, we find, for Figure 3: Series (top) and their autocorrelation functions (bottom) of the process. 20

Example Show that the ac.f. of the ARMA(1,1) model, where, and is given by for. and Sketch of the proof: First we compute, it gives us (using ). Then we compute, it gives us. Solving for gives, as shown above. The second part of ACFs can be obtained by the Yule- Walker equation. 21

Some useful models ARIMA models If the observed time series is nonstationary in the mean, then we can difference the series, as suggested in Chapter 2. If is replaced by in (6), then we have a model capable of describing certain types of nonstationary series. Such a model is called an integrated model. Let, the general ARIMA (autoregressive integrated moving average) process is of the form Or: Where and are defined in (6). We may write (7) in the form the model above is said to be an ARIMA process of order (p, d, q). The model for is clearly non-stationary, as the AR operator has d roots on the unit circle. Note that the random walk can be regarded as an ARIMA(0,1,0) process. ARIMA models can be extended to include seasonal effects, as discussed later. 22

Some useful models The general linear process A general class of processes may be written as an MA process, of possibly infinite order, in the form Where to that the process is stationary. A stationary process defined by (9) is called a general linear process. Stationary AR and ARMA processes can also be expressed as a general linear process using the duality between AR and MA processes. 23

The Wold decomposition theorem The Wold decomposition theorem: Any discretetime stationary process can be expressed as the sum of two uncorrelated processes, one purely deterministic and one purely indeterministic, which are defined as follows. Regress on and denote the residual variance from the resulting linear regression model by. If, then the process can be forecast exactly, we say that is purely deterministic. If, then the linear regression on the remote past is useless for prediction purposes, and we say that is purely indeterministic. The Wold decomposition theorem also says that the purely indeterministic component can be written as a linear sum of a sequence of uncorrelated random variables, say. 24