Lesson 15: Building ARMA models. Examples

Similar documents
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 14: Model Checking

Lesson 8: Testing for IID Hypothesis with the correlogram

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 2: What is a time series Model

Lesson 17: Vector AutoRegressive Models

CHAPTER 8 FORECASTING PRACTICE I

Econometrics II Heij et al. Chapter 7.1

Lesson 4: Stationary stochastic processes

FE570 Financial Markets and Trading. Stevens Institute of Technology

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Ch 6. Model Specification. Time Series Analysis

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Midterm Suggested Solutions

MCMC analysis of classical time series algorithms.

Modelling using ARMA processes

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Problem Set 2: Box-Jenkins methodology

Lecture 4a: ARMA Model

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Chapter 5: Models for Nonstationary Time Series

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Univariate ARIMA Models

Estimation and application of best ARIMA model for forecasting the uranium price.

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

COMPUTER SESSION 3: ESTIMATION AND FORECASTING.

Time Series Analysis

Econ 623 Econometrics II Topic 2: Stationary Time Series

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

AR, MA and ARMA models

STAT 436 / Lecture 16: Key

Elements of Multivariate Time Series Analysis

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Scenario 5: Internet Usage Solution. θ j

Advanced Econometrics

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Review Session: Econometrics - CLEFIN (20192)

2. An Introduction to Moving Average Models and ARMA Models

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

INTRODUCTION TO TIME SERIES ANALYSIS. The Simple Moving Average Model

Estimating AR/MA models

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

White Noise Processes (Section 6.2)

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Applied time-series analysis

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

Author: Yesuf M. Awel 1c. Affiliation: 1 PhD, Economist-Consultant; P.O Box , Addis Ababa, Ethiopia. c.

5 Autoregressive-Moving-Average Modeling

Paper Review: NONSTATIONARY COVARIANCE MODELS FOR GLOBAL DATA

Chapter 3: Regression Methods for Trends

Time series analysis of activity and temperature data of four healthy individuals

The ARIMA Procedure: The ARIMA Procedure

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

Minitab Project Report - Assignment 6

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

ARIMA Models. Richard G. Pierse

Lecture 2: ARMA(p,q) models (part 2)

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity

Some Time-Series Models

Econ 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:

Automatic Autocorrelation and Spectral Analysis

Solar irradiance forecasting for Chulalongkorn University location using time series models

Class 1: Stationary Time Series Analysis

Univariate linear models

Lecture 5: Estimation of time series

Suan Sunandha Rajabhat University

Statistics 349(02) Review Questions

Introduction to Signal Processing

Lecture on ARMA model

Transformations for variance stabilization

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

Births at Edendale Hospital

Final Examination 7/6/2011

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Chapter 9: Forecasting

ARIMA Modelling and Forecasting

Stationary Stochastic Time Series Models

Lecture 2: Univariate Time Series

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -20 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

STAT Financial Time Series

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Austrian Inflation Rate

Econometrics I: Univariate Time Series Econometrics (1)

Transcription:

Lesson 15: Building ARMA models. Examples Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it

Examples In this lesson, in order to illustrate the time series modelling methodology we have presented so far, we analyze some time series.

Example 1 By using a computer program we have generated a time series x.the graph of the series is presented in the following figure Figure : A simulated time series

Example 1 The objective is to build an ARMA model this time series. The first step in developing a model is to determine if the series is stationary.

Example 1 Figure : Sample autocorrelation and sample partial autocorrelation Our time series seems the realization of a stationary process with zero mean, thus we can look at sample autocorrelation and partial autocorrelation function to establish the orders p and q of the ARMA model.

Example 1 Since the SACF cuts off after lag 2 and the SPCAF follows a damped cycle, an MA(2) model x t = u t + θ 1 u t 1 + θ 2 u t 2, u t WN(0, σ 2 ) seems appropriate for the sample data.

Example 1 Table reports the result of the ML estimation. 300 observations, Dependent Variable x Variabile Coefficient St. error t statistic p-value θ 1 1,68559 0,0456203 36,9481 0,0000 θ 2 0,883683 0,0492842 17,9303 0,0000 Variance of innovations 0.941107

Example 1 Now, we consider the graph of the residuals Figure : Residuals from MA(2) model

Example 1 Figure : SACF and SPACF of residuals from MA(2) model

Example 1 By analysing the SACF and SPACF of residuals presented in figure, we note that any term isn t significant and Q 25 = 16.4450 do not indicate any autocorrelation in the residuals. They can be assimilate to a white noise process.

Example 1 We conclude that the MA(2) model defined by x t = u t + 1.686u t 1 + 0.884u t 2, u t WN(0, 0.94) appear to fit the data very well.

Example 2 Consider the montly series of the foreign exchange rate Lira per US Dollar from Jannuary 1973 until October 1989 (202 observations).

Example 2 It can be observed that the series displays a nonstationary pattern with an upward trending behavior. Figure : Foreign exchange rate Lira per U.S. $ from Jannuary 19t3 until October 1989

Example 2 The first difference of the series seems to have a constant mean, although inspection of the graph (see Figure ) suggests thye variance is an increasing function of time. Figure : First difference of the foreign exchange rate Lira per US dollar

Example 2 As we can see in figure, the first difference of the logarithm is the most likely candidate to be covariance stationary. Figure : First difference of the logarithm of the foreign exchange rate Lira per US Dollar.

Example 2 Now, we examine the autocorrelation and partial autocorrelation functions of the logarithmic change in the foreign exchange rate Lira per US Dollar. Figure : SACF and SPACF for the logarithmic change in the foreign exchange rate Lira per US Dollar.

Example 2 An AR(1) model is fitted by using the exact maximum likelihood estimation. The parameter estimates are summarized in the following table

Example 2 Sample 1973:02 1989:10. Dependent Variable: First difference of log of the foreign exchange rate Lira per US Dollar Coefficient Std. error t statistic p-value Variance innov. 0,381412 0,0652368 5,8466 0,0000 0,000543700

Example 2 The AR(1) model fit indicates a highly significant parameter φ 1 with estimate ˆφ 1 = 0, 381.

Example 2 The Q statistic (Q 20 = 16, 5014) and the graphs of the SACF and SPACF of residuals (see Figure )indicate that the autocorrelations of the residuals are not statistically significant. Figure : SACF and SACFP of residuals from the model AR(1)

Example 2 Thus we conclude that the AR(1) model x t = 0.381x t 1 + u t where u t WN(0, 0.00054) and x t is the first difference of log of the foreign exchange rate Lira per US Dollar, fits the data well.

Example 3 We consider the log of GNP deflator series in USA observed on the period 1955:1-2000:4. The objective is to build an ARMA model for this time series.

Example 3 Figure : Log of GNP deflator series in USA observed on the period The time serie graph is shown in figure. We note that the mean is changing over time. The variable exhibts a strong trend. Thus the series cannot be considered a realization of a stationary process.

Example 3 In order to make stationary the series, we consider the first differeces. Figure : Graphical plot of the first difference of log of GNP deflator series

Example 3 Visual inspection gives again strong indication of nonstationarity. Thus we consider the second difference. Figure : Graphical plot of the second difference of log of GNP deflator series

Example 3 The second difference seems the realization of a stationary process with zero mean, thus we can look at sample autocorrelation and partial autocorrelation function of the second difference, to establish the orders p and q of the ARMA model. Figure shows the graphs of SACFs and SPACFs.

Example 3 Additional information can be obtained by inspecting the outcomes of the AIC and BIC criteria. Table : The information criteria: AIC and BIC Orders p,q of ARMA model 2,2 2,1 1,2 1,1 1,0 0,1 AIC -1614.4-1611.3-1616.0-1613.3-1609.7-1614.7 BIC -1595.2-1595.2-1599.9-1600.4-1600.1-1605.1 From AIC values it is concluded that the ARMA(1,2) model is most suitable for our time series. From BIC values, however, the MA(1) is judged to be better suited.

Example 3 The fact that AIC and BIC provided different indications about the best fitting models is not surprising because BIC penalizes larger models more than AIC. Thus BIC tends to produce more parsimonious best-fitting models than AIC.

Example 3 Since computing time is inexpensive we can estimate both models. Table 1 shows the results of fitting an ARMA(1,2) for the second difference of log of GNP deflator.

Example 3 Sample 1955:1 2000:4. Dependent Variable: second difference of log of GNP deflator Variabile Coefficient Std. error t statistic p-value const 8,97976e-06 0,000122030 0,0736 0,9413 φ 1-0,923081 0,0694282-13,2955 0,0000 θ 1 0,438205 0,0999095 4,3860 0,0000 θ 2-0,349970 0,0785200-4,4571 0,0000

Example 3 These graphs are very Umberto similar Triaccato the Lessoncorrelograms 15: Building ARMA models. of aexamples white Now, we can look at sample autocorrelation and partial autocorrelation functions of residuals of the model ARMA(1,2) to establish if this ARMA model is a good model for the data. Figure : SACF and SACFP of residuals from the model ARMA(1,2)

Example 3 The Q K -statistic computed with K = 20 lags is equal to Q 20 = 16.2932, whereas the critical value is χ 2 1 0.05,17 = 27.5871.

Example 3 These results indicate that the ARMA(1,2) model (1 0.923L)x t = (1 + 0.438L 0.350L 2 )u t u t WN(0, 0.85) with x t = (1 L) 2 log(y t ) where y t is the original series, fits the data well.

Example 3 Now, we consider the MA(1) model. Sample 1955:1 2000:4. Dependent Variable: second difference of log of GNP deflator Variabile Coefficient Std. error t statistic p-value const 8,79573e-06 0,000116929 0,0752 0,9400 θ 1-0,466072 0,0615454-7,5728 0,0000

The correlograms and the Box-Pierce statistics (Q 20 = 21.6819) indicate that the residuals behave as white noise processes. Figure : SACF and SACFP of residuals from the model MA(1)

Example 3 Thus we conclude that also the model fits the data well. x t = (1 0.466L)u t u t WN(0, 0.89),

Example 3 Let us look at the forecasting performace of the two models. In particular, we forecast the future values of our time series, log(y t ), from 2001:1 to 2001:4.

Example 3 First, we consider the model (1 0.923L)x t = (1 + 0.438L 0.350L 2 )u t u t WN(0, 0.85)

Example 3 The 1-step, 2-step, 3-step and 4-step forecasts are presented in the following table. Table : ARMA(1,2) forecasts Step ahead Actual Forecast Std. error 2001:1 4,499576 4,498700 0,002913 2001:2 4,506344 4,504065 0,005289 2001:3 4,509474 4,509945 0,007403 2001:4 4,512517 4,515367 0,009934

Example 3 Table presents the forecast obtained by using the MA(1) model x t = (1 0.466L)u t u t WN(0, 0.89), Table : MA(1) forecasts Step ahead Actual Forecast Std. error 2001:1 4,499576 4,498286 0,002956 2001:2 4,506344 4,503785 0,005413 2001:3 4,509474 4,509293 0,008166 2001:4 4,512517 4,514810 0,011218

Example 3 Given these results we may conclude that the performance of two models is very similar. Which model should be prefered?

Example 3 It is usual to choose a parsimonious model, that is a model that describes all of the features of the data of interest using as few parameters as possible.

Example 3 Thus we choose the MA(1) model x t = (1 0.466L)u t u t WN(0, 0.89),