Lecture 6: Forecasting of time series
|
|
- Rafe Lang
- 6 years ago
- Views:
Transcription
1 Lecture 6, page 1 Lecture 6: Forecasting of time series Outline of lesson 6 (chapter 5) Forecasting Very complicated and untidy in the book A lot of theory developed (in which we will not dwell). Only chapter Sketch of the theory Practical example C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 1 of 1
2 Lecture 6, page 2 Forecasting Given a time series {X 1, X 2,, X n } Want to forecast X n+k How do we do that? Many methods; Let s start with X n+1 We want to find X n+1 such that: (X n+1 a 0 a 1 X n a 2 X n-1 a n X 1 ) 2 is minimised. As {X 1, X 2,, X n } is known, the a i s can be found by derivation and setting the expression equal to zero => Least squares procedure. However, when in the Box-Jenkins context, the a i s already have a known parametric form (estimated coefficients). The procedure thus reduces to Estimate the time series Use the coefficients to estimate X n+1 For k>1, the new, estimated values will be part of the conditional expectation Generalised for X n+k, we have: n p q n+ k = ϕ i Pn X n+ k 1 + θ n+ k 1, j ( X n+ k j i= 1 j= k P X Xˆ ) on the n first observartions n+ k j, where P n is the estimate based - We use the sum of the AR-coefficients multiplied by the past observations p lags back in time. - We add the noise ( X Xˆ ) multiplied by MA-coefficients if the k is less than q. C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 2 of 2
3 Lecture 6, page 3 In Splus this is implemented in the arima.forecast routine. (The Kalman-filter is the practical iteration procedure) Venables and Ripley (1994) suggest that the first (long) part of the time series can be used to forecast the latter (short) part of the time series. -> By this approach you will quickly get an idea about how well your model performs, and not the least for how long you can trust your forecast. C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 3 of 3
4 Lecture 6, page 4 # Australian red wine tmp <- read.table("e:\\programfiler\\itsm96\\wine_b.dat") # tmp <- read.table("c:\\kyrre\\studier\\drgrad\\kurs\\series\\tsdata\\wine_b.dat") vin <- ts(tmp[,1], frequency=12, start=c(1980,1)) vin89 <- window(vin, end=c(1989,12)) # Using first part of series to predict latter part par(mfrow=c(2,1)) ts.plot(vin, main="australian red wine consume", xlab="year", ylab="litres") ts.points(vin89, pch=28, col=8) legend(locator(1), legend=c("wine "), marks=28, col=8) Australian red wine consume litres Wine Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 Jan 92 Year Special features? Season Trend Increasing variance C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 4 of 4
5 # Taking log to stabilise variance vin89 <- log(vin89) ts.plot(vin89, main="australian red wine consum", xlab="year", ylab="ln(litres)") ts.points(vin89, pch=28, col=8) Lecture 6, page 5 Australian red wine consum ln(litres) Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 Year C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 5 of 5
6 Lecture 6, page 6 # Deseasonlising the data tsp(vin89) # [1] vin.ln.stl <- stl(vin89, "periodic") par(mfrow=c(2,1)) ts.plot(vin.ln.stl$seas, main="seasonal components", ylab="", xlab="") ts.plot(vin.ln.stl$rem, main="remainder") Seasonal components Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 Remainder Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 6 of 6
7 Lecture 6, page 7 # Removing the trend y <- 1:length(vin.ln.stl$rem) vin.trend.lin <- ts(lm(vin.ln.stl$rem ~ y)$fitted.values, frequency=12, start=c(1980,1)) par(mfrow=c(2,2)) ts.plot(vin.ln.stl$rem, vin.trend.lin, lty=c(1,1), main="residual time series w/linear trend ( )", xlab="year", ylab="ln(litres)") vin.resid <- vin.ln.stl$rem - vin.trend.lin ts.plot(vin.resid, main="detrended Red wine time series", xlab="year", ylab="ln(litres)") acf(vin.resid) acf(vin.resid, type="p") Residual time series w/linear trend ( ) Detrended Red wine time series ln(litres) ln(litres) Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 Year Series : vin.resid Jan 80 Jan 82 Jan 84 Jan 86 Jan 88 Jan 90 Year Series : vin.resid ACF Partial ACF Lag Lag Have arrived at a time series where the autoregression damps down reasonably fast. C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 7 of 7
8 Lecture 6, page 8 # AR-method library(mass) par(mfrow=c(2,2)) cpgram(vin.resid) ar(vin.resid)$aic [1] [5] [9] [13] [17] [21] length(ar(vin.resid)$aic) [1] 21 plot(0:20, ar(vin.resid)$aic, xlab="order", ylab="aic", main="aic for AR(p)") Series: vin.resid frequency AIC AIC for AR(p) order C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 8 of 8
9 Lecture 6, page 9 vin89.ar2 <- arima.mle(vin.resid, model=list(order=c(2,0,0))) # vin89.ar2$model$ar # [1] vin89.fore <- arima.forecast(vin.resid, n=24, model=vin89.ar2$model) par(mfrow=c(2,1)) ts.plot(vin89.fore$mean, main="forecast ") Forecast # Adding seasonal component vin89.fore$mean <- vin89.fore$mean + vin.ln.stl$sea[1:24] par(mfrow=c(2,1)) ts.plot(vin89.fore$mean, main="forecast w/seasonal pattern ") Forecast w/seasonal pattern C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 9 of 9
10 Lecture 6, page 10 # Adding linear trend lm(vin.ln.stl$rem ~ y) Call: lm(formula = vin.ln.stl$rem ~ y) Coefficients: (Intercept) y Degrees of freedom: 120 total; 118 residual Residual standard error: y2 <- seq( * , * , length=24) vin89.fore$mean <- vin89.fore$mean + y2 par(mfrow=c(2,1)) ts.plot(window(log(vin), start=c(1990,1)), vin89.fore$mean, vin89.fore$mean *vin89.fore$std.err, vin89.fore$mean *vin89.fore$std.err, col=c(1,3,1,1), lty=c(1,1,6,6)) ts.points(window(log(vin), start=c(1990,1)), pch=4) title("ar-method") ts.plot(exp(window(log(vin), start=c(1990,1))), exp(vin89.fore$mean), exp(vin89.fore$mean *vin89.fore$std.err), exp(vin89.fore$mean *vin89.fore$std.err), col=c(1,3,1,1), lty=c(1,1,6,6)) ts.points(exp(window(log(vin), start=c(1990,1))), pch=4) title("on a normal scale") AR-method On a normal scale C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 10 of 10
11 Lecture 6, page 11 # ARMA modelling vin89.arma <- arima.mle(vin.resid, model=list(order=c(2,0,1)), n.cond=6) vin89.arma$model $order: [1] $ar: [1] $ndiff: [1] 0 $ma: [1] > length(vin.resid) [1] 120 > vin89.arma <- arima.mle(vin.resid, model=list(order=c(2,0,0)), n.cond=6) > aicc(vin89.arma$loglik, 2, 0, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(2,0,1)), n.cond=6) > aicc(vin89.arma$loglik, 2, 1, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(2,0,2)), n.cond=6) > aicc(vin89.arma$loglik, 2, 2, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(1,0,1)), n.cond=6) > aicc(vin89.arma$loglik, 1, 1, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(1,0,2)), n.cond=6) > aicc(vin89.arma$loglik, 1, 2, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(0,0,1)), n.cond=6) > aicc(vin89.arma$loglik, 0, 1, 120) [1] > vin89.arma <- arima.mle(vin.resid, model=list(order=c(0,0,2)), n.cond=6) > aicc(vin89.arma$loglik, 0, 2, 120) [1] vin89.arma <- arima.mle(vin.resid, model=list(order=c(1,0,1)), n.cond=6) vin89.fore <- arima.forecast(vin.resid, n=24, model=vin89.arma$model) # Adding seasonal component vin89.fore$mean <- vin89.fore$mean + vin.ln.stl$sea[1:24] # Adding linear trend vin89.fore$mean <- vin89.fore$mean + y2 ts.plot(window(log(vin), start=c(1990,1)), vin89.fore$mean, vin89.fore$mean *vin89.fore$std.err, vin89.fore$mean *vin89.fore$std.err, col=c(1,3,1,1), lty=c(1,1,6,6)) ts.points(window(log(vin), start=c(1990,1)), pch=4) title("arma-method") ts.plot(exp(window(log(vin), start=c(1990,1))), exp(vin89.fore$mean), exp(vin89.fore$mean *vin89.fore$std.err), exp(vin89.fore$mean *vin89.fore$std.err), col=c(1,3,1,1), lty=c(1,1,6,6)) ts.points(exp(window(log(vin), start=c(1990,1))), pch=4) title("on a normal scale") C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 11 of 11
12 Lecture 6, page 12 ARMA-method On a normal scale C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 12 of 12
13 Lecture 6, page 13 Forecasting: Future values are found based on previous observations (by way of the estimated parameters) (In practice, a least squares approach by way of the Kalman Filter is used in the estimation.) Use the first part of the time seires to estimate the parameters of the model. Use the obtained coefficients to estimate X n+1 For k>1 a recursive procedure is used. (Remember to preprosess the data) The performance of the actual forecast (model) decays quickly to the mean. "Back-tranformation" must be done to display the forecast. C:\Kyrre\studier\drgrad\Kurs\series\lecture doc, KL, , page 13 of 13
Lecture 5: Estimation of time series
Lecture 5, page 1 Lecture 5: Estimation of time series Outline of lesson 5 (chapter 4) (Extended version of the book): a.) Model formulation Explorative analyses Model formulation b.) Model estimation
More informationLecture 3-4: Probability models for time series
Lecture 3-4, page 1 Lecture 3-4: Probability models for time series Outline of lesson 3-4 (chapter 3) The most heavy (and theoretical) part (and lesson) AR(p) processes MA(q) processes ARMA(p,q) processes
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationTransformations for variance stabilization
FORECASTING USING R Transformations for variance stabilization Rob Hyndman Author, forecast Variance stabilization If the data show increasing variation as the level of the series increases, then a transformation
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationFORECASTING USING R. Dynamic regression. Rob Hyndman Author, forecast
FORECASTING USING R Dynamic regression Rob Hyndman Author, forecast Dynamic regression Regression model with ARIMA errors y t = β 0 + β 1 x 1,t + + β r x r,t + e t y t modeled as function of r explanatory
More informationINTRODUCTION TO TIME SERIES ANALYSIS. The Simple Moving Average Model
INTRODUCTION TO TIME SERIES ANALYSIS The Simple Moving Average Model The Simple Moving Average Model The simple moving average (MA) model: More formally: where t is mean zero white noise (WN). Three parameters:
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationMCMC analysis of classical time series algorithms.
MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins
More informationSpectral Analysis. Al Nosedal University of Toronto. Winter Al Nosedal University of Toronto Spectral Analysis Winter / 71
Spectral Analysis Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis Winter 2016 1 / 71 essentially, all models are wrong, but some are useful George E. P.
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationLecture 19 Box-Jenkins Seasonal Models
Lecture 19 Box-Jenkins Seasonal Models If the time series is nonstationary with respect to its variance, then we can stabilize the variance of the time series by using a pre-differencing transformation.
More informationForecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.5 Seasonal ARIMA models Forecasting using R 1 Outline 1 Backshift notation reviewed 2 Seasonal ARIMA models 3 ARIMA vs ETS 4 Lab session 12 Forecasting using R Backshift
More informationMODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo
Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationTime Series HILARY TERM 2008 PROF. GESINE REINERT
Time Series HILARY TERM 2008 PROF. GESINE REINERT http://www.stats.ox.ac.uk/ reinert 1 Overview Chapter 1: What are time series? Types of data, examples, objectives. ffl Definitions, stationarity and autocovariances.
More informationScenario 5: Internet Usage Solution. θ j
Scenario : Internet Usage Solution Some more information would be interesting about the study in order to know if we can generalize possible findings. For example: Does each data point consist of the total
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationLab: Box-Jenkins Methodology - US Wholesale Price Indicator
Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale
More informationMarcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design
Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationElements of Multivariate Time Series Analysis
Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationCOMPUTER SESSION 3: ESTIMATION AND FORECASTING.
UPPSALA UNIVERSITY Department of Mathematics JR Analysis of Time Series 1MS014 Spring 2010 COMPUTER SESSION 3: ESTIMATION AND FORECASTING. 1 Introduction The purpose of this exercise is two-fold: (i) By
More informationLecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book
Lecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book 1 Predicting Error 1. y denotes a random variable (stock price, weather, etc) 2. Sometimes we want to do prediction (guessing). Let
More informationEstimation and application of best ARIMA model for forecasting the uranium price.
Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This
More informationTime Series Modeling. Shouvik Mani April 5, /688: Practical Data Science Carnegie Mellon University
Time Series Modeling Shouvik Mani April 5, 2018 15-388/688: Practical Data Science Carnegie Mellon University Goals After this lecture, you will be able to: Explain key properties of time series data Describe,
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationMinitab Project Report - Assignment 6
.. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationSuan Sunandha Rajabhat University
Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis Kunya Bowornchockchai Suan Sunandha Rajabhat University INTRODUCTION The objective of this research is to forecast
More informationThe Problem. Regression With Correlated Errors. Generalized Least Squares. Correlated Errors. Consider the typical regression model.
The Problem Regression With Correlated Errors Consider the typical regression model y t = β z t + x t where x t is a process with covariance function γ(s, t). The matrix formulation is y = Z β + x where
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationForecasting: Principles and Practice. Rob J Hyndman. 12. Advanced methods OTexts.com/fpp/9/2/ OTexts.com/fpp/9/3/
Rob J Hyndman Forecasting: Principles and Practice 12. Advanced methods OTexts.com/fpp/9/2/ OTexts.com/fpp/9/3/ Forecasting: Principles and Practice 1 Outline 1 Vector autoregressions 2 Neural network
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationSTA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9)
STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline 1 Building ARIMA Models 2 SARIMA 3 Homework 4c Arthur Berg STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) 2/ 34 Outline 1 Building ARIMA Models
More informationDynamic Time Series Regression: A Panacea for Spurious Correlations
International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh
More informationAnalysis. Components of a Time Series
Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously
More informationSTA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline. Return Rate. US Gross National Product
STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline 1 Building ARIMA Models 2 SARIMA 3 Homework 4c Arthur Berg STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) 2/ 34 Return Rate Suppose x t is the value
More information477/577 In-class Exercise 5 : Fitting Wine Sales
477/577 In-class Exercise 5 : Fitting Wine Sales (due Fri 4/06/2017) Name: Use this file as a template for your report. Submit your code and comments together with (selected) output from R console. Your
More informationApplied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot
Applied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot SCHOOL OF COMPUTER SCIENCE AND STATISTICS TRINITY COLLEGE DUBLIN IRELAND https://www.scss.tcd.ie/rozenn.dahyot Michaelmas Term 2017 Contents 1 Introduction
More informationarxiv: v1 [stat.co] 11 Dec 2012
Simulating the Continuation of a Time Series in R December 12, 2012 arxiv:1212.2393v1 [stat.co] 11 Dec 2012 Halis Sak 1 Department of Industrial and Systems Engineering, Yeditepe University, Kayışdağı,
More informationSolution to Series 6
Dr. M. Dettling Applied Series Analysis SS 2014 Solution to Series 6 1. a) > r.bel.lm summary(r.bel.lm) Call: lm(formula = NURSING ~., data = d.beluga) Residuals: Min 1Q
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More informationReport for Fitting a Time Series Model
Report for Fitting a Time Series Model 1. Introduction In this report, I choose the stock price of Starbucks Corporation (SBUX) as my data to analyze. The time period for the data is from Jan 2009 to Dec
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationSTAT 436 / Lecture 16: Key
STAT 436 / 536 - Lecture 16: Key Modeling Non-Stationary Time Series Many time series models are non-stationary. Recall a time series is stationary if the mean and variance are constant in time and the
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More information5 Autoregressive-Moving-Average Modeling
5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely
More informationHomework #5 - Answer Key Mark Scheuerell
Homework #5 - Answer Key Mark Scheuerell Background Here are the answers for the homework problems from the sixth week of class on the Dynamic Linear Models (DLMs) material. Begin by getting the data get
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationForecasting with ARIMA models This version: 14 January 2018
Forecasting with ARIMA models This version: 14 January 2018 Notes for Intermediate Econometrics / Time Series Analysis and Forecasting Anthony Tay Elsewhere we showed that the optimal forecast for a mean
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationFirstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below).
Project: Forecasting Sales Step 1: Plan Your Analysis Answer the following questions to help you plan out your analysis: 1. Does the dataset meet the criteria of a time series dataset? Make sure to explore
More informationSTAT 153: Introduction to Time Series
STAT 153: Introduction to Time Series Instructor: Aditya Guntuboyina Lectures: 12:30 pm - 2 pm (Tuesdays and Thursdays) Office Hours: 10 am - 11 am (Tuesdays and Thursdays) 423 Evans Hall GSI: Brianna
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationLecture 6a: Unit Root and ARIMA Models
Lecture 6a: Unit Root and ARIMA Models 1 2 Big Picture A time series is non-stationary if it contains a unit root unit root nonstationary The reverse is not true. For example, y t = cos(t) + u t has no
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationChapter 8 Conclusion
1 Chapter 8 Conclusion Three questions about test scores (score) and student-teacher ratio (str): a) After controlling for differences in economic characteristics of different districts, does the effect
More informationStochastic Modelling Solutions to Exercises on Time Series
Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have
More informationHomework 2. For the homework, be sure to give full explanations where required and to turn in any relevant plots.
Homework 2 1 Data analysis problems For the homework, be sure to give full explanations where required and to turn in any relevant plots. 1. The file berkeley.dat contains average yearly temperatures for
More informationarxiv: v1 [stat.me] 5 Nov 2008
arxiv:0811.0659v1 [stat.me] 5 Nov 2008 Estimation of missing data by using the filtering process in a time series modeling Ahmad Mahir R. and Al-khazaleh A. M. H. School of Mathematical Sciences Faculty
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationTime series analysis of activity and temperature data of four healthy individuals
Time series analysis of activity and temperature data of four healthy individuals B.Hadj-Amar N.Cunningham S.Ip March 11, 2016 B.Hadj-Amar, N.Cunningham, S.Ip Time Series Analysis March 11, 2016 1 / 26
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationThe log transformation produces a time series whose variance can be treated as constant over time.
TAT 520 Homework 6 Fall 2017 Note: Problem 5 is mandatory for graduate students and extra credit for undergraduates. 1) The quarterly earnings per share for 1960-1980 are in the object in the TA package.
More information9. Using Excel matrices functions to calculate partial autocorrelations
9 Using Excel matrices functions to calculate partial autocorrelations In order to use a more elegant way to calculate the partial autocorrelations we need to borrow some equations from stochastic modelling,
More informationModelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods
International Journal of Sciences Research Article (ISSN 2305-3925) Volume 2, Issue July 2013 http://www.ijsciences.com Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins
More informationForecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1
Forecasting using R Rob J Hyndman 3.2 Dynamic regression Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Stochastic and deterministic trends 3 Periodic seasonality 4 Lab session 14 5 Dynamic
More informationSTAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05
STAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05 1. ibm data: The random walk model of first differences is chosen to be the suggest model of ibm data. That is (1 B)Y t = e t where e t is a mean
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 9 - Heteroskedasticity Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 43 Agenda 1 Introduction 2 AutoRegressive Conditional Heteroskedastic Model (ARCH)
More informationHomework 5, Problem 1 Andrii Baryshpolets 6 April 2017
Homework 5, Problem 1 Andrii Baryshpolets 6 April 2017 Total Private Residential Construction Spending library(quandl) Warning: package 'Quandl' was built under R version 3.3.3 Loading required package:
More information