4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,
|
|
- Earl Fletcher
- 5 years ago
- Views:
Transcription
1 61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ
2 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t θ q ɛ t q 1. Mean of MA(q) Process: E(y t ) = E(ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t θ q ɛ t q ) = 0 2. Autocovariance Function of MA(q) Process: q τ σ 2 ɛ(θ 0 θ τ + θ 1 θ τ θ q τ θ q ) = σ 2 ɛ θ i θ τ+i, τ = 1, 2,, q, γ(τ) = i=0 0, τ = q + 1, q + 2,, where θ 0 = 1.
3 63 3. MA( q) process is stationary. 4. MA(q) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t θ q ɛ t q Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L θ q L q. Therefore, we have: E(y t ) = µ + θ(l)e(ɛ t ) = µ.
4 ARMA Model ARMA (Autoregressive Moving Average ) Process 1. ARMA(p, q) y t = φ 1 y t 1 + φ 2 y t φ p y t p + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t θ q ɛ t q, which is rewritten as: φ(l)y t = θ(l)ɛ t, where φ(l) = 1 φ 1 L φ 2 L 2 φ p L p and θ(l) = 1+θ 1 L+θ 2 L 2 + +θ q L q.
5 65 2. Likelihood Function: The variance-covariance matrix of Y, denoted by V, has to be computed. Example: ARMA(1,1) Process: y t = φ 1 y t 1 + ɛ t + θ 1 ɛ t 1 Obtain the autocorrelation coefficient. The mean of y t is to take the expectation on both sides. E(y t ) = φ 1 E(y t 1 ) + E(ɛ t ) + θ 1 E(ɛ t 1 ), where the second and third terms are zeros.
6 66 Therefore, we obtain: E(y t ) = 0. The autocovariance of y t is to take the expectation, multiplying y t τ on both sides. E(y t y t τ ) = φ 1 E(y t 1 y t τ ) + E(ɛ t y t τ ) + θ 1 E(ɛ t 1 y t τ ). Each term is given by: E(y t y t τ ) = γ(τ), E(y t 1 y t τ ) = γ(τ 1),
7 67 σ 2 ɛ, τ = 0, E(ɛ t y t τ ) = 0, τ = 1, 2,, (φ 1 + θ 1 )σ 2 ɛ, τ = 0, E(ɛ t 1 y t τ ) = σ 2 ɛ, τ = 1, 0, τ = 2, 3,. Therefore, we obtain; γ(0) = φ 1 γ(1) + (1 + φ 1 θ 1 + θ 2 1 )σ2 ɛ, γ(1) = φ 1 γ(0) + θ 1 σ 2 ɛ, γ(τ) = φ 1 γ(τ 1), τ = 2, 3,.
8 68 From the first two equations, γ(0) and γ(1) are computed by: ( 1 φ1 ) ( γ(0) ) ( 1 + φ1 θ 1 + θ 2 ) 1 φ 1 1 γ(1) = σ 2 ɛ θ 1 ( γ(0) ) ( 1 ) 1 φ1 ( 1 + φ1 θ 1 + θ 2 ) 1 γ(1) = σ 2 ɛ = σ2 ɛ 1 φ 2 1 φ 1 1 θ 1 ( 1 φ1 ) ( 1 + φ1 θ 1 + θ 2 ) 1 φ 1 1 θ 1 ( = σ2 ɛ 1 φ φ 1 θ 1 + θ 2 1 (1 + φ 1 θ 1 )(φ 1 + θ 1 ) ).
9 69 Thus, the initial value of the autocorrelation coefficient is given by: ρ(1) = (1 + φ 1θ 1 )(φ 1 + θ 1 ) φ 1 θ 1 + θ 2 1 We have: ρ(τ) = φ 1 ρ(τ 1).
10 70 ARMA(p, q) +drift: y t = µ + φ 1 y t 1 + φ 2 y t 2 + φ p y t p + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t θ q ɛ t q. Mean of ARMA(p, q) Process: φ(l)y t = µ + θ(l)ɛ t, where φ(l) = 1 φ 1 L φ 2 L 2 φ p L p and θ(l) = 1 + θ 1 L + θ 2 L θ q L q. Therefore, y t = φ(l) 1 µ + φ(l) 1 θ(l)ɛ t. E(y t ) = φ(l) 1 µ + φ(l) 1 θ(l)e(ɛ t ) = φ(1) 1 µ = µ 1 φ 1 φ 2 φ p.
11 ARIMA Model Autoregressive Integrated Moving Average (ARIMA ) Model ARIMA(p, d, q) Process φ(l) d y t = θ(l)ɛ t, where d y t = d 1 (1 L)y t = d 1 y t d 1 y t 1 = (1 L) d y t for d = 1, 2,, and 0 y t = y t.
12 SARIMA Model Seasonal ARIMA (SARIMA) Process: 1. SARIMA(p, d, q) φ(l) d s y t = θ(l)ɛ t, where s y t = (1 L s )y t = y t y t s. s = 4 when y t denotes quarterly date and s = 12 when y t represents monthly data.
13 Optimal Prediction 1. AR(p) Process: y t = φ 1 y t φ p y t p + ɛ t (a) Define: E(y t+k Y t ) = y t+k t, where Y t denotes all the information available at time t. Taking the conditional expectation of y t+k = φ 1 y t+k 1 + +φ p y t+k p +ɛ t+k on both sides, y t+k t = φ 1 y t+k 1 t + + φ p y t+k p t,
14 74 where y s t = y s for s t. (b) Optimal prediction is given by solving the above differential equation. 2. MA(q) Process: y t = ɛ t + θ 1 ɛ t θ q ɛ t q (a) Let ˆɛ T, ˆɛ T 1,, ˆɛ 1 be the estimated errors. (b) y t+k = ɛ t+k + θ 1 ɛ t+k θ q ɛ t+k q (c) Therefore, y t+k t = ɛ t+k t + θ 1 ɛ t+k 1 t + + θ q ɛ t+k q t, where ɛ s t = 0 for s > t and ɛ s t = ˆɛ s for s t.
15 75 3. ARMA(p, q) Process: y t = φ 1 y t φ p y t p + ɛ t + θ 1 ɛ t θ q ɛ t q (a) y t+k = φ 1 y t+k φ p y t+k p + ɛ t+k + θ 1 ɛ t+k θ q ɛ t+k q (b) Optimal prediction is: y t+k t = φ 1 y t+k 1 t + + φ p y t+k p t + ɛ t+k t + θ 1 ɛ t+k 1 t + + θ q ɛ t+k q t, where y s t = y s and ɛ s t = ˆɛ s for s t, and ɛ s t = 0 for s > t.
16 Identification 1. Based on AIC or SBIC given d, s, we obtain p, q. (a) AIC (Akaike s Information Criterion) AIC = 2 log(likelihood) + 2k, where k = p + q, which is the number of parameters estimated. (b) SBIC (Shwarz s Bayesian Information Criterion) SBIC = 2 log(likelihood) + k log T,
17 77 where T denotes the number of observations. 2. From the sample autocorrelation coefficient function ˆρ(k) and the partial autocorrelation coefficient function ˆφ k,k for k = 1, 2,, we obtain p, d, q, s. AR(p) Process MA(q) Process Autocorrelation Function Gradually decreasing ρ(k) = 0, k = q + 1, q + 2, Partial Autocorrelation Function φ(k, k) = 0, Gradually decreasing k = p + 1, p + 2, (a) Compute s y t to remove seasonality.
18 78 Compute the autocovariance functions of s y t. If the autocovariance functions have period s, we take (1 L s ), again. (b) Determine the order of difference. Compute the partial autocovariance functions every time. If the autocovariance functions decrease as τ is large, go to the next step. (c) Determine the order of AR terms (i.e., p). Compute the partial autocovariance functions every time. The partial autocovariance functions are close to zero after some τ, go to the next step.
19 79 (d) Determine the order of MA terms (i.e., q). Compute the autocovariance functions every time. If the autocovariance functions are randomly around zero, end of the procedure.
20 Example of SARIMA using Consumption Data Construct SARIMA model using monthly and seasonally unadjusted consumption expenditure data and STATA12. Estimation Period: Jan., 1970 Dec., 2012 (T = 516). gen time=_n. tsset time time variable: time, 1 to 516 delta: 1 unit. corrgram expend
21 LAG AC PAC Q Prob>Q [Autocorrelation] [Partial Autocor] gen dexp=expend-l.expend (1 missing value generated)
22 82. corrgram dexp LAG AC PAC Q Prob>Q [Autocorrelation] [Partial Autocor] gen sdex=dexp-l12.dexp (13 missing values generated)
23 83. corrgram sdex LAG AC PAC Q Prob>Q [Autocorrelation] [Partial Autocor]
24 84. arima sdex, ar(1,2) ma(1) (setting optimization to BHHH) Iteration 0: log likelihood = Iteration 1: log likelihood = Iteration 2: log likelihood = Iteration 3: log likelihood = Iteration 4: log likelihood = (switching optimization to BFGS) Iteration 5: log likelihood = Iteration 6: log likelihood = Iteration 7: log likelihood = Iteration 8: log likelihood = ARIMA regression Sample: Number of obs = 503 Wald chi2(3) = Log likelihood = Prob > chi2 =
25 OPG sdex Coef. Std. Err. z P> z [95% Conf. Interval] sdex _cons ARMA ar L L ma L /sigma Note: The test of the variance against zero is one sided, and the two-sided confidence interval is truncated at zero. 85
26 86. estat ic Model Obs ll(null) ll(model) df AIC BIC Note: N=Obs used in calculating BIC; see [R] BIC note
27 ARCH and GARCH Models Autoregressive Conditional Heteroskedasticity (ARCH) Generalized Autoregressive Conditional Heteroskedasticity (GARCH) 1. ARCH (p) Model ɛ t ɛ t 1, ɛ t 2,, ɛ 1 N(0, h t ), where h t = α 0 + α 1 ɛ 2 t α pɛ 2 t p.
28 88 The unconditional variance of ɛ t is: σ 2 ɛ = α 0 1 α 1 α 2 α p 2. GARCH (p, q) Model ɛ t ɛ t 1, ɛ t 2,, ɛ 1 N(0, h t ), where h t = α 0 + α 1 ɛ 2 t α pɛ 2 t p + β 1 h t β q h t q.
29 89 3. Application to OLS (Case of ARCH(1) Model): y t = x t β + ɛ t, ɛ t ɛ t 1, ɛ t 2,, ɛ 1 N(0, α 0 + α 1 ɛ 2 t 1 ). The joint density of ɛ 1, ɛ 2,, ɛ T is: T f (ɛ 1,, ɛ T ) = f (ɛ 1 ) f (ɛ t ɛ t 1,, ɛ 1 ) t=2 ( ) 1/2 ( ) = (2π) 1/2 α 0 1 exp 1 α 1 2α 0 /(1 α 1 ) ɛ2 1 T T (2π) (T 1)/2 (α 0 + α 1 ɛ 2 t 1 ) 1/2 exp 1 ɛt 2 2 α 0 + α 1 ɛ. 2 t 1 t=2 t=2
30 90 The log-likelihood function is: log L(β, α 0, α 1 ; y 1,, y T ) = 1 2 log(2π) 1 2 log ( T T t=2 ) α α 1 2α 0 /(1 α 1 ) (y 1 x 1 β) 2 log(2π) 1 T log ( α 0 + α 1 (y t 1 x t 1 β) 2) 2 t=2 (y t x t β) 2 α 0 + α 1 (y t 1 x t 1 β) 2. Obtain α 0, α 1 and β such that the log-likelihood function is maximized. α 0 > 0 and α 1 > 0 have to be satisfied.
31 91 These two conditions are explicitly included, when the model is modified to: E(ɛ 2 t ɛ t 1, ɛ t 2,, ɛ 1 ) = α α2 1 ɛ2 t 1. Testing the ARCH(1) Effect: (a) Estimate y t = x t β + u t by OLS, and compute ˆβ and û t = y t x t ˆβ. (b) Estimate û 2 t = α 0 + α 1 û 2 t 1 by OLS. If ˆα 1 is significant, there is the ARCH(1) effect in the error term. This test corresponds to LM test.
Autoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationMoreover, the second term is derived from: 1 T ) 2 1
170 Moreover, the second term is derived from: 1 T T ɛt 2 σ 2 ɛ. Therefore, 1 σ 2 ɛt T y t 1 ɛ t = 1 2 ( yt σ T ) 2 1 2σ 2 ɛ 1 T T ɛt 2 1 2 (χ2 (1) 1). (b) Next, consider y 2 t 1. T E y 2 t 1 T T = E(y
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationLecture#17. Time series III
Lecture#17 Time series III 1 Dynamic causal effects Think of macroeconomic data. Difficult to think of an RCT. Substitute: different treatments to the same (observation unit) at different points in time.
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationFinancial Econometrics Using Stata
Financial Econometrics Using Stata SIMONA BOFFELLI University of Bergamo (Italy) and Centre for Econometric Analysis, Cass Business School, City University London (UK) GIOVANNI URGA Centre for Econometric
More information7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15
Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic
More informationECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:
More information7 Introduction to Time Series
Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some
More informationCase of single exogenous (iv) variable (with single or multiple mediators) iv à med à dv. = β 0. iv i. med i + α 1
Mediation Analysis: OLS vs. SUR vs. ISUR vs. 3SLS vs. SEM Note by Hubert Gatignon July 7, 2013, updated November 15, 2013, April 11, 2014, May 21, 2016 and August 10, 2016 In Chap. 11 of Statistical Analysis
More informationEstimating Markov-switching regression models in Stata
Estimating Markov-switching regression models in Stata Ashish Rajbhandari Senior Econometrician StataCorp LP Stata Conference 2015 Ashish Rajbhandari (StataCorp LP) Markov-switching regression Stata Conference
More informationECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:
More informationProblem Set 2 Solution Sketches Time Series Analysis Spring 2010
Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationIntroduction to Econometrics
Introduction to Econometrics STAT-S-301 Introduction to Time Series Regression and Forecasting (2016/2017) Lecturer: Yves Dominicy Teaching Assistant: Elise Petit 1 Introduction to Time Series Regression
More informationTesting methodology. It often the case that we try to determine the form of the model on the basis of data
Testing methodology It often the case that we try to determine the form of the model on the basis of data The simplest case: we try to determine the set of explanatory variables in the model Testing for
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009
1. Introduction Given a covariance stationary process µ ' E[ ], the Wold decomposition states that where U t ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 with vanishing
More informationLecture 2: Poisson and logistic regression
Dankmar Böhning Southampton Statistical Sciences Research Institute University of Southampton, UK S 3 RI, 11-12 December 2014 introduction to Poisson regression application to the BELCAP study introduction
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationVector Autogregression and Impulse Response Functions
Chapter 8 Vector Autogregression and Impulse Response Functions 8.1 Vector Autogregressions Consider two sequences {y t } and {z t }, where the time path of {y t } is affected by current and past realizations
More informationMeasures of Fit from AR(p)
Measures of Fit from AR(p) Residual Sum of Squared Errors Residual Mean Squared Error Root MSE (Standard Error of Regression) R-squared R-bar-squared = = T t e t SSR 1 2 ˆ = = T t e t p T s 1 2 2 ˆ 1 1
More information9) Time series econometrics
30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationLecture 5: Poisson and logistic regression
Dankmar Böhning Southampton Statistical Sciences Research Institute University of Southampton, UK S 3 RI, 3-5 March 2014 introduction to Poisson regression application to the BELCAP study introduction
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationLecture 7 Autoregressive Processes in Space
Lecture 7 Autoregressive Processes in Space Dennis Sun Stanford University Stats 253 July 8, 2015 1 Last Time 2 Autoregressive Processes in Space 3 Estimating Parameters 4 Testing for Spatial Autocorrelation
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationNonstationary time series models
13 November, 2009 Goals Trends in economic data. Alternative models of time series trends: deterministic trend, and stochastic trend. Comparison of deterministic and stochastic trend models The statistical
More informationEconometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution
Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution Delivered by Dr. Nathaniel E. Urama Department of Economics, University of Nigeria,
More information10) Time series econometrics
30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More informationRomanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS
THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationModel selection using penalty function criteria
Model selection using penalty function criteria Laimonis Kavalieris University of Otago Dunedin, New Zealand Econometrics, Time Series Analysis, and Systems Theory Wien, June 18 20 Outline Classes of models.
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationA SARIMAX coupled modelling applied to individual load curves intraday forecasting
A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationIntroduction to Estimation Methods for Time Series models Lecture 2
Introduction to Estimation Methods for Time Series models Lecture 2 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 2 SNS Pisa 1 / 21 Estimators:
More informationHomework Solutions Applied Logistic Regression
Homework Solutions Applied Logistic Regression WEEK 6 Exercise 1 From the ICU data, use as the outcome variable vital status (STA) and CPR prior to ICU admission (CPR) as a covariate. (a) Demonstrate that
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationEconometrics. 9) Heteroscedasticity and autocorrelation
30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationEconometrics II. Seppo Pynnönen. Spring Department of Mathematics and Statistics, University of Vaasa, Finland
Department of Mathematics and Statistics, University of Vaasa, Finland Spring 218 Part VI Vector Autoregression As of Feb 21, 218 1 Vector Autoregression (VAR) Background The Model Defining the order of
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationBox-Jenkins. (1) Identification ( ) (2) Estimation ( ) (3) Diagnostic Checking ( ) (1) Identification: ARMA(p,q) p, q. (2) Estimation: ARMA(p,q)
4 A RMA Box-Jenkins () Identification ( ) (2) Estimation ( ) (3) Diagnostic Checking ( ) () Identification: ARMA(p,q) p, q (2) Estimation: ARMA(p,q) φ(l)y t = m + θ(l)ε t φ = (φ,,φ p ) θ = (θ,,θ q ) m
More informationSTAT 436 / Lecture 16: Key
STAT 436 / 536 - Lecture 16: Key Modeling Non-Stationary Time Series Many time series models are non-stationary. Recall a time series is stationary if the mean and variance are constant in time and the
More informationChapter 14: Time Series: Regression & Forecasting
Chapter 14: Time Series: Regression & Forecasting 14-1 1-1 Outline 1. Time Series Data: What s Different? 2. Using Regression Models for Forecasting 3. Lags, Differences, Autocorrelation, & Stationarity
More informationAutomatic seasonal auto regressive moving average models and unit root test detection
ISSN 1750-9653, England, UK International Journal of Management Science and Engineering Management Vol. 3 (2008) No. 4, pp. 266-274 Automatic seasonal auto regressive moving average models and unit root
More informationSTOR 356: Summary Course Notes Part III
STOR 356: Summary Course Notes Part III Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 27599-3260 rls@email.unc.edu April 23, 2008 1 ESTIMATION
More informationApplication of Time Sequence Model Based on Excluded Seasonality in Daily Runoff Prediction
Send Orders for Reprints to reprints@benthamscience.ae 546 The Open Cybernetics & Systemics Journal, 2014, 8, 546-552 Open Access Application of Time Sequence Model Based on Excluded Seasonality in Daily
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationModelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods
International Journal of Sciences Research Article (ISSN 2305-3925) Volume 2, Issue July 2013 http://www.ijsciences.com Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins
More informationIntroduction to Markov-switching regression models using the mswitch command
Introduction to Markov-switching regression models using the mswitch command Gustavo Sánchez StataCorp May 18, 2016 Aguascalientes, México (StataCorp) Markov-switching regression in Stata May 18 1 / 1
More informationMEI Exam Review. June 7, 2002
MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)
More informationAutocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time
Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationMonday 7 th Febraury 2005
Monday 7 th Febraury 2 Analysis of Pigs data Data: Body weights of 48 pigs at 9 successive follow-up visits. This is an equally spaced data. It is always a good habit to reshape the data, so we can easily
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationEcon 423 Lecture Notes
Econ 423 Lecture Notes (hese notes are modified versions of lecture notes provided by Stock and Watson, 2007. hey are for instructional purposes only and are not to be distributed outside of the classroom.)
More information