Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).
|
|
- Gregory O’Neal’
- 6 years ago
- Views:
Transcription
1 Original ACF Residual ACF Generated ACF Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Theorem 5.6. If (X i ) is stationary AR(p) with (W i ) WN(σ 2 ), and ˆφ (n) is the YW estimates based on X 1,...,X n, then n(ˆφ(n) φ) n N(0,σ 2 Γ 1 ), in distribution. Moreover, ˆσ 2 σ 2 in probability. For proof, see Brocwell and Davis, Theorem Eample 5.7. The mobile phone (GSM) standard involves lossy speech compression based on linear prediction coefficients, which is engineering terminology AR coefficients. Figure 29 shows a the beginning of a 0.5 second speech sample of ah at 8000 samples per second, the corresponding residual of AR(100) fit to the sample, and a random sample of AR(100). (The compression includes both the AR coefficients and an additional residual compression, which involves a lot of fine tuning...) 42 In the AR model, we have (X i µ) p φ j (X i j µ) = W i,
2 and we could thin also the following estimator Definition 5.8 (Conditional sum of squares). The conditional sum of squares, or conditional least squares estimator of φ = φ 1,...,φ n and µ is (ˆφ,ˆµ) := argmin S c( 1,..., n ;φ,µ), (φ,µ) n ( p 2. S c ( 1,..., n ;φ,µ) := i µ φ j ( i j µ)) (9) i=p+1 If we differentiate wrt. µ and set to zero, we get ˆµ = n i=p+1 ( i p φ j i j ) (n p)(1 p φ j) = 1 ( n p ) i +R n,p, n p i=p+1 if n is large, because R n,p is a residual term of order O(p 2 ). If we substitute µ = in (9) and differentiate wrt. φ and set to zero, we get that the estimates ˆφ j should satisfy 1 n n ( i )( i ) i=p+1 which is, for large n, close to p ( 1 ˆφ j n n i=p+1 ) ( i )( i j ) = 0, ˆγ = p ˆφ jˆγ j, which corresponds to the Yule-Waler estimate. This shows that for large n, the Yule-Waler estimates are going to be similar to the CSS estimates. Eample 5.9. Assume net an AR(1) with Gaussian white noise (W n ) i.i.d. N(0,σ 2 ). We can write the lielihood now as ( l( 1,..., n ) = N 1 µ;0, ) σ 2 n N ( 1 φ 2 i µ;φ 1 ( i 1 µ),σ 2), 1 where N(,m,σ 2 ) stands for the Gaussian p.d.f. Denoting ˆ i = i µ, we can epand logl = 1 φ2 1 2σ 2 ˆ σ 2 i=2 n (ˆ i φ 1ˆ i 1 ) 2 n 2 logσ log(1 φ2 1)+c i=2 = 1 2σ 2S c( 1,..., n ;φ 1,µ)+R 1 ( 1,σ 2,φ 2 1,µ). 43
3 As the number of samples increases, the conditional sum of squares term will dominate, meaning that the maimum lielihood estimator will be similar to the CSS, and hence the YW estimators. Remar Finding both the CSS and the ML estimates requires, in general, iterative numerical optimisation methods. The story about the maimum lielihood estimates of a general AR(p) is similar: The first p variables follow a multivariate Gaussian distribution, and the conditional sum of squares term will be the dominating one, so the maimum lielihood estimates will coincide with the YW and CSS estimates asymptotically. Keep in mind, however, that with any finite n, the Yule-Waler, the CSS and the ML estimates will all be generally different, and may be preferable in specific applications. The R arima used ML by default. Question. Can you give some reasons why each of the three variants could be useful in certain applications? 5.3 General ARMA We saw earlier that the Yule-Waler method of estimation for the AR models was straightforward and efficient, and coincides asymptotically with the CSS and the ML estimators. There are not as simple and well-behaved methods for the estimation of parameters in the general ARMA model, and one has to resort to numerical optimisation. If the time series is not too long, and if the fitted model does not have too many parameters, it is common to use maimum lielihood estimation in the ARIMA contet, assuming Gaussian white noise. This is also what the R standard ARIMA fitting tool arima does. The ML estimation requires iterative numerical optimisation methods, and weshallnottaeacloserlooatthemethodsnow.instead,wequotethefollowing asymptotic normality result. Theorem (not eaminable) Suppose β = (φ,θ) are parameters of an ARMA(p,q) process (X i ) with Gaussian (W i ) WN(σ 2 ), which satisfies Condition Let ˆβ (n) = (ˆφ (n),ˆθ (n) ) stand for the ML estimator calculated from (X 1,...,X n ), then, n(ˆβ(n) β) n N ( 0,V(β) ), in distribution, where the covariance matri V(β) can be epressed as [ ] EUU V(β) = σ 2 T EUV T 1 EV U T EV V T, U = (U p,...,u 1 ) V = (V q,...,v 1 ), 44
4 where (U i ) and (V i ) are AR(p) and AR(q) processes satisfying φ(b)u i = W i and θ(b)v i = W i. In case of p = 0 or q = 0, the corresponding matrices vanish. See Brocwell and Davis, Section 8.8. We note that the confidence intervals of the estimated parameters may be (and are) calculated from the Hessian, ] ) 1 1 ([ n V(β) 2 l(ˆβ), β i β j where l is the log-lielihood of β. i,j 6 Model selection tools Perhaps the most challenging question in time series analysis is to find a satisfactory model class. In the case of real data, there rarely is a correct model, and what is satisfacotry depends on what the model is used for. We will net loo some tools which may indicate which models might be appropriate. The first thing to do is always to inspect the data. Plot the time series, or if it is long, loo for shorter segments of the data at the time. Loo for obvious trends or other behaviour suggesting non-stationary. Inspect also cyclic or nearcyclic behaviour. Autocorrelation plots and the periodogram can be helpful, they can also suggest cyclic components. Remember to try also transformations, such as log-transform or Bo-Co transforms. It can be instructive to loo at the data agains lagged versions of itself, that is, the scatterplots of (X i,x i+ ). Eample 6.1. Lag plots of the speech data in Eample 5.7. lag.plot(, layout=c(2,3), set.lags=c(1:3, 32, 72, 104), pch=".") 6.1 Non-stationarity If the data shows signs of non-stationarity, recall that you may inspect the differences. Note, however, that if removal of a trend maes the data seem stationary (data is trend stationary), it is possible to consider the original data with an eternal regressor handling the trend. Remar 6.2. Unit root tests were mentioned in Section 4.5. We shall not discuss unit root tests further, but only note that there are some implemented in R library tseries: the Augmented Dicey-Fuller test adf.test and the Phillips- Perron test pp.test. 45
5 lag 1 lag 2 lag 3 lag 32 lag lag Figure 30: Lag plots of the speech data in Eample Autocorrelation Theorem 1.17 stated asymptotic normality of the sample ACF in case of white noise. The following generalises that for a large class of linear processes. Theorem 6.3. (Not eaminable) Suppose (X i ) is a stationary process given by X i = µ+ j= c j W j, (W i ) WN(σ 2 ), with coefficients satisfying j c j < and j j c2 j <, and let (ρ ) be its autocorrelation. Then, for any h N, the sample autocorrelations ˆρ (n) calculated from X 1,...,X n satisfy (n) n (ˆρ 1 ρ 1,..., ˆρ (n) ) n h ρ h N(0,W h ) in distribution, where the limiting covariance matri is given as [W h ] ij = ( )( ) ρ+i ρ i 2ρ i ρ ρ+j ρ j 2ρ j ρ =1 For a proof, see Brocwell and Davis Theorem Remar 6.4. Theorem 6.3 justifies us to say that (ˆρ 1,..., ˆρ h ) is approimately N ( (ρ 1,...,ρ h ),W h /n ) forlargen.notethattheorem6.3holdsforanystationary 46
6 ARMA(p,q), because we now that it has representation with c j = 0 for j < 0 and c j decay eponentially in j. Corollary 6.5. Suppose (X i ) is a MA(q). Then, for all i > q, n(ˆρi ρ i ) N(0,σ 2 i), where Proof. Left aso an eercise. q σi 2 = 1+2 ρ 2 j. The result of Corollary 6.5 is used by R acf, when called with ci.type="ma". It displays confidence bounds based on 1 ˆσ 2 = 1+2 ˆρ 2 j. Question. Can you eplain what is the rationale of this? What about the possible caveats? Eample 6.6. Comparison of ACF with standard and MA confidence bounds. = arima.sim(model=list(ma=c(3/4, -1/5, -1/5)), 100) par(mfrow=c(2, 1)) acf(); acf(, ci.type="ma") 6.3 Partial autocorrelation function Definition 6.7. The partial autocorrelation function (PACF) (α ) 0 of a zeromean finite variance stationary process (X i ) is defined as ρ 0 = 1 = α 0 and ρ 1 = α 1, and for 2 through α = Corr(X +1 ˆX +1 2:, X 1 ˆX 1 2: ), where ˆX +1 2: and ˆX 1 2: are the best linear predictors (in the mean square sense) 12, of X +1 and X 1 given X 2,...,X, respectively. Theorem 6.8. The partial autocorrelation of AR(p) process α = 0 for all > p. 12. Recall that the best linear predictor ˆX of a random variable X given Y 1,...,Y n in the mean square sense is ˆX = n c jy j, where the constants (c 1,...,c n ) are chosen to minimise E[( ˆX X) 2 ]. That is, the partial autocorrelation is correlation of the residuals of X +1 and X 1 after regressing with X 2,...,X. 47
7 Proof. Let > p, then we have [ p ] E[X +1 X 2,...,X ] = E φ j X +1 j +W +1 X 2,...,X = p φ j X +1 j, because W +1 is independent of X 2,...,X and j in the sum. Because conditional epectation minimises the mean square error, we deduce that (in this specific case!) ˆX +1 2: = p φ jx +1 j. We then conclude, α = Corr(W +1, X 1 ˆX 1 2: ) = 0. Theorem 6.9. Let ρ be the autocorrelation of a stationary process (X i ), and assume β () = (β () 1,...,β () ) is the solution of R () β () = ρ (), where ρ () = (ρ 1,...,ρ ) and [R () ] ij = ρ i j for 1 i,j. Then, the partial autocorrelation α = β () for 2. The proof can be found, for eample, from Brocwell and Davis, Corollary () Definition The sample PACF ˆα = ˆβ, where ˆβ () satisfies ˆR ()ˆβ() = ˆρ (), with ˆR () and ˆρ () standing for the R () and ρ () with the autocorrelations replaced with their corresponding sample quantities. 48
8 Remar The partial autocorrelations can be calculated iteratively with the Levinson-Durbin algorithm, as discussed in Section 5.2. Note that each ˆα p corresponds to the last estimated autocorrelation coefficient ˆφ p of the AR(p) Yule- Waler estimates. Eample (Not eaminable) The PACF of MA(1) is (eercise) α = ( θ 1) (1 θ1) 2. 1 θ 2(+1) 1 The general story is similar (and very much eaminable!): For MA(q), the PACF will not vanish but will tail off, just lie the ACF for AR(p). Theorem Assume (X i ) is AR(p), then the partial autocorrelation coefficient α (n) calculated from X 1,...,X n satisfies (n) n nα N(0,1), for > p. This follows from from asymptotic of Yule-Waler estimates stated in Theorem5.6(cf.BrocwellandDavisE.8.15).ItisthebasisofthePACFconfidence intervals in R. Eample Figure 6.14 shows ACF and PACF of simulated AR(1) with φ 1 = 3/4, MA(1) with θ 1 = 3/4 and random wal. n = 300.ar <- arima.sim(model=list(ar=c(3/4)), n).ma <- arima.sim(model=list(ma=c(3/4)), n).rw <- cumsum(rnorm(n)) par(mfrow=c(3,3)) ts.plot(.ar); acf(.ar); pacf(.ar) ts.plot(.ma); acf(.ma); pacf(.ma) ts.plot(.rw); acf(.rw); pacf(.rw) 6.4 Information criteria Autocorrelation and partial autocorrelation plots can suggest certain low-order AR and MA models, respectively. For a mied ARMA(p,q) with p > 1 and q > 1, neither ACF or PACF vanish, providing little help. Further, comparing AR, ARMA and MA models can be very difficult, because the model families are not ordered (how to compare AR(2) and ARMA(1, 1), say?). It is possible (and quite popular) to base model choice based on generic information criteria such as Aaie s AIC, AIC with correction AICc, and the 49
9 Figure 31: Data (left), ACF (middle) and PACF (right) of AR(1) (top), MA(1) (middle) and random wal (bottom) of Eample Bayesian BIC: AIC := 2lnl ML +2 2n AICc := 2lnl ML + n 1 BIC := 2lnl ML +lnn, = AIC 2( +1) n 1 where n is the length of the data, l ML is the lielihood of the ML estimate and = p+q +2 is the number of parameters (including mean and variance). Remar Note that AICc converges to AIC as n tends to infinity, but the penalty of BIC will be higher, leading to favour models with less parameters. fit <- arima(, order=c(p,0,q)) n <- length(); <- q+p+2 fit.aic <- AIC(fit) fit.aicc <- AIC(fit, =2*n/(n--1)) fit.bic <- AIC(fit, =log(n)) ThereisevenanautomaticARIMAmodelfittingtoolinR,whichsearches for the best (low order) ARIMA in terms of some of the information criteria. 50
distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationParameter estimation: ACVF of AR processes
Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More information6.3 Forecasting ARMA processes
6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationLecture 5: Estimation of time series
Lecture 5, page 1 Lecture 5: Estimation of time series Outline of lesson 5 (chapter 4) (Extended version of the book): a.) Model formulation Explorative analyses Model formulation b.) Model estimation
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationPart III Spectrum Estimation
ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationLecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model
More informationSystem Identification
System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 26, 2013 Module 6 Lecture 1 Arun K. Tangirala System Identification July 26, 2013 1 Objectives of this Module
More informationContents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8
A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationMAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)
MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only
More informationITSM-R Reference Manual
ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................
More informationSTAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019
STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationSTAT 443 (Winter ) Forecasting
Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationClassical Decomposition Model Revisited: I
Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationCHAPTER 8 FORECASTING PRACTICE I
CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationMoving Average (MA) representations
Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationFinal Examination 7/6/2011
The Islamic University of Gaza Faculty of Commerce Department of Economics & Applied Statistics Time Series Analysis - Dr. Samir Safi Spring Semester 211 Final Examination 7/6/211 Name: ID: INSTRUCTIONS:
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationAn algorithm for robust fitting of autoregressive models Dimitris N. Politis
An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationSTAT 436 / Lecture 16: Key
STAT 436 / 536 - Lecture 16: Key Modeling Non-Stationary Time Series Many time series models are non-stationary. Recall a time series is stationary if the mean and variance are constant in time and the
More informationMarcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design
Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More informationPart 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)
GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationCh 8. MODEL DIAGNOSTICS. Time Series Analysis
Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationAR(p) + I(d) + MA(q) = ARIMA(p, d, q)
AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the
More informationUnivariate, Nonstationary Processes
Univariate, Nonstationary Processes Jamie Monogan University of Georgia March 20, 2018 Jamie Monogan (UGA) Univariate, Nonstationary Processes March 20, 2018 1 / 14 Objectives By the end of this meeting,
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationCOMPUTER SESSION: ARMA PROCESSES
UPPSALA UNIVERSITY Department of Mathematics Jesper Rydén Stationary Stochastic Processes 1MS025 Autumn 2010 COMPUTER SESSION: ARMA PROCESSES 1 Introduction In this computer session, we work within the
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationCOMPUTER SESSION 3: ESTIMATION AND FORECASTING.
UPPSALA UNIVERSITY Department of Mathematics JR Analysis of Time Series 1MS014 Spring 2010 COMPUTER SESSION 3: ESTIMATION AND FORECASTING. 1 Introduction The purpose of this exercise is two-fold: (i) By
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009
1. Introduction Given a covariance stationary process µ ' E[ ], the Wold decomposition states that where U t ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 with vanishing
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More information= 1 2 x (x 1) + 1 {x} (1 {x}). [t] dt = 1 x (x 1) + O (1), [t] dt = 1 2 x2 + O (x), (where the error is not now zero when x is an integer.
Problem Sheet,. i) Draw the graphs for [] and {}. ii) Show that for α R, α+ α [t] dt = α and α+ α {t} dt =. Hint Split these integrals at the integer which must lie in any interval of length, such as [α,
More informationFinancial Time Series Analysis Week 5
Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ
More informationEstimation and application of best ARIMA model for forecasting the uranium price.
Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationThe Identification of ARIMA Models
APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationGaussian Copula Regression Application
International Mathematical Forum, Vol. 11, 2016, no. 22, 1053-1065 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2016.68118 Gaussian Copula Regression Application Samia A. Adham Department
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More information