Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
|
|
- William Booker
- 5 years ago
- Views:
Transcription
1 Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order (mth order) difference equation takes the general form x t fx t 1,,x t m. A linear difference equation takes the general form x t 1 x t 1 2 x t 2 A stochastic difference equation takes the general form x t fx t 1,x t 2,, t where t is a random sequence (often i.i.d. in applications) called the forcing process or driving process. (c) James Davidson /04/2014
2 A linear stochastic difference equation takes the general form x t 1 x t 1 2 x t 2 t The object is to solve these equations (determine the path x 1,x 2,x 3, ) given initial conditions x 0,x 1, and (if present) the sequence t. Consider the non-stochastic case to establish methods and notation. The linear case is the simplest and best understood, and we focus on this. (c) James Davidson /04/2014
3 Some useful formulae A geometric series is A 1 a a 2 a 3 a i. i0 Let the tth partial sum be denoted t 1 At 1 a a 2 a t 2 a t 1 i0 a i. Note that aat a a 2 a t 1 a t At a t 1. If a 1, this has the closed-form solution At aat 1 a t 1 at 1 a. (c) James Davidson /04/2014
4 Cases: 1. If a 1, the geometric series is convergent. a t 0ast, and A limat 1 t 1 a 2. If a 1itdiverges: A. 3. If a 1, no solution. At "flip-flops" between 0 and If a 1, At flip-flops between in limit! 5. Finally, if a 1, At t and A. (c) James Davidson /04/2014
5 Also consider A t a 2a 2 3a 3 ta t By a similar argument, 1 aa t a 2a 2 3a 3 ta t a 2 2a 3 ta t1 a a 2 a 3 a t ta t1 Hence, with a 1, A t a 1 a At tat. When a 1, note that A i1 ia i a 1 a A a 1 a 2 (c) James Davidson /04/2014
6 First order linear difference equation x t 1 x t 1 Given x 0, the solution path is found by iteration, as x 1 1 x 0 x x 0 x t 1 1 t 1 1 t 1 x 0 If 1 1 the series is summable, and as t, j x t 1. j0 1 1 This is called the stable solution, and is independent of x 0. x t approaches this point from any starting point. In the other cases of 1, there is either no stable solution, or an infinite solution. Note: we can guess the solution by putting x t x t 1 x (say) and so solve x 1 1 This must be the stable solution if one exists but otherwise, it is irrelevant. (c) James Davidson /04/2014
7 Second Order Linear Difference Equation x t 1 x t 1 2 x t 2 Our concern is to find conditions for a stable solution to this equation, as in the first-order case. If it exists, this must take the form x However, solution by iteration is obviously difficult: x t 1 x t 1 2 x t x t x t 3? How do we know if the solution converges? (c) James Davidson /04/2014
8 Consider the pair of first-order equations x t 1 x t 1 y t Write these in the form y t 2 y t 1. x t 1 x t 1 2 x t 1 1 x t 2 x t 1 2 x t x t 2 x t 1 x t 1 2 x t 2 It is intuitively clear that the stability conditions have the form 1 1, 2 1 since then y t 1 2 and x t To apply these restrictions, it is necessary to invert the mapping 1, 2 1, 2. (c) James Davidson /04/2014
9 The Lag Operator Let the operator L (alternative notation, B) be defined by Lx t x t 1. Then, for example, L 2 x t LLx t Lx t 1 x t 2. The second-order equation can now be written 1 1 Lx t y t or equivalently, 1 2 Ly t 1 1 L1 2 Lx t L 1 2 L 2 x t 1 1 L 2 L 2 x t. The model therefore defines a quadratic equation in the lag operator. (c) James Davidson /04/2014
10 Reminder: Roots of a Quadratic Consider, for z C, the quadratic equation z 2 1 z 2 z 1 z 2 0 where and and 2 are the roots (zeros) of this equation, and are given by , When /4 these solutions are complex numbers (complex conjugate pair, since 1 and 2 real). Also note: The roots of the equation 1 1 z 2 z z1 2 z 0 are 1/ 1 and 1/ 2, similarly. y t (above) can be complex-valued, although x t is real-valued by construction. (c) James Davidson /04/2014
11 Stability Analysis The stable solution (finite, independent of initial conditions) evidently takes the form x t 1 1 L 2 L provided the inversion of the lag polynomial is a legitimate step. Assume 1 1, 2 1 and consider for z C, z 2 z z1 2 z. Note that, for z 1, so that 1 1 z1 1 z 1 2 z z z j 1 z j. j0 1. Assume 1 2 (both real). Then, for z 1, z1 2 z z z j0 1 j1 2 j1 1 2 z j (c) James Davidson /04/2014
12 2. Assume 1 2 (real). Write in j1 j By L Hôpital s rule*, and hence (Compare page 2.5 above.) 1 j1 1 j z 2 j 1 1 j j0 j 11 j z j as 0, *Iff 0andg 0as 0, the limit of f g is equal to that of f g latter is defined., when the (c) James Davidson /04/2014
13 Complex Roots Let 1 re i 2 re i... a complex conjugate pair in polar coordinates, where r 0, i 1 and 0 2. Their modulus (absolute value) is r. Using the facts we obtain re i rcos irsin (Euler s formula) cos x cosx sin x sinx 1 j1 2 j1 1 2 r j eij1 e ij1 e i e i r j sinj 1 sin, j 1,2,3, (c) James Davidson /04/2014
14 Stability depends on the k lying inside the unit circle, having r μ 1 r θ 0 1 μ 2 1 (c) James Davidson /04/2014
15 Finally..., replace z by the operator L. Since L j for any j, wehavetheresult x t j0 1 j1 2 j1 1 2 real roots, 1, 2 1 j0 j 11 j equal roots, 1 1 j0 r j sinj 1 sin complex roots, r 1 (c) James Davidson /04/2014
16 Third Order Case Exercise: verify that if 1 2, 2 3 and 1 3,then z1 2 z1 3 z etc., etc z z z The General Case: Factorising the polynomial as 1 1 z p z p 1 1 z1 2 z1 p z, the rule is that the difference equation x t 1 x t 1 p x t p has a stable solution if and only if k 1fork 1,,p. Since the roots of this polynomial are 1/ 1,,1/ p, we express the stability condition as: The roots of the lag polynomial lie strictly outside the unit circle. Since the lag coefficients are real, the roots are either real, or in conjugate complex pairs. (c) James Davidson /04/2014
17 Stochastic Linear Difference Equations First Order Case: x t 1 x t 1 t, t 1,2,3, The iterative solution is x t 1 1 t 1 1 t 1 t 1 t t 1 x 0 Assume that t iid0, 2. The solution of this equation is interpreted in terms of the properties of the random variable x t when t is large. Case: 1 1. As t, dependence on starting value becomes negligible. Ex t 1 1 j 1 E t j j1 1 1 Varx t E j0 j 2 1 t j j0 2j 1 E 2 t j j0 2 j0 1 2j k0,k j 1 j 1 k E t j t k (c) James Davidson /04/2014
18 In this calculation, note that 1 j 2 1. j Therefore, if the E t j t k were to take any finite fixed values such that for all t, j and k, we can say that E t j t k B (*) j0 k0,k j 1 j 1 k E t j t k j0 1 j 1 k E t j t k k0 B Hence, since E t j t k 0 whenever j k, the double sum of cross-product terms vanishes unambiguously. (c) James Davidson /04/2014
19 Other Properties Since the forcing process is i.i.d., and x t depends (in effect) on only a finite number of these terms, this is a stationary process when t is large enough. x tm and Ex t tm j 0forj m. Hence, 1 1 j0 1 j tm j m j m t j 1 j 1 tm j j0 j0 1 1 m 1 m m 1 1 x t 1 j 1 tm j j0 Covx t,x tm m 1 Varx t 1 m m. Therefore, x t is a short memory process, since j 2 j Note: we may define a stationary process with starting point t 0, by letting x 0 be a drawing from the stationary distribution of x t. (c) James Davidson /04/2014
20 This stochastic process is called a first-order autoregression (AR(1)). Realization of 100 observations from x t 0.7x t 1 t, t N0,1, x 0 0 Corresponding i.i.d. process, t : (c) James Davidson /04/2014
21 Second Order Case. x t 1 x t 1 1 x t 2 t, t 1,2,3, From previous results, we have the stationary second order solution (MA() representation), j0 1 j1 2 j1 1 2 t j real roots, 1, 2 1 x t t 1 1 L 2 L 2 j j 11 t j equal roots, 1 1 j0 j0 r j sinj 1 sin t j complex roots, r 1 Complex roots imply sinusoidal lag distributions in the MA() representation! (c) James Davidson /04/2014
22 Generalization The AR(p) process is x t 1 x t 1 p x t p t. Using the lag operator, this can be written in the form or where x t 1 Lx t 2 L 2 x t p L p x t t Lx t t L 1 1 L 2 L 2 p L p. The stationary solution of the model, when it exists, can be written in the form x t 1 t L where 1/L is a lag polynomial of infinite order, with summable coefficients. (c) James Davidson /04/2014
23 Autocovariances of AR processes These are conveniently found from the Yule-Walker equations. Multiply the equation by x t j for j 0,1,2,,p and take expected values to yield a system of p 1 equations in the unknowns 0, p. Consider the AR(2): These equations may be solved as (c) James Davidson /04/2014
24 For the higher order cases, solve the difference equation for j 3,4,5, j 1 j 1 2 j 2 Obviously, the conditions for j 0 are identical to the stability conditions for the process. By rearranging the Y-W equations, one can also solve for 2, 1, 2 from 0, 1, 2. (c) James Davidson /04/2014
25 Moving Average Processes Consider a process of the form x t L t where L 1 1 L q L q and t iid0, 2. This is the MA(q) process. The autocovariances are q q q q q 2 q 2 q with j 0forj q. Thus, the process is stationary for all choices of L. If L is invertible, it can be expressed as a difference equation of infinite order, x t L 1 t. All MA(q) processes can be written as AR() except those having a root of unity (over-differenced processes). (c) James Davidson /04/2014
26 Invertibility of MA Processes Consider the MA(1) case first. is a process with the following properties: v t t 1 t 1 Ev t 0, Varv t , Covv t,v t Covv t,v t j 0forj 1. But suppose t iid0, 2 where Then we can write equivalently v t t 1 1 t 1 The processes v t and v t have the same autocovariances, and on this basis are observationally equivalent. They are not distributed identically, and if t is i.i.d then t is not, in general. However, we cannot distinguish them on the basis of first and second moments. With this caveat, there is no loss of generality in choosing the representation with 1 1. (c) James Davidson /04/2014
27 The MA(q) case: v t L t and v t L t are observationally equivalent processes where L 1 1 L1 q L L L1 2 L1 q L and E 2 t 2 1 E 2 t. In total, 2 q equivalent representations! Conventionally, we impose invertibility to identify the model. Further caveat: We can also write, for example with same v t, where by construction t v t 1 1 L t L t 1 1L L t. In this case, both t and t are uncorrelated processes with E t E t 2, but at most one of these processes can be i.i.d, in general. The exception is the Gaussian case, where uncorrelatedness is equivalent to independence. (c) James Davidson /04/2014
28 ARMA Processes Combining AR and MA components yields the ARMA(p,q) process Lx t L t. This represents a flexible class of linear models for stationary processes. Subject to stability/invertibility, the ARMA can be viewed as a difference equation of infinite order (AR( )), L L x t 1 t and as a moving average of infinite order (MA()): x t 1 L L t. (c) James Davidson /04/2014
29 Solving Rational Lag Expansions Method of undetermined coefficients Write z zz and solve for the coefficients of z. For simplicity set q p, sowehave 1 1 z p z p 1 1 z p z p 1 1 z 2 z 2 3 z z p z p 1 z 1 1 z 2 p 1 z p1 p z p 1 p z p1 p p z 2p 1 Equating coefficients of z j for j 1,2,... and rearranging, we obtain and then for j p, p p p p p 1 j p p 1 j p1 1 j 1. These equations can be solved as a recursion for as many steps as required. Stability of the polynomial z ensures that j 0asj. (c) James Davidson /04/2014
30 Vector Autoregressions Let x t be a sequence of m-vectors. The VAR(p) model takes the form x t a 0 A 1 x t 1 A p x t p u t where a 0 is a m-vector of intercepts, A k,k 1,,p are square m m matrices, u t m 1 is a vector of disturbances with Eu t m Eu t u t m1 mm Also write where AL I A 1 L A p L p. ALx t u t (c) James Davidson /04/2014
31 Stability of the VAR Start with the case p 1. Solve the system by repeated substitution as x t a 0 Ax t 1 u t a 0 Aa 0 Ax t 2 u t 1 u t I A A 2...a 0 u t Au t 1 A 2 u t 2 Thus, we need to know the properties of A n as n gets large. (c) James Davidson /04/2014
32 Eigenvalues The eigenvalues of A are the solutions (assumed distinct - possibly complex-valued) to the equation A I 0. A square matrix with distinct eigenvalues 1, 2,, m has diagonalization A CMC 1 where M diag 1,, m and C is the matrix of eigenvectors. Note that A I CMC 1 I CM C 1 CC 1 M I C C 1 M I 1 2 m. Stability condition: A n CMC 1 CMC 1 CMC 1 CM n C 1. The conditions for A n 0 as n are that i 1, for each i. (c) James Davidson /04/2014
33 The General Case Write the VAR(p) model in companion form: x t A 1 A 2 A p 0 x t 1 u t x t 1 I x t 2 0 x t 2 0 I x t p x t p 0 0 I 0 x t p 1 0 or x t A x t 1 u t mp 1 1. Repeat the analysis of p 1 on this model. A (mp 1 mp 1) hasmp 1 eigenvalues of which m are 0. It can be shown that the remaining mp eigenvalues match the inverted roots of A 0. (c) James Davidson /04/2014
34 The Generalized Stability (Invertibility) Condition: Generalizing the AR(p) analysis, we can show that all the roots of the equation A p I p 1 A 1 A p (a polynomial of order mp) must lie outside the unit circle. Note the case p 1. Observe that I A m I A where 1/, so the eigenvalue and polynomial root conditions are equivalent. Note the case m 1. The companion form provides an alternative way to analyse the stability of the AR(p). (c) James Davidson /04/2014
35 The Final Form of a VAR To solve ALx t u t, note that AL 1 1 AL adjal where A is a lag polynomial of order mp and the elements of adjal are lag polynomials of maximum order pm 1. The final form equations are AL x t adjalu t Key facts: The vector on the right-hand side is a sum of m moving average terms in the elements of u t. A sum of MAq terms has zero autocovariances beyond q lags, hence itself has a MA representation. Hence, the final form of a VAR is a vector of univariate ARMA processes! Nominally the AR roots are the same for each equation. In practice there are frequently cancellations of common factors, element by element. (c) James Davidson /04/2014
Discrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationWe use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.
Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,
More informationMidterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015
Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 The test lasts 1 hour and 15 minutes. No documents are allowed. The use of a calculator, cell phone or other equivalent electronic
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationARMA Estimation Recipes
Econ. 1B D. McFadden, Fall 000 1. Preliminaries ARMA Estimation Recipes hese notes summarize procedures for estimating the lag coefficients in the stationary ARMA(p,q) model (1) y t = µ +a 1 (y t-1 -µ)
More informationVector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem
Vector autoregressive Moving Average Process Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem Road Map 1. Introduction 2. Properties of MA Finite Process 3. Stationarity of MA Process 4. VARMA
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More information18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013
18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process
More informationARMA Models: I VIII 1
ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationMath Tools: Stochastic Processes Revised: November 30, 2015
ECON-UB 233 Dave Backus @ NYU Math Tools: Stochastic Processes Revised: November 30, 2015 All of modern macroeconomics, and most of modern finance, is concerned with how randomness unfolds through time.
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationARMA (and ARIMA) models are often expressed in backshift notation.
Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationARIMA Models. Richard G. Pierse
ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationEstimating Moving Average Processes with an improved version of Durbin s Method
Estimating Moving Average Processes with an improved version of Durbin s Method Maximilian Ludwig this version: June 7, 4, initial version: April, 3 arxiv:347956v [statme] 6 Jun 4 Abstract This paper provides
More informationMultivariate ARMA Processes
LECTURE 8 Multivariate ARMA Processes A vector y(t) of n elements is said to follow an n-variate ARMA process of orders p and q if it satisfies the equation (1) A 0 y(t) + A 1 y(t 1) + + A p y(t p) = M
More information6 NONSEASONAL BOX-JENKINS MODELS
6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal
More informationMultivariate Time Series: VAR(p) Processes and Models
Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with
More informationEigenvalues and Eigenvectors
Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationVAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:
VAR Model (k-variate VAR(p model (in the Reduced Form: where: Y t = A + B 1 Y t-1 + B 2 Y t-2 + + B p Y t-p + ε t Y t = (y 1t, y 2t,, y kt : a (k x 1 vector of time series variables A: a (k x 1 vector
More informationARMA models with time-varying coefficients. Periodic case.
ARMA models with time-varying coefficients. Periodic case. Agnieszka Wy lomańska Hugo Steinhaus Center Wroc law University of Technology ARMA models with time-varying coefficients. Periodic case. 1 Some
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationNew Introduction to Multiple Time Series Analysis
Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2
More informationSome suggested repetition for the course MAA508
Some suggested repetition for the course MAA58 Linus Carlsson, Karl Lundengård, Johan Richter July, 14 Contents Introduction 1 1 Basic algebra and trigonometry Univariate calculus 5 3 Linear algebra 8
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More information1 Matrices and vector spaces
Matrices and vector spaces. Which of the following statements about linear vector spaces are true? Where a statement is false, give a counter-example to demonstrate this. (a) Non-singular N N matrices
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationj=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.
Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More information6.3 Forecasting ARMA processes
6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss
More informationTime Series Outlier Detection
Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection
More informationMTH4101 CALCULUS II REVISION NOTES. 1. COMPLEX NUMBERS (Thomas Appendix 7 + lecture notes) ax 2 + bx + c = 0. x = b ± b 2 4ac 2a. i = 1.
MTH4101 CALCULUS II REVISION NOTES 1. COMPLEX NUMBERS (Thomas Appendix 7 + lecture notes) 1.1 Introduction Types of numbers (natural, integers, rationals, reals) The need to solve quadratic equations:
More informationSolutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14
Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,
More informationStructural Macroeconometrics. Chapter 4. Summarizing Time Series Behavior
Structural Macroeconometrics Chapter 4. Summarizing Time Series Behavior David N. DeJong Chetan Dave The sign of a truly educated man is to be deeply moved by statistics. George Bernard Shaw This chapter
More informationChapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models
Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45
ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions
More informationCointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56
Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The
More informationDynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models
6 Dependent data The AR(p) model The MA(q) model Hidden Markov models Dependent data Dependent data Huge portion of real-life data involving dependent datapoints Example (Capture-recapture) capture histories
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationExponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes
Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationFINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015
FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 205 Time Allowed: 60 minutes Family Name (Surname) First Name Student Number (Matr.) Please answer all questions by
More informationNext topics: Solving systems of linear equations
Next topics: Solving systems of linear equations 1 Gaussian elimination (today) 2 Gaussian elimination with partial pivoting (Week 9) 3 The method of LU-decomposition (Week 10) 4 Iterative techniques:
More information