1 Linear Difference Equations

Size: px
Start display at page:

Download "1 Linear Difference Equations"

Transcription

1 ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with y 0 given The equation is a first order difference equation. It is easily solved by recursive substitution p-th order systems Consider a p-th order system t 1 y t = φ t y 0 + φ i ε t i y t = φ 1 y t 1 + φ 2 y t φ p y t p + ε t To solve this system it is useful to write it in vector first order form (called companion form) as or y t y t 1 y t 2... y t p+1 φ 1 φ 2 φ 3... φ p = Z t = ΦZ t 1 + e t y t 1 y t 2 y t 3... y t p ε t

2 with Z t = y t y t 1 y t 2... y t p+1 etc. By recursive substitution, t 1 Z t = Φ t Z 0 + Φ i e t i so that a solution is obtained given Z 0 = (y 0, y 1,..., y t p+1 ). Stability: The linear difference equation Z t = ΦZ t 1 + e t is stable if, when e t = e for all t and Z 0 is an arbitrary constant, them lim t Z t = Z where Z does not depend on Z 0 In the first-order case, stability is achieved if φ < 1 In the higher order case, what is required is that lim t Φt = 0 and t 1 Φ i converges These two conditions are equivalent to the conditions that all eigenvalues of Φ are less than one in modulus (absolute value). To see this, assume for the moment that the eigenvalues of Φ are distinct. In this case we can decompose Φ = P ΛP 1 where the columns of P are eigenvectors of Φ and Λ is a diagonal matrix with the eigenvalues of Φ on the diagonal (see Hamilton). 2

3 Thus Φ 2 = P ΛP 1 P ΛP 1 = P Λ 2 P 1 and Φ t = P Λ t P 1 When the eigenvalues of Φ are not distinct, a similar argument can be applied to the Jordan decomposition of Φ (see Hamilton). 2 Stochastic Processes In stochastic models, the input sequence {Y t } t=1 of random variables form an example of stochastic processes. Some useful definitions: Stochastic process: the probability law governing {Y t } t=1 Realization: One draw from the process, i.e. {y t } t=1. Note that {y t } t=1 is only ONE observation instead of an infinite number of observations of the stochastic process. What is the implication and how to deal with this problem? Strict Stationarity: The process is strict stationary if the probability distribution of (Y t, Y t+1,..., Y t+k ) is identical to the probability distribution of (Y τ, Y τ+1,..., Y τ+k ) for all t, τ, k. (Thus, all joint distributions are time invariant) Autocovariances: The autocovariances are λ t,k = cov ( Y t, Yt+k T ) Autocorrelations: The autocorrelations are ρ t,k = corr ( Y t, Y T t+k). If Y 1 correlates with Y 2, Y 2 correlates with Y 3, does Y 1 correlate with Y 3? Covariance Stationarity: The process is covariance stationary if µ t = E (Y t ) = µ for all t λ t,k = λ k for all t 3

4 Thus the means and autocovariances do not depend on time. When Y is scalar, covariance stationary implies λ k = λ k and when Y is a vector, covariance stationarity implies λ k = λ T k Why? consider the example where ( ) ( xt+1 a = y t+1 c }{{} Z t+1 b d ) (xt ) + ε t+1 y t }{{} Z t Assuming Z t is i.i.d. standard bi-variate normal. In this case, ( ) a c λ 1 = Cov (Z t, Z t+1 ) = b d ( ) a b λ 1 = c d Trend Stationary: The process is trend stationary if Y t f (t) is stationary, for appropriately chosen f (.). In many cases, f (t) is taken to be f (t) = α + βt. In the case of stock price, the trend can represent for example equity premium. Difference Stationary: A process is difference stationary if Y t Y t 1 is stationary. For example, if Y t is stock price, does difference stationary mean the stock price follows a random walk? (No, there might be 4

5 other variables that can predict stock return, see the various notions of market efficiency.) What is the implication for proving stock market efficiency? It is really equivalent to prove a negative statement there does not exist a variable that can predict stock return. Let denote the differencing operator so that Y t = Y t Y t 1, then sometimes a series is said to be Integrated of order d written as I (d) if d Y t is stationary, but d 1 Y t is not stationary. White noise: A process is called white noise if it is covariance stationary, µ = 0 and λ k = 0 for k > 0 Martingale Process: Y t is a martingale if E [Y t Ω t 1 ] = Y t 1 where Ω t 1 Ω t is the time t 1 information set. Often Ω t = {Y τ } t τ=0. Martingale Difference Process: Y t is a martingale difference process if E [Y t Ω t 1 ] = 0 Markov process: {X t } is Markov if, given X t, the distribution of X s for s > t does not depend on X u for u < t. Is a Markov process martingale? Is a martingale a markov process? 3 Autoregressive Processes An example of stochastic process is Y T = φ 1 Y T 1 + φ 2 Y T φ p Y T p + ε t where ε t iid ( 0, σ 2) and (Y 0, Y 1,...Y p+1 ) is independent of {ε t } t=1 with mean µ 0 and variance σ0 2. This process is called Autoregressive Process of 5

6 Order p, abbreviated as AR(p). In the AR(1) model t 1 Y t = φy t 1 + ε t implies Y t = φ t Y 0 + φ i ε t i It can be verified that µ t = EY t = φ t µ 0 λ t,0 = V ary t = φ 2t σ φ2t 1 φ 2 σ2 for φ < 1 = σ tσ 2 for φ = 1 And these are time invariant if 1. φ < 1 2. µ 0 = 0, σ 2 0 = σ2 1 φ 2 It then can be verified that λ t,k = φk def σ2 = λ 1 φ2 k which does not depend on t. Hence condition 1 and 2 are necessary and sufficient for covariance stationarity. given 1, Y 0 only has a transitory effect on the process. Often only 1 is mentioned because, But if you want to simulate a truely stationary process, the initial observation of an AR(1) model needs to be drawn from its stationary distribution. For AR(p) model, there are similar restrictions that lead to covariance stationarity. Like in AR(1), there are two sets of restrictions. The first is that the process is stable, that is all the eigenvalues of Φ are less than 1 in modulus. The second involves the mean and variance of the initial conditions. These are worked out in detail in Hamilton. 6

7 4 The Lag Operator A useful notational device is the lag operator, denoted by L (some authors use B). In general, L is an operator that maps the sequence {y t } t= into another sequence {x t } t=. Specifically, L lags the sequence one period. Thus If b denotes a constant Ly t = y t 1, L 2 y t = y t 2,..., L p y t = y t p bly t = by t 1 = Lby t We can use this operator to write AR(p) model as Y t = φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t = φ 1 LY t + φ 2 L 2 Y t φ p L p Y t + ε t or ( 1 φ1 L φ 2 L 2... φ p L p) Y t = ε t or φ (L) Y t = ε t with φ (L) = ( 1 φ 1 L φ 2 L 2... φ p L p). Notice that the operator φ (L) is a p th order polynomial in the lag operator L. This polynomial is called the autoregressive polynomial. As an exercise, you should show that the roots of the autoregressive polynomial are the reciprocals of the eigenvalues of the companion matrix Φ (the zeros of the autoregressive polynomial are the values of z that makes φ (z) = 0). Thus, often the conditions for covariance stationary are stated as The roots of AR polynomial are greater than 1 in modulus. The lag operator is useful because it can be manipulated in familiar algebraic ways to simplify calculations. For example, in the AR(1) process, we can write each observation as the sum of lagged shocks t 1 Y t = φ t Y 0 + φ i ε t i 7

8 It is quite tedious to do this for AR(p) process. With the lag operator, this task can be accomplished in a less tedious way. To begin, for AR(1) process, one can write (1 φl) Y t = ε t and we want to find the operator that maps {ε t } into {Y t }, that is we seek a c (L) such that Y t = c (L) ε t Since (1 φl) Y t = ε t, it is natural to write c (L) = (1 φl) 1. What does this mean? Recall (1 x) 1 = 1 + x + x 2 + x when x < 1. This suggests writing (1 φl) 1 = 1 + φl + φ 2 L 2 + φ 3 L which yields the solution Y t = φ i ε t i which is the same one would obtain from infinite recursive substitution. Thus, inverting (1 φl) algebraically gives the right answer. A word of warning however: The algebraic inverse of the polynomial (1 φz) where z is a variable, is not unique. To see this, note (1 φz) φ 1 z 1 = φ 1 z 1 1 so that and thus 1 φz = φ 1 z 1 1 φ 1 z 1 (1 φz) 1 = φ 1 z 1 ( 1 φ 1 z 1) 1 = φ 1 z 1 ( 1 + φ 1 z 1 + φ 2 z ) 8

9 which suggests writing (1 φl) 1 = φ 1 L 1 ( 1 + φ 1 L 1 + φ 2 L ) which implies the solution Y t = φ i ε t+i i=1 Where does this come from? Y t+1 = φy t + ε t+1 hence Y t = φ 1 Y t+1 + φ 1 ε t+1 = φ 2 Y t+2 + φ 2 ε t+2 + φ 1 ε t+1 =... This is the forward solution to the difference equation which could be deduced, for example, by forward recursive substitution. How do we choose between the two solutions Y t = φ i ε t i and Y t = φ i ε t+i i=1 One approach is to impose a side condition on the solution: bounded input sequences {ε t } must lead to bounded output sequences. This would rule out the forward solution if φ < 1 since φ t will explode as t grows large. Analogously, it would rule out the backward solution if φ > 1. This yields a solution rule: Solve stable roots backwards and unstable roots forward. 9

10 5 Moving Average Models We say that Y t follows a moving average process of order q if Y t = ε t θ 1 ε t 1 θ 2 ε t 2... θ q ε t q where ε t iid ( 0, σ 2). Example: where might you see moving average models. Market microstructure models. E.g., assume the observed price equals the true fundamental (which follows random walk) plus a bid-ask bounce then the return p t = p t + ε t r t = p t p t 1 = p t p t 1 + ε t ε t 1 which is in the form of a MA model (for now, this is easy to see by assuming p t p t 1 = 0, later we ll show that this assumption does not matter). Some properties Covariance Stationary MA(1) model: Y t = ε t θε t 1 E (Y t ) = 0 V ar (Y t ) = σ 2 ( 1 + θ 2) Cov (Y t, Y t 1 ) = θσ 2 (given this, is stock return positively or negatively auto-correlated if there is bid-ask bounce?) Cov (Y t, Y t k ) = 0 for k > 1 Hence the process is covariance stationary MA(q) model: Y t = ε t θ 1 ε t 1 θ 2 ε t 2... θ q ε t q E (Y t ) = 0 V ar (Y t ) = σ 2 ( 1 + q i=1 θ2 i 10 )

11 Invertibility Cov (Y t, Y t k ) = σ 2 ( θ k + q k j=1 θ jθ k+j ) for k q Cov (Y t, Y t k ) = 0 for k > q Hence the process is covariance stationary Motivation: Consider forecasting in MA(1) model Y t = ε t θε t 1. Thus, the forecast of Y t constructed at time t 1 would be θε t 1. The problem is that ε t 1 is not directly observed; it must be constructed from the lagged values of Y t. How can this be done? Note t 1 ε t = θ t ε 0 + θ i Y t i Thus, the sequence of ε s could be formed from present and lagged Y s if ε 0 were known. But of course ε 0 is unknown. The process is invertible (meaning that the ε s can be determined from lagged Y s) if t 1 ε t = θ i qm Y t i εt as t i.e., assuming ε 0 = 0 has no lasting effect on the quality of the forecasts. For the MA(1) model, invertibility requires θ < 1. Equivalently, writing Y t = (1 θl) ε t, invertibility requires that the roots of θ (z) = 1 θz are greater than 1 in modulus For the MA(q) model, we get an analogously result, namely that the process is invertible if the roots of θ (z) = 1 θ 1 z θ 2 z 2... θ q z q are greater than 1 in modulus Another motivation for the invertibility restriction comes from considering the autocovariances of the MA(1) model λ 0 = σ 2 ( 1 + θ 2) λ 1 = θσ 2 11

12 Note we could also describe these two autocovariances as ( λ 0 = σ θ 2) λ 1 = θ σ 2 with θ = θ 1 and σ 2 = σ 2 (1 + θ 2 ) 1 (1 + θ 2 ). That is, the two MA(1) processes Y t = ε t θε t 1 with V ar (ε t ) = σ 2 and Y t = ε t θ ε t 1 with V ar ( ε t ) = σ 2 have exactly the same autocovariances. Thus, given data on Y t, we can t tell the processes apart, at least using the first two moments of the data. These two models are Observationally Equivalent for their first two moments. Since θ = θ 1, one way to (arbitrarily) choosen between them is to impose the restriction that the MA parameter is 1 in absolute value. 6 Autocovariance Generating Functions The autocovariance generating function for a covariance stationary process is given by λ (z) = λ j z j j= so that the autocovariances are given by the coefficients on the argument z j. It s purpose (or one of the purposes) is the same as the moment generating function - namely it is a convenient way to store the autocovariances of a covariance stationary stochastic process. For the MA process, the ACGF is particularly easy to construct. Suppose Y t = θ (L) ε t 12

13 then λ (z) = σ 2 θ (z) θ ( z 1) We will verify this formula for MA(1) process - you should verify it for higher order MA processes For an MA(1) process θ (z) = (1 θz), so that θ (z) θ ( z 1) = (1 θz) ( 1 θz 1) = θz 1 + ( 1 + θ 2) θz which implies autocovariances λ 1 = θσ 2 λ 0 = σ 2 ( 1 + θ 2) λ 1 = θσ 2 with all other autocovariances equal to 0. correct answer for an MA(1) process. Thus, the formula yields the Now, return to the issue of invertibility. Consider the MA polynomial for the MA(q) model θ (z) = 1 θ 1 z θ 2 z 2... θ q z q Suppose that this polynomial has zeros at z = γ 1 we can factor the polynomial as and so that ACGF is given by 1, γ 1 2 θ (z) = (1 γ 1 z) (1 γ 2 z)... (1 γ q z),..., γ 1 q. In this case, λ (z) = σ 2 (1 γ 1 z) (1 γ 2 z)... (1 γ q z) ( 1 γ 1 z 1) ( 1 γ 2 z 1)... ( 1 γ q z 1) But since (1 γz) ( 1 γz 1) is proportional to ( 1 γ 1 z ) ( 1 γ 1 z 1) (recall the discussion of invertibility in the MA(1) model), we can flip or invert the roots of the MA polynomial, change σ 2 to adjust for the factor of porportionality, and obtain the same ACGF hence a model with the same 13

14 autocovariances. Thus, for example, in the MA(2) model, if θ (z) = (1 γ 1 z) (1 γ 2 z) then models with are observationally equivalent. θ 1 (z) = ( 1 γ 1 1 z) (1 γ 2 z) θ 2 (z) = (1 γ 1 z) ( 1 γ 1 2 z) θ (z) = ( 1 γ 1 1 z) ( 1 γ 1 2 z) 7 Autoregressive-Moving Average (ARMA) Models Autoregressive-Moving Average models combine the simple AR and MA models. The ARMA(p,q) model is Y t = φ 1 Y T 1 +φ 2 Y T φ p Y T p +ε t θ 1 ε t 1 θ 2 ε t 2... θ q ε t q or φ (L) Y t = θ (L) ε t with and φ (L) = 1 φ 1 L φ 2 L 2... φ p L p θ (L) = 1 θ 1 L θ 2 L 2... θ q L q Conditions for covariance stationary and invertibility are just the same as in the simple models: the roots of φ (z) and θ (z) are greater than 1 in modulus. 14

15 The ACGF for the ARMA model can be derived as follows. Since φ (L) Y t = θ (L) ε t then Y t = c (L) ε t with c (L) = φ (L) 1 θ (L) which is a well defined (mean square convergent) polynomial in positive powers of L (i.e. backward looking) if the roots of φ (z) are greater than 1 in absolute value. Thus Y t has the MA representation Y t = c (L) ε t so that λ (z) = σ 2 c (z) c ( z 1) = σ 2 φ (z) 1 θ (z) φ ( z 1) 1 ( θ z 1 ) 8 ARIMA Models Suppose Y t is integrated of order d so that Y t must be differenced d times to be stationary. Let X t = (1 L) d Y t and suppose that X t follows an ARMA(p,q) model. Then we say that Y t follows an ARIMA(p,d,q) model. (The extra I in ARIMA stands for integrated ). 15

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Multivariate ARMA Processes

Multivariate ARMA Processes LECTURE 8 Multivariate ARMA Processes A vector y(t) of n elements is said to follow an n-variate ARMA process of orders p and q if it satisfies the equation (1) A 0 y(t) + A 1 y(t 1) + + A p y(t p) = M

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Chapter 2: Unit Roots

Chapter 2: Unit Roots Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Estimating Moving Average Processes with an improved version of Durbin s Method

Estimating Moving Average Processes with an improved version of Durbin s Method Estimating Moving Average Processes with an improved version of Durbin s Method Maximilian Ludwig this version: June 7, 4, initial version: April, 3 arxiv:347956v [statme] 6 Jun 4 Abstract This paper provides

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information