Time Series 2. Robert Almgren. Sept. 21, 2009

Size: px
Start display at page:

Download "Time Series 2. Robert Almgren. Sept. 21, 2009"

Transcription

1 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models to real data. Let {x t } t= be a scalar time series. That means that each x t is a real-valued random variable on some probability space, and the time series is completely defined by the distribution of each x t and of all the joint distributions. The index variable t has the interpretation of discrete time, and we consider the series to extend to infinity in both directions. This definition of a time series does not talk about any rule by which the values x t are generated; that is the subject of this section. The series {x t } is stationary if all joint distributions x t,..., x t p are independent of t, that is, of where we are in the sequence. It is weakly stationary if only the first and second moments are independent of t. The series is Gaussian if all joint distributions are Gaussian. If the series is Gaussian then weak stationarity is equivalent to stationarity. For the linear theory we only worry about first and second moments, it will not really matter whether a series is stationary or only weakly so, or whether it is Gaussian. People with a background in stochastic analysis may wonder what about filtrations? Do we need a detailed construction to describe how information is revealed in time? Generally speaking, that concept stays in the background in time series analysis, because we are not doing optimal control. It is still present, and we would prefer that any real-time analysis methods include only data observed to date rather than future data. But unlike trading strategies, here the payoff for forward-looking is less dramatic since we are only doing statistical modeling. Sometimes we even consider analysis methods such as Fourier transforms that use the entire series of observations both future and past. 1

2 Robert Almgren: Time Series 2 Sept. 21, Partial auto-correlations Last time, we defined the mean and autocovariance at lag l µ = E ( x t, γl = Cov ( x t, x t l = E ( (xt µ (x t l µ. Note that these formulas have a t on the right side, meaning that we should take the mean and covariances of some specific term in the sequence. But there is no t on the left side because of stationarity. The zero-order covariance γ 0 = Var(x t is the mean-square size of the individual terms. Then the autocorrelation is ρ l = γ l γ 0. which has ρ 0 = 1 and ρ l 1. The autocovariance and autocorrelation are defined for all l = 0, ±1, ±2,... and are even: ρ l = ρ l. Let us quickly review what correlation means. Correlation of ρ l between x t and x t l tells you how much you learn about the value of x t by knowing the value of x t l, without knowing anything about any other values. As usual in stochastic modeling, we assume we know all the parameters of the process µ and { } γ l l= ; the only unknown is the values x t themselves. In general, suppose that ξ and η are two random variables with E(ξ = E(η = 0, E(ξ 2 = σ 2 ξ, E(η2 = ση 2, and E(ξη = ρσ ξ σ η. If we had no information about the value of ξ, our best predictor for the value of η would be simply η = 0. To predict the value of η given ξ, we make a linear model η = βξ. We choose the value of β by minimizing the squared error: [ (η ] 2 min E βξ β ( = min σ 2 β η 2βρσ ξ σ η + β 2 σ 2 ξ β = ρ σ η σ ξ Applying this with η = x t µ and ξ = x t l µ, both of zero mean and with equal variances, the best predictor for x t given x t l (and knowing no other values is x t µ ρ l ( xt l µ. The error in this prediction will be somewhat less than the zeroinformation prediction x t = µ.

3 Robert Almgren: Time Series 2 Sept. 21, Now suppose we observe all the l preceding values x t 1,..., x t l, and let us set µ = 0 for simplicity. We make a model of the form x t = β 1 x t β l x t l. We choose the coefficients β 1,..., β l to minimize the square error: [ ( min E x t ( 2 ] β 1 x t β l x t l β 1,...,β l Differentiating with respect to β j for each j = 1,... and taking the expectation, we obtain the set of l linear conditions γ j = β 1 γ j β l γ j l, j = 1,..., l. (Note that the sequence of subscripts j 1,..., j l crosses zero somewhere in the middle, so we use the definition of γ l for l < 0 as well as l > 0. That is, we solve the linear system Γ β 1. β l = γ 1. γ l, with Γ ij = γ i j. The last coefficient β l is the lth partial autocorrelation of x t with x t l, sometimes denoted PACF(l. It is the additional value for predicting x t of knowing x t l if you already know x t 1,..., x t l+1. Note that to get the partial auto-correlation at each lag, you have to solve a different linear system for each l. More fully, we might denote the lth matrix as Γ l, and the l coefficients as β l,j, the jth coefficient in the fit of order l. 2 White noise A time series {w t } is a white noise if all finite collections of elements are identically distributed, uncorrelated random variables with zero mean. That is, the series {w t } is stationary, has mean zero E(w t = 0, and has zero covariances at all nonzero lags: E(w t w s = 0 if s t.

4 Robert Almgren: Time Series 2 Sept. 21, The individual variance σ 2 = E ( wt 2 must be constant if the series is stationary. The series is a strong white noise if w s and w t are independent for s t, which is a stronger than being uncorrelated. We do not require that w t be Gaussian; for example, a sequence of coin flips w t = ±1 is white noise. That said, a Gaussian sequence is the most common example and the most useful to keep in mind, as for example by Matlab w=randn(1,n or R w <- rnorm(n. If r t is a series of asset returns, then the theory of the efficient market suggests that the sign of r t should be very difficult to predict in terms of previous values r t 1, r t 2,... (and any other data available at the time. But this does not necessarily mean that the return series is white noise, since we might be able to estimate the size of likely price changes. If σ t is our estimate for E ( rt 2, then the series w t = r t /σ t might plausibly be a white noise (with unit variance. We construct time series by applying various transformations to a white noise input, and we analyse data by finding transformations that reduce the given data set to something that passe statistical tests for white noise. That is, we investigate the relationship {x t } {w t } (1 in both directions. The relationship is between the entire sequences on both side, not on the individual values at a specific time. In the leftward direction, {w t } {x t }, this is a model that specifies how the time series of interest might be generated from a white noise series. If you like, it is a specification for a program for generating sample realizations of the sequence. Just because this model exists does not mean that the original series really was generated in that way, just that this is one possible way to construct it. In the rightward direction, {x t } {w t }, this is a technique for data analysis. You start with an empirical data set consisting of one observation of all the x t, an apply a series of transformations to get another numerical series w t. If your analysis is correct, this derived series should satisfy the empirical tests to be a white noise. This is the same as usual regression, where you should keep improving your model until no observable structure is present in the residuals. As an example, consider the random walk model x t = x t 1 + w t. Stated in this way, it is a constructive formula for generating realizations. It can also be inverted as w t = x t x t 1, and in this formulation it can be used to test the residuals and thus the hypothesis that the input data is a random walk.

5 Robert Almgren: Time Series 2 Sept. 21, Moving average models A moving average model of order q 0, MA(q, has the form x t = µ + q θ i w t i = µ + θ 0 w t + θ 1 w t θ q w t q. (2 i=0 with E(w 2 t = σ 2. In theory, x t might also depend on future values w t+j with j > 0, but we will assume our model is causal, depending only on past values. (For finite q, a two-sided model of order q is equivalent to a one-sided model of order 2q, and if you were clever you could probably define this equivalence in the limit q. Clearly the mean is E(x t = µ. Let us calculate the autocorrelation function. Setting µ = 0 for convenience, γ l = E ( x t x t l = q i,j=0 θ i θ j E ( q l w t i w t l j = σ 2 θ j θ j+l. The key ingredient is the white noise definition E ( w i w j = σ 2 δ ij. By definition, the sum is zero when the upper limit is strictly less than the lower limit, that is, when l > q. Thus an MA(q model has autocorrelation strictly zero for lags greater than the order of the model. This is apparent from looking at (2, since x t depends on w t,..., w t q and x s depends on w s,..., w s q and if s t > q these sets do not overlap, so there is no possible mechanism for x t and x s to have any relationship. By the above, the variance is E(x 2 t = γ 0 = σ 2( θ θ 2 q which is certainly finite for any finite q. In the limit q we require that the sequence be summable ( θ j exists, which is stronger than square-summable ( θ j 2 exists. Because the variance is finite, the series x t is stationary for any finite set of values θ 1,..., θ q, with only the rather weak summability condition in the case q =. j=0 Non-example: The random walk x t = x t 1 + w t can formally be written as x t = w t + w t 1 + w t 2 +. This looks like a movingaverage modelbut we do not count it as such, because the sequence of coefficients θ = ( 1, 1, 1,... is not summable.

6 Robert Almgren: Time Series 2 Sept. 21, Figure 1: Sample path of MA(1 x t = 1 2( wt + w t 1. Small dots are w t, large dots and heavy lines are x t. The MA process smooths the jaggedness of the white noise. (We take Var(w t = 2 so Var(x t = 1. Example For q = 1, let us consider x t = w t + α w t 1, E ( wt 2 = σ 2. The mean is zero, and the autocovariances are γ 0 = ( 1 + α 2 σ 2, γ 1 = α σ 2, so the correlation (the only nonzero value is ρ = ρ 1 is ρ = α 1 + α 2. Note that the maximum possible value is ρ = ± 1 2, when α = ±1, although Cauchy-Schwarz requires only ρ 1. Larger values of ρ than 1 2 are certainly possible, but not within an MA(1 model.

7 Robert Almgren: Time Series 2 Sept. 21, Figure 1 shows a sample path, with α = 1. Thus (except for a factor of 2 each value x t is the average of two successive values of w t and is hence a smoothed version of w t. In general, the PACF of an MA(q model is not zero for l > q. For the above example, to compute the PACF at lag l = 2 the matrix (setting σ 2 = 1 is ( ( γ0 γ 1 1 ρ Γ = = γ γ 1 γ 0 0 ρ 1 with β 2 Γ 1 = γ 2 1 γ 0 (1 ρ 2 ( 1 ρ. ρ 1 Then for the fit x t = β 1 x t 1 + β 2 x t 2, the optimal values β 1, β 2 are β 1 = Γ 1 γ 1 = Γ 1 ργ 0 = ρ ρ 2 ρ so the partial auto-correlation at lag 2 is β 2 = ρ2 1 ρ 2 = α α 2 + α 4. You can easily check that the 3rd order PACF is also not zero. Here is what is going on. We want to predict x t, a combination of w t and w t 1, neither of which we can observe directly no matter what values of x s we are given. Knowing the value of x t 1 is useful, because it gives us some information about w t 1, though this is mixed in with w t 2. But knowing the value x t 2 only has no value because it involves w t 2 and w t 3, which are independent of the two values we care about. However, if we already know x t 1, then additional information about x t 2 is useful, because it helps us distinguish the two components w t 1 and w t 2 in x t 1. Recovery from autocorrelation Suppose we are given a sequence of autocorrelations γ 0,..., γ q. Can we reconstruct a model that would have generated them? The answer is yes, partially. First, the autocorrelations must be zero beyond a finite order q, if a MA(q model is to have any hope. Then we solve the above

8 Robert Almgren: Time Series 2 Sept. 21, equations for θ 0,..., θ q in terms of γ 0,..., γ q, but the answer will not be unique. As an example, consider q = 1 and fix σ = 1. The system is γ 0 = θ θ 2 1, γ 1 = θ 0 θ 1. It is clear that there are no solutions if γ 1 > 1 2 γ 0, although the standard conditions on autocorrelation require only γ 1 γ 0. And if γ 1 < 1 2 γ 0, there are four distinct solutions, corresponding to changing the sign of w t and w t 1 and to interchanging w t and w t 1. Normalization Having both θ 0 and σ 2 is redundant, and we may normalize the value θ 0 = 1 (or we could require E(wt 2 = 1, but θ 0 = 1 is the convention. Thus x t = µ + w t + θ 1 w t θ q w t q, E ( wt 2 = σ 2. This is the standard form of an MA(q model, although there are conflicting conventions for the sign of the θ j. 4 Auto-regressive models Exploiting the symmetry of the relationship (1, we now consider models that can be written like (2 exchanging x t w t : w t = x t c φ 1 x t 1 φ p x t p for some p 0 and eventually including the limit p. Or, in more standard form, x t = c + φ 1 x t φ p x t p + w t. (3 This is called an autoregressive model of order p or AR(p. It is not clear that x t has a stationary solution at all, or (what is the same thing that the influence of these starting values will decay. Indeed, the random walk from last week is an AR(1 process with φ 1 = 1 and c = 0, and we know that this is not stationary. Even worse, consider an AR(1 process x t = 2x t 1 + w t, whose solution explode exponentially. Clearly, we need to figure out a general way to tell when solutions are well behaved.

9 Robert Almgren: Time Series 2 Sept. 21, Linear dynamics To understand the behavior of solutions to (3, let us use a little formal notation from the general theory of linear systems. Let us write (3 as ( Lx t = R t with ( Lx t = x t φ 1 x t 1 φ p x t p and R t = c + w t. Here L is a homogeneous linear operator acting on an entire sequence x = {x t } t=, yielding another infinite sequence y = Lx, given by the above expression. R is the inhomogeneous term on the right side, which has a constant part and a random part. As is usual for linear systems, the solution may be written x t = s= I t s R s = I j R t j j=0 where the influence function I is the infinite sequence that satisfies 1 ( LI t = δ t, I t = 0 for t < 0. This tells you how the system responds to a unit perturbation at t = 0. The entire system behavior is simply a linear combination of the response to all these perturbations. All of this says that it is enough to look for solutions to the homogeneous problem Lx = 0, or x t = φ 1 x t φ p x t p with the same coefficients φ 1,..., φ p but with the constant c set to zero and the noise term w t eliminated. We look for pure exponential solutions to this problem in the form x t = λ t where λ is a real or complex number. This is a solution if and only if P(1/λ = 0, where the characteristic polynomial of degree p is P(z = 1 φ 1 z φ p z p. 1 The Kronecker delta is δ t = 1 for t = 0 and δ t = 0 for t 0.

10 Robert Almgren: Time Series 2 Sept. 21, Since P(0 = 1, the polynomial is not identically zero, and generically it has p distinct roots. The general solution to Lx = 0 is x t = a 1 λ t a p λ t p where the coefficients a 1,..., a p are determined by the initial conditions. Typically, all of the a j will be nonzero, if only by a small amount, so any growing exponential will cause the whole solution to grow exponentially. The behavior of solutions to the linear system is determined by the locations of the zeros of P in the complex plane. There are three cases: 1. All of the zeros z have z > 1. In this case, all solutions of the form x t = λ t have λ < 1. That is, they decay exponentially. Regardless of what the initial data is (the δ t in the definition of I, the effect of each input decays exponentially, with a rate that at worst is given by max z. Solutions are bounded and the process is stationary. 2. At least one of the zeros z has z < 1. Then there is at least one special solution with λ > 1, meaning that it grows exponentially. The typical solution will almost certainly contain a component of this growing solution, and hence it will grow. The process is not stationary and no reasonable solution exists. 3. All of the zeros z have z 1, and at least one has z = 1. This is a special borderline case that we will discuss later. Since P(0 = 1, the only way to get P(z = 0 for z < 1 (Case 2 is for the coefficients φ j to be large enough. For example, if φ φ p < 1, then for z 1 we have φ 1 z + + φ p z p φ φ p < 1, so it is impossible for P(z to be zero for any z < 1. Example Take p = 1, so the model is x t = c + φx t 1 + w t. The characteristic polynomial is P(z = 1 φz, which has a single root at z = 1/φ. This is strictly greater than one in absolute value if and only if φ < 1, which is the condition we found last time. If this is true then perturbations decay exponentially; if it is false then they explode exponentially.

11 Robert Almgren: Time Series 2 Sept. 21, Reduction to MA model Any stationary AR model can be converted to an MA model of infinite order. For example, consider the AR(1 model: x t = c + φ x t 1 + w t = c + φ ( c + φx t 2 + w t 1 + wt = (1 + φc + φ 2 ( c + φx t 3 + w t 3 + wt + φw t 1 = (1 + φ + φ c + w t + φ w t 1 + φ 2 w t 2 + c = 1 φ + w t + φ w t 1 + φ 2 w t 2 + if φ < 1, which we found to be exactly the condition for the model to have a stationary distribution at all. Thus the AR(1 model is equivalent to a MA( model, in which the coefficients are given by an infinite sequence with a very simple exponential structure. You can see that the same procedure would work for any finite AR(p model, though the details would be more complicated. Mean and autocorrelations Let us calculate the moments of an AR process. First, the mean µ = E(x t satisfies µ = c + φ 1 µ + + φ p µ by stationarity and since E(w t = 0, so µ = 1 P(1 = c 1 φ 1 φ p. This is finite as long as all of the zeros of P are outsize the unit disk. For the autocorrelation, we start with an AR(1 process with c = 0. Multiplying the dynamic equation x t = φx t 1 + w t by x t, we have x 2 t = φ x t x t 1 + ( φx t 1 + w t wt. Taking expectations and using E(w t x t 1 = 0, we find γ 0 = φ γ 1 + σ 2. For any l > 0, multiplying the dynamic equation by x t l and taking expectations gives γ l = φ γ l 1

12 Robert Almgren: Time Series 2 Sept. 21, In particular, γ 1 = φγ 0 and so γ 0 = φ 2 γ 0 + σ 2 or γ 0 = σ 2 1 φ 2 which is finite for φ < 1. Furthermore, all autocorrelations of positive order are a geometric series γ l = φ l γ 0. This suggests that an AR(1 model is a very natural fit for observed data whose autocorrelations decay exponentially. We recover the coefficient φ from the decay rate, and then the noise variance σ from the overall variance. For an AR(2 process, the same procedure gives γ 0 = φ 1 γ 1 + φ 2 γ 2 + σ 2 γ 1 = φ 1 γ 0 + φ 2 γ 1 γ l = φ 1 γ l 1 + φ 2 γ l 2 for l 2. The first three of these (l = 2 are an inhomogeneous linear system, the Yule-Walker equations, to be solved for γ 0, γ 1, and γ 2. Again, for l 2, the autocorrelations solve a linear recurrence relation in increasing lag that is the same recurrence relation as for the data series in increasing time. Its solutions are therefore decaying exponentially if the original series is stationary. Conversely, if you knew the first few autocorrelations of the series, you could solve these equations to determine φ 1, φ 2, and σ 2. In general, an AR(p model gives an inhomogeneous linear system of size p + 1, the Yule-Walker equations, to be solved for the first p + 1 covariances γ 0,..., γ p, followed by a recurrence relation with the same structure as the original problem for γ p+1, γ p+2,.... It is easy to see that the partial auto-correlations of an AR(p model are zero for l > p, since the dependence on previous values is explicitly included in the model. 5 ARMA models It is natural to combine the MA(q model with the AR(p model to construct an ARMA(p,q model: x t φ 1 x t 1 φ p x t p = c + w t + θ 1 w t θ q w t q. You can get quite rich dynamics with fairly small values of p and q.

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Time Series 4. Robert Almgren. Oct. 5, 2009

Time Series 4. Robert Almgren. Oct. 5, 2009 Time Series 4 Robert Almgren Oct. 5, 2009 1 Nonstationarity How should you model a process that has drift? ARMA models are intrinsically stationary, that is, they are mean-reverting: when the value of

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Math Tools: Stochastic Processes Revised: November 30, 2015

Math Tools: Stochastic Processes Revised: November 30, 2015 ECON-UB 233 Dave Backus @ NYU Math Tools: Stochastic Processes Revised: November 30, 2015 All of modern macroeconomics, and most of modern finance, is concerned with how randomness unfolds through time.

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

1 Teaching notes on structural VARs.

1 Teaching notes on structural VARs. Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information