Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Size: px
Start display at page:

Download "Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko"

Transcription

1 Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36

2 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r) = 1. The time t budget constraint is βb t+1 + c t = b t + y t The lifetime budget constraint (imposing lim T + β T b T +1 = 0): (b.c.) j=0 ( ) 1 j c t+j = b t r j=0 At time t, the PIH consumer will solve max {c t+j, b t+j+1 } j=0 ( ) 1 j y t+j. 1 + r E t β j u(c t+j ) j=0 subject to the sequence of (b.c.) 2 / 36

3 Stochastic Euler equation: E t u (c t+1 ) = β(1 + r)u (c t ), E t c t+1 = c t, or E t c t+1 = 0. Consumption is said to be a martingale. 3 / 36

4 Excess sensitivity A natural way to test the martingale hypothesis is by running the following regression c t+1 = α + βx t + error, and testing if β = 0. X t is a variable known at t. For example, X t = {y t, y t } lagged income or lagged income change/growth. If β is estimated to be statistically different from 0, consumption is said to be excessively sensitive to lagged information (income). Such tests are called the excess sensitivity tests of consumption. 4 / 36

5 Euler equations The Euler equations for periods t and t + 1 are E t c t+1 = c t, E t+1 c t+2 = c t+1. Apply the time-t conditional expectation operator to both sides of the last equation to obtain E t E t+1 c t+2 = E t c t+1 E t c t+2 = c t. Following similar steps, we can show that E t c t+k = c t, for all k 1. 5 / 36

6 The lifetime budget constraint can be written as: or ( ) 1 j ( ) c t+j = b t + E t 1 j y t+j, 1 + r 1 + r E t Thus, c t j=0 ( ) 1 j = b t + E t 1 + r j=0 j=0 j=0 ( ) 1 j y t+j. 1 + r c t = r ( ) 1 j b t + E t y t+j. 1 + r 1 + r j=0 }{{} permanent income 6 / 36

7 At time t 1, c t 1 = r 1 + r b t 1 + r 1 + r E t 1 [ y t r y t + = r 1 + r b t 1 + r 1 + r y r t 1 + (1 + r) 2 E t 1 c t 1 (1 + r) = rb t 1 + ry t 1 + r } 1 {{ + r } =1 β j=0 E t 1 j=0 c t 1 = ( rb t 1 + ry t 1 rc }{{ t 1 ) + (1 β)e } t 1 = r 1+r bt= (1 β)bt ] 1 (1 + r) 2 y t (1 + r) j y t+j 1 (1 + r) j y t+j j=0 1 (1 + r) j y t+j Subtracting the result from c t on the previous slide, we obtain c t = (1 β) E t β j y t+j E t 1 β j y t+j j=0 j=0 7 / 36

8 c t = r ( ) 1 j (E t E t 1)y t+j, 1 + r 1 + r j=0 }{{} innovation in the permanent income where (E t E t 1 )x t+k = E t x t+k E t 1 x t+k. To understand more about the magnitude of consumption changes, we need to have some idea about the properties of a (stochastic) process of income {y t }. 8 / 36

9 Time series A time series is a collection of observations y t, each recorded at time t. We will talk of a time series as a collection of realizations of random variables Y t. The time of recording an observation belongs to some set T 0. If T 0 is a discrete set, a time series is called discrete (daily, monthly, annual time series are examples of discrete time series). 9 / 36

10 Sample path In data, we observe a sample path of Y t : e.g., y 1, y 2,..., y T. We want to model the observed time series as a realization of a stochastic process Y t, t = 1, 2,..., T, keeping in mind that the process could have started before t = 1 and could run after t = T (for example, Y t can be recorded at t = 0, ±1, ±2,...). 10 / 36

11 Auto-covariance function We want to construct a mathematical/statistical model that would describe the data we observe. For a single time series, the dependence between Y t and Y t, Y t±1, Y t±2, etc., is described by the auto-covariance function. The auto-covariance function γ(, ) for a stochastic process {Y t, t T } is defined by γ(i, j) = E [(Y i EY i )(Y j EY j )], i, j T. γ(i, i) the variance of Y at time i, γ(i, i + 1) the auto-covariance between Y s recorded at time i and i + 1, etc. In general, those quantities can be time-dependent. 11 / 36

12 Weak stationarity A stochastic process {Y t, t = 0, ±1, ±2,...} is weakly stationary if EYt 2 <, t EY t = µ, t γ(i, j) = γ(i + t, j + t) = γ(i j) = γ(j i), i, j, t = 0, ±1, ±2, / 36

13 If, e.g. i j = 1, for a weakly-stationary process γ(1) can be calculated as E [(Y t EY t )(Y t 1 EY t 1 )] = E [(Y t EY t )(Y t+1 EY t+1 )], t. Similarly, γ(2) is γ(2) = E [(Y t EY t )(Y t 2 EY t 2 )] = E [(Y t EY t )(Y t+2 EY t+2 )], etc. 13 / 36

14 If you have a model for Y t, the mean, variances and auto-covariances can be estimated by simulating the model S times for t = 0, ±1, ±2,... and taking the average across S simulations. For example, EY t = 1 S EY 2 t = 1 S S yt s, s=1 S (yt s ) 2, where y s t is the value assumed by Y at time t in simulation s. In real data, we do not have the luxury of observing the process repeatedly but we can infer the mean, variances, and auto-covariances of the process by calculating sample analogs of population moments. s=1 14 / 36

15 Random walk Most of observed time series are not stationary. An example is a random walk. Random walk can be described by a process Y t = Y t 1 + u t, where u t iid(0, σ 2 ), t = 1,..., T, and Y 0 = 0. Note that Y t = t j=1 u j and EY 2 t = tσ 2. Random walk is not covariance-stationary since it violates one of the conditions of weak stationarity (in this case, finite variance). Y t (= Y t Y t 1 ), however, is covariance-stationary. 15 / 36

16 For y 1, y 2,, y T, a sample path of a stationary process Y t, we can estimate the sample auto-covariance function. The sample auto-covariance function is defined by: ˆγ(k) = 1 T T k (y j+k y)(y j y), 0 k < T, j=1 where ˆγ(k) = ˆγ( k), T < k 0, and y = 1 T T y t. t=1 16 / 36

17 ARMA processes Stationary ARMA processes specify Y t as a function of current and past realizations of white noise. A stochastic process, U t, is called white noise (W N) with mean zero and variance σ 2 if γ(0) = σ 2 γ(k) = 0, k / 36

18 An ARMA(p,q) process is described, for each t = 0, ±1, ±2,..., by the following equation: Y t φ 1 Y t 1 φ 2 Y t 2 φ p Y t p = U t + θ 1 U t 1 + θ 2 U t 2 + where U t W N(0, σ 2 ). + θ q U t q, 18 / 36

19 In a more compact notation, a mean zero ARMA(p,q) process is defined by φ(l)y t = θ(l)u t, where φ(l) = φ 0 L 0 φ 1 L φ 2 L 2... φ p L p, θ(l) = θ 0 L 0 + θ 1 L + θ 2 L θ q L q, θ 0 1, φ 0 1, and L is the lag operator so that L k x t = x t k, k = 0, ±1, ±2, / 36

20 An MA(q) process is obtained by setting φ(l) 1: Y t = θ(l)u t, where θ(l) = 1 + θ 1 L + θ 2 L θ q L q. Similarly, an AR(p) process is obtained by setting θ(l) 1: φ(l)y t = U t. 20 / 36

21 An ARMA(p,q) process is said to be stationary if the roots of the AR polynomial, 1 φ 1 z φ 2 z 2... φ p z p = 0, are greater than 1 in modulus, i.e., lie outside the unit circle. E.g., an AR(1) process: (1 φl)y t = U t. It is stationary if the root of 1 φz = 0 is greater than 1 in absolute value. It happens if z = φ 1 > 1, or if φ < 1. We already know that if φ is equal to 1, the process is not covariance-stationary; the same applies to all AR(1) processes with φ > / 36

22 We can express an AR(1) process as Y t = U t (1 φl) = (1 + φl + φ2 L 2 + φ 3 L )U t = U t + φu t 1 + φ 2 U t 2 + φ 3 U t Thus, an AR(1) process can be represented by an MA process of infinite order with particular restrictions on the moving average coefficients. 22 / 36

23 An ARMA(p,q) process is said to be invertible if the roots of the MA polynomial, 1 + θ 1 z + θ 2 z θ q z q = 0 are greater than 1 in modulus, i.e., lie outside the unit circle. For an MA(1) process, this means that θ < / 36

24 Note also that an MA(1) process can be expressed as or Y t = (1 + θl)u t = (1 ( θ)l)u t Y t (1 ( θ)l) = U t, or (1 + ( θ)l + ( θ) 2 L 2 + ( θ) 3 L )Y t = U t, Y t = θy t 1 θ 2 Y t 2 + θ 3 Y t U t = ( θ) j Y t j + U t. That is, an invertible MA(1) process can be represented by an AR process of infinite order. j=1 24 / 36

25 Auto-covariance function for an AR(1) process If (1 φl)y t = U t, then ψ j = φ j, j = 0,...,. γ(0) = σ 2 (1 + φ 2 + φ 4 + φ ) = σ2 1 φ 2 γ(1) = σ 2 φ(1 + φ 2 + φ 4 + φ ) = φ 1 φ 2 γ(k) = φγ(k 1), k 1. σ2 25 / 36

26 Auto-covariance function for an MA(1) process If Y t = (1 + θl)u t, ψ 0 = 1, ψ 1 = θ, ψ j = 0, j > 1. γ(0) = σ 2 (1 + θ 2 ) γ(1) = θσ 2 γ(k) = 0, k > 1. If the process is an MA(q), the auto-covariance function is zero for k > q. 26 / 36

27 Back to the PIH c t = r ( ) 1 j (E t E t 1)y t+j. 1 + r 1 + r j=0 }{{} innovation in the permanent income We need to know (E t E t 1 )y t, (E t E t 1 )y t+1, (E t E t 1 )y t+2, / 36

28 Assume that the process is a covariance-stationary MA( ) process: Then, Y t = θ(l)u t = U t + θ 1 U t 1 + θ 2 U t 2 + θ 3 U t (E t E t 1 )y t = u t (E t E t 1 )y t+1 = θ 1 u t (E t E t 1 )y t+2 = θ 2 u t (E t E t 1 )y t+3 = θ 3 u t. 28 / 36

29 For this income process, c t = r [ 1 + r = r 1 + r u t ] θ 3 (1 + r) 3 u t +... ] u t + θ r u θ 2 t + (1 + r) 2 u t + [ 1 + θ r + θ 2 (1 + r) 2 + θ 3 (1 + r) Note that 1 + θ 1 1+r + θ 2 + θ (1+r) = θ(l) (1+r) 3 L 1. 1+r Thus, c t = [ ( )] r 1 θ 1 + r 1 + r }{{} MPC out of the shock u t. 29 / 36

30 If Y t is some general ARMA(p,q) process, φ(l)y t = θ(l)u t so that Y t = θ(l) φ(l) U t, it can be shown that ) c t = ( r θ r 1+r φ ( 1 1+r )u t. Note that the polynomial φ(l) may have a unit root. 30 / 36

31 Excess smoothness Aggregate income in macro data is well fit by the following model y t = µ + α y t 1 + u t (1 L)y t = µ + α(1 L)Ly t + u t (1 L)(1 αl)y t = µ + u t µ y t = (1 L)(1 αl) + u t (1 L)(1 αl) u t y t = µ + (1 L)(1 αl). Utilizing our formula for consumption changes in accordance with the PIH c t = r 1 + r ( r 1 ) ( 1 α 1+r )u t = 1 + r 1 + r α u t. 31 / 36

32 Excess smoothness, contd. Note that α is estimated to be positive in the aggregate data. Thus, the MPC out of the shock to aggregate income should be greater than 1. Implications for the variances are such that var PIH ( c t ) = ( ) 1 + r 2 σu 2 > σ r α u. The variance of the innovation to consumption change should be larger than the variance in the innovation to income. In the data, the reverse is true. This result is known as the excess smoothness of consumption. 32 / 36

33 PIH and structural income processes Let s assume that the income process consists of two components the permanent component and the transitory component. y t = τ t + w t τ t = µ + τ t 1 + u P t w t = θ(l)u T t, where τ t is the permanent (in macro: trend ) component, u P t is the permanent shock; w t is the mean-reverting, stationary component (in macro: cycle ), u T t is the transitory shock. u P t iid(0, σ 2 u P ), u T t iid(0, σ 2 u T ), and u T t, u P t are uncorrelated at all leads and lags. 33 / 36

34 We assume that a consumer is able to differentiate between the permanent shocks (e.g., due to such events as promotion/demotion, permanent disability, etc.) and transitory shocks (e.g., those emanating from temporary sickness, short spells of unemployment, bonuses, overtime, etc.) to his income. We want to predict the consumer s reaction to these distinct shocks assuming the PIH is true. 34 / 36

35 We can express income in terms of current and past shocks as: (1 L)y t = τ t + w t = µ + u P t + (1 L)θ(L)u T t, or where µ = (1 L) 1 µ. The PIH implies: c t = y t = µ + (1 L) 1 u P t + θ(l)u T t, r 1 + r [ ( 1 1 ) 1 ( ) ] 1 u P t + θ u T t, 1 + r 1 + r where ( ) θ(l) = 1 + θ 1 L + θ 2 L , and θ 1 1+r = 1 + θ 1 1+r + θ (1+r) 2 35 / 36

36 In micro data, y t MA(2) (e.g., Abowd and Card 1989). Thus, w t MA(1) and θ(l) = 1 + θl. It follows that c t = u P t + r 1 + r ( 1 + θ ) u T t = u P t + r 1 + r 1 + r ( 1 + r + θ 1 + r Meghir and Pistaferri (2004): ˆθ [0.17, ]. Assume r = r 1+r ( ) 1+r+θ 1+r 0.023; for θ = , For θ = 0.17, ) a very tiny fraction. r 1+r ( 1+r+θ 1+r ) u T t. Thus, the MPC out of the permanent shock is 1 and the MPC out the transitory shock is at most 0.03 (3 cents per $1.) 36 / 36

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

The Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

The Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko The Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 15 A 2-period formulation 2-period problem, periods 0 and 1. Within-period (instantaneous) utility function is quadratic: u(c t ) =

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

ECOM 009 Macroeconomics B. Lecture 2

ECOM 009 Macroeconomics B. Lecture 2 ECOM 009 Macroeconomics B Lecture 2 Giulio Fella c Giulio Fella, 2014 ECOM 009 Macroeconomics B - Lecture 2 40/197 Aim of consumption theory Consumption theory aims at explaining consumption/saving decisions

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Lecture 2. (1) Permanent Income Hypothesis (2) Precautionary Savings. Erick Sager. February 6, 2018

Lecture 2. (1) Permanent Income Hypothesis (2) Precautionary Savings. Erick Sager. February 6, 2018 Lecture 2 (1) Permanent Income Hypothesis (2) Precautionary Savings Erick Sager February 6, 2018 Econ 606: Adv. Topics in Macroeconomics Johns Hopkins University, Spring 2018 Erick Sager Lecture 2 (2/6/18)

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Graduate Macroeconomics 2 Problem set Solutions

Graduate Macroeconomics 2 Problem set Solutions Graduate Macroeconomics 2 Problem set 10. - Solutions Question 1 1. AUTARKY Autarky implies that the agents do not have access to credit or insurance markets. This implies that you cannot trade across

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Chapter 2: Unit Roots

Chapter 2: Unit Roots Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Class: Trend-Cycle Decomposition

Class: Trend-Cycle Decomposition Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

Forecasting and Estimation

Forecasting and Estimation February 3, 2009 Forecasting I Very frequently the goal of estimating time series is to provide forecasts of future values. This typically means you treat the data di erently than if you were simply tting

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Lecture 2. (1) Aggregation (2) Permanent Income Hypothesis. Erick Sager. September 14, 2015

Lecture 2. (1) Aggregation (2) Permanent Income Hypothesis. Erick Sager. September 14, 2015 Lecture 2 (1) Aggregation (2) Permanent Income Hypothesis Erick Sager September 14, 2015 Econ 605: Adv. Topics in Macroeconomics Johns Hopkins University, Fall 2015 Erick Sager Lecture 2 (9/14/15) 1 /

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Multivariate ARMA Processes

Multivariate ARMA Processes LECTURE 8 Multivariate ARMA Processes A vector y(t) of n elements is said to follow an n-variate ARMA process of orders p and q if it satisfies the equation (1) A 0 y(t) + A 1 y(t 1) + + A p y(t p) = M

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

This note introduces some key concepts in time series econometrics. First, we

This note introduces some key concepts in time series econometrics. First, we INTRODUCTION TO TIME SERIES Econometrics 2 Heino Bohn Nielsen September, 2005 This note introduces some key concepts in time series econometrics. First, we present by means of examples some characteristic

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

1 Teaching notes on structural VARs.

1 Teaching notes on structural VARs. Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

ECONOMETRICS Part II PhD LBS

ECONOMETRICS Part II PhD LBS ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 18, 2018 Francesco Franco Empirical Macroeconomics 1/23 Invertible Moving Average A difference equation interpretation Consider an invertible MA1)

More information

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal Structural VAR Modeling for I(1) Data that is Not Cointegrated Assume y t =(y 1t,y 2t ) 0 be I(1) and not cointegrated. That is, y 1t and y 2t are both I(1) and there is no linear combination of y 1t and

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Financial Time Series Analysis: Part II

Financial Time Series Analysis: Part II Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing

More information

7. Integrated Processes

7. Integrated Processes 7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information