Stationary Time Series, Conditional Heteroscedasticity, Random Walk, Test for a Unit Root, Endogenity, Causality and IV Estimation

Size: px
Start display at page:

Download "Stationary Time Series, Conditional Heteroscedasticity, Random Walk, Test for a Unit Root, Endogenity, Causality and IV Estimation"

Transcription

1 1 / 67 Stationary Time Series, Conditional Heteroscedasticity, Random Walk, Test for a Unit Root, Endogenity, Causality and IV Estimation Chapter 1 Financial Econometrics Michael Hauser WS18/19

2 2 / 67 Content Notation Conditional heteroscedasticity, GARCH, estimation, forecasting Random walk, test for unit roots Granger causality Assumptions of the regression model Endogeneity, exogeneity Instrumental variable, IV, estimator Testing for exogeneity: Hausman test and Hausman-Wu test Appendix: OLS, Weak stationarity, Wold representation, ACF, AR(1), MA(1), tests for WN, prediction of an AR(1)

3 3 / 67 Notation X, Π... matrices x, β, ɛ... column vectors x, β, ɛ... real values, single variables i, j, t... indices L... lag operator. We do not use the backward shift operator B. WN, RW... white noise, random walk asy... asymptotically df... degrees of freedom e.g.... exempli gratia, for example i.e.... id est, that is i.g.... in general lhs, rhs... left, right hand side rv... random variable wrt... with respect to

4 4 / 67 Conditional heteroscedasticity GARCH models

5 5 / 67 Conditional heteroscedasticity A stylized fact of stock returns and other financial return series are periods of high volatility followed by periods of low volatility. The increase and decrease of this volatility pattern can be captured by GARCH models. The idea is that the approach of new information increases the uncertainty in the market and so the variance. After some while the market participants find to a new consensus and the variance decreases. We assume - for the beginning - the (log)returns, r t, are WN r t = µ + ɛ t with ɛ t N(0, σ 2 t ) σ 2 t = E(ɛ 2 t I t 1 ) σ 2 t is the predictor for ɛ 2 t given the information at period (t 1). σ 2 t is the conditional variance of ɛ 2 t given I t 1. I t 1 = {ɛ 2 t 1, ɛ2 t 2,..., σ2 t 1, σ2 t 2,...}.

6 Conditional heteroscedasticity Volatility is commonly measured either by ɛ 2 t the local variance (rt 2 ɛ t, the modulus of ɛ t. contains the same information), or 6 / 67

7 7 / 67 Generalized autoregressive conditional heteroscedasticity, GARCH The generalized autoregressive conditional heteroscedasticity, GARCH, model of order (1, 1), GARCH(1,1), uses as set of information I t 1 = {ɛ 2 t 1, σ2 t 1 } σ 2 t = a 0 + a 1 ɛ 2 t 1 + b 1σ 2 t 1 The unconditional variance of ɛ t is then [We set E(σ 2 t ) = E(ɛ2 t ) = σ2. So σ 2 = a 0 + a 1 σ 2 + b 1 σ 2 and we solve wrt σ 2.] E(ɛ 2 t ) = a 0 1 (a 1 + b 1 ) The unconditional variance is constant and exists if a 1 + b 1 < 1 Further as σ 2 t is a variance and so positive, a 0, a 1 and b 1 have to be positive. a 0, a 1, b 1 > 0

8 8 / 67 GARCH(r, s) A GARCH model of order (r, s), GARCH(r, s), r 0, s > 0, is σ 2 t = a 0 + s a j ɛ 2 t j + 1 r b j σt j 2 1 with s a j + 1 r b j < 1, a j, b j > 0 1 An autoregressive conditional heteroscedasticity, ARCH, model of order s is a GARCH(0, s). E.g. ARCH(1) is σ 2 t = a 0 + a 1 ɛ 2 t 1 with a 1 < 1, a 0, a 1 > 0 Comment 1: A GARCH(1,0), σt 2 = a 0 + b 1 σt 1 2, is not useful as it describes a deterministic decay once a starting value for some past σt j 2 is given. Comment 2: The order s refers to the number of a j coefficients, r to the b j coefficients.

9 9 / 67 IGARCH(1,1) An empirically relevant version is the integrated GARCH, IGARCH(1,1) model, σ 2 t = a 0 + a 1 ɛ 2 t 1 + b 1σ 2 t 1 with a 1 + b 1 = 1, a 0, a 1, b 1 > 0 where the conditional variance exists, but not the unconditional one.

10 10 / 67 GARCH(1,1): ARMA and ARCH representations We can define an innovation in the variance as ν t = ɛ 2 t E(ɛ 2 t I t 1 ) = ɛ 2 t σ 2 t Replacing σ 2 t by σ 2 t = ɛ 2 t ν t in the GARCH(1,1) model we obtain ɛ 2 t = a 0 + (a 1 + b 1 )ɛ 2 t 1 + ν t b 1 ν t 1 This model is an ARMA(1,1) for ɛ 2 t. So the ACF of r 2 t can be inspected for getting an impression of the dynamics. The model is stationary, if (a 1 + b 1 ) < 1. Recursive substitution of lags of σ 2 t = a 0 + a 1 ɛ 2 t 1 + b 1σ 2 t 1 in σ2 t gives an infinite ARCH representation with a geometric decay σ 2 t = a 0 1 b 1 + a 1 1 b j 1 1 ɛ 2 t j

11 11 / 67 Variants of the GARCH model t-garch: As the unconditional distribution of r t can be seen as a mixture of normal distributions with different variances, GARCH is able to model fat tails. However empirically, the reduction in the kurtosis of r t by a normal GARCH model is often not sufficient. A simple solution is to consider an already fat tailed distribution for ɛ t instead of a normal one. Candidates are e.g. the t-distribution with df > 2, or the GED (generalized error distr) with tail parameter κ > 0. ARCH-in-mean, ARCH-M: If market participants are risk avers, they want a higher average return in uncertain periods than in normal periods. So the mean return should be higher when σt 2 is high. r t = µ 0 + µ 1 σ 2 t + ɛ t with µ 1 > 0, ɛ t N(0, σ 2 t )

12 12 / 67 Variants of the GARCH model asymmetric GARCH, threshold GARCH, GJR model: As the assumption that both good and bad new information has the same absolute (symmetric) effect might not hold. A useful variant is σ 2 t = a 0 + a 1 ɛ 2 t 1 + b 1σ 2 t 1 + γd t 1ɛ 2 t 1 where the dummy variable D indicates a positive shock. So The contribution of ɛ 2 t 1 to σ2 is D t 1 = 1, if ɛ t 1 > 0 D t 1 = 0, if ɛ t 1 0 (a 1 + γ), if ɛ t 1 > 0 a 1, if ɛ t 1 0 If γ < 0, negative shocks have a larger impact on future volatility than positive shocks. (GJR stands for Glosten, Jagannathan, Runkle.)

13 13 / 67 Variants of the GARCH model Exponential GARCH, EGARCH: By modeling the log of the conditional variance, log(σt 2 ), the EGARCH guarantees positive variances, independent on the choice of the parameters. It can be formulated also in an asymmetric way (γ 0). log(σ 2 t ) = a 0 + a 1 ɛ t 1 σ 2 t 1 + b 1 log(σ 2 t 1 ) + γ ɛ t 1 σ 2 t 1 If γ < 0, positive shocks generate less volatility than negative shocks.

14 Estimation of the GARCH(r, s) model Say r t can be described more generally as r t = x tθ + ɛ t with vector x t of some third variables, x t = (x 1t,..., x mt ), possibly including lagged r t s, seasonal dummies, etc. ɛ t with conditional heteroscedasticity can be written as ɛ t = σ t ξ t with ξ t iid N(0, 1) σ 2 t has the form σ 2 t = a 0 + a 1 ɛ 2 t a s ɛ 2 t s + b 1σ 2 t b r σ 2 t r. So the conditional density of r t x t, I t 1 is given by f N (r t x t, I t 1 ) = 1 2πσt 2 exp( 1 2 ɛ 2 t σ 2 t ), t = max(r, s) + 1,..., n Comments: V(σ t ξ t x t, I t 1 ) = σ 2 t V(ξ t x t, I t 1 ) = σ 2 t, starting values σ 2 1,..., σ2 r appropriately chosen, ɛ t = ɛ t (θ), σ 2 t = σ 2 t (a 0, a 1,..., a s, b 1,..., b r ) 14 / 67

15 15 / 67 Estimation of the GARCH model The ML, maximum likelihood, estimator of the parameter vector θ, a 0, a 1,..., a s, b 1,..., b r is given by as the ɛ s are uncorrelated. max θ,a 0,a 1,...,a s,b 1,...,b r n max(r,s)+1 f N (r t x t, I t 1 ) The estimates are asy normal distributed. Standard t-tests, etc. apply. Note the requirement of uncorrelated ɛ s. In case autocorrelation in the returns r t is ignored, the model is misspecified. And, you will detect ARCH effects although they might not exist. So in a first step, model the level of r t by ARMA, e.g., and then fit GARCH models to the residuals. Remark: If r t is autocorrelated, so also r 2 t is autocorrelated i.g.

16 16 / 67 Forecasting a GARCH(1,1) process Using σ 2 = a 0 /(1 (a 1 + b 1 )) in σ 2 t = a 0 + a 1 ɛ 2 t 1 + b 1σ 2 t 1 we rewrite the model as σ 2 t σ 2 = a 1 [ɛ 2 t 1 σ2 ] + b 1 [σ 2 t 1 σ2 ] So the 1-step ahead forecast E(ɛ 2 t+1 I t) is σ 2 t+1 t = (σ2 t+1 ) = E(ɛ2 t+1 I t) = σ 2 + a 1 [ɛ 2 t σ 2 ] + b 1 [σ 2 t σ 2 ] with the 1-step ahead forecast error ɛ 2 t+1 σ2 t+1 t = ν t+1. The h-step ahead forecast is, h 2, [replacing both ɛ 2 t, σ2 t by their (h 1)-step ahead forecast σt+h 1 t 2 ] σt+h t 2 = σ2 + (a 1 + b 1 )[σt+h 1 t 2 σ2 ] = = σ 2 + (a 1 + b 1 ) h 1 [a 1 (ɛ 2 t σ 2 ) + b 1 (σt 2 σ 2 )]

17 17 / 67 Random walk and unit root test Random walk, I(1), Dickey Fuller test

18 18 / 67 Random walk, RW A process y t = αy t 1 + ɛ t with α = 1 is called random walk. y t = y t 1 + ɛ t with ɛ t WN Taking the variances on both sides gives V(y t ) = V(y t 1 ) + σ 2. This has only a solution V(y), if σ 2 = 0. So no unconditional variance of y t exists. Starting the process at t = 0 its explicit form is y t = y 0 + t 1 ɛ j. The conditional expectation and conditional variance are E(y t y 0 ) = y 0, V(y t y 0 ) = t σ 2 The conditional variance of a RW increases with t. The process is nonstationary. The associated characteristic polynomial 1 z = 0 [as (1 L)y t = ɛ t ] has a root on the unit circle, z = 1. Moreover it has a unit root. z = 1

19 19 / 67 Random walks There are 3 standard types of a random walk. ɛ t is WN. Random walk (without a drift): y t = y t 1 + ɛ t Random walk with a drift. c is the drift parameter. y t = c + y t 1 + ɛ t or y t = y 0 + c t + t 1 ɛ j Random walk with drift and trend. y t = c + b t + y t 1 + ɛ t

20 ARIMA(1,1,1) Say we have a process y t = 1.2y t 1 0.2y t 2 + ɛ t 0.5ɛ t 1 written in lag polynomials (1 1.2L + 0.2L 2 )y t = (1 0.5L)ɛ t. The characteristic AR polynomial 1 1.2z + 0.2z 2 = 0 has the roots z 1 = 1/0.2, z 2 = 1. So we can factorize as (1 0.2L)(1 L)y t = (1 0.5L)ɛ t Replacing (1 L)y t by y t = y t y t 1 the process may be written as (1 0.2L) y t = (1 0.5L)ɛ t or y t = 0.2 y t 1 + ɛ t 0.5ɛ t 1 So y t, the differenced y t, is a stationary ARMA(1,1) process. The original process y t (the level) is integrated of order 1, an ARIMA(1,1,1). Taking simple differences removes the nonstationarity and brings us in the world of stationary processes, where elaborate techniques are available. Sometimes differencing twice is necessary. Root for (1 αl): 1 αz = 0 with z 1 = (1/α). So 1 (1/z 1 )z = / 67

21 Test for a unit root, Dickey-Fuller test We start with the model y t = c + αy t 1 + ɛ t, ɛ t WN and want to know whether α = 1, the process is a random walk, or α < 1, the process is stationary. Estimation of α with OLS gives consistent estimates in both cases. For RW s they are even super-consistent, and converge faster than with n. For the Dickey-Fuller test we subtract on both sides y t 1 and get y t = c + (α 1)y t 1 + ɛ t. We estimate with OLS and calculate the standard t-statistic for (α 1) τ = ˆα 1 se(ˆα). 21 / 67

22 22 / 67 Dickey-Fuller test, DF test Under the null hypothesis H 0 : α = 1 or (α 1) = 0 the τ-statistic is distributed according to the distribution tabulated in Dickey-Fuller(1976). It has fatter tails than the t-distribution and is somewhat skewed to the left. The distribution depends on the sample size, and on the type of the RW model, without drift, with drift, with a linear trend: A : y t = y t 1 + ɛ t, B : y t = c + y t 1 + ɛ t, C : y t = c + b t + y t 1 + ɛ t The alternative hypothesis is H A : α < 1 or (α 1) < 0

23 23 / 67 Dickey-Fuller test, deterministic trend Remark: Model C, y t = c + b t + y t 1 + ɛ t, is, used to distinguish between a RW and a deterministic (linear) trend. y t = c + b t + (α 1)y t 1 + ɛ t H 0 : If the DF test leads to an acceptance of the H 0 of a RW, then y t is a RW, and not a deterministic trend. (It might be that the conditional mean has a quadratic trend.) H A : If the DF test rejects the H 0 of a RW, then y t does not obey to a RW. If b is significant, y t might be modeled more adequately by a deterministic trend.

24 24 / 67 The augmented Dickey-Fuller test In case ɛ t is not WN, but (stationary and) autocorrelated, the Dickey-Fuller test is not reliable. Therefore we estimate the model augmented with lagged y t s to capture this autocorrelation, and obtain approximately uncorrelated residuals. y t = c + (α 1)y t 1 + p φ j y t j + ɛ t 1 The maximal lag order p is chosen automatically by an information criterion of your choice, e.g. AIC, SBC. Remark: Tests for bubbles (α > 1) are found in Phillips, Shi and Yu(2015).

25 Granger causality 25 / 67

26 26 / 67 Granger causality Granger causality is actually a concept for classification of predictability in stationary dynamic (AR(p)-type) models. Restricted model, model 0: y t = α 0 + α 1 y t α p y t p + ɛ 0,t We ask whether a stationary x t can help to forecast y t. We consider the model augmented by lagged x s: Unrestricted model, model 1: y t = α 0 + α 1 y t α p y t p + β 1 x t β q x t q + ɛ 1,t X Granger-causes Y, if at least one of the β j 0.

27 27 / 67 Testing via LR-test The hypothesis of no Granger causation of X wrt Y can be tested for fixed p, q via a LR-test: H 0 : β 1 =... = β q = 0 n log( s2 0 s1 2 ) χ 2 (q) The test statistic is distributed χ 2 with q degrees of freedom. log(σ0 2/σ2 1 ) may be interpreted as the strength of the causality.

28 28 / 67 Testing via F-test, instantaneous causality Or we test via a Wald F-statistic (RSS 0 RSS 1 )/q RSS 1 /(n p q) F(q, n p q) If in the augmented model also the current x t plays a role, then this is called instantaneous causality between X and Y. Both Y Granger-causes X and X Granger-causes Y might exist. This is called feedback relationship. [RSS... residual sum of squares]

29 29 / 67 Example, deficiencies Example: Daily returns of various European stock indices were slightly Granger-caused by the New York stock exchange before 9/ However during the months after, adjustment speeded up, instantaneous causality rose and lagged reaction vanished. Granger causation is subject to the same deficiencies as the bivariate correlation coefficient: Causation may vanish when including a third variable (spurious correlation), or Causation may turn out only after adding a third variable. The space within causation is searched for is relevant.

30 Assumptions of classical regression model 30 / 67

31 The linear regression model y = Xβ + ɛ The dependent variable y is (n 1), a column of length n. The regressor matrix X is (n K ), X = [x tk ] n K. The parameter vector β is (K 1) and the error term ɛ is (n 1). More explicitly y = x 1 β x K β K + ɛ x k is the k-th column in X, x k = [x k ] n 1. In terms of single time points or individuals y t = x t β + ɛ t where x t, x i, are the t-th/i-th row in X. 31 / 67

32 32 / 67 Assumptions: review A0. True model: We know the true model and the model in question is specified correctly in the given form. A1. Linear model: The model is linear. (I.e. linear in the parameters.) A2. Full rank of X: No variable can be expressed as linear combination of the others. All parameters are identified. No multicollinearity. As the regressors are stochastic the condition is: plim(x X/n) = Q is finite, constant and non singular. If the x-variables have all mean zero, Q is the asymptotic covariance matrix. In the special case of a deterministic X A2 reduces to the full rank, nonsingularity, respectively, of (X X).

33 33 / 67 Assumptions: review A3. Exogeneity of the explanatory variables: The explanatory variables at any point of time, i, are independent with the current, and all past and future errors. E[ɛ i x j1,..., x jk ] = 0 i, j = 1,..., n. A4. Homoscedasticity and uncorrelated errors: The disturbances ɛ t have finite and constant variance, σ 2. They are uncorrelated with each other. A5. Exogenously generated data: The generating process of the x s is outside the model. The analysis is done conditionally on the observed X. A6. Normal distribution: The disturbances are (approximately) normally distributed (for convenience). y is multivariate normal conditional on X, y X N(.,.)

34 Endogeneity 34 / 67

35 35 / 67 Endogeneity: definition Endogeneity exists if explanatory variables are correlated with the disturbance term. We start with y = Xβ + ɛ and estimate by OLS, with b = ˆβ = (X X) 1 X y. Then b may be written as b = β + (X X) 1 X ɛ If E(X ɛ) or plim(x ɛ/n) is not zero, then the estimates are biased or inconsistent. There are several models where endogeneity arises: The errors-in-the-variables model Simultaneity Dynamic regression

36 36 / 67 Endogeneity: the errors-in-the-(explanotary)variables model The true model is: y = βx + ɛ However, we observe only x o, i.e. x with a measurement (random) error η, which we cannot control for: x o = x + η The model we estimate is: y = βx o + ɛ o So ɛ o = ɛ βη, and Cov(x o, ɛ o ) 0. Cov(x o, ɛ o ) = Cov(x + η, ɛ) β Cov(x + η, η) 0 The first term can be assumed to be zero, Corr(ɛ, η) = 0, the second one is different from zero. The measurement error causes inconsistency of the estimates. The bias in β does not vanish even with n.

37 37 / 67 Endogeneity: simultaneity Simultaneity is a general phenomenon. It says that individual decisions in (economic) systems are interdependent. Single independent decisions can hardly be separated out. Example: In macroeconomics the simple consumption function describes the relationship between private consumption, C, and disposable income, Y. C = α + βy + ɛ However, income is generated by consumption expenditure, and investment and government consumption, etc. Say, the latter is aggregated and denoted by Z. Y = C + Z Both equations give a simultaneous equation system.

38 38 / 67 Endogeneity: simultaneity LS of the consumption function yields (a b) = (α β) + [(1Y ) (1Y )] 1 (1Y ) ɛ Since Y = C + Z, E(1Y ) ɛ = E[1 (C + Z )] ɛ 0 E(C ɛ) 0 as C = α + βy + ɛ, and so (a b) is biased. This is a simple example for the simultaneous equation bias. Example: It is difficult to distinguish empirically between pure univariate expectations and expectations influenced by the overall economic state. Further, consider expectations about the expectations of other subjects,....

39 39 / 67 Strict / weak exogeneity Assumption A3 says that current, past and future disturbances are uncorrelated with current explanatory variables. E[ɛ i x j1,..., x jk ] = 0 i, j = 1,..., n or E[x jk ɛ i ] = 0 i, j = 1,..., n for all k We say the x variables are strictly exogenous. For weak exogeneity only E[x ik ɛ i ] = 0 i = 1,..., n for all k is required.

40 Predetermined variables, dynamic regression Variable x is called predetermined if E[x t ɛ t+s ] = 0 s > 0 Current x t is uncorrelated with all future ɛ s, but possibly correlated with past. Consider the simple dynamic regression with γ < 1 Repeated substitution gives y t = γy t 1 + ɛ t t 1 y t = γ i ɛ t i + γ t y 0 = ɛ t + γɛ t γ t 1 ɛ 1 + γ t y 0 i=0 y t 1 does not dependent on ɛ t, ɛ t+1,.... So y t 1 is predetermined, and weakly exogenous. The lack of contemporary correlation of y t and ɛ t is essential. If E(x t ɛ t ) = 0 the consequences of the dynamic dependencies vanish at least asymptotically and we obtain consistent estimates, which might, however, be biased in small samples. 40 / 67

41 Instrumental variables 41 / 67

42 42 / 67 Instrumental variables, IV A solution for the edogeneity problem (biases/inconsistencies) are IV estimators. Instrumental variables, short instruments, are essentially exogenous variables - they do not correlate with the disturbances by assumption - used as proxies for endogenous explanatories. The art is to find variables which are good proxies (and do not show endogeneity), so that the fit of the regression will become acceptable. Sometimes the loss of information due to the use of IVs (bad proxies) is larger than the bias induced by endogenity. Then it is preferable to stay with the simple LS estimator.

43 43 / 67 IV estimation: a 2-step procedure Say the model is y = X 1 β 1 + X 2 β 2 + ɛ X 1 contains the variables leading to the endogeneity problem. We choose at least as many instruments, the columns in Z, as there are variables in X 1, regress (LS) X 1 on Z : X 1 = Z γ + η. We get X 1 = Z γ, and replace X 1 in the original model by X 1 : y = X 1 β 1 + X 2 β 2 + ɛ The resulting LS estimate b = b IV is the instrumental variable estimator.

44 44 / 67 IV estimator: more formally Formally we relax assumption A3 to AI3 (weak exogeneity) with AI 3. E(ɛ i x i,1,..., x i,k ) = 0, i = 1,..., n Only the contemporaneous correlation has to vanish. We show that the estimate b IV is asymptotically normal. For notational convenience the model is the following y = Xβ + ɛ, X (n K ), Z (n K ) There are - say K - instruments in Z, so that X Z has rank K. LS of X on Z gives with X = Z γ + η γ is (K K ) and η (n K ). γ = (Z Z ) 1 Z X and X = Z γ = Z (Z Z ) 1 Z X

45 45 / 67 IV estimator: more formally Now we have to estimate by LS y = Xβ + ɛ b = b IV = ( X X) 1 X y =... = [X Z (Z Z ) 1 Z X] 1 [Z (Z Z ) 1 Z X] y =... = (Z X) 1 Z y

46 46 / 67 IV estimator: unbiasedness, consistency We replace y by y = Xβ + ɛ. So b IV = β + (Z X) 1 Z ɛ For consistency we need that plim(z Z /n) = Q ZZ plim(z X/n) = Q ZX = Q XZ plim(x X/n) = Q X plim(z ɛ/n) = 0 a positive definite matrix a nonzero matrix a positive definite matrix and as Z are valid instruments. Then plim b IV = β + plim(z X/n) 1 (Z ɛ/n) = β + Q 1 ZX plim(z ɛ/n) plim b IV = β. The IV estimator is consistent. and so

47 47 / 67 IV estimator: asy normality Application of a CLT (central limit theorem) gives the asymptotic distribution of b IV. The expectation is obtained by using (weak) AI3 E(ɛ t z t ) = 0, the variance by considering (Z ɛ/ n). Under very general conditions moments up to order (2 + ɛ) exist, ɛ weakly dependent holds Z ɛ/ n d N[0, σ 2 Q ZZ ] Remark: If the disturbances are heteroscedastic having covariance matrix Ω σ 2 I, the asymptotic variance of (Z ɛ/ n) would be plim(z ΩZ /n) instead. The IV estimator is asymptotically distributed as b IV N[β, (σ 2 /n)q 1 ZX Q ZZ Q 1 XZ ] Remark: asyvar(z ɛ/ n) = plim(z ɛ/ n)(z ɛ/ n) = plim(z ɛɛ Z )/n = σ 2 Q ZZ

48 48 / 67 Hausman test: testing for endogeneity If there is a suspicion of endogeneity about the X s, we want to check for it. The Hausman test offers a possibility. We compare the estimates y = Xb LS + ɛ t,ls and y = Xb IV + ɛ t,iv

49 49 / 67 Hausman test: testing for endogeneity The idea is that under the null hypothesis of no endogeneity both OLS and IV are consistent, though IV is less efficient (because we use proxies as instruments). Asy.Var[b IV ] Asy.Var[b LS ]... nonnegative definite Further, under the null should hold E(d) = E(b IV b LS ) = 0 However under the alternative of endogenity LS is biased/inconsistent, and the estimates differ: E(d) 0. A Wald statistic based thereon is H = d {Est.Asy.Var[d]} 1 d with Est.Asy.Var[d] the estimated asymptotic variance of d.

50 Hausman test: testing for endogeneity Est.Asy.Var[d] depends on the variance of b IV, b LS and their covariances. However, the covariance between an efficient estimator, b E, and its difference from an inefficient one, b I, is zero. (without proof) Cov(b E, b I b E ) = Cov(b E, b I ) Cov(b E, b E ) = 0 So Cov(b E, b I ) = Var(b E ). Applied to b IV and b LS : Asy.Var(b IV b LS ) = Asy.Var(b IV ) Asy.Var(b LS ) This reduces H under the null of no endogeneity to H = 1 σ 2 d [( X X) 1 (X X) 1 ] 1 d χ 2 (K ) where K = K K 0, and K 0 is the number of explanatories which are not under consideration wrt endogeneity. K... no. of possibly endogenous variables. [For y = Xβ + ɛ the variance of b LS is V(b) = σ 2 (X X) 1 ] 50 / 67

51 51 / 67 Hausman test The Hausman test is more generally applicable. If we compare an efficient and an inefficient estimator, which are both consistent, under the null. And, under the alternative the efficient one becomes inconsistent, and the inefficient remains consistent.

52 52 / 67 Hausman-Wu test: testing for endogeneity A simple related test for endogeneity is the Hausman-Wu test. Under the null we consider the model y = Xβ + ɛ Under the alternative we consider the augmented model y = Xβ + X γ + ɛ where X are the K explanatories under suspicion causing endogeneity approximated by their estimates using the instruments. The idea is, that under the null hypothesis of no endogeneity, X represents an irrelevant additional variable, so γ = 0.

53 53 / 67 Hausman-Wu test: testing for endogeneity Under the alternative the null model yields biased estimates. The test statistics is a common F-test with K and (n K K ) degrees of freedom, where the restricted model (γ = 0) is tested against the unrestricted (γ 0) one. Important to note: The test depends essentially on the choice of appropriate instruments.

54 Exercises, references and appendix 54 / 67

55 55 / 67 Exercises Choose 2 out of (2, 3, 4, 5) and 2 of (1G, 6G). 1G Use EViews and Ex1_1_gspc_garch.wf1. Find an appropriate EGARCH model for the return series of the S&P 500. For the period to First test for a unit root in the stock prices, then specify and estimate the ARMA and GARCH models for the returns separately. Compare standard GARCH, GARCH-M, asymmetric GARCH, t-garch and EGARCH. [For estimating ARMA-GARCH models in R see Ex1_gspc_garch_R.txt and Ex1_interest_rates_R.txt. ] 2 Derive (a) the ARMA(1,1) and (b) the ARCH( ) representation of a GARCH(1,1) model.

56 56 / 67 Exercises 3 Has the model y t = 1.7y t 1 0.8y t 2 + ɛ t a unit root? 4 Derive the IV estimator and its asymptotic distribution for homoscedastic and uncorrelated errors. 5 Write down the F-statistic for the Hausman-Wu test. 6G Investigate the type and strength of causality in the US and European stock index returns around 9/11. Test for various types of causality: X causes Y, instantaneous causality, and feedback effects (X causes Y, and Y causes X). Use Ex1_6_Granger_R.txt.

57 57 / 67 References Greene 13.2 Ramanathan 10 Verbeek 8 Phillips, Shi and Jun(2015): Testing for Multiple Bubbles: Historical Episodes of Exuberance and Collapse in the S&P 500, International Economic Review,

58 58 / 67 Appendix Useful formula for OLS White noise, Wold representation, ACF, AR(1), MA(1), lag polynomial, roots

59 59 / 67 Some useful formula for OLS Model y = Xβ + ɛ The normal equations give the 1st order conditions for min β (ɛ ɛ) LS solution for β (X X)b = X y b = (X X) 1 X y In order to see the (potential) unbiasedness of b b = β + (X X) 1 X ɛ The variance-covariance matrix of the estimate with Var(ɛ) = σ 2 I The residual vector e Var(b) = E[(b β)(b β) ] = σ 2 (X X) 1 e = y Xb = y X(X X) 1 X y = [I X(X X) 1 X ]y = My

60 60 / 67 White noise, WN A stochastic process {x t }, t =..., 1, 0, 1,..., is a sequence of rv s, where the rv s have common properties, eg. for strictly stationarity with finite dimensional common distributions. A stochastic process {x t } is a white noise, WN, x t = ɛ t if for x t (and so also for ɛ t ) the (unconditional) mean E(x t ) = E(ɛ t ) = 0, for all t the (unconditional) variance V(x t ) = V(ɛ t ) = σ 2 fixed (independent of t) and the autocorrelations Corr(x t, x t j ) = Corr(ɛ t, ɛ t j ) = 0 for j 0. More general, x t may have a mean, which is different form zero x t = µ + ɛ t

61 Weak stationarity, ACF A process {x t } is weakly stationary, if E(x t ) = µ fixed, independent of t, V(x t ) = γ 0 = σ 2 x fixed, independent of t Cov(x t, x t j ) = γ j independent of t, and depends only on the time distance between the both periods t and (t j), j 0. Weak stationarity refers only to the first two moments of the process. A common distribution is not required. {γ j }, j 0, is called the autocovariance function, ACF, of {x t }. The sample autocovariances are called correlogram. 61 / 67

62 62 / 67 Wold representation theorem Every weakly stationary process may be represented as a (one-sided) weighted sum of a WN process {ɛ t } x t = µ + δ j ɛ t j = µ + ɛ t + δ 1 ɛ t 1 + δ 2 ɛ t with 0 δ2 j <, (square summable) and normalized with δ 0 = 1. As for all t E(ɛ t ) = 0, V(ɛ t ) = σ 2 and Corr(ɛ t, ɛ t j ) = 0 for j 0 E(x t ) = µ + 0 δ je(ɛ t j ) = µ V(x t ) = 0 δ2 j V(ɛ t j ) = σ 2 0 δ2 j < finite, fixed and independent of t. Remark: V(X + Y ) = V(X) + V(Y ), if Corr(X, Y ) = 0.

63 63 / 67 AR(1) An autoregressive process of order 1, AR(1), is given by with ɛ t WN and α < 1. Written as regression model Its Wold representation is Its ACF decays geometrically. (x t µ) α(x t 1 µ) = ɛ t x t = (1 α)µ + αx t 1 + ɛ t = c + αx t 1 + ɛ t x t = µ + α j ɛ t j V(x t ) = γ 0 = σ 2 x = σ 2 Corr(x t, x t j ) = γ j /γ 0 = α j 0 0 α 2 j = σ α 2

64 64 / 67 Forecasting with AR(1) For forecasting we take the conditional expectation of x t based on available information at (t 1), I t 1 = {x t 1, x t 2,..., ɛ t 1, ɛ t 2,...}. The predictor of an AR(1) process x t for period t is E(x t I t 1 ) = (1 α)µ + αx t 1 The useful information reduces to I t 1 = {x t 1 }. Contrary, the unconditional expectation is E(x t ) = µ. The innovation (stochastics) ɛ t which is driving the AR(1) x t = (1 α)µ + αx t 1 + ɛ t is ɛ t = x t E(x t I t 1 )

65 65 / 67 AR(1), roots of the characteristic polynomial The AR(1) can be written by means of the lag polynomial (1 αl) as (1 αl)(x t µ) = ɛ t The roots of the characteristic polynomial are given by 1 αz = 0 with the solution z = 1/α. The stationarity condition α < 1 translates for the root to z = 1/α > 1. The root lies outside the unit circle. This stationarity condition generalizes to AR(p) processes of higher order p. (1 α 1 L... α p L p )(x t µ) = ɛ t 1 α 1 z... α p z p = 0, z > 1 The roots have to lie outside the unit circle.

66 66 / 67 MA(1) A moving average process of order 1, MA(1), is given by (x t µ) = ɛ t + βɛ t 1 with ɛ t WN, and β < 1 for invertibility. (Invertibility guarantees an unique AR representation. MAs are stationary independent of the choice of β.) Its Wold representation is a finite sum of the ɛ s and is the model itself. Its ACF vanishes after lag 1. V(x t ) = γ 0 = σ 2 x = (1 + β 2 )σ 2 Corr(x t, x t 1 ) = γ 1 /γ 0 = β/(1 + β 2 ) < 1/2 Corr(x t, x t j ) = γ j /γ 0 = 0 for j > 1.

67 67 / 67 Tests for white noise The sample autocorrelations are denoted by ˆρ j, ˆρ j = ˆγ j /ˆγ 0. Under WN the sample autocorrelations (for any j > 0) are asy distributed as ˆρ j a N( 1 n, 1 n ) So a 95% coverage interval for ˆρ j under WN is [ 1 n /n, 1 n /n]. A standard test for autocorrelation of zero up to a maximal order m, 1 j m, is the Box-Pierce test. Using a correction for the df s it is called Ljung-Box test. Q = n(n + 2) m 1 ˆρ 2 j n j χ 2 m Under the H 0 : ρ 1 =... = ρ m = 0 Q follows a chi-square distribution with m degrees of freedom.

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Vector error correction model, VECM Cointegrated VAR

Vector error correction model, VECM Cointegrated VAR 1 / 58 Vector error correction model, VECM Cointegrated VAR Chapter 4 Financial Econometrics Michael Hauser WS17/18 2 / 58 Content Motivation: plausible economic relations Model with I(1) variables: spurious

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Panel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63

Panel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63 1 / 63 Panel Data Models Chapter 5 Financial Econometrics Michael Hauser WS17/18 2 / 63 Content Data structures: Times series, cross sectional, panel data, pooled data Static linear panel data models:

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

ECON3327: Financial Econometrics, Spring 2016

ECON3327: Financial Econometrics, Spring 2016 ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary

More information

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Stochastic vs. deterministic

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

ECON 4160, Spring term Lecture 12

ECON 4160, Spring term Lecture 12 ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

7. Integrated Processes

7. Integrated Processes 7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

7. Integrated Processes

7. Integrated Processes 7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference GARCH Models Estimation and Inference Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 1 Likelihood function The procedure most often used in estimating θ 0 in

More information

Econometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 11 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 30 Recommended Reading For the today Advanced Time Series Topics Selected topics

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Empirical Economic Research, Part II

Empirical Economic Research, Part II Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Nonstationary Time Series:

Nonstationary Time Series: Nonstationary Time Series: Unit Roots Egon Zakrajšek Division of Monetary Affairs Federal Reserve Board Summer School in Financial Mathematics Faculty of Mathematics & Physics University of Ljubljana September

More information

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning Økonomisk Kandidateksamen 2004 (I) Econometrics 2 Rettevejledning This is a closed-book exam (uden hjælpemidler). Answer all questions! The group of questions 1 to 4 have equal weight. Within each group,

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Prof. Massimo Guidolin 019 Financial Econometrics Winter/Spring 018 Overview ARCH models and their limitations Generalized ARCH models

More information

Simultaneous Equation Models Learning Objectives Introduction Introduction (2) Introduction (3) Solving the Model structural equations

Simultaneous Equation Models Learning Objectives Introduction Introduction (2) Introduction (3) Solving the Model structural equations Simultaneous Equation Models. Introduction: basic definitions 2. Consequences of ignoring simultaneity 3. The identification problem 4. Estimation of simultaneous equation models 5. Example: IS LM model

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017 Introduction to Regression Analysis Dr. Devlina Chatterjee 11 th August, 2017 What is regression analysis? Regression analysis is a statistical technique for studying linear relationships. One dependent

More information

ECON 4160, Lecture 11 and 12

ECON 4160, Lecture 11 and 12 ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference

More information

Moreover, the second term is derived from: 1 T ) 2 1

Moreover, the second term is derived from: 1 T ) 2 1 170 Moreover, the second term is derived from: 1 T T ɛt 2 σ 2 ɛ. Therefore, 1 σ 2 ɛt T y t 1 ɛ t = 1 2 ( yt σ T ) 2 1 2σ 2 ɛ 1 T T ɛt 2 1 2 (χ2 (1) 1). (b) Next, consider y 2 t 1. T E y 2 t 1 T T = E(y

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56 Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The

More information

Time Series Methods. Sanjaya Desilva

Time Series Methods. Sanjaya Desilva Time Series Methods Sanjaya Desilva 1 Dynamic Models In estimating time series models, sometimes we need to explicitly model the temporal relationships between variables, i.e. does X affect Y in the same

More information

Univariate, Nonstationary Processes

Univariate, Nonstationary Processes Univariate, Nonstationary Processes Jamie Monogan University of Georgia March 20, 2018 Jamie Monogan (UGA) Univariate, Nonstationary Processes March 20, 2018 1 / 14 Objectives By the end of this meeting,

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data Covers Chapter 10-12, some of 16, some of 18 in Wooldridge Regression Analysis with Time Series Data Obviously time series data different from cross section in terms of source of variation in x and y temporal

More information

Chapter 2: Unit Roots

Chapter 2: Unit Roots Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity

More information

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

This chapter reviews properties of regression estimators and test statistics based on

This chapter reviews properties of regression estimators and test statistics based on Chapter 12 COINTEGRATING AND SPURIOUS REGRESSIONS This chapter reviews properties of regression estimators and test statistics based on the estimators when the regressors and regressant are difference

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

BCT Lecture 3. Lukas Vacha.

BCT Lecture 3. Lukas Vacha. BCT Lecture 3 Lukas Vacha vachal@utia.cas.cz Stationarity and Unit Root Testing Why do we need to test for Non-Stationarity? The stationarity or otherwise of a series can strongly influence its behaviour

More information

Linear Regression with Time Series Data

Linear Regression with Time Series Data u n i v e r s i t y o f c o p e n h a g e n d e p a r t m e n t o f e c o n o m i c s Econometrics II Linear Regression with Time Series Data Morten Nyboe Tabor u n i v e r s i t y o f c o p e n h a g

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

GARCH models. Erik Lindström. FMS161/MASM18 Financial Statistics

GARCH models. Erik Lindström. FMS161/MASM18 Financial Statistics FMS161/MASM18 Financial Statistics Time series models Let r t be a stochastic process. µ t = E[r t F t 1 ] is the conditional mean modeled by an AR, ARMA, SETAR, STAR etc. model. Having a correctly specified

More information

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006. 6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,

More information

E 4101/5101 Lecture 9: Non-stationarity

E 4101/5101 Lecture 9: Non-stationarity E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 9 - Heteroskedasticity Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 43 Agenda 1 Introduction 2 AutoRegressive Conditional Heteroskedastic Model (ARCH)

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Financial Time Series Analysis: Part II

Financial Time Series Analysis: Part II Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing

More information

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference Università di Pavia GARCH Models Estimation and Inference Eduardo Rossi Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

9) Time series econometrics

9) Time series econometrics 30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series

More information

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15 Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic

More information

Unit root problem, solution of difference equations Simple deterministic model, question of unit root

Unit root problem, solution of difference equations Simple deterministic model, question of unit root Unit root problem, solution of difference equations Simple deterministic model, question of unit root (1 φ 1 L)X t = µ, Solution X t φ 1 X t 1 = µ X t = A + Bz t with unknown z and unknown A (clearly X

More information

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Emmanuel Alphonsus Akpan Imoh Udo Moffat Department of Mathematics and Statistics University of Uyo, Nigeria Ntiedo Bassey Ekpo Department of

More information

TIME SERIES DATA ANALYSIS USING EVIEWS

TIME SERIES DATA ANALYSIS USING EVIEWS TIME SERIES DATA ANALYSIS USING EVIEWS I Gusti Ngurah Agung Graduate School Of Management Faculty Of Economics University Of Indonesia Ph.D. in Biostatistics and MSc. in Mathematical Statistics from University

More information

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models Journal of Finance and Investment Analysis, vol.1, no.1, 2012, 55-67 ISSN: 2241-0988 (print version), 2241-0996 (online) International Scientific Press, 2012 A Non-Parametric Approach of Heteroskedasticity

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Lecture 6a: Unit Root and ARIMA Models

Lecture 6a: Unit Root and ARIMA Models Lecture 6a: Unit Root and ARIMA Models 1 2 Big Picture A time series is non-stationary if it contains a unit root unit root nonstationary The reverse is not true. For example, y t = cos(t) + u t has no

More information

Regression with time series

Regression with time series Regression with time series Class Notes Manuel Arellano February 22, 2018 1 Classical regression model with time series Model and assumptions The basic assumption is E y t x 1,, x T = E y t x t = x tβ

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia GARCH Models Estimation and Inference Eduardo Rossi University of Pavia Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38 ARIMA Models Jamie Monogan University of Georgia January 25, 2012 Jamie Monogan (UGA) ARIMA Models January 25, 2012 1 / 38 Objectives By the end of this meeting, participants should be able to: Describe

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 6 Jakub Mućk Econometrics of Panel Data Meeting # 6 1 / 36 Outline 1 The First-Difference (FD) estimator 2 Dynamic panel data models 3 The Anderson and Hsiao

More information

Introduction to Eco n o m et rics

Introduction to Eco n o m et rics 2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Introduction to Eco n o m et rics Third Edition G.S. Maddala Formerly

More information

Testing for non-stationarity

Testing for non-stationarity 20 November, 2009 Overview The tests for investigating the non-stationary of a time series falls into four types: 1 Check the null that there is a unit root against stationarity. Within these, there are

More information

388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag.

388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag. INDEX Aggregation... 104 Almon lag... 135-140,149 AR(1) process... 114-130,240,246,324-325,366,370,374 ARCH... 376-379 ARlMA... 365 Asymptotically unbiased... 13,50 Autocorrelation... 113-130, 142-150,324-325,365-369

More information

7 Introduction to Time Series

7 Introduction to Time Series Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some

More information

Linear Regression with Time Series Data

Linear Regression with Time Series Data u n i v e r s i t y o f c o p e n h a g e n d e p a r t m e n t o f e c o n o m i c s Econometrics II Linear Regression with Time Series Data Morten Nyboe Tabor u n i v e r s i t y o f c o p e n h a g

More information

Christopher Dougherty London School of Economics and Political Science

Christopher Dougherty London School of Economics and Political Science Introduction to Econometrics FIFTH EDITION Christopher Dougherty London School of Economics and Political Science OXFORD UNIVERSITY PRESS Contents INTRODU CTION 1 Why study econometrics? 1 Aim of this

More information