GENERALISED LEAST SQUARES AND RELATED TOPICS

Size: px
Start display at page:

Download "GENERALISED LEAST SQUARES AND RELATED TOPICS"

Transcription

1 GENERALISED LEAST SQUARES AND RELATED TOPICS Haris Psaradakis Birkbeck, University of London Nonspherical Errors Consider the model y = Xβ + u, E(u) =0, E(uu 0 )=σ 2 Ω, where Ω is a symmetric and positive definite n n matrix (and, for simplicity, X is nonstochastic) This model allows the errors to be heteroskedastic and/or autocorrelated, and is often referred to as the generalised linear regression model If ˆβ OLS =(X 0 X) X 0 y is the OLS estimator of β, itcanbeshownthat: E(ˆβ OLS )=β +(X 0 X) X 0 E(u) E( β) =β var(ˆβ OLS )=E[(X 0 X) X 0 uu 0 X(X 0 X) ]=σ 2 (X 0 X) X 0 ΩX(X 0 X) Crucially, since var(ˆβ OLS ) 6= σ 2 (X 0 X), statistical inference based on ˆσ 2 OLS(X 0 X),with ˆσ 2 OLS =(y ˆβ OLS X) 0 (y ˆβ OLS X)/(n k), is invalid and likely to be unreliable plim ˆβ OLS = β if the largest eigenvalue of Ω is bounded for all n and the largest eigenvalue of (X 0 X) tends to zero as n ˆσ 2 OLS is, in general, a biased and inconsistent estimator of σ 2

2 2 GLS Estimator Assume that Ω is known Since Ω is a (real) symmetric matrix with eigenvalues λ,,λ n (say), it can be diagonalised as S 0 ΩS = Λ, where Λ =diag(λ,, λ n ) and S is an orthogonal matrix (S 0 = S ) with the eigenvectors of Ω as columns The diagonal matrix Λ is positive definite and may be factored as Λ = Λ /2 Λ /2,where ³p Λ /2 =diag λ,, p λ n Letting P = SΛ /2,wehave S 0 ΩS = Λ Ω = SΛS 0 Ω =(S 0 ) Λ S Ω = SΛ /2 Λ /2 S Ω = PP 0, and Ω =(P 0 ) P P 0 ΩP = I n By premultiplying the model by P 0, we can transform it into a specification which satisfies the usual Gauss Markov assumptions More specifically, we have P 0 y = P 0 Xβ + P 0 u, or y 0 = X 0 β + u 0 Notice that E(u 0 )=E(P 0 u)=0and E(u 0 u 0 0)=E(P 0 uu 0 P )=σ 2 P 0 ΩP = σ 2 I n Since the transformed model satisfies the classical assumptions, efficient estimation of β canbeachievedbyols: ˆβ = (X 0 0X 0 ) X 0 0y 0 =(X 0 PP 0 X) X 0 PP 0 y ˆβ =(X 0 Ω X) X 0 Ω y This estimator is the generalised least squares (GLS) estimator of β It can also be obtained as the solution to the problem: min β (y Xβ) 0 Ω (y Xβ) 2

3 An unbiased estimator of σ 2 is ˆσ 2 = (n k) (y 0 X 0ˆβ) 0 (y 0 X 0ˆβ) = (n k) (P 0 y P 0 X ˆβ) 0 (P 0 y P 0 X ˆβ) = (n k) (y X ˆβ) 0 PP 0 (y X ˆβ) = (n k) (y X ˆβ) 0 Ω (y X ˆβ) Properties of the GLS estimator: E(ˆβ) =β +(X 0 0X 0 ) X 0 0E(u 0 )=β var(ˆβ) =σ 2 (X 0 0X 0 ) = σ 2 (X 0 PP 0 X) = σ 2 (X 0 ΩX) Since the transformed model satisfies the assumptions of the Gauss Markov theorem, it follows that the GLS estimator (which is obtained by applying OLS to the transformed model) is the BLUE of β ThisresultisknownasAitken stheorem If u N(0,σ 2 Ω), then ˆβ N(β,σ 2 (X 0 Ω X) ) Linear restrictions of the form H 0 : Rβ r =0(R being q k with full rank q k) can be tested using the F -statistic F = (Rˆβ r) 0 [R(X 0 Ω X) R 0 ] (Rˆβ r) qˆσ 2 H 0 F (q, n k) If lim (n X 0 0X 0 )= lim(n X 0 Ω X)=Q 0 is finite and positive definite, then ˆβ p β; ifn /2 X0u 0 0 = n /2 X 0 Ω u d N(0,σ 2 Q 0 ),then n(ˆβ β) d N(0,σ 2 Q 0 ) as n ˆσ 2 is unbiased and consistent 3 Feasible GLS In practice Ω istypicallyunknown,butitisoftenassumedtodependinaknownwayona vector of unknown parameters α If ˆα is a consistent estimator of α, we can use ˆΩ = Ω(ˆα) 3

4 in lieu of Ω to obtain the feasible GLS (FGLS) estimator β =(X 0 ˆΩ X 0 ) X 0 ˆΩ y The FGLS β is consistent for β if µ plim n X0 ˆΩ X is finite and nonsingular and µ plim n X0 ˆΩ u =0 The FGLS β is asymptotically equivalent to the GLS ˆβ (so β and ˆβ have the same asymptotic distribution) if and µ plim n X0 ˆΩ X µ plim n X 0 ˆΩ u µ = plim n X0 Ω X µ = plim n X 0 Ω u Except for some simple cases, the finite-sample properties and exact distribution of FGLS estimators are unknown We note that FGLS is possible only when some structure is imposed on Ω With an unrestricted Ω, therearen(n +)/2 parameters in σ 2 Ω, which cannot be estimated from asampleofsizen 4 Maximum Likelihood In the generalised linear regression model with u N(0,σ 2 Ω) and known Ω, thegls estimator is the MLE To verify this, note that y N(Xβ,σ 2 Ω) and so the likelihood function is L(β,σ 2 )=(2π) n/2 [det(σ 2 Ω)] /2 exp 2 (y Xβ)0 (σ 2 Ω) (y Xβ) 4

5 with log L = n 2 log 2π 2 log det(σ2 Ω) 2 (y Xβ)0 (σ 2 Ω) (y Xβ) Noting that = n 2 log 2π n 2 log σ2 2 log det Ω 2σ 2 (y Xβ)0 Ω (y Xβ) (y Xβ) 0 Ω (y Xβ)=y 0 Ω y 2β 0 X 0 Ω y + β 0 X 0 Ω Xβ, the necessary conditions for maximisation of log L are: log L β = 2X 0 Ω y +2X 0 Ω Xβ =0, log L σ 2 The solution of the likelihood equations is: = n 2σ 2 + 2σ 4 (y Xβ)0 Ω (y Xβ)=0 ˆβ ML =(X 0 Ω X) X 0 Ω y = ˆβ, We also have ˆσ 2 ML = n (y X ˆβ) 0 Ω (y X ˆβ) 2 log L β β 0 = σ 2 X0 Ω X, log L β σ 2 = 2σ 4 X0 Ω (y Xβ), log L = n σ 4 2σ 4 2σ (y 6 Xβ)0 Ω (y Xβ), and so the Fisher information matrix for (β,σ 2 ) can be shown to be X 0 Ω X σ n 2σ 4 Finally, n(ˆβml β) n(ˆσ 2 ML σ 2 ) where Q 0 = lim (n X 0 Ω X) d N 0 0 5, σ2 Q σ 4,

6 5 Heteroskedasticity Consider the following model with heteroskedastic errors: y = Xβ + u, E(u) =0, E(uu 0 )=σ 2 Ω, Ω =diag(ω 2,, ω 2 n) This is a special case of the generalised linear regression model 5 Hetroskedasticity-Robust Inference As discussed earlier, the OLS estimator ˆβ OLS =(X 0 X) X 0 y of β is still unbiased and consistent (although not BLUE) in the presence of heteroskedasticity Furthermore, it can be shown that, under fairly mild regularity conditions, ˆβ OLS a N(β,n σ 2 Q MQ ), where µ Q = plim n X0 X and and Q and M are assumed to be finite and positive definite µ M = plim n X0 ΩX, If Ω is known, efficient estimates can be obtained by means of GLS or ML If Ω is unknown, one may specify a model for ω 2 t, estimate this model (and hence Ω), and apply FGLS However, efficiency gains from FGLS are guaranteed only if the model for the error variances is correct For this reason, it has become popular to estimate β by OLS even when heteroskedasticity is suspected but to adjust the standard errors and related test statistics so that they are valid in the presence of arbitrary heteroskedasticity Halbert White showed that it is possible to construct an estimator of the asymptotic covariance matrix of ˆβ OLS which is consistent in the presence of heteroskedasticity of unknown form Such an estimator is called a heteroskedasticity-consistent covariance matrix estimator Recall that the asymptotic covariance matrix of ˆβ OLS is µ n plim n X0 X µ σ 2 plim n X0 ΩX plim µ n X0 X 6

7 In general, it is impossible to estimate Ω consistently since it has n elements However, n σ 2 X 0 ΩX = n σ P 2 n t= ω2 t x t x 0 t (x 0 t is the t th row of X) has only k 2 elements and can be estimated consistently by n X 0 ˆΩX, whereˆω may be one of several inconsistent estimator of σ 2 Ω The heteroskedasticity-consistent (or heteroskedasticity-robust) estimator of the asymptotic covariance matrix of ˆβ OLS is then given by Ĥ =(X 0 X) (X 0 ˆΩX)(X 0 X) White proposed using ˆΩ = diag(û 2,, û 2 n), where û t are OLS residuals Alternative estimators can be obtained by replacing û 2 t in the main diagonal of ˆΩ with one of the following: µ n û 2 t, n k û 2 t h t, or û 2 t ( h t ) 2, where h t is t th diagonal element of X(X 0 X) X 0 Such estimators tend to have better finite-sample properties than the original White estimator that uses û 2 t The square roots of the elements on the main diagonal of Ĥ are often referred to as heteroskedasticityconsistent (or Eicker White) standard errors When testing hypotheses about β, tests which are asymptotically robust to heteroskedasticity of unknown form can be constructed by using a heteroskedasticity-consistent estimator of the asymptotic covariance matrix of ˆβ OLS instead of the usual OLS covariance matrix estimator 52 Testing for Heteroskedasticity 52 The Breusch Pagan Godfrey Koenker Test The Breusch Pagan Godfrey Koenker LM test for heteroskedasticity tests H 0 : E(u 2 t )= σ 2 against H : E(u 2 t ) = σ 2 h(ztα), 0 where h( ) is an arbitrary positive function with h(0) = and z t is a vector of independent variables (in Koenker s version of the test, z t = x t ) The null hypothesis of homoskedasticity is equivalent to α =0AnLMtestof this hypothesis can be carried out by obtaining the R 2 in the auxiliary regression of the squared OLS residuals from the model (û 2 t ) on a constant term and the variables in z t The 7

8 Breusch Pagan Godfrey Koenker test statistic is H BPK = nr 2, which is asymptotically χ 2 -distributed under H 0 with degrees of freedom equal to the dimension of z t 522 The White Test The White LM test tests the null hypothesis of homoskedasticity against the alternative of heteroskedasticity of unspecified form The test can be carried out by obtaining the R 2 in the auxiliary regression of the squared OLS residuals (û 2 t ) on a constant, all the original regressors in X, their squares and their cross-products The White LM test statistic is H W = nr 2,whichhasanasymptoticχ 2 (p) distribution under the null, where p is the number of regressors in the auxiliary regression, excluding the intercept An alternative version of the test is based on the standard F -testforthehypothesisthatallcoefficients in the auxiliary regression, except the constant, are equal to zero We note that rejection of the null may be due to genuine heteroskedasticity, but it may also be due to some other type of specification error (such as an incorrect functional form or omitted explanatory variables) 6 Autocorrelation Consider the linear regression model with AR() errors: y t = x 0 tβ + u t, u t = φu t + ε t, where φ < and ε t iid(0,σ 2 )ItcanbeshownthatE(u t )=0, E(u 2 t )= σ2 φ 2, and E(u t u t s )= φs σ 2 φ 2, s 0 8

9 Consequently, in the model y = Xβ + u, wehavee(uu 0 )=σ 2 Ω with φ φ 2 φ n Ω = φ φ φ n 2 φ 2 φ n φ n 2 φ n 3 6 OLS Estimation The OLS estimator ˆβ OLS =(X 0 X) X 0 y is consistent for β provided that µ µ plim n X0 u =0 and plim n X0 X = Q, with Q finite and positive definite Furthermore, under appropriate regularity conditions, ˆβ OLS a N β,n σ 2 Q MQ, where µ M = plim n X0 ΩX As in the case of heteroskedasticity, we may use the OLS estimator (in spite of its inefficiency) and adjust the standard errors and related test statistics so that they are valid in the presence of autocorrelation of unknown form The asymptotic covariance matrix of ˆβ OLS can be consistently estimated by (X 0 X) b Ψ(X 0 X) where bψ = nx û 2 t x t x 0 t + t= X nx j= t=j+ µ j û t û t j (x t x 0 t j + x t j x 0 + t), û t = y t x 0 tˆβ OLS and ( <n) is a so-called bandwidth (or truncation) parameter Standard errors computed from this estimator are referred to as heteroskedasticityand-autocorrelation-consistent (HAC) standard errors (or Newey West standard errors) They can be used to construct tests which are asymptotically valid in the presence of heteroskedasticity and/or autocorrelation of unspecified form 9

10 62 FGLS Estimation Under the AR() specification for the errors, E(uu 0 )=σ 2 Ω and Ω = PP 0,where p φ φ P 0 = 0 φ φ The efficient GLS estimator of β can be obtained via OLS in the transformed model y 0 = X 0 β + u 0, where y 0 = P 0 y = p φ 2 y y 2 φy, X 0 = P 0 X = p φ 2 x x 2 φx, u 0 = P 0 u y n φy n x n φx n In practice, φ (and thus Ω) are unknown FGLS estimation of β requires, therefore, a preliminary estimate of φ A natural consistent estimator of φ is à nx!, à nx! ˆφ = û t û t û 2 t, t=2 t=2 where û t = y t x 0 tˆβ OLS The resulting FGLS estimator of β is ˆβ F =(X 0 ˆP ˆP 0 X) X 0 ˆP ˆP 0 y, where ˆP 0 = q ˆφ ˆφ ˆφ ˆφ 0

11 The FGLS estimator ˆβ F is sometimes known as the Prais Winsten estimator (or the Cochrane Orcutt estimator, if the first observation (y,x ) is omitted from the calculations) It is possible to iterate such estimators to convergence Since the estimators are asymptotically efficient at every iteration, nothing is gained asymptotically by doing so 63 ML Estimation Since y,, y n are not mutually independent, the joint density f(y,y 2,,y n ) is not equal to the product of the marginal densities Q n t= f(y t) However, the likelihood function can be constructed by noticing that f(y,y 2 ) = f(y 2 y )f(y ), f(y,y 2,y 3 ) = f(y 3 y 2,y )f(y 2,y )=f(y 3 y 2,y )f(y 2 y )f(y ), f(y,y 2,,y n ) = f(y ) ny f(y t y t,,y ) t=2 From the transformation of the model obtained through premultiplication with P 0,we can see that y = x 0 β +( φ 2 ) /2 ε, y t = φy t +(x t φx t ) 0 β + ε t, t =2,,n Hence, if ε t N(0,σ 2 ),then y N(x 0 β,σ 2 /( φ 2 )), y t y t,, y N(φy t +(x t φx t ) 0 β,σ 2 ), t =2,,n Consequently, the log-likelihood function is nx log L(β,φ,σ 2 ) = logf(y )+ log f(y t y t,, y ) t=2 = log 2π +logσ 2 log( φ 2 ) ª φ2 (y 2 2σ 2 x 0 β) 2 n log 2π +logσ 2 nx (y 2 2σ 2 t φy t x 0 tβ + φx 0 t β) 2 Maximisation of log L with respect to (β,φ,σ 2 ) yields the MLE t=2

12 64 Testing for Autocorrelation 64 Durbin Watson Test The Durbin Watson test statistic is d = P n t=2 (û t û t ) P 2 n, t= û2 t where û t = y t x 0 tˆβ OLS Notethatd 2( ˆφ) The distribution of DW depends on X However, it is possible to compute upper and lower limits for the critical values of d that depend only upon n and k We will denote these upper and lower limits by d U and d L, respectively If d<d L, the null hypothesis H 0 : φ =0is rejected in favour of H : φ>0; ifd>d U, H 0 : φ =0is not rejected; if d L <d<d U, the test is inconclusive When testing against negative autocorrelation, H 0 : φ =0is rejected in favour of H : φ<0 if 4 d<d L ; H 0 : φ =0is not rejected if 4 d>d U ;ifd L < 4 d<d U, the test is inconclusive The Durbin Watson test is valid only if a constant term is included in the model and the regressors are nonstochastic 642 The Breusch Godfrey Test The Breusch Godfrey LM test can be used to test of autocorrelation of order higher than Examples of models which allow for autocorrelation of order p are: px AR(p) : u t = φ j u t j + ε t, j= MA(p) : u t = ε t + px θ j ε t j The LM test of the null hypothesis H 0 : no autocorrelation against the alternative H : u t AR(p) or H : u t MA(p) can be implemented by regressing the OLS residuals û t j= on x t, û t,,û t p to obtain the uncentred R 2 TheLMstatisticisBG = (n p)r 2,and it has a χ 2 (p) asymptotic distribution under the null (An alternative version of the test 2

13 is based on the standard F -test for the hypothesis that the coefficients on the p lagged residuals in the auxiliary regression are jointly equal to zero) The Breusch Godfrey test essentially assesses the significance of the covariance of the residuals with their lagged values, controlling for the intervening effect of the explanatory variables Importantly, it does not suffer from any of the shortcomings of the Durbin Watson test: it is valid regardless of whether X is stochastic or non-stochastic and it does not have an inconclusive region 65 Common Factor Restrictions It is not uncommon for linear regression models to suffer from dynamic misspecification The simplest example is failure to include a lagged dependent variable among the regressors In such cases, the residuals may display autocorrelation even when the errors are in fact serially uncorrelated Consider a model with AR() errors: H : y t = x 0 tβ + u t, u t = φu t + ε t, ε t iid(0,σ 2 ) It is easy to see that the model can be rewritten as H : y t = x 0 tβ + φu t + ε t = x 0 tβ + φ(y t x 0 t β)+ε t If we relax the nonlinear restriction that is implicit in this specification, we have H 2 : y t = x 0 tβ + φy t + x 0 t γ + ε t In other words, H is a special case of the unrestricted dynamic model H 2, obtained by imposing the so-called common factor (COMFAC) restriction γ = φβ To see why the restriction is called COMFAC, rewrite H as ( φl)y t =( φl)x 0 tβ + ε t, where L is the lag operator (so that Lz t = z t ) The common factor ( φl) appears on both sides of the equation 3

14 Tests for autocorrelation test wether φ =0in H Such tests are meaningful only if the COMFAC restriction is valid If the COMFAC restriction is valid, GLS in H will yield consistent and efficient estimates If, however, the COMFAC restriction is invalid, GLS in H will yield biased and inconsistent estimates The COMFAC restriction can be tested using a W, LR or LM test 7 Seemingly Unrelated Regressions 7 Model Zellner s seemingly unrelated regression equations (SURE) model consists of m linear regression equations, each of which satisfies the assumptions of the classical linear regression model: y i = X i β i + u i, i =,,m, with y i (n ), X i (n k i ), u i (n ), β i (k i ), rank(x i )=k i <nstackingthem equations, we have or, more compactly, y y 2 y m (mn ) = X X X m (mn k) β β 2 + u u 2 β m u m (k ) (mn ), y = Xβ + u, with k = P m i= k i If u it is the t th element of u i, we assume that E(u it )=0and σ ij, if t = s, E(u it u js )= 0, if t 6= s, 4

15 ie, the errors are homoskedastic and serially uncorrelated across observations but are contemporaneously correlated across equations If Σ =(σ ij ) is the m m contemporaneous covariance matrix, we have V E(uu 0 )= σ I n σ 2 I n σ m I n σ 2 I n σ 22 I n σ 2m I n σ m I n σ m2 I n σ mm I n (mn mn) = Σ I n V is positive definite whenever Σ is 72 FGLS Estimation Each equation of the model is, by itself, a classical linear regression Therefore, its parameters could be estimated consistently, if not efficiently, one equation at a time by OLS The model as a whole is a generalised linear regression model Therefore, the BLUE of β is the GLS estimator ˆβ =(X 0 V X) X 0 V y =[X 0 (Σ I n )X] X 0 (Σ I n )y, with var(ˆβ) =(X 0 V X) =[X 0 (Σ I n )X] If u N(0, Σ I n ),thenˆβ is also the MLE When Σ is unknown (which is likely to be the case in practice), we can use a FGLS estimator instead of GLS Let û i be the n vector of OLS residuals from equation i (Note that system OLS of the SURE is equivalent to OLS equation by equation) Since the elements of Σ are consistently estimated by ˆσ ij = û0 iû j n, i,j =,,m, the FGLS estimator can be obtained as ˆβ F = h i ³ X 0 (bσ I n )X X 0 Σ b I n y, 5

16 where bσ =(ˆσ ij ) Under general conditions on u and X, theglsandfglsofβ are consistent and have the same asymptotic distribution: µ n(ˆβf β) d N 0, plim(n X 0 V X) as n Theasymptoticcovariancematrixofˆβ F is consistently estimated by [X 0 (ˆΣ I n )X] Hence, the hypothesis Rβ r =0can be tested using the Wald statistic W =(Rˆβ F r) 0 {RX 0 (ˆΣ I n )XR 0 } (Rˆβ F r), which has a χ 2 (q) asymptotic distribution under the null, with q =rank(r) k 73 Comparison Between OLS and GLS There are two cases where OLS is algebraically equivalent to GLS and, hence, BLUE If σ ij =0for i 6= j, thenˆβ = ˆβ OLS To see why, ˆβ = [X 0 (Σ I n )X] X 0 (Σ I n )y σ XX σ Xy 0 0 σ 22 X2X σ 22 X2y 0 2 = 0 0 σ mm XmX 0 m σ mm Xmy 0 m σ (XX 0 ) 0 0 σ Xy 0 0 σ 22 (X2X 0 2 ) 0 σ 22 X2y 0 2 = 0 0 σ mm(xmx 0 m ) σ mm Xmy 0 m (XX 0 ) Xy 0 (X2X 0 2 ) X2y 0 2 = = ˆβ OLS (XmX 0 m ) Xmy 0 m 6

17 If X = X 2 = = X m,thenˆβ = ˆβ OLS To see why, put X = X 2 = = X m = X 0 Then, we have: ˆβ = [X 0 (Σ I n )X] X 0 (Σ I n )y = [(I m X0)(Σ 0 I n )(I m X 0 )] (I m X0)(Σ 0 I n )y = [(Σ X0)(I 0 m X 0 )] (Σ X0)y 0 = (Σ X0X 0 0 ) (Σ X0)y 0 = [Σ (X0X 0 0 ) ](Σ X0)y 0 = [I m (X0X 0 0 ) X0]y 0 = (X 0 X) X 0 y = ˆβ OLS 74 Kronecker Products Let A =(a ij ) and B =(b ij ) be m n and p q matrices, respectively The mp nq matrix A B = is the Kronecker product of A and B a B a 2 B a n B a 2 B a 22 B a 2n B a m B a m2 B a mn B (A B)(C D) =AC BD (A B) 0 = A 0 B 0 (A B) = A B,ifA and B are invertible tr (A B) = tr(a)tr(b), if A and B are square det (A B) =(deta) m (det B) n,a(m m), B(n n) 7

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley Review of Classical Least Squares James L. Powell Department of Economics University of California, Berkeley The Classical Linear Model The object of least squares regression methods is to model and estimate

More information

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Ch.10 Autocorrelated Disturbances (June 15, 2016) Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

Zellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley

Zellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley Zellner s Seemingly Unrelated Regressions Model James L. Powell Department of Economics University of California, Berkeley Overview The seemingly unrelated regressions (SUR) model, proposed by Zellner,

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares L Magee Fall, 2008 1 Consider a regression model y = Xβ +ɛ, where it is assumed that E(ɛ X) = 0 and E(ɛɛ X) =

More information

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Quick Review on Linear Multiple Regression

Quick Review on Linear Multiple Regression Quick Review on Linear Multiple Regression Mei-Yuan Chen Department of Finance National Chung Hsing University March 6, 2007 Introduction for Conditional Mean Modeling Suppose random variables Y, X 1,

More information

Panel Data Models. James L. Powell Department of Economics University of California, Berkeley

Panel Data Models. James L. Powell Department of Economics University of California, Berkeley Panel Data Models James L. Powell Department of Economics University of California, Berkeley Overview Like Zellner s seemingly unrelated regression models, the dependent and explanatory variables for panel

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

Linear Regression with Time Series Data

Linear Regression with Time Series Data Econometrics 2 Linear Regression with Time Series Data Heino Bohn Nielsen 1of21 Outline (1) The linear regression model, identification and estimation. (2) Assumptions and results: (a) Consistency. (b)

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Econometrics Master in Business and Quantitative Methods

Econometrics Master in Business and Quantitative Methods Econometrics Master in Business and Quantitative Methods Helena Veiga Universidad Carlos III de Madrid Models with discrete dependent variables and applications of panel data methods in all fields of economics

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

LECTURE 11. Introduction to Econometrics. Autocorrelation

LECTURE 11. Introduction to Econometrics. Autocorrelation LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct

More information

the error term could vary over the observations, in ways that are related

the error term could vary over the observations, in ways that are related Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may

More information

Generalized Least Squares Theory

Generalized Least Squares Theory Chapter 4 Generalized Least Squares Theory In Section 3.6 we have seen that the classical conditions need not hold in practice. Although these conditions have no effect on the OLS method per se, they do

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction

More information

Topic 7: Heteroskedasticity

Topic 7: Heteroskedasticity Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic

More information

MEI Exam Review. June 7, 2002

MEI Exam Review. June 7, 2002 MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning Økonomisk Kandidateksamen 2004 (I) Econometrics 2 Rettevejledning This is a closed-book exam (uden hjælpemidler). Answer all questions! The group of questions 1 to 4 have equal weight. Within each group,

More information

Section 6: Heteroskedasticity and Serial Correlation

Section 6: Heteroskedasticity and Serial Correlation From the SelectedWorks of Econ 240B Section February, 2007 Section 6: Heteroskedasticity and Serial Correlation Jeffrey Greenbaum, University of California, Berkeley Available at: https://works.bepress.com/econ_240b_econometrics/14/

More information

Empirical Economic Research, Part II

Empirical Economic Research, Part II Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction

More information

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE 1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +

More information

LECTURE ON HAC COVARIANCE MATRIX ESTIMATION AND THE KVB APPROACH

LECTURE ON HAC COVARIANCE MATRIX ESTIMATION AND THE KVB APPROACH LECURE ON HAC COVARIANCE MARIX ESIMAION AND HE KVB APPROACH CHUNG-MING KUAN Institute of Economics Academia Sinica October 20, 2006 ckuan@econ.sinica.edu.tw www.sinica.edu.tw/ ckuan Outline C.-M. Kuan,

More information

Financial Econometrics

Financial Econometrics Material : solution Class : Teacher(s) : zacharias psaradakis, marian vavra Example 1.1: Consider the linear regression model y Xβ + u, (1) where y is a (n 1) vector of observations on the dependent variable,

More information

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage

More information

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

Week 11 Heteroskedasticity and Autocorrelation

Week 11 Heteroskedasticity and Autocorrelation Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity

More information

y it = α i + β 0 ix it + ε it (0.1) The panel data estimators for the linear model are all standard, either the application of OLS or GLS.

y it = α i + β 0 ix it + ε it (0.1) The panel data estimators for the linear model are all standard, either the application of OLS or GLS. 0.1. Panel Data. Suppose we have a panel of data for groups (e.g. people, countries or regions) i =1, 2,..., N over time periods t =1, 2,..., T on a dependent variable y it and a kx1 vector of independent

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 16, 2013 Outline Univariate

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Spatial Regression. 3. Review - OLS and 2SLS. Luc Anselin. Copyright 2017 by Luc Anselin, All Rights Reserved

Spatial Regression. 3. Review - OLS and 2SLS. Luc Anselin.   Copyright 2017 by Luc Anselin, All Rights Reserved Spatial Regression 3. Review - OLS and 2SLS Luc Anselin http://spatial.uchicago.edu OLS estimation (recap) non-spatial regression diagnostics endogeneity - IV and 2SLS OLS Estimation (recap) Linear Regression

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

Introduction to Estimation Methods for Time Series models Lecture 2

Introduction to Estimation Methods for Time Series models Lecture 2 Introduction to Estimation Methods for Time Series models Lecture 2 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 2 SNS Pisa 1 / 21 Estimators:

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16)

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) 1 2 Model Consider a system of two regressions y 1 = β 1 y 2 + u 1 (1) y 2 = β 2 y 1 + u 2 (2) This is a simultaneous equation model

More information

Model Mis-specification

Model Mis-specification Model Mis-specification Carlo Favero Favero () Model Mis-specification 1 / 28 Model Mis-specification Each specification can be interpreted of the result of a reduction process, what happens if the reduction

More information

Topic 6: Non-Spherical Disturbances

Topic 6: Non-Spherical Disturbances Topic 6: Non-Spherical Disturbances Our basic linear regression model is y = Xβ + ε ; ε ~ N[0, σ 2 I n ] Now we ll generalize the specification of the error term in the model: E[ε] = 0 ; E[εε ] = Σ = σ

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research Linear models Linear models are computationally convenient and remain widely used in applied econometric research Our main focus in these lectures will be on single equation linear models of the form y

More information

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e., 1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 4 Jakub Mućk Econometrics of Panel Data Meeting # 4 1 / 30 Outline 1 Two-way Error Component Model Fixed effects model Random effects model 2 Non-spherical

More information

Simple Linear Regression: The Model

Simple Linear Regression: The Model Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random

More information

Heteroscedasticity and Autocorrelation

Heteroscedasticity and Autocorrelation Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation

More information

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. 9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).

More information

Instrumental Variables, Simultaneous and Systems of Equations

Instrumental Variables, Simultaneous and Systems of Equations Chapter 6 Instrumental Variables, Simultaneous and Systems of Equations 61 Instrumental variables In the linear regression model y i = x iβ + ε i (61) we have been assuming that bf x i and ε i are uncorrelated

More information

Testing Linear Restrictions: cont.

Testing Linear Restrictions: cont. Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n

More information

Formulary Applied Econometrics

Formulary Applied Econometrics Department of Economics Formulary Applied Econometrics c c Seminar of Statistics University of Fribourg Formulary Applied Econometrics 1 Rescaling With y = cy we have: ˆβ = cˆβ With x = Cx we have: ˆβ

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Practical Econometrics. for. Finance and Economics. (Econometrics 2) Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics

More information

Multivariate Regression Analysis

Multivariate Regression Analysis Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x

More information

General Linear Model: Statistical Inference

General Linear Model: Statistical Inference Chapter 6 General Linear Model: Statistical Inference 6.1 Introduction So far we have discussed formulation of linear models (Chapter 1), estimability of parameters in a linear model (Chapter 4), least

More information

7. GENERALIZED LEAST SQUARES (GLS)

7. GENERALIZED LEAST SQUARES (GLS) 7. GENERALIZED LEAST SQUARES (GLS) [1] ASSUMPTIONS: Assume SIC except that Cov(ε) = E(εε ) = σ Ω where Ω I T. Assume that E(ε) = 0 T 1, and that X Ω -1 X and X ΩX are all positive definite. Examples: Autocorrelation:

More information

The Statistical Property of Ordinary Least Squares

The Statistical Property of Ordinary Least Squares The Statistical Property of Ordinary Least Squares The linear equation, on which we apply the OLS is y t = X t β + u t Then, as we have derived, the OLS estimator is ˆβ = [ X T X] 1 X T y Then, substituting

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Economic modelling and forecasting

Economic modelling and forecasting Economic modelling and forecasting 2-6 February 2015 Bank of England he generalised method of moments Ole Rummel Adviser, CCBS at the Bank of England ole.rummel@bankofengland.co.uk Outline Classical estimation

More information

Panel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63

Panel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63 1 / 63 Panel Data Models Chapter 5 Financial Econometrics Michael Hauser WS17/18 2 / 63 Content Data structures: Times series, cross sectional, panel data, pooled data Static linear panel data models:

More information

Economics 582 Random Effects Estimation

Economics 582 Random Effects Estimation Economics 582 Random Effects Estimation Eric Zivot May 29, 2013 Random Effects Model Hence, the model can be re-written as = x 0 β + + [x ] = 0 (no endogeneity) [ x ] = = + x 0 β + + [x ] = 0 [ x ] = 0

More information

Econometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets

Econometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets Econometrics II - EXAM Outline Solutions All questions hae 5pts Answer each question in separate sheets. Consider the two linear simultaneous equations G with two exogeneous ariables K, y γ + y γ + x δ

More information

The outline for Unit 3

The outline for Unit 3 The outline for Unit 3 Unit 1. Introduction: The regression model. Unit 2. Estimation principles. Unit 3: Hypothesis testing principles. 3.1 Wald test. 3.2 Lagrange Multiplier. 3.3 Likelihood Ratio Test.

More information

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8 Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall

More information

1 Outline. 1. Motivation. 2. SUR model. 3. Simultaneous equations. 4. Estimation

1 Outline. 1. Motivation. 2. SUR model. 3. Simultaneous equations. 4. Estimation 1 Outline. 1. Motivation 2. SUR model 3. Simultaneous equations 4. Estimation 2 Motivation. In this chapter, we will study simultaneous systems of econometric equations. Systems of simultaneous equations

More information

Likely causes: The Problem. E u t 0. E u s u p 0

Likely causes: The Problem. E u t 0. E u s u p 0 Autocorrelation This implies that taking the time series regression Y t X t u t but in this case there is some relation between the error terms across observations. E u t 0 E u t E u s u p 0 Thus the error

More information

Chapter 6. Panel Data. Joan Llull. Quantitative Statistical Methods II Barcelona GSE

Chapter 6. Panel Data. Joan Llull. Quantitative Statistical Methods II Barcelona GSE Chapter 6. Panel Data Joan Llull Quantitative Statistical Methods II Barcelona GSE Introduction Chapter 6. Panel Data 2 Panel data The term panel data refers to data sets with repeated observations over

More information

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

The Linear Regression Model

The Linear Regression Model The Linear Regression Model Carlo Favero Favero () The Linear Regression Model 1 / 67 OLS To illustrate how estimation can be performed to derive conditional expectations, consider the following general

More information

Economics 308: Econometrics Professor Moody

Economics 308: Econometrics Professor Moody Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey

More information

Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data

Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data Panel data Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data - possible to control for some unobserved heterogeneity - possible

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

Christopher Dougherty London School of Economics and Political Science

Christopher Dougherty London School of Economics and Political Science Introduction to Econometrics FIFTH EDITION Christopher Dougherty London School of Economics and Political Science OXFORD UNIVERSITY PRESS Contents INTRODU CTION 1 Why study econometrics? 1 Aim of this

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

Instrumental Variables and GMM: Estimation and Testing. Steven Stillman, New Zealand Department of Labour

Instrumental Variables and GMM: Estimation and Testing. Steven Stillman, New Zealand Department of Labour Instrumental Variables and GMM: Estimation and Testing Christopher F Baum, Boston College Mark E. Schaffer, Heriot Watt University Steven Stillman, New Zealand Department of Labour March 2003 Stata Journal,

More information

[y i α βx i ] 2 (2) Q = i=1

[y i α βx i ] 2 (2) Q = i=1 Least squares fits This section has no probability in it. There are no random variables. We are given n points (x i, y i ) and want to find the equation of the line that best fits them. We take the equation

More information

Spatial Regression. 9. Specification Tests (1) Luc Anselin. Copyright 2017 by Luc Anselin, All Rights Reserved

Spatial Regression. 9. Specification Tests (1) Luc Anselin.   Copyright 2017 by Luc Anselin, All Rights Reserved Spatial Regression 9. Specification Tests (1) Luc Anselin http://spatial.uchicago.edu 1 basic concepts types of tests Moran s I classic ML-based tests LM tests 2 Basic Concepts 3 The Logic of Specification

More information

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)

More information

Greene, Econometric Analysis (7th ed, 2012) Chapters 9, 20: Generalized Least Squares, Heteroskedasticity, Serial Correlation

Greene, Econometric Analysis (7th ed, 2012) Chapters 9, 20: Generalized Least Squares, Heteroskedasticity, Serial Correlation EC771: Econometrics, Spring 2012 Greene, Econometric Analysis (7th ed, 2012) Chapters 9, 20: Generalized Least Squares, Heteroskedasticity, Serial Correlation The generalized linear regression model The

More information

Max. Likelihood Estimation. Outline. Econometrics II. Ricardo Mora. Notes. Notes

Max. Likelihood Estimation. Outline. Econometrics II. Ricardo Mora. Notes. Notes Maximum Likelihood Estimation Econometrics II Department of Economics Universidad Carlos III de Madrid Máster Universitario en Desarrollo y Crecimiento Económico Outline 1 3 4 General Approaches to Parameter

More information

Econometria. Estimation and hypotheses testing in the uni-equational linear regression model: cross-section data. Luca Fanelli. University of Bologna

Econometria. Estimation and hypotheses testing in the uni-equational linear regression model: cross-section data. Luca Fanelli. University of Bologna Econometria Estimation and hypotheses testing in the uni-equational linear regression model: cross-section data Luca Fanelli University of Bologna luca.fanelli@unibo.it Estimation and hypotheses testing

More information

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

LECTURE 2 LINEAR REGRESSION MODEL AND OLS SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another

More information

Ma 3/103: Lecture 24 Linear Regression I: Estimation

Ma 3/103: Lecture 24 Linear Regression I: Estimation Ma 3/103: Lecture 24 Linear Regression I: Estimation March 3, 2017 KC Border Linear Regression I March 3, 2017 1 / 32 Regression analysis Regression analysis Estimate and test E(Y X) = f (X). f is the

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 2 Jakub Mućk Econometrics of Panel Data Meeting # 2 1 / 26 Outline 1 Fixed effects model The Least Squares Dummy Variable Estimator The Fixed Effect (Within

More information

Maximum Likelihood (ML) Estimation

Maximum Likelihood (ML) Estimation Econometrics 2 Fall 2004 Maximum Likelihood (ML) Estimation Heino Bohn Nielsen 1of32 Outline of the Lecture (1) Introduction. (2) ML estimation defined. (3) ExampleI:Binomialtrials. (4) Example II: Linear

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 17, 2012 Outline Heteroskedasticity

More information