Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,
|
|
- Tamsyn Stokes
- 5 years ago
- Views:
Transcription
1 1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series applications Typically found in financial data, macro data, wage data Autocorrelation can only happen into the past, not into the future cov(ɛ i, ɛ j ) 0 i, j 2 AR(1) Errors AR(1) errors occur when y i = X i β + ɛ i and ɛ i = ρɛ i 1 + u i where ρ is the autocorrelation coecient, ρ < 1 and u i N(0, σ 2 u) Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, ie, ɛ i = ρ 1 ɛ i 1 + ρ 2 ɛ i ρ p ɛ i p Note: We will need ρ < 1 for stability and stationarity If ρ < 1 happens to fail then we have the following problems: 1 ρ = 0: No serial correlation present 2 ρ > 1: The process explodes 3 ρ = 1: The process follows a random walk 4 ρ = 1: The process is oscillatory 5 ρ < 1: The process explodes in an oscillatory fashion The consequences for OLS: ˆβ is unbiased and consistent but no longer ecient and usual statistical inference is rendered invalid 1
2 Lemma: ɛ i = ρ j u i j j=0 Note the expectation of ɛ i is E[ɛ i ] = E[ ρ j u i j ] The variance of ɛ is var(ɛ i ) = E[ɛ 2 i ] = j=0 ρ j E[u i j ] = j=0 = E[(u i + ρu i 1 + ρ 2 u i 2 + ) 2 ] ρ j 0 = 0 j=0 = E[u 2 i + ρu i 1 u i + ρ 2 u 2 i 1 + ρ 4 u 2 i 2 + ] var(ɛ i ) = σ 2 u + ρ 2 σ 2 u + ρ 4 σ 2 u + Therefore, the var(ɛ i ) is var(ɛ i ) = σ 2 u + ρ 2 σ 2 u + ρ 4 σ 2 u + = σ 2 u + ρ 2 (var(ɛ i 1 )) But, assuming homoscedasticity, var(ɛ i ) = var(ɛ i 1 ) so that var(ɛ i ) = σ 2 u + ρ 2 (var(ɛ i 1 )) = σu 2 + ρ 2 (var(ɛ i )) var(ɛ i ) = σu 2 1 ρ 2 σ2 Note: This is why we need ρ < 1 for stability in the process If ρ > 1 then the denominator is negative and the var(ɛ i ) cannot be negative We note the correlation between ɛ i and ɛ i 1 cov(ɛ i, ɛ i 1 ) corr(ɛ i, ɛ i 1 ) = var(ɛi )var(ɛ i 1 ) = ρ σ 2 1 ρ 2 u σu 2 = ρ is the correlation coecient 1 ρ 2 At this point the following results hold 1 The OLS estimate of s 2 is biased but consistent 2 s 2 is usually biased downward because we usually nd ρ > 0 in economic data This implies that σ 2 (X X) 1 tends to be less than σ 2 (X X) 1 X ΩX(X X) 1 if ρ > 0 and the variables of X are positively correlated over time This implies that t-statistics are over-stated and we may introduce Type I errors in our inferences How do we know if we have Autocorrelation or not? 2
3 3 Tests for Autocorrelation 1 Plot residuals (ˆɛ i ) against time 2 Plot residuals (ˆɛ i ) against ˆɛ i 1 4 Durbin-Watson Test This is a most popular test Assumptions: (a) Regression has a constant term (b) No lagged dependent variables (c) No missing values (d) AR(1) error structure The null hypothesis is that ρ = 0 or that there is no serial correlation The test statistic is calculated as d = i=2 (ˆɛ i ˆɛ i 1 ) 2 i=1 ˆɛ2 i which is equivalent to ˆɛ Aˆɛ where A = An equivalent test is d = 2(1 ˆρ) where ˆρ comes from ˆɛ i = ρˆɛ i 1 + u i Note that 1 ρ 1 so that d [0, 4] where (a) d = 0 indicates perfect positive serial correlation (b) d = 4 indicates perfect negative serial correlation (c) d = 2 indicates no serial correlation The reason for this is that the DW statistic does not follow a standard distribution The distribution of the statistic is dependent upon ˆɛ i, which are dependent upon the X is in the model Further, there are dierent degrees of freedom that must be controlled for For example, let N = 25, k = 3 then DW L = 0906 and DW U = 1409 If d = 178 then d > DW U but d < 4 DW U and we fail to reject the null Graphically this looks like Reject H 0 Positive Correlation Inconclusive Zone Fail to Reject H 0 or H 0 0 DW L DW U 2 4 DWU 4 DW L 4 Inconclusive Zone Reject H 0 Negative Correlation 3
4 Let's look a little closer at our DW statistic i=2 DW = ˆɛ i 2 2 i=2 ˆɛ iˆɛ i 1 + i=2 ˆɛ2 i 1 [ 2 i=2 ˆɛ iˆɛ i 1 + ] ˆɛ 2 1 ˆɛ 2 N = why? Note the following: i=2 ˆɛ2 i = ˆɛ ˆɛ ˆɛ 2 N N i=2 ˆɛ2 i 1 = ˆɛ ˆɛ ˆɛ 2 N 1 = ˆɛ ˆɛ ˆɛ 2 N = ˆɛ ˆɛ ˆɛ 2 N Therefore we have simply added and subtracted ˆɛ 2 1 and ˆɛ 2 N Therefore, DW = 2ˆɛ ˆɛ 2 i=2 ˆɛ iˆɛ i 1 ˆɛ 2 1 ˆɛ 2 N = 2 2 i=2 (ρˆɛ i 1 + u i )ˆɛ i 1 [ˆɛ ˆɛ 2 N ] then DW = 2 2γ 1 ˆρ γ 2 where γ 1 = i=2 ˆɛ2 i 1 and γ 2 = ˆɛ2 1 + ˆɛ 2 N Note that as N then γ 1 1 and γ 2 0 so that DW 2 2ˆρ Under H 0 : ρ = 0 and thus DW = 2 Note: We can calculate ˆρ as ˆρ = 1 05DW Durbin's h-test on the lagged dependent variable The Durbin-Watson test assumes that X is non-stochastic This may not always be the case, eg, if we include lagged dependent variables on the right-hand side Durbin oers an alternative test in this case Under the null hypothesis that ρ = 0 the test becomes ( h = 1 d ) N 2 1 N(var(α)) where α is the coecient on the lagged dependent variable Note: If Nvar(α) > 1 then we have a problem because we can't take the square root of a negative number Durbin's h statistic is approximately distributed as a normal with unit variance 4
5 Breusch-Godfrey Test Errors are AR(p) This is basically a Lagrange Multiplier test of H 0 : No autocorrelation versus H α : Errors are AR(p) Regress ˆɛ i on X i, ˆɛ i 1,, ˆɛ i p and obtain NR 2 χ 2 p where p is the number of lagged values that contribute to the correlation The intuition behind this test is rather straightforward We know that X ˆɛ = 0 so that any R 2 > 0 must be caused by correlation between the current and the lagged residuals 4 Correcting an AR(1) Process One way to x the problem is to get the error term of the estimated equation to satisfy the full ideal conditions One way to do this might be through substitution Consider the model we estimate is y t = β 0 + β 1 X t + ɛ t where ɛ t = ρɛ t 1 + u t and u t (0, σ 2 u) It is possible to rewrite the original model as y t = β 0 + β 1 X t + ρɛ t 1 + u t but ɛ t 1 = y t 1 β 0 β 1 X t 1 thus y t = β 0 + β 1 X t + ρ(y t 1 β 0 β 1 X t 1 ) + u t : via substitution y t ρy t 1 = β 0 (1 ρ) + β 1 (X t ρx t 1 ) + u t : via gathering terms y t = β 0 + β 1 X t + u t We can estimate the transformed model, which satises the full ideal conditions as long as ut satisfies the full ideal conditions One downside is the loss of the rst observation, which can be a considerable sacrifice in degrees of freedom What if ρ is unknown? We seek a consistent estimator of ρ so as to run Feasible GLS Methods of estimating ρ 1 Cochranne-Orcutt: Throw out the rst observation We assume an AR(1) process which implies ɛ i = ρɛ i 1 + u i So, we run OLS on ˆɛ i = ρˆɛ i 1 + u i and obtain ˆρ = i=2 ˆɛ iˆɛ i 1 i=2 ˆɛ2 i which is the OLS estimator of ρ 5
6 2 Durbin's Method After substituting for ɛ i we see that y i = β 0 + β 1 X i1 + β 2 X i2 + + β k X ik + ρɛ i 1 + u i = β 0 + β 1 X i1 + + β k X ik + ρ(y i 1 β 0 β 1 X i 1,1 β k X i 1,k ) + u i So, we run OLS on y i = ρy i 1 + (1 ρ)β 0 + β 1 X i1 β 1 ρx i 1,1 + + β k ρx i,k β k ρx i 1,k + u i From this we obtain ˆρ which is the coecient on y i 1 This parameter estimate is biased but consistent Note: When k is large, we may have a problem in the degrees of freedom To preserve the degrees of freedom, we must have N > 2k + 1 observations to employ this method In small samples, this method may not be feasible 3 Newey-West Covariance Matrix We can correct the covariance matrix of ˆβ much like we did in the case of heteroscedasticity This extention of White (1980) was oered by Newey and West We seek a consistent estimator of X ΩX which then leads to where cov( ˆβ) = σ 2 (X X) 1 X ΩX(X X) 1 X ΩX = 1 N ˆɛ 2 i X i X i + 1 N L ω iˆɛ jˆɛ j 1 (X j X j 1 + X j 1 X j) i=1 j=i+1 where ω i = 1 i L + 1 A possible problem in this approach is to determine L, or how far back into the past to go to correct the covariance matrix of autocorrelation 7
7 Forecasting in the AR(1) Environment Having estimated β GLS we know that β GLS is BLUE when the cov(ɛ) = σ 2 Ω when Ω I With an AR(1) process, we know that tomorrow's output is dependent upon today's output and today's random error We estimate where ɛ t = ρɛ t 1 + u t The forecast becomes y t = X t β + ɛ t y t+1 = X t+1 β + ɛ t+1 = X t+1 β + ρɛ t + u t+1 To nish the forecast, we need ˆρ from our previous estimation techniques and then we recongize that ɛ t = y t X t β from GLS estimation We assume that u t+1 as a zero mean Then we see that ŷ t+1 = X t+1 β + ˆρ ɛ Example: Gasoline Retail Prices In this example we look at the relationship between the US average retail price of gasoline and the wholesale price of gasoline from from January 1985 through February 2006 As an initial step, we plot the two series over time and notice a highly correlated set of series: obs allgradesprice 8 wprice
8 A simple OLS regression model produces: reg allgradesprice on wprice Source SS df MS Number of obs = F( 1, 252) = Model Prob > F = Residual R-squared = Adj R-squared = Total Root MSE = allgradesp~e Coef Std Err t P> t [95% Conf Interval] wprice _cons The results suggest that for every penny in wholesale price, there is a 121 penny increase in the average retail price of gasoline The constant term suggests that, on average, there is approximately 32 cents dierence between retail and wholesale prices, comprised of prots, state and federal taxes A Durbin-Watson statistic calculated after the regression yields Durbin-Watson d-statistic( 2, 254) = The DW statistic suggests that the data suer from signicant autocorrelation Reversing out an estimate of ˆρ = 1 d/2 suggests that ρ = 0904 Here is a picture of the tted residuals against time: Residuals obs 9
9 Here are robust-regression results: reg allgradesprice on wprice + r Number of obs= 254 Regression with robust standard errors F( 1, 252) = Prob > F = R-squared = Root MSE = Robust allgradesp~e Coef Std Err t P> t [95% Conf Interval] wprice _cons The robust regression results suggest that the naive OLS over-states the variance in the parameter estimate on wprice, but the positive value of ρ suggests the opposite is likely true Various xes are possible First, Newey-West standard errors: Reg allgradesprice on wprice, lag(1) Regression with Newey-West standard errors Number of obs = 254 maximum lag: 1 F(1,252) = Prob > F = Newey-West allgradesp~e Coef Std Err t P> t [95% Conf Interval] wprice _cons The Newey-West corrected standard errors, assuming AR(1) errors, are significantly higher than the robust OLS standard errors but are only slightly lower than those in naive OLS Cochrane-Orcutt AR(1) regression -- iterated estimates Source SS df MS Number of obs = F( 1, 251) = Model Prob > F = Residual R-squared = Adj R-squared = Total Root MSE = allgradesp~e Coef Std Err t P> t [95% Conf Interval] wprice _cons rho Durbin-Watson statistic (original) Durbin-Watson statistic (transformed)
ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:
More informationECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:
More informationEconometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in
More informationAutocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time
Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused
More informationF9 F10: Autocorrelation
F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?
More informationEconometrics. 9) Heteroscedasticity and autocorrelation
30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for
More informationAUTOCORRELATION. Phung Thanh Binh
AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures
More informationOutline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation
1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption
More informationModel Mis-specification
Model Mis-specification Carlo Favero Favero () Model Mis-specification 1 / 28 Model Mis-specification Each specification can be interpreted of the result of a reduction process, what happens if the reduction
More informationAnswers: Problem Set 9. Dynamic Models
Answers: Problem Set 9. Dynamic Models 1. Given annual data for the period 1970-1999, you undertake an OLS regression of log Y on a time trend, defined as taking the value 1 in 1970, 2 in 1972 etc. The
More informationHeteroskedasticity and Autocorrelation
Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity
More informationQuestions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares
Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares L Magee Fall, 2008 1 Consider a regression model y = Xβ +ɛ, where it is assumed that E(ɛ X) = 0 and E(ɛɛ X) =
More information11.1 Gujarati(2003): Chapter 12
11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed
More informationHeteroscedasticity and Autocorrelation
Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation
More informationReading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1
Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationAnswer all questions from part I. Answer two question from part II.a, and one question from part II.b.
B203: Quantitative Methods Answer all questions from part I. Answer two question from part II.a, and one question from part II.b. Part I: Compulsory Questions. Answer all questions. Each question carries
More information1 Graphical method of detecting autocorrelation. 2 Run test to detect autocorrelation
1 Graphical method of detecting autocorrelation Residual plot : A graph of the estimated residuals ˆɛ i against time t is plotted. If successive residuals tend to cluster on one side of the zero line of
More informationEconomics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models
University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe
More information7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15
Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic
More informationTesting methodology. It often the case that we try to determine the form of the model on the basis of data
Testing methodology It often the case that we try to determine the form of the model on the basis of data The simplest case: we try to determine the set of explanatory variables in the model Testing for
More informationLECTURE 10: MORE ON RANDOM PROCESSES
LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more
More information7 Introduction to Time Series
Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some
More informationHeteroskedasticity. Part VII. Heteroskedasticity
Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least
More information1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE
1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +
More informationLECTURE 13: TIME SERIES I
1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural
More informationWeek 3: Simple Linear Regression
Week 3: Simple Linear Regression Marcelo Coca Perraillon University of Colorado Anschutz Medical Campus Health Services Research Methods I HSMP 7607 2017 c 2017 PERRAILLON ALL RIGHTS RESERVED 1 Outline
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationEconometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018
Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate
More informationLab 07 Introduction to Econometrics
Lab 07 Introduction to Econometrics Learning outcomes for this lab: Introduce the different typologies of data and the econometric models that can be used Understand the rationale behind econometrics Understand
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More informationPlease discuss each of the 3 problems on a separate sheet of paper, not just on a separate page!
Econometrics - Exam May 11, 2011 1 Exam Please discuss each of the 3 problems on a separate sheet of paper, not just on a separate page! Problem 1: (15 points) A researcher has data for the year 2000 from
More informationEconometrics Midterm Examination Answers
Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i
More information10) Time series econometrics
30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit
More informationDiagnostics of Linear Regression
Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions
More informationAutoregressive models with distributed lags (ADL)
Autoregressive models with distributed lags (ADL) It often happens than including the lagged dependent variable in the model results in model which is better fitted and needs less parameters. It can be
More informationEconomics 620, Lecture 13: Time Series I
Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is
More informationNon-independence due to Time Correlation (Chapter 14)
Non-independence due to Time Correlation (Chapter 14) When we model the mean structure with ordinary least squares, the mean structure explains the general trends in the data with respect to our dependent
More informationGraduate Econometrics Lecture 4: Heteroskedasticity
Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model
More information10. Time series regression and forecasting
10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the
More informationProblem Set #3-Key. wage Coef. Std. Err. t P> t [95% Conf. Interval]
Problem Set #3-Key Sonoma State University Economics 317- Introduction to Econometrics Dr. Cuellar 1. Use the data set Wage1.dta to answer the following questions. a. For the regression model Wage i =
More informationin the time series. The relation between y and x is contemporaneous.
9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x
More informationCh.10 Autocorrelated Disturbances (June 15, 2016)
Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the
More informationFormulary Applied Econometrics
Department of Economics Formulary Applied Econometrics c c Seminar of Statistics University of Fribourg Formulary Applied Econometrics 1 Rescaling With y = cy we have: ˆβ = cˆβ With x = Cx we have: ˆβ
More informationIris Wang.
Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?
More informationEconomics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2
Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions
More information9) Time series econometrics
30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series
More informationAutocorrelation. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Autocorrelation POLS / 20
Autocorrelation Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Autocorrelation POLS 7014 1 / 20 Objectives By the end of this meeting, participants should be
More informationLab 11 - Heteroskedasticity
Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction
More informationLikely causes: The Problem. E u t 0. E u s u p 0
Autocorrelation This implies that taking the time series regression Y t X t u t but in this case there is some relation between the error terms across observations. E u t 0 E u t E u s u p 0 Thus the error
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationStationary and nonstationary variables
Stationary and nonstationary variables Stationary variable: 1. Finite and constant in time expected value: E (y t ) = µ < 2. Finite and constant in time variance: Var (y t ) = σ 2 < 3. Covariance dependent
More informationInstrumental Variables, Simultaneous and Systems of Equations
Chapter 6 Instrumental Variables, Simultaneous and Systems of Equations 61 Instrumental variables In the linear regression model y i = x iβ + ε i (61) we have been assuming that bf x i and ε i are uncorrelated
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationProblem Set #5-Key Sonoma State University Dr. Cuellar Economics 317- Introduction to Econometrics
Problem Set #5-Key Sonoma State University Dr. Cuellar Economics 317- Introduction to Econometrics C1.1 Use the data set Wage1.dta to answer the following questions. Estimate regression equation wage =
More informationEcon 423 Lecture Notes
Econ 423 Lecture Notes (hese notes are modified versions of lecture notes provided by Stock and Watson, 2007. hey are for instructional purposes only and are not to be distributed outside of the classroom.)
More information1 Introduction to Generalized Least Squares
ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the
More information9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.
9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).
More informationSection 6: Heteroskedasticity and Serial Correlation
From the SelectedWorks of Econ 240B Section February, 2007 Section 6: Heteroskedasticity and Serial Correlation Jeffrey Greenbaum, University of California, Berkeley Available at: https://works.bepress.com/econ_240b_econometrics/14/
More information1 The Multiple Regression Model: Freeing Up the Classical Assumptions
1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator
More informationEconometrics Lecture 9 Time Series Methods
Econometrics Lecture 9 Time Series Methods Tak Wai Chau Shanghai University of Finance and Economics Spring 2014 1 / 82 Time Series Data I Time series data are data observed for the same unit repeatedly
More informationECON Introductory Econometrics. Lecture 6: OLS with Multiple Regressors
ECON4150 - Introductory Econometrics Lecture 6: OLS with Multiple Regressors Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 6 Lecture outline 2 Violation of first Least Squares assumption
More informationHeteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set
Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that
More informationEmpirical Application of Simple Regression (Chapter 2)
Empirical Application of Simple Regression (Chapter 2) 1. The data file is House Data, which can be downloaded from my webpage. 2. Use stata menu File Import Excel Spreadsheet to read the data. Don t forget
More informationSection 2 NABE ASTEF 65
Section 2 NABE ASTEF 65 Econometric (Structural) Models 66 67 The Multiple Regression Model 68 69 Assumptions 70 Components of Model Endogenous variables -- Dependent variables, values of which are determined
More informationHandout 12. Endogeneity & Simultaneous Equation Models
Handout 12. Endogeneity & Simultaneous Equation Models In which you learn about another potential source of endogeneity caused by the simultaneous determination of economic variables, and learn how to
More informationStatistical Inference with Regression Analysis
Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing
More informationFinal Exam. 1. Definitions: Briefly Define each of the following terms as they relate to the material covered in class.
Name Answer Key Economics 170 Spring 2003 Honor pledge: I have neither given nor received aid on this exam including the preparation of my one page formula list and the preparation of the Stata assignment
More informationApplied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics
Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish
More informationMA Advanced Econometrics: Applying Least Squares to Time Series
MA Advanced Econometrics: Applying Least Squares to Time Series Karl Whelan School of Economics, UCD February 15, 2011 Karl Whelan (UCD) Time Series February 15, 2011 1 / 24 Part I Time Series: Standard
More informationEconometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series
Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What
More informationEconomics 308: Econometrics Professor Moody
Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey
More information22s:152 Applied Linear Regression. Returning to a continuous response variable Y...
22s:152 Applied Linear Regression Generalized Least Squares Returning to a continuous response variable Y... Ordinary Least Squares Estimation The classical models we have fit so far with a continuous
More informationECON Introductory Econometrics. Lecture 7: OLS with Multiple Regressors Hypotheses tests
ECON4150 - Introductory Econometrics Lecture 7: OLS with Multiple Regressors Hypotheses tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 7 Lecture outline 2 Hypothesis test for single
More informationWeek 11 Heteroskedasticity and Autocorrelation
Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity
More information22s:152 Applied Linear Regression. In matrix notation, we can write this model: Generalized Least Squares. Y = Xβ + ɛ with ɛ N n (0, Σ)
22s:152 Applied Linear Regression Generalized Least Squares Returning to a continuous response variable Y Ordinary Least Squares Estimation The classical models we have fit so far with a continuous response
More informationHeteroskedasticity. (In practice this means the spread of observations around any given value of X will not now be constant)
Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set so that E(u 2 i /X i ) σ 2 i (In practice this means the spread
More informationIntroductory Econometrics. Lecture 13: Hypothesis testing in the multiple regression model, Part 1
Introductory Econometrics Lecture 13: Hypothesis testing in the multiple regression model, Part 1 Jun Ma School of Economics Renmin University of China October 19, 2016 The model I We consider the classical
More informationLecture 4: Heteroskedasticity
Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan
More informationFreeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94
Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations
More informationGreene, Econometric Analysis (7th ed, 2012)
EC771: Econometrics, Spring 2012 Greene, Econometric Analysis (7th ed, 2012) Chapters 2 3: Classical Linear Regression The classical linear regression model is the single most useful tool in econometrics.
More informationWarwick Economics Summer School Topics in Microeconometrics Instrumental Variables Estimation
Warwick Economics Summer School Topics in Microeconometrics Instrumental Variables Estimation Michele Aquaro University of Warwick This version: July 21, 2016 1 / 31 Reading material Textbook: Introductory
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationFinite Sample Performance of A Minimum Distance Estimator Under Weak Instruments
Finite Sample Performance of A Minimum Distance Estimator Under Weak Instruments Tak Wai Chau February 20, 2014 Abstract This paper investigates the nite sample performance of a minimum distance estimator
More informationRegression #8: Loose Ends
Regression #8: Loose Ends Econ 671 Purdue University Justin L. Tobias (Purdue) Regression #8 1 / 30 In this lecture we investigate a variety of topics that you are probably familiar with, but need to touch
More informationEmpirical Application of Panel Data Regression
Empirical Application of Panel Data Regression 1. We use Fatality data, and we are interested in whether rising beer tax rate can help lower traffic death. So the dependent variable is traffic death, while
More informationEconometrics - 30C00200
Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business
More informationEconometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 11 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 30 Recommended Reading For the today Advanced Time Series Topics Selected topics
More informationCHAPTER 4: Forecasting by Regression
CHAPTER 4: Forecasting by Regression Prof. Alan Wan 1 / 57 Table of contents 1. Revision of Linear Regression 3.1 First-order Autocorrelation and the Durbin-Watson Test 3.2 Correction for Autocorrelation
More informationPanel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63
1 / 63 Panel Data Models Chapter 5 Financial Econometrics Michael Hauser WS17/18 2 / 63 Content Data structures: Times series, cross sectional, panel data, pooled data Static linear panel data models:
More informationECON3150/4150 Spring 2016
ECON3150/4150 Spring 2016 Lecture 4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo Last updated: January 26, 2016 1 / 49 Overview These lecture slides covers: The linear regression
More informationThe Classical Linear Regression Model
The Classical Linear Regression Model ME104: Linear Regression Analysis Kenneth Benoit August 14, 2012 CLRM: Basic Assumptions 1. Specification: Relationship between X and Y in the population is linear:
More informationNonstationary Time Series:
Nonstationary Time Series: Unit Roots Egon Zakrajšek Division of Monetary Affairs Federal Reserve Board Summer School in Financial Mathematics Faculty of Mathematics & Physics University of Ljubljana September
More information1: a b c d e 2: a b c d e 3: a b c d e 4: a b c d e 5: a b c d e. 6: a b c d e 7: a b c d e 8: a b c d e 9: a b c d e 10: a b c d e
Economics 102: Analysis of Economic Data Cameron Spring 2016 Department of Economics, U.C.-Davis Final Exam (A) Tuesday June 7 Compulsory. Closed book. Total of 58 points and worth 45% of course grade.
More informationIntroduction to Econometrics
Introduction to Econometrics STAT-S-301 Introduction to Time Series Regression and Forecasting (2016/2017) Lecturer: Yves Dominicy Teaching Assistant: Elise Petit 1 Introduction to Time Series Regression
More informationFinal Exam. Question 1 (20 points) 2 (25 points) 3 (30 points) 4 (25 points) 5 (10 points) 6 (40 points) Total (150 points) Bonus question (10)
Name Economics 170 Spring 2004 Honor pledge: I have neither given nor received aid on this exam including the preparation of my one page formula list and the preparation of the Stata assignment for the
More informationGLS and related issues
GLS and related issues Bernt Arne Ødegaard 27 April 208 Contents Problems in multivariate regressions 2. Problems with assumed i.i.d. errors...................................... 2 2 NON-iid errors 2 2.
More informationEcon 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10
Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show
More informationProblem set 1 - Solutions
EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed
More information