Section 2 NABE ASTEF 65
|
|
- Vanessa Lynch
- 6 years ago
- Views:
Transcription
1 Section 2 NABE ASTEF 65
2 Econometric (Structural) Models 66
3 67
4 The Multiple Regression Model 68
5 69
6 Assumptions 70
7 Components of Model Endogenous variables -- Dependent variables, values of which are determined within the system Exogenous variables -- Variables determined outside the system but influence the system by affecting the values of the endogenous variables Structural parameters -- Estimated using econometric techniques and relevant data Lagged endogenous variables Lagged exogenous variables Predetermined variables 71
8 72
9 Model Selection Criteria MSE T e 2 t t = = 1 T s 2 = T 2 et T t= 1 T = T K T T K MSE T is the " Penalty Factor" T K Akaike Information Criterion (AIC) 2K T AIC = e MSE 73
10 Schwarz or Bayesian Information Criterion (SIC) or (BIC) ( ) K T SIC BIC = T MSE 74
11 75
12 Elasticity: Interpretation: An elasticity of -.4 means a 1 percent change in the exogenous variable will lead to a -.4 percent change in the dependent variable. Suppose we wish to see whether or not the dependent variable and one independent variable are related after netting out the effect of any other independent variables in the model. Partial correlation coefficient between Y and X j -- the effect of X j on Y not accounted for by the other variables in the model. 76
13 77
14 Tests of Hypotheses About Single Coefficients k ) 78
15 Joint Tests on Several Regression Coefficients 79
16 80
17 Diagnostics Serial Correlation Collinearity Influence Points 81
18 Serial Correlation Definition Tests Durbin-Watson Test Nonparametric Runs Test Durbin h Test Lagrange Multiplier (LM) Test Box-Pierce Q Statistic Ljung- Box Q * Statistic (Small-sample modification of Box- Pierce Q Statistic) Generalized Least Squares 82
19 Serial Correlation Y i = β 0 + β1x1i + β E( ε ε ) 0 i j 2 X 2i β k X ki + ε i i = 1,2,..., n i j Disturbance terms are not independent. The correlation between εt and ε t k is called an autocorrelation of order k. 83
20 Formal Definition of Autocorrelation or Serial Correlation Autocorrelation or serial correlation refers to the lack of independence of error (or disturbance) terms. Autocorrelation and serial correlation refer to the same phenomenon. Simply put, a systematic pattern exists in the residuals of the econometric model. Ideally, the residuals, which represent a composite of all factors not embedded in the model, should exhibit no pattern. That is to say, the residuals should follow a white-noise (or random) pattern. 84
21 Prevalence of Serial Correlation With the use of time-series data in econometric applications, serial correlation is public enemy number one. Systematic patterns in the error terms commonly arise due to the (inadvertent) omission of explanatory variables in econometric models. These variables may come from disciplines other than economics, finance, or business, for example, psychology and sociology. Or, these variables may represent factors that simply are difficult to quantify, such as tastes and preferences of consumers or technological innovation on the part of producers. 85
22 Consequences of Serial Correlation Bishop (1981) Errors contaminated with autocorrelation or serial correlation Potential of discovering spurious relationships due to problems with autocorrelated errors (Granger and Newbold, 1974) Difficulties with structural analysis and forecasting If the error structure is autoregressive, then OLS estimates of the regression parameters are: (1) unbiased, (2) consistent, but (3) inefficient in small and in large samples. 86
23 The estimates of the standard errors of the coefficients in any econometric model are biased downward if the residuals are positively autocorrelated. They are biased upward if the residuals are negatively autocorrelated. Therefore, the calculated t-statistic is biased upward or downward in the opposite direction of the bias in the estimated standard error of that coefficient. Granger and Newbold (1974) further suggest that the econometric results can be defined as nonsense if R 2 >DW(d). 87
24 Positive autocorrelation of the errors generally tends to make the estimate of the error variance too small, so confidence intervals are too narrow and null hypotheses are rejected with a higher probability than the stated significance level. Negative autocorrelation of the errors generally tends to make the estimate of the error variance too large, so confidence intervals are too wide; as well, the power of significance tests is reduced. With either positive or negative autocorrelation, least-squares parameter estimates usually are not as efficient as generalized least-squares parameter estimates. 88
25 Regression with Autocorrelated Errors Ordinary regression analysis is based on several statistical assumptions. One key assumption is that the errors are independent of each other. However, with time series data, the ordinary regression residuals usually are correlated over time. Violation of the independent errors assumption has three important consequences for ordinary regression. First, statistical tests of the significance of the parameters and the confidence limits for the predicted values are not correct. Second, the estimates of the regression coefficients are not as efficient as they would be if the autocorrelation were taken into account. Third, since the ordinary regression residuals are not independent, they contain information that can be used to improve the prediction of future values. 89
26 Solution to the Serial Correlation Problem Generalized Least Squares (GLS) The AUTOREG procedure solves this problem by augmenting the regression model with an autoregressive model for the random error, thereby accounting for the systematic pattern of the errors. Instead of the usual regression model, the following autoregressive error model is used: y t = x β + ε t t ε t = φ ε 1 t 1 φ 2 ε t 2... φ m ε t m + v t v t ~ IN(0, σ 2 ) The notation indicates that each v t is normally and independently 2 distributed with mean 0 and variance σ 90
27 By simultaneously estimating the regression coefficients β and the autoregressive error model parameters φ i, the AUTOREG procedure corrects the regression estimates for autocorrelation. Thus, this kind of regression analysis is often called autoregressive error correction or serial correlation correction. This technique also is called the use of generalized least squares (GLS). 91
28 Predicted Values and Residuals The AUTOREG procedure can produce two kinds of predicted values and corresponding residuals and confidence limits. The first kind of predicted value is obtained from only the structural part of the model; this predicted value is an estimate of the unconditional mean of the dependent variable at time t. The second kind of predicted value includes both the structural part of the model and the predicted value of the autoregressive error process. Both the structural part and autoregressive error process of the model (termed the full model) are used to forecast future values. 92
29 Tests for Serial Correlation The Durbin-Watson Test H H 0 1 : ρ = 0 : ρ 0 0 DW = n t = 2 (ˆ e t t= 1 eˆ DW ( d) 2(1 ρ) n eˆ 2 t t 1 ) 2 4 ( or d statistic ) (approximation good only for large samples) If ρ = 0, then d = 2; if ρ = 1, then d = 0; if ρ = -1, then d = 4 93
30 The Distribution of DW (d) is reported by Durbin and Watson (1950, 1951) d L,d U depend on α, k, n DW invalid with models that contain no intercept and models that contain lagged dependent variables 94
31 The sampling distribution of d depends on the values of the exogenous variables and hence Durbin and Watson derived upper (du) limits and lower (dl) limits for the significance levels for d. Tables of the distribution are found in most econometric textbooks. The Durbin-Watson test perhaps is the most used procedure in econometric applications. 95
32 96
33 Appendix G, Statistical Table, cont. 97
34 Although the DW test is the most commonly used test for serial correlation, there are limitations: (1) tests for only first-order serial correlation. (2) test may be inconclusive. (3) test cannot be applied in models with lagged dependent variables. (4) test cannot be applied in models without intercepts. 98
35 There are other tables for the DW test that have been prepared to take care of special situations. Some of these are: 1. R.W. Farebrother (1980) provides tables for regression models with no intercept term. 2. Savin and White (1977) present tables for the DW test for samples with 6 to 200 observations and for as many as 20 regressors. 99
36 3. Wallis (1972) gives tables for regression models with quarterly data. Here one would like to test for fourth-order autocorrelation rather than first-order autocorrelation. In this case, the DW statistic is: n 2 (ût û t 4) t= 5 d4 = n 2 ût t=1 = Wallis provides 5% critical values d L and d U for two situations: where the k regressors include an intercept (but not a full set of seasonal dummy variables) and another where the regressors include four quarterly seasonal dummy variables. In each case the critical values are for testing H 0 : ρ =0 against H 1 : ρ > 0. For the hypothesis H 1 : ρ < 0, Wallis suggests that the appropriate critical values are (4-d U ) and (4-d L ). King and Giles (1978) give further significance points for these tests. 100
37 4. King (1981) gives the 5% points for d L and d U quarterly time-series data with trend and/or seasonal dummy variables. These tables are for testing first-order autocorrelation. 5. King (1983) gives tables for the DW test for monthly data. In case of monthly data, we may wish to test for twelfth-order autocorrelation. 101
38 Nonparametric Runs Test (Gujarti, 1978) More general than the DW test. Interest in H 0 : ρ = 0. Test of AR(1) process in the error terms N+ = number of positive residuals N- = number of negative residuals N = number of observations Nr = number of runs Example: E Test Statistic: + [ Nr ] = (2N N ) / N [ N ] = (2N N (2N N N)) /( N ( N 1)) VAR r Z = (N r E(N r VAR(N N(0,1) Reject H 0 (non-autocorrelation) if the test statistic is too large in absolute value. )) / r ) 102
39 103
40 In this example, sample evidence exists to suggest the presence of positive serial correlation, the more common form of pattern in the residuals in regard to the use of economic or financial data. 104
41 105
42 In the Greene problem for gasoline, ρˆ DW=0.786 and rho= Use of Nonparametric Runs test N = 36 N + = 19 N - = 17 N r = 11 E [ N ] VAR r = (2N + N ) / N = [ ] N = (2N N (2N N N)) /( N ( N 1)) = r Z = ( N E( N )) Z = = = at α =. 05, reject H :ρ = 0. Z crit r = 1.96 at α =.05 r 0 VAR( N r ) 106
43 Analysts must recognize that a good Durbin-Watson statistic is insufficient evidence upon which to conclude that the error structure is contamination free in terms of autocorrelation. The Durbin-Watson test is only applicable for the presence of first-order autocorrelation. There is little reason to suppose that the correct model for residuals is AR(1); a mixed, autoregressive, movingaverage (ARMA) structure is much more likely to be correct, especially with quarterly, monthly, and weekly frequencies of time-series data. Modeling of the residuals can be employed following the methodology of Box and Jenkins (1976). Owing to higher frequencies of time-series data used in applied econometrics in recent years, the pattern of the error structure generally is more complex than the common AR(1) pattern. 107
44 Durbin h-test A large sample test for autocorrelation when lagged dependent variables are present. ρˆ 1 (1/ 2)d h = ρˆ n /(1 nv( β)) ~ h & Coefficient associated with N(0,1) Test breaks down if nv(ˆ) β 1 d is the DW statistic Yt 1 If the Durbin-h test breaks down, compute the OLS residuals Then regress û t on û t 1, yt 1, and the set of exogenous variables. The test for ρ = 0 is carried out by testing the significance of the coefficient û t 1 û t 108
45 OLS estimates Presence of a lagged dependent variable 109
46 Calculation of the Durbin h-test for the Greene Problem In the Greene Problem for gasoline demand ˆ ρ = 1 (1.639/ 2) = n = 35 v( B) = ( ) 2 = h =
47 A General Test for higher-order Serial Correlation: The LM Test (Breusch and Pagan, 1980) LM - Lagrange multiplier y = β + β X β X + u t H 0 u : ρ 1 0 t = ρ 1 = ρ 2 1 u 1t t 1 + ρ 2 =... = ρ u p k t 2 = 0 kt t ρ p u t = 1,2,...,n. t p + e t e t ~ IN(0, σ The X s may or may not include lagged dependent variables. First: estimate by OLS and obtain the least squares residuals Second: estimate û t û t = γ0 + γ1x1t γ kxkt + û t iρi + p t= 1 v t. 2 ) û Third: test whether the coefficients of t i are all zero. Use the conventional F-statistic. 111
48 Box-Pierce or Ljung-Box Tests Check the serial correlation pattern of the residuals, need to be sure that there is no serial correlation (desire white noise) Box and Pierce (1970) suggest looking at not just the first-order autocorrelation but autocorrelation of all orders of residuals. 2 r k Calculate Q = N m 2 r k, where is the autocorrelation of lag k, and N is the number of observations in the series. If the model fitted is appropriate, number of estimated parameters k=1 Q 2 ~ & χ m p where p is the Ljung and Box (1978) suggest a modification of the Q-statistic for moderate sample sizes. Q* = N(N + 2) m k= 1 (N k) 1 2 r k 112
49 With the Box-Pierce of Ljung-Box tests, we examine the interface of structural models with timeseries models We use the correlations and partial correlations of the residuals over time. The idea is to determine the appropriate pattern in the error structure from the autocorrelation and partial autocorrelation functions associated with the residuals. Autocorrelation functions tell us about moving average (MA) patterns. Partial autocorrelation functions tell us about autoregressive (AR) patterns. Anticipate ARMA error structures, particularly higher-order AR patterns in residuals of econometric models. 113
50 114
51 The test can be used for different specifications of the error process: For Example: u ρ u e. t = 4 t 4 + t estimate u ˆ... ˆ t = γ + γ 1X1t + + γ k X kt + ρ4ut 4 + v 0 t. test H : ρ4 0 =
52 116
53 117
54 118
55 119
56 120
57 With time-series data, in most cases serial correlation problems will surface Analysts must examine the error structure carefully Minimally Graph the residuals over time Consider the significance of the Durbin-Watson statistic Consider higher-order autocorrelation structure via PROC ARIMA Consider the Godfrey LM Test Consider the Box-Pierce or Ljung-Box Tests (Q-Statistics) Re-estimate econometric models with AR(p) error structures 121
58 Collinearity - Nature of Problem - Consequences - Introduction - Belsley, Kuh, Welsch Diagnostics Variance inflation factors Condition indices Variance-decomposition proportions - Circumvention of Problem: Ridge Regression 122
59 Collinearity Diagnostics Multicollinearity refers to the presence of highly intercorrelated explanatory variables in regression models. It is not surprising that it is considered one of the most ubiquitous, significant, and difficult problems in applied econometrics often referred to by modelers as the familiar curse. Collinearity diagnostics measure how closely regressors are related to other regressors and how this relationship affects the stability and variance of the regression estimates. 123
60 Signs of Multicollinearity Signs of multicollinearity in a regression analysis include: (1) Large standard errors on the regression coefficient, so that estimates of the true model parameters become unstable and low t-values prevail. (2) The parameter estimates vary considerably from sample to sample. (3) Often there will be drastic changes in the regression estimates after only minor data revision. (4) Conflicting conclusions will be reached from the usual tests of significance (such as the wrong sign for a parameter). 124
61 (5) Extreme correlations between pairs of variables. (6) Omitting a variable from the equation results in smaller regression standard errors. (7) A good fit not providing good forecasts. 125
62 We use multicollinearity diagnostics to: (1) produce a set of condition indices that signal the presence of one or more near dependencies among the variables. Linear dependency, an extreme form of multicollinearity, occurs when there is an exact linear relationship among the variables. (2) uncover those variables that are involved in particular near dependencies and to assess the degree to which the estimated regression coefficients are being degraded by the presence of the near dependencies. In practice, if one exogenous variable has a high squared multiple correlation (R-squared) with the other independent variables, it is extremely unlikely that the exogenous variable in question contributes significantly to the prediction equation. When the R-squared is too high, the variables is, in essence, redundant. 126
63 The Ballentine 127
64 Definition of Collinearity 128
65 Orthogonal Variables Non-orthogonal Variables Key Points: (1) Sampling variances of estimated OLS coefficients increase sharply. (2) Greater sampling covariances for the OLS coefficients. 129
66 Ill-Conditioning-Multicollinearity-Collinearity Deals with specific characteristics of the data matrix X a data problem, not a statistical problem Speak in terms of severity rather than of its existence or nonexistence Effects on structural integrity of econometric models Opposite of collinear orthogonal 130
67 Constitutes threat to proper specification and effective estimation of a structural relationship Larger variances (standard errors) of regression coefficients VAR not indistinguishable from the consequences of inadequate variability in regressors Covariance among parameter estimates often large and of the wrong sign Difficulties in interpretation Confidence regions for parameters wide Increase in type II error (Accept H0 when H0 false) Decrease power of tests 131
68 VIF i = 1 1 R 2 i 132
69 Condition Indices 133
70 134
71 the 135
72 136
73 137
74 three possible cases of degrading collinearity. 138
75 139
76 140
77 Ridge Regression 141
78 142
79 143
80 144
81 Influence Diagnostics and Robust Regression Concern Outliers Particular observations exert undue influence on regression results. Influential observations may be a legitimate part of data set. Influential observations may cast doubt either on the validity of the observation or on the general adequacy of the model. 145
82 Diagnostics 146
83 147
84 148
85 149
86 150
87 151
88 152
89 153
90 154
91 155
92 156
93 Influence on the Variance of Regression Coefficients 157
94 158
95 Robust Regression OLS suffers in performance in the presence of outliers and certain non-normal error distributions (heavy tails). Robust estimators are not sensitive to outliers. Robust estimators essentially weight down the influence of data points that produce residuals large in magnitude. 159
96 160
97 Cutoff DFFITS 2 p / n.89 DFBETAS 2 n.44 H 2p/n.40 COVRATIO 1+3p/n p/n
98 162
99 163
100 164
101 165
Autocorrelation or Serial Correlation
Chapter 6 Autocorrelation or Serial Correlation Section 6.1 Introduction 2 Evaluating Econometric Work How does an analyst know when the econometric work is completed? 3 4 Evaluating Econometric Work Econometric
More informationG. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication
G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?
More informationEconomics 308: Econometrics Professor Moody
Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationF9 F10: Autocorrelation
F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?
More informationIris Wang.
Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?
More information11.1 Gujarati(2003): Chapter 12
11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed
More informationIntroduction to Eco n o m et rics
2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Introduction to Eco n o m et rics Third Edition G.S. Maddala Formerly
More informationFinQuiz Notes
Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable
More informationMULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level
More informationApplied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics
Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish
More informationEcon 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10
Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show
More informationReading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1
Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k
More informationEconometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in
More information10. Time series regression and forecasting
10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the
More informationWORKSHOP. Introductory Econometrics with EViews. Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic
WORKSHOP on Introductory Econometrics with EViews Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic Res. Asst. Pejman Bahramian PhD Candidate, Department of Economic Res. Asst. Gizem Uzuner MSc Student,
More informationEconometrics Honor s Exam Review Session. Spring 2012 Eunice Han
Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity
More informationDetecting and Assessing Data Outliers and Leverage Points
Chapter 9 Detecting and Assessing Data Outliers and Leverage Points Section 9.1 Background Background Because OLS estimators arise due to the minimization of the sum of squared errors, large residuals
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationFöreläsning /31
1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +
More informationLATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION
LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION BEZRUCKO Aleksandrs, (LV) Abstract: The target goal of this work is to develop a methodology of forecasting Latvian GDP using ARMA (AutoRegressive-Moving-Average)
More informationBox-Jenkins ARIMA Advanced Time Series
Box-Jenkins ARIMA Advanced Time Series www.realoptionsvaluation.com ROV Technical Papers Series: Volume 25 Theory In This Issue 1. Learn about Risk Simulator s ARIMA and Auto ARIMA modules. 2. Find out
More informationECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:
More informationOutline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation
1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption
More informationIntroduction to Econometrics
Introduction to Econometrics T H I R D E D I T I O N Global Edition James H. Stock Harvard University Mark W. Watson Princeton University Boston Columbus Indianapolis New York San Francisco Upper Saddle
More informationEmpirical Economic Research, Part II
Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction
More informationECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54
ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:
More informationTIME SERIES DATA ANALYSIS USING EVIEWS
TIME SERIES DATA ANALYSIS USING EVIEWS I Gusti Ngurah Agung Graduate School Of Management Faculty Of Economics University Of Indonesia Ph.D. in Biostatistics and MSc. in Mathematical Statistics from University
More informationDiagnostics of Linear Regression
Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions
More informationHeteroskedasticity and Autocorrelation
Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More information1/34 3/ Omission of a relevant variable(s) Y i = α 1 + α 2 X 1i + α 3 X 2i + u 2i
1/34 Outline Basic Econometrics in Transportation Model Specification How does one go about finding the correct model? What are the consequences of specification errors? How does one detect specification
More informationChristopher Dougherty London School of Economics and Political Science
Introduction to Econometrics FIFTH EDITION Christopher Dougherty London School of Economics and Political Science OXFORD UNIVERSITY PRESS Contents INTRODU CTION 1 Why study econometrics? 1 Aim of this
More informationContents. Part I Statistical Background and Basic Data Handling 5. List of Figures List of Tables xix
Contents List of Figures List of Tables xix Preface Acknowledgements 1 Introduction 1 What is econometrics? 2 The stages of applied econometric work 2 Part I Statistical Background and Basic Data Handling
More informationFreeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94
Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations
More informationLikely causes: The Problem. E u t 0. E u s u p 0
Autocorrelation This implies that taking the time series regression Y t X t u t but in this case there is some relation between the error terms across observations. E u t 0 E u t E u s u p 0 Thus the error
More informationLECTURE 10: MORE ON RANDOM PROCESSES
LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More informationEco and Bus Forecasting Fall 2016 EXERCISE 2
ECO 5375-701 Prof. Tom Fomby Eco and Bus Forecasting Fall 016 EXERCISE Purpose: To learn how to use the DTDS model to test for the presence or absence of seasonality in time series data and to estimate
More informationDEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND
DEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND Testing For Unit Roots With Cointegrated Data NOTE: This paper is a revision of
More informationEconomics 620, Lecture 13: Time Series I
Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is
More informationMultiple Regression Analysis
1 OUTLINE Basic Concept: Multiple Regression MULTICOLLINEARITY AUTOCORRELATION HETEROSCEDASTICITY REASEARCH IN FINANCE 2 BASIC CONCEPTS: Multiple Regression Y i = β 1 + β 2 X 1i + β 3 X 2i + β 4 X 3i +
More informationLECTURE 13: TIME SERIES I
1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural
More informationCHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More informationCHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis
CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing
More information388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag.
INDEX Aggregation... 104 Almon lag... 135-140,149 AR(1) process... 114-130,240,246,324-325,366,370,374 ARCH... 376-379 ARlMA... 365 Asymptotically unbiased... 13,50 Autocorrelation... 113-130, 142-150,324-325,365-369
More informationMODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY
MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY The simple ARCH Model Eva Rubliková Ekonomická univerzita Bratislava Manuela Magalhães Hill Department of Quantitative Methods, INSTITUTO SUPERIOR
More informationEconometric Methods. Prediction / Violation of A-Assumptions. Burcu Erdogan. Universität Trier WS 2011/2012
Econometric Methods Prediction / Violation of A-Assumptions Burcu Erdogan Universität Trier WS 2011/2012 (Universität Trier) Econometric Methods 30.11.2011 1 / 42 Moving on to... 1 Prediction 2 Violation
More information1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE
1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationTesting methodology. It often the case that we try to determine the form of the model on the basis of data
Testing methodology It often the case that we try to determine the form of the model on the basis of data The simplest case: we try to determine the set of explanatory variables in the model Testing for
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationAutoregressive models with distributed lags (ADL)
Autoregressive models with distributed lags (ADL) It often happens than including the lagged dependent variable in the model results in model which is better fitted and needs less parameters. It can be
More informationTesting for Unit Roots with Cointegrated Data
Discussion Paper No. 2015-57 August 19, 2015 http://www.economics-ejournal.org/economics/discussionpapers/2015-57 Testing for Unit Roots with Cointegrated Data W. Robert Reed Abstract This paper demonstrates
More informationEconomics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models
University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe
More informationINTRODUCTORY REGRESSION ANALYSIS
;»»>? INTRODUCTORY REGRESSION ANALYSIS With Computer Application for Business and Economics Allen Webster Routledge Taylor & Francis Croup NEW YORK AND LONDON TABLE OF CONTENT IN DETAIL INTRODUCTORY REGRESSION
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationRegression of Time Series
Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationFinQuiz Notes
Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression
More informationECONOMETRICS HONOR S EXAM REVIEW SESSION
ECONOMETRICS HONOR S EXAM REVIEW SESSION Eunice Han ehan@fas.harvard.edu March 26 th, 2013 Harvard University Information 2 Exam: April 3 rd 3-6pm @ Emerson 105 Bring a calculator and extra pens. Notes
More informationPhD/MA Econometrics Examination. January, 2015 PART A. (Answer any TWO from Part A)
PhD/MA Econometrics Examination January, 2015 Total Time: 8 hours MA students are required to answer from A and B. PhD students are required to answer from A, B, and C. PART A (Answer any TWO from Part
More informationFinancial Econometrics
Financial Econometrics Multivariate Time Series Analysis: VAR Gerald P. Dwyer Trinity College, Dublin January 2013 GPD (TCD) VAR 01/13 1 / 25 Structural equations Suppose have simultaneous system for supply
More informationEconometrics. 9) Heteroscedasticity and autocorrelation
30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for
More informationAuto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,
1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series
More informationIntroductory Econometrics
Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction
More informationAUTOCORRELATION. Phung Thanh Binh
AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures
More informationAutocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time
Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.
ACE 564 Spring 2006 Lecture 8 Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information by Professor Scott H. Irwin Readings: Griffiths, Hill and Judge. "Collinear Economic Variables,
More informationLinear Regression with Time Series Data
Econometrics 2 Linear Regression with Time Series Data Heino Bohn Nielsen 1of21 Outline (1) The linear regression model, identification and estimation. (2) Assumptions and results: (a) Consistency. (b)
More informationECON 4230 Intermediate Econometric Theory Exam
ECON 4230 Intermediate Econometric Theory Exam Multiple Choice (20 pts). Circle the best answer. 1. The Classical assumption of mean zero errors is satisfied if the regression model a) is linear in the
More informationEconometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution
Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution Delivered by Dr. Nathaniel E. Urama Department of Economics, University of Nigeria,
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationLab: Box-Jenkins Methodology - US Wholesale Price Indicator
Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale
More informationCovers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data
Covers Chapter 10-12, some of 16, some of 18 in Wooldridge Regression Analysis with Time Series Data Obviously time series data different from cross section in terms of source of variation in x and y temporal
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCHAPTER 4: Forecasting by Regression
CHAPTER 4: Forecasting by Regression Prof. Alan Wan 1 / 57 Table of contents 1. Revision of Linear Regression 3.1 First-order Autocorrelation and the Durbin-Watson Test 3.2 Correction for Autocorrelation
More informationLeonor Ayyangar, Health Economics Resource Center VA Palo Alto Health Care System Menlo Park, CA
Skewness, Multicollinearity, Heteroskedasticity - You Name It, Cost Data Have It! Solutions to Violations of Assumptions of Ordinary Least Squares Regression Models Using SAS Leonor Ayyangar, Health Economics
More informationEconometrics Part Three
!1 I. Heteroskedasticity A. Definition 1. The variance of the error term is correlated with one of the explanatory variables 2. Example -- the variance of actual spending around the consumption line increases
More informationOkun's Law Testing Using Modern Statistical Data. Ekaterina Kabanova, Ilona V. Tregub
Okun's Law Testing Using Modern Statistical Data Ekaterina Kabanova, Ilona V. Tregub The Finance University under the Government of the Russian Federation International Finance Faculty, Moscow, Russia
More informationThe Multiple Regression Model Estimation
Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:
More informationA Bootstrap Test for Causality with Endogenous Lag Length Choice. - theory and application in finance
CESIS Electronic Working Paper Series Paper No. 223 A Bootstrap Test for Causality with Endogenous Lag Length Choice - theory and application in finance R. Scott Hacker and Abdulnasser Hatemi-J April 200
More informationStationarity and Cointegration analysis. Tinashe Bvirindi
Stationarity and Cointegration analysis By Tinashe Bvirindi tbvirindi@gmail.com layout Unit root testing Cointegration Vector Auto-regressions Cointegration in Multivariate systems Introduction Stationarity
More informationin the time series. The relation between y and x is contemporaneous.
9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x
More informationTesting for Regime Switching in Singaporean Business Cycles
Testing for Regime Switching in Singaporean Business Cycles Robert Breunig School of Economics Faculty of Economics and Commerce Australian National University and Alison Stegman Research School of Pacific
More informationFinancial Time Series Analysis: Part II
Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing
More informationMulticollinearity and A Ridge Parameter Estimation Approach
Journal of Modern Applied Statistical Methods Volume 15 Issue Article 5 11-1-016 Multicollinearity and A Ridge Parameter Estimation Approach Ghadban Khalaf King Khalid University, albadran50@yahoo.com
More informationArma-Arch Modeling Of The Returns Of First Bank Of Nigeria
Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Emmanuel Alphonsus Akpan Imoh Udo Moffat Department of Mathematics and Statistics University of Uyo, Nigeria Ntiedo Bassey Ekpo Department of
More informationAre Forecast Updates Progressive?
CIRJE-F-736 Are Forecast Updates Progressive? Chia-Lin Chang National Chung Hsing University Philip Hans Franses Erasmus University Rotterdam Michael McAleer Erasmus University Rotterdam and Tinbergen
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationEconometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series
Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions
More informationChapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks
Chapter 5 Classical linear regression model assumptions and diagnostics Introductory Econometrics for Finance c Chris Brooks 2013 1 Violation of the Assumptions of the CLRM Recall that we assumed of the
More information1 The Multiple Regression Model: Freeing Up the Classical Assumptions
1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator
More informationAre Forecast Updates Progressive?
MPRA Munich Personal RePEc Archive Are Forecast Updates Progressive? Chia-Lin Chang and Philip Hans Franses and Michael McAleer National Chung Hsing University, Erasmus University Rotterdam, Erasmus University
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More information