Autocorrelation or Serial Correlation

Size: px
Start display at page:

Download "Autocorrelation or Serial Correlation"

Transcription

1 Chapter 6 Autocorrelation or Serial Correlation

2 Section 6.1 Introduction 2

3 Evaluating Econometric Work How does an analyst know when the econometric work is completed? 3

4 4 Evaluating Econometric Work Econometric Results indeed a term that for many economists conjures up horrifying visions of well-meaning but perhaps marginally skilled, likely nocturnal, individuals sorting through endless piles of computer print-outs. One final print-out is then chosen for seemingly mysterious reasons and the rest discarded to be recycled through a local paper processor and another computer printer for other econometricians to repeat the process ad infinitum. Besides supplying a lucrative business for the paper recyclers, what useful output, if any, results from such a process? This question lies at the heart of the so called science of econometrics as currently applied, a practice which has been called data-mining, number crunching, model sifting, data grubbing, fishing, data messaging, and even alchemy among other less palatable terms. All of these euphemisms describe basically, the same process: choosing an econometric model based on repeated experimentation with available sample data. - Ziemer (1984)

5 5 Evaluating Econometric Work Econometrics may not have the everlasting charm of Holmesian characters and adventures, or even a famous resident of Baker Street, but there is much in his methodological approach to the solving of criminal cases that is of relevance to applied econometric modeling. Holmesian detection may be interpreted as accommodating the relationship between data theory, modeling procedures, deductions and inferences, analysis of biases, testing of theories, re-evaluation and reformulation of theories, and finally reaching a solution to the problem at hand. With this in mind, can applied econometricians learn anything from the master of detection? - McAleer (1994)

6 Key Diagnostics Diagnostic Serial Correlation (Autocorrelation) Heteroscedasticity Collinearity Diagnostics Influence diagnostics Structural Change Component of Econometric Model Under Consideration Error (or Disturbance) Term Error (or Disturbance) Term Explanatory Variables Observations Structural Parameters See Beggs (1988). 6

7 Section 6.2 Autocorrelation or Serial Correlation

8 8 Autocorrelation or Serial Correlation Definition Consequences Formal Tests Durbin-Watson Test Nonparametric Runs Test Durbin's h-test Durbin's m-test Lagrange Multiplier (LM) Test Box-Pierce Test (Q Statistic) Ljung-Box Test (Q* Statistic) (Small-sample modification of Box-Pierce Q Statistic) Solution Generalized Least Squares

9 Formal Definition of Autocorrelation or Serial Correlation Autocorrelation or serial correlation refers to the lack of independence of error (or disturbance) terms. Autocorrelation and serial correlation refer to the same phenomenon. Simply put, a systematic pattern exists in the residuals of the econometric model. Ideally, the residuals, which represent a composite of all factors not embedded in the model, should exhibit no pattern. That is to say, the residuals should follow a white-noise (or random) pattern. 9

10 Prevalence of Serial Correlation With the use of time-series data in econometric applications, serial correlation is public enemy number one. Systematic patterns in the error terms commonly arise due to the (inadvertent) omission of explanatory variables in econometric models. These variables might come from disciplines other than economics, finance, or business; for example, psychology and sociology. Alternatively, these variables might represent factors that simply are difficult to quantify, such as tastes and preferences of consumers or technological innovation on the part of producers. 10

11 Consequences of Serial Correlation Bishop (1981) Errors contaminated with autocorrelation or serial correlation Potential of discovering spurious relationships due to problems with autocorrelated errors (Granger and Newbold, 1974) Difficulties with structural analysis and forecasting If the error structure is autoregressive, then OLS estimates of the regression parameters are (1) unbiased, (2) consistent, but (3) inefficient in small and in large samples. 11 continued...

12 Consequences of Serial Correlation The estimates of the standard errors of the coefficients in any econometric model are biased downward if the residuals are positively autocorrelated. They are biased upward if the residuals are negatively autocorrelated. Therefore, the calculated t statistic is biased upward or downward in the opposite direction of the bias in the estimated standard error of that coefficient. Granger and Newbold (1974) further suggest that the econometric results can be defined as nonsense if R 2 >DW(d). 12 continued...

13 Consequences of Serial Correlation Positive autocorrelation of the errors generally tends to make the estimate of the error variance too small, so confidence intervals are too narrow and null hypotheses are rejected with a higher probability than the stated significance level. Negative autocorrelation of the errors generally tends to make the estimate of the error variance too large, so confidence intervals are too wide; also the power of significance tests is reduced. With either positive or negative autocorrelation, leastsquares parameter estimates usually are not as efficient as generalized least-squares parameter estimates. 13

14 Regression with Autocorrelated Errors 14 Ordinary regression analysis is based on several statistical assumptions. One key assumption is that the errors are independent of each other. However, with time series data, the ordinary regression residuals usually are correlated over time. Violation of the independent errors assumption has three important consequences for ordinary regression. First, statistical tests of the significance of the parameters and the confidence limits for the predicted values are not correct. Second, the estimates of the regression coefficients are not as efficient as they would be if the autocorrelation were taken into account. Third, because the ordinary regression residuals are not independent, they contain information that can be used to improve the prediction of future values.

15 Solution to the Serial Correlation Problem Generalized Least Squares (GLS) The AUTOREG procedure solves this problem by augmenting the regression model with an autoregressive model for the random error, thereby accounting for the systematic pattern of the errors. Instead of the usual regression model, the following autoregressive error model is used: y ε x β + ε t = t + t = φ ε 1 t t 1 φ 2 ε t 2... φ m ε t m + v t v t ~ IN(0, σ 2 ) v t ~ IN(0, σ 2 ) The notation indicates that each v t is normally and independently distributed with mean 0 and variance σ continued...

16 Solution to the Serial Correlation Problem Generalized Least Squares (GLS) By simultaneously estimating the regression coefficients β and the autoregressive error model parameters φ i, the AUTOREG procedure corrects the regression estimates for autocorrelation. Thus, this kind of regression analysis is often called autoregressive error correction or serial correlation correction. This technique is also called the use of generalized least squares (GLS). 16

17 Predicted Values and Residuals The AUTOREG procedure can produce two kinds of predicted values and corresponding residuals and confidence limits. The first kind of predicted value is obtained from only the structural part of the model. This predicted value is an estimate of the unconditional mean of the dependent variable at time t. The second kind of predicted value includes both the structural part of the model and the predicted value of the autoregressive error process. Both the structural part and autoregressive error process of the model (termed the full model) are used to forecast future values. 17 continued...

18 Predicted Values and Residuals Use the OUTPUT statement to store predicted values and residuals in a SAS data set and to output other values such as confidence limits and variance estimates. The P= option specifies an output variable to contain the full model predicted values. The PM= option names an output variable for the predicted (unconditional) mean. The R= and RM= options specify output variables for the corresponding residuals, computed as the actual value minus the predicted value. 18

19 Serial Correlation Disturbance terms are not independent. The correlation between ε t and ε t-k is called an j i E T i X X X Y j i t kt k t t t = = 0 ) ( 1,2,..., ε ε ε β β β β t t-k autocorrelation of order k continued...

20 Serial Correlation Recommend a graphical analysis of plotting the residuals over time to determine the existence of a non-random or systematic pattern. Residuals Time Positive Correlation Residuals Time Negative Correlation 20...

21 Section 6.3 Tests for Serial Correlation

22 The Durbin-Watson Test AR(1) process H H 0 1 : ρ = 0 : ρ 0 = n t= 2 0 DW If ε t (ˆ e t eˆ t n t= 1 ρε = t 1 1 ) DW ( d ) 2(1 ρ ) so that ˆ ρ = ρ = 0, then d = 2; if ρ = 1, then d ˆ 2 e t v t or d statistic DW ( d ) 1 2 = 0; n t= 2 n eˆ eˆ t t= 1 e t 1 2 t if ρ = -1, then d = 4 22 continued...

23 The Durbin-Watson Test d L,d U depend on α, k, n. DW is invalid with models that contain no intercept and models that contain lagged dependent variables. The distribution of DW(d) is reported by Durbin and Watson (1950, 1951). 23 continued...

24 f(d) Reject H 0 Reject H 0 Evidence of positive autocorrelation Zone of indecision Zone of indecision Evidence of negative autocorrelation Accept H 0 0 d d L d U 2 4-d U 4-d L 4 24 continued...

25 The Durbin-Watson Test The sampling distribution of d depends on the values of the exogenous variables and hence Durbin and Watson derived upper (d U ) limits and lower (d L ) limits for the significance levels for d. Tables of the distribution are found in most econometric textbooks. The Durbin-Watson test perhaps is the most used procedure in econometric applications. 25

26 The Durbin-Watson Statistic 26 continued...

27 27 Appendix G: Statistical Table

28 Limitations of the Durbin-Watson Test Although the Durbin-Watson test is the most commonly used test for serial correlation, there are limitations: 1. The test is for first-order serial correlation only. 2. The test might be inconclusive. 3. The test cannot be applied in models with lagged dependent variables. 4. The test cannot be applied in models without intercepts. 28

29 Additional Tables There are other tables for the DW test that have been prepared to take care of special situations. Some of these are: 1. R.W. Farebrother (1980) provides tables for regression models with no intercept term. 2. Savin and White (1977) present tables for the DW test for samples with 6 to 200 observations and for as many as 20 regressors. 29 continued...

30 Additional Tables 3. Wallis (1972) gives tables for regression models with quarterly data. Here you want to test for fourth-order autocorrelation rather than the first-order autocorrelation. In this case, the DW statistic is n d (û û t t 4 t= 5 4 = n 2 ûu t t=1 Wallis provides 5% critical values d L and d U for two situations: where the k regressors include an intercept (but not a full set of seasonal dummy variables) and another where the regressors include four quarterly seasonal dummy variables. In each case the critical values are for testing H 0 : ρ =0 against H 1 : ρ > 0. For the hypothesis H 1 : ρ < 0, Wallis suggests that the appropriate critical values are (4-d U ) and (4-d L ). King and Giles (1978) give further significance points for Wallis tests. 30 continued... ) 2

31 Additional Tables 4. King (1981) gives the 5% points for d L and d U quarterly time-series data with trend and/or seasonal dummy variables. These tables are for testing first-order autocorrelation. 5. King (1983) gives tables for the DW test for monthly data. In case of monthly data, you want to test for twelfth-order autocorrelation. 31

32 Nonparametric Runs Test (Gujarti, 1978) More general than the DW test. Interest in H 0 : ρ = 0. Test of AR(1) process in the error terms N+ = number of positive residuals N- = number of negative residuals N = number of observations Nr = number of runs Example: E + [ Nr ] = (2N N ) / N [ N ] = (2N N (2N N N)) /( N ( N 1)) VAR r Test Statistic: Z = (N r E(N r )) / VAR(N r ) N(0,1) Reject H 0 (non-autocorrelation) if the test statistic is too large in absolute value. 32

33 The REG Procedure Model: MODEL1 Dependent Variable: lnpcg Number of Observations Read 36 Number of Observations Used 36 Analysis of Variance Sum of Mean Source DF Squares Square F Value Pr > F Model <.0001 Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var continued...

34 Parameter Estimates Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t

35 The REG Procedure Model: MODEL1 Dependent Variable: lnpcg Durbin-Watson D Pr < DW <.0001 Pr > DW Number of Observations 36 1st Order Autocorrelation NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. In this example, sample evidence exists to suggest the presence of positive serial correlation, which is the more common form of pattern in the residuals in regard to the use of economic or financial data. 35

36 Obs year lnpcgres Obs year lnpcgres

37 The Greene Problem In the Greene problem for gasoline, DW = and = Use of Nonparametric Runs test N = 36 N + = 19 N - = 17 N r = 11 E [ N ] VAR r = (2N + N ) / N = [ N ] = (2 N N (2 N N N )) /( N ( N 1)) = r ρˆ 37 Z = ( N E( N )) Z = = = at α =. 05, reject H :ρ = 0. Z crit r = 1.96 at α =.05 r VAR( N 0 r )

38 38 Analysis Limitations Analysts must recognize that a good Durbin-Watson statistic is insufficient evidence upon which to conclude that the error structure is contamination free in terms of autocorrelation. The Durbin-Watson test is only applicable for the presence of first-order autocorrelation. There is little reason to suppose that the correct model for residuals is AR(1). A mixed, autoregressive, movingaverage (ARMA) structure is much more likely to be correct, especially with quarterly, monthly, and weekly frequencies of time-series data. Modeling of the residuals can be employed following the methodology of Box and Jenkins (1976). Owing to higher frequencies of time-series data used in applied econometrics in recent years, the pattern of the error structure generally is more complex than the common AR(1) pattern.

39 A General Test for Higher-Order Serial Correlation The LM Test (Breusch and Pagan, 1980) LM - Lagrange multiplier y t = β u 0 t + β 1 = ρ 1 X u 1t t β + ρ u t 2 H0 : ρ1 = ρ2 =... = ρp = 2 k X 0 kt + u t ρ p u t t p = 1,2,..., n. + e t e t ~ IN(0, σ The Xs might or might not include lagged dependent variables. 1. Estimate by OLS and obtain the least squares residuals. ûγ t 2. Estimate. p u ˆ ˆ t = 0 + γ 1X1t γ k X kt + ut iρi + vt t= 1 3. Test whether the coefficients of û t i are all zero. Use the conventional F statistic.. 2 ) 39

40 40 Box-Pierce or Ljung-Box Tests Check the serial correlation pattern of the residuals. You must be sure that there is no serial correlation (desire white noise). H 0 : no pattern in the residuals (The residuals are white noise.) Box and Pierce (1970) suggest looking at not only the firstorder autocorrelation but autocorrelation of all orders of residuals. 2 r k Calculate Q = N m r 2 k, where is the autocorrelation of lag k, and N is the number of observations in the series. If the model fitted is appropriate,. Ljung and Box (1978) suggest a modification of the Q statistic for moderate sample sizes. Q* = N(N + 2) k= 1 m Q k= 1 (N 2 ~ & χ m k) 1 2 r k

41 Box-Pierce or Ljung-Box Tests With the Box-Pierce or Ljung-Box tests, you examine the interface of structural models with time-series models. Use the correlations and partial correlations of the residuals over time. The idea is to determine the appropriate pattern in the error structure from the autocorrelation and partial autocorrelation functions associated with the residuals. Autocorrelation functions tell you about moving average (MA) patterns. Partial autocorrelation functions tell you about autoregressive (AR) patterns. Anticipate ARMA error structures, particularly higherorder AR patterns in residuals of econometric models. 41

42 Section 6.4 Sample Problem: The Demand for Shrimp

43 The REG Procedure Model: MODEL1 Dependent Variable: QSHRIMP Number of Observations Read 97 Number of Observations Used 97 Analysis of Variance Sum of Mean Source DF Squares Square F Value Pr > F Model <.0001 Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var continued...

44 Parameter Estimates Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept PSHRIMP <.0001 PFIN PSHELL ADSHRIMP ADFIN ADSHELL

45 The REG Procedure Output The REG Procedure Model: MODEL1 Dependent Variable: QSHRIMP Durbin-Watson D Pr < DW Pr > DW Number of Observations 97 1st Order Autocorrelation NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. Conclusion: No AR(1) pattern in the residuals 45

46 RESID

47 The ARIMA Procedure Output Name of Variable = resqshrimp Mean of Working Series -12E-16 Standard Deviation Number of Observations 97 The autocorrelation function (acf). A plot of the correlation of the residuals at various lags. Corr (et, et-k), k = 0, 1, 2,, 24. MA(3) Pattern Autocorrelations Lag Covariance Correlation Std Error ******************** * ** ****** ** * ** *** ** ***

48 Partial Autocorrelations Lag Correlation AR(3) Pattern * ** ****** ** The partial ** ** * ** * *** autocorrelation function (PACF). A plot of the correlation of the residuals at various lags after netting out intermittent lags **** ** ** * * ****. 48

49 PROC ARIMA Output Partial Autocorrelations Lag Correlation * *. Autocorrelation Check for White Noise To Chi- Pr > Lag Square DF ChiSq Autocorrelations The Ljung Box Q* statistic reveals that the residual series is not white noise. 49

50 Correlogram of RESID Presence of MA(3), AR(3) pattern 50

51 Durbin-Watson Statistics Order DW Pr < DW Pr > DW NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. Godfrey's Serial Correlation Test Alternative LM Pr > LM 51 AR(1) AR(2) AR(3) Presence of AR(3) pattern

52 OLS Estimates Standard Approx Variable DF Estimate Error t Value Pr > t Intercept PSHRIMP <.0001 PFIN PSHELL ADSHRIMP ADFIN ADSHELL

53 Breusch-Godfrey Serial Correlation LM Test: F-statistic Prob. F(3,87) Obs*R-squared Prob. Chi-Square(3) Test Equation: Dependent Variable: RESID Method: Least Squares Sample: 1 97 Included observations: 97 Presample missing value lagged residuals set to zero. Coefficient Std. Error t-statistic Prob. C PSHRIMP PFIN PSHELL ADSHRIMP ADSHELL ADFIN RESID(-1) RESID(-2) RESID(-3) Presence of AR(3) Pattern 53 R-squared Mean dependent var -3.13E-15 Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood Hannan-Quinn criter F-statistic Durbin-Watson stat Prob(F-statistic)

54 Correcting for serial correlation through the use of PROC AUTOREG Use of Yule- Walker estimates of φ 1, φ 2, and φ 3 Estimates of Autoregressive Parameters Standard Lag Coefficient Error t Value Yule-Walker Estimates SSE DFE 87 MSE Root MSE SBC AIC Regress R-Square Total R-Square Log Likelihood Observations 97 54

55 Durbin-Watson Statistics Order DW Pr < DW Pr > DW NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. Now, no serial correlation exists in the residuals. 55

56 The AUTOREG Procedure Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) AR(2) AR(3) GLS Estimates Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 PSHRIMP <.0001 PFIN PSHELL ADSHRIMP ADFIN ADSHELL Statistically significant estimated coefficients for PSHRIMP and ADFIN

57 Partial Autocorrelations Preliminary MSE Starting values of estimates of φ 1, φ 2, and φ 3 in the ML procedure. Estimates of Autoregressive Parameters Standard Lag Coefficient Error t Value Algorithm converged. 57

58 Maximum Likelihood Estimates Use of the Maximum Likelihood procedure to produce estimates of φ 1, φ 2, and φ 3. SSE DFE 87 MSE Root MSE SBC AIC Regress R-Square Total R-Square Log Likelihood Observations 97 Durbin-Watson Statistics Order DW Pr < DW Pr > DW NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for use of autoreg procedure testing negative autocorrelation.

59 Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) AR(2) AR(3) ML Estimates Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 PSHRIMP <.0001 PFIN PSHELL ADSHRIMP ADFIN ADSHELL AR AR AR continued...

60 Autoregressive parameters assumed given. Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 PSHRIMP <.0001 PFIN PSHELL ADSHRIMP ADFIN ADSHELL

61 Depiction of the Estimated Model for the Qshrimp Problem The estimated model is based on the Maximum Likelihood estimates. Qshrimp t = *Pshrimp t *Pfin t *Pshell1 t *Adshrimp t Adfin t *Adshell1 t + v t 61 v t = *v t *v t *v t-3 + ε t MSE = (estimate of residual variance) This estimate is smaller than the OLS estimate of The total R-square statistic computed from the residuals of the autoregressive model is , reflecting the improved fit from the use of past residuals to help predict the next value of Qshrimp t. The Reg Rsq value is , which is the R-square statistic for a regression of transformed variables adjusted for the estimated autocorrelation.

62 Comparison of Diagnostic Statistics and Parameter Estimates from the Qshrimp Problem Explanatory Variables OLS Yule-Walker Maximum Likelihood Parameter Estimate Standard Error Parameter Estimate Standard Error Parameter Estimate Standard Error Intercept Pshrimp Pfin Pshell Adshrimp Adfin Adshell continued...

63 Comparison of Diagnostic Statistics and Parameter Estimates Parameter Estimate OLS Yule-Walker Maximum Likelihood Standard Error Parameter Estimate Standard Error Parameter Estimate MSE Root MSE SBC AIC Regress R- Square Total R-Square Log Likelihood AR 1 NA AR 2 NA AR 3 NA Standard Error 63 DW order DW order DW order

64 Section 6.5 Sample Problem: The Demand for Gasoline

65 Model: MODEL1 Dependent Variable: lnpcg Number of Observations Read 36 Number of Observations Used 36 Analysis of Variance Sum of Mean Source DF Squares Square F Value Pr > F Model <.0001 Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var continued...

66 OLS Estimates Parameter Estimates Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t

67 The REG Procedure Output Model: MODEL1 Dependent Variable: lnpcg Durbin-Watson D Pr < DW <.0001 Pr > DW Number of Observations 36 1st Order Autocorrelation NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. What is the conclusion regarding serial correlation based on the Durbin-Watson statistic? 67

68 RESID

69 MA(1) Name of Variable = lnpcgres Mean of Working Series 9.68E-16 Standard Deviation Number of Observations 36 Autocorrelations Lag Covariance Correlation Std Error ******************** ************ ************ * *** **** ** * E E **** ******* **** Statistically significant autocorrelation and partial autocorrelation coefficients 69

70 Partial Autocorrelations AR(1), AR(2) Lag Correlation ************ ********* ** ** ** ** ******* * ******. 70 continued...

71 Autocorrelation Check for White Noise To Chi- Pr > Lag Square DF ChiSq Autocorrelations Statistically significant Ljung-Box Q* statistic. Thus, the pattern in the residuals is not random or white noise. 71

72 72

73 The AUTOREG Procedure Dependent Variable lnpcg Ordinary Least Squares Estimates SSE DFE 29 MSE Root MSE SBC AIC Regress R-Square Total R-Square Normal Test Pr > ChiSq Log Likelihood Observations 36 Durbin-Watson Statistics Order DW Pr < DW Pr > DW < NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. 73 continued...

74 AR(1), AR(2) Terms Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) AR(2) <.0001 Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t

75 The AUTOREG Procedure Estimates of Autocorrelations Lag Covariance Correlation ******************** ************ * Partial Autocorrelations Preliminary MSE Estimates of φ 1, φ 2 75 Estimates of Autoregressive Parameters Standard Lag Coefficient Error t Value continued...

76 Yule-Walker Estimates SSE DFE 27 MSE Root MSE SBC AIC Regress R-Square Total R-Square Log Likelihood Observations 36 Durbin-Watson Statistics Order DW Pr < DW Pr > DW After you take into account this pattern in the residuals, the serial correlation problem no longer is evident. NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. 76

77 The AUTOREG Procedure Output Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) AR(2) Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t

78 Estimates of Autocorrelations Lag Covariance Correlation ******************** ************ * Partial Autocorrelations Starting values in the ML procedure to obtain estimates of φ 1, φ 2 Preliminary MSE Estimates of Autoregressive Parameters Standard Lag Coefficient Error t Value continued...

79 Maximum Likelihood Estimates SSE DFE 27 MSE Root MSE SBC AIC Regress R-Square Total R-Square Log Likelihood Observations 36 Durbin-Watson Statistics Order DW Pr < DW Pr > DW NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. 79

80 Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) AR(2) ML Estimates Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t AR <.0001 AR continued...

81 Autoregressive parameters assumed given. Standard Approx Variable DF Estimate Error t Value Pr > t Intercept <.0001 lny <.0001 lnpg lnpnc lnpuc lnppt t

82 Depiction of the Estimated Model for the Demand for Gasoline Problem (Greene) The estimated model is based on the Maximum Likelihood estimates: ln pcg = *ln y *ln pg *ln *ln v t t puc t *ln t ppt = * vt * vt 2 t * t + ε t t pnc t Regress R-Square = Total R-Square = MSE = = DW order 1 = DW order 2 = σˆ 2 82

83 Section 6.6 A Test for Serial Correlation in the Presence of a Lagged Dependent Variable

84 Durbin s h-test A large sample test for autocorrelation when lagged dependent variables are present. H 0 :ρ = 0 AR(1) process in error terms ρˆ 1 (1/ 2)d h = ρˆ n /(1 nv( β)) d is the DW statistic. 84 The test breaks down if. Coefficient associated with ~ N(0,1) h & nv(ˆ) β 1 Yt 1 If the Durbin's h-test breaks down, compute the OLS residuals. Then regress û t on û t 1, yt 1, and the set of exogenous variables. The test û t 1 for ρ = 0 is carried out by testing the significance of the coefficient. (Durbin's m-test) û t

85 The REG Procedure Output Model: MODEL1 Dependent Variable: lnpcg Number of Observations Read 36 Number of Observations Used 35 Number of Observations with Missing Values 1 Analysis of Variance Sum of Mean Source DF Squares Square F Value Pr > F Model <.0001 Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var continued...

86 Parameter Estimates OLS Estimates Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept laglnpcg <.0001 lny lnpg <.0001 lnpnc lnpuc lnppt t Presence of a lagged dependent variable 86

87 The REG Procedure Output Model: MODEL1 Dependent Variable: lnpcg Durbin-Watson D Pr < DW Pr > DW Number of Observations 35 1st Order Autocorrelation NOTE: Pr<DW is the p-value for testing positive autocorrelation, and Pr>DW is the p-value for testing negative autocorrelation. The DW statistic reveals positive serial correlation, AR(1) pattern. 87

88 Name of Variable = lnpcgres Mean of Working Series -66E-17 Standard Deviation Number of Observations 35 Autocorrelations Lag Covariance Correlation Std Error ******************** **** ****** ** *** * **** * *

89 From the ACF and PACF plots, no pattern in the residuals is revealed. Partial Autocorrelations Lag Correlation **** ******* ****** *** ** *. The ARIMA Procedure Autocorrelation Check for White Noise To Chi- Pr > Lag Square DF ChiSq Autocorrelations The Ljung-Box Q* statistic suggests a white noise residual series.

90 The AUTOREG Procedure Dependent Variable lnpcg Durbin s h-test, at least at the 0.10 level of significance, suggests a non-random residual series. Ordinary Least Squares Estimates SSE DFE 27 MSE Root MSE SBC AIC Regress R-Square Total R-Square Durbin h Pr > h Normal Test Pr > ChiSq Log Likelihood Observations 35 Durbin-Watson Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) continued...

91 Standard Approx Variable DF Estimate Error t Value Pr > t Intercept laglnpcg <.0001 lny lnpg <.0001 lnpnc lnpuc lnppt t

92 The AUTOREG Procedure Output Partial Autocorrelations Starting value of φ in the ML procedure Preliminary MSE Estimates of Autoregressive Parameters Standard Lag Coefficient Error t Value Algorithm converged.

93 Maximum Likelihood Estimates SSE DFE 26 MSE Root MSE SBC AIC Regress R-Square Total R-Square Log Likelihood Observations 35 Durbin-Watson Godfrey's Serial Correlation Test Alternative LM Pr > LM AR(1) GLS (ML) Estimates 93 continued...

94 Standard Approx Variable DF Estimate Error t Value Pr > t Intercept laglnpcg lny lnpg <.0001 lnpnc lnpuc lnppt t AR <

95 The AUTOREG Procedure Output Autoregressive parameters assumed given. Standard Approx Variable DF Estimate Error t Value Pr > t Intercept laglnpcg lny lnpg <.0001 lnpnc lnpuc lnppt t

96 Calculation of Durbin s h-test for the Greene Problem In the Greene Problem for gasoline demand: ˆ ρ = 1 (1.639 / 2) = n = 35 v( B) = ( ) 2 = h =

97 Analysis of Variance Sum of Mean Source DF Squares Square F Value Pr > F Model Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var Parameter Estimates Parameter Standard Variable Label DF Estimate Error t Value Pr > t Intercept Intercept lnpcgres laglnpcg lny lnpg lnpnc lnppt t

98 Section 6.7 Summary Remarks about the Issue of Serial Correlation

99 Final Considerations With time-series data, in most cases this problem will surface. Analysts must examine the error structure carefully. Minimally do the following: Graph the residuals over time. Consider the significance of the Durbin-Watson statistic. Consider higher-order autocorrelation structure via PROC ARIMA. Consider the Godfrey LM test. Consider the Box-Pierce or Ljung-Box tests (Q statistics). Re-estimate econometric models with AR(p) error structures via PROC AUTOREG. Use the Yule-Walker or Maximum Likelihood method to obtain estimates of the AR(P) error structure. A preference exists for the Maximum Likelihood method. 99

Section 2 NABE ASTEF 65

Section 2 NABE ASTEF 65 Section 2 NABE ASTEF 65 Econometric (Structural) Models 66 67 The Multiple Regression Model 68 69 Assumptions 70 Components of Model Endogenous variables -- Dependent variables, values of which are determined

More information

Applied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics

Applied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish

More information

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k

More information

Brief Sketch of Solutions: Tutorial 3. 3) unit root tests

Brief Sketch of Solutions: Tutorial 3. 3) unit root tests Brief Sketch of Solutions: Tutorial 3 3) unit root tests.5.4.4.3.3.2.2.1.1.. -.1 -.1 -.2 -.2 -.3 -.3 -.4 -.4 21 22 23 24 25 26 -.5 21 22 23 24 25 26.8.2.4. -.4 - -.8 - - -.12 21 22 23 24 25 26 -.2 21 22

More information

Eco and Bus Forecasting Fall 2016 EXERCISE 2

Eco and Bus Forecasting Fall 2016 EXERCISE 2 ECO 5375-701 Prof. Tom Fomby Eco and Bus Forecasting Fall 016 EXERCISE Purpose: To learn how to use the DTDS model to test for the presence or absence of seasonality in time series data and to estimate

More information

in the time series. The relation between y and x is contemporaneous.

in the time series. The relation between y and x is contemporaneous. 9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x

More information

CHAPTER 4: Forecasting by Regression

CHAPTER 4: Forecasting by Regression CHAPTER 4: Forecasting by Regression Prof. Alan Wan 1 / 57 Table of contents 1. Revision of Linear Regression 3.1 First-order Autocorrelation and the Durbin-Watson Test 3.2 Correction for Autocorrelation

More information

Univariate linear models

Univariate linear models Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

Exercise Sheet 6: Solutions

Exercise Sheet 6: Solutions Exercise Sheet 6: Solutions R.G. Pierse 1. (a) Regression yields: Dependent Variable: LC Date: 10/29/02 Time: 18:37 Sample(adjusted): 1950 1985 Included observations: 36 after adjusting endpoints C 0.244716

More information

11.1 Gujarati(2003): Chapter 12

11.1 Gujarati(2003): Chapter 12 11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed

More information

Economics 308: Econometrics Professor Moody

Economics 308: Econometrics Professor Moody Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Exercise Sheet 5: Solutions

Exercise Sheet 5: Solutions Exercise Sheet 5: Solutions R.G. Pierse 2. Estimation of Model M1 yields the following results: Date: 10/24/02 Time: 18:06 C -1.448432 0.696587-2.079327 0.0395 LPC -0.306051 0.272836-1.121740 0.2640 LPF

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

CHAPTER 6: SPECIFICATION VARIABLES

CHAPTER 6: SPECIFICATION VARIABLES Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION

LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION BEZRUCKO Aleksandrs, (LV) Abstract: The target goal of this work is to develop a methodology of forecasting Latvian GDP using ARMA (AutoRegressive-Moving-Average)

More information

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e., 1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series

More information

LECTURE 11. Introduction to Econometrics. Autocorrelation

LECTURE 11. Introduction to Econometrics. Autocorrelation LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct

More information

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

ARDL Cointegration Tests for Beginner

ARDL Cointegration Tests for Beginner ARDL Cointegration Tests for Beginner Tuck Cheong TANG Department of Economics, Faculty of Economics & Administration University of Malaya Email: tangtuckcheong@um.edu.my DURATION: 3 HOURS On completing

More information

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Emmanuel Alphonsus Akpan Imoh Udo Moffat Department of Mathematics and Statistics University of Uyo, Nigeria Ntiedo Bassey Ekpo Department of

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

Testing methodology. It often the case that we try to determine the form of the model on the basis of data

Testing methodology. It often the case that we try to determine the form of the model on the basis of data Testing methodology It often the case that we try to determine the form of the model on the basis of data The simplest case: we try to determine the set of explanatory variables in the model Testing for

More information

5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1)

5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) 5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) Assumption #A1: Our regression model does not lack of any further relevant exogenous variables beyond x 1i, x 2i,..., x Ki and

More information

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. 9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Use of Dummy (Indicator) Variables in Applied Econometrics

Use of Dummy (Indicator) Variables in Applied Econometrics Chapter 5 Use of Dummy (Indicator) Variables in Applied Econometrics Section 5.1 Introduction Use of Dummy (Indicator) Variables Model specifications in applied econometrics often necessitate the use of

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Econ 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:

Econ 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements: Econ 427, Spring 2010 Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements: 1. (page 132) In each case, the idea is to write these out in general form (without the lag

More information

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation 1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption

More information

The GARCH Analysis of YU EBAO Annual Yields Weiwei Guo1,a

The GARCH Analysis of YU EBAO Annual Yields Weiwei Guo1,a 2nd Workshop on Advanced Research and Technology in Industry Applications (WARTIA 2016) The GARCH Analysis of YU EBAO Annual Yields Weiwei Guo1,a 1 Longdong University,Qingyang,Gansu province,745000 a

More information

MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY

MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY The simple ARCH Model Eva Rubliková Ekonomická univerzita Bratislava Manuela Magalhães Hill Department of Quantitative Methods, INSTITUTO SUPERIOR

More information

WORKSHOP. Introductory Econometrics with EViews. Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic

WORKSHOP. Introductory Econometrics with EViews. Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic WORKSHOP on Introductory Econometrics with EViews Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic Res. Asst. Pejman Bahramian PhD Candidate, Department of Economic Res. Asst. Gizem Uzuner MSc Student,

More information

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale

More information

Financial Time Series Analysis: Part II

Financial Time Series Analysis: Part II Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing

More information

Diagnostics of Linear Regression

Diagnostics of Linear Regression Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Possibly useful formulas for this exam: b1 = Corr(X,Y) SDY / SDX. confidence interval: Estimate ± (Critical Value) (Standard Error of Estimate)

Possibly useful formulas for this exam: b1 = Corr(X,Y) SDY / SDX. confidence interval: Estimate ± (Critical Value) (Standard Error of Estimate) Statistics 5100 Exam 2 (Practice) Directions: Be sure to answer every question, and do not spend too much time on any part of any question. Be concise with all your responses. Partial SAS output and statistical

More information

TIME SERIES DATA ANALYSIS USING EVIEWS

TIME SERIES DATA ANALYSIS USING EVIEWS TIME SERIES DATA ANALYSIS USING EVIEWS I Gusti Ngurah Agung Graduate School Of Management Faculty Of Economics University Of Indonesia Ph.D. in Biostatistics and MSc. in Mathematical Statistics from University

More information

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

ECONOMETRIA II. CURSO 2009/2010 LAB # 3 ECONOMETRIA II. CURSO 2009/2010 LAB # 3 BOX-JENKINS METHODOLOGY The Box Jenkins approach combines the moving average and the autorregresive models. Although both models were already known, the contribution

More information

OLS Assumptions Violation and Its Treatment: An Empirical Test of Gross Domestic Product Relationship with Exchange Rate, Inflation and Interest Rate

OLS Assumptions Violation and Its Treatment: An Empirical Test of Gross Domestic Product Relationship with Exchange Rate, Inflation and Interest Rate J. Appl. Environ. Biol. Sci., 6(5S)43-54, 2016 2016, TextRoad Publication ISSN: 2090-4274 Journal of Applied Environmental and Biological Sciences www.textroad.com OLS Assumptions Violation and Its Treatment:

More information

Stat 500 Midterm 2 12 November 2009 page 0 of 11

Stat 500 Midterm 2 12 November 2009 page 0 of 11 Stat 500 Midterm 2 12 November 2009 page 0 of 11 Please put your name on the back of your answer book. Do NOT put it on the front. Thanks. Do not start until I tell you to. The exam is closed book, closed

More information

Frequency Forecasting using Time Series ARIMA model

Frequency Forecasting using Time Series ARIMA model Frequency Forecasting using Time Series ARIMA model Manish Kumar Tikariha DGM(O) NSPCL Bhilai Abstract In view of stringent regulatory stance and recent tariff guidelines, Deviation Settlement mechanism

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

7. Integrated Processes

7. Integrated Processes 7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider

More information

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Practical Econometrics. for. Finance and Economics. (Econometrics 2) Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics

More information

Bristol Business School

Bristol Business School Bristol Business School Module Leader: Module Code: Title of Module: Paul Dunne UMEN3P-15-M Econometrics Academic Year: 07/08 Examination Period: January 2008 Examination Date: 16 January 2007 Examination

More information

Answers: Problem Set 9. Dynamic Models

Answers: Problem Set 9. Dynamic Models Answers: Problem Set 9. Dynamic Models 1. Given annual data for the period 1970-1999, you undertake an OLS regression of log Y on a time trend, defined as taking the value 1 in 1970, 2 in 1972 etc. The

More information

2. Linear regression with multiple regressors

2. Linear regression with multiple regressors 2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions

More information

Practice Questions for the Final Exam. Theoretical Part

Practice Questions for the Final Exam. Theoretical Part Brooklyn College Econometrics 7020X Spring 2016 Instructor: G. Koimisis Name: Date: Practice Questions for the Final Exam Theoretical Part 1. Define dummy variable and give two examples. 2. Analyze the

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Likely causes: The Problem. E u t 0. E u s u p 0

Likely causes: The Problem. E u t 0. E u s u p 0 Autocorrelation This implies that taking the time series regression Y t X t u t but in this case there is some relation between the error terms across observations. E u t 0 E u t E u s u p 0 Thus the error

More information

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall Applied Econometrics Second edition Dimitrios Asteriou and Stephen G. Hall MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3. Imperfect Multicollinearity 4.

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Answers to Problem Set #4

Answers to Problem Set #4 Answers to Problem Set #4 Problems. Suppose that, from a sample of 63 observations, the least squares estimates and the corresponding estimated variance covariance matrix are given by: bβ bβ 2 bβ 3 = 2

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

Introduction to Eco n o m et rics

Introduction to Eco n o m et rics 2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Introduction to Eco n o m et rics Third Edition G.S. Maddala Formerly

More information

Final Exam. 1. Definitions: Briefly Define each of the following terms as they relate to the material covered in class.

Final Exam. 1. Definitions: Briefly Define each of the following terms as they relate to the material covered in class. Name Answer Key Economics 170 Spring 2003 Honor pledge: I have neither given nor received aid on this exam including the preparation of my one page formula list and the preparation of the Stata assignment

More information

Autoregressive models with distributed lags (ADL)

Autoregressive models with distributed lags (ADL) Autoregressive models with distributed lags (ADL) It often happens than including the lagged dependent variable in the model results in model which is better fitted and needs less parameters. It can be

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level

More information

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent

More information

The Evolution of Snp Petrom Stock List - Study Through Autoregressive Models

The Evolution of Snp Petrom Stock List - Study Through Autoregressive Models The Evolution of Snp Petrom Stock List Study Through Autoregressive Models Marian Zaharia Ioana Zaheu Elena Roxana Stan Faculty of Internal and International Economy of Tourism RomanianAmerican University,

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10 Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show

More information

Analysis. Components of a Time Series

Analysis. Components of a Time Series Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

1 Quantitative Techniques in Practice

1 Quantitative Techniques in Practice 1 Quantitative Techniques in Practice 1.1 Lecture 2: Stationarity, spurious regression, etc. 1.1.1 Overview In the rst part we shall look at some issues in time series economics. In the second part we

More information

Lab 07 Introduction to Econometrics

Lab 07 Introduction to Econometrics Lab 07 Introduction to Econometrics Learning outcomes for this lab: Introduce the different typologies of data and the econometric models that can be used Understand the rationale behind econometrics Understand

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Box-Jenkins ARIMA Advanced Time Series

Box-Jenkins ARIMA Advanced Time Series Box-Jenkins ARIMA Advanced Time Series www.realoptionsvaluation.com ROV Technical Papers Series: Volume 25 Theory In This Issue 1. Learn about Risk Simulator s ARIMA and Auto ARIMA modules. 2. Find out

More information

Week 11 Heteroskedasticity and Autocorrelation

Week 11 Heteroskedasticity and Autocorrelation Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity

More information

Empirical Economic Research, Part II

Empirical Economic Research, Part II Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction

More information

CORRELATION, ASSOCIATION, CAUSATION, AND GRANGER CAUSATION IN ACCOUNTING RESEARCH

CORRELATION, ASSOCIATION, CAUSATION, AND GRANGER CAUSATION IN ACCOUNTING RESEARCH CORRELATION, ASSOCIATION, CAUSATION, AND GRANGER CAUSATION IN ACCOUNTING RESEARCH Alireza Dorestani, Northeastern Illinois University Sara Aliabadi, Northeastern Illinois University ABSTRACT In this paper

More information

Bristol Business School

Bristol Business School Bristol Business School Academic Year: 10/11 Examination Period: January Module Leader: Module Code: Title of Module: John Paul Dunne Econometrics UMEN3P-15-M Examination Date: 12 January 2011 Examination

More information

APPLIED MACROECONOMETRICS Licenciatura Universidade Nova de Lisboa Faculdade de Economia. FINAL EXAM JUNE 3, 2004 Starts at 14:00 Ends at 16:30

APPLIED MACROECONOMETRICS Licenciatura Universidade Nova de Lisboa Faculdade de Economia. FINAL EXAM JUNE 3, 2004 Starts at 14:00 Ends at 16:30 APPLIED MACROECONOMETRICS Licenciatura Universidade Nova de Lisboa Faculdade de Economia FINAL EXAM JUNE 3, 2004 Starts at 14:00 Ends at 16:30 I In Figure I.1 you can find a quarterly inflation rate series

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Multivariate Time Series Analysis: VAR Gerald P. Dwyer Trinity College, Dublin January 2013 GPD (TCD) VAR 01/13 1 / 25 Structural equations Suppose have simultaneous system for supply

More information

Answer all questions from part I. Answer two question from part II.a, and one question from part II.b.

Answer all questions from part I. Answer two question from part II.a, and one question from part II.b. B203: Quantitative Methods Answer all questions from part I. Answer two question from part II.a, and one question from part II.b. Part I: Compulsory Questions. Answer all questions. Each question carries

More information

ECON 312 FINAL PROJECT

ECON 312 FINAL PROJECT ECON 312 FINAL PROJECT JACOB MENICK 1. Introduction When doing statistics with cross-sectional data, it is common to encounter heteroskedasticity. The cross-sectional econometrician can detect heteroskedasticity

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Economics 620, Lecture 13: Time Series I

Economics 620, Lecture 13: Time Series I Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is

More information

ECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions

ECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued Brief Suggested Solutions 1. In Lab 8 we considered the following

More information

Stationary and nonstationary variables

Stationary and nonstationary variables Stationary and nonstationary variables Stationary variable: 1. Finite and constant in time expected value: E (y t ) = µ < 2. Finite and constant in time variance: Var (y t ) = σ 2 < 3. Covariance dependent

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information