10. Time series regression and forecasting
|
|
- Damon Richards
- 5 years ago
- Views:
Transcription
1 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the causal effect on a variable of interest, Y, of a change in another variable, X, over time? What is the best forecast of the value of an economic variable (e.g. the stock price) at some future date? 282
2 Remarks: The analysis of time series data requires knowledge on several specific concepts Some of these key concepts (to be explained) are forecasting estimation of dynamic causal effects stationarity (non-stationarity) Aim of this section: Description of these basic concepts (For detailed information refer to the many special lectures on this topic) 283
3 10.1. Time series data and serial correlation Point of departure: We consider an economic / financial variable over time (for example, the inflation rate or the unemployment rate) We denote the observation on that variable at date t by Y t We denote the number of all observations by T The period of time between observations (i.e. between observation t and t + 1) is some unit of time (a day, week, month, quarter,...) In a first step, we always plot the observations on the variable over time 284
4 Inflation and unemployment in the US, (quarterly data) 285
5 Definition 10.1: (Lags, first differences) We consider a time series variable Y t observed through time (t = 1,..., T ). We use the following notation: The first lag of a time series Y t is Y t 1. The jth lag is Y t j. The first difference of a series, Y t, is its change between periods t 1 and t: Y t = Y t Y t 1. The first difference of the logarithm of Y t is ln(y t ) = ln(y t ) ln(y t 1 ). 286
6 Remarks: The percentage change of a time series Y t between periods t 1 and t is approximately 100 ln(y t ) This approximation is most accurate when the percentage change is small (see Slide 92) 287
7 Autocorrelation: An important issue in the analysis of time series data is whether and how a variable Y t is related to its own past values As a measure of this relation, we use the covariance and the correlation of Y t with its own past values Autocovariance and autocorrelation (serial correlation) Definition 10.2: (Autocovariance, autocorrelation) The jth autocovariance (autocorrelation) of a series Y t is the covariance (correlation coefficient) between Y t and its jth lag: jth autocovariance γ j = Cov(Y t, Y t j ), jth autocorrelation ρ j = Cov(Y t, Y t j ) Var(Yt ) Var(Y t j ). 288
8 Remarks: The formulas in Definition 10.2 represent the probabilistic autocovariances and autocorrelations when we think of the series Y t as a sequence of random variables (jth population autocovariances and autocorrelations) implicitly assume that the population autocovariances γ j and autocorrelations ρ j remain constant over time, for example in the case of j = 1 γ 1 = Cov(Y 2, Y 1 ) = Cov(Y 3, Y 2 ) =... ρ 1 = Corr(Y 2, Y 1 ) = Corr(Y 3, Y 2 ) =... Assumption of stationarity (to be discussed later) 289
9 Question: How can we estimate the theoretical population autocovariances γ j and autocorrelations ρ j on the basis of the time series observations Y 1, Y 2,..., Y T? Definition 10.3: (Estimating autocovariances, autocorrelations) The conventional estimators of the jth autocovariance γ j and the jth autocorrelation ρ j based on the observations Y 1, Y 2,..., Y T are defined as follows: ˆγ j = 1 T T t=j+1 ( Yt Ȳ ) ( Y t j Ȳ ), (10.1) ˆρ j = ˆγ j ˆγ 0, (10.2) where Ȳ denotes the sampling mean of the Y t. 290
10 Remarks: The estimators (10.1) and (10.2) are consistent Both estimators implicitly assume stationarity Example: Sample autocorrelations of the U.S. CPI inflation rate and its changes up to lag 10 (see next slides) We denote the U.S. CPI inflation rate in EViews by INF t INF t itself exhibits a strongly positive autocorrelation Its first difference, INF t, exhibits a strongly negative autocorrelation 291
11 U.S. CPI inflation rate and its changes U.S. CPI inflation rate First differences in the U.S. CPI inflation rate
12 Sample autocorrelations of the U.S. CPI inflation rate and its changes Variable: INF Date: 25/06/12 Time: 11:21 Sample: 1957Q1 2005Q1 Included observations: 192 Autocorrelation Partial Correlation AC PAC Q-Stat Prob. ******. ****** *****. * ******. ** ***** *.. ****... ****... ****... *** * ***. * ***. * Variable: D(INF) Date: 25/06/12 Time: 11:22 Sample: 1957Q1 2005Q1 Included observations: 191 Autocorrelation Partial Correlation AC PAC Q-Stat Prob **. ** **. *** **. * * * * *. * **. * * * *
13 Four economic time series 294
14 10.2. Autoregressions Aim of this section: Forecasts made using a regression model that relates a time series variable to its own past values Definition 10.4: (First order autoregressive model) We consider an economic time series variable Y t and define the first order autoregressive population model (the AR(1) model) for Y t as where u t is an error term. Y t = β 0 + β 1 Y t 1 + u t, (10.3) 295
15 Example: Fit of an AR(1) model to the change in the U.S. CPI inflation rate EViews output: Fit of an AR(1) model to INF t Dependent Variable: D(INF) Method: Least Squares Date: 28/06/12 Time: 10:29 Sample (adjusted): 1957Q4 2004Q4 Included observations: 189 after adjustments Convergence achieved after 3 iterations Variable Coefficient Std. Error t-statistic Prob. C AR(1) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood F-statistic Prob(F-statistic) tatis Inverted AR Roots Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Hannan-Quinn criter Durbin-Watson stat
16 Forecasting: Consider the AR(1) model in Eq. (10.3) We estimate the unknown parameters β 0 and β 1 by OLS using the data Y 1,..., Y T We obtain the OLS estimates ˆβ 0 and ˆβ 1 We aim at forecasting the future value Y T +1 based on the observed value Y T and the estimated coefficients ˆβ 0 and ˆβ 1 We denote this forecast by Ŷ T +1 T and it is given by Ŷ T +1 T = ˆβ 0 + ˆβ 1 Y T (10.4) The forecast error is the difference between the value of Y T +1 that actually occurs and its forecasted value based on Y T : Forecast error = Y T +1 Ŷ T +1 T (10.5) 297
17 Forecasting: [continued] Forecasts versus predicted values Forecasts pertain to out-of-sample observations Predicted values pertain to in-sample observations The root mean squared forecast error (RMSFE) is a measure of the magnitude of a typical mistake using a forecasting model and is defined by RMSFE = The RMSFE has two sources: The unknown future values of u t The errors in the estimates ˆβ 0 and ˆβ 1 E [ Y T +1 Ŷ T +1 T ] 2 (10.6) 298
18 Forecasting: [continued] The RMSFE can be estimated by the standard error of the regression (to be discussed in Section 10.3.) Example: We consider the estimation output on Slide 296 The estimated AR(1) model for the changes in INF t is INF t = INF t 1 (10.7) The estimation period is 1957:Q1 2004:Q4 We aim at forecasting the inflation rate for 2005:Q1 299
19 Example: [continued] In general, we have ÎNF T +1 T = INF T + ÎNF T +1 T INF T = INF T + INF T +1 T (10.8) Setting T = 2004:Q4, we find from the data set From Eq. (10.7), we have INF 2004:Q4 = % (10.9) INF 2004:Q4 = % (10.10) INF 2005:Q1 = INF 2004:Q4 = % (10.11) 300
20 Example: [continued] It follows from the Eqs. (10.8) (10.11) that ÎNF 2005:Q1 2004:Q4 = = % Accuracy of the forecast: From the data set we find that INF 2005:Q1 = % The forecast error is INF 2005:Q1 ÎNF 2005:Q1 2004:Q4 = = % 301
21 Next: Extension of the AR(1) model by including potentially useful information in more distant past values of the time series Definition 10.5: (pth-order autoregressive model) The pth-order autoregressive model (the AR(p) model) represents Y t as a linear function of p of its lagged values: Y t = β 0 + β 1 Y t 1 + β 2 Y t β p Y t p + u t, (10.12) where E(u t Y t 1, Y t 2,...) = 0. The number of lags p is called the order, or the lag length, of the autoregression. 302
22 Implications of the assumption E(u t Y t 1, Y t 2,...) = 0: 1. The best forecast of Y T +1 based on its entire history depends only on the most recent p past values It can be shown that if Y t follows an AR(p) model, then the best forecast (in the sense of having smallest RMSFE) of Y T +1 based on Y T, Y T 1,... is Y T +1 T = β 0 +β 1 Y T +β 2 Y T β p Y T p+1 (10.13) Since the coefficients β 0,..., β p are unknown, we use the forecast from Eq. (10.13) with estimated coefficients 2. The errors u t are serially uncorrelated 303
23 Fit of an AR(4) model to INF t Dependent Variable: D(INF) Method: Least Squares Date: 28/06/12 Time: 10:30 Sample (adjusted): 1958Q3 2004Q4 Included observations: 186 after adjustments Convergence achieved after 3 iterations Variable Coefficient Std. Error t-statistic Prob. C AR(1) AR(2) AR(3) AR(4) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood F-statistic Prob(F-statistic) tatis Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Hannan-Quinn criter Durbin-Watson stat Inverted AR Roots i i i i 304
24 Estimation results: Estimated equation: INF t = INF t INF t 2 (0.0758) (0.0741) (0.0762) INF t INF t 4 (10.14) (0.0764) (0.0744) The coefficients on INF t 2, INF t 3, INF t 4 are jointly statistically different from zero (F -statistic = , p-value < ) R 2 improves from for the AR(1) model on Slide 296 to SER improves from for the AR(1) model on Slide 296 to
25 Inflation forecast for 2005:Q1: Recall Eq. (10.8) on Slide 300 with T = 2004:Q4 ÎNF T +1 T = INF T + From the data set we have INF T +1 T INF 2004:Q4 = INF 2004:Q4 = , INF 2004:Q3 = INF 2004:Q2 = , INF 2004:Q1 = Using the estimates in Eq. (10.14) on Slide 305, we obtain INF 2005:Q1 2004:Q4 = ( ) =
26 Inflation forecast for 2005:Q1: [continued] From Eq. (10.8) we thus have ÎNF 2005:Q1 2004:Q4 = = % Accuracy of the forecast: From the data set we find that INF 2005:Q1 = % The forecast error is INF 2005:Q1 ÎNF 2005:Q1 2004:Q4 = =
27 Obviously: Surprisingly, the AR(4) forecast error (-1.441) is larger in absolute value than the AR(1) forecast error ( ) (to be explained in Section 10.3.) 308
28 10.3. Time series regression with additional predictors and the autoregressive distributed lag model Next: Other variables than past values of the Y -variable may help to forecast the variable of interest These variables (called predictors) should be included on the right-hand side of the autoregression Eq. (10.12) on Slide 302 Autoregressive distributed lag models Example: Forecasting changes in the inflation rate using past unemployment rates (short-run Phillips curve) We denote the unemployment rate in EViews by UNEMP 309
29 Change in the U.S. CPI inflation rate between year t and year t + 1 versus the unemployment rate in year t 310
30 Fit of an AR(4) model plus UNEMP t 1 to INF t Dependent Variable: D(INF) Method: Least Squares Date: 28/06/12 Time: 10:31 Sample (adjusted): 1958Q3 2004Q4 Included observations: 186 after adjustments Convergence achieved after 9 iterations Variable Coefficient Std. Error t-statistic Prob. C UNEMP(-1) AR(1) AR(2) AR(3) AR(4) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood F-statistic Prob(F-statistic) tatis Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Hannan-Quinn criter Durbin-Watson stat Inverted AR Roots i i i i 311
31 Estimation results: Lagged predictor UNEMP t 1 is significant at the 5% level Improvement of the R 2 from for the pure AR(4) model to Using the estimates from Slide 311 and the data set including the observations for UNEMP, we compute the inflation forecast for 2005:Q1 as The forecast error is ÎNF 2005:Q1 2004:Q4 = % INF 2005:Q1 ÎNF 2005:Q1 2004:Q4 = =
32 Fit of an AR(4) model plus (UNEMP t 1,..., UNEMP t 4 ) to INF t Dependent Variable: D(INF) Method: Least Squares Date: 28/06/12 Time: 10:39 Sample (adjusted): 1959Q1 2004Q4 Included observations: 184 after adjustments Convergence achieved after 11 iterations Variable Coefficient Std. Error t-statistic Prob. C UNEMP(-1) UNEMP(-2) UNEMP(-3) UNEMP(-4) AR(1) AR(2) AR(3) AR(4) R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood F-statistic Prob(F-statistic) tatis Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Hannan-Quinn criter Durbin-Watson stat Inverted AR Roots i i i i 313
33 Estimation results: Predictor lags UNEMP t 1, UNEMP t 2, UNEMP t 3 are individually significant at the 1% level, UNEMP t 4 at the 10% level Substantial improvement of the R 2 from to Using the estimates from Slide 313 and the data set including the observations for UNEMP, we compute the inflation forecast for 2005:Q1 as The forecast error is ÎNF 2005:Q1 2004:Q4 = % INF 2005:Q1 ÎNF 2005:Q1 2004:Q4 = =
34 Now: Formal definition of this autoregressive model including one additional predictor Definition 10.6: (Autoregressive distributed lag model) The autoregressive distributed lag model with p lags of Y t and q lags of the predictor X t, denoted by ADL(p, q), is Y t = β 0 + β 1 Y t 1 + β 2 Y t β p Y t p δ 1 X t 1 + δ 2 X t δ q X t q + u t, (10.15) where β 0, β 1,..., β p, δ 1,..., δ q are unknown coefficients and u t is the error term with E(u t Y t 1, Y t 2,..., X t 1, X t 2,...) =
35 Remarks: The assumption E(u t Y t 1, Y t 2,..., X t 1, X t 2,...) = 0 implies that no additional lags of either Y or X belong in the ADL model (the lag lengths p and q are the true lag lengths) The ADL model contains lags of the dependent variable (autoregressive component) a distributed lag of a single additional predictor X In general, forecasts can be improved by using multiple predictors 316
36 Stationarity: Forecasting future values of a time series Y t based on past relationships implicitly require that the relationships remain stable over time Concept of stationarity Definition 10.7: (Stationarity) A time series Y t is stationary if its probability distribution does not change over time, that is, if the joint distribution (Y s+1, Y s+2,..., Y s+t ) does not depend on s regardless of the value of T ; otherwise, Y t is said to be nonstationary. A pair of time series, X t and Y t, are said to be jointly stationary, if the joint distribution (X s+1, Y s+1, X s+2, Y s+2,..., X s+t, Y s+t ) does not depend on s regardless of the value of T. Stationarity requires the future to be like the past, at least in a probabilistic sense. 317
37 Definition 10.8: (Time series regression with multiple predictors) The general times series regression model allows for k additonal predictors X 1,... X k with q 1 included lags of X 1, q 2 included lags for X 2, and so forth: where Y t = β 0 + β 1 Y t 1 + β 2 Y t β p Y t p + δ 11 X 1t 1 + δ 12 X 1t δ 1q1 X 1t q1 (10.16) δ k1 X kt 1 + δ k2 X kt δ kqk X kt qk + u t, 1. E(u t Y t 1, Y t 2,..., X 1t 1, X 1t 2,..., X kt 1, X kt 2,...) = The random variables (Y t, X 1t,..., X kt ) have a stationary distribution. (Y t, X 1t,..., X kt ) and (Y t j, X 1t j,..., X kt j ) become independent as j gets large. 318
38 Definition 10.8: [continued] 3. Large outliers are unlikely: X 1t,..., X kt and Y t have nonzero, finite fourth moments. 4. There is no perfect multicollinearity. Remarks: The first part of Assumption #2 requires that the distribution of the data today is the same as its distribution in the past The second part of Assumption #2 requires that the random variables become independently distributed when the amount of time separating them becomes large Both parts replace the cross-sectional OLS Assumption #2 on Slide
39 Statistical inference: Given the assumptions in Definition 10.8, we can apply OLS in the ususal way to make inference on the regression coefficients We can use the F -statistic to test whether the lags of one of the included regressors have useful predictive content Granger causality tests Definition 10.9: (Granger causality test) The Granger causality statistic is the F -statistic testing the hypothesis that the coefficients on all the values of one of the variables in Eq. (10.16) are simultaneously equal to zero (for example, the coefficients on X 1t 1, X 1t 2,... X 1t q1 ). This null hypothesis implies that these regressors have no predictive content for Y t beyond that contained in the other regressors. 320
40 Remarks: Granger causality means that if X Granger-causes Y, then X is a useful predictor of Y, given the other variables in the regression A more accurate phrasing than Granger causality would be Granger predictability Example: Consider the relationship between INF t and its past values and past values of UNEMP on Slide 313 Test the null hypothesis on the UNEMP coefficients H 0 : δ 11 = 0, δ 12 = 0, δ 13 = 0, δ 14 = 0 321
41 Example: [continued] F -statistic: , p-value < UNEMP appears to contain information that is useful for forecasting the change in the inflation rate Forecast uncertainty: We consider the RMSFE defined in Eq. (10.6) on Slide 298 as a measure of uncertainty of a forecast In general, the RMSFE consists of two components: uncertainty arising from the estimation of the regression coefficients uncertainty about the future unknown value of u t 322
42 Example: Consider an ADL(1, 1) model with a single predictor: Y t = β 0 + β 1 Y t 1 + δ 1 X t 1 + u t Assume further that u t is homoskedastic The forecast of Y T +1 is Ŷ T +1 T = ˆβ 0 + ˆβ 1 Y T + ˆδ 1 X T The forecast error is Y T +1 Ŷ T +1 T = u T +1 [ ] (ˆβ 0 β 0 ) + (ˆβ 1 β 1 )Y T + (ˆδ 1 δ 1 )X T (10.17) 323
43 Example: [continued] Since u t is homoskedastic, u T +1 has variance σu 2 and it can be shown that [ (YT ) ] 2 MSFE = E +1 Ŷ T +1 T = σ 2 u + Var [ (ˆβ 0 β 0 ) + (ˆβ 1 β 1 )Y T + (ˆδ 1 δ 1 )X T ], (10.18) so that RMSFE = MSFE Forecast uncertainty: [continued] The term σu 2 appearing in Eq. (10.18) can be estimated by the standard error of the regression defined on Slide 21: ˆσ 2 u = SER 2 = 1 T 3 T ût 2 t=1 324
44 Forecast uncertainty: [continued] The second term appearing on the right-hand side of Eq. (10.18) can be estimated by specific statistical techniques (not to be discussed here) Adding the two latter estimates yield the estimate RMSFE is (approxi- In practice, a 95% forecast interval of Y T +1 mately) given by Ŷ T +1 T ± 1.96 RMSFE Eq. (10.18) illuminates that highly parameterized models may produce larger forecast errors than parsimoneous models (see our previous examples) 325
45 10.4. Lag length selection using information criteria Important practical issue: How many lags should be included in a time series regression? Aim of this section: Presentation of statistical methods for choosing the number of lags in an autoregression a time series regression with multiple predictors 326
46 Determining the order of an autoregression Potential trade-off: If the order p of an estimated autoregression is too low, we omit potentially valuable information contained in the more distant lagged values too high, we estimate more coefficients than necessary thus introducing additional estimation error into our forecasts Two approaches: t-statistic approach Use of information criteria 327
47 t-statistic approach: Consider a model with many lags (that is with a high value of p) and perform hypothesis tests on the final lag Example: Start by estimating an AR(6) model and test whether the coefficient on the sixth lag is significant at the 5% level if not, drop it and estimate an AR(5) model, test the significance of the fifth lag, and so forth Remarks: The t-statistic approach has the tendency to produce too large a model 328
48 Remarks: [continued] Reasoning: Even if the true AR order is 5 (that is β 6 = 0), a t-test at the 5% level will incorrectly reject the null hypothesis 5% of the time by chance H 0 : β 6 = 0 When the true value of p is five, this approach will estimate p to be six 5% of the time Some textbooks suggest the t-statistic approach starting with the order p = 0 and then successively including AR terms whenever the t-statistic indicates significance at the 5% level (modeling from small to large) This is not a recommended procedure because of potential omitted-variable bias 329
49 Information criteria: Information criteria are measures reflecting the trade-off described on Slide 327 in the selection of the order p in an autoregression We can estimate the order p of an autoregression by minimizing such an information criterion The most popular criteria are the Schwarz (SIC) and the Akaike (AIC) information criteria Both, the SIC and the AIC, are based on the sum of squared residuals (SSR) of the AR(p) model estimated by OLS: SSR(p) = (see Slide 22) T t=1 û 2 t = T t=1 ( Yt ˆβ 0 ˆβ 1 Y t 1... ˆβ p Y t p ) 2 330
50 Definition 10.10: (Schwarz, Akaike information criteria) The Schwarz (SIC) and the Akaike (AIC) information criteria of an AR(p) model estimated by OLS are respectively defined as [ ] SSR(p) SIC(p) = ln + (p + 1) ln(t ) T T, (10.19) [ ] SSR(p) AIC(p) = ln + (p + 1) 2 T T. (10.20) The SIC and the AIC estimators of p are the values that respectively minimize SIC(p) and AIC(p) among the possible choices p = 0, 1,..., p max, where p max is the largest value of p considered and p = 0 corresponds to the model that contains only an intercept. 331
51 Information criteria: [continued] The first term on the right-hand side of the Eqs. (10.19) and (10.20) necessarily decrease when adding a lag to the autoregression The respective second terms, (p+1) ln(t )/T and (p+1) 2/T, necessarily increase when adding a lag to the autoregression (terms punishing the inclusion of further lags) The two terms in the Eqs. (10.19) and (10.20) reflect the trade-off Both information criteria are routinely computed by EViews (see the outputs on Slides 296, 304) 332
52 Information criteria: [continued] It can be proved that the SIC estimator of p is consistent the AIC estimator of p is not consistent (AIC overestimates p with nonzero probability) The Schwarz (SIC) and Akaike (AIC) information criteria of distinct AR(p) models for INF t, 1958:Q4 2005:Q1 (T = 186) AR order p SIC(p) AIC(p)
53 Remark: The SIC and AIC estimates of p should be determined by running all autoregressions involved over the same sampling period (thus, using the same number of observations) Example: Our U.S. inflation dataset originally covers the sampling period 1957:Q1 2005:Q1 (T = 193) Since we estimate, inter alia, an AR(5) model involving the highest lag INF t 5 the feasible sampling period adjusts to 1958:Q4 2005:Q1 (T = 186) It is this period, 1958:Q4 2005:Q1 with T = 186, that should also be used to compute the SIC and AIC values in all other autoregressions with p 4 334
54 Lag length selection in time series regression with multiple predictors Potential trade-off here: The choice of the number of predictors plus the corresponding lag numbers must balance the benefit of using additional information the cost of estimating additional coefficients Two approaches: F -statistic approach Information criteria 335
55 F -statistic approach: Use the F -statistic to test joint hypotheses that sets of coefficients are simultaneously equal to zero Example: Consider Eq. (10.16) on Slide 318 Use the F -statistic to test the null hypothesis H 0 : δ 11 = 0, δ 12 = 0,..., δ 1q1 = 0 (the predictor X 1 has no predictive content) Similar to the t-statistic approach, the F -statistic approach has the tendency to produce too large models 336
56 Information criteria: Consider the general time series regression model with multiple predictors defined in Eq. (10.16) on Slide 318 Denote the number of all coefficients (including the intercept) by K The SIC and AIC information criteria are then modified as [ ] SSR(K) SIC(K) = ln + K ln(t ) T T [ ] SSR(K) AIC(K) = ln + K 2 T T Evaluate the SIC (or AIC) for each candidate model The model with the minimal value of SIC (or AIC) is the preferred model 337
57 Information criteria: [continued] Two important practical considerations: Again, all candidate models must be estimated over the same sampling period (see Slide 334) When there are multiple predictors, the approach gets computationally demanding since it requires computing many different models (many combinations of the lag parameters) Convenient shortcut: Require all the regressors to have the same number of lags, that is, require that p = q 1 =... = q k, so that only p max + 1 models need to be compared (corresponding to p = 0, 1,..., p max ) 338
58 Thank you for your attention! 339
Augmenting our AR(4) Model of Inflation. The Autoregressive Distributed Lag (ADL) Model
Augmenting our AR(4) Model of Inflation Adding lagged unemployment to our model of inflationary change, we get: Inf t =1.28 (0.31) Inf t 1 (0.39) Inf t 2 +(0.09) Inf t 3 (0.53) (0.09) (0.09) (0.08) (0.08)
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions
More informationIntroduction to Econometrics
Introduction to Econometrics STAT-S-301 Introduction to Time Series Regression and Forecasting (2016/2017) Lecturer: Yves Dominicy Teaching Assistant: Elise Petit 1 Introduction to Time Series Regression
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationEcon 423 Lecture Notes
Econ 423 Lecture Notes (These notes are slightly modified versions of lecture notes provided by Stock and Watson, 2007. They are for instructional purposes only and are not to be distributed outside of
More informationCHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More information13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity
Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent
More information6. Assessing studies based on multiple regression
6. Assessing studies based on multiple regression Questions of this section: What makes a study using multiple regression (un)reliable? When does multiple regression provide a useful estimate of the causal
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationEcon 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:
Econ 427, Spring 2010 Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements: 1. (page 132) In each case, the idea is to write these out in general form (without the lag
More informationARDL Cointegration Tests for Beginner
ARDL Cointegration Tests for Beginner Tuck Cheong TANG Department of Economics, Faculty of Economics & Administration University of Malaya Email: tangtuckcheong@um.edu.my DURATION: 3 HOURS On completing
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More information5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1)
5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) Assumption #A1: Our regression model does not lack of any further relevant exogenous variables beyond x 1i, x 2i,..., x Ki and
More information4. Nonlinear regression functions
4. Nonlinear regression functions Up to now: Population regression function was assumed to be linear The slope(s) of the population regression function is (are) constant The effect on Y of a unit-change
More informationIntroduction to Time Series Regression and Forecasting
Introduction to Time Series Regression and Forecasting (SW Chapter 14) Outline 1. Time Series Data: What s Different? 2. Using Regression Models for Forecasting 3. Lags, Differences, Autocorrelation, &
More informationBrief Sketch of Solutions: Tutorial 3. 3) unit root tests
Brief Sketch of Solutions: Tutorial 3 3) unit root tests.5.4.4.3.3.2.2.1.1.. -.1 -.1 -.2 -.2 -.3 -.3 -.4 -.4 21 22 23 24 25 26 -.5 21 22 23 24 25 26.8.2.4. -.4 - -.8 - - -.12 21 22 23 24 25 26 -.2 21 22
More informationThe Simple Regression Model. Part II. The Simple Regression Model
Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square
More informationChapter 14: Time Series: Regression & Forecasting
Chapter 14: Time Series: Regression & Forecasting 14-1 1-1 Outline 1. Time Series Data: What s Different? 2. Using Regression Models for Forecasting 3. Lags, Differences, Autocorrelation, & Stationarity
More informationAutoregressive distributed lag models
Introduction In economics, most cases we want to model relationships between variables, and often simultaneously. That means we need to move from univariate time series to multivariate. We do it in two
More informationEconometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationin the time series. The relation between y and x is contemporaneous.
9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x
More informationIntroduction to Time Series Regression and Forecasting
Introduction to Time Series Regression and Forecasting (SW Chapter 14) Outline 1. Time Series Data: What s Different? 2. Using Regression Models for Forecasting 3. Lags, Differences, Autocorrelation, &
More informationIntroduction to Econometrics
Introduction to Econometrics T H I R D E D I T I O N Global Edition James H. Stock Harvard University Mark W. Watson Princeton University Boston Columbus Indianapolis New York San Francisco Upper Saddle
More informationAnswers to Problem Set #4
Answers to Problem Set #4 Problems. Suppose that, from a sample of 63 observations, the least squares estimates and the corresponding estimated variance covariance matrix are given by: bβ bβ 2 bβ 3 = 2
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationFinancial Time Series Analysis: Part II
Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing
More information3. Linear Regression With a Single Regressor
3. Linear Regression With a Single Regressor Econometrics: (I) Application of statistical methods in empirical research Testing economic theory with real-world data (data analysis) 56 Econometrics: (II)
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationOutline. 11. Time Series Analysis. Basic Regression. Differences between Time Series and Cross Section
Outline I. The Nature of Time Series Data 11. Time Series Analysis II. Examples of Time Series Models IV. Functional Form, Dummy Variables, and Index Basic Regression Numbers Read Wooldridge (2013), Chapter
More information1 Quantitative Techniques in Practice
1 Quantitative Techniques in Practice 1.1 Lecture 2: Stationarity, spurious regression, etc. 1.1.1 Overview In the rst part we shall look at some issues in time series economics. In the second part we
More information7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15
Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More information9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.
9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).
More informationThe Multiple Regression Model Estimation
Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:
More informationPractice Questions for the Final Exam. Theoretical Part
Brooklyn College Econometrics 7020X Spring 2016 Instructor: G. Koimisis Name: Date: Practice Questions for the Final Exam Theoretical Part 1. Define dummy variable and give two examples. 2. Analyze the
More information11. Simultaneous-Equation Models
11. Simultaneous-Equation Models Up to now: Estimation and inference in single-equation models Now: Modeling and estimation of a system of equations 328 Example: [I] Analysis of the impact of advertisement
More informationLecture#17. Time series III
Lecture#17 Time series III 1 Dynamic causal effects Think of macroeconomic data. Difficult to think of an RCT. Substitute: different treatments to the same (observation unit) at different points in time.
More informationThe general linear regression with k explanatory variables is just an extension of the simple regression as follows
3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because
More informationExercise Sheet 6: Solutions
Exercise Sheet 6: Solutions R.G. Pierse 1. (a) Regression yields: Dependent Variable: LC Date: 10/29/02 Time: 18:37 Sample(adjusted): 1950 1985 Included observations: 36 after adjusting endpoints C 0.244716
More information1 Introduction. 2 AIC versus SBIC. Erik Swanson Cori Saviano Li Zha Final Project
Erik Swanson Cori Saviano Li Zha Final Project 1 Introduction In analyzing time series data, we are posed with the question of how past events influences the current situation. In order to determine this,
More informationCovers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data
Covers Chapter 10-12, some of 16, some of 18 in Wooldridge Regression Analysis with Time Series Data Obviously time series data different from cross section in terms of source of variation in x and y temporal
More informationLATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION
LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION BEZRUCKO Aleksandrs, (LV) Abstract: The target goal of this work is to develop a methodology of forecasting Latvian GDP using ARMA (AutoRegressive-Moving-Average)
More informationPractical Econometrics. for. Finance and Economics. (Econometrics 2)
Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics
More informationCORRELATION, ASSOCIATION, CAUSATION, AND GRANGER CAUSATION IN ACCOUNTING RESEARCH
CORRELATION, ASSOCIATION, CAUSATION, AND GRANGER CAUSATION IN ACCOUNTING RESEARCH Alireza Dorestani, Northeastern Illinois University Sara Aliabadi, Northeastern Illinois University ABSTRACT In this paper
More informationAPPLIED MACROECONOMETRICS Licenciatura Universidade Nova de Lisboa Faculdade de Economia. FINAL EXAM JUNE 3, 2004 Starts at 14:00 Ends at 16:30
APPLIED MACROECONOMETRICS Licenciatura Universidade Nova de Lisboa Faculdade de Economia FINAL EXAM JUNE 3, 2004 Starts at 14:00 Ends at 16:30 I In Figure I.1 you can find a quarterly inflation rate series
More information7 Introduction to Time Series
Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some
More informationThe GARCH Analysis of YU EBAO Annual Yields Weiwei Guo1,a
2nd Workshop on Advanced Research and Technology in Industry Applications (WARTIA 2016) The GARCH Analysis of YU EBAO Annual Yields Weiwei Guo1,a 1 Longdong University,Qingyang,Gansu province,745000 a
More informationFinQuiz Notes
Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple
More informationRomanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS
THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationMeasures of Fit from AR(p)
Measures of Fit from AR(p) Residual Sum of Squared Errors Residual Mean Squared Error Root MSE (Standard Error of Regression) R-squared R-bar-squared = = T t e t SSR 1 2 ˆ = = T t e t p T s 1 2 2 ˆ 1 1
More informationEconometrics I: Univariate Time Series Econometrics (1)
Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews
More informationReview of Econometrics
Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,
More informationEmpirical Economic Research, Part II
Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction
More informationMultiple Regression Analysis. Part III. Multiple Regression Analysis
Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant
More informationFrequency Forecasting using Time Series ARIMA model
Frequency Forecasting using Time Series ARIMA model Manish Kumar Tikariha DGM(O) NSPCL Bhilai Abstract In view of stringent regulatory stance and recent tariff guidelines, Deviation Settlement mechanism
More informationHeteroskedasticity. Part VII. Heteroskedasticity
Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least
More informationFinal Exam Financial Data Analysis at the University of Freiburg (Winter Semester 2008/2009) Friday, November 14, 2008,
Professor Dr. Roman Liesenfeld Final Exam Financial Data Analysis at the University of Freiburg (Winter Semester 2008/2009) Friday, November 14, 2008, 10.00 11.30am 1 Part 1 (38 Points) Consider the following
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More information2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0
Introduction to Econometrics Midterm April 26, 2011 Name Student ID MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. (5,000 credit for each correct
More informationVector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.
Vector Autoregressive Model Vector Autoregressions II Empirical Macroeconomics - Lect 2 Dr. Ana Beatriz Galvao Queen Mary University of London January 2012 A VAR(p) model of the m 1 vector of time series
More informationExercise Sheet 5: Solutions
Exercise Sheet 5: Solutions R.G. Pierse 2. Estimation of Model M1 yields the following results: Date: 10/24/02 Time: 18:06 C -1.448432 0.696587-2.079327 0.0395 LPC -0.306051 0.272836-1.121740 0.2640 LPF
More informationAbout the seasonal effects on the potential liquid consumption
About the seasonal effects on the potential liquid consumption Lucie Ravelojaona Guillaume Perrez Clément Cousin ENAC 14/01/2013 Consumption raw data Figure : Evolution during one year of different family
More informationECON3150/4150 Spring 2015
ECON3150/4150 Spring 2015 Lecture 3&4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo January 29, 2015 1 / 67 Chapter 4 in S&W Section 17.1 in S&W (extended OLS assumptions) 2
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions
DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued Brief Suggested Solutions 1. In Lab 8 we considered the following
More informationEC408 Topics in Applied Econometrics. B Fingleton, Dept of Economics, Strathclyde University
EC408 Topics in Applied Econometrics B Fingleton, Dept of Economics, Strathclyde University Applied Econometrics What is spurious regression? How do we check for stochastic trends? Cointegration and Error
More informationFinancial Econometrics
Financial Econometrics Multivariate Time Series Analysis: VAR Gerald P. Dwyer Trinity College, Dublin January 2013 GPD (TCD) VAR 01/13 1 / 25 Structural equations Suppose have simultaneous system for supply
More informationECON3327: Financial Econometrics, Spring 2016
ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary
More informationEastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I. M. Balcilar. Midterm Exam Fall 2007, 11 December 2007.
Eastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I M. Balcilar Midterm Exam Fall 2007, 11 December 2007 Duration: 120 minutes Questions Q1. In order to estimate the demand
More informationG. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication
G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?
More informationF9 F10: Autocorrelation
F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?
More informationEconometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series
Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What
More informationEconometrics Honor s Exam Review Session. Spring 2012 Eunice Han
Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity
More informationEconometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage
More information8. Instrumental variables regression
8. Instrumental variables regression Recall: In Section 5 we analyzed five sources of estimation bias arising because the regressor is correlated with the error term Violation of the first OLS assumption
More informationOutline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation
1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption
More informationECONOMETRICS HONOR S EXAM REVIEW SESSION
ECONOMETRICS HONOR S EXAM REVIEW SESSION Eunice Han ehan@fas.harvard.edu March 26 th, 2013 Harvard University Information 2 Exam: April 3 rd 3-6pm @ Emerson 105 Bring a calculator and extra pens. Notes
More information11.1 Gujarati(2003): Chapter 12
11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed
More informationSolution to Exercise E6.
Solution to Exercise E6. The Multiple Regression Model. Inference Exercise E6.1 Beach umbrella rental Part I. Simple Linear Regression Model. a. Regression model: U t = α + β T t + u t t = 1,..., 22 Model
More informationApplied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall
Applied Econometrics Second edition Dimitrios Asteriou and Stephen G. Hall MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3. Imperfect Multicollinearity 4.
More informationEconometrics and Structural
Introduction to Time Series Econometrics and Structural Breaks Ziyodullo Parpiev, PhD Outline 1. Stochastic processes 2. Stationary processes 3. Purely random processes 4. Nonstationary processes 5. Integrated
More informationTime Series. Chapter Time Series Data
Chapter 10 Time Series 10.1 Time Series Data The main difference between time series data and cross-sectional data is the temporal ordering. To emphasize the proper ordering of the observations, Table
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationForecasting Seasonal Time Series 1. Introduction. Philip Hans Franses Econometric Institute Erasmus University Rotterdam
Forecasting Seasonal Time Series 1. Introduction Philip Hans Franses Econometric Institute Erasmus University Rotterdam SMU and NUS, Singapore, April-May 2004 1 Outline of tutorial lectures 1 Introduction
More informationAUTOCORRELATION. Phung Thanh Binh
AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures
More informationEconomtrics of money and finance Lecture six: spurious regression and cointegration
Economtrics of money and finance Lecture six: spurious regression and cointegration Zongxin Qian School of Finance, Renmin University of China October 21, 2014 Table of Contents Overview Spurious regression
More informationApplied Econometrics. Professor Bernard Fingleton
Applied Econometrics Professor Bernard Fingleton 1 Causation & Prediction 2 Causation One of the main difficulties in the social sciences is estimating whether a variable has a true causal effect Data
More informationMultivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationSTAT Regression Methods
STAT 501 - Regression Methods Unit 9 Examples Example 1: Quake Data Let y t = the annual number of worldwide earthquakes with magnitude greater than 7 on the Richter scale for n = 99 years. Figure 1 gives
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationAutoregressive approaches to import export time series I: basic techniques
Modern Stochastics: Theory and Applications 2 (2015) 51 65 DOI: 10.15559/15-VMSTA22 Autoregressive approaches to import export time series I: basic techniques Luca Di Persio a a Dept. Informatics, University
More informationModel Specification and Data Problems. Part VIII
Part VIII Model Specification and Data Problems As of Oct 24, 2017 1 Model Specification and Data Problems RESET test Non-nested alternatives Outliers A functional form misspecification generally means
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More informationE 4101/5101 Lecture 9: Non-stationarity
E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson
More information1 Regression with Time Series Variables
1 Regression with Time Series Variables With time series regression, Y might not only depend on X, but also lags of Y and lags of X Autoregressive Distributed lag (or ADL(p; q)) model has these features:
More information