Forecasting. Francis X. Diebold University of Pennsylvania. August 11, / 323
|
|
- Ferdinand McDowell
- 6 years ago
- Views:
Transcription
1 Forecasting Francis X. Diebold University of Pennsylvania August 11, / 323
2 Copyright c 2013 onward, by Francis X. Diebold. These materials are freely available for your use, but be warned: they are highly preliminary, significantly incomplete, and rapidly evolving. All are licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. (Briefly: I retain copyright, but you can use, copy and distribute non-commercially, so long as you give me attribution and do not modify. To view a copy of the license, visit In return I ask that you please cite the books whenever appropriate, as: Diebold, F.X. (year here), Book Title Here, Department of Economics, University of Pennsylvania, fdiebold/textbooks.html. The painting is Enigma, by Glen Josselsohn, from Wikimedia Commons. 2 / 323
3 Elements of Forecasting in Business, Finance, Economics and Government 1. Forecasting in Action 1.1 Operations planning and control 1.2 Marketing 1.3 Economics 1.4 Financial speculation 1.5 Financial risk management 1.6 Capacity planning 1.7 Business and government budgeting 1.8 Demography 1.9 Crisis management 3 / 323
4 Forecasting Methods: An Overview Review of probability, statistics and regression Six Considerations Basic to Successful Forecasting 1. Forecasts and decisions 2. The object to be forecast 3. Forecast types 4. The forecast horizon 5. The information set 6. Methods and complexity 6.1 The parsimony principle 6.2 The shrinkage principle 4 / 323
5 Statistical Graphics for Forecasting Why graphical analysis is important Simple graphical techniques Elements of graphical style Application: graphing four components of real GNP 5 / 323
6 Modeling and Forecasting Trend Modeling trend Estimating trend models Forecasting trend Selecting forecasting models using the Akaike and Schwarz criteria Application: forecasting retail sales 6 / 323
7 Modeling and Forecasting Seasonality The nature and sources of seasonality Modeling seasonality Forecasting seasonal series Application: forecasting housing starts 7 / 323
8 Characterizing Cycles Covariance stationary time series White noise The lag operator Wold s theorem, the general linear process, and rational distributed lags Estimation and inference for the mean, autocorrelation and partial autocorrelation functions Application: characterizing Canadian employment dynamics 8 / 323
9 Modeling Cycles: MA, AR and ARMA Models Moving-average (MA) models Autoregressive (AR) models Autoregressive moving average (ARMA) models Application: specifying and estimating models for forecasting employment 9 / 323
10 Forecasting Cycles Optimal forecasts Forecasting moving average processes Forecasting infinite-ordered moving averages Making the forecasts operational The chain rule of forecasting Application: forecasting employment 10 / 323
11 Putting it all Together: A Forecasting Model with Trend, Seasonal and Cyclical Components Assembling what we ve learned Application: forecasting liquor sales Recursive estimation procedures for diagnosing and selecting forecasting models 11 / 323
12 Forecasting with Regression Models Conditional forecasting models and scenario analysis Accounting for parameter uncertainty in confidence intervals for conditional forecasts Unconditional forecasting models Distributed lags, polynomial distributed lags, and rational distributed lags Regressions with lagged dependent variables, regressions with ARMA disturbances, and transfer function models Vector autoregressions Predictive causality Impulse-response functions and variance decomposition Application: housing starts and completions 12 / 323
13 Evaluating and Combining Forecasts Evaluating a single forecast Evaluating two or more forecasts: comparing forecast accuracy Forecast encompassing and forecast combination Application: OverSea shipping volume on the Atlantic East trade lane 13 / 323
14 Unit Roots, Stochastic Trends, ARIMA Forecasting Models, and Smoothing Stochastic trends and forecasting Unit roots: estimation and testing Application: modeling and forecasting the yen/dollar exchange rate Smoothing Exchange rates, continued 14 / 323
15 Volatility Measurement, Modeling and Forecasting The basic ARCH process The GARCH process Extensions of ARCH and GARCH models Estimating, forecasting and diagnosing GARCH models Application: stock market volatility 15 / 323
16 Useful Books, Journals and Software Books Statistics review, etc.: Wonnacott, T.H. and Wonnacott, R.J. (1990), Introductory Statistics, Fifth Edition. New York: John Wiley and Sons. Pindyck, R.S. and Rubinfeld, D.L. (1997),Econometric Models and Economic Forecasts, Fourth Edition. New York: McGraw-Hill. Maddala, G.S. (2001), Introduction to Econometrics, Third Edition. New York: Macmillan. Kennedy, P. (1998), A Guide to Econometrics, Fourth Edition. Cambridge, Mass.: MIT Press. 16 / 323
17 Useful Books, Journals and Software cont. Time series analysis: Chatfield, C. (1996), The Analysis of Time Series: An Introduction, Fifth Edition. London: Chapman and Hall. Granger, C.W.J. and Newbold, P. (1986), Forecasting Economic Time Series, Second Edition. Orlando, Florida: Academic Press. Harvey, A.C. (1993), Time Series Models, Second Edition. Cambridge, Mass.: MIT Press. Hamilton, J.D. (1994), Time Series Analysis, Princeton: Princeton University Press. 17 / 323
18 Useful Books, Journals and Software cont. Special insights: Armstrong, J.S. (Ed.) (1999), The Principles of Forecasting. Norwell, Mass.: Kluwer Academic Forecasting. Makridakis, S. and Wheelwright S.C. (1997), Forecasting: Methods and Applications, Third Edition. New York: John Wiley. Bails, D.G. and Peppers, L.C. (1997), Business Fluctuations. Englewood Cliffs: Prentice Hall. Taylor, S. (1996), Modeling Financial Time Series, Second Edition. New York: Wiley. 18 / 323
19 Useful Books, Journals and Software cont. Journals Journal of Forecasting Journal of Business Forecasting Methods and Systems Journal of Business and Economic Statistics Review of Economics and Statistics Journal of Applied Econometrics 19 / 323
20 Useful Books, Journals and Software cont. Software General: Eviews S+ Minitab SAS R Python Many more... Cross-section: Stata Open-ended: Matlab 20 / 323
21 Useful Books, Journals and Software cont. Online Information Resources for Economists: 21 / 323
22 A Brief Review of Probability, Statistics, and Regression for Forecasting Topics Discrete Random Variable Discrete Probability Distribution Continuous Random Variable Probability Density Function Moment Mean, or Expected Value Location, or Central Tendency Variance Dispersion, or Scale Standard Deviation Skewness Asymmetry Kurtosis Leptokurtosis 22 / 323
23 A Brief Review of Probability, Statistics, and Regression for Forecasting Topics cont. Skewness Asymmetry Kurtosis Leptokurtosis Normal, or Gaussian, Distribution Marginal Distribution Joint Distribution Covariance Correlation Conditional Distribution Conditional Moment Conditional Mean Conditional Variance 23 / 323
24 A Brief Review of Probability, Statistics, and Regression for Forecasting cont. Topics cont. Population Distribution Sample Estimator Statistic, or Sample Statistic Sample Mean Sample Variance Sample Standard Deviation Sample Skewness Sample Kurtosis χ 2 Distribution t Distribution F Distribution Jarque-Bera Test 24 / 323
25 Regression as Curve Fitting Least-squares estimation: Fitted values: min β T t = 1 [ y t β 0 β 1 x t ] 2 Residuals: ŷ t = ˆβ 0 + ˆβ 1 x t e t = y t ŷ t 25 / 323
26 Regression as a probabilistic model Simple regression: y t = β 0 + β 1 x t + ε t ε t iid (0, σ 2 ) Multiple regression: y t = β 0 + β 1 x t + β 2 z t + ε t ε t iid (0, σ 2 ) 26 / 323
27 Regression as a probabilistic model cont. Mean dependent var S.D. dependent var 1.49 ȳ = 1 T T t=1 y t SD = Sum squared resid T t=1 (y t ȳ) 2 T 1 SSR = T t=1 e 2 t 27 / 323
28 Regression as a probabilistic model cont. F statistic S.E. of regression 0.99 F = (SSR res SSR) / (k 1) SSR / (T k) s 2 = T t=1 e2 t T k SER = s 2 = T t=1 e2 t T k 28 / 323
29 Regression as a probabilistic model cont. R squared 0.58 R 2 = 1 T t=1 e2 t T t=1 (y t ȳ t ) 2 or R 2 = 1 Adjusted R squared T 1 T T t=1 e2 t T t=1 (y t ȳ t ) 2 R 2 = 1 1 T k 1 T 1 T t=1 e2 t T t=1 (y t ȳ t ) 2 29 / 323
30 Regression as a probabilistic model cont. Akaike info criterion 0.03 Schwarz criterion 0.15 AIC = e ( 2k T ) SIC = T ( k T) T t=1 e2 t T T t=1 e2 t T 30 / 323
31 Regression as a probabilistic model cont. Durbin Watson stat 1.97 y t = β 0 + β 1 x t + ε t v t iid N(0, σ 2 ) ε t = φε t 1 + v t DW = T t=2 (e t e t 1 ) 2 T t=1 e2 t 31 / 323
32 Regression of y on x and z 32 / 323
33 Scatterplot of y versus x 33 / 323
34 Scatterplot of y versus x Regression Line Superimposed 34 / 323
35 Scatterplot of y versus z Regression Line Superimposed 35 / 323
36 Residual Plot Regression of y on x and z 36 / 323
37 Six Considerations Basic to Successful Forecasting 1. The Decision Environment and Loss Function L(e) = e 2 L(e) = e 2. The Forecast Object Event outcome, event timing, time series. 3. The Forecast Statement Point forecast, interval forecast, density forecast, probability forecast 37 / 323
38 Six Considerations Basic to Successful Forecasting cont. 4. The Forecast Horizon h-step ahead forecast h-step-ahead extrapolation forecast 5. The Information Set Ω univariate T = {y T, y T 1,..., y 1 } Ω multivariate T = {y T, x T, y T 1, x T 1,..., y 1, x 1 } 6. Methods and Complexity, the Parsimony Principle, and the Shrinkage Principle Signal vs. noise Smaller is often better Even incorrect restrictions can help 38 / 323
39 Six Considerations Basic to Successful Forecasting cont. Decision Making with Symmetric Loss Demand High Demand Low Build Inventory 0 $10,000 Reduce Inventory $10,000 0 Decision Making with Asymmetric Loss Demand High Demand Low Build Inventory 0 $10,000 Reduce Inventory $20, / 323
40 Six Considerations Basic to Successful Forecasting cont. Forecasting with Symmetric Loss High Actual Sales Low Actual Sales High Forecasted Sales 0 $10,000 Low Forecasted Sales $10,000 0 Forecasting with Asymmetric Loss High Actual Sales Low Actual Sales High Forecasted Sales 0 $10,000 Low Forecasted Sales $20, / 323
41 Quadratic Loss 41 / 323
42 Absolute Loss 42 / 323
43 Asymmetric Loss 43 / 323
44 Forecast Statement 44 / 323
45 Forecast Statement cont. 45 / 323
46 Extrapolation Forecast 46 / 323
47 Extrapolation Forecast cont. 47 / 323
48 Statistical Graphics For Forecasting 1. Why Graphical Analysis is Important Graphics helps us summarize and reveal patterns in data Graphics helps us identify anomalies in data Graphics facilitates and encourages comparison of different pieces of data Graphics enables us to present a huge amount of data in a small space, and it enables us to make huge data sets coherent 2. Simple Graphical Techniques Univariate, multivariate Time series vs. distributional shape Relational graphics 3. Elements of Graphical Style Know your audience, and know your goals. Show the data, and appeal to the viewer. Revise and edit, again and again. 4. Application: Graphing Four Components of Real GNP 48 / 323
49 Anscombe s Quartet (1) (2) (3) (4) x1 y1 x2 y2 x3 y3 x / 323
50 Anscombe s Quartet 50 / 323
51 Anscombe s Quartet Bivariate Scatterplot 51 / 323
52 1-Year Treasury Bond Rates 52 / 323
53 Change in 1-Year Treasury Bond Rates 53 / 323
54 Liquor Sales 54 / 323
55 Histogram and Descriptive Statistics Change in 1-Year Treasury Bond Rates 55 / 323
56 Scatterplot 1-Year vs. 10-year Treasury Bond Rates 56 / 323
57 Scatterplot Matrix 1-, 10-, 20-, and 30-Year Treasury Bond Rates 57 / 323
58 Time Series Plot Aspect Ratio 1: / 323
59 Time Series Plot Banked to 45 Degrees 59 / 323
60 Time Series Plot Aspect Ratio 1: / 323
61 Graph 61 / 323
62 Graph 62 / 323
63 Graph 63 / 323
64 Graph 64 / 323
65 Graph 65 / 323
66 Components of Real GDP (Millions of Current Dollars, Annual 66 / 323
67 Modeling and Forecasting Trend 1. Modeling Trend T t = β 0 + β 1 TIME t T t = β 0 + β 1 TIME t + β 2 TIME 2 t T t = β 0 e β 1TIME t ln(t t ) = ln(β 0 ) + β 1 TIME t 67 / 323
68 Modeling and Forecasting Trend 2. Estimating Trend Models ( ˆβ 0, ˆβ1 ) = arg min β 0,β 1 T t = 1 [ y t β 0 β 1 TIME t ] 2 ( ˆβ 0, ˆβ1, ˆβ2 ) = arg min β 0,β 1,β 2 ( ˆβ 0, ˆβ1 ) = arg min β 0,β 1 T t = 1 T t = 1 [ yt β 0 β 1 TIME t β 2 TIME 2 t [ y t β 0 e β 1TIME t ] 2 ( ˆβ 0, ˆβ1 ) = arg min β 0,β 1 T t = 1 [ ln y t ln β 0 β 1 TIME t ] 2 68 / 323
69 Modeling and Forecasting Trend 3. Forecasting Trend y t = β 0 + β 1 TIME t + ε t y T+h = β 0 + β 1 TIME T+h + ε T+h y T+h,T = β 0 + β 1 TIME T+h ŷ T+h,T = ˆβ 0 + ˆβ 1 TIME T+h 69 / 323
70 Modeling and Forecasting Trend 3. Forecasting Trend cont. y T+h,T ± 1.96σ ŷ T+h,T ± 1.96ˆσ N(y T+h,T, σ 2 ) N(ŷ T+h,T, ˆσ 2 ) 70 / 323
71 Modeling and Forecasting Trend 4. Selecting Forecasting Models MSE = T t=1 e2 t T R 2 = 1 T t=1 e2 t T t=1 (y t ȳ) 2 71 / 323
72 Modeling and Forecasting Trend 4. Selecting Forecasting Models cont. s 2 = T t=1 e2 t T k s 2 = ( T ) T t=1 e2 t T k T R 2 = 1 T t=1 e2 t / T k T t=1 (y t ȳ) 2 / T 1 = 1 s 2 Tt=1 (y t ȳ) 2 / T 72 / 323
73 Modeling and Forecasting Trend 4. Selecting Forecasting Models cont. AIC = e ( 2k T ) SIC = T ( k T) T t=1 e2 t T T t=1 e2 t T Consistency Efficiency 73 / 323
74 Labor Force Participation Rate 74 / 323
75 Increasing and Decreasing Labor Trends 75 / 323
76 Labor Force Participation Rate 76 / 323
77 Linear Trend Female Labor Force Participation Rate 77 / 323
78 Linear Trend Male Labor Force Participation Rate 78 / 323
79 Volume on the New York Stock Exchange 79 / 323
80 Various Shapes of Quadratic Trends 80 / 323
81 Quadratic Trend Volume on the New York Stock Exchange 81 / 323
82 Log Volume on the New York Stock Exchange 82 / 323
83 Various Shapes of Exponential Trends 83 / 323
84 Linear Trend Log Volume on the New York Stock Exchange 84 / 323
85 Exponential Trend Volume on the New York Stock Exchange 85 / 323
86 Degree-of-Freedom Penalties Various Model Selection Criteria 86 / 323
87 Retail Sales 87 / 323
88 Retail Sales Linear Trend Regression 88 / 323
89 Retail Sales Linear Trend Residual Plot 89 / 323
90 Retail Sales Quadratic Trend Regression 90 / 323
91 Retail Sales Quadratic Trend Residual Plot 91 / 323
92 Retail Sales Log Linear Trend Regression 92 / 323
93 Retail Sales Log Linear Trend Residual Plot 93 / 323
94 Retail Sales Exponential Trend Regression 94 / 323
95 Retail Sales Exponential Trend Residual Plot 95 / 323
96 Model Selection Criteria Linear, Quadratic and Exponential Trend Models Linear Trend Quadratic Trend Exponential Trend AIC SIC / 323
97 Retail Sales History January, 1990 December, / 323
98 Retail Sales History January, 1990 December, / 323
99 Retail Sales History January, 1990 December, / 323
100 Retail Sales History January, 1990 December, / 323
101 Modeling and Forecasting Seasonality 1. The Nature and Sources of Seasonality 2. Modeling Seasonality D 1 = (1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0,...) D 2 = (0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0,...) D 3 = (0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0,...) D 4 = (0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1,...) s y t = γ i D it + ε t i=1 s y t = β 1 TIME t + γ i D it i=1 + ε t y t = β 1 TIME t + s γ i D it + i=1 v 1 i=1 δ HD i HDV it + v 2 i=1 δi TD TDV i 101 / 323
102 Modeling and Forecasting Seasonality 3. Forecasting Seasonal Series y t = β 1 TIME t + s γ i D it + i=1 v 1 i=1 δ HD i HDV it + v 2 i=1 δi TD TDV i y T+h = β 1 TIME T+h + s γ i D i,t+h + i=1 v 1 i=1 δ HD i HDV i,t+h + y T+h,T = β 1 TIME T+h + s γ i D i,t+h + i=1 v 1 i=1 δ HD i HDV i,t+h + ŷ T+h,T = ˆβ 1 TIME T+h + sˆγ i D i,t+h + i=1 v 1 i=1 ˆδ HD i HDV i,t+h / 323
103 Modeling and Forecasting Seasonality 3. Forecasting Seasonal Series cont. y T+h,T ± 1.96σ ŷ T+h,T ± 1.96ˆσ N(y T+h,T, σ 2 ) N(ŷ T+h,T, ˆσ 2 ) 103 / 323
104 Gasoline Sales 104 / 323
105 Liquor Sales 105 / 323
106 Durable Goods Sales 106 / 323
107 Housing Starts, January, 1946 November, / 323
108 Housing Starts, January, 1990 November, / 323
109 Housing Starts Regression Results - Seasonal Dummy Variable Model 109 / 323
110 Residual Plot 110 / 323
111 Housing Starts Estimated Seasonal Factors 111 / 323
112 Housing Starts 112 / 323
113 Housing Starts 113 / 323
114 Characterizing Cycles 1. Covariance Stationary Time Series Realization Sample Path Covariance Stationary Ey t = µ t Ey t = µ γ(t, τ) = cov(y t, y t τ ) = E(y t µ)(y t τ µ) γ(t, τ) = γ(τ) ρ(τ) = cov(y t, y t τ ) var(yt ) var(y t τ ) = γ(τ) = γ(τ) γ(0) γ(0) γ(0) 114 / 323
115 1.Characterizing Cycles Cont. corr(x, y) = cov(x, y) σ x σ y ρ(τ) = γ(τ), τ = 0, 1, 2,... γ(0) ρ(τ) = cov(y t, y t τ ) var(yt ) var(y t τ ) = γ(τ) = γ(τ) γ(0) γ(0) γ(0) p(τ) regression of y t on y t 1,..., y t τ 115 / 323
116 2.White Noise y t WN(0, σ 2 ) y t iid (0, σ 2 ) y t iid N(0, σ 2 ) 116 / 323
117 2.White Noise Cont. E(y t ) = 0 var(y t ) = σ 2 E(y t Ω t 1 ) = 0 var(y t Ω t 1 ) = E[(y t E(y t Ω t 1 )) 2 Ω t 1 ] = σ / 323
118 3.The Lag Operator L y t = y t 1 L 2 y t = L(L(y t )) = L(y t 1 ) = y t 2 B(L) = b 0 + b 1 L + b 2 L b m L m L m y t = y t m y t = (1 L)y t = y t y t 1 (1 +.9L +.6L 2 )y t = y t +.9y t 1 +.6y t 2 B(L) = b 0 + b 1 L + b 2 L = b i L i B(L) ε t = b 0 ε t + b 1 ε t 1 + b 2 ε t = b i ε t i i=0 i=0 118 / 323
119 4.Wold s Theorem, the General Linear Process, and Rational Distributed Lags Wold s Theorem Let {y t } be any zero-mean covariance-stationary process. Then: y t = B(L)ε t = b i ε t i i=0 ε t WN(0, σ 2 ) where b 0 = 1 and i=0 b 2 i < 119 / 323
120 The General Linear Process y t = B(L)ε t = ε t b i ε t i i=0 iid WN(0, σ 2 ), where b 0 = 1 and b 2 i < i=0 120 / 323
121 The General Linear Process Cont. E(y t ) = E( b i ε t i ) = i=0 b i Eε t i = i=0 b i 0 = 0 i=0 var(y t ) = var( b i ε t i ) = i=0 b 2 i var(ε t i ) = i=0 b 2 i σ 2 i=0 = σ 2 b i=0 E(y t Ω t 1 ) = E(ε t Ω t 1 ) + b 1 E(ε t 1 Ω t 1 ) + b 2 E(ε t 2 Ω t 1 ) +... = var(y t Ω t 1 ) = E[(y t E(y t Ω t 1 )) 2 Ω t 1 ] = E(ε 2 t Ω t 1 ) = E(ε 2 t ) 121 / 323
122 Rational Distributed Lags B(L) = Θ (L) Φ (L) Θ(L) = Φ(L) = q θ i L i i=0 p φ i L i i=0 B(L) Θ (L) Φ (L) 122 / 323
123 5.Estimation and Inference for the Mean, Auto Correlation and Partial Autocorrelation Functions ˆρ(τ) = 1 T ȳ = 1 T T t=1 y t ρ(τ) = E [(y t µ) (y t τ µ)] E[(y t µ) 2 ] T t = τ+1 [(y t ȳ) (y t τ ȳ)] T t = τ+1 1 T T t=1 (y = [(y t ȳ) ( t ȳ) 2 T t=1 (y t ŷ t = ĉ + ˆβ 1 y t ˆβ τ y t τ ˆp(τ) ˆβ τ ( ˆρ(τ), ˆp(τ) N 0, ) 1 T 123 / 323
124 5.Estimation and Inference for the Mean, Auto Correlation and Partial Autocorrelation Functions Cont. ˆρ(τ) N(0, 1 T ) roottˆρ(τ) N(0, 1) T ˆρ 2 (τ) χ 2 1 Q BP = T Q LB = T (T + 2) m ˆρ 2 (τ) τ = 1 m τ = 1 ( 1 T τ ) ˆρ 2 (τ) 124 / 323
125 A Rigid Cycle Pattern 125 / 323
126 Autocorrelation Function, One-Sided Gradual Damping 126 / 323
127 Autocorrelation Function, Non-Damping 127 / 323
128 Autocorrelation Function, Gradual Damped Oscillation 128 / 323
129 Autocorrelation Function, Sharp Cutoff 129 / 323
130 Realization of White Noise Process 130 / 323
131 Population Autocorrelation Function of White Noise Process 131 / 323
132 Population Partialautocorrelation Function of White Noise Process 132 / 323
133 Canadian Employment Index 133 / 323
134 Canadian Employment Index Correlogram Sample: 1962:1 1993:4 Included observations: 128 Acorr. P. Acorr. Std. Error Ljung-Box p-v v / 323
135 Canadian Employment Index, Sample Autocorrelation and Partial Autocorrelation Functions 135 / 323
136 Modeling Cycles: MA,AR, and ARMA Models The MA(1) Process y t = ε t + θε t 1 = (1 + θl)ε t ε t WN(0, σ 2 ) If invertible: y t = ε t + θy t 1 θ 2 y t 2 + θ 3 y t / 323
137 Modeling Cycles: MA,AR, and ARMA Models Cont. Ey t = E(ε t ) + θe(ε t 1 ) = 0 var(y t ) = var(ε t ) + θ 2 var(ε t 1 ) = σ 2 + θ 2 σ 2 = σ 2 (1 + θ 2 ) E(y t Ω t 1 ) = E((ε t + θε t 1 ) Ω t 1 ) = E(ε t Ω t 1 ) + θe(ε t 1 Ω t 1 ) var(y t Ω t 1 ) = E[(y t E(y t Ω t 1 )) 2 Ω t 1 ] = E(ε 2 t Ω t 1 ) = E(ε 2 t ) 137 / 323
138 The MA(q) Process where y t = ε t + θ 1 ε t θ q ε t q = Θ(L)ε t ε t WN(0, σ 2 ) Θ(L) = 1 + θ 1 L θ q L q 138 / 323
139 The AR(1) Process y t = φy t 1 + ε t ε t WN(0, σ 2 ) If covariance stationary: y t = ε t + φε t 1 + φ 2 ε t / 323
140 Moment Structure E(y t ) = E(ε t + φε t 1 + φ 2 ε t ) = E(ε t ) + φe(ε t 1 ) + φ 2 E(ε t 2 ) +... = 0 var(y t ) = var(ε t + φε t 1 + φ 2 ε t ) = σ 2 + φ 2 σ 2 + φ 4 σ = σ 2 i=0 φ2i = σ 2 1 φ / 323
141 Moment Structure Cont. E(y t y t 1 ) = E((φy t 1 + ε t ) y t 1 ) = φe(y t 1 y t 1 ) + E(ε t y t 1 ) = φy t = φy t 1 var(y t y t 1 ) = var((φy t 1 + ε t ) y t 1 ) = φ 2 var(y t 1 y t 1 ) + var(ε t y t 1 ) = 0 + σ 2 = σ / 323
142 Moment Structure Cont. Autocovariances and autocorrelations: y t = φy t 1 + ε t y t y t τ = φy t 1 y t τ + ε t y t τ For τ 1, (Yule-Walker equation) But γ(τ) = φγ(τ 1). γ(0) = σ 2 1 φ / 323
143 Moment Structure Cont.. Thus and γ(τ) = φ τ σ 2 1 φ 2, τ = 0, 1, 2,... Partial autocorrelations: ρ(τ) = φ τ, τ = 0, 1, 2, / 323
144 The AR(p) Process y t = φ 1 y t 1 + φ 2 y t φ p y t p + ε t ε t WN(0, σ 2 ) 144 / 323
145 The ARMA(1,1) Process MA representation if invertible: y t = φy t 1 + ε t + θε t 1 ε t WN(0, σ 2 ) y t = (1 + θ L) (1 φ L) ε t AR representation of covariance stationary: (1 φ L) (1 + θ L) y t = ε t 145 / 323
146 The ARMA(p,q) Process y t = φ 1 y t φ p y t p + ε t + θ 1 ε t θ q ε t q ε t WN(0, σ 2 ) Φ(L)y t = Θ(L)ε t 146 / 323
147 Realization of Two MA(1) Processes 147 / 323
148 Population Autocorrelation Function MA(1) Process θ = / 323
149 Population Autocorrelation Function MA(1) Process θ = / 323
150 Population Partial Autocorrelation Function MA(1) Process θ = / 323
151 Population Partial Autocorrelation Function MA(1) Process θ = / 323
152 Realization of Two AR(1) Processes 152 / 323
153 Population Autocorrelation Function AR(1) Process φ = / 323
154 Population Autocorrelation Function AR(1) Process φ = / 323
155 Population Partial Autocorrelation Function AR(1) Process φ = / 323
156 Population Partial Autocorrelation Function AR(1) Process φ = / 323
157 Population Autocorrelation Function AR(2) Process with Complex Roots 157 / 323
158 Employment: MA(4) Model Sample: 1962:1 1993:4 Included observations: / 323
159 Employment MA(4) Residual Plot 159 / 323
160 Employment: MA(4) Model Residual Correlogram Sample: 1962:1 1993:4 Included observations: 128 Q statistic probabilities adjusted for 4 ARMA term(s) Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
161 Employment: MA(4) Model Residual Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands 161 / 323
162 Employment: AR(2) Model LS // Dependent Variable is CANEMP Sample: 1962:1 1993:4 Included observations: 128 Convergence achieved after 3 iterations Variable Coefficient Std. Error t Statistic Prob. 162 / 323
163 Employment AR(2) Model Residual Plot 163 / 323
164 Employment AIC Values of Various ARMA Models MA Order AR Order / 323
165 Employment SIC Values of Various ARMA Models MA Order AR Order / 323
166 Employment: ARMA(3,1) Model LS // Dependent Variable is CANEMP Sample: 1962:1 1993:4 Included observations: 128 Convergence achieved after 17 iterations Variable Coefficient Std. Error t Statistic Prob. 166 / 323
167 Employment ARMA(3) Model Residual Plot 167 / 323
168 Employment: ARMA(3,1) Model Residual Correlogram Sample: 1962:1 1993:4 Included observations: 128 Q statistic probabilities adjusted for 4 ARMA term(s) Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
169 Employment: ARMA(3) Model Residual Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands 169 / 323
170 Forecasting Cycles Ω T = {y T, y T 1, y T 2,...}, Ω T = {ε T, ε T 1, ε T 2,...}. Optimal Point Forecasts for Infinite-Order Moving Averages y t = b i ε t i, i=0 where ε t WN(0, σ 2 ), 170 / 323
171 Forecasting Cycles Cont. b 0 = 1, and. σ 2 i=0 b 2 i < y T+h = ε T+h + b 1 ε T+h b h ε T + b h+1 ε T y T+h,T = b h ε T + b h+1 ε T e T+h,T = (y T+h y T+h,T ) = σ 2 h = σ2 h 1 b 2 i. i=0 h 1 b i ε T+h i, i=0 171 / 323
172 Interval and Density Forecasts y T+h = y T+h,T + e T+h,T. 95% h-step-ahead interval forecast: y T+h,T ± 1.96σ h h-step-ahead density forecast: N(y T+h,T, σ 2 h ) Making the Forecasts Operational The Chain Rule of Forecasting 172 / 323
173 Employment History and Forecast MA(4) Model 173 / 323
174 Employment History and Long-Horizon Forecast MA(4) Model 174 / 323
175 Employment History, Forecast and Realization MA(4) Model 175 / 323
176 Employment History and Forecast AR(2) Model 176 / 323
177 Employment History and Long-Horizon Forecast AR(2) Model 177 / 323
178 Employment History and Very Long-Horizon Forecast AR(2) Model 178 / 323
179 Employment History, Forecast and Realization AR(2) Model 179 / 323
180 Putting it all Together A Forecast Model with Trend, Seasonal and Cyclical Components The full model: y t = T t (θ) + s γ i D it + i=1 v 1 i=1 δ HD i HDV it + v 2 i=1 δ TD i TDV it + ε t Φ(L)ε t = Θ(L)v t Φ(L) = 1 φ 1 L... φ p L p Θ(L) = 1 + θ 1 L θ q L q v t WN(0, σ 2 ). 180 / 323
181 Point Forecasting y T+h = T T+h (θ) + s γ i D i,t+h + i=1 v 1 i=1 δ HD i HDV i,t+h + v 2 i=1 δi TD T y T+h,T = T T+h (θ) + s γ i D i,t+h + i=1 v 1 i=1 δ HD i HDV i,t+h + v 2 δi TD i=1 ŷ T+h,T = T T+h (ˆθ) + sˆγ i D i,t+h + i=1 v 1 i=1 ˆδ HD i HDV i,t+h + v 2 i=1 ˆδ i TD T 181 / 323
182 Interval Forecasting and Density Forecasting Interval Forecasting: ŷ T+h,T ± z α/2ˆσ h e.g.: (95% interval) ŷ T+h,T ± 1.96ˆσ h Density Forecasting: N(ŷ T+h,T, ˆσ 2 h ) 182 / 323
183 Recursive Estimation y t = K β k x kt + ε t k=1 ε t iidn(0, σ 2 ), t = 1,..., T. OLS estimation uses the full sample, t = 1,..., T. Recursive least squares uses an expanding sample. Begin with the first K observations and estimate the model. Then estimate using the first K + 1 observations, and so on. At the end we have a set of recursive parameter estimates: ˆβ k,t, for k = 1,..., K and t = K,..., T. 183 / 323
184 Recursive Residuals At each t, t = K,..., T 1, compute a 1-step forecast, K ŷ t+1,t = ˆβ kt x k,t+1. k=1 The corresponding forecast errors, or recursive residuals, are ê t+1,t = y t+1 ŷ t+1,t. ê t+1,t N(0, σ 2 r t ) where r t > 1 for all t 184 / 323
185 Standardized Recursive Residuals and CUSUM w t+1,t êt+1,t σ r t, t = K,..., T 1. Under the maintained assumptions, CUSUM t w t+1,t iidn(0, 1). t t=k Then w t+1,t, t = K,..., T 1 is just a sum of iid N(0, 1) s. 185 / 323
186 Liquor Sales, / 323
187 Log Liquor Sales, / 323
188 Log Liquor Sales: Quadratic Trend Regression 188 / 323
189 Liquor Sales Quadratic Trend Regression Residual Plot 189 / 323
190 Liquor Sales Quadratic Trend Regression Residual Correlogram Acorr. P. Acorr. Std. Error Ljung-Box / 323
191 Liquor Sales Quadratic Trend Regression Residual Sample Autocorrelation Functions 191 / 323
192 Liquor Sales Quadratic Trend Regression Residual Partial Autocorrelation Functions 192 / 323
193 Log Liquor Sales: Quadratic Trend Regression With Seasonal Dummies and AR(3) Disturbances 193 / 323
194 Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Plot 194 / 323
195 Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Correlogram Acorr. P. Acorr.Std.Error Ljung-Box p-value / 323
196 Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Sample Autocorrelation Functions 196 / 323
197 Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Sample Partial Autocorrelation Functions 197 / 323
198 Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Plot 198 / 323
199 Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Correlogram Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
200 Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Sample Autocorrelation Functions 200 / 323
201 Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Sample Partial Autocorrelation Functions 201 / 323
202 Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Histogram and Normality Test 202 / 323
203 Log Liquor Sales History and 12-Month-Ahead Forecast 203 / 323
204 Log Liquor Sales History, 12-Month-Ahead Forecast, and Realization 204 / 323
205 Log Liquor Sales History and 60-Month-Ahead Forecast 205 / 323
206 Log Liquor Sales Long History and 60-Month-Ahead Forecast 206 / 323
207 Liquor Sales Long History and 60-Month-Ahead Forecast 207 / 323
208 Recursive Analysis Constant Parameter Model 208 / 323
209 Recursive Analysis Breaking Parameter Model 209 / 323
210 Log Liquor Sales: Quadratic Trend Regression with Seasonal Dummies and AR(3) Residuals and Two Standard Errors Bands 210 / 323
211 Log Liquor Sales: Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Recursive Parameter Estimates 211 / 323
212 Log Liquor Sales: Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances CUMSUM Analysis 212 / 323
213 Forecasting with Regression Models Conditional Forecasting Models and Scenario Analysis Density forecast: y t = β 0 + β 1 x t + ε t ε t N(0, σ 2 ) y T+h,T x T+h = β 0 + β 1 x T+h N(y T+h,T x T+h, σ2 ) Scenario analysis, contingency analysis No forecasting the RHS variables problem 213 / 323
214 Unconditional Forecasting Models y T+h,T = β 0 + β 1 x T+h,T Forecasting the RHS variables problem Could fit a model to x (e.g., an autoregressive model) Preferably, regress y on x t h, x t h 1,... No problem in trend and seasonal models 214 / 323
215 Distributed Lags Start with unconditional forecast model: Generalize to y t = β 0 + δx t 1 + ε t y t = β 0 + distributed lag model lag weights lag distribution N x i=1 δ i x t i + ε t 215 / 323
216 Polynomial Distributed Lags min β 0, δ i [ T y t β 0 t = N x+1 N x i=1 δ i x t i ] 2 subject to δ i = P(i) = a + bi + ci 2, i = 1,..., N x Lag weights constrained to lie on low-order polynomial Additional constraints can be imposed, such as Smooth lag distribution Parsimonious P(N x ) = / 323
217 Rational Distributed Lags Equivalently, y t = A(L) B(L) x t + ε t B(L)y t = A(L)x t + B(L) ε t Lags of x and y included Important to allow for lags of y, one way or another 217 / 323
218 Another way: distributed lag regression with lagged dependent variables y t = β 0 + N y α i y t i + i=1 N x j=1 δ j x t j + ε t Another way: distributed lag regression with ARMA disturbances y t = β 0 + ε t N x i=1 δ i x t i = Θ(L) Φ(L) v t v t WN(0, σ 2 ) + ε t 218 / 323
219 Another Way: The Transfer function Model and Various Special Cases Univariate ARMA Distributed Lag with y t = C(L) D(L) ε t A(, or Lagged Dep. Variables B(L) y t = A(L) x t + ε t y t = A(L) B(L) x t + 1 B(L) ε t C( Distributed Lag with 219 / 323
220 Vector Autoregressions e.g., bivariate VAR(1) Estimation by OLS y 1,t = φ 11 y 1,t 1 + φ 12 y 2,t 1 + ε 1,t y 2,t = φ 21 y 1,t 1 + φ 22 y 2,t 1 + ε 2,t ε 1,t WN(0, σ 2 1) ε 2,t WN(0, σ 2 2) cov(ε 1,t, ε 2,t ) = σ 12 Order selection by information criteria Impulse-response functions, variance decompositions, predictive causality Forecasts via Wold s chain rule 220 / 323
221 Point and Interval Forecast 221 / 323
222 U.S. Housing Starts and Completions, / 323
223 Starts Correlogram Sample: 1968: :12 Included observations: 288 Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
224 Starts Sample Autocorrelations 224 / 323
225 Starts Sample Partial Autocorrelations 225 / 323
226 Completions Correlogram Completions Correlogram Sample: 1968: :12 Included observations: 288 Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
227 Completions Sample Autocorrelations 227 / 323
228 Completions Partial Autocorrelations 228 / 323
229 Starts and Completions: Sample Cross Correlations 229 / 323
230 VAR Order Selection with AIC and SIC 230 / 323
231 VAR Starts Equation 231 / 323
232 VAR Start Equation Residual Plot 232 / 323
233 VAR Starts Equation Residual Correlogram Sample: 1968: :12 Included observations: 284 Acorr. P. Acorr. Std. Error Ljung-Box p-v / 323
234 VAR Starts Equation Residual Sample Autocorrelations 234 / 323
235 Var Starts Equation Residual Sample Partial Autocorrelations 235 / 323
236 Evaluating and Combining Forecasts Evaluating a single forecast Process: y t = µ + ε t + b 1 ε t 1 + b 2 ε t ε t WN(0, σ 2 ), h-step-ahead linear least-squares forecast: y t+h,t = µ + b h ε t + b h+1 ε t Corresponding h-step-ahead forecast error: e t+h,t = y t+h y t+h,t = ε t+h + b 1 ε t+h b h 1 ε t+1 with variance σh 2 = h 1 σ2 (1 + b 2 i ) i=1 236 / 323
237 Evaluating and Combining Forecasts So, four key properties of optimal forecasts: a. Optimal forecasts are unbiased b. Optimal forecasts have 1-step-ahead errors that are white noise c. Optimal forecasts have h-step-ahead errors that are at most MA(h-1) d. Optimal forecasts have h-step-ahead errors with variances that are non-decreasing in h and that converge to the unconditional variance of the process 1. All are easily checked. How? 237 / 323
238 Assessing optimality with respect to an information set Unforecastability principle: The errors from good forecasts are not be forecastable! Regression: k 1 e t+h,t = α + α i x it + u t 1. Test whether α 0,..., α k 1 are 0 Important case: i=1 e t+h,t = α + α 1 y t+h,t + u t 1. Test whether (α 0, α 1 ) = (0, 0) Equivalently, = β + β 1 y t+h,t + u t y t+h,t 1. Test whether (β 0, β 1 ) = (0, 1) 238 / 323
239 Evaluating multiple forecasts: comparing forecast accuracy Forecast errors, e t+h,t p t+h,t = y t+h y t+h,t y t+h EV = = y t+h y t+h,t Forecast percent errors, ME = 1 T 1 T T t=1 e t+h,t T (e t+h,t ME) 2 1 MSE = 1 T T t=1 MSPE = 1 T T t=1 RMSE = 1 T T e 2 t+h,t t=1 p 2 t+h,t e 2 t+h,t 239 / 323
240 Forecast encompassing y t+h = β a y a t+h,t + β by b t+h,t + ε t+h,t 1. If (β a, β b ) = (1, 0), model a forecast-encompasses model b 2. If (β a, β b ) = (0, 1), model b forecast-encompasses model a 3. Otherwise, neither model encompasses the other Alternative approach: (y t+h y t = β(y a t+h,t y t) + β b (y b t+h,t y t) + ε t+h,t 1. Useful in I(1) situations 240 / 323
241 Variance-covaiance forecast combination Composite formed from two unbiased forecasts: y c t+h = ωya t+h,t + (1 ω)yb t+h,t e c t+h = ωea t+h,t + (1 ω)eb t+h,t σ 2 c = ω 2 σ 2 aa + (1 ω) 2 σ 2 bb + 2ω(1 ω)σ2 ab ω = σ 2 bb σ2 ab σ 2 bb + σ2 aa 2sigma 2 ab ˆω = ˆσ 2 bb ˆσ2 ab ˆσ 2 bb + ˆσ2 aa 2ˆσ 2 ab 241 / 323
242 Regression-based forecast combination y t+h = β 0 + β 1 y a t+h,t + β 2y b t+h,t + ε t+h,t 1. Equivalent to variance-covariance combination if weights sum to unity and intercept is excluded 2. Easy extension to include more than two forecasts 3. Time-varying combining weights 4. Dynamic combining regressions 5. Shrinkage of combining weights toward equality 6. Nonlinear combining regressions 242 / 323
243 Unit Roots, Stochastic Trends, ARIMA Forecasting Models, and Smoothing 1. Stochastic Trends and Forecasting Φ(L)y t = Θ(L)ε t Φ(L) = Φ (L)(1 L) Φ (L)(1 L)y t = Θ(L)ε t I(0) vs I(1) processes Φ (L) y t = Θ(L)ε t 243 / 323
244 Unit Roots, Stochastic Trends, ARIMA Forecasting Models, and Smoothing Cont. Random Walk y t = y t 1 + ε t ε t WN(0, σ 2 ), Random walk with drift y t = δ + y t 1 + ε t ε t WN(0, σ 2 ), Stochastic trend vs deterministic trend 244 / 323
245 Properties of random walks y t = y t 1 + ε t ε t WN(0, σ 2 ), With time 0 value y 0 : y t = y 0 + t i=1 E(y t ) = y 0 var(y t ) = tσ 2 ε i lim var(y t) = t 245 / 323
246 Random walk with drift Random walk with drift y t = δ + y t 1 + ε t ε t WN(0, σ 2 ), Assuming time 0 value y 0 : y t = tδ + y 0 + t i=1 E(y t ) = y 0 + tδ var(y t ) = tσ 2 lim var(y t) = t ε i 246 / 323
247 ARIMA(p,1,q) model or Φ(L)(1 L)y t = c + Θ(L)ε t (1 L)y t = cφ 1 (L) + Φ 1 (L)Θ(L)ε t where Φ(L) = 1 Φ 1 L... Φ p L p Θ(L) = 1 Θ 1 L... Θ q L q and all the roots of both lag operator polynomials are outside the unit circle. 247 / 323
248 ARIMA(p,d,q) model or Φ(L)(1 L) d y t = c + Θ(L)ε t (1 L) d y t = cφ 1 (L) + Φ 1 (L)Θ(L)ε t where Φ(L) = 1 Φ 1 L... Φ p L p Θ(L) = 1 Θ 1 L... Θ q L q and all the roots of both lag operator polynomials are outside the unit circle. 248 / 323
249 Properties of ARIMA(p,1,q) processes Appropriately made stationary by differencing Shocks have permanent effects Forecasts don t revert to a mean Variance grows without bound as time progresses Interval forecasts widen without bound as horizon grows 249 / 323
250 Random walk example Point forecast Recall that for the AR(1) process, the optimal forecast is y t = φy t 1 + ε t y t WN(0, σ 2 ) y T+h,T Thus in the random walk case, = φ h y T y T+h,T = y T, for all h 250 / 323
251 Random walk example Cont. Interval and density forecasts Recall error associated with optimal AR(1) forecast: e T+h,T = (y T+h y T+h,T ) = ε T+h + φε T+h φ h 1 ε T+1 with variance σ 2 h Thus in the random walk case, e T+h,T = h 1 = σ2 φ 2i i=0 h 1 ε T+h i i=0 σ 2 h = hσ2 h step ahead 95% interval : y T ± 1.96σ h h step ahead density forecast : N(y T, hσ 2 ) 251 / 323
252 Effects of Unit Roots Sample autocorrelation function fails to damp Sample partial autocorrelation function near 1 for τ = 1, and then damps quickly Properties of estimators change e.g., least-squares autoregression with unit roots True process: y t = y t 1 + ε t Estimated model: y t = φy t 1 + ε t Superconsistency:T ( ˆφ LS 1) stabilizes as sample size grows Bias: E(ˆφ LS ) < 1 Offsetting effects of bias and superconsistency 252 / 323
253 Unit Root Tests y t ε t = φy t 1 + ε t iid N(0, σ 2 ) ˆτ = ˆφ 1 s 1 T t=2 y2 t 1 Dickey-Fuller ˆτ distribution Trick regression: y t y t = (φ 1)y t 1 + ε t 253 / 323
254 Allowing for nonzero mean under the alternative Basic model: (y t µ) = φ(y t 1 µ) + ε t which we rewrite as y t = α + φy t 1 + ε t where α = µ(1 φ) α vanishes when φ = 1 (null) α is nevertheless present under the alternative, so we include an intercept in the regression Dickey-Fuller τ ˆ µ distribution 254 / 323
255 Allowing for deterministic linear trend under the alternative Basic model: (y t a btime t ) = φ(y t 1 a b TIME t 1 ) + ε t or y t = α + βtime t + φy t 1 + ε t where α = a(1 φ) + bφ and β = b(1 φ). Under the null hypothesis we have a random walk with drift, y t = b + y t 1 + ε t Under the deterministic-trend alternative hypothesis both the intercept and the trend enter and so are included in the regression. 255 / 323
256 Allowing for higher-order autoregressive dynamics AR(p) process: Rewrite: y t + y t = ρ 1 y t 1 + p φ j y t j j=1 = ε t p ρ j (y t j+1 y t j ) + ε t j=2 where p 2, ρ 1 = p j=1 φ j, and ρ i = p j=1 φ j, i = 2,..., p. Unit root: ρ 1 = 1 (AR(p 1) in first differences) ˆτ distribution holds asymptotically. 256 / 323
257 Allowing for a nonzero mean in the AR(p) case or (y t µ) + p φ j (y t j µ) = ε t j=1 y t = α + ρy t 1 + p ρ j (y t j+1 y t j ) + ε t, j=2 where α = µ(1 + p j=1 φ j), and the other parameters are as above. In the unit root case, the intercept vanishes, because p j=1 φ j = 1. ˆτ µ distribution holds asymptotically. 257 / 323
258 Allowing for trend under the alternative or where and (y t a btime t ) + p φ j (y t j a btime t j ) = ε t j=1 y t = k 1 + k 2 TIME t + ρ 1 y t 1 + k 1 = a(1 + p ρ j (y t j+1 y t j ) + ε t j=2 p φ i ) b i=1 k 2 = btime t (1 + p iφ i i=1 p φ i ) i=1 In the unit root case, k 1 = b p i=1 iφ i and k 2 = 0. ˆτ τ distribution holds asymptotically. 258 / 323
259 General ARMA representations: augmented Dickey-Fuller tests k 1 y t = ρ 1 y t 1 + ρ j (y t j+1 y t j ) + ε t j=2 k 1 y t = α + ρ 1 y t 1 + ρ j (y t j+1 y t j ) + ε t j=2 k 1 y t = k 1 + k 2 TIME t + ρ 1 y t 1 + ρ j (y t j+1 y t j ) + ε t j=2 k-1 augmentation lags have been included ˆτ, ˆτ µ, and ˆτ τ hold asymptotically under the null 259 / 323
260 Simple moving average smoothing 1. Original data: {y t } T t=1 2. Smoothed data: {ȳ t } 3. Two-sided moving average is ȳ t = (2m + 1) 1 m i= m y t i 4. One-sided moving average is ȳ t = (m + 1) 1 m i=0 y t i 5. One-sided weighted moving average is ȳ t = m i=0 w iy t i Must choose smoothing parameter, m 260 / 323
261 Exponential Smoothing Local level model: y t = c 0t + ε t c 0t = c 0,t 1 + η t η t WN(0, σ 2 η) Exponential smoothing can construct the optimal estimate of c 0 - and hence the optimal forecast of any future value of y - on the basis of current and past y What if the model is misspecified? 261 / 323
262 Exponential smoothing algorithm Observed series, {y t } T t=1 Smoothed series, {ȳ t } T t=1 (estimate of the local level) Forecasts, ŷ T +h,t 1. Initialize at t = 1: ȳ 1 = y 1 2. Update: ȳ t = αy t + (1 α)ȳ t 1, t = 2,...T 3. Forecast: ŷ T + h, T = ȳ T Smoothing parameter α [0, 1] 262 / 323
263 Demonstration that the weights are exponential Start: ȳ t = αy t + (1 α)ȳ t 1 Substitute backward for ȳ: ȳ t = t 1 w j y t j j=0 where w j = α(1 α) j Exponential weighting, as claimed Convenient recursive structure 263 / 323
264 Holt-Winters Smoothing y t = c 0t + c 1t TIME t + ε t c 0t = c 0,t 1 + η t c 1t = c 1,t 1 + ν t Local level and slope model Holt-Winters smoothing can construct optimal estimates of c 0 and c 1 - hence the optimal forecast of any future value of y by extrapolating the trend - on the basis of current and past y 264 / 323
265 Holt-Winters smoothing algorithm 1. Initialize at t = 2: ȳ 2 = y 2 F 2 = y 2 y 1 2. Update: ȳ t = αy t + (1 α)(ȳ t 1 + F t 1 ), 0 < α < 1 F t = β(ȳ t ȳ t 1 ) + (1 β)f t 1, 0 < β < 1 t = 3, 4,..., T. 3. Forecast: ŷ T +h,t = ȳ T + hf T ȳ t is the estimated level at time t F t is the estimated slope at time t 265 / 323
266 Random Walk Level and Change 266 / 323
267 Random Walk With Drift Level and Change 267 / 323
268 U.S. Per Capita GNP History and Two Forecasts 268 / 323
269 U.S. Per Capita GNP History, Two Forecasts, and Realization 269 / 323
270 Random Walk, Levels Sample Autocorrelation Function (Top Panel) and Sample Partial Autocorrelation Function (Bottom Panel) 270 / 323
271 Random Walk, First Differences Sample Autocorrelation Function (Top Panel) and Sample Partial Autocorrelation Function (Bottom Panel) 271 / 323
272 Log Yen / Dollar Exchange Rate (Top Panel) and Change in Log Yen / Dollar Exchange Rate (Bottom Panel) 272 / 323
273 Log Yen / Dollar Exchange Rate Sample Autocorrelations (Top Panel) and Sample Partial Autocorrelations (Bottom Panel) 273 / 323
274 Log Yen / Dollar Exchange Rate, First Differences Sample Autocorrelations (Top Panel) and Sample Partial Autocorrelations (Bottom Panel) 274 / 323
275 Log Yen / Dollar Rate, Levels AIC and SIC Values of Various ARMA Models 275 / 323
276 Log Yen / Dollar Exchange Rate Best-Fitting Deterministic-Trend Model 276 / 323
277 Log Yen / Dollar Exchange Rate Best-Fitting Deterministic-Trend Model : Residual Plot 277 / 323
278 Log Yen / Dollar Rate History and Forecast : AR(2) in Levels with Linear Trend 278 / 323
279 Log Yen / Dollar Rate History and Long-Horizon Forecast : AR(2) in Levels with Linear Trend 279 / 323
280 Log Yen / Dollar Rate History, Forecast and Realization : AR(2) in Levels with Linear Trend 280 / 323
281 Log Yen / Dollar Exchange Rate Augmented Dickey-Fuller Unit Root Test 281 / 323
282 Log Yen / Dollar Rate, Changes AIC and SIC Values of Various ARMA Models 282 / 323
283 Log Yen / Dollar Exchange Rate Best-Fitting Stochastic-Trend Model 283 / 323
284 Log Yen / Dollar Exchange Rate Best-Fitting Stochastic-Trend Model : Residual Plot 284 / 323
285 Log Yen / Dollar Rate History and Forecast : AR(1) in Differences with Intercept 285 / 323
286 Log Yen / Dollar Rate History and Long-Horizon Forecast : AR(1) in Differences with Intercept 286 / 323
287 Log Yen / Dollar Rate History, Forecast and Realization : AR(1) in Differences with Intercept 287 / 323
288 Log Yen / Dollar Exchange Rate Holt-Winters Smoothing 288 / 323
289 Log Yen / Dollar Rate History and Forecast : Holt-Winters Smoothing 289 / 323
290 Log Yen / Dollar Rate History and Long-Horizon Forecast : Holt-Winters Smoothing 290 / 323
291 Log Yen / Dollar Rate History, Forecast and Realization : Holt-Winters Smoothing 291 / 323
292 Volatility Measurement, Modeling and Forecasting The main idea: ε t Ω t 1 (0, σ 2 t ) Ω t 1 = {ε t 1, ε t 2,...} We ll look at: Basic Structure and properties Time variation in volatility and prediction-error variance ARMA representation in squares GARCH(1,1) and exponential smoothing Unconditional symmetry and leptokurtosis Convergence to normality under temporal aggregation Estimation and testing 292 / 323
293 Basic Structure and Properties Standard models (e.g., ARMA): Unconditional mean: constant Unconditional variance: constant Conditional mean: varies Conditional variance: constant (unfortunately) k-step-ahead forecast error variance: depends only on k, not on Ω t (again unfortunately) 293 / 323
294 The Basic ARCH Process B(L) = y t b i L i i = 0 = B(L)ε t i = 0 b 2 i < b 0 = 1 ε t Ω t 1 N(0, σ 2 t ) σ 2 t = ω + γ(l)ε 2 t ω > 0 γ(l) = p γ i L i i = 1 γ i 0 for all i γi < / 323
295 The Basic ARCH Process cont. ARCH(1) process: r t Ω t 1 (0, σ 2 t ) σ 2 t = ω + αr 2 t 1 Unconditional mean: E(r t ) = 0 Unconditional variance: E(r t E(r t )) 2 = ω 1 α Conditional mean: E(r t Ω t 1 ) = 0 Conditional variance: E([r t E(r t Ω t 1 )] 2 Ω t 1 ) = ω + αr 2 t / 323
296 The GARCH Process y t = ε t ε t Ω t 1 N(0, σ 2 t ) σ 2 t = ω + α(l)ε 2 t + β(l)σ 2 t α(l) = p α i L i, β(l) = i = 1 q β i L i i = 1 ω > 0, α i 0, β i 0, αi + β i < / 323
297 Time Variation in Volatility and Prediction Error Variance Prediction error variance depends on Ω t 1 e.g. 1-step-ahead prediction error variance is now σ 2 t = ω + αr 2 t 1 + βσ 2 t 1 Conditional variance is a serially correlated RV Again, follows immediately from σ 2 t = ω + αr 2 t 1 + βσ 2 t / 323
298 ARMA Representation in Squares r 2 t has the ARMA(1,1) representation: r 2 t = ω + (α + β)r 2 t 1 βν t 1 + ν t where ν t = r 2 t σ 2 t Important result: The above equation is simply r 2 t = (ω + (α + β)r 2 t 1 βν t 1 ) + ν t = σ 2 t + ν t Thus r 2 t is a noisy indicator of σ 2 t 298 / 323
299 GARCH(1,1) and Exponential Smoothing Exponential smoothing recursion: r 2 t = γr 2 t + (1 γ) r 2 t 1 Back substitution yields: where GARCH(1,1) σ 2 t Back substitution yields: σ 2 t = r 2 t = w j r 2 t j w j = γ(1 γ) j = ω + αr 2 t 1 + βσ 2 t 1 ω 1 β + α β j 1 r 2 t j 299 / 323
300 Unconditional Symmetry and Leptokurtosis Volatility clustering produces unconditional leptokurtosis Conditional symmetry translates into unconditional symmetry Unexpected agreement with the facts! Convergence to Normality under Temporal Aggregation Temporal aggregation of covariance stationary GARCH processes produces convergence to normality. Again, unexpected agreement with the facts! 300 / 323
301 Estimation and Testing Estimation: easy! Maximum Likelihood Estimation L(θ; r 1,..., r T ) = f(r T Ω T 1 ; θ) f(r T 1 Ω T 2 ; θ)... If the conditional densities are Gaussian, f(r t Ω t 1 ; θ) = 1 ( σt 2 (θ) 1/2 exp 1 r 2 ) t 2π 2 σt 2(θ). We can ignore the f (r p,..., r 1 ; θ) term, yielding the likelihood: T p 2 ln(2π) 1 2 T t=p+1 ln σ 2 t (θ) 1 2 T t=p+1 r 2 t σ 2 t (θ). Testing: likelihood ratio tests Graphical diagnostics: Correlogram of squares, correlogram of squared standardized residuals 301 / 323
302 Variations on Volatility Models We will look at: Asymmetric response and the leverage effect Exogenous variables GARCH-M and time-varying risk premia 302 / 323
303 Asymmetric Response and the Leverage Effect: TGARCH and EGARCH Asymmetric response I: TARCH Standard GARCH: TARCH: σ 2 t σ 2 t = ω + αr 2 t 1 + βσ 2 t 1 = ω + αr 2 t 1 + γr 2 t 1D t 1 + βσ 2 t 1 where positive return (good news): α effect on volatility negative return (bad news): α + γ effect on volatility γ 0 : Asymmetric news response γ > 0 : Leverage effect 303 / 323
304 Asymmetric Response and the Leverage Effect Cont. Asymmetric Response II: E-GARCH ln(σt 2 ) = ω + α r t 1 + γ r t 1 + β ln(σ σ t 1) 2 t 1 σ t 1 Log specification ensures that the conditional variance is positive. Volatility driven by both size and sign of shocks Leverage effect when γ < / 323
305 Introducing Exogenous Variables r t Ω t 1 N(0, σ 2 t ) σ 2 t = ω + αr 2 t 1 + βσ 2 t 1 + γx t where: γ is a parameter vector X is a set of positive exogenous variables. 305 / 323
306 Component GARCH Standard GARCH: (σ 2 t ω) = α(r 2 t 1 ω) + β(σ 2 t 1 ω), for constant long-run volatility ω. Component GARCH: (σ 2 t q t ) = α(r 2 t 1 q t 1 ) + β(σ 2 t 1 q t 1 ), for time-varying long-run volatility q t, where q t = ω + ρ(q t 1 ω) + φ(r 2 t 1 σ 2 t 1) 306 / 323
307 Component GARCH Cont. Transitory dynamics governed by α + β Persistent dynamics governed by ρ Equivalent to nonlinearly restricted GARCH(2,2) Exogenous variables and asymmetry can be allowed: (σt 2 q t ) = α(r 2 t 1 q t 1 ) + γ(r 2 t 1 q t 1 )D t 1 + β(σt 1 2 q t 1 ) + θ 307 / 323
308 Regression with GARCH Disturbances y t = x tβ + ε t ε t Ω t 1 N(0, σ 2 t ) 308 / 323
309 GARCH-M and Time-Varying Risk Premia Standard GARCH regression model: y t = x tβ + ε t ε t Ω t 1 N(0, σ 2 t ) GARCH-M model is a special case: y t = x tβ + γσ 2 t + ε t ε t Ω t 1 N(0, σ 2 t ) Time-varying risk premia in excess returns 309 / 323
310 Time Series Plot NYSE Returns 310 / 323
311 Histogram and Related Diagnostic Statistics NYSE Returns 311 / 323
312 Correlogram NYSE Returns 312 / 323
313 Time Series Plot Squared NYSE Returns 313 / 323
314 Correlogram Squared NYSE Returns 314 / 323
315 AR(5) Model Squared NYSE Returns 315 / 323
316 ARCH(5) Model NYSE Returns 316 / 323
317 Correlogram Standardized ARCH(5) Residuals : NYSE Returns 317 / 323
318 GARCH(1,1) Model NYSE Returns 318 / 323
319 Correlogram Standardized GARCH(1,1) Residuals : NYSE Returns 319 / 323
320 Estimated Conditional Standard Deviation GARCH(1,1) Model : NYSE Returns 320 / 323
321 Estimated Conditional Standard Deviation Exponential Smoothing : NYSE Returns 321 / 323
322 Conditional Standard Deviation History and Forecast : GARCH(1,1) Model 322 / 323
323 Conditional Standard Deviation Extended History and Extended Forecast : GARCH(1,1) Model 323 / 323
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationHeteroskedasticity in Time Series
Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More information10. Time series regression and forecasting
10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More informationVolatility. Gerald P. Dwyer. February Clemson University
Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationRomanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS
THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable
More informationLecture on ARMA model
Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationEcon 4120 Applied Forecasting Methods L10: Forecasting with Regression Models. Sung Y. Park CUHK
Econ 4120 Applied Forecasting Methods L10: Forecasting with Regression Models Sung Y. Park CUHK Conditional forecasting model Forecast a variable conditional on assumptions about other variables. (scenario
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationE 4101/5101 Lecture 9: Non-stationarity
E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationVector Auto-Regressive Models
Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationVAR Models and Applications
VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationEconometrics. Francis X. Diebold University of Pennsylvania. April 17, / 259
Econometrics Francis X. Diebold University of Pennsylvania April 17, 2018 1 / 259 Copyright c 2013-2018, by Francis X. Diebold. All rights reserved. All materials are freely available for your use, but
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More information9) Time series econometrics
30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationFinancial Time Series Analysis: Part II
Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing
More informationEconometrics of financial markets, -solutions to seminar 1. Problem 1
Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationCointegration, Stationarity and Error Correction Models.
Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average
More informationWeek 5 Quantitative Analysis of Financial Markets Modeling and Forecasting Trend
Week 5 Quantitative Analysis of Financial Markets Modeling and Forecasting Trend Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 :
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More information10) Time series econometrics
30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationDefine y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationG. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication
G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More informationEcon 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:
Econ 427, Spring 2010 Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements: 1. (page 132) In each case, the idea is to write these out in general form (without the lag
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationIntroduction to Modern Time Series Analysis
Introduction to Modern Time Series Analysis Gebhard Kirchgässner, Jürgen Wolters and Uwe Hassler Second Edition Springer 3 Teaching Material The following figures and tables are from the above book. They
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationChapter 2: Unit Roots
Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More information9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.
9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationECON3327: Financial Econometrics, Spring 2016
ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationMultivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationin the time series. The relation between y and x is contemporaneous.
9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x
More informationReview of Statistics
Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and
More informationEconometrics of Panel Data
Econometrics of Panel Data Jakub Mućk Meeting # 9 Jakub Mućk Econometrics of Panel Data Meeting # 9 1 / 22 Outline 1 Time series analysis Stationarity Unit Root Tests for Nonstationarity 2 Panel Unit Root
More information