Forecasting: Principles and Practice. Rob J Hyndman. 12. Advanced methods OTexts.com/fpp/9/2/ OTexts.com/fpp/9/3/

Size: px
Start display at page:

Download "Forecasting: Principles and Practice. Rob J Hyndman. 12. Advanced methods OTexts.com/fpp/9/2/ OTexts.com/fpp/9/3/"

Transcription

1 Rob J Hyndman Forecasting: Principles and Practice 12. Advanced methods OTexts.com/fpp/9/2/ OTexts.com/fpp/9/3/ Forecasting: Principles and Practice 1

2 Outline 1 Vector autoregressions 2 Neural network models 3 Time series with complex seasonality Forecasting: Principles and Practice Vector autoregressions 2

3 Vector autoregressions Dynamic regression assumes a unidirectional relationship: forecast variable influenced by predictor variables, but not vice versa. Vector AR allow for feedback relationships. All variables treated symmetrically. i.e., all variables are now treated as endogenous. Personal consumption may be affected by disposable income, and vice-versa. e.g., Govt stimulus package in Dec 2008 increased Christmas spending which increased incomes. Forecasting: Principles and Practice Vector autoregressions 3

4 Vector autoregressions Dynamic regression assumes a unidirectional relationship: forecast variable influenced by predictor variables, but not vice versa. Vector AR allow for feedback relationships. All variables treated symmetrically. i.e., all variables are now treated as endogenous. Personal consumption may be affected by disposable income, and vice-versa. e.g., Govt stimulus package in Dec 2008 increased Christmas spending which increased incomes. Forecasting: Principles and Practice Vector autoregressions 3

5 Vector autoregressions VAR(1) y 1,t = c 1 + φ 11,1 y 1,t 1 + φ 12,1 y 2,t 1 + e 1,t y 2,t = c 2 + φ 21,1 y 1,t 1 + φ 22,1 y 2,t 1 + e 2,t Forecasts: ŷ 1,T+1 T = ĉ 1 + ˆφ 11,1 y 1,T + ˆφ 12,1 y 2,T ŷ 2,T+1 T = ĉ 2 + ˆφ 21,1 y 1,T + ˆφ 22,1 y 2,T. ŷ 1,T+2 T = ĉ 1 + ˆφ 11,1 ŷ 1,T+1 + ˆφ 12,1 ŷ 2,T+1 ŷ 2,T+2 T = ĉ 2 + ˆφ 21,1 ŷ 1,T+1 + ˆφ 22,1 ŷ 2,T+1. Forecasting: Principles and Practice Vector autoregressions 4

6 Vector autoregressions VAR(1) y 1,t = c 1 + φ 11,1 y 1,t 1 + φ 12,1 y 2,t 1 + e 1,t y 2,t = c 2 + φ 21,1 y 1,t 1 + φ 22,1 y 2,t 1 + e 2,t Forecasts: ŷ 1,T+1 T = ĉ 1 + ˆφ 11,1 y 1,T + ˆφ 12,1 y 2,T ŷ 2,T+1 T = ĉ 2 + ˆφ 21,1 y 1,T + ˆφ 22,1 y 2,T. ŷ 1,T+2 T = ĉ 1 + ˆφ 11,1 ŷ 1,T+1 + ˆφ 12,1 ŷ 2,T+1 ŷ 2,T+2 T = ĉ 2 + ˆφ 21,1 ŷ 1,T+1 + ˆφ 22,1 ŷ 2,T+1. Forecasting: Principles and Practice Vector autoregressions 4

7 VARs are useful when forecasting a collection of related variables where no explicit interpretation is required; testing whether one variable is useful in forecasting another (the basis of Granger causality tests); impulse response analysis, where the response of one variable to a sudden but temporary change in another variable is analysed; forecast error variance decomposition, where the proportion of the forecast variance of one variable is attributed to the effect of other variables. Forecasting: Principles and Practice Vector autoregressions 5

8 VARs are useful when forecasting a collection of related variables where no explicit interpretation is required; testing whether one variable is useful in forecasting another (the basis of Granger causality tests); impulse response analysis, where the response of one variable to a sudden but temporary change in another variable is analysed; forecast error variance decomposition, where the proportion of the forecast variance of one variable is attributed to the effect of other variables. Forecasting: Principles and Practice Vector autoregressions 5

9 VARs are useful when forecasting a collection of related variables where no explicit interpretation is required; testing whether one variable is useful in forecasting another (the basis of Granger causality tests); impulse response analysis, where the response of one variable to a sudden but temporary change in another variable is analysed; forecast error variance decomposition, where the proportion of the forecast variance of one variable is attributed to the effect of other variables. Forecasting: Principles and Practice Vector autoregressions 5

10 VARs are useful when forecasting a collection of related variables where no explicit interpretation is required; testing whether one variable is useful in forecasting another (the basis of Granger causality tests); impulse response analysis, where the response of one variable to a sudden but temporary change in another variable is analysed; forecast error variance decomposition, where the proportion of the forecast variance of one variable is attributed to the effect of other variables. Forecasting: Principles and Practice Vector autoregressions 5

11 VAR example > ar(usconsumption,order=3) $ar,, 1 consumption income consumption income ,, 2 consumption income consumption income ,, 3 consumption income consumption income $var.pred consumption income consumption income Forecasting: Principles and Practice Vector autoregressions 6

12 VAR example > library(vars) > VARselect(usconsumption, lag.max=8, type="const")$selection AIC(n) HQ(n) SC(n) FPE(n) > var <- VAR(usconsumption, p=3, type="const") > serial.test(var, lags.pt=10, type="pt.asymptotic") Portmanteau Test (asymptotic) data: Residuals of VAR object var Chi-squared = , df = 28, p-value = Forecasting: Principles and Practice Vector autoregressions 7

13 VAR example > summary(var) VAR Estimation Results: ========================= Endogenous variables: consumption, income Deterministic variables: const Sample size: 161 Estimation results for equation consumption: ============================================ Estimate Std. Error t value Pr(> t ) consumption.l * income.l consumption.l * income.l consumption.l ** income.l const *** Forecasting: Principles and Practice Vector autoregressions 8

14 VAR example Estimation results for equation income: ======================================= Estimate Std. Error t value Pr(> t ) consumption.l e-05 *** income.l ** consumption.l income.l consumption.l *** income.l const ** --- Signif. codes: 0 *** ** 0.01 * Correlation matrix of residuals: consumption income consumption income Forecasting: Principles and Practice Vector autoregressions 9

15 VAR example fcst <- forecast(var) plot(fcst, xlab="year") Forecasting: Principles and Practice Vector autoregressions 10

16 VAR example Forecasts from VAR(3) income consumption Year Forecasting: Principles and Practice Vector autoregressions 11

17 Outline 1 Vector autoregressions 2 Neural network models 3 Time series with complex seasonality Forecasting: Principles and Practice Neural network models 12

18 Neural network models Simplest version: linear regression Input Output layer layer Input #1 Input #2 Input #3 Output Input #4 Coefficients attached to predictors are called weights. Forecasts are obtained by a linear combination of inputs. Weights selected using a learning algorithm that minimises a cost function. Forecasting: Principles and Practice Neural network models 13

19 Neural network models Simplest version: linear regression Input Output layer layer Input #1 Input #2 Input #3 Output Input #4 Coefficients attached to predictors are called weights. Forecasts are obtained by a linear combination of inputs. Weights selected using a learning algorithm that minimises a cost function. Forecasting: Principles and Practice Neural network models 13

20 Neural network models Simplest version: linear regression Input Output layer layer Input #1 Input #2 Input #3 Output Input #4 Coefficients attached to predictors are called weights. Forecasts are obtained by a linear combination of inputs. Weights selected using a learning algorithm that minimises a cost function. Forecasting: Principles and Practice Neural network models 13

21 Neural network models Simplest version: linear regression Input Output layer layer Input #1 Input #2 Input #3 Output Input #4 Coefficients attached to predictors are called weights. Forecasts are obtained by a linear combination of inputs. Weights selected using a learning algorithm that minimises a cost function. Forecasting: Principles and Practice Neural network models 13

22 Neural network models Nonlinear model with one hidden layer Input Hidden Output layer layer layer Input #1 Input #2 Input #3 Output Input #4 A multilayer feed-forward network where each layer of nodes receives inputs from the previous layers. Inputs to each node combined using linear combination. Result modified by nonlinear function before being output. Forecasting: Principles and Practice Neural network models 14

23 Neural network models Nonlinear model with one hidden layer Input Hidden Output layer layer layer Input #1 Input #2 Input #3 Output Input #4 A multilayer feed-forward network where each layer of nodes receives inputs from the previous layers. Inputs to each node combined using linear combination. Result modified by nonlinear function before being output. Forecasting: Principles and Practice Neural network models 14

24 Neural network models Nonlinear model with one hidden layer Input Hidden Output layer layer layer Input #1 Input #2 Input #3 Output Input #4 A multilayer feed-forward network where each layer of nodes receives inputs from the previous layers. Inputs to each node combined using linear combination. Result modified by nonlinear function before being output. Forecasting: Principles and Practice Neural network models 14

25 Neural network models Nonlinear model with one hidden layer Input Hidden Output layer layer layer Input #1 Input #2 Input #3 Output Input #4 A multilayer feed-forward network where each layer of nodes receives inputs from the previous layers. Inputs to each node combined using linear combination. Result modified by nonlinear function before being output. Forecasting: Principles and Practice Neural network models 14

26 Neural network models Inputs to hidden neuron j linearly combined: z j = b j + 4 w i,j x i. i=1 Modified using nonlinear function such as a sigmoid: s(z) = e z, This tends to reduce the effect of extreme input values, thus making the network somewhat robust to outliers. Forecasting: Principles and Practice Neural network models 15

27 Neural network models Weights take random values to begin with, which are then updated using the observed data. There is an element of randomness in the predictions. So the network is usually trained several times using different random starting points, and the results are averaged. Number of hidden layers, and the number of nodes in each hidden layer, must be specified in advance. Forecasting: Principles and Practice Neural network models 16

28 Neural network models Weights take random values to begin with, which are then updated using the observed data. There is an element of randomness in the predictions. So the network is usually trained several times using different random starting points, and the results are averaged. Number of hidden layers, and the number of nodes in each hidden layer, must be specified in advance. Forecasting: Principles and Practice Neural network models 16

29 Neural network models Weights take random values to begin with, which are then updated using the observed data. There is an element of randomness in the predictions. So the network is usually trained several times using different random starting points, and the results are averaged. Number of hidden layers, and the number of nodes in each hidden layer, must be specified in advance. Forecasting: Principles and Practice Neural network models 16

30 NNAR models Lagged values of the time series can be used as inputs to a neural network. NNAR(p, k): p lagged inputs and k nodes in the single hidden layer. NNAR(p, 0) model is equivalent to an ARIMA(p, 0, 0) model but without stationarity restrictions. Seasonal NNAR(p, P, k): inputs (y t 1, y t 2,..., y t p, y t m, y t 2m, y t Pm ) and k neurons in the hidden layer. NNAR(p, P, 0) m model is equivalent to an ARIMA(p, 0, 0)(P,0,0) m model but without stationarity restrictions. Forecasting: Principles and Practice Neural network models 17

31 NNAR models Lagged values of the time series can be used as inputs to a neural network. NNAR(p, k): p lagged inputs and k nodes in the single hidden layer. NNAR(p, 0) model is equivalent to an ARIMA(p, 0, 0) model but without stationarity restrictions. Seasonal NNAR(p, P, k): inputs (y t 1, y t 2,..., y t p, y t m, y t 2m, y t Pm ) and k neurons in the hidden layer. NNAR(p, P, 0) m model is equivalent to an ARIMA(p, 0, 0)(P,0,0) m model but without stationarity restrictions. Forecasting: Principles and Practice Neural network models 17

32 NNAR models Lagged values of the time series can be used as inputs to a neural network. NNAR(p, k): p lagged inputs and k nodes in the single hidden layer. NNAR(p, 0) model is equivalent to an ARIMA(p, 0, 0) model but without stationarity restrictions. Seasonal NNAR(p, P, k): inputs (y t 1, y t 2,..., y t p, y t m, y t 2m, y t Pm ) and k neurons in the hidden layer. NNAR(p, P, 0) m model is equivalent to an ARIMA(p, 0, 0)(P,0,0) m model but without stationarity restrictions. Forecasting: Principles and Practice Neural network models 17

33 NNAR models Lagged values of the time series can be used as inputs to a neural network. NNAR(p, k): p lagged inputs and k nodes in the single hidden layer. NNAR(p, 0) model is equivalent to an ARIMA(p, 0, 0) model but without stationarity restrictions. Seasonal NNAR(p, P, k): inputs (y t 1, y t 2,..., y t p, y t m, y t 2m, y t Pm ) and k neurons in the hidden layer. NNAR(p, P, 0) m model is equivalent to an ARIMA(p, 0, 0)(P,0,0) m model but without stationarity restrictions. Forecasting: Principles and Practice Neural network models 17

34 NNAR models Lagged values of the time series can be used as inputs to a neural network. NNAR(p, k): p lagged inputs and k nodes in the single hidden layer. NNAR(p, 0) model is equivalent to an ARIMA(p, 0, 0) model but without stationarity restrictions. Seasonal NNAR(p, P, k): inputs (y t 1, y t 2,..., y t p, y t m, y t 2m, y t Pm ) and k neurons in the hidden layer. NNAR(p, P, 0) m model is equivalent to an ARIMA(p, 0, 0)(P,0,0) m model but without stationarity restrictions. Forecasting: Principles and Practice Neural network models 17

35 NNAR models in R The nnetar() function fits an NNAR(p, P, k) m model. If p and P are not specified, they are automatically selected. For non-seasonal time series, default p = optimal number of lags (according to the AIC) for a linear AR(p) model. For seasonal time series, defaults are P = 1 and p is chosen from the optimal linear model fitted to the seasonally adjusted data. Default k = (p + P + 1)/2 (rounded to the nearest integer). Forecasting: Principles and Practice Neural network models 18

36 NNAR models in R The nnetar() function fits an NNAR(p, P, k) m model. If p and P are not specified, they are automatically selected. For non-seasonal time series, default p = optimal number of lags (according to the AIC) for a linear AR(p) model. For seasonal time series, defaults are P = 1 and p is chosen from the optimal linear model fitted to the seasonally adjusted data. Default k = (p + P + 1)/2 (rounded to the nearest integer). Forecasting: Principles and Practice Neural network models 18

37 NNAR models in R The nnetar() function fits an NNAR(p, P, k) m model. If p and P are not specified, they are automatically selected. For non-seasonal time series, default p = optimal number of lags (according to the AIC) for a linear AR(p) model. For seasonal time series, defaults are P = 1 and p is chosen from the optimal linear model fitted to the seasonally adjusted data. Default k = (p + P + 1)/2 (rounded to the nearest integer). Forecasting: Principles and Practice Neural network models 18

38 NNAR models in R The nnetar() function fits an NNAR(p, P, k) m model. If p and P are not specified, they are automatically selected. For non-seasonal time series, default p = optimal number of lags (according to the AIC) for a linear AR(p) model. For seasonal time series, defaults are P = 1 and p is chosen from the optimal linear model fitted to the seasonally adjusted data. Default k = (p + P + 1)/2 (rounded to the nearest integer). Forecasting: Principles and Practice Neural network models 18

39 NNAR models in R The nnetar() function fits an NNAR(p, P, k) m model. If p and P are not specified, they are automatically selected. For non-seasonal time series, default p = optimal number of lags (according to the AIC) for a linear AR(p) model. For seasonal time series, defaults are P = 1 and p is chosen from the optimal linear model fitted to the seasonally adjusted data. Default k = (p + P + 1)/2 (rounded to the nearest integer). Forecasting: Principles and Practice Neural network models 18

40 Sunspots Surface of the sun contains magnetic regions that appear as dark spots. These affect the propagation of radio waves and so telecommunication companies like to predict sunspot activity in order to plan for any future difficulties. Sunspots follow a cycle of length between 9 and 14 years. Forecasting: Principles and Practice Neural network models 19

41 Sunspots Surface of the sun contains magnetic regions that appear as dark spots. These affect the propagation of radio waves and so telecommunication companies like to predict sunspot activity in order to plan for any future difficulties. Sunspots follow a cycle of length between 9 and 14 years. Forecasting: Principles and Practice Neural network models 19

42 Sunspots Surface of the sun contains magnetic regions that appear as dark spots. These affect the propagation of radio waves and so telecommunication companies like to predict sunspot activity in order to plan for any future difficulties. Sunspots follow a cycle of length between 9 and 14 years. Forecasting: Principles and Practice Neural network models 19

43 NNAR(9,5) model for sunspots Forecasts from NNAR(9) Forecasting: Principles and Practice Neural network models 20

44 NNAR model for sunspots fit <- nnetar(sunspotarea) plot(forecast(fit,h=20)) To restrict to positive values: fit <- nnetar(sunspotarea,lambda=0) plot(forecast(fit,h=20)) Forecasting: Principles and Practice Neural network models 21

45 NNAR model for sunspots fit <- nnetar(sunspotarea) plot(forecast(fit,h=20)) To restrict to positive values: fit <- nnetar(sunspotarea,lambda=0) plot(forecast(fit,h=20)) Forecasting: Principles and Practice Neural network models 21

46 Outline 1 Vector autoregressions 2 Neural network models 3 Time series with complex seasonality Forecasting: Principles and Practice Time series with complex seasonality 22

47 Examples US finished motor gasoline products Thousands of barrels per day Weeks Forecasting: Principles and Practice Time series with complex seasonality 23

48 Examples Number of calls to large American bank (7am 9pm) Number of call arrivals March 17 March 31 March 14 April 28 April 12 May 5 minute intervals Forecasting: Principles and Practice Time series with complex seasonality 23

49 Examples Turkish electricity demand Electricity demand (GW) Days Forecasting: Principles and Practice Time series with complex seasonality 23

50 TBATS model TBATS Trigonometric terms for seasonality Box-Cox transformations for heterogeneity ARMA errors for short-term dynamics Trend (possibly damped) Seasonal (including multiple and non-integer periods) Forecasting: Principles and Practice Time series with complex seasonality 24

51 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 l t = l t 1 + φb t 1 + αd t s (i) t m i + d t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = s(i) j,t 1 cos λ(i) j + s (i) j,t 1 sin λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

52 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 s (i) t m i + d t Box-Cox transformation l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = s(i) j,t 1 cos λ(i) j + s (i) j,t 1 sin λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

53 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 s (i) t m i + d t Box-Cox transformation M seasonal periods l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = s(i) j,t 1 cos λ(i) j + s (i) j,t 1 sin λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

54 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 s (i) t m i + d t Box-Cox transformation M seasonal periods l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = global and local trend s(i) j,t 1 cos λ(i) j + s (i) j,t 1 sin λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

55 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 s (i) t m i + d t Box-Cox transformation M seasonal periods l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = global and local trend ARMA error s(i) j,t 1 cos λ(i) j + s (i) j,t 1 sin λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

56 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = log y t if ω = 0. y (ω) t = l t 1 + φb t 1 + M i=1 s (i) t m i + d t Box-Cox transformation M seasonal periods l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p q d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 s (i) j,t j=1 s (i) j,t = s(i) j,t 1 global and local trend ARMA error Fourier-like seasonal terms+ s (i) j,t 1 sin λ(i) j cos λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

57 TBATS model y t = observation at time t { y (ω) (y ω t 1)/ω if ω 0; t = logtbats y t if ω = 0. Trigonometric M y (ω) t = l t 1 + φb t 1 + s (i) t m i + d t i=1 l t = l t 1 + φb t 1 + αd t b t = (1 φ)b + φb t 1 + βd t p Trendq d t = φ i d t i + θ j ε t j + ε t s (i) t = i=1 k i j=1 Box-Cox ARMA Seasonal s (i) j,t j=1 s (i) j,t = s(i) j,t 1 Box-Cox transformation M seasonal periods global and local trend ARMA error Fourier-like seasonal terms+ s (i) j,t 1 sin λ(i) j cos λ(i) j + γ (i) 1 d t s (i) = j,t s(i) j,t 1 sin λ(i) j + s (i) j,t 1 cos λ(i) j + γ (i) 2 d t Forecasting: Principles and Practice Time series with complex seasonality 25

58 Examples Forecasts from TBATS(0.999, {2,2}, 1, {< ,8>}) Thousands of barrels per day fit <- tbats(gasoline) fcast <- forecast(fit) plot(fcast) Weeks Forecasting: Principles and Practice Time series with complex seasonality 26

59 Number of call arrivals Examples Forecasts from TBATS(1, {3,1}, 0.987, {<169,5>, <845,3>}) fit <- tbats(callcentre) fcast <- forecast(fit) plot(fcast) 3 March 17 March 31 March 14 April 28 April 12 May 26 May 9 June 5 minute intervals Forecasting: Principles and Practice Time series with complex seasonality 27

60 Examples Forecasts from TBATS(0, {5,3}, 0.997, {<7,3>, <354.37,12>, <365.25,4>}) Electricity demand (GW) fit <- tbats(turk) fcast <- forecast(fit) plot(fcast) Days Forecasting: Principles and Practice Time series with complex seasonality 28

Forecasting: principles and practice. Rob J Hyndman 3.2 Forecasting with multiple seasonality

Forecasting: principles and practice. Rob J Hyndman 3.2 Forecasting with multiple seasonality Forecasting: principles and practice Rob J Hyndman 3.2 Forecasting with multiple seasonality 1 Outline 1 Time series with complex seasonality 2 STL with multiple seasonal periods 3 Dynamic harmonic regression

More information

FORECASTING USING R. Dynamic regression. Rob Hyndman Author, forecast

FORECASTING USING R. Dynamic regression. Rob Hyndman Author, forecast FORECASTING USING R Dynamic regression Rob Hyndman Author, forecast Dynamic regression Regression model with ARIMA errors y t = β 0 + β 1 x 1,t + + β r x r,t + e t y t modeled as function of r explanatory

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1 Forecasting using R Rob J Hyndman 3.2 Dynamic regression Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Stochastic and deterministic trends 3 Periodic seasonality 4 Lab session 14 5 Dynamic

More information

Transformations for variance stabilization

Transformations for variance stabilization FORECASTING USING R Transformations for variance stabilization Rob Hyndman Author, forecast Variance stabilization If the data show increasing variation as the level of the series increases, then a transformation

More information

Vector Autoregression

Vector Autoregression Vector Autoregression Prabakar Rajasekaran December 13, 212 1 Introduction Vector autoregression (VAR) is an econometric model used to capture the evolution and the interdependencies between multiple time

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Autoregressive distributed lag models

Autoregressive distributed lag models Introduction In economics, most cases we want to model relationships between variables, and often simultaneously. That means we need to move from univariate time series to multivariate. We do it in two

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Fall 2008 Environmental Econometrics (GR03) TSII Fall 2008 1 / 16 More on AR(1) In AR(1) model (Y t = µ + ρy t 1 + u t ) with ρ = 1, the series is said to have a unit root or a

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Package NlinTS. September 10, 2018

Package NlinTS. September 10, 2018 Type Package Title Non Linear Time Series Analysis Version 1.3.5 Date 2018-09-05 Package NlinTS September 10, 2018 Maintainer Youssef Hmamouche Models for time series forecasting

More information

Lecture 7a: Vector Autoregression (VAR)

Lecture 7a: Vector Autoregression (VAR) Lecture 7a: Vector Autoregression (VAR) 1 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR

More information

APPLIED TIME SERIES ECONOMETRICS

APPLIED TIME SERIES ECONOMETRICS APPLIED TIME SERIES ECONOMETRICS Edited by HELMUT LÜTKEPOHL European University Institute, Florence MARKUS KRÄTZIG Humboldt University, Berlin CAMBRIDGE UNIVERSITY PRESS Contents Preface Notation and Abbreviations

More information

Dynamic Regression Models (Lect 15)

Dynamic Regression Models (Lect 15) Dynamic Regression Models (Lect 15) Ragnar Nymoen University of Oslo 21 March 2013 1 / 17 HGL: Ch 9; BN: Kap 10 The HGL Ch 9 is a long chapter, and the testing for autocorrelation part we have already

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Lecture 7a: Vector Autoregression (VAR)

Lecture 7a: Vector Autoregression (VAR) Lecture 7a: Vector Autoregression (VAR) 1 2 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

Eco 5316 Time Series Econometrics. Lecture 16 Vector Autoregression (VAR) Models

Eco 5316 Time Series Econometrics. Lecture 16 Vector Autoregression (VAR) Models Eco 5316 Time Series Econometrics Lecture 16 Vector Autoregression (VAR) Models 1 / 47 Motivation theoretical models are developed to shed light on interactions and dynamic relationship between variables

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Forecasting electricity load demand using hybrid exponential smoothing-artificial neural network model

Forecasting electricity load demand using hybrid exponential smoothing-artificial neural network model International Journal of Advances in Intelligent Informatics ISSN: 2442-6571 131 Forecasting electricity load demand using hybrid exponential smoothing-artificial neural network model Winita Sulandari

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

Impulse Response and Granger Causality in Dynamical Systems with Autoencoder Nonlinear Vector Autoregressions. Abstract

Impulse Response and Granger Causality in Dynamical Systems with Autoencoder Nonlinear Vector Autoregressions. Abstract Impulse Response and Granger Causality in Dynamical Systems with Autoencoder Nonlinear Vector Autoregressions Kurt Izak Cabanilla Kevin Thomas Go Thinking Machines Data Science Thinking Machines Data Science

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Applied Time Series Topics

Applied Time Series Topics Applied Time Series Topics Ivan Medovikov Brock University April 16, 2013 Ivan Medovikov, Brock University Applied Time Series Topics 1/34 Overview 1. Non-stationary data and consequences 2. Trends and

More information

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I. Vector Autoregressive Model Vector Autoregressions II Empirical Macroeconomics - Lect 2 Dr. Ana Beatriz Galvao Queen Mary University of London January 2012 A VAR(p) model of the m 1 vector of time series

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Multivariate Time Series Analysis: VAR Gerald P. Dwyer Trinity College, Dublin January 2013 GPD (TCD) VAR 01/13 1 / 25 Structural equations Suppose have simultaneous system for supply

More information

Automatic forecasting with a modified exponential smoothing state space framework

Automatic forecasting with a modified exponential smoothing state space framework ISSN 1440-771X Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Automatic forecasting with a modified exponential smoothing state space framework

More information

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data Revisiting linear and non-linear methodologies for time series - application to ESTSP 08 competition data Madalina Olteanu Universite Paris 1 - SAMOS CES 90 Rue de Tolbiac, 75013 Paris - France Abstract.

More information

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.5 Seasonal ARIMA models Forecasting using R 1 Outline 1 Backshift notation reviewed 2 Seasonal ARIMA models 3 ARIMA vs ETS 4 Lab session 12 Forecasting using R Backshift

More information

Forecasting with R A practical workshop

Forecasting with R A practical workshop Forecasting with R A practical workshop International Symposium on Forecasting 2016 19 th June 2016 Nikolaos Kourentzes nikolaos@kourentzes.com http://nikolaos.kourentzes.com Fotios Petropoulos fotpetr@gmail.com

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann ECLT 5810 Classification Neural Networks Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann Neural Networks A neural network is a set of connected input/output

More information

A hybrid model of stock price prediction based on the PCA-ARIMA-BP

A hybrid model of stock price prediction based on the PCA-ARIMA-BP ANZIAM J. 58 (E) pp.e162 E178, 2017 E162 A hybrid model of stock price prediction based on the PCA-ARIMA-BP Hua Luo 1 Shuang Wang 2 (Received 13 June 2016; revised 17 February 2017) Abstract A pca-arima-bp

More information

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during

More information

Multivariate forecasting with VAR models

Multivariate forecasting with VAR models Multivariate forecasting with VAR models Franz Eigner University of Vienna UK Econometric Forecasting Prof. Robert Kunst 16th June 2009 Overview Vector autoregressive model univariate forecasting multivariate

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

A Course in Time Series Analysis

A Course in Time Series Analysis A Course in Time Series Analysis Edited by DANIEL PENA Universidad Carlos III de Madrid GEORGE C. TIAO University of Chicago RUEY S. TSAY University of Chicago A Wiley-Interscience Publication JOHN WILEY

More information

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. COGS Q250 Fall 2012 Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. For the first two questions of the homework you will need to understand the learning algorithm using the delta

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

07 Multivariate models: Granger causality, VAR and VECM models. Andrius Buteikis,

07 Multivariate models: Granger causality, VAR and VECM models. Andrius Buteikis, 07 Multivariate models: Granger causality, VAR and VECM models Andrius Buteikis, andrius.buteikis@mif.vu.lt http://web.vu.lt/mif/a.buteikis/ Introduction In the present chapter, we discuss methods which

More information

STA 6857 VAR, VARMA, VARMAX ( 5.7)

STA 6857 VAR, VARMA, VARMAX ( 5.7) STA 6857 VAR, VARMA, VARMAX ( 5.7) Outline 1 Multivariate Time Series Modeling 2 VAR 3 VARIMA/VARMAX Arthur Berg STA 6857 VAR, VARMA, VARMAX ( 5.7) 2/ 16 Outline 1 Multivariate Time Series Modeling 2 VAR

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Deep Feedforward Networks

Deep Feedforward Networks Deep Feedforward Networks Liu Yang March 30, 2017 Liu Yang Short title March 30, 2017 1 / 24 Overview 1 Background A general introduction Example 2 Gradient based learning Cost functions Output Units 3

More information

Spectral Analysis. Al Nosedal University of Toronto. Winter Al Nosedal University of Toronto Spectral Analysis Winter / 71

Spectral Analysis. Al Nosedal University of Toronto. Winter Al Nosedal University of Toronto Spectral Analysis Winter / 71 Spectral Analysis Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis Winter 2016 1 / 71 essentially, all models are wrong, but some are useful George E. P.

More information

Time series models: sparse estimation and robustness aspects

Time series models: sparse estimation and robustness aspects Time series models: sparse estimation and robustness aspects Christophe Croux (KU Leuven, Belgium) 2017 CRonos Spring Course Limassol, Cyprus, 8-10 April Based on joint work with Ines Wilms, Ruben Crevits,

More information

388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag.

388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag. INDEX Aggregation... 104 Almon lag... 135-140,149 AR(1) process... 114-130,240,246,324-325,366,370,374 ARCH... 376-379 ARlMA... 365 Asymptotically unbiased... 13,50 Autocorrelation... 113-130, 142-150,324-325,365-369

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Multivariate time series modeling using VARMAX

Multivariate time series modeling using VARMAX ABSTRACT Paper 1656-2014 Multivariate time series modeling using VARMAX Anders Milhøj, University of Copenhagen Two examples of Vector Autoregressive Moving Average modeling with exogenous variables are

More information

ADAPTIVE FILTER THEORY

ADAPTIVE FILTER THEORY ADAPTIVE FILTER THEORY Fourth Edition Simon Haykin Communications Research Laboratory McMaster University Hamilton, Ontario, Canada Front ice Hall PRENTICE HALL Upper Saddle River, New Jersey 07458 Preface

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Neural Network Approach to Estimating Conditional Quantile Polynomial Distributed Lag (QPDL) Model with an Application to Rubber Price Returns

Neural Network Approach to Estimating Conditional Quantile Polynomial Distributed Lag (QPDL) Model with an Application to Rubber Price Returns American Journal of Business, Economics and Management 2015; 3(3): 162-170 Published online June 10, 2015 (http://www.openscienceonline.com/journal/ajbem) Neural Network Approach to Estimating Conditional

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

TIME SERIES DATA ANALYSIS USING EVIEWS

TIME SERIES DATA ANALYSIS USING EVIEWS TIME SERIES DATA ANALYSIS USING EVIEWS I Gusti Ngurah Agung Graduate School Of Management Faculty Of Economics University Of Indonesia Ph.D. in Biostatistics and MSc. in Mathematical Statistics from University

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Multilayer Perceptrons and Backpropagation

Multilayer Perceptrons and Backpropagation Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:

More information

Seasonality. Matthieu Stigler January 8, Version 1.1

Seasonality. Matthieu Stigler January 8, Version 1.1 Seasonality Matthieu Stigler Matthieu.Stigler@gmail.com January 8, 2009 Version 1.1 This document is released under the Creative Commons Attribution-Noncommercial 2.5 India license. Matthieu Stigler Matthieu.Stigler@gmail.com

More information

CSC242: Intro to AI. Lecture 21

CSC242: Intro to AI. Lecture 21 CSC242: Intro to AI Lecture 21 Administrivia Project 4 (homeworks 18 & 19) due Mon Apr 16 11:59PM Posters Apr 24 and 26 You need an idea! You need to present it nicely on 2-wide by 4-high landscape pages

More information

Forecasting of a hydropower plant energy production

Forecasting of a hydropower plant energy production Forecasting of a hydropower plant energy production Master s Thesis in Computer Science Ksenia Dmitrieva May 13, 2015 Halden, Norway www.hiof.no Abstract This thesis presents several machine learning

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

2. Multivariate ARMA

2. Multivariate ARMA 2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt

More information

Forecasting Arrivals and Occupancy Levels in an Emergency Department

Forecasting Arrivals and Occupancy Levels in an Emergency Department Forecasting Arrivals and Occupancy Levels in an Emergency Department Ward Whitt 1, Xiaopei Zhang 2 Abstract This is a sequel to Whitt and Zhang (2017), in which we developed an aggregate stochastic model

More information

Do we need Experts for Time Series Forecasting?

Do we need Experts for Time Series Forecasting? Do we need Experts for Time Series Forecasting? Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot Campus, Poole, BH12 5BB - United

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017 Introduction to Regression Analysis Dr. Devlina Chatterjee 11 th August, 2017 What is regression analysis? Regression analysis is a statistical technique for studying linear relationships. One dependent

More information

10) Time series econometrics

10) Time series econometrics 30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit

More information

1 Teaching notes on structural VARs.

1 Teaching notes on structural VARs. Bent E. Sørensen November 8, 2016 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest to analyze, estimation is a different matter time series models are the

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

AR(p) + I(d) + MA(q) = ARIMA(p, d, q) AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the

More information

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno

FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 3(B), March 2012 pp. 2285 2300 FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS

More information

Neural networks (not in book)

Neural networks (not in book) (not in book) Another approach to classification is neural networks. were developed in the 1980s as a way to model how learning occurs in the brain. There was therefore wide interest in neural networks

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

Testing for non-stationarity

Testing for non-stationarity 20 November, 2009 Overview The tests for investigating the non-stationary of a time series falls into four types: 1 Check the null that there is a unit root against stationarity. Within these, there are

More information

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Kyriaki Kitikidou, Elias Milios, Lazaros Iliadis, and Minas Kaymakis Democritus University of Thrace,

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Short-term wind forecasting using artificial neural networks (ANNs)

Short-term wind forecasting using artificial neural networks (ANNs) Energy and Sustainability II 197 Short-term wind forecasting using artificial neural networks (ANNs) M. G. De Giorgi, A. Ficarella & M. G. Russo Department of Engineering Innovation, Centro Ricerche Energia

More information

Outline Introduction OLS Design of experiments Regression. Metamodeling. ME598/494 Lecture. Max Yi Ren

Outline Introduction OLS Design of experiments Regression. Metamodeling. ME598/494 Lecture. Max Yi Ren 1 / 34 Metamodeling ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 1, 2015 2 / 34 1. preliminaries 1.1 motivation 1.2 ordinary least square 1.3 information

More information

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Neural networks. Chapter 20, Section 5 1

Neural networks. Chapter 20, Section 5 1 Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of

More information

Forecasting Arrivals and Occupancy Levels in an Emergency Department

Forecasting Arrivals and Occupancy Levels in an Emergency Department Forecasting Arrivals and Occupancy Levels in an Emergency Department Ward Whitt 1, Xiaopei Zhang 2 Abstract This is a sequel to Whitt and Zhang (2017), in which we developed an aggregate stochastic model

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Introduction to Econometrics

Introduction to Econometrics Introduction to Econometrics T H I R D E D I T I O N Global Edition James H. Stock Harvard University Mark W. Watson Princeton University Boston Columbus Indianapolis New York San Francisco Upper Saddle

More information

Wavelet Neural Networks for Nonlinear Time Series Analysis

Wavelet Neural Networks for Nonlinear Time Series Analysis Applied Mathematical Sciences, Vol. 4, 2010, no. 50, 2485-2495 Wavelet Neural Networks for Nonlinear Time Series Analysis K. K. Minu, M. C. Lineesh and C. Jessy John Department of Mathematics National

More information

Regime switching models

Regime switching models Regime switching models Structural change and nonlinearities Matthieu Stigler Matthieu.Stigler at gmail.com April 30, 2009 Version 1.1 This document is released under the Creative Commons Attribution-Noncommercial

More information

Chapter 12: Linear regression II

Chapter 12: Linear regression II Chapter 12: Linear regression II Timothy Hanson Department of Statistics, University of South Carolina Stat 205: Elementary Statistics for the Biological and Life Sciences 1 / 14 12.4 The regression model

More information

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, 23 2013 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run

More information