Decision 411: Class 9. HW#3 issues

Size: px
Start display at page:

Download "Decision 411: Class 9. HW#3 issues"

Transcription

1 Decision 411: Class 9 Presentation/discussion of HW#3 Introduction to ARIMA models Rules for fitting nonseasonal models Differencing and stationarity Reading the tea leaves : : ACF and PACF plots Unit roots Estimation issues Example: the UNITS series HW#3 issues Data transformations needed? How to model seasonality? Outliers? Lagged effects? Comparison of economic impact of promotion variables (sums of coefficients) Forecast for next period 1

2 Transformations? If the dependent variable has only a slight trend, it may be hard to tell if growth is exponential or if seasonality is multiplicative: a log transformation may be unnecessary Effects of promotion variables on sales are probably additive if promotional activities are carried out on a per-package package basis (rather than mass media) Variables cannot be logged if they contain zeroes. Seasonal adjustment? A priori seasonal adjustment of the dependent variable in a regression model may be problematic due to effects of independent variables (seasonal adjustment may distort the effects of other variables if those effects do not naturally vary with the season) Seasonal adjustment can be performed inside the regression model by using an externally-supplied seasonal index, and perhaps also the product of the seasonal index and the time index, as a separate regressor (or else by using seasonal dummies) If the dependent variable is transformed via seasonal adjustment or logging, it will be necessary to untransform the final forecasts. Economic interpretations of coefficients are also affected. 2

3 Outliers? In general, outliers in data need to be carefully considered for their effects on model estimation and prediction they they should not be removed high- handedly. Outliers in random variables may or may not be expected to occur again in the future. Outliers in decision variables may be the most informative data points (experiments!) Do effects remain linear for large values of independent variables? (Residual plots may shed light on this.) Lagged effects? When fitting regression models to time series data, cross-correlation correlation plots may help to reveal lags of independent variables that are useful regressors However, lack of significant cross-correlations correlations does not prove that lagged variables won t t be helpful in the context of a multiple regression model, where other variables are also present If there are a priori reasons for believing that an independent variable may have lagged effects, then those lags probably should be included in a preliminary all likely suspects model Usually only low order lags (e.g., lag 1 and lag 2, in addition to lag 0) are important 3

4 ARIMA models Auto-Regressive Integrated Moving Average Are an adaptation of discrete-time time filtering methods developed in 1930 s s and 1940 s s by electrical engineers (Norbert Wiener et al.) Statisticians George Box and Gwilym Jenkins developed systematic methods for applying them to business & economic data in the 1970 s s (hence the name Box-Jenkins models ) ARIMA models are... Generalized random walk models fine-tuned to eliminate all residual autocorrelation Generalized exponential smoothing models that can incorporate long-term trends and seasonality Stationarized regression models that use lags of the dependent variables and/or lags of the forecast errors as regressors The most general class of forecasting models for time series that can be stationarized* by transformations such as differencing, logging, and or deflating * A time series is stationary if all of its statistical properties mean, variance, autocorrelations, etc. are constant in time. Thus, it has no trend, no heteroscedasticity, and a constant degree of wiggliness. 4

5 Why use ARIMA models? ARIMA model-fitting tools allow you to systematically explore the autocorrelation pattern in a time series in order to pick the optimal statistical model assuming there is really a stable pattern in the data! ARIMA options let you fine-tune model types such as RW, SES, and multiple regression (add trend to SES, add autocorrelation correction to RW or multiple regression) Construction of an ARIMA model 1. Stationarize the series, if necessary, by differencing (& perhaps also logging, deflating, etc.) 2. Regress the stationarized series on lags of itself and/or lags of the forecast errors as needed to remove all traces of autocorrelation from the residuals 5

6 What ARIMA stands for A series which needs to be differenced to be made stationary is an integrated (I)) series Lags of the stationarized series are called auto- regressive (AR)) terms Lags of the forecast errors are called moving average (MA)) terms ARIMA terminology A non-seasonal ARIMA model can be (almost) completely summarized by three numbers: p = the number of autoregressive terms d = the number of nonseasonal differences q = the number of moving-average terms This is called an ARIMA( ARIMA(p,d,q) model The model may also include a constant term (or not) 6

7 ARIMA models we ve already met ARIMA(0,0,0)+c = mean (constant) model ARIMA(0,1,0) = RW model ARIMA(0,1,0)+c = RW with drift model ARIMA(1,0,0)+c = 1 st order AR model ARIMA(1,1,0)+c = differenced 1 st -order AR model ARIMA(0,1,1) = SES model ARIMA(0,1,1)+c = SES + trend = RW + correction for lag-1 1 autocorrelation ARIMA(1,1,2) ARIMA(0,2,2) = LES w/ damped trend (leveling off) = generalized LES (including Holt s) The ARIMA filtering box time series p d q signal (forecasts) Objective: adjust the knobs until the residuals are white noise (uncorrelated) noise (residuals) 7

8 ARIMA forecasting equation Let Y denote the original series Let y denote the differenced (stationarized)) series No difference (d=0):( y t = Y t First difference (d=1): y t = Y t Y t-1 Second difference (d=2): y t = (Y t Y t-1 ) (Y t-1 Y t-2 ) = Y t 2Y t-1 + Y t-2 Forecasting equation for y yˆ t μ + φ1 yt By convention, the AR terms are + and the MA terms are constant AR terms (lagged values of y) = φ y t 1 t p MA terms (lagged errors) Not as bad as it looks! Usually p+q 2 and either p=0 0 or q=0 (pure AR or pure MA model) q p θ e... θ e 1 t q 8

9 Undifferencing the forecast The differencing (if any) must be reversed to obtain a forecast for the original series: If d = 0 : Yˆ t = yˆ t If d 1: Yˆ = yˆ + Y If d = t t t 1 ˆ ˆ = 2 : Yt = yt + 2Yt 1 Yt 2 Fortunately, your software will do all of this automatically! Why do you need both AR and MA terms? In general, you don t: : usually it suffices to use only one type or the other. Some series are better fitted by AR terms, others are better fitted by MA terms (at a given level of differencing). Rough rule of thumb: if the stationarized series has positive autocorrelation at lag 1, AR terms often work best. If it has negative autocorrelation at lag 1, MA terms often work best. 9

10 Interpretation of AR terms A series displays autoregressive (AR) behavior if it apparently feels a restoring force that tends to pull it back toward its mean. In an AR(1) model, the AR(1) coefficient determines how fast the series tends to return to its mean. If the coefficient is near zero,, the series returns to its mean quickly; ; if the coefficient is near 1, the series returns to its mean slowly. In a model with 2 or more AR coefficients, the sum of the coefficients determines the speed of mean reversion, and the series may also show an oscillatory pattern. Interpretation of MA terms A series displays moving-average (MA) behavior if it apparently undergoes random shocks whose effects are felt in two or more consecutive periods. The MA(1) coefficient is (minus) the fraction of last period s s shock that is still felt in the current period. The MA(2) coefficient, if any, is (minus) the fraction of the shock two periods ago that is still felt in the current period, and so on. 10

11 Tools for identifying ARIMA models: ACF and PACF plots The autocorrelation function (ACF) plot shows the correlation of the series with itself at different lags The autocorrelation of Y at lag k is the correlation between Y and LAG(Y,k) The partial autocorrelation function (PACF) plot shows the amount of autocorrelation at lag k that is not explained by lower-order order autocorrelations The partial autocorrelation at lag k is the coefficient of LAG(Y,k) ) in an AR(k) ) model, i.e., in a regression of Y on LAG(Y,, 1), LAG(Y,2), up to LAG(Y,k) AR and MA signatures ACF that dies out gradually and PACF that cuts off sharply after a few lags AR signature An AR series is usually positively autocorrelated at lag 1 (or even borderline nonstationary) ACF that cuts off sharply after a few lags and PACF that dies out more gradually MA signature An MA series is usually negatively autcorrelated at lag 1 (or even mildly overdifferenced) 11

12 AR signature: mean-reverting behavior, slow decay in ACF (usually positive at lag 1), sharp cutoff after a few lags in PACF. Here the signature is AR(2) because of 2 spikes in PACF. MA signature: noisy pattern, sharp cutoff in ACF (usually negative at lag 1), gradual decay in PACF. Here the signature is MA(1) because of 1 spike in ACF. 12

13 AR or MA? It depends! Whether a series displays AR or MA behavior often depends on the extent to which it has been differenced. An underdifferenced series has an AR signature (positive autocorrelation) After one or more orders of differencing, the autocorrelation will become more negative and an MA signature will emerge The autocorrelation spectrum Positive autocorrelation No autocorrelation Negative autocorrelation add DIFF add AR add DIFF add MA remove DIFF Nonstationary Auto-Regressive White Noise Moving-Average Overdifferenced 13

14 Model-fitting steps 1. Determine the order of differencing 2. Determine the numbers of AR & MA terms 3. Fit the model check to see if residuals are white noise, highest-order coefficients are significant (w/ no unit roots ), and forecasts look reasonable. If not, return to step 1 or 2. In other words, move right or left in the autocorrelation spectrum by appropriate choices of differencing and AR/MA terms, until you reach the center (white noise) 1. Determine the order of differencing If the series apparently (or logically) has a stable long-term mean,, then perhaps no differencing is needed If the series has a consistent trend,, you MUST use at least one order of differencing [or else include a trended regressor] ] otherwise the long-term forecasts will revert to the mean of the historical sample If two orders of differencing are used, you should suppress the constant (otherwise you will get a quadratic trend in forecasts) 14

15 How much differencing is needed? Usually the best order of differencing is the least amount needed to stationarize the series When in doubt, you can also try the next higher order of differencing & compare models If the lag-1 1 autocorrelation is already zero or negative, you don t t need more differencing Beware of overdifferencing if there is more autocorrelation and/or higher variance after differencing, you may have gone too far! 2. Determine # of AR & MA terms After differencing, inspect the ACF and PACF at low-order order lags (1, 2, 3, ): Positive decay pattern in ACF at lags 1, 2, 3, 3 single* positive spike in PACF at lag 1 AR=1 Negative spike in ACF at lag 1, negative decay pattern in PACF at lags 1, 2, 3, MA=1 MA terms often work well with differences: ARIMA(0,1,1) = SES, ARIMA(0,2,2) = LES. *Two positives spikes in PACF AR=2, two negative spikes in ACF MA=2, etc. 15

16 3. Fit the model & check Residual time series plot looks stationary? No residual autocorrelation? (or is there a signature of AR or MA terms that still need to be added)? Highest-order coefficient(s) ) significant? Sum of coefficients of same type < 1? UNITS series: begin with ARIMA(0,0,0)+c model, i.e., constant model. Residuals = original data with mean subtracted, showing a strong upward trend, lag-1 1 autocorrelation close to 1.0, slowly- declining pattern in ACF differencing is obviously needed 16

17 After 1 nonseasonal difference: ARIMA(0,1,0)+c (random walk with drift) model: lag-1 1 residual autocorrelation still positive, ACF still declines slowly, 2 technically-significant spikes in PACF. Is it stationary (e.g. mean-reverting)? If we think so, try adding AR=2. After adding AR=2: ARIMA(2,1,0)+c model. No significant spikes in ACF or PACF, residuals appear to be stationary. The mean is the estimated constant long-term trend (0.416 per period). AR(2) coefficient is significant (t=2.49). AR coefficients sum to less than 1.0 ( ( no unit root,, i.e., not seriously underdifferenced). White noise std. dev. (RMSE) = 1.35, versus 1.44 for RW. 17

18 What s s the difference between MEAN and CONSTANT in the ARIMA output? MEAN = mean of stationarized series If no differencing was used, this is just the mean of the original series. If one order of differencing was used, this is the long-term average trend CONSTANT = the intercept term (μ)( ) in the forecasting equation Connection (a mathematical identity): CONSTANT= MEAN (1 sum of AR coefficients) Estimated white noise standard deviation? This is just the RMSE of the model in transformed units, adjusted for # coefficients estimated. I.e., it is the same statistic that is called standard error of the estimate in regression If a natural log transform was used, it is the RMS error in percentage terms. 18

19 units Time Sequence Plot for units ARIMA(2,1,0) with constant actual forecast 95.0% limits The plot of data and forecasts shows that the trend in the forecasts equals the historical average trend. A good model?? What if we had gone to 2 orders of differencing instead? With 2 nonseasonal differences and no constant: : ARIMA(0,2,0) model. Residuals are definitely stationary but now strongly negatively autocorrelated at lag 1, with a single negative spike in ACF. Overdifferenced or MA=1?? Note that RMSE is larger than it was with one difference 1.69 vs 1.44 not a good sign. 19

20 After adding MA=1: ARIMA(0,2,1) model essentially essentially a linear exponential smoothing model. Residuals look fine, no autocorrelation. Estimated MA(1) coefficient is only 0.75 (not dangerously close to 1.0 not seriously overdifferenced). White noise std. dev. = 1.37, about the same as (2,1,0)+c model. units Time Sequence Plot for units ARIMA(0,2,1) actual forecast 95.0% limits Forecast plot shows that the trend in this model is the local trend estimated at the end of the series, flatter than in the previous model. Confidence limits widen more rapidly than in the previous model because a time-varying trend is assumed, as in LES. 20

21 Equation of first model The ARIMA(2,1,0)+c forecasting equation is Yˆ Y t = μ+ φ ( Y Y ) + φ ( Y Y t 1 1 t 1 t 2 2 t 2 t 3 where μ = 0.228, φ 1 = 0.250, and φ 2 = Thus, the predicted change is a constant plus multiples of the last two changes ( momentum ) ) Equation of second model The ARIMA(0,2,1) forecasting equation is: Yˆ t Y t 1 = ( Yt 1 Yt 2 ) θ e 1 t 1...where the MA(1) coefficient is θ 1 = Thus, the predicted change equals the last change minus a multiple of the last forecast error An ARIMA(0,2,1) model is almost the same as Brown s s LES model, with the correspondence α 1- ½θ 1. Hence this model is similar to an LES model with α

22 ARIMA(0,2,2) vs. LES General ARIMA (0,2,2) model (w/o const): Yˆ Y = ( Y Y ) θ e θ e t t 1 t 1 t 2 1 t 1 2 t 2 Brown s s LES is exactly equivalent to (0,2,2) with θ 1 = 2 2α, θ 2 = (1 α α) 2 Holt s s LES is exactly equivalent to (0,2,2) with θ 1 = 2 α αβ, θ 2 = (1 α α) The ARIMA model is somewhat more general because it allows α and β outside (0,1) Model comparison shows that the two best ARIMA models (B and D) perform about equally well for one-period period-ahead forecasts. They differ mainly in their assumptions about trend and width of confidence limits for longer-term forecasts. An LES model has been included for comparison essentially the same as ARIMA(0,2,1). Note that α

23 Estimation of ARIMA models Like SES and LES models, ARIMA models face a start-up problem at the beginning of the series: prior lagged values are needed. Naïve approach: AR models can be initialized by assuming that prior values of the differenced series are equal to the mean MA models can be initialized by assuming that prior errors were zero Backforecasting A better approach: use backforecasting to estimate the prior values of the differenced series and the forecast errors The key idea: a stationary time series looks the same (statistically speaking) whether it is traversed forward or backward in time. Hence the same model that forecasts the future of a series can also forecast its past. 23

24 How backforecasting works On each iteration of the estimation algorithm, the algorithm first makes a backward pass through the series and forecasts into the past until the forecasts decay to the mean. Then it turns around and starts forecasting in the forward direction. Squared error calculations are based on all the errors in the forward pass, even those in the prior period. Theoretically, this yields the maximally efficient estimates of coefficients. Backforecasting caveats If the model is badly mis-specified specified (e.g., under or over-differenced, or over-fitted with too many AR or MA terms), backforecasting may drive the coefficient estimates to perverse values and/or yield too-optimistic optimistic error statistics If in doubt, try estimating the model without backforecasting until you are fairly sure it is properly identified. Turning on backforecasting should yield slight changes in coefficients & improved error stats, but not a radical change. 24

25 Backforecasting is an estimation option for ARIMA models, as well as SES, LES, Holt, and Winters models (The stopping criteria are thresholds for the change in residual sum of squares and parameter estimates below which the estimation will stop) Number of iterations ARIMA models are estimated by an iterative, nonlinear optimization algorithm (like Solver in Excel). The precision of the estimation is controlled by two stopping criteria (normally you don t t need to change these) A large number of iterations ( 10)( indicates that the optimal coefficients were hard to find possibly a problem with the model such as redundant AR/MA parameters or a unit root 25

26 What is a unit root? Suppose that a time series has the true equation of motion: Yt = φyt 1 + εt...where ε t is a stationary process (mean-reverting, constant variance & autocorrelation, etc.) If the true value of φ is 1, then Y is said to have an (autoregressive) unit root.. Its true equation is then Y Y = ε t t 1 t which is just a random walk (perhaps with drift and correlated steps) Why are unit roots important? If a series has an AR unit root, then a jump (discontinuity) in the series permanently raises its expected future level, as in a random walk If it doesn t t have an AR unit root, then it eventually reverts to a long-term mean or trend line after a jump A series that has a unit root is much less predictable at long forecast horizons than one that does not If can be hard to tell whether a trended series has a unit root or whether it is reverting to a trend line 26

27 Unit root tests A crude unit root test is to fit an AR model and see whether the AR coefficients sum to almost exactly 1. A more sensitive test is to regress DIFF(Y) ) on LAG(Y,1), i.e., fit the model Y Y = α + βy + ε t t 1 t 1 t and test whether β is significantly different from zero using a one-sided t-test test with 1.75 times the usual critical value (the Dickey-Fuller test ). If not, then you can t t reject the unit root hypothesis. Fancier unit root and cointegration tests regress DIFF(Y) ) on LAG(Y,1) plus a bunch of lags of DIFF(Y) ( augmented DF test ) ) and/or other variables (e.g. time trend), with more complicated critical values. Does UNITS have a unit root? Degrees of freedom for error Infinity Critical t value (.05, 1-tailed) Critical Dickey-Fuller value (= t *1.75) By the DF test, we can t t reject the unit root hypothesis for the UNITS series 27

28 But we can reject the unit root hypothesis for DIFF(UNITS), suggesting that 1 order of differencing is sufficient Practical implications (for us) In regression analysis, if both Y and X have unit roots (i.e., are random walks), then the regression Y on X may yield a spurious estimate of the relation of Y to X (better to regress DIFF(Y) ) on DIFF(X) ) in that case) In ARIMA analysis, if a unit root exists either on the AR or the MA side of the model, the model can be simplified by using a higher or lower order of differencing This brings us to 28

29 Over- or under-differencing differencing What if we had identified a grossly wrong order of differencing? For example, suppose we hadn t t differenced the UNITS series at all, or supposed we had differenced it more than twice? Here is the original UNITS series again. If we ignore the obvious trend in the time series plot, we see that the ACF and PACF display a (suspiciously!) strong AR(1) signature. Suppose we just add an AR(1) term 29

30 The estimated AR(1) coefficient is (essentially 1.0), which is equivalent to an order of differencing. An AR factor mimics an order of differencing when the sum of AR coefficients is 1.0 (a sign of a unit root), so the model is really telling us that a higher order of differencing is needed. Adding an AR(2) term does not help. Now the sum of the AR coefficients is almost exactly equal to 1.0. Again, this means that there is an AR unit root-- --a a higher order of differencing is needed. 30

31 On the other hand, suppose we grossly overdifferenced the series. Here 3 orders of differencing have been used (not recommended!). The series now displays a suspiciously strong MA(1) or MA(2) signature. If we add an MA(1) term, its estimated coefficient is (essentially 1.0), which is exactly canceling one order of differencing. An MA factor cancels an order of differencing if the sum of MA coefficients is 1.0 (indicating an MA unit root, also called a non-invertible model). 31

32 Adding an MA(2) term does not really improve the results, and the sum of the MA coefficients is now almost exactly equal to 1.0. Again, this tells us less differencing was needed. Conclusions If the series is grossly under- or over-differenced, you will see what appears to be a suspiciously strong AR or MA signature. This is not good!! The correct order of differencing usually reduces rather than increases the patterns in the ACF and PACF When coefficients are estimated, if the coefficients of the same type sum to almost exactly 1.0, this indicates that a higher or lower order of differencing should have been used. 32

33 The rules of the ARIMA game The preceding guidelines for fitting ARIMA models to data can be summed up in the following rules. ARIMA rules: differencing Rule 1: If the series has positive autocorrelations out to a high number of lags and/or a non-zero trend,, then it probably needs a higher order of differencing. Rule 2: If the lag-1 1 autocorrelation is zero or negative,, the series does not need a higher order of differencing. If the lag-1 1 autocorrelation is -0.5 or more negative, the series may already be overdifferenced differenced. Rule 3: The optimal order of differencing is often (not always, but often) the order of differencing at which the standard deviation is lowest. 33

34 ARIMA rules: differencing continued Rule 4: (a) A model with no orders of differencing assumes that the original series is stationary (among other things, mean-reverting). (b) A model with one order of differencing assumes that the original series has a constant average trend (e.g. a random walk or SES-type model, with or without drift). (c) A model with two orders of total differencing assumes that the original series has a time- varying trend (e.g. an LES-type model). ARIMA rules: constant term or not? Rule 5: (a) A model with no orders of differencing normally includes a constant term. The estimated mean is the average level. (b) In a model with one order of differencing, a constant should be included if there is a long- term trend. The estimated mean is the average trend. (b) A model with two orders of total differencing normally does not include a constant term. 34

35 ARIMA rules: AR signature Rule 6: If the partial autocorrelation function (PACF) of the differenced series displays a sharp cutoff and/or the lag-1 1 autocorrelation is positive i.e., i.e., if the series appears slightly underdifferenced then consider adding one or more AR terms to the model. The lag beyond which the PACF cuts off is the indicated number of AR terms. ARIMA rules: MA signature Rule 7: If the autocorrelation function (ACF) of the differenced series displays a sharp cutoff and/or the lag-1 1 autocorrelation is negative i.e., if the series appears slightly overdifferenced then consider adding an MA term to the model. The lag beyond which the ACF cuts off is the indicated number of MA terms. 35

36 ARIMA rules: residual signature Rule 8: If an important AR or MA term has been omitted,, its signature will usually show up in the residuals. For example, if an AR(1) term (only) has been included in the model, and the residual PACF clearly shows a spike at lag 2, this suggests that an AR(2) term may also be needed. Rule 9: ARIMA rules: cancellation It is possible for an AR term and an MA term to cancel each other's effects, so if a mixed AR-MA model seems to fit the data, also try a model with one fewer AR term and one fewer MA term particularly if the parameter estimates in the original model need more than 10 iterations to converge. 36

37 ARIMA rules: unit roots Rule 10: If there is a unit root in the AR part of the model i.e., if the sum of the AR coefficients is almost exactly 1 you 1 should try reducing the number of AR terms by one and increasing the order of differencing by one. Rule 11: If there is a unit root in the MA part of the model i.e., if the sum of the MA coefficients is almost exactly 1 you 1 should try reducing the number of MA terms by one and reducing the order of differencing by one. Rule 12: If the long-term forecasts appear erratic or unstable, and/or the model takes 10 or more iterations to estimate, there may be a unit root in the AR or MA coefficients. 37

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Decision 411: Class 4

Decision 411: Class 4 Decision 411: Class 4 Non-seasonal averaging & smoothing models Simple moving average (SMA) model Simple exponential smoothing (SES) model Linear exponential smoothing (LES) model Combining seasonal adjustment

More information

Decision 411: Class 4

Decision 411: Class 4 Decision 411: Class 4 Non-seasonal averaging & smoothing models Simple moving average (SMA) model Simple exponential smoothing (SES) model Linear exponential smoothing (LES) model Combining seasonal adjustment

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

Lecture 6a: Unit Root and ARIMA Models

Lecture 6a: Unit Root and ARIMA Models Lecture 6a: Unit Root and ARIMA Models 1 2 Big Picture A time series is non-stationary if it contains a unit root unit root nonstationary The reverse is not true. For example, y t = cos(t) + u t has no

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Decision 411: Class 7

Decision 411: Class 7 Decision 411: Class 7 Confidence limits for sums of coefficients Use of the time index as a regressor The difficulty of predicting the future Confidence intervals for sums of coefficients Sometimes the

More information

Scenario 5: Internet Usage Solution. θ j

Scenario 5: Internet Usage Solution. θ j Scenario : Internet Usage Solution Some more information would be interesting about the study in order to know if we can generalize possible findings. For example: Does each data point consist of the total

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

Chapter 2: Unit Roots

Chapter 2: Unit Roots Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity

More information

Unit root problem, solution of difference equations Simple deterministic model, question of unit root

Unit root problem, solution of difference equations Simple deterministic model, question of unit root Unit root problem, solution of difference equations Simple deterministic model, question of unit root (1 φ 1 L)X t = µ, Solution X t φ 1 X t 1 = µ X t = A + Bz t with unknown z and unknown A (clearly X

More information

Univariate, Nonstationary Processes

Univariate, Nonstationary Processes Univariate, Nonstationary Processes Jamie Monogan University of Georgia March 20, 2018 Jamie Monogan (UGA) Univariate, Nonstationary Processes March 20, 2018 1 / 14 Objectives By the end of this meeting,

More information

CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS

CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS 21.1 A stochastic process is said to be weakly stationary if its mean and variance are constant over time and if the value of the covariance between

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38 ARIMA Models Jamie Monogan University of Georgia January 25, 2012 Jamie Monogan (UGA) ARIMA Models January 25, 2012 1 / 38 Objectives By the end of this meeting, participants should be able to: Describe

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Minitab Project Report - Assignment 6

Minitab Project Report - Assignment 6 .. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an

More information

Univariate linear models

Univariate linear models Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15 Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic

More information

SOME BASICS OF TIME-SERIES ANALYSIS

SOME BASICS OF TIME-SERIES ANALYSIS SOME BASICS OF TIME-SERIES ANALYSIS John E. Floyd University of Toronto December 8, 26 An excellent place to learn about time series analysis is from Walter Enders textbook. For a basic understanding of

More information

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Stochastic vs. deterministic

More information

7 Introduction to Time Series

7 Introduction to Time Series Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN

More information

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them. TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

10) Time series econometrics

10) Time series econometrics 30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008)

FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008) FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008) Part I Logit Model in Bankruptcy Prediction You do not believe in Altman and you decide to estimate the bankruptcy prediction

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale

More information

Estimation and application of best ARIMA model for forecasting the uranium price.

Estimation and application of best ARIMA model for forecasting the uranium price. Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This

More information

Modeling and forecasting global mean temperature time series

Modeling and forecasting global mean temperature time series Modeling and forecasting global mean temperature time series April 22, 2018 Abstract: An ARIMA time series model was developed to analyze the yearly records of the change in global annual mean surface

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Nonstationary Time Series:

Nonstationary Time Series: Nonstationary Time Series: Unit Roots Egon Zakrajšek Division of Monetary Affairs Federal Reserve Board Summer School in Financial Mathematics Faculty of Mathematics & Physics University of Ljubljana September

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

INTRODUCTORY REGRESSION ANALYSIS

INTRODUCTORY REGRESSION ANALYSIS ;»»>? INTRODUCTORY REGRESSION ANALYSIS With Computer Application for Business and Economics Allen Webster Routledge Taylor & Francis Croup NEW YORK AND LONDON TABLE OF CONTENT IN DETAIL INTRODUCTORY REGRESSION

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

TESTING FOR CO-INTEGRATION

TESTING FOR CO-INTEGRATION Bo Sjö 2010-12-05 TESTING FOR CO-INTEGRATION To be used in combination with Sjö (2008) Testing for Unit Roots and Cointegration A Guide. Instructions: Use the Johansen method to test for Purchasing Power

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

Theoretical and Simulation-guided Exploration of the AR(1) Model

Theoretical and Simulation-guided Exploration of the AR(1) Model Theoretical and Simulation-guided Exploration of the AR() Model Overview: Section : Motivation Section : Expectation A: Theory B: Simulation Section : Variance A: Theory B: Simulation Section : ACF A:

More information

5 Autoregressive-Moving-Average Modeling

5 Autoregressive-Moving-Average Modeling 5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely

More information

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial

More information

Lecture 19 Box-Jenkins Seasonal Models

Lecture 19 Box-Jenkins Seasonal Models Lecture 19 Box-Jenkins Seasonal Models If the time series is nonstationary with respect to its variance, then we can stabilize the variance of the time series by using a pre-differencing transformation.

More information

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

ECONOMETRIA II. CURSO 2009/2010 LAB # 3 ECONOMETRIA II. CURSO 2009/2010 LAB # 3 BOX-JENKINS METHODOLOGY The Box Jenkins approach combines the moving average and the autorregresive models. Although both models were already known, the contribution

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT USING PROC ARIMA TO MODEL TRENDS IN US HOME PRICES Ross Bettinger, Analytical Consultant, Seattle, WA We demonstrate the use of the Box-Jenkins time series modeling methodology to analyze US home

More information

BCT Lecture 3. Lukas Vacha.

BCT Lecture 3. Lukas Vacha. BCT Lecture 3 Lukas Vacha vachal@utia.cas.cz Stationarity and Unit Root Testing Why do we need to test for Non-Stationarity? The stationarity or otherwise of a series can strongly influence its behaviour

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Frequency Forecasting using Time Series ARIMA model

Frequency Forecasting using Time Series ARIMA model Frequency Forecasting using Time Series ARIMA model Manish Kumar Tikariha DGM(O) NSPCL Bhilai Abstract In view of stringent regulatory stance and recent tariff guidelines, Deviation Settlement mechanism

More information

Automatic Forecasting

Automatic Forecasting Automatic Forecasting Summary The Automatic Forecasting procedure is designed to forecast future values of time series data. A time series consists of a set of sequential numeric data taken at equally

More information

Stationarity Revisited, With a Twist. David G. Tucek Value Economics, LLC

Stationarity Revisited, With a Twist. David G. Tucek Value Economics, LLC Stationarity Revisited, With a Twist David G. Tucek Value Economics, LLC david.tucek@valueeconomics.com 314 434 8633 2016 Tucek - October 7, 2016 FEW Durango, CO 1 Why This Topic Three Types of FEs Those

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Automatic seasonal auto regressive moving average models and unit root test detection

Automatic seasonal auto regressive moving average models and unit root test detection ISSN 1750-9653, England, UK International Journal of Management Science and Engineering Management Vol. 3 (2008) No. 4, pp. 266-274 Automatic seasonal auto regressive moving average models and unit root

More information

Econometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 11 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 30 Recommended Reading For the today Advanced Time Series Topics Selected topics

More information

A stochastic modeling for paddy production in Tamilnadu

A stochastic modeling for paddy production in Tamilnadu 2017; 2(5): 14-21 ISSN: 2456-1452 Maths 2017; 2(5): 14-21 2017 Stats & Maths www.mathsjournal.com Received: 04-07-2017 Accepted: 05-08-2017 M Saranyadevi Assistant Professor (GUEST), Department of Statistics,

More information

EC408 Topics in Applied Econometrics. B Fingleton, Dept of Economics, Strathclyde University

EC408 Topics in Applied Econometrics. B Fingleton, Dept of Economics, Strathclyde University EC408 Topics in Applied Econometrics B Fingleton, Dept of Economics, Strathclyde University Applied Econometrics What is spurious regression? How do we check for stochastic trends? Cointegration and Error

More information

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia International Journal of Applied Science and Technology Vol. 5, No. 5; October 2015 Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia Olayan

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

Analysis of Violent Crime in Los Angeles County

Analysis of Violent Crime in Los Angeles County Analysis of Violent Crime in Los Angeles County Xiaohong Huang UID: 004693375 March 20, 2017 Abstract Violent crime can have a negative impact to the victims and the neighborhoods. It can affect people

More information

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing

More information

Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia

Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia DOI 10.1515/ptse-2017-0005 PTSE 12 (1): 43-50 Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia Umi MAHMUDAH u_mudah@yahoo.com (State Islamic University of Pekalongan,

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Homework 2. For the homework, be sure to give full explanations where required and to turn in any relevant plots.

Homework 2. For the homework, be sure to give full explanations where required and to turn in any relevant plots. Homework 2 1 Data analysis problems For the homework, be sure to give full explanations where required and to turn in any relevant plots. 1. The file berkeley.dat contains average yearly temperatures for

More information

Economics 618B: Time Series Analysis Department of Economics State University of New York at Binghamton

Economics 618B: Time Series Analysis Department of Economics State University of New York at Binghamton Problem Set #1 1. Generate n =500random numbers from both the uniform 1 (U [0, 1], uniformbetween zero and one) and exponential λ exp ( λx) (set λ =2and let x U [0, 1]) b a distributions. Plot the histograms

More information

9) Time series econometrics

9) Time series econometrics 30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Paper SA-08. Are Sales Figures in Line With Expectations? Using PROC ARIMA in SAS to Forecast Company Revenue

Paper SA-08. Are Sales Figures in Line With Expectations? Using PROC ARIMA in SAS to Forecast Company Revenue Paper SA-08 Are Sales Figures in Line With Expectations? Using PROC ARIMA in SAS to Forecast Company Revenue Saveth Ho and Brian Van Dorn, Deluxe Corporation, Shoreview, MN ABSTRACT The distribution of

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Statistical Methods for Forecasting

Statistical Methods for Forecasting Statistical Methods for Forecasting BOVAS ABRAHAM University of Waterloo JOHANNES LEDOLTER University of Iowa John Wiley & Sons New York Chichester Brisbane Toronto Singapore Contents 1 INTRODUCTION AND

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Log transformation

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

Prediction of Grain Products in Turkey

Prediction of Grain Products in Turkey Journal of Mathematics and Statistics Original Research Paper Prediction of Grain Products in Turkey Özlem Akay, Gökmen Bozkurt and Güzin Yüksel Department of Statistics, The Faculty of Science and Letters,

More information

Time Series Econometrics 4 Vijayamohanan Pillai N

Time Series Econometrics 4 Vijayamohanan Pillai N Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Long-run Relationships in Finance Gerald P. Dwyer Trinity College, Dublin January 2016 Outline 1 Long-Run Relationships Review of Nonstationarity in Mean Cointegration Vector Error

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below).

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below). Project: Forecasting Sales Step 1: Plan Your Analysis Answer the following questions to help you plan out your analysis: 1. Does the dataset meet the criteria of a time series dataset? Make sure to explore

More information

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting "serial correlation"

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting serial correlation Time Series Analysis 2) assessment of/accounting for seasonality This (not surprisingly) concerns the analysis of data collected over time... weekly values, monthly values, quarterly values, yearly values,

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information