Decision 411: Class 4

Size: px
Start display at page:

Download "Decision 411: Class 4"

Transcription

1 Decision 411: Class 4 Non-seasonal averaging & smoothing models Simple moving average (SMA) model Simple exponential smoothing (SES) model Linear exponential smoothing (LES) model Combining seasonal adjustment with non-seasonal smoothing Winters seasonal smoothing model Guidelines for future HW writeups Presentation should stand on its own (SG files are mainly just for audit trail) What s the bottom line? (forecast, trend, key drivers?) Clearly define the variables (units, dates, transformations, etc.) used in the analysis Use bullet points for key observations & findings Use tables to present key numbers (forecasts & CI s) Embed the most important chart(s), with annotations Show where the numbers came from Explain your model s assumptions in layman s terms 1

2 Averaging & smoothing models Today s topics Later: ARIMA models We ll meet ARIMA later in the course, but briefly, an ARIMA (p,d,q) model is like a regression model in which the dependent variable is a d-order difference of the input variable, and the independent variables are p lagged values of the dependent variable (AR terms) and/or q lagged values of the forecast errors (MA terms), plus an optional constant term. Many of the averaging & smoothing models are special cases, e.g., an ARIMA(0,1,1) model is an SES model. p = # AR terms (lags of dependent variable) q = # MA terms (lags of errors) d = order of differencing of input variable 2

3 Averaging & smoothing models The problem: sometimes nonseasonal (or seasonally adjusted) data appears to be locally stationary with a time-varying mean The mean (constant) model doesn t track changes in the mean, has positively autocorrelated errors The random walk model may not perform well either in this situation: it oversteers, picks up too much noise in the data, and yields negatively correlated errors Example: series X Mean (constant) model yields positively autocorrelated errors... doesn t react to changes in the local mean...rmse = Constant mean = Autocorrelations Residual Autocorrelations for X Constant mean = lag No reaction to local changes in data Strong positive autocorrelation at lag 1 3

4 Example, continued Random walk model for series X yields negatively autocorrelated errors... overreacts to changes... RMSE=122 not any better! Random walk Autocorrelations Residual Autocorrelations for X Random walk lag Over-reaction to local changes in data (always 1 period too late) Strong negative autocorrelation at lag 1 A solution: Use a model that averages or smooths the recent data to filter out some of the noise and estimate the local mean, such as the Simple Moving Average (SMA) model: Yt 1 + Y t Y Ŷ t m t = m i.e., just average the last m observed values. 4

5 Properties of SMA model Average age of the data in the forecast is (m+1)/2 Yt 1 + Y t Y Ŷ t m t = m m=3 avg. age = 2 m=5 avg. age = 3 m=9 avg. age = 5, etc. (m+1)/2 is midway between 1 period old and m periods old hence it lags behind turning points by (m+1)/2 periods Properties of SMA, continued Long-term forecasts = horizontal straight line (=simple average of last few values) Confidence limits??? No theory!! Works well on highly irregular data: no data point receives more weight than others, so it s relatively robust against outliers Can also be tapered for even greater robustness 5

6 Example, continued SMA with m=3 (average age=2) yields RMSE=104 (significantly better!) and less negative autocorrelation (50% confidence limits are shown here, but don t trust them: they are based on the assumption of the mean remaining fixed at the latest value) Simple moving average of 3 terms Forecasts lag behind turning point by about 2 periods Autocorrelations % confidence limits (?) -1 Residual Autocorrelations for X Simple moving average of 3 terms lag No autocorrelation at lag 1 Example, continued SMA with m=5 (average age=3) yields RMSE=102 (very slightly better), smoother forecasts, slight positive autocorrelation in errors 50% confidence limits shown Simple moving average of 5 terms Autocorrelations Residual Autocorrelations for X Simple moving average of 5 terms lag Forecasts lag behind turning point by about 3 periods Slight positive autocorrelation at lag 1 6

7 Example, continued SMA with m=9 (average age=5) yields RMSE=104 (slightly worse), more positive autocorrelation in errors Simple moving average of 9 terms Forecasts lag behind turning point by about 5 periods Autocorrelations Residual Autocorrelations for X Simple moving average of 9 terms lag More positive autocorrelation at lag 1 Example, continued SMA with m=19 (average age=10) yields RMSE=118 (significantly worse), very smooth forecasts, much more positive autocorrelation Simple moving average of 19 terms Forecasts lag behind turning point by about 10 periods Autocorrelations Residual Autocorrelations for X Simple moving average of 19 terms lag Strong positive autocorrelation at lag 1 7

8 Smoothness vs. responsiveness Note that the more we smooth the data, the more clearly we see the signal stand out. But...greater clarity comes at the expense of getting the news later. If we want our forecasting model to respond quickly to changes, it will also pick up false alarms due to noise in the data. Conclusions For a time series with a randomly varying local mean, the SMA model may outperform both the mean model and the random walk model It allows us to strike a balance between averaging over too much past data or too little past data. However... 8

9 Shortcomings of SMA model It s hard to optimize the number of terms (m), because it is a discrete parameter... you must use trial and error. Intuitively, you should not equally weight the last m observations when computing the average... it would be better to discount the older data in a gradual fashion. These observations motivate... Brown s Simple Exponential Smoothing Let: α = smoothing constant S t = smoothed series at period t Recursive smoothing formula: S t = αy t + (1 α) S t-1 Forecast for next period = current smoothed value: Y ˆ = t+1 S t 9

10 ˆ t+ 1 Y Mathematically equivalent formulas for SES forecasts = αy Yˆ Yˆ 1 = + αe t + (1 α) Yˆ t forecast=interpolation between previous forecast and previous observation forecast=previous forecast plus t+ t t fraction α of previous error: et = Yt Yˆ t ˆ t+ 1 Y = Y (1 α) e t t forecast=previous observation minus fraction 1-α of previous error Yˆ Mathematically equivalent formulas for SES forecasts, continued Last but not least: 2 3 t+ t 1 = α[ Yt + (1 α) Yt 1 + (1 α) Yt 2 + (1 α) Y ] forecast = exponentially weighted moving average of all past observations or in other words, a discounted moving average with a discount factor of 1-α per period 10

11 Properties of SES model SES uses a smoothing parameter (α) which is continuously variable, so it is easily optimized by least squares If α = 1, SES random walk model If α = 0, SES constant model Average age of data in SES forecast is 1/α Examples: α = 0.5 avg. age = 2 α = 0.2 avg. age = 5 α = 0.1 avg. age = 10, etc. Properties of SES, continued For a given average age, SES is somewhat superior to SMA because it places relatively more weight on the most recent observation Hence it is slightly more "responsive" to changes occuring in the recent past. Caveat: it is also more sensitive to recent outliers than the SMA model-- not so good for messy data. 11

12 SMA (m=9) vs. SES (α=0.2) average age = 5 for both models SES weights are larger than SMA weights at first few lags, then gradually decline to zero Lag SMA weight SES weight SMA weights are 1/9 on first 9 lags of Y, zero afterward Average age is the center of mass ( balancing point ) of the weight distribution Properties of SES, continued Long-term forecasts from the basic SES model are a horizontal straight line (no trend, as in random walk and SMA) SES = ARIMA(0,1,1), i.e., random walk model (without drift) plus MA=1, which adds a multiple of lag-1 forecast error: ˆ t+ 1 Y = Y (1 α) e t t random walk lag-1 error 12

13 Properties of SES, continued Exact k-step ahead forecast standard error can be computed using ARIMA theory: SE 2 fcst( k) = 1+ ( k 1) α SE fcst(1) Note that it increases with k more slowly than for the random walk model, which is the special case α=1: SE = k SE fcst( k) fcst(1) Hence the SES model assumes the series is more predictable than a random walk Example, continued SES with optimal α=0.3 (average age=3.3) yields RMSE = 99 (best yet, by a small margin), no significant residual autocorrelations Simple exponential smoothing with alpha = % confidence limits shown Don t worry about an isolated spike at an oddball lag like lag 9 probably just due to a pair of large errors separated by 9 periods Autocorrelations Residual Autocorrelations for X Simple exponential smoothing with alpha = lag 13

14 SES with constant trend A constant linear trend can be added to an SES model by fitting it as an ARIMA(0,1,1) model with constant Alas, the ARIMA implementation of SES models can t be combined with seasonal adjustment in the Forecasting procedure in Statgraphics (although you could seasonally adjust and then fit an ARIMA model in two steps) SES with constant trend, continued A constant exponential trend can be added to SES by using the inflation adjustment option in Statgraphics The average percentage growth per period can be estimated from the slope coefficient of a linear trend model or ARIMA(0,1,0)+c model fitted with a natural log transformation See video clip #10 for examples 14

15 What if the series has a time-varying trend, as well as a time-varying mean? Evidently what is needed is an estimate of the local trend as well as the local mean This is the motivating idea behind Brown s Linear Exponential Smoothing (LES) model It s also sometimes called double exponential smoothing, because it involves a double application of exponential smoothing How LES works Apply SES once to get a singly-smoothed series S t that lags behind the current value by 1/α 1 periods.* Smooth the smoothed series (using same α) to get an even smoother series S t that lags behind by 2(1/α 1) periods To forecast the future, extrapolate a line between the two points (t (1/α 1), S t ) and (t 2(1/α 1), S t ) *Average age relative to next value is 1/α, so age relative to current value is 1/α -1 15

16 LES forecasts from t = 90, α=0.1* Draw a horizontal line extending 9 periods back in time from the current value of the singly-smoothed series Draw a horizontal line extending 18 periods back in time from the current value of the doubly-smoothed series X S' S'' 3. Extrapolate a line into the future through the left endpoints of the two horizontal lines *1/α = 10, so 1/α - 1 = 9 How LES works There are two equivalent sets of mathematical formulas for implementing the logic of the LES model One set of formulas (I) explicitly computes the current estimates of level and trend in each period The other set of formulas (II) merely computes the next forecast from the observed data and forecast errors in the last two periods 16

17 LES formulas: I 1. Compute singly smoothed series at period t: S' t = αy t + (1-α)S' t-1 2. Compute doubly smoothed series: S'' t = α S' t + (1-α) S'' t-1 3. Compute the estimated level at period t: Startup: S' 1 = S'' 1 = Y 1 L t = 2S' t S'' t 4. Compute the estimated trend at period t: T t = (α/(1-α))(s' t S'' t ) 5. Finally, the k-step ahead forecast is given by: Y ˆ = L + kt t+ k t t LES formulas: II Mathematically equivalent formula (requires fewer columns on a spreadsheet): Yˆ 2 t+ 1 = 2Yt Yt 1 2(1 α) et + (1 α) et 1 Very important start-up values: Yˆ ˆ 2 = Y1 = Y1, hence e1 = 0, e 2 = Y 2 Y 1 (If you don t use these start-up values, the early forecasts will gyrate wildly!) 17

18 Example, continued LES model is optimized at α=0.16, yielding RMSE=102 (about the same as SES) but the forecast plot shows a decreasing trend due to the local downward trend at end of series, confidence intervals also widen more rapidly due to assumption that trend may be varying Brown's linear exp. smoothing with alpha = Residual Autocorrelations for X Brown's linear exp. smoothing with alpha = % confidence limits shown Autocorrelations lag LES vs. SES SES assumes only a time-varying level (i.e., a local mean), while LES assumes a time-varying level and trend. SES assumes that the series is more predictable than a random walk, while LES is assumes it is less predictable. LES model is relatively unstable, hence it may be dangerous to extrapolate the local trend very far. There are fancier versions of LES that include a trend-dampening factor. 18

19 LES vs. SES, continued In both SES and LES, the smaller the value of α, the more smoothing (i.e., less response to the most recent observation) Remember that the average age is 1/α in SES model (amount of lag behind turning points). In LES model, forecast is based on what was happening between 1/α and 2/α periods ago. When fitted to the same series, LES usually has a smaller optimal α than SES. LES vs. SES, continued SES is the most widely used non-seasonal forecasting model. It has a sounder underlying theory than the SMA model, and it is computationally convenient to use on hundreds or thousands of parallel time series (e.g., for SKU-level forecasting). Its assumption of no trend is often unrealistic, but it is surprisingly robust in practice for short-term forecasts--often better than LES even for series that have trends. You can add an exponential trend via the inflation adjustment option. You can add a linear trend to an SES model by fitting it as an ARIMA(0,1,1) model with constant--but you can t combine ARIMA with seasonal adjustment in the Forecasting procedure. 19

20 Estimation issues Optimization of α is performed by nonlinear least squares (like Excel s nonlinear solver). Nonlinear estimation requires a search process whose solution is inexact and may depend on the starting value. In Statgraphics, you may notice that the optimal α varies slightly when the model is revisited, because it restarts the estimation from the previous optimum. Estimation issues, continued α is constrained to lie between and for SES and LES models. If the best SES model is actually a random walk model (α=1), then the estimation algorithm will converge to This will often happen if the series has a significant trend. Once α hits its upper bound (0.9999), the estimation may get stuck there. Try manually changing the initial value to (say) 0.5 before refitting the model if the data sample is changed. 20

21 Estimation issues, continued Because LES and SES use recursive formulas in which each forecast depends on prior errors, their estimation also depends on how they are initialized (i.e., on the prior errors that are assumed at the very beginning). The usual approach is to just assume that the first error is zero. A more sophisticated approach, available as an estimation option in Statgraphics, is to use backforecasting * to start up the model. *We ll discuss this in more detail later in the course. Holt s linear exponential smoothing Holt s model improves on LES by introducing separate smoothing constants for level and trend ( alpha and beta ) In theory, this allows it to perform more stable trend estimation while adapting to sudden jumps in level 21

22 Holt s model formulas 1. Updated level L t is an interpolation between the most recent data point and the previous forecast of the level: Lt = αyt + (1 α )( Lt 1+ Tt 1) Most recent data point Forecast of L t made at period t-1 Holt s model formulas 2. Updated trend T t is an interpolation between the change in the estimated level and the previous estimate of the trend: T ) t = β( Lt Lt 1) + ( 1 β Tt 1 Just-observed change in the level Previous trend estimate 22

23 Holt s model formulas 3. k-step ahead forecast from period t: Yˆt + k = L t + kt t Extrapolation of level and trend from period t Example, continued Holt s model is optimized at α=0.306, β=0.007 yielding RMSE = 100 (essentially same as SES & LES) but forecast plot shows a slightly increasing local trend at end of series, due to relatively heavy smoothing of trend! Holt's linear exp. smoothing with alpha = and beta = % confidence limits shown Autocorrelations Residual Autocorrelations for X Holt's linear exp. smoothing with alpha = and beta = lag 23

24 Model comparisons Models B-C-D-E hardly differ on error measures. Model choice should also depend on theoretical considerations, such as the reasonableness of the trend assumptions A cautionary word about trend extrapolation If you are forecasting more than one period ahead, it is especially important to estimate the trend correctly In general, trend assumptions and estimation should be based on everything you know about a time series, not just error statistics of one-period-ahead forecasts or t-stats of slope coefficients 24

25 A cautionary word about trend extrapolation Extrapolation of time-varying trends estimated by double smoothing can be dangerous Hence SES (perhaps with fixed trend) often works better in practice A trend dampening factor (0 < φ < 1) is often used in conjunction with LES or Holt s: ˆ 2 k Y + = L + ( φ + φ φ ) T t k t t Combining seasonal adjustment with a non-seasonal smoothing model Often a seasonally adjusted series looks like a good candidate for fitting with a smoothing or averaging model. Hence, you can forecast a seasonal series by a combination of seasonal adjustment and non-seasonal smoothing (or other non-seasonal model). This hybrid approach allows you to model the seasonal pattern explicitly, but it does not have a solid underlying statistical theory--confidence limits may be dubious. There is also some danger of overfitting the seasonal pattern if you don t have enough seasons of data. 25

26 Example of LES + seasonal adjustment on a spreadsheet The single-equation form of the LES model is easily implemented on a spreadsheet, and Solver can be used to find the value of α that minimizes RMSE. LES out-of-sample forecasts The LES model, like any other one-step-ahead forecasting model, can extrapolate its forecasts into the future by bootstrapping itself, i.e., by substituting the one-step-ahead forecast for the next data point and then forecasting the next period from there, and so on. 26

27 LES forecasts for seasonally adjusted data Note that LES lags behind turning points, like all smoothing models Seasonally adjusted LES forecast but it tracks the data pretty well during stretches where the trend is consistent and its out-of-sample forecasts extrapolate the most recent trend Dec-83 Dec-84 Dec-85 Dec-86 Dec-87 Dec-88 Dec-89 Dec-90 Dec-91 Dec-92 Dec-93 Dec-94 Re-seasonalized LES forecasts Dec-83 Jun-84 Dec-84 Jun-85 Dec-85 Jun-86 Dec-86 Jun-87 Dec-87 Jun-88 Dec-88 Jun-89 Dec-89 Jun-90 Dec-90 Jun-91 Dec-91 Jun-92 Dec-92 Jun-93 Dec-93 Jun-94 Dec-94 Jun-95 Original series Reseasonalized forecast 27

28 Example: housing starts Series displays strong seasonality as well as cyclicality Original data (not seasonally adjusted) 139 Time Series Plot for HousesNSA HousesNSA Note the last observation 39 1/83 1/87 1/91 1/95 1/99 1/03 New residential construction since

29 Seasonally adjusted data SADJUSTED Time Series Plot for SADJUSTED In seasonally adjusted terms, the last observation is abnormally large! How will different models react to it? (This abnormality was not so apparent on the unadjusted graph!) 54 1/83 1/87 1/91 1/95 1/99 1/03 After seasonal adjustment, variations in level and trend are clearer Nonseasonal forecasting model fitted to adjusted data: RW+drift Time Sequence Plot for SADJUSTED Random walk with drift = /83 1/88 1/93 1/98 1/03 1/08 actual forecast 50.0% limits This model extrapolates the long-term trend from the most recent (higher) level Depending on the kind of long-term trend assumptions we feel are appropriate, we could fit the seasonally adjusted series with a non-seasonal model such as a random walk with drift... 29

30 Nonseasonal forecasting model fitted to adjusted data: SES Time Sequence Plot for SADJUSTED Simple exponential smoothing with alpha = actual forecast 50.0% limits /83 1/88 1/93 1/98 1/03 1/08 This model extrapolates a flat trend from an exponentiallyweighted average of recent levels or a simple exponential smoothing model... Nonseasonal forecasting model fitted to adjusted data: Brown s LES Time Sequence Plot for SADJUSTED Brown's linear exp. smoothing with alpha = actual forecast 50.0% limits /83 1/88 1/93 1/98 1/03 1/08 This model tries to extrapolate the recent trend, which is jerked upward by the last observation or Brown s linear exponential smoothing model... 30

31 Nonseasonal forecasting model fitted to adjusted data: Holt s LES Time Sequence Plot for SADJUSTED Holt's linear exp. smoothing with alpha = and beta = actual forecast 50.0% limits /83 1/88 1/93 1/98 1/03 1/08 This model also tries to extrapolate the recent trend, but the trend estimate is more conservative due to small beta (heavy smoothing) or Holt s linear exponential smoothing model... Hybrid seasonal models in SG You can fit hybrid models in the Forecasting procedure in Statgraphics by selecting multiplicative seasonal adjustment in conjunction with a RW or SES or LES model type. The forecasts are automatically reseasonalized in the plots and model comparison statistics Be on guard against overfitting: seasonal adjustment adds many parameters to the model, and estimation period statistics may not be fully adjusted to correct for additional parameters. 31

32 Hybrid seasonal models RW + seasonal adjustment Time Sequence Plot for HousesNSA Random walk with drift = /83 1/88 1/93 1/98 1/03 1/08 actual forecast 50.0% Note limits sharply raised forecasts, driven by unusual seasonally adjusted value of last data point Here s the result of fitting the RW-with-drift model with multiplicative seasonal adjustment 32

33 SES + seasonal adjustment Time Sequence Plot for HousesNSA Simple exponential smoothing with alpha = /83 1/88 1/93 1/98 1/03 1/08 actual forecast 50.0% limits More conservative (though still raised) forecasts, tighter confidence limits Here s the result of fitting the SES model with multiplicative seasonal adjustment Brown s LES + seasonal adjustment Time Sequence Plot for HousesNSA Brown's linear exp. smoothing with alpha = /83 1/88 1/93 1/98 1/03 1/08 actual forecast 50.0% limits Forecasts march steeply upward, confidence limits are rather wide Here s the result of fitting the LES model with multiplicative seasonal adjustment 33

34 Holt s LES + seasonal adjustment Time Sequence Plot for HousesNSA Holt's linear exp. smoothing with alpha = and beta = actual 150 forecast 50.0% limits 50 1/83 1/88 1/93 1/98 1/03 1/08 Forecasts start from higher level but with flatter trend than LES, but confidence limits are rather optimistic Here s the result of fitting Holt s model with multiplicative seasonal adjustment Linear trend + seasonal adjustment (?) Time Sequence Plot for HousesNSA Linear trend = t actual forecast 50.0% limits /83 1/88 1/93 1/98 1/03 1/08 Obviously not appropriate! Just for fun, here s a linear trend model with multiplicative seasonal adjustment 34

35 Model comparison report shows that SES and Holt s do the best in estimation period, although RW model is slightly luckier in validation period (last 4 years of data were held out) Residual Plot for adjusted HousesNSA Simple exponential smoothing with alpha = Residual /83 1/87 1/91 1/95 1/99 1/03 Residual Autocorrelations for adjusted HousesNSA 1 Simple exponential smoothing with alpha = Residual plots for SES model show stable variance, no significant autocorrelation model appears OK Autocorrelations lag 35

36 proportion Residual Plot for adjusted HousesNSA Simple exponential smoothing with alpha = Even the (vertical) probability plot looks good.* This is a pane option behind the residual plots. *This result validates the use of normal distribution theory to compute the confidence intervals from the forecast standard errors. What s the best forecast? The main issue here is what to infer from the recent jump in seasonally adjusted housing starts. Our modeling results do not really answer this question for us they merely show the consequences of different assumptions we may wish to make. Ideally, domain knowledge should shed additional light on the appropriateness of the assumptions. The SES model is clearly the most conservative choice, because its forecasts are less radically affected by one recent observation. 36

37 Winter s Seasonal Smoothing The logic of Holt s model can be extended to recursively estimate time-varying seasonal indices as well as level and trend. Let L t, T t, and S t denote the estimated level, trend, and seasonal index at period t. Let s denote the number of periods in a season. Let α, β, and γ denote separate smoothing constants* for level, trend, and seasonality *numbers between 0 and 1: smaller values more smoothing Winters model formulas 1. Updated level L t is an interpolation between the seasonally adjusted value of the most recent data point and the previous forecast of the level: L t Yt = α + ( 1 α)( Lt 1 + Tt 1) S t s Seasonally adjusted value of Y t Forecast of L t made at period t-1 37

38 Winters model formulas 2. Updated trend T t is an interpolation between the change in the estimated level and the previous estimate of the trend: T ) t = β( Lt Lt 1) + ( 1 β Tt 1 Just-observed change in the level Previous trend estimate Winters model formulas 3. Updated seasonal index S t is an interpolation between the ratio of the data point to the estimated level and the previous estimate of the seasonal index: S t Y t = γ + (1 γ) Lt S t s Ratio to moving average of current data point Last estimate of seasonal index in the same season 38

39 Winters model formulas 4. k-step ahead forecast from period t: ˆ Y t+ k = ( Lt + ktt ) St s+ k Extrapolation of level and trend from period t Most recent estimate of the seasonal index for k th period in the future Estimation issues Estimation of Winters model is tricky, and not all software does it well: sometimes you get crazy results. There are three separate smoothing constants to be jointly estimated by nonlinear least squares (α, β, γ). Initialization is also tricky, especially for the seasonal indices. 39

40 Estimation issues Some common initialization schemes: Naïve approach: set initial level = 1st data point, trend = 0, seasonal indices = 1.0 More sophisticated: perform a seasonal decomposition to obtain initial seasonal indices & fit trend line to obtain initial trend Even more sophisticated: use backforecasting Calculation of confidence intervals is also complicated & not always done correctly. Winter s model fitted to housing starts Time Sequence Plot for HousesNSA Winter's exp. smoothing with alpha = , beta = , gamma = actual 150 forecast 50.0% limits /83 1/88 1/93 1/98 1/03 1/08 Results of fitting Winters model In this case, the Winters forecasts & confidence intervals look similar to those of the Holt s model with seasonal adjustment (alpha and beta are very similar as should be expected) 40

41 Model comparison report shows that Winters fits a little less well than SES or Holt s model, but is otherwise OK Winters model in practice The Winters model is popular in automatic forecasting software, because it has a little of everything (level, trend, seasonality). Sometimes it works well, but difficulties in initialization & estimation can lead to strange results in other cases. In principle it is similar to linear exponential smoothing and can produce similarly unstable long-term trend projections. 41

42 What really happened in last 5 years? Variables RW+drift SES LES HOLT WINTERS ACTUAL DATE All models overpredicted housing starts for the rest of 1992 and 1993, over-responding to the Feb. 02 jump, but later values were in the middle range of predictions until recent plunge Class 4 recap Averaging and smoothing models enable you to estimate time-varying levels and trends. SMA, SES, and LES models can be combined with seasonal adjustment to forecast seasonal data (...but beware of changing seasonal patterns and possibility of overfitting) Winters estimates time-varying seasonal indices. You need to exercise judgment in model selection in order to make appropriate assumptions about changing levels and trends & unusual events. 42

Decision 411: Class 4

Decision 411: Class 4 Decision 411: Class 4 Non-seasonal averaging & smoothing models Simple moving average (SMA) model Simple exponential smoothing (SES) model Linear exponential smoothing (LES) model Combining seasonal adjustment

More information

Decision 411: Class 9. HW#3 issues

Decision 411: Class 9. HW#3 issues Decision 411: Class 9 Presentation/discussion of HW#3 Introduction to ARIMA models Rules for fitting nonseasonal models Differencing and stationarity Reading the tea leaves : : ACF and PACF plots Unit

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Log transformation

More information

Decision 411: Class 7

Decision 411: Class 7 Decision 411: Class 7 Confidence limits for sums of coefficients Use of the time index as a regressor The difficulty of predicting the future Confidence intervals for sums of coefficients Sometimes the

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS Moving Averages and Smoothing Methods ECON 504 Chapter 7 Fall 2013 Dr. Mohammad Zainal 2 This chapter will describe three simple approaches to forecasting

More information

Decision 411: Class 8

Decision 411: Class 8 Decision 411: Class 8 One more way to model seasonality Advanced regression (power tools): Stepwise and all possible regressions 1-way ANOVA Multifactor ANOVA General Linear Models (GLM) Out-of of-sample

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Decision 411: Class 8

Decision 411: Class 8 Decision 411: Class 8 One more way to model seasonality Advanced regression (power tools): Stepwise and all possible regressions 1-way ANOVA Multifactor ANOVA General Linear Models (GLM) Out-of of-sample

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them. TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;

More information

Improved Holt Method for Irregular Time Series

Improved Holt Method for Irregular Time Series WDS'08 Proceedings of Contributed Papers, Part I, 62 67, 2008. ISBN 978-80-7378-065-4 MATFYZPRESS Improved Holt Method for Irregular Time Series T. Hanzák Charles University, Faculty of Mathematics and

More information

Economic Forecasting Lecture 9: Smoothing Methods

Economic Forecasting Lecture 9: Smoothing Methods Economic Forecasting Lecture 9: Smoothing Methods Richard G. Pierse 1 Introduction Smoothing methods are rather different from the model-based methods that we have been looking at up to now in this module.

More information

Automatic Forecasting

Automatic Forecasting Automatic Forecasting Summary The Automatic Forecasting procedure is designed to forecast future values of time series data. A time series consists of a set of sequential numeric data taken at equally

More information

Modified Holt s Linear Trend Method

Modified Holt s Linear Trend Method Modified Holt s Linear Trend Method Guckan Yapar, Sedat Capar, Hanife Taylan Selamlar and Idil Yavuz Abstract Exponential smoothing models are simple, accurate and robust forecasting models and because

More information

Forecasting. BUS 735: Business Decision Making and Research. exercises. Assess what we have learned

Forecasting. BUS 735: Business Decision Making and Research. exercises. Assess what we have learned Forecasting BUS 735: Business Decision Making and Research 1 1.1 Goals and Agenda Goals and Agenda Learning Objective Learn how to identify regularities in time series data Learn popular univariate time

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Chapter 5: Forecasting

Chapter 5: Forecasting 1 Textbook: pp. 165-202 Chapter 5: Forecasting Every day, managers make decisions without knowing what will happen in the future 2 Learning Objectives After completing this chapter, students will be able

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial

More information

BUSI 460 Suggested Answers to Selected Review and Discussion Questions Lesson 7

BUSI 460 Suggested Answers to Selected Review and Discussion Questions Lesson 7 BUSI 460 Suggested Answers to Selected Review and Discussion Questions Lesson 7 1. The definitions follow: (a) Time series: Time series data, also known as a data series, consists of observations on a

More information

Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book

Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book Lecture 7: Exponential Smoothing Methods Please read Chapter 4 and Chapter 2 of MWH Book 1 Big Picture 1. In lecture 6, smoothing (averaging) method is used to estimate the trend-cycle (decomposition)

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Analysis of Violent Crime in Los Angeles County

Analysis of Violent Crime in Los Angeles County Analysis of Violent Crime in Los Angeles County Xiaohong Huang UID: 004693375 March 20, 2017 Abstract Violent crime can have a negative impact to the victims and the neighborhoods. It can affect people

More information

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia International Journal of Applied Science and Technology Vol. 5, No. 5; October 2015 Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia Olayan

More information

Scenario 5: Internet Usage Solution. θ j

Scenario 5: Internet Usage Solution. θ j Scenario : Internet Usage Solution Some more information would be interesting about the study in order to know if we can generalize possible findings. For example: Does each data point consist of the total

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Exponential smoothing is, like the moving average forecast, a simple and often used forecasting technique

Exponential smoothing is, like the moving average forecast, a simple and often used forecasting technique EconS 450 Advanced Farm Management Forecasting Lecture 2 Simple Exponential Smoothing Exponential smoothing is, like the moving average forecast, a simple and often used forecasting technique Exponential

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Theoretical and Simulation-guided Exploration of the AR(1) Model

Theoretical and Simulation-guided Exploration of the AR(1) Model Theoretical and Simulation-guided Exploration of the AR() Model Overview: Section : Motivation Section : Expectation A: Theory B: Simulation Section : Variance A: Theory B: Simulation Section : ACF A:

More information

STAT 115: Introductory Methods for Time Series Analysis and Forecasting. Concepts and Techniques

STAT 115: Introductory Methods for Time Series Analysis and Forecasting. Concepts and Techniques STAT 115: Introductory Methods for Time Series Analysis and Forecasting Concepts and Techniques School of Statistics University of the Philippines Diliman 1 FORECASTING Forecasting is an activity that

More information

Robust control charts for time series data

Robust control charts for time series data Robust control charts for time series data Christophe Croux K.U. Leuven & Tilburg University Sarah Gelper Erasmus University Rotterdam Koen Mahieu K.U. Leuven Abstract This article presents a control chart

More information

Statistical Methods for Forecasting

Statistical Methods for Forecasting Statistical Methods for Forecasting BOVAS ABRAHAM University of Waterloo JOHANNES LEDOLTER University of Iowa John Wiley & Sons New York Chichester Brisbane Toronto Singapore Contents 1 INTRODUCTION AND

More information

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure. STATGRAPHICS Rev. 9/13/213 Calibration Models Summary... 1 Data Input... 3 Analysis Summary... 5 Analysis Options... 7 Plot of Fitted Model... 9 Predicted Values... 1 Confidence Intervals... 11 Observed

More information

Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 3 Forecasting Linear Models, Regression, Holt s, Seasonality

More information

Decision 411: Class 5. Where we ve been so far

Decision 411: Class 5. Where we ve been so far Decision 411: Class 5 HW#2 discussion Introduction to regression forecasting Example: rolling back the beer tax Where we ve been so far Thus far we have looked at the most basic models for predicting future

More information

Decision 411: Class 5

Decision 411: Class 5 Decision 411: Class 5 HW#2 discussion Introduction to regression forecasting Roll back the beer tax? Where we ve been so far Thus far we have looked at the most basic models for predicting future values

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

Multiplicative Winter s Smoothing Method

Multiplicative Winter s Smoothing Method Multiplicative Winter s Smoothing Method LECTURE 6 TIME SERIES FORECASTING METHOD rahmaanisa@apps.ipb.ac.id Review What is the difference between additive and multiplicative seasonal pattern in time series

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

STATS24x7.com 2010 ADI NV, INC.

STATS24x7.com 2010 ADI NV, INC. TIME SERIES SIMPLE EXPONENTIAL SMOOTHING If the mean of y t remains constant over n time, each observation gets equal weight: y ˆ t1 yt 0 et, 0 y n t 1 If the mean of y t changes slowly over time, recent

More information

Forecasting. Al Nosedal University of Toronto. March 8, Al Nosedal University of Toronto Forecasting March 8, / 80

Forecasting. Al Nosedal University of Toronto. March 8, Al Nosedal University of Toronto Forecasting March 8, / 80 Forecasting Al Nosedal University of Toronto March 8, 2016 Al Nosedal University of Toronto Forecasting March 8, 2016 1 / 80 Forecasting Methods: An Overview There are many forecasting methods available,

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Suan Sunandha Rajabhat University

Suan Sunandha Rajabhat University Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis Kunya Bowornchockchai Suan Sunandha Rajabhat University INTRODUCTION The objective of this research is to forecast

More information

INTRODUCTION TO FORECASTING (PART 2) AMAT 167

INTRODUCTION TO FORECASTING (PART 2) AMAT 167 INTRODUCTION TO FORECASTING (PART 2) AMAT 167 Techniques for Trend EXAMPLE OF TRENDS In our discussion, we will focus on linear trend but here are examples of nonlinear trends: EXAMPLE OF TRENDS If you

More information

Seasonal Adjustment using X-13ARIMA-SEATS

Seasonal Adjustment using X-13ARIMA-SEATS Seasonal Adjustment using X-13ARIMA-SEATS Revised: 10/9/2017 Summary... 1 Data Input... 3 Limitations... 4 Analysis Options... 5 Tables and Graphs... 6 Analysis Summary... 7 Data Table... 9 Trend-Cycle

More information

9. Linear Regression and Correlation

9. Linear Regression and Correlation 9. Linear Regression and Correlation Data: y a quantitative response variable x a quantitative explanatory variable (Chap. 8: Recall that both variables were categorical) For example, y = annual income,

More information

Time series and Forecasting

Time series and Forecasting Chapter 2 Time series and Forecasting 2.1 Introduction Data are frequently recorded at regular time intervals, for instance, daily stock market indices, the monthly rate of inflation or annual profit figures.

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

NATCOR. Forecast Evaluation. Forecasting with ARIMA models. Nikolaos Kourentzes

NATCOR. Forecast Evaluation. Forecasting with ARIMA models. Nikolaos Kourentzes NATCOR Forecast Evaluation Forecasting with ARIMA models Nikolaos Kourentzes n.kourentzes@lancaster.ac.uk O u t l i n e 1. Bias measures 2. Accuracy measures 3. Evaluation schemes 4. Prediction intervals

More information

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting "serial correlation"

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting serial correlation Time Series Analysis 2) assessment of/accounting for seasonality This (not surprisingly) concerns the analysis of data collected over time... weekly values, monthly values, quarterly values, yearly values,

More information

Time Series. Anthony Davison. c

Time Series. Anthony Davison. c Series Anthony Davison c 2008 http://stat.epfl.ch Periodogram 76 Motivation............................................................ 77 Lutenizing hormone data..................................................

More information

11 More Regression; Newton s Method; ROC Curves

11 More Regression; Newton s Method; ROC Curves More Regression; Newton s Method; ROC Curves 59 11 More Regression; Newton s Method; ROC Curves LEAST-SQUARES POLYNOMIAL REGRESSION Replace each X i with feature vector e.g. (X i ) = [X 2 i1 X i1 X i2

More information

Keller: Stats for Mgmt & Econ, 7th Ed July 17, 2006

Keller: Stats for Mgmt & Econ, 7th Ed July 17, 2006 Chapter 17 Simple Linear Regression and Correlation 17.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will

More information

QMT 3001 BUSINESS FORECASTING. Exploring Data Patterns & An Introduction to Forecasting Techniques. Aysun KAPUCUGİL-İKİZ, PhD.

QMT 3001 BUSINESS FORECASTING. Exploring Data Patterns & An Introduction to Forecasting Techniques. Aysun KAPUCUGİL-İKİZ, PhD. 1 QMT 3001 BUSINESS FORECASTING Exploring Data Patterns & An Introduction to Forecasting Techniques Aysun KAPUCUGİL-İKİZ, PhD. Forecasting 2 1 3 4 2 5 6 3 Time Series Data Patterns Horizontal (stationary)

More information

Achilles: Now I know how powerful computers are going to become!

Achilles: Now I know how powerful computers are going to become! A Sigmoid Dialogue By Anders Sandberg Achilles: Now I know how powerful computers are going to become! Tortoise: How? Achilles: I did curve fitting to Moore s law. I know you are going to object that technological

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

CHAPTER 18. Time Series Analysis and Forecasting

CHAPTER 18. Time Series Analysis and Forecasting CHAPTER 18 Time Series Analysis and Forecasting CONTENTS STATISTICS IN PRACTICE: NEVADA OCCUPATIONAL HEALTH CLINIC 18.1 TIME SERIES PATTERNS Horizontal Pattern Trend Pattern Seasonal Pattern Trend and

More information

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES THE ROYAL STATISTICAL SOCIETY 9 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES The Society provides these solutions to assist candidates preparing

More information

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Lecture 15 20 Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Modeling for Time Series Forecasting Forecasting is a necessary input to planning, whether in business,

More information

FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008)

FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008) FIN822 project 2 Project 2 contains part I and part II. (Due on November 10, 2008) Part I Logit Model in Bankruptcy Prediction You do not believe in Altman and you decide to estimate the bankruptcy prediction

More information

YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES

YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES This topic includes: Transformation of data to linearity to establish relationships

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

TESTING FOR CO-INTEGRATION

TESTING FOR CO-INTEGRATION Bo Sjö 2010-12-05 TESTING FOR CO-INTEGRATION To be used in combination with Sjö (2008) Testing for Unit Roots and Cointegration A Guide. Instructions: Use the Johansen method to test for Purchasing Power

More information

Ridge Regression. Summary. Sample StatFolio: ridge reg.sgp. STATGRAPHICS Rev. 10/1/2014

Ridge Regression. Summary. Sample StatFolio: ridge reg.sgp. STATGRAPHICS Rev. 10/1/2014 Ridge Regression Summary... 1 Data Input... 4 Analysis Summary... 5 Analysis Options... 6 Ridge Trace... 7 Regression Coefficients... 8 Standardized Regression Coefficients... 9 Observed versus Predicted...

More information

interval forecasting

interval forecasting Interval Forecasting Based on Chapter 7 of the Time Series Forecasting by Chatfield Econometric Forecasting, January 2008 Outline 1 2 3 4 5 Terminology Interval Forecasts Density Forecast Fan Chart Most

More information

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 9: Simple Regression Business Statistics Lecture 9: Simple Regression 1 On to Model Building! Up to now, class was about descriptive and inferential statistics Numerical and graphical summaries of data Confidence intervals

More information

Chapter 16. Simple Linear Regression and dcorrelation

Chapter 16. Simple Linear Regression and dcorrelation Chapter 16 Simple Linear Regression and dcorrelation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

Assumptions in Regression Modeling

Assumptions in Regression Modeling Fall Semester, 2001 Statistics 621 Lecture 2 Robert Stine 1 Assumptions in Regression Modeling Preliminaries Preparing for class Read the casebook prior to class Pace in class is too fast to absorb without

More information

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time TIMES SERIES INTRODUCTION A time series is a set of observations made sequentially through time A time series is said to be continuous when observations are taken continuously through time, or discrete

More information

Chapter 27 Summary Inferences for Regression

Chapter 27 Summary Inferences for Regression Chapter 7 Summary Inferences for Regression What have we learned? We have now applied inference to regression models. Like in all inference situations, there are conditions that we must check. We can test

More information

Polynomial Models Studio Excel 2007 for Windows Instructions

Polynomial Models Studio Excel 2007 for Windows Instructions Polynomial Models Studio Excel 2007 for Windows Instructions A. Download the data spreadsheet, open it, and select the tab labeled Murder. This has the FBI Uniform Crime Statistics reports of Murder and

More information

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues Overfitting Categorical Variables Interaction Terms Non-linear Terms Linear Logarithmic y = a +

More information

Decision 411: Forecasting

Decision 411: Forecasting Decision 411: Forecasting Professor: Bob Nau Course content: How to predict the future How to learn from the past using data analysis Who should be interested: Anyone on a quantitative career track (financial

More information

António Ascensão Costa

António Ascensão Costa ISSN 1645-0647 Notes on Pragmatic Procedures and Exponential Smoothing 04-1998 António Ascensão Costa Este documento não pode ser reproduzido, sob qualquer forma, sem autorização expressa. NOTES ON PRAGMATIC

More information

10) Time series econometrics

10) Time series econometrics 30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit

More information

Objectives Simple linear regression. Statistical model for linear regression. Estimating the regression parameters

Objectives Simple linear regression. Statistical model for linear regression. Estimating the regression parameters Objectives 10.1 Simple linear regression Statistical model for linear regression Estimating the regression parameters Confidence interval for regression parameters Significance test for the slope Confidence

More information

FORECASTING FLUCTUATIONS OF ASPHALT CEMENT PRICE INDEX IN GEORGIA

FORECASTING FLUCTUATIONS OF ASPHALT CEMENT PRICE INDEX IN GEORGIA FORECASTING FLUCTUATIONS OF ASPHALT CEMENT PRICE INDEX IN GEORGIA Mohammad Ilbeigi, Baabak Ashuri, Ph.D., and Yang Hui Economics of the Sustainable Built Environment (ESBE) Lab, School of Building Construction

More information

LAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION

LAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION LAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION In this lab you will first learn how to display the relationship between two quantitative variables with a scatterplot and also how to measure the strength of

More information

Chapter 16. Simple Linear Regression and Correlation

Chapter 16. Simple Linear Regression and Correlation Chapter 16 Simple Linear Regression and Correlation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will

More information

The Art of Forecasting

The Art of Forecasting Time Series The Art of Forecasting Learning Objectives Describe what forecasting is Explain time series & its components Smooth a data series Moving average Exponential smoothing Forecast using trend models

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

High Voltage DC Transmission Prof. Dr. S.N. Singh Department of Electrical Engineering Indian Institute of Technology, Kanpur

High Voltage DC Transmission Prof. Dr. S.N. Singh Department of Electrical Engineering Indian Institute of Technology, Kanpur High Voltage DC Transmission Prof. Dr. S.N. Singh Department of Electrical Engineering Indian Institute of Technology, Kanpur Module No. # 02 Lecture No. # 09 Analysis of Converter Circuit So, let us,

More information

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine Exponential Smoothing INSR 260, Spring 2009 Bob Stine 1 Overview Smoothing Exponential smoothing Model behind exponential smoothing Forecasts and estimates Hidden state model Diagnostic: residual plots

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

appstats27.notebook April 06, 2017

appstats27.notebook April 06, 2017 Chapter 27 Objective Students will conduct inference on regression and analyze data to write a conclusion. Inferences for Regression An Example: Body Fat and Waist Size pg 634 Our chapter example revolves

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp Nonlinear Regression Summary... 1 Analysis Summary... 4 Plot of Fitted Model... 6 Response Surface Plots... 7 Analysis Options... 10 Reports... 11 Correlation Matrix... 12 Observed versus Predicted...

More information

Regression Analysis: Basic Concepts

Regression Analysis: Basic Concepts The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance

More information

y response variable x 1, x 2,, x k -- a set of explanatory variables

y response variable x 1, x 2,, x k -- a set of explanatory variables 11. Multiple Regression and Correlation y response variable x 1, x 2,, x k -- a set of explanatory variables In this chapter, all variables are assumed to be quantitative. Chapters 12-14 show how to incorporate

More information

Chapter 8 - Forecasting

Chapter 8 - Forecasting Chapter 8 - Forecasting Operations Management by R. Dan Reid & Nada R. Sanders 4th Edition Wiley 2010 Wiley 2010 1 Learning Objectives Identify Principles of Forecasting Explain the steps in the forecasting

More information

Sampling Distributions in Regression. Mini-Review: Inference for a Mean. For data (x 1, y 1 ),, (x n, y n ) generated with the SRM,

Sampling Distributions in Regression. Mini-Review: Inference for a Mean. For data (x 1, y 1 ),, (x n, y n ) generated with the SRM, Department of Statistics The Wharton School University of Pennsylvania Statistics 61 Fall 3 Module 3 Inference about the SRM Mini-Review: Inference for a Mean An ideal setup for inference about a mean

More information

Forecasting Gold Price. A Comparative Study

Forecasting Gold Price. A Comparative Study Course of Financial Econometrics FIRM Forecasting Gold Price A Comparative Study Alessio Azzutti, University of Florence Abstract: This paper seeks to evaluate the appropriateness of a variety of existing

More information

Copyright, Nick E. Nolfi MPM1D9 Unit 6 Statistics (Data Analysis) STA-1

Copyright, Nick E. Nolfi MPM1D9 Unit 6 Statistics (Data Analysis) STA-1 UNIT 6 STATISTICS (DATA ANALYSIS) UNIT 6 STATISTICS (DATA ANALYSIS)... 1 INTRODUCTION TO STATISTICS... 2 UNDERSTANDING STATISTICS REQUIRES A CHANGE IN MINDSET... 2 UNDERSTANDING SCATTER PLOTS #1... 3 UNDERSTANDING

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Automatic forecasting with a modified exponential smoothing state space framework

Automatic forecasting with a modified exponential smoothing state space framework ISSN 1440-771X Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Automatic forecasting with a modified exponential smoothing state space framework

More information