3.3. LOCALLY CONSTANT MEAN MODEL AND SIMPLE EXPONENTIAL SMOOTHING

Size: px
Start display at page:

Download "3.3. LOCALLY CONSTANT MEAN MODEL AND SIMPLE EXPONENTIAL SMOOTHING"

Transcription

1 SIMPLE EXPONENTIAL SMOOTHING 85 lated error terms. (It should be kept in mind, however, that this standard error is a large sample approximation, which may not be suitable for small n.) We conclude that the constant mean model gives an adequate description of the historic data. Conditional on the assumption that the structure of the model does not change in the near future, we can compute forecasts for future lumber production. The forecasts from the constant mean model are the same for all forecast lead times and are given by 21976(/) = I = 35,652. The standard error of these forecasts is given by 64- = 2071; a 95 percent prediction interval by [35,652 (2.045)(2071)], or [31,417; 39,8871. This prediction interval is quite large, since there is considerable variability in the historic data. If new observations become available, the forecasts are easily updated. Lumber production in 1977, for example, was 37,520 million board feet. Then the revised forecasts are given by = 35,652 + &[37,520-35,6521 = 35, LOCALLY CONSTANT MEAN MODEL AND SIMPLE EXPONENTIAL SMOOTHING The model in Equation (3.1) assumes that the mean is constant over all time periods. As a consequence, in the forecast computations each observation carries the same weight. In many instances, however, the assumption of a time constant mean is restrictive, and it would be more reasonable to allow for a mean that moves slowly over time. Heuristically, in such a case it would be reasonable to give more weight to the most recent observations and less to the observations in the distant past. If one chooses weights that decrease geometrically with the age of the observations, the forecast of the future observation at time n + 1 can be calculated from in(!) n- I = c C w'z,-~ = C [Z, + WZ,-~ + + W " - I Z ~ ] (3.8) 1-0 The constant w ( I w I < 1) is a discount coefficient. This coefficient, which

2 86 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES should depend on how fast the mean level changes, is usually chosen between.7 and.95; in many applications a value of.9 is suggested [see Brown (1962)l. The factor c = (1 - w") is needed to normalize the sum of the weights to 1. Since it follows that 1 lon c a'= - n- I 1-0 n- I r-0 l-w c c a'= 1 If n is large, then the term W" in the normalizing constant c goes to zero, and exponentially weighted forecasts can be written as?,(i) = (1 - W ) c WJZ,-, j>o = (1 - w )[zn + W zn-i + W2Zn-2 + * *.] (3.9) The forecasts are the same for all lead times 1. The coefficient a = 1 - w is called the smoothing constant and is usually chosen between.05 and.30. The expression sn = sy= (1 - W )[Z, + Wz,-I + W2zn-2 + *..I 2 = a[z, + (1 - (Y)z,-, + (1 - a) z,-.~ I (3.10) is called the smoothed statistic or the smoothed value. The last available smoothed statistic S,, serves as forecast for all future observations, in(/) = S,. Since it is an exponentially weighted average of previous observations, this method is called simple exponential smoothing Updating Forecasts The forecast in (3.9), or equivalently the smoothed statistic in (3. lo), can be updated in several alternative ways. By simple substitution it can be shown that sn = (1 - W )Z, + asn-, ijl) = (1 - O)t, + dnj1) (3.11)

3 SIMPLE EXPONENTIAL SMOOTHING 87 or S,, = Sn-\ + (1 - w)[z, - S,,J i"(1) = 2,,-,(1) + (1 - w)[zn - 2n-l(l)] (3.12) Expressions (3.1 1) and (3.12) show how the forecasts can be updated after a new observation has become available. Equation (3.1 1) expresses the new forecast as a combination of the old forecast and the most recent observation. If w is small, more weight is given to the last observation and the information from previous periods is heavily discounted. If w is close to 1, a new observation will change the old forecast only very little. Equation (3.12) expresses the new forecast as the previous forecast corrected by a fraction (a = 1 - w ) of the last forecast error Actual Implementation of Simple Exponential Smoothing The recurrence equation in (3.11) can be used to update the smoothed statistics at any time period t. In practice, one starts the recursion with the first observation zi and calculates S, = (1 - w ) ~ +, wso. This is then substituted into (3.11) to calculate S, = (1 - w)z2 + US,, and the smoothing is continued until S,, is reached. This procedure is somewhat simpler than the one in (3.8) and is usually adopted in practice. To carry out these operations we need to know (1) a starting value So, and (2) a smoothing constant a = 1 - w. Initial Value for So Through repeated application of Equation (3.1 l), it can be shown that s,, = (1 - w)[z, + WZ,,-, + * * ' + wn-iz,] + w"s0 Thus the influence of So on S,, is negligible, provided n is moderately large and w smaller than 1. For example, if n = 30 and w =.9 (or a =.I), the weight given to the initial smoothed value So (i.e., w" =.042) is very small compared with the combined weight that is given to z,,..., z30 (i.e., 1 - W" =.958). We take the simple arithmetic average of the available historical data (z,, z2,..., z,,) as the initial estimate of So. Such a choice has also been suggested by Brown (1962) and Montgomery and Johnson (1976). The arithmetic average will perform well, provided that the mean level changes only slowly (small a, or w close to 1).

4 88 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES Alternative solutions to choosing the initial value So have been suggested in the literature. Makridalus and Wheelwright (1978), for example, use the first observation as initial smoothed statistic; So = z,, which implies that S, = (1 - w)z, + wso = z,. Such a choice will be preferable if the level changes rapidly (a close to 1, or w close to 0). Another slightly different solution to the initialization in exponential smoothing is to choose So as the backforecast value of to. This is achieved by reversing the time order and smoothing backwards; S; = (1 - w)z, + as:+,, where S;+, = 2,. The prediction of zo, which is given by S:, can then be used as initial value So. Choice of the Smoothing Constant The smoothing constant a = 1 - w determines the extent to which past observations influence the forecast. A small a results in a slow response to changes in the level; a large a results in a rapid response, which, however, will also make the forecast respond to irregular movements in the time series. The smoothmg constant a is frequently determined by simulation. Forecasts are generated for various a s (usually over the range.05 to.30) and are then compared to the actual observations zi, z2,..., t,,. For each a, the one-step-ahead forecast errors and the sum of the squared one-step-ahead forecast errors SSE( a) = n /=I e:.~,(1) are calculated. The smoothing constant, which minimizes the sum of the squared forecast errors, is then used as smoothing constant in the derivation of future forecasts. The notation e,-,(l) expresses the fact that it is the one-step-ahead forecast error of the forecast that is calculated from past data up to and including time t - 1. In general, e,(i) = L,,, - if(/) is the I-step-ahead forecast error corresponding to the I-step-ahead forecast made at time t. The smoothing constant that is obtained by simulation depends on the value of So. A poorly chosen starting value So will lead to an increase in the smoothing constant. Ideally, since the choice of a depends on So, one should choose a and So jointly. For example, if a = 0, one should choose the sample mean as starting value; if a = 1, one should choose So = zi. If

5 0 +(1 SIMPLE EXPONENTIAL SMOOTHING 89 0 < a < 1, one could choose So as the "backforecast" value; So = a[z, + (1 - a)z2 + ' - a)"-*z,-,] + (1 - a)"-'z,. Further discussion of this point can be found in Ledolter and Abraham (1983). Once the smoothing constant has been determined from past data, the forecasts are easily updated as each new observation becomes available: It is usually assumed that the smoothing constant a stays fixed over time. Thus it is not necessary to reestimate the value of a as each new observation becomes available. How to detect whether the smoothing constant (or in general, any parameter of a model) has changed is discussed in Chapter 8. There we discuss tracking signals and also adaptive smoothing methods, in which the smoothing parameter adapts itself to changes in the underlying time series Additional Comments and Example Forecasts from simple exponential smoothing were introduced without reference to a particular model. In Chapter 7 we discuss the model under which these forecasts are minimum mean square error (MMSE) forecasts. There it will be shown that simple exponential smoothing leads to optimal (MMSE) forecasts, if the mean /3, in z, = p, + E, changes according to a random walk model /3, = /3,- I + a, or, equivalently, if the observations are generated from a particular time series model. The main reason for the widespread use of simple exponential smoothing comes from the updating equations (3.1 I), since they make the calculation of new forecasts computationally very convenient. Only the previous forecast and the most recent observation have to be stored when updating the forecast. This is especially important if a large number of different items have to be predicted. Another reason exponential smoothing techniques have received broad attention in the business literature is that they are fully automatic. Once a computer program has been written and a discount coefficient w (or equivalently a smoothing constant a = 1 - h) has been chosen, forecasts for any time series can be derived without manual intervention of the forecaster. The fact that they are fully automatic has been put forward as an advantage of the scheme. However, it can equally well be argued that this is a great disadvantage, since every time series is treated identically. Thus it is very important to perform diagnostic checks to see whether this forecasting technique is in fact adequate.

6 90 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES Example 3.2: Quartedy Iowa Nonfarm Income As an example we consider the quarterly Iowa nonfarm income for The 128 observations are listed in series 2 of the Data Appendix; a plot of the data is given in Figure 3.2. The data exhibit exponential growth, a pattern typical of many economic series of this time period. Instead of analyzing and forecasting the original series, we first model the quarterly growth rates of nonfarm income. The quarterly growth rates or percentage changes t = 1,2,..., 127 are also given in the Data Appendix and are plotted in Figure 3.3. Analyzing percentage changes is approximately equivalent to analyzing successive differences of logarithmically transformed data, since I (1 + z34 log I,+, - log I/ = log- = log 4 11 = log( 1 + 2:) I z: for small z: = z,/loo The plot of the growth rates indicates that the mean level is not constant but changes slowly over time. Especially in the middle part of the series IJ 1 I Year Figure 3.2. Iowa nonfarm income, first quarter 1948 to fourth quarter 1979

7 SIMPLE EXPONENTIAL SMOOTHING 91,I I I I Year Figure 3.3. Growth rates of Iowa nonfarm income, second quarter 1948 to fourth quarter (during the 1960s), the mean level increases toward a slightly hgher new level. However, it does not appear to grow as a linear function of time, nor do we expect such a pattern in the near future. The constant mean model would be clearly inappropriate. This can also be seen from the sample autocorrelations of the growth rates, which are given in Table 3.2. Compared with their standard errors I/ =.089, most autocorrelations are significantly different from zero. Since the mean is slowly changing, simple exponential smoothing appears to be an appropriate method. Smoothing constants somewhere between -05 and.30 are usually recommended. To illustrate the calculations we use a smoothing constant a =.11. This particular choice will be justified in the next section. The average of all 127 observations, Z = 1.829, is used as initial smoothed statistic So. We could also have chosen the average of a subset of the observations [let s say the first 6 or the first 10 observations; see Bowerman and OConnell (1979)) The value So = is used to predict the first observation. With z, = SO, the first one-stepahead forecast error is given by eo(l) = z, - So = S O = The updated smoothed statistic S, = az, + (1 - a)so = (.11)(.SO) + (.89)(1.829) = is used to pre-

8 92 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIfiS Table 3.2. Sample Autocorrelations r, of Growth Rates of Iowa Nonfarm Income (n = 127) Lag k Sample autocorrelation r, i8.35.i8.22 Table 3.3. Simple Exponential Smoothing-Growth Rates of Iowa Nonfarm Income a =.I1 a =.40 Smoothed One-Step-Ahcad Smoothed One-Step-Ahead Time Observation Statistic Forecast Error Statistic Forecast Error t 2, s/ e,- ](I) = 2, - s/_ I s, el- l(1) = z/ - s/- I A k 0: SSE(.l I) = : SSE(.40) = dict 2,. With 2, = 2.65, the one-step-ahead forecast error is el(l) = 2, - S, = =.967. The next smoothed statistic S, = az2 + (1 - a)si = (.11)(2.65) + (.89)(1.683) = is the prediction for t3; since z3 =.97, the one-step-ahead forecast error is e,(l) = z3 - S, = From t3 we can calculate S3 = 1.699; e3( 1) =.701, etc. The observations z,, the smoothed statistics S,, and the one-step-ahead forecast errors are given in Table 3.3. Through repeated updating, S, = (.l l)r, + (.89)S,- I, we eventually find that SI2, = (.11)(2.35) + (.89)(2.692) = The last smoothed statistic is then used to predict all future growth rates fi2,(/) = This implies that our prediction of the Iowa nonfarm income for the first quarter of 1980 is given by i,,,( 1) = ( )Zl,, = ( )(5965) = 6123, for the second quarter by Z128(2) = ( )*(5965) = 6286, and in general jl2,(/) = ( )'(5965).

9 SIMPLE EXPONENTIAL SMOOTHING 93 In our example the sum of the squared one-step-ahead forecast errors for the smoothing constant a =.I 1 is given by SSE(.11) = n I= I ef-l(l) = ( )2 + (.967)2 + - * * + (.458)2 + ( -.342)* = For illustration we have also calculated the smoothed statistics S, and the one-step-ahead forecast errors e,- 1(1) = z, - S,- I for the smoothing con- stant a =.40. The results are given in the last two columns of Table 3.3. There it is found that SSE(.40) = ( )2 + (1.353)2 +. +(.464)2 + ( -.471)2 = Changing the smoothing constant from.01 to.30 in increments of.01 leads to the sums of squared one-step-ahead forecast errors in Table 3.4; they are plotted in Figure 3.4. The minimum is achieved for a =. 1 1, which explains our previous choice of a. Before the model is used for forecasting, we must investigate whether it gives an adequate description of the historical data. If the forecast model is appropriate, then the one-step-ahead forecast errors should be uncorrelated. Correlation among the one-step-ahead forecast errors would imply that the current forecast error can be used to improve the next forecast. In such a case, we would incorporate this information into the next forecast and would thus use a different forecast model. Table 3.4. Sums of Squared One-StepAhead Forecast Errors for Different Values of a; Simple Exponential Smoothing-Growth Rates of Iowa Nonfarm Income a SWa) a SSE( a) a SSE( a) I I I I I I I I I I

10 94 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES c Figure 3.4. Plot of SSE(a), for simple exponential smoothing-growth rates of Iowa nonfarm income. As a diagnostic check, we calculate the sample autocorrelations of the one-step-ahead forecast errors t=o If the forecast errors are uncorrelated, the sample autocorrelations should vary around mean zero with standard error l/r~'/~. The sample autocorrelations of the one-step-ahead forecast errors are given in Table 3.5. The autocorrelations of the forecast errors, when using the optimal smoothing constant.i 1, are compared with those for a smoothing constant of.40. Comparing the autocorrelations in the first column (a =.I]) with their standard error 1/ =.089, we find that they are within f 2 standard errors. For this smoothing constant, successive onestep-ahead forecast errors are uncorrelated. On the other hand, one-stepahead forecast errors for exponential smoothmg with a =.40 are correlated. The significant lag 1 correlation indicates that forecast errors one step apart are still correlated. This shows that for this data set the smoothing constant a =.40 is inappropriate. In addition to checking for correlation among the forecast errors, one should also check for possible bias in the forecasts. A mean of the forecast

11 REGRESSION MODELS WITH TIME AS INDEPENDENT VARIABLE 95 Table 3.5. Means, Standard Errors, and Sample Autocorrelations of the One-StepAhead Forecast Errors from Exponential Smoothing (with a =.I1 and a =.4O)-Growth Rates of Iowa Nonfarm Income Sample Autocorrelations of One-Step-Ahead Forecast Errors Lag k a =. I1 a! = I9 4.I4.I I1 6.oo.oo Mean of historical forecast errors, Staxidard error of mean, I errors that is significantly larger (smaller) than zero indicates that the forecast procedure underpredicts (overpredicts) the observations. To assess the significance of the mean of the forecast errors, we compare it with its standard error s/n I2, where n-l s2 = - C [e,(l) - el2 r=o Means and standard errors of the one-step-ahead forecast errors are also given in Table 3.5. Since the mean.059 lies within k2 standard errors, we conclude that the mean of the forecast errors is not significantly different from zero. Thus, exponential smoothing with a smoothing constant a =.I 1 leads to unbiased forecasts REGRESSION MODELS WITH TIME AS INDEPENDENT VARIABLE We now extend the constant mean model to the class of regression models in which certain functions of time are taken as independent variables. We consider models of the form m (3.13)

12 96 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES where = (PI, P2,.,., P,)' is a vector of parameters and f( j) = [fl(j),..., f,(j)]' is a vector of specified fitting or forecast functions; (en+j, j = 0, f 1, 2,... } is a sequence of independent N(0, a2) random variables. Furthermore, we assume that the fitting functions f(j) satisfy the difference equation f( j + I) = Lf(j) (3.14) where L is an m X m nonsingular fixed transition matrix For example, exponentials, polynomials, sines, cosines, and linear combinations of these functions can be shown to satisfy these equations. In Equation (3.13) them fitting or forecast functionsf,(j) (i = 1,..., m) are defined relative to time origin n. Equivalently, we could express the model as an ordinary regression in whch the forecast functions are defined in relation to the time origin zero: Since we assume that the fitting functions satisfy (3.14), we can write f(n +j) = Lf(n + j - 1) = = L"f(j) and express (3.15) as The coefficients in this representation are then given by B = (L")'P*. The parameterizations in Equations (3.13) and (3.15) lead to equivalent representations. However, when updating the parameter estimates and forecasts, the parameterization (3.13) turns out to be more convenient Examples To familiarize ourselves with this class of forecast models, we consider several important special cases. To be consistent with the regression notation in Chapter 2, we have used Po as the parameter corresponding to a constant fitting function. This amounts to a relabeling of the parameters in model (3.13).

13 REGRESSION MODELS WITH TIME AS INDEPENDENT VARIABLE!v 1. Constant mean model: This model is obtained by choosing a single constant fitting function fl(j) = 1. In this case, L = f(0) = Linear trend model: Zn+j Po + Plj + %+J Here the two fitting functions are f l(j) = 1 and f2(j) = j. The transition matrix in (3.14) is gven by.=[:;i and f(o)=[a] 3. Quadratic trend model: In this case we have three fitting functions fl(j) = 1, f2(j) = j, and f,(j) = j2/2. Choosing j2/2 instead of tht quadratic term j2 simplifies the difference equation in (3.14). The transition matrix is Furthermore, L=[] General k th-order polynomial model: k j = C Piz + En+j i=o

14 90 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES This is obtained by specifying k + 1 fitting functions j' A+l(j) = 7 i = 0, I,..., k The transition matrix and the initial vector f(0) are given by L= I;] 1/2 1 f(0) = 1 - l/k! l/(k - l)! L is a lower triangular matrix with elements ljj = l/(i - j)! for i 2 j point sinusoidal model: 2T. 2n. zn+, = Po + PIsin-j + &COS-J + en+, In this case the fitting functions are given by Furthermore, L= /2 1/21 0-1/2 0/2 f(0) = The sinusoidal model can be used to represent seasonal time series. Additional seasonal models are introduced in Chapter Forecasts If the parameters B in model (3.13) are known, the forecast of a future observation at time n + 1 is given by (3.16)

15 REGRESSION MODELS WITH TIME AS INDEPENDENT VARIABLE 99 The forecast is unbiased, with mean square error u2. A loo(1 - A) percent prediction interval for the future observation t,+, is given by [z,(l) - ua/2u; rn(l) + u?/2'1. If the coefficients p in (3.13) are unknown, we have to estimate them from past data tl, z2,..., z,. The least squares estimates minimize In the general least squares formulation of Section 2.3, X'= [f(-n+ 1),..., f(o)] xx = f(o)f'(o) +.. * + f( -n + 1)fy -n + 1) n- I = c f(-j)f'(-j) j-0 and n- I X'y = c f( -j)t,-, J=o The least squares estimates are given by where b,, = F,-'h, (3.18) The subscript n in the above notation expresses the fact that the estimates are calculated from observations up to and including time n. Substituting these estimates into Equation (3.16), we forecast a future observation t, +, from

16 100 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES It follows from results in Section 2.6 that in(/) is an unbiased forecast. The variance of the forecast error en(/) = z,+, - in(/) is given by V[e,(/)] = a2[1 + f'(l)f;'f(/)] (3.20) The variance u2 can be estimated from (3.21) and a 100( 1 - A) percent prediction interval for a future realization is given by in(/) t,,,(n - m)6[1 + f'(l)f;'f(l)]''2 (3.22) Updating Parameter Estimates and Forecasts If an additional observation z,+, becomes available, the estimate &,+ I can be written as where and n F,,+, = f( -j)f'(-j) = F, + f( -n)f'( -n) J=o n n-- I = f(o)z,+, + L-'h, In the above derivation we used the fact that f(-j) = M(-j - 1); see (3.14). The new forecast is given by zn+l(/) = r(/)r+ I The result in (3.23) can be used to update the parameter estimate fin+ I. For the recursive calculation, only Fn and h, have to be stored.

17 DISCOUNTED LEAST SQUARES AND GENERAL EXPONENTIAL SMOOTHING DISCOUNTED LEAST SQUARES AND GENERAL EXPONENTIAL SMOOTHING In model (3.13) we have assumed that the parameters p are constant. Estimating the coefficients by ordinary least squares implies that each observation (recent or past) has the same importance in determining the estimates. If we assume that the model in (3.13) is only locally constant and if we wished to guard against possible parameter changes, we could give more weight to recent observations and discount past observations. This reasoning motivated Brown (1962) to consider a discount factor in the least squares criterion. In discounted least squares or general exponential smoothing, the parameter estimates are determined by minimizing n- I c WJ[tn-j - f'( -j)pl2 (3.24) The constant o ( 101 j-0 < 1) is a discount factor that discounts past observations exponentially. Brown suggests choosing w such that.70 < urn <.95. where m is the number of estimated parameters in the model. It should be emphasized that at this point no model for possible changes in the coefficients is assumed, and exponential discounting is introduced as an ad hoc procedure. In a later chapter (Chap. 7) we will show that this procedure leads to minimum mean square error forecasts, but only if the underlying process generating the data is of a particular form. Weighted least squares procedures can be used to find the solution to the minimization problem in (3.24). In the notation of Section 2.13 we can write the model as y = Xg + E, where y' = (t,,..., 2,-,, z,); X' = [f( -n + I),..., f( - l),f(o)]; V(E) = a2q-', where and n- 1 xnx = c o'f( -j)r( -j) j-0 The estimate of fl is thus given by (3.25)

18 102 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES where now n- I n- 1 F, = c wjf( -j)f'(-j) and h, = ojf( -j)zn-, J=o J=o The forecasts are given by in(/) = r (l)bn (3.26) Updating Parameter Estimates and Forecasts Using the observations up to period n + 1, the estimated coefficients b,+, are given by where k+ 1 = ~L'lhn+ 1 n = f(o)z,+, + w c wj-ll-'f(-j + l)z,+,-, I" I = f(o)z,+ I + ol-'h, (3.27) In the derivation of (3.27) we have used the fact that f( j + 1) = Lf(j). Furthermore, n F,+I = c df(-j)f'(-j) = w"f(-n)f'(-n) + F, 1'0 For polynomial and sinusoidal fitting functions, the elements of wnf( -n)f'( -n) go to zero as n + oo, since on tends to zero faster than f( - n) can grow. For example, in the linear trend model with fl( j) = 1 and f2( j) = j, the elements of the matrix w"f( -n)f'( -n) = w" [ = go to zero provided (wi < 1. For a decreasing exponential fitting function f(j) = +J (1+1 < l), we have to assume that IwI < G2; in this case, = ( w+-~)" goes to zero.

19 DISCOUNTED LEAST SQUARES AND GENERAL EXPONENTIAL SMOOTHING 103 Under the above assumption, the elements of the matrix d f( - n)f( - n) go to zero, and Fn+' reaches a limit (or steady state) that we denote by F; lim Fn+l = F = n- m df( -j)f'( -j) jbo (3.28) Then the steady-state solution can be written as = F-'h,+l = F-'[f(O)t,+, + ol-'h,] = F-lf(0)zn+I + wf-il-'h,, Furthermore, = F-'f(O)z,, I + wf-'l-'fdn Therefore, = F-'[F - f(o)f'(o)]l = L - F-'f(O)f'(O)L' = F-'f(O)z,+, + [L'- F-'f(O)f(O)L][j, = Up,, + F-'f(O)z,,+ I - F-'f(O)f'(l)O,, The new parameter estimate in Equation (3.29) is a linear combination of the previous parameter estimate and the one-step-ahead forecast error. Since L, F, and f(0) are fixed for each model and each discount coefficient o, only the last parameter estimate and the last forecast error have to be stored for parameter and forecast updating.

20 = 104 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES We now consider several special cases. One special case, the locally constant mean model or simple exponential smoothing, was discussed previously (Sec. 3.3). In terms of the general notation (3.13), the mean model is described by a single constant fitting function, fl( j) = 1. It follows from Equation (3.25) that for this special case n I " 1-0 i,(l) = p, = - (3.30) For fixed n, as w.+ 1, we find from L'Hospital's rule on the evaluation of limits that Equation (3.30) reduces to, n -l This is the minimum mean square error forecast for the globally constant mean model (see Sec. 3.2). In the steady-state case (n -, 00) and for fixed w, the forecast is given by in(l) = (1 - w ) c wjz, -J J>O s,, (3.31) This corresponds to simple exponential smoothing; the forecast of a future value is given by the last smoothed statistic S,. Equation (3.29) simplifies in this special case to (3.32) Other special cases of discounted least squares are discussed in the next two sections. In Section 3.6 we discuss the locally constant trend model or double exponential smoothing. In Section 3.7 we discuss triple exponential smoothing LOCALLY CONSTANT LINEAR TREND MODEL AND DOUBLE EXPONENTIAL SMOOTHING The linear trend model z,+~ = Po + p, j + is described by m = 2 fitting functions fl( j) = 1, f2( j) = j, transition matrix

21 DOUBLE EXPONENTIAL SMOOTHING and initial vector 105 The expressions in Equation (3.25) simplify to Furthermore, since F= r - n- I w lim C j wj = n+m j-0 (1 - W )* n- I w(1 + w ) lim C j2wj = n-m j-0 (1 - u)3 the steady-state value of F, is given by 1 -W 1-W (1 - ")2 --w w(l + 0) (1 - (1 - Thus, for large n, the estimates fin = (bo, n, bl,n)' are given by or 1 -a2 (1 - w)2 zw '2, -, fin = F -lhn = - w)2 (1 iw)3 I[ ] - 2 jw *zn b0,, = (1 - w2)zw~z,-,, - (1 - w) 2 CjwJz,_, b,,, = (1 - w ) cwjzn-j (1-0 jw Jzn-J (3.33)

22 lo6 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES We now relate these estimates to statistics that can be obtained from an extension of simple exponential smoothing. For this purpose, we introduce smoothed statistics of higher orders. The single smoothed statistic Sill = Sn was defined previously, r sy = (1 - w)z, + ws;'], = (1 - O)COJZ,-, We now define the double smoothed statistic Si2] as s,21 = (1 - o)s,[l] + us;?!, = (1 - w)2c(j + l)w'z, and in general the smoothed statistic of order k, sy = (1 - w )sy-ii + ws;kll (3.34) (3.35) (3.36) The constant w is the discount coefficient; a = 1 - w is called the smoothing constant. SA21 is called the double smoothed statistic because it is derived by smoothng the single smoothed statistic S,!']. Similarly, Sikl is derived by k successive smoothing operations. By simple substitution it can be shown that the estimates bo, and bl, in (3.33) can be expressed as a function of the single and double smoothed statistics: The forecasts can then be computed from 1-0 w (3.38) Or, in terms of the smoothing constant a, i,(l) = (2 + -Js;1- j ff (1 + -1)sp 1-ff Since the forecasts can be expressed as a function of the single and double smoothed statistics, this forecast procedure is known as double exponential smoothing.

23 DOUBLE EXPONENTIAL SMOOTHING Updating Coefficient Estimates Using the updating relation in Equation (3.29), we can furthermore write By substitution it can be shown that (3.40) The first equation follows by substituting i,(l) = Iso,, + bi,,. The second equation is derived by substituting into the second equation of (3.39) Another Interpretation of Double Exponential Smoothing A slightly different approach will aid in the interpretation of the updating equations in (3.40). We follow an approach originally used by Holt (1957). We assume a linear trend model for the mean pi and write it in slightly different form: pi = p, + (t - n)p. In this representation, p is the slope parameter and p, is the level at time n. Then the mean at time n + 1 is defined as pn+ I = p,, + p. An estimate of this mean can be found from two different sources: (1) from z,+ I, which represents the present estimate of p,, I and (2) from ji, + fin, which is the estimate of pn+ I from observations up to and including time n. Note that 8, is the estimate of the slope at time n. Holt considers a linear combination of these estimates, where a, = 1 - wi (0 < a, < 1) is a smoothing constant that determines how quickly past information is discounted. Similarly, information about the slope comes from two sources: (1) from the difference of the mean estimates - p, (i.e., the most recent

24 10s REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES estimate of &+ I) and (2) from the previous estimate of the slope Bn. Again, these estimates are linearly weighted to give ijni I = (1 - ~ 2>(fin+l - f in) + w2ijn The coefficient a2 = 1 - w2 (0 < a2 < 1) is another smoothing constant. Thus, the parameter estimates are updated according to fin+] = (1 - Wl)zn+l + ~ l ( f i n + bn) fin+, = (1 - ~ 2)(fin+l - f in) + w2lin (3.41) and the forecasts of future observations are given by in+ = f in+] + f lni-11 (3.42) Equations (3.41) are more general than those of double exponential smoothing in (3.40), since two different discount coefficients are used. However, if w1 = w2 and o2 = 20/(1 + a), Holt's procedure is equivalent to double exponential smoothing with discount coefficient w (or smoothing constant a = 1-0) Actual Implementation of Double Exponential Smoothing The forecasts in double exponential smoothing can be computed from Equation (3.38), which implies that it is not necessary to calculate the discounted least squares estimates flo, and b,, directly. However, we need to know (1) the smoothing constant a = 1 - w and (2) the most recent smoothed statistics SLl]and SL2]. To calculate these smoothed statistics, we can start the recursions (3.34) and (3.35) from the first observation. For this, we need to specify initial values Sill and Sd2]. Equations (3.34) and (3.35) can then be used to update the smoothed statistics until S,!'] and S,['] are reached. Initial Values for So''' and S/" Equations (3.37) express the estimates of the model coefficients as a function of the smoothed statistics. These equations can be solved for S,"' and SL2]. For n = 0, it follows that Sd2] = ijo.0-21-w w bi.0 (3.43)

25 DOUBLE EXPONENTIAL SMOOTHING 109 The estimates of the coefficients at time zero, B,,, and z)l,o, are usually found by fitting the constant linear trend model z, = Po + Plt + E~ either to all available observations tl, t2,..., L,, or to a subset consisting of the first n, observations. This approach is similar to the one in simple exponential smoothing where the estimate of the constant mean model (i.e., the sample mean) is used as starting value for $I]. Ordinary least squares [see Eq. (2.7)] leads to the following estimates: n+ 1 P,=L-B1- (3.44) These estimates are used for boo and b,,, (ix., = bo and b,,o = b,). and the initial values Sd1I and S$ are derived from Equations (3.43). Choice of the Smoothing Constan2 As in simple exponential smoothing, the discount coefficient w (or equivalently the smoothing coefficient a = 1 - w ) is chosen by generating onestep-ahead forecast errors for several values of the smoothing constant a. The sum of the squared one-step-ahead forecast errors is calculated for each chosen a. The smoothing constant that minimizes SSE(a) = C[Z, - i,-l(l)]z = Z [Zf- (2 + I-a OL p!il + (1 + I--(y a )S/!q2 (3.45) is the coefficient that is used for future forecasting. In the forecasting literature it is usually suggested that the discount coefficient w be chosen close to 1 (or equivalently the smoothing constant a = 1 - w close to zero). Brown (1962), for example, suggests that the discount factor w in double exponential smoothing be chosen between =.84 and =.97. However, in many applications of simple and double exponential smoothing it is found that w lies outside the suggested range. The closer o is to zero, the less likely it is that the data are described by a locally constant mean model (as in simple exponential smoothing) or a locally constant linear trend model (as in double exponential smoothing).

26 I10 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES For example, let us consider the extreme case when w + 0 (or equivalently a 1). In simple exponential smoothing the smoothed value becomes S, = (l)z, + (O)S,-, = z,. Thus the last recorded value is the best forecast of all future observations;,?,(i) = z,. Information from all other past observations is ignored. In double exponential smoothing, the smoothed statistics in (3.34) and (3.35) are given by S,"] = SA21 = z,. From (3.38) it follows that the forecasts, for example for 1 = 1, are 1 - W +- w Now, as w -+ 0, lim,?,,(i) = 22, - z, + lim -a{ w(1 - w)z, W-rO w-0 0 = 22, - z, + z, - 2,- I = 22, - 2,- I Ths result implies that double exponential smoothing with a discount factor w --* 0 corresponds to fitting a linear trend model to only the last two observations. No other observations contribute to the forecast. This choice of o is very different from the case w = 1. There the best forecast is achieved by fitting a constant linear trend model to all available observations and giving each observation the same weight Examples Example 3.3: Weekb Thermostat Sales As an example for double exponential smoothing, we analyze a sequence of 52 weekly sales observations. The data, which are listed in Table 3.6 and plotted in Figure 3.5, were originally analyzed by Brown (1962). The plot of

27 DOUBLE EXPONENTIAL SMOOTHING 111 Table 3.6. Weekly Thermostat Sales, 52 Observations I Read downwards, left to right. Source: Reprinted by permission of Prentice-Hall, Inc. from R. G. Brown (1962), Smoothing, Forecasting and Prediction of Discrete Time Series, p. 43 I. the data indicates an upward trend in the thermostat sales. This trend, however, does not appear to be constant but seems to change over time. A constant linear trend model would therefore not be appropriate. Nevertheless, we first fit a constant linear trend model t, = Po + P,t + E, to all 52 observations. The time origin in our constant linear trend model is at time zero. This representation is somewhat different from the models in Section 3.4, where the time origin is always at time period n. However, we choose the current representation because we need the estimates of Po and p, to calculate the initial values for Sd and Sd2]. Another reason for fitting a constant linear trend model is to illustrate how a residual analysis can detect model inadequacies. The least squares estimates of Po and PI are calculated from (3.44): 52 C ( t )~~ I- I bl = 52 = C (t- 26.5)* 1-1 &, = Z - bl(26.5) = If the constant linear trend model were appropriate, the residuals

28 112 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES Week Figure 3.5. Weekly thermostat sales. el = z, - (&, + ),t) for t = 1,2,..., 52 should be uncorrelated. The sample autocorrelations of the residuals, rk = t-k+ I 52 c e: I== 1 are given in Table 3.7. The mean f? can be omitted in the calculation of rk, since the least squares fit introduces restrictions among the residuals [see Eq. (2.15)]. Since an intercept is included in the model, one of the restrictions forces the residuals to add to zero. Comparing the sample autocorrelations with their standard error 1/ =.14, we find that the residuals are correlated and that the constant linear trend model is certainly not appropriate for this particular data set. To model the time-changing trend, we now consider double exponential smoothing. We choose a smoothing constant a =.14 (w =.86). Later we will explain why we choose this particular value. To derive the initial values

29 DOUBLE EXPONENTIAL SMOOTHING 113 Table 3.7. Sample Autocodations of the Residuals from the Constant Linear Trend Model-Thermostat Sales Lag k Autocorrelation r, i S$l and Sd2], we substitute the least squares estimates fi, and fi, into Equations (3.43). 86 Sd = = ( 3 SJ21= (2.325) = From these initial smoothed statistics we can calculate the forecasts 2,( 1). For example, for I = 1, it follows from (3.38) that Similarly, for I = 2 we find Since z, = 206, we can calculate the first one-step-ahead forecast error: e,(l) = z, - i,(l) = = The smoothed statistics can be updated [Eqs. (3.34), (3.35)]: sp = (1 - w)z, + wsjl1 = (.14)(206) + (.86)(152.12) = Sp = (1- w)s/ ] 4- osp = (.14)(159.66) + (.86)(137.84) = The forecasts i,(l) can then be calculated from (3.38):.86 (140.89) =

30 114 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES and the next one-step-ahead forecast error is given by e,(l) = z2 - f,(1) = = With the new observation z2 we can update the smoothed statistics and calculate Sil and Si2]. The observations zf; the smoothed statistics $'I, $'I; the forecasts.2,(1), f,(2); and the one-step-ahead forecast errors e,-,(1) = z, -.2,-,(1) are listed in Table 3.8. Eventually, through repeated application of (3.34) and (3.35), we find = , S.& = Forecasts for the next I periods are then given by 252(1) = (2 + $1) ( I.:: For example, the forecast for the next period ( I = 1) is given by ; for two periods ahead it is given by is2(2) = All other forecasts lie on a straight line determined by &(l) and fs,(2). Choice of the smoothing constant. For the smoothing constant a =.14, the sum of the squared one-step-ahead forecast errors is given by SSE( -14) = (37.28)' + (63.52)* + * * * + ( )2 + (40.48)' = 41,469 Table 3.8. Double Exponential Smoothing with Smoothing Constant a =.14--Themostet Sales Q : : : SSE(.14) = 41,469

31 DOUBLE EXPONENTIAL SMOOTHING 11s Table 3.9. Sums of Squared One-StepAhead Forecast Errors for Different Values of a; Double Exponential Smoothing-Thermostat Sales Q SSE(a) a SSE(a) a SSE( a) , ,O , , , , , , ,O , I4 41, , , , ,OO I , , , , , , ,1 I , , , , , , ,790 Changing the smoothing constant from.02 to.30 in increments of.01 leads to the sums of squared one-step-ahead forecast errors that are listed in Table 3.9 and plotted in Figure 3.6. It is found that the minimum of SSE(a) occurs when a =.14, which explains our previous choice. As in simple exponential smoothing, the choice of So[ ] and So[2] will influence the value of a obtained by simulation. Ideally, one should choose the starting values as a function of a as well as the data t,,..., z,. SSE(a1 50, ,000 - I 1 l a

32 116 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES Table Mean, Standard Error, and Sample Autocorrelations of the One-StepAhead Forecast Errors from Double Exponential Smoothing (a =.lr()--thennostat Sales Lag Sample Autocorrelations of One-Step-Ahead Forecast Errors k *k I.I I6 4 -.I Mean of historical forecast errors 1.86 Standard error of mean 3.95 Checking the adequacy of double exponential smoothing. To check whether double exponential smoothing is an appropriate forecast procedure for the thermostat sales data, we calculate the mean, the standard error, and the autocorrelations of the one-step-ahead forecast errors e,- 1(1) = z, -,?,-l(l). The results in Table 3.10 indicate that double exponential smoothing is an appropriate forecasting procedure for the thermostat sales data, since (1) it leads to unbiased forecasts and (2) the forecast errors are uncorrelated. The standard error of rk is 1/ 4s =.14; all sample autocorrelations are well within two standard errors. Example 3.4: Unkrsity of Iowa Student Enrollments As another example, we consider the annual student enrollments (fall and spring semesters combined) at the University of Iowa. Observations for the Table Total Annual Student Enrollment at the University of Iowa, 1951/52 to la79/so 14,348 14,307 15,197 16,715 18,476 19,404 20,173 20,645 20,937 21,501 22,788 23,579 25,319 28,250 32,191 34,584 36,366 37,865 39,173 40,119 39,626 39,107 39,796 41,567 43,646 43,534 44,157 44,551 45,572 Read across, top to bottom.

33 ~ DOUBLE EXPONENTIAL SMOOTHING / / /8 1 Year Figure 3.7. University of Iowa student enrollment, 1951/52 to 1979/80. last 29 years ( /52 through 1979/80) are summarized in Table A plot of the observations is given in Figure 3.7. Because of the growing trend pattern of the series, we decided to use double exponential smoothing. To decide on the smoothing constant a, we simulated the forecast errors for several different smoothing coefficients. Sums of the squared one-step-ahead forecast errors, n SSE( a) = [ z, - 2,- I ( 1)12 I= 1 = [z,- (2+- I-a a I=- I )s/!, + (1 + I--(y a )S/Y,]* are given in Table 3.12 and plotted in Figure 3.8. The optimal smoothing constant is given by a = 37. We notice that it is not in the range usually suggested for exponential smoothing. The function SSE(a) is very flat from.80 to 1.00, indicating that the smoothing constant a is close to 1. A smoothing constant near 1 implies that the linear trend depends mostly on the last two observations and can change very quickly. The trend isnot of a deterministic nature; it is stochastic, changing rapidly with each new observation.

34 Table Sums of Squared One-StepAhead Forecast Emrs for Different Values of a; Double Exponential Smoothing-University of Iowa Student Enrollment Smoothing Constant a Sum of Squared One-Step-Ahead Forecast Errors (in IOOO) SSWa) , ,M , , , , , , , , , , , , ,700 SSEM 12 x X lo x I q

35 DOUBLE EXPONENTIAL SMOOTHING 119 The observations, the first- and second-order smoothed statistics, and the one-step-ahead forecast errors for the smoothing constant a =.87 are given in Table The initial smoothed statistics are calculated from (3.43) and (3.44). All 29 observations are used to determine the estimates &, and 8,. Substituting the last smoothed statistics S# = 45,431 and S# = 45,300 in the forecast equation (3.38), we can calculate the enrollments for future Table Double Exponential Smoothing (a =.87)- University of Iowa Student Enrollment I I ,348 14,307 15,197 16,715 18,476 19,404 20,173 20,645 20,937 21,501 22,788 23,579 25, ,250 32,191 34,584 36,366 37,865 39,173 40,119 39,626 39,107 39, ,567 43,646 43,534 44,157 44, ,572 11,418 13,967 14,263 15,076 16,502 18,219 19,250 20,053 20,568 20,889 21,421 22, ,453 25,076 27, ,625 34,199 36,084 37,634 38,973 39,970 39,671 39,180 39, ,326 43,344 43,509 44,073 44,489 45, ,230 13,611 14,178 14,959 16, ,970 19,084 19,927 20,485 20,836 21,345 22,446 23,322 24,848 27, ,082 33,794 35,787 37,393 38, ,689 39,246 39,655 41,109 43,054 43,450 43,992 44,424 45,300 12,863 16,704 14,914 15,973 18,045 20,137 20, , , ,293 22,006 23,875 24,460 26, ,827 35,801 37,317 38,375 39,480 40,552 41,173 39,528 38,671 40,185 42,998 45,580 43,965 44,696 44,986 46,438 14,121 19,085 1,485 15,48 I - 2,397 16, , , , , , , , , , ,427 1,419 39,435 1,364 40,029-1,217 40, , , , ,404-1,547 38, ,594 1,125 44,452 1,382 47, ,361-2,046 45, , , SSq.87) = 28,679,000

36 120 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES Table Mean, Standard Error, and Sample Autocorrelations of the One-StepAhead Forecast Errors from Double Exponential Smoothing (a =.87)-University of Iowa Student Enrollment Lag k Sample Autocorrelations of One-Step-Ahead Forecast Errors rk Mean of historical forecast errors Standard error of mean academic years. The forecast for 1980/81 (I = 1 year ahead) is given by i29(1) = 46,438; for 1981/82 we predict an enrollment of i29(2) = 47,314; etc. The diagnostics in Table 3.14 indicate that double exponential smoothing with a smoothing constant a =.87 leads to unbiased forecasts and uncorrelated forecast errors. The standard error of rk is 1/ =.19. It should be emphasized that a smoothing constant of a =.87 implies a rapidly changing linear trend model. As we have shown earlier, for w -+ 0 (or a -+ 1), all forecasts fall on a straight line that is determined by the last two observations. The optimal forecast system for University of Iowa student enrollments essentially remembers only the last two observations; all previous observations are ignored LOCALLY QUADRATIC TREND MODEL AND TRIPLE EXPONENTIAL SMOOTHING The quadratic model is characterized by rn = 3 fitting functionsf,(j) = l,fi( j ) = j,f,(j) = j2/2. The vector of fitting functions f(j) = [fl( j), f2(j), f3( j)]' satisfies the difference equation f(j + 1) = M (j) [see Eq. (3.14)], where the transition

37 TRIPLE EXPONENTIAL SMOOTHING matrix is and L= f 1 1 [ 4 f(o) = 121 The matrix F, = Z, IdwJf( -j)f ( -j) in (3.25) is given by fz j wj and the vector h, by $Cj wj - P J ~ ~ w J 4Cj J For large n, it can be shown [see Brown (1962), p that n- 1 lim C j3wj = n-m j-0 (1 - w(1 + 4w + w ) n- I lim C j4wj = n-m j-0 (1 - w)5 o(l a ) The limits of CwJ, CjwJ, and Ej wj were already discussed in Section 3.6. Thus the steady-state value of F, is given by F= ~1 I 1-0 -w w(1 + 0) -w(l (1 - w) w(1 + w ) 1 2(1 - a), -a w(1 + w ) (1 - w)2 2(1 - a) w ) (1 - w)3 2(1 - w)4 -o(l + 4w + w ) w(1 + llw + llw2 + w ) 2(1 - a)* 4(1 - w)

38 = 122 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES For large n, the estimates 1, = (&,n, fin = F-'h, - -I- - (1 - w)(l + w + w2) f(l - w)"l + w ) ;(I - w)2(i + w) (1 - w y bi,,, f12,,)' are then given by I - w )3(~ + low (I - w)4(~ + 3w) (I - w)4(~ (I - w)' 4 w2 2 a 2 (1 - w )s 2 w2 W2 -_ ZWJZ" -ZJW'Zn - ~~J2W'Z, A The elements of h,, and thus the estimates D, the first three smoothed statistics: can be expressed in terms of We have shown earlier that s,"' = (1 - w ) c ujz,-, By substitution it can be shown that sp = (1 - w)2 c ( J + I)OJZ, J,0 /a0 and in general for k 2 2, Thus the elements in h, can be written as functions of the first three smoothed statistics.

39 TRIPLE EXPONENTIAL SMOOTHING 123 The coefficient estimates R, expressed in terms of the smoothed statistics, are given by bo,n = 3s;li - 3~121 + s;31 (3.47) where a = 1 - w. The I-step-ahead forecast can be calculated from 1 + ~r(6-5c~) 2w I 421 ff2 w2 (3.48) Since the smoothed smoothing. forecasts can be expressed as a function of the first three statistics, this procedure is known under triple exponential Implementation of Triple Expotrential Smoothing Equation (3.48) implies that the forecasts from triple exponential smoothing can be obtained from the smoothed statistics S,[l], Si2J, and Si3]. Thus it is not necessary to compute the discounted least squares estimates %, n, bl. n, 8, n' The smoothed statistics can be calculated recursively from period 1 onwards. This requires initial values S$'], Sf], and Sd31. Initial Values for the Smoothed Statistics Initial values for the smoothed statistics can be found by first fitting a constant quadratic model z, = Po + P,t + P27 t2 + &f

40 124 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES to all past data (z,, z2,..., zn). The quadratic model is parameterized such that B, = (Po,/3,, /I2) represents the coefficients at time origin zero. The least squares estimates are given by Do = (X X)- X z, where the 3 X n matrix X = [f(l),..., f(n)] and z = (z,,..., zn). The least squares estimates lo = (Boa,, PI, p2) can then be substituted into Equations (3.47) and solved for Sd l, Sd21, and Sd31. The solutions are given by SJ I = p - -p w a) 42 O a 2a a) sp = p - -p + O ( Y I 2a2 P2 Choice of the Smoothing Constant Brown (1962) suggests choosing w such that.70 < w3 <.95, which implies a value of w between.89 and.98 (or a = 1 - w between.02 and.11). If past data are available, the smoothing constant can be found by simulation. The sum of squared one-step-ahead forecast errors can be calculated for various values of a (or w = 1 - a). The value that minimizes this sum is used for future smoothing and forecasting. There are very few genuine data sets that are best forecast by smoothing methods of orders higher than 2. The reason for this will become clear when we investigate the models that imply these procedures (Chap. 7). We will find that time series data do not usually follow these models. Model Checking Again, to check the adequacy of triple exponential smoothing, we recommend calculating past one-step-ahead forecast errors. If triple exponential smoothing is adequate, the forecast errors should have mean zero and furthermore be uncorrelated Extension to the General Polynomial Model and Higher Order Exponential Smoothing Exponential smoothing in the context of the general polynomial model

41 PREDICTION INTERVALS FOR FUTURE VALUES 125 is discussed by Brown and Meyer (1961) and Brown (1962). They show that discounted least squares leads to coefficient estimates and forecasts that can be expressed as linear combinations of the first k + 1 smoothed statistics &'!I1,..., S,[k+ll. Explicit expressions for &, and the forecasts, however, quickly become cumbersome and are not pursued further PREDICTION INTERVALS FOR FUTURE VALUES We have shown in Section 3.5 that the discounted least squares estimate of p in the general model z,+~ = f'(j)b + E,+/ is given by where I), = Fn- Ih, n- 1 n- I F, = wjf( -j)f'( -j) and h, = wjf( -j)z,-, J -0 J=o [see Eq. (3.25)]. From this estimate we can calculate the 1-step-ahead forecast of a future observation z,+/, To assess the uncertainty associated with this forecast, we have to derive the variance of the I-step-ahead forecast error The variance is given by since in(/) is a function of the random variables zi,..., z,, which in our model are assumed uncorrelated with z,+,. To calculate this variance we have to derive the variance of the discounted least squares estimate b,,. The m X rn covariance matrix of p, is given by V( fin) = V(F[lh,) = F,,-'V(h,)F[' (3.51)

42 126 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES In the variance derivations given in the literature [Brown (1962), Montgomery and Johnson (1976)], it is assumed that the observations z, are uncorrelated and have the same variance u2 (see the comment given below). Then it follows that n- 1 n- I and n- 1 (3.52) For large n, Fn and C7:&2jf( F,, respectively, and -j)f'( -j) approach steady-state values F and v( A) = u~f-~f,f- I (3.53) Substitution of (3.53) into (3.50) leads to the variance of the I-step-ahead forecast error v[en(l)] = u2c: (3.54) where c: = 1 + f'(i)f-'f*f-'f(l). Then a loo(1 - A) percent prediction interval for the future observation z,+/ is in(1) * ua/2uc/ (3.55) where u?/~ is the loo(1 - A/2) percentage point of the standard normal distribution. Comment If we assume a model in which all observations have equal variance, we should calculate ordinary, and not discounted, least squares estimates. In (3.54) we have derived the variance of the discounted least squares estimates under the assumption that the observations have equal variance. Such an approach leads to correct standard errors only if the model (mean, trend, etc.) stays constant over time. For a large smoothing constant a (or small a), which indicates rapid changes in the model, these variance approximations will be rather poor.

43 PREDICTION INTERVALS FOR FUTURE VALUES Prediction Intervals for Sums of Future Observations 127 Sometimes the forecaster is interested not only in a forecast of one single observation but also in forecasts of a sum of K future realizations. For example, if sales are recorded monthly, the forecaster might be interested in the forecast of next year's total sales. The forecast of a sum of K future observations, C,", is given by the sum of their respective forecasts: K K C in(/) C p(i)bn = C f'tl) k (3.56) I- 1 I- 1 ( / I 1 1 The variance of the cumulative forecast error, C,", Izn+, - CE written as can be (3.57) A loo(1 - A) percent prediction interval for the sum of K future observations, C,", z,+,, can be calculated from Examples To illustrate these general expressions we consider now several special cases. 1. Locally constant mean model:

44 128 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES In this model we have only one fitting function; fi( j) = 1 for all j. This fitting function implies F-I = 1 - w and F, = Cw2J = 1/(1 - w2). Sub- stituting these expressions into Equation (3.53), we find that the variance of b,, = s,, = (1 - w )~w z,-~ is From (3.54) it follows that the variance of the I-step-ahead forecast error is given by a2 2a ( 1-@2) 1 -a2 V[e,(l)] = = 02- (3.59) which is the same for all 1. A loo(1 - A) percent prediction interval for a future observation z,+/ is given by (3.60) The prediction intervals are constant for all I. This implies, for example, that the uncertainty associated with a 10-step-ahead forecast is the same as that of a one-step-ahead forecast. For models with rapidly changing mean (or w close to zero), this will be a very poor approximation. In this case the observations are described by a random walk z,, = zn- I + E,, and the I-step-ahead forecast is En(/) = zn. Then the variance of the I-step-ahead forecast error z,+, - In(l) = zn+/ - z, = E,+, E,+ I is given by V[z,+/ - in(/)] = la2. The forecast error variance increases as a linear function of the forecast lead time I. A detailed discussion of forecast error variances will be given in Chapter 5. Similarly, it can be shown from Equation (3.58) that a loo(1 - A) prediction interval for the sum of K future observations is given by (3.61) 2. Locally constant linear trend madel: zn+j = Po + Plj + En+j In this model we have m = 2 fitting functions, fl( j) = 1 and f2( j) =j. It

45 PREDICTION INTERVALS FOR FUTURE VALUES was shown earlier (see Sec. 3.6) that [ 1 - w2 (1 - w)2 F-i = (1 (1-129 Furthermore, = J>O -j i F, =,2/[ ;'] w2 1 - w2 (1 - u2)2 -w2 w*(1 + w2) (1 - w">' (1 - wq3 4 c;= W (1 + [(l + 4w + 5w2) +21( 1 - w)( 1 + 3w) + 2P( 1 - w)2] A 100( 1 - A) percent prediction interval for z,+ I is therefore given by Estimation of the Variance The variance of the forecast errors in (3.54) and the prediction intervals (3.55) include u2, the unknown population variance of the errors E,. This

46 130 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES unknown variance can be estimated from the sample variance of the one-step-ahead forecast errors. It was shown earlier [Eq. (3.54)] that the variance of the (1 = 1)-step-ahead forecast error is given by u,'= V[e,(l)] = u2[1 + f'(l)f-'f*f-'f(l)] = a2c: Then it follows that U2 u2 = e 4 (3.64) where c: = [l + f'(l)f-'f*f-'f(l)]. The observed forecast errors e,- I (1) = t, - 2, - 1) (t = 1,2,..., n) can be used to estimate the variance of the one-step-ahead forecast errors. If the model is correct, these errors have mean zero, and hence the variance estimate is given by n A mean correction in (3.65) is not needed, since a correct model will lead to unbiased forecasts. This expression is substituted into Equation (3.64) to get an estimate of u2: (3.66) Substituting (3.66) into (3.55) we find that an estimated 100( 1 - A) percent prediction interval for a future observation z,+/ is given by Similarly, it follows from (3.58) that the prediction interval for a sum of future observations ZE It,+l is

47 PREDICTION INTERVALS FOR FUTURE VALUES Example I: Simple Exponential Smoothing In the case of simple exponential smoothing, it follows from (3.59) that c: = 2a/(l - w2). Thus, for any I the estimated prediction interval for z,+, is 131 sn f ua/26e (3.69) From Section 3.3, we recall that the Iowa nonfarm income growth rates were best predicted by simple exponential smoothing with a smoothing constant (r =.11. The prediction for all future growth rates was 2127(/) = The sum of the squared one-step-ahead forecast errors was I27 (zi - Sl-,)2 = =1 Thus 6: = /127 =.9306, and 6e =.965. A 95 percent prediction interval for all future growth rates z,~,+/ is given by & (1.96)(.965) or (.763,4.545). Example 2: Double Exponential Smoothing For double exponential smoothing, the estimated prediction interval for z, +, is given by where c:= (1 + [(l + 4w + 5w2) +21(1 - o)(l + 3w) + 212(1 - w) ] We illustrate the calculation of the prediction intervals using the thermostat sales series that was analyzed in Section 3.6. With a smoothing constant a =.14, the double exponential smoothing forecasts were is2(l) = , is2(2) = , 252(3) = , and so on. Furthermore, the sum of the squared one-step-ahead forecast errors was 41,469. Therefore, 6: = 41,469/52 = and 13~ = Evaluating c, for I = 1,2,3 and a =.14 (w =.86) leads to ci = 1.095, c2 = 1.106, c3 = Thus the 95 percent

48 132 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES prediction intervals are: One step ahead ( z ~ ~ ) : f (1.96)(28.24) or (263.94,374.64) Two steps ahead (q4): rt (1.96)(28.24)( 1.106/1.095) or (267.60,379.42) Three steps ahead ( zs5): f (1.96)(28.24)(1.118/1.095) or (271.23,384.23) Updating Probability Intervals The probability intervals for future observations are easily updated as each new observation becomes available. The estimated variance of the one-stepahead forecast errors at time n + 1, d:(n + I), can be expressed as a function of the previous variance estimate and the most recent one-stepahead forecast error: n+ I An Alternative Variance Estimate Equation (3.65) is used to estimate the variance of the one-step-ahead forecast errors from the corresponding sample variance. An alternative estimate that is more convenient for updating can be calculated from the mean absolute deviation of the one-step-ahead forecast errors [see Montgomery and Johnson (1976)l. It is given by where n ee = 1.25Ae (3.71) Ae = t= I n - t- I n Substituting this estimate into Equations (3.55) and (3.58) leads to alternative prediction intervals.

49 FURTHER COMMENTS 133 The prediction intervals are easily updated as each new observation becomes available, since the mean absolute deviation at time n + 1, 8,(n + I), can be expressed in terms of the previous estimate and the most recent forecast error: n+ 1 The justification for this alternative variance estimate is given below. Let us assume that a random variable X follows a normal distribution with mean p and variance u2. Then the expected value of the absolute deviation (mean absolute deviation) is given by A = EIX - pl = +-m I X --m - pjf(x) dx If we assume that the one-step-ahead forecast errors are normally distributed, then it follows that A, = u,e or a, = ACE I 1.25A, This is a reasonable approximation, even if the errors are nonnormal F'URTHER COMMENTS Exponential smoothing procedures have received considerable attention in the business forecasting literature. Two main reasons for the popularity of these techniques among routine business forecasters are: 1. Easy updating relationships allow the forecaster to update the forecasts without storing all past observations; only the most recent smoothed statistics have to be stored. 2. These procedures are said to be automatic and easy to use.

50 134 REGRESSION AND SMOOTHING FOR NONSEASONAL TIME SERIES The claim that exponential smoothing procedures are automatic is true only if one has already decided on the order of smoothmg and on the value of the smoothing constant. The order of smoothing is usually decided after an ad hoc visual inspection of the data, For observations with slowly changing mean, first-order (or simple) exponential smoothing is suggested. Observations that increase (decrease) linearly over time are usually predicted by double exponential smoothing. A value of a between.05 and.30 is usually suggested for the smoothing constant. A value of.1 seems to be preferred in many forecasting textbooks. It should be pointed out that a visual inspection of the data alone can lead to incorrect conclusions about the order of exponential smoothng (or equivalently the order of the locally constant polynomial trend model). For example, stock price data, which are known to follow random walks (zf = 2,-, + E,), can sometimes give the impression of local linear and quadratic trends. Furthermore, it should be emphasized that a smoothing constant a =.1 (or discount coefficient w =.9) will not always lead to good forecasts. For the University of Iowa student enrollments, for example, it was found that the estimated w in double exponential smoothing is close to zero. This essentially implies a stochastic trend model that remembers only the two most recent observations. In Chapter 7 we take another look at exponential smoothing methods and gain additional insights into why these procedures work well in some instances but perform poorly in other cases. There we show that it depends on the underlying stochastic process whether exponential smoothing will lead to good forecasts. For exponential smoothing to perform well, the stochastic process has to be from a special, particularly restricted, subclass of the stochastic models, which are discussed in Chapter 5. After learning more about this subclass, we can ask whether these restricted models are more likely to occur than others. We will learn that there are no good reasons why real series should follow these restricted models. In particular, models that imply higher order exponential smoothing are rarely found in practice.

THEORETICAL RESULTS AND EMPIRICAL S TitleON THE FORECASTING ACCURACY OF THE LINEAR EXPONENTIAL SMOOTHING METHOD

THEORETICAL RESULTS AND EMPIRICAL S TitleON THE FORECASTING ACCURACY OF THE LINEAR EXPONENTIAL SMOOTHING METHOD THEORETICAL RESULTS AND EMPIRICAL S TitleON THE FORECASTING ACCURACY OF THE LINEAR EXPONENTIAL SMOOTHING METHOD Author(s) Chen, Chunhang Citation Ryukyu mathematical journal, 8: 1-2 Issue Date 1996-01-20

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Statistical Methods for Forecasting

Statistical Methods for Forecasting Statistical Methods for Forecasting BOVAS ABRAHAM University of Waterloo JOHANNES LEDOLTER University of Iowa John Wiley & Sons New York Chichester Brisbane Toronto Singapore Contents 1 INTRODUCTION AND

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting "serial correlation"

Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting serial correlation Time Series Analysis 2) assessment of/accounting for seasonality This (not surprisingly) concerns the analysis of data collected over time... weekly values, monthly values, quarterly values, yearly values,

More information

Economic Forecasting Lecture 9: Smoothing Methods

Economic Forecasting Lecture 9: Smoothing Methods Economic Forecasting Lecture 9: Smoothing Methods Richard G. Pierse 1 Introduction Smoothing methods are rather different from the model-based methods that we have been looking at up to now in this module.

More information

STATS24x7.com 2010 ADI NV, INC.

STATS24x7.com 2010 ADI NV, INC. TIME SERIES SIMPLE EXPONENTIAL SMOOTHING If the mean of y t remains constant over n time, each observation gets equal weight: y ˆ t1 yt 0 et, 0 y n t 1 If the mean of y t changes slowly over time, recent

More information

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS Moving Averages and Smoothing Methods ECON 504 Chapter 7 Fall 2013 Dr. Mohammad Zainal 2 This chapter will describe three simple approaches to forecasting

More information

STAT 115: Introductory Methods for Time Series Analysis and Forecasting. Concepts and Techniques

STAT 115: Introductory Methods for Time Series Analysis and Forecasting. Concepts and Techniques STAT 115: Introductory Methods for Time Series Analysis and Forecasting Concepts and Techniques School of Statistics University of the Philippines Diliman 1 FORECASTING Forecasting is an activity that

More information

Forecasting Using Time Series Models

Forecasting Using Time Series Models Forecasting Using Time Series Models Dr. J Katyayani 1, M Jahnavi 2 Pothugunta Krishna Prasad 3 1 Professor, Department of MBA, SPMVV, Tirupati, India 2 Assistant Professor, Koshys Institute of Management

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Log transformation

More information

The Art of Forecasting

The Art of Forecasting Time Series The Art of Forecasting Learning Objectives Describe what forecasting is Explain time series & its components Smooth a data series Moving average Exponential smoothing Forecast using trend models

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Regression of Time Series

Regression of Time Series Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching

More information

Forecasting. Chapter Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall

Forecasting. Chapter Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Forecasting Chapter 15 15-1 Chapter Topics Forecasting Components Time Series Methods Forecast Accuracy Time Series Forecasting Using Excel Time Series Forecasting Using QM for Windows Regression Methods

More information

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine Exponential Smoothing INSR 260, Spring 2009 Bob Stine 1 Overview Smoothing Exponential smoothing Model behind exponential smoothing Forecasts and estimates Hidden state model Diagnostic: residual plots

More information

The log transformation produces a time series whose variance can be treated as constant over time.

The log transformation produces a time series whose variance can be treated as constant over time. TAT 520 Homework 6 Fall 2017 Note: Problem 5 is mandatory for graduate students and extra credit for undergraduates. 1) The quarterly earnings per share for 1960-1980 are in the object in the TA package.

More information

Chapter 2 Wiener Filtering

Chapter 2 Wiener Filtering Chapter 2 Wiener Filtering Abstract Before moving to the actual adaptive filtering problem, we need to solve the optimum linear filtering problem (particularly, in the mean-square-error sense). We start

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

Improved Holt Method for Irregular Time Series

Improved Holt Method for Irregular Time Series WDS'08 Proceedings of Contributed Papers, Part I, 62 67, 2008. ISBN 978-80-7378-065-4 MATFYZPRESS Improved Holt Method for Irregular Time Series T. Hanzák Charles University, Faculty of Mathematics and

More information

Statistical Methods. for Forecasting

Statistical Methods. for Forecasting Statistical Methods for Forecasting Statistical Methods for Forecasting BOVAS ABRAHAM JOHANNES LEDOLTER WILEY- INTERSCI ENCE A JOHN WILEY & SONS, INC., PUBLICA'TION Copyright 0 1983.2005 by John Wiley

More information

Forecasting: The First Step in Demand Planning

Forecasting: The First Step in Demand Planning Forecasting: The First Step in Demand Planning Jayant Rajgopal, Ph.D., P.E. University of Pittsburgh Pittsburgh, PA 15261 In a supply chain context, forecasting is the estimation of future demand General

More information

Mathematics for Economics MA course

Mathematics for Economics MA course Mathematics for Economics MA course Simple Linear Regression Dr. Seetha Bandara Simple Regression Simple linear regression is a statistical method that allows us to summarize and study relationships between

More information

A robust Hansen Sargent prediction formula

A robust Hansen Sargent prediction formula Economics Letters 71 (001) 43 48 www.elsevier.com/ locate/ econbase A robust Hansen Sargent prediction formula Kenneth Kasa* Research Department, Federal Reserve Bank of San Francisco, P.O. Box 770, San

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

INTRODUCTION TO FORECASTING (PART 2) AMAT 167

INTRODUCTION TO FORECASTING (PART 2) AMAT 167 INTRODUCTION TO FORECASTING (PART 2) AMAT 167 Techniques for Trend EXAMPLE OF TRENDS In our discussion, we will focus on linear trend but here are examples of nonlinear trends: EXAMPLE OF TRENDS If you

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Product and Inventory Management (35E00300) Forecasting Models Trend analysis

Product and Inventory Management (35E00300) Forecasting Models Trend analysis Product and Inventory Management (35E00300) Forecasting Models Trend analysis Exponential Smoothing Data Storage Shed Sales Period Actual Value(Y t ) Ŷ t-1 α Y t-1 Ŷ t-1 Ŷ t January 10 = 10 0.1 February

More information

Time-Series Analysis. Dr. Seetha Bandara Dept. of Economics MA_ECON

Time-Series Analysis. Dr. Seetha Bandara Dept. of Economics MA_ECON Time-Series Analysis Dr. Seetha Bandara Dept. of Economics MA_ECON Time Series Patterns A time series is a sequence of observations on a variable measured at successive points in time or over successive

More information

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Lecture 15 20 Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Modeling for Time Series Forecasting Forecasting is a necessary input to planning, whether in business,

More information

ADSP ADSP ADSP ADSP. Advanced Digital Signal Processing (18-792) Spring Fall Semester, Department of Electrical and Computer Engineering

ADSP ADSP ADSP ADSP. Advanced Digital Signal Processing (18-792) Spring Fall Semester, Department of Electrical and Computer Engineering Advanced Digital Signal rocessing (18-792) Spring Fall Semester, 201 2012 Department of Electrical and Computer Engineering ROBLEM SET 8 Issued: 10/26/18 Due: 11/2/18 Note: This problem set is due Friday,

More information

Supplementary Information

Supplementary Information If f - - x R f z (F ) w () () f F >, f E jj E, V G >, G >, E G,, f ff f FILY f jj ff LO_ N_ j:rer_ N_ Y_ fg LO_; LO_ N_; N_ j:rer_; j:rer_ N_ Y_ f LO_ N_ j:rer_; j:rer_; N_ j:rer_ Y_ fn LO_ N_ - N_ Y_

More information

BOOLEAN MAfRICES AND GRAPH THEORY. R. S. Ledley. NBR Report No /3265. March 1968

BOOLEAN MAfRICES AND GRAPH THEORY. R. S. Ledley. NBR Report No /3265. March 1968 ?0 00 BOOLEAN MAfRICES AND GRAPH THEORY by R. S. Ledley NBR Report No. 68032/3265 March 1968 Chirac! WoAr- lzds(o0j National Biomedical Research Foundation 11200 Lockwood Drive Silver Spring, Maryland

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

15 yaş üstü istihdam ( )

15 yaş üstü istihdam ( ) Forecasting 1-2 Forecasting 23 000 15 yaş üstü istihdam (2005-2008) 22 000 21 000 20 000 19 000 18 000 17 000 - What can we say about this data? - Can you guess the employement level for July 2013? 1-3

More information

Future Self-Guides. E,.?, :0-..-.,0 Q., 5...q ',D5', 4,] 1-}., d-'.4.., _. ZoltAn Dbrnyei Introduction. u u rt 5,4) ,-,4, a. a aci,, u 4.

Future Self-Guides. E,.?, :0-..-.,0 Q., 5...q ',D5', 4,] 1-}., d-'.4.., _. ZoltAn Dbrnyei Introduction. u u rt 5,4) ,-,4, a. a aci,, u 4. te SelfGi ZltAn Dbnyei Intdtin ; ) Q) 4 t? ) t _ 4 73 y S _ E _ p p 4 t t 4) 1_ ::_ J 1 `i () L VI O I4 " " 1 D 4 L e Q) 1 k) QJ 7 j ZS _Le t 1 ej!2 i1 L 77 7 G (4) 4 6 t (1 ;7 bb F) t f; n (i M Q) 7S

More information

Cyclical Effect, and Measuring Irregular Effect

Cyclical Effect, and Measuring Irregular Effect Paper:15, Quantitative Techniques for Management Decisions Module- 37 Forecasting & Time series Analysis: Measuring- Seasonal Effect, Cyclical Effect, and Measuring Irregular Effect Principal Investigator

More information

Marcia Gumpertz and Sastry G. Pantula Department of Statistics North Carolina State University Raleigh, NC

Marcia Gumpertz and Sastry G. Pantula Department of Statistics North Carolina State University Raleigh, NC A Simple Approach to Inference in Random Coefficient Models March 8, 1988 Marcia Gumpertz and Sastry G. Pantula Department of Statistics North Carolina State University Raleigh, NC 27695-8203 Key Words

More information

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression Chapter 14 Student Lecture Notes 14-1 Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Multiple Regression QMIS 0 Dr. Mohammad Zainal Chapter Goals After completing

More information

Final Exam - Solutions

Final Exam - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis March 17, 2010 Instructor: John Parman Final Exam - Solutions You have until 12:30pm to complete this exam. Please remember to put your

More information

Forecasting Chapter 3

Forecasting Chapter 3 Forecasting Chapter 3 Introduction Current factors and conditions Past experience in a similar situation 2 Accounting. New product/process cost estimates, profit projections, cash management. Finance.

More information

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model Checking/Diagnostics Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns

More information

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model Checking/Diagnostics Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns

More information

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10 Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show

More information

P a g e 5 1 of R e p o r t P B 4 / 0 9

P a g e 5 1 of R e p o r t P B 4 / 0 9 P a g e 5 1 of R e p o r t P B 4 / 0 9 J A R T a l s o c o n c l u d e d t h a t a l t h o u g h t h e i n t e n t o f N e l s o n s r e h a b i l i t a t i o n p l a n i s t o e n h a n c e c o n n e

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Using regression to study economic relationships is called econometrics. econo = of or pertaining to the economy. metrics = measurement

Using regression to study economic relationships is called econometrics. econo = of or pertaining to the economy. metrics = measurement EconS 450 Forecasting part 3 Forecasting with Regression Using regression to study economic relationships is called econometrics econo = of or pertaining to the economy metrics = measurement Econometrics

More information

Ch3. TRENDS. Time Series Analysis

Ch3. TRENDS. Time Series Analysis 3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true

More information

Unit 3: Linear and Exponential Functions

Unit 3: Linear and Exponential Functions Unit 3: Linear and Exponential Functions In Unit 3, students will learn function notation and develop the concepts of domain and range. They will discover that functions can be combined in ways similar

More information

INTRODUCTORY REGRESSION ANALYSIS

INTRODUCTORY REGRESSION ANALYSIS ;»»>? INTRODUCTORY REGRESSION ANALYSIS With Computer Application for Business and Economics Allen Webster Routledge Taylor & Francis Croup NEW YORK AND LONDON TABLE OF CONTENT IN DETAIL INTRODUCTORY REGRESSION

More information

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES THE ROYAL STATISTICAL SOCIETY 9 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES The Society provides these solutions to assist candidates preparing

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables. Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate

More information

3.1. Determine the z-transform, including the region of convergence, for each of the following sequences: N, N::: n.

3.1. Determine the z-transform, including the region of convergence, for each of the following sequences: N, N::: n. Chap. 3 Problems 27 versa. Specifically, we showed that the defining power series of the z-transform may converge when the Fourier transform does not. We explored in detail the dependence of the shape

More information

A Second Course in Statistics: Regression Analysis

A Second Course in Statistics: Regression Analysis FIFTH E D I T I 0 N A Second Course in Statistics: Regression Analysis WILLIAM MENDENHALL University of Florida TERRY SINCICH University of South Florida PRENTICE HALL Upper Saddle River, New Jersey 07458

More information

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time TIMES SERIES INTRODUCTION A time series is a set of observations made sequentially through time A time series is said to be continuous when observations are taken continuously through time, or discrete

More information

Chapter 7 Forecasting Demand

Chapter 7 Forecasting Demand Chapter 7 Forecasting Demand Aims of the Chapter After reading this chapter you should be able to do the following: discuss the role of forecasting in inventory management; review different approaches

More information

d. = (1 + Xi(X. 1X, 1) X.) 2)

d. = (1 + Xi(X. 1X, 1) X.) 2) INTERFACING SAS SOFTWARE WITH THE B34S SYSTEM RECURSIVE RESIDUAL OPTION: A BRIEF LOOK AT THEORY AND AN EXAMPLE Houston H. Stokes University of Illinois at Chicago Abstract The recursive residual technique,

More information

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure. STATGRAPHICS Rev. 9/13/213 Calibration Models Summary... 1 Data Input... 3 Analysis Summary... 5 Analysis Options... 7 Plot of Fitted Model... 9 Predicted Values... 1 Confidence Intervals... 11 Observed

More information

~,. :'lr. H ~ j. l' ", ...,~l. 0 '" ~ bl '!; 1'1. :<! f'~.., I,," r: t,... r':l G. t r,. 1'1 [<, ."" f'" 1n. t.1 ~- n I'>' 1:1 , I. <1 ~'..

~,. :'lr. H ~ j. l' , ...,~l. 0 ' ~ bl '!; 1'1. :<! f'~.., I,, r: t,... r':l G. t r,. 1'1 [<, . f' 1n. t.1 ~- n I'>' 1:1 , I. <1 ~'.. ,, 'l t (.) :;,/.I I n ri' ' r l ' rt ( n :' (I : d! n t, :?rj I),.. fl.),. f!..,,., til, ID f-i... j I. 't' r' t II!:t () (l r El,, (fl lj J4 ([) f., () :. -,,.,.I :i l:'!, :I J.A.. t,.. p, - ' I I I

More information

ECON 343 Lecture 4 : Smoothing and Extrapolation of Time Series. Jad Chaaban Spring

ECON 343 Lecture 4 : Smoothing and Extrapolation of Time Series. Jad Chaaban Spring ECON 343 Lecture 4 : Smoothing and Extrapolation of Time Series Jad Chaaban Spring 2005-2006 Outline Lecture 4 1. Simple extrapolation models 2. Moving-average models 3. Single Exponential smoothing 4.

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the

More information

Decision 411: Class 4

Decision 411: Class 4 Decision 411: Class 4 Non-seasonal averaging & smoothing models Simple moving average (SMA) model Simple exponential smoothing (SES) model Linear exponential smoothing (LES) model Combining seasonal adjustment

More information

Applications of Mathematics

Applications of Mathematics Applications of Mathematics Tomáš Cipra Exponential smoothing for irregular data Applications of Mathematics Vol. 5 2006) No. 6 597--604 Persistent URL: http://dml.cz/dmlcz/34655 Terms of use: Institute

More information

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is

More information

Final Exam - Solutions

Final Exam - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis March 19, 2010 Instructor: John Parman Final Exam - Solutions You have until 5:30pm to complete this exam. Please remember to put your

More information

Chapter 14 Student Lecture Notes 14-1

Chapter 14 Student Lecture Notes 14-1 Chapter 14 Student Lecture Notes 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 14 Multiple Regression Analysis and Model Building Chap 14-1 Chapter Goals After completing this

More information

Departamento de Estadfstica y Econometrfa Statistics and Econometrics Series 27. Universidad Carlos III de Madrid December 1993 Calle Madrid, 126

Departamento de Estadfstica y Econometrfa Statistics and Econometrics Series 27. Universidad Carlos III de Madrid December 1993 Calle Madrid, 126 -------_._-- Working Paper 93-45 Departamento de Estadfstica y Econometrfa Statistics and Econometrics Series 27 Universidad Carlos III de Madrid December 1993 Calle Madrid, 126 28903 Getafe (Spain) Fax

More information

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s A g la di ou s F. L. 462 E l ec tr on ic D ev el op me nt A i ng er A.W.S. 371 C. A. M. A l ex an de r 236 A d mi ni st ra ti on R. H. (M rs ) A n dr ew s P. V. 326 O p ti ca l Tr an sm is si on A p ps

More information

Name (print, please) ID

Name (print, please) ID Name (print, please) ID Operations Management I 7- Winter 00 Odette School of Business University of Windsor Midterm Exam I Solution Wednesday, ebruary, 0:00 :0 pm Last Name A-S: Odette B0 Last Name T-Z:

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc

Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc Process of predicting a future event Underlying basis of all business decisions Production Inventory Personnel Facilities

More information

Business Statistics. Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220. Dr. Mohammad Zainal

Business Statistics. Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220. Dr. Mohammad Zainal Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220 Dr. Mohammad Zainal Chapter Goals After completing

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state

More information

5.5 Recurrence Relations and Clenshaw s Recurrence Formula

5.5 Recurrence Relations and Clenshaw s Recurrence Formula 178 Chapter 5. Evaluation of Functions Then the answer is 0 ( ) w =0 d w + i w 0, c 0 2w c + id = d + iw w 0, c

More information

Forecasting. Operations Analysis and Improvement Spring

Forecasting. Operations Analysis and Improvement Spring Forecasting Operations Analysis and Improvement 2015 Spring Dr. Tai-Yue Wang Industrial and Information Management Department National Cheng Kung University 1-2 Outline Introduction to Forecasting Subjective

More information

Chapter 8 - Forecasting

Chapter 8 - Forecasting Chapter 8 - Forecasting Operations Management by R. Dan Reid & Nada R. Sanders 4th Edition Wiley 2010 Wiley 2010 1 Learning Objectives Identify Principles of Forecasting Explain the steps in the forecasting

More information

TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER

TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER J. Japan Statist. Soc. Vol. 38 No. 1 2008 41 49 TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER Andrew Harvey* and Thomas Trimbur** The article analyses the relationship between unobserved component trend-cycle

More information

Lecture # 31. Questions of Marks 3. Question: Solution:

Lecture # 31. Questions of Marks 3. Question: Solution: Lecture # 31 Given XY = 400, X = 5, Y = 4, S = 4, S = 3, n = 15. Compute the coefficient of correlation between XX and YY. r =0.55 X Y Determine whether two variables XX and YY are correlated or uncorrelated

More information

Lecture 12. Functional form

Lecture 12. Functional form Lecture 12. Functional form Multiple linear regression model β1 + β2 2 + L+ β K K + u Interpretation of regression coefficient k Change in if k is changed by 1 unit and the other variables are held constant.

More information

Chapter 5: Forecasting

Chapter 5: Forecasting 1 Textbook: pp. 165-202 Chapter 5: Forecasting Every day, managers make decisions without knowing what will happen in the future 2 Learning Objectives After completing this chapter, students will be able

More information

STAT 212 Business Statistics II 1

STAT 212 Business Statistics II 1 STAT 1 Business Statistics II 1 KING FAHD UNIVERSITY OF PETROLEUM & MINERALS DEPARTMENT OF MATHEMATICAL SCIENCES DHAHRAN, SAUDI ARABIA STAT 1: BUSINESS STATISTICS II Semester 091 Final Exam Thursday Feb

More information

The Blue Chip Survey: Moving Beyond the Consensus

The Blue Chip Survey: Moving Beyond the Consensus The Useful Role of Forecast Surveys Sponsored by NABE ASSA Annual Meeting, January 7, 2005 The Blue Chip Survey: Moving Beyond the Consensus Kevin L. Kliesen, Economist Work in Progress Not to be quoted

More information

data lam=36.9 lam=6.69 lam=4.18 lam=2.92 lam=2.21 time max wavelength modulus of max wavelength cycle

data lam=36.9 lam=6.69 lam=4.18 lam=2.92 lam=2.21 time max wavelength modulus of max wavelength cycle AUTOREGRESSIVE LINEAR MODELS AR(1) MODELS The zero-mean AR(1) model x t = x t,1 + t is a linear regression of the current value of the time series on the previous value. For > 0 it generates positively

More information

UNIVERSITY OF SWAZILAND MAIN EXAMINATION PAPER 2016 PROBABILITY AND STATISTICS ANSWER ANY FIVE QUESTIONS.

UNIVERSITY OF SWAZILAND MAIN EXAMINATION PAPER 2016 PROBABILITY AND STATISTICS ANSWER ANY FIVE QUESTIONS. UNIVERSITY OF SWAZILAND MAIN EXAMINATION PAPER 2016 TITLE OF PAPER PROBABILITY AND STATISTICS COURSE CODE EE301 TIME ALLOWED 3 HOURS INSTRUCTIONS ANSWER ANY FIVE QUESTIONS. REQUIREMENTS SCIENTIFIC CALCULATOR

More information

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47 ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with

More information

Ron Heck, Fall Week 3: Notes Building a Two-Level Model

Ron Heck, Fall Week 3: Notes Building a Two-Level Model Ron Heck, Fall 2011 1 EDEP 768E: Seminar on Multilevel Modeling rev. 9/6/2011@11:27pm Week 3: Notes Building a Two-Level Model We will build a model to explain student math achievement using student-level

More information

Mathematics Standards for High School Algebra II

Mathematics Standards for High School Algebra II Mathematics Standards for High School Algebra II Algebra II is a course required for graduation and is aligned with the College and Career Ready Standards for Mathematics in High School. Throughout the

More information

T h e C S E T I P r o j e c t

T h e C S E T I P r o j e c t T h e P r o j e c t T H E P R O J E C T T A B L E O F C O N T E N T S A r t i c l e P a g e C o m p r e h e n s i v e A s s es s m e n t o f t h e U F O / E T I P h e n o m e n o n M a y 1 9 9 1 1 E T

More information

MATHEMATICS AND GAMES* Nagayoshi lwahori University of Tokyo

MATHEMATICS AND GAMES* Nagayoshi lwahori University of Tokyo MATHEMATICS AND GAMES* Nagayoshi lwahori University of Tokyo When one looks at mathematical phenomena or theories from various points of view instead of from only one angle, one usually gets some unexpected

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

Unit 3: Linear and Exponential Functions

Unit 3: Linear and Exponential Functions Unit 3: Linear and Exponential Functions In Unit 3, students will learn function notation and develop the concepts of domain and range. They will discover that functions can be combined in ways similar

More information

Chapter 1. Linear Regression with One Predictor Variable

Chapter 1. Linear Regression with One Predictor Variable Chapter 1. Linear Regression with One Predictor Variable 1.1 Statistical Relation Between Two Variables To motivate statistical relationships, let us consider a mathematical relation between two mathematical

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information