Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Size: px
Start display at page:

Download "Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis"

Transcription

1 Chapter 12: An introduction to Time Series Analysis

2 Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive Integrated Moving Average (ARIMA) models. Such models that can handle seasonal patterns are called SARIMA models. Time domain time series analysis (ARIMA models) is based on the autocorrelation function. A function called the spectral density can account for the variation in a time series by cyclic components at different frequencies may be the basis for time series analysis, in which case we talk about frequency domain time series analysis. In this short treatment of the analysis of time series we will describe only the time domain approach to inferences about time series. ARIMA models attempt to forecasts future time series values for a particular series based on past values of the variable being forecast.

3 Introduction Time series data consists of pairs of observations (t, z t ), t = 1,..., n taken over equally spaced, discrete time intervals. Since the variable observed is measured only at discrete time intervals the data is said to be discrete data even though the variable observed may be continuous. A mechanical pen recording continuously the temperature for every instant of time on a roll of paper would be an example of continuous data. Examples of discrete time series are: (1) Rainfall on successive days; (2) Total sales figures each week for a number of successive weeks; (3) Yearly population values for a country or town; (4) Numbers of women unemployed each week in a city; (5) Daily stock prices; (6) Number of sunspots each year over a number of years;

4 Introduction (7) Dow Jones indexes for various goods; (8) Number of airline passengers per year; (9) Average monthly air temperature in a given local over a number of months; (10) Bank interest as recorded on each day; (11) Coal production per year; (12) Number of housing permits issued in a certain local per year; (13) Amount of real-estate loans per month issued in a given local; (14) Weekly profit margin for a particular firm; (15) College enrolment in a province per year; (16) Total yearly exports.

5 Introduction In regression analysis and many other types of statistical analysis we assume individual observations are independent, however in time series data we assume that individual observations are dependent, that is correlated. Often we assume that the current observation depends strongly on the previous observation in the time series, so that pair of observations separated by one time unit are highly correlated, that is have strong first order autocorrelation. It is common to see a time series model which relates the current z value to the two most recent observations. Thus ARIMA models are often used to make short-term forecasts. To build a time series model and make forecasts we usually need at least 50 observations, because we need a large sample sizes to estimate the autocorrelation accurately.

6 Stationary Time Series The ARIMA methods are useful for stationary time series. A time series is said to be stationary if there is no systematic change in the mean and the variance, and if strictly periodic variations have been removed. That is a stationary time series has a mean, variance, and autocorrelation function that are essentially constant through time. Note that there is a proper probability definition of stationarity in most books written on time series, and the mathematical student should consult one such book. Given n time series observations the usual formulas may be used to estimate the mean and variance, z = 1 n n i=1 z i and s 2 z = 1 n n (z i z) 2. i=1 Basically an ABIMA model is a simple or perhaps complicated equation relating a current z value to its past values.

7 Stationary Time Series Example 12.1 The following equation (sometimes called an autoregressive process of the first order) is an example of an ARIMA model, z t = C + φ 1 z t 1 + α t. In this equation C is a constant term, α t is the current random shock to the system, φ 1 is a parameter (like a regression coefficient) that relates the current z t value to the immediate past z t 1 value. This is an auto-regressive model of order one, denoted as ARIMA(1,0,0), which is a sub-case of the general ARIMA model. To fit this model to a time series data set in R package we use arima(data, order = c(1, 0, 0))

8 Stationary Time Series Example 12.2 Consider the following: z t = C θ 1 α t 1 + α t. Note that, C again is a constant term, and this model uses the parameter θ 1 to relate the immediate past shock to the current value of z t, so that the current value of z t now depends on an immediate past shock α t 1 and a current shock α t. This model is called a moving average model of order one, denoted as ARIMA(0,0,1), which is also a sub-case of the general ARIMA model. To fit this model to a time series data set in R package we use arima(data, order = c(0, 0, 1))

9 Stationary Time Series Example 12.3 Consider the following z t = C + φ 1 z t 1 θ 1 α t 1 + α t. This model is a mixed model, it contains both AR (autoregressive) and MA (moving average) terms. It is an ARIMA(1,0,1) process because the AR order is one and the MA order is one. To fit this model to a time series data set in R package we use arima(data, order = c(1, 0, 1))

10 Paradigm for the Box-Jenkins Modelling Procedure To fit the Box-Jenkins models we follow a four step procedure. (1) Identify the appropriate model. There are many (an infinite number of models) and the first step is to select a possible appropriate one. We will discuss, shortly, two major tools for selecting an appropriate model. (2) Fit the model chosen in (1) by using available time series data. The estimation procedure is usually based on maximum likelihood procedures or least squares. Under certain normality assumptions (random shocks normally distributed) least squares estimators are equivalent to maximum likelihood estimators. (3) Check the fitted model obtained in (2) for adequacy, for example, check the residuals. If the model is not adequate, then go back to Step (1). (4) If the model fitted in (2) is proven to be adequate in (3), then we may use the model to make forecasts.

11 Paradigm for the Box-Jenkins Modelling Procedure There are two functions which we calculate that help us to identify what model we should fit: (1) The ACF, the autocorrelation function and (2) The PACF, the partial autocorrelation function. We may use the R package to calculate both these functions, corresponding R commands are: (1) acf(data) and pacf(data). These commands will automatically make ACF and PACF graphs. Figure : ACF and PACF Plots

12 How do these graphs helps? Associated with each ARIMA model we would have a theoretical ACF and a theoretical PACF. The procedure is to estimate from the time series data the theoretical ACF and PACF. The estimates are called the empirical ACF and empirical PACF. We then compare the empirical functions to the theoretical functions. We will pick the model which have the theoretical ACF and PACF that agrees as closely as we can decide to the empirical ACF and PACF. This tentative model is then fitted and used for forecasting if it is adequate, otherwise we would try again to find a better model.

13 The Definition of the Empirical ACF and Empirical PACF For a stationary processes, the empirical ACF and empirical PACF are defined as following: (1) The Empirical ACF: Given discrete time series (t, z t ), t = 1,..., n, the autocorrelation coefficient at lag 1 is defined as r 1 = n 1 t=1 (z t z)(z t+1 z) n t=1 (z t z) 2. Similarly, we can define the autocorrelation coefficient at lag 2, lag 3 and so on. The autocorrelation coefficient at lag k is defined as r k = n k t=1 (z t z)(z t+k z) n t=1 (z t z) 2 A plot of r k against k is called the empirical correlogram.

14 The Definition of the Empirical ACF and Empirical PACF (2) The Empirical PACF: The PACF is an attempt to measure the correlation between points in the time series lagged (separated) by say k in such a way that the effects of the intervening time series z values between z t, and z t+k are accounted for. The symbol for the theoretical PACF is φ kk and the symbol for the empirical (estimated) PACF is ˆφ kk. Note that ρ k = φ kk when we have k = 1 since with lag one there are no intervening points. The calculation of the PACF is difficult but one way in which an estimate can be made is in terms of the autocorrelation matrix. Note that for a series of length n we may write a matrix of all the possible autocorrelations. For example for a series of length two (note we could not estimate correlation from a series of length 2, in fact, it is recommended that a series be of length 50 before we try and estimate correlations) we could have one autocorrelation ρ 1 and A matrix ( ) 1 ρ1 ρ 1 1

15 The Definition of the Empirical ACF and Empirical PACF For a series of size three we would have 1 ρ 1 ρ 2 ρ 1 1 ρ 1 ρ 2 ρ 1 1 Clearly for a series of length n we would have an n n matrix containing n 1 different autocorrelations. Of course one or more could be zero. Now the first PACF is define by the second by φ 11 = ρ 1, 1 ρ 1 ρ 1 ρ 2 φ 22 = 1 ρ 1 ρ 1 1 The same pattern is repeated for third and any higher order PACF. We can estimate these PACF s by replacing the autocorrelations by the estimated value r k. That is the matrix in the numerator if formed

16 The Definition of the Empirical ACF and Empirical PACF from the autocorrelation matrix in the denominator by replacing just the last column by the vector of possible correlations (ρ 1, ρ 2,..., ρ n ). Note that, (1) Stationary autoregressive (AR) processes have theoretical ACF s that decay or damp out toward zero (that is the r k s decrease in magnitude as k increases, r k k). But they have theoretical PACF s that cut off sharply to zero after a few spikes. The lag length of the last PACF spike equals the AR order (p) of the process. For example the AR(1) processes has one only PACF spike, all other φ kk are zeros. (2) Moving average (MA) processes have theoretical ACF s that cut off to zero after a certain number of spikes. The lag of the last ACF spike equals the MA order of the process. Their theoretical PACF s decay or die out toward zero (that is the φ kk s decrease in magnitude as k increases, φ kk k). Properties of a good time series model: (1) It uses the smallest number of coefficients needed to explain the data.

17 Further Notations (2) It is stationary ( that is it has AR coefficients which satisfy certain mathematical inequalities). (3) It is invertible (has MA coefficients which satisfy certain mathematical inequalities. (4) It had coefficients that are significant. (5) It has uncorrelated residuals. (6) It makes forecasts that are at least sensible. We need some further notation to represent complicated models and because most books and software packages use this additional notation. To reference most works on time series we must understand the standard notation. The Box-Jenkins approach uses two operators that permits compact representation of the models. The two operators are (1) The backshift operator, B, defined by B(z t ) = z (t 1).

18 Further Notations (2) The difference operator,, defined by (z t ) = z t z (t 1). Note that we have the relationship = 1 B. Thai is (z t ) = z t B(z t ). Furthermore, d = (1 B) d. Now, we take a look on some of the following examples: B 2 (z t ) = x (t 2), and B k (z t ) = x (t k). (1 B)z t = z t B(z t ) = z t z (t 1). Using the backshift operator, we can write the AR(2) time series model z t = C + φ 1 t (t 1) + φ 2 t (t 2) + α t as (z t φ 1 t (t 1) φ 2 t (t 2) ) = C + α t,

19 Further Notations which is equivalent to (1 φ 1 B φ 2 B 2 )z t = C + α t. Letting Φ 2 (B) = (1 φ 1 B φ 2 B 2 ), we can further simplify the above notation by Φ 2 (B)z t = C + α t. Similarly, we can represent a MA(2) time series model z t = C + α t θ 1 α(t 1) θ 2 α(t 2) by z t = C + (1 θ 1 B θ 2 B 2 )α t = C + Θ 2 (B)α t, where, Θ 2 (B) = (1 θ 1 B θ 2 B 2 ). Now we can write a ARIMA(2 0 2) model as Φ 2 (B)z t = C + Θ 2 α t.

20 Stationarity and Invertibility Stationarity In general, we can write a general ARIMA (p 0 q) model as where, Φ p (B)z t = C + Θ q α t, Φ p (B) = (1 φ 1 B φ 2 B 2... φ p B p ) and Θ q (B) = (1 θ 1 B θ 2 B 2... θ q B q ). We will discuss the general ARIMA(p d q) model later. In this section, we introduce two important characteristics of a time series. Stationary implies that the AR coefficients must satisfy certain conditions. Usually after estimating a model you check that the AR coefficients satisfy the following stationarity conditions. (1) Stationarity condition for AR(1): The requirement is the the coefficient in absolute value be less than one, that is φ 1 < 1.

21 Stationarity and Invertibility Why do we need stationary? In fact this is the requirement for all ARIMA(1,0,q) models. In practice we do not know φ 1 so we check the estimate ˆφ 1. (2) Stationarity condition for AR(2) or ARIMA(2,0,q) models: The requirement is that φ 2 < 1 φ 2 + φ 1 < 1 φ 2 φ 1 < 1. Note that all three conditions must hold for the ARIMA(2, 0, q) models to be stationary. We apply the conditions to the estimated parameters ˆφ 1 and ˆφ 2. Note that o check AR processes of order p > 2 is difficult, however a necessary condition for stationarity is that φ 1 + φ φ p < 1. Stationarity is necessary because a realization of a time series represents only one sample observation at each time point. If the mean

22 Stationarity and Invertibility How to checking for stationarity? changes with time then we would have only one observation at time point t to estimate the mean at that time point. We cannot start history again to get two realization of the series at each point in time. If we had two observations at a point in time then we could estimate the mean at that point in time by averaging the two observations. However we have only one observation at any point in time consequently we require stationarity, that is, that the mean value is not changing over time. If the mean is not changing over time then we can estimate the mean by averaging observations over time. Same thing applies to estimation of the variance and of the covariance. Consequently we require that neither the mean, nor the variance, nor the covariance change over time, i.e. that the time series be stationarity in terms of these parameters. This is usually called secondorder stationarity or weakly stationary. For normal processes this second-order stationarity implies strict stationarity since the multivariate normal distribution is completely characterized by its first

23 Stationarity and Invertibility Invertibility and second moments. To check the stationarity of a time series in practice, we follow the following steps: (1) Check the time series plot to see if either the mean or the variance appears to be changing over time dramatically. (2) Check the ACF, it should be decreasing rapidly to zero and reach zero at roughly lag 5 or 6. This means that the t-value for r 5 or r 6 should indicate that the autocorrelation is not significantly different from zero. (3) Check if the AR coefficients satisfy the inequality restrictions discussed above. Note that the above stationarity inequalities apply only to AR coefficients in general ARIMA(p, d, q) model. Invertibility implies that the MA coefficients must satisfy certain conditions. Usually after estimating a model you check that the MA coefficients satisfy the following invertibility conditions. (1) Invertibility for MA(1) processes: For an MA(1) or ARIMA(p,0,1) process, invertibility requires that the absolute value of θ 1 be less

24 Non-stationarity in the mean than one, i.e., θ 1 < 1. (2) Invertibility for MA(2) processes: For an MA(2) or ARMA(p,0,2) process to be invertible we must satisfy that θ 2 < 1 θ 2 + θ 1 < 1 θ 2 θ 1 < 1. Again for higher order MA processes (q > 2) it is complicated to check invertibility but a necessary condition for MA models is that θ 1 + θ θ p < 1. Note that the invertibility inequalities apply only to MA coefficients in general ARIMA(p, d, q) model. Our common sense tells us that in general more recent events (shocks) should have larger coefficients and more past events should have re-

25 Differencing duced weights. Invertibility guarantees that the weights on past observations will decline as we move further into the past. The general formula for an ARIMA process is denoted by AR1MA(p, d, q). In this model we now understand the meaning of AR and the p which indicates the order of the AR process, we understand MA and the q which indicates the order of the MA process. We have left to explain the meaning of the letter I and the the letter d. We will tackle this in the next lecture. The models discussed in previous require that the series be stationary. It may be that the series is stationary except with respect to the mean which may be, for example, increasing or decreasing with time. We now look at how we can handle a series that is non-stationary in the mean. If a different segments of a series behave much like the rest of the series after we allow for changes in level and/or slope then the series may be transformed into a stationary series simply by differencing. Differencing is a type of filtering, which can be used to remove trend.

26 Differencing Basically we difference a given series until it becomes stationary. Suppose we have a non-seasonal series and we denote the original observations by {y t, t = 1, 2,..., n} then a new differenced series, say, {z t, t = 1, 2,..., n} can be formed by z t = y t = y t y t 1 A series z t, created in this way by making first differences is called a first differences of y t. We hope that this series of first differences will have a constant mean. It is surprising how often simple first differences will produce a series with constant mean. Recall again that = (1 B). Now if first differences do not work then we try second differences. We can obtain second differences by first differences of the the first differences series z t. Let us call the series of second differences w t, then we have w t = z t z t 1 = (y t y t 1 ) (y t 1 y t 2 ) = y t 2y t 1 + y t 2

27 Differencing thus w t is the original series differenced twice. Note that this can be done using the difference operator or the backshift operator: or w t = (1 B) 2 y t = (1 2B + B 2 )y t = y t 2y t 1 + y t 2 w t = 2 y t = ( y t ) = (y t y t 1 ) Recall that, and = y t y t 1 = (y t y t 1 ) (y t 1 y t 2 ) = y t 2y t 1 + y t 2. Φ(B) = (1 φ 1 B φ 2 B 2 φ p B p ), Θ(B) = (1 θ 1 B θ 2 B 2 θ q B q ), We are now in a position to write the general nonseasonal ARIMA model, we can indicate the ARIMA (p, d, q) by Φ(B) d y t = C + Θ(B)α t.

28 Differencing For example the ARIMA (1, 1, 1) model is y t y t 1 = C + φ 1 (y t 1 y t 2 ) θ 1 α t 1 + α t. In the compact notation we can write this model as φ(b) 1 y t = C + θ(b)α t, with φ(b) = 1 θ 1 B and θ(b) = (1 θ 1 B). We can see that d stands for the number of times a realization of a series must be differenced to achieve a stationary mean. Usually not more than two times, i.e. usually d 2. Remarks on d: (1) If a series is nonstationary its estimated acf will decay very slowly and we should try differencing the series to produce a series which is stationary. (2) If the estimated AR coefficients do not satisfy the AR stationarity conditions we should try differencing.

29 The I in the term ARIMA (3) If we look at a time series plot and see that segments of the series differ only in level of the time series, then we should try d = 1. (4) If we look at a time series plot and see that both the level and the slope of segments of the series appear to be changing through time then we should try d = 2. Fitting an ARIMA(p,d,q) model to an original series y t consists of first differencing the original series y t d times resulting in, say, a differenced series z t to which we then fit an ARMA(p,q) (ARIMA(p,0,q)) model. Therefore since we had to difference the original series d times to get back to the original series we have to sum (often called integrate) the differenced series d times. That is while zs are differences of the ys, the ys are sums of the zs. The I in the term ARIMA refers to this summing or integrating. For example consider a differenced series z t, z t = (1 B)y t

30 Estimation then in terms of y t we have y t = (1 B) 1 z t = (1 + B + B 2 + )z t = z t + z t 1 + z t 2 + which shows that the original series is a sum of the differenced series. The coefficients in the ARIMA model must be obtained via some criteria. The most common criteria is chose values of C, φ and θ that maximizes the likelihood or chose estimates that result in the smallest sum of squared residuals (least squares). We will discuss the LS procedure briefly. Consider the AR(1) model (1 φ 1 B)z t = C + α t, or z t = C + φ 1 z t 1 + α t If we take expectations on both sides, then because of stationarity of the time series, we have µ = C + φ 1 µ + 0 C = µ(1 φ 1 )

31 Root-Mean-Squared Erroe and we can rewrite the model as z t = µ(1 φ 1 ) + φ 1 z t 1 + α t Now if we had a good estimate of ˆφ1 for φ 1 and ˆµ for µ, we could estimate the residual by ˆα t = z t ẑ t, where ẑ t = ˆµ(1 ˆφ 1 ) + ˆφ 1 z t 1. Thus we select parameter values ˆµ and ˆφ 1 such that the sum of squared residuals is minimum, i.e. SSR = ˆα t 2 = [ ] 2 z t ˆµ(1 ˆφ 1 ) + ˆφ 1 z t 1. t t Thus if ˆµ and ˆφ 1 are such that SSR is a minimum we have the least squares estimates. The following quantity is called the root-mean-squared error: SSR ˆσ α =

32 Root-Mean-Squared Erroe where n is all available time series observations and m is the number of estimated parameters. If we are considering two models we select the model with the smaller ˆσ α, which is called the root-mean-squared error or the estimated standard deviance of the random shocks.

33 The Ljung and Box χ 2 Test Dtatistic A ARIMA model assumes that the random shocks are independent. A successful model should have residual random shocks that are uncorrelated, however, if the model is not adequate to explain the data then the residuals from that model will contain effects not accounted for by the AR and MA terms in the model. Thus the residuals shocks will be associated and will not have zero autocorrelations. We may therefore construct a test statistics for the goodness of fit of the model by testing if a set of autocorrelations of lags, say, 1 to k are all zero. The null hypothesis is that for the shocks αs with test statistic H 0 : ρ 1 (α) = ρ 2 (α) = = ρ k (α) = 0 Q = n(n + 2) k i=1 (n 1) 1 r 2 i (ˆα) where n is the number of observations used to estimate the model.

34 Forecasting This statistic has approximately a χ 2 -distribution with (k m) degrees of freedom where m is the number of parameter estimated. Basically we reject for large Q, if we do not reject the null then this is evidence that the shocks estimated from the model have the required pattern that they are uncorrelated. Note: Some programs use the Box-Pierce statistics which performs the same test. For large sample sizes they are equivalent for small sample sizes the Ljung-Box statistics follows more closely the χ 2 - distribution use to calculate the p-value of the test statistics. Suppose we have fitted the ARIMA (1,0,1): z t = ˆµ(1 ˆφ 1 ) + ˆφ 1 z t 1 ˆθ 1 α t 1 + α t and now we want to forecast ahead say l time points i.e. we want to forecast (estimate) z t+1. Now the theoretical model ARIMA (1,0,1) states that z t = ˆµ(1 ˆφ 1 ) + ˆφ 1 z t 1 ˆθ 1 α t 1 + α t

35 Forecasting Now a best estimate of z t+1 (which is a random variable) is its mathematical expectation given our knowledge of past values as specified by the model, in this case the ARIMA(1,0,1) model. Let I t be the set of past values used in the model, then E(z t+1 I t ) = µ(1 φ 1 ) + φ 1 z t θ 1 α t where we assume E(α t+1 ) = 0. We do not know the parameter values so we replace them by the estimates given in the ARlMA(1,0,1) model fitted above. Thus we estimate z t+1 by ẑ t (1), where ẑ t (1) = ˆµ(1 ˆφ 1 ) + ˆφ 1 z t ˆθ 1 α t. Then we estimate z t+2 by ẑ t (2), where ẑ t (2) = ˆµ(1 ˆφ 1 ) + ˆφ 1 ẑ t (1) ˆθ 1 α t+1. where since we do not know α t+1 we replace it by its expected value of zero. Hence the above estimate of ẑ t (2) becomes ẑ t (2) = ˆµ(1 ˆφ 1 ) + ˆφ 1 ẑ t (1).

36 Models for Seasonal Data Similarly, we can build up a forecast for any future value. Of course the further we try to forecast into the future the larger will be the standard error of the forecast. Note that for stationary models the forecasts converge towards the mean of the series. Computer programs have been written to produce these forecasts and their standard errors. In particular R contains a time series packages for time series analysis, one for Box-Jenkins approach and one for the spectral analysis approach. Sometimes a time series will have a pattern that will repeat say every s time periods. When this happens we term the data to be periodic data. An example of periodic data is data with seasonal variation. Generally to remove the seasonal variation we must difference the observations by length s. That is we must difference observations separated by s time units i.e. lags of s, 2s, 3s, etc. The basic idea is that observations s time periods apart {z t, z t s, z t 2s,...} are similar. Observations that are similar every s time points are correlated, so we expect nonzero correlations between observations

37 Φ p (B) d z t = C + Θ q (B)α t Chapter 12: An introduction to Time Series Analysis Models for Seasonal Data separated by s time points or by 2s time points and so on. Thus if we construct an acf or a pacf we expect to find nonzero correlations at lags s, 2s, and so on. If the series lacks correlations at any lags except those that are multiples of s we have a purely seasonal process. An autoregressive model for this data can be written using one autoregressive parameter φ: z t = C + φ t s z t s + α t or (1 φ t s B s )z t = α t We may have to perform first-degree or second degree or Dth degree differencing of length s to obtain a stationary series, that is For example if D = 1 D s z t = (1 B s ) D z t. s z t = (1 B s )z t = z t z t s. Recall that the general model for the ARIMA(p, d, q) process is

38 Models for Seasonal Data The general model for the seasonal model is Φ P (B s ) D s z t = C + Θ Q (B s )α t. Now one way to combine the seasonal model with the local ARIMA model is to multiply them, then the general ARIMA(p, d, q)(p, D, Q), model is Φ p (B)Φ P (B s ) d D s z t = C + Θ Q (B s )Θ q (B)α t. For example ARIMA(0,0,1)(0,1,1) is a model which assumes that the process to be modelled has periodicity 4, since s = 4, and because D = 1, z t is differenced once by length four. Further since d = 0, there is no nonseasonal differencing for the local effect. Further since Q = l we have one seasonal MA term at lag 4. Again since

39 References q = 1 we have one nonseasonal MA term. Thus the model is thus z t z t 4 = C + (1 Θ 4 (B 4 ))(1 θ 1 B)α t = C + (1 Θ 4 (B 4 ))(α t θ 1 α t 1 ) = C + (α t θ 1 α t 1 ) Θ 4 (B 4 )(α t θ 1 α t 1 ) = C + α t θ 1 α t 1 Θ 4 α t 4 + Θ 4 θ 1 α t 5 z t = C + z t 4 θ 1 α t 1 Θ 4 α t 4 + Θ 4 θ 1 α t 5 + α t. All these model are easily fitted using the R, however finding the right model takes time and lots of experience. Most of this chapter are based on the following two books. References (1) Time Series Analysis and Forecasting, by O.D. Anderson (1) Forecasting with Univariate Box-Jenkins Models, by A. Pankratz

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Analysis. Components of a Time Series

Analysis. Components of a Time Series Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously

More information

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38 ARIMA Models Jamie Monogan University of Georgia January 25, 2012 Jamie Monogan (UGA) ARIMA Models January 25, 2012 1 / 38 Objectives By the end of this meeting, participants should be able to: Describe

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Forecasting using R. Rob J Hyndman. 2.3 Stationarity and differencing. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.3 Stationarity and differencing. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.3 Stationarity and differencing Forecasting using R 1 Outline 1 Stationarity 2 Differencing 3 Unit root tests 4 Lab session 10 5 Backshift notation Forecasting using

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA www.arpapress.com/volumes/vol14issue3/ijrras_14_3_14.pdf A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA Ette Harrison Etuk Department of Mathematics/Computer Science, Rivers State University

More information

Lecture 19 Box-Jenkins Seasonal Models

Lecture 19 Box-Jenkins Seasonal Models Lecture 19 Box-Jenkins Seasonal Models If the time series is nonstationary with respect to its variance, then we can stabilize the variance of the time series by using a pre-differencing transformation.

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is

More information

Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods

Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods International Journal of Sciences Research Article (ISSN 2305-3925) Volume 2, Issue July 2013 http://www.ijsciences.com Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Forecasting: principles and practice 1

Forecasting: principles and practice 1 Forecasting: principles and practice Rob J Hyndman 2.3 Stationarity and differencing Forecasting: principles and practice 1 Outline 1 Stationarity 2 Differencing 3 Unit root tests 4 Lab session 10 5 Backshift

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

ECONOMETRIA II. CURSO 2009/2010 LAB # 3 ECONOMETRIA II. CURSO 2009/2010 LAB # 3 BOX-JENKINS METHODOLOGY The Box Jenkins approach combines the moving average and the autorregresive models. Although both models were already known, the contribution

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

arxiv: v1 [stat.me] 5 Nov 2008

arxiv: v1 [stat.me] 5 Nov 2008 arxiv:0811.0659v1 [stat.me] 5 Nov 2008 Estimation of missing data by using the filtering process in a time series modeling Ahmad Mahir R. and Al-khazaleh A. M. H. School of Mathematical Sciences Faculty

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Suan Sunandha Rajabhat University

Suan Sunandha Rajabhat University Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis Kunya Bowornchockchai Suan Sunandha Rajabhat University INTRODUCTION The objective of this research is to forecast

More information

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series Journal of Mathematics and Statistics 8 (4): 500-505, 2012 ISSN 1549-3644 2012 doi:10.3844/jmssp.2012.500.505 Published Online 8 (4) 2012 (http://www.thescipub.com/jmss.toc) Seasonal Autoregressive Integrated

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Time Series 4. Robert Almgren. Oct. 5, 2009

Time Series 4. Robert Almgren. Oct. 5, 2009 Time Series 4 Robert Almgren Oct. 5, 2009 1 Nonstationarity How should you model a process that has drift? ARMA models are intrinsically stationary, that is, they are mean-reverting: when the value of

More information

5 Autoregressive-Moving-Average Modeling

5 Autoregressive-Moving-Average Modeling 5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely

More information

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Estimation and application of best ARIMA model for forecasting the uranium price.

Estimation and application of best ARIMA model for forecasting the uranium price. Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

White Noise Processes (Section 6.2)

White Noise Processes (Section 6.2) White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Decision 411: Class 9. HW#3 issues

Decision 411: Class 9. HW#3 issues Decision 411: Class 9 Presentation/discussion of HW#3 Introduction to ARIMA models Rules for fitting nonseasonal models Differencing and stationarity Reading the tea leaves : : ACF and PACF plots Unit

More information

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

AR(p) + I(d) + MA(q) = ARIMA(p, d, q) AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Chapter 3: Regression Methods for Trends

Chapter 3: Regression Methods for Trends Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Asian Economic and Financial Review. SEASONAL ARIMA MODELLING OF NIGERIAN MONTHLY CRUDE OIL PRICES Ette Harrison Etuk

Asian Economic and Financial Review. SEASONAL ARIMA MODELLING OF NIGERIAN MONTHLY CRUDE OIL PRICES Ette Harrison Etuk Asian Economic and Financial Review journal homepage: http://aessweb.com/journal-detail.php?id=5002 SEASONAL ARIMA MODELLING OF NIGERIAN MONTHLY CRUDE OIL PRICES Ette Harrison Etuk Department of Mathematics/Computer

More information

MCMC analysis of classical time series algorithms.

MCMC analysis of classical time series algorithms. MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins

More information

Multiplicative Sarima Modelling Of Nigerian Monthly Crude Oil Domestic Production

Multiplicative Sarima Modelling Of Nigerian Monthly Crude Oil Domestic Production Journal of Applied Mathematics & Bioinformatics, vol.3, no.3, 2013, 103-112 ISSN: 1792-6602 (print), 1792-6939 (online) Scienpress Ltd, 2013 Multiplicative Sarima Modelling Of Nigerian Monthly Crude Oil

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them. TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

SOME BASICS OF TIME-SERIES ANALYSIS

SOME BASICS OF TIME-SERIES ANALYSIS SOME BASICS OF TIME-SERIES ANALYSIS John E. Floyd University of Toronto December 8, 26 An excellent place to learn about time series analysis is from Walter Enders textbook. For a basic understanding of

More information

Part 1. Multiple Choice (40 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 6 points each)

Part 1. Multiple Choice (40 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 6 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (40 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 6 points each) The maximum number of points on this exam is

More information

Sugarcane Productivity in Bihar- A Forecast through ARIMA Model

Sugarcane Productivity in Bihar- A Forecast through ARIMA Model Available online at www.ijpab.com Kumar et al Int. J. Pure App. Biosci. 5 (6): 1042-1051 (2017) ISSN: 2320 7051 DOI: http://dx.doi.org/10.18782/2320-7051.5838 ISSN: 2320 7051 Int. J. Pure App. Biosci.

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

arxiv: v1 [stat.co] 11 Dec 2012

arxiv: v1 [stat.co] 11 Dec 2012 Simulating the Continuation of a Time Series in R December 12, 2012 arxiv:1212.2393v1 [stat.co] 11 Dec 2012 Halis Sak 1 Department of Industrial and Systems Engineering, Yeditepe University, Kayışdağı,

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

STAT 436 / Lecture 16: Key

STAT 436 / Lecture 16: Key STAT 436 / 536 - Lecture 16: Key Modeling Non-Stationary Time Series Many time series models are non-stationary. Recall a time series is stationary if the mean and variance are constant in time and the

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Stat 565. (S)Arima & Forecasting. Charlotte Wickham. stat565.cwick.co.nz. Feb

Stat 565. (S)Arima & Forecasting. Charlotte Wickham. stat565.cwick.co.nz. Feb Stat 565 (S)Arima & Forecasting Feb 2 2016 Charlotte Wickham stat565.cwick.co.nz Today A note from HW #3 Pick up with ARIMA processes Introduction to forecasting HW #3 The sample autocorrelation coefficients

More information

Analysis of Violent Crime in Los Angeles County

Analysis of Violent Crime in Los Angeles County Analysis of Violent Crime in Los Angeles County Xiaohong Huang UID: 004693375 March 20, 2017 Abstract Violent crime can have a negative impact to the victims and the neighborhoods. It can affect people

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

UNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970

UNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970 UNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970 Joseph George Caldwell, PhD (Statistics) 1432 N Camino Mateo, Tucson, AZ 85745-3311 USA Tel. (001)(520)222-3446, E-mail jcaldwell9@yahoo.com (File converted

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Acta Universitatis Carolinae. Mathematica et Physica

Acta Universitatis Carolinae. Mathematica et Physica Acta Universitatis Carolinae. Mathematica et Physica Jitka Zichová Some applications of time series models to financial data Acta Universitatis Carolinae. Mathematica et Physica, Vol. 52 (2011), No. 1,

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Forecasting Area, Production and Yield of Cotton in India using ARIMA Model

Forecasting Area, Production and Yield of Cotton in India using ARIMA Model Forecasting Area, Production and Yield of Cotton in India using ARIMA Model M. K. Debnath 1, Kartic Bera 2 *, P. Mishra 1 1 Department of Agricultural Statistics, Bidhan Chanda Krishi Vishwavidyalaya,

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Quantitative Finance I

Quantitative Finance I Quantitative Finance I Linear AR and MA Models (Lecture 4) Winter Semester 01/013 by Lukas Vacha * If viewed in.pdf format - for full functionality use Mathematica 7 (or higher) notebook (.nb) version

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

A Data-Driven Model for Software Reliability Prediction

A Data-Driven Model for Software Reliability Prediction A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background

More information

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below).

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below). Project: Forecasting Sales Step 1: Plan Your Analysis Answer the following questions to help you plan out your analysis: 1. Does the dataset meet the criteria of a time series dataset? Make sure to explore

More information

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006. 6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Ch3. TRENDS. Time Series Analysis

Ch3. TRENDS. Time Series Analysis 3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true

More information

ARIMA modeling to forecast area and production of rice in West Bengal

ARIMA modeling to forecast area and production of rice in West Bengal Journal of Crop and Weed, 9(2):26-31(2013) ARIMA modeling to forecast area and production of rice in West Bengal R. BISWAS AND B. BHATTACHARYYA Department of Agricultural Statistics Bidhan Chandra Krishi

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information