Electricity Consumption Forecasting in the Khan Younis Province Using Exponential Smoothing and Box - Jenkins Methods: A Modeling Viewpoint.

Size: px
Start display at page:

Download "Electricity Consumption Forecasting in the Khan Younis Province Using Exponential Smoothing and Box - Jenkins Methods: A Modeling Viewpoint."

Transcription

1

2 Electricity Consumption Forecasting in the Khan Younis Province Using Exponential Smoothing and Box - Jenkins Methods: A Modeling Viewpoint. August 26, 2015

3 The Islamic University of Gaza Faculty of Science Department of Mathematics Electricity Consumption Forecasting in the Khan Younis Province Using Exponential Smoothing and Box - Jenkins Methods : A Modeling Viewpoint Submitted By RANA MAHMOUED ABU AL RISH Supervised By Dr. Bisher M.Iqelan A THESIS SUBMITTED IN PARTIAL FULFILMENT OF THE REQUIREMENT FOR THE DEGREE OF MASTER OF MATHEMATICS June, 2015

4

5 To my parents... To my son Ryad... To my husband Ahmed... And to all knowledge seekers... i

6 Contents Acknowledgments Abbreviation ix x Abstract 1 Literature Review 2 Introduction 3 I 5 1 Introduction Examples Of Time Series Properties of Time series Stationary Time Series Box-Jenkins methodology Models for Stationary Time Series General Linear Processes Autoregressive Process Moving Average Processes Autoregressive Moving Average Model Models for non Stationary Time Series Multiplicative Seasonal ARIMA Models ii

7 2.3 Forecasting Model Identification Parameter Estimation of the SARIMAModel Diagnostics Checking Of The Fitted Model Forecasting the study variable Exponential Smoothing Introduction Classification of Exponential Smoothing Methods Point Forecasts for the Best-Known Methods Simple Exponential Smoothing (N,N Method) Holt Linear Method (A, N Method) Damped Trend Method (A d, A Method) Additive damped trend Holt-Winters Trend and Seasonality Method Additive Seasonality (A,A Method) General Point Forecasting Equations Innovations state space models for exponential smoothing ET S(A, N, N): simple exponential smoothing with additive errors ET S(M, N, N): simple exponential smoothing with multiplicative errors State Space Models for Holts Linear Method State Space Models for All Exponential Smoothing Methods Initialization and Estimation Initialization Estimation and model selection Measure error iii

8 II Case Study 49 4 Analysis Data Using Box Jenkins Method Data Description The Box-Jenkins Approach to Fitting ARIMA Model: Model Specification Analysis data using exponential smoothing methods exponential smoothing model First Method : Simple Exponential Smoothing Model Second Method :Holt s Linear Trend Method Damped trend methods Holt-Winters seasonal method Summary CONCLUSIONS 73 Recommendations 74 Refrences 75 Appendix 77 iv

9 List of Figures 1.1 Average Monthly Temperatures, Dubuque, lowa Monthly Temperatures, Dubuque, lowa simulated AR(1)process with φ = simulated AR(1)process with φ = Oil production in Saudi Arabia from 1996 to Time series plot of electricity consumption in province Khan Younis monthly symbols Acf for monthly electricity consumption Pacf for monthly electricity consumption First difference of monthly electricity consumption Residuals from the fitted ARIMA(2, 1, 2)(1, 0, 1) 12 model Forecasts for monthly electricity consumption Simple exponential smoothing applied to electricity consumption in province Khan Younis Forecasts from simple exponential smoothing Forecasts from Holt s linear method Forecasts from Damped Holts method with exponential trend Forecasting electricity data using Holt-Winters method with both additive and multiplicative seasonality Estimated components for Holt-Winters method with additive and multiplicative seasonal components v

10 5.7 Forecasting data using multiplicative seasonal components vi

11 List of Tables 2.1 Behavior of the ACF and the P ACF for ARMA Models Descriptive Statistics P-values for Augmented Dickey-Fuller (ADF ) test And Kwiatkowski-Phillips- Schmidt- Shin (KP SS)Test for monthly electricity consumption SARIMA Models Criteria for the monthly electricity consumption Comparative between Prediction data of 2011 using ARIMA(2, 1, 2)(1, 0, 1) 12 and Actual data Prediction data for year 2011 using simple exponential smoothing with three different values for the smoothing parameter α Measure error for simple exponential smoothing models Comparative between actual data prediction data of year 2011 using Holt linear method Eroor Measure For Holt trend Model Comparative between actual data and prediction data of year 2011 using Damped trend method Measure error for Damped trend models Prediction data of month of year 2011 using Holt-Winters method with both additive and multiplicative seasonality Measure error for two model additive and multiplicative seasonal components Measure error for fitted model in all methods vii

12 5.10 Comparative between actual data and prediction data by ARIMA(2, 1, 2)(1, 0, 1) 12 and ET S(M, A, M) Measure Error for ET S(M, A, M) and ARIMA(2, 1, 2)(1, 0, 1) 12 Models. 72 viii

13 Acknowledgments Praise be to Almighty ALLAH who always help and guide me to bring forth to light this work. I am also grateful to my supervisor Bisher M.Iqelan for suggesting the topic of the thesis, tremendous support and healthy ideas. It has been a privilege to work with. My special thanks to all members of the Math department at the Islamic University of Gaza for their help and teaching. Also thanks to my parents, son, husband and family members,who have always shadowed me with love and fortitude. ix

14 Abbreviation ACF Autocorrelation function. ACVF Auto covariance function ADF Augmented Dickey-Fuller AR(p) Autoregressive Model of order p. ARMA(p, q) Autoregressive moving average model of order (p, q). ARIMA(p, d, q) Integrated autoregressive moving average model of order (p, q) AIC Akaikes Information Criterion. AICc AIC, Bias Corrected BIC Bayesian Information Criterion. KPSS Kwiatkowski-Phillips-Schmidt-Shin SARIMA Seasonal Integrated Autoregressive Moving Average Model SES Simple Exponential Smoothing SIC Schwarzs Information Criterion MA(q) Moving average model of order q. MAE The Mean Absolute Error MAPE The Mean Absolute Percentage Error MSE The Mean Squared Error. NID Normal Independent Distributed PACF Partial autocorrelation function. RMSE The Root Mean Squared Error RSS Residual Sum of Squares WN White Noise x

15 Abstract Time Series analysis can be used to extract information hidden in data. The classical techniques for Time Series data analysis are the linear Time Series models including the Moving Average Models (M A), the Autoregressive Models (AR), the Autoregressive Moving Average Models (ARM A), the Seasonal Integrated Moving Average Models (SARIM A). We are mention and display these models in details, and we show the important characteristics and methods of finding their parameters, auto covariance,autocorrelation functions and partial autocorrelation function. We are presented a details of Exponential Smoothing Model and his methods like Simple Exponential Smoothing Model, Holts Linear Method, Damped Trend Method and Holt-Winters Trend and Seasonality Method. In this theses we have used Box - Jenkins models and Exponential Smoothing Model to analysis the electricity data of Khan Younis province in period and we compar between two models to choose the fitting model for forecasting data in period Jan 2011 to Dec Aftre comparative the best model is Exponential Smoothing Model. program. We are using R 1

16 Literature Review Many scientists and researchers have studied time series. The mathematician Fourier first touched to study time series in (1807) he representation time series as infinite series for Certain functions sin and cos this representation has been named Fourier series, this series adopted by Schuster Stocks, (1906) and Beveridge (1922). The theory and practice of time series analysis have developed rapidly since the appearance in 1970 of the seminal work of George E. P. Box and Gwilym M. Jenkins, Time Series Analysis: Forecasting and Control, now available in its third edition (1994) with co-author Gregory C. Reinsel. Many books on time series have appeared since then, but some of them give too little practical application, while others give too little theoretical background. This book attempts to present both application, and theory at a level accessible to a wide variety of students and practitioners. Our approach is to mix application and theory throughout the book as they are naturally needed. Shummay and David 1999 studying, presented examples of time series and Box - jenkins. Peter J. Brock well and Richard A. Davis (2001) studying time series and forecasting. In (2003) Degerine, S. and Lambert-Lacroix studied concepts of time series. In(1998) researcher (Makridaskis, S, et.al) studying Exponential Smoothing Method for time series, in (2002) researcher (Celia F., Balaji V.,Les S., Asish G and Amar R ) Simple Exponential Smoothing method and Holt Winter and they application for sales of women s clothing. In (2003) (Simon,2003) studying Exponential Smoothing Method for time series and prepare formula to calculate average and σ In (2008)(Rob J. Hyndman, Anne B. Koehler, J. Keith Ord and Ralph D. Snyder) studying Exponential Smoothing method and forecasting. 2

17 Introduction The prediction of future behavior of the time series is important issues in the statistical sciences in order to need it in the areas of all of life, such as prediction of the situation, air temperatures, that most countries rely on its plans and programs of development on the basis and methods of advanced process in order to reach more effective results and leads Census key role in This building plans and programs. Interested has put these studies a range of statistical methods and mathematical methods to take advantage of them to predict, and that of the most important problems facing researchers when doing analysis of time series is the stability of the series or not, which could affect the mathematical model. In this theses, we studied the number of consumers of electricity in the period in Khan Younis and treatment it by using two models 1. Box - Jenkins Models 2. Exponential Smoothing Model To choose the best model in order to predict the number of consumers in This thesis is organized as fellows We start by recalling background of the time series and its properties in chapter 1 it contains 3 sections Examples Of Time Series,Properties of Time series and Stationary Time Series. Chapter 2 introduces Box-Jenkins Models and contains 3 section Models for Stationary Time Series Moving Average Models (M A), Autoregressive Models (AR) and Autoregressive Moving Average Models (ARM A). Models for non Stationary Time Series Seasonal Integrated Moving Average Models (SARIM A), and Forecasting Chapter 3 talk about Exponential Smoothing Models ET S and important methods of this model as Simple Exponential Smoothing Model, Holts Linear Method, Damped Trend 3

18 Method and Holt-Winters Trend and Seasonality Method and we have studied the properties of all methods. Chapter 4 and 5 given the result of application of two model on electricity data and the result. 4

19 Part I 5

20 Chapter 1 Introduction In this chapter, we introduce some basic ideas of time series analysis and we will study some properties of time series in section 1.2. The purpose of time series analysis is generally: 1. To understand or model the stochastic mechanism that gives rise to an observed series 2. To predict or forecast the future values of a series based on the history of that series and, possibly, other related series or factors. 3. To describe the characteristics of these oscillations This chapter contain 3 section Example of Time Series, Properties of Time Series and Stationary Time Series. Definition A time series is a set of observations Y t each one being recorded at a specific time t. Or is a sequence of data points, measured typically at successive time instants at uniform time interval. 1.1 Examples Of Time Series Definition A time series is a set of observations Y t each one being recorded at a specific time t. Or is a sequence of data points, measured typically at successive time instants at uniform time interval. Example Average Monthly Temperatures, Dubuque, Iowa Figure 1.1 shows the average yearly temperatures in New Haven from 1964 to 1976,it 6

21 climate is warm during summer when temperatures tend to be in the 70 s and very cold during winter when temperatures tend to be in the 20 s. The warmest month of the year is the July with an average maximum temperature of 82.8 degrees Fahrenheit, while the coldest month of the year is January with an average minimum temperature of 16.9 degree Fahrenheit. This time series displays a very regular pattern called seasonality. Seasonality for monthly values occurs when observations twelve months apart are related in some manner or another. All Januarys and Februarys are quite cold but they are similar in value and different from the temperatures of the warmer months of June, July, and August for example. There is still variation among the January values and variation among the June values. Models for such series must accommodate this variation while preserving the similarities. Here the reason for the seasonality is well understood the Northern Hemispheres changing inclination toward the sun. For more details see [7] Figure 1.1: Average Monthly Temperatures, Dubuque, lowa 7

22 Figure 1.2: Monthly Temperatures, Dubuque, lowa 1.2 Properties of Time series This section descries the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of mean, variance, covariance functions, stationary processes, autocorrelation functions and partial autocorrelation functions.for more details see [15] and [16] Definition Mean Function For any time series {Y t } the mean function denoted by µ t is defined as µ t = E(Y t ) (1.2.1) Definition The Auto Covariance Function For any time series {Y t } the auto covariance function (ACV F )of the time series {Y t } denoted by γ Y (t, s) is defined as the second moment product γ Y (t, s) = cov(y t, Y s ) (1.2.2) = E[(Y t µ t )(Y s µ s )] (1.2.3) = E(Y t Y s ) µ t µ s (1.2.4) 8

23 For all time point s and t. When no possible confusion exists about which time series we are referring to, we will drop the subscript and write γ Y (t, s) as γ s,t. It is clear that, for s = t, the auto covariance reduces to the variance, because γ Y (t, t) = V ar(y t ) (1.2.5) = E[Y t E{Y t }] 2 (1.2.6) Note that γ Y (t, s) = γ Y (s, t) and γ t,s γ t,t γ s,s Definition The Autocorrelation Function The autocorrelation function (ACF ) of time series {Y t } denoted by ρ t,s follows is defined as ρ t,s = Corr(Y t, Y s ) (1.2.7) = Cov(Y t, Y s ) (V aryt )(V ary s ) (1.2.8) The ACF measures the linear predictability of the series at time t, say, Y t, using only the value Y s. We note that 1 ρ t,s 1 values of ρ t,s near ±1 indicate strong linear dependance, where as values near zero indicate weak linear dependance and if ρ t,s = 0 we say that (Y t ), (Y s ) are uncorrelated. 1.3 Stationary Time Series The preceding definitions of the mean and auto covariance functions are completely general. Although we have not made any special assumptions about the behavior of the time series, many of the preceding examples have hinted that a sort of regularity may exist over time in the behavior of a time series. We introduce the notion of regularity using a concept called stationarity.[1] Definition Strict stationarity Time series Y t is said to be strict stationary if the joint distribution of {Y t1, Y t2,...y tn } is the same as the joint distribution of {Y t1 +h, Y t2 +h,..., Y tn+h} 9

24 Definition Weakly Stationarity Time series Y t is said to be weakly stationary if 1. the mean value function µ t is constant. 2. the covariance function, γ s,t depends on s and t only through their difference s t. In the literature, usually stationarity means weak stationarity, unless otherwise specified. One important case where stationarity implies strict stationarity is if the time series is Gaussian which means that the distribution functions of {Y t } are all multivariate Gaussian, i.e. the joint density of F Yt,Yt+j1,...,Y t+jn (y t, y t+j1,..., y t+jn ) is Gaussian. Example Random walk let (S t : t = 0, 1, 2,...) be sequence of independent identically distributed random variables each with zero mean and variance σ 2 the observed time series {Y t : t = 1, 2,...} is constructed as follows Y 1 = S 1 Y 2 = S 1 + S 2. Y t = S 1 + S S t E(S t ) = 0, E(St 2 ) = tσ 2 for all t,and for h 0 γ s (t + h, t) = Cov(S t+h, S t ) = Cov(S t + Y t+1 + Y t Y t+h, S t ) = Cov(S t, S t ) = V ar(s t ) = E(St 2 ) (E(S t )) 2 = tσ 2 Since γ s (t + h, t) depends on t, the series {S t } is not stationary. 10

25 Notation Note that because the mean function, µ t = E(Y t ) of a stationary time series is independent of time t, we will write µ t = µ Also, because the covariance function of a stationary time series,γ s,t depends on s and t only through their difference s t, we may simplify the notation. Let s = t + h, where h represents the time shift or lag, then γ (t+h,t) = cov(y t+h, Y t ) = E[(Y t+h µ t+h )(Y t µ t )] = E[(Y h µ)(y 0 µ)] = γ (h,0) does not depend on the time argument t we have assumed that V ar(y t ) = γ (0,0) <. Henceforth, for convenience we will drop the second argument of γ (h,0). Definition The auto covariance function (ACV F ) The auto covariance function of a stationary time series will be written as γ (h) = Cov(Y t+h, Y t ) (1.3.1) = E[(Y t+h µ)(y t µ)] (1.3.2) A final useful property that auto covariance function of a stationary series is symmetric around the origin, that is γ h = γ h (1.3.3) Proposition (Properties of Auto covariance Function (ACV F ) The auto covariance function (ACV F ) of a stationary time series Y t has the following properties: 11

26 Nonnegativity: γ 0 0 Bounded ness: γ h γ 0, for any h Z Symmetry :γ h =γ h γ (t,s) = γ (0, s t ) Proof. See [16] Definition The autocorrelation function (ACF ) The autocorrelation function (ACF ) of a stationary time series will be written as ρ h = γ (t+h,t) γ(t+h,t+h) γ (t,t) (1.3.4) = γ h γ 0 (1.3.5) Proposition (Properties of Autocorrelation Function (ACF )) The autocorrelation function ρ(h) of a stationary time series Y t has the following properties: ρ 0 = 1 ρ h 1,for all h Z. ρ h = ρ h Proof. See [16] Definition (The partial Autocorrelation Function(P ACF )) The partial Autocorrelation Function (P ACF ) of time series {Y t } denoted byφ kk φ kk = corr(y t, Y t k Y t 1, Y t 2,..., Y t k+1 ) (1.3.6) We have studied in this chapter the basic rules of time series and properties like Variance, Covariance, auto covariance, correlation and auto correlation and we studied types of time series. In the second chapter we will study Box - Jenkins model Moving Average, Autoregressive and Moving Average Autoregressive 12

27 Chapter 2 Box-Jenkins methodology 2.1 Models for Stationary Time Series This chapter discusses the basic concepts of a broad class of parametric time series models the autoregressive moving average (ARM A) models. These models have assumed great importance in modeling real-world processes. For more details see[7] General Linear Processes we will study a class of linear models, called linear time series models that are designed specifically for modeling the dynamic behavior of time series. These include, movingaverage (M A), autoregressive (AR) and autoregressive-moving average (ARM A) models. Definition Time series {Y t } is a linear process if it has the representation y t = e t + ψ 1 e t 1 + ψ 2 e t (2.1.1) or y t = j=0 ψ je t j for all t,where e t have zero mean and variance σ 2 and ψ j with j=1 ψ2 j < and ψ 0 = 1 is a sequence of constant 13

28 Definition (White noise) Time series Y t is said to be a white noise with mean zero and variance σ 2 written as Y t WN(0, σ 2 ) if and only if {Y t } has zero mean and covariance function as σ 2, if h = 0; γ h = 0, if h 0. It is clear that a white noise process is stationary. Definition (Back Shift Operator) For any time series {Y t } the Back Shift Operator is defined by BY t = Y t 1 and extend it to powers B 2 Y t = B(BY t ) = BY t 1 = Y t 2 and so on. Thus B k Y t = Y t k (2.1.2) An important part of time series analysis is the selection of a suitable model for data. These models are very important tool for forecasting. We will take three famous models: Autoregressive (AR) model, Moving average (M A) model and Autoregression Moving average (ARM A) model. These models are very important in modeling real world processes. We can rewrite the time series models by simplified and useful formula using Back Shift Operator B Autoregressive Process Definition Autoregressive Process The autoregressive process of order p, denoted by AR(p), is defined as Y t = φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t (2.1.3) Where φ 1, φ 2,..., φ p are the parameters of the model and ε t WN (0, σ 2 ) 14

29 The mean of Y t in (2.1.3) is zero.if the mean µ of Y t is not zero replace Y t byy t µ in i.e Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ)... + φ p (Y t p µ) + ε t By using the back shift operator we can write AR(p) as or even more concisely as (1-φ 1 B φ 2 B 2... φ p B p )Y t = ε t (2.1.4) φ(b)y t = ε t (2.1.5) φ(b) is called the characteristic polynomial where φ(b) = 1 φ 1 B φ 2 B 2... φ p B p Figure 2.1 displays the time plot of a simulated AR(1) process with φ = 0.9 Figure 2.1: simulated AR(1)process with φ =

30 Definition Causality A linear process {Y t } is causal of {W t } if there is a with ψ j < and ψ(b) = ψ 0 + ψ 1 B + ψ 2 B Y t = ψ(b)w t Moving Average Processes Definition Moving Average Moving model of order q denoted by MA(q) model, is defined as where are θ 1, θ 2...θ q are parameters Y t = ε t θ 1 ε t 1 θ 2 ε t 2... θ q ε t q (2.1.6) is Some texts and software packages write the MA model with negative coefficients that Y t = ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q By using the back shift operator we can write the MA(q) as or even more concisely as Y t = (1 θ 1 B θ 2 B 2... θ q B q )ε t (2.1.7) Y t = θ(b)ε t (2.1.8) 16

31 θ(b) is called the characteristic polynomial where θ(b) = 1 θ 1 B θ 2 B 2... θ q B q Figure 2.2 shows a time plot of a simulated MA(1) series withθ =0.9 Figure 2.2: simulated AR(1)process with φ = 0.9 Definition Invertibility A linear process {Y t } is invertible of {W t } if there is a with j=0 π j < and π(b) = π 0 + π 1 B + π 2 B W t = π(b)y t Autoregressive Moving Average Model Definition Autoregressive Moving Average Model The Autoregressive Moving Average Model denoted by ARM A(p, q) is defined as Y t = φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t θ 1 ε t 1 θ 2 ε t 2...θ q ε t q (2.1.9) 17

32 with φ p 0, θ q 0 and σ 2 ε > 0, and the parameters p and q are called the autoregressive and the moving average orders, respectively. By using the back shift operator, we can write the ARMA(p, q)as (1-φ 1 B φ 2 B 2... φ p B p )Y t = (1 θ 1 B θ 2 B 2... θ q B q )ε t (2.1.10) φ(b)y t = θ(b)ε t (2.1.11) Definition characteristic polynomial The AR(p), MA(q) polynomial are defined as φ(x) = 1 φ 1 (x) φ 2 (x) 2...φ p (x) p (2.1.12) and θ(x) = 1 + θ 1 (x) + θ 2 (x) θ q (x) q (2.1.13) respectively, where x is a complex number. Table 2.1: Behavior of the ACF and the P ACF for ARMA Models AR(p) M A(q) ARM A(p, q) ACF Tails off Cuts off after lag q Tails off P ACF Cuts off after lag p Tails off Tails off Remark We can say about φ(b)y t between φ(x) and θ(x) = θ(b)ε t ARMA(p, q) if no common factor Example Consider the process Y t = 0.75Y t Y t 2 + ε t 0.5ε t 1 18

33 or in operator form (1-0.75B B 2 )Y t = ( B)ε t At first Y t appear to be an ARMA(2, 1) process. But the associated polynomial φ(z) = (1 0.75z z 2 )Y t = (1 0.5z)(1 0.25z) θ(z) = (1 0.5z)ε t have a common factor that can be canceled. After cancelation, the model is reduced to so the model is an AR(1) Y t = 0.25Y t 1 + ε t Definition An ARMA(p, q) model φ(b)y t = θ(b)ε t is said to be causal, if the time series (Y t ) can be written as a one-sided linear process: where ψ(b) = j=0 ψ jb j and j=0 ψ j < Causality of an ARMA(p, q) process Y t = j=0 ψ jε t j = ψ(b)ε t (2.1.14) An ARMA(p, q) model is causal if and only if φ(z) 0 for z 1 the coefficients of the linear process given can be determined by solving ψ(z) = j=0 ψ jz j = θ(z) φ(z), z 1 Another way to expressing note is that an ARMA process is causal only when the roots of φ(z) lie outside the unit circle that is, φ(z) = 0 only when z > 1 Definition Invertible Of An ARM A An ARMA(p, q) model φ(b)y t = θ(b)ε t is said to be invertible, if the time series (Y t ) can be written as 19

34 where π(b) = j=0 π jb j and j=0 π j <. See[16] Invertibility of an ARM A(p, q) Process π(b)y t = j=0 π jy t j = ε t (2.1.15) An ARMA(p, q) model is invertible if and only if θ(z) 0 for z 1 The coefficients π j of π(b) given in can be determined by solving π(z) = j=0 π jz j = φ(z) θ(z), z 1 Another way to expressing last note is that an ARMA process is invertible only when the roots of θ(z) lie outside the unit circle that is, θ(z) = 0 only when z > 1. Example Consider the process or, in operator form, Y t = 0.4Y t Y t 2 + ε t + ε t ε t 2 (1-0.4B-0.45B 2 )Y t = (1 + B B 2 )ε t At first, Y t appears to be an ARMA(2, 2) process. But, the associated polynomials φ(z) = (1 0.4z 0.4z 2 ) = ( z)(1 0.9z) θ(z) = (1 + z z 2 ) = ( z) 2 have a common factor that can be canceled. After cancelation, the polynomials become φ(z) = (1 0.9z) and θ(z) = ( z) so the model is an ARMA(1, 1) model,(1 0.9B)Y t = ( B)ε t or Y t = 0.9Y t ε t 1 + ε t The model is causal because φ(z) = (1 0.9z) = 0 when z = 10 9 which is outside the unit circle. The model is also invertible because the root of θ(z) = ( z) is z = 2 which is outside the unit circle. 20

35 2.2 Models for non Stationary Time Series In statistics, an autoregressive integrated moving average (ARIM A) model is a generalization of an autoregressive moving average or (ARM A) model. These models are fitted to time series data either to better understand the data or to predict future points in the series. The ARIMA model is applied in some cases where data show evidence of non stationarity, where an initial differencing step (corresponding to the integrated part of the model) can be applied to remove the non stationarity. The model is generally referred to as an ARIMA(p, d, q) model where p, d, and q are integers greater than or equal to zero and refer to the order of the autoregressive, integrated, and moving average parts of the model respectively. The first parameter p refers to the number of autoregressive lags (not counting the unit roots), the second parameter d refers to the order of integration, and the third parameter q gives the number of moving average lags. For more details see [7] and [12] Definition Integrated Autoregressive Moving Average Model ARIM A A process Y t is said to be integrated autoregressive moving average model abbreviated ARIMA(p, d, q)if d Y t = (1 B) d Y t (2.2.1) is stationary ARMA process (p, q). In general we will write the model as IfE( d Y t ) = µ, we write the model as φ(b)(1 B) d Y t = θ(b)e t (2.2.2) where α = µ(1 φ 1 φ 2... φ p ) φ(b)(1 B) d Y t = α + θ(b)e t (2.2.3) 21

36 2.2.1 Multiplicative Seasonal ARIM A Models In this section, we introduce several modifications made to the ARIM A model to account for seasonal and non stationary behavior. Often, the dependence on the past tends to occur most strongly at multiples of some underlying seasonal lag s. Definition Seasonal Time Series Seasonal variation is a component of a time series which is defined as the repetitive and predictable movement around the trend line in one year or less. Some Examples of Seasonal Time Series: Monthly Carbon Dioxide Levels at Alert, Canada from January 1994 through December Monthly U.S. Retail and Food Service Sales from January 1992 to August 2008 in millions of dollars. Electricity consumption of an industrial sector of U.S. Definition Seasonal M A(Q) Model A seasonal MA(Q) model of order Q with seasonal period s is define by Y t = e t Θ 1 e t s Θ 2 e t 2s... Θ Q e t Qs (2.2.4) with seasonal M A characteristic polynomial Θ(x) = 1 Θ 1 x s Θ 2 x 2s... Θ Q x Qs Definition Seasonal AR(P ) Model A seasonal AR(P ) model of order P and seasonal period s is defined by with seasonal AR characteristic polynomial Y t = Φ 1 Y t s + Φ 2 Y t 2s Φ P Y P s + e t (2.2.5) Φ(x) = 1 Φ 1 x s Φ 2 x 2s... Φ P x P s 22

37 Definition Multiplicative Seasonal ARIM A Model SARIM A Multiplicative Seasonal ARIM A Model takes the form with operator Φ p (B s )y t = Θ Q (B s )e t (2.2.6) Φ(B) = 1 Φ 1 B s Φ 2 B 2s... Φ P B P s Θ(B) = 1 Θ 1 B s Θ 2 B 2s... Θ Q B Qs The multiplicative seasonal autoregressive integrated moving average model, or SARIM A model is given by φ(b)φ(b) d D s = θ(b)θ(b)e t (2.2.7) The general model is denoted as ARIMA(p, d, q)(p, D, Q) s. Example Consider the following model, which often provides a reasonable representation for seasonal, non stationary, economic time series.we display the equations for the model, denoted by ARIMA (0, 1, 1)(0, 1, 1) 12 (1-B 12 )(1 B)y t = (1 + ΘB 12 )(1 + θb)e t Expanding both sides see [16] (1 - B - B 12 + B 13 )y t = (1 + θb + ΘB 12 + ΘθB 13 )e t 23

38 2.3 Forecasting In this section, we shall consider the calculation of forecasts and their properties for both deterministic trend models and ARIM A models. Based on the available history of the series up to time t, namely Y 1, Y 2, Y 3,..., Y t 1 we would like to forecast the value of Y t+l that will occur L time units into the future. For more details see [12] and [15]. Definition Minimum Mean Square The minimum mean square error forecast is given by Ŷ t (L) = E(Y t+l Y 1, Y 2,..., Y t ) (2.3.1) For ARIM A models, the forecasts can be expressed in several different ways. Each expression contributes to our understanding of the overall forecasting procedure with respect to computing, updating, assessing precision, or long-term forecasting behavior. Definition Akaikes Information Criterion (AIC) Akaikes Information Criterion (AIC) AIC = -2log(maximum likelihood) + 2k (2.3.2) where k is the number of parameters in the model Akaikes Information Criterion (AIC)has another definition Definition Akaikes Information Criterion (AIC) AIC = ln σ 2 + n + 2k n (2.3.3) where k is the number of parameters in the model and σ 2 = RSS k n 24

39 where RSS k denotes the residual sum of squares under the model with k regression coefficients. Definition AIC, Bias Corrected (AICc) AICc =ln σ 2 k + n+k n k 2 (2.3.4) Definition Schwarzes Information Criterion (SIC) SIC = ln σ 2 k + k ln n n (2.3.5) SIC is also called the Bayesian Information Criterion (BIC) For more details see [16] Example AR(1) consider the model with a nonzero mean that satisfies Y t µ = φ(y t 1 µ) + e t Replacing t by t + 1 in last equation we have Take the conditional expectations of both sides of Equation Y t+1 µ = φ(y t µ) + e t+1 (2.3.6) Ŷ t (1) µ = φ[e(y t Y 1, Y 2,..., Y t ) µ] + E(e t+1 Y 1, Y 2,..., Y t ) (2.3.7) Since E(Y t Y 1, Y 2,..., Y t ) = Y t and e t+1 is independent of Y 1, Y 2,..., Y t 1 we have E(e t+1 Y 1, Y 2,..., Y t ) = E(e t+1 ) = 0. Thus Equation (2.3.7) can be written as Ŷ t (1) = µ + φ(y t µ) Now consider a general lead time L. Replacing t by t + L in last Equation and taking the conditional expectations of both sides produces Ŷ t (L) = µ + φ(y t L µ) for L 1 since φ 1, we have simply Ŷ t µ for large L. 25

40 2.3.1 Model Identification Definition Identification: Means to find out the appropriate values of p, q, d, P, Q and D of the order of general SARIMA model, we will use ACF and P ACF to find these values. Stationarity and Seasonality The first step in developing a Box-Jenkins model is to determine if the series is stationary and if there is any significant seasonality that needs to be modeled. Detecting seasonality Seasonality (or periodicity) can usually be assessed from an autocorrelation plot, a seasonal subseries plot, or a spectral plot. Differencing to achieve stationarity Box and Jenkins recommend the differencing approach to achieve stationarity. However, fitting a curve and subtracting the fitted values from the original data can also be used in the context of Box-Jenkins models. Seasonal differencing At the model identification stage, our goal is to detect seasonality, if it exists, and to identify the order for the seasonal autoregressive and seasonal moving average terms. For many series, the period is known and a single seasonality term is sufficient. For example, for monthly data we would typically include either a seasonal AR term or a seasonal MA term. For Box-Jenkins models, we do not explicitly remove seasonality before fitting the model. Instead, we include the order of the seasonal terms in the model specification to the ARIMA estimation software. However, it may be helpful to apply a seasonal difference to the data and regenerate the autocorrelation and partial autocorrelation plots. This may help in the model identification of the non-seasonal component of the model. In some cases, the seasonal differencing may remove most or all of the seasonality effect. Identify p and q Once stationarity and seasonality have been addressed, the next step is to identify the order of the autoregressive and moving average terms. Detecting stationarity Stationarity can be assessed from a run sequence plot. The run sequence plot should 26

41 show constant location and scale. It can also be detected from an autocorrelation plot. Specifically, non-stationarity is often indicated by an autocorrelation plot with very slow decay. Order of Autoregressive Process (p) Specifically, for an AR(1) process, the sample autocorrelation function should have an exponentially decreasing appearance. However, higher-order AR processes are often a mixture of exponentially decreasing and damped sinusoidal components. For higherorder autoregressive processes, the sample autocorrelation needs to be supplemented with a partial autocorrelation plot. The partial autocorrelation of an AR(p) process becomes zero at lag p+1 and greater, so we examine the sample partial autocorrelation function to see if there is evidence of a departure from zero. This is usually determined by placing a 95confidence interval on the sample partial autocorrelation plot (most software programs that generate sample autocorrelation plots will also plot this confidence interval). If the software program does not generate the confidence band, it is approximately 2 N with N denoting the sample size. Order of Moving Average Process (q) The autocorrelation function of a MA(q) process becomes zero at lag q +1 and greater, so we examine the sample autocorrelation function to see where it essentially becomes zero. We do this by placing the 95 confidence interval for the sample autocorrelation function on the sample autocorrelation plot. Most software that can generate the autocorrelation plot can also generate this confidence interval. The sample partial autocorrelation function is generally not helpful for identifying the order of the moving average process Parameter Estimation of the SARIM AM odel After getting appropriate value of p, q, d, P, Q and D the next stage to find the values of θ, φ, Θ and Φ Diagnostics Checking Of The Fitted Model Diagnostics test is applied to understand whether the estimated parameters and residuals of the fitted SARIM A model are significant. ACF and P ACF of Residuals We hope 27

42 that these will show the W N pattern Forecasting the study variable When the model is complete it will be used to forecast the future behavior of the currency pair. 28

43 Chapter 3 Exponential Smoothing 3.1 Introduction Exponential smoothing is probably the widely used class of procedures for smoothing discrete time series in order to forecast the immediate future. This popularity can be attributed to its simplicity, its computational efficiency, the ease of adjusting its responsiveness to changes in the process being forecast, and its reasonable accuracy. The idea of exponential smoothing is to smooth the original series the way the moving average does and to use the smoothed series in forecasting future values of the variable of interest. In exponential smoothing, however, we want to allow the more recent values of the series to have greater influence on the forecast of future values than the more distant observations. Exponential smoothing is a simple and pragmatic approach to forecasting, whereby the forecast is constructed from an exponentially weighted average of past observations. The largest weight is given to the present observation, less weight to the immediately preceding observation, even less weight to the observation before that, and so on exponential decay of influence of past data. Historically, exponential smoothing describes a class of forecasting methods. In fact, some of the most successful forecasting methods are based on the concept of exponential smoothing. There are a variety of methods that fall into the exponential smoothing family, each having the property that forecasts are weighted combinations of past observations, with recent observations given relatively more weight than older observations. 29

44 The name exponential smoothing reflects the fact that the weights decrease exponentially as the observations get old.[17] Exponential Smoothing statistical technique for detecting significant changes in data by ignoring the fluctuations irrelevant to the purpose at hand. In exponential smoothing (as opposed to in moving averages smoothing), older data is given progressively-less relative weight (importance) whereas newer data is given progressively-greater weight. Also called averaging, it is employed in making short-term forecasts. The wait-and-see attitude to changes around them is the intuitive way people employ exponential smoothing in their daily living Classification of Exponential Smoothing Methods In exponential smoothing, we always start with the trend component, which is itself a combination of a level term (l) and a growth term (b). The level and growth can be combined in a number of ways, giving five future trend types. Let (T h ) denote the forecast trend over the next h time periods, and let ϕ denote a damping parameter (0 < ϕ < 1). Then the five trend types or growth patterns are as follows: Non :T h = l Additive :T h = l + bh Additivedamped :T h = l + (ϕ + ϕ ϕ h ) Multiplicative :T h = lb h Multiplicativedamped :T h = lb (ϕ+ϕ ϕ h ) If the error component is ignored, then we have the fifteen exponential smoothing methods given in the following table. Some of these methods are better known under other names. For example, cell (N, N) describes the simple exponential smoothing (or SES) method, cell (A, N) describes Holts linear method, and cell (A d, N) describes the damped trend method. Holt-Winters additive method is given by cell (A, A), and Holt- Winters multiplicative method is given by cell (A, M). The other cells correspond to less commonly used but analogous methods. 30

45 3.1.2 Point Forecasts for the Best-Known Methods In this section, a simple introduction is provided to some of the best known exponential smoothing methods simple exponential smoothing (N, N), Holts linear method (A, N), the damped trend method (A d, N) and Holt-Winters seasonal method (A, A) and (A, M). 3.2 Simple Exponential Smoothing (N,N Method) The simplest of the exponentially smoothing methods is naturally called simple exponential smoothing(ses). (In some books [8], it is called single exponential smoothing.) This method is used for short-range forecasting, usually just one month into the future. The model assumes that the data fluctuates around a reasonably stable mean (no trend or consistent pattern of growth). For more details see [11] and [17] Definition Simple exponential Smoothing Simple exponential Smoothing equation is defined as: ŷ t+1 = yˆt + α(y t yˆt) (3.2.1) where α is constant between 0 and 1. Another way of writing the last equation is ŷ t+1 = αy t + (1 α)yˆt (3.2.2) 31

46 If this substitution process is repeated by replacing yˆt 1 with its components, ŷ t 2 with its components, and so on, the result is (3.2.3) ŷ t+1 = αy t + (1 α)y t 1 + α(1 α) 2 y t α(1 α) t 1 y 1 + (1 α) t yˆ1 So ŷ t+1 represents a weighted moving average of all past observations with the weights decreasing exponentially hence the name exponential smoothing. We note that the weight of ŷ 1 may be quite large when α is small and the time series is relatively short. For longer range forecasts, it is assumed that the forecast function is flat. That is, ŷ t+h h = yˆt+1 A flat forecast function is used because simple exponential smoothing works best for data that have no trend, seasonality, or other underlying patterns. Another way of writing this is to let l t = yˆt+1 then becomes l t = αy t + (1 α)l t The value of l t is a measure of the level of the series at time t Initial Value The initial value of y t plays an important role in computing all the subsequent values. Setting it to y 1 is one method of initialization. Another possibility would be to average the first four or five observations. The smaller the value of α the more important is the selection of the initial value of y t. 32

47 Component form An alternative representation is the component form. For simple exponential smoothing the only component included is the level, l t (Other methods considered later in this chapter may also include a trend b t and seasonal component s t.) Component form representations of exponential smoothing methods comprise a forecast equation and a smoothing equation for each of the components included in the method. The component form of simple exponential smoothing is given by: Forecast equation y t+1 = l t Smoothing equation l t = αy t + (1 α)l t 1, where l t is the level (or the smoothed value) of the series at time t. The forecast equation shows that the forecasted value at time t + 1 is the estimated level at time t. The smoothing equation for the level (usually referred to as the level equation) gives the estimated level of the series at each period t. Applying the forecast equation for time T gives, y T +1 T = l T, the most recent estimated level. If we replace l t by y t+1 t and l t 1 by y t t 1 in the smoothing equation, we will recover the weighted average form of simple exponential smoothing. Error correction form The third form of simple exponential smoothing is obtained by re-arranging the level equation in the component form to get what we refer to as the error correction form l t = l t 1 + α(y t l t 1 ) = l t 1 + αe t where e t = y t l t 1 = y t yˆt for t = 1,..., T. That is, e t is the one-step withinsample forecast error at time t. The within-sample forecast errors lead to the adjustment/correction of the estimated level throughout the smoothing process for t = 1,..., T. For more details see [11] and [17] Example The data in Figure 3.1 do not display any clear trending behavior or any seasonality, although the mean of the data may be changing slowly over time. 33

48 Figure 3.1: Oil production in Saudi Arabia from 1996 to Holt Linear Method (A, N Method) Holt (1957) extended simple exponential smoothing to linear exponential smoothing to allow forecasting of data with trends. The forecast for Holt,s linear exponential smoothing method is found using two smoothing constants, α and β (with values between 0 and 1), and three equations: Definition Holts linear method equations are defined as : Level :l t = αy t + (1 α)(l t 1 + b t 1 ) (3.3.1) Growth :b t = β (l t l t 1 ) + (1 β )b t 1 (3.3.2) F orecast :ŷ t+h = l t + b t h (3.3.3) Here l t denotes an estimate of the level of the series at time t and b t denotes an estimate of the slope (or growth) of the series at time t. One interesting special case of this method occurs when β = 0 Then Level :l t = αy t + (1 α)(l t 1 + b t 1 ) F orecast :ŷ t+h t = l t + b t h 34

49 As with simple exponential smoothing, the level equation here shows that l t is a weighted average of observation y t and the within-sample one-step-ahead forecast for timet, here given by l t 1 + b t 1. The trend equation shows that b t is a weighted average of the estimated trend at time t based on l t l t l and b t 1, the previous estimate of the trend. The forecast function is no longer flat but trending. Error correction form The error correction form of the level and the trend equations show the adjustments in terms of the within -sample one-step forecast errors l t = l t 1 + b t 1 + αe t b t = b t 1 + αβ e t where e t = y t (l t 1 + b t 1 ) = y t yˆt t Damped Trend Method (A d, A Method) The forecasts generated by Holts linear method display a constant trend (increasing or decreasing) indefinitely into the future. Even more extreme are the forecasts generated by the exponential trend method which include exponential growth or decline. Empirical evidence indicates that these methods tend to over-forecast, especially for longer forecast horizons. Motivated by this observation, Gardner and McKenzie (1985) introduced a parameter that dampensthe trend to a flat line some time in the future. Methods that include a damped trend have proven to be very successful and are arguably the most popular individual methods when forecasts are required automatically for many series.[9] Additive damped trend Definition Additive damped trend method equations are defined as 35

50 Level :l t = αy t + (1 α)(l t 1 + φb t 1 ) (3.4.1) Growth :b t = β (l t l t 1 ) + (1 β )φb t 1 (3.4.2) F orecast :ŷ t+h t = l t + (φ + φ φ h )b t (3.4.3) Thus, the growth for the one-step forecast of y t+1 is φb t, and the growth is dampened by a factor of φ for each additional future time period. Notation method If φ = 1 this method gives the same forecasts as Holts linear For 0 < φ < 1, as h the forecasts approach an asymptote given by l t+φb t 1 φ. We usually restrict φ > 0 to avoid a negative coefficient being applied to b t 1 in (3.4.2), and φ < 1to avoid b t increasing exponentially. error correction The error correction form of the smoothing equations is l t = l t 1 + φb t 1 + αe t b t = φb t 1 + αβ e t 3.5 Holt-Winters Trend and Seasonality Method If the data have no trend or seasonal patterns, then simple exponential smoothing is appropriate. If the data display a linear trend, Holts linear method is appropriate. But if the data are seasonal, these methods, on their own, cannot handle the problem well. Holt-Winters method was extended by Winters (1960) to capture seasonality directly is based on three smoothing equations one for the level, one for trend and one for seasonality with smoothing parameters α, β and γ. We use m to denote the period of the seasonality. It is similar to Holts linear method, with one additional equation for dealing with seasonality.there are two variations to this method that differ in the nature of the seasonal component. The additive method is preferred when the seasonal variations are 36

51 roughly constant through the series, while the multiplicative method is preferred when the seasonal variations are changing proportional to the level of the series. With the additive method, the seasonal component is expressed in absolute terms in the scale of the observed series, and in the level equation the series is seasonally adjusted by subtracting the seasonal component. Within each year the seasonal component will add up to approximately zero with the multiplicative method, the seasonal component is expressed in relative terms (percentages) and the series is seasonally adjusted by dividing through by the seasonal component. Within each year, the seasonal component will sum up to approximately m. See [8] and [14] Additive Seasonality (A,A Method) The seasonal component in Holt-Winters method may also be treated additively, although this is less common. Definition The basic equations for Holt-Winters additive method are as follows: Level :l t = α(y t s t m ) + (1 α)(l t 1 + b t 1 ) (3.5.1) Growth :b t = β (l t l t 1 ) + (1 β )b t 1 (3.5.2) Seasonal :s t = γ(y t l t 1 b t 1 ) + (1 γ)s t m (3.5.3) F orecast :ŷ t+h t = l t + b t h + s t m+h + m (3.5.4) The equation for the seasonal component is often expressed as s t = γ (y t l t ) + (1 γ )s t m If we substitute l t from the smoothing equation for the level of the component form above, we get s t = γ (1 α)(y t l t 1 b t 1 ) + (1 γ (1 α))s t m 37

52 which is identical to the smoothing equation for the seasonal component we specify here with γ = γ (1 α) The usual parameter restriction is 0 γ 1, which translates to 0 γ 1 α. The error correction form of the smoothing equations is l t = l t 1 + b t 1 + αe t b t = b t 1 + αβ e t s t = s t m + γe t where e t = y t (l t 1 + b t 1 + s t m ) = y t y t t 1 are the one-step training forecast errors. 3.6 General Point Forecasting Equations Table 3.2 gives recursive formulae for computing point forecasts h periods ahead for all of the exponential smoothing methods. In each case,l t denotes the series level at time t, b t denotes the slope at time t, s t denotes the seasonal component of the series at timet, and m denotes the number of seasons in a year,α, β, γ and φ are constants and φ h = φ + φ φ h. and h + m = [(h 1)mod m]

53 3.7 Innovations state space models for exponential smoothing We now introduce the state space models that underlie exponential smoothing methods. For each method, there are two models. Model with additive errors Model with multiplicative errors. The point forecasts for the two models are identical(provided the same parameter values are used), but their prediction intervals will differ. To distinguish the models with additive and multiplicative errors, we add an extra letter to the front of the method notation. The triplet (E,T,S) refers to the three components;error, trend and seasonality. So the model ET S(A, A, N) has additive errors, additive trend and no seasonality in other words, this is Holts linear method with additive errors. Similarly, ET S(M, M d, M) refers to a model with multiplicative errors,a damped multiplicative 39

54 trend and multiplicative seasonality. The notation ET S(,, ) helps in remembering the order in which the components are specified. ET S can also be considered an abbreviation of Exponential Smoothing.[6] ET S(A, N, N): simple exponential smoothing with additive errors As discussed in Section (3.3.1), the error correction form of simple exponential smoothing is given by l t = l t 1 + αe t where e t = y t l t 1 and yˆt t 1 = l t 1.Thus e t = y t yˆt t 1 represents a one-step forecast error and we can write y t = l t 1 + e t To make this into an innovations state space model, all we need to do is specify the probability distribution for e t For a model with additive errors, we assume that one-step forecast errors et are normally distributed white noise with mean 0 and variance σ 2 i.e e t = ɛ t NID(0, σ 2 ). Then the equations of the model can be written as y t = l t 1 + ɛ t (3.7.1) l t = l t 1 + αɛ t (3.7.2) ET S(M, N, N): simple exponential smoothing with multiplicative errors In a similar fashion, we can specify models with multiplicative errors by writing the onestep random errors as relative errors: 40

55 ɛ t = yt yˆt 1 t yˆt 1 t Substituting yˆt 1 t = l t 1 gives y t = l t 1 + l t 1 ɛ t and e t = y t yˆt 1 t = l t 1 ɛ t Then we can write the multiplicative form of the state space model as y t = l t 1 (1 + ɛ t ) l t = l t 1 (1 + αɛ t ) State Space Models for Holts Linear Method We can now explain the ideas using Holts linear method Additive Error Model: ET S(A, A, N) let µ t = ŷ t = l t 1 +b t 1 denote the one-step forecast of y t assuming we know the values of all parameters. Also let ɛ t = y t µ t denote the one-step forecast error at time t. From (3.3.3) we find that and using (3.3.1),(3.3.2) we can write y t = l t 1 + b t 1 + ɛ t (3.7.3) l t = l t 1 + b t 1 + αɛ t (3.7.4) b t = b t 1 + β (l t l t 1 b t 1 ) (3.7.5) = b t 1 + αβ ɛ t (3.7.6) We simplify the last expression by setting β = αβ Multiplicative Error Model:ET S(M, A, N) A model with multiplicative error can be derived similarly, by first setting ε t = yt µt µ t so 41

56 that ε t is a relative error. Then, following a similar approach to that for additive errors, we find y t = (l t 1 + b t 1 )(1 + ε t ) l t = (l t 1 + b t 1 )(1 + αε t ) b t = b t 1 + β(l t 1 + b t 1 )ε t State Space Models for All Exponential Smoothing Methods The underlying equations for the additive error models are given in Table.We use β = αβ to simplify the notation. Multiplicative error models are obtained by replacing ε t with ε t µ t in the equations of Table 3.3 and 3.4 The resulting multiplicative error equations are given in the Table 3.3 and

57 43

58 3.8 Initialization and Estimation In order to use these models for forecasting, we need to specify the type of model to be used (model selection), the value of x 0 (initialization), and the values of the parameters α, β, γ and φ (estimation). In this section, we discuss initialization and estimation, leaving model Initialization The non-linear optimization requires some initial values. We use α = β = γ = 0.5 and φ = 0.9 The initial values of l 0, b 0 and s k (k = m + 1,..., 0, ) are obtained using the following heuristic scheme. Initial seasonal component. 1. For seasonal data, compute a 2m moving average through the first few years of data (we use up to four years if the data are available). Denote this by {f t } t = ( m) + 1, ( m ) + 2, For additive seasonality, we detrend the data to obtain y t f t.for multiplicative seasonality, we detrend the data to obtain yt f t Then compute initial seasonal indices,s m+1,..., s 0 by averaging the detrended data for each season. Normalize these seasonal indices so that they add to zero for additive seasonality, and add to m for multiplicative seasonality. Initial level component 1. For seasonal data, compute a linear trend using linear regression on the first ten seasonally adjusted values (using the seasonal indices obtained above) against a time variable t = 1,..., For nonseasonal data, compute a linear trend on the first ten observations against a time variable t = 1,..., 10 Then set Fl 0 to be the intercept of the trend. 44

59 Initial growth component. 1. For additive trend, set b 0 to be the slope of the trend. 2. For multiplicative trend, set b 0 = 1 + b a where a denotes the intercept and b denotes the slope of the fitted trend. These initial states are then refined by estimating them along with the parameters, as described below Estimation and model selection Let L (θ, x 0 ) = n log( n t=1 e2 t /k 2 (x t 1 )) + 2 n t=1 log k(x t 1) (3.8.1) Then L is equal to twice the negative logarithm of the conditional likelihood function of the state space model (with constant terms eliminated). An alternative to estimating the parameters by minimizing the sum of squared errors, is to maximize the likelihood. The likelihood is the probability of the data arising from the specified model. So a large likelihood is associated with a good model. For an additive error model, maximizing the likelihood gives the same results as minimizing the sum of squared errors. However, different results will be obtained for multiplicative error models. In this section, we will estimate the smoothing parameters θ = (α, β, γ, φ) and initial states x 0 = (l 0, b 0, s 0, s 1,..., s m+1 ) by maximizing the likelihood. The possible values that the smoothing parameters can take is restricted. Traditionally the parameters have been constrained to lie between 0 and 1 so that the equations can be interpreted as weighted averages.that is, 0 < α, β, γ and φ < 1 For the state space models, we have setβ = αβ and γ = (1 α)γ. Therefore the traditional restrictions translate to 0 < α < 1 0 < β < α and 0 < γ < 1 α. In practice, the damping parameter φ is usually constrained further to prevent numerical difficulties in estimating the model. A common constraint is to set 0.8 < φ < Another way to view the parameters is through a consideration of the mathematical properties of the state space models. Then the parameters are constrained to prevent observations in the distant past having a continuing effect on current forecasts. This leads to some admis- 45

60 sibility constraints on the parameters which are usually (but not always) less restrictive than the usual region.[11] 3.9 Measure error Due to the fundamental importance of time series forecasting in many practical situations, proper care should be taken while selecting a particular model,to estimate forecast accuracy and to compare different models. Each of these measures is a function of the actual and forecasted values of the time series.in each of the forthcoming definitions, y t is the actual value, f t is the forecasted value, e t = y t f t is the forecast error and n is the size of the test set. Definition The Mean Absolute Error (M AE) The Mean Absolute Error (MAE) is defined as MAE= 1 n n t=1 e t It measures the average absolute deviation of forecasted values from original ones. In MAE, the effects of positive and negative errors do not cancel out. Definition The Mean Absolute Percentage Error (M AP E) The Mean Absolute Percentage Error (M AP E)is defined as MAPE = 1 n n t=1 et y t 100 (3.9.1) This measure represents the percentage of average absolute error occurred. It is independent of the scale of measurement, but affected by data transformation Definition The Mean Squared Error (M SE) The Mean Squared Error (M SE) is defined as 46

61 MSE= 1 n e 2 t MSE gives an overall idea of the error occurred during forecasting. Definition The Root Mean Squared Error (RM SE) The Root Mean Squared Error (RM SE)is defined as RMSE = 1 n e 2 t 47

62 48

63 Part II Case Study 49

64 Chapter 4 Analysis Data Using Box Jenkins Method 4.1 Data Description The data of our study is monthly observations for electricity consumption in Khan Younis province during the period from January 2000 to December The data were taken at the end of every month. The total number of observations is (132). is a time series data, an overview of data from January 2000 to December 2010 are plotted in Figure 4.1 Figure 4.1: Time series plot of electricity consumption in province Khan Younis monthly symbols 50

65 Let s take general idea about the data, We will show some descriptive statistics of the time series in Table (4.1) Table 4.1: Descriptive Statistics Statistics Min Median 11.9 Mean Max The Box-Jenkins Approach to Fitting ARIM A Model: We can see from Figure ( 4.1) that there seems to be seasonal variation in the number of consumption. One way to determine more objectively if differeincing is required is to use a unit root test. These are statistical hypothesis tests of stationarity that are designed for determining whether differentiae is required. A number of unit root tests are available, and they are based on different assumptions and may lead to conflicting answers. One of the most popular tests is the Augmented Dickey-Fuller (ADF ) test. The null-hypothesis for an ADF test is that the data are nonstationary. So large p-values are indicative of non-stationarity, and small p-values suggest stationarity. Another popular unit root test is the Kwiatkowski-Phillips-Schmidt-Shin (KP SS) test. This reverses the hypotheses, so the null-hypothesis is that the data are stationary. In this case, small p-values (e.g., less than 0.05) suggest that differentiae is required. It s known that For the ADF test, if p-value 0.05 the process stationary For the KPSS test, if p-value 0.05 the process stationary 51

66 The results as shown in table 4.2,the KP SS test: the p-value is 0.01 which is less than p = 0.05, the ADF test:the p-value of the Augmented Dickey-Fuller (ADF ) test equals 0.3 which greater than p = Table 4.2: P-values for Augmented Dickey-Fuller (ADF ) test And Kwiatkowski-Phillips- Schmidt- Shin (KP SS)Test for monthly electricity consumption test electricity consumption ADF 0.3 KPSS 0.01 This result indicates that the time series of monthly electricity consumption is not stationary. Investigation is also done by the examination of the autocorrelation and partial autocorrelation functions as shown in Figure 4.2 and 4.3 respectively Figure 4.2: Acf for monthly electricity consumption 52

67 Figure 4.3: Pacf for monthly electricity consumption 4.2 Model Specification The data are clearly non-stationary, with some seasonality, so we will first take a seasonal difference. The seasonally difference data are shown in Figure 4.4. This shows one way to make a time series stationary compute the differences between consecutive observations. This is known as differentiae. Our aim now is to find an appropriate ARIMA model based on the ACF and P ACF shown in Figure 4.4 as As shown in the plot of ACF and P ACF in Figure 4.5 Non-seasonal behavior: The significant spike at lag 2 and 3 in the ACF suggests a non-seasonal MA(1) component.this leads for non-seasonal MA(1). The significant spike at lag 2 and lag 3 also at lag 11, 12 and 13. Seasonal behavior: We look at what going on around lags 1, 12 and 24 and so on. That is the ACF has significant lags at 1, 12, 24. This lead to a seasonalm A(1) component. Consequently, this initial analysis suggests that a possible model for these data is an ARIMA (0, 1, 1)(0, 1, 1) 12 The 53

68 AICc of the model is but the residuals of the model show that there is significant lag at 36 which indicate some additional seasonal term. while that for the ARIM A (0, 1, 1)(0, 1, 0) is we tried other models with AR terms as well, but none that gave a smaller AICc value. Consequently, we choose the ARIMA(2, 1, 2)(1, 0, 1) 12 as we shown in Figure 4.6 the residuals of model show that there are significant spikes in both the ACF and P ACF Table 4.3 display the model Figure 4.4: First difference of monthly electricity consumption 54

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Estimation and application of best ARIMA model for forecasting the uranium price.

Estimation and application of best ARIMA model for forecasting the uranium price. Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Empirical Approach to Modelling and Forecasting Inflation in Ghana

Empirical Approach to Modelling and Forecasting Inflation in Ghana Current Research Journal of Economic Theory 4(3): 83-87, 2012 ISSN: 2042-485X Maxwell Scientific Organization, 2012 Submitted: April 13, 2012 Accepted: May 06, 2012 Published: June 30, 2012 Empirical Approach

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation AMERICAN JOURNAL OF SCIENTIFIC AND INDUSTRIAL RESEARCH 214, Science Huβ, http://www.scihub.org/ajsir ISSN: 2153-649X, doi:1.5251/ajsir.214.5.6.185.194 Time Series Forecasting: A Tool for Out - Sample Model

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1 Forecasting using R Rob J Hyndman 3.2 Dynamic regression Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Stochastic and deterministic trends 3 Periodic seasonality 4 Lab session 14 5 Dynamic

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

Ch 9. FORECASTING. Time Series Analysis

Ch 9. FORECASTING. Time Series Analysis In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Financial Time Series Analysis: Part II

Financial Time Series Analysis: Part II Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 1 Unit root Deterministic trend Stochastic trend Testing for unit root ADF-test (Augmented Dickey-Fuller test) Testing

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

A stochastic modeling for paddy production in Tamilnadu

A stochastic modeling for paddy production in Tamilnadu 2017; 2(5): 14-21 ISSN: 2456-1452 Maths 2017; 2(5): 14-21 2017 Stats & Maths www.mathsjournal.com Received: 04-07-2017 Accepted: 05-08-2017 M Saranyadevi Assistant Professor (GUEST), Department of Statistics,

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Using Analysis of Time Series to Forecast numbers of The Patients with Malignant Tumors in Anbar Provinc

Using Analysis of Time Series to Forecast numbers of The Patients with Malignant Tumors in Anbar Provinc Using Analysis of Time Series to Forecast numbers of The Patients with Malignant Tumors in Anbar Provinc /. ) ( ) / (Box & Jenkins).(.(2010-2006) ARIMA(2,1,0). Abstract: The aim of this research is to

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia

Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia DOI 10.1515/ptse-2017-0005 PTSE 12 (1): 43-50 Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia Umi MAHMUDAH u_mudah@yahoo.com (State Islamic University of Pekalongan,

More information

Automatic seasonal auto regressive moving average models and unit root test detection

Automatic seasonal auto regressive moving average models and unit root test detection ISSN 1750-9653, England, UK International Journal of Management Science and Engineering Management Vol. 3 (2008) No. 4, pp. 266-274 Automatic seasonal auto regressive moving average models and unit root

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.5 Seasonal ARIMA models Forecasting using R 1 Outline 1 Backshift notation reviewed 2 Seasonal ARIMA models 3 ARIMA vs ETS 4 Lab session 12 Forecasting using R Backshift

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia

Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia International Journal of Applied Science and Technology Vol. 5, No. 5; October 2015 Time Series Analysis of United States of America Crude Oil and Petroleum Products Importations from Saudi Arabia Olayan

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial

More information

Improved Holt Method for Irregular Time Series

Improved Holt Method for Irregular Time Series WDS'08 Proceedings of Contributed Papers, Part I, 62 67, 2008. ISBN 978-80-7378-065-4 MATFYZPRESS Improved Holt Method for Irregular Time Series T. Hanzák Charles University, Faculty of Mathematics and

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series Journal of Mathematics and Statistics 8 (4): 500-505, 2012 ISSN 1549-3644 2012 doi:10.3844/jmssp.2012.500.505 Published Online 8 (4) 2012 (http://www.thescipub.com/jmss.toc) Seasonal Autoregressive Integrated

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

This note introduces some key concepts in time series econometrics. First, we

This note introduces some key concepts in time series econometrics. First, we INTRODUCTION TO TIME SERIES Econometrics 2 Heino Bohn Nielsen September, 2005 This note introduces some key concepts in time series econometrics. First, we present by means of examples some characteristic

More information

Evaluation of Some Techniques for Forecasting of Electricity Demand in Sri Lanka

Evaluation of Some Techniques for Forecasting of Electricity Demand in Sri Lanka Appeared in Sri Lankan Journal of Applied Statistics (Volume 3) 00 Evaluation of Some echniques for Forecasting of Electricity Demand in Sri Lanka.M.J. A. Cooray and M.Indralingam Department of Mathematics

More information

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them. TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;

More information

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA www.arpapress.com/volumes/vol14issue3/ijrras_14_3_14.pdf A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA Ette Harrison Etuk Department of Mathematics/Computer Science, Rivers State University

More information

Classical Decomposition Model Revisited: I

Classical Decomposition Model Revisited: I Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 011 MODULE 3 : Stochastic processes and time series Time allowed: Three Hours Candidates should answer FIVE questions. All questions carry

More information

Time-series Forecasting - Exponential Smoothing

Time-series Forecasting - Exponential Smoothing Time-series Forecasting - Exponential Smoothing Dr. Josif Grabocka ISMLL, University of Hildesheim Business Analytics Business Analytics 1 / 26 Motivation Extreme recentivism: The Naive forecast considers

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Minitab Project Report - Assignment 6

Minitab Project Report - Assignment 6 .. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

FORECASTING COARSE RICE PRICES IN BANGLADESH

FORECASTING COARSE RICE PRICES IN BANGLADESH Progress. Agric. 22(1 & 2): 193 201, 2011 ISSN 1017-8139 FORECASTING COARSE RICE PRICES IN BANGLADESH M. F. Hassan*, M. A. Islam 1, M. F. Imam 2 and S. M. Sayem 3 Department of Agricultural Statistics,

More information