A time series is called strictly stationary if the joint distribution of every collection (Y t

Size: px
Start display at page:

Download "A time series is called strictly stationary if the joint distribution of every collection (Y t"

Transcription

1 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a month. Time series present a challenge for the statistical analysis because of the obvious correlation introduced by the sampling of adjacent points in time. For ease of exposition, we assume now that the time series is observed over a discrete and equally spaced set of times T = {t 1,..., t n } and Y t is the random variable that generates the observation y t at time t. The objective of time series analysis is to understand the process that generates the sample time series and to predict (forecast) the response variable at future times. A complete description of the time series {Y t, t T } (intended as the random process generating the data), is given by the joint distribution function of (Y 1,..., Y tn ). In practice, some modelling assumptions on the form of this joint distribution will be needed. An important object in time series analysis is the autocovariance function γ Y (s, t) = cov(y s, Y t ) = E[(Y s E[Y s ])(Y t E[Y t ])]. The autocovariance measures the linear dependence between the response at two different times. It is often more convenient to consider instead the autocorrelation function (ACF) ρ Y (s, t) = γ Y (s, t) (γy (s, s)γ Y (t, t), which is the correlation between Y s and Y t. Note that this quantity is well defined only if both Y S and Y t have finite variance. 5.1 Stationary time series Without any assumptions on the process generating the time series, it would be impossible to carry out any statistical analysis, because we observed only one replicate from the random vector (Y 1,..., Y tn ). However, in many applications there is some regularity (or smoothness) in the underlying process, thus allowing us to borrow information across the time series to investigate the characteristics of the process. An important class of these processes are time series that are stationary. A time series is called strictly stationary if the joint distribution of every collection (Y t 1,..., Y t k ) is equal to the joint distribution of the time shifted set (Y t 1 +h,..., Y t k +h), for any h Z such that t 1 +h t 1 and t k + h t n. 1

2 This definition implies that, for a strictly stationary series, Y s and Y t are identically distributed for any s, t. Moreover, if the means E[Y s ] and E[Y t ] exist, they are the same and the autocovariance function γ Y (s, t) = γ Y (s + h, t + h) (check this!). Therefore the covariance between two time points depends only on their time shift h. Since checking strictly stationarity for a time series is very difficult and it is often a too strong assumption, we formulate a weaker definition. A time series is called weakly stationary (or simply stationary) if Y t has finite variance for all t and 1. the mean E[Y t ] is constant for all t and 2. the autocovariance function γ Y (s, t) depends on s and t only through their difference h = s t. Strictly stationarity implies (weak) stationarity but the opposite is not true. We may also simplify the notation for the autocovariance function, since it depends only on the time shift (or lag) h. Let s = t + h, then γ Y (t + h, t) = cov(y t+h, Y t ) = cov(y t, Y 0 ) = γ Y (t, 0). = γ Y (t) with a little abuse of notation. Example 5.1. (White noise) A simple kind of generating process for the time series is a collection of uncorrelated random variables W t, all with mean 0 and finite variance σ 2. In engineering, this process is usually called white noise. The white noise is a (weakly) stationary process, because V ar(w t ) = σ 2, +, E[W t ] = 0 t and { σ 2 h = 0 γ W (s, t) = γ W (h) = 0 h 0 If W t are Gaussian random variables for all t (Gaussian white noise), then the series is also strictly stationary. Example 5.2. (Signal plus noise) In some cases, data can be modelled as an underlying deterministic signal corrupted by additive noise, i.e. Y t = cos(2πt/10) + W t, with t = 1, 2,..., 100 and W t is the white noise described in Example 5.1. In this case Y t is not stationary, because E[Y t ] = cos(2πt/10) and it is not constant in time. If the time series is stationary, we can estimate the constant mean µ = E[Y t ] with the sample mean, as t n µ = 1 Y t, n t=t 1 2

3 which is an unbiased estimator for µ, and the autocovariance function at leg h as t n h γ Y (h) = 1 (Y t+h µ)(y t µ), n t=t 1 with γ Y ( h) = γ Y (h) and h = 0, 1,..., n 1. Note that the larger the lag, the less observations we have available to estimate the autocovariance. The sample autocorrelation function is then defined as ρ Y (h) = γ Y (h) γ Y (0). Remark 5.3. When the time series is a white noise, the sample ACF ρ W (h) is asymptotically normal with zero mean and variance 1 n, for h = 1, 2,..., H. This is useful to test if the time series at hand (or the residuals after some modelling) is indeed generated by a white noise AR models In classical linear models, the response variable is influenced only by the correspondent independent variables (predictors) (plus independent errors). In the time series case, it may be needed to allow the response variable at time t to depend also on its past values. Autoregressive (AR) models are based on the idea that the current value of the time series Y t can be expressed as a function of p past values Y t 1,..., Y t p. An autoregressive model of order p, also called AR(p), is of the form Y t = α + p β h Y t h + W t, h=1 where α, β h, h = 1,..., p are unknown coefficients, W t is a white noise process of variance σ 2 and Y t is a stationary process. Let us consider for simplicity a AR(1) process with α = 0, Y t = β 1 Y t 1 + W t, 3

4 and investigate its properties. We can rewrite it as Y t =β 1 Y t 1 + W t = β 1 (β 1 Y t 2 + W t ) + W t =β 2 1Y t 2 + β 1 W t 1 + W t. k 1 =β1 k Y t k + β j 1 W t j. j=1 We can therefore represent the AR(1) process as Y t = + β j 1 W t j with the left hand side well-defined when β 1 < 1. If β 1 1, it is possible to write an equivalent expression with a combination of future values of the white noise. For this reason, the case β 1 1 is called non-causal and it is not usually relevant for real life time series. When β 1 < 1, the AR(1) process is causal with mean E[Y t ] = and autocovariance function + β j 1 E[W t j] = γ Y (h) = cov(y t, Y t+h ) = E[( β j 1 W t j)( β1 k W t+h k )] = = σ 2 + β j 1 βh+j 1 = σ 2 β h 1 + k=0 β 2j 1 = σ2 β1 h 1 β1 2. As a consequence, the ACF for a AR(1) process is ρ Y (h) = γ Y (h)/γ Y (0) = β1 h, for h 0. We can then compare the sample ACF of a time series to see if it is consistent with the one for a AR(1) model. You can check that, if Y t is a general AR(p) process, E[Y t ] = α/(1 p j=1 β j) MA models The moving average (MA) models assume instead that the time series is generated by combination of white noises. 4

5 A moving average model of order q, also called MA(q), is of the form Y t = W t + q θ h W t h, h=1 where θ h, h = 1,..., q are unknown coefficients and W t is a white noise process of variance σ 2. Since Y t is a finite linear combination of white noise terms, the process is stationary with zero mean. The autocovariance function is { σ γ Y (h) = cov(y t, Y t+h ) = 2 q h θ jθ j+h if 0 h q 0 if h > q where θ 0 = 1, and the ACR is q h θ jθ j+h ρ Y (h) = q if 0 h q θ2 j 0 if h > q An important feature of the MA(q) process is exactly that the autocorrelation is zero for times with lag larger than q. It is also relevant to note that moving average process do not have a unique expression. For example, the processes Y t = W t + 0.5W t 1 with W t white noise of variance 4 and Z t = V t + 2V t 1 with V t white noise of variance 1, are equivalent (you can check that they are both stationary process with the same mean and autocovariance function) ARMA models Mixed autoregressive moving averages (ARMA) processes model the time series as the sum of an autoregressive part and a moving average part. 5

6 A mixed autoregressive and moving average model of autoregressive order p and moving average order q, also called ARMA(p, q), is of the form p q Y t = α + β j Y t h + W t + θ h W t h, j=1 where α, β j, θ h, h = 1,..., q and j = 1,..., p, are unknown coefficients and W t is a white noise process of variance σ 2. h=1 Remark 5.4. (Parameter redundancy) The same ARMA process can be parametrized in multiple ways. For example, let Y t = W t be a white noise. Then, Y t 1 = W t 1 is the same (shifted) time series. If we now take a linear combination of the two, we get or Y t βy t 1 = W t βw t 1 Y t = βy t 1 + W t βw t 1 (1) which is a ARMA(1, 1) process. However, we know that Y t is a white noise! Therefore, we have masked the white noise behind a overparametrization (a range of admissible values for β lead to the same process). To overcome this problem (as well as the non uniqueness of the moving average model), an additional conditions may be imposed on the ARMA process. Let us first define the autoregressive (AR) and moving average (MA) polynomials as and β(z) = 1 β 1 z β p z p θ(z) = 1 + θ 1 z + + θ q z q respectively. To avoid the parameters redundancy, we can ask that the two polynomials have no common factor and include this in the definition of ARMA model. This is why the process (1) is not usually called ARMA(1,1), because it can be reduced to white noise. Moreover, the form of the AR polynomial is linked to the causality of the process by the following proposition. Proposition 5.5. An ARMA(p, q) process is causal if and only if β(z) 0 for z < 1, i.e. if the roots of β(z) lie outside of the unit circle. The parameters of ARMA processes can be estimated from the time series data by maximum likelihood. The validity of the model assumptions can be then checked looking at the residuals. 6

7 5.2 Non stationary time series: Seasonality and Trend In many cases of interest the time series is not stationary, either because the mean changes over time or because the autocovariance function is not a function of the distance between two time points. The latter problem is more difficult to address and it is outside the scope of the course. However, the change in the mean can be included in the model using tools we have seen in the previous parts of the course. A seasonal pattern is present in the series when data are influenced by seasonal factors (e.g., the quarter of the year, the month, day of the week, the hour of the day). The period P of the seasonal effect is known and we can assume that the mean change follows the same periodical dynamic. Y t = µ t mod P + E t where t mod P is the reminder of t divided by the period P and E t is a stationary time series. µ t mod P is called seasonal component. In practice, we can estimate the mean by averaging across the repetition of the same period. For example, if data span multiple years and we think that a yearly seasonal pattern is present, we can estimate the mean for January as the average of all the values in January over the years. In time series analysis, it is usually called trend any long-term and non seasonal increase or decrease in the data, i.e. a function f(t) such that Y t = f(t) + E t, with E t stationary time series. The trend f(t) can be estimated using parametric (or non-parametric) regression methods we have seen in the previous part of the course. In case of a linear trend, an alternative is also to differentiate the time series to remove the trend. Therefore, we can decompose a non stationary time series as the sum of a seasonal term, a trend and a stationary residual process: Y t = S t + T t + E t. 5.3 Forecasting The goal is now to predict the future values of a time series Y tn+m, m = 1,..., M. We will denote with Ŷt n+m t n the prediction (or forecast) of Y tn+m based on the knowledge of the time series up to time t n. The accuracy of the forecast is usually measured by mean square error, i.e. we want E[Y tn+m Ŷt n+m t n ] to be small. We discuss briefly a gallery of basic method that are often quite effective in practice (when used wisely). 7

8 5.3.1 Average method The average method simply predict all future values of the time series with the average of the observed series, i.e. t n Ŷ tn+m tn = 1 Y t. n t=t 1 If the time series is stationary, we are predicting future observations with an estimate of their mean. Moreover, when m becomes large, the future observations are less and less influenced by what happened in the observed window of time and therefore the best we can expect to do is predict them using their expected value Naive method The naive method forecasts the future observations using the last available value, i.e. Ŷ tn+m t n = Y tn. The idea here is that if adjacent observations are highly correlated, the future observations will be similar to the last one, at least for small values of m Simple exponential smoothing In place of the naive method, where all forecasts for the future are equal to the last observed value of the series, or the average method, where all future forecasts are equal to a simple average of the observed data, we may want to use something in between that take into account all the observed time series but giving more weight to the most recent observation. This approach is called simple exponential smoothing, where forecasts are obtained by weighted averages where the weights decrease exponentially as observations come from further in the past, i.e. Ŷ tn+1 t n = αy tn + α(1 α)y tn 1 + α(1 α) 2 Y tn < α 1 plays the role of the smoothing parameter, when α goes to 1 we get back the naive predictor, while for α small more and more weight is given to observations from the past. Alternatively, the simple exponential smoothing forecast can be written in a recursive way: Ŷ tn+1 t n = αy tn + (1 α)ŷt n t n 1, 8

9 This needs to set the initial value for the forecast and it is usually taken as Ŷ t1 t 0 = Y t1. Note that, if the sample size of the time series is large enough, we expect this initial value to have very little weight in the forecast. There is also the issue of how to choose the smoothing parameter α. This can be done in a subjective way, maybe looking at the autocorrelation function, or α can be chosen by minimising the sum of squared error SSE = t n t=t 1 (Y t Ŷt t 1) Model based forecasting In general, the best forecast in terms of minimizing the MSE is the conditional expectation Ŷ tn+m t n = E[Y tn+m Y t1,..., Y tn ] but to compute this we need to know the joint distribution of Y tn+m, Y t1,..., Y tn. This is possible only for relatively simple models and for this reason much of the much of theory of prediction restricts attention to linear predictors of the form Ŷ tn+m tn = ω t Y t, t=t 1 where ω t are suitable weights chosen to minimise the MSE (we will discuss this approach further in the context of spatial statistics). In the case of ARMA models, the minimum MSE prediction is obtained with the following recursive procedure: (1) future values of the white noise are set to zero (2) future values of the process are taken equal to their conditional expectation (3) present and past values of W t and Y t are taken equal to their observed values. For example, for the ARMA(1,1) process, the minimum MSE forecast up to time t n + m will be { β1 Y Ŷ tn+h tn = tn + θ 1 W tn for h = 1 β 1 Y tn+(h 1) for h = 2,..., m In practice, however, we need to plug in the maximum likelihood estimates in place of the true parameters and residuals in place of the past and present white noise. Thus the resulting predictor is not guaranteed to minimize the mean square error. t n 9

10 5.3.5 Series decomposition If the time series is not stationary, we may want first to decompose the series in its seasonal, trend and stationary component and forecast each component separately. Each component may also need a different forecasting method. For example, the forecasting of a parametric term can be done with the prediction from the linear model fitted with generalized least square. We have already seen the expression of the generalized least square estimator for the coefficients of a linear model when estimating the coefficients of the mixed effects models once the variance parameters are known (equation (3) of Section 4 of the lecture notes). The same idea applies here but the covariances between the observations are given by the time dependence. For the seasonal component, an average or naive method can be used for forecasting, while an ARMA model may be fitted to the stationary residuals. References Shumway, R. H., & Stoffer, D. S. (2010) Time series analysis and its applications: with R examples. Springer Science & Business Media. Hyndman, R. J., & Athanasopoulos, G. (2014) Forecasting: principles and practice. OTexts. 10

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Time series analysis of activity and temperature data of four healthy individuals

Time series analysis of activity and temperature data of four healthy individuals Time series analysis of activity and temperature data of four healthy individuals B.Hadj-Amar N.Cunningham S.Ip March 11, 2016 B.Hadj-Amar, N.Cunningham, S.Ip Time Series Analysis March 11, 2016 1 / 26

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

MCMC analysis of classical time series algorithms.

MCMC analysis of classical time series algorithms. MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Chapter 5: Models for Nonstationary Time Series

Chapter 5: Models for Nonstationary Time Series Chapter 5: Models for Nonstationary Time Series Recall that any time series that is a stationary process has a constant mean function. So a process that has a mean function that varies over time must be

More information

Problem set 1 - Solutions

Problem set 1 - Solutions EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed

More information