Time series and spectral analysis. Peter F. Craigmile

Size: px
Start display at page:

Download "Time series and spectral analysis. Peter F. Craigmile"

Transcription

1 Time series and spectral analysis Peter F. Craigmile Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France Jun 2016 Thank you to The Camille Jordan Institute, the MultiRisk LEFE project, and the Labex Milyon. My research is partially supported by the US National Science Foundation under grants NSF-DMS and NSF-SES

2 Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis is the area of statistics which deals with analyzing dependencies between different observations in time. If we ignore the dependencies that we observe in time series data, then we can be led to incorrect statistical inferences. References: Good introductory text: Brockwell and Davis [2002] The theory version: Brockwell and Davis [1991] Other popular textbooks: Cryer and Chan [2008], Shumway and Stoffer [2010] 2

3 Time series processes and data A time series process (model) is a stochastic process {X t : t T } where the index set T is usually either a subset of the integers, Z (discrete-time time series), or a subset of the reals, R (continuous-time time series). For these lectures we will focus on discrete-time time series. A given realization or sample path of the process, {x t }, yields our time series data or just the time series. Sometimes a time series model will only be summaries of the process (for example, the means and covariances of the RVs). 3

4 Application 1: Global temperature series from Berkeley Earth project [Rohde et al., 2013] (a) (b) Temperature anomaly ( C) Estimated standard error ( C) Year Year (a) Global annual mean temperature anomalies with respect to the climatology. The gray shaded region denotes simultaneous 95% confidence intervals for the mean, using estimated standard errors from Rohde et al. [2013]; (b) A plot of the estimated standard errors for the mean by year. 4

5 Application 2: River discharge series Average monthly discharge Year Log10 average monthly discharge Year Average monthly discharge (in cubic feet per second), at Alum Creek, Columbus, Ohio, USA. Source: United States Geological Survey, 5

6 Exploratory data analysis (EDA) for time series Start with time series plot of {x t : t = 1,..., n} versus t. Carefully think about the time scale and the axes. Look for the following: 1. Are there changes in behavior over time? (e.g., are there shifts in mean and/or variance?) This is an indicator of nonstationarity often we try to model the nonstationarity, remove it, and try to model the remainder. 2. Are there outliers? (unusual values with respect to the rest of the data). 3. Are some data missing in time? (complicates EDA and analysis). 4. Is the data non-gaussian? Sometimes we can just transform the data, or consider non-gaussian time series models (e.g., Poisson or binomial time series models) 6

7 Classifying features of times series 1. Trend 2. Seasonal or periodic components 3. The noise (also called the residual, irregular, error, or random term). Detecting and modeling changepoints are currently in vogue. 7

8 Combining trend, seasonality, and noise Most common: The additive model, x t = µ t + s t + η t. Also common: the multiplicative model x t = µ t s t η t. If all the variables are positive then we obtain the additive model by taking logarithms: log x t = log µ t + log s t + log η t. Often want to transform back so that we can interpret on the original measurement scale. 8

9 Objectives of time series analysis 1. Analysis and modeling 2. Forecasting and prediction 3. Adjustment/transformation 4. Simulation/emulation 5. Control 9

10 Gaussian processes Gaussian processes are time series processes, for which all finite-dimensional distributions are multivariate normal. The joint probability density function of X = (X 1,..., X n ) T at x = (x 1,..., x n ) T is ( f X (x) = (2π) n/2 det(σ ) 1/2 exp 1 ) 2 (x µ)t Σ 1 (x µ), where µ = (µ 1,..., µ n ) T and the (t, t ) element of the covariance matrix Σ is cov(x t, X t ) 10

11 Strictly stationary processes Processes in which the joint distribution of the process at any collection of time points are unaffected by time shifts: Formally {X t : t T } is strictly stationary (also called narrow-sense) when (X t1,..., X tn ) is equal in distribution to (X t1 +h,..., X tn +h), for all integers h and n 1, and time points {t k T }. 11

12 (Weakly) stationary processes {X t : t T } is (weakly) stationary if (i) E(X t ) = µ X for some constant µ X which does not depend on t; and (ii) cov(x t, X t+h ) = γ X (h), a finite constant that can depend on the (time) lag h but not on t. Other terms for weakly stationary include second-order, covariance, wide-sense. γ X ( ) is called the autocovariance function (ACVF) of the stationary process. Also useful: the unitless autocorrelation function (ACF) of the stationary process, ρ X (h) = corr(x t, X t+h ) = γ X(h) γ X (0), h Z. 12

13 Properties of the ACVF Any ACVF of a stationary process has the following properties: (i) The ACVF is even: γ X (h) = γ X ( h) for all h; (ii) The ACVF is nonnegative definite (nnd): n j=1 n k=1 a jγ X (j k)a k 0 for all positive integers n and real-valued constants {a j : j = 1,..., n}. Bochner s theorem [Bochner, 1933, 1955] states that: a real-valued function γ X ( ) defined on the integers is the ACVF of a stationary process if and only if it is even and nnd. Rather than using Bochner s theorem, it is more common to create processes from measurable functions of other processes linear filtering is the most popular. 13

14 Linear filtering preserves stationarity A linear time invariant (LTI) filter is an operation that takes a time series process {W t : t T } and creates a new process {X t } via X t = j= ψ j W t j t T, where {ψ j : j Z} is a sequence of filter coefficients that do not depend on time and that are absolutely summable: j ψ j <. Then, if {W t } is a mean zero stationary process with ACVF γ W (h), {X t } is a mean zero stationary process with ACVF γ X (h) = ψ j ψ k γ W (j k + h), h Z. j= k= This is the linear filtering preserves stationarity result. We will use this result to define useful classes of stationary processes. 14

15 IID noise and white noise The simplest strictly stationary processes is IID noise: {X t : t T } is a set of independent and identically distributed RVs. There are no dependencies across time for this process. The analogous no correlation model that is weakly stationary is white noise: {X t } in this case satisfies E(X t ) = µ, a constant independent of time for all t, with σ 2, h = 0; γ X (h) = 0 otherwise. We write WN(µ, σ 2 ) for a white noise process with mean µ and variance σ 2 > 0. Neither process is a good model for dependent time series, but are good building blocks! 15

16 Linear processes Suppose that {ψ j : j Z} is an absolutely summable sequence that is independent of time and {Z t : t T } is a WN(0, σ 2 ) process. By the filtering preserves stationary result, the linear process {X t : t T }, defined by X t = j= ψ j Z t j, t T, is a stationary process with zero mean and an ACVF given by γ X (h) = σ 2 j= ψ j ψ j+h, h Z. This class is useful for theoretical studies of linear stationary processes, but in practice for statistical modeling we use linear filters of finite time support (i.e., ψ j 0 for a finite collection of indexes j). 16

17 Example: Moving average process of order q, MA(q) Let {Z t } be a WN(0, σ 2 ) process. For an integer q > 0 and constants θ 1,..., θ q with θ q 0 define X t = Z t + θ 1 Z t θ q Z t q where we let θ 0 = 1. = θ 0 Z t + θ 1 Z t θ q Z t q q = θ j Z t j, j=0 {X t } is known as the moving average process of order q, or the MA(q) process. Clearly, by definition, the MA(q) process is a linear process. 17

18 Defining linear processes with backward shifts The backward shift operator, B, is defined by BX t = X t 1. We can represent a linear process using the backward shift operator as where we let ψ(b) = j= ψ jb j. X t = ψ(b)z t, ψ(b) is a polynomial with the backward shift operator as the argument (not a number). We will think of it as a type of linear filter. Example: a mean zero MA(1) process can be written as where ψ(b) = 1 + θb. X t = ψ(b)z t, 18

19 Example: The MA(q) process is stationary By the filtering preserves stationarity result, the MA(q) process defined previously is a stationary process with mean zero and ACVF γ X (h) = σ 2 q θ j θ j+h j=0 (let θ j = 0 for j < 0 and j > q for this to make sense). Let I(A) denote the indicator function defined by 1, if the event A is true; I(A) = 0, if the event A is false. Then q q γ X (h) = θ j θ k γ Z (j k + h) j=0 k=0 = σ 2 q j=0 k=0 q θ j θ k I(k = j + h) = σ 2 q θ j θ j+h. j=0 19

20 Exercise: Differencing an MA(1) process Suppose Y t is an MA(1) process defined in the usual way. Let X t = Y t Y t 1 denote the differenced MA(1) process. 1. Is {X t } stationary? 2. What is the ACVF and ACF of {X t }? 20

21 What is spectral analysis? In probability theory we usually work with cumulative distribution functions (or probability density/mass functions, when they exist), But, we can also work with its Fourier transform: the characteristic function. 21

22 What is spectral analysis, continued? Similarly, for stationary processes we have used the autocovariance function (ACVF). We now consider its Fourier transform, the spectrum, which provides a frequency decomposition of the variance of a process. Good references: Priestley [1981a], Priestley [1981b], Brillinger [1981], Brockwell and Davis [1991, Section 4], and Percival and Walden [1993]. 22

23 Applications of spectral methods 1. Examining and modeling the dependence of a time series in the spectral domain. 2. Testing for significant periodicities [e.g., Fisher, 1929] 3. Building valid time series models [e.g., Bloomfield, 1973]. 4. Fast simulation of stationary processes [e.g. Davies and Harte, 1987]. 5. Approximate maximum likelihood estimation methods [e.g., Whittle, 1953, Dahlhaus, 1989] that are fully efficient as N. 23

24 Example: The harmonic process Consider the harmonic process defined by X t = L [A l cos(2πf l t) + B l sin(2πf l t)]. l=1 Assume that {f l } are sorted in increasing order, and that {A l } and {B l } are a sequence of (mutually) uncorrelated mean zero RVs with var(a l ) = var(b l ) = σ 2 l for each l. It can be shown that {X t } is a stationary mean zero process, with ACVF γ X (h) = L σl 2 cos(2πf l h). l=1 24

25 An analysis of variance for the harmonic process This implies that the variance is L var(x t ) = γ X (0) = σl 2. This is the first example of a spectral decomposition. We have decomposed the variance of the process {X t } in terms of the variances of sinusoids. This is an example of an analysis of variance. l=1 25

26 Relating sinusoids to complex exponentials The process {X t } can be rewritten as L X t = [C l cos(2πf l t + φ l )]. l=1 Using complex numbers, let i = 1. Employing the identity, cos(z) = (e iz + e iz )/2, it can be shown that L X t = D l e i2πflt. l= L where D 0 = 0, D l = (C l /2)e iφ l for l = 1,..., L, D l = Dl (the complex conjugate), and f l = f l. Then the ACVF is L γ X (h) = S l e i2πflh, l= L where S l = S l = σl 2/2 for l = 1,..., L and S 0 = 0. We call S l the integrated spectrum at frequency f l. 26

27 A discussion about notation and frequencies I will follow the notation of Percival and Walden [1993]. In particular I let the frequency be f [ 1/2, 1/2] Many use ω = 2πf [ π, π]. The use of f matches with the frequencies I used in the harmonic regression model. We start by assuming that the time series {X t } is collected with a sampling interval of = 1 (one time unit between observations). In practice, can vary. When changes, the range of frequencies changes to [ F, F ], where F = 1 2 is called the Nyquist frequency. 27

28 Herglotz s theorem (The spectrum is the Fourier transform of the ACVF) A real-valued function γ X ( ) defined on the integers is non-negative definite if and only if γ X (h) = 1/2 1/2 e i2πfh ds I X(f), for all h Z, where SX I ( ) is a right-continuous, non-decreasing and bounded function on [ 1/2, 1/2] with SX I ( 1/2) = 0. SX I ( ) is called the integrated spectrum or the spectral distribution function. The derivative of SX I ( ), when it exists, is called the spectral density function (SDF), S( ). 28

29 Combining with Bochner s theorem A real-valued function γ X ( ) defined on the integers is the ACVF of a stationary process {X t } if and only if 1. γ X (h) = 1/2 1/2 ei2πfh dsx I (f), for all h Z, where SI X ( ) is a right-continuous, nondecreasing and bounded function on [ 1/2, 1/2] with SX I ( 1/2) = γ X ( ) is even and non-negative definite. We say the S I X ( ) is the spectrum/spectral distribution function of {X t} and γ X ( ). (Whenever necessary we write S I X ( ) to indicate it is the spectrum of {X t}.) 29

30 So what does the spectrum measure? For a stationary process {X t }, taking γ X (h) = we let h = 0 to get 1/2 1/2 var(x t ) = γ X (0) = When the derivative exists, we have γ X (0) = 1/2 1/2 e i2πfh ds I X(f), 1/2 1/2 S X (f)df. ds I X(f). Thus we can think of the increments of the spectrum as decomposing the variance of a time series into a collection of pieces, each of which can be associated with a different frequency f. Different stationary time series models have different decompositions by frequency. Before we look at some examples let us discuss some further properties of the spectrum. 30

31 Existence of the spectral density function If γ X ( ) is any real-valued function defined on the integers that is absolutely summable (i.e., h γ X(h) < ), then with γ X (h) = 1/2 1/2 S X (f) = h Z e i2πfh S X (f)df, e i2πfh γ X (h). We say that the γ X ( ) and the S X ( ) are Fourier transform pairs, and write {γ X ( )} S X ( ). 31

32 For real-valued stationary processes A real-valued function S X ( ) defined on f 1/2 is the SDF of a real-valued stationary process {X t } if and only if 1. S X (f) = S X ( f) for all f 1/2 (the SDF is an even function). 2. S X (f) 0 for all f 1/ /2 1/2 S X(f) df <. Note that this result is if and only if: In particular it gives a way to create a stationary process without ever having to check that the ACVF is nonnegative definite. Once we have a SDF that satisfies these conditions then γ X (h) = 1/2 1/2 32 e i2πfh S X (f) df.

33 Changing the sampling interval Let {X t } be a stationary process, but suppose that we observe the process at sampling interval (not necessarily of value one). Remember that F = 1/(2 ) is the Nyquist frequency. With an absolutely summable ACVF we have S X (f) = e i2πfh γ X (h) df, h Z with γ X (h) = F F e i2πfh S X (f) df. As decreases we get more information about the SDF. We are less affected by aliasing of frequencies. 33

34 Aliasing: sinusoids An example of two aliased sinusoids: Delta = 1, f = 1/ Delta = 1, f = 1/ The amplitudes of the waves are same. Thus, the spectrum is the same. 34

35 Aliasing, in general Let {X t } be a stationary process, observed with sampling interval. If the ACVF is absolutely summable, then S X (f) = S X (f + k/ ), for all k Z. This leads us to imagine that S X (f), observed in any given interval [ F, F ], contains the accumulation of all spectral densities observed at frequencies {f + k/ : k Z}. 35

36 The SDF of a white noise process Let {Z t } be a WN(0, σ 2 ) process. Then for all f < 1/2, S Z (f) = e i2πfh γ Z (h) h= = 1. γ Z (0) = σ 2. The SDF of white noise is a constant function. Indeed, this is why it is called white noise. The SDF (and hence variance) of a WN process is an equal mix of all frequencies ( colors ). 36

37 Linear time-invariant filtering of stationary processes Suppose {X t } is a stationary mean zero process with ACVF γ X ( ) and let {a j } be absolutely summable. Remember {Y t }, defined as a linear filtering of {X t }, by Y t = j= a j X t j, is a stationary mean zero process with ACVF γ Y (h) = a j a k γ X (j k + h). j= k= Now suppose that the SDF of {X t } is S X ( ). What happens to the SDF under filtering? i.e., what is S Y ( )? 37

38 First, some definitions The transfer function of a filter {a j } is its Fourier transform: A(f) = j= a j e i2πfj. The gain function of {a j } is the modulus of the transfer function, A(f). The square or squared gain function of {a j } is the modulus squared of the transfer function: A(f) = A(f) 2 = A(f)A (f), where denotes, as usual, the complex conjugate. 38

39 What is the SDF when linear filtering? We have that 1/2 1/2 e i2πfh S Y (f)df = γ Y (h) = = = = = a j a k γ X (j k + h) j= k= j= k= 1/2 1/2 1/2 1/2 1/2 1/2 a j a k 1/2 e i2πfh S X (f) 1/2 e i2πf(j k+h) S X (f)df a j e i2πfj j= k= e i2πfh S X (f)a(f)a (f)df e i2πfh A(f)S X (f)df. a k e i2πfk df By uniqueness of Fourier transforms, S Y (f) = A(f)S X (f). 39

40 Example 1: The SDF of an MA(1) process For {Z t } a WN(0,σ 2 ) process, let X t = Z t + θz t 1 = a j Z t j, j= where 1, j = 0; a j = θ, j = 1; 0, otherwise. The transfer function is A(f) = j= a j e i2πfj = 1 + θe i2πf. 40

41 Example 1: The SDF of an MA(1) process, cont. The square gain function is A(f) = A(f)A (f) = (1 + θe i2πf )(1 + θe i2πf ) = 1 + θ ( e i2πf + e i2πf) + θ 2 = 1 + 2θ cos(2πf) + θ 2. Thus the SDF of an MA(1) process is S X (f) = A(f)S Z (f) = (1 + 2θ cos(2πf) + θ 2 )σ 2. 41

42 Example 2: ARMA processes Suppose {Z t } is a WN(0, σ 2 ) process. {X t } is an autoregressive moving average process of autoregressive order p and moving average order q, or an ARMA(p, q) process for short, if there exists a stationary solution to the following equation: p X t φ j X t j = Z t + j=1 assuming that φ p 0 and θ q 0. q θ k Z t k, t T, k=1 By there exists a stationary solution we mean that we can represent {X t } as a (stationary) linear process. 42

43 Special cases of the ARMA process With p = 0, the moving average process of order q (the MA(q) process) satisfies q X t = Z t + θ k Z t k, t T. k=1 When q = 0 we obtain the autoregressive process of order p (the AR(p) process; Yule [1921, 1927]) X t p φ j X t j = Z t, t T. j=1 43

44 The SDF of an ARMA process We have that 1 p φ j e i2πfj j=1 2 S X (f) = 1 + q 2 θ k e i2πfk S Z (f). k=1 Thus 1 + q S X (f) = σ 2 k=1 θ ke i2πfk 2 1 p j=1 φ je i2πfj 2, f < 1/2. This is often called the rational SDF. 44

45 Example SDFs for different AR (ARMA(p, 0)) processes AR(1) process φ 1 = 0.6 AR(1) process φ 1 = 0.2 SDF SDF frequency AR(2) process φ = (0.75, 0.5) frequency AR(4) process φ = (2.7607, , , ) SDF SDF frequency frequency Usually we change the y-scale to be decibels (db) a 10 log 10 transformation. 45

46 Example simulations of the AR(2) and AR(4) processes AR(2) ACF time Lag AR(4) ACF time Lag Exercise: Compare these realizations with the global temperature and water discharge series we examined previously. 46

47 Approximation theorems involving spectral densities [Brockwell and Davis, 1991, Coroll 4.4.2] if S Y ( ) is a symmetric continuous SDF and ɛ > 0, then there exists an AR(p) process {X t } such that S Y (f) S X (f) < ɛ, for all f [ 1/2, 1/2]. Brockwell and Davis [1991, Coroll 4.4.1] gives a similar result for MA(q) processes. An AR approximation for time series is used more often than the MA approximation in the literature. Combining both, gives us a motivation for why ARMA models can outperform AR and MA models, in terms of the ability to approximate the SDF (equivalently the ACVF). 47

48 Wavelet transformations We use spectral analysis to view a time series in terms of a sum of sinusoids at different frequencies. Unless we do something sneaky, we lose all information about time. A wavelet transformation uses wavelets ( small waves ) to decompose a time series into: 1. averages over a specified (typically long) time scale, and 2. changes of averages over a range of shorter time scales. It is a time-scale (some say frequency) decomposition. Thus wavelet methods are a natural way to explore and model non-stationarity. See Vidakovic [1998] and Percival and Walden [2000] we will discuss wavelets at further points in these lectures. 48

49 References P. Bloomfield. An exponential model for the spectrum of a scalar time series. Biometrika, 60: , S. Bochner. Monotone funktionen, Stieltjessche integrale und harmonische analyse. Mathematische Annalen, 108: , S. Bochner. Harmonic analysis and probability theory. University of California Press, Berkeley, CA, D. R. Brillinger. Time Series: Data Analysis and Theory. Holt, New York, NY, P. J. Brockwell and R. Davis. Introduction to Time Series and Forecasting (Second Edition). Springer, New York, NY, P. J. Brockwell and R. A. Davis. Time Series. Theory and Methods (Second Edition). Springer, New York, J. D. Cryer and K.-S. Chan. Time Series Analysis, with Applications in R. Springer, New York, NY, R. Dahlhaus. Efficient parameter estimation for self similar processes. The Annals of Statistics, 17: , R. B. Davies and D. S. Harte. Tests for Hurst effect. Biometrika, 74:95 101, URL R. Fisher. Tests of significance in harmonic analysis. Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Math. or Phys. Character ( ), 125:54 59, D. Percival and A. Walden. Spectral Analysis for Physical Applications. Cambridge University Press, Cambridge, D. Percival and A. Walden. Wavelet Methods for Time Series Analysis. Cambridge University Press, Cambridge, England, M. B. Priestley. Spectral Analysis and Time Series. (Vol. 1): Univariate Series. Academic Press, London, UK, 1981a. M. B. Priestley. Spectral Analysis and Time Series. (Vol. 2): Multivariate Series, Prediction and Control. Academic Press, London, UK, 1981b. R. Rohde, R. A. Muller, R. Jacobsen, E. Muller, S. Perlmutter, A. Rosenfeld, J. Wurtele, D. Groom, and C. Wickham. A new estimate of the average earth surface land temperature spanning 1753 to Geoinfor Geostat: An Overview, 1, R. H. Shumway and D. S. Stoffer. Time series analysis and its applications: with R examples. Springer, New York, NY, B. Vidakovic. Statistical Modeling by Wavelets. Wiley, New York, NY, P. Whittle. Estimation and information in stationary time series. Arkiv för matematik, 2: , URL 2FBF G. U. Yule. On the time-correlation problem, with special reference to the variate difference correlation method. Journal of Royal Statistical Society, 84: , URL G. U. Yule. On a method of investigating periodicities in disturbed series, with special reference to Wolfer s sunspot numbers. Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character, 226: , URL 49

A time series is a set of observations made sequentially in time.

A time series is a set of observations made sequentially in time. Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Computational Data Analysis!

Computational Data Analysis! 12.714 Computational Data Analysis! Alan Chave (alan@whoi.edu)! Thomas Herring (tah@mit.edu),! http://geoweb.mit.edu/~tah/12.714! Introduction to Spectral Analysis! Topics Today! Aspects of Time series

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

Wavelet Methods for Time Series Analysis. Motivating Question

Wavelet Methods for Time Series Analysis. Motivating Question Wavelet Methods for Time Series Analysis Part VII: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong Modeling, Estimation and Control, for Telecommunication Networks Notes for the MGR-815 course 12 June 2010 School of Superior Technology Professor Zbigniew Dziong 1 Table of Contents Preface 5 1. Example

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Minitab Project Report - Assignment 6

Minitab Project Report - Assignment 6 .. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping

Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping Wavelet Methods for Time Series Analysis Part IX: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work with

More information

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information