Chapter 13: Functional Autoregressive Models

Size: px
Start display at page:

Download "Chapter 13: Functional Autoregressive Models"

Transcription

1 Chapter 13: Functional Autoregressive Models Jakub Černý Department of Probability and Mathematical Statistics Stochastic Modelling in Economics and Finance December 9, / 25

2 Contents 1 Introduction / 25

3 Causal model Definition Let {ε n, n Z} be a white noise and let {c j, j N 0 } be a sequence of constants such that j=0 c j <. A random sequence {X n, n Z} defined as X n = c j ε n j, j=0 n Z is called causal linear process. Above definition briefly says that causal linear processes depend only on the present and past errors, not the future ones. 3 / 25

4 AR model Definition Random sequence {X n, n Z} is called autoregressive model of order p, AR(p),if p X n = φ i X n i + ε n i=1 where φ 1,..., φ p are real constants, φ p 0 and {ε n, n Z} is a white noise. In general, the definition of the AR model is based on the white noise, but [Horváth, Kokoszka (2011)] assume that {ε n, n Z} is a sequence of iid mean zero random variables which is a narrower class of processes. 4 / 25

5 Causal AR model If c j = ψ j and ψ < 1, AR(1) model {X n, n Z} is causal and has a unique solution in form X n = ψ j ε n j j=0 Our goal is to use AR(1) model condition ψ < 1 and establish analogous condition for the functional model. 5 / 25

6 FAR(1) model To lighten the notation we set L =. Definition We say that a sequence {X n, n Z} of mean zero elements of L 2 follows a functional AR(1) (FAR(1)) model if X n = Ψ(X n 1 ) + ε n (1.1) where Ψ L and {ε n, n Z} is a sequence of iid mean zero errors in L 2 satisfying E ε n 2 <. Following theory is developed under the assumption of iid errors. General case can be found in [Bosq (2000)]. 6 / 25

7 Existence - Lemma Lemma For any Ψ L, the following two conditions are equivalent: C1 There exists an integer j 0 0 such that Ψ j 0 < 1. C2 There exists a > 0 and 0 < b < 1 such that for every j 0, Ψ j ab j. Proof. The condition C2 clearly implies C1. So we must show only that C1 implies C2. We set j = qj 0 + r where q 0 and 0 r < j 0. Therefore, Ψ j = Ψ qj 0+r Ψ j 0 q Ψ r. If Ψ j 0 = 0, then C2 holds for any a > 0 and 0 < b < 1, so we assume in the following that Ψ j 0 > 0. 7 / 25

8 Existence - Lemma cont. Since q > j/j 0 1 and Ψ j 0 < 1, we get Ψ j Ψ j 0 j/j 0 1 Ψ r ( Ψ j 0 1/j 0) j Ψ j 0 1 max 0 r<j 0 Ψ r, so C2 holds with a = Ψ j 0 1 max Ψ r, b = Ψ j 0 1/j 0. 0 r<j 0 Note that condition C1 is weaker than the condition Ψ < 1. Nevertheless, C2 is a suffciently strong condition to ensure the convergence of the series j Ψj (ε n j ) and the existence of a stationary causal solution to FAR(1) equations, as stated in the following theorem. 8 / 25

9 Existence and uniqueness - Theorem Theorem If condition C1 holds, then there is a unique strictly stationary causal solution to 1.1. This solution is given by X n = Ψ j (ε n j ) (2.1) j=0 The series converges almost surely, and in the L 2 norm, i.e E X n m Ψ j (ε n j ) j=0 2 0, m. 9 / 25

10 Existence and uniqueness - Proof Proof. Let us remind that we work with L 2 (Ω, L 2 ([0, 1])) which is a Hilbert space with the inner product E X, Y, X, Y L 2 ([0, 1]). To show that the sequence X (m) n = m j=0 Ψj (ε n j ) has a limit in L 2 (Ω, L 2 ([0, 1])) it suffices to check that it is a Cauchy sequence in m for every fixed n. E m j=m+1 Ψ j (ε n j ) 2 = = m j,k=m+1 m j=m+1 E Ψ j (ε n j ), Ψ k (ε n k ) E Ψ j (ε n j ) / 25

11 Existence and uniqueness - Proof cont. Therefore by the previous lemma m 2 E Ψ j m m (ε n j ) Ψ j 2 E ε 0 2 E ε 0 2 a 2 j=m+1 Thus X (m) n By the condition C2 E Ψ j ε n j j=0 j=m+1 j=m+1 converges in L 2 (Ω, L 2 ([0, 1])), it is enough to verify that Ψ j (ε n j ) < a.s. j=0 2 b 2j. Ψ j Ψ k E ε 0 2 E ε 0 2 ab j j,k=0 which is finite and so j=0 Ψj ε n j < a.s. j= / 25

12 Existence and uniqueness - Proof cont. 2 The series X n is strictly stationary, and it satisfies (1.1). Suppose {X n} is another strictly stationary casual sequence satisfying (1.1). Then, iterating 1.1, we obtain, for any m 1, Therefore X n = m Ψ j (ε n j ) + Ψ m+1 (X n m+1). j=1 E X n X (m) n Ψ m+1 E X n m+1 E X 0 ab m+1. Hence X n is equal a.s. to the limit of X (m) n, i.e. to X n. 12 / 25

13 Brief example Consider an integral Hilbert-Schmidt operator on L 2 defined by Ψ(x)(t) = ψ(t, s)x(s)ds, x L 2, which satisfies ψ 2 (t, s)dtds < 1. (2.2) From the introduction ([Horváth, Kokoszka (2011), chapter 2.2]) we know that the left-hand side of above inequality is equal to Ψ 2 S. Since Ψ Ψ S, we see that (2.2) implies condition C1 with j 0 = / 25

14 Ψ - Idea First, we will focus on the AR(1) model X n = ψx n 1 + ε n. We assume that ψ < 1 so there is a stationary solution. Then, multiplying previous equation by X n 1 and taking the expected value, we get γ 1 = ψγ 0, where γ k = Cov(X n, X n k ) = E(X n X n k ). The usual estimator of ψ is ˆψ = ˆγ 1 /ˆγ 0 where ˆγ k are sample autocovariances. This approach is known as the Yule-Walker estimation. 14 / 25

15 Ψ We want to apply this technique to the functional model. By 1.1 and under condition C1 E [ X n, x X n 1 ] = E [ Ψ(X n 1 ), x X n 1 ], x L 2. The lag-1 autocovariance operator is defined by C 1 (x) = E [ X n, x X n+1 ]. and denote T the adjoint operator, then C T 1 = CΨT, i.e. C 1 = ΨC The above identity is analogous to the AR(1) process, so we would like to obtain an estimate of Ψ by using a finite sample version of the following relation Ψ = C 1 C / 25

16 Ψ - operator C However, the operator C does not have a bounded inverse on the whole H. Recall that C 1 (C(x)) = x, where C 1 (y) = j=1 λ 1 j y, v j v j. The operator C 1 is defined if all λ j are positive. Since C 1 (v n ) = λ 1 n, as n it is unbounded. This makes the estimation of the bounded operator Ψ using the relation Ψ = C 1 C 1 difficult. Note that if λ 1 λ 2 λ p > λ p+k = 0 for all k = 1, 2,... then {X n } is in the space spanned by {v 1,..., v p }. On this subspace we define C 1 (y) = p j=1 λ 1 j y, v j v j. 16 / 25

17 Ψ - EFPC A practical solution is to use only p most important EFPC s ˆv j, and to define p ÎC p (x) = ˆλ 1 j x, ˆv j ˆv j j=1 The operator ÎC p is defined on the whole of L 2, and it is bounded if ˆλ j > 0 for j p. By reasonable choice of p we can find a balance between retaining the relevant information in the sample, and the danger of working with the small eigenvalues. 17 / 25

18 Ψ - Estimator To derive a computable estimator of Ψ, we will use an empirical version of C 1 = ΨC. C 1 is estimated by we get, for any x L 2, p Ĉ 1 ÎC p (x) = Ĉ1 Ĉ 1 (x) = 1 N 1 X k, x X k+1, N 1 = = 1 N 1 j=1 ˆλ 1 j N 1 k=1 N 1 1 N 1 k=1 k=1 x, ˆv j ˆv j X k, p j=1 p j=1 ˆλ 1 j x, ˆv j ˆv j X k+1 ˆλ 1 j x, ˆv j X k, ˆv j X k / 25

19 Ψ - Estimator cont. The estimator Ĉ1ÎC p can be used in principle, but typically an additional smoothing step is introduced by using the approximation From this follows Ψ p (x) = 1 N 1 N 1 k=1 X k+1 p p X k+1, ˆv i ˆv i. i=1 p j=1 i=1 ˆλ 1 j x, ˆv j X k, ˆv j X k+1, ˆv i ˆv i. The estimator is consistent if p = p N is a function of the sample size N. Sufficient conditions for Ψ p Ψ 0 can be found in [Bosq (2000)]. 19 / 25

20 Ψ - Estimator cont. 2 The estimator is a kernel operator with the kernel ˆψ p such that Ψ p (x)(t) = ˆψ p (t, s)x(s)ds from which follows that ˆψ p (t, s) = 1 N 1 N 1 k=1 p p j=1 i=1 ˆλ 1 j X k, ˆv j X k+1, ˆv i ˆv j (s)ˆv i (t). The kernel estimation can be obtained using R function pca.fd which is contained in a fda package (Functional Data Analysis). More about this package can be found at 20 / 25

21 Ψ - Example Figure : The kernel surface ψ(t, s) (top left) and its esimates ˆψ(t, s) for p = 2, 3, 4 (Source: [Horváth, Kokoszka (2011), p. 241]) 21 / 25

22 Ψ - Example cont. In previous figure are illustrated Gaussian kernel ψ(t, s) = α exp{ (t 2 + s 2 )/2} with α chosen so that the Hilbert-Schmidt norm is 1/2, and three estimates (series length N = 100) for p = 2, 3, 4 and ε n were generated as Brownian bridges. We may see very large discrepancies in magnitude and in shape (which were observed for other kernels and innovations). Moreover, by any reasonable distance measure between two surfaces, the distance between ψ and ˆψ p increases as p increases. This is counterintuitive because by using more EFPC s ˆv j, we would expect the approximation to improve. 22 / 25

23 Ψ - Example cont. 2 The sums p j=1 ˆλ j explain 74, 83 and 87 % of the variance for p = 2, 3, 4, but the absolute deviation distances between ψ and ˆψ p are 0.4, 0.44 and The same pattern was obtained for the RMSE distance ˆψ ψ S. As N increases, these distances decrease, but their tendency to increase with p remains. This problem is partially due to the fact that for many FAR(1) models, the estimated eigenvalues ˆλ j are very small, except ˆλ 1 and ˆλ 2, and small error in their estimation means large error in the rest ˆλ 1 j. Partial solution to this problem can be found in [Kokoszka, Zhang (2010)] by adding a positive baseline to the ˆλ j. However, as we shall further see, precise estimation of the kernel ψ is not necessary to obtain satisfactory predictions. 23 / 25

24 Bosq, D.: Linear Processes in Function Spaces, Springer, New York, Horváth L., Kokoszka P.: Inference for Functional Data with Applications, Preprint, Kokoszka, P., Zhang, X.: Improved estimation of the kernel of the functional autoregressive process, Technical Report, Utah State University, Prášková Z.: Základy náhodných procesů II, Karolinum, Praha, / 25

25 Thank you for your attention! Jakub Černý 25 / 25

Weakly dependent functional data. Piotr Kokoszka. Utah State University. Siegfried Hörmann. University of Utah

Weakly dependent functional data. Piotr Kokoszka. Utah State University. Siegfried Hörmann. University of Utah Weakly dependent functional data Piotr Kokoszka Utah State University Joint work with Siegfried Hörmann University of Utah Outline Examples of functional time series L 4 m approximability Convergence of

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Advances in Decision Sciences Volume, Article ID 893497, 6 pages doi:.55//893497 Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Junichi Hirukawa and

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

On the prediction of stationary functional time series

On the prediction of stationary functional time series On the prediction of stationary functional time series Alexander Aue Diogo Dubart Norinho Siegfried Hörmann Abstract This paper addresses the prediction of stationary functional time series. Existing contributions

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Tests for error correlation in the functional linear model

Tests for error correlation in the functional linear model Tests for error correlation in the functional linear model Robertas Gabrys Utah State University Lajos Horváth University of Utah Piotr Kokoszka Utah State University Abstract The paper proposes two inferential

More information

Technische Universität München. Zentrum Mathematik. Time Series in Functional Data Analysis

Technische Universität München. Zentrum Mathematik. Time Series in Functional Data Analysis Technische Universität München Zentrum Mathematik Time Series in Functional Data Analysis Master s Thesis by Taoran Wei Themenstellerin: Prof. Dr. Claudia Klüppelberg Betreuer: M.Sc. Johannes Klepsch Abgabetermin:

More information

On the Error Correction Model for Functional Time Series with Unit Roots

On the Error Correction Model for Functional Time Series with Unit Roots On the Error Correction Model for Functional Time Series with Unit Roots Yoosoon Chang Department of Economics Indiana University Bo Hu Department of Economics Indiana University Joon Y. Park Department

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Tests for separability in nonparametric covariance operators of random surfaces

Tests for separability in nonparametric covariance operators of random surfaces Tests for separability in nonparametric covariance operators of random surfaces Shahin Tavakoli (joint with John Aston and Davide Pigoli) April 19, 2016 Analysis of Multidimensional Functional Data Shahin

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Estimation of the mean of functional time series and a two-sample problem

Estimation of the mean of functional time series and a two-sample problem J. R. Statist. Soc. B (01) 74, Part 5, pp. Estimation of the mean of functional time series and a two-sample problem Lajos Horváth, University of Utah, Salt Lake City, USA Piotr okoszka Colorado State

More information

Switching Regime Estimation

Switching Regime Estimation Switching Regime Estimation Series de Tiempo BIrkbeck March 2013 Martin Sola (FE) Markov Switching models 01/13 1 / 52 The economy (the time series) often behaves very different in periods such as booms

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Robustní monitorování stability v modelu CAPM

Robustní monitorování stability v modelu CAPM Robustní monitorování stability v modelu CAPM Ondřej Chochola, Marie Hušková, Zuzana Prášková (MFF UK) Josef Steinebach (University of Cologne) ROBUST 2012, Němčičky, 10.-14.9. 2012 Contents Introduction

More information

Fundamental concepts of functional data analysis

Fundamental concepts of functional data analysis Fundamental concepts of functional data analysis Department of Statistics, Colorado State University Examples of functional data 0 1440 2880 4320 5760 7200 8640 10080 Time in minutes The horizontal component

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Functional prediction of intraday cumulative returns

Functional prediction of intraday cumulative returns Functional prediction of intraday cumulative returns Piotr Kokoszka 1 and Xi Zhang 2 1 Department of Statistics, Colorado State University, USA 2 Department of Mathematics & Statistics, Utah State University,

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Forecasting locally stationary time series

Forecasting locally stationary time series Forecasting locally stationary time series Rebecca Killick r.killick@lancs.ac.uk Joint work with Idris Eckley (Lancaster), Marina Knight (York) & Guy Nason (Bristol) June 30, 2014 Rebecca Killick (Lancaster

More information

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2 THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS Mor Ndongo & Abdou Kâ Diongue UFR SAT, Universit Gaston Berger, BP 34 Saint-Louis, Sénégal (morndongo000@yahoo.fr) UFR SAT, Universit Gaston

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Package freqdom. September 1, 2017

Package freqdom. September 1, 2017 Type Package Package freqdom September 1, 2017 Title Frequency Domain Based Analysis: Dynamic PCA Version 2.0.1 Date 2017-08-29 Author Hormann S., Kidzinski L. Maintainer Kidzinski L.

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

Estimating functional time series by moving average model fitting

Estimating functional time series by moving average model fitting Estimating functional time series by moving average model fitting Alexander Aue Johannes Klepsch January 6, 2017 Abstract Functional time series have become an integral part of both functional data and

More information

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Stable Process. 2. Multivariate Stable Distributions. July, 2006 Stable Process 2. Multivariate Stable Distributions July, 2006 1. Stable random vectors. 2. Characteristic functions. 3. Strictly stable and symmetric stable random vectors. 4. Sub-Gaussian random vectors.

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Lecture 3. Random Fourier measurements

Lecture 3. Random Fourier measurements Lecture 3. Random Fourier measurements 1 Sampling from Fourier matrices 2 Law of Large Numbers and its operator-valued versions 3 Frames. Rudelson s Selection Theorem Sampling from Fourier matrices Our

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Weak invariance principles for sums of dependent random functions

Weak invariance principles for sums of dependent random functions Available online at www.sciencedirect.com Stochastic Processes and their Applications 13 (013) 385 403 www.elsevier.com/locate/spa Weak invariance principles for sums of dependent random functions István

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix jean-charles.croix@emse.fr Génie Mathématique et Industriel (GMI) First workshop on Gaussian processes at Saint-Etienne

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

Orthonormal Bases Fall Consider an inner product space V with inner product f, g and norm

Orthonormal Bases Fall Consider an inner product space V with inner product f, g and norm 8.03 Fall 203 Orthonormal Bases Consider an inner product space V with inner product f, g and norm f 2 = f, f Proposition (Continuity) If u n u 0 and v n v 0 as n, then u n u ; u n, v n u, v. Proof. Note

More information

Bernstein-Szegö Inequalities in Reproducing Kernel Hilbert Spaces ABSTRACT 1. INTRODUCTION

Bernstein-Szegö Inequalities in Reproducing Kernel Hilbert Spaces ABSTRACT 1. INTRODUCTION Malaysian Journal of Mathematical Sciences 6(2): 25-36 (202) Bernstein-Szegö Inequalities in Reproducing Kernel Hilbert Spaces Noli N. Reyes and Rosalio G. Artes Institute of Mathematics, University of

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Multivariate Normal-Laplace Distribution and Processes

Multivariate Normal-Laplace Distribution and Processes CHAPTER 4 Multivariate Normal-Laplace Distribution and Processes The normal-laplace distribution, which results from the convolution of independent normal and Laplace random variables is introduced by

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 9 Jakub Mućk Econometrics of Panel Data Meeting # 9 1 / 22 Outline 1 Time series analysis Stationarity Unit Root Tests for Nonstationarity 2 Panel Unit Root

More information

A Note on Hilbertian Elliptically Contoured Distributions

A Note on Hilbertian Elliptically Contoured Distributions A Note on Hilbertian Elliptically Contoured Distributions Yehua Li Department of Statistics, University of Georgia, Athens, GA 30602, USA Abstract. In this paper, we discuss elliptically contoured distribution

More information

Linear Processes in Function Spaces

Linear Processes in Function Spaces D. Bosq Linear Processes in Function Spaces Theory and Applications Springer Preface Notation vi xi Synopsis 1 1. The object of study 1 2. Finite-dimensional linear processes 3 3. Random variables in function

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Some functional (Hölderian) limit theorems and their applications (II)

Some functional (Hölderian) limit theorems and their applications (II) Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen

More information

Multiresolution Models of Time Series

Multiresolution Models of Time Series Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Hilbert Space Methods Used in a First Course in Quantum Mechanics A Recap WHY ARE WE HERE? QUOTE FROM WIKIPEDIA

Hilbert Space Methods Used in a First Course in Quantum Mechanics A Recap WHY ARE WE HERE? QUOTE FROM WIKIPEDIA Hilbert Space Methods Used in a First Course in Quantum Mechanics A Recap Larry Susanka Table of Contents Why Are We Here? The Main Vector Spaces Notions of Convergence Topological Vector Spaces Banach

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Functional time series

Functional time series Rob J Hyndman Functional time series with applications in demography 4. Connections, extensions and applications Outline 1 Yield curves 2 Electricity prices 3 Dynamic updating with partially observed functions

More information

AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS. Ngoc Mai Tran Supervisor: Professor Peter G.

AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS. Ngoc Mai Tran Supervisor: Professor Peter G. AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS Ngoc Mai Tran Supervisor: Professor Peter G. Hall Department of Mathematics and Statistics, The University of Melbourne.

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

SIEVE BOOTSTRAP FOR FUNCTIONAL TIME SERIES. By Efstathios Paparoditis University of Cyprus

SIEVE BOOTSTRAP FOR FUNCTIONAL TIME SERIES. By Efstathios Paparoditis University of Cyprus Submitted to the Annals of Statistics SIEVE BOOTSTRAP FOR FUNCTIONAL TIME SERIES By Efstathios Paparoditis University of Cyprus A bootstrap procedure for functional time series is proposed which exploits

More information

Estimating Moving Average Processes with an improved version of Durbin s Method

Estimating Moving Average Processes with an improved version of Durbin s Method Estimating Moving Average Processes with an improved version of Durbin s Method Maximilian Ludwig this version: June 7, 4, initial version: April, 3 arxiv:347956v [statme] 6 Jun 4 Abstract This paper provides

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Independent component analysis for functional data

Independent component analysis for functional data Independent component analysis for functional data Hannu Oja Department of Mathematics and Statistics University of Turku Version 12.8.216 August 216 Oja (UTU) FICA Date bottom 1 / 38 Outline 1 Probability

More information

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis : Asymptotic Inference and Empirical Analysis Qian Li Department of Mathematics and Statistics University of Missouri-Kansas City ql35d@mail.umkc.edu October 29, 2015 Outline of Topics Introduction GARCH

More information

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit Evan Kwiatkowski, Jan Mandel University of Colorado Denver December 11, 2014 OUTLINE 2 Data Assimilation Bayesian Estimation

More information

The Hilbert Space of Random Variables

The Hilbert Space of Random Variables The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

REAL ANALYSIS II HOMEWORK 3. Conway, Page 49

REAL ANALYSIS II HOMEWORK 3. Conway, Page 49 REAL ANALYSIS II HOMEWORK 3 CİHAN BAHRAN Conway, Page 49 3. Let K and k be as in Proposition 4.7 and suppose that k(x, y) k(y, x). Show that K is self-adjoint and if {µ n } are the eigenvalues of K, each

More information

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

TAMS39 Lecture 10 Principal Component Analysis Factor Analysis

TAMS39 Lecture 10 Principal Component Analysis Factor Analysis TAMS39 Lecture 10 Principal Component Analysis Factor Analysis Martin Singull Department of Mathematics Mathematical Statistics Linköping University, Sweden Content - Lecture Principal component analysis

More information

Does k-th Moment Exist?

Does k-th Moment Exist? Does k-th Moment Exist? Hitomi, K. 1 and Y. Nishiyama 2 1 Kyoto Institute of Technology, Japan 2 Institute of Economic Research, Kyoto University, Japan Email: hitomi@kit.ac.jp Keywords: Existence of moments,

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Raymond K. W. Wong Department of Statistics, Texas A&M University Xiaoke Zhang Department

More information

Interest Rate Models:

Interest Rate Models: 1/17 Interest Rate Models: from Parametric Statistics to Infinite Dimensional Stochastic Analysis René Carmona Bendheim Center for Finance ORFE & PACM, Princeton University email: rcarmna@princeton.edu

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Conditional Autoregressif Hilbertian process

Conditional Autoregressif Hilbertian process Application to the electricity demand Jairo Cugliari SELECT Research Team Sustain workshop, Bristol 11 September, 2012 Joint work with Anestis ANTONIADIS (Univ. Grenoble) Jean-Michel POGGI (Univ. Paris

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0 MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X

More information

Introduction to Discrete-Time Wavelet Transform

Introduction to Discrete-Time Wavelet Transform Introduction to Discrete-Time Wavelet Transform Selin Aviyente Department of Electrical and Computer Engineering Michigan State University February 9, 2010 Definition of a Wavelet A wave is usually defined

More information

Confidence Intervals for the Autocorrelations of the Squares of GARCH Sequences

Confidence Intervals for the Autocorrelations of the Squares of GARCH Sequences Confidence Intervals for the Autocorrelations of the Squares of GARCH Sequences Piotr Kokoszka 1, Gilles Teyssière 2, and Aonan Zhang 3 1 Mathematics and Statistics, Utah State University, 3900 Old Main

More information