Applications of Distance Correlation to Time Series
|
|
- Brooke McCarthy
- 5 years ago
- Views:
Transcription
1 Applications of Distance Correlation to Time Series Richard A. Davis, Columbia University Muneya Matsui, Nanzan University Thomas Mikosch, University of Copenhagen Phyllis Wan, Columbia University May 2-6, 2016 Workshop on Dependence, Stability, and Extremes The Fields Institute
2 Warm-up example: Amazon-returns ( to ) amazon time
3 Warm-up example: Amazon-returns ( to ) Series amazon Series amazon^2 ACF ACF Lag Lag
4 Warm-up example: Amazon-returns ( to ) Series amazon Series amazon^2 ACF ACF Lag Lag Series abs(amazon) ADCF of Amazon ACF ADCF Lag Lag
5 Example: Kilkenny wind speed time series Bonus (teaser) Kilken: 1/1/61 to 1/17/78 Series kilken kilken ACF time Lag
6 Example: Kilkenny wind speed time series Kilken: 1/1/61 to 1/17/78 Series kilken kilken ACF time Lag Series kilkenres ADCF of Kilken Res ACF ADCF 0e+00 8e Lag Lag Figure: Auto-distance correlation function (ADCF) of residuals from AR(9) model applied to Kilkenny daily wind speed.
7 Distance Covariance: a measure of dependence Distance covariance: random vectors X R p and Y R q, T(X, Y; µ) = ϕx,y (s, t) ϕ X (s) ϕ Y (t) 2 µ(ds, dt), R p+q where ϕ X,Y, ϕ X, ϕ Y denote the respective characteristic functions of (X, Y), X, Y, and µ is a measure.
8 Distance Covariance: a measure of dependence Distance covariance: random vectors X R p and Y R q, T(X, Y; µ) = ϕx,y (s, t) ϕ X (s) ϕ Y (t) 2 µ(ds, dt), R p+q where ϕ X,Y, ϕ X, ϕ Y denote the respective characteristic functions of (X, Y), X, Y, and µ is a measure. Distance correlation: R(X, Y; µ) = T(X,Y;µ) T(X,X;µ)T(Y,Y;µ)
9 Distance Covariance: a measure of dependence Distance covariance: random vectors X R p and Y R q, T(X, Y; µ) = ϕx,y (s, t) ϕ X (s) ϕ Y (t) 2 µ(ds, dt), R p+q where ϕ X,Y, ϕ X, ϕ Y denote the respective characteristic functions of (X, Y), X, Y, and µ is a measure. Distance correlation: R(X, Y; µ) = T(X,Y;µ) T(X,X;µ)T(Y,Y;µ) Sample distance covariance: Observations (X 1, Y 1 ),..., (X n, Y n ) from a statioonary time series (X t, Y t ). Then T X,Y n (0) = ˆϕ X,Y (s, t) ˆϕ X (s)ˆϕ Y (t) 2 µ(ds, dt), R p+q where ˆϕ denotes the empirical characteristic function, e.g., ˆϕ X,Y (s, t) = n 1 j=1 e i s,x j, +i t,y j
10 Background Feuerverger et al. (1981,...), Feuerverger and Mureika (1977); applications of the empirical characteristic function, inference using Fourier methods, etc. Feuerverger (1993), a bivariate test for independence, cor(cos(sx), cos(ty)), etc. S. Csörgő (1981a,b,c) Limit behavior of characteristic functions. Meintannis et al. (2008), 2015, Fourier methods for testing multivariate independence Székely, Rizzo, Bakirov (2007), Székely and Rizzo (2009), (2014), special choice of weight function, Brownian distance covariance Dueck et al. (2014) affinely invariant distance correlation Zhou (2012) application to time series. Fokianos and Pitsillou (2016). Testing pairwise dependence in time series.
11 Distance Covariance: choice of weight function T X,Y (0) = ϕ R p+q X,Y (s, t) ϕ X (s)ϕ Y (t) 2 µ(ds, dt) If µ = µ 1 µ 2, let µ i be the Fourier transform of µ i, i.e., µ i (x) = e i s,x dµ i (s), R p and assuming Fubination is okay, T(X, Y; µ) = E[ˆµ 1 (X X ) ˆµ 2 (Y Y )] + E[ˆµ 1 (X X )]E[ˆµ 2 (Y Y )] 2 E[ˆµ 1 (X X ) ˆµ 2 (Y Y )]. where (X, Y), (X, Y ), (X, Y ) are iid copies.
12 Distance Covariance: choice of weight function T X,Y (0) = ϕ R p+q X,Y (s, t) ϕ X (s)ϕ Y (t) 2 µ(ds, dt) If µ = µ 1 µ 2, let µ i be the Fourier transform of µ i, i.e., µ i (x) = e i s,x dµ i (s), R p and assuming Fubination is okay, T(X, Y; µ) = E[ˆµ 1 (X X ) ˆµ 2 (Y Y )] + E[ˆµ 1 (X X )]E[ˆµ 2 (Y Y )] 2 E[ˆµ 1 (X X ) ˆµ 2 (Y Y )]. where (X, Y), (X, Y ), (X, Y ) are iid copies. Choose µ i so that µ i have an explicit and easy to compute form.
13 Distance Covariance: computing T n T X,Y (0) = R p+q ϕ X,Y (s, t) ϕ X (s)ϕ Y (t) 2 µ(ds, dt) Fourier transform: µ i (x) = e i s,x dµ i (s), R p Computing T n. T X,Y n (0) = n 2 s=1 t=1 µ 1 (X s X t ) µ 2 (Y s Y t ) + n 2 µ 1 (X s X t )n 2 µ 2 (Y s Y t ) s,t=1 s,t=1 2n 3 µ 1 (X s X t ) µ 2 (Y s Y u ) s,t,u=1
14 Distance Covariance: choice of weight function finite (probability) measures: normal density: µ(x) = exp{ 1/2x Σx} sub-gaussian α/2-stable: µ(x) = exp{ (x, y) Σ(x, y) α/2 }, α (0, 2).
15 Distance Covariance: choice of weight function finite (probability) measures: normal density: µ(x) = exp{ 1/2x Σx} sub-gaussian α/2-stable: µ(x) = exp{ (x, y) Σ(x, y) α/2 }, α (0, 2). infinite measures: Lévy measure corresponding to an infinitely divisible random vector. Székely and Rizzo (2009): w(s, t) = with α (0, 2) and µ(x) = x α 1 c p c q t α+p p s α+q. q
16 Distance Covariance: choice of weight function finite (probability) measures: normal density: µ(x) = exp{ 1/2x Σx} sub-gaussian α/2-stable: µ(x) = exp{ (x, y) Σ(x, y) α/2 }, α (0, 2). infinite measures: Lévy measure corresponding to an infinitely divisible random vector. Székely and Rizzo (2009): with α (0, 2) and µ(x) = x α w(s, t) = 1 c p c q t α+p p s α+q. q Distance correlation is scale and rotational invariant relative to w(s, t).
17 Results consistency Existence of T(X, Y; µ) = R p+q ϕ X,Y (s, t) ϕ X (s)ϕ Y (t) 2 w(s, t)dsdt,. 1 µ a finite measure. 2 µ is infinite in a neighborhood of the origin and for some α (0, 2], E[ X α ] + E[ Y α ] < and R p+q 1 (s, t) α µ(ds, dt) <.
18 Results consistency Existence of T(X, Y; µ) = R p+q ϕ X,Y (s, t) ϕ X (s)ϕ Y (t) 2 w(s, t)dsdt,. 1 µ a finite measure. 2 µ is infinite in a neighborhood of the origin and for some α (0, 2], E[ X α ] + E[ Y α ] < and R p+q 1 (s, t) α µ(ds, dt) <. Consistency: If (X t, Y t ) is a stationary ergodic sequence satisfying 1 or 2 above, then T X,Y n (h) a.s. T X,Y (h).
19 Results weak convergence Assume that X 0 Y 0, α-mixing ( h=1 α 1/r <, r > 1) + moment condition, h and E[ X α + Y α ] <, [ p E X (l) α] <, [ q E Y (l) α] <, (1) l=1 R p+q (1 s α (1+ɛ)/u )(1 t α (1+ɛ)/u ) µ(ds, dt) < (2) where u = 2r/(r 1), α min(2, α). Then n T n (X, Y; µ) d G 2 µ = G(s, t) 2 µ(ds, dt), (3) R p+q where G is a complex-valued mean-zero Gaussian process. l=1
20 Results weak convergence Assume that X 0 and Y 0 are dependent and for some α (u/2, u] and for α min(2, α) the following hold: E[ X 2α + Y 2α ] <, [ E (1 p q X (l) α )(1 l=1 k=1 ] Y (k) α ) <, (4) and Then R p+q (1 s α /u )(1 t α /u ) µ(ds, dt) <. (5) n (Tn (X, Y; µ) T(X, Y; µ)) d G µ = G (s, t) µ(ds, dt), (6) R p+q where G (s, t) = 2Re{G(s, t)c(s, t)} is a mean-zero Gaussian process.
21 Distance correlation and AR(p) models Let (X t ) be the causal AR(p) process given by p X t = φ k X t k + Z t, (Z t ) IID(0, σ 2 ). k=1 Least squares estimate: observations X 1,..., X n where X t 1 = (X t 1,..., X t p ) ˆφ φ = Γ 1 1 n,p n ˆΓ n,p = 1 n t=p+1 X t 1 Z t, X T t 1 X t 1. t=p+1
22 Distance correlation and AR(p) models Let (X t ) be the causal AR(p) process given by p X t = φ k X t k + Z t, (Z t ) IID(0, σ 2 ). k=1 Least squares estimate: observations X 1,..., X n where X t 1 = (X t 1,..., X t p ) ˆφ φ = Γ 1 1 n,p n ˆΓ n,p = 1 n t=p+1 t=p+1 X t 1 Z t, X T t 1 X t 1. n 1/2 (ˆφ φ) d Q N(0, σ 2 Σ 1 p ).
23 Distance correlation and AR(p) residuals Fitted residuals Ẑ k = X k ˆφ X k 1, k = p , n. = Z k + (φ ˆφ) X k 1
24 Distance correlation and AR(p) residuals Fitted residuals Ẑ k = X k ˆφ X k 1, k = p , n. = Z k + (φ ˆφ) X k 1 Difference in characteristic functions between noise and residuals: n 1/2 e isẑ k +itẑ k+h n 1/2 k=p+1 k=p+1 e isz k +itz k+h ( = n 1/2 e isz k +itz k+h e is(φ ˆφ) X k 1 +it(φ ˆφ) X k+h 1 ) 1 k=p+1
25 Distance correlation and AR(p) residuals Fitted residuals Ẑ k = X k ˆφ X k 1, k = p , n. = Z k + (φ ˆφ) X k 1 Difference in characteristic functions between noise and residuals: n 1/2 e isẑ k +itẑ k+h n 1/2 k=p+1 k=p+1 e isz k +itz k+h ( = n 1/2 e isz k +itz k+h e is(φ ˆφ) X k 1 +it(φ ˆφ) X k+h 1 ) 1 k=p+1 ( ) n 1/2 e isz k +itz k+h is(φ ˆφ) X k 1 + it(φ ˆφ) X k+h 1 k=p+1
26 Distance correlation and AR(p) residuals Fitted residuals Ẑ k = X k ˆφ X k 1, k = p , n. = Z k + (φ ˆφ) X k 1 Difference in characteristic functions between noise and residuals: n 1/2 e isẑ k +itẑ k+h n 1/2 k=p+1 k=p+1 e isz k +itz k+h ( = n 1/2 e isz k +itz k+h e is(φ ˆφ) X k 1 +it(φ ˆφ) X k+h 1 ) 1 k=p+1 ( ) n 1/2 e isz k +itz k+h is(φ ˆφ) X k 1 + it(φ ˆφ) X k+h 1 k=p+1 = n 1/2 (φ ˆφ) n 1 e isz k +itz k+h (isx k 1 + itx k+h 1 ) k=p+1
27 Distance correlation and AR(p) residuals Fitted residuals Ẑ k = X k ˆφ X k 1, k = p , n. = Z k + (φ ˆφ) X k 1 Difference in characteristic functions between noise and residuals: n 1/2 k=p+1 e isẑ k +itẑ k+h n 1/2 = n 1/2 k=p+1 n 1/2 k=p+1 = n 1/2 (φ ˆφ) n 1 k=p+1 e isz k +itz k+h e isz k +itz k+h ( e is(φ ˆφ) X k 1 +it(φ ˆφ) X k+h 1 1 ) e isz k +itz k+h ( is(φ ˆφ) X k 1 + it(φ ˆφ) X k+h 1 ) k=p+1 d ( QE e isz 1+itZ h+1 (isx 0 + itx h ) ) e isz k +itz k+h (isx k 1 + itx k+h 1 )
28 ADCF of AR(p) residuals Z t finite and infinite variance where Ẑ t = X t p k=1 ˆφ k X t k. Theorem R n (h) := R n (Ẑ 1, Ẑ h+1 ; µ), Assume [(1 s 2 ) (1 t 2 ) + (s 2 + t 2 ) 1 { s t >1} ]µ(ds, dt) < 1 If EZ 2 t <, then n Rn (h) d GZ (s, t) + ξ h (s, t) 2 µ(ds, dt). T(0)
29 ADCF of AR(p) residuals Z t finite and infinite variance where Ẑ t = X t p k=1 ˆφ k X t k. Theorem R n (h) := R n (Ẑ 1, Ẑ h+1 ; µ), Assume [(1 s 2 ) (1 t 2 ) + (s 2 + t 2 ) 1 { s t >1} ]µ(ds, dt) < 1 If EZ 2 t <, then n Rn (h) d GZ (s, t) + ξ h (s, t) 2 µ(ds, dt). T(0) ξ h (s, t) = tϕ Z (t) ϕ Z (s)ψt h Q
30 ADCF of AR(p) residuals Z t finite and infinite variance where Ẑ t = X t p k=1 ˆφ k X t k. Theorem R n (h) := R n (Ẑ 1, Ẑ h+1 ; µ), Assume [(1 s 2 ) (1 t 2 ) + (s 2 + t 2 ) 1 { s t >1} ]µ(ds, dt) < 1 If EZ 2 t <, then n Rn (h) d GZ (s, t) + ξ h (s, t) 2 µ(ds, dt). T(0) ξ h (s, t) = tϕ Z (t) ϕ Z (s)ψt h Q 2 If Z t DOA(α) with index α (0, 2), then n Rn (h) d GZ (s, t) 2 µ(ds, dt). T(0)
31 Example: ADCF of AR(10) residuals: Z t N(0, 1) vs. Z t t(1.5) N(0,1) ADCF lag iid noise ADCF t(1.5) iid noise lag Figure: Empirical 5%, 50%, 95% quantiles of R n for Z t N(0, 1) (upper panel) and Z t t(1.5) (lower panel)
32 Example: ADCF of AR(10) residuals: Z t N(0, 1) vs. Z t t(1.5) N(0,1) ADCF residuals iid noise lag ADCF t(1.5) iid noise lag Figure: Empirical 5%, 50%, 95% quantiles of R n and Rn for Z t N(0, 1) (upper panel) and Z t t(1.5) (lower panel)
33 Example: ADCF of AR(10) residuals: Z t N(0, 1) vs. Z t t(1.5) N(0,1) ADCF residuals iid noise lag ADCF t(1.5) residuals iid noise lag Figure: Empirical 5%, 50%, 95% quantiles of R n and Rn for Z t N(0, 1) (upper panel) and Z t t(1.5) (lower panel)
34 Example: ADCF of AR(10) residuals: Z t N(0, 1) vs. Z t t(1.5) N(0,1) ADCF residuals bootstrap iid noise lag ADCF t(1.5) residuals iid noise lag Figure: Empirical 5%, 50%, 95% quantiles of R n and Rn for Z t N(0, 1) (upper panel) and Z t t(1.5) (lower panel)
35 Example: ADCF of AR(10) residuals: Z t N(0, 1) vs. Z t t(1.5) N(0,1) ADCF residuals bootstrap iid noise lag ADCF t(1.5) residuals bootstrap iid noise lag Figure: Empirical 5%, 50%, 95% quantiles of R n and Rn for Z t N(0, 1) (upper panel) and Z t t(1.5) (lower panel)
36 Example: ADCF of AR(10) residuals: symmetric Gamma(0.2,0.5) noise Székely and Rizzo weight function, need not work. w(s, t) = 1 c p c q t 1+p p s 1+q q, ADCF ADCF residuals noise lag lag Figure: Left: Box-plots from 500 independent replications. Right panel: empirical 5%, 50%, 95% quantiles from simulated residuals and from iid noise.
37 Example: Kilkenny wind speed time series bs residu iid noise ADCF 0e+00 4e 04 8e Lag Figure: Auto-distance correlation function (ADCF) of residuals from AR(9) model applied to Kilkenny daily wind speed.
38 Example: Kilkenny wind speed time series bs residu iid noise ADCF 0e+00 4e 04 8e Lag Figure: Auto-distance correlation function (ADCF) of residuals from AR(9) model applied to Kilkenny daily wind speed.
39 Example: Kilkenny wind speed time series Series kilkengarchres Series kilkengarchres^2 ACF ACF Lag Lag Series abs(kilkengarchres) ACF ADCF Lag Lag Figure: Auto-distance correlation function (ADCF) of residuals from AR(9) followed by a GARCH(1,1) model applied to Kilkenny daily wind speed.
Extremogram and Ex-Periodogram for heavy-tailed time series
Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal
More informationExtremogram and ex-periodogram for heavy-tailed time series
Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal
More informationNonlinear Time Series Modeling
Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September
More informationPractical conditions on Markov chains for weak convergence of tail empirical processes
Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,
More informationParameter estimation: ACVF of AR processes
Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationThe largest eigenvalues of the sample covariance matrix. in the heavy-tail case
The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)
More informationLecture 3: Statistical sampling uncertainty
Lecture 3: Statistical sampling uncertainty c Christopher S. Bretherton Winter 2015 3.1 Central limit theorem (CLT) Let X 1,..., X N be a sequence of N independent identically-distributed (IID) random
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationAn Updated Literature Review of Distance Correlation and
An Updated Literature Review of Distance Correlation and its Applications to Time Series Dominic Edelmann 1, Konstantinos Fokianos 2, and Maria Pitsillou 3 arxiv:1710.01146v2 [stat.me] 12 Jul 2018 1 Department
More informationdcovts: Distance Covariance/Correlation for Time Series by Maria Pitsillou and Konstantinos Fokianos
CONTRIBUTED RESEARCH ARTICLES 324 dcovts: Distance Covariance/Correlation for Time Series by Maria Pitsillou and Konstantinos Fokianos Abstract The distance covariance function is a new measure of dependence
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationHeavy Tailed Time Series with Extremal Independence
Heavy Tailed Time Series with Extremal Independence Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Herold Dehling Bochum January 16, 2015 Rafa l Kulik and Philippe Soulier Regular variation
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationAssessing the dependence of high-dimensional time series via sample autocovariances and correlations
Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny University of Aarhus Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia),
More informationOn 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).
On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More informationSTAT 248: EDA & Stationarity Handout 3
STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationMAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)
MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only
More informationNonparametric Function Estimation with Infinite-Order Kernels
Nonparametric Function Estimation with Infinite-Order Kernels Arthur Berg Department of Statistics, University of Florida March 15, 2008 Kernel Density Estimation (IID Case) Let X 1,..., X n iid density
More informationRecall that the AR(p) model is defined by the equation
Estimation of AR models Recall that the AR(p) model is defined by the equation X t = p φ j X t j + ɛ t j=1 where ɛ t are assumed independent and following a N(0, σ 2 ) distribution. Assume p is known and
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationIV. Covariance Analysis
IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationMarcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design
Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling
More informationSTA 6857 Estimation ( 3.6)
STA 6857 Estimation ( 3.6) Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg STA 6857 Estimation ( 3.6) 2/ 19 Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg
More informationRobust Testing and Variable Selection for High-Dimensional Time Series
Robust Testing and Variable Selection for High-Dimensional Time Series Ruey S. Tsay Booth School of Business, University of Chicago May, 2017 Ruey S. Tsay HTS 1 / 36 Outline 1 Focus on high-dimensional
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationLarge sample distribution for fully functional periodicity tests
Large sample distribution for fully functional periodicity tests Siegfried Hörmann Institute for Statistics Graz University of Technology Based on joint work with Piotr Kokoszka (Colorado State) and Gilles
More informationStochastic volatility models: tails and memory
: tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationSimulation of Max Stable Processes
Simulation of Max Stable Processes Whitney Huang Department of Statistics Purdue University November 6, 2013 1 / 30 Outline 1 Max-Stable Processes 2 Unconditional Simulations 3 Conditional Simulations
More informationA Framework for Daily Spatio-Temporal Stochastic Weather Simulation
A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationLecture The Sample Mean and the Sample Variance Under Assumption of Normality
Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationStat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationAsymptotic inference for a nonstationary double ar(1) model
Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationIntroduction to Time Series Analysis. Lecture 12.
Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.
More informationA SARIMAX coupled modelling applied to individual load curves intraday forecasting
A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationThe Laplace driven moving average a non-gaussian stationary process
The Laplace driven moving average a non-gaussian stationary process 1, Krzysztof Podgórski 2, Igor Rychlik 1 1 Mathematical Sciences, Mathematical Statistics, Chalmers 2 Centre for Mathematical Sciences,
More informationOn robust and efficient estimation of the center of. Symmetry.
On robust and efficient estimation of the center of symmetry Howard D. Bondell Department of Statistics, North Carolina State University Raleigh, NC 27695-8203, U.S.A (email: bondell@stat.ncsu.edu) Abstract
More informationCONTAGION VERSUS FLIGHT TO QUALITY IN FINANCIAL MARKETS
EVA IV, CONTAGION VERSUS FLIGHT TO QUALITY IN FINANCIAL MARKETS Jose Olmo Department of Economics City University, London (joint work with Jesús Gonzalo, Universidad Carlos III de Madrid) 4th Conference
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationInference for non-stationary, non-gaussian, Irregularly-Sampled Processes
Inference for non-stationary, non-gaussian, Irregularly-Sampled Processes Robert Wolpert Duke Department of Statistical Science wolpert@stat.duke.edu SAMSI ASTRO Opening WS Aug 22-26 2016 Aug 24 Hamner
More information1.4 Properties of the autocovariance for stationary time-series
1.4 Properties of the autocovariance for stationary time-series In general, for a stationary time-series, (i) The variance is given by (0) = E((X t µ) 2 ) 0. (ii) (h) apple (0) for all h 2 Z. ThisfollowsbyCauchy-Schwarzas
More informationOn a Nonparametric Notion of Residual and its Applications
On a Nonparametric Notion of Residual and its Applications Bodhisattva Sen and Gábor Székely arxiv:1409.3886v1 [stat.me] 12 Sep 2014 Columbia University and National Science Foundation September 16, 2014
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More informationReview of Probability
Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical
More informationGARCH processes probabilistic properties (Part 1)
GARCH processes probabilistic properties (Part 1) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More information7. The Multivariate Normal Distribution
of 5 7/6/2009 5:56 AM Virtual Laboratories > 5. Special Distributions > 2 3 4 5 6 7 8 9 0 2 3 4 5 7. The Multivariate Normal Distribution The Bivariate Normal Distribution Definition Suppose that U and
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationBootstrapping high dimensional vector: interplay between dependence and dimensionality
Bootstrapping high dimensional vector: interplay between dependence and dimensionality Xianyang Zhang Joint work with Guang Cheng University of Missouri-Columbia LDHD: Transition Workshop, 2014 Xianyang
More informationApplied Time Series Topics
Applied Time Series Topics Ivan Medovikov Brock University April 16, 2013 Ivan Medovikov, Brock University Applied Time Series Topics 1/34 Overview 1. Non-stationary data and consequences 2. Trends and
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationwhite noise Time moving average
1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving
More informationWaseda International Symposium on Stable Process, Semimartingale, Finance & Pension Mathematics
! Waseda International Symposium on Stable Process, Semimartingale, Finance & Pension Mathematics Organizers: Masanobu Taniguchi (Waseda Univ.), Dou Xiaoling (ISM) and Kenta Hamada (Waseda Univ.) Waseda
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationThe Distance Standard Deviation
The Distance Standard Deviation Dominic Edelmann, Donald Richards, and Daniel Vogel arxiv:1705.05777v1 [math.st] 16 May 2017 May 17, 2017 Abstract The distance standard deviation, which arises in distance
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationPoisson Cluster process as a model for teletraffic arrivals and its extremes
Poisson Cluster process as a model for teletraffic arrivals and its extremes Barbara González-Arévalo, University of Louisiana Thomas Mikosch, University of Copenhagen Gennady Samorodnitsky, Cornell University
More information