Modeling and testing long memory in random fields

Similar documents
Long-range dependence

Statistical study of spatial dependences in long memory random fields on a lattice, point processes and random geometry.

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

Stochastic volatility models: tails and memory

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Statistics (1): Estimation

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Tail empirical process for long memory stochastic volatility models

Some functional (Hölderian) limit theorems and their applications (II)

Unit Roots in White Noise?!

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

SF2943 TIME SERIES ANALYSIS: A FEW COMMENTS ON SPECTRAL DENSITIES

Stochastic Volatility Models with Long Memory

A note on a Marčenko-Pastur type theorem for time series. Jianfeng. Yao

SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance

5: MULTIVARATE STATIONARY PROCESSES

Asymptotic Statistics-III. Changliang Zou

7. MULTIVARATE STATIONARY PROCESSES

Detection of structural breaks in multivariate time series

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Generalised AR and MA Models and Applications

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Practical conditions on Markov chains for weak convergence of tail empirical processes

Gaussian vectors and central limit theorem

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Module 9: Stationary Processes

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

3. ARMA Modeling. Now: Important class of stationary processes

STA205 Probability: Week 8 R. Wolpert

ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP

Central Limit Theorem for Non-stationary Markov Chains

Nonconcave Penalized Likelihood with A Diverging Number of Parameters

arxiv: v1 [math.pr] 13 Jan 2019

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Notes on Time Series Modeling

Simulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence

11. Further Issues in Using OLS with TS Data

distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as

Nonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31

Stochastic Processes

ELEMENTS OF PROBABILITY THEORY

Adaptive wavelet decompositions of stochastic processes and some applications

Ch. 14 Stationary ARMA Process

Time Series Examples Sheet

Lecture I: Asymptotics for large GUE random matrices

Persistence as a spectral property

Limiting distribution of the least squares estimates in. polynomial regression with long memory noises

1. Stochastic Processes and filtrations

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Extremogram and ex-periodogram for heavy-tailed time series

Time series analysis MAP565

Beyond the color of the noise: what is memory in random phenomena?

Discrete time processes

Multiresolution Models of Time Series

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals

Poisson Cluster process as a model for teletraffic arrivals and its extremes

Single Equation Linear GMM with Serially Correlated Moment Conditions

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes

Long memory and changing persistence

Temporal aggregation of stationary and nonstationary discrete-time processes

Stochastic process. X, a series of random variables indexed by t

Convergence of the long memory Markov switching model to Brownian motion

Introduction to Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

Single Equation Linear GMM with Serially Correlated Moment Conditions

Fourier Analysis of Stationary and Non-Stationary Time Series

Gaussian processes. Basic Properties VAG002-

Stationary Graph Processes: Nonparametric Spectral Estimation

Graduate Econometrics I: Maximum Likelihood I

Chapter 1 Numerical approximation of data : interpolation, least squares method

Time Series Analysis -- An Introduction -- AMS 586

Lectures 2 3 : Wigner s semicircle law

Optimal series representations of continuous Gaussian random fields

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

Extreme inference in stationary time series

A Study on "Spurious Long Memory in Nonlinear Time Series Models"

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

ARIMA Modelling and Forecasting

Tail bound inequalities and empirical likelihood for the mean

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Extremogram and Ex-Periodogram for heavy-tailed time series

From Fractional Brownian Motion to Multifractional Brownian Motion

Consistent estimation of the memory parameter for nonlinear time series

A Short Course in Basic Statistics

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

CONVERGENCE IN DISTRIBUTION OF THE PERIODOGRAM OF CHAOTIC PROCESSES

Wavelet domain test for long range dependence in the presence of a trend

Adjusted Empirical Likelihood for Long-memory Time Series Models

SELF-SIMILARITY PARAMETER ESTIMATION AND REPRODUCTION PROPERTY FOR NON-GAUSSIAN HERMITE PROCESSES

Time Series Examples Sheet

Econ 424 Time Series Concepts

Stein s Method and Characteristic Functions

CHANGE DETECTION IN TIME SERIES

High quantile estimation for some Stochastic Volatility models

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Transcription:

Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6

1 Introduction Long memory random fields Motivations Previous studies : isotropic long memory

Random fields on the lattice Z d Discrete random field : random process on the lattice Z d In dimension 2 : 18 16 14 12 8 6 4 2 2 4 6 8 12 14 16 18 The value at each point of the lattice is represented by a color : > Red Green Blue Stationary random field : the color s generating law is invariant under translations.

A particular stationary random field : the white noise 9 3 3 9

A { weakly dependent short memory random field 9 3 3 9

A { strongly dependent long memory random field 9 3 3 9

Quantification of the dependence of (X n ) n Z d The dependence between two points of X at lag h Z d can be measured by the covariance function r(h) = cov(x k, X k+h ) = cov(x, X h ). An alternative tool to study dependence : the spectral density f defined on [ π, π] d by r(h) = e i<λ,h> f(λ)dλ. [ π,π] d Remark The two points of view are closely related but not equivalent

Long memory Definition A random field X exhibits long memory when its covariance function is not summable, i.e. h Z d r(h) =. Proposition If the spectral density f of X is unbounded then X is long-range dependent. Examples In dimension d = 2, f(x, y) = x α y β, < α < 1, < β < 1 ; f(x, y) = (x 2 + y 2 ) α, < α < 1 ; f(x, y) = x y α, < α < 1.

Example : a short memory random field 9 3 3 9

Example : a short memory random field x 4 9 3 2.5 2 1.5 1 9.5.5 3 3 9 3 Its covariance function. 3 9

Example : a short memory random field x 4 9 3 2.5 2 1.5 1 9.5.5 3 3 9 3 Its covariance function. 3 9 9 3 3 9 Its periodogram.

Example : An isotropic long memory random field 9 3 3 9

Example : An isotropic long memory random field 9 9 3 3 9 3 Its covariance function. 3 9

Example : An isotropic long memory random field 9 9 3 3 9 3 Its covariance function. 3 9 4 3 3 2 9 3 3 9 Its periodogram.

Example : A product-type long memory random field x 4 9 8 6 9 4 2 2 3 3 9 3 Its covariance function. 3 9 9 3 9 3 3 9 Its periodogram.

Example : A non-isotropic long memory random field 9 9 3 3 9 3 Its covariance function. 3 9 9 3 9 3 3 9 Its periodogram.

Motivations Are there some natural mathematical models leading to long memory random field? Consider a realisation of a long memory random field, can we test the strong dependence property? can we estimate this long memory (direction, intensity,...)? Can we use the usual statistical methods (in regression, in forecasting,...)? What is the asymptotic behaviour of the main tools (partial sums, empirical process,...)?

Previous studies In image analysis : Some texture models (Kashyap and Lapsa, 1984, Bennett and Khotanzad, 1998 and Eom, 1 ). Asymptotic results, under isotropic long memory : Partial sums (Dobrushin and Major, 1979 et Surgailis, 1982 ). Empirical process (Doukhan, Lang and Surgailis, 2 ). Quadratic forms (Heyde and Gay, 1993 and Doukhan, Leon and Soulier, 1996 ). Local times (Doukhan and Leon, 1996 ).

Isotropic long memory Definition A stationary random field exhibits isotropic long memory if one of the two following conditions is fulfilled its covariance function behaves as ( ) n r(n) = n α b L( n ), < α < d. n its spectral density is continuous everywhere but at zero where : ( ) ( ) x 1 f(x) x α d b L, < α < d. x x where L is a slowly varying function at infinity and b is a continuous function on the unit sphere of R d.

2 Modeling Filtering Aggregation Long memory in Statistical Mechanics

Filtering : autoregressive fields Let (ɛ n ) n Z d be a white noise, P (L 1,..., L d )X n1,...,n d = ɛ n1,...,n d admits a unique stationary solution iff 2 1 P (e iλ1,..., e iλ d ) dλ <. [ π,π] d If d = 1 λ, P (e iλ ) π π 1 P (e iλ ) 2 dλ < If d 2 λ, P (e iλ1,..., e iλ d ) = [ π,π] d 1 Example P (e iλ 1,...,e iλ d ) 2 dλ <. X n1,...,n 5 1 5 (X n 1 1,n 2,...,n 5 + + X n1,...,n 5 1) = ɛ n1,...,n 5 admits a stationary solution with spectral density 1 f X (λ 1,..., λ 5 ) 1 1 5 + + e (eiλ1 iλ5 ) 2.

Filtering : general case Let (ɛ n ) n Z d be a white noise with the spectral representation ɛ n = e i<n,λ> dw (λ), [ π,π] d where W is the random spectral measure of ɛ. Property The field constructed by filtering ɛ through a, defined by X n = e i<n,λ> a(λ)dw (λ) = â j1,...,j d ɛ n1 j 1,...,n d j d, [ π,π] d j 1,...,j d where â are the Fourier coefficients of a, has the spectral density f X (λ) a(λ) 2. X is long-range dependent if the filter a is unbounded

Some examples in dimension d = 2 Example (Non-isotropic long memory) a(λ 1, λ 2 ) = λ 1 λ 2 α where < α < 1 2. Example (Long memory depending on the directions) ( X n1,n 2 = 1 L ) 1 + L α 2 ɛ n1,n2 2, < α < 1/2, admits the spectral density ( f(λ 1, λ 2 ) = σ2 4π 2 4 sin 2 λ 1 λ 2 sin2 2 2 + sin2 ( )) λ1 + λ α 2. 2

Aggregation Construct N independent copies of the autoregressive field X : P (L 1, L 2 )X n1,n 2 = ɛ n1,n 2, where P is a polynomial function with random coefficients, independent from ɛ. The aggregated field is obtained thanks to the classical CLT : Z n,m = lim N 1 N N k=1 The Gaussian field Z has the spectral density f(λ) = X (k) n,m. σ2 (2π) d E P 1 ( e iλ1, e iλ2) 2. If one chooses properly the law generating the coefficients of P, then Z is a long memory Gaussian random field.

Example (Product-type long memory) P (L 1, L 2 ) = (1 β 1 L 1 )(1 β 2 L 2 ) where β 1 and β 2 are generated by the same law with density 1 x. Then f(λ 1, λ 2 ) 1 at (λ 1, λ 2 ) = (, ) λ1 λ 2 r(h, l) is non-summable. Example (Long memory in one particual direction) P (L 1, L 2 ) = 1 βl 1 L 2 where β is generated by a density 1 x. Then 1 f(λ 1, λ 2 ) at (λ 1, λ 2 ) = (, ) λ1 +λ 2 r(h, l) = if h l and r(h, h) is non-summable.

Long memory in Statistical mechanics The Ising model : The spins takes value in { 1, 1} and the interaction potential is { βxi x Φ i,j (x i, x j ) = j if d k=1 i k j k = 1 otherwise, where β > represents the inverse of the temperature. When d 2, there is phase transition if β β c and r(h) h (d 2+µ), at β = β c, h where µ [, 2] is a parameter depending on d.

Long memory in Statistical Mechanics -The homogeneous systems : x i R and { 1 Φ i,j (x i, x j ) = 2 J()x2 i if i = j J(i j)x i x j if i j, where (J(i)) i Z d is an even positive definite sequence of l 1 (Z d ). Theorem (Dobrushin 19, Künsch 19 ) If [ π,π] d Ĵ 1 (λ)dλ <, the pure phases are Gaussian with spectral measure Ĵ 1. Moreover, there is phase transition iff Ĵ admit one root. For homogeneous systems, System in phase transition Long memory random field.

Example (The harmonic potential in dimension d 3) 1 2d if n = 1 J(n) = 1 if n = otherwise. One has Ĵ(λ) 1 ( 1 2 1 d λk) 2, when λ. k=1 Example (In dimension d = 2, particular direction of l.m.) j 1 α <j k j+α if l = θk, k > 1 J(k, l) = 1 if k = l = otherwise, where α ], 1/2[ et θ R. One can prove that «2α Ĵ(λ 1, λ 2) 1 2 sin λ1 + θλ 2. 2

3 Testing long memory

Expected test Null hypothesis : X exhibits short memory Alternative hypothesis : X exhibits long memory The test statistic is based on the empirical variance of the partial sums S j = j 1 i 1 =1... j d i d =1 ( Xi1,...,i d X n ). Why? The asymptotic behaviour of the partial sums allows to distinguish between weak dependence and strong dependence.

4 Asymptotic study of the partial sums Partial sums under weak dependence Previous results under isotropic long memory Spectral scheme Application to partial sums

Partial sums under weak dependence Theorem (Wichura (1971), Dedecker (1) ) Let X be a second order stationary random field such that j Z d r(j) <. Under moments hypothesis on X and assuming σ 2 = j Z d r(j), 1 σn d/2 [nt 1 ] k 1 =1... [nt d ] k d =1 X k1,...,k d D([,1] d ) B(t 1,..., t d ), where B is the Brownian Sheet on [, 1] d i.e. the Gaussian field with covariance function γ(s, t) = d i=1 s i t i.

Results under isotropic long memory Let (ξ k ) k Z d be a strong white noise and X k = j Z d a j ξ k j, where ( ) j a j = j β L( j )b, j d 2 < β < d. L is a slowly varying function at infinity and b a continuous function on the unit sphere of R d. The field X exhibits isotropic long memory. Theorem (Dobrushin and Major (1979), Surgailis (1982), Avram and Taqqu (1987) ) 1 n d m(β d 2 ) L(n) m [nt 1] k 1=1 [nt d ]... k d =1 P m (X k1,...,k d ) fidi n Z m(t), where P m is the Appell polynomial of degree m and where Z m is the Hermite process of order m.

Spectral scheme for linear fields Let the linear field X k1,...,k d = â j1,...,j d ξ k1 j 1,...,k d j d = e i<k,λ> a(λ)dw (λ), j [ π,π] 1,...,j d d where â is the Fourier transform of filter a and where ξ is a noise with spectral representation : ξ k = e i<k,λ> dw (λ). The partial sums of X, can be rewritten Sn X (t) = [nt 1] 1 Sn X (t 1,..., t d ) =n d/2 [ nπ,nπ] d a where W n (A) = n d/2 W (n 1 A). k 1= [nt d ] 1... k d = X k1,...,k d, ( ) d λ e iλj[ntj]/n 1 n n(e iλj/n 1) dw n(λ), j=1

Theorem (Lang and Soulier () when d = 1, Lavancier when d 1 ) Let W n (A) = n d/2 W (n 1 A) where W is the random spectral measure of an i.i.d white noise. If L Φ 2 (R d ) n Φ, then Φ n dw n L ΦdW, where W is the Gaussian white noise spectral field. Remark The result is still true when W comes from a non-i.i.d white noise provided its spectral density is bounded above and it follows a fonctional CLT.

Application to partial sums when d 2 Recall : X k = â j ξ k j = [ π,π] d e i<k,λ> a(λ)dw (λ), therefore f X (λ) a 2 (λ). Proposition (Lavancier, CLT ) If a is continuous at and a(), [nt 1] [nt 1 d ]... n d/2 k 1= k d = where B is the Brownian Sheet. Remark The result is concerned with X k1,...,k d fidi n a()b(t), Weakly dependent random fields (when a is continuous everywhere) Long memory random fields involving non-zero spectral singularities (ex : a(λ 1, λ 2 ) = λ 1 λ 2 1 α, < α < 1/2)

Application to partial sums when d 2 Proposition (Lavancier, Non-CLT) If a(λ) ã(λ) at with ã(cλ) = c α ã(λ) ( < α < 1), [nt 1] [nt 1 d ]... n d/2+α k 1= k d = X k1,...,k d fidi n ã(λ) d j=1 e itjλj 1 dw (λ). iλ j Remark This result is concerned with long memory random field : isotropic l.m., ex : ã(λ 1, λ 2 ) = (λ 2 1 + λ2 2 ) α, < α < 1. non-isotropic l.m., ex : ã(λ 1, λ 2 ) = λ 1 + θλ 2 α, < α < 1 2, θ R.

Recall : X n = e i<n,λ> a(λ)dw (λ). [ π,π] d Remark (Convergence to the Fractional Brownian Sheet) If a(λ) = d a j (λ j ), j=1 where, for all j, a j (λ j ) λ j α j, < α j < 1/2, then 1 n (d/2+p j α j) [nt 1 ] k 1 =... [nt d ] k d = X k1,...,k d fidi n R d d j=1 e it jλ j 1 iλ j λ j α j dw (λ).

5 Testing procedure and simulations Test hypothesis and test statistic Consistency Simulations in dimension d = 2

Test hypothesis Null hypothesis : weak dependence j Z d r(j) < n d/2 σ [nt 1] k 1=1 [nt d ]... k d =1 D([,1] X d ) k B(t), where σ 2 := j Z d r(j) and B is the Brownian Sheet on [, 1]d. moments hypothesis. Alternative hypothesis : long memory n γ [nt 1] L(n) k 1=1 [nt d ]... k d =1 D([,1] X d ) k Y (t), where γ > d/2, L is a slowly varying function at infinity and Y is a measurable random field.

Test statistic Let q n be a integer in [1, n], an estimator of σ 2 := P j Z r(j) is d ( X ŝ 2 ˆr : empirical covariance function q n,n = ω qn,j ˆr(j), where ω qn,j = Q d j { q n,...,q n} d i=1 (1 j i q n ) Let S j = j 1 X i 1 =1... j d X i d =1 `Xi1,...,i d X n. Definition (Extension of the V/S statistic to d > 1) Let A n = {1,..., n} d, the V/S statistic is defined by hence M n = n 2d ŝ 2 q n,n V M n = n d ar ( Sj, j A ) n, j A n S j ŝ 2 q n,n 2 n d j A n S j 2.

Consistency Proposition (Lavancier, Under the null hypothesis) Under the null hypothesis, choose q n such that lim n q n = and lim n q n /n =, then M n L Z,1] d B(t)! 2 " dy Z t i B(1)! dt B(t) i=1 [,1] d!! 2 dy t i B(1) dt#, i=1 where B is the Brownian Sheet on [, 1] d. Proposition (Lavancier, Under the alternative hypothesis) Under the alternative hypothesis, choose q n such that lim n q n = and for all δ >, lim n q n /n δ =, then M n P.

Simulation of the limit law under the null hypothesis, when d = 2 Average =, 897 V ariance =, 18 c 9% =, 1448 c 95% =, 1692.5.1.15.2.25.3.35.4.45

Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% 3 9

Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% 3 9 9 f(λ 1, λ 2) (λ 2 1 + λ 2 2) γ, < γ < 1. 3 γ, 96, 7, 4 ˆp 75% 58, 1% 32, 7% 3 9

Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% 3 9 9 f(λ 1, λ 2) (λ 2 1 + λ 2 2) γ, < γ < 1. 3 γ, 96, 7, 4 ˆp 75% 58, 1% 32, 7% 3 9 9 f(λ 1, λ 2) λ 1 λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 27% 27% 24, 5% 3 9