Beyond the color of the noise: what is memory in random phenomena?

Similar documents
MAXIMA OF LONG MEMORY STATIONARY SYMMETRIC α-stable PROCESSES, AND SELF-SIMILAR PROCESSES WITH STATIONARY MAX-INCREMENTS. 1.

ELEMENTS OF PROBABILITY THEORY

Tail empirical process for long memory stochastic volatility models

Nonlinear Time Series Modeling

Poisson Cluster process as a model for teletraffic arrivals and its extremes

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

CÉLINE LACAUX AND GENNADY SAMORODNITSKY

Infinitely divisible distributions and the Lévy-Khintchine formula

Integral representations in models with long memory

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Heavy Tailed Time Series with Extremal Independence

The Laplace driven moving average a non-gaussian stationary process

Weak quenched limiting distributions of a one-dimensional random walk in a random environment

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Probability and Measure

Wiener Measure and Brownian Motion

Covariance structure of continuous time random walk limit processes

Stochastic Processes. A stochastic process is a function of two variables:

Econ 424 Time Series Concepts

Spatial extreme value theory and properties of max-stable processes Poitiers, November 8-10, 2012

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Part II Probability and Measure

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

A Class of Fractional Stochastic Differential Equations

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Introduction to Stochastic processes

COMPOSITION SEMIGROUPS AND RANDOM STABILITY. By John Bunge Cornell University

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Anomalous diffusion in biology: fractional Brownian motion, Lévy flights

Nonparametric regression with martingale increment errors

Stochastic volatility models: tails and memory

Jump-type Levy Processes

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

Gaussian, Markov and stationary processes

Systems Driven by Alpha-Stable Noises

GARCH processes probabilistic properties (Part 1)

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Handbook of Spatial Statistics Chapter 2: Continuous Parameter Stochastic Process Theory by Gneiting and Guttorp

Exercises with solutions (Set D)

Harnack Inequalities and Applications for Stochastic Equations

Lecture Characterization of Infinitely Divisible Distributions

25.1 Ergodicity and Metric Transitivity

Statistical signal processing

Max stable Processes & Random Fields: Representations, Models, and Prediction

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Long-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu

Gaussian distributions and processes have long been accepted as useful tools for stochastic

Kyoto (Fushimi-Uji) International Seminar

A Probability Review

Stationary particle Systems

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Gaussian Fields and Percolation

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

2 G. SAMORODNITSKY Contents. Introduction 3 2. Heavy Tails 4 3. Long range dependence Self-similar processes Periodogram Hermite p

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

Topics in fractional Brownian motion

LONG RANGE DEPENDENCE. 1. Introduction

Karhunen-Loève Expansions of Lévy Processes

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Extremogram and Ex-Periodogram for heavy-tailed time series

Some functional (Hölderian) limit theorems and their applications (II)

Lecture - 30 Stationary Processes

Heteroskedasticity in Time Series

Bayesian Regularization

Extremogram and ex-periodogram for heavy-tailed time series

DISTRIBUTION OF THE SUPREMUM LOCATION OF STATIONARY PROCESSES. 1. Introduction

Notions such as convergent sequence and Cauchy sequence make sense for any metric space. Convergent Sequences are Cauchy

The Equivalence of Ergodicity and Weak Mixing for Infinitely Divisible Processes1

Chapter 6. Random Processes

Brownian Motion and Conditional Probability

From Fractional Brownian Motion to Multifractional Brownian Motion

Moment Properties of Distributions Used in Stochastic Financial Models

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Exercises in stochastic analysis

11. Further Issues in Using OLS with TS Data

A Lévy-Fokker-Planck equation: entropies and convergence to equilibrium

Shifting processes with cyclically exchangeable increments at random

Translation Invariant Experiments with Independent Increments

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS

Stochastic processes for symmetric space-time fractional diffusion

Module 9: Stationary Processes

Chapter 7. Basic Probability Theory

Regular Variation and Extreme Events for Stochastic Processes

Statistical inference on Lévy processes

Poisson random measure: motivation

Modeling and testing long memory in random fields

Signals and Spectra - Review

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

GARCH processes continuous counterparts (Part 2)

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Numerical Methods with Lévy Processes

Discrete time processes

Transcription:

Beyond the color of the noise: what is memory in random phenomena? Gennady Samorodnitsky Cornell University September 19, 2014

Randomness means lack of pattern or predictability in events according to Wikipedia -3-1 1 2 3 x 0 200 400 600 800 1000 tt -3-1 1 3 y 0 200 400 600 800 1000 tt However: certain different patterns are present on the two plots.

The two plots are of two stationary stochastic processes with the same marginals. The second one has memory, while the first one does not. Traditionally, in probability the notion of memory applies to stationary stochastic processes ( X n, n = 0, 1, 2,... ) : for every h = 1, 2,... ( Xn, n = 0, 1, 2,... ) d = ( Xn+h, n = 0, 1, 2,... ).

The memory in a stochastic process: how observations far away in time affect each other. How does one measure memory? It is obvious: use correlations! Let ρ n = Corr ( X k, X k+n ), n = 0, 1, 2,....

Four different correlation functions rho_n=2^(-n) rho_n=1.5^(-n) rho1 0.0 0.2 0.4 0.6 0.8 1.0 rho2 0.0 0.2 0.4 0.6 0.8 1.0 0 5 10 15 20 n 0 5 10 15 20 n MA(5) AR(infty), a_j=j^(-0.75) rho3 0.0 0.2 0.4 0.6 0.8 1.0 rho4 0.3 0.5 0.7 0.9 0 5 10 15 20 n 0 5 10 15 20 n What do we see in these plots?

Covariances and correlations of a second-order stationary process can be expressed through the spectral measure of the process: ρ n = 1 e inx F (dx), n = 0, 1, 2,... ; Var X 0 ( π,π] F is a finite symmetric measure on ( π, π]. If F has a density with respect to the Lebesgue measure on ( π, π], the density f is called the power spectral density of the process.

One can view the process as the sum of waves of different frequencies with random and uncorrelated weights: X n = e inx M(dx), n = 0, 1, 2,..., ( π,π] M a random measure governed by the spectral measure (density). Such a process is also called a noise.

If the spectral density is constant, the noise is white. If some frequencies have a larger weight than some other frequencies, the noise is colored. The common colors of the noise: pink, brown, blue, violet, grey. The different colors describe which frequencies are preferred and by how much.

Long range dependence (long memory) Long memory in a stochastic process: when the common wisdom goes wrong, for example: The square root of the sample size rule is no longer valid; Objects that used to be approximately normal are no longer so.

In the second-order language: either slowly decaying correlations ρ n n d L n, n, 0 < d < 1, L n is slowly varying, or spectral density has a pole at zero f (x) x d 1 L(1/x), x 0.

Four different spectral densities White noise Colored noise, short memory d1 0.10 0.14 0.18 0.22 d2 0.1 0.2 0.3 0.4 0.5-3 -2-1 0 1 2 3 tt -3-2 -1 0 1 2 3 tt Colored noise, long memory Colored noise, negative memory d3 0.1 0.2 0.3 0.4 0.5 d4 0.05 0.10 0.15 0.20 0.25-3 -2-1 0 1 2 3 tt -3-2 -1 0 1 2 3 tt How much information is in the color of the noise?

Very different processes, the same color ARCH(1) -15-5 5 15 0 200 400 600 800 1000 iid -10 0 5 10 0 200 400 600 800 1000 Both processes are white noises, with equally heavy tails

Conclusions: the second order characteristics do not provide enough information about the length of the memory; long range dependence is far from being determined by correlations; with sufficiently heavy tails correlations do not even exist; long memory as a phenomenon is a phase transition.

Examples of long memory as a phase transition 1 Suppose that ( X n, n = 0, 1, 2,... ) is a stationary stochastic process with a finite variance. Let S n = n X i, n = 1, 2,.... i=1 We may say that the process has short memory if, as n, S n nex 0 n 1/2 N(0, σ 2 ), 0 < σ 2 <.

We may require that ( ) S[nt] [nt]ex 0 n 1/2, t 0 ( σb(t), t 0 ) in D[0, ) in the Skorohod topology. The limit ( B(t), t 0 ) is the Brownian motion. It is a Gaussian process, has stationary increments, is self-similar with H = 1/2.

We may say that the process has long memory if: as n, the normalization n 1/2 is wrong: S n nex 0 a n has a finite nonzero limit; when a n >> n 1/2, the memory is long and positive; when a n << n 1/2, the memory is long and negative (also: medium memory).

Under long memory the limit Y in ( ) S[nt] [nt]ex 0, t 0 ( Y (t), t 0 ) a n is a self-similar process with stationary increments, but not a Brownian motion. It could be: a Fractional Brownian motion (a Gaussian process); a process in a higher order Wiener chaos: Y (t) =... Q t (x 1,..., x k ) dw (x 1 )... dw (x k ), t 0. R R

2 Suppose that X = ( X n, n = 0, 1, 2,... ) is a stationary stochastic process with infinite variance. For simplicity: asssume the process to be symmetric: X d = X. Assume that the observations have regularly varying tails: P( X n > x) = x α L(x), x > 0, 0 < α < 2, where L is a slowly varying at infinity function.

If the memory is short, then ( ) S[nt], t 0 ( L α (t), t 0 ) ; a n a n = n 1/α L 1 (n), L 1 is slowly varying; L α is a symmetric α-stable Lévy motion; L α is self-similar with H = 1/α.

If the memory is long, one starts getting, as limits, processes such as Linear Fractional Stable Motions: ( Y (t) = t x H 1/α x H 1/α) L α (dx) t 0. R The range of H: 0 < H < 1; the process Y is H-self-similar, and has stationary increments. The memory can be even longer.

The stationary process X We consider infinitely divisible processes of the form X n = f n (x)dm(x), n = 1, 2,.... E M is a homogeneous symmetric infinitely divisible random measure on a (E, E). M has an infinite, σ-finite, control measure µ and local Lévy measure ρ: for every A E with µ(a) <, u R, { } Ee ium(a) ( ) = exp µ(a) 1 cos(ux) ρ(dx). R

The functions f n, n = 1, 2,... are deterministic functions of the form f n (x) = f T n (x) = f ( T n x ), x E, n = 1, 2,... : f : E R is a measurable function, satisfying certain integrability assumptions; T : E E a measurable map preserving measure µ.

We assume that the local Lévy measure ρ has a regularly varying tail with index α, 0 < α < 2: ρ(, ) RV α at infinity. With a proper integrability assumption on the function f : the process X has regularly varying finite-dimensional distributions, with the same tail exponent α.

The key assumption: the map T is conservative and pointwise dual ergodic: there is a sequence of positive constants a n such that 1 for every f L 1 (µ). a n k=1 n T k f E f dµ a.e. The dual operator T satisfies the relation T f g dµ = f g T dµ for f L 1 (µ), g L (µ). E E

Theorem Assume that the normalizing sequence (a n ) in the pointwise dual ergodicity is regularly varying with exponent 0 < β < 1 and that µ(f ) = f dµ 0. Then for some sequence (c n ) that is regularly varying with exponent β + (1 β)/α, 1 n X k µ(f ) Y α,β in D[0, ). c n k=1

The limiting process Let 0 < β < 1. We start with inverse process M β (t) = S β (t) = inf{ u 0 : S β (u) t }, t 0. ( Sβ (t), t 0 ) is a (strictly) β-stable subordinator. ( Mβ (t), t 0 ) is called the Mittag-Leffler process.

The Mittag-Leffler process has a continuous and non-decreasing version. It is self-similar with exponent β. Its increments are neither stationary nor independent. All of its moments are finite. E exp{θm β (t)} = n=0 (θt β ) n Γ(1 + nβ), θ R.

Define Y α,β (t) = Ω [0, ) M β ( (t x)+, ω ) dz α,β (ω, x), t 0. Z α,β is a SαS random measure on Ω [0, ) with control measure P ν. ν a measure on [0, ) given by ν(dx) = (1 β)x β dx. M β is a Mittag-Leffler process defined on (Ω, F, P ).

Conclusions The length of memory should not be measured by correlations or a similar simple measure. Deeper features of the process affect the limiting distributions of the partial sums, partial maxima, etc. In each given application it is important to use a model that fits, and not only the spectrum of the noise.