Modeling and testing long memory in random fields
|
|
- Andrew Johnston
- 5 years ago
- Views:
Transcription
1 Modeling and testing long memory in random fields Frédéric Lavancier Université Lille 1 LS-CREST Paris 24 janvier 6
2 1 Introduction Long memory random fields Motivations Previous studies : isotropic long memory
3 Random fields on the lattice Z d Discrete random field : random process on the lattice Z d In dimension 2 : The value at each point of the lattice is represented by a color : > Red Green Blue Stationary random field : the color s generating law is invariant under translations.
4 A particular stationary random field : the white noise
5 A { weakly dependent short memory random field
6 A { strongly dependent long memory random field
7 Quantification of the dependence of (X n ) n Z d The dependence between two points of X at lag h Z d can be measured by the covariance function r(h) = cov(x k, X k+h ) = cov(x, X h ). An alternative tool to study dependence : the spectral density f defined on [ π, π] d by r(h) = e i<λ,h> f(λ)dλ. [ π,π] d Remark The two points of view are closely related but not equivalent
8 Long memory Definition A random field X exhibits long memory when its covariance function is not summable, i.e. h Z d r(h) =. Proposition If the spectral density f of X is unbounded then X is long-range dependent. Examples In dimension d = 2, f(x, y) = x α y β, < α < 1, < β < 1 ; f(x, y) = (x 2 + y 2 ) α, < α < 1 ; f(x, y) = x y α, < α < 1.
9 Example : a short memory random field
10 Example : a short memory random field x Its covariance function. 3 9
11 Example : a short memory random field x Its covariance function Its periodogram.
12 Example : An isotropic long memory random field
13 Example : An isotropic long memory random field Its covariance function. 3 9
14 Example : An isotropic long memory random field Its covariance function Its periodogram.
15 Example : A product-type long memory random field x Its covariance function Its periodogram.
16 Example : A non-isotropic long memory random field Its covariance function Its periodogram.
17 Motivations Are there some natural mathematical models leading to long memory random field? Consider a realisation of a long memory random field, can we test the strong dependence property? can we estimate this long memory (direction, intensity,...)? Can we use the usual statistical methods (in regression, in forecasting,...)? What is the asymptotic behaviour of the main tools (partial sums, empirical process,...)?
18 Previous studies In image analysis : Some texture models (Kashyap and Lapsa, 1984, Bennett and Khotanzad, 1998 and Eom, 1 ). Asymptotic results, under isotropic long memory : Partial sums (Dobrushin and Major, 1979 et Surgailis, 1982 ). Empirical process (Doukhan, Lang and Surgailis, 2 ). Quadratic forms (Heyde and Gay, 1993 and Doukhan, Leon and Soulier, 1996 ). Local times (Doukhan and Leon, 1996 ).
19 Isotropic long memory Definition A stationary random field exhibits isotropic long memory if one of the two following conditions is fulfilled its covariance function behaves as ( ) n r(n) = n α b L( n ), < α < d. n its spectral density is continuous everywhere but at zero where : ( ) ( ) x 1 f(x) x α d b L, < α < d. x x where L is a slowly varying function at infinity and b is a continuous function on the unit sphere of R d.
20 2 Modeling Filtering Aggregation Long memory in Statistical Mechanics
21 Filtering : autoregressive fields Let (ɛ n ) n Z d be a white noise, P (L 1,..., L d )X n1,...,n d = ɛ n1,...,n d admits a unique stationary solution iff 2 1 P (e iλ1,..., e iλ d ) dλ <. [ π,π] d If d = 1 λ, P (e iλ ) π π 1 P (e iλ ) 2 dλ < If d 2 λ, P (e iλ1,..., e iλ d ) = [ π,π] d 1 Example P (e iλ 1,...,e iλ d ) 2 dλ <. X n1,...,n (X n 1 1,n 2,...,n X n1,...,n 5 1) = ɛ n1,...,n 5 admits a stationary solution with spectral density 1 f X (λ 1,..., λ 5 ) e (eiλ1 iλ5 ) 2.
22 Filtering : general case Let (ɛ n ) n Z d be a white noise with the spectral representation ɛ n = e i<n,λ> dw (λ), [ π,π] d where W is the random spectral measure of ɛ. Property The field constructed by filtering ɛ through a, defined by X n = e i<n,λ> a(λ)dw (λ) = â j1,...,j d ɛ n1 j 1,...,n d j d, [ π,π] d j 1,...,j d where â are the Fourier coefficients of a, has the spectral density f X (λ) a(λ) 2. X is long-range dependent if the filter a is unbounded
23 Some examples in dimension d = 2 Example (Non-isotropic long memory) a(λ 1, λ 2 ) = λ 1 λ 2 α where < α < 1 2. Example (Long memory depending on the directions) ( X n1,n 2 = 1 L ) 1 + L α 2 ɛ n1,n2 2, < α < 1/2, admits the spectral density ( f(λ 1, λ 2 ) = σ2 4π 2 4 sin 2 λ 1 λ 2 sin sin2 ( )) λ1 + λ α 2. 2
24 Aggregation Construct N independent copies of the autoregressive field X : P (L 1, L 2 )X n1,n 2 = ɛ n1,n 2, where P is a polynomial function with random coefficients, independent from ɛ. The aggregated field is obtained thanks to the classical CLT : Z n,m = lim N 1 N N k=1 The Gaussian field Z has the spectral density f(λ) = X (k) n,m. σ2 (2π) d E P 1 ( e iλ1, e iλ2) 2. If one chooses properly the law generating the coefficients of P, then Z is a long memory Gaussian random field.
25 Example (Product-type long memory) P (L 1, L 2 ) = (1 β 1 L 1 )(1 β 2 L 2 ) where β 1 and β 2 are generated by the same law with density 1 x. Then f(λ 1, λ 2 ) 1 at (λ 1, λ 2 ) = (, ) λ1 λ 2 r(h, l) is non-summable. Example (Long memory in one particual direction) P (L 1, L 2 ) = 1 βl 1 L 2 where β is generated by a density 1 x. Then 1 f(λ 1, λ 2 ) at (λ 1, λ 2 ) = (, ) λ1 +λ 2 r(h, l) = if h l and r(h, h) is non-summable.
26 Long memory in Statistical mechanics The Ising model : The spins takes value in { 1, 1} and the interaction potential is { βxi x Φ i,j (x i, x j ) = j if d k=1 i k j k = 1 otherwise, where β > represents the inverse of the temperature. When d 2, there is phase transition if β β c and r(h) h (d 2+µ), at β = β c, h where µ [, 2] is a parameter depending on d.
27 Long memory in Statistical Mechanics -The homogeneous systems : x i R and { 1 Φ i,j (x i, x j ) = 2 J()x2 i if i = j J(i j)x i x j if i j, where (J(i)) i Z d is an even positive definite sequence of l 1 (Z d ). Theorem (Dobrushin 19, Künsch 19 ) If [ π,π] d Ĵ 1 (λ)dλ <, the pure phases are Gaussian with spectral measure Ĵ 1. Moreover, there is phase transition iff Ĵ admit one root. For homogeneous systems, System in phase transition Long memory random field.
28 Example (The harmonic potential in dimension d 3) 1 2d if n = 1 J(n) = 1 if n = otherwise. One has Ĵ(λ) 1 ( d λk) 2, when λ. k=1 Example (In dimension d = 2, particular direction of l.m.) j 1 α <j k j+α if l = θk, k > 1 J(k, l) = 1 if k = l = otherwise, where α ], 1/2[ et θ R. One can prove that «2α Ĵ(λ 1, λ 2) 1 2 sin λ1 + θλ 2. 2
29 3 Testing long memory
30 Expected test Null hypothesis : X exhibits short memory Alternative hypothesis : X exhibits long memory The test statistic is based on the empirical variance of the partial sums S j = j 1 i 1 =1... j d i d =1 ( Xi1,...,i d X n ). Why? The asymptotic behaviour of the partial sums allows to distinguish between weak dependence and strong dependence.
31 4 Asymptotic study of the partial sums Partial sums under weak dependence Previous results under isotropic long memory Spectral scheme Application to partial sums
32 Partial sums under weak dependence Theorem (Wichura (1971), Dedecker (1) ) Let X be a second order stationary random field such that j Z d r(j) <. Under moments hypothesis on X and assuming σ 2 = j Z d r(j), 1 σn d/2 [nt 1 ] k 1 =1... [nt d ] k d =1 X k1,...,k d D([,1] d ) B(t 1,..., t d ), where B is the Brownian Sheet on [, 1] d i.e. the Gaussian field with covariance function γ(s, t) = d i=1 s i t i.
33 Results under isotropic long memory Let (ξ k ) k Z d be a strong white noise and X k = j Z d a j ξ k j, where ( ) j a j = j β L( j )b, j d 2 < β < d. L is a slowly varying function at infinity and b a continuous function on the unit sphere of R d. The field X exhibits isotropic long memory. Theorem (Dobrushin and Major (1979), Surgailis (1982), Avram and Taqqu (1987) ) 1 n d m(β d 2 ) L(n) m [nt 1] k 1=1 [nt d ]... k d =1 P m (X k1,...,k d ) fidi n Z m(t), where P m is the Appell polynomial of degree m and where Z m is the Hermite process of order m.
34 Spectral scheme for linear fields Let the linear field X k1,...,k d = â j1,...,j d ξ k1 j 1,...,k d j d = e i<k,λ> a(λ)dw (λ), j [ π,π] 1,...,j d d where â is the Fourier transform of filter a and where ξ is a noise with spectral representation : ξ k = e i<k,λ> dw (λ). The partial sums of X, can be rewritten Sn X (t) = [nt 1] 1 Sn X (t 1,..., t d ) =n d/2 [ nπ,nπ] d a where W n (A) = n d/2 W (n 1 A). k 1= [nt d ] 1... k d = X k1,...,k d, ( ) d λ e iλj[ntj]/n 1 n n(e iλj/n 1) dw n(λ), j=1
35 Theorem (Lang and Soulier () when d = 1, Lavancier when d 1 ) Let W n (A) = n d/2 W (n 1 A) where W is the random spectral measure of an i.i.d white noise. If L Φ 2 (R d ) n Φ, then Φ n dw n L ΦdW, where W is the Gaussian white noise spectral field. Remark The result is still true when W comes from a non-i.i.d white noise provided its spectral density is bounded above and it follows a fonctional CLT.
36 Application to partial sums when d 2 Recall : X k = â j ξ k j = [ π,π] d e i<k,λ> a(λ)dw (λ), therefore f X (λ) a 2 (λ). Proposition (Lavancier, CLT ) If a is continuous at and a(), [nt 1] [nt 1 d ]... n d/2 k 1= k d = where B is the Brownian Sheet. Remark The result is concerned with X k1,...,k d fidi n a()b(t), Weakly dependent random fields (when a is continuous everywhere) Long memory random fields involving non-zero spectral singularities (ex : a(λ 1, λ 2 ) = λ 1 λ 2 1 α, < α < 1/2)
37 Application to partial sums when d 2 Proposition (Lavancier, Non-CLT) If a(λ) ã(λ) at with ã(cλ) = c α ã(λ) ( < α < 1), [nt 1] [nt 1 d ]... n d/2+α k 1= k d = X k1,...,k d fidi n ã(λ) d j=1 e itjλj 1 dw (λ). iλ j Remark This result is concerned with long memory random field : isotropic l.m., ex : ã(λ 1, λ 2 ) = (λ λ2 2 ) α, < α < 1. non-isotropic l.m., ex : ã(λ 1, λ 2 ) = λ 1 + θλ 2 α, < α < 1 2, θ R.
38 Recall : X n = e i<n,λ> a(λ)dw (λ). [ π,π] d Remark (Convergence to the Fractional Brownian Sheet) If a(λ) = d a j (λ j ), j=1 where, for all j, a j (λ j ) λ j α j, < α j < 1/2, then 1 n (d/2+p j α j) [nt 1 ] k 1 =... [nt d ] k d = X k1,...,k d fidi n R d d j=1 e it jλ j 1 iλ j λ j α j dw (λ).
39 5 Testing procedure and simulations Test hypothesis and test statistic Consistency Simulations in dimension d = 2
40 Test hypothesis Null hypothesis : weak dependence j Z d r(j) < n d/2 σ [nt 1] k 1=1 [nt d ]... k d =1 D([,1] X d ) k B(t), where σ 2 := j Z d r(j) and B is the Brownian Sheet on [, 1]d. moments hypothesis. Alternative hypothesis : long memory n γ [nt 1] L(n) k 1=1 [nt d ]... k d =1 D([,1] X d ) k Y (t), where γ > d/2, L is a slowly varying function at infinity and Y is a measurable random field.
41 Test statistic Let q n be a integer in [1, n], an estimator of σ 2 := P j Z r(j) is d ( X ŝ 2 ˆr : empirical covariance function q n,n = ω qn,j ˆr(j), where ω qn,j = Q d j { q n,...,q n} d i=1 (1 j i q n ) Let S j = j 1 X i 1 =1... j d X i d =1 `Xi1,...,i d X n. Definition (Extension of the V/S statistic to d > 1) Let A n = {1,..., n} d, the V/S statistic is defined by hence M n = n 2d ŝ 2 q n,n V M n = n d ar ( Sj, j A ) n, j A n S j ŝ 2 q n,n 2 n d j A n S j 2.
42 Consistency Proposition (Lavancier, Under the null hypothesis) Under the null hypothesis, choose q n such that lim n q n = and lim n q n /n =, then M n L Z,1] d B(t)! 2 " dy Z t i B(1)! dt B(t) i=1 [,1] d!! 2 dy t i B(1) dt#, i=1 where B is the Brownian Sheet on [, 1] d. Proposition (Lavancier, Under the alternative hypothesis) Under the alternative hypothesis, choose q n such that lim n q n = and for all δ >, lim n q n /n δ =, then M n P.
43 Simulation of the limit law under the null hypothesis, when d = 2 Average =, 897 V ariance =, 18 c 9% =, 1448 c 95% =,
44 Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% 3 9
45 Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% f(λ 1, λ 2) (λ λ 2 2) γ, < γ < 1. 3 γ, 96, 7, 4 ˆp 75% 58, 1% 32, 7% 3 9
46 Simulations based on random fields with q = 8 9 f(λ 1, λ 2) λ 1 γ λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 81% 75, 4% 62, 4% f(λ 1, λ 2) (λ λ 2 2) γ, < γ < 1. 3 γ, 96, 7, 4 ˆp 75% 58, 1% 32, 7% f(λ 1, λ 2) λ 1 λ 2 γ, < γ < 1. 3 γ, 75, 5, 25 ˆp 27% 27% 24, 5% 3 9
Long-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationStatistical study of spatial dependences in long memory random fields on a lattice, point processes and random geometry.
Statistical study of spatial dependences in long memory random fields on a lattice, point processes and random geometry. Frédéric Lavancier, Laboratoire de Mathématiques Jean Leray, Nantes 9 décembre 2011.
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationStochastic volatility models: tails and memory
: tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and
More informationEstimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements
of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements François Roueff Ecole Nat. Sup. des Télécommunications 46 rue Barrault, 75634 Paris cedex 13,
More informationNOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY
Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationStatistics (1): Estimation
Statistics (1): Estimation Marco Banterlé, Christian Robert and Judith Rousseau Practicals 2014-2015 L3, MIDO, Université Paris Dauphine 1 Table des matières 1 Random variables, probability, expectation
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationTail empirical process for long memory stochastic volatility models
Tail empirical process for long memory stochastic volatility models Rafa l Kulik and Philippe Soulier Carleton University, 5 May 2010 Rafa l Kulik and Philippe Soulier Quick review of limit theorems and
More informationSome functional (Hölderian) limit theorems and their applications (II)
Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen
More informationUnit Roots in White Noise?!
Unit Roots in White Noise?! A.Onatski and H. Uhlig September 26, 2008 Abstract We show that the empirical distribution of the roots of the vector auto-regression of order n fitted to T observations of
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationSF2943 TIME SERIES ANALYSIS: A FEW COMMENTS ON SPECTRAL DENSITIES
SF2943 TIME SERIES ANALYSIS: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting
More informationStochastic Volatility Models with Long Memory
Stochastic Volatility Models with Long Memory Clifford M. Hurvich Philippe Soulier 1 Introduction In this contribution we consider models for long memory in volatility. There are a variety of ways to construct
More informationA note on a Marčenko-Pastur type theorem for time series. Jianfeng. Yao
A note on a Marčenko-Pastur type theorem for time series Jianfeng Yao Workshop on High-dimensional statistics The University of Hong Kong, October 2011 Overview 1 High-dimensional data and the sample covariance
More informationSF2943: A FEW COMMENTS ON SPECTRAL DENSITIES
SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting the spectral densities
More informationResearch Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance
Advances in Decision Sciences Volume, Article ID 893497, 6 pages doi:.55//893497 Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Junichi Hirukawa and
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationDetection of structural breaks in multivariate time series
Detection of structural breaks in multivariate time series Holger Dette, Ruhr-Universität Bochum Philip Preuß, Ruhr-Universität Bochum Ruprecht Puchstein, Ruhr-Universität Bochum January 14, 2014 Outline
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationPractical conditions on Markov chains for weak convergence of tail empirical processes
Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP
ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP JENS-PETER KREISS, EFSTATHIOS PAPARODITIS, AND DIMITRIS N. POLITIS Abstract. We explore the limits of the autoregressive (AR) sieve bootstrap,
More informationCentral Limit Theorem for Non-stationary Markov Chains
Central Limit Theorem for Non-stationary Markov Chains Magda Peligrad University of Cincinnati April 2011 (Institute) April 2011 1 / 30 Plan of talk Markov processes with Nonhomogeneous transition probabilities
More informationNonconcave Penalized Likelihood with A Diverging Number of Parameters
Nonconcave Penalized Likelihood with A Diverging Number of Parameters Jianqing Fan and Heng Peng Presenter: Jiale Xu March 12, 2010 Jianqing Fan and Heng Peng Presenter: JialeNonconcave Xu () Penalized
More informationarxiv: v1 [math.pr] 13 Jan 2019
arxiv:1901.04086v1 [math.pr] 13 Jan 2019 Non-central limit theorem for non-linear functionals of vector valued Gaussian stationary random fields Péter Major Mathematical Institute of the Hungarian Academy
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationSimulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence
Simulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence Georg Schoechtel TU Darmstadt, IRTG 1529 Japanese-German International Workshop on Mathematical Fluid Dynamics,
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationNonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31
Nonlinear Error Correction Model and Multiple-Threshold Cointegration Man Wang Dong Hua University, China Joint work with N.H.Chan May 23, 2014 Nonlinear Error Correction Model and Multiple-Threshold Cointegration
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationAdaptive wavelet decompositions of stochastic processes and some applications
Adaptive wavelet decompositions of stochastic processes and some applications Vladas Pipiras University of North Carolina at Chapel Hill SCAM meeting, June 1, 2012 (joint work with G. Didier, P. Abry)
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationLecture I: Asymptotics for large GUE random matrices
Lecture I: Asymptotics for large GUE random matrices Steen Thorbjørnsen, University of Aarhus andom Matrices Definition. Let (Ω, F, P) be a probability space and let n be a positive integer. Then a random
More informationPersistence as a spectral property
, Georgia Tech. Joint work with Naomi Feldheim and Ohad Feldheim. March 18, 2017 Advertisement. SEAM2018, March 23-25, 2018. Georgia Tech, Atlanta, GA. Organizing committee: Michael Lacey Wing Li Galyna
More informationLimiting distribution of the least squares estimates in. polynomial regression with long memory noises
GREQAM Groupement de Recherche en Economie Quantitative d'aix-marseille - UMR-CNRS 6579 Ecole des Hautes Etudes en Sciences Sociales Universités d'aix-marseille II et III Document de Travail n 2006-01
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationExtremogram and ex-periodogram for heavy-tailed time series
Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal
More informationTime series analysis MAP565
Time series analysis MAP565 François Roueff January 29, 2016 ii Contents 1 Random processes 1 1.1 Introduction..................................... 2 1.2 Random processes.................................
More informationBeyond the color of the noise: what is memory in random phenomena?
Beyond the color of the noise: what is memory in random phenomena? Gennady Samorodnitsky Cornell University September 19, 2014 Randomness means lack of pattern or predictability in events according to
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationMultiresolution Models of Time Series
Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with
More informationAsymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals
Acta Applicandae Mathematicae 78: 145 154, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. 145 Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals M.
More informationPoisson Cluster process as a model for teletraffic arrivals and its extremes
Poisson Cluster process as a model for teletraffic arrivals and its extremes Barbara González-Arévalo, University of Louisiana Thomas Mikosch, University of Copenhagen Gennady Samorodnitsky, Cornell University
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationLocal Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes
Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes By Tomohito NAITO, Kohei ASAI and Masanobu TANIGUCHI Department of Mathematical Sciences, School of Science and Engineering,
More informationLong memory and changing persistence
Long memory and changing persistence Robinson Kruse and Philipp Sibbertsen August 010 Abstract We study the empirical behaviour of semi-parametric log-periodogram estimation for long memory models when
More informationTemporal aggregation of stationary and nonstationary discrete-time processes
Temporal aggregation of stationary and nonstationary discrete-time processes HENGHSIU TSAI Institute of Statistical Science, Academia Sinica, Taipei, Taiwan 115, R.O.C. htsai@stat.sinica.edu.tw K. S. CHAN
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationConvergence of the long memory Markov switching model to Brownian motion
Convergence of the long memory Marov switching model to Brownian motion Changryong Bae Sungyunwan University Natércia Fortuna CEF.UP, Universidade do Porto Vladas Pipiras University of North Carolina February
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring
Lecture 8 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Applications: Bayesian inference: overview and examples Introduction
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationFourier Analysis of Stationary and Non-Stationary Time Series
Fourier Analysis of Stationary and Non-Stationary Time Series September 6, 2012 A time series is a stochastic process indexed at discrete points in time i.e X t for t = 0, 1, 2, 3,... The mean is defined
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationStationary Graph Processes: Nonparametric Spectral Estimation
Stationary Graph Processes: Nonparametric Spectral Estimation Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro Dept. of Signal Theory and Communications King Juan Carlos University
More informationGraduate Econometrics I: Maximum Likelihood I
Graduate Econometrics I: Maximum Likelihood I Yves Dominicy Université libre de Bruxelles Solvay Brussels School of Economics and Management ECARES Yves Dominicy Graduate Econometrics I: Maximum Likelihood
More informationChapter 1 Numerical approximation of data : interpolation, least squares method
Chapter 1 Numerical approximation of data : interpolation, least squares method I. Motivation 1 Approximation of functions Evaluation of a function Which functions (f : R R) can be effectively evaluated
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationLectures 2 3 : Wigner s semicircle law
Fall 009 MATH 833 Random Matrices B. Való Lectures 3 : Wigner s semicircle law Notes prepared by: M. Koyama As we set up last wee, let M n = [X ij ] n i,j=1 be a symmetric n n matrix with Random entries
More informationOptimal series representations of continuous Gaussian random fields
Optimal series representations of continuous Gaussian random fields Antoine AYACHE Université Lille 1 - Laboratoire Paul Painlevé A. Ayache (Lille 1) Optimality of continuous Gaussian series 04/25/2012
More informationLECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT
MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes
More informationExtreme inference in stationary time series
Extreme inference in stationary time series Moritz Jirak FOR 1735 February 8, 2013 1 / 30 Outline 1 Outline 2 Motivation The multivariate CLT Measuring discrepancies 3 Some theory and problems The problem
More informationA Study on "Spurious Long Memory in Nonlinear Time Series Models"
A Study on "Spurious Long Memory in Nonlinear Time Series Models" Heri Kuswanto and Philipp Sibbertsen Leibniz University Hanover, Department of Economics, Germany Discussion Paper No. 410 November 2008
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationTail bound inequalities and empirical likelihood for the mean
Tail bound inequalities and empirical likelihood for the mean Sandra Vucane 1 1 University of Latvia, Riga 29 th of September, 2011 Sandra Vucane (LU) Tail bound inequalities and EL for the mean 29.09.2011
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationExtremogram and Ex-Periodogram for heavy-tailed time series
Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal
More informationFrom Fractional Brownian Motion to Multifractional Brownian Motion
From Fractional Brownian Motion to Multifractional Brownian Motion Antoine Ayache USTL (Lille) Antoine.Ayache@math.univ-lille1.fr Cassino December 2010 A.Ayache (USTL) From FBM to MBM Cassino December
More informationConsistent estimation of the memory parameter for nonlinear time series
Consistent estimation of the memory parameter for nonlinear time series Violetta Dalla, Liudas Giraitis and Javier Hidalgo London School of Economics, University of ork, London School of Economics 18 August,
More informationA Short Course in Basic Statistics
A Short Course in Basic Statistics Ian Schindler November 5, 2017 Creative commons license share and share alike BY: C 1 Descriptive Statistics 1.1 Presenting statistical data Definition 1 A statistical
More informationHomework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples
Homework #3 36-754, Spring 27 Due 14 May 27 1 Convergence of the empirical CDF, uniform samples In this problem and the next, X i are IID samples on the real line, with cumulative distribution function
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationCONVERGENCE IN DISTRIBUTION OF THE PERIODOGRAM OF CHAOTIC PROCESSES
Stochastics and Dynamics c World Scientific Publishing Company COVERGECE I DISTRIBUTIO OF THE PERIODOGRAM OF CHAOTIC PROCESSES ARTUR O. LOPES and SÍLVIA R. C. LOPES Instituto de Matemática, UFRGS, Av.
More informationWavelet domain test for long range dependence in the presence of a trend
Wavelet domain test for long range dependence in the presence of a trend Agnieszka Jach and Piotr Kokoszka Utah State University July 24, 27 Abstract We propose a test to distinguish a weakly dependent
More informationAdjusted Empirical Likelihood for Long-memory Time Series Models
Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics
More informationSELF-SIMILARITY PARAMETER ESTIMATION AND REPRODUCTION PROPERTY FOR NON-GAUSSIAN HERMITE PROCESSES
Communications on Stochastic Analysis Vol. 5, o. 1 11 161-185 Serials Publications www.serialspublications.com SELF-SIMILARITY PARAMETER ESTIMATIO AD REPRODUCTIO PROPERTY FOR O-GAUSSIA HERMITE PROCESSES
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationStein s Method and Characteristic Functions
Stein s Method and Characteristic Functions Alexander Tikhomirov Komi Science Center of Ural Division of RAS, Syktyvkar, Russia; Singapore, NUS, 18-29 May 2015 Workshop New Directions in Stein s method
More informationCHANGE DETECTION IN TIME SERIES
CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700
More informationHigh quantile estimation for some Stochastic Volatility models
High quantile estimation for some Stochastic Volatility models Ling Luo Thesis submitted to the Faculty of Graduate and Postdoctoral Studies in partial fulfillment of the requirements for the degree of
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More information