Long-range dependence
|
|
- Tiffany Freeman
- 5 years ago
- Views:
Transcription
1 Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
2 Outline 1 Introduction to time series Stationarity AR(1), MA(1), ARMA(p,q) time series Spectral density and periodogram 2 Long-Range Dependence (LRD) Definitions of LRD in time and spectral domain Examples of time series with long-memory Application in finance - long-memory volatility model 3 Multivariate extension Definitions Causal and non-causal representations Examples Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
3 PART A INTRODUCTION TO TIME SERIES Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
4 Introduction to time series Time series is a set of observations x t, each one being recorded at a specified time t. Discrete time series: The set T 0 of times at which observations are made is discrete. Example (Population in the U.S.) t x t ,929, ,308, ,239, ,638, ,545,805 Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
5 Mathematical model Question: What mathematical model should we use in time series analysis? We can think of each observation x t as a realized value of a certain random variable X t. Then the time series {x t, t T 0 } is the realization of the family of random variables {X t, t T 0 }. Definition (Stochastic Process) A stochastic process is a family of random variables {X t, t T } defined on a probability space (Ω, F, P). We will consider T = Z. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
6 Weak Stationarity Definition (Stationarity) A stochastic process {X t } t Z is (weakly or second-order) stationary if for any t, s Z, E X (t) 2 < EX (t) = EX (0) Cov(X (t), X (s)) = Cov(X (t s), X (0)) The time difference t s is called time-lag. Time domain perspective Consider a (weakly) stationary time series {X t } t Z. In the time domain, one focuses on the functions γ X (h) = Cov(X h, X 0 ) = Cov(X t+h, X t ), for all t, h Z ρ X (h) = γ X (h) γ X (0), Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
7 ACV, ACVF called autocovariance function (ACVF) and autocorrelation function (ACF). ACVF and ACF are measures of dependence in time series. Sample counterparts of ACVF and ACF are the functions γ X (h) = 1 N N h t=1 Basic Properties (X t+ h X )(X t+ h X ), ρ X (h) = γ X (h), h N 1. γ X (0) 1 (Symmetry) ρ X (h) = ρ X ( h), h Z 2 (Range) ρ X (h) 1, h Z 3 (Interpretation) ρ X (h) close to -1, 0 and 1 correspond to strong negative, weak and strong positive correlation, respectively, in time series at lag h. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
8 Time series examples Example 1. (White Noise.) A time series X n = Z n, n Z is called White Noise, denoted WN(0, σ 2 ), if EZ n = 0 and γ Z (h) = { σ 2, if h = 0, 0, if h 0, ρ Z (h) = { 1, if h = 0, 0, if h 0. Example 2. (MA(1)). A time series is called a Moving Average of order 1 (MA(1)) if it is given by X n = Z n + θz n 1, n Z, where Z n WN(0, σ 2 ). Observe that, σ 2 (1 + θ 2 ), h = 0, γ X (h) = EX h X 0 = E(Z h +θz h 1 )(Z 0 +θz 1 ) = σ 2 θ, h = 1, 0, h 2, Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
9 Time series examples and hence 1, h = 0, θ ρ X (h) =, h = 1, 1+θ 2 0, h 2. Example 3. (AR(1)). A (weakly) stationary time series {X n } n Z is called Autoregressive of order 1 (AR(1)) if it satisfies the AR(1) equation X n = φx n 1 + Z n, n Z, where Z n WN(0, σ 2 ). To see that AR(1) time series exists, suppose φ < 1 and notice that, X n = φ 2 X n 2 + φz n 1 + Z n = φ m X n m + φ m 1 Z n (m 1) Z n = φ m Z n m. m=0 The time series is well defined in the L 2 (Ω)-sense because, Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
10 Time series examples E ( n2 m=n 1 φ m Z n m ) 2 = n 2 for φ < 1. Also we can easily calculate, m=n 1 φ 2m σ 2 0, as n 1, n 2, φ h γ X (h) = σ 2 1 φ 2, ρ X (h) = φ h. Example 4. (ARMA(p,q)) A (weakly) stationary time series {X n } n Z is called Autoregressive moving average of orders p and q (ARMA(p, q)) if it satisfies the equation X n φ 1 X n 1... φ p X n p = Z n + θ 1 Z n θ q Z n q, where Z n WN(0, σ 2 ). It is convenient to express the conditions in terms of the so-called backshift operator B, defined as B k X n = X n k, B 0 = I, n Z. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
11 Spectral domain With this notation, the ARMA(p, q) equation becomes φ(b)x n = θ(b)z n, where φ(z) = 1 φ 1 z... φ p z p, θ(z) = 1 + θ 1 z θ q z q are the so-called characteristic polynomials. Spectral domain perspective In the spectral domain the focus is on the function f X (λ) = 1 2π h= e ihλ γ X (h), λ ( π, π], called spectral density. Observe that f X is well defined pointwise when γ X L 1 (Z). Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
12 Spectral domain Example 1. (White Noise cont d). If Z n WN(0, σ 2 ) then f X (λ) = σ2, λ ( π, π]. 2π Example 2. (AR(1) cont d). If {X n } n Z is AR(1) time series with φ < 1 and γ X (h) = σ 2 φ h /(1 φ 2 ), then f X (λ) = = ( σ 2 2π(1 φ ) σ 2 2π(1 φ 2 ) = σ2 1 2π 1 φe iλ 2. ) (e ihλ + e ihλ )φ h h=1 (1 + φe iλ φeiλ + 1 φe iλ 1 φe iλ ) Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
13 Spectral domain Basic properties 1 Symmetry f X (λ) = f X ( λ) 2 Non-negative f X (λ) 0. For 3 Inverse discrete Fourier transform γ X (h) = π π e ihλ f X (λ)dλ. A sample counterpart to the spectral density is defined by 1 γ X (h)e ihλ = 1 N 2 X n e inλ =: I X,N(λ), 2π 2π 2π h <N with the first relation holding only at the so-called Fourier frequencies λ = λ k = 2πk [ ] [ ] N 1 N with k =,...,. N 2 2 n=1 Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
14 Spectral domain I X,N is known as the periodogram and has the following properties: 1. (Computational speed). I X,N (λ k ) can be computed by fast Fourier Transform (FFT) in O(NlogN) steps, supposing N can be factored out in many factors. 2. (Statistical properties). I X,N (λ) is not a consistent estimator for 2πf (λ), but is asymptotically unbiased. the periodogram needs to be smoothed out to become consistent. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
15 PART B LONG-RANGE DEPENDENCE Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
16 Slowly varying functions Definition 1: A function L is slowly varying at infinity if it is positive and, for any a > 0, L(au) lim u L(u) = 1. A function L is slowly varying at zero if the function L(1/u) is slowly varying at infinity. Example: L(u) = log(u) is slowly varying at infinity. We are now ready to introduce time series with long-range dependence. these will involve a parameter and a slowly varying function L. d (0, 1/2), Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
17 Definitions of long-range dependence d (0, 1/2), L 1, L 2 slowly varying at infinity, L 3 slowly varying at zero. Condition I. X n = µ + k=0 ψ kɛ n k with {ɛ n } n Z WN and Condition II. ψ k = L 1 (k)k d 1, as k. γ(k) = L 2 (k)k 2d 1, as k. Condition III. Condition IV. k= γ(k) =. f (λ) = L 3 (λ)λ 2d, as λ 0. In statistical inference, all slowly varying function are replaced by constants (in the asymptotic sense, e.g. L 2 (u) c 2 ). Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
18 Definitions of long-range depdendence Definition 2: A second-order stationary time series X = {X n } n Z is called long-range dependent (LRD, in short) if one of the non-equivalent conditions IIV above holds. The parameterd (0, 1/2) is called a long-range dependence (LRD) parameter. Other names: Long memory, strongly dependent, 1/f (1/λ) noise, colored noise, burstiness,... Definition 3: A second-order stationary time series X = {X n } n Z is called short-range dependent (SRD, in short) if k= γ(k) <. Other names: short memory, weakly dependent,... Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
19 Some real examples of LRD series Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
20 Some real examples of LRD series Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
21 Some real examples of LRD series Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
22 Some real examples of LRD series Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
23 Cond. IV III As mentioned earlier Conditions I-IV are not equivalent in general. We will examine briefly how they are related. Fact: If k= γ(k) <, then the series X = {X n} n Z has a continuous spectral density f (λ) = 1 2π k= e ikλ γ(k). If cond. III fails, that is, k= γ(k) <, the fact above implies that f (λ) is continuous and f (0) = 1 2π k= γ(k) <. But by cond. IV, lim λ 0 f (λ) = lim λ 0 L 3 (λ)λ 2d =. Hence a contradiction, showing that cond. IV implies III. Definition: A slowly varying function L on [0, ) is called quasi-monotone if it is of bounded variation on any compact interval of [0, ) and if for all δ > 0, x u δ dl(u) = O(x δ L(x)), as x. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
24 Comparing cond. II and IV Cond. IV II: Informally, γ(k) = π π e ikλ f (λ)dλ = = k 2d 1 L 3 ( 1 k ) kπ kπ π π e ikλ L 3 (λ)λ 2d dλ e iz z 2d L 3( z k ) L 3 ( 1 k ) dz k 2d 1 L 3 ( 1 k ) e iz z 2d dz = k 2d 1 L 3 ( 1 k 2d) )2 cos(π(1 )Γ(1 2d). 2 and hence L 2 (u) L 3 (1/u)2 cos( π(1 2d) 2 )Γ(1 2d). These calculations can be justified assuming that L 3 is quasi-monotone. Otherwise the result is not in general even in the case L 3 (k) c 3. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
25 Comparing cond. II and IV Cond. II IV: Informally, f (λ) = 1 2π k= = λ 2d L 2 ( 1 λ ) 1 2π e ikλ γ(k) = 1 2π k= λ 2d L 2 ( 1 λ ) 1 2π k= e ikλ L 2 (k)k 2d 1 e ikλ L 2(kλ 1 λ ) L 2 ( 1 λ ) (kλ)2d 1 λ e iz z 2d 1 dz = λ 2d L 2 ( 1 λ ) 1 Γ(2d) cos(πd), 2π thus L 3 (λ) L 2 ( 1 λ ) 1 π Γ(2d) cos(πd). These calculations can be justified assuming that L 2 is quasi-monotone. Otherwise the result is not in general even in the case L 2 (k) c 2. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
26 Some real examples of LRD series Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
27 FARIMA(0, d, 0) Example : If d is a non negative integer, then X t is said to be an ARIMA(0, d, 0) process if (I B) d X n = Z n, where Z n WN(0, σ 2 ) and B is the backshift operator. In simple cases of d=1,2,.., this equation is: d = 1 : (I B)X n = X n X n 1 = Z n d = 2 : (I B) 2 X n = (X n X n 1 ) (X n 1 X n 2 ) = Z n and so on when d 3. Remark: In many applications we want to difference the observed time series in order to achieve approximate stationarity. However even though differencing might seem appropriate, taking the first or second difference may be too strong! Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
28 FARIMA(0, d, 0) Example cont d: For other values of d we would like to interpret the solution to the ARIMA(0, d, 0) as X n = (I B) d Z n = b j B j Z n = j=0 b j Z n j, j=0 where b j s are the coefficients in the Taylor expansion of (1 z) d = 1 + dz + = ( j j=0 k=1 d(d + 1) z 2 + 2! ) k 1 + d z j = k j=0 d(d + 1)(d + 2) z ! Γ(j + d) Γ(j)Γ(d) zd =: b j z j. The time series is well defined when j=0 b2 j < which depends on the behavior of b j as j. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45 j=0
29 FARIMA(0, d, 0) Example cont d: By Stirling s formula Γ(p) 2πe (p 1) (p 1) p 1/2, as p, we have b j = Γ(j + a) Γ(j + b) j d 1, as j. Γ(d) Then the time series is well defined if 2(d 1) + 1 = 2d 1 < 0 or d < 1/2. Definition: The time series X n = (I B) d Z n = j=0 b jz n j is called FARIMA(0, d, 0) when d < 1/2. Since b j jd 1 Γ(d) a FARIMA(0, d, 0) series is LRD when 0 < d < 1/2, in the sense of condition I and hence II and III. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
30 FARIMA(0, d, 0) Example cont d: Fact 1: FARIMA(0, d, 0) series has spectral density ( f (λ) = σ2 2 sin λ ) 2d = σ2 2π 2 2π 1 e iλ 2d σ2 2π λ 2d as λ 0. Basic idea: A series X n = j=0 b jz n j has spectral density f (λ) = σ2 2π Here, j=0 b je ijλ = (1 e iλ ) d. b j e ijλ j=0 2 Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
31 FARIMA(0, d, 0) Fact 2: FARIMA(0, d, 0) series has autocovariances γ(k) = σ 2 ( 1) k Γ(1 2d) Γ(1 d)γ(1 k d) Basic idea: γ(k) = π π Γ(1 2d)sin(πd) σ2 k 2d 1, as k. π 2π e ikλ f (λ)dλ = σ2 cos(kλ)(2 sin(λ/2)) 2d dλ 2π 0 and the formula for the last integral is known. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
32 FARIMA(0, d, 0) Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
33 Long Memory in Volatility Long memory stochastic volatility model (LMSV). Consider the latent variable model for the return series r t r t = σ t v t, where v t is an independent identically distributed series with mean zero and finite variance and σ 2 t is given by, σ t = exp(h t /2), where h t is a Gaussian long memory series independent of v t. Using the moment generating function of the Gaussian distribution it can be shown for the LMSV model that ρ r 2 t (j) Cj 2d 1, as j. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
34 PART C MULTIVARIATE LRD Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
35 Multivariate LRD We focus here on vector-valued (R p -valued), second order stationary times series X = {X n } n Z. Autocovariance matrix function: γ(h) = (γ jk (h)) j,k=1,...,p = EX 0 X h EX 0EX h, h Z Spectral density matrix function (if it exists) f (λ) = (f jk (λ)) j,k=1,..,p, λ ( π, π]. It satisfies π π e inλ f (λ)dλ = γ(n), n Z. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
36 Mutlivariate LRD Remark 1: The cross covariances γ jk (n) may not be equal to γ jk ( n) when j k. Therefore in contrast to the univariate case, it is not true in general that γ(n) = γ( n), n Z. (1) If (1) holds, the time series X is called time reversible. Remark 2: Since (1) may not hold, f (λ) is in general, complex valued. f (λ) is Hermitian symmetric, non negative definite and satisfies f ( λ) = f (λ), λ [ π, π). We are now ready to extend Conditions II and IV to the multivariate case. Let D = diag(d 1,..., d p ) with d j (0, 1/2), j = 1,..., p. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
37 Multivariate LRD Condition II-v. The autocovariance matrix function of the time series X = {X n } n Z satisfies: γ(n) = n D (1/2)I L 2 (n)n D (1/2)I, (2) where L 2 is an R p p -valued function satisfying L 2 (u) R, as u +, for some p p matrix R. Equivalently we can write, γ jk (n) = L 2,jk (n)n (d j +d k ) 1 R jk n (d j +d k ) 1, as n. Condition IV-v. The spectral density matrix function satisfies f (λ) = λ D L 3 (λ)λ D, (3) where L 3 is a C p p -valued, Hermitian symmetric, non-negative definite matrix function satisfying L 3 (λ) G, as λ 0, for some p p, Hermitian symmetric, non-negative definite matrix G. Equivalently, f jk (λ) = L 3,jk (λ)λ (d j +d k ) G jk λ (d j +d k ) =: g jk e iφ jk λ (d j +d k ) as λ 0, Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
38 Multivariate LRD where g jk R and φ jk [ π, π). Remarks: 1 The individual component series {Xn} j n Z, j = 1,..., p, of a multivariate LRD series are LRD with parameters d j, j = 1,..., p. 2 Note from (3) that f (λ) is Hermitian, non negative definite. The entries φ jk are referred to as phase parameters. 3 We supposed for simplicity that all slowly varying functions behave as constants. 4 The squared coherence function Hjk 2 (λ) = f jk(λ) 2 /(f jj (λ)f kk (λ)) satisfies 0 Hjk 2 (λ) 1. As λ 0, this translates into 0 lim λ 0 G jk 2 λ 2(dj +dk) G jj λ 2d j G kk λ 2d k = G jk 2 G jj G kk 1 (4) and also explains why the choice of λ (d j +d k ) is natural for the cross-spectral density f jk (λ). Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
39 Comparing cond. II-v and IV-v Suppose that L 2,jk are quasi-monotone. Then cond II-v implies cond IV-v with G jk = g jk e iφ jk given by { Rjk R kj φ jk = arctan tan( π } R jk + R kj 2 (d j + d k )), g jk = Γ(d j + d k )(R jk + R kj ) cos( π 2 (d j + d k )). 2π cos(φ jk ) Suppose that RL 3,jk, IL 3,jk are quasi-monotone. Then cond IV-v implies cond II-v with { R jk = 2Γ(1 (d j +d k )) RG jk sin( π 2 (d j +d k )) IG jk cos( π } 2 (d j +d k )) Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
40 Linear Representation Suppose that the series X n has a linear representation of the form X n = Ψ m Z n m, m= where {Ψ m = (ψ m,jk ) j,k=1,...,p } m Z, a sequence of real matrices such that ψ m,jk = L jk (m) m d j 1, where L(m) = (L jk (m)) j,k=1,...,p is an R p p -valued function satisfying L(m) A + as m, and L(m) A as m, for some p p real matrices A + = (a + jk ) j,k=1,...,p, A = (a jk ) j,k=1,...,p. Then X n is LRD. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
41 Example Consider the series X n = Ψ m Z n m, m=0 where {Z n } n Z, and Ψ m as earlier. Then the phase parameters appearing in the (j, k) element of the spectral density matrix of X n have the form φ jk = (d j d k ) π 2. Question: What behavior should I consider for Ψ m to get a causal represenation? Linear combinations of c a,b m = m b cos(2πm a ) and s a,b m = m b sin(2πm a ) (?) Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
42 FARIMA(0,D,0) M + = (m + ij ), M = (m ij ) Rp p. {Z n } n Z such that EZ n = 0 and EZ n Z m = σ 2 I if n = m, and = 0 if n m. Define a multivariate FARIMA(0, D, 0) series as X n = (I B) D M + Z n + (I B 1 ) D M Z n. For the component wise spectral density I have where c j,k = σ2 2π f jk (λ) c j,k λ (d j +d k ), as λ 0, (e i(d j d k ) π 2 A1 + e i(d j d k ) π 2 A2 + e i(d j d k ) π 2 A3 + e i(d j d k ) π 2 A4 ) and A 1 = p t=1 m+ jt m+ kt, A 2 = p t=1 m+ jt m kt, A 3 = p t=1 m jt m+ kt, A 4 = p t=1 m jt m kt. Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
43 Lots and Lots of Fun Stuff Self similar processes Issing model in two dimensions Infinite sourse Poisson model with Heavy tails Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
44 References The results in part c are a joint work with my advisor Vladas Pipiras. 1 Brockwell, P. J. & Davis, R. A. (2009), Time series: Theory and methods, Springer. 2 Pipiras, V. & Taqqu, S. M. (forthcoming 2014), Long-Range Dependence and Self-Similarity. 3 Doukhan, P. Oppenheim, G. Taqqu, S. M. (2003), Theory and applications of Long-Range Dependence Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
45 Thank you! Kechagias Stefanos (UNC) Long-range dependence May 23, / 45
STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationContents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8
A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationSF2943: A FEW COMMENTS ON SPECTRAL DENSITIES
SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting the spectral densities
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationIdentification, estimation and applications of a bivariate long-range dependent time series model with general phase
Identification, estimation and applications of a bivariate long-range dependent time series model with general phase Stefanos Kechagias SAS Institute Vladas Pipiras University of North Carolina March 27,
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationSynthesis of Gaussian and non-gaussian stationary time series using circulant matrix embedding
Synthesis of Gaussian and non-gaussian stationary time series using circulant matrix embedding Vladas Pipiras University of North Carolina at Chapel Hill UNC Graduate Seminar, November 10, 2010 (joint
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationNOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY
Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationModeling bivariate long-range dependence with general phase
Modeling bivariate long-range dependence with general phase Stefanos Kechagias SAS Institute Vladas Pipiras University of North Carolina November 30, 2018 Abstract Bivariate time series models are introduced
More informationThe autocorrelation and autocovariance functions - helpful tools in the modelling problem
The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationSTA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)
STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E
More informationSTAT 443 (Winter ) Forecasting
Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationMATH 5075: Time Series Analysis
NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationStochastic processes: basic notions
Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This
More informationLong-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu
Long-Range Dependence and Self-Similarity c Vladas Pipiras and Murad S. Taqqu January 24, 2016 Contents Contents 2 Preface 8 List of abbreviations 10 Notation 11 1 A brief overview of times series and
More informationStrictly Stationary Solutions of Autoregressive Moving Average Equations
Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution
More informationSTOR 356: Summary Course Notes
STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationCh. 19 Models of Nonstationary Time Series
Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationTHE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2
THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS Mor Ndongo & Abdou Kâ Diongue UFR SAT, Universit Gaston Berger, BP 34 Saint-Louis, Sénégal (morndongo000@yahoo.fr) UFR SAT, Universit Gaston
More informationDetection of structural breaks in multivariate time series
Detection of structural breaks in multivariate time series Holger Dette, Ruhr-Universität Bochum Philip Preuß, Ruhr-Universität Bochum Ruprecht Puchstein, Ruhr-Universität Bochum January 14, 2014 Outline
More informationTime Series Analysis. Solutions to problems in Chapter 5 IMM
Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationStochastic volatility models: tails and memory
: tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and
More informationNonlinear Time Series Modeling
Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationSTAT 248: EDA & Stationarity Handout 3
STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance
More informationARMA models with time-varying coefficients. Periodic case.
ARMA models with time-varying coefficients. Periodic case. Agnieszka Wy lomańska Hugo Steinhaus Center Wroc law University of Technology ARMA models with time-varying coefficients. Periodic case. 1 Some
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationAdaptive wavelet decompositions of stochastic processes and some applications
Adaptive wavelet decompositions of stochastic processes and some applications Vladas Pipiras University of North Carolina at Chapel Hill SCAM meeting, June 1, 2012 (joint work with G. Didier, P. Abry)
More informationA note on a Marčenko-Pastur type theorem for time series. Jianfeng. Yao
A note on a Marčenko-Pastur type theorem for time series Jianfeng Yao Workshop on High-dimensional statistics The University of Hong Kong, October 2011 Overview 1 High-dimensional data and the sample covariance
More informationARMA (and ARIMA) models are often expressed in backshift notation.
Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationA Diagnostic for Seasonality Based Upon Autoregressive Roots
A Diagnostic for Seasonality Based Upon Autoregressive Roots Tucker McElroy (U.S. Census Bureau) 2018 Seasonal Adjustment Practitioners Workshop April 26, 2018 1 / 33 Disclaimer This presentation is released
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationTime Series Outlier Detection
Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More information6.3 Forecasting ARMA processes
6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationSTAD57 Time Series Analysis. Lecture 23
STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More informationAssessing the dependence of high-dimensional time series via sample autocovariances and correlations
Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny University of Aarhus Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia),
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More informationTemporal aggregation of stationary and nonstationary discrete-time processes
Temporal aggregation of stationary and nonstationary discrete-time processes HENGHSIU TSAI Institute of Statistical Science, Academia Sinica, Taipei, Taiwan 115, R.O.C. htsai@stat.sinica.edu.tw K. S. CHAN
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationWavelet domain test for long range dependence in the presence of a trend
Wavelet domain test for long range dependence in the presence of a trend Agnieszka Jach and Piotr Kokoszka Utah State University July 24, 27 Abstract We propose a test to distinguish a weakly dependent
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationLecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)
Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency
More informationHeavy Tailed Time Series with Extremal Independence
Heavy Tailed Time Series with Extremal Independence Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Herold Dehling Bochum January 16, 2015 Rafa l Kulik and Philippe Soulier Regular variation
More informationThe sample autocorrelations of financial time series models
The sample autocorrelations of financial time series models Richard A. Davis 1 Colorado State University Thomas Mikosch University of Groningen and EURANDOM Abstract In this chapter we review some of the
More information