Detection of structural breaks in multivariate time series
|
|
- Aldous Chambers
- 6 years ago
- Views:
Transcription
1 Detection of structural breaks in multivariate time series Holger Dette, Ruhr-Universität Bochum Philip Preuß, Ruhr-Universität Bochum Ruprecht Puchstein, Ruhr-Universität Bochum January 14, 2014
2 Outline 1 Motivation 2 Piecewise stationary processes 3 Testing for stationarity 4 Detecting structural breaks 5 Finite sample properties
3 Switch in variance 1 / 46
4 Switch in variance Figure: 1024 realizations of an independent series, where the variance switches from 1 to 4. 1 / 46
5 AR(1) model with structural break 2 / 46
6 AR(1) model with structural break Figure: realizations of an AR(1) process where the parameter switches from 0.9 to 2 / 46
7 A less obvious example 3 / 46
8 A less obvious example Figure: realizations of an AR(1) process where the parameter switches from 0.7 to 3 / 46
9 The goal of this talk (1) Do there exist structural breaks (in the autocovariance structure)? (2) If yes, how many of them are present? (3) Where are the break points located? (4) In the multivariate setting: in which components do breaks occur? 4 / 46
10 The goal of this talk (1) Do there exist structural breaks (in the autocovariance structure)? (2) If yes, how many of them are present? (3) Where are the break points located? (4) In the multivariate setting: in which components do breaks occur? This talk gives an answer to these problems The number of change points is unknown. Structural breaks can occur in different components of the time series (these will be identified). 4 / 46
11 Some references in the one dimensional case Detecting change points in the second-order structure [like the variance or the parameters in an AR(p) model] has found considerable interest. Inclan and Tiao (1994, JASA), Chen and Gupta (1997, JASA) and Lee and Park (2001, SJoS) are testing for one change point in the variance. Lee et al. (2003, SJoS) trying to detect one change in specific parameters [like the AR(1) parameter]. Binary segmentation is used to detect multiple change points working but far from being optimal. Davis et al. (2006, JASA) propose an algorithm to fit piecewise AR(p) processes. 5 / 46
12 Multivariate case Much less literature in the multivariate case. Aue et al. (2009, AoS) propose a test for detecting one structural break in the variance matrix. Cho and Fryzlewicz (2013) develope an algorithm to segment parts with different second order characteristics. 6 / 46
13 Stationarity Consider a centered R d -valued time series {X t } t Z Question: Does Γ(t, h) := E[X t+h X H t ] = [γ ij (t, h)] i,j=1,...,d change over time for some h Z? 7 / 46
14 Stationarity Consider a centered R d -valued time series {X t } t Z Question: Does change over time for some h Z? Move to the frequency domain: does change over time? Γ(t, h) := E[X t+h X H t ] = [γ ij (t, h)] i,j=1,...,d f(t, λ) = 1 2π Γ(t, h) exp( iλh) h Z 7 / 46
15 Piecewise stationarity Model: triangular scheme X t,t, t = 1,..., T, with a piecewise stationary representation on K + 1 intervals, i.e. where X t,t = l=0 Ψ(1) l=0 Ψ(2). l=0 Ψ(K+1) l Z t l if 0 = b 0T < t b 1T l Z t l if b 1T < t b 2T l Z t l if b K T < t b K+1 T = T, White noise {Z t} t Z N(0, I d ) (normality assumption is not necessary) Ψ (1) l,..., Ψ (K+1) l R d d (l = 0, 1,...) 0 = b 0 < b 1 < < b K < b K+1 = 1. 8 / 46
16 The local spectral density matrix Compact notation where X t,t = Ψ l (t/t )Z t l, l=0 Ψ l : [0, 1] R d d t = 1,..., T are piecewise constant functions (on the (same) K + 1 intervals (b 0, b 1 ], (b 1, b 2 ],... (b K, b K+1 ]). Null hypothesis (no change in the second order characteristics) H 0 : K = 0 9 / 46
17 The local spectral density matrix Compact notation where X t,t = Ψ l (t/t )Z t l, l=0 Ψ l : [0, 1] R d d t = 1,..., T are piecewise constant functions (on the (same) K + 1 intervals (b 0, b 1 ], (b 1, b 2 ],... (b K, b K+1 ]). Null hypothesis (no change in the second order characteristics) H 0 : K = 0 local spectral density matrix at time u [0, 1] f(u, λ) = 1 Ψ l (u)ψ H 2π m(u) exp( iλ(l m)). l,m=0 9 / 46
18 Motivating the procedure If K 1, the spectral density f(u, λ) has points of discontinuity in u direction. For v [0, 1] consider D(v, ω) = lim ɛ 0 ( ωπ 0 f(v ɛ, λ)dλ ωπ 0 f(v + ɛ, λ)dλ) D(v) := sup D(v, ω) := ω [0,1] sup sup ω [0,1] a,b=1,...,d If there is no structural break at time v, then D(v) = 0. D(v, ω) a,b 10 / 46
19 Motivating the procedure If K 1, the spectral density f(u, λ) has points of discontinuity in u direction. For v [0, 1] consider D(v, ω) = lim ɛ 0 ( ωπ 0 f(v ɛ, λ)dλ ωπ 0 f(v + ɛ, λ)dλ) D(v) := sup D(v, ω) := ω [0,1] sup sup ω [0,1] a,b=1,...,d If there is no structural break at time v, then D(v) = 0. For testing the hypothesis H 0 : K = 0, we consider D := sup D(v) = sup D(v, ω). v [0,1] v,ω [0,1] D(v, ω) a,b 10 / 46
20 Next steps In order to test the hypothesis H 0 : K = 0 we 1) construct an empirical version ˆD T (v, ω) of D(v, ω). 2) use as an estimator of D. ˆD T := sup ˆD T (v, ω) v,ω [0,1] 3) use the AR( ) bootstrap to estimate the (1 α)-quantile q 1 α of ˆD T under H 0 and reject H 0 if ˆD T > ˆq 1 α. 11 / 46
21 Estimating D = sup v,ω [0,1] D(v, ω) We estimate f(u, λ) by the local periodogram I N (u, λ) := 1 2πN N 1 r,s=0 X ut N/2+1+r,T X H ut N/2+1+s,T exp( iλ(r s)). Note: I N (u, λ) is not consistent for f(u, λ) (but for D(v, ω) we only need averages)! 12 / 46
22 Estimating D = sup v,ω [0,1] D(v, ω) Estimate D(v, ω) = lim ɛ 0 ( by Riemann sums, that is ωπ 0 f(v ɛ, λ)dλ ωπ 0 f(v + ɛ, λ)dλ). ˆD T (v, ω) := 2π N ωn/2 k=1 ( I N ( v N/(2T ), λk ) IN ( v + N/(2T ), λk ) ), where λ k := 2πk N Note: The estimate is consistent! 13 / 46
23 The statistic v N γ sup ω [0,1] [ ˆD T (v, ω)] a,b (N = T /8, γ = 0.4) / 46
24 The statistic v N γ sup ω [0,1] [ ˆD T (v, ω)] a,b (N = T /8, γ = 0.4) / 46
25 Asymptotic properties of ˆD T = sup v,ω [0,1] ˆD T (v, ω) Theorem Assume N and N/T c 0: a) If K = 0 then N γ ˆDT = o P (1) for any 0 < γ < 1 2. If c > 0, then N1/2 ˆDT converges weakly to a centered Gaussian process. b) If K 1 then there exist constants C R + such that lim P( sup [ ˆD T (b r, ω)] a,b > C) = 1 T ω [0,1] for all (r, a, b) {1,..., K} {1,..., d} 2 where sup [D(b r, ω)] a,b > 0. ω [0,1] 16 / 46
26 Bootstrapping ˆD T Note: If c = 0 then N 1/2 ˆDT does not converge weakly! N 1/2 ˆDT (v 1, ω) and N 1/2 ˆDT (v 1, ω) are asymptotically uncorrelated! We estimate the quantiles of ˆD T under the null hypothesis H 0 : K = 0 using an AR( )-Bootstrap [Berg et al. (2011, JSPI), Kreiss et al. (2011, AoS)]. This procedure exploits the fact that every stationary process can be accurately approximated by AR(p)-models, if p is sufficiently large. 17 / 46
27 Main idea of AR(p) bootstrap (1) We generate bootstrap replicates of a process {X t } t Z with spectral density g (λ) = 1 0 f(u, λ)du 18 / 46
28 Main idea of AR(p) bootstrap (1) We generate bootstrap replicates of a process {X t } t Z with spectral density g (λ) = 1 0 f(u, λ)du (2) Note that g (λ) is the best approximation of f(u, λ) with respect to the L 2 -distance π 1 π 0 tr [( f(u, λ) g(λ) )( f(u, λ) g(λ) ) H] dudλ 18 / 46
29 Main idea of AR(p) bootstrap (1) We generate bootstrap replicates of a process {X t } t Z with spectral density g (λ) = 1 0 f(u, λ)du (2) Note that g (λ) is the best approximation of f(u, λ) with respect to the L 2 -distance π 1 π 0 tr [( f(u, λ) g(λ) )( f(u, λ) g(λ) ) H] dudλ (3) The process {X t } t Z is approximated by an AR(p) process, where p is increasing with the sample size. 18 / 46
30 Main idea of AR(p) bootstrap (1) We generate bootstrap replicates of a process {X t } t Z with spectral density g (λ) = 1 0 f(u, λ)du (2) Note that g (λ) is the best approximation of f(u, λ) with respect to the L 2 -distance π 1 π 0 tr [( f(u, λ) g(λ) )( f(u, λ) g(λ) ) H] dudλ (3) The process {X t } t Z is approximated by an AR(p) process, where p is increasing with the sample size. (4) We can prove consistency! 18 / 46
31 Algorithm for generating replicates ˆD T of ˆD T 1) Choose p N and compute an estimator (â 1,p,..., â p,p) for ( p p ) (a 1,p,..., a p,p) := argmin tr E[(X t,t b j,p X t j,t )(X t,t b j,p X t j,t ) H ], b 1,p,...,b p,p j=1 j=1 2) Set X t,t = X t,t for t = 1,..., p. 3) Calculate p X t,t = â j,p X t j,t + ˆΣ 1/2 p Z j,t for t > p j=1 where Z j,t are independent N(0, I d ) distributed and ˆΣ p = 1 T (ẑ i z T )(ẑ i z T ) H z T := 1 T ẑ j T p T p j=p+1 j=p+1 p ẑ j : = X j,t â i,p X j i,t for j = p + 1,..., T. i=1 4) Define ˆD T (v, ω) as ˆD T (v, ω) but with the X t,t replaced by its bootstrap replicates X t,t. 5) Define ˆD T := sup (v,ω) [0,1] 2 ˆD T (v, ω). 19 / 46
32 Algorithm for testing H 0 1) Calculate the test statistic ˆD T using the observed data {X 1,T,..., X T,T }. 2) Choose p N and determine estimates (â 1,p,..., â p,p, ˆΣ p ) which fit an AR(p) model to the observed data. 3) Generate B N replicates ˆD T,i i = 1,..., B. 4) Estimate the (1 α) quantile of ˆD T by the corresponding empirical quantile ( ˆD T ) T, (1 α)b of the sample { ˆD T,1,..., ˆD T,T }. 5) Reject H 0 if ˆD T > ( ˆD T ) T, (1 α)b. 20 / 46
33 Summary This yields an asymptotic level α test for the null hypothesis of no structural breaks. We have to choose: - The window-length N - The AR-dimension p. 21 / 46
34 Summary This yields an asymptotic level α test for the null hypothesis of no structural breaks. We have to choose: - The window-length N - The AR-dimension p. Take p as the minimizer of the the AIC criterion. How to choose N? Just a few minutes (this problem is partially open)! Finite sample properties? At the end of this talk! 21 / 46
35 Follow-Up questions If H 0 has been rejected, the following questions occur: 1) How many break points? Construction of an estimator ˆK for K. 2) Where are the break points located? Construction of an estimator (ˆb 1,..., ˆb ˆK ). How can the data {X t,t } t=1,...,t be subdivided into stationary segments? 3) In which components do the breaks occur? 22 / 46
36 Main idea Under the null hypothesis of no structural breaks: N γ holds (for 0 < γ < 1/2), while sup ˆD T (v, ω) = o P (1) v,ω [0,1] N γ for all b i i = 1,..., K if K 1. sup ˆD T (b i, ω) ω [0,1] 23 / 46
37 Main idea Under the null hypothesis of no structural breaks: N γ holds (for 0 < γ < 1/2), while sup ˆD T (v, ω) = o P (1) v,ω [0,1] N γ for all b i i = 1,..., K if K 1. sup ˆD T (b i, ω) ω [0,1] Identify a (possible) structural break in the component (a, b) at time point v if N γ sup [ ˆD T (v, ω)] a,b > ɛ T,a,b (v) (1) ω [0,1] for some thresholding sequence ɛ T,a,b (v) = o(n γ ) satisfying lim inf ɛ T,a,b(v) C > 0. T 23 / 46
38 Choice of ɛ T,a,b Identify a (possible) structural break in the component (a, b) at time point v if (too many!!) N γ sup [ ˆD T (v, ω)] a,b > ɛ T,a,b (v). ω [0,1] We choose where ( d(d + 1)T ɛ T,a,b (v) = 2M T,a,b (v, 1) log 2N ), M T,a,b (v, ω) = 1 N ωn [I 2N (v, λ k,2n )] aa [I 2N (v, λ k,2n )] bb k=1 24 / 46
39 Example / 46
40 v N γ sup [ ˆD T (v, ω)] a,b with γ = 0.4, N = T /8 ω [0,1] / 46
41 Localization of structural breaks (1) ˆB P = {ˆb 1,..., ˆb ˆK } : potential break points in { N T, N+1 T N T,..., ˆB D = : detected break points T } (2) Add the element b ˆB P to the set ˆB D for which sup ( sup N γ [ ˆD T ( b, ω)] a,b ) (a,b) {1,...,d} 2 ω [0,1] is maximal and replace the set ˆB P by ˆB P \[ b N T, b + N T ]. (3) Repeat step (2) until B P = ˆK = ˆB D (ˆb 1,..., ˆb ˆK ) are the different elements of ˆB D 27 / 46
42 Asymptotic properties (informal) Theorem Assume lim inf T ɛ T,a,b(v) C > 0, ɛ T,a,b (v) = o(n γ ), then (a) The probability that the decision rule indicates a structural break although there is no one vanishes asymptotically. (b) The probability that the procedure detects all structural breaks (and the corresponding components) converges to / 46
43 Regularization In the detection algorithm we require the choice of γ and N. We choose γ = 0.4 How to choose N (this work is not finished)? - If there are only a few structural breaks, N should be rather large. - If there exist many break points with small distances, N should be small. 29 / 46
44 Regularization In the detection algorithm we require the choice of γ and N. We choose γ = 0.4 How to choose N (this work is not finished)? - If there are only a few structural breaks, N should be rather large. - If there exist many break points with small distances, N should be small. Good choice for the test: N = T /2 29 / 46
45 Regularization One more example: 30 / 46
46 Regularization One more example: Good choice for the test: N = T /4 30 / 46
47 Regularization One more example: Good choice for the test: N = T /4 Good choice for the detection procedure: N = T /8 30 / 46
48 Data-driven choice of N 1) Use a set of even integers satisfying T N1 < N 2 <... < N n T 5/6 2) Estimate for each N i the number ˆK T (N i ) of break points 3) Define i := sup{i {2,..., n} ˆK T (N i 1 ) ˆK T (N i )} (here sup = ) and { N Ni if i = n if i = 4) Choose N = 2N for the test for structural breaks N = N for the estimation of the number and location of breakpoints. N n 31 / 46
49 Data-driven choice of N 1) Use a set of even integers satisfying T N1 < N 2 <... < N n T 5/6 2) Estimate for each N i the number ˆK T (N i ) of break points 3) Define i := sup{i {2,..., n} ˆK T (N i 1 ) ˆK T (N i )} (here sup = ) and { N Ni if i = n if i = 4) Choose N = 2N for the test for structural breaks N = N for the estimation of the number and location of breakpoints. 5) In the applications (FFT) N n N i = 2 log 2 ( T ) 1+i 31 / 46
50 Size of the test (univariate) X t = Z t + 0.5Z t 1 (2) X t = 0.5X t 1 + Z t (3) Model (2) Model (3) T 5 % 10 % 5 % 10 % Table: Empirical rejection frequencies of the bootstrap test in model (2) and (3) with different T. 32 / 46
51 Size of the test (multivariate) ( ) θ1 θ X t = Z t + 2 Z θ 2 θ t 1 (4) 1 ( ) φ1 φ X t = 2 X φ 2 φ t 1 + Z t. (5) 1 H 0 : Model (4) H 0 : Model (5) θ = (0.3, 0.1) θ = ( 0.5, 0.1) φ = (0.3, 0.1) φ = ( 0.5, 0.1) T 5% 10% 5% 10% 5% 10% 5% 10% Table: Empirical rejection frequencies of the bootstrap test in model (4) and (5) with different choices of θ = (θ 1, θ 2), φ = (φ 1, φ 2) and T. 33 / 46
52 Power of the test X t,t = X t,t = X t,t = ( K θl [ bl T +1, b l+1 T ](t) 0.2 θ l ( φl [ bl T +1, b l+1 T ](t) l=0 K l=0 K 1 [ bl T +1, b l+1 T ](t) l=0 ) Z t 1 + Z t (6) 0.2 φ l ) X t 1,T + Z t (7) ( ) σl 0.2 Z 0.2 σ t (8) l (6) (7) (8) T = 128 T = 256 T = 512 b parameter new Aue new Aue new Aue ( 1 4, 2 3, 3 ) (1, 1.5, 1, 1.5) ( 1 ) (1, 1.5) ( 1 4, 2 3, 3 ) (0.5, 0.5, 0.5, 0.5) ( 1 ) (0.5, 0.5) ( 1 4, 2 3, 3 ) (1, 2, 1, 0.5) ( 1 ) (1, 2) / 46
53 Performance of the detection rule - white noise model X t,t = 4 j=1 1 ( j 1 4 T, j 4 T ](t)θ jz t, Θ 1 := ( ) Θ 2 := ( ) Θ 3 := ( ) Θ 4 := ( ) {Z t} t Z is a two dimensional Gaussian white noise process Three changes at T /4; T /2 and 3T /4 35 / 46
54 Performance of the detection rule - white noise model / 46
55 Performance of the detection rule - white noise model Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= T= T= / 46
56 Performance of the detection rule - white noise model Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= 512 T= T= T= T= T= / 46
57 Performance of the detection rule - AR(2) model X t,t = ( ) ( ) X t 1,T + X t 2,T if t 1 T 2 ( ) ( ) X t 1,T + X t 2,T if 1 < t 3 2 T 4 ( ) ( ) X t 1,T + X 0 0 t 2,T if 3 < t 1 4 T +Z t {Z t} t Z is a two dimensional centered Gaussian process with covariance ( 1 ) Two changes at T /2 and 3T /4 38 / 46
58 Performance of the detection rule - AR(2) model / 46
59 Performance of the detection rule - AR(2) model Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= T= T= / 46
60 Performance of the detection rule - AR(2) model Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= T= T= T= T= T= / 46
61 Two locally stationary MA(1) models / 46
62 Two locally stationary MA(1) models Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= T= T= / 46
63 Two locally stationary MA(1) models Figure: Empirical distribution of ˆb = (ˆb 1,..., ˆb ˆK ) based on 100 simulation runs for sample sizes T {512, 1024, 2048}. Left: new procedure. Right: Davis et al (2006) T= T= T= T= T= T= / 46
64 Multivariate data example We consider sector ETFs Symbol Name Sector 1 XLB Materials Select Sector SPDR Commodities 2 XLP Consumer Staples Select Sector SPDR Consumer Staples 3 XLU Utilities Select Sector SPDR Utilities 4 XLF Financial Select Sector SPDR Financials 5 XLE Energy Select Sector SPDR Energy and their log returns X t,j := log(y t,j /Y t 1,j ) with Y t,j denoting the adjusted closing price at time t of sector ETF j. 43 / 46
65 Multivariate data example 0e+00 2e 04 4e e+00 2e 04 4e e+00 2e 04 4e e+00 4e 05 8e e+00 4e 05 8e e+00 4e 05 8e e+00 2e 04 4e e+00 2e 04 4e 04 6e e+00 2e 04 4e e+00 2e 04 4e Figure: Plot of functions v N γ e+00 2e 04 4e e+00 2e 04 4e sup ˆD T (v, ω) a,b for a, b = 1,..., / 46 ω [0,1]
66 Multivariate data example Figure: Plot of the log-returns for the sector ETFs. 45 / 46
67 Summary Summary A test for the presence of structural breaks in multivariate time series Estimation of the number of structural breaks occur Estimation of the locations and components, where structural breaks occur Outperforms common binary segmentation approach if more than one break point is present The methodology is based on the assumption of a piecewise stationary multivariate process but it can be generalized to processes with different locally stationary behavior on different segements 1 ( ωπ D(v, ω) := lim ɛ 0 ɛ 0 v+ɛ v f(u, λ)dudλ ωπ v 0 v ɛ f(u, λ)dudλ ) 46 / 46
68 Summary Summary A test for the presence of structural breaks in multivariate time series Estimation of the number of structural breaks occur Estimation of the locations and components, where structural breaks occur Outperforms common binary segmentation approach if more than one break point is present The methodology is based on the assumption of a piecewise stationary multivariate process but it can be generalized to processes with different locally stationary behavior on different segements Future work: 1 ( ωπ D(v, ω) := lim ɛ 0 ɛ 0 v+ɛ v f(u, λ)dudλ ωπ v Investigate the choice of regularization parameters (in particular N) How to adjust the procedure if the dimension d is large (growing)? Multiscale inference? 0 v ɛ f(u, λ)dudλ ) 46 / 46
time series Philip Preuß, Ruprecht Puchstein, Holger Dette Ruhr-Universität Bochum, Fakultät für Mathematik Bochum, Germany Abstract
Detection of multiple structural breaks in multivariate time series arxiv:309.309v [math.s] 5 Sep 203 Philip Preuß, Ruprecht Puchstein, Holger Dette Ruhr-Universität Bochum, Fakultät für Mathematik 44780
More informationMultiscale and multilevel technique for consistent segmentation of nonstationary time series
Multiscale and multilevel technique for consistent segmentation of nonstationary time series Haeran Cho Piotr Fryzlewicz University of Bristol London School of Economics INSPIRE 2009 Imperial College London
More informationCHANGE DETECTION IN TIME SERIES
CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700
More informationBootstrapping high dimensional vector: interplay between dependence and dimensionality
Bootstrapping high dimensional vector: interplay between dependence and dimensionality Xianyang Zhang Joint work with Guang Cheng University of Missouri-Columbia LDHD: Transition Workshop, 2014 Xianyang
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationCUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS. Sangyeol Lee
CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS Sangyeol Lee 1 Contents 1. Introduction of the CUSUM test 2. Test for variance change in AR(p) model 3. Test for Parameter Change in Regression Models
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationRobustní monitorování stability v modelu CAPM
Robustní monitorování stability v modelu CAPM Ondřej Chochola, Marie Hušková, Zuzana Prášková (MFF UK) Josef Steinebach (University of Cologne) ROBUST 2012, Němčičky, 10.-14.9. 2012 Contents Introduction
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationMultivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationHeteroskedasticity; Step Changes; VARMA models; Likelihood ratio test statistic; Cusum statistic.
47 3!,57 Statistics and Econometrics Series 5 Febrary 24 Departamento de Estadística y Econometría Universidad Carlos III de Madrid Calle Madrid, 126 2893 Getafe (Spain) Fax (34) 91 624-98-49 VARIANCE
More informationExtreme inference in stationary time series
Extreme inference in stationary time series Moritz Jirak FOR 1735 February 8, 2013 1 / 30 Outline 1 Outline 2 Motivation The multivariate CLT Measuring discrepancies 3 Some theory and problems The problem
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationOptimal Covariance Change Point Detection in High Dimension
Optimal Covariance Change Point Detection in High Dimension Joint work with Daren Wang and Alessandro Rinaldo, CMU Yi Yu School of Mathematics, University of Bristol Outline Review of change point detection
More informationAsymptotic distribution of GMM Estimator
Asymptotic distribution of GMM Estimator Eduardo Rossi University of Pavia Econometria finanziaria 2010 Rossi (2010) GMM 2010 1 / 45 Outline 1 Asymptotic Normality of the GMM Estimator 2 Long Run Covariance
More informationOn the Power of Tests for Regime Switching
On the Power of Tests for Regime Switching joint work with Drew Carter and Ben Hansen Douglas G. Steigerwald UC Santa Barbara May 2015 D. Steigerwald (UCSB) Regime Switching May 2015 1 / 42 Motivating
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationLong-Run Covariability
Long-Run Covariability Ulrich K. Müller and Mark W. Watson Princeton University October 2016 Motivation Study the long-run covariability/relationship between economic variables great ratios, long-run Phillips
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationExtremogram and Ex-Periodogram for heavy-tailed time series
Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal
More informationExtremogram and ex-periodogram for heavy-tailed time series
Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationCentral Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, / 91. Bruce E.
Forecasting Lecture 3 Structural Breaks Central Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, 2013 1 / 91 Bruce E. Hansen Organization Detection
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationFinancial Time Series: Changepoints, structural breaks, segmentations and other stories.
Financial Time Series: Changepoints, structural breaks, segmentations and other stories. City Lecture hosted by NAG in partnership with CQF Institute and Fitch Learning Rebecca Killick r.killick@lancs.ac.uk
More informationPractical conditions on Markov chains for weak convergence of tail empirical processes
Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationModel Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao
Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley
More informationThe Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University
The Bootstrap: Theory and Applications Biing-Shen Kuo National Chengchi University Motivation: Poor Asymptotic Approximation Most of statistical inference relies on asymptotic theory. Motivation: Poor
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationRobust Backtesting Tests for Value-at-Risk Models
Robust Backtesting Tests for Value-at-Risk Models Jose Olmo City University London (joint work with Juan Carlos Escanciano, Indiana University) Far East and South Asia Meeting of the Econometric Society
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationDoes k-th Moment Exist?
Does k-th Moment Exist? Hitomi, K. 1 and Y. Nishiyama 2 1 Kyoto Institute of Technology, Japan 2 Institute of Economic Research, Kyoto University, Japan Email: hitomi@kit.ac.jp Keywords: Existence of moments,
More informationA Residual-Based Multivariate Constant Correlation Test
A Residual-Based Multivariate Constant Correlation Test Fang Duan Dominik Wied May 8, 2018 Abstract We propose a new multivariate constant correlation test based on residuals. This test takes into account
More informationConverse bounds for private communication over quantum channels
Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationBickel Rosenblatt test
University of Latvia 28.05.2011. A classical Let X 1,..., X n be i.i.d. random variables with a continuous probability density function f. Consider a simple hypothesis H 0 : f = f 0 with a significance
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationDiscussion of High-dimensional autocovariance matrices and optimal linear prediction,
Electronic Journal of Statistics Vol. 9 (2015) 1 10 ISSN: 1935-7524 DOI: 10.1214/15-EJS1007 Discussion of High-dimensional autocovariance matrices and optimal linear prediction, Xiaohui Chen University
More informationEXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME. Xavier Mestre 1, Pascal Vallet 2
EXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME Xavier Mestre, Pascal Vallet 2 Centre Tecnològic de Telecomunicacions de Catalunya, Castelldefels, Barcelona (Spain) 2 Institut
More informationOrthogonal samples for estimators in time series
Orthogonal samples for estimators in time series Suhasini Subba Rao Department of Statistics, Texas A&M University College Station, TX, U.S.A. suhasini@stat.tamu.edu May 10, 2017 Abstract Inference for
More informationDefine y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let
More informationA Modified Fractionally Co-integrated VAR for Predicting Returns
A Modified Fractionally Co-integrated VAR for Predicting Returns Xingzhi Yao Marwan Izzeldin Department of Economics, Lancaster University 13 December 215 Yao & Izzeldin (Lancaster University) CFE (215)
More informationRegime switching models
Regime switching models Structural change and nonlinearities Matthieu Stigler Matthieu.Stigler at gmail.com April 30, 2009 Version 1.1 This document is released under the Creative Commons Attribution-Noncommercial
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationVast Volatility Matrix Estimation for High Frequency Data
Vast Volatility Matrix Estimation for High Frequency Data Yazhen Wang National Science Foundation Yale Workshop, May 14-17, 2009 Disclaimer: My opinion, not the views of NSF Y. Wang (at NSF) 1 / 36 Outline
More informationEcon 583 Final Exam Fall 2008
Econ 583 Final Exam Fall 2008 Eric Zivot December 11, 2008 Exam is due at 9:00 am in my office on Friday, December 12. 1 Maximum Likelihood Estimation and Asymptotic Theory Let X 1,...,X n be iid random
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationA Conditional Approach to Modeling Multivariate Extremes
A Approach to ing Multivariate Extremes By Heffernan & Tawn Department of Statistics Purdue University s April 30, 2014 Outline s s Multivariate Extremes s A central aim of multivariate extremes is trying
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationLong memory and changing persistence
Long memory and changing persistence Robinson Kruse and Philipp Sibbertsen August 010 Abstract We study the empirical behaviour of semi-parametric log-periodogram estimation for long memory models when
More informationHeavy Tailed Time Series with Extremal Independence
Heavy Tailed Time Series with Extremal Independence Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Herold Dehling Bochum January 16, 2015 Rafa l Kulik and Philippe Soulier Regular variation
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationOn detection of unit roots generalizing the classic Dickey-Fuller approach
On detection of unit roots generalizing the classic Dickey-Fuller approach A. Steland Ruhr-Universität Bochum Fakultät für Mathematik Building NA 3/71 D-4478 Bochum, Germany February 18, 25 1 Abstract
More informationModel Selection and Geometry
Model Selection and Geometry Pascal Massart Université Paris-Sud, Orsay Leipzig, February Purpose of the talk! Concentration of measure plays a fundamental role in the theory of model selection! Model
More informationDuration-Based Volatility Estimation
A Dual Approach to RV Torben G. Andersen, Northwestern University Dobrislav Dobrev, Federal Reserve Board of Governors Ernst Schaumburg, Northwestern Univeristy CHICAGO-ARGONNE INSTITUTE ON COMPUTATIONAL
More informationSTT 843 Key to Homework 1 Spring 2018
STT 843 Key to Homework Spring 208 Due date: Feb 4, 208 42 (a Because σ = 2, σ 22 = and ρ 2 = 05, we have σ 2 = ρ 2 σ σ22 = 2/2 Then, the mean and covariance of the bivariate normal is µ = ( 0 2 and Σ
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationMultivariate Time Series: VAR(p) Processes and Models
Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with
More informationEmpirical likelihood and self-weighting approach for hypothesis testing of infinite variance processes and its applications
Empirical likelihood and self-weighting approach for hypothesis testing of infinite variance processes and its applications Fumiya Akashi Research Associate Department of Applied Mathematics Waseda University
More informationAssociation studies and regression
Association studies and regression CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar Association studies and regression 1 / 104 Administration
More informationAdjusted Empirical Likelihood for Long-memory Time Series Models
Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics
More informationTime Series Analysis
Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationQuick Review on Linear Multiple Regression
Quick Review on Linear Multiple Regression Mei-Yuan Chen Department of Finance National Chung Hsing University March 6, 2007 Introduction for Conditional Mean Modeling Suppose random variables Y, X 1,
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationTest for Parameter Change in ARIMA Models
Test for Parameter Change in ARIMA Models Sangyeol Lee 1 Siyun Park 2 Koichi Maekawa 3 and Ken-ichi Kawai 4 Abstract In this paper we consider the problem of testing for parameter changes in ARIMA models
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationFinancial Econometrics and Quantitative Risk Managenent Return Properties
Financial Econometrics and Quantitative Risk Managenent Return Properties Eric Zivot Updated: April 1, 2013 Lecture Outline Course introduction Return definitions Empirical properties of returns Reading
More informationWild Binary Segmentation for multiple change-point detection
for multiple change-point detection Piotr Fryzlewicz p.fryzlewicz@lse.ac.uk Department of Statistics, London School of Economics, UK Isaac Newton Institute, 14 January 2014 Segmentation in a simple function
More informationHETEROSCEDASTICITY AND AUTOCORRELATION ROBUST STRUCTURAL CHANGE DETECTION. By Zhou Zhou 1 University of Toronto February 1, 2013.
HETEROSCEDASTICITY AND AUTOCORRELATION ROBUST STRUCTURAL CHANGE DETECTION By Zhou Zhou 1 University of Toronto February 1, 2013 Abstract The assumption of (weak) stationarity is crucial for the validity
More informationYou must continuously work on this project over the course of four weeks.
The project Five project topics are described below. You should choose one the projects. Maximum of two people per project is allowed. If two people are working on a topic they are expected to do double
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information