Tests for separability in nonparametric covariance operators of random surfaces

Size: px
Start display at page:

Download "Tests for separability in nonparametric covariance operators of random surfaces"

Transcription

1 Tests for separability in nonparametric covariance operators of random surfaces Shahin Tavakoli (joint with John Aston and Davide Pigoli) April 19, 2016

2 Analysis of Multidimensional Functional Data Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

3 Functional Data Analysis Spatial Statistics Typical Framework 1 Want to probe the law of X, a random function in a Hilbert space (X L 2 ([0, 1] d, R)) 2 Observe independent and identically distributed realizations X 1,..., X n First order structure The mean function µ(t) = E [X(t)], t [0, 1] d, t X(t) is the parametrization. Second order structure The covariance function, resp. operator c(t, s) = cov (X(t), X(s)), t, s [0, 1] d, Cf(t) = c(t, s)f(s)ds, f L 2 ([0, 1] d, R). [0,1] d Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

4 FDA: Covariance is Crucial Karhunen-Loève expansion (basis for fpca) Let X be a random element of L 2 ([0, 1] d, R) and assume E X 2 <, and let C = n 1 λnϕn 2 ϕ n, be the SVD decomposition. The Karhunen-Loève (KL) expansion of X is ( ) X µ = ϕ n 2 ϕ n X = ξ nϕ n, in L 2, n=1 n=1 where Eξ n = 0, E [ξ nξ m] = δ n,mλ n, and ξ n = X µ, ϕ n. q 2 E X ξ nϕ n = λ n n=1 n>q Yields a separation of variables: {stochastic}+{functional} Uncorrelates: the ξ ns are uncorrelated, and the ϕ ns are orthogonal Optimal linear finite dimensional approximations (fpca µ + q n=1 ξnϕn) technology transfer from multivariate analysis: (e.g. Modeling, Inference, Classification) In inference problems, eigenfunctions form the natural basis for regularization Smoothness: each ϕ j accounts for λ j / n 1 λn of the total variation Consistent estimation of eigenstructure of the covariance operator (Dauxois, Pousse, Romain, 1982) Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

5 FDA: Estimation of the Covariance In practice, the covariance is unknown, but can be estimated by the sample covariance, Ĉ n = n 1 n ( Xi X ) ( 2 Xi X ), i=1 given an i.i.d. sample X 1,..., X n. Asymptotics (as n ) Consistency a.s. Ĉn C in trace norm, d CLT n(ĉn C) N(0, Γ), w.r.t. Hilbert Schmidt norm, [( ) ( )] Γ = E (X µ) 2(X µ) C (X µ) 2(X µ) C. Eigenstructure of C consistently estimated as a by-product! 2 Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

6 FDA: Estimation of the Covariance In practice, the covariance is unknown, but can be estimated by the sample covariance, Ĉ n = n 1 n ( Xi X ) ( 2 Xi X ), i=1 given an i.i.d. sample X 1,..., X n. Issues Computational complexity If X L 2 ([0, 1] 2, R) is represented by m = p q basis functions (think m-dimensional vector), then the covariance is O(m 2 )-dimensional. Statistical Accuracy Compared to the number of unknown parameters, the sample size is order of magnitudes smaller. E.g. p = q = 100 = unknown parameters, but usually n 100. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

7 Workaround: Separability assumption Definition If X L 2 ([ S, S] d [0, T ], R) is random, with covariance then the covariance is separable if C(s, t, s, t ) = cov(x(s, t), X(s, t )) (0.1) C(s, t, s, t ) = C 1 (s, s )C 2 (t, t ) Under separability assumption, Estimation of C is much faster and accurate. Eigenfunctions of C given by ϕ i (s)ψ j (t), where ϕ i, ψ j are the eigenfunctions of C 1, resp. C 2. Even if separability does not hold, working under this assumption provides Biased estimator of covariance, but with lower variance. still provides a valid basis, but not optimal one. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

8 Covariance: Memory size (in R) If X is represented by m = p p basis functions (think m-dimensional vector), then the covariance is O(m 2 )-dimensional. GB 1e 05 1e+01 1e+07 Slice of fmri full fmri Computational Limit Structural MRI Hard drive (1TB) RAM (16GB) full covariance p Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

9 Covariance: Memory size (in R) If X is represented by m = p p basis functions (think m-dimensional vector), then the covariance is O(m 2 )-dimensional. GB 1e 05 1e+01 1e+07 Slice of fmri full fmri Computational Limit Structural MRI Hard drive (1TB) RAM (16GB) full covariance separable covariance p Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

10 Example X 1,..., X n i.i.d. X R In general, the covariance has 5050 degrees of freedom. Under separability, this reduces to 110 degrees of freedom! Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

11 What this talk is about Eigenstructure of the covariance is crucial in Functional Data. But it is not computable in some cases. Separability can help. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

12 What this talk is about Eigenstructure of the covariance is crucial in Functional Data. But it is not computable in some cases. Separability can help. Is separability a valid assumption? How can we test the separability assumption? without computing the sample full covariance. without parametric assumptions on the covariance (or marginal covariances). Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

13

14 Mathematical Framework Tensor product of Hilbert Spaces L 2 ([ S, S] d [0, T ], R) = L 2 ([ S, S] d, R) L 2 ([0, T ], R) which corresponds to saying that X(s, t) = f i (s)g i (t) i 1 General formulation of the problem: X H 1 H 2 is a random element. If E X 2 <, then its covariance operator C is trace-class: C S 1 (H 1 H 2 ). Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

15 Separability Given operators C 1 S 1 (H 1 ) and C 2 S 1 (H 2 ), we define C 1 C 2 S 1 (H 1 H 2 ) on simple tensors: ( C1 C 2 ) (u v) = C1 u C 2 v, u H 1, v H 2. Hence, ( ) C1 C 2 (X(s, t)) = C 1 (f i (s)) C 2 (g i (t)). i 1 Separability (think independence ) An operator C S 1 (H 1 H 2 ) is called separable if C = C 1 C 2, for some C 1 S 1 (H 1 ), C 2 S 1 (H 2 ). Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

16 Separability Given operators C 1 S 1 (H 1 ) and C 2 S 1 (H 2 ), we define C 1 C 2 S 1 (H 1 H 2 ) on simple tensors: ( C1 C 2 ) (u v) = C1 u C 2 v, u H 1, v H 2. Hence, ( ) C1 C 2 (X(s, t)) = C 1 (f i (s)) C 2 (g i (t)). i 1 Separability (think independence ) An operator C S 1 (H 1 H 2 ) is called separable if C = C 1 C 2, for some C 1 S 1 (H 1 ), C 2 S 1 (H 2 ). How could we approximate an arbitrary C S 1 (H 1 H 2 ) by a separable operator? Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

17 Partial Traces Partial Trace ( Think marginal distribution ) We denote by Tr 1 : S 1(H 1 H 2) S 1(H 2) the unique continuous linear operator satisfying Tr 1 ( A1 A 2 ) = Tr (A1) A 2, A 1 S 1(H 1), A 2 S 1(H 2), Likewise, Tr 2 : S 1(H 1 H 2) S 1(H 1). L 2 version If H 1 H 2 = L 2 ([ S, S] d [0, T ], R) = L 2 ([ S, S] d, R) L 2 ([0, T ], R), and C(s, t, s, t ) is a trace-class operator on that space, Tr 1(C)(t, t ) = C(s, t, s, t )ds s [ S,S] d Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

18 Partial Traces and Separable Approximation Characterization of separability An operator C S 1 (H 1 H 2 ) is separable if and only if Tr(C)C = Tr 2 (C) Tr 1 (C). Separable approximation We define the separable approximation of C S 1 (H 1 H 2 ) by Sep(C) = Tr 2 (C) Tr 1 (C) Tr(C) = Tr 2 (C) Tr(C) Tr 1 (C) Tr(C), provided Tr(C) 0. Notice that Sep is not linear, but continuous at C s.t. Tr(C) 0. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

19 Testing separability Let H 0 : C = C 1 C 2, and Ĉn be the sample covariance operator based on X 1,..., X n. Full Test D n = )) n (Ĉn Sep (Ĉn. But this requires computing the full sample covariance! Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

20 Testing separability Let H 0 : C = C 1 C 2, and Ĉn be the sample covariance operator based on X 1,..., X n. Full Test D n = )) n (Ĉn Sep (Ĉn. But this requires computing the full sample covariance! Can be avoided by looking at projections along the eigenfunctions of Sep(Ĉn)! Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

21 Testing separability Let H 0 : C = C 1 C 2, and Ĉn be the sample covariance operator based on X 1,..., X n. Full Test D n = )) n (Ĉn Sep (Ĉn. But this requires computing the full sample covariance! Can be avoided by looking at projections along the eigenfunctions of Sep(Ĉn)! Projected tests For integers r, s 1, let T n(r, s) = D n(û r ˆv s), û r ˆv s = ( ) 1 n 2 n Xk X n, v i 2 û j ˆλrˆγ s, n k=1 where (ˆλ r, û r) is the r-th eigenvalue/eigenfunction pair of Tr 2 (Ĉn) and (ˆγs, ˆvs) is the s-th eigenvalue/eigenfunction pair of Tr 1 (Ĉn). Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

22 Asymptotics Assumption Assume X H. There exists an orthonormal basis (e i ) i 1 H such that i=1 ( E X, e j 4) 1/4 <. (Equivalent to -summability of eigenvalues if X is Gaussian). Theorem Let X H = H 1 H 2 be a random element satisfying the Assumption. Then if the covariance operator C of X is separable, and Tr(C) 0, D n = )) d n (Ĉn Sep (Ĉn Z, n, where Z is a Gaussian random element of S 1 (H 1 H 2 ) with mean 0 and covariance... Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

23 Asymptotic Covariance of D n [ [ ] [ ]] E Tr (A 1 A 2 )Z Tr (B 1 B 2 )Z = [ ] Tr (A B)Γ + Tr [BC] [ ] Tr (A Id H )Γ Tr(C) Tr[B [( ) 1C 1 ] Tr A (Id H1 B 2 ) Tr[C 1 ] Tr[B [ ] 2C 2 ] (B Tr (Id H 1 Id H2 ))Γ Tr[C 2 ] { Tr[A 2C 2 ] Tr[C 2 ] Tr[B 2C 2 ] Tr Tr[C 2 ] [( ) ] Tr (A 1 Id H2 ) B Γ Tr[B 2C 2 ] Tr[C 2 ] { ] Γ + Tr[AC] Tr[C] [( (A 1 Id H2 ) (B 1 Id H2 ) Tr [( ) ] Tr A (B 1 Id H2 ) Γ [ ] B)Γ (Id H + Tr[BC] Tr[Γ] Tr[C] Tr[B [ ] 1C 1 ] (Id } Tr (Id H H1 B 2 ))Γ Tr[C 1 ] + Tr[BC] Tr[C] ) ] Γ Tr[B [( ) ] 1C 1 ] Tr (A 1 Id H2 ) (Id } H1 B 2 ) Γ Tr[C 1 ] { Tr[A 1C 1 ] Tr[C 1 ] Tr[B 2C 2 ] Tr Tr[C 2 ] [( ) ] Tr (Id H1 A 2 ) B Γ [( (Id H1 A 2 ) (B 1 Id H2 ) + Tr[BC] Tr[C] ) ] Γ Tr[B [( ) 1C 1 ] Tr (Id H1 A 2 ) (Id H1 B 2 ) Γ] }. Tr[C 1 ] ) ] Tr [((A 1 Id H2 ) Id H Γ ) ] Tr [((Id H1 A 2 ) Id H Γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

24 Asymptotics of the projections Corollary Under the conditions of previous Theorem, if I {(i, j) : i, j 1} is a finite set of indices such that λ rγ s > 0 for each (r, s) I, then d (T N (r, s)) (r,s) I N(0, Σ), as N, ) where Σ = (Σ (r,s),(r,s ) (r,s),(r,s is given by ) I Σ (r,s),(r,s ) = β rsr s + αrs β r s + α r s βr s + α rs β r s + α r s β rs Tr(C) + αrsα r s β Tr(C) 2 + λrλ r β s s Tr(C 1 ) 2 + γsγ s β r r Tr(C 2 ) 2 λr β r s s + λ r β rs s Tr(C 1 ) γs β r s r + γ s β rsr Tr(C 2 ) ( αrs γs β r Tr(C) Tr(C 2 ) + λ r β s ) α ( ) r s γs βr λr β s + Tr(C 1 ) Tr(C) Tr(C 2 ) Tr(C 1 ) where µ = E [X], α rs = λ rγ s, β ijkl = E [ X µ, ui v j 2 X µ, uk v l 2], and denotes summation over the corresponding index, i.e. β r jk = i 1 β rijk. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

25 Gaussian Setting Corollary Let T N (r, s) = D N (û r ˆv s), û r ˆv s,, where r, s 1 are fixed. Assume the conditions of Theorem hold, and that X is Gaussian. If I {(i, j) : i, j 1} is a finite set of indices such that λ rγ s > 0 for each (r, s) I, then (T N (r, s)) (r,s) I d N(0, Σ), as N. where Σ (r,s),(r,s ) = 2λrλ r γsγ ( ) s Tr(C) 2 δ rr Tr(C 1 ) 2 + C (λr + λ r ) Tr(C 1) ( ) δ ss Tr(C 2 ) 2 + C (γs + γ s ) Tr(C 2), and δ ij = 1 if i = j, and zero otherwise. In particular, notice that Σ itself is separable. Notice that each Σ (r,s),(r,s ) can be consistently estimated, thus we can construct an asymptotically χ 2 test for separability. I Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

26 Testing separability in practice Tests Base tests on functionals of D n = n (Ĉn Sep T n(r, s) = D n(û r ˆv s), û r ˆv s : G n(i) = (r,s) I T 2 n(r, s), G a n(i) = (r,s) I T n(r, 2 s)/ˆσ 2 (r, s) G n(i) = ˆΣ 1/2 T/2 L,I Tn(I)ˆΣ R,I 2, (Ĉn )). Recall that D n 2 2 = i,j 1 Dn(e i f j ) 2, where (e i ) i 1 is an orthonormal basis of H 1, and (f j ) j 1 is an orthonormal basis of H 2. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

27 Testing separability in practice Tests Base tests on functionals of D n = n (Ĉn Sep T n(r, s) = D n(û r ˆv s), û r ˆv s : G n(i) = (r,s) I T 2 n(r, s), G a n(i) = (r,s) I T n(r, 2 s)/ˆσ 2 (r, s) G n(i) = ˆΣ 1/2 T/2 L,I Tn(I)ˆΣ R,I 2, (Ĉn )). Recall that D n 2 2 = i,j 1 Dn(e i f j ) 2, where (e i ) i 1 is an orthonormal basis of H 1, and (f j ) j 1 is an orthonormal basis of H 2. Asymptotics + Parametric Bootstrap Empirical Bootstrap G n(i) X X Ga n (I) X X Gn(I), X X X D n 2 2 X X Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

28 Testing separability in practice Tests Base tests on functionals of D n = n (Ĉn Sep T n(r, s) = D n(û r ˆv s), û r ˆv s : G n(i) = (r,s) I T 2 n(r, s), G a n(i) = (r,s) I T n(r, 2 s)/ˆσ 2 (r, s) G n(i) = ˆΣ 1/2 T/2 L,I Tn(I)ˆΣ R,I 2, (Ĉn )). Recall that D n 2 2 = i,j 1 Dn(e i f j ) 2, where (e i ) i 1 is an orthonormal basis of H 1, and (f j ) j 1 is an orthonormal basis of H 2. Asymptotics + Parametric Bootstrap Empirical Bootstrap G n(i) X X Ga n (I) X X Gn(I), X X X D n 2 2 X X R package covsep available on CRAN & Github Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

29 Gaussian Scenario; I = {(1, 1)} Empirical power Empirical power N=10 N=25 N=50 N= γ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

30 Gaussian Scenario; I = {(r, s) : r, s = 1, 2} Empirical power Empirical power N=10 N=25 N=50 N= γ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

31 Application: Acoustic Phonetic Data Data: Log-spectrograms of recorded speech for 23 speakers (k) of 5 different Romance languages (L), pronouncing 10 different words (i) (numbers from 1 to 10): Rik L (ω, t) = SL ik (ω, t) (1/n n i i) Sik L (ω, t). k=1 We consider the residuals of same language as replicates from the same acoustic process : R L (ω, t) 81 points frequency, 100 points in the time. = full covariance structure with about degrees of freedom Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

32 Separability of Covariance for Acoustic Phonetic Data P-value for separability of the covariance of each language cov ( R L (ω, t), R L (ω, t ) ) = C L 1 (ω, ω )C L 2 (t, t ). I French Italian Portuguese American Spanish Iberian Spanish I < < < < I I < Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

33 Conclusions Tools for testing separability without computing the full covariance operator. No parametric assumptions! Hilbert space setup: applicable for multidimensional or multivariate functional data! Implemented in R package covsep (CRAN & Github) Can be generalized to higher dimensions, i.e. random element of H 1 H 2 H 3 with covariance C = C 1 C 2 C 3... (have fun!) Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

34 Thank you for your attention References [1] Gneiting, T. (2002), Nonseparable, stationary covariance functions for space-time data, Journal of the American Statistical Association 97(458), [2] Lu, N. & Zimmerman, D. L. (2005), The likelihood ratio test for a separable covariance matrix, Statistics & probability letters 73(4), [3] Fuentes, M. (2006), Testing for separability of spatial-temporal covariance functions, Journal of statistical planning and inference 136(2), [4] Mitchell, M. W., Genton, M. g. & Gumpertz, M. L. (2005), Testing for separability of space-time covariances, Environmetrics 16(8), [5] Aston, J. & Pigoli, D. & Tavakoli, S.(2015), Tests for separability in nonparametric covariance operators of random surfaces, arxiv: , under revision. [6] Tavakoli, S. (2016), covsep: Tests for Determining if the Covariance Structure of 2-Dimensional Data is Separable. R package version URL: Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

35 Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

36 References (Fourier analysis of Functional Time Series) Panaretos, V. M. & Tavakoli, S. (2013a), Cramér-Karhunen-Loève Representation and Harmonic Principal Component Analysis of Functional Time Series, Stochastic Processes and their Applications, 123 (7). Panaretos, V. M. & Tavakoli, S. (2013b), Fourier Analysis of Stationary Processes in Function Space, Annals of Statistics, 41 (2). Tavakoli, S. & Panaretos, V. M. (2015), Detecting and Localizing Differences in Functional Time Series Dynamics: A Case Study in Molecular Biophysics, to appear in JASA Application & Case Studies. Shahin Tavakoli (2015). ftsspec: Spectral Density Estimation and Comparison for Functional Time Series. R package version Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

37 Testing separability in practice Recall that T n (r, s) = D n (û r ˆv s ), û r ˆv s. Base tests on functionals of D n = )) n (Ĉn Sep (Ĉn : G n (r, s) = T 2 n(r, s) G n (r, s) = T 2 n(r, s)/ˆσ 2 (r, s) G n (I) = (r,s) I G n(r, s) G n (I) = (r,s) I G n (r, s) Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

38 Testing separability in practice Recall that T n (r, s) = D n (û r ˆv s ), û r ˆv s. Base tests on functionals of D n = )) n (Ĉn Sep (Ĉn : G n (r, s) = T 2 n(r, s) G n (r, s) = T 2 n(r, s)/ˆσ 2 (r, s) G n (I) = (r,s) I G n(r, s) G n (I) = (r,s) I G n (r, s) Devil s advocate: What if the departure from separability is outside of this subspace? Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

39 Testing separability in practice Recall that T n (r, s) = D n (û r ˆv s ), û r ˆv s. Base tests on functionals of D n = )) n (Ĉn Sep (Ĉn : G n (r, s) = T 2 n(r, s) G n (r, s) = T 2 n(r, s)/ˆσ 2 (r, s) G n (I) = (r,s) I G n(r, s) G n (I) = (r,s) I G n (r, s) Devil s advocate: What if the departure from separability is outside of this subspace? D n 2 2 = i,j 1 D n(e i f j ) 2, where (e i ) i 1 is an orthonormal basis of H 1, and (f j ) j 1 is an orthonormal basis of H 2. Hilbert Schmidt Test (HS) Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

40 Parametric Bootstrap Approximations Given X 1,..., X n, I. compute X, Sep II. For b = 1,..., B, X b i (Ĉn ), and H n = H n (X 1,..., X n ). 1 Create bootstrap ( samples )) X b = {X1, b..., Xn}, b where i.i.d. F X, Sep (Ĉn. 2 Compute H b n = H n (X b ), III. Compute the estimated bootstrap p-value p = 1 B B 1 {H b n >H n}, b=1 where 1 {A} = 1 if A is true, and zero otherwise. Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

41 Empirical Bootstrap Approximations Given X = {X 1,..., X n ) }, I. Compute Sep (Ĉn and H n = H n (X 1,..., X n ). II. For b = 1,..., B, 1 Create the bootstrap sample X b = {X1, b..., Xn} b by drawing with repetition from X 1,..., X n. 2 For each bootstrap sample, compute b n = n (X, X b ). III. Compute the estimated bootstrap p-value p = 1 B 1 B { b n >H n}, b=1 where 1 {A} = 1 if A is true, and zero otherwise. H n n (X, X ) G n (I) G n (I) D n 2 2 D n D n 2 2. (i,j) I (T n(i, j) T n (i, j)) 2. (i,j) I (T n(i, j) T n (i, j)) 2 /ˆσ 2 (i, j). Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

42 Simulation Studies Generate X 1,, X n R 32 7 with mean zero and covariance C = C (γ), where C (γ) (i 1, j 1, i 2, j 2) = (1 γ)c 1(i 1, i 2)c 2(j 1, j 2) { 1 + γ ( j 1 j 2 ) exp ( } i 1 i 2 32 )2 ( j 1 j 2, ) γ [0, 1]; i 1, i 2 = 1,..., 32; j 1, j 2 = 1,..., 7. γ = 0, 0.01,..., 0.1; n = 10, 25, 50, 100; B = 1000 The X where simulated with Gaussian distribution (Gaussian scenario) and multivariate Student t distribution with 6 degrees of freedom (non-gaussian distribution). Compare G n(i), G n(i), D n 2 2 for different sets of indices I. Replicated 1000 each combination of parameters. level α = 5%. About 280 CPU-days of simulations! Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

43 Simulation Studies c 1, c 2 are given by Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

44 Gaussian Scenario; I = {(1, 1)} N=10 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

45 Gaussian Scenario; I = {(1, 1)} N=25 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

46 Gaussian Scenario; I = {(1, 1)} N=50 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

47 Gaussian Scenario; I = {(1, 1)} N=100 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

48 Non-Gaussian Scenario; I = {(1, 1)} N=10 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

49 Non-Gaussian Scenario; I = {(1, 1)} N=25 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

50 Non-Gaussian Scenario; I = {(1, 1)} N=50 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

51 Non-Gaussian Scenario; I = {(1, 1)} N=100 CLT ParBoot ParBoot* EmpBoot EmpBoot* HS ParBoot HS EmpBoot Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

52 Increasing projection subspaces (Gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=10 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

53 Increasing projection subspaces (Gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=25 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

54 Increasing projection subspaces (Gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=50 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

55 Increasing projection subspaces (Gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=100 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

56 Increasing projection subspaces (non-gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=10 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

57 Increasing projection subspaces (non-gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=25 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

58 Increasing projection subspaces (non-gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=50 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

59 Increasing projection subspaces (non-gaussian Scenario) I 1 = {(1, 1)}, I 2 = {(r, s) : r, s = 1, 2}, I 3 = {(r, s) : r = 1,..., 4; s = 1,..., 10}. level/power N=100 I_1 I_2 I_ γ Shahin Tavakoli (Cambridge) Testing Separability April 19, / 48

TESTS FOR SEPARABILITY IN NONPARAMETRIC COVARIANCE OPERATORS OF RANDOM SURFACES

TESTS FOR SEPARABILITY IN NONPARAMETRIC COVARIANCE OPERATORS OF RANDOM SURFACES The Annals of Statistics 2017, Vol. 45, No. 4, 1431 1461 DOI: 10.1214/16-AOS1495 Institute of Mathematical Statistics, 2017 TESTS FOR SEPARABILITY IN NONPARAMETRIC COVARIANCE OPERATORS OF RANDOM SURFACES

More information

Distances and inference for covariance operators

Distances and inference for covariance operators Royal Holloway Probability and Statistics Colloquium 27th April 2015 Distances and inference for covariance operators Davide Pigoli This is part of joint works with J.A.D. Aston I.L. Dryden P. Secchi J.

More information

Package covsep. May 6, 2018

Package covsep. May 6, 2018 Package covsep May 6, 2018 Title Tests for Determining if the Covariance Structure of 2-Dimensional Data is Separable Version 1.1.0 Author Shahin Tavakoli [aut, cre], Davide Pigoli [ctb], John Aston [ctb]

More information

Second-Order Inference for Gaussian Random Curves

Second-Order Inference for Gaussian Random Curves Second-Order Inference for Gaussian Random Curves With Application to DNA Minicircles Victor Panaretos David Kraus John Maddocks Ecole Polytechnique Fédérale de Lausanne Panaretos, Kraus, Maddocks (EPFL)

More information

Modeling Multi-Way Functional Data With Weak Separability

Modeling Multi-Way Functional Data With Weak Separability Modeling Multi-Way Functional Data With Weak Separability Kehui Chen Department of Statistics University of Pittsburgh, USA @CMStatistics, Seville, Spain December 09, 2016 Outline Introduction. Multi-way

More information

Distance and inference for covariance functions

Distance and inference for covariance functions Distance and inference for covariance functions Davide Pigoli SuSTaIn Workshop High dimensional and dependent functional data Bristol, 11/09/2012 This work is in collaboration with: Dr. J.A.D. Aston, Department

More information

Smooth Common Principal Component Analysis

Smooth Common Principal Component Analysis 1 Smooth Common Principal Component Analysis Michal Benko Wolfgang Härdle Center for Applied Statistics and Economics benko@wiwi.hu-berlin.de Humboldt-Universität zu Berlin Motivation 1-1 Volatility Surface

More information

Weak Separablility for Two-way Functional Data: Concept and Test

Weak Separablility for Two-way Functional Data: Concept and Test Weak Separablility for Two-way Functional Data: Concept and Test arxiv:703.020v2 [stat.me] 3 Mar 207 Kehui Chen, Brian Lynch University of Pittsburgh April 3, 207 Abstract: This paper concerns multi-way

More information

Weakly dependent functional data. Piotr Kokoszka. Utah State University. Siegfried Hörmann. University of Utah

Weakly dependent functional data. Piotr Kokoszka. Utah State University. Siegfried Hörmann. University of Utah Weakly dependent functional data Piotr Kokoszka Utah State University Joint work with Siegfried Hörmann University of Utah Outline Examples of functional time series L 4 m approximability Convergence of

More information

1 Data Arrays and Decompositions

1 Data Arrays and Decompositions 1 Data Arrays and Decompositions 1.1 Variance Matrices and Eigenstructure Consider a p p positive definite and symmetric matrix V - a model parameter or a sample variance matrix. The eigenstructure is

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix jean-charles.croix@emse.fr Génie Mathématique et Industriel (GMI) First workshop on Gaussian processes at Saint-Etienne

More information

Functional Latent Feature Models. With Single-Index Interaction

Functional Latent Feature Models. With Single-Index Interaction Generalized With Single-Index Interaction Department of Statistics Center for Statistical Bioinformatics Institute for Applied Mathematics and Computational Science Texas A&M University Naisyin Wang and

More information

Chapter 13: Functional Autoregressive Models

Chapter 13: Functional Autoregressive Models Chapter 13: Functional Autoregressive Models Jakub Černý Department of Probability and Mathematical Statistics Stochastic Modelling in Economics and Finance December 9, 2013 1 / 25 Contents 1 Introduction

More information

Collocation based high dimensional model representation for stochastic partial differential equations

Collocation based high dimensional model representation for stochastic partial differential equations Collocation based high dimensional model representation for stochastic partial differential equations S Adhikari 1 1 Swansea University, UK ECCM 2010: IV European Conference on Computational Mechanics,

More information

ANOVA: Analysis of Variance - Part I

ANOVA: Analysis of Variance - Part I ANOVA: Analysis of Variance - Part I The purpose of these notes is to discuss the theory behind the analysis of variance. It is a summary of the definitions and results presented in class with a few exercises.

More information

Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics

Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html

More information

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques Institut für Numerische Mathematik und Optimierung Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques Oliver Ernst Computational Methods with Applications Harrachov, CR,

More information

Fundamental concepts of functional data analysis

Fundamental concepts of functional data analysis Fundamental concepts of functional data analysis Department of Statistics, Colorado State University Examples of functional data 0 1440 2880 4320 5760 7200 8640 10080 Time in minutes The horizontal component

More information

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data. Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two

More information

Random Eigenvalue Problems Revisited

Random Eigenvalue Problems Revisited Random Eigenvalue Problems Revisited S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. Email: S.Adhikari@bristol.ac.uk URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html

More information

FUNCTIONAL DATA ANALYSIS. Contribution to the. International Handbook (Encyclopedia) of Statistical Sciences. July 28, Hans-Georg Müller 1

FUNCTIONAL DATA ANALYSIS. Contribution to the. International Handbook (Encyclopedia) of Statistical Sciences. July 28, Hans-Georg Müller 1 FUNCTIONAL DATA ANALYSIS Contribution to the International Handbook (Encyclopedia) of Statistical Sciences July 28, 2009 Hans-Georg Müller 1 Department of Statistics University of California, Davis One

More information

Multivariate Gaussian Analysis

Multivariate Gaussian Analysis BS2 Statistical Inference, Lecture 7, Hilary Term 2009 February 13, 2009 Marginal and conditional distributions For a positive definite covariance matrix Σ, the multivariate Gaussian distribution has density

More information

6.435, System Identification

6.435, System Identification System Identification 6.435 SET 3 Nonparametric Identification Munther A. Dahleh 1 Nonparametric Methods for System ID Time domain methods Impulse response Step response Correlation analysis / time Frequency

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

FUNCTIONAL DATA ANALYSIS FOR VOLATILITY PROCESS

FUNCTIONAL DATA ANALYSIS FOR VOLATILITY PROCESS FUNCTIONAL DATA ANALYSIS FOR VOLATILITY PROCESS Rituparna Sen Monday, July 31 10:45am-12:30pm Classroom 228 St-C5 Financial Models Joint work with Hans-Georg Müller and Ulrich Stadtmüller 1. INTRODUCTION

More information

Karhunen-Loeve Expansion and Optimal Low-Rank Model for Spatial Processes

Karhunen-Loeve Expansion and Optimal Low-Rank Model for Spatial Processes TTU, October 26, 2012 p. 1/3 Karhunen-Loeve Expansion and Optimal Low-Rank Model for Spatial Processes Hao Zhang Department of Statistics Department of Forestry and Natural Resources Purdue University

More information

Fractal functional regression for classification of gene expression data by wavelets

Fractal functional regression for classification of gene expression data by wavelets Fractal functional regression for classification of gene expression data by wavelets Margarita María Rincón 1 and María Dolores Ruiz-Medina 2 1 University of Granada Campus Fuente Nueva 18071 Granada,

More information

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Arno Solin and Simo Särkkä Aalto University, Finland Workshop on Gaussian Process Approximation Copenhagen, Denmark, May 2015 Solin &

More information

A Vector-Space Approach for Stochastic Finite Element Analysis

A Vector-Space Approach for Stochastic Finite Element Analysis A Vector-Space Approach for Stochastic Finite Element Analysis S Adhikari 1 1 Swansea University, UK CST2010: Valencia, Spain Adhikari (Swansea) Vector-Space Approach for SFEM 14-17 September, 2010 1 /

More information

AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS. Ngoc Mai Tran Supervisor: Professor Peter G.

AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS. Ngoc Mai Tran Supervisor: Professor Peter G. AN INTRODUCTION TO THEORETICAL PROPERTIES OF FUNCTIONAL PRINCIPAL COMPONENT ANALYSIS Ngoc Mai Tran Supervisor: Professor Peter G. Hall Department of Mathematics and Statistics, The University of Melbourne.

More information

Linear Algebra and Dirac Notation, Pt. 3

Linear Algebra and Dirac Notation, Pt. 3 Linear Algebra and Dirac Notation, Pt. 3 PHYS 500 - Southern Illinois University February 1, 2017 PHYS 500 - Southern Illinois University Linear Algebra and Dirac Notation, Pt. 3 February 1, 2017 1 / 16

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley

More information

A Note on Hilbertian Elliptically Contoured Distributions

A Note on Hilbertian Elliptically Contoured Distributions A Note on Hilbertian Elliptically Contoured Distributions Yehua Li Department of Statistics, University of Georgia, Athens, GA 30602, USA Abstract. In this paper, we discuss elliptically contoured distribution

More information

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture 6: Additional Results for VAR s 6.1. Confidence Intervals for Impulse Response Functions There

More information

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given

More information

MIMO Capacities : Eigenvalue Computation through Representation Theory

MIMO Capacities : Eigenvalue Computation through Representation Theory MIMO Capacities : Eigenvalue Computation through Representation Theory Jayanta Kumar Pal, Donald Richards SAMSI Multivariate distributions working group Outline 1 Introduction 2 MIMO working model 3 Eigenvalue

More information

Testing the Equality of Covariance Operators in Functional Samples

Testing the Equality of Covariance Operators in Functional Samples Scandinavian Journal of Statistics, Vol. 4: 38 5, 3 doi:./j.467-9469..796.x Board of the Foundation of the Scandinavian Journal of Statistics. Published by Blackwell Publishing Ltd. Testing the Equality

More information

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Distances and inference for covariance operators

Distances and inference for covariance operators Biometrika (2014), 101,2,pp. 409 422 doi: 10.1093/biomet/asu008 Printed in Great Britain Advance Access publication 17 April 2014 Distances and inference for covariance operators BY DAVIDE PIGOLI Department

More information

Independent component analysis for functional data

Independent component analysis for functional data Independent component analysis for functional data Hannu Oja Department of Mathematics and Statistics University of Turku Version 12.8.216 August 216 Oja (UTU) FICA Date bottom 1 / 38 Outline 1 Probability

More information

Basic Operator Theory with KL Expansion

Basic Operator Theory with KL Expansion Basic Operator Theory with Karhunen Loéve Expansion Wafa Abu Zarqa, Maryam Abu Shamala, Samiha Al Aga, Khould Al Qitieti, Reem Al Rashedi Department of Mathematical Sciences College of Science United Arab

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

On corrections of classical multivariate tests for high-dimensional data

On corrections of classical multivariate tests for high-dimensional data On corrections of classical multivariate tests for high-dimensional data Jian-feng Yao with Zhidong Bai, Dandan Jiang, Shurong Zheng Overview Introduction High-dimensional data and new challenge in statistics

More information

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing

More information

Modeling Repeated Functional Observations

Modeling Repeated Functional Observations Modeling Repeated Functional Observations Kehui Chen Department of Statistics, University of Pittsburgh, Hans-Georg Müller Department of Statistics, University of California, Davis Supplemental Material

More information

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Raymond K. W. Wong Department of Statistics, Texas A&M University Xiaoke Zhang Department

More information

High Dimensional Covariance and Precision Matrix Estimation

High Dimensional Covariance and Precision Matrix Estimation High Dimensional Covariance and Precision Matrix Estimation Wei Wang Washington University in St. Louis Thursday 23 rd February, 2017 Wei Wang (Washington University in St. Louis) High Dimensional Covariance

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Implementation of Sparse Wavelet-Galerkin FEM for Stochastic PDEs

Implementation of Sparse Wavelet-Galerkin FEM for Stochastic PDEs Implementation of Sparse Wavelet-Galerkin FEM for Stochastic PDEs Roman Andreev ETH ZÜRICH / 29 JAN 29 TOC of the Talk Motivation & Set-Up Model Problem Stochastic Galerkin FEM Conclusions & Outlook Motivation

More information

Learning gradients: prescriptive models

Learning gradients: prescriptive models Department of Statistical Science Institute for Genome Sciences & Policy Department of Computer Science Duke University May 11, 2007 Relevant papers Learning Coordinate Covariances via Gradients. Sayan

More information

Continuous Probability Distributions from Finite Data. Abstract

Continuous Probability Distributions from Finite Data. Abstract LA-UR-98-3087 Continuous Probability Distributions from Finite Data David M. Schmidt Biophysics Group, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (August 5, 1998) Abstract Recent approaches

More information

Complex random vectors and Hermitian quadratic forms

Complex random vectors and Hermitian quadratic forms Complex random vectors and Hermitian quadratic forms Gilles Ducharme*, Pierre Lafaye de Micheaux** and Bastien Marchina* * Université Montpellier II, I3M - EPS ** Université de Montréal, DMS 26 march 2013

More information

Sampling and Low-Rank Tensor Approximations

Sampling and Low-Rank Tensor Approximations Sampling and Low-Rank Tensor Approximations Hermann G. Matthies Alexander Litvinenko, Tarek A. El-Moshely +, Brunswick, Germany + MIT, Cambridge, MA, USA wire@tu-bs.de http://www.wire.tu-bs.de $Id: 2_Sydney-MCQMC.tex,v.3

More information

Polynomial Chaos and Karhunen-Loeve Expansion

Polynomial Chaos and Karhunen-Loeve Expansion Polynomial Chaos and Karhunen-Loeve Expansion 1) Random Variables Consider a system that is modeled by R = M(x, t, X) where X is a random variable. We are interested in determining the probability of the

More information

Lecture 3: Central Limit Theorem

Lecture 3: Central Limit Theorem Lecture 3: Central Limit Theorem Scribe: Jacy Bird (Division of Engineering and Applied Sciences, Harvard) February 8, 003 The goal of today s lecture is to investigate the asymptotic behavior of P N (εx)

More information

Likelihood Ratio Tests. that Certain Variance Components Are Zero. Ciprian M. Crainiceanu. Department of Statistical Science

Likelihood Ratio Tests. that Certain Variance Components Are Zero. Ciprian M. Crainiceanu. Department of Statistical Science 1 Likelihood Ratio Tests that Certain Variance Components Are Zero Ciprian M. Crainiceanu Department of Statistical Science www.people.cornell.edu/pages/cmc59 Work done jointly with David Ruppert, School

More information

[y i α βx i ] 2 (2) Q = i=1

[y i α βx i ] 2 (2) Q = i=1 Least squares fits This section has no probability in it. There are no random variables. We are given n points (x i, y i ) and want to find the equation of the line that best fits them. We take the equation

More information

Announcements (repeat) Principal Components Analysis

Announcements (repeat) Principal Components Analysis 4/7/7 Announcements repeat Principal Components Analysis CS 5 Lecture #9 April 4 th, 7 PA4 is due Monday, April 7 th Test # will be Wednesday, April 9 th Test #3 is Monday, May 8 th at 8AM Just hour long

More information

Maximum variance formulation

Maximum variance formulation 12.1. Principal Component Analysis 561 Figure 12.2 Principal component analysis seeks a space of lower dimensionality, known as the principal subspace and denoted by the magenta line, such that the orthogonal

More information

Supplementary Material for Testing separability of space-time functional processes

Supplementary Material for Testing separability of space-time functional processes Biometrika (year), volume, number, pp. 20 Printed in Great Britain Advance Access publication on date Supplementary Material for Testing separability of space-time functional processes BY P. CONSTANTINOU

More information

Nonparametric Inference In Functional Data

Nonparametric Inference In Functional Data Nonparametric Inference In Functional Data Zuofeng Shang Purdue University Joint work with Guang Cheng from Purdue Univ. An Example Consider the functional linear model: Y = α + where 1 0 X(t)β(t)dt +

More information

Hierarchical Modeling for Univariate Spatial Data

Hierarchical Modeling for Univariate Spatial Data Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This

More information

A Framework for Daily Spatio-Temporal Stochastic Weather Simulation

A Framework for Daily Spatio-Temporal Stochastic Weather Simulation A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Joint Factor Analysis for Speaker Verification

Joint Factor Analysis for Speaker Verification Joint Factor Analysis for Speaker Verification Mengke HU ASPITRG Group, ECE Department Drexel University mengke.hu@gmail.com October 12, 2012 1/37 Outline 1 Speaker Verification Baseline System Session

More information

Functional Data Analysis

Functional Data Analysis FDA 1-1 Functional Data Analysis Michal Benko Institut für Statistik und Ökonometrie Humboldt-Universität zu Berlin email:benko@wiwi.hu-berlin.de FDA 1-2 Outline of this talk: Introduction Turning discrete

More information

Large sample covariance matrices and the T 2 statistic

Large sample covariance matrices and the T 2 statistic Large sample covariance matrices and the T 2 statistic EURANDOM, the Netherlands Joint work with W. Zhou Outline 1 2 Basic setting Let {X ij }, i, j =, be i.i.d. r.v. Write n s j = (X 1j,, X pj ) T and

More information

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

LECTURE 2 LINEAR REGRESSION MODEL AND OLS SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another

More information

Lecture 1: Center for Uncertainty Quantification. Alexander Litvinenko. Computation of Karhunen-Loeve Expansion:

Lecture 1: Center for Uncertainty Quantification. Alexander Litvinenko. Computation of Karhunen-Loeve Expansion: tifica Lecture 1: Computation of Karhunen-Loeve Expansion: Alexander Litvinenko http://sri-uq.kaust.edu.sa/ Stochastic PDEs We consider div(κ(x, ω) u) = f (x, ω) in G, u = 0 on G, with stochastic coefficients

More information

Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands

Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands Elizabeth C. Mannshardt-Shamseldin Advisor: Richard L. Smith Duke University Department

More information

Geostatistical Modeling for Large Data Sets: Low-rank methods

Geostatistical Modeling for Large Data Sets: Low-rank methods Geostatistical Modeling for Large Data Sets: Low-rank methods Whitney Huang, Kelly-Ann Dixon Hamil, and Zizhuang Wu Department of Statistics Purdue University February 22, 2016 Outline Motivation Low-rank

More information

Symmetry and Separability In Spatial-Temporal Processes

Symmetry and Separability In Spatial-Temporal Processes Symmetry and Separability In Spatial-Temporal Processes Man Sik Park, Montserrat Fuentes Symmetry and Separability In Spatial-Temporal Processes 1 Motivation In general, environmental data have very complex

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix GT APSSE - April 2017, the 13th joint work with Xavier Bay. 1 / 29 Sommaire 1 Preliminaries on Gaussian processes 2

More information

Random Matrices and Multivariate Statistical Analysis

Random Matrices and Multivariate Statistical Analysis Random Matrices and Multivariate Statistical Analysis Iain Johnstone, Statistics, Stanford imj@stanford.edu SEA 06@MIT p.1 Agenda Classical multivariate techniques Principal Component Analysis Canonical

More information

Asymptotic Distribution of the Largest Eigenvalue via Geometric Representations of High-Dimension, Low-Sample-Size Data

Asymptotic Distribution of the Largest Eigenvalue via Geometric Representations of High-Dimension, Low-Sample-Size Data Sri Lankan Journal of Applied Statistics (Special Issue) Modern Statistical Methodologies in the Cutting Edge of Science Asymptotic Distribution of the Largest Eigenvalue via Geometric Representations

More information

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data Song Xi CHEN Guanghua School of Management and Center for Statistical Science, Peking University Department

More information

Lecture 3: Central Limit Theorem

Lecture 3: Central Limit Theorem Lecture 3: Central Limit Theorem Scribe: Jacy Bird (Division of Engineering and Applied Sciences, Harvard) February 8, 003 The goal of today s lecture is to investigate the asymptotic behavior of P N (

More information

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Vahid Dehdari and Clayton V. Deutsch Geostatistical modeling involves many variables and many locations.

More information

Solving the steady state diffusion equation with uncertainty Final Presentation

Solving the steady state diffusion equation with uncertainty Final Presentation Solving the steady state diffusion equation with uncertainty Final Presentation Virginia Forstall vhfors@gmail.com Advisor: Howard Elman elman@cs.umd.edu Department of Computer Science May 6, 2012 Problem

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

A test of weak separability for multi-way functional data, with application to brain connectivity studies

A test of weak separability for multi-way functional data, with application to brain connectivity studies Biometrika (-), -, -, pp. 1 16 Printed in Great Britain - A test of weak separability for multi-way functional data, with application to brain connectivity studies BY BRIAN LYNCH AND KEHUI CHEN Department

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Lecture 3. Inference about multivariate normal distribution

Lecture 3. Inference about multivariate normal distribution Lecture 3. Inference about multivariate normal distribution 3.1 Point and Interval Estimation Let X 1,..., X n be i.i.d. N p (µ, Σ). We are interested in evaluation of the maximum likelihood estimates

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

arxiv: v1 [stat.me] 20 Jan 2017

arxiv: v1 [stat.me] 20 Jan 2017 Permutation tests for the equality of covariance operators of functional data with applications to evolutionary biology Alessandra Cabassi 1, Davide Pigoli 2, Piercesare Secchi 3, and Patrick A. Carter

More information

MP 472 Quantum Information and Computation

MP 472 Quantum Information and Computation MP 472 Quantum Information and Computation http://www.thphys.may.ie/staff/jvala/mp472.htm Outline Open quantum systems The density operator ensemble of quantum states general properties the reduced density

More information

Modeling Function-Valued Stochastic Processes, With Applications to Fertility Dynamics

Modeling Function-Valued Stochastic Processes, With Applications to Fertility Dynamics Modeling Function-Valued Stochastic Processes, With Applications to Fertility Dynamics Kehui Chen 1, Pedro Delicado 2 and Hans-Georg Müller 3 1 Dept. of Statistics, University of Pittsburgh, Pittsburgh,

More information

Practical unbiased Monte Carlo for Uncertainty Quantification

Practical unbiased Monte Carlo for Uncertainty Quantification Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University

More information

MIXED EFFECTS MODELS FOR TIME SERIES

MIXED EFFECTS MODELS FOR TIME SERIES Outline MIXED EFFECTS MODELS FOR TIME SERIES Cristina Gorrostieta Hakmook Kang Hernando Ombao Brown University Biostatistics Section February 16, 2011 Outline OUTLINE OF TALK 1 SCIENTIFIC MOTIVATION 2

More information

On the prediction of stationary functional time series

On the prediction of stationary functional time series On the prediction of stationary functional time series Alexander Aue Diogo Dubart Norinho Siegfried Hörmann Abstract This paper addresses the prediction of stationary functional time series. Existing contributions

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

Curve alignment and functional PCA

Curve alignment and functional PCA Curve alignment and functional PCA Juhyun Par* Department of Mathematics and Statistics, Lancaster University, Lancaster, U.K. juhyun.par@lancaster.ac.u Abstract When dealing with multiple curves as functional

More information

9.1 Orthogonal factor model.

9.1 Orthogonal factor model. 36 Chapter 9 Factor Analysis Factor analysis may be viewed as a refinement of the principal component analysis The objective is, like the PC analysis, to describe the relevant variables in study in terms

More information

Multivariate Regression Analysis

Multivariate Regression Analysis Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x

More information

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models Bare minimum on matrix algebra Psychology 588: Covariance structure and factor models Matrix multiplication 2 Consider three notations for linear combinations y11 y1 m x11 x 1p b11 b 1m y y x x b b n1

More information

The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence

The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence Jordi Gali Luca Gambetti ONLINE APPENDIX The appendix describes the estimation of the time-varying coefficients VAR model. The model

More information

Supervised Dimension Reduction:

Supervised Dimension Reduction: Supervised Dimension Reduction: A Tale of Two Manifolds S. Mukherjee, K. Mao, F. Liang, Q. Wu, M. Maggioni, D-X. Zhou Department of Statistical Science Institute for Genome Sciences & Policy Department

More information