Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Size: px
Start display at page:

Download "Karhunen-Loève decomposition of Gaussian measures on Banach spaces"

Transcription

1 Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix Génie Mathématique et Industriel (GMI) First workshop on Gaussian processes at Saint-Etienne - November 2017, 7th. 1 / 42

2 Contents 1 Motivation 2 Karhunen-Loève decomposition in Hilbert spaces Gaussian vectors in R n Gaussian elements in Hilbert spaces 3 Karhunen-Loève decomposition in Banach spaces 4 Particular case of (R n,. ) 5 Particular case of C(K), K metric and compact. Continuous Gaussian processes Illustration on Brownian motion Summability of λ 6 Conclusion and open questions 7 References 2 / 42

3 Motivation 3 / 42

4 Motivation The problem of representation of Gaussian elements in linear series is used in: 1 Simulation (e.g. truncated Karhunen-Loève series), 2 Approximation and Dimension reduction (e.g. PCA or POD), 3 Optimal quantization, 4 Bayesian inverse problems. If the existence of an optimal basis is well known in Hilbert spaces it s not always explicit (eigenvalue problem),... it s not the case in Banach spaces. What if we are interested in non-hilbertian norms? 4 / 42

5 Motivation In the case of a continuous Gaussian process on [0, 1] we may inject it in L 2 ([0, 1], dx) and use Hilbertian geometry Basis functions f0 f1 f2 f3 f Figure: Karhunen-Loève basis of Brownian motion in L 2 ([0, 1], dx). 5 / 42

6 Motivation or tackle the problem directly in C([0, 1]) f0 f1 f2 f3 f4 f5 f6 f7 Basis functions Figure: Brownian motion basis functions in C([0, 1]) (Paul Levy s construction)....in the hope of better approximation w.r.t. supremum norm! 6 / 42

7 Karhunen-Loève decomposition in Hilbert spaces 7 / 42

8 Gaussian vector in Euclidian spaces Consider the euclidian space R n with canonical inner product.,.. Gaussian random vector Let (Ω, F, P) a probability space and X : (Ω, F, P) (R n, B(R n )) a measurable mapping, then X is a Gaussian vector if and only if y R n, X, y is a Gaussian random variable. 8 / 42

9 Gaussian vector in Euclidian spaces Consider the euclidian space R n with canonical inner product.,.. Gaussian random vector Let (Ω, F, P) a probability space and X : (Ω, F, P) (R n, B(R n )) a measurable mapping, then X is a Gaussian vector if and only if y R n, X, y is a Gaussian random variable. Covariance Given a (centered) Gaussian random vector X, define the bilinear form: x, y R n, Cov(x, y) = E[ X, x X, y ], (1) which uniquely defines Σ : R n R n such that: x, y R n, Σx, y = Cov(x, y). (2) 8 / 42

10 Cameron-Martin and Gaussian spaces Considering the Gaussian vector X gives a fundamental structure: Gaussian space: Vect( X, e k, k [1, n]) L 2 (P) Cameron-Martin space H X = Range(Σ) equipped with.,. X = Σ 1.,. Loève isometry The following application: is an isometry. X, x L 2 (P) Σx H X (3) 9 / 42

11 Representation of Gaussian vectors in R n For any orthonormal basis (e k ) in R n, we can write: n X (ω) = X (ω), e k e k a.s. (4) k=1 10 / 42

12 Representation of Gaussian vectors in R n For any orthonormal basis (e k ) in R n, we can write: n X (ω) = X (ω), e k e k a.s. (4) k=1 X (ω), e i, X (ω), e j are independent if and only if Cov(e i, e j ) = Σe i, e j = Σe i, Σe j X = / 42

13 Representation of Gaussian vectors in R n For any orthonormal basis (e k ) in R n, we can write: n X (ω) = X (ω), e k e k a.s. (4) k=1 X (ω), e i, X (ω), e j are independent if and only if Cov(e i, e j ) = Σe i, e j = Σe i, Σe j X = 0. For any orthonormal basis (x k ) in H X : X (ω) = dim(h X ) k=1 X (ω), x k X x k (5) 10 / 42

14 Representation of Gaussian vectors in R n For any orthonormal basis (e k ) in R n, we can write: n X (ω) = X (ω), e k e k a.s. (4) k=1 X (ω), e i, X (ω), e j are independent if and only if Cov(e i, e j ) = Σe i, e j = Σe i, Σe j X = 0. For any orthonormal basis (x k ) in H X : X (ω) = dim(h X ) k=1 X (ω), x k X x k (5) The Spectral theorem (on covariance operator) exhibits a particular (Karhunen-Loève) basis (bi-orthogonal and Σh k = λ k h k ): X (ω) = dim(h X ) k=1 λk X (ω), h k h k (6) 10 / 42

15 Representation of Gaussian vectors in R n For any orthonormal basis (e k ) in R n, we can write: n X (ω) = X (ω), e k e k a.s. (4) k=1 X (ω), e i, X (ω), e j are independent if and only if Cov(e i, e j ) = Σe i, e j = Σe i, Σe j X = 0. For any orthonormal basis (x k ) in H X : X (ω) = dim(h X ) k=1 X (ω), x k X x k (5) The Spectral theorem (on covariance operator) exhibits a particular (Karhunen-Loève) basis (bi-orthogonal and Σh k = λ k h k ): X (ω) = dim(h X ) k=1 λk X (ω), h k h k (6) Note P k the linear projector on the k-th biggest eigenvalues, then: k n, min rank(p)=k E[ X PX 2 ] = E[ X P k X 2 ] = λ k λ dim(hx ) 10 / 42

16 Representation of Gaussian vectors in R n e 2 λ1 λ2 B HX (0, 1) e 1 B E (0, 1) Figure: Karhunen-Loève basis in dimension / 42

17 Gaussian element in Hilbert spaces Consider the (real) Hilbert space H with inner product.,.. Gaussian random element Let (Ω, F, P) a probability space and X : (Ω, F, P) (H, B(H)) a measurable mapping, then X is a Gaussian element if and only if y H, X, y is a Gaussian random variable. Covariance Given a Gaussian random element X, define the bilinear form: x, y H, Cov(x, y) = E[ X, x X, y ], (8) and the associated covariance operator C : H H such that: x, y H, Cx, y = Cov(x, y). (9) 12 / 42

18 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). 13 / 42

19 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). The Cameron-Martin space H X is the completion of Range(C) w.r.t. x, y C = C 1 x, y (it s a proper subspace of H, H X H, see Vakhania 1987). 13 / 42

20 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). The Cameron-Martin space H X is the completion of Range(C) w.r.t. x, y C = C 1 x, y (it s a proper subspace of H, H X H, see Vakhania 1987). For any Hilbert basis (e k ) in H we can write: X (ω) = k 0 X (ω), e k e k a.s. (10) 13 / 42

21 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). The Cameron-Martin space H X is the completion of Range(C) w.r.t. x, y C = C 1 x, y (it s a proper subspace of H, H X H, see Vakhania 1987). For any Hilbert basis (e k ) in H we can write: X (ω) = k 0 X (ω), e k e k a.s. (10) For any basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1): X (ω) d = k 0 ξ k (ω)x k (11) 13 / 42

22 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). The Cameron-Martin space H X is the completion of Range(C) w.r.t. x, y C = C 1 x, y (it s a proper subspace of H, H X H, see Vakhania 1987). For any Hilbert basis (e k ) in H we can write: X (ω) = k 0 X (ω), e k e k a.s. (10) For any basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1): X (ω) d = k 0 ξ k (ω)x k (11) Spectral theorem applies and exhibits a (Karhunen-Loève) bi-orthogonal basis (h k ): X (ω) = k 0 λk ξ k (ω)h k a.s. 13 / 42

23 Representation of Gaussian elements in Hilbert spaces The covariance operator C : H H is positive, symmetric and trace-class (see Vakhania 1987). The Cameron-Martin space H X is the completion of Range(C) w.r.t. x, y C = C 1 x, y (it s a proper subspace of H, H X H, see Vakhania 1987). For any Hilbert basis (e k ) in H we can write: X (ω) = k 0 X (ω), e k e k a.s. (10) For any basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1): X (ω) d = k 0 ξ k (ω)x k (11) Spectral theorem applies and exhibits a (Karhunen-Loève) bi-orthogonal basis (h k ): X (ω) = k 0 λk ξ k (ω)h k a.s. Eckart-Young theorem is still valid (functional PCA,...) k > 0, min E[ X PX rank(p)=k 2 H] = E[ X P k X 2 H] = λ i (12) i>k 13 / 42

24 Karhunen-Loève decomposition in Banach spaces 14 / 42

25 Gaussian element in Banach spaces Consider the (real) Banach space E with duality pairing.,. E,E. Gaussian random element Let (Ω, F, P) a probability space and X : (Ω, F, P) (E, B(E)) a measurable mapping, then X is a Gaussian element if and only if f E, X, f E,E is a Gaussian random variable. Covariance Given a Gaussian random element X, define the bilinear form: f, g E, Cov(f, g) = E[ X, f E,E X, g E,E ], (13) and the associated covariance operator C : E E (see Vakhania 1987 or Bogachev 1998) such that: f, g E, Cf, g E,E = Cov(f, g). (14) 15 / 42

26 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). 16 / 42

27 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). The space H X is the completion of Range(C) w.r.t. x, y X = y, C 1 x E,E (it s a proper subspace of E, H X E, see Vakhania 1987, Bogachev 1998). 16 / 42

28 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). The space H X is the completion of Range(C) w.r.t. x, y X = y, C 1 x E,E (it s a proper subspace of E, H X E, see Vakhania 1987, Bogachev 1998). For any Hilbert basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1), we have: X d = k 1 ξ k x k (15) 16 / 42

29 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). The space H X is the completion of Range(C) w.r.t. x, y X = y, C 1 x E,E (it s a proper subspace of E, H X E, see Vakhania 1987, Bogachev 1998). For any Hilbert basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1), we have: X d = k 1 ξ k x k (15) C : E E No Spectral theorem. 16 / 42

30 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). The space H X is the completion of Range(C) w.r.t. x, y X = y, C 1 x E,E (it s a proper subspace of E, H X E, see Vakhania 1987, Bogachev 1998). For any Hilbert basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1), we have: X d = k 1 ξ k x k (15) C : E E No Spectral theorem. We know that (Bogachev 1998, Vakahnia 1991): (x k ) Range(C) Hilbert basis in H X, (x k ) H X such that k 0 h k 2 E < +, 16 / 42

31 Representation of Gaussian elements in Banach spaces The covariance operator C : E E is positive, symmetric and nuclear (see Vakhania 1987). The space H X is the completion of Range(C) w.r.t. x, y X = y, C 1 x E,E (it s a proper subspace of E, H X E, see Vakhania 1987, Bogachev 1998). For any Hilbert basis (x k ) in H X and (ξ k ) i.i.d. N (0, 1), we have: X d = k 1 ξ k x k (15) C : E E No Spectral theorem. We know that (Bogachev 1998, Vakahnia 1991): (x k ) Range(C) Hilbert basis in H X, (x k ) H X such that k 0 h k 2 E < +, If the basis (x k ) is in Range(C) with x k = Cx k, then: X (ω) = k 0 X (ω), x k E,E x k (16) 16 / 42

32 Representation of Gaussian elements in Banach spaces Factorization of covariance operators (Vakhania 1987, Bogachev 1998). Let C be the covariance operator of a Gaussian element, then: C = SS, (17) where S : H E is a bounded operator and H a Hilbert space. 17 / 42

33 Representation of Gaussian elements in Banach spaces Factorization of covariance operators (Vakhania 1987, Bogachev 1998). Let C be the covariance operator of a Gaussian element, then: C = SS, (17) where S : H E is a bounded operator and H a Hilbert space. Examples: In the Hilbert case, S = C 1 2, S : H X E the inclusion map, S : f E L2 (P) E[f (X )X ] E (S is the injection from E to L 2 ) 17 / 42

34 Representation of Gaussian elements in Banach spaces Factorization of covariance operators (Vakhania 1987, Bogachev 1998). Let C be the covariance operator of a Gaussian element, then: C = SS, (17) where S : H E is a bounded operator and H a Hilbert space. Examples: In the Hilbert case, S = C 1 2, S : H X E the inclusion map, S : f E L2 (P) E[f (X )X ] E (S is the injection from E to L 2 ) (Luschgy and Pagès 2009) Let C = SS with S : H E. Then for any basis (h n ) in H: X d = k 0 ξ k Sh k, (18) where (ξ k ) are i.i.d. N (0, 1). 17 / 42

35 Representation of Gaussian elements in Banach spaces This methodology has been widely used: When E = C([0, 1]), S : f E L 2 ([0, 1], dx)... We will now give a different methodology, mimicking the Hilbert case, to construct a Hilbert basis in H X. We will proceed as follows: 1 Find directions of maximum variance, 2 Choose a right notion of orthogonality to iterate, 3 Study the asymptotics. 18 / 42

36 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 19 / 42

37 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 19 / 42

38 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 3 If λ 0 > 0, let x 0 E such that Cf 0 = λ 0 x 0, then P 0 : x E x, f 0 E,E x 0 is a projector of unit norm. 19 / 42

39 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 3 If λ 0 > 0, let x 0 E such that Cf 0 = λ 0 x 0, then P 0 : x E x, f 0 E,E x 0 is a projector of unit norm. 4 X = (P 0 X, (I P 0 )X ). 19 / 42

40 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 3 If λ 0 > 0, let x 0 E such that Cf 0 = λ 0 x 0, then P 0 : x E x, f 0 E,E x 0 is a projector of unit norm. 4 X = (P 0 X, (I P 0 )X ). 5 Iterate on X 1 = (I P 0 )X. 19 / 42

41 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 3 If λ 0 > 0, let x 0 E such that Cf 0 = λ 0 x 0, then P 0 : x E x, f 0 E,E x 0 is a projector of unit norm. 4 X = (P 0 X, (I P 0 )X ). 5 Iterate on X 1 = (I P 0 )X. Bay & Croix 2017 (λ n ) is non-increasing, λ n 0 and ( λ k x k ) is a Hilbert basis in H X. 19 / 42

42 Decomposition of the Cameron-Martin space 1 For any (Gaussian) covariance operator, f E Cov(f, f ) R + may be interpreted as a Rayleigh quotient, is weakly sequentially continuous, is quadratic. 2 f 0 B E (0, 1) (non-unique) such that Cov(f 0, f 0 ) = max f E 1 Cov(f, f ) = λ 0 (Banach-Alaoglu), 3 If λ 0 > 0, let x 0 E such that Cf 0 = λ 0 x 0, then P 0 : x E x, f 0 E,E x 0 is a projector of unit norm. 4 X = (P 0 X, (I P 0 )X ). 5 Iterate on X 1 = (I P 0 )X. Bay & Croix 2017 (λ n ) is non-increasing, λ n 0 and ( λ k x k ) is a Hilbert basis in H X. We will note x0 = f 0 and xn = f n n 1 k=0 x k, f n E,E xk P n = n k=0 x k x k (similar to Gram-Schmidt). such that 19 / 42

43 Properties of the decomposition Few comments about the previous construction... Properties This construction gives the following properties: X (ω) = k 0 X (ω), x k E,E x k a.s. C = SS with S : f E k 0 λ k x k, f E,E x k Note C n = n k=0 λ kx k x k then C C n L(E,E) = λ n+1. Recovers Karhunen-Loève basis in the Hilbert case. (x k, x k ) E E is bi-orthogonal. Vect(x k, k 0) E = H X E, k N, f k E = x k E = 1. Remarks k 0 λ k need not be finite (not a nuclear representation of C). Optimality seems out of reach (Rate optimality?). 20 / 42

44 Particular case of (R n,. ) 21 / 42

45 Application in R n Here, E = (R n,. 1 ) and (x, f ) E E, x, f E,E Suppose a centered dataset (y i ) i [1,p] (R n ) p, Form the usual covariance matrix Σ M n (R), Find the direction of maximal variance f 0 R n : = n i=1 x if i. λ 0 = f T 0 Σf 0 = max i [1,n] Σ (i,i) with f 0 = (0,..., 0, 1, 0,..., 0) (19) If λ 0 > 0 then x 0 = Σf0 λ 0. Σ 1 = Σ λ 0 x 0 x T 0 Iterate. 22 / 42

46 Application in R n Residuals norm as r k = 1 n n i=1 x i k j=0 x i, xj E,E x j, Projections norm as p k = 1 n n i=1 k j=0 x i, xj E,E x j. Figure: Residuals (decreasing) and projections (increasing) norm (blue for L2-PCA, red for Banach PCA) 23 / 42

47 Particular case of C(K), K metric and compact. 24 / 42

48 E = C(K) with K metric compact. Let X be a Gaussian element such that X C(K) almost-surely and suppose that k : (s, t) K 2 Cδ s, δ t E,E = Cov(X s, X t ) R (20) is continuous. The Cameron-Martin space H X coincides here with the Reproducing Kernel Hilbert space (RKHS). The previous decomposition becomes: 1 Set n = 0 and k n = k, 2 Find x n K such that k n (x n, x n ) = max x K k n (x, x), 3 If λ n = k n (x n, x n ) > 0 then k n+1 (s, t) = k n (s, t) kn(s,xn)kn(xn,t) λ n, 4 n n / 42

49 Illustration on Brownian motion: Step 1 Figure: Left: Brownian motion samples and point-wise (symmetrized) variance. Right: First basis function x 0(t) = t associated to f 0 = δ / 42

50 Illustration on Brownian motion: Step 2 Figure: Left: Brownian bridge samples and point-wise (symmetrized) variance. Right: Second basis function x 1(t) = min(t, 0.5) 0.5t, associated to f 1 = δ / 42

51 Illustration on Brownian motion: Step 3 Figure: Left: Conditional Brownian motion samples and point-wise (symmetrized) variance. Right: Third and fourth basis functions. 28 / 42

52 A word on summability The proposed decomposition need not be a nuclear representation of C, that is k 0 λ k [0, + ]. Indeed, the Brownian case shows us: k 0 λ k = = + (21) 32 However, a simple transformation gives a nuclear representation in this case: f 2 1 = f 2 1 f 2 2 Cov( f 2 1, f ) = Cov(f 2 1, f 1 2 f 2 2 = f f 2 2 Cov( f 2 2, 2 f 2 2 ) = Cov(f 2 2, f 2 2 Doing similar transformations at each step gives: k 0 2 ) 2 ) = 1 8 = 1 8 (22) (23) λ k = = 2 (24) 256 Remark that E[ k n ξ kh k 2 ] k n λ k, thus the approximation error is exponentially decreasing! 29 / 42

53 Conclusion and open questions 30 / 42

54 Conclusion and open questions Conclusions New practical method to represent Gaussian elements, based on covariance operator. Numerical solution in the case of E = C(K) (K metric and compact). Rediscovers Paul Levy s construction of Brownian motion. Open question & future work Other properties of (x k, x k ) E E? Rate optimality of finite dimensional approximations in C(K)? Multiplicity of decomposition elements? Links with approximation numbers (ex: l(x ) = E[ X 2 ] 1 2 ) approximation theory (s-numbers)? Properties of Banach spaces? Optimality for projection norm? Dual problem? Interpretation of k 0 λ k when it is finite? Evolution of x k E? 31 / 42

55 References 32 / 42

56 References X. Bay, J.-C. Croix. Karhunen-Loève decomposition of Gaussian measures on Banach spaces. ArXiV, H. Luschgy, G. Pagès. Expansions For Gaussian Processes And Parseval Frames. Electronic Journal of Probability, 14, 2009, N. Vakhania. Canonical Factorization of Gaussian covariance operators and some of its applications. Theory of Probability and Its Applications, V. Bogachev. Gaussian Measures. Mathematical Surveys and Monographs vol. 62, AMS, N. Vakhania, V. Tarieladze, S. Chobanyan. Probability Distributions on Banach spaces. Mathematics and Its Applications, Springer Netherlands, / 42

57 Appendix 34 / 42

58 Examples with different kernels Basis functions 1.2 f0 f1 f2 f3 1.0 f4 f5 f6 f Figure: Wiener measure basis functions. 35 / 42

59 Examples with different kernels 10 1 Variances Variances Figure: Wiener measure basis variances. 36 / 42

60 Examples with different kernels 1.5 Basis functions f0 f1 0.5 f2 f3 f4 f5 f6 f Figure: k(s, t) = exp ) ( (s t)2 basis functions / 42

61 Examples with different kernels 10 1 Variances Variances Figure: k(s, t) = exp ) ( (s t)2 basis variances / 42

62 Examples with different kernels Basis functions x0 x1 1.0 x2 x3 x4 x5 x6 x7 x8 x9 x10 x Figure: Matern 3/2 kernel basis functions. 39 / 42

63 Examples with different kernels 10 1 Variances Variances Figure: Matern 3/2 kernel basis variances. 40 / 42

64 Examples with different kernels Basis functions 1.2 f0 f1 f2 f3 1.0 f4 f5 f6 f Figure: Fractional brownian motion (H = 75%) basis functions. 41 / 42

65 Examples with different kernels Basis functions 1.2 f0 f1 f2 f3 1.0 f4 f5 f6 f Figure: Fractional brownian motion (H = 25%) basis functions. 42 / 42

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix GT APSSE - April 2017, the 13th joint work with Xavier Bay. 1 / 29 Sommaire 1 Preliminaries on Gaussian processes 2

More information

Analysis Preliminary Exam Workshop: Hilbert Spaces

Analysis Preliminary Exam Workshop: Hilbert Spaces Analysis Preliminary Exam Workshop: Hilbert Spaces 1. Hilbert spaces A Hilbert space H is a complete real or complex inner product space. Consider complex Hilbert spaces for definiteness. If (, ) : H H

More information

A Note on Hilbertian Elliptically Contoured Distributions

A Note on Hilbertian Elliptically Contoured Distributions A Note on Hilbertian Elliptically Contoured Distributions Yehua Li Department of Statistics, University of Georgia, Athens, GA 30602, USA Abstract. In this paper, we discuss elliptically contoured distribution

More information

Kernel Method: Data Analysis with Positive Definite Kernels

Kernel Method: Data Analysis with Positive Definite Kernels Kernel Method: Data Analysis with Positive Definite Kernels 2. Positive Definite Kernel and Reproducing Kernel Hilbert Space Kenji Fukumizu The Institute of Statistical Mathematics. Graduate University

More information

Your first day at work MATH 806 (Fall 2015)

Your first day at work MATH 806 (Fall 2015) Your first day at work MATH 806 (Fall 2015) 1. Let X be a set (with no particular algebraic structure). A function d : X X R is called a metric on X (and then X is called a metric space) when d satisfies

More information

3. Some tools for the analysis of sequential strategies based on a Gaussian process prior

3. Some tools for the analysis of sequential strategies based on a Gaussian process prior 3. Some tools for the analysis of sequential strategies based on a Gaussian process prior E. Vazquez Computer experiments June 21-22, 2010, Paris 21 / 34 Function approximation with a Gaussian prior Aim:

More information

Universität Stuttgart. Fachbereich Mathematik

Universität Stuttgart. Fachbereich Mathematik Universität tuttgart Fachbereich Mathematik Convergence ypes and Rates in Generic Karhunen-Loève Expansions with Applications to ample Path Properties Ingo teinwart Preprint 2014/007 Fachbereich Mathematik

More information

CHAPTER VIII HILBERT SPACES

CHAPTER VIII HILBERT SPACES CHAPTER VIII HILBERT SPACES DEFINITION Let X and Y be two complex vector spaces. A map T : X Y is called a conjugate-linear transformation if it is a reallinear transformation from X into Y, and if T (λx)

More information

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Maximilian Kasy Department of Economics, Harvard University 1 / 37 Agenda 6 equivalent representations of the

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

I teach myself... Hilbert spaces

I teach myself... Hilbert spaces I teach myself... Hilbert spaces by F.J.Sayas, for MATH 806 November 4, 2015 This document will be growing with the semester. Every in red is for you to justify. Even if we start with the basic definition

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )

More information

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

Principal Component Analysis

Principal Component Analysis CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given

More information

Multi-normed spaces and multi-banach algebras. H. G. Dales. Leeds Semester

Multi-normed spaces and multi-banach algebras. H. G. Dales. Leeds Semester Multi-normed spaces and multi-banach algebras H. G. Dales Leeds Semester Leeds, 2 June 2010 1 Motivating problem Let G be a locally compact group, with group algebra L 1 (G). Theorem - B. E. Johnson, 1972

More information

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32 Lévy Processes and Infinitely Divisible Measures in the Dual of a Nuclear Space David Applebaum School of Mathematics and Statistics, University of Sheffield, UK Talk at "Workshop on Infinite Dimensional

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 2 Part 3: Native Space for Positive Definite Kernels Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2014 fasshauer@iit.edu MATH

More information

Malliavin Calculus: Analysis on Gaussian spaces

Malliavin Calculus: Analysis on Gaussian spaces Malliavin Calculus: Analysis on Gaussian spaces Josef Teichmann ETH Zürich Oxford 2011 Isonormal Gaussian process A Gaussian space is a (complete) probability space together with a Hilbert space of centered

More information

Chapter 8 Integral Operators

Chapter 8 Integral Operators Chapter 8 Integral Operators In our development of metrics, norms, inner products, and operator theory in Chapters 1 7 we only tangentially considered topics that involved the use of Lebesgue measure,

More information

COMPACT OPERATORS. 1. Definitions

COMPACT OPERATORS. 1. Definitions COMPACT OPERATORS. Definitions S:defi An operator M : X Y, X, Y Banach, is compact if M(B X (0, )) is relatively compact, i.e. it has compact closure. We denote { E:kk (.) K(X, Y ) = M L(X, Y ), M compact

More information

René Bartsch and Harry Poppe (Received 4 July, 2015)

René Bartsch and Harry Poppe (Received 4 July, 2015) NEW ZEALAND JOURNAL OF MATHEMATICS Volume 46 2016, 1-8 AN ABSTRACT ALGEBRAIC-TOPOLOGICAL APPROACH TO THE NOTIONS OF A FIRST AND A SECOND DUAL SPACE III René Bartsch and Harry Poppe Received 4 July, 2015

More information

Approximate Kernel Methods

Approximate Kernel Methods Lecture 3 Approximate Kernel Methods Bharath K. Sriperumbudur Department of Statistics, Pennsylvania State University Machine Learning Summer School Tübingen, 207 Outline Motivating example Ridge regression

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Arno Solin and Simo Särkkä Aalto University, Finland Workshop on Gaussian Process Approximation Copenhagen, Denmark, May 2015 Solin &

More information

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20 The Lévy-Itô decomposition and the Lévy-Khintchine formula in the dual of a nuclear space. Christian Fonseca-Mora School of Mathematics and Statistics, University of Sheffield, UK Talk at "Stochastic Processes

More information

arxiv: v1 [math.fa] 28 Dec 2018

arxiv: v1 [math.fa] 28 Dec 2018 DECOMPOSITION OF GAUSSIAN PROCESSES, AND FACTORIZATION OF POSITIVE DEFINITE KERNELS arxiv:1812.10850v1 [math.fa] 28 Dec 2018 PALLE JORGENSEN AND FENG TIAN Abstract. We establish a duality for two factorization

More information

WHITE NOISE REPRESENTATION OF GAUSSIAN RANDOM FIELDS

WHITE NOISE REPRESENTATION OF GAUSSIAN RANDOM FIELDS Communications on Stochastic Analysis Vol. 7, No. (203) 8-90 Serials Publications www.serialspublications.com WHITE NOISE REPRESENTATION OF GAUSSIAN RANDOM FIELDS ZACHARY GELBAUM Abstract. We obtain a

More information

Non commutative Khintchine inequalities and Grothendieck s theo

Non commutative Khintchine inequalities and Grothendieck s theo Non commutative Khintchine inequalities and Grothendieck s theorem Nankai, 2007 Plan Non-commutative Khintchine inequalities 1 Non-commutative Khintchine inequalities 2 µ = Uniform probability on the set

More information

Second-Order Inference for Gaussian Random Curves

Second-Order Inference for Gaussian Random Curves Second-Order Inference for Gaussian Random Curves With Application to DNA Minicircles Victor Panaretos David Kraus John Maddocks Ecole Polytechnique Fédérale de Lausanne Panaretos, Kraus, Maddocks (EPFL)

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Basic Operator Theory with KL Expansion

Basic Operator Theory with KL Expansion Basic Operator Theory with Karhunen Loéve Expansion Wafa Abu Zarqa, Maryam Abu Shamala, Samiha Al Aga, Khould Al Qitieti, Reem Al Rashedi Department of Mathematical Sciences College of Science United Arab

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Exercises to Applied Functional Analysis

Exercises to Applied Functional Analysis Exercises to Applied Functional Analysis Exercises to Lecture 1 Here are some exercises about metric spaces. Some of the solutions can be found in my own additional lecture notes on Blackboard, as the

More information

Interest Rate Models:

Interest Rate Models: 1/17 Interest Rate Models: from Parametric Statistics to Infinite Dimensional Stochastic Analysis René Carmona Bendheim Center for Finance ORFE & PACM, Princeton University email: rcarmna@princeton.edu

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Math 113 Final Exam: Solutions

Math 113 Final Exam: Solutions Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P

More information

Course Description - Master in of Mathematics Comprehensive exam& Thesis Tracks

Course Description - Master in of Mathematics Comprehensive exam& Thesis Tracks Course Description - Master in of Mathematics Comprehensive exam& Thesis Tracks 1309701 Theory of ordinary differential equations Review of ODEs, existence and uniqueness of solutions for ODEs, existence

More information

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability...

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability... Functional Analysis Franck Sueur 2018-2019 Contents 1 Metric spaces 1 1.1 Definitions........................................ 1 1.2 Completeness...................................... 3 1.3 Compactness......................................

More information

Learning gradients: prescriptive models

Learning gradients: prescriptive models Department of Statistical Science Institute for Genome Sciences & Policy Department of Computer Science Duke University May 11, 2007 Relevant papers Learning Coordinate Covariances via Gradients. Sayan

More information

1 Singular Value Decomposition and Principal Component

1 Singular Value Decomposition and Principal Component Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)

More information

Maximum variance formulation

Maximum variance formulation 12.1. Principal Component Analysis 561 Figure 12.2 Principal component analysis seeks a space of lower dimensionality, known as the principal subspace and denoted by the magenta line, such that the orthogonal

More information

Statistical Convergence of Kernel CCA

Statistical Convergence of Kernel CCA Statistical Convergence of Kernel CCA Kenji Fukumizu Institute of Statistical Mathematics Tokyo 106-8569 Japan fukumizu@ism.ac.jp Francis R. Bach Centre de Morphologie Mathematique Ecole des Mines de Paris,

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

E.7 Alaoglu s Theorem

E.7 Alaoglu s Theorem E.7 Alaoglu s Theorem 359 E.7 Alaoglu s Theorem If X is a normed space, then the closed unit ball in X or X is compact if and only if X is finite-dimensional (Problem A.25). Even so, Alaoglu s Theorem

More information

Tests for separability in nonparametric covariance operators of random surfaces

Tests for separability in nonparametric covariance operators of random surfaces Tests for separability in nonparametric covariance operators of random surfaces Shahin Tavakoli (joint with John Aston and Davide Pigoli) April 19, 2016 Analysis of Multidimensional Functional Data Shahin

More information

CS 7140: Advanced Machine Learning

CS 7140: Advanced Machine Learning Instructor CS 714: Advanced Machine Learning Lecture 3: Gaussian Processes (17 Jan, 218) Jan-Willem van de Meent (j.vandemeent@northeastern.edu) Scribes Mo Han (han.m@husky.neu.edu) Guillem Reus Muns (reusmuns.g@husky.neu.edu)

More information

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2 1 A Good Spectral Theorem c1996, Paul Garrett, garrett@math.umn.edu version February 12, 1996 1 Measurable Hilbert bundles Measurable Banach bundles Direct integrals of Hilbert spaces Trivializing Hilbert

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Intertibility and spectrum of the multiplication operator on the space of square-summable sequences

Intertibility and spectrum of the multiplication operator on the space of square-summable sequences Intertibility and spectrum of the multiplication operator on the space of square-summable sequences Objectives Establish an invertibility criterion and calculate the spectrum of the multiplication operator

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

Introduction to Functional Analysis With Applications

Introduction to Functional Analysis With Applications Introduction to Functional Analysis With Applications A.H. Siddiqi Khalil Ahmad P. Manchanda Tunbridge Wells, UK Anamaya Publishers New Delhi Contents Preface vii List of Symbols.: ' - ix 1. Normed and

More information

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015 EE613 Machine Learning for Engineers Kernel methods Support Vector Machines jean-marc odobez 2015 overview Kernel methods introductions and main elements defining kernels Kernelization of k-nn, K-Means,

More information

MAT 578 FUNCTIONAL ANALYSIS EXERCISES

MAT 578 FUNCTIONAL ANALYSIS EXERCISES MAT 578 FUNCTIONAL ANALYSIS EXERCISES JOHN QUIGG Exercise 1. Prove that if A is bounded in a topological vector space, then for every neighborhood V of 0 there exists c > 0 such that tv A for all t > c.

More information

Functional Analysis I

Functional Analysis I Functional Analysis I Course Notes by Stefan Richter Transcribed and Annotated by Gregory Zitelli Polar Decomposition Definition. An operator W B(H) is called a partial isometry if W x = X for all x (ker

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

An introduction to some aspects of functional analysis

An introduction to some aspects of functional analysis An introduction to some aspects of functional analysis Stephen Semmes Rice University Abstract These informal notes deal with some very basic objects in functional analysis, including norms and seminorms

More information

5 Compact linear operators

5 Compact linear operators 5 Compact linear operators One of the most important results of Linear Algebra is that for every selfadjoint linear map A on a finite-dimensional space, there exists a basis consisting of eigenvectors.

More information

Functional Analysis F3/F4/NVP (2005) Homework assignment 3

Functional Analysis F3/F4/NVP (2005) Homework assignment 3 Functional Analysis F3/F4/NVP (005 Homework assignment 3 All students should solve the following problems: 1. Section 4.8: Problem 8.. Section 4.9: Problem 4. 3. Let T : l l be the operator defined by

More information

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space Robert Jenssen, Deniz Erdogmus 2, Jose Principe 2, Torbjørn Eltoft Department of Physics, University of Tromsø, Norway

More information

Measure, Integration & Real Analysis

Measure, Integration & Real Analysis v Measure, Integration & Real Analysis preliminary edition 10 August 2018 Sheldon Axler Dedicated to Paul Halmos, Don Sarason, and Allen Shields, the three mathematicians who most helped me become a mathematician.

More information

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define HILBERT SPACES AND THE RADON-NIKODYM THEOREM STEVEN P. LALLEY 1. DEFINITIONS Definition 1. A real inner product space is a real vector space V together with a symmetric, bilinear, positive-definite mapping,

More information

The Dirichlet-to-Neumann operator

The Dirichlet-to-Neumann operator Lecture 8 The Dirichlet-to-Neumann operator The Dirichlet-to-Neumann operator plays an important role in the theory of inverse problems. In fact, from measurements of electrical currents at the surface

More information

A Hilbert Space for Random Processes

A Hilbert Space for Random Processes Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of

More information

1 Continuity Classes C m (Ω)

1 Continuity Classes C m (Ω) 0.1 Norms 0.1 Norms A norm on a linear space X is a function : X R with the properties: Positive Definite: x 0 x X (nonnegative) x = 0 x = 0 (strictly positive) λx = λ x x X, λ C(homogeneous) x + y x +

More information

The Dual of a Hilbert Space

The Dual of a Hilbert Space The Dual of a Hilbert Space Let V denote a real Hilbert space with inner product, u,v V. Let V denote the space of bounded linear functionals on V, equipped with the norm, F V sup F v : v V 1. Then for

More information

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets 9.520 Class 22, 2004 Tomaso Poggio and Sayan Mukherjee About this class Goal To introduce an alternate perspective of RKHS via integral operators

More information

Kernel-Based Contrast Functions for Sufficient Dimension Reduction

Kernel-Based Contrast Functions for Sufficient Dimension Reduction Kernel-Based Contrast Functions for Sufficient Dimension Reduction Michael I. Jordan Departments of Statistics and EECS University of California, Berkeley Joint work with Kenji Fukumizu and Francis Bach

More information

A Limit Theorem for the Squared Norm of Empirical Distribution Functions

A Limit Theorem for the Squared Norm of Empirical Distribution Functions University of Wisconsin Milwaukee UWM Digital Commons Theses and Dissertations May 2014 A Limit Theorem for the Squared Norm of Empirical Distribution Functions Alexander Nerlich University of Wisconsin-Milwaukee

More information

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS Di Girolami, C. and Russo, F. Osaka J. Math. 51 (214), 729 783 GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS CRISTINA DI GIROLAMI and FRANCESCO RUSSO (Received

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

MATHEMATICS. Course Syllabus. Section A: Linear Algebra. Subject Code: MA. Course Structure. Ordinary Differential Equations

MATHEMATICS. Course Syllabus. Section A: Linear Algebra. Subject Code: MA. Course Structure. Ordinary Differential Equations MATHEMATICS Subject Code: MA Course Structure Sections/Units Section A Section B Section C Linear Algebra Complex Analysis Real Analysis Topics Section D Section E Section F Section G Section H Section

More information

BERGMAN KERNEL ON COMPACT KÄHLER MANIFOLDS

BERGMAN KERNEL ON COMPACT KÄHLER MANIFOLDS BERGMAN KERNEL ON COMPACT KÄHLER MANIFOLDS SHOO SETO Abstract. These are the notes to an expository talk I plan to give at MGSC on Kähler Geometry aimed for beginning graduate students in hopes to motivate

More information

Analysis of Fractals, Image Compression and Entropy Encoding

Analysis of Fractals, Image Compression and Entropy Encoding Analysis of Fractals, Image Compression and Entropy Encoding Myung-Sin Song Southern Illinois University Edwardsville Jul 10, 2009 Joint work with Palle Jorgensen. Outline 1. Signal and Image processing,

More information

Kernel Methods. Machine Learning A W VO

Kernel Methods. Machine Learning A W VO Kernel Methods Machine Learning A 708.063 07W VO Outline 1. Dual representation 2. The kernel concept 3. Properties of kernels 4. Examples of kernel machines Kernel PCA Support vector regression (Relevance

More information

STUDY PLAN MASTER IN (MATHEMATICS) (Thesis Track)

STUDY PLAN MASTER IN (MATHEMATICS) (Thesis Track) STUDY PLAN MASTER IN (MATHEMATICS) (Thesis Track) I. GENERAL RULES AND CONDITIONS: 1- This plan conforms to the regulations of the general frame of the Master programs. 2- Areas of specialty of admission

More information

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Stable Process. 2. Multivariate Stable Distributions. July, 2006 Stable Process 2. Multivariate Stable Distributions July, 2006 1. Stable random vectors. 2. Characteristic functions. 3. Strictly stable and symmetric stable random vectors. 4. Sub-Gaussian random vectors.

More information

FINITE-DIMENSIONAL LINEAR ALGEBRA

FINITE-DIMENSIONAL LINEAR ALGEBRA DISCRETE MATHEMATICS AND ITS APPLICATIONS Series Editor KENNETH H ROSEN FINITE-DIMENSIONAL LINEAR ALGEBRA Mark S Gockenbach Michigan Technological University Houghton, USA CRC Press Taylor & Francis Croup

More information

Independent component analysis for functional data

Independent component analysis for functional data Independent component analysis for functional data Hannu Oja Department of Mathematics and Statistics University of Turku Version 12.8.216 August 216 Oja (UTU) FICA Date bottom 1 / 38 Outline 1 Probability

More information

Convergence of the Ensemble Kalman Filter in Hilbert Space

Convergence of the Ensemble Kalman Filter in Hilbert Space Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based

More information

On compact operators

On compact operators On compact operators Alen Aleanderian * Abstract In this basic note, we consider some basic properties of compact operators. We also consider the spectrum of compact operators on Hilbert spaces. A basic

More information

Stochastic Processes

Stochastic Processes Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space

More information

A brief introduction to trace class operators

A brief introduction to trace class operators A brief introduction to trace class operators Christopher Hawthorne December 2015 Contents 1 Introduction 1 2 Preliminaries 1 3 Trace class operators 2 4 Duals 8 1 Introduction The trace is a useful number

More information

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Your first day at work MATH 806 (Fall 2015)

Your first day at work MATH 806 (Fall 2015) Your first day at work MATH 806 (Fall 2015) 1. Let X be a set (with no particular algebraic structure). A function d : X X R is called a metric on X (and then X is called a metric space) when d satisfies

More information

Math Real Analysis II

Math Real Analysis II Math 4 - Real Analysis II Solutions to Homework due May Recall that a function f is called even if f( x) = f(x) and called odd if f( x) = f(x) for all x. We saw that these classes of functions had a particularly

More information

Gaussian process regression for Sensitivity analysis

Gaussian process regression for Sensitivity analysis Gaussian process regression for Sensitivity analysis GPSS Workshop on UQ, Sheffield, September 2016 Nicolas Durrande, Mines St-Étienne, durrande@emse.fr GPSS workshop on UQ GPs for sensitivity analysis

More information

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books. Applied Analysis APPM 44: Final exam 1:3pm 4:pm, Dec. 14, 29. Closed books. Problem 1: 2p Set I = [, 1]. Prove that there is a continuous function u on I such that 1 ux 1 x sin ut 2 dt = cosx, x I. Define

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

Spectral Theory, with an Introduction to Operator Means. William L. Green

Spectral Theory, with an Introduction to Operator Means. William L. Green Spectral Theory, with an Introduction to Operator Means William L. Green January 30, 2008 Contents Introduction............................... 1 Hilbert Space.............................. 4 Linear Maps

More information

Five Mini-Courses on Analysis

Five Mini-Courses on Analysis Christopher Heil Five Mini-Courses on Analysis Metrics, Norms, Inner Products, and Topology Lebesgue Measure and Integral Operator Theory and Functional Analysis Borel and Radon Measures Topological Vector

More information

Diffeomorphic Warping. Ben Recht August 17, 2006 Joint work with Ali Rahimi (Intel)

Diffeomorphic Warping. Ben Recht August 17, 2006 Joint work with Ali Rahimi (Intel) Diffeomorphic Warping Ben Recht August 17, 2006 Joint work with Ali Rahimi (Intel) What Manifold Learning Isn t Common features of Manifold Learning Algorithms: 1-1 charting Dense sampling Geometric Assumptions

More information

SYMPLECTIC GEOMETRY: LECTURE 1

SYMPLECTIC GEOMETRY: LECTURE 1 SYMPLECTIC GEOMETRY: LECTURE 1 LIAT KESSLER 1. Symplectic Linear Algebra Symplectic vector spaces. Let V be a finite dimensional real vector space and ω 2 V, i.e., ω is a bilinear antisymmetric 2-form:

More information

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling Machine Learning B. Unsupervised Learning B.2 Dimensionality Reduction Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University

More information

Support Vector Machines

Support Vector Machines Wien, June, 2010 Paul Hofmarcher, Stefan Theussl, WU Wien Hofmarcher/Theussl SVM 1/21 Linear Separable Separating Hyperplanes Non-Linear Separable Soft-Margin Hyperplanes Hofmarcher/Theussl SVM 2/21 (SVM)

More information

Paradigms of Probabilistic Modelling

Paradigms of Probabilistic Modelling Paradigms of Probabilistic Modelling Hermann G. Matthies Brunswick, Germany wire@tu-bs.de http://www.wire.tu-bs.de abstract RV-measure.tex,v 4.5 2017/07/06 01:56:46 hgm Exp Overview 2 1. Motivation challenges

More information

About Grupo the Mathematical de Investigación Foundation of Quantum Mechanics

About Grupo the Mathematical de Investigación Foundation of Quantum Mechanics About Grupo the Mathematical de Investigación Foundation of Quantum Mechanics M. Victoria Velasco Collado Departamento de Análisis Matemático Universidad de Granada (Spain) Operator Theory and The Principles

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information