Positive Definite Kernels: Opportunities and Challenges

Size: px
Start display at page:

Download "Positive Definite Kernels: Opportunities and Challenges"

Transcription

1 Positive Definite Kernels: Opportunities and Challenges Michael McCourt Department of Mathematical and Statistical Sciences University of Colorado, Denver CUNY Mathematics Seminar CUNY Graduate College April 10, 2014 (CUDenver) Positive Definite Kernels April 10, / 55

2 Acknowledgements I would like to extend my thanks to John Loustau Marcello Lucia michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

3 Solving problems numerically Opportunity As applications continue to increase in complexity, and computer hardware becomes more powerful and inexpensive, better numerical methods will be required to conduct the required simulations. (CUDenver) Positive Definite Kernels April 10, / 55

4 Solving problems numerically Opportunity As applications continue to increase in complexity, and computer hardware becomes more powerful and inexpensive, better numerical methods will be required to conduct the required simulations. My Strategy Kernel-based methods are a useful tool for computations including: Function approximation and interpolation, Integration, Partial differential equations (BVPs), Machine and statistical learning, and Optimization and surrogate modeling. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

5 Topics of Discussion Introduce positive definite kernels and their properties Discuss the scattered data interpolation problem Explain how a kernel basis can be used to solve that problem Explore the impact of different kernel choices in the quality of the approximation Explore various kernel-based schemes for solving PDEs Method of fundamental solutions Kernel collocation RBF-based finite differences michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

6 What is a kernel? Kernel functions Essentially, any function of two variables: K (x, z) is a kernel. I only care about symmetric kernels: K (x, z) = K (z, x). Positive definite kernels can be interpreted as functional analysis analogs of positive definite matrices. v T Kv > 0, or K (x, z)v(x)v(z) dx dz > 0. R n R n A reproducing kernel Hilbert space H K is a space of functions which are in some way generated by the kernel K : K (x, ), f HK = f (x), if f H K. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

7 What is a kernel? Kernel functions Essentially, any function of two variables: K (x, z) is a kernel. I only care about symmetric kernels: K (x, z) = K (z, x). Positive definite kernels can be interpreted as functional analysis analogs of positive definite matrices. v T Kv > 0, or K (x, z)v(x)v(z) dx dz > 0. R n R n A reproducing kernel Hilbert space H K is a space of functions which are in some way generated by the kernel K : K (x, ), f HK = f (x), if f H K. Note: This term K (x, z) is unrelated to the kernel (null space) of a matrix. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

8 Positive Definite Kernels Positive Definite Kernel and Radial Basis Function (RBF) are used interchangeably... but Not all kernels are radial: K (x, z) = min(x, z) xz Not all RBFs are positive definite: φ(r) = r 2 + c 2 Even so, I may switch back and forth between the terms, thus unless otherwise specified. kernel = RBF michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

9 Sample Kernels Because all kernels (for now) are radial, we define r = x z. Gaussians Inverse Multiquadrics C 2 Matérn K (x, z) = exp ( ε 2 r 2) K (x, z) = ε 2 r 2 K (x, z) = (1 + r) exp ( εr) C 4 Wendland K (x, z) = (1 εr) 6 + ( ) 35(εr) εr + 3 michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

10 Gaussian exp ( ε 2 r 2) Inv. Multiquadric ε 2 r 2 C 2 Matérn These RBFs my be thought of as a similarity measure. (1 + r) exp ( εr) C 4 Wendland (1 εr) 6 ( + 35(εr) εr + 3 ) michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

11 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

12 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. Consider kernel-based interpolation - the approximation is a linear combination of RBFs (not polynomials) s(x) = N c j K (x, x j ), j=1 x Ω R d with K : Ω Ω R a positive definite kernel. To find c j solve the interpolation equations s(x i ) = f (x i ) = y i, i = 1,..., N which gives a linear system Kc = y; K is symmetric positive definite. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

13 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

14 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

15 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

16 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

17 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

18 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

19 Fundamental Application (Scattered Data Fitting) Given data (x j, y j ), j = 1,..., N, with x j R d, y j R, find a (continuous) function s such that s(x j ) = y j, j = 1,..., N. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

20 Kernels vs. Polynomials A polynomial interpolation scheme (in 1D) looks like: p f (x) = a 1 + a 2 x + a 3 x a n x n 1. The kernel-based scheme takes the form s(x) = c 1 K (x, x 1 ) + c 2 K (x, x 2 ) + c 3 K (x, x 3 ) c n K (x, x n ). Why is this kernel (RBF) approach preferable? michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

21 Kernels vs. Polynomials A polynomial interpolation scheme (in 1D) looks like: p f (x) = a 1 + a 2 x + a 3 x a n x n 1. The kernel-based scheme takes the form s(x) = c 1 K (x, x 1 ) + c 2 K (x, x 2 ) + c 3 K (x, x 3 ) c n K (x, x n ). Why is this kernel (RBF) approach preferable? Uniqueness is guaranteed in higher dimensions. RBF methods are often referred to as meshfree. Optimality of these methods is provable in some circumstances. Properties such as smoothness are variable. The locality of the basis functions can be parameterized. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

22 RBFs in applications: Why aren t there more? For many problems, kernel-based approximations are appropriate; sometimes they would be optimal. Why are they not more prevalent? michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

23 RBFs in applications: Why aren t there more? For many problems, kernel-based approximations are appropriate; sometimes they would be optimal. Why are they not more prevalent? Possible barriers to widespread RBF usage 1 Mathematical complexity 2 Stability concerns 3 Computational cost 4 Presence of one or more free parameters michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

24 Gaussian Kernel Interpolation Probably the most common kernel in applications is the Gaussian ( K (x, z) = exp ε 2 x z 2), The ε value determines the locality of the basis functions. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

25 Gaussian Kernel Interpolation Probably the most common kernel in applications is the Gaussian ( K (x, z) = exp ε 2 x z 2), The ε value determines the locality of the basis functions. Opportunity Gaussians are great choices for interpolating very smooth functions. Problem For some (potentially very accurate) choices of ε, the interpolation matrix K may be very ill-conditioned. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

26 Scattered Data Fitting with Gaussians Gaussians can be spectrally accurate, allowing for high accuracy with smaller N (number of input points). For some applications, the cost of producing a data point is significant, so this is useful. The choice of shape parameter ε can have a great effect on the accuracy and condition of the interpolant. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

27 Naive Gaussian Interpolation Results As ε 0 the K (, x i ) and K (, x j ) start to resemble each other, producing a matrix which looks increasingly like a matrix of all ones michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

28 Better Gaussian Interpolation Results This is what we want to see: No instability for smaller ε Recovering the ε 0 polynomial limit [Fornberg, 2002] How can we produce this result? michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

29 Eigenfunction (Mercer) Series Using Hilbert-Schmidt Theory K (x, z) = e ε2 (x z) 2 = λ n ϕ n (x)ϕ n (z) n=0 michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

30 Eigenfunction (Mercer) Series Using Hilbert-Schmidt Theory K (x, z) = e ε2 (x z) 2 = λ n ϕ n (x)ϕ n (z) where α λ n = 2 ( ε 2 ) n α 2 + δ 2 + ε 2 α 2 + δ 2 + ε 2, ϕ n (x) = γ n e δ2 x 2 H n (αβx) with H n Hermite polynomials, ( ( ) ) 1 2ε 2 4 β = 1 +, γ n = α n=0 β 2 n Γ(n + 1), δ2 = α2 2 ( ) β 2 1. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

31 Eigenfunction (Mercer) Series Using Hilbert-Schmidt Theory K (x, z) = e ε2 (x z) 2 = λ n ϕ n (x)ϕ n (z) where α λ n = 2 ( ε 2 ) n α 2 + δ 2 + ε 2 α 2 + δ 2 + ε 2, ϕ n (x) = γ n e δ2 x 2 H n (αβx) with H n Hermite polynomials, ( ( ) ) 1 2ε 2 4 β = 1 +, γ n = α n=0 β 2 n Γ(n + 1), δ2 = α2 2 ( ) β 2 1. These eigenpairs solve the Hilbert-Schmidt eigenvalue problem, e ε2 (x z) 2 ϕ n (z)ρ(z) dz = λ n ϕ n (x), ρ(x) = α π e α2 x 2. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

32 Eigenfunctions in the ε 0 limit Studying the limits as ε 0 shows ( lim β = lim ε 0 ε ( ) ) 1 2ε 2 4 α ) α 2 ( lim ε 0 δ2 = lim β 2 1 ε 0 2 which in turn allows us to conclude that = 1, lim ε 0 ε 2, lim ϕ n(x) = lim γ n e δ2 x 2 H n (αβx) = γ n H n (αx). ε 0 ε 0 This supports the polynomial limit proved more than 10 years ago. Infinite Smoothness Only This limit is true only for analytic kernels - kernels with finite smoothness have (often) a piecewise polynomial spline limit. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

33 Eigenvalues in the ε 0 limit As ε 0, the eigenvalues λ n decay increasingly fast lim λ α n = lim 2 ( ε 2 ε 0 ε 0 α 2 + δ 2 + ε 2 α 2 + δ 2 + ε 2 ( ) α 2 1 n = α 2 α 2 lim ε 0 ε2n This decay is a major cause of the ill-conditioning in the kernel interpolation problem (we will see this). Avoiding Instability We can use the eigenexpansion of the Gaussians to create a stable basis for them. ) n michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

34 Developing the Hilbert-Schmidt SVD (GaussQR) The eigenfunction structure is K (x, z) = λ m ϕ m (x)ϕ m (z), m=0 λ 1 = (... ϕ 1 (x) ϕ N (x) ) ϕ 1 (z). λ N ϕ N (z),.... In nice, compressed, vector notation, K (x, z) = φ(x) T Λφ(x) michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

35 Developing the Hilbert-Schmidt SVD (GaussQR) The scattered data fitting problem asks us to find an interpolant s(x) = N c k K (x, x k ), k=1 = k(x) T c. The coefficient vector c is defined by Kc = y: K (x 1, x 1 ) K (x 1, x N ) c 1 k(x 1 ) T.. =. K (x N, x 1 ) K (x N, x N ) k(x N ) T which is just s(x j ) = y j, 1 j N. We can write c N s(x) = k(x) T K 1 y. c = y 1. y N, michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

36 Developing the Hilbert-Schmidt SVD (GaussQR) We can rewrite each row of K (our basis functions) as k(x) T = ( K (x, x 1 ) K (x, x N ) ) = ( φ(x) T Λφ(x 1 ) φ(x) T Λφ(x N ) ) = φ(x) T Λ ( φ(x 1 ) φ(x N ) ). Stacking these gives the K matrix k(x 1 ) T φ(x 1 ) T K = =. k(x N ) T. φ(x N ) T which I will write as a matrix eigenexpansion Λ ( φ(x 1 ) φ(x N ) ), K = ΦΛΦ T, and k(x) T = φ(x) T ΛΦ. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

37 Developing the Hilbert-Schmidt SVD (GaussQR) K = ΦΛΦ T involves infinite sized matrices... Break Φ and Λ into blocks. ( ) Λ1 Φ = (Φ 1 Φ 2 ), Λ =, Λ 2 so that Φ 1 and Λ 1 are N N. Let s rewrite our basis using this structure ( ) ( ) k(x) T = φ(x) T ΛΦ T = φ(x) T Λ1 Φ T 1 Λ 2 Φ T 2 ( ) = φ(x) T I N Λ 2 Φ T 2 Φ T 1 Λ 1 1. Λ 1 Φ T 1, Define a new function to write ψ(x) T = φ(x) T ( I N Λ 2 Φ T 2 Φ T 1 Λ 1 1 k(x) T = ψ(x) T Λ 1 Φ T 1. ) michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

38 Developing the Hilbert-Schmidt SVD (GaussQR) Use this to rewrite K, which was just rows of k: k(x 1 ) T ψ(x 1 ) T K = =. k(x N ) T. ψ(x N ) T Λ 1 Φ T 1 = ΨΛ 1Φ T 1. This K = ΨΛ 1 Φ T 1 is called the Hilbert-Schmidt SVD. Using this in to s... s(x) = k(x) T K 1 y = ψ(x) T Λ 1 Φ T 1 Φ T 1 Λ 1 1 Ψ 1 y = ψ(x) T Ψ 1 y. and all the ill-conditioning from Λ 1 which would normally have appeared in K 1 has been resolved. Our new stable basis is ψ. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

39 A New, More Stable, Basis (CUDenver) Positive Definite Kernels April 10, / 55

40 A New, More Stable, Basis This stable basis ψ allows us to evaluate the interpolant s(x) = ψ(x) T Ψ 1 y, rather than s(x) = k(x) T (ΨΛ 1 Φ T 1 ) 1 y. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

41 Solving PDEs with Kernels We have used a linear combination of kernels to solve a scattered data approximation problem. A similar strategy can be used to solve boundary value problems of the form Ls = f, Bs = g, on interior, on boundary, for L a linear differential operator. Note Nonlinear and time-stepping problems can also be solved, though we will not talk about them here. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

42 Solving PDEs numerically Question What does it mean to solve a PDE numerically? Lu = f, Bu = g, on interior, on boundary. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

43 Solving PDEs numerically Question What does it mean to solve a PDE numerically? Lu = f, Bu = g, on interior, on boundary. Answer Different communities view different results as solutions. Approximate solution values at chosen points. Approximate Quantities of Interest (QoI) within larger simulations. Approximate solution function which can be evaluated as needed. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

44 Solving PDEs numerically: Approximate solution Our focus in this talk will be on constructing a function which approximates the true solution. Goal Create a function s(x) = N c k K (x, z k ) k=1 which approximately solves the PDE in some sense we deem appropriate. Appropriate approximation How do we enforce the PDE and BC on our s? What are these z k centers? michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

45 Method of Fundamental Solutions The first (1950s??) meshfree method for solving PDEs is called the Method of Fundamental Solutions (MFS). Assumption Our L operator and our domain support an analytically known Green s function G(x, z), such that LG(, z) = δ( z). Strategy L = 2 - G(x, z) = 1/ x z L = 2 λ 2 I - G(x, z) is a Bessel Function (2nd kind) Use K (x, z k ) = G(x, z k ) as the basis functions. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

46 Method of Fundamental Solutions MFS is appropriate for homogeneous PDEs: Ls = 0, Bs = g, on interior, on boundary. By choosing M s(x) = c k G(x, z k ) k=1 we see the interior condition is immediately satisfied: M M M Ls(x) = L c k G(x, z k ) = c k LG(x, z k ) = c k δ(x z k ) = 0, k=1 k=1 k=1 so long as z k is not in the interior. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

47 Method of Fundamental Solutions Because of this design, only the c k are chosen to satisfy the BC. To do this, we fix N collocation points on the boundary, at which we enforce Bs = g. BG(x 1, z 1 ) BG(x 1, z M ) c 1 g(x 1 ).. =. BG(x N, z M ) BG(x N, z M ) g(x N ) c M The shorthand BG(x i, z j ) means BG(x, z J ) x=xi. If M = N this is a square system. Usually, we set M < N and solve the least square problem. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

48 Method of Fundamental Solutions: Overview 1 Choose N collocation points on the boundary 2 Form a solution which is a linear combination of Green s functions There should be M N such functions 3 Center the Green s function outside of the solution domain 4 Solve the appropriate linear system (least squares system) to correctly form the solution A theorem exists which guarantees convergence for M = N as N, subject to possible ill-conditioning. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

49 Method of Fundamental Solutions - Point Distribution Collocation points are located on the boundary. G(, z k ) is centered outside of the domain. No collocation points needed in the interior. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

50 Method of Fundamental Solutions - Point Distribution michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

51 MFS Convergence - Modified Helmholtz Problem michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

52 Method of Fundamental Solutions - Review MFS benefits An R d PDE is solved as a R d 1 interpolation. Cheaper, and easier analysis. Choosing M N is cheaper and (often) more stable. No specific point distribution is required. MFS problems Collocation and source points must be chosen. There may be bad choices. The collocation matrix is probably dense. Only appropriate for homogeneous PDEs. The Method of Particular Solutions exists for nonhomogeneous problems. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

53 Kernel Collocation Instead of Green s functions as our solution basis, we will choose other positive definite kernels (Gaussians). Unlike MFS, the collocation points must exist on the interior as well as the boundary, because Ls = f will not be automatically satisfied. Again, two sets of points must be determined: x i - The collocation points (interior/boundary) z j - The kernel centers michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

54 Kernel Collocation - Point Selection Historical note Almost all papers use x k = z k, that is, the centers and collocation points are the same - this matches the interpolation setting. Recently, researchers have challenged this strategy, but it is still the standard and the only setting we consider here. Matching the centers and collocation points (N = N L + N B ) yields: LK (x 1, x 1 ) LK (x 1, x N ) c 1 f (x 1 ).. LK (x NL, x 1 ) LK (x NL, x N ). BK (x NL +1, x 1 ) BK (x NL +1, x N ) = f (x NL ). g(x NL +1).. BK (x NL +N B, x 1 ) BK (x NL +N B, x N ) g(x NL +N B ) c N michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

55 Kernel Collocation - Stable Basis Using appropriate shorthand, we can rewrite this system as ( ) ( ) KL f c = g K B The HS-SVD allows us to write K L = Ψ L Λ 1 Φ T 1 K B = Ψ B Λ 1 Φ T 1 because Λ 1 Φ T 1 is defined by the centers, not the collocation points. This produces ( ) ( ) ΨL Λ 1 Φ T f 1 c = g Ψ B and the solution s is evaluated as ( ) 1 ( ) s(x) = ψ(x) T ΨL f Ψ B g = ψ(x) T Ψ 1 L,B y f,g. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

56 Stable Gaussians for Boundary Value Problems Implementing the stable basis for this system will avoid the ill-conditioning which would otherwise plague the collocation solution. This example is a 2D Helmholtz problem, comparing Gaussian collocation to Trefethen s polynomial method and the standard Gaussian kernel collocation method. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

57 Evaluating the Collocation Solution Suppose that we wanted to evaluate our PDE solution at our collocation points x 1,..., x N, similarly to a finite difference method: s = s(x 1 ). s(x N ) = = ψ(x 1 ) T Ψ 1. ψ T (x N )Ψ 1 ψ(x 1 ) T. ψ(x N ) T L,B y f,g L,B y f,g Ψ 1 L,B y f,g = ΨΨ 1 L,B y f,g This relationship s = ΨΨ 1 L,B y f,g suggests that ΨΨ 1 L,B plays a similar role as the standard finite difference matrix (plus boundary conditions). michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

58 RBF Differentiation Matrices Observation Applying ΨΨ 1 L,B y f,g evaluates a function (the approximate solution s) when given data (the PDE s RHS y f,g ). Idea Could we reverse the process... that is, given function values, could we evaluate derivatives of that function? Given data y at points x 1,..., x N, we can form an associated interpolant s = ψ( ) T Ψ 1 y. Applying a derivative operator D to this produces Ds = Dψ( ) T Ψ 1 y. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

59 RBF Differentiation Matrices Evaluating the derivative of our interpolant at the interpolation points: Ds(x 1 ) Dψ(x 1 ) T Ψ 1 y s D =. =. Ds(x N ) Dψ(x N ) T Ψ 1 y = Dψ(x 1 ) T. Dψ(x N ) T Ψ 1 y = Ψ D Ψ 1 y Our differentiation matrix Ψ D Ψ 1 takes in data from a function f (evaluated at x 1,..., x N and stored in y) and returns an approximation of Df at the same locations. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

60 RBF Pseudospectral Solver We can expand our notation to evaluate Ds at any location, not just the input data locations. Ψ I L Ψ B B - Evaluates Lψ at the interior collocation points - Evaluates Bψ at the boundary collocation points Using this, the PDE, in pseudospectral form could be written as ( Ψ I L Ψ 1 ) ( ) f Ls = f s =, because g Bs = g Ψ B B Ψ 1 This used by Trefethen to describe the differentiation matrix structure of polynomial collocation methods. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

61 RBF Finite Differences This design of RBF Differentiation matrices motivates one of the most promising new developments in meshfree PDE solvers: RBF-FD. Strategy Define a localized differential operator (analogous to finite differences) which can be used approximate derivatives. Benefits No prescribed point distribution would be required (e.g., uniform points for finite differences, Chebyshev nodes for polynomials) The locality of the operator is a free parameter, and could even vary within a single PDE. Unlike collocation based operators, RBF-FD would be sparse. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

62 RBF Finite Differences - Point Selection Each point uses nearby points to approximate a differential operator. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

63 RBF Finite Differences - Structure For each x k, nearby points, X k, are chosen to approximate the derivatives, and a one row differentiation matrix is computed for each point: Ψ x k L ( Ψ X k ) 1. This matrix has the effect of taking in nearby function values and producing a linear combination of them to approximate the L operator at x k. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

64 RBF Finite Differences - Sparsity Pattern This seems like a really crummy structure which will be much worse than finite differences... michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

65 RBF Finite Differences - Improved Sparsity Pattern But a simple reordering of our indices produces the familiar banded structure. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

66 RBF Finite Differences - Convergence Multiple meanings As the number of points used in computing the finite difference stencil increases, this method should converge to traditional collocation. For a fixed number of neighbors, but an increasingly large number of collocation points, convergence should occur until saturation. Eventually the quality will be bounded by the limited domain over which points are used to approximate the derivatives. A problem of scale For increasingly dense points, different ε shape parameters (which control the kernel locality) may be needed to maintain convergence order. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

67 RBF Finite Differences - Convergence michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

68 RBF Finite Differences - Overview 1 A neighborhood is defined around each collocation point. 2 A linear combination of the points in that neighborhood are used to approximate the derivative at that point. 3 This produces a sparse operator which can be used to solve boundary value problems. 4 Choosing a neighborhood optimally, and then choosing the kernel to use for the approximation in the neighborhood (ε) is still an open problem. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

69 Review Positive definite kernels (RBFs) can be used as a basis for scattered data interpolation and PDE collocation. Some kernels see severe ill-conditioning. A change of basis using the Hilbert-Schmidt SVD can circumvent the traditional ill-conditioning; mostly Gaussians, for now... The Method of Fundamental Solutions is the original kernel-based PDE solver. Collocation schemes can be constructed using the same strategy as interpolation. Less theory exists in this setting. Using this concept, we can define RBF differentiation matrices. RBF-FD produce localized differential operators, comparable to standard PDE solvers, but in a meshfree setting. michael.mccourt@ucdenver.edu (CUDenver) Positive Definite Kernels April 10, / 55

Solving Boundary Value Problems (with Gaussians)

Solving Boundary Value Problems (with Gaussians) What is a boundary value problem? Solving Boundary Value Problems (with Gaussians) Definition A differential equation with constraints on the boundary Michael McCourt Division Argonne National Laboratory

More information

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt. SINGAPORE SHANGHAI Vol TAIPEI - Interdisciplinary Mathematical Sciences 19 Kernel-based Approximation Methods using MATLAB Gregory Fasshauer Illinois Institute of Technology, USA Michael McCourt University

More information

Stable Parameterization Schemes for Gaussians

Stable Parameterization Schemes for Gaussians Stable Parameterization Schemes for Gaussians Michael McCourt Department of Mathematical and Statistical Sciences University of Colorado Denver ICOSAHOM 2014 Salt Lake City June 24, 2014 michael.mccourt@ucdenver.edu

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods The Connection to Green s Kernels Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2014 fasshauer@iit.edu MATH 590 1 Outline 1 Introduction

More information

Using Gaussian eigenfunctions to solve boundary value problems

Using Gaussian eigenfunctions to solve boundary value problems Advances in Applied Mathematics and Mechanics Adv Appl Math Mech, Vol xx, No x, pp xx-xx DOI: xxxx xxx 200x Using Gaussian eigenfunctions to solve boundary value problems Michael McCourt Center for Applied

More information

Compact Matérn Kernels and Piecewise Polynomial Splines Viewed From a Hilbert-Schmidt Perspective

Compact Matérn Kernels and Piecewise Polynomial Splines Viewed From a Hilbert-Schmidt Perspective Noname manuscript No (will be inserted by the editor) Compact Matérn Kernels and Piecewise Polynomial Splines Viewed From a Hilbert-Schmidt Perspective Roberto Cavoretto Gregory E Fasshauer Michael McCourt

More information

Numerical solution of nonlinear sine-gordon equation with local RBF-based finite difference collocation method

Numerical solution of nonlinear sine-gordon equation with local RBF-based finite difference collocation method Numerical solution of nonlinear sine-gordon equation with local RBF-based finite difference collocation method Y. Azari Keywords: Local RBF-based finite difference (LRBF-FD), Global RBF collocation, sine-gordon

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 34: Improving the Condition Number of the Interpolation Matrix Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu

More information

RBF-FD Approximation to Solve Poisson Equation in 3D

RBF-FD Approximation to Solve Poisson Equation in 3D RBF-FD Approximation to Solve Poisson Equation in 3D Jagadeeswaran.R March 14, 2014 1 / 28 Overview Problem Setup Generalized finite difference method. Uses numerical differentiations generated by Gaussian

More information

STABLE EVALUATION OF GAUSSIAN RBF INTERPOLANTS

STABLE EVALUATION OF GAUSSIAN RBF INTERPOLANTS STABLE EVALUATION OF GAUSSIAN RBF INTERPOLANTS GREGORY E. FASSHAUER AND MICHAEL J. MCCOURT Abstract. We provide a new way to compute and evaluate Gaussian radial basis function interpolants in a stable

More information

Green s Functions: Taking Another Look at Kernel Approximation, Radial Basis Functions and Splines

Green s Functions: Taking Another Look at Kernel Approximation, Radial Basis Functions and Splines Green s Functions: Taking Another Look at Kernel Approximation, Radial Basis Functions and Splines Gregory E. Fasshauer Abstract The theories for radial basis functions (RBFs) as well as piecewise polynomial

More information

A scaling perspective on accuracy and convergence in RBF approximations

A scaling perspective on accuracy and convergence in RBF approximations A scaling perspective on accuracy and convergence in RBF approximations Elisabeth Larsson With: Bengt Fornberg, Erik Lehto, and Natasha Flyer Division of Scientific Computing Department of Information

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 2: Radial Basis Function Interpolation in MATLAB Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 2 Part 3: Native Space for Positive Definite Kernels Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2014 fasshauer@iit.edu MATH

More information

Multivariate Interpolation with Increasingly Flat Radial Basis Functions of Finite Smoothness

Multivariate Interpolation with Increasingly Flat Radial Basis Functions of Finite Smoothness Multivariate Interpolation with Increasingly Flat Radial Basis Functions of Finite Smoothness Guohui Song John Riddle Gregory E. Fasshauer Fred J. Hickernell Abstract In this paper, we consider multivariate

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 43: RBF-PS Methods in MATLAB Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590 Chapter 43 1 Outline

More information

Meshfree Approximation Methods with MATLAB

Meshfree Approximation Methods with MATLAB Interdisciplinary Mathematical Sc Meshfree Approximation Methods with MATLAB Gregory E. Fasshauer Illinois Institute of Technology, USA Y f? World Scientific NEW JERSEY LONDON SINGAPORE BEIJING SHANGHAI

More information

Numerical solution of surface PDEs with Radial Basis Functions

Numerical solution of surface PDEs with Radial Basis Functions Numerical solution of surface PDEs with Radial Basis Functions Andriy Sokolov Institut für Angewandte Mathematik (LS3) TU Dortmund andriy.sokolov@math.tu-dortmund.de TU Dortmund June 1, 2017 Outline 1

More information

RBF Collocation Methods and Pseudospectral Methods

RBF Collocation Methods and Pseudospectral Methods RBF Collocation Methods and Pseudospectral Methods G. E. Fasshauer Draft: November 19, 24 Abstract We show how the collocation framework that is prevalent in the radial basis function literature can be

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 1 Part 3: Radial Basis Function Interpolation in MATLAB Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2014 fasshauer@iit.edu

More information

Radial basis functions topics in slides

Radial basis functions topics in slides Radial basis functions topics in 40 +1 slides Stefano De Marchi Department of Mathematics Tullio Levi-Civita University of Padova Napoli, 22nd March 2018 Outline 1 From splines to RBF 2 Error estimates,

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 40: Symmetric RBF Collocation in MATLAB Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590 Chapter

More information

Radial Basis Functions I

Radial Basis Functions I Radial Basis Functions I Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo November 14, 2008 Today Reformulation of natural cubic spline interpolation Scattered

More information

Kernels A Machine Learning Overview

Kernels A Machine Learning Overview Kernels A Machine Learning Overview S.V.N. Vishy Vishwanathan vishy@axiom.anu.edu.au National ICT of Australia and Australian National University Thanks to Alex Smola, Stéphane Canu, Mike Jordan and Peter

More information

Radial Basis Functions generated Finite Differences (RBF-FD) for Solving High-Dimensional PDEs in Finance

Radial Basis Functions generated Finite Differences (RBF-FD) for Solving High-Dimensional PDEs in Finance 1 / 27 Radial Basis Functions generated Finite Differences (RBF-FD) for Solving High-Dimensional PDEs in Finance S. Milovanović, L. von Sydow Uppsala University Department of Information Technology Division

More information

D. Shepard, Shepard functions, late 1960s (application, surface modelling)

D. Shepard, Shepard functions, late 1960s (application, surface modelling) Chapter 1 Introduction 1.1 History and Outline Originally, the motivation for the basic meshfree approximation methods (radial basis functions and moving least squares methods) came from applications in

More information

1. Introduction. A radial basis function (RBF) interpolant of multivariate data (x k, y k ), k = 1, 2,..., n takes the form

1. Introduction. A radial basis function (RBF) interpolant of multivariate data (x k, y k ), k = 1, 2,..., n takes the form A NEW CLASS OF OSCILLATORY RADIAL BASIS FUNCTIONS BENGT FORNBERG, ELISABETH LARSSON, AND GRADY WRIGHT Abstract Radial basis functions RBFs form a primary tool for multivariate interpolation, and they are

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 43: RBF-PS Methods in MATLAB Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590 Chapter 43 1 Outline

More information

Direct Learning: Linear Classification. Donglin Zeng, Department of Biostatistics, University of North Carolina

Direct Learning: Linear Classification. Donglin Zeng, Department of Biostatistics, University of North Carolina Direct Learning: Linear Classification Logistic regression models for classification problem We consider two class problem: Y {0, 1}. The Bayes rule for the classification is I(P(Y = 1 X = x) > 1/2) so

More information

Review and problem list for Applied Math I

Review and problem list for Applied Math I Review and problem list for Applied Math I (This is a first version of a serious review sheet; it may contain errors and it certainly omits a number of topic which were covered in the course. Let me know

More information

Lecture 8: Boundary Integral Equations

Lecture 8: Boundary Integral Equations CBMS Conference on Fast Direct Solvers Dartmouth College June 23 June 27, 2014 Lecture 8: Boundary Integral Equations Gunnar Martinsson The University of Colorado at Boulder Research support by: Consider

More information

Solving the 3D Laplace Equation by Meshless Collocation via Harmonic Kernels

Solving the 3D Laplace Equation by Meshless Collocation via Harmonic Kernels Solving the 3D Laplace Equation by Meshless Collocation via Harmonic Kernels Y.C. Hon and R. Schaback April 9, Abstract This paper solves the Laplace equation u = on domains Ω R 3 by meshless collocation

More information

Theoretical and computational aspects of multivariate interpolation with increasingly flat radial basis functions

Theoretical and computational aspects of multivariate interpolation with increasingly flat radial basis functions Theoretical and computational aspects of multivariate interpolation with increasingly flat radial basis functions Elisabeth Larsson Bengt Fornberg June 0, 003 Abstract Multivariate interpolation of smooth

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 14: The Power Function and Native Space Error Estimates Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu

More information

A new stable basis for radial basis function interpolation

A new stable basis for radial basis function interpolation A new stable basis for radial basis function interpolation Stefano De Marchi and Gabriele Santin Department of Mathematics University of Padua (Italy) Abstract It is well-known that radial basis function

More information

9.2 Support Vector Machines 159

9.2 Support Vector Machines 159 9.2 Support Vector Machines 159 9.2.3 Kernel Methods We have all the tools together now to make an exciting step. Let us summarize our findings. We are interested in regularized estimation problems of

More information

Kernel Methods. Barnabás Póczos

Kernel Methods. Barnabás Póczos Kernel Methods Barnabás Póczos Outline Quick Introduction Feature space Perceptron in the feature space Kernels Mercer s theorem Finite domain Arbitrary domain Kernel families Constructing new kernels

More information

National Taiwan University

National Taiwan University National Taiwan University Meshless Methods for Scientific Computing (Advisor: C.S. Chen, D.L. oung) Final Project Department: Mechanical Engineering Student: Kai-Nung Cheng SID: D9956 Date: Jan. 8 The

More information

1. For the case of the harmonic oscillator, the potential energy is quadratic and hence the total Hamiltonian looks like: d 2 H = h2

1. For the case of the harmonic oscillator, the potential energy is quadratic and hence the total Hamiltonian looks like: d 2 H = h2 15 Harmonic Oscillator 1. For the case of the harmonic oscillator, the potential energy is quadratic and hence the total Hamiltonian looks like: d 2 H = h2 2mdx + 1 2 2 kx2 (15.1) where k is the force

More information

5.6 Nonparametric Logistic Regression

5.6 Nonparametric Logistic Regression 5.6 onparametric Logistic Regression Dmitri Dranishnikov University of Florida Statistical Learning onparametric Logistic Regression onparametric? Doesnt mean that there are no parameters. Just means that

More information

Quadrature approaches to the solution of two point boundary value problems

Quadrature approaches to the solution of two point boundary value problems Quadrature approaches to the solution of two point boundary value problems Seth F. Oppenheimer Mohsen Razzaghi Department of Mathematics and Statistics Mississippi State University Drawer MA MSU, MS 39763

More information

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space Robert Jenssen, Deniz Erdogmus 2, Jose Principe 2, Torbjørn Eltoft Department of Physics, University of Tromsø, Norway

More information

Kernel Methods. Foundations of Data Analysis. Torsten Möller. Möller/Mori 1

Kernel Methods. Foundations of Data Analysis. Torsten Möller. Möller/Mori 1 Kernel Methods Foundations of Data Analysis Torsten Möller Möller/Mori 1 Reading Chapter 6 of Pattern Recognition and Machine Learning by Bishop Chapter 12 of The Elements of Statistical Learning by Hastie,

More information

The Kernel Trick. Robert M. Haralick. Computer Science, Graduate Center City University of New York

The Kernel Trick. Robert M. Haralick. Computer Science, Graduate Center City University of New York The Kernel Trick Robert M. Haralick Computer Science, Graduate Center City University of New York Outline SVM Classification < (x 1, c 1 ),..., (x Z, c Z ) > is the training data c 1,..., c Z { 1, 1} specifies

More information

Space-Time Localized Radial Basis Function Collocation Methods for PDEs

Space-Time Localized Radial Basis Function Collocation Methods for PDEs Space-Time Localized Radial Basis Function Collocation Methods for PDEs Alfa Heryudono Department of Mathematics University of Massachusetts Dartmouth ICERM Topical Workshop Providence August 27 This research

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 33: Adaptive Iteration Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590 Chapter 33 1 Outline 1 A

More information

Each new feature uses a pair of the original features. Problem: Mapping usually leads to the number of features blow up!

Each new feature uses a pair of the original features. Problem: Mapping usually leads to the number of features blow up! Feature Mapping Consider the following mapping φ for an example x = {x 1,...,x D } φ : x {x1,x 2 2,...,x 2 D,,x 2 1 x 2,x 1 x 2,...,x 1 x D,...,x D 1 x D } It s an example of a quadratic mapping Each new

More information

Math Theory of Partial Differential Equations Lecture 3-2: Spectral properties of the Laplacian. Bessel functions.

Math Theory of Partial Differential Equations Lecture 3-2: Spectral properties of the Laplacian. Bessel functions. Math 412-501 Theory of Partial ifferential Equations Lecture 3-2: Spectral properties of the Laplacian. Bessel functions. Eigenvalue problem: 2 φ + λφ = 0 in, ( αφ + β φ ) n = 0, where α, β are piecewise

More information

6 Non-homogeneous Heat Problems

6 Non-homogeneous Heat Problems 6 Non-homogeneous Heat Problems Up to this point all the problems we have considered for the heat or wave equation we what we call homogeneous problems. This means that for an interval < x < l the problems

More information

LMS Algorithm Summary

LMS Algorithm Summary LMS Algorithm Summary Step size tradeoff Other Iterative Algorithms LMS algorithm with variable step size: w(k+1) = w(k) + µ(k)e(k)x(k) When step size µ(k) = µ/k algorithm converges almost surely to optimal

More information

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee

RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets 9.520 Class 22, 2004 Tomaso Poggio and Sayan Mukherjee About this class Goal To introduce an alternate perspective of RKHS via integral operators

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 33: Adaptive Iteration Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH 590 Chapter 33 1 Outline 1 A

More information

arxiv: v1 [math.na] 7 May 2009

arxiv: v1 [math.na] 7 May 2009 The hypersecant Jacobian approximation for quasi-newton solves of sparse nonlinear systems arxiv:0905.105v1 [math.na] 7 May 009 Abstract Johan Carlsson, John R. Cary Tech-X Corporation, 561 Arapahoe Avenue,

More information

Consistency Estimates for gfd Methods and Selection of Sets of Influence

Consistency Estimates for gfd Methods and Selection of Sets of Influence Consistency Estimates for gfd Methods and Selection of Sets of Influence Oleg Davydov University of Giessen, Germany Localized Kernel-Based Meshless Methods for PDEs ICERM / Brown University 7 11 August

More information

3 Green s functions in 2 and 3D

3 Green s functions in 2 and 3D William J. Parnell: MT34032. Section 3: Green s functions in 2 and 3 57 3 Green s functions in 2 and 3 Unlike the one dimensional case where Green s functions can be found explicitly for a number of different

More information

Kernel Method: Data Analysis with Positive Definite Kernels

Kernel Method: Data Analysis with Positive Definite Kernels Kernel Method: Data Analysis with Positive Definite Kernels 2. Positive Definite Kernel and Reproducing Kernel Hilbert Space Kenji Fukumizu The Institute of Statistical Mathematics. Graduate University

More information

Slide05 Haykin Chapter 5: Radial-Basis Function Networks

Slide05 Haykin Chapter 5: Radial-Basis Function Networks Slide5 Haykin Chapter 5: Radial-Basis Function Networks CPSC 636-6 Instructor: Yoonsuck Choe Spring Learning in MLP Supervised learning in multilayer perceptrons: Recursive technique of stochastic approximation,

More information

Breaking Computational Barriers: Multi-GPU High-Order RBF Kernel Problems with Millions of Points

Breaking Computational Barriers: Multi-GPU High-Order RBF Kernel Problems with Millions of Points Breaking Computational Barriers: Multi-GPU High-Order RBF Kernel Problems with Millions of Points Michael Griebel Christian Rieger Peter Zaspel Institute for Numerical Simulation Rheinische Friedrich-Wilhelms-Universität

More information

Kernel B Splines and Interpolation

Kernel B Splines and Interpolation Kernel B Splines and Interpolation M. Bozzini, L. Lenarduzzi and R. Schaback February 6, 5 Abstract This paper applies divided differences to conditionally positive definite kernels in order to generate

More information

STABLE CALCULATION OF GAUSSIAN-BASED RBF-FD STENCILS

STABLE CALCULATION OF GAUSSIAN-BASED RBF-FD STENCILS STABLE CALCULATION OF GAUSSIAN-BASED RBF-FD STENCILS BENGT FORNBERG, ERIK LEHTO, AND COLLIN POWELL Abstract. Traditional finite difference (FD) methods are designed to be exact for low degree polynomials.

More information

Theory of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Theory of Positive Definite Kernel and Reproducing Kernel Hilbert Space Theory of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

Collocation approximation of the monodromy operator of periodic, linear DDEs

Collocation approximation of the monodromy operator of periodic, linear DDEs p. Collocation approximation of the monodromy operator of periodic, linear DDEs Ed Bueler 1, Victoria Averina 2, and Eric Butcher 3 13 July, 2004 SIAM Annual Meeting 2004, Portland 1=Dept. of Math. Sci.,

More information

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space. Chapter 1 Preliminaries The purpose of this chapter is to provide some basic background information. Linear Space Hilbert Space Basic Principles 1 2 Preliminaries Linear Space The notion of linear space

More information

Radial basis function partition of unity methods for PDEs

Radial basis function partition of unity methods for PDEs Radial basis function partition of unity methods for PDEs Elisabeth Larsson, Scientific Computing, Uppsala University Credit goes to a number of collaborators Alfa Ali Alison Lina Victor Igor Heryudono

More information

Least Squares Approximation

Least Squares Approximation Chapter 6 Least Squares Approximation As we saw in Chapter 5 we can interpret radial basis function interpolation as a constrained optimization problem. We now take this point of view again, but start

More information

A fast and well-conditioned spectral method: The US method

A fast and well-conditioned spectral method: The US method A fast and well-conditioned spectral method: The US method Alex Townsend University of Oxford Leslie Fox Prize, 24th of June 23 Partially based on: S. Olver & T., A fast and well-conditioned spectral method,

More information

Fast and accurate methods for the discretization of singular integral operators given on surfaces

Fast and accurate methods for the discretization of singular integral operators given on surfaces Fast and accurate methods for the discretization of singular integral operators given on surfaces James Bremer University of California, Davis March 15, 2018 This is joint work with Zydrunas Gimbutas (NIST

More information

Acoustic Wave Equation

Acoustic Wave Equation Acoustic Wave Equation Sjoerd de Ridder (most of the slides) & Biondo Biondi January 16 th 2011 Table of Topics Basic Acoustic Equations Wave Equation Finite Differences Finite Difference Solution Pseudospectral

More information

Special Two-Semester Linear Algebra Course (Fall 2012 and Spring 2013)

Special Two-Semester Linear Algebra Course (Fall 2012 and Spring 2013) Special Two-Semester Linear Algebra Course (Fall 2012 and Spring 2013) The first semester will concentrate on basic matrix skills as described in MA 205, and the student should have one semester of calculus.

More information

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

Scattered Interpolation Survey

Scattered Interpolation Survey Scattered Interpolation Survey j.p.lewis u. southern california Scattered Interpolation Survey p.1/53 Scattered vs. Regular domain Scattered Interpolation Survey p.2/53 Motivation modeling animated character

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Gaussian Fields and Percolation

Gaussian Fields and Percolation Gaussian Fields and Percolation Dmitry Beliaev Mathematical Institute University of Oxford RANDOM WAVES IN OXFORD 18 June 2018 Berry s conjecture In 1977 M. Berry conjectured that high energy eigenfunctions

More information

Machine Learning. Kernels. Fall (Kernels, Kernelized Perceptron and SVM) Professor Liang Huang. (Chap. 12 of CIML)

Machine Learning. Kernels. Fall (Kernels, Kernelized Perceptron and SVM) Professor Liang Huang. (Chap. 12 of CIML) Machine Learning Fall 2017 Kernels (Kernels, Kernelized Perceptron and SVM) Professor Liang Huang (Chap. 12 of CIML) Nonlinear Features x4: -1 x1: +1 x3: +1 x2: -1 Concatenated (combined) features XOR:

More information

Chapter 9. Support Vector Machine. Yongdai Kim Seoul National University

Chapter 9. Support Vector Machine. Yongdai Kim Seoul National University Chapter 9. Support Vector Machine Yongdai Kim Seoul National University 1. Introduction Support Vector Machine (SVM) is a classification method developed by Vapnik (1996). It is thought that SVM improved

More information

Recent developments in the dual receiprocity method using compactly supported radial basis functions

Recent developments in the dual receiprocity method using compactly supported radial basis functions Recent developments in the dual receiprocity method using compactly supported radial basis functions C.S. Chen, M.A. Golberg and R.A. Schaback 3 Department of Mathematical Sciences University of Nevada,

More information

Fundamental Solutions and Green s functions. Simulation Methods in Acoustics

Fundamental Solutions and Green s functions. Simulation Methods in Acoustics Fundamental Solutions and Green s functions Simulation Methods in Acoustics Definitions Fundamental solution The solution F (x, x 0 ) of the linear PDE L {F (x, x 0 )} = δ(x x 0 ) x R d Is called the fundamental

More information

Section 12.6: Non-homogeneous Problems

Section 12.6: Non-homogeneous Problems Section 12.6: Non-homogeneous Problems 1 Introduction Up to this point all the problems we have considered are we what we call homogeneous problems. This means that for an interval < x < l the problems

More information

26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G.

26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G. 10-708: Probabilistic Graphical Models, Spring 2015 26 : Spectral GMs Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G. 1 Introduction A common task in machine learning is to work with

More information

Kernel Methods. Charles Elkan October 17, 2007

Kernel Methods. Charles Elkan October 17, 2007 Kernel Methods Charles Elkan elkan@cs.ucsd.edu October 17, 2007 Remember the xor example of a classification problem that is not linearly separable. If we map every example into a new representation, then

More information

Discrete Projection Methods for Integral Equations

Discrete Projection Methods for Integral Equations SUB Gttttingen 7 208 427 244 98 A 5141 Discrete Projection Methods for Integral Equations M.A. Golberg & C.S. Chen TM Computational Mechanics Publications Southampton UK and Boston USA Contents Sources

More information

CPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018

CPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018 CPSC 340: Machine Learning and Data Mining Sparse Matrix Factorization Fall 2018 Last Time: PCA with Orthogonal/Sequential Basis When k = 1, PCA has a scaling problem. When k > 1, have scaling, rotation,

More information

No-hair and uniqueness results for analogue black holes

No-hair and uniqueness results for analogue black holes No-hair and uniqueness results for analogue black holes LPT Orsay, France April 25, 2016 [FM, Renaud Parentani, and Robin Zegers, PRD93 065039] Outline Introduction 1 Introduction 2 3 Introduction Hawking

More information

A new class of highly accurate differentiation schemes based on the prolate spheroidal wave functions

A new class of highly accurate differentiation schemes based on the prolate spheroidal wave functions We introduce a new class of numerical differentiation schemes constructed for the efficient solution of time-dependent PDEs that arise in wave phenomena. The schemes are constructed via the prolate spheroidal

More information

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Arno Solin and Simo Särkkä Aalto University, Finland Workshop on Gaussian Process Approximation Copenhagen, Denmark, May 2015 Solin &

More information

Kernel Principal Component Analysis

Kernel Principal Component Analysis Kernel Principal Component Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Sturm-Liouville operators have form (given p(x) > 0, q(x)) + q(x), (notation means Lf = (pf ) + qf ) dx

Sturm-Liouville operators have form (given p(x) > 0, q(x)) + q(x), (notation means Lf = (pf ) + qf ) dx Sturm-Liouville operators Sturm-Liouville operators have form (given p(x) > 0, q(x)) L = d dx ( p(x) d ) + q(x), (notation means Lf = (pf ) + qf ) dx Sturm-Liouville operators Sturm-Liouville operators

More information

Numerical Methods I Non-Square and Sparse Linear Systems

Numerical Methods I Non-Square and Sparse Linear Systems Numerical Methods I Non-Square and Sparse Linear Systems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 September 25th, 2014 A. Donev (Courant

More information

MATH 590: Meshfree Methods

MATH 590: Meshfree Methods MATH 590: Meshfree Methods Chapter 5: Completely Monotone and Multiply Monotone Functions Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2010 fasshauer@iit.edu MATH

More information

Curve Fitting. 1 Interpolation. 2 Composite Fitting. 1.1 Fitting f(x) 1.2 Hermite interpolation. 2.1 Parabolic and Cubic Splines

Curve Fitting. 1 Interpolation. 2 Composite Fitting. 1.1 Fitting f(x) 1.2 Hermite interpolation. 2.1 Parabolic and Cubic Splines Curve Fitting Why do we want to curve fit? In general, we fit data points to produce a smooth representation of the system whose response generated the data points We do this for a variety of reasons 1

More information

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric

More information

ON DIMENSION-INDEPENDENT RATES OF CONVERGENCE FOR FUNCTION APPROXIMATION WITH GAUSSIAN KERNELS

ON DIMENSION-INDEPENDENT RATES OF CONVERGENCE FOR FUNCTION APPROXIMATION WITH GAUSSIAN KERNELS ON DIMENSION-INDEPENDENT RATES OF CONVERGENCE FOR FUNCTION APPROXIMATION WITH GAUSSIAN KERNELS GREGORY E. FASSHAUER, FRED J. HICKERNELL, AND HENRYK WOŹNIAKOWSKI Abstract. This article studies the problem

More information

A Note on the Differential Equations with Distributional Coefficients

A Note on the Differential Equations with Distributional Coefficients MATEMATIKA, 24, Jilid 2, Bil. 2, hlm. 115 124 c Jabatan Matematik, UTM. A Note on the Differential Equations with Distributional Coefficients Adem Kilicman Department of Mathematics, Institute for Mathematical

More information

The automatic solution of PDEs using a global spectral method

The automatic solution of PDEs using a global spectral method The automatic solution of PDEs using a global spectral method 2014 CBMS-NSF Conference, 29th June 2014 Alex Townsend PhD student University of Oxford (with Sheehan Olver) Supervised by Nick Trefethen,

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

A Posteriori Error Bounds for Meshless Methods

A Posteriori Error Bounds for Meshless Methods A Posteriori Error Bounds for Meshless Methods Abstract R. Schaback, Göttingen 1 We show how to provide safe a posteriori error bounds for numerical solutions of well-posed operator equations using kernel

More information

Recent Results for Moving Least Squares Approximation

Recent Results for Moving Least Squares Approximation Recent Results for Moving Least Squares Approximation Gregory E. Fasshauer and Jack G. Zhang Abstract. We describe two experiments recently conducted with the approximate moving least squares (MLS) approximation

More information

Support Vector Machines

Support Vector Machines Support Vector Machines INFO-4604, Applied Machine Learning University of Colorado Boulder September 28, 2017 Prof. Michael Paul Today Two important concepts: Margins Kernels Large Margin Classification

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 19: Computing the SVD; Sparse Linear Systems Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information