Optimal dataindependent point locations for RF interpolation S De Marchi, R Schaback and H Wendland Università di Verona (Italy), Universität Göttingen (Germany) Metodi di Approssimazione: lezione dell 11 maggio 004 p1/6
& ) ) & * +, + Preliminaries, distinct, data sites, data values to be interpolated RF interpolation (easiest): fix a symmetric PD kernel and form $#! (1) If ( ( is even positive definite : the interpolation matrix, invertible, then positive definite, PD kernel It is often radial, 0/, and therefore defined on is called a because every CPD kernel has an associated normalized PD kernel Metodi di Approssimazione: lezione dell 11 maggio 004 p/6
1 1 3 4 3 3 ( A @ 6 ;< C 6 C H C Some useful notations Take span $# The interpolant can be written in terms of cardinal functions (, ie 5 For the purpose of stability and error analysis the following quantities are important: separation distance: 8:9 7 > ;= @? /, filldistance: C > DEF ; 8 9 7 > ;< G/ uniformity: Metodi di Approssimazione: lezione dell 11 maggio 004 p3/6
The problem Are there any good or even optimal point sets for the interpolation problem? Metodi di Approssimazione: lezione dell 11 maggio 004 p4/6
J Literature 1 EYER, A Optimale Centerverteilung bei Interpolation mit radialen asisfunktionen Diplomarbeit, Universität Göttingen, 1994 OS, L P, AND MAIER, U, On the asymptotics of points which maximize determinants of the form det J ( I In Advances in Multivariate Approximation (erlin, 1999), W Haussmann, K Jetter, and M Reimer, Eds, vol 107 of Math Res, WileyVCH, pp1 3 ISKE, A, Optimal distribution of centers for radial basis function methods Tech Rep M0004, Technische Universität München, 000 Metodi di Approssimazione: lezione dell 11 maggio 004 p5/6
& L / O K C H Literature 1 EYER, A considered numerical aspects of the problem OS, L P, AND MAIER, U, investigated on Feketetype points for univariate RFs for a broad class of functions, proving that: Equally spaced points give asymptotically largest determinants for the interpolation matrix 3 ISKE, A constructed and characterized admissible sets by varying the centers for stability and quality of approximation by RF, proving that uniformly distributed points gives better results He also provided a bound for the uniformity: N M, = space dimension Metodi di Approssimazione: lezione dell 11 maggio 004 p6/6
Our approach (I) Power function estimates (II) Geometric arguments Metodi di Approssimazione: lezione dell 11 maggio 004 p7/6
P P!! @ @ 3 Power function estimates $# span 1 C defines on the space The kernel an inner product + @ Q @ + @ $# Q @ $#, the R 1 C Set 1 C is a reproducing kernel of so that, then native Hilbert space If $# $# 5 and by CauchySchwarz inequality () S K J J 5 Metodi di Approssimazione: lezione dell 11 maggio 004 p8/6
S S C S S T W X X X ] \ b K ] \ ] O g ` / j / b K NC L C k i Some properties of the Power function 1 is the norm of the pointwise error functional; Error estimates bound 3 If ; then VU in terms of the fill distance * If such that Qdc space is translation invariant, integrable and has Fourier transform, ef YZ [ bu Z c YZ / / a, K ` _^, then Therefore [ / / ` _^ with is normequivalent to the 5 0h ` ^ lnm Lo pn (3) Metodi di Approssimazione: lezione dell 11 maggio 004 p9/6
r q t r u t r s x v ~ } } } u Š u Œ s C Main result The hypotheses on and are as before Theorem 1 Then for every there exists a constant with the following property If and are given such that wyx{z ƒ ˆ Œ then the fill distance of Ž for all satisfies d (4) œ } š (5) Comment: optimally distributed data sites are sets that cannot have a large region in small without centers, ie is sufficiently Metodi di Approssimazione: lezione dell 11 maggio 004 p10/6
Q! K C + 6 C + J S + + + S + Quasiuniformity and filldistance The previous theorem fails in two situations: Ÿž When p m ` ^ b we have and we don t get is the Gaussian (cfr PaleyWiener theory) Now, assuming that every, we can define is already quasiuniform, ie n $# For this function we have + 3 $# for 5 J ie there is equality in () Hence, the assumption on the approximation properties of the set gives K results follow from lower bounds on the power function and the desired Metodi di Approssimazione: lezione dell 11 maggio 004 p11/6
^ S S NC L i S ² g NC L i ³ / ³ / The Greedy Method (GM) Idea: we generate larger and larger data sets by adding the maxima of the Power function wrt preceeding set This method produces welldistributed point sets Greedy Algorithm (GA) starting step: ( + ª ª «ª iteration step: ± < 0h ± < with Convergence: we hope that 0h < as when convex, or, convex Metodi di Approssimazione: lezione dell 11 maggio 004 p1/6
z z µ Š t r v µ The greedy algorithm converges Theorem Suppose is compact and satisfies an interior cone condition Suppose further that is a positive definite kernel defined on a convex and compact region Then, the greedy algorithm converges at least like z Œ ˆ with a constant Remark: º¹ Ž Metodi di Approssimazione: lezione dell 11 maggio 004 p13/6
½ ¾ ÂC > ; ÀÁ 7 M M M M 6 C Geometric Greedy Method (GGM) Notice: Practical experiments show that the greedy minimization algorithm of the power function, fills the currently largest hole in the data point close to the center of the hole Geometric Greedy Algorithm (GGA) starting step: dist ½ ¼» & & c iteration step: given form st and define diam J J pick ÄÃdist Remark: the algorithm works very well for subsets small filldistance and large separation distance of Then,, with Metodi di Approssimazione: lezione dell 11 maggio 004 p14/6
O + + 6 ; O O + C > ; U U U U ¾ 6 Convergence of the GGA and G/ 8:9 7, G/ 8:9 7 [ f Define ÅÃ > ÅÃ >? / 8 9 7 ÀÁ 7 ÀÁ 7 C ÅÃ M ÄÃ > C > ; Lemma 1 The GGA produces point sets which are quasiuniform To be more precise, f for all [ f ^ [ f Metodi di Approssimazione: lezione dell 11 maggio 004 p15/6
Æ Æ 6 6 K + K 6 ¾ K ¾ Æ È 6 Ç j ^¾ Remarks If is a bounded region in, the GGA constructs asymptotically uniformly distributed data sets that cover asymptotically optimal way since are disjoint With cover while in dist + we find Ç vol vol Ç vol, showing that both É and decay asymptotically like Metodi di Approssimazione: lezione dell 11 maggio 004 p16/6
Ë [ [ Ê [ [ Ê Î[ Î[ Í[ g Ì X Ï S [ fk # NC L i X Examples Ë pts discretized on a regular grid of The kernels are: the Gaussian (with scale 1) and Wendland s function (with scale 15) Greedy method (GM) Executed untill / h g ^ Greedy geometric method (GGM) The sets are computed by the GGA while the error is evaluated on this point set Metodi di Approssimazione: lezione dell 11 maggio 004 p17/6
GM and GGM : Gaussian I 1 10 8 Error 08 06 10 6 04 10 4 0 10 0 0 10 0 04 10 06 10 4 08 1 1 08 06 04 0 0 0 04 06 08 1 10 6 10 0 10 1 N Figure 1: Gaussian: (left) the N=48 optimal points, (right) the error as function of N, decays as ÑyÒ / Ð ^ Metodi di Approssimazione: lezione dell 11 maggio 004 p18/6
S K NC L i GM and GGM : Gaussian II 1 08 06 04 0 0 0 04 06 1 08 06 04 0 0 0 04 06 08 08 1 1 08 06 04 0 0 0 04 06 08 1 1 1 08 06 04 0 0 0 04 06 08 Figure : Gaussian: (left) the N=13 optimal points when / h are taken [ g, (right) the power function where the maxima Metodi di Approssimazione: lezione dell 11 maggio 004 p19/6
^¾ GM and GGM : Gaussian III 1 10 8 Error 08 06 10 6 04 10 4 0 10 0 0 10 0 04 10 06 10 4 08 1 1 08 06 04 0 0 0 04 06 08 1 10 6 10 0 10 1 N Figure 3: Gaussian, (left) geometric greedy data error is larger by a factor 4 and decays as Ö Ò ÔÓÕ, (right) the Metodi di Approssimazione: lezione dell 11 maggio 004 p0/6
Ò GM and GGM: Wendland s function I 1 10 0 Error 08 06 10 1 04 0 10 0 0 10 3 04 06 10 4 08 1 1 08 06 04 0 0 0 04 06 08 1 10 5 10 0 10 1 N Figure 4: Wendland s fnc: (left) the N=100 optimal points, (right) the error as function of N that decays as Ð ^ Metodi di Approssimazione: lezione dell 11 maggio 004 p1/6
»» Ñ / Ò ^¾ GM and GGM: Wendland s function II 1 10 0 Error 08 06 10 1 04 0 10 0 0 10 3 04 06 10 4 08 1 1 08 06 04 0 0 0 04 06 08 1 10 5 10 0 10 1 N Figure 5: Wendland s fnc, (left) geometric greedy data (right) the error is larger by a factor 14 and decays as, Metodi di Approssimazione: lezione dell 11 maggio 004 p/6
Distances geometric greedy geometric greedy 05 Gaussian: separation distances Gaussian: fill distances 045 04 15 035 03 05 1 0 015 05 01 005 0 0 40 60 80 0 0 0 40 60 80 Figure 6: Gaussian Metodi di Approssimazione: lezione dell 11 maggio 004 p3/6
Distances geometric greedy geometric greedy Inverse multiquadrics: separation distances 05 045 04 035 03 05 0 015 01 15 1 05 Inverse multiquadrics: fill distances 005 0 0 40 60 80 100 0 0 0 40 60 80 100 Figure 7: Inverse Multiquadrics function Metodi di Approssimazione: lezione dell 11 maggio 004 p4/6
Distances geometric greedy geometric greedy 05 Wendland: separation distances Wendland: fill distances 045 04 15 035 03 05 1 0 015 05 01 005 0 0 40 60 80 0 0 0 40 60 80 Figure 8: Wendland s function Metodi di Approssimazione: lezione dell 11 maggio 004 p5/6
j ^¾ b K Final remarks The GGA is independent on the kernel and we proved that generates asymptotically optimal sequences It still inferior to the GA that takes maxima of the power function So far, we have no proof of the fact the GGA generates a sequence with optimality, as required by asymptotic Metodi di Approssimazione: lezione dell 11 maggio 004 p6/6