On Nonrigid Shape Similarity and Correspondence

Size: px
Start display at page:

Download "On Nonrigid Shape Similarity and Correspondence"

Transcription

1 On Nonrigid Shape Similarity and Correspondence Alon Shtern and Ron Kimmel February 19, 2018 arxiv: v1 [cs.cv] 18 Nov 2013 Figure 1: Use of correspondence for symmetry detection and texture transfer. The two intrinsically symmetric halves of a human face were found by mapping the shape (left) to itself. Textures from two faces (middle) were transferred to each half (right). Abstract An important operation in geometry processing is finding the correspondences between pairs of shapes. The Gromov-Hausdorff distance, a measure of dissimilarity between metric spaces, has been found to be highly useful for nonrigid shape comparison. Here, we explore the applicability of related shape similarity measures to the problem of shape correspondence, adopting spectral type distances. We propose to evaluate the spectral kernel distance, the spectral embedding distance and the novel spectral quasi-conformal distance, comparing the manifolds from different viewpoints. By matching the shapes in the spectral domain, important attributes of surface structure are being aligned. For the purpose of testing our ideas, we introduce a fully automatic framework for finding intrinsic correspondence between two shapes. The proposed method achieves state-of-the-art results on the Princeton isometric shape matching protocol applied, as usual, to the TOSCA and SCAPE benchmarks. 1

2 1 Introduction Correspondence detection between pairs of shapes lies at the heart of many operations in the field of geometry processing. The problem of acquiring correspondence between rigid shapes has been widely addressed in the literature. As for non-rigid shapes, this problem remains difficult even when the space of deformations is narrowed to nearly isometric surfaces, which approximately preserve the geodesic distances between corresponding points on each shape. A common approach for shape matching is to define a measure of dissimilarity between shapes modeled as 2-manifolds. The well-established Gromov-Hausdorff distance measures the maximum geodesic discrepancy between pairs of corresponding points of the two given shapes [MS05, BBK06]. The point-wise map can be inferred to as a byproduct of the evaluation of the Gromov-Hausdorff distance. This approach was embraced by the Generalized Multi-Dimensional Scaling (GMDS) framework [BBK06], and by its recently proposed Spectral GMDS implicit alternative, in which the distances matching problem is treated in the dual intrinsic spectral domains of the given shapes [ADK13]. Within the Gromov-Hausdorff framework, Bronstein et al. [BBM + 10] suggested replacing the geodesic distance by the diffusion distance [CLL + 05], exploiting the apparent stability of diffusion distances to local changes in the topology of the shape. Despite its generality and theoretical beauty, it has been a challenge to apply the Gromov-Hausdorff framework in a strait-forward manner to shape matching, mainly due to its intrinsically combinatorial nature. Bérard et al. [BBG94] exhibited the spectral embedding of Riemannian manifolds by their heat kernel. They embedded the manifolds into a compatible common Euclidean space and proved that the Hausdorff distance in that space is a metric between isometry classes of Riemannian manifolds, which means, in particular, that two manifolds are at zero distance if and only if they are isometric. Lipman et al. introduced the conformal Wasserstein distance between surfaces with restricted disk or sphere topology [LD11, LF09], utilizing the Möbius transform. The definition of the conformal Wasserstein distance is based on the well-known class of transportation distances [Mum91, RTG00] (a.k.a. Wasserstein-Kantorovich-Rubinstein or Earth Mover s Distance). Several authors have suggested to generalize the class of Möbius transforms, e.g. using least squares conformal maps [WWJ + 07], bounding the conformal factor [AKZ13], or controlling the conformal distortion [Lip12, ZG11, Ahl66]. Zhu et al. suggested to apply post-process flattening, and impose area preserving constraints [ZHT03]. However, it is known that in the general case, a mapping cannot be both angle-preserving and area-preserving. Kasue and Kumura [KK94] extended the Gromov-Hausdorff distance framework to the class of spectral methods. The spectral kernel distance was constructed by replacing the metric defined on the manifolds with the heat kernel. Recently, Mémoli [Mém09] introduced the spectral Gromov-Wasserstein distance, applying the theory of mass transportation. The spectral Gromov Wasserstein distance via the comparison of heat kernels satisfies all properties of a metric on the class of isometric manifolds. Here, we address the problem of shape correspondence in the context of shape similarity. We find that comparing quasi-conformal quantities, provides a solid initial set of matched points. Finer point-wise correspondence between shapes is detected by the 2

3 matching of their respective spectral kernels. Finally, accurate dense correspondence can be achieved by the simultaneous evaluation of the spectral kernel and the spectral embedding distances. At a high level, the spectral kernel based approach substitutes the reality of geodesic distances on the surface with the notion of spectral connectivity. Many options for spectral connectivity can be considered, depending on the underlying application. One possibility is to use the heat kernel that provides a natural notion of scale, which is useful for multiscale shape comparison. Spectral distances, such as the diffusion distance [BBM + 10], or the interpolated geodesic distances [AK13b], are alternative options. In that perspective, the spectral kernel approach can be interpreted as a generalization of the spectral GMDS [ADK13] framework. For the task of point-wise correspondence and based on the data sets we analyzed, our empirical observations indicate that the commute time kernel (a.k.a GPS kernel) [QH07, Rus07], makes a prominent choice, as it well adapts to large isometric deformations exhibited in the data sets we analyzed. Moreover, we try to bridge the gap between area-preserving and angle-preserving mappings. It is well known that conformal mappings are similarities in the small in the sense that over small regions, the image differs from the original by a scale factor alone [GWC + 04]. However, non-infinitesimal areas of the surface may be considerably enlarged or reduced. Here, we try to account for these seemingly conflicting constraints, endorsing a spectral embedding design, that includes a natural balance between area-preserving and conformal requirements. 1.1 Contribution Our main contribution is a novel formulation and practical evaluation of shape similarity measures. We contemplate different objective functions, based on the following types of spectral distances. The spectral kernel distance. The spectral embedding distance. The spectral quasi-conformal distance. We introduce three optimization procedures for evaluating shape similarity. First, Spectral Quasi-Conformal Maps that uses matched eigenfunctions for the estimation of the spectral quasi-conformal distance. Second, the Spectral Kernel Maps that evaluates the spectral kernel distance. Third, the functional Spectral Kernel Maps that uses functional representation for the evaluation of the spectral embedding distance and the spectral kernel distance. Furthermore, we present a fully automatic framework for correspondence detection, that is robust to isometric deformations and achieves state-of-the-art performance. We emphasize that the spectral embedding metric and the spectral kernel distance express different attributes of shape structure. At one end, spectral embedding presents excellent point 3

4 identification qualities. At the other end, comparing spectral kernels reflects the geometric connectivity of points on the surfaces. The functional spectral kernel maps algorithm alternates between kernel distance evaluation and spectral embedding, thus, combines the merits of both. 1.2 Related work 1.3 Spectral kernels Laplace-Beltrami eigendecomposition and the heat kernel Let X be a compact two-dimensional Riemannian manifold. The divergence of the gradient of a function f over the manifold, f div gradf is called the Laplace-Beltrami operator (LBO) of f and can be considered as a generalization of the standard notion of the Laplace operator to compact Riemannian manifolds [Bel64, Tau95, Lév06]. The Laplace-Beltrami eigenvalue problem is given by φ i λ i φ i. By definition, is a positive semidefinite operator. Therefore, its eigendecomposition consists of non-negative eigenvalues λ i. We can order the eigenvalues as follows 0 = λ 0 < λ 1 < < λ i <. The set of corresponding eigenfunctions given by {φ 0, φ 1,, φ i, } forms an orthonormal basis, such that φ X i(x)φ j (x)dv X = δ ij, where dv X is the volume element on manifold X. The solution of the heat equation u = u, with point heat source at x X (heat t value at point x X after time t > 0) is the heat kernel K t (x, x ). The heat kernel can be represented in terms of the Laplace-Beltrami eigenbasis as [MR51] K t (x, x ) = i e λ it φ i (x)φ i (x ). The heat kernel embedding Φ t spectral space : x l 2 maps the shape into the infinite dimensional Φ t (x) Φ(x)e Λt/2, where Φ(x) {φ 1 (x), φ 2 (x),..., φ i (x),... }, Λ diag(λ 1, λ 2,..., λ i,... ). Therefore, the heat kernel can be represented as the inner product of the heat kernel embedding K t (x, x ) = Φ t (x) Φ t (x ). 4

5 1.3.2 The GPS kernel Similarly to embedding into the heat kernel, Rustamov [Rus07] proposed the Global Point Signature (GPS) embedding into { } 1 1 Φ GPS (x) φ 1 (x), φ 2 (x),... = Φ(x)Λ 1 2, λ1 λ2 with the respective GPS kernel K GPS (x, x ) Φ GPS (x) Φ GPS (x ) = i 1 λ i φ i (x)φ i (x ). The choice of the GPS kernel for embedding is motivated in a number of ways. Rustamov [Rus07] argued that the GPS kernel coincides with the Green s function, and that the Green s function, in some sense, measures the extent to which two points are geometrically connected. It is easy to see that K GPS (x, x ) = t=0 K t(x, x )dt, i.e. the GPS kernel is the integration through all scales of the heat kernel [BB11]. The GPS kernel is equivalent to the commute time kernel [QH07]. The commute time distance can be thought of as the average time of a random walk starting at x, passing through x and return back to x. If X is obtained by uniformly scaling X by a factor α > 0, the eigenvalues of the Laplace-Beltrami operator are scaled according to λ i = α 2 λ i, and the corresponding L 2 -normalized eigenfunctions become φ i (x) = α 1 φ i (x). Then, it is easy to see that the GPS kernel i λ i 1 φ i (x)φ i (x ), is global scale invariant [BB11]. The gradients of the GPS coordinates are normalized. By definition φ i (x) = λ i φ i (x) and X φ2 i (x)dv X = 1. Applying Green s first identity on Riemannian manifolds X φ i(x) Functional maps 2 dv X = λi X φ i (x) λi φ i(x) λi dv X = 1. Ovsjanikov et al. [OBCS + 12] presented the idea of functional representation. Let ϕ : X Y be a bijective mapping between shapes X and Y. If we are given a scalar function f X : X R, then, we can obtain a corresponding function f Y : Y R by the composition f Y = f X ϕ 1. Let us denote the induced transformation of generic space of real valued functions by ϕ F : F(X, R) F(Y, R), where we use F(, R) to denote a generic space of real-valued functions. We call ϕ F the functional representation of the mapping ϕ. Such a representation is linear, since for every pair of functions f1 X,f2 X and scalars α 1,α 2, ϕ F (α 1 f X 1 + α 2 f X 2 ) = (α 1 f X 1 + α 2 f X 2 ) ϕ 1 5

6 = α 1 f X 1 ϕ 1 + α 2 f X 2 ϕ 1 = α 1 ϕ F (f X 1 ) + α 2 ϕ F (f X 2 ). Next, suppose that the function space of X is equipped with a basis Φ X = {φ X i } so that any function f X : X R can be represented as a linear combination of basis functions f X = i ax i φ X i. If Y is equipped with a basis Φ Y = {φ Y i }, then ϕ F (φ X i ) = j (CX ) ij φ Y j for some (C X ) ij and ( ) ϕ F (f X ) = ϕ F a X i φ X i = ( a X i (C X ) ij )φ Y j. i j i Equivalently, for ϕ F (f X ) = f Y = i ay i φ Y i, we have ( ) ϕ 1 F (f Y 1 ) = ϕ F a Y i φ Y i = ( a Y i (C Y ) ij )φ X j. j i Therefore, we can represent f X as a row vector a X with coefficients a X i, and equivalently, f Y as a row vector a Y with coefficients a Y i. Then, we have a Y = a X C X, and a X = a Y C Y, where the matrices C X and C Y are independent of f X and are completely determined by the bases Φ X and Φ Y and the map ϕ. Here, we restrict ourselves to use the first N Laplace-Beltrami eigenfunctions as the basis functions (see Section 1.3.1) for the functional representations. This basis is well suited for representing near isometric shapes [AK13a, OBCS + 12]. Point-to-point correspondences assume that each point x X corresponds to some point y Y by the mapping y = ϕ(x). In this case, the delta function f X ( x) = δ x ( x) at point x corresponds to the delta function f Y (ỹ) = δ y (ỹ) at point y. We emphasize that x and ỹ are the variables of the functions, while x and y are the respective parameters. It is easy to show that for δ x ( x) Equivalently, for δ y (ỹ) a X x = Φ X (x) = (φ X 1 (x), φ X 2 (x),..., φ X i (x),... ). a Y y = Φ Y (y) = (φ Y 1 (y), φ Y 2 (y),..., φ Y i (y),... ). Then, we can construct the function preservation constraints A X = A Y C Y, where the matrices A X = Φ X and A Y = Φ Y are built by stacking the row vectors a X x and a Y y, respectively. Thus, given some initial correspondence y = ˆϕ 0 (x), a post-process iterative refinement algorithm can be obtained via [OBCS + 12], Algorithm 1 : Post-process iterative refinement for l = 1 to L do 1. Construct the constraint matrices A X l, AY l by stacking a X x, a Y y, utilizing the correspondence provided by the previous iteration y = ˆϕ l 1 (x). 2. Find the optimal Cl Y minimizing A X l A Y l CY 2 l. F 3. For each row x of Φ X, find new correspondence ˆϕ l (x) by searching for the closest row of y in Φ Y C Y l. end for i 6

7 The output of this algorithm is both a functional matrix C Y and point-wise correspondence. This procedure is similar to the well known Iterative Closest Point (ICP) [YM92, BM92] in N dimensions, except that it is done in the spectral domain, rather than the standard Euclidean space. 2 Shape Similarity 2.1 Metric spaces and spectral distances In this section we first review known metrics between shapes modeled as 2-Riemannian manifolds. We then introduce a novel distance between manifolds which we denote by spectral quasi-conformal distance Metric space A metric space is a pair (X, d X ) where the set X is equipped with a distance d X : X X R 0, satisfying the properties P1: Symmetry - d X (x, x ) = d X (x, x). P2: Triangle-inequality - d X (x, x ) d X (x, x ) + d X (x, x ). P3: Identity - d X (x, x ) = 0 if and only if x = x Gromov-Hausdorff distance Let X and Y be two compact metric spaces which are subsets of a common metric space (Z, d Z ), and we want to compare X to Y. A common approach is that of computing the Hausdorff distance between them [Hau14], d Z H(X, Y ) max(sup x X d Z (x, Y ), sup d Z (y, X)). y Y Now, if we allow the shapes to undergo isometric transformations, comparison is possible using the Gromov-Hausdorff distance [Gro81], d GH (X, Y ) inf Z,f,g dz H(f(X), g(y )), where f : X Z and g : Y Z are isometric embeddings (distance preserving) into the metric space Z. The Gromov-Hausdorff distance can equivalently be defined as [BBI01] where d GH (X, Y ) inf ϕ:x Y ψ:y X 1 max(dis(ϕ), dis(ψ), dis(ϕ, ψ)), 2 dis(ϕ) sup d X (x, x ) d Y (ϕ(x), ϕ(x )), x,x X d X (ψ(y), ψ(y )) d Y (y, y ), dis(ψ) sup y,y Y dis(ϕ, ψ) sup x X,y Y d X (x, ψ(y)) d Y (ϕ(x), y). 7

8 2.1.3 Spectral embedding distance From the point of view of spectral geometry, Bérard et al. [BBG94] used the spectral properties of the heat operator e t to define a metric between two Riemannian manifolds X, Y M. Here, M is the set of all closed (i.e. compact without boundary) Riemannian manifolds of dimension n. Given a Riemannian manifolds X M with volume Vol(X) and some t > 0, they based their metric on the eigendecomposition of heat kernel K t (x, x ) and defined the spectral embedding It ΦX (x) = { Vol(X)e λx i t/2 φ X i (x)} i by utilizing the eigenfunctions φ X i and eigenvalues λ X i of the Laplace-Beltrami operator X. Given a pair of eigenbases Φ X, Φ Y, they embedded the two Riemannian manifolds into their respective heat kernels It ΦX, It ΦY, and measured the Hausdorff distance d EMB H between the manifolds in the common Euclidean space. d EMB H (It ΦX, I ΦY t ) inf ϕ:x Y ψ:y X dis EMB t (ϕ) sup x X dis EMB t (ψ) sup y Y max(dis EMB t d(i ΦX t (x), I ΦY t (ϕ), dis EMB (ψ)), (ϕ(x))), d(it ΦX (ψ(y)), It ΦY (y)). The distance between point x X and y Y is the usual Euclidean distance in the embedded space d(it ΦX (x), It ΦY (y)) 2 = It ΦX (x) It ΦY (y). Bérard et al. defined the distance between the manifolds X, Y, as the upper-bound of the Hausdorff distance evaluated for all compatible pairs of eigenbases. d EMB t L 2 t (X, Y ) max(d EMB (X, Y ), d EMB (Y, X)), d EMB I t (X, Y ) sup d EMB I t (Y, X) sup I t inf {Φ X } {Φ Y } inf {Φ Y } {Φ X } d EMB H d EMB H (It ΦX (It ΦX I t, I ΦY t ),, I ΦY t ). (1) We denote d EMB t (X, Y ) as the spectral embedding distance. They showed that for any fixed t > 0, the spectral embedding distance d EMB t is a metric between isometry classes of Riemannian manifolds. In particular, d EMB t (X, Y ) = 0 if and only if the Riemannian manifolds X and Y are isometric. We note that the theory holds for any kernel of the form K(x, x ) = i k(λ i)φ i (x)φ i (x ), with k( ) injective and decreasing sufficiently fast at infinity Spectral kernel distance Kasue and Kumura [KK94] defined the metric between Riemannian manifolds by comparing their respective heat kernels d SPEC (X, Y ) inf max(dis SPEC (ϕ), dis SPEC (ψ)), (2) ϕ:x Y ψ:y X 8

9 taking the supremum of kernel distortion for all t > 0 dis SPEC (ϕ) sup u(t)d SPEC t (x, x, ϕ(x), ϕ(x )), x,x X,t>0 dis SPEC (ψ) sup y,y Y,t>0 u(t)d SPEC t (ψ(y), ψ(y ), y, y ), d SPEC t (x, x, y, y ) Vol(X)K X t (x, x ) Vol(Y )K Y t (y, y ). The function u(t) e (t+1/t) is used to normalize the kernels for different values of t. We denote d SPEC (X, Y ) as the spectral kernel distance Spectral Gromov-Wasserstein distance Recently, Mémoli [Mém09] introduced a spectral version of the Gromov-Wasserstein distance, similar to the one proposed by Kasue and Kumura. He applied the well-established transportation distances, also known as Earth Mover s Distance (EMD). Given importance probability measures µ X (x), µ Y (y) such that X µx (x)dx = 1, Y µy (y)dy = 1, respectively, and for a coupling measure µ(x, y) satisfying µ(x, y)dx = X µ Y (y) and µ(x, y)dy = Y µx (x), the spectral Gromov-Wasserstein distance d GW,p (X, Y ) is the EMD worst case L p norm of the spectral kernel distortion for all scales t > 0, d SPEC GW,p (X, Y ) inf sup c(t) Γ SPEC t,x,y µ L, t>0 p (µ µ) Γ SPEC t,x,y d SPEC t (x, x, y, y ) p dµ(x, y)dµ(x, y ), p L p (µ µ) where c(t) e a/t for some a > 0. X Y X Y Spectral quasi-conformal distance Motivated by conformal type distances [GWC + 04, LD11, AKZ13] and spectral embedding [BBG94], we propose the quasi-conformal distance. The key point in our semi-conformal approach is that of replacing the angles between curves by analogous point-wise signatures. To that end, we regard the isometry invariant quantities ω i,j φ i φ j λi λ j, which relate both to angles and areas of the triangles constructed by the gradients of the eigenfunctions φ i, φ j. Now, similarly to the spectral embedding, we define the embedding Ĩ ΦX t (x) {Vol(X)e (λx i +λx j )t/2 ω X i,j(x)} i,j. Theorem 1. The embedding ĨΦX t separates points on the Riemannian manifold X M, i.e. x x ĨΦX t (x) ĨΦX t (x ). 9

10 Proof. Suppose that for two distinct points x, x X, we have ĨΦX t (x) = ĨΦX t (x ). This means that for all i, j we have X φ X i (x) X φ X j (x) = X φ X i (x ) X φ X j (x ). Any scalar function f X : X R can be represented as a linear combination of the eigenbasis f X (x) = i a iφ X i (x). We can write the norm of the gradient as X f X (x) 2 = i,j a i a j ( X φ X i (x) X φ X j (x)) = i,j a i a j ( X φ X i (x ) X φ X j (x )) = X f X (x ) 2. However, one can easily imagine a function f X, such that X f X takes distinct values at those two points, a contradiction meaning that the ĨΦX t should have been different. Let X and Y be the two closed Riemannian manifolds we would like to compare, and let us be given some t > 0. We define the spectral quasi-conformal embedding Jt ΦX (x) ĨΦX t (x) It ΦX (x). Given a pair of eigenbases Φ X, Φ Y, we embed the two Riemannian manifolds into Jt ΦX and Jt ΦY, and measure the Hausdorff distance d QC H between the manifolds in the common Euclidean space. d QC H ΦX (Jt, J ΦY t ) inf ϕ:x Y ψ:y X dis QC t (ϕ) sup x X dis QC t (ψ) sup y Y max(dis QC t d(j ΦX t (x), J ΦY t (ϕ), dis QC t (ψ)), (ϕ(x))), d(jt ΦX (ψ(y)), Jt ΦY (y)). The distance between point x X and y Y is the usual Euclidean distance in the embedded space d(jt ΦX (x), Jt ΦY (y)) = Jt ΦX (x) Jt ΦY (y). We define the spectral quasi-conformal distance as the supremum of the Hausdorff distance evaluated for all compatible pairs of eigenbases. d QC t (X, Y ) max(d QC d QC J t (X, Y ) sup d QC J t (Y, X) sup J t inf {Φ X } {Φ Y } inf {Φ Y } {Φ X } L 2 (X, Y ), d QC J t (Y, X)), d QC H d QC H ΦX (Jt ΦX (Jt, J ΦY t ),, J ΦY t ). (3) Theorem 2. The family {Jt ΦX } t is invariant to global scaling of the metric. Proof. If the Riemannian manifold X is obtained by uniformly scaling the metric of the Riemannian manifold X by a factor α > 0, then, the eigenvalues of the Laplace- Beltrami operator are scaled according to λ X i = α 2 λ X i, the corresponding L 2 -normalized 10

11 eigenfunctions become φ X i have = α 1 φ X i, and the gradient operator reads X = α 1 X. We ω X i,j = Xφ X i Xφ X j = λ X i λ X j = α 2 X φ X i X φ X j λ X i λx j α 1 Xα 1φXi = α 2 ω X i,j. α 1 X α 1 φ X j α 2 λ X i α 2 λ X j Now, if we set t = α 2 t X (ĨΦ t ) i,j = Vol( X)e (λ X i +λ X ) t/2 j ω X i,j = α 2 Vol(X)e (α 2 λ X i +α 2 λ X j )α2t/2 α 2 ωi,j X = Vol(X)e (λx i +λx j )t/2 ωi,j X = (ĨΦX t ) i,j, (I Φ X t ) i = Vol( X) 1 2 φ Xi e λ X i t/2 = (α 2 Vol(X)) 1 2 e α 2 λ X i α 2 t/2 α 1 φ X i = Vol(X) 1 2 e λ X i t/2 φ X i = (I ΦX t ) i. Theorem 3. For any fixed t > 0, the spectral quasi-conformal distance d QC t between isometry classes of Riemannian manifolds. A complete proof is given in appendix A. is a metric 2.2 Shape similarity functionals First, we wish to define a more tractable version of the formal spectral kernel distance that was presented in Eq. 2. In general, we want to find a direct mapping that preserves the spectral connectivity between points. Let us denote by X and Y the two shapes we compare. The correspondence between X and Y is represented by a bijective mapping ϕ : X Y, such that for each point x X, the corresponding point y Y is obtained by the mapping y = ϕ(x). An importance measure denoted by ρ X (x) is specified for each point x X, such that X ρx (x)dx = 1. Based on the rationale of the spectral kernel distance, we wish to compare the kernels K X and K Y. We define the spectral kernel similarity functional d SPEC p (X, Y ) min ϕ:x Y Γ SPEC, (4) L p that minimizes the L p norm of the spectral kernel distortion Γ SPEC p d SPEC L p K (x, x, ϕ(x), ϕ(x )) p dρ X (x)dρ X (x ), where X,Y XX X,Y d SPEC K (x, x, y, y ) K X (x, x ) K Y (y, y ). 11

12 We remark that the definition of the spectral kernel similarity functional can easily be extended for comparing multiple kernels. Here, we choose the kernel to be the GPS one. Next, we would like to reformulate the spectral embedding distance, given in Eq. 1, as the objective function of the iterative refinement algorithm defined in Section The iterative algorithm tries to fit the shapes embedded in a common Euclidean space, in such a way that a point on one shape matches to the closest point on the other shape. Recall, that the correspondence ϕ(x) of point x X is found by searching for the nearest neighbor of Φ X (x) in Φ Y C Y. We can think of Φ X as an embedding of shape X into the N dimensional spectral space. Similarly, we can think of Φ Y C Y as a rigid transformation of the embedding Φ Y of shape Y into the same spectral space. Hence, we can present the spectral embedding similarity functional d EMB p (X, Y ) min C Y,ϕ Γ EMB, (5) L p that minimizes the L p norm of the spectral distance Γ EMB d EMB (Φ X (x), Φ Y (ϕ(x))c Y ) p dρ X (x). X,Y p L p X The metric d EMB defines the spectral distance between points in the common space. Here, we choose the global scale invariant distance d EMB (Φ X (x), Φ Y (y)c Y ) (Φ X (x) Φ Y (y)c Y )Λ 1 2. Finally, we derive an objective function from the spectral quasi-conformal distance of Eq. 3. In addition to the angle and area-preserving quantity ω i,j, we incorporate the orientation of the shapes by introducing X,Y ν i,j n ( φ i φ j ) λi λ j, where n is the outward pointing normal to the surface. The point-wise signatures ω i,j, ν i,j can be related to the areas of the triangles constructed by the norms of the gradients φ i, φ j and the angles between them ( φ i, φ j ). Using ω i,j and ν i,j, we can formulate the spectral quasi-conformal similarity functional d QC p (X, Y ) sup inf {Φ X } {Φ Y } inf ϕ Γ QC X,Y L p that optimizes the L p norm of the quasi-conformal distortion Γ QC d QC (x, ϕ(x)) p dρ X (x). X,Y p L p X X L 2, (6) The quasi-conformal distortion d QC (x, y) between point x X and point y Y is defined using ω i,j and ν i,j. Here, we choose the L 2 distortion d QC (x, y) 2 = i,j (ω X i,j(x) ω Y i,j(y)) 2 + (ν X i,j(x) ν Y i,j(y)) 2. (7) Let us summarize. In this section we formulated three objective functions. 12

13 The spectral kernel similarity functional d SPEC p (X, Y ). The spectral embedding similarity functional d EMB p (X, Y ). The spectral quasi-conformal similarity functional d QC p (X, Y ). 3 Shape similarity evaluation 3.1 Spectral kernel maps We now address the issue of minimizing the spectral kernel similarity functional d SPEC p (X, Y ). For that goal we present a randomized iterative algorithm. The key idea is that previously obtained correspondences are used to find a new mapping, and that at each iteration, new M randomly selected points are being matched. Let X and Y be the discrete shapes we wish to compare. We estimate the correspondence ˆϕ(x) by minimizing the L 2 discrete version of the spectral kernel similarity functional of Eq. 4, ˆϕ = min ϕ x,x X d SPEC K (x, x, ϕ(x), ϕ(x )) 2 ρ X (x)ρ X (x ). Suppose we are given a subset of points X 0 X and an initial correspondence ˆϕ 0 (x), x X 0. The Spectral Kernel Maps algorithm is given by Algorithm 2 : Spectral kernel maps for l = 1 to L do 1. Select a subset of points X l by drawing M points from X with probability P (x) = ρ X (x). 2. For each x X l find new correspondence ˆϕ l (x) by minimizing ˆϕ l (x) = argmin K X (x, x ) K Y (y, ˆϕ l 1 (x )) 2. y Y x X l 1 end for The algorithm provides as an output the kernel signatures K X (x, x ), K Y (y, ˆϕ L (x )) x X L, and the mapping ˆϕ(x) = argmin K X (x, x ) K Y (y, ˆϕ L (x )) 2 x X. y Y x X L We note that the spectral kernel maps are equivalent in structure to the Heat Kernel Maps [OMMG10], with the important difference that unlike the heat kernel maps, our design does not depend on accurate initial matching of feature points. Moreover, the spectral kernel maps method can scale up the number of points being used for generating the isometric signatures, which consequently increases the robustness and the accuracy of the mapping. 13

14 3.2 Functional spectral kernel maps We further extend the idea put forward in the previous section. In addition to optimizing the spectral kernel similarity objective function d SPEC p (X, Y ), we also want to minimize the spectral embedding similarity functional d EMB p (X, Y ). Recall, that the post-process iterative refinement procedure presented in Algorithm 1, already optimizes d EMB 2 (X, Y ). We are looking for a way to modify Algorthim 1, in a manner that will simultaneously minimize d SPEC 2 (X, Y ), by matching K X (x, x ) to K Y (y, y ). For that goal, we impose the spectral kernel constraints on the delta functions that represent correspondence. Accordingly, the function should correspond to f X x,x ( x) = (KX (x, x )/ K X (x, x) )δ x ( x), f Y y,y (ỹ) = (KY (y, y )/ K Y (y, y) )δ y (ỹ). For clarity we note that x and ỹ are the variables of the functions, while x, x and y, y are the respective parameters. Therefore, we have and equivalently a X x,x = (KX (x, x )/ K X (x, x) )Φ X (x), a Y y,y = (KY (y, y )/ K Y (y, y) )Φ Y (y). Then, we can construct the function preservation constraints A X = A Y C Y, where the matrices A X, A Y are built by stacking the row vectors a X x,x and ay y,y, respectively. Notice that we normalize the kernels, so that the impact of all constraints a X x,x, a Y y,y are similar for all x X, y Y. By recalling that for nearly isometric shapes, the correspondence we are looking for should be represented by a nearly-diagonal C Y [KBB + 13], we can submit an element-wise off-diagonal penalty W and formulate the following problem argmin A X A Y C Y 2 + β F W C Y 2, F (8) C Y where β is a tuning parameter. The minimization of (8) can be obtained separately for each column of C Y by the least squares method. Suppose we are given a subset of points X 0 X and an initial correspondence ˆϕ 0 (x), x X 0. We are ready to present the functional Spectral Kernel Maps algorithm 14

15 Algorithm 3 : Functional spectral kernel maps for l = 1 to L do 1. Construct the constraint matrices A X l, AY l by stacking a X x,x, ay y,y, utilizing the correspondence provided by the previous iteration y = ˆϕ l 1 (x), y = ˆϕ l 1 (x ). 2. Find the optimal C Y l minimizing A X l A Y l CY l 2 F + β W C Y l 3. Select a subset of points X l by drawing M points from X with probability P (x) = ρ X (x). 4. For each row x X l of Φ X, find new correspondence ˆϕ l (x) by searching for the closest row of y in Φ Y Cl Y, applying ˆϕ l(x) = argmin d EMB (Φ X (x), Φ Y (y)cl Y ). y Y end for The algorithm provides as an output both the functional matrix C Y computed in Step 2, and the point-wise correspondence ˆϕ(x) found in Step 4. We note that the functional spectral kernel maps algorithm is closely related to the spectral GMDS procedure [ADK13], as both try to match the connectivity between points on each surface by functional representation. It can also be viewed as a generalization of the post-process iterative refinement algorithm. This is noticed by setting the kernel K(x, x ) to be the heat kernel K t (x, x ). As t 0 the only constraints that remain are a X x,x Φ X (x) and a Y y,y Φ Y (y). The advantage of the functional spectral kernel maps algorithm is in the combination of the two methods, resulting in highly accurate correspondence maps. 3.3 Spectral quasi-conformal maps We now turn to approximate correspondence via the spectral quasi-conformal similarity functional d QC p (X, Y ), given in Eq. 6. Let {φ X i } N 0 i=1, {φy i } N 0 i=1 be the first N 0 eigenfunctions of the shapes X and Y, respectively. We suggest to find correspondence for each point x X by minimizing the quasi-conformal distortion of Eq. 7, introducing the Spectral Quasi-Conformal Maps ˆϕ(x) = argmin y Y x N 0 i,j=1 (ω X i,j(x) ω Y i,j(y)) 2 + (ν X i,j(x) ν Y i,j(y)) 2. The correspondence ˆϕ(x) is chosen from a limited set of points y Y x. The reduced subset of potential points Y x is obtained by comparing the point-wise signatures Ψ X to Ψ Y. In Subsection we explain how to obtain the point-wise signature Ψ. For each point x X we find Y x points y Y that are closest to x in sense of to L 2 distance Ψ X Ψ Y L 2. We note that this simple preprocessing step boosts the performance of the quasi-conformal maps. We point out, that the quantities ωi,j X and νi,j X are meaningful only if we have sets of corresponding eigenfunctions. In the next section we describe how this condition is fulfilled by a robust eigenfunction matching technique. 2 F. 15

16 3.3.1 Robust eigenfunction matching The need to optimize cost functions over all orthonormal pairs of eigenbases, arises regularly while evaluating spectral similarity of shapes, like the spectral embeding distance and the spectral quasi-conformal distance. Computationally, this exhaustive search can be relaxed by an indirect approach that first finds the orthonormal eigenbasis Φ Y, that best corresponds to Φ X. Given orthonormal eigenfunctions Φ Y = { φ Y i } of a generic shape (i.e. one without repeated eigenvalues), we can derive the family of orthonormal eigenbases Φ Y = {s i φy πi } by modifying the signs s i {+1, 1} of the eigenfunctions φ Y i, and reordering them by π i Z >0. For approximately isometric generic shapes X and Y, we would like to find the optimal signs s = {s i } and permutations π = {π i } such that φ X i (x) s i φy πi (ϕ(x)). To guarantee with high probability that we do not encounter eigenvalue multiplicity, we limit the search to a small number of compatible eigenfunctions. Next, we show how matching of the first N 0 eigenfunctions can be achieved, under the assumption that their respective eigenvalues are non-repeating Matching via high order statistics We adopt the approach presented in [SK13] which showed that using high order statistics can resolve the matching parameters. Here, we limit our method to comparing third order moments φ i φ j φ k da X, i, j, k {1, 2,..., N 0 }, ξ X i,j,k X where da X is the area element of surface X. Given two isometric shapes X and Y, we can find the matching parameters s and π by minimizing the difference of the third order moments on the two shapes ŝ, ˆπ = argmin s,π (ξi,j,k X s i s j s k ξπ Y i,π j,π k ) 2. (9) i,j,k This method considerably reduces the search space of possible corresponding eigenfunctions. Alas, in the presence of intrinsic symmetry, the third order moments of the antisymmetric eigenfunctions are ambiguous and cannot be compared effectively. Therefore, the signs of the antisymmetric eigenfunctions cannot be determined completely. In the next section we remedy this situation, further evaluating the reduced search space of compatible eigenbases Matching via correspondence quality analysis The need to have a pair of compatible eigenbasis is of important for the initialization of the quasi-conformal maps procedure. After determining the permutation π, and excluding some sign sequences, as described in the previous section, we further compare the eigenbases, by estimating the sign sequence s, so that the eigenbasis Φ Y (s) = s i φy πi, best match the eigenbasis Φ X. Assuming for now, that we have a function Q( ˆϕ) which is maximized by the best mapping ˆϕ(ŝ), we can follow Algorithm 4. 16

17 Algorithm 4 : Matching via correspondence quality analysis For a possible sign sequence s, estimate the mapping ˆϕ(s) by applying a correspondence framework that uses s for initialization. Evaluate the correspondence quality by the measure Q( ˆϕ(s)). Search for the optimal sign sequence ŝ that maximizes Q( ˆϕ(ŝ)). Compute the compatible eigenbasis Φ Y (ŝ). Next, we seek for a criterion Q( ˆϕ) that enfolds the consistency of the correspondence ˆϕ(s). For this purpose many heuristics can be considered. We have selected to maximize the orientable area correlation Orientable area correlation In this section we evaluate the quality of the correspondence ϕ : X Y from the point of view of area preservation criteria. Intuitively, for nearly isometric shapes and a good correspondence ϕ, the image of any triangle and the triangle itself should have approximately the same area. Let τ Y be a triangle with vertices y i y j y k on shape Y, having coordinates V yi V yj V yk in R 3. The normal to the triangle n Y (τ Y ) can be estimated in two ways. ˆn Y 1 (τ Y ) - the cross product of the vectors (V y j V y i ) and (V y k V y i ) ñ Y (τ Y ) (V y j V y i ) (V y k V y i ), ˆn Y 1 (τ Y ) ñ Y (y)/ ñ Y (y). ˆn Y 2 (τ Y ) - the average of the normals at the vertices ˆn Y 2 (τ Y ) 1 3 (ny (y i ) + n Y (y j ) + n Y (y k )), where n Y (y) is the usual normal to the surface at point y Y. We note that ˆn Y 1 (τ Y ) ˆn Y 2 (τ Y ) for a small enough triangle τ Y. Let T X be a triangulation of the surface X. We order the vertices x i x j x k of each triangle τ X T X in such a way that n X 1 (τ X ) points outward. The triangle τ X is mapped by y i = ϕ(x i ), y j = ϕ(x j ), y k = ϕ(x k ). We denote by τϕ Y the imaged triangle with vertices y i y j y k. For the special case where the shapes are isometric, and for a good map ϕ(x), we expect that the area A(τ X ) of an infinitesimal triangle τ X will be approximately equal to the area A(τϕ Y ) of its image τϕ Y, and that ˆn Y 1 (τϕ Y ) ˆn Y 2 (τϕ Y ) 1, as the two versions of the normal point to the same direction. In general, ˆn Y 1 (τϕ Y ) does not coincide with ˆn Y 2 (τϕ Y ), for example if the orientation is somehow flipped by the mapping ϕ(x), then, the normals will have different directions and ˆn Y 1 (τϕ Y ) ˆn Y 2 (τϕ Y ) 1. The normal ˆn Y 1 (τϕ Y ) is in fact the image of the normal ˆn X (τ X ). The above discussion motivates us to measure the correlation between the area of τ X oriented by ˆn Y 1 (τϕ Y ) to the area of τϕ Y oriented by ˆn Y 2 (τϕ Y ). We hereby define the orientable area-preservation quality Q(ϕ) by 17

18 Q(ϕ) where the imaged triangle τϕ Y τ X T X Matched eigenbasis signature ( A(τ X )ˆn Y 1 (τϕ Y ) ) (A(τ ϕ Y )ˆn Y 2 (τϕ Y ) ) τ X T X, (10) A 2 (τ X ) A 2 (τϕ Y ) τ X T X τ X T X is constructed by the mapping ϕ applied to the triangle As a preprocessing step for the spectral quasi-conformal maps computation, we would like to find an informative signature Ψ that can be compared between the two shapes. Having a matched pair of eigenbases comprised of N 0 compatible eigenfunctions, shapes can be compared by a mix of global and local quantities. 1. The matched N 0 low order eigenfunctions φ i that capture the global structure of the shapes. 2. The pseudo Wave Kernel Signature [ASC11] bandpass filters γ t (x) N λ i e λit φ 2 i (x), controlled by the parameter t, that better describe local features [SK13]. Thus, we construct the matched eigenbasis signature Ψ(x) {φ 1 (x), φ 2 (x),, φ N0 (x), γ t1 (x), γ t2 (x),, γ tb (x)}, combining together the N 0 low order eigenfunctions with the normalized bandpass filters γ t (x) sampled B times at t = {t 1, t 2,..., t B }. γ t (x), X γ2 t (x)da X 4 Correspondence framework In this section we convert the observations outlined in the previous sections into a complete and applicable framework for correspondence detection. The framework consists of the following sequential steps. Initialize coarse correspondence, using spectral quasi-conformal maps, see Section 3.3. Refine correspondence by applying the spectral kernel maps algorithm, see Section 3.1. Dense correspondence is achieved by the functional spectral kernel maps with offdiagonal penalty, see Section 3.2. We note that a matched pair of eigenbases is found by using a combination of high order statistics and correspondence quality analysis, see Section i=1

19 4.1 Implementation The proposed correspondence framework is fully automatic. As such, in all our experiments we used the same choice of parameters. In Section we explained some interesting properties of the GPS kernel. Our empirical evidence suggests that the GPS kernel, that is the normalization of the eigenfunctions φ i by the factor ( λ i ) 1, provides superior qualities for correspondence detection, compared to other kernels we tested. In general, we chose our parameters for achieving the most accurate results in a reasonable time. To that end, we used N = 120 eigenfunctions of the Laplace-Beltrami operator. For analysis, we break down the correspondence framework into different modules, with the following parameters. Robust eigenfunction matching - see Section 3.3.1, with the size of the triangulation set to T X = Mesh simplification is performed by the QSlim software [GH97]. Matched eigenbasis signature - see Section 3.3.2, N 0 = 6 matched eigenfunctions and B = 6 logarithmically sampled bandpass filters, with t 1 = 1 50λ 1 and t B = 1 λ 1. Spectral quasi-conformal maps - see Section 3.3, with N 0 = 6 matched eigenfunctions, and Y x = 100. Spectral kernel maps - see Section 3.1, K is the GPS kernel, M = 1000 points and L = 15 iterations. P (x) uniformly selects points from the mesh. Functional spectral kernel maps - see Section 3.1, K is the GPS kernel, M = 2000 points and L = 15 iterations. The off-diagonal penalty W ij = λy i λ X j U i was set to be proportional to the difference of the eigenvalues λ Y i and λ X i, and scaled by the i th entry of the diagonal U = diag((a Y ) T A Y ), the tuning parameter β was set to Results We tested the proposed method on pairs of shapes represented by triangulated meshes from both the TOSCA database [BBK08] and from the SCAPE database [ASP + 05]. The TOSCA dataset contains densely sampled synthetic human and animal surfaces, divided into several classes with given ground-truth point-to-point correspondences between the shapes within each class. The SCAPE dataset contains scans of real human bodies in different poses. We compare our work to several correspondence detection methods. Spectral Kernel Maps - the method proposed in this paper. Functional Maps + Blended (TOSCA only) - the functional maps based post-process iterative refinement algorithm. We use the results shown in [OBCS + 12]. There, the post-process procedure refines the correspondence provided by the Blended method [KLF11]. λ X j 19

20 Blended - the method proposed by Kim et al. that uses a weighted combination of isometric maps [KLF11]. Möbius Voting - the method proposed by Lipman et al. counts votes on the conformal Möbius transformations [LF09]. Permuted Sparse Coding + MSER (SCAPE only) - the approach proposed by Pokrass et al. finds correspondence by using methods from the field of sparse modeling. We note that this method depends on the ability to detect repeatable regions between shapes. We use the results shown in [PBB + 13]. There, maximally stable extremal regions (MSER) are used as a preprocessing step [LBB11]. Functional Maps + MSER (SCAPE only) - the functional maps based post-process iterative refinement algorithm. We use the results shown in [PBB + 13]. Figure 2 compares our correspondence framework with existing methods on the TOSCA benchmark, using the evaluation protocol proposed in [KLF11]. The distortion curves describe the percentage of surface points falling within a relative geodesic distance from what is assumed to be their true locations. For each shape, the geodesic distance is normalized by the square root of the shape s area. As is evident from the benchmark the proposed method significantly outperforms existing ones. Figure 3 compares the proposed correspondence framework with existing methods on the SCAPE database, again using the evaluation protocol proposed in [KLF11], allowing symmetries. Remark: In the evaluation, we do not allow per-point symmetry selection (as other methods do), but rather the correct symmetry is automatically chosen for the shape as a whole. Geodesic Error Spectral Kernel Maps F. Maps + Blended Blended Möbius Voting Table 1: Percentage of surface points falling within a relative geodesic error for different methods. Table 1 displays the percentage of correspondences that fall within different values of relative geodesic distances. It is interesting to focus on large geodesic errors. Unlike other methods, in the proposed approach more than 99% of the correspondences have significantly smaller geodesic error of less then than 0.1. Next, we analyze the contribution of each component of the proposed framework. Figure 4 compares different combinations of the modules of the correspondence framework. We notice that the functional spectral kernel maps part is the most dominant module. Still, it needs a good starting point that is provided by the spectral quasi-conformal maps. We illustrate how our method is able to find the intrinsic reflective symmetry axis of nonrigid shapes. Intrinsic symmetry detection can be viewed as finding correspondence from a shape to itself [RBBK07]. Following this approach, we simply map a shape to 20

21 TOSCA correspondence Figure 2: Evaluation of the spectral kernel maps algorithm applied to shapes from the TOSCA database, using the protocol of [KLF11]. itself and select the sign sequence that flips the orientation of the shape by maximizing Q(ϕ(s)). In Figure 5 we visualize the distance between a point and its image for several shapes from the TOSCA database. Finally, we demonstrate the proposed approach in texture transfer experiments on shapes of scanned human faces taken from the BFM database [PKA + 09]. The intrinsic correspondence gives us the possibility to perform texture manipulation and mapping [ZKK02, BBK07]. In Figure 6, we used the maps obtained by the proposed correspondence framework to transfer the textures of the reference shapes to the target shapes. Figure 7 shows gradual morphing from one shape (left) to another shape (right), obtained by linear interpolation of the extrinsic geometry and the texture, for different values of the interpolation factor γ [BBK07]. Figure 1 depicts how the texture from the two middle human faces are transferred to the intrinsically symmetric halves of the target face. 21

22 SCAPE correspondence (allow symmetries) Figure 3: Evaluation of the spectral kernel maps algorithm applied to shapes from the SCAPE database, using the protocol of [KLF11] with allowed symmetries. 5 Conclusions A new method for comparing nonrigid shapes was introduced. The theoretical components are based on spectral distances between manifolds, that have been integrated into a holistic shape matching framework. We have demonstrated the effectiveness of the proposed approach by achieving state-of-the-art results on shape matching benchmarks. In the future, we intend to study the characteristics of other types of distances between deformable shapes, and incorporate them in the proposed correspondence framework. Acknowledgment This work has been supported by grant agreement no of the European Community s FP7-ERC program. 22

23 Spectral kernel maps module analysis (TOSCA) Figure 4: Evaluation of different combinations of the modules of the correspondence framework applied to shapes from the TOSCA database. The modules: Robust Eigenfunction Matching (REM), Matched Eigenbasis Signature (MES), Spectral Quasi-Conformal Maps (SQCM), Spectral Kernel Maps (SKM), Functional Spectral Kernel Maps (FSKM). A Proof of Theorem 3 We would like to show that the quasi-conformal distance d QC t the following properties. P1: Symmetry - d QC t (X, Y ) = d QC t (Y, X) for every X, Y M. : M M R 0 satisfies P2: Triangle-inequality - For every three Riemannian manifolds X, Y, Z M, P3: Identity - d QC t (X, Z) d QC t (X, Y ) + d QC t (Y, Z). (a) If X, Y M are isometric Riemannian manifolds, then d QC t (X, Y ) = 0. (b) If d QC t (X, Y ) = 0 then X, Y are isometric. 23

24 Figure 5: Symmetry axis of several shapes from the TOSCA database. The proof of the theorem is based on the work of Be rard et al. [BBG94], that showed that the spectral embedding distance is a metric. Lemma 1. Let X, Y M, and let ϕ : X 7 Y and ψ : Y 7 X be smooth bijective mappings y = ϕ(ψ(y)). Let us denote by Jϕ (x), Jψ (y), the Jacobians of the maps ϕ, ψ, respectively. Then, dqc (X, Y ) = 0 implies that Jϕ (x) (and equivalently Jψ (y)) is constant. Proof. By the definition of the spectral quasi-conformal distance, if dqc (X, Y ) = 0 and if x = ψ(y) or y = ϕ(x), then p p X Y Vol(X)e λi t/2 φx = Vol(Y )e λi t/2 φyi i > 0. (11) i Integrating Eq. 11, we get Z p λx t/2 i 0 = Vol(X)e φx i (x)dvx X Z p Y = Vol(Y )e λi t/2 φyi (ϕ(x))dvx ZX p λy t/2 i = Vol(Y )e φyi (y)jψ (y)dvy. (12) Y Hence, Jψ (y) is orthogonal to φyi (y) i > 0. We conclude that Jψ (y) (and equivalently Jϕ (x)) must be a constant. 24

25 Figure 6: Texture transfer - front and side views. In each row, the same shape is shown with different textures transferred from reference faces (diagonal). γ=0 γ = 0.1 γ = 0.3 γ = 0.5 γ = 0.7 γ = 0.8 γ=1 Figure 7: Shape morphing. The correspondence is used to transform the texture and the extrinsic geometry, creating a morphing effect. 25

26 Proof 2. Under the condition that { ( Y φ Y i Y φ Y j ) i j constant} spans the space of real scalar functions, we can prove the claim only by using the coordinates of ĨΦ t. By the definition of the spectral quasi-conformal distance, if d QC (X, Y ) = 0 and if x = ψ(y) or y = ϕ(x), then where z X i,j( X φ X i X φ X j ) = z Y i,j( Y φ Y i Y φ Y j ), (13) z X i,j Vol(X) e (λx i +λx j )t/2 λ X i λx j, zi,j Y Vol(Y ) e (λy i +λy j )t/2. λ Y i λy j Integrating Eq. 13, we get z X i,j X X φ X i (x) X φ X j (x)dv X = zi,j Y = zi,j Y X Y ( Y φ Y i (ϕ(x)) Y φ Y j (ϕ(x)))dv X ( Y φ Y i (y) Y φ Y j (y))j ψ (y)dv Y. (14) The gradients of the eigenfunctions are orthogonal ( X φ X i (x) X φ X j (x))dv X X = 0 i j, ( Y φ Y i (y) Y φ Y j (y))dv Y = 0 i j. (15) Y and using Eq. 14 and Eq.15, we have ( Y φ Y i (y) Y φ Y j (y))j ψ (y)dv Y = 0. Y Hence, J ψ (y) is orthogonal to ( Y φ Y i (y) Y φ Y j (y)) i j. Again, we conclude that J ψ (y) (and equivalently J ϕ (x)) is a constant. Because the Jacobian J ψ (y) = J ψ is a constant, the following relation holds Vol(X) = dv X = J ψ dv Y = J ψ Vol(Y ). X Therefore, the Jacobian can be expressed as the ratio of the volumes of the Riemannian manifolds J ψ = Vol(X)/Vol(Y ). Lemma 2. For X, Y M, d QC (X, Y ) = 0 implies that X, Y are isospectral, i.e. λ X i = λ Y i i. Proof. Integrating the square of Eq. 11, we get ( Vol(X)e λ X i t/2 φ X i (x) ) 2 ( dvx = Vol(Y )e λ Y i t/2 φ Y i (y) ) 2 Jψ (y)dv Y. Y X Y 26

Spectral Processing. Misha Kazhdan

Spectral Processing. Misha Kazhdan Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry

More information

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008.

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008. Shape Matching: A Metric Geometry Approach Facundo Mémoli. CS 468, Stanford University, Fall 2008. 1 GH: definition d GH (X, Y ) = inf Z,f,g dz H(f(X),g(Y )) 2 Remember the expression: For maps φ : X Y,

More information

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008.

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008. Shape Matching: A Metric Geometry Approach Facundo Mémoli. CS 468, Stanford University, Fall 2008. 1 General stuff: class attendance is mandatory will post possible projects soon a couple of days prior

More information

Functional Maps ( ) Dr. Emanuele Rodolà Room , Informatik IX

Functional Maps ( ) Dr. Emanuele Rodolà Room , Informatik IX Functional Maps (12.06.2014) Dr. Emanuele Rodolà rodola@in.tum.de Room 02.09.058, Informatik IX Seminar «LP relaxation for elastic shape matching» Fabian Stark Wednesday, June 18th 14:00 Room 02.09.023

More information

APPROXIMATELY ISOMETRIC SHAPE CORRESPONDENCE BY MATCHING POINTWISE SPECTRAL FEATURES AND GLOBAL GEODESIC STRUCTURES

APPROXIMATELY ISOMETRIC SHAPE CORRESPONDENCE BY MATCHING POINTWISE SPECTRAL FEATURES AND GLOBAL GEODESIC STRUCTURES Advances in Adaptive Data Analysis Vol. 3, Nos. 1 & 2 (2011) 203 228 c World Scientific Publishing Company DOI: 10.1142/S1793536911000829 APPROXIMATELY ISOMETRIC SHAPE CORRESPONDENCE BY MATCHING POINTWISE

More information

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN Contribution from: Mathematical Physics Studies Vol. 7 Perspectives in Analysis Essays in Honor of Lennart Carleson s 75th Birthday Michael Benedicks, Peter W. Jones, Stanislav Smirnov (Eds.) Springer

More information

Laplace Operator and Heat Kernel for Shape Analysis

Laplace Operator and Heat Kernel for Shape Analysis Laplace Operator and Heat Kernel for Shape Analysis Jian Sun Mathematical Sciences Center, Tsinghua University R kf := 2 f x 2 1 Laplace Operator on R k, the standard Laplace operator: R kf := div f +

More information

Basic Properties of Metric and Normed Spaces

Basic Properties of Metric and Normed Spaces Basic Properties of Metric and Normed Spaces Computational and Metric Geometry Instructor: Yury Makarychev The second part of this course is about metric geometry. We will study metric spaces, low distortion

More information

IFT LAPLACIAN APPLICATIONS. Mikhail Bessmeltsev

IFT LAPLACIAN APPLICATIONS.   Mikhail Bessmeltsev IFT 6112 09 LAPLACIAN APPLICATIONS http://www-labs.iro.umontreal.ca/~bmpix/teaching/6112/2018/ Mikhail Bessmeltsev Rough Intuition http://pngimg.com/upload/hammer_png3886.png You can learn a lot about

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

δ-hyperbolic SPACES SIDDHARTHA GADGIL

δ-hyperbolic SPACES SIDDHARTHA GADGIL δ-hyperbolic SPACES SIDDHARTHA GADGIL Abstract. These are notes for the Chennai TMGT conference on δ-hyperbolic spaces corresponding to chapter III.H in the book of Bridson and Haefliger. When viewed from

More information

Overview of normed linear spaces

Overview of normed linear spaces 20 Chapter 2 Overview of normed linear spaces Starting from this chapter, we begin examining linear spaces with at least one extra structure (topology or geometry). We assume linearity; this is a natural

More information

(x, y) = d(x, y) = x y.

(x, y) = d(x, y) = x y. 1 Euclidean geometry 1.1 Euclidean space Our story begins with a geometry which will be familiar to all readers, namely the geometry of Euclidean space. In this first chapter we study the Euclidean distance

More information

EE Technion, Spring then. can be isometrically embedded into can be realized as a Gram matrix of rank, Properties:

EE Technion, Spring then. can be isometrically embedded into can be realized as a Gram matrix of rank, Properties: 5/25/200 2 A mathematical exercise Assume points with the metric are isometrically embeddable into Then, there exists a canonical form such that for all Spectral methods We can also write Alexander & Michael

More information

Kernel Method: Data Analysis with Positive Definite Kernels

Kernel Method: Data Analysis with Positive Definite Kernels Kernel Method: Data Analysis with Positive Definite Kernels 2. Positive Definite Kernel and Reproducing Kernel Hilbert Space Kenji Fukumizu The Institute of Statistical Mathematics. Graduate University

More information

CS 468, Spring 2013 Differential Geometry for Computer Science Justin Solomon and Adrian Butscher

CS 468, Spring 2013 Differential Geometry for Computer Science Justin Solomon and Adrian Butscher CS 468, Spring 2013 Differential Geometry for Computer Science Justin Solomon and Adrian Butscher http://graphics.stanford.edu/projects/lgl/papers/nbwyg-oaicsm-11/nbwyg-oaicsm-11.pdf Need to understand

More information

Pullbacks, Isometries & Conformal Maps

Pullbacks, Isometries & Conformal Maps Pullbacks, Isometries & Conformal Maps Outline 1. Pullbacks Let V and W be vector spaces, and let T : V W be an injective linear transformation. Given an inner product, on W, the pullback of, is the inner

More information

Justin Solomon MIT, Spring 2017

Justin Solomon MIT, Spring 2017 Justin Solomon MIT, Spring 2017 http://pngimg.com/upload/hammer_png3886.png You can learn a lot about a shape by hitting it (lightly) with a hammer! What can you learn about its shape from vibration frequencies

More information

William P. Thurston. The Geometry and Topology of Three-Manifolds

William P. Thurston. The Geometry and Topology of Three-Manifolds William P. Thurston The Geometry and Topology of Three-Manifolds Electronic version 1.1 - March 00 http://www.msri.org/publications/books/gt3m/ This is an electronic edition of the 1980 notes distributed

More information

Lecture: Some Practical Considerations (3 of 4)

Lecture: Some Practical Considerations (3 of 4) Stat260/CS294: Spectral Graph Methods Lecture 14-03/10/2015 Lecture: Some Practical Considerations (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Author: Raif M. Rustamov Presenter: Dan Abretske Johns Hopkins 2007 Outline Motivation and Background Laplace-Beltrami Operator

More information

arxiv: v1 [cs.gr] 16 Oct 2013

arxiv: v1 [cs.gr] 16 Oct 2013 MATCHING LBO EIGENSPACE OF NON-RIGID SHAPES VIA HIGH ORDER STATISTICS Alon Shtern, Ron Kimmel arxiv:1310.4459v1 [cs.gr] 16 Oct 2013 Technion - Israel Institute of Technology Abstract A fundamental tool

More information

CHAPTER VIII HILBERT SPACES

CHAPTER VIII HILBERT SPACES CHAPTER VIII HILBERT SPACES DEFINITION Let X and Y be two complex vector spaces. A map T : X Y is called a conjugate-linear transformation if it is a reallinear transformation from X into Y, and if T (λx)

More information

Heat Kernel Signature: A Concise Signature Based on Heat Diffusion. Leo Guibas, Jian Sun, Maks Ovsjanikov

Heat Kernel Signature: A Concise Signature Based on Heat Diffusion. Leo Guibas, Jian Sun, Maks Ovsjanikov Heat Kernel Signature: A Concise Signature Based on Heat Diffusion i Leo Guibas, Jian Sun, Maks Ovsjanikov This talk is based on: Jian Sun, Maks Ovsjanikov, Leonidas Guibas 1 A Concise and Provably Informative

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities

Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities Laurent Saloff-Coste Abstract Most smoothing procedures are via averaging. Pseudo-Poincaré inequalities give a basic L p -norm control

More information

Optimal Transportation. Nonlinear Partial Differential Equations

Optimal Transportation. Nonlinear Partial Differential Equations Optimal Transportation and Nonlinear Partial Differential Equations Neil S. Trudinger Centre of Mathematics and its Applications Australian National University 26th Brazilian Mathematical Colloquium 2007

More information

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation Statistics 62: L p spaces, metrics on spaces of probabilites, and connections to estimation Moulinath Banerjee December 6, 2006 L p spaces and Hilbert spaces We first formally define L p spaces. Consider

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

Matching shapes by eigendecomposition of the Laplace-Beltrami operator

Matching shapes by eigendecomposition of the Laplace-Beltrami operator Matching shapes by eigendecomposition of the Laplace-Beltrami operator Anastasia Dubrovina Technion - Israel Institute of Technology nastyad@tx.technion.ac.il Ron Kimmel Technion - Israel Institute of

More information

An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications. Giuseppe Patané, CNR-IMATI - Italy

An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications. Giuseppe Patané, CNR-IMATI - Italy i An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications Giuseppe Patané, CNR-IMATI - Italy ii Contents List of Figures...1 List of Tables...7 1 Introduction...9

More information

fy (X(g)) Y (f)x(g) gy (X(f)) Y (g)x(f)) = fx(y (g)) + gx(y (f)) fy (X(g)) gy (X(f))

fy (X(g)) Y (f)x(g) gy (X(f)) Y (g)x(f)) = fx(y (g)) + gx(y (f)) fy (X(g)) gy (X(f)) 1. Basic algebra of vector fields Let V be a finite dimensional vector space over R. Recall that V = {L : V R} is defined to be the set of all linear maps to R. V is isomorphic to V, but there is no canonical

More information

Applied and Computational Harmonic Analysis

Applied and Computational Harmonic Analysis Appl. Comput. Harmon. Anal. 30 2011) 363 401 Contents lists available at ScienceDirect Applied and Computational Harmonic Analysis www.elsevier.com/locate/acha A spectral notion of Gromov Wasserstein distance

More information

Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction

Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction A presentation by Evan Ettinger on a Paper by Vin de Silva and Joshua B. Tenenbaum May 12, 2005 Outline Introduction The

More information

6/9/2010. Feature-based methods and shape retrieval problems. Structure. Combining local and global structures. Photometric stress

6/9/2010. Feature-based methods and shape retrieval problems. Structure. Combining local and global structures. Photometric stress 1 2 Structure Feature-based methods and shape retrieval problems Global Metric Local Feature descriptors Alexander & Michael Bronstein, 2006-2009 Michael Bronstein, 2010 tosca.cs.technion.ac.il/book 048921

More information

SPECTRAL THEORY EVAN JENKINS

SPECTRAL THEORY EVAN JENKINS SPECTRAL THEORY EVAN JENKINS Abstract. These are notes from two lectures given in MATH 27200, Basic Functional Analysis, at the University of Chicago in March 2010. The proof of the spectral theorem for

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016 U.C. Berkeley CS294: Spectral Methods and Expanders Handout Luca Trevisan February 29, 206 Lecture : ARV In which we introduce semi-definite programming and a semi-definite programming relaxation of sparsest

More information

Data dependent operators for the spatial-spectral fusion problem

Data dependent operators for the spatial-spectral fusion problem Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces 9.520: Statistical Learning Theory and Applications February 10th, 2010 Reproducing Kernel Hilbert Spaces Lecturer: Lorenzo Rosasco Scribe: Greg Durrett 1 Introduction In the previous two lectures, we

More information

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions

More information

Mostow Rigidity. W. Dison June 17, (a) semi-simple Lie groups with trivial centre and no compact factors and

Mostow Rigidity. W. Dison June 17, (a) semi-simple Lie groups with trivial centre and no compact factors and Mostow Rigidity W. Dison June 17, 2005 0 Introduction Lie Groups and Symmetric Spaces We will be concerned with (a) semi-simple Lie groups with trivial centre and no compact factors and (b) simply connected,

More information

Best approximations in normed vector spaces

Best approximations in normed vector spaces Best approximations in normed vector spaces Mike de Vries 5699703 a thesis submitted to the Department of Mathematics at Utrecht University in partial fulfillment of the requirements for the degree of

More information

The Symmetric Space for SL n (R)

The Symmetric Space for SL n (R) The Symmetric Space for SL n (R) Rich Schwartz November 27, 2013 The purpose of these notes is to discuss the symmetric space X on which SL n (R) acts. Here, as usual, SL n (R) denotes the group of n n

More information

Spectral Geometry of Riemann Surfaces

Spectral Geometry of Riemann Surfaces Spectral Geometry of Riemann Surfaces These are rough notes on Spectral Geometry and their application to hyperbolic riemann surfaces. They are based on Buser s text Geometry and Spectra of Compact Riemann

More information

Data-dependent representations: Laplacian Eigenmaps

Data-dependent representations: Laplacian Eigenmaps Data-dependent representations: Laplacian Eigenmaps November 4, 2015 Data Organization and Manifold Learning There are many techniques for Data Organization and Manifold Learning, e.g., Principal Component

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

Grothendieck s Inequality

Grothendieck s Inequality Grothendieck s Inequality Leqi Zhu 1 Introduction Let A = (A ij ) R m n be an m n matrix. Then A defines a linear operator between normed spaces (R m, p ) and (R n, q ), for 1 p, q. The (p q)-norm of A

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = CALCULUS ON MANIFOLDS 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = a M T am, called the tangent bundle, is itself a smooth manifold, dim T M = 2n. Example 1.

More information

2) Let X be a compact space. Prove that the space C(X) of continuous real-valued functions is a complete metric space.

2) Let X be a compact space. Prove that the space C(X) of continuous real-valued functions is a complete metric space. University of Bergen General Functional Analysis Problems with solutions 6 ) Prove that is unique in any normed space. Solution of ) Let us suppose that there are 2 zeros and 2. Then = + 2 = 2 + = 2. 2)

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

Manifold Regularization

Manifold Regularization 9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,

More information

Computing and Processing Correspondences with Functional Maps

Computing and Processing Correspondences with Functional Maps Computing and Processing Correspondences with Functional Maps Maks Ovsjanikov 1 Etienne Corman 2 Michael Bronstein 3,4,5 Emanuele Rodolà 3 Mirela Ben-Chen 6 Leonidas Guibas 7 Frédéric Chazal 8 Alexander

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Optimal Transport: A Crash Course

Optimal Transport: A Crash Course Optimal Transport: A Crash Course Soheil Kolouri and Gustavo K. Rohde HRL Laboratories, University of Virginia Introduction What is Optimal Transport? The optimal transport problem seeks the most efficient

More information

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Linear Algebra A Brief Reminder Purpose. The purpose of this document

More information

Implicitely and Densely Discrete Black-Box Optimization Problems

Implicitely and Densely Discrete Black-Box Optimization Problems Implicitely and Densely Discrete Black-Box Optimization Problems L. N. Vicente September 26, 2008 Abstract This paper addresses derivative-free optimization problems where the variables lie implicitly

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Chapter 3. Riemannian Manifolds - I. The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves

Chapter 3. Riemannian Manifolds - I. The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves Chapter 3 Riemannian Manifolds - I The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves embedded in Riemannian manifolds. A Riemannian manifold is an abstraction

More information

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric

More information

Welcome to Copenhagen!

Welcome to Copenhagen! Welcome to Copenhagen! Schedule: Monday Tuesday Wednesday Thursday Friday 8 Registration and welcome 9 Crash course on Crash course on Introduction to Differential and Differential and Information Geometry

More information

LECTURE Itineraries We start with a simple example of a dynamical system obtained by iterating the quadratic polynomial

LECTURE Itineraries We start with a simple example of a dynamical system obtained by iterating the quadratic polynomial LECTURE. Itineraries We start with a simple example of a dynamical system obtained by iterating the quadratic polynomial f λ : R R x λx( x), where λ [, 4). Starting with the critical point x 0 := /2, we

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008.

Shape Matching: A Metric Geometry Approach. Facundo Mémoli. CS 468, Stanford University, Fall 2008. Shape Matching: A Metric Geometry Approach Facundo Mémoli. CS 468, Stanford University, Fall 2008. 1 The Problem of Shape/Object Matching databases of objects objects can be many things: proteins molecules

More information

1 Directional Derivatives and Differentiability

1 Directional Derivatives and Differentiability Wednesday, January 18, 2012 1 Directional Derivatives and Differentiability Let E R N, let f : E R and let x 0 E. Given a direction v R N, let L be the line through x 0 in the direction v, that is, L :=

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Spaces with Ricci curvature bounded from below

Spaces with Ricci curvature bounded from below Spaces with Ricci curvature bounded from below Nicola Gigli February 23, 2015 Topics 1) On the definition of spaces with Ricci curvature bounded from below 2) Analytic properties of RCD(K, N) spaces 3)

More information

are Banach algebras. f(x)g(x) max Example 7.4. Similarly, A = L and A = l with the pointwise multiplication

are Banach algebras. f(x)g(x) max Example 7.4. Similarly, A = L and A = l with the pointwise multiplication 7. Banach algebras Definition 7.1. A is called a Banach algebra (with unit) if: (1) A is a Banach space; (2) There is a multiplication A A A that has the following properties: (xy)z = x(yz), (x + y)z =

More information

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

1 Euclidean geometry. 1.1 The metric on R n

1 Euclidean geometry. 1.1 The metric on R n 1 Euclidean geometry This chapter discusses the geometry of n-dimensional Euclidean space E n, together with its distance function. The distance gives rise to other notions such as angles and congruent

More information

Optimization Theory. A Concise Introduction. Jiongmin Yong

Optimization Theory. A Concise Introduction. Jiongmin Yong October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

More information

Outline of the course

Outline of the course School of Mathematical Sciences PURE MTH 3022 Geometry of Surfaces III, Semester 2, 20 Outline of the course Contents. Review 2. Differentiation in R n. 3 2.. Functions of class C k 4 2.2. Mean Value Theorem

More information

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector 1 Query model In the quantum query model we wish to compute some function f and we access the input through queries. The complexity of f is the number of queries needed to compute f on a worst-case input

More information

Discrete Differential Geometry. Discrete Laplace-Beltrami operator and discrete conformal mappings.

Discrete Differential Geometry. Discrete Laplace-Beltrami operator and discrete conformal mappings. Discrete differential geometry. Discrete Laplace-Beltrami operator and discrete conformal mappings. Technische Universität Berlin Geometric Methods in Classical and Quantum Lattice Systems, Caputh, September

More information

The Brownian map A continuous limit for large random planar maps

The Brownian map A continuous limit for large random planar maps The Brownian map A continuous limit for large random planar maps Jean-François Le Gall Université Paris-Sud Orsay and Institut universitaire de France Seminar on Stochastic Processes 0 Jean-François Le

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

4.5 The critical BGW tree

4.5 The critical BGW tree 4.5. THE CRITICAL BGW TREE 61 4.5 The critical BGW tree 4.5.1 The rooted BGW tree as a metric space We begin by recalling that a BGW tree T T with root is a graph in which the vertices are a subset of

More information

Definition and basic properties of heat kernels I, An introduction

Definition and basic properties of heat kernels I, An introduction Definition and basic properties of heat kernels I, An introduction Zhiqin Lu, Department of Mathematics, UC Irvine, Irvine CA 92697 April 23, 2010 In this lecture, we will answer the following questions:

More information

Spectral Theory, with an Introduction to Operator Means. William L. Green

Spectral Theory, with an Introduction to Operator Means. William L. Green Spectral Theory, with an Introduction to Operator Means William L. Green January 30, 2008 Contents Introduction............................... 1 Hilbert Space.............................. 4 Linear Maps

More information

154 Chapter 9 Hints, Answers, and Solutions The particular trajectories are highlighted in the phase portraits below.

154 Chapter 9 Hints, Answers, and Solutions The particular trajectories are highlighted in the phase portraits below. 54 Chapter 9 Hints, Answers, and Solutions 9. The Phase Plane 9.. 4. The particular trajectories are highlighted in the phase portraits below... 3. 4. 9..5. Shown below is one possibility with x(t) and

More information

Local semiconvexity of Kantorovich potentials on non-compact manifolds

Local semiconvexity of Kantorovich potentials on non-compact manifolds Local semiconvexity of Kantorovich potentials on non-compact manifolds Alessio Figalli, Nicola Gigli Abstract We prove that any Kantorovich potential for the cost function c = d / on a Riemannian manifold

More information

Commutative Banach algebras 79

Commutative Banach algebras 79 8. Commutative Banach algebras In this chapter, we analyze commutative Banach algebras in greater detail. So we always assume that xy = yx for all x, y A here. Definition 8.1. Let A be a (commutative)

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Manifold Learning: Theory and Applications to HRI

Manifold Learning: Theory and Applications to HRI Manifold Learning: Theory and Applications to HRI Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr August 19, 2008 1 / 46 Greek Philosopher

More information

1 Math 241A-B Homework Problem List for F2015 and W2016

1 Math 241A-B Homework Problem List for F2015 and W2016 1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let

More information

A metric approach to shape comparison via multidimensional persistence

A metric approach to shape comparison via multidimensional persistence A metric approach to shape comparison via multidimensional persistence Patrizio Frosini 1,2 1 Department of Mathematics, University of Bologna, Italy 2 ARCES - Vision Mathematics Group, University of Bologna,

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Metric spaces and metrizability

Metric spaces and metrizability 1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively

More information

Course Summary Math 211

Course Summary Math 211 Course Summary Math 211 table of contents I. Functions of several variables. II. R n. III. Derivatives. IV. Taylor s Theorem. V. Differential Geometry. VI. Applications. 1. Best affine approximations.

More information

Self-equilibrated Functions in Dual Vector Spaces: a Boundedness Criterion

Self-equilibrated Functions in Dual Vector Spaces: a Boundedness Criterion Self-equilibrated Functions in Dual Vector Spaces: a Boundedness Criterion Michel Théra LACO, UMR-CNRS 6090, Université de Limoges michel.thera@unilim.fr reporting joint work with E. Ernst and M. Volle

More information

Topological properties

Topological properties CHAPTER 4 Topological properties 1. Connectedness Definitions and examples Basic properties Connected components Connected versus path connected, again 2. Compactness Definition and first examples Topological

More information

CONVERGENCE THEORY. G. ALLAIRE CMAP, Ecole Polytechnique. 1. Maximum principle. 2. Oscillating test function. 3. Two-scale convergence

CONVERGENCE THEORY. G. ALLAIRE CMAP, Ecole Polytechnique. 1. Maximum principle. 2. Oscillating test function. 3. Two-scale convergence 1 CONVERGENCE THEOR G. ALLAIRE CMAP, Ecole Polytechnique 1. Maximum principle 2. Oscillating test function 3. Two-scale convergence 4. Application to homogenization 5. General theory H-convergence) 6.

More information

DECOMPOSING DIFFEOMORPHISMS OF THE SPHERE. inf{d Y (f(x), f(y)) : d X (x, y) r} H

DECOMPOSING DIFFEOMORPHISMS OF THE SPHERE. inf{d Y (f(x), f(y)) : d X (x, y) r} H DECOMPOSING DIFFEOMORPHISMS OF THE SPHERE ALASTAIR FLETCHER, VLADIMIR MARKOVIC 1. Introduction 1.1. Background. A bi-lipschitz homeomorphism f : X Y between metric spaces is a mapping f such that f and

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 1 Ridge Regression Ridge regression and the Lasso are two forms of regularized

More information