Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course

Similar documents
Spectral Processing. Misha Kazhdan

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Spectral Algorithms II

EE Technion, Spring then. can be isometrically embedded into can be realized as a Gram matrix of rank, Properties:

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

IFT LAPLACIAN APPLICATIONS. Mikhail Bessmeltsev

Justin Solomon MIT, Spring 2017

Functional Maps ( ) Dr. Emanuele Rodolà Room , Informatik IX

Functional Analysis Review

Spectra of Adjacency and Laplacian Matrices

15 Singular Value Decomposition

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Spectral Clustering. Zitao Liu

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Singular Value Decomposition

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Basic Calculus Review

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Linear Algebra. Maths Preliminaries. Wei CSE, UNSW. September 9, /29

Chapter 6: Orthogonality

14 Singular Value Decomposition

Non-linear Dimensionality Reduction

Machine Learning for Data Science (CS4786) Lecture 11

Background Mathematics (2/2) 1. David Barber

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Lecture 2: Linear Algebra Review

Tutorials in Optimization. Richard Socher

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian

Spectral Clustering. by HU Pili. June 16, 2013

The Laplacian ( ) Matthias Vestner Dr. Emanuele Rodolà Room , Informatik IX

Statistical Machine Learning

CS 143 Linear Algebra Review

Review of Some Concepts from Linear Algebra: Part 2

Numerical Methods I: Eigenvalues and eigenvectors

The Jordan Normal Form and its Applications

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

4.2. ORTHOGONALITY 161

Mathematical foundations - linear algebra

Heat Kernel Signature: A Concise Signature Based on Heat Diffusion. Leo Guibas, Jian Sun, Maks Ovsjanikov

Analysis of Spectral Kernel Design based Semi-supervised Learning

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Statistical Geometry Processing Winter Semester 2011/2012

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University

Markov Chains and Spectral Clustering

LECTURE NOTE #11 PROF. ALAN YUILLE

Lecture: Face Recognition and Feature Reduction

Directional Field. Xiao-Ming Fu

Data Analysis and Manifold Learning Lecture 2: Properties of Symmetric Matrices and Examples

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Math 307 Learning Goals. March 23, 2010

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Fundamentals of Matrices

Linear algebra and applications to graphs Part 1

Review of some mathematical tools

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Review problems for MA 54, Fall 2004.

L26: Advanced dimensionality reduction

Linear Algebra Review

Math 302 Outcome Statements Winter 2013

Lecture 12 : Graph Laplacians and Cheeger s Inequality

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

Jordan normal form notes (version date: 11/21/07)

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Geometric Modeling Summer Semester 2012 Linear Algebra & Function Spaces

IFT CONTINUOUS LAPLACIAN Mikhail Bessmeltsev

The structure tensor in projective spaces

Linear Algebra Massoud Malek

Computational Methods. Eigenvalues and Singular Values

Math 307 Learning Goals

Linear Algebra. Min Yan

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

Least Squares Optimization

EECS490: Digital Image Processing. Lecture #26

Spectral Clustering on Handwritten Digits Database

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

MATH 829: Introduction to Data Mining and Analysis Principal component analysis

Mathematical foundations - linear algebra

Eigenvalue Problems Computation and Applications

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Clustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion

Least Squares Optimization

Section 3.9. Matrix Norm

Linear Subspace Models

Nonlinear Dimensionality Reduction

Review of similarity transformation and Singular Value Decomposition

Linear Algebra in Actuarial Science: Slides to the lecture

STA141C: Big Data & High Performance Statistical Computing

CS 664 Segmentation (2) Daniel Huttenlocher

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

Spectral Generative Models for Graphs

Chapter 3 Transformations

Principal Component Analysis

Functional Analysis Review

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications. Giuseppe Patané, CNR-IMATI - Italy

Transcription:

Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course

Why Spectral? A different way to look at functions on a domain

Why Spectral? Better representations lead to simpler solutions

A Motivating Application Shape Correspondence Rigid alignment easy different pose? Spectral transform normalizes shape pose nment Rigid align

Spectral Geometry Processing Use eigen structure of well behaved linear operators for geometry processing

Eigen structure Eigenvectors and eigenvalues Au = λu,, u 0 Diagonalization or eigen decomposition A = UΛU T Projection into eigen subspace y = U (k) U (k)t y DFT like spectral transform ŷ = U T y

Eigen structure eigen decomposition A = U Λ U T mesh geometry y = positive definite matrix A subspace projection y = U (3) U (3)T y =

Eigen structure eigen decomposition A = U Λ U T mesh geometry y = positive definite matrix A spectral transform ŷ = U T y = U T

Classification of Applications Eigenstructure(s) used Eigenvalues: signature for shape characterization Eigenvectors: form spectral embedding (a transform) Eigenprojection: also a transform DFT like Dimensionality of spectral embeddings 1D: mesh sequencing 2D or 3D: graph drawing or mesh parameterization Higher D: clustering, segmentation, correspondence Mesh operator used Laplace Beltrami, distances matrix, other Combinatorial vs. geometric 1 st order vs. higher order Normalized vs. un normalized

Operators? Best Symmetric positive definite operator: x T Ax > 0 for any x Can live with Semi positive definite (x T Ax 0 for any x) Non symmetric, as long as eigenvalues are real and positive e.g.: L = DW, where W is SPD and D is diagonal. Beware of Non square operators Complex eigenvalues Negative eigenvalues

Spectral Processing Perspectives Signal Sg processing pocess Filtering and compression Relation to discrete Fourier transform (DFT) Geometric Global and intrinsic Machine learning Dimensionality reduction

The smoothing problem Smooth out rough features of a contour (2D shape)

Laplacian smoothing Move each vertex towards the centroid of its neighbours Here: Centroid = midpoint Move half way

Laplacian smoothing and Laplacian Local averaging 1D discrete Laplacian

Smoothing result Obtained by 10 steps of Laplacian smoothing

Signal representation Represent a contour using a discrete periodic 2D signal x coordinates of the x coordinates of the seahorse contour

Laplacian smoothing in matrix form Smoothing operator x component only y treated same way

1D discrete Laplacian operator Smoothingand Laplacianoperator

Spectral analysis of signal/geometry Express signal X as a linear sum of eigenvectors Project X along eigenvector DFT like spectral transform X = E T Spatial domain Spectral domain

Plot of eigenvectors First 8 eigenvectors of the 1D periodic Laplacian More oscillation as eigenvalues (frequencies) increase

Relation to Discrete Fourier Transform Smallest eigenvalue of L is zero Each remaining eigenvalue (except for the last one when n is even) has multiplicity 2 The plotted real eigenvectors are not unique to L One particular set of eigenvectors of L are the DFT basis Both sets exhibit similar oscillatory behaviours w.r.t. frequencies

Reconstruction and compression Reconstruction using k leading coefficients A form of spectral compression with info loss given by L 2 =

Plot of spectral transform coefficients Fairly fast decay as eigenvalue increases

Reconstruction examples n = 401 n = 75

Laplacian smoothing as filtering Recall thelaplacian smoothing operator Repeated application of S A filter applied to spectral coefficients

Examples m = 1 m = 5 m = 10 m = 50 Filter:

Computational issues No need to compute spectral coefficients for filtering Polynomial (e.g., Laplacian): matrix vector multiplication Spectral compression needs explicit spectral transform Efficient computation [Levy et al. 08]

Towards spectral mesh transform Signal representation Vectors of x, y, z vertex coordinates (x, y, z) Laplacian operator for meshes Encodes connectivity and geometry Combinatorial: graph Laplacians and variants Discretization of the continuous Laplace Beltrami operator The same kind of spectral transform and analysis

Spectral Mesh Compression

Spectral Processing Perspectives Signal Sg processing pocess Filtering and compression Relation to discrete Fourier transform (DFT) Geometric Global and intrinsic Machine learning Dimensionality reduction

A geometric perspective: classical Classical l Euclidean geometry Primitives not represented in coordinates Geometric relationships deduced in a pure and self contained manner Useof axioms

A geometric perspective: analytic Descartes analytic lti geometry Algebraic analysis tools introduced Primitives referenced in global frame extrinsic approach

Intrinsic approach Riemann s intrinsic view of geometry Geometryviewed purely from the surface perspective Metric: distance between points on surface Manyspaces (shapes) can be treated simultaneously: isometry

Spectral methods: intrinsic view Spectral approach takes the intrinsic view Intrinsic geometric/mesh information captured via a linear mesh operator Eigenstructures of the operator present the intrinsic geometric information in an organized manner Rarely need all eigenstructures, dominant ones often suffice

Capture of global information (Courant Fisher) Let S R n n be a symmetric matrix. Then its eigenvalues λ 1 λ 2. λ n must satisfy the following, λ = i v T v k min v = 1 2 = 0, 1 k i- 1 v T Sv where v 1, v 2,, v i 1 are eigenvectors of S corresponding to the smallest eigenvalues λ 1, λ 2,, λ i 1, respectively.

Interpretation λ = i v T v k min v = 1 2 = 0, 1 k i-1 v T S v v T S v T v v Rayleigh quotient Smallest eigenvector minimizes the Rayleigh quotient k th smallest eigenvector minimizes Rayleigh quotient, among the vectors orthogonal to all previous eigenvectors Solutions to global loptimization i problems

Use of eigenstructures Eigenvalues Spectral graph theory: graph eigenvalues closely related to almost all major global graph invariants Have been adopted as compact global shape descriptors Eigenvectors Useful extremal properties, e.g., heuristic for NP hard problems normalized cuts and sequencing Spectral embeddings capture global information, e.g., clustering

Example: clustering problem

Example: clustering problem

Spectral clustering Encode information about pairwise point affinities Input data A ij = e p i p j 2σ 2 2 Operator A Spectral embedding Leading eigenvectors

Spectral clustering eigenvectors In spectral domain Perform any clustering (e.g., k means) in spectral domain

Why does it work this way? Linkage based (local info.) spectral domain Spectral clustering

Local vs. global distances A good distance: Points in same cluster closer in transformed domain Look at set of shortest paths more global Would be nice to cluster according to c ij Commute time distance c ij = expected time for random walk to go from i to j and then back to i

Local vs. global distances In spectral domain

Commute time and spectral Eigen decompose the graph Laplacian K K = UΛU T Let K be the generalized inverse of K, K = UΛ U T, Λ ii = 1/Λ ii if Λ ii 0, otherwise Λ ii = 0. Note: the Laplacian is singular

Commute time and spectral Let z i be the i th row of UΛ 1/2 the spectral embedding Then Scaling each eigenvector by inverse square root of eigenvalue the commute time distance [Klein & Randic 93, Fouss et al. 06] z i z j 2 = c ij Full set of eigenvectors used, but select first k in practice

Example: intrinsic geometry Our first example: correspondence Spectral transform to handle shape pose Rigid alignment

Spectral Processing Perspectives Signal Sg processing pocess Filtering and compression Relation to discrete Fourier transform (DFT) Geometric Global and intrinsic Machine learning Dimensionality reduction

Spectral embedding Spectral decomposition A = UΛU T Full spectral embedding given by scaled eigenvectors (each scaled by squaredrootofeigenvalue) root of eigenvalue) completely captures theoperator W W T = 1 A 2 W = UΛ

Dimensionality reduction Full spectral embedding is high dimensional Use few dominant eigenvectors dimensionality reduction Information preserving Structure t enhancement t(polarization Theorem) Low D representation: simplifying solutions

Eckard & Young: Info preserving A R n n : symmetric and positive semi definite U (k) R n k : leading eigenvectors of A, scaled by square root of eigenvalues Then U (k) U (k)t : best rank k approximation of A in Frobenius norm U (k) =

Brand & Huang: Polarization Theorem

Low dim simpler problems Mesh projected into the eigenspace formed by the first two eigenvectors of a mesh Laplacian Reduce 3D analysis to contour analysis [Liu & Zhang 07]

Challenges Not quite DFT Basis for DFT is fixed given n,, e.g., g, regular and easy to compare (Fourier descriptors) Spectral mesh transform is operatoroperator dependent Which operator to use? Different behavior of eigenfunctions on the same sphere

Challenges No free lunch No mesh Laplacian on general meshes can satisfy a list of all desirable properties Remedy: use nice meshes Delaunay ornon obtuse Delaunay but obtuse Non obtuse

Additional issues Computational issues: FFT vs. eigen decomposition Regularity of vibration patternslost Difficult to characterize eigenvectors, eigenvalue not enough Non trivial to compare two sets of eigenvectors how to pair up?

Conclusion Use eigen structure of well behaved linear operators for geometry processing Solve problem in a different domain via a spectral transform Fourier analysis on meshes Captures global and intrinsic shape characteristics Dimensionality reduction: effective and simplifying