A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs

Size: px
Start display at page:

Download "A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs"

Transcription

1 A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award

2 In collaboration with Fan Zhang Huaijun Qiu Hongfang Wang Bai Xiao David Emms Richard Wilson

3 Pattern analysis and learning with graphs is difficult Graphs are not vectors: There is no natural ordering of nodes and edges. Correspondences must be used to establish order. Structural variations: Numbers of nodes and edges are not fixed. They can vary due to segmentation error. Not easily summarised: Since they do not reside in a vector space, mean and covariance hard to characterise.

4 Structural Variations

5 Problem studied How can find efficient means of characterising graph structure which does not involve exhaustive search? Enumerate properties of graph structure without explicit search, e.g. count cycles, path length frequencies, etc.. Can we analyse the structure of sets of graphs without solving the graph-matching problem? Inexact graph matching is computational bottleneck for most problems involving graphs. Answer: let a random walker do the work.

6 Graph Spectral Methods Use eigenvalues and eigenvectors of adjacency graph (or Laplacian matrix) - Biggs, Cvetokovic, Fan Chung Singular value methods for exact graph-matching and point-set alignment). (Umeyama, Scott and Longuet-Higgins, Shapiro and Brady). Use of eigenvectors for image segmentation (Shi and Malik) and for perceptual grouping (Freeman and Perona, Sarkar and Boyer). Graph-spectral methods for indexing shock-trees (Dickinson and Shakoufandeh)

7 Random walks on graphs Determined by the Laplacian spectrum (and in continuous time case by heatkernel). Can be used to interpret, and analyse, spectral methods since they can be understood intuitively as path-based.

8 Outline Some spectral graph theory and links to random walk. Characterisations based on commute-time, heat-kernel and zeta function. Quantum walks, and how interference leads to some powerful alternative algorithms. Conclusions and future.

9 Graph spectra and random walks Use spectrum of Laplacian matrix to compute hitting and commute times for random walk on a graph

10 Laplacian Matrix Weighted adjacency matrix Degree matrix Laplacian matrix = otherwise E v u v w u v u W 0 ), ( ), ( ), ( = V v v u W u u D ), ( ), ( W D L =

11 Laplacian spectrum Spectral Decomposition of Laplacian L = ΦΛΦ Element-wise T = λ φ k Λ = diag( λ1,..., λ V ) Φ = φ... φ ) L( u, v) = λ φ ( u) φ ( v) k k k k k φ T k ( 1 V k 0 = λ1 λ2... λ V

12 Properties of the Laplacian Eigenvalues are positive and smallest eigenvalue is zero 0 = λ 1 < λ2 <... < λ V Multiplicity of zero eigenvalue is number connected components of graph. Zero eigenvalue is associated with all-ones vector. Eigenvector associated with the second smallest eigenvector is Fiedler vector.

13 Discrete-time random walk State-vector p t t = Tpt 1 = T p 0 Transition matrix T = D 1 A Steady state p s = Tp s Determined by leading eigenvector of adjacency matrix A, or equivalently the second (Fiedler) eigenvector of the Laplacian.

14 Continuous time random walk

15 Heat Kernels Solution of heat equation and measures information flow across edges of graph with time: T ht L = D W = ΦΛΦ = Lht t Solution found by exponentiating Laplacian eigensystem h t = exp[ λ t] φ φ = Φ exp[ Λt] Φ k k k T k T

16 Heat kernel and random walk State vector of continuous time random walk satisfies the differential equation Solution p t t = Lp t p exp[ Lt] p = = t 0 h t p 0

17 Heat kernel and path lengths In terms of number of paths of length k from node u to node v! ), ( ] exp[ ), ( 1 k t v u P t v u h k k k t = = ), ( v u P k ) ( ) ( ) (1 ), ( v u v u P i i k V i i k φ φ λ =

18 Example. Graph shows spanning tree of heat-kernel. Here weights of graph are elements of heat kernel. As t increases, then spanning tree evolves from a tree rooted near centre of graph to a string (with ligatures). Low t behaviour dominated by Laplacian, high t behaviour dominated by Fiedler-vector.

19 Diffusion smoothing Use heat-kernel to smooth noisy data. Zhang and Hancock, Pattern Recognition 2008 and 2010 (in press)

20 Literature General diffusion-based partial differential equation (PDE): u = div( D u) t If D=1, isotropic linear diffusion (Gaussian filter). If D is a scalar function, inhomogeneous diffusion, e.g. Perona- Malik diffusion: D= g( u ) If D is a 2 2diffusion tensor, anisotropic diffusion, e.g. g( u ) 0 D = 0 1 (Weickert, etc).

21 Motivation: Why Graph Diffusion? Assumption for continuous PDEs An image is a continuous function on R 2 Consider discretisation for numerical implementation. Weakness: Noisy images may not be sufficiently smooth to give reliable derivatives. Fast, accurate, and stable implementation is difficult to achieve. Solution: diffusion on graphs An image is a smooth function on a graph. Purely combinatorial operators that require no discretisation are used.

22 Steps Set-up diffusion process as problem involving weighted graph, where anisotropic smoothing is modelled by an edge-weight matrix. Diffusion is heat-flow on the associated graph-structure. Nodes are pixels, and diffusion of grey-scale information is along edges with time. Solution is given by exponentiating the spectrum of the associated Laplacian matrix.

23 Graph Representation of Images I An image is represented using a weighted graph Γ= ( V, E). The nodes V are the pixels. An edge eij E is formed between two nodes v i and v j. The weight of an edge, e, is denoted by wi (, j). ij

24 Graph Edge Weight Characterise each pixel i by a n n window Ni of neighbors instead of using a single pixel alone. The similarity between nodes v i and v j is measured by the Gaussian weighted Euclidean distance between windows, i.e. d (, i j) = N N = G * N N σ i j 2, σ σ i j 2 Thus, edge weight is computed by wi (, j) exp( = 2 dσ (, i j) ) if X(i)-X(j) κ 0 otherwise 2 2 r

25 Anisotropic diffusion as heat Vertices: pixel values flow on a graph Edge weight: c.f. thermal conductivity Diffusion-equation for pixel values: I Pixel values stored as vector of t = LIt stacked image columns. t Connectivity structure encoded by Laplacian. I t = exp[ Lt] I0 = h t I 0 Heat kernel is smoothing filter.

26 Results It takes 3~6 seconds to process a 256 square image.

27 Root-Mean-Square Error comparison

28 Root-Mean-Square Error comparison

29

30 Smoothing in Diffusion Tensor MRI Tensor characterisation of water diffusion direction at each voxel location, Assign tensors to prolate or oblate classes using eigenvalues of tensors (FA). For pairs of prolate tensors weight is determined by angle between principle eigenvectors. Otherwise weight is determined geodesic distance between tensors on half-cone.

31 Raw image

32 Add noise

33 Graph-based diffusion smoothing

34 Pennec Riemannian smoothing of tensors

35 Root mean square error

36 Root mean square error

37 Effect of smoothing

38 Details

39 Details

40

41 Raw and smoothed

42 Tracked fibers (raw and smoothed)

43 Details

44 Principal direction field FA map Smoothed FA Principal direction field Smoothed principal direction field.

45 Consistent labelling Run probability diffusion backwards using Fokker-Planck equation to restore label consistency

46 Relaxation Labelling It aims at assigning consistent and unambiguous labels to a given set of objects. Relies on contextual information provided by topology of object arrangement and sets of compatibility relations on object-configutations. Involves evidence combination to update label probabilities and propagate label consistency constraints. Requires initial label probability assignment.

47 Relaxation Labelling Support functions for evidence combination: Hummer & Zucker: Kittler & Hancock: Probability update formula: Ω = N j i i i k j i ij j k i P R j S ω ω ω ω ω ) ( ) ( ) ( ) (, ) ( Ω + = i i k i k j k j k j k j S j P j S j P j P ω ω ω ω ω ω ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( 1) ( Ω = N j i i i k j i ij j k i P R j S ω ω ω ω ω ) ( ) ( ) ( ) (, ) (

48 Graph spectral relaxation labeling Set up a support graph where nodes represent possible object-label assignments. Edges represent label compatibility. Set up a random walk on the graph. Probability of visiting node is probability of object-label assignment (state-vector). Evolve state-vector of random with time to update probabilities and combine evidence.

49 Method Exploit the similarities between diffusion processes and relaxation labelling to develop a iterative process for consistent labelling. A compositional graph structure is used to represent the labelling. Each node on the graph is an object-label assignment. Edge weights encode label compatibility information. Run diffusion process to update label probabilities.

50 Label-probability diffusion process Given label-set M, object-set O nodes of graph are Cartesian pairs V=OxM. Edge weights are label compatibilities, weighted Laplacian L=D-W. Probability update formula: run diffusion process backwards in time (Fokker-Planck equation): p t = exp[ t( D W )] p = Φ Λ 0 exp[ t ] Φ T p 0

51 Experiments Five synthetic data-sets: I. Two rings (R2) II. Ring-Gaussian (RG) III. Three Gaussian (G3) V. Four Gaussian G4_2) IV. Four Gaussian (G4_1)

52 Experimental Results Resutls of synthetic data-sets: II. Results of Ring-Gaussian data (RG) I. Results of Three Gaussian data (G3) III. Results of Four Gaussian data (G4-1)

53 Experimental Results Real world data (from UCI machine learning data repository) III. Results of Iris data-set IV. Results of Wine data

54 Embedding using commute time Commute time is robust to edge deletion or edge weight errors. Qiu and Hancock, IEEE TPAMI 2007.

55 Hitting time Q(u,v): Expected number of steps of a random walk before node v is visited commencing from node u.

56 Commute time CT(u,v): Expected time taken for a random walk to travel from node u to node v and then return again CT(u,v)=Q(u,v)+Q(v,u)

57 Idea: compute edge attribute that is robust to modifications in edgestructure. Commute time: averages over all paths connecting a pair of nodes. Effects of edge deletion reduced.

58 Illustration Spanning tree vs commute time spanning tree Spanning tree Commte time spanning tree.

59 Green s function Spectral representation V 1 G( u, v) = λi φi ( u) φi ( v) i= 2 Meaning: psuedo inverse of Laplacian

60 Commute Time Commute time Hitting time and the Green s function Commute time and Laplacian eigen-spectrum ), ( ), ( ), ( u u G d vol v v G d vol v Q u u v = ), ( ), ( ), ( ), ( ), ( ), ( ), ( u v G d vol v u G d vol v v G d vol u u G d vol u v Q v Q u v u CT u u v u + = + = ( ) = = 2 2 ) ( ) ( 1 ), ( V i i i i v u vol v u CT φ φ λ

61 Commute Time Embedding Embedding that preserves commute time has co-ordinate matrix (vectors of co-ordinates are columns): Y = vol Λ Φ T

62 Diffusion map Transition probability matrix P = D 1 W Eigensystem Diffusion time D PΨ = ΛΨΨ m 2 2 t ( u, v) = λi [ ϕi ( u) ϕi ( v i= 1 )] 2 Embedding Y = Λ t Ψ T Commute time is sum of diffusion time over all timesteps t.

63 Example embedding

64 Examples

65 Multi-body Motion Analysis Use Costeira and Kanade factorisation method to decompose movement of a set of points over a series of frames into a shape-matrix and a shape-interaction matrix. [X/Y]=M S. Shape interaction matrix is used a adjacency weight matrix and Laplacian computed. W=S. Points are embedded in vector-space using commutetime embedding. Independently moving objects correspond to clusters in embedding space.

66 Multibody motion analysis

67 Videos

68 Videos

69 Videos

70 Videos

71 Videos

72 Motions

73 Embeddings

74 Separation accuracy

75 Invariants from the heat-kernel Extract permutation invariant characteristics from heat-kernel trace. Links with Ihara zeta-function and path-length distribution on graph. Xiao, Wilson and Hancock, Pattern Recognition 2009.

76 Setting the scene.probe structure of graph without explicit search.

77 Moments of the heat-kernel trace.can we characterise graph by the shape of its heat-kernel trace function?

78 Heat Kernel Trace Tr[ ht ] = exp[ λit] Shape of heat-kernel distinguishes graphs can we characterise its shape using moments Trace i Time (t)->

79 Rosenberg Zeta function Definition of zeta function ς ( s) λ 0 = ( λ ) k k s

80 Heat-kernel moments Mellin transform s 1 s 1 λi = t exp[ λit] dt Γ( s) 0 s Γ( s) = t 0 1 exp[ t] dt Trace and number of connected components Tr[ h t ] = C + λ 0 i exp[ λ t] i C is multiplicity of zero eigenvalue or number of connected components in graph. Zeta function ς ( s) 1 Γ( s) s = λi = 0 λ 0 i t s 1 [ Tr[ h ] C]dt t Zeta-function is related to moments of heatkernel trace.

81 Zeta-function behavior

82 Objects 72 views of each object taken in 5 degree intervals as camera moves in circle around object. Feature points extracted using corner detector. Construct Voronoi tesselation image plane using corner points as seeds. Delaunay graph is region adjacency graph for Voronoi regions.

83 Heat kernel moments (zeta(s), s=1,2,3,4)

84 PCA using zeta(s), s=1,2,3,4)

85 PCA on Laplace spectrum

86

87

88 Ox-Caltech database

89 Line-patterns Use Huet+Hancock representation (TPAMI-99). Extract straight line segments from Canny edgemap. Weight computed using continuity and proximity. Captures arrangement Gestalts.

90

91 Zeta function derivative Zeta function in terms of natural exponential ς ( s) = Derivative λ 0 ς '( s) = ( λ ) s Derivative at origin ς '(0) = ln λk = k λ 0 k λ 0 k k = λ 0 k exp[ s ln λ ] ln λ exp[ s ln λ ] k ln λ 0 k 1 λ k k k

92 Meaning Number of spanning trees in graph du u V ' τ ( G) = exp[ ζ ( o)] d u V u

93 COIL

94 Ox-Cal

95 Eigenvalue polynomials (COIL) Determinant Trace

96 Eigenvalue Polynomials (Ox- Cal)

97 Spectral polynomials (COIL)

98 Spectral Polynomials (Ox-Cal)

99 COIL: node and edge frequency

100 Ox-Cal: node+edge frequency

101 Performance Rand index=correct/(correct+wrong). Zeta func Sym. Poly. (evals) Sym. Poly. (matrix) Laplacian spectrum 0,

102 Quantum Walks Have both discrete time (DQW) and continuous time (CQW) variants. Use qubits to represent state of walk. State-vector is composed of complex numbers rather than an probabilities. Governed by unitary matrices rather than stochastic matrices. Admit interference of different feasible walks. Reversible, non ergodic, no limiting distribution.

103 Literature Fahri and Gutmann: CQW can penetrate family of trees in polynomial time, whereas classical walk can not. Childs: CQW hits exponentially faster than classical walk. Kempe: exploits polynomial hitting time to solve 2-SAT problem and suggest solution to routing problem. Shenvi, Kempe and Whaley: search of unordered database.

104 Quantum walks.have richer structure due to interference and complex representation. What can a quantum walker discover about a graph that a classical walker can not?

105 New possibilities Interference in quantum walks used to develop alternatives to classical random walks algorithm.

106 Consequences of interference Quantum walks hit exponentially faster than classical ones. Local symmetry reflected in cancellation of complex amplitude. Can be exploited to lift co-spectrality of strongly regular graphs. Quantum version of commute time embedding can be used to detect symmetries.

107 Strongly Regular Graphs Two SRGs with parameters (16,6,2,2) A B

108 Is the sp + (U 3 ) able to distinguish all There is no method proven to be able to decide whether two SRGs are isomorphic in polynomial time. There are large families of strongly regular graphs that we can test the method on SRGs? MDS embeddings of the SRGs with parameters (25,12,5,6)-red, (26,10,3,4)-blue, (29,14,6,7)-black, (40,12,2,4)-green using the adjacency spectrum (top) and the spectrum of S+(U^3) (bottom).

109 Testing The algorithm takes time O( E ^3). We have tested it for all pairs of graphs from the table opposite and found that it successfully determines whether the graphs are isomorphic in all cases.

110 Continuous Time Quantum Walk Evolution governed by Schrodinger s equation d dt ψ t >= il ψ t > Solution ψ t >= exp[ ilt] ψ 0 >

111 Walks compared Classical walk Quantum walk

112 Quantum hitting time Probability that walk is at node v at time t t P( X = v) = < v exp[ ilt] u > u 2 t T P( X = v) = < v Φ exp[ iλt] Φ u > u 2 Hazard function R(t) satisfies Hitting time is expectation of time weighted by hazard function: d dt t R( t) = P( X = v)[1 R( t)] 0 Q( u, v) = R( t) tdt u

113 Average commute vs path length Quantum walk reduces effect of path length. Quantum Classical

114 Example Embedding

115 Wheel with spokes

116 Symmetries

117 Symmetries of 3x3 Grid

118 8x8 Grid

119 Conclusions Interference of quantum walks can be exploited in algorithm design. Leads to a method for resolving cospectrality of graphs and trees Can also be used to design algorithm for inexact graph matching. Quantum commute time differs from its classical counterpart, and leads to embeddings that can be used to characterise symmetries in graphs..

120 Conclusions Random walks provide powerful framework for pathbased or flow-based analysis in vision and pattern recognition. Convenient since they rely on the Laplacian eigensystem, and although eigendecompisition is time consuming implementation is easy. Allow both low level and high level problems to be addressed. Quantum walks offer fascinating alternative which allows for interference of alternative walks.

Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision

Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit

More information

Characterising Structural Variations in Graphs

Characterising Structural Variations in Graphs Characterising Structural Variations in Graphs Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award In collaboration with Fan Zhang

More information

Advanced topics in spectral graph theory

Advanced topics in spectral graph theory Advanced topics in spectral graph theory Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award Links explored in this talk What more

More information

Spectral Generative Models for Graphs

Spectral Generative Models for Graphs Spectral Generative Models for Graphs David White and Richard C. Wilson Department of Computer Science University of York Heslington, York, UK wilson@cs.york.ac.uk Abstract Generative models are well known

More information

Graph Matching & Information. Geometry. Towards a Quantum Approach. David Emms

Graph Matching & Information. Geometry. Towards a Quantum Approach. David Emms Graph Matching & Information Geometry Towards a Quantum Approach David Emms Overview Graph matching problem. Quantum algorithms. Classical approaches. How these could help towards a Quantum approach. Graphs

More information

Heat Kernel Analysis On Graphs

Heat Kernel Analysis On Graphs Heat Kernel Analysis On Graphs Xiao Bai Submitted for the degree of Doctor of Philosophy Department of Computer Science June 7, 7 Abstract In this thesis we aim to develop a framework for graph characterization

More information

Image enhancement. Why image enhancement? Why image enhancement? Why image enhancement? Example of artifacts caused by image encoding

Image enhancement. Why image enhancement? Why image enhancement? Why image enhancement? Example of artifacts caused by image encoding 13 Why image enhancement? Image enhancement Example of artifacts caused by image encoding Computer Vision, Lecture 14 Michael Felsberg Computer Vision Laboratory Department of Electrical Engineering 12

More information

Pattern Recognition 42 (2009) Contents lists available at ScienceDirect. Pattern Recognition

Pattern Recognition 42 (2009) Contents lists available at ScienceDirect. Pattern Recognition Pattern Recognition 4 (9) 1988 -- Contents lists available at ScienceDirect Pattern Recognition journal homepage: www.elsevier.com/locate/pr Coined quantum walks lift the cospectrality of graphs and trees

More information

Pattern Vectors from Algebraic Graph Theory

Pattern Vectors from Algebraic Graph Theory Pattern Vectors from Algebraic Graph Theory Richard C. Wilson, Edwin R. Hancock Department of Computer Science, University of York, York Y 5DD, UK Bin Luo Anhui University, P. R. China Abstract Graph structures

More information

Space-Variant Computer Vision: A Graph Theoretic Approach

Space-Variant Computer Vision: A Graph Theoretic Approach p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic

More information

Unsupervised Clustering of Human Pose Using Spectral Embedding

Unsupervised Clustering of Human Pose Using Spectral Embedding Unsupervised Clustering of Human Pose Using Spectral Embedding Muhammad Haseeb and Edwin R Hancock Department of Computer Science, The University of York, UK Abstract In this paper we use the spectra of

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 23 1 / 27 Overview

More information

Clustering and Embedding using Commute Times

Clustering and Embedding using Commute Times Clustering and Embedding using Commute Times Huaijun Qiu and Edwin R. Hancock Abstract This paper exploits the properties of the commute time between nodes of a graph for the purposes of clustering and

More information

NONLINEAR DIFFUSION PDES

NONLINEAR DIFFUSION PDES NONLINEAR DIFFUSION PDES Erkut Erdem Hacettepe University March 5 th, 0 CONTENTS Perona-Malik Type Nonlinear Diffusion Edge Enhancing Diffusion 5 References 7 PERONA-MALIK TYPE NONLINEAR DIFFUSION The

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an

More information

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix René Vidal Stefano Soatto Shankar Sastry Department of EECS, UC Berkeley Department of Computer Sciences, UCLA 30 Cory Hall,

More information

Nonlinear Diffusion. 1 Introduction: Motivation for non-standard diffusion

Nonlinear Diffusion. 1 Introduction: Motivation for non-standard diffusion Nonlinear Diffusion These notes summarize the way I present this material, for my benefit. But everything in here is said in more detail, and better, in Weickert s paper. 1 Introduction: Motivation for

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

Random walks and anisotropic interpolation on graphs. Filip Malmberg

Random walks and anisotropic interpolation on graphs. Filip Malmberg Random walks and anisotropic interpolation on graphs. Filip Malmberg Interpolation of missing data Assume that we have a graph where we have defined some (real) values for a subset of the nodes, and that

More information

Robust Motion Segmentation by Spectral Clustering

Robust Motion Segmentation by Spectral Clustering Robust Motion Segmentation by Spectral Clustering Hongbin Wang and Phil F. Culverhouse Centre for Robotics Intelligent Systems University of Plymouth Plymouth, PL4 8AA, UK {hongbin.wang, P.Culverhouse}@plymouth.ac.uk

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

dynamical zeta functions: what, why and what are the good for?

dynamical zeta functions: what, why and what are the good for? dynamical zeta functions: what, why and what are the good for? Predrag Cvitanović Georgia Institute of Technology November 2 2011 life is intractable in physics, no problem is tractable I accept chaos

More information

NON-LINEAR DIFFUSION FILTERING

NON-LINEAR DIFFUSION FILTERING NON-LINEAR DIFFUSION FILTERING Chalmers University of Technology Page 1 Summary Introduction Linear vs Nonlinear Diffusion Non-Linear Diffusion Theory Applications Implementation References Page 2 Introduction

More information

An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications. Giuseppe Patané, CNR-IMATI - Italy

An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications. Giuseppe Patané, CNR-IMATI - Italy i An Introduction to Laplacian Spectral Distances and Kernels: Theory, Computation, and Applications Giuseppe Patané, CNR-IMATI - Italy ii Contents List of Figures...1 List of Tables...7 1 Introduction...9

More information

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University Diffusion/Inference geometries of data features, situational awareness and visualization Ronald R Coifman Mathematics Yale University Digital data is generally converted to point clouds in high dimensional

More information

Self-Tuning Spectral Clustering

Self-Tuning Spectral Clustering Self-Tuning Spectral Clustering Lihi Zelnik-Manor Pietro Perona Department of Electrical Engineering Department of Electrical Engineering California Institute of Technology California Institute of Technology

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

Spectral Clustering. by HU Pili. June 16, 2013

Spectral Clustering. by HU Pili. June 16, 2013 Spectral Clustering by HU Pili June 16, 2013 Outline Clustering Problem Spectral Clustering Demo Preliminaries Clustering: K-means Algorithm Dimensionality Reduction: PCA, KPCA. Spectral Clustering Framework

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x

More information

Unsupervised dimensionality reduction

Unsupervised dimensionality reduction Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional

More information

Nonlinear Diffusion. Journal Club Presentation. Xiaowei Zhou

Nonlinear Diffusion. Journal Club Presentation. Xiaowei Zhou 1 / 41 Journal Club Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology 2009-12-11 2 / 41 Outline 1 Motivation Diffusion process

More information

Convex Optimization of Graph Laplacian Eigenvalues

Convex Optimization of Graph Laplacian Eigenvalues Convex Optimization of Graph Laplacian Eigenvalues Stephen Boyd Abstract. We consider the problem of choosing the edge weights of an undirected graph so as to maximize or minimize some function of the

More information

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk We now turn to a second major topic in quantum algorithms, the concept

More information

A Riemannian Framework for Denoising Diffusion Tensor Images

A Riemannian Framework for Denoising Diffusion Tensor Images A Riemannian Framework for Denoising Diffusion Tensor Images Manasi Datar No Institute Given Abstract. Diffusion Tensor Imaging (DTI) is a relatively new imaging modality that has been extensively used

More information

Data dependent operators for the spatial-spectral fusion problem

Data dependent operators for the spatial-spectral fusion problem Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.

More information

Higher Order Cartesian Tensor Representation of Orientation Distribution Functions (ODFs)

Higher Order Cartesian Tensor Representation of Orientation Distribution Functions (ODFs) Higher Order Cartesian Tensor Representation of Orientation Distribution Functions (ODFs) Yonas T. Weldeselassie (Ph.D. Candidate) Medical Image Computing and Analysis Lab, CS, SFU DT-MR Imaging Introduction

More information

Random walks for deformable image registration Dana Cobzas

Random walks for deformable image registration Dana Cobzas Random walks for deformable image registration Dana Cobzas with Abhishek Sen, Martin Jagersand, Neil Birkbeck, Karteek Popuri Computing Science, University of Alberta, Edmonton Deformable image registration?t

More information

Directional Field. Xiao-Ming Fu

Directional Field. Xiao-Ming Fu Directional Field Xiao-Ming Fu Outlines Introduction Discretization Representation Objectives and Constraints Outlines Introduction Discretization Representation Objectives and Constraints Definition Spatially-varying

More information

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Amit Singer Princeton University Department of Mathematics and Program in Applied and Computational Mathematics

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS VIKAS CHANDRAKANT RAYKAR DECEMBER 5, 24 Abstract. We interpret spectral clustering algorithms in the light of unsupervised

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

THE analysis of relational patterns or graphs has proven to

THE analysis of relational patterns or graphs has proven to IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL 27, NO 7, JULY 2005 1 Pattern Vectors from Algebraic Graph Theory Richard C Wilson, Member, IEEE Computer Society, Edwin R Hancock, and

More information

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way

More information

arxiv:quant-ph/ v1 15 Apr 2005

arxiv:quant-ph/ v1 15 Apr 2005 Quantum walks on directed graphs Ashley Montanaro arxiv:quant-ph/0504116v1 15 Apr 2005 February 1, 2008 Abstract We consider the definition of quantum walks on directed graphs. Call a directed graph reversible

More information

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures

More information

Diffusion and random walks on graphs

Diffusion and random walks on graphs Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural

More information

Spectral Feature Vectors for Graph Clustering

Spectral Feature Vectors for Graph Clustering Spectral Feature Vectors for Graph Clustering Bin Luo,, Richard C. Wilson,andEdwinR.Hancock Department of Computer Science, University of York York YO DD, UK Anhui University, P.R. China {luo,wilson,erh}@cs.york.ac.uk

More information

Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding

Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding Jun Tang Key Lab of IC&SP, Ministry of Education Anhui University Hefei, China tangjunahu@gmail.com Ling Shao, Simon

More information

Spectral Graph Theory

Spectral Graph Theory Spectral Graph Theory Aaron Mishtal April 27, 2016 1 / 36 Outline Overview Linear Algebra Primer History Theory Applications Open Problems Homework Problems References 2 / 36 Outline Overview Linear Algebra

More information

Math 307 Learning Goals. March 23, 2010

Math 307 Learning Goals. March 23, 2010 Math 307 Learning Goals March 23, 2010 Course Description The course presents core concepts of linear algebra by focusing on applications in Science and Engineering. Examples of applications from recent

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

Exploiting Sparse Non-Linear Structure in Astronomical Data

Exploiting Sparse Non-Linear Structure in Astronomical Data Exploiting Sparse Non-Linear Structure in Astronomical Data Ann B. Lee Department of Statistics and Department of Machine Learning, Carnegie Mellon University Joint work with P. Freeman, C. Schafer, and

More information

CMPSCI 791BB: Advanced ML: Laplacian Learning

CMPSCI 791BB: Advanced ML: Laplacian Learning CMPSCI 791BB: Advanced ML: Laplacian Learning Sridhar Mahadevan Outline! Spectral graph operators! Combinatorial graph Laplacian! Normalized graph Laplacian! Random walks! Machine learning on graphs! Clustering!

More information

Linear Diffusion and Image Processing. Outline

Linear Diffusion and Image Processing. Outline Outline Linear Diffusion and Image Processing Fourier Transform Convolution Image Restoration: Linear Filtering Diffusion Processes for Noise Filtering linear scale space theory Gauss-Laplace pyramid for

More information

Linear Algebra for Machine Learning. Sargur N. Srihari

Linear Algebra for Machine Learning. Sargur N. Srihari Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it

More information

Discrete quantum random walks

Discrete quantum random walks Quantum Information and Computation: Report Edin Husić edin.husic@ens-lyon.fr Discrete quantum random walks Abstract In this report, we present the ideas behind the notion of quantum random walks. We further

More information

Graph Metrics and Dimension Reduction

Graph Metrics and Dimension Reduction Graph Metrics and Dimension Reduction Minh Tang 1 Michael Trosset 2 1 Applied Mathematics and Statistics The Johns Hopkins University 2 Department of Statistics Indiana University, Bloomington November

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

Manifold Coarse Graining for Online Semi-supervised Learning

Manifold Coarse Graining for Online Semi-supervised Learning for Online Semi-supervised Learning Mehrdad Farajtabar, Amirreza Shaban, Hamid R. Rabiee, Mohammad H. Rohban Digital Media Lab, Department of Computer Engineering, Sharif University of Technology, Tehran,

More information

Low-level Image Processing

Low-level Image Processing Low-level Image Processing In-Place Covariance Operators for Computer Vision Terry Caelli and Mark Ollila School of Computing, Curtin University of Technology, Perth, Western Australia, Box U 1987, Emaihtmc@cs.mu.oz.au

More information

A New Space for Comparing Graphs

A New Space for Comparing Graphs A New Space for Comparing Graphs Anshumali Shrivastava and Ping Li Cornell University and Rutgers University August 18th 2014 Anshumali Shrivastava and Ping Li ASONAM 2014 August 18th 2014 1 / 38 Main

More information

Online Kernel PCA with Entropic Matrix Updates

Online Kernel PCA with Entropic Matrix Updates Dima Kuzmin Manfred K. Warmuth Computer Science Department, University of California - Santa Cruz dima@cse.ucsc.edu manfred@cse.ucsc.edu Abstract A number of updates for density matrices have been developed

More information

Nonlinear Dimensionality Reduction

Nonlinear Dimensionality Reduction Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Kernel PCA 2 Isomap 3 Locally Linear Embedding 4 Laplacian Eigenmap

More information

A quantum walk based search algorithm, and its optical realisation

A quantum walk based search algorithm, and its optical realisation A quantum walk based search algorithm, and its optical realisation Aurél Gábris FJFI, Czech Technical University in Prague with Tamás Kiss and Igor Jex Prague, Budapest Student Colloquium and School on

More information

Consistent Positive Directional Splitting of Anisotropic Diffusion

Consistent Positive Directional Splitting of Anisotropic Diffusion In: Boštjan Likar (ed.): Proc. of Computer Vision Winter Workshop, Bled, Slovenia, Feb. 7-9, 2001, pp. 37-48. Consistent Positive Directional Splitting of Anisotropic Diffusion Pavel Mrázek and Mirko Navara

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

Spectral Clustering. Guokun Lai 2016/10

Spectral Clustering. Guokun Lai 2016/10 Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph

More information

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR

More information

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba WWW:

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba   WWW: SURF Features Jacky Baltes Dept. of Computer Science University of Manitoba Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky Salient Spatial Features Trying to find interest points Points

More information

Sparse Approximation: from Image Restoration to High Dimensional Classification

Sparse Approximation: from Image Restoration to High Dimensional Classification Sparse Approximation: from Image Restoration to High Dimensional Classification Bin Dong Beijing International Center for Mathematical Research Beijing Institute of Big Data Research Peking University

More information

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features

More information

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features

More information

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Consensus Algorithms for Camera Sensor Networks Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Camera Sensor Networks Motes Small, battery powered Embedded camera Wireless interface

More information

CS 556: Computer Vision. Lecture 13

CS 556: Computer Vision. Lecture 13 CS 556: Computer Vision Lecture 1 Prof. Sinisa Todorovic sinisa@eecs.oregonstate.edu 1 Outline Perceptual grouping Low-level segmentation Ncuts Perceptual Grouping What do you see? 4 What do you see? Rorschach

More information

Quantum walk algorithms

Quantum walk algorithms Quantum walk algorithms Andrew Childs Institute for Quantum Computing University of Waterloo 28 September 2011 Randomized algorithms Randomness is an important tool in computer science Black-box problems

More information

CSE 554 Lecture 7: Alignment

CSE 554 Lecture 7: Alignment CSE 554 Lecture 7: Alignment Fall 2012 CSE554 Alignment Slide 1 Review Fairing (smoothing) Relocating vertices to achieve a smoother appearance Method: centroid averaging Simplification Reducing vertex

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Efficient Beltrami Filtering of Color Images via Vector Extrapolation

Efficient Beltrami Filtering of Color Images via Vector Extrapolation Efficient Beltrami Filtering of Color Images via Vector Extrapolation Lorina Dascal, Guy Rosman, and Ron Kimmel Computer Science Department, Technion, Institute of Technology, Haifa 32000, Israel Abstract.

More information

Fast and Accurate HARDI and its Application to Neurological Diagnosis

Fast and Accurate HARDI and its Application to Neurological Diagnosis Fast and Accurate HARDI and its Application to Neurological Diagnosis Dr. Oleg Michailovich Department of Electrical and Computer Engineering University of Waterloo June 21, 2011 Outline 1 Diffusion imaging

More information

Medical Image Analysis

Medical Image Analysis Medical Image Analysis CS 593 / 791 Computer Science and Electrical Engineering Dept. West Virginia University 23rd January 2006 Outline 1 Recap 2 Edge Enhancement 3 Experimental Results 4 The rest of

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

Filtering via a Reference Set. A.Haddad, D. Kushnir, R.R. Coifman Technical Report YALEU/DCS/TR-1441 February 21, 2011

Filtering via a Reference Set. A.Haddad, D. Kushnir, R.R. Coifman Technical Report YALEU/DCS/TR-1441 February 21, 2011 Patch-based de-noising algorithms and patch manifold smoothing have emerged as efficient de-noising methods. This paper provides a new insight on these methods, such as the Non Local Means or the image

More information

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms ITK Filters Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms ITCS 6010:Biomedical Imaging and Visualization 1 ITK Filters:

More information

arxiv: v1 [cs.dm] 14 Feb 2013

arxiv: v1 [cs.dm] 14 Feb 2013 Eigenfunctions of the Edge-Based Laplacian on a Graph arxiv:1302.3433v1 [cs.dm] 14 Feb 2013 Richard C. Wilson, Furqan Aziz, Edwin R. Hancock Dept. of Computer Science University of York, York, UK March

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

Doubly Stochastic Normalization for Spectral Clustering

Doubly Stochastic Normalization for Spectral Clustering Doubly Stochastic Normalization for Spectral Clustering Ron Zass and Amnon Shashua Abstract In this paper we focus on the issue of normalization of the affinity matrix in spectral clustering. We show that

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to

More information

A Factorization Method for 3D Multi-body Motion Estimation and Segmentation

A Factorization Method for 3D Multi-body Motion Estimation and Segmentation 1 A Factorization Method for 3D Multi-body Motion Estimation and Segmentation René Vidal Department of EECS University of California Berkeley CA 94710 rvidal@eecs.berkeley.edu Stefano Soatto Dept. of Computer

More information

Spectral Clustering on Handwritten Digits Database

Spectral Clustering on Handwritten Digits Database University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

CS 664 Segmentation (2) Daniel Huttenlocher

CS 664 Segmentation (2) Daniel Huttenlocher CS 664 Segmentation (2) Daniel Huttenlocher Recap Last time covered perceptual organization more broadly, focused in on pixel-wise segmentation Covered local graph-based methods such as MST and Felzenszwalb-Huttenlocher

More information

Convex Optimization of Graph Laplacian Eigenvalues

Convex Optimization of Graph Laplacian Eigenvalues Convex Optimization of Graph Laplacian Eigenvalues Stephen Boyd Stanford University (Joint work with Persi Diaconis, Arpita Ghosh, Seung-Jean Kim, Sanjay Lall, Pablo Parrilo, Amin Saberi, Jun Sun, Lin

More information

2.1 Laplacian Variants

2.1 Laplacian Variants -3 MS&E 337: Spectral Graph heory and Algorithmic Applications Spring 2015 Lecturer: Prof. Amin Saberi Lecture 2-3: 4/7/2015 Scribe: Simon Anastasiadis and Nolan Skochdopole Disclaimer: hese notes have

More information

Robust Laplacian Eigenmaps Using Global Information

Robust Laplacian Eigenmaps Using Global Information Manifold Learning and its Applications: Papers from the AAAI Fall Symposium (FS-9-) Robust Laplacian Eigenmaps Using Global Information Shounak Roychowdhury ECE University of Texas at Austin, Austin, TX

More information

Linear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc

Linear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc Linear Diffusion Derivation of Heat equation Consider a 2D hot plate with Initial temperature profile I 0 (x, y) Uniform (isotropic) conduction coefficient c Unit thickness (along z) Problem: What is temperature

More information