Characterising Structural Variations in Graphs

Size: px
Start display at page:

Download "Characterising Structural Variations in Graphs"

Transcription

1 Characterising Structural Variations in Graphs Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award

2 In collaboration with Fan Zhang Tatjiana Aleksic Bin Luo Huaijun Qiu Peng Ren Bai Xiao David Emms Richard Wilson

3 Learning with graph data Problems based on graphs arise in areas such as language processing, proteomics/chemoinformatics, data mining, computer vision and complex systems. Relatively little methodology available, and vectorial methods from statistical machine learning not easily applied since there is no canonical ordering of the nodes in a graph. Can make considerable progress if we develop permutation invariant characterisations of variations in graph structure.

4 Learning with graph data Problems based on graphs arise in areas such as language processing, proteomics/chemoinformatics, data mining, computer vision and complex systems. Relatively little methodology available, and vectorial methods from statistical machine learning not easily applied since there is no canonical ordering of the nodes in a graph. Can make considerable progress if we develop permutation invariant characterisations of variations in graph structure.

5 Learning with graph data Problems based on graphs arise in areas such as language processing, proteomics/chemoinformatics, data mining, computer vision and complex systems. Relatively little methodology available, and vectorial methods from statistical machine learning not easily applied since there is no canonical ordering of the nodes in a graph. Can make considerable progress if we develop permutation invariant characterisations of variations in graph structure.

6 Structural Variations

7 Problem In computer vision graph-structures are used to abstract image structure. However, the algorithms used to segment the image primitives are not reliable. As a result there are both additional and missing nodes (due to segmentation error) and variations in edge-structure. Hence image matching and recognition can not be reduced to a graph isomorphism or even a subgraph isomorphism problem. Instead inexact graph matching methods are needed.

8 Problem In computer vision graph-structures are used to abstract image structure. However, the algorithms used to segment the image primitives are not reliable. As a result there are both additional and missing nodes (due to segmentation error) and variations in edge-structure. Hence image matching and recognition can not be reduced to a graph isomorphism or even a subgraph isomorphism problem. Instead inexact graph matching methods are needed.

9 Problem In computer vision graph-structures are used to abstract image structure. However, the algorithms used to segment the image primitives are not reliable. As a result there are both additional and missing nodes (due to segmentation error) and variations in edge-structure. Hence image matching and recognition can not be reduced to a graph isomorphism or even a subgraph isomorphism problem. Instead inexact graph matching methods are needed.

10 Learning is difficult because Graphs are not vectors: There is no natural ordering of nodes and edges. Correspondences must be used to establish order. Structural variations: Numbers of nodes and edges are not fixed. They can vary due to segmentation error. Not easily summarised: Since they do not reside in a vector space, mean and covariance hard to characterise.

11 Measuring similarity of graphs Early work on graph-matching is vision ( Barrow and Popplestone) introduced association graph and showed how it could be used to locate maximum common subgraph. Work on syntactic and structural pattern recognition in 1980 s unearthed problems with inexact matching (Sanfeliu+Eshera and Fu. Haralick and Shapiro, Wong etc) and extended concept of edit distance from strings to graphs. Recent work has aimed to develop probability distributions for graph matching (Christmas, Kittler and Petrou, Wilson and Hancock, Seratosa and Sanfeliu) and match using advanced optimisation methods(simic, Gold and Rangarjan). Renewed interest in placing classical methods such as edit distance (Bunke) and max-clique (Pelillo) on a more rigorous footing.

12 Learning with graphs Work with (dis) similarities: Can perform pariwise clustering or embed sets of graphs in a vector space using multidimensional scaling on similarities. Non metricity of similarities may pose problems. Embed individual graphs in a low dimensional space: Characterise structural variations in terms of statistical variation in a point-pattern. Learn modes of structural variation: Understand how edge (connectivity) structure varies for graphs belonging to the same class. Requires correspondences of raw structure or alignment of an embedded one. Can also be effected using permutation invariant characteristics (path length, commute-times, cycle frequencies). Construct generative model: Borrow ideas from graphical model to construct model for raw structures or point distribution model to for embedded graphs.

13 Approaches Work with permutation invariant graph characteristics, which can relate to the topological structure. Embed graphs in a vector space and manipulate as point patterns (align, characterise statistically). Establish correspondences and learn variations in edge structure. Determine likelihood that particular nodes and edges co-occur.

14 Methods Graph-spectra lead to straightforward methods for characterising structure and embedding, that do not require correspondences and are permutation invariant. Random walks provide an intuitive way of understanding spectral methods in terms of distributions of path and cycle length. When correspondences are to hand detailed generative models can be constructed using information theory.

15 Our contributions IJCV 2007 (Torsello, Robles-Kelly, Hancock) shape classes from edit distance using pairwise clustering. PAMI 06 and Pattern Recognition 05 (Wilson, Luo and Hancock) graph clustering using spectral features and polynomials. PAMI 07 (Torsello and Hancock) generative model for variations in tree structure using description length. CVIU09 (Xiao, Wilson and Hancock) generative model from heat-kernel embedding of graphs.

16 Delaunay Graph

17 Synthetic Sequence

18 PCA on Adjacency matrices Convert adjacency matrices into long-vectors and perform PCA on sequence. Left column: Eigenspaces Right column: Graph distances in eigenspace Top row: Unweighted graph Middle row: Weighted graph proximity weights Bottom row: Fully connected weighted graph (point proximity matrix)

19 Image Sequence

20 PCA+Eigenvalues

21 ICA+Eigenvalues

22 Shock graphs Type 1 shock (monotonically increasing radius) Type 2 shock (minimum radius) Type 3 shock (constant radius) Type 4 shock (maximum radius)

23 Pairwise clustering Edit distancees Shape comparison Multidimensional scaling on distances

24 Outline Some spectral graph theory and links to random walk. Characterisations based on commute-time, heat-kernel and zeta function. More detailed structure from quantum walks. Conclusions and future.

25 Problem studied How can we find efficient means of characterising graph structure which does not involve exhaustive search? Enumerate properties of graph structure without explicit search, e.g. count cycles, path length frequencies, etc.. Can we analyse the structure of sets of graphs without solving the graph-matching problem? Inexact graph matching is computational bottleneck for most problems involving graphs. Answer: let a random walker do the work.

26 Graph Spectral Methods Use eigenvalues and eigenvectors of adjacency graph (or Laplacian matrix) - Biggs, Cvetokovic, Fan Chung Singular value methods for exact graph-matching and point-set alignment). (Umeyama, Scott and Longuet-Higgins, Shapiro and Brady). Use of eigenvectors for image segmentation (Shi and Malik) and for perceptual grouping (Freeman and Perona, Sarkar and Boyer). Graph-spectral methods for indexing shock-trees (Dickinson and Shakoufandeh)

27 Random walks on graphs Determined by the Laplacian spectrum (and in continuous time case by heatkernel). Can be used to interpret, and analyse, spectral methods since they can be understood intuitively as path-based.

28 Graph spectra and random walks Use spectrum of Laplacian matrix to compute hitting and commute times for random walk on a graph

29 Laplacian Matrix Weighted adjacency matrix Degree matrix Laplacian matrix = otherwise E v u v w u v u W 0 ), ( ), ( ), ( = V v v u W u u D ), ( ), ( W D L =

30 Laplacian spectrum Spectral Decomposition of Laplacian L = ΦΛΦ Element-wise T = λ φ k Λ = diag( λ1,..., λ V ) Φ = φ... φ ) L( u, v) = λ φ ( u) φ ( v) k k k k k φ T k ( 1 V k 0 = λ1 λ2... λ V

31 Properties of the Laplacian Eigenvalues are positive and smallest eigenvalue is zero 0 = λ 1 < λ2 <... < λ V Multiplicity of zero eigenvalue is number connected components of graph. Zero eigenvalue is associated with all-ones vector. Eigenvector associated with the second smallest eigenvector is Fiedler vector.

32 Discrete-time random walk State-vector p t t = Tpt 1 = T p 0 Transition matrix T = D 1 A Steady state p s = Tp s Determined by leading eigenvector of adjacency matrix A, or equivalently the second (Fiedler) eigenvector of the Laplacian.

33 Continuous time random walk

34 Heat Kernels Solution of heat equation and measures information flow across edges of graph with time: T ht L = D W = ΦΛΦ = Lht t Solution found by exponentiating Laplacian eigensystem h t = exp[ λ t] φ φ = Φ exp[ Λt] Φ k k k T k T

35 Heat kernel and random walk State vector of continuous time random walk satisfies the differential equation Solution p t t = Lp t p exp[ Lt] p = = t 0 h t p 0

36 Heat kernel and path lengths In terms of number of paths of length k from node u to node v! ), ( ] exp[ ), ( 1 k t v u P t v u h k k k t = = ), ( v u P k ) ( ) ( ) (1 ), ( v u v u P i i k V i i k φ φ λ =

37 Example. Graph shows spanning tree of heat-kernel. Here weights of graph are elements of heat kernel. As t increases, then spanning tree evolves from a tree rooted near centre of graph to a string (with ligatures). Low t behaviour dominated by Laplacian, high t behaviour dominated by Fiedler-vector.

38 Embedding using commute time Commute time is robust to edge deletion or edge weight errors. Qiu and Hancock, IEEE TPAMI 2007.

39 Hitting time Q(u,v): Expected number of steps of a random walk before node v is visited commencing from node u.

40 Commute time CT(u,v): Expected time taken for a random walk to travel from node u to node v and then return again CT(u,v)=Q(u,v)+Q(v,u)

41 Idea: compute edge attribute that is robust to modifications in edgestructure. Commute time: averages over all paths connecting a pair of nodes. Effects of edge deletion reduced.

42 Illustration Spanning tree vs commute time spanning tree Spanning tree Commte time spanning tree.

43 Green s function Spectral representation V 1 G( u, v) = λi φi ( u) φi ( v) i= 2 Meaning: psuedo inverse of Laplacian

44 Commute Time Commute time Hitting time and the Green s function Commute time and Laplacian eigen-spectrum ), ( ), ( ), ( u u G d vol v v G d vol v Q u u v = ), ( ), ( ), ( ), ( ), ( ), ( ), ( u v G d vol v u G d vol v v G d vol u u G d vol u v Q v Q u v u CT u u v u + = + = ( ) = = 2 2 ) ( ) ( 1 ), ( V i i i i v u vol v u CT φ φ λ

45 Commute Time Embedding Embedding that preserves commute time has co-ordinate matrix (vectors of co-ordinates are columns): Y = vol Λ Φ T

46 Diffusion map Transition probability matrix P = D 1 W Eigensystem Diffusion time D PΨ = ΛΨΨ m 2 2 t ( u, v) = λi [ ϕi ( u) ϕi ( v i= 1 )] 2 Embedding Y = Λ t Ψ T Commute time is sum of diffusion time over all timesteps t.

47 Example embedding

48 Examples

49 Multi-body Motion Analysis Use Costeira and Kanade factorisation method to decompose movement of a set of points over a series of frames into a shape-matrix and a shape-interaction matrix. [X/Y]=M S. Shape interaction matrix is used a adjacency weight matrix and Laplacian computed. W=S. Points are embedded in vector-space using commutetime embedding. Independently moving objects correspond to clusters in embedding space.

50 Multibody motion analysis

51 Videos

52 Videos

53 Videos

54 Videos

55 Videos

56 Motions

57 Embeddings

58 Separation accuracy

59 Invariants from the heat-kernel Extract permutation invariant characteristics from heat-kernel trace. Links with Ihara zeta-function and path-length distribution on graph. Xiao, Wilson and Hancock, Pattern Recognition 2009.

60 Moments of the heat-kernel trace.can we characterise graph by the shape of its heat-kernel trace function?

61 Heat Kernel Trace Tr[ ht ] = exp[ λit] Shape of heat-kernel distinguishes graphs can we characterise its shape using moments Trace i Time (t)->

62 Rosenberg Zeta function Definition of zeta function ς ( s) λ 0 = ( λ ) k k s

63 Heat-kernel moments Mellin transform s 1 s 1 λi = t exp[ λit] dt Γ( s) 0 s Γ( s) = t 0 1 exp[ t] dt Trace and number of connected components Tr[ h t ] = C + λ 0 i exp[ λ t] i C is multiplicity of zero eigenvalue or number of connected components in graph. Zeta function ς ( s) 1 Γ( s) s = λi = 0 λ 0 i t s 1 [ Tr[ h ] C]dt t Zeta-function is related to moments of heatkernel trace.

64 Zeta-function behavior

65 Objects 72 views of each object taken in 5 degree intervals as camera moves in circle around object. Feature points extracted using corner detector. Construct Voronoi tesselation image plane using corner points as seeds. Delaunay graph is region adjacency graph for Voronoi regions.

66 Heat kernel moments (zeta(s), s=1,2,3,4)

67 PCA using zeta(s), s=1,2,3,4)

68 PCA on Laplace spectrum

69

70

71 Ox-Caltech database

72 Line-patterns Use Huet+Hancock representation (TPAMI-99). Extract straight line segments from Canny edgemap. Weight computed using continuity and proximity. Captures arrangement Gestalts.

73

74 Zeta function derivative Zeta function in terms of natural exponential ς ( s) = Derivative λ 0 ς '( s) = ( λ ) s Derivative at origin ς '(0) = ln λk = k λ 0 k λ 0 k k = λ 0 k exp[ s ln λ ] ln λ exp[ s ln λ ] k ln λ 0 k 1 λ k k k

75 Meaning Number of spanning trees in graph du u V ' τ ( G) = exp[ ζ ( o)] d u V u

76 COIL

77 Ox-Cal

78 Eigenvalue polynomials (COIL) Determinant Trace

79 Eigenvalue Polynomials (Ox- Cal)

80 Spectral polynomials (COIL)

81 Spectral Polynomials (Ox-Cal)

82 COIL: node and edge frequency

83 Ox-Cal: node+edge frequency

84 Performance Rand index=correct/(correct+wrong). Zeta func Sym. Poly. (evals) Sym. Poly. (matrix) Laplacian spectrum 0,

85 Deeper probes of structure Ihara zeta function

86 Zeta functions Used in number theory to characterise distribution of prime numbers. Can be extended to graphs by replacing notion of prime number with that of a prime cycle.

87 Characterising graphs Topological: e.g. average degree, degree distribution, edge-density, diameter, cycle frequencies etc. Spectral: use eigenvalues of adjacency matrix or Laplacian. Algebraic: co-efficients of characteristic polynomial.

88 Ihara Zeta function Determined by distribution of prime cycles. Transform graph to oriented line graph (OLG) with edges as nodes and edges indicating incidence at a common vertex. Zeta function is reciprocal of characteristic polynomial for OLG adjacency matrix. Coefficients of polynomial determined by eigenvalues of OLG adjacency matrix. Coefficients linked to topological quantities such as cycle frequencies, number of spanning trees.

89 Oriented Line Graph

90 Ihara Zeta Function Ihara Zeta Function for a graph G(V,E) Defined over prime cycles of graph Rational expression in terms of characteristic polynomial of oriented line-graph A is adjacency matrix of line digraph Q =D-I (degree matrix minus identity

91 Characteristic Polynomials from IZF Perron-Frobenius operator is the adjacency matrix T H of the oriented line graph Determinant Expression of IZF Each coefficient,i.e. Ihara coefficient, can be derived from the elementary symmetric polynomials of the eigenvalue set Pattern Vector in terms of

92 Analysis of determinant From matrix logs ζ ( s) = 1 det[ I Ts] = exp[ Tr[ T k s ] k > 1 k k ] Tr[T^k] is symmetric polynomial of 1 eigenvalues of T Tr[ T ] = λ λn Tr[ T... Tr[ T 2 N 2 ] = λ + λ λ +... λ 1 ] = λ λ... λ N N

93 Distribution of prime cycles Frequency distribution for cycles of length l s d ds lnζ ( s) = l N l s l Cycle frequencies l 1 d N l = lnζ ( s) l s= 0 = ( l 1)! ds Tr[ T l ]

94 Experiments: Edge-weighted Graphs Feature Distance & Edit Distance Three Classes of Randomly Generated Graphs

95 Experiments: Hypergraphs

96 Complexity

97 Non-Neuman Entropy Given in terms of normalised Laplacian eigenvalues H λk ln λk = k Quadratic approximation H = λ (1 λ ) Variance of elements of weighted adjacency matrix. k k k

98 Birkhoff polytopes Representation of permutation group using stochastic basis matrices. Doubly stochastic matrix can be represented as a convex combination of permutation matrices Complexity K = p P k C = k k p k n! ln k p k

99 Shannon-Jensen Divergence In terms of Shannon entropy J = H ( ) k Gk k H ( G k ) Related to edit distance on trees (Torsello+Hancock). Can lead to family of kernels

100 Behaviour

101

102

103

104

105

106

107 Quantum Walks Have both discrete time (DQW) and continuous time (CQW) variants. Use qubits to represent state of walk. State-vector is composed of complex numbers rather than an probabilities. Governed by unitary matrices rather than stochastic matrices. Admit interference of different feasible walks. Reversible, non ergodic, no limiting distribution.

108 Literature Fahri and Gutmann: CQW can penetrate family of trees in polynomial time, whereas classical walk can not. Childs: CQW hits exponentially faster than classical walk. Kempe: exploits polynomial hitting time to solve 2-SAT problem and suggest solution to routing problem. Shenvi, Kempe and Whaley: search of unordered database.

109 Quantum walks.have richer structure due to interference and complex representation. What can a quantum walker discover about a graph that a classical walker can not?

110 New possibilities Interference in quantum walks used to develop alternatives to classical random walks algorithm.

111 Consequences of interference Quantum walks hit exponentially faster than classical ones. Local symmetry reflected in cancellation of complex amplitude. Can be exploited to lift co-spectrality of strongly regular graphs. Quantum version of commute time embedding can be used to detect symmetries.

112 Strongly Regular Graphs Two SRGs with parameters (16,6,2,2) A B

113 Is the sp + (U 3 ) able to distinguish all There is no method proven to be able to decide whether two SRGs are isomorphic in polynomial time. There are large families of strongly regular graphs that we can test the method on SRGs? MDS embeddings of the SRGs with parameters (25,12,5,6)-red, (26,10,3,4)-blue, (29,14,6,7)-black, (40,12,2,4)-green using the adjacency spectrum (top) and the spectrum of S+(U^3) (bottom).

114 Testing The algorithm takes time O( E ^3). We have tested it for all pairs of graphs from the table opposite and found that it successfully determines whether the graphs are isomorphic in all cases.

115 Continuous Time Quantum Walk Evolution governed by Schrodinger s equation d dt ψ t >= il ψ t > Solution ψ t >= exp[ ilt] ψ 0 >

116 Walks compared Classical walk Quantum walk

117 Quantum hitting time Probability that walk is at node v at time t t P( X = v) = < v exp[ ilt] u > u 2 t T P( X = v) = < v Φ exp[ iλt] Φ u > u 2 Hazard function R(t) satisfies Hitting time is expectation of time weighted by hazard function: d dt t R( t) = P( X = v)[1 R( t)] 0 Q( u, v) = R( t) tdt u

118 Average commute vs path length Quantum walk reduces effect of path length. Quantum Classical

119 Example Embedding

120 Wheel with spokes

121 Symmetries

122 Symmetries of 3x3 Grid

123 8x8 Grid

124 Conclusions Interference of quantum walks can be exploited in algorithm design. Leads to a method for resolving cospectrality of graphs and trees Can also be used to design algorithm for inexact graph matching. Quantum commute time differs from its classical counterpart, and leads to embeddings that can be used to characterise symmetries in graphs..

125 Conclusions Random walks provide powerful framework for pathbased or flow-based analysis in vision and pattern recognition. Convenient since they rely on the Laplacian eigensystem, and although eigendecompisition is time consuming implementation is easy. Allow both low level and high level problems to be addressed. Quantum walks offer fascinating alternative which allows for interference of alternative walks.

A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs

A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs A Random Walk Through Spectral Graph Theory letting a random walker do the hard work of analysing graphs Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson

More information

Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision

Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision Diffusions on Graphs Applications in Low, Intermediate and High Level Computer Vision Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit

More information

Advanced topics in spectral graph theory

Advanced topics in spectral graph theory Advanced topics in spectral graph theory Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award Links explored in this talk What more

More information

Spectral Generative Models for Graphs

Spectral Generative Models for Graphs Spectral Generative Models for Graphs David White and Richard C. Wilson Department of Computer Science University of York Heslington, York, UK wilson@cs.york.ac.uk Abstract Generative models are well known

More information

Heat Kernel Analysis On Graphs

Heat Kernel Analysis On Graphs Heat Kernel Analysis On Graphs Xiao Bai Submitted for the degree of Doctor of Philosophy Department of Computer Science June 7, 7 Abstract In this thesis we aim to develop a framework for graph characterization

More information

Graph Matching & Information. Geometry. Towards a Quantum Approach. David Emms

Graph Matching & Information. Geometry. Towards a Quantum Approach. David Emms Graph Matching & Information Geometry Towards a Quantum Approach David Emms Overview Graph matching problem. Quantum algorithms. Classical approaches. How these could help towards a Quantum approach. Graphs

More information

Pattern Vectors from Algebraic Graph Theory

Pattern Vectors from Algebraic Graph Theory Pattern Vectors from Algebraic Graph Theory Richard C. Wilson, Edwin R. Hancock Department of Computer Science, University of York, York Y 5DD, UK Bin Luo Anhui University, P. R. China Abstract Graph structures

More information

Pattern Recognition 42 (2009) Contents lists available at ScienceDirect. Pattern Recognition

Pattern Recognition 42 (2009) Contents lists available at ScienceDirect. Pattern Recognition Pattern Recognition 4 (9) 1988 -- Contents lists available at ScienceDirect Pattern Recognition journal homepage: www.elsevier.com/locate/pr Coined quantum walks lift the cospectrality of graphs and trees

More information

Spectral Feature Vectors for Graph Clustering

Spectral Feature Vectors for Graph Clustering Spectral Feature Vectors for Graph Clustering Bin Luo,, Richard C. Wilson,andEdwinR.Hancock Department of Computer Science, University of York York YO DD, UK Anhui University, P.R. China {luo,wilson,erh}@cs.york.ac.uk

More information

Unsupervised Clustering of Human Pose Using Spectral Embedding

Unsupervised Clustering of Human Pose Using Spectral Embedding Unsupervised Clustering of Human Pose Using Spectral Embedding Muhammad Haseeb and Edwin R Hancock Department of Computer Science, The University of York, UK Abstract In this paper we use the spectra of

More information

Clustering and Embedding using Commute Times

Clustering and Embedding using Commute Times Clustering and Embedding using Commute Times Huaijun Qiu and Edwin R. Hancock Abstract This paper exploits the properties of the commute time between nodes of a graph for the purposes of clustering and

More information

THE analysis of relational patterns or graphs has proven to

THE analysis of relational patterns or graphs has proven to IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL 27, NO 7, JULY 2005 1 Pattern Vectors from Algebraic Graph Theory Richard C Wilson, Member, IEEE Computer Society, Edwin R Hancock, and

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

Graph Theory and Spectral Methods for Pattern Recognition. Richard C. Wilson Dept. of Computer Science University of York

Graph Theory and Spectral Methods for Pattern Recognition. Richard C. Wilson Dept. of Computer Science University of York Graph Theory and Spectral Methods for Pattern Recognition Richard C. Wilson Dept. of Computer Science University of York Graphs and Networks Graphs and networks are all around us Simple networks 10s to

More information

arxiv: v1 [cs.dm] 14 Feb 2013

arxiv: v1 [cs.dm] 14 Feb 2013 Eigenfunctions of the Edge-Based Laplacian on a Graph arxiv:1302.3433v1 [cs.dm] 14 Feb 2013 Richard C. Wilson, Furqan Aziz, Edwin R. Hancock Dept. of Computer Science University of York, York, UK March

More information

Graph Generative Models from Information Theory

Graph Generative Models from Information Theory Graph Generative Models from Information Theory Lin Han A Thesis Submitted for the Degree of Doctor of Philosophy the University of York Departments of Computer Science October 2012 Abstract Generative

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course

Spectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

arxiv:quant-ph/ v1 15 Apr 2005

arxiv:quant-ph/ v1 15 Apr 2005 Quantum walks on directed graphs Ashley Montanaro arxiv:quant-ph/0504116v1 15 Apr 2005 February 1, 2008 Abstract We consider the definition of quantum walks on directed graphs. Call a directed graph reversible

More information

Robust Motion Segmentation by Spectral Clustering

Robust Motion Segmentation by Spectral Clustering Robust Motion Segmentation by Spectral Clustering Hongbin Wang and Phil F. Culverhouse Centre for Robotics Intelligent Systems University of Plymouth Plymouth, PL4 8AA, UK {hongbin.wang, P.Culverhouse}@plymouth.ac.uk

More information

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an

More information

1.10 Matrix Representation of Graphs

1.10 Matrix Representation of Graphs 42 Basic Concepts of Graphs 1.10 Matrix Representation of Graphs Definitions: In this section, we introduce two kinds of matrix representations of a graph, that is, the adjacency matrix and incidence matrix

More information

A New Space for Comparing Graphs

A New Space for Comparing Graphs A New Space for Comparing Graphs Anshumali Shrivastava and Ping Li Cornell University and Rutgers University August 18th 2014 Anshumali Shrivastava and Ping Li ASONAM 2014 August 18th 2014 1 / 38 Main

More information

The weighted spectral distribution; A graph metric with applications. Dr. Damien Fay. SRG group, Computer Lab, University of Cambridge.

The weighted spectral distribution; A graph metric with applications. Dr. Damien Fay. SRG group, Computer Lab, University of Cambridge. The weighted spectral distribution; A graph metric with applications. Dr. Damien Fay. SRG group, Computer Lab, University of Cambridge. A graph metric: motivation. Graph theory statistical modelling Data

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 23 1 / 27 Overview

More information

Convex Optimization of Graph Laplacian Eigenvalues

Convex Optimization of Graph Laplacian Eigenvalues Convex Optimization of Graph Laplacian Eigenvalues Stephen Boyd Abstract. We consider the problem of choosing the edge weights of an undirected graph so as to maximize or minimize some function of the

More information

Self-Tuning Spectral Clustering

Self-Tuning Spectral Clustering Self-Tuning Spectral Clustering Lihi Zelnik-Manor Pietro Perona Department of Electrical Engineering Department of Electrical Engineering California Institute of Technology California Institute of Technology

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

Math 307 Learning Goals. March 23, 2010

Math 307 Learning Goals. March 23, 2010 Math 307 Learning Goals March 23, 2010 Course Description The course presents core concepts of linear algebra by focusing on applications in Science and Engineering. Examples of applications from recent

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

Articulated Shape Matching Using Laplacian Eigenfunctions and Unsupervised Point Registration

Articulated Shape Matching Using Laplacian Eigenfunctions and Unsupervised Point Registration Articulated Shape Matching Using Laplacian Eigenfunctions and Unsupervised Point Registration Diana Mateus Radu Horaud David Knossow Fabio Cuzzolin Edmond Boyer INRIA Rhône-Alpes 655 avenue de l Europe-

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

Networks and Matrices

Networks and Matrices Bowen-Lanford Zeta function, Math table talk Oliver Knill, 10/18/2016 Networks and Matrices Assume you have three friends who do not know each other. This defines a network containing 4 nodes in which

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding

Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding Point Pattern Matching Based on Line Graph Spectral Context and Descriptor Embedding Jun Tang Key Lab of IC&SP, Ministry of Education Anhui University Hefei, China tangjunahu@gmail.com Ling Shao, Simon

More information

CMPSCI 791BB: Advanced ML: Laplacian Learning

CMPSCI 791BB: Advanced ML: Laplacian Learning CMPSCI 791BB: Advanced ML: Laplacian Learning Sridhar Mahadevan Outline! Spectral graph operators! Combinatorial graph Laplacian! Normalized graph Laplacian! Random walks! Machine learning on graphs! Clustering!

More information

Quantum Walks on Strongly Regular Graphs

Quantum Walks on Strongly Regular Graphs Quantum Walks on Strongly Regular Graphs by Krystal J. Guo A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Mathematics in Combinatorics

More information

Spectral Graph Theory

Spectral Graph Theory Spectral Graph Theory Aaron Mishtal April 27, 2016 1 / 36 Outline Overview Linear Algebra Primer History Theory Applications Open Problems Homework Problems References 2 / 36 Outline Overview Linear Algebra

More information

Spectra of Adjacency and Laplacian Matrices

Spectra of Adjacency and Laplacian Matrices Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra

More information

Random walks and anisotropic interpolation on graphs. Filip Malmberg

Random walks and anisotropic interpolation on graphs. Filip Malmberg Random walks and anisotropic interpolation on graphs. Filip Malmberg Interpolation of missing data Assume that we have a graph where we have defined some (real) values for a subset of the nodes, and that

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Networks as vectors of their motif frequencies and 2-norm distance as a measure of similarity

Networks as vectors of their motif frequencies and 2-norm distance as a measure of similarity Networks as vectors of their motif frequencies and 2-norm distance as a measure of similarity CS322 Project Writeup Semih Salihoglu Stanford University 353 Serra Street Stanford, CA semih@stanford.edu

More information

Periodicity & State Transfer Some Results Some Questions. Periodic Graphs. Chris Godsil. St John s, June 7, Chris Godsil Periodic Graphs

Periodicity & State Transfer Some Results Some Questions. Periodic Graphs. Chris Godsil. St John s, June 7, Chris Godsil Periodic Graphs St John s, June 7, 2009 Outline 1 Periodicity & State Transfer 2 Some Results 3 Some Questions Unitary Operators Suppose X is a graph with adjacency matrix A. Definition We define the operator H X (t)

More information

Math 307 Learning Goals

Math 307 Learning Goals Math 307 Learning Goals May 14, 2018 Chapter 1 Linear Equations 1.1 Solving Linear Equations Write a system of linear equations using matrix notation. Use Gaussian elimination to bring a system of linear

More information

Non-bayesian Graph Matching without Explicit Compatibility Calculations

Non-bayesian Graph Matching without Explicit Compatibility Calculations Non-bayesian Graph Matching without Explicit Compatibility Calculations Barend Jacobus van Wyk 1,2 and Michaël Antonie van Wyk 3 1 Kentron, a division of Denel, Centurion, South Africa 2 University of

More information

18.312: Algebraic Combinatorics Lionel Levine. Lecture 19

18.312: Algebraic Combinatorics Lionel Levine. Lecture 19 832: Algebraic Combinatorics Lionel Levine Lecture date: April 2, 20 Lecture 9 Notes by: David Witmer Matrix-Tree Theorem Undirected Graphs Let G = (V, E) be a connected, undirected graph with n vertices,

More information

Spectral Clustering. by HU Pili. June 16, 2013

Spectral Clustering. by HU Pili. June 16, 2013 Spectral Clustering by HU Pili June 16, 2013 Outline Clustering Problem Spectral Clustering Demo Preliminaries Clustering: K-means Algorithm Dimensionality Reduction: PCA, KPCA. Spectral Clustering Framework

More information

Chapter 2 Spectra of Finite Graphs

Chapter 2 Spectra of Finite Graphs Chapter 2 Spectra of Finite Graphs 2.1 Characteristic Polynomials Let G = (V, E) be a finite graph on n = V vertices. Numbering the vertices, we write down its adjacency matrix in an explicit form of n

More information

Space-Variant Computer Vision: A Graph Theoretic Approach

Space-Variant Computer Vision: A Graph Theoretic Approach p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk We now turn to a second major topic in quantum algorithms, the concept

More information

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix René Vidal Stefano Soatto Shankar Sastry Department of EECS, UC Berkeley Department of Computer Sciences, UCLA 30 Cory Hall,

More information

Spectral Clustering on Handwritten Digits Database

Spectral Clustering on Handwritten Digits Database University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:

More information

Lecture: Some Practical Considerations (3 of 4)

Lecture: Some Practical Considerations (3 of 4) Stat260/CS294: Spectral Graph Methods Lecture 14-03/10/2015 Lecture: Some Practical Considerations (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

Graph Metrics and Dimension Reduction

Graph Metrics and Dimension Reduction Graph Metrics and Dimension Reduction Minh Tang 1 Michael Trosset 2 1 Applied Mathematics and Statistics The Johns Hopkins University 2 Department of Statistics Indiana University, Bloomington November

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

Eugene Wigner [4]: in the natural sciences, Communications in Pure and Applied Mathematics, XIII, (1960), 1 14.

Eugene Wigner [4]: in the natural sciences, Communications in Pure and Applied Mathematics, XIII, (1960), 1 14. Introduction Eugene Wigner [4]: The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve.

More information

Directional Field. Xiao-Ming Fu

Directional Field. Xiao-Ming Fu Directional Field Xiao-Ming Fu Outlines Introduction Discretization Representation Objectives and Constraints Outlines Introduction Discretization Representation Objectives and Constraints Definition Spatially-varying

More information

Fundamentals of Matrices

Fundamentals of Matrices Maschinelles Lernen II Fundamentals of Matrices Christoph Sawade/Niels Landwehr/Blaine Nelson Tobias Scheffer Matrix Examples Recap: Data Linear Model: f i x = w i T x Let X = x x n be the data matrix

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x

More information

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory ORIE 4741: Learning with Big Messy Data Spectral Graph Theory Mika Sumida Operations Research and Information Engineering Cornell September 15, 2017 1 / 32 Outline Graph Theory Spectral Graph Theory Laplacian

More information

Inverse Perron values and connectivity of a uniform hypergraph

Inverse Perron values and connectivity of a uniform hypergraph Inverse Perron values and connectivity of a uniform hypergraph Changjiang Bu College of Automation College of Science Harbin Engineering University Harbin, PR China buchangjiang@hrbeu.edu.cn Jiang Zhou

More information

Quantum walk algorithms

Quantum walk algorithms Quantum walk algorithms Andrew Childs Institute for Quantum Computing University of Waterloo 28 September 2011 Randomized algorithms Randomness is an important tool in computer science Black-box problems

More information

Lecture: Local Spectral Methods (3 of 4) 20 An optimization perspective on local spectral methods

Lecture: Local Spectral Methods (3 of 4) 20 An optimization perspective on local spectral methods Stat260/CS294: Spectral Graph Methods Lecture 20-04/07/205 Lecture: Local Spectral Methods (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide

More information

Discrete quantum random walks

Discrete quantum random walks Quantum Information and Computation: Report Edin Husić edin.husic@ens-lyon.fr Discrete quantum random walks Abstract In this report, we present the ideas behind the notion of quantum random walks. We further

More information

The Hierarchical Product of Graphs

The Hierarchical Product of Graphs The Hierarchical Product of Graphs Lali Barrière Francesc Comellas Cristina Dalfó Miquel Àngel Fiol Universitat Politècnica de Catalunya - DMA4 April 8, 2008 Outline 1 Introduction 2 Graphs and matrices

More information

Message-Passing Algorithms for GMRFs and Non-Linear Optimization

Message-Passing Algorithms for GMRFs and Non-Linear Optimization Message-Passing Algorithms for GMRFs and Non-Linear Optimization Jason Johnson Joint Work with Dmitry Malioutov, Venkat Chandrasekaran and Alan Willsky Stochastic Systems Group, MIT NIPS Workshop: Approximate

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

Lecture 3: graph theory

Lecture 3: graph theory CONTENTS 1 BASIC NOTIONS Lecture 3: graph theory Sonia Martínez October 15, 2014 Abstract The notion of graph is at the core of cooperative control. Essentially, it allows us to model the interaction topology

More information

Lecture 2: From Classical to Quantum Model of Computation

Lecture 2: From Classical to Quantum Model of Computation CS 880: Quantum Information Processing 9/7/10 Lecture : From Classical to Quantum Model of Computation Instructor: Dieter van Melkebeek Scribe: Tyson Williams Last class we introduced two models for deterministic

More information

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH Hoang Trang 1, Tran Hoang Loc 1 1 Ho Chi Minh City University of Technology-VNU HCM, Ho Chi

More information

Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design

Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design Rahul Dhal Electrical Engineering and Computer Science Washington State University Pullman, WA rdhal@eecs.wsu.edu

More information

An Introduction to Spectral Graph Theory

An Introduction to Spectral Graph Theory An Introduction to Spectral Graph Theory Mackenzie Wheeler Supervisor: Dr. Gary MacGillivray University of Victoria WheelerM@uvic.ca Outline Outline 1. How many walks are there from vertices v i to v j

More information

State Controllability, Output Controllability and Stabilizability of Networks: A Symmetry Perspective

State Controllability, Output Controllability and Stabilizability of Networks: A Symmetry Perspective State Controllability, Output Controllability and Stabilizability of Networks: A Symmetry Perspective Airlie Chapman and Mehran Mesbahi Robotics, Aerospace, and Information Networks Lab (RAIN) University

More information

A Markov Chain Inspired View of Hamiltonian Cycle Problem

A Markov Chain Inspired View of Hamiltonian Cycle Problem A Markov Chain Inspired View of Hamiltonian Cycle Problem December 12, 2012 I hereby acknowledge valuable collaboration with: W. Murray (Stanford University) V.S. Borkar (Tata Inst. for Fundamental Research)

More information

Lecture: Local Spectral Methods (1 of 4)

Lecture: Local Spectral Methods (1 of 4) Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide

More information

How to count - an exposition of Polya s theory of enumeration

How to count - an exposition of Polya s theory of enumeration How to count - an exposition of Polya s theory of enumeration Shriya Anand Published in Resonance, September 2002 P.19-35. Shriya Anand is a BA Honours Mathematics III year student from St. Stephens College,

More information

Intensity Analysis of Spatial Point Patterns Geog 210C Introduction to Spatial Data Analysis

Intensity Analysis of Spatial Point Patterns Geog 210C Introduction to Spatial Data Analysis Intensity Analysis of Spatial Point Patterns Geog 210C Introduction to Spatial Data Analysis Chris Funk Lecture 5 Topic Overview 1) Introduction/Unvariate Statistics 2) Bootstrapping/Monte Carlo Simulation/Kernel

More information

Eigenvectors Via Graph Theory

Eigenvectors Via Graph Theory Eigenvectors Via Graph Theory Jennifer Harris Advisor: Dr. David Garth October 3, 2009 Introduction There is no problem in all mathematics that cannot be solved by direct counting. -Ernst Mach The goal

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

Doubly Stochastic Normalization for Spectral Clustering

Doubly Stochastic Normalization for Spectral Clustering Doubly Stochastic Normalization for Spectral Clustering Ron Zass and Amnon Shashua Abstract In this paper we focus on the issue of normalization of the affinity matrix in spectral clustering. We show that

More information

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing

More information

Data dependent operators for the spatial-spectral fusion problem

Data dependent operators for the spatial-spectral fusion problem Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.

More information

Mic ael Flohr Representation theory of semi-simple Lie algebras: Example su(3) 6. and 20. June 2003

Mic ael Flohr Representation theory of semi-simple Lie algebras: Example su(3) 6. and 20. June 2003 Handout V for the course GROUP THEORY IN PHYSICS Mic ael Flohr Representation theory of semi-simple Lie algebras: Example su(3) 6. and 20. June 2003 GENERALIZING THE HIGHEST WEIGHT PROCEDURE FROM su(2)

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

Spectral characterization of Signed Graphs

Spectral characterization of Signed Graphs University of Primorska, Koper May 2015 Signed Graphs Basic notions on Signed Graphs Matrices of Signed Graphs A signed graph Γ is an ordered pair (G, σ), where G = (V (G), E(G)) is a graph and σ : E(G)

More information

Algebraic Representation of Networks

Algebraic Representation of Networks Algebraic Representation of Networks 0 1 2 1 1 0 0 1 2 0 0 1 1 1 1 1 Hiroki Sayama sayama@binghamton.edu Describing networks with matrices (1) Adjacency matrix A matrix with rows and columns labeled by

More information

A Bound for Non-Subgraph Isomorphism

A Bound for Non-Subgraph Isomorphism A Bound for Non-Sub Isomorphism Christian Schellewald School of Computing, Dublin City University, Dublin 9, Ireland Christian.Schellewald@computing.dcu.ie, http://www.computing.dcu.ie/ cschellewald/ Abstract.

More information

Diffusion and random walks on graphs

Diffusion and random walks on graphs Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural

More information

COUNTING AND ENUMERATING SPANNING TREES IN (di-) GRAPHS

COUNTING AND ENUMERATING SPANNING TREES IN (di-) GRAPHS COUNTING AND ENUMERATING SPANNING TREES IN (di-) GRAPHS Xuerong Yong Version of Feb 5, 9 pm, 06: New York Time 1 1 Introduction A directed graph (digraph) D is a pair (V, E): V is the vertex set of D,

More information

4 Matrix product states

4 Matrix product states Physics 3b Lecture 5 Caltech, 05//7 4 Matrix product states Matrix product state (MPS) is a highly useful tool in the study of interacting quantum systems in one dimension, both analytically and numerically.

More information

Chapter 19 Clifford and the Number of Holes

Chapter 19 Clifford and the Number of Holes Chapter 19 Clifford and the Number of Holes We saw that Riemann denoted the genus by p, a notation which is still frequently used today, in particular for the generalizations of this notion in higher dimensions.

More information

1 ** The performance objectives highlighted in italics have been identified as core to an Algebra II course.

1 ** The performance objectives highlighted in italics have been identified as core to an Algebra II course. Strand One: Number Sense and Operations Every student should understand and use all concepts and skills from the pervious grade levels. The standards are designed so that new learning builds on preceding

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

Notes on Linear Algebra and Matrix Theory

Notes on Linear Algebra and Matrix Theory Massimo Franceschet featuring Enrico Bozzo Scalar product The scalar product (a.k.a. dot product or inner product) of two real vectors x = (x 1,..., x n ) and y = (y 1,..., y n ) is not a vector but a

More information