Graphs in Machine Learning

Size: px
Start display at page:

Download "Graphs in Machine Learning"

Transcription

1 Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman October 10th, 2018 MVA 2018/2019

2 Previous lecture where do the graphs come from? social, information, utility, and biological networks we create them from the flat data random graph models specific applications and concepts maximizing influence on a graph gossip propagation, submodularity, proof of the approximation guarantee Google pagerank random surfer process, steady state vector, sparsity online semi-supervised learning label propagation, backbone graph, online learning, combinatorial sparsification, stability analysis Erdős number project, real-world graphs, heavy tails, small world when did this happen? similarity graphs different types construction practical considerations Michal Valko Graphs in Machine Learning SequeL - 2/43

3 This lecture Laplacians and their properties spectral graph theory random walks recommendation on a bipartite graph resistive networks recommendation score as a resistance? Laplacian and resistive networks resistance distance and random walks PS: some students have started working on their projects already Michal Valko Graphs in Machine Learning SequeL - 3/43

4 Similarity Graphs: ε or k-nn? DEMO IN CLASS tutorial.pdf Michal Valko Graphs in Machine Learning SequeL - 4/43

5 Generic Similarity Functions Gaussian similarity function/heat function/rbf: Cosine similarity function: Typical Kernels ( xi x j 2 ) s ij = exp 2σ 2 s ij = cos(θ) = ( x T i x j ) x i x j Michal Valko Graphs in Machine Learning SequeL - 5/43

6 Similarity Graphs G = (V, E) - with a set of nodes V and a set of edges E Michal Valko Graphs in Machine Learning SequeL - 6/43

7 Sources of Real Networks default.htm Michal Valko Graphs in Machine Learning SequeL - 7/43

8 L = D W graph Laplacian the only matrix that matters

9 2 Graph Laplacian L = G = (V, E) - with a set of nodes V and a set of edges E A W D L = D W adjacency matrix weight matrix (diagonal) degree matrix graph Laplacian matrix L is SDD! Michal Valko Graphs in Machine Learning SequeL - 9/43

10 Properties of Graph Laplacian Graph function: a vector f R N assigning values to nodes: f : V(G) R. f T Lf = 1 w i,j (f i f j ) 2 = S G (f) 2 i,j N Michal Valko Graphs in Machine Learning SequeL - 10/43

11 Recap: Eigenwerte und Eigenvektoren A vector v is an eigenvector of matrix M of eigenvalue λ Mv = λv. If (λ 1, v 1 ) are (λ 2, v 2 ) eigenpairs for symmetric M with λ 1 λ 2 then v 1 v 2, i.e., v T 1 v 2 = 0. If (λ, v 1 ), (λ, v 2 ) are eigenpairs for M then (λ, v 1 + v 2 ) is as well. For symmetric M, the multiplicity of λ is the dimension of the space of eigenvectors corresponding to λ. N N symmetric matrix has N eigenvalues (w/ multiplicities). Michal Valko Graphs in Machine Learning SequeL - 11/43

12 Eigenvalues, Eigenvectors, and Eigendecomposition A vector v is an eigenvector of matrix M of eigenvalue λ Mv = λv. Vectors {v i } i form an orthonormal basis with λ 1 λ 2... λ N. i Mv i = λ i v i MQ = QΛ Q has eigenvectors in columns and Λ has eigenvalues on its diagonal. Right-multiplying MQ = QΛ by Q T we get the eigendecomposition of M: M = MQQ T = QΛQ T = i λ iv i v T i Michal Valko Graphs in Machine Learning SequeL - 12/43

13 M = L: Properties of Graph Laplacian We can assume non-negative weights: w ij 0. L is symmetric L positive semi-definite f T Lf = 1 2 i,j N w i,j(f i f j ) 2 Recall: If Lf = λf then λ is an eigenvalue (of the Laplacian). The smallest eigenvalue of L is 0. Corresponding eigenvector: 1 N. All eigenvalues are non-negative reals 0 = λ 1 λ 2 λ N. Self-edges do not change the value of L. Michal Valko Graphs in Machine Learning SequeL - 13/43

14 Properties of Graph Laplacian The multiplicity of eigenvalue 0 of L equals to the number of connected components. The eigenspace of 0 is spanned by the components indicators. Proof: If (0, f) is an eigenpair then 0 = 1 2 i,j N w i,j(f i f j ) 2. Therefore, f is constant on each connected component. If there are k components, then L is k-block-diagonal: L 1 L 2 L =... Lk For block-diagonal matrices: the spectrum is the union of the spectra of L i (eigenvectors of L i padded with zeros elsewhere). For L i (0, 1 Vi ) is an eigenpair, hence the claim. Michal Valko Graphs in Machine Learning SequeL - 14/43

15 Smoothness of the Function and Laplacian f = (f 1,..., f N ) T : graph function Let L = QΛQ T be the eigendecomposition of the Laplacian. Diagonal matrix Λ whose diagonal entries are eigenvalues of L. Columns of Q are eigenvectors of L. Columns of Q form a basis. α: Unique vector such that Qα = f Note: Q T f = α Smoothness of a graph function S G (f) S G (f) = f T Lf = f T QΛQ T f = α T Λα = α 2 Λ = N λ i αi 2 i=1 Smoothness and regularization: Small value of (a) S G (f) (b) Λ norm of α (c) α i for large λ i Michal Valko Graphs in Machine Learning SequeL - 15/43

16 Smoothness of the Function and Laplacian N S G (f) = f T Lf = f T QΛQ T f = α T Λα = α 2 Λ = λ i αi 2 Eigenvectors are graph functions too! What is the smoothness of an eigenvector? Spectral coordinates of eigenvector v k : Q T v k = e k i=1 N S G (v k )=v T k Lv k =v T k QΛQT v k = e T k Λe k = e k 2 Λ = λ i (e k ) 2 i = λ k i=1 The smoothness of k-th eigenvector is the k-th eigenvalue. Michal Valko Graphs in Machine Learning SequeL - 16/43

17 Laplacian of the Complete Graph K N What is the eigenspectrum of L KN? L KN = From before: we know that (0, 1 N ) is an eigenpair. N N N N N 1 If v 0 N and v 1 N = i v i = 0. To get the other eigenvalues, we compute (L KN v) 1 and divide by v 1 (wlog v 1 0). N (L KN v) 1 = (N 1)v 1 v i = Nv 1. i=2 What are the remaining eigenvalues/vectors? Michal Valko Graphs in Machine Learning SequeL - 17/43

18 Normalized Laplacians L un = D W L sym = D 1/2 LD 1/2 = I D 1/2 WD 1/2 L rw = D 1 L = I D 1 W f T L sym f = 1 2 ( ) 2 f i w i,j f j di dj i,j N (λ, u) is an eigenpair for L rw iff (λ, D 1/2 u) is an eigenpair for L sym Michal Valko Graphs in Machine Learning SequeL - 18/43

19 Normalized Laplacians L sym and L rw are PSD with non-negative real eigenvalues 0 = λ 1 λ 2 λ 3 λ N. (λ, u) is an eigenpair for L rw iff (λ, u) solve the generalized eigenproblem Lu = λdu. (0, 1 N ) is an eigenpair for L rw. (0, D 1/2 1 N ) is an eigenpair for L sym. Multiplicity of eigenvalue 0 of L rw or L sym equals to the number of connected components. Proof: As for L. Michal Valko Graphs in Machine Learning SequeL - 19/43

20 Laplacian and Random Walks on Undirected Graphs stochastic process: vertex-to-vertex jumping transition probability v i v j is p ij = w ij /d i def d i = j w ij transition matrix P = (p ij ) ij = D 1 W (notice L rw = I P) if G is connected and non-bipartite unique stationary distribution π = (π 1, π 2, π 3,..., π N ) where π i = d i /vol(v ) vol(g) = vol(v ) = vol(w) def = i d i = i,j w ij π = 1T W vol(w) verifies πp = π as: πp = 1T WP vol(w) = 1T DP vol(w) = 1T DD 1 W = 1T W vol(w) vol(w) = π What s the difference from the PageRank TM? Michal Valko Graphs in Machine Learning SequeL - 20/43

21 score(v, m) recommendation on a bipartite graph with the graph Laplacian!

22 Use of Laplacians: Movie recommendation How to do movie recommendation on a bipartite graph? Adam Barbara Céline viewer 1 viewer 2 viewer 3 ranking ranking ranking ranking movie A movie B movie C Blade Runner 2049 Cars 3 Capitaine Superslip Question: Do we recommend Capitaine Superslip to Adam? Let s compute some score(v, m)! Michal Valko Graphs in Machine Learning SequeL - 22/43

23 Use of Laplacians: Movie recommendation How to compute the score(v, m)? Using some graph distance! Idea 1 : maximally weighted path score(v, m) = max vpm weight(p) = max vpm e P ranking(e) Idea 2 : change the path weight score 2 (v, m) = max vpm weight 2 (P) = max vpm min e P ranking(e) Idea 3 : consider everything score 3 (v, m) = max flow from m to v Michal Valko Graphs in Machine Learning SequeL - 23/43

24 Laplacians and Resistive Networks How to compute the score(v, m)? Idea 4 : view edges as conductors score 4 (v, m) = effective resistance between m and v v i + C C conductance R resistance i current V voltage C = 1 R i = CV = V R Michal Valko Graphs in Machine Learning SequeL - 24/43

25 Resistive Networks: Some high-school physics Michal Valko Graphs in Machine Learning SequeL - 25/43

26 Resistive Networks resistors in series 1 R = R R n C = 1 C C N i = V R conductors in parallel C = C C N i = VC Effective Resistance on a graph Take two nodes: a b. Let V ab be the voltage between them and i ab the current between them. Define R ab = V ab i ab and C ab = 1 R ab. We treat the entire graph as a resistor! Michal Valko Graphs in Machine Learning SequeL - 26/43

27 Resistive Networks: Optional Homework (ungraded) Show that R ab is a metric space. 1. R ab 0 2. R ab = 0 iff a = b 3. R ab = R ba 4. R ac R ab + R bc The effective resistance is a distance! Michal Valko Graphs in Machine Learning SequeL - 27/43

28 How to compute effective resistance? Kirchhoff s Law flow in = flow out V 1 V 3 C 3 V C 1 C 2 V 2 V = C 1 C V 1 + C 2 C V 2 + C 3 C V 3 (convex combination) residual current = CV C 1 V 1 C 2 V 2 C 3 V 3 Kirchhoff says: This is zero! There is no residual current! Michal Valko Graphs in Machine Learning SequeL - 28/43

29 Resistors: Where is the link with the Laplacian? General case of the previous! d i = j c ij = sum of conductances d i if i = j, L ij = c ij if (i, j) E, 0 otherwise. v = voltage setting of the nodes on graph. (Lv) i = residual current at v i as we derived Use: setting voltages and getting the current Inverting injecting current and getting the voltages The net injected has to be zero Kirchhoff s Law. Michal Valko Graphs in Machine Learning SequeL - 29/43

30 Resistors and the Laplacian: Finding R ab Let s calculate R 1N to get the movie recommendation score! L 0 v 2. v n 1 1 = i 0. 0 i i = V R V = 1 R = 1 i Return R 1N = 1 i Doyle and Snell: Random Walks and Electric Networks Michal Valko Graphs in Machine Learning SequeL - 30/43

31 Resistors and the Laplacian: Finding R 1N Lv = (i, 0,..., i) T boundary valued problem For R 1N V 1 and V N are the boundary (v 1, v 2,..., v N ) is harmonic: V i interior (not boundary) V i is a convex combination of its neighbors Michal Valko Graphs in Machine Learning SequeL - 31/43

32 Resistors and the Laplacian: Finding R 1n From the properties of electric networks (cf. Doyle and Snell) we inherit the useful properties of the Laplacians! Example: Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions (later in the course) Maximum Principle If f = v is harmonic then min and max are on the boundary. Uniqueness Principle If f and g are harmonic with the same boundary then f = g Michal Valko Graphs in Machine Learning SequeL - 32/43

33 Resistors and the Laplacian: Finding R 1N Alternative method to calculate R 1N : 1 0 def Lv =. = i ext Return R 1N = v 1 v N Why? 0 1 Question: Does v exist? L does not have an inverse :(. Not unique: 1 in the nullspace of L : L(v + c1) = Lv + cl1 = Lv Moore-Penrose pseudo-inverse solves LS Solution: Instead of v = L 1 i ext we take v = L + i ext We get: R 1N = v 1 v N = i T extv = i T extl + i ext. Notice: We can reuse L + to get resistances for any pair of nodes! Michal Valko Graphs in Machine Learning SequeL - 33/43

34 What? A pseudo-inverse? Eigendecomposition of the Laplacian: L = QΛQ T = N λ i q i q T i = i=1 N λ i q i q T i i=2 Pseudo-inverse of the Laplacian: L + = QΛ + Q T = N i=2 1 λ i q i q T i Moore-Penrose pseudo-inverse solves a least squares problem: v = arg min Lx i ext 2 = L + i ext x Michal Valko Graphs in Machine Learning SequeL - 34/43

35 cut(a, B) = 1 2 ft Lf spectral clustering with connectivity beyond compactness

36 How to rule the world? Let s make France great again! Michal Valko Graphs in Machine Learning SequeL - 36/43

37 How to rule the world? Michal Valko Graphs in Machine Learning SequeL - 37/43

38 How to rule the world: AI is here 1d71fe2e-d391-11e2-b05f-3ea3f0e7bb5a_story.html Talk of Rayid Ghaniy: Michal Valko Graphs in Machine Learning SequeL - 38/43

39 Application of Graphs for ML: Clustering Michal Valko Graphs in Machine Learning SequeL - 39/43

40 Application: Clustering - Recap What do we know about the clustering in general? ill defined problem (different tasks different paradigms) I know it when I see it inconsistent (wrt. Kleinberg s axioms) number of clusters k need often be known difficult to evaluate What do we know about k-means? hard version of EM clustering sensitive to initialization optimizes for compactness yet: algorithm-to-go Michal Valko Graphs in Machine Learning SequeL - 40/43

41 Spectral Clustering: Cuts on graphs A B C F E G D H I J K Michal Valko Graphs in Machine Learning SequeL - 41/43

42 Spectral Clustering: Cuts on graphs A B C F E G D H I J K Defining the cut objective we get the clustering! Michal Valko Graphs in Machine Learning SequeL - 41/43

43 Spectral Clustering: Cuts on graphs A B C F E G D H I J K MinCut: cut(a, B) = i A,j B w ij Are we done? Can be solved efficiently, but maybe not what we want.... Michal Valko Graphs in Machine Learning SequeL - 41/43

44 Next class on Wednesday, October 17th at 14:00! S. des Conférences S. Visio DSI S. Renaudeau C518 Bretécher Amphi Tocqueville Uderzo Amphi Marie Curie S. des Comm. Amphi 121 Condorcet FCD Fonteneau 131 bis Amphi 109 Amphi e-media Michal Valko Graphs in Machine Learning SequeL - 42/43

45 Michal Valko ENS Paris-Saclay, MVA 2018/2019 SequeL team, Inria Lille Nord Europe

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Ulrike von Luxburg, Gary Miller, Mikhail Belkin October 16th, 2017 MVA 2017/2018

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman January 27, 2015 MVA 2014/2015

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Mikhail Belkin, Jerry Zhu, Olivier Chapelle, Branislav Kveton October 30, 2017

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

MATH 829: Introduction to Data Mining and Analysis Clustering II

MATH 829: Introduction to Data Mining and Analysis Clustering II his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments

More information

Physical Metaphors for Graphs

Physical Metaphors for Graphs Graphs and Networks Lecture 3 Physical Metaphors for Graphs Daniel A. Spielman October 9, 203 3. Overview We will examine physical metaphors for graphs. We begin with a spring model and then discuss resistor

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Mikhail Belkin, Jerry Zhu, Olivier Chapelle, Branislav Kveton February 10, 2015 MVA 2014/2015 Previous

More information

Walks, Springs, and Resistor Networks

Walks, Springs, and Resistor Networks Spectral Graph Theory Lecture 12 Walks, Springs, and Resistor Networks Daniel A. Spielman October 8, 2018 12.1 Overview In this lecture we will see how the analysis of random walks, spring networks, and

More information

Diffusion and random walks on graphs

Diffusion and random walks on graphs Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural

More information

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory

ORIE 4741: Learning with Big Messy Data. Spectral Graph Theory ORIE 4741: Learning with Big Messy Data Spectral Graph Theory Mika Sumida Operations Research and Information Engineering Cornell September 15, 2017 1 / 32 Outline Graph Theory Spectral Graph Theory Laplacian

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France Partially based on material by: Toma s Koca k November 23, 2015 MVA 2015/2016 Last Lecture Examples of applications of online SSL

More information

Lecture 13: Spectral Graph Theory

Lecture 13: Spectral Graph Theory CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved

More information

Lecture: Modeling graphs with electrical networks

Lecture: Modeling graphs with electrical networks Stat260/CS294: Spectral Graph Methods Lecture 16-03/17/2015 Lecture: Modeling graphs with electrical networks Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Spectral Clustering. Guokun Lai 2016/10

Spectral Clustering. Guokun Lai 2016/10 Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Effective Resistance and Schur Complements

Effective Resistance and Schur Complements Spectral Graph Theory Lecture 8 Effective Resistance and Schur Complements Daniel A Spielman September 28, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in

More information

Lecture 13 Spectral Graph Algorithms

Lecture 13 Spectral Graph Algorithms COMS 995-3: Advanced Algorithms March 6, 7 Lecture 3 Spectral Graph Algorithms Instructor: Alex Andoni Scribe: Srikar Varadaraj Introduction Today s topics: Finish proof from last lecture Example of random

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II Spectral clustering: overview MATH 567: Mathematical Techniques in Data Science Clustering II Dominique uillot Departments of Mathematical Sciences University of Delaware Overview of spectral clustering:

More information

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Spectral Bandits for Smooth Graph Functions with Applications in Recommender Systems

Spectral Bandits for Smooth Graph Functions with Applications in Recommender Systems Spectral Bandits for Smooth Graph Functions with Applications in Recommender Systems Tomáš Kocák SequeL team INRIA Lille France Michal Valko SequeL team INRIA Lille France Rémi Munos SequeL team, INRIA

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018

More information

Lecture 24: Element-wise Sampling of Graphs and Linear Equation Solving. 22 Element-wise Sampling of Graphs and Linear Equation Solving

Lecture 24: Element-wise Sampling of Graphs and Linear Equation Solving. 22 Element-wise Sampling of Graphs and Linear Equation Solving Stat260/CS294: Randomized Algorithms for Matrices and Data Lecture 24-12/02/2013 Lecture 24: Element-wise Sampling of Graphs and Linear Equation Solving Lecturer: Michael Mahoney Scribe: Michael Mahoney

More information

Predicting Graph Labels using Perceptron. Shuang Song

Predicting Graph Labels using Perceptron. Shuang Song Predicting Graph Labels using Perceptron Shuang Song shs037@eng.ucsd.edu Online learning over graphs M. Herbster, M. Pontil, and L. Wainer, Proc. 22nd Int. Conf. Machine Learning (ICML'05), 2005 Prediction

More information

Online Social Networks and Media. Link Analysis and Web Search

Online Social Networks and Media. Link Analysis and Web Search Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Rob Fergus, Tomáš Kocák March 17, 2015 MVA 2014/2015 Last Lecture Analysis of online SSL Analysis

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

An Algorithmist s Toolkit September 10, Lecture 1

An Algorithmist s Toolkit September 10, Lecture 1 18.409 An Algorithmist s Toolkit September 10, 2009 Lecture 1 Lecturer: Jonathan Kelner Scribe: Jesse Geneson (2009) 1 Overview The class s goals, requirements, and policies were introduced, and topics

More information

Lecture: Local Spectral Methods (1 of 4)

Lecture: Local Spectral Methods (1 of 4) Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide

More information

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Chapter 17 Graphs and Graph Laplacians 17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Definition 17.1. A directed graph isa pairg =(V,E), where V = {v

More information

Graphs, Vectors, and Matrices Daniel A. Spielman Yale University. AMS Josiah Willard Gibbs Lecture January 6, 2016

Graphs, Vectors, and Matrices Daniel A. Spielman Yale University. AMS Josiah Willard Gibbs Lecture January 6, 2016 Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016 From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian

Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian Radu Balan January 31, 2017 G = (V, E) An undirected graph G is given by two pieces of information: a set of vertices V and a set of edges E, G =

More information

Spectral and Electrical Graph Theory. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral and Electrical Graph Theory. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral and Electrical Graph Theory Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Spectral Graph Theory: Understand graphs through eigenvectors and

More information

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman Kernels of Directed Graph Laplacians J. S. Caughman and J.J.P. Veerman Department of Mathematics and Statistics Portland State University PO Box 751, Portland, OR 97207. caughman@pdx.edu, veerman@pdx.edu

More information

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,

More information

Machine Learning for Data Science (CS4786) Lecture 11

Machine Learning for Data Science (CS4786) Lecture 11 Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

Lecture 10: October 27, 2016

Lecture 10: October 27, 2016 Mathematical Toolkit Autumn 206 Lecturer: Madhur Tulsiani Lecture 0: October 27, 206 The conjugate gradient method In the last lecture we saw the steepest descent or gradient descent method for finding

More information

Algebraic Constructions of Graphs

Algebraic Constructions of Graphs Spectral Graph Theory Lecture 15 Algebraic Constructions of Graphs Daniel A. Spielman October 17, 2012 15.1 Overview In this lecture, I will explain how to make graphs from linear error-correcting codes.

More information

Link Analysis Ranking

Link Analysis Ranking Link Analysis Ranking How do search engines decide how to rank your query results? Guess why Google ranks the query results the way it does How would you do it? Naïve ranking of query results Given query

More information

Linear Algebra and Graphs

Linear Algebra and Graphs Linear Algebra and Graphs IGERT Data and Network Science Bootcamp Victor Amelkin victor@cs.ucsb.edu UC Santa Barbara September 11, 2015 Table of Contents Linear Algebra: Review of Fundamentals Matrix Arithmetic

More information

Lecture 10. Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis

Lecture 10. Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis CS-621 Theory Gems October 18, 2012 Lecture 10 Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis 1 Introduction In this lecture, we will see how one can use random walks to

More information

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min Spectral Graph Theory Lecture 2 The Laplacian Daniel A. Spielman September 4, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The notes written before

More information

CMPSCI 791BB: Advanced ML: Laplacian Learning

CMPSCI 791BB: Advanced ML: Laplacian Learning CMPSCI 791BB: Advanced ML: Laplacian Learning Sridhar Mahadevan Outline! Spectral graph operators! Combinatorial graph Laplacian! Normalized graph Laplacian! Random walks! Machine learning on graphs! Clustering!

More information

COMPSCI 514: Algorithms for Data Science

COMPSCI 514: Algorithms for Data Science COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction

More information

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank David Glickenstein November 3, 4 Representing graphs as matrices It will sometimes be useful to represent graphs

More information

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Lecture 12 : Graph Laplacians and Cheeger s Inequality CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful

More information

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find

More information

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31 Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking Dengyong Zhou zhou@tuebingen.mpg.de Dept. Schölkopf, Max Planck Institute for Biological Cybernetics, Germany Learning from

More information

Laplacian Matrices of Graphs: Spectral and Electrical Theory

Laplacian Matrices of Graphs: Spectral and Electrical Theory Laplacian Matrices of Graphs: Spectral and Electrical Theory Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale University Toronto, Sep. 28, 2 Outline Introduction to graphs

More information

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012 CS-62 Theory Gems October 24, 202 Lecture Lecturer: Aleksander Mądry Scribes: Carsten Moldenhauer and Robin Scheibler Introduction In Lecture 0, we introduced a fundamental object of spectral graph theory:

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

ORIE 6334 Spectral Graph Theory September 22, Lecture 11

ORIE 6334 Spectral Graph Theory September 22, Lecture 11 ORIE 6334 Spectral Graph Theory September, 06 Lecturer: David P. Williamson Lecture Scribe: Pu Yang In today s lecture we will focus on discrete time random walks on undirected graphs. Specifically, we

More information

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = 30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can

More information

Networks and Their Spectra

Networks and Their Spectra Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.

More information

Online Social Networks and Media. Opinion formation on social networks

Online Social Networks and Media. Opinion formation on social networks Online Social Networks and Media Opinion formation on social networks Diffusion of items So far we have assumed that what is being diffused in the network is some discrete item: E.g., a virus, a product,

More information

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa Introduction to Search Engine Technology Introduction to Link Structure Analysis Ronny Lempel Yahoo Labs, Haifa Outline Anchor-text indexing Mathematical Background Motivation for link structure analysis

More information

Spectra of Adjacency and Laplacian Matrices

Spectra of Adjacency and Laplacian Matrices Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra

More information

Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: 11/21/07) Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

More information

Computer Vision Group Prof. Daniel Cremers. 14. Clustering

Computer Vision Group Prof. Daniel Cremers. 14. Clustering Group Prof. Daniel Cremers 14. Clustering Motivation Supervised learning is good for interaction with humans, but labels from a supervisor are hard to obtain Clustering is unsupervised learning, i.e. it

More information

CS249: ADVANCED DATA MINING

CS249: ADVANCED DATA MINING CS249: ADVANCED DATA MINING Graph and Network Instructor: Yizhou Sun yzsun@cs.ucla.edu May 31, 2017 Methods Learnt Classification Clustering Vector Data Text Data Recommender System Decision Tree; Naïve

More information

MTH50 Spring 07 HW Assignment 7 {From [FIS0]}: Sec 44 #4a h 6; Sec 5 #ad ac 4ae 4 7 The due date for this assignment is 04/05/7 Sec 44 #4a h Evaluate the erminant of the following matrices by any legitimate

More information

A Note on Google s PageRank

A Note on Google s PageRank A Note on Google s PageRank According to Google, google-search on a given topic results in a listing of most relevant web pages related to the topic. Google ranks the importance of webpages according to

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Mining Graph/Network Data Instructor: Yizhou Sun yzsun@ccs.neu.edu March 16, 2016 Methods to Learn Classification Clustering Frequent Pattern Mining Matrix Data Decision

More information

Communities Via Laplacian Matrices. Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices

Communities Via Laplacian Matrices. Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices Communities Via Laplacian Matrices Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices The Laplacian Approach As with betweenness approach, we want to divide a social graph into

More information

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an

More information

Introduction to Spectral Graph Theory and Graph Clustering

Introduction to Spectral Graph Theory and Graph Clustering Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation

More information

Online Social Networks and Media. Link Analysis and Web Search

Online Social Networks and Media. Link Analysis and Web Search Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information

More information

Introducing the Laplacian

Introducing the Laplacian Introducing the Laplacian Prepared by: Steven Butler October 3, 005 A look ahead to weighted edges To this point we have looked at random walks where at each vertex we have equal probability of moving

More information

CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University

CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University CS224W: Social and Information Network Analysis Jure Leskovec Stanford University Jure Leskovec, Stanford University http://cs224w.stanford.edu Task: Find coalitions in signed networks Incentives: European

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

0.1 Naive formulation of PageRank

0.1 Naive formulation of PageRank PageRank is a ranking system designed to find the best pages on the web. A webpage is considered good if it is endorsed (i.e. linked to) by other good webpages. The more webpages link to it, and the more

More information

Laplacian eigenvalues and optimality: II. The Laplacian of a graph. R. A. Bailey and Peter Cameron

Laplacian eigenvalues and optimality: II. The Laplacian of a graph. R. A. Bailey and Peter Cameron Laplacian eigenvalues and optimality: II. The Laplacian of a graph R. A. Bailey and Peter Cameron London Taught Course Centre, June 2012 The Laplacian of a graph This lecture will be about the Laplacian

More information

Clustering using Mixture Models

Clustering using Mixture Models Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior

More information

ORIE 6334 Spectral Graph Theory October 13, Lecture 15

ORIE 6334 Spectral Graph Theory October 13, Lecture 15 ORIE 6334 Spectral Graph heory October 3, 206 Lecture 5 Lecturer: David P. Williamson Scribe: Shijin Rajakrishnan Iterative Methods We have seen in the previous lectures that given an electrical network,

More information

Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian

Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian Radu Balan February 5, 2018 Datasets diversity: Social Networks: Set of individuals ( agents, actors ) interacting with each other (e.g.,

More information

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay

A Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Spectra of Digraph Transformations

Spectra of Digraph Transformations Spectra of Digraph Transformations Aiping Deng a,, Alexander Kelmans b,c a Department of Applied Mathematics, Donghua University, 201620 Shanghai, China arxiv:1707.00401v1 [math.co] 3 Jul 2017 b Department

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

CSC 411 Lecture 12: Principal Component Analysis

CSC 411 Lecture 12: Principal Component Analysis CSC 411 Lecture 12: Principal Component Analysis Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 12-PCA 1 / 23 Overview Today we ll cover the first unsupervised

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

DATA MINING LECTURE 13. Link Analysis Ranking PageRank -- Random walks HITS

DATA MINING LECTURE 13. Link Analysis Ranking PageRank -- Random walks HITS DATA MINING LECTURE 3 Link Analysis Ranking PageRank -- Random walks HITS How to organize the web First try: Manually curated Web Directories How to organize the web Second try: Web Search Information

More information

Mining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University

Mining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University Note to other teachers and users of these slides: We would be delighted if you found this our material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit

More information

Introduction to Data Mining

Introduction to Data Mining Introduction to Data Mining Lecture #9: Link Analysis Seoul National University 1 In This Lecture Motivation for link analysis Pagerank: an important graph ranking algorithm Flow and random walk formulation

More information

Spectral Analysis of Random Walks

Spectral Analysis of Random Walks Graphs and Networks Lecture 9 Spectral Analysis of Random Walks Daniel A. Spielman September 26, 2013 9.1 Disclaimer These notes are not necessarily an accurate representation of what happened in class.

More information

Convergence of Random Walks

Convergence of Random Walks Graphs and Networks Lecture 9 Convergence of Random Walks Daniel A. Spielman September 30, 010 9.1 Overview We begin by reviewing the basics of spectral theory. We then apply this theory to show that lazy

More information

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms : Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

Communities, Spectral Clustering, and Random Walks

Communities, Spectral Clustering, and Random Walks Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12

More information