Talk 2: Graph Mining Tools - SVD, ranking, proximity. Christos Faloutsos CMU

Similar documents
Large Graph Mining: Power Tools and a Practitioner s guide

Faloutsos, Tong ICDE, 2009

Proximity on Graphs -Definitions, Fast Solutions, and Applications

Thanks to Jure Leskovec, Stanford and Panayiotis Tsaparas, Univ. of Ioannina for slides

Thanks to Jure Leskovec, Stanford and Panayiotis Tsaparas, Univ. of Ioannina for slides

Text Analytics (Text Mining)

1 Searching the World Wide Web

Link Analysis. Leonid E. Zhukov

Link Analysis Ranking

CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University

CS 277: Data Mining. Mining Web Link Structure. CS 277: Data Mining Lectures Analyzing Web Link Structure Padhraic Smyth, UC Irvine

Text Analytics (Text Mining)

DATA MINING LECTURE 13. Link Analysis Ranking PageRank -- Random walks HITS

Introduction to Data Mining

Online Social Networks and Media. Link Analysis and Web Search

CS 3750 Advanced Machine Learning. Applications of SVD and PCA (LSA and Link analysis) Cem Akkaya

Data Mining Recitation Notes Week 3

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa

ECEN 689 Special Topics in Data Science for Communications Networks

Data Mining and Matrices

Online Social Networks and Media. Link Analysis and Web Search

Must-read Material : Multimedia Databases and Data Mining. Indexing - Detailed outline. Outline. Faloutsos

Lab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018

CS60021: Scalable Data Mining. Dimensionality Reduction

Hub, Authority and Relevance Scores in Multi-Relational Data for Query Search

MultiRank and HAR for Ranking Multi-relational Data, Transition Probability Tensors, and Multi-Stochastic Tensors

Node and Link Analysis

Link Analysis. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze

Link Analysis Information Retrieval and Data Mining. Prof. Matteo Matteucci

Dimensionality Reduction

Node Centrality and Ranking on Networks

PageRank. Ryan Tibshirani /36-662: Data Mining. January Optional reading: ESL 14.10

How does Google rank webpages?

Links between Kleinberg s hubs and authorities, correspondence analysis, and Markov chains

STA141C: Big Data & High Performance Statistical Computing

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

6.207/14.15: Networks Lecture 7: Search on Networks: Navigation and Web Search

Page rank computation HPC course project a.y

Random Surfing on Multipartite Graphs

Inf 2B: Ranking Queries on the WWW

Some relationships between Kleinberg s hubs and authorities, correspondence analysis, and the Salsa algorithm

Web Structure Mining Nodes, Links and Influence

Node similarity and classification

6.207/14.15: Networks Lectures 4, 5 & 6: Linear Dynamics, Markov Chains, Centralities

Slides based on those in:

CS6220: DATA MINING TECHNIQUES

Calculating Web Page Authority Using the PageRank Algorithm

Eigenvalue Problems Computation and Applications

Applications to network analysis: Eigenvector centrality indices Lecture notes

Francois Fouss, Alain Pirotte, Jean-Michel Renders & Marco Saerens. January 31, 2006

Degree Distribution: The case of Citation Networks

Complex Social System, Elections. Introduction to Network Analysis 1

CSI 445/660 Part 6 (Centrality Measures for Networks) 6 1 / 68

Lecture 7 Mathematics behind Internet Search

CS6220: DATA MINING TECHNIQUES

Google Page Rank Project Linear Algebra Summer 2012

The Second Eigenvalue of the Google Matrix

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

IR: Information Retrieval

Some relationships between Kleinberg s hubs and authorities, correspondence analysis, and the Salsa algorithm

CS249: ADVANCED DATA MINING

MAE 298, Lecture 8 Feb 4, Web search and decentralized search on small-worlds

1998: enter Link Analysis

Lecture: Face Recognition and Feature Reduction

Lecture 12: Link Analysis for Web Retrieval

Computing PageRank using Power Extrapolation

CMU SCS. Large Graph Mining. Christos Faloutsos CMU

Index. Copyright (c)2007 The Society for Industrial and Applied Mathematics From: Matrix Methods in Data Mining and Pattern Recgonition By: Lars Elden

Krylov Subspace Methods to Calculate PageRank

Lecture: Local Spectral Methods (1 of 4)

A Note on Google s PageRank

Hyperlinked-Induced Topic Search (HITS) identifies. authorities as good content sources (~high indegree) HITS [Kleinberg 99] considers a web page

Slide source: Mining of Massive Datasets Jure Leskovec, Anand Rajaraman, Jeff Ullman Stanford University.

Wiki Definition. Reputation Systems I. Outline. Introduction to Reputations. Yury Lifshits. HITS, PageRank, SALSA, ebay, EigenTrust, VKontakte

Web Ranking. Classification (manual, automatic) Link Analysis (today s lesson)

Lecture: Face Recognition and Feature Reduction

Affine iterations on nonnegative vectors

Finding central nodes in large networks

Idea: Select and rank nodes w.r.t. their relevance or interestingness in large networks.

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Data Mining Techniques

Markov Chains and Spectral Clustering

Information Retrieval

Spectral Graph Theory Tools. Analysis of Complex Networks

0.1 Naive formulation of PageRank

Information Retrieval and Search. Web Linkage Mining. Miłosz Kadziński

LINK ANALYSIS. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

Complex Networks CSYS/MATH 303, Spring, Prof. Peter Dodds

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31

Fast Direction-Aware Proximity for Graph Mining

Algebraic Representation of Networks

Introduction to Data Mining

Discovering Contexts and Contextual Outliers Using Random Walks in Graphs

Outline : Multimedia Databases and Data Mining. Indexing - similarity search. Indexing - similarity search. C.

Machine learning for pervasive systems Classification in high-dimensional spaces

Introduction to Information Retrieval

The Google Markov Chain: convergence speed and eigenvalues

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

Transcription:

Talk 2: Graph Mining Tools - SVD, ranking, proximity Christos Faloutsos CMU

Outline Introduction Motivation Task 1: Node importance Task 2: Recommendations Task 3: Connection sub-graphs Conclusions Lipari 2010 (C) 2010, C. Faloutsos 2

Node importance - Motivation: Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important? Lipari 2010 (C) 2010, C. Faloutsos 3

Node importance - Motivation: Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important? A1: HITS (SVD = Singular Value Decomposition) A2: eigenvector (PageRank) Lipari 2010 (C) 2010, C. Faloutsos 4

Node importance - motivation SVD and eigenvector analysis: very closely related Lipari 2010 (C) 2010, C. Faloutsos 5

SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Lipari 2010 (C) 2010, C. Faloutsos 6

SVD - Motivation problem #1: text - LSI: find concepts problem #2: compression / dim. reduction Lipari 2010 (C) 2010, C. Faloutsos 7

SVD - Motivation problem #1: text - LSI: find concepts Lipari 2010 (C) 2010, C. Faloutsos 8

SVD - Motivation Customer-product, for recommendation system: vegetarians meat eaters Lipari 2010 (C) 2010, C. Faloutsos 9

SVD - Motivation problem #2: compress / reduce dimensionality Lipari 2010 (C) 2010, C. Faloutsos 10

Problem - specs ~10**6 rows; ~10**3 columns; no updates; random access to any cell(s) ; small error: OK Lipari 2010 (C) 2010, C. Faloutsos 11

SVD - Motivation Lipari 2010 (C) 2010, C. Faloutsos 12

SVD - Motivation Lipari 2010 (C) 2010, C. Faloutsos 13

SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties Lipari 2010 (C) 2010, C. Faloutsos 14

SVD - Definition (reminder: matrix multiplication x = 3 x 2 2 x 1 Lipari 2010 (C) 2010, C. Faloutsos 15

SVD - Definition (reminder: matrix multiplication x = 3 x 2 2 x 1 3 x 1 Lipari 2010 (C) 2010, C. Faloutsos 16

SVD - Definition (reminder: matrix multiplication x = 3 x 2 2 x 1 3 x 1 Lipari 2010 (C) 2010, C. Faloutsos 17

SVD - Definition (reminder: matrix multiplication x = 3 x 2 2 x 1 3 x 1 Lipari 2010 (C) 2010, C. Faloutsos 18

SVD - Definition (reminder: matrix multiplication x = Lipari 2010 (C) 2010, C. Faloutsos 19

SVD - Definition A [n x m] = U [n x r] Λ [ r x r] (V [m x r] ) T A: n x m matrix (eg., n documents, m terms) U: n x r matrix (n documents, r concepts) Λ: r x r diagonal matrix (strength of each concept ) (r : rank of the matrix) V: m x r matrix (m terms, r concepts) Lipari 2010 (C) 2010, C. Faloutsos 20

SVD - Definition A = U Λ V T - example: Lipari 2010 (C) 2010, C. Faloutsos 21

SVD - Properties THEOREM [Press+92]: always possible to decompose matrix A into A = U Λ V T, where U, Λ, V: unique (*) U, V: column orthonormal (ie., columns are unit vectors, orthogonal to each other) U T U = I; V T V = I (I: identity matrix) Λ: singular are positive, and sorted in decreasing order Lipari 2010 (C) 2010, C. Faloutsos 22

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 23

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung CS-concept MD-concept CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 24

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung doc-to-concept CS-concept similarity matrix MD-concept CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 25

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung strength of CS-concept CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 26

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung CS-concept term-to-concept similarity matrix CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 27

SVD - Example A = U Λ V T - example: data inf. retrieval brain lung CS-concept term-to-concept similarity matrix CS = x x MD Lipari 2010 (C) 2010, C. Faloutsos 28

SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties Lipari 2010 (C) 2010, C. Faloutsos 29

SVD - Interpretation #1 documents, terms and concepts : U: document-to-concept similarity matrix V: term-to-concept sim. matrix Λ: its diagonal elements: strength of each concept Lipari 2010 (C) 2010, C. Faloutsos 30

SVD Interpretation #1 documents, terms and concepts : Q: if A is the document-to-term matrix, what is A T A? A: Q: A A T? A: Lipari 2010 (C) 2010, C. Faloutsos 31

SVD Interpretation #1 documents, terms and concepts : Q: if A is the document-to-term matrix, what is A T A? A: term-to-term ([m x m]) similarity matrix Q: A A T? A: document-to-document ([n x n]) similarity matrix Lipari 2010 (C) 2010, C. Faloutsos -32

SVD properties V are the eigenvectors of the covariance matrix A T A U are the eigenvectors of the Gram (innerproduct) matrix AA T Further reading: 1. Ian T. Jolliffe, Principal Component Analysis (2 nd ed), Springer, 2002. 2. Gilbert Strang, Linear Algebra and Its Applications (4 th ed), Brooks Cole, 2005. Lipari 2010 Copyright: (C) 2010, Faloutsos, C. Faloutsos Tong (2009) 2-33

SVD - Interpretation #2 best axis to project on: ( best = min sum of squares of projection errors) Lipari 2010 (C) 2010, C. Faloutsos 34

SVD - Motivation Lipari 2010 (C) 2010, C. Faloutsos 35

SVD - interpretation #2 SVD: gives best axis to project first singular vector v1 minimum RMS error Lipari 2010 (C) 2010, C. Faloutsos 36

SVD - Interpretation #2 Lipari 2010 (C) 2010, C. Faloutsos 37

SVD - Interpretation #2 A = U Λ V T - example: = x x v1 Lipari 2010 (C) 2010, C. Faloutsos 38

SVD - Interpretation #2 A = U Λ V T - example: variance ( spread ) on the v1 axis = x x Lipari 2010 (C) 2010, C. Faloutsos 39

SVD - Interpretation #2 A = U Λ V T - example: U Λ gives the coordinates of the points in the projection axis = x x Lipari 2010 (C) 2010, C. Faloutsos 40

SVD - Interpretation #2 More details Q: how exactly is dim. reduction done? = x x Lipari 2010 (C) 2010, C. Faloutsos 41

SVD - Interpretation #2 More details Q: how exactly is dim. reduction done? A: set the smallest singular values to zero: = x x Lipari 2010 (C) 2010, C. Faloutsos 42

SVD - Interpretation #2 ~ x x Lipari 2010 (C) 2010, C. Faloutsos 43

SVD - Interpretation #2 ~ x x Lipari 2010 (C) 2010, C. Faloutsos 44

SVD - Interpretation #2 ~ x x Lipari 2010 (C) 2010, C. Faloutsos 45

SVD - Interpretation #2 ~ Lipari 2010 (C) 2010, C. Faloutsos 46

SVD - Interpretation #2 Exactly equivalent: spectral decomposition of the matrix: = x x Lipari 2010 (C) 2010, C. Faloutsos 47

SVD - Interpretation #2 Exactly equivalent: spectral decomposition of the matrix: = x λ 1 u x 1 u 2 λ 2 v 1 Lipari 2010 (C) 2010, C. Faloutsos 48 v 2

SVD - Interpretation #2 Exactly equivalent: spectral decomposition of the matrix: m n = λ 1 u 1 v T 1 + λ 2 u 2 v T 2 +... Lipari 2010 (C) 2010, C. Faloutsos 49

SVD - Interpretation #2 Exactly equivalent: spectral decomposition of the matrix: m r terms n = λ 1 u 1 v T 1 + λ 2 u 2 v T 2 +... n x 1 1 x m Lipari 2010 (C) 2010, C. Faloutsos 50

SVD - Interpretation #2 approximation / dim. reduction: by keeping the first few terms (Q: how many?) m n = λ 1 u 1 v T 1 + λ 2 u 2 v T 2 +... assume: λ 1 >= λ 2 >=... Lipari 2010 (C) 2010, C. Faloutsos 51

SVD - Interpretation #2 A (heuristic - [Fukunaga]): keep 80-90% of energy (= sum of squares of λ i s) m n = λ 1 u 1 v T 1 + λ 2 u 2 v T 2 +... assume: λ 1 >= λ 2 >=... Lipari 2010 (C) 2010, C. Faloutsos 52

SVD - Detailed outline Motivation Definition - properties Interpretation #1: documents/terms/concepts #2: dim. reduction #3: picking non-zero, rectangular blobs Complexity Case studies Additional properties Lipari 2010 (C) 2010, C. Faloutsos 53

SVD - Interpretation #3 finds non-zero blobs in a data matrix = x x Lipari 2010 (C) 2010, C. Faloutsos 54

SVD - Interpretation #3 finds non-zero blobs in a data matrix = x x Lipari 2010 (C) 2010, C. Faloutsos 55

SVD - Interpretation #3 finds non-zero blobs in a data matrix = communities (bi-partite cores, here) Row 1 Row 4 Row 5 Row 7 Col 1 Col 3 Col 4 Lipari 2010 (C) 2010, C. Faloutsos 56

SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties Lipari 2010 (C) 2010, C. Faloutsos 57

SVD - Complexity O( n * m * m) or O( n * n * m) (whichever is less) less work, if we just want singular values or if we want first k singular vectors or if the matrix is sparse [Berry] Implemented: in any linear algebra package (LINPACK, matlab, Splus, mathematica...) Lipari 2010 (C) 2010, C. Faloutsos 58

SVD - conclusions so far SVD: A= U Λ V T : unique (*) U: document-to-concept similarities V: term-to-concept similarities Λ: strength of each concept dim. reduction: keep the first few strongest singular values (80-90% of energy ) SVD: picks up linear correlations SVD: picks up non-zero blobs Lipari 2010 (C) 2010, C. Faloutsos 59

SVD - Detailed outline Motivation Definition - properties Interpretation Complexity SVD properties Case studies Conclusions Lipari 2010 (C) 2010, C. Faloutsos 60

SVD - Other properties - summary can produce orthogonal basis (obvious) (who cares?) can solve over- and under-determined linear problems (see C(1) property) can compute fixed points (= steady state prob. in Markov chains ) (see C(4) property) Lipari 2010 (C) 2010, C. Faloutsos 61

SVD -outline of properties (A): obvious (B): less obvious (C): least obvious (and most powerful!) Lipari 2010 (C) 2010, C. Faloutsos 62

Properties - by defn.: A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] A(1): U T [r x n] U [n x r ] = I [r x r ] (identity matrix) A(2): V T [r x n] V [n x r ] = I [r x r ] A(3): Λ k = diag( λ 1k, λ 2k,... λ r k ) (k: ANY real number) A(4): A T = V Λ U T Lipari 2010 (C) 2010, C. Faloutsos 63

Less obvious properties A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] =?? Lipari 2010 (C) 2010, C. Faloutsos 64

Less obvious properties A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U Λ 2 U T symmetric; Intuition? Lipari 2010 (C) 2010, C. Faloutsos 65

Less obvious properties A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U Λ 2 U T symmetric; Intuition? document-to-document similarity matrix B(2): symmetrically, for V (AT) [m x n] A [n x m] = V L2 VT Intuition? Lipari 2010 (C) 2010, C. Faloutsos 66

Less obvious properties A: term-to-term similarity matrix B(3): ( (A T ) [m x n] A [n x m] ) k = V Λ 2k V T and B(4): (A T A ) k ~ v 1 λ 2k 1 v T 1 for k>>1 where v 1 : [m x 1] first column (singular-vector) of V λ 1 : strongest singular value Lipari 2010 (C) 2010, C. Faloutsos 67

Less obvious properties B(4): (A T A ) k ~ v 1 λ 1 2k v 1 T for k>>1 B(5): (A T A ) k v ~ (constant) v 1 ie., for (almost) any v, it converges to a vector parallel to v 1 Thus, useful to compute first singular vector/ value (as well as the next ones, too...) Lipari 2010 (C) 2010, C. Faloutsos 68

Less obvious properties - repeated: A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U Λ 2 U T B(2): (A T ) [m x n] A [n x m] = V Λ2 V T B(3): ( (A T ) [m x n] A [n x m] ) k = V Λ 2k V T B(4): (A T A ) k ~ v 1 λ 1 2k v 1 T B(5): (A T A ) k v ~ (constant) v 1 Lipari 2010 (C) 2010, C. Faloutsos 69

Least obvious properties - cont d A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] C(2): A [n x m] v 1 [m x 1] = λ 1 u 1 [n x 1] where v 1, u 1 the first (column) vectors of V, U. (v 1 == right-singular-vector) C(3): symmetrically: u 1 T A = λ 1 v 1 T u 1 == left-singular-vector Therefore: Lipari 2010 (C) 2010, C. Faloutsos 70

Least obvious properties - cont d A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] C(4): A T A v 1 = λ 1 2 v 1 (fixed point - the dfn of eigenvector for a symmetric matrix) Lipari 2010 (C) 2010, C. Faloutsos 71

Least obvious properties - altogether A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] C(1): A [n x m] x [m x 1] = b [n x 1] then, x 0 = V Λ (-1) U T b: shortest, actual or leastsquares solution C(2): A [n x m] v 1 [m x 1] = λ 1 u 1 [n x 1] C(3): u 1 T A = λ 1 v 1 T C(4): A T A v 1 = λ 1 2 v 1 Lipari 2010 (C) 2010, C. Faloutsos 72

Properties - conclusions A(0): A [n x m] = U [ n x r ] Λ [ r x r ] V T [ r x m] B(5): (A T A ) k v ~ (constant) v 1 C(1): A [n x m] x [m x 1] = b [n x 1] then, x 0 = V Λ (-1) U T b: shortest, actual or leastsquares solution C(4): A T A v 1 = λ 1 2 v 1 Lipari 2010 (C) 2010, C. Faloutsos 73

SVD - detailed outline... SVD properties case studies Kleinberg s algorithm Google s algorithm Conclusions Lipari 2010 (C) 2010, C. Faloutsos 74

Kleinberg s algo (HITS) Kleinberg, Jon (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms. Lipari 2010 (C) 2010, C. Faloutsos 75

Recall: problem dfn Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important? Lipari 2010 (C) 2010, C. Faloutsos 76

Kleinberg s algorithm Problem dfn: given the web and a query find the most authoritative web pages for this query Step 0: find all pages containing the query terms Step 1: expand by one move forward and backward Lipari 2010 (C) 2010, C. Faloutsos 77

Kleinberg s algorithm Step 1: expand by one move forward and backward Lipari 2010 (C) 2010, C. Faloutsos 78

Kleinberg s algorithm on the resulting graph, give high score (= authorities ) to nodes that many important nodes point to give high importance score ( hubs ) to nodes that point to good authorities ) hubs authorities Lipari 2010 (C) 2010, C. Faloutsos 79

observations Kleinberg s algorithm recursive definition! each node (say, i -th node) has both an authoritativeness score a i and a hubness score h i Lipari 2010 (C) 2010, C. Faloutsos 80

Kleinberg s algorithm Let E be the set of edges and A be the adjacency matrix: the (i,j) is 1 if the edge from i to j exists Let h and a be [n x 1] vectors with the hubness and authoritativiness scores. Then: Lipari 2010 (C) 2010, C. Faloutsos 81

Kleinberg s algorithm Then: a i = h k + h l + h m k l m i that is a i = Sum (h j ) over all j that (j,i) edge exists or a = A T h Lipari 2010 (C) 2010, C. Faloutsos 82

Kleinberg s algorithm i n q p symmetrically, for the hubness : h i = a n + a p + a q that is h i = Sum (q j ) over all j that (i,j) edge exists or h = A a Lipari 2010 (C) 2010, C. Faloutsos 83

Kleinberg s algorithm In conclusion, we want vectors h and a such that: h = A a a = A T h Recall properties: C(2): A [n x m] v 1 [m x 1] = λ 1 u 1 [n x 1] C(3): u T 1 A = λ 1 v T 1 = Lipari 2010 (C) 2010, C. Faloutsos 84

Kleinberg s algorithm In short, the solutions to h = A a a = A T h are the left- and right- singular-vectors of the adjacency matrix A. Starting from random a and iterating, we ll eventually converge (Q: to which of all the singular-vectors? why?) Lipari 2010 (C) 2010, C. Faloutsos 85

Kleinberg s algorithm (Q: to which of all the singular-vectors? why?) A: to the ones of the strongest singular-value, because of property B(5): B(5): (A T A ) k v ~ (constant) v 1 Lipari 2010 (C) 2010, C. Faloutsos 86

Kleinberg s algorithm - results Eg., for the query java : 0.328 www.gamelan.com 0.251 java.sun.com 0.190 www.digitalfocus.com ( the java developer ) Lipari 2010 (C) 2010, C. Faloutsos 87

Kleinberg s algorithm - discussion authority score can be used to find similar pages (how?) Lipari 2010 (C) 2010, C. Faloutsos 88

SVD - detailed outline... Complexity SVD properties Case studies Kleinberg s algorithm (HITS) Google s algorithm Conclusions Lipari 2010 (C) 2010, C. Faloutsos 89

PageRank (google) Brin, Sergey and Lawrence Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf. Larry Page Sergey Brin Lipari 2010 (C) 2010, C. Faloutsos 90

Problem: PageRank Given a directed graph, find its most interesting/central node A node is important, if it is connected with important nodes (recursive, but OK!) Lipari 2010 (C) 2010, C. Faloutsos 91

Problem: PageRank - solution Given a directed graph, find its most interesting/central node Proposed solution: Random walk; spot most popular node (-> steady state prob. (ssp)) A node has high ssp, if it is connected with high ssp nodes (recursive, but OK!) Lipari 2010 (C) 2010, C. Faloutsos 92

(Simplified) PageRank algorithm Let A be the adjacency matrix; let B be the transition matrix: transpose, column-normalized - then To From 1 2 3 = B 4 5 Lipari 2010 (C) 2010, C. Faloutsos 93

(Simplified) PageRank algorithm B p = p B p = p 1 2 3 = 4 5 Lipari 2010 (C) 2010, C. Faloutsos 94

A Definitions Adjacency matrix (from-to) D Degree matrix = (diag ( d1, d2,, dn) ) B Transition matrix: to-from, column normalized B = A T D -1 Lipari 2010 (C) 2010, C. Faloutsos 95

(Simplified) PageRank algorithm B p = 1 * p thus, p is the eigenvector that corresponds to the highest eigenvalue (=1, since the matrix is column-normalized) Why does such a p exist? p exists if B is nxn, nonnegative, irreducible [Perron Frobenius theorem] Lipari 2010 (C) 2010, C. Faloutsos 96

(Simplified) PageRank algorithm In short: imagine a particle randomly moving along the edges compute its steady-state probabilities (ssp) Full version of algo: with occasional random jumps Why? To make the matrix irreducible Lipari 2010 (C) 2010, C. Faloutsos 97

Full Algorithm With probability 1-c, fly-out to a random node Then, we have p = c B p + (1-c)/n 1 => p = (1-c)/n [I - c B] -1 1 Lipari 2010 (C) 2010, C. Faloutsos 98

Alternative notation M Modified transition matrix Then M = c B + (1-c)/n 1 1 T p = M p That is: the steady state probabilities = PageRank scores form the first eigenvector of the modified transition matrix Lipari 2010 (C) 2010, C. Faloutsos 99

Parenthesis: intuition behind eigenvectors Lipari 2010 (C) 2010, C. Faloutsos 100

Formal definition If A is a (n x n) square matrix (λ, x) is an eigenvalue/eigenvector pair of A if A x = λ x CLOSELY related to singular values: Lipari 2010 (C) 2010, C. Faloutsos 101

Property #1: Eigen- vs singular-values if B [n x m] = U [n x r] Λ [ r x r] (V [m x r] ) T then A = (B T B) is symmetric and C(4): B T B v i = λ 2 i v i ie, v 1, v 2,...: eigenvectors of A = (B T B) Lipari 2010 (C) 2010, C. Faloutsos 102

Property #2 If A [nxn] is a real, symmetric matrix Then it has n real eigenvalues (if A is not symmetric, some eigenvalues may be complex) Lipari 2010 (C) 2010, C. Faloutsos 103

Property #3 If A [nxn] is a real, symmetric matrix Then it has n real eigenvalues And they agree with its n singular values, except possibly for the sign Lipari 2010 (C) 2010, C. Faloutsos 104

Intuition A as vector transformation x A x x = x 2 1 3 1 Lipari 2010 (C) 2010, C. Faloutsos 105

Intuition By defn., eigenvectors remain parallel to themselves ( fixed points ) λ 1 v 1 A v 1 3.62 * = Lipari 2010 (C) 2010, C. Faloutsos 106

Usually, fast: Convergence Lipari 2010 (C) 2010, C. Faloutsos 107

Usually, fast: Convergence Lipari 2010 (C) 2010, C. Faloutsos 108

Convergence Usually, fast: depends on ratio λ1 : λ2 λ1 λ2 Lipari 2010 (C) 2010, C. Faloutsos 109

Kleinberg/google - conclusions SVD helps in graph analysis: hub/authority scores: strongest left- and rightsingular-vectors of the adjacency matrix random walk on a graph: steady state probabilities are given by the strongest eigenvector of the (modified) transition matrix Lipari 2010 (C) 2010, C. Faloutsos 110

Conclusions SVD: a valuable tool given a document-term matrix, it finds concepts (LSI)... and can find fixed-points or steady-state probabilities (google/ Kleinberg/ Markov Chains) Lipari 2010 (C) 2010, C. Faloutsos 111

Conclusions cont d (We didn t discuss/elaborate, but, SVD... can reduce dimensionality (KL)... and can find rules (PCA; RatioRules)... and can solve optimally over- and underconstraint linear systems (least squares / query feedbacks) Lipari 2010 (C) 2010, C. Faloutsos 112

References Berry, Michael: http://www.cs.utk.edu/~lsi/ Brin, S. and L. Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf. Lipari 2010 (C) 2010, C. Faloutsos 113

References Christos Faloutsos, Searching Multimedia Databases by Content, Springer, 1996. (App. D) Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition, Academic Press. I.T. Jolliffe Principal Component Analysis Springer, 2002 (2 nd ed.) Lipari 2010 (C) 2010, C. Faloutsos 114

References cont d Kleinberg, J. (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms. Press, W. H., S. A. Teukolsky, et al. (1992). Numerical Recipes in C, Cambridge University Press. www.nr.com Lipari 2010 (C) 2010, C. Faloutsos 115

Outline Introduction Motivation Task 1: Node importance Task 2: Recommendations & proximity Task 3: Connection sub-graphs Conclusions Lipari 2010 (C) 2010, C. Faloutsos 116

Acknowledgement: Most of the foils in Task 2 are by Hanghang TONG www.cs.cmu.edu/~htong Lipari 2010 (C) 2010, C. Faloutsos 117

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 118

Motivation: Link Prediction A? B i i i i Should we introduce Mr. A to Mr. B? Lipari 2010 (C) 2010, C. Faloutsos 119

Motivation - recommendations customers Products / movies smith Terminator 2?? Lipari 2010 (C) 2010, C. Faloutsos 120

Answer: proximity yes, if A and B are close yes, if smith and terminator 2 are close QUESTIONS in this part: - How to measure closeness /proximity? - How to do it quickly? - What else can we do, given proximity scores? Lipari 2010 (C) 2010, C. Faloutsos 121

How close is A to B? a.k.a Relevance, Closeness, Similarity Lipari 2010 (C) 2010, C. Faloutsos 122

Why is it useful? Recommendation And many more Image captioning [Pan+] Conn. / CenterPiece subgraphs [Faloutsos+], [Tong+], [Koren +] and Link prediction [Liben-Nowell+], [Tong+] Ranking [Haveliwala], [Chakrabarti+] Email Management [Minkov+] Neighborhood Formulation [Sun+] Pattern matching [Tong+] Collaborative Filtering [Fouss+] Lipari 2010 (C) 2010, C. Faloutsos 123

Region Automatic Image Captioning Image Test Image Keyword Sea Sun Sky Wave Cat Forest Tiger Grass Q: How to assign keywords to the test image? Lipari 2010 (C) 2010, A: C. Faloutsos Proximity! [Pan+ 2004] 124

Center-Piece Subgraph(CePS) Input Output CePS guy Original Graph CePS Q: How to find hub for the black nodes? A: Proximity! [Tong+ KDD 2006] Lipari 2010 (C) 2010, C. Faloutsos 125

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 126

How close is A to B? Should be close, if they have - many, - short - heavy paths Lipari 2010 (C) 2010, C. Faloutsos 127

Why not shortest path? Some ``bad proximities A: pizza delivery guy problem Lipari 2010 (C) 2010, C. Faloutsos 128

Why not max. netflow? Some ``bad proximities A: No penalty for long paths Lipari 2010 (C) 2010, C. Faloutsos 129

What is a ``good Proximity? Multiple Connections Quality of connection Direct & In-directed Conns Length, Degree, Weight Lipari 2010 (C) 2010, C. Faloutsos 130

Random walk with restart 9 10 2 12 1 3 8 11 4 5 6 7 [Haveliwala 02] Lipari 2010 (C) 2010, C. Faloutsos 131

Random walk with restart 0.04 0.03 10 0.10 9 0.13 2 0.08 1 8 0.13 3 11 0.04 4 0.05 6 5 0.13 7 0.05 Nearby nodes, higher scores More red, more relevant 12 0.02 Node 1 Node 2 Node 3 Node 4 Node 5 Node 6 Node 7 Node 8 Node 9 Node 10 Node 11 Node 12 Node 4 0.13 0.10 0.13 0.22 0.13 0.05 0.05 0.08 0.04 0.03 0.04 0.02 Ranking vector Lipari 2010 (C) 2010, C. Faloutsos 132

Why RWR is a good score? j i : adjacency matrix. c: damping factor all paths from i to j with length 1 all paths from i to j with length 2 all paths from i to j with length 3 Lipari 2010 (C) 2010, C. Faloutsos 133

Detailed outline Problem dfn and motivation Solution: Random walk with restarts variants Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 134

Variant: escape probability Define Random Walk (RW) on the graph Esc_Prob(CMU Paris) Prob (starting at CMU, reaches Paris before returning to CMU) CMU the remaining graph Paris Esc_Prob = Pr (smile before cry) Lipari 2010 (C) 2010, C. Faloutsos 135

Other Variants Other measure by RWs Community Time/Hitting Time [Fouss+] SimRank [Jeh+] Equivalence of Random Walks Electric Networks: EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+] Spring Systems Katz [Katz], [Huang+], [Scholkopf+] Matrix-Forest-based Alg [Chobotarev+] Lipari 2010 (C) 2010, C. Faloutsos 136

Other Variants Other measure by RWs Community Time/Hitting Time [Fouss+] SimRank [Jeh+] Equivalence of Random Walks All are related to or similar to Electric Networks: EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+] random walk with restart! Spring Systems Katz [Katz], [Huang+], [Scholkopf+] Matrix-Forest-based Alg [Chobotarev+] Lipari 2010 (C) 2010, C. Faloutsos 137

Map of proximity measurements RWR Norma lize 4 ssp decides 1 esc_prob Katz Regularized Un-constrained Quad Opt. Esc_Prob + Sink Hitting Time/ Commute Time relax X out-degree Effective Conductance voltage = position Harmonic Func. Constrained Quad Opt. String System Physical Models Mathematic Tools Lipari 2010 (C) 2010, C. Faloutsos 138

Notice: Asymmetry (even in undirected graphs) B C C-> A : high A-> C: low A E D Lipari 2010 (C) 2010, C. Faloutsos 139

Summary of Proximity Definitions Goal: Summarize multiple relationships Solutions Basic: Random Walk with Restarts [Haweliwala 02] [Pan+ 2004][Sun+ 2006][Tong+ 2006] Properties: Asymmetry [Koren+ 2006][Tong+ 2007] [Tong+ 2008] Variants: Esc_Prob and many others. [Faloutsos+ 2004] [Koren+ 2006][Tong+ 2007] Lipari 2010 (C) 2010, C. Faloutsos 140

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 141

Reminder: PageRank With probability 1-c, fly-out to a random node Then, we have p = c B p + (1-c)/n 1 => p = (1-c)/n [I - c B] -1 1 Lipari 2010 (C) 2010, C. Faloutsos 142

p = c B p + (1-c)/n 1 The only difference Ranking vector Adjacency matrix Restart p Starting vector Lipari 2010 (C) 2010, C. Faloutsos 143

p = c B p + (1-c)/n 1 Computing RWR Ranking vector Adjacency matrix Restart p Starting vector 1 1 4 2 3 5 9 8 6 10 11 12 n x 1 n x n n x 1 7 Lipari 2010 (C) 2010, C. Faloutsos 144

Q: Given query i, how to solve it? Query?? Ranking vector Adjacency matrix Ranking vector Starting vector Lipari 2010 (C) 2010, C. Faloutsos 145

OntheFly: 0.13 1 0.04 99 10 10 0.10 2 2 0.08 1 88 3 30.13 11 4 4 5 6 0.05 5 6 0.13 77 0.05 0.03 0.04 12 0.02 No pre-computation/ light storage Slow on-line response O(mE) Lipari 2010 (C) 2010, C. Faloutsos 146

PreCompute R: 0.13 1 2 4 0.10 3 0.13 0.13 5 6 7 9 10 10 0.04 8 0.05 0.05 0.08 11 11 0.03 12 12 0.04 0.02 Q c x Q Lipari 2010 (C) 2010, C. Faloutsos 147

PreCompute: 0.13 1 2 4 0.10 3 0.13 0.13 5 6 7 9 10 10 0.04 8 0.05 0.05 0.08 11 11 0.03 12 12 0.04 0.02 Fast on-line response Heavy pre-computation/storage cost 3 O(n ) 2 O(n ) Lipari 2010 (C) 2010, C. Faloutsos 148

Q: How to Balance? Off-line On-line Lipari 2010 (C) 2010, C. Faloutsos 149

How to balance? Idea ( B-Lin ) Break into communities Pre-compute all, within a community Adjust (with S.M.) for bridge edges H. Tong, C. Faloutsos, & J.Y. Pan. Fast Random Walk with Lipari 2010 (C) 2010, C. Faloutsos 150 Restart and Its Applications. ICDM, 613-622, 2006.

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 151

Q gcap: Automatic Image Caption { Sea Sun Sky Wave } { Cat Forest Grass Tiger } A: Proximity! {?,?,?,} [Pan+ KDD2004] Lipari 2010 (C) 2010, C. Faloutsos 152

Region Image Test Image Sea Sun Sky Wave Cat Forest Tiger Grass Keyword Lipari 2010 (C) 2010, C. Faloutsos 153

Region Image {Grass, Forest, Test Image Cat, Tiger} Sea Sun Sky Wave Cat Forest Tiger Grass Keyword Lipari 2010 (C) 2010, C. Faloutsos 154

C-DEM (Screen-shot) Lipari 2010 (C) 2010, C. Faloutsos 155

C-DEM: Multi-Modal Query System for Drosophila Embryo Databases [Fan+ VLDB 2008] Lipari 2010 (C) 2010, C. Faloutsos 156

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010 (C) 2010, C. Faloutsos 157

Problem: update E edges changed Involves n authors, m confs. n authors m Conferences Lipari 2010 (C) 2010, C. Faloutsos 158

Solution: Use Sherman-Morrison Lemma to quickly update the inverse matrix Lipari 2010 (C) 2010, C. Faloutsos 159

Fast-Single-Update log(time) (Seconds) 176x speedup 40x speedup Our method Our method Datasets Lipari 2010 (C) 2010, C. Faloutsos 160

ptrack: Philip S. Yu s Top-5 conferences up to each year ICDE ICDCS SIGMETRICS PDIS VLDB CIKM ICDCS ICDE SIGMETRICS ICMCS KDD SIGMOD ICDM CIKM ICDCS ICDM KDD ICDE SDM VLDB 1992 1997 2002 2007 Databases Performance Distributed Sys. DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs Databases Data Mining Lipari 2010 (C) 2010, C. Faloutsos 161

ptrack: Philip S. Yu s Top-5 conferences up to each year ICDE ICDCS SIGMETRICS PDIS VLDB CIKM ICDCS ICDE SIGMETRICS ICMCS KDD SIGMOD ICDM CIKM ICDCS ICDM KDD ICDE SDM VLDB 1992 1997 2002 2007 Databases Performance Distributed Sys. DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs Databases Data Mining Lipari 2010 (C) 2010, C. Faloutsos 162

Prox. Rank KDD s Rank wrt. VLDB over years Data Mining and Databases are getting closer & closer Year Lipari 2010 (C) 2010, C. Faloutsos 163

ctrack:10 most influential authors in NIPS community up to each year T. Sejnowski M. Jordan Author-paper bipartite graph from NIPS 1987-1999. 3k. 1740 papers, 2037 authors, spreading over 13 years Lipari 2010 (C) 2010, C. Faloutsos 164

Conclusions - Take-home messages Proximity Definitions RWR and a lot of variants Computation Sherman Morrison Lemma Fast Incremental Computation Applications Recommendations; auto-captioning; tracking Center-piece Subgraphs (next) E-mail management; anomaly detection, Lipari 2010 (C) 2010, C. Faloutsos 165

References L. Page, S. Brin, R. Motwani, & T. Winograd. (1998), The PageRank Citation Ranking: Bringing Order to the Web, Technical report, Stanford Library. T.H. Haveliwala. (2002) Topic-Sensitive PageRank. In WWW, 517-526, 2002 J.Y. Pan, H.J. Yang, C. Faloutsos & P. Duygulu. (2004) Automatic multimedia cross-modal correlation discovery. In KDD, 653-658, 2004. Lipari 2010 (C) 2010, C. Faloutsos 166

References C. Faloutsos, K. S. McCurley & A. Tomkins. (2002) Fast discovery of connection subgraphs. In KDD, 118-127, 2004. J. Sun, H. Qu, D. Chakrabarti & C. Faloutsos. (2005) Neighborhood Formation and Anomaly Detection in Bipartite Graphs. In ICDM, 418-425, 2005. W. Cohen. (2007) Graph Walks and Graphical Models. Draft. Lipari 2010 (C) 2010, C. Faloutsos 167

References P. Doyle & J. Snell. (1984) Random walks and electric networks, volume 22. Mathematical Association America, New York. Y. Koren, S. C. North, and C. Volinsky. (2006) Measuring and extracting proximity in networks. In KDD, 245 255, 2006. A. Agarwal, S. Chakrabarti & S. Aggarwal. (2006) Learning to rank networked entities. In KDD, 14-23, 2006. Lipari 2010 (C) 2010, C. Faloutsos 168

References S. Chakrabarti. (2007) Dynamic personalized pagerank in entity-relation graphs. In WWW, 571-580, 2007. F. Fouss, A. Pirotte, J.-M. Renders, & M. Saerens. (2007) Random-Walk Computation of Similarities between Nodes of a Graph with Application to Collaborative Recommendation. IEEE Trans. Knowl. Data Eng. 19(3), 355-369 2007. Lipari 2010 (C) 2010, C. Faloutsos 169

References H. Tong & C. Faloutsos. (2006) Center-piece subgraphs: problem definition and fast solutions. In KDD, 404-413, 2006. H. Tong, C. Faloutsos, & J.Y. Pan. (2006) Fast Random Walk with Restart and Its Applications. In ICDM, 613-622, 2006. H. Tong, Y. Koren, & C. Faloutsos. (2007) Fast direction-aware proximity for graph mining. In KDD, 747-756, 2007. Lipari 2010 (C) 2010, C. Faloutsos 170

References H. Tong, B. Gallagher, C. Faloutsos, & T. Eliassi- Rad. (2007) Fast best-effort pattern matching in large attributed graphs. In KDD, 737-746, 2007. H. Tong, S. Papadimitriou, P.S. Yu & C. Faloutsos. (2008) Proximity Tracking on Time-Evolving Bipartite Graphs. SDM 2008. Lipari 2010 (C) 2010, C. Faloutsos 171

References B. Gallagher, H. Tong, T. Eliassi-Rad, C. Faloutsos. Using Ghost Edges for Classification in Sparsely Labeled Networks. KDD 2008 H. Tong, Y. Sakurai, T. Eliassi-Rad, and C. Faloutsos. Fast Mining of Complex Time-Stamped Events CIKM 08 H. Tong, H. Qu, and H. Jamjoom. Measuring Proximity on Graphs with Side Information. ICDM 2008 Lipari 2010 (C) 2010, C. Faloutsos 172

Resources www.cs.cmu.edu/~htong/soft.htm For software, papers, and ppt of presentations www.cs.cmu.edu/~htong/tut/cikm2008/ cikm_tutorial.html For the CIKM 08 tutorial on graphs and proximity Again, thanks to Hanghang TONG for permission to use his foils in this part Lipari 2010 (C) 2010, C. Faloutsos 173

Outline Introduction Motivation Task 1: Node importance Task 2: Recommendations & proximity Task 3: Connection sub-graphs Conclusions Lipari 2010 (C) 2010, C. Faloutsos 174

Problem definition Solution Results Detailed outline H. Tong & C. Faloutsos Center-piece subgraphs: problem definition and fast solutions. In KDD, 404-413, 2006. Lipari 2010 (C) 2010, C. Faloutsos 175

Center-Piece Subgraph(Ceps) Given Q query nodes Find Center-piece ( ) A Input of Ceps Q Query nodes Budget b k softand number App. Social Network Law Inforcement Gene Network Lipari 2010 (C) 2010, C. Faloutsos 176 B B A C C

Challenges in Ceps Q1: How to measure importance? (Q2: How to extract connection subgraph? Q3: How to do it efficiently?) Lipari 2010 (C) 2010, C. Faloutsos 177

Challenges in Ceps Q1: How to measure importance? A: proximity but how to combine scores? (Q2: How to extract connection subgraph? Q3: How to do it efficiently?) Lipari 2010 (C) 2010, C. Faloutsos 178

AND: Combine Scores Q: How to combine scores? Lipari 2010 (C) 2010, C. Faloutsos 179

AND: Combine Scores Q: How to combine scores? A: Multiply = prob. 3 random particles coincide on node j Lipari 2010 (C) 2010, C. Faloutsos 180

Problem definition Solution Results Detailed outline Lipari 2010 (C) 2010, C. Faloutsos 181

Case Study: AND query Lipari 2010 (C) 2010, C. Faloutsos 182

Case Study: AND query Lipari 2010 (C) 2010, C. Faloutsos 183

Conclusions Proximity (e.g., w/ RWR) helps answer AND and k_softand queries Lipari 2010 (C) 2010, C. Faloutsos 184

Overall conclusions SVD: a powerful tool HITS/ pagerank (dimensionality reduction) Proximity: Random Walk with Restarts Recommendation systems Auto-captioning Center-Piece Subgraphs Lipari 2010 (C) 2010, C. Faloutsos 185