Preserving sparsity in dynamic network computations
|
|
- Alexis Warner
- 5 years ago
- Views:
Transcription
1 GoBack
2 Preserving sparsity in dynamic network computations Francesca Arrigo and Desmond J. Higham Network Science meets Matrix Functions Oxford, Sept. 1, 2016 This work was funded by the Engineering and Physical Sciences Research Council under grant EP/M00158X/1. 1
3 Overview Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal One step at a time Less is more A little twist Cost comparison 2
4 Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal 3
5 Temporal networks Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Evolving networks: fixed set of nodes and edges that appear and disappear as time goes by. Ordered sequence of time points and unweighted graphs: t 0 < t 1 < < t M and {G [k] } M k=0 = {(V, E [k] )}. The associated adjacency matrices of order n are {A [k] } M k=0 = {(a [k] ij )}, where a [k] ij = { 1 if (i, j) E [k] 0 otherwise. Figure from: 4
6 Dynamic walks Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Extension to the temporal case the well-known concept of walk of length p in static networks. Definition: A dynamic walk of length p from node i 1 to node i p+1 consists of a sequence of p + 1 nodes i 1, i 2,..., i p+1 and a sequence of times t r1 t r2 t rp such that a [r m] i m i m+1 0 for m = 1, 2,..., p. Time slots must be ordered, but need not be consecutive! P. Grindrod, M. Parsons, D. J. Higham, and E. Estrada, Communicability across evolving networks, Phys. Rev. E83 (2011)
7 Dynamic comm. matrix Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal The dynamic communicability matrix Q [M] is defined as: Q [M] = (I αa [0] ) 1 (I αa [1] ) 1 (I αa [M] ) 1, where 0 < α < 1/ρ, with ρ = max k {ρ(a [k] )} the largest among the spectral radii of the adjacency matrices. P. Grindrod, M. Parsons, D. J. Higham, and E. Estrada, Communicability across evolving networks, Phys. Rev. E83 (2011)
8 Dynamic comm. matrix Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal The dynamic communicability matrix Q [M] is defined as: Q [M] = (I αa [0] ) 1 (I αa [1] ) 1 (I αa [M] ) 1, where 0 < α < 1/ρ, with ρ = max k {ρ(a [k] )} the largest among the spectral radii of the adjacency matrices. It can also be defined iteratively as: where Q [ 1] = I. Q [k] = Q [k 1] (I αa [k] ) 1, k = 0, 1,..., M, P. Grindrod, M. Parsons, D. J. Higham, and E. Estrada, Communicability across evolving networks, Phys. Rev. E83 (2011)
9 Dynamic comm. matrix Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal The dynamic communicability matrix Q [M] is defined as: Q [M] = (I αa [0] ) 1 (I αa [1] ) 1 (I αa [M] ) 1, where 0 < α < 1/ρ, with ρ = max k {ρ(a [k] )} the largest among the spectral radii of the adjacency matrices. It can also be defined iteratively as: where Q [ 1] = I. Q [k] = Q [k 1] (I αa [k] ) 1, k = 0, 1,..., M, Its entries provide a weighted count of the dynamic walks between any two nodes in a temporal network. P. Grindrod, M. Parsons, D. J. Higham, and E. Estrada, Communicability across evolving networks, Phys. Rev. E83 (2011)
10 Dynamic comm. matrix Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal nz = 151 Average number of nonzeros in the adjacency matrices:
11 Dynamic comm. matrix Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal nz =
12 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic receive centrality of node i is defined as r [k] i = 1 T Q [k] e i. It takes large values for nodes that are effective at gathering information. 9
13 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic receive centrality of node i is defined as r [k] i = 1 T Q [k] e i. It takes large values for nodes that are effective at gathering information. It satisfies a lower-dimensional, vector-valued iteration: r [k] := 1 T Q [k] = r [k 1] (I αa [k] ) 1, k = 0, 1,... M. where r [ 1] = 1. 9
14 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic receive centrality of node i is defined as r [k] i = 1 T Q [k] e i. It takes large values for nodes that are effective at gathering information. It satisfies a lower-dimensional, vector-valued iteration: r [k] := 1 T Q [k] = r [k 1] (I αa [k] ) 1, k = 0, 1,... M. where r [ 1] = 1. Why so? Because given a summary of how much information is flowing into each node, we can propagate this information forward when new edges emerge: receive centrality cares about where the information terminates. 9
15 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic broadcast centrality of a node i is defined as b [k] i = e T i Q [k] 1. It takes large values for nodes that are effective at distributing information. 10
16 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic broadcast centrality of a node i is defined as b [k] i = e T i Q [k] 1. It takes large values for nodes that are effective at distributing information. We need access to the current dynamic communicability matrix at each step to update b [k]. Expensive in terms of storage and computational expense. 10
17 Dynamic centralities Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Dynamic broadcast centrality of a node i is defined as b [k] i = e T i Q [k] 1. It takes large values for nodes that are effective at distributing information. We need access to the current dynamic communicability matrix at each step to update b [k]. Expensive in terms of storage and computational expense. Why so? Because a summary of how much information is flowing out of each node cannot be straightforwardly updated when new edges emerge: broadcast centrality cares about where the information originates. 10
18 Our Goal Temporal networks Dynamic walks Dynamic comm. matrix Dynamic centralities Our Goal Goal: address this issue by deriving a new algorithm that delivers good approximations to the original dynamic broadcast centrality measure whilst retaining the benefits of the sparsity present in the time slices. 11
19 One step at a time Less is more A little twist Cost comparison 12
20 One step at a time One step at a time Less is more A little twist Cost comparison In describing Q [M] we are allowing some traversals that are not really meaningful, such as i j i j i at one time step. First - use an at most one time point alternative to the original iteration: Q [k] = Q [k 1] (I + αa [k] ) Q [k 1] (I + αa [k], k = 0, 1,..., M, ) 2 with Q [ 1] = I and 0 < α < 1/ρ as before. This matrix will still fill in as k increases. 13
21 Less is more One step at a time Less is more A little twist Cost comparison We need to remove all those entries that are too small to be meaningful. Second - threshold the matrix at level θ k Q [k] = Q [k 1] (I + αa [k] ) θk Q [k 1] (I + αa [k] 2, ) θk with Q [ 1] = I and 0 < α < 1/ρ as before. The function C θk sets to zero all the entries of the matrix C 0 that are smaller than θ k. 14
22 Less is more One step at a time Less is more A little twist Cost comparison Thresholding parameters θ k : selected at each step in order to achieve the desired sparsity in Q [k]. There are two key circumstances where the thresholding has an effect: the value of α p dominates the contribution given by the products of the adjacency matrices, i.e., there are not too many walks of length p between the two nodes under consideration; the information has not moved from a certain node for a long time and the normalization step has made the corresponding contribution smaller than the other entries. We are dismissing information that has little potential. 15
23 Less is more One step at a time Less is more A little twist Cost comparison We need to remove all those entries that are too small to be meaningful. Second - threshold the matrix at level θ k Q [k] = Q [k 1] (I + αa [k] ) θk Q [k 1] (I + αa [k] 2, ) θk with Q [ 1] = I and 0 < α < 1/ρ as before. The function C θk sets to zero all the entries of the matrix C 0 that are smaller than θ k. 16
24 Less is more One step at a time Less is more A little twist Cost comparison We need to remove all those entries that are too small to be meaningful. Second - threshold the matrix at level θ k Q [k] = Q [k 1] (I + αa [k] ) θk Q [k 1] (I + αa [k] 2, ) θk with Q [ 1] = I and 0 < α < 1/ρ as before. The function C θk sets to zero all the entries of the matrix C 0 that are smaller than θ k. We might zero out rows that become important only later in time. 16
25 A little twist One step at a time Less is more A little twist Cost comparison We need to keep track of the activity of nodes that have been inactive for a long time or have not started sending out information yet. Third - add a little information Q [k] = Q [k 1] (I + αa [k] ) θk + m k A [k] Q [k 1] (I + αa [k] ) θk + m k A [k] 2, with Q [ 1] = I and 0 < α < 1/ρ as before. m k is the smallest nonzero entry of Q [k 1] (I + αa [k] ) θk, A [k] = αw [k] A [k], W [k] = diag(w 1, w 2,..., w n ) R n n such that w i = { 1 if e T i Q [k 1] (I + αa [k] ) θk 1 = 0 0 otherwise. 17
26 Cost comparison One step at a time Less is more A little twist Cost comparison Assume: there is a bounded number of nonzeros per row in each A [k]. Computational benefits of using Q [k] instead of Q [k] to compute the dynamic broadcast centrality: 1) reduced storage requirements by a factor of n at each time step, 2) reduced the dominant computational task at each time step by a factor of n. 18
27 19
28 n = 151, M + 1 = 1138, α = Average number of nonzeros per time stamp: Maximum number of nonzeros allowed in Q [k] = nz = J. L ESKOVEC, SNAP: Network dataset. 20
29 n = 151, M + 1 = 1138, α = Average number of nonzeros per time stamp: Maximum number of nonzeros allowed in Q [k] = nz = Q [M] is 92.5% full. J. L ESKOVEC, SNAP: Network dataset. 20
30 10 3 Evolution of θ k 2000 nnz( Q [k] )
31 nz =
32 nz = 1676 Q [M] is 7.4% full. 22
33 23
34 Top 10 ranked nodes: Q [M] Q [M]
35 n = 106 M + 1 = 365, α = Undirected layers. Average number of nonzeros per time stamp: Maximum number of nonzeros allowed in Q [k] = nz = N. Eagle and A. S. Pentland, ity mining: sensing complex social systems, Personal and ubiquitous computing, 10 (2006), pp
36 n = 106 M + 1 = 365, α = Undirected layers. Average number of nonzeros per time stamp: Maximum number of nonzeros allowed in Q [k] = nz = Q [M] is 100% full. N. Eagle and A. S. Pentland, ity mining: sensing complex social systems, Personal and ubiquitous computing, 10 (2006), pp
37 10 2 Evolution of θ k 3000 nnz( Q [k] )
38 nz =
39 nz = 2583 Q [M] is 23% full. 26
40 27
41 Top 10 ranked nodes: Q [M] Q [M]
42 28
43 We derived a sparsification technique that delivers accurate approximations to the full-matrix centrality rankings, while retaining the level of sparsity present in the network timeslices. 29
44 We derived a sparsification technique that delivers accurate approximations to the full-matrix centrality rankings, while retaining the level of sparsity present in the network timeslices. With this new algorithm, as we move forward in time the storage cost remains fixed and the computational cost scales linearly. 29
45 Thank you Questions? 30
Communicability Across Evolving Networks
School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS_2010-32 17 November 2010 Communicability Across Evolving Networks by Peter Grindrod, Desmond J. Higham,
More informationSimple sparse matrices we have seen so far include diagonal matrices and tridiagonal matrices, but these are not the only ones.
A matrix is sparse if most of its entries are zero. Simple sparse matrices we have seen so far include diagonal matrices and tridiagonal matrices, but these are not the only ones. In fact sparse matrices
More informationCommunicability across evolving networks
PHYSICAL REVIEW E 83, 4612 (211) Communicability across evolving networks Peter Grindrod and Mark C. Parsons Department of Mathematics, University of Reading, United Kingdom Desmond J. Higham Department
More informationChapter 7 Network Flow Problems, I
Chapter 7 Network Flow Problems, I Network flow problems are the most frequently solved linear programming problems. They include as special cases, the assignment, transportation, maximum flow, and shortest
More informationSUPPLEMENTARY MATERIALS TO THE PAPER: ON THE LIMITING BEHAVIOR OF PARAMETER-DEPENDENT NETWORK CENTRALITY MEASURES
SUPPLEMENTARY MATERIALS TO THE PAPER: ON THE LIMITING BEHAVIOR OF PARAMETER-DEPENDENT NETWORK CENTRALITY MEASURES MICHELE BENZI AND CHRISTINE KLYMKO Abstract This document contains details of numerical
More informationAnomaly Detection in Temporal Graph Data: An Iterative Tensor Decomposition and Masking Approach
Proceedings 1st International Workshop on Advanced Analytics and Learning on Temporal Data AALTD 2015 Anomaly Detection in Temporal Graph Data: An Iterative Tensor Decomposition and Masking Approach Anna
More informationA Nearly Sublinear Approximation to exp{p}e i for Large Sparse Matrices from Social Networks
A Nearly Sublinear Approximation to exp{p}e i for Large Sparse Matrices from Social Networks Kyle Kloster and David F. Gleich Purdue University December 14, 2013 Supported by NSF CAREER 1149756-CCF Kyle
More informationNumerical Methods I Non-Square and Sparse Linear Systems
Numerical Methods I Non-Square and Sparse Linear Systems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 September 25th, 2014 A. Donev (Courant
More informationApproximating a single component of the solution to a linear system
Approximating a single component of the solution to a linear system Christina E. Lee, Asuman Ozdaglar, Devavrat Shah celee@mit.edu asuman@mit.edu devavrat@mit.edu MIT LIDS 1 How do I compare to my competitors?
More informationDynamic Network Analysis in Julia. Zhang, Weijian. MIMS EPrint: Manchester Institute for Mathematical Sciences School of Mathematics
Dynamic Network nalysis in Julia Zhang, Weijian 2015 MIMS EPrint: 2015.83 Manchester Institute for Mathematical Sciences School of Mathematics The University of Manchester Reports available from: nd by
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Mining Graph/Network Data Instructor: Yizhou Sun yzsun@ccs.neu.edu November 16, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining Matrix Data Decision
More informationMATH 5524 MATRIX THEORY Pledged Problem Set 2
MATH 5524 MATRIX THEORY Pledged Problem Set 2 Posted Tuesday 25 April 207 Due by 5pm on Wednesday 3 May 207 Complete any four problems, 25 points each You are welcome to complete more problems if you like,
More informationEstimating the Largest Elements of a Matrix
Estimating the Largest Elements of a Matrix Samuel Relton samuel.relton@manchester.ac.uk @sdrelton samrelton.com blog.samrelton.com Joint work with Nick Higham nick.higham@manchester.ac.uk May 12th, 2016
More informationCOMPSCI 514: Algorithms for Data Science
COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction
More informationA MATRIX ANALYSIS OF DIFFERENT CENTRALITY MEASURES
A MATRIX ANALYSIS OF DIFFERENT CENTRALITY MEASURES MICHELE BENZI AND CHRISTINE KLYMKO Abstract. Node centrality measures including degree, eigenvector, Katz and subgraph centralities are analyzed for both
More informationSpectral Graph Theory Tools. Analysis of Complex Networks
Spectral Graph Theory Tools for the Department of Mathematics and Computer Science Emory University Atlanta, GA 30322, USA Acknowledgments Christine Klymko (Emory) Ernesto Estrada (Strathclyde, UK) Support:
More informationScientific Computing with Case Studies SIAM Press, Lecture Notes for Unit VII Sparse Matrix
Scientific Computing with Case Studies SIAM Press, 2009 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit VII Sparse Matrix Computations Part 1: Direct Methods Dianne P. O Leary c 2008
More informationLAPLACIAN MATRIX AND APPLICATIONS
LAPLACIAN MATRIX AND APPLICATIONS Alice Nanyanzi Supervisors: Dr. Franck Kalala Mutombo & Dr. Simukai Utete alicenanyanzi@aims.ac.za August 24, 2017 1 Complex systems & Complex Networks 2 Networks Overview
More informationSparse Linear Algebra Issues Arising in the Analysis of Complex Networks
Sparse Linear Algebra Issues Arising in the Analysis of Complex Networks Department of Mathematics and Computer Science Emory University Atlanta, GA 30322, USA Acknowledgments Christine Klymko (Emory)
More informationThis section is an introduction to the basic themes of the course.
Chapter 1 Matrices and Graphs 1.1 The Adjacency Matrix This section is an introduction to the basic themes of the course. Definition 1.1.1. A simple undirected graph G = (V, E) consists of a non-empty
More informationNew Theory and Algorithms for Scheduling Arc Shutdown Jobs in Networks
New Theory and Algorithms for Scheduling Arc Shutdown Jobs in Networks Patrick Andersen February 27, 2014 This project explores a recently developed type of scheduling problem that combines the two Operations
More informationThe definition of a vector space (V, +, )
The definition of a vector space (V, +, ) 1. For any u and v in V, u + v is also in V. 2. For any u and v in V, u + v = v + u. 3. For any u, v, w in V, u + ( v + w) = ( u + v) + w. 4. There is an element
More informationLast Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection
Eigenvalue Problems Last Time Social Network Graphs Betweenness Girvan-Newman Algorithm Graph Laplacian Spectral Bisection λ 2, w 2 Today Small deviation into eigenvalue problems Formulation Standard eigenvalue
More informationCS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory
CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from
More informationTHE DEFORMED GRAPH LAPLACIAN AND ITS APPLICATIONS TO NETWORK CENTRALITY ANALYSIS
THE DEFORMED GRAPH LAPLACIAN AND ITS APPLICATIONS TO NETWORK CENTRALITY ANALYSIS PETER GRINDROD, DESMOND J. HIGHAM, AND VANNI NOFERINI Abstract. We introduce and study a new network centrality measure
More informationLab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018
Lab 8: Measuring Graph Centrality - PageRank Monday, November 5 CompSci 531, Fall 2018 Outline Measuring Graph Centrality: Motivation Random Walks, Markov Chains, and Stationarity Distributions Google
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Mining Graph/Network Data Instructor: Yizhou Sun yzsun@ccs.neu.edu March 16, 2016 Methods to Learn Classification Clustering Frequent Pattern Mining Matrix Data Decision
More informationLecture 10. Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis
CS-621 Theory Gems October 18, 2012 Lecture 10 Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis 1 Introduction In this lecture, we will see how one can use random walks to
More informationModeling temporal networks using random itineraries
Modeling temporal networks using random itineraries Alain Barrat CPT, Marseille, France & ISI, Turin, Italy j j i i G k k i j k i j me$ N k 2$ i j k 1$ 0$ A.B., B. Fernandez, K. Lin, L.-S. Young Phys.
More informationCommunicability Across Evolving Networks
Communicability Across Evolving Networks Peter Grindrod Desmond J. Higham Mark C. Parsons Ernesto Estrada Abstract Many natural and technological applications generate time ordered sequences of networks,
More informationMatching Exponential-Based and Resolvent-Based Centrality Measures. Aprahamian, Mary and Higham, Desmond J. and Higham, Nicholas J.
Matching Exponential-Based and Resolvent-Based Centrality Measures Aprahamian, Mary and Higham, Desmond J. and Higham, Nicholas J. 215 MIMS EPrint: 215.3 Manchester Institute for Mathematical Sciences
More informationA New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1
CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern
More informationAlgebraic Representation of Networks
Algebraic Representation of Networks 0 1 2 1 1 0 0 1 2 0 0 1 1 1 1 1 Hiroki Sayama sayama@binghamton.edu Describing networks with matrices (1) Adjacency matrix A matrix with rows and columns labeled by
More informationInformation Propagation Analysis of Social Network Using the Universality of Random Matrix
Information Propagation Analysis of Social Network Using the Universality of Random Matrix Yusuke Sakumoto, Tsukasa Kameyama, Chisa Takano and Masaki Aida Tokyo Metropolitan University, 6-6 Asahigaoka,
More informationCommunities Via Laplacian Matrices. Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices
Communities Via Laplacian Matrices Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices The Laplacian Approach As with betweenness approach, we want to divide a social graph into
More information5.1 Banded Storage. u = temperature. The five-point difference operator. uh (x, y + h) 2u h (x, y)+u h (x, y h) uh (x + h, y) 2u h (x, y)+u h (x h, y)
5.1 Banded Storage u = temperature u= u h temperature at gridpoints u h = 1 u= Laplace s equation u= h u = u h = grid size u=1 The five-point difference operator 1 u h =1 uh (x + h, y) 2u h (x, y)+u h
More informationSparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28
Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:
More informationOverlapping Communities
Overlapping Communities Davide Mottin HassoPlattner Institute Graph Mining course Winter Semester 2017 Acknowledgements Most of this lecture is taken from: http://web.stanford.edu/class/cs224w/slides GRAPH
More informationRecovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies
July 12, 212 Recovery of Low-Rank Plus Compressed Sparse Matrices with Application to Unveiling Traffic Anomalies Morteza Mardani Dept. of ECE, University of Minnesota, Minneapolis, MN 55455 Acknowledgments:
More informationDATA MINING LECTURE 13. Link Analysis Ranking PageRank -- Random walks HITS
DATA MINING LECTURE 3 Link Analysis Ranking PageRank -- Random walks HITS How to organize the web First try: Manually curated Web Directories How to organize the web Second try: Web Search Information
More informationUsing an Auction Algorithm in AMG based on Maximum Weighted Matching in Matrix Graphs
Using an Auction Algorithm in AMG based on Maximum Weighted Matching in Matrix Graphs Pasqua D Ambra Institute for Applied Computing (IAC) National Research Council of Italy (CNR) pasqua.dambra@cnr.it
More information6.207/14.15: Networks Lecture 7: Search on Networks: Navigation and Web Search
6.207/14.15: Networks Lecture 7: Search on Networks: Navigation and Web Search Daron Acemoglu and Asu Ozdaglar MIT September 30, 2009 1 Networks: Lecture 7 Outline Navigation (or decentralized search)
More informationLecture II: Matrix Functions in Network Science, Part 1
Lecture II: Matrix Functions in Network Science, Part 1 Michele Benzi Department of Mathematics and Computer Science Emory University Atlanta, Georgia, USA Summer School on Theory and Computation of Matrix
More informationLink Analysis Ranking
Link Analysis Ranking How do search engines decide how to rank your query results? Guess why Google ranks the query results the way it does How would you do it? Naïve ranking of query results Given query
More informationCalculating Web Page Authority Using the PageRank Algorithm
Jacob Miles Prystowsky and Levi Gill Math 45, Fall 2005 1 Introduction 1.1 Abstract In this document, we examine how the Google Internet search engine uses the PageRank algorithm to assign quantitatively
More informationLow-density parity-check (LDPC) codes
Low-density parity-check (LDPC) codes Performance similar to turbo codes Do not require long interleaver to achieve good performance Better block error performance Error floor occurs at lower BER Decoding
More informationCS60021: Scalable Data Mining. Dimensionality Reduction
J. Leskovec, A. Rajaraman, J. Ullman: Mining of Massive Datasets, http://www.mmds.org 1 CS60021: Scalable Data Mining Dimensionality Reduction Sourangshu Bhattacharya Assumption: Data lies on or near a
More informationCommunities, Spectral Clustering, and Random Walks
Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12
More informationA Distributed Newton Method for Network Utility Maximization, II: Convergence
A Distributed Newton Method for Network Utility Maximization, II: Convergence Ermin Wei, Asuman Ozdaglar, and Ali Jadbabaie October 31, 2012 Abstract The existing distributed algorithms for Network Utility
More informationOPTIMIZATION ALGORITHM FOR FINDING SOLUTIONS IN LINEAR PROGRAMMING PROBLEMS
An. Şt. Univ. Ovidius Constanţa Vol. 11(2), 2003, 127 136 OPTIMIZATION ALGORITHM FOR FINDING SOLUTIONS IN LINEAR PROGRAMMING PROBLEMS Ioan Popoviciu Abstract When speaking about linear programming problems
More informationLecture 13 Spectral Graph Algorithms
COMS 995-3: Advanced Algorithms March 6, 7 Lecture 3 Spectral Graph Algorithms Instructor: Alex Andoni Scribe: Srikar Varadaraj Introduction Today s topics: Finish proof from last lecture Example of random
More informationRecommendation Systems
Recommendation Systems Pawan Goyal CSE, IITKGP October 21, 2014 Pawan Goyal (IIT Kharagpur) Recommendation Systems October 21, 2014 1 / 52 Recommendation System? Pawan Goyal (IIT Kharagpur) Recommendation
More informationNetworks and Their Spectra
Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.
More informationModeling face-to-face social interaction networks
Modeling face-to-face social interaction networks Romualdo Pastor-Satorras Dept. Fisica i Enginyería Nuclear Universitat Politècnica de Catalunya Spain http://www.fen.upc.edu/~romu Work done in collaboration
More informationOn Distributed Coordination of Mobile Agents with Changing Nearest Neighbors
On Distributed Coordination of Mobile Agents with Changing Nearest Neighbors Ali Jadbabaie Department of Electrical and Systems Engineering University of Pennsylvania Philadelphia, PA 19104 jadbabai@seas.upenn.edu
More informationLecture: Local Spectral Methods (2 of 4) 19 Computing spectral ranking with the push procedure
Stat260/CS294: Spectral Graph Methods Lecture 19-04/02/2015 Lecture: Local Spectral Methods (2 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide
More informationCentrality and Communicability Measures in Complex Networks: Mathematical and Computational Aspects, I
Centrality and Communicability Measures in Complex Networks: Mathematical and Computational Aspects, I Michele Benzi Department of Mathematics and Computer Science Emory University Atlanta, Georgia, USA
More informationEstimation of large dimensional sparse covariance matrices
Estimation of large dimensional sparse covariance matrices Department of Statistics UC, Berkeley May 5, 2009 Sample covariance matrix and its eigenvalues Data: n p matrix X n (independent identically distributed)
More informationPageRank: The Math-y Version (Or, What To Do When You Can t Tear Up Little Pieces of Paper)
PageRank: The Math-y Version (Or, What To Do When You Can t Tear Up Little Pieces of Paper) In class, we saw this graph, with each node representing people who are following each other on Twitter: Our
More informationData Gathering and Personalized Broadcasting in Radio Grids with Interferences
Data Gathering and Personalized Broadcasting in Radio Grids with Interferences Jean-Claude Bermond a,, Bi Li a,b, Nicolas Nisse a, Hervé Rivano c, Min-Li Yu d a Coati Project, INRIA I3S(CNRS/UNSA), Sophia
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer
More information5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns
5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns (1) possesses the solution and provided that.. The numerators and denominators are recognized
More informationInternal Covariate Shift Batch Normalization Implementation Experiments. Batch Normalization. Devin Willmott. University of Kentucky.
Batch Normalization Devin Willmott University of Kentucky October 23, 2017 Overview 1 Internal Covariate Shift 2 Batch Normalization 3 Implementation 4 Experiments Covariate Shift Suppose we have two distributions,
More informationWeek 4. (1) 0 f ij u ij.
Week 4 1 Network Flow Chapter 7 of the book is about optimisation problems on networks. Section 7.1 gives a quick introduction to the definitions of graph theory. In fact I hope these are already known
More informationSpectral Clustering on Handwritten Digits Database
University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:
More informationEdge Modification Techniques for Enhancing Network Communicability and Robustness, Part I
Edge Modification Techniques for Enhancing Network Communicability and Robustness, Part I Department of Mathematics and Computer Science Emory University Atlanta, GA 30322, USA International Summer School
More informationSang Gu Lee and Jeong Mo Yang
Commun. Korean Math. Soc. 20 (2005), No. 1, pp. 51 62 BOUND FOR 2-EXPONENTS OF PRIMITIVE EXTREMAL MINISTRONG DIGRAPHS Sang Gu Lee and Jeong Mo Yang Abstract. We consider 2-colored digraphs of the primitive
More informationLink Analysis and Web Search
Link Analysis and Web Search Episode 11 Baochun Li Professor Department of Electrical and Computer Engineering University of Toronto Link Analysis and Web Search (Chapter 13, 14) Information networks and
More informationMarkov Chains and Spectral Clustering
Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu
More informationCS249: ADVANCED DATA MINING
CS249: ADVANCED DATA MINING Graph and Network Instructor: Yizhou Sun yzsun@cs.ucla.edu May 31, 2017 Methods Learnt Classification Clustering Vector Data Text Data Recommender System Decision Tree; Naïve
More informationLecture 5: Random Walks and Markov Chain
Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables
More informationOnline Social Networks and Media. Link Analysis and Web Search
Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information
More informationInfering and Calibrating Triadic Closure in a Dynamic Network
Infering and Calibrating Triadic Closure in a Dynamic Network Alexander V. Mantzaris and Desmond J. Higham Abstract In the social sciences, the hypothesis of triadic closure contends that new links in
More informationDiffusion and random walks on graphs
Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural
More informationPivoting. Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3
Pivoting Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3 In the previous discussions we have assumed that the LU factorization of A existed and the various versions could compute it in a stable manner.
More informationLink Analysis. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze
Link Analysis Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze 1 The Web as a Directed Graph Page A Anchor hyperlink Page B Assumption 1: A hyperlink between pages
More information1 Counting spanning trees: A determinantal formula
Math 374 Matrix Tree Theorem Counting spanning trees: A determinantal formula Recall that a spanning tree of a graph G is a subgraph T so that T is a tree and V (G) = V (T ) Question How many distinct
More informationLecture 13: Spectral Graph Theory
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationMath/Phys/Engr 428, Math 529/Phys 528 Numerical Methods - Summer Homework 3 Due: Tuesday, July 3, 2018
Math/Phys/Engr 428, Math 529/Phys 528 Numerical Methods - Summer 28. (Vector and Matrix Norms) Homework 3 Due: Tuesday, July 3, 28 Show that the l vector norm satisfies the three properties (a) x for x
More information12. Cholesky factorization
L. Vandenberghe ECE133A (Winter 2018) 12. Cholesky factorization positive definite matrices examples Cholesky factorization complex positive definite matrices kernel methods 12-1 Definitions a symmetric
More informationA New Space for Comparing Graphs
A New Space for Comparing Graphs Anshumali Shrivastava and Ping Li Cornell University and Rutgers University August 18th 2014 Anshumali Shrivastava and Ping Li ASONAM 2014 August 18th 2014 1 / 38 Main
More informationOnline Social Networks and Media. Link Analysis and Web Search
Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information
More informationRobust Principal Component Analysis
ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M
More informationNon-backtracking PageRank. Arrigo, Francesca and Higham, Desmond J. and Noferini, Vanni. MIMS EPrint:
Non-backtracking PageRank Arrigo, Francesca and Higham, Desmond J. and Noferini, Vanni 2018 MIMS EPrint: 2018.29 Manchester Institute for Mathematical Sciences School of Mathematics The University of Manchester
More informationLecture: Local Spectral Methods (1 of 4)
Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide
More informationApplications of Matrix Functions to Network Analysis and Quantum Chemistry Part I: Complex Networks
Applications of Matrix Functions to Network Analysis and Quantum Chemistry Part I: Complex Networks Michele Benzi Department of Mathematics and Computer Science Emory University Atlanta, Georgia, USA SSMF2013,
More informationDistributed Consensus and Linear Functional Calculation in Networks: An Observability Perspective
Distributed Consensus and Linear Functional Calculation in Networks: An Observability Perspective ABSTRACT Shreyas Sundaram Coordinated Science Laboratory and Dept of Electrical and Computer Engineering
More informationECEN 689 Special Topics in Data Science for Communications Networks
ECEN 689 Special Topics in Data Science for Communications Networks Nick Duffield Department of Electrical & Computer Engineering Texas A&M University Lecture 8 Random Walks, Matrices and PageRank Graphs
More informationEdge-Weighted Personalized PageRank: Breaking a Decade-Old Performance Barrier
Edge-Weighted Personalized PageRank: Breaking a Decade-Old Performance Barrier W. Xie D. Bindel A. Demers J. Gehrke 12 Aug 2015 W. Xie, D. Bindel, A. Demers, J. Gehrke KDD2015 12 Aug 2015 1 / 1 PageRank
More informationGraph Sparsification III: Ramanujan Graphs, Lifts, and Interlacing Families
Graph Sparsification III: Ramanujan Graphs, Lifts, and Interlacing Families Nikhil Srivastava Microsoft Research India Simons Institute, August 27, 2014 The Last Two Lectures Lecture 1. Every weighted
More informationACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms
1. Computability, Complexity and Algorithms Part a: You are given a graph G = (V,E) with edge weights w(e) > 0 for e E. You are also given a minimum cost spanning tree (MST) T. For one particular edge
More informationA Note on Google s PageRank
A Note on Google s PageRank According to Google, google-search on a given topic results in a listing of most relevant web pages related to the topic. Google ranks the importance of webpages according to
More informationFaloutsos, Tong ICDE, 2009
Large Graph Mining: Patterns, Tools and Case Studies Christos Faloutsos Hanghang Tong CMU Copyright: Faloutsos, Tong (29) 2-1 Outline Part 1: Patterns Part 2: Matrix and Tensor Tools Part 3: Proximity
More informationMulti-Robotic Systems
CHAPTER 9 Multi-Robotic Systems The topic of multi-robotic systems is quite popular now. It is believed that such systems can have the following benefits: Improved performance ( winning by numbers ) Distributed
More informationLeveraging Big Data: Lecture 13
Leveraging Big Data: Lecture 13 http://www.cohenwang.com/edith/bigdataclass2013 Instructors: Edith Cohen Amos Fiat Haim Kaplan Tova Milo What are Linear Sketches? Linear Transformations of the input vector
More informationMarkov Chains, Random Walks on Graphs, and the Laplacian
Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer
More informationUncertainty and Randomization
Uncertainty and Randomization The PageRank Computation in Google Roberto Tempo IEIIT-CNR Politecnico di Torino tempo@polito.it 1993: Robustness of Linear Systems 1993: Robustness of Linear Systems 16 Years
More informationc 2016 Society for Industrial and Applied Mathematics
SIAM J. MATRIX ANAL. APPL. Vol. 37, No., pp. 443 468 c 06 Society for Industrial and Applied Mathematics EDGE MODIFIATION RITERIA FOR ENHANING THE OMMUNIABILITY OF DIGRAPHS FRANESA ARRIGO AND MIHELE BENZI
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More information8.1 Concentration inequality for Gaussian random matrix (cont d)
MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration
More information