Part 1: Les rendez-vous symétriques
|
|
- Rodger Wiggins
- 5 years ago
- Views:
Transcription
1 Part 1: Les rendez-vous symétriques Cris Moore, Santa Fe Institute Joint work with Alex Russell, University of Connecticut
2 A walk in the park L AND NL-COMPLETENESS 313 there are n locations in a park you and I each call them 1,...,n but our labels differ by a uniformly random permutation π at each step, we can each move wherever we like (a complete graph) our initial positions are uniformly random we can t signal or leave messages how can we minimize the expected time for us to rendezvous? FIGURE 8.6: This REACHABILITY problem is in L; just keep your hand on the right-hand wall. But what if the maze were not planar, or if it had loops? classic advice of keeping your hand on the right-hand wall. At the end of this chapter, we will also discuss the recent discovery that REACHABILITY for undirected graphs is in L. In general, however, it seems that to find your way through a directed graph you either need a guide, or an amount of memory significantly larger than the O(log n ) it takes to remember your current location. In the next section, we will derive an upper bound on how much memory you need to do it on your own. Exercise 8.3 Let ONE-OUT REACHABILITY be the variant of REACHABILITY on a directed graph G where each vertex has at most one outgoing edge. Show that ONE-OUT REACHABILITY is in L. Do you think that it is L-complete under some reasonable kind of reduction?
3 Wait for Mommy! It s optimal if you stay put and I visit 1,...,n (in any order) the expected time is (n+1)/2 at time t, you visit xt and I visit yt : then trend = min {t xt=π(yt)} no matter what, Prπ[xt=π(yt)]=1/n E[t rend ]= = 1X Pr[t rend > t ] t =0 1X Pr x 1 6= (y 1 ) ^ ^x t 6= (y t ) t =0 nx t =0 Å 1 = n t n ã union bound
4 Symmetric strategies what if we haven t agreed on who should wander, and who should wait? we must each follow the same randomized strategy e.g. if we both wander, visiting uniform locations, at each step we meet with probability 1/n these events are independent, so E[trend]=n can we do better? we need negative correlations: Pr x t +1 = (y t +1 ) x 1 6= (y 1 ) ^ ^ x t 6= (y t ) > 1 n
5 Anderson and Weber [1990] flip a biased coin, and either wait or wander in each round of n steps: with probability θ, choose a uniform location and stay there with probability 1 θ, choose a uniform permutation σ and visit σ(1),...,σ(n) optimize over θ
6 Analyzing Anderson and Weber if one of us wanders and the other waits, we meet in expected time n/2 if we both wander, we meet at the first t such that σx(t)=π(σy(t)) first fixed point (if any) of σx 1 πσy when n is large, the number q of fixed points of a random permutation is Poisson with mean 1 if there are q fixed points, we ll find the first one in expected time n q + 1 if q 1, we ll meet in expected time n 1 1/e 1X q=1 e 1 q! 1 q + 1 = n e 2 e 1
7 Analyzing Anderson and Weber putting all this together, we have E[t rend ]=2 (1 ) n 2 +(1 ) 2 e 2 e n + 2 (1 )2 + e (n + E[t rend ]) optimized at θ= , giving E[t rend ]=0.829n is this optimal? we don t know... but we can prove that there is a β > 1/2 such that for any symmetric strategy, E[t rend ] > n
8 A clever failure you and I independently choose random primes p,q = O( n) we choose random subsets of {1,...,n} of size p and q respectively, and cycle through them deterministically with high probability, p and q are distinct, so we visit all pq = O(n) pairs, and rendezvous with constant probability (birthdays!) the expected time for this to work is... wait for it... n the events that π hits each of these pq pairs are nearly independent... we failed to create correlations so that the conditional probability of meeting increases
9 Rendezvous and permanents the permanent of a matrix is the number of permutations in its support: XY perm A = i A i, (i ) let St {1,...,n} {1,...,n} be the set of pairs we visited in the first n steps let Jt be the matrix (J t ) ij = ( 0 if(i, j ) 2 S t 1 otherwise probability we haven t met yet = probability that π is in the support of Jt: E[t rend ]= 1X Pr[t rend > t ]= t =0 1X t =0 perm J t n!
10 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
11 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
12 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
13 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
14 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
15 How fast can we drive down the permanent? when you wait for mommy, perm Jt/n! = 1 t/n and the union bound is tight
16 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
17 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
18 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
19 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
20 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
21 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent
22 How fast can we drive down the permanent? when we both wander, perm Jt/n! (1 1/n) t e t/n, and each step is independent after n steps, all we know is that π is a derangement: perm Jt/n! 1/e
23 Inclusion and exclusion an arrangement is a subset of St where no two are in the same row or column there are (n k)! permutations π that include a given arrangement of size k let Ak = number of arrangements of size k. The fraction of permutations that avoid all of them is perm J t n! = nx ( 1) k (n k )! A k n! k =0 = nx k =0 ( 1) k k! A k n k = 1 A 1 n + A 2 n(n 1) for instance, if t=n and St is the diagonal, then perm J t n! = nx k =0 ( 1) k k! A 3 n(n 1)(n 2) + A k = 1 e n k and
24 A lower bound cut inclusion-exclusion off at third order: perm J t n! where we used St t and A 1 n + A 2 n(n 1) S t n + A 2 n(n 1) Å 1 S t n + A 2 n 2 A 3 apple S t 2 A 2 3 A 3 n(n 1)(n 2) S t 2 1 3(n 2) ã t 3n hope: either St is small, or A2 is large (in expectation) then in either case, we do better than 1 t/n
25 Arrangements in space and time A2 = number of pairs of elements of St that aren t on the same row or column let T2 = number of pairs of times t < t at which both players visit distinct places let Xt,t and Yt,t be the indicator random variable for the events xt xt and yt yt X then T 2 = X t,t 0Y t,t 0 t <t 0 let Q t,t 0 = E[X t,t 0]=Pr[x t 6= x t 0]. Since players are identical and independent, E[T 2 ]= X t <t 0 Q 2 t,t 0 a crude bound on how much T2 overcounts A2: T 2 A 2 apple t 2 S t 2
26 Arrangements in space and time T2 = number of pairs of times t,t at which both players visit distinct places E[T 2 ]= X t <t 0 Q 2 t,t 0 and St E 2 apple X Ä 2Qt,t 0 t <t 0 Q 2 t,t 0 ä we can relate these with Cauchy-Schwarz, giving E[T 2 ] X Q 2 1 t,t 0 t t,t 0 2 t 2 1 r Ç X 1 E t,t 0 Q t,t 0! 2 St t 2 å 2 2 if St is large, then T2 is too
27 Putting it all together Combining all these bounds and writing E St =αt and t=τn gives E perm J t n! Å 1 p ã 2 Å ã 3 the union bound is E perm J t n! 1 for each τ, we optimize over α and take whichever one is better integrating over τ gives 1.0 E[t rend ]= n +o(n) where β is the area under the curve we get β=0.548 (vs )
28 What I think of this proof
29 A better parametrization? the Anderson-Weber strategy has two extremes: wander or stay put the probability we don t meet after t steps is either 1 t/n or e t/n (or 1). how can we show that we often have e t/n? lemma: let St =t and let τ=t/n. Let St have the property that two pairs (i,j), chosen uniformly without replacement, collide on a row or column with probability c. Then perm J t n! e c 2 2 cosh perm J X t ( 1) k A k k t proof: use = and A k 1 c for even k n! k! 2 k k =0 a good bound when c is small; when c=1, the union bound is tight n k goal: prove that it s one or the other, so that Anderson-Weber is optimal
30 Variations on a theme we can put a graph structure on the locations suppose you and I are walking on a cycle: our initial positions are uniform, and so are our orientations conjecture [Alpern]: the optimal strategy is to flip a coin to choose a direction, travel halfway around the cycle, then try again lattices? hypercubes? Cayley graphs?
31 Part 2: Spectral clustering for sparse graphs joint work with Florent Krzakala, Elchanan Mossel, Joe Neeman, Allan Sly, Lenka Zdeborova, and Pan Zhang
32 d to choose between this fit and the ructure. In the present case it turns on into high and low degrees gives the nd sothe it is this division that block the algo-model stochastic the degree-corrected blockmodel, by tion of edge probability with degree is the functional form of the likelihood, block structure for fitting to the true pparent that this behavior is not limk = 2. For K = 3, the ordinary del will, for sufficiently heterogeneous towards splitting into three groups by um, suppose and low and similarly for higheras follows: a graph G is generated of course possible that the true comself corresponds entirely mainly toof n/2 vertices each divide n vertices intoor two groups (b) With degree-correction d low degree, but we only want our structure if it(u,v), is still statistically sur- probability cin/n if they are in the same group, for each connect them with w about degree sequence, andare this and the probability cout/n if they in different FIG. 2:groups Divisions of the political blog network found he corrected model does. (a) uncorrected and (b) corrected blockmodels. Th -world example we show in Fig. 2 an average degree is c=(c vertex is proportional to its degree and vertex co in+cout)/2 twork of political blogs assembled by inferred group membership. The division in (b) co e [31]. This network is composed of to the between liberal and conserv given just the graph, label the verticesroughly according todivision their group! l or group web diaries) about US polgiven in [31]. links between them, as captured on Thursday, June 19, blogs The have known political
33 Warning: overfitting! the most likely group assignment, or MAP (maximum a posteriori) estimate, maximizes the probability of the graph but this overfits! e.g. random 3-regular graphs have bisections with only 11% of the edges crossing the cut [Zdeborová & Boettcher] indeed, there are many such bisections, and they have nothing in common! can we even distinguish G from an Erdős-Rényi graph?
34 A phase transition theorem: we can label the vertices better than chance if and only if c out c in > 2 p c conjectured by Decelle, Krzakala, Moore, and Zdeborová [2011] based on the cavity method N=100k, BP N=70k, MC N=128, MC N=128, full BP q=4, c=16 proved on the negative side by Mossel, Neeman, and Sly [2012] and on the positive side by Massoulié and MNS [2013] undetectable = c /c cout/cin E-R graph strong communities
35 Clustering nodes with eigenvalues take your favorite linear operator associated with a graph: adjacency matrix, graph Laplacian, stochastic transition matrix if there are 2 groups, label nodes according to the sign of the 2nd eigenvector if there are k groups, look at the first k eigenvectors, and use your favorite clustering algorithm in R k
36 When does this work? using tools from random matrix theory, we can compute the typical spectrum of the adjacency matrix of a graph generated by the stochastic block model the bulk follows the Wigner semicircle law λ 2 λ 1 ρ(λ) -2 c 0 2 c λ the communities are detectable as long as λ2 lies outside this bulk......it crosses at the detectability transition! [Nadakuditi and Newman, Phys. Rev. Lett. 2012]
37 But in the sparse case... if v has degree d, applying A 2 has d ways to return to v thus A has an eigenvector with an eigenvalue at least d these localized eigenvalues deviate from the semicircle law: informative eigenvectors get lost in the bulk
38 The non-backtracking operator B is a walk on directed edges of the network, with backtracking prohibited: prevents paths returning to a high-degree vertex it appears that the bulk of B s spectrum is confined to a disk of radius c, and spectral clustering with B works down to the detectability transition
39
40 Comparing with standard spectral methods Non backtracking Modularity Random Walk Adjacency Laplacian BP n=10 5, c=3 Non backtrac Modularity Random Walk Adjacency Laplacian BP Overlap Overlap c in c out Fig. 5. The accuracy of spectral algorithms based on different linear operators, and of belief propagation, f [Krzakala, Moore, Mossel, Neeman, Sly, Zdeborová, Zhang, PNAS 2013] while fixing the average degree c =3; the detectability transition given by [1] occurs at c in c out =2 p c; the detectability transition is at c Each point is averaged over 20 instances with n =10 5. Ou
41 A sharp boundary for the bulk? conjecture: let G=G(n,c/n). For any ε>0, with high probability, except for the leading eigenvalue λ1 c, all eigenvalues of B have absolute value at most c+ε. a simple argument shows that almost all eigenvalues are in this disk: since eigenvalues are at most singular values in absolute value, X 2r apple tr B r (B r ) T since G is locally treelike, for small r, each diagonal entry (B r (B r ) T )vv = number of vertices r steps from v. In expectation this is c r, so moments are bounded: E 2r apple c r but this only limits the number of bad eigenvalues to n α for some α=α(c)<1 proving the conjecture requires r~diam(g), far beyond treelike neighborhoods
42 A more compact form for a graph with n vertices and m edges, B is a 2m 2m matrix but except for ±1 eigenvalues, its spectrum is the same as a 2n 2n matrix, B 0 = Ç å 0 D 1 1 A think of a directed edge with a head and a tail: each neighbor of the head gains a head (A), the tail becomes an antihead (-1), and the head becomes d 1 tails (D-1).
43 A more compact form B 0 = Ç å 0 D 1 1 A quadratic eigenvalue equation: B 0 Ç u v å = Ç u v å ) Ä A D ( 2 1)1 ä v = 0 B has an eigenvalue λ if and only if λa D has an eigenvalue λ 2 1 the spectral density of λa D is the fixed point of a recursive distribution [Bordenave and Lelarge; Khorunzhy, Shcherbina, and Vengerovsky; Saade, Krzakala, Zdeborova] but proving zero density doesn t prove a gap...
44 Shameless Plug To put it bluntly: this book rocks! It somehow manages to combine the fun of a popular book with the intellectual heft of a textbook. Scott Aaronson, MIT This is, simply put, the best-written book on the theory of computation I have ever read; one of the best-written mathematical books I have ever read, period. Cosma Shalizi, Carnegie Mellon
The non-backtracking operator
The non-backtracking operator Florent Krzakala LPS, Ecole Normale Supérieure in collaboration with Paris: L. Zdeborova, A. Saade Rome: A. Decelle Würzburg: J. Reichardt Santa Fe: C. Moore, P. Zhang Berkeley:
More informationReconstruction in the Generalized Stochastic Block Model
Reconstruction in the Generalized Stochastic Block Model Marc Lelarge 1 Laurent Massoulié 2 Jiaming Xu 3 1 INRIA-ENS 2 INRIA-Microsoft Research Joint Centre 3 University of Illinois, Urbana-Champaign GDR
More informationSpectral Partitiong in a Stochastic Block Model
Spectral Graph Theory Lecture 21 Spectral Partitiong in a Stochastic Block Model Daniel A. Spielman November 16, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened
More informationCommunity Detection. fundamental limits & efficient algorithms. Laurent Massoulié, Inria
Community Detection fundamental limits & efficient algorithms Laurent Massoulié, Inria Community Detection From graph of node-to-node interactions, identify groups of similar nodes Example: Graph of US
More informationA Modified Method Using the Bethe Hessian Matrix to Estimate the Number of Communities
Journal of Advanced Statistics, Vol. 3, No. 2, June 2018 https://dx.doi.org/10.22606/jas.2018.32001 15 A Modified Method Using the Bethe Hessian Matrix to Estimate the Number of Communities Laala Zeyneb
More informationSpectral Redemption: Clustering Sparse Networks
Spectral Redemption: Clustering Sparse Networks Florent Krzakala Cristopher Moore Elchanan Mossel Joe Neeman Allan Sly, et al. SFI WORKING PAPER: 03-07-05 SFI Working Papers contain accounts of scienti5ic
More informationTheory and Methods for the Analysis of Social Networks
Theory and Methods for the Analysis of Social Networks Alexander Volfovsky Department of Statistical Science, Duke University Lecture 1: January 16, 2018 1 / 35 Outline Jan 11 : Brief intro and Guest lecture
More informationCOMMUNITY DETECTION IN SPARSE NETWORKS VIA GROTHENDIECK S INEQUALITY
COMMUNITY DETECTION IN SPARSE NETWORKS VIA GROTHENDIECK S INEQUALITY OLIVIER GUÉDON AND ROMAN VERSHYNIN Abstract. We present a simple and flexible method to prove consistency of semidefinite optimization
More informationStatistical and Computational Phase Transitions in Planted Models
Statistical and Computational Phase Transitions in Planted Models Jiaming Xu Joint work with Yudong Chen (UC Berkeley) Acknowledgement: Prof. Bruce Hajek November 4, 203 Cluster/Community structure in
More informationReconstruction in the Sparse Labeled Stochastic Block Model
Reconstruction in the Sparse Labeled Stochastic Block Model Marc Lelarge 1 Laurent Massoulié 2 Jiaming Xu 3 1 INRIA-ENS 2 INRIA-Microsoft Research Joint Centre 3 University of Illinois, Urbana-Champaign
More information1 Adjacency matrix and eigenvalues
CSC 5170: Theory of Computational Complexity Lecture 7 The Chinese University of Hong Kong 1 March 2010 Our objective of study today is the random walk algorithm for deciding if two vertices in an undirected
More informationSVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)
Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular
More informationAlgebraic Representation of Networks
Algebraic Representation of Networks 0 1 2 1 1 0 0 1 2 0 0 1 1 1 1 1 Hiroki Sayama sayama@binghamton.edu Describing networks with matrices (1) Adjacency matrix A matrix with rows and columns labeled by
More informationRandomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationCommunity detection in stochastic block models via spectral methods
Community detection in stochastic block models via spectral methods Laurent Massoulié (MSR-Inria Joint Centre, Inria) based on joint works with: Dan Tomozei (EPFL), Marc Lelarge (Inria), Jiaming Xu (UIUC),
More informationPentagon Walk Revisited. Double Heads Revisited Homework #1 Thursday 26 Jan 2006
1 Let X 0, X 1,... be random variables that take values in the set {A, B, C, D, E}. Suppose X n is the node being visited at time n in the Pentagon walk we discussed in class. That is, we assume that X
More informationProbability & Computing
Probability & Computing Stochastic Process time t {X t t 2 T } state space Ω X t 2 state x 2 discrete time: T is countable T = {0,, 2,...} discrete space: Ω is finite or countably infinite X 0,X,X 2,...
More informationSpectral Clustering. Guokun Lai 2016/10
Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph
More informationExtreme eigenvalues of Erdős-Rényi random graphs
Extreme eigenvalues of Erdős-Rényi random graphs Florent Benaych-Georges j.w.w. Charles Bordenave and Antti Knowles MAP5, Université Paris Descartes May 18, 2018 IPAM UCLA Inhomogeneous Erdős-Rényi random
More informationPhase Transitions in Physics and Computer Science. Cristopher Moore University of New Mexico and the Santa Fe Institute
Phase Transitions in Physics and Computer Science Cristopher Moore University of New Mexico and the Santa Fe Institute Magnetism When cold enough, Iron will stay magnetized, and even magnetize spontaneously
More informationSpectral thresholds in the bipartite stochastic block model
Spectral thresholds in the bipartite stochastic block model Laura Florescu and Will Perkins NYU and U of Birmingham September 27, 2016 Laura Florescu and Will Perkins Spectral thresholds in the bipartite
More informationx x 1 0 1/(N 1) (N 2)/(N 1)
Please simplify your answers to the extent reasonable without a calculator, show your work, and explain your answers, concisely. If you set up an integral or a sum that you cannot evaluate, leave it as
More informationHow Robust are Thresholds for Community Detection?
How Robust are Thresholds for Community Detection? Ankur Moitra (MIT) joint work with Amelia Perry (MIT) and Alex Wein (MIT) Let me tell you a story about the success of belief propagation and statistical
More informationBelief Propagation, Robust Reconstruction and Optimal Recovery of Block Models
JMLR: Workshop and Conference Proceedings vol 35:1 15, 2014 Belief Propagation, Robust Reconstruction and Optimal Recovery of Block Models Elchanan Mossel mossel@stat.berkeley.edu Department of Statistics
More informationCommunity Detection. Data Analytics - Community Detection Module
Community Detection Data Analytics - Community Detection Module Zachary s karate club Members of a karate club (observed for 3 years). Edges represent interactions outside the activities of the club. Community
More informationAssignment 4: Solutions
Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each
More informationUmans Complexity Theory Lectures
Umans Complexity Theory Lectures Lecture 8: Introduction to Randomized Complexity: - Randomized complexity classes, - Error reduction, - in P/poly - Reingold s Undirected Graph Reachability in RL Randomized
More informationPan Zhang Ph.D. Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501, USA
Pan Zhang Ph.D. Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501, USA PERSONAL DATA Date of Birth: November 19, 1983 Nationality: China Mail: 1399 Hyde Park Rd, Santa Fe, NM 87501, USA E-mail:
More informationClustering from Sparse Pairwise Measurements
Clustering from Sparse Pairwise Measurements arxiv:6006683v2 [cssi] 9 May 206 Alaa Saade Laboratoire de Physique Statistique École Normale Supérieure, 24 Rue Lhomond Paris 75005 Marc Lelarge INRIA and
More informationLecture 5: January 30
CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 5: January 30 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They
More informationOnline Social Networks and Media. Link Analysis and Web Search
Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information
More informationGraph fundamentals. Matrices associated with a graph
Graph fundamentals Matrices associated with a graph Drawing a picture of a graph is one way to represent it. Another type of representation is via a matrix. Let G be a graph with V (G) ={v 1,v,...,v n
More informationRandom Vectors, Random Matrices, and Diagrammatic Fun
Random Vectors, Random Matrices, and Diagrammatic Fun Cristopher Moore University of New Mexico & Santa Fe Institute joint work with Alexander Russell University of Connecticut A product of inner products
More informationCSI 445/660 Part 6 (Centrality Measures for Networks) 6 1 / 68
CSI 445/660 Part 6 (Centrality Measures for Networks) 6 1 / 68 References 1 L. Freeman, Centrality in Social Networks: Conceptual Clarification, Social Networks, Vol. 1, 1978/1979, pp. 215 239. 2 S. Wasserman
More informationGraph Detection and Estimation Theory
Introduction Detection Estimation Graph Detection and Estimation Theory (and algorithms, and applications) Patrick J. Wolfe Statistics and Information Sciences Laboratory (SISL) School of Engineering and
More informationChapter 11. Matrix Algorithms and Graph Partitioning. M. E. J. Newman. June 10, M. E. J. Newman Chapter 11 June 10, / 43
Chapter 11 Matrix Algorithms and Graph Partitioning M. E. J. Newman June 10, 2016 M. E. J. Newman Chapter 11 June 10, 2016 1 / 43 Table of Contents 1 Eigenvalue and Eigenvector Eigenvector Centrality The
More information1.10 Matrix Representation of Graphs
42 Basic Concepts of Graphs 1.10 Matrix Representation of Graphs Definitions: In this section, we introduce two kinds of matrix representations of a graph, that is, the adjacency matrix and incidence matrix
More informationPRAMs. M 1 M 2 M p. globaler Speicher
PRAMs A PRAM (parallel random access machine) consists of p many identical processors M,..., M p (RAMs). Processors can read from/write to a shared (global) memory. Processors work synchronously. M M 2
More informationSPARSE RANDOM GRAPHS: REGULARIZATION AND CONCENTRATION OF THE LAPLACIAN
SPARSE RANDOM GRAPHS: REGULARIZATION AND CONCENTRATION OF THE LAPLACIAN CAN M. LE, ELIZAVETA LEVINA, AND ROMAN VERSHYNIN Abstract. We study random graphs with possibly different edge probabilities in the
More informationGRAPH PARTITIONING USING SINGLE COMMODITY FLOWS [KRV 06] 1. Preliminaries
GRAPH PARTITIONING USING SINGLE COMMODITY FLOWS [KRV 06] notes by Petar MAYMOUNKOV Theme The algorithmic problem of finding a sparsest cut is related to the combinatorial problem of building expander graphs
More informationLearning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University
Learning from Sensor Data: Set II Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University 1 6. Data Representation The approach for learning from data Probabilistic
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4
ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian
More informationEECS 495: Randomized Algorithms Lecture 14 Random Walks. j p ij = 1. Pr[X t+1 = j X 0 = i 0,..., X t 1 = i t 1, X t = i] = Pr[X t+
EECS 495: Randomized Algorithms Lecture 14 Random Walks Reading: Motwani-Raghavan Chapter 6 Powerful tool for sampling complicated distributions since use only local moves Given: to explore state space.
More informationMarkov Chains and Related Matters
Markov Chains and Related Matters 2 :9 3 4 : The four nodes are called states. The numbers on the arrows are called transition probabilities. For example if we are in state, there is a probability of going
More informationMatrix estimation by Universal Singular Value Thresholding
Matrix estimation by Universal Singular Value Thresholding Courant Institute, NYU Let us begin with an example: Suppose that we have an undirected random graph G on n vertices. Model: There is a real symmetric
More informationDegree Distribution: The case of Citation Networks
Network Analysis Degree Distribution: The case of Citation Networks Papers (in almost all fields) refer to works done earlier on same/related topics Citations A network can be defined as Each node is
More information1 Tridiagonal matrices
Lecture Notes: β-ensembles Bálint Virág Notes with Diane Holcomb 1 Tridiagonal matrices Definition 1. Suppose you have a symmetric matrix A, we can define its spectral measure (at the first coordinate
More informationDATA MINING LECTURE 13. Link Analysis Ranking PageRank -- Random walks HITS
DATA MINING LECTURE 3 Link Analysis Ranking PageRank -- Random walks HITS How to organize the web First try: Manually curated Web Directories How to organize the web Second try: Web Search Information
More informationMarkov chains and the number of occurrences of a word in a sequence ( , 11.1,2,4,6)
Markov chains and the number of occurrences of a word in a sequence (4.5 4.9,.,2,4,6) Prof. Tesler Math 283 Fall 208 Prof. Tesler Markov Chains Math 283 / Fall 208 / 44 Locating overlapping occurrences
More informationThe expansion of random regular graphs
The expansion of random regular graphs David Ellis Introduction Our aim is now to show that for any d 3, almost all d-regular graphs on {1, 2,..., n} have edge-expansion ratio at least c d d (if nd is
More informationChaos, Complexity, and Inference (36-462)
Chaos, Complexity, and Inference (36-462) Lecture 21 Cosma Shalizi 3 April 2008 Models of Networks, with Origin Myths Erdős-Rényi Encore Erdős-Rényi with Node Types Watts-Strogatz Small World Graphs Exponential-Family
More informationThe Markov Chain Monte Carlo Method
The Markov Chain Monte Carlo Method Idea: define an ergodic Markov chain whose stationary distribution is the desired probability distribution. Let X 0, X 1, X 2,..., X n be the run of the chain. The Markov
More informationRandom Walks on Graphs. One Concrete Example of a random walk Motivation applications
Random Walks on Graphs Outline One Concrete Example of a random walk Motivation applications shuffling cards universal traverse sequence self stabilizing token management scheme random sampling enumeration
More informationLecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality
CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationOnline Social Networks and Media. Link Analysis and Web Search
Online Social Networks and Media Link Analysis and Web Search How to Organize the Web First try: Human curated Web directories Yahoo, DMOZ, LookSmart How to organize the web Second try: Web Search Information
More informationLecture 5: Random Walks and Markov Chain
Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 2
MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on
More informationChaos, Complexity, and Inference (36-462)
Chaos, Complexity, and Inference (36-462) Lecture 21: More Networks: Models and Origin Myths Cosma Shalizi 31 March 2009 New Assignment: Implement Butterfly Mode in R Real Agenda: Models of Networks, with
More informationA lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo
A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d
More informationRandom Lifts of Graphs
27th Brazilian Math Colloquium, July 09 Plan of this talk A brief introduction to the probabilistic method. A quick review of expander graphs and their spectrum. Lifts, random lifts and their properties.
More informationA Tale of Two Cultures: Phase Transitions in Physics and Computer Science. Cristopher Moore University of New Mexico and the Santa Fe Institute
A Tale of Two Cultures: Phase Transitions in Physics and Computer Science Cristopher Moore University of New Mexico and the Santa Fe Institute Computational Complexity Why are some problems qualitatively
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer
More informationSpectral thresholds in the bipartite stochastic block model
JMLR: Workshop and Conference Proceedings vol 49:1 17, 2016 Spectral thresholds in the bipartite stochastic block model Laura Florescu New York University Will Perkins University of Birmingham FLORESCU@CIMS.NYU.EDU
More informationWAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME)
WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME ITAI BENJAMINI, GADY KOZMA, LÁSZLÓ LOVÁSZ, DAN ROMIK, AND GÁBOR TARDOS Abstract. We observe returns of a simple random wal on a finite graph to a fixed node,
More informationThe Probabilistic Method
The Probabilistic Method Janabel Xia and Tejas Gopalakrishna MIT PRIMES Reading Group, mentors Gwen McKinley and Jake Wellens December 7th, 2018 Janabel Xia and Tejas Gopalakrishna Probabilistic Method
More informationRendezvous On A Discrete Line
Rendezvous On A Discrete Line William H. Ruckle Abstract In a rendezvous problem on a discrete line two players are placed at points on the line. At each moment of time each player can move to an adjacent
More informationSpectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min
Spectral Graph Theory Lecture 2 The Laplacian Daniel A. Spielman September 4, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The notes written before
More information25.1 Markov Chain Monte Carlo (MCMC)
CS880: Approximations Algorithms Scribe: Dave Andrzejewski Lecturer: Shuchi Chawla Topic: Approx counting/sampling, MCMC methods Date: 4/4/07 The previous lecture showed that, for self-reducible problems,
More informationarxiv: v1 [stat.me] 6 Nov 2014
Network Cross-Validation for Determining the Number of Communities in Network Data Kehui Chen 1 and Jing Lei arxiv:1411.1715v1 [stat.me] 6 Nov 014 1 Department of Statistics, University of Pittsburgh Department
More informationDissertation Defense
Clustering Algorithms for Random and Pseudo-random Structures Dissertation Defense Pradipta Mitra 1 1 Department of Computer Science Yale University April 23, 2008 Mitra (Yale University) Dissertation
More informationCS249: ADVANCED DATA MINING
CS249: ADVANCED DATA MINING Graph and Network Instructor: Yizhou Sun yzsun@cs.ucla.edu May 31, 2017 Methods Learnt Classification Clustering Vector Data Text Data Recommender System Decision Tree; Naïve
More informationCS 6820 Fall 2014 Lectures, October 3-20, 2014
Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given
More information6.207/14.15: Networks Lecture 12: Generalized Random Graphs
6.207/14.15: Networks Lecture 12: Generalized Random Graphs 1 Outline Small-world model Growing random networks Power-law degree distributions: Rich-Get-Richer effects Models: Uniform attachment model
More informationConsensus Problems on Small World Graphs: A Structural Study
Consensus Problems on Small World Graphs: A Structural Study Pedram Hovareshti and John S. Baras 1 Department of Electrical and Computer Engineering and the Institute for Systems Research, University of
More informationSummary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)
Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection
More informationCLUSTERING over graphs is a classical problem with
Maximum Likelihood Latent Space Embedding of Logistic Random Dot Product Graphs Luke O Connor, Muriel Médard and Soheil Feizi ariv:5.85v3 [stat.ml] 3 Aug 27 Abstract A latent space model for a family of
More informationBefore we show how languages can be proven not regular, first, how would we show a language is regular?
CS35 Proving Languages not to be Regular Before we show how languages can be proven not regular, first, how would we show a language is regular? Although regular languages and automata are quite powerful
More informationLecture 13: Spectral Graph Theory
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Mining Graph/Network Data Instructor: Yizhou Sun yzsun@ccs.neu.edu March 16, 2016 Methods to Learn Classification Clustering Frequent Pattern Mining Matrix Data Decision
More informationCommunities Via Laplacian Matrices. Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices
Communities Via Laplacian Matrices Degree, Adjacency, and Laplacian Matrices Eigenvectors of Laplacian Matrices The Laplacian Approach As with betweenness approach, we want to divide a social graph into
More informationKernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman
Kernels of Directed Graph Laplacians J. S. Caughman and J.J.P. Veerman Department of Mathematics and Statistics Portland State University PO Box 751, Portland, OR 97207. caughman@pdx.edu, veerman@pdx.edu
More informationRings, Paths, and Paley Graphs
Spectral Graph Theory Lecture 5 Rings, Paths, and Paley Graphs Daniel A. Spielman September 12, 2012 5.1 About these notes These notes are not necessarily an accurate representation of what happened in
More informationarxiv: v1 [cs.ds] 11 Oct 2018
Path matrix and path energy of graphs arxiv:1810.04870v1 [cs.ds] 11 Oct 018 Aleksandar Ilić Facebook Inc, Menlo Park, California, USA e-mail: aleksandari@gmail.com Milan Bašić Faculty of Sciences and Mathematics,
More informationPage Max. Possible Points Total 100
Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic
More informationPermutations and Combinations
Permutations and Combinations Permutations Definition: Let S be a set with n elements A permutation of S is an ordered list (arrangement) of its elements For r = 1,..., n an r-permutation of S is an ordered
More informationSpectral Graph Theory Lecture 3. Fundamental Graphs. Daniel A. Spielman September 5, 2018
Spectral Graph Theory Lecture 3 Fundamental Graphs Daniel A. Spielman September 5, 2018 3.1 Overview We will bound and derive the eigenvalues of the Laplacian matrices of some fundamental graphs, including
More informationStrongly Regular Graphs, part 1
Spectral Graph Theory Lecture 23 Strongly Regular Graphs, part 1 Daniel A. Spielman November 18, 2009 23.1 Introduction In this and the next lecture, I will discuss strongly regular graphs. Strongly regular
More informationEigenvalues, random walks and Ramanujan graphs
Eigenvalues, random walks and Ramanujan graphs David Ellis 1 The Expander Mixing lemma We have seen that a bounded-degree graph is a good edge-expander if and only if if has large spectral gap If G = (V,
More informationUniversity of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods
University of Chicago Autumn 2003 CS37101-1 Markov Chain Monte Carlo Methods Lecture 4: October 21, 2003 Bounding the mixing time via coupling Eric Vigoda 4.1 Introduction In this lecture we ll use the
More information1 Ways to Describe a Stochastic Process
purdue university cs 59000-nmc networks & matrix computations LECTURE NOTES David F. Gleich September 22, 2011 Scribe Notes: Debbie Perouli 1 Ways to Describe a Stochastic Process We will use the biased
More informationPRGs for space-bounded computation: INW, Nisan
0368-4283: Space-Bounded Computation 15/5/2018 Lecture 9 PRGs for space-bounded computation: INW, Nisan Amnon Ta-Shma and Dean Doron 1 PRGs Definition 1. Let C be a collection of functions C : Σ n {0,
More informationNotes 6 : First and second moment methods
Notes 6 : First and second moment methods Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Roc, Sections 2.1-2.3]. Recall: THM 6.1 (Markov s inequality) Let X be a non-negative
More information18.312: Algebraic Combinatorics Lionel Levine. Lecture 19
832: Algebraic Combinatorics Lionel Levine Lecture date: April 2, 20 Lecture 9 Notes by: David Witmer Matrix-Tree Theorem Undirected Graphs Let G = (V, E) be a connected, undirected graph with n vertices,
More informationCSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010
CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010 Computational complexity studies the amount of resources necessary to perform given computations.
More informationCS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003
CS6999 Probabilistic Methods in Integer Programming Randomized Rounding April 2003 Overview 2 Background Randomized Rounding Handling Feasibility Derandomization Advanced Techniques Integer Programming
More informationClustering using Mixture Models
Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior
More informationPowerful tool for sampling from complicated distributions. Many use Markov chains to model events that arise in nature.
Markov Chains Markov chains: 2SAT: Powerful tool for sampling from complicated distributions rely only on local moves to explore state space. Many use Markov chains to model events that arise in nature.
More informationSymmetric Rendezvous Search
Symmetric Rendezvous Search Richard Weber Talk to the Adams Society, 1 February, 007 Statistical Laboratory, Centre for Mathematical Sciences, University of Cambridge Aisle miles (006) Two people lose
More information