Lecture 4: Phase Transition in Random Graphs

Similar documents
Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian

Lecture 9: Laplacian Eigenmaps

Lecture 7: February 6

Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian

Lecture 7: Convex Optimizations

Random Graphs. EECS 126 (UC Berkeley) Spring 2019

Notes 6 : First and second moment methods

6.207/14.15: Networks Lecture 4: Erdös-Renyi Graphs and Phase Transitions

6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Notes on the second moment method, Erdős multiplication tables

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Lecture 4: Sampling, Tail Inequalities

Chapter 3. Erdős Rényi random graphs

Random Geometric Graphs

Bipartite decomposition of random graphs

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Proving the central limit theorem

Lecture Notes 3 Convergence (Chapter 5)

Small subgraphs of random regular graphs

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A = A U. U [n] P(A U ). n 1. 2 k(n k). k. k=1

18.175: Lecture 13 Infinite divisibility and Lévy processes

Expectation of Random Variables

Lecture 7: The Subgraph Isomorphism Problem

On the Spectra of General Random Graphs

The concentration of the chromatic number of random graphs

CSE 312 Final Review: Section AA

A simple branching process approach to the phase transition in G n,p

X = X X n, + X 2

Network models: random graphs

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

Tail and Concentration Inequalities

Quantum query complexity of entropy estimation

Lecture 06 01/31/ Proofs for emergence of giant component

The typical structure of sparse K r+1 -free graphs

Introduction to Probability

Random Geometric Graphs

Lecture 8. October 22, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

4 Pairs of Random Variables

Percolation in General Graphs

The expansion of random regular graphs

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Mathematical Methods for Computer Science

Sharp threshold functions for random intersection graphs via a coupling method.

Similarity Measures for Link Prediction Using Power Law Degree Distribution

Stochastic Models in Computer Science A Tutorial

The chromatic number of random regular graphs

arxiv: v1 [math.co] 24 Apr 2014

Random Graphs III. Y. Kohayakawa (São Paulo) Chorin, 4 August 2006

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

COMPSCI 240: Reasoning Under Uncertainty

Practice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example

PROBABILITY AND STATISTICS IN COMPUTING. III. Discrete Random Variables Expectation and Deviations From: [5][7][6] German Hernandez

Course Notes. Part IV. Probabilistic Combinatorics. Algorithms

Susceptible-Infective-Removed Epidemics and Erdős-Rényi random

On the S-Labeling problem

2 2 + x =

Lecture 8: February 8

CS Homework Chapter 6 ( 6.14 )

Deterministic Decentralized Search in Random Graphs

Quick Tour of Basic Probability Theory and Linear Algebra

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS014) p.4149

s k k! E[Xk ], s <s 0.

Resistance Growth of Branching Random Networks

CMSC Discrete Mathematics FINAL EXAM Tuesday, December 5, 2017, 10:30-12:30

Math 261A Probabilistic Combinatorics Instructor: Sam Buss Fall 2015 Homework assignments

Moments and tails. Chapter 2

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

RANDOM GRAPHS BY JOEL SPENCER. as n.

RANDOM GRAPHS. Joel Spencer ICTP 12

Subhypergraph counts in extremal and random hypergraphs and the fractional q-independence

A New Random Graph Model with Self-Optimizing Nodes: Connectivity and Diameter

1 Exercises for lecture 1

Transversal designs and induced decompositions of graphs

The giant component in a random subgraph of a given graph

Random Graphs. 7.1 Introduction

. p.1. Mathematical Models of the WWW and related networks. Alan Frieze

Decision Making and Social Networks

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

Graphs and matrices with maximal energy

Induced Turán numbers

February 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM

Random topology and geometry

A lower bound for the spectral radius of graphs with fixed diameter

Adventures in random graphs: Models, structures and algorithms

1.1 Review of Probability Theory

On a Form of Coordinate Percolation

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

CS224W: Analysis of Networks Jure Leskovec, Stanford University

The regularity method and blow-up lemmas for sparse graphs

Note on the structure of Kruskal s Algorithm

Chapter 1. Introduction

Edge-disjoint induced subgraphs with given minimum degree

Transcription:

Lecture 4: Phase Transition in Random Graphs Radu Balan February 21, 2017

Distributions Today we discuss about phase transition in random graphs. Recall on the Erdöd-Rényi class G n,p of random graphs, the probability mass function on G, P : G [0, 1], is obtained by assuming that, as random variables, edges are independent from one another, and each edge occurs with probability p [0, 1]. Thus a graph G G with m vertices will have probability P(G) given by ( ) n m P(G) = p m 2 (1 p).

Distributions Today we discuss about phase transition in random graphs. Recall on the Erdöd-Rényi class G n,p of random graphs, the probability mass function on G, P : G [0, 1], is obtained by assuming that, as random variables, edges are independent from one another, and each edge occurs with probability p [0, 1]. Thus a graph G G with m vertices will have probability P(G) given by ( ) n m P(G) = p m 2 (1 p). Recall the expected number of q-cliques X q is ( ) n E[X q ] = q p q(q 1)/2

Distributions We shall also use gnm the set of all graphs on n vertices with m edges. The set Γ n,m has cardinal ( ) n 2. m In Γ n,m each graph is equally probable.

Cliques The case of 3-cliques: E[X 3 ] = θn 3 p 3 (θ 1 6 ). The case of 4-cliques: E[X 4 ] = θn 4 p 6 (θ 1 24 ). The first problem we consider is thesize of the largest clique of a random graph. Note, finding the size of the largest clique (called the clique number) is a NP-hard problem.

Cliques The case of 3-cliques: E[X 3 ] = θn 3 p 3 (θ 1 6 ). The case of 4-cliques: E[X 4 ] = θn 4 p 6 (θ 1 24 ). The first problem we consider is thesize of the largest clique of a random graph. Note, finding the size of the largest clique (called the clique number) is a NP-hard problem. Idea: Analyze p so that E[X q ] 1. For p > 1 n and large n we expect that graphs will have a 3-clique; For p > 1 and large n, we expect that graphs will have a 4-clique; n 2/3

Cliques The case of 3-cliques: E[X 3 ] = θn 3 p 3 (θ 1 6 ). The case of 4-cliques: E[X 4 ] = θn 4 p 6 (θ 1 24 ). The first problem we consider is thesize of the largest clique of a random graph. Note, finding the size of the largest clique (called the clique number) is a NP-hard problem. Idea: Analyze p so that E[X q ] 1. For p > 1 n and large n we expect that graphs will have a 3-clique; For p > 1 and large n, we expect that graphs will have a 4-clique; n 2/3 Question: How sharp are these thresholds?

3-Cliques Theorem Let p = p(n) be the edge probability in G n,p. 1 If p 1 n (i.e. lim n np = ) then lim n Prob[G G n,p has a 3 clique] 1. 2 If p 1 n (i.e. lim n np = 0) then lim n Prob[G G n,p has a 3 clique] 0.

3-Cliques Theorem Let p = p(n) be the edge probability in G n,p. 1 If p 1 n (i.e. lim n np = ) then lim n Prob[G G n,p has a 3 clique] 1. 2 If p 1 n (i.e. lim n np = 0) then lim n Prob[G G n,p has a 3 clique] 0. Theorem Let m = m(n) be the number of edges in Γ n,m. 1 m If m n (i.e. lim n n = ) then lim n Prob[G Γ n,m has a 3 clique] 1. 2 m If m n (i.e. lim n n = 0) then lim n Prob[G Γ n,m has a 3 clique] 0.

4-Cliques Theorem Let p = p(n) be the edge probability in G n,p. 1 If p 1 (i.e. lim n 2/3 n n 2/3 p = ) then lim n Prob[G G n,p has a 4 clique] 1. 2 If p 1 (i.e. lim n 2/3 n n 2/3 p = 0) then lim n Prob[G G n,p has a 4 clique] 0.

4-Cliques Theorem Let p = p(n) be the edge probability in G n,p. 1 If p 1 (i.e. lim n 2/3 n n 2/3 p = ) then lim n Prob[G G n,p has a 4 clique] 1. 2 If p 1 (i.e. lim n 2/3 n n 2/3 p = 0) then lim n Prob[G G n,p has a 4 clique] 0. Theorem Let m = m(n) be the number of edges in Γ n,m. 1 If m n 4/3 m (i.e. lim n = ) then n 4/3 lim n Prob[G Γ n,m has a 4 clique] 1. 2 If m n 4/3 m (i.e. lim n = 0) then n 4/3 lim n Prob[G Γ n,m has a 4 clique] 0.

q-cliques Theorem Let p = p(n) be the edge probability in G n,p. Let q 3 be and integer. 1 If p 1 (i.e. lim n 2/(q 1) n n 2/(q 1) p = ) then lim n Prob[G G n,p has a q clique] 1. 2 If p 1 (i.e. lim n 2/(q 1) n n 2/(q 1) p = 0) then lim n Prob[G G n,p has a q clique] 0.

q-cliques Theorem Let p = p(n) be the edge probability in G n,p. Let q 3 be and integer. 1 If p 1 (i.e. lim n 2/(q 1) n n 2/(q 1) p = ) then lim n Prob[G G n,p has a q clique] 1. 2 If p 1 (i.e. lim n 2/(q 1) n n 2/(q 1) p = 0) then lim n Prob[G G n,p has a q clique] 0. Theorem Let m = m(n) be the number of edges in Γ n,m. Let q 3 be and integer. 1 If m n 2(q 2)/(q 1) m (i.e. lim n = ) then n 2(q 2)/(q 1) lim n Prob[G Γ n,m has a q clique] 1. 2 If m n 2(q 2)/(q 1) m (i.e. lim n = 0) then n 2(q 1)/(q 1) lim n Prob[G Γ n,m has a q clique] 0.

Markov and Chebyshev Inequalities We want to control probabilities of the random event X 3 (G) > 0. Two important tools: 1 (Markov s Inequality) Assume X is a non-negative random variable. Then Prob[X t] E[X] t. 2 (Chebyshev s Inequality) For any random variable X, Prob[ X E[X] t] Var[X] t 2. where E[X] is the mean of X, and Var[X] = E[X 2 ] E[X] 2 is the variance of X.

Markov and Chebyshev Inequalities We want to control probabilities of the random event X 3 (G) > 0. Two important tools: 1 (Markov s Inequality) Assume X is a non-negative random variable. Then Prob[X t] E[X] t. 2 (Chebyshev s Inequality) For any random variable X, Prob[ X E[X] t] Var[X]. t 2 where E[X] is the mean of X, and Var[X] = E[X 2 ] E[X] 2 is the variance of X. Quick Proof: Prob[X t] = t p X (x)dx 1 t t xp X (x)dx E[X]. t Prob[ X E[X] t] = P[ X E[X] 2 t 2 ] E[ X E[X] 2 ] t 2 = Var[X] t 2.

Proofs for the 3-clique case For small probability: We shall use Markov s inequality to show Prob[X 3 > 0] 0 when p 1 n : Prob[X 3 > 0] = Prob[X 3 1] E[X 3] 1 = n(n 1)(n 2) p 3 = θn 3 p 3 0. 6

Proofs for the 3-clique case For small probability: We shall use Markov s inequality to show Prob[X 3 > 0] 0 when p 1 n : Prob[X 3 > 0] = Prob[X 3 1] E[X 3] 1 = n(n 1)(n 2) p 3 = θn 3 p 3 0. 6 For large probability: Since E[X 3 ] it follows that Prob[X 3 > 0] > 0. We need to show that Prob[X 3 = 0] 0. By Chebyshev s inequality: Prob[X 3 = 0] Prob[ X 3 E[X 3 ] E[X 3 ]] Var[X 3] E[X 3 ] 2

Proofs for the 3-clique case For small probability: We shall use Markov s inequality to show Prob[X 3 > 0] 0 when p 1 n : Prob[X 3 > 0] = Prob[X 3 1] E[X 3] 1 = n(n 1)(n 2) p 3 = θn 3 p 3 0. 6 For large probability: Since E[X 3 ] it follows that Prob[X 3 > 0] > 0. We need to show that Prob[X 3 = 0] 0. By Chebyshev s inequality: Prob[X 3 = 0] Prob[ X 3 E[X 3 ] E[X 3 ]] Var[X 3] E[X 3 ] 2 Need the variance of X 3 = (i,j,k) S 3 1 i,j,k, X3 2 = 1 i,j,k 1 i,j,k. (i,j,k ) S 3 (i,j,k) S 3

Proofs for the 3-clique case X3 2 = 1 i,j,k + (1 i,j,k 1 i,j,l +1 i,j,k 1 j,k,l +1 i,j,k 1 k,i,l )+ (i,j,k) S 3 (n) + (i,j,k) S 3 (n) u,v S 2 (n 3) + (i,j,k) S 3 (n) l S 1 (n 3) (1 i,j,k 1 i,u,v + 1 i,j,k 1 j,u,v 1 i,j,k 1 k,u,v )+ (i,j,k) S 3 (n) (i,j,k ) S 3 (n 3) 1 i,j,k 1 i,j,k

Proofs for the 3-clique case X 2 3 = (i,j,k) S 3 (n) + 1 i,j,k + (i,j,k) S 3 (n) u,v S 2 (n 3) + (i,j,k) S 3 (n) l S 1 (n 3) (1 i,j,k 1 i,j,l +1 i,j,k 1 j,k,l +1 i,j,k 1 k,i,l )+ (1 i,j,k 1 i,u,v + 1 i,j,k 1 j,u,v 1 i,j,k 1 k,u,v )+ (i,j,k) S 3 (n) (i,j,k ) S 3 (n 3) E[X 2 3 ] = S 3 p 3 + 3 S 3 (n 3)p 5 + 3 S 3 Thus ( n 3 2 1 i,j,k 1 i,j,k ) p 6 + S 3 ( n 3 3 Var[X 3 ] = E[X 2 3 ] E[X 3 ] 2 =... = θ(n 3 p 3 + n 4 p 5 + n 5 p 6 ). ) p 6.

Proofs for the 3-clique case and: Prob[X 3 = 0] θ(n3 p 3 + n 4 p 5 + n 5 p 6 ) θ(n 6 p 6 ) 1 (np) 3 + 1 n 0.

Proofs for the 3-clique case and: Prob[X 3 = 0] θ(n3 p 3 + n 4 p 5 + n 5 p 6 ) θ(n 6 p 6 ) 1 (np) 3 + 1 n 0. Similar proofs for the other cases (4-cliques and q-cliques).

Behavior at the threshold In general we obtain a coarse threshold. Recall a Poisson process X with parameter λ has p.m.f. Prob[X = k] = e Theorem In G n,p, λ λk k!. 1 For p = c n, X 3 is asymptotically Poisson with parameter λ = c 3 /6. 2 For p = c n 2/3, X 4 is asymptotically Poisson with parameter λ = c 6 /24.

Behavior at the threshold In general we obtain a coarse threshold. Recall a Poisson process X with parameter λ has p.m.f. Prob[X = k] = e Theorem In G n,p, λ λk k!. 1 For p = c n, X 3 is asymptotically Poisson with parameter λ = c 3 /6. 2 For p = c n 2/3, X 4 is asymptotically Poisson with parameter λ = c 6 /24. Theorem In Γ n,m, 1 For m = cn, X 3 is asymptotically Poisson with parameter λ = 4c 3 /3. 2 For m = cn 4/3, X 4 is asymptotically Poisson with parameter λ = 8c 6 /3.

Connected Components G n,p class of random graphs has a remarkable property in regards to the largest connected component. We shall express the result in the class Γ n,m.

Connected Components Theorem 1 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 0 2 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 1

Connected Components Theorem 1 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 0 2 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 1 3 Assume m = 1 2n log(n) + tn + o(n), where o(n) n. Then lim n Prob[G Γn,m is connected] = e e 2t

Connected Components Theorem 1 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 0 2 Let m = m(n) satisfies m 1 2n log(n). Then lim Prob[G n Γn,m is connected] = 1 3 Assume m = 1 2n log(n) + tn + o(n), where o(n) n. Then lim n Prob[G Γn,m is connected] = e e 2t In this case 1 2n log(n) is known as a strong threshold.

3-cliques & Connectivity results Results for n = 1000 vertices. 1 3-cliques. Recall E[X 3 ] m 3 2 Connectivity. Recall the connectivity threshold is 1 2n log(n) = 3454.

3-cliques

3-cliques

Connectivity

Connectivity

Connectivity

Connectivity

References B. Bollobás, Graph Theory. An Introductory Course, Springer-Verlag 1979. 99(25), 15879 15882 (2002). F. Chung, Spectral Graph Theory, AMS 1997. F. Chung, L. Lu, The average distances in random graphs with given expected degrees, Proc. Nat.Acad.Sci. 2002. R. Diestel, Graph Theory, 3rd Edition, Springer-Verlag 2005. P. Erdös, A. Rényi, On The Evolution of Random Graphs G. Grimmett, Probability on Graphs. Random Processes on Graphs and Lattices, Cambridge Press 2010. J. Leskovec, J. Kleinberg, C. Faloutsos, Graph Evolution: Densification and Shrinking Diameters, ACM Trans. on Knowl.Disc.Data, 1(1) 2007.