Coupling of Scale-Free and Classical Random Graphs

Similar documents
Coupling Scale-Free and Classical Random Graphs

Equivalence of the random intersection graph and G(n, p)

Lecture 5: Probabilistic tools and Applications II

The phase transition in the Erdös-Rényi random graph model

Lecture 7: February 6

Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds

An Efficient reconciliation algorithm for social networks

. p.1. Mathematical Models of the WWW and related networks. Alan Frieze

Many cliques in H-free subgraphs of random graphs

Lecture 5: January 30

Induced subgraphs with many repeated degrees

The diameter and mixing time of critical random graphs.

A note on network reliability

Connectivity of random cubic sum graphs

Tianjin. June Phase Transition. Joel Spencer

variance of independent variables: sum of variances So chebyshev predicts won t stray beyond stdev.

JIGSAW PERCOLATION ON ERDÖS-RÉNYI RANDOM GRAPHS

Induced subgraphs of Ramsey graphs with many distinct degrees

9.1 Branching Process

Local MAX-CUT in Smoothed Polynomial Time

The large deviation principle for the Erdős-Rényi random graph

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

Sets that are connected in two random graphs

Lecture 4. 1 Estimating the number of connected components and minimum spanning tree

Topic 10: Hypothesis Testing

Mixing time and diameter in random graphs

CSE 321 Solutions to Practice Problems

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

Distributed Systems Gossip Algorithms

Learning Theory. Piyush Rai. CS5350/6350: Machine Learning. September 27, (CS5350/6350) Learning Theory September 27, / 14

Random Graphs III. Y. Kohayakawa (São Paulo) Chorin, 4 August 2006

Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.

Topic 10: Hypothesis Testing

arxiv: v1 [math.co] 28 Mar 2015

FIRST ORDER SENTENCES ON G(n, p), ZERO-ONE LAWS, ALMOST SURE AND COMPLETE THEORIES ON SPARSE RANDOM GRAPHS

Balanced Allocation Through Random Walk

Susceptible-Infective-Removed Epidemics and Erdős-Rényi random

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 2 Luca Trevisan August 29, 2017

Tom Salisbury

MATH 22 HAMILTONIAN GRAPHS. Lecture V: 11/18/2003

CS224W: Analysis of Networks Jure Leskovec, Stanford University

CS5314 Randomized Algorithms. Lecture 15: Balls, Bins, Random Graphs (Hashing)

THE SECOND LARGEST COMPONENT IN THE SUPERCRITICAL 2D HAMMING GRAPH

ON SUBGRAPH SIZES IN RANDOM GRAPHS

The Monte Carlo Method

Rigidity of Graphs and Frameworks

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017

The typical structure of sparse K r+1 -free graphs

CSE 421 NP-Completeness

Graph-Constrained Group Testing

Poisson approximations

Random Graphs. 7.1 Introduction

Dominating Set. Chapter 7

ADFOCS 2006 Saarbrücken. Phase Transition. Joel Spencer

Equitable coloring of random graphs

Packing tight Hamilton cycles in 3-uniform hypergraphs

ON THE NUMBER OF ALTERNATING PATHS IN BIPARTITE COMPLETE GRAPHS

Applications of the Lopsided Lovász Local Lemma Regarding Hypergraphs

The expansion of random regular graphs

Lecture 18: More NP-Complete Problems

Dominating a family of graphs with small connected subgraphs

Large cliques in sparse random intersection graphs

Rainbow Connection of Random Regular Graphs

Assignment 2 : Probabilistic Methods

n Boolean variables: x 1, x 2,...,x n 2 {true,false} conjunctive normal form:

The cover time of sparse random graphs.

25 Minimum bandwidth: Approximation via volume respecting embeddings

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

CS 6820 Fall 2014 Lectures, October 3-20, 2014

Lecture 2: Rumor Spreading (2)

Solution Week 68 (12/29/03) Tower of circles

Lecture 6. Today we shall use graph entropy to improve the obvious lower bound on good hash functions.

Sharp threshold functions for random intersection graphs via a coupling method.

Generating all subsets of a finite set with disjoint unions

Probabilistic Proofs of Existence of Rare Events. Noga Alon

Two-coloring random hypergraphs

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Edge-disjoint induced subgraphs with given minimum degree

CPSC 536N: Randomized Algorithms Term 2. Lecture 9

Homomorphism Testing

arxiv: v1 [math.st] 1 Nov 2017

SPIN SYSTEMS: HARDNESS OF APPROXIMATE COUNTING VIA PHASE TRANSITIONS

Solving Random Satisfiable 3CNF Formulas in Expected Polynomial Time

CSE 312 Final Review: Section AA

Clique is hard on average for regular resolution

Randomized Algorithms III Min Cut

SHORT PATHS IN 3-UNIFORM QUASI-RANDOM HYPERGRAPHS. Joanna Polcyn. Department of Discrete Mathematics Adam Mickiewicz University

THE STRANGE LOGIC OF RANDOM GRAPHS. Joel Spencer. Courant Institute

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Randomized Load Balancing:The Power of 2 Choices

Umans Complexity Theory Lectures

11.1 Set Cover ILP formulation of set cover Deterministic rounding

Part I: Discrete Math.

Independent sets and repeated degrees

k-regular subgraphs near the k-core threshold of a random graph

1. Introduction Given k 2, a k-uniform hypergraph (in short, k-graph) consists of a vertex set V and an edge set E ( V

Cryptography and Security Midterm Exam

P is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k.

Distinct edge weights on graphs

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018

Transcription:

Coupling of Scale-Free and Classical Random Graphs April 18, 2007

Introduction Consider a graph where we delete some nodes and look at the size of the largest component remaining. Just how robust are scale free graphs to random failures

Introduction Consider a graph where we delete some nodes and look at the size of the largest component remaining. Just how robust are scale free graphs to random failures to deliberate attacks

Introduction Consider a graph where we delete some nodes and look at the size of the largest component remaining. Just how robust are scale free graphs to random failures to deliberate attacks should the adversary have infinite time?

Introduction If we take the adversary out of the picture we can phrase the question as.

Introduction If we take the adversary out of the picture we can phrase the question as. How many vertices do we need to remove so that the graph breaks up into small pieces?

The Model Consider the Preferential Attachment Model. Assume m is fixed, G[k] G (k) m. We create G[k + 1] by adding a new vertex k + 1 with m edges (k + 1, t 1 ),..., (k + 1, t m ) where d G[k],i(j) Pr[t i = j] = 2mk + 2i 1

Main Result Theorem 1 There exist c > 0,m 0 s.t. for m m 0, whp G (n) m the following holds (a) Every induced subgraph of size 10n log(m) m contains a component of size at least 2n log m m (b) The graph contains an independent set of size cn log(m) m

Proof Technique Proof by coupling We create two graphs (G 1, G 2 ) s.t. G 1 G (n) m, G 2 G(n, p). But G 1 and G 2 are not independent but heavily correlated.

What is coupling? Let X D X, Y D Y. A coupling is a joint distribution D X,Y s.t. the marginals are correct, i.e. D X, = D X and D,Y = D Y.

What is coupling? Let X D X, Y D Y. A coupling is a joint distribution D X,Y s.t. the marginals are correct, i.e. D X, = D X and D,Y = D Y. Suppose X, Y are uniform [0, 1] and we let Y = 1 X then both marginals are still uniform but the joint distribution is far from independent.

Chernoff Bounds Let X Bin(n, p) then ( e δ Pr[X > (δ + 1)np] < (1 + δ) (1+δ) ( e δ Pr[X < (1 δ)np] < (1 δ) (1 δ) ) np ) np

The Coupling Theorem 2 Fix η < 1/2, there exist constants A, c > 0 s.t. for fixed m we can construct G 1 G (n) m, G 2 G(n, ηm/n) s.t. whp e(g 2 \ G 1 ) Ae cm n

The Coupling Sidenote We can create G(n, p) one node at a time G[1] contains a single node

The Coupling Sidenote We can create G(n, p) one node at a time G[1] contains a single node Given G[k]

The Coupling Sidenote We can create G(n, p) one node at a time G[1] contains a single node Given G[k] generate Y Bin(k, p)

The Coupling Sidenote We can create G(n, p) one node at a time G[1] contains a single node Given G[k] generate Y Bin(k, p) let S be a random subset of [k] of size Y

The Coupling Sidenote We can create G(n, p) one node at a time G[1] contains a single node Given G[k] generate Y Bin(k, p) let S be a random subset of [k] of size Y create edges between k and S

The Coupling Proof We grow G 1, G 2 one node at a time. Start with G 1 [k 0 ] G (n) m, G 2 [k 0 ] G(k 0, ηm/n), where k 0 grows slowly with n. Let t 1,..., t m be the vertices picked when growing G 1 [k]. For j k + 1 we have Pr(t i = j) = d G[k],i(j) 2mk + 2i 1 m 2km + 2m = 1 2k + 2

The Coupling Proof 1 2k+2... 1 2 3 k 1 k k + 1 Construct s 1,..., s m [k], s.t. s i s are independent Pr(s i = j) = 1 2k+2 for j and s i = j implies t i = j.

The Coupling Proof Let X = {i s i }, then X Bin(m, Let Y Bin(k, ηm/n), then E[X] = mk mk (1 + ɛ)η 2k + 2 n k 2k+2 ). = (1 + ɛ)e[y ] So Pr[X < Y ] Ae c m for some constants A, c > 0.

The Coupling Proof All the s i s are distinct w.p 1 O( m2 k ), given this and X the set S 1 = {s i s i } is a random subset of [k] of size X.

The Coupling Proof All the s i s are distinct w.p 1 O( m2 k ), given this and X the set S 1 = {s i s i } is a random subset of [k] of size X. If Y X we pick a random subset S 2 of size Y from S 1.

The Coupling Proof All the s i s are distinct w.p 1 O( m2 k ), given this and X the set S 1 = {s i s i } is a random subset of [k] of size X. If Y X we pick a random subset S 2 of size Y from S 1. Otherwise we pick S 2 a random subset of size Y from [k], this happens w.p. Ae c m + o(1).

The Coupling Proof Set D k+1 = #of edges added to G 2 and not G 1. Pr[D k+1 > 0] = Ae c m + o(1), and D k+1 Y so E[D k+1 ] Ame c m

The Coupling Proof The number of edges in G 2 \ G 2 is at most ( ) k0 2 + n k=k 0 +1 Assuming k 0 = n 1/4 we see that the sum has expected value at most Ane c m and is concentrated around its mean. D k

The Coupling Proof The number of edges in G 2 \ G 2 is at most ( ) k0 2 + n k=k 0 +1 Assuming k 0 = n 1/4 we see that the sum has expected value at most Ane c m and is concentrated around its mean. QED D k

Proof of Theorem 1(a) Let G 1, G 2 be as described in Theorem 2. Suppose G 1 has an induced subgraph on a set V, with V = 10n log(m) m where every component has size less than 2n log(m) m. Then we can partition V into two set V 1, V 2 s.t. V 1, V 2 4n log(m) m and G 1 has no V 1 -V 2 edges.

Proof of Theorem 1(a) Let x = Ae c m n, if the coupling works then G 2 has at most x, V 1 -V 2 edges. The number of V 1 -V 2 edges in G 2 is a binomial with mean µ = V 1 V 2 ηm/n 24η(log(m)) 2 n/m Pick m large enough s.t. x < µ/100

Proof of Theorem 1(a) By Chernoff bounds the probability that we have fewer than x edges crossing V 1, V 2 is at most e 11µ/12. Now ( ) n Pr[G 2 has at most x V 1 V 2 edges] 2 V e 22η(log(m))2 n/m V

Proof of Theorem 1(a) By Chernoff bounds the probability that we have fewer than x edges crossing V 1, V 2 is at most e 11µ/12. Now ( ) n Pr[G 2 has at most x V 1 V 2 edges] 2 V e 22η(log(m))2 n/m V ( ) en V 2 V e 22η(log(m))2 n/m V

Proof of Theorem 1(a) By Chernoff bounds the probability that we have fewer than x edges crossing V 1, V 2 is at most e 11µ/12. Now ( ) n Pr[G 2 has at most x V 1 V 2 edges] 2 V e 22η(log(m))2 n/m V ( ) en V 2 V e 22η(log(m))2 n/m V ( ) 2em V e 22η(log(m))2 n/m 10 log(m)

Proof of Theorem 1(a) By Chernoff bounds the probability that we have fewer than x edges crossing V 1, V 2 is at most e 11µ/12. Now ( ) n Pr[G 2 has at most x V 1 V 2 edges] 2 V e 22η(log(m))2 n/m V ( ) en V 2 V e 22η(log(m))2 n/m V ( ) 2em V e 22η(log(m))2 n/m 10 log(m) e (10 22η)(log(m))2 n/m

Proof of Theorem 1(a) By Chernoff bounds the probability that we have fewer than x edges crossing V 1, V 2 is at most e 11µ/12. Now ( ) n Pr[G 2 has at most x V 1 V 2 edges] 2 V e 22η(log(m))2 n/m V ( ) en V 2 V e 22η(log(m))2 n/m V ( ) 2em V e 22η(log(m))2 n/m 10 log(m) e (10 22η)(log(m))2 n/m QED

Overview of the proof The idea of the proof was this

Overview of the proof The idea of the proof was this We create G 1, G 2 simultaneously s.t.

Overview of the proof The idea of the proof was this We create G 1, G 2 simultaneously s.t. a bad event in G 1 implies w.h.p another bad event in G 2

Overview of the proof The idea of the proof was this We create G 1, G 2 simultaneously s.t. a bad event in G 1 implies w.h.p another bad event in G 2 which is easy to show is rare in G 2.

Overview of the proof The idea of the proof was this We create G 1, G 2 simultaneously s.t. a bad event in G 1 implies w.h.p another bad event in G 2 which is easy to show is rare in G 2. But we are working in the joint probability space so this implies that the bad event is rare in G 1.

Another Coupling Theorem 3 Let ɛ > 0 be given, there is a constant C s.t. for fixed m we can couple G 1, G 2 s.t. G 1 G (n) m, G 2 G(n, Cm/n) s.t. whp G 2 contains G 1 \ V for a set of vertices V s.t. {i V i ɛn} ɛn/m.

Another Coupling Proof of Theorem 3 We start with G 1 [k 0 ] G (k 0) m and G 2 [k 0 ] G(k 0, Cm/n) independent, and here k 0 = ɛn. A vertex is bad at time k if it has degree at leas Am in G 1 [k].

Another Coupling Proof of Theorem 3 We need the following result from Bollobás et al.: Vertex i has degree d + m in G (n) m with probability ( ) d + m 1 o(n 1 ) + (1 + o(1)) ( i m 1 n )m/2 (1 i n )d i.e. for i ɛn it is exponentially small in d for large d If we choose A large enough the expected number of bad vertices is at most e Bm n so whp there are at most ɛn/m bad vertices.

Another Coupling Proof of Theorem 3 Let t 1,..., t m be defined as in the previous proof. Let V k be the set of vertices j, s.t. k 0 j k that are good. Now for j V k and Pr[t i = j] (Am + m)/(2mk) A/k Aɛ 1 /n Pr[t i / V k ] k 0 m/(2mn) = ɛ/2 so Pr[t i = j r t i / {j 1,..., j r 1 }] 2Aɛ 2 /n = p

Another Coupling Proof of Theorem 3 Let T i be a random subset of V k s.t. every vertex is picked independently with probability p. Then we can couple T i and t i s.t. t i T i if t i is good. p j 1 j 2 j 3... j r Pr[t i = j r t i / {j 1,..., j r 1 }]

Another Coupling Proof of Theorem 3 Let S 1 be the set of good vertices adjacent to k + 1 in G 1. Let S 2 = T 1 T 2... T m. Then S 2 is a random subset of V k where every vertex is picked w.p. 1 (1 p) m mp Cm/n for some large C. This completes the proof since we ve constructed S 1 s.t. S 1 S 2.

Proof of Theorem 1(b) We re almost there For a constant a > 1 we know that G n, a n has an independent set of size log(a) a n when a. Now couple G 1 and G 2 as in Theorem 3, with ɛ = 1 2 and assume C > 2. Look at the subgraph of G 2 of vertices that come in after n 2. It has distribution G n 2, Cm n G n 2, Cm/2 n/2 so it has an independent set of size n log(m) Cm. Removing the bad vertices from this set we see that this set is also independent in G 1. So G 1 has an independent set of size n log(m) Cm n 2m. By picking m large enough we get the result.

The End Thank You

The End Thank You Questions?