Probabilistic Combinatorics. Jeong Han Kim

Similar documents
7 Lovász Local Lemma. , for all S with S N + (i) =. Draw directed edges from Bi to all of. events {Bj : j N + (i)}, meaning

Colorings of uniform hypergraphs with large girth

Probabilistic Methods in Combinatorics Lecture 6

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

Lecture 5: January 30

GRAPH EMBEDDING LECTURE NOTE

Theorem (Special Case of Ramsey s Theorem) R(k, l) is finite. Furthermore, it satisfies,

9.1 Branching Process

Coloring Simple Hypergraphs

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

Coloring Simple Hypergraphs

Cycle lengths in sparse graphs

Two-coloring random hypergraphs

Coloring Uniform Hypergraphs with Few Colors*

Decomposition of random graphs into complete bipartite graphs

Induced subgraphs with many repeated degrees

The Lovász Local Lemma : A constructive proof

Notes 6 : First and second moment methods

78 Years of Ramsey R(3, k)

The Probabilistic Method

Packing nearly optimal Ramsey R(3, t) graphs

The concentration of the chromatic number of random graphs

Lecture 4: Inequalities and Asymptotic Estimates

THE PROBABILISTIC METHOD IN COMBINATORICS

Assignment 2 : Probabilistic Methods

Packing nearly optimal Ramsey R(3, t) graphs

Constructions in Ramsey theory

RAMSEY THEORY. 1 Ramsey Numbers

Paul Erdős and Graph Ramsey Theory

Monochromatic and Rainbow Colorings

Equivalence of the random intersection graph and G(n, p)

Chromatic number, clique subdivisions, and the conjectures of

Independence numbers of locally sparse graphs and a Ramsey type problem

Randomized algorithm

Induced subgraphs of Ramsey graphs with many distinct degrees

Disjoint Subgraphs in Sparse Graphs 1

REGULARITY LEMMAS FOR GRAPHS

List coloring hypergraphs

Assignment 4: Solutions

Randomized Algorithms

18.5 Crossings and incidences

RANDOM GRAPHS. Joel Spencer ICTP 12

Probabilistic Method. Benny Sudakov. Princeton University

CS Homework Chapter 6 ( 6.14 )

Lecture 1 : Probabilistic Method

Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.

Essentials on the Analysis of Randomized Algorithms

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

Exercises with solutions (Set D)

Computing the Independence Polynomial: from the Tree Threshold Down to the Roots

New lower bounds for hypergraph Ramsey numbers

Independent Transversals in r-partite Graphs

Edge-disjoint induced subgraphs with given minimum degree

Concentration of Measures by Bounded Size Bias Couplings

arxiv: v2 [math.co] 10 May 2016

(n 1)/2. Proof. Choose a random tournament. The chance that it fails to have S k is bounded above by this probability.

Math 261A Probabilistic Combinatorics Instructor: Sam Buss Fall 2015 Homework assignments

Decomposition of random graphs into complete bipartite graphs

Ramsey-type problem for an almost monochromatic K 4

PRAMs. M 1 M 2 M p. globaler Speicher

Local And Global Colorability of Graphs

Balanced Allocation Through Random Walk

On Ramsey numbers of uniform hypergraphs with given maximum degree

Local And Global Colorability of Graphs

The application of probabilistic method in graph theory. Jiayi Li

Induced Ramsey-type theorems

Randomized Computation

THE PROBABILISTIC METHOD IN COMBINATORICS

Lecture 06 01/31/ Proofs for emergence of giant component

MAT 135B Midterm 1 Solutions

Oral Qualifying Exam Syllabus

A PROBABILISTIC THRESHOLD FOR MONOCHROMATIC ARITHMETIC PROGRESSIONS

Size and degree anti-ramsey numbers

Chromatic number, clique subdivisions, and the conjectures of Hajós and Erdős-Fajtlowicz

Concentration of Measures by Bounded Couplings

Lecture 4. P r[x > ce[x]] 1/c. = ap r[x = a] + a>ce[x] P r[x = a]

Lectures 6: Degree Distributions and Concentration Inequalities

Large topological cliques in graphs without a 4-cycle

Probabilistic Proofs of Existence of Rare Events. Noga Alon

Rainbow Hamilton cycles in uniform hypergraphs

Ma/CS 6b Class 15: The Probabilistic Method 2

Discrete Mathematics

A = A U. U [n] P(A U ). n 1. 2 k(n k). k. k=1

CS684 Graph Algorithms

Notes 18 : Optional Sampling Theorem

Random Lifts of Graphs

MONOCHROMATIC SCHUR TRIPLES IN RANDOMLY PERTURBED DENSE SETS OF INTEGERS

Gonzalo Fiz Pontiveros, Simon Griffiths, Robert Morris, David Saxton and Jozef Skokan The Ramsey number of the clique and the hypercube

Discrete mathematics , Fall Instructor: prof. János Pach

COS597D: Information Theory in Computer Science September 21, Lecture 2

Applications of the Lopsided Lovász Local Lemma Regarding Hypergraphs

R(p, k) = the near regular complete k-partite graph of order p. Let p = sk+r, where s and r are positive integers such that 0 r < k.

IMA Preprint Series # 2385

Ramsey theory and the geometry of Banach spaces

1 Sequences of events and their limits

Rainbow Hamilton cycles in uniform hypergraphs

Random Graphs III. Y. Kohayakawa (São Paulo) Chorin, 4 August 2006

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

Lecture 7: February 6

An asymptotically tight bound on the adaptable chromatic number

Transcription:

Probabilistic Combinatorics Jeong Han Kim 1

Tentative Plans 1. Basics for Probability theory Coin Tossing, Expectation, Linearity of Expectation, Probability vs. Expectation, Bool s Inequality 2. First moment (or expectation) method and Markov Inequality Applications to problems regarding Property B Arithmetic Progressions Covering hypercubes Ramsey numbers Independence numbers of graphs Occupancy Problems 2

3. Methods of conditional events Property B & Recoloring (Beck s theorem) Independent numbers of sparse graphs (AKS* theorem) Covering hypercubes (K & Roche s theorem) *Ajtai, Komlós and Szemerédi 4. Second Moment Method (or Chebyschev s Inequality) Applications to problems regarding Arithmetic progression Random graphs Perfect matchings in random uniform hypergraphs Covering hypercube Occupancy Problems and Poisson approximation 3

4. Law of large numbers Chernoff Bounds Martingale Inequalities Talagrand Inequality, Applications to problems regarding: Ramsey numbers Chromatic number of G(n, 1/2), Incremental random method Cut-off line Algorithm: Matching of uniform random numbers [0, 1] 4

6. Lovasz Local Lemma 7. Incremental random methods 8. Branching Processes 9. Poisson Cloning Model 10. Random regular graph and contiguity 5

1 Basics for Probability Theory 1.1 Probabilities, Events and Random variables One coin tossing X = 1 if HEAD 0 if TAIL Thus Pr[X = 0] = Pr[X = 1] = 1/2. Two coin tossing: X 1, X 2 Pr[X 1 = 0, X 2 = 0] = Pr[X 1 = 1, X 2 = 0] = Pr[X 1 = 0, X 2 = 1] = Pr[X 1 = 1, X 2 = 1] = 1/4 6

n coin tossing: X 1, X 2,..., X n For any (x i ) {0, 1} n, We say that Pr[X 1 = x 1, X 2 = x 2,..., X n = x n ] = 1/2 n. X 1,..., X n : independent and identically distributed or simply i.i.d 7

Event Event A: 10th coin is HEAD or A = {X 10 = 1}. Event B: # of heads = k or B = {X 1 + + X n = k}. 8

Probability of events Pr[X 10 = 1] = 1/2 or Pr[A] = 1/2 Pr[X 1 + + X n = k] = ( ) n 2 n or Pr[B] = k ( ) n 2 n, k where ( n k) =the # ways of to take k objects out of n objects ( ) n n! = n choose k k k!(n k)! 9

n biased coin tossing: X 1,..., X n : i.i.d. Pr[X i = 0] = 1 p and Pr[X i = 1] = p Events & Probabilities Pr[X 10 = 1] = p ( ) n Pr[X 1 + + X n = k] = p k (1 p) n k. k 10

Properties For any two events A, B, (a) If A B, then Pr[A] Pr[B]. E.g. A = {X 5 = 1, X 11 = 0}, B = {X 11 = 0} Then A B and p(1 p) = Pr[A] Pr[B] = 1 p. (b) For any two events, A and B Pr[A B] = Pr[A] + Pr[B] Pr[A B]. 11

In particular, Pr[A B] Pr[A] + Pr[B]. Generally, (Boole s Inequality) For events A 1,..., A m, Pr[ m i=1a i ] m Pr[A i ]. i=1 12

Random variables (RV) E.g. S = X 1 + + X n where X 1,..., X n : i.i.d. and Pr[X 1 = 0] = 1 p and Pr[X 1 = 1] = p. k (indistinguishable) balls and n bins Each ball is to be distributed uniformly at random so that Pr[ i th ball is in j th bin ] = 1/n Define RV X = # of balls in first 10 bins 13

Expectation of nonnegative integral valued RV X: E[X] := k Pr[X = k]. k=0 E.g. S = X 1 + + X n E[S] = = k Pr[S = k] k=1 n k k=1 ( ) n p k (1 p) n k = pn k Easy: Pr[X > 0] E[X], Pr[X > 0] = Pr[S = k] k=1 k Pr[S = k] = E[X]. k=1 14

Linearity of Expectation For any RV s X and Y, E[X + Y ] = E[X] + E[Y ]. More generally, for RV s X 1,..., X n, E.g. E[X 1 + + X n ] = E[X 1 ] + + E[X n ]. E[S] = E[X 1 + + X n ] = E[X 1 ] + E[X n ] = p + + p = pn. 15

2. First Moment (Expectation) Method For nonnegative integral valued RV X Pr[X > 0] = E[1(X > 0)] E[X1(X > 0)] = E[X], in particualr, if E[X] < 1, then Pr[X = 0] 1 E[X] > 0. On the other hand, E[X] k = Pr[X k] > 0. so that there is an instance which makes X k 16

2.1 Property B A hypergraph H = (V, E) has Property B, or 2-colorable, if a 2-coloring of V s.t. no edge in H is monochromatic. Theorem If a k-uniform hypergraph H has less than 2 k 1 edges, then H has property B. 17

Theorem If a k-uniform hypergraph H has less than 2 k 1 edges, then H has property B. Proof. Color each vertex, randomly and independently, either B (blue) or R (red) with equal probability. For e H, Pr[e is monochromatic] = 2 k+1 yields E[# of monochromatic edges] = H 2 k+1 < 1, and hence Pr[ no monochromatic edge] > 0. Thus 2-coloring with no monochromatic edge. 18

2.2 Arithmetic progressions and van der Waerden number W (k) Arithmetic Progression (AP) with k terms in {1,..., n} a, a + d, a + 2d,..., a + (k 1)d {1,..., n} Let W (k) be the least n so that, if {1,..., n} is two-colored, a monochromatic AP with k terms van der Waerden ( 27) W (3) = 9, W (4) = 35, W (5) = 178,... W (k) is FINITE for any k. Theorem W (k) 2 k/2. 19

Theorem W (k) 2 k/2. Proof. Two-color {1,..., n} randomly, say Red & Blue so that Pr[i is colored Red] = Pr[i is colored Blue] = 1/2 independently of all j i. For each k-term AP S in {1,..., n}, Pr[S is monochromatic ] = 2 1 k. Since there are at most n 2 /2 such S (WHY?), if n < 2 k/2 then and E[# of monochromatic S] (n 2 /2)2 1 k < 1, Pr[ no monochromatic S] > 0. 20

2.3 Covering n-cube n-cube Q n : { 1, 1} n = {(x i ) : x i = 1 or 1, i = 1,..., n} Let X 1,..., X m be (mutually) independent uniform random vectors in Q n, in particular, Pr[X j = u] = 2 n for any u Q n. Theorem If m = (1 + ε)n for ε > 0, then as n 0. Pr[ w Q n with w X j > 0 j = 1,..., m] 2 εn 0, 21

Theorem If m = (1 + ε)n for ε > 0, then Pr[ w Q n with w X j > 0 j = 1,..., m] 2 εn 0, as n 0. Proof. For w Q n, let Y w be the indicator RV for the event A w that w X j > 0 for all j = 1,..., m. Then E[# of w with w X j > 0 j = 1,..., m] = E[ Y w ] = Pr[A w ]. w Q n w Q n As X j s are mutually independent, Pr[A w ] = Pr[w X j > 0 for all j = 1,..., m] = m j=1 Pr[w X j > 0] (1/2) m. Therefore, the probability is bounded by 2 n m = 2 εn. 22

2.4 Ramsey Number R(s, t) Recall, for a graph G, ω(g): clique number of G (size of a largest clique) α(g): independence number of G (size of a largest independent set) 23

R(s, t) := min{n : for every G on n vertices, ω(g) s or α(g) t} EASY R(s, t) = R(t, s), R(2, t) = t Greenwood & Gleason ( 55): R(3, 3) = 6, R(3, 4) = 9, R(3, 5) = 14, R(4, 4) = 18 MORE: R(3, 6) = 18, R(3, 7) = 23, R(3, 8) = 28, R(3, 9) = 36, R(4, 5) = 25, 43 R(5, 5) 49 24

There are graphs on n vertices, where ( ) n 2 2 (n 2) = n(n 1)/2. For example, if n = 28 2 (28 2 ) 6156563648 0.6156563648 10 114 25

Ramsey( 30): R(s, t) is FINITE. Skolem( 33), Erdős and Szekeres( 35): R(s, t) R(s 1, t) + R(s, t 1) and R(s, t) ( ) s + t 2 s 1 26

Theorem. If then ( ) n t 2 1 ( t 2) < 1 R(t, t) > n. Proof. Random graph G = G(n, 1/2): Each of ( n 2) edges in Kn is in G with probability 1/2, independently of all other edges For each subset T of size t, let A T be the event that T is a clique in G. Then Pr[ω(G) t] Pr[ T A T ] T Pr[A T ]. Since T has ( t 2) edges in it, Pr[A T ] = 2 ( t 2) 27

and Similarly, Thus Pr[ω(G) t] Pr[α(G) t] ( ) n 2 ( 2) t. t ( ) n 2 ( 2) t. t Pr[ω(G) t or α(g) t] Pr[ω(G) t] + Pr[α(G) t] ( ) n 2 1 ( 2) t < 1, t which implies that Pr[ω(G) < t and α(g) < t] > 0, in particular, there is at least one such graph. 28

Using Stirling formula, n! = 2πn e ε n ( n e ) n, where 1/(12n + 1) < ε n < 1/(12n), R(t, t) n = (1 + o(1))(t/e)2 (t 1)/2 = (1 + o(1))t e 2 t/2. 2 Hence (1 + o(1))t e 2 t/2 R(t, t) 2 ( ) 2t 2 t 1 c4 t / t. BIG open problem: lim R(t, t t)1/t =??? 29

Even existence is not known. 2 lim inf R(s, s) 1/s lim sup R(s, s) 1/s 4 30

Theorem If for some 0 p 1, then ( ) ( ) n p (s 2) n + (1 p) (t 2) < 1 s t R(s, t) > n. Proof. Exercise (take each edge with probability p.) For fixed s, ( ) (s 1)/2 t R(s, t) c s. log t 31

2.5 Independence numbers of graphs Theorem (Turán) For a graph G = (V, E) with V = n and the average degree t(g) = 1 n v V d(v) (A probabilistic) Proof. α(g) Randomly order all vertices of G, n t(g) + 1. Take the first vertex v 1 and delete all verts in its nbd N(v). Take the next undeleted vertex and do the same. 32

Let I be the independent set obtained. Enough to show that Note that E[ I ] n t + 1, E[ I ] = E[ v 1(v I)] = v Pr[v I]. where 1(v I) = 1 if v I 0 if v I. 33

If a vertex v precedes all verts in N(v) then The corresponding probability is v I. 1 d(v) + 1. We have Thus E[ I ] v by Jensen s Ineq.. Pr[v I] 1 d(v) + 1 1 d(v) + 1. n (1/n) v d(v) + 1 = n t + 1. 34

2.6 Markov Inequality Theorem 1 (Markov Inequality) For a random variable X 0, Proof. Pr[X λ] E[X] λ. Pr[X λ] E[(X/λ)1(X λ)] E[X/λ] = E[X]/λ, where 1(X λ) = 1 if X λ 0 otherwise. Note that E[1(X λ)] = Pr[X λ]. 35

2.7 Occupancy Problems Occupancy Problems: Insert each of m balls to n distinct bins uniformly at random. Two question: What is the maximum number of balls in any bin? What is the expected number of bins containing k balls in them. Consider the case m = n. For all i = 1, 2,, n, X i = # of balls in the i-th bin. Then E[X i ] =??. 36

Note that in general, Thus P r[x i = 1] = P r[x i = j] = E[X i ] = n j=1 ( ) n (1/n)(1 1/n) n 1, 1 ( ) n (1/n) j (1 1/n) n j. j ( ) n (1/n) j (1 1/n) n j = 1. j Or, since n i=1 X i = n, in particular, n i=1 E[X i] = n, and all E[X i ] s are the same, E[X i ] = 1. 37

Recall m = n and X i is the number of balls in the i-th bin, i = 1, 2,, n. Since For which value k does no bin receive more than k balls with high probability? Pr[X i = j] = ( ) n (1/n) j (1 1/n) n j, j and Stirling formula gives ( j ) j, j! = (1 + O( 1 j )) 2πj e we have Pr[X i l] (1 + O( 1 l )) j l(e/j) j c(e/l) l. 38

Thus, (e/l) l n 2 implies that Pr[ X i l] i Pr[X i l] n cn 2 1. Theorem With probability at least 1 O(1/n), no bin has more than (e ln n)/ ln ln n balls in it. 39

Exercise For m = n log n, show that with probability 1 o(1) every bin contains O(log n) balls. 40

3 Methods of conditional events 3.1 Property B & Recoloring Recall that a hypergraph H = (V, E) has Property B, or 2-colorable, if a 2-coloring of V s.t. no edge in H is monochromatic. Have proved: Any k-uniform hypergraph with H < 2 k 1 have property B. 41

Theorem (Beck 78) If a k-uniform hypergraph H has at most ck 1/3 (ln k) 1/2 2 k 1 edges, then H has property B for any constant c < 3. 42

Remark: Radhakrishnan & Srinivasan ( 99) If then then H has property B. On the other hand, ( k ) 1/22 k H 0.7 ln k non-2-colorable k-uniform hypergraph with ck 2 2 k edges Proof. Ex. (Take random m = ck 2 2 k k-subsets of {1,..., k 2 } and use the first moment method.) 43

Let f(k) = min{m : non-2-colorable k-uniform hypergraph with m edges }. Then ( k ) 1/22 0.7 k f(k) ck 2 2 k. ln k 44

Proof. Let p = ln k/(3k) and t = ck 1/3 (ln k) 1/2, where c is a constant < 3. Then We need to show that if then H has property B. 2te pk + t 2 pe pk < 1. H t2 k 1 45

Color each vertex, randomly and independently, either B (blue) or R (red) with equal probability (as before). Call this first coloring Let W be the set all vertices which belong to at lease one monochromatic edge. Independently change the color of each v W with probability p Call this recoloring 46

An edge e after recoloring is monochromatic by two reasons: 1. Event A e : e is monochromatic in both the first coloring and the recoloring Pr[A e ] = Pr[e monoch. in first col. ] ( Pr[ no color in e is changed (in the recol.)] ) + Pr[ all colors in e is changed (in the recol.)] = 2 1 k ((1 p) k + p k ) 2 1 k (2(1 p) k ) 2 2 k e pk and Pr[ such e] H 2 2 k e pk 2te pk. 47

2. Event B e : = U 0 e s.t. and in the first col, e \ U 0 was red and U 0 was blue, in the recol. all col. in e \ U 0 remains the same and all col. in U 0 are changed, or the same with red/blue reversed. Notice that this case also requires at least one edge, say f, with f e, f e U 0, and f is blue monoch. 48

Let U = U 0 \ f. Then the case implies that f, U(possibly ) with e f and U e \ f s.t. the event B efu that and in the first col., e \ (U f) was red, U f was blue in the recol., all col. in e \ (U f) remains the same and all col. in U (e f) are changed, or the same with red/blue reversed, occurs. 49

Since Pr[B efu ] 2 1 2k+ e f p e f + U, we have Pr[ efu B efu ] e Pr[B efu ]. f:e f U:U e\f It follows that Pr[B efu ] U:U e\f k e f u=0 ( ) k e f 2 1 2k+ e f p e f +u u = 2 1 2k+ e f p e f (1 + p) k e f 2 1 2k e pk( 2p 1 + p ) e f. 50

Thus U:U e\f Pr[B efu ] 2 1 2k e pk( 2p ) e f 2 2 2k e pk p 1 + p yields Pr[ e B e ] Pr[ efu B efu ] e 2 2 2k pe pk f:e f H 2 2 2 2k pe pk t 2 pe pk, and hence Pr[ e A e e,f,u B efu ] Pr[ e A e ] + Pr[ e,f,u B efu ] 2te pk + t 2 pe pk < 1. 51

3.2 Independence numbers of sparse graphs Theorem (Ajtai, Komlós and Szemerédi ( 81)) For a triangle-free graph G, E[ I ] (t = t(g): the average degree). cn log t t Ex. This implies that R(3, t) c t 2 log t. 52

For a graph G, let I denote the set of all independent sets of G. Take an independent set I in I uniformly at random so that Pr[I = J] = 1 I J I. I : (uniformly) random independent set Theorem (Shearer) For a K r -free graph G, E[ I ] c rn log t t log log t. 53

Proof of AKS Theorem. (Alon s version of Shearer s proof) Assume maxdeg(g) 2t (WHY?). For v V (G), let t if v I X v = N(v) I otherwise, or equivalently, X v := t1(v I) + w v 1(w I). Note that X v = t I + 1(w I) v V (G) v V (G) w v t I + 2t I = 3t I 54

Enough to show that E[X v ] c log t, for this implies 3tE[ I ] E[ v X v ] cn log t. Let J = I \ ({v} N(v)), we will show that E[X v J] c log t. 55

E[X v J] = t 1 + 2 r + (r/2)2r 1 + 2 r. 56

If 2 r t/ log t 1, then t 1 + 2 r log t. If 2 r t/ log t 1, then (r/2)2 r 1 + 2 r r/2 log t log log t. 57