Lecture 5. Shearer s Lemma

Similar documents
Lecture 7. The Cluster Expansion Lemma

arxiv: v1 [math.co] 18 Nov 2017

Lecture 5. Symmetric Shearer s Lemma

The Lovász Local Lemma: constructive aspects, stronger variants and the hard core model

Lecture 18. Ramanujan Graphs continued

An Algorithmic Proof of the Lopsided Lovász Local Lemma (simplified and condensed into lecture notes)

MATH 802: ENUMERATIVE COMBINATORICS ASSIGNMENT 2

K-center Hardness and Max-Coverage (Greedy)

5 Set Operations, Functions, and Counting

Lecture 3: Oct 7, 2014

More Approximation Algorithms

A short course on matching theory, ECNU Shanghai, July 2011.

1 Matroid intersection

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 2 Luca Trevisan August 29, 2017

Lecture 5: January 30

1 Maximum Budgeted Allocation

CHAPTER 8: EXPLORING R

Computing the Independence Polynomial: from the Tree Threshold Down to the Roots

Properties of θ-super positive graphs

Matching Polynomials of Graphs

MINIMALLY NON-PFAFFIAN GRAPHS

College of Computer & Information Science Fall 2009 Northeastern University 22 September 2009

Lecture 11 October 7, 2013

Decompositions of graphs into cycles with chords

Even Cycles in Hypergraphs.

Problem Set 2: Solutions Math 201A: Fall 2016

arxiv: v1 [math.co] 28 Oct 2016

Network Design and Game Theory Spring 2008 Lecture 2

Determinants of Partition Matrices

The concentration of the chromatic number of random graphs

Topics in Theoretical Computer Science April 08, Lecture 8

1 Strict local optimality in unconstrained optimization

MATH 243E Test #3 Solutions

Course : Algebraic Combinatorics

On the mean connected induced subgraph order of cographs

Induced subgraphs with many repeated degrees

Sharing a pizza: bisecting masses with two cuts

An estimate for the probability of dependent events

Lecture 16: Roots of the Matching Polynomial

Graph coloring, perfect graphs

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Sequences of height 1 primes in Z[X]

1 The Local-to-Global Lemma

Notes. Combinatorics. Combinatorics II. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

Randomized Algorithms

MATH 220 (all sections) Homework #12 not to be turned in posted Friday, November 24, 2017

CPSC 536N: Randomized Algorithms Term 2. Lecture 2

Probabilistic Proofs of Existence of Rare Events. Noga Alon

Lecture 19. The Kadison-Singer problem

Lecture 7: February 6

Coloring square-free Berge graphs

The Lovász Local Lemma : A constructive proof

Generalized Pigeonhole Properties of Graphs and Oriented Graphs

MINIMALLY NON-PFAFFIAN GRAPHS

NAME: Mathematics 205A, Fall 2008, Final Examination. Answer Key

Game saturation of intersecting families

Math 421, Homework #6 Solutions. (1) Let E R n Show that = (E c ) o, i.e. the complement of the closure is the interior of the complement.

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

ACO Comprehensive Exam 19 March Graph Theory

CPSC 536N: Randomized Algorithms Term 2. Lecture 9

1 Review of last lecture and introduction

STAT 7032 Probability Spring Wlodek Bryc

Lecture 24: April 12

Independence numbers of locally sparse graphs and a Ramsey type problem

Tree sets. Reinhard Diestel

Discrete Mathematics. Spring 2017

Mathematics 220 Midterm Practice problems from old exams Page 1 of 8

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

UNIQUENESS OF HIGHLY REPRESENTATIVE SURFACE EMBEDDINGS

A Geometric Approach to Graph Isomorphism

Induced subgraphs of prescribed size

Lecture 8 : Eigenvalues and Eigenvectors

A Linear Round Lower Bound for Lovasz-Schrijver SDP Relaxations of Vertex Cover

Statistics 251/551 spring 2013 Homework # 4 Due: Wednesday 20 February

MULTIPLICITIES OF MONOMIAL IDEALS

Chapter 1 : The language of mathematics.

Notice that lemma 4 has nothing to do with 3-colorability. To obtain a better result for 3-colorable graphs, we need the following observation.

February 2017 February 18, 2017

Chase Joyner. 901 Homework 1. September 15, 2017

Basic Combinatorics. Math 40210, Section 01 Fall Homework 8 Solutions

arxiv: v2 [cs.ds] 11 Nov 2017

Math General Topology Fall 2012 Homework 13 Solutions

Cographs; chordal graphs and tree decompositions

Independent sets and repeated degrees

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

Lecture 10: Lovász Local Lemma. Lovász Local Lemma

A note on [k, l]-sparse graphs

On the size of minimal unsatisfiable formulas

9 - The Combinatorial Nullstellensatz

2030 LECTURES. R. Craigen. Inclusion/Exclusion and Relations

COS597D: Information Theory in Computer Science October 5, Lecture 6

An Algorithmist s Toolkit September 24, Lecture 5

On the Turán number of forests

1 The Knapsack Problem

The Lopsided Lovász Local Lemma

CSCE 222 Discrete Structures for Computing. Review for Exam 2. Dr. Hyunyoung Lee !!!

CHAPTER 7. Connectedness

Transcription:

Stanford University Spring 2016 Math 233: Non-constructive methods in combinatorics Instructor: Jan Vondrák Lecture date: April 6, 2016 Scribe: László Miklós Lovász Lecture 5. Shearer s Lemma 5.1 Introduction and motivation Suppose we are given a dependency graph G with vertex set V (G) = [n], and probabilities p 1, p 2,..., p n. The Local Lovász Lemma gives a sufficient condition for ensuring that a set of events E 1, E 2,..., E n with dependencies consistent with G and satisfying P(E i ) p i does not cover the entire probability space, that is [ n ] P E i > 0. This theorem is not sharp, however. Suppose, for example, that G = K n, the complete graph on n vertices, and all the p i are equal to p. Then the condition given by the LLL is that p 1 en. However, the union bound shows that p < 1 n is sufficient. Shearer s Lemma gives a sufficient and necessary condition for this. The downside is that, unlike the Local Lovász Lemma, the conditions of Shearer s Lemma are often very difficult to check. 5.2 Definitions and Statement of Lemma In order to state Shearer s Lemma, we need to define some multivariate polynomials related to the independent sets of a graph. We will think of the graph G as being fixed, and then we will denote by p the vector (p i : i V (G)). Definition 5.1 Given a graph G, we denote by Ind(G) the independent sets of G, that is Ind(G) = {I V (G) : I contains no edges}. Definition 5.2 Given a graph G, we define a polynomial q I over R V (G) for any I V (G) as Note that q I = 0 unless I Ind(G). q I (p) = ( 1) J\I j J p j. Definition 5.3 Given a graph G, we define a polynomials q S over R V (G) for any S V (G) as q S (p) = J S We are now ready to state Shearer s Lemma. ( 1) J j J p j.

Lemma 5.4 (Shearer s Lemma) Given a graph G on n vertices and p (0, 1) n, the following are equivalent. 1. I Ind(G) : q I (p) > 0. 2. S V (G) : q S (p) > 0. 3. For any events E 1, E 2,..., E n for which G is a dependency graph, if P [E i ] p i for each i, then [ n ] P E i > 0. Moreover, if these conditions are satisfied, then [ n ] P E i q (p) = q V (G) (p). Example 5.5 Suppose again that G = K n. Then the independent sets are the empty set and singleton sets. We have that the probability of avoiding n E i is at least q (p) = I Ind(K n) ( 1) I p I = 1 n p i, which is exactly the union bound. (For a singleton {i}, q {i} = p i > 0 by assumption.) Example 5.6 If G is the empty graph, then we have q (p) = n (1 p i ) > 0, which is indeed the probability of avoiding n independent events. 5.3 Proof of Shearer s Lemma As before, we denote [ ] P S = P E i. We will also omit the argument p in the polynomials. Further, we use the following shorthand notation: S a = S \ {a}, S + a = S {a}, and p I = i I p i. First, we show that Property 2 implies Property 3 (which is the main implication, analogous to the LLL). i S

Proof: [2 3] We will prove that in fact P S q S for all S [n]. For S =, both of these are clearly equal to 1. Inductively, we want to prove that for any a S [n], we have P S q S. (5.1) By a telescoping product, this implies that P S q S and the desired implication. We will prove (5.1) by induction on the size of S. Pick any a S. As in the proof of the LLL, we know that P S p a P S\Γ + (a). (5.2) Note that each independent set containing a consists of {a} and an independent subset of V (G) \ Γ + (a). By separating independent sets based on whether they contain a or not, we have q S (p) = J S ( 1) J p J = J S a ( 1) J p J + = (p) p a J Ind(G) J S\Γ + (a) J Ind(G) J S\Γ + (a) ( 1) J +a p J +a ( 1) J p J = (p) p a q S\Γ + (a)(p). (5.3) Note the similarity between (5.2) and (5.3). By the inductive hypothesis, we can assume (5.1) for each element in (S a) \ (S \ Γ + (a)), and by a telescoping product we obtain P S\Γ + (a) q S\Γ+ (a). Combining this with (5.2) and (5.3), we deduce that P S 1 p a P S\Γ+ (a) 1 p a q S\Γ+ (a) = q S which completes the proof. Proof:[1 2] Next, we show that Property 1 and 2 are equivalent. To show that 2 implies 1, we have q I (p) = ( 1) L p I p L = p I q V \Γ + (I)(p). (5.4) L Ind(G) L V \Γ + (I) This is clearly positive if each p i and q S is positive. For the reverse, we claim that q S = q I. (5.5) I S=

This is clearly positive if q I is positive for every I Ind(G) (including I = which is always present in the sum). To prove (5.5), we use an inclusion-exclusion calculation: I S= q I (p) = I S= ( 1) J\I p J = = ( 1) J p J p J \S \S ( 1) J\I ( 1) I = J\S= ( 1) J p J = q S (p) since the summation \S ( 1) I is zero unless J \ S =. We have thus proven that Properties 1 and 2 are equivalent. Last, we need to show that Property 3 implies the other two. In other words, we want to show that if there exists an independent set I such that q I (p) 0, then we can find events E i, i V consistent with G, with P[E i ] p i, and the events cover the entire probability space. We also show that if Shearer s conditions are satisfied, then there is an instance such that Pr[ n E i] = q (p); i.e., Shearer s lemma is tight. Note that if p = 0, then q S (p) = 1 for each S. This means that, since we have a finite set of continuous functions (polynomials), there exists an ɛ such that if each p i ɛ, then each q S (p) > 0. If we further assume that for each i we have 0 < p i < ɛ, then we know from (5.4) that we must in fact have q I (p) > 0 for each independent set I. Let the Shearer region be S = {p (0, 1) n I Ind(G); q I (p) > 0}. Look at λp for λ (0, 1). We know that there is δ > 0 such that if 0 < λ < δ, then λp S. Also, λp / S for some λ > 0 (for example at the point where λp i 1 for some i). Let λ = sup {λ : λp S}, and p = λ p. Then, since S is clearly an open set, p is not in it. However, by continuity of the polynomials, we must have that for all I, q I (p ) 0 (and in addition, for all S we have q S (p ) 0). From (5.3), we can see that for any a S V, 0 q S (p) (p). Since p / S, at least one of the polynomials is 0, and hence we must have q V (p ) = q (p ) = 0. Hence, for any point p, either p S in which case q (λp) > 0 for all λ [0, 1], or p / S in which case q (λp) = 0 for some λ [0, 1]. Remark 5.7 Considering the discussion above, we can now add a Property 4 equivalent to the conditions of Shearer s lemma: 4. λ [0, 1]; q (λp) > 0. Clearly Property 3 is downward closed, so if p / S and we construct a set of events for p = λ p that disproves Property 3, then we have also constructed a set of events for p that disproves Property 3. We will thus replace p with p in what follows, and we will assume that each q I (p) 0 and each q S (p) 0.

Shearer s tight instance. By the above, we assume that q S (p) 0 and q S (p) 0 for all S V. If we plug S = into (5.5), we obtain the equation 1 = q (p) = q I (p) = S V q S (p). Since each q S (p) is nonnegative, we can define a collection of events E 1,..., E n such that: P E i = q S (p) i S j V \S for every S V. Note that this specifies fully a probability space with events E 1,..., E n. We have to prove that this is a probability distribution that is consistent with G, and each event E i has probability p i. First, we claim the following Claim 5.8 For any independent set I, we have [ ] P[I] := P E i = p i. i I i I Proof: Let I be an independent set. Then P[I] = q J = J J ( 1) J\J p J = I J p J J: J ( 1) J\J = p I by the same inclusion-exclusion argument as before. In particular, we see that P[E i ] = p i. We also clearly have the following, since q S (p) = 0 for S / Ind(G). Claim 5.9 For any S / Ind(G), we have [ ] P[S] := P E i = 0. In particular, this shows i S {i, j} E(G) E i =. Thus, the tight instance can be summarized by saying that neighboring events are disjoint and non-neighboring events are independent. Next, we show that G is indeed a dependency graph for these events. Claim 5.10 For any i V, E i is mutually independent of { : j V \ Γ + (i)}.

Proof: Take any J V \ Γ + (i). If J is an independent set, then J {i} is also independent. In this case, Claim 5.8 implies P E i = P = p i p j = P(E i ) P. j J j J j J {i} j J If J is not independent, then of course J {i} is not independent. In this case, Claim 5.9 gives P E i = 0 = P(E i ) 0 = P(E i ) P. j J j J The proof is completed by applying inclusion-exclusion: If J, K [n] \ Γ + (i) are disjoint, P E i E k = j J k K L K( 1) L P E i j J L = P[E i ] L K( 1) L P = P[E i ] P j J j J L k K E k. We conclude that E i is independent of { : j V \ Γ + (i)}. Finally, from Claim 5.8, we get by inclusion-exclusion: P = ( 1) J P = ( 1) J p J = q S (p). j S J S j J J S }{{} 0 only when J Ind(G) Thus the tight instance shows that the lower bounds P[ i S E i] q S (p) are tight. In particular, the tight instance satisfies P[ n E i] = q V (p) = q (p) which is 0 if Shearer s conditions are violated.