Lecture 10 Algorithmic version of the local lemma

Similar documents
Robin Moser makes Lovász Local Lemma Algorithmic! Notes of Joel Spencer

An extension of the Moser-Tardos algorithmic local lemma

An Algorithmic Proof of the Lopsided Lovász Local Lemma (simplified and condensed into lecture notes)

The Lovász Local Lemma : A constructive proof

Lecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun

Lecture: New Constructive Aspects of the Lovász Local Lemma, and their Applications June 15, 2011

Size and degree anti-ramsey numbers

Computing the Independence Polynomial: from the Tree Threshold Down to the Roots

Global Optima from Local Algorithms

College of Computer & Information Science Fall 2009 Northeastern University 22 September 2009

The Turán number of sparse spanning graphs

Deterministic approximation for the cover time of trees

Two-coloring random hypergraphs

Probabilistic Constructions of Computable Objects and a Computable Version of Lovász Local Lemma.

Lecture 2: Proof of Switching Lemma

Coloring Uniform Hypergraphs with Few Colors*

The Probabilistic Method

n Boolean variables: x 1, x 2,...,x n 2 {true,false} conjunctive normal form:

On the Algorithmic Lovász Local Lemma and Acyclic Edge Coloring

Advanced Algorithms 南京大学 尹一通

20.1 2SAT. CS125 Lecture 20 Fall 2016

Lecture 24: April 12

Deterministic Algorithms for the Lovász Local Lemma

Dynamic Programming on Trees. Example: Independent Set on T = (V, E) rooted at r V.

Chapter 3 Deterministic planning

1 More finite deterministic automata

arxiv: v1 [cs.ds] 8 Dec 2016

Increasing the Span of Stars

On the Beck-Fiala Conjecture for Random Set Systems

Lower Bounds on van der Waerden Numbers: Randomized- and Deterministic-Constructive

The Lefthanded Local Lemma characterizes chordal dependency graphs

Classes of Boolean Functions

A CONSTRUCTIVE ALGORITHM FOR THE LOVÁSZ LOCAL LEMMA ON PERMUTATIONS 1

Independent Transversals in r-partite Graphs

FIRST ORDER SENTENCES ON G(n, p), ZERO-ONE LAWS, ALMOST SURE AND COMPLETE THEORIES ON SPARSE RANDOM GRAPHS

The Local Lemma is asymptotically tight for SAT

Realization Plans for Extensive Form Games without Perfect Recall

Out-colourings of Digraphs

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 8 Luca Trevisan September 19, 2017

ABSTRACT THE LOVÁSZ LOCAL LEMMA. Dissertation directed by: Professor Aravind Srinivasan, Department of Computer Science

Probabilistic Proofs of Existence of Rare Events. Noga Alon

Rainbow Hamilton cycles in uniform hypergraphs

Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases

Rainbow Hamilton cycles in uniform hypergraphs

Trivial, Tractable, Hard. A Not So Sudden Complexity Jump in Neighborhood Restricted CNF Formulas

Packing and Covering Dense Graphs

: On the P vs. BPP problem. 18/12/16 Lecture 10

On a hypergraph matching problem

Edge-disjoint induced subgraphs with given minimum degree

Lecture 5: Probabilistic tools and Applications II

Lecture 11: Generalized Lovász Local Lemma. Lovász Local Lemma

CSC 2429 Approaches to the P vs. NP Question and Related Complexity Questions Lecture 2: Switching Lemma, AC 0 Circuit Lower Bounds

arxiv: v5 [cs.ds] 2 Oct 2011

arxiv: v1 [math.co] 18 Nov 2017

Bayesian Networks Factor Graphs the Case-Factor Algorithm and the Junction Tree Algorithm

Models of Computation,

Lecture 4: Inequalities and Asymptotic Estimates

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

NP-Completeness Part II

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 22 Lecturer: David Wagner April 24, Notes 22 for CS 170

Space and Nondeterminism

On shredders and vertex connectivity augmentation

WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME)

Essential facts about NP-completeness:

NP-Completeness Part II

Markov Decision Processes

The Shortest Vector Problem (Lattice Reduction Algorithms)

Tree Decomposition of Graphs

An estimate for the probability of dependent events

Algorithms: COMP3121/3821/9101/9801

Lecture # 12. Agenda: I. Vector Program Relaxation for Max Cut problem. Consider the vectors are in a sphere. Lecturer: Prof.

Aditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016

CSE 211. Pushdown Automata. CSE 211 (Theory of Computation) Atif Hasan Rahman

Choosability and fractional chromatic numbers

Discrete Mathematics: Lectures 6 and 7 Sets, Relations, Functions and Counting Instructor: Arijit Bishnu Date: August 4 and 6, 2009

Santa Claus Schedules Jobs on Unrelated Machines

Remarks on a Ramsey theory for trees

15.081J/6.251J Introduction to Mathematical Programming. Lecture 18: The Ellipsoid method

Clause/Term Resolution and Learning in the Evaluation of Quantified Boolean Formulas

Graph coloring, perfect graphs

On improving matchings in trees, via bounded-length augmentations 1

On Maximizing Welfare when Utility Functions are Subadditive

On the Structure and the Number of Prime Implicants of 2-CNFs

An asymptotically tight bound on the adaptable chromatic number

Mock Exam Künstliche Intelligenz-1. Different problems test different skills and knowledge, so do not get stuck on one problem.

A constructive algorithm for the Lovász Local Lemma on permutations

Graph Theorizing Peg Solitaire. D. Paul Hoilman East Tennessee State University

Vertex colorings of graphs without short odd cycles

Computational Logic. Davide Martinenghi. Spring Free University of Bozen-Bolzano. Computational Logic Davide Martinenghi (1/30)

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

17.1 Correctness of First-Order Tableaux

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 25. Minesweeper. Optimal play in Minesweeper

THE MOSER-TARDOS FRAMEWORK WITH PARTIAL RESAMPLING

On the Structure and Complexity of Symbolic Proofs of Polynomial Identities

Regular Resolution Lower Bounds for the Weak Pigeonhole Principle

arxiv: v2 [cs.ds] 3 Oct 2017

Acyclic subgraphs with high chromatic number

1 Introduction The relation between randomized computations with one-sided error and randomized computations with two-sided error is one of the most i

Notes on Complexity Theory Last updated: November, Lecture 10

Transcription:

Lecture 10 Algorithmic version of the local lemma Uriel Feige Department of Computer Science and Applied Mathematics The Weizman Institute Rehovot 76100, Israel uriel.feige@weizmann.ac.il June 9, 2014 1 The local lemma Let A be a collection of random events A 1... A m. For event A i, let Γ(A i ) be a minimal set of events that A i depends on in the sense that A i is independent of all events in A \ {Γ(A i ) A i } (information on events not in Γ(A i ) A i does not affect the probability of event A i happening). The general version of the Lovasz local lemma [4] (see also [1]) is as follows. Theorem 1 If there is an assignment 0 < x i < 1 satisfying for every i: P r[a i ] x i j A j Γ(A i ) (1 x j ) then the probability that no event A i happens is at least m i=1 (1 x i ) > 0. Observe that if the events were independent, the probability that no bad event happens would have been exactly m i=1 (1 P r[a i ]). Hence in a sense, replacing the P r[a i ] values by the larger x i values is the penalty paid in the conclusion of the local lemma for the variables not being truly independent. In many applications, the following uniform version of the local lemma suffices. Theorem 2 Let A be a collection of random events each happening with probability p. Assume that every event depends on at most d events (including itself). Namely, Γ(A i ) + 1 d. Then if pde < 1 then with positive probability no event happens. Proof: The uniform version follows from the general version by picking x i = pe for every i. Then we have x i j A j Γ(A i ) Here is an application to the local lemma. (1 x j ) pe(1 pe) d 1 pe(1 1/d) d 1 p 1

Theorem 3 Every k-cnf formula with n variables in which each variable appears in less than 2 k /ek clauses is satisfiable. Proof: Consider a random assignment to the variables, and let A i be the event that clause C i is not satisfied. Then P r[a i ] = 2 k. Note that removing all events that correspond to clauses that share variables with the clause correspond to event A i leaves A i independent of all remaining events. Each clause shares variables with less than 1 + k( 2k ek 1) < 2k /e clauses including itself. Hence the uniform version of the local lemma implies that there is positive probability that the formula is satisfied by the random assignment, and hence some assignment is satisfying. Note that the randomized algorithm implicit in the proof of Theorem 3 has exponentially small success probability, and hence will need to be repeated an exponential number of times until it is expected to produce a satisfying assignment. Finding a polynomial time ( constructive ) version for the local lemma has been an open question for many years. The first breakthrough in this respect was by Joseph Beck [2], which gave a constructive version (with weaker parameters, and that applies only in some cases). A more recent breakthrough by Robin Moser (extended further in [5]) managed to produce a constructive version that among other things provides a polynomial time algorithm for the problem in Theorem 3. 2 An algorithmic version The algorithmic versions of the local lemma do not always apply, but they do apply in most interesting cases (see [3] for additional algorithmic versions). We shall show a random polynomial time algorithm for satisfying sparse k-cnf formulas as in Theorem 3. This algorithm can be derandomized and generalized to other applications, but this will not be discussed here. The analysis (of the expected running time) that we provide is not the tightest possible, since we try to keep it simple and intuitive. For stronger results, see [5]. Let x 1,..., x n be the variables of a k-cnf formula and let m be the number of clauses. Assume that each clause shares variables with at most d clauses (including itself). Let p = 2 k be the probability that a clause is not satisfied by a random assignment to the variables. We assume that pde < 1 and show a simple randomized algorithm for satisfying the k-cnf formula. 1. Pick a random assignment for the variables, where each variable is set to true independently with probability 1/2. 2. As long as there is some clause that is not satisfied, pick an arbitrary clause that is not satisfied, and assign fresh random value to all its variables. We shall show that the above algorithm terminates with a satisfying assignment in expected polynomial time. The expectation is taken over the random coin tosses of the algorithm. These are produced dynamically throughout the algorithm. It will be more convenient for us to view them in an equivalent way, as if they are produced statically before the algorithm begins. For this we imagine that there is a table R with r rows (r can be thought of as infinite or very large) and n columns. The columns are indexed by the variables. Initially, one fills the table by random and independent 0/1 values. Thereafter, 2

whenever the algorithm needs a fresh random value for a variable, in considers the column corresponding to the variable, and uses the first yet unused entry in that column. All probabilistic statements that we make are with respect to the random initial contents of R. Consider now the sequence of unsatisfied clauses visited by the algorithm, C 1, C 2,.... Note that C i and C j for i j may refer to the same clause. Consider clause C t that is visited in step t of the algorithm. How do we determine which k entries of the table R gave the value of its variables that made it unsatisfied at time step t? We know which columns they belong to, but the row that each value belongs to depends on the history of the execution of the algorithm. In particular, different variables might take their value from different rows. The part of the execution of the algorithm that is relevant to C t can be viewed as a tree T t labeled by clauses. It is build by backward induction from step t towards step 1. At step i for t i 1 we have the tree Tt i. It is derived from tree Tt i+1 as follows. If clause C i does not intersect any of the clauses in Tt i+1 then Tt i = Tt i+1. Else, C i is appended to the clause C j deepest in the tree Tt i+i that shares a variable with C i. The tree Tt 1 determines uniquely from which locations in R the random values for the variables of each of the clauses C i are taken, for all clauses C i in the tree (and not just for C m ). Simply start with the deepest node in the tree (breaking ties arbitrarily, which is justified by Proposition 4), allocate to the respective clause the values from the respective columns in the first row of R, erase these values and shift the respective columns down by one row, and repeat. Proposition 4 If two clauses C i and C j with i < j are at the same depth in T 1 t, then they are disjoint. Proof: Otherwise, C i could have been placed at greater depth, as a descendant of C j. For the random values in R to make the generation of the tree Tt 1 feasible, it must be the case that all clauses in the tree are not satisfied by the respective values in R. Let q be the number of clauses in Tt 1. The probability of the event that the tree Tt 1 is feasible is exactly p q. Note that even if given R, the tree Tt 1 is feasible, it does not imply that the tree is actually generated by the algorithm. For example, in step t there might be two overlapping unsatisfied clauses, and the algorithm might choose any one of them, avoiding the creation of the other respective feasible tree. Lemma 5 Given the table R and q 1, for the algorithm to run for qm steps it must be the case that R contains a feasible tree of size at least q. Proof: Among the first qm clauses visited by the algorithm, at least q of them represent the same clause. All q copies of the same clause must belong to the tree associated with the last appearance of this clause in the sequence of visited clauses. It follows that if R is such that no feasible tree of size q or more exists, the algorithm must terminate after at most qm steps. (Observe that it might be the case that no feasible tree of size q exists, but a feasible tree of larger size does exist, since each step of the algorithm adds a root to a tree, and not a leaf, and this root might have degree larger than 1.) 3

Let us count how many legally labeled trees of size q there are at all. A tree is legally labeled if it is a rooted tree labeled by clauses, every two adjacent nodes (namely, a node and its direct child) are labeled by clauses that share a variable, and the clauses corresponding to any two nodes in the same level of the tree do not intersect. As every clause intersects at most d different clauses (including itself - in a legally labeled tree a clause can be a direct child of itself), this implies that any node in the tree has a choice of at most d labels for its children. (The number of children of a node would then be at most d, and also at most k, because clauses labeling children do not intersect, though this last fact is not needed for the analysis.) Pick an arbitrary order among the m clauses. There are m ways of choosing the root. We represent the remaining tree as a Boolean vector of length d(q 1) with q 1 entries that are 1. For every clause chosen into the tree, we append to the end of the vector d coordinates, representing the (at most) d neighbors of the clause (and some of the coordinates might not represent any clause at all if there are less than d neighbors). Their initial value is 0. Scan the coordinates one by one, and for each coordinate that represents a clause that is in the tree, set the corresponding entry in the vector to 1 (and append d coordinates, unless this is the qth node in the tree). It can readily be seen that any legal tree can be represented in such a fashion, and that any vector corresponds to at most one legal tree. Hence the number of legal trees with q nodes is at most m ( d(q 1) ) q 1 m(de) q. Hence the probability over R that there is a feasible legal tree of size Q or more is at most m q=q (de) q p q = m(dpe) Q /(1 dpe). For large enough Q, (e.g., Q = Θ(log m) suffices if dpe is bounded away from 1), the probability that a feasible legal tree of size at least Q exists tends to 0. This implies both that the formula is satisfiable, and that the randomized algorithm will find a satisfying assignment in expected O(mQ) steps. 2.1 Comments For the purpose of applying the local lemma, the notion of events being independent might be subtle, so some care is needed. Here is a simple example to illustrate this point. Suppose that there are three 0/1 variables, and two equations x 1 = x 2 and x 2 = x 3. The variables are given unbiased independent random values, and the events of interest are for each equation to hold. Are the two events independent? Formally no, because each equation is satisfied with probability exactly 1/2, regardless of the other equation. Hence for the existential form of the local lemma (theorem 1) the dependency sets are empty. However, the algorithmic version of the local lemma presented in this lecture would need to treat them as if they are dependent, because reassigning values at random to the variables in one clause may cause the other clause to stop being satisfied. References [1] Noga Alon and Joel Spencer. The Probabilistic Method. Wiley-Interscience Series in Discrete Mathematics and Optimization. [2] Joseph Beck. An algorithmic approach to the Lovasz Local Lemma. Random Structures and Algorithms, 2 (1991), pp, 343-365. 4

[3] Bernhard Haeupler, Barna Saha, Aravind Srinivasan: New Constructive Aspects of the Lovasz Local Lemma. J. ACM 58(6): 28 (2011). [4] Paul Erdos and Laszlo Lovasz. Problems and results on 3-chromatic hypergraphs and some related questions. In Infinite and Finite Sets, volume 11 of Colloq. Math. Soc. J. Bolyai, pages 609-627. North-Holland, 1975. [5] Robin A. Moser, Gabor Tardos: A constructive proof of the general Lovasz Local Lemma. J. ACM 57(2) (2010). 5