Algebraic Methods. Motivation: Systems like this: v 1 v 2 v 3 v 4 = 1 v 1 v 2 v 3 v 4 = 0 v 2 v 4 = 0

Similar documents
1 Algebraic Methods. 1.1 Gröbner Bases Applied to SAT

Linear Equations in Linear Algebra

Solving Linear Systems Using Gaussian Elimination

Linear Equations in Linear Algebra

1 Efficient Transformation to CNF Formulas

Sat-Solving Based on Boundary Point Elimination

Satisfiability and SAT Solvers. CS 270 Math Foundations of CS Jeremy Johnson

Deductive Systems. Lecture - 3

Comp487/587 - Boolean Formulas

Chapter 1: Systems of Linear Equations and Matrices

Propositional and Predicate Logic - V

COLLEGE ALGEBRA. Properties of Real Numbers with Clock Arithmetic

Integer vs. constraint programming. IP vs. CP: Language

Math 2331 Linear Algebra

Section 5.3 Systems of Linear Equations: Determinants

Logic: Propositional Logic (Part I)

DM559 Linear and Integer Programming. Lecture 2 Systems of Linear Equations. Marco Chiarandini

Fields in Cryptography. Çetin Kaya Koç Winter / 30

3. Replace any row by the sum of that row and a constant multiple of any other row.

1111: Linear Algebra I

1 - Systems of Linear Equations

Foundations of Artificial Intelligence

Conjunctive Normal Form and SAT

Chapter 5. Linear Algebra. Sections A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Conjunctive Normal Form and SAT

Chapter 2. Reductions and NP. 2.1 Reductions Continued The Satisfiability Problem (SAT) SAT 3SAT. CS 573: Algorithms, Fall 2013 August 29, 2013

Knowledge base (KB) = set of sentences in a formal language Declarative approach to building an agent (or other system):

Computational Intractability 2010/4/15. Lecture 2

MATRICES. a m,1 a m,n A =

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

An Introduction to SAT Solving

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Lesson 2: Introduction to Variables

Cardinality Networks: a Theoretical and Empirical Study

Matrix Basic Concepts

Lectures on Linear Algebra for IT

Critical Reading of Optimization Methods for Logical Inference [1]

Row Reduction and Echelon Forms

Computational Logic. Davide Martinenghi. Spring Free University of Bozen-Bolzano. Computational Logic Davide Martinenghi (1/30)

5 Systems of Equations

Chapter 5. Linear Algebra. Sections A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Matrix operations Linear Algebra with Computer Science Application

A Pigeon-Hole Based Encoding of Cardinality Constraints 1

Topics in Computer Mathematics

Matrices and RRE Form

A Collection of Problems in Propositional Logic

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Section 6.2 Larger Systems of Linear Equations

Linear Algebra I Lecture 8

Propositional Calculus: Formula Simplification, Essential Laws, Normal Forms

Gauss-Jordan Row Reduction and Reduced Row Echelon Form

MTH 215: Introduction to Linear Algebra

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Generic degree structure of the minimal polynomial nullspace basis: a block Toeplitz matrix approach

Degree of a polynomial

UCLID: Deciding Combinations of Theories via Eager Translation to SAT. SAT-based Decision Procedures

1111: Linear Algebra I

A New Look at BDDs for Pseudo-Boolean Constraints

Notes on Complexity Theory Last updated: October, Lecture 6

Classical and quantum satisability 1. Author(s): Anderson de Araújo Marcelo Finger

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

MathCheck2: A SAT+CAS Verifier for Combinatorial Conjectures

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Matrices: 2.1 Operations with Matrices

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Decision Procedures for Satisfiability and Validity in Propositional Logic

1111: Linear Algebra I

CSE 555 HW 5 SAMPLE SOLUTION. Question 1.

Fractions. Review R.7. Dr. Doug Ensley. January 7, Dr. Doug Ensley Review R.7

MathCheck2: A SAT+CAS Verifier for Combinatorial Conjectures

Formal Verification Methods 1: Propositional Logic

Conjunctive Normal Form and SAT

RESOLUTION OVER LINEAR EQUATIONS AND MULTILINEAR PROOFS

COMP219: Artificial Intelligence. Lecture 20: Propositional Reasoning

On the Relative Efficiency of DPLL and OBDDs with Axiom and Join

A Little Logic. Propositional Logic. Satisfiability Problems. Solving Sudokus. First Order Logic. Logic Programming

MATH 320, WEEK 7: Matrices, Matrix Operations

Space complexity of cutting planes refutations

Verification using Satisfiability Checking, Predicate Abstraction, and Craig Interpolation. Himanshu Jain THESIS ORAL TALK

Integer factorization, part 1: the Q sieve. D. J. Bernstein

Algebra & Trig. I. For example, the system. x y 2 z. may be represented by the augmented matrix

Math 1314 Week #14 Notes

Linear Programming Formulation of Boolean Satisfiability Problem

(x 1 +x 2 )(x 1 x 2 )+(x 2 +x 3 )(x 2 x 3 )+(x 3 +x 1 )(x 3 x 1 ).

Linear Algebra, Boolean Rings and Resolution? Armin Biere. Institute for Formal Models and Verification Johannes Kepler University Linz, Austria

LOGIC PROPOSITIONAL REASONING

Matrices. Chapter Definitions and Notations

Think about systems of linear equations, their solutions, and how they might be represented with matrices.

Lecture 22. m n c (k) i,j x i x j = c (k) k=1

Propositional Logic: Evaluating the Formulas

MTH 102A - Linear Algebra II Semester

Improving Unsatisfiability-based Algorithms for Boolean Optimization

Handout 1 EXAMPLES OF SOLVING SYSTEMS OF LINEAR EQUATIONS

Topics in Model-Based Reasoning

Section 1.1: Systems of Linear Equations

Introduction to Advanced Results

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

Homework 1.1 and 1.2 WITH SOLUTIONS

Theory of Computer Science. Theory of Computer Science. E4.1 Overview E4.2 3SAT. E4.3 Graph Problems. E4.4 Summary.

Quantum Searching. Robert-Jan Slager and Thomas Beuman. 24 november 2009

Transcription:

Motivation: Systems like this: v v 2 v 3 v 4 = v v 2 v 3 v 4 = 0 v 2 v 4 = 0 are very difficult for CNF SAT solvers although they can be solved using simple algebraic manipulations

Let c 0, c,...,c 2 n be a 0- vector of 2 n coefficients. For 0 j < n, let b i,j be the j th bit in the binary representation of number i. Our proof system targets sets of equations of the following form: 2 n i=0 c i v b i,0 v b i, 2...v b i,n n = 0 where variables v i take values from {0, }. Note: the term i = 0 is a constant. Each equation in the set is regarded to be a fact. A term is said to be multi-linear. example: v v 2 v 3 + v 2 v 3 + v v 3 + v + v 2 + = 0

New facts are derived from old facts using the following rules:. Any even sum of like terms in an equation may be replaced by 0. e.g.: v v 2 + v v 2 0 and + 0. Needed to eliminate terms when adding equations. 2. A factor v 2 may be replaced by v Needed to ensure terms remain multi-linear after multiplication 3. An equation may be multiplied by a term, the resulting equation may be reduced by the rule above. e.g.: v 3 v 4 (v + v 3 = 0) v v 3 v 4 + v 3 v 4 = 0. The new equation is said to be a new, derived fact. 4. Two equations may be added, using mod 2 arithmetic The new equation is said to be a new, derived fact.

Observe: solution spaces for two equations are complementary if they differ only in their c 0 term Example: v v 2 v 3 + v v 2 + v 2 v 3 + v + = 0 and v v 2 v 3 + v v 2 + v 2 v 3 + v = 0 are complementary. Derived equations have solution space at least as large as the original set Example: v = 0 added to v = 0 gives 0 = 0, which allows v = All derived equations + old equations have same solution space as the original set

Theorem: Using rules stated earlier, The equation =0 is always derivable from an inconsistent set of multi-linear equations and never derived from a consistent set.

Examples: The clause (v v 2 v 3 ) is represented by: v ( + v 2 )( + v 3 ) + v 2 ( + v 3 ) + v 3 + = 0 which reduces to v v 2 v 3 + v v 2 + v v 3 + v 2 v 3 + v + v 2 + v 3 + = 0. The clause ( v v 2 v 3 ) is represented by: ( + v )( + v 2 )( + v 3 ) + v 2 ( + v 3 ) + v 3 + = 0 which reduces to v v 2 v 3 + v v 2 + v v 3 + v = 0. The expression ( v v 2 v 3 ) (v v 2 v 3 ) is represented by: 2v v 2 v 3 + 2v v 2 + 2v v 3 + v 2 v 3 + 2v + v 2 + v 3 + = 0 v 2 v 3 + v 2 + v 3 + = 0 (same as (v 2 v 3 )) Observe: A BDD is an equation

Example: Solve the expression (v v 2 ) (v 2 v 3 ) (v 3 v ): v v 2 +v 2 = 0 () v 2 v 3 +v 3 = 0 (2) v v 3 +v = 0 (3) v v 2 v 3 +v 2 v 3 = 0 (4) v 3 () v v 2 v 3 +v 3 = 0 (5) (4) + (2) v v 2 v 3 +v v 3 = 0 (6) v (2) v v 2 v 3 +v = 0 (7) (6) + (3) v v 2 v 3 +v v 2 = 0 (8) v 2 (3) v v 2 v 3 +v 2 = 0 (9) (8) + () v +v 2 = 0 (0) (9) + (7) v +v 3 = 0 () (5) + (7) Solution: v = v 2 = v 3

Motivation: Systems like this: v v 2 v 3 v 4 = v v 2 v 3 v 4 = 0 v 2 v 4 = 0 are very difficult for CNF SAT solvers although they can be solved using simple algebraic manipulations v + v 2 + v 3 + v 4 + = 0 v + v 2 + v 3 + v 4 + = 0 v 2 + v 4 + = 0

An Algebraic Solver (ψ, d) /* Input: List of equations ψ = e,..., e m, integer d */ /* Output: satisfiable or unsatisfiable */ /* Locals: Set B of equations */ Set B. Repeat while ψ : Pop e ψ. Repeat while e B : first non-zero(e) = first non-zero(e ): Set e reduce(e + e ). /* Rule 4. */ If e is = 0: Output unsatisfiable If e is not 0 = 0: Set B B {e}. If degree(e) < d: Repeat for all variables v: If reduce(v e) has not been in ψ: Append ψ reduce(v e). /* Rule 3. */ Output satisfiable.

Theorem: The number of derivations used by the algebraic solver is within a polynomial factor of the minimum number possible. Theorem: minimum number of derivations used by the algebraic solver cannot be much greater than, and may sometimes be far less than, the minimum number needed by resolution.

Comparison with BDD operations (restrict(f,c)): f = (v v 2 ) ( v v 3 ) c = (v 2 v 3 ) v 0 v 3 v 2 0 0 0 v 2 0 v 3 0 0 v 0 v 3 v 2 0 0 0 Algebra: f : v v 3 + v 2 + v v 2 = 0 and c : v 2 v 3 + v 3 = 0 (v 2 v 3 ) (v v 3 + v 2 + v v 2 = 0) (v 2 v 3 = 0) + (v 2 v 3 + v 3 = 0) v 3 = 0. As BDDs: (v 2 v 3 ) means v 2 = v 3 = rows are 0, all other rows are. All rows of f are rows of (v 2 v 3 ) so product can be added as a fact. But when conjoined with c, the inference v 3 = 0 is obtained.

Comparison with BDD operations (gcf(f,c) and Ex. Quant.): On BDDs, gcf(f,c) depends on the variable ordering But gcf(f,c) may replace f, not so for algebra In algebra, ex. quant. means multiply two equations example: consider g = v v 2 v 3 + v v 3 + v + = 0. To ex. quant. away v 2 from g: form equations: v + = 0 (v 2 = ) v v 3 + v + = 0 (v 2 = 0) Then multiply (v + ) (v v 3 + v + ) = (v v 3 + v v 3 + v + ) = (v + ). But the quantified away variable can be in just one equation.

Integer Programming: Algebraic Methods Solve: M ψ α +b Z, α i {0, }, for all 0 i < n, where M ψ is the (0, ±) matrix representation of CNF formula ψ, b is an integer vector: b i is the number of - entries in row i of M ψ, and Z is a vector of s. A solution certifies ψ is satisfiable. No solution certifies its unsatisfiable. example: ( v 0 v v 7 ) ( v v 2 ) ( v 2 v 3 ) ( v 0 v 3 v 8 ) (v 0 v 6 v 7 ) ( v 5 v 6 ) ( v 4 v 5 v 9 ) (v 0 v 4 ) - 0 0 0 0 0-0 0 0-0 0 0 0 0 0 0 0 0-0 0 0 0 0 0-0 0-0 0 0 0 0 0 0 0 0 - - 0 0 0 0 0 0 0 0-0 0 0 0 0 0 0-0 0 0 0 0 0 0 0 0 0 0 2 α + 2 2 0