CSC242: Intro to AI Lecture 11
Propositional Inference
Propositional Inference
Factored Representation Splits a state into variables (factors, attributes, features, things you know ) that can have values Factored states can be more or less similar (unlike atomic states) Can also represent uncertainty (don t know value of some attribute)
Constraint Satisfaction Problem (CSP) X: Set of variables { X1,..., Xn } D: Set of domains { D1,..., Dn } Each domain Di = set of values { v1,..., vk } C: Set of constraints { C1,..., Cm }
4 Stench Breez e PIT Breez e 3 Stench PIT Breez e Gold 2 Stench Breez e 1 Breez e PIT Breez e START 1 2 3 4
Partially Observable Don t know everything about the state of the world Have to represent uncertainty in our knowledge Have to explore
Representation 4 3 2 1 Stench Stench Breez e Stench Gold Breez e Breez e PIT Breez e PIT PIT Breez e Breez e START 1 2 3 4 Boolean[][] P = new Boolean[n][n]; Boolean[][] G = new Boolean[n][n]; Boolean[][] W = new Boolean[n][n]; Boolean[][] At = new Boolean[n][n]; Boolean FacingN, FacingS, FacingE, FacingW; Boolean[][] S = new Boolean[n][n]; Boolean[][] B = new Boolean[n][n];
A Programming Language for Knowledge
if At[i,j] and S[i,j] then (W[i-1,j] or W[i+1,j] or W[i,j-1] or W[i,j+1]) if At[i,j] and (W[i-1,j] or W[i+1,j] or W[i,j-1] or W[i,j+1]) then S[i,j] Warning: not a real programming language
Propositional Logic Aristole (384BC - 332BC) George Boole (1815-1864)
Syntax Propositional (boolean) variables Connectives: Negation (not, ) Conjunction (and, ) Disjunction (or, ) Implication (if-then, ) Biconditional (if and only if, ) Precedence rules and parentheses
Semantics P Q P P Q P Q P Q P Q true true false true true true true false true true false true true false true false false true false false false false false false true true
Hungry true false true false Cranky false true true false
Possible Worlds Hungry=true, Cranky=false Hungry=false, Cranky=true Hungry=true, Cranky=true Hungry=false, Cranky=false
Possible Worlds Hungry=true, Cranky=false Hungry=false, Cranky=true Hungry Cranky Hungry=true, Cranky=true Hungry=false, Cranky=false
Possible Worlds Hungry=true, Cranky=false Hungry=false, Cranky=true Hungry Cranky Hungry=true, Cranky=true Hungry=false, Cranky=false
A sentence or set of sentences in propositional logic reduces the number of possible worlds by ruling out some of them.
Satisfaction B1,1 P1,2 P2,1 P1,2 P2,1 B1,1 (P1,2 P2,1) true true true true true false true true true false true false true true true false false true true false true true false true true false true false true false true false false false false false false false false true
Model An assignment that satisfies a sentence or set of sentences A possible world where those sentences are true, and so is not ruled out by them May be how the world actually is
Unsatisfiable No complete, consistent assignment of truth values to the propositions that makes the sentence or set of sentences true Rules out all possible worlds Cannot describe the actual world
2 n Truth Table B1,1 P1,2 P2,1 P1,2 P2,1 B1,1 (P1,2 P2,1) true true true true true false true true true false true false true true true false false true true false true true false true true false true false true false true false false false false false false false false true
DFS for Boolean CSP O(2 n ) O(n) Time Complexity Space Complexity
Exploration Strategy Choose Update Act Observe
Inference Use our knowledge of the world to infer facts that we hadn't directly observed
Hungry Cranky Hungry Cranky
1,4 2,4 3,4 4,4 1,3 W! 2,3 3,3 4,3 1,2 A 2,2 3,2 4,2 S OK OK 1,1 2,1 B 3,1 P! 4,1 V V OK OK A = Agent B = Breeze G = Glitter, Gold OK = Safe square P = Pit S = Stench V = Visited W = Wumpus (a) P[1][1] = false; W[1][1] = false; B[1][1] = false; S[1][1] = false; W[1][2] = false; P[1][2] = false; W[2][1] = false; P[1][2] = false; B[2][1] = true; S[2][1] = false; B[1][2] = false; S[1][2] = true; W[1][3] = true; P[2][2] = false; P[3][1] = true;
Inference What other things are we justified in believing, assuming our background knowledge and perceptions are accurate? What other sentences are true, given our background knowledge and perceptions?
Inference Infinite number of sentences of propositional logic, even over a finite set of propositional variables So ask whether a given sentence or set of sentences follows from our knowledge
What does it mean to follow from our knowledge?
α... P[1,1]=true, B[1,2]=false,... P[1,1]=false, B[1,2]=true,... P[1,1]=false, B[1,2]=false,... P[1,1]=false, B[1,2]=false,.........
α β... P[1,1]=true, B[1,2]=false,... P[1,1]=false, B[1,2]=true,... P[1,1]=false, B[1,2]=false,... P[1,1]=false, B[1,2]=false,.........
α β... P[1,1]=true, B[1,2]=false,... P[1,1]=false, B[1,2]=true,... P[1,1]=false, B[1,2]=false,... P[1,1]=false, B[1,2]=false,.........
α β... P[1,1]=true, B[1,2]=false,... P[1,1]=false, B[1,2]=true,... P[1,1]=false, B[1,2]=false,... P[1,1]=false, B[1,2]=false,.........
α β... P[1,1]=true, B[1,2]=false,... P[1,1]=false, B[1,2]=true,... P[1,1]=false, B[1,2]=false,... P[1,1]=false, B[1,2]=false,.........
Entailment α entails β when: β is true in every world considered possible by α Every model of α is also a model of β Models(α) Models(β)
Entailment is conservative Only accept a conclusion that is guaranteed to be true whenever the premises are true
Language for expressing knowledge: propositional logic Definition of entailment ( follows from ) Need to be able to compute whether α entails β
Model Checking P Q... α... β true true... true... true false true... false... false true false... true... true false false... false... false true true... false... true false true... true... true true false... false... false false false... false... true
Propositional Inference
Model Checking P Q... α... β true true... true... true false true... false... false true false... true... true false false... false... false true true... false... true false true... true... true true false... false... false false false... false... true
Intuition: Math
Intuition: Math 123 +456 579
Intuition: Math x + 3 = 7 x + 3-3 = 7-3 x = 7-3 x = 4
Mathematical Identities Allow us to rewrite equations Truth-preserving: If the original equation holds, then so does the rewritten one
Inference Rules Look for rules that allow us to rewrite sentences in a truth-preserving way We ll call these inference rules, since they will allow us to do inference (draw conclusions, make implicit knowledge explicit)
P Q Q P
P Q P Q true true true false true true true false false false false true
Entailment α entails β when: β is true in every world considered possible by α Every model of α is also a model of β Models(α) Models(β)
{ P Q, P } = Q
{ α β, α } = β
Modus Ponens α β, α β
Derivation β can be derived from α using inference rules `
Properties of Inference Rules
Soundness Derives only logically entailed sentences Truth-preserving if ` then =
Completeness Derives all logically entailed sentences if = then `
Inference Rules ^ ( ) ( ) And-elimination Double negation DeMorgan s Laws ),, ( ) ) ^ ( ) ) ( ) ) ^ ( ) ), Modus Ponens Definition of biconditional
Want to compute whether = For a sound inference rule: if ` then = ` ` if and then =
Proof Sequence of sound inference rule applications that lead from the premises to the desired conclusion
Background knowledge: R 1 : P 1,1 R 2 : B 1,1, (P 1,2 _ P 2,1 ) R 3 : B 2,1, (P 1,1 _ P 2,2 _ P 3,1 ) Perceptions: R 4 : B 1,1 R 5 : B 2,1 Biconditional elimination on R2: R 6 :((B 1,1 ) (P 1,2 _ P 2,1 ) ^ ((P 1,2 _ P 2,1 ) ) B 1,1 ) And-elimination on R6: R 7 :(P 1,2 _ P 2,1 ) ) B 1,1 Logical equivalence for contrapositives: R 8 : B 1,1 (P 1,2 P 2,1 ) Modus Ponens on R8 and R4: R 9 : (P 1,2 P 2,1 ) DeMorgan s Rule: And-elimination on R10: R 10 : P 1,2 P 2,1 R 11 : P 1,2
Propositional Inference As Search Initial state: Set of facts (initial knowledge base) Actions: Apply an inference rule to the sentences that match their premises Result: Add conclusions of inference rule to knowledge base Goal: The knowledge base contains the sentence we want to prove
Theorem Proving Searching for proofs is an alternative to enumerating models In many practical cases, finding a proof can be more efficient because the proof can ignore irrelevant propositions, no matter how many of them there are.
Propositional Inference Entailment: follows from our knowledge Model checking Inference rules: soundness, completeness Proof: search for sequence of sound inference rules from premises to conclusion
Propositional Inference As Search Initial state: Set of facts (initial knowledge base) Actions: Apply an inference rule to the sentences that match their premises Result: Add conclusions of inference rule to knowledge base Goal: The knowledge base contains the sentence we want to prove
Inference Rules ^ ( ) ( ) And-elimination Double negation DeMorgan s Laws ),, ( ) ) ^ ( ) ) ( ) ) ^ ( ) ), Modus Ponens Definition of biconditional
Hungry Cranky Hungry Cranky
1,4 2,4 3,4 4,4 1,3 W! 2,3 3,3 4,3 1,2 A 2,2 P? 3,2 4,2 S OK OK 1,1 2,1 B 3,1 P? P! 4,1 V V OK OK (a) A = Agent B = Breeze G = Glitter, Gold OK = Safe square P = Pit S = Stench V = Visited W = Wumpus P 1,1 _ P 2,2 _ P 3,1 P 1,1 P 2,2 _ P 3,1 P 2,2 P 3,1
If A or B is true and you know it s not A, then it must be B
Literals Literal: propositional variable (P) or negation of propositional variable ( Q) Complementary literals: one literal is the negation of another (P, P)
Clauses Clause: disjunction of literals: P Q R P Unit clause: a single literal: P Q
Unit Resolution Clause Clause l 1 l i l k, m l 1 l i 1 l i+1 l k l 1,..., l k and m are literals li and m are complementary Unit Clause Complementary literals: Positive literal: P Negative literal: P
Hungry Cranky Hungry Cranky
1,4 2,4 3,4 4,4 1,3 W! 2,3 3,3 4,3 1,2 A 2,2 P? 3,2 4,2 S OK OK 1,1 2,1 B 3,1 P? P! 4,1 V V OK OK (a) A = Agent B = Breeze G = Glitter, Gold OK = Safe square P = Pit S = Stench V = Visited W = Wumpus P 1,1 _ P 2,2 _ P 3,1 P 1,1 P 2,2 _ P 3,1 P 2,2 P 3,1
Unit Resolution Sound: if ` then = Not complete: if = then `
Resolution l 1 l i l k, m 1 m j m n l 1 l i 1 l i+1 l k m 1 m j 1 m j+1 m n l 1,..., l k, m 1,..., m n are literals li and mj are complementary Technical note: Resulting clause must be factored to contain only one copy of each literal.
P 1,1 P 3,1, P 1,1 P 2,2 P 3,1 P 2,2
1. Hungry Cranky 2. Sleepy Hungry 3. Cranky Sleepy 4. Sleepy Cranky (1,2) 5. Cranky Cranky (3,4) 6. Cranky (factoring)
Resolution Sound: if ` then = Easy to show Complete: if = then ` Proof by contradiction (see book)
Resolution l 1 l i l k, m 1 m j m n l 1 l i 1 l i+1 l k m 1 m j 1 m j+1 m n l 1,..., l k, m 1,..., m n are literals li and mj are complementary Technical note: Resulting clause must be factored to contain only one copy of each literal.
A B P Q (A B) (B C)
Conjunctive Normal Form (CNF) Eliminate : α β α β β α Eliminate : α β α β Move negation in: α α (α β) ( α β) (α β) ( α β) Distribute over : (α (β γ)) ((α β ) (α γ))
A B P Q (A B) (B C) A B P Q (A B) (B C) A, B P Q ( A B) ( B C) ( A B) ( A C) ( B B) ( B C) ( A B) ( A C) B ( B C) ( A B), ( A C), B, ( B C)
Inference Using Resolution Convert sentences (KB) to CNF (set of clauses) Apply resolution inference rule to pairs of clauses with complementary literals Add resulting clause to set of clauses Until...
Proof by Contradiction = if and only if ( ) is unsatisfiable If negation of goal is inconsistent with our knowledge Then the goal itself is entailed by our knowledge
Resolution Refutation Convert (KB α) to CNF Apply resolution rule until: No new clauses can be added KB does not entail α Two clauses resolve to yield the empty clause KB entails α
Proof Using Resolution Proof by contradiction Derive empty clause from (KB α) (converted to CNF, of course)
Effective Resolution Definite clauses Disjunction of literals with exactly one positive literal Horn clauses Disjunction of literals with at most one positive literal Natural reading as facts and if-then rules
Forward Chaining Knowledge base of definite clauses: facts and rules If premises of a rule (conjunction of literals) are known Add its conclusion (single literal) to set of known facts Until either query is added or no further inferences can be made
Backward Chaining Work backward from query q If q is known to be true, we are done Otherwise find all rules whose conclusion (head) is q If all the premises (body) of one of those rules can be proven true, then q is true
Effective Resolution Definite clauses Disjunction of literals with exactly one positive literal Horn clauses Disjunction of literals with at most one positive literal Natural reading as facts and if-then rules
Propositional Inference Entailment: follows from our knowledge Model checking Intractable But see also AIMA 7.6 & Project 2
Propositional Inference Inference rules: soundness, completeness Searching for proofs is an alternative to enumerating models May be faster in practice
Propositional Inference Resolution is a sound and complete inference rule Works on clauses (CNF) Special cases: Definite & Horn clauses Forward and backward chaining
Project 2: Logical Sudoku Formulate sudoku using logic Write a program that inputs a sudoku and outputs its logical representation Solve the puzzle using an inference engine Which you don t have to write... Write a program that outputs the solution of the sudoku nicely Due: Mon 4 Mar 23:00
For next time: AIMA 8.0-8.3; 8.1.1-8.1.2 fyi Quiz on PL