Inference in first-order logic

Similar documents
Inference in first-order logic

Inference in first-order logic

Inference in first-order logic. Production systems.

CS 2710 Foundations of AI Lecture 12. Propositional logic. Logical inference problem

Propositional logic II.

Knowledge base (KB) = set of sentences in a formal language Declarative approach to building an agent (or other system):

Inference in Propositional Logic

Advanced Topics in LP and FP

Proof Methods for Propositional Logic

Inference Methods In Propositional Logic

7.5.2 Proof by Resolution

Title: Logical Agents AIMA: Chapter 7 (Sections 7.4 and 7.5)

[1] Describe with examples, the three inference rules involving quantifiers

Inf2D 13: Resolution-Based Inference

CS 2740 Knowledge Representation. Lecture 4. Propositional logic. CS 2740 Knowledge Representation. Administration

Artificial Intelligence Chapter 7: Logical Agents

Inference in First Order Logic

Inference Methods In Propositional Logic

Propositional Reasoning

CSC242: Intro to AI. Lecture 11. Tuesday, February 26, 13

Propositional Resolution

Propositional Logic: Review

Logic. Knowledge Representation & Reasoning Mechanisms. Logic. Propositional Logic Predicate Logic (predicate Calculus) Automated Reasoning

Inference in First-Order Logic

CS 380: ARTIFICIAL INTELLIGENCE

Resolution: Motivation

Knowledge based Agents

Lecture 10: Even more on predicate logic" Prerequisites for lifted inference: Skolemization and Unification" Inference in predicate logic"

The Wumpus Game. Stench Gold. Start. Cao Hoang Tru CSE Faculty - HCMUT

Convert to clause form:

CS 380: ARTIFICIAL INTELLIGENCE PREDICATE LOGICS. Santiago Ontañón

CS 771 Artificial Intelligence. Propositional Logic

Objectives. Logical Reasoning Systems. Proofs. Outline. Example proof. Example proof. You should

CS 7180: Behavioral Modeling and Decision- making in AI

Knowledge Representation. Propositional logic

Propositional Logic Part 1

7 LOGICAL AGENTS. OHJ-2556 Artificial Intelligence, Spring OHJ-2556 Artificial Intelligence, Spring

Logic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001

Outline. Logical Agents. Logical Reasoning. Knowledge Representation. Logical reasoning Propositional Logic Wumpus World Inference

CS 771 Artificial Intelligence. First Order Logic Inference

Propositional Logic. Logic. Propositional Logic Syntax. Propositional Logic

Artificial Intelligence

Learning Goals of CS245 Logic and Computation

Artificial Intelligence Chapter 9: Inference in First-Order Logic

Propositional Logic: Bottom-Up Proofs

COMP9414: Artificial Intelligence Propositional Logic: Automated Reasoning

Knowledge Representation. Propositional logic.

CS 730/730W/830: Intro AI

Deductive Systems. Lecture - 3

Logical Inference. Artificial Intelligence. Topic 12. Reading: Russell and Norvig, Chapter 7, Section 5

CS 4700: Artificial Intelligence

CSC242: Intro to AI. Lecture 13. Thursday, February 28, 13

AI Programming CS S-09 Knowledge Representation

Intelligent Agents. Pınar Yolum Utrecht University

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 9: Inference in First-Order Logic

Propositional Logic: Methods of Proof. Chapter 7, Part II

Logical Agents (I) Instructor: Tsung-Che Chiang

Logical Agent & Propositional Logic

Outline. Logical Agents. Logical Reasoning. Knowledge Representation. Logical reasoning Propositional Logic Wumpus World Inference

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Propositional Logic: Methods of Proof (Part II)

Logic and Reasoning. Foundations of Computing Science. Pallab Dasgupta Professor, Dept. of Computer Sc & Engg INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR

Logical Agents. Chapter 7

6. Logical Inference

Logical Agent & Propositional Logic

Chapter 7 R&N ICS 271 Fall 2017 Kalev Kask

Introduction Logic Inference. Discrete Mathematics Andrei Bulatov

Introduction to Intelligent Systems

Propositional Logic: Methods of Proof (Part II)

Introduction to Intelligent Systems

Propositional Logic: Models and Proofs

CS:4420 Artificial Intelligence

2. The Logic of Compound Statements Summary. Aaron Tan August 2017

Knowledge-based systems

Class Assignment Strategies

Artificial Intelligence

Logic and Inferences

CS 730/830: Intro AI. 3 handouts: slides, asst 6, asst 7. Wheeler Ruml (UNH) Lecture 12, CS / 16. Reasoning.

Propositional Logic Language

CogSysI Lecture 8: Automated Theorem Proving

INF3170 / INF4171 Notes on Resolution

Revised by Hankui Zhuo, March 21, Logical agents. Chapter 7. Chapter 7 1

Strong AI vs. Weak AI Automated Reasoning

Outline. Reducing first-order inference to propositional inference Unification. Forward chaining Backward chaining

Logic in AI Chapter 7. Mausam (Based on slides of Dan Weld, Stuart Russell, Subbarao Kambhampati, Dieter Fox, Henry Kautz )

Part 1: Propositional Logic

Resolution for Predicate Logic

Inference in first-order logic


Logical agents. Chapter 7. Chapter 7 1

Language of Propositional Logic

Planning: situation calculus

Logic and Computation

Propositional Logic: Methods of Proof (Part II)

Warm-Up Problem. Is the following true or false? 1/35

7. Logical Agents. COMP9414/ 9814/ 3411: Artificial Intelligence. Outline. Knowledge base. Models and Planning. Russell & Norvig, Chapter 7.

Resolution (14A) Young W. Lim 8/15/14

Automated Reasoning Lecture 2: Propositional Logic and Natural Deduction

Logical Agents. Chapter 7

Logical agents. Chapter 7. Chapter 7 1

Transcription:

CS 2710 Foundations of AI Lecture 15 Inference in first-order logic Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Logical inference in FOL Logical inference problem: Given a knowledge base KB (a set of sentences) and a sentence, does the KB semantically entail? KB? Logical inference problem in the first-order logic is undecidable!!! No procedure that can decide the entailment for all possible input sentences in a finite number of steps. Methods: Inference rule approach Resolution refutation 1

Inference rules for quantifiers Universal elimination x ( a - is a constant symbol ( a) substitutes a variable with a constant symbol x Likes( IceCream) Likes( Ben, IceCream) Existential elimination. x ( ( a) Substitutes a variable with a constant symbol that does not appear elsewhere in the KB x Kill ( Victim) Kill ( Murderer, Victim) Inference rules for quantifiers Universal instantiation (introduction) x is not free in x Introduces a universal variable which does not affect or its assumptions Sister ( Am Jane) x Sister( Am Jane) Existential instantiation (introduction) ( a) a is a ground term in x ( x is not free in Substitutes a ground term in the sentence with a variable and an existential statement Likes( Ben, IceCream) x Likes( IceCream) 2

Unification Problem in inference: Universal elimination gives us many opportunities for substituting variables with ground terms x ( a - is a constant symbol ( a) Solution: avoid making blind substitutions of ground terms Make substitutions that help to advance inferences Use substitutions matching similar sentences in KB Make inferences on the variable level Do not substitute ground terms if not necessary Unification takes two similar sentences and computes the substitution that makes them look the same, if it exists UNIFY( p, q) s.t. SUBST(σ, p) SUBST (, q) Generalized inference rules Use substitutions that let us make inferences!!!! Example: Generalized Modus Ponens If there exists a substitution such that SUBST, A ) SUBST (, A ') ( i i A1 A2 An B, A1 ', A2 ', An ' SUBST (, B) for all i=1,2, n Substitution that satisfies the generalized inference rule can be build via unification process Advantage of the generalized rules: they are focused only substitutions that allow the inferences to proceed are tried 3

Resolution inference rule Recall: Resolution inference rule is sound and complete (refutation-complete) for the propositional logic and CNF A B, A C B C Generalized resolution rule is sound and refutation complete for the first-order logic and CNF w/o equalities (if unsatisfiable the resolution will find the contradiction) 1 2 k, 1 2 n SUBST (, Example: 1 UNIFY(, ) i 1 i 1 i k j 1 fail P(, Q( John) P( John) j 1 j 1 ) n Inference with the resolution rule Matches the KB in CNF Proof by refutation: Prove that KB, is unsatisfiable resolution is refutation-complete Main procedure (steps): 1. Convert KB, to CNF with ground terms and universal variables only 2. Apply repeatedly the resolution rule while keeping track and consistency of substitutions 3. Stop when empty set (contradiction) is derived or no more new resolvents (conclusions) follow 4

Conversion to CNF 1. Eliminate implications, equivalences ( p q) ( p q) 2. Move negations inside (DeMorgan s Laws, double negation) ( p q) p q x p x p ( p q) p q x p x p p p 3. Standardize variables (rename duplicate variables) ( x P( ) ( x Q( ) ( x P( ) ( y Q( ) 4. Move all quantifiers left (no invalid capture possible ) ( x P( ) ( y Q( ) x y P( Conversion to CNF 5. Skolemization (removal of existential quantifiers through elimination) If no universal quantifier occurs before the existential quantifier, replace the variable with a new constant symbol also called Skolem constant y P( A) P( A) B) If a universal quantifier precedes the existential quantifier replace the variable with a function of the universal variable x y P( x P( F( ) F( - a special function - called Skolem function 5

Conversion to CNF 6. Drop universal quantifiers (all variables are universally quantified) x P( F( ) P( F( ) 7. Convert to CNF using the distributive laws p ( q r) ( p q) ( p r) The result is a CNF with variables, constants, functions Resolution example KB P(,, P(, R(, α S(A) 6

Resolution example KB P(,, P(, R(, α S(A) { y / w} P( Resolution example KB P(,, P(, R(, α S(A) { y / w} P( { x / w} S( 7

Resolution example KB P(,, P(, R(, α S(A) { y / w} P( { x / w} S( { z / w} S( Resolution example KB P(,, P(, R(, α S(A) { y / w} P( { x / w} S( { z / w} S( { w/ A} Empty resolution 8

Resolution example KB P(,, P(, R(, α S(A) { y / w} P( { x / w} S( { z / w} KB S( { w/ A} Empty resolution Contradiction Dealing with equality Resolution works for the first-order logic without equalities To incorporate equalities we need an additional inference rule Demodulation rule UNIFY z i, t ) ( 1 1 2 k, t1 t2 SUB( SUBST (, t ), SUBST (, t ), ) 1 fail where Example: P( f ( a)), f ( x P( a) Paramodulation rule: more powerful Resolution+paramodulation give a refutation-complete proof theory for FOL 2 z i 1 occurs in i 2 k 9

Sentences in Horn normal form Horn normal form (HNF) in the propositional logic a special type of a clause with at most one positive literal ( A B) ( A C D) Typically written as: ( B A) (( A C) D) A clause with one literal, e.g. A, is also called a fact A clause representing an implication (with a conjunction of positive literals in antecedent and one positive literal in consequent), is also called a rule Resolution rule and modus ponens: Both are complete inference rule for unit inferences for KBs in the Horn normal form. Recall: Not all KBs are convertible to HNF!!! Horn normal form in FOL First-order logic (FOL) adds variables and quantifiers, works with terms, predicates HNF in FOL: primitive sentences (propositions) are formed by predicates Generalized modus ponens rule: a substituti on s.t. i SUBST (, ') SUBST (, ) 1', 2', n', 1 2 n SUBST (, ) Generalized resolution and generalized modus ponens: is complete for unit inferences for the KBs in HN; Not all first-order logic sentences can be expressed in HNF i i 10

Forward and backward chaining Two inference procedures based on modus ponens for Horn KBs: Forward chaining Idea: Whenever the premises of a rule are satisfied, infer the conclusion. Continue with rules that became satisfied. Typical usage: If we want to infer all sentences entailed by the existing KB. Backward chaining (goal reduction) Idea: To prove the fact that appears in the conclusion of a rule prove the premises of the rule. Continue recursively. Typical usage: If we want to prove that the target (goal) sentence is entailed by the existing KB. Both procedures are complete for KBs in Horn form!!! Forward chaining example Forward chaining Idea: Whenever the premises of a rule are satisfied, infer the conclusion. Continue with rules that became satisfied Assume the KB with the following rules: KB: : Steamboat ( Sailboat ( Faster ( R2: R3: F1: F2: F3: Sailboat ( RowBoat ( Faster ( Faster ( Faster ( Faster ( Steamboat (Titanic ) Sailboat (Mistral ) RowBoat (PondArrow ) Theorem: Faster ( Titanic, PondArrow )? 11

Forward chaining example KB: : R2: R3: Steamboat( Sailboat ( Sailboat ( RowBoat ( Faster( F1: F3: RowBoat(PondArrow )? Forward chaining example KB: : R2: R3: Steamboat( Sailboat ( Sailboat ( RowBoat ( Faster( F1: F3: RowBoat(PondArrow ) Rule is satisfied: F4: Mistral ) 12

Forward chaining example KB: : R2: R3: Steamboat( Sailboat ( Sailboat ( RowBoat ( Faster( F1: F3: RowBoat(PondArrow ) Rule is satisfied: F4: Mistral ) Rule R2 is satisfied: F5: Faster( Mistral, PondArro Forward chaining example KB: : R2: R3: Steamboat( Sailboat ( Sailboat ( RowBoat ( Faster( F1: F3: RowBoat(PondArrow ) Rule is satisfied: F4: Mistral ) Rule R2 is satisfied: F5: Faster( Mistral, PondArro Rule R3 is satisfied: F6: PondArro 13

Backward chaining example Backward chaining (goal reduction) Idea: To prove the fact that appears in the conclusion of a rule prove the antecedents (if part) of the rule & repeat recursively. KB: : R2: R3: F1: F2: F3: Steamboat( Sailboat ( Sailboat ( RowBoat ( Faster( Sailboat(Mistral) RowBoat(PondArro Theorem: PondArro Backward chaining example PondArro F1: F3: RowBoat(PondArro Sailboat(PondArro Steamboat( Sailboat ( PondArro { x / Titanic, y / PondArrow } 14

Backward chaining example PondArro R2 Sailboat(Titanic) F1: F3: RowBoat(PondArro Sailboat(PondArro RowBoat(PondArro Sailboat ( RowBoat ( PondArro { y / Titanic, z / PondArrow } Backward chaining example PondArro R2 R3 F1: F3: RowBoat(PondArro Sailboat(Titanic) Faster( PondArrow ) Sailboat(PondArro RowBoat(PondArro Faster( PondArro { x / Titanic, z / PondArrow} 15

Backward chaining example PondArro R2 R3 F1: F3: RowBoat(PondArro Sailboat(Titanic) Faster( PondArrow ) Sailboat(PondArro RowBoat(PondArro Sailboat(Mistral) Steamboat( Sailboat( Mistral ) { x / Titanic, y / Mistral } Backward chaining example PondArro R2 R3 y must be bound to the same term Sailboat(Titanic) Faster( Mistral, PondArro Mistral ) Sailboat(PondArro RowBoat(PondArro Sailboat(Mistral) 16

Backward chaining example PondArro R2 R3 F1: F3: RowBoat(PondArro Sailboat(Titanic) Faster( Mistral, PondArro Mistral ) Sailboat(PondArro RowBoat(PondArro R2 Sailboat(Mistral) Sailboat(Mistral) RowBoat(PondArro Backward chaining PondArro R2 R3 Sailboat(Titanic) Faster( Mistral, PondArro Mistral ) Sailboat(PondArro RowBoat(PondArro R2 Sailboat(Mistral) Sailboat(Mistral) RowBoat(PondArro 17