The Core Method. The Very Idea. The Propositional CORE Method. Human Reasoning
|
|
- Alicia McDaniel
- 6 years ago
- Views:
Transcription
1 The Core Method International Center for Computational Logic Technische Universität Dresden Germany The Very Idea The Propositional CORE Method Human Reasoning The Core Method 1
2 The Very Idea Various semantics for logic programs coincide with fixed points of associated immediate consequence operators (Apt, van Emden: Contributions to the Theory of Logic Programming. Journal of the ACM 9, : 198). Banach Contraction Mapping Theorem A contraction mapping f defined on a complete metric space (X, d) has a unique fixed point. The sequence y, f(y), f(f(y)),... converges to this fixed point for any y X. Consider programs whose immediate consequence operator is a contraction. (Fitting: Metric Methods Three Examples and a Theorem. Journal of Logic Programming 1, : 1994). Every continuous function on the reals can be uniformly approximated by feedforward connectionist networks (Funahashi: On the Approximate Realization of Continuous Mappings by Neural Networks. Neural Networks, : 1989). Consider programs whose immediate consequence operator is continuous. (H., Kalinke, Störr: Approximating the Semantics of Logic Programs by Recurrent Neural Networks. Applied Intelligence 11, 45-59: 1999). The Core Method
3 First Ideas H., Kalinke: Towards a New Massively Parallel Computational Model for Logic Programming In: Proceedings of the ECAI94 Workshop on Combining Symbolic and Connectionist Processing, 68-77: The Core Method 3
4 Interpretations Let L be a propositional language and {, } the set of truth values. An interpretation I is a mapping L {, }. For a given program P, an interpretation I can be represented by the set of atoms occurring in P which are mapped to under I, i.e. I R P. R P is the set of all interpretations for P. ( R P, ) is a complete lattice. An interpretation I for P is a model for P iff I(P) =. The Core Method 4
5 Immediate Consequence Operator Immediate consequence operator T P : R P R P : T P (I) = {A there is a clause A L 1... L n P such that I = L 1... L n}. I is a supported model iff T P (I) = I. Let lfp T P be the least fixed point of T P if it exists. The Core Method 5
6 The Propositional CORE Method Let L be a propositional logic language. Given a logic program P together with immediate consequence operator T P. Let R P = m and R P be the set of interpretations for P. Find a mapping rep : R P R m. Construct a feed-forward network computing f P : R m R m, called the core, such that the following holds: If T P (I) = J then f P (rep(i)) = rep(j), where I, J R P. If f P (s) = t then T P (rep 1 (s)) = rep 1 (t), where s, t R m. Connect the units in the output layer recursively to the units in the input layer. Show that the following holds I = lfp T P iff the recurrent network converges to rep(i), i.e. it reaches a stable state with input and output layer representing rep(i). Connectionist model generation using recurrent networks with feed-forward core. The Core Method 6
7 3-Layer Recurrent Networks core 1... input layer 1 hidden layer output layer At each point in time all units do: apply activation function to obtain potential, apply output function to obtain output. The Core Method 7
8 Propositional CORE Method using Binary Threshold Units Let L be the language of propositional logic. Let P be a propositional logic program, e.g., P = {p, r p q, r p q}. T P (I) = {A A L 1... L m P such that I = L 1... L m}. T P ( ) = {p} T P ({p}) = {p, r} T P ({p, r}) = {p, r} = lfp T P The Core Method 8
9 Representing Interpretations R P Let m = R P and identify R P with {1,...,m}. Define rep : R P R m such that for all 1 j m we find: { 1 if j I, rep(i)[j] = 0 if j I. E.g., if R P = {p, q, r} = {1,, 3} and I = {p, r} then rep(i) = (1, 0, 1). Other encodings are possible, e.g., rep (I)[j] = { 1 if j I, 1 if j I. We can represent interpretations by arrays of binary or bipolar threshold units. The Core Method 9
10 Computing the Core Theorem For each program P, there exists a core of logical threshold units computing T P. Proof Let P be a program, m = R P, and R +. Wlog we assume that all occurrences of in P have been eliminated. Translation Algorithm 1 Input and output layer: vector of length m of binary threshold units with threshold 0 and / in the input and output layer, respectively. For each clause of the form A L 1... L k P, k 0, do:.1 Add a binary threshold unit u h to the hidden layer.. Connect u h to the unit representing A in the output layer with weight..3 For each literal L j, 1 j k, connect the unit representing L j in the input layer to u h and, if L j is an atom then set the weight to otherwise set the weight to..4 Set the threshold of u h to (p 0), where p is the number of positive literals occurring in L 1... L k. The Core Method 10
11 Computing the Core (Continued) Theorem For each program P, there exists a core of logical threshold units computing T P. Proof (Continued) Some observations: u h becomes active at time t + 1 iff L 1... L k is mapped to by the interpretation represented by the state of the input layer at time t. The unit representing A in the output layer becomes active at time t + iff there is a rule of the form A L 1... L k P and the unit u h in the hidden layer corresponding to this rule is active at time t + 1. The result follows immediately from these observations. qed The Core Method 11
12 Computing the Core (Example) Consider again P = {p, r p q, r p q}. The translation algorithm yields: p 1 - p q 1 q r 1 r The Core Method 1
13 Hidden Layers are Needed The XOR can be represented by the program {r p q, r p q}. Proposition -layer networks cannot compute T P for definite P. Proof Suppose there exist -layer networks for definite P computing T P. Consider P = {p q, p r s, p t u}. p 1 w 7 θ 7 7 p q 8 q r 3 9 r s 4 w s t 5 11 t u 6 1 u Let v be the state of the input layer; v is an interpretation. The Core Method 13
14 Hidden Layers are Needed (Continued) Proposition -layer networks cannot compute T P for definite P. Proof (Continued) Consider P = {p q, p r s, p t u}. We have to find θ 7 and w 7j, j 6, such that p T P (v) iff w 7 v + w 73 v 3 + w 74 v 4 + w 75 v 5 + w 76 v 6 θ 7. Because conjunction is commutative we find p T P (v) iff w 7 v + w 74 v 3 + w 73 v 4 + w 76 v 5 + w 75 v 6 θ 7. Consequently, p T P (v) iff w 7 v + w 1 (v 3 + v 4 ) + w (v 5 + v 6 ) θ 7, where w 1 = 1 (w 73 + w 74 ) and w = 1 (w 75 + w 76 ). The Core Method 14
15 Hidden Layers are Needed (Continued) Proposition -layer networks cannot compute T P for definite P. Proof (Continued) Consider P = {p q, p r s, p t u}. Likewise, because disjunction is commutative we find p T P (v) iff w x θ 7, where w = 1 3 (w 7 + w 1 + w ) and x = 7 j= v i. For the network to compute T P the following must hold: If x = 0 (v =... = v 6 = 0) then w x θ 7 < 0. If x = 1 (v = 1, v 3 =... = v 6 = 0) then w x θ 7 0. If x = (v = v 4 = v 6 = 0, v 3 = v 5 = 1) then w x θ 7 < 0. However, d(w x θ 7) = w cannot change its sign; contradiction. dx Consequently, -layer feed-forward networks cannot compute T P qed The Core Method 15
16 Adding Recurrent Connections Recall P = {p, r p q, r p q}. - The Core Method 16
17 On the Existence of Least Fixed Points Theorem For definite programs P, T P has a least fixed point which can be obtained by iterating T P starting with the empty interpretation. (Apt, van Emden: Contributions to the Theory of Logic Programming. Journal of the ACM 9, : 198.) In general, however, least fixed points do not always exist. Consider P = {p q, q p} p - p q - q The corresponding recurrent network does not reach a stable state if initialized by the empty interpretation. It has two stable states corresponding to the interpretations {p} and {q}. The Core Method 17
18 Metrics for Logic Programs Let P be program and l a level mapping for P. For interpretations I, J R P we define d P = { 0 if I = J n if n is the smallest level on which I and J differ. Proposition 1 ( R P, d P ) is a complete metric space. Proposition If P is acceptable, then there exists a metric space such that T P is a contraction on it. For proofs of both Propositions see Fitting: Metric Methods Three Examples and a Theorem. Journal of Logic Programming 1, : Corollary If P is acceptable, then there exists a 3-layer recurrent network of logical threshold units such that the computation starting with an arbitrary initial input converges and yields the unique fixed point of T P. The Core Method 18
19 Time and Space Complexity Let n = P be the number of clauses and m = R m be the number of propositional variables occurring in P. m + n units, mn connections in the core. T P (I) is computed in steps. The parallel computational model to compute T P (I) is optimal. (A parallel computational model requiring p(n) processors and t(n) time to solve a problem of size n is optimal if p(n) t(n) = O(T(n)), where T(n) is the sequential time to solve this problem; see: Karp, Ramachandran: Parallel Algorithms for Shared-Memory Machines. In: Handbook of Theoretical Computer Science, Elsevier, : 1990.) The recurrent network settles down in 3n steps in the worst case. (Dowling, Gallier: Linear-time Algorithms for Testing the Satisfiability of Propositional Horn Formulae. J. of Logic Programming 1, 67-84: 1984; Scutella: A Note on Dowling and Gallier s Top-Down Algorithm for Propositional Horn Satisfiability. J. of Logic Programming 8: 65-7: 1990.) Exercise Give an example of a program with worst case time behavior. The Core Method 19
20 Reasoning wrt Least Fixed Points Let P be a program and assume that T P admits a least fixed point. Let lfp T P denote the least fixed point of P. It can be shown that lfp T P is the least model of P. We define P = lm F iff lfp T P (F) =. Observe = lm =. Consider P = {p, q r}. Then, lfp T P = {p} and P = lm p q r, but P = q and P = r. If we consider = lm, then negation is not classical negation. This is the reason for using instead of. is often called negation by failure. The Core Method 0
21 Extensions The approach has been extended to many-valued logic programs (Kalinke 1994, Seda, Lane 004), extended logic programs (d Avila Garcez, Broda, Gabbay 00), modal logic programs (d Avila Garcez, Lamb, Gabbay 00), intuitionistic logic programs d Avila Garcez, Lamb, Gabbay 003), first-order logic programs (H., Kalinke, Störr 1999, Bader, Hitzler, H., Witzel 007). The Core Method 1
22 KBANN Knowledge Based Artificial Neural Networks Towell, Shavlik: Extracting Refined Rules from Knowledge Based Neural Networks. Machine Learning 131, : Can we do better than empirical learning? Consider acyclic logic programs, e.g., P = {a b c d, a d e, h f g, k a h}. b c d e 3 a k f g 3 h The Core Method
23 KBANN Learning Given hierachical sets of propositional rules as background knowledge. Map rules into multi-layer feed-forward networks with sigmoidal units. Add hidden units (optional). Add units for known input features that are not referenced in the rules. Fully connect layers. Add near-zero random numbers to all links and thresholds. Apply backpropagation. Empirical evaluation: system performs better than purely empirical and purely hand-built classifiers. The Core Method 3
24 KBANN A Problem Towel, Shavlik 1993: Works if rules have few conditions and there are few rules with the same head. q 1. q 9 q 10 q 19 s r 1 r. r r p q = p r = 9 and v q = v r = p s = 0.9 and v s = 1 1+e β(9 9) 0.46 with β = e β(0.9 0) 0.6 with β = 1. The Core Method 4
25 Solving the Problem d Avila Garcez, Zaverucha, Carvalho: Logic Programming and Inductive Learning in Artificial Neural Networks. In: Knowledge Representation in Neural Networks (Herrmann, Strohmaier, eds.), Logos, Berlin, 33-46: Can we combine the ideas of the propositational CORE method and KBANN while avoiding the above mentioned problem? The approach has been generalized in Bader: Neural-Symbolic Integration. PhD thesis, TU Dresden, Informatik: 009. The Core Method 5
26 Propositional CORE Method using Squashing Units Let u be a squashing unit. Let Ψ = lim p Ψ(p) and Ψ + = lim p Ψ(p). Let a, a 0, a + R such that Ψ < a < a 0 < a + < Ψ +. u is active iff v a +. u is passive iff v a. u is in null-state iff v = a 0. u is undecided iff a < (v a 0 ) < a +. p + = Ψ 1 (a + ) is called minimal activation potential. p = Ψ 1 (a ) is called maximal inactivation potential. p 0 = Ψ 1 (a 0 ) is called null-state potential. The Core Method 6
27 The Task How can we guarantee that a unit is either active, passive or in the null-state? Suppose input layer units output only finitely many values. Let u be a hidden layer unit. If the input layer is finite, then the potential of u may only take finitely many different values. Let P = {p 1,...,p n} be the set of possible values for the potential of u. Let P + = {p P p > p 0 } and P = {p P p < p 0 }. Let m = max(m, m + ), where { 0 if P + } { = 0 if P } = m + = p +, m = p min(p + ) p0 otherwise otherwise. p 0 min(p ) Observations If the weights on the connections to u and the threshold of u are multiplied with m, then u is either active, passive or in the null-state. u produces only finitely many different output values. The transformation can be applied to output layer units as well. The Core Method 7
28 Example Consider a bipolar sigmoidal unit u with Ψ(p) = tanh(p). Let P = { 0.9, 0, 0.3, 0.0, 0., 0.4, 0.8} and a = 0.8, a 0 = 0.0, a + = 0.8. Then, p = Ψ 1 (a ) 1.1, p 0 = Ψ 1 (a 0 ) = 0.0, p + = Ψ 1 (a + ) 1.1. Hence, m = = 3.66 and m+ 1.1 = = Thus, m = max(m, m + ) = max(3.66, 5.493) = and we obtain p tanh(p) tanh(m p) Exercise Specify a core of bipolar sigmoidal units for P = {p, r p q, r p q} with a = 0.9, a 0 = 0.0, a + = 0.9. The Core Method 8
29 Results Relation to logic programs is preserved. For each program P, there exists a core of squashing units computing T P. If P is acceptable, then there exists a 3-layer recurrent network of squashing units such that the computation starting with an arbitrary initial input converges and yields the unique fixed point of T P. Likewise for consistent acceptable extended logic programs. The core is trainable by backpropagation. The neural symbolic cycle writable embedding Symbolic System Connectionist System trainable readable extraction The Core Method 9
30 Cores for Three-Valued Logic Programs Consider {p q}. A translation algorithm translates programs into a core. Recurrent connections connect the output to the input layer. Kalinke 1995, Seda, Lane 004 Φ F new Φ SvL p p p p p p p p q q q q q - q q q - - The Core Method 30
31 A CORE Method for Human Reasoning Consider three-layer feed forward networks of binary threshold units. The input as well as the output layer shall represent interpretations. Theorem For each program P there exists a feed-forward core computing Φ SvL,P. Add recurrent connections between corresponding units in the output and the input layer. Corollary The recurrent network reaches a stable state representing lfpφ SvL,P if initialized with,. The Core Method 31
32 The Suppression Task Modus Ponens If she has an essay to write, she will study late in the library. She has an essay to write. 96% of subjects conclude that she will study late in the library. P 4 = {l e ab, e, ab }. e e l l ab ab 3 - e e l l ab ab - lfpφ SvL,P4 = lm 3Ł wcp 4 = {l, e},{ab}. From {l, e},{ab} follows that she will study late in the library. The Core Method 3
33 The Suppression Task Alternative Arguments If she has an essay to write, she will study late in the library. She has an essay to write. If she has some textbooks to read, she will study late in the library. 96% of subjects conclude that she will study late in the library. P 5 = {l e ab 1, e, ab 1, l t ab, ab }. e e l l 3 ab 1 ab 1 t t ab ab e e l l ab 1 ab 1 t t - ab ab lfpφ SvL,P5 = lm 3Ł wcp 5 = {e, l},{ab 1, ab }. From {e, l},{ab 1, ab } follows that she will study late in the library. The Core Method 33
34 The Suppression Task Additional Argument If she has an essay to write, she will study late in the library. She has an essay to write. If the library stays open, she will study late in the library. 38% of subjects conclude that she will study late in the library. P 6 = {l e ab 1, e, l o ab, ab 1 o, ab e,}. e e l l 3 ab 1 ab 1 o o ab ab e e l l ab 1 ab 1 o o - ab ab lfpφ SvL,P6 = lm 3Ł wcp 6 = {e},{ab }. From {e},{ab } follows that it is unknown whether she will study late in the library. The Core Method 34
35 The Suppression Task Denial of Antecendent (DA) If she has an essay to write, she will study late in the library. She does not have an essay to write. 46% of subjects conclude that she will not study late in the library. P 7 = {l e ab, e, ab }. e e l l ab ab 3 - e e l l ab ab - lfpφ SvL,P7 = lm 3Ł wcp 7 =,{ab, e, l}. From,{ab, e, l} follows that she will not study late in the library. The Core Method 35
36 The Suppression Task Alternative Argument and DA If she has an essay to write, she will study late in the library. She does not have an essay to write. If she has textbooks to read, she will study late in the library. 4% of subjects conclude that Marian will not study late in the library. P 8 = {l e ab 1, e, ab 1, l t ab, ab }. e e l l 3 ab 1 ab 1 t t ab ab e e l l ab 1 ab 1 t t - ab ab lfpφ SvL,P8 = lm 3Ł wcp 8 =,{ab 1, ab, e}. From,{ab 1, ab, e} follows that it is unknown whether she will study late in the library. The Core Method 36
37 The Suppression Task Additional Argument and DA If she has an essay to write, she will study late in the library. She does not have an essay to write. If the library is open, she will study late in the library. 63% of subjects conclude that she will not study late in the library. P 9 = {l e ab 1, e, l o ab, ab 1 o, ab e}. e e l l 3 ab 1 ab 1 o o ab ab e e l l ab 1 ab 1 o o - ab ab lfpφ SvL,P9 = lm 3Ł wcp 9 = {ab },{e, l}. From {ab },{e, l} follows that she will not study late in the library. The Core Method 37
38 Summary Under Łukasiewicz semantics we obtain Byrne 1989 Program lm 3Ł wcp i (l) Modus Ponens l (96%) P 4 Alternative Arguments l (96%) P 5 Additional Arguments l (38%) P 6 U Modus Ponens and DA l (46%) P 7 Alternative Arguments and DA l (4%) P 8 U Additional Arguments and DA l (63%) P 9 The approach appears to be adequate. Fitting semantics or completion is inadequate. The Core Method 38
Recurrent Neural Networks and Logic Programs
Recurrent Neural Networks and Logic Programs The Very Idea Propositional Logic Programs Propositional Logic Programs and Learning Propositional Logic Programs and Modalities First Order Logic Programs
More informationlogic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte
Steffen Hölldobler logika je všude logic is everywhere la lógica está por todas partes Logika ada di mana-mana lógica está em toda parte la logique est partout Hikmat har Jaga Hai Human Reasoning, Logic
More informationIntegrating Logic Programs and Connectionist Systems
Integrating Logic Programs and Connectionist Systems Sebastian Bader, Steffen Hölldobler International Center for Computational Logic Technische Universität Dresden Germany Pascal Hitzler Institute of
More informationWeak Completion Semantics
Weak Completion Semantics International Center for Computational Logic Technische Universität Dresden Germany Some Human Reasoning Tasks Logic Programs Non-Monotonicity Łukasiewicz Logic Least Models Weak
More informationExtracting Reduced Logic Programs from Artificial Neural Networks
Extracting Reduced Logic Programs from Artificial Neural Networks Jens Lehmann 1, Sebastian Bader 2, Pascal Hitzler 3 1 Department of Computer Science, Technische Universität Dresden, Germany 2 International
More informationTowards reasoning about the past in neural-symbolic systems
Towards reasoning about the past in neural-symbolic systems Rafael V. Borges and Luís C. Lamb Institute of Informatics Federal University of Rio Grande do Sul orto Alegre RS, 91501-970, Brazil rvborges@inf.ufrgs.br;
More informationSkeptical Abduction: A Connectionist Network
Skeptical Abduction: A Connectionist Network Technische Universität Dresden Luis Palacios Medinacelli Supervisors: Prof. Steffen Hölldobler Emmanuelle-Anna Dietz 1 To my family. Abstract A key research
More informationNeural-Symbolic Integration
Neural-Symbolic Integration A selfcontained introduction Sebastian Bader Pascal Hitzler ICCL, Technische Universität Dresden, Germany AIFB, Universität Karlsruhe, Germany Outline of the Course Introduction
More informationA Computational Logic Approach to the Suppression Task
A Computational Logic Approach to the Suppression Task mmanuelle-anna Dietz and Steffen Hölldobler ({dietz,sh}@iccl.tu-dresden.de) International Center for Computation Logic, TU Dresden D-01062 Dresden,
More informationWe don t have a clue how the mind is working.
We don t have a clue how the mind is working. Neural-Symbolic Integration Pascal Hitzler Universität Karlsruhe (TH) Motivation Biological neural networks can easily do logical reasoning. Why is it so difficult
More informationPropositional Logic Language
Propositional Logic Language A logic consists of: an alphabet A, a language L, i.e., a set of formulas, and a binary relation = between a set of formulas and a formula. An alphabet A consists of a finite
More informationLogic Programs and Connectionist Networks
Wright State University CORE Scholar Computer Science and Engineering Faculty Publications Computer Science and Engineering 9-2004 Logic Programs and Connectionist Networks Pascal Hitzler pascal.hitzler@wright.edu
More informationConnectionist Artificial Neural Networks
Imperial College London Department of Computing Connectionist Artificial Neural Networks Master s Thesis by Mathieu Guillame-Bert Supervised by Dr. Krysia Broda Submitted in partial fulfillment of the
More informationFirst-order deduction in neural networks
First-order deduction in neural networks Ekaterina Komendantskaya 1 Department of Mathematics, University College Cork, Cork, Ireland e.komendantskaya@mars.ucc.ie Abstract. We show how the algorithm of
More informationExtracting Reduced Logic Programs from Artificial Neural Networks
Extracting Reduced Logic Programs from Artificial Neural Networks Jens Lehmann 1, Sebastian Bader 2, Pascal Hitzler 3 1 Department of Computer Science, Universität Leipzig, Germany 2 International Center
More informationLogic Programs and Three-Valued Consequence Operators
International Center for Computational Logic, TU Dresden, 006 Dresden, Germany Logic Programs and Three-Valued Consequence Operators Master s Thesis Carroline Dewi Puspa Kencana Ramli Author Prof. Dr.
More informationKnowledge base (KB) = set of sentences in a formal language Declarative approach to building an agent (or other system):
Logic Knowledge-based agents Inference engine Knowledge base Domain-independent algorithms Domain-specific content Knowledge base (KB) = set of sentences in a formal language Declarative approach to building
More informationForward Human Reasoning Modeled by Logic Programming Modeled by Classical Logic with Circumscription and Projection
Faculty of Computer Science Institute of Artificial Intelligence Knowledge Representation and Reasoning Forward Human Reasoning Modeled by Logic Programming Modeled by Classical Logic with Circumscription
More informationNested Epistemic Logic Programs
Nested Epistemic Logic Programs Kewen Wang 1 and Yan Zhang 2 1 Griffith University, Australia k.wang@griffith.edu.au 2 University of Western Sydney yan@cit.uws.edu.au Abstract. Nested logic programs and
More informationAdvanced Topics in LP and FP
Lecture 1: Prolog and Summary of this lecture 1 Introduction to Prolog 2 3 Truth value evaluation 4 Prolog Logic programming language Introduction to Prolog Introduced in the 1970s Program = collection
More informationDistributed Knowledge Representation in Neural-Symbolic Learning Systems: A Case Study
Distributed Knowledge Representation in Neural-Symbolic Learning Systems: A Case Study Artur S. d Avila Garcez δ, Luis C. Lamb λ,krysia Broda β, and Dov. Gabbay γ δ Dept. of Computing, City University,
More informationLanguage of Propositional Logic
Logic A logic has: 1. An alphabet that contains all the symbols of the language of the logic. 2. A syntax giving the rules that define the well formed expressions of the language of the logic (often called
More informationLogical Inference. Artificial Intelligence. Topic 12. Reading: Russell and Norvig, Chapter 7, Section 5
rtificial Intelligence Topic 12 Logical Inference Reading: Russell and Norvig, Chapter 7, Section 5 c Cara MacNish. Includes material c S. Russell & P. Norvig 1995,2003 with permission. CITS4211 Logical
More informationApplied Logic. Lecture 1 - Propositional logic. Marcin Szczuka. Institute of Informatics, The University of Warsaw
Applied Logic Lecture 1 - Propositional logic Marcin Szczuka Institute of Informatics, The University of Warsaw Monographic lecture, Spring semester 2017/2018 Marcin Szczuka (MIMUW) Applied Logic 2018
More informationNormal Forms of Propositional Logic
Normal Forms of Propositional Logic Bow-Yaw Wang Institute of Information Science Academia Sinica, Taiwan September 12, 2017 Bow-Yaw Wang (Academia Sinica) Normal Forms of Propositional Logic September
More informationArtificial Intelligence Chapter 7: Logical Agents
Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent State University February 20, 2006 AI: Chapter 7: Logical Agents 1 Contents Knowledge Based Agents
More information3 Propositional Logic
3 Propositional Logic 3.1 Syntax 3.2 Semantics 3.3 Equivalence and Normal Forms 3.4 Proof Procedures 3.5 Properties Propositional Logic (25th October 2007) 1 3.1 Syntax Definition 3.0 An alphabet Σ consists
More informationEquivalence for the G 3-stable models semantics
Equivalence for the G -stable models semantics José Luis Carballido 1, Mauricio Osorio 2, and José Ramón Arrazola 1 1 Benemérita Universidad Autóma de Puebla, Mathematics Department, Puebla, México carballido,
More informationLogic: Propositional Logic (Part I)
Logic: Propositional Logic (Part I) Alessandro Artale Free University of Bozen-Bolzano Faculty of Computer Science http://www.inf.unibz.it/ artale Descrete Mathematics and Logic BSc course Thanks to Prof.
More informationFibring Neural Networks
Fibring Neural Networks Artur S d Avila Garcez δ and Dov M Gabbay γ δ Dept of Computing, City University London, ECV 0H, UK aag@soicityacuk γ Dept of Computer Science, King s College London, WC2R 2LS,
More informationDeductive Systems. Lecture - 3
Deductive Systems Lecture - 3 Axiomatic System Axiomatic System (AS) for PL AS is based on the set of only three axioms and one rule of deduction. It is minimal in structure but as powerful as the truth
More informationNeural-Symbolic Integration
Neural-Symbolic Integration Dissertation zur Erlangung des akademischen Grades Doktor rerum naturalium (Dr. rer. nat) vorgelegt an der Technischen Universität Dresden Fakultät Informatik eingereicht von
More informationArtificial Intelligence. Propositional logic
Artificial Intelligence Propositional logic Propositional Logic: Syntax Syntax of propositional logic defines allowable sentences Atomic sentences consists of a single proposition symbol Each symbol stands
More informationlogic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte
Simple Connectionist Systems Steffen Hölldobler logika je všude International Center for Computational Logic Technische Universität Dresden Germany logic is everywhere Connectionist Networks la lógica
More informationFoundations of Artificial Intelligence
Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard, Maren Bennewitz, and Marco Ragni Albert-Ludwigs-Universität Freiburg Contents 1 Agents
More informationFoundations of Artificial Intelligence
Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Bernhard Nebel Albert-Ludwigs-Universität Freiburg May 17, 2016
More information185.A09 Advanced Mathematical Logic
185.A09 Advanced Mathematical Logic www.volny.cz/behounek/logic/teaching/mathlog13 Libor Běhounek, behounek@cs.cas.cz Lecture #1, October 15, 2013 Organizational matters Study materials will be posted
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Propositional Logic Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationNatural Deduction. Formal Methods in Verification of Computer Systems Jeremy Johnson
Natural Deduction Formal Methods in Verification of Computer Systems Jeremy Johnson Outline 1. An example 1. Validity by truth table 2. Validity by proof 2. What s a proof 1. Proof checker 3. Rules of
More informationNatural Deduction for Propositional Logic
Natural Deduction for Propositional Logic Bow-Yaw Wang Institute of Information Science Academia Sinica, Taiwan September 10, 2018 Bow-Yaw Wang (Academia Sinica) Natural Deduction for Propositional Logic
More informationPropositional and Predicate Logic - V
Propositional and Predicate Logic - V Petr Gregor KTIML MFF UK WS 2016/2017 Petr Gregor (KTIML MFF UK) Propositional and Predicate Logic - V WS 2016/2017 1 / 21 Formal proof systems Hilbert s calculus
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationDescription Logics. Foundations of Propositional Logic. franconi. Enrico Franconi
(1/27) Description Logics Foundations of Propositional Logic Enrico Franconi franconi@cs.man.ac.uk http://www.cs.man.ac.uk/ franconi Department of Computer Science, University of Manchester (2/27) Knowledge
More informationMAI0203 Lecture 7: Inference and Predicate Calculus
MAI0203 Lecture 7: Inference and Predicate Calculus Methods of Artificial Intelligence WS 2002/2003 Part II: Inference and Knowledge Representation II.7 Inference and Predicate Calculus MAI0203 Lecture
More informationLogic for Computer Science - Week 4 Natural Deduction
Logic for Computer Science - Week 4 Natural Deduction 1 Introduction In the previous lecture we have discussed some important notions about the semantics of propositional logic. 1. the truth value of a
More informationLogical Agents (I) Instructor: Tsung-Che Chiang
Logical Agents (I) Instructor: Tsung-Che Chiang tcchiang@ieee.org Department of Computer Science and Information Engineering National Taiwan Normal University Artificial Intelligence, Spring, 2010 編譯有誤
More informationIntelligent Agents. Pınar Yolum Utrecht University
Intelligent Agents Pınar Yolum p.yolum@uu.nl Utrecht University Logical Agents (Based mostly on the course slides from http://aima.cs.berkeley.edu/) Outline Knowledge-based agents Wumpus world Logic in
More informationMathematical Logic Propositional Logic - Tableaux*
Mathematical Logic Propositional Logic - Tableaux* Fausto Giunchiglia and Mattia Fumagalli University of Trento *Originally by Luciano Serafini and Chiara Ghidini Modified by Fausto Giunchiglia and Mattia
More informationOn Urquhart s C Logic
On Urquhart s C Logic Agata Ciabattoni Dipartimento di Informatica Via Comelico, 39 20135 Milano, Italy ciabatto@dsiunimiit Abstract In this paper we investigate the basic many-valued logics introduced
More informationThe Wumpus Game. Stench Gold. Start. Cao Hoang Tru CSE Faculty - HCMUT
The Wumpus Game Stench Stench Gold Stench Start 1 The Wumpus Game Stench in the square containing the wumpus and in the directly adjacent squares in the squares directly adjacent to a pit Glitter in the
More informationCS 380: ARTIFICIAL INTELLIGENCE
CS 380: RTIFICIL INTELLIGENCE PREDICTE LOGICS 11/8/2013 Santiago Ontañón santi@cs.drexel.edu https://www.cs.drexel.edu/~santi/teaching/2013/cs380/intro.html Summary of last day: Logical gents: The can
More informationAn Introduction to Modal Logic III
An Introduction to Modal Logic III Soundness of Normal Modal Logics Marco Cerami Palacký University in Olomouc Department of Computer Science Olomouc, Czech Republic Olomouc, October 24 th 2013 Marco Cerami
More informationLogic and Inferences
Artificial Intelligence Logic and Inferences Readings: Chapter 7 of Russell & Norvig. Artificial Intelligence p.1/34 Components of Propositional Logic Logic constants: True (1), and False (0) Propositional
More informationEE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 10, 5/9/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Logical Agents Chapter 7
More informationarxiv: v2 [cs.ai] 22 Jun 2017
Unsupervised Neural-Symbolic Integration Son N. Tran The Australian E-Health Research Center, CSIRO son.tran@csiro.au arxiv:1706.01991v2 [cs.ai] 22 Jun 2017 Abstract Symbolic has been long considered as
More informationOutline. Logical Agents. Logical Reasoning. Knowledge Representation. Logical reasoning Propositional Logic Wumpus World Inference
Outline Logical Agents ECE57 Applied Artificial Intelligence Spring 008 Lecture #6 Logical reasoning Propositional Logic Wumpus World Inference Russell & Norvig, chapter 7 ECE57 Applied Artificial Intelligence
More informationOutline. Logical Agents. Logical Reasoning. Knowledge Representation. Logical reasoning Propositional Logic Wumpus World Inference
Outline Logical Agents ECE57 Applied Artificial Intelligence Spring 007 Lecture #6 Logical reasoning Propositional Logic Wumpus World Inference Russell & Norvig, chapter 7 ECE57 Applied Artificial Intelligence
More informationFuzzy Answer Set semantics for Residuated Logic programs
semantics for Logic Nicolás Madrid & Universidad de Málaga September 23, 2009 Aims of this paper We are studying the introduction of two kinds of negations into residuated : Default negation: This negation
More informationPropositional Logic: Models and Proofs
Propositional Logic: Models and Proofs C. R. Ramakrishnan CSE 505 1 Syntax 2 Model Theory 3 Proof Theory and Resolution Compiled at 11:51 on 2016/11/02 Computing with Logic Propositional Logic CSE 505
More informationPropositional Logics and their Algebraic Equivalents
Propositional Logics and their Algebraic Equivalents Kyle Brooks April 18, 2012 Contents 1 Introduction 1 2 Formal Logic Systems 1 2.1 Consequence Relations......................... 2 3 Propositional Logic
More informationUnsupervised Neural-Symbolic Integration
Unsupervised Neural-Symbolic Integration Son N. Tran The Australian E-Health Research Center, CSIRO son.tran@csiro.au Abstract Symbolic has been long considered as a language of human intelligence while
More information02 Propositional Logic
SE 2F03 Fall 2005 02 Propositional Logic Instructor: W. M. Farmer Revised: 25 September 2005 1 What is Propositional Logic? Propositional logic is the study of the truth or falsehood of propositions or
More informationIntelligent Agents. First Order Logic. Ute Schmid. Cognitive Systems, Applied Computer Science, Bamberg University. last change: 19.
Intelligent Agents First Order Logic Ute Schmid Cognitive Systems, Applied Computer Science, Bamberg University last change: 19. Mai 2015 U. Schmid (CogSys) Intelligent Agents last change: 19. Mai 2015
More informationChapter 2 Algorithms and Computation
Chapter 2 Algorithms and Computation In this chapter, we first discuss the principles of algorithm and computation in general framework, common both in classical and quantum computers, then we go to the
More informationInference Methods In Propositional Logic
Lecture Notes, Artificial Intelligence ((ENCS434)) University of Birzeit 1 st Semester, 2011 Artificial Intelligence (ENCS434) Inference Methods In Propositional Logic Dr. Mustafa Jarrar University of
More informationInference in Propositional Logic
Inference in Propositional Logic Deepak Kumar November 2017 Propositional Logic A language for symbolic reasoning Proposition a statement that is either True or False. E.g. Bryn Mawr College is located
More information2. The Logic of Compound Statements Summary. Aaron Tan August 2017
2. The Logic of Compound Statements Summary Aaron Tan 21 25 August 2017 1 2. The Logic of Compound Statements 2.1 Logical Form and Logical Equivalence Statements; Compound Statements; Statement Form (Propositional
More informationTrichotomy Results on the Complexity of Reasoning with Disjunctive Logic Programs
Trichotomy Results on the Complexity of Reasoning with Disjunctive Logic Programs Mirosław Truszczyński Department of Computer Science, University of Kentucky, Lexington, KY 40506, USA Abstract. We present
More informationA Connectionist Cognitive Model for Temporal Synchronisation and Learning
A Connectionist Cognitive Model for Temporal Synchronisation and Learning Luís C. Lamb and Rafael V. Borges Institute of Informatics Federal University of Rio Grande do Sul orto Alegre, RS, 91501-970,
More informationCS 380: ARTIFICIAL INTELLIGENCE PREDICATE LOGICS. Santiago Ontañón
CS 380: RTIFICIL INTELLIGENCE PREDICTE LOGICS Santiago Ontañón so367@drexeledu Summary of last day: Logical gents: The can reason from the knowledge they have They can make deductions from their perceptions,
More informationTitle: Logical Agents AIMA: Chapter 7 (Sections 7.4 and 7.5)
B.Y. Choueiry 1 Instructor s notes #12 Title: Logical Agents AIMA: Chapter 7 (Sections 7.4 and 7.5) Introduction to Artificial Intelligence CSCE 476-876, Fall 2018 URL: www.cse.unl.edu/ choueiry/f18-476-876
More informationStrong AI vs. Weak AI Automated Reasoning
Strong AI vs. Weak AI Automated Reasoning George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Artificial intelligence can be classified into two categories:
More informationDismatching and Local Disunification in EL
Dismatching and Local Disunification in EL (Extended Abstract) Franz Baader, Stefan Borgwardt, and Barbara Morawska Theoretical Computer Science, TU Dresden, Germany {baader,stefborg,morawska}@tcs.inf.tu-dresden.de
More informationLogical Agents. Chapter 7
Logical Agents Chapter 7 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem
More information7 LOGICAL AGENTS. OHJ-2556 Artificial Intelligence, Spring OHJ-2556 Artificial Intelligence, Spring
109 7 LOGICAL AGENS We now turn to knowledge-based agents that have a knowledge base KB at their disposal With the help of the KB the agent aims at maintaining knowledge of its partially-observable environment
More informationKnowledge Extraction from DBNs for Images
Knowledge Extraction from DBNs for Images Son N. Tran and Artur d Avila Garcez Department of Computer Science City University London Contents 1 Introduction 2 Knowledge Extraction from DBNs 3 Experimental
More informationPropositional Logic: Syntax
4 Propositional Logic: Syntax Reading: Metalogic Part II, 22-26 Contents 4.1 The System PS: Syntax....................... 49 4.1.1 Axioms and Rules of Inference................ 49 4.1.2 Definitions.................................
More informationComp487/587 - Boolean Formulas
Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested
More informationTwo-Valued Logic Programs
Two-Valued Logic Programs Vladimir Lifschitz University of Texas at Austin, USA Abstract We define a nonmonotonic formalism that shares some features with three other systems of nonmonotonic reasoning
More informationProof Theoretical Studies on Semilattice Relevant Logics
Proof Theoretical Studies on Semilattice Relevant Logics Ryo Kashima Department of Mathematical and Computing Sciences Tokyo Institute of Technology Ookayama, Meguro, Tokyo 152-8552, Japan. e-mail: kashima@is.titech.ac.jp
More informationPropositional and Predicate Logic - II
Propositional and Predicate Logic - II Petr Gregor KTIML MFF UK WS 2016/2017 Petr Gregor (KTIML MFF UK) Propositional and Predicate Logic - II WS 2016/2017 1 / 16 Basic syntax Language Propositional logic
More informationKnowledge based Agents
Knowledge based Agents Shobhanjana Kalita Dept. of Computer Science & Engineering Tezpur University Slides prepared from Artificial Intelligence A Modern approach by Russell & Norvig Knowledge Based Agents
More informationLOGIC PROPOSITIONAL REASONING
LOGIC PROPOSITIONAL REASONING WS 2017/2018 (342.208) Armin Biere Martina Seidl biere@jku.at martina.seidl@jku.at Institute for Formal Models and Verification Johannes Kepler Universität Linz Version 2018.1
More informationPart 1: Propositional Logic
Part 1: Propositional Logic Literature (also for first-order logic) Schöning: Logik für Informatiker, Spektrum Fitting: First-Order Logic and Automated Theorem Proving, Springer 1 Last time 1.1 Syntax
More informationA New 3-CNF Transformation by Parallel-Serial Graphs 1
A New 3-CNF Transformation by Parallel-Serial Graphs 1 Uwe Bubeck, Hans Kleine Büning University of Paderborn, Computer Science Institute, 33098 Paderborn, Germany Abstract For propositional formulas we
More informationA Little Logic. Propositional Logic. Satisfiability Problems. Solving Sudokus. First Order Logic. Logic Programming
A Little Logic International Center for Computational Logic Technische Universität Dresden Germany Propositional Logic Satisfiability Problems Solving Sudokus First Order Logic Logic Programming A Little
More informationIntroduction to Metalogic
Philosophy 135 Spring 2008 Tony Martin Introduction to Metalogic 1 The semantics of sentential logic. The language L of sentential logic. Symbols of L: Remarks: (i) sentence letters p 0, p 1, p 2,... (ii)
More informationArtificial Intelligence
Artificial Intelligence Propositional Logic Marc Toussaint University of Stuttgart Winter 2015/16 (slides based on Stuart Russell s AI course) Outline Knowledge-based agents Wumpus world Logic in general
More informationLogical agents. Chapter 7. Chapter 7 1
Logical agents Chapter 7 Chapter 7 1 Outline Knowledge-based agents Logic in general models and entailment Propositional (oolean) logic Equivalence, validity, satisfiability Inference rules and theorem
More informationPropositional Logic. Fall () Propositional Logic Fall / 30
Propositional Logic Fall 2013 () Propositional Logic Fall 2013 1 / 30 1 Introduction Learning Outcomes for this Presentation 2 Definitions Statements Logical connectives Interpretations, contexts,... Logically
More informationDisplay calculi in non-classical logics
Display calculi in non-classical logics Revantha Ramanayake Vienna University of Technology (TU Wien) Prague seminar of substructural logics March 28 29, 2014 Revantha Ramanayake (TU Wien) Display calculi
More informationInference Methods In Propositional Logic
Lecture Notes, Advanced Artificial Intelligence (SCOM7341) Sina Institute, University of Birzeit 2 nd Semester, 2012 Advanced Artificial Intelligence (SCOM7341) Inference Methods In Propositional Logic
More informationHypersequent Calculi for some Intermediate Logics with Bounded Kripke Models
Hypersequent Calculi for some Intermediate Logics with Bounded Kripke Models Agata Ciabattoni Mauro Ferrari Abstract In this paper we define cut-free hypersequent calculi for some intermediate logics semantically
More informationLogical Agent & Propositional Logic
Logical Agent & Propositional Logic Berlin Chen 2005 References: 1. S. Russell and P. Norvig. Artificial Intelligence: A Modern Approach. Chapter 7 2. S. Russell s teaching materials Introduction The representation
More information22c:145 Artificial Intelligence
22c:145 Artificial Intelligence Fall 2005 Propositional Logic Cesare Tinelli The University of Iowa Copyright 2001-05 Cesare Tinelli and Hantao Zhang. a a These notes are copyrighted material and may not
More informationTR : Binding Modalities
City University of New York (CUNY) CUNY Academic Works Computer Science Technical Reports Graduate Center 2012 TR-2012011: Binding Modalities Sergei N. Artemov Tatiana Yavorskaya (Sidon) Follow this and
More informationComputational Intelligence Winter Term 2017/18
Computational Intelligence Winter Term 207/8 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS ) Fakultät für Informatik TU Dortmund Plan for Today Single-Layer Perceptron Accelerated Learning
More informationArtificial Intelligence
Artificial Intelligence Propositional Logic Marc Toussaint University of Stuttgart Winter 2016/17 (slides based on Stuart Russell s AI course) Motivation: Most students will have learnt about propositional
More informationOutline. Formale Methoden der Informatik First-Order Logic for Forgetters. Why PL1? Why PL1? Cont d. Motivation
Outline Formale Methoden der Informatik First-Order Logic for Forgetters Uwe Egly Vienna University of Technology Institute of Information Systems Knowledge-Based Systems Group Motivation Syntax of PL1
More informationA brief introduction to Logic. (slides from
A brief introduction to Logic (slides from http://www.decision-procedures.org/) 1 A Brief Introduction to Logic - Outline Propositional Logic :Syntax Propositional Logic :Semantics Satisfiability and validity
More information