Selected solutions to exercises Knowledge Representation and Reasoning
|
|
- Alexandrina Hunter
- 5 years ago
- Views:
Transcription
1 Selected solutions to exercises Knowledge Representation and Reasoning th December, 2010 Logic & Resolution (Appendix A) Exercise 2.1 (iv) We are looking for a substitution θ such that: P(x,z,y)θ = P(x,z,x)θ = P(a,x,x)θ To equate the first argument x has to replaced by a. We then have: P(x,z,y){a/x} = P(a,z,y) P(x,z,x){a/x} = P(a,z,a) P(a,x,x){a/x} = P(a,a,a) Now to equate the second and third argument, both z and y also have to be replaced by a. Hence θ = {a/x, a/y, a/z}. Exercise 2.2 (v) Following the steps on pages of the Lecture Notes: F = x(p(a,x) y(q(x,y) zr(z))) P(a,b) y(q(b,y) R(f(y))) (step 4) y(p(a,b) (Q(b,y) R(f(y)))) (step 5) y(p(a,b) (Q(b,y) R(f(y)))) (step 5) y((p(a,b) Q(b,y)) (P(a,b) R(f(y)))) (step 6) (P(a,b) Q(b,y)) (P(a,b) R(f(y))) (step 7) F = {P(a,b) Q(b,y),P(a,b) R(f(y))} (step 8) Step 4 is the skolemisation step, where the existential quantifier is replaced by a (function) term with universally quantified variables and a constant if the function symbol of the term has no arguments. In general equivalence will be lost in step 4 (skolemisation), this also holds in this case. Compare xp(a,x) with P(a,b), it holds that P(a,b) xp(a,x), but not P(a,b) xp(a,x) (the reason is that the x may be interpreted as another constant, such as a or c). Important for resolution is that F is satisfiable if and only if F is satisfiable, so here it does not really matter. 1
2 Logic programming and Prolog (Chapter 2) Exercise 1.1 (ii) θ 4 D(b) θ 3 θ 2 θ 1 P(x) Q(y) L(x,y)) P(a) Q(y) L(a, y) L(a,b) D(b) Q(b) D(y) L(a,y) The corresponding substitutions are: θ 1 = {a/x} θ 2 = {b/y} θ 3 = {b/y} θ 4 = {} Description Logics & Frames (Chapter 3) Exercise 1 a. Employee Human b. Mother Women haschild. c. Parent Mother Father d. Grandmother Mother haschild.parent e. haschild.human Human Exercise 2 a. This frame taxonomy summarises some AI and computing science history, which is briefly explained below. MYCIN and PROSPECTOR were two of the early knowledge-based systems, developed at the end of the 1970s, which included methods for reasoning with uncertainty. At the time two reasoning methods were distinguished: top-down or goal-driven inference, similar to backtracking in Prolog, where the reasoning process was controlled by goals, and bottom-up or data-driven inference, where the reasoning process was controlled by means of data or facts. DEC, an acronym which stands for Digital Equipment Corporation, was a USA computer industry company that was very innovative, but lost impact because of the popularity of the non-network-based simple personal computer with Microsoft s DOS as operating system. Their strategy was focussed on the development of workstations and network-based minicomputers, very much in line with modern ideas around the internet, cloud computing, and network-based workstations. DEC was the first company that developed a knowledge systems, called XCON, that was able to generate the specifications of minicomputers based on requirements of customers. This made 2
3 sense at the time because there were many different options available for most of the computer systems sold by DEC. Although visionary and very successful until about 1987, DEC s strategy was no longer successful at the end of the 1980s, when Microsoft, as most consumers, did not believe in network-based computing and were in favour of the stand-alone PC. DEC was bought by Compaq that later merged with HP. In the take-over process, almost all of the innovative capacity of the company was lost. HP is surely not a major player in AI and computing science innovation as DEC once was. The following frames offer an adequate solution: class knowledge-based-system is superclass nil contents = domain-knowledge inference-type : {top-down, bottom-up} end class diagnostic-system is superclass knowledge-based-system inference-type = top-down end class configuration-system is superclass knowledge-based-system inference-type = bottom-up end instance MYCIN is instance-of diagnostic-system developer = Shortliffe end instance PROSPECTOR is instance-of diagnostic-system domain = geology end instance XCON is instance-of configuration-system company = DEC end b. The taxonomy is represented in Figure 1. The set of inheritance chains is as follows Ω T = {y 1, y 2, 3
4 y 4 [a = c 2 ] y 3 y 2 [a = c 1 ] y 1 Figure 1: Taxonomy. y 3, y 4, y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 y 2, y 1 y 3, y 2 y 3, y 2 y 4, y 3 y 4, y 1 y 2 y 3, y 1 y 2 y 4, y 1 y 3 y 4, y 2 y 3 y 4, y 1 y 2 y 3 y 4, y 1 y 2 [a = c 1 ], y 2 y 4 [a = c 2 ], y 3 y 4 [a = c 2 ], y 1 y 2 y 4 [a = c 2 ], y 1 y 3 y 4 [a = c 2 ], y 2 y 3 y 4 [a = c 2 ], y 1 y 2 y 3 y 4 [a = c 2 ]} Note that we start with the classes, such as y 1, attribute-value specifications, such as y 4 [a = c 2 ], add the subclass relationships, such as y 1 y 3 and y 3 y 4, and then concatenate the chains, yielding, for example, y 1 y 3 y 4 and y 1 y 3 y 4 [a = c 2 ]. The conclusion set is obtaining by propagating the attributes with their values, e.g., a = c 2, to the beginning of a chain. For example, y 1 y 2 y 4 [a = c 2 ] would yield y 1 [a = c 2 ] as a conclusion. The entire conclusion set is as follows: C(Ω T ) = 4
5 {y 1 [a = c 1 ], y 1 [a = c 2 ], y 2 [a = c 1 ], y 2 [a = c 2 ], y 3 [a = c 2 ], y 4 [a = c 2 ]} C(Ω T ) is not consistent as we have two different values for attribute a for class y 1 and also for class y 2. The following inheritance chains are precluded or blocked (see definition of preclusion on page 53, Definition 3.3, of the Lecture Notes): {y 2 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 2 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 1 y 3 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 2 y 3 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 2 y 3 y 4 [a = c 2 ]} (by y 1 y 2 [a = c 1 ]) The precluded inheritance chains are ignored when computing the inheritable conclusion set. Therefore, the inheritable conclusion set becomes H(Ω T ) = { y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 [a = c 1 ], y 3 [a = c 2 ]} T is consistent as now there are no longer cases with two different values for attribute a in any of the classes y i, i = 1,...,4. Exercise 3 a. car wheels.{4} car seats.{4} sportscar car sportscar seats.{2} Rolls-Royce : car (Rolls-Royce, sufficient) : max-speed If a sportscar is given, then the set is inconsistent. If the set is empty, then it is consistent. b. An incorrect implementation would have resulted in the situation as represented in figure 2. The algorithm is non-deterministic, because the order is not specified. If the order is y 3, y 2, then the attribute a has the value c 1 (which happens to be correct). If the order is y 2, y 3, then the attribute a gets the value c 2 (which is incorrect). 5
6 y 3 [a = c 2 ] y 2 [a = c 1 ] y 1 Figure 2: Multiple inheritance with exceptions. y 4 [a = c 2 ] y 3 y 2 [a = c 1 ] y 1 x Figure 3: Taxonomy. 6
7 c. The taxonomy is represented in figure 3. The inheritance chains Ω T = {y 1, y 2, y 3, y 4, y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 y 2, y 2 y 3, y 3 y 4, y 1 y 3, y 1 y 2 y 3, y 2 y 3 y 4, y 1 y 3 y 4, y 1 y 2 y 3 y 4, y 1 y 2 [a = c 1 ], y 3 y 4 [a = c 2 ], y 2 y 3 y 4 [a = c 2 ], y 1 y 3 y 4 [a = c 2 ], y 1 y 2 y 3 y 4 [a = c 2 ]} The conclusion set C(Ω T ) = {y 1 [a = c 1 ], y 1 [a = c 2 ], y 2 [a = c 1 ], y 2 [a = c 2 ], y 3 [a = c 2 ], y 4 [a = c 2 ]} C(Ω T ) is not consistent. The following inheritance chains are precluded: {y 2 y 3 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 3 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 1 y 2 y 3 y 4 [a = c 2 ]} (by y 1 y 2 [a = c 1 ]) Therefore, the inheritable conclusion set H(Ω T ) = {y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 [a = c 1 ], y 3 [a = c 2 ]} T is consistent. 7
8 [a = c 2 ] p 3 p 2 [a = c 1 ] p 1 Figure 4: Taxonomy. Exercise 4 a. We get the following set (conjunction) of formulas in first-order predicate logic: Φ = { x(f 1 (x) F 2 (x)), x(f 1 (x) a(x,c 1 )), x(f 1 (x) a(x,c 2 )), x(f 1 (x) a(x,c 3 ))} For the inheritance algorithms, consult the slides about Description logics and Frames. The resulting algorithm now looks as follows. Replace in the Inherit function: attr-value-pairs attr-value-pairs NewAttributes(pairs, attr-value-pairs) by attr-value-pairs MergeAttributes(pairs, attr-value-pairs) in which MergeAttributes adds new attribute values to the attribute which already has values. This in contrast to NewAttributes which removes new found values if the attribute already has a value. b. Note: all attributes are single-valued, i.e., we use a function with equality (=) representation, i.e., a(x) = c, rather than a predicate (relation) representation, i.e., a(x, c), as in the multi-valued case. The taxonomy is represented in Figure 4. Translation to first-order predicate logic: φ = { x(p 1 (x) p 2 (x)), x(p 1 (x) p 3 (x)), x(p 2 (x) a(x) = c 1 ), x(p 3 (x) a(x) = c 2 ), c 1 c 2 } No inconsistency can be derived. This would have been possible if there were instances, e.g., p 1 (i), because then a(i) = c 1 a(i) = c 2 would be derived, which cannot be satisfied. The conclusion set C(Ω T ) = {p 1 [a = c 1 ], p 1 [a = c 2 ], p 2 [a = c 1 ], p 3 [a = c 2 ]} 8
9 root superframe frame value facet if-needed-demon default Figure 5: Inheritance relation. This conclusion set is inconsistent. Meanings will correspond only if instances are added to the logical formulas. c. Correct values: edge = 2 value cube1 not: default cube base = 4 demon cube not: default prism height = 2 demon cube not: value prism volume = 8 demon prism not: default cube1 The inheritance relation is represented in Figure 5. Exercise 6 a. One possibility is to translate the formula and to prove the validity of the translation. The equivalent formula in predicate logic is x(( y r(x,y) A(y) B(y)) (( y r(x,y) A(y)) ( y r(x,y) B(y))) Proof: Take some x. Suppose that (1) y r(x,y) A(y) B(y). Take y such that r(x, y). Then A(y) follows directly from (1). Hence y r(x, y) A(y). The same reasoning for y r(x, y) B(y). b. DIY! c. In predicate logic: x(( y r(x,y) A(y) B(y)) (( y r(x,y) A(y)) ( y r(x,y) B(y))) 9
10 Consider the structure with domain D = {d 1,d 2,d 3 } and interpretation I such that: I(r) = {(d 1,d 2 ),(d 1,d 3 )} I(A) = {d 2 } I(B) = {d 3 } Choose x = d 1. Then y r(x,y) A(y) B(y) holds, but (e.g.) not y r(x,y) A(y). Hence, the formula is not true. Model-based Reasoning (Chapter 4) Exercise 4a A schematic view of the circuit is as follows: 1 0 O1 A1 0 1 A2 O2 0 Conflict sets are sets of components, that, the assumption that they work normally is inconsistent with the observation. In this case, there are 5 conflict sets: CS 1 = {O 1,A 1 } CS 2 = {O 1,O 2,A 2 } CS 3 = {O 1,A 1,A 2 } CS 4 = {O 1,A 1,O 2 } CS 5 = {O 1,O 2,A 1,A 2 } The first two conflict sets are (subset) minimal. If we add a component to a conflict set, then this results in another conflict set. Different hitting set trees are possible, depending on the order of chosing a conflict set. For example, the start of a hitting set tree can look as follows: {O1, O2, A2} O1 O2 A2 v {O1, A1} etc. O1 A1 v v Minimal diagnoses correspond to the minimal hitting sets, for example, {O 1 } and {O 2,A 1 }. 10
11 f 1 α d 1 α 1 d 2 2 f 2 d 3 f 3 Figure 6: Causal graph. Exercise 7a Draw a causal graph, shown in Figure 6, because that helps solving the problem. A hypothesis D is called an abductive diagnosis of P = (Σ,F), with F observed findings and causal model Σ = (,Φ, R) (see slides and Lecture Notes for definitions) if: (1) R D = F (2) R D C = where C = { f f Φ,f F,f is a positive literal}. Note that if we omit part (2), then this definition corresponds to a prediction of F. Thus, a diagnosis D is a prediction of what has been observed, i.e., F, that is consistent with the constraints C. In this case, we have F = {f 1,f 3 }, so C = { f 2 }. This gives three possible diagnoses: D 1 = {d 1,d 2,d 3 } D 2 = {d 1,d 2,d 3,α 1 } D 3 = {d 1,α 1,d 3 } Note that, if d 2 and α 2 are included, then we have an inconsistency with f 2. Uncertainty Reasoning (Chapter 5) Exercise 1c a. CF(a,e ) = 0.8; CF(b,e ) = 0.4; CF(c,e ) = 0.7. CF(a or b or c,e ) = max{max{cf(a,e ),CF(b,e )},CF(c,e )} = max{0.8, 0.7} = 0.8 CF(f,e 1 ) = 1 max{0,cf(e,e )} = = 0.8 b. CF(c and d,e ) = min{cf(c,e ),CF(d,e )} = 0,6; CF(f,e 2 ) = 0,5 0,6 = 0,3. c. CF(f,e 3 ) = = 0.6. d. Apply the combination function for co-concluding rules on (1) (2), and (3): CF(f,e 1 co e 2 ) = (1 0,8) = 0.86 CF(f,(e 1 co e 2 ) co e 3 ) = (1 0.86) =
12 Exercise 1d To see what it means that such a rule is idempotent, take a value for y, for example c; then fco(x,c) is an operator on the argument x (we could call that o(x)). So idempotence then means that fco(x,c) = fco(fco(x,c),c). This is not the case for the rule mentioned. For example, take x = 0.5 and c = 0.4. Then fco(x,c) = (1 0.5) = 0.7 and fco(fco(x,c),c) = (1 0.7) = Advantage of idempotence: two or more identical production rule only change the CF once. Disadvantage of idempotence: of different rules that result into an equal CF, only 1 of them contributes to the final CF. Exercise 2b Use the definition of a conditional probability and the rules for marginalisation and factorisation of a Bayesian network: P(v 3 v 1 ) = P(v 1,v 3 ) P(v 1 ) Exercise 3 = V 2 P(v 1,V 2,v 3 ) V = 2 P(v 3 V 2 )P(V 2 v 1 )P(v 1 ) P(v 1 ) P(v 1 ) = V 2 P(v 3 V 2 )P(V 2 v 1 ) = P(v 3 v 2 )P(v 2 v 1 ) + P(v 3 v 2 )P( v 2 v 1 ) = = 0.28 a. See slide 26 of the slides about uncertainty reasoning. In the certainty factor model, CFs are propagated using the following rule: CF(h,e ) = CF(h,e) max{0,cf(e,e )} which we could interpret as the probabilistic statement: P(h e ) = P(h e) max{0,p(e e )} = P(h e)p(e e ) According to the interpretation of CF rules, we can model the distribution using a Bayesian network E E H. Then it holds that: P(h e ) = P(h e)p(e e ) + P(h e)p( e e ) This is close to the CF model, but we still need to make sure that P(h e)p( e e ) = 0. If P( e e ) = 0, then P(e e ) = 1, which is (usually) inconsistent with the CF model, so this is a bad solution. So apparently we also need to require that P(h e) = 0. b. We can make the CF factor closer to the probabilistic model by including (besides CF(h,e)) a statement CF(h, e) to a rule. Different definitions are possible, for example. CF(h,e ) = CF(h,e) max{0,cf(e,e ) + CF(h, e) max{0, CF(e,e )} 12
13 c. In the noisy-and model it holds that: P(e C 1,C 2 ) = P(e I 1,I 2 ) P(I k C k ) = P(i 1 C 1 )P(i 2 C 2 ) C 1 C 2 =e k=1,2 A corresponding CF definition could look at follows: CF(h,e 1 co e 2) = CF(h,e 1)CF(h,e 2) Exercise 4 a. The generated Markov networks from the logical formula look as follows: R(1,1) R(2,1) R(1,2) R(2,2) Q(1,1) Q(1,2) Q(2,1) R(2,2) b. Consider the joint distribution P(R(1,1),R(2,1),Q(1,1),Q(1,2)). There are no dependencies with the remaining random variables, so we can compute probabilities of this distribution without considering the rest of the variables. We consider all the states of this distribution and count the number of true instantiations of F. For example, in the state that {R(1,1) = false,r(2,1) = false,q(1,1) = false, Q(1, 2) = true}, then the instantiations are: R(1,1) Q(1,1) R(2,1) Q(1,1) For example, R(1,1) Q(1,2) is not a true instantiation. A list with the number of true instantiations is as follows: R(1,1) R(2,1) Q(1,1) Q(1,2) n(x) P(x) Z e 4w e 4w e 4w e 4w e 2w e 3w e 3w e 4w e 2w e 3w e 3w e 4w e e 2w e 2w e 4w 13
14 (The same can be done for the other part of the model). c. The normalisation constant can be computed by summing up the last column (because Z = x P(x) Z). Z = 7e 4w + 4e 3w + 4e 2w So it holds that: P(R(2,1),Q(1,2)) = 1/Z(e 4w + e 4w + e 2w + e 3w ) 0.27 d. Similar to (c). P(R(2,1)) = 1/Z(5e 4w + 2e 3w + e 2w )
Modeling and reasoning with uncertainty
CS 2710 Foundations of AI Lecture 18 Modeling and reasoning with uncertainty Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square KB systems. Medical example. We want to build a KB system for the diagnosis
More informationBayesian belief networks
CS 2001 Lecture 1 Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square 4-8845 Milos research interests Artificial Intelligence Planning, reasoning and optimization in the presence
More informationReasoning about uncertainty
Reasoning about uncertainty Rule-based systems are an attempt to embody the knowledge of a human expert within a computer system. Human knowledge is often imperfect. - it may be incomplete (missing facts)
More informationLecture 10: Introduction to reasoning under uncertainty. Uncertainty
Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More informationFormal Logic: Quantifiers, Predicates, and Validity. CS 130 Discrete Structures
Formal Logic: Quantifiers, Predicates, and Validity CS 130 Discrete Structures Variables and Statements Variables: A variable is a symbol that stands for an individual in a collection or set. For example,
More informationIntelligent Systems (AI-2)
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies
More information3. The Logic of Quantified Statements Summary. Aaron Tan August 2017
3. The Logic of Quantified Statements Summary Aaron Tan 28 31 August 2017 1 3. The Logic of Quantified Statements 3.1 Predicates and Quantified Statements I Predicate; domain; truth set Universal quantifier,
More informationReasoning with Uncertainty
Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take
More informationREASONING UNDER UNCERTAINTY: CERTAINTY THEORY
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY Table of Content Introduction Certainty Theory Definition Certainty Theory: Values Interpretation Certainty Theory: Representation Certainty Factor Propagation
More informationUncertainty and Bayesian Networks
Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks
More informationIntroduction to Predicate Logic Part 1. Professor Anita Wasilewska Lecture Notes (1)
Introduction to Predicate Logic Part 1 Professor Anita Wasilewska Lecture Notes (1) Introduction Lecture Notes (1) and (2) provide an OVERVIEW of a standard intuitive formalization and introduction to
More informationBayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.
Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence
More informationTentamen TDDC17 Artificial Intelligence 20 August 2012 kl
Linköpings Universitet Institutionen för Datavetenskap Patrick Doherty Tentamen TDDC17 Artificial Intelligence 20 August 2012 kl. 08-12 Points: The exam consists of exercises worth 32 points. To pass the
More informationTDT70: Uncertainty in Artificial Intelligence. Chapter 1 and 2
TDT70: Uncertainty in Artificial Intelligence Chapter 1 and 2 Fundamentals of probability theory The sample space is the set of possible outcomes of an experiment. A subset of a sample space is called
More informationNormal Forms for First-Order Logic
Logic and Proof Hilary 2016 James Worrell Normal Forms for First-Order Logic In this lecture we show how to transform an arbitrary formula of first-order logic to an equisatisfiable formula in Skolem form.
More informationDenote John by j and Smith by s, is a bachelor by predicate letter B. The statements (1) and (2) may be written as B(j) and B(s).
PREDICATE CALCULUS Predicates Statement function Variables Free and bound variables Quantifiers Universe of discourse Logical equivalences and implications for quantified statements Theory of inference
More informationCSCE 222 Discrete Structures for Computing. Predicate Logic. Dr. Hyunyoung Lee. !!!!! Based on slides by Andreas Klappenecker
CSCE 222 Discrete Structures for Computing Predicate Logic Dr. Hyunyoung Lee Based on slides by Andreas Klappenecker 1 Predicates A function P from a set D to the set Prop of propositions is called a predicate.
More informationCPSC 121: Models of Computation
CPSC 121: Models of Computation Unit 6 Rewriting Predicate Logic Statements Based on slides by Patrice Belleville and Steve Wolfman Coming Up Pre-class quiz #7 is due Wednesday October 25th at 9:00 pm.
More informationMAT 243 Test 1 SOLUTIONS, FORM A
t MAT 243 Test 1 SOLUTIONS, FORM A 1. [10 points] Rewrite the statement below in positive form (i.e., so that all negation symbols immediately precede a predicate). ( x IR)( y IR)((T (x, y) Q(x, y)) R(x,
More informationReasoning Under Uncertainty: Introduction to Probability
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Uncertainty 1 Textbook 6.1 Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Uncertainty 1, Slide 1 Lecture Overview 1
More informationEE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks
More informationICS141: Discrete Mathematics for Computer Science I
ICS141: Discrete Mathematics for Computer Science I Dept. Information & Computer Sci., Originals slides by Dr. Baek and Dr. Still, adapted by J. Stelovsky Based on slides Dr. M. P. Frank and Dr. J.L. Gross
More informationIncomplete version for students of easllc2012 only. 6.6 The Model Existence Game 99
98 First-Order Logic 6.6 The Model Existence Game In this section we learn a new game associated with trying to construct a model for a sentence or a set of sentences. This is of fundamental importance
More informationPredicate Logic 1. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic
Predicate Logic 1 Background to Logic Paradigm Joseph Spring School of Computer Science This Lecture We consider the following topics: The Closed World Assumption Predicates in Extension The Universal
More informationCPSC 121: Models of Computation. Module 6: Rewriting predicate logic statements
CPSC 121: Models of Computation Pre-class quiz #7 is due Wednesday October 16th at 17:00. Assigned reading for the quiz: Epp, 4th edition: 4.1, 4.6, Theorem 4.4.1 Epp, 3rd edition: 3.1, 3.6, Theorem 3.4.1.
More information2. Use quantifiers to express the associative law for multiplication of real numbers.
1. Define statement function of one variable. When it will become a statement? Statement function is an expression containing symbols and an individual variable. It becomes a statement when the variable
More informationFirst-Order Predicate Logic. Basics
First-Order Predicate Logic Basics 1 Syntax of predicate logic: terms A variable is a symbol of the form x i where i = 1, 2, 3.... A function symbol is of the form fi k where i = 1, 2, 3... und k = 0,
More informationProbability. CS 3793/5233 Artificial Intelligence Probability 1
CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions
More informationPredicate Logic. Andreas Klappenecker
Predicate Logic Andreas Klappenecker Predicates A function P from a set D to the set Prop of propositions is called a predicate. The set D is called the domain of P. Example Let D=Z be the set of integers.
More informationLogic and Modelling. Introduction to Predicate Logic. Jörg Endrullis. VU University Amsterdam
Logic and Modelling Introduction to Predicate Logic Jörg Endrullis VU University Amsterdam Predicate Logic In propositional logic there are: propositional variables p, q, r,... that can be T or F In predicate
More informationCOMP219: Artificial Intelligence. Lecture 20: Propositional Reasoning
COMP219: Artificial Intelligence Lecture 20: Propositional Reasoning 1 Overview Last time Logic for KR in general; Propositional Logic; Natural Deduction Today Entailment, satisfiability and validity Normal
More informationG52DOA - Derivation of Algorithms Predicate Logic
G52DOA - Derivation of Algorithms Predicate Logic Venanzio Capretta Predicate Logic So far, we studied propositional logic, in which we started with unspecified propositional variables A, B, C, and combined
More informationDiscrete Mathematics and Its Applications
Discrete Mathematics and Its Applications Lecture 1: The Foundations: Logic and Proofs (1.3-1.5) MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 19, 2017 Outline 1 Logical
More informationLogic. Knowledge Representation & Reasoning Mechanisms. Logic. Propositional Logic Predicate Logic (predicate Calculus) Automated Reasoning
Logic Knowledge Representation & Reasoning Mechanisms Logic Logic as KR Propositional Logic Predicate Logic (predicate Calculus) Automated Reasoning Logical inferences Resolution and Theorem-proving Logic
More informationDefining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser
Lecture 17: Uncertainty 2 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture How belief networks can be a Knowledge Base for probabilistic knowledge. How to construct a belief network. How to answer
More informationReasoning with Uncertainty
Reasoning with Uncertainty Topics: Why is uncertainty important? How do we represent and reason with uncertain knowledge? Progress in research: 1980s: rule-based representation of uncertainty (MYCIN, Prospector)
More informationStatistical Inference
Statistical Inference Classical and Bayesian Methods Class 6 AMS-UCSC Thu 26, 2012 Winter 2012. Session 1 (Class 6) AMS-132/206 Thu 26, 2012 1 / 15 Topics Topics We will talk about... 1 Hypothesis testing
More informationIntroduction to Artificial Intelligence. Unit # 11
Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian
More informationStrong AI vs. Weak AI Automated Reasoning
Strong AI vs. Weak AI Automated Reasoning George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Artificial intelligence can be classified into two categories:
More informationIntelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example
More informationPredicate Logic: Sematics Part 1
Predicate Logic: Sematics Part 1 CS402, Spring 2018 Shin Yoo Predicate Calculus Propositional logic is also called sentential logic, i.e. a logical system that deals with whole sentences connected with
More informationUncertainty processing in FEL-Expert Lecture notes
Uncertainty processing in FEL-Expert Lecture notes Marek Obitko, obitko@labe.felk.cvut.cz 1 Introduction This text describes uncertainty processing in the FEL-Expert system and is intended as lecture notes
More informationWhere are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning
Knowledge Engineering Semester 2, 2004-05 Michael Rovatsos mrovatso@inf.ed.ac.uk Lecture 8 Dealing with Uncertainty 8th ebruary 2005 Where are we? Last time... Model-based reasoning oday... pproaches to
More informationLogics for Data and Knowledge Representation
Logics for Data and Knowledge Representation 4. Introduction to Description Logics - ALC Luciano Serafini FBK-irst, Trento, Italy October 9, 2012 Origins of Description Logics Description Logics stem from
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationIntro to AI: Lecture 8. Volker Sorge. Introduction. A Bayesian Network. Inference in. Bayesian Networks. Bayesian Networks.
Specifying Probability Distributions Specifying a probability for every atomic event is impractical We have already seen it can be easier to specify probability distributions by using (conditional) independence
More informationRules Build Arguments Rules Building Arguments
Section 1.6 1 Section Summary Valid Arguments Inference Rules for Propositional Logic Using Rules of Inference to Build Arguments Rules of Inference for Quantified Statements Building Arguments for Quantified
More informationIdentity. "At least one dog has fleas" is translated by an existential quantifier"
Identity Quantifiers are so-called because they say how many. So far, we've only used the quantifiers to give the crudest possible answers to the question "How many dogs have fleas?": "All," "None," "Some,"
More informationIntroduction to Artificial Intelligence (AI)
Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 9 Oct, 11, 2011 Slide credit Approx. Inference : S. Thrun, P, Norvig, D. Klein CPSC 502, Lecture 9 Slide 1 Today Oct 11 Bayesian
More informationCS Lecture 3. More Bayesian Networks
CS 6347 Lecture 3 More Bayesian Networks Recap Last time: Complexity challenges Representing distributions Computing probabilities/doing inference Introduction to Bayesian networks Today: D-separation,
More informationLogic - recap. So far, we have seen that: Logic is a language which can be used to describe:
Logic - recap So far, we have seen that: Logic is a language which can be used to describe: Statements about the real world The simplest pieces of data in an automatic processing system such as a computer
More information1. True/False (40 points) total
Name: 1 2 3 4 5 6 7 total 40 20 45 45 20 15 15 200 UMBC CMSC 671 Final Exam December 20, 2009 Please write all of your answers on this exam. The exam is closed book and has seven problems that add up to
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationProbability, Entropy, and Inference / More About Inference
Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference
More informationLogic. Logic is a discipline that studies the principles and methods used in correct reasoning. It includes:
Logic Logic is a discipline that studies the principles and methods used in correct reasoning It includes: A formal language for expressing statements. An inference mechanism (a collection of rules) to
More informationThe Probabilistic Interpretation of Model-based Diagnosis
The Probabilistic Interpretation of Model-based Diagnosis Ildikó Flesch MICC, Maastricht University, Maastricht, the Netherlands Email: ildiko@cs.ru.nl Peter J.F. Lucas Institute for Computing and Information
More informationReasoning in Uncertain Situations
9 Reasoning in Uncertain Situations 9.0 Introduction 9.1 Logic-Based Abductive Inference 9.2 Abduction: Alternatives to Logic 9.3 The Stochastic Approach to Uncertainty 9.4 Epilogue and References 9.5
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDA132) Lecture 09 2017-02-15 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationReasoning Under Uncertainty: Introduction to Probability
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23 March 12, 2007 Textbook 9 Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23, Slide 1 Lecture Overview
More informationProbabilistic Reasoning. (Mostly using Bayesian Networks)
Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty
More informationQuantifiers Here is a (true) statement about real numbers: Every real number is either rational or irrational.
Quantifiers 1-17-2008 Here is a (true) statement about real numbers: Every real number is either rational or irrational. I could try to translate the statement as follows: Let P = x is a real number Q
More informationGenerative Techniques: Bayes Rule and the Axioms of Probability
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 8 3 March 2017 Generative Techniques: Bayes Rule and the Axioms of Probability Generative
More informationSome Rewriting Systems as a Background of Proving Methods
Some Rewriting Systems as a Background of Proving Methods Katalin Pásztor Varga Department of General Computer Science Eötvös Loránd University e-mail: pkata@ludens.elte.hu Magda Várterész Institute of
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDAF70) Lecture 04 2019-02-01 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationDefinite Logic Programs
Chapter 2 Definite Logic Programs 2.1 Definite Clauses The idea of logic programming is to use a computer for drawing conclusions from declarative descriptions. Such descriptions called logic programs
More informationUnit I LOGIC AND PROOFS. B. Thilaka Applied Mathematics
Unit I LOGIC AND PROOFS B. Thilaka Applied Mathematics UNIT I LOGIC AND PROOFS Propositional Logic Propositional equivalences Predicates and Quantifiers Nested Quantifiers Rules of inference Introduction
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationToward Computing Conflict-Based Diagnoses in Probabilistic Logic Programming
Toward Computing Conflict-Based Diagnoses in Probabilistic Logic Programming Arjen Hommersom 1,2 and Marcos L.P. Bueno 2 1 Open University of the Netherlands 2 Radboud University Nijmegen, The Netherlands
More informationProbabilistic Reasoning; Network-based reasoning
Probabilistic Reasoning; Network-based reasoning COMPSCI 276, Spring 2013 Set 1: Introduction and Background Rina Dechter (Reading: Pearl chapter 1-2, Darwiche chapters 1,3) 1 Class Description n Instructor:
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationDiscrete Probability and State Estimation
6.01, Fall Semester, 2007 Lecture 12 Notes 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.01 Introduction to EECS I Fall Semester, 2007 Lecture 12 Notes
More informationReasoning Under Uncertainty
Reasoning Under Uncertainty Chapter 14&15 Part Kostas (1) Certainty Kontogiannis Factors E&CE 457 Objectives This unit aims to investigate techniques that allow for an algorithmic process to deduce new
More informationPropositional Logic Not Enough
Section 1.4 Propositional Logic Not Enough If we have: All men are mortal. Socrates is a man. Does it follow that Socrates is mortal? Can t be represented in propositional logic. Need a language that talks
More informationCOMP538: Introduction to Bayesian Networks
COMP538: Introduction to Bayesian Networks Lecture 2: Bayesian Networks Nevin L. Zhang lzhang@cse.ust.hk Department of Computer Science and Engineering Hong Kong University of Science and Technology Fall
More informationIntroduction to Artificial Intelligence Belief networks
Introduction to Artificial Intelligence Belief networks Chapter 15.1 2 Dieter Fox Based on AIMA Slides c S. Russell and P. Norvig, 1998 Chapter 15.1 2 0-0 Outline Bayesian networks: syntax and semantics
More informationSection 2.1: Introduction to the Logic of Quantified Statements
Section 2.1: Introduction to the Logic of Quantified Statements In the previous chapter, we studied a branch of logic called propositional logic or propositional calculus. Loosely speaking, propositional
More informationICS141: Discrete Mathematics for Computer Science I
ICS141: Discrete Mathematics for Computer Science I Dept. Information & Computer Sci., Originals slides by Dr. Baek and Dr. Still, adapted by J. Stelovsky Based on slides Dr. M. P. Frank and Dr. J.L. Gross
More informationPredicate Logic & Quantification
Predicate Logic & Quantification Things you should do Homework 1 due today at 3pm Via gradescope. Directions posted on the website. Group homework 1 posted, due Tuesday. Groups of 1-3. We suggest 3. In
More informationLecture Lecture 5
Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationPredicate Logic. 1 Predicate Logic Symbolization
1 Predicate Logic Symbolization innovation of predicate logic: analysis of simple statements into two parts: the subject and the predicate. E.g. 1: John is a giant. subject = John predicate =... is a giant
More informationMathematical Logic. Reasoning in First Order Logic. Chiara Ghidini. FBK-IRST, Trento, Italy
Reasoning in First Order Logic FBK-IRST, Trento, Italy April 12, 2013 Reasoning tasks in FOL Model checking Question: Is φ true in the interpretation I with the assignment a? Answer: Yes if I = φ[a]. No
More informationIntroduction to Bayesian Learning
Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline
More informationPredicate Logic. Predicates. Math 173 February 9, 2010
Math 173 February 9, 2010 Predicate Logic We have now seen two ways to translate English sentences into mathematical symbols. We can capture the logical form of a sentence using propositional logic: variables
More informationDirected Graphical Models
CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential
More informationBayesian belief networks. Inference.
Lecture 13 Bayesian belief networks. Inference. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Midterm exam Monday, March 17, 2003 In class Closed book Material covered by Wednesday, March 12 Last
More informationMarkov localization uses an explicit, discrete representation for the probability of all position in the state space.
Markov Kalman Filter Localization Markov localization localization starting from any unknown position recovers from ambiguous situation. However, to update the probability of all positions within the whole
More information2/2/2018. CS 103 Discrete Structures. Chapter 1. Propositional Logic. Chapter 1.1. Propositional Logic
CS 103 Discrete Structures Chapter 1 Propositional Logic Chapter 1.1 Propositional Logic 1 1.1 Propositional Logic Definition: A proposition :is a declarative sentence (that is, a sentence that declares
More informationIt rains now. (true) The followings are not propositions.
Chapter 8 Fuzzy Logic Formal language is a language in which the syntax is precisely given and thus is different from informal language like English and French. The study of the formal languages is the
More informationIntroduction to Probabilistic Reasoning. Image credit: NASA. Assignment
Introduction to Probabilistic Reasoning Brian C. Williams 16.410/16.413 November 17 th, 2010 11/17/10 copyright Brian Williams, 2005-10 1 Brian C. Williams, copyright 2000-09 Image credit: NASA. Assignment
More informationINSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW PREDICATE LOGIC
1 CHAPTER 7. PREDICATE LOGIC 1 INSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW PREDICATE LOGIC 1 Predicate Logic 1.1 Introduction There are various arguments which cannot be dealt
More informationPart Possible Score Base 5 5 MC Total 50
Stat 220 Final Exam December 16, 2004 Schafer NAME: ANDREW ID: Read This First: You have three hours to work on the exam. The other questions require you to work out answers to the questions; be sure to
More informationSampling Algorithms for Probabilistic Graphical models
Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir
More informationDecomposition. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 81
Decomposition Rudolf Kruse, Alexander Dockhorn Bayesian Networks 81 Object Representation Property Car family body Property Hatchback Motor Radio Doors Seat cover 2.8 L Type 4 Leather, 150kW alpha Type
More informationReasoning Under Uncertainty: Belief Network Inference
Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5 Textbook 10.4 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 1 Lecture Overview 1 Recap
More informationExpert Systems! Knowledge Based Systems!
Expert Systems Knowledge Based Systems ES-1 Medical diagnosis» Disease identification Example Areas of Use ES-2 Example Areas of Use 2 Medical diagnosis» Disease identification Natural resource exploration»
More informationExpert Systems! Knowledge Based Systems!
Expert Systems Knowledge Based Systems ES-1 Medical diagnosis» Disease identification Example Areas of Use ES-2 Example Areas of Use 2 Medical diagnosis» Disease identification Natural resource exploration»
More informationPart 2: First-Order Logic
Part 2: First-Order Logic First-order logic formalizes fundamental mathematical concepts is expressive (Turing-complete) is not too expressive (e. g. not axiomatizable: natural numbers, uncountable sets)
More information