Selected solutions to exercises Knowledge Representation and Reasoning

Similar documents
Modeling and reasoning with uncertainty

Bayesian belief networks

Reasoning about uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Basic Probabilistic Reasoning SEG

Formal Logic: Quantifiers, Predicates, and Validity. CS 130 Discrete Structures

Intelligent Systems (AI-2)

3. The Logic of Quantified Statements Summary. Aaron Tan August 2017

Reasoning with Uncertainty

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY

Uncertainty and Bayesian Networks

Introduction to Predicate Logic Part 1. Professor Anita Wasilewska Lecture Notes (1)

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Tentamen TDDC17 Artificial Intelligence 20 August 2012 kl

TDT70: Uncertainty in Artificial Intelligence. Chapter 1 and 2

Normal Forms for First-Order Logic

Denote John by j and Smith by s, is a bachelor by predicate letter B. The statements (1) and (2) may be written as B(j) and B(s).

CSCE 222 Discrete Structures for Computing. Predicate Logic. Dr. Hyunyoung Lee. !!!!! Based on slides by Andreas Klappenecker

CPSC 121: Models of Computation

MAT 243 Test 1 SOLUTIONS, FORM A

Reasoning Under Uncertainty: Introduction to Probability

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

ICS141: Discrete Mathematics for Computer Science I

Incomplete version for students of easllc2012 only. 6.6 The Model Existence Game 99

Predicate Logic 1. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic

CPSC 121: Models of Computation. Module 6: Rewriting predicate logic statements

2. Use quantifiers to express the associative law for multiplication of real numbers.

First-Order Predicate Logic. Basics

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Predicate Logic. Andreas Klappenecker

Logic and Modelling. Introduction to Predicate Logic. Jörg Endrullis. VU University Amsterdam

COMP219: Artificial Intelligence. Lecture 20: Propositional Reasoning

G52DOA - Derivation of Algorithms Predicate Logic

Discrete Mathematics and Its Applications

Logic. Knowledge Representation & Reasoning Mechanisms. Logic. Propositional Logic Predicate Logic (predicate Calculus) Automated Reasoning

Defining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser

Reasoning with Uncertainty

Statistical Inference

Introduction to Artificial Intelligence. Unit # 11

Strong AI vs. Weak AI Automated Reasoning

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Predicate Logic: Sematics Part 1

Uncertainty processing in FEL-Expert Lecture notes

Where are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning

Logics for Data and Knowledge Representation

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Intro to AI: Lecture 8. Volker Sorge. Introduction. A Bayesian Network. Inference in. Bayesian Networks. Bayesian Networks.

Rules Build Arguments Rules Building Arguments

Identity. "At least one dog has fleas" is translated by an existential quantifier"

Introduction to Artificial Intelligence (AI)

CS Lecture 3. More Bayesian Networks

Logic - recap. So far, we have seen that: Logic is a language which can be used to describe:

1. True/False (40 points) total

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Probability, Entropy, and Inference / More About Inference

Logic. Logic is a discipline that studies the principles and methods used in correct reasoning. It includes:

The Probabilistic Interpretation of Model-based Diagnosis

Reasoning in Uncertain Situations

Probabilistic representation and reasoning

Reasoning Under Uncertainty: Introduction to Probability

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Quantifiers Here is a (true) statement about real numbers: Every real number is either rational or irrational.

Generative Techniques: Bayes Rule and the Axioms of Probability

Some Rewriting Systems as a Background of Proving Methods

Probabilistic representation and reasoning

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Definite Logic Programs

Unit I LOGIC AND PROOFS. B. Thilaka Applied Mathematics

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Toward Computing Conflict-Based Diagnoses in Probabilistic Logic Programming

Probabilistic Reasoning; Network-based reasoning

Dept. of Linguistics, Indiana University Fall 2015

Discrete Probability and State Estimation

Reasoning Under Uncertainty

Propositional Logic Not Enough

COMP538: Introduction to Bayesian Networks

Introduction to Artificial Intelligence Belief networks

Section 2.1: Introduction to the Logic of Quantified Statements

ICS141: Discrete Mathematics for Computer Science I

Predicate Logic & Quantification

Lecture Lecture 5

Lecture 1: Probability Fundamentals

Predicate Logic. 1 Predicate Logic Symbolization

Mathematical Logic. Reasoning in First Order Logic. Chiara Ghidini. FBK-IRST, Trento, Italy

Introduction to Bayesian Learning

Predicate Logic. Predicates. Math 173 February 9, 2010

Directed Graphical Models

Bayesian belief networks. Inference.

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

2/2/2018. CS 103 Discrete Structures. Chapter 1. Propositional Logic. Chapter 1.1. Propositional Logic

It rains now. (true) The followings are not propositions.

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

INSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW PREDICATE LOGIC

Part Possible Score Base 5 5 MC Total 50

Sampling Algorithms for Probabilistic Graphical models

Decomposition. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 81

Reasoning Under Uncertainty: Belief Network Inference

Expert Systems! Knowledge Based Systems!

Expert Systems! Knowledge Based Systems!

Part 2: First-Order Logic

Transcription:

Selected solutions to exercises Knowledge Representation and Reasoning 2010 2011 15th December, 2010 Logic & Resolution (Appendix A) Exercise 2.1 (iv) We are looking for a substitution θ such that: P(x,z,y)θ = P(x,z,x)θ = P(a,x,x)θ To equate the first argument x has to replaced by a. We then have: P(x,z,y){a/x} = P(a,z,y) P(x,z,x){a/x} = P(a,z,a) P(a,x,x){a/x} = P(a,a,a) Now to equate the second and third argument, both z and y also have to be replaced by a. Hence θ = {a/x, a/y, a/z}. Exercise 2.2 (v) Following the steps on pages 96 97 of the Lecture Notes: F = x(p(a,x) y(q(x,y) zr(z))) P(a,b) y(q(b,y) R(f(y))) (step 4) y(p(a,b) (Q(b,y) R(f(y)))) (step 5) y(p(a,b) (Q(b,y) R(f(y)))) (step 5) y((p(a,b) Q(b,y)) (P(a,b) R(f(y)))) (step 6) (P(a,b) Q(b,y)) (P(a,b) R(f(y))) (step 7) F = {P(a,b) Q(b,y),P(a,b) R(f(y))} (step 8) Step 4 is the skolemisation step, where the existential quantifier is replaced by a (function) term with universally quantified variables and a constant if the function symbol of the term has no arguments. In general equivalence will be lost in step 4 (skolemisation), this also holds in this case. Compare xp(a,x) with P(a,b), it holds that P(a,b) xp(a,x), but not P(a,b) xp(a,x) (the reason is that the x may be interpreted as another constant, such as a or c). Important for resolution is that F is satisfiable if and only if F is satisfiable, so here it does not really matter. 1

Logic programming and Prolog (Chapter 2) Exercise 1.1 (ii) θ 4 D(b) θ 3 θ 2 θ 1 P(x) Q(y) L(x,y)) P(a) Q(y) L(a, y) L(a,b) D(b) Q(b) D(y) L(a,y) The corresponding substitutions are: θ 1 = {a/x} θ 2 = {b/y} θ 3 = {b/y} θ 4 = {} Description Logics & Frames (Chapter 3) Exercise 1 a. Employee Human b. Mother Women haschild. c. Parent Mother Father d. Grandmother Mother haschild.parent e. haschild.human Human Exercise 2 a. This frame taxonomy summarises some AI and computing science history, which is briefly explained below. MYCIN and PROSPECTOR were two of the early knowledge-based systems, developed at the end of the 1970s, which included methods for reasoning with uncertainty. At the time two reasoning methods were distinguished: top-down or goal-driven inference, similar to backtracking in Prolog, where the reasoning process was controlled by goals, and bottom-up or data-driven inference, where the reasoning process was controlled by means of data or facts. DEC, an acronym which stands for Digital Equipment Corporation, was a USA computer industry company that was very innovative, but lost impact because of the popularity of the non-network-based simple personal computer with Microsoft s DOS as operating system. Their strategy was focussed on the development of workstations and network-based minicomputers, very much in line with modern ideas around the internet, cloud computing, and network-based workstations. DEC was the first company that developed a knowledge systems, called XCON, that was able to generate the specifications of minicomputers based on requirements of customers. This made 2

sense at the time because there were many different options available for most of the computer systems sold by DEC. Although visionary and very successful until about 1987, DEC s strategy was no longer successful at the end of the 1980s, when Microsoft, as most consumers, did not believe in network-based computing and were in favour of the stand-alone PC. DEC was bought by Compaq that later merged with HP. In the take-over process, almost all of the innovative capacity of the company was lost. HP is surely not a major player in AI and computing science innovation as DEC once was. The following frames offer an adequate solution: class knowledge-based-system is superclass nil contents = domain-knowledge inference-type : {top-down, bottom-up} end class diagnostic-system is superclass knowledge-based-system inference-type = top-down end class configuration-system is superclass knowledge-based-system inference-type = bottom-up end instance MYCIN is instance-of diagnostic-system developer = Shortliffe end instance PROSPECTOR is instance-of diagnostic-system domain = geology end instance XCON is instance-of configuration-system company = DEC end b. The taxonomy is represented in Figure 1. The set of inheritance chains is as follows Ω T = {y 1, y 2, 3

y 4 [a = c 2 ] y 3 y 2 [a = c 1 ] y 1 Figure 1: Taxonomy. y 3, y 4, y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 y 2, y 1 y 3, y 2 y 3, y 2 y 4, y 3 y 4, y 1 y 2 y 3, y 1 y 2 y 4, y 1 y 3 y 4, y 2 y 3 y 4, y 1 y 2 y 3 y 4, y 1 y 2 [a = c 1 ], y 2 y 4 [a = c 2 ], y 3 y 4 [a = c 2 ], y 1 y 2 y 4 [a = c 2 ], y 1 y 3 y 4 [a = c 2 ], y 2 y 3 y 4 [a = c 2 ], y 1 y 2 y 3 y 4 [a = c 2 ]} Note that we start with the classes, such as y 1, attribute-value specifications, such as y 4 [a = c 2 ], add the subclass relationships, such as y 1 y 3 and y 3 y 4, and then concatenate the chains, yielding, for example, y 1 y 3 y 4 and y 1 y 3 y 4 [a = c 2 ]. The conclusion set is obtaining by propagating the attributes with their values, e.g., a = c 2, to the beginning of a chain. For example, y 1 y 2 y 4 [a = c 2 ] would yield y 1 [a = c 2 ] as a conclusion. The entire conclusion set is as follows: C(Ω T ) = 4

{y 1 [a = c 1 ], y 1 [a = c 2 ], y 2 [a = c 1 ], y 2 [a = c 2 ], y 3 [a = c 2 ], y 4 [a = c 2 ]} C(Ω T ) is not consistent as we have two different values for attribute a for class y 1 and also for class y 2. The following inheritance chains are precluded or blocked (see definition of preclusion on page 53, Definition 3.3, of the Lecture Notes): {y 2 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 2 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 1 y 3 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 2 y 3 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 2 y 3 y 4 [a = c 2 ]} (by y 1 y 2 [a = c 1 ]) The precluded inheritance chains are ignored when computing the inheritable conclusion set. Therefore, the inheritable conclusion set becomes H(Ω T ) = { y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 [a = c 1 ], y 3 [a = c 2 ]} T is consistent as now there are no longer cases with two different values for attribute a in any of the classes y i, i = 1,...,4. Exercise 3 a. car wheels.{4} car seats.{4} sportscar car sportscar seats.{2} Rolls-Royce : car (Rolls-Royce, sufficient) : max-speed If a sportscar is given, then the set is inconsistent. If the set is empty, then it is consistent. b. An incorrect implementation would have resulted in the situation as represented in figure 2. The algorithm is non-deterministic, because the order is not specified. If the order is y 3, y 2, then the attribute a has the value c 1 (which happens to be correct). If the order is y 2, y 3, then the attribute a gets the value c 2 (which is incorrect). 5

y 3 [a = c 2 ] y 2 [a = c 1 ] y 1 Figure 2: Multiple inheritance with exceptions. y 4 [a = c 2 ] y 3 y 2 [a = c 1 ] y 1 x Figure 3: Taxonomy. 6

c. The taxonomy is represented in figure 3. The inheritance chains Ω T = {y 1, y 2, y 3, y 4, y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 y 2, y 2 y 3, y 3 y 4, y 1 y 3, y 1 y 2 y 3, y 2 y 3 y 4, y 1 y 3 y 4, y 1 y 2 y 3 y 4, y 1 y 2 [a = c 1 ], y 3 y 4 [a = c 2 ], y 2 y 3 y 4 [a = c 2 ], y 1 y 3 y 4 [a = c 2 ], y 1 y 2 y 3 y 4 [a = c 2 ]} The conclusion set C(Ω T ) = {y 1 [a = c 1 ], y 1 [a = c 2 ], y 2 [a = c 1 ], y 2 [a = c 2 ], y 3 [a = c 2 ], y 4 [a = c 2 ]} C(Ω T ) is not consistent. The following inheritance chains are precluded: {y 2 y 3 y 4 [a = c 2 ], (by y 2 [a = c 1 ]) y 1 y 3 y 4 [a = c 2 ], (by y 1 y 2 [a = c 1 ]) y 1 y 2 y 3 y 4 [a = c 2 ]} (by y 1 y 2 [a = c 1 ]) Therefore, the inheritable conclusion set H(Ω T ) = {y 2 [a = c 1 ], y 4 [a = c 2 ], y 1 [a = c 1 ], y 3 [a = c 2 ]} T is consistent. 7

[a = c 2 ] p 3 p 2 [a = c 1 ] p 1 Figure 4: Taxonomy. Exercise 4 a. We get the following set (conjunction) of formulas in first-order predicate logic: Φ = { x(f 1 (x) F 2 (x)), x(f 1 (x) a(x,c 1 )), x(f 1 (x) a(x,c 2 )), x(f 1 (x) a(x,c 3 ))} For the inheritance algorithms, consult the slides about Description logics and Frames. The resulting algorithm now looks as follows. Replace in the Inherit function: attr-value-pairs attr-value-pairs NewAttributes(pairs, attr-value-pairs) by attr-value-pairs MergeAttributes(pairs, attr-value-pairs) in which MergeAttributes adds new attribute values to the attribute which already has values. This in contrast to NewAttributes which removes new found values if the attribute already has a value. b. Note: all attributes are single-valued, i.e., we use a function with equality (=) representation, i.e., a(x) = c, rather than a predicate (relation) representation, i.e., a(x, c), as in the multi-valued case. The taxonomy is represented in Figure 4. Translation to first-order predicate logic: φ = { x(p 1 (x) p 2 (x)), x(p 1 (x) p 3 (x)), x(p 2 (x) a(x) = c 1 ), x(p 3 (x) a(x) = c 2 ), c 1 c 2 } No inconsistency can be derived. This would have been possible if there were instances, e.g., p 1 (i), because then a(i) = c 1 a(i) = c 2 would be derived, which cannot be satisfied. The conclusion set C(Ω T ) = {p 1 [a = c 1 ], p 1 [a = c 2 ], p 2 [a = c 1 ], p 3 [a = c 2 ]} 8

root superframe frame value facet if-needed-demon default Figure 5: Inheritance relation. This conclusion set is inconsistent. Meanings will correspond only if instances are added to the logical formulas. c. Correct values: edge = 2 value cube1 not: default cube base = 4 demon cube not: default prism height = 2 demon cube not: value prism volume = 8 demon prism not: default cube1 The inheritance relation is represented in Figure 5. Exercise 6 a. One possibility is to translate the formula and to prove the validity of the translation. The equivalent formula in predicate logic is x(( y r(x,y) A(y) B(y)) (( y r(x,y) A(y)) ( y r(x,y) B(y))) Proof: Take some x. Suppose that (1) y r(x,y) A(y) B(y). Take y such that r(x, y). Then A(y) follows directly from (1). Hence y r(x, y) A(y). The same reasoning for y r(x, y) B(y). b. DIY! c. In predicate logic: x(( y r(x,y) A(y) B(y)) (( y r(x,y) A(y)) ( y r(x,y) B(y))) 9

Consider the structure with domain D = {d 1,d 2,d 3 } and interpretation I such that: I(r) = {(d 1,d 2 ),(d 1,d 3 )} I(A) = {d 2 } I(B) = {d 3 } Choose x = d 1. Then y r(x,y) A(y) B(y) holds, but (e.g.) not y r(x,y) A(y). Hence, the formula is not true. Model-based Reasoning (Chapter 4) Exercise 4a A schematic view of the circuit is as follows: 1 0 O1 A1 0 1 A2 O2 0 Conflict sets are sets of components, that, the assumption that they work normally is inconsistent with the observation. In this case, there are 5 conflict sets: CS 1 = {O 1,A 1 } CS 2 = {O 1,O 2,A 2 } CS 3 = {O 1,A 1,A 2 } CS 4 = {O 1,A 1,O 2 } CS 5 = {O 1,O 2,A 1,A 2 } The first two conflict sets are (subset) minimal. If we add a component to a conflict set, then this results in another conflict set. Different hitting set trees are possible, depending on the order of chosing a conflict set. For example, the start of a hitting set tree can look as follows: {O1, O2, A2} O1 O2 A2 v {O1, A1} etc. O1 A1 v v Minimal diagnoses correspond to the minimal hitting sets, for example, {O 1 } and {O 2,A 1 }. 10

f 1 α d 1 α 1 d 2 2 f 2 d 3 f 3 Figure 6: Causal graph. Exercise 7a Draw a causal graph, shown in Figure 6, because that helps solving the problem. A hypothesis D is called an abductive diagnosis of P = (Σ,F), with F observed findings and causal model Σ = (,Φ, R) (see slides and Lecture Notes for definitions) if: (1) R D = F (2) R D C = where C = { f f Φ,f F,f is a positive literal}. Note that if we omit part (2), then this definition corresponds to a prediction of F. Thus, a diagnosis D is a prediction of what has been observed, i.e., F, that is consistent with the constraints C. In this case, we have F = {f 1,f 3 }, so C = { f 2 }. This gives three possible diagnoses: D 1 = {d 1,d 2,d 3 } D 2 = {d 1,d 2,d 3,α 1 } D 3 = {d 1,α 1,d 3 } Note that, if d 2 and α 2 are included, then we have an inconsistency with f 2. Uncertainty Reasoning (Chapter 5) Exercise 1c a. CF(a,e ) = 0.8; CF(b,e ) = 0.4; CF(c,e ) = 0.7. CF(a or b or c,e ) = max{max{cf(a,e ),CF(b,e )},CF(c,e )} = max{0.8, 0.7} = 0.8 CF(f,e 1 ) = 1 max{0,cf(e,e )} = 1 0.8 = 0.8 b. CF(c and d,e ) = min{cf(c,e ),CF(d,e )} = 0,6; CF(f,e 2 ) = 0,5 0,6 = 0,3. c. CF(f,e 3 ) = 1 0.6 = 0.6. d. Apply the combination function for co-concluding rules on (1) (2), and (3): CF(f,e 1 co e 2 ) = 0.8 + 0.3(1 0,8) = 0.86 CF(f,(e 1 co e 2 ) co e 3 ) = 0.86 + 0.6(1 0.86) = 0.94 11

Exercise 1d To see what it means that such a rule is idempotent, take a value for y, for example c; then fco(x,c) is an operator on the argument x (we could call that o(x)). So idempotence then means that fco(x,c) = fco(fco(x,c),c). This is not the case for the rule mentioned. For example, take x = 0.5 and c = 0.4. Then fco(x,c) = 0.5 + 0.4(1 0.5) = 0.7 and fco(fco(x,c),c) = 0.7 + 0.4(1 0.7) = 0.82. Advantage of idempotence: two or more identical production rule only change the CF once. Disadvantage of idempotence: of different rules that result into an equal CF, only 1 of them contributes to the final CF. Exercise 2b Use the definition of a conditional probability and the rules for marginalisation and factorisation of a Bayesian network: P(v 3 v 1 ) = P(v 1,v 3 ) P(v 1 ) Exercise 3 = V 2 P(v 1,V 2,v 3 ) V = 2 P(v 3 V 2 )P(V 2 v 1 )P(v 1 ) P(v 1 ) P(v 1 ) = V 2 P(v 3 V 2 )P(V 2 v 1 ) = P(v 3 v 2 )P(v 2 v 1 ) + P(v 3 v 2 )P( v 2 v 1 ) = 0.7 0.3 + 0.1 0.7 = 0.28 a. See slide 26 of the slides about uncertainty reasoning. In the certainty factor model, CFs are propagated using the following rule: CF(h,e ) = CF(h,e) max{0,cf(e,e )} which we could interpret as the probabilistic statement: P(h e ) = P(h e) max{0,p(e e )} = P(h e)p(e e ) According to the interpretation of CF rules, we can model the distribution using a Bayesian network E E H. Then it holds that: P(h e ) = P(h e)p(e e ) + P(h e)p( e e ) This is close to the CF model, but we still need to make sure that P(h e)p( e e ) = 0. If P( e e ) = 0, then P(e e ) = 1, which is (usually) inconsistent with the CF model, so this is a bad solution. So apparently we also need to require that P(h e) = 0. b. We can make the CF factor closer to the probabilistic model by including (besides CF(h,e)) a statement CF(h, e) to a rule. Different definitions are possible, for example. CF(h,e ) = CF(h,e) max{0,cf(e,e ) + CF(h, e) max{0, CF(e,e )} 12

c. In the noisy-and model it holds that: P(e C 1,C 2 ) = P(e I 1,I 2 ) P(I k C k ) = P(i 1 C 1 )P(i 2 C 2 ) C 1 C 2 =e k=1,2 A corresponding CF definition could look at follows: CF(h,e 1 co e 2) = CF(h,e 1)CF(h,e 2) Exercise 4 a. The generated Markov networks from the logical formula look as follows: R(1,1) R(2,1) R(1,2) R(2,2) Q(1,1) Q(1,2) Q(2,1) R(2,2) b. Consider the joint distribution P(R(1,1),R(2,1),Q(1,1),Q(1,2)). There are no dependencies with the remaining random variables, so we can compute probabilities of this distribution without considering the rest of the variables. We consider all the states of this distribution and count the number of true instantiations of F. For example, in the state that {R(1,1) = false,r(2,1) = false,q(1,1) = false, Q(1, 2) = true}, then the instantiations are: R(1,1) Q(1,1) R(2,1) Q(1,1) For example, R(1,1) Q(1,2) is not a true instantiation. A list with the number of true instantiations is as follows: R(1,1) R(2,1) Q(1,1) Q(1,2) n(x) P(x) Z 1 1 1 1 4 e 4w 1 1 1 0 4 e 4w 1 1 0 1 4 e 4w 1 1 0 0 4 e 4w 1 0 1 1 2 e 2w 1 0 1 0 3 e 3w 1 0 0 1 3 e 3w 1 0 0 0 4 e 4w 0 1 1 1 2 e 2w 0 1 1 0 3 e 3w 0 1 0 1 3 e 3w 0 0 0 0 4 e 4w 0 0 1 1 0 e 0 0 0 1 0 2 e 2w 0 0 0 1 2 e 2w 0 0 0 0 4 e 4w 13

(The same can be done for the other part of the model). c. The normalisation constant can be computed by summing up the last column (because Z = x P(x) Z). Z = 7e 4w + 4e 3w + 4e 2w + 1 81.5 So it holds that: P(R(2,1),Q(1,2)) = 1/Z(e 4w + e 4w + e 2w + e 3w ) 0.27 d. Similar to (c). P(R(2,1)) = 1/Z(5e 4w + 2e 3w + e 2w ) 0.6 14