Measuring Consistency Gain and Information Loss in Stepwise Inconsistency Resolution

Size: px
Start display at page:

Download "Measuring Consistency Gain and Information Loss in Stepwise Inconsistency Resolution"

Transcription

1 Measuring Consistency Gain and Information Loss in Stepwise Inconsistency Resolution John Grant Department of Mathematics, Towson University Towson, MD 21252, USA and Department of Computer Science, University of Maryland College Park, MD 20742, USA and Anthony Hunter Department of Computer Science University College London Gower Street, London WC1E 6BT, UK August 2, 2010 Abstract Inconsistency is a feature of many kinds of data and knowledge. Furthermore, it is often difficult to know what information is the cause of the inconsistency, and therefore it is difficult to know what information to delete or change. Hence, changes to the information with the aim of gaining in the level of consistency can result in what can be described as a loss of information. In this paper, we are concerned with analyzing this trade-off. We review some existing proposals, and make some new proposals, for measures of inconsistency and measures of information. We compare these proposals with respect to each other and also with respect to a framework of constraints on measures. We then present a novel proposal for measuring the trade-off of consistency gain and information loss in the context of stepwise resolution. 1 Introduction Inconsistency, and deciding how to deal with it, is a well-recognized problem in many areas of computer science including data and knowledge engineering, software engineering, robotics, and natural language. Often it is not possible to determine with high confidence which items of data or knowledge are incorrect. It might be that to find this out would cost more than the information is actually worth. Or it might be that it is just not possible to acquire this information. In these situations, it may however be useful to delete or update items of information that are involved in inconsistencies based on the nature of those inconsistencies. But since it is often unclear which items of information are indeed incorrect, the process of inconsistency resolution can result in a gain in the degree of consistency, but at the price of a loss of information. The aim of this paper is to investigate the trade-off of gain in consistency and loss of information. We do this by considering measures of inconsistency and measures of information. We will start by investigating 1

2 what are essential properties of inconsistency and information measures. We propose three requirements in both cases and consider various definitions, mostly ones previously proposed. Each proposal has some rationale, so it is worthwhile to investigate their compatibility with one another. We will show that in a welldefined sense each measure is incompatible with every other measure, and this goes both for inconsistency measures and information measures. These results suggest that there does not exist a single inconsistency measure or information measure that coincides with intuition in general. Nonetheless, the framework for inconsistency and information measures is potentially useful for choosing measures according to specific applications. To further investigate the trade-offs that arise between inconsistency and information, we will consider the applicability of measures of inconsistency and information in stepwise inconsistency resolution, that is, by inconsistency resolution proceeding one formula at a time. We will show how measures of inconsistency and measures of information can help decide to which formulae to apply an inconsistency resolution operation at each step. To illustrate some of the key issues in stepwise inconsistency resolution, we consider the following example. Let K = {a, a b c, b, d}. K has two minimal inconsistent subsets: M 1 = {a, a b c} and M 2 = { a b c, b}; and two maximal consistent subsets N 1 = {a, b, d} and N 2 = { a b c, d}. As we want to show how to reduce the inconsistency of K in a stepwise fashion, one formula at a time, we will apply three inconsistency resolution functions: delete a formula, weaken a formula, and split a formula. Let s consider these possibilities for K. Deletion We delete a formula that is in a minimal inconsistent subset. (There is no point to deleting d.) Thus we can delete either a b c or a or b. In the first case, since a b c is in both minimal inconsistent subsets, the result is consistent. This is the most drastic of the three options because this operation loses the most information. Weakening We change a formula to another formula logically implied by it. Typically, we add a disjunct or change a conjunction to a disjunction. For instance, we can weaken a b c to ( a b) c or a b c. We can weaken a to a b or even a a, and so on. While this operation may reduce the number of minimal inconsistent subsets, the size of the minimal inconsistent subsets may rise, as seen here, where the first weakening results in one minimal inconsistent subset {a, ( a b) c, b}. Splitting We split a formula into its conjuncts. This may isolate the really problematic conjuncts. For instance, we can split a b c into a, b, and c. In this case, we get a new knowledgebase {a, a, b, b, c, d} that is still inconsistent, though by some inconsistency measures it is less inconsistent. Also, this allows us at a later step to delete just the portion of the conjunction that causes the inconsistency. In an inconsistent knowledgebase, any one of the formulae in the knowledgebase can be selected for one of the resolution operations (of deletion, weakening or splitting). So there is a question of how to choose a formula and which operation to apply. In general, inconsistency and information measures offer possible answers to this question. Our guiding principle is to minimize information loss while reducing inconsistency. Stepwise resolution may be useful in many diverse situations. For example, when designing a complex IT system, requirements are collated from the stakeholders. Often this results in substantial inconsistencies. Furthermore, these inconsistencies cannot be arbitrarily resolved. Rather, there needs to be a series of discussions and negotiations leading to stepwise refinement of the requirements. As another example of stepwise resolution consider how inconsistencies arise in knowledge about a biological system during biomedical research. Identification of an inconsistency may be a spur to undertake further research to investigate the knowledge that appears in the inconsistency. However, it might not be possible to investigate all the knowledge because of a lack of experimental resources, or perhaps because of a lack of techniques, 2

3 and so only some of the knowledge can be subject to revision. This therefore leads to a stepwise resolution of inconsistency. As new knowledge becomes available, resolution steps can be undertaken. The main contributions of our paper can be summarized as follows: (1) We consider a range of measures for inconsistency, mainly from the literature, and show that they are pairwise incompatible in the sense that they measure inconsistency in different ways (and so one cannot be substituted by another); (2) We consider a range of measures for information, some from the literature and some new, and show that they are pairwise incompatible in the sense that they measure information in different ways (and so one cannot be substituted by another); (3) We provide a range of constraints on inconsistency and information measures, some of which are new, and give the first comprehensive evaluation of the range of inconsistency and information measures with respect to this range of constraints; and (4) We provide a novel means by which the trade-off of consistency gain and information loss can be evaluated in stepwise inconsistency resolution using the range of inconsistency and information measures. 2 Preliminary definitions We assume a propositional language L of formulae composed from a set of atoms A and the logical connectives,,. We use φ and ψ for arbitrary formulae and α and β for atoms. All formulae are assumed to be in conjunctive normal form. Hence every formula φ has the form ψ 1... ψ n, where each ψ i, 1 i n, has the form β i1... β im, where each β ik, 1 k m is a literal (an atom or negated atom). A knowledgebase K is a finite set of formulae. We let denote the classical consequence relation, and write K to denote that K is inconsistent. Logical equivalence is defined in the usual way: K K iff K K and K K. We find it useful to define also a stronger notion of equivalence we call b(ijection)-equivalence as follows. Knowledgebase K is b(ijection)-equivalent to knowledgebase K, denoted K f K iff there is a bijection f : K K such that for all φ K, φ is logically equivalent to f(φ). For example, {a, b} is logically equivalent but not b(ijection)-equivalent to {a b}. We write R 0 for the set of nonnegative real numbers and K for the set of all knowledgebases (in some presumed language L). For a knowledgebase K, MI(K) is the set of minimal inconsistent subsets of K, and MC(K) is the set of maximal consistent subsets of K. Also, if MI(K) = {M 1,..., M n } then Problematic(K) = M 1... M n, and Free(K) = K\Problematic(K). So Free(K) contains the formulae in K that are not involved in any inconsistency and Problematic(K) contains the formulae in K that are involved in at least one inconsistency. We also require the set of formulae in K that are individually inconsistent (i.e. Selfcontradictions(K) = {φ K {φ} }). In the next section we will use these functions in definitions for syntactic measures of inconsistency. The corresponding semantics uses Priest s three valued logic (3VL) [P79] with the classical two valued semantics augmented by a third truth value denoting inconsistency. The truth values for the connectives are defined in Figure 1. An interpretation i is a function that assigns to each atom that appears in K one of three truth values: i : Atoms(K) {F, B, T }. For an interpretation i it is convenient to separate the atoms into two groups, namely the ones that are assigned a classical truth value and the ones that are assigned B. Binarybase(i) = {α i(α) = T or i(α) = F } Conflictbase(i) = {α i(α) = B} For a knowledgebase K we define the models as the set of interpretations where no formula in K is assigned the truth value F : Models(K) = {i for all φ K, i(φ) = T or i(φ) = B} Then, as a measure of inconsistency for K we define Contension(K) = Min{ Conflictbase(i) i Models(K)} So the contension gives the minimum number of atoms that are assigned B by the 3VL models of K. 3

4 α β α β α β α T T T T F T B T B F T F T F F B T T B B B B B B B B F B F B F T T F T F B B F T F F F F T Figure 1: Truth table for three valued logic (3VL). This semantics extends the classical semantics with a third truth value, B, denoting contradictory. Rows 1, 3, 7, and 9, give the classical semantics, and the other rows give the extended semantics. Example 1. For K = {a, a, a b, b}, there are two models of K, i 1 and i 2, where i 1 (a) = B, i 1 (b) = B, i 2 (a) = B, and i 2 (b) = F. Therefore, Conflictbase(i 1 ) = 2 and Conflictbase(i 2 ) = 1. Hence, Contension(K) = 1. Finally, we consider some useful definitions based on the notion of implicants. A set of literals X is an implicant for a knowledgebase K iff for each φ K, X φ. A minimal implicant is called a prime implicant. For example, for K = {a, b c}, the prime implicants are X 1 = {a, b} and X 2 = {a, c}. A proxy for K is a set of literals X such that X is a prime implicant of a maximal consistent subset of K. Let the set of proxies for K (denoted Proxies(K)) be defined as follows. Proxies(K) = {X X is a prime implicant of K MC(K)} For example, for K = {a, a, b c}, Proxies(K) = {{a, b}, { a, b}, {a, c}, { a, c}}. We see that each proxy represents an interpretation of the possible literals that hold, and so the number of proxies rises by increasing the number of disjuncts in any formula, and by increasing the number of conflicting formulae. The cardinality of each proxy rises with the amount of information in each alternative, and so adding conjuncts to a formula will increase the size of one or more proxies (as long as the conjunction is consistent). 3 Inconsistency and information measures In this section, we study inconsistency and information measures. We consider both existing and new proposals. Our main result is that for both inconsistency measures and information measures, the various measures are incompatible with one another. This result strongly implies that unlike some other intuitive concepts, such as the concept of effective computability, where different definitions using recursion, λ- calculus, and Turing machines are equivalent, both inconsistency measure and information measure are too elusive to be captured by a single definition. Additionally, for information measures we also consider various plausible constraints and investigate which measures satisfy them. 3.1 Inconsistency measures for knowledgebases An inconsistency measure assigns a nonnegative real value to every knowledgebase. We make three requirements for inconsistency measures. The constraints ensure that all and only consistent knowledgebases 4

5 get measure 0, the measure is monotonic for subsets, and the removal of a formula that does not participate in an inconsistency leaves the measure unchanged. Definition 1. An inconsistency measure I : K R 0 is a function such that the following three conditions hold: 1. I(K) = 0 iff K is consistent 2. If K K, then I(K) I(K ), 3. For all α Free(K)(I(K) = I(K\{α}) The above requirements are taken from [HK06] where (1) is called consistency, (2) is called monotony, and (3) is called free formula independence. Next we introduce five inconsistency measures: the rationale for each is given below. Definition 2. For a knowledgebase K, the inconsistency measures I C, I P, I B, I S, and I R are s.t. I C (K) = MI(K) I M (K) = ( MC(K) + Selfcontradictions(K) ) 1 I P (K) = Problematic(K) I B (K) = Contension(K) { 0 if K is consistent I Q (K) = otherwise X MI(K) 1 X We explain the measures as follows: I C (K) counts the number of minimal inconsistent subsets of K; I M (K) counts the sum of the number of maximal consistent subsets together with the number of contradictory formulae but 1 must be subtracted to make I(K) = 0 when K is consistent; I P (K) counts the number of formulae in minimal inconsistent subsets of K; I B (K) counts the minimum number of atoms that need to be assigned B amongst the 3VL models of K; and I Q computes the weighted sum of the minimal inconsistent subsets of K, where the weight is the inverse of the size of the minimal inconsistent subset (and hence smaller minimal inconsistent subsets are regarded as more inconsistent than larger ones). Clearly, each of these measures satisfies the definition of being an inconsistency measure (i.e. Definition 1). There is a rationale for each inconsistency measure. We cannot require these differently defined measures to have identical values but it would be reasonable to assume that at least some of them place the knowledgebases in the same order with respect to inconsistency. Define I x and I y to be order-equivalent if for all knowledgebases K 1 and K 2, I x (K 1 ) < I x (K 2 ) iff I y (K 1 ) < I y (K 2 ) and order-independent otherwise. The next theorem shows that order-equivalence doesn t hold for any pair of the inconsistency measures we have defined, leading us to conclude that inconsistency is too elusive a concept to be captured in a single measure. Since order-equivalence is expressed as a universal statement, order-independence, its negation, is proved by a counterexample. Theorem 1. I C, I M, I P, I B, and I Q are pairwise order-independent. Proof. We use the following four knowledgebases to show independence: Let K 1 = {a, a b, b c, c}, K 2 = {a, b, a b}, K 3 = {a, a b, b c, c, c d, d}. K 4 = {a, b, a b, c}. Figure 3.1 gives the computation of these measures on the knowledgebases. The following equalities and inequalities show pairwise independence. 5

6 K I C (K) I M (K) I P (K) I B (K) I Q (K) K /4 K K /12 K /3 Figure 2: The inconsistency measures for the knowledgebases K 1 = {a, a b, b c, c}, K 2 = {a, b, a b}, K 3 = {a, a b, b c, c, c d, d}, and K 4 = {a, b, a b, c}. I C (K 1 ) < I C (K 2 ) but I M (K 1 ) > I M (K 2 ) and I P (K 1 ) > I P (K 2 ); I C (K 1 ) < I C (K 3 ) but I B (K 1 ) = I B (K 3 ); and I C (K 1 ) = I C (K 4 ) but I Q (K 1 ) < I Q (K 4 ). Next, I M (K 1 ) > I M (K 2 ) but I B (K 1 ) < I B (K 2 ) and I Q (K 1 ) < I Q (K 2 ); and I M (K 2 ) < I M (K 4 ) but I P (K 2 ) = I P (K 4 ). Continuing, I P (K 1 ) > I P (K 2 ) but I B (K 1 ) < I B (K 2 ) and I Q (K 1 ) < I Q (K 2 ); Finally, I B (K 1 ) = I B (K 3 ) but I Q (K 1 ) < I Q (K 3 ). Although the five inconsistency measures are quite different, four of them give identical results on bijectionequivalent knowledge bases. Proposition 1. If K f K then I C (K) = I C (K ), I M (K) = I M (K ), I P (K) = I P (K ), and I Q (K) = I Q (K ), Proof. Bijection-equivalence, K f K, means that there is a bijection f : K K such that for all φ K, φ is logically equivalent to f(φ). We use the notation f(x) = {f(φ) φ X} where X K (hence K = f(k)). Then, X MI(K) iff f(x) MI(K ) and Y MC(K) iff f(y ) MI(K ). Additionally, φ Selfcontradictions(K) iff f(φ) Selfcontradictions(K ). The result then follows. Interestingly, b-equivalence does not guarantee equality for I B. The problem is with self-contradictions. For instance, if K = {a, a} and K = {a a b b}, then K f K, but I B (K) = 1 I B (K ) = 2. The use of minimal inconsistent subsets, such as I C, I P, and I Q, and the use of maximal consistent subsets such as I M, have been proposed previously for measures of inconsistency [HK04, HK08]. The idea of a measure that is sensitive to the number of formulae to produce an inconsistency eminates from Knight [Kni01] in which the more formulae needed to produce the inconsistency, the less inconsistent the set. As explored in [HK08], this sensitivity is obtained with I Q. Another approach involves looking at the proportion of the language that is touched by the inconsistency such as I B. This allows us to look inside the formulae [Hun02, KLM03, GH06]. This means that two formulae viewed as two whole belief bases (singleton sets) can have different inconsistency measures. But, in these approaches one can identify the set of formulae with its conjunction (i.e. the set {ϕ, ϕ } has the same inconsistency measure as the set {ϕ ϕ }). Whilst model-based techniques have been proposed before for measures of inconsistency, I B is a novel proposal since it is based on three-valued logic, and as such, is simpler than the ones based on four-valued logic (e.g. [Hun02]). 3.2 Information measures for knowledgebases Another dimension to analysing inconsistency is to ascertain the amount of information in a knowledgebase. The following novel proposal for an information measure assigns a nonnegative real number to every knowledgebase. The constraints ensure that the empty set has measure 0, the measure is subset monotonic 6

7 for consistent knowledgebases, and a consistent knowledgebase that does not contain only tautologies has nonzero measure. Definition 3. An information measure J : (L) R 0 is a function such that the following three conditions hold: 1. If K = then J(K) = If K K, and K is consistent, then J(K ) J(K). 3. If K is consistent and φ K such that φ is not a tautology, then J(K) > 0. The above definition is a general definition that allows for a range of possible measures to be defined. Next we introduce seven information measures; the rationale for each is given below. Definition 4. For a knowledgebase K, the information measures J A, J S, J F, J C, J B, J P, and J L are such that J A (K) = Atoms(K) J S (K) = K J F (K) = Free(K) J C (K) = Max{ M M MC(K)} J B (K) = Max{ Binarybase(i) i Models(K)} J P (K) = Max{ X X Proxies(K)} J L (K) = log 2 2 n {Models(K ) K MC(K)} where n = Atoms(K) The first two measures do not actually deal with inconsistency at all: J A counts the number of atoms and J S counts the number of formulae. For the other four measures: J F counts the number of free formulae; J C finds the size of the largest maximal consistent subset; J B finds the maximum number of atoms that need not be assigned B in the 3VL models; J P finds the size of the largest proxy; and J L uses an informationtheoretic approach that is discussed further at the end of this section. All seven measures are information measures according to Definition 3. In analogy to inconsistency measures, we can define order-equivalence and order-independence for information measures. Similarly, we find that order-equivalence does not hold for any pair of information measures, leading us to conclude that information, like inconsistency is also too elusive a concept to be captured in a single measure. Theorem 2. J A, J S, J F, J C, J B, J P, and J L are pairwise order-independent. Proof. We use the following three knowledgebases, including one from before: Let K 1 = {a, a b, b c, c}, K 2 = {a, b, a b}, K 5 = {a, b, a b, b, a b}, and K 6 = {a b c}. Figure 3.2 gives the computation of these measures on the knowledgebases. The following equalities and inequalities show independence. J A (K 1 ) = J A (K 6 ) but J S (K 1 ) > J S (K 6 ), J F (K 1 ) < J F (K 6 ), J C (K 1 ) > J C (K 6 ), J B (K 1 ) < J B (K 6 ), J P (K 1 ) > J P (K 6 ), and J L (K 1 ) > J L (K 6 ). Next, J S (K 1 ) > J S (K 6 ) but J F (K 1 ) < J F (K 6 ) and J B (K 1 ) < J B (K 6 ); and J S (K 1 ) < J S (K 5 ) but J C (K 1 ) = J C (K 5 ), J P (K 1 ) > J P (K 5 ), and J L (K 1 ) > J L (K 5 ). Continuing, J F (K 1 ) = J F (K 5 ) but J B (K 1 ) > J B (K 5 ), J P (K 1 ) > J P (K 5 ), and J L (K 1 ) > J L (K 5 ); 7

8 K J A (K) J S (K) J F (K) J C (K) J B (K) J P (K) J L (K) K K K K Figure 3: The information measures for the knowledgebases K 1 = {a, a b, b c, c}, K 2 = {a, b, a b}, K 5 = {a, b, a b, b, a b}, and K 6 = {a b c}. and J F (K 1 ) < J F (K 6 ) but J C (K 1 ) > J C (K 6 ). Next, J C (K 1 ) = J C (K 5 ) but J B (K 1 ) > J B (K 5 ), J P (K 1 ) > J P (K 5 ), and J L (K 1 ) > J L (K 5 ). Then, J B (K 1 ) < I B (K 6 ) but J P (K 1 ) > J P (K 6 ) and J L (K 1 ) > J L (K 6 ). Finally, J P (K 2 ) = J P (K 5 ) but J L (K 2 ) > J L (K 5 ). Next we prove some results concerning information measures followed by some that relate inconsistency measures with inconsistency measures. Proposition 2. If K is consistent, then J S (K) = J F (K) = J C (K). Proof. In a consistent knowledgebase all formulae are free. Proposition 3. If K is a set of literals, then J A (K) = J C (K) = J P (K). Proof. Let J A (K) = n. K may contain more than n literals because it may contain both α and α for some atom α. Each X M C (K) is obtained by deleting one of α or α for such an atom. Any set obtained this way is also a maximal proxy. Hence J C (K) = J P (K) = n. Proposition 4. For any knowledgebase K, J S (K) J F (K) = I P (K). Proof. J S (K) J F (K) = K Free(K) = Problematic(K) = I P (K). Proposition 5. For any knowledgebase K, J A (K) J B (K) = I B (K). Proof. For every interpretation i and atom α, α Binarybase(i) Conflictbase(i) where Binarybase(i) Conflictbase(i) =. But then, if i is an interpretation for which Binarybase(i) is maximal, Conflictbase(i) must be minimal, and vica versa. For any such i, Atoms(K) Binarybase(i) = Conflictbase(i). Proposition 6. No information measure is also an inconsistency measure. Proof. Let K = {a}. Then I(K) = 0 by the first constraint for inconsistency measures and J(K) > 0 by the third constraint for information measures. Since our definition of information measure (i.e. Definition 3) is rather weak we consider additional constraints that can be useful for comparing information measures. For an information measure J, and for any knowledgebases K, K L, we call J: (Monotonic) If K K, then J(K) J(K ). (Clarity) For all φ K, J(K) J(K { φ}). (Equivalence) If K is consistent and K K, then J(K) = J(K ). 8

9 (Bijection-Equivalence) If K f K, then J(K) = J(K ). (Closed) If K is consistent, and K φ, then J(K) = J(K {φ}). (Cumulative) If K {φ} is consistent, and K φ, then J(K) < J(K {φ}). A monotonic information measure is monotonic even for inconsistent knowledgebases. A clarity measure does not increase when the negation of a formula in the knowledgebase is added. An equivalence measure assigns the same value to logically equivalent consistent knowledgebases. A bijection-equivalence measure (which was first proposed in [Kni01]) has the same value for a pair of knowledgebases when the formulae are pairwise equivalent. A closed measure (which was first proposed in [Loz94b]) does not have increased information for a consistent knowledgebase when entailed formulae are added. A cumulative measure (which was first proposed in [Loz94b]) has increased information for a consistent knowledgebase when a non-entailed formula is added that is consistent with it. Theorem 3. Figure 4 indicates the constraints that hold for each of the information measures J A, J S, J F, J C, J B, J P, and J L. Proof. (1) J A. It is clear from the definitions that J A has the monotonic and clarity properties. Equivalence and B-equivalence do not hold: consider K = {a a} and K = {a a b}. K is consistent, K K, and K f K, but J A (K) = 1 and J A (K ) = 2. J A is not closed: let K 1 = {a}, K 1 is consistent and K 1 a b, but J A (K 1 ) = 1 and J A (K 1 = {a, a b}) = 2. J A is also not cumulative: let K 2 = {a b}, then K 2 = {a b, a} is consistent and K 2 a, but J A (K 2 ) = J A (K 2). (2) J S. It is clear from the definitions that J S has the monotonic, b-equivalence, and cumulative properties, and does not have the clarity and closed properties. To show that equivalence does not hold, let K = {a, b} and K = {a b}. Here K K but J S (K) = 2 J S (K ) = 1. (3) J F. It is clear from the definitions that J F has the clarity and cumulative properties. To show that J F is not monotonic let K = {a} and K = {a, a}. Then, K K, but J F (K) = 1 and J F (K ) = 0. Equivalence and closed also do not hold: consider K 1 = {a b} and K 1 = {a b, a}. Then, K 1 K 1 but J F (K 1 ) = 1 and J F (K 1 ) = 2. However, b-equivalence holds because for two equivalent formulae either both or neither are free. (4) J C. J C (K) = J S (K) for consistent K. The properties are the same and the proofs similar. (5) J B. Clarity is clear from the definitions. The example used for J F shows that J B is not monotonic. The other properties follow the same way as for J A. (6) J P. It is clear from the definitions that J P is monotonic, has clarity, and both equivalence and b- equivalence hold. The closed property also follows because the addition of α does not change the proxies. Finally, cumulative also holds because Proxies(K {α}) contains a proxy entailing α but Proxies(K) does not contain it (there is only one proxy). (7) J L. As J L has a more complicated definition, none of the six properties is immediately clear and we proceed one by one. J L is not monotonic: consider K = {a, b} and K = {a, b, a}. Now K K but J L (K) = 4 > J L (K ) = 2. On the other hand J L has clarity because the addition of φ leaves the numerator the same and increases the denominator in the definition. Both equivalence and b-equivalence hold because in general both the numerator and denominator of the (b-)equivalent knowledge bases are the same. The only way the numerators can be different is in the case of tautologies. For instance, a a in K is logically equivalent to a a b in K and suppose that otherwise K and K are the same and b does not appear in K. Then comparing J L (K ) to J L (K) we find that the numerator (resp. denominator) of the former is twice the numerator (resp. denominator) of the latter, hence J L (K) = J L (K ). This argument works in a similar way no matter now many different atoms are in K and K. Next we show that J L is not closed. For this purpose let K = {a}. Now K a b. But J L (K) = 1 while J L ({a, a b}) = Finally, we show that J L is cumulative. So consider what happens when K is extended with φ as in the definition. Consider first the case where all the atoms in φ are in K. But then the extension causes the number of models to decrease, thereby increasing the J L value. In case φ has one or more variables not in K the numerator of J L will increase without a proportional increase in the denominator. Hence the cumulative condition holds. 9

10 J A J S J F J C J B J P J L Monotonic Clarity Equivalence B-Equivalence Closed Cumulative Figure 4: Summary of constraints that hold (indicated by ) for particular information measures Depending on which constraints one considers important, one may choose from those measures that satisfy them. In particular, J P satisfies all seven constraints. The J A, J S and J F and J C measures are simple syntactic measures that have been considered in some form before (see for example [HK04] for a discussion)). However, the J B and J P are novel proposals for information measures. There have also been proposals for measures of information for propositional logic based on Shannon s information theory (see for example [Kem53]). Essentially, these measures consider the number of models of the set of formulae (the less models, the more informative the set), and in case the set of formulae is consistent, the result is intuitive. However, when the set is inconsistent, the set is regarded as having null information content. To address the need to consider inconsistent information, Lozinskii proposed a generalization of the information-theoretic approach to measuring information [Loz94b] that we called J L earlier. 4 Stepwise inconsistency resolution Generally, when a knowledgebase is inconsistent, we would like to reduce its inconsistency value, preferably to 0. The problem is that a reduction in inconsistency may lead to a corresponding reduction in information. Consider, for instance, J S. This measure counts the number of formulae in the knowledgebase. Hence any deletion reduces it. Our goal is to reduce inconsistency with as little information loss as possible, a task that depends on the choice of both the inconsistency measure and the information measure. We start by formally defining the three functions that we allow in the process of inconsistency resolution. We will assume, unless specified otherwise, that these operations are applied to an inconsistent knowledgebase and that the formulae in the knowledgebase are in conjunctive normal form. Definition 5. An inconsistency resolution function irf, is one of the following three functions applied to a knowledgebase K: (Deletion) d(φ) = K \ {φ}. (Weakening) w(φ, ψ) = (K \ {φ}) {ψ} where φ ψ. (Splitting) s(φ) = (K \ {φ}) {φ 1,..., φ n } where φ 1,..., φ n are the conjuncts in φ. We write irf(k) for the knowledgebase obtained by applying irf to K. Also irf(k) = K in case irf is not applicable to K. In the stepwise inconsistency resolution process we will usually have multiple applications of such functions. A stepwise resolution function sequence (abbr. function sequence) F = irf 1,..., irf n is a sequence of such functions. A stepwise inconsistency resolution knowledgebase sequence (abbr. knowledgebase sequence) K F = K 0,..., K n is a sequence of knowledgebases obtained by using F such that K 0 is the 10

11 α a a b b c c c d d R IC (K, K \ {α}) R IM (K, K \ {α}) R IP (K, K \ {α}) R IB (K, K \ {α}) R IQ (K, K \ {α}) 3/6 5/6 2/6 4/6 2/6 2/6 Figure 5: Illustration of resolution measures applied to knowledgebases obtained by deleting a formula from the knowledgebase K = {a, a b, b c, c, c d, d}. Here we see that according to I P, c is the optimal choice for deletion, while for I Q, it is a b. initial knowledgebase and irf i (K i 1 ) = K i for 1 i n. We also write F(K 0 ) = K n and observe that K n = irf n (... irf 1 (K 0 )...). The goal of stepwise inconsistency resolution is to reduce the inconsistency of the knowledgebase. Next we define a simple way to measure the reduction. We will be interested in applying this definition to the case where F(K) = K for some function sequence F. Definition 6. Given an inconsistency measure I, an inconsistency resolution measure R I : K K R is defined as follows: R I (K, K ) = I(K) I(K ) For illustration we give two examples. The example given in Figure 5 corresponds to deletion, and the following example corresponds to splitting a formula. Example 2. Let K = {a, a b, b}. Splitting K by applying s( a b) we obtain K = {a, a, b, b}. Here we see that splitting does not reduce inconsistency according to any of the five inconsistency measures. Indeed, for several measures it causes an increase in inconsistency. R IC (K, (K \ { a b}) { a, b}) = 0 R IM (K, (K \ { a b}) { a, b}) = 2 R IP (K, (K \ { a b}) { a, b}) = 1 R IB (K, (K \ { a b}) { a, b}) = 0 R IQ (K, (K \ { a b}) { a, b}) = 0 Some simple observations concerning the R I measure are the following: (1) If φ K, then R I (K, K \ {φ}) = 0 and (2) If φ Free(K) then R I (K, K \ {φ}) = 0. In the stepwise resolution process we try to minimize the loss of information as well. For this reason we now define a way to measure the loss of information. Definition 7. Given an information measure J, an information loss measure R J : K K R is defined as follows. R J (K, K ) = J(K) J(K ) Our general goal is to simultaneously maximize R I and minimize R J. In the following subsections we consider some of the issues for each of the options we have (i.e. for deletion, for weakening, and for splitting). 11

12 4.1 Inconsistency resolution by deletion Deletion is the simplest, and yet most drastic, of the options we have for dealing with inconsistency. In terms of deciding of how to proceed, if deletion is the only function used, it is just a matter of choosing a formula to delete at each step. The following result describes the possibilities for both R I and R J when K is obtained from K by a single deletion. Theorem 4. Let K be obtained from an inconsistent K by deleting a single formula. (a) For all 5 inconsistency measures R I (K, K ) 0. (b) For the information measures J F, J B and J L, R J (K, K ) may be negative; in the other cases R J (K, K ) is a nonnegative integer. Proof. (a) Immediate from property (2) of an inconsistency measure. (b) Clearly, when a formula is deleted from an inconsistent knowledgebase K, the number of atoms and the number of formulae cannot increase. Consider now J C which measures the size of the biggest maximal consistent subset. No formula deletion can increase this size. For a similar reason, as the proxies are taken over maximal consistent subsets, R J cannot be negative for J P. For the other cases let K = {a, a b, b} and K = {a, b}. Then R JF (K, K ) = R JB (K, K ) = 0 2 = 2 and R JL (K, K ) = 1 2 = 1. The following result follows immediately from the second constraint of an information measure and will be useful in narrowing the knowledgebases that need to be considered for minimal information loss when using deletions only for inconsistency resolution. Proposition 7. If K is consistent then R J (K, K \ {φ}) 0 This result shows that once we delete enough formulae from an inconsistent knowledgebase to make it consistent (and thereby make any inconsistency measure 0), we might as well stop because additional deletions may only cause information loss. This gives the following result. Corollary 1. Suppose that stepwise inconsistency resolution is done by deletions only. To find a consistent knowledgebase with minimal information loss (i. e. where R J (K, K ) is minimal) it suffices to consider only those function sequences F where F(K) MC(K). Now we present two examples where we check all the maximal consistent subsets of a knowledgebase for information loss. Example 3. Let K = { a b, a, b c, c}, K 1 = {a, b c, c}, K 2 = { a b, c}, and K 3 = K = { a b, b c}. K 1, K 2, and K 3 are the maximal consistent subsets. K 1 minimizes information loss for all Js. K K 1 K 2 K 3 J A J S J F J C J B J P J L

13 Example 4. Let K = { a b c, a, c}. The maximal consistent subsets are K 1 = { a b c}, and K 2 = {a, c}. In this case K 1 is the better choice for J A, J B, J P, and J L ; while K 2 is the better choice for J S, J F, and J C. K K 1 K 2 J A J S J F J C J B J P J L Inconsistency resolution by weakening In this subsection we investigate the case where the inconsistency of a knowledgebase is resolved by using weakenings only. Thus we start with an inconsistent knowledgebase K and by applying one or more weakenings we obtain a consistent K. Our concern here is what happens to the information measure during this process. In order to analyze this situation we will exclude the case where a formula is weakened by using an atom not in K such as by applying a disjunction with such an atom. We do this because it does not seem reasonable to change the language of the knowledgebase when our purpose is to weaken it for consistency. Also, by excluding this case we make sure that the information measure cannot become arbitrarily large by simply taking bigger and bigger disjuncts with new atoms. Our result is summarized in the following theorem. Theorem 5. Let K be an inconsistent knowledgebase that is transformed to a consistent knowledgebase K by one or more weakenings without introducing any atom not already in K. Then 1. J A (K ) J A (K). 2. J S (K ) J S (K). 3. J F (K ) J F (K). 4. J C (K ) J C (K). 5. No inequality holds between J B (K ) and J B (K). 6. J P (K ) J P (K). 7. J L (K ) J L (K). Proof. 1. follows from the assumption that no new atom is added. However, an atom may be lost, for example, by w(a b, a). 2. follows from the fact that weakening substitutes one formula for another; hence the number of formulae cannot increase. A decrease is possible if the new formula is already in the knowledgebase. The reason for 3. is that weakenings may cause the new formulae to be free. Equality is possible in case all weakened formulae ψ are already in K. A similar argument works for 4. except that here the maximal size of a consistent subset may increase. To show 5. we note that in the example following this proof J B (K ) > J B (K). But let K = {a b c d, a} and apply w(a b c d, b) to get K = {b, a}. Then, 3 = J B (K) > J B (K ) = 2. We just note here that if no atoms are lost during the weakenings, as additional atoms may be added to the binarybase, J B (K ) J B (K). During the process of weakening, the maximum size of a proxy cannot increase, but may decrease as shown in the following 13

14 example. Let K = {a b c, b c}. Here J P (K) = 3. Consider the weakening w(a b c, a b), so K = {(a b, b c}. Then J P (K ) = 1. This gives 6. To show 7. we consider two cases. First, if Atoms(K) = Atoms(K ) then the numerator remains fixed. The denominator cannot increase because now there is only one maximal consistent set. In the second case, where Atoms(K) > Atoms(K ), the numerator decreases by some power of 2 but the number of models for the denominator must decrease by at least the same power of 2. We end this subsection by revisiting Example 3 but here we apply weakening instead of deletion. Example 5. We start with K = { a b, a, b c, c}. Applying F 1 = w(a, a c), w( c, b c) we obtain K 1 = { a b, a c, b c, b c}. If we apply F 2 = w( a b, b a), we get K 2 = {b a, a, b c, c}. Both weakenings result in identical J values. K K 1 K 2 J A J S J F J C J B J P J L Inconsistency resolution using splitting Here we consider what happens when splitting is applied. First we note that unlike deletion and weakening, splitting by itself cannot resolve inconsistencies. Hence splitting must be used in conjunction with deletion or weakening. We start by considering what happens when just splitting is applied. Just as in the case of deletions and weakenings, we split only formulae in Problematic(K). Theorem 6. Let K be obtained from an inconsistent knowledgebase K by splitting a single formula in Problematic(K). Then (a) (b) 1. I C (K ) I C (K), 2. I M (K ) I M (K), 3. I P (K ) I P (K), 4. I B (K ) = I B (K), 5. No inequality holds between I Q (K ) and I Q (K). 1. J A (K ) = J A (K), 2. J S (K ) > J S (K), 3. J F (K ) J F (K), 4. J C (K ) J C (K), 5. J B (K ) = J B (K), 6. J P (K ) = J P (K) 7. No inequality holds between J L (K ) and J L (K). 14

15 Proof. Suppose that the formula φ was split to φ 1,..., φ n. (a) Consider a minimal inconsistent subset S = {φ, ψ 1,..., ψ k }. Then S = {φ 1,..., φ n, ψ 1,..., ψ k } is an inconsistent subset of K. So there must be at least one (and possibly several) minimal inconsistent subsets of S. This shows that I C (K ) I C (K). For example, if K 1 = {a b, a b} and s(a b) is applied, K 1 = {a, b, a b} giving I C (K 1 ) = 1 and I C (K 1) = 2. Next, consider a maximal consistent subset T of K. If T does not contain φ then either T is still a maximal consistent subset of K or there may be several maximal consistent subsets of K obtained by adding one or more elements from {φ 1,..., φ k }. On the other hand, if T contains φ, say T = {φ, ψ 1,..., ψ k }, then T = {φ 1,..., φ n, ψ 1,..., ψ k } is a maximal consistent subset of K. It may also be possible that splitting creates a new maximal consistent subset. So I M (K ) I M (K). For example, let K 2 = {a b c, a, c d, d}, apply s(a b c) to get K 2 = {a, b, c, a, c d, d}. Then, MC(K 2 ) = {{a b c, c d}, {a b c, d}, { a, c d, d} and hence I M (K 2 ) = 2. Also, MC(K 2) = {{a, b, c, c d}, {a, b, c, d}, {a, b, c d, d}, {b, c, a, c d}, {b, c, a, d}, {b, a, c d, d} and hence I M (K 2) = 5. Since φ Problematic(K), at least one of φ i Problematic(K ) for 1 i n. Hence, I P (K ) I P (K). Since splitting does not change the number of atoms in the conflictbase of an interpretation and I B counts the minimal number of atoms in the conflictbase, I B (K ) = I B (K). To show why there is no inequality for I Q, we consider two examples. First, K 3 = {a b, a b}, apply s(a b) to get K 3 = {a, b, a b}. So MI(K 3 ) = {{a b, a b}} and MI(K 3) = {{a, a b}, {b, a b}}. Therefore, I Q (K 3 ) = 1/2 and I Q (K 3) = 1/2+1/2 = 1, and hence I Q (K 3) > I Q (K 3 ). Second, let K 4 = {a b, a b}, and again apply s(a b) to get K 4 = {a, b, a b}. So MI(K 4 ) = {{a b, a b}} and MI(K 4) = {{a, b, a b}}. Therefore, I Q (K 4 ) = 1/2 and I Q (K 4) = 1/3, and hence I Q (K 4) < I Q (K 4 ). (b) The number of atoms in K and K are the same, hence J A (K) = J A (K ). But there are more formulae in K, hence J S (K ) > J S (K). With more formulae in K there may be more free formulae. For example, J F (K 2 ) = 0 but J F (K 2) = 1 because b Free(K 2). So, in general, J F (K ) J F (K). Splitting may also increase the size of a maximal consistent subset, giving J C (K ) J C (K). For example, J C (K 2 ) = 3 and J C (K 2) = 4. We found earlier that splitting does not change the conflictbase for an interpretation, so it also does not change the binarybase, giving J B (K ) = J B (K). As far as the proxies are concerned, we need to consider the maximal consistent subsets. First, consider a maximal consistent subset of K that corresponds to a maximal consistent subset of K, such as {a, b, c, c d} corresponding to {a b c, c d} for K 2. The proxies are the same. But as we showed for that example, there may be maximal consistent subsets of K that are not directly obtainable from a maximal consistent subset of K, such as {a, b, c d, d}. But, in fact, such a set must be obtainable from a combination, in this case from the combination of {a b c, c d} and {a b c, d}. Thus the maximum proxy size cannot increase. This gives J P (K ) = J P (K). Finally, we show why there is no inequality for J L. In the case of K 2 and K 2 above, we get J L (K 2 ) > J L (K 2) because for K 2 there are four models, namely {a, b, c, d}, {a, b, c, d}, { a, b, c, d}, and { a, b, c, d}, of which the first three are also models for K 2, but for K 2 there are three additional models: {a, b, c, d}, { a, b, c, d}, and { a, b, c, d}. Here, as the numerators are the same and the denominator becomes larger, splitting reduces J L. But in the case where K 5 = {a b, a}, s(a b) is applied to get K 5 = {a, a, b}, there are three models for K 5 : {a, b}, { a, b}, and { a, b}, of which only the first 2 are models of K 5 and there are no others. Hence J L (K 5 ) < J L (K 5). This theorem shows that splitting decreases neither inconsistency nor information (except possibly for 15

16 I Q and J L ), and for some measures it increases both. Anyway, as pointed out earlier, splitting must be combined with another operation to eliminate inconsistency. The following example shows what happens when splitting is done first followed by deletion versus just doing deletion, versus just weakening. Example 6. Let K = K 2 from the proof of the above theorem, i.e. K = {a b c, a, c d, d}. The only way to make K consistent by deleting a single formula is by applying d(a b c), yielding K = { a, c d, d}. Then I(K ) = 0 and J(K ) = 3 for all inconsistency and information measures. Next apply F = s(a b c), d(a), d(c) to obtain K = {b, a, c d, d}. Again, I(K ) = 0 for all inconsistency measures, but now J(K ) = 4 for all information measures. Finally we consider a case of weakening. Apply w(a b c, a b c) to get K = {a b c, a, c d, d}. Here also I(K ) = 0 and J(K ) = 4 for all measures I and J, the same result as for splitting followed by deletion. 5 Managing inconsistency resolution In general, inconsistency resolution should be guided by the aim of decreasing inconsistency without excessive loss of information. However, there is a trade-off between the amount to which inconsistency is decreased and the amount of information loss that can be accepted. Futhermore, there can be numerous choices over what resolution steps to take at any state of the knowledgebase. We explore some of these issues in the following example. Example 7. Consider a couple of witnesses who see a suspect escape in a car. Imagine, that police collect the following statements: (Witness 1) It was a red luxury Jeep. It was expensive since luxury cars are expensive. Anyway, Jeeps are off-road cars, and off-road cars are expensive. (Witness 2) It was a red Honda. It was not an expensive car because it was a Honda. The police can assume a constraint that a car cannot be both a Jeep and a Honda. These statements can be represented by the following propositional formulae (in CNF). luxury jeep red luxury expensive ( jeep offroad) ( offroad expensive) honda red honda expensive jeep honda This is K. The minimal inconsistent subsets, MI(K), are {luxury jeep red, ( jeep offroad) ( offroad expensive), honda red, honda expensive} {luxury jeep red, luxury expensive, honda red, honda expensive} {luxury jeep red, honda red, jeep honda} The maximal consistent subsets, MC(K), are {luxury jeep red, luxury expensive, ( jeep offroad) ( offroad expensive), honda red} {luxury jeep red, luxury expensive, ( jeep offroad) ( offroad expensive), honda expensive, jeep honda} { luxury expensive, ( jeep offroad) ( offroad expensive), honda red, honda expensive, jeep honda} {luxury jeep red, honda red, honda expensive} 16

17 For K, we get the following measures I C (K) = 3, I M (K) = 3, I P (K) = 6, I B (K) = 2, I Q (K) = 5/6 J A (K) = 6, J S (K) = 6, J F (K) = 0, J C (K) = 5, J B (K) = 4, J P (K) = 6, J L (K) = 3.68 Now suppose, the police consider what would be gained and lost if a witness could revise their statements in light of the other witnesses statement. For example, if witness 1 withdrew the information that Jeeps are off-road cars, and off-road cars are expensive, i.e. using d(( jeep offroad) ( offroad expensive)), K 1 = K \ {( jeep offroad) ( offroad expensive)} would result by most measures in some decrease in inconsistency, and except for J F and J L, some loss of information (as follows). I C (K 1 ) = 2, I M (K 1 ) = 3, I P (K 1 ) = 5, I B (K 1 ) = 2, I Q (K 1 ) = 7/12 J A (K 1 ) = 5, J S (K 1 ) = 5, J F (K 1 ) = 0, J C (K 1 ) = 4, J B (K 1 ) = 3, J P (K 1 ) = 5, J L (K 1 ) = 4 Whereas, if witness 2 withdraws the information that Hondas are not expensive, i.e. d( honda expensive), K 2 = K \ { honda expensive} would result in a greater decrease in inconsistency for less information loss (as follows). I C (K 2 ) = 1, I M (K 2 ) = 2, I P (K 2 ) = 3, I B (K 2 ) = 1, I Q (K 2 ) = 1/3 J A (K 2 ) = 6, J S (K 2 ) = 5, J F (K 2 ) = 2, J C (K 2 ) = 4, J B (K 2 ) = 5, J P (K 2 ) = 6, J L (K 2 ) = 4 So if only deletion is considered, we see that the second option is better than the first since the degree of inconsistency resolution is greater, and the degree of information loss is less (except for J S and J C where it is the same), in the second case. The police may be reluctant for the witnesses to withdraw factual statements like It was a red luxury Jeep or It was a red Honda since these may be more valuable than the knowledge of the witnesses as manifested by statements such as Hondas are not expensive. However, the police may consider whether relaxation of the facts may be useful in decreasing inconsistency. So suppose the irf F = s(luxury jeep red), d(luxury) is applied to get The result is K 3 = (K \ {luxury jeep red}) {jeep, red} I C (K 3 ) = 2, I M (K 3 ) = 3, I P (K 3 ) = 5, I B (K 3 ) = 2, I Q (K 3 ) = 7/12 J A (K 3 ) = 6, J S (K 3 ) = 7, J F (K 3 ) = 2, J C (K 3 ) = 6, J B (K 3 ) = 4, J P (K 3 ) = 6, J L (K 3 ) = 2.83 After further meetings with the witnesses, the police determine that witness 1 is not so confident the car was a luxury car, and witness 2 is not committed to the knowledge that Hondas are not expensive. On this basis, using the irf F = s(luxury jeep red), d(luxury), d( honda expensive), the police get the revised knowledgebase K 4 = (K \ {luxury jeep red, honda expensive}) {jeep, red} This results in the following by which we can see that the reduction in inconsistency has not been at the price of excessive information loss. I C (K 4 ) = 1, I M (K 4 ) = 2, I P (K 4 ) = 3, I B (K 4 ) = 1, I Q (K 4 ) = 1/3 J A (K 4 ) = 6, J S (K 4 ) = 6, J F (K 4 ) = 3, J C (K 4 ) = 5, J B (K 4 ) = 5, J P (K 4 ) = 5, J L (K 4 ) = 2.4 Out of the options considered, K 2 and K 4 reduce the inconsistency the most. K 4 is slightly better for four information measures, while K 2 is better for one information measure. Furthermore, unless either witness wishes to withdraw their claim that it was a Jeep or a Honda, there will remain some inconsistency in the knowledge. In conclusion, we see that by an analysis of some of the options for revising the information from the witnesses, the police can direct their queries to the witnesses. 17

Measuring the Good and the Bad in Inconsistent Information

Measuring the Good and the Bad in Inconsistent Information Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Measuring the Good and the Bad in Inconsistent Information John Grant Department of Computer Science, University

More information

Truth-Functional Logic

Truth-Functional Logic Truth-Functional Logic Syntax Every atomic sentence (A, B, C, ) is a sentence and are sentences With ϕ a sentence, the negation ϕ is a sentence With ϕ and ψ sentences, the conjunction ϕ ψ is a sentence

More information

Critical Reading of Optimization Methods for Logical Inference [1]

Critical Reading of Optimization Methods for Logical Inference [1] Critical Reading of Optimization Methods for Logical Inference [1] Undergraduate Research Internship Department of Management Sciences Fall 2007 Supervisor: Dr. Miguel Anjos UNIVERSITY OF WATERLOO Rajesh

More information

Essential facts about NP-completeness:

Essential facts about NP-completeness: CMPSCI611: NP Completeness Lecture 17 Essential facts about NP-completeness: Any NP-complete problem can be solved by a simple, but exponentially slow algorithm. We don t have polynomial-time solutions

More information

International Journal of Approximate Reasoning

International Journal of Approximate Reasoning International Journal of Approximate Reasoning xxx (2011) xxx xxx Contents lists available at ScienceDirect International Journal of Approximate Reasoning journal homepage:www.elsevier.com/locate/ijar

More information

Focused search for arguments from propositional knowledge

Focused search for arguments from propositional knowledge Focused search for arguments from propositional knowledge Department of Computer Science University College London Contents Introduction: Framework for argumentation and motivation for efficient search

More information

Inference in Propositional Logic

Inference in Propositional Logic Inference in Propositional Logic Deepak Kumar November 2017 Propositional Logic A language for symbolic reasoning Proposition a statement that is either True or False. E.g. Bryn Mawr College is located

More information

a. ~p : if p is T, then ~p is F, and vice versa

a. ~p : if p is T, then ~p is F, and vice versa Lecture 10: Propositional Logic II Philosophy 130 3 & 8 November 2016 O Rourke & Gibson I. Administrative A. Group papers back to you on November 3. B. Questions? II. The Meaning of the Conditional III.

More information

Warm-Up Problem. Is the following true or false? 1/35

Warm-Up Problem. Is the following true or false? 1/35 Warm-Up Problem Is the following true or false? 1/35 Propositional Logic: Resolution Carmen Bruni Lecture 6 Based on work by J Buss, A Gao, L Kari, A Lubiw, B Bonakdarpour, D Maftuleac, C Roberts, R Trefler,

More information

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard

More information

Chapter 3 Deterministic planning

Chapter 3 Deterministic planning Chapter 3 Deterministic planning In this chapter we describe a number of algorithms for solving the historically most important and most basic type of planning problem. Two rather strong simplifying assumptions

More information

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181. Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität

More information

On the Measure of Conflicts: Shapley Inconsistency Values

On the Measure of Conflicts: Shapley Inconsistency Values On the Measure of Conflicts: Shapley Inconsistency Values Anthony Hunter Department of Computer Science University College London, UK a.hunter@cs.ucl.ac.uk Sébastien Konieczny CRIL - CNRS Université d

More information

PROPOSITIONAL LOGIC. VL Logik: WS 2018/19

PROPOSITIONAL LOGIC. VL Logik: WS 2018/19 PROPOSITIONAL LOGIC VL Logik: WS 2018/19 (Version 2018.2) Martina Seidl (martina.seidl@jku.at), Armin Biere (biere@jku.at) Institut für Formale Modelle und Verifikation BOX Game: Rules 1. The game board

More information

First-Degree Entailment

First-Degree Entailment March 5, 2013 Relevance Logics Relevance logics are non-classical logics that try to avoid the paradoxes of material and strict implication: p (q p) p (p q) (p q) (q r) (p p) q p (q q) p (q q) Counterintuitive?

More information

From Satisfiability to Linear Algebra

From Satisfiability to Linear Algebra From Satisfiability to Linear Algebra Fangzhen Lin Department of Computer Science Hong Kong University of Science and Technology Clear Water Bay, Kowloon, Hong Kong Technical Report August 2013 1 Introduction

More information

Informal Statement Calculus

Informal Statement Calculus FOUNDATIONS OF MATHEMATICS Branches of Logic 1. Theory of Computations (i.e. Recursion Theory). 2. Proof Theory. 3. Model Theory. 4. Set Theory. Informal Statement Calculus STATEMENTS AND CONNECTIVES Example

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Bernhard Nebel Albert-Ludwigs-Universität Freiburg May 17, 2016

More information

CHAPTER 4 CLASSICAL PROPOSITIONAL SEMANTICS

CHAPTER 4 CLASSICAL PROPOSITIONAL SEMANTICS CHAPTER 4 CLASSICAL PROPOSITIONAL SEMANTICS 1 Language There are several propositional languages that are routinely called classical propositional logic languages. It is due to the functional dependency

More information

Introduction to Metalogic

Introduction to Metalogic Philosophy 135 Spring 2008 Tony Martin Introduction to Metalogic 1 The semantics of sentential logic. The language L of sentential logic. Symbols of L: Remarks: (i) sentence letters p 0, p 1, p 2,... (ii)

More information

Unary negation: T F F T

Unary negation: T F F T Unary negation: ϕ 1 ϕ 1 T F F T Binary (inclusive) or: ϕ 1 ϕ 2 (ϕ 1 ϕ 2 ) T T T T F T F T T F F F Binary (exclusive) or: ϕ 1 ϕ 2 (ϕ 1 ϕ 2 ) T T F T F T F T T F F F Classical (material) conditional: ϕ 1

More information

Logic: Propositional Logic (Part I)

Logic: Propositional Logic (Part I) Logic: Propositional Logic (Part I) Alessandro Artale Free University of Bozen-Bolzano Faculty of Computer Science http://www.inf.unibz.it/ artale Descrete Mathematics and Logic BSc course Thanks to Prof.

More information

DIMACS Technical Report March Game Seki 1

DIMACS Technical Report March Game Seki 1 DIMACS Technical Report 2007-05 March 2007 Game Seki 1 by Diogo V. Andrade RUTCOR, Rutgers University 640 Bartholomew Road Piscataway, NJ 08854-8003 dandrade@rutcor.rutgers.edu Vladimir A. Gurvich RUTCOR,

More information

Stabilizing Boolean Games by Sharing Information

Stabilizing Boolean Games by Sharing Information John Grant Sarit Kraus Michael Wooldridge Inon Zuckerman Stabilizing Boolean Games by Sharing Information Abstract. We address the issue of manipulating games through communication. In the specific setting

More information

Propositional Logic Language

Propositional Logic Language Propositional Logic Language A logic consists of: an alphabet A, a language L, i.e., a set of formulas, and a binary relation = between a set of formulas and a formula. An alphabet A consists of a finite

More information

Language of Propositional Logic

Language of Propositional Logic Logic A logic has: 1. An alphabet that contains all the symbols of the language of the logic. 2. A syntax giving the rules that define the well formed expressions of the language of the logic (often called

More information

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes These notes form a brief summary of what has been covered during the lectures. All the definitions must be memorized and understood. Statements

More information

Applied Logic. Lecture 1 - Propositional logic. Marcin Szczuka. Institute of Informatics, The University of Warsaw

Applied Logic. Lecture 1 - Propositional logic. Marcin Szczuka. Institute of Informatics, The University of Warsaw Applied Logic Lecture 1 - Propositional logic Marcin Szczuka Institute of Informatics, The University of Warsaw Monographic lecture, Spring semester 2017/2018 Marcin Szczuka (MIMUW) Applied Logic 2018

More information

Prime Forms and Minimal Change in Propositional Belief Bases

Prime Forms and Minimal Change in Propositional Belief Bases Annals of Mathematics and Artificial Intelligence manuscript No. (will be inserted by the editor) Prime Forms and Minimal Change in Propositional Belief Bases J. Marchi G. Bittencourt L. Perrussel Received:

More information

PAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht

PAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht PAC Learning prof. dr Arno Siebes Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht Recall: PAC Learning (Version 1) A hypothesis class H is PAC learnable

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard, Maren Bennewitz, and Marco Ragni Albert-Ludwigs-Universität Freiburg Contents 1 Agents

More information

Description Logics. Foundations of Propositional Logic. franconi. Enrico Franconi

Description Logics. Foundations of Propositional Logic.   franconi. Enrico Franconi (1/27) Description Logics Foundations of Propositional Logic Enrico Franconi franconi@cs.man.ac.uk http://www.cs.man.ac.uk/ franconi Department of Computer Science, University of Manchester (2/27) Knowledge

More information

Nested Epistemic Logic Programs

Nested Epistemic Logic Programs Nested Epistemic Logic Programs Kewen Wang 1 and Yan Zhang 2 1 Griffith University, Australia k.wang@griffith.edu.au 2 University of Western Sydney yan@cit.uws.edu.au Abstract. Nested logic programs and

More information

Advanced Topics in LP and FP

Advanced Topics in LP and FP Lecture 1: Prolog and Summary of this lecture 1 Introduction to Prolog 2 3 Truth value evaluation 4 Prolog Logic programming language Introduction to Prolog Introduced in the 1970s Program = collection

More information

Argumentation and rules with exceptions

Argumentation and rules with exceptions Argumentation and rules with exceptions Bart VERHEIJ Artificial Intelligence, University of Groningen Abstract. Models of argumentation often take a given set of rules or conditionals as a starting point.

More information

Knowledge representation DATA INFORMATION KNOWLEDGE WISDOM. Figure Relation ship between data, information knowledge and wisdom.

Knowledge representation DATA INFORMATION KNOWLEDGE WISDOM. Figure Relation ship between data, information knowledge and wisdom. Knowledge representation Introduction Knowledge is the progression that starts with data which s limited utility. Data when processed become information, information when interpreted or evaluated becomes

More information

arxiv:math/ v1 [math.lo] 5 Mar 2007

arxiv:math/ v1 [math.lo] 5 Mar 2007 Topological Semantics and Decidability Dmitry Sustretov arxiv:math/0703106v1 [math.lo] 5 Mar 2007 March 6, 2008 Abstract It is well-known that the basic modal logic of all topological spaces is S4. However,

More information

Chapter 4: Classical Propositional Semantics

Chapter 4: Classical Propositional Semantics Chapter 4: Classical Propositional Semantics Language : L {,,, }. Classical Semantics assumptions: TWO VALUES: there are only two logical values: truth (T) and false (F), and EXTENSIONALITY: the logical

More information

Improved Algorithms for Module Extraction and Atomic Decomposition

Improved Algorithms for Module Extraction and Atomic Decomposition Improved Algorithms for Module Extraction and Atomic Decomposition Dmitry Tsarkov tsarkov@cs.man.ac.uk School of Computer Science The University of Manchester Manchester, UK Abstract. In recent years modules

More information

A Little Deductive Logic

A Little Deductive Logic A Little Deductive Logic In propositional or sentential deductive logic, we begin by specifying that we will use capital letters (like A, B, C, D, and so on) to stand in for sentences, and we assume that

More information

The Calculus of Computation: Decision Procedures with Applications to Verification. Part I: FOUNDATIONS. by Aaron Bradley Zohar Manna

The Calculus of Computation: Decision Procedures with Applications to Verification. Part I: FOUNDATIONS. by Aaron Bradley Zohar Manna The Calculus of Computation: Decision Procedures with Applications to Verification Part I: FOUNDATIONS by Aaron Bradley Zohar Manna 1. Propositional Logic(PL) Springer 2007 1-1 1-2 Propositional Logic(PL)

More information

Non-monotonic Logic I

Non-monotonic Logic I Non-monotonic Logic I Bridges between classical and non-monotonic consequences Michal Peliš 1 Common reasoning monotonicity Γ ϕ Γ ϕ can fail caused by: background knowledge, implicit facts, presuppositions,

More information

CSCI.6962/4962 Software Verification Fundamental Proof Methods in Computer Science (Arkoudas and Musser) Chapter p. 1/33

CSCI.6962/4962 Software Verification Fundamental Proof Methods in Computer Science (Arkoudas and Musser) Chapter p. 1/33 CSCI.6962/4962 Software Verification Fundamental Proof Methods in Computer Science (Arkoudas and Musser) Chapter 4.1-4.8 p. 1/33 CSCI.6962/4962 Software Verification Fundamental Proof Methods in Computer

More information

arxiv: v1 [cs.cc] 5 Dec 2018

arxiv: v1 [cs.cc] 5 Dec 2018 Consistency for 0 1 Programming Danial Davarnia 1 and J. N. Hooker 2 1 Iowa state University davarnia@iastate.edu 2 Carnegie Mellon University jh38@andrew.cmu.edu arxiv:1812.02215v1 [cs.cc] 5 Dec 2018

More information

3. Only sequences that were formed by using finitely many applications of rules 1 and 2, are propositional formulas.

3. Only sequences that were formed by using finitely many applications of rules 1 and 2, are propositional formulas. 1 Chapter 1 Propositional Logic Mathematical logic studies correct thinking, correct deductions of statements from other statements. Let us make it more precise. A fundamental property of a statement is

More information

Normal Forms of Propositional Logic

Normal Forms of Propositional Logic Normal Forms of Propositional Logic Bow-Yaw Wang Institute of Information Science Academia Sinica, Taiwan September 12, 2017 Bow-Yaw Wang (Academia Sinica) Normal Forms of Propositional Logic September

More information

Informatics 1 - Computation & Logic: Tutorial 5 v:1.12 1st November :36:40

Informatics 1 - Computation & Logic: Tutorial 5 v:1.12 1st November :36:40 Informatics 1 - Computation & Logic: Tutorial 5 v:1.12 1st November 2017 16:36:40 Satisfiability and Resolution Week 7: 30 October 3 November 2017 Please attempt the entire worksheet in advance of the

More information

Conflict-Based Belief Revision Operators in Possibilistic Logic

Conflict-Based Belief Revision Operators in Possibilistic Logic Conflict-Based Belief Revision Operators in Possibilistic Logic Author Qi, Guilin, Wang, Kewen Published 2012 Conference Title Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence

More information

General Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations

General Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations General Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations OFER ARIELI AND ARNON AVRON, Department of Computer Science, School of Mathematical Sciences, Tel-Aviv University,

More information

Lecture 9: The Splitting Method for SAT

Lecture 9: The Splitting Method for SAT Lecture 9: The Splitting Method for SAT 1 Importance of SAT Cook-Levin Theorem: SAT is NP-complete. The reason why SAT is an important problem can be summarized as below: 1. A natural NP-Complete problem.

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Frank Hutter and Bernhard Nebel Albert-Ludwigs-Universität Freiburg

More information

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models Contents Mathematical Reasoning 3.1 Mathematical Models........................... 3. Mathematical Proof............................ 4..1 Structure of Proofs........................ 4.. Direct Method..........................

More information

PL: Truth Trees. Handout Truth Trees: The Setup

PL: Truth Trees. Handout Truth Trees: The Setup Handout 4 PL: Truth Trees Truth tables provide a mechanical method for determining whether a proposition, set of propositions, or argument has a particular logical property. For example, we can show that

More information

A Little Deductive Logic

A Little Deductive Logic A Little Deductive Logic In propositional or sentential deductive logic, we begin by specifying that we will use capital letters (like A, B, C, D, and so on) to stand in for sentences, and we assume that

More information

Pairing Transitive Closure and Reduction to Efficiently Reason about Partially Ordered Events

Pairing Transitive Closure and Reduction to Efficiently Reason about Partially Ordered Events Pairing Transitive Closure and Reduction to Efficiently Reason about Partially Ordered Events Massimo Franceschet Angelo Montanari Dipartimento di Matematica e Informatica, Università di Udine Via delle

More information

ECE473 Lecture 15: Propositional Logic

ECE473 Lecture 15: Propositional Logic ECE473 Lecture 15: Propositional Logic Jeffrey Mark Siskind School of Electrical and Computer Engineering Spring 2018 Siskind (Purdue ECE) ECE473 Lecture 15: Propositional Logic Spring 2018 1 / 23 What

More information

Introduction to Metalogic 1

Introduction to Metalogic 1 Philosophy 135 Spring 2012 Tony Martin Introduction to Metalogic 1 1 The semantics of sentential logic. The language L of sentential logic. Symbols of L: (i) sentence letters p 0, p 1, p 2,... (ii) connectives,

More information

5 Set Operations, Functions, and Counting

5 Set Operations, Functions, and Counting 5 Set Operations, Functions, and Counting Let N denote the positive integers, N 0 := N {0} be the non-negative integers and Z = N 0 ( N) the positive and negative integers including 0, Q the rational numbers,

More information

Propositional and Predicate Logic - V

Propositional and Predicate Logic - V Propositional and Predicate Logic - V Petr Gregor KTIML MFF UK WS 2016/2017 Petr Gregor (KTIML MFF UK) Propositional and Predicate Logic - V WS 2016/2017 1 / 21 Formal proof systems Hilbert s calculus

More information

Tecniche di Verifica. Introduction to Propositional Logic

Tecniche di Verifica. Introduction to Propositional Logic Tecniche di Verifica Introduction to Propositional Logic 1 Logic A formal logic is defined by its syntax and semantics. Syntax An alphabet is a set of symbols. A finite sequence of these symbols is called

More information

Encoding Deductive Argumentation in Quantified Boolean Formulae

Encoding Deductive Argumentation in Quantified Boolean Formulae Encoding Deductive Argumentation in Quantified Boolean Formulae Philippe Besnard IRIT-CNRS, Universitè Paul Sabatier, 118 rte de Narbonne, 31062 Toulouse, France Anthony Hunter Department of Computer Science,

More information

Spanning and Independence Properties of Finite Frames

Spanning and Independence Properties of Finite Frames Chapter 1 Spanning and Independence Properties of Finite Frames Peter G. Casazza and Darrin Speegle Abstract The fundamental notion of frame theory is redundancy. It is this property which makes frames

More information

Přednáška 12. Důkazové kalkuly Kalkul Hilbertova typu. 11/29/2006 Hilbertův kalkul 1

Přednáška 12. Důkazové kalkuly Kalkul Hilbertova typu. 11/29/2006 Hilbertův kalkul 1 Přednáška 12 Důkazové kalkuly Kalkul Hilbertova typu 11/29/2006 Hilbertův kalkul 1 Formal systems, Proof calculi A proof calculus (of a theory) is given by: A. a language B. a set of axioms C. a set of

More information

CS156: The Calculus of Computation

CS156: The Calculus of Computation CS156: The Calculus of Computation Zohar Manna Winter 2010 It is reasonable to hope that the relationship between computation and mathematical logic will be as fruitful in the next century as that between

More information

Characterization of Semantics for Argument Systems

Characterization of Semantics for Argument Systems Characterization of Semantics for Argument Systems Philippe Besnard and Sylvie Doutre IRIT Université Paul Sabatier 118, route de Narbonne 31062 Toulouse Cedex 4 France besnard, doutre}@irit.fr Abstract

More information

CONTENTS. Appendix C: Gothic Alphabet 109

CONTENTS. Appendix C: Gothic Alphabet 109 Contents 1 Sentential Logic 1 1.1 Introduction............................ 1 1.2 Sentences of Sentential Logic................... 2 1.3 Truth Assignments........................ 7 1.4 Logical Consequence.......................

More information

Conciliation through Iterated Belief Merging

Conciliation through Iterated Belief Merging Conciliation through Iterated Belief Merging Olivier Gauwin Sébastien Konieczny Pierre Marquis CRIL CNRS Université d Artois, Lens, France {gauwin, konieczny, marquis}@cril.fr Abstract Two families of

More information

Theoretical Computer Science. The loop formula based semantics of description logic programs

Theoretical Computer Science. The loop formula based semantics of description logic programs Theoretical Computer Science 415 (2012) 60 85 Contents lists available at SciVerse ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs The loop formula based semantics

More information

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel Foundations of AI 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard and Bernhard Nebel Contents Agents that think rationally The wumpus world Propositional logic: syntax and semantics

More information

Introduction to Structured Argumentation

Introduction to Structured Argumentation Introduction to Structured Argumentation Anthony Hunter Department of Computer Science, University College London, UK April 15, 2016 1 / 42 Computational models of argument Abstract argumentation Structured

More information

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION KRIPKE S THEORY OF TRUTH RICHARD G HECK, JR 1. INTRODUCTION The purpose of this note is to give a simple, easily accessible proof of the existence of the minimal fixed point, and of various maximal fixed

More information

Introduction to Intelligent Systems

Introduction to Intelligent Systems Logical Agents Objectives Inference and entailment Sound and complete inference algorithms Inference by model checking Inference by proof Resolution Forward and backward chaining Reference Russel/Norvig:

More information

Contribution of Problems

Contribution of Problems Exam topics 1. Basic structures: sets, lists, functions (a) Sets { }: write all elements, or define by condition (b) Set operations: A B, A B, A\B, A c (c) Lists ( ): Cartesian product A B (d) Functions

More information

Description Logics. Deduction in Propositional Logic. franconi. Enrico Franconi

Description Logics. Deduction in Propositional Logic.   franconi. Enrico Franconi (1/20) Description Logics Deduction in Propositional Logic Enrico Franconi franconi@cs.man.ac.uk http://www.cs.man.ac.uk/ franconi Department of Computer Science, University of Manchester (2/20) Decision

More information

CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL LOGIC

CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL LOGIC CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL LOGIC 1 Motivation and History The origins of the classical propositional logic, classical propositional calculus, as it was, and still often is called,

More information

A Weak Post s Theorem and the Deduction Theorem Retold

A Weak Post s Theorem and the Deduction Theorem Retold Chapter I A Weak Post s Theorem and the Deduction Theorem Retold This note retells (1) A weak form of Post s theorem: If Γ is finite and Γ = taut A, then Γ A and derives as a corollary the Deduction Theorem:

More information

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true.

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. Symbolic Logic 3 Testing deductive validity with truth tables For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. So, given that truth tables

More information

A An Overview of Complexity Theory for the Algorithm Designer

A An Overview of Complexity Theory for the Algorithm Designer A An Overview of Complexity Theory for the Algorithm Designer A.1 Certificates and the class NP A decision problem is one whose answer is either yes or no. Two examples are: SAT: Given a Boolean formula

More information

Tree sets. Reinhard Diestel

Tree sets. Reinhard Diestel 1 Tree sets Reinhard Diestel Abstract We study an abstract notion of tree structure which generalizes treedecompositions of graphs and matroids. Unlike tree-decompositions, which are too closely linked

More information

Introducing Proof 1. hsn.uk.net. Contents

Introducing Proof 1. hsn.uk.net. Contents Contents 1 1 Introduction 1 What is proof? 1 Statements, Definitions and Euler Diagrams 1 Statements 1 Definitions Our first proof Euler diagrams 4 3 Logical Connectives 5 Negation 6 Conjunction 7 Disjunction

More information

CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction

CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction Tim Roughgarden January 15, 2014 1 The Story So Far Our current theme is the design of ex post incentive compatible (EPIC) ascending

More information

Stochastic Histories. Chapter Introduction

Stochastic Histories. Chapter Introduction Chapter 8 Stochastic Histories 8.1 Introduction Despite the fact that classical mechanics employs deterministic dynamical laws, random dynamical processes often arise in classical physics, as well as in

More information

KB Agents and Propositional Logic

KB Agents and Propositional Logic Plan Knowledge-Based Agents Logics Propositional Logic KB Agents and Propositional Logic Announcements Assignment2 mailed out last week. Questions? Knowledge-Based Agents So far, what we ve done is look

More information

Intelligent Agents. First Order Logic. Ute Schmid. Cognitive Systems, Applied Computer Science, Bamberg University. last change: 19.

Intelligent Agents. First Order Logic. Ute Schmid. Cognitive Systems, Applied Computer Science, Bamberg University. last change: 19. Intelligent Agents First Order Logic Ute Schmid Cognitive Systems, Applied Computer Science, Bamberg University last change: 19. Mai 2015 U. Schmid (CogSys) Intelligent Agents last change: 19. Mai 2015

More information

Comp487/587 - Boolean Formulas

Comp487/587 - Boolean Formulas Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested

More information

6. Logical Inference

6. Logical Inference Artificial Intelligence 6. Logical Inference Prof. Bojana Dalbelo Bašić Assoc. Prof. Jan Šnajder University of Zagreb Faculty of Electrical Engineering and Computing Academic Year 2016/2017 Creative Commons

More information

Propositional logic. Programming and Modal Logic

Propositional logic. Programming and Modal Logic Propositional logic Programming and Modal Logic 2006-2007 4 Contents Syntax of propositional logic Semantics of propositional logic Semantic entailment Natural deduction proof system Soundness and completeness

More information

1.1 Statements and Compound Statements

1.1 Statements and Compound Statements Chapter 1 Propositional Logic 1.1 Statements and Compound Statements A statement or proposition is an assertion which is either true or false, though you may not know which. That is, a statement is something

More information

CHAPTER 10. Gentzen Style Proof Systems for Classical Logic

CHAPTER 10. Gentzen Style Proof Systems for Classical Logic CHAPTER 10 Gentzen Style Proof Systems for Classical Logic Hilbert style systems are easy to define and admit a simple proof of the Completeness Theorem but they are difficult to use. By humans, not mentioning

More information

Preference, Choice and Utility

Preference, Choice and Utility Preference, Choice and Utility Eric Pacuit January 2, 205 Relations Suppose that X is a non-empty set. The set X X is the cross-product of X with itself. That is, it is the set of all pairs of elements

More information

Measuring Inconsistency in Knowledge via Quasi-classical Models

Measuring Inconsistency in Knowledge via Quasi-classical Models From: AAAI-02 Proceedings. Copyright 2002, AAAI (www.aaai.org). All rights reserved. Measuring Inconsistency in Knowledge via Quasi-classical Models Anthony Hunter Department of Computer Science University

More information

NP-Completeness and Boolean Satisfiability

NP-Completeness and Boolean Satisfiability NP-Completeness and Boolean Satisfiability Mridul Aanjaneya Stanford University August 14, 2012 Mridul Aanjaneya Automata Theory 1/ 49 Time-Bounded Turing Machines A Turing Machine that, given an input

More information

Critical Reading of Optimization Methods for Logical Inference [1]

Critical Reading of Optimization Methods for Logical Inference [1] Critical Reading of Optimization Methods for Logical Inference [1] Undergraduate Research Internship Department of Management Sciences Winter 2008 Supervisor: Dr. Miguel Anjos UNIVERSITY OF WATERLOO Rajesh

More information

Logic for Computer Science - Week 4 Natural Deduction

Logic for Computer Science - Week 4 Natural Deduction Logic for Computer Science - Week 4 Natural Deduction 1 Introduction In the previous lecture we have discussed some important notions about the semantics of propositional logic. 1. the truth value of a

More information

A Unifying Semantics for Belief Change

A Unifying Semantics for Belief Change A Unifying Semantics for Belief Change C0300 Abstract. Many belief change formalisms employ plausibility orderings over the set of possible worlds to determine how the beliefs of an agent ought to be modified

More information

Natural Deduction for Propositional Logic

Natural Deduction for Propositional Logic Natural Deduction for Propositional Logic Bow-Yaw Wang Institute of Information Science Academia Sinica, Taiwan September 10, 2018 Bow-Yaw Wang (Academia Sinica) Natural Deduction for Propositional Logic

More information

Formal (natural) deduction in propositional logic

Formal (natural) deduction in propositional logic Formal (natural) deduction in propositional logic Lila Kari University of Waterloo Formal (natural) deduction in propositional logic CS245, Logic and Computation 1 / 67 I know what you re thinking about,

More information

On the Complexity of Input/Output Logic

On the Complexity of Input/Output Logic On the Complexity of Input/Output Logic Xin Sun 1 and Diego Agustín Ambrossio 12 1 Faculty of Science, Technology and Communication, University of Luxembourg, Luxembourg xin.sun@uni.lu 2 Interdisciplinary

More information

Supplementary exercises in propositional logic

Supplementary exercises in propositional logic Supplementary exercises in propositional logic The purpose of these exercises is to train your ability to manipulate and analyze logical formulas. Familiarize yourself with chapter 7.3-7.5 in the course

More information

COMP9414: Artificial Intelligence Propositional Logic: Automated Reasoning

COMP9414: Artificial Intelligence Propositional Logic: Automated Reasoning COMP9414, Monday 26 March, 2012 Propositional Logic 2 COMP9414: Artificial Intelligence Propositional Logic: Automated Reasoning Overview Proof systems (including soundness and completeness) Normal Forms

More information