Unifying Version Space Representations: Part II

Size: px
Start display at page:

Download "Unifying Version Space Representations: Part II"

Transcription

1 Unifying Version Space Representations: Part II E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, and H.J. van den Herik IKAT, Department of Computer Science, Maastricht University, P.O.Box 616, 6200 MD Maastricht, The Netherlands Abstract In this paper we continue our research on unification of version-space representations. We consider the open question how to unify one-sided versionspace representations. To answer the question we introduce a new family of version-space representations that can be adapted during the learning process. The family consists of two dual representations: adaptable one-sided maximal boundary sets (AOMBS) and adaptable one-sided minimal boundary sets (AOmBS). Without loss of generality the article covers in detail only the first representation. AOMBS are defined by the positive training set and the set of maximal boundary sets indexed by the elements of a particular covering of the negative training set. We show that a version space has a space of AOMBS representations each having its own applicability requirements. This space includes the basic one-sided version-space representations: the one-sided maximal boundary sets (Hirsh, 1992) and the one-sided instance-based maximal boundary sets (Smirnov et al., 2002). So, we may conclude that AOMBS unify these representations. The task of learning AOMBS is viewed as a task of identifying a proper representation within the space of AOMBS representations depending on the applicability requirements given. This is demonstrated in an example where AOMBS are used to overcome the complexity problem of the one-sided maximal boundary sets. 1 Introduction Version spaces are nowadays an established approach to the concept-learning task (Mitchell, 1997). They are defined as sets of descriptions in concept languages that are consistent with training data. Version-space learning is an incremental process (Mitchell, 1997): if a new training instance is given, the version space is revised so that it consists of all the concept descriptions consistent with the processed training data plus the instance. To learn version spaces they have to be represented. For that purpose several version-space representations were proposed in (Hirsh, 1992; Hirsh et al., 1997; Idemstam-Almquist, 1990; Sablon et al., 1994; Sebag and Rouveirol, 2000; Smirnov, 2001; Smirnov et al., 2002; Smith and Rosenbloom, 1990). Besides the diverse nature, the representations differ in terms of applicability. Although it is a sign of the maturity of the field of version spaces, deep knowledge of the representations is required when they are chosen and applied. To facilitate the potential user we proposed in (Smirnov et al., 2004) to unify version-space representations into representations with broader scopes of applicability. The first two representations unified were the boundary sets (Mitchell, 1997) and the instance-based boundary sets (Smirnov, 2001). The result of the unification was a version-space representation called adaptable boundary sets (Smirnov et al., 2004). We showed that it has a broader scope of applicability than both the boundary sets and the instance-based boundary sets. In this paper we address the open question how to unify the families of one-sided versionspace representations. We focus on the unification of the family of one-sided boundary sets (Hirsh, 1992) and the family of one-sided instance-based boundary sets (Smirnov et al., 2002). The motivation for this research question is twofold: (1) the one-sided boundary sets have a broader scope of applicability than the boundary sets (Hirsh, 1992); and (2) the one-sided instance-based boundary sets have a broader scope of applicability than the instance-based boundary sets (Smirnov et al., 2002). The result of unification is a new family of unifying versionspace representations called adaptable one-sided boundary sets. In this paper we show that the adaptable one-sided boundary sets are the most

2 applicable version-space representations. The family of adaptable one-sided boundary sets consists of two dual representations: adaptable one-sided maximal boundary sets (AOMBS) and adaptable one-sided minimal boundary sets (AOmBS). We cover in detail only the AOMBS representation. The paper starts in section 2 with a necessary formalisation. The AOMBS are introduced in section 3. We show that a version space can have a space of AOMBS each having its own applicability requirements. We prove that this space includes the one-sided maximal boundary sets (Hirsh, 1992) and the one-sided instancebased maximal boundary sets (Smirnov et al., 2002). Thus, the AOMBS unify these two representations. The task of learning AOMBS is considered as identifying a proper representation in the space of AOMBS representations depending on the applicability requirements given. For that purpose a general learning algorithm, two merging algorithms, and other useful algorithms of the AOMBS are given in sections 4, 5, and 6. Section 7 provides an example how AOMBS overcome the complexity problem of one-sided maximal boundary sets. The dual representation of the AOMBS, adaptable one-sided minimal boundary sets (AOmBS), is given in section 8. In section 9 conclusions are given. 2 Formalisation Let I be the universe of all the instances. Concepts C are defined as subsets of I. They are represented in a concept language Lc. The language Lc is a set of descriptions c representing each exactly one concept. Instances are related to concept descriptions by a cover relation M. The relation M(c, i) holds for c Lc, i I iff the instance i is a member of the concept given by c. A description c Lc is said to cover an instance i I iff the relation M(c, i) holds. As a rule any target concept C is incompletely defined by training sets I + I and I I of positive and negative instances such that I + C and I C =. The concept-learning task in this case is to find descriptions of C in Lc. To find the descriptions of a target concept, we specify them by the consistency criterion: a description c is consistent iff c correctly classifies the training data. Mitchell (1997) defined the set of all consistent descriptions w.r.t. the training sets I + and I of a target concept as the version space VS(I +, I ): {c Lc ( i I + )M(c, i) ( i I ) M(c, i)}. To learn version spaces, they have to be compactly represented. It is possible usually if concept languages are ordered by a relation more general ( ). The relation c 1 c 2 holds for descriptions c 1, c 2 Lc iff: ( i I)(M(c 1, i) M(c 2, i)). A concept language Lc with the relation is partially ordered. In our study we are interested in sets C Lc that have a maximal set MAX (C): {c C ( c C)((c c) (c = c))} and a minimal set MIN (C): {c C ( c C)((c c ) (c = c))}. The maximal and minimal sets of version spaces are known as maximal and minimal boundary sets (Mitchell, 1997). Concept languages for which the boundary sets can be defined for each version space are called admissible. More formally, a concept language Lc is admissible iff each subset C Lc is bounded; i.e., for each c C there exist g MAX (C) and s MIN (C) s.t. g c and c s. 3 Adaptable One-Sided Maximal Boundary Sets Adaptable one-sided maximal boundary sets (AOMBS) are a version-space representation that consists of (1) the training set I + of positive instances and (2) the indexed set of maximal boundary sets based on the elements of a particular covering 1 of the training set I of negative instances. AOMBS are formally defined in definition 1. The definition uses the notion of the set of all the coverings of the set I denoted as SP(I ). Definition 1 (AOMBS) Consider an admissible concept language Lc and the nonempty training set I I. Then 1 A covering of a nonempty set I is a set consisting of nonempty subsets of I such that the union of those subsets is the whole set.

3 AOMBS of the version space VS(I +, I ) are an ordered pair I +, {G(I +, I n )} I, where P SP(I ) and G(I +, I n ) = MAX (VS(I +, I n )). Theorem 1 (Correctness of AOMBS) Let VS(I +, I ) be a version space with AOMBS: I +, {G(I +, I n )} I. If the concept language Lc is admissible, then a concept description c Lc is a member of the version space VS(I +, I )) iff: ( p I + )M(c, p) and ( I n P )( g G(I +, I n ))(g c))). By theorem 1, given the AOMBS of a version space VS(I +, I ), the descriptions in VS(I +, I ) are those that are (1) consistent with the set I +, and (2) more specific than an element of each set G(I +, I n ). Thus, the AOMBS are a compact version-space representation. A version space has a space of AOMBS representations. Below this space is formally defined. Definition 2 (Space of AOMBS representations) Given I, the space R(I +, I ) of AOMBS representations of the version space VS(I +, I ) is defined as: { I +, {G(I +, I n )} I P SP(I )}. The space R(I +, I ) is partially ordered by a relation more homogeneous : AOMBS: I +, {G(I +, In )} I are more homogeneous 1 than AOMBS: I +, {G(I +, In )} I iff each 2 set In P1 is a union of one or more sets In P2. The least element of R(I+, I ) is the representation I +, {G(I +, {n})} n I defined for P = i I {{i}}. According to (Smirnov et al., 2002) this AOMBS representation equals instance-based maximal boundary sets. The greatest element of R(I +, I ) is the representation I +, {G(I +, I )} defined for P = {I }. According to (Hirsh, 1992) this AOMBS representation equals one-sided maximal boundary sets. Since the space of AOMBS representations includes the instance-based maximal boundary sets and the one-sided maximal boundary sets, the AOMBS unify these two representations. More homogeneous AOMBS are not always more comprehensible. For example, AOMBS, which covering P is the power set of the set I, are more homogeneous than the AOMBS equal to the instance-based maximal boundary sets, but they are not more comprehensible. Thus, we say that (AOMBS) 1 are more comprehensible than (AOMBS) 2 if the covering P1 contains less elements than the covering P2. The AOMBS in the space R(I +, I ) are different also in terms of the conditions when they are finite, (efficiently) computable and allow (efficiently) implementing the basic version-space operations (Hirsh, 1992; Hirsh et al., 1997; Smirnov, 2001). For example, we proved in (Smirnov et al., 2002) that the conditions for finiteness and (efficient) computation of the onesided maximal boundary sets are more restrictive than those for finiteness and efficient computation of the instance-based maximal boundary sets. In addition, we showed that both representations can be used for implementing most version-space operations. Nevertheless, there are operations that can be implemented either with one-sided maximal boundary sets or with instance-based maximal boundary sets. From the above we conclude that learning AOMBS requires setting in advance applicability requirements. The requirements can be the levels of homogeneity and comprehensibility, and the conditions when the representations are finite, (efficiently) computable and allow (efficiently) implementing the version-space operations. Once the requirements are set, the task of learning AOMBS is to identify a proper representation in the space of AOMBS representations. In the next two sections we present learning and merging algorithms designed for this purpose. 4 General Learning Algorithm The general learning algorithm updates the AOMBS of a version space w.r.t. a training instance. It is given for admissible concept languages and has two parts. The first part of the algorithm is applied only if the new training instance i is negative. The instance is added to the training set I. The covering P of the updated set I = I {i} is formed from the covering P of the set I s.t. i is added to some elements of P and/or the set

4 {i} is added to P. The covering P is called an extension of the covering P. Formally, this relation is defined for P and P if for each In P : ( I n P )(I n ( In P )(In = {i})). (I n = In ) = In {i}) We note that the first part of the algorithm is not completely specified. It is determined depending on the applicability requirements imposed on the AOMBS to be learned. The second part of the general learning algorithm is applied for both positive and negative training instances. If the instance i is positive, given the AOMBS: I +, {G(I +, In )} I of the version space VS(I +, I ) and the set I + = I + {i}, the algorithm computes the AOMBS: I +, {G(I +, In )} I of the version space VS(I +, I ). If the instance i is negative, given the AOMBS: I +, {G(I +, In )} I of the version space VS(I +, I ) and the covering P of the training set I (I = I {i}), the algorithm computes the AOMBS: I +, {G(I +, In )} I of the version space VS(I +, I ). The second part of the algorithm is based on theorems 2 and 3 given below. Theorem 2 Let VS(I +, I ) be a version space represented by AOMBS: I +, {G(I +,I n )} I, and let VS(I +, I ) be a version space represented by AOMBS: I +, {G(I +, I n )} I, where I + = I + {i}. If the concept language Lc is admissible, then for each I n P : G(I +, I n ) = {g G(I +, I n ) M(g, i)}. Theorem 3 Let VS(I +, I ) be a version space represented by AOMBS: I +, {G(I +, In )} I, and let VS(I +, I ) be a version space represented by AOMBS: I +, {G(I +, In )} I, where I = I {i}. If the concept language Lc is admissible, then: for In P s.t. ( In P )(In = In ): G(I +, In ) = G(I +, In ) for In P s.t. ( In P )(In = In {i}): G(I +, In )=MAX ({c VS(I +, In ) M(c, i)}) if {i} P : G(I +, {i}) = MAX (VS(I +, {i})). To facilitate implementation we formulate the second part of the algorithm in detailed steps. Given a new training instance i I and the AOMBS: I +, {G(I +, I n )} I of a version space VS(I +, I ), this part of the algorithm is as follows: if the instance i is positive then I + = I + {i}. for each I n P : - form the set G(I +, I n ) from the elements of the set G(I +, I n ) that cover i; return AOMBS: I +, {G(I +, I n )} I of VS(I +, I ). if the instance i is negative and a covering P of the training set I =I {i} is created, then for I n P s.t. ( I n P )(I n = I n ): - take the set G(I +, I n ) from the AOMBS: I +, {G(I + n, I )} I ; for In P s.t. ( In P )(In =In {i}): - form the set G(I +, In ) as the maximal set of the subset of the version space VS(I +, In ) which elements do not cover the instance i; if {i} P then generate the set G(I +, {i}); return AOMBS: {I +, {G(I +, I n )} I of VS(I +, I ). The algorithm can be tuned to learn different AOMBS. If the covering P is formed s.t. P = i I {{i}}, then instance-based maximal boundary sets are learned. If the covering P is formed s.t. P = {I }, then one-sided maximal boundary sets are learned. In all other cases the algorithm learns some AOMBS in the space R(I +, I ) that are different from the least and greatest elements. 5 Merging Algorithms Given the AOMBS of a version space, merging algorithms form more homogeneous AOMBS of

5 the same version space. In this section we propose two such algorithms that differ w.r.t. the requirements the concept language satisfies. 5.1 General Merging Algorithm The general merging algorithm is given for admissible concept languages. It consists of two parts. To describe the first part of the algorithm we introduce the notion of merged regroupment. Let P and P be coverings of the set I. If P P, then P is a merged regroupment of P iff: ( I n P )( Q P )(I n = I n Q I n ). Given the AOMBS: I +, {G(I +, I n )} I of a version space VS(I +, I ), the first part of the algorithm computes the merged regroupment P of the covering P. This computation is specified depending on the applicability requirements imposed on the AOMBS to be merged. The second part of the algorithm is based on theorem 4 below. Theorem 4 Let VS(I +,I ) be a version space given by two AOMBS: I +, {G(I +,In )} I and I +, {G(I +, In )} I s.t. P is a merged regroupment of P. If the concept language Lc is admissible, then for In P the set G(I +, In ) is equal to: {c MS({G(I +, I n )} I n Q ) ( p I + )M(c, p)} where Q SP(In ), Q P, and MS({G(I +, In )} I n Q ) is the maximal set of {c Lc ( In Q )( g G(I +, In ))(g c)}. Below we describe the second part of the general merging algorithm. Given the AOMBS: {I +, {G(I +, In )} I of a version space VS(I +, I ) and the covering P (computed by the first part), the second part of the algorithm computes the merged AOMBS: {I +, {G(I +, In )} I of the same version space VS(I +, I ). This is done in the following steps: for each set I n P s.t. I n P : - take the set G(I +, I n ) from the AOMBS I +, {G(I +, I n )} I ; for each set I n P s.t. I n / P : - compute the set MS({G(I +, I n )} I n Q ) where Q P s.t. I n = I n Q I n ; - form the maximal boundary set G(I +, In ) from those elements of the set MS({G(I +, In )} I n Q ) that cover all the instances in I + ; return the resulting merged AOMBS: I +, {G(I +, In )} I of VS(I +, I ). 5.2 Specialised Merging Algorithm The specialised merging algorithm is proposed for computing more homogeneous AOMBS of which the sizes are reduced. It is applicable for admissible concept languages if the intersectionpreserving property holds (Smirnov, 2001). Definition 3 (Intersection-Preserving Property (IP)) An admissible concept language has the intersection preserving property when for each nonempty set C Lc there exists the greatest lower bound glb(c) Lc s.t. an instance i I is covered by all the elements of C iff the instance i is covered by glb(c). The specialised merging algorithm is based on theorem 5 below. Theorem 5 Consider a set I, a covering P SP(I ), and a nonempty set Q P s.t. ( In Q ) G(I +, In ) = 1. If the concept language Lc is admissible and the property IP holds, then: G(I +, I n Q I n ) = {glb( I n Q G(I +, I n ))}. Below we describe the specialised merging algorithm. The input is an AOMBS: I +, {G(I +, In )} I of a version space VS(I +, I ). The output is a more homogeneous AOMBS of VS(I +, I ). The first part of the algorithm determines the new regroupment P of the covering P. This is done in four steps: - initialise a set Q to be empty;

6 - determine each maximal boundary set G(I +, I n ) in the AOMBS s.t. G(I +, I n ) = 1 and add I n to Q ; - form the set I n = I n Q I n ; - P = (P \ Q ) {I n }. If P P the second part of the algorithm is executed in three steps: - compute glb( I n Q G(I +, I n )); - G(I +, I n ) = {glb( I n Q G(I +, I n ))} (according to theorem 5); - return AOMBS: I +, {G(I +, I n )} I of VS(I +, I ). According to step 2 the size of the set G(I +, In ) is equal to one. Thus, the algorithm guarantees that the size of the merged AOMBS is reduced. 6 Other Useful Algorithms In this section we propose two useful algorithms for the AOMBS. The first one is an algorithm for version-space collapse. It checks whether version spaces are empty. The second algorithm is a classification algorithm. 6.1 Algorithm for Version-Space Collapse The algorithm for version-space collapse is proposed for admissible languages with the property IP. It is based on theorem 6 showing how to use version spaces VS(I +, I n ) for I n P to check if the version space VS(I +, I ) is empty. Theorem 6 Consider an admissible concept language Lc s.t. the property IP holds. If the set I is nonempty, then: (VS(I +, I ) ) ( I n P )(VS(I +, I n ) ). If the property IP holds, then according to theorem 6 in order to check the version space VS(I +, I ) for collapse we need to check for collapse the version spaces VS(I +, In ) for In P. Since VS(I +, In ) are represented by their maximal boundary sets G(I +, In ) in the AOMBS of VS(I +, I ), we have to determine a relation between the maximal boundary sets G(I +, In ) and the version spaces VS(I +, In ). The desired relation is given in theorem 7 taken from (Smirnov, 2001). Theorem 7 states that the version space VS(I +, I ) is not empty if and only if the maximal boundary set G(I +, I ) is not empty. Theorem 7 If the concept language Lc is admissible, then: (VS(I +, I ) ) (G(I +, I ) ). Theorems 6 and 7 imply corollary 1. It states that if the property IP holds and the training set I is not empty, then the version space VS(I +, I ) is not empty iff for each In P the maximal boundary set G(I +, In ) is not empty. Corollary 1 Consider an admissible concept language Lc s.t. the property IP holds and a training set I with a covering P SP(I ). If I, then: (VS(I +, I ) ) ( I )(G(I +, I n ) ). Below we describe the algorithm for version-space collapse. The input is an AOMBS: I +, {G(I +, In )} I of a version space VS(I +, I ). The output is true if VS(I +, I ) = ; otherwise, it is false. The algorithm is as follows: for all I n P : if G(I +, I n ) = then return true (by corollary 1); return false. 6.2 Classification Algorithm The classification algorithm is based on unanimous voting (Mitchell, 1997): an instance i is classified if all the descriptions in a version space VS(I +, I ) agree on a classification. The algorithm is given for admissible languages with the property IP. It is based on dual theorems 8 and 9 from (Smirnov, 2001). Theorem 8 states that the subset of VS(I +, I ) of which the elements do not cover the instance i is empty iff all the descriptions of VS(I +, I ) cover the instance i. Theorem 9 can be explained by duality.

7 Theorem 8 For i I : VS(I +, I {i}) = iff (( c VS(I +, I ))M(c, i). Theorem 9 For i I : VS(I + {i}, I ) = iff (( c VS(I +, I )) M(c, i). The classification algorithm is described as follows. Its input is an instance i and the AOMBS of a version space VS(I +, I ). The algorithm outputs: + if all the descriptions in VS(I +, I ) cover i; if no descriptions in VS(I +, I ) cover i; or? otherwise. The algorithm consists of three steps: - form the AOMBS of the version space VS(I +, I {i}); if VS(I +, I {i}) = then return + ; - form the AOMBS of the version space VS(I + {i}, I ); if VS(I + {i}, I ) = then return - ; - return?. We note that the classification algorithm is applicable only if the property IP holds because of the use of the algorithm for version-space collapse. 7 Example of AOMBS Consider the concept-learning task from Table 1: the instance universe I and the concept language Lc are 1-CNF languages with 8 Boolean attributes s.t. I Lc 2. The training instances are chosen s.t. G({i + 1 }, {i 2, i 3, i 4, i 5 }) = 2 {i 2,i 3,i 4,i 5 } = 16; i.e., the size of one-sided boundary sets is exponential in the number of negative instances. Our goal is to learn AOMBS of which the size is polynomial in the size of training data. The goal is achievable since the space of AOMBS representations contains instance-based maximal boundary sets of which the size is polynomial in the number of training instances (Smirnov et al., 2002). Therefore, we tune the general learning algorithm to form the new covering P s.t. the upper size limit of the 2 Note that the concept language Lc is admissible and the property IP holds. desired AOMBS is the size of the instance-based maximal boundary sets 3. The initialisation and the trace of the general learning algorithm are given in the second column of Table 1. They are explained as follows. Before the instance i 4 the AOMBS are equal to one-sided maximal boundary sets, since the sizes of the one-sided maximal boundary sets and the instance-based maximal boundary sets are equal. When the instance i 4 is given, the algorithm has two choices: (1) to update the set G({i + 1 }, {i 2, i 3 }); or (2) to create the set G({i + 1 }, {i 4 }). If the first choice is made the size of the AOMBS becomes 9. If the second choice is made the size of the AOMBS becomes 7; i.e., it is equal to the size of the instance-based maximal boundary sets. Thus, the algorithm makes the second choice. When the instance i 5 is given, the algorithm can be analogously explained. When the positive instance i + 6 is processed, the sets G({i + 1 }, {i 2, i 3 }) and G({i+ 1 }, {i 4, i 5 }) are updated, and the sets G({i + 1, i+ 6 }, {i 2, i 3 }) and G({i + 1, i+ 6 }, {i 4, i 5 }) are created and added to the AOMBS. The resulting AOMBS are processed by the specialised merging algorithm because the property IP holds and the G sets contain just one element. Thus, the G part of the final AOMBS are given by G({i + 1, i+ 6 }, {i 2, i 3, i 4, i 5 }) = {(1,?, 1,?, 1,?, 1,?)}; i.e., the final AOMBS equals the one-sided boundary sets. 8 Adaptable One-Sided Minimal Boundary Sets Adaptable one-sided minimal boundary sets (AOmBS) can be derived by duality from the previous sections. The AOmBS unify the onesided minimal boundary sets (Hirsh, 1992) and the one-sided instance-based minimal boundary sets (Smirnov et al., 2002). We refrain from providing further details. 9 Conclusions This paper considered the open question how to unify one-sided version-space representations. 3 In reality the worst-case size of instance-based maximal boundary sets is used. For 1-CNF languages with Boolean attributes it equals ( I + + I )A, where A is the number of attributes. Thus, the size can be computed without computing the representation. It is not used in our example because of space limits.

8 Training Instance Initialisation S A S B S I i + 1 =(1,1,1,1,1,1,1,1) I+ = {(1, 1, 1, 1, 1, 1, 1, 1)} i 2 =(0,0,1,1,1,1,1,1) G({i+ 1 }, {i 2 }) = {(1,?,?,?,?,?,?,?), (?, 1,?,?,?,?,?,?)} Algorithm Trace i 3 =(1,1,0,0,1,1,1,1) I+ = {(1, 1, 1, 1, 1, 1, 1, 1)} G({i + 1 }, {i 2, i 3 }) = {(1,?, 1,?,?,?,?,?), (1,?,?, 1,?,?,?,?), (?, 1, 1,?,?,?,?,?), (?, 1,?, 1,?,?,?,?)} i 4 =(1,1,1,1,0,0,1,1) I+ = {(1, 1, 1, 1, 1, 1, 1, 1)} G({i + 1 }, {i 2, i 3 }) = {(1,?, 1,?,?,?,?,?), (1,?,?, 1,?,?,?,?), (?, 1, 1,?,?,?,?,?), (?, 1,?, 1,?,?,?,?)} G({i + 1 }, {i 4 }) = {(?,?,?,?, 1,?,?,?), (?,?,?,?,?, 1,?,?)} i 5 =(1,1,1,1,1,1,0,0) I+ = {(1, 1, 1, 1, 1, 1, 1, 1)} G({i + 1 }, {i 2, i 3 }) = {(1,?, 1,?,?,?,?,?), (1,?,?, 1,?,?,?,?), (?, 1, 1,?,?,?,?,?), (?, 1,?, 1,?,?,?,?)} G({i + 1 }, {i 4, i 5 }) = {(?,?,?,?, 1,?, 1,?), (?,?,?,?, 1,?,?, 1), (?,?,?,?,?, 1, 1,?), (?,?,?,?,?, 1,?, 1)} i + 6 =(1,0,1,0,1,0,1,0) I+ = {(1, 1, 1, 1, 1, 1, 1, 1), (1, 0, 1, 0, 1, 0, 1, 0)} G({i + 1, i+ 6 }, {i 2, i 3 }) = {(1,?, 1,?,?,?,?,?)} G({i + 1, i+ 6 }, {i 4, i 5 }) = {(?,?,?,?, 1,?, 1,?)} Table 1: Execution of the General Learning Algorithm. The first column shows the training instances, the second column shows the algorithm execution; the third, fourth and fifth columns show the size S A, S B and S I of the AOMBS, the one-sided maximal boundary sets; and the instancebased maximal boundary sets. To answer the question we introduced a family of new version-space representations called adaptable one-sided boundary sets. We showed that the new representations unify the family of one-sided boundary sets (Hirsh, 1992) and the family of one-sided instance-based boundary sets (Smirnov et al., 2002). In this context we note that the latter families consist of the most applicable version-space representations known till now. Thus, we conclude that currently the adaptable one-sided boundary sets are the most applicable version-space representations. References H. Hirsh, N. Mishra, and L. Pitt Version spaces without boundary sets. In Proceedings of the Fourteenth National Conference on Artificial Intelligence (AAAI-97), pages , Menlo Park, CA. AAAI Press. H. Hirsh Polynomial-time learning with version spaces. In Proceedings of the Tenth National Conference on Artificial Intelligence (AAAI-92), pages , Menlo Park, CA. AAAI Press. P. Idemstam-Almquist Demand networks: an alternative representation of version spaces. Master s thesis, Department of Computer Science and Systems Sciences, Stockholm University, Stockholm, Sweden. T.M. Mitchell Machine learning. McGraw- Hill, New York, NY. G. Sablon, L. DeRaedt, and M. Bruynooghe Iterative versionspaces. Artificial Intelligence, 69(1 2): M. Sebag and C. Rouveirol Resourcebounded relational reasoning: induction and deduction through stochastic matching. Machine Learning, 38(1-2): E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, and H.J. van den Herik New version-space representations for efficient instance retraction. In Proceedings of the First International Workshop on Knowledge Discovery in Inductive Databases (KDID-2002). E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, and H.J. van den Herik A unifying version-space representation. Annals of Math and Artificial Intelligence (in print). E.N. Smirnov Conjunctive and disjunctive version spaces with instance-based boundary sets. Ph.D. thesis, Department of Computer Science, Maastricht University, Maastricht, The Netherlands. B.D. Smith and P.S. Rosenbloom Incremental non-backtracking focusing: a polynomially bounded generalization algorithm for version spaces. In Proceedings of the Eight National Conference on Artificial Intelligence (AAAI-90), pages MIT Press.

Introduction to machine learning. Concept learning. Design of a learning system. Designing a learning system

Introduction to machine learning. Concept learning. Design of a learning system. Designing a learning system Introduction to machine learning Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2). Introduction to machine learning When appropriate

More information

Version Spaces and the Consistency Problem

Version Spaces and the Consistency Problem Version Spaces and the Consistency Problem Haym Hirsh Nina Mishra Leonard Pitt December 17, 2003 Abstract A version space is a collection of concepts consistent with a given set of positive and negative

More information

Classification Based on Logical Concept Analysis

Classification Based on Logical Concept Analysis Classification Based on Logical Concept Analysis Yan Zhao and Yiyu Yao Department of Computer Science, University of Regina, Regina, Saskatchewan, Canada S4S 0A2 E-mail: {yanzhao, yyao}@cs.uregina.ca Abstract.

More information

Disjunctive Temporal Reasoning in Partially Ordered Models of Time

Disjunctive Temporal Reasoning in Partially Ordered Models of Time From: AAAI-00 Proceedings Copyright 2000, AAAI (wwwaaaiorg) All rights reserved Disjunctive Temporal Reasoning in Partially Ordered Models of Time Mathias Broxvall and Peter Jonsson Department of Computer

More information

A Unit Resolution Approach to Knowledge Compilation. 2. Preliminaries

A Unit Resolution Approach to Knowledge Compilation. 2. Preliminaries A Unit Resolution Approach to Knowledge Compilation Arindama Singh and Manoj K Raut Department of Mathematics Indian Institute of Technology Chennai-600036, India Abstract : Knowledge compilation deals

More information

Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples

Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples Published in: Journal of Global Optimization, 5, pp. 69-9, 199. Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples Evangelos Triantaphyllou Assistant

More information

A Tutorial on Computational Learning Theory Presented at Genetic Programming 1997 Stanford University, July 1997

A Tutorial on Computational Learning Theory Presented at Genetic Programming 1997 Stanford University, July 1997 A Tutorial on Computational Learning Theory Presented at Genetic Programming 1997 Stanford University, July 1997 Vasant Honavar Artificial Intelligence Research Laboratory Department of Computer Science

More information

UNITARY UNIFICATION OF S5 MODAL LOGIC AND ITS EXTENSIONS

UNITARY UNIFICATION OF S5 MODAL LOGIC AND ITS EXTENSIONS Bulletin of the Section of Logic Volume 32:1/2 (2003), pp. 19 26 Wojciech Dzik UNITARY UNIFICATION OF S5 MODAL LOGIC AND ITS EXTENSIONS Abstract It is shown that all extensions of S5 modal logic, both

More information

Introduction to Machine Learning

Introduction to Machine Learning Outline Contents Introduction to Machine Learning Concept Learning Varun Chandola February 2, 2018 1 Concept Learning 1 1.1 Example Finding Malignant Tumors............. 2 1.2 Notation..............................

More information

Role-depth Bounded Least Common Subsumers by Completion for EL- and Prob-EL-TBoxes

Role-depth Bounded Least Common Subsumers by Completion for EL- and Prob-EL-TBoxes Role-depth Bounded Least Common Subsumers by Completion for EL- and Prob-EL-TBoxes Rafael Peñaloza and Anni-Yasmin Turhan TU Dresden, Institute for Theoretical Computer Science Abstract. The least common

More information

Lecture 2: Foundations of Concept Learning

Lecture 2: Foundations of Concept Learning Lecture 2: Foundations of Concept Learning Cognitive Systems II - Machine Learning WS 2005/2006 Part I: Basic Approaches to Concept Learning Version Space, Candidate Elimination, Inductive Bias Lecture

More information

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur Module 12 Machine Learning Lesson 34 Learning From Observations 12.2 Concept Learning Definition: The problem is to learn a function mapping examples into two classes: positive and negative. We are given

More information

Concept Learning.

Concept Learning. . Machine Learning Concept Learning Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg Martin.Riedmiller@uos.de

More information

Computing Least Common Subsumers in Description Logics with Existential Restrictions*

Computing Least Common Subsumers in Description Logics with Existential Restrictions* Computing Least Common Subsumers in Description Logics with Existential Restrictions* Franz Baader, Ralf Kiisters, Ralf Molitor LuFg Theoretische Informatik, RWTH Aachen email: {baader,kuesters,molitor}@infonnatik.rwth-aachcn.dc

More information

Lattice Machine: Version Space in Hyperrelations

Lattice Machine: Version Space in Hyperrelations Lattice Machine: Version Space in Hyperrelations [Extended Abstract] Hui Wang, Ivo Düntsch School of Information and Software Engineering University of Ulster Newtownabbey, BT 37 0QB, N.Ireland {H.Wang

More information

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard

More information

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181. Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität

More information

Computation and Inference

Computation and Inference Computation and Inference N. Shankar Computer Science Laboratory SRI International Menlo Park, CA July 13, 2018 Length of the Longest Increasing Subsequence You have a sequence of numbers, e.g., 9, 7,

More information

Concept Learning. Space of Versions of Concepts Learned

Concept Learning. Space of Versions of Concepts Learned Concept Learning Space of Versions of Concepts Learned 1 A Concept Learning Task Target concept: Days on which Aldo enjoys his favorite water sport Example Sky AirTemp Humidity Wind Water Forecast EnjoySport

More information

Minimally-Sized Balanced Decomposition Schemes for Multi-Class Classification

Minimally-Sized Balanced Decomposition Schemes for Multi-Class Classification Minimally-Sized Balanced Decomposition Schemes for Multi-Class Classification Evgueni Smirnov, Matthijs Moed, Georgi Nalbantov, and Ida Sprinkhuizen-Kuyper Abstract Error-correcting output coding (ECOC)

More information

Computational Learning Theory

Computational Learning Theory 09s1: COMP9417 Machine Learning and Data Mining Computational Learning Theory May 20, 2009 Acknowledgement: Material derived from slides for the book Machine Learning, Tom M. Mitchell, McGraw-Hill, 1997

More information

ML techniques. symbolic techniques different types of representation value attribute representation representation of the first order

ML techniques. symbolic techniques different types of representation value attribute representation representation of the first order MACHINE LEARNING Definition 1: Learning is constructing or modifying representations of what is being experienced [Michalski 1986], p. 10 Definition 2: Learning denotes changes in the system That are adaptive

More information

Lecture 25 of 42. PAC Learning, VC Dimension, and Mistake Bounds

Lecture 25 of 42. PAC Learning, VC Dimension, and Mistake Bounds Lecture 25 of 42 PAC Learning, VC Dimension, and Mistake Bounds Thursday, 15 March 2007 William H. Hsu, KSU http://www.kddresearch.org/courses/spring2007/cis732 Readings: Sections 7.4.17.4.3, 7.5.17.5.3,

More information

A Relationship Between CNF and DNF Systems Derivable from Examples

A Relationship Between CNF and DNF Systems Derivable from Examples Published in: Published in: ORSA Journal on Computing, Vol. 7, No. 3, pp. 283-285, Summer 1995. A Relationship Between CNF and DNF Systems Derivable from Examples Evangelos Triantaphyllou 1 and Allen L.

More information

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem. 1 More on NP In this set of lecture notes, we examine the class NP in more detail. We give a characterization of NP which justifies the guess and verify paradigm, and study the complexity of solving search

More information

A Study on Monotone Self-Dual Boolean Functions

A Study on Monotone Self-Dual Boolean Functions A Study on Monotone Self-Dual Boolean Functions Mustafa Altun a and Marc D Riedel b a Electronics and Communication Engineering, Istanbul Technical University, Istanbul, Turkey 34469 b Electrical and Computer

More information

DISTINGUING NON-DETERMINISTIC TIMED FINITE STATE MACHINES

DISTINGUING NON-DETERMINISTIC TIMED FINITE STATE MACHINES DISTINGUING NON-DETERMINISTIC TIMED FINITE STATE MACHINES Maxim Gromov 1, Khaled El-Fakih 2, Natalia Shabaldina 1, Nina Yevtushenko 1 1 Tomsk State University, 36 Lenin Str.. Tomsk, 634050, Russia gromov@sibmail.com,

More information

Optimization Models for Detection of Patterns in Data

Optimization Models for Detection of Patterns in Data Optimization Models for Detection of Patterns in Data Igor Masich, Lev Kazakovtsev, and Alena Stupina Siberian State University of Science and Technology, Krasnoyarsk, Russia is.masich@gmail.com Abstract.

More information

Nested Epistemic Logic Programs

Nested Epistemic Logic Programs Nested Epistemic Logic Programs Kewen Wang 1 and Yan Zhang 2 1 Griffith University, Australia k.wang@griffith.edu.au 2 University of Western Sydney yan@cit.uws.edu.au Abstract. Nested logic programs and

More information

Unifying Theories of Programming

Unifying Theories of Programming 1&2 Unifying Theories of Programming Unifying Theories of Programming 3&4 Theories Unifying Theories of Programming designs predicates relations reactive CSP processes Jim Woodcock University of York May

More information

The matrix approach for abstract argumentation frameworks

The matrix approach for abstract argumentation frameworks The matrix approach for abstract argumentation frameworks Claudette CAYROL, Yuming XU IRIT Report RR- -2015-01- -FR February 2015 Abstract The matrices and the operation of dual interchange are introduced

More information

A Preference Logic With Four Kinds of Preferences

A Preference Logic With Four Kinds of Preferences A Preference Logic With Four Kinds of Preferences Zhang Zhizheng and Xing Hancheng School of Computer Science and Engineering, Southeast University No.2 Sipailou, Nanjing, China {seu_zzz; xhc}@seu.edu.cn

More information

Computational learning theory. PAC learning. VC dimension.

Computational learning theory. PAC learning. VC dimension. Computational learning theory. PAC learning. VC dimension. Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering Dept. of Cybernetics COLT 2 Concept...........................................................................................................

More information

Classification (Categorization) CS 391L: Machine Learning: Inductive Classification. Raymond J. Mooney. Sample Category Learning Problem

Classification (Categorization) CS 391L: Machine Learning: Inductive Classification. Raymond J. Mooney. Sample Category Learning Problem Classification (Categorization) CS 9L: Machine Learning: Inductive Classification Raymond J. Mooney University of Texas at Austin Given: A description of an instance, x X, where X is the instance language

More information

Data Cleaning and Query Answering with Matching Dependencies and Matching Functions

Data Cleaning and Query Answering with Matching Dependencies and Matching Functions Data Cleaning and Query Answering with Matching Dependencies and Matching Functions Leopoldo Bertossi 1, Solmaz Kolahi 2, and Laks V. S. Lakshmanan 2 1 Carleton University, Ottawa, Canada. bertossi@scs.carleton.ca

More information

Selected Algorithms of Machine Learning from Examples

Selected Algorithms of Machine Learning from Examples Fundamenta Informaticae 18 (1993), 193 207 Selected Algorithms of Machine Learning from Examples Jerzy W. GRZYMALA-BUSSE Department of Computer Science, University of Kansas Lawrence, KS 66045, U. S. A.

More information

Disjunctive Bottom Set and Its Computation

Disjunctive Bottom Set and Its Computation Disjunctive Bottom Set and Its Computation Wenjin Lu and Ross King Department of Computer Science, University of Wales, Aberystwyth Ceredigion, SY23 3DB, Wales, UK e-mail:{wwl, rdk}@aber.ac.uk Abstract

More information

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness Harry Lewis November 19, 2013 Reading: Sipser 7.4, 7.5. For culture : Computers and Intractability: A Guide to the Theory

More information

Computers and Operations Research, Vol. 21, No. 2, pp , 1994.

Computers and Operations Research, Vol. 21, No. 2, pp , 1994. Published in: Computers and Operations Research, Vol. 21, No. 2, pp. 185-179, 1994. GENERATING LOGICAL EXPRESSIONS FROM POSITIVE AND NEGATIVE EXAMPLES VIA A BRANCH-AND-BOUND APPROACH Evangelos Triantaphyllou

More information

Chapter 0 Introduction. Fourth Academic Year/ Elective Course Electrical Engineering Department College of Engineering University of Salahaddin

Chapter 0 Introduction. Fourth Academic Year/ Elective Course Electrical Engineering Department College of Engineering University of Salahaddin Chapter 0 Introduction Fourth Academic Year/ Elective Course Electrical Engineering Department College of Engineering University of Salahaddin October 2014 Automata Theory 2 of 22 Automata theory deals

More information

Advanced Topics in LP and FP

Advanced Topics in LP and FP Lecture 1: Prolog and Summary of this lecture 1 Introduction to Prolog 2 3 Truth value evaluation 4 Prolog Logic programming language Introduction to Prolog Introduced in the 1970s Program = collection

More information

Version Spaces.

Version Spaces. . Machine Learning Version Spaces Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de

More information

Computing Loops With at Most One External Support Rule

Computing Loops With at Most One External Support Rule Computing Loops With at Most One External Support Rule Xiaoping Chen and Jianmin Ji University of Science and Technology of China P. R. China xpchen@ustc.edu.cn, jizheng@mail.ustc.edu.cn Fangzhen Lin Department

More information

Interleaved Alldifferent Constraints: CSP vs. SAT Approaches

Interleaved Alldifferent Constraints: CSP vs. SAT Approaches Interleaved Alldifferent Constraints: CSP vs. SAT Approaches Frédéric Lardeux 3, Eric Monfroy 1,2, and Frédéric Saubion 3 1 Universidad Técnica Federico Santa María, Valparaíso, Chile 2 LINA, Université

More information

Web-Mining Agents Computational Learning Theory

Web-Mining Agents Computational Learning Theory Web-Mining Agents Computational Learning Theory Prof. Dr. Ralf Möller Dr. Özgür Özcep Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Exercise Lab) Computational Learning Theory (Adapted)

More information

A generalization of modal definability

A generalization of modal definability A generalization of modal definability Tin Perkov Polytechnic of Zagreb Abstract. Known results on global definability in basic modal logic are generalized in the following sense. A class of Kripke models

More information

Partial model checking via abstract interpretation

Partial model checking via abstract interpretation Partial model checking via abstract interpretation N. De Francesco, G. Lettieri, L. Martini, G. Vaglini Università di Pisa, Dipartimento di Ingegneria dell Informazione, sez. Informatica, Via Diotisalvi

More information

Introduction to Kleene Algebras

Introduction to Kleene Algebras Introduction to Kleene Algebras Riccardo Pucella Basic Notions Seminar December 1, 2005 Introduction to Kleene Algebras p.1 Idempotent Semirings An idempotent semiring is a structure S = (S, +,, 1, 0)

More information

Concept Learning. Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University.

Concept Learning. Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University. Concept Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Tom M. Mitchell, Machine Learning, Chapter 2 2. Tom M. Mitchell s

More information

On the Complexity of the Reflected Logic of Proofs

On the Complexity of the Reflected Logic of Proofs On the Complexity of the Reflected Logic of Proofs Nikolai V. Krupski Department of Math. Logic and the Theory of Algorithms, Faculty of Mechanics and Mathematics, Moscow State University, Moscow 119899,

More information

Epistemic Reasoning in Logic Programs

Epistemic Reasoning in Logic Programs Abstract Although epistemic logic programming has an enhanced capacity to handle complex incomplete information reasoning and represent agents epistemic behaviours, it embeds a significantly higher computational

More information

Reasoning with Inconsistent and Uncertain Ontologies

Reasoning with Inconsistent and Uncertain Ontologies Reasoning with Inconsistent and Uncertain Ontologies Guilin Qi Southeast University China gqi@seu.edu.cn Reasoning Web 2012 September 05, 2012 Outline Probabilistic logic vs possibilistic logic Probabilistic

More information

A New Approach to Knowledge Base Revision in DL-Lite

A New Approach to Knowledge Base Revision in DL-Lite A New Approach to Knowledge Base Revision in DL-Lite Zhe Wang and Kewen Wang and Rodney Topor School of ICT, Griffith University Nathan, QLD 4111, Australia Abstract Revising knowledge bases (KBs) in description

More information

Concept Learning Mitchell, Chapter 2. CptS 570 Machine Learning School of EECS Washington State University

Concept Learning Mitchell, Chapter 2. CptS 570 Machine Learning School of EECS Washington State University Concept Learning Mitchell, Chapter 2 CptS 570 Machine Learning School of EECS Washington State University Outline Definition General-to-specific ordering over hypotheses Version spaces and the candidate

More information

On Two Class-Constrained Versions of the Multiple Knapsack Problem

On Two Class-Constrained Versions of the Multiple Knapsack Problem On Two Class-Constrained Versions of the Multiple Knapsack Problem Hadas Shachnai Tami Tamir Department of Computer Science The Technion, Haifa 32000, Israel Abstract We study two variants of the classic

More information

arxiv: v4 [math.oc] 5 Jan 2016

arxiv: v4 [math.oc] 5 Jan 2016 Restarted SGD: Beating SGD without Smoothness and/or Strong Convexity arxiv:151.03107v4 [math.oc] 5 Jan 016 Tianbao Yang, Qihang Lin Department of Computer Science Department of Management Sciences The

More information

The Space of Maximal Ideals in an Almost Distributive Lattice

The Space of Maximal Ideals in an Almost Distributive Lattice International Mathematical Forum, Vol. 6, 2011, no. 28, 1387-1396 The Space of Maximal Ideals in an Almost Distributive Lattice Y. S. Pawar Department of Mathematics Solapur University Solapur-413255,

More information

Trichotomy Results on the Complexity of Reasoning with Disjunctive Logic Programs

Trichotomy Results on the Complexity of Reasoning with Disjunctive Logic Programs Trichotomy Results on the Complexity of Reasoning with Disjunctive Logic Programs Mirosław Truszczyński Department of Computer Science, University of Kentucky, Lexington, KY 40506, USA Abstract. We present

More information

Computational Learning Theory

Computational Learning Theory 0. Computational Learning Theory Based on Machine Learning, T. Mitchell, McGRAW Hill, 1997, ch. 7 Acknowledgement: The present slides are an adaptation of slides drawn by T. Mitchell 1. Main Questions

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard, Maren Bennewitz, and Marco Ragni Albert-Ludwigs-Universität Freiburg Contents 1 Agents

More information

The Size of a Revised Knowledge Base

The Size of a Revised Knowledge Base 1 To appear on the Proc of the Fourteenth ACM SIGACT SIGMOD SIGART Symposium on Principles of Database Systems (PODS-95) The Size of a Revised Knowledge Base Marco Cadoli Francesco M Donini Paolo Liberatore

More information

A Theory of Forgetting in Logic Programming

A Theory of Forgetting in Logic Programming A Theory of Forgetting in Logic Programming Kewen Wang 1,2 and Abdul Sattar 1,2 and Kaile Su 1 1 Institute for Integrated Intelligent Systems 2 School of Information and Computation Technology Griffith

More information

Policies Generalization in Reinforcement Learning using Galois Partitions Lattices

Policies Generalization in Reinforcement Learning using Galois Partitions Lattices Policies Generalization in Reinforcement Learning using Galois Partitions Lattices Marc Ricordeau and Michel Liquière mricorde@wanadoo.fr, liquiere@lirmm.fr Laboratoire d Informatique, de Robotique et

More information

FIRST-ORDER QUERY EVALUATION ON STRUCTURES OF BOUNDED DEGREE

FIRST-ORDER QUERY EVALUATION ON STRUCTURES OF BOUNDED DEGREE FIRST-ORDER QUERY EVALUATION ON STRUCTURES OF BOUNDED DEGREE INRIA and ENS Cachan e-mail address: kazana@lsv.ens-cachan.fr WOJCIECH KAZANA AND LUC SEGOUFIN INRIA and ENS Cachan e-mail address: see http://pages.saclay.inria.fr/luc.segoufin/

More information

Abstract Theorem Proving *

Abstract Theorem Proving * Abstract Theorem Proving * Fausto Giunchiglia Mechanised Reasoning Group IRST Povo, I 38100 Trento Italy fausto@irst.uucp Toby Walsh Department of Artificial Intelligence University of Edinburgh 80 South

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Bernhard Nebel Albert-Ludwigs-Universität Freiburg May 17, 2016

More information

Comp487/587 - Boolean Formulas

Comp487/587 - Boolean Formulas Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested

More information

A Lower Bound of 2 n Conditional Jumps for Boolean Satisfiability on A Random Access Machine

A Lower Bound of 2 n Conditional Jumps for Boolean Satisfiability on A Random Access Machine A Lower Bound of 2 n Conditional Jumps for Boolean Satisfiability on A Random Access Machine Samuel C. Hsieh Computer Science Department, Ball State University July 3, 2014 Abstract We establish a lower

More information

[read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] General-to-specific ordering over hypotheses

[read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] General-to-specific ordering over hypotheses 1 CONCEPT LEARNING AND THE GENERAL-TO-SPECIFIC ORDERING [read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] Learning from examples General-to-specific ordering over hypotheses Version spaces and

More information

Concept Learning through General-to-Specific Ordering

Concept Learning through General-to-Specific Ordering 0. Concept Learning through General-to-Specific Ordering Based on Machine Learning, T. Mitchell, McGRAW Hill, 1997, ch. 2 Acknowledgement: The present slides are an adaptation of slides drawn by T. Mitchell

More information

Linear Discrimination Functions

Linear Discrimination Functions Laurea Magistrale in Informatica Nicola Fanizzi Dipartimento di Informatica Università degli Studi di Bari November 4, 2009 Outline Linear models Gradient descent Perceptron Minimum square error approach

More information

Self-duality of bounded monotone boolean functions and related problems

Self-duality of bounded monotone boolean functions and related problems Discrete Applied Mathematics 156 (2008) 1598 1605 www.elsevier.com/locate/dam Self-duality of bounded monotone boolean functions and related problems Daya Ram Gaur a, Ramesh Krishnamurti b a Department

More information

Dependency Schemes in QBF Calculi: Semantics and Soundness

Dependency Schemes in QBF Calculi: Semantics and Soundness Dependency Schemes in QBF Calculi: Semantics and Soundness Olaf Beyersdorff and Joshua Blinkhorn School of Computing, University of Leeds, UK {O.Beyersdorff,scjlb}@leeds.ac.uk Abstract. We study the parametrisation

More information

Elementary Sets for Logic Programs

Elementary Sets for Logic Programs Elementary Sets for Logic Programs Martin Gebser Institut für Informatik Universität Potsdam, Germany Joohyung Lee Computer Science and Engineering Arizona State University, USA Yuliya Lierler Department

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Deductive Inference for the Interiors and Exteriors of Horn Theories

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Deductive Inference for the Interiors and Exteriors of Horn Theories MATHEMATICAL ENGINEERING TECHNICAL REPORTS Deductive Inference for the Interiors and Exteriors of Horn Theories Kazuhisa MAKINO and Hirotaka ONO METR 2007 06 February 2007 DEPARTMENT OF MATHEMATICAL INFORMATICS

More information

Optimization and Complexity

Optimization and Complexity Optimization and Complexity Decision Systems Group Brigham and Women s Hospital, Harvard Medical School Harvard-MIT Division of Health Sciences and Technology Aim Give you an intuition of what is meant

More information

Towards a Structured Analysis of Approximate Problem Solving: ACaseStudy in Classification

Towards a Structured Analysis of Approximate Problem Solving: ACaseStudy in Classification Towards a Structured Analysis of Approximate Problem Solving: ACaseStudy in Classification Perry Groot and Annette ten Teije and Frank van Harmelen Division of Mathematics and Computer Science, Faculty

More information

NP Completeness and Approximation Algorithms

NP Completeness and Approximation Algorithms Chapter 10 NP Completeness and Approximation Algorithms Let C() be a class of problems defined by some property. We are interested in characterizing the hardest problems in the class, so that if we can

More information

Lecture Notes in Machine Learning Chapter 4: Version space learning

Lecture Notes in Machine Learning Chapter 4: Version space learning Lecture Notes in Machine Learning Chapter 4: Version space learning Zdravko Markov February 17, 2004 Let us consider an example. We shall use an attribute-value language for both the examples and the hypotheses

More information

The Perceptron Algorithm 1

The Perceptron Algorithm 1 CS 64: Machine Learning Spring 5 College of Computer and Information Science Northeastern University Lecture 5 March, 6 Instructor: Bilal Ahmed Scribe: Bilal Ahmed & Virgil Pavlu Introduction The Perceptron

More information

A An Overview of Complexity Theory for the Algorithm Designer

A An Overview of Complexity Theory for the Algorithm Designer A An Overview of Complexity Theory for the Algorithm Designer A.1 Certificates and the class NP A decision problem is one whose answer is either yes or no. Two examples are: SAT: Given a Boolean formula

More information

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University Formal Epistemology: Lecture Notes Horacio Arló-Costa Carnegie Mellon University hcosta@andrew.cmu.edu Logical preliminaries Let L 0 be a language containing a complete set of Boolean connectives, including

More information

1. Introduction Recap

1. Introduction Recap 1. Introduction Recap 1. Tractable and intractable problems polynomial-boundness: O(n k ) 2. NP-complete problems informal definition 3. Examples of P vs. NP difference may appear only slightly 4. Optimization

More information

Elementary Sets for Logic Programs

Elementary Sets for Logic Programs Elementary Sets for Logic Programs Martin Gebser Institut für Informatik Universität Potsdam, Germany Joohyung Lee Computer Science and Engineering Arizona State University, USA Yuliya Lierler Department

More information

HOPFIELD neural networks (HNNs) are a class of nonlinear

HOPFIELD neural networks (HNNs) are a class of nonlinear IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 52, NO. 4, APRIL 2005 213 Stochastic Noise Process Enhancement of Hopfield Neural Networks Vladimir Pavlović, Member, IEEE, Dan Schonfeld,

More information

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel Foundations of AI 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard and Bernhard Nebel Contents Agents that think rationally The wumpus world Propositional logic: syntax and semantics

More information

On the Progression of Situation Calculus Basic Action Theories: Resolving a 10-year-old Conjecture

On the Progression of Situation Calculus Basic Action Theories: Resolving a 10-year-old Conjecture On the Progression of Situation Calculus Basic Action Theories: Resolving a 10-year-old Conjecture Stavros Vassos and Hector J Levesque Department of Computer Science University of Toronto Toronto, ON,

More information

Statistical Learning Theory and the C-Loss cost function

Statistical Learning Theory and the C-Loss cost function Statistical Learning Theory and the C-Loss cost function Jose Principe, Ph.D. Distinguished Professor ECE, BME Computational NeuroEngineering Laboratory and principe@cnel.ufl.edu Statistical Learning Theory

More information

Qualifying Exam in Machine Learning

Qualifying Exam in Machine Learning Qualifying Exam in Machine Learning October 20, 2009 Instructions: Answer two out of the three questions in Part 1. In addition, answer two out of three questions in two additional parts (choose two parts

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Frank Hutter and Bernhard Nebel Albert-Ludwigs-Universität Freiburg

More information

A brief introduction to Logic. (slides from

A brief introduction to Logic. (slides from A brief introduction to Logic (slides from http://www.decision-procedures.org/) 1 A Brief Introduction to Logic - Outline Propositional Logic :Syntax Propositional Logic :Semantics Satisfiability and validity

More information

Characterization of Semantics for Argument Systems

Characterization of Semantics for Argument Systems Characterization of Semantics for Argument Systems Philippe Besnard and Sylvie Doutre IRIT Université Paul Sabatier 118, route de Narbonne 31062 Toulouse Cedex 4 France besnard, doutre}@irit.fr Abstract

More information

Data Mining and Machine Learning

Data Mining and Machine Learning Data Mining and Machine Learning Concept Learning and Version Spaces Introduction Concept Learning Generality Relations Refinement Operators Structured Hypothesis Spaces Simple algorithms Find-S Find-G

More information

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION KRIPKE S THEORY OF TRUTH RICHARD G HECK, JR 1. INTRODUCTION The purpose of this note is to give a simple, easily accessible proof of the existence of the minimal fixed point, and of various maximal fixed

More information

SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments. Lintao Zhang

SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments. Lintao Zhang SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments Requirements for SAT solvers in the Real World Fast & Robust Given a problem instance, we want to solve it quickly Reliable

More information

Bottom-Up Induction of Feature Terms

Bottom-Up Induction of Feature Terms Machine Learning, 41, 259 294, 2000 c 2000 Kluwer Academic Publishers. Manufactured in The Netherlands. Bottom-Up Induction of Feature Terms EVA ARMENGOL eva@iiia.csic.es ENRIC PLAZA enric@iiia.csic.es

More information

Machine Learning 2010

Machine Learning 2010 Machine Learning 2010 Michael M Richter Support Vector Machines Email: mrichter@ucalgary.ca 1 - Topic This chapter deals with concept learning the numerical way. That means all concepts, problems and decisions

More information

CS 5114: Theory of Algorithms. Tractable Problems. Tractable Problems (cont) Decision Problems. Clifford A. Shaffer. Spring 2014

CS 5114: Theory of Algorithms. Tractable Problems. Tractable Problems (cont) Decision Problems. Clifford A. Shaffer. Spring 2014 Department of Computer Science Virginia Tech Blacksburg, Virginia Copyright c 2014 by Clifford A. Shaffer : Theory of Algorithms Title page : Theory of Algorithms Clifford A. Shaffer Spring 2014 Clifford

More information

A New Approach to Proving Upper Bounds for MAX-2-SAT

A New Approach to Proving Upper Bounds for MAX-2-SAT A New Approach to Proving Upper Bounds for MAX-2-SAT Arist Kojevnikov Alexander S. Kulikov Abstract In this paper we present a new approach to proving upper bounds for the maximum 2-satisfiability problem

More information

DISTINCT FUZZY SUBGROUPS OF A DIHEDRAL GROUP OF ORDER 2pqrs FOR DISTINCT PRIMES p, q, r AND s

DISTINCT FUZZY SUBGROUPS OF A DIHEDRAL GROUP OF ORDER 2pqrs FOR DISTINCT PRIMES p, q, r AND s Iranian Journal of Fuzzy Systems Vol 12, No 3, (2015) pp 137-149 137 DISTINCT FUZZY SUBGROUPS OF A DIHEDRAL GROUP OF ORDER 2pqrs FOR DISTINCT PRIMES p, q, r AND s O NDIWENI AND B B MAKAMBA Abstract In

More information