An objective definition of subjective probability

Size: px
Start display at page:

Download "An objective definition of subjective probability"

Transcription

1 An objective definition of subjective probability Nico Roos 1 Abstract. Several attempts have been made to give an objective definition of subjective probability. These attempts can be divided into two approaches. The first approach uses an a priori probability distribution over the set of interpretations of the language that we are using to describe information. The idea is to define such an a priori probability distribution using some general principles such as the insufficient reason principle of Bernoulli and Laplace. The second approach does not start from a set of interpretations among which we try to find the one describing the world, but instead tries to build a partial model of the world. Uncertainty in the available information results in several possible partial models, each presenting a different view of the world. Using the insufficient reason principle, a probability is assigned to each view. This paper will present arguments for using the second approach instead of the first. Furthermore, a new formalization of the second approach, solving the problems of earlier attempts, will be given. 1 Introduction Several attempts have been made to give an objective definition of subjective probability. These attempts can be divided into two approaches. The first approach uses an a priori probability distribution on the set of interpretations of the language that we are using to describe information [1, 4, 2, 6]. The idea is to define such an a priori probability distribution using some general principles. The second approach does not start from a set of interpretations among which we try to find the one describing the world, but instead tries to build a partial model of the world [7, 8, 9]. Uncertainty in the available information results in several possible partial models, each presenting a different view of the world. Using the insufficient reason principle of Bernoulli and Laplace, a probability is assigned to each view. The two approaches can be characterized by two child games. The approach in which we start with a probability distribution over the set interpretations can be characterized by the game Who is it. The goal of this game is to identify a person from a set of candidates by asking questions. Based on the answers, we eliminate some of the candidates. If we would define an a priori probability distribution over the set of candidates, we could answer questions such as: what is the chance that the person we try to identify has blue eyes?. The other approach can be characterized as solving a jigsaw puzzle. When we receive some information, this information may represent two or more pieces of which one belongs to the puzzle. Each of these pieces results in a different view on how to complete the puzzle. We may also receive information representing a piece of the puzzle of which we do not know where to put it into the puzzle. There might be more than one position where it could fit. This again can result in different views on how to complete the puzzle. If we have no 1 Maastricht University, Department of Computer Science, P.O. Box 616, 6200 MD Maastricht, The Netherlands, roos@cs.unimaas.nl information to prefer one of these views, using the insufficient reason principle of Bernoulli and Laplace, we might consider all these views on how to complete the puzzle as being equally likely. So the uncertainty expresses our lack of knowledge. For the first approach, one also uses the insufficient reason principle. It is used in the definition of the a priori probability distribution. Can we argue, however, that the set of candidates should be equally likely? Suppose, for example that we have the following information about the person to be identified. The person has blue or green eyes. If this person has green eyes, his/her hair is blond. If there are as many candidates with blue as with green eyes, what should be the probability that the person to be identified has green eyes? If all candidates are equally likely, and if there are candidates for each combination of eye color and hair color, then the probability that the person to be identified has green eyes will be less than 0:5. This outcome is perfectly valid for this game, but is it for an agent receiving information about the world? The fact that we do not know the color of the person s hair if he/she has blue eyes, can hardly be a reason to consider having blue eyes to be more likely. It would imply that the likelihood of a possible situation is proportional with the lack of information about this situation. The heart of the problem is that in the first approach, we assume that all worlds are really possible. The only thing that we do not know is which world has been selected for today. The name of the game is, however, that there is one fixed world of which we have to determine what it looks like. Therefore, we should view information as pieces of a puzzle. Some information, such as the color of the persons eyes, represents two pieces, one of which belongs to the puzzle. So, we can have different views on how to complete the puzzle. As a result uncertainty arises. Furthermore, information stating that the person has blond hair if s/he has green eyes, is a pieces that we can put into the puzzle in one view, but which is not a piece of the puzzle in the other view. So, we have more information about one possible view than we have about the other. This should not influence the likelihood of the two views. 2 The probability distribution Since there is only one fixed world, we cannot define a probability distribution over a set of possible worlds. As was pointed out in the introduction, we can only define a probability distribution over the set views we have about the world. Therefore, we must now address the question concerning the requirements such a probability distribution must meet. First of all, views must be mutually exclusive. Since views are in fact epistemic states, it might be possible to combine the information of two different views in one more informative view. In other words, c 1998 N. Roos ECAI th European Conference on Artificial Intelligence Edited by Henri Prade Published in 1998 by John Wiley & Sons, Ltd.

2 two different views need not give incompatible descriptions of the world. Since such views cannot be considered as being mutually exclusive, we must exclude them. If views are mutually exclusive and if we have no reason to prefer one description of the world to another, all views should be equally likely. Note that the information content of a view cannot be an issue since we cannot talk about the set of worlds described by a view. There is only one world. New information described by an implication may change the likelihood of a view. If the consequent of the implication consists of a disjunction, a view may be replaced by two or more new views. For example, If the person has blue eyes, s/he has black or blond hair. The above implication describes two possible ways to extend the information content of a view. So, one view is replaced by two new views. In our example, we go from two views to three views. Since we have still no reason to prefer one view to another, the probability distribution changes. Now consider the following slightly modified example. A man or a woman is standing in front of the house. If it a woman, her eyes are blue or green. If it is a man, his eyes are blue. This example also describes three views of the world. Our intuition tells us here that the probability that the person in front of the house is a man should be the same as the probability that this person is a woman. Since this example appears to be similar to the previous example, why are the views not equally likely? The difference between the two examples is that in the first example we have one person to be identified while in the last example we have two persons, a man and a woman. If the person in front of the house is a woman, then there is uncertainty concerning the color of her eyes. How do we represent this information in one or more views? In a view of the world, the woman will be represented by some object. Since an object can, in principle, denote any entity in the world, we need a way to indicate that an object in one view representing a woman having blue eyes, denotes the same entity as an object in another view representing the same woman having green eyes. We can realize this by introducing the object denoting the woman in a common parent-view of the two views. Naturally, objects that are present in both parent- and child-view should denote the same entity in the world. Since a parent-view can be interpreted as the intersection of its child-views, two of the three views described by our example must be grouped in order to represent that we are talking about one woman. The resulting hierarchy presents the preferences between the views. For the same reason as describing a single entity can result in a hierarchy of views, so can describing a group of entities. Suppose for example that 10 or 15 persons are waiting a in room. If we also know that each person has brown or blond hair, then we get again a hierarchy of views, where the root view consists of two sub-views, one containing 10 persons and one containing 15 persons. Each sub-view is also divided into sub-views which describe the possible properties of the objects. Now suppose that we know that 80% of the persons have brown hair. Then there will be respectively 8 or 12 persons? having brown hair. So the view with 10 persons will be divided into? sub-views, and the view with 15 persons will be divided into 12 sub-views. Although there are more views with 15 persons, the presence of 15 persons in the room is not more likely than the presence 10 persons. This result corresponds with our intuitions. 3 The language We assume a slightly modified first order language L. This language is recursively defined from a set of atomic predicates P, a set of constants C, a set of variables X, a set of objects O, the operators :, ^, _ and!, and the quantified descriptions [Q x ]. The quantified description corresponds with a quantifier and a variable in standard first order logic; e.g. 8x. In the quantified description [Q x ], Q represents one of the quantifiers 8, 9, (%p) or (#n) where p 2 [0; 1] and n 2 IN. x is a variable and 2 L is a formula denoting the reference class. The reference class is optional. If absent, all objects belong to the reference class. The premises used in the reasoning process are a finite set of formulas without free variables or objects. 4 Handling uncertainty through partial model construction The general idea is to approximate the model of the world by constructing a partial model using information about the world. In this way we try to solve the puzzle which is represented by a complete model of the world. Unfortunately, some kinds of information, such as disjunctions and statistical information, allows us to construct different partial models, resulting in uncertainty about what holds in the world. Definition 1 Let O be the set of all possible objects. A partial model M is a tuple ho; Fi where O C [ O is a set of objects and F L is a set of formulas. Furthermore, any object or constant occurring in a formula of F is an element of O. Notice that we allow that the set of objects O also contains constants. This choice avoids the need to specify a denotation of the constants. As a consequence, we will not be able to represent that two constants denote the same object. This may seem worse than it is. Constants are used by an agent for names of objects and for values of sensors. If we say that Mary has blue eyes, we mean that an object named Mary has eye color blue. The person named Mary may have other names and there may even be other persons having the same name. In all circumstance Mary is different from Julia although one person may have both names. How do we actually construct a partial model? As we have seen above, some information may not uniquely describe a part of the world. This results in different views of how the world may look like. To make things more complicated, statistical information can divide a view into sub-views. Therefore, the partial models must be organized into a tree. In this tree, each node represents a view and its children represent the sub-views. The leaves of the tree consist of the actual partial models. We will call this tree an hierarchical model. The root of an hierarchical model must, of course, satisfy the premises. Furthermore, a formula satisfied by a node (a view) must also be satisfied by its children. Definition 2 Let V be a view. V is either a partial model M or a set of views fw 1; :::; W k g. O(V ) = T ho;f i2leaves(v ) O. F (V ) = T ho;f i2leaves(v ) F. Definition 3 Let ' be a formula and let V be a view. V satisfies ', V j= ', is defined in the following way. Reasoning under Uncertainty 596 N. Roos

3 V j= ' if for some 2 L, V j= and V j= :. V j= ' if ' 2 F (V ). V j= 1 ^ 2 if V j= 1 and V j= 2. V j= :( 1 ^ 2) if V j= : 1 or V j= : 2. V j= 1 _ 2 if V j= 1 or V j= 2. V j= :( 1 _ 2) if V j= : 1 and V j= : 2. V j= [9 x ] if V = fw1; :::; W k g, < = fo 2 O(V ) j for each 1 i k: W i j= [ x = o]; g (< = O(V ) if is absent) and for some o 2 < : W i j= [ x = o] for each 1 i k. V j= :[Q x ] if V = fw1; :::; W k g, < = fo 2 O(V ) j for each 1 i k: W i j= [ x = o]; g (< = O(V ) if is absent) and: if Q = #n, then jfo 2 < j for each 1 i k: W i j= [ x = o]gj > n; if Q = 8, then for some o 2 <: W i j= : [ x = o] for each 1 i k; V j= ' if V = fw1; :::; W k g and for each V 2 W, W j= '. Partial models can be ordered with respect to the amount of information that they contain. Here, we will use an information ordering according to which a partial model N contains at least the same information as a partial model M, if and only if N satisfies every formula of M. There is, however, a small problem that we have to deal with. Objects that are not represented by constants are just arbitrary names for entities in the world. Therefore, different partial models can use different names to denote the same entity. To relate the objects of one partial model to the objects of another partial model, we need a mapping of the objects of one model to the objects of the other model. Definition 4 Let M = ho; Fi and N = ho 0 ; F 0 i be two partial models. Furthermore, let f : O! (O [ C) be a mapping of the objects of M to the objects of N. N contains at least the same information as M given f, M v f N, if and only if for each ' 2 F there holds that N j= '[f]. We can extend the above defined information ordering to an information ordering on views. In this information ordering, we must take into account that sub-views can be deleted because of new information. Furthermore, views with less hierarchical levels, are considered to contain less information. Definition 5 Let V and W be two views and let f : O! (O [ C) be a mapping of the objects. W contains at least the same information as V given f, V v f W, if and only if V and W are partial models and V v f W ; or for each W 0 2 W, either V v f W 0 or there is a V 0 2 V such that V 0 v f W 0. Definition 6 Let V and W be two hierarchical models. W contains at least the same information as V, V v W, if and only if for some f : O! (O [ C) mapping of the objects V v f W. A view of a hierarchical model may contain redundant sub-views. We prefer hierarchical models without redundant views since redundant views provide no relevant information and redundant views are not mutually exclusive. Different views may use different objects to denote the same entity in the world. Therefore, to determine redundancy, we need a mapping of the objects of one view to another view. The objects of a common parent view, however, denote the same entity in every child view. Definition 7 Let H be a hierarchical model. H is redundant free if and only if for no two sub-views V and W of a view U in the hierarchical model H there is a function f : O! (O [ C) such that for each o 2 O(U): f(o) = o and V v f W. The above defined partial models and views do not guarantee that the truth value of a formula will be defined in terms of its constitutive parts. So, how do we guarantee this? We could try to guarantee this by demanding that a view still satisfies a non atomic formula after removing this formula from the leaves of the view. Unfortunately, this approach does not guarantee that the partial models and views give a good approximation of the world. We cannot represent the information that there are 5 houses in a street in terms of its constitutive parts in a partial model. The semantics of the formula is incomplete respect to a partial model. Therefore, another approach is needed. To assure that a partial model gives the best possible representation of the world given the available information, the premises, we will introduce a representation relation. This representation relation tells us whether a view accurately represents a formula. Definition 8 Let ' be a formula and let V be a view. V represents ', V j ', is defined in the following way. V j ' if ' is a literal and ' 2 F (V ). V j 1 ^ 2 if V j= 1 and V j= 2. V j :( 1 ^ 2) if V j= : 1 or V j= : 2, and V j= i or V j= : i for i 2 f1; 2g. V j 1 _ 2 if V j= 1 or V j= 2, and V j= i or V j= : i for i 2 f1; 2g. V j 1 ^ 2 if V j= : 1 and V j= : 2. V j! if V 6j= or V j=. V j [Q x ] if V = fw 1; :::; W k g, < = fo 2 O(V ) j for each 1 i k: W i j= [ x = o]; g (< = O(V ) if is absent), j<j = k and: if Q = 9, then for some o 2 <: W i j= [ x = o] for each 1 i k; if Q = #n, then jfo 2 < j for each 1 i k: W i j= [ x = o]gj = n and jfo 2 < j for each 1 i k: W i j= : [ x = o]gj = k? n; if Q = %p, then jfo 2 < j for each 1 i k: W i j= [ x = o]gj = p k and jfo 2 < j for each 1 i k: W i j= : [ x = o]gj = (1?p)k. if Q = 8, then for each o 2 <: W i j= [ x = o] for each 1 i k; Definition 9 Let V be a view. V is a complete if and only if for each formula ' 2 F (V ): V j ' and for each W 2 V such that W 6= fmg for some partial model M, there holds that W is complete. Given the above defined view we can define the least informative complete hierarchical model satisfying the premises. Definition 10 Let be a set of premises and let V be a view. The view V is an hierarchical model of if and only if V is the least informative, redundant free and complete view such that V j=. By applying the insufficient reason principle, we can now define the probability measure. Reasoning under Uncertainty 597 N. Roos

4 Definition 11 Let V be a view and let ' be a formula. If the truth value of ' is defined in every leave of V, then P V (') = 8 < : 1 if V j= ' 0 if V j= :' V = fw 1; :::; W k g 1 P k W 2V PW (') if Otherwise, P V (') is undefined. Notice that the elimination of sub-views after receiving new information need not influence the probability of a view. A sub-view only denotes a possible way to extend its parent-view. It does not denote a situation that occurs in a proportion of the worlds. Therefore, the sub-views bear no influence on the parent view. So, the probability of a formula holding in the parent-view should not change when some sub-views are eliminated after receiving new information. Only the elimination of views on the same level after receiving of new information can have this effect. This behavior corresponds more or less with transferable belief model of Smets and Robert [10]. The example of Mr. Jones murder case discussed in [10], can be reformulated in the here proposed approach, leading to the same results a the transferable belief model. 5 Validity To verify the validity of the proposed approach, we must verify whether the derivable results correspond with our intuitions. First, however, we will look at three general requirements that a subjective probability measure should meet. The probability measure must uniquely be determined by the premises. The probability measure should not depend on the vocabulary. The probability measure should not depend on the number of objects in the world. The reason for the last two requirements is that every day new concepts are introduced and new objects are created or invented. When, for example, some advertiser introduces some new property of washing powder, this should not influence the chances of rain. Only information relating this new property to the climate change can have this effect. Theorem 1 Let be a set of premises. For each two hierarchical models V and W determined by the premises, there holds: V v W and W v V. This theorem implies that the views V and W are identical except for the names of the objects. So, the probability measure is uniquely determined by the premises. The other two requirements are also met. Since the defined probability measure depends on the current epistemic states, as follows from the above theorem, the vocabulary and the number of objects in the world have no influence on the probability measure. Now we will look at the probability that an object possesses some property, given statistical information about a group of objects to which the object belongs. Theorem 2 Let the premises consist of '(a) and [(%p)x '(x)] (x), and let V be the corresponding hierarchical model. Then P V ( (a)) = p. Specificity is the principle by which properties of a smaller, more specific group of objects override the properties of a larger group of objects. Theorem 3 Let the premises consist of '(a), [(%p)x '(x)] (x), [(%q)x (x)] (x), and [8 x '(x)](x), and let V be the corresponding hierarchical model. Then P V ( (a)) = p. 6 Related work Carnap [3] makes a distinction between a probability that describes the relation between a hypothesis (a formula) and evidence (the premises), and a probability that says something about the facts of nature. The former, which he calls a logical probability, describes a purely logical relation between evidence and a hypothesis. The latter, which describes statistics, is a mathematical theory about elementary statements based on an empirical procedure. Clearly, the here proposed probability measure defines a logical probability. It describes a logical relation between a formula and the premises. Carnap s main interest is in inductive logic. For example, the derivation of the relative frequency p in the formula [(%p) x (x)]'(x) on the bases of evidence, the premises. The here defined probability is unsuited for this purpose. Suppose for example that there are 10 houses in a street and that we wish to determine whether 50% or 80% of these house have a basement. After inspecting the first 7 houses, we may have found 5 house with a basement. One can easily verify that the two hypotheses are still equally likely given this evidence. Hence, the here defined probability measure does not confirm one of the hypotheses. Roos [7, 8, 9] proposes to establish a relation between information and subjective probability through the construction of a partial model. In the reasoning process that he proposes, a partial model consisting of mutually exclusive views, is constructed. On these views the insufficient reason argument is applied. Hence, the probability of a formula is proportional the number of views that satisfy the formula. So, like that above proposed approach, the probability depends on the number of epistemic states. In [7], Roos implicitly assumes that a reference class consists of en infinite number of objects. Furthermore, he does not require that views are mutually exclusive in the way as defined in this paper. In [9], he solves the latter problem. Here, he shows that reasoning through partial model construction can be used for handling uncertainty and for reason maintenance. In [8], Roos no longer assumes that a reference class contains an infinite number of objects. Since he does not take into account the dependency of statistical information on the reference class, this approach leads to counter intuitive results if we do not exactly know the number of objects in a reference class. Furthermore, since Roos requires that all views have the same objects, it also leads to conceptual difficulties. Knowing that there are 4 or 5 house in a street, what denotes the fifth object in the view where we only know of 4 houses? Bacchus, Grove, Halpern and Koller [1, 4, 2] propose to define a probability distributions over the set of interpretations. To assure that the interpretations are mutually exclusive, they introduce an important restriction on the set of interpretation. They assume that all interpretations possess the same set of objects. Furthermore, they assume that identical objects in different interpretations denote the same entity in the world. Though similar assumptions have been made in this paper, there is an important difference. The here proposed partial model introduce objects to denote specific entities in the world. By doing so, we implicitly assume that each entity in the world about Reasoning under Uncertainty 598 N. Roos

5 which we receive information can somehow be distinguished. When we receive information stating that there are 5 houses in a street, we assume that we will be able to identify 5 houses when we enter the street. The objects of a semantic interpretation do not play the same role as the objects of a partial model. In an interpretation, objects do not by themselves denote entities in the world. They only identify entities in the world through their relations with other objects. In this light, a set of objects shared by all interpretation plays an odd role. It implies that we know of the existence of each object in the set of objects but we do not know which entity in the world is represented by the object. So, an object can denote a person, a virus, a dragon or whatever in different interpretations. Another odd assumption that Bacchus et al. make is that the number of objects approximates infinity. Though there are of course many objects, we cannot claim that there are infinitely many objects. Assuming a finite number of objects however, raises problems since every day new objects are created while others disappear. As pointed out in the previous section, the probability measure should not depend on the vocabulary. The random numbers approach described in [1, 4, 2] does not meet this requirement if finite numbers of objects are allowed for. Another requirement formulated in the previous section is that the probability measure may not depend on the number of objects. Extending the set of objects of an interpretation with a new object, may not influence the probability measure. This should particularly hold for the probability that n objects o possess a property ', i.e. '(o) holds, if a new object o 0 does not possess this property. From this requirement, we can derive the following result. Theorem 4 If probability that n objects o possess a property ' does not change after adding a new object o 0 for which :'(o 0 ) holds, then for each n 2 IN there is a p 2 [0; 1] such that for each k 2 IN; k n there holds: P ([(#n) x]'(x) j [(#k) x]true) = p Notice that [(#k) x]true denotes that the world consists of k objects. If we would restrict ourselves to monadic predicate logic and apply the insufficient reason argument on the number of objects possessing the property ', we get the random propensity approach proposed in [1]. How to extend the random propensity approach to non monadic predicate logic is, however, unclear. Kyburg [6] uses an approach based on counting models, to give a semantics for direct inference. Direct inference is inference from a general statistical or analogical premise to a conclusion that concerns the case [6]. In his semantic definition, Kybrug does not start from the models of the premises. Instead, he considers several broader sets of models. The reason is that, according to Kyburg, logical constraints can give misleading precision. He illustrates this with the following two formulas. The proportion of 50-year-old females in the USA that live for more than two year lays between [0.945,0.965] and the proportion of 50-year-old females in Rochester (USA) that live for more than two year lays between [0.930,0.975]. Kyburg argues that the statistical information about the 50-year-old females in the USA is better than the statistical information about 50-year-old females in Rochester. Therefore, the latter information should be ignored. This is however, one possible interpretation of the provided statistical information. We could also argue that a proportion of of the 50- year-old females in Rochester is possible. Therefore, the 50-year-old females in Rochester may represent an exception on the 50-year-old females in the USA. Smets and Robert [10] proposes the transferable belief model as a way to handle an agent s beliefs. As we have noted at the end of Section 4, the hierarchy of views can exhibit the same behavior as the transferable belief model after receiving new information. The causes of these behaviors are different. In the transferable belief model, the transfer of belief is the result of assigning belief masses to sets of worlds, while here it is the result of dependencies between formulas. 7 Conclusion In this paper a subjective probability measure has been defined by constructing epistemic states using the premises and by assigning probabilities to these states using the insufficient reason principle. In this way subjective probability has been defined in an objective way. There are still some issues that are open for further research. Firstly, way a group of objects is represented in a partial model, actually the way it is not represented, can be improved. Secondly, a hierarchy of view has been used to represent uncertainty about the properties of objects. Is the proposed hierarchy the best way to represent this uncertainy? Thirdly, after eliminating a sub-view of a view, e.g. because of new information, the likelihood of the view does not change as long as the there remain other sub-views. Is this behavior correct in all circumstances? Fourthly, on the pragmatic side improvements are possible and even required if we wish to use this approach for practical purposes. Humans cannot handle much more than 7 views at the same time [5]. This means that we can consider 1 of 7 objects, or 2 of 4 objects. Computers can perform slightly better. Since selecting 10 of 20 objects can already result in different views, a computer will soon reach its limits. To cope with this problem, we might assign weights to views. A weighted view should summarize a large number of views. A last issue for further research is the incorporation of frequency information, such as: the bus is often too late, in the proposed approach. REFERENCES [1] F. Bacchus, A. J. Grove, J. Y. Halpern, D. Koller, From statistics to beliefs, AAAI-92 (1992) [2] F. Bacchus, A. J. Grove, J. Y. Halpern, D. Koller, From statistical knowledge bases to degrees of belief, Artificial Intelligence 87 (1996) [3] R. Carnap, Logical foundations of probability 2nd edition, The University of Chicago Press, (1962) [4] A. J. Grove, J. Y. Halpern, D. Koller, Random worlds and maximum entropy, Journal of Artificial Intelligence Research 2 (1994) [5] P. N. Johnson-Laird, Mental models, Toward a cognitive science of language inferences and consciousness, Cambridge University Press, Cambridge (1983). [6] H. E. Kyburg, Combinatorial semantics: semantics for frequent validity, Computational Intelligence 13 (1997) [7] N. Roos, How to reason with uncertain knowledge, IPMU 90, in: B. Bouchon-Meurier, R. R. Yager, L. A. Zadeh (eds), Uncertainty in knowledge bases Springer-Verlag (1991) [8] N. Roos, Reasoning with partial models; Construction of partial models and management of uncertainty, in: W. van der Hoek, J.-J. Ch. Meyer, Y. H. Tan, C. Witteveen (eds), Non-monotonic reasoning and partial semantics, Ellis Horwood (1992) [9] N. Roos, Uncertain, inconsistent and default knowledge: reasoning through the construction of a partial model, Dutch/German workshop on non-monotonic reasoning techniques and their application (1993). [10] Ph. Smets, K. Robert, The transferable belief model Artificial Intelligence 66 (1994) Reasoning under Uncertainty 599 N. Roos

Introduction to Metalogic

Introduction to Metalogic Philosophy 135 Spring 2008 Tony Martin Introduction to Metalogic 1 The semantics of sentential logic. The language L of sentential logic. Symbols of L: Remarks: (i) sentence letters p 0, p 1, p 2,... (ii)

More information

Nested Epistemic Logic Programs

Nested Epistemic Logic Programs Nested Epistemic Logic Programs Kewen Wang 1 and Yan Zhang 2 1 Griffith University, Australia k.wang@griffith.edu.au 2 University of Western Sydney yan@cit.uws.edu.au Abstract. Nested logic programs and

More information

PROBABILISTIC LOGIC. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc.

PROBABILISTIC LOGIC. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc. PROBABILISTIC LOGIC A deductive argument is a claim of the form: If P 1, P 2,...,andP

More information

Logic. Quantifiers. (real numbers understood). x [x is rotten in Denmark]. x<x+x 2 +1

Logic. Quantifiers. (real numbers understood). x [x is rotten in Denmark]. x<x+x 2 +1 Logic One reason for studying logic is that we need a better notation than ordinary English for expressing relationships among various assertions or hypothetical states of affairs. A solid grounding in

More information

Tutorial on Axiomatic Set Theory. Javier R. Movellan

Tutorial on Axiomatic Set Theory. Javier R. Movellan Tutorial on Axiomatic Set Theory Javier R. Movellan Intuitively we think of sets as collections of elements. The crucial part of this intuitive concept is that we are willing to treat sets as entities

More information

Russell s logicism. Jeff Speaks. September 26, 2007

Russell s logicism. Jeff Speaks. September 26, 2007 Russell s logicism Jeff Speaks September 26, 2007 1 Russell s definition of number............................ 2 2 The idea of reducing one theory to another.................... 4 2.1 Axioms and theories.............................

More information

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

2) There should be uncertainty as to which outcome will occur before the procedure takes place. robability Numbers For many statisticians the concept of the probability that an event occurs is ultimately rooted in the interpretation of an event as an outcome of an experiment, others would interpret

More information

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial C:145 Artificial Intelligence@ Uncertainty Readings: Chapter 13 Russell & Norvig. Artificial Intelligence p.1/43 Logic and Uncertainty One problem with logical-agent approaches: Agents almost never have

More information

Evidence with Uncertain Likelihoods

Evidence with Uncertain Likelihoods Evidence with Uncertain Likelihoods Joseph Y. Halpern Cornell University Ithaca, NY 14853 USA halpern@cs.cornell.edu Riccardo Pucella Cornell University Ithaca, NY 14853 USA riccardo@cs.cornell.edu Abstract

More information

22c:145 Artificial Intelligence

22c:145 Artificial Intelligence 22c:145 Artificial Intelligence Fall 2005 Propositional Logic Cesare Tinelli The University of Iowa Copyright 2001-05 Cesare Tinelli and Hantao Zhang. a a These notes are copyrighted material and may not

More information

PROOF-THEORETIC REDUCTION AS A PHILOSOPHER S TOOL

PROOF-THEORETIC REDUCTION AS A PHILOSOPHER S TOOL THOMAS HOFWEBER PROOF-THEORETIC REDUCTION AS A PHILOSOPHER S TOOL 1. PROOF-THEORETIC REDUCTION AND HILBERT S PROGRAM Hilbert s program in the philosophy of mathematics comes in two parts. One part is a

More information

Argumentation and rules with exceptions

Argumentation and rules with exceptions Argumentation and rules with exceptions Bart VERHEIJ Artificial Intelligence, University of Groningen Abstract. Models of argumentation often take a given set of rules or conditionals as a starting point.

More information

An Inquisitive Formalization of Interrogative Inquiry

An Inquisitive Formalization of Interrogative Inquiry An Inquisitive Formalization of Interrogative Inquiry Yacin Hamami 1 Introduction and motivation The notion of interrogative inquiry refers to the process of knowledge-seeking by questioning [5, 6]. As

More information

Logic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001

Logic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001 Logic Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001 Last Lecture Games Cont. α-β pruning Outline Games with chance, e.g. Backgammon Logical Agents and thewumpus World

More information

Logic. (Propositional Logic)

Logic. (Propositional Logic) Logic (Propositional Logic) 1 REPRESENTING KNOWLEDGE: LOGIC Logic is the branch of mathematics / philosophy concerned with knowledge and reasoning Aristotle distinguished between three types of arguments:

More information

Logic, Sets, and Proofs

Logic, Sets, and Proofs Logic, Sets, and Proofs David A. Cox and Catherine C. McGeoch Amherst College 1 Logic Logical Operators. A logical statement is a mathematical statement that can be assigned a value either true or false.

More information

09 Modal Logic II. CS 3234: Logic and Formal Systems. October 14, Martin Henz and Aquinas Hobor

09 Modal Logic II. CS 3234: Logic and Formal Systems. October 14, Martin Henz and Aquinas Hobor Martin Henz and Aquinas Hobor October 14, 2010 Generated on Thursday 14 th October, 2010, 11:40 1 Review of Modal Logic 2 3 4 Motivation Syntax and Semantics Valid Formulas wrt Modalities Correspondence

More information

Creative Objectivism, a powerful alternative to Constructivism

Creative Objectivism, a powerful alternative to Constructivism Creative Objectivism, a powerful alternative to Constructivism Copyright c 2002 Paul P. Budnik Jr. Mountain Math Software All rights reserved Abstract It is problematic to allow reasoning about infinite

More information

cis32-ai lecture # 18 mon-3-apr-2006

cis32-ai lecture # 18 mon-3-apr-2006 cis32-ai lecture # 18 mon-3-apr-2006 today s topics: propositional logic cis32-spring2006-sklar-lec18 1 Introduction Weak (search-based) problem-solving does not scale to real problems. To succeed, problem

More information

Pei Wang( 王培 ) Temple University, Philadelphia, USA

Pei Wang( 王培 ) Temple University, Philadelphia, USA Pei Wang( 王培 ) Temple University, Philadelphia, USA Artificial General Intelligence (AGI): a small research community in AI that believes Intelligence is a general-purpose capability Intelligence should

More information

Conceivability and Modal Knowledge

Conceivability and Modal Knowledge 1 3 Conceivability and Modal Knowledge Christopher Hill ( 2006 ) provides an account of modal knowledge that is set in a broader context of arguing against the view that conceivability provides epistemic

More information

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION KRIPKE S THEORY OF TRUTH RICHARD G HECK, JR 1. INTRODUCTION The purpose of this note is to give a simple, easily accessible proof of the existence of the minimal fixed point, and of various maximal fixed

More information

Generalized Quantifiers Logical and Linguistic Aspects

Generalized Quantifiers Logical and Linguistic Aspects Generalized Quantifiers Logical and Linguistic Aspects Lecture 1: Formal Semantics and Generalized Quantifiers Dag Westerståhl University of Gothenburg SELLC 2010 Institute for Logic and Cognition, Sun

More information

Partial Implication Semantics for Desirable Propositions

Partial Implication Semantics for Desirable Propositions Partial Implication Semantics for Desirable Propositions Yi Zhou XiaoPing Chen Department of Computer Science University of Science and Technology of China HeFei, 230026 China zyz@mail.ustc.edu.cn Department

More information

Proving Completeness for Nested Sequent Calculi 1

Proving Completeness for Nested Sequent Calculi 1 Proving Completeness for Nested Sequent Calculi 1 Melvin Fitting abstract. Proving the completeness of classical propositional logic by using maximal consistent sets is perhaps the most common method there

More information

Axiomatic set theory. Chapter Why axiomatic set theory?

Axiomatic set theory. Chapter Why axiomatic set theory? Chapter 1 Axiomatic set theory 1.1 Why axiomatic set theory? Essentially all mathematical theories deal with sets in one way or another. In most cases, however, the use of set theory is limited to its

More information

Maximal Introspection of Agents

Maximal Introspection of Agents Electronic Notes in Theoretical Computer Science 70 No. 5 (2002) URL: http://www.elsevier.nl/locate/entcs/volume70.html 16 pages Maximal Introspection of Agents Thomas 1 Informatics and Mathematical Modelling

More information

Generating New Beliefs From Old

Generating New Beliefs From Old Generating New Beliefs From Old Fahiem Bacchus Computer Science Dept. University of Waterloo Waterloo, Ontario Canada, N2L 3G1 fbacchus@logos.waterloo.edu Adam J. Grove NEC Research Institute 4 Independence

More information

CS 331: Artificial Intelligence Propositional Logic I. Knowledge-based Agents

CS 331: Artificial Intelligence Propositional Logic I. Knowledge-based Agents CS 331: Artificial Intelligence Propositional Logic I 1 Knowledge-based Agents Can represent knowledge And reason with this knowledge How is this different from the knowledge used by problem-specific agents?

More information

Knowledge-based Agents. CS 331: Artificial Intelligence Propositional Logic I. Knowledge-based Agents. Outline. Knowledge-based Agents

Knowledge-based Agents. CS 331: Artificial Intelligence Propositional Logic I. Knowledge-based Agents. Outline. Knowledge-based Agents Knowledge-based Agents CS 331: Artificial Intelligence Propositional Logic I Can represent knowledge And reason with this knowledge How is this different from the knowledge used by problem-specific agents?

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 1

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 1 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 1 1 A Brief Introduction Welcome to Discrete Math and Probability Theory! You might be wondering what you ve gotten yourself

More information

Definite Logic Programs

Definite Logic Programs Chapter 2 Definite Logic Programs 2.1 Definite Clauses The idea of logic programming is to use a computer for drawing conclusions from declarative descriptions. Such descriptions called logic programs

More information

The Role of Dialectics in Defeasible Argumentation 1 2

The Role of Dialectics in Defeasible Argumentation 1 2 The Role of Dialectics in Defeasible Argumentation 1 2 Guillermo R. Simari Carlos I. Chesñevar Alejandro J. García Grupo de Investigación en Inteligencia Artificial (GIIA) Departamento de Matemática, Universidad

More information

On the teaching and learning of logic in mathematical contents. Kyeong Hah Roh Arizona State University

On the teaching and learning of logic in mathematical contents. Kyeong Hah Roh Arizona State University On the teaching and learning of logic in mathematical contents Kyeong Hah Roh Arizona State University khroh@asu.edu Students understanding of the formal definitions of limit teaching and learning of logic

More information

Section 2.1: Introduction to the Logic of Quantified Statements

Section 2.1: Introduction to the Logic of Quantified Statements Section 2.1: Introduction to the Logic of Quantified Statements In the previous chapter, we studied a branch of logic called propositional logic or propositional calculus. Loosely speaking, propositional

More information

3/29/2017. Logic. Propositions and logical operations. Main concepts: propositions truth values propositional variables logical operations

3/29/2017. Logic. Propositions and logical operations. Main concepts: propositions truth values propositional variables logical operations Logic Propositions and logical operations Main concepts: propositions truth values propositional variables logical operations 1 Propositions and logical operations A proposition is the most basic element

More information

SKETCHY NOTES FOR WEEKS 7 AND 8

SKETCHY NOTES FOR WEEKS 7 AND 8 SKETCHY NOTES FOR WEEKS 7 AND 8 We are now ready to start work on the proof of the Completeness Theorem for first order logic. Before we start a couple of remarks are in order (1) When we studied propositional

More information

I. Induction, Probability and Confirmation: Introduction

I. Induction, Probability and Confirmation: Introduction I. Induction, Probability and Confirmation: Introduction 1. Basic Definitions and Distinctions Singular statements vs. universal statements Observational terms vs. theoretical terms Observational statement

More information

Handout on Logic, Axiomatic Methods, and Proofs MATH Spring David C. Royster UNC Charlotte

Handout on Logic, Axiomatic Methods, and Proofs MATH Spring David C. Royster UNC Charlotte Handout on Logic, Axiomatic Methods, and Proofs MATH 3181 001 Spring 1999 David C. Royster UNC Charlotte January 18, 1999 Chapter 1 Logic and the Axiomatic Method 1.1 Introduction Mathematicians use a

More information

Belief revision: A vade-mecum

Belief revision: A vade-mecum Belief revision: A vade-mecum Peter Gärdenfors Lund University Cognitive Science, Kungshuset, Lundagård, S 223 50 LUND, Sweden Abstract. This paper contains a brief survey of the area of belief revision

More information

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins. Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence

More information

-1- THE PROBABILITY THAT TWEETY IS ABLE TO FLY Giangiacomo Gerla Dipartimento di Matematica e Fisica, Università di Camerino ITALY.

-1- THE PROBABILITY THAT TWEETY IS ABLE TO FLY Giangiacomo Gerla Dipartimento di Matematica e Fisica, Università di Camerino ITALY. -1- THE PROBABILITY THAT TWEETY IS ABLE TO FLY Giangiacomo Gerla Dipartimento di Matematica e Fisica, Università di Camerino ITALY. Abstract. Consider the question of assigning a probabilistic valuation

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity

More information

6. Logical Inference

6. Logical Inference Artificial Intelligence 6. Logical Inference Prof. Bojana Dalbelo Bašić Assoc. Prof. Jan Šnajder University of Zagreb Faculty of Electrical Engineering and Computing Academic Year 2016/2017 Creative Commons

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

Product Update and Looking Backward

Product Update and Looking Backward Product Update and Looking Backward Audrey Yap May 21, 2006 Abstract The motivation behind this paper is to look at temporal information in models of BMS product update. That is, it may be useful to look

More information

Class 29 - November 3 Semantics for Predicate Logic

Class 29 - November 3 Semantics for Predicate Logic Philosophy 240: Symbolic Logic Fall 2010 Mondays, Wednesdays, Fridays: 9am - 9:50am Hamilton College Russell Marcus rmarcus1@hamilton.edu Class 29 - November 3 Semantics for Predicate Logic I. Proof Theory

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,

More information

Discrete Mathematics & Mathematical Reasoning Predicates, Quantifiers and Proof Techniques

Discrete Mathematics & Mathematical Reasoning Predicates, Quantifiers and Proof Techniques Discrete Mathematics & Mathematical Reasoning Predicates, Quantifiers and Proof Techniques Colin Stirling Informatics Some slides based on ones by Myrto Arapinis Colin Stirling (Informatics) Discrete Mathematics

More information

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models Contents Mathematical Reasoning 3.1 Mathematical Models........................... 3. Mathematical Proof............................ 4..1 Structure of Proofs........................ 4.. Direct Method..........................

More information

CS 2800: Logic and Computation Fall 2010 (Lecture 13)

CS 2800: Logic and Computation Fall 2010 (Lecture 13) CS 2800: Logic and Computation Fall 2010 (Lecture 13) 13 October 2010 1 An Introduction to First-order Logic In Propositional(Boolean) Logic, we used large portions of mathematical language, namely those

More information

CS1800: Mathematical Induction. Professor Kevin Gold

CS1800: Mathematical Induction. Professor Kevin Gold CS1800: Mathematical Induction Professor Kevin Gold Induction: Used to Prove Patterns Just Keep Going For an algorithm, we may want to prove that it just keeps working, no matter how big the input size

More information

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 CS17 Integrated Introduction to Computer Science Klein Contents Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 1 Tree definitions 1 2 Analysis of mergesort using a binary tree 1 3 Analysis of

More information

On the errors introduced by the naive Bayes independence assumption

On the errors introduced by the naive Bayes independence assumption On the errors introduced by the naive Bayes independence assumption Author Matthijs de Wachter 3671100 Utrecht University Master Thesis Artificial Intelligence Supervisor Dr. Silja Renooij Department of

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

PHIL12A Section answers, 28 Feb 2011

PHIL12A Section answers, 28 Feb 2011 PHIL12A Section answers, 28 Feb 2011 Julian Jonker 1 How much do you know? Give formal proofs for the following arguments. 1. (Ex 6.18) 1 A B 2 A B 1 A B 2 A 3 A B Elim: 2 4 B 5 B 6 Intro: 4,5 7 B Intro:

More information

Finite information logic

Finite information logic Finite information logic Rohit Parikh and Jouko Väänänen April 5, 2002 Work in progress. Please do not circulate! Partial information logic is a generalization of both rst order logic and Hintikka-Sandu

More information

DECISIONS UNDER UNCERTAINTY

DECISIONS UNDER UNCERTAINTY August 18, 2003 Aanund Hylland: # DECISIONS UNDER UNCERTAINTY Standard theory and alternatives 1. Introduction Individual decision making under uncertainty can be characterized as follows: The decision

More information

A Strong Relevant Logic Model of Epistemic Processes in Scientific Discovery

A Strong Relevant Logic Model of Epistemic Processes in Scientific Discovery A Strong Relevant Logic Model of Epistemic Processes in Scientific Discovery (Extended Abstract) Jingde Cheng Department of Computer Science and Communication Engineering Kyushu University, 6-10-1 Hakozaki,

More information

Probability Calculus. Chapter From Propositional to Graded Beliefs

Probability Calculus. Chapter From Propositional to Graded Beliefs Chapter 2 Probability Calculus Our purpose in this chapter is to introduce probability calculus and then show how it can be used to represent uncertain beliefs, and then change them in the face of new

More information

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Introduction: MLE, MAP, Bayesian reasoning (28/8/13) STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this

More information

TEACHING INDEPENDENCE AND EXCHANGEABILITY

TEACHING INDEPENDENCE AND EXCHANGEABILITY TEACHING INDEPENDENCE AND EXCHANGEABILITY Lisbeth K. Cordani Instituto Mauá de Tecnologia, Brasil Sergio Wechsler Universidade de São Paulo, Brasil lisbeth@maua.br Most part of statistical literature,

More information

With Question/Answer Animations. Chapter 2

With Question/Answer Animations. Chapter 2 With Question/Answer Animations Chapter 2 Chapter Summary Sets The Language of Sets Set Operations Set Identities Functions Types of Functions Operations on Functions Sequences and Summations Types of

More information

ICS141: Discrete Mathematics for Computer Science I

ICS141: Discrete Mathematics for Computer Science I ICS141: Discrete Mathematics for Computer Science I Dept. Information & Computer Sci., Originals slides by Dr. Baek and Dr. Still, adapted by J. Stelovsky Based on slides Dr. M. P. Frank and Dr. J.L. Gross

More information

Deductive Systems. Lecture - 3

Deductive Systems. Lecture - 3 Deductive Systems Lecture - 3 Axiomatic System Axiomatic System (AS) for PL AS is based on the set of only three axioms and one rule of deduction. It is minimal in structure but as powerful as the truth

More information

Data Mining and Machine Learning

Data Mining and Machine Learning Data Mining and Machine Learning Concept Learning and Version Spaces Introduction Concept Learning Generality Relations Refinement Operators Structured Hypothesis Spaces Simple algorithms Find-S Find-G

More information

Modal Dependence Logic

Modal Dependence Logic Modal Dependence Logic Jouko Väänänen Institute for Logic, Language and Computation Universiteit van Amsterdam Plantage Muidergracht 24 1018 TV Amsterdam, The Netherlands J.A.Vaananen@uva.nl Abstract We

More information

Supplementary Logic Notes CSE 321 Winter 2009

Supplementary Logic Notes CSE 321 Winter 2009 1 Propositional Logic Supplementary Logic Notes CSE 321 Winter 2009 1.1 More efficient truth table methods The method of using truth tables to prove facts about propositional formulas can be a very tedious

More information

The Inductive Proof Template

The Inductive Proof Template CS103 Handout 24 Winter 2016 February 5, 2016 Guide to Inductive Proofs Induction gives a new way to prove results about natural numbers and discrete structures like games, puzzles, and graphs. All of

More information

A Class of Star-Algebras for Point-Based Qualitative Reasoning in Two- Dimensional Space

A Class of Star-Algebras for Point-Based Qualitative Reasoning in Two- Dimensional Space From: FLAIRS- Proceedings. Copyright AAAI (www.aaai.org). All rights reserved. A Class of Star-Algebras for Point-Based Qualitative Reasoning in Two- Dimensional Space Debasis Mitra Department of Computer

More information

Fuzzy Systems. Introduction

Fuzzy Systems. Introduction Fuzzy Systems Introduction Prof. Dr. Rudolf Kruse Christoph Doell {kruse,doell}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Examples: P: it is not the case that P. P Q: P or Q P Q: P implies Q (if P then Q) Typical formula:

Examples: P: it is not the case that P. P Q: P or Q P Q: P implies Q (if P then Q) Typical formula: Logic: The Big Picture Logic is a tool for formalizing reasoning. There are lots of different logics: probabilistic logic: for reasoning about probability temporal logic: for reasoning about time (and

More information

Tutorial on Mathematical Induction

Tutorial on Mathematical Induction Tutorial on Mathematical Induction Roy Overbeek VU University Amsterdam Department of Computer Science r.overbeek@student.vu.nl April 22, 2014 1 Dominoes: from case-by-case to induction Suppose that you

More information

The Importance of Being Formal. Martin Henz. February 5, Propositional Logic

The Importance of Being Formal. Martin Henz. February 5, Propositional Logic The Importance of Being Formal Martin Henz February 5, 2014 Propositional Logic 1 Motivation In traditional logic, terms represent sets, and therefore, propositions are limited to stating facts on sets

More information

First-Order Theorem Proving and Vampire. Laura Kovács (Chalmers University of Technology) Andrei Voronkov (The University of Manchester)

First-Order Theorem Proving and Vampire. Laura Kovács (Chalmers University of Technology) Andrei Voronkov (The University of Manchester) First-Order Theorem Proving and Vampire Laura Kovács (Chalmers University of Technology) Andrei Voronkov (The University of Manchester) Outline Introduction First-Order Logic and TPTP Inference Systems

More information

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes These notes form a brief summary of what has been covered during the lectures. All the definitions must be memorized and understood. Statements

More information

Formalizing knowledge-how

Formalizing knowledge-how Formalizing knowledge-how Tszyuen Lau & Yanjing Wang Department of Philosophy, Peking University Beijing Normal University November 29, 2014 1 Beyond knowing that 2 Knowledge-how vs. Knowledge-that 3 Our

More information

Logic for Computer Science - Week 4 Natural Deduction

Logic for Computer Science - Week 4 Natural Deduction Logic for Computer Science - Week 4 Natural Deduction 1 Introduction In the previous lecture we have discussed some important notions about the semantics of propositional logic. 1. the truth value of a

More information

Fuzzy Systems. Introduction

Fuzzy Systems. Introduction Fuzzy Systems Introduction Prof. Dr. Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge

More information

Reasoning with Uncertainty

Reasoning with Uncertainty Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take

More information

Mathematics 114L Spring 2018 D.A. Martin. Mathematical Logic

Mathematics 114L Spring 2018 D.A. Martin. Mathematical Logic Mathematics 114L Spring 2018 D.A. Martin Mathematical Logic 1 First-Order Languages. Symbols. All first-order languages we consider will have the following symbols: (i) variables v 1, v 2, v 3,... ; (ii)

More information

Possibilistic Logic. Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini.

Possibilistic Logic. Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini. Possibilistic Logic Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini November 21, 2005 1 Introduction In real life there are some situations where only

More information

Artificial Intelligence Knowledge Representation I

Artificial Intelligence Knowledge Representation I Artificial Intelligence Knowledge Representation I Agents that reason logically knowledge-based approach implement agents that know about their world and reason about possible courses of action needs to

More information

Elementary Linear Algebra, Second Edition, by Spence, Insel, and Friedberg. ISBN Pearson Education, Inc., Upper Saddle River, NJ.

Elementary Linear Algebra, Second Edition, by Spence, Insel, and Friedberg. ISBN Pearson Education, Inc., Upper Saddle River, NJ. 2008 Pearson Education, Inc., Upper Saddle River, NJ. All rights reserved. APPENDIX: Mathematical Proof There are many mathematical statements whose truth is not obvious. For example, the French mathematician

More information

Lecture Notes in Machine Learning Chapter 4: Version space learning

Lecture Notes in Machine Learning Chapter 4: Version space learning Lecture Notes in Machine Learning Chapter 4: Version space learning Zdravko Markov February 17, 2004 Let us consider an example. We shall use an attribute-value language for both the examples and the hypotheses

More information

Expressive Power, Mood, and Actuality

Expressive Power, Mood, and Actuality Expressive Power, Mood, and Actuality Rohan French Abstract In Wehmeier (2004) we are presented with the subjunctive modal language, a way of dealing with the expressive inadequacy of modal logic by marking

More information

CMPT Machine Learning. Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th

CMPT Machine Learning. Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th CMPT 882 - Machine Learning Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th Stephen Fagan sfagan@sfu.ca Overview: Introduction - Who was Bayes? - Bayesian Statistics Versus Classical Statistics

More information

Logic: Propositional Logic Truth Tables

Logic: Propositional Logic Truth Tables Logic: Propositional Logic Truth Tables Raffaella Bernardi bernardi@inf.unibz.it P.zza Domenicani 3, Room 2.28 Faculty of Computer Science, Free University of Bolzano-Bozen http://www.inf.unibz.it/~bernardi/courses/logic06

More information

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science Ch.6 Uncertain Knowledge Representation Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/39 Logic and Uncertainty One

More information

Proof. Theorems. Theorems. Example. Example. Example. Part 4. The Big Bang Theory

Proof. Theorems. Theorems. Example. Example. Example. Part 4. The Big Bang Theory Proof Theorems Part 4 The Big Bang Theory Theorems A theorem is a statement we intend to prove using existing known facts (called axioms or lemmas) Used extensively in all mathematical proofs which should

More information

A set theoretic view of the ISA hierarchy

A set theoretic view of the ISA hierarchy Loughborough University Institutional Repository A set theoretic view of the ISA hierarchy This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CHEUNG,

More information

Integrating State Constraints and Obligations in Situation Calculus

Integrating State Constraints and Obligations in Situation Calculus Integrating State Constraints and Obligations in Situation Calculus Robert Demolombe ONERA-Toulouse 2, Avenue Edouard Belin BP 4025, 31055 Toulouse Cedex 4, France. Robert.Demolombe@cert.fr Pilar Pozos

More information

Argumentation-Based Models of Agent Reasoning and Communication

Argumentation-Based Models of Agent Reasoning and Communication Argumentation-Based Models of Agent Reasoning and Communication Sanjay Modgil Department of Informatics, King s College London Outline Logic and Argumentation - Dung s Theory of Argumentation - The Added

More information

Uncertainty and Rules

Uncertainty and Rules Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty

More information

Significance Testing with Incompletely Randomised Cases Cannot Possibly Work

Significance Testing with Incompletely Randomised Cases Cannot Possibly Work Human Journals Short Communication December 2018 Vol.:11, Issue:2 All rights are reserved by Stephen Gorard FRSA FAcSS Significance Testing with Incompletely Randomised Cases Cannot Possibly Work Keywords:

More information

A Preference Logic With Four Kinds of Preferences

A Preference Logic With Four Kinds of Preferences A Preference Logic With Four Kinds of Preferences Zhang Zhizheng and Xing Hancheng School of Computer Science and Engineering, Southeast University No.2 Sipailou, Nanjing, China {seu_zzz; xhc}@seu.edu.cn

More information

Some Remarks on Alternating Temporal Epistemic Logic

Some Remarks on Alternating Temporal Epistemic Logic Some Remarks on Alternating Temporal Epistemic Logic Corrected version: July 2003 Wojciech Jamroga Parlevink Group, University of Twente, Netherlands Institute of Mathematics, University of Gdansk, Poland

More information

Decision procedure for Default Logic

Decision procedure for Default Logic Decision procedure for Default Logic W. Marek 1 and A. Nerode 2 Abstract Using a proof-theoretic approach to non-monotone reasoning we introduce an algorithm to compute all extensions of any (propositional)

More information

Preliminaries. Introduction to EF-games. Inexpressivity results for first-order logic. Normal forms for first-order logic

Preliminaries. Introduction to EF-games. Inexpressivity results for first-order logic. Normal forms for first-order logic Introduction to EF-games Inexpressivity results for first-order logic Normal forms for first-order logic Algorithms and complexity for specific classes of structures General complexity bounds Preliminaries

More information