Reasoning Under Uncertainty

Size: px
Start display at page:

Download "Reasoning Under Uncertainty"

Transcription

1 Reasoning Under Uncertainty Chapter 14&15 Part Kostas (1) Certainty Kontogiannis Factors E&CE 457 Objectives This unit aims to investigate techniques that allow for an algorithmic process to deduce new facts from a knowledge base with a level of confidence or a measure of belief. These techniques are of particular importance when: 1. The rules in the knowledge base do not produce a conclusion that is certain even though the rule premises are known to be certain and/or 2. The premises of the rules are not known to be certain The three parts in this unit deal with: 1. Techniques related to certainty factors and their application in rulebased systems 2. Techniques related to the Measures of Belief, their relationship to probabilistic reasoning, and finally their application in rule-based systems 3. The Demster Shafer model for reasoning under uncertainty in rulebased systems 1

2 Uncertainty and Evidential Support In its simplest case, a Knowledge Base contains rules of the form : A & B & C => D where facts A, B, C are considered to be True (that is these facts hold with probability 1), and D is asserted in the Knowledge Base as being True (also with probability 1) However for realistic cases, domain knowledge has to be modeled in way that accommodates uncertainty. In other words we would like to encode domain knowledge using rules of the form: A & B & C => D (CF:x1) where A, B, C are not necessarily certain (i.e. CF = 1) Issues in Rule-Based Reasoning Under Uncertainty Many rules support the same conclusion with various degrees of Certainty A1 & A2 & A3 => H (CF=0.5) B1 & B2 & B3 => H (CF=0.6) (If we assume all A1, A2, A3, B1, B3, B3 hold then H is supported with CF(H) = CFcombine(0.5, 0.6)) The premises of a rule to be applied do not hold with absolute certainty (CF, or probability associated with a premise not equal to 1) Rule: A1 => H (CF=0.5) However if during a consultation, A1 holds with CF(A1) = 0.3 the H holds with CF(H) = 0.5*0.3 =

3 The Certainty Factor Model The potential for a single piece of negative evidence should not overwhelm several pieces of positive evidence and vice versa the computational expense of storing MB s and MD s should be avoided and instead maintain a cumulative CF value Simple model: CF = MB - MD Cfcombine(X, Y) = X + Y*(1-X) The problem is that a single negative evidence overwhelms several pieces of positive evidence The Revised CF Model CF = MB - MD 1 - min(mb, MD) {X + Y(1 - X) X, Y > 0 CFcombine(X,Y) = X + Y One of X, Y < min( X, Y ) - CFcombine(-X, -Y) X, Y < 0 3

4 Additional Use of CFs Provide methods for search termination R1 R2 R3 R4 A B C D E In the case of branching in the inference sequencing paths should be kept distinct Cutoff in Complex Inferences R1 R2 A B C D R R5 F R4 E We should maintain to paths for cutoff (0.2), one being (E, D, C, B, A) and the other (F, C, B, A). If we had one path then E, D, C would drop to 0.19 and make C unusable later in path F, C, B, A. 4

5 Reasoning Under Uncertainty Part (2) Measures of Belief Kostas Kontogiannis E&CE 457 Terminology The units of belief follow the same as in probability theory If the sum of all evidence is represented by e and d is the diagnosis (hypothesis) under consideration, then the probability P(d e) is interpreted as the probabilistic measure of belief or strength that the hypothesis d holds given the evidence e. In this context: P(d) : a-priori probability (the probability hypothesis d occurs P(e d) : the probability that the evidence represented by e are present given that the hypothesis (i.e. disease) holds 5

6 Analyzing and Using Sequential Evidence Let e 1 be a set of observations to date, and s 1 be some new piece of data. Furthermore, let e be the new set of observations once s 1 has been added to e 1. Then P(d i e) = P(s 1 d i & e1) P(d i e 1 ) Sum (P(s 1 d j & e 1 ) P(d j e 1 )) P(d e) = x is interpreted: IF you observe symptom e THEN conclude hypothesis d with probability x Requirements It is practically impossible to obtain measurements for P(sk dj) for each or the pieces of data sk, in e, and for the inter-relationships of the sk within each possible hypothesis dj Instead, we would like to obtain a measurement of P(di e) in terms of P(di sk), where e is the composite of all the observed sk 6

7 Advantages of Using Rules in Uncertainty Reasoning The use of general knowledge and abstractions in the problem domain The use of judgmental knowledge Ease of modification and fine-tuning Facilitated search for potential inconsistencies and contradictions in the knowledge base Straightforward mechanisms for explaining decisions An augmented instructional capability Measuring Uncertainty Probability theory Confirmation Classificatory: The evidence e confirms the hypothesis h Comparative: e1 confirms h more strongly than e2 confirms h or e confirms h1 more strongly than e confirms h2 Quantitative: e confirms h with strength x usually denoted as C[h,e]. In this context C[h,e] is not equal to 1-C[~h,e] Fuzzy sets 7

8 Model of Evidential Strength Quantification scheme for modeling inexact reasoning The concepts of belief and disbelief as units of measurement The terminology is based on: MB[h,e] = x the measure of increased belief in the hypothesis h, based on the evidence e, is x MD[h,e] = y the measure of increased disbelief in the hypothesis h, based on the evidence e, is y The evidence e need not be an observed event, but may be a hypothesis subject to confirmation For example, MB[h,e] = 0.7 reflects the extend to which the expert s belief that h is true is increased by the knowledge that e is true In this sense MB[h,e] = 0 means that the expert has no reason to increase his/her belief in h on the basis of e Probability and Evidential Model In accordance with subjective probability theory, P(h) reflects expert s belief in h at any given time. Thus 1 - P(h) reflects expert s disbelief regarding the truth of h If P(h e) > P(h), then it means that the observation of e increases the expert s belief in h, while decreasing his/her disbelief regarding the truth of h In fact, the proportionate decrease in disbelief is given by the following ratio P(h e) - P(h) 1 - P(h) The ratio is called the measure of increased belief in h resulting from the observation of e (i.e. MB[h,e]) 8

9 Probability and Evidential Model On the other hand, if P(h e) < P(h), then the observation of e would decrease the expert s belief in h, while increasing his/her disbelief regarding the truth of h. The proportionate decrease in this case is P(h) - P(h e) P(h) Note that since one piece of evidence can not both favor and disfavor a single hypothesis, that is when MB[h,e] >0 MD[h,e] = 0, and when MD[h,e] >0 then MB[h,e] = 0. Furthermore, when P(h e) = P(h), the evidence is independent of the hypothesis and MB[h,e] = MD[h,e] = 0 Definitions of Evidential Model MB[h,e] = { 1 if P(h) = 1 max[p(h e), P(h)] - P(h) otherwise 1 - P(h) MD[h,e] = { 1 if P(h) = 0 min[p(h e), P(h)] - P(h) otherwise - P(h) CF[h,e] = MB[h,e] - MD[h,e] 9

10 Characteristics of Belief Measures Range of degrees: 0 <= MB[h,e] <= 1 0 <= MD[h,e] <= 1-1 <= CF[h,e] <= +1 Evidential strength of mutually exclusive hypotheses If h is shown to be certain P(h e) = 1 MB[[h,e] = 1 MD[h,e] = 0 CF[h,e] = 1 If the negation of h is shown to be certain P(~h e) = 1 MB[h,e] = 0 MD[h,e] = 1 CF[h,e] = -1 Characteristics of Belief Measures Suggested limits and ranges: -1 = CF[h,~h] <= CF[h,e] <= C[h,h] = +1 Note: MB[~h,e] = 1 if and only if MD[h,e] = 1 For mutually exclusive hypotheses h1 and h2, if MB[h1,e] = 1, then MD[h2,e] = 1 Lack of evidence: MB[h,e] = 0 if h is not confirmed by e (i.e. e and h are independent or e disconfirms h) MD[h,e] = 0 if h is not disconfirmed by e (i.e. e and h are independent or e confirms h) CF[h,e] = 0 if e neither confirms nor disconfirms h (i.e. e and h are independent) 10

11 More Characteristics of Belief Measures CF[h,e] + CF[~h,e] =/= 1 MB[h,e] = MD[~h,e] The Belief Measure Model as an Approximation Suppose e = s1 & s2 and that evidence e confirms d. Then CF[d, e] = MB[d,e] - 0 = P(d e) - P(d) = 1 - P(d) = P(d s1&s2) - P(d) 1 - P(d) which means we still need to keep probability measurements and moreover, we need to keep MBs and MDs 11

12 Defining Criteria for Approximation MB[h, e+] increases toward 1 as confirming evidence is found, equaling 1 if and only f a piece of evidence logically implies h with certainty MD[h, e-] increases toward 1 as disconfirming evidence is found, equaling 1 if and only if a piece of evidence logically implies ~h with certainty CF[h, e-] <= CF[h, e- & e+] <= CF[h, e+] MB[h,e+] = 1 then MD[h, e-] = 0 and CF[h, e+] = 1 MD[h, e-] = 1 then MB[h, e+] = 0 and CF[h, e-] = -1 The case where MB[h, e+] = MD[h, e-] = 1 is contradictory and hence CF is undefined Defining Criteria for Approximation If s1 & s2 indicates an ordered observation of evidence, first s1 then s2 then: MB[h, s1&s2] = MB[h, s2&s1] MD[h, s1&s2] = MD[h, s2&s1] CF[h, s1&s2] = CF[h, s2&s1] If s2 denotes a piece of potential evidence, the truth or falsity of which is unknown: MB[h, s1&s2] = MB[h, s1] MD[h, s1&s2] = MD[h, s1] CF[h, s1&s2] = CF[h, s1] 12

13 Combining Functions MB[h, s1&s2] = { 0 If MD[h, s1&s2] = 1 MB[h, s1] + MB[h, s2](1 - MB[h, s1]) otherwise MD[h, s1&s2] = { 0 If MB[h, s1&s2] = 1 MD[h, s1] + MD[h, s2](1 - MD[h, s1]) otherwise MB[h1 or h2, e] = max(mb[h1, e], MB[h2, e]) MD[h1 or h2, e] = min(md[h1, e], MD[h2, e]) MB[h, s1] = MB [h, s1] * max(0, CF[s1, e]) MD[h,s1] = MD [h, s1] * max(0, CF[s1, e]) Probabilistic Reasoning and Certainty Factors (Revisited) Of methods for utilizing evidence to select diagnoses or decisions, probability theory has the firmest appeal The usefulness of Bayes theorem is limited by practical difficulties, related to the volume of data required to compute the a-priori probabilities used in the theorem. On the other hand CFs and MBs, MDs offer an intuitive, yet informal, way of dealing with reasoning under uncertainty. The MYCIN model tries to combine these two areas (probabilistic, CFs) by providing a semi-formal bridge (theory) between the two areas 13

14 A Simple Probability Model (The MYCIN Model Prelude) Consider a finite population of n members. Members of the population may possess one or more of several properties that define subpopulations, or sets. Properties of interest might be e1 or e2, which may be evidence for or against a diagnosis h. The number of individuals with a certain property say e, will be denoted as n(e), and the number of two properties e1 and e2 will be denoted as n(e1&e2). Probabilities can be computed as ratios A Simple Probability Model (Cont.) From the above we observe that: n(e1 & h) * n = n(e & h) * n n(e) * n(h) = n(h) * n(e) So a convenient form of Bayes theorem is: P(h e) = P(e h) P(h) P(e) If we consider that two pieces of evidence e1 and e2 bear on a hypothesis h, and that if we assume e1 and e2 are independent then the following ratios hold and n(e1 & e2) = n(e1) * n(e2) n n n n(e1 & e2 & h) = n(e1 & h) * n(e2 & h) n(h) n(h) n(h) 14

15 Simple Probability Model With the above the right-hand side of the Bayes Theorem becomes: P(e1 & e2 h) = P(e1 h) * P(e2 h) P(e1 & e2) P(e1) P(e2) The idea is to ask the experts to estimate the ratios P(e i h)/p(h) and P(h), and from these compute P(h e 1 & e 2 & & e n ) The ratios P(e i h)/p(h) should be in the range [0,1/P(h)] In this context MB[h,e] = 1 when all individuals with e have disease h, and MD[h,e] = 1 when no individual with e has h Adding New Evidence Serially adjusting the probability of a hypothesis with new evidence against the hypothesis: P(h e ) = P(e i h) * P(h e ) P(e i ) or new evidence favoring the hypothesis: P(h e ) = 1 - P(e i ~h) * [ 1 - P(h e )] P(e i ) 15

16 Measure of Beliefs and Probabilities We can define then the MB and MD as: MB[h,e] = 1 - P(e ~h] P(e) and MD[h,e] = 1 - P(e h) P(e) The MYCIN Model MB[h1 & h2, e] = min(mb[h1,e], MB[h2,e]) MD[h1 & h2, e] = max(md[h1,e], MD[h2,e]) MB[h1 or h2, e) = max(mb[h1,e], MB[h2,e]) MD[h1 or h2, e) = min(md[h1,e], MDh2,e]) 1 - MD[h, e1 & e2] = (1 - MD[h,e1])*(1-MD[h,e2]) 1- MB[h, e1 & e2] = (1 - MB[h,e1])*(1-MB[h, e2]) CF(h, e f & e a ) = MB[h, e f ] - MD[h,e a ] 16

17 Reasoning Under Uncertainty Part (3) Demster-Shafer Model Kostas Kontogiannis E&CE 457 The Demster-Shafer Model So far we have described techniques, all of which consider an individual hypothesis (proposition) and and assign to each of them a point estimate in terms of a CF An alternative technique is to consider sets of propositions and assign to them an interval of the form [Belief, Plausibility] that is [Bel(p), 1-Bel(~p)] 17

18 Belief and Plausibility Belief (denoted as Bel) measures the strength of the evidence in favor of a set of hypotheses. It ranges from 0 (indicating no support) to 1 (indicating certainty). Plausibility (denoted as Pl) is defined as Pl(s) = 1 - Bel(~s) Plausibility also ranges from 0 to 1, and measures the extent to which evidence in favor of ~s leaves room for belief in s. In particular, if we have certain evidence in favor of ~s, then the Bel(~s) = 1, and the Pl(s) = 0. This tells us that the only possible value for Bel(s) = 0 Objectives for Belief and Plausibility To define more formally Belief and Plausibility we need to start with an exhaustive universe of mutually exclusive hypotheses in our diagnostic domain. We call this set frame of discernment and we denote it as Theta Our goal is to attach a some measure of belief to elements of Theta. In addition, since the elements of Theta are mutually exclusive, evidence in favor of some may have an effect on our belief in the others. The key function we use to measure the belief of elements of Theta is a probability density function, which we denote as m 18

19 The Probability Density Function in Demster-Shafer Model The probability density function m used in the Demster- Shafer model, is defined not just for the elements of Theta but for all subsets of it. The quantity m(p) measures the amount of belief that is currently assigned to exactly the set p of hypotheses If Theta contains n elements there are 2 n subsets of Theta We must assign m so that the sum of all the m values assigned to subsets of Theta is equal to 1 Although dealing with 2 n hypotheses may appear intractable, it usually turns out that many of the subsets will never need to be considered because they have no significance in a particular consultation and so their m value is 0 Defining Belief in Terms of Function m Having defined m we can now define Bel(p) for a set p, as the sum of the values of m for p and for all its subsets. Thus Bel(p) is our overall belief, that the correct answer lies somewhere in the set p In order to be able to use m, and thus Bel and Pl in reasoning programs, we need to define functions that enable us to combine m s that arise from multiple sources of evidence The combination of belief functions m1 and m2 is supported by the Dempster-Shafer model and results to a new belief function m3 19

20 Combining Belief Functions To combine the belief functions m1 and m2 on sets X and Y we use the following formula m3(z) = Sum Y intersect Y = Z m1(x) * m2(y) 1 - Sum X intersect Y = empty m1(x) * m2(y) If all the intersections X, Y are not empty then m3 is computed by using only the upper part of the fraction above (I.e. normalize by dividing by 1) If there are intersections of X, Y that are empty the upper part of the fraction is normalized by 1-k (where k is the sum of the m1*m2 on the X,Y elements that give empty intersection 20

Reasoning with Uncertainty

Reasoning with Uncertainty Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take

More information

Probabilistic Reasoning and Certainty Factors

Probabilistic Reasoning and Certainty Factors 12 Probabilistic Reasoning and Certainty Factors J. Barclay Adams The. development of automated assistance for medical diagnosis and decision making is an area of both theoretical and practical interest.

More information

Uncertainty and Rules

Uncertainty and Rules Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty

More information

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY REASONING UNDER UNCERTAINTY: CERTAINTY THEORY Table of Content Introduction Certainty Theory Definition Certainty Theory: Values Interpretation Certainty Theory: Representation Certainty Factor Propagation

More information

Reasoning about uncertainty

Reasoning about uncertainty Reasoning about uncertainty Rule-based systems are an attempt to embody the knowledge of a human expert within a computer system. Human knowledge is often imperfect. - it may be incomplete (missing facts)

More information

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins. Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence

More information

Basic Probabilistic Reasoning SEG

Basic Probabilistic Reasoning SEG Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision

More information

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,

More information

Today s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence

Today s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence Today s s lecture Lecture 6: Uncertainty - 6 Alternative Models of Dealing with Uncertainty Information/Evidence Dempster-Shaffer Theory of Evidence Victor Lesser CMPSCI 683 Fall 24 Fuzzy logic Logical

More information

Knowledge-Based Systems and Deductive Databases

Knowledge-Based Systems and Deductive Databases Knowledge-Based Systems and Deductive Databases Wolf-Tilo Balke Christoph Lofi Institut für Informationssysteme Technische Universität Braunschweig http://www.ifis.cs.tu-bs.de 10 Expert Systems 10.1 Expert

More information

Multicomponent DS Fusion Approach for Waveform EKG Detection

Multicomponent DS Fusion Approach for Waveform EKG Detection Multicomponent DS Fusion Approach for Waveform EKG Detection Nicholas Napoli University of Virginia njn5fg@virginia.edu August 10, 2013 Nicholas Napoli (UVa) Multicomponent EKG Fusion August 10, 2013 1

More information

Frank Agnosticism. - Shafer 1976 A Mathematical Theory of Evidence, pp

Frank Agnosticism. - Shafer 1976 A Mathematical Theory of Evidence, pp Frank Agnosticism The rules for belief functions permit us, when we have little evidence bearing on a proposition, to express frank agnosticism by according both that proposition and its negation very

More information

Sequential adaptive combination of unreliable sources of evidence

Sequential adaptive combination of unreliable sources of evidence Sequential adaptive combination of unreliable sources of evidence Zhun-ga Liu, Quan Pan, Yong-mei Cheng School of Automation Northwestern Polytechnical University Xi an, China Email: liuzhunga@gmail.com

More information

Bayesian Learning Features of Bayesian learning methods:

Bayesian Learning Features of Bayesian learning methods: Bayesian Learning Features of Bayesian learning methods: Each observed training example can incrementally decrease or increase the estimated probability that a hypothesis is correct. This provides a more

More information

arxiv:cs/ v2 [cs.ai] 29 Nov 2006

arxiv:cs/ v2 [cs.ai] 29 Nov 2006 Belief Conditioning Rules arxiv:cs/0607005v2 [cs.ai] 29 Nov 2006 Florentin Smarandache Department of Mathematics, University of New Mexico, Gallup, NM 87301, U.S.A. smarand@unm.edu Jean Dezert ONERA, 29

More information

Fuzzy Systems. Possibility Theory.

Fuzzy Systems. Possibility Theory. Fuzzy Systems Possibility Theory Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing

More information

Evidential networks for reliability analysis and performance evaluation of systems with imprecise knowledge

Evidential networks for reliability analysis and performance evaluation of systems with imprecise knowledge Evidential networks for reliability analysis and performance evaluation of systems with imprecise knowledge Christophe Simon, Philippe Weber To cite this version: Christophe Simon, Philippe Weber. Evidential

More information

The Semi-Pascal Triangle of Maximum Deng Entropy

The Semi-Pascal Triangle of Maximum Deng Entropy The Semi-Pascal Triangle of Maximum Deng Entropy Xiaozhuan Gao a, Yong Deng a, a Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu, 610054,

More information

A Class of DSm Conditioning Rules 1

A Class of DSm Conditioning Rules 1 Class of DSm Conditioning Rules 1 Florentin Smarandache, Mark lford ir Force Research Laboratory, RIE, 525 Brooks Rd., Rome, NY 13441-4505, US bstract: In this paper we introduce two new DSm fusion conditioning

More information

Deductive Systems. Lecture - 3

Deductive Systems. Lecture - 3 Deductive Systems Lecture - 3 Axiomatic System Axiomatic System (AS) for PL AS is based on the set of only three axioms and one rule of deduction. It is minimal in structure but as powerful as the truth

More information

The Fundamental Principle of Data Science

The Fundamental Principle of Data Science The Fundamental Principle of Data Science Harry Crane Department of Statistics Rutgers May 7, 2018 Web : www.harrycrane.com Project : researchers.one Contact : @HarryDCrane Harry Crane (Rutgers) Foundations

More information

A Formal Logical Framework for Cadiag-2

A Formal Logical Framework for Cadiag-2 648 Medical Informatics in a United and Healthy Europe K-P Adlassnig et al (Eds) IOS Press, 29 29 European Federation for Medical Informatics All rights reserved doi:12/978-1-675-44-5-648 A Formal Logical

More information

Uncertainty in Heuristic Knowledge and Reasoning

Uncertainty in Heuristic Knowledge and Reasoning Uncertainty in Heuristic Knowledge and Koichi YAMADA 1603-1, Kami-tomioka, Nagaoka, Niigata, Japan, 940-2188 Management and Information Systems Science Nagaoka University of Technology yamada@kjs.nagaokaut.ac.jp

More information

Inquiry Calculus and the Issue of Negative Higher Order Informations

Inquiry Calculus and the Issue of Negative Higher Order Informations Article Inquiry Calculus and the Issue of Negative Higher Order Informations H. R. Noel van Erp, *, Ronald O. Linger and Pieter H. A. J. M. van Gelder,2 ID Safety and Security Science Group, TU Delft,

More information

Uncertanity Handling in Knowledge Based System

Uncertanity Handling in Knowledge Based System 120 Uncertanity Handling in Knowledge Based System Sandhia Valsala, Bindhya Thomas College of Computer Studies AMA International University Salmabad,Kingdom of Bahrain Abstract The traditional goal in

More information

Modeling and reasoning with uncertainty

Modeling and reasoning with uncertainty CS 2710 Foundations of AI Lecture 18 Modeling and reasoning with uncertainty Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square KB systems. Medical example. We want to build a KB system for the diagnosis

More information

THE LOGIC OF COMPOUND STATEMENTS

THE LOGIC OF COMPOUND STATEMENTS CHAPTER 2 THE LOGIC OF COMPOUND STATEMENTS Copyright Cengage Learning. All rights reserved. SECTION 2.1 Logical Form and Logical Equivalence Copyright Cengage Learning. All rights reserved. Logical Form

More information

D-S Evidence Theory Applied to Fault 1 Diagnosis of Generator Based on Embedded Sensors

D-S Evidence Theory Applied to Fault 1 Diagnosis of Generator Based on Embedded Sensors D-S Evidence Theory Applied to Fault 1 Diagnosis of Generator Based on Sensors Du Qingdong, Xu Lingyu, Zhao Hai School of Information Engineering, Northeastern University Shenyang 110006, China cn_dudong@yahoo.com

More information

Incompatibility Paradoxes

Incompatibility Paradoxes Chapter 22 Incompatibility Paradoxes 22.1 Simultaneous Values There is never any difficulty in supposing that a classical mechanical system possesses, at a particular instant of time, precise values of

More information

Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.

Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams. Course Introduction Probabilistic Modelling and Reasoning Chris Williams School of Informatics, University of Edinburgh September 2008 Welcome Administration Handout Books Assignments Tutorials Course

More information

Imprecise Probability

Imprecise Probability Imprecise Probability Alexander Karlsson University of Skövde School of Humanities and Informatics alexander.karlsson@his.se 6th October 2006 0 D W 0 L 0 Introduction The term imprecise probability refers

More information

Decision of Prognostics and Health Management under Uncertainty

Decision of Prognostics and Health Management under Uncertainty Decision of Prognostics and Health Management under Uncertainty Wang, Hong-feng Department of Mechanical and Aerospace Engineering, University of California, Irvine, 92868 ABSTRACT The decision making

More information

Computer Science CPSC 322. Lecture 18 Marginalization, Conditioning

Computer Science CPSC 322. Lecture 18 Marginalization, Conditioning Computer Science CPSC 322 Lecture 18 Marginalization, Conditioning Lecture Overview Recap Lecture 17 Joint Probability Distribution, Marginalization Conditioning Inference by Enumeration Bayes Rule, Chain

More information

I I I I I I I I I I I I I I I I I I I

I I I I I I I I I I I I I I I I I I I STRONG AND WEAK METHODS: A LOGCAL VEW OF UNCERTANTY John Fox mperial Cancer Research Fund Laboratories, London AAA, Los Angeles, August 1985 Before about 1660 when the modern Pascalian concept of probability

More information

Dempster's Rule of Combination is. #P -complete. Pekka Orponen. Department of Computer Science, University of Helsinki

Dempster's Rule of Combination is. #P -complete. Pekka Orponen. Department of Computer Science, University of Helsinki Dempster's Rule of Combination is #P -complete Pekka Orponen Department of Computer Science, University of Helsinki eollisuuskatu 23, SF{00510 Helsinki, Finland Abstract We consider the complexity of combining

More information

Entropy and Specificity in a Mathematical Theory of Evidence

Entropy and Specificity in a Mathematical Theory of Evidence Entropy and Specificity in a Mathematical Theory of Evidence Ronald R. Yager Abstract. We review Shafer s theory of evidence. We then introduce the concepts of entropy and specificity in the framework

More information

arxiv: v1 [cs.ai] 4 Sep 2007

arxiv: v1 [cs.ai] 4 Sep 2007 Qualitative Belief Conditioning Rules (QBCR) arxiv:0709.0522v1 [cs.ai] 4 Sep 2007 Florentin Smarandache Department of Mathematics University of New Mexico Gallup, NM 87301, U.S.A. smarand@unm.edu Jean

More information

Where are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning

Where are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning Knowledge Engineering Semester 2, 2004-05 Michael Rovatsos mrovatso@inf.ed.ac.uk Lecture 8 Dealing with Uncertainty 8th ebruary 2005 Where are we? Last time... Model-based reasoning oday... pproaches to

More information

Evidence with Uncertain Likelihoods

Evidence with Uncertain Likelihoods Evidence with Uncertain Likelihoods Joseph Y. Halpern Cornell University Ithaca, NY 14853 USA halpern@cs.cornell.edu Riccardo Pucella Cornell University Ithaca, NY 14853 USA riccardo@cs.cornell.edu Abstract

More information

Proving simple set properties...

Proving simple set properties... Proving simple set properties... Part 1: Some examples of proofs over sets Fall 2013 Proving simple set properties... Fall 2013 1 / 17 Introduction Overview: Learning outcomes In this session we will...

More information

A unified view of some representations of imprecise probabilities

A unified view of some representations of imprecise probabilities A unified view of some representations of imprecise probabilities S. Destercke and D. Dubois Institut de recherche en informatique de Toulouse (IRIT) Université Paul Sabatier, 118 route de Narbonne, 31062

More information

Combining Belief Functions Issued from Dependent Sources

Combining Belief Functions Issued from Dependent Sources Combining Belief Functions Issued from Dependent Sources MARCO E.G.V. CATTANEO ETH Zürich, Switzerland Abstract Dempster s rule for combining two belief functions assumes the independence of the sources

More information

1.9 APPLICATION OF EVIDENCE THEORY TO QUANTIFY UNCERTAINTY IN FORECAST OF HURRICANE PATH

1.9 APPLICATION OF EVIDENCE THEORY TO QUANTIFY UNCERTAINTY IN FORECAST OF HURRICANE PATH 1.9 APPLICATION OF EVIDENCE THEORY TO QUANTIFY UNCERTAINTY IN FORECAST OF HURRICANE PATH Svetlana V. Poroseva, Julie Letschert** and M. Yousuff Hussaini School of Computational Science, Florida State University,

More information

Handling imprecise and uncertain class labels in classification and clustering

Handling imprecise and uncertain class labels in classification and clustering Handling imprecise and uncertain class labels in classification and clustering Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) COST Action IC 0702 Working group C, Mallorca,

More information

Neutrosophic Masses & Indeterminate Models.

Neutrosophic Masses & Indeterminate Models. Neutrosophic Masses & Indeterminate Models. Applications to Information Fusion Florentin Smarandache Mathematics Department The University of New Mexico 705 Gurley Ave., Gallup, NM 8730, USA E-mail: smarand@unm.edu

More information

On Markov Properties in Evidence Theory

On Markov Properties in Evidence Theory On Markov Properties in Evidence Theory 131 On Markov Properties in Evidence Theory Jiřina Vejnarová Institute of Information Theory and Automation of the ASCR & University of Economics, Prague vejnar@utia.cas.cz

More information

Study of Fault Diagnosis Method Based on Data Fusion Technology

Study of Fault Diagnosis Method Based on Data Fusion Technology Available online at www.sciencedirect.com Procedia Engineering 29 (2012) 2590 2594 2012 International Workshop on Information and Electronics Engineering (IWIEE) Study of Fault Diagnosis Method Based on

More information

A new Approach to Drawing Conclusions from Data A Rough Set Perspective

A new Approach to Drawing Conclusions from Data A Rough Set Perspective Motto: Let the data speak for themselves R.A. Fisher A new Approach to Drawing Conclusions from Data A Rough et Perspective Zdzisław Pawlak Institute for Theoretical and Applied Informatics Polish Academy

More information

1.1 Statements and Compound Statements

1.1 Statements and Compound Statements Chapter 1 Propositional Logic 1.1 Statements and Compound Statements A statement or proposition is an assertion which is either true or false, though you may not know which. That is, a statement is something

More information

A Guide to Proof-Writing

A Guide to Proof-Writing A Guide to Proof-Writing 437 A Guide to Proof-Writing by Ron Morash, University of Michigan Dearborn Toward the end of Section 1.5, the text states that there is no algorithm for proving theorems.... Such

More information

Probability Calculus. Chapter From Propositional to Graded Beliefs

Probability Calculus. Chapter From Propositional to Graded Beliefs Chapter 2 Probability Calculus Our purpose in this chapter is to introduce probability calculus and then show how it can be used to represent uncertain beliefs, and then change them in the face of new

More information

Uncertainty processing in FEL-Expert Lecture notes

Uncertainty processing in FEL-Expert Lecture notes Uncertainty processing in FEL-Expert Lecture notes Marek Obitko, obitko@labe.felk.cvut.cz 1 Introduction This text describes uncertainty processing in the FEL-Expert system and is intended as lecture notes

More information

Combination calculi for uncertainty reasoning: representing uncertainty using distributions

Combination calculi for uncertainty reasoning: representing uncertainty using distributions Annals of Mathematics and Artificial Intelligence 20 (1997) 69 109 69 Combination calculi for uncertainty reasoning: representing uncertainty using distributions Robert Hummel Courant Institute of Mathematical

More information

Application of Evidence Theory to Construction Projects

Application of Evidence Theory to Construction Projects Application of Evidence Theory to Construction Projects Desmond Adair, University of Tasmania, Australia Martin Jaeger, University of Tasmania, Australia Abstract: Crucial decisions are necessary throughout

More information

A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS

A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS Albena TCHAMOVA, Jean DEZERT and Florentin SMARANDACHE Abstract: In this paper a particular combination rule based on specified

More information

Application of Evidence Theory and Discounting Techniques to Aerospace Design

Application of Evidence Theory and Discounting Techniques to Aerospace Design Application of Evidence Theory and Discounting Techniques to Aerospace Design Fiona Browne 1, David Bell 1, Weiru Liu 1, Yan Jin 1, Colm Higgins 1, Niall Rooney 2, Hui Wang 2, and Jann Müller 3 1 School

More information

EnM Probability and Random Processes

EnM Probability and Random Processes Historical Note: EnM 503 - Probability and Random Processes Probability has its roots in games of chance, which have been played since prehistoric time. Games and equipment have been found in Egyptian

More information

On Conditional Independence in Evidence Theory

On Conditional Independence in Evidence Theory 6th International Symposium on Imprecise Probability: Theories and Applications, Durham, United Kingdom, 2009 On Conditional Independence in Evidence Theory Jiřina Vejnarová Institute of Information Theory

More information

CERTAINTY FACTORS. Expert System Lab Work. Lecture #6

CERTAINTY FACTORS. Expert System Lab Work. Lecture #6 CERTAINTY FACTORS Expert System Lab Work Lecture #6 Uncertanity There is a uncertainty in the mind Of experts when make a decision accordance their expertise Representation Uncertainty Basis Concept of

More information

1) partial observability (road state, other drivers' plans, etc.) 4) immense complexity of modelling and predicting tra_c

1) partial observability (road state, other drivers' plans, etc.) 4) immense complexity of modelling and predicting tra_c UNIT III Reasoning under uncertainty: Logics of non-monotonic reasoning - Implementation- Basic probability notation - Bayes rule Certainty factors and rule based systems-bayesian networks Dempster - Shafer

More information

PROBABILISTIC LOGIC. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc.

PROBABILISTIC LOGIC. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc. PROBABILISTIC LOGIC A deductive argument is a claim of the form: If P 1, P 2,...,andP

More information

Confirmation Theory. Pittsburgh Summer Program 1. Center for the Philosophy of Science, University of Pittsburgh July 7, 2017

Confirmation Theory. Pittsburgh Summer Program 1. Center for the Philosophy of Science, University of Pittsburgh July 7, 2017 Confirmation Theory Pittsburgh Summer Program 1 Center for the Philosophy of Science, University of Pittsburgh July 7, 2017 1 Confirmation Disconfirmation 1. Sometimes, a piece of evidence, E, gives reason

More information

UNCERTAINTY. In which we see what an agent should do when not all is crystal-clear.

UNCERTAINTY. In which we see what an agent should do when not all is crystal-clear. UNCERTAINTY In which we see what an agent should do when not all is crystal-clear. Outline Uncertainty Probabilistic Theory Axioms of Probability Probabilistic Reasoning Independency Bayes Rule Summary

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

CSCE 222 Discrete Structures for Computing. Review for Exam 1. Dr. Hyunyoung Lee !!!

CSCE 222 Discrete Structures for Computing. Review for Exam 1. Dr. Hyunyoung Lee !!! CSCE 222 Discrete Structures for Computing Review for Exam 1 Dr. Hyunyoung Lee 1 Topics Propositional Logic (Sections 1.1, 1.2 and 1.3) Predicate Logic (Sections 1.4 and 1.5) Rules of Inferences and Proofs

More information

IN THIS paper we investigate the diagnosability of stochastic

IN THIS paper we investigate the diagnosability of stochastic 476 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 50, NO 4, APRIL 2005 Diagnosability of Stochastic Discrete-Event Systems David Thorsley and Demosthenis Teneketzis, Fellow, IEEE Abstract We investigate

More information

CSC Discrete Math I, Spring Propositional Logic

CSC Discrete Math I, Spring Propositional Logic CSC 125 - Discrete Math I, Spring 2017 Propositional Logic Propositions A proposition is a declarative sentence that is either true or false Propositional Variables A propositional variable (p, q, r, s,...)

More information

13.4 INDEPENDENCE. 494 Chapter 13. Quantifying Uncertainty

13.4 INDEPENDENCE. 494 Chapter 13. Quantifying Uncertainty 494 Chapter 13. Quantifying Uncertainty table. In a realistic problem we could easily have n>100, makingo(2 n ) impractical. The full joint distribution in tabular form is just not a practical tool for

More information

Y. Xiang, Inference with Uncertain Knowledge 1

Y. Xiang, Inference with Uncertain Knowledge 1 Inference with Uncertain Knowledge Objectives Why must agent use uncertain knowledge? Fundamentals of Bayesian probability Inference with full joint distributions Inference with Bayes rule Bayesian networks

More information

Rough Set Theory Fundamental Assumption Approximation Space Information Systems Decision Tables (Data Tables)

Rough Set Theory Fundamental Assumption Approximation Space Information Systems Decision Tables (Data Tables) Rough Set Theory Fundamental Assumption Objects from the domain are perceived only through the values of attributes that can be evaluated on these objects. Objects with the same information are indiscernible.

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

CogSysI Lecture 9: Non-Monotonic and Human Reasoning

CogSysI Lecture 9: Non-Monotonic and Human Reasoning CogSysI Lecture 9: Non-Monotonic and Human Reasoning Intelligent Agents WS 2004/2005 Part II: Inference and Learning Non-Monotonic and Human Reasoning CogSysI Lecture 9: Non-Monotonic and Human Reasoning

More information

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T Logic and Reasoning Final Exam Practice Fall 2017 Name Section Number The final examination is worth 100 points. 1. (10 points) What is an argument? Explain what is meant when one says that logic is the

More information

Lecture 2. Logic Compound Statements Conditional Statements Valid & Invalid Arguments Digital Logic Circuits. Reading (Epp s textbook)

Lecture 2. Logic Compound Statements Conditional Statements Valid & Invalid Arguments Digital Logic Circuits. Reading (Epp s textbook) Lecture 2 Logic Compound Statements Conditional Statements Valid & Invalid Arguments Digital Logic Circuits Reading (Epp s textbook) 2.1-2.4 1 Logic Logic is a system based on statements. A statement (or

More information

Computation and Logic Definitions

Computation and Logic Definitions Computation and Logic Definitions True and False Also called Boolean truth values, True and False represent the two values or states an atom can assume. We can use any two distinct objects to represent

More information

Examine characteristics of a sample and make inferences about the population

Examine characteristics of a sample and make inferences about the population Chapter 11 Introduction to Inferential Analysis Learning Objectives Understand inferential statistics Explain the difference between a population and a sample Explain the difference between parameter and

More information

Knowledge Representation. Propositional logic

Knowledge Representation. Propositional logic CS 2710 Foundations of AI Lecture 10 Knowledge Representation. Propositional logic Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Knowledge-based agent Knowledge base Inference engine Knowledge

More information

This article is published in Journal of Multiple-Valued Logic and Soft Computing 2008, 15(1), 5-38.

This article is published in Journal of Multiple-Valued Logic and Soft Computing 2008, 15(1), 5-38. Audun Jøsang. Conditional Reasoning with Subjective Logic. This article is published in Journal of Multiple-Valued Logic and Soft Computing 2008, 15(1), 5-38. Published in DUO with permission from Old

More information

Desire-as-belief revisited

Desire-as-belief revisited Desire-as-belief revisited Richard Bradley and Christian List June 30, 2008 1 Introduction On Hume s account of motivation, beliefs and desires are very di erent kinds of propositional attitudes. Beliefs

More information

Non-Axiomatic Logic (NAL) Specification. Pei Wang

Non-Axiomatic Logic (NAL) Specification. Pei Wang Non-Axiomatic Logic (NAL) Specification Pei Wang October 30, 2009 Contents 1 Introduction 1 1.1 NAL and NARS........................ 1 1.2 Structure of NAL........................ 2 1.3 Specifying NAL.........................

More information

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true.

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. Symbolic Logic 3 Testing deductive validity with truth tables For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. So, given that truth tables

More information

Introduction to Metalogic

Introduction to Metalogic Philosophy 135 Spring 2008 Tony Martin Introduction to Metalogic 1 The semantics of sentential logic. The language L of sentential logic. Symbols of L: Remarks: (i) sentence letters p 0, p 1, p 2,... (ii)

More information

Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule

Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Alan Mackworth UBC CS 322 Uncertainty 2 March 13, 2013 Textbook 6.1.3 Lecture Overview Recap: Probability & Possible World Semantics

More information

Building Bayesian Networks. Lecture3: Building BN p.1

Building Bayesian Networks. Lecture3: Building BN p.1 Building Bayesian Networks Lecture3: Building BN p.1 The focus today... Problem solving by Bayesian networks Designing Bayesian networks Qualitative part (structure) Quantitative part (probability assessment)

More information

BIVARIATE P-BOXES AND MAXITIVE FUNCTIONS. Keywords: Uni- and bivariate p-boxes, maxitive functions, focal sets, comonotonicity,

BIVARIATE P-BOXES AND MAXITIVE FUNCTIONS. Keywords: Uni- and bivariate p-boxes, maxitive functions, focal sets, comonotonicity, BIVARIATE P-BOXES AND MAXITIVE FUNCTIONS IGNACIO MONTES AND ENRIQUE MIRANDA Abstract. We give necessary and sufficient conditions for a maxitive function to be the upper probability of a bivariate p-box,

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

On the teaching and learning of logic in mathematical contents. Kyeong Hah Roh Arizona State University

On the teaching and learning of logic in mathematical contents. Kyeong Hah Roh Arizona State University On the teaching and learning of logic in mathematical contents Kyeong Hah Roh Arizona State University khroh@asu.edu Students understanding of the formal definitions of limit teaching and learning of logic

More information

Classical Belief Conditioning and its Generalization to DSm Theory

Classical Belief Conditioning and its Generalization to DSm Theory Journal of Uncertain Systems Vol.2, No.4, pp.267-279, 2008 Online at: www.jus.org.uk Classical Belief Conditioning and its Generalization to DSm Theory ilan Daniel Institute of Computer Science, Academy

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Analyzing the Combination of Conflicting Belief Functions.

Analyzing the Combination of Conflicting Belief Functions. Analyzing the Combination of Conflicting Belief Functions. Philippe Smets IRIDIA Université Libre de Bruxelles 50 av. Roosevelt, CP 194-6, 1050 Bruxelles, Belgium psmets@ulb.ac.be http://iridia.ulb.ac.be/

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science Ch.6 Uncertain Knowledge Representation Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/39 Logic and Uncertainty One

More information

Lecture Notes on The Curry-Howard Isomorphism

Lecture Notes on The Curry-Howard Isomorphism Lecture Notes on The Curry-Howard Isomorphism 15-312: Foundations of Programming Languages Frank Pfenning Lecture 27 ecember 4, 2003 In this lecture we explore an interesting connection between logic and

More information

Knowledge representation DATA INFORMATION KNOWLEDGE WISDOM. Figure Relation ship between data, information knowledge and wisdom.

Knowledge representation DATA INFORMATION KNOWLEDGE WISDOM. Figure Relation ship between data, information knowledge and wisdom. Knowledge representation Introduction Knowledge is the progression that starts with data which s limited utility. Data when processed become information, information when interpreted or evaluated becomes

More information

A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France

A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France Ambiguity, uncertainty and climate change, UC Berkeley, September 17-18, 2009 A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France Outline 1. Intro:

More information

Proofs. Joe Patten August 10, 2018

Proofs. Joe Patten August 10, 2018 Proofs Joe Patten August 10, 2018 1 Statements and Open Sentences 1.1 Statements A statement is a declarative sentence or assertion that is either true or false. They are often labelled with a capital

More information

Knowledge Representation. Propositional logic.

Knowledge Representation. Propositional logic. CS 1571 Introduction to AI Lecture 10 Knowledge Representation. Propositional logic. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Announcements Homework assignment 3 due today Homework assignment

More information

A REASONING MODEL BASED ON AN EXTENDED DEMPSTER-SHAFER THEORY *

A REASONING MODEL BASED ON AN EXTENDED DEMPSTER-SHAFER THEORY * From: AAAI-86 Proceedings. Copyright 1986, AAAI (www.aaai.org). All rights reserved. A REASONING MODEL BASED ON AN EXTENDED DEMPSTER-SHAFER THEORY * John Yen Computer Science Division Department of Electrical

More information

Paucity, abundance, and the theory of number: Online Appendices

Paucity, abundance, and the theory of number: Online Appendices Paucity, abundance, and the theory of number: Online Appendices Daniel Harbour Language, Volume 90, Number 1, March 2014, pp. s1-s4 (Article) Published by Linguistic Society of America DOI: https://doi.org/10.1353/lan.2014.0019

More information

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010) http://math.sun.ac.za/amsc/sam Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics 2009-2010 Lecture notes in progress (27 March 2010) Contents 2009 Semester I: Elements 5 1. Cartesian product

More information