Aggrega?on of Epistemic Uncertainty
|
|
- Lilian Page
- 5 years ago
- Views:
Transcription
1 Aggrega?on of Epistemic Uncertainty - Certainty Factors and Possibility Theory - Koichi Yamada Nagaoka Univ. of Tech. 1 What is Epistemic Uncertainty? Epistemic Uncertainty Aleatoric Uncertainty (Sta?s?cal / Objec?ve Uncertainty) related to frequency - cogni?ve uncertainty caused by incomplete knowledge / lack of informa?on - subjec?ve uncertainty Examples of Epistemic Uncertainty - Q1: Gravity accelera?on in this room : about 9.8 m/s 2 - Q2: whether the suspect arrested is the real murderer or not. The uncertainty contained in the answers cannot be represented by frequency. It is uncertainty considered as a degree of our belief. 2 Compu?ng 1
2 Which Uncertainty Does Probability Represent? Originally, probability had been a measure to represent "frequency." In 1740s, Thomas Bayes considered a way to deal with degrees of belief in the framework of Probability theory, which is called Bayesian Probability or subjec?ve probability. Examples of Subjec?ve probabili?es : It will rain tomorrow at 50%. Our team will win tomorrow at 99%. These percentages are not frequencies, because tomorrow will come just once. They are our beliefs. Probability can be used both for frequency (Aleatoric uncertainty) and for degrees of belief (Epistemic uncertainty). 3 Other Theories for Represen?ng Uncertainty Possibility Theory - a theory related to an adjec?ve, "Possible" Dempster-Shafer theory of Evidence - a generalized theory of uncertainty Certainty Factor - Uncertainty representa?on employed in MYCIN (1974) Fuzzy set theory - vagueness contained in concepts and words Rough Set theory - indiscernibility and approxima?on due to our limited knowledge Mul?-valued logics - truth between "completely true" and "completely false" Note: These are all theories to deal with Epistemic Uncertainty, which suggests there are many aspects in Epistemic Uncertainty. è focus on the degree of beliefs 4 Compu?ng 2
3 What is Important for Dealing with Epistemic Uncertainty? Capability to deal with "ignorance" / "unknown situa?on" is important. Example: Suppose there is a gang group in a small town, and your old friend Tom has joined it. One day, a murder happened in the town. There was perfect evidence that one of the gang members did it. No other informa?on is given. How do you represent uncertainty that "Tom is the murderer"? Probability theory P(Tom) =1/ n n : the number of the gang members, but we do not know the exact number. Probability cannot represent the uncertainty of this situa?on. 5 Representa?on in Other Theories Possibility theory: an uncertain situa?on is represented by a pair of possibili?es; π (Tom) =1.0 π (Tom) =1.0 : Possibility that Tom is the murderer is 1.0. : Possibility that Tom is NOT the murderer is π 0.0 Dempster-Shafer theory of Evidence: uncertainty is represented by two measures; Pl(Tom) = 1.0 Bel(Tom) = Pl,Bel 0.0 Certainty Factor Model CF(Tom) = CF : Plausibility that Tom is the murderer is 1.0. : Belief that Tom is the murderer is : perfect affirma?on 1 : perfect nega?on 0 : unknown or no informa?on 6 Compu?ng 3
4 Aggrega?on of Epistemic Uncertainty In everyday-decision making, we frequently gather mul?ple pieces of uncertain informa?on, and aggregate the informa?on. - Which beach resort shall we go to during the next vaca?on? - Which job should I choose among the mul?ple offers? - Which is telling the truth, President Trump or the New York Times? We need to gather and aggregate much uncertain informa?on to answer these ques?ons. - Aggrega?on is one of the most important informa?on processing for Epistemic Uncertainty. - Many applica?ons need informa?on aggrega?on in decision-making, affec?ve informa?on processing, sensor fusion, flexible informa?on retrieval, etc. There are only a few theories that provide a standard aggrega?on func?on. - Dempster-Shafer Theory of Evidence : Dempster's rule of combina?on - Certainty Factor Model 7 Certainty Factor Model CF was devised and used to represent uncertainty of a hypothesis given some evidence in stead of Probability in a famous Expert System MYCIN. because, - Probability cannot express the unknown situa?on (ignorance). - No standard way to aggregate mul?ple probability distribu?ons derived from mul?ple pieces of evidence. The CF model was evaluated as "Prac?cal" by many prac??oners, but was also cri?cized harshly by theore?cians, blaming it is theore?cally wrong. - There was no sound interpreta?on of the CF model in the framework of Probability theory. 8 Compu?ng 4
5 Original Defini?on of Certainty Factors CF of hypothesis h given evidence e C f (h,e) [ 1, +1] +1 : perfect affirma?on 1 : perfect nega?on 0 : neither is supported (unknown, no evidence) MB(h,e) : degree that belief in h is revised by e toward affirma?on MD(h,e) : degree that belief in h is revised by e toward nega?on - We do not adopt the defini?on, because it was proved that the defini?on is not consistent with the aggrega?on func?on of CFs. 9 Aggrega?on Func?on of CF Model Let x (y) be CF of hypothesis h given evidence ex (ey). x = Cf (h, e x ) y = Cf (h,e y ) Then, the Aggrega?on (combina?on) func?on is given as follows; The aggregated CF could be interpreted as follows; f M (x, y) = Cf (h, e x e y ) The equa?on is commuta?ve and associa?ve. So, aggrega?on results are not dependent on the order of sequence, when there are mul?ple CFs. 10 Compu?ng 5
6 New Sound Interpreta?on of CFs with Possibility Theory Koichi Yamada: Aggrega?on of Epistemic Uncertainty: A New Interpreta?on of the Certainty Factor with Possibility Theory and Causa?on Events, SCIS&ISIS 2018 (submioed) The rest of this presenta?on discusses a new sound interpreta?on of Certainty Factors using Possibility theory. A Certainty Factor Transformable to each other the both represent the same uncertainty A Possibility Distribu?on Examine new aggrega?on func?ons in the frame work of Possibility theory. - One of the aggrega?on func?ons is exactly the same as the one used in MYCIN. è This gives the theore?cal basis to the MYCIN's aggrega?on func?on. 11 Possibility is another scale to measure uncertainty with a value in [0, 1] similar to "Probability. According to L. A. Zadeh, Possibility Theory - Essen?ally, humans u?lize possibility rather than probability in decision-making. - Vagueness contained in Natural Language is principally possibilis?c. Possibility Seems appropriate for represen?ng epistemic uncertainty Some impressive statements about Possibility and Probability - What is impossible is improbable. (Zadeh) - What is possible can be improbable. (Zadeh) - What is improbable is not impossible, necessarily. (Zadeh) - What is probable must be possible. (D. Dubois and H. Prade) 12 Compu?ng 6
7 Possibility Measure Axioms of possibility measures Π( ) = 0 Π( U) = 1 Π(A B) = max(π(a),π(b)) Possibility Measure Possibility measure is defined in the similar way to Probability measure. Algebraic sum of Probability is replaced by Max opera?on in Possibility. Π : 2 U [0,1] U: the universal set P : 2 [0,1 ] A and B do not need to be disjoint. Proper?es A B Π( A) Π( B) c N( A) = 1 Π( A ) Necessity measure A is necessary = Not A is not possible Probability Measure P( ) = 0 U Axioms of probability measures P( U) = 1 P ( A B) = P( A) + P( B) Proper?es A B P( A) P( B) P(A) =1 P(A C ) if P(A B) = P(A) + P(B) P(A B) A B = 13 Possibility Distribu?on Both Probability and Possibility have distribu?on func?ons. Algebraic sum of Probability is replaced by Max opera?on in Possibility. Possibility distribu?on π :U [0,1] Proper?es π u ) = Π({ u }) ( i i Π(A) = Max u i A Π(U ) = Maxπ (u i ) u i U Π({u i }) = Maxπ (u i ) u i A = Max{π (u 1 ),π (u 2 ),...,π (u n )}=1 Probability distribu?on p :U [0,1] Proper?es p u ) = P({ u }) ( i i u A P ( A) = P({ u }) = p( u ) P( U) = p( ui ) i ui U i u A = p( u1 ) + p( u2) p( un ) = 1 i i A possibility distribu?on must be "normal." 14 Compu?ng 7
8 Condi?onal Possibility: Condi?onal Possibility Condi?ons and Independence are defined in a similar way despite that details are different. Algebraic product of Probability is replaced by Min opera?on in Possibility. Π ( B A) Condi?onal Probability: P( B A) Π(A B) = min(π(a),π(b A)) P(A B) = P(A) P(B A) Π( B A) = 1, Π if Π( A) = Π( A B) 0 ( A B), (1) When A is independent of B, Π(A B) = Π(A) (2) When B is independent of A, Π(B A) = Π(B) if Π( A) > Π( A B) (3) When (1) or (2) holds, A and B is non-interac?ve. Π(A B) = min(π(a),π(b)) P( A B) P( B A) =, if P( A) 0 P( A) A is independent of B B is independent of A. P(A B) = P(A) P(B A) = P(B) P(A B) = P(A) P(B) 15 Hypothesis and Opposite Hypothesis We introduce the Opposite Hypothesis k to a hypothesis h, which sa?sfies the following logical formulae. Which is the murderer, male or female? The hypothesis h and O-hypothesis k sa?sfy the following; is not tautology. is not contradic?on. h k male female unknown Note: Assuming the closed world assump?on, the asser?on of male (female) should be rejected, if there is no evidence for male (female). If we have no evidence both for male and female, we have to reject both asser?ons of male and female. represents "unknown" because of no evidence 16 Compu?ng 8
9 Causa?on Events Y. Peng and J. A. Reggia (1987) h : e - represents an event that evidence e supports hypothesis h k : d - represents an event that evidence d supports O-hypothesis k 17 Hypothesis, O-hypothesis, Causa?on Event Eh : the set of all pieces of evidence that possibly supports h. Ek : the set of all pieces of evidence that possibly supports k. Hypothesis h is true at least, one of possible pieces of evidence supports h, and no possible pieces of evidence supports k. If we define h and k above, they sa?sfy the formulae 18 Compu?ng 9
10 Condi?onal Causa?on Possibility CondiBonal CausaBon Possibility is possibility that evidence ei supports hypothesis h, only given the evidence ei.! represents that only ei is present and the others are not. Note : we assume is not contradic?on, even if. represent that evidence ei is not present (found) nor ej. This is possible, because we are considering the epistemic world. 19 Proposi?on Possibility that h is true only given evidence e is the same as the possibility that the evidence supports the hypothesis only given the evidence. (It is because there is no evidence that supports h, other than "e") Possibility that h is false only given e is the same as the possibility that the evidence does not support the hypothesis only given the evidence. (It is because there is no evidence that supports h). 20 Compu?ng 10
11 Possibility Distribu?on can be represented by a single value in [-1, 1]. Possibility distribu?on of hypothesis h given only e is represented by ( π (h!e), π (h!e) ), where max( π (h!e), π (h!e) ) =1.0 The possibility distribu?on could be represented by a single value gh(h!e) in [-1,1]. Then the possibility distribu?on is restored from gh(h!e) in [-1,1] using the following equa?ons. 21 Transforma?on between g h (h!e) and Possibility Distribu?ons g h (h!e) is regarded as a single value representa?on of a possibility distribu?on. 22 Compu?ng 11
12 Our Defini?on of Certainty Factors We define the CF by the next equa?on. 23 Aggrega?on of CFs (1) When two CFs x, y 0, it is supposed that the evidence ex and ey support the hypothesis h and In this case, x and y are transformed to the possibility distribu?ons. Then we calculate the possibility distribu?on of h given only ex and ey, using the proposi?ons shown before, and assuming condi?onal independency and non-interac?vity of causa?on events. π (h!e x,e y ) =1.0 π (h!e x,e y ) = min(1 x,1 y) Then, the above possibility distribu?on can be transformed back to a CF. 24 Compu?ng 12
13 Aggrega?on of CFs (2) When two CFs x, y 0, it is supposed that the evidence ex and ey support the O-hypothesis k and In this case, x and y are transformed to the following possibility distribu?ons. Then we calculate the possibility distribu?on of h given only ex and ey, using the proposi?ons assuming condi?onal independency and non-interac?vity of causa?on events. π (k!e x,e y ) =1.0 π (k!e x,e y ) = min(1+ x,1+ y) Then, the aggregated CF is obtained as follows; 25 Aggrega?on of CFs (3) When two CFs x > 0 > y, it is supposed that the evidence ex and ey support h and k, respec?vely and In this case, x and y are transformed to the following possibility distribu?ons. Then we calculate the possibility distribu?on of h given only ex and ey, using the proposi?ons assuming condi?onal independency and non-interac?vity of causa?on events. π (h k!e x,e y ) =1+ y Then, the aggregated CF is obtained as follows; π (h k!e x,e y ) =1 x 26 Compu?ng 13
14 Aggrega?on Func?ons of CFs By summing up the three cases before, we get the next aggrega?on func?on. The possibility distribu?on obtained in the case of x > 0 > y is not normal. (It is because x and y contradict each other) If we normalize the distribu?on, then transform it to CF, we get the next aggrega?on func?on. 27 Aggrega?on Func?ons of CFs (2) The standard opera?ons used for Possibility theory are Min and Max. Mathema?cally, it is possible to use other t-norm and t-conorm. If we use algebraic product and sum instead of Min and Max, we get the following aggrega?on func?ons. - If we do not normalize the possibility distribu?on in the case of x > 0 > y, - If we normalize the possibility distribu?on in the case of x > 0 > y, Note: This is completely the same as the MYCIN s aggrega?on func?on. The Certainty Factor model can be jus?fied in the framework of Possibility Theory. 28 Compu?ng 14
15 Mathema?cal Proper?es of the Four Aggrega?on Func?ons Agrega+on rule Commuta+vity Associa+vity Con+nuity Monotonicity MIN/MAX w/o normaliza?on Non-decreasing MIN/MAX w/ normaliza?on Non-decreasing Algebraic Prod/Sum w/o normaliza?on Increasing Algebraic Prod/Sum w/ normaliza?on Increasing 29 Numerical Example (1) Suppose we get five pieces of evidence for a hypothesis h. The CFs are given sequen?ally by 0.3, - 0.5, 0.8, 0.4, fmin : Min/Max operators, No normaliza?on fm-nor : Min/Max operators, with normaliza?on falg : Algebraic Product/Sum operators, No normaliza?on fa-nor : Algebraic Product/Sum operators, with normaliza?on 30 Compu?ng 15
16 Numerical Example (2) Aggrega?on results when the order of the sequence is changed. affected strongly by the recent informa?on Associa?vity 31 Effects of Repe??ve O-hypothesis with Low CF Suppose we have the hypothesis h with a high CF (0.9) at first, then the O-hypothesis k with a low CF (-0.1) is given repeatedly. - Hypothesis h : true news (CF = 0.9) - O-hypothesis k : fake news (CF = 0.1) Simula?on Result At first, CF = 0.9, then CF = -0.1 is repeated 20?mes. h -> k h -> k - Aggrega?ons without normaliza?on are sensi?ve to the repe??ve O-hypothesis, and the value of CF decreases rapidly. - But in the case of Min/Max opera?on, the result is bounded by the low CF (-0.1), while the case of algebraic opera?on accumulates the CF of the O-hypothesis. 32 Compu?ng 16
17 Effects of Repe??ve O-Hypothesis with Low CF(2) The case where the hypothesis h is given in the middle. The case where the hypothesis h is given in the end. h -> k h -> k - Aggrega?ons with algebraic opera?ons have an effect to accumulate the low CFs. => This effect is risky in cases where much fake informa?on is repeated. => On the other hand, it would be useful when we cannot get the certain direct evidence, and can get much indirect but reliable evidence with a low CF. 33 Lessons from Examples When much wrong evidence is contained, it is risky to use "algebraic opera?on" because of the accumula?on effects of CFs of wrong evidence. it is beoer use "normaliza?on", because decreasing effect of the CF by wrong evidence is small. In the situa?ons where there is much wrong evidence (e.g. noises), It is beoer to use Min/Max opera?on with normaliza?on. In cases where most of reliable evidence has low CFs and there is liole wrong evidence, aggrega?on with "algebraic opera?on" helps us. In the applica?ons where most of evidence is indirect with a low CF, but it is reliable, it is beoer to use algebraic opera?on with normaliza?on (e.g. clinical diagnosis). 34 Compu?ng 17
18 Posi?on of Epistemic Uncertainty in AI The goal: Inves?gate the Human Intelligence, and develop Human Like Machine Intelligence Ar?ficial Intelligence Epistemic AI The goal: analyze data, find paoerns, and develop machines to judge something using the paoerns. Symbolis?c AI (Tradi?onal AI) AI with Symbol processing symbols = concepts Computa?onal AI (Computa?onal Intelligence) AI with numerical processing Symbolis?c AI (Knowledge Discovery) Data Science/Engineering Computa?onal AI (Machine Learning) (Sta?s?cal AI) Problem-solving Heuris?c Search Deduc?ve reasoning Knowledge representa?on Connec?onism (Neural Nets) Fuzzy Logic Evolu?onal Computa?on Epistemic Uncertainty Aoribute Oriented Induc?on Rough Set Model Associa?on Learning Discriminant Analysis Support Vector Machine Decision Trees Ensemble Learning Bayesian Learning 35 Conclusion Epistemic Uncertainty was discussed. It is important even in the era of data science, as long as humans / robots have to make decisions based on much uncertain/vague informa?on. Theories of epistemic uncertainty should be able to deal with "ignorance" or "unknown situa?on" caused by lack of informa?on/knowledge. The CF model, which had been cri?cized for a long?me, was recalled, interpreted newly with Possibility theory, and the aggrega?on func?on was jus?fied theore?cally. Four simple aggrega?on func?ons of Certainty Factors were proposed, and the mathema?cal proper?es were discussed. 36 Compu?ng 18
Sources of Uncertainty
Probability Basics Sources of Uncertainty The world is a very uncertain place... Uncertain inputs Missing data Noisy data Uncertain knowledge Mul>ple causes lead to mul>ple effects Incomplete enumera>on
More informationReasoning with Uncertainty
Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take
More informationREASONING UNDER UNCERTAINTY: CERTAINTY THEORY
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY Table of Content Introduction Certainty Theory Definition Certainty Theory: Values Interpretation Certainty Theory: Representation Certainty Factor Propagation
More informationIntroduc)on to Ar)ficial Intelligence
Introduc)on to Ar)ficial Intelligence Lecture 10 Probability CS/CNS/EE 154 Andreas Krause Announcements! Milestone due Nov 3. Please submit code to TAs! Grading: PacMan! Compiles?! Correct? (Will clear
More informationCS 6140: Machine Learning Spring What We Learned Last Week 2/26/16
Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Sign
More informationComputer Vision. Pa0ern Recogni4on Concepts Part I. Luis F. Teixeira MAP- i 2012/13
Computer Vision Pa0ern Recogni4on Concepts Part I Luis F. Teixeira MAP- i 2012/13 What is it? Pa0ern Recogni4on Many defini4ons in the literature The assignment of a physical object or event to one of
More informationGraphical Models. Lecture 3: Local Condi6onal Probability Distribu6ons. Andrew McCallum
Graphical Models Lecture 3: Local Condi6onal Probability Distribu6ons Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Condi6onal Probability Distribu6ons
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More informationUncertainty in Heuristic Knowledge and Reasoning
Uncertainty in Heuristic Knowledge and Koichi YAMADA 1603-1, Kami-tomioka, Nagaoka, Niigata, Japan, 940-2188 Management and Information Systems Science Nagaoka University of Technology yamada@kjs.nagaokaut.ac.jp
More informationGraphical Models. Lecture 1: Mo4va4on and Founda4ons. Andrew McCallum
Graphical Models Lecture 1: Mo4va4on and Founda4ons Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. Board work Expert systems the desire for probability
More informationCSE 473: Ar+ficial Intelligence. Hidden Markov Models. Bayes Nets. Two random variable at each +me step Hidden state, X i Observa+on, E i
CSE 473: Ar+ficial Intelligence Bayes Nets Daniel Weld [Most slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at hnp://ai.berkeley.edu.]
More informationFuzzy Systems. Possibility Theory
Fuzzy Systems Possibility Theory Prof. Dr. Rudolf Kruse Christoph Doell {kruse,doell}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge
More informationUncertainty and Rules
Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty
More informationIntroduc)on. CSC 1300 Discrete Structures Villanova University. Villanova CSC Dr Papalaskari 1
Introduc)on CSC 1300 Discrete Structures Villanova University 1 Discrete Structures Goal: Understand how to use mathema)cs to reason about problems in Computer Sciences: sets and coun)ng func)ons and rela)ons
More informationDiscrete Structures. Introduc:on. Examples of problems solved using discrete structures. What is discrete about Discrete Structures?
Discrete Structures Introduc:on CSC 1300 Discrete Structures Villanova University Goal: Understand how to use mathema:cs to reason about problems in Computer Sciences: sets and coun:ng func:ons and rela:ons
More informationReasoning Under Uncertainty
Reasoning Under Uncertainty Chapter 14&15 Part Kostas (1) Certainty Kontogiannis Factors E&CE 457 Objectives This unit aims to investigate techniques that allow for an algorithmic process to deduce new
More informationIntroduc)on to the Design and Analysis of Experiments. Violet R. Syro)uk School of Compu)ng, Informa)cs, and Decision Systems Engineering
Introduc)on to the Design and Analysis of Experiments Violet R. Syro)uk School of Compu)ng, Informa)cs, and Decision Systems Engineering 1 Complex Engineered Systems What makes an engineered system complex?
More informationCS 6140: Machine Learning Spring What We Learned Last Week. Survey 2/26/16. VS. Model
Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Assignment
More informationCS 6140: Machine Learning Spring 2016
CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa?on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Logis?cs Assignment
More informationProofs Sec*ons 1.6, 1.7 and 1.8 of Rosen
Proofs Sec*ons 1.6, 1.7 and 1.8 of Rosen Spring 2013 CSCE 235 Introduc5on to Discrete Structures Course web- page: cse.unl.edu/~cse235 Ques5ons: Piazza Outline Mo5va5on Terminology Rules of inference:
More informationMathematical Approach to Vagueness
International Mathematical Forum, 2, 2007, no. 33, 1617-1623 Mathematical Approach to Vagueness Angel Garrido Departamento de Matematicas Fundamentales Facultad de Ciencias de la UNED Senda del Rey, 9,
More informationADVICE. Since we will need to negate sentences, it is important to review seccons 7 and 11 before proceeding.
20.CONTRADICTION AND CONTRAPOSITIVE We address, once again, the ques2on of how to prove a mathema2cal statement of the form if A, then B. We have used so far the template for a direct proof: unravel A,
More informationBayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.
Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence
More informationProbability and Structure in Natural Language Processing
Probability and Structure in Natural Language Processing Noah Smith Heidelberg University, November 2014 Introduc@on Mo@va@on Sta@s@cal methods in NLP arrived ~20 years ago and now dominate. Mercer was
More informationIntroduc)on to Ar)ficial Intelligence
Introduc)on to Ar)ficial Intelligence Lecture 13 Approximate Inference CS/CNS/EE 154 Andreas Krause Bayesian networks! Compact representa)on of distribu)ons over large number of variables! (OQen) allows
More informationElici%ng Informa%on from the Crowd a Part of the EC 13 Tutorial on Social Compu%ng and User- Generated Content
Elici%ng Informa%on from the Crowd a Part of the EC 13 Tutorial on Social Compu%ng and User- Generated Content Yiling Chen Harvard University June 16, 2013 Roadmap Elici%ng informa%on for events with verifiable
More informationCSE 473: Ar+ficial Intelligence. Probability Recap. Markov Models - II. Condi+onal probability. Product rule. Chain rule.
CSE 473: Ar+ficial Intelligence Markov Models - II Daniel S. Weld - - - University of Washington [Most slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188
More informationOutline. Logic. Knowledge bases. Wumpus world characteriza/on. Wumpus World PEAS descrip/on. A simple knowledge- based agent
Outline Logic Dr. Melanie Mar/n CS 4480 October 8, 2012 Based on slides from hap://aima.eecs.berkeley.edu/2nd- ed/slides- ppt/ Knowledge- based agents Wumpus world Logic in general - models and entailment
More informationComputer Vision. Pa0ern Recogni4on Concepts. Luis F. Teixeira MAP- i 2014/15
Computer Vision Pa0ern Recogni4on Concepts Luis F. Teixeira MAP- i 2014/15 Outline General pa0ern recogni4on concepts Classifica4on Classifiers Decision Trees Instance- Based Learning Bayesian Learning
More informationCSE 473: Ar+ficial Intelligence
CSE 473: Ar+ficial Intelligence Hidden Markov Models Luke Ze@lemoyer - University of Washington [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188
More informationBayesian networks Lecture 18. David Sontag New York University
Bayesian networks Lecture 18 David Sontag New York University Outline for today Modeling sequen&al data (e.g., =me series, speech processing) using hidden Markov models (HMMs) Bayesian networks Independence
More informationCSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on
CSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on Professor Wei-Min Shen Week 13.1 and 13.2 1 Status Check Extra credits? Announcement Evalua/on process will start soon
More informationSequential adaptive combination of unreliable sources of evidence
Sequential adaptive combination of unreliable sources of evidence Zhun-ga Liu, Quan Pan, Yong-mei Cheng School of Automation Northwestern Polytechnical University Xi an, China Email: liuzhunga@gmail.com
More informationA NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS
A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS Albena TCHAMOVA, Jean DEZERT and Florentin SMARANDACHE Abstract: In this paper a particular combination rule based on specified
More informationApplication of New Absolute and Relative Conditioning Rules in Threat Assessment
Application of New Absolute and Relative Conditioning Rules in Threat Assessment Ksawery Krenc C4I Research and Development Department OBR CTM S.A. Gdynia, Poland Email: ksawery.krenc@ctm.gdynia.pl Florentin
More informationCS 4649/7649 Robot Intelligence: Planning
CS 4649/7649 Robot Intelligence: Planning Probability Primer Sungmoon Joo School of Interactive Computing College of Computing Georgia Institute of Technology S. Joo (sungmoon.joo@cc.gatech.edu) 1 *Slides
More informationCSCI 1010 Models of Computa3on. Lecture 02 Func3ons and Circuits
CSCI 1010 Models of Computa3on Lecture 02 Func3ons and Circuits Overview Func3ons and languages Designing circuits from func3ons Minterms and the DNF Maxterms and CNF Circuit complexity Algebra of Boolean
More informationScientific/Technical Approach
Network based Hard/Soft Information Fusion: Soft Information and its Fusion Ronald R. Yager, Tel. 212 249 2047, E Mail: yager@panix.com Objectives: Support development of hard/soft information fusion Develop
More informationInforma(onal Subs(tutes and Complements for Predic(on
Informa(onal Subs(tutes and Complements for Predic(on Yiling Chen Harvard University November 19, 2015 Joint work with Bo Waggoner Roadmap Informa(on, predic(on and predic(on markets Subs(tutes and complements
More informationIntroduc)on to Ar)ficial Intelligence
Introduc)on to Ar)ficial Intelligence Lecture 9 Logical reasoning CS/CNS/EE 154 Andreas Krause First order logic (FOL)! Proposi)onal logic is about simple facts! There is a breeze at loca)on [1,2]! First
More informationPu#ng the physics into sea ice parameterisa3ons: a case study (melt ponds)
Pu#ng the physics into sea ice parameterisa3ons: a case study (melt ponds) April 1998 July 1998 Ice Sta3on SHEBA. Canadian Coast Guard icebreaker Des Groseilliers. Danny Feltham Centre for Polar Observa3on
More informationOn the Relation of Probability, Fuzziness, Rough and Evidence Theory
On the Relation of Probability, Fuzziness, Rough and Evidence Theory Rolly Intan Petra Christian University Department of Informatics Engineering Surabaya, Indonesia rintan@petra.ac.id Abstract. Since
More informationChapter 4 : The Logic of Boolean Connec6ves. Not all English connec4ves are truth- func4onal
Chapter 4 : The Logic of Boolean Connec6ves Not all English connec4ves are truth- func4onal Max was at home because Claire went to the library. Home(max) because WentToLibrary(claire) T T T T F T Hence
More informationWhere are we? è Five major sec3ons of this course
UVA CS 4501-001 / 6501 007 Introduc8on to Machine Learning and Data Mining Lecture 12: Probability and Sta3s3cs Review Yanjun Qi / Jane University of Virginia Department of Computer Science 10/02/14 1
More informationPSAAP Project Stanford
PSAAP Project QMU @ Stanford Component Analysis and rela:on to Full System Simula:ons 1 What do we want to predict? Objec:ve: predic:on of the unstart limit expressed as probability of unstart (or alterna:vely
More informationCounter-examples to Dempster s rule of combination
Jean Dezert 1, Florentin Smarandache 2, Mohammad Khoshnevisan 3 1 ONERA, 29 Av. de la Division Leclerc 92320, Chatillon, France 2 Department of Mathematics University of New Mexico Gallup, NM 8730, U.S.A.
More informationReasoning about uncertainty
Reasoning about uncertainty Rule-based systems are an attempt to embody the knowledge of a human expert within a computer system. Human knowledge is often imperfect. - it may be incomplete (missing facts)
More informationD. Dubois, H. Prade, F. Touazi (coopéra7on avec A. Hadjali, S. Kaci)
D. Dubois, H. Prade, F. Touazi (coopéra7on avec A. Hadjali, S. Kaci) Mo7va7on Two ways of expressing qualitative preferences Logical statements with priorities (possibilistic logic) Conditional preference
More informationUncertain Knowledge and Bayes Rule. George Konidaris
Uncertain Knowledge and Bayes Rule George Konidaris gdk@cs.brown.edu Fall 2018 Knowledge Logic Logical representations are based on: Facts about the world. Either true or false. We may not know which.
More informationThe Mysteries of Quantum Mechanics
The Mysteries of Quantum Mechanics Class 5: Quantum Behavior and Interpreta=ons Steve Bryson www.stevepur.com/quantum Ques=ons? The Quantum Wave Quantum Mechanics says: A par=cle s behavior is described
More informationCS6750: Cryptography and Communica7on Security
CS6750: Cryptography and Communica7on Security Class 6: Simple Number Theory Dr. Erik- Oliver Blass Plan 1. Role of number theory in cryptography 2. Classical problems in computa7onal number theory 3.
More informationReduced Models for Process Simula2on and Op2miza2on
Reduced Models for Process Simulaon and Opmizaon Yidong Lang, Lorenz T. Biegler and David Miller ESI annual meeng March, 0 Models are mapping Equaon set or Module simulators Input space Reduced model Surrogate
More informationImprecise Probabilities with a Generalized Interval Form
Introduction Imprecise Probability based on Generalized Intervals Conditioning and Updating Imprecise Probabilities with a Generalized Interval Form Yan Wang University of Central Florida REC2008 Wang
More informationProbability Basics. Robot Image Credit: Viktoriya Sukhanova 123RF.com
Probability Basics These slides were assembled by Eric Eaton, with grateful acknowledgement of the many others who made their course materials freely available online. Feel free to reuse or adapt these
More informationFuzzy Systems. Possibility Theory.
Fuzzy Systems Possibility Theory Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing
More informationW3203 Discrete Mathema1cs. Logic and Proofs. Spring 2015 Instructor: Ilia Vovsha. hcp://www.cs.columbia.edu/~vovsha/w3203
W3203 Discrete Mathema1cs Logic and Proofs Spring 2015 Instructor: Ilia Vovsha hcp://www.cs.columbia.edu/~vovsha/w3203 1 Outline Proposi1onal Logic Operators Truth Tables Logical Equivalences Laws of Logic
More informationOutline. What is Machine Learning? Why Machine Learning? 9/29/08. Machine Learning Approaches to Biological Research: Bioimage Informa>cs and Beyond
Outline Machine Learning Approaches to Biological Research: Bioimage Informa>cs and Beyond Robert F. Murphy External Senior Fellow, Freiburg Ins>tute for Advanced Studies Ray and Stephanie Lane Professor
More informationODEs + Singulari0es + Monodromies + Boundary condi0ons. Kerr BH ScaRering: a systema0c study. Schwarzschild BH ScaRering: Quasi- normal modes
Outline Introduc0on Overview of the Technique ODEs + Singulari0es + Monodromies + Boundary condi0ons Results Kerr BH ScaRering: a systema0c study Schwarzschild BH ScaRering: Quasi- normal modes Collabora0on:
More informationGenerative Techniques: Bayes Rule and the Axioms of Probability
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 8 3 March 2017 Generative Techniques: Bayes Rule and the Axioms of Probability Generative
More informationPossibilistic Logic. Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini.
Possibilistic Logic Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini November 21, 2005 1 Introduction In real life there are some situations where only
More informationMolecular Dynamics. Molecules in motion
Molecular Dynamics Molecules in motion 1 Molecules in mo1on Molecules are not sta1c, but move all the 1me Source: h9p://en.wikipedia.org/wiki/kine1c_theory 2 Gasses, liquids and solids Gasses, liquids
More informationLecture 10: Introduction to reasoning under uncertainty. Uncertainty
Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,
More informationLast Lecture Recap UVA CS / Introduc8on to Machine Learning and Data Mining. Lecture 3: Linear Regression
UVA CS 4501-001 / 6501 007 Introduc8on to Machine Learning and Data Mining Lecture 3: Linear Regression Yanjun Qi / Jane University of Virginia Department of Computer Science 1 Last Lecture Recap q Data
More informationImprecise Probability
Imprecise Probability Alexander Karlsson University of Skövde School of Humanities and Informatics alexander.karlsson@his.se 6th October 2006 0 D W 0 L 0 Introduction The term imprecise probability refers
More informationBias/variance tradeoff, Model assessment and selec+on
Applied induc+ve learning Bias/variance tradeoff, Model assessment and selec+on Pierre Geurts Department of Electrical Engineering and Computer Science University of Liège October 29, 2012 1 Supervised
More informationCourse Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.
Course Introduction Probabilistic Modelling and Reasoning Chris Williams School of Informatics, University of Edinburgh September 2008 Welcome Administration Handout Books Assignments Tutorials Course
More informationFuzzy Logic. An introduction. Universitat Politécnica de Catalunya. Departament de Teoria del Senyal i Comunicacions.
Universitat Politécnica de Catalunya Departament de Teoria del Senyal i Comunicacions Fuzzy Logic An introduction Prepared by Temko Andrey 2 Outline History and sphere of applications Basics. Fuzzy sets
More informationIntroduction to Particle Filters for Data Assimilation
Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,
More informationUncertain Entailment and Modus Ponens in the Framework of Uncertain Logic
Journal of Uncertain Systems Vol.3, No.4, pp.243-251, 2009 Online at: www.jus.org.uk Uncertain Entailment and Modus Ponens in the Framework of Uncertain Logic Baoding Liu Uncertainty Theory Laboratory
More informationWe are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors
We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,100 116,000 120M Open access books available International authors and editors Downloads Our
More informationCSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on. Professor Wei-Min Shen Week 8.1 and 8.2
CSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on Professor Wei-Min Shen Week 8.1 and 8.2 Status Check Projects Project 2 Midterm is coming, please do your homework!
More informationProbabilistic and Bayesian Analytics
Probabilistic and Bayesian Analytics Note to other teachers and users of these slides. Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these
More informationLinear Regression and Correla/on. Correla/on and Regression Analysis. Three Ques/ons 9/14/14. Chapter 13. Dr. Richard Jerz
Linear Regression and Correla/on Chapter 13 Dr. Richard Jerz 1 Correla/on and Regression Analysis Correla/on Analysis is the study of the rela/onship between variables. It is also defined as group of techniques
More informationLinear Regression and Correla/on
Linear Regression and Correla/on Chapter 13 Dr. Richard Jerz 1 Correla/on and Regression Analysis Correla/on Analysis is the study of the rela/onship between variables. It is also defined as group of techniques
More informationOn flexible database querying via extensions to fuzzy sets
On flexible database querying via extensions to fuzzy sets Guy de Tré, Rita de Caluwe Computer Science Laboratory Ghent University Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium {guy.detre,rita.decaluwe}@ugent.be
More informationMul$- model ensemble challenge ini$al/model uncertain$es
Mul$- model ensemble challenge ini$al/model uncertain$es Yuejian Zhu Ensemble team leader Environmental Modeling Center NCEP/NWS/NOAA Acknowledgments: EMC ensemble team staffs Presenta$on for WMO/WWRP
More informationA generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP
A generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP 08 76131 Mont-Saint-Aignan Cedex, France Eric.Lefevre@insa-rouen.fr
More informationMA/CS 109 Lecture 7. Back To Exponen:al Growth Popula:on Models
MA/CS 109 Lecture 7 Back To Exponen:al Growth Popula:on Models Homework this week 1. Due next Thursday (not Tuesday) 2. Do most of computa:ons in discussion next week 3. If possible, bring your laptop
More informationDifferen'al Privacy with Bounded Priors: Reconciling U+lity and Privacy in Genome- Wide Associa+on Studies
Differen'al Privacy with Bounded Priors: Reconciling U+lity and Privacy in Genome- Wide Associa+on Studies Florian Tramèr, Zhicong Huang, Erman Ayday, Jean- Pierre Hubaux ACM CCS 205 Denver, Colorado,
More informationModeling radiocarbon in the Earth System. Radiocarbon summer school 2012
Modeling radiocarbon in the Earth System Radiocarbon summer school 2012 Outline Brief introduc9on to mathema9cal modeling Single pool models Mul9ple pool models Model implementa9on Parameter es9ma9on What
More informationAnalyzing the degree of conflict among belief functions.
Analyzing the degree of conflict among belief functions. Liu, W. 2006). Analyzing the degree of conflict among belief functions. Artificial Intelligence, 17011)11), 909-924. DOI: 10.1016/j.artint.2006.05.002
More informationCSE 21 Math for Algorithms and Systems Analysis. Lecture 10 Condi<onal Probability
CSE 21 Math for Algorithms and Systems Analysis Lecture 10 Condi
More informationA Problem Involving Games. Paccioli s Solution. Problems for Paccioli: Small Samples. n / (n + m) m / (n + m)
Class #10: Introduction to Probability Theory Artificial Intelligence (CS 452/552): M. Allen, 27 Sept. 17 A Problem Involving Games } Two players put money in on a game of chance } First one to certain
More informationUncertain Logic with Multiple Predicates
Uncertain Logic with Multiple Predicates Kai Yao, Zixiong Peng Uncertainty Theory Laboratory, Department of Mathematical Sciences Tsinghua University, Beijing 100084, China yaok09@mails.tsinghua.edu.cn,
More informationNumerical Methods in Biomedical Engineering
Numerical Methods in Biomedical Engineering Lecture s website for updates on lecture materials: h5p://9nyurl.com/m4frahb Individual and group homework Individual and group midterm and final projects (Op?onal)
More informationHybrid Logic and Uncertain Logic
Journal of Uncertain Systems Vol.3, No.2, pp.83-94, 2009 Online at: www.jus.org.uk Hybrid Logic and Uncertain Logic Xiang Li, Baoding Liu Department of Mathematical Sciences, Tsinghua University, Beijing,
More informationSta$s$cal Significance Tes$ng In Theory and In Prac$ce
Sta$s$cal Significance Tes$ng In Theory and In Prac$ce Ben Cartere8e University of Delaware h8p://ir.cis.udel.edu/ictir13tutorial Hypotheses and Experiments Hypothesis: Using an SVM for classifica$on will
More information2/2/2018. CS 103 Discrete Structures. Chapter 1. Propositional Logic. Chapter 1.1. Propositional Logic
CS 103 Discrete Structures Chapter 1 Propositional Logic Chapter 1.1 Propositional Logic 1 1.1 Propositional Logic Definition: A proposition :is a declarative sentence (that is, a sentence that declares
More informationSec$on Summary. Mathematical Proofs Forms of Theorems Trivial & Vacuous Proofs Direct Proofs Indirect Proofs
Section 1.7 Sec$on Summary Mathematical Proofs Forms of Theorems Trivial & Vacuous Proofs Direct Proofs Indirect Proofs Proof of the Contrapositive Proof by Contradiction 2 Proofs of Mathema$cal Statements
More informationToday s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence
Today s s lecture Lecture 6: Uncertainty - 6 Alternative Models of Dealing with Uncertainty Information/Evidence Dempster-Shaffer Theory of Evidence Victor Lesser CMPSCI 683 Fall 24 Fuzzy logic Logical
More informationEnsemble of Climate Models
Ensemble of Climate Models Claudia Tebaldi Climate Central and Department of Sta7s7cs, UBC Reto Knu>, Reinhard Furrer, Richard Smith, Bruno Sanso Outline Mul7 model ensembles (MMEs) a descrip7on at face
More informationIS4200/CS6200 Informa0on Retrieval. PageRank Con+nued. with slides from Hinrich Schütze and Chris6na Lioma
IS4200/CS6200 Informa0on Retrieval PageRank Con+nued with slides from Hinrich Schütze and Chris6na Lioma Exercise: Assump0ons underlying PageRank Assump0on 1: A link on the web is a quality signal the
More informationChemistry. Monday, October 9 th Tuesday, October 10 th, 2017
Chemistry Monday, October 9 th Tuesday, October 10 th, 2017 Do-Now: Unit 1 Test Day Do-Now 1. Write down today s FLT Copy & Complete 2. Write down one question you have about the test (topic-wise or format-wise).
More informationLearning Deep Genera,ve Models
Learning Deep Genera,ve Models Ruslan Salakhutdinov BCS, MIT and! Department of Statistics, University of Toronto Machine Learning s Successes Computer Vision: - Image inpain,ng/denoising, segmenta,on
More informationTask Schedulable Problem and Maximum Scheduling Problem in a Multi-agent System
JOURNAL OF SOFTWARE, VOL. 6, NO. 11, NOVEMBER 2011 2225 Task Schedulable Problem and Maximum Scheduling Problem in a Mul-agent System Bin Li, Xiaowei Zhang, Jun Wu, Junwu Zhu School of Informaon Engineering
More informationWhere are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning
Knowledge Engineering Semester 2, 2004-05 Michael Rovatsos mrovatso@inf.ed.ac.uk Lecture 8 Dealing with Uncertainty 8th ebruary 2005 Where are we? Last time... Model-based reasoning oday... pproaches to
More informationGeneral linear model: basic
General linear model: basic Introducing General Linear Model (GLM): Start with an example Proper>es of the BOLD signal Linear Time Invariant (LTI) system The hemodynamic response func>on (Briefly) Evalua>ng
More informationContext-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI
Context-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI Sarah Calderwood Kevin McAreavey Weiru Liu Jun Hong Abstract There has been much interest in the Belief-Desire-Intention
More information3/10/11. Which interpreta/on sounds most reasonable to you? PH300 Modern Physics SP11
3// PH3 Modern Physics SP The problems of language here are really serious. We wish to speak in some way about the structure of the atoms. But we cannot speak about atoms in ordinary language. Recently:.
More informationShort introduc,on to the
OXFORD NEUROIMAGING PRIMERS Short introduc,on to the An General Introduction Linear Model to Neuroimaging for Neuroimaging Analysis Mark Jenkinson Mark Jenkinson Janine Michael Bijsterbosch Chappell Michael
More information