Fuzzy Systems. Possibility Theory

Similar documents
Probabilistic Reasoning: Graphical Models

Fuzzy Systems. Possibility Theory.

Introduction to belief functions

Decomposition. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 81

Handling imprecise and uncertain class labels in classification and clustering

Entropy and Specificity in a Mathematical Theory of Evidence

Fuzzy Systems. Introduction

Possibilistic Networks: Data Mining Applications

Scenarios, probability and possible futures

A unified view of some representations of imprecise probabilities

Fuzzy Systems. Introduction

On the Relation of Probability, Fuzziness, Rough and Evidence Theory

Contradiction Measures and Specificity Degrees of Basic Belief Assignments

The Semi-Pascal Triangle of Maximum Deng Entropy

On Markov Properties in Evidence Theory

BIVARIATE P-BOXES AND MAXITIVE FUNCTIONS. Keywords: Uni- and bivariate p-boxes, maxitive functions, focal sets, comonotonicity,

On Conditional Independence in Evidence Theory

IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Belief functions: basic theory and applications

Possibilistic information fusion using maximal coherent subsets

Sequential adaptive combination of unreliable sources of evidence

SMPS 08, 8-10 septembre 2008, Toulouse. A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France

arxiv: v1 [cs.ai] 16 Aug 2018

On flexible database querying via extensions to fuzzy sets

Today s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence

arxiv: v1 [cs.cv] 11 Jun 2008

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

The cautious rule of combination for belief functions and some extensions

Belief Functions: the Disjunctive Rule of Combination and the Generalized Bayesian Theorem

The Unnormalized Dempster s Rule of Combination: a New Justification from the Least Commitment Principle and some Extensions

A New ER-MCDA Mapping for Decision-Making based on Imperfect Information

Reasoning with Uncertainty

Introduction to Bayesian Inference

A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France

Uncertainty and Rules

Representing qualitative capacities as families of possibility measures

Uncertain Logic with Multiple Predicates

Possibilistic Logic. Damien Peelman, Antoine Coulon, Amadou Sylla, Antoine Dessaigne, Loïc Cerf, Narges Hadji-Hosseini.

Possibility Theory and Statistical Reasoning

Contradiction measures and specificity degrees of basic belief assignments

Fuzzy Systems. Fuzzy Control

Comparison of 3-valued Logic using Fuzzy Rules

Basic Probabilistic Reasoning SEG

The maximum Deng entropy

Probabilistic Reasoning

UNCERTAINTY. In which we see what an agent should do when not all is crystal-clear.

Probabilistic Logics and Probabilistic Networks

Independence in Generalized Interval Probability. Yan Wang

Fuzzy Systems. Fuzzy Arithmetic

Threat assessment of a possible Vehicle-Born Improvised Explosive Device using DSmT

Generative Techniques: Bayes Rule and the Axioms of Probability

Origins of Probability Theory

A new generalization of the proportional conflict redistribution rule stable in terms of decision

The Capacitated Vehicle Routing Problem with Evidential Demands

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University

Basics of Probability

Probabilistic Graphical Models. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153

Necessity-based Choquet integrals for sequential decision making under uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Multiplication of Multinomial Subjective Opinions

Data Fusion with Imperfect Implication Rules

Encoding formulas with partially constrained weights in a possibilistic-like many-sorted propositional logic

General combination rules for qualitative and quantitative beliefs

Interval-valued Fuzzy Sets, Possibility Theory and Imprecise Probability

PROBABILISTIC LOGIC. J. Webster (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering Copyright c 1999 John Wiley & Sons, Inc.

A generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP

Evidential Reasoning for Multi-Criteria Analysis Based on DSmT-AHP

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Analyzing the degree of conflict among belief functions.

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

Possibilistic Safe Beliefs

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY

Possibilistic Boolean Games: Strategic Reasoning under Incomplete Information

Place of possibility theory in transportation analysis

Decision-making with belief functions

Context-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI

A Decision Theory for Partially Consonant Belief Functions

Capturing Independence Graphically; Undirected Graphs

Data Fusion in the Transferable Belief Model.

On analysis of the unicity of Jeffrey s rule of conditioning in a possibilistic framework

Basic notions of probability theory

Grundlagen der Künstlichen Intelligenz

Artificial Intelligence Programming Probability

CMPSCI 240: Reasoning about Uncertainty

Multi-criteria Decision Making by Incomplete Preferences

Probabilistic representation and reasoning

arxiv:cs/ v2 [cs.ai] 29 Nov 2006

Pengju XJTU 2016

A Possibilistic Extension of Description Logics

Uncertainty. Chapter 13

Extreme probability distributions of random/fuzzy sets and p-boxes

Credibilistic Bi-Matrix Game

Naive Possibilistic Classifiers for Imprecise or Uncertain Numerical Data

Probability and statistics; Rehearsal for pattern recognition

Combination of classifiers with optimal weight based on evidential reasoning

Capturing Independence Graphically; Undirected Graphs

ECE 302: Chapter 02 Probability Model

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

Transcription:

Fuzzy Systems Possibility Theory Prof. Dr. Rudolf Kruse Christoph Doell {kruse,doell}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing and Language Engineering R. Kruse, C. Doell FS Possibility Theory Lecture 1

Outline 1. Partial Belief Kolmogorov Axioms (Finite Case) Mass Distribution Belief and Plausibility 2. Possibility and Necessity 3. Possibility Theory

A Simple Example Oil contamination of water by trading vessels Typical formulation: The accident occurred approximately 10 miles away from the coast. Locations of interest: open sea (z3), 12-mile zone (z2), 3-mile zone (z1), canal (ca), refueling dock (rd), loading dock (ld) These 6 locations Ω are disjoint and exhaustive Ω = {z3, z2, z1, ca, rd, ld} R. Kruse, C. Doell FS Possibility Theory Lecture 1 1 / 32

Modeling Partial Belief (Possibly vague) statements are often not simply true or false Decision maker should be able to quantify his/her degree of belief This can be objective measurement or subjective valuation Probability theory Sample space Θ (finite set of distinct possible outcomes of some random experiment) Event A Θ For any Θ, probability P is assumed to be P : 2 Θ [0, 1] satisfying Kolmogorov axioms R. Kruse, C. Doell FS Possibility Theory Lecture 1 2 / 32

Kolmogorov Axioms For any Θ, real-valued function P : 2 Θ [0, 1] must satisfy i) 0 P(A) 1 for all events A Θ, ii) P(Θ) = 1, iii) if A B =, then P(A B) = P(A) + P(B) R. Kruse, C. Doell FS Possibility Theory Lecture 1 3 / 32

Partial Belief and Evidence Masses We may not conceive elements of Θ but their observations of some space Ω Mapping Γ attaches to each sensor θ Θ its output (either element or subset or fuzzy set of Ω) Probability P on Ω induces via Γ a structure on Ω This structure represents partial beliefs about actual state of world ω 0 If Γ : Θ 2 Ω, then (P, Γ) is random set If subjective valuation is quantified, then evidence masses are attached to subsets of Ω Thus, expert must partition belief, attributing bigger amounts to more reliable pieces of knowledge R. Kruse, C. Doell FS Possibility Theory Lecture 1 4 / 32

Mass Distribution Recall example with Ω = {z3, z2, z1, ca, rd, ld} Propositional statement in port equals event {ca, rd, ld} Event may represent maximum level of differentiation for expert Expert specifies mass distribution m : 2 Ω [0, 1] Here, Ω is called frame of discernment m : 2 Ω [0, 1] must satisfy (i) m( ) = 0, (ii) A:A Ω (A) = 1 Subsets A Ω with m(a) > 0 are called focal elements of m R. Kruse, C. Doell FS Possibility Theory Lecture 1 5 / 32

Belief and Plausibility m(a) measures belief committed exactly to A For total amount of belief (credibility) of A, sum up m(b) whereas B A For maximum amount of belief movable to A, sum up m(b) with B A This leads to belief function and plausibility function Bel m : 2 Ω [0, 1], Bel m (A) = B:B A Pl m : 2 Ω [0, 1], Pl m (A) = B:B A m(b) m(b) R. Kruse, C. Doell FS Possibility Theory Lecture 1 6 / 32

Belief and Plausibility In any case Bel(Ω) = 1 ( closed world assumption) Total ignorance modeled by m 0 : 2 Ω [0, 1] with m 0 (Ω) = 1, m 0 (A) = 0 for all A Ω m 0 leads to Bel(Ω) = Pl(Ω) = 1 and Bel(A) = 0, Pl(A) = 1 for all A Ω For ordinary probability, use m 1 : 2 Ω [0, 1] with m 1 ({ω}) = p ω and m 1 (A) = 0 for all sets A with A > 1 m 1 is called Bayesian belief function Exact knowledge modeled by m 2 : 2 Ω [0, 1], m 2 ({ω 0 }) = 1 and m 2 (A) = 0 for all A {ω 0 } R. Kruse, C. Doell FS Possibility Theory Lecture 1 7 / 32

Example Consider statement: ship is in port with degree of certainty of 0.6, further evidence is not available Mass distribution m : 2 Ω [0, 1], m({in port}) = 0.6, m(ω) = 0.4, m(a) = 0 otherwise m(ω) = 0.4 represents inability to attach that amount of mass to any A Ω e.g. m({in port}) = 0.4 would exceed expert s statement R. Kruse, C. Doell FS Possibility Theory Lecture 1 8 / 32

Outline 1. Partial Belief 2. Possibility and Necessity Nested Focal Elements Possibility Distribution 3. Possibility Theory

Nested Focal Elements Besides Bayesian belief functions, there is another important case If focal elements A 1,..., A n of m are nested, i.e. can be arranged such that A 1 A 2... A n, then belief function is called consonant Simplest case: belief function with only 1 focal element A Ω where m(a) = 1 Then it follows Bel m (B) = { 1 if A B, 0 otherwise and Pl m (B) = { 1 if A B, 0 otherwise Each ω A is candidate for being actual but unknown state ω 0 R. Kruse, C. Doell FS Possibility Theory Lecture 1 9 / 32

Possibility Distribution We definitely know that ω 0 Ā Can be used for ρ : Ω {0, 1}, ρ(ω) = Pl({ω}) ρ attaches 1 to possible and 0 to impossible elements ρ is thus named possibility distribution R. Kruse, C. Doell FS Possibility Theory Lecture 1 10 / 32

Possibility and Necessity Truth of possibly ω 0 B is called possibility of B Ω This is true if max{ρ(ω) ω Ω} = 1 Truth of necessarily ω 0 Ω is called necessity of B Ω This is true if max{ρ(ω) ω B} = 0 necessarily ω 0 Ω being true requires possibly ω 0 B being false R. Kruse, C. Doell FS Possibility Theory Lecture 1 11 / 32

Possibility and Necessity Measures Generally, possibility (and necessity) becomes matter of degree Instead of ρ : Ω {0, 1}, membership function π : Ω [0, 1], π(ω) = Pl({ω}) Thus, possibility measure and necessity measure are defined as Π m : 2 Ω [0, 1], Π m (B) = max{π(ω) : ω B} nec m : 2 Ω [0, 1], nec m (B) = 1 Π( B) Π and nec are plausibility and belief functions, resp. R. Kruse, C. Doell FS Possibility Theory Lecture 1 12 / 32

Example Probability vs. Possibility Statement A(n): Anna ate n eggs for breakfast. (Subjective) probability P(A(n)) can be determined by experiments: How many eggs will Anna eat for today s breakfast? possibility π(a(n)): How many eggs can Anna eat for breakfast. n 1 2 3 4 5 6 7 8 π(a(n)) 1 1 1 1.8.6.4.2 P(A(n)).1.8.1 0 0 0 0 0 a possible event needs not to be probable a probable event is always possible R. Kruse, C. Doell FS Possibility Theory Lecture 1 13 / 32

Properties of Possibility Measures i) Π( ) = 0 ii) Π(Ω) = 1 iii) Π(A B) = max{π(a), Π(B)} for all A, B Ω Possibility of some set is determined by its most possible element nec(ω) = 1 Π( ) = 1 means closed world assumption: necessarily ω 0 Ω must be true Total ignorance: Π(B) = 1, nec(b) = 0 for all B, B Ω Perfect knowledge: Π({ω}) = nec({ω}) = 0 for all ω ω 0 and Π({ω 0 }) = nec({ω 0 }) = 1 R. Kruse, C. Doell FS Possibility Theory Lecture 1 14 / 32

Nested Focal Elements complete sequence of nested focal elements of Π on 2 Ω where Ω = {ω 1,..., ω n } m(a 3 ) m(a 4 )... m(a n ) m(a 2 ) m(a 1 ) ω 1 ω 2 ω 3 ω 4... ω n π(x 1 ) π(x 2 ) π(x3 ) π(x 4 )... π(x n ) R. Kruse, C. Doell FS Possibility Theory Lecture 1 15 / 32

Example Consider ship locations again Given membership function π(z3) = π(z2) = 0 π(z1) = π(ld) = 0.3 π(ca) = 0.6 π(rd) = 0.1 Π({z3, z2}) = 0 and nec({z1, ca, rd, ld}) = 1 We know it is impossible that ship is located in {z3, z2} ω 0 {z1, ca, rd, ld} Π({ca, rd}) = 1, nec({ca, rd}) = 0.7 means location of ship is possibly but not with certainty in {ca, rd} R. Kruse, C. Doell FS Possibility Theory Lecture 1 16 / 32

Outline 1. Partial Belief 2. Possibility and Necessity 3. Possibility Theory Axiomatic Approach The Context Model Possibility Distributions Reasoning Evidence Propagation

Possibility and Fuzzy Sets Let variable T be temperature in C (only integers) Current but unknown value T 0 is given by T is around 21 C 1.00 0.75 0.50 0.25 0 π 17 18 19 20 21 22 23 24 25 19 20 21 22 23 π : 1/3 2/3 1 2/3 1/3 Incomplete information induces possibility distribution function π π is numerically identical with membership function Nested α-cuts play same role as focal elements R. Kruse, C. Doell FS Possibility Theory Lecture 1 17 / 32

Possibility Theory Best-known calculus for handling uncertainty: probability theory [Laplace, 1812] Less well-known, but noteworthy alternative: possibility theory [Dubois and Prade, 1988] Possibility theory can handle uncertain and imprecise information, while probability theory was only designed to handle uncertain information R. Kruse, C. Doell FS Possibility Theory Lecture 1 18 / 32

Possibility Theory: Axiomatic Approach Definition Let Ω be a (finite) sample space. A possibility measure Π on Ω is a function Π : 2 Ω [0, 1] satisfying i) Π( ) = 0 and ii) E 1, E 2 Ω : Π(E 1 E 2 ) = max{π(e 1 ), Π(E 2 )}. Similar to Kolmogorov s axioms of probability theory From axioms, it follows Π(E 1 E 2 ) min{π(e 1 ), Π(E 2 )} Attributes are introduced as random variables (as in probability theory) Π(A = a) is abbreviation of Π({ω Ω A(ω) = a}) If event E is possible without restriction, then Π(E) = 1 If event E is impossible, then Π(E) = 0 R. Kruse, C. Doell FS Possibility Theory Lecture 1 19 / 32

Possibility Theory and the Context Model Interpretation of degrees of possibility [Gebhardt and Kruse, 1993] Let Ω be (nonempty) set of all possible states of world, ω 0 the actual (but unknown) state Let C = {c 1,..., c n } be set of contexts (observers, frame conditions etc.) and (C, 2 C, P) finite probability space (context weights) Let Γ : C 2 Ω be set-valued mapping, which assigns to each context the most specific correct set-valued specification of ω 0 Sets Γ(c) are called focal sets of Γ Γ is random set (i.e. set-valued random variable) [Nguyen, 1978] Basic possibility assignment induced by Γ is mapping π : Ω [0, 1] π(ω) P({c C ω Γ(c)}). R. Kruse, C. Doell FS Possibility Theory Lecture 1 20 / 32

Example: Dice and Shakers shaker 1 shaker 2 shaker 3 shaker 4 shaker 5 tetrahedron hexahedron octahedron icosahedron dodecahedron 1 4 1 6 1 8 1 10 1 12 numbers degree of possibility 1 1 4 5 + 1 5 + 1 5 + 1 5 + 1 5 = 1 1 5 6 5 + 1 5 + 1 5 + 1 5 = 4 5 1 7 8 5 + 1 5 + 1 5 = 3 5 1 9 10 5 + 1 5 = 2 5 1 11 12 5 = 1 5 R. Kruse, C. Doell FS Possibility Theory Lecture 1 21 / 32

Definition Let Γ : C 2 Ω be a random set. The possibility measure induced by Γ is the mapping Π : 2 Ω [0, 1], E P({c C E Γ(c) }). Problem: from given interpretation it only follows: { E Ω : max π(ω) Π(E) min ω E 1 2 3 4 5 c 1 : 1 2 c 2 : 1 4 c 3 : 1 4 π 0 1 2 1 1 2 1 4 1, ω E π(ω) c 1 : 1 2 c 2 : 1 4 } 1 2 3 4 5. c 3 : 1 4 π 1 4 R. Kruse, C. Doell FS Possibility Theory Lecture 1 22 / 32 1 4 1 2 1 4 1 4

From Context Model to Possibility Measures Attempts to solve indicated problem: require focal sets to be consonant mass assignment theory [Baldwin et al., 1996] problem: voting model is not sufficient to justify consonance use lower bound as most pessimistic choice [Gebhardt, 1997] problem: basic possibility assignments represent negative information, lower bound is actually most optimistic choice justify lower bound from decision making purposes R. Kruse, C. Doell FS Possibility Theory Lecture 1 23 / 32

From Context Model to Possibility Measures Assume: in the end we must decide on one single event Each event is described by values of set of attributes Then it can be useful to assign to set of events the degree of possibility of most possible event in set example: 28 18 18 36 36 18 18 28 0 0 0 28 18 0 0 0 18 0 0 0 0 18 18 0 18 18 18 28 28 18 18 18 max 0 40 0 40 0 0 0 0 20 40 40 20 40 40 20 max R. Kruse, C. Doell FS Possibility Theory Lecture 1 24 / 32

Definition Let X = {A 1,..., A n } be a set of attributes defined on a (finite) sample space Ω with respective domains dom(a i ), i = 1,..., n. A possibility distribution π X over X is the restriction of a possibility measure Π on Ω to the set of all events that can be defined by stating values for all attributes in X. That is, π X = Π EX, where { E X = = E 2 Ω a 1 dom(a 1 ) :... a n dom(a n ) : E = } A j = a j { A j X E 2 Ω a 1 dom(a 1 ) :... a n dom(a n ) : { E = ω Ω A j (ω) = a j }}. A j X Corresponds to the notion of probability distribution R. Kruse, C. Doell FS Possibility Theory Lecture 1 25 / 32

A Possibility Distribution 40 70 10 70 20 10 20 20 30 30 20 10 40 80 10 70 30 10 70 60 60 60 20 10 20 20 10 20 30 10 40 40 80 90 20 10 40 80 10 70 30 10 70 60 80 90 20 10 80 90 70 70 small 90 medium 80 80 70 large medium small all numbers in parts per 1000 90 large s m l 70 20 80 70 40 70 20 90 60 30 Numbers state degrees of possibility of corresponding value combination 40 70 20 70 60 80 70 70 80 90 40 40 R. Kruse, C. Doell FS Possibility Theory Lecture 1 26 / 32

Reasoning 0 0 0 70 0 0 0 20 0 0 0 10 0 0 0 70 0 0 0 60 0 0 0 10 0 0 0 20 0 0 0 40 0 0 0 10 0 0 0 70 0 0 0 60 0 0 0 10 0 0 0 70 small 40 medium 70 70 60 Using information that given object is green large medium small all numbers in parts per 1000 10 large s m l 70 20 70 70 40 60 20 10 10 10 0 0 0 70 0 0 0 70 0 0 0 40 R. Kruse, C. Doell FS Possibility Theory Lecture 1 27 / 32

Possibilistic Decomposition As for relational and probabilistic networks, 3D possibility distribution can be decomposed into projections to subspaces, i.e. maximum projection to subspace color shape, maximum projection to subspace shape size Can be reconstructed using the following formula: ( i, j, k : π a (color) i, a (shape) j, a (size) ) k { ( = min π a (color) i, a (shape) ) ( j, π { ( = min max π a (color) i, a (shape) j k ( max π i a (color) i, a (shape) j a (shape) j, a (size) ) k,, a (size) ) } k, a (size) )} k R. Kruse, C. Doell FS Possibility Theory Lecture 1 28 / 32

Reasoning with Projections Again same result can be obtained using only projections to subspaces (maximal degrees of possibility): 0 0 0 70 80 90 70 70 min new old new 40 80 10 70 0 0 0 70 30 0 10 0 70 0 60 60 80 0 90 0 20 0 10 10 new color old max line shape new old 70 80 60 70 10 90 size old new min new s m l 90 80 70 40 70 70 max column old new 20 20 80 70 70 70 40 40 70 60 20 20 90 60 30 10 10 10 s m l R. Kruse, C. Doell FS Possibility Theory Lecture 1 29 / 32

Conditional Possibility and Independence Definition Let Ω be a (finite) sample space, Π a possibility measure on Ω, and E 1, E 2 Ω events. Then Π(E 1 E 2 ) = Π(E 1 E 2 ) is called the conditional possibility of E 1 given E 2. Definition Let Ω be a (finite) sample space, Π a possibility measure on Ω, and A, B, and C attributes with respective domains dom(a), dom(b), and dom(c). A and B are called conditionally possibilistically independent given C, written A Π B C, iff a dom(a) : b dom(b) : c dom(c) : Π(A = a, B = b C = c) = min{π(a = a C = c), Π(B = b C = c)} similar to corresponding notions of probability theory R. Kruse, C. Doell FS Possibility Theory Lecture 1 30 / 32

Possibilistic Evidence Propagation π(b = b A = a obs ) ( = π A = a, B = b, (1) = max a dom(a) c dom(c) ) C = c A = a obs { max {π(a = a, B = b, C = c A = a obs)}} a dom(a) c dom(c) (2) = max A: color B: shape C: size { max {min{π(a = a, B = b, C = c), π(a = a A = a obs)}}} a dom(a) c dom(c) (3) = max { max {min{π(a = a, B = b), π(b = b, C = c), a dom(a) c dom(c) π(a = a A = a obs )}}} = max {min{π(a = a, B = b), π(a = a A = a obs), a dom(a) max {π(b = b, C = c)} }} c dom(c) }{{} =π(b=b) π(a=a,b=b) = max a dom(a) {min{π(a = a, B = b), π(a = a A = a obs)}} R. Kruse, C. Doell FS Possibility Theory Lecture 1 31 / 32

Literature for the Lecture Baldwin, J. F., Lawry, J., and Martin, T. P. (1996). A mass assignment theory of the probability of fuzzy events. Fuzzy Sets and Systems, 83:353 367. Dubois, D. and Prade, H. (1988). Possibility Theory: An Approach to Computerized Processing of Uncertainty. Plenum Press, New York, USA. Gebhardt, J. (1997). Learning from Data: Possibilistic Graphical Models. Habilitation, TU Braunschweig. Gebhardt, J. and Kruse, R. (1993). The context model: An integrating view of vagueness and uncertainty. International Journal of Approximate Reasoning, 9:283 314. Laplace, P.-S. (1812). Théorie analytique des probabilités. Madame Veuve Courcier, Paris, France. Nguyen, H. T. (1978). On random sets and belief functions. Journal of Mathematical Analysis and Applications, 65:531 542. R. Kruse, C. Doell FS Possibility Theory Lecture 1 32 / 32