Handling imprecise and uncertain class labels in classification and clustering
|
|
- Julius Jefferson
- 5 years ago
- Views:
Transcription
1 Handling imprecise and uncertain class labels in classification and clustering Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) COST Action IC 0702 Working group C, Mallorca, March 16, 2009 Thierry Denœux Handling imprecise and uncertain class labels 1/ 48
2 Classification and clustering Classical framework We consider a collection L of n objects. Each object is assumed to belong to one of K groups (classes). Each object is described by An attribute vector x R p (attribute data), or Its similarity to all other objects (proximity data). The class membership of objects may be: Completely known, described by class labels (supervised learning); Completely unknown (unsupervised learning); Known for some objects, and unknown for others (semi-supervised learning). Thierry Denœux Handling imprecise and uncertain class labels 2/ 48
3 Classification and clustering Problems Problem 1: predict the class membership of objects drawn from the same population as L (classification). Problem 2: Estimate parameters of the population from which L is drawn (mixture model estimation). Problem 3: Determine the class membership of objects in L (clustering); supervised unsupervised semi-supervised Problem 1 x x Problem 2 x x x Problem 3 x x Thierry Denœux Handling imprecise and uncertain class labels 3/ 48
4 Motivations In real situations, we may have only partial knowledge of class labels: intermediate situation between supervised and unsupervised learning partially supervised learning. The class membership of objects can usually be predicted with some remaining uncertainty: the outputs from classification and clustering algorithms should reflect this uncertainty. The theory of belief functions is suitable for representing uncertain and imprecise class information: as input to classification and mixture model estimation algorithms; as output of classification and clustering algorithms. Thierry Denœux Handling imprecise and uncertain class labels 4/ 48
5 Outline 1 Theory of belief functions 2 Principle Implementation 3 Problem statement Method Simulation results 4 Problem Evidential c-means Thierry Denœux Handling imprecise and uncertain class labels 5/ 48
6 Mass function Let X be a variable taking values in a finite set Ω (frame of discernment). Mass function: m : 2 Ω [0, 1] such that m(a) = 1. A Ω Every A of Ω such that m(a) > 0 is a focal set of m. Interpretation: m represents An item of evidence regarding the value of X. A state of knowledge (belief state) induced by this item of evidence. Thierry Denœux Handling imprecise and uncertain class labels 6/ 48
7 Full notation m Ω Ag,t {X}[EC] denotes the mass function Representing the beliefs of agent Ag; At time t; Regarding variable X; Expressed on frame Ω; Based on evidential corpus EC. Thierry Denœux Handling imprecise and uncertain class labels 7/ 48
8 Special cases m may be seen as: A family of weighted sets {(A i, m(a i )), i = 1,..., r}. A generalized probability distribution (masses are distributed in 2 Ω instead of Ω). Special cases: r = 1: categorical mass function ( set). We denote by m A the categorical mass function with focal set A. A i = 1, i = 1,..., r: Bayesian mass function ( probability distribution). A 1... A r : consonant mass function ( possibility distribution). Thierry Denœux Handling imprecise and uncertain class labels 8/ 48
9 Belief and plausibility functions Belief function: bel(a) = B A B A m(b) = =B A m(b), A Ω (degree of belief (support) in hypothesis "X A") Plausibility function: pl(a) = m(b), A Ω B A (upper bound on the degree of belief that could be assigned to A after taking into account new information) bel pl. Thierry Denœux Handling imprecise and uncertain class labels 9/ 48
10 Relations between m, bel et pl Relations: bel(a) = pl(ω) pl(a), A Ω m(a) = { =B A ( 1) A B bel(b), A 1 bel(ω) A = m, bel et pl are thus three equivalent representations of a same piece of information. Thierry Denœux Handling imprecise and uncertain class labels 10/ 48
11 Dempster s rule Definition (Dempster s rule of combination) A Ω, (m 1 m 2 )(A) = m 1 (B)m 2 (C). B C=A Properties: Commutativity, associativity. Neutral element: vacuous m Ω such that m Ω (Ω) = 1 (represents total ignorance). (m 1 m 2 )( ) 0: degree of conflict. Justified axiomatically. Other rules exist (disjunctive rule, cautious rule, etc...). Thierry Denœux Handling imprecise and uncertain class labels 11/ 48
12 Discounting Discounting allows us to take into account meta-knowledge about the reliability of a source of information. Let m be a mass function provided by a source of information. α [0, 1] be the plausibility that the source is not reliable. Discounting m with discount rate α yields the following mass function: α m = (1 α)m + αm Ω. Properties: 0 m = m and 1 m = m Ω. Thierry Denœux Handling imprecise and uncertain class labels 12/ 48
13 Pignistic transformation Assume that our knowledge about X is represented by a mass function m, and we have to choose one element of Ω. Several strategies: 1 Select the element with greatest plausibility. 2 Select the element with greatest pignistic probability: Betp(ω) = (m assumed to be normal) {A Ω ω A} m(a). A Thierry Denœux Handling imprecise and uncertain class labels 13/ 48
14 Outline Theory of belief functions Principle Implementation 1 Theory of belief functions 2 Principle Implementation 3 Problem statement Method Simulation results 4 Problem Evidential c-means Thierry Denœux Handling imprecise and uncertain class labels 14/ 48
15 Problem Theory of belief functions Principle Implementation Let Ω denote the set of classes, et L the learning set L = {e i = (x i, m i ), i = 1,..., n} where x i is the attribute vector for object o i, and m i = m Ω {y i } is a mass function on the class y i of object o i. Special cases: m i ({ω k }) = 1: precise labeling; m i (A) = 1 for A Ω: imprecise (set-valued) labeling; m i is a Bayesian mass function: probabilistic labeling; m i is a consonant mass function: possibilistic labeling, etc... Problem: Build a mass function m Ω {y}[x, L] regarding the class y of a new object o described by x. Thierry Denœux Handling imprecise and uncertain class labels 15/ 48
16 Solution (Denoeux, 1995) Theory of belief functions Principle Implementation Each example e i = (x i, m i ) in L is an item of evidence regarding y. The reliability of this information decreases with the distance between x and x i. It should be discounted with a discount rate α i = φ (d(x, x i )), where φ is a decreasing function from R + to [0, 1]: m{y}[x, e i ] = α i m i. The n mass functions should then be combined conjunctively: m{y}[x, L] = m{y}[x, e i ]... m{y}[x, e n ]. Thierry Denœux Handling imprecise and uncertain class labels 16/ 48
17 Implementation Principle Implementation Take into account only the k nearest neighbors of x dans L evidential k-nn rule (generalizes the voting k-nn rule). Definition of φ: for instance, φ(d) = β exp( γd 2 ). Determination of hyperparameters β and γ heuristically or by minimizing an error function (Zouhal and Denoeux, 1997). Summarize L as r prototypes learnt by minimizing an error function RBF-like neural network approach (Denoeux, 2000). Thierry Denœux Handling imprecise and uncertain class labels 17/ 48
18 : EEG data Principle Implementation 500 EEG signals encoded as 64-D patterns, 50 % negative (delta waves), 5 experts A/D converter output A/D converter output time Thierry Denœux Handling imprecise and uncertain class labels 18/ 48
19 Results on EEG data (Denoeux and Zouhal, 2001) Principle Implementation K = 2 classes, d = 64 data labeled by 5 experts Possibilistic labels computed from distribution of expert labels using a probability-possibility transformation. n = 200 learning patterns, 300 test patterns k k-nn w K -NN TBM TBM (crisp labels) (uncert. labels) Thierry Denœux Handling imprecise and uncertain class labels 19/ 48
20 Outline Theory of belief functions Problem statement Method Simulation results 1 Theory of belief functions 2 Principle Implementation 3 Problem statement Method Simulation results 4 Problem Evidential c-means Thierry Denœux Handling imprecise and uncertain class labels 20/ 48
21 Mixture model Theory of belief functions Problem statement Method Simulation results The feature vectors and class labels are assumed to be drawn from a joint probability distribution: P(Y = k) = π k, k = 1,..., K, f (x Y = k) = f (x, θ k ), k = 1,..., K. Let ψ = (π 1,..., π K, θ 1,..., θ K ). Thierry Denœux Handling imprecise and uncertain class labels 21/ 48
22 Data Theory of belief functions Problem statement Method Simulation results We consider a realization of an iid random sample from (X, Y ) of size n: (x 1, y 1 ),..., (x n, y n ). The class labels are assumed to be imperfectly observed and partially specified by mass functions. The learning set has the following form: L = {(x 1, m 1 ),..., (x n, m n )}. Problem 2: estimate ψ using L. Remark: this problem encompasses supervised, unsupervised and semi-supervised learning as special cases. Thierry Denœux Handling imprecise and uncertain class labels 22/ 48
23 Generalized likelihood criterion (Côme et al., 2009) Problem statement Method Simulation results Approach: ˆψ = arg max ψ plψ (ψ L). Theorem The logarithm of the conditional plausibility of ψ given L is given by ( ) ln pl Ψ (ψl) = ( N K ) ln pl ik π k f (x i ; θ k ) + ν, i=1 k=1 where pl ik = pl(y i = k) and ν is a constant. Thierry Denœux Handling imprecise and uncertain class labels 23/ 48
24 Generalized EM algorithm (Côme et al., 2009) Problem statement Method Simulation results An EM algorithm (with guaranteed convergence) can be derived to maximize the previous criterion. This algorithm becomes identical to the classical EM algorithm in the case of completely unsupervised or semi-supervised data. The complexity of this algorithm is identical to that of the classical EM algorithm. Thierry Denœux Handling imprecise and uncertain class labels 24/ 48
25 Experimental settings Problem statement Method Simulation results Simulated and real data sets. Each example i was assumed to be labelled by an expert who provides his/her most likely label ŷ i and a measure of doubt p i. This information is represented by a simple mass function: m i ({ŷ i }) = 1 p i m i (Ω) = p i. Simulations: p i drawn randomly form a Beta distribution, true label changed to any other label with probability p i. Thierry Denœux Handling imprecise and uncertain class labels 25/ 48
26 Results Iris data Theory of belief functions Problem statement Method Simulation results Thierry Denœux Handling imprecise and uncertain class labels 26/ 48
27 Results Wine data Theory of belief functions Problem statement Method Simulation results Thierry Denœux Handling imprecise and uncertain class labels 27/ 48
28 Outline Theory of belief functions Problem Evidential c-means 1 Theory of belief functions 2 Principle Implementation 3 Problem statement Method Simulation results 4 Problem Evidential c-means Thierry Denœux Handling imprecise and uncertain class labels 28/ 48
29 Credal partition Problem Evidential c-means n objects described by attribute vectors x 1,..., x n. Assumption: each object belongs to one of K classes in Ω = {ω 1,..., ω K }, Goal: express our beliefs regarding the class membership of objects, in the form of mass functions m 1,..., m n on Ω. Resulting structure = Credal partition, generalizes hard, fuzzy and possibilistic partitions Thierry Denœux Handling imprecise and uncertain class labels 29/ 48
30 Theory of belief functions Problem Evidential c-means A m 1 (A) m 2 (A) m 3 (A) m 4 (A) m 5 (A) {ω 1 } {ω 2 } {ω 1, ω 2 } {ω 3 } {ω 1, ω 3 } {ω 2, ω 3 } Ω Thierry Denœux Handling imprecise and uncertain class labels 30/ 48
31 Special cases Theory of belief functions Problem Evidential c-means Each m i is a certain bba crisp partition of Ω. Each m i is a Bayesian bba fuzzy partition of Ω u ik = m i ({ω k }), i, k Each m i is a consonant bba possibilistic partition of Ω u ik = pl Ω i ({ω k }) Thierry Denœux Handling imprecise and uncertain class labels 31/ 48
32 Algorithms Theory of belief functions Problem Evidential c-means EVCLUS (Denoeux and Masson, 2004): proximity (possibly non metric) data, multidimensional scaling approach. Evidential c-means (ECM): (Masson and Denoeux, 2008): attribute data, alternate optimization of an FCM-like cost function. Thierry Denœux Handling imprecise and uncertain class labels 32/ 48
33 Basic ideas Theory of belief functions Problem Evidential c-means Let v k be the prototype associated to class ω k (k = 1,..., K ). Let A j a non empty subset of Ω (a set of classes). We associate to A j a prototype v j defined as the center of mass of the v k for all ω k A j. Basic ideas: for each non empty A j Ω, m ij = m i (A j ) should be high if x i is close to v j. The distance to the empty set is defined as a fixed value δ. Thierry Denœux Handling imprecise and uncertain class labels 33/ 48
34 Optimization problem Problem Evidential c-means Minimize J ECM (M, V ) = n i=1 {j/a j,a j Ω} A j α m β ij d 2 ij + n δ 2 m β i, i=1 subject to {j/a j Ω,A j } m ij + m i = 1 i = 1, n, Thierry Denœux Handling imprecise and uncertain class labels 34/ 48
35 Butterfly dataset Problem Evidential c-means x x 1 8 Thierry Denœux Handling imprecise and uncertain class labels 35/ 48
36 Butterfly dataset Results Problem Evidential c-means m({ 1 }) m( ) m({ 2 }) m(emptyset) bbas point number Thierry Denœux Handling imprecise and uncertain class labels 36/ 48
37 4-class data set Problem Evidential c-means x x 1 Thierry Denœux Handling imprecise and uncertain class labels 37/ 48
38 4-class data set Hard credal partition Problem Evidential c-means Thierry Denœux Handling imprecise and uncertain class labels 38/ 48
39 4-class data set Lower approximation Problem Evidential c-means L 4 4 L L 2 L Thierry Denœux Handling imprecise and uncertain class labels 39/ 48
40 4-class data set Upper approximation Problem Evidential c-means U 4 4 U U 2 2 U Thierry Denœux Handling imprecise and uncertain class labels 40/ 48
41 Brain data Theory of belief functions Problem Evidential c-means Magnetic resonance imaging of pathological brain, 2 sets of parameters. Image 1 shows normal tissue (bright) and ventricals + cerebrospinal fluid (dark). Image 2 shows pathology (bright). (a) Thierry Denœux Handling imprecise and uncertain class labels 41/ 48 (b)
42 Brain data Results in gray level space Problem Evidential c-means Grey levels in image Grey levels in image 1 Thierry Denœux Handling imprecise and uncertain class labels 42/ 48
43 Brain data Lower and upper approximations Problem Evidential c-means (a) (b) (c) Thierry Denœux Handling imprecise and uncertain class labels 43/ 48
44 The theory of belief functions extends both set theory and probability theory: It allows for the representation of imprecision and uncertainty. It is more general than possibility theory. Belief functions may be used to represent imprecise and/or uncertain knowledge of class labels soft labels. Many classification and clustering algorithms can be adapted to handle such class labels (partially supervised learning) generate them from data (credal partition) Thierry Denœux Handling imprecise and uncertain class labels 44/ 48
45 References References I cf. tdenoeux T. Denœux. A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05): , L. M. Zouhal and T. Denoeux. An evidence-theoretic k-nn rule with parameter optimization. IEEE Transactions on Systems, Man and Cybernetics C, 28(2): ,1998. Thierry Denœux Handling imprecise and uncertain class labels 45/ 48
46 References References II cf. tdenoeux T. Denœux. A neural network classifier based on Dempster-Shafer theory. IEEE Transactions on Systems, Man and Cybernetics A, 30(2), , T. Denoeux and M. Masson. EVCLUS: Evidential Clustering of Proximity Data. IEEE Transactions on Systems, Man and Cybernetics B, (34)1, , Thierry Denœux Handling imprecise and uncertain class labels 46/ 48
47 References References III cf. tdenoeux T. Denœux and P. Smets. Classification using Belief Functions: the Relationship between the Case-based and Model-based Approaches. IEEE Transactions on Systems, Man and Cybernetics B, 36(6), , M.-H. Masson and T. Denoeux. ECM: An evidential version of the fuzzy c-means algorithm. Pattern Recognition, 41(4), , Thierry Denœux Handling imprecise and uncertain class labels 47/ 48
48 References References IV cf. tdenoeux E. Côme, L. Oukhellou, T. Denoeux and P. Aknin. Learning from partially supervised data using mixture models and belief functions. Pattern Recognition, 42(3), , Thierry Denœux Handling imprecise and uncertain class labels 48/ 48
Introduction to belief functions
Introduction to belief functions Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) http://www.hds.utc.fr/ tdenoeux Spring School BFTA 2011 Autrans, April 4-8, 2011 Thierry
More informationLearning from partially supervised data using mixture models and belief functions
* Manuscript Click here to view linked References Learning from partially supervised data using mixture models and belief functions E. Côme a,c L. Oukhellou a,b T. Denœux c P. Aknin a a Institut National
More informationA novel k-nn approach for data with uncertain attribute values
A novel -NN approach for data with uncertain attribute values Asma Trabelsi 1,2, Zied Elouedi 1, and Eric Lefevre 2 1 Université de Tunis, Institut Supérieur de Gestion de Tunis, LARODEC, Tunisia trabelsyasma@gmail.com,zied.elouedi@gmx.fr
More informationBelief functions: basic theory and applications
Belief functions: basic theory and applications Thierry Denœux 1 1 Université de Technologie de Compiègne, France HEUDIASYC (UMR CNRS 7253) https://www.hds.utc.fr/ tdenoeux ISIPTA 2013, Compiègne, France,
More informationFunctions. Marie-Hélène Masson 1 and Thierry Denœux
Clustering Interval-valued Proximity Data using Belief Functions Marie-Hélène Masson 1 and Thierry Denœux Université de Picardie Jules Verne Université de Technologie de Compiègne UMR CNRS 6599 Heudiasyc
More informationLearning from data with uncertain labels by boosting credal classifiers
Learning from data with uncertain labels by boosting credal classifiers ABSTRACT Benjamin Quost HeuDiaSyC laboratory deptartment of Computer Science Compiègne University of Technology Compiègne, France
More informationThe cautious rule of combination for belief functions and some extensions
The cautious rule of combination for belief functions and some extensions Thierry Denœux UMR CNRS 6599 Heudiasyc Université de Technologie de Compiègne BP 20529 - F-60205 Compiègne cedex - France Thierry.Denoeux@hds.utc.fr
More informationData Fusion in the Transferable Belief Model.
Data Fusion in the Transferable Belief Model. Philippe Smets IRIDIA Université Libre de Bruxelles 50 av. Roosevelt,CP 194-6,1050 Bruxelles,Belgium psmets@ulb.ac.be http://iridia.ulb.ac.be/ psmets Abstract
More informationThe Unnormalized Dempster s Rule of Combination: a New Justification from the Least Commitment Principle and some Extensions
J Autom Reasoning manuscript No. (will be inserted by the editor) 0 0 0 The Unnormalized Dempster s Rule of Combination: a New Justification from the Least Commitment Principle and some Extensions Frédéric
More informationCombination of classifiers with optimal weight based on evidential reasoning
1 Combination of classifiers with optimal weight based on evidential reasoning Zhun-ga Liu 1, Quan Pan 1, Jean Dezert 2, Arnaud Martin 3 1. School of Automation, Northwestern Polytechnical University,
More informationSMPS 08, 8-10 septembre 2008, Toulouse. A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France
SMPS 08, 8-10 septembre 2008, Toulouse A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France The frame of reference: climate sensitivity Climate sensitivity
More informationPartially-Hidden Markov Models
Partially-Hidden Markov Models Emmanuel Ramasso and Thierry Denœux and Noureddine Zerhouni Abstract This paper addresses the problem of Hidden Markov Models (HMM) training and inference when the training
More informationPairwise Classifier Combination using Belief Functions
Pairwise Classifier Combination using Belief Functions Benjamin Quost, Thierry Denœux and Marie-Hélène Masson UMR CNRS 6599 Heudiasyc Université detechnologiedecompiègne BP 059 - F-6005 Compiègne cedex
More informationC Ahmed Samet 1,2, Eric Lefèvre 2, and Sadok Ben Yahia 1 1 Laboratory of research in Programming, Algorithmic and Heuristic Faculty of Science of Tunis, Tunisia {ahmed.samet, sadok.benyahia}@fst.rnu.tn
More informationOn belief functions implementations
On belief functions implementations Arnaud Martin Arnaud.Martin@univ-rennes1.fr Université de Rennes 1 - IRISA, Lannion, France Xi an, July, 9th 2017 1/44 Plan Natural order Smets codes General framework
More informationHow to implement the belief functions
How to implement the belief functions Arnaud Martin Arnaud.Martin@univ-rennes1.fr Université de Rennes 1 - IRISA, Lannion, France Autrans, April, 5 2011 1/45 Plan Natural order Smets codes General framework
More informationEvidence combination for a large number of sources
Evidence combination for a large number of sources Kuang Zhou a, Arnaud Martin b, and Quan Pan a a. Northwestern Polytechnical University, Xi an, Shaanxi 710072, PR China. b. DRUID, IRISA, University of
More informationFuzzy Systems. Possibility Theory.
Fuzzy Systems Possibility Theory Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing
More informationFuzzy Systems. Possibility Theory
Fuzzy Systems Possibility Theory Prof. Dr. Rudolf Kruse Christoph Doell {kruse,doell}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge
More informationMulti-hypothesis nearest-neighbor classifier based on class-conditional weighted distance metric
Multi-hypothesis nearest-neighbor classifier based on class-conditional weighted distance metric Lianmeng Jiao, Quan Pan, Xiaoxue Feng To cite this version: Lianmeng Jiao, Quan Pan, Xiaoxue Feng. Multi-hypothesis
More informationA hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France
Ambiguity, uncertainty and climate change, UC Berkeley, September 17-18, 2009 A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France Outline 1. Intro:
More informationarxiv: v1 [cs.ai] 28 Oct 2013
Ranking basic belief assignments in decision making under uncertain environment arxiv:30.7442v [cs.ai] 28 Oct 203 Yuxian Du a, Shiyu Chen a, Yong Hu b, Felix T.S. Chan c, Sankaran Mahadevan d, Yong Deng
More informationarxiv: v1 [cs.ai] 16 Aug 2018
Decision-Making with Belief Functions: a Review Thierry Denœux arxiv:1808.05322v1 [cs.ai] 16 Aug 2018 Université de Technologie de Compiègne, CNRS UMR 7253 Heudiasyc, Compiègne, France email: thierry.denoeux@utc.fr
More informationDecision fusion for postal address recognition using belief functions
Decision fusion for postal address recognition using belief functions David Mercier, Genevieve Cron, Thierry Denoeux, Marie-Hélène Masson To cite this version: David Mercier, Genevieve Cron, Thierry Denoeux,
More informationA hierarchical fusion of expert opinion in the TBM
A hierarchical fusion of expert opinion in the TBM Minh Ha-Duong CIRED, CNRS haduong@centre-cired.fr Abstract We define a hierarchical method for expert opinion aggregation that combines consonant beliefs
More informationEntropy and Specificity in a Mathematical Theory of Evidence
Entropy and Specificity in a Mathematical Theory of Evidence Ronald R. Yager Abstract. We review Shafer s theory of evidence. We then introduce the concepts of entropy and specificity in the framework
More informationPrediction of future observations using belief functions: a likelihood-based approach
Prediction of future observations using belief functions: a likelihood-based approach Orakanya Kanjanatarakul 1,2, Thierry Denœux 2 and Songsak Sriboonchitta 3 1 Faculty of Management Sciences, Chiang
More informationCredal Fusion of Classifications for Noisy and Uncertain Data
Credal Fusion of Classifications for Noisy and Uncertain Data Fatma Karem, Mounir Dhibi, Arnaud Martin, Mohamed Salim Bouhlel To cite this version: Fatma Karem, Mounir Dhibi, Arnaud Martin, Mohamed Salim
More informationHierarchical Proportional Redistribution Principle for Uncertainty Reduction and BBA Approximation
Hierarchical Proportional Redistribution Principle for Uncertainty Reduction and BBA Approximation Jean Dezert Deqiang Han Zhun-ga Liu Jean-Marc Tacnet Abstract Dempster-Shafer evidence theory is very
More informationDecision-making with belief functions
Decision-making with belief functions Thierry Denœux Université de Technologie de Compiègne, France HEUDIASYC (UMR CNRS 7253) https://www.hds.utc.fr/ tdenoeux Fourth School on Belief Functions and their
More informationarxiv: v1 [cs.cv] 11 Jun 2008
HUMAN EXPERTS FUSION FOR IMAGE CLASSIFICATION Arnaud MARTIN and Christophe OSSWALD arxiv:0806.1798v1 [cs.cv] 11 Jun 2008 Abstract In image classification, merging the opinion of several human experts is
More informationIN CHANGE detection from heterogeneous remote sensing
168 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 11, NO. 1, JANUARY 014 Change Detection in Heterogeneous Remote Sensing Images Based on Multidimensional Evidential Reasoning Zhun-ga Liu, Grégoire
More informationBelief Functions: the Disjunctive Rule of Combination and the Generalized Bayesian Theorem
Belief Functions: the Disjunctive Rule of Combination and the Generalized Bayesian Theorem Philippe Smets IRIDIA Université Libre de Bruxelles 50 av. Roosevelt, CP 194-6, 1050 Bruxelles, Belgium January
More informationContradiction Measures and Specificity Degrees of Basic Belief Assignments
Contradiction Measures and Specificity Degrees of Basic Belief Assignments Florentin Smarandache Arnaud Martin Christophe Osswald Originally published as: Smarandache F., Martin A., Osswald C - Contradiction
More informationManaging Decomposed Belief Functions
Managing Decomposed Belief Functions Johan Schubert Department of Decision Support Systems, Division of Command and Control Systems, Swedish Defence Research Agency, SE-164 90 Stockholm, Sweden schubert@foi.se
More informationOnline active learning of decision trees with evidential data
Online active learning of decision trees with evidential data Liyao Ma, Sébastien Destercke, Yong Wang To cite this version: Liyao Ma, Sébastien Destercke, Yong Wang. Online active learning of decision
More informationA Fuzzy-Cautious OWA Approach with Evidential Reasoning
Advances and Applications of DSmT for Information Fusion Collected Works Volume 4 A Fuzzy-Cautious OWA Approach with Evidential Reasoning Deqiang Han Jean Dezert Jean-Marc Tacnet Chongzhao Han Originally
More informationThe Semi-Pascal Triangle of Maximum Deng Entropy
The Semi-Pascal Triangle of Maximum Deng Entropy Xiaozhuan Gao a, Yong Deng a, a Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu, 610054,
More informationGeneral combination rules for qualitative and quantitative beliefs
General combination rules for qualitative and quantitative beliefs Arnaud Martin, Christophe Osswald E 3 I 2 EA387 ENSIETA 2 rue François Verny, 2980 Brest Cedex 09, France. Email:Arnaud.Martin,Christophe.Osswald@ensieta.fr
More informationA New ER-MCDA Mapping for Decision-Making based on Imperfect Information
A New ER-MCDA Mapping for Decision-Making based on Imperfect Information Simon Carladous 1,2,4, Jean-Marc Tacnet 1, Jean Dezert 3, Guillaume Dupouy 1, and Mireille Batton-Hubert 4 1 Université Grenoble
More informationSound Source Localization from Uncertain Information Using the Evidential EM Algorithm
Sound Source Localization from Uncertain Information Using the Evidential EM Algorithm Xun Wang 1,,,BenjaminQuost 1, Jean-Daniel Chazot, and Jérôme Antoni 3 1 Heudiasyc, UMR CNRS 753, Université de Technologie
More informationA Static Evidential Network for Context Reasoning in Home-Based Care Hyun Lee, Member, IEEE, Jae Sung Choi, and Ramez Elmasri, Member, IEEE
1232 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART A: SYSTEMS AND HUMANS, VOL. 40, NO. 6, NOVEMBER 2010 A Static Evidential Network for Context Reasoning in Home-Based Care Hyun Lee, Member,
More informationAnalyzing the Combination of Conflicting Belief Functions.
Analyzing the Combination of Conflicting Belief Functions. Philippe Smets IRIDIA Université Libre de Bruxelles 50 av. Roosevelt, CP 194-6, 1050 Bruxelles, Belgium psmets@ulb.ac.be http://iridia.ulb.ac.be/
More informationHandling Dierent Forms of. Uncertainty in Regression Analysis: a Fuzzy Belief Structure Approach. Universite de Technologie de Compiegne,
Handling Dierent Forms of Uncertainty in Regression Analysis: a Fuzzy Belief Structure Approach Simon Petit-Renaud and Thierry Denu Universite de Technologie de Compiegne, HeuDiaSyc-UMR C.N.R.S. 6599,
More informationJoint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model
Joint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model Gavin Powell & David Marshall The Geometric Computing & Computer Vision Group,
More informationCSCI-567: Machine Learning (Spring 2019)
CSCI-567: Machine Learning (Spring 2019) Prof. Victor Adamchik U of Southern California Mar. 19, 2019 March 19, 2019 1 / 43 Administration March 19, 2019 2 / 43 Administration TA3 is due this week March
More informationOn Conditional Independence in Evidence Theory
6th International Symposium on Imprecise Probability: Theories and Applications, Durham, United Kingdom, 2009 On Conditional Independence in Evidence Theory Jiřina Vejnarová Institute of Information Theory
More informationIGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden
IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October 28-30 th 2014, Kalmar, Sweden Comparison of probabilistic and alternative evidence theoretical methods for the handling of parameter
More informationHierarchical DSmP Transformation for Decision-Making under Uncertainty
Hierarchical DSmP Transformation for Decision-Making under Uncertainty Jean Dezert Deqiang Han Zhun-ga Liu Jean-Marc Tacnet Originally published as Dezert J., Han D., Liu Z., Tacnet J.-M., Hierarchical
More informationSemantics of the relative belief of singletons
Semantics of the relative belief of singletons Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe, 38334 SAINT ISMIER CEDEX, France Fabio.Cuzzolin@inrialpes.fr Summary. In this paper we introduce
More informationInformation fusion for scene understanding
Information fusion for scene understanding Philippe XU CNRS, HEUDIASYC University of Technology of Compiègne France Franck DAVOINE Thierry DENOEUX Context of the PhD thesis Sino-French research collaboration
More informationAn evidence-theoretic k-nn rule with parameter optimization 1 L. M. Zouhal y and T. Denux Universite detechnologie de Compiegne - U.M.R. CNRS 6599 Heudiasyc BP 20529 - F-60205 Compiegne cedex - France
More informationEvidential Reasoning for Multi-Criteria Analysis Based on DSmT-AHP
Evidential Reasoning for Multi-Criteria Analysis Based on DSmT-AHP Jean Dezert Jean-Marc Tacnet Originally published as Dezert J., Tacnet J.-M., Evidential Reasoning for Multi-Criteria Analysis based on
More informationAnalyzing the degree of conflict among belief functions.
Analyzing the degree of conflict among belief functions. Liu, W. 2006). Analyzing the degree of conflict among belief functions. Artificial Intelligence, 17011)11), 909-924. DOI: 10.1016/j.artint.2006.05.002
More informationThe internal conflict of a belief function
The internal conflict of a belief function Johan Schubert 1 Abstract In this paper we define and derive an internal conflict of a belief function We decompose the belief function in question into a set
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationRough operations on Boolean algebras
Rough operations on Boolean algebras Guilin Qi and Weiru Liu School of Computer Science, Queen s University Belfast Belfast, BT7 1NN, UK Abstract In this paper, we introduce two pairs of rough operations
More informationContext-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI
Context-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI Sarah Calderwood Kevin McAreavey Weiru Liu Jun Hong Abstract There has been much interest in the Belief-Desire-Intention
More informationIs Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function?
Is Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function? Deqiang Han Jean Dezert Chongzhao Han Yi Yang Originally published as Han D., Dezert J., Han C., Is Entropy Enough
More informationA generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP
A generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP 08 76131 Mont-Saint-Aignan Cedex, France Eric.Lefevre@insa-rouen.fr
More informationBelief functions: past, present and future
Belief functions: past, present and future CSA 2016, Algiers Fabio Cuzzolin Department of Computing and Communication Technologies Oxford Brookes University, Oxford, UK Algiers, 13/12/2016 Fabio Cuzzolin
More informationA NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS
A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS Albena TCHAMOVA, Jean DEZERT and Florentin SMARANDACHE Abstract: In this paper a particular combination rule based on specified
More informationThe intersection probability and its properties
The intersection probability and its properties Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe Montbonnot, France Abstract In this paper we introduce the intersection probability, a Bayesian approximation
More informationPrinciples of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata
Principles of Pattern Recognition C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata e-mail: murthy@isical.ac.in Pattern Recognition Measurement Space > Feature Space >Decision
More informationCombining Belief Functions Issued from Dependent Sources
Combining Belief Functions Issued from Dependent Sources MARCO E.G.V. CATTANEO ETH Zürich, Switzerland Abstract Dempster s rule for combining two belief functions assumes the independence of the sources
More informationStatistical learning. Chapter 20, Sections 1 4 1
Statistical learning Chapter 20, Sections 1 4 Chapter 20, Sections 1 4 1 Outline Bayesian learning Maximum a posteriori and maximum likelihood learning Bayes net learning ML parameter learning with complete
More informationApplication of Evidence Theory and Discounting Techniques to Aerospace Design
Application of Evidence Theory and Discounting Techniques to Aerospace Design Fiona Browne 1, David Bell 1, Weiru Liu 1, Yan Jin 1, Colm Higgins 1, Niall Rooney 2, Hui Wang 2, and Jann Müller 3 1 School
More informationCautious OWA and Evidential Reasoning for Decision Making under Uncertainty
Cautious OWA and Evidential Reasoning for Decision Making under Uncertainty Jean-Marc Tacnet Cemagref -ETGR 2, rue de la papèterie - B.P. 76 F-38402 Saint Martin d Hères Cedex, France Email: jean-marc.tacnet@cemagref.fr
More informationData Preprocessing. Cluster Similarity
1 Cluster Similarity Similarity is most often measured with the help of a distance function. The smaller the distance, the more similar the data objects (points). A function d: M M R is a distance on M
More informationAdaptative combination rule and proportional conflict redistribution rule for information fusion
Adaptative combination rule and proportional conflict redistribution rule for information fusion M. C. Florea 1, J. Dezert 2, P. Valin 3, F. Smarandache 4, Anne-Laure Jousselme 3 1 Radiocommunication &
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 218 Outlines Overview Introduction Linear Algebra Probability Linear Regression 1
More informationA new generalization of the proportional conflict redistribution rule stable in terms of decision
Arnaud Martin 1, Christophe Osswald 2 1,2 ENSIETA E 3 I 2 Laboratory, EA 3876, 2, rue Francois Verny, 29806 Brest Cedex 9, France. A new generalization of the proportional conflict redistribution rule
More informationClassification of uncertain data in the framework of belief functions : nearest-neighbor-based and rule-based approaches
Classification of uncertain data in the framework of belief functions : nearest-neighbor-based and rule-based approaches Lianmeng Jiao To cite this version: Lianmeng Jiao. Classification of uncertain data
More informationUncertainty and Rules
Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty
More informationApplication of Evidence Theory to Construction Projects
Application of Evidence Theory to Construction Projects Desmond Adair, University of Tasmania, Australia Martin Jaeger, University of Tasmania, Australia Abstract: Crucial decisions are necessary throughout
More informationSequential adaptive combination of unreliable sources of evidence
Sequential adaptive combination of unreliable sources of evidence Zhun-ga Liu, Quan Pan, Yong-mei Cheng School of Automation Northwestern Polytechnical University Xi an, China Email: liuzhunga@gmail.com
More informationAutonomous Underwater Vehicle sensors fusion by the theory of belief functions for Rapid Environment Assessment
Autonomous Underwater Vehicle sensors fusion by the theory of belief functions for Rapid Environment Assessment Arnaud Martin 1, Jean-Christophe Cexus 1, Gilles Le Chenadec 2, Elodie Cantéro 2, Thierry
More informationData Fusion with Imperfect Implication Rules
Data Fusion with Imperfect Implication Rules J. N. Heendeni 1, K. Premaratne 1, M. N. Murthi 1 and M. Scheutz 2 1 Elect. & Comp. Eng., Univ. of Miami, Coral Gables, FL, USA, j.anuja@umiami.edu, kamal@miami.edu,
More informationL11: Pattern recognition principles
L11: Pattern recognition principles Bayesian decision theory Statistical classifiers Dimensionality reduction Clustering This lecture is partly based on [Huang, Acero and Hon, 2001, ch. 4] Introduction
More informationA Practical Taxonomy of Methods and Literature for Managing Uncertain Spatial Data in Geographic Information Systems
A Practical Taxonomy of Methods and Literature for Managing Uncertain Spatial Data in Geographic Information Systems Madjid Tavana a,b, * a Business Systems and Analytics Department Distinguished Chair
More informationBelief functions: A gentle introduction
Belief functions: A gentle introduction Seoul National University Professor Fabio Cuzzolin School of Engineering, Computing and Mathematics Oxford Brookes University, Oxford, UK Seoul, Korea, 30/05/18
More informationProtein Structure Prediction Using Neural Networks
Protein Structure Prediction Using Neural Networks Martha Mercaldi Kasia Wilamowska Literature Review December 16, 2003 The Protein Folding Problem Evolution of Neural Networks Neural networks originally
More informationA Generalization of Bayesian Inference in the Dempster-Shafer Belief Theoretic Framework
A Generalization of Bayesian Inference in the Dempster-Shafer Belief Theoretic Framework J. N. Heendeni, K. Premaratne, M. N. Murthi, J. Uscinski University of Miami Coral Gables, FL, USA Email: j.anuja@umiami.edu,
More informationChange Detection from Remote Sensing Images Based on Evidential Reasoning
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 Change Detection from Remote Sensing Images Based on Evidential Reasoning Zhun-ga Liu 1,3, Jean Dezert 2, Grégoire
More informationReliability assessment for multi-state systems under uncertainties based on the Dempster-Shafer theory
Reliability assessment for multi-state systems under uncertainties based on the Dempster-Shafer theory Mohamed Sallak, Walter Schön, and Felipe Aguirre Compiegne University of Technology, UMR CNRS 7253
More informationClustering K-means. Clustering images. Machine Learning CSE546 Carlos Guestrin University of Washington. November 4, 2014.
Clustering K-means Machine Learning CSE546 Carlos Guestrin University of Washington November 4, 2014 1 Clustering images Set of Images [Goldberger et al.] 2 1 K-means Randomly initialize k centers µ (0)
More informationREASONING UNDER UNCERTAINTY: CERTAINTY THEORY
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY Table of Content Introduction Certainty Theory Definition Certainty Theory: Values Interpretation Certainty Theory: Representation Certainty Factor Propagation
More informationContradiction measures and specificity degrees of basic belief assignments
Author manuscript, published in "International Conference on Information Fusion, Chicago : United States (2008)" Contradiction measures and specificity degrees of basic belief assignments Florentin Smarandache
More informationA Study of the Pari-Mutuel Model from the Point of View of Imprecise Probabilities
PMLR: Proceedings of Machine Learning Research, vol. 62, 229-240, 2017 ISIPTA 17 A Study of the Pari-Mutuel Model from the Point of View of Imprecise Probabilities Ignacio Montes Enrique Miranda Dep. of
More informationExtended component importance measures considering aleatory and epistemic uncertainties
Extended component importance measures considering aleatory and epistemic uncertainties Mohamed Sallak, Walter Schon, Felipe Aguirre To cite this version: Mohamed Sallak, Walter Schon, Felipe Aguirre.
More informationCOMBINING BELIEF FUNCTIONS ISSUED FROM DEPENDENT SOURCES
COMBINING BELIEF FUNCTIONS ISSUED FROM DEPENDENT SOURCES by Marco E. G. V. Cattaneo Research Report No. 117 May 2003 Sear für Statistik Eidgenössische Technische Hochschule (ETH) CH-8092 Zürich Switzerland
More informationThe Transferable Belief Model for reliability analysis of systems with data uncertainties and failure dependencies
The Transferable Belief Model for reliability analysis of systems with data uncertainties and failure dependencies Mohamed Sallak, Walter Schön, and Felipe Aguirre Compiegne University of Technology, UMR
More informationThe maximum Deng entropy
The maximum Deng entropy Bingyi Kang a, Yong Deng a,b,c, a School of Computer and Information Science, Southwest University, Chongqing, 40075, China b School of Electronics and Information, Northwestern
More informationPrevious Accomplishments. Focus of Research Iona College. Focus of Research Iona College. Publication List Iona College. Journals
Network-based Hard/Soft Information Fusion: Soft Information and its Fusion Ronald R. Yager, Tel. 212 249 2047, E-Mail: yager@panix.com Objectives: Support development of hard/soft information fusion Develop
More informationDecision of Prognostics and Health Management under Uncertainty
Decision of Prognostics and Health Management under Uncertainty Wang, Hong-feng Department of Mechanical and Aerospace Engineering, University of California, Irvine, 92868 ABSTRACT The decision making
More informationStatistical reasoning with set-valued information : Ontic vs. epistemic views
Statistical reasoning with set-valued information : Ontic vs. epistemic views Inès Couso, Didier Dubois To cite this version: Inès Couso, Didier Dubois. Statistical reasoning with set-valued information
More informationNaive Possibilistic Classifiers for Imprecise or Uncertain Numerical Data
Author manuscript, published in "Fuzzy Sets and Systems (2013) pp xxxx" DOI : 10.1016/j.fss.2013.07.012 Naive Possibilistic Classifiers for Imprecise or Uncertain Numerical Data Myriam Bounhas a,b,, Mohammad
More informationBIVARIATE P-BOXES AND MAXITIVE FUNCTIONS. Keywords: Uni- and bivariate p-boxes, maxitive functions, focal sets, comonotonicity,
BIVARIATE P-BOXES AND MAXITIVE FUNCTIONS IGNACIO MONTES AND ENRIQUE MIRANDA Abstract. We give necessary and sufficient conditions for a maxitive function to be the upper probability of a bivariate p-box,
More informationMachine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall
Machine Learning Gaussian Mixture Models Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall 2012 1 The Generative Model POV We think of the data as being generated from some process. We assume
More informationPossibilistic information fusion using maximal coherent subsets
Possibilistic information fusion using maximal coherent subsets Sebastien Destercke and Didier Dubois and Eric Chojnacki Abstract When multiple sources provide information about the same unknown quantity,
More information