Equivalence Between Belief Theories and Naïve Bayesian Fusion for Systems with Independent Evidential Data: Part II, The Example
|
|
- Archibald Cobb
- 6 years ago
- Views:
Transcription
1 Equivalence Between Belief Theories and Naïve Bayesian Fusion for ystems with Independent Evidential Data: Part II, The Example John J. udano Lockheed artin oorestown, NJ, 08057, U john.j.sudano@lmco.com bstract The process of fusing multiple independent sensor measurements, communication link data from other independent systems, and dynamic data base information is essential to support critical decisions in a timely way. any real systems can be mapped to such a process. The independence of the input evidential data with an equal probable uniform prior probability distribution (i.e., Naïve Bayesian fusion greatly simplifies the mathematical techniques used to properly fuse the evidential data. Equivalence between Belief Fusion and Naïve Bayesian is shown for this process. The equivalence comparison is done in probability space. The title of a 2001 colloquium, Data Fusion & Target ID: Dempster-hafer & Probability Theories Holy War, depicts the state of mind of many researchers. The goal of this article is to show that large areas from both mathematical camps are equivalent. This equivalence can be exploited by reducing the computational complexity of the fusion process. The fusion can be done in the linear probability set space rather than the exponential power-set representation of the belief space. For a system with 10 possible hypotheses, The fusion of independent data in belief space would involve the fusion of as many as 1024 members of the power set, while exactly the same results can be obtained by fusion of 10 members in probability space. This implies a non-trivial saving in computation complexity for the implementation of many real systems, such as medical diagnostic systems, automated cognitive performance evaluation, oil exploration systems, combat identification, ballistic missile component discrimination, and semi-automated homeland security systems. The numerical examples in this article help clarify the mathematical techniques and confirm the equivalence results. 1 Introduction In the article Equivalence Between Belief Theories and Naïve Bayesian Fusion for ystems with Independent Evidential Data: Part I, The Theory the author describes the equivalence between two mathematical techniques. In this article, numerical examples are calculated showing the equivalence. Let Ω {, B, C} be the set of 3 possible outcomes with the following notation for the power set. ~[1,0,0] B~[0,1,0] C~[0,0,1] B~[1,1,0] C~[1,0,1] BC~[0,1,1] BC~[1,1,1] Four independent probability distributions are calculated from a weighted random process and sorted. PD , , < PD , , < PD , , < PD , , < (1 From the above probability distributions, calculate the BBs with the Inverse Pignistic Probability Transform (IPPT using the Generalized um ean [11] for the values of st1. The mapping has the property that the pignistic probability proportional to belief is equal to the original probability distribution; i.e., PrBl(Pd(. Generally this is not the case. The BBs generated thus can be used by all pignistic probability transforms to estimate probabilities. Therefore, proper comparisons can be made for all combinations of belief theories and Naive Bayesian Fusion. pecial care must be exercised in using PrBl since it can give erroneous results for non-mature data sets. For a given probability set P( i with IPPT values of s1 and t1, the basic belief assignments are calculated as: mm( i 1,1 P( i / D mm( i, j 1,1 ( P( i + P( j /( 2D (2 mm(i, j,k 1,1 ( P( i + P( j + P( k /( 3D mm(1,2,...n 1,1 ( P( 1 + P( P( N /( N * D D is calculated analytically as N D (2-1/N. For each PDi the calculated the BBs are: mm1@1, 0, 0D mm1@0, 1, 0D mm1@0, 0, 1D mm1@1, 1, 0D mm1@1, 0, 1D mm1@1, 1, 1D mm1@0, 1, 1D (3 mm2@1, 0, 0D mm2@0, 1, 0D mm2@0, 0, 1D mm2@1, 1, 0D mm2@1, 0, 1D mm2@1, 1, 1D mm2@0, 1, 1D (4
2 0, 0D , 0D , 1D , 0D , 1D , 1D , 1D (5 0, 0D , 0D , 1D , 0D , 1D , 1D , 1D (6 The singleton Plausibilities [9] for each set of BBs are calculated as: Plmmk ( J mmk ( K K J 0 (7 Plmm1 { , , } Plmm2 { , , } Plmm3 { , , } Plmm4 { , , } (8 The pignistic probability proportional to normalized plausibility (PrNPl is computed for each singleton element of C Ω with C Ω for all 2. 1 PrNPl k ( C mmk ( C 0 (9 PrNPl , , < PrNPl , , < PrNPl , , < PrNPl , , < (10 mets Pignistic probability for each set of BBs is calculated as: BetPmmk ( i i mmk ( (11 BetPmm1 { , , } BetPmm2 { , , } BetPmm3 { , , } BetPmm4 { , , } (12 The Pignistic Probability transforms proportional to Plausibilities (PrPl are calculated for each set of BBs PrPlk ( i C i Pl ( i [ Pl ( mmk ( ] (13 PrPl1@1, 0, 0D PrPl2@1, 0, 0D PrPl1@0, 1, 0D PrPl2@0, 1, 0D PrPl1@0, 0, 1D PrPl2@0, 0, 1D (14 PrPl3@1, 0, 0D PrPl4@1, 0, 0D PrPl3@0, 1, 0D PrPl4@0, 1, 0D PrPl3@0, 0, 1D PrPl4@0, 0, 1D (15 The pignistic probability proportional to all Plausibilities (PraPl is equal to the belief and a component proportional to the sum of all the plausibilities PraPl ( i Bel( i + ε Pl( i with 1- Bel( i i Ω ε Pl( i i Ω (16 PraPl1 { , , } PraPl2 { , , } PraPl3 { , , } PraPl4 { , , } (17 The Hybrid Pignistic Probability (PrHyb [10] distributes the BBs proportionally to PraPl. PraPl ( i PrHybk ( i mmk ( C [PraPl ( ] i (18 PrHyb1@1, 0, 0D PrHyb2@1, 0, 0D PrHyb1@0, 1, 0D PrHyb2@0, 1, 0D PrHyb1@0, 0, 1D PrHyb2@0, 0, 1D (19 PrHyb3@1, 0, 0D PrHyb4@1, 0, 0D PrHyb3@0, 1, 0D PrHyb4@0, 1, 0D PrHyb3@0, 0, 1D PrHyb4@0, 0, 1D (20 The Pignistic Probability transforms proportional to Beliefs (PrBl are calculated for each set of BBs. Note PrBl is equal to the original probability distribution [9]. PrBlmmk ( i C i mmk ( i [ mmk ( mmk ( ] (21 PrBlmm1@1, 0, 0D PrBlmm2@1, 0, 0D PrBlmm1@0, 1, 0D PrBlmm2@0, 1, 0D PrBlmm1@0, 0, 1D PrBlmm2@0, 0, 1D (22 PrBlmm3@1, 0, 0D PrBlmm4@1, 0, 0D PrBlmm3@0, 1, 0D PrBlmm4@0, 1, 0D PrBlmm3@0, 0, 1D PrBlmm4@0, 0, 1D (23 2 Dempster-hafer (D Belief Fusion Combining two BBs by using Dempster s rule of combination yields the fused BB. mm1( mm2( C mm12 ( (24 B C mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (25 From above BBs, compute the pignistic probability PrNPl. PrNPl12 { , , } (26 Combining the fused BBs with the third belief data input by using Dempster s rule of combination yields the fused BB. mm3( C mm123 ( (27 B C
3 0, 0DD , 0DD , 1DD , 0DD , 1DD , 1DD , 1DD (28 From above BBs, compute PrNPl. PrNPl , , < (29 Combining the fused BBs with the fourth belief data input by using Dempster s rule of combination yields the fused BB. mm1234@1, 0, 0D mm1234@0, 1, 0D mm1234@0, 0, 1D mm1234@1, 1, 0D mm1234@1, 0, 1D mm1234@0, 1, 1D mm1234@1, 1, 1D (30 From above BBs, compute PrNPl. PrNPl , , < (31 3 Naïve Bayesian Fusion of the Four PrNPl Probability Distributions The Naïve Bayesian fusions of the four Pignistic Probability proportional to Plausibilities (PrNPl are calculated as: PrNPl1( DPrNPl2( D...PrNPlN( D BPrNPl12...N(D PrNPl1(...PrNPlN( (32 BPrNPl , , < BPrNPl , , < BPrNPl , , < BPrNPl , , < (33 Note the equivalence between the Pignistic Probability estimate PrNPl of the D fusion (26, (29, (31 and the probability of the Naïve Bayesian fusions as computed using BPrNPl (33 of the original BBs. 4 The Fixsen-ahler odified Dempster- hafer (D Belief Fusion Combining two BBs by using the D combination rule gives the fused BBs. mm1( mm2( C B C B C (34 mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (35 From above BBs, compute mets Pignistic probabilities: BetP12@1, 0, 0D BetP12@0, 1, 0D BetP12@0, 0, 1D (36 Combining the next input with the fused BBs by using D combination rule gives the fused BBs. mm123( mm3( C B C B C (37 mm123@1, 0, 0D mm123@0, 1, 0D mm123@0, 0, 1D mm123@1, 1, 0D mm123@1, 0, 1D mm123@0, 1, 1D mm123@1, 1, 1D (38 From above BBs compute mets Pignistic probabilities: BetP123@1, 0, 0D BetP123@0, 1, 0D BetP123@0, 0, 1D (39 Combining all four inputs by using the D combination rule gives the fused BBs. mm1234@1, 0, 0D mm1234@0, 1, 0D mm1234@0, 0, 1D mm1234@1, 1, 0D mm1234@1, 0, 1D mm1234@0, 1, 1D mm1234@1, 1, 1D (40 From above BBs, compute mets Pignistic probabilities: BetP1234@1, 0, 0D BetP1234@0, 1, 0D BetP1234@0, 0, 1D (41 5 The udano Generalized Belief Fusion with Cardinality Weighting The Generalized Belief Fusion is calculated with cardinality weighting: B C mm1( mm2( C B C (42 mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (43 From above BBs compute the pedigree Pignistic probabilities with cardinality weighting: PrPed12@1, 0, 0D PrPed12@0, 1, 0D PrPed12@0, 0, 1D (44 Fusing the first three inputs: mm123@@1, 0, 0DD mm123@@0, 1, 0DD mm123@@0, 0, 1DD mm123@@1, 1, 0DD mm123@@1, 0, 1DD mm123@@0, 1, 1DD mm123@@1, 1, 1DD (45 From above BBs, compute the pedigree Pignistic probabilities with cardinality weighting: PrPed123@1, 0, 0D PrPed123@0, 1, 0D PrPed123@0, 0, 1D (46
4 Fusing all four inputs: 0, 0D , 0D , 1D , 0D , 1D , 1D , 1D (47 From above BBs, compute the pedigree Pignistic probabilities: 0, 0D , 0D , 1D (48 6 Naïve Bayesian Fusion of the mets Pignistic and udano Pedigree Pignistic Probabilities with Cardinality weighting The Naïve Bayesian fusions of the mets Pignistic Probability BetP (12 and Pedigree Pignistic Probabilities with Cardinality weighting are calculated as: BetP1( D mm1( D 0 mm1( D (49 BetP1( DBetP2( D...BetPN( D BBetP12...N(D BetP1(...BetPN( (50 BBetP , , < BBetP , , < BBetP , , < (51 The Pedigree Pignistic Probability for a single set of BBs with cardinality weighting is calculated as: PrPed1( C PrPed1( C C C mm1( C [ ρ 1( C] C [ ρ1( ] mm1( C [ C ] C [ ] ince C is a singleton its cardinality value is one. PrPed1( C C mm1( (52 (53 (54 Note that the Pedigree Pignistic probability with cardinality weighting (54 is the same as mets Pignistic Probability (49 The Naïve Bayesian fusions of the Pedigree Pignistic Probability with cardinality weighting are calculated as: PrPed1( DPrPed2( D...PrPedN( D BPrPed12...N(D PrPed1(...PrPedN( BPrPed , , < BPrPed , , < BPrPed , , < (56 (55 Note the equivalence between the mets Pignistic Probability estimate BetP of the D fusion (36, (39, (41 and the udano Pedigree Pignistic probabilities with cardinality weighting of the GBF fusion (44, (46, and (48, and the probability of the Naïve Bayesian fusions as computed using BBetP (51 and BPrPed (56. 7 The udano Generalized Belief Fusion with Pl Weighting Compute the Generalized Belief Fusion lgorithm with the Pl weighting function. (57 mm1( mm2( C C [Pl( ] C [Pl( ] B C C [Pl( ] C [Pl( C] mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (58 Compute the Pedigree Pignistic Probability with Pl weighting: PrPed ( C C C [Pl( C] C [Pl( C] C [Pl( ] C [Pl( ] (59 PrPed12@1, 0, 0D PrPed12@0, 1, 0D PrPed12@0, 0, 1D (60 Compute the Generalized Belief Fusion lgorithm with the Pl weighting function for the three inputs. mm123@1, 0, 0D mm123@0, 1, 0D mm123@0, 0, 1D mm123@1, 1, 0D mm123@1, 0, 1D mm123@0, 1, 1D mm123@1, 1, 1D (61 Compute the Pedigree Pignistic Probability with Pl weighting for the above BBs: PrPed123@1, 0, 0D PrPed123@0, 1, 0D PrPed123@0, 0, 1D (62 Compute the Generalized Belief Fusion lgorithm with the Pl weighting function for all four inputs. mm1234@1, 0, 0D mm1234@0, 1, 0D mm1234@0, 0, 1D mm1234@1, 1, 0D mm1234@1, 0, 1D mm1234@0, 1, 1D mm1234@1, 1, 1D (63 From the above BBs, compute the Pedigree Pignistic probabilities: PrPed1234@1, 0, 0D PrPed1234@0, 1, 0D PrPed1234@0, 0, 1D (64
5 8 Naïve Bayesian Fusion of the Pignistic Probability Proportional to Plausibility The Naïve Bayesian fusions of the Pignistic Probability proportional to Plausibilities (PrPl are calculated as: PrPl1( DPrPl2( D...PrPlN( D BPrPl12...N(D PrPl1(...PrPlN( B Ω BPrPl1 { , , } BPrPl12 { , , } BPrPl123 { , , } BPrPl1234 { , , } (66 Note the equivalence between the Pedigree Pignistic Probability with Pl weighting of the Generalized Belief Fusion lgorithm with the Pl weighting function (60, (62, (64 and the probability of the Naïve Bayesian fusions as computed using BPrPl (66. (65 9 The udano Generalized Belief Fusion with PraPl Weighting The probability proportionality function ρ has the PraPl weighting function in the Generalized Belief Fusion lgorithm. PraPl _ B C... Z mm 1( mm2 ( C C [PraPl( ] C [PraPl( ] C [PraPl( ] C [PraPl( C] (67 mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (68 From above BBs, compute the Pedigree Pignistic probabilities with the PraPl weighting function: PrPed12@1, 0, 0D PrPed12@0, 1, 0D PrPed12@0, 0, 1D (69 Fusing the three BBs inputs: mm123@1, 0, 0D mm123@0, 1, 0D mm123@0, 0, 1D mm123@1, 1, 0D mm123@1, 0, 1D mm123@0, 1, 1D mm123@1, 1, 1D (70 From above BBs, compute the Pedigree Pignistic probabilities with the PraPl weighting function: PrPed123@1, 0, 0D PrPed123@0, 1, 0D PrPed123@0, 0, 1D (71 Fusing all four inputs: mm1234@1, 0, 0D mm1234@0, 1, 0D mm1234@0, 0, 1D mm1234@1, 1, 0D mm1234@1, 0, 1D mm1234@0, 1, 1D mm1234@1, 1, 1D (72 From above BBs compute the Pedigree Pignistic probabilities with the PraPl weighting function: PrPed1234@1, 0, 0D PrPed1234@0, 1, 0D PrPed1234@0, 0, 1D (73 10 Naïve Bayesian Fusion of the Hybrid Pignistic Probability The Naïve Bayesian fusions of the Hybrid Pignistic Probability are calculated as: PrHyb1( DPrHyb2( D...PrHybN( D BPrHyb12...N(D PrHyb1(...PrHybN( (74 BPrHyb1 { , , } BPrHyb12 { , , } BPrHyb123 { , , } BPrHyb1234 { , , } (75 i y k { Note the equivalence between the udano Pedigree Pignistic Probability of the Generalized Belief Fusion lgorithm with PraPl weighting function (69, (71, (73 and the probability of Naïve Bayesian Fusion as computed using PrHyb ( The udano Generalized Belief Fusion with Belief Weighting The Generalized Belief Fusion with Belief as the weighting functions: B C mm ( mm ( C C [ Bel( ] C [ Bel( ] 1 2 C [ Bel( ] C [ Bel( C] (76 mm12@1, 0, 0D mm12@0, 1, 0D mm12@0, 0, 1D mm12@1, 1, 0D mm12@1, 0, 1D mm12@0, 1, 1D mm12@1, 1, 1D (77 From above BBs, compute the Pedigree Pignistic probabilities with the Bel weighting function: PrPed12@1, 0, 0D PrPed12@0, 1, 0D PrPed12@0, 0, 1D (78 Fusing the first three inputs: mm123@@1, 0, 0DD mm123@@0, 1, 0DD mm123@@0, 0, 1DD mm123@@1, 1, 0DD mm123@@1, 0, 1DD mm123@@0, 1, 1DD mm123@@1, 1, 1DD (79
6 From above BBs compute the Pedigree Pignistic probabilities with the Bel weighting function: 0, 0D , 0D , 1D (80 Fusing all four inputs: 0, 0D , 0D , 1D , 0D , 1D , 1D , 1D (81 Probability of Naïve Bayesian Fusion as computed using the original Probability Distributions PD ( Results Figure 1: hows the comparison of thirteen methods of fusing the same four independent belief measurements; the probability increase of the most probable state is shown. From above BBs compute the Pedigree Pignistic probabilities: 0, 0D , 0D , 1D ( Naïve Bayesian Fusion of the Pignistic Probability Proportional to Beliefs The Naïve Bayesian fusions of the Pedigree Pignistic probabilities with the Bel weighting function or the Pignistic Probability proportional to Belief (22,23 are calculated as: PrBlmm1( DPrBlmm2( D...PrBlmmN( D BPrBl12...N(D PrBlmm1(...PrBlmmN( i y k { (83 BPrBl , , < BPrBl , , < BPrBl , , < BPrBl , , < (84 13 Naïve Bayesian Fusion of the Original Four Probability Distributions The Naïve Bayesian fusions of the four original probability distributions (1 are calculated as: BPD12.. N( D PD1( D... PDN( D PD1(... PDN( B Ω Figure 1: GBF (Bel + PrPed(Bl BNF(PrBl BNF(PD GBF(PraPl + PrPed BNF(~PrHyb GBF(Pl + PrPed BNF(PrPl GBF(Cardinality + PrPed D+BetP BNF(PrPed(Cardinality BNF(BetP D-BF +PrNPl GBF(1 + PrPed BNF(PrNPl n example has been generated for fusing four independent belief data sources. Thirteen fusion comparisons for the same four input measurements have been computed. The thirteen fusion results fall into five unique classes. Figure 1 shows the thirteen fusion comparisons of the most probable hypothesis for the same four independent input measurements. The results fall into five unique curves: 2 1 BPD , , < BPD , , < BPD , , < BPD , , < (85 Note the equivalence between the udano Pedigree Pignistic Probability of the Generalized Belief Fusion lgorithm with Belief weighting function (78, (80, (82 and the Probability of Naïve Bayesian Fusion as computed using PrBl (84 and the Curve 1. hows the numerical equivalence between the Dempster-hafer belief fusion mapped via the pignistic probability estimate proportional to the normalized plausibility (PrNPl and the Naïve Bayesian fusion of the PrNPl computed for each input BB set. Curve 2. hows the numerical equivalence between (a the Fixsen-ahler odified Dempster-hafer (D Belief Fusion mapped via mets pignistic probability estimate
7 (BetP, (b the udano Generalized Belief Fusion with Cardinality Weighting mapped via the Pedigree Pignistic probabilities with cardinality weighting, and (c Naïve Bayesian fusion of the BetP computed for each input BB set. Curve 3. hows the numerical equivalence between the udano Generalized Belief Fusion with the Plausibility weighting mapped via the Pedigree Pignistic probabilities with Plausibility weighting, and the Naïve Bayesian fusion of the pignistic probability estimate proportional to the plausibility (PrPl computed for each input BB set. Curve 4. hows the numerical equivalence between the udano Generalized Belief Fusion with the pignistic probability proportional to all Plausibilities (PraPl weighting mapped via the Pedigree Pignistic probabilities with PraPl weighting, and the Naïve Bayesian fusion of the Hybrid Pignistic Probability computed for each input BB set. Curve 5. hows the numerical equivalence between the udano Generalized Belief Fusion with the pignistic probability proportional to Belief (Bel weighting mapped via the Pedigree Pignistic probabilities with Bel weighting, and the Naïve Bayesian fusion of the Pignistic Probability proportional to Belief computed for each input BB set. ince the BBs are computed by the Inverse Pignistic Probability Transform (IPPT using the Generalized um ean, so that PrBl(Pd(, then the of Naïve Bayesian Fusion as computed using the original Probability Distributions is also equivalent. 15 Conclusion The process of fusing multiple independent sensor measurements, communication link data from other independent systems, and dynamic data base information is essential to support critical decisions in a timely way. any real systems can be mapped to such a process. The independence of the input evidential data with an equal probable uniform prior probability distribution (i.e., Naïve Bayesian fusion greatly simplifies the mathematical techniques used to properly fuse the evidential data. Equivalence between the Pignistic Probability Estimates of the Belief Fusion of the BBs and the Naïve Bayesian fusion of the Pignistic Probability Estimates of the individual BBs has been shown for this process. The equivalence comparison is done in probability space. The practical implications are notable for information fusion processes in many real systems. For many such systems, some inputs to the information fusion process are better represented by Ω the exponential belief, Power - set ( Ω 2, representation of the incomplete information set. Via an appropriate pignistic probability transform, all these inputs are mapped into the linear probability Ω set representations and fused. This greatly simplifies the computation complexity since the equivalent fusion results are obtained in linear probability space rather than exponential belief space. References [1] hafer, G., athematical Theory of Evidence, Princeton University Press, [2] D. Fixsen and R. ahler (1992 " Dempster-hafer pproach to Bayesian Classification," Proc. 5th Int'l ymp. on ensor Fusion, Vol. I (Unclassified, Naval Training Center, Orlando FL, pril , pp [3] mets P., Kennes, R., The Transferable Belief odel, rtificial Intelligence, vol. 66, pages , [4] Fister, T., itchell, R., odified Dempster-hafer with Entropy Based Belief Body Compression, Proc Joint ervice Combat Identification ystems Conference (CIC, Naval Postgraduate chool, C, ugust 1994, pp [5] Fixsen, D.; ahler, R.P.. The odified Dempster- hafer pproach to Classification IEEE Transactions on ystems, an and Cybernetics, Part, Volume: 27 Issue: 1, Pages: , Jan [6] David Lewis, Naive (Bayes at Forty: The Independence ssumption in Information Retrieval, Proceedings of the 10th European Conference on achine Learning, ECL-98. [7] I. ndroutsopoulos, J. Koutsias, K. V. Chandrinos, G. Paliouras, and C. D. pyropoulos., n Evaluation of Naïve Bayesian nti-pam Filtering, In Proc. of the workshop on achine Learning in the New Information ge, [8] Peri, Joseph.., Data Fusion & Target ID: Dempster- hafer & Probability Theories Holy War, ay 18, 2001, Parsons uditorium, JHU pplied Physics Lab Colloquium [9] udano, John, J., Pignistic Probability Transforms for ixes of Low- and High- Probability Events, Fourth International Conference on Information Fusion 2001, ontreal, QC, Canada, ugust 2001, pages TUB [10] udano, John, J., The ystem Probability Information Content (PIC Relationship to Contributing Components, Combining Independent ulti-ource Beliefs, Hybrid and Pedigree Pignistic Probabilities, Proceedings of the Fifth International Conference on Information Fusion, 2002 Volume: 2, 2002 Pages: [11] udano, John, J., Inverse pignistic probability transforms, Proceedings of the Fifth International Conference on Information Fusion, Volume: 2, 2002 Pages:
8 [12] Dezert, Jean, "Foundations for New Theory of Plausible and Paradoxical Reasoning", to appear in "Information and ecurity Journal", n International Journal, Edited by Tzvetan emerdjiev, CLPP, Bulgarian cademy of ciences, ophia, Nov [13] udano, John, J., Generalized Belief Fusion lgorithm Proceedings of the ixth International Conference on Information Fusion, 2003 [14] udano, John, J., Equivalence Between Belief Theories and Naïve Bayesian Fusion for ystems with Independent Evidential Data: Part I, The Theory, Proceedings of the ixth International Conference on Information Fusion, 2003
Equivalence Between Belief Theories and Naïve Bayesian Fusion for Systems with Independent Evidential Data: Part I, The Theory
Equivalence etween elief Theories and aïve ayesian Fusion for ystes with Independent Evidential ata: Part I The Theory ohn. udano Lockheed Martin Moorestown 08057 U ohn..sudano@lco.co bstract The process
More informationIs Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function?
Is Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function? Deqiang Han Jean Dezert Chongzhao Han Yi Yang Originally published as Han D., Dezert J., Han C., Is Entropy Enough
More informationHierarchical DSmP Transformation for Decision-Making under Uncertainty
Hierarchical DSmP Transformation for Decision-Making under Uncertainty Jean Dezert Deqiang Han Zhun-ga Liu Jean-Marc Tacnet Originally published as Dezert J., Han D., Liu Z., Tacnet J.-M., Hierarchical
More informationA Class of DSm Conditioning Rules 1
Class of DSm Conditioning Rules 1 Florentin Smarandache, Mark lford ir Force Research Laboratory, RIE, 525 Brooks Rd., Rome, NY 13441-4505, US bstract: In this paper we introduce two new DSm fusion conditioning
More informationTransformations of belief masses into subjective probabilities
Jean Dezert 1, Florentin Smarandache 2 1 The French Aerospace Lab, 29 Avenue de la Division Leclerc, 92320 Chatillon, France 2 Department of Mathematics University of New Mexico Gallup, NM 8730, U.S.A.
More informationTracking and Identification of Multiple targets
Tracking and Identification of Multiple targets Samir Hachour, François Delmotte, Eric Lefèvre, David Mercier Laboratoire de Génie Informatique et d'automatique de l'artois, EA 3926 LGI2A first name.last
More informationA NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS
A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS Albena TCHAMOVA, Jean DEZERT and Florentin SMARANDACHE Abstract: In this paper a particular combination rule based on specified
More informationarxiv: v1 [cs.cv] 11 Jun 2008
HUMAN EXPERTS FUSION FOR IMAGE CLASSIFICATION Arnaud MARTIN and Christophe OSSWALD arxiv:0806.1798v1 [cs.cv] 11 Jun 2008 Abstract In image classification, merging the opinion of several human experts is
More informationApplication of Evidence Theory to Construction Projects
Application of Evidence Theory to Construction Projects Desmond Adair, University of Tasmania, Australia Martin Jaeger, University of Tasmania, Australia Abstract: Crucial decisions are necessary throughout
More informationHierarchical Proportional Redistribution Principle for Uncertainty Reduction and BBA Approximation
Hierarchical Proportional Redistribution Principle for Uncertainty Reduction and BBA Approximation Jean Dezert Deqiang Han Zhun-ga Liu Jean-Marc Tacnet Abstract Dempster-Shafer evidence theory is very
More informationApplication of Evidence Theory and Discounting Techniques to Aerospace Design
Application of Evidence Theory and Discounting Techniques to Aerospace Design Fiona Browne 1, David Bell 1, Weiru Liu 1, Yan Jin 1, Colm Higgins 1, Niall Rooney 2, Hui Wang 2, and Jann Müller 3 1 School
More informationA new generalization of the proportional conflict redistribution rule stable in terms of decision
Arnaud Martin 1, Christophe Osswald 2 1,2 ENSIETA E 3 I 2 Laboratory, EA 3876, 2, rue Francois Verny, 29806 Brest Cedex 9, France. A new generalization of the proportional conflict redistribution rule
More informationA New PCR Combination Rule for Dynamic Frame Fusion
Chinese Journal of Electronics Vol.27, No.4, July 2018 A New PCR Combination Rule for Dynamic Frame Fusion JIN Hongbin 1, LI Hongfei 2,3, LAN Jiangqiao 1 and HAN Jun 1 (1. Air Force Early Warning Academy,
More informationDeng entropy in hyper power set and super power set
Deng entropy in hyper power set and super power set Bingyi Kang a, Yong Deng a,b, a School of Computer and Information Science, Southwest University, Chongqing, 40075, China b Institute of Integrated Automation,
More informationA novel k-nn approach for data with uncertain attribute values
A novel -NN approach for data with uncertain attribute values Asma Trabelsi 1,2, Zied Elouedi 1, and Eric Lefevre 2 1 Université de Tunis, Institut Supérieur de Gestion de Tunis, LARODEC, Tunisia trabelsyasma@gmail.com,zied.elouedi@gmx.fr
More informationOverriding the Experts: A Stacking Method For Combining Marginal Classifiers
From: FLAIRS-00 Proceedings. Copyright 2000, AAAI (www.aaai.org). All rights reserved. Overriding the Experts: A Stacking ethod For Combining arginal Classifiers ark D. Happel and Peter ock Department
More informationA Static Evidential Network for Context Reasoning in Home-Based Care Hyun Lee, Member, IEEE, Jae Sung Choi, and Ramez Elmasri, Member, IEEE
1232 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART A: SYSTEMS AND HUMANS, VOL. 40, NO. 6, NOVEMBER 2010 A Static Evidential Network for Context Reasoning in Home-Based Care Hyun Lee, Member,
More informationChange Detection from Remote Sensing Images Based on Evidential Reasoning
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 Change Detection from Remote Sensing Images Based on Evidential Reasoning Zhun-ga Liu 1,3, Jean Dezert 2, Grégoire
More informationSemantics of the relative belief of singletons
Semantics of the relative belief of singletons Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe, 38334 SAINT ISMIER CEDEX, France Fabio.Cuzzolin@inrialpes.fr Summary. In this paper we introduce
More informationThe Semi-Pascal Triangle of Maximum Deng Entropy
The Semi-Pascal Triangle of Maximum Deng Entropy Xiaozhuan Gao a, Yong Deng a, a Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu, 610054,
More informationPairwise Classifier Combination using Belief Functions
Pairwise Classifier Combination using Belief Functions Benjamin Quost, Thierry Denœux and Marie-Hélène Masson UMR CNRS 6599 Heudiasyc Université detechnologiedecompiègne BP 059 - F-6005 Compiègne cedex
More informationShort Note: Naive Bayes Classifiers and Permanence of Ratios
Short Note: Naive Bayes Classifiers and Permanence of Ratios Julián M. Ortiz (jmo1@ualberta.ca) Department of Civil & Environmental Engineering University of Alberta Abstract The assumption of permanence
More informationObject identification using T-conorm/norm fusion rule
Albena Tchamova, Jean Dezert, Florentin Smarandache Object identification using T-conorm/norm fusion rule Albena Tchamova 1 IPP, Bulgarian Academy of Sciences, Acad. G. Bonchev Str., bl. 25-A, 1113 Sofia,
More informationA Dynamic Evidential Network for Multisensor Context Reasoning in Home-based Care
Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 2009 A Dynamic Evidential Network for Multisensor Context Reasoning in Home-based Care
More informationAn Alternative Combination Rule for Evidential Reasoning
An Alternative Combination Rule for Evidential Reasoning Faouzi Sebbak, Farid Benhammadi, M hamed Mataoui, Sofiane Bouznad and Yacine Amirat AI Laboratory, Ecole Militaire Polytechnique, Bordj el Bahri,
More informationJoint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model
Joint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model Gavin Powell & David Marshall The Geometric Computing & Computer Vision Group,
More informationNeutrosophic Masses & Indeterminate Models.
Neutrosophic Masses & Indeterminate Models. Applications to Information Fusion Florentin Smarandache Mathematics Department The University of New Mexico 705 Gurley Ave., Gallup, NM 8730, USA E-mail: smarand@unm.edu
More informationEvidential Reasoning for Multi-Criteria Analysis Based on DSmT-AHP
Evidential Reasoning for Multi-Criteria Analysis Based on DSmT-AHP Jean Dezert Jean-Marc Tacnet Originally published as Dezert J., Tacnet J.-M., Evidential Reasoning for Multi-Criteria Analysis based on
More informationAdaptative combination rule and proportional conflict redistribution rule for information fusion
Adaptative combination rule and proportional conflict redistribution rule for information fusion M. C. Florea 1, J. Dezert 2, P. Valin 3, F. Smarandache 4, Anne-Laure Jousselme 3 1 Radiocommunication &
More informationA new generalization of the proportional conflict redistribution rule stable in terms of decision
A new generalization of the proportional conflict redistribution rule stable in terms of decision Arnaud Martin, Christophe Osswald To cite this version: Arnaud Martin, Christophe Osswald. A new generalization
More informationSet Theory Correlation Free Algorithm for HRRR Target Tracking
Approved for public release; distribution is unlimited. Set Theory Correlation Free Algorithm for HRRR Target Tracking Erik Blasch Lang Hong AFRL/SNAT, 2241 Avionics Cir., WPAFB, OH 45433 Dept. of Elec.
More informationFuzzy Systems. Possibility Theory.
Fuzzy Systems Possibility Theory Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing
More informationThe intersection probability and its properties
The intersection probability and its properties Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe Montbonnot, France Abstract In this paper we introduce the intersection probability, a Bayesian approximation
More informationarxiv:cs/ v2 [cs.ai] 29 Nov 2006
Belief Conditioning Rules arxiv:cs/0607005v2 [cs.ai] 29 Nov 2006 Florentin Smarandache Department of Mathematics, University of New Mexico, Gallup, NM 87301, U.S.A. smarand@unm.edu Jean Dezert ONERA, 29
More informationThreat assessment of a possible Vehicle-Born Improvised Explosive Device using DSmT
Threat assessment of a possible Vehicle-Born Improvised Explosive Device using DSmT Jean Dezert French Aerospace Lab. ONERA/DTIM/SIF 29 Av. de la Div. Leclerc 92320 Châtillon, France. jean.dezert@onera.fr
More informationHandling imprecise and uncertain class labels in classification and clustering
Handling imprecise and uncertain class labels in classification and clustering Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) COST Action IC 0702 Working group C, Mallorca,
More informationManaging Decomposed Belief Functions
Managing Decomposed Belief Functions Johan Schubert Department of Decision Support Systems, Division of Command and Control Systems, Swedish Defence Research Agency, SE-164 90 Stockholm, Sweden schubert@foi.se
More informationApproximation of Belief Functions by Minimizing Euclidean Distances
Approximation of Belief Functions by Minimizing Euclidean Distances Thomas Weiler and Ulrich Bodenhofer Software Competence Center Hagenberg A-4232 Hagenberg, Austria e-mail: {thomas.weiler,ulrich.bodenhofer}@scch.at
More informationMultisensor Data Fusion and Belief Functions for Robust Singularity Detection in Signals
4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 20 Multisensor Data Fusion and Belief Functions for Robust Singularity Detection in Signals Gwénolé Le Moal, George
More informationarxiv: v1 [cs.ai] 28 Oct 2013
Ranking basic belief assignments in decision making under uncertain environment arxiv:30.7442v [cs.ai] 28 Oct 203 Yuxian Du a, Shiyu Chen a, Yong Hu b, Felix T.S. Chan c, Sankaran Mahadevan d, Yong Deng
More informationA Rough Set Interpretation of User s Web Behavior: A Comparison with Information Theoretic Measure
A Rough et Interpretation of User s Web Behavior: A Comparison with Information Theoretic easure George V. eghabghab Roane tate Dept of Computer cience Technology Oak Ridge, TN, 37830 gmeghab@hotmail.com
More informationData Fusion with Imperfect Implication Rules
Data Fusion with Imperfect Implication Rules J. N. Heendeni 1, K. Premaratne 1, M. N. Murthi 1 and M. Scheutz 2 1 Elect. & Comp. Eng., Univ. of Miami, Coral Gables, FL, USA, j.anuja@umiami.edu, kamal@miami.edu,
More informationFusion of imprecise beliefs
Jean Dezert 1, Florentin Smarandache 2 1 ONERA, 29 Av. de la Division Leclerc 92320, Chatillon, France 2 Department of Mathematics University of New Mexico Gallup, NM 8730, U.S.A. Fusion of imprecise beliefs
More informationA Generalization of Bayesian Inference in the Dempster-Shafer Belief Theoretic Framework
A Generalization of Bayesian Inference in the Dempster-Shafer Belief Theoretic Framework J. N. Heendeni, K. Premaratne, M. N. Murthi, J. Uscinski University of Miami Coral Gables, FL, USA Email: j.anuja@umiami.edu,
More informationAppeared in: International Journal of Approximate Reasoning, 41(3), April 2006, ON THE PLAUSIBILITY TRANSFORMATION METHOD FOR TRANSLATING
Appeared in: International Journal of Approximate Reasoning, 41(3), April 2006, 314--330. ON THE PLAUSIBILITY TRANSFORMATION METHOD FOR TRANSLATING BELIEF FUNCTION MODELS TO PROBABILITY MODELS Barry R.
More informationCombination of classifiers with optimal weight based on evidential reasoning
1 Combination of classifiers with optimal weight based on evidential reasoning Zhun-ga Liu 1, Quan Pan 1, Jean Dezert 2, Arnaud Martin 3 1. School of Automation, Northwestern Polytechnical University,
More informationMulti Sensor Data Fusion, Methods and Problems
Multi Sensor Data Fusion, Methods and Problems Rawa Adla 1, Youssef Bazzi 2, and Nizar Al-Holou 1 1 Department of Electrical and Computer Engineering, University of Detroit Mercy, Detroit, MI, U.S.A 2
More informationMulti-Object Association Decision Algorithms with Belief Functions
ulti-object Association Decision Algorithms with Belief Functions Jérémie Daniel and Jean-Philippe Lauffenburger Université de Haute-Alsace UHA) odélisation Intelligence Processus Systèmes IPS) laboratory
More informationA generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP
A generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP 08 76131 Mont-Saint-Aignan Cedex, France Eric.Lefevre@insa-rouen.fr
More informationGenetic Algorithm Based on Similarity for Probabilistic Transformation of Belief Functions
Genetic Algorithm Based on Similarity for Probabilistic Transformation of Belief Functions Yilin Dong a, Xinde Li a,, Jean Dezert b, Pei Li a, Xianghui Li a a Key Laboratory of Measurement and Control
More informationClassical Belief Conditioning and its Generalization to DSm Theory
Journal of Uncertain Systems Vol.2, No.4, pp.267-279, 2008 Online at: www.jus.org.uk Classical Belief Conditioning and its Generalization to DSm Theory ilan Daniel Institute of Computer Science, Academy
More informationA Mathematical Theory of Identification for Information Fusion
A Mathematical Theory of Identification for Information Fusion Tod M. Schuck Lockheed Martin Naval Electronic and Surveillance Systems Surface Systems P.O. Box 027 99 Borton Landing Road Building 3000
More informationSequential adaptive combination of unreliable sources of evidence
Sequential adaptive combination of unreliable sources of evidence Zhun-ga Liu, Quan Pan, Yong-mei Cheng School of Automation Northwestern Polytechnical University Xi an, China Email: liuzhunga@gmail.com
More informationData Fusion with Entropic Priors
Data Fusion with Entropic Priors Francesco PALMIERI, Domenico CIUONZO Dipartimento di Ingegneria dell Informazione, Seconda Università di Napoli, Italy Abstract. In classification problems, lack of knowledge
More informationCounter-examples to Dempster s rule of combination
Jean Dezert 1, Florentin Smarandache 2, Mohammad Khoshnevisan 3 1 ONERA, 29 Av. de la Division Leclerc 92320, Chatillon, France 2 Department of Mathematics University of New Mexico Gallup, NM 8730, U.S.A.
More informationA Fuzzy-Cautious OWA Approach with Evidential Reasoning
Advances and Applications of DSmT for Information Fusion Collected Works Volume 4 A Fuzzy-Cautious OWA Approach with Evidential Reasoning Deqiang Han Jean Dezert Jean-Marc Tacnet Chongzhao Han Originally
More informationUncertainty Measurement for Ultrasonic Sensor Fusion Using Generalized Aggregated Uncertainty Measure 1
AUT Journal of Modeling and Simulation AUT J. Model. Simul., 49(1)(2017)85-94 DOI: 10.22060/miscj.2016.827 Uncertainty Measurement for Ultrasonic Sensor Fusion Using Generalized Aggregated Uncertainty
More informationarxiv:cs/ v1 [cs.ai] 6 Sep 2004
The Generalized Pignistic Transformation Jean Dezert Florentin Smarandache Milan Daniel ONERA Dpt.of Mathematics Institute of Computer Science 9 Av. de la Div. Leclerc Univ. of New Mexico Academy of Sciences
More informationThe maximum Deng entropy
The maximum Deng entropy Bingyi Kang a, Yong Deng a,b,c, a School of Computer and Information Science, Southwest University, Chongqing, 40075, China b School of Electronics and Information, Northwestern
More informationSensor Data Fusion. Edited by Tzvetan Semerdjiev. Volume 9, Information & Security. Plausible and Paradoxical Reasoning. Sensor Data Processing
Information & Security Sensor Data Fusion Volume 9, 2002 Edited by Tzvetan Semerdjiev Editorial by Tzvetan Semerdjiev Reasoning and Object-Oriented Data Processing for Multisensor Data Fusion Plausible
More informationContext-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI
Context-dependent Combination of Sensor Information in Dempster-Shafer Theory for BDI Sarah Calderwood Kevin McAreavey Weiru Liu Jun Hong Abstract There has been much interest in the Belief-Desire-Intention
More informationEstimation of Target Behavior Tendencies using Dezert-Smarandache Theory
Estimation of Target Behavior Tendencies using Dezert-Smarandache Theory Albena Tchamova Tzvetan Semerdjiev Central Laboratory for Parallel Processing Central Laboratory for Parallel Processing Bulgarian
More informationThe Use of Locally Weighted Regression for the Data Fusion with Dempster-Shafer Theory
The Use of Locally Weighted Regression for the Data Fusion with Dempster-Shafer Theory by Z. Liu, D. S. Forsyth, S. M. Safizadeh, M.Genest, C. Mandache, and A. Fahr Structures, Materials Performance Laboratory,
More informationKnowledge Discovery in Clinical Databases with Neural Network Evidence Combination
nowledge Discovery in Clinical Databases with Neural Network Evidence Combination #T Srinivasan, Arvind Chandrasekhar 2, Jayesh Seshadri 3, J B Siddharth Jonathan 4 Department of Computer Science and Engineering,
More informationEvidence combination for a large number of sources
Evidence combination for a large number of sources Kuang Zhou a, Arnaud Martin b, and Quan Pan a a. Northwestern Polytechnical University, Xi an, Shaanxi 710072, PR China. b. DRUID, IRISA, University of
More informationMidterm 2 V1. Introduction to Artificial Intelligence. CS 188 Spring 2015
S 88 Spring 205 Introduction to rtificial Intelligence Midterm 2 V ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib
More informationA PCR-BIMM filter For Maneuvering Target Tracking
A PCR-BIMM filter For Maneuvering Target Tracking Jean Dezert Benjamin Pannetier Originally published as Dezert J., Pannetier B., A PCR-BIMM filter for maneuvering target tracking, in Proc. of Fusion 21,
More informationA Comparison of Methods for Transforming Belief Function Models to Probability Models
Appeared in: TD Nielsen & NL Zhang (eds.), Symbolic and Quantitative Approaches to Reasoning with Uncertainty, 2003, 255 266, Springer, Berlin. A Comparison of Methods for Transforming Belief Function
More informationA Real Z-box Experiment for Testing Zadeh s Example
18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 A Real Z-box Experiment for Testing Zadeh s Example Jean Dezert The French Aerospace Lab Chemin de la Hunière F-91761
More informationMeasure divergence degree of basic probability assignment based on Deng relative entropy
Measure divergence degree of basic probability assignment based on Deng relative entropy Liguo Fei a, Yong Deng a,b,c, a School of Computer and Information Science, Southwest University, Chongqing, 400715,
More informationLearning from Data. Amos Storkey, School of Informatics. Semester 1. amos/lfd/
Semester 1 http://www.anc.ed.ac.uk/ amos/lfd/ Introduction Welcome Administration Online notes Books: See website Assignments Tutorials Exams Acknowledgement: I would like to that David Barber and Chris
More informationUsing Belief Propagation to Counter Correlated Reports in Cooperative Spectrum Sensing
Using Belief Propagation to Counter Correlated Reports in Cooperative Spectrum Sensing Mihir Laghate and Danijela Cabric Department of Electrical Engineering, University of California, Los Angeles Emails:
More informationA MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM
Approved for public release; distribution is unlimited. A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM By Dr. Buddy H. Jeun, Jay Jayaraman Lockheed Martin Aeronautical Systems,
More informationContradiction Measures and Specificity Degrees of Basic Belief Assignments
Contradiction Measures and Specificity Degrees of Basic Belief Assignments Florentin Smarandache Arnaud Martin Christophe Osswald Originally published as: Smarandache F., Martin A., Osswald C - Contradiction
More informationStatistical methods for decision making in mine action
Statistical methods for decision making in mine action Jan Larsen Intelligent Signal Processing Technical University of Denmark jl@imm.dtu.dk, www.imm.dtu.dk/~jl Jan Larsen 1 Why do we need statistical
More informationStatistical Multisource-Multitarget Information Fusion
Statistical Multisource-Multitarget Information Fusion Ronald P. S. Mahler ARTECH H O U S E BOSTON LONDON artechhouse.com Contents Preface Acknowledgments xxm xxv Chapter 1 Introduction to the Book 1 1.1
More informationCautious OWA and Evidential Reasoning for Decision Making under Uncertainty
Cautious OWA and Evidential Reasoning for Decision Making under Uncertainty Jean-Marc Tacnet Cemagref -ETGR 2, rue de la papèterie - B.P. 76 F-38402 Saint Martin d Hères Cedex, France Email: jean-marc.tacnet@cemagref.fr
More informationIntegrating Correlated Bayesian Networks Using Maximum Entropy
Applied Mathematical Sciences, Vol. 5, 2011, no. 48, 2361-2371 Integrating Correlated Bayesian Networks Using Maximum Entropy Kenneth D. Jarman Pacific Northwest National Laboratory PO Box 999, MSIN K7-90
More informationIN CHANGE detection from heterogeneous remote sensing
168 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 11, NO. 1, JANUARY 014 Change Detection in Heterogeneous Remote Sensing Images Based on Multidimensional Evidential Reasoning Zhun-ga Liu, Grégoire
More informationReasoning with Uncertainty
Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take
More informationDecision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over
Point estimation Suppose we are interested in the value of a parameter θ, for example the unknown bias of a coin. We have already seen how one may use the Bayesian method to reason about θ; namely, we
More informationAn Improved Focal Element Control Rule
vailable online at www.sciencedirect.com Procedia ngineering 5 (0) 7 dvanced in Control ngineeringand Information Science n Improved Focal lement Control Rule JIN Hong-bin a, LN Jiang-qiao b a* a ir Force
More informationUnderstanding Loading in Feedback Amplifier Analysis
Understanding Loading in Feedback Amplifier Analysis Manuel ToledoQuiñones Electrical and Computer Engineering Department University of Puerto Rico Mayagüez, Puerto Rico Session 1532 Introduction The application
More informationApplication of DSmT for Land Cover Change Prediction
Samuel Corgne 1, Laurence Hubert-Moy 2, Gregoire Mercier 3, Jean Dezert 4 1, 2 COSTEL, CNRS UMR LETG 6554, Univ. Rennes 2, Place du recteur Henri Le Moal, 35043 Rennes, France 3 AMCIC, CNRS FRE 2658 team
More informationMATHEMATICS OF DATA FUSION
MATHEMATICS OF DATA FUSION by I. R. GOODMAN NCCOSC RDTE DTV, San Diego, California, U.S.A. RONALD P. S. MAHLER Lockheed Martin Tactical Defences Systems, Saint Paul, Minnesota, U.S.A. and HUNG T. NGUYEN
More informationSolution of the VBIED problem using Dezert-Smarandache Theory (DSmT)
Solution of the VBIED problem using Dezert-Smarandache Theory (DSmT) Dr. Jean Dezert The French Aerospace Lab. - ONERA 29 Av. de la Division Leclerc 92320 Châtillon, France jean.dezert@onera.fr Dr. Florentin
More informationMonte-Carlo Approximations for Dempster-Shafer Belief Theoretic Algorithms
4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, Monte-Carlo Approximations for Dempster-Shafer Belief Theoretic Algorithms Thanuka L. Wickramarathne, Kamal Premaratne,
More informationDecision of Prognostics and Health Management under Uncertainty
Decision of Prognostics and Health Management under Uncertainty Wang, Hong-feng Department of Mechanical and Aerospace Engineering, University of California, Irvine, 92868 ABSTRACT The decision making
More informationData Fusion in the Transferable Belief Model.
Data Fusion in the Transferable Belief Model. Philippe Smets IRIDIA Université Libre de Bruxelles 50 av. Roosevelt,CP 194-6,1050 Bruxelles,Belgium psmets@ulb.ac.be http://iridia.ulb.ac.be/ psmets Abstract
More informationIntroduction to belief functions
Introduction to belief functions Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) http://www.hds.utc.fr/ tdenoeux Spring School BFTA 2011 Autrans, April 4-8, 2011 Thierry
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationFission of Opinions in Subjective Logic
Fission of Opinions in Subjective Logic udun Jøsang University of Oslo - UNIK Graduate Center josang @ matnat.uio.no bstract Opinion fusion in subjective logic consists of combining separate observers
More informationREVEAL. Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008
REVEAL Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008 Presented to Program Officers: Drs. John Tague and Keith Davidson Undersea Signal Processing Team, Office
More informationApplication of New Absolute and Relative Conditioning Rules in Threat Assessment
Application of New Absolute and Relative Conditioning Rules in Threat Assessment Ksawery Krenc C4I Research and Development Department OBR CTM S.A. Gdynia, Poland Email: ksawery.krenc@ctm.gdynia.pl Florentin
More informationLearning from data with uncertain labels by boosting credal classifiers
Learning from data with uncertain labels by boosting credal classifiers ABSTRACT Benjamin Quost HeuDiaSyC laboratory deptartment of Computer Science Compiègne University of Technology Compiègne, France
More informationBayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.
Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence
More informationMulticomponent DS Fusion Approach for Waveform EKG Detection
Multicomponent DS Fusion Approach for Waveform EKG Detection Nicholas Napoli University of Virginia njn5fg@virginia.edu August 10, 2013 Nicholas Napoli (UVa) Multicomponent EKG Fusion August 10, 2013 1
More informationOn belief functions implementations
On belief functions implementations Arnaud Martin Arnaud.Martin@univ-rennes1.fr Université de Rennes 1 - IRISA, Lannion, France Xi an, July, 9th 2017 1/44 Plan Natural order Smets codes General framework
More informationAnalysis of information fusion combining rules under the DSm theory using ESM inputs
Analysis of information fusion combining rules under the DSm theory using ESM inputs Pascal Djiknavorian Dominic Grenier Département de Génie Électrique et Informatique Faculté des Sciences et de Génie,
More informationA New Uncertainty Measure in Belief Entropy Framework
A New Uncertainty Measure in Belief Entropy Framework Moïse Digrais Mambé,4, Tchimou N Takpé 2,4, Nogbou Georges Anoh 3,4, Souleymane Oumtanaga,4 Institut National Polytechnique Félix Houphouët-Boigny,
More information