Multi Sensor Data Fusion, Methods and Problems

Similar documents
Reasoning with Uncertainty

Uncertainty and Rules

Decision of Prognostics and Health Management under Uncertainty

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

A NEW CLASS OF FUSION RULES BASED ON T-CONORM AND T-NORM FUZZY OPERATORS

D-S Evidence Theory Applied to Fault 1 Diagnosis of Generator Based on Embedded Sensors

Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p.

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY

Tracking and Identification of Multiple targets

Application of New Absolute and Relative Conditioning Rules in Threat Assessment

A Class of DSm Conditioning Rules 1

MATHEMATICS OF DATA FUSION

Imprecise Probability

An Advanced Rule Engine for Computer Generated Forces

MULTINOMIAL AGENT S TRUST MODELING USING ENTROPY OF THE DIRICHLET DISTRIBUTION

Object identification using T-conorm/norm fusion rule

Integrating Dependent Sensory Data

1.9 APPLICATION OF EVIDENCE THEORY TO QUANTIFY UNCERTAINTY IN FORECAST OF HURRICANE PATH

Review Article A Review of Data Fusion Techniques

The Fundamental Principle of Data Science

Statistical methods for decision making in mine action

Basic Probabilistic Reasoning SEG

Distributed estimation in sensor networks

Sequential adaptive combination of unreliable sources of evidence

Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.

Statistical Multisource-Multitarget Information Fusion

The Use of Locally Weighted Regression for the Data Fusion with Dempster-Shafer Theory

Where are we? Knowledge Engineering Semester 2, Reasoning under Uncertainty. Probabilistic Reasoning

Models of collective inference

Multisensor Data Fusion and Belief Functions for Robust Singularity Detection in Signals

Today s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Rough Set Theory Fundamental Assumption Approximation Space Information Systems Decision Tables (Data Tables)

Reasoning Under Uncertainty

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

The Semi-Pascal Triangle of Maximum Deng Entropy

Bayesian Methods in Artificial Intelligence

Bayesian Learning (II)

Mobile Robot Localization

Evidence with Uncertain Likelihoods

A New Combination Rule in Evidence Theory

Probabilistic and Bayesian Machine Learning

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

The internal conflict of a belief function

Computational Cognitive Science

Improving Travel Time Estimates from Inductive Loop and Toll Collection Data with Dempster Shafer Data Fusion

Intelligent Data Fusion for Applied Decision Support

arxiv: v1 [stat.ap] 22 Mar 2016

Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?

Equivalence Between Belief Theories and Naïve Bayesian Fusion for Systems with Independent Evidential Data: Part II, The Example

Mobile Robot Localization

Probing the covariance matrix

Uncertainty Quantification for Machine Learning and Statistical Models

Data Fusion with Entropic Priors

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London

Is Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function?

A Probabilistic Relational Model for Characterizing Situations in Dynamic Multi-Agent Systems

JOINT INTERPRETATION OF ON-BOARD VISION AND STATIC GPS CARTOGRAPHY FOR DETERMINATION OF CORRECT SPEED LIMIT

Systematic Error Modeling and Bias Estimation

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Risk Assessment of E-Commerce Projects Using Evidential Reasoning

Robust UAV Search for Environments with Imprecise Probability Maps

Combination of evidence from multiple administrative data sources: quality assessment of the Austrian register-based Census 2011

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Naive Bayesian Rough Sets

Cost-Aware Bayesian Sequential Decision-Making for Domain Search and Object Classification

Previous Accomplishments. Focus of Research Iona College. Focus of Research Iona College. Publication List Iona College. Journals

A generic framework for resolving the conict in the combination of belief structures E. Lefevre PSI, Universite/INSA de Rouen Place Emile Blondel, BP

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Infrared Images Data Merging for Plasma Facing Component Inspection

Joint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Probability Basics. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Model Theory Based Fusion Framework with Application to. Multisensor Target Recognition. Zbigniew Korona and Mieczyslaw M. Kokar

Pei Wang( 王培 ) Temple University, Philadelphia, USA

Data Fusion with Imperfect Implication Rules

A novel k-nn approach for data with uncertain attribute values

Defining posterior hypotheses in C2 systems

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

Measuring the Value of High Level Fusion

Ranking Verification Counterexamples: An Invariant guided approach

Global Workspace Theory Inspired Architecture for Autonomous Structural Health Monitoring

Controlo Switched Systems: Mixing Logic with Differential Equations. João P. Hespanha. University of California at Santa Barbara.

Mihai Boicu, Gheorghe Tecuci, Dorin Marcu Learning Agent Center, George Mason University

Detection theory. H 0 : x[n] = w[n]

arxiv: v1 [cs.cv] 11 Jun 2008

A noninformative Bayesian approach to domain estimation

Active Object Recognition in Parametric Eigenspace Λ

Application of Evidence Theory and Discounting Techniques to Aerospace Design

Local Bayesian fusion realized via an agent based architecture

2D Image Processing (Extended) Kalman and particle filter

Quantifying uncertainty & Bayesian networks

Remaining Useful Performance Analysis of Batteries

Probabilistic Logic Rule Learning from Small Imperfect Open-World Datasets using a Dempster-Shafer Theoretic Approach

Active Object Recognition in Parametric Eigenspace

A new Approach to Drawing Conclusions from Data A Rough Set Perspective

Independence in Generalized Interval Probability. Yan Wang

Handling imprecise and uncertain class labels in classification and clustering

SAMPLE CHAPTERS UNESCO EOLSS FOUNDATIONS OF STATISTICS. Reinhard Viertl Vienna University of Technology, A-1040 Wien, Austria

Scientific/Technical Approach

Transcription:

Multi Sensor Data Fusion, Methods and Problems Rawa Adla 1, Youssef Bazzi 2, and Nizar Al-Holou 1 1 Department of Electrical and Computer Engineering, University of Detroit Mercy, Detroit, MI, U.S.A 2 Department of Electrical and Computer Engineering, Lebanese University, Beirut, Lebanon Abstract - Sensors, which monitor the surrounding environment in order to enhance our decisions, play a major role in our lives and contribute to our actions. A single sensor, however, is not capable of providing enough information; therefore, multiple sensors have to be integrated in a way to perform the additional task of interpretation, which may be more helpful and informative than what can be observed using a single sensor. Since the nature of sensor s functional characteristics can lead to output that contains erroneous measurement readings due to noise, measurement errors, and delays, multiple sensors are needed to confirm the certainty of desired actions. For sensors to work properly, a computational system is required in order to fuse sensor data in a process called multi-sensor-data fusion. This paper presents an overview of multi-sensor-data fusion, using two techniques (Bayes and Dempster-Shafer), with highlights of the techniques shortcomings. Keywords: sensor fusion; Bayes theory; Dempster- Shafer; fusion model. 1 Introduction Sensor-data fusion refers to the integration of data from multiple sensors of identical or different technical characteristics. The merits of the sensor fusion methods are to provide a better estimate of the feature of interest and to provide a result represented by hypothesis that is more accurate than would be obtained when using a single sensor. There are many problems considered when designing a system with a single sensor [1], such as loss of data when a sensor failure occurs, individual sensors providing data only from its own field of view, the frequency of measurements being limited to the time needed for a sensor to process its data, which is limited in accuracy due to the precision of the sensing system of the sensor, and the uncertainty in sensor measurement about objects which have been detected. In order to enhance the certainty measure of the observed data, a multi-sensor-data fusion system is required. Countless benefits can be derived from using sensor-data fusion methods, depending on the system and the application nature, since each system expects at least one of the following features to exist [2][3]: redundancy, which enables the whole system to be active even when a sensor failure or breakdown channel occurs; improved coverage area, which therefore makes complementary data available; increased confidence in the results and the certainty of information; enhancement of spatial resolution; extended temporal coverage; and improvement in the detection rate of objects. In the past few decades, sensor-data fusion has been researched and has appended developments for many fields such as science, technology, and engineering. In order to build a multi-sensor-data fusion system, deep understanding of the application characteristics is required. This can help in choosing the suitable data-fusion architecture, and in setting the data transmission facets. Then, determining the optimal fusion technique and acquiring a reliable algorithm for estimation and prediction are required [4]. This paper has been organized as follows: Section II presents types of sensors, and section III defines the levels of data fusion. In section IV the architecture of fusion has been proposed and an overview of multi-sensor fusion models is presented in section V. In Section VI we discuss two methods of combining data which are (Bayes and Theory of Evidence). The paper concludes with an evaluation of these two fusion techniques and a declaration of their shortcomings. 2 Types of Sensors There are two types of sensors: active and passive [5]. In active sensors, the output is generated as a result of stimulation that causes an alteration in the electrical amplitude. This type of sensor requires a power source for excitation and data security because it uses a direct transfer of data. Such examples: sonar, radar, or an ultra-wide band sensor. Alternatively, passive sensors do not require an electrical power source to work; they generate a voltage output using the temperature of the energy they are sensing. Also, passive sensor-data is considered more secure than active sensor-data. As an example of such sensors, a camera senses the amount of light it receives from the environment, and converts it into voltage levels. The variations in these voltages are stored as different pixel values in the computer. Each of these types of sensors can be further subdivided depending on whether the targets identify themselves

or not; that is, whether they are cooperative or uncooperative sensors [5]. 3 Sensor-Data Fusion Levels In general, the process of sensor-data fusion can be categorized into three levels: low, intermediate, and highlevel [4]. Low-level: Also referred to as raw-data fusion. Rawdata is fused from multi sensors of the same type to generate a new raw data set which is more annotative than the original raw-data, as collected by sensors. Intermediate-level: This level of fusion also called feature fusion. It combines different features into a feature map, which is usually used for detection of objects. High-level: This level of fusion, known as decision fusion, requires a fusion method to be applied, such as the statistical or fuzzy logic method, to compute a hypothesis value representing the decision. 4 Fusion Architecture Three different multi-sensor-data fusion architectures exist: centralized, pre-processed, and hybrid [6] [7]. Centralized fusion: This architecture is employed when sensors of the same type are selected. The fusion process combines raw data from all sensors in a central processor. This process requires a time synchronization of all sensors data. The decision resulting from this type of fusion is based on the amount of data collected by sensors. The advantage of centralizing fusion is secured data, and there is no data loss in the preprocessing. On the other hand, centralizing fusion requires sending all the collected data to the central processor, which is not acceptable in some applications. Pre-processed fusion: known as distributed fusion. Used for sensors of the same or different types. In this architecture, the central processor is relieved from preprocessing the raw data from the individual sensors. Hybrid fusion: This fusion architecture involves a mix of centralized and pre-processed fusion schemes. In practice, this is the best set-up of fusion architecture. 5 Fusion Models There are several fusion models to consider when designing a multi-sensor-data fusion system. They will be reviewed in this section. 5.1 Joint Directors of Laboratories- JDL Model: The JDL model was proposed by the US. Department of Defense (DoD) in 1986 [8]. The JDL model consists of: five levels of data processing, as follows [8]: Level 0 is Source Preprocessing, which determines the time, type, and identity of the collected data. Level 1 is Object Refinement, a level that combines information received from Level 0, in order to generate a representation of individual objects. Level 2 is a Situation Refinement, which defines the relationship between objects and detected targets and incorporates environmental information and observations. Level 2 is Threat Refinement, in which inferences are constructed about the target to have a solid base for decisions and actions. These inferences are based on a priori information and prediction about the next state. Level 4 is a Processing, which consists of three processes: monitoring data fusion efficiency, defining the missing information needed to enhance multi-level data fusion, and allocating sensors in order to accomplish the aim of the fusion process. At the same time, the JDL model has a database management system responsible for controlling data fusion processes, and a bus for interconnection between levels [3] [8]. 5.2 Waterfall Model This model assigns the priority of processing to the lower levels first. The fusion process is based on six levels of processing [9]; these levels can be matched to the JDL model levels: The first two levels, Sensing and Signal Processing, are similar to the first level in the JDL model (Level 0). The next two levels, Levels 3 and 4; are Feature Extraction and Pattern Processing, and these levels relate to Level 1 of the JDL model. Level 5 is a Situation Assessment that can be matched to Level 2 in the JDL model. Level 6 is Decision Making that is equivalent to the Threat Refinement level in the JDL model (Level 3). The advantage of the Waterfall model is the simplicity in understanding and applying, but it has a major limitation, as there is no feedback loop between levels. 5.3 Intelligence Cycle Based Model The Intelligence Cycle Based model is a cyclic model that captures some inherent processing behaviors among stages. It consists of five stages: planning, collection, collation, evaluation, and the dissemination stage [1] [10]. Stage-1 is the Planning and Direction stage, where determinations of the requirements take place; Stage-2 is the Collection stage, which collects required information; and Stage-3 is a Collation stage, that streamlines the collected information. Stage-4 is Evaluation stage, where the fusion process occurs, and the last, Stage-5 is the Dissemination stage, that distributes the fused inferences.

5.4 Boyd Model The Boyd model is considered a cyclical model [11]. Its cycle has four stages of processing: observation, orientation, decision, and action. Each stage can be matched to a specific level of the JDL model [3]: The Observation stage is similar to Source Preprocessing in the JDL model and to the Collection stage of the Intelligence Cycle model. The Orientation stage is comparable to Levels 1, 2, and 3 of the JDL model, and it can be considered similar to the Collection and Collation stages of Intelligence cycle model. The Decision stage is comparable to the processing level in the JDL model and equivalent to the evaluation and dissemination stages of the Intelligence cycle model. While the Action stage has no match to any level in the JDL model, it can be considered as the Dissemination stage of the Intelligence cycle model. A better fusion process model can be obtained by a combination of the intelligence cycle and Boyd models, such as the Omnibus Model. 5.5 Omnibus Model This model is based on a combination of the Intelligence cycle and Boyd models. The cyclic structure of this model can be compared to the Boyd loop, with more enhancements to the structure of the processing levels [12]. The observation step of the Boyd model has been modeled as a sensing and signal processing. While the orientation step of the Boyd model is conducted as feature extraction and pattern processing. The decision step of the Boyd model has been replaced in Omnibus by two phases: context processing and decision making. The action step of the Boyd model is divided into control and resource tasking. The Omnibus model is considered more efficient and generalized than other fusion models, such as Waterfall, Intelligence cycle, or Boyd models. The Omnibus model employs the strengths of other previous models and at the same time is considered very easy to employ in a sensor fusion system. 5.6 Thomopoulos Model The architecture of this model is based on three levels of data processing: signal level, evidence level, and dynamic level [13]. The signal level is a correlation and learning process that integrates sensor data. The evidence level is used to describe the statistical model that processes data from different sensors in some form of local inference. The dynamic level combines different observations in a centralized or decentralized fashion, assuming that a mathematical model that describes the process from which data is collected is known [13]. 6 Fusion Techniques Once the fusion architecture is designed, one or more fusion techniques should be implemented to fuse the data at different levels, keeping in mind that uncertainty exists in all descriptions of the sensing and data fusion process. Then, an explicit measurement of this uncertainty must be provided, in order to fuse the sensory information in an efficient and predictable manner [14]. However, there are many methods used for representing uncertainty; almost all of the theoretical developments are based on the use of probabilistic models [14]. Probabilistic models have an important aspect when developing data fusion methods. It is an essential requirement to obtain a well understanding of probabilistic modeling techniques when a data fusion is required in any coherent manner [14]. This section discusses two approaches used in sensor fusion systems, Bayesian network Posteriori as a probabilistic model technique and an alternative method to the probability; theory of evidence by Dempster-Shafer. The Bayes approach deals with probability values in which incomplete information which leads to uncertainty is not accounted for, that is: PP AA + PP AA 1. Where Dempsetr-Shafer accounts for incomplete information in the mass function: mm(aa, AA, θθ) 1. 6.1 Multi Sensor Data Fusion Using Bayes Tachnique The main objective when developing a sensor fusion system is to integrate data from multiple sensors, whether identical or different, active or passive, in order to generate reliable estimation about the object of interest, as depicted in Fig.1. Several methods are modeled when multi-sensor fusion is required, such as: Deciding, Guiding, Averaging, Bayesian statistics, and Integration [15]. S 1 X 1 T 1 S m X 2 S 2 T 2 X 3 S 3 T 3 X m T n Bayesiantechnique Dempster- Shafer Fuzzy logic AND OR Intersection X 3 Average Fig. 1. General model of multi-sensor data fusion S(X 1. X 2. X 3,.. X m )

In general, the model of sensor fusion consists of a set of sensors S j {s 1, s 2, s 3,.., s m } where each sensor s output is represented by an array of likelihood values; that is, the conditional probabilities. This array represents the environment of interest, which could be modeled by a single state T represented by finite types of targets, T i {t 1, t 2, t 3,.., t n }. A single sensor S observes targets t n and returns a single value for each type T. These values form the Likelihood Probability array, P(s t i ), which are illustrated in Table I. When more than one sensor is employed in the system, a sensor fusion method must be applied to integrate the various observations in order to make an assessment of each target type being detected. In the case of another sensor S 2 is introduced to detect or observe the same target types, namely t 1, t 2, t 3, t 4,..t n, a second likelihood array is generated as in Table I. For every sensor in the system, a likelihood vector is generated. TABLE I. LIKLIHOOD VECTORS FROM SENSOR1(S 1 ) AND SENSOR2 (S 2 ) Target t 1 t 2 t 3 ---------- t n type Sensor S 1 P(s 1 t 1 ) P(s 1 t 2 ) P(s 1 t 3 ) ----------- P(s 1 t n ) Sensor S 2 P(s 2 t 1 ) P(s 2 t 2 ) P(s 2 t 3 ) - P(s 2 t n ) 6.1.1 Bayes Theorem Bayes rule is based on the joint and the conditional probability [16]. P (S T i ): The observation S about the state T i, which is the conditional probability describing each state of nature T i. This is called the likelihood that observation S will be made. P (T i S): It is computed from the original prior information and the information gained by observation. It represents the posterior probability value describing the likelihoods associated with T i, given the observation S. The Bayesian posterior probability represents our confidence in the hypothesis, which is based on the prior data and current observation values, which enable the user to make a decision based on the application type. Table II represents the joint likelihood probability values JLV, and the posteriori probability values. JLV is generated for a target type t i from S1 and S2; P(S1,S2 t i ), based on (2) P (S t i ) P (S 1 t i ). P (S 2 t i ) (2) Then by applying Bayes theory, the hypotheses values for each target type are computed by applying the posterior probability values using (1). On the other hand, Bayes theory still suffers a few disadvantages; one disadvantage when using a Bayes theorem that is acquiring accurate a priori probabilities values P(T) or P(TT ii ). In many cases all the a priori probabilities are assigned equal values that would affect the precision in the posteriori probability, which differs from case to case, based on the number of targets, as in (3). P (t 1 ) P (t 2 ) P (t 3 ) ------- P (t n ) 1/n (3) PP(TT ii SS) PP(SS TT ii )PP(TT ii) ii PP(SS TT ii ) PP(TT ii ) P (T i ): The prior beliefs about the values of T i. (1) n: is the number of targets. TABLE II. GENERATING THE CONDITIONAL AND POSTERIOR PROBABILITY JLV P(s t i ) P(s t 1 ) P(s 1 t 1 ).P(s 2 t 1 ) P(s t 2 ) P(s 1 t 2 ).P(s 2 t 2 ) P(s t 3 ) P(s 1 t 3 ).P(s 2 t 3 ) P(s t n ) P(s 1 t n ).P(s 2 t n ) Posteriori P(t i s) PP(t 1 ss) PP(ss tt 1 )PP(tt 1 ) PP(tt 2 ss) PP(ss tt 2 )PP(tt 2 ) PP(tt 3 ss) PP(ss tt 3 )PP(tt 3 ) -- PP(tt nn ss) PP(ss tt nn )PP(tt nn ) 6.2 Multi Sensor Date Fusion Using Theory of Evidences This theory was developed by Dempster and Shafer [17] [18], as a mathematical method that generalizes Bayesian theory, which allows the representation of ignorance. The Bayesian theory assigns for each evidence only one possible event, while Dempster-Shafer associates evidence with multiple possible events. The Dempster-Shafer theory assumes that the states for which we have probabilities are independent with respect to our subjective probability judgments. The implementation of this method to a specific

problem normally involves two steps [18]. The first step is to solve the uncertainties into a priori independent items of evidence. The second is a computation using the Dempster- Shafer method, as illustrated in Table III. Three functions in Dempster-Shafer theory must be noted due to their importance [16]: 1) The Basic Probability Assignment function or Mass function (bpa or m): This is the fundamental of Dempster/Shafer theory. It presented by a mass function m and defined as follows in (4): mm: PP(xx) [0,1] mm( ) 0 AA PP(xx) mm (AA) 1 (4) P (x) is the power set of x is the null set A is a set in the power set (A P (x)). [19] 2) The Belief function (Bel): represents the lower boundary of the pba interval. Where the Belief for a set A is defined in (5): BBBBBB(AA) BB BB AA mm (BB) (5) (B) is a proper subset of set (A). 3) The Plausibility function (Pl): Is the upper boundary of the pba interval. The plausibility for a set A is defined in (6): PPPP(AA) mm(bb) BB BB AA (6) Based on the basis that all assignments should be added up to one, we can derive these two measures from each other in (7): AA is the classical complement of A. PPPP(AA) 1 BBBBBB(AA ) (7) 6.2.1 Dempster Shafer Method of Fusion Since the Dempster-Shafer method of integration of two events is a conjunctive relation (AND), then the integration of two masses mm 1 and mm 2 is performed as in (8) [20]: mm 12 (AA) BB CCAA mm 1 (BB). mm 2 (CC) 1 kk (8) AA, mm 12 ( ) 0 KK BB CC mm 1 (BB). mm 2 (CC) K is the conflict degree between evidences, and (1-K) is used as a normalization factor. TABLE III. INTEGRATION OF SENSOR 1 (S 1 ) AND SENSOR 2 (S 2 ) USING DEMPSTER/SHAFER Sensor 2 (S2) 1 m 2 (B m ) -------- m 2 (B 3 ) m 2 (B 2 ) m 2 (B 1 ) Sensor 1 (S 1 ) m 1 (A 1 ) m 1 (A 2 ) ------ m 1 (A n ) m 2 (B 1 ) m 1 (A 1 ).m 2 (B 1 ) m 1 (A 2 ).m 2 (B 1 ) ------ m 1 (A n ).m 2 (B 1 ) m 2 (B 2 ) m 1 (A 1 ).m 2 (B 2 ) m 1 (A 2 ).m 2 (B 2 ) ------ m 1 (A n ).m 2 (B 2 ) m 2 (B 3 ) m 1 (A 1 ).m 2 (B 3 ) m 1 (A 2 ).m 2 (B 3 ) ----- m 1 (A n ).m 2 (B 3 ) m 2 (B m ) m 1 (A 1 ).m 2 (B m ) m 1 (A 2 ).m 2 (B m ) ----- m 1 (A n ).m 2 (B m ) A n B m m 1 (A n ). m 2 (B m ) 0 m 1 (A 1 ) m 1 (A 2 ) m 1 (A 3 ) - m 1 (A n ) 1 Dempster/Shafer theory is considered as a forward straight method to express pieces of evidence with different levels of abstraction, and it can be used to combine pieces of evidence, which is not always available when using other methods of fusion. 7 Conclusion This paper presents a review of multi-sensor fusion and the requirement for building a data fusion system. Two fusion techniques were proposed: Bayes and Dempster/Shafer s theories. While Bayes theory does not account for incomplete information, it is the sum of a probability, and its complement in a sample space is one. Dempster/Shafer accounts for ignorance or incomplete information, so the sum

of the probability and its complement with ignorance is equal to one. Both methods represent the hypothesis with a single value (posteriori or support). Dempster/Shafer introduces an upper boundary called plausibility; therefore, it gives more room for the decision making process. However, both theories require a prior knowledge of the state; Bayes theory requires a priori probability, and Dempster/Shafer requires a mass function. Based on the prior information available, one method has an advantage over the other. [17] G. Shafer, A Mathematical Theory of Evidence, 1976, Princeton University Pres [18] K. Sentz, S. Ferson, Combination of Evidence in Dempster-Shafer Theory, Sandia National Laboratories, 2002. [19] G. J. Klir, and M. J. Wierman Uncertainty-Based Information: Elements of Generalized Information Theory, Heidelberg, Physica- Verlag,1998. [20] G. M. Provan, The validity of Dempster Shafer belief functions, Int J Approx Reason, 1992, 6:389 99. One of the major problems when using both techniques is that both assume independency in their computation process between events, and this assumption leads to uncertainty in the hypothesis, sometimes adding or taking away part of the value associated with the hypothesis. In our ongoing research we were able to show that when we quantified dependency between two sensors events (outputs), we accounted for uncertainty, but the assumption of independence does not control the performance of the system. 8 References [1] W. Elmenreich, An introduction to sensor fusion. Research report 47/2001, Institute fur Technische Informatik, Vienna University of Technology, Austria, 2002. [2] P. Grossmann, Multisensor data fusion, GEC J Technol 15:27 37, 1998. [3] D. L. Hall, and S. H. Mcmullen, Mathematical techniques in multisensor data fusion, MA: Artech House, London, 1992. [4] J. R. Raol, G. Girija, and N. Shanthakumar, Theory of data fusion and kinematic-level fusion, CRC Press book, 2009 [5] R. C. Luo, and M. G. Kay, Data fusion in robotics and machine intelligence, Academic press, 1992. [6] P. Lytrivis, G. Thomaidis, and A. Amditis, Sensor data fusion in automotive applications, Sensor and Data Fusion, Nada Milisavljevic (Ed.), ISBN: 978-3-902613-52-3, InTech, DOI: 10.5772/6574, 2009. [7] A. G. O. Mutambra, Decentralized estimation and control for multisensory systems, Florida, USA: CRC Press, 1998. [8] J. Llinas, and D. L. Hall, An introduction to multi-sensor data fusion, Proc. IEEE International Symposium on Circuits and Systems 1998, 6: 537-540. [9] C. J. Harris, A. Bailey and T. J. Dodd, Multi-sensor data fusion in defence and aerospace, Aeronautical Journal, 102 (1015): 229-244. [10] A. N. Shulsky, Silent warfare: Understanding the world of intelligence, New York: Brassey s. [11]. J. R. Boyd, A discourse on winning and losing, Jhon R Boyd, 1987. [12] M. Bedworth, and J. O Brien. The omnibus model: A new model for data fusion?, 2nd International Conference on Information Fusion (FUSION 99), 1999. [13] S. C. Thomopoulos, Sensor integration and data fusion, Proc. SPIE 1198, Sensor Fusion II: Human and Machine Strategies, 1989. [14] H. D. Whyte, Multi Sensor Data Fusion, Australian Centre for Field Robotics, 2006 [15] J. K. Hackett,M. Shah, "Multi-sensor fusion: a perspective," Robotics and Automation, 1990. Proceedings., 1990 IEEE International Conference on, vol., no., pp.1324,1330 vol.2, 13-18 May 1990 [16] D. Koks, and S. Challa, An Introduction to Bayesian and Dempster- Shafer Data Fusion, Sadhana 29(2):145 76, 2004.