Nonlinear Bayesian Estimation with Convex Sets of Probability Densities

Size: px
Start display at page:

Download "Nonlinear Bayesian Estimation with Convex Sets of Probability Densities"

Transcription

1 Nonlinear Bayesian Estimation with Conve Sets of Probability Densities Benjamin Noac, Vesa Klumpp, Dietrich Brunn, and Uwe D. Hanebec Intelligent Sensor-ctuator-Systems aboratory Institute of Computer Science and Engineering Universität Karlsruhe (TH), Germany bstract This paper presents a theoretical framewor for Bayesian estimation in the case of imprecisely nown probability density functions. The lac of nowledge about the true density functions is represented by sets of densities. formal Bayesian estimator for these sets is introduced, which is intractable for infinite sets. To obtain a tractable filter, properties of conve sets in form of conve polytopes of densities are investigated. It is shown that pathwise connected sets and their conve hulls describe the same ignorance. Thus, an eact algorithm is derived, which only needs to process the hull, delivering tractable results in the case of a proper parametrization. Since the estimator delivers a conve hull of densities as output, the theoretical grounds are laid for deriving efficient Bayesian estimators for sets of densities. The derived filter is illustrated by means of an eample. Keywords: Nonlinear Bayesian estimation, conve set, conve polytope I. INTRODUCTION In many technical applications, unnown quantities are incorporated and have to be dealt with. Typical eamples include localization problems, map building, reconstruction of quantities from noisy measurements, etc. main problem is to determine estimates of inaccessible states from measurements corrupted by noise. In general, Bayesian inference models are used for combining data to reach conclusions and insight. This implies that the herein used probability distributions are assumed to be optimal. The appropriateness of stochastic mechanisms has thus to be questioned, when probabilities are not well nown. proposal to overcome this issue is to process sets oossible probabilities, which are the focus of this paper. fter proposing a generic Bayesian estimator for sets of densities, conve sets parametrized as conve polytopes are investigated. It is shown that a pathwise connected set and its conve hull describe the same ignorance. Thus restricting our investigation to conve sets is sufficient. n eact Bayesian estimator, which only processes the conve hull, is derived and illustrated by means of an eample. First of all the eisting approaches are discussed: For stochastic uncertainties a multitude of approaches eist. For linear Gaussian systems, the famous Kalman Filter [1] can be utilized. For the case of non-gaussian noise, typically appearing in nonlinear systems, closed-form solutions do not eist in general. dditionally, recursive processing being Interval probabilities Structure Robust Bayes. analysis Neighbourhood Conve sets Polyhedron Coherent previsions Figure 1. Conve sets orobabilities in the theorey of interval probabilities, robust Bayesian analysis and coherent previsions. necessary for many technical systems, the compleity of the density representation increases continually. Only in special cases, the compleity is bounded [2]. So, for almost all realworld applications, approimations are inevitable. common approach is to linearize the system as in the etended Kalman Filter [3] or to assume the joint densities to be Gaussian as in the Unscented Kalman Filter [4]. Widely used are particle filters, where densities are represented by random or pseudo-random samples [5]. Though they shine through their algorithmic simplicity, they do posses some problems in terms of convergence [6], [7]. nother approach is to represent the probability densities by function series: [8] uses Gram-Chalier epansions, [9] derives a filter based on generalized Edgeworth series for continuous time systems with additive Gaussian noise, and [10], [11] use Fourier Epansions. Furthermore, Gaussian Mitures [12], Dirac Mitures [13] or combinations of them [14] are also utilized. Conve sets orobabilities found attention in literature as well. [15] investigates interval-probabilities. This concept pursues the goal of generalizing classical probability theory in order to achieve an etensible description of uncertainty. The ey idea is to assign intervals to random events instead of single values. Conve sets have a significant role in this theory: The largest set of distributions yielding a certain interval of probabilities is conve. Such a set is called the structure of an interval probability assessment. In the theory of robust Bayesian analysis [16], [17] conve sets are used to assess global robustness. Neighbourhood 32

2 classes of functions are often employed to study sensitivity to prior distributions. t this, the range of deviations is computed while varying the prior over a class of functions. Such sets are common to be conve. ower and upper probabilities are defined as a special case of lower and upper previsions in the subjectivistic and behavioristic theory of imprecise probabilities [18]. These probabilities generally correspond to conve polyhedrons with parallel faces. s illustrated by Figure 1, conve sets are to be regarded as an underlying concept for a variety of theories, which aspire to etend classical probability theory. gainst the bacground of decision theory, [19] provides an overview and a justification of using conve sets orobabilities. In this contet, conve sets are called credal sets. Considering linear loss or utility functions, an arbitrary set and its conve hull effect the same pattern oreferences. So the main reason for using credal sets is that the largest set containing all probability distributions, which induce a specific decision, is conve. Transferring the concepts of conditional probability and independence to the theory orobability sets turned out to be a difficult tas. We overcome this issue by identifying all sets inducing the same probability assessment. For our investigations we will follow evi s epistemology as suggested in [20], [21]. ccording to this, stochastic models are intended to epress lac orecision. Such a model produces a single estimate, which is considered to be the best one. Due to inappropriate assumptions, incomplete prior nowledge or unnown quantities, there might be more estimates to be taen into account. In this regard, set-valued stochastic models can cope with the incapability to specify appropriate probabilities. In that case, a random variable will be characterized by a set of distributions, each of which is a possible and permissible probabilistic description of that variable. Sets orobability distributions represent a state of ignorance, since we are not capable of maing a decision among those distributions. This means that the possible distributions compete for being the best estimate. dding all conve combinations to the set, i.e., taing the conve hull, is one way to effect a compromise. This accounts for the strong tendency towards conve sets. Eisting approaches such as the set-valued Kalman filter [21] or projection-based approaches [22] model only the initial state as a conve set. In contrast, this wor provides a theoretical framewor for modeling system and measurement noise as conve sets as well. In this article, we consider systems with a continuous state space and investigate conve sets orobability density functions. The sets of the corresponding cumulative distributions are also conve because of the linearity of integration. The following section reviews the discrete-time Bayesian estimator and eplains the concept orocessing sets of densities. The remainder of this paper is structured as follows: In Section III, conve polytopes of densities are defined. It is shown that intervals orobabilities and epectation values can easily be computed by using the vertices of the polytopes. Section IV derives a Bayesian estimator for conve polytopes. This u f +1 e a (ˆ, u, ŵ ) ŵ ˆ +1 Delay Delay Pred. Step ˆ Dis.-time System ˆv h (ˆ, ˆv ) Filt. Step Bayes. Estimator ŷ f e Figure 2. Structure of a discrete-time system and a Bayesian estimator for sets of densities. estimator processes only the vertices of the polytopes and thereby generates the vertices of a new conve polytope. This resulting polytope is the conve hull of the eact set, which would arise from element-wise processing. The ey result is that the eact set and its covering conve polytope yield the same probability intervals, even after multiple processing steps. n eample application is investigated in Section V. Here, the measurement uncertainties of a radar altimeter are given by a conve polytope. This wor is concluded in Section VI with an outloo to future investigations. II. PROBEM FORMUTION In this paper, we focus on probabilistic dynamic discretetime nonlinear systems, which can be described by +1 = a (, u, w ), (1) y = h (, v ), (2) where the underlining denotes vectors and the lowercase boldface letters random variables. In the system equation (1), the time update is performed: The system function a (,, ) projects the n-dimensional inaccessible state, characterized by the random vector Ω R n, with respect to the input u Ω u R and the system noise w Ω w R p f w(w ) onto the state +1 at the discrete time + 1. The measurement equation (2) models the outcome of a measurement y Ω y R m at time with respect to the state and the measurement noise v Ω v R q f v(v ) utilizing the measurement function h (, ). The real system is a realization of the probabilistic system (1) and (2) as depicted in Figure 2. The realizations of random vectors are denoted by a hat. ssuming the probability densities of the noise f w(w ), f v(v p ) and the initial density f0 ( ) as being stochastically independent, an estimation for the state at time can be determined with the Bayesian estimator, which consists of two steps: The filtering step determines an improved estimate f e ( ) by incorporating a measurement ŷ into a prior estimate 33

3 ( ) by evaluating f e ( ) = 1 f ( c ) ( ), (3) f ( ) := f(ŷ ) = δ q( ŷ h (, v ) ) f v (v ) dv, Ω v c = f ( ) ( ) d. Ω The conditional density f ( ) = f(ŷ ) is called lielihood, c is a normalization constant, and δ q (.) is the q- dimensional Dirac delta function. In the prediction step, the state estimate is etrapolated from f ( ) at discrete time to time + 1 by determining f ( +1 ) = f T ( +1 )f ( ) d, (4) Ω f T ) ( +1 ) = f w (w ) dw. Ω w δ p ( +1 a (, u, w ) The conditional density f T( +1 ) is also called transition density. Since u is assumed to be nown, it is not eplicitly denoted in f T( +1 ). The prior density normally is the result of a previous filtering step, i. e., f ( ) = f e( ). Now we turn to investigating sets of densities: s discussed in the introduction, such sets can evolve from imprecise prior nowledge, unnown measurements or unnown system parameters. Since this wor focuses on theoretical aspects, it is not investigated how proper sets of densities can be obtained for practical applications. Here we consider to have a set of system functions K and a set of measurement functions H at time. The set of transition densities is then given by M T := f T f T ( +1 ) =, +1 Ω, a, δ p ( +1 a (, u, w ) ) f w (w ) dw Ω w and the set of lielihoods is given by M = f f ( ) = δ q( ŷ h (, v ) ) f v (v ) dv, Ω v Ω, h H. The Bayesian estimator for sets of densities is defined as the element-wise processing of the sets. The filtering step can be written as M e = f e f e ( ) = ( )f ( ) Ω ( )f (, ) d Ω, f M, Mp, with M p being the set oriors. The prediction step results in M p +1 = f p ( +1) = ( +1 )f ( ) d, Ω f T +1 Ω, f T M T, f e M e This formulation is more of formal importance, because for infinite sets the latter epressions are intractable. Therefore, a parametrization based on conve polytopes is investigated in the following sections. III. CONVEX SETS OF PROBBIITY DENSITIES s a special case of conve sets we consider conve polytopes orobability densities, which can easily be obtained by taing the conve hull of a finite number of density functions. The reason for choosing conve polytopes is clarity and mathematical brevity. Since the number of density functions may be arbitrarily large, any conve set can be approimated with an arbitrarily small error. generalization to general conve sets is a laborious tas but does not promise additional insights. conve polytope is defined by n n P := α i f i α i = 1, α i 0, (5) where f 1,..., f n are arbitrary probability density functions. For every f = i α if i P the probability of an event can be written as the conve combination P () = f() d = α i f i () d, where the integrals in the latter epression are the probabilities computed from the vertices f 1,..., f n. Hence, P yields a probability interval I P () := [ P (), P () ] for every event, where P () = min n f i () d and P () = ma n f i () d The probability interval of the complementary event c is given by [ 1 P (), 1 P () ]. Interestingly, a pathwise connected set M which contains f 1,..., f n and which has the conve hull P as depicted in Figure 3 provides the same intervals orobabilities as P. This relationship is proven in the following theorem. Note, that the operator convm denotes the conve hull of M. Theorem 1 Consider a polytope P as defined in equation (5) and a pathwise connected set M with the following properties: f 1,..., f n M and P = convm... 34

4 f 1 f n M f 2 P i,n P f n 1 Figure 3. The set M contains the vertices f 1,..., f n of P and is pathwise connected. Then, for an event, the set orobabilities I M () := P () P () = f() d, f M is an interval identical to I P (). PROOF. Since M P, the set orobabilities I M () is a subset of I P (). et f i and f j define the endpoints of the probability interval I P (). Since M is pathwise connected, there eists a continuous path P i,j : [0, 1] M with P i,j (0) = f i and P i,j (1) = f j. Due to the linearity of the integral, the interval I i,j () = P () P () = f() d, f P i,j ([0, 1]) is pathwise connected as well and has the same endpoints as I P (). Thus we have I P () = I i,j () I M () and the proof is complete. This Theorem is a strong argument for the utilization of conve sets: It does not mae any difference if a pathwise connected set or its conve hull is considered. The probability intervals of any event will not differ. nalogously, we can derive an interval for the epectation values induced by P, i.e., [ n min E f i, ] n ma E f i In a similar manner, the preceding observations can be adopted for every linear functional, since sets of conve combinations of scalars result in intervals. In contrast, specifying intervals for the variances cannot be achieved by determining minimal and maimal variance of the verte densities. lengthy, but straightforward, calculation yields V f = α i V fi j=1. f i [ α i α j Efi E fj ] 2 for f = i α if i P. So the variances of P depend both on the variances of the vertices f 1,..., f n and the squared distances of their epected values. IV. BYESIN ESTIMTION WITH CONVEX SETS The previously eplained Bayesian estimator for sets of densities performs filter and prediction steps elementwise. Consequently, an appropriate representation of infinite sets is inevitable in order to realize filter and prediction step with bounded compleity. We have to ascertain that we are able to retain that representation after filtering and prediction for further processing steps. In this section, we present a Bayesian state estimator for conve polytopes of density functions. We assume that not only the initial state but also measurement and system noise are modelled as conve polytopes.. Filtering with Conve Polytopes First, we consider the filter step. et P p := conv 1,..., m be the conve polytope orior or predicted probability density functions and P := conv f1,..., fn be the conve polytope of lielihoods. For the sae of clarity, we omit the time inde. Here, the subscripts denote the different vertices of a polytope. The elementwise application of filter step (3) yields the eact set M e f := Ω (ν) f (ν) dν P p, f P. The following theorem reveals a way of calculating the conve hull of M e. et V (P p ) and V ( P ) denote the vertices of P p and P, respectively. We have V (P p ) = 1,..., f m p and V ( P ) = f1,..., fn. Theorem 2 (Filtering Step for Conve Polytopes) The vertices of the smallest conve polytope covering M e are obtained by elementwise filtering of V (P p ) with V ( P ). ccording to this, we define P e by conv Ω Then we have i fj i (ν) f j (ν) dν i P e = convm e. V (P p ), f j V ( P ). PROOF. Suppose that is an element of P p and f lies in P. Then there are weights α 1,..., α m and β 1,..., β n, such that = m α i i and f = j=1 β i f j, 35

5 where i V (P p ) and fj V ( P ). For a shorter notation, we define Γ i,j := Ω i (ν) f j p (ν) dν. pplying (3) to f and f gives f e ( ) = ( ) f ( ) Ω (ν) f (ν) dν m,n,j=1 = α i i ( ) β j fj ( ) m,n =1,l=1 α β l Ω (ν) f l (ν) dν [ ] m,n α i β j = m,n i ( ) f j ( ) =,j=1 m,n,j=1,j=1 [ =1,l=1 α β l Γ,l ] α i β j Γ i,j f p i ( ) f j ( ) m,n =1,l=1 α. β l Γ,l Γ i,j The coefficients in the latter sum are non-negative and satisfy [ ] m,n α i β j Γ i,j m,n =1,l=1 α = 1. β l Γ,l Hence, f e is to be regarded as a conve combination of the density functions Ω i fj i (ν) f j, (ν) dν which result from elementwise filtering of V (P p ) and V ( P ) and are themselves elements of M e. In particular, f e lies in the conve hull of these functions. This states that M e P e, which finishes the proof. If P p or P consists of only one element, then conveity will be conserved, as stated in the net theorem. Theorem 3 The eact set M e is conve, if P p or P is a singleton set. In particular, the following equation applies: P e = M e PROOF. We will show M e P e. Without loss of generality, we assume that P = f1. et f e P e be arbitrary. Then this function is a conve combination m f e i f1 = α i Ω i (ν) f 1 (ν) dν of the vertices V (P e ), where m α i = 1, α i 0. In order to prove that f e lies in M e we have to determine = m γ i i P p, such that m f e i = α i ( ) f1 ( ) Ω i (ν) f 1 (ν) dν = ( ) f1 ( ) Ω (ν) f1 (ν) dν = ( m γ i i ( )) f 1 ( ) Ω ( m =1 γ i (ν)) f 1 (ν) dν m i = γ i ( ) f1 ( ) m =1 γ Ω (ν) f 1 Me (ν) dν (6) holds. comparison of coefficients gives α i γ = i Γ m i =1 γ. (7) Γ s before, Γ i denotes the integral Ω (ν) f 1 (ν) dν. We require that the weights γ i sum to one and therefore we have ( m m m ) α i 1 = γ i = γ Γ. Γ i Together with (7) we now obtain γ i = =1 α i /Γ i m α i/γ i. The weights γ i are obviously non-negative and result thus in the demanded function = m γ i i for (6). B. Prediction with Conve Polytopes In line with Theorem 2, a corresponding conclusion holds for the prediction step. We define the conve polytope of estimated densities by P e = convf1 e,..., fm e and accordingly the conve polytope of transition densities by P T = conv f1 T,..., fm T. s before, M p denotes the resulting set after elementwise processing with (4). Theorem 4 (Prediction Step for Conve Polytopes) The prediction step for sets of densities with the verte sets V (P e ) and V ( P T) yields the vertices of P p, which is related to the eact set M p by means of the equation P p = convv (P p ) = convm p. PROOF. Every f e P e and every f T P T can be written as m f e = α i fi e and f T = β i fj T, j=1 respectively. Due to the linearity of integration, it follows directly that = f T ( ) f e ( ) d = Ω m,n,j=1 α i β j Ω fj T ( ) fi e ( ) d, where m,n,j=1 α iβ j = 1 and Ω f j T( ) f i e( ) d are the vertices of P p. Hence, the predicted density lies in P p as anticipated. In the following theorem, we again consider the situation orocessing only one probility density function f e or f T instead of P e or P T. Theorem 5 If P e or P T is singleton, then M p is conve and equal to P p. PROOF. Considering (8), one can easily accept that conveity will be preserverd, when the cardinality of P e or P T amounts to one. (8) 36

6 P 1 f i P 2 Figure 4. Elementwise processing of the vertices V `P 1 `P and V 2 yields the vertices of P which is the conve hull of M. The eact set M arises from elementwise processing of P 1 and P 2. C. Discussion Elementwise processing of two conve sets of density functions usually leads to a non conve set M. With Bayesian estimation, this occurs when combining sets orior densities and sets of lielihoods, or transition densities. It was shown that after each filter step or prediction step the conve hull of the eact set can easily be found by restricting oneself to the vertices of the polytopes. Especially (8) points out that in general the eact set M is a proper subset of the corresponding polytope P. This is due to the fact that this equation can be interpreted as a quadratic function of the weights. Considering the conve hull P instead of the non conve resulting set M, implies that density functions will be introduced after each processing step, which are not part of the eact set, as illustrated in Figure 4. Fortunately, the use of the covering polytope for further processing is appropriate, since it leads to eactly the same probability assessment. This is due to the linearity of the integral, which yields the probability from the densities. One ey statement of this paper is that after multiple recursive processing steps the result of the proposed estimator is the conve hull of the eact set given by elementwise processing. This means that applying processing steps to P and taing the conve hull afterwards results in the same set as processing the vertices V (P ) and taing the conve hull of them at all time steps until. This is mainly due to the fact that the resulting vertices are images of the vertices of P. Other approaches woring on sets of densities commonly just use sets orior probabilities, whereas lielihoods and transition densities are fied. In this particular case conveity will be preserved due to Theorem 3 and 5. In terms of conve M f i,j P f j polytopes, we have proposed a method of modelling system and measurement noise as sets as well. n important point to emphasize is the fact that the vertices of the resulting polytope are in general affinely independent. Then, the polytope is a simple, which cannot be simplified to a polytope with less vertices. The eample in the following section shows a special case, where some of the vertices after each processing step coincide. In conclusion, this section shows that only the finite sets of vertices are required to implement a Bayesian estimator for conve polytopes, though it can be interpreted as element-wise processing of an infinite number of densities. Unfortunately, the number of vertices needed to characterize the conve set increases eponentially over the number orocessing steps, depending on the number of vertices of the prior sets. Processing two conve sets with n and m vertices usually leads to a set whose conve polytope is characterized by n m vertices. V. EXMPE To illustrate the idea of an estimator for sets of densities, a simple eample is discussed in this section. [23] presents a problem, where a radar altimeter is utilized for measuring the ground clearance of a plane. The measurements differ significantly when flying over trees in contrast to flying over clear ground. The lielihood f = πn (ˆ 1, σ 2 1) + (1 π)n (ˆ 2, σ 2 2) (9) is suggested, where N (ˆ, σ 2 ) denotes the probability density funcion of the normal distribution with standard deviation σ and epected value ˆ. When flying over a tree-covered area, N (ˆ 1, σ1) 2 describes the lielihood of an echo over clear ground and N (ˆ 2, σ2) 2 the echo over tree tops. In contrast to [23], we assume the probability π of being over clear grounds to be uncertain in a sense of ignorance. For our eample, we assume having three identical measurements, which should be fused. The lielihood has the parameters ˆ 1 = 1, σ1 2 = 1/4, ˆ 2 = 1, σ1 2 = 1 and the parameter π lies in the interval [0.3, 0.8]. ccording to (9) the set is defined as (0.3 N ( 1, 14 ) ) N (1, 1) P = f (1 λ) f ( ) = λ + (0.8 N ( 1, 14 ) ) N (1, 1), λ [0, 1]. In the first step, the lielihood set P is fused with a uniform distribution. The result is naturally the same set, which is depicted in Figure 5(a). The multiplication with itself results in a set with four vertices of four-component Gaussian mitures as plotted in Figure 5(b). In the plot, only three components can be seen, since two are identical. This is because of the symmetries within the chosen sets due to the simplicity of the problem. Figure 5(c) shows the second fusion step. Now the set of vertices consists of eight si-component Gaussian mitures. gain, only four different densities can be distinguished, since some of the densities are identical. The 37

7 f (), f 0 e () f 1 e () (a) The initial set of lielihood densities. The two verte densities of the polytope and nine densities in-between are plotted. The polytope was fused with a uniform density, so the result after the first step is identical with the lielihood set. The interval of the epectation, depicted by the vertical dashed lines, is [ 0.6, 0.4]. f 2 e () (c) fter the second fusion step. Only the resulting four verte densities are plotted and the epectation interval, depicted by the vertical dashed lines, is [ 0.99, 0.25] (b) fter the first fusion step. Now only the resulting three verte densities are plotted and the epectation interval, depicted by the vertical dashed lines, is [ 0.97, 0.06]. f 3 e () (d) fter the third fusion step. Only the resulting five verte densities are plotted and the epectation interval, depicted by the vertical dashed lines, is [ 1.0, 0.5]. Figure 5. Four fusion steps with P. The prior distribution is uniform. The number of vertices increases by one after each step, but in general the number increases eponentially. final step ist depicted in Figure 5(d). The resulting polytope has five vertices. Though some of the verte densities are projected onto each other, it can be seen that the use of conve polytopes yields the same ey problem as standard stochastic nonlinear filters: The compleity of the density representation typically increases eponentially with each filtering step. Here, the number of vertices and the number of Gaussian components increase with each step. Coping with this increase of compleity is part of future wor. Considering the intervals of the epectations, which are depicted as vertical lines, it can be seen that the width stays constant from the initial to the first step and decreases from the first to the second step. This implies that ignorance decreases at a different rate as the stochastic uncertainty, which typically decreases strictly monotonically and is not discussed here. The investigation of the convergence of Bayesian estimators with sets of densities is another interesting point of further research. VI. CONCUSIONS ND FUTURE WORK In this wor, a theoretical framewor for processing sets orobability densities in the contet of Bayesian estimation was presented. Since a generic Bayesian estimator for density sets is intractable, conve sets were eamined. To eep the mathematical epressions simple, only conve polytopes were discussed. It was shown that enlarging a non-conve set to 38

8 its conve hull does not increase the etent of ignorance, i. e., the probability intervals of arbitrary events. The conve hull is merely the larger set, which constitutes the same probability assessment. Therefore the utilization of conve sets is sufficient. main drawbac of using conve polytopes lies in the fact that the number of vertices grows eponentially during processing, which is a direct consequence of Theorems 2 and 4. There eists no straightforward method to simplify the resulting polytopes, since the function space is infinitedimensional and the polytopes are thus typically simplices. direction of future wor might be to employ Gaussian miture or Fourier representations. Due to approimation, vertices can therefore become affinely dependend, which allows reducing the number of vertices. It should be mentioned that the herein used sets are then sets of approimate functions and thus do not necessarily contain the true probability density. nother important point to emphasize is that conve polytopes are sets of miture densities. Every function inside a polytope is a conve combination of the vertices, which can themselves be miture densities. So for some applications the number of verte densities tend to be infinite. n eample is a density set where the mean value is unnown but assumed to lie in a specific interval. In this regard we desire to model conve set of translated densities, which have no polytopic representation. nother application is decentralized data fusion. t this, information fusion is performed on several processing nodes, which usually are spatially distributed. Information in terms of measurements is incorporated into an estimate in each processing node. This estimate is transmitted to the other nodes that incorporate this data into their own estimates. Due to the lac of nowledge about which measurements are incorporated into the estimate, multiple processing of the same measurement can occur, which leads to unnown correlations. pproaches handling these unnown correlations are, e. g., covariance hulls and covariance intersection filters [24], [25], where marginal densities with unnown correlation coefficients are predestinated for a set-based model. n eamination in the light of the results of this wor seems promising. Further research will focus on more general models of conve sets to which the presented results can directly be applied. REFERENCES [1] R. E. Kalman, new pproach to inear Filtering and Prediction Problems, Transactions of the SME, Journal of Basic Engineering, vol. 82, pp , [2] R. Kulhavy, Can We Preserve the Structure of Recursive Bayesian Estimation in a imited-dimensional Implementation? in International Symposium on the Mathematical Theory of Networs and Systems, 1993, pp [3] H. W. Sorenson, east-squares Estimation: From Gauss to Kalman, IEEE Spectrum, vol. 7, pp , [4] S. J. Julier and J. K. Uhlmann, Unscented Filtering and Nonlinear Estimation, Proceedings of the IEEE, vol. 92, no. 3, pp , [5] M. S. rulampalam, S. Masell, N. Gordon, and T. Clapp, Tutorial on Particle Filters for On-line Non-linear/Non-Gaussian Bayesian Tracing, IEEE Transactions on Signal Processing, vol. 50, no. 2, pp , [6] F. Daum and J. Huang, Curse of Dimensionality and Particle Filters, erospace Conference, Proceedings IEEE, vol. 4, pp , Mar [7] M. Šimandl and O. Straa, Sampling Densities of Particle Filter: Survey and Comparison, in merican Control Conference (CC 07), 2007, pp [8] K. Srinivasan, State Estimation by Orthogonal Epansion of Probability Distributions, IEEE Transactions on utomatic Control, vol. 15, no. 1, pp. 3 10, [9] S. Challa, Y. Bar-Shalom, and V. Krishnamurthy, Nonlinear Filtering via Generalized Edgeworth Series and Gauss-Hermite Quadrature, IEEE Transactions on Signal Processing, vol. 48, no. 6, pp , Jun [10] R. Kronmal and M. Tarter, The Estimation of Probability Densities and Cumulatives by Fourier Series Methods, Journal of the merican Statistical ssociation, vol. 63, no. 323, pp , Sep [Online]. vailable: %2963%3323%3C925%3TEOPD%3E2.0.CO%3B2-2 [11] D. Brunn, F. Sawo, and U. D. Hanebec, Nonlinear Multidimensional Bayesian Estimation with Fourier Densities, in Proceedings of the 2006 IEEE Conference on Decision and Control (CDC 2006), San Diego, California, Dec. 2006, pp [12] D.. lspach and H. W. Sorenson, Nonlinear Bayesian Estimation using Gaussian Sum pproimations, IEEE Transactions on utomatic Control, vol. 17, no. 4, pp , ug [13] O. C. Schrempf, D. Brunn, and U. D. Hanebec, Density pproimation Based on Dirac Mitures with Regard to Nonlinear Estimation and Filtering, in Proceedings of the 2006 IEEE Conference on Decision and Control (CDC 2006), San Diego, California, Dec [14] M. F. Huber and U. D. Hanebec, The Hybrid Density Filter for Nonlinear Estimation based on Hybrid Conditional Density pproimation, in Proceedings of the 10th International Conference on Information Fusion (Fusion 2007), Quebec, Canada, Jul [15] K. Weichselberger, Elementare Grundbegriffe einer allgemeineren Wahrscheinlicheitsrechnung I. Heidelberg: Physica-Verlag, [16] J. O. Berger, n Overview of Robust Bayesian nalysis, Test, vol. 3, pp , [17] J. O. Berger, D. R. Insula, and F. Ruggeri, Bayesian Robustness, in Robust Bayesian nalysis, ser. ecture Notes in Statistics. New Yor: Springer-Verlag, 2000, no. 152, ch. 1, pp [18] P. Walley, Statistical Reasoning with Imprecise Probabilities. ondon: Chapman and Hall, 1991, vol. Monographs on Statistics and pplied Probability 42. [19] F. G. Cozman, Brief Introduction to the Theory of Sets of Probability Measures, Robotics Institute, Carnegie Mellon University, Tech. Rep., [Online]. vailable: fgcozman/qbayes. html [20] I. evi, Imprecision and Indeterminacy in Probability Judgment, Philosophy of Science, vol. 52, no. 3, pp , sep [Online]. vailable: 33%3C390%3IIIPJ%3E2.0.CO%3B2-H [21] D. R. Morrel and W. C. Stirling, Set-valued filtering and smoothing, IEEE Transactions on systems, man and cybernetics, vol. 21, no. 1, pp , Jan [22] J. Kenney and W. Stirling, Nonlinear Filtering of Conve Sets of Probability Distributions, Journal of statistical planning and inference, vol. 105, no. 1, pp , [Online]. vailable: [23] T. Schön, F. Gustafsson, and P.-J. Nordlund, Marginalized Particle Filters for Nonlinear State-space Models, Division of Communication Systems, Department of Electrical Engineering, inöpings University, Tech. Rep., Oct [24] U. D. Hanebec, K. Briechle, and J. Horn, Tight Bound for the Joint Covariance of Two Random Vectors with Unnown but Constrained Cross Correlation, in Proceedings of the 2001 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI 2001), Baden Baden, Germany, ug. 2001, pp [25]. Chen, P. O. rambel, and R. K. Mehra, Fusion under Unnown Correlation Covariance Intersection as a Special Case, in Proceedings of The Fifth International Conference on Information Fusion,

Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential Use in Nonlinear Robust Estimation

Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential Use in Nonlinear Robust Estimation Proceedings of the 2006 IEEE International Conference on Control Applications Munich, Germany, October 4-6, 2006 WeA0. Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential

More information

A STATE ESTIMATOR FOR NONLINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS

A STATE ESTIMATOR FOR NONLINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS A STATE ESTIMATOR FOR NONINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS Oliver C. Schrempf, Uwe D. Hanebec Intelligent Sensor-Actuator-Systems aboratory, Universität Karlsruhe (TH), Germany

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

Efficient Nonlinear Bayesian Estimation based on Fourier Densities

Efficient Nonlinear Bayesian Estimation based on Fourier Densities Efficient Nonlinear Bayesian Estimation based on Fourier Densities Dietrich Brunn, Felix Sawo, and Uwe D. Hanebeck Abstract Efficiently implementing nonlinear Bayesian estimators is still not a fully solved

More information

Extended Object and Group Tracking with Elliptic Random Hypersurface Models

Extended Object and Group Tracking with Elliptic Random Hypersurface Models Extended Object and Group Tracing with Elliptic Random Hypersurface Models Marcus Baum Benjamin Noac and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems Laboratory ISAS Institute for Anthropomatics

More information

Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance

Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance Dirac Mixture Density Approximation Based on Minimization of the Weighted Cramér von Mises Distance Oliver C Schrempf, Dietrich Brunn, and Uwe D Hanebeck Abstract This paper proposes a systematic procedure

More information

Parameterized Joint Densities with Gaussian and Gaussian Mixture Marginals

Parameterized Joint Densities with Gaussian and Gaussian Mixture Marginals Parameterized Joint Densities with Gaussian and Gaussian Miture Marginals Feli Sawo, Dietrich Brunn, and Uwe D. Hanebeck Intelligent Sensor-Actuator-Sstems Laborator Institute of Computer Science and Engineering

More information

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Fredri Orderud Sem Sælands vei 7-9, NO-7491 Trondheim Abstract The Etended Kalman Filter (EKF) has long

More information

Extension of the Sliced Gaussian Mixture Filter with Application to Cooperative Passive Target Tracking

Extension of the Sliced Gaussian Mixture Filter with Application to Cooperative Passive Target Tracking 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 Extension of the Sliced Gaussian Mixture Filter with Application to Cooperative Passive arget racing Julian Hörst, Felix

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

Distributed Kalman Filtering in the Presence of Packet Delays and Losses

Distributed Kalman Filtering in the Presence of Packet Delays and Losses Distributed Kalman Filtering in the Presence of Pacet Delays and Losses Marc Reinhardt, Benjamin Noac, Sanjeev Kularni, and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems Laboratory (ISAS) Institute

More information

Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator

Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator Andreas Rauh, Kai Briechle, and Uwe D Hanebec Member, IEEE Abstract In this paper, the Prior Density Splitting Mixture

More information

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION 1 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 3 6, 1, SANTANDER, SPAIN RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T

More information

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances Journal of Mechanical Engineering and Automation (): 6- DOI: 593/jjmea Cramér-Rao Bounds for Estimation of Linear System oise Covariances Peter Matiso * Vladimír Havlena Czech echnical University in Prague

More information

Optimal Mixture Approximation of the Product of Mixtures

Optimal Mixture Approximation of the Product of Mixtures Optimal Mixture Approximation of the Product of Mixtures Oliver C Schrempf, Olga Feiermann, and Uwe D Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute of Computer Science and Engineering

More information

The Kernel-SME Filter with False and Missing Measurements

The Kernel-SME Filter with False and Missing Measurements The Kernel-SME Filter with False and Missing Measurements Marcus Baum, Shishan Yang Institute of Computer Science University of Göttingen, Germany Email: marcusbaum, shishanyang@csuni-goettingende Uwe

More information

v are uncorrelated, zero-mean, white

v are uncorrelated, zero-mean, white 6.0 EXENDED KALMAN FILER 6.1 Introduction One of the underlying assumptions of the Kalman filter is that it is designed to estimate the states of a linear system based on measurements that are a linear

More information

Likelihood Bounds for Constrained Estimation with Uncertainty

Likelihood Bounds for Constrained Estimation with Uncertainty Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 5 Seville, Spain, December -5, 5 WeC4. Likelihood Bounds for Constrained Estimation with Uncertainty

More information

A Comparison of Particle Filters for Personal Positioning

A Comparison of Particle Filters for Personal Positioning VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University

More information

A SQUARE ROOT ALGORITHM FOR SET THEORETIC STATE ESTIMATION

A SQUARE ROOT ALGORITHM FOR SET THEORETIC STATE ESTIMATION A SQUARE ROO ALGORIHM FOR SE HEOREIC SAE ESIMAION U D Hanebec Institute of Automatic Control Engineering echnische Universität München 80290 München, Germany fax: +49-89-289-28340 e-mail: UweHanebec@ieeeorg

More information

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations PREPRINT 1 Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations Simo Särä, Member, IEEE and Aapo Nummenmaa Abstract This article considers the application of variational Bayesian

More information

Fuzzy histograms and fuzzy probability distributions

Fuzzy histograms and fuzzy probability distributions Fuzzy histograms and fuzzy probability distributions R. Viertl Wiedner Hauptstraße 8 / 7-4 Vienna R.Viertl@tuwien.ac.at W. Trutschnig Wiedner Hauptstraße 8 / 7-4 Vienna trutschnig@statistik.tuwien.ac.at

More information

Finite-Horizon Optimal State-Feedback Control of Nonlinear Stochastic Systems Based on a Minimum Principle

Finite-Horizon Optimal State-Feedback Control of Nonlinear Stochastic Systems Based on a Minimum Principle Finite-Horizon Optimal State-Feedbac Control of Nonlinear Stochastic Systems Based on a Minimum Principle Marc P Deisenroth, Toshiyui Ohtsua, Florian Weissel, Dietrich Brunn, and Uwe D Hanebec Abstract

More information

Nonlinear State Estimation! Particle, Sigma-Points Filters!

Nonlinear State Estimation! Particle, Sigma-Points Filters! Nonlinear State Estimation! Particle, Sigma-Points Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Particle filter!! Sigma-Points Unscented Kalman ) filter!!

More information

Nonlinear Information Filtering for Distributed Multisensor Data Fusion

Nonlinear Information Filtering for Distributed Multisensor Data Fusion 20 American Control Conference on O'Farrell Street, San Francisco, CA, USA June 29 - July 0, 20 Nonlinear Information Filtering for Distributed ultisensor Data Fusion Benjamin Noac, Daniel Lyons, atthias

More information

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch

More information

Bayesian Approach 2. CSC412 Probabilistic Learning & Reasoning

Bayesian Approach 2. CSC412 Probabilistic Learning & Reasoning CSC412 Probabilistic Learning & Reasoning Lecture 12: Bayesian Parameter Estimation February 27, 2006 Sam Roweis Bayesian Approach 2 The Bayesian programme (after Rev. Thomas Bayes) treats all unnown quantities

More information

PGF 42: Progressive Gaussian Filtering with a Twist

PGF 42: Progressive Gaussian Filtering with a Twist PGF : Progressive Gaussian Filtering with a Twist Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory (ISAS) Institute for Anthropomatics Karlsruhe Institute of Technology (KIT), Germany uwe.hanebeck@ieee.org

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture Notes Fall 2009 November, 2009 Byoung-Ta Zhang School of Computer Science and Engineering & Cognitive Science, Brain Science, and Bioinformatics Seoul National University

More information

Optimal Sample-Based Fusion for Distributed State Estimation

Optimal Sample-Based Fusion for Distributed State Estimation Optimal Sample-Based Fusion for Distributed State Estimation Janni Steinbring, Benjamin Noac, arc Reinhardt, and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems Laboratory (ISAS) Institute for Anthropomatics

More information

ON MODEL SELECTION FOR STATE ESTIMATION FOR NONLINEAR SYSTEMS. Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof

ON MODEL SELECTION FOR STATE ESTIMATION FOR NONLINEAR SYSTEMS. Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof ON MODEL SELECTION FOR STATE ESTIMATION FOR NONLINEAR SYSTEMS Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof Delft Center for Systems and Control, Delft University of Technology, Mekelweg 2, 2628 CD

More information

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns

More information

Outline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination

Outline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:

More information

Optimal Gaussian Filtering for Polynomial Systems Applied to Association-free Multi-Target Tracking

Optimal Gaussian Filtering for Polynomial Systems Applied to Association-free Multi-Target Tracking 4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, Optimal Gaussian Filtering for Polynomial Systems Applied to Association-free Multi-Target Tracking Marcus Baum, Benjamin

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters

A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters 18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters Zhansheng Duan, Xiaoyun Li Center

More information

State Space and Hidden Markov Models

State Space and Hidden Markov Models State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State

More information

Generalized Near-Bell Numbers

Generalized Near-Bell Numbers 1 2 3 47 6 23 11 Journal of Integer Sequences, Vol. 12 2009, Article 09.5.7 Generalized Near-Bell Numbers Martin Griffiths Department of Mathematical Sciences University of Esse Wivenhoe Par Colchester

More information

QUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS

QUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS QUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS Parvathinathan Venkitasubramaniam, Gökhan Mergen, Lang Tong and Ananthram Swami ABSTRACT We study the problem of quantization for

More information

in a Rao-Blackwellised Unscented Kalman Filter

in a Rao-Blackwellised Unscented Kalman Filter A Rao-Blacwellised Unscented Kalman Filter Mar Briers QinetiQ Ltd. Malvern Technology Centre Malvern, UK. m.briers@signal.qinetiq.com Simon R. Masell QinetiQ Ltd. Malvern Technology Centre Malvern, UK.

More information

Particle Methods as Message Passing

Particle Methods as Message Passing Particle Methods as Message Passing Justin Dauwels RIKEN Brain Science Institute Hirosawa,2-1,Wako-shi,Saitama,Japan Email: justin@dauwels.com Sascha Korl Phonak AG CH-8712 Staefa, Switzerland Email: sascha.korl@phonak.ch

More information

Adrian N. Bishop Australian National University and NICTA

Adrian N. Bishop Australian National University and NICTA Information Fusion via the Wasserstein Barycenter in the Space of Probability Measures: Direct Fusion of Empirical Measures and Gaussian Fusion with Unnown Correlation Adrian N. Bishop Australian National

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Multidimensional partitions of unity and Gaussian terrains

Multidimensional partitions of unity and Gaussian terrains and Gaussian terrains Richard A. Bale, Jeff P. Grossman, Gary F. Margrave, and Michael P. Lamoureu ABSTRACT Partitions of unity play an important rôle as amplitude-preserving windows for nonstationary

More information

Gaussian Estimation under Attack Uncertainty

Gaussian Estimation under Attack Uncertainty Gaussian Estimation under Attack Uncertainty Tara Javidi Yonatan Kaspi Himanshu Tyagi Abstract We consider the estimation of a standard Gaussian random variable under an observation attack where an adversary

More information

A Sequential Monte Carlo Approach for Extended Object Tracking in the Presence of Clutter

A Sequential Monte Carlo Approach for Extended Object Tracking in the Presence of Clutter A Sequential Monte Carlo Approach for Extended Object Tracing in the Presence of Clutter Niolay Petrov 1, Lyudmila Mihaylova 1, Amadou Gning 1 and Dona Angelova 2 1 Lancaster University, School of Computing

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

Regular finite Markov chains with interval probabilities

Regular finite Markov chains with interval probabilities 5th International Symposium on Imprecise Probability: Theories and Applications, Prague, Czech Republic, 2007 Regular finite Markov chains with interval probabilities Damjan Škulj Faculty of Social Sciences

More information

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach Tracing of Extended Objects and Group Targets using Random Matrices A New Approach Michael Feldmann FGAN Research Institute for Communication, Information Processing and Ergonomics FKIE D-53343 Wachtberg,

More information

STATIONARITY RESULTS FOR GENERATING SET SEARCH FOR LINEARLY CONSTRAINED OPTIMIZATION

STATIONARITY RESULTS FOR GENERATING SET SEARCH FOR LINEARLY CONSTRAINED OPTIMIZATION STATIONARITY RESULTS FOR GENERATING SET SEARCH FOR LINEARLY CONSTRAINED OPTIMIZATION TAMARA G. KOLDA, ROBERT MICHAEL LEWIS, AND VIRGINIA TORCZON Abstract. We present a new generating set search (GSS) approach

More information

Random Fields in Bayesian Inference: Effects of the Random Field Discretization

Random Fields in Bayesian Inference: Effects of the Random Field Discretization Random Fields in Bayesian Inference: Effects of the Random Field Discretization Felipe Uribe a, Iason Papaioannou a, Wolfgang Betz a, Elisabeth Ullmann b, Daniel Straub a a Engineering Risk Analysis Group,

More information

Extension of the Sparse Grid Quadrature Filter

Extension of the Sparse Grid Quadrature Filter Extension of the Sparse Grid Quadrature Filter Yang Cheng Mississippi State University Mississippi State, MS 39762 Email: cheng@ae.msstate.edu Yang Tian Harbin Institute of Technology Harbin, Heilongjiang

More information

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft 1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented

More information

Bayesian Inference of Noise Levels in Regression

Bayesian Inference of Noise Levels in Regression Bayesian Inference of Noise Levels in Regression Christopher M. Bishop Microsoft Research, 7 J. J. Thomson Avenue, Cambridge, CB FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

Riccati difference equations to non linear extended Kalman filter constraints

Riccati difference equations to non linear extended Kalman filter constraints International Journal of Scientific & Engineering Research Volume 3, Issue 12, December-2012 1 Riccati difference equations to non linear extended Kalman filter constraints Abstract Elizabeth.S 1 & Jothilakshmi.R

More information

Statistical Geometry Processing Winter Semester 2011/2012

Statistical Geometry Processing Winter Semester 2011/2012 Statistical Geometry Processing Winter Semester 2011/2012 Linear Algebra, Function Spaces & Inverse Problems Vector and Function Spaces 3 Vectors vectors are arrows in space classically: 2 or 3 dim. Euclidian

More information

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Recap: Linear Algebra Today... Topics: Mathematical Background Linear algebra Analysis & differential geometry Numerical techniques Geometric

More information

Lecture 7: Weak Duality

Lecture 7: Weak Duality EE 227A: Conve Optimization and Applications February 7, 2012 Lecture 7: Weak Duality Lecturer: Laurent El Ghaoui 7.1 Lagrange Dual problem 7.1.1 Primal problem In this section, we consider a possibly

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

6.867 Machine learning

6.867 Machine learning 6.867 Machine learning Mid-term eam October 8, 6 ( points) Your name and MIT ID: .5.5 y.5 y.5 a).5.5 b).5.5.5.5 y.5 y.5 c).5.5 d).5.5 Figure : Plots of linear regression results with different types of

More information

CS491/691: Introduction to Aerial Robotics

CS491/691: Introduction to Aerial Robotics CS491/691: Introduction to Aerial Robotics Topic: State Estimation Dr. Kostas Alexis (CSE) World state (or system state) Belief state: Our belief/estimate of the world state World state: Real state of

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

Bayesian Methods / G.D. Hager S. Leonard

Bayesian Methods / G.D. Hager S. Leonard Bayesian Methods Recall Robot Localization Given Sensor readings z 1, z 2,, z t = z 1:t Known control inputs u 0, u 1, u t = u 0:t Known model t+1 t, u t ) with initial 1 u 0 ) Known map z t t ) Compute

More information

ON CONVERGENCE PROPERTIES OF MESSAGE-PASSING ESTIMATION ALGORITHMS. Justin Dauwels

ON CONVERGENCE PROPERTIES OF MESSAGE-PASSING ESTIMATION ALGORITHMS. Justin Dauwels ON CONVERGENCE PROPERTIES OF MESSAGE-PASSING ESTIMATION ALGORITHMS Justin Dauwels Amari Research Unit, RIKEN Brain Science Institute, Wao-shi, 351-0106, Saitama, Japan email: justin@dauwels.com ABSTRACT

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

Hot Topics in Multisensor Data Fusion

Hot Topics in Multisensor Data Fusion Hot Topics in Multisensor Data Fusion Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory (ISAS) Institute for Anthropomatics and Robotics Karlsruhe Institute of Technology (KIT), Germany http://isas.uka.de

More information

Steepest descent on factor graphs

Steepest descent on factor graphs Steepest descent on factor graphs Justin Dauwels, Sascha Korl, and Hans-Andrea Loeliger Abstract We show how steepest descent can be used as a tool for estimation on factor graphs. From our eposition,

More information

A Propagating Wave Packet The Group Velocity

A Propagating Wave Packet The Group Velocity Lecture 7 A Propagating Wave Pacet The Group Velocity Phys 375 Overview and Motivation: Last time we looed at a solution to the Schrödinger equation (SE) with an initial condition (,) that corresponds

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

Simultaneous state and input estimation with partial information on the inputs

Simultaneous state and input estimation with partial information on the inputs Loughborough University Institutional Repository Simultaneous state and input estimation with partial information on the inputs This item was submitted to Loughborough University's Institutional Repository

More information

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications

More information

EE 302 Division 1. Homework 6 Solutions.

EE 302 Division 1. Homework 6 Solutions. EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the

More information

Local Probability Models

Local Probability Models Readings: K&F 3.4, 5.~5.5 Local Probability Models Lecture 3 pr 4, 2 SE 55, Statistical Methods, Spring 2 Instructor: Su-In Lee University of Washington, Seattle Outline Last time onditional parameterization

More information

Linear Prediction Theory

Linear Prediction Theory Linear Prediction Theory Joseph A. O Sullivan ESE 524 Spring 29 March 3, 29 Overview The problem of estimating a value of a random process given other values of the random process is pervasive. Many problems

More information

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Multiple target tracking based on sets of trajectories

Multiple target tracking based on sets of trajectories Multiple target tracing based on sets of traectories Ángel F. García-Fernández, Lennart vensson, Mar R. Morelande arxiv:6.863v3 [cs.cv] Feb 9 Abstract We propose a solution of the multiple target tracing

More information

Bregman Divergence and Mirror Descent

Bregman Divergence and Mirror Descent Bregman Divergence and Mirror Descent Bregman Divergence Motivation Generalize squared Euclidean distance to a class of distances that all share similar properties Lots of applications in machine learning,

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Equivalence of Minimal l 0 and l p Norm Solutions of Linear Equalities, Inequalities and Linear Programs for Sufficiently Small p

Equivalence of Minimal l 0 and l p Norm Solutions of Linear Equalities, Inequalities and Linear Programs for Sufficiently Small p Equivalence of Minimal l 0 and l p Norm Solutions of Linear Equalities, Inequalities and Linear Programs for Sufficiently Small p G. M. FUNG glenn.fung@siemens.com R&D Clinical Systems Siemens Medical

More information

Distributed Greedy Sensor Scheduling for Model-based Reconstruction of Space-Time Continuous Physical Phenomena

Distributed Greedy Sensor Scheduling for Model-based Reconstruction of Space-Time Continuous Physical Phenomena Distributed Greedy Sensor Scheduling for Model-based Reconstruction of Space-Time Continuous Physical Phenomena Marco F. Huber, Achim Kuwertz, Felix Sawo, and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems

More information

Where now? Machine Learning and Bayesian Inference

Where now? Machine Learning and Bayesian Inference Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from

More information

Imprecise Probability

Imprecise Probability Imprecise Probability Alexander Karlsson University of Skövde School of Humanities and Informatics alexander.karlsson@his.se 6th October 2006 0 D W 0 L 0 Introduction The term imprecise probability refers

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Expressions for the covariance matrix of covariance data

Expressions for the covariance matrix of covariance data Expressions for the covariance matrix of covariance data Torsten Söderström Division of Systems and Control, Department of Information Technology, Uppsala University, P O Box 337, SE-7505 Uppsala, Sweden

More information

Partitioned Covariance Intersection

Partitioned Covariance Intersection Partitioned Covariance Intersection Arne Petersen Christian Albrechts University Kiel, Germany E-Mail: apetersen@mip.informati.uni-iel.de Phone: +49 ()431 88 1418 Marc-André Beyer Raytheon Anschütz GmbH

More information

Manipulators. Robotics. Outline. Non-holonomic robots. Sensors. Mobile Robots

Manipulators. Robotics. Outline. Non-holonomic robots. Sensors. Mobile Robots Manipulators P obotics Configuration of robot specified by 6 numbers 6 degrees of freedom (DOF) 6 is the minimum number required to position end-effector arbitrarily. For dynamical systems, add velocity

More information

State Estimation and Prediction in a Class of Stochastic Hybrid Systems

State Estimation and Prediction in a Class of Stochastic Hybrid Systems State Estimation and Prediction in a Class of Stochastic Hybrid Systems Eugenio Cinquemani Mario Micheli Giorgio Picci Dipartimento di Ingegneria dell Informazione, Università di Padova, Padova, Italy

More information

SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS

SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS Miroslav Šimandl, Jindřich Duní Department of Cybernetics and Research Centre: Data - Algorithms - Decision University of West

More information

Robust-to-Dynamics Linear Programming

Robust-to-Dynamics Linear Programming Robust-to-Dynamics Linear Programg Amir Ali Ahmad and Oktay Günlük Abstract We consider a class of robust optimization problems that we call robust-to-dynamics optimization (RDO) The input to an RDO problem

More information

Minimum Necessary Data Rates for Accurate Track Fusion

Minimum Necessary Data Rates for Accurate Track Fusion Proceedings of the 44th IEEE Conference on Decision Control, the European Control Conference 005 Seville, Spain, December -5, 005 ThIA0.4 Minimum Necessary Data Rates for Accurate Trac Fusion Barbara F.

More information

A New Approach for Hybrid Bayesian Networks Using Full Densities

A New Approach for Hybrid Bayesian Networks Using Full Densities A New Approach for Hybrid Bayesian Networks Using Full Densities Oliver C. Schrempf Institute of Computer Design and Fault Tolerance Universität Karlsruhe e-mail: schrempf@ira.uka.de Uwe D. Hanebeck Institute

More information

EE-597 Notes Quantization

EE-597 Notes Quantization EE-597 Notes Quantization Phil Schniter June, 4 Quantization Given a continuous-time and continuous-amplitude signal (t, processing and storage by modern digital hardware requires discretization in both

More information

IN recent years, the problems of sparse signal recovery

IN recent years, the problems of sparse signal recovery IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 1, NO. 2, APRIL 2014 149 Distributed Sparse Signal Estimation in Sensor Networs Using H -Consensus Filtering Haiyang Yu Yisha Liu Wei Wang Abstract This paper

More information

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 29 Gaussian Mixtures Proposal Density in Particle Filter for Trac-Before-Detect Ondřej Straa, Miroslav Šimandl and Jindřich

More information

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1 ROUST CONSTRINED ESTIMTION VI UNSCENTED TRNSFORMTION Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a a Department of Chemical Engineering, Clarkson University, Potsdam, NY -3699, US.

More information

Imprecise Bernoulli processes

Imprecise Bernoulli processes Imprecise Bernoulli processes Jasper De Bock and Gert de Cooman Ghent University, SYSTeMS Research Group Technologiepark Zwijnaarde 914, 9052 Zwijnaarde, Belgium. {jasper.debock,gert.decooman}@ugent.be

More information