Fundamentals of Multiple-Hypothesis Tracking

Size: px
Start display at page:

Download "Fundamentals of Multiple-Hypothesis Tracking"

Transcription

1 Fundamentals of Multiple-Hypothesis Tracing Stefano Coraluppi NATO Lecture Series STO IST-55 Advanced Algorithms for Effectively Fusing Hard and Soft Information ITA NDL SWE GBR, September-October 6

2 Outline Multi-target tracing preliminaries Multiple-hypothesis tracing Hypothesis aggregation

3 The Multi-Target Tracing Challenge Difficulties Unnown and changing number of targets Uncertain target evolution Poor detection statistics (missed detections, fading effects, false alarms, redundant measurements, merged measurements) Poor measurement statistics (random errors, bias errors) Unnown data association Multi-sensor data (disparate data, active and passive sensors, multi-int data) The goal Generate a set of tracs that are close to targets in inematic, feature, and identity space Oh yes, another difficulty Choice of appropriate metric or set of metrics 3

4 Examples in Active and Passive Sonar 4

5 Target Existence Continuous-time birth-death process Birth rate λ b, death rate λ χ We consider a discrete-time sequence t, t,, t, with no targets at t Discrete-time Poisson birth process with mean μ b t over time interval t = t + t μ b t = λ b e λ χ t + τ dτ t t + Discrete-time death probability p χ t = λ χ e λ χτ dτ t t + = λ b λ χ e λ χ t = e λ χ t Notes Birth-discretization approximation: note that μ b t < λ b t Stationarity: t μ b t = λ b λ χ Target independence: the number of births in temporally non-overlapping intervals are independent random variables Poisson birth rate can also be understood as the limit of the Binomial distribution (many potential target cells, small birth probability in each) The independence assumption is generally used for simplicity, though it does not always hold (e.g. group-target existence, uninformative large-variance priors) 5

6 Target Evolution Nearly-constant position model w x Nearly-constant velocity model w x x Ornstein-Uhlenbec process w + x γ Integrated Ornstein-Uhlenbec process Singer model (nearly-constant acceleration) w x w + x γ Notes Multiple-model generalizations are common in the tracing community Direct discrete-time modeling is also possible, with piecewise-constant process noise As with target existence, we generally assume independent target motion; some generalizations exploit hierarchical OU processes γ + x x x 6

7 Time-Discretization of Stochastic Dynamics Nearly constant velocity (NCV) model X t FXt wt X x x F E E wt wt w q t Discrete-time NCV model X A X w X t A X exp t Ft t wt dt A exp t Ft Ew E w w Q qt 3 qt 3 qt qt 7

8 Linear Gaussian Filtering z v The models shown on the previous slide are all linear Gaussian processes If, additionally, we have linear measurements with additive Gaussian noise, the optimal linear estimator, with respect to the minimum mean squared error (MMSE) criterion, is the optimal estimator overall Notes x y ~N x y x y, Σ x Σ xy Σ yx Σ y = x + Σ xy Σ y y y E x,y x x y x x y T = Σ x Σ xy Σ y Σ yx This linear estimator may be written recursively: the well-nown Kalman filter This remains the optimal linear estimator even with non-gaussian noise We often employ the Kalman filter even if linearity or Gaussianity are violated z Example x v ~ U,,... n, x n n z n z v n n... zn n n x z,..., z minz,..., z * n x max n z n n n * z z feasibility 8

9 9 Stationary Target Evolution Modified Ornstein-Uhlenbec process The discrete-time form Notes Stability (& behavior) depend on closed-loop eigenvalues Stationarity: x + w x γ γ X w X X A X x X w E t t t t t t t t A exp exp exp exp exp exp exp exp Q Q Q Q Q w w E exp exp exp t t t q Q t t t q Q exp exp exp t t t q Q exp exp exp 4 ˆ ˆ ˆ, Re Im ˆ path followed by eigenvalues for increasing ˆ t A v p Q Q ˆ ˆ q p ˆ q v

10 The Multiple-Model MOU Process for Evasive Target Filtering Use of variable-structure IMM (VS-IMM) for move-stopmove targets has been studied (Kirubarajan 3) We may consider an OU-IMM approach Motion-dependent detection model (Koch 4): P D r = p d e ln r MDV MOU process for the move mode, second MOU process for the stop mode with small σ v Mode-dependent detection probabilities: P D i = p d erf MDV σ v i (D case), P D i = p d e MDV σ v i (D) Use P D i Λ i in standard IMM mode-matched lielihood computation, and use P D i for missed-detection scoring Advantages Monte Carlo simulation results are encouraging Allows for slow-moving targets below MDV Captures acceleration and deceleration naturally Avoids problematic stop-move transition Filter KF VS-IMM OU-IMM RMS 65.7 m 35.6 m 34.6 m position error (m) RMS velocity error (m) 8.5 m/sec 5.6 m/sec.8 m/sec

11 Kalman Smoothing Smoothing improves estimation performance and provides statistically-consistent interpolation The forward-bacward smoother requires a time-reversed motion model for: x + = A x + w x ~N, Q, w ~N, Q The reverse-time inematic model with uncorrelated disturbances w B ~N, Q B and terminal condition x n ~N, Q (Verghese 979): x = A B x + + w B A B = A I Q Q Q B = A Q Q Q Q A T A form of the forward-bacward smoother avoids the Information Filter on the bacward pass (Wall 98); slightly-simpler form uses inflated priors (Coraluppi 6), similar to recent distributed filtering wor (Koch 4) The equation below is not a valid time reversal: noise increments and prior at final time are not orthogonal x + = A x + w x = A x + A w (Note that there is no problem with the matrix inverse, as A is non-singular)

12 Bayesian Estimation Bayes ris Estimator Cost Xˆ Y Y C X ˆ, X J E E X E C Xˆ Y Y, X X C Xˆ Y, X E E C Xˆ Y, X, Y Y X X Y Minimum mean squared error (MMSE) estimation x, y x y Xˆ y EX y C (conditional mean) Minimum mean absolute error (MMAE) estimation x, y x y Xˆ y M X y C (conditional median) Minimum probability of error, or maximum a posteriori (MAP) estimation C x, y x y Xˆ y arg max f X y arg max f y X f X (conditional mode)

13 A Bayesian Estimation Example Prior distribution Conditional distribution Estimators Prior Posterior Points to note ˆ X MMSE f y x exp x, x y y x x exp xy, y MMSE, MMAE, MAP estimators differ & are nonlinear f x y f f ˆ Computation of the MAP estimator does not require (the unconditional density) f y f X MMAE y x y x y f f x f x dx y exp. 68 y x f x x exp x y f y xf x x yexp x y x ˆ X MMSE y xexp x y log Xˆ MMAE Xˆ MAP ˆ X MAP y x y y 3

14 The Multi-Target Tracing Problem Compute p X Z ; almost nobody attempts this (Kreucher 5) Extract from p X Z a set of tracs that is close to truth but note that MAP estimation is not meaningful here (Mahler 7) Consider the following posterior distribution: with probability p there is one target, distributed in state space according to N μ, Σ ; with probability p, there are no targets. How to determine the MAP estimate? Same difficulty exists when comparing lie-cardinality solutions with different target temporal support Some solution paradigms Detection-based vs. unified detection & tracing Centralized vs. distributed Hard vs. soft Labelled vs. unlabeled Association-based vs. association-free Sequential vs. batch Deterministic vs. stochastic (Multiple-Hypothesis Tracing) 4

15 A Labeled-Tracing, Association-Free Method Symmetric Measurement Equations (SME) approach Transform data association problem to one of nonlinear filtering (Kamen 99) Initial formulation assumes nown number of targets, no false alarms, no missed detections; subsequent wor relaxes these assumptions. Non-trivial filter convergence challenges; it appears most useful in conjunction with established methods (e.g. MHT), for group-target trac maintenance (Blacman 999) Useful as an approach to establishing tracing performance bounds (Daum 997) Example Consider a two-target trac-maintenance problem with measurements z i = x i + v i, E v i = σ, with i =, ; data association is unnown. Symmetric measurements: (i) association-independent zero-mean noise, (ii) invertible (non-linear) observation function h; (iii) full-ran Jacobean matrix H. y = c z + z y = c z + z σ 5

16 Outline Multi-target tracing preliminaries Multiple-hypothesis tracing Hypothesis aggregation 6

17 Association-Based MTT The MTT challenge: an intractable posterior probability distribution p X Z All measured data Hybrid-state decomposition p X Z = p X Z, q p q Z q MHT approach uses maximum a posteriori (MAP) estimation q = arg max q p q Z X arg max X p X Z, q Recursive formulation p q Z = p Z Z,q p q q p q Z p Z Z Unfortunately, this is a very large sum MHT finds lieliest explanation of data, and uses this to determine (approximately) the lieliest set of tracs Recursive processing enables real-time and computational feasibility 7

18 Is MAP Estimation a Good Idea? Single-target tracing: MAP doesn t minimize MSE (Braca ) This is true more generally: MMSE, MMAE, and MAP criteria lead to the same estimator under very narrow assumptions (linearity and Gaussianity) MAP estimation is useful in practice No need for computation of p Z Z MMSE and MMAE are difficult to define in a multi-target setting 8

19 Objections to Association-Based MTT Bayes rule prescribes the following: p q Z = p Z q p q Some issues that have been raised (Vo 8): p Z Since q depends on Z as it is prescribes how to explain the data, is p q a valid prior? Is p Z q a valid lielihood function? It is not clear whether p Z q p q is the joint density p Z, q : p q = p Z q p q dz must not depend on the data; This contradicts Z the fact that q does depend on Z! Does the normalizing constant even exist? We can address these concerns with a conditioning argument: p q Z = p q Z, Z = p Z q, Z p q Z p Z Z q is conditionally independent of Z given Z, hence p q Z is now a valid prior and p Z q, Z is now a valid lielihood function p q Z = p Z q, Z p q Z dz Z, Z = Z p Z Z = p Z q, Z p q Z q consistent with Z 9

20 Hypothesis-Oriented MHT (HO-MHT) The starting point p q Z = p q Z First numerator factor p Z Z = p q Z, Z = p Z Z,q, Z p q Z, Z p Z Z, Z p Z Z, q, Z = p Z Z, q Second numerator factor p q Z, Z = p q Z, Z, q p q Z, Z p q Z, Z = p q Z, Z,q p Z q,z p q Z p Z Z p q Z, Z = p q Z,q p q Z Denominator p Z Z, Z = p Z Z The final form p Z Z p Z Z p q Z = p Z Z,q p q q p q Z p Z Z

21 Assumptions Enabling Trac-Oriented MHT (TO-MHT) Target Poisson assumption Exponentially-distributed target inter-arrival (birth) times with parameter λ b Exponentially distributed target lifetime with parameter λ χ μ b t = λ b λ χ e λ χt p χ t = e λ χt Sensor Bernoulli assumption Each target is detected with probability p d Poisson false alarm assumption Large number of detection cells N and vanishingly small false detection probability p F, with p F N Λ

22 Trac-Oriented MHT Recursion Recall that we have: p q Z = p Z Z,q p q q p q Z p Z Z Hypothesis contribution, with r = Z, τ tracs, d detections, χ deaths, and b births: p q q = exp p dμ b Λ Λ r Filtering contribution: r! p χ χ p Z Z, q = f z j Z, q j J d The MHT recursion: c = exp p d μ b Λ Λ r r! p Z Z j Z f fa z j p q Z = c p q Z p χ χ p χ p d τ χ d p χ p d Λ p χ p d τ χ d p χ p d f z j Z,q j J d d pd μ b Λ j J f z j Z, q b j Jfa f fa z j Λf fa z j j J b b p d μ b f z j Z,q Λf fa z j death missed detection update birth

23 Trac-Oriented MHT Z, Z, Z3, Z,, Z, Z, Z3, Z,, Z, Z, Z3, Z,, Z, Z, Z3, Z,, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z, Z,, Z,, Z3, Z,, Z3, Z,, Z,, Z,, Z3, Z,, Z,, Z,, Z3, Z,, Z,, Z,, Z3, Z,, Z, Z,,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,,, Z, Z3, Z,, Z,, Z, Z3, Z,, Z,,, Z,, Z,, Z3, Z,,, Z,, Z,, Z3, Z,, Z,, Z,, Z3, Z,,,,,,,,,,,,,,,,,,,,,,, Z Z Z3 Z Z Z Z3 Z Z Z Z3 Z,,, Z,, Z,, Z3, Z,,, Z,, Z,, Z3, Z,, Z,, Z,, Z3, Z,,, Z,, Z,, Z, Z,,, Z,, Z,, Z, Z,, Z,, Z,, Z, 3 Hypothesis-oriented view 3 3 Z, Z, Z3 Trac-oriented view Z Z Z Z Z Z 3 Z 3 Z 3 Z 3 Z 3 Z 3 3

24 Correcting Some Statements About MHT Both TO-MHT and HO-MHT are solving the same problem MAP global-hypothesis estimation problem MMSE estimation problem conditioned on the MAP global hypothesis In the literature, it is occasionally (and, in my view, erroneously) stated that HO-MHT is MAP-MMSE, while TO-MHT is ML-MMSE (e.g. Bar-Shalom 9) The global hypothesis q is well-posed as a state variable Mahler (7) claims that, since it is an observable, the use of Z in the MHT recursion is suspicious Note that we may express recursion in prediction-update form: p q Z = p q q p q Z (prediction step) p q Z = p Z Z,q p q Z p Z Z (update step) In practice, we are not concerned with predictions that ultimately lead to null posterior probability Mahler (7) raises the concern that measurement labelling introduces an a priori order on the data that may introduce a statistical bias in the MHT solution In fact, measurement labels are arbitrary, and have no impact on p q Z, hence there is no impact on the resulting MHT solution X, q 4

25 Practical Considerations Hypothesis generation Limit to a single global hypothesis for a single global association hypothesis Measurement gating Limit the number of association hypotheses Hypothesis pruning Reduce the number of association hypotheses Online output even for forensic problems Trac extraction Decoupled and sequential data association and trac extraction The sequential mode enables feedbac and proves beneficial Data association Trac extraction 5

26 An Example: Two Closely-Spaced Targets Newly-acquired detections Confirmed tracs T R3(t ) R(t ) R(t ) T How to perform trac maintenance for T & T? 6

27 High-Performance Hypothesis Resolution: Illustration with n-scan = (Tree Depth) trac coast T A T R R R R R3 trac termination Trac hypotheses resolved trac R R3 optimal global hypothesis new trac hypothesis R3 trac update objective subject to Account for tracs T and T Account for reports R, R, and R3 max cx Ax b x i, (for each vector element) maxcx Ax b x, N x~ b max ~ xx Ax b x, A simple, alternative scheme: trac-score normalization and greedy trac selection N 7

28 Performance Metrics Trac fragmentation (this lowers target purity) Non-traced target (this lowers target completeness) Legend Target Trac Localization error (this is averaged over all target-trac associations) Trac swap (this lowers trac purity) False tracs (these lower trac completeness) 8

29 Tracer Performance Calibration Model-based optimization Blac-box optimization Deterministic e.g. gradient-free generalized bisection algorithm Stochastic e.g. Marov Chain Monte Carlo (MCMC) with the Metropolis- Hastings algorithm Scalar objective Weighted combination of tracer metrics, including label-free optimal sub-pattern assignment (OSPA) Global lielihood Why global lielihood? Matches parameter optimization objective to MHT optimization objective No need for ground truth f y x f x p xy α xy = f y p yx α yx α xy = min, f y p yx f x p xy 9

30 Some Key Points MAP estimation based on p X Z is not meaningful MAP estimation based on p q Z is well posed MAP estimation may not achieve optimality for metrics of operational interest Recursive formulation of p q Z is the ey enabler for MHT Trac-oriented MHT and further simplifying approximations are needed as well for practical implementations 3

31 Outline Multi-target tracing preliminaries Multiple-hypothesis tracing Hypothesis aggregation 3

32 Case I: Exact Aggregation, Indistinguishable Global Hypotheses Richer set of global hypotheses, without increasing data-association hypotheses (Coraluppi 4) Consider multiple target birth and death times, for the same data-association sequence Consider never-detected targets The enhanced MHT recursion Aggregation over indistinguishable birth events on undetected targets Solution structure: decoupled observed-target and ghost-target solutions, data-independent ghost solution, common structure for visible targets (assuming stationary dynamics) Further aggregation is possible (with stationarity) p Q p Z exp b r! p p d d fa fa jj d JbJ fa p Q Z z Z, Q p pd fd z j Z, Q jj f z Z, Q jj d r fa fa fa f j j pdb fb z fa f fa z j b c j Z Z, Q, Q Enhanced MHT identifies optimal birth time interval (in this illustration, birth is followed by a misseddetection event) p d b u p d i Classical MHT assumes birth occurs in this time interval n! i b g i n i time measurement Further aggregation considers multiple intervals g deaths missed detections detections births unnoticed births ghost births 3

33 Benefits of Enhanced MHT [m] Improved trac extraction in temporally-staggered sensor settings Improved estimate of target cardinality in low p d settings The time-invariant solution No undetected births of visible targets Fixed ghost-target structure, with decreasing numbers at longer lifetime Example: 7 p.. 5 b p d t state t t 3 t 4 t time [sec] 33

34 34 34 Cardinality estimation for speed, privacy (Durand 3, Kodialam 7) Case II: Exact Aggregation, Indistinguishable Measurements!!! g i i g b d b d b b d b d d d Z Q p n p b d r r p p p p Z Q p.! i n i i n i n i n b i n i i n i n p p n p i n n i n e p p n p i n n i n p n p b n p Z L n Z L n p truth data CMHT CPHD Kalman filter b x b b x p x b p p p L d d p L I d x p Z L x x d Benefit of MHT aggregation and direct MHT-PHD comparison Cardinality MHT (CMHT) recursion (Coraluppi 4) Counting-targets limit of CPHD (Mahler 7) Kalman filter (suboptimal, problem is neither linear nor Gaussian)

35 Case III: Approximate Aggregation Trac coalescence (Fitzgerald 985), trac repulsion (Willett 7), and trac repulsion mitigation via trac-brea-trac (Coraluppi 9) Further mitigation via hypothesis aggregation (Coraluppi ) position [m] truth MHT ideal PDAF - - m distance between targets unique hypothesis hypothesis class - average log probability number of swaps optimal swap rate as a function of distance between targets time [sec].5 truth MHT ideal PDAF ECMHT equivalent measurement RMS error [m] m distance between targets swap rate displacement position [m] time [sec] number of swaps 35

36 Conclusions Multi-target tracing is a challenging problem that requires statistical estimation and combinatorial optimization Multiple-hypothesis tracing is a mathematically rigorous and well-performing solution paradigm Practical MHTs must use judicious hypothesis management, trac extraction, and hypothesis aggregation where possible MHT performance can be optimized via principled tracer modeling (this is hard) or through grey-box optimization 36

37 Recent Advances in Multiple-Hypothesis and Graph-Based Tracing Stefano Coraluppi NATO Lecture Series STO IST-55 Advanced Algorithms for Effectively Fusing Hard and Soft Information ITA NDL SWE GBR, September-October 6 37

38 Outline Distributed MHT Asynchronous MHT Graph-Based Tracing 38

39 Centralized MHT Single-sensor MHT wors well in many applications Multi-sensor MHT is effective in some applications fragmentation Without MHT processing With MHT processing ADULTS 3 Bistatic # Bistatic # Fused Theory Practice 39

40 Multi-Sensor Centralized MHT Small sensor networs Crazy Ivan Run SEABAR 7 Large sensor networs Full-networ processing Adaptive processing (based on context information) No active trac P N U P D Adaptive processing (based on tracer modeling) sojourn N t P D sojourn t N P U K P M Active trac P M sojourn t P M sojourn K t P M K detection probability sensor sensors sensors Fusion Performance Curves 3 sensors PERFORMANCE CURVE false trac rate FTR (per hour) 4

41 The Case for Distributed MHT Why distributed multi-sensor systems? Why distributed (i.e. multi-stage) single-sensor systems? Bandwidth and legacy constraints Robustness against target fading and registration errors Ability to handle disparate sensors, including active and passive sensors Improved performance with high-confidence same-sensor association An Example: Airborne Sense and Avoid Successful trac maintenance (yellow) Effectiveness in high-ambiguity settings (e.g. passive bistatic radar, HF radar) Processing flexibility with limited-quality sensor data (e.g. redundant measurements, hardware-induced artifacts, dim targets) An Example: GMTI Radar Tracing Significant processing flexibility, beyond examples discussed here Upfront static fusion via ML processing: Active sonobuoy field Residual processing: HF Radar Tracing fuse-before-trac processing BREST 9 trac-extract-trac processing 4

42 MHT with Active and Passive Sensors The use of virtual measurements Yes: trac initialization, hypothesis gating No: trac update, hypothesis scoring The use of equivalent measurements Equivalent-measurement (traclet) formation: sequence of measurements may replaced by an equivalent measurement: X + = X + L Z CX, P + = I LC P, L = P C T CP C T + R Columns of C T are fixed as the orthonormal eigenvectors of P + P corresponding to positive eigenvalues L = I P + P C T, Z = CX + L T L L T X + X, R = Λ Reduction in spurious-hypothesis formation, filter computations at fusion center, and bandwidth requirements Note: may not be full-dimensional; suboptimal filtering (common target process noise); sub-optimal trac scoring (even in single-sensor setting); processing latency The use of composite fusion logic Reduction in impact of airborne clutter while minimizing latency to single-sensor trac initiation 4

43 MHT for Dim Targets in Clutter Redundant returns: MHT-based measurement clustering Persistent sidelobes: multiple-mode filtering Hardware-induced artifact: constant-range tracing Dim target: two-stage dynamic tracing 43

44 Results The overall processing architecture Visualization of end-to-end result Low-power UAV-based GMTI radar data 44

45 Independent Performance Evaluation Significant in-house testing AFRL-provided CTESS STR Measures of Performance software that includes additional metrics, e.g. the optimal subpattern assignment (OSPA) metric AFRL performance evaluation against leading GMTI tracing technology providers Full set of metrics confirms STR Multi-Stage MHT (MS-MHT) is best performer by a sizeable margin 45

46 Handling Redundant Measurements Most tracing paradigms adopt a point-target assumption (i.e. at most one measurement per target per sensor per scan) These algorithms do not directly address extended-object tracing and limited-performance detectors that lead to multiple detections per target Many researchers have addressed this challenge Some researchers explicitly consider multiple-model phenomena, e.g. for over-the-horizon-radar (OTHR) sensors (MHT Sathyan 3) Other researchers assume a more challenging formulation, whereby the same detection model applies to all measurements (PDAF Kirubarajan, PHD Clar, Degen 4) The MHT recursion can be generalized (Coraluppi 6) Poisson case (p i = λi i! e λ ): p q q = Λr e Λ e μ b r! p χ χ λ i i! Λ i e λ p χ Global hypothesis factorization is achieved. i d i λ i i! Λ i e λ μ b The number of measurements in a cluster impacts explicitly the global hypothesis score; the benefit of the principled derivation is to establish the exact dependence. Substituting p d for e λ and limiting clusters to unity cardinality yields the familiar, classical MHT recursion. i b i D illustration of improved clustering (red) with respect to classical processing (blue) 46

47 Practical Considerations Single-stage redundant-measurement MHT is computationally infeasible The number of ways to cluster N measurements is given by the Bell number B N, which is (roughly) O(N N ) MHT Feasibility relies on distributed MHT processing Multi-stage processing solutions include (i) redundant-measurement MHT for static clustering, followed by classical MHT; or (ii) classical MHT, followed by redundant-measurement MHT for trac fusion MHT MHT 47

48 More on Trac Repulsion Not surprisingly, more measurements lead to improved localization performance Interestingly, analysis of bias errors shows that the impact of multiple measurements increases the tracrepulsion effect An explanation: 48

49 Outline Distributed MHT Asynchronous MHT Graph-Based Tracing 49

50 The Multi-INT Trac Fusion Challenge No viable multi-int technology solutions are available Distributed MHT performs correct multi-sensor fusion, but cannot scale to large problems Graph-based tracing (GBT) is fast, but cannot incorporate multi-int data (Castañon ) Paradigms are of interest that provide improved scalability while addressing the problem Asynchronous MHT (A-MHT) exploits forensic identity information to reduce spurious hypotheses and improve data association decisions L y n = L y L y i y i Kinematicbased tracs 3 i=,,n 4 L y L y i y i i=,,n Sparse identity information Significant trac overlap / confusion 5 5

51 Asynchronous Global Nearest Neighbor From an estimation perspective, MTT leads to surprises Distributed processing may outperform centralized processing in robustness (simplify registration, handle target fading) and performance (exploit single-sensor association) Delayed information (e.g. traclets) reduces spurious association hypotheses Asynchronous data association may perform better than insequence sequential processing position truth clairvoyant sequential 5asynchronous (approx) asynchronous (exact) average positional error clairvoyant sequential asynchronous (approx) asynchronous (exact) time number of sensor scans 5

52 Target purity Asynchronous MHT (A-MHT) MHT with trac-breaage logic S,W S t S S W A-MHT: batch-level (forensic) processing S W W t W terminated W S t S W W time Asynchronous MHT W S Processing steps (not time) W Classical MHT W W Hypothesis tree depth 5

53 Improved Graphical Exploitation Basic idea Forward-bacward coarse gating A-MHT on reduced hypothesis space Same object (SIGINT trac) Two MHT paths (but not A-MHT paths) N levels of data time N- levels of inematic tracs An A-MHT path 53

54 Improved Graphical Exploitation Basic idea Forward-bacward coarse gating A-MHT on reduced hypothesis space Forward light cone Same object (SIGINT trac) N levels of data No path enumeration in coarse gating time position m N- levels of inematic tracs 54

55 Improved Graphical Exploitation Basic idea Forward-bacward coarse gating A-MHT on reduced hypothesis space Bacward light cone Same object (SIGINT trac) N levels of data No path enumeration in coarse gating time position m N- levels of inematic tracs 55

56 Improved Graphical Exploitation Basic idea Forward-bacward coarse gating A-MHT on reduced hypothesis space Reduced hypothesis space Same object (SIGINT trac) N levels of data No path enumeration in coarse gating time position m N- levels of inematic tracs 56

57 A-MHT with Enhanced Association Processing Algorithm Clairvoyant MHT A-MHT Improved A-MHT (light cone) Average localization error (m) Fraction of correct associations Results based on Monte Carlo realizations 57

58 Outline Distributed MHT Asynchronous MHT Graph-Based Tracing 58

59 An Illustration of the Complexity of Some Tracing Paradigms The data Fusion inference with no approximations: hypothesis-oriented MHT Fusion inference with some approximations (Poisson targets and clutter): trac-oriented MHT Fusion inference with additional approximations (path independence): GBT 59

60 Multi-INT Graph-Based Tracing (MI-GBT) e Simple scenario: e v v e is location of ID measurement v i are inematic tracs v 3 t Generate separate graphs that define feasible set of flows for each ID; crucially, the emitter tracs do not show up as nodes. Constrain each inematic trac to be used exactly once. Source Nodes Kinematic Tracs Graph G (no ID) Graph G (ID e ) x v x v v v 3 x x x 3 x 3 x x v x v v x x Solve for consistent set of flows that maximize solution lielihood. Details in Coraluppi (6) Sin Nodes x v x v 6

61 MI-GBT Mathematical Formulation Minimize: J = e E i,j A c ij x ij. Subject to: Costs encode target existence and evolution statistics as well as sensor detection and localization statistics x ij,, i, j A, s. t. e E, Linear objective function: the unnowns are the edges in the multi-int graph. Each edge must be selected or not: no fractional values. :e E i: i,j A x ij =, j s. t. v j V, Each inematic trac must be used exactly once. x i i:v i V \ =, s. t. e E, Each emitter must be used exactly once: unity flow in each emitter sub-graph. x ji j: j,i A x ij j: i,j A =, i s. t. v i V, s. t. e E. Flow must be preserved within each sub-graph 6

62 Complexity Estimates Variables m sets of update tracs V inematic tracs in each set E emitter tracs Exact solution to the multi-int problem Integer linear program (ILP) size: A-MHT: M~O V m E MI-GBT: M~O m V E GBT: M~O m V ILP solution complexity: A-MHT & MI-GBT: assume O M 4 via LP relaxation GBT: O M 3 via min-cost networ flow (MCNF) or bipartite matching solution Approximate solution to multi-int problem Does not solve the multi-int problem 6

63 Multi-INT MCMC (MI-MCMC) Efficient sampling in global hypothesis space (Oh 9) Metropolis-Hastings sampling Moves follow a proposal distribution Moves that improve the solution lielihood are always accepted, otherwise an acceptance probability is applied Design includes merge, split, and swap moves A single swap move if executed here will correct some measurement assignment errors in the MCMC solution 63

64 Empirical Analysis on Small Scenarios Confirms Analytical Expectations GBT often violates identity constraints MHT (& A-MHT) have difficulty when emissions are temporally distant GBT and MI-GBT commit some errors when reasoning over inematic-only data, due to path-independence approximation Two realizations of the two-target scenario GBT & MI-GBT errors GBT error MHT & A-MHT error MHT & A-MHT recovery from error MI-MCMC converges to highlielihood solutions 64

65 MI-GBT, MCMC and MI-MCMC Exhibit Excellent Tracing Performance Metrics Localization error: reflects target & trac completeness Probability of correct association: reflects target & trac purity Benefit of fast MI-MCMC convergence will be important for large-scale problems (not needed here) (small is good here) (large is good here) Results on 5 Monte Carlo realizations for each scenario duration 65

66 More Challenging Problems Demonstrate Need for MI-MCMC (Hot Start) One realization of MCMC & MI-MCMC lielihood convergence MI-MCMC performance is best Results on 5 Monte Carlo realizations for each scenario duration 66

67 Multi-INT Challenge Problem Simulated inematic tracs for + targets High purity, high fragmentation Simulated identity emissions only on three high value targets (HVTs) Infrequent 5min scenario HVT # HVT #3 HVT # HVT # & #3 time-space confusion HVTs in multi-int challenge problem 67

68 Experimental Framewor data simulation multi-target tracing evaluation GBT HVT DD scenario Poisson point process Kinematic tracer model Identity sensor model MHT MI- GBT Metrics evaluator Algorithms: Graph-Based Tracer (GBT) Multiple-Hypothesis Tracer (MHT) Multi-INT Graph-Based Tracer (MI-GBT) 68

69 Results for HVT Scenario Non-ideal performance due to high target density, move-stop-move, limited identity detections MI-GBT outperforms MHT (on all metrics) and GBT (on HVT metrics) HVT purity lower than target purity as confusion is significant, by design MI-GBT HVT purity remains below unity, due to many closely-spaced inematic tracs 8% improvement Metric MHT GBT MI-GBT Target completeness Target purity Trac completeness Trac purity HVT completeness HVT purity

70 The Way Forward on Multi-INT Fusion Exploit complementary strengths of MHT, GBT, and MCMC solution paradigms MHT Kinematic sensor data... A-MHT MI- GBT MI- MCMC MHT Intelligence sensor data Consider more complex target phenomena that include emitters, persons, vehicles Emitters may be indistinguishable or may provide imprecise identification Multiple emitters may move between people Multiple people may move between vehicles Include other types of data including human-derived information Text-based reports Human intervention in fusion processing 7

71 Conclusions MHT is a powerful and mature approach to MTT MAP global hypothesis estimation is well posed, though it does not guarantee optimality with respect to metrics of interest There are surprising opportunities for performance gains in MHT Decoupling of data association and trac extraction Hypothesis aggregation over indistinguishable or similar hypotheses Distributed tracing in both multi-sensor and single-sensor settings Asynchronous data association The multi-int fusion problem goes beyond what advanced MHT alone can address Graph-based tracing must be exploited Stochastic-sampling methods may provide valuable solution refinement 7

Generalizations to the Track-Oriented MHT Recursion

Generalizations to the Track-Oriented MHT Recursion 18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 Generalizations to the Track-Oriented MHT Recursion Stefano Coraluppi and Craig Carthel Systems & Technology Research

More information

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications

More information

The Mixed Labeling Problem in Multi Target Particle Filtering

The Mixed Labeling Problem in Multi Target Particle Filtering The Mixed Labeling Problem in Multi Target Particle Filtering Yvo Boers Thales Nederland B.V. Haasbergerstraat 49, 7554 PA Hengelo, The Netherlands yvo.boers@nl.thalesgroup.com Hans Driessen Thales Nederland

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions Stephan Reuter, Daniel Meissner, Benjamin Wiling, and Klaus Dietmayer Institute of Measurement, Control, and

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

Previously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life

Previously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life REGLERTEKNIK Previously on TT, AUTOMATIC CONTROL Target Tracing: Lecture 2 Single Target Tracing Issues Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering Linöping

More information

Incorporating Track Uncertainty into the OSPA Metric

Incorporating Track Uncertainty into the OSPA Metric 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 Incorporating Trac Uncertainty into the OSPA Metric Sharad Nagappa School of EPS Heriot Watt University Edinburgh,

More information

Statistical Multisource-Multitarget Information Fusion

Statistical Multisource-Multitarget Information Fusion Statistical Multisource-Multitarget Information Fusion Ronald P. S. Mahler ARTECH H O U S E BOSTON LONDON artechhouse.com Contents Preface Acknowledgments xxm xxv Chapter 1 Introduction to the Book 1 1.1

More information

Sensor Tasking and Control

Sensor Tasking and Control Sensor Tasking and Control Sensing Networking Leonidas Guibas Stanford University Computation CS428 Sensor systems are about sensing, after all... System State Continuous and Discrete Variables The quantities

More information

Machine Learning Lecture Notes

Machine Learning Lecture Notes Machine Learning Lecture Notes Predrag Radivojac January 25, 205 Basic Principles of Parameter Estimation In probabilistic modeling, we are typically presented with a set of observations and the objective

More information

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Data Association Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Data Association Data association is the process of associating uncertain measurements to known tracks. Problem

More information

BAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS. Francesco Papi and Du Yong Kim

BAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS. Francesco Papi and Du Yong Kim 3rd European Signal Processing Conference EUSIPCO BAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS Francesco Papi and Du Yong Kim Department of Electrical

More information

Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis?

Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis? REGLERTEKNIK Summary of Past Lectures AUTOMATIC COROL Target Tracing: Lecture Multiple Target Tracing: Part I Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering

More information

Optimal Fusion Performance Modeling in Sensor Networks

Optimal Fusion Performance Modeling in Sensor Networks Optimal Fusion erformance Modeling in Sensor Networks Stefano Coraluppi NURC a NATO Research Centre Viale S. Bartolomeo 400 926 La Spezia, Italy coraluppi@nurc.nato.int Marco Guerriero and eter Willett

More information

Supplemental Material : A Unified Framework for Multi-Target Tracking and Collective Activity Recognition

Supplemental Material : A Unified Framework for Multi-Target Tracking and Collective Activity Recognition Supplemental Material : A Unified Framewor for Multi-Target Tracing and Collective Activity Recognition Wongun Choi and Silvio Savarese Electrical and Computer Engineering University of Michigan Ann Arbor

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London Distributed Data Fusion with Kalman Filters Simon Julier Computer Science Department University College London S.Julier@cs.ucl.ac.uk Structure of Talk Motivation Kalman Filters Double Counting Optimal

More information

GMTI Tracking in the Presence of Doppler and Range Ambiguities

GMTI Tracking in the Presence of Doppler and Range Ambiguities 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 GMTI Tracing in the Presence of Doppler and Range Ambiguities Michael Mertens Dept. Sensor Data and Information

More information

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models A Gaussian Mixture PHD Filter for Nonlinear Jump Marov Models Ba-Ngu Vo Ahmed Pasha Hoang Duong Tuan Department of Electrical and Electronic Engineering The University of Melbourne Parville VIC 35 Australia

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Multi-Target Tracking Using A Randomized Hypothesis Generation Technique

Multi-Target Tracking Using A Randomized Hypothesis Generation Technique Multi-Target Tracing Using A Randomized Hypothesis Generation Technique W. Faber and S. Charavorty Department of Aerospace Engineering Texas A&M University arxiv:1603.04096v1 [math.st] 13 Mar 2016 College

More information

A Comparison of Particle Filters for Personal Positioning

A Comparison of Particle Filters for Personal Positioning VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University

More information

Markov Chain Monte Carlo Data Association for Multiple-Target Tracking

Markov Chain Monte Carlo Data Association for Multiple-Target Tracking OH et al.: MARKOV CHAIN MONTE CARLO DATA ASSOCIATION FOR MULTIPLE-TARGET TRACKING 1 Markov Chain Monte Carlo Data Association for Multiple-Target Tracking Songhwai Oh, Stuart Russell, and Shankar Sastry

More information

Random Finite Set Methods. for Multitarget Tracking

Random Finite Set Methods. for Multitarget Tracking Random Finite Set Methods for Multitarget Tracing RANDOM FINITE SET METHODS FOR MULTITARGET TRACKING BY DARCY DUNNE a thesis submitted to the department of electrical & computer engineering and the school

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

A Unifying Framework for Multi-Target Tracking and Existence

A Unifying Framework for Multi-Target Tracking and Existence A Unifying Framework for Multi-Target Tracking and Existence Jaco Vermaak Simon Maskell, Mark Briers Cambridge University Engineering Department QinetiQ Ltd., Malvern Technology Centre Cambridge, U.K.

More information

Extended Object and Group Tracking with Elliptic Random Hypersurface Models

Extended Object and Group Tracking with Elliptic Random Hypersurface Models Extended Object and Group Tracing with Elliptic Random Hypersurface Models Marcus Baum Benjamin Noac and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems Laboratory ISAS Institute for Anthropomatics

More information

Introduction to Machine Learning Midterm Exam

Introduction to Machine Learning Midterm Exam 10-701 Introduction to Machine Learning Midterm Exam Instructors: Eric Xing, Ziv Bar-Joseph 17 November, 2015 There are 11 questions, for a total of 100 points. This exam is open book, open notes, but

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Lecture Outline. Target Tracking: Lecture 7 Multiple Sensor Tracking Issues. Multi Sensor Architectures. Multi Sensor Architectures

Lecture Outline. Target Tracking: Lecture 7 Multiple Sensor Tracking Issues. Multi Sensor Architectures. Multi Sensor Architectures Lecture Outline Target Tracing: Lecture 7 Multiple Sensor Tracing Issues Umut Orguner umut@metu.edu.tr room: EZ-12 tel: 4425 Department of Electrical & Electronics Engineering Middle East Technical University

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Combine Monte Carlo with Exhaustive Search: Effective Variational Inference and Policy Gradient Reinforcement Learning

Combine Monte Carlo with Exhaustive Search: Effective Variational Inference and Policy Gradient Reinforcement Learning Combine Monte Carlo with Exhaustive Search: Effective Variational Inference and Policy Gradient Reinforcement Learning Michalis K. Titsias Department of Informatics Athens University of Economics and Business

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking 1 The Problem Goal: To track activities

More information

Related Concepts: Lecture 9 SEM, Statistical Modeling, AI, and Data Mining. I. Terminology of SEM

Related Concepts: Lecture 9 SEM, Statistical Modeling, AI, and Data Mining. I. Terminology of SEM Lecture 9 SEM, Statistical Modeling, AI, and Data Mining I. Terminology of SEM Related Concepts: Causal Modeling Path Analysis Structural Equation Modeling Latent variables (Factors measurable, but thru

More information

ADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING

ADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING ADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING A Thesis Presented to The Academic Faculty By Ryan D. Palkki In Partial Fulfillment of the Requirements for the Degree Master

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

an introduction to bayesian inference

an introduction to bayesian inference with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena

More information

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach Tracing of Extended Objects and Group Targets using Random Matrices A New Approach Michael Feldmann FGAN Research Institute for Communication, Information Processing and Ergonomics FKIE D-53343 Wachtberg,

More information

A Generalized Labeled Multi-Bernoulli Filter for Maneuvering Targets

A Generalized Labeled Multi-Bernoulli Filter for Maneuvering Targets A Generalized Labeled Multi-Bernoulli Filter for Maneuvering Targets arxiv:163.565v1 [stat.me] 15 Mar 16 Yuthia Punchihewa School of Electrical and Computer Engineering Curtin University of Technology

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

ORBIT DETERMINATION AND DATA FUSION IN GEO

ORBIT DETERMINATION AND DATA FUSION IN GEO ORBIT DETERMINATION AND DATA FUSION IN GEO Joshua T. Horwood, Aubrey B. Poore Numerica Corporation, 4850 Hahns Pea Drive, Suite 200, Loveland CO, 80538 Kyle T. Alfriend Department of Aerospace Engineering,

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Markov Chain Monte Carlo Data Association for Multi-Target Tracking

Markov Chain Monte Carlo Data Association for Multi-Target Tracking MCMCDA 1 Markov Chain Monte Carlo Data Association for Multi-Target Tracking Songhwai Oh, Stuart Russell, and Shankar Sastry Abstract This paper presents Markov chain Monte Carlo data association (MCMCDA)

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Stochastic Proximal Gradient Algorithm

Stochastic Proximal Gradient Algorithm Stochastic Institut Mines-Télécom / Telecom ParisTech / Laboratoire Traitement et Communication de l Information Joint work with: Y. Atchade, Ann Arbor, USA, G. Fort LTCI/Télécom Paristech and the kind

More information

Introduction to Machine Learning Midterm Exam Solutions

Introduction to Machine Learning Midterm Exam Solutions 10-701 Introduction to Machine Learning Midterm Exam Solutions Instructors: Eric Xing, Ziv Bar-Joseph 17 November, 2015 There are 11 questions, for a total of 100 points. This exam is open book, open notes,

More information

Overfitting, Bias / Variance Analysis

Overfitting, Bias / Variance Analysis Overfitting, Bias / Variance Analysis Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 8, 207 / 40 Outline Administration 2 Review of last lecture 3 Basic

More information

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Mining Classification Knowledge

Mining Classification Knowledge Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology COST Doctoral School, Troina 2008 Outline 1. Bayesian classification

More information

WE consider the problem of detecting and localizing a material

WE consider the problem of detecting and localizing a material 1886 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 55, NO. 5, MAY 2007 Detection and Localization of Material Releases With Sparse Sensor Configurations Emily B. Fox, Student Member, IEEE, John W. Fisher,

More information

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Mini-Course 07 Kalman Particle Filters Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Agenda State Estimation Problems & Kalman Filter Henrique Massard Steady State

More information

Extended Target Tracking Using a Gaussian- Mixture PHD Filter

Extended Target Tracking Using a Gaussian- Mixture PHD Filter Extended Target Tracing Using a Gaussian- Mixture PHD Filter Karl Granström, Christian Lundquist and Umut Orguner Linöping University Post Print N.B.: When citing this wor, cite the original article. IEEE.

More information

A NEW FORMULATION OF IPDAF FOR TRACKING IN CLUTTER

A NEW FORMULATION OF IPDAF FOR TRACKING IN CLUTTER A NEW FRMULATIN F IPDAF FR TRACKING IN CLUTTER Jean Dezert NERA, 29 Av. Division Leclerc 92320 Châtillon, France fax:+33146734167 dezert@onera.fr Ning Li, X. Rong Li University of New rleans New rleans,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

CPHD filtering in unknown clutter rate and detection profile

CPHD filtering in unknown clutter rate and detection profile CPHD filtering in unnown clutter rate and detection profile Ronald. P. S. Mahler, Ba Tuong Vo, Ba Ngu Vo Abstract In Bayesian multi-target filtering we have to contend with two notable sources of uncertainty,

More information

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18 CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$

More information

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Target Tracking Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Linear Dynamical System (LDS) Stochastic process governed by is the state vector is the input vector is the process

More information

Multi-target Multi-Bernoulli Tracking and Joint. Multi-target Estimator

Multi-target Multi-Bernoulli Tracking and Joint. Multi-target Estimator Multi-target Multi-Bernoulli Tracing and Joint Multi-target Estimator MULTI-TARGET MULTI-BERNOULLI TRACKING AND JOINT MULTI-TARGET ESTIMATOR BY ERKAN BASER, B.Sc., M.Sc. a thesis submitted to the department

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment

Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment Yong Huang a,b, James L. Beck b,* and Hui Li a a Key Lab

More information

Data association uncertainty occurs when remote sensing devices, such as radar,

Data association uncertainty occurs when remote sensing devices, such as radar, The Probabilistic Data Association Filter ESTIMATION IN THE PRESENCE OF MEASUREMENT ORIGIN UNCERTAINTY YAAKOV BAR-SHALOM, FRED DAUM, and JIM HUANG Data association uncertainty occurs when remote sensing

More information

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Development of Stochastic Artificial Neural Networks for Hydrological Prediction Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

CSCE 478/878 Lecture 6: Bayesian Learning

CSCE 478/878 Lecture 6: Bayesian Learning Bayesian Methods Not all hypotheses are created equal (even if they are all consistent with the training data) Outline CSCE 478/878 Lecture 6: Bayesian Learning Stephen D. Scott (Adapted from Tom Mitchell

More information

The Kernel-SME Filter with False and Missing Measurements

The Kernel-SME Filter with False and Missing Measurements The Kernel-SME Filter with False and Missing Measurements Marcus Baum, Shishan Yang Institute of Computer Science University of Göttingen, Germany Email: marcusbaum, shishanyang@csuni-goettingende Uwe

More information

For final project discussion every afternoon Mark and I will be available

For final project discussion every afternoon Mark and I will be available Worshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

A Comparison of Multiple-Model Target Tracking Algorithms

A Comparison of Multiple-Model Target Tracking Algorithms University of New Orleans ScholarWors@UNO University of New Orleans heses and Dissertations Dissertations and heses 1-17-4 A Comparison of Multiple-Model arget racing Algorithms Ryan Pitre University of

More information

Bayesian Approach 2. CSC412 Probabilistic Learning & Reasoning

Bayesian Approach 2. CSC412 Probabilistic Learning & Reasoning CSC412 Probabilistic Learning & Reasoning Lecture 12: Bayesian Parameter Estimation February 27, 2006 Sam Roweis Bayesian Approach 2 The Bayesian programme (after Rev. Thomas Bayes) treats all unnown quantities

More information

Probability Hypothesis Density Filter for Multitarget Multisensor Tracking

Probability Hypothesis Density Filter for Multitarget Multisensor Tracking Probability Hypothesis Density Filter for Multitarget Multisensor Tracing O. Erdinc, P. Willett, Y. Bar-Shalom ECE Department University of Connecticut ozgur, willett @engr.uconn.edu ybs@ee.uconn.edu Abstract

More information

10-701/15-781, Machine Learning: Homework 4

10-701/15-781, Machine Learning: Homework 4 10-701/15-781, Machine Learning: Homewor 4 Aarti Singh Carnegie Mellon University ˆ The assignment is due at 10:30 am beginning of class on Mon, Nov 15, 2010. ˆ Separate you answers into five parts, one

More information

Gaussian Mixture PHD and CPHD Filtering with Partially Uniform Target Birth

Gaussian Mixture PHD and CPHD Filtering with Partially Uniform Target Birth PREPRINT: 15th INTERNATIONAL CONFERENCE ON INFORMATION FUSION, ULY 1 Gaussian Mixture PHD and CPHD Filtering with Partially Target Birth Michael Beard, Ba-Tuong Vo, Ba-Ngu Vo, Sanjeev Arulampalam Maritime

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

TSRT14: Sensor Fusion Lecture 9

TSRT14: Sensor Fusion Lecture 9 TSRT14: Sensor Fusion Lecture 9 Simultaneous localization and mapping (SLAM) Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 9 Gustaf Hendeby Spring 2018 1 / 28 Le 9: simultaneous localization and

More information

Chapter 3 Deterministic planning

Chapter 3 Deterministic planning Chapter 3 Deterministic planning In this chapter we describe a number of algorithms for solving the historically most important and most basic type of planning problem. Two rather strong simplifying assumptions

More information

Distributed estimation in sensor networks

Distributed estimation in sensor networks in sensor networks A. Benavoli Dpt. di Sistemi e Informatica Università di Firenze, Italy. e-mail: benavoli@dsi.unifi.it Outline 1 An introduction to 2 3 An introduction to An introduction to In recent

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics 1. What are probabilistic graphical models (PGMs) 2. Use of PGMs Engineering and AI 3. Directionality in

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set MAS 6J/1.16J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

But if z is conditioned on, we need to model it:

But if z is conditioned on, we need to model it: Partially Unobserved Variables Lecture 8: Unsupervised Learning & EM Algorithm Sam Roweis October 28, 2003 Certain variables q in our models may be unobserved, either at training time or at test time or

More information

The Cardinality Balanced Multi-Target Multi-Bernoulli Filter and its Implementations

The Cardinality Balanced Multi-Target Multi-Bernoulli Filter and its Implementations PREPRINT: IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 2, PP. 49-423, 29 1 The Cardinality Balanced Multi-Target Multi-Bernoulli Filter and its Implementations Ba-Tuong Vo, Ba-Ngu Vo, and Antonio

More information

Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine

Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Mike Tipping Gaussian prior Marginal prior: single α Independent α Cambridge, UK Lecture 3: Overview

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)? ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we

More information

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Target Tracking and Classification using Collaborative Sensor Networks

Target Tracking and Classification using Collaborative Sensor Networks Target Tracking and Classification using Collaborative Sensor Networks Xiaodong Wang Department of Electrical Engineering Columbia University p.1/3 Talk Outline Background on distributed wireless sensor

More information

13 : Variational Inference: Loopy Belief Propagation and Mean Field

13 : Variational Inference: Loopy Belief Propagation and Mean Field 10-708: Probabilistic Graphical Models 10-708, Spring 2012 13 : Variational Inference: Loopy Belief Propagation and Mean Field Lecturer: Eric P. Xing Scribes: Peter Schulam and William Wang 1 Introduction

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information