G. Hendeby Target Tracking: Lecture 5 (MHT) December 10, / 36

Similar documents
Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis?

Previously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Why The Association Log-Likelihood Distance Should Be Used For Measurement-To-Track Association

Tracking and Identification of Multiple targets

Generalizations to the Track-Oriented MHT Recursion

Efficient Sensitivity Analysis in Hidden Markov Models

CS281 Section 4: Factor Analysis and PCA

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set

Tracking Groups of People with a Multi-Model Hypothesis Tracker

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

TSRT14: Sensor Fusion Lecture 8

STONY BROOK UNIVERSITY. CEAS Technical Report 829

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

ASSOCIATION OF SATELLITE OBSERVATIONS USING BAYESIAN INFERENCE

Lecture Outline. Target Tracking: Lecture 7 Multiple Sensor Tracking Issues. Multi Sensor Architectures. Multi Sensor Architectures

ADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING

Maximum Likelihood, Logistic Regression, and Stochastic Gradient Training

Generalized Data Association for Multitarget Tracking in Clutter

A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM

PMR Learning as Inference

Factor Analysis (10/2/13)

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Model Comparison. Course on Bayesian Inference, WTCN, UCL, February Model Comparison. Bayes rule for models. Linear Models. AIC and BIC.

CLUe Training An Introduction to Machine Learning in R with an example from handwritten digit recognition

Linear Regression (continued)

a Short Introduction

Introduction to Machine Learning. Introduction to ML - TAU 2016/7 1

Multimodal Nested Sampling

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18

Optimal Fusion Performance Modeling in Sensor Networks

Machine Learning

Collaborative Filtering. Radek Pelánek

Intro to Bayesian Methods

Collaborative topic models: motivations cont

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

New Multiple Target Tracking Strategy Using Domain Knowledge and Optimisation

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Part A. P (w 1 )P (w 2 w 1 )P (w 3 w 1 w 2 ) P (w M w 1 w 2 w M 1 ) P (w 1 )P (w 2 w 1 )P (w 3 w 2 ) P (w M w M 1 )

A Bayesian Perspective on Residential Demand Response Using Smart Meter Data

Machine Learning. Lecture 9: Learning Theory. Feng Li.

GMTI Tracking in the Presence of Doppler and Range Ambiguities

Gate Volume Estimation for Target Tracking

Chris Bishop s PRML Ch. 8: Graphical Models

Online Appearance Model Learning for Video-Based Face Recognition

CS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization

Mobile Robot Localization

Advanced Introduction to Machine Learning

Statistical Methods in Particle Physics

Machine Learning (CS 567) Lecture 2

Linear Dynamical Systems

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Lecture 3: Machine learning, classification, and generative models

Dynamic models 1 Kalman filters, linearization,

Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig

Estimating Polynomial Structures from Radar Data

18.9 SUPPORT VECTOR MACHINES

STATS 306B: Unsupervised Learning Spring Lecture 5 April 14

Introduction to Algorithms

Mobile Robot Localization

Machine Learning

Detecting Changes in Multivariate Time Series

Decision Trees. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. February 5 th, Carlos Guestrin 1

26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G.

Conditional Random Field

PAC-learning, VC Dimension and Margin-based Bounds

TSRT14: Sensor Fusion Lecture 9

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

Introduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen

Multivariate statistical methods and data mining in particle physics

COMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017

Multi-Target Particle Filtering for the Probability Hypothesis Density

GWAS V: Gaussian processes

Parameter Estimation. Industrial AI Lab.

Chapter 14 Combining Models

Multimedia Databases 1/29/ Indexes for Multimedia Data Indexes for Multimedia Data Indexes for Multimedia Data

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Machine Learning. Theory of Classification and Nonparametric Classifier. Lecture 2, January 16, What is theoretically the best classifier

Eigenface-based facial recognition

Out-of-Sequence Data Processing for Track-before-Detect using Dynamic Programming

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

Tutorial on Venn-ABERS prediction

Clustering bi-partite networks using collapsed latent block models

MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE

Online Clutter Estimation Using a Gaussian Kernel Density Estimator for Target Tracking

Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers

Constrained State Estimation Using the Unscented Kalman Filter

A Study of Poisson Multi-Bernoulli Mixture Conjugate Prior in Multiple Target Estimation

On Identification of Cascade Systems 1

The Kernel Trick, Gram Matrices, and Feature Extraction. CS6787 Lecture 4 Fall 2017

A Modified Incremental Principal Component Analysis for On-line Learning of Feature Space and Classifier

Incorporating Track Uncertainty into the OSPA Metric

CS 231A Section 1: Linear Algebra & Probability Review

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Introduction to SVM and RVM

CS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

Variational Principal Components

Lecture : Probabilistic Machine Learning

Transcription:

REGLERTEKNIK Lecture Outline Target Tracking: Lecture 5 Multiple Target Tracking: Part II Gustaf Hendeby hendeby@isy.liu.se Div. Automatic Control Dept. Electrical Engineering Linköping University December 0, 204 AUTOMATIC COROL. Fundamental Components Summary 2. Assignment Problem Algorithm 3. Track-Based MHT Implementation Details Summary 4. User Interaction 5. Examples SUPPORT ADABTS 6. Summary Concluding Remarks Learn More... G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 Last Lecture Multiple Hypothesis Tracking (MHT) Intro multi-target tracking (MTT) Single hypothesis tracker (SHT) Global nearest neighbor (GNN) Joint Probabilistic Data Association (JPDA) Auction algorithm Fundamental theorem of target tracking MHT: consider multiple associations hypotheses over Started with the conceptual MHT Integrated track initialization Two principal implementations hypotheses based track based G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 4 / 36

: basic idea Representing Fundamental Components Described in Reid (979) Intuitive hypothesis based brute force implementation Between consecutive instants, different association hypotheses,, are kept in memory {Θ i k }N h Idea: generate all possible hypotheses, and then prune to avoid combinatorial hypotheses growth Hypothesis limiting techniques: clustering pruning low probability hypotheses N-scan pruning combining similar hypotheses... Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) ŷk k 2 ŷk k ŷk k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36 Fundamental Components Fundamental Components Representing Generating Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) 3 3 ŷk k 2 ŷk k 2 ŷk k 2 Θ k, P(Θ k ) 2 3 measurements T2 Form Θ l k {θ k, Θ i k } Θ 2 k, P(Θ2 k ) 2 3 measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 7 / 36

Fundamental Components Fundamental Components Computing Hypothesis Probabilities System Overview Let Θ l k {θ k, Θ i k }, then (using the Fundamental Theorem of TT ) New Set of Measurements {y i k} mk P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k k k (j) ( P j D Pj G ]P(Θ ) i k y 0:k ) fa β mnt nt P j D pj θ k k ( ) Set of {Θ i k } Nh Generate New {Θ i k} Nh Calculate Hyp. Probabilities {P (Θ i k)} Nh Reduce Number of Θ i k j J i D j J i ND z Note The sets JD i and J ND i depend on Θi k! The number of targets and target estimates usually differ between hypotheses. User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 8 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 9 / 36 Reducing Complexity Clustering Group targets without common measurements, and handle the groups separately Clustering Pruning of low probability hypotheses N-scan pruning Merging similar hypotheses Θ k Θ 2 k ŷ 2 k k ŷ k k ŷ k k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 0 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36

Clustering Clustering: cluster management Group targets without common measurements, and handle the groups separately 3 Cluster-2 Θ k y ŷk k 2 k Cluster- ŷ k k 3 Cluster-2 Θ 2 k Cluster- ŷ k k When targets get closer If measurement falls inside the gates of tracks in different clusters, merge the clusters The hypotheses for each cluster are combined into a super-hypotheses When targets separate If a group of tracks cluster do not share measurements with the other tracks in the cluster (for a period of ), split the cluster for the cluster are also divided into smaller hypotheses corresponding to two smaller clusters. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ k, P(Θ k ) Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ 2 k, P(Θ2 k ) T2 2 3 measurements Cluster- Θ k, P(Θ k ) T2 measurements Cluster-2 Θ k, P(Θ k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 2 3 measurements Cluster- Θ 2 k, P(Θ2 k ) measurements Cluster-2 Θ 2 k, P(Θ2 k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36

Clustering: process clusters separately (2/2) reduction For each cluster: Delete hypotheses with probability below a threshold, γ p (e.g., γ p = 0.00) Deletion Condition: P(Θ i k ) < γ p Keep only the most probable hypotheses with a total probability mass above a threshold, γ c (e.g., γ c = 0.99) resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 Deletion Condition: i k= P(Θ l k k ) > γ c Θ 3 0 Θ 4 Θ 5 Θ 7 2 where l k is a sequence such that P(Θ l k k ) P(Θl k+ k ) 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 4 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 Θ 2 resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 0 Θ 4 Θ 4 Θ 5 Θ 7 2 0 2 3 4 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36

resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 Θ 4 Θ 5 3 Θ 4 Θ 5 3 Θ 6 3 Θ 6 3 0 2 3 4 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 Hypothesis Merging resolved after N steps Θ 4 3 Θ 5 3 Reid s original paper suggests to check for hypothesis pairs with: the same number of targets (tracks) similar track estimates If these conditions are satisfied: merge the hypotheses assign the new hypothesis the sum of the combined hypotheses probability Θ 6 3 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36

Summary Summary Attractive method since each hypothesis is an alternative representation of reality easily interpreted Drawback: generating all possible hypotheses only to discarding (most of) them is inefficient Some hypotheses contain the same track; hence fewer unique tracks than hypotheses Track based methods were popular until an efficient way to implement a hypothesis based MHT was given by Cox and Hingorani (996) Proposed by Cox and Hingorani (996) Generate only the best hypotheses, skip hypotheses that will be deleted Use the N-best solutions to the assignment problem (introduced last lecture with GNN) Murty s method, 968 Find the N h -best hypothesis, generating as few unnecessary hypothesis as possible Hypothesis reduction techniques still apply G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 7 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 8 / 36 Assignment Problem Assignment Problem Assignment Problem: repetition (/2) Assignment Problem: repetition (2/2) Let Θ l k {θ k, Θ i k }. P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k θ k (j) fa (y ( P j D Pj G ]P(Θ ) i k y 0:k ) β mnt k nt j J i D P j D pj k k k ) j J i ND P(Θ l k y 0:k) β mfa k fa [ β mnt k nt j J i D P j D pj θ k (j) k k ( ) P j D Pj G ] C i P(Θ i k y 0:k ) Logarithmize and form the assignment matrices Divide and multiply the right hand side by nt i C i ( P j D Pj G ) = ( P j D Pj G ) ( P j D Pj G ) j= j JD i j JND i represents. l ij log Pj D pj k k (y i k) ( P j D Pj G ). A T T 2 fa fa 2 fa 3 nt nt 2 nt 3 l l 2 log β fa log β nt l 2 log β fa log β nt log β fa log β nt A 2 T fa fa 2 fa 3 nt nt 2 nt 3 l log β fa log β nt 2 l 2 log β fa log β nt 3 log β fa log β nt G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 9 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 20 / 36

Assignment Problem Assignment Problem Assignment Problem: N-best solutions Assignment Problem: Murty s Method Given an assignment matrix A i, the Auction algorithm (or similar) finds the best assignment in polynomial Generalizations of this problem to find the N-best solutions: Formulate as several best assignment problems Solve independently using the Auction algorithm Murty s method Murty s Method Given the assignment matrix A i, Find the best solution using Auction algorithm. 2nd best solution: Express the 2nd best solution as the solution of a number of best solution assignment problems. Find the solution to each of these problems by Auction. The solution giving the maximum reward (minimum cost) is the second best solution. Repeat the procedure for more solutions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 22 / 36 Algorithm Algorithm Algorithm Outline Generating the N h -best Aim: Given hypotheses {Θ i k }N h and measurements {y k i }m k, find the N h best hypotheses {Θ i k }N h (avoid generating all hypotheses) Reminder of Hypothesis Probability P(Θ l k y 0:k) β mfa k fa [ β mnt k nt P j D pj θ k (j) k k ( ) P j j JD i D Pj G }{{} Assignment dependent ] C i P(Θ i k y 0:k ) }{{} Legacy Find {Θ l k }N h l= that maximizes P(Θl k y 0:k). Two steps: Obtain the solution from the assignment (Murty s method) Multiply the obtained quantity by previous hypothesis dependent terms Input {Θ i k }N h, {P(Θi k y 0:k )} N h, and {y i k }m k Output HYP-LIST (N hypotheses, decreasing probability) PROB-LIST (matching probabilities). Initialize all elements in HYP-LIST and PROB-LIST to and 2. Find assignment matrices {A i } N h for {Θi k }N h 3. For j =... N h. For i =... N h. For the assignment matrix A i find the jth best solution Θ ji k 2. Compute the probability P(Θ ji k ) 3. Update HYP-LIST and PROB-LIST: If the new hypothesis enters the list, discard the least probable entry 4. If P(Θ ji k ) is lower than the lowest probability in PROB-LIST discard Θji k and never use A i again in subsequent recursions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 23 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 24 / 36

Track-Based MHT Track-Based MHT: motivation Track-Based MHT Track-Based MHT: principle usually contain identical tracks significantly fewer tracks than hypotheses Idea: Store tracks, T i, not hypotheses, Θ i, over Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) Tracks at k, {T i k }Nt Track scores, Sc(T i k ) Form a track tree, not a hypothesis tree Delete tracks with low scores old tracks new tracks T k T k T 2 k T 3 k ŷ 2 k k ŷ k k ŷ k k Track List T 2 k 2 4 5 6 y ŷk k 2 k T ŷ k k k 2 2 T 7 k T 8 k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 25 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 26 / 36 Track-Based MHT Implementation Details Track-Based MHT Implementation Details Generation Hypothesis: a collection of compatible tracks: Θ k = {T k, T 5 k, T 8 k }, Θ2 k = {T 2 k, T 3 k, T 7 k, T 8 k } Generating hypothesis is needed for reducing the number of tracks further and for user presentation Use only tracks with high score Keep track compatibility information (e.g., in a binary matrix) T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k T k 0 0 0 0 T 2 k 0 T 3 k 0 0 0 0 T 4 k 0 0 T 5 k 0 0 T 6 k 0 T 7 k 0 T 8 k 0 Track List old tracks T k T 2 k new tracks T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k Track Scores and Probabilities Track probability: P( i ) = T i k Θj k Hypothesis score: Sc(Θ i k ) = T j k Θi k Hypothesis probability: P(Θ j k ) Sc(T j k ) P(Θ i k ) = exp ( Sc(Θ i k )) + N h j= exp( Sc(Θ j k )) Track List old tracks T k T 2 k new tracks 3 2 2 3 T 2 k 4 5 6 7 8 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 27 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 28 / 36

Track-Based MHT Implementation Details Track-Based MHT Summary Complexity Reducing Techniques System Components New Set of Measurements {y i k} mk Cluster incompatible tracks for efficient hypothesis generation Apply N-scan pruning to the track trees Set of Tracks {T i k } Nt Generate New Tracks {T i k} Nt Discard Low Score Tracks Discard Low Probability Tracks Merge tracks with common recent measurement history Generate {Θ i k} Nh Calculate Track Probabilities z User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 29 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 30 / 36 User Interaction Examples SUPPORT User Presentation Logic Example: harbor protection (SUPPORT) Maximum probability hypothesis: simplest alternative Possibly jumpy; the maximum probability hypothesis can change erratically Show track clusters: (weighted) mean, covariance and expected number of targets Keep a separate track list: update at each step with a selection of tracks from different hypotheses Consult (Blackman and Popoli, 999) for details G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 32 / 36

Examples ADABTS Summary Concluding Remarks Example: busy indoor environments Which Multi-TT Method to Use? Computation SNR Low Medium High Low Group TT / PHD GNN GNN Medium MHT GNN or JPDA GNN High TrBD / MHT MHT Any GNN and JPDA are very bad in low SNR. When using GNN, one generally has to enlarge the overconfident covariances to account for neglected data association uncertainty. JPDA has track coalescence and should not be used with closely spaced targets, see the coalescence avoiding versions. MHT requires significantly higher computational load but it is said to be able to work reasonably under 0 00 s worse SNR. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 33 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 34 / 36 Learning More (/2) Summary Learn More... Learning More (2/2) Summary Learn More... Samuel S. Blackman. Multiple hypothesis tracking for multiple target tracking. IEEE Transactions on Aerospace and Electronic Systems, 9():5 8, January 2004. Samuel S. Blackman and Robert Popoli. Design and analysis of modern tracking systems. Artech House radar library. Artech House, Inc, 999. ISBN -5853-006-0. Ingemar J. Cox and Sunita L. Hingorani. An efficient implementation of Reid s multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8(2):38 50, February 996. Ingemar J. Cox and Matthew L. Miller. On finding ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 3():486 489, January 995. Ingemar J. Cox, Matthew L. Miller, Roy Danchick, and G. E. Newnam. A comparison of two algorithms for determining ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 33():295 30, January 997. Roy Danchick and G. E. Newnam. Reformulating Reid s MHT method with generalised Murty K-best ranked linear assignment algorithm. IEE Proceedings-F Radar and Sonar Navigation, 53():3 22, February 2006. Matthew L. Miller, Harold S. Stone, and Ingemar J. Cox. Optimizing Murty s ranked assignment method. IEEE Transactions on Aerospace and Electronic Systems, 33(3):85 862, July 997. Donald B. Reid. An algorithm for tracking multiple tragets. IEEE Transactions on Automatic Control, 24(6):843 854, December 979. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 35 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 36 / 36