REGLERTEKNIK Lecture Outline Target Tracking: Lecture 5 Multiple Target Tracking: Part II Gustaf Hendeby hendeby@isy.liu.se Div. Automatic Control Dept. Electrical Engineering Linköping University December 0, 204 AUTOMATIC COROL. Fundamental Components Summary 2. Assignment Problem Algorithm 3. Track-Based MHT Implementation Details Summary 4. User Interaction 5. Examples SUPPORT ADABTS 6. Summary Concluding Remarks Learn More... G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 Last Lecture Multiple Hypothesis Tracking (MHT) Intro multi-target tracking (MTT) Single hypothesis tracker (SHT) Global nearest neighbor (GNN) Joint Probabilistic Data Association (JPDA) Auction algorithm Fundamental theorem of target tracking MHT: consider multiple associations hypotheses over Started with the conceptual MHT Integrated track initialization Two principal implementations hypotheses based track based G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 4 / 36
: basic idea Representing Fundamental Components Described in Reid (979) Intuitive hypothesis based brute force implementation Between consecutive instants, different association hypotheses,, are kept in memory {Θ i k }N h Idea: generate all possible hypotheses, and then prune to avoid combinatorial hypotheses growth Hypothesis limiting techniques: clustering pruning low probability hypotheses N-scan pruning combining similar hypotheses... Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) ŷk k 2 ŷk k ŷk k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36 Fundamental Components Fundamental Components Representing Generating Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) 3 3 ŷk k 2 ŷk k 2 ŷk k 2 Θ k, P(Θ k ) 2 3 measurements T2 Form Θ l k {θ k, Θ i k } Θ 2 k, P(Θ2 k ) 2 3 measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 7 / 36
Fundamental Components Fundamental Components Computing Hypothesis Probabilities System Overview Let Θ l k {θ k, Θ i k }, then (using the Fundamental Theorem of TT ) New Set of Measurements {y i k} mk P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k k k (j) ( P j D Pj G ]P(Θ ) i k y 0:k ) fa β mnt nt P j D pj θ k k ( ) Set of {Θ i k } Nh Generate New {Θ i k} Nh Calculate Hyp. Probabilities {P (Θ i k)} Nh Reduce Number of Θ i k j J i D j J i ND z Note The sets JD i and J ND i depend on Θi k! The number of targets and target estimates usually differ between hypotheses. User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 8 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 9 / 36 Reducing Complexity Clustering Group targets without common measurements, and handle the groups separately Clustering Pruning of low probability hypotheses N-scan pruning Merging similar hypotheses Θ k Θ 2 k ŷ 2 k k ŷ k k ŷ k k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 0 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36
Clustering Clustering: cluster management Group targets without common measurements, and handle the groups separately 3 Cluster-2 Θ k y ŷk k 2 k Cluster- ŷ k k 3 Cluster-2 Θ 2 k Cluster- ŷ k k When targets get closer If measurement falls inside the gates of tracks in different clusters, merge the clusters The hypotheses for each cluster are combined into a super-hypotheses When targets separate If a group of tracks cluster do not share measurements with the other tracks in the cluster (for a period of ), split the cluster for the cluster are also divided into smaller hypotheses corresponding to two smaller clusters. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ k, P(Θ k ) Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ 2 k, P(Θ2 k ) T2 2 3 measurements Cluster- Θ k, P(Θ k ) T2 measurements Cluster-2 Θ k, P(Θ k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 2 3 measurements Cluster- Θ 2 k, P(Θ2 k ) measurements Cluster-2 Θ 2 k, P(Θ2 k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36
Clustering: process clusters separately (2/2) reduction For each cluster: Delete hypotheses with probability below a threshold, γ p (e.g., γ p = 0.00) Deletion Condition: P(Θ i k ) < γ p Keep only the most probable hypotheses with a total probability mass above a threshold, γ c (e.g., γ c = 0.99) resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 Deletion Condition: i k= P(Θ l k k ) > γ c Θ 3 0 Θ 4 Θ 5 Θ 7 2 where l k is a sequence such that P(Θ l k k ) P(Θl k+ k ) 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 4 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 Θ 2 resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 0 Θ 4 Θ 4 Θ 5 Θ 7 2 0 2 3 4 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36
resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 Θ 4 Θ 5 3 Θ 4 Θ 5 3 Θ 6 3 Θ 6 3 0 2 3 4 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 Hypothesis Merging resolved after N steps Θ 4 3 Θ 5 3 Reid s original paper suggests to check for hypothesis pairs with: the same number of targets (tracks) similar track estimates If these conditions are satisfied: merge the hypotheses assign the new hypothesis the sum of the combined hypotheses probability Θ 6 3 0 2 3 4 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 5 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 6 / 36
Summary Summary Attractive method since each hypothesis is an alternative representation of reality easily interpreted Drawback: generating all possible hypotheses only to discarding (most of) them is inefficient Some hypotheses contain the same track; hence fewer unique tracks than hypotheses Track based methods were popular until an efficient way to implement a hypothesis based MHT was given by Cox and Hingorani (996) Proposed by Cox and Hingorani (996) Generate only the best hypotheses, skip hypotheses that will be deleted Use the N-best solutions to the assignment problem (introduced last lecture with GNN) Murty s method, 968 Find the N h -best hypothesis, generating as few unnecessary hypothesis as possible Hypothesis reduction techniques still apply G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 7 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 8 / 36 Assignment Problem Assignment Problem Assignment Problem: repetition (/2) Assignment Problem: repetition (2/2) Let Θ l k {θ k, Θ i k }. P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k θ k (j) fa (y ( P j D Pj G ]P(Θ ) i k y 0:k ) β mnt k nt j J i D P j D pj k k k ) j J i ND P(Θ l k y 0:k) β mfa k fa [ β mnt k nt j J i D P j D pj θ k (j) k k ( ) P j D Pj G ] C i P(Θ i k y 0:k ) Logarithmize and form the assignment matrices Divide and multiply the right hand side by nt i C i ( P j D Pj G ) = ( P j D Pj G ) ( P j D Pj G ) j= j JD i j JND i represents. l ij log Pj D pj k k (y i k) ( P j D Pj G ). A T T 2 fa fa 2 fa 3 nt nt 2 nt 3 l l 2 log β fa log β nt l 2 log β fa log β nt log β fa log β nt A 2 T fa fa 2 fa 3 nt nt 2 nt 3 l log β fa log β nt 2 l 2 log β fa log β nt 3 log β fa log β nt G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 9 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 20 / 36
Assignment Problem Assignment Problem Assignment Problem: N-best solutions Assignment Problem: Murty s Method Given an assignment matrix A i, the Auction algorithm (or similar) finds the best assignment in polynomial Generalizations of this problem to find the N-best solutions: Formulate as several best assignment problems Solve independently using the Auction algorithm Murty s method Murty s Method Given the assignment matrix A i, Find the best solution using Auction algorithm. 2nd best solution: Express the 2nd best solution as the solution of a number of best solution assignment problems. Find the solution to each of these problems by Auction. The solution giving the maximum reward (minimum cost) is the second best solution. Repeat the procedure for more solutions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 2 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 22 / 36 Algorithm Algorithm Algorithm Outline Generating the N h -best Aim: Given hypotheses {Θ i k }N h and measurements {y k i }m k, find the N h best hypotheses {Θ i k }N h (avoid generating all hypotheses) Reminder of Hypothesis Probability P(Θ l k y 0:k) β mfa k fa [ β mnt k nt P j D pj θ k (j) k k ( ) P j j JD i D Pj G }{{} Assignment dependent ] C i P(Θ i k y 0:k ) }{{} Legacy Find {Θ l k }N h l= that maximizes P(Θl k y 0:k). Two steps: Obtain the solution from the assignment (Murty s method) Multiply the obtained quantity by previous hypothesis dependent terms Input {Θ i k }N h, {P(Θi k y 0:k )} N h, and {y i k }m k Output HYP-LIST (N hypotheses, decreasing probability) PROB-LIST (matching probabilities). Initialize all elements in HYP-LIST and PROB-LIST to and 2. Find assignment matrices {A i } N h for {Θi k }N h 3. For j =... N h. For i =... N h. For the assignment matrix A i find the jth best solution Θ ji k 2. Compute the probability P(Θ ji k ) 3. Update HYP-LIST and PROB-LIST: If the new hypothesis enters the list, discard the least probable entry 4. If P(Θ ji k ) is lower than the lowest probability in PROB-LIST discard Θji k and never use A i again in subsequent recursions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 23 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 24 / 36
Track-Based MHT Track-Based MHT: motivation Track-Based MHT Track-Based MHT: principle usually contain identical tracks significantly fewer tracks than hypotheses Idea: Store tracks, T i, not hypotheses, Θ i, over Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) Tracks at k, {T i k }Nt Track scores, Sc(T i k ) Form a track tree, not a hypothesis tree Delete tracks with low scores old tracks new tracks T k T k T 2 k T 3 k ŷ 2 k k ŷ k k ŷ k k Track List T 2 k 2 4 5 6 y ŷk k 2 k T ŷ k k k 2 2 T 7 k T 8 k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 25 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 26 / 36 Track-Based MHT Implementation Details Track-Based MHT Implementation Details Generation Hypothesis: a collection of compatible tracks: Θ k = {T k, T 5 k, T 8 k }, Θ2 k = {T 2 k, T 3 k, T 7 k, T 8 k } Generating hypothesis is needed for reducing the number of tracks further and for user presentation Use only tracks with high score Keep track compatibility information (e.g., in a binary matrix) T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k T k 0 0 0 0 T 2 k 0 T 3 k 0 0 0 0 T 4 k 0 0 T 5 k 0 0 T 6 k 0 T 7 k 0 T 8 k 0 Track List old tracks T k T 2 k new tracks T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k Track Scores and Probabilities Track probability: P( i ) = T i k Θj k Hypothesis score: Sc(Θ i k ) = T j k Θi k Hypothesis probability: P(Θ j k ) Sc(T j k ) P(Θ i k ) = exp ( Sc(Θ i k )) + N h j= exp( Sc(Θ j k )) Track List old tracks T k T 2 k new tracks 3 2 2 3 T 2 k 4 5 6 7 8 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 27 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 28 / 36
Track-Based MHT Implementation Details Track-Based MHT Summary Complexity Reducing Techniques System Components New Set of Measurements {y i k} mk Cluster incompatible tracks for efficient hypothesis generation Apply N-scan pruning to the track trees Set of Tracks {T i k } Nt Generate New Tracks {T i k} Nt Discard Low Score Tracks Discard Low Probability Tracks Merge tracks with common recent measurement history Generate {Θ i k} Nh Calculate Track Probabilities z User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 29 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 30 / 36 User Interaction Examples SUPPORT User Presentation Logic Example: harbor protection (SUPPORT) Maximum probability hypothesis: simplest alternative Possibly jumpy; the maximum probability hypothesis can change erratically Show track clusters: (weighted) mean, covariance and expected number of targets Keep a separate track list: update at each step with a selection of tracks from different hypotheses Consult (Blackman and Popoli, 999) for details G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 3 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 32 / 36
Examples ADABTS Summary Concluding Remarks Example: busy indoor environments Which Multi-TT Method to Use? Computation SNR Low Medium High Low Group TT / PHD GNN GNN Medium MHT GNN or JPDA GNN High TrBD / MHT MHT Any GNN and JPDA are very bad in low SNR. When using GNN, one generally has to enlarge the overconfident covariances to account for neglected data association uncertainty. JPDA has track coalescence and should not be used with closely spaced targets, see the coalescence avoiding versions. MHT requires significantly higher computational load but it is said to be able to work reasonably under 0 00 s worse SNR. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 33 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 34 / 36 Learning More (/2) Summary Learn More... Learning More (2/2) Summary Learn More... Samuel S. Blackman. Multiple hypothesis tracking for multiple target tracking. IEEE Transactions on Aerospace and Electronic Systems, 9():5 8, January 2004. Samuel S. Blackman and Robert Popoli. Design and analysis of modern tracking systems. Artech House radar library. Artech House, Inc, 999. ISBN -5853-006-0. Ingemar J. Cox and Sunita L. Hingorani. An efficient implementation of Reid s multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8(2):38 50, February 996. Ingemar J. Cox and Matthew L. Miller. On finding ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 3():486 489, January 995. Ingemar J. Cox, Matthew L. Miller, Roy Danchick, and G. E. Newnam. A comparison of two algorithms for determining ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 33():295 30, January 997. Roy Danchick and G. E. Newnam. Reformulating Reid s MHT method with generalised Murty K-best ranked linear assignment algorithm. IEE Proceedings-F Radar and Sonar Navigation, 53():3 22, February 2006. Matthew L. Miller, Harold S. Stone, and Ingemar J. Cox. Optimizing Murty s ranked assignment method. IEEE Transactions on Aerospace and Electronic Systems, 33(3):85 862, July 997. Donald B. Reid. An algorithm for tracking multiple tragets. IEEE Transactions on Automatic Control, 24(6):843 854, December 979. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 35 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 36 / 36