G. Hendeby Target Tracking: Lecture 5 (MHT) December 10, / 36
|
|
- Ralph Barber
- 5 years ago
- Views:
Transcription
1 REGLERTEKNIK Lecture Outline Target Tracking: Lecture 5 Multiple Target Tracking: Part II Gustaf Hendeby hendeby@isy.liu.se Div. Automatic Control Dept. Electrical Engineering Linköping University December 0, 204 AUTOMATIC COROL. Fundamental Components Summary 2. Assignment Problem Algorithm 3. Track-Based MHT Implementation Details Summary 4. User Interaction 5. Examples SUPPORT ADABTS 6. Summary Concluding Remarks Learn More... G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Last Lecture Multiple Hypothesis Tracking (MHT) Intro multi-target tracking (MTT) Single hypothesis tracker (SHT) Global nearest neighbor (GNN) Joint Probabilistic Data Association (JPDA) Auction algorithm Fundamental theorem of target tracking MHT: consider multiple associations hypotheses over Started with the conceptual MHT Integrated track initialization Two principal implementations hypotheses based track based G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
2 : basic idea Representing Fundamental Components Described in Reid (979) Intuitive hypothesis based brute force implementation Between consecutive instants, different association hypotheses,, are kept in memory {Θ i k }N h Idea: generate all possible hypotheses, and then prune to avoid combinatorial hypotheses growth Hypothesis limiting techniques: clustering pruning low probability hypotheses N-scan pruning combining similar hypotheses... Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) ŷk k 2 ŷk k ŷk k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Fundamental Components Fundamental Components Representing Generating Each hypothesis, {Θ i k }N h, is characterized by the number of targets (tracks) and their corresponding sufficient statistics Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) 3 3 ŷk k 2 ŷk k 2 ŷk k 2 Θ k, P(Θ k ) 2 3 measurements T2 Form Θ l k {θ k, Θ i k } Θ 2 k, P(Θ2 k ) 2 3 measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
3 Fundamental Components Fundamental Components Computing Hypothesis Probabilities System Overview Let Θ l k {θ k, Θ i k }, then (using the Fundamental Theorem of TT ) New Set of Measurements {y i k} mk P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k k k (j) ( P j D Pj G ]P(Θ ) i k y 0:k ) fa β mnt nt P j D pj θ k k ( ) Set of {Θ i k } Nh Generate New {Θ i k} Nh Calculate Hyp. Probabilities {P (Θ i k)} Nh Reduce Number of Θ i k j J i D j J i ND z Note The sets JD i and J ND i depend on Θi k! The number of targets and target estimates usually differ between hypotheses. User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Reducing Complexity Clustering Group targets without common measurements, and handle the groups separately Clustering Pruning of low probability hypotheses N-scan pruning Merging similar hypotheses Θ k Θ 2 k ŷ 2 k k ŷ k k ŷ k k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36
4 Clustering Clustering: cluster management Group targets without common measurements, and handle the groups separately 3 Cluster-2 Θ k y ŷk k 2 k Cluster- ŷ k k 3 Cluster-2 Θ 2 k Cluster- ŷ k k When targets get closer If measurement falls inside the gates of tracks in different clusters, merge the clusters The hypotheses for each cluster are combined into a super-hypotheses When targets separate If a group of tracks cluster do not share measurements with the other tracks in the cluster (for a period of ), split the cluster for the cluster are also divided into smaller hypotheses corresponding to two smaller clusters. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, 204 / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ k, P(Θ k ) Clustering: process clusters separately (/2) generation Form Θ l k {θ k, Θ i k } for each cluster as if the other clusters do not exist. Without clustering Θ 2 k, P(Θ2 k ) T2 2 3 measurements Cluster- Θ k, P(Θ k ) T2 measurements Cluster-2 Θ k, P(Θ k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / measurements Cluster- Θ 2 k, P(Θ2 k ) measurements Cluster-2 Θ 2 k, P(Θ2 k ) measurements G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
5 Clustering: process clusters separately (2/2) reduction For each cluster: Delete hypotheses with probability below a threshold, γ p (e.g., γ p = 0.00) Deletion Condition: P(Θ i k ) < γ p Keep only the most probable hypotheses with a total probability mass above a threshold, γ c (e.g., γ c = 0.99) resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 Deletion Condition: i k= P(Θ l k k ) > γ c Θ 3 0 Θ 4 Θ 5 Θ 7 2 where l k is a sequence such that P(Θ l k k ) P(Θl k+ k ) G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Θ 2 resolved after N steps Θ 0 Θ 2 0 Θ Θ 2 Θ 3 Θ 2 2 Θ 3 2 Θ 4 2 Θ 5 2 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 0 Θ 4 Θ 4 Θ 5 Θ G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
6 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 resolved after N steps Θ 3 Θ 4 2 Θ 5 2 Θ 3 Θ 2 3 Θ 3 3 Θ 4 3 Θ 4 Θ 5 3 Θ 4 Θ 5 3 Θ 6 3 Θ G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Hypothesis Merging resolved after N steps Θ 4 3 Θ 5 3 Reid s original paper suggests to check for hypothesis pairs with: the same number of targets (tracks) similar track estimates If these conditions are satisfied: merge the hypotheses assign the new hypothesis the sum of the combined hypotheses probability Θ G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
7 Summary Summary Attractive method since each hypothesis is an alternative representation of reality easily interpreted Drawback: generating all possible hypotheses only to discarding (most of) them is inefficient Some hypotheses contain the same track; hence fewer unique tracks than hypotheses Track based methods were popular until an efficient way to implement a hypothesis based MHT was given by Cox and Hingorani (996) Proposed by Cox and Hingorani (996) Generate only the best hypotheses, skip hypotheses that will be deleted Use the N-best solutions to the assignment problem (introduced last lecture with GNN) Murty s method, 968 Find the N h -best hypothesis, generating as few unnecessary hypothesis as possible Hypothesis reduction techniques still apply G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Assignment Problem Assignment Problem Assignment Problem: repetition (/2) Assignment Problem: repetition (2/2) Let Θ l k {θ k, Θ i k }. P(Θ l k y 0:k) p(y k Θ l k, y 0:k )P(θ k Θ i k, y 0:k )P(Θ i k y 0:k ) [ ][ β mfa k θ k (j) fa (y ( P j D Pj G ]P(Θ ) i k y 0:k ) β mnt k nt j J i D P j D pj k k k ) j J i ND P(Θ l k y 0:k) β mfa k fa [ β mnt k nt j J i D P j D pj θ k (j) k k ( ) P j D Pj G ] C i P(Θ i k y 0:k ) Logarithmize and form the assignment matrices Divide and multiply the right hand side by nt i C i ( P j D Pj G ) = ( P j D Pj G ) ( P j D Pj G ) j= j JD i j JND i represents. l ij log Pj D pj k k (y i k) ( P j D Pj G ). A T T 2 fa fa 2 fa 3 nt nt 2 nt 3 l l 2 log β fa log β nt l 2 log β fa log β nt log β fa log β nt A 2 T fa fa 2 fa 3 nt nt 2 nt 3 l log β fa log β nt 2 l 2 log β fa log β nt 3 log β fa log β nt G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
8 Assignment Problem Assignment Problem Assignment Problem: N-best solutions Assignment Problem: Murty s Method Given an assignment matrix A i, the Auction algorithm (or similar) finds the best assignment in polynomial Generalizations of this problem to find the N-best solutions: Formulate as several best assignment problems Solve independently using the Auction algorithm Murty s method Murty s Method Given the assignment matrix A i, Find the best solution using Auction algorithm. 2nd best solution: Express the 2nd best solution as the solution of a number of best solution assignment problems. Find the solution to each of these problems by Auction. The solution giving the maximum reward (minimum cost) is the second best solution. Repeat the procedure for more solutions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Algorithm Algorithm Algorithm Outline Generating the N h -best Aim: Given hypotheses {Θ i k }N h and measurements {y k i }m k, find the N h best hypotheses {Θ i k }N h (avoid generating all hypotheses) Reminder of Hypothesis Probability P(Θ l k y 0:k) β mfa k fa [ β mnt k nt P j D pj θ k (j) k k ( ) P j j JD i D Pj G }{{} Assignment dependent ] C i P(Θ i k y 0:k ) }{{} Legacy Find {Θ l k }N h l= that maximizes P(Θl k y 0:k). Two steps: Obtain the solution from the assignment (Murty s method) Multiply the obtained quantity by previous hypothesis dependent terms Input {Θ i k }N h, {P(Θi k y 0:k )} N h, and {y i k }m k Output HYP-LIST (N hypotheses, decreasing probability) PROB-LIST (matching probabilities). Initialize all elements in HYP-LIST and PROB-LIST to and 2. Find assignment matrices {A i } N h for {Θi k }N h 3. For j =... N h. For i =... N h. For the assignment matrix A i find the jth best solution Θ ji k 2. Compute the probability P(Θ ji k ) 3. Update HYP-LIST and PROB-LIST: If the new hypothesis enters the list, discard the least probable entry 4. If P(Θ ji k ) is lower than the lowest probability in PROB-LIST discard Θji k and never use A i again in subsequent recursions G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
9 Track-Based MHT Track-Based MHT: motivation Track-Based MHT Track-Based MHT: principle usually contain identical tracks significantly fewer tracks than hypotheses Idea: Store tracks, T i, not hypotheses, Θ i, over Θ k, P(Θ k ) Θ 2 k, P(Θ2 k ) Tracks at k, {T i k }Nt Track scores, Sc(T i k ) Form a track tree, not a hypothesis tree Delete tracks with low scores old tracks new tracks T k T k T 2 k T 3 k ŷ 2 k k ŷ k k ŷ k k Track List T 2 k y ŷk k 2 k T ŷ k k k 2 2 T 7 k T 8 k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Track-Based MHT Implementation Details Track-Based MHT Implementation Details Generation Hypothesis: a collection of compatible tracks: Θ k = {T k, T 5 k, T 8 k }, Θ2 k = {T 2 k, T 3 k, T 7 k, T 8 k } Generating hypothesis is needed for reducing the number of tracks further and for user presentation Use only tracks with high score Keep track compatibility information (e.g., in a binary matrix) T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k T k T 2 k 0 T 3 k T 4 k 0 0 T 5 k 0 0 T 6 k 0 T 7 k 0 T 8 k 0 Track List old tracks T k T 2 k new tracks T k T 2 k T 3 k T 4 k T 5 k T 6 k T 7 k T 8 k Track Scores and Probabilities Track probability: P( i ) = T i k Θj k Hypothesis score: Sc(Θ i k ) = T j k Θi k Hypothesis probability: P(Θ j k ) Sc(T j k ) P(Θ i k ) = exp ( Sc(Θ i k )) + N h j= exp( Sc(Θ j k )) Track List old tracks T k T 2 k new tracks T 2 k G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
10 Track-Based MHT Implementation Details Track-Based MHT Summary Complexity Reducing Techniques System Components New Set of Measurements {y i k} mk Cluster incompatible tracks for efficient hypothesis generation Apply N-scan pruning to the track trees Set of Tracks {T i k } Nt Generate New Tracks {T i k} Nt Discard Low Score Tracks Discard Low Probability Tracks Merge tracks with common recent measurement history Generate {Θ i k} Nh Calculate Track Probabilities z User Presentation Logic G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 User Interaction Examples SUPPORT User Presentation Logic Example: harbor protection (SUPPORT) Maximum probability hypothesis: simplest alternative Possibly jumpy; the maximum probability hypothesis can change erratically Show track clusters: (weighted) mean, covariance and expected number of targets Keep a separate track list: update at each step with a selection of tracks from different hypotheses Consult (Blackman and Popoli, 999) for details G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
11 Examples ADABTS Summary Concluding Remarks Example: busy indoor environments Which Multi-TT Method to Use? Computation SNR Low Medium High Low Group TT / PHD GNN GNN Medium MHT GNN or JPDA GNN High TrBD / MHT MHT Any GNN and JPDA are very bad in low SNR. When using GNN, one generally has to enlarge the overconfident covariances to account for neglected data association uncertainty. JPDA has track coalescence and should not be used with closely spaced targets, see the coalescence avoiding versions. MHT requires significantly higher computational load but it is said to be able to work reasonably under 0 00 s worse SNR. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 Learning More (/2) Summary Learn More... Learning More (2/2) Summary Learn More... Samuel S. Blackman. Multiple hypothesis tracking for multiple target tracking. IEEE Transactions on Aerospace and Electronic Systems, 9():5 8, January Samuel S. Blackman and Robert Popoli. Design and analysis of modern tracking systems. Artech House radar library. Artech House, Inc, 999. ISBN Ingemar J. Cox and Sunita L. Hingorani. An efficient implementation of Reid s multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8(2):38 50, February 996. Ingemar J. Cox and Matthew L. Miller. On finding ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 3(): , January 995. Ingemar J. Cox, Matthew L. Miller, Roy Danchick, and G. E. Newnam. A comparison of two algorithms for determining ranked assignments with application to multitarget tracking and motion correspondence. IEEE Transactions on Aerospace and Electronic Systems, 33():295 30, January 997. Roy Danchick and G. E. Newnam. Reformulating Reid s MHT method with generalised Murty K-best ranked linear assignment algorithm. IEE Proceedings-F Radar and Sonar Navigation, 53():3 22, February Matthew L. Miller, Harold S. Stone, and Ingemar J. Cox. Optimizing Murty s ranked assignment method. IEEE Transactions on Aerospace and Electronic Systems, 33(3):85 862, July 997. Donald B. Reid. An algorithm for tracking multiple tragets. IEEE Transactions on Automatic Control, 24(6): , December 979. G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36 G. Hendeby (hendeby@isy.liu.se) Target Tracking: Lecture 5 (MHT) December 0, / 36
Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis?
REGLERTEKNIK Summary of Past Lectures AUTOMATIC COROL Target Tracing: Lecture Multiple Target Tracing: Part I Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering
More informationPreviously on TT, Target Tracking: Lecture 2 Single Target Tracking Issues. Lecture-2 Outline. Basic ideas on track life
REGLERTEKNIK Previously on TT, AUTOMATIC CONTROL Target Tracing: Lecture 2 Single Target Tracing Issues Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering Linöping
More informationRobotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard
Robotics 2 Data Association Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Data Association Data association is the process of associating uncertain measurements to known tracks. Problem
More informationLecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection
REGLERTEKNIK Lecture Outline AUTOMATIC CONTROL Target Tracking: Lecture 3 Maneuvering Target Tracking Issues Maneuver Detection Emre Özkan emre@isy.liu.se Division of Automatic Control Department of Electrical
More informationWhy The Association Log-Likelihood Distance Should Be Used For Measurement-To-Track Association
Why The Association Log-Likelihood Distance Should Be Used For Measurement-To-Track Association Richard Altendorfer and Sebastian Wirkert Abstract The Mahalanobis distance is commonly used in multi-object
More informationTracking and Identification of Multiple targets
Tracking and Identification of Multiple targets Samir Hachour, François Delmotte, Eric Lefèvre, David Mercier Laboratoire de Génie Informatique et d'automatique de l'artois, EA 3926 LGI2A first name.last
More informationGeneralizations to the Track-Oriented MHT Recursion
18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 Generalizations to the Track-Oriented MHT Recursion Stefano Coraluppi and Craig Carthel Systems & Technology Research
More informationEfficient Sensitivity Analysis in Hidden Markov Models
Efficient Sensitivity Analysis in Hidden Markov Models Silja Renooij Department of Information and Computing Sciences, Utrecht University P.O. Box 80.089, 3508 TB Utrecht, The Netherlands silja@cs.uu.nl
More informationCS281 Section 4: Factor Analysis and PCA
CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we
More informationA Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set
1th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set Kathrin Wilkens 1,2 1 Institute
More informationTracking Groups of People with a Multi-Model Hypothesis Tracker
Tracking Groups of People with a Multi-Model Hypothesis Tracker Boris Lau Kai O. Arras Wolfram Burgard Abstract People in densely populated environments typically form groups that split and merge. In this
More informationFUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS
FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationSTONY BROOK UNIVERSITY. CEAS Technical Report 829
1 STONY BROOK UNIVERSITY CEAS Technical Report 829 Variable and Multiple Target Tracking by Particle Filtering and Maximum Likelihood Monte Carlo Method Jaechan Lim January 4, 2006 2 Abstract In most applications
More informationSLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada
SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM
More informationASSOCIATION OF SATELLITE OBSERVATIONS USING BAYESIAN INFERENCE
AAS 13-245 ASSOCIATION OF SATELLITE OBSERVATIONS USING BAYESIAN INFERENCE Christopher Binz and Liam Healy When observing satellites in an increasingly cluttered space environment, ambiguity in measurement
More informationLecture Outline. Target Tracking: Lecture 7 Multiple Sensor Tracking Issues. Multi Sensor Architectures. Multi Sensor Architectures
Lecture Outline Target Tracing: Lecture 7 Multiple Sensor Tracing Issues Umut Orguner umut@metu.edu.tr room: EZ-12 tel: 4425 Department of Electrical & Electronics Engineering Middle East Technical University
More informationADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING
ADDRESSING TRACK COALESCENCE IN SEQUENTIAL K-BEST MULTIPLE HYPOTHESIS TRACKING A Thesis Presented to The Academic Faculty By Ryan D. Palkki In Partial Fulfillment of the Requirements for the Degree Master
More informationMaximum Likelihood, Logistic Regression, and Stochastic Gradient Training
Maximum Likelihood, Logistic Regression, and Stochastic Gradient Training Charles Elkan elkan@cs.ucsd.edu January 17, 2013 1 Principle of maximum likelihood Consider a family of probability distributions
More informationGeneralized Data Association for Multitarget Tracking in Clutter
A. Tchamova 1, T. Semerdjiev 2, P. Konstantinova 3, Jean Dezert 4 1, 2, 3 Institute for Parallel Processing, Bulgarian Academy of Sciences, Acad. G. Bonchev Str.,bl.25-A, 1113, Sofia Bulgaria 4 ONERA,
More informationA MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM
Approved for public release; distribution is unlimited. A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM By Dr. Buddy H. Jeun, Jay Jayaraman Lockheed Martin Aeronautical Systems,
More informationPMR Learning as Inference
Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning
More informationFactor Analysis (10/2/13)
STA561: Probabilistic machine learning Factor Analysis (10/2/13) Lecturer: Barbara Engelhardt Scribes: Li Zhu, Fan Li, Ni Guan Factor Analysis Factor analysis is related to the mixture models we have studied.
More informationRao-Blackwellized Particle Filter for Multiple Target Tracking
Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized
More informationModel Comparison. Course on Bayesian Inference, WTCN, UCL, February Model Comparison. Bayes rule for models. Linear Models. AIC and BIC.
Course on Bayesian Inference, WTCN, UCL, February 2013 A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y. This is implemented
More informationCLUe Training An Introduction to Machine Learning in R with an example from handwritten digit recognition
CLUe Training An Introduction to Machine Learning in R with an example from handwritten digit recognition Ad Feelders Universiteit Utrecht Department of Information and Computing Sciences Algorithmic Data
More informationLinear Regression (continued)
Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression
More informationa Short Introduction
Collaborative Filtering in Recommender Systems: a Short Introduction Norm Matloff Dept. of Computer Science University of California, Davis matloff@cs.ucdavis.edu December 3, 2016 Abstract There is a strong
More informationIntroduction to Machine Learning. Introduction to ML - TAU 2016/7 1
Introduction to Machine Learning Introduction to ML - TAU 2016/7 1 Course Administration Lecturers: Amir Globerson (gamir@post.tau.ac.il) Yishay Mansour (Mansour@tau.ac.il) Teaching Assistance: Regev Schweiger
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More informationCSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18
CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$
More informationOptimal Fusion Performance Modeling in Sensor Networks
Optimal Fusion erformance Modeling in Sensor Networks Stefano Coraluppi NURC a NATO Research Centre Viale S. Bartolomeo 400 926 La Spezia, Italy coraluppi@nurc.nato.int Marco Guerriero and eter Willett
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,
More informationCollaborative Filtering. Radek Pelánek
Collaborative Filtering Radek Pelánek 2017 Notes on Lecture the most technical lecture of the course includes some scary looking math, but typically with intuitive interpretation use of standard machine
More informationIntro to Bayesian Methods
Intro to Bayesian Methods Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601 Lecture 1 1 Course Webpage Syllabus LaTeX reference manual R markdown reference manual Please come to office
More informationCollaborative topic models: motivations cont
Collaborative topic models: motivations cont Two topics: machine learning social network analysis Two people: " boy Two articles: article A! girl article B Preferences: The boy likes A and B --- no problem.
More informationPattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods
Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter
More informationNew Multiple Target Tracking Strategy Using Domain Knowledge and Optimisation
New Multiple Target Tracking Strategy Using Domain Knowledge and Optimisation Runxiao Ding, Miao Yu, Hyondong Oh and Wen-Hua Chen Abstract This paper proposes an environment-dependent vehicle dynamic modelling
More informationRobotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard
Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications
More informationPart A. P (w 1 )P (w 2 w 1 )P (w 3 w 1 w 2 ) P (w M w 1 w 2 w M 1 ) P (w 1 )P (w 2 w 1 )P (w 3 w 2 ) P (w M w M 1 )
Part A 1. A Markov chain is a discrete-time stochastic process, defined by a set of states, a set of transition probabilities (between states), and a set of initial state probabilities; the process proceeds
More informationA Bayesian Perspective on Residential Demand Response Using Smart Meter Data
A Bayesian Perspective on Residential Demand Response Using Smart Meter Data Datong-Paul Zhou, Maximilian Balandat, and Claire Tomlin University of California, Berkeley [datong.zhou, balandat, tomlin]@eecs.berkeley.edu
More informationMachine Learning. Lecture 9: Learning Theory. Feng Li.
Machine Learning Lecture 9: Learning Theory Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 2018 Why Learning Theory How can we tell
More informationGMTI Tracking in the Presence of Doppler and Range Ambiguities
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 GMTI Tracing in the Presence of Doppler and Range Ambiguities Michael Mertens Dept. Sensor Data and Information
More informationGate Volume Estimation for Target Tracking
Gate Volume Estimation for Target Tracking Darko Mušicki Mark R. Morelande Dept of Electrical Engineering Dept of Electrical Engineering University of Melbourne University of Melbourne Victoria 30 Victoria
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationOnline Appearance Model Learning for Video-Based Face Recognition
Online Appearance Model Learning for Video-Based Face Recognition Liang Liu 1, Yunhong Wang 2,TieniuTan 1 1 National Laboratory of Pattern Recognition Institute of Automation, Chinese Academy of Sciences,
More informationCS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization
CS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization Tim Roughgarden February 28, 2017 1 Preamble This lecture fulfills a promise made back in Lecture #1,
More informationMobile Robot Localization
Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations
More informationAdvanced Introduction to Machine Learning
10-715 Advanced Introduction to Machine Learning Homework 3 Due Nov 12, 10.30 am Rules 1. Homework is due on the due date at 10.30 am. Please hand over your homework at the beginning of class. Please see
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 10 December 17, 01 Silvia Masciocchi, GSI Darmstadt Winter Semester 01 / 13 Method of least squares The method of least squares is a standard approach to
More informationMachine Learning (CS 567) Lecture 2
Machine Learning (CS 567) Lecture 2 Time: T-Th 5:00pm - 6:20pm Location: GFS118 Instructor: Sofus A. Macskassy (macskass@usc.edu) Office: SAL 216 Office hours: by appointment Teaching assistant: Cheol
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationProbability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles
Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Myungsoo Jun and Raffaello D Andrea Sibley School of Mechanical and Aerospace Engineering Cornell University
More informationLecture 3: Machine learning, classification, and generative models
EE E6820: Speech & Audio Processing & Recognition Lecture 3: Machine learning, classification, and generative models 1 Classification 2 Generative models 3 Gaussian models Michael Mandel
More informationDynamic models 1 Kalman filters, linearization,
Koller & Friedman: Chapter 16 Jordan: Chapters 13, 15 Uri Lerner s Thesis: Chapters 3,9 Dynamic models 1 Kalman filters, linearization, Switching KFs, Assumed density filters Probabilistic Graphical Models
More informationWolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig
Multimedia Databases Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig http://www.ifis.cs.tu-bs.de 13 Indexes for Multimedia Data 13 Indexes for Multimedia
More informationEstimating Polynomial Structures from Radar Data
Estimating Polynomial Structures from Radar Data Christian Lundquist, Umut Orguner and Fredrik Gustafsson Department of Electrical Engineering Linköping University Linköping, Sweden {lundquist, umut, fredrik}@isy.liu.se
More information18.9 SUPPORT VECTOR MACHINES
744 Chapter 8. Learning from Examples is the fact that each regression problem will be easier to solve, because it involves only the examples with nonzero weight the examples whose kernels overlap the
More informationSTATS 306B: Unsupervised Learning Spring Lecture 5 April 14
STATS 306B: Unsupervised Learning Spring 2014 Lecture 5 April 14 Lecturer: Lester Mackey Scribe: Brian Do and Robin Jia 5.1 Discrete Hidden Markov Models 5.1.1 Recap In the last lecture, we introduced
More informationIntroduction to Algorithms
Lecture 1 Introduction to Algorithms 1.1 Overview The purpose of this lecture is to give a brief overview of the topic of Algorithms and the kind of thinking it involves: why we focus on the subjects that
More informationMobile Robot Localization
Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting
More informationDetecting Changes in Multivariate Time Series
Detecting Changes in Multivariate Time Series Alan Wise* Supervisor: Rebecca Wilson September 2 nd, 2016 *STOR-i Ball: Best Dressed Male 2016 What I Will Cover 1 Univariate Changepoint Detection Detecting
More informationDecision Trees. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University. February 5 th, Carlos Guestrin 1
Decision Trees Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University February 5 th, 2007 2005-2007 Carlos Guestrin 1 Linear separability A dataset is linearly separable iff 9 a separating
More information26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G.
10-708: Probabilistic Graphical Models, Spring 2015 26 : Spectral GMs Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G. 1 Introduction A common task in machine learning is to work with
More informationConditional Random Field
Introduction Linear-Chain General Specific Implementations Conclusions Corso di Elaborazione del Linguaggio Naturale Pisa, May, 2011 Introduction Linear-Chain General Specific Implementations Conclusions
More informationPAC-learning, VC Dimension and Margin-based Bounds
More details: General: http://www.learning-with-kernels.org/ Example of more complex bounds: http://www.research.ibm.com/people/t/tzhang/papers/jmlr02_cover.ps.gz PAC-learning, VC Dimension and Margin-based
More informationTSRT14: Sensor Fusion Lecture 9
TSRT14: Sensor Fusion Lecture 9 Simultaneous localization and mapping (SLAM) Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 9 Gustaf Hendeby Spring 2018 1 / 28 Le 9: simultaneous localization and
More informationSUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION
SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION 1 Outline Basic terminology Features Training and validation Model selection Error and loss measures Statistical comparison Evaluation measures 2 Terminology
More informationIntroduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen
Kernel Methods Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Kernel Methods 1 / 37 Outline
More informationMultivariate statistical methods and data mining in particle physics
Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general
More informationCOMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017
COMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University SOFT CLUSTERING VS HARD CLUSTERING
More informationMulti-Target Particle Filtering for the Probability Hypothesis Density
Appears in the 6 th International Conference on Information Fusion, pp 8 86, Cairns, Australia. Multi-Target Particle Filtering for the Probability Hypothesis Density Hedvig Sidenbladh Department of Data
More informationGWAS V: Gaussian processes
GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011
More informationParameter Estimation. Industrial AI Lab.
Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed
More informationChapter 14 Combining Models
Chapter 14 Combining Models T-61.62 Special Course II: Pattern Recognition and Machine Learning Spring 27 Laboratory of Computer and Information Science TKK April 3th 27 Outline Independent Mixing Coefficients
More informationMultimedia Databases 1/29/ Indexes for Multimedia Data Indexes for Multimedia Data Indexes for Multimedia Data
1/29/2010 13 Indexes for Multimedia Data 13 Indexes for Multimedia Data 13.1 R-Trees Multimedia Databases Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig
More informationComputer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)
Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori
More informationMachine Learning. Theory of Classification and Nonparametric Classifier. Lecture 2, January 16, What is theoretically the best classifier
Machine Learning 10-701/15 701/15-781, 781, Spring 2008 Theory of Classification and Nonparametric Classifier Eric Xing Lecture 2, January 16, 2006 Reading: Chap. 2,5 CB and handouts Outline What is theoretically
More informationEigenface-based facial recognition
Eigenface-based facial recognition Dimitri PISSARENKO December 1, 2002 1 General This document is based upon Turk and Pentland (1991b), Turk and Pentland (1991a) and Smith (2002). 2 How does it work? The
More informationOut-of-Sequence Data Processing for Track-before-Detect using Dynamic Programming
Out-of-Sequence Data Processing for Track-before-Detect using Dynamic Programming Felix Govaers, Yang Rong +, Lai Hoe Chee +, Teow Loo Nin +, Ng Gee Wah +, Wolfgang Koch Fraunhofer FKIE, Wachtberg, Germany
More informationRAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS
RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014
More informationTutorial on Venn-ABERS prediction
Tutorial on Venn-ABERS prediction Paolo Toccaceli (joint work with Alex Gammerman and Ilia Nouretdinov) Computer Learning Research Centre Royal Holloway, University of London 6th Symposium on Conformal
More informationClustering bi-partite networks using collapsed latent block models
Clustering bi-partite networks using collapsed latent block models Jason Wyse, Nial Friel & Pierre Latouche Insight at UCD Laboratoire SAMM, Université Paris 1 Mail: jason.wyse@ucd.ie Insight Latent Space
More informationMULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE
MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE Bruno CHRISTOPHE Vanessa FONDBERTASSE Office National d'etudes et de Recherches Aérospatiales (http://www.onera.fr) B.P. 72 - F-92322 CHATILLON
More informationOnline Clutter Estimation Using a Gaussian Kernel Density Estimator for Target Tracking
4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 0 Online Clutter Estimation Using a Gaussian Kernel Density Estimator for Target Tracking X. Chen, R. Tharmarasa, T.
More informationReducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers Erin Allwein, Robert Schapire and Yoram Singer Journal of Machine Learning Research, 1:113-141, 000 CSE 54: Seminar on Learning
More informationConstrained State Estimation Using the Unscented Kalman Filter
16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and
More informationA Study of Poisson Multi-Bernoulli Mixture Conjugate Prior in Multiple Target Estimation
A Study of Poisson Multi-Bernoulli Mixture Conjugate Prior in Multiple Target Estimation Sampling-based Data Association, Multi-Bernoulli Mixture Approximation and Performance Evaluation Master s thesis
More informationOn Identification of Cascade Systems 1
On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se
More informationThe Kernel Trick, Gram Matrices, and Feature Extraction. CS6787 Lecture 4 Fall 2017
The Kernel Trick, Gram Matrices, and Feature Extraction CS6787 Lecture 4 Fall 2017 Momentum for Principle Component Analysis CS6787 Lecture 3.1 Fall 2017 Principle Component Analysis Setting: find the
More informationA Modified Incremental Principal Component Analysis for On-line Learning of Feature Space and Classifier
A Modified Incremental Principal Component Analysis for On-line Learning of Feature Space and Classifier Seiichi Ozawa, Shaoning Pang, and Nikola Kasabov Graduate School of Science and Technology, Kobe
More informationIncorporating Track Uncertainty into the OSPA Metric
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 Incorporating Trac Uncertainty into the OSPA Metric Sharad Nagappa School of EPS Heriot Watt University Edinburgh,
More informationCS 231A Section 1: Linear Algebra & Probability Review
CS 231A Section 1: Linear Algebra & Probability Review 1 Topics Support Vector Machines Boosting Viola-Jones face detector Linear Algebra Review Notation Operations & Properties Matrix Calculus Probability
More informationIntroduction: MLE, MAP, Bayesian reasoning (28/8/13)
STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this
More informationIntroduction to SVM and RVM
Introduction to SVM and RVM Machine Learning Seminar HUS HVL UIB Yushu Li, UIB Overview Support vector machine SVM First introduced by Vapnik, et al. 1992 Several literature and wide applications Relevance
More informationCS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang
CS 231A Section 1: Linear Algebra & Probability Review Kevin Tang Kevin Tang Section 1-1 9/30/2011 Topics Support Vector Machines Boosting Viola Jones face detector Linear Algebra Review Notation Operations
More informationMMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise
MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise I. Bilik 1 and J. Tabrikian 2 1 Dept. of Electrical and Computer Engineering, University
More informationVariational Principal Components
Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More information