Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013

Similar documents
Tracking with Kalman Filter

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

Quantifying Uncertainty

Time-Varying Systems and Computations Lecture 6

Composite Hypotheses testing

Uncertainty as the Overlap of Alternate Conditional Distributions

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Statistics for Economics & Business

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

6 Supplementary Materials

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Discussion of Extensions of the Gauss-Markov Theorem to the Case of Stochastic Regression Coefficients Ed Stanek

Mathematical Preparations

Linear Feature Engineering 11

THE Kalman filter (KF) rooted in the state-space formulation

Introduction to Regression

Chapter 3. Two-Variable Regression Model: The Problem of Estimation

Feature Selection: Part 1

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

Chapter 11: Simple Linear Regression and Correlation

4DVAR, according to the name, is a four-dimensional variational method.

Probability Theory (revisited)

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

Impulse Noise Removal Technique Based on Fuzzy Logic

BACKGROUND SUBTRACTION WITH EIGEN BACKGROUND METHODS USING MATLAB

Chapter 9: Statistical Inference and the Relationship between Two Variables

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

ECONOMICS 351*-A Mid-Term Exam -- Fall Term 2000 Page 1 of 13 pages. QUEEN'S UNIVERSITY AT KINGSTON Department of Economics

Statistics Spring MIT Department of Nuclear Engineering

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

Parameter Estimation for Dynamic System using Unscented Kalman filter

Expectation Maximization Mixture Models HMMs

Radar Trackers. Study Guide. All chapters, problems, examples and page numbers refer to Applied Optimal Estimation, A. Gelb, Ed.

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

Comparison of Regression Lines

8 : Learning in Fully Observed Markov Networks. 1 Why We Need to Learn Undirected Graphical Models. 2 Structural Learning for Completely Observed MRF

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

Negative Binomial Regression

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1

Chapter 13: Multiple Regression

IV. Performance Optimization

Statistics Chapter 4

Properties of Least Squares

STATISTICAL MECHANICS

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Convergence of random processes

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

This column is a continuation of our previous column

e i is a random error

Linear Approximation with Regularization and Moving Least Squares

Classification as a Regression Problem

Kinematics of Fluids. Lecture 16. (Refer the text book CONTINUUM MECHANICS by GEORGE E. MASE, Schaum s Outlines) 17/02/2017

Lecture 3 Stat102, Spring 2007

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

Hidden Markov Models

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

Chapter 15 - Multiple Regression

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models

The Ordinary Least Squares (OLS) Estimator

Fourier Transform. Additive noise. Fourier Tansform. I = S + N. Noise doesn t depend on signal. We ll consider:

β0 + β1xi and want to estimate the unknown

EEE 241: Linear Systems

Invariant deformation parameters from GPS permanent networks using stochastic interpolation

Linear regression. Regression Models. Chapter 11 Student Lecture Notes Regression Analysis is the

Outline. Multivariate Parametric Methods. Multivariate Data. Basic Multivariate Statistics. Steven J Zeil

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Chapter 8 Indicator Variables

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

β0 + β1xi. You are interested in estimating the unknown parameters β

Conjugacy and the Exponential Family

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

Let the Shape Speak - Discriminative Face Alignment using Conjugate Priors

Kernel Methods and SVMs Extension

Economics 130. Lecture 4 Simple Linear Regression Continued

Which Separator? Spring 1

Lecture Notes on Linear Regression

A quantum-statistical-mechanical extension of Gaussian mixture model

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Lecture 10 Support Vector Machines II

Feb 14: Spatial analysis of data fields

Basic Business Statistics, 10/e

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

x i1 =1 for all i (the constant ).

Transcription:

Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys, and R. Klette Course Materals

Materal Feature selecton: SIFT Features Trackng: Kalman Flter: F&P Chapter 17 Greg Welch and Gary Bshop, UNC: http://www.cs.unc.edu/~welch/kalman/ Web-ste (electronc and prnted references, book lsts, Java demo, software etc.) Course materal SIGGRAPH: http://www.cs.unc.edu/~tracker/ref/s2001/kalman/ndex.html

Trackng Rgd Objects

Trackng Rgd Objects

Trackng Rgd Objects

Trackng Rgd Objects

Feature Trackng Trackng of good features & effcent search for subsequent postons. What are good features? Requred propertes: Well-defned (.e. negborng ponts should all be dfferent) Stable across vews (.e. same 3D pont should be extracted as feature for neghborng vewponts)

Lowe s SIFT features (Lowe, ICCV99) SIFT: Scale Invarant Feature Transform Recover features wth change of poston, orentaton and scale

SIFT features Scale-space DoG maxma Verfy mnmum contrast and cornerness Orentaton from domnant gradent Descrptor based on gradent dstrbutons 0 2 0 2

Dynamc Feature Trackng Trackng s the problem of generatng an nference about the moton of an object gven a sequence of mages. The key techncal dffculty s mantanng an accurate representaton of the posteror on object poston gven measurements, and dong so effcently.

Kalman Flter The Kalman flter s a very powerful tool when t comes to controllng nosy systems. Apollo 8 (December 1968), the frst human spaceflght from the Earth to an orbt around the moon, would certanly not have been possble wthout the Kalman flter (see www.on.org/museum/tem_vew.cfm?cd=6&scd=5 &d=293 ). Applcatons: Trackng Economcs Navgaton Depth and velocty measurements..

14 What s t used for? Trackng mssles Trackng heads/hands/drumstcks Extractng lp moton from vdeo Fttng Bezer patches to pont data Lots of computer vson applcatons Economcs Navgaton

Key Concept Nosy process data Estmate average trajectores Smoothng: Sldng wndow for averagng (here sze 64) But: If horzontal axs s tme? We know the past but not the future! Tme-dependent process: Modelng of process tself ncludng nose estmates.

Model for trackng Object has nternal state X Captal ndcates random varable Small represents partcular value X x Obtaned measurements n frame are Value of the measurement y Y

Lnear Dynamc Models State s lnearly transformed plus Gaussan nose x ~ N D x, 1 d Relevant measures are lnearly obtaned from state plus Gaussan nose y ~ N M Suffcent to mantan mean and standard devaton x, m

General Steps of Trackng 1. Predcton: What s the next state of the object gven past measurements P X Y y Y y 0 0,, 1 1 2. Data assocaton: Whch measures are relevant for the state? 3. Correcton: Compute representaton of the state from predcton and measurements. P X Y y, Y y Y y 0 0, 1 1,

Concept Kalman Flterng predct correct

Independence Assumptons Only mmedate past matters P X X X PX X 1,, 1 1 Measurements depend only on current state P Y, Y,, Y X P Y X P Y,, Y X j k j k Important smplfcatons Fortunately t doesn t lmt to much!

Sprt of Kalman Flterng: A really smple example We are on a boat at nght and lost our poston We know: star poston

Fxed Poston p s poston of boat, v s velocty of boat p p 1 state s X p X D X 1 D I We only measure poston so M I, Y M X X

Observer 1 makes a measurement y0, m 0 Condtonal Densty Functon x0 y 0 N ( y 0, m 0 ) 0 m 0-2 0 2 4 6 8 10 12 14

Then: Observer 2 makes a measurement m y1, 1 Condtonal Densty Functon x1? N ( y 1, m 1 ) 1? -2 0 2 4 6 8 10 12 14 How does second measurement affect estmate of frst measurement?

x K Combne measurements & varances: Kalman 2 x1 K2 2 x 2 1 2 2 2 1 ( y ) 1 y 2 1 2 1 2 1 1 2 y 2 2 Combne Varances (statstcs)

Combne measurements & varances: Kalman x x 2 2 2 2 Condtonal Densty Functon N ( x 2, 2 ) -2 0 2 4 6 8 10 12 14 Orgnal estmates updated (corrected) n the presence of a new measurement.

33 Predct Correct KF operates by 1. Predctng the new state and ts uncertanty 2. Correctng wth the new measurement predct correct

34 A really smple example We are on a boat at nght and lost our poston We know: move wth constant velocty star poston

35 But suppose we re movng -2 0 2 4 6 8 10 12 14 Not all the dfference s error. Some may be moton KF can nclude a moton model Estmate velocty and poston

36 Process Model Descrbes how the state changes over tme The state for the frst example was scalar The process model was nothng changes A better model mght be constant velocty moton X p v p p v v 1 t 1 ( t) v 1

37 Measurement Model What you see from where you are not Where you are from what you see

38 Constant Velocty p s poston of boat, v s velocty of boat p p 1 ( t) v 1 state s X [ p v] t X D X 1 D 1 t 0 1 We only measure poston so M t t [1 0], Y [1 0] [ p v] t p

Multdmensonal Statstcs To be seen as a generalzaton of the scalar-valued mean and varance to hgher dmensons. 2 varables: Source: Wkpeda

40 State and Error Covarance Frst two moments of Gaussan process Process State (Mean) x Error Covarance d

41 The Process Model Process dynamcs X D X 1 w State transton Uncertanty over nterval w ~ N 0, d Dffcult to determne

42 Measurement Model Measurement relatonshp to state Y M X Measurement matrx Measurement uncertanty ~ N 0, m

Predct (Tme Update) X D X 1 D 1 Y M X D T x 1 d x 43 1

Measurement Update (Correct) a posteror state and error covarance X X K Y M I K M X Kalman gan Mnmzes posteror error covarance x x 44

The Kalman Gan 1 m T T M M M K Weghts between predcton and measurements to posteror error covarance For no measurement uncertanty: 0 m 1 1 1 T T M M M M K y M M x y M x x 1 1 State s deduced only from measurement

46 The Kalman Gan Smple unvarate (scalar) example m K a posteror state and error covarance x y K x x K 1

47 Summary PREDICT CORRECT 1 x D x d T D D 1 x M y K x x M K I 1 m T T M M M K

48 Example: Estmatng a Constant The state transton matrx D I x Dx 1 w x 1 w The measurement matrx M I y Mx x Predcton x x 1 1 d

49 Measurement Update x x K y x 1 K K m

Setup/Intalzaton Generate 50 samples centered around -0.37727 wth standard devaton of 0.1 (var 0.01). d 10 5 x 0 0 50 0 1

51 State and Measurements m =0.1 2 =0.01 m 0.1 Flter was told the correct measurement varance.

52 Error Covarance (ntally 1) ( 2 )

State and Measurements m = 1 53 Flter was told that the measurement varance was 100 tmes greater (.e. 1) so t was slower to beleve the measurements.

State and Measurements m = 0.01 2 =0.0001 54 Flter was told that the measurement varance was 100 tmes smaller (.e. 0.0001) so t was very quck to beleve the nosy measurements.

Demonstraton own experments

2D Poston-Velocty (PV)

2D Poston-Velocty (PV)

Example: Hand Gesture Recognton and Trackng

63 Kalman Flter Web Ste http://www.cs.unc.edu/~welch/kalman/ Electronc and prnted references Book lsts and recommendatons Research papers Lnks to other stes Some software News

64 Java-Based KF Learnng Tool On-lne 1D smulaton Lnear and non-lnear Varable dynamcs http://www.cs.unc.edu/~welch/kalman/

65 KF Course Web Page http://www.cs.unc.edu/~tracker/ref/s2001/kalman/ndex.html ( http://www.cs.unc.edu/~tracker/ ) Java-Based KF Learnng Tool KF web page

66 Relevant References Azarbayejan, Al, and Alex Pentland (1995). Recursve Estmaton of Moton, Structure, and Focal Length, IEEE Trans. Pattern Analyss and Machne Intellgence 17(6): 562-575. Dellaert, Frank, Sebastan Thrun, and Charles Thorpe (1998). Jacoban Images of Super- Resolved Texture Maps for Model-Based Moton Estmaton and Trackng, IEEE Workshop on Applcatons of Computer Vson (WACV'98), October, Prnceton, NJ, IEEE Computer Socety. http://mac-welch.cs.unc.edu/~welch/comp256/

Extensons: Partcle Flterng, Condensaton http://www.robots.ox.ac.uk/~msard/condensaton.html A. Blake, B. Bascle, M. Isard, and J. MacCormck, Statstcal models of vsual shape and moton, n Phl. Trans. R. Soc. A., vol. 356, pp. 1283 1302, 1998 B. Isard, M., Blake, and A., Condensaton condtonal densty propagaton for vsual trackng, n Int. J. Computer Vson, vol. 28, no. 1, pp. 5 28, 1998 C. Blake, A., Isard, M.A., Reynard, and D., Learnng to track the vsual moton of contours, n J. Artfcal Intellgence, vol. 78, pp. 101 134, 1995

Condensaton Algorthm (Blake et al. )

Extensons: Partcle Flterng, Condensaton

Extensons: Partcle Flterng, Condensaton

Extensons: Partcle Flterng, Condensaton