ECE 468: Digital Image Processing. Lecture 8

Similar documents
Detectors part II Descriptors

CS5670: Computer Vision

CSE 473/573 Computer Vision and Image Processing (CVIP)

SIFT: SCALE INVARIANT FEATURE TRANSFORM BY DAVID LOWE

Image matching. by Diva Sian. by swashford

SIFT: Scale Invariant Feature Transform

LoG Blob Finding and Scale. Scale Selection. Blobs (and scale selection) Achieving scale covariance. Blob detection in 2D. Blob detection in 2D

CS 556: Computer Vision. Lecture 21

Overview. Harris interest points. Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points

Achieving scale covariance

Feature detectors and descriptors. Fei-Fei Li

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros

ECE 468: Digital Image Processing. Lecture 2

Overview. Introduction to local features. Harris interest points + SSD, ZNCC, SIFT. Evaluation and comparison of different detectors

Overview. Introduction to local features. Harris interest points + SSD, ZNCC, SIFT. Evaluation and comparison of different detectors

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba WWW:

Blob Detection CSC 767

CS 556: Computer Vision. Lecture 13

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

SIFT keypoint detection. D. Lowe, Distinctive image features from scale-invariant keypoints, IJCV 60 (2), pp , 2004.

Properties of detectors Edge detectors Harris DoG Properties of descriptors SIFT HOG Shape context

Advanced Features. Advanced Features: Topics. Jana Kosecka. Slides from: S. Thurn, D. Lowe, Forsyth and Ponce. Advanced features and feature matching

Review for Exam 1. Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA

Scale & Affine Invariant Interest Point Detectors

INTEREST POINTS AT DIFFERENT SCALES

Blobs & Scale Invariance

CS4670: Computer Vision Kavita Bala. Lecture 7: Harris Corner Detec=on

Feature extraction: Corners and blobs

SURVEY OF APPEARANCE-BASED METHODS FOR OBJECT RECOGNITION

Image Processing 1 (IP1) Bildverarbeitung 1

SIFT, GLOH, SURF descriptors. Dipartimento di Sistemi e Informatica

Lecture 7: Finding Features (part 2/2)

Lecture 3: The Shape Context

Lecture 7: Finding Features (part 2/2)

Computer Vision & Digital Image Processing

Recap: edge detection. Source: D. Lowe, L. Fei-Fei

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Harris Corner Detector

Maximally Stable Local Description for Scale Selection

Scale-space image processing

Advances in Computer Vision. Prof. Bill Freeman. Image and shape descriptors. Readings: Mikolajczyk and Schmid; Belongie et al.

Shape of Gaussians as Feature Descriptors

Lecture 8: Interest Point Detection. Saad J Bedros

Review Smoothing Spatial Filters Sharpening Spatial Filters. Spatial Filtering. Dr. Praveen Sankaran. Department of ECE NIT Calicut.

Feature detectors and descriptors. Fei-Fei Li

Determine whether the following system has a trivial solution or non-trivial solution:

Lec 12 Review of Part I: (Hand-crafted) Features and Classifiers in Image Classification

Interest Operators. All lectures are from posted research papers. Harris Corner Detector: the first and most basic interest operator

Lesson 04. KAZE, Non-linear diffusion filtering, ORB, MSER. Ing. Marek Hrúz, Ph.D.

Lecture 4: Gaussian Elimination and Homogeneous Equations

Visual Object Recognition

CS 3710: Visual Recognition Describing Images with Features. Adriana Kovashka Department of Computer Science January 8, 2015

Extract useful building blocks: blobs. the same image like for the corners

EECS150 - Digital Design Lecture 15 SIFT2 + FSM. Recap and Outline

Orientation Map Based Palmprint Recognition

Chapter 1: Systems of Linear Equations

Local Enhancement. Local enhancement

Given a feature in I 1, how to find the best match in I 2?

Instance-level l recognition. Cordelia Schmid INRIA


Analysis on a local approach to 3D object recognition

Lecture 8: Interest Point Detection. Saad J Bedros

Fourier Transforms 1D

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Introduction to Computer Vision. 2D Linear Systems

Rotation Invariant Object Detection with Matrix-Valued Kernels

Section 3.9. Matrix Norm

Image Analysis. Feature extraction: corners and blobs

Basics on 2-D 2 D Random Signal

Hilbert-Huang Transform-based Local Regions Descriptors

Instance-level recognition: Local invariant features. Cordelia Schmid INRIA, Grenoble

Homework #1. Denote the sum we are interested in as To find we subtract the sum to find that

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Lecture 4.3 Estimating homographies from feature correspondences. Thomas Opsahl

Lecture 3: Linear Filters

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

Computer Vision, Laboratory session 2

Histogram Processing

Linear Equations in Linear Algebra

Taking derivative by convolution

Introduction to Linear Image Processing

Lecture 3: Linear Filters

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Today s class. Constrained optimization Linear programming. Prof. Jinbo Bi CSE, UConn. Numerical Methods, Fall 2011 Lecture 12

The XOR problem. Machine learning for vision. The XOR problem. The XOR problem. x 1 x 2. x 2. x 1. Fall Roland Memisevic

Outline. Convolution. Filtering

Face detection and recognition. Detection Recognition Sally

Linear Algebra & Geometry why is linear algebra useful in computer vision?

CITS 4402 Computer Vision

Multimedia Databases. Previous Lecture. 4.1 Multiresolution Analysis. 4 Shape-based Features. 4.1 Multiresolution Analysis

Reading. 3. Image processing. Pixel movement. Image processing Y R I G Q

EE263 homework 3 solutions

Multimedia Databases. 4 Shape-based Features. 4.1 Multiresolution Analysis. 4.1 Multiresolution Analysis. 4.1 Multiresolution Analysis

Multiscale Autoconvolution Histograms for Affine Invariant Pattern Recognition

Multimedia Databases. Wolf-Tilo Balke Philipp Wille Institut für Informationssysteme Technische Universität Braunschweig

Image Processing. Waleed A. Yousef Faculty of Computers and Information, Helwan University. April 3, 2010

TRACKING and DETECTION in COMPUTER VISION Filtering and edge detection

Linear Independence x

Statistical Pattern Recognition

Transcription:

ECE 68: Digital Image Processing Lecture 8 Prof. Sinisa Todorovic sinisa@eecs.oregonstate.edu 1

Point Descriptors

Point Descriptors Describe image properties in the neighborhood of a keypoint Descriptors = Vectors that are ideally affine invariant Popular descriptors: Scale invariant feature transform (SIFT) Steerable filters Shape context and geometric blur Gradient location and orientation histogram (GLOH) Histogram of Oriented Gradients (HOGs)

SIFT The key idea: a point can be described by a distribution of intensity gradients in the neighborhood of that point

SIFT Descriptor 18-D vector = (x blocks) x (8 bins of histogram) gradients of a 16x16 patch centered at the point histogram of gradients at certain angles of a x subpatch The figure illustrates only 8x8 pixel neighborhood that is transformed into x blocks, for visibility

MATLAB Code for SIFT 6

Matching Cost of Two Descriptors Euclidean distance: (d 1, d )=kd 1 d k Chi-squared distance: (d 1, d )= X i (d 1i d i ) d 1i + d i 7

Matching Formulation Given two sets of descriptors to be matched V = {d 1, d,...,d N }, and V 0 = {d 0 1, d 0,...,d 0 M } Find the legal mapping f F f := {(d, d 0 ): d V, d 0 V 0 } Which minimizes the total cost of matching ˆf =min ff X (d,d 0 )f (d, d 0 ), (d, d 0 ) 0 8

Total Cost of Matching A = cost matrix 1 0 0 0... M 110 1 0 1 0... 1M... N M X (d,d 0 ) (d, d 0 )=tr(a T 1) matrix of all ones 9

Total Cost of Matching A = cost matrix 1 0 0 0... M 110 1 0 1 0... 1M... N M X (d,d 0 ) (d, d 0 )=tr(a T 1) matrix of all ones X (d,d 0 )f (d, d 0 )=? 10

Linearization X (d,d 0 )f (d, d 0 X ) = (d, d 0 ) 1 (d,d 0 )f X = (d, d 0 ) x(d, d 0 ) (d,d 0 ) x(d, d 0 )=1, if (d, d 0 ) f matched pair x(d, d 0 )=0, if (d, d 0 ) / f unmatched pair 11

Linearization Linearization by introducing an indicator matrix X = 0 0 1... 0 0 0 1 0... 0 0... N M x(d, d 0 )=1, if (d, d 0 ) f matched pair x(d, d 0 )=0, if (d, d 0 ) / f unmatched pair 1

Linearization A = 1 0 0 0... M 110 1 0 1 0... 1M... N M X = 0 0 1... 0 0 0 1 0... 0 0... N M X (d,d 0 )f (d, d 0 )= X (d,d 0 ) d,d 0 x d,d 0 = tr(a T X) 1

Matching Formulation ˆX =min X tr(at X) ˆX = 0 trivial solution we need to constrain the formulation 1

Matching Formulation min X tr(at X) subject to: 8d V, 8d 0 V 0,x dd {0, 1} 0 X 8d, x dd =1 0 d 0 8d 0 X, x dd =1 0 d what is the meaning of this constraint? 1

Matching Formulation min X tr(at X) subject to: 8d V, 8d 0 V 0,x dd 0 {0, 1} 8d, 8d 0, X x dd =1 0 d 0 X x dd =1 0 d one-to-one matching 16

Relaxation min X tr(at X) subject to: 8d V, 8d 0 V 0,x dd 0 [0, 1] 8d, 8d 0, X x dd =1 0 d 0 X x dd =1 0 d one-to-one matching 17

Linear Assignment Problem subject to: min X tr(at X) 8d V, 8d 0 V 0,x dd 0 [0, 1] 8d, 8d 0, X x dd =1 0 d 0 X x dd =1 0 d one-to-one matching Hungarian algorithm for the balanced problem V = V 18

Hungarian Algorithm 19

The Hungarian Algorithm 1. From each row of A, find the row minimum, and subtract it from all elements in that row.. From each column of A, find the column minimum, and subtract it from all elements in that column.. Cross out the minimum number of rows and columns in A to cover all zero elements of A. If all rows of A are crossed out, we are done, and go to step 6.. Otherwise, find the minimum entry of A that is not crossed out. Subtract it from all entries of A that are not crossed out. Also, add it to all elements that are crossed out. Return to step with the new matrix. 6. Solutions are zero elements of A. Go first for the zero element which is unique in its row and column. Then, delete that row and column from A. Repeat until you delete all rows or columns from A. 0

Example -- The Hungarian Algorithm given a cost matrix A = 6 1 8 7 1 6 7 8 9 7 6 10 A = 6 9 0 0 10 0 6 0 8 step 1: find minimums in each row and subtract 7 1

Example -- The Hungarian Algorithm step : find minimums in each column and subtract A = 6 9 0 0 0 10 1 0 7 0 6

Example -- The Hungarian Algorithm step : cross out the zeros with a minimum number of lines A = 6 9 0 0 0 10 1 0 7 0 6 bold means crossed out we found lines < rows

Example -- The Hungarian Algorithm step : find minimum that is not crossed out A = 6 9 0 0 0 10 1 0 7 0 6

Example -- The Hungarian Algorithm step : subtract from non-crossed and add to crossed out elements A = 6 10 1 1 1 9 0 6 1 7 1 1

Example -- The Hungarian Algorithm Return to step 1: find minimums in each row and subtract A = 6 9 0 0 1 9 0 0 7 0 0 6

Example -- The Hungarian Algorithm Repeated step : find minimums in each column and subtract A = 6 9 0 0 1 9 0 0 7 0 0 note: no change from the previous step 7

Example -- The Hungarian Algorithm Repeated step : cross out the zeros with a minimum number of lines A = 6 9 0 0 1 9 0 0 7 0 0 bold means crossed out we found lines = rows 8

Example -- The Hungarian Algorithm -- Solution go for the unique solution first step 6: A = 6 9 0 0 1 9 0 0 7 0 0 f = {(d, d 0 )} 9

Example -- The Hungarian Algorithm -- Solution go for the unique solution first step : A = 6 9 0 0 1 9 0 0 7 0 0 f = {(d, d 0 ), (d, d 0 1)} 0

Example -- The Hungarian Algorithm -- Solution go for the unique solution first step : A = 6 9 0 0 1 9 0 0 7 0 0 f = {(d, d 0 ), (d, d 0 1), (d 1, d 0 )} 1

Example -- The Hungarian Algorithm -- Solution go for the unique solution first step : A = 6 9 0 0 1 9 0 0 7 0 0 f = {(d, d 0 ), (d, d 0 1), (d 1, d 0 ), (d, d 0 )} There is a number of alternative solutions!