CSE 554 Lecture 7: Alignment

Similar documents
CSE 554 Lecture 6: Deformation I

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Least-Squares Rigid Motion Using SVD

Linear Algebra & Geometry why is linear algebra useful in computer vision?

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Lecture: Face Recognition and Feature Reduction

Image Registration Lecture 2: Vectors and Matrices

Singular Value Decomposition

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Principal Component Analysis

Introduction to Machine Learning

1 Linearity and Linear Systems

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Lecture: Face Recognition and Feature Reduction

Lecture 3: Review of Linear Algebra

Jeffrey D. Ullman Stanford University

Lecture 3: Review of Linear Algebra

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Problem # Max points possible Actual score Total 120

Background Mathematics (2/2) 1. David Barber

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Announcements (repeat) Principal Components Analysis

Covariance to PCA. CS 510 Lecture #8 February 17, 2014

LECTURE 16: PCA AND SVD

Lecture 6. Numerical methods. Approximation of functions

Exercise Sheet 1.

Covariance to PCA. CS 510 Lecture #14 February 23, 2018

Point Distribution Models

Statistical Geometry Processing Winter Semester 2011/2012

Preliminary Examination, Numerical Analysis, August 2016

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

Econ Slides from Lecture 7

Principal Components Analysis (PCA)

Eigenvalues and diagonalization

Numerical Methods I Singular Value Decomposition

PCA, Kernel PCA, ICA

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

Fundamentals of Matrices

Least Squares Optimization

Linear Algebra Review. Vectors

Principal Component Analysis (PCA)

Econ Slides from Lecture 8

Lecture 5 Singular value decomposition

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

1 Singular Value Decomposition and Principal Component

14 Singular Value Decomposition

Spectral Processing. Misha Kazhdan

Matrices and Deformation

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

Expectation Maximization

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

COMP 558 lecture 18 Nov. 15, 2010

Computational Methods. Eigenvalues and Singular Values

Singular Value Decomposition

Dimensionality reduction

Mathematical foundations - linear algebra

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

AMS526: Numerical Analysis I (Numerical Linear Algebra)

COMP6237 Data Mining Covariance, EVD, PCA & SVD. Jonathon Hare

1 Feature Vectors and Time Series

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Multivariate Statistical Analysis

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

The Singular Value Decomposition

Lecture 6 Positive Definite Matrices

Math 315: Linear Algebra Solutions to Assignment 7

Deep Learning Book Notes Chapter 2: Linear Algebra

Review problems for MA 54, Fall 2004.

Linear Algebra Review. Fei-Fei Li

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

The Mathematics of Facial Recognition

CSE 494/598 Lecture-6: Latent Semantic Indexing. **Content adapted from last year s slides

Jordan Normal Form and Singular Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Foundations of Computer Vision

Estimating Body Segment Motion by Tracking Markers. William C. Rose, James G. Richards

Principle Components Analysis (PCA) Relationship Between a Linear Combination of Variables and Axes Rotation for PCA

Problems. Looks for literal term matches. Problems:

Principal Component Analysis (PCA) CSC411/2515 Tutorial

Principal Components Analysis (PCA) and Singular Value Decomposition (SVD) with applications to Microarrays

18.06SC Final Exam Solutions

7. Symmetric Matrices and Quadratic Forms

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

What is Principal Component Analysis?

EE16B Designing Information Devices and Systems II

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

2. Review of Linear Algebra

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

Designing Information Devices and Systems II

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Probabilistic Latent Semantic Analysis

LECTURE NOTE #11 PROF. ALAN YUILLE

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

Preprocessing & dimensionality reduction

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Lecture 2: Linear Algebra Review

Linear Algebra Practice Problems

Transcription:

CSE 554 Lecture 7: Alignment Fall 2012 CSE554 Alignment Slide 1 Review Fairing (smoothing) Relocating vertices to achieve a smoother appearance Method: centroid averaging Simplification Reducing vertex count Method: edge collapsing CSE554 Alignment Slide 2 1

Registration Fitting one model to match the shape of another Automated annotation Tracking and motion analysis Shape and data comparison CSE554 Alignment Slide 3 Registration Challenges: global and local shape differences Imaging causes global shifts and tilts Requires alignment The shape of the organ or tissue differs in subjects and evolve over time Requires deformation Brain outlines of two mice After alignment After deformation CSE554 Alignment Slide 4 2

Alignment Registration by translation or rotation The structure stays rigid under these two transformations Called rigid-body or isometric (distance-preserving) transformations Mathematically, they are represented as matrix/vector operations Before alignment After alignment CSE554 Alignment Slide 5 Transformation Math Translation Vector addition: p ' v p 2D: p' x p' y v x v y p x p y p ' 3D: p' x p' y p' z v x v y v z p x p y p z v p CSE554 Alignment Slide 6 3

Transformation Math Rotation Matrix product: p ' R p y p ' 2D: p x ' p y ' R p x p y p x Cos Sin R Sin Cos Rotate around the origin! y p ' To rotate around another point q: p' R p q q q p x CSE554 Alignment Slide 7 Transformation Math Rotation Matrix product: p ' R p 3D: p x ' p y ' p z ' Around X axis: Around Y axis: Around Z axis: R R x R y R z p x p y p z 1 0 0 0 Cos Sin 0 Sin Cos Cos a 0 Sin a 0 1 0 Sin a 0 Cos a Cos a Sin a 0 Sin a Cos a 0 0 0 1 z Any arbitrary 3D rotation can be composed from these three rotations y x CSE554 Alignment Slide 8 4

Transformation Math Properties of an arbitrary rotational matrix Orthonormal (orthogonal and normal): R R T I Examples: Cos Sin Cos Sin Sin Cos Sin Cos 1 0 0 1 1 0 0 0 Cos Sin 0 Sin Cos 1 0 0 0 Cos Sin 0 Sin Cos 1 0 0 0 1 0 0 0 1 Easy to invert: R 1 R T Any orthonormal matrix represents a rotation around some axis (not limited to X,Y,Z) CSE554 Alignment Slide 9 Transformation Math Properties of an arbitrary rotational matrix Given an orthonormal matrix, the angle of rotation represented by the matrix can be easily calculated from the trace of the matrix Trace: sum of diagonal entries 2D: The trace equals 2 Cos(a), where a is the rotation angle 3D: The trace equals 1 + 2 Cos(a) The larger the trace, the smaller the rotation angle CSE554 Alignment Slide 10 5

Transformation Math Eigenvectors and eigenvalues Let M be a square matrix, v is an eigenvector and λ is an eigenvalue if: M v v If M represents a rotation (i.e., orthonormal), the rotation axis is an eigenvector whose eigenvalue is 1. There are at most m distinct eigenvalues for a m by m matrix Any scalar multiples of an eigenvector is also an eigenvector (with the same eigenvalue). CSE554 Alignment Slide 11 Alignment Input: two models represented as point sets Source and target Output: locations of the translated and rotated source points Source Target CSE554 Alignment Slide 12 6

Alignment Method 1: Principal component analysis (PCA) Aligning principal directions Method 2: Singular value decomposition (SVD) Optimal alignment given prior knowledge of correspondence Method 3: Iterative closest point (ICP) An iterative SVD algorithm that computes correspondences as it goes CSE554 Alignment Slide 13 Method 1: PCA Compute a shape-aware coordinate system for each model Origin: Centroid of all points Axes: Directions in which the model varies most or least Transform the source to align its origin/axes with the target CSE554 Alignment Slide 14 7

Method 1: PCA Computing axes: Principal Component Analysis (PCA) Consider a set of points p 1,,p n with centroid location c Construct matrix P whose i-th column is vector p i c 2D (2 by n): P p 1x c x p 2 x c x... p n x c x p 1 y c y p 2 y c y... p n y c y 3D (3 by n): P p 1 x c x p 2 x c x... p n x c x p 1 y c y p 2 y c y... p n y c y p 1 z c z p 2 z c z... p n z c z p i Build the covariance matrix: 2D: a 2 by 2 matrix 3D: a 3 by 3 matrix M P P T c CSE554 Alignment Slide 15 Method 1: PCA Computing axes: Principal Component Analysis (PCA) Eigenvectors of the covariance matrix represent principal directions of shape variation The eigenvectors are un-singed and orthogonal (2 in 2D; 3 in 3D) Eigenvalues indicate amount of variation along each eigenvector Eigenvector with largest (smallest) eigenvalue is the direction where the model shape varies the most (least) Eigenvector with the smallest eigenvalue Eigenvector with the largest eigenvalue CSE554 Alignment Slide 16 8

Method 1: PCA PCA-based alignment Let c S,c T be centroids of source and target. First, translate source to align c S with c T : p i p i c T c S Next, find rotation R that aligns two sets of PCA axes, and rotate source around c T : p i ' c T R p i c T Combined: p i ' c T R p i c S c S p i c T c T c T p i p i ' CSE554 Alignment Slide 17 Method 1: PCA Finding rotation between two sets of oriented axes Let A, B be two matrices whose columns are the axes The axes are orthogonal and normalized (i.e., both A and B are orthonormal) We wish to compute a rotation matrix R such that: R A B Notice that A and B are orthonormal, so we have: R B A 1 B A T CSE554 Alignment Slide 18 9

Method 1: PCA Assigning orientation to PCA axes There are 2 possible orientation assignments in 2D In 3D, there are 4 possibilities (observing the right-hand rule) 1 st eigenvector 2 nd eigenvector 3 rd eigenvector CSE554 Alignment Slide 19 Method 1: PCA Finding rotation between two sets of un-oriented axes Fix the orientation of the target axes. For each orientation assignment of the source axes, compute R Pick the R with smallest rotation angle (by checking the trace of R) Smaller rotation Larger rotation CSE554 Alignment Slide 20 10

Method 1: PCA Limitations Centroid and axes are affected by noise Noise Axes are affected PCA result CSE554 Alignment Slide 21 Method 1: PCA Limitations Axes can be unreliable for circular objects Eigenvalues become similar, and eigenvectors become unstable Rotation by a small angle PCA result CSE554 Alignment Slide 22 11

Method 2: SVD Optimal alignment between corresponding points Assuming that for each source point, we know where the corresponding target point is CSE554 Alignment Slide 23 Method 2: SVD Formulating the problem Source points p 1,,p n with centroid location c S Target points q 1,,q n with centroid location c T q i is the corresponding point of p i After centroid alignment and rotation by some R, a transformed source point is located at: p i ' c T R p i c S We wish to find the R that minimizes sum of pair-wise distances: n E q i p i ' 2 i 1 CSE554 Alignment Slide 24 12

Method 2: SVD An equivalent formulation Let P be a matrix whose i-th column is vector p i c S Let Q be a matrix whose i-th column is vector q i c T Consider the cross-covariance matrix: M P Q T Find the orthonormal matrix R that maximizes the trace: Tr R M CSE554 Alignment Slide 25 Method 2: SVD Solving the minimization problem Singular value decomposition (SVD) of an m by m matrix M: M U W V T U,V are m by m orthonormal matrices (i.e., rotations) W is a diagonal m by m matrix with non-negative entries The orthonormal matrix (rotation) R V U T is the R that maximizes the trace Tr R M SVD is available in Mathematica and many Java/C++ libraries CSE554 Alignment Slide 26 13

Method 2: SVD SVD-based alignment: summary Forming the cross-covariance matrix M P Q T Computing SVD M U W V T The rotation matrix is R V U T Translate and rotate the source: p i ' c T R p i c S Translate Rotate CSE554 Alignment Slide 27 Method 2: SVD Advantage over PCA: more stable As long as the correspondences are correct CSE554 Alignment Slide 28 14

Method 2: SVD Advantage over PCA: more stable As long as the correspondences are correct CSE554 Alignment Slide 29 Method 2: SVD Limitation: requires accurate correspondences Which are usually not available CSE554 Alignment Slide 30 15

Method 3: ICP The idea Use PCA alignment to obtain initial guess of correspondences Iteratively improve the correspondences after repeated SVD Iterative closest point (ICP) 1. Transform the source by PCA-based alignment 2. For each transformed source point, assign the closest target point as its corresponding point. Align source and target by SVD. Not all target points need to be used 3. Repeat step (2) until a termination criteria is met. CSE554 Alignment Slide 31 ICP Algorithm After PCA After 1 iter After 10 iter CSE554 Alignment Slide 32 16

ICP Algorithm After PCA After 1 iter After 10 iter CSE554 Alignment Slide 33 ICP Algorithm Termination criteria A user-given maximum iteration is reached The improvement of fitting is small Root Mean Squared Distance (RMSD): n i 1 q i p i ' 2 n Captures average deviation in all corresponding pairs Stops the iteration if the difference in RMSD before and after each iteration falls beneath a user-given threshold CSE554 Alignment Slide 34 17

More Examples After PCA After ICP CSE554 Alignment Slide 35 More Examples After PCA After ICP CSE554 Alignment Slide 36 18