Statistics of Shape: Eigen Shapes PCA and PGA

Similar documents
T xi (M) n i. x i b 1xi. x j. h j. x p j b 2x i

Methods for sparse analysis of high-dimensional data, II

Statistical Multi-object Shape Models

Methods for sparse analysis of high-dimensional data, II

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Maximum variance formulation

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

1 Principal Components Analysis

Slide a window along the input arc sequence S. Least-squares estimate. σ 2. σ Estimate 1. Statistically test the difference between θ 1 and θ 2

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Bayesian Principal Geodesic Analysis in Diffeomorphic Image Registration

Geometric Modelling Summer 2016

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Machine learning: Hypothesis testing. Anders Hildeman

THIRD SEMESTER M. Sc. DEGREE (MATHEMATICS) EXAMINATION (CUSS PG 2010) MODEL QUESTION PAPER MT3C11: COMPLEX ANALYSIS

Linear regression. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Statistics on Anatomic Objects Reflecting Inter-Object Relations

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

= F (b) F (a) F (x i ) F (x i+1 ). a x 0 x 1 x n b i

Mathematical foundations - linear algebra

33 JINWEI ZHOU acts on the left of u =(u ::: u N ) 2 F N Let G(N) be the group acting on the right of F N which preserving the standard inner product

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Riemannian geometry of positive definite matrices: Matrix means and quantum Fisher information

DIFFERENTIAL GEOMETRY HW 12

Sarang C. Joshi, Michael I. Miller, Ayananshu Banerjee, Tom Coogan, Division of Applied Mathematics, Brown Univ., Providence, RI ABSTRACT

Linear Algebra Massoud Malek

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Principal Component Analysis

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Recall that any inner product space V has an associated norm defined by

J. Korean Math. Soc. 32 (1995), No. 3, pp. 471{481 ON CHARACTERIZATIONS OF REAL HYPERSURFACES OF TYPE B IN A COMPLEX HYPERBOLIC SPACE Seong Soo Ahn an

Techinical Proofs for Nonlinear Learning using Local Coordinate Coding

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Lecture 1.4: Inner products and orthogonality

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Announcements (repeat) Principal Components Analysis

Lecture 3: Review of Linear Algebra

1. If 1, ω, ω 2, -----, ω 9 are the 10 th roots of unity, then (1 + ω) (1 + ω 2 ) (1 + ω 9 ) is A) 1 B) 1 C) 10 D) 0

Geometry of almost-product (pseudo-)riemannian manifold. manifolds and the dynamics of the observer. Aneta Wojnar

Nonlinear Dimensionality Reduction

Lecture 3: Review of Linear Algebra

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

15 Singular Value Decomposition

Riemannian Geometry for the Statistical Analysis of Diffusion Tensor Data

MATH 829: Introduction to Data Mining and Analysis Principal component analysis

Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid

Discriminative Direction for Kernel Classifiers

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

An Algorithmist s Toolkit Nov. 10, Lecture 17

Linear Regression and Its Applications

NATIONAL BOARD FOR HIGHER MATHEMATICS. M. A. and M.Sc. Scholarship Test. September 25, Time Allowed: 150 Minutes Maximum Marks: 30

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA

MTH 2310, FALL Introduction

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class

Principal Component Analysis

Estimating the Statistics of Multi-Object Anatomic Geometry Using Inter-Object Relationships

Modeling Low-Dimensional Submanifolds In Dynamical Systems

Learning from persistence diagrams

DEVELOPMENT OF MORSE THEORY

Physics 307. Mathematical Physics. Luis Anchordoqui. Wednesday, August 31, 16

Statistical Pattern Recognition

LINEAR MMSE ESTIMATION

Dimensionality Reduction and Principal Components

V. SUBSPACES AND ORTHOGONAL PROJECTION

12.5 Equations of Lines and Planes

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Unsupervised Learning

DS-GA 1002 Lecture notes 10 November 23, Linear models

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion

Linear Algebra: Matrix Eigenvalue Problems

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

2 Multidimensional Hyperbolic Problemss where A(u) =f u (u) B(u) =g u (u): (7.1.1c) Multidimensional nite dierence schemes can be simple extensions of

Vectors Coordinate frames 2D implicit curves 2D parametric curves. Graphics 2008/2009, period 1. Lecture 2: vectors, curves, and surfaces

Dimension Reduction (PCA, ICA, CCA, FLD,

2 2 + x =

Computation. For QDA we need to calculate: Lets first consider the case that

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

A classical particle with spin realized

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3.

Estimation of Probability Distribution on Multiple Anatomical Objects and Evaluation of Statistical Shape Models

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Can we do statistical inference in a non-asymptotic way? 1

Real Analysis Prelim Questions Day 1 August 27, 2013

Supervised Dimension Reduction:

B 1 = {B(x, r) x = (x 1, x 2 ) H, 0 < r < x 2 }. (a) Show that B = B 1 B 2 is a basis for a topology on X.

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

DS-GA 1002 Lecture notes 12 Fall Linear regression

Neighbourhoods of Randomness and Independence

Multivariate Statistical Analysis of Deformation Momenta Relating Anatomical Shape to Neuropsychological Measures

Convergence of Eigenspaces in Kernel Principal Component Analysis

Matrix Lie groups. and their Lie algebras. Mahmood Alaghmandan. A project in fulfillment of the requirement for the Lie algebra course

Data Mining Techniques

CHAPTER 1 PRELIMINARIES

Uncertainty Quantification for LDDMM Using a Low-rank Hessian Approximation

Transcription:

Sarang Joshi 10/10/2001 #1 Statistics of Shape: Eigen Shapes PCA and PGA Sarang Joshi Departments of Radiation Oncology, Biomedical Engineering and Computer Science University of North Carolina at Chapel Hill

Sarang Joshi 2/16/2003 #2 Study of the Shape of the Hippocampus Two approaches to study the shape variation of the hippocampus in populations: Statistics of deformation fields using Principal Components Analysis Statistics of medial descriptions using Lei Groups:- Principal Geodesic Analysis

Statistics of Deformation Fields Sarang Joshi 2/16/2003 #3

Sarang Joshi 2/16/2003 #4 Hippocampal Mapping Atlas Patients

Sarang Joshi 2/16/2003 #5 Hippocampal Mapping Atlas Subjects

' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ The provisory template hippocampal surface M 0 is carried onto the family of targets: M 0 h 1 h ;! 2 h ; ;! M 1 M 0 ; ;! N M 2 M 0 ; h ;1 1 h ;1 2 h ;1 N M N : h 1 o M 0 h 1 h 2 h 2 o M 0 M 0 Template h N h N o M 0 & Sarang Joshi May 3, 2000 16 %

' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ The mean transformation and the template representing the entire population: h = 1 N NX i=1 h i M temp = h M 0 : The mean hippocampus of the population of thirty subjects. & Sarang Joshi May 3, 2000 17 %

' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ Mean hippocampus representing the control population: h control = 1 N control N control X i=1 h control i M control = h control M 0 : Mean hippocampus representing the Schizophrenic population: h schiz = 1 N schiz N schiz X i=1 h schiz i M schiz = h schiz M 0 : Control population Schizophrenic population & Sarang Joshi May 3, 2000 18 %

' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Hippocampi M i i= 1 N deformation of the mean M temp : M i : fyjy = x + u i (x) x2 M temp g u i (x) = h i (x) ; x x 2 M temp : Vector eld u i (x) shown in red. Construct Gaussian random vector elds over sub-manifolds. & Sarang Joshi May 3, 2000 19 %

' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Let H(M) be the Hilbert space of square integrable vector elds on M. Inner product on the Hilbert space H(M): hf gi = 3 X i=1 Z M f i (x)g i (x)d(x) where d is a measure on the oriented manifold M. Denition 1 The random eld fu(x) x 2 Mg is a Gaussian random eld on a manifold M with mean u 2 H(M) and covariance operator K u (x y) if 8f 2 H(M), hf i is normally distributed with mean m f 2 f = hk uf fi = h u fi and variance Gaussian eld is completely specied by it's mean u and the covariance operator K u (x y). Construct Gaussian random elds as a quadratic mean limit using a complete IR 3 -valued orthonormal basis f k k = 1 2 g h i j i = 0 i6= j & Sarang Joshi May 3, 2000 19 %

' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Theorem 1 Let fu(x) x2 Mg be a Gaussian random vector eld with mean m U 2 H and covariance K U of nite trace. There exists a sequence of nite dimensional Gaussian random vector elds fu n (x)g such that where U(x) q.m. = lim n!1 U n (x) U n (x) = n X k=1 Z k (!) k (x) fz k (!) k = 1 g are independent Gaussian random variables with xed means EfZ k g = k and covariances EfjZ i j 2 g; EfZ i g 2 = 2 i = i P i i < 1 and (hk k ) are the eigen functions and the eigen values of the covariance operator K U : i i (x) = Z M K U(x y) i (y)d(y) where d is the measure on the manifold M. If d, the surface measure on M temp is atomic around the points x k then f i g satisfy the system of linear equations i i (x k ) = M X j=1 ^K U (x k y j ) i (y j )(y j ) i= 1 N where (y j ) is the surface measure around point y j. & Sarang Joshi May 3, 2000 20 %

' Eigen Shapes of the Hippocampus. $ Eigen shapes E i i = 1 N dened as: E i = fx + ( i ) i (x) : x 2 M temp g : Eigen shapes completely characterize the variation of the submanifold in the population. & Sarang Joshi May 3, 2000 21 %

' Statistical Signicance of Shape Dierence Between Populations. $ Assume that fu schiz j u control j g j = 1 15 are realizations from a Gaussian process with mean u schiz and u control and common covariance K U. Statistical hypothesis test on shape dierence: H 0 : u norm H 1 : u norm = u schiz 6= u schiz Expand the deformation elds in the eigen functions i : fz schiz j u schiz(j) N (x) = N X i=1 u control(j) N (x) = N X i=1 Z schiz(j) i i (x) Z control(j) i i (x) Zj control j = 1 15g Gaussian random vectors with means Z schiz and Z control and covariance. Hotelling's T 2 test: T 2 N = M 2 ( ^ Z norm ; ^ Z schiz ) T ^;1 ( ^ Z norm ; ^ Z schiz ) : N T 2 N p-value : P N (H 0 ) 3 9.8042 0.0471 4 14.3086 0.0300 5 14.4012 0.0612 6 19.6038 0.0401 & N: number of eigen functions. Sarang Joshi May 3, 2000 22 %

' Bayesian Classication on Hippocampus Shape Between Population. $ Bayesian log-likelihood ratio test: H 0 : normal hippocampus, H 1 : schizophrenic hippocampus. N = ;(Z ; ^ Z schiz ) y ^;1 (Z ; ^ Z schiz ) + (Z ; Z ^ norm ) y ^;1 (Z ; Z ^ H 0 < norm ) > 0 H 1 Use Jack Knife for estimating probability of classication: Log Likelihood Ratio 6 4 2 0-2 -4-6 Control Schiz & Sarang Joshi May 3, 2000 23 %

Sarang Joshi 2/16/2003 #6 Statistics of Medial descriptions Each figure a quad mesh of medial atoms: { m m 0 i, j 0 i, j : = i = 1L N, j = 1L M ( x i, j, r, F, θ ) Medial atom parameters include angles and rotations. Medial atoms do not form a Hilbert Space Cannot use Eigen Shape for statistical characterization!! }

Sarang Joshi 2/16/2003 #7 Statistics of Medial descriptions Set of all Medial Atoms forms Lie-Group m = ( x i, j, r, F, θ ) m R 3 R + SO (3) SO (2) R 3 R + SO(3) SO(2) :Position x :Radius r :Frame :Object angle

Sarang Joshi 2/16/2003 #8 Lie Groups A Lie group is a group G which is also a differential manifold where the group operations are differential maps. R R µ : ( x, ι : x a y) x a 1 : G xy : a G G G a Both composition and the inverse are differential maps 3 + : µ ( x, : SO(3) SO(2) : y) x Multiplica tive : 3 3 2 = + y, x 1 reals Orthogonal 2Orthogonal = x µ ( x, Matrix Matrix y) = G xy, Group Group x 1 = 1 x

Sarang Joshi 2/16/2003 #9 Lie Group Means Algebraic mean not defined on Lie Groups Use geometric definition: Remanian Distance well defined on a Manifold. Given N medial atoms m { : i = 1L N } m i the mean is defined as the group element that minimizes the average squared distance to the data. m = arg min 1 N N i= 1 d ( m, m i ) 2 No closed form solution need to use Lie-Group optimization techniques.

Sarang Joshi 2/16/2003 #10 Geodesic Curves Medial manifold is curved and hence no straight lines. Distance minimizing Geodesic curves are analogous to straight lines in Euclidian Space. Geodesics in Lie Groups are given by the exponent map: g ( t) = exp( ta) Geodesics are one parameter sub-groups analogous to 1-dimensional subspaces in N R

Sarang Joshi 2/16/2003 #11 Principal Geodesics Since the set of all medial atoms is a curved manifold linear PCA is not defined as well. Principal Geodesics are defined as the geodesics that minimize residual distance. No closed form solution: Needs non linear optimization.