Extrinsic Antimean and Bootstrap

Similar documents
Extrinsic Means and Antimeans

Means and Antimeans. Vic Patrangenaru 1,K. David Yao 2, Ruite Guo 1 Florida State University 1 Department of Statistics, 2 Department of Mathematics

GEOMETRY AND DISTRIBUTIONS OF SHAPES Rabi Bhattacharya, Univ. of Arizona, USA (With V. Patrangenaru & A. Bhattacharya)

STATISTICS ON SHAPE MANIFOLDS: THEORY AND APPLICATIONS Rabi Bhattacharya, Univ. of Arizona, Tucson (With V. Patrangenaru & A.

INTRINSIC MEAN ON MANIFOLDS. Abhishek Bhattacharya Project Advisor: Dr.Rabi Bhattacharya

Review of Linear Algebra

Analysis Preliminary Exam Workshop: Hilbert Spaces

The Singular Value Decomposition

Linear Algebra Massoud Malek

Eigenvalues and Eigenvectors

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES

Lecture 7. Econ August 18

Basic Elements of Linear Algebra

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

Section 3.9. Matrix Norm

Chap 3. Linear Algebra

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

1. General Vector Spaces

Lecture notes: Applied linear algebra Part 1. Version 2

Chapter 3 Transformations

Normed & Inner Product Vector Spaces

QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE. Denitizable operators in Krein spaces have spectral properties similar to those

NONPARAMETRIC BAYESIAN INFERENCE ON PLANAR SHAPES

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Statistics on Placenta Shapes

Statistics on Manifolds and Landmarks Based Image Analysis: A Nonparametric Theory with Applications

Chapter 5 Eigenvalues and Eigenvectors

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Exercise Sheet 1.

Linear Algebra Review. Vectors

6 Inner Product Spaces

CHAPTER 3. Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems

Principal Components Theory Notes

Lecture 10 - Eigenvalues problem

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Introduction to Matrix Algebra

RIGIDITY OF MINIMAL ISOMETRIC IMMERSIONS OF SPHERES INTO SPHERES. Christine M. Escher Oregon State University. September 10, 1997

Regularization and Inverse Problems

7. Symmetric Matrices and Quadratic Forms

Symmetric and anti symmetric matrices

Statistical Analysis on Manifolds: A Nonparametric Approach for Inference on Shape Spaces

Appendix A: Matrices

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

The following definition is fundamental.

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

2. Matrix Algebra and Random Vectors

Part IA. Vectors and Matrices. Year

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

FFTs in Graphics and Vision. The Laplace Operator

Linear algebra and applications to graphs Part 1

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Math 315: Linear Algebra Solutions to Assignment 7

Spectral Properties of Elliptic Operators In previous work we have replaced the strong version of an elliptic boundary value problem

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

TRANSITIVE HOLONOMY GROUP AND RIGIDITY IN NONNEGATIVE CURVATURE. Luis Guijarro and Gerard Walschap

Elliptic Regularity. Throughout we assume all vector bundles are smooth bundles with metrics over a Riemannian manifold X n.

Tutorials in Optimization. Richard Socher

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Geometry of symmetric R-spaces

LECTURE 9: THE WHITNEY EMBEDDING THEOREM

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

ON THE SINGULAR DECOMPOSITION OF MATRICES

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3.

Math 209B Homework 2

1 Hermitian symmetric spaces: examples and basic properties

Optimization Theory. Linear Operators and Adjoints

1 Linear Algebra Problems

As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing).

Lecture 1: Review of linear algebra

Course Summary Math 211

Algebra I Fall 2007

Statistics on Manifolds with Applications to Shape Spaces

p 2 p 3 p y p z It will not be considered in the present context; the interested reader can find more details in [05].

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

Linear Algebra Review

Linear Algebra. Min Yan

NOTES on LINEAR ALGEBRA 1

Eigenvalue (mis)behavior on manifolds

Functional Analysis Review

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Foundations of Matrix Analysis

X-RAY TRANSFORM ON DAMEK-RICCI SPACES. (Communicated by Jan Boman)

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Tangent bundles, vector fields

Introduction to Teichmüller Spaces

Eigenvalue and Eigenvector Problems

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Nonparametric Inference on Manifolds

arxiv: v1 [math.dg] 25 Dec 2018 SANTIAGO R. SIMANCA

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

21 Symmetric and skew-symmetric matrices

General Relativity by Robert M. Wald Chapter 2: Manifolds and Tensor Fields

Transcription:

Extrinsic Antimean and Bootstrap Yunfan Wang Department of Statistics Florida State University April 12, 2018 Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 1 / 28

Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 2 / 28

Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 3 / 28

Two Types of Distance In statistics one mainly considers two types of distances on a manifold M: 1 A geodesic distance (arc distance), that is the Riemmannian distance ρ g associated with a Riemannian structure g on M. 2 A chord distance, is the distance ρ j induced by the Euclidean distance on R N via an embedding j : M R N, that is given by ρ j (p, q) = j(q) j(p) 2 0 An intrinsic data analysis on a manifold is a statistical analysis of a probability measure, using a geodesic distance based statistics. An extrinsic data analysis is a statistical analysis based on a chord distance based statistics. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 4 / 28

Embedding There are infinitely many embeddings of RP m in an Euclidean space, when consider sample mean and sample antimean, it is preferred to use an embedding j that is compatible with two transitive group actions of SO(m + 1) on RP m, respectively on j(rp m ), that is j(t [x]) = T j([x]), T SO(m + 1), [x] RP m (1) where T [x] = [Tx] Such an embedding is said to be equivariant. The equivariant embedding of RP m that was used so far in the axial data analysis literature is the Veronese Whitney (VW) embedding j : RP m Sym + (m + 1, R), that associates to an axis the matrix of the orthogonal projection on this axis: j([x]) = xx T, x = 1 (2) Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 5 / 28

Fréchet Function Consider object spaces provided with a chord distance associated with the embedding of an object space into a numerical space, and the statistical analysis performed relative to a chord distance is termed extrinsic data analysis. The expected square distance from the random object X to an arbitrary point p defines what we call the Fréchet function associated with X : F(p) = E(ρ 2 (p, X)), (3) and its minimizers form the Fréchet mean set. In case when ρ is the chord distance on M induced by the Euclidean distance in R N via an embedding j : M R N, the Fréchet function becomes F(p) = j(x) j(p) 2 0Q(dx), (4) M where Q = P X is the probability measure on M, associated with X. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 6 / 28

Extrinsic Mean and Antimean In this case the Fréchet mean set is called the extrinsic mean set, and if we have a unique point in the extrinsic mean set of X, this point is the extrinsic mean of X, and is labeled µ E (X) or simply µ E. Also, given X 1,..., X n i.i.d.r.v. s from Q, their extrinsic sample mean (set) is the extrinsic mean (set) of the empirical distribution ˆQ n = 1 n ni=1 δ Xi. Introducing a new location parameter for X. Definition The set of maximizers of the Fréchet function, is called the extrinsic antimean set. In case the extrinsic antimean set has one point only, that point is called extrinsic antimean of X, and is labeled αµ j,e (Q), or simply αµ E, when j is known. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 7 / 28

Focal and non-focal Remark Let (M, ρ 0 ) be a compact metric space, where ρ 0 is the chord distance via the embedding j : M R N, that is ρ 0 (p 1, p 2 ) = j(p 1 ) j(p 2 ) = d 0 (j(p 1 ), j(p 2 )), (p 1, p 2 ) M 2 where d 0 is the Euclidean distance in R N. Recall that a point y R N for which there is a unique point p M satisfying the equality, d 0 (y, j(m)) = inf x M y j(x) 0 = d 0(y, j(p)) is called j-nonfocal. A point which is not j-nonfocal is said to be j-focal. And if y is a j-nonfocal point, its projection on j(m) is the unique point j(p) = P j (y) j(m) with d o (y, j(m)) = d 0 (y, j(p)). Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 8 / 28

Focal and non-focal cont Definition (a)a point y R N for which there is unique point p M satisfying the equality, sup y j(x) 0 = d 0 (y, j(p)) x M is called αj-nonfocal. A point which is not αj-nonfocal is said to be αj-focal. (b)if y is an αj-nonfocal point, its projection on j(m) is the unique point z = P F,J (y) j(m) with sup x M y j(x) 0 = d 0 (y, j(p)). Definition A probability distribution Q on M is said to be αj-nonfocal if the mean µ of j(q) is αjnonfocal. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 9 / 28

Extrinsic Mean The extrinsic mean µ j,e (Q) of a j-nonfocal probability measure Q on M w.r.t. an embedding j, when it exists,is given by µ j,e (Q) = j 1 (P F,j (µ)) where µ is the mean of j(q). For the VW embedding, F is the set of matrices in S + (m + 1, R), whose largest eigenvalues are of multiplicity at least 2. The projection P j assigns to each non negative definite symmetric matrix A with highest eigenvalue of multiplicity 1, the matrix vv T, where v is a unit eigenvector of A corresponding to its largest eigenvalue. Furthermore, the VW mean of a random object [X] RP m, [X T X] = 1, is given by µ j,e (Q) = [γ(m + 1)] and (λ(a), γ(a)), a = 1,.., m + 1 are eigenvalues and unit eigenvectors pairs. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 10 / 28

Extrinsic Antimean PROPOSITION (i) The set of αvw -nonfocal points in Sym + (m + 1, R), is the set of matrices in Sym + (m + 1, R) whose smallest eigenvalue has multiplicity 1. (ii) The projection P F,j : (αf ) c j(rp m ) assigns to each nonnegative definite symmetric matrix A, of rank 1, with a smallest eigenvalue of multiplicity 1, the matrix j([ν]), where ν = 1 and ν is an eigenvector of A corresponding to that eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 11 / 28

Extrinsic Antimean - con t PROPOSITION Let Q be a distribution on RP m. The VW-antimean set of a random object [X], X T X = 1 on RP m, is the set of points p = [v] V 1, where V 1 is the eigenspace corresponding to the smallest eigenvalue λ(1) of E(XX T ). If in addition Q = P [X] is αvw -nonfocal, then µ j,e (Q) = j 1 (P F,j (µ)) = γ(1) where (λ(a), γ(a)), a = 1,.., m + 1 are eigenvalues in increasing order and the corresponding unit eigenvectors of µ = E(XX T ). Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 12 / 28

Extrinsic Antimean - con t PROPOSITION Let Q be a distribution on RP m. Let x 1,..., x n be random observations from a distribution Q on RP m, such that j(x) is αvw-nonfocal. Then the VW sample antimean of x 1,..., x n is given by; ax j,e = j 1 (P F,j (j(x))) = g(1) where (d(a), g(a)) are the eigenvalues in increasing order and the corresponding unit eigenvectors of J = 1 n x i xi T. n i=1 Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 13 / 28

Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 14 / 28

Kendall Shape Spaces We are concerned with a landmark based nonparametric analysis of similarity shape data. For landmark based shape data, one considers a k-ad x = (x 1,..., x k (R m ) k, which consists of k labeled points in R m that represents coordinates of landmarks. G is the group direct similarities of R m. A similarity is a function f : R m R m, that uniformly dilates distance, that is, for which there is K > 0, such that f (x) f (y) = k x y, x, y R m. A direct similarity is given by f (x) = Ax + b, A T A = ci m, c > 0, where A has a positive determinant. Direct similarities form under composition, the group of direct similarities. The sample spaces considered here are called them shape spaces of k-ads in R m. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 15 / 28

Kendall Shape Spaces - cont Direct Similarity (Kendall) Shape: Geometrical information that remains when location, scale and rotational effects are filtered out from a k-ads. Two k-ads (z 1, z 2,..., z k ) and (z 1, z 2,..., z k ) are said to have the same shape if there is a direct similarity T in the plane, that is, a composition of a rotation, a translation and a homothety such that T (z j ) = z j for j = 1,..., k. Having the same shape is an equivalence relationship in the spaces of planar k-ads, and the set of all equivalence classes of k-ads is called the planar shape spaces of k-ads, or the space Σ k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 16 / 28

Kendall Shape Spaces - cont Without loss of generality one may assume that two k-ads that have the same shape also have the same center of mass, that is, Σz j = Σz j = 0, and they have the same shape if there is a composition of a transformation T which keeps the origin fixed, and is a rotation followed by a homothety such that T (z j ) = z j for j = 1,..., k 1. Such a transformation T is determined by a nonzero complex number, that is to say, the two k-ads with center of mass 0 have the same shape if there is a z C\{0} such that zz j = z j for j = 1,..., k 1. Thus the shape equivalence class of a planar k-ad is uniquely determined by a point in CP k 2, that is to say, Σ k 2 is identified with CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 17 / 28

Embedding The Procrustean distance, in our terminology, is the distance induced by the Euclidean distance on CP k 2 via a quadratic Veronese-Whitney embedding into a unit sphere of the linear space S(k 1, C of selfadjoint complex matrices of order k 1. In order to define j : CP k 2 S(k 1, C) it is useful to note that CP k 2 = S 2k 3 /S 1, where S 2k 3 is the space of complex vectors C k 1 of norm 1, and the equivalence relation on S 2k 3 is by multiplication with scalars in S 1 (complex numbers of modulus 1). If z = (z 1, z 2,..., z k 1 ) is in S 2k 3, we will denote by [z] the equivalence class of z in CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 18 / 28

Embedding - cont The Veronese-Whitney (or simply Veronese) map is in this case j([z]) = zz where, if z is considered as a column vector, z is the adjoint of z, that is, the conjugate of the transpose of z. The Euclidean distance in the space of Hermitian matrices S(k 1, C) is d 2 0 (A, B) = Tr((A B) (A B) ) = Tr((A B) 2 ). Definition Let L k = {(Z 1,..., Z k ) C k Z 1 + + Z k = 0} Theorem The Kendall planar shape space Σ k 2 can be identified with the complex projective space P(L k ). Moreover, since L k has complex dimension k 1, P(L k ) P(C k 1 ) = CP k 2, therefore the Kendall planar shape analysis is data analysis on the complex projective space CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 19 / 28

Extrinsic Mean on the Complex Projective Space Theorem Let Q be a probability distribution on CP k 2 and let {[Z r ], Z r = 1 r=1,...,n } be a random sample from Q. (a) Q is nonfocal iff λ the largest eigenvalue of E[Z 1 Z1 ] is simple and in this case µ E Q = [m], where m is an eigenvector of E[Z 1 Z1 ] corresponding to λ, with m = 1. (b) The extrinsic sample mean X E = [m], where m is an eigenvector of norm 1 of 1 ni=1 n Z i Zi, Z i = 1, i = 1,..., n, corresponding to the largest eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 20 / 28

Extrinsic Antimean on the Complex Projective Space Theorem Let Q be a probability distribution on CP k 2 and let {[Z r ], Z r = 1 r=1,...,n } be a random sample from Q. (a) Q is nonfocal iff λ the smallest eigenvalue of E[Z 1 Z1 ] is simple and in this case µ E Q = [m], where m is an eigenvector of E[Z 1 Z1 ] corresponding to λ, with m = 1. (b) The extrinsic sample antimean αx E = [m], where m is an eigenvector of norm 1 of 1 ni=1 n Z i Zi, Z i = 1, i = 1,..., n, corresponding to the smallest eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 21 / 28

Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 22 / 28

Computational Example Use the VW-embedding of the complex projective space (Kendall shape space) to compare VW means and VW antimeans for a configuration of landmarks on midface in a population of normal children, based on a study on growth measured from X-rays at 8 and 14 years of age. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 23 / 28

Coordinates of skull image Figure: The coordinates for children skull Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 24 / 28

Extrinsic Sample Mean Figure: Extrinsic Sample Mean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 25 / 28

Bootstraps of extrinsic Sample Mean Figure: Bootstraps of Extrinsic Sample Mean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 26 / 28

Extrinsic Sample Antimean Figure: Extrinsic Sample Antimean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 27 / 28

Bootstraps of extrinsic Sample antimean Figure: Bootstraps for Extrinsic Sample Antimean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, 2018 28 / 28