Extrinsic Antimean and Bootstrap

Size: px
Start display at page:

Download "Extrinsic Antimean and Bootstrap"

Transcription

1 Extrinsic Antimean and Bootstrap Yunfan Wang Department of Statistics Florida State University April 12, 2018 Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

2 Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

3 Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

4 Two Types of Distance In statistics one mainly considers two types of distances on a manifold M: 1 A geodesic distance (arc distance), that is the Riemmannian distance ρ g associated with a Riemannian structure g on M. 2 A chord distance, is the distance ρ j induced by the Euclidean distance on R N via an embedding j : M R N, that is given by ρ j (p, q) = j(q) j(p) 2 0 An intrinsic data analysis on a manifold is a statistical analysis of a probability measure, using a geodesic distance based statistics. An extrinsic data analysis is a statistical analysis based on a chord distance based statistics. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

5 Embedding There are infinitely many embeddings of RP m in an Euclidean space, when consider sample mean and sample antimean, it is preferred to use an embedding j that is compatible with two transitive group actions of SO(m + 1) on RP m, respectively on j(rp m ), that is j(t [x]) = T j([x]), T SO(m + 1), [x] RP m (1) where T [x] = [Tx] Such an embedding is said to be equivariant. The equivariant embedding of RP m that was used so far in the axial data analysis literature is the Veronese Whitney (VW) embedding j : RP m Sym + (m + 1, R), that associates to an axis the matrix of the orthogonal projection on this axis: j([x]) = xx T, x = 1 (2) Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

6 Fréchet Function Consider object spaces provided with a chord distance associated with the embedding of an object space into a numerical space, and the statistical analysis performed relative to a chord distance is termed extrinsic data analysis. The expected square distance from the random object X to an arbitrary point p defines what we call the Fréchet function associated with X : F(p) = E(ρ 2 (p, X)), (3) and its minimizers form the Fréchet mean set. In case when ρ is the chord distance on M induced by the Euclidean distance in R N via an embedding j : M R N, the Fréchet function becomes F(p) = j(x) j(p) 2 0Q(dx), (4) M where Q = P X is the probability measure on M, associated with X. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

7 Extrinsic Mean and Antimean In this case the Fréchet mean set is called the extrinsic mean set, and if we have a unique point in the extrinsic mean set of X, this point is the extrinsic mean of X, and is labeled µ E (X) or simply µ E. Also, given X 1,..., X n i.i.d.r.v. s from Q, their extrinsic sample mean (set) is the extrinsic mean (set) of the empirical distribution ˆQ n = 1 n ni=1 δ Xi. Introducing a new location parameter for X. Definition The set of maximizers of the Fréchet function, is called the extrinsic antimean set. In case the extrinsic antimean set has one point only, that point is called extrinsic antimean of X, and is labeled αµ j,e (Q), or simply αµ E, when j is known. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

8 Focal and non-focal Remark Let (M, ρ 0 ) be a compact metric space, where ρ 0 is the chord distance via the embedding j : M R N, that is ρ 0 (p 1, p 2 ) = j(p 1 ) j(p 2 ) = d 0 (j(p 1 ), j(p 2 )), (p 1, p 2 ) M 2 where d 0 is the Euclidean distance in R N. Recall that a point y R N for which there is a unique point p M satisfying the equality, d 0 (y, j(m)) = inf x M y j(x) 0 = d 0(y, j(p)) is called j-nonfocal. A point which is not j-nonfocal is said to be j-focal. And if y is a j-nonfocal point, its projection on j(m) is the unique point j(p) = P j (y) j(m) with d o (y, j(m)) = d 0 (y, j(p)). Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

9 Focal and non-focal cont Definition (a)a point y R N for which there is unique point p M satisfying the equality, sup y j(x) 0 = d 0 (y, j(p)) x M is called αj-nonfocal. A point which is not αj-nonfocal is said to be αj-focal. (b)if y is an αj-nonfocal point, its projection on j(m) is the unique point z = P F,J (y) j(m) with sup x M y j(x) 0 = d 0 (y, j(p)). Definition A probability distribution Q on M is said to be αj-nonfocal if the mean µ of j(q) is αjnonfocal. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

10 Extrinsic Mean The extrinsic mean µ j,e (Q) of a j-nonfocal probability measure Q on M w.r.t. an embedding j, when it exists,is given by µ j,e (Q) = j 1 (P F,j (µ)) where µ is the mean of j(q). For the VW embedding, F is the set of matrices in S + (m + 1, R), whose largest eigenvalues are of multiplicity at least 2. The projection P j assigns to each non negative definite symmetric matrix A with highest eigenvalue of multiplicity 1, the matrix vv T, where v is a unit eigenvector of A corresponding to its largest eigenvalue. Furthermore, the VW mean of a random object [X] RP m, [X T X] = 1, is given by µ j,e (Q) = [γ(m + 1)] and (λ(a), γ(a)), a = 1,.., m + 1 are eigenvalues and unit eigenvectors pairs. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

11 Extrinsic Antimean PROPOSITION (i) The set of αvw -nonfocal points in Sym + (m + 1, R), is the set of matrices in Sym + (m + 1, R) whose smallest eigenvalue has multiplicity 1. (ii) The projection P F,j : (αf ) c j(rp m ) assigns to each nonnegative definite symmetric matrix A, of rank 1, with a smallest eigenvalue of multiplicity 1, the matrix j([ν]), where ν = 1 and ν is an eigenvector of A corresponding to that eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

12 Extrinsic Antimean - con t PROPOSITION Let Q be a distribution on RP m. The VW-antimean set of a random object [X], X T X = 1 on RP m, is the set of points p = [v] V 1, where V 1 is the eigenspace corresponding to the smallest eigenvalue λ(1) of E(XX T ). If in addition Q = P [X] is αvw -nonfocal, then µ j,e (Q) = j 1 (P F,j (µ)) = γ(1) where (λ(a), γ(a)), a = 1,.., m + 1 are eigenvalues in increasing order and the corresponding unit eigenvectors of µ = E(XX T ). Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

13 Extrinsic Antimean - con t PROPOSITION Let Q be a distribution on RP m. Let x 1,..., x n be random observations from a distribution Q on RP m, such that j(x) is αvw-nonfocal. Then the VW sample antimean of x 1,..., x n is given by; ax j,e = j 1 (P F,j (j(x))) = g(1) where (d(a), g(a)) are the eigenvalues in increasing order and the corresponding unit eigenvectors of J = 1 n x i xi T. n i=1 Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

14 Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

15 Kendall Shape Spaces We are concerned with a landmark based nonparametric analysis of similarity shape data. For landmark based shape data, one considers a k-ad x = (x 1,..., x k (R m ) k, which consists of k labeled points in R m that represents coordinates of landmarks. G is the group direct similarities of R m. A similarity is a function f : R m R m, that uniformly dilates distance, that is, for which there is K > 0, such that f (x) f (y) = k x y, x, y R m. A direct similarity is given by f (x) = Ax + b, A T A = ci m, c > 0, where A has a positive determinant. Direct similarities form under composition, the group of direct similarities. The sample spaces considered here are called them shape spaces of k-ads in R m. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

16 Kendall Shape Spaces - cont Direct Similarity (Kendall) Shape: Geometrical information that remains when location, scale and rotational effects are filtered out from a k-ads. Two k-ads (z 1, z 2,..., z k ) and (z 1, z 2,..., z k ) are said to have the same shape if there is a direct similarity T in the plane, that is, a composition of a rotation, a translation and a homothety such that T (z j ) = z j for j = 1,..., k. Having the same shape is an equivalence relationship in the spaces of planar k-ads, and the set of all equivalence classes of k-ads is called the planar shape spaces of k-ads, or the space Σ k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

17 Kendall Shape Spaces - cont Without loss of generality one may assume that two k-ads that have the same shape also have the same center of mass, that is, Σz j = Σz j = 0, and they have the same shape if there is a composition of a transformation T which keeps the origin fixed, and is a rotation followed by a homothety such that T (z j ) = z j for j = 1,..., k 1. Such a transformation T is determined by a nonzero complex number, that is to say, the two k-ads with center of mass 0 have the same shape if there is a z C\{0} such that zz j = z j for j = 1,..., k 1. Thus the shape equivalence class of a planar k-ad is uniquely determined by a point in CP k 2, that is to say, Σ k 2 is identified with CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

18 Embedding The Procrustean distance, in our terminology, is the distance induced by the Euclidean distance on CP k 2 via a quadratic Veronese-Whitney embedding into a unit sphere of the linear space S(k 1, C of selfadjoint complex matrices of order k 1. In order to define j : CP k 2 S(k 1, C) it is useful to note that CP k 2 = S 2k 3 /S 1, where S 2k 3 is the space of complex vectors C k 1 of norm 1, and the equivalence relation on S 2k 3 is by multiplication with scalars in S 1 (complex numbers of modulus 1). If z = (z 1, z 2,..., z k 1 ) is in S 2k 3, we will denote by [z] the equivalence class of z in CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

19 Embedding - cont The Veronese-Whitney (or simply Veronese) map is in this case j([z]) = zz where, if z is considered as a column vector, z is the adjoint of z, that is, the conjugate of the transpose of z. The Euclidean distance in the space of Hermitian matrices S(k 1, C) is d 2 0 (A, B) = Tr((A B) (A B) ) = Tr((A B) 2 ). Definition Let L k = {(Z 1,..., Z k ) C k Z Z k = 0} Theorem The Kendall planar shape space Σ k 2 can be identified with the complex projective space P(L k ). Moreover, since L k has complex dimension k 1, P(L k ) P(C k 1 ) = CP k 2, therefore the Kendall planar shape analysis is data analysis on the complex projective space CP k 2. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

20 Extrinsic Mean on the Complex Projective Space Theorem Let Q be a probability distribution on CP k 2 and let {[Z r ], Z r = 1 r=1,...,n } be a random sample from Q. (a) Q is nonfocal iff λ the largest eigenvalue of E[Z 1 Z1 ] is simple and in this case µ E Q = [m], where m is an eigenvector of E[Z 1 Z1 ] corresponding to λ, with m = 1. (b) The extrinsic sample mean X E = [m], where m is an eigenvector of norm 1 of 1 ni=1 n Z i Zi, Z i = 1, i = 1,..., n, corresponding to the largest eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

21 Extrinsic Antimean on the Complex Projective Space Theorem Let Q be a probability distribution on CP k 2 and let {[Z r ], Z r = 1 r=1,...,n } be a random sample from Q. (a) Q is nonfocal iff λ the smallest eigenvalue of E[Z 1 Z1 ] is simple and in this case µ E Q = [m], where m is an eigenvector of E[Z 1 Z1 ] corresponding to λ, with m = 1. (b) The extrinsic sample antimean αx E = [m], where m is an eigenvector of norm 1 of 1 ni=1 n Z i Zi, Z i = 1, i = 1,..., n, corresponding to the smallest eigenvalue. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

22 Outline 1 Introduction Distance Fréchet Function Extrinsic Mean and Antimean 2 Kendall Shape Spaces Extrinsic antimean on Complex Space 3 Application Computational Example Results Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

23 Computational Example Use the VW-embedding of the complex projective space (Kendall shape space) to compare VW means and VW antimeans for a configuration of landmarks on midface in a population of normal children, based on a study on growth measured from X-rays at 8 and 14 years of age. Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

24 Coordinates of skull image Figure: The coordinates for children skull Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

25 Extrinsic Sample Mean Figure: Extrinsic Sample Mean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

26 Bootstraps of extrinsic Sample Mean Figure: Bootstraps of Extrinsic Sample Mean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

27 Extrinsic Sample Antimean Figure: Extrinsic Sample Antimean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

28 Bootstraps of extrinsic Sample antimean Figure: Bootstraps for Extrinsic Sample Antimean Yunfan Wang (Florida State University) Extrinsic Antimean and Bootstrap April 12, / 28

Extrinsic Means and Antimeans

Extrinsic Means and Antimeans Extrinsic Means and Antimeans Vic Patrangenaru 1, K. David Yao 2, Ruite Guo 1 Florida State University 1 Department of Statistics, 2 Department of Mathematics December 28, 2015 1 Introduction Fréchet (1948)

More information

Means and Antimeans. Vic Patrangenaru 1,K. David Yao 2, Ruite Guo 1 Florida State University 1 Department of Statistics, 2 Department of Mathematics

Means and Antimeans. Vic Patrangenaru 1,K. David Yao 2, Ruite Guo 1 Florida State University 1 Department of Statistics, 2 Department of Mathematics Means and Antimeans Vic Patrangenaru 1,K. David Yao 2, Ruite Guo 1 Florida State University 1 Department of Statistics, 2 Department of Mathematics May 10, 2015 1 Introduction Fréchet (1948) noticed that

More information

GEOMETRY AND DISTRIBUTIONS OF SHAPES Rabi Bhattacharya, Univ. of Arizona, USA (With V. Patrangenaru & A. Bhattacharya)

GEOMETRY AND DISTRIBUTIONS OF SHAPES Rabi Bhattacharya, Univ. of Arizona, USA (With V. Patrangenaru & A. Bhattacharya) GEOMETRY AND DISTRIBUTIONS OF SHAPES Rabi Bhattacharya, Univ. of Arizona, USA (With V. Patrangenaru & A. Bhattacharya) CONTENTS 1. PROBAB. MEASURES ON MANIFOLDS (a) FRÉCHET MEAN (b) EXTRINSIC MEAN (c)

More information

STATISTICS ON SHAPE MANIFOLDS: THEORY AND APPLICATIONS Rabi Bhattacharya, Univ. of Arizona, Tucson (With V. Patrangenaru & A.

STATISTICS ON SHAPE MANIFOLDS: THEORY AND APPLICATIONS Rabi Bhattacharya, Univ. of Arizona, Tucson (With V. Patrangenaru & A. STATISTICS ON SHAPE MANIFOLDS: THEORY AND APPLICATIONS Rabi Bhattacharya, Univ. of Arizona, Tucson (With V. Patrangenaru & A. Bhattacharya) CONTENTS 1. INTRODUCTION - EXAMPLES 2. PROBAB. MEASURES ON MANIFOLDS

More information

INTRINSIC MEAN ON MANIFOLDS. Abhishek Bhattacharya Project Advisor: Dr.Rabi Bhattacharya

INTRINSIC MEAN ON MANIFOLDS. Abhishek Bhattacharya Project Advisor: Dr.Rabi Bhattacharya INTRINSIC MEAN ON MANIFOLDS Abhishek Bhattacharya Project Advisor: Dr.Rabi Bhattacharya 1 Overview Properties of Intrinsic mean on Riemannian manifolds have been presented. The results have been applied

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Analysis Preliminary Exam Workshop: Hilbert Spaces

Analysis Preliminary Exam Workshop: Hilbert Spaces Analysis Preliminary Exam Workshop: Hilbert Spaces 1. Hilbert spaces A Hilbert space H is a complete real or complex inner product space. Consider complex Hilbert spaces for definiteness. If (, ) : H H

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems. Lecture 2: Eigenvalues, eigenvectors and similarity The single most important concept in matrix theory. German word eigen means proper or characteristic. KTH Signal Processing 1 Magnus Jansson/Emil Björnson

More information

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES II. DIFFERENTIABLE MANIFOLDS Washington Mio Anuj Srivastava and Xiuwen Liu (Illustrations by D. Badlyans) CENTER FOR APPLIED VISION AND IMAGING SCIENCES Florida State University WHY MANIFOLDS? Non-linearity

More information

Lecture 7. Econ August 18

Lecture 7. Econ August 18 Lecture 7 Econ 2001 2015 August 18 Lecture 7 Outline First, the theorem of the maximum, an amazing result about continuity in optimization problems. Then, we start linear algebra, mostly looking at familiar

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS G. RAMESH Contents Introduction 1 1. Bounded Operators 1 1.3. Examples 3 2. Compact Operators 5 2.1. Properties 6 3. The Spectral Theorem 9 3.3. Self-adjoint

More information

Section 3.9. Matrix Norm

Section 3.9. Matrix Norm 3.9. Matrix Norm 1 Section 3.9. Matrix Norm Note. We define several matrix norms, some similar to vector norms and some reflecting how multiplication by a matrix affects the norm of a vector. We use matrix

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Normed & Inner Product Vector Spaces

Normed & Inner Product Vector Spaces Normed & Inner Product Vector Spaces ECE 174 Introduction to Linear & Nonlinear Optimization Ken Kreutz-Delgado ECE Department, UC San Diego Ken Kreutz-Delgado (UC San Diego) ECE 174 Fall 2016 1 / 27 Normed

More information

QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE. Denitizable operators in Krein spaces have spectral properties similar to those

QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE. Denitizable operators in Krein spaces have spectral properties similar to those QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE BRANKO CURGUS and BRANKO NAJMAN Denitizable operators in Krein spaces have spectral properties similar to those of selfadjoint operators in Hilbert spaces.

More information

NONPARAMETRIC BAYESIAN INFERENCE ON PLANAR SHAPES

NONPARAMETRIC BAYESIAN INFERENCE ON PLANAR SHAPES NONPARAMETRIC BAYESIAN INFERENCE ON PLANAR SHAPES Author: Abhishek Bhattacharya Coauthor: David Dunson Department of Statistical Science, Duke University 7 th Workshop on Bayesian Nonparametrics Collegio

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Statistics on Placenta Shapes

Statistics on Placenta Shapes IMS Lecture Notes Monograph Series c Institute of Mathematical Statistics, Statistics on Placenta Shapes Abhishek Bhattacharya University of Arizona Abstract: This report presents certain recent methodologies

More information

Statistics on Manifolds and Landmarks Based Image Analysis: A Nonparametric Theory with Applications

Statistics on Manifolds and Landmarks Based Image Analysis: A Nonparametric Theory with Applications Statistics on Manifolds and Landmarks Based Image Analysis: A Nonparametric Theory with Applications Rabi Bhattacharya, Department of Mathematics, The University of Arizona, Tucson, AZ, 85721, USA and

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Recap: A N N matrix A has an eigenvector x (non-zero) with corresponding

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

CHAPTER 3. Matrix Eigenvalue Problems

CHAPTER 3. Matrix Eigenvalue Problems A SERIES OF CLASS NOTES FOR 2005-2006 TO INTRODUCE LINEAR AND NONLINEAR PROBLEMS TO ENGINEERS, SCIENTISTS, AND APPLIED MATHEMATICIANS DE CLASS NOTES 3 A COLLECTION OF HANDOUTS ON SYSTEMS OF ORDINARY DIFFERENTIAL

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Principal Components Theory Notes

Principal Components Theory Notes Principal Components Theory Notes Charles J. Geyer August 29, 2007 1 Introduction These are class notes for Stat 5601 (nonparametrics) taught at the University of Minnesota, Spring 2006. This not a theory

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = CALCULUS ON MANIFOLDS 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = a M T am, called the tangent bundle, is itself a smooth manifold, dim T M = 2n. Example 1.

More information

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p

More information

RIGIDITY OF MINIMAL ISOMETRIC IMMERSIONS OF SPHERES INTO SPHERES. Christine M. Escher Oregon State University. September 10, 1997

RIGIDITY OF MINIMAL ISOMETRIC IMMERSIONS OF SPHERES INTO SPHERES. Christine M. Escher Oregon State University. September 10, 1997 RIGIDITY OF MINIMAL ISOMETRIC IMMERSIONS OF SPHERES INTO SPHERES Christine M. Escher Oregon State University September, 1997 Abstract. We show two specific uniqueness properties of a fixed minimal isometric

More information

Regularization and Inverse Problems

Regularization and Inverse Problems Regularization and Inverse Problems Caroline Sieger Host Institution: Universität Bremen Home Institution: Clemson University August 5, 2009 Caroline Sieger (Bremen and Clemson) Regularization and Inverse

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

Symmetric and anti symmetric matrices

Symmetric and anti symmetric matrices Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal

More information

Statistical Analysis on Manifolds: A Nonparametric Approach for Inference on Shape Spaces

Statistical Analysis on Manifolds: A Nonparametric Approach for Inference on Shape Spaces Statistical Analysis on Manifolds: A Nonparametric Approach for Inference on Shape Spaces Abhishek Bhattacharya Department of Statistical Science, Duke University Abstract. This article concerns nonparametric

More information

Appendix A: Matrices

Appendix A: Matrices Appendix A: Matrices A matrix is a rectangular array of numbers Such arrays have rows and columns The numbers of rows and columns are referred to as the dimensions of a matrix A matrix with, say, 5 rows

More information

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T 1 1 Linear Systems The goal of this chapter is to study linear systems of ordinary differential equations: ẋ = Ax, x(0) = x 0, (1) where x R n, A is an n n matrix and ẋ = dx ( dt = dx1 dt,..., dx ) T n.

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010 Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010 Diagonalization of Symmetric Real Matrices (from Handout Definition 1 Let δ ij = { 1 if i = j 0 if i j A basis V = {v 1,..., v n } of R n

More information

2. Matrix Algebra and Random Vectors

2. Matrix Algebra and Random Vectors 2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns

More information

Part IA. Vectors and Matrices. Year

Part IA. Vectors and Matrices. Year Part IA Vectors and Matrices Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2018 Paper 1, Section I 1C Vectors and Matrices For z, w C define the principal value of z w. State de Moivre s

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

FFTs in Graphics and Vision. The Laplace Operator

FFTs in Graphics and Vision. The Laplace Operator FFTs in Graphics and Vision The Laplace Operator 1 Outline Math Stuff Symmetric/Hermitian Matrices Lagrange Multipliers Diagonalizing Symmetric Matrices The Laplacian Operator 2 Linear Operators Definition:

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Spectral Properties of Elliptic Operators In previous work we have replaced the strong version of an elliptic boundary value problem

Spectral Properties of Elliptic Operators In previous work we have replaced the strong version of an elliptic boundary value problem Spectral Properties of Elliptic Operators In previous work we have replaced the strong version of an elliptic boundary value problem L u x f x BC u x g x with the weak problem find u V such that B u,v

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

TRANSITIVE HOLONOMY GROUP AND RIGIDITY IN NONNEGATIVE CURVATURE. Luis Guijarro and Gerard Walschap

TRANSITIVE HOLONOMY GROUP AND RIGIDITY IN NONNEGATIVE CURVATURE. Luis Guijarro and Gerard Walschap TRANSITIVE HOLONOMY GROUP AND RIGIDITY IN NONNEGATIVE CURVATURE Luis Guijarro and Gerard Walschap Abstract. In this note, we examine the relationship between the twisting of a vector bundle ξ over a manifold

More information

Elliptic Regularity. Throughout we assume all vector bundles are smooth bundles with metrics over a Riemannian manifold X n.

Elliptic Regularity. Throughout we assume all vector bundles are smooth bundles with metrics over a Riemannian manifold X n. Elliptic Regularity Throughout we assume all vector bundles are smooth bundles with metrics over a Riemannian manifold X n. 1 Review of Hodge Theory In this note I outline the proof of the following Fundamental

More information

Tutorials in Optimization. Richard Socher

Tutorials in Optimization. Richard Socher Tutorials in Optimization Richard Socher July 20, 2008 CONTENTS 1 Contents 1 Linear Algebra: Bilinear Form - A Simple Optimization Problem 2 1.1 Definitions........................................ 2 1.2

More information

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

More information

Geometry of symmetric R-spaces

Geometry of symmetric R-spaces Geometry of symmetric R-spaces Makiko Sumi Tanaka Geometry and Analysis on Manifolds A Memorial Symposium for Professor Shoshichi Kobayashi The University of Tokyo May 22 25, 2013 1 Contents 1. Introduction

More information

LECTURE 9: THE WHITNEY EMBEDDING THEOREM

LECTURE 9: THE WHITNEY EMBEDDING THEOREM LECTURE 9: THE WHITNEY EMBEDDING THEOREM Historically, the word manifold (Mannigfaltigkeit in German) first appeared in Riemann s doctoral thesis in 1851. At the early times, manifolds are defined extrinsically:

More information

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Author: Raif M. Rustamov Presenter: Dan Abretske Johns Hopkins 2007 Outline Motivation and Background Laplace-Beltrami Operator

More information

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms De La Fuente notes that, if an n n matrix has n distinct eigenvalues, it can be diagonalized. In this supplement, we will provide

More information

ON THE SINGULAR DECOMPOSITION OF MATRICES

ON THE SINGULAR DECOMPOSITION OF MATRICES An. Şt. Univ. Ovidius Constanţa Vol. 8, 00, 55 6 ON THE SINGULAR DECOMPOSITION OF MATRICES Alina PETRESCU-NIŢǍ Abstract This paper is an original presentation of the algorithm of the singular decomposition

More information

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3.

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3. CHAPTER 3 Gauss map In this chapter we will study the Gauss map of surfaces in R 3. 3.1. Surfaces in R 3 Let S R 3 be a submanifold of dimension 2. Let {U i, ϕ i } be a DS on S. For any p U i we have a

More information

Math 209B Homework 2

Math 209B Homework 2 Math 29B Homework 2 Edward Burkard Note: All vector spaces are over the field F = R or C 4.6. Two Compactness Theorems. 4. Point Set Topology Exercise 6 The product of countably many sequentally compact

More information

1 Hermitian symmetric spaces: examples and basic properties

1 Hermitian symmetric spaces: examples and basic properties Contents 1 Hermitian symmetric spaces: examples and basic properties 1 1.1 Almost complex manifolds............................................ 1 1.2 Hermitian manifolds................................................

More information

Optimization Theory. Linear Operators and Adjoints

Optimization Theory. Linear Operators and Adjoints Optimization Theory Linear Operators and Adjoints A transformation T. : X Y y Linear Operators y T( x), x X, yy is the image of x under T The domain of T on which T can be defined : D X The range of T

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing).

As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing). An Interlude on Curvature and Hermitian Yang Mills As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing). Suppose we wanted

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Course Summary Math 211

Course Summary Math 211 Course Summary Math 211 table of contents I. Functions of several variables. II. R n. III. Derivatives. IV. Taylor s Theorem. V. Differential Geometry. VI. Applications. 1. Best affine approximations.

More information

Algebra I Fall 2007

Algebra I Fall 2007 MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary

More information

Statistics on Manifolds with Applications to Shape Spaces

Statistics on Manifolds with Applications to Shape Spaces Statistics on Manifolds with Applications to Shape Spaces Rabi Bhattacharya and Abhishek Bhattacharya Abstract. This article provides an exposition of recent developments on the analysis of landmark based

More information

p 2 p 3 p y p z It will not be considered in the present context; the interested reader can find more details in [05].

p 2 p 3 p y p z It will not be considered in the present context; the interested reader can find more details in [05]. 1. Geometrical vectors A geometrical vector p represents a point P in space. The point P is an abstraction that often, but not always, requires a representation. Vector representations are given wrt a

More information

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental

More information

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A). Math 344 Lecture #19 3.5 Normed Linear Spaces Definition 3.5.1. A seminorm on a vector space V over F is a map : V R that for all x, y V and for all α F satisfies (i) x 0 (positivity), (ii) αx = α x (scale

More information

Linear Algebra Review

Linear Algebra Review January 29, 2013 Table of contents Metrics Metric Given a space X, then d : X X R + 0 and z in X if: d(x, y) = 0 is equivalent to x = y d(x, y) = d(y, x) d(x, y) d(x, z) + d(z, y) is a metric is for all

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

NOTES on LINEAR ALGEBRA 1

NOTES on LINEAR ALGEBRA 1 School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura

More information

Eigenvalue (mis)behavior on manifolds

Eigenvalue (mis)behavior on manifolds Bucknell University Lehigh University October 20, 2010 Outline 1 Isoperimetric inequalities 2 3 4 A little history Rayleigh quotients The Original Isoperimetric Inequality The Problem of Queen Dido: maximize

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

X-RAY TRANSFORM ON DAMEK-RICCI SPACES. (Communicated by Jan Boman)

X-RAY TRANSFORM ON DAMEK-RICCI SPACES. (Communicated by Jan Boman) Volume X, No. 0X, 00X, X XX Web site: http://www.aimsciences.org X-RAY TRANSFORM ON DAMEK-RICCI SPACES To Jan Boman on his seventy-fifth birthday. François Rouvière Laboratoire J.A. Dieudonné Université

More information

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1 Vectors Arrays of numbers They represent a point in a n dimensional space 2 Vectors: Scalar Product Scalar-Vector Product Changes

More information

Tangent bundles, vector fields

Tangent bundles, vector fields Location Department of Mathematical Sciences,, G5-109. Main Reference: [Lee]: J.M. Lee, Introduction to Smooth Manifolds, Graduate Texts in Mathematics 218, Springer-Verlag, 2002. Homepage for the book,

More information

Introduction to Teichmüller Spaces

Introduction to Teichmüller Spaces Introduction to Teichmüller Spaces Jing Tao Notes by Serena Yuan 1. Riemann Surfaces Definition 1.1. A conformal structure is an atlas on a manifold such that the differentials of the transition maps lie

More information

Eigenvalue and Eigenvector Problems

Eigenvalue and Eigenvector Problems Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

Nonparametric Inference on Manifolds

Nonparametric Inference on Manifolds Nonparametric Inference on Manifolds This book introduces in a systematic manner a general nonparametric theory of statistics on manifolds, with emphasis on manifolds of shapes. The theory has important

More information

arxiv: v1 [math.dg] 25 Dec 2018 SANTIAGO R. SIMANCA

arxiv: v1 [math.dg] 25 Dec 2018 SANTIAGO R. SIMANCA CANONICAL ISOMETRIC EMBEDDINGS OF PROJECTIVE SPACES INTO SPHERES arxiv:82.073v [math.dg] 25 Dec 208 SANTIAGO R. SIMANCA Abstract. We define inductively isometric embeddings of and P n (C) (with their canonical

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

21 Symmetric and skew-symmetric matrices

21 Symmetric and skew-symmetric matrices 21 Symmetric and skew-symmetric matrices 21.1 Decomposition of a square matrix into symmetric and skewsymmetric matrices Let C n n be a square matrix. We can write C = (1/2)(C + C t ) + (1/2)(C C t ) =

More information

General Relativity by Robert M. Wald Chapter 2: Manifolds and Tensor Fields

General Relativity by Robert M. Wald Chapter 2: Manifolds and Tensor Fields General Relativity by Robert M. Wald Chapter 2: Manifolds and Tensor Fields 2.1. Manifolds Note. An event is a point in spacetime. In prerelativity physics and in special relativity, the space of all events

More information