Statistics of Shape: Eigen Shapes PCA and PGA

Size: px
Start display at page:

Download "Statistics of Shape: Eigen Shapes PCA and PGA"

Transcription

1 Sarang Joshi 10/10/2001 #1 Statistics of Shape: Eigen Shapes PCA and PGA Sarang Joshi Departments of Radiation Oncology, Biomedical Engineering and Computer Science University of North Carolina at Chapel Hill

2 Sarang Joshi 2/16/2003 #2 Study of the Shape of the Hippocampus Two approaches to study the shape variation of the hippocampus in populations: Statistics of deformation fields using Principal Components Analysis Statistics of medial descriptions using Lei Groups:- Principal Geodesic Analysis

3 Statistics of Deformation Fields Sarang Joshi 2/16/2003 #3

4 Sarang Joshi 2/16/2003 #4 Hippocampal Mapping Atlas Patients

5 Sarang Joshi 2/16/2003 #5 Hippocampal Mapping Atlas Subjects

6 ' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ The provisory template hippocampal surface M 0 is carried onto the family of targets: M 0 h 1 h ;! 2 h ; ;! M 1 M 0 ; ;! N M 2 M 0 ; h ;1 1 h ;1 2 h ;1 N M N : h 1 o M 0 h 1 h 2 h 2 o M 0 M 0 Template h N h N o M 0 & Sarang Joshi May 3, %

7 ' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ The mean transformation and the template representing the entire population: h = 1 N NX i=1 h i M temp = h M 0 : The mean hippocampus of the population of thirty subjects. & Sarang Joshi May 3, %

8 ' Shape of 2-D Sub-Manifolds of the Brain: Hippocampus. $ Mean hippocampus representing the control population: h control = 1 N control N control X i=1 h control i M control = h control M 0 : Mean hippocampus representing the Schizophrenic population: h schiz = 1 N schiz N schiz X i=1 h schiz i M schiz = h schiz M 0 : Control population Schizophrenic population & Sarang Joshi May 3, %

9 ' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Hippocampi M i i= 1 N deformation of the mean M temp : M i : fyjy = x + u i (x) x2 M temp g u i (x) = h i (x) ; x x 2 M temp : Vector eld u i (x) shown in red. Construct Gaussian random vector elds over sub-manifolds. & Sarang Joshi May 3, %

10 ' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Let H(M) be the Hilbert space of square integrable vector elds on M. Inner product on the Hilbert space H(M): hf gi = 3 X i=1 Z M f i (x)g i (x)d(x) where d is a measure on the oriented manifold M. Denition 1 The random eld fu(x) x 2 Mg is a Gaussian random eld on a manifold M with mean u 2 H(M) and covariance operator K u (x y) if 8f 2 H(M), hf i is normally distributed with mean m f 2 f = hk uf fi = h u fi and variance Gaussian eld is completely specied by it's mean u and the covariance operator K u (x y). Construct Gaussian random elds as a quadratic mean limit using a complete IR 3 -valued orthonormal basis f k k = 1 2 g h i j i = 0 i6= j & Sarang Joshi May 3, %

11 ' Gaussian Random Vector Fields on 2-D Sub-Manifolds. $ Theorem 1 Let fu(x) x2 Mg be a Gaussian random vector eld with mean m U 2 H and covariance K U of nite trace. There exists a sequence of nite dimensional Gaussian random vector elds fu n (x)g such that where U(x) q.m. = lim n!1 U n (x) U n (x) = n X k=1 Z k (!) k (x) fz k (!) k = 1 g are independent Gaussian random variables with xed means EfZ k g = k and covariances EfjZ i j 2 g; EfZ i g 2 = 2 i = i P i i < 1 and (hk k ) are the eigen functions and the eigen values of the covariance operator K U : i i (x) = Z M K U(x y) i (y)d(y) where d is the measure on the manifold M. If d, the surface measure on M temp is atomic around the points x k then f i g satisfy the system of linear equations i i (x k ) = M X j=1 ^K U (x k y j ) i (y j )(y j ) i= 1 N where (y j ) is the surface measure around point y j. & Sarang Joshi May 3, %

12 ' Eigen Shapes of the Hippocampus. $ Eigen shapes E i i = 1 N dened as: E i = fx + ( i ) i (x) : x 2 M temp g : Eigen shapes completely characterize the variation of the submanifold in the population. & Sarang Joshi May 3, %

13 ' Statistical Signicance of Shape Dierence Between Populations. $ Assume that fu schiz j u control j g j = 1 15 are realizations from a Gaussian process with mean u schiz and u control and common covariance K U. Statistical hypothesis test on shape dierence: H 0 : u norm H 1 : u norm = u schiz 6= u schiz Expand the deformation elds in the eigen functions i : fz schiz j u schiz(j) N (x) = N X i=1 u control(j) N (x) = N X i=1 Z schiz(j) i i (x) Z control(j) i i (x) Zj control j = 1 15g Gaussian random vectors with means Z schiz and Z control and covariance. Hotelling's T 2 test: T 2 N = M 2 ( ^ Z norm ; ^ Z schiz ) T ^;1 ( ^ Z norm ; ^ Z schiz ) : N T 2 N p-value : P N (H 0 ) & N: number of eigen functions. Sarang Joshi May 3, %

14 ' Bayesian Classication on Hippocampus Shape Between Population. $ Bayesian log-likelihood ratio test: H 0 : normal hippocampus, H 1 : schizophrenic hippocampus. N = ;(Z ; ^ Z schiz ) y ^;1 (Z ; ^ Z schiz ) + (Z ; Z ^ norm ) y ^;1 (Z ; Z ^ H 0 < norm ) > 0 H 1 Use Jack Knife for estimating probability of classication: Log Likelihood Ratio Control Schiz & Sarang Joshi May 3, %

15 Sarang Joshi 2/16/2003 #6 Statistics of Medial descriptions Each figure a quad mesh of medial atoms: { m m 0 i, j 0 i, j : = i = 1L N, j = 1L M ( x i, j, r, F, θ ) Medial atom parameters include angles and rotations. Medial atoms do not form a Hilbert Space Cannot use Eigen Shape for statistical characterization!! }

16 Sarang Joshi 2/16/2003 #7 Statistics of Medial descriptions Set of all Medial Atoms forms Lie-Group m = ( x i, j, r, F, θ ) m R 3 R + SO (3) SO (2) R 3 R + SO(3) SO(2) :Position x :Radius r :Frame :Object angle

17 Sarang Joshi 2/16/2003 #8 Lie Groups A Lie group is a group G which is also a differential manifold where the group operations are differential maps. R R µ : ( x, ι : x a y) x a 1 : G xy : a G G G a Both composition and the inverse are differential maps 3 + : µ ( x, : SO(3) SO(2) : y) x Multiplica tive : = + y, x 1 reals Orthogonal 2Orthogonal = x µ ( x, Matrix Matrix y) = G xy, Group Group x 1 = 1 x

18 Sarang Joshi 2/16/2003 #9 Lie Group Means Algebraic mean not defined on Lie Groups Use geometric definition: Remanian Distance well defined on a Manifold. Given N medial atoms m { : i = 1L N } m i the mean is defined as the group element that minimizes the average squared distance to the data. m = arg min 1 N N i= 1 d ( m, m i ) 2 No closed form solution need to use Lie-Group optimization techniques.

19 Sarang Joshi 2/16/2003 #10 Geodesic Curves Medial manifold is curved and hence no straight lines. Distance minimizing Geodesic curves are analogous to straight lines in Euclidian Space. Geodesics in Lie Groups are given by the exponent map: g ( t) = exp( ta) Geodesics are one parameter sub-groups analogous to 1-dimensional subspaces in N R

20 Sarang Joshi 2/16/2003 #11 Principal Geodesics Since the set of all medial atoms is a curved manifold linear PCA is not defined as well. Principal Geodesics are defined as the geodesics that minimize residual distance. No closed form solution: Needs non linear optimization.

T xi (M) n i. x i b 1xi. x j. h j. x p j b 2x i

T xi (M) n i. x i b 1xi. x j. h j. x p j b 2x i Gaussian Random Fields for Statistical Characterization of Brain Submanifolds. Sarang Joshi Department of Radiation Oncology and Biomedical Enginereing University of North Carolina at Chapel Hill Sarang

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Statistical Multi-object Shape Models

Statistical Multi-object Shape Models Statistical Multi-object Shape Models Conglin Lu, Stephen M. Pizer, Sarang Joshi, Ja-Yeon Jeong Medical mage Display & Analysis Group University of North Carolina, Chapel Hill Abstract The shape of a population

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Maximilian Kasy Department of Economics, Harvard University 1 / 37 Agenda 6 equivalent representations of the

More information

Maximum variance formulation

Maximum variance formulation 12.1. Principal Component Analysis 561 Figure 12.2 Principal component analysis seeks a space of lower dimensionality, known as the principal subspace and denoted by the magenta line, such that the orthogonal

More information

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That

More information

1 Principal Components Analysis

1 Principal Components Analysis Lecture 3 and 4 Sept. 18 and Sept.20-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Principal Components Analysis Principal components analysis (PCA) is a very popular technique for

More information

Slide a window along the input arc sequence S. Least-squares estimate. σ 2. σ Estimate 1. Statistically test the difference between θ 1 and θ 2

Slide a window along the input arc sequence S. Least-squares estimate. σ 2. σ Estimate 1. Statistically test the difference between θ 1 and θ 2 Corner Detection 2D Image Features Corners are important two dimensional features. Two dimensional image features are interesting local structures. They include junctions of dierent types Slide 3 They

More information

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1 Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of

More information

Bayesian Principal Geodesic Analysis in Diffeomorphic Image Registration

Bayesian Principal Geodesic Analysis in Diffeomorphic Image Registration Bayesian Principal Geodesic Analysis in Diffeomorphic Image Registration Miaomiao Zhang and P. Thomas Fletcher School of Computing, University of Utah, Salt Lake City, USA Abstract. Computing a concise

More information

Geometric Modelling Summer 2016

Geometric Modelling Summer 2016 Geometric Modelling Summer 2016 Exercises Benjamin Karer M.Sc. http://gfx.uni-kl.de/~gm Benjamin Karer M.Sc. Geometric Modelling Summer 2016 1 Dierential Geometry Benjamin Karer M.Sc. Geometric Modelling

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

Machine learning: Hypothesis testing. Anders Hildeman

Machine learning: Hypothesis testing. Anders Hildeman Location of trees 0 Observed trees 50 100 150 200 250 300 350 400 450 500 0 100 200 300 400 500 600 700 800 900 1000 Figur: Observed points pattern of the tree specie Beilschmiedia pendula. Location of

More information

THIRD SEMESTER M. Sc. DEGREE (MATHEMATICS) EXAMINATION (CUSS PG 2010) MODEL QUESTION PAPER MT3C11: COMPLEX ANALYSIS

THIRD SEMESTER M. Sc. DEGREE (MATHEMATICS) EXAMINATION (CUSS PG 2010) MODEL QUESTION PAPER MT3C11: COMPLEX ANALYSIS THIRD SEMESTER M. Sc. DEGREE (MATHEMATICS) EXAMINATION (CUSS PG 2010) MODEL QUESTION PAPER MT3C11: COMPLEX ANALYSIS TIME:3 HOURS Maximum weightage:36 PART A (Short Answer Type Question 1-14) Answer All

More information

Linear regression. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Linear regression. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Linear regression DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall15 Carlos Fernandez-Granda Linear models Least-squares estimation Overfitting Example:

More information

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem Dot product The dot product is an inner product on a coordinate vector space (Definition 1, Theorem 1). Definition 1 Given vectors v and u in n-dimensional space, the dot product is defined as, n v u v

More information

Statistics on Anatomic Objects Reflecting Inter-Object Relations

Statistics on Anatomic Objects Reflecting Inter-Object Relations Statistics on Anatomic Objects Reflecting Inter-Object Relations Ja-Yeon Jeong, Stephen M.Pizer, and Surajit Ray Medical Image Display & Analysis Group (MIDAG) University of North Carolina, Chapel Hill

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix GT APSSE - April 2017, the 13th joint work with Xavier Bay. 1 / 29 Sommaire 1 Preliminaries on Gaussian processes 2

More information

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = CALCULUS ON MANIFOLDS 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = a M T am, called the tangent bundle, is itself a smooth manifold, dim T M = 2n. Example 1.

More information

= F (b) F (a) F (x i ) F (x i+1 ). a x 0 x 1 x n b i

= F (b) F (a) F (x i ) F (x i+1 ). a x 0 x 1 x n b i Real Analysis Problem 1. If F : R R is a monotone function, show that F T V ([a,b]) = F (b) F (a) for any interval [a, b], and that F has bounded variation on R if and only if it is bounded. Here F T V

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

33 JINWEI ZHOU acts on the left of u =(u ::: u N ) 2 F N Let G(N) be the group acting on the right of F N which preserving the standard inner product

33 JINWEI ZHOU acts on the left of u =(u ::: u N ) 2 F N Let G(N) be the group acting on the right of F N which preserving the standard inner product SOOHOW JOURNL OF MTHEMTIS Volume 24, No 4, pp 329-333, October 998 THE GEODESIS IN GRSSMNN MNIFOLDS Y JINWEI ZHOU bstract With a suitable basis of Euclidean space, every geodesic in Grassmann manifold

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

Riemannian geometry of positive definite matrices: Matrix means and quantum Fisher information

Riemannian geometry of positive definite matrices: Matrix means and quantum Fisher information Riemannian geometry of positive definite matrices: Matrix means and quantum Fisher information Dénes Petz Alfréd Rényi Institute of Mathematics Hungarian Academy of Sciences POB 127, H-1364 Budapest, Hungary

More information

DIFFERENTIAL GEOMETRY HW 12

DIFFERENTIAL GEOMETRY HW 12 DIFFERENTIAL GEOMETRY HW 1 CLAY SHONKWILER 3 Find the Lie algebra so(n) of the special orthogonal group SO(n), and the explicit formula for the Lie bracket there. Proof. Since SO(n) is a subgroup of GL(n),

More information

Sarang C. Joshi, Michael I. Miller, Ayananshu Banerjee, Tom Coogan, Division of Applied Mathematics, Brown Univ., Providence, RI ABSTRACT

Sarang C. Joshi, Michael I. Miller, Ayananshu Banerjee, Tom Coogan, Division of Applied Mathematics, Brown Univ., Providence, RI ABSTRACT Hierarchical brain mapping via a generalized Dirichlet solution for mapping brain manifolds Sarang C. Joshi, Michael I. Miller, Department of Electrical Engineering and Inst. for Biomedical Computing Washington

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix jean-charles.croix@emse.fr Génie Mathématique et Industriel (GMI) First workshop on Gaussian processes at Saint-Etienne

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yuanzhen Shao MA 26500 Yuanzhen Shao PCA 1 / 13 Data as points in R n Assume that we have a collection of data in R n. x 11 x 21 x 12 S = {X 1 =., X x 22 2 =.,, X x m2 m =.

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Recall that any inner product space V has an associated norm defined by

Recall that any inner product space V has an associated norm defined by Hilbert Spaces Recall that any inner product space V has an associated norm defined by v = v v. Thus an inner product space can be viewed as a special kind of normed vector space. In particular every inner

More information

J. Korean Math. Soc. 32 (1995), No. 3, pp. 471{481 ON CHARACTERIZATIONS OF REAL HYPERSURFACES OF TYPE B IN A COMPLEX HYPERBOLIC SPACE Seong Soo Ahn an

J. Korean Math. Soc. 32 (1995), No. 3, pp. 471{481 ON CHARACTERIZATIONS OF REAL HYPERSURFACES OF TYPE B IN A COMPLEX HYPERBOLIC SPACE Seong Soo Ahn an J. Korean Math. Soc. 32 (1995), No. 3, pp. 471{481 ON CHARACTERIZATIONS OF REAL HYPERSURFACES OF TYPE B IN A COMPLEX HYPERBOLIC SPACE Seong Soo Ahn and Young Jin Suh Abstract. 1. Introduction A complex

More information

Techinical Proofs for Nonlinear Learning using Local Coordinate Coding

Techinical Proofs for Nonlinear Learning using Local Coordinate Coding Techinical Proofs for Nonlinear Learning using Local Coordinate Coding 1 Notations and Main Results Denition 1.1 (Lipschitz Smoothness) A function f(x) on R d is (α, β, p)-lipschitz smooth with respect

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Lecture 1.4: Inner products and orthogonality

Lecture 1.4: Inner products and orthogonality Lecture 1.4: Inner products and orthogonality Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics M.

More information

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

Announcements (repeat) Principal Components Analysis

Announcements (repeat) Principal Components Analysis 4/7/7 Announcements repeat Principal Components Analysis CS 5 Lecture #9 April 4 th, 7 PA4 is due Monday, April 7 th Test # will be Wednesday, April 9 th Test #3 is Monday, May 8 th at 8AM Just hour long

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

1. If 1, ω, ω 2, -----, ω 9 are the 10 th roots of unity, then (1 + ω) (1 + ω 2 ) (1 + ω 9 ) is A) 1 B) 1 C) 10 D) 0

1. If 1, ω, ω 2, -----, ω 9 are the 10 th roots of unity, then (1 + ω) (1 + ω 2 ) (1 + ω 9 ) is A) 1 B) 1 C) 10 D) 0 4 INUTES. If, ω, ω, -----, ω 9 are the th roots of unity, then ( + ω) ( + ω ) ----- ( + ω 9 ) is B) D) 5. i If - i = a + ib, then a =, b = B) a =, b = a =, b = D) a =, b= 3. Find the integral values for

More information

Geometry of almost-product (pseudo-)riemannian manifold. manifolds and the dynamics of the observer. Aneta Wojnar

Geometry of almost-product (pseudo-)riemannian manifold. manifolds and the dynamics of the observer. Aneta Wojnar Geometry of almost-product (pseudo-)riemannian manifolds and the dynamics of the observer University of Wrocªaw Barcelona Postgrad Encounters on Fundamental Physics, October 2012 Outline 1 Motivation 2

More information

Nonlinear Dimensionality Reduction

Nonlinear Dimensionality Reduction Nonlinear Dimensionality Reduction Piyush Rai CS5350/6350: Machine Learning October 25, 2011 Recap: Linear Dimensionality Reduction Linear Dimensionality Reduction: Based on a linear projection of the

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR

More information

15 Singular Value Decomposition

15 Singular Value Decomposition 15 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Riemannian Geometry for the Statistical Analysis of Diffusion Tensor Data

Riemannian Geometry for the Statistical Analysis of Diffusion Tensor Data Riemannian Geometry for the Statistical Analysis of Diffusion Tensor Data P. Thomas Fletcher Scientific Computing and Imaging Institute University of Utah 50 S Central Campus Drive Room 3490 Salt Lake

More information

MATH 829: Introduction to Data Mining and Analysis Principal component analysis

MATH 829: Introduction to Data Mining and Analysis Principal component analysis 1/11 MATH 829: Introduction to Data Mining and Analysis Principal component analysis Dominique Guillot Departments of Mathematical Sciences University of Delaware April 4, 2016 Motivation 2/11 High-dimensional

More information

Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid

Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid Claire Cury, Joan Alexis Glaunès, Marie Chupin, Olivier Colliot To cite this version: Claire Cury, Joan Alexis Glaunès, Marie Chupin,

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2 1 A Good Spectral Theorem c1996, Paul Garrett, garrett@math.umn.edu version February 12, 1996 1 Measurable Hilbert bundles Measurable Banach bundles Direct integrals of Hilbert spaces Trivializing Hilbert

More information

An Algorithmist s Toolkit Nov. 10, Lecture 17

An Algorithmist s Toolkit Nov. 10, Lecture 17 8.409 An Algorithmist s Toolkit Nov. 0, 009 Lecturer: Jonathan Kelner Lecture 7 Johnson-Lindenstrauss Theorem. Recap We first recap a theorem (isoperimetric inequality) and a lemma (concentration) from

More information

Linear Regression and Its Applications

Linear Regression and Its Applications Linear Regression and Its Applications Predrag Radivojac October 13, 2014 Given a data set D = {(x i, y i )} n the objective is to learn the relationship between features and the target. We usually start

More information

NATIONAL BOARD FOR HIGHER MATHEMATICS. M. A. and M.Sc. Scholarship Test. September 25, Time Allowed: 150 Minutes Maximum Marks: 30

NATIONAL BOARD FOR HIGHER MATHEMATICS. M. A. and M.Sc. Scholarship Test. September 25, Time Allowed: 150 Minutes Maximum Marks: 30 NATIONAL BOARD FOR HIGHER MATHEMATICS M. A. and M.Sc. Scholarship Test September 25, 2010 Time Allowed: 150 Minutes Maximum Marks: 30 Please read, carefully, the instructions on the following page 1 INSTRUCTIONS

More information

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy Computation Of Asymptotic Distribution For Semiparametric GMM Estimators Hidehiko Ichimura Graduate School of Public Policy and Graduate School of Economics University of Tokyo A Conference in honor of

More information

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008 Advances in Manifold Learning Presented by: Nakul Verma June 10, 008 Outline Motivation Manifolds Manifold Learning Random projection of manifolds for dimension reduction Introduction to random projections

More information

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class Fall, 2003 CIS 610 Advanced geometric methods Homework 3 November 11, 2003; Due November 25, beginning of class You may work in groups of 2 or 3 Please, write up your solutions as clearly and concisely

More information

Principal Component Analysis

Principal Component Analysis CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given

More information

Estimating the Statistics of Multi-Object Anatomic Geometry Using Inter-Object Relationships

Estimating the Statistics of Multi-Object Anatomic Geometry Using Inter-Object Relationships Estimating the Statistics of Multi-Object Anatomic Geometry Using Inter-Object Relationships Stephen M. Pizer, Ja-Yeon Jeong, Conglin Lu, Keith Muller, and Sarang Joshi Medical Image Display & Analysis

More information

Modeling Low-Dimensional Submanifolds In Dynamical Systems

Modeling Low-Dimensional Submanifolds In Dynamical Systems 1 Clarkson University Modeling Low-Dimensional Submanifolds In Dynamical Systems A Dissertation by Chen Yao Department of Mathematics and Computer Science Submitted in partial fulllment of the requirements

More information

Learning from persistence diagrams

Learning from persistence diagrams Learning from persistence diagrams Ulrich Bauer TUM July 7, 2015 ACAT final meeting IST Austria Joint work with: Jan Reininghaus, Stefan Huber, Roland Kwitt 1 / 28 ? 2 / 28 ? 2 / 28 3 / 28 3 / 28 3 / 28

More information

DEVELOPMENT OF MORSE THEORY

DEVELOPMENT OF MORSE THEORY DEVELOPMENT OF MORSE THEORY MATTHEW STEED Abstract. In this paper, we develop Morse theory, which allows us to determine topological information about manifolds using certain real-valued functions defined

More information

Physics 307. Mathematical Physics. Luis Anchordoqui. Wednesday, August 31, 16

Physics 307. Mathematical Physics. Luis Anchordoqui. Wednesday, August 31, 16 Physics 307 Mathematical Physics Luis Anchordoqui 1 Bibliography L. A. Anchordoqui and T. C. Paul, ``Mathematical Models of Physics Problems (Nova Publishers, 2013) G. F. D. Duff and D. Naylor, ``Differential

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction

More information

LINEAR MMSE ESTIMATION

LINEAR MMSE ESTIMATION LINEAR MMSE ESTIMATION TERM PAPER FOR EE 602 STATISTICAL SIGNAL PROCESSING By, DHEERAJ KUMAR VARUN KHAITAN 1 Introduction Linear MMSE estimators are chosen in practice because they are simpler than the

More information

Dimensionality Reduction and Principal Components

Dimensionality Reduction and Principal Components Dimensionality Reduction and Principal Components Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Motivation Recall, in Bayesian decision theory we have: World: States Y in {1,..., M} and observations of X

More information

V. SUBSPACES AND ORTHOGONAL PROJECTION

V. SUBSPACES AND ORTHOGONAL PROJECTION V. SUBSPACES AND ORTHOGONAL PROJECTION In this chapter we will discuss the concept of subspace of Hilbert space, introduce a series of subspaces related to Haar wavelet, explore the orthogonal projection

More information

12.5 Equations of Lines and Planes

12.5 Equations of Lines and Planes 12.5 Equations of Lines and Planes Equation of Lines Vector Equation of Lines Parametric Equation of Lines Symmetric Equation of Lines Relation Between Two Lines Equations of Planes Vector Equation of

More information

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................

More information

Unsupervised Learning

Unsupervised Learning 2018 EE448, Big Data Mining, Lecture 7 Unsupervised Learning Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/ee448/index.html ML Problem Setting First build and

More information

DS-GA 1002 Lecture notes 10 November 23, Linear models

DS-GA 1002 Lecture notes 10 November 23, Linear models DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.

More information

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion Today Probability and Statistics Naïve Bayes Classification Linear Algebra Matrix Multiplication Matrix Inversion Calculus Vector Calculus Optimization Lagrange Multipliers 1 Classical Artificial Intelligence

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information

2 Multidimensional Hyperbolic Problemss where A(u) =f u (u) B(u) =g u (u): (7.1.1c) Multidimensional nite dierence schemes can be simple extensions of

2 Multidimensional Hyperbolic Problemss where A(u) =f u (u) B(u) =g u (u): (7.1.1c) Multidimensional nite dierence schemes can be simple extensions of Chapter 7 Multidimensional Hyperbolic Problems 7.1 Split and Unsplit Dierence Methods Our study of multidimensional parabolic problems in Chapter 5 has laid most of the groundwork for our present task

More information

Vectors Coordinate frames 2D implicit curves 2D parametric curves. Graphics 2008/2009, period 1. Lecture 2: vectors, curves, and surfaces

Vectors Coordinate frames 2D implicit curves 2D parametric curves. Graphics 2008/2009, period 1. Lecture 2: vectors, curves, and surfaces Graphics 2008/2009, period 1 Lecture 2 Vectors, curves, and surfaces Computer graphics example: Pixar (source: http://www.pixar.com) Computer graphics example: Pixar (source: http://www.pixar.com) Computer

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

2 2 + x =

2 2 + x = Lecture 30: Power series A Power Series is a series of the form c n = c 0 + c 1 x + c x + c 3 x 3 +... where x is a variable, the c n s are constants called the coefficients of the series. n = 1 + x +

More information

Computation. For QDA we need to calculate: Lets first consider the case that

Computation. For QDA we need to calculate: Lets first consider the case that Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the

More information

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) PCA transforms the original input space into a lower dimensional space, by constructing dimensions that are linear combinations

More information

A classical particle with spin realized

A classical particle with spin realized A classical particle wit spin realized by reduction of a nonlinear nonolonomic constraint R. Cusman D. Kemppainen y and J. Sniatycki y Abstract 1 In tis paper we describe te motion of a nonlinear nonolonomically

More information

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3.

CHAPTER 3. Gauss map. In this chapter we will study the Gauss map of surfaces in R 3. CHAPTER 3 Gauss map In this chapter we will study the Gauss map of surfaces in R 3. 3.1. Surfaces in R 3 Let S R 3 be a submanifold of dimension 2. Let {U i, ϕ i } be a DS on S. For any p U i we have a

More information

Estimation of Probability Distribution on Multiple Anatomical Objects and Evaluation of Statistical Shape Models

Estimation of Probability Distribution on Multiple Anatomical Objects and Evaluation of Statistical Shape Models Estimation of Probability Distribution on Multiple Anatomical Objects and Evaluation of Statistical Shape Models Ja-Yeon Jeong A dissertation submitted to the faculty of the University of North Carolina

More information

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26 Principal Component Analysis Brett Bernstein CDS at NYU April 25, 2017 Brett Bernstein (CDS at NYU) Lecture 13 April 25, 2017 1 / 26 Initial Question Intro Question Question Let S R n n be symmetric. 1

More information

Can we do statistical inference in a non-asymptotic way? 1

Can we do statistical inference in a non-asymptotic way? 1 Can we do statistical inference in a non-asymptotic way? 1 Guang Cheng 2 Statistics@Purdue www.science.purdue.edu/bigdata/ ONR Review Meeting@Duke Oct 11, 2017 1 Acknowledge NSF, ONR and Simons Foundation.

More information

Real Analysis Prelim Questions Day 1 August 27, 2013

Real Analysis Prelim Questions Day 1 August 27, 2013 Real Analysis Prelim Questions Day 1 August 27, 2013 are 5 questions. TIME LIMIT: 3 hours Instructions: Measure and measurable refer to Lebesgue measure µ n on R n, and M(R n ) is the collection of measurable

More information

Supervised Dimension Reduction:

Supervised Dimension Reduction: Supervised Dimension Reduction: A Tale of Two Manifolds S. Mukherjee, K. Mao, F. Liang, Q. Wu, M. Maggioni, D-X. Zhou Department of Statistical Science Institute for Genome Sciences & Policy Department

More information

B 1 = {B(x, r) x = (x 1, x 2 ) H, 0 < r < x 2 }. (a) Show that B = B 1 B 2 is a basis for a topology on X.

B 1 = {B(x, r) x = (x 1, x 2 ) H, 0 < r < x 2 }. (a) Show that B = B 1 B 2 is a basis for a topology on X. Math 6342/7350: Topology and Geometry Sample Preliminary Exam Questions 1. For each of the following topological spaces X i, determine whether X i and X i X i are homeomorphic. (a) X 1 = [0, 1] (b) X 2

More information

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2, Math 051 W008 Margo Kondratieva Week 10-11 Quadratic forms Principal axes theorem Text reference: this material corresponds to parts of sections 55, 8, 83 89 Section 41 Motivation and introduction Consider

More information

DS-GA 1002 Lecture notes 12 Fall Linear regression

DS-GA 1002 Lecture notes 12 Fall Linear regression DS-GA Lecture notes 1 Fall 16 1 Linear models Linear regression In statistics, regression consists of learning a function relating a certain quantity of interest y, the response or dependent variable,

More information

Neighbourhoods of Randomness and Independence

Neighbourhoods of Randomness and Independence Neighbourhoods of Randomness and Independence C.T.J. Dodson School of Mathematics, Manchester University Augment information geometric measures in spaces of distributions, via explicit geometric representations

More information

Multivariate Statistical Analysis of Deformation Momenta Relating Anatomical Shape to Neuropsychological Measures

Multivariate Statistical Analysis of Deformation Momenta Relating Anatomical Shape to Neuropsychological Measures Multivariate Statistical Analysis of Deformation Momenta Relating Anatomical Shape to Neuropsychological Measures Nikhil Singh, Tom Fletcher, Sam Preston, Linh Ha, J. Stephen Marron, Michael Wiener, and

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

Matrix Lie groups. and their Lie algebras. Mahmood Alaghmandan. A project in fulfillment of the requirement for the Lie algebra course

Matrix Lie groups. and their Lie algebras. Mahmood Alaghmandan. A project in fulfillment of the requirement for the Lie algebra course Matrix Lie groups and their Lie algebras Mahmood Alaghmandan A project in fulfillment of the requirement for the Lie algebra course Department of Mathematics and Statistics University of Saskatchewan March

More information

Data Mining Techniques

Data Mining Techniques Data Mining Techniques CS 6220 - Section 2 - Spring 2017 Lecture 4 Jan-Willem van de Meent (credit: Yijun Zhao, Arthur Gretton Rasmussen & Williams, Percy Liang) Kernel Regression Basis function regression

More information

CHAPTER 1 PRELIMINARIES

CHAPTER 1 PRELIMINARIES CHAPTER 1 PRELIMINARIES 1.1 Introduction The aim of this chapter is to give basic concepts, preliminary notions and some results which we shall use in the subsequent chapters of the thesis. 1.2 Differentiable

More information

Uncertainty Quantification for LDDMM Using a Low-rank Hessian Approximation

Uncertainty Quantification for LDDMM Using a Low-rank Hessian Approximation Uncertainty Quantification for LDDMM Using a Low-rank Hessian Approximation Xiao Yang, Marc Niethammer, Department of Computer Science and Biomedical Research Imaging Center University of North Carolina

More information