Differential Geometry and Lie Groups with Applications to Medical Imaging, Computer Vision and Geometric Modeling CIS610, Spring 2008

Similar documents
Manifolds, Lie Groups, Lie Algebras, with Applications. Kurt W.A.J.H.Y. Reillag (alias Jean Gallier) CIS610, Spring 2005

A Riemannian Framework for Denoising Diffusion Tensor Images

The Log-Euclidean Framework Applied to SPD Matrices and Polyaffine Transformations

Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA

Introduction to Computational Manifolds and Applications

Methods in Computer Vision: Introduction to Matrix Lie Groups

Modern Geometric Structures and Fields

Vectors. January 13, 2013

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class

Statistical Analysis of Tensor Fields

Riemannian Metric Learning for Symmetric Positive Definite Matrices

Principal Components Theory Notes

Riemannian Geometry for the Statistical Analysis of Diffusion Tensor Data

Introduction to Machine Learning

The Square Root Velocity Framework for Curves in a Homogeneous Space

Chapter 3. Riemannian Manifolds - I. The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves

The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows up in an interesting way in computer vision.

Affine Connections: Part 2

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES

Chapter Two Elements of Linear Algebra

MILNOR SEMINAR: DIFFERENTIAL FORMS AND CHERN CLASSES

Spring, 2012 CIS 515. Fundamentals of Linear Algebra and Optimization Jean Gallier

Review of Linear Algebra

Upon successful completion of MATH 220, the student will be able to:

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

Loos Symmetric Cones. Jimmie Lawson Louisiana State University. July, 2018

BMI/STAT 768: Lecture 04 Correlations in Metric Spaces

Problems in Linear Algebra and Representation Theory

LECTURES MATH370-08C

GEOMETRIC MEANS IN A NOVEL VECTOR SPACE STRUCTURE ON SYMMETRIC POSITIVE-DEFINITE MATRICES

On the spectrum of the Hodge Laplacian and the John ellipsoid

15 Singular Value Decomposition

Choice of Riemannian Metrics for Rigid Body Kinematics

The line, the circle, and the ray. R + x r. Science is linear, is nt? But behaviors take place in nonlinear spaces. The line The circle The ray

Variable separation and second order superintegrability

Quantitative Understanding in Biology Principal Components Analysis

MATHEMATICS. Units Topics Marks I Relations and Functions 10

Research Trajectory. Alexey A. Koloydenko. 25th January Department of Mathematics Royal Holloway, University of London

LECTURE NOTE #11 PROF. ALAN YUILLE

Algebra I Fall 2007

Robust Tensor Splines for Approximation of Diffusion Tensor MRI Data

An Invitation to Geometric Quantization

GEOMETRIC DISTANCE BETWEEN POSITIVE DEFINITE MATRICES OF DIFFERENT DIMENSIONS

CHAPTER 1 PRELIMINARIES

On the exponential map on Riemannian polyhedra by Monica Alice Aprodu. Abstract

Improved Correspondence for DTI Population Studies via Unbiased Atlas Building

Spectral Processing. Misha Kazhdan

Linear vector spaces and subspaces.

Apprentissage non supervisée

Geometry for Physicists

The Symmetric Space for SL n (R)

Tangent bundles, vector fields

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

LECTURE 16: LIE GROUPS AND THEIR LIE ALGEBRAS. 1. Lie groups

How curvature shapes space

Differential Topology Final Exam With Solutions

LECTURE 10: THE PARALLEL TRANSPORT

Appendix A: Matrices

Eigenvalue comparisons in graph theory

I = i 0,

(x, y) = d(x, y) = x y.

Eigenvalue (mis)behavior on manifolds

Topics in Representation Theory: Lie Groups, Lie Algebras and the Exponential Map

The Principal Component Analysis

Math 215B: Solutions 1

x y = 1, 2x y + z = 2, and 3w + x + y + 2z = 0

The Geometrization Theorem

Math Theory of Partial Differential Equations Lecture 3-2: Spectral properties of the Laplacian. Bessel functions.

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

x n -2.5 Definition A list is a list of objects, where multiplicity is allowed, and order matters. For example, as lists

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008

Spring, 2008 CIS 610. Advanced Geometric Methods in Computer Science Jean Gallier Homework 1, Corrected Version

Computation. For QDA we need to calculate: Lets first consider the case that

Distances, volumes, and integration

Stratification of 3 3 Matrices

Pullbacks, Isometries & Conformal Maps

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Data Analysis on Spaces of Positive Definite Matrices with an Application to Dyslexia Detection

THE UNIFORMISATION THEOREM OF RIEMANN SURFACES

Lie Groups, Lie Algebras and the Exponential Map

As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing).

Transport Continuity Property

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Directional Field. Xiao-Ming Fu

Largest dual ellipsoids inscribed in dual cones

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

QUALIFYING EXAMINATION Harvard University Department of Mathematics Tuesday January 20, 2015 (Day 1)

Rigid Geometric Transformations

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS

Math 302 Outcome Statements Winter 2013

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey

Cohomogeneity one hypersurfaces in CP 2 and CH 2

Manifold Regularization

Geometry in a Fréchet Context: A Projective Limit Approach

Heisenberg groups, in discrete and continuous

KUIPER S THEOREM ON CONFORMALLY FLAT MANIFOLDS

CLASS-XII ( ) Units No. of Periods Marks. I. Relations and Functions II. Algebra III. Calculus 80 44

This last statement about dimension is only one part of a more fundamental fact.

Unbounded Convex Semialgebraic Sets as Spectrahedral Shadows

Remarks on the Cayley Representation of Orthogonal Matrices and on Perturbing the Diagonal of a Matrix to Make it Invertible

Transcription:

Differential Geometry and Lie Groups with Applications to Medical Imaging, Computer Vision and Geometric Modeling CIS610, Spring 2008 Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA e-mail: jean@saul.cis.upenn.edu January 22, 2008

2

Chapter 1 Introduction: Problems, Questions, Motivations Let M be some space of data. Often, the space M has some geometric and topological structure. Sometimes, there is a way of multiplying the objects in M which makes M into a group. Problems we often want to solve: (1) Interpolate data: Given a, b M, compute (1 λ)a + λb, λ [0, 1]. (2) Compute the mean of a finite set of data (a sample) a = {a 1,...,a n }, a = a 1 + + a n. n 3

4 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS (3) Compute the (sample) variance of a finite set of data a = {a 1,...,a n }, n i=1 var(a) = (a i a) 2. n 1 (4) Given two samples, a = {a 1,...,a n } and b = {b 1,...,b n }, find their (sample) covariance, n i=1 cov(a, b) = (a i a)(b i b) n 1 The covariance of a and b measures how a and b varies from the mean with respect to each other. Note, cov(a, a) = var(a). If cov(a, b) = 0 we say that a and b are uncorrelated.

(5) Let X be an n d matrix with entries in M. We denote the ith row of X by X i, with 1 i n. The jth column is denoted C j (1 j d). (It is sometimes called a feature vector, but this terminology is far from being universally accepted. In fact, many people in computer vision call the data points, X i, feature vectors!) Perform PCA analysis of the data set, X 1,...,X n, with X i M d. The purpose of principal components analysis (PCA) is to identify patterns in data and understand the variance-covariance structure of the data. 5

6 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS This is useful for (a) Data reduction: Often much of the variability of the data can be accounted for by a smaller number of principal components. (b) Interpretation: PCA can show relationships that were not previously suspected. (6) Assume M is some kind of geometric space. For a, b M, find a shortest path from a to b. If M is a vector space, there are good methods for solving these problems. However, if M is a curved space it may be very difficult, even impossible, to solve these problems.

If fact, it is not clear that the notion of mean makes any sense if M is not a vector space! For example, in medical imaging, DTI (Diffusion Tensor Imaging) produces a 3D symmetric, positive definite matrix, at each voxel of an imaging volume. In brain imaging, this method is used to track the white matter fibres, which demonstrate higher diffusivity of water in the direction of the fibre. One would hope to produce statistical atlases from diffusion tensor images and to understand the anatomical variability caused by a disease. Unfortunately, the space of n n symmetric, positive definite matrices, SPD(n), is not a vector space. Consequently, standard linear statistical methods do not apply. 7

8 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS Recall that a matrix, A, is in SPD(n) iff it is symmetric and if its eigenvalues are all strictly positive. The second condition is equivalent to x Ax > 0 for all x 0 (x R n ). For example, it is easy to show that a matrix ( ) a b b c is positive definite iff ac b 2 > 0 and a, c > 0. So, SPD(2) can be viewed as a certain open region in R 3. It is a convex cone, so (convex) interpolation and the notion of mean make sense. The space SPD(n) isalsoa convex cone.

Actually, this may not be the best way to represent SPD(n). It turns out that SPD(n) isthehomogeneous space SL(n)/SO(n) (in fact, a symmetric space). It is a Riemannian manifold of nonpositive sectional curvature. Since SPD(3) is convex, the mean of m matrices in SPD(3) belongs to SPD(3). However, the mean of matrices in SPD(3) with the same determinant can be a matrix with a greater determinant, which is undesirable. Worse, PCA is invalid because it does not preserve positive-definiteness. Some of these problems can be alleviated using the exponential map and its inverse (log). Indeed, if S(n) denotes the vector space of all symmetric n n matrices, then the exponential map, is a bijection! exp: S(n) SPD(n), 9

10 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS This fact is the basis of the approach of Arsigny, Fillard, Pennec and Ayache. We will study their papers on this subject later on. An other example is given by the group of rotations, SO(3). These are the 3 3 (orthogonal) matrices, R, such that RR = R R = I and det(r) =1. This is a curved space and if R 1,R 2 SO(3), in general, (1 λ)r 1 + λr 2 is not a rotation matrix! It is also possible to interpolate, again using the exponential map, exp: so(3) SO(3), where so(3) is the vector space of 3 3 skew-symmetric matrices. Another interpolation method uses the quaternions. However, computing the mean is a harder problem.

11 All this raises a number of questions: (1) What is a good notion of space, M, so that (a) At least, locally, such a space looks Euclidean. (b) We can promote calculus on R n to functions, f: M R. In particular, we can (c) define smooth functions on M. (d) define the derivative (differential) of functions on M. (e) solve differential equations (ODE s) on M. (f) integrate (nice) functions on M. (g) We can define the distance between two points. (h) We can find shortest paths (perhaps only under certain conditions). (2) What does it mean for a space to be curved? How do we define the notion of curvature?

12 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS (3) What is the effect of curvature (whatever it is)? (4) What is the Laplacian on a manifold? For example, what is the Laplacian on the sphere S n R n+1, with S n = {(x 1,...,x n+1 ) R n+1 x 2 1 + + x 2 n+1 =1}. What are the eigenfunctions of the Laplacian on the sphere? (the nonzero functions, f, such that f = λf, for some λ R) They turn out to be the spherical harmonics. The concept of a manifold seems to be a very good candidate for the notion of space we are seeking.

A very useful thing about manifolds is that for every point, p M, there is a kind of linear approximation, the tangent space, T p M,toM at p. Near p, T p M approximates M. Given a smooth map, f: M N, for every p M, there is a linear map (the tangent map), df p : T p M T f(p) N, which can be viewed as a linear approximation of f (near p). Furthermore, if every tangent space, T p M, has an inner product (a way to define orthogonality and distances), then M is called a Riemannian manifold and there is a map, exp: T p M M, defined at least near 0 (in T p M). 13

14 CHAPTER 1. INTRODUCTION: PROBLEMS, QUESTIONS, MOTIVATIONS In a Riemannian manifold, we can also define covariant derivatives and various notions of curvature. When a manifold also has a group structure (so that multiplication and inversion are smooth), we get a very interesting structure called a Lie group. Even if a manifold, M, is not a Lie group, there may be an action, : G M M, of a Lie group, G, on M and under certain conditions, M, can be viewed as a quotient G/K, where K is a subgroup of G. When M = G/K, as above, certain notions on G can be transported to M. We say that M is a homogeneous space. Let us now begin to familiarize ourselves with Lie groups and manifolds by looking at many concrete examples. We begin with groups of matrices (matrix groups).