Topological Data Analysis - Spring 2018

Similar documents
Simplicial Homology. Simplicial Homology. Sara Kališnik. January 10, Sara Kališnik January 10, / 34

Chapter 3: Homology Groups Topics in Computational Topology: An Algorithmic View

LECTURE 3: RELATIVE SINGULAR HOMOLOGY

7.3 Singular Homology Groups

CELLULAR HOMOLOGY AND THE CELLULAR BOUNDARY FORMULA. Contents 1. Introduction 1

MATH 215B HOMEWORK 5 SOLUTIONS

HOMOLOGY THEORIES INGRID STARKEY

6 Axiomatic Homology Theory

CW-complexes. Stephen A. Mitchell. November 1997

The Hurewicz Theorem

Part II. Algebraic Topology. Year

Smith theory. Andrew Putman. Abstract

x y B =. v u Note that the determinant of B is xu + yv = 1. Thus B is invertible, with inverse u y v x On the other hand, d BA = va + ub 2

CS 468: Computational Topology Group Theory Fall b c b a b a c b a c b c c b a

Persistent Homology. 128 VI Persistence

Definition 2.3. We define addition and multiplication of matrices as follows.

SECTION 5: EILENBERG ZILBER EQUIVALENCES AND THE KÜNNETH THEOREMS

HOMOLOGY AND COHOMOLOGY. 1. Introduction

Equivalence Relations

Homework 3: Relative homology and excision

A duality on simplicial complexes

Math 530 Lecture Notes. Xi Chen

Exercise: Consider the poset of subsets of {0, 1, 2} ordered under inclusion: Date: July 15, 2015.

Neural Codes and Neural Rings: Topology and Algebraic Geometry

18.312: Algebraic Combinatorics Lionel Levine. Lecture 22. Smith normal form of an integer matrix (linear algebra over Z).

Solution: We can cut the 2-simplex in two, perform the identification and then stitch it back up. The best way to see this is with the picture:

THE FUNDAMENTAL GROUP AND SEIFERT-VAN KAMPEN S THEOREM

Course notes in algebraic topology

Hungry, Hungry Homology

Equivalence of the Combinatorial Definition (Lecture 11)

Chapter 2 Linear Transformations

10. Smooth Varieties. 82 Andreas Gathmann

DISCRETIZED CONFIGURATIONS AND PARTIAL PARTITIONS

Algebraic Topology Homework 4 Solutions

EILENBERG-ZILBER VIA ACYCLIC MODELS, AND PRODUCTS IN HOMOLOGY AND COHOMOLOGY

Lecture 12: Feb 16, 2017

Groups and Symmetries

Nonabelian Poincare Duality (Lecture 8)

ALGEBRAICALLY TRIVIAL, BUT TOPOLOGICALLY NON-TRIVIAL MAP. Contents 1. Introduction 1

a (b + c) = a b + a c

Tensor, Tor, UCF, and Kunneth

Math 6510 Homework 10

0.2 Vector spaces. J.A.Beachy 1

A Primer on Homological Algebra

Algebraic Topology I Homework Spring 2014

Infinite-Dimensional Triangularization

10 Excision and applications

FUNDAMENTAL GROUPS AND THE VAN KAMPEN S THEOREM. Contents

is Use at most six elementary row operations. (Partial

Math 121 Homework 4: Notes on Selected Problems

for some n i (possibly infinite).

MATH Linear Algebra

Definitions. Notations. Injective, Surjective and Bijective. Divides. Cartesian Product. Relations. Equivalence Relations

Formal power series rings, inverse limits, and I-adic completions of rings

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that

COMBINATORIAL GROUP THEORY NOTES

Math 210C. A non-closed commutator subgroup

Elementary maths for GMT

Linear equations in linear algebra

ALGEBRA II: RINGS AND MODULES OVER LITTLE RINGS.

An Outline of Homology Theory

Linear Algebra. Preliminary Lecture Notes

Lattices and Hermite normal form

Test 1 Review Problems Spring 2015

BASIC GROUP THEORY : G G G,

C n.,..., z i 1., z i+1., w i+1,..., wn. =,..., w i 1. : : w i+1. :... : w j 1 1.,..., w j 1. z 0 0} = {[1 : w] w C} S 1 { },

Algebraic Topology M3P solutions 2

Linear Algebra March 16, 2019

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra

MATH8808: ALGEBRAIC TOPOLOGY

Corrections to Introduction to Topological Manifolds (First edition) by John M. Lee December 7, 2015

Solutions to Problem Set 1

THE S 1 -EQUIVARIANT COHOMOLOGY RINGS OF (n k, k) SPRINGER VARIETIES

7. Homotopy and the Fundamental Group

1 Differentiable manifolds and smooth maps

Linear Algebra. Preliminary Lecture Notes

Groups of Prime Power Order with Derived Subgroup of Prime Order

Math 145. Codimension

p,q H (X), H (Y ) ), where the index p has the same meaning as the

A GLIMPSE OF ALGEBRAIC K-THEORY: Eric M. Friedlander

Math 752 Week s 1 1

1. Classifying Spaces. Classifying Spaces

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

Algebraic Geometry Spring 2009

Lecture Summaries for Linear Algebra M51A

2. Intersection Multiplicities

MATH 215B HOMEWORK 4 SOLUTIONS

Math Homotopy Theory Hurewicz theorem

L E C T U R E N O T E S O N H O M O T O P Y T H E O R Y A N D A P P L I C AT I O N S

MATH 320, WEEK 7: Matrices, Matrix Operations

Modules Over Principal Ideal Domains

Math 6802 Algebraic Topology II

Homotopy and homology groups of the n-dimensional Hawaiian earring

GENERATING SETS KEITH CONRAD

A Little Beyond: Linear Algebra

Exercises for Algebraic Topology

NOTES IN COMMUTATIVE ALGEBRA: PART 2

Manifolds and Poincaré duality

Foundations of Matrix Analysis

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

Transcription:

Topological Data Analysis - Spring 2018 Simplicial Homology Slightly rearranged, but mostly copy-pasted from Harer s and Edelsbrunner s Computational Topology, Verovsek s class notes. Gunnar Carlsson s Topology and Data. Gunnar Carlsson s Class Notes for Math 149 1 Review: Vector Space Terminology Here we recall some standard definitions and terminology from linear algebra. First a set V along with a notion of addition + forms an Abelian group if the following properties are satisfied v, v V imply v + v V v + v = v + v, (v + v ) + v = v + (v + v ) There exists a neutral element 0 V which satisfies v + 0 = v for every v V. For every v V, there is an inverse element w V such that v + w = 0. Suppose we also have some field of scalars F and a notion of scalar multiplication, ie. if r F, v V then rv V. Then V is a vector space over F if Scalar multiplication is compatible with field multiplication a(bv) = (ab)v, where v V, and a, b F. Identity element of scalar multiplication 1v = v, where 1 denotes the multiplicative identity in F. Scalar multiplication is distributive with respect to vector addition: a(u + v) = au + av, where v V, and a, b F. Scalar multiplication is distributive with respect to field addition: (a+b)v = av+bv, where v V, and a, b F. 1

A set of vectors v 1,..., v n V forms a basis for V if every element v V can be written as v = r 1 v 1 +... + r n v n for a unique choice of scalars r 1,..., r n F. Although V can have many choices of basis, the number of elements in such a basis can be shown to be fixed; this number, rank(v ), is called the dimension or rank of V. A subset W V forms a vector subspace of V if it is closed under addition and scalar multiplication. Given a fixed v V, we define the coset v + W = {v + w w W }. The quotient space V/W is then defined to be the set of all such cosets; it is itself a vector space over the same field, with vector addition defined by (v + W ) + (v + W ) = (v + v ) + W and scalar multiplication given by r(v + W ) = (rv) + W. The neutral element in V/W is of course just W = 0 + W. It can be shown that rank(v/w ) = rank(v ) rank(w ). Example 1.1. Let V = R 2 and let W be the y-axis. We want to give a simple description of V/W. Recall that two elements (x, y) and (x, y ) of R 2 give the same element of V/W, meaning (x, y) + W = (x, y ) + W, if and only if (x, y) (x, y ) = (x x, y y ) W. This means that x x = 0 since W is the y-axis. Thus, two elements (x, y) and (x, y ) of V determine the same element of V/W if and only if x = x. So, a vector of V/W is completely determined by specifying the x-coordinate since the value of the y-coordinate does not matter. In particular, any element of V/W is represented by exactly one element of the form (x, 0), so we can identify V/W with the set of vectors of the form (x, 0) - i.e. with the x-axis. Of course, the precise meaning of identify is that the map from V/W to the x-axis which sends (x, y) + W (x, 0) is an isomorphism. We will not spell out all the details of this here, but the point is that it really does make sense to think of the quotient V/W in this case as being the same as the x-axis. If V and U are two vector spaces over F, we can form their direct sum V U = {(v, u) v V, u U}. This is also a vector space over F, with vector addition and scalar multiplication defined componentwise. A mapping f : V U between vector spaces is called a linear transformation if f respects all vector space structure: namely, if f(rv + sv ) = rf(v) + sf(v ) holds for all v, v V and r, s F. We define the kernel of such a mapping by kerf = {v V f(v) = 0}; note that kerf is a subspace of V. We also define the image of this mapping to be imf = {f(v) v V }; note that imf is a subspace of U. Finally, it can be shown that rank(v ) = rank(kerf) + rank(imf). 2

2 Simplicial Homology Homology groups were introduced by Henri Poincare in one of a series of papers on analysis situ. He named the ranks of the homology groups after another mathematician, Betti, who introduced a slightly different version years earlier. Homology groups were defined not for topological spaces directly, but for simplicial complexes. Unfortunately, (a) not every space can be described as a simplicial complex and (b) each space can be described as a simplicial complex in many different ways. In the early development of the subject, the apparent dependence of the homology calculation on the simplicial complex structure was a serious problem, and was the subject of a great deal of research. These problems were eventually resolved by Eilenberg, who showed that there is a way to extend the definition of homology groups to all spaces, and in such a way that the result depends only on the space itself, and not on any particular structures as a simplicial complex. Eilenberg s solution was, however, extremely infinite in nature, and is not amenable to direct computation. Calculations of homology for simplicial complexes remains the best method for explicit calculation. Because most spaces of interest are either explicitly simplicial complexes or homotopy equivalent to such, it turns out that simplicial calculation is sufficient for most situations. We now begin the definition of the simplicial homology groups for a given simplicial complex K. Our working example is shown in Figure 1. Homology groups provide a mathematical language for the holes in a topological space. Perhaps surprisingly, they capture holes indirectly, by focusing on what surrounds them. Their main ingredients are group operations and maps that relate topologically meaningful subsets of a space with each other. In this section, we introduce the various groups involved in the setup. b B e a A F T C d E D c Figure 1: Simplicial complex for which we will carry out a computation as an example. 2.1 Chain Complexes Fix a dimension i and a field F. Definition 2.1 (Vector spaces of Chains). An i-chain is a formal sum of i-simplices ci σ i, where c i F and the sum is taken over all possible i-simplices σ i K. The set 3

of all such i-chains is denoted C i (K). We can add two i-chains: given c = c i σ i and d = s i σ i, we define c + d = (c i + s i )σ i. We can also multiply i-chains by scalars in the obvious way. Hence C i (K) forms a vector space over F, and is called the vector space of i-chains in K; note that the set of i-simplices forms an obvious basis for C i (K) but that other bases are also possible. Hence the rank of C i (K) is simply n i, the number of i-simplices in K. Note that the neutral element is 0 = 0σ i. For i < 0 and i > dim(k), we have C i (K) = 0 since there are no simplices in those dimensions. From now on, we will make the simplifying assumption that our field F is simply the binary field Z/2Z; in this case, an i-chain can be thought of as just a collection of i-simplices and adding two i-chains corresponds to taking the symmetric difference of the collections. To relate these groups, we define the boundary of an i-simplex as the sum of its (i 1)-dimensional faces. Writing σ = [u 0, u 1,..., u i ] for the simplex spanned by the listed vertices, its boundary is i σ = i [u 0, u 1,..., û j,..., u i ] j=0 where the hat indicates that u j is omitted. For an i-chain, c = c i σ i, the boundary is the sum of the boundaries of its simplices, i c = i c i σ i. Hence, taking the boundary maps an i-chain to a (i 1)-chain, and we write i : C i (K) C i 1 (K). Notice also that taking the boundary commutes with addition, that is, i (nc + n c ) = n i c + n i c. This is the defining property of a linear map. We will therefore refer to i as the boundary homomorphism or, shorter, the boundary map for chains. Every linear transformation can be represented by a matrix once we fix a basis for the domain and the target spaces. We choose the set of i-simplices as a basis for C i and the set of (i 1)-simplices as a basis for C i 1, and we select some arbitrary but fixed ordering of these simplices. In these ordered bases, i is represented by the boundary matrix D i = {a ji } which has n i 1 rows and n i columns; each row is indexed by a (i 1)-simplex and each column is indexed by an i-simplex. The j-th column contains a 1 for each row indexed by a (i 1)-face of the j-th i-simplex, and all other entries are zero. The chain complex is the sequence of chain groups connected by boundary homomorphisms,... i+2 C i+1 (K) i+1 C i (K) i C i 1 (K) i 1... C 0 (K) 0 It will often be convenient to drop the index from the boundary homomorphism since it is implied by the dimension of the chain it applies to. For an example, consider Figure 1. Then 2 (T ) = B + C + F C 1 (K), and 1 (F ) = b + d. Note also that 1 ( 2 (T )) = 1 (B + C + F ) = 1 (B) + 1 (C) + 1 (F ) = b + e + e + d + b + d = 0. 4

2.2 Cycles, Boundaries, Homology Using the boundary maps, we now distinguish two special types of chains and use them to define homology groups. First we define an i-cycle to be an i-chain with empty boundary, that is, i C i (K) is an i-cycle iff i (c) = 0. The set of all such i-cycles forms a subspace of C i (K), which we denote as Z i (K) = ker i ; we set its rank to z i. For the complex in Figure 2, we see that the group of 1-cycles Z 1 (K) contains the chains B + C + D, A + D + F + G, as well as many more. On the other hand, Z 2 (K) = 0 as can be seen by direct computation. Finally, every single 0-chain forms a 0-cycle, since C 1 (K) = 0. Given an i-chain c C i (K), we say that c is an i-boundary if there exists d C i+1 (K) such that c = i+1 (d). The set of i-boundaries also forms a subspace of C i (K), and we denote it by B i (K) = Im i+1 ; we set its rank to b i. The fundamental property that makes homology work is that the boundary of a boundary is necessarily zero. Lemma 2.2 (Fundamental Lemma of Homology.). i i+1 d = 0 for every integer i and every (i + 1)-chain d. Proof. We just need to show that i i+1 σ = 0 for an (i + 1)-simplex σ. The boundary, i+1 σ, consists of all i-faces of σ. Every (i 1)-face of σ belongs to exactly two i-faces, so i ( i+1 σ) = 0. Using this fact, we define the i-th homology group of K to be H i (K) = Z i (K)/B i (K). The rank of this group is called the i-th Betti number of K, β i = rank(h i (K)). Note that β i = z i b i (note also that n i = z i + b i 1 ). The elements of H i are called i-dimensional homology classes, and they correspond to cosets c + B i, where c is an i- cycle. Any two cycles in the same class are called homologous; note that c, c Z i (K) are homologous iff there is a (i + 1)-chain d such that c + c = (d). Now let us compute the homology for the simplicial complex K depicted in Figure 2. We first write down the chain complex. The vector spaces are as follows C 0 (K) = a, b, c, d, e, C 1 (K) = A, B, C, D, E, C 2 (K) = T, C i (K) = 0 for i 3, and the boundary maps are 2 = 0 1 1 0 1, 1 = A B C D E F a 1 0 0 0 1 0 b 1 1 0 0 0 1 c 0 0 0 1 1 0, 0 = (0). d 0 0 1 1 0 1 e 0 1 1 0 0 0 The cycles are Z 2 = ker 2 = 0, Z 1 = ker 1 = F + B + C, F + A + E + D, Z 0 = ker 0 = a, b, c, d, e. 5

The boundaries are Now B 2 = Im 3 = 0, B 1 = Im 2 = B + C + F, B 0 = Im 1 = a + b, b + e, d + e, c + d. H 2 = Z 2 (K)/B 2 (K) = 0, H 1 = F + B + C, F + A + E + D / B + C + F = Z/Z 2, H 0 = a, b, c, d, e / a + b, b + e, d + e, c + d = Z/Z 2. We can determine the betti numbers from the cycles and boundaries directly β 2 = rankz 2 rankb 2 = 0, β 1 = rankz 1 rankb 1 = 2 1 = 1, β 0 = rankz 0 rankb 0 = 5 4 = 1. For i 3, Z i = B i = 0 and therefore the homology groups are trivial. Theorem 2.3. The following statements hold: 1. Despite their definition, the homology groups do not depend on choice of triangulation. In other words, no matter how we triangulate a given topological space, we will always get the same groups! 2. Homeomorphic spaces have isomorphic homology groups. In fact, we have a stronger statement: homotopy equivalent spaces have isomorphic homology groups. We will not prove these statements in this class. 2.3 Induced Maps A continuous map from one topological space to another maps cycles to cycles and boundaries to boundaries. We can therefore use the images to construct new homology groups. The are not necessarily the same as the ones of the original space since cycles can become boundaries. We describe this more formally for two simplicial complexes and an inclusion, f : K L. Then f induces a map from the chains of K to the chains of the same dimension of L. Specifically, if c = c j σ j is an i-chain in K, then f # (c) = c j τ j, where τ j = f(σ i ). Writing K and L for the boundary maps in the two complexes, we note that f # K = L f #, that is, the induced map commutes with the boundary map. The fact that the induced map commutes with the boundary map implies that f # takes cycles to cycles, f # (Z i (K)) f # (Z i (L)), and boundaries to boundaries, f # (B i (K)) f # (B i (L)). Therefore, it defines a map on the quotients, which we call the induced map on homology, written f : H i (K) H i (L). The rank of the image is bounded from above by both Betti numbers, rankf (H i (K)) min{β i (K), β i (L)}. 6

2.4 Matrix Reduction Unlike fundamental groups, there is a well-defined algorithm to compute the homology groups of an arbitrary simplicial complex K, originally due to Poincare, called the reduction algorithm. In addition to the groups themselves, the reduction algorithm computes basis for each homology group: a set of i-cycles whose homology classes generate H i (K). Poincare s reduction algorithm is actually equivalent to an algorithm published four decades earlier by Smith for computing a certain normal form of an integer matrix, although Poincare was apparently unaware of this fact. Smith s algorithm is in turn a variant of the standard Gaussian elimination algorithm, which was actually discovered by Chinese mathematicians some time before 100AD. From now on we assume a fixed simplicial complex K and drop all mention of it from our notation. Before proceeding with the algorithm we prove the following proposition: Proposition 2.4. Let g : V V and h: W W be invertible linear transformations and f : V W. Then Imf is isomorphic to Imhfg. It follows that if we have the matrix equation A(f) = A(h)A(f)A(g), then V/Imf is isomorphic to V/Im(hfg). Proof. This follows from the elementary observation that w w im(f) if and only if h(w) h(w) Im(hf). Corollary 2.5. Let W be the vector space k n for some n, and suppose that we are given an m n matrix A with entries in a field k. A can be regarded as a linear transformation from V = k n to W, and the span of the columns in this matrix is the image of the transformation A. Then if we apply any row or column operation (permuting rows/columns, multiplying a row/column by a non-zero element of k, or adding a multiple of one row/column to another) to obtain a matrix A, then V/Im(A) is isomorphic to V/Im(A )). We want to calculate bases, and the rank, for the cycle group Z i and the boundary group B i. Recall that the former group is the kernel of i and hence the set of vectors which D i sends to zero. We now perform some operations on D i to make these vectors more apparent. We confine ourselves to two column operations which modify D i without changing any ranks, and we do so by right-multiplying D i by a matrix V = [v ij ]: exchange column k with column l; here v kl = v lk = 1, v ii = 1, for all i k, l and all other entries zero. replace column l with the sum of column k and column l: here v kl = 1, v ii = 1 for all i, and all other entries zero. The first column operation just swaps the name of basis elements, while the second replaces the l-th basis element with the sum of the k-th and the l-th, or by the sum of whatever the two columns represented before the operation. 7

We can also perform two row operations, each done by left-multiplying by U = [u ji ]: exchange rows k and l: u lk = u kl = 1, u ii = 1, for all i k, l and all other entries zero. replace row l with the sum of row k and row l: here u lk = 1, u ii = 1 for all i, and all other entries zero. The second row operation replaces the k-th basis element with the sum of the k-th and l-th, or by whatever these rows represented before the operation. Note that after every such operation, we still have valid bases for C i and for C i 1. 2.5 Smith Normal Form The end goal is to put our matrix into Smith Normal Form N i, which means a form where some initial segment of the diagonal (or perhaps the entire diagonal) is 1 and the rest of the matrix is zero, as shown below: 8

The number of columns of this matrix is rank(c i ) = n i = b i 1 + z i. We arrange it so that the leftmost b i 1 columns have ones on the diagonal and the rightmost z i columns are all zero. Then the latter columns represent i-chains which have zero boundary; in other words, basis elements for Z i. The former represent i-chains whose non-zero boundaries form a basis for B i 1. Hence, by reducing the matrices D i for all i into Smith normal form, we can extract the numbers z i and b i, and thus the Betti numbers β i = z i b i for every i. To actually produce bases for Z i and for B i 1, we can keep track of the row and column operations. Writing the Smith Normal Form as N i = U i 1 D i V i, we can show that the last z i columns of V i give a basis for Z i and the first b i 1 columns of the inverse of U i 1 give a basis for B i 1. 2.6 Reduction process and example How do we reduce D i into Smith Normal Form? This is easy. First we perform exchanges to move a 1 to the top left corner. Using this 1, we destroy the first column and the first row. We then recurse on the smaller submatrix obtained by removing the first column and the first row. It is not hard to see that this reduction takes time at most cubic in the number of simplices. 3 Simplifying Calculations We usually need a lot of simplices to represent shapes and boundary matrices can be of dimensions that make it difficult to carry out a computation by hand. We now take a look at two methods for reducing the number of simplices, but preserving homology. Proposition 3.1. Let K be an abstract simplicial complex. Suppose that σ is a maximal simplex of K, i.e. so that it is not contained in any other simplex. Let p denote the dimension of σ. Suppose further that σ σ is a face, of dimension p 1, and suppose further that σ is not contained in any simplex other than σ and itself. Then we may form the subcomplex K K, by removing the simplices σ and σ. Then the result is that the linear transformation : H i (K ) H i (K) induced by the inclusion K K is an isomorphism. A simplicial complex and a sequence of steps that show that it s homology is the same as the homology of a point. 9

Proposition 3.2. Let K be any abstract simplicial complex, and let σ be any maximal simplex in K. The maximality of σ means that we may remove σ from K, and obtain from it a subcomplex K. Let p denote the dimension of σ. We observe that p (σ) is a cycle in C p 1 (K ), because all the (p 1)-simplices in K are also in K. Consequently, the equivalence class [ p (σ)] is an element ξ σ H p 1 (K ). Then the relationship between the homology of K and the homology of K is given as follows. 1. H i (K ) = H i (K) whenever i p, p 1. 2. H p 1 (K) = H p 1 (K )/ ξ σ, where ξ σ denotes the span of vector ξ σ. Note that if ξ σ is the zero element in H p 1 (K ), then H p 1 (K ) = H p 1 (K). 3. If ξ σ 0 in H p 1 (K ), then H p (K ) = H p (K). If ξ σ = 0, then H p (K) = H p (K ) F, where F is the field. 4 Making homology more sensitive It is useful to ask how sensitive a measure homology is of the shape of a simplicial complex, but considering a simple shape recognition task, namely the recognition of printed letters. 10

We begin with the first three letters of the alphabet, and find that H1 succeeds at distinguishing between them. However, after this initial success, we see that every other letter has the same first Betti number as one of these three. Homology can be refined to discriminate more finely between the letters. To understand how this works, we digress a bit to discuss how an analogous problem in manifold topology is approached. The homology groups H i (R n ) vanish for all i > 0. What this means is that homology is unable to distinguish between R m and R n when m = n. From the point of view of a topologist who is interested in distinguishing different manifolds from each other, this means that homology is in some ways a relatively weak invariant. This failure can be addressed by computing homology on auxiliary or derived spaces, constructed using various geometric constructions: 1. Removing a point: While the homology groups of R n vanish, the homology groups of R n \ {0} do not. To see this, we observe that we have the inclusion i: S n 1 R n \ {0} as well as the map r : R n \{0} S n 1 defined by r(v) = f. r i v is equal to the identity map for S n 1, and the other composite i r is homotopic to the identity map of R n \{0} via the straight line homotopy H(v, t) = (1 t)i r+tid R n \{0}. With this we can detect the difference between R n and R m, with m = n, by recognizing that the homology of the result of removing a single point from the two spaces are different, since homology detects the difference between spheres of different dimensions. 2. Removing singular points: Consider the space given by the crossing of two lines. This space is also contractible, as one can easily see by retracting each line segment onto the crossing point A. On the other hand, we recognize that its shape has features which distinguish it from an interval or a circle, and might want to detect that homologically. If we remove the singular point A, we will find that the space remaining breaks up into four distinct components, which can be detected by H 0. 11

If one uses homology on suitably constructed auxiliary spaces, one can obtain classification criteria for many interesting problems in shape discrimination. To think through how we might apply these ideas to the problem of distinguishing between letters, let us define an end of a space X to be a point x X so that there is a neighborhood N of x which is homeomorphic to [0, 1), and so that the homeomorphism carries x to 0. In this case, the auxiliary or derived space is the set of ends of the space, e(x), and we can compute its zero-dimensional homology H 0, to get the Betti number β 0. We now obtain the following partition of the set of letters. We have obtained improved discrimination in this way. Removing singular points and computing β 0 produces further resolution on the letters. 12