Math 396. An application of Gram-Schmidt to prove connectedness

Similar documents
implies that if we fix a basis v of V and let M and M be the associated invertible symmetric matrices computing, and, then M = (L L)M and the

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j

A PRIMER ON SESQUILINEAR FORMS

V (v i + W i ) (v i + W i ) is path-connected and hence is connected.

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

Math 396. Quotient spaces

Chapter 6: Orthogonality

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH Linear Algebra

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

235 Final exam review questions

Review problems for MA 54, Fall 2004.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Review of linear algebra

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA 8: Linear algebra: characteristic polynomial

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Linear Algebra 2 Spectral Notes

Lecture 4 Orthonormal vectors and QR factorization

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that

Notation. For any Lie group G, we set G 0 to be the connected component of the identity.

Section 6.4. The Gram Schmidt Process

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

Exercises on chapter 0

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 594. Solutions 5

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Supplementary Notes March 23, The subgroup Ω for orthogonal groups

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 291-2: Lecture Notes Northwestern University, Winter 2016

MATH 235. Final ANSWERS May 5, 2015

Lemma 1.3. The element [X, X] is nonzero.

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Math Linear Algebra II. 1. Inner Products and Norms

Last name: First name: Signature: Student number:

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Some notes on Coxeter groups

Homework 11 Solutions. Math 110, Fall 2013.

Mathematics Department Stanford University Math 61CM/DM Inner products

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Exam questions with full solutions

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Problem set 2. Math 212b February 8, 2001 due Feb. 27

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Math 108b: Notes on the Spectral Theorem

There are two things that are particularly nice about the first basis

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Review of Linear Algebra

Methods of Mathematical Physics X1 Homework 2 - Solutions

Eigenvalues and Eigenvectors

Homework set 4 - Solutions

Lecture 4.1: Homomorphisms and isomorphisms

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

2 bases of V. Largely for this reason, we will ignore the 0 space in this handout; anyway, including it in various results later on is always either m

Math 396. Bijectivity vs. isomorphism

Math 249B. Nilpotence of connected solvable groups

6 Inner Product Spaces

Solving a system by back-substitution, checking consistency of a system (no rows of the form

THE EULER CHARACTERISTIC OF A LIE GROUP

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Problem # Max points possible Actual score Total 120

A Field Extension as a Vector Space

Diagonalizing Matrices

Problems in Linear Algebra and Representation Theory

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Chapter 2: Linear Independence and Bases

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

Spectral Theorem for Self-adjoint Linear Operators

October 25, 2013 INNER PRODUCT SPACES

Yale University Department of Mathematics Math 350 Introduction to Abstract Algebra Fall Midterm Exam Review Solutions

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

The Orthogonal Geometry of R n

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Orthogonal Complements

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

(v, w) = arccos( < v, w >

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Typical Problem: Compute.

Supplementary Notes on Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Math 113 Final Exam: Solutions

REFLECTIONS IN A EUCLIDEAN SPACE

I. Multiple Choice Questions (Answer any eight)

Transcription:

Math 396. An application of Gram-Schmidt to prove connectedness 1. Motivation and background Let V be an n-dimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V (the notation stands for General Linear). In other words, this is the open locus in Hom R (V, V ) where the continuous (multi-variate) polynomial function det : Hom R (V, V ) R is non-vanishing. When V = R n, this is the set of invertible n by n matrices in Mat n n (R), and it is usually called GL n (R) rather than GL(R n ). For example, when n = 2 and we imagine the 4-dimensional space Mat 2 2 (R) as coordinatized by matrix entries a, b, c, d, then GL 2 (R) is the complement of the hypersurface in R 4 cut out by the condition ad bc = 0 in a 4-dimensional space. It s quite big. We make GL(V ) into a topological space by viewing it as an open in the finite-dimensional R-vector space Hom R (V, V ). The concepts of open set, closed set, limit, etc. in GL(V ) can be expressed in terms of any choice of linear coordinates on V used to identify the situation with GL n (R) in which two matrices are close when the corresponding matrix entries (ij in each) are close in R. Consider the determinant map det : GL(V ) R {0}. Being a polynomial function in matrix entries relative to any choice of basis of V, this is visibly continuous and trivially surjective (think of diagonal matrices). But the target is disconnected, so the source cannot be connected. More specifically, U + = {T GL(V ) det T > 0}, U = {T GL(V )) det T < 0} is a non-trivial separation of GL(V ). But is this the only obstruction to connectedness? More specifically, if we define GL + (V ) = {T GL(V ) det T > 0}, then is this connected? In fact, we will even prove it is path-connected. This is hard to see right away, but the proof will exhibit an explicit geometrically constructed path of matrices joining up the identity map to any chosen T with positive determinant. The method will essentially amount to a vivid geometric perspective on the Gram-Schmidt process. A related connectedness question concerns the orthogonal matrices. Suppose we fix a choice of an inner product, on V. We define O(V ) = O(V,, ) = {T Hom R (V, V ) T (v), T (v ) = v, v }, called the orthogonal group for the data (V,, ), though we usually suppress mention of, in the notation. In other words, if T is the adjoint map then the condition is T T = 1 (which forces T T = 1). In concrete terms, if we choose an orthonormal basis to identify V with R n in such a way that our inner product goes over to the standard one, then O(V ) becomes the explicit O n (R) = {M GL n (R) MM t = 1}. This is a closed subset of GL n (R) since the condition MM t = 1 amounts to a system of n 2 polynomial conditions on the matrix entries of M. For example, when n = 2 with ( ) a b M = c c we get the conditions a 2 + b 2 = 1, c 2 + d 2 = 1, ac + bd = 0. 1

2 The elements of O n (R) are the linear maps from R n to R n that preserve the standard inner product on R n. We know that all eigenvalues of such a matrix over C (where it is unitary) have to have absolute value 1. For any M Mat n n (R) we know that the real number Det(M) has absolute value equal to λ i where {λ i } is the set of eigenvalues of M in C (counting multiplicities in terms of roots of the characteristic polynomial). Since λ i = 1 for all i in the orthogonal (or rather, unitary) case, we see that λ i = 1 for such matrices, so the determinant function on O n (R) has values in {±1}. As with GL(V ), the sign of the (continuous) determinant gives an evident non-trivial separation. Let s restrict our attention to SO(V ) = SO(V,, ) = {T O(V ) det T = 1} = O(V ) GL + (V ). Here, S stands for special, which is the usual terminology for when one imposes a det = 1 condition (e.g., SL(V ) denotes the subgroup of elements in GL(V ) with determinant 1, called the special linear group of V ; for V = R n it is usually denoted SL n (R) GL n (R)). Is SO(V ) connected? In fact, we ll prove it is path-connected. Actually, the method of proof of the two connectedness results will be to first prove path connectedness of SO(V ), and to then use the choice of an inner product and the Gram-Schmidt algorithm to deduce from this that GL + (V ) is path-connected. In order to motivate things with less clutter, we will first reduce the case of GL + (V ) to that of SO(V ), and then we ll handle the latter case. 2. Path-connectedness of GL + (V ) Let T GL + (V ) be an element. We seek to find a continuous path in GL + (V ) which links up T to the identity map. We now fix a choice of inner product on V, which can certainly be done (in lots of ways), so we get a corresponding orthogonal group O(V ). What we ll actually do is use the Gram-Schmidt algorithm to find a path in GL(V ) joining up T to an element in SO(V ). Then the path-connectedness of the latter (which we ll prove in the next section) will finish the job. Here is the basic idea. Choose an orthonormal basis {e 1,..., e n } of V. Let v j = T (e j ) be the image of the jth basis vector under the linear map T. Let {v 1,..., v n} be the orthonormal basis which results from applying the Gram-Schmidt process to the v j s. Let T : V V be the linear map which sends e j to v j (so T is an isomorphism). We will continuously deform the ordered set {v 1,..., v n } into {v 1,..., v n} using the Gram-Schmidt formulas, and this will lead to a path joining up T to T inside of GL + (V ). We ll then show that T SO(V ), so we ll be done (or rather, will be reduced to path-connectedness of SO(V )). More explicitly, consider the formulas which define the Gram-Schmidt algorithm. We first run through without normalizing: w 1 = v 1, j 1 w j v j, w i = v j w i, w i w i for 2 j n. Thus, v j = w j / w j for 1 j n. We now define visibly continuous functions i=1 w i : [0, 1] V

3 as follows: w 1 (t) = v 1 Note that for every t and 1 i n we have j 1 v j, w i w j (t) = v j t w i, w i w i i=1 span(w 1 (t),..., w i (t)) = span(v 1,..., v i ), so {w 1 (t),..., w n (t)} is a basis of V for all t. Also, for t = 0 this is the original basis {v 1,... } and for t = 1 it is the non-normalized basis {w 1,... }. Making one final modification, if we define functions u j : [0, 1] V by the rule u j (t) = w j(t) w j (t) t then each u j is continuous (why?) with {u 1 (t),..., u n (t)} a basis of V for all t; this yields the original basis {v 1,... } for t = 0 and the Gram-Schmidt output {v 1,... } for t = 1. We conclude that [0, 1] V V = V n defined by t (u 1 (t),..., u n (t)) is a continuous system of bases which moves from {v 1,..., v n } to {v 1,... }. Geometrically, we visualize a collection of n arrows sticking out of the origin, with this collection of arrows moving continuously from {v i } to {v i }. Such a visualization is sometimes called a moving frame. Now recall we began with a linear map T : V V determined by the condition T (e j ) = v j and we also defined a linear map T : V V by the property T (e j ) = v j. Note that T carries an orthonormal basis to an orthonormal basis. This at least makes T orthogonal, thanks to: Lemma 2.1. Let T : (V,, ) (V,, ) be a map between finite-dimensional inner product spaces, with T (e i ), T (e j ) = e i, e j for a basis {e 1,..., e n } of V. Then T respects the inner products. That is, T (v 1 ), T (v 2 ) = v 1, v 2 for all v 1, v 2 V. Proof. The pairings (v 1, v 2 ) T (v 1 ), T (v 2 ), (v 1, v 2 ) v 1, v 2 are bilinear forms on V which, by hypothesis, coincide on pairs from a basis. But by bilinearity, a bilinear form is uniquely determined by its values on pairs from a basis. Thus, these two bilinear forms coincide, and that s what we needed to prove. Although this lemma shows that T is orthogonal, it isn t immediately clear that det T = 1 (as opposed to det T = 1). The fact that T SO(V ), which is to say det T > 0, will follow from our next observation: there is a continuous path in GL(V ) which begins at our initial T and ends at T. Indeed, define T t : V V to be the linear map determined by the requirement T t (e j ) = u j (t). Note that T 0 = T and T 1 = T. Moreover, since {u j (t)} is a basis for all t, it follows that T t : V V is invertible for all t, which is to say T t GL(V ).

4 We now show that the map [0, 1] GL(V ) defined by t T t is actually continuous. To see the continuity, we impose coordinates via the orthonormal basis e = {e 1,..., e n }. In such terms, T t is the matrix whose jth column is the list of e-coordinates of T t (e j ) = u j (t). But recall that t u j (t) is a continuous function [0, 1] V, and a map to a finite-dimensional R-vector space is continuous if and only if the resulting component functions relative to some (and then any) basis are continuous as maps to R. That is, the e-coordinate functions of the u j (t) s are continuous maps [0, 1] R. In more explicit terms, if we write u j (t) = i a ij (t)e i then a ij : [0, 1] R is continuous. Thus, if we stare at the matrix T t = (a ij (t)) in the e-coordinates, then every matrix entry is a continuous R-valued function of t. Since continuity for a matrix-valued function is equivalent to continuity of the matrix entry functions, it follows that [0, 1] Hom R (V, V ) Mat n n (R) defined by t T t really is continuous. The topology on GL(V ) is induced by Hom R (V, V ), which is to say that continuity of t T t as a GL(V )-valued map is a consequence of its continuity as a Hom R (V, V )-valued map. Summarizing what we have done so far, given a linear isomorphism T GL(V ), we have constructed a continuous path inside of GL(V ) which begins at T and ends at T O(V ) (where we chose an inner product on V ). Crucial to this was the explicit nature of the Gram-Schmidt algorithm. This basic construction never actually needed that det T > 0. But now we use the condition det T > 0 to prove det T > 0 (and hence T SO(V ), as T is orthogonal). The point is simply that the map det : GL(V ) R {0} is continuous and hence the map [0, 1] R {0} defined by t det(t t ) is continuous (being a composite of continuous maps). Since a continuous map ϕ : [0, 1] R {0} must have connected (and hence interval) image, the sign of ϕ(t) must be the same throughout (Intermediate Value Theorem!). In our situation, it follows that the function t det(t t ) has constant sign. Since the sign is positive at t = 0, it must then be positive at t = 1. We conclude that not only is T SO(V ) but in fact we have constructed a continuous path from T to T entirely inside of GL + (V ). Now we just need to prove the path-connectedness of SO(V ) to find a path in here linking up T to the identity. This is done in the next section. 3. Path-connectedness of SO(V ) Choose any T SO(V ). We will find a continuous path in SO(V ) which begins at T and ends at the identity map. This will yield the desired path connectedness. Choose an orthonormal basis {e j } of V, and let v j = T (e j ), so by orthogonality of T we know that {v j } is an orthonormal basis of V as well. We will define a continuous function u : [0, 1] V V = V n described by t (u 1 (t),..., u n (t)) such that u(0) = {e 1,..., e n }, u(1) = {v 1,..., v n }, and u(t) = {u 1 (t),..., u n (t)} is an orthonormal basis of V for all t [0, 1]. Suppose for a moment that we have such a continuous system of orthonormal bases. Define the linear maps T t : V V by the condition T t (e j ) = u j (t). The map T t is orthogonal since it takes an orthonormal basis to an orthonormal basis. Note that T 0 = id V

5 and T 1 = T. By the same method as in the previous section, the continuity of u implies that t T t is a continuous map from [0, 1] to GL(V ), and even into O(V ). In particular, the function det(t t ) is a continuous non-vanishing function on [0, 1] with values in {±1} since orthogonal maps from V to V have determinant ±1, whence this determinant is constant. The value at t = 0 is det(t 0 ) = det(id V ) = 1, so t T t is a continuous path in SO(V ) connecting the identity map to T, thereby finishing the proof of path-connectedness once we have constructed the above continuous system u of orthonormal bases moving from {e i } to {v i }. The construction of such a continuous u must somewhere use that the orthogonal map T : V V sending e j to v j has determinant 1 rather than 1 (as otherwise no such T can exist!). Now we give the construction of u. If dim V = 1, then the only orthogonal map on V with determinant 1 is the identity, so SO(V ) consists of a single element and hence path-connectedness is trivial. We induct on dim V, so we can assume dim V > 1. Consider the two ordered orthonormal bases {e i } and {v i } related by the orthogonal map T with det T = 1. If e 1 and v 1 are linearly independent, let W be the 2-dimensional span of e 1 and v 1. If we have linear dependence, let W be a 2-dimensional subspace containing the common line spanned by e 1 and v 1. We have an orthogonal decomposition V = W W (note W = 0 in case dim V = 2). Choose an ordered orthonormal basis of W of the form {v 1, v 1 }. We have e 1 = av 1 + a v 1 with a2 + a 2 = 1. We can find θ [0, 2π) such that (a, a ) = (cos(θ), sin(θ)), so if we let r t : W W be the rotation by angle tθ for 0 t 1, then r 0 is the identity and r 1 is a rotation which sends v 1 to e 1. Define the linear map T t : V V on V = W W by the requirement that on W it acts as the identity and on W it acts by r t. It is clear from the construction on W and W that T t is an orthogonal map for all t, and even has determinant 1 for all t. The continuity of the trigonometric matrix function entries for r t makes it clear that t T T t is a continuous map from [0, 1] to SO(V ). Moreover, T T 0 = T and T T 1 sends e 1 to e 1. Thus, by moving along the continuous path t T T t in SO(V ) we link up our original map T to one which fixes e 1. If we can find a continuous path in SO(V ) from T 1 to the identity map, we ll be done by simply moving along the concatentation of the two paths. Since T 1 fixes e 1, if we let V = (Re 1 ) then V = Re 1 V is an orthogonal decomposition and the orthogonal T 1 must take V back into V. If we let T : V V denote the orthogonal map induced by T 1, then the action of T 1 on V = Re 1 V is described by id Re1 T. Since dim V < dim V and 1 = det T 1 = det(id Re1 ) det T = det T, we have T SO(V ), so by induction there is a continuous path [0, 1] SO(V ) written as t T t which begins at T and ends at id V. Thus, the maps id Re1 T t form a continuous path in SO(V ) beginning at T 1 and ending at the identity. Remark 3.1. We conclude with a challenge question. Observe that C is connected (in contrast with R ). Hence, there is no determinant obstruction to connectivity of GL n (C). Thus, one may be led to guess that if V is a nonzero finite-dimensional C-vector space then the open subset GL(V ) of C-linear automorphisms in Hom C (V, V ) is connected, and even path-connected. (Here we give any finite-dimensional C-vector space, such as Hom C (V, V ), its natural topology as a finite-dimensional R-vector space.) Prove the correctness of this guess by using moving frames in the C-vector space V.