Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Similar documents
Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Lecture 7: Positive Semidefinite Matrices

MATH 221, Spring Homework 10 Solutions

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

235 Final exam review questions

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math 113 Final Exam: Solutions

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

PRACTICE PROBLEMS FOR THE FINAL

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Linear Algebra- Final Exam Review

LINEAR ALGEBRA REVIEW

Homework 11 Solutions. Math 110, Fall 2013.

MATH Spring 2011 Sample problems for Test 2: Solutions

Linear Algebra Final Exam Solutions, December 13, 2008

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Chapter 6: Orthogonality

Review problems for MA 54, Fall 2004.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

(v, w) = arccos( < v, w >

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Chapter 4 Euclid Space

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MAT Linear Algebra Collection of sample exams

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

Conceptual Questions for Review

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATH 115A: SAMPLE FINAL SOLUTIONS

Chapter 3 Transformations

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

MATH 235. Final ANSWERS May 5, 2015

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Definitions for Quizzes

Lecture Summaries for Linear Algebra M51A

Linear Algebra Practice Problems

GQE ALGEBRA PROBLEMS

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear algebra II Homework #1 due Thursday, Feb A =

Eigenvalues and Eigenvectors A =

I. Multiple Choice Questions (Answer any eight)

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Math Linear Algebra II. 1. Inner Products and Norms

A PRIMER ON SESQUILINEAR FORMS

80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.

Dimension. Eigenvalue and eigenvector

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Elementary linear algebra

1. General Vector Spaces

Last name: First name: Signature: Student number:

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Solutions to Final Exam

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Name: Final Exam MATH 3320

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

Linear Algebra Highlights

Math Final December 2006 C. Robinson

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Lecture 2: Linear operators

Math 113 Midterm Exam Solutions

Supplementary Notes on Linear Algebra

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 10, 2011

MATH. 20F SAMPLE FINAL (WINTER 2010)

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

DEPARTMENT OF MATHEMATICS

Online Exercises for Linear Algebra XM511

The Spectral Theorem for normal linear maps

Linear Algebra Final Exam Study Guide Solutions Fall 2012

MTH 2032 SemesterII

MA 265 FINAL EXAM Fall 2012

Spring 2014 Math 272 Final Exam Review Sheet

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Lecture 11: Eigenvalues and Eigenvectors

Linear Algebra Lecture Notes-II

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Linear Algebra. Workbook

Math 25a Practice Final #1 Solutions

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Transcription:

Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you are not sure about something, you will be much more likely to get partial credit if you write out more details! Problem 1. No justification is needed. In this question, V is an n-dimensional vector space over C, T, S are linear operators from V to itself, and W is an m- dimensional inner product space over R. Mark true or false. (a) Any spanning set for V contains a basis. (True) (b) V is isomorphic to its dual space V. (True; they have the same dimension.) (c) Suppose W 1, W 2 are two subspaces of W. Then there exists an isometry σ : W W so that σ(w 1 ) = W 2. (True: take orthonormal bases for both, extend, and map one to the other.) (d) If T S is zero, then ST is zero. (False: take 2 2 matrices S, T with entries only in the (1, 1) and (1, 2) entries, respectively.) (e) If T S is not injective, then ST is not injective. (True: same determinant.) (f) If T S is the identity operator, then ST is the identity operator. (True: if S is inverse to T, it is so both on the left and the right.) (g) The characteristic polynomials of T S and ST are the same. (True; tricky.) (h) Any linear transformation from W to itself has an eigenvector. (False: rotation of R 2 ; any self-adjoint transformation has an eigenvector.) (i) If x, y W satisfy x, y = 0, and x, y are both nonzero, then x and y are linearly independent. (True; same argument as orthonormal sets are LI. ) (j) if T n+1 = 0, then also T n = 0. (True; consider image(t ) image(t 2 )... ; on an old homework.) Problem 2. Let V be a finite-dimensional space, U V a subspace with U V. Suppose that x 1,..., x r is a linearly independent list in V, such that span(x 1,..., x r ) + U = V. Prove that we can find x r+1,..., x n U so that x 1,..., x n is a basis for V. We can extend x 1,..., x r to a basis x 1,..., x r, y r+1,..., y n. (Thus, n = dim V ). (1) For each r + 1 i n, there exists w i span(x 1,..., x r ), x i U with y i = w i + x i. Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r i=1 α ix i + n i=r+1 β iy i, which we may rewrite 1

2 as: r α i x i + i=1 n i=r+1 β i x i + n i=r+1 β i w i. The last term on the right belongs to span(x i ), and so the entire expression does. Thus, v span(x i ), as required. Prove there exists a basis (e 1,..., e n ) for V so that e i / U for all i. Choose a basis e 1,..., e t for U. Extend to a basis e 1,..., e n for V. Then e j / U for j > t; also, e j + e n / U for j t, since otherwise e n = (e j + e n ) e j would belong to U. Now consider e 1 + e n, e 2 + e n,..., e t + e n, e t+1, e t+2,..., e n. It is still a basis for V, because it is linearly independent: t α i (e i + e n ) + i=1 n i=t+1 e i = 0 = n 1 α i e i + (α n + α 1 + α 2 + + α t )e n = 0, i=1 which implies α i = 0 for i n 1, and then that α n = 0. Problem 3. Suppose V, W are finite dimensional vector spaces. Let K be a subspace of V and I a subspace of W. Explain why {T L(V, W ) : null(t ) = K and image(t ) = I} is not a subspace of L(V, W ). It does not contain T = 0 (at least, if K is not all of V ). Compute (with proof) the dimension of {T L(V, W ) : null(t ) K and image(t ) = I}. Let Q be the subspace described by this equation. Take a basis (v 1,..., v k ) for K and extend to a basis (v 1,..., v k, y 1,..., y r ) for V. Let Y = span(y 1,..., y r ). Then V = K int Y. Consider the map: Q Φ L(Y, I), L(Y, I) Ψ Q where Φ restricts a linear map to Y ; on the other hand, Ψ(S) extends a map by zero, i.e. Ψ(S)(k + y) = S(y), k K, y Y. This is well-defined because the map K Y V, defined by (k, y) k+y, is an isomorphism, by definition of the internal direct sum. Then Φ, Ψ are inverse to one another, and so Q is isomorphic to L(Y, I). Thus, dim Q = dim Y dim I = (dim V dim K) dim I. Problem 4. Let V be a finite-dimensional vector space over C. Let T L(V, V ). Define the adjoint map ˆT : V V. Suppose that there exists a basis of eigenvectors for T. Prove that there exists a basis of eigenvectors for ˆT. Suppose v i is a basis of eigenvectors, i.e. T (v i ) = λ i v i. The matrix of T with respect to bases (v i ), (v i ) is diagonal, with λ i s on the diagonal. Thus, the matrix of ˆT with respect to the dual bases (vi ), (v i ) is the transpose of this, i.e. diagonal with λ i s on the diagonal. Therefore, ˆT (v i ) = λ i v i, thus vi gives a basis of ˆT -eigenvectors.

3 Suppose that image(t ) null(t ). Prove that image( ˆT ) null( ˆT ). Note that if image(t ) null(t ) if and only if T 2 = 0. This is so if and only if T ˆ 2 = ˆT ˆT = ˆT 2 is zero. This is so if and only if ˆT 2 = 0. Problem 5. Let V be a finite dimensional vector space over the complex numbers. Let T L(V, V ). Let I be the identity transformation from V to V, and let T, T 2,... be the successive powers of T. In this question we will consider the subspace Q of L(V, V ) spanned by the (infinite) ( list 1, T, ) T 2,.... 2 0 Compute Q in the case when V = C 2 and T =. 0 3 Q is equal to the vector space of diagonal matrices. Prove that there exists N 1 so that {1, T, T 2,..., T N } spans Q. Q is finite-dimensional, because it is a subspace of the finite-dimensional space L(V, V ). Therefore, it is spanned by a finite collection e 1,..., e r Q. Each e i is a linear combination of (1, T,..., T Ni ) for some N i. Let N = max N i. Then every e i belongs to the span of (1, T,..., T N ), and thus (1, T,..., T N ) spans Q. Prove that dim(q) dim(v ) always. 1 The Cayley-Hamilton theorem shows that there is a polynomial p of degree dim V specifically, the characteristic polynomial of T so that p(t ) = 0. In particular, [ ] T dim V is a linear combination of I, T,..., T dim V 1. Now, T n span(i, T,..., T dim V 1 ) = T n+1 span(t, T 2,..., T dim V ), and the latter space is equal to span(i, T,..., T dim V 1 ) by above. By induction, we see T n span(i, T,..., T dim V 1 ) for every n. Thus, Q is spanned by dim V elements. So dim(q) dim V. Problem 6. Consider R 3 with the usual inner product (the dot product) and let M be a 3 3 symmetric matrix, thought of as a linear map R 3 R 3. Let e 1 = (1, 1, 2), e 2 = (3, 1, 0) and let W be the span of e 1, e 2. Find a vector e 3 perpendicular to W. We apply the Gram-Schmidt process to the linearly independent set (e 1, e 2, (0, 0, 1)). (We check this is linearly independent by computing the determinant; it is easy to compute by expansion around the last row. It is also possible to directly solve the equations e 3, e 1 = e 3, e 2 = 0 to get to the answer.) First, we scale e 1 to length 1 to get f 1 = (1, 1, 2)/ 6. Then, we project e 2 to get f 2 = e 2 e 2, f 1 f 1 = (3, 1, 0) (2/3, 2/3, 4/3) = (7/3, 1/3, 4/3). Now we scale f 2 to length 1 to get f 2 = (7, 1, 4)/ 66. Finally, we project (0, 0, 1) to the orthogonal complement of f 1, f 2 : f 3 = (0, 0, 1) (1, 1, 2)/3 (7, 1, 4)( 4)/66 = (1/11, 3/11, 1/11). We scale it to length 1 to get f 3 = (1, 3, 1)/ 11. Thus, e 3 = (1, 3, 1) is a vector perpendicular to W. Suppose you are given that M(w) W whenever w W, i.e., M preserves W. Prove that e 3 is an eigenvector of W. 1 Hint: Use the Cayley-Hamilton theorem.

4 For every w W, M(e 3 ), w = e 3, M(w) = 0, since M(w) W and e 3 is perpendicular to W. Thus, M(e 3 ) W. We have seen that dim W = 3 dim(w ) = 1; since e 3 W, we must have W = span(e 3 ). Since M(e 3 ) span(e 3 ), it follows that e 3 is an eigenvector. Find the vector in W that is closest to (0, 0, 1). It was discussed in class that, if f 1, f 2 is an orthonormal basis for W, the closest vector to v in W is w = v, f 1 f 1 + v, f 2 f 2. Here closest means that v w 2 is minimized. In any case, this equals: 1/3(1, 1, 2) 4(7, 1, 4)/66 = ( 1/11, 3/11, 10/11), so ( 1/11, ( 3/11, ) 10/11) is the closest point on W to (0, 0, 1). 3 2 Problem 7. Let A =. 1 2 Compute the determinant, the characteristic polynomial, and eigenvalues of A. The determinant is 4. The characteristic polynomial is (3 λ)(2 λ) 2 = λ 2 6λ + 4 = (λ 4)(λ 1). So the eigenvalues are 4 and 1. ( The eigenvector ) v( 1 corresponding ) to λ = 4 belongs to the kernel of 1 2 2, e.g. ; the eigenvector v 1 2 1 2 corresponding to λ = 1 ( ) ( ) 2 2 1 belongs to the kernel of, e.g.. 1 1 1 Compute A 100. (Please explain your procedure clearly; then you can get partial credit even if you make numerical errors.) We have seen Av 1 = 4v 1, Av 2 = v 2. Thus, A 100 v 1 = 4 100 v 1, A 100 v 2 = v 2. ( ) ( ) 2 1 4 100 0 In other words, with D =, AD = D. Thus, 1 1 0 1 ( ) ( ) A 100 4 100 0 = D D 1 (2 4 = 100 + 1)/3 (2 4 100 2)/3 0 1 (4 100 1)/3 (4 100 + 2)/3 Let B be an invertible 3 3 complex matrix with the following property: for any integer n, positive or negative, all the matrix entries Bij n of B n satisfy Bij n 10. Prove that all the eigenvalues of B have absolute value 1, and that B has a basis of eigenvectors. Fact. Put on C 3 the standard inner product (z 1, z 2, z 3 ) (w 1, w 2, w 3 ) = zi w i. If M is a 3 3 matrix all of whose entries are A in absolute value, and v any vector in C 3, then Mv 3A v. Proof of fact. In fact, each coordinate of Mv is of the form 3 j=1 M ijv j ; each term in this sum is at most A v. Therefore, Mv 3A 2 v 2 < 3A v. We now begin the proof. Let V λ be any generalized eigenspace of B. Let N = B λi, which we think of as a linear map N : V λ V λ. We know that N 3 = 0. For v V λ and any n 1:

5 (2) B n v = (λi + N) n v = (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 /2)v We have simply expanded out (λi + N) n, using the fact that N 3 = 0. There exists v V λ that belongs to the null-space of N: since λ is an eigenvalue, the null-space of B λi is nonzero. Applying the above equation to v, we see simply: B n = λ n v. If λ > 1 and we take n so large that λ n > 30, this contradicts the Fact above, since on the one hand B n v 30 v ; on the other hand λ n v > 30 v. Therefore, λ 1. Applying the same reasoning to B 1, shows that λ 1 1. Putting these two together, λ = 1. This completes the proof that every eigenvalue has absolute value 1. The triangle inequality in an inner product space implies that v 1 + v 2 + v 3 v 1 v 2 v 3. Therefore, (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 n(n 1) /2)v N 2 v n Nv v. 2 So, if we can find v V λ so that N 2 (v) 0, the norm of the right-hand side of the equation (3) B n v = (λi + N) n v = (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 /2)v will grow without bound as n (it is a quadratic function of n.) This is a contradiction to the Fact above, since the left hand side is at most 30 v. If N 2 (v) = 0 for all v V λ, but there exists v V λ so that N(v) 0, the same reasoning still gives a contradiction. Therefore, N = 0 on V λ. Therefore, B(v) = λv whenever v V λ, and any basis for V λ is a basis of genuine eigenvectors. Doing this for every generalized eigenspace shows that there exists a basis of eigenvectors. Note: This problem was too hard and way too long. Sorry; I didn t realize when setting it how long a solution takes to write. Problem 8. Suppose that V, W are finite-dimensional inner product spaces, and T : V W a linear map. Define the singular values of T. [CORRECTED.] Suppose we are given a k-dimensional subspace Q V so that T v v for each v Q. Prove that at least k of the singular values of T counted with multiplicity are 1. Let the singular values of T be σ 1 σ 2.... There exist orthonormal bases e i, f i for V so that T e i = σ i f i. Suppose that the statement is not true, i.e. σ k < 1. We will derive a contradiction. Let W = span(e k,..., e n ). Then dim(q W ) = dim(q) + dim(w ) dim(q + W ) dim(q) + dim(w ) dim(v ) = 1. So there exists nonzero v Q W. Because v W, we can write v = j k α je j for some α j ; then T v = α j σ j f j = αj 2 = v. j k j k α 2 j σ2 j < j k

6 We used the fact that σ j 1 for every j k. However, the conclusion T v < v contradicts the fact v Q.