Linear Algebra Final Exam Solutions, December 13, 2008

Similar documents
Definition (T -invariant subspace) Example. Example

Test 3, Linear Algebra

Family Feud Review. Linear Algebra. October 22, 2013

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

MATH 115A: SAMPLE FINAL SOLUTIONS

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Bare-bones outline of eigenvalue theory and the Jordan canonical form

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

1 Invariant subspaces

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam May 25th, 2018

Math 113 Winter 2013 Prof. Church Midterm Solutions

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

Math 113 Final Exam: Solutions

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Study Guide for Linear Algebra Exam 2

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Math 113 Midterm Exam Solutions

235 Final exam review questions

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Math Final December 2006 C. Robinson

Last name: First name: Signature: Student number:

Math 308 Practice Test for Final Exam Winter 2015

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Final Exam Practice Problems Answers Math 24 Winter 2012

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Sample Final Exam: Solutions

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Math 308 Final, Autumn 2017

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Review problems for MA 54, Fall 2004.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Diagonalization of Matrix

DEPARTMENT OF MATHEMATICS

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

Jordan Canonical Form Homework Solutions

Name: Final Exam MATH 3320

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Definitions for Quizzes

PRACTICE PROBLEMS FOR THE FINAL

August 2015 Qualifying Examination Solutions

Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.

Eigenvalues and Eigenvectors

Math 110 (Fall 2018) Midterm II (Monday October 29, 12:10-1:00)

DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS

Eigenvalues and Eigenvectors A =

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Solutions to Final Exam

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Lecture 12: Diagonalization

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Math 115A: Homework 5

Vector Spaces and Linear Transformations

Math Linear Algebra

Summer Session Practice Final Exam

Linear Algebra 2 Spectral Notes

The Jordan Normal Form and its Applications

Chapter 6: Orthogonality

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

(VI.D) Generalized Eigenspaces

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MATH 221, Spring Homework 10 Solutions

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Linear Algebra- Final Exam Review

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 22, 2016

Practice Final Exam Solutions

Linear Algebra Practice Problems

Math 54 Second Midterm Exam, Prof. Srivastava October 31, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

LINEAR ALGEBRA REVIEW

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Quizzes for Math 304

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

4. Linear transformations as a vector space 17

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

MATH 1553 PRACTICE FINAL EXAMINATION

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Choose three of: Choose three of: Choose three of:

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Spring 2014 Math 272 Final Exam Review Sheet

Transcription:

Linear Algebra Final Exam Solutions, December 13, 2008 Write clearly, with complete sentences, explaining your work. You will be graded on clarity, style, and brevity. If you add false statements to a correct argument, you will lose points. 1 20 2 20 3 20 4 20 5 20 Total 100

1. Let F be a field and let V be a vector space over F. (a) Let (v 1, v 2,..., v n ) be a list in V. Define what it means for (v 1, v 2,..., v n ) to be a basis of V. Define the dimension of V, explaining when it exists (and is finite). (b) Prove that if W 1 and W 2 are linear subspaces of a finite dimensional space V, then dim(w 1 + W 2 ) = dim(w 1 ) + dim(w 2 ) dim(w 1 W 2 ). (c) Prove that, with the notation of the previous part, dim(w 1 W 2 ) dim(w 1 ) + dim(w 2 ) dim V. (a) The list is a basis for V if and only if every element of V can be written uniquely as a sum a i v i, or, equivalently, if the list is independent and spans V. If such a basis exists, then any two bases have the same length, and this length is the dimension of V. (b) Choose a basis for W 1 W 2 and prolong it to bases for W 1 and W 2. Thus we find a list (v 1,... v n ) such that (v 1,... v d ) is a basis for W 1 W 2, (v 1,... v d+d1 is a basis for W 1, and (v 1,..., v d, v d+d1 +1,... v d+d1 +d 2 ) is a basis for W 2. It is easy to see that this list spans W 1 + W 2. Let us check that it is independent. If a i v i = 0, then a 1 v 1 + a d+d1 = a d+d1 +1 a d+d1 +d 2. The vector on the left belongs to V 1 and the vector on the right belongs to V 2, hence both belong to V 1 V 2. Hence a 1 v 1 + a d+d1 = b 1 v 1 + v d v d for some b i, and since (v 1,..., v d+d1 ) is independent, a i = 0 for d < i d + d 1. But then we have a linear combination adding up to zero involving only the vectors in the basis for W 2 so all the coefficients are zero. (c) Since W 1 + W 2 is contained in V, its dimension is less than or equal to the dimension of V. Hence dim(w 1 W 2 ) = dim(w 1 )+dim(w 2 ) dim(w 1 +W 2 ) dim(w 1 )+dim(w 2 ) dim V

2. Let T be a linear operator on a finite dimensional vector space V over a field F. (a) If λ F, what is meant by the λ-eigenspace E λ (T ) of T? (b) If λ F, what is meant by the generalized λ-eigenspace GE λ (T ) of T? (c) Let (v 1,..., v n ) be a list of nonzero generalized eigenvectorscorresponding to distinct eigenvalues (λ 1,..., λ n ). Prove that (v 1,..., v n ) is linearly independent. (Remark: if you have trouble with this, you can do it for eigenvalues instead of generalized eigenvalues for partial credit.) (a) The λ-eigenspace of T is the kernel of T λi. (b) The generalized λ-eigensapce of T is the set of all vectors v such that (T λi) k v = 0 for some k. (c) We prove this by induction on n, the cases of n = 0, 1 being trivial. First we recall that if N is a nilpotent operator and a is not zero, then a + N is invertible. Furthermore, GE i is invariant under T λ n I and is equal to T λ i + a i ), where a i = λ i λ n. Thus if i < n, T λ n is invertible on GE i, and hence so is each of its powers. Now suppose that a i v i = 0. Apply (T λ n ) k to conclude that n 0 = a i (T λ n ) k v i. i=1 Let v i := (T λ n ) k v i. For k sufficiently large v n = 0, so we get 0 = n 1 i=1 a i v i = 0. But v i is a nonzero element of GE i, and the induction assumption tells us that a i = 0 for all i < n. Since v n 0 and now a n v n 0, a n = 0 also. 3

3. Let V be an inner product space over C and let T be a linear operator on V. Do not assume that V is finite dimensional. (a) Let W be a finite dimensional subspace of V. State the theorem on orthogonal projection to W and use it to prove that (W ) = W. (b) Show that there is at most one operator T such that (T v w) = (v T w) for all v and w. If this operator exists, it is called the adjoint of T. (c) Suppose that T exists. Prove that Ker(T ) = Im(T ). Prove that if Im(T ) is finite dimensional, then Im(T ) = Ker(T ). (d) Is this same conclusion true assuming that Ker(T ) is finite dimensional? (a) The theorem says that if W is finite dimensional, then V = W W. Say w W and v W. Then (v w) = 0, so certainly w (W ). On the other hand, say v (W ). Then v can be written as a sum v = w + v, where w W and v W. Then (v v ) = 0 and (w v ) = 0, hence (v v ) = 0 so v = 0 and v W (b) If T and T both satisfy this equation, let S := T T. Then (v Sw) = 0 for all v and w. In particular (Sw Sw) = 0 for all w, hence Sw = 0 for all w, hence S = 0 and T = T. (c) If v KerT, then (v T w) = (T v w) = 0 for all w, so v (ImT ). Conversely, if v (ImT ), then (T v T v) = (v T T v) = 0 so T v = 0 and v KerT. We know that Ker(T ) = Im(T ), hence by part (a), Ker(T ) = Im(T ) if Im(T ) is finite dimensional. 4

4. Say whether each of the following is true or false. If true, give a brief explanation. If false, give a counterexample, and if possible, a corrected version of the statement, for example by adding a missing hypothesis. (You need not prove the corrected statement in this case.)here V is a finite dimensional complex vector space and T is a linear operator on V. (a) The characteristic polynomial f T (t) and the minimal polynomial of p T (t) have the same roots with the same multiplicities. (b) The operator T is diagonalizable if and only if f T (t) has distinct roots. (c) If V is an inner product space (still finite dimensional over C) and W is T -invariant, then W is also T -invariant. (d) The real vector space R 3 has an orthogonal basis consisting of 1 2 3 eigenvectors for the operator corresponding to the matrix 2 3 4. 3 4 5 (a) This is not true. For example, the 0 operator on a two-dimensional vector space has characteristic polynomial t 2 but minimal polynomial t. It is true that f T and p T have the same roots, but the multiplicities of the roots of p T may be less than the multiplicities of f T. (b) The example above is a counterexample. The correct statement is that T is diagonalizable if and only if p T has distinct roots. (c) This is not true: for example T (x 1, x 2 ) = (x 2, 0). It is true if T is self-adjoint, or, more generally, normal. (d) This is true since the operator is self adjoint. 5

5. Let T be a nilpotent operator on a vector space V. Let W be a T - invariant subspace of V. (a) Prove that if Ker(T ) W = 0, then W = 0. (b) Prove that if Im(T ) + W = V, then W = V. Hint: If you are completely stuck, try this for small values of r, e.g. r = 1 or 2. Solution (a) Since W is T -invariant, we can consider the restriction T W of T to W to get an operator on W. Since KerT W = 0, T W is injective. On the other hand, T n = 0 for some n, and hence T n W = 0 also, Since T n W is again injective, W = 0. (b) Say v V. We can write v = w 1 + T v 1 for some w 1 W, v 1 V. Now apply the same reasoning to write v 1 = w + T v 2 and hence v = w 1 + T w + T 2 v 2. Since W is invariant T w W and hence w 2 := w 1 + T w W. Thus we have written v = w 2 + T 2 v 2. Continuing in this way, we find some w n W, v n V with v = w n + T n v n for all n, Since T n = 0 for some n, we see that v W. 6