EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

Similar documents
Bare-bones outline of eigenvalue theory and the Jordan canonical form

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Definition (T -invariant subspace) Example. Example

Vector Spaces and Linear Transformations

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

MATH 115A: SAMPLE FINAL SOLUTIONS

(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) =

8 General Linear Transformations

Math 110: Worksheet 3

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Study Guide for Linear Algebra Exam 2

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Solutions for Math 225 Assignment #4 1

MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 1 Due: Wednesday July 13, 2005

Linear Algebra Final Exam Solutions, December 13, 2008

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

and let s calculate the image of some vectors under the transformation T.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Answer Keys For Math 225 Final Review Problem

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

4. Linear transformations as a vector space 17

SUPPLEMENT TO CHAPTERS VII/VIII

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Topics in linear algebra

Math 113 Winter 2013 Prof. Church Midterm Solutions

22M: 121 Final Exam. Answer any three in this section. Each question is worth 10 points.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Comps Study Guide for Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Math Linear Algebra Final Exam Review Sheet

Online Exercises for Linear Algebra XM511

Practice Final Exam Solutions

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

MATH 235. Final ANSWERS May 5, 2015

Math 54 Second Midterm Exam, Prof. Srivastava October 31, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

I. Multiple Choice Questions (Answer any eight)

M.6. Rational canonical form

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

LINEAR ALGEBRA REVIEW

Diagonalization of Matrix

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Recall : Eigenvalues and Eigenvectors

Solutions to the August 2008 Qualifying Examination

Lecture Summaries for Linear Algebra M51A

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

0.1 Rational Canonical Forms

Algebra Exam Syllabus

1 Invariant subspaces

CSL361 Problem set 4: Basic linear algebra

Linear Algebra Highlights

2.3. VECTOR SPACES 25

The Jordan Canonical Form

Linear Algebra. Workbook

Final Exam Practice Problems Answers Math 24 Winter 2012

Matrices related to linear transformations

POLYNOMIAL EQUATIONS OVER MATRICES. Robert Lee Wilson. Here are two well-known facts about polynomial equations over the complex numbers

Solution to Homework 1

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

MATH 221, Spring Homework 10 Solutions

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

NOTES II FOR 130A JACOB STERBENZ

Definition 3 A Hamel basis (often just called a basis) of a vector space X is a linearly independent set of vectors in X that spans X.

August 2015 Qualifying Examination Solutions

Linear Algebra Lecture Notes-II

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

MATH 1553-C MIDTERM EXAMINATION 3

Linear Algebra II Lecture 13

Numerical Linear Algebra Homework Assignment - Week 2

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 113 Midterm Exam Solutions

Linear Algebra Lecture Notes-I

Properties of Linear Transformations from R n to R m

5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

Name: Final Exam MATH 3320

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

Eigenvalues and Eigenvectors

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

Eigenvalues and Eigenvectors A =

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Linear Algebra (Math-324) Lecture Notes

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Linear Algebra Practice Problems

REU 2007 Apprentice Class Lecture 8

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS

JORDAN NORMAL FORM NOTES

Algebra Homework, Edition 2 9 September 2010

Linear Algebra II Lecture 22

Transcription:

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA Mahmut Kuzucuoğlu Middle East Technical University matmah@metu.edu.tr Ankara, TURKEY March 14, 015

ii TABLE OF CONTENTS CHAPTERS 0. PREFACE.................................................. 1 1. LINEAR...................................................??. MAP......................................................?? 3.............................................................?? 4.??......................................................?? 5.???.........................................................??

Preface I have given some linear algebra courses in various years. These problems are given to students from the books which I have followed that year. I have kept the solutions of exercises which I solved for the students. These notes are collection of those solutions of exercises. Mahmut Kuzucuoğlu METU, Ankara March 14, 015 M. Kuzucuoğlu 1

........ 1956 M E T U DEPARTMENT OF MATHEMATICS Math 6 Quiz I (05.03.008) Name : Answer Key ID Number : 3606 Signature : 0000000 Duration : 60 minutes Show all your work. Unsupported answers will not be graded. 1.) Let A = 4 0 4 3 3 7 4. (a) Find the characteristic polynomial of A. Solution. The characteristic polynomial of A is f(x) = det(xi A). So, x 4 f(x) = 0 x + 4 3 3 7 x 4 x + 4 3 = (x ) 7 x 4 + 3 4 x + 4 3 (b) Find the minimal polynomial of A. = (x )(x 16 + 1) + 3(1 x 8) = (x )(x + 5) + 3(4 x) = (x )(x + 5 6) = (x )(x 1)(x + 1) Solution. We know that the minimal polynomial divides the characteristic polynomial and they same the same roots. Thus, the minimal polynomial for A is m A (x) = f(x) = (x )(x 1)(x + 1). (c) Find the characteristic vectors and a basis B such that [A] B is diagonal. Solution. The characteristic values of A are c 1 =, c = 1, c 3 = 1. 0 4 3 7 3x 7y + z = 0 A I = 0 6 3 0 1, y z = 0 z = y x = y 3 7 0 0 0 Thus, α 1 = ( 1, 1, ) is a characteristic vector associated with the characteristic value c 1 =. 1 4 1 4 y = 3k x + 4y z = 0 A I = 0 5 3 0 5 3, 5y + 3z = 0 z = 5k 3 7 3 0 0 0 x = k Thus, α = (, 3, 5) is a characteristic vector associated with the characteristic value c = 1. 3 4 3 4 x = t 3x + 4y z = 0 A + I = 0 3 3 0 1 1, y z = 0 y = 3t 3 7 5 0 0 0 z = 3t

Thus, α 3 = (, 3, 3) is a characteristic vector associated with the characteristic value c 3 = 1. 0 0 Now, B = {α 1, α, α 3 } is a basis and [A] B = 0 1 0 is a diagonal matrix. 0 0 1 (d) Find A -conductor of the vector α = (1, 1, 1) into the invariant subspace spanned by ( 1, 1, ). Solution. Set W =< ( 1, 1, ) > and denote the A -conductor of α into W by g(x). Since m A (A) = 0 we have m A (A)α W. Thus, g(x) divides m A (x). Hence, the possibilities for g(x) are x, x 1, x + 1, (x )(x 1), (x )(x + 1), (x 1)(x + 1). We will try these polynomials. (Actually, the answer could be given directly.) Now, 0 4 1 (A I)α = 0 6 3 1 = 3 / W g(x) x, (A I)α = 3 7 1 1 4 1 0 5 3 1 3 7 3 1 3 4 1 (A + I)α = 0 3 3 1 3 7 = = 8 3 / W g(x) x 1, 7 3 7 5 1 5 0 4 3 (A I)(A I)α = 0 6 3 (A I)(A + I)α = (A I)(A + I)α = 0 4 0 6 3 3 7 1 4 0 5 3 3 7 3 7 5 0 5 0 5 5 0 5 / W g(x) x + 1, = = = 6 9 9 10 15 / W g(x) (x )(x 1), / W g(x) (x )(x + 1), 5 15 15 = 15α 1 W g(x) = x 1..) Find a 3 3 matrix whose minimal polynomial is x. 0 0 1 Solution. For the matrix A = 0 0 0 we have A 0 and A = 0. Thus, A is a 3 3 matrix whose minimal polynomial is x. 0 0 0 3.) Prove that similar matrices have the same minimal polynomial. Solution. Let A and B be similar matrices, i.e., B = P 1 AP for some invertible matrix P. For any k > 0 we have B k = (P 1 AP ) k = P 1 A k P which implies that f(b) = P 1 f(a)p for any polynomial f(x). Let f A and f B be the minimal polynomials of A and B, respectively. Then f A (B) = P 1 f A (A)P = P 1 OP = O implies that f B divides f A. On the other hand, O = f B (B) = P 1 f B (A)P gives us f B (A) = O. Hence, f A divides f B. Therefore, we have f A = f B. 30

M. KUZUCUOĞLU 1. Math 6 Exercises and Solutions (1) Let A be a 3 3 matrix with real entries. Prove that if A is not similar over R to a triangular matrix then A is similar over C to a diagonal matrix. Proof. Since A is a 3 3 matrix with real entries, the characteristic polynomial, f(x), of A is a polynomial of degree 3 with real coefficients. We know that every polynomial of degree 3 with real coefficients has a real root, say c 1. On the other hand, since A is not similar over R to a triangular matrix, the minimal polynomial of A is not product of polynomials of degree one. So one of the irreducible factor, h, of the minimal polynomial of A is degree. Then h has two complex roots, one of which is the conjugate of the other. Thus, the characteristic polynomial has one real root and two complex roots, c 1, λ and λ. The minimal polynomial over complex numbers is (x c 1 )(x λ)(x λ) which implies that A is diagonalizable over complex numbers. () Let T be a linear operator on a finite dimensional vector space over an algebraically closed field F. Let f be a polynomial over F. Prove that c is a characteristic value of f(t ) if and only if f(t) = c where t is a characteristic value of T. Proof. Let t be a characteristic value of T and β be a nonzero characteristic vector associated with the characteristic value t. Then, T β = tβ, T β = T (T β) = T (tβ) = tt β = t β, and inductively we can see that T k β = t k β for any k 1. Thus, for any polynomial f(x) we have f(t )β = f(t)β which means, since β 0, that f(t) is a characteristic value of the linear operator f(t ). Assume that c is a characteristic value of f(t ). Since F is algebraically closed, the minimal polynomial of T is product of linear polynomials, that is, T is similar to a triangular operator. If [P 1 T P ] B is triangular matrix, then [P 1 f(t )P ] B is

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA 3 also triangular and on the diagonal of [P 1 f(t )P ] B we have f(c i ), where c i is a characteristic value of T. (3) Let c be a characteristic value of T and let W be the space of characteristic vectors associated with the characteristic value c. What is the restriction operator T W. Solution. Every vector v W is a characteristic vector. Hence, T v = cv for all v W. Therefore, T W = ci. (4) Every matrix A satisfying A = A is similar to a diagonal matrix. Solution. A satisfies the polynomial x x. Thus, the minimal polynomial, m A (x), of A divides x x, that is m A (x) = x or m A (x) = x 1 or m A (x) = x(x 1). If m A (x) = x, then A = 0. If m A (x) = x 1, then A = I. If m A (x) = x(x 1), then the minimal polynomial of A is product of distinct polynomials of degree one. Thus, by a Theorem, the matrix A is similar to diagonal matrix with diagonal entries consisting of the characteristic values, 0 and 1. (5) Let T be a linear operator on V. If every subspace of V is invariant under T then it is a scalar multiple of the identity operator. Solution. If dim V = 1 then for any 0 v V, we have T v = cv, since V is invariant under T. Hence, T = ci. Assume that dim V > 1 and let B = {v 1, v,, v n } be a basis for V. Since W 1 = v 1 is invariant under T, we have T v 1 = c 1 v 1. Similarly, since W = v is invariant under T, we have T v = c v. Now, W 3 = v 1 +v is also invariant under T. Hence, T (v 1 +v ) = λ(v 1 +v ) or c 1 v 1 +c v = λ(v 1 +v ), which gives us (c 1 λ)v 1 + (c λ)v = 0. However, v 1 and v are linearly independent and hence we should have c 1 = c = λ. Similarly, one can continue with the subspace v 1 + v + v 3

4 M. KUZUCUOĞLU and observe that T (v 3 ) = λv 3. So for any v i B, we have T v i = λv i. Thus, T = λi. (6) Let V be the vector space of n n matrices over F. Let A be a fixed n n matrix. Let T be a linear operator on V defined by T (B) = AB. Show that the minimal polynomial of T is the minimal polynomial of A. Solution. Let m A (x) = x n + a n 1 x n 1 + + a 1 x + a 0 be the minimal polynomial of A, so that m A (A) = 0. It is easy to see that T k (B) = A k B for any k 1. Then, for any B V we have m A (T )B = (T n + a n 1 T n 1 + + a 1 T + a 0 I)B = T n (B) + a n 1 T n 1 (B) + + a 1 T (B) + a 0 B = A n B + a n 1 A n 1 B + + a 1 AB + a 0 B = (A n + a n 1 A n 1 + + a 1 A + a 0 I)B = m A (A)B = 0. Thus, we obtain m A (T ) = 0, which means that m T (x) divides m A (x). Now, let m T (x) = x m + c m 1 x m 1 + + c 1 x + c 0 be the minimal polynomial of T, so that m T (T ) = 0. Then, for any B V we have m T (A)B = (A m + c m 1 A m 1 + + c 1 A + c 0 I)B = A m B + c m 1 A m 1 B + + c 1 AB + c 0 B = T m (B) + c m 1 T m 1 (B) + + c 1 T (B) + c 0 B = (T m + c m 1 T m 1 + + c 1 T + c 0 I)B = m T (T )B = 0, which leads to m T (A) = 0, meaning that m A (x) divides m T (x). Since, monic polynomials dividing each other are the same we have m T (x) = m A (x).

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA 5 (7) If E is a projection and f is a polynomial, then show that f(e) = ai + be. What are a and b in terms of the coefficients of f? Solution. Let f(x) = c 0 + c 1 x + + c n x n. Then, f(e) = c 0 I + c 1 E + + c n E n. Since E is a projection, (E = E), we have E k = E for any k 1. Then, f(e) = c 0 I + c 1 E + + c n E n = c 0 I + c 1 E + + c n E = c 0 I + (c 1 + + c n )E. Thus, a is the constant term of f and b is the sum of all other coefficients. (8) Let V be a finite dimensional vector space and let W 1 be any subspace of V. Prove that there is a subspace W of V such that V = W 1 W. Proof. Let B W1 = {β 1,, β k } be a basis for W 1. We may extend B W1 to a basis B V of V, say B V = {β 1,, β k, β k+1,, β n }. Let W be the subspace spanned by β k+1,, β n. Then, as they are linearly independent in V, we have B W = {β k+1,, β n }. Clearly W 1 + W = V as W 1 + W contains a basis of V and so spans V. Let β W 1 W. Then, β W 1 implies that β = c 1 β 1 + + c k β k, and β W implies that β = c k+1 β k+1 + + c n β n. The last two equalities give us c 1 β 1 + +c k β k c k+1 β k+1 +c n β n = 0, but since β i s are linearly independent, we obtain c i = 0 for all i = 1,, n which means that β = 0. That is W 1 W = {0}, and hence V = W 1 W. (9) Let V be a real vector space and E be an idempotent linear operator on V, that is a projection. Prove that I + E is invertible. Find (I + E) 1. Proof. Since E is an idempotent linear operator it is diagonalizable by Question 4. So there exists a basis of V consisting of characteristics vectors of E corresponding to the characteristic values 0 and 1. That is, there exists a basis

6 M. KUZUCUOĞLU ([I + E] B ) 1 = B = {β 1,, β n } such that Eβ i = β i for i = 1,, k, and Eβ i = 0 for i = k + 1,, n. Then (I + E)β i = β i for i = 1,, k and (I + E)β i = β i for i = k + 1,, n, that is, [I + E] B = [ I 1 0 0 I where I 1 stands for k k identity matrix, I is (n k) (n k) identity matrix and each 0 represents the zero matrix of appropriate dimension. It is now easy to see that [I + E] B is invertible, since det(i + E) = k 0. To find the inverse of (I + E), we note that [ 1 I 1 0 0 I ] = [ I 1 0 0 I ] + [ ], 1 I 1 0 0 0 ] = I 1 [E] B. Therefore, (I + E) 1 = I 1 E. (You may verify that really this is the inverse, by showing that (I + E)(I 1 E) = (I E)(I + E) = I.) 1 (10) Let T be a linear operator on V which commutes with every projection operator on V. What can you say about T? Solution. Let B be a basis for V and β i B, i I where I is some index set. We can write V as a direct sum V = W i U where W i = β i. Then there exists a projection E i of V onto the subspace W i for each i I. Note that E i v W i for all v V, and E i β i = β i. Now, by assumption, the linear operator T commutes with E i for all i I, that is, T E i = E i T. Then, for β i W i, we have T E i β i = E i T β i W i implies that T β i = T (E i β i ) = c i β i for some constant c i F. Thus, β i is a characteristic vector of T. Hence, V has a basis consisting of characteristic vectors of T. It follows that T is a diagonalizable linear operator on V. (11) Let V be the vector space of continuous real valued functions on the interval [ 1, 1] of the real line. Let W e be the space

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA 7 of even functions, f( x) = f(x), and W o be the space of odd functions, f( x) = f(x). a) Show that V = W e W o. b) If T is the indefinite integral operator (T f)(x) = are W e and W o invariant under T? Solution. a) Let f V. Then, we may write x 0 f(t)dt, f(x) = f(x) + f( x) + f(x) f( x) = f(x) + f( x) f(x) f( x) +. f(x) + f( x) Observe that f e (x) = is a continuous even function and f o (x) = is a continuous odd function. f(x) f( x) Hence, f = f e + f o, that is V = W e + W o. To show that V = W e W o, we need to show that W e W o = {0}. To see this, let g W e W o. Then, g W e implies that g( x) = g(x), and g W o implies that g( x) = g(x). Thus, we have g(x) = g(x) or g(x) = 0 for all x [ 1, 1], which means that g = 0. b) For f(x) = x W o, we have (T f)(x) = x / / W o, and for g(x) = x W e, we have (T g)(x) = x 3 /3 / W e. Thus, neither W e nor W o are invariant under T. (1) Let V be a finite dimensional vector space over the field F, and let T be a linear operator on V, such that rank(t ) = 1. Prove that either T is diagonalizable or T is nilpotent, but not both. Proof. Since rank(t ) = dim(im(t )) = 1, we have dim(ker(t )) = n 1. Let 0 β Im(T ). So, Im(T ) = β. Since β Im(T ), there exists a vector α 0 V such that T α 0 = β. Let {α 1, α,, α n 1 } be a basis for Ker(T ). Then, B = {α 0, α 1, α,, α n 1 } is a basis for V. We have T α i = 0 for all i = 1,,, n 1.

8 M. KUZUCUOĞLU If T α 0 Ker(T ), then T α 0 = c 1 α 1 + + c n 1 α n 1 and 0 0 0 0 c 1 0 0 0 [T ] B = c 0 0 0....... c n 1 0 0. and it is easily seen that T = 0 meaning that T is nilpotent. Note that at least one of c i s is nonzero, since otherwise, α 0 would be in Ker(T ) which contradicts with the choice of B. If T α 0 / Ker(T ), then T β Im(T ) and T β = c 0 β. In this case we construct a new basis B = {β, α 1, α,, α n 1 } and c 0 0 0 0 0 0 [T ] B =...... 0 0 0 which means that T is diagonalizable. (13) Let T be a linear operator on the finite dimensional vector space V. Suppose T has a cyclic vector. Prove that if U is any linear operator which commutes with T, then U is a polynomial in T. Proof. Let B = {α, T α,, T n 1 α} be a basis for V containing the cyclic vector α and let m(x) = x n + a n 1 x n 1 + + a 1 x + a 0 be the minimal polynomial of T. Since Uα is in V, it can be written as a linear combination of basis vectors. Then, Uα = b 0 α+b 1 T α+ +b n 1 T n 1 α where b 0, b 1,, b n 1 are elements of the field F. That is, (b 0 I+b 1 T + +b n 1 T n 1 U)α = 0. Now, since U and T commute, we have UT (α) = T U(α) = T (b 0 α + b 1 T α + + b n 1 T n 1 α) = b 0 T α + b 1 T α + + b n 1 T n α = (b 0 I + b 1 T + + b n 1 T n 1 )T α

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA 9 which means that (b 0 I + b 1 T + + b n 1 T n 1 U)T α = 0. Similarly, we can show that (b 0 I + b 1 T + + b n 1 T n 1 U)T i α = 0 for all i =, 3,, n 1. Since the transformation b 0 I + b 1 T + + b n 1 T n 1 U maps each basis vector to the zero vector, it is identically equal to zero on the whole space. Thus, we obtain U = b 0 I + b 1 T + + b n 1 T n 1. (14) Give an example of two 4 4 nilpotent matrices which have the same minimal polynomial but which are notsimilar. 0 0 0 0 0 0 0 0 1 0 0 0 Solution. Let A = 0 0 0 0 and B = 1 0 0 0 0 0 0 0. 0 0 0 0 0 0 1 0 It is easy to see that m A (x) = m B (x) = x but they are not similar since, A has 3 distinct characteristic vectors corresponding to the characteristic value zero, but B has only two characteristic vectors corresponding to the characteristic value zero. (15) Show that if N is a nilpotent linear operator on an n dimensional vector space V, then the characteristic polynomial for N is x n. Solution. Recall that N is nilpotent, if N k = 0 for some k N +. Since, N is a nilpotent linear operator on V, the minimal polynomial for N is of the form x m for some m n. Then, all characteristic values of N are zero. Since the minimal polynomial is a product of linear polynomials, N is a triangulable operator. It follows that there exists a basis B of

10 M. KUZUCUOĞLU V such that 0 0 0 0 0 [N] B =..... 0 since, similar matrices have the characteristic polynomial, it follows that the characteristic polynomial of N is x n where n = dimv. (16) Let T be a linear operator on R 3 which is represented in the standard ordered basis by the matrix 0 0 0 0. 0 0 1 Prove that T has no cyclic vector. What is the T cyclic subspace generated by the vector β = (1, 1, 3)? Solution. Assume that T has a cyclic vector α = (a 1, a, a 3 ). Then B = {α, T α, T α} will be a basis for R 3. That is, the vectors α = (a 1, a, a 3 ), T α = (a 1, a, a 3 ), T α = (4a 1, 4a, a 3 ) must be linearly independent, or the matrix a 1 a a 3 a 1 a a 3 4a 1 4a a 3 must be invertible. Applying elementary row operations, we obtain R 1 + R a 1 a a 3 R + R 3 a 1 a a 3 0 0 3a 3 0 0 a 3 4R 1 + R 3 0 0 3a 3 1R 3 0 0 0 a 1 a a 3 a 1 a a 3 4a 1 4a a 3 which is not invertible. Hence, T has no cyclic vector. To find the cyclic subspace generated by β, it is enough to check if β and T β are independent since we have already shown that the set {α, T α, T α} can not be linearly independent for any α R 3. Clearly, β = (1, 1, 3) and T β = (,, 3) are

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA 11 linearly independent since, otherwise, one of them would be a multiple of the other one which is not the case here. Thus, the cyclic subspace generated by β is Z(β; T ) = (1, 1, 3), (,, 3) = {λ(1, 1, 3)+µ(,, 3) : λ, µ R}. (17) Find the minimal polynomial and rational form of the matrix c 0 1 T = 0 c 1. 1 1 c Solution. The characteristic polynomial of T is x c 0 1 f T (x) = det(xi T ) = 0 x c 1 1 1 x c x c 1 = (x c) 1 x c + 0 1 x c 1 = (x c)((x c) 1) (x c) = (x c)((x c) ) = (x c)(x c )(x c + ). Since the characteristic polynomial and the minimal polynomial have the same roots and the minimal polynomial divides the characteristic polynomial we have m T (x) = f T (x) = (x c)((x c) ) = (x c) 3 (x c) = x 3 + ( 3c)x + (3c )x + ( c 3 + c). Thus the rational form of T is R = 0 0 c 3 c 1 0 3c + 0 1 3c.