GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION
|
|
- Elfreda Megan Snow
- 5 years ago
- Views:
Transcription
1 GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION FRANZ LUEF Abstract. Our exposition is inspired by S. Axler s approach to linear algebra and follows largely his exposition in Down with Determinants, check also the book LinearAlgebraDoneRight by S. Axler [1]. These are the lecture notes for the course of Prof. H.G. Feichtinger Lineare Algebra 2 from Before we introduce generalized eigenvectors of a linear transformation we recall some basic facts about eigenvalues and eigenvectors of a linear transformation. Let V be a n-dimensional complex vector space. Recall a complex number λ is called an eigenvalue of a linear operator T on V if T λi is not injective, i.e. ker(t λi) {0}. The main result about eigenvalues is that every linear operator on a finite-dimensional complex vector space has an eigenvalue! Furthermore we call a vector v V an eigenvector of T if T v = λv for some eigenvalue λ. The central result on eigenvectors is that Non-zero eigenvectors corresponding to distinct eigenvalues of a linear transformation on V are linearly independent. Consequently the number of distinct eigenvalues of T cannot exceed thte dimension of V. Unfortunately the eigenvectors of T need not span V. transformation on C 4 whose matrix is T = For example the linear as only the eigenvalue 0, and its eigenvectors form a one-dimensional subspace of C 4. Observe that T, T 2 0 but T 3 = 0. More generally a linear operator T such that T, T 2,..., T p 1 0 and T p = 0 is called nilpotent of index p. More generally, let T be a linear operator on V, then the space of all linear operators on V is finitedimensional (actually of dimension n 2 ). Then there exists a smallest positive integer k such that I, T, T 2,..., T k are not linearly independent. In other words there exist unique complex numbers a 0, a 1,..., a k 1 such that a 0 I + a 1 T + + a k 1 T k 1 + T k = 0. The polynomial m(x) = a 0 + a 1 x + + a k 1 z k 1 + z k is called the minimal polynomial of T. It is the monic polynomial of smallest degree such that m(t ) = 0. A polynomial q such that q(t ) = 0 is a so-called annihilating polynomial. The 1
2 Fundamental Theorem of Algebra yields that m(x) = (x λ 1 ) α 1 (x λ 2 ) α2 (x λ m ) αm, where α j is the multiplicity of the eigenvalue λ j of T. Since m(t ) = (T λ 1 I) α 1 (T λ 2 I) α2 (T λ m I) αm = 0 implies that for some j (T λ j ) α j = 0 is not injective, i.e. ker(t λ j I) αj {0}. What is the structure of the subspace ker(t λ j I)? First of all we call a vector v V a generalized eigenvector of T if (T λi) k v = 0 for some eigenvalue λ of T. Then ker(t λi) k is the space of all generalized eigenvectors of T corresponding to an eigenvalue λ. Lemma 0.1. The set of generalized eigenvectors of T on a n-dimensional complex vector space corresponding to an eigenvalue λ equals ker(t λi) n. Proof. Obviously, every element of ker(t λi) n is a generalized eigenvector of T corresponding to λ. Let us show the other inclusion. If v 0 is a generalized eigenvector of T corresponding to V, then we need to prove that (T λi) n v = 0. By assumption there is a smallest non-negative integer k such that (T λi) k v = 0. We are done if we show that k n. In other words we proof that v, (T λi)v,..., (T λi) k 1 v are linearly independent vectors. Since then we will have k linearly independent elements in an n-dimensional vector space, which implies that k n. Let a 0, a 1,..., a k 1 be complex numbers such that a 0 v + a 1 (T λi)v + + a k 1 (T λi) k 1 v = 0. Apply (T λi) k 1 to both sides of the equation above, getting a 0 (T λi) k 1 v = 0, which yields a 0 = 0. Now apply (T λi) k 2 to both sides of the equation, getting a 1 (T λi) k 1 v = 0, which implies a 1 = 0. Continuing in this fashion, we see that a j = 0 for each j, as desired. Following the basic pattern of the proof that non-zero eigenvectors corresponding to discinct eigenvalues of T are linearly independent, we obtain: Proposition 0.2. Non-zero generalized eigenvectors corresponding to distinct eigenvalues of T are linearly independent. Proof. Suppose that v 1,.., v m are non-zero generalized eigenvectors of T corresponding to distinct eigenvalues λ 1,..., λ m. We assume that there are complex numbers a 1,..., a m such that a 1 v 1 + a 2 v a m v m = 0. Then we have to show that a 1 = a 2 = = a m = 0. Let k be the smallest positive integer such that (T λi) k v 1 = 0. Then apply the linear operator (T λ 1 I) k 1 (T λ 2 I) n (T λ m I) n 2
3 to both sides of the previous equation, getting a 1 (T λ 1 I) k 1 (T λ 2 I) n (T λ m I) n v 1 = 0. We rewrite (T λ 2 I) n T λ m I) n as ((T λ 1 ) + (λ 1 λ 2 )I) n (T λ n ) + (λ 1 λ n )I) n v 1 = 0. An application of the binomial theorem gives a sum of terms which when combined with (T λ 1 I) k 1 on the left and applied to v 1 gives 0, except for the term a 1 (λ 1 λ 2 ) n (λ 1 λ m ) n (T λ 1 ) k 1 v 1 = 0. Thus a 1 0. Continuing in a similar fashion, we get a j = 0 for each j, as desired. The central fact about generalized eigenvectors is that they span V. Theorem 0.3. Let V be a n-dimensional complex vector space and let λ be an eigenvalue of T. Then V = ker(t λi) n im(t λi) n. Proof. The proof will be an induction on n, the dimension of V. The result holds for n = 1. Suppose that n > 1 and that the result holds for all vector spaces of dimension less than n. Let λ be any eigenvalue of T. Then we want to show that V = ker(t λi) n im(t λi) n =: V 1 V 2. Let v V 1 V 2. Then (T λi) n v = 0 and there exists a u V such that (T λi) n u = v. Applying (T λi) n to both sides of the last equation, we have that (T λi) 2n u = 0. Consequently, (T λi) n u = 0, i.e. v = 0. Thus V 1 V 2 = {0}. Now V 1 and V 2 are the kernel and the image of a linear operator on V, we have dim V = dim V 1 + dim V 2. Note that V 1 {0}, because λ is an eigenvalue of T, thus dim V 2 < n. Furthermore T maps V 2 into V 2 since T commutes with (T λi) n. By our induction hypothesis, V 2 is spanned by the generalized eigenvectors of T V2, each of wich is also a generalized eigenvector of T. Everything in V 1 is a generalized eigenvector of T, which gives the desired result. Corollary 0.4. If 0 is the only eigenvalue of a linear operator on V, then T is nilpotent. Proof. By assumption 0 is the only eigenvalue of T. Then every vector v in V is a generalized eigenvector of T corresponding to the eigenvalue λ = 0. Consequently T p = 0 for some p. As a consequence we get the following structure theorem for linear transformations. Theorem 0.5. Let λ 1,..., λ m be the distinct eigenvalues of T, with E 1,..., E m denoting the corresponding sets of generalized eigenvectors. Then (1) V = E 1 E 2 E m ; (2) T maps each E j into itself; (3) each (T λ j I) Ej is nilpotent; (4) each T Ej has only one eigenvalue, namely λ j. 3
4 Proof. (1) Follows from the linear independence of generalized eigenvectors corresponding to distinct eigenvalues and that the generalized eigenvectors of λ j span E j. (2) Suppose v E j. Then (T λ j I) k v = 0 for some positive integer k. Furthermore we have (T λ j ) k T v = T (T λ j ) k v = T (0) = 0, i.e. T v U j. (3) is a reformulation of the definition of a generalized eigenvector. (4) Let λ be an eigenvalue of T Uj, with corresponding non-zero eigenvector v U j. Then (T λ j I)v = (λ λ j )v, and hence (T λ j I) k v = (λ λ j ) k v for each positive integer k. But v is a generalized eigenvector of T corresponding to λ j, the left hand side of the equation is 0 for some k, i.e. λ = λ j. The next theorem connects the minimal polynomial of T to th decomposition of V as a direct sum of generalized eigenvectors. Theorem 0.6. Let λ 1,..., λ m be the distinct eigenvalues of T, let E j denote the set of the generalized eigenvectors corresponding to λ j, and let α j be the smallest positive integer such that (T λ j I) α j v = 0 for every v E j. Let Then Proof. m(x) = (x λ 1 ) α 1 (x λ 2 ) α2 (x λ m ) αm. (1) m has degree at most dim(v ); (2) if p is another annihilating polynomial of T, then p is a polynomial multiple of m; (3) m is the minimal polynomial of T. Each α j is at most the dimension of E j and V = E 1 E m gives that the α j s can at most add up to n. Let p be a polynomial such that p(t ) = 0. We show that p is a polynomial multiple of each (x λ j ) α j. We now fix j. Then q has to be of the form p(x) = a(x r 1 ) δ 1 (x r 2 ) δ2 (x r M ) α M (x λ j ) δ, where a is a non-zero complex number and the r k s are complex numbers all different from λ j, the δ k s are positive integers, and δ is a non-negative integer. Suppose v E j. Then (T λ j I) δ v is also in E j. Now a(t r 1 ) δ 1 (T r 2 ) δ2 (T r M ) α M (T λ j ) δ v = p(t )v = 0 and (T r 1 ) δ 1 (T r 2 ) δ2 (T r M ) α M is injective on E j. Thus (T λ j I) δ v = 0. But v was an arbitrary element of E j, this implies α j δ, i.e. p is a polynomial multiple of (x λ j ) α j. 4
5 Suppose v is a vector in some E j. Then m(t )v = 0. Because E 1,..., E m span V, we conclude that m(t ) = 0, but from (ii) we know that no monic polynomial of lower degree has this property thus m must be the minimal polynomial. Let λ be an eigenvalue of T. Then the geometric multiplicity is defined as the dimension of the set of generalized eigenvectors of T corresponding to λ. Then the sum of the multiplicities of all eigenvalues of T equals n. Let λ 1,..., λ m be the distinct eigenvalues of T, with corresponding multiplicities β 1,..., β m. Then the polynomial c(x) = (x λ 1 ) β1 (x λ m ) βm is called the characteristic polynomial of T. Theorem 0.7 (Cayley-Hamilton). Let c be the characteristic polynomial of T. Then c(t ) = 0. Note that α j β j = dim E j, i.e. c(t) is a polynomial multiple of m(t ). A linear operator on V is called diagonalizable if its eigenvectors to distinct eigenvalues allow to span V. In terms of our approach this may be expressed as follows: A linear operator T is diagonalizable if and only if all generalized eigenvectors are actually eigenvectors. It turns out that the class of diagonalizable matrices consists of those linear operators such that T T = T T, so-called normal matrices. References [1] S. Axler. Linear algebra done right. 2nd ed. Springer, New York, NY, Fakultät für Mathematik, Nordbergstrasse 15, 1090 Wien, Austria, address: franz.luef@univie.ac.at 5
Bare-bones outline of eigenvalue theory and the Jordan canonical form
Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional
More information1 Invariant subspaces
MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another
More informationLinear Algebra Final Exam Solutions, December 13, 2008
Linear Algebra Final Exam Solutions, December 13, 2008 Write clearly, with complete sentences, explaining your work. You will be graded on clarity, style, and brevity. If you add false statements to a
More informationJORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5
JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct
More informationA linear algebra proof of the fundamental theorem of algebra
A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional
More informationA linear algebra proof of the fundamental theorem of algebra
A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional
More informationMath 113 Homework 5. Bowei Liu, Chao Li. Fall 2013
Math 113 Homework 5 Bowei Liu, Chao Li Fall 2013 This homework is due Thursday November 7th at the start of class. Remember to write clearly, and justify your solutions. Please make sure to put your name
More informationALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND
More informationLINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS
LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts
More informationThe Jordan Canonical Form
The Jordan Canonical Form The Jordan canonical form describes the structure of an arbitrary linear transformation on a finite-dimensional vector space over an algebraically closed field. Here we develop
More informationLinear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.
Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear
More informationMath 113 Winter 2013 Prof. Church Midterm Solutions
Math 113 Winter 2013 Prof. Church Midterm Solutions Name: Student ID: Signature: Question 1 (20 points). Let V be a finite-dimensional vector space, and let T L(V, W ). Assume that v 1,..., v n is a basis
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationMATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by
MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar
More informationMATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS
MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS There will be eight problems on the final. The following are sample problems. Problem 1. Let F be the vector space of all real valued functions on
More informationThe Cayley-Hamilton Theorem and the Jordan Decomposition
LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationInfinite-Dimensional Triangularization
Infinite-Dimensional Triangularization Zachary Mesyan March 11, 2018 Abstract The goal of this paper is to generalize the theory of triangularizing matrices to linear transformations of an arbitrary vector
More informationMath 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.
Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we
More informationThe converse is clear, since
14. The minimal polynomial For an example of a matrix which cannot be diagonalised, consider the matrix ( ) 0 1 A =. 0 0 The characteristic polynomial is λ 2 = 0 so that the only eigenvalue is λ = 0. The
More informationMath 113 Midterm Exam Solutions
Math 113 Midterm Exam Solutions Held Thursday, May 7, 2013, 7-9 pm. 1. (10 points) Let V be a vector space over F and T : V V be a linear operator. Suppose that there is a non-zero vector v V such that
More informationLinear Algebra II Lecture 22
Linear Algebra II Lecture 22 Xi Chen University of Alberta March 4, 24 Outline Characteristic Polynomial, Eigenvalue, Eigenvector and Eigenvalue, Eigenvector and Let T : V V be a linear endomorphism. We
More information5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found
5. Diagonalization plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found eigenvalues EV, eigenvectors, eigenspaces 5.. Eigenvalues and eigenvectors.
More information(a + b)c = ac + bc and a(b + c) = ab + ac.
2. R I N G S A N D P O LY N O M I A L S The study of vector spaces and linear maps between them naturally leads us to the study of rings, in particular the ring of polynomials F[x] and the ring of (n n)-matrices
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More informationEXERCISES AND SOLUTIONS IN LINEAR ALGEBRA
EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA Mahmut Kuzucuoğlu Middle East Technical University matmah@metu.edu.tr Ankara, TURKEY March 14, 015 ii TABLE OF CONTENTS CHAPTERS 0. PREFACE..................................................
More informationTHE MINIMAL POLYNOMIAL AND SOME APPLICATIONS
THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise
More informationVector Spaces and Linear Transformations
Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations
More informationNotes on the matrix exponential
Notes on the matrix exponential Erik Wahlén erik.wahlen@math.lu.se February 14, 212 1 Introduction The purpose of these notes is to describe how one can compute the matrix exponential e A when A is not
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationGeneralized eigenspaces
Generalized eigenspaces November 30, 2012 Contents 1 Introduction 1 2 Polynomials 2 3 Calculating the characteristic polynomial 5 4 Projections 7 5 Generalized eigenvalues 10 6 Eigenpolynomials 15 1 Introduction
More information4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial
Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and
More informationLecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)
Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1) Travis Schedler Thurs, Nov 17, 2011 (version: Thurs, Nov 17, 1:00 PM) Goals (2) Polar decomposition
More informationFurther linear algebra. Chapter IV. Jordan normal form.
Further linear algebra. Chapter IV. Jordan normal form. Andrei Yafaev In what follows V is a vector space of dimension n and B is a basis of V. In this chapter we are concerned with linear maps T : V V.
More informationLinear Algebra II Lecture 13
Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since
More informationNOTES II FOR 130A JACOB STERBENZ
NOTES II FOR 130A JACOB STERBENZ Abstract. Here are some notes on the Jordan canonical form as it was covered in class. Contents 1. Polynomials 1 2. The Minimal Polynomial and the Primary Decomposition
More informationLecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)
Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Travis Schedler Tue, Oct 18, 2011 (version: Tue, Oct 18, 6:00 PM) Goals (2) Solving systems of equations
More informationMath 240 Calculus III
Generalized Calculus III Summer 2015, Session II Thursday, July 23, 2015 Agenda 1. 2. 3. 4. Motivation Defective matrices cannot be diagonalized because they do not possess enough eigenvectors to make
More informationMATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,
MATH 205 HOMEWORK #3 OFFICIAL SOLUTION Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. a F = R, V = R 3, b F = R or C, V = F 2, T = T = 9 4 4 8 3 4 16 8 7 0 1
More informationAugust 2015 Qualifying Examination Solutions
August 2015 Qualifying Examination Solutions If you have any difficulty with the wording of the following problems please contact the supervisor immediately. All persons responsible for these problems,
More informationDiagonalization of Matrix
of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that
More informationLecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)
Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1) Travis Schedler Thurs, Nov 17, 2011 (version: Thurs, Nov 17, 1:00 PM) Goals (2) Polar decomposition
More informationThen x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r
Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you
More informationNONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction
NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques
More informationLINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS
LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in
More informationGeneralized Eigenvectors and Jordan Form
Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationGiven a finite-dimensional vector space V over a field K, recall that a linear
Jordan normal form Sebastian Ørsted December 16, 217 Abstract In these notes, we expand upon the coverage of linear algebra as presented in Thomsen (216). Namely, we introduce some concepts and results
More information(VI.D) Generalized Eigenspaces
(VI.D) Generalized Eigenspaces Let T : C n C n be a f ixed linear transformation. For this section and the next, all vector spaces are assumed to be over C ; in particular, we will often write V for C
More informationLinear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities
Linear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities 2. Determinant of a linear transformation, change of basis. In the solution set of Homework 1, New Series, I included
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationThe Jordan Normal Form and its Applications
The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens
More informationTopics in linear algebra
Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over
More informationA proof of the Jordan normal form theorem
A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it
More information(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S
1 Vector spaces 1.1 Definition (Vector space) Let V be a set with a binary operation +, F a field, and (c, v) cv be a mapping from F V into V. Then V is called a vector space over F (or a linear space
More informationMATHEMATICS 217 NOTES
MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationMATH Spring 2011 Sample problems for Test 2: Solutions
MATH 304 505 Spring 011 Sample problems for Test : Solutions Any problem may be altered or replaced by a different one! Problem 1 (15 pts) Let M, (R) denote the vector space of matrices with real entries
More informationMATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:
More informationLecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)
Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1) Travis Schedler Tue, Nov 29, 2011 (version: Tue, Nov 29, 1:00 PM) Goals
More informationOnline Exercises for Linear Algebra XM511
This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2
More informationSolution for Homework 5
Solution for Homework 5 ME243A/ECE23A Fall 27 Exercise 1 The computation of the reachable subspace in continuous time can be handled easily introducing the concepts of inner product, orthogonal complement
More informationGRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.
GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,
More informationAdvanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur
Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Lecture No. #07 Jordan Canonical Form Cayley Hamilton Theorem (Refer Slide Time:
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More informationMath 113 Final Exam: Solutions
Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P
More informationMath 110 Linear Algebra Midterm 2 Review October 28, 2017
Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections
More informationFinal A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017
Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.
More information8 General Linear Transformations
8 General Linear Transformations 8.1 Basic Properties Definition 8.1 If T : V W is a function from a vector space V into a vector space W, then T is called a linear transformation from V to W if, for all
More informationMATH JORDAN FORM
MATH 53 JORDAN FORM Let A,, A k be square matrices of size n,, n k, respectively with entries in a field F We define the matrix A A k of size n = n + + n k as the block matrix A 0 0 0 0 A 0 0 0 0 A k It
More informationName: Solutions. Determine the matrix [T ] C B. The matrix is given by A = [T ] C B =
Name: Solutions 1. (5 + 5 + 15 points) Let F = F 7. Recall we defined for a positive integer n a vector space P n (F ) by P n (F ) = {f(x) F [x] : deg(f) n}. (a) What is the dimension of P n (F ) over
More informationEigenvalues, Eigenvectors, and Invariant Subspaces
CHAPTER 5 Statue of Italian mathematician Leonardo of Pisa (7 25, approximate dates), also known as Fibonacci. Exercise 6 in Section 5.C shows how linear algebra can be used to find an explicit formula
More informationSPECTRAL THEORY EVAN JENKINS
SPECTRAL THEORY EVAN JENKINS Abstract. These are notes from two lectures given in MATH 27200, Basic Functional Analysis, at the University of Chicago in March 2010. The proof of the spectral theorem for
More informationThe Cyclic Decomposition of a Nilpotent Operator
The Cyclic Decomposition of a Nilpotent Operator 1 Introduction. J.H. Shapiro Suppose T is a linear transformation on a vector space V. Recall Exercise #3 of Chapter 8 of our text, which we restate here
More informationAbstract Vector Spaces and Concrete Examples
LECTURE 18 Abstract Vector Spaces and Concrete Examples Our discussion of linear algebra so far has been devoted to discussing the relations between systems of linear equations, matrices, and vectors.
More informationLecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues
Lecture Notes: Eigenvalues and Eigenvectors Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk 1 Definitions Let A be an n n matrix. If there
More informationEigenvectors. Prop-Defn
Eigenvectors Aim lecture: The simplest T -invariant subspaces are 1-dim & these give rise to the theory of eigenvectors. To compute these we introduce the similarity invariant, the characteristic polynomial.
More informationREU 2007 Apprentice Class Lecture 8
REU 2007 Apprentice Class Lecture 8 Instructor: László Babai Scribe: Ian Shipman July 5, 2007 Revised by instructor Last updated July 5, 5:15 pm A81 The Cayley-Hamilton Theorem Recall that for a square
More information1 Linear transformations; the basics
Linear Algebra Fall 2013 Linear Transformations 1 Linear transformations; the basics Definition 1 Let V, W be vector spaces over the same field F. A linear transformation (also known as linear map, or
More information(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) =
(Can) Canonical Forms Math 683L (Summer 2003) Following the brief interlude to study diagonalisable transformations and matrices, we must now get back to the serious business of the general case. In this
More informationWhat is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix
Professor Joana Amorim, jamorim@bu.edu What is on this week Vector spaces (continued). Null space and Column Space of a matrix............................. Null Space...........................................2
More informationCHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v
CHAPTER 5 REVIEW Throughout this note, we assume that V and W are two vector spaces with dimv = n and dimw = m. T : V W is a linear transformation.. A map T : V W is a linear transformation if and only
More information10. Noether Normalization and Hilbert s Nullstellensatz
10. Noether Normalization and Hilbert s Nullstellensatz 91 10. Noether Normalization and Hilbert s Nullstellensatz In the last chapter we have gained much understanding for integral and finite ring extensions.
More informationEigenvalues and Eigenvectors
November 3, 2016 1 Definition () The (complex) number λ is called an eigenvalue of the n n matrix A provided there exists a nonzero (complex) vector v such that Av = λv, in which case the vector v is called
More informationJordan Normal Form. Chapter Minimal Polynomials
Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q
More informationBASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x
BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,
More informationCartan s Criteria. Math 649, Dan Barbasch. February 26
Cartan s Criteria Math 649, 2013 Dan Barbasch February 26 Cartan s Criteria REFERENCES: Humphreys, I.2 and I.3. Definition The Cartan-Killing form of a Lie algebra is the bilinear form B(x, y) := Tr(ad
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationMATH 110 SOLUTIONS TO THE PRACTICE FINAL EXERCISE 1. [Cauchy-Schwarz inequality]
MATH 110 SOLUTIONS TO THE PRACTICE FINAL PEYAM TABRIZIAN Note: There might be some mistakes and typos. Please let me know if you find any! EXERCISE 1 Theorem: [Cauchy-Schwarz inequality] Let V be a vector
More information1.4 Solvable Lie algebras
1.4. SOLVABLE LIE ALGEBRAS 17 1.4 Solvable Lie algebras 1.4.1 Derived series and solvable Lie algebras The derived series of a Lie algebra L is given by: L (0) = L, L (1) = [L, L],, L (2) = [L (1), L (1)
More informationJORDAN NORMAL FORM NOTES
18.700 JORDAN NORMAL FORM NOTES These are some supplementary notes on how to find the Jordan normal form of a small matrix. First we recall some of the facts from lecture, next we give the general algorithm
More informationGeneralized eigenvector - Wikipedia, the free encyclopedia
1 of 30 18/03/2013 20:00 Generalized eigenvector From Wikipedia, the free encyclopedia In linear algebra, for a matrix A, there may not always exist a full set of linearly independent eigenvectors that
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationMATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.
MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationHomework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.
Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0
More informationLA-3 Lecture Notes. Karl-Heinz Fieseler. Uppsala 2014
LA-3 Lecture Notes Karl-Heinz Fieseler Uppsala 2014 1 Contents 1 Dual space 2 2 Direct sums and quotient spaces 4 3 Bilinear forms 7 4 Jordan normal form 8 5 Minimal and characteristic polynomial 17 6
More information