Definition (T -invariant subspace) Example. Example

Similar documents
Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 221, Spring Homework 10 Solutions

Jordan Canonical Form Homework Solutions

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Diagonalization of Matrix

Math 3191 Applied Linear Algebra

and let s calculate the image of some vectors under the transformation T.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Eigenvalues, Eigenvectors, and Diagonalization

Lecture 12: Diagonalization

DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Exercise Set 7.2. Skills

Eigenvalues and Eigenvectors

Name: Final Exam MATH 3320

Recall : Eigenvalues and Eigenvectors

Final Exam Practice Problems Answers Math 24 Winter 2012

Linear Algebra Final Exam Solutions, December 13, 2008

MATH 115A: SAMPLE FINAL SOLUTIONS

4. Linear transformations as a vector space 17

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

MATH 1553-C MIDTERM EXAMINATION 3

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

1 Invariant subspaces

Math 2331 Linear Algebra

The Jordan Normal Form and its Applications

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

MAT 1302B Mathematical Methods II

Generalized Eigenvectors and Jordan Form

Diagonalization. Hung-yi Lee

235 Final exam review questions

Linear Algebra II Lecture 13

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Family Feud Review. Linear Algebra. October 22, 2013

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Jordan Normal Form and Singular Decomposition

City Suburbs. : population distribution after m years

Dimension. Eigenvalue and eigenvector

1 Last time: least-squares problems

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Calculating determinants for larger matrices

Study Guide for Linear Algebra Exam 2

Eigenvalue and Eigenvector Homework

Eigenvectors. Prop-Defn

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Chapter 5. Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

2 Eigenvectors and Eigenvalues in abstract spaces.

Matrices related to linear transformations

Eigenspaces and Diagonalizable Transformations

Lecture 11: Eigenvalues and Eigenvectors

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Math 205, Summer I, Week 4b:

Math Final December 2006 C. Robinson

Chapter 4 & 5: Vector Spaces & Linear Transformations

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Linear algebra II Tutorial solutions #1 A = x 1

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Practice Final Exam Solutions

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Summer Session Practice Final Exam

Bare-bones outline of eigenvalue theory and the Jordan canonical form

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Lecture 10 - Eigenvalues problem

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Math 240 Calculus III

LINEAR ALGEBRA REVIEW

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Vector Spaces and Linear Transformations

Eigenvalues and Eigenvectors

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

Math Matrix Algebra

MAC Module 12 Eigenvalues and Eigenvectors

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Additional Homework Problems

Definitions for Quizzes

Math 315: Linear Algebra Solutions to Assignment 7

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

MAT1302F Mathematical Methods II Lecture 19

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Transcription:

Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin by looking for T -invariant subspaces of V. Definition (T -invariant subspace) Let V be a vector space, let W be a subspace of V, and let T : V V be a linear transformation. We say that W is a T -invariant subspace of V if T (W ) W. In other words, we say W is T -invariant if for all w W, it must be the case that T (w) W. Example If T : P P is differentiation (where P is the vector space of all polynomials with real coefficients), then P k is T -invariant for all k. Example If T : V V is a linear transformation, and W is a one-dimensional T -invariant subspace of V, then there exists a scalar λ such that T (w) = λw for all w W.

Definition (direct sum) Let W 1,..., W k be subspaces of a vector space V. If every vector v V can be written as v = w 1 + + w k where w i W i, then we say V is a sum of the subspaces W 1,..., W k and write V = W 1 + + W k. Furthermore, if W i W j = {0} when i j, then we say V is the direct sum of W 1,..., W k and write V = W 1 W k. In this case, the expression v = w 1 + + w k is unique.

Example R 3 is a direct sum of the xy-plane and the z-axis. We understand a linear transformation T : V V if we understand what T does to all of the T -invariant subspaces of V. As such, we might ask for a direct sum decomposition V = W 1 W k of V into T -invariant subspaces. Knowing what T does to each W i will completely determine what T does to V. The simplest T -invariant subspaces are the one-dimensional T -invariant subspaces. As we saw in the example above, the vectors in these spaces are simply scaled when T acts on them. Our study of eigenvalues and eigenvectors will help us better understand such T -invariant subspaces.

Definition (eigenvalue, eigenvector) Let V be a vector space over F, and let T : V V be a linear transformation. An eigenvalue of T is a scalar λ F such that there is a nonzero vector v V such that T (v) = λv. In this case, we say v is an eigenvector of T associated to λ. Also, we define the eigenspace of λ to be E λ = {v V T (v) = λv} = {v V (T λi )v = 0} = ker(t λi ). Note Suppose V is finite-dimensional. If B is a basis for V, then T (v) = λv [T (v)] B = [λv] B [T ] B [v] B = λ[v] B. In other words, λ is an eigenvalue of T if and only if it is an eigenvalue of any matrix representation [T ] B of T. We can use this fact when actually trying to find the eigenvalues of T!

Example Example

Because the action of T on each of its eigenspaces is so straightforward, it s tempting to ask if V can be written as a direct sum of its eigenspaces: V = E λ1 E λk. If this is indeed the case, we say that T is diagonalizable. Definition (diagonalizable) Let V be a finite-dimensional vector space, and let T : V V be a linear transformation. Then T is diagonalizable if there is a basis C for V such that the matrix [T ] C is a diagonal matrix. Note that if [T ] C is diagonal, then the vectors in C are eigenvectors for T, and V is a direct sum of the eigenspaces of T.

Note If V is finite-dimensional with basis B, then by Theorem 6.29 ([T ] C = P 1 [T ] B P), T : V V is diagonalizable if and only if [T ] B is similar to a diagonal matrix. Theorem Let V be a finite-dimensional vector space. A linear transformation T : V V is diagonalizable if and only if V has a basis of eigenvectors of T.

Remember, in Math 12 we learned that... If T : V V is a linear transformation, then if λ 1,..., λ m are distinct eigenvalues for T with corresponding eigenvectors v 1,..., v m, then {v 1,..., v m } is linearly independent if V is n-dimensional, and T has n distinct eigenvalues, then T is diagonalizable (but the converse is false!) T is invertible if and only if 0 is not an eigenvalue of T T is diagonalizable if and only if the algebraic and geometric multiplicity of every eigenvalue is equal the eigenvalues of a triangular matrix are on the diagonal of the matrix similar matrices have the same characteristic polynomial (and thus the same eigenvalues) not all linear transformations are diagonalizable being diagonalizable has nothing to do with being invertible, and vice versa!

Jordan Canonical Form Even though not all linear transformations are diagonalizable, it turns out we can get pretty close. More specifically, let V be a finite-dimensional complex vector space, and let T : V V be a linear transformation. Then there is a basis B (of generalized eigenvectors ) of V such that [T ] B is block-diagonal and the blocks are all Jordan blocks :