DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS

Similar documents
Definition (T -invariant subspace) Example. Example

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

and let s calculate the image of some vectors under the transformation T.

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

8. Diagonalization.

Lecture 11: Diagonalization

4. Linear transformations as a vector space 17

Linear Algebra: Matrix Eigenvalue Problems

Chapter 5 Eigenvalues and Eigenvectors

Math 3191 Applied Linear Algebra

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Eigenvalues and Eigenvectors

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

MATHEMATICS 217 NOTES

Bare-bones outline of eigenvalue theory and the Jordan canonical form

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Recall : Eigenvalues and Eigenvectors

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Linear Algebra Highlights

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

0.1 Rational Canonical Forms

Jordan Canonical Form Homework Solutions

Eigenvalue and Eigenvector Homework

Diagonalization of Matrix

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Diagonalizing Matrices

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Lecture 10 - Eigenvalues problem

Review problems for MA 54, Fall 2004.

REU 2007 Apprentice Class Lecture 8

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Eigenvalues and Eigenvectors

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Eigenvalues and Eigenvectors: An Introduction

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Eigenvalues and Eigenvectors

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Jordan Normal Form. Chapter Minimal Polynomials

I. Multiple Choice Questions (Answer any eight)

The Jordan Canonical Form

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

AN ITERATION. In part as motivation, we consider an iteration method for solving a system of linear equations which has the form x Ax = b

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

City Suburbs. : population distribution after m years

MA 265 FINAL EXAM Fall 2012

Chapter 4 & 5: Vector Spaces & Linear Transformations

Lecture 12: Diagonalization

Linear Algebra II Lecture 22

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Homework 2 Foundations of Computational Math 2 Spring 2019

Announcements Monday, November 06

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

1 Linear Algebra Problems

2 Eigenvectors and Eigenvalues in abstract spaces.

Exercise Set 7.2. Skills

The Cayley-Hamilton Theorem and the Jordan Decomposition

Symmetric and anti symmetric matrices

MAT 1302B Mathematical Methods II

Cayley-Hamilton Theorem

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Study Guide for Linear Algebra Exam 2

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Math 108b: Notes on the Spectral Theorem

Math 205, Summer I, Week 4b:

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Math Matrix Algebra

Eigenvalues and Eigenvectors

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Agenda: Understand the action of A by seeing how it acts on eigenvectors.

Lecture 11: Eigenvalues and Eigenvectors

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

UCSD ECE269 Handout #18 Prof. Young-Han Kim Monday, March 19, Final Examination (Total: 130 points)

Math 1553 Worksheet 5.3, 5.5

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Econ Slides from Lecture 7

MAT1302F Mathematical Methods II Lecture 19

Notes on Eigenvalues, Singular Values and QR

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

Transcription:

DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS The correct choice of a coordinate system (or basis) often can simplify the form of an equation or the analysis of a particular problem. For example, consider the obliquely oriented ellipse in Figure 7.2.1 whose equation in the xy-coordinate system is By rotating the xy-coordinate system counterclockwise through an angle of 45 into a uv -coordinate system by means of (5.6.13) on p. 326, the cross-product term is eliminated, and the equation of the ellipse simplifies to become Similarity Two n n matrices A and B are said to be similar whenever there exists a nonsingular matrix P such that The product is called a similarity transformation on A.

A Fundamental Problem. Given a square matrix A, reduce it to the simplest possible form by means of a similarity transformation. Diagonal matrices have the simplest form, so we first ask, Is every square matrix similar to a diagonal matrix? Linear algebra and matrix theory would be simpler subjects if this were true, but it s not. For example, consider which is false. Thus A, as well as any other nonzero nilpotent matrix, is not similar to a diagonal matrix. Nonzero nilpotent matrices are not the only ones that can t be diagonalized, but, as we will see, nilpotent matrices play a particularly important role in nondiagonalizability. So, if not all square matrices can be diagonalized by a similarity transformation, what are the characteristics of those that can? An answer is easily derived by examining the equation implies that P must be a matrix whose columns constitute n linearly independent eigenvectors, and D is a diagonal matrix whose diagonal entries are the corresponding eigenvalues. It s straightforward to reverse the above argument to prove the converse i.e., if there exists a linearly independent set of n eigenvectors that are used as columns to build a nonsingular matrix P, and if D is the diagonal matrix whose diagonal entries are the corresponding eigenvalues, then P^( 1) AP = D. Below is a summary.

Since not all square matrices are diagonalizable, it s natural to inquire about the next best thing i.e., can every square matrix be triangularized by similarity? This time the answer is yes, but before explaining why, we need to make the following observation. Similarity Preserves Eigenvalues Row reductions don t preserve eigenvalues (try a simple example). However, similar matrices have the same characteristic polynomial, so they have the same eigenvalues with the same multiplicities. Caution! Similar matrices need not have the same eigenvectors. In the context of linear operators, this means that the eigenvalues of a matrix representation of an operator L are invariant under a change of basis. In other words, the eigenvalues are intrinsic to L in the sense that they are independent of any coordinate representation. Now we can establish the fact that every square matrix can be triangularized by a similarity transformation. In fact, as Issai Schur realized in 1909, the similarity transformation always can be made to be unitary. The Cayley Hamilton theorem asserts that every square matrix satisfies its own characteristic equation p(λ) = 0. That is, p(a) = 0. Problem: Show how the Cayley Hamilton theorem follows from Schur s triangularization theorem. Solution: Schur s theorem insures the existence of a unitary U such thatu AU = T is triangular, and the development allows for the eigenvalues A to appear in any given order on the diagonal of T. So, if σ (A) = {λ1, λ2,..., λk } with λi repeated ai times, then there is a unitary U such that

and thus p(a) = 0. Schur s theorem is not the complete story on triangularizing by similarity. By allowing nonunitary similarity transformations, the structure of the uppertriangular matrix T can be simplified to contain zeros everywhere except on the diagonal and the superdiagonal (the diagonal immediately above the main diagonal). This is the Jordan form developed on p. 590, but some of the seeds are sown here.

Example:

Determining whether or not An n is diagonalizable is equivalent to determining whether or not A has a complete linearly independent set of eigenvectors, and this can be done if you are willing and able to compute all of the eigenvalues and eigenvectors for A. But this brute force approach can be a monumental task. Fortunately, there are some theoretical tools to help determine how many linearly independent eigenvectors a given matrix possesses. Independent Eigenvectors Let {λ1, λ2,..., λk } be a set of distinct eigenvalues for A. If {(λ1, x1 ), (λ2, x2 ),..., (λk, xk )} is a set of eigenpairs for A, then S = {x1, x2,..., xk } is a linearly independent set. (7.2.3) If Bi is a basis for N (A λi I), then B = B1 B2 Bk is a linearly independent set. (7.2.4)

These results lead to the following characterization of diagonalizability.

If An n happens to have n distinct eigenvalues, then each eigenvalue is simple. This means that geo multa (λ) = alg multa (λ) = 1 for each λ, so (7.2.5) produces the following corollary guaranteeing diagonalizability. Distinct Eigenvalues If no eigenvalue of A is repeated, then A is diagonalizable. (7.2.6) Caution! The converse is not true. An elegant and more geometrical way of expressing diagonalizability is now presented to help simplify subsequent analyses and pave the way for extensions.