Eigenvalues, Eigenvectors, and Diagonalization

Similar documents
DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 5. Eigenvalues and Eigenvectors

Dimension. Eigenvalue and eigenvector

Summer Session Practice Final Exam

Announcements Monday, October 29

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 221, Spring Homework 10 Solutions

Study Guide for Linear Algebra Exam 2

Recall : Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

Math 3191 Applied Linear Algebra

Math Final December 2006 C. Robinson

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Definition (T -invariant subspace) Example. Example

2 Eigenvectors and Eigenvalues in abstract spaces.

235 Final exam review questions

MATH 1553-C MIDTERM EXAMINATION 3

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Math Matrix Algebra

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 12: Diagonalization

Eigenvalues and Eigenvectors

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Name: Final Exam MATH 3320

Math 2331 Linear Algebra

Eigenspaces and Diagonalizable Transformations

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MAT 1302B Mathematical Methods II

Practice Final Exam. Solutions.

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

4. Linear transformations as a vector space 17

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Math 205, Summer I, Week 4b:

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Solutions to Final Exam

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Announcements Wednesday, November 01

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Math 315: Linear Algebra Solutions to Assignment 7

Diagonalization of Matrix

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Generalized Eigenvectors and Jordan Form

Calculating determinants for larger matrices

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Econ Slides from Lecture 7

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

MA 265 FINAL EXAM Fall 2012

Lecture 15, 16: Diagonalization

Answer Keys For Math 225 Final Review Problem

City Suburbs. : population distribution after m years

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MTH 464: Computational Linear Algebra

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 1553 PRACTICE FINAL EXAMINATION

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Announcements Monday, November 06

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math 240 Calculus III

Lec 2: Mathematical Economics

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

Math 20F Practice Final Solutions. Jor-el Briones

Announcements Wednesday, November 01

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Lecture 3 Eigenvalues and Eigenvectors

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Chapter 7: Symmetric Matrices and Quadratic Forms

Final Exam Practice Problems Answers Math 24 Winter 2012

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra - Part II

0.1 Eigenvalues and Eigenvectors

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Diagonalization. Hung-yi Lee

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Transcription:

Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will use some specific matrices as examples here. Example. Consider the 2 2 matrix 3 3 First, this matrix corresponds to a linear transformation T : R 2 R 2 defined by T (x) = Ax for any vector x R 2. Let s first investigate this linear transformation. Consider the vectors 0 x =, x 2 =, x 3 =, x 4 =. 0 The images of these vectors under T are 3 4 2 Ax =, Ax 2 =, Ax 3 =, Ax 4 =, 3 4 2 If we draw the original vectors and their images on the same picture, respectively, we obtain the following. Notice that the vector x forms an angle with its image Ax, so does x 2. However, x 3 and its image Ax 3 are colinear, and so are x 4 and its image Ax 4. In another word, Ax 3 is a scalar multiple of x 3, and Ax 4 is a scalar multiple of x 4. Here, we call x 3 and x 4 the eigenvectors of the matrix A, and the scalars λ 3, λ 4 R such that Ax 3 = λ 3 x 3 and Ax 4 = λ 4 x 4 are called the eigenvalues of the matrix A. The precise definitions of these two concepts are given below.

Definition. Let A be an n n matrix. If there is a scalar λ and a nonzero vector v such that they satisfy the matrix equation Av = λv, then we call λ an eigenvalue of A, and v an eigenvector of A corresponding to the eigenvalue λ. Remark. Given an eigenvalue λ of the matrix A, the eigenvector corresponding to λ is not unique. In fact, if v is such an eigenvector, then any nontrivial scalar multiple of v is also an eigenvector corresponding to λ. By linearity, A(cv) = c(av) = c(λv) = λ(cv) for any scalar c 0. We want a systematic procedure to find the eigenvalues and eigenvectors of any given matrix A. In another word, we want to describe all possible solutions to the matrix equation Av = λv, where both λ and v are unknown to us. However, notice that λv = λiv, where I is the identity matrix. Furthermore, we can move every term to the left-hand side of the equation, hence, equivalently, we solve for the homogeneous system (A λi)v = 0. () By the invertible matrix theorem, we know that this system has non-trivial solutions if and only if det(a λi) = 0. This is called the characteristic equation. Notice that det(a λi) is a polynomial of degree n in the variable λ, and this polynomial is also called the characteristic polynomial of A. The problem of finding all eigenvalues now reduces to finding all roots of the characteristic polynomial, which we (usually) know how to do. After getting the eigenvalues, we can now solve the homogeneous system (), or equivalently, the null space of the matrix A λi, to obtain the eigenvectors corresponding to each eigenvalue. Remark. By the construction above, all eigenvectors corresponding to a specific eigenvalue λ form a linear subspace. This subspace is called the eigenspace of A corresponding to λ. Example 2. We still consider the matrix 3 3 The characteristic polynomial of A is λ 3 χ A (λ) = det(a λi) = 3 λ = ( λ)2 9 = λ 2 2λ 8 = (λ + 2)(λ 4).

Hence, the eigenvalues of A are λ = 2 and λ 2 = 4. For the eigenvalue λ = 2, its corresponding eigenspace is all solution to the matrix equation (A + 2I)v = 0, i.e., this eigenspace is 3 3 { } E = nul(a + 2I) = nul = span. 3 3 Similarly, the eigenspace corresponding to the eigenvalue λ 2 = 4 is 3 3 { } E = nul(a 4I) = nul = span. 3 3 As verified in Example, the vectors v = and v 2 = are eigenvectors of A. One nice application of the eigenvalues and eigenvectors is to diagonalize a matrix. But before that, we need to introduce the concept of similarity. Definition. Let A anb B be n n matrices. We say that A and B are similar if there exists an invertible matrix P such that A = P BP. Sometimes, the matrix P is referred to as the change-of-coordinate matrix. Theorem. If A and B are similar, then they have the same characteristic polynomial, and hence the same eigenvalues. Remark. The converse of this theorem is not true. The motivation behind diagonalization of a matrix is to come up with a simpler way of computing matrix powers rather than arduously performing matrix multiplication. Given a matrix A with an eigenvalue λ and corresponding eigenspace E. We have a pretty good understanding of the action of A k on the eigenspace E. Each iteration of A multiplies each vector in E by the scalar λ, and there are a total of k iterations, thus each vector in E is multiplied by λ k after the action of A k. We want extend this relatively simple multiplication to the whole vector space R n. If we assume that the n n matrix A has n eigenvalues λ,..., λ n, counting multiplicity, and n linearly independent eigenvectors v,..., v n, with each v i an eigenvector corresponding to the eigenvalue λ i, then, by the invertible matrix theorem, the matrix P = [v v n

is invertible. Furthermore, since the columns are eigenvectors, we have λ 0 AP = [λ v λ n v n = [v v n..... = P D, 0 λ n where D is the diagonal matrix with eigenvalues on the diagonal entries. Therefore, we have A = P DP, and we have just diagonalized the matrix A. Definition. A matrix A is diagonalizable if it is similar to a diagonal matrix. The diagonalization of a diagonalizable matrix A is the process described above, which achieves where P is invertible, and D is diagonal. A = P DP, Example 3. We go back to the examples with the matrix 3 3 In Example 2, we computed the eigenvalues and their corresponding eigenvectors λ = 2, v =, λ 2 = 4, v 2 =. By the diagonalization process described above, let 2 0 D =, and P =. 0 4 Then A = P DP is a diagonalization of A. Example 4. If we want to compute the matrix power A 5, We do not have to perform tedious matrix multiplication seven times. Notice that A 5 = (P DP ) 5 = P D(P P )D(P P )D(P P )D(P P )DP = P D 5 P It is much simpler to do this computation, since the matrix power of a diagonal matrix is just the power of each diagonal entry. Therefore, A 8 ( 2) 5 0 32 0 /2 /2 496 528 = = = 0 4 5 0 024 /2 /2 528 496

The following theorem characterizes the diagonalizable matrices. Theorem 2. Let A be an n n matrix. The followings are equivalent.. A is diagonalizable. 2. A has n linearly independent eigenvectors. 3. The eigenvectors of A span R n. 4. The sum of the dimensions of the eigenspaces of A equals n. Remark. In other words, A is diagonalizable if and only if there are enough eigenvectors to form a basis of R n. We call such a basis an eigenbasis of R.