v = 1(1 t) + 1(1 + t). We may consider these components as vectors in R n and R m : w 1. R n, w m

Similar documents
Diagonalization of Matrix

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Algebra II Lecture 13

Lecture 12: Diagonalization

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Math 3191 Applied Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Lecture 11: Diagonalization

15. Eigenvalues, Eigenvectors

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MAT 1302B Mathematical Methods II

Practice Final Exam Solutions

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Linear Algebra Practice Problems

Solutions to Final Exam

Cayley-Hamilton Theorem

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Math Final December 2006 C. Robinson

and let s calculate the image of some vectors under the transformation T.

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Eigenvalues and Eigenvectors

Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Math Linear Algebra Final Exam Review Sheet

Final Exam Practice Problems Answers Math 24 Winter 2012

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Diagonalization. Hung-yi Lee

MATH 221, Spring Homework 10 Solutions

MATH Spring 2011 Sample problems for Test 2: Solutions

Lecture 7: Positive Semidefinite Matrices

Math 205, Summer I, Week 4b:

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Math Camp Notes: Linear Algebra II

Definition (T -invariant subspace) Example. Example

MAT1302F Mathematical Methods II Lecture 19

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Eigenvalue and Eigenvector Homework

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

MATH 235. Final ANSWERS May 5, 2015

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Math 108b: Notes on the Spectral Theorem

Eigenvalues, Eigenvectors, and Diagonalization

Exercise Set 7.2. Skills

Matrices related to linear transformations

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Math Matrix Algebra

MA 265 FINAL EXAM Fall 2012

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Math 121 Practice Final Solutions

Econ Slides from Lecture 7

1. Select the unique answer (choice) for each problem. Write only the answer.

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Eigenvalues and Eigenvectors 7.2 Diagonalization

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Jordan Normal Form and Singular Decomposition

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Announcements Monday, November 06

Study Guide for Linear Algebra Exam 2

14. Properties of the Determinant

Calculating determinants for larger matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

EIGENVALUES AND EIGENVECTORS 3

235 Final exam review questions

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Linear Algebra, part 3 QR and SVD

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Dimension. Eigenvalue and eigenvector

Systems of Linear ODEs

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

A Brief Outline of Math 355

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Recall : Eigenvalues and Eigenvectors

Math 113 Final Exam: Solutions

Diagonalization. P. Danziger. u B = A 1. B u S.

Math 215 HW #11 Solutions

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Problems for M 11/2: A =

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Eigenvalues and Eigenvectors

Jordan Canonical Form Homework Solutions

Eigenvalues and Eigenvectors A =

Math 215 HW #9 Solutions

Matrices: 2.1 Operations with Matrices

Math 20F Final Exam(ver. c)

Transcription:

20 Diagonalization Let V and W be vector spaces, with bases S = {e 1,, e n } and T = {f 1,, f m } respectively Since these are bases, there exist constants v i and w such that any vectors v V and w W can be written as: v = v 1 e 1 + v 2 e 2 + + v n e n w = w 1 f 1 + w 2 f 2 + + w m f m We call the coefficients v 1,, v n the components of v in the basis {e 1,, e n } Example Consider the basis S = {1 t, 1 + t} for the vector space P 1 (t) The vector v = 2t has components v 1 = 1, v 2 = 1, because v = 1(1 t) + 1(1 + t) We may consider these components as vectors in R n and R m : v 1 v n R n, w 1 w m R m Now suppose we have a linear transformation L : V W Then we can expect to write L as an m n matrix, turning an n-dimensional vector of coefficients corresponding to v into an m-dimensional vector of coefficients for w Using linearity, we write: L(v) = L(v 1 e 1 + v 2 e 2 + + v n e n ) = v 1 L(e 1 ) + v 2 L(e 2 ) + + v n L(e n ) This is a vector in W Let s compute its components in W We know that for each e, L(e ) is a vector in W, and can thus be written uniquely as a linear combination of vectors in the basis T Then we can find coefficients M i such that: m L(e ) = f 1 M 1 + + f m M m = f i M i i=1 We ve written the M i on the right side of the f s to agree with our previous notation for matrix multiplication We have an up-hill rule where the 1

matching indices for the multiplied obects run up and to the left, like so: f i M i Now M i is the ith component of L(e ) Regarding the coefficients M i as a matrix, we can see that the th column of M is the coefficients of L(e ) in the basis T Then we can write: L(v) = L(v 1 e 1 + v 2 e 2 + + v n e n ) = v 1 L(e 1 ) + v 2 L(e 2 ) + + v n L(e n ) m = v L(e ) i=1 m = v (M 1 f 1 + + M m f m ) i=1 m n = f i [ Mv i ] i=1 =1 The last equality is the definition of matrix multiplication Thus: v 1 v n L M1 1 M 1 n v 1 M1 m Mn m v n and M = (M i ) is called the matrix of L Notice that this matrix depends on a choice of bases for V and W Example Let L : P 1 (t) P 1 (t), such that L(a + bt) = (a + b)t Since V = P 1 (t) = W, let s choose the same basis for V and W We ll choose the basis {1 t, 1 + t} for this example Thus: ( ) ( ) 0 L(1 t) = (1 1)t = 0 = (1 t) 0 + (1 + t) 0 = (1 t) (1 + t) 0 ( ) ( ) 1 L(1 + t) = (1 + 1)t = 2t (1 t) 1 + (1 + t) 1 = (1 t) (1 + t) 1 ( ) 0 1 M = 0 1, 2

Now suppose we are lucky, and we have L : V V, and the basis {v 1,, v n } is a set of linearly independent eigenvectors for L, with eigenvalues λ 1,, λ n Then: L(v 1 ) = λ 1 v 1 L(v 2 ) = λ 2 v 2 L(v n ) = λ n v n As a result, the matrix of L is: λ 1 λ 2 M =, λn where all entries off of the diagonal are zero We call the n n matrix of a linear transformation L : V V diagonalizable if there exists a collection of n linearly independent eigenvectors for L In other words, L is diagonalizable if there exists a basis for V of eigenvectors for L In a basis of eigenvectors, the matrix of a linear transformation is diagonal On the other hand, if an n n matrix M is diagonal, then the standard basis vectors e i are already a set of n linearly independent eigenvectors for M Change of Basis Suppose we have two bases S = {v 1,, v n } and T = {u 1,, u n } for a vector space V Then we may write each v i uniquely as a linear combination of the u : v i = u P i Here, the P i are constants, which we can regard as a matrix P = (P i ) P must have an inverse, since we can also write each u uniquely as a linear 3

combination of the v i : u = k v k Q k Then we can write: v i = k v k Q k P i But v kq k P i is the i, entry of the product matrix QP Since the only expression for v i in the basis S is v i itself, then QP fixes each v i As a result, each v i is an eigenvector for QP with eigenvalues 1, so QP is the identity The matrix P is then called a change of basis matrix Changing basis changes the matrix of a linear transformation To wit, suppose L : V V has matrix M = (M i) in the basis T = {u 1,, u n }, so L(u i ) = k M k i u k Now, suppose that S = {v 1,, v n } is a basis of eigenvectors for L, with eigenvalues λ 1,, λ n Then L(v i ) = λ i v i = k v k D k i where D is the diagonal matrix whose diagonal entries Dk k are the eigenvalues λ k Let P be the change of basis matrix from the basis T to the basis S Then: L(v i ) = L( u P i ) = L(u )P i = u k M k P i Meanwhile, we have: L(v i ) = k In other words, we see that We can summarize as follows: v k D k i = k k u P k Dk i MP = P D or D = P 1 MP Change of basis multiplies vectors by the change of basis matrix P, to give vectors in the new basis To get the matrix of a linear transformation in the new basis, we conugate the matrix of L by the change of basis matrix: M P 1 MP 4

If for two matrices N and M there exists an invertible matrix P such that M = P 1 NP, then we say that M and N are similar Then the above discussion shows that diagonalizable matrices are similar to diagonal matrices References Hefferon, Chapter Three, Section V: Change of Basis Wikipedia: Change of Basis Diagonalizable Matrix Similar Matrix Review Questions 1 Show that similarity of matrices is an equivalence relation (The definition of an equivalence relation is given in Section 2, in the fourth review problem) ( ) a b 2 When is the 2 2 matrix diagonalizable? Include examples c d in your answer 3 Let P n (t) be the vector space of degree n polynomials, and d dt : P n(t) P n 1 (t) be the derivative operator Find the matrix of d dt in the bases {1, t,, t n } for P n (t) and {1, t,, t n 1 } for P n 1 (t) 4 When writing a matrix for a linear transformation, we have seen that the choice of basis matters In fact, even the order of the basis matters! Write all possible reorderings of the standard basis {e 1, e 2, e 3 } for R 3 Write each change of basis matrix between the standard basis {e 1, e 2, e 3 } and each of its reorderings What can you observe about these change of basis matrices? (Note: These matrices are known as permutation matrices) 5

Given the linear transformation L(x, y, z) = (2y z, 3x, 2z+x+y), write the matrix M for L in the standard basis, and two other reorderings of the standard basis Can you make any observations about the resulting matrices? 6