Eigenvalues and Eigenvectors

Similar documents
Linear Algebra: Matrix Eigenvalue Problems

Math 108b: Notes on the Spectral Theorem

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

A Brief Outline of Math 355

AMS526: Numerical Analysis I (Numerical Linear Algebra)

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Maths for Signals and Systems Linear Algebra in Engineering

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems

4. Linear transformations as a vector space 17

MATH 583A REVIEW SESSION #1

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

Symmetric and self-adjoint matrices

Chapter 6 Inner product spaces

Linear Algebra Lecture Notes-II

Conceptual Questions for Review

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

Schur s Triangularization Theorem. Math 422

Positive Definite Matrix

CS 246 Review of Linear Algebra 01/17/19

Math Linear Algebra Final Exam Review Sheet

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Maths for Signals and Systems Linear Algebra in Engineering

1. General Vector Spaces

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

A Review of Linear Algebra

Knowledge Discovery and Data Mining 1 (VO) ( )

Eigenvalues and Eigenvectors

Study Guide for Linear Algebra Exam 2

Elementary linear algebra

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

Foundations of Matrix Analysis

Lecture 10 - Eigenvalues problem

Chapter 5 Eigenvalues and Eigenvectors

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Diagonalizing Matrices

Stat 159/259: Linear Algebra Notes

Math 307 Learning Goals

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Review problems for MA 54, Fall 2004.

CS 143 Linear Algebra Review

Math 489AB Exercises for Chapter 2 Fall Section 2.3

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

Math Homework 8 (selected problems)

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Review of some mathematical tools

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

EE5120 Linear Algebra: Tutorial 7, July-Dec Covers sec 5.3 (only powers of a matrix part), 5.5,5.6 of GS

Matrix Theory. A.Holst, V.Ufnarovski

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Chapter 6: Orthogonality

ACM 104. Homework Set 5 Solutions. February 21, 2001

Linear Algebra Primer

Linear Algebra 2 Spectral Notes

Lecture 15, 16: Diagonalization

I. Multiple Choice Questions (Answer any eight)

Lec 2: Mathematical Economics

Linear Equations and Matrix

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

2. Every linear system with the same number of equations as unknowns has a unique solution.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Linear Algebra Formulas. Ben Lee

MATHEMATICS 217 NOTES

Symmetric and anti symmetric matrices

Econ Slides from Lecture 7

MIT Final Exam Solutions, Spring 2017

Math 310 Final Exam Solutions

The Spectral Theorem for normal linear maps

Linear Algebra using Dirac Notation: Pt. 2

Chapter 3. Matrices. 3.1 Matrices

Math 21b. Review for Final Exam

GQE ALGEBRA PROBLEMS

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

1 Linear Algebra Problems

Lecture 4 Eigenvalue problems

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Math 408 Advanced Linear Algebra

Diagonalization of Matrix

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Mathematical Methods wk 2: Linear Operators

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Properties of Matrices and Operations on Matrices

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Linear Algebra Review. Vectors

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Notes on Eigenvalues, Singular Values and QR

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Numerical Methods for Solving Large Scale Eigenvalue Problems

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Transcription:

/88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra

Eigenvalue Problem /88

Eigenvalue Equation By definition, the eigenvalue equation for matrix A is Ax = λx Note that A must be square. This equation is satisfied by a vector x that, under the transform of A, is mapped to a vector in the subspace spanned by x. Solving an eigenvalue equation is an eigenvalue problem. 3/88

Consider the unknowns λ and x in the eigenvalue equation for A Ax = λx A solution for λ is an eigenvalue of A. The set of eigenvalues of A is called the spectrum of A. A solution for x is an eigenvector of A. Note that if x is an eigenvector of A, then cx is also an eigenvector of A for c 0. 4/88

Solving an Eigenvalue Equation An eigenvalue equation Ax = λx is non-linear, as it involves product of unknowns λx. How do we find a solution? The key is to turn the above non-linear equation to systems of linear equations. Find the eigenvalues by solving a non-linear equation in λ. For each eigenvalue, find the eigenvectors with that eigenvalue by solving a system of linear equations for x. 5/88

Finding Eigenvalues An eigenvalue λ for A must satisfy the characteristic equation of A as follows A λi = 0 Why? x 0 such that Ax = λx x 0 such that (A λi)x = 0 (A λi) is singular A λi = 0 6/88

Finding Eigenvectors For an eigenvalue λ of A, the eigenspace of A with λ is {x 0 Ax = λx} This set contains the eigenvectors of A with eigenvalue λ. Apart from a null vector, the eigenspace with eigenvalue λ is the same as the nullspace of (A λi). 7/88

Example Solve the eigenvalue equation for A = 4 5 3 A λi = 0 λ λ = 0 λ =, 5 λ = s = λ = s = 8/88

Example: Projection Matrix Solve the eigenvalue equation for P = [ ] P λi = 0 λ λ = 0 λ =, 0 λ = s = λ = 0 s = 9/88

Example: Complex Eigenvalues Solve the eigenvalue equation for K = 0 0 K λi = 0 λ + = 0 λ = i, i λ = i s = i λ = i s = i 0/88

Number of Eigenvalues A matrix of order n n has at most n distinct eigenvalues. Why? The determinant a λ... a n A λi =..... a n... a nn λ is a polynomial of λ of order n, with no more than n roots. Thus, we can denote the spectrum of A by {λ,..., λ k }, k n /88

Eigenvalues vs. Distinct Eigenvalues Let the spectrum of an n n matrix A be {λ,..., λ k } Then the characteristic polynomial of A can be written as k A λi = c (λ λ i ) γ i i= where k γ i = n i= Sometimes we denote the (not-necessarily-distinct) eigenvalues by λ,..., λ n /88

Trace and Sum of Eigenvalues The sum of the diagonal elements of a square matrix is its trace. The sum of eigenvalues of a matrix equals its trace. a λ... a n A λi =..... = c(λ λ )... (λ λ n ) a n... a nn λ Equating the λ n and λ n terms on both sides ( λ) n = cλ n c = ( ) n ( λ) n (a + + a nn ) = cλ n (( λ ) + + ( λ n )) λ + + λ n = a + + a nn 3/88

Determinant and Product of Eigenvalues The product of the eigenvalues of a matrix equals its determinant. Why? n A λi = ( ) n (λ λ i ) i= n A = ( ) n ( λ i ) i= n = λ i i= 4/88

Diagonalization and Decomposition 5/88

Multiplicity of Eigenvalue Let A be a matrix of order n n with spectrum {λ,..., λ k } The characteristic polynomial of A can be written as f (λ) = A λi = ( ) n γ i is the algebraic multiplicity of λ i. k i= (λ λ i ) γ i For the eigenspace of A associated with λ i N (A λ i I) {0} g i = dim N (A λ i I) is the geometric multiplicity of λ i. 6/88

Sum of Multiplicities For any i γ i g i Since k γ i = n i= we have k g i n i= 7/88

Defective Matrix An n n matrix is defective if It is non-defective if k g i < n i= k g i = n i= For example [ 0 ] 0 0 is defective, while [ 0 ] 0 0 0 is not. 8/88

Set of Independent Eigenvectors For a non-defective matrix of order n n, there is a set of n linearly independent eigenvectors (i.e. a basis for R n ) of the matrix. Why? Eigenvalue λ i corresponds to an eigenspace of dimension g i, contributing g i linearly independent eigenvectors to the basis. Eigenvectors associated with distinct eigenvalues are linearly independent. So n linearly independent eigenvectors can be found, forming a basis. 9/88

Eigenvalue Decomposition Let s,..., s n be linearly independent eigenvectors of a non-defective matrix A with eigenvalues λ,..., λ n. Then where A = SΛS S = [ s... s n ], Λ = diag(λ,..., λ n ) This is called eigenvalue decomposition of a (non-defective) matrix. As i = λ i s i AS = SΛ A = SΛS 0/88

Diagonalization It follows that Λ = S AS This is called diagonalization of a (non-defective) matrix. P = [ ] [ ] [ ] = 0 0 0 K = 0 0 0 = i i 0 i i i 0 0 i /88

Common Eigenvectors Non-defective matrices A and B have a common eigenvector matrix if they commute. Suppose AB = BA. Let x be an eigenvector of B with eigenvalue λ. Consider ABx and ABx. BAx = ABx = Aλx = λax So Ax is also an eigenvector of B with eigenvalue λ, which means Ax = λ x That is, x is an eigenvector of A for some eigenvalue λ. /88

Simultaneous Diagonalization Non-defective matrices A and B have a common eigenvector matrix only if they commute AB = BA Suppose A and B have a common eigenvector matrix S, with A = SΛ S, B = SΛ S Then AB = SΛ S SΛ S = SΛ Λ S = SΛ Λ S = SΛ S SΛ S = BA 3/88

Difference Equations 4/88

Power of a Matrix Suppose A is diagonalizable. The power of A can be simplifies by A k = (SΛS ) k = (SΛS )(SΛS )... (SΛS ) = SΛ(S S)Λ(S S)... (S S)ΛS = SΛ k S Note that SΛ k S is much easier to compute than A k. 5/88

Fibonacci Sequence Fibonacci numbers are defined recursively by F k+ = F k + F k We will look at the Fibonacci sequence which starts with F 0 = 0, F = The first few Fibonacci numbers are 0 3 5 8 3... 6/88

Representation of Recursion with Matrix Combining the recurrence relation with a trivial equality F k+ = F k + F k F k = F k + 0 F k we have a system of matrix equations for the Fibonacci numbers Fk+ F k = Fk 0 F k with initial condition F F 0 = 0 7/88

Simplification by Matrix Define u k = Fk+ F k, A = 0 Let S be an eigenvector matrix of A and Λ be the associated eigenvalue matrix. We have u k = Au k = A(Au k ) = = A k u = A k u 0 = SΛ k S u 0 = [ s s ] [ λ k ] c λ k, c c c = S u 0 = c λ k s + c λ k s 8/88

Eigenvalue Problem of A Eigenvalues Eigenvectors A λi = 0 λ = + 5, λ = 5 [ λi λ i Fitting the initial condition c c = S u 0 = ] s i = 0 s i = [ λ λ ] [ 0 λi ] λ = λ λ λ 9/88

Fibonacci Numbers Thus u k = c λ k s + c λ k s = λ k λ λ λ So the Fibonacci numbers are λ λ λ k λ ( ) F k = λ k λ λ λ k = ( ) k ( ) k + 5 5, k = 0,,... 5 30/88

A Markov Process Every year of the people outside Asia move in, and 0 people inside Asia move out. 0 of the Let y k be the number of people outside Asia and z k be the number of people inside Asia at the end of year k. Then yk+ z k+ 0.9 0. yk = 0. 0.8 z k or simply x k+ = Ax k, k = 0,,... Matrix A is Markov, as the elements of A are non-negative the elements in every column of A sum to 3/88

Population as a Function of k Using the eigenvalue decomposition of A is we have A = SΛS = [ 3 3 3 3 ] 0 0 0.7 x k = Ax k = A(Ax k ) = = A k x 0 = SΛ k S x 0 That is yk z k = [ 3 3 3 3 = (y 0 + z 0 )() k [ 3 3 ] k 0 y0 0 0.7 z 0 ] + (y 0 z 0 )(0.7) k [ 3 3 ] 3/88

Steady State We can see that x k converges to the first term x = y z = (y 0 + z 0 ) [ 3 3 ] Note that x is the eigenvector of A with eigenvalue Ax = x Thus x is called the steady state since it stays if entered. 33/88

Linear Differential Equations 34/88

Linear Differential Equation Consider an unknown function u(t) that satisfies where a is a constant. du(t) dt = au(t) The equation is a differential equation as it involves a derivative. What is the solution of such a linear differential equation? 35/88

Solution It is easy to verify that u(t) = u(0)e at where u(0) is the initial value. 36/88

System of Linear Differential Equations Next, consider two unknown functions, v(t) and w(t), that satisfy a system of linear differential equations where a, b, c, d are constants. dv = av + bw dt dw = cv + dw dt 37/88

Matrix Representation A system of linear differential equations can be represented by matrices. For the system of linear differential equations in the previous slide, we have du dt = Au where v a b u =, A = w c d 38/88

Sneak Preview It turns out the solution of can be expressed as du dt = Au u(t) = e At u(0) with an appropriate definition of matrix exponential e At 39/88

Working out an Example Consider the case with That is u(t) = v(t), A = w(t) dv = 4v 5w dt dw = v 3w dt 4 5 3 40/88

Looking for a Solution An exponential function, after differentiation, is still an exponential function. So we assume v(t) = e αt y, w(t) = e αt z Substitute into the differential equations, we have Eliminating e αt, we get αe αt y = 4e αt y 5e αt z αe αt z = e αt y 3e αt z αy = 4y 5z αz = y 3z which has the form of an eigenvalue equation Ax = αx 4/88

General Solution via A A solution of the differential equations consists of y, z and α, which are related to eigenvector and eigenvalue of A. If λ is an eigenvalue of A with associated eigenvectors x, then d ( e λt x ) = λe λt x = e λt Ax = A ( e λt x ) dt By linearity, any linear combination u = i c i e λ i t x i where λ i is an eigenvalue and x i is an eigenvector, satisfies the differential equation du dt = Au 4/88

The Solution Given initial condition u(0) = u 0, the solution of is u(t) = S du dt = Au where S is an eigenvector matrix of A. e λ t 0 0 e λ t S u 0 e u(t) = c e λt s +c e λt λ t 0 c c s = S 0 e λ t u(0) = u c 0 = S c c e = S λ t 0 u c 0 u(t) = S 0 e λ t S u 0 43/88

An Example of Initial Condition Solve du dt = Au, u = v(t), A = w(t) with initial condition v(0) = 8, w(0) = 5. A λi = 0 λ =, s = e t 0 u(t) = S 0 e t S u 0 = ] ] 4 5 3, s = [ 5 e t 0 0 e t ] [ 5 5 ] 8 5 = 3e t [ + e t [ 5 44/88

Matrix Exponential Let A be a square matrix. Matrix exponential is defined by e A k=0 A k k! = I + A + A! + A3 3! +... It has the same form as a scalar exponential e x = k=0 x k k! = + x + x! + x 3 3! +... 45/88

Diagonal Matrix For a diagonal matrix D = diag(d,..., d n ) e D = I + D + D! +... (( = diag + d + d! +... = diag ( ) e d,..., e dn ),..., ( + d n + d n! +... )) 46/88

Diagonalizable Matrices For a diagonalizable matrix A = SΛS e A = I + A + A! + = I + SΛS + (SΛS ) )! = S (I + Λ + Λ! +... S = Se Λ S +... 47/88

Solution of Differential Equation in Matrix The solution for is u(t) = i du dt = Au c i e λ i t s i = Se Λt S u(0) = e At u(0) since ( Se Λt S = S I + Λt + Λ t ) +...! S = I + (SΛS )t + (SΛS ) t +...! = I + At + A t +...! = e At 48/88

Higher-order Linear Differential Equations A high-order linear differential equation can be decomposed into a system of first-order linear differential equations. For example, a third-order linear differential equation can be converted to where y + by + cy = 0 u = Au y 0 0 u = v, A = 0 0 w 0 c b 49/88

Linear Partial Differential Equations A partial differential equation can be transformed to a system of linear differential equations if one variable is discretized. Consider a heat equation u t = u x When x is discretized, we have du dt = Au by defining u u =, A = u N 50/88

Complex Matrix 5/88

Vectors with Complex Numbers Inner product (x, y) = x y + + x n y n Length x = (x, x) = x x + + x n x n = x + + x n Orthogonality (x, y) = 0 x y 5/88

Example Decide the inner product, lengths, and orthogonality for x = 3 i + i and 5 y = i 53/88

Hermitian The Hermitian of matrix A is defined as A H = (A ) T Basically Hermitian = conjugation + transposition It can be shown that ( ) A H H = A (AB) H = B H A H 54/88

Hermitian and Inner Product The inner product of two vectors can be written as (x, y) = x H y It follows that (x, Ay) = x H Ay = (A H x) H y = (A H x, y) (Ax, y) = (Ax) H y = x H A H y = (x, A H y) 55/88

Hermitian Matrix By definition, a matrix A is Hermitian if A H = A For example is Hermitian, since [ ] 3 3i A = 3 + 3i 5 [ ] T [ ] A H = (A ) T 3 + 3i 3 3i = = = A 3 3i 5 3 + 3i 5 56/88

Properties of a Hermitian Matrix Let A be Hermitian. x H Ax is real for any x x H Ax = x H A H x = x H A H (x H ) H = (x H Ax) H = (x H Ax) The eigenvalues of A are real As i = λ i s i s H i As i = λ i s H i s i λ i = sh i As i s H i s i R Eigenvectors of A associated with distinct eigenvalues are orthogonal (As, s ) = (s, As ) (λ λ )(s, s ) = 0 (s, s ) = 0 57/88

Example [ ] 3 3i A = 3 + 3i 5 A λi = 0 λ = 8, (A 8I) s = 0 s = + i (A + I) s = 0 s = ( ) +i ( (s, s ) = + ( + i) + i ) = 0 58/88

Unitary Matrix By definition, a matrix U is unitary if U = U H i.e. UU H = U H U = I Vector length is invariant under transformation by U Ux = (Ux, Ux) = (x, U H Ux) = (x, x) = x The eigenvalues are of unit modulus Us i = λ i s i s i = Us i = λ i s i = λ i s i λ i = Eigenvectors associated with distinct eigenvalues are orthogonal (Us, Us ) = (s, U H Us ) = (s, s ) (Us, Us ) = (λ s, λ s ) = λ λ (s, s ) ( λ λ )(s, s ) = 0 (s, s ) = 0 59/88

Examples of Unitary Matrix Rotation matrix Fourier matrix Permutation matrix cos t sin t U = sin t cos t F = ω ω ω 3 ω ω 4 ω 6 ω 3 ω 6 ω 9 0 0 0 0 0 0 P = 0 0 0 0 0 0 60/88

Skew-Hermitian By definition, a matrix K is skew-hermitian if K H = K Let [ ] [ 3 3i i K = ia = i = 3 + 3i 5 3 + 3i ] 3 + 3i 5i Indeed K H = (ia) H = ia H = ia = K [ K H i = 3 + 3i ] H 3 + 3i = 5i i 3 3i = K 3 3i 5i 6/88

Similarity Transformation 6/88

Similar Matrices By definition, matrix B is similar to matrix A if Similarity relation is denoted by M such that B = M AM B A Note that similarity is an equivalence relation. Thus, we also say A and B are similar. 63/88

Example A = 0, M 0 0 = M = b, M 0 = b, M = 0 A B = M AM = A B = M AM = [ ] b 0 0 [ ] 64/88

Eigenvalues of Similar Matrices If matrix A and matrix B are similar, then they have the same eigenvalues. Suppose A and B are similar, with B = M AM Let λ be an eigenvalue of B. Then B λi = 0 M AM λm M = 0 M (A λi)m = 0 A λi = 0 So λ is an eigenvalue of A. 65/88

Eigenvectors of Similar Matrices An eigenvector x of A corresponds to an eigenvector y = M x of B. Furthermore, x and y are associated with the same eigenvalue. Let λ be an eigenvalue of A with an associated eigenvector x. Then Ax = λx M Ax = λm x ( M AM ) ( M x ) = λ ( M x ) By = λy 66/88

Example A = 0, B 0 0 = b, B 0 0 = [ ] The eigenvalues of A are and 0, with eigenvectors 0, 0 B A. The eigenvectors of B are M = 0 0, M B A. The eigenvectors of B are ] M = 0 [, M 0 = 0 = b 67/88

Change of Basis The matrix representation for a linear transformation depends on the used basis. When there is a change of basis, the matrix changes to a similar matrix (with a similarity transformation). 68/88

Representation of a Vector Through a basis, a vector in a vector space can be represented by a column. The representation depends on the used basis. Specifically, suppose B = {v,..., v n } and then x = c v + + c n v n [x] B = {c i } 69/88

Representation of a Linear Transformation Through a basis for the domain D and a basis for the range R, a linear transformation from D to R can be represented by a matrix. The matrix representation of a linear transformation T : D R depends on the used bases. Here we use notation [T] UV for the matrix representation for T using basis U for D and basis V for R. 70/88

Construction of Matrix Representation Let U = {u,..., u n } be a basis for D and V = {v,..., v m } be a basis for R. Consider linear transformation Suppose T : D R m T(u j ) = a ij v i, i= j =,..., n Using basis U for D and basis V for R, we have [T] UV = {a ij } so that T(x) = y [y] V = [T] UV [x] U for x D and y R. 7/88

Proof Using basis U for D and basis V for R, suppose n m x = x j u j, T(x) = y = y i v i j= i= Using linearity of T n n m m n T(x) = x j T(u j ) = x j a ij v i = a ij x j v i j= j= i= i= j= Thus i.e. y i = a ij x j [y] V = [T] UV [x] U 7/88

Two Bases Consider a linear transformation within a vector space T : S S With basis G = {V,..., V n }, the matrix representation for T is b... b n n T(V j ) = b ij V i [T] GG =..... = B i= b n... b nn With basis H = {v,..., v n }, the matrix representation for T is a... a n n T(v j ) = a ij v i [T] HH =..... = A i= a n... a nn 73/88

Identity Transformation Consider the identity transformation I(x) = x Obviously I is linear, with [I] GG = I, [I] HH = I. How about [I] GH? Suppose the basis vectors are related by We have so n V j = m ij v i, j =,..., n i= n I(V j ) = V j = m ij v i, j =,..., n i= [I] GH = {m ij } = M 74/88

Change of Vector Representation How are [x] G = {x i } and [x] H = {x i } related? n n n n n x = x j V j = x j m ij v i = m ij x j v i j= j= i= i= j= n = x i v i i= Thus x i = j m ij x j in matrix form [x] H = [I] GH [x] G We also have [x] G = [I] GH [x] H = [I] HG [x] H [I] HG = [I] GH 75/88

Change of Matrix Representation How are [T] GG and [T] HH related? ( ) T(V j ) = T m ij v i = m ij (T(v i )) = i i k = b ij V i = b ij m ki v k, i i k i Thus a ki m ij = m ki b ij, j, k i i In matrix form [T] HH [I] GH = [I] GH [T] GG m ij a ki v k i 76/88

Relation to Similarity Transformation The identity [T] HH [I] GH = [I] GH [T] GG can be re-arranged as similarity transformation [T] HH = [I] GH [T] GG [I] GH [T] GG = [I] HG [T] HH [I] HG [T] GG = [I] HG [T] HH [I] GH (B = M A M) 77/88

Eigenvector of a Linear Transformation Consider a linear transformation T within a vector space. An eigenvector of T is defined by T(s) = λs The matrix representation of T using an eigenbasis G = {s,..., s n } consisting of eigenvectors of T is particular simple: [T] GG = diag(λ,..., λ n ) since Ts i = λ i s i 78/88

Eigenvalue Decomposition Let H = {v,..., v n } be another basis. Suppose s j = i s ij v i Then [I] GH = {s ij } = S The representation of T with H can be constructed by [T] HH = [I] GH [T] GG [I] GH 79/88

Example: Projection Let T be the projection to the line L at angle θ to x-axis. With an eigenbasis [T] GG = Λ = 0 0 0 Let H be the standard basis. Then cos θ sin θ [I] GH = S = sin θ cos θ [ cos [T] HH = A = SΛS = ] θ cos θ sin θ cos θ sin θ sin θ 80/88

Normal Matrix and Orthonormal Eigenbasis 8/88

Schur Lemma For any matrix A, there exists a unitary matrix U such that T = U H AU is an upper-triangular matrix. Since A U H AU = T A is similar to an upper-triangular matrix. 8/88

Normal Matrix By definition, a matrix is normal if it commutes with its Hermitian, i.e. NN H = N H N 83/88

Diagonalizaility of a Normal Matrix A normal matrix can be diagonalized. Let A be a normal matrix. According to Schur lemma, there exists a unitary matrix U such that is upper-triangular. Furthermore T = U H AU TT H = U H AU(U H AU) H = U H AA H U = U H A H AU = U H A H UU H AU = T H T 84/88

Consider the first diagonal element Similarly (TT H ) = (T H T) t k tk = tkt k = t k k t k = 0, k > (TT H ) = (T H T) t k tk = tkt k = t k k t k = 0, k > The same argument applies to the remaining rows. Thus, T is diagonal, i.e., A is diagonalizable. 85/88

Complete Orthonormal Eigenvectors A normal matrix has sufficient orthonormal eigenvectors for a basis. Since T is diagonal U H AU = T AU = UT Au i = t ii u i so u i is an eigenvector of A associated with eigenvalue λ i = t ii. Since U is unitary U H U = I so the vectors u,..., u n are orthonormal. 86/88

Spectral Theorem A real symmetric matrix A can be factorized by A = QΛQ T where Λ is real and diagonal Q is real and orthogonal A is Hermitian, so the eigenvalues are real. The eigenvectors are also real, and orthonormal. 87/88

Example A = [ ] = [ ] [ 3 0 0 ] [ ] = 3 [ ] + [ ] 88/88