Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Similar documents
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Elementary maths for GMT

Math Linear Algebra Final Exam Review Sheet

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

CS 246 Review of Linear Algebra 01/17/19

Review of Linear Algebra

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Quantum Computing Lecture 2. Review of Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Introduction to Matrix Algebra

Linear Algebra Practice Final

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

and let s calculate the image of some vectors under the transformation T.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear Algebra Solutions 1

Linear Algebra Practice Problems

Chapter 2 Notes, Linear Algebra 5e Lay

Matrices and Linear Algebra

Definition (T -invariant subspace) Example. Example

MATRICES. a m,1 a m,n A =

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Review problems for MA 54, Fall 2004.

Linear Algebra Review. Vectors

MATH 221, Spring Homework 10 Solutions

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

LINEAR ALGEBRA REVIEW

MATH2210 Notebook 2 Spring 2018

Online Exercises for Linear Algebra XM511

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Introduction to Matrices

Knowledge Discovery and Data Mining 1 (VO) ( )

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

Dimension. Eigenvalue and eigenvector

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Linear Algebra Practice Problems

Conceptual Questions for Review

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Materials engineering Collage \\ Ceramic & construction materials department Numerical Analysis \\Third stage by \\ Dalya Hekmat

ANSWERS. E k E 2 E 1 A = B

Math 360 Linear Algebra Fall Class Notes. a a a a a a. a a a

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Linear Algebra: Matrix Eigenvalue Problems

1 9/5 Matrices, vectors, and their applications

Eigenvalues and Eigenvectors

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Markov chains and the number of occurrences of a word in a sequence ( , 11.1,2,4,6)

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

2 b 3 b 4. c c 2 c 3 c 4

Calculating determinants for larger matrices

Linear Systems and Matrices

Matrices. 1 a a2 1 b b 2 1 c c π

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

MAT 1302B Mathematical Methods II

Matrix Arithmetic. j=1

Math Final December 2006 C. Robinson

235 Final exam review questions

Linear Algebra and Matrices

Topic 1: Matrix diagonalization

Math 215 HW #9 Solutions

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Components and change of basis

CHAPTER 8: Matrices and Determinants

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra Primer

Appendix A: Matrices

n n matrices The system of m linear equations in n variables x 1, x 2,..., x n can be written as a matrix equation by Ax = b, or in full

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MTH 464: Computational Linear Algebra

1 Matrices and Systems of Linear Equations. a 1n a 2n

Elementary Row Operations on Matrices

Linear Algebra Tutorial for Math3315/CSE3365 Daniel R. Reynolds

0.1 Eigenvalues and Eigenvectors

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

The Singular Value Decomposition

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Linear Algebra V = T = ( 4 3 ).

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Matrices Gaussian elimination Determinants. Graphics 2009/2010, period 1. Lecture 4: matrices

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Eigenvalues and Eigenvectors: An Introduction

MATH 235. Final ANSWERS May 5, 2015

Math 205, Summer I, Week 4b:

2. Matrix Algebra and Random Vectors

Study Guide for Linear Algebra Exam 2

Math 343 Midterm I Fall 2006 sections 002 and 003 Instructor: Scott Glasgow

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Transcription:

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 1 / 32

Matrices A matrix is a square or rectangular table of numbers. An m n matrix has m rows and n columns. This is read m by n. This matrix is 2 3: A = [ 1 2 ] 3 4 5 6 The entry in row i, column j, is denoted A i, j or A ij. A 1,1 = 1 A 1,2 = 2 A 1,3 = 3 A 2,1 = 4 A 2,2 = 5 A 2,3 = 6 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 2 / 32

Matrix multiplication A B = C [ ] 5 2 3 2 ] 1 2 3 0 1 1 1 [ = 4 5 6 } {{ } 1 6 4 3 } {{ } 2 3 } {{ } 2 4 3 4 Let A be p q and B be q r. The product AB = C is a certain p r matrix of dot products: C i, j = q A i,k B k, j = dot product (i th row of A) (j th column of B) k=1 The number of columns in A must equal the number of rows in B (namely q) in order to be able to compute the dot products. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 3 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 ] C 1,1 = 1(5) + 2(0) + 3( 1) = 5 + 0 3 = 2 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 4 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 ] C 1,2 = 1( 2) + 2(1) + 3(6) = 2 + 2 + 18 = 18 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 5 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 ] C 1,3 = 1(3) + 2(1) + 3(4) = 3 + 2 + 12 = 17 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 6 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 9 ] C 1,4 = 1(2) + 2( 1) + 3(3) = 2 2 + 9 = 9 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 7 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 9 14 ] C 2,1 = 4(5) + 5(0) + 6( 1) = 20 + 0 6 = 14 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 8 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 9 14 33 ] C 2,2 = 4( 2) + 5(1) + 6(6) = 8 + 5 + 36 = 33 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 9 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 9 14 33 41 ] C 2,3 = 4(3) + 5(1) + 6(4) = 12 + 5 + 24 = 41 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 10 / 32

Matrix multiplication [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 [ 2 18 17 ] 9 14 33 41 21 C 2,4 = 4(2) + 5( 1) + 6(3) = 8 5 + 18 = 21 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 11 / 32

Transpose of a matrix Given matrix A of dimensions p q, the transpose A is q p, obtained by interchanging rows and columns: (A ) ij = A ji. [ ] 1 2 3 = 4 5 6 1 4 2 5 3 6 Transpose of a product reverses the order and transposes the factors: (AB) = B A [ ] 5 2 3 2 1 2 3 0 1 1 1 = 4 5 6 1 6 4 3 5 0 1 2 1 6 1 4 3 1 4 2 5 = 3 6 2 1 3 [ 2 18 17 ] 9 14 33 41 21 2 14 18 33 17 41 9 21 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 12 / 32

Multiplying several matrices Multiplication is not commutative: usually, AB BA. AB and BA are both defined only if A and B are both n n. Even then, they are usually not equal. Multiplication is associative: (AB)C = A(BC) Suppose A is p 1 p 2 B is p 2 p 3 C is p 3 p 4 D is p 4 p 5 Then ABCD is p 1 p 5. By associativity, it may be computed in many ways, such as A(B(CD)), (AB)(CD),... or directly by: (ABCD) i, j = p 2 p 3 p 4 k 2 =1 k 3 =1 k 4 =1 A i, k2 B k2, k 3 C k3, k 4 D k4, j This generalizes to any number of matrices. Powers A 2 = AA, A 3 = AAA,... are defined for square matrices. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 13 / 32

Inverse matrix The n n identity matrix I is 1 0 0 I = 0 1 0 I i, j = 0 0 1 { 1 if i = j (main diagonal); 0 if i j (elsewhere). The inverse of an n n matrix A is an n n matrix A 1 such that A A 1 = I and A 1 A = I. It may or may not exist. For 2 2 matrices [ ] a b A = c d A 1 = 1 ad bc [ d ] b c a unless det(a) = ad bc = 0, in which case A 1 is undefined. For n n matrices, use the row reduction algorithm (a.k.a. Gaussian elimination) in Linear Algebra. If A, B are invertible and the same size: (AB) 1 = B 1 A 1 The order is reversed and the factors are inverted. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 14 / 32

Span, basis, and linear (in)dependence The span of vectors v 1,..., v k is the set of all linear combinations α 1 v 1 + + α k v k α 1,..., α k R Example 1 In 3D, 1 0 span 0, 0 = 0 1 x 0 : x, z R z = xz plane Here, the span of these two vectors is a 2-dimensional space. Every vector in it is generated by a unique linear combination. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 15 / 32

Span, basis, and linear (in)dependence Example 2 In 3D, 1 1 0 span 0, 1, 0 = 0 0 1/2 x y : x, y, z R = R3. z Note that x 1 1 0 y = (x y) 0 + y 1 2z 0 z 0 0 1/2 Here, the span of these three vectors is a 3-dimensional space. Every vector in R 3 is generated by a unique linear combination. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 16 / 32

Span, basis, and linear (in)dependence Example 3 In 3D, 1 1 0 span 0, 0, 0 = 0 1 1 x 0 : x, z R z = xz plane This is a plane (2D), even though it s a span of three vectors. Note that v 2 = v 1 + v 3, or v 1 v 2 + v 3 = 0. There are multiple ways to generate each vector in the span: for all x, z, t, x 1 0 1 1 0 0 = x 0 + z 0 + t ( v 1 v 2 + v 3 ) = (x + t) 0 t 0 + (z + t) 0 } {{ } z 0 1 0 1 1 = 0 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 17 / 32

Span, basis, and linear (in)dependence Given vectors v 1,..., v k, if there is a linear combination α 1 v 1 + + α k v k = 0 with at least one α i 0, the vectors are linearly dependent (Ex. 3). Otherwise they are linearly independent (Ex. 1 2). Linearly independent vectors form a basis of the space S they span. Any vector in S is a unique linear combination of basis vectors (vs. it s not unique if v 1,..., v k are linearly dependent). One basis of R n is a unit vector on each axis: but there are other possibilities, e.g., Example 2: [ 1 0 0 [ ] [ ] [ ] 10 01 00,, 0 0 1 ], [ 1 1 0 ], [ 0 0 1/2 ] Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 18 / 32

Eigenvectors Eigenvalues and eigenvectors Let A be a square matrix (k k) and v 0 be a column vector (k 1). If A v = λ v for a scalar λ, then v is an eigenvector of A with eigenvalue λ. Example [8 [ ] 1 3 1 6 3 ] [ ] 1 3 = [ ] (8)(1) + ( 1)(3) (6)(1) + (3)(3) is an eigenvector with eigenvalue 5. = [ ] 5 15 = 5 [ ] 1 3 But this is just a verification. How do we find eigenvalues and eigenvectors? Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 19 / 32

Finding eigenvalues and eigenvectors We will work with the example P = [ ] 8 1 6 3 Form the identity matrix of the same dimensions: [ ] 1 0 I = 0 1 The formula for the determinant depends on the dimensions of the matrix. For a 2 2 matrix, [ ] a b det = ad bc c d Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 20 / 32

Finding eigenvalues and eigenvectors Compute the determinant of P λi: [ ] 8 λ 1 det(p λi) = det 6 3 λ = (8 λ)(3 λ) ( 1)(6) = 24 11λ + λ 2 + 6 = λ 2 11λ + 30 This is the characteristic polynomial of P. It has degree k in λ. The characteristic equation is det(p λi) = 0. Solve it for λ. For k = 2, use the quadratic formula: λ = 11 ± ( 11) 2 4(1)(30) 2 The eigenvalues are λ = 5 and λ = 6. = 5, 6 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 21 / 32

Finding the (right) eigenvector for λ = 5 Let v = [ ] a. We will solve for a, b. b The equation P v = λ v is equivalent to (P λi) v = 0. [ ] 0 0 = (P 5I) v = [ ] [ ] 3 1 a 6 2 b = [ ] 3a b 6a 2b so 3a b = 0 and 6a 2b = 0 (which are equivalent). Solving gives b = 3a. Thus, [ ] a v = = b Any nonzero scalar multiple of eigenvalue 5. [ ] a 3a [ ] 1 3 = a [ ] 1 3 is an eigenvector of P with Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 22 / 32

Finding the (right) eigenvector for λ = 6 Let v = [ ] a. We will solve for a, b. b The equation P v = λ v is equivalent to (P λi) v = 0. [ ] 0 0 = (P 6I) v = [ ] [ ] 2 1 a 6 3 b = [ ] 2a b 6a 3b so 2a b = 0 and 6a 3b = 0 (which are equivalent). Solving gives b = 2a. Thus, [ ] a v = = b Any nonzero scalar multiple of eigenvalue 6. [ ] a 2a [ ] 1 2 = a [ ] 1 2 is an eigenvector of P with Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 23 / 32

Verify the eigenvectors [ ] [ ] 8 1 1 6 3 3 [ ] [ ] 8 1 2 6 3 4 = = [ ] 8(1) 1(3) 6(1) + 3(3) [ ] 8(2) 1(4) 6(2) + 3(4) = = [ ] 5 15 [ ] 12 24 [ ] 1 = 5 3 [ ] 2 = 6 4 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 24 / 32

Normalization: Which scalar multiple should we use? In some applications, any nonzero multiple is fine. In others, a particular scaling is required. Markov chains / Stochastic matrices Scale the vector [ ] so that the entries sum up to 1. 1 For v = a, the sum is a (1 + 3) = 4a = 1, so a = 1/4: 3 [ ] 1/4 v = 3/4 Principal component analysis Scale it to be a unit vector, so that the sum of the squares of its entries equals 1: 1 = a 2 (1 2 + 3 2 ) = 10a 2 ±1 so a = 1 2 + 3 = ±1. 2 10 v = ± [ ] 1/ 10 3/ 10 (two possibilities) Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 25 / 32

Finding the left eigenvector for λ = 5 Let v = [ a b ]. We will solve for a, b. The equation vp = λ v is equivalent to v(p λi) = 0. [ ] [ ] [ ] 3 1 0 0 = v(p 5I) = a b 6 2 = [ 3a + 6b a 2b ] so 3a + 6b = 0 and a 2b = 0 (which are equivalent). Solving gives b = a/2. Thus, v = [ a b ] = [ a a/2 ] = a [ 1 1/2 ] Any nonzero scalar multiple of [ 1 with eigenvalue 5. 1/2 ] is a left eigenvector of P Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 26 / 32

Finding the left eigenvector for λ = 6 Let v = [ a b ]. We will solve for a, b. The equation vp = λ v is equivalent to v(p λi) = 0. [ ] [ ] [ ] 2 1 0 0 = v(p 6I) = a b 6 3 = [ 2a + 6b a 3b ] so 2a + 6b = 0 and a 3b = 0 (which are equivalent). Solving gives b = a/3. Thus, v = [ a b ] = [ a a/3 ] = a [ 1 1/3 ] Any nonzero scalar multiple of [ 1 with eigenvalue 6. 1/3 ] is a left eigenvector of P Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 27 / 32

Verify the left eigenvectors [ ] [ ] 8 1 2 1 6 3 = [ 2(8) + 1(6) 2( 1) + 1(3) ] = [ 10 5 ] = 5 [ 2 1 ] [ ] [ ] 8 1 1.5.5 6 3 = [ 1.5(8).5(6) 1.5( 1).5(3) ] = [ 9 3 ] = 6 [ 1.5.5 ] Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 28 / 32

Diagonalizing a matrix This procedure assumes there are k linearly independent eigenvectors, where P is k k. If the characteristic polynomial has k distinct roots, then there are k such eigenvectors. But if roots are repeated, there may or may not be a full set of eigenvectors. We ll explore this complication later. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 29 / 32

Diagonalizing a matrix Put the right eigenvectors r 1, r 2,... into the columns of a matrix V. Form diagonal matrix D with eigenvalues λ 1, λ 2,... in the same order: V = [ [ ] [ ] [ ] ] 1 2 λ1 0 5 0 r 1 r 2 = D = = 3 4 0 λ 2 0 6 [ ] [ ] Compute V 1 l 1 2 1 = = l 2 3/2 1/2 Its rows are the left eigenvectors l 1, l 2,... of P, in the same order as the eigenvalues in D, scaled so that l i r i = 1. This gives the diagonalization P = VDV 1 : P = V D V 1 [ ] [ ] [ ] [ ] 8 1 1 2 5 0 2 1 = 6 3 3 4 0 6 3/2 1/2 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 30 / 32

Matrix powers using the spectral decomposition An expansion of P n is P n = (VDV 1 )(VDV 1 ) (VDV 1 ) = VD n V 1 : [ ] [ ] [ ] P n = VD n V 1 5 n 0 = V 0 6 n V 1 5 n 0 = V V 1 0 0 + V 0 0 0 6 n V 1 V [ ] 5 n 0 V 1 = 0 0 [ ][ ][ ] [ 1 2 5 n 0 2 1 (1)(5 = n )( 2) (1)(5 n ] )(1) 3 4 0 0 1.5.5 (3)(5 n )( 2) (3)(5 n )(1) ] [ ] [ 2 ] 1 = n λ1 r 1 l 1 = 5 n 2 1 = 5 n [ 1 3 6 3 V [ ] 0 0 0 6 n V 1 = [ ] [ ] [ ] [ 1 2 0 0 2 1 2(6 3 4 0 6 n = n )(1.5) 2(6 n ] )(.5) 1.5.5 4(6 n )(1.5) 4(6 n )(.5) ] [ ] [1.5 ].5 = n λ2 r 2 l 2 = 6 n 3 1 = 6 n [ 2 4 6 2 Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 31 / 32

Matrix powers using the spectral decomposition Continue computing P n : P n = VD n V 1 = V [ ] = 5 n 2 1 6 3 [ ] 5 n 0 0 6 n + 6 n [ 3 1 6 2 V 1 = V ] [ ] 5 n 0 V 1 + V 0 0 [ ] 0 0 0 6 n V 1 General formula (with k = 2 and two distinct eigenvalues): P n = VD n V 1 = λ 1 n r 1 l 1 + λ 2 n r 2 l 2 General formula: If P is k k and is diagonalizable, this becomes: P n = VD n V 1 = λ 1 n r 1 l 1 + λ 2 n r 2 l 2 + + λ k n r k l k What if the matrix is not diagonalizable? We will see a generalization called the Jordan Canonical Form. Prof. Tesler Diagonalizing a matrix Math 283 / Fall 2016 32 / 32