Eigenvalues and Eigenvectors

Similar documents
Symmetric Matrices and Quadratic Forms

1 Last time: similar and diagonalizable matrices

Inverse Matrix. A meaning that matrix B is an inverse of matrix A.

Matrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Pearson Education, Inc.

Matrix Algebra 2.2 THE INVERSE OF A MATRIX Pearson Education, Inc.

Stochastic Matrices in a Finite Field

Chapter Vectors

MATH10212 Linear Algebra B Proof Problems

The Jordan Normal Form: A General Approach to Solving Homogeneous Linear Systems. Mike Raugh. March 20, 2005

, then cv V. Differential Equations Elements of Lineaer Algebra Name: Consider the differential equation. and y2 cos( kx)

Section Let A =. Then A has characteristic equation λ. 2 4λ + 3 = 0 or (λ 3)(λ 1) = 0. Hence the eigenvalues of A are λ 1 = 3 and λ 2 = 1.

CHAPTER 5. Theory and Solution Using Matrix Techniques

Theorem: Let A n n. In this case that A does reduce to I, we search for A 1 as the solution matrix X to the matrix equation A X = I i.e.

Iterative Techniques for Solving Ax b -(3.8). Assume that the system has a unique solution. Let x be the solution. Then x A 1 b.

A Note on the Symmetric Powers of the Standard Representation of S n

PAPER : IIT-JAM 2010

Lecture 8: October 20, Applications of SVD: least squares approximation

Topics in Eigen-analysis

Apply change-of-basis formula to rewrite x as a linear combination of eigenvectors v j.

Introduction to Computational Biology Homework 2 Solution

Linearly Independent Sets, Bases. Review. Remarks. A set of vectors,,, in a vector space is said to be linearly independent if the vector equation

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

(3) If you replace row i of A by its sum with a multiple of another row, then the determinant is unchanged! Expand across the i th row:

Problem Set 2 Solutions

Recurrence Relations

Application of Jordan Canonical Form

CHAPTER I: Vector Spaces

a for a 1 1 matrix. a b a b 2 2 matrix: We define det ad bc 3 3 matrix: We define a a a a a a a a a a a a a a a a a a

(VII.A) Review of Orthogonality

Matrix Theory, Math6304 Lecture Notes from October 23, 2012 taken by Satish Pandey

1. By using truth tables prove that, for all statements P and Q, the statement

Where do eigenvalues/eigenvectors/eigenfunctions come from, and why are they important anyway?

5.1. The Rayleigh s quotient. Definition 49. Let A = A be a self-adjoint matrix. quotient is the function. R(x) = x,ax, for x = 0.

Basic Iterative Methods. Basic Iterative Methods

SOLUTION SET VI FOR FALL [(n + 2)(n + 1)a n+2 a n 1 ]x n = 0,

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Matrices and vectors

The Method of Least Squares. To understand least squares fitting of data.

M 340L CS Homew ork Set 6 Solutions

M 340L CS Homew ork Set 6 Solutions

2 Geometric interpretation of complex numbers

PROBLEM SET I (Suggested Solutions)

Chimica Inorganica 3

a for a 1 1 matrix. a b a b 2 2 matrix: We define det ad bc 3 3 matrix: We define a a a a a a a a a a a a a a a a a a

Lemma Let f(x) K[x] be a separable polynomial of degree n. Then the Galois group is a subgroup of S n, the permutations of the roots.

Math 4707 Spring 2018 (Darij Grinberg): homework set 4 page 1

LECTURE 8: ORTHOGONALITY (CHAPTER 5 IN THE BOOK)

Math 475, Problem Set #12: Answers

Polynomials with Rational Roots that Differ by a Non-zero Constant. Generalities

1 1 2 = show that: over variables x and y. [2 marks] Write down necessary conditions involving first and second-order partial derivatives for ( x0, y

Mon Feb matrix inverses. Announcements: Warm-up Exercise:

Algebra of Least Squares

For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3.

THE ASYMPTOTIC COMPLEXITY OF MATRIX REDUCTION OVER FINITE FIELDS

ECE-S352 Introduction to Digital Signal Processing Lecture 3A Direct Solution of Difference Equations

CS276A Practice Problem Set 1 Solutions

Math Solutions to homework 6

Math 61CM - Solutions to homework 3

Chapter 6: Determinants and the Inverse Matrix 1

TMA4205 Numerical Linear Algebra. The Poisson problem in R 2 : diagonalization methods

TEACHER CERTIFICATION STUDY GUIDE

CALCULATING FIBONACCI VECTORS

LINEAR ALGEBRA. Paul Dawkins

Vector Spaces and Vector Subspaces. Remarks. Euclidean Space

Solutions to home assignments (sketches)

SOLVING LINEAR RECURSIONS OVER ALL FIELDS

Geometry of LS. LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT

REVISION SHEET FP1 (MEI) ALGEBRA. Identities In mathematics, an identity is a statement which is true for all values of the variables it contains.

Chapter 2. Periodic points of toral. automorphisms. 2.1 General introduction

Example 1.1 Use an augmented matrix to mimic the elimination method for solving the following linear system of equations.

Efficient GMM LECTURE 12 GMM II

Complex Analysis Spring 2001 Homework I Solution

denote the set of all polynomials of the form p=ax 2 +bx+c. For example, . Given any two polynomials p= ax 2 +bx+c and q= a'x 2 +b'x+c',

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.

6.003 Homework #3 Solutions

REVISION SHEET FP1 (MEI) ALGEBRA. Identities In mathematics, an identity is a statement which is true for all values of the variables it contains.

4 The Sperner property.

Math 4400/6400 Homework #7 solutions

PROPERTIES OF AN EULER SQUARE

NICK DUFRESNE. 1 1 p(x). To determine some formulas for the generating function of the Schröder numbers, r(x) = a(x) =

SNAP Centre Workshop. Basic Algebraic Manipulation

The Discrete Fourier Transform

Mon Apr Second derivative test, and maybe another conic diagonalization example. Announcements: Warm-up Exercise:

NBHM QUESTION 2007 Section 1 : Algebra Q1. Let G be a group of order n. Which of the following conditions imply that G is abelian?

b i u x i U a i j u x i u x j

(3) If you replace row i of A by its sum with a multiple of another row, then the determinant is unchanged! Expand across the i th row:

ECE Spring Prof. David R. Jackson ECE Dept. Notes 20

MATH : Matrices & Linear Algebra Spring Final Review

Algorithms and Data Structures 2014 Exercises and Solutions Week 13

Notes for Lecture 11

CHAPTER 3. GOE and GUE

Machine Learning for Data Science (CS 4786)

After the completion of this section the student should recall

Zeros of Polynomials

Brief Review of Functions of Several Variables

PROBLEMS AND SOLUTIONS 2

A brief introduction to linear algebra

THE SPECTRAL RADII AND NORMS OF LARGE DIMENSIONAL NON-CENTRAL RANDOM MATRICES

M A T H F A L L CORRECTION. Algebra I 1 4 / 1 0 / U N I V E R S I T Y O F T O R O N T O

MIDTERM 3 CALCULUS 2. Monday, December 3, :15 PM to 6:45 PM. Name PRACTICE EXAM SOLUTIONS

Transcription:

5 Eigevalues ad Eigevectors 5.3 DIAGONALIZATION

DIAGONALIZATION Example 1: Let. Fid a formula for A k, give that P 1 1 = 1 2 ad, where Solutio: The stadard formula for the iverse of a 2 2 matrix yields A 7 2 = 4 1 A = PDP 1 5 0 D = 0 3 P 2 1 1 = 1 1 Slide 5.3-2

DIAGONALIZATION The, by associativity of matrix multiplicatio, A = ( PDP )( PDP ) = PD( P P) DP = PDDP 123 2 1 1 1 1 1 Agai, 2 1 1 5 0 2 1 2 1 = PD P = 2 1 2 0 3 1 1 A = ( PDP ) A = ( PD { P ) P D P = PDD P = PD P 3 1 2 1 2 1 2 1 3 1 I I Slide 5.3-3

DIAGONALIZATION k 1 k 1 1 5 0 2 1 k k 1 A = PD P = k 1 2 0 3 1 1 I geeral, for, k k k k 2 5 3 5 3 = k k k k 2 3 2 5 2 3 5 A square matrix A is said to be diagoalizable if A is similar to a diagoal matrix, that is, if A = PDP 1 for some ivertible matrix P ad some diagoal, matrix D. Slide 5.3-4

THE DIAGONALIZATION THEOREM Theorem 5: A matrix A is diagoalizable if ad oly if A has liearly idepedet eigevectors. A = PDP 1 I fact,, with D a diagoal matrix, if ad oly if the colums of P ad liearly idepedet eigevectors of A. I this case, the diagoal etries of D are eigevalues of A that correspod, respectively, to the eigevectors i P. I other words, A is diagoalizable if ad oly if there are eough eigevectors to form a basis of. We call such a basis a eigevector basis of. Slide 5.3-5

THE DIAGONALIZATION THEOREM Proof: First, observe that if P is ay matrix with colums v 1,, v, ad if D is ay diagoal matrix with diagoal etries 1,,, the while [ v v L v ] [ v v L v ] AP = A = A A A PD 1 2 1 2 0 L 0 1 0 L 0 M M M 0 0 L 2 = P = ----(1) [ v v L v ] 1 1 2 2 ----(2) Slide 5.3-6

THE DIAGONALIZATION THEOREM Now suppose A is diagoalizable ad right-multiplyig this relatio by P, we have AP = PD. I this case, equatios (1) ad (2) imply that Equatig colums, we fid that A = PDP 1 Sice P is ivertible, its colums v 1,, v must be liearly idepedet.. The [ Av Av L A v ] = [ v v L v ] 1 2 1 1 2 2 Av = v, Av = v, K, Av = v 1 1 1 2 2 2 ----(3) ----(4) Slide 5.3-7

THE DIAGONALIZATION THEOREM Also, sice these colums are ozero, the equatios i (4) show that 1,, are eigevalues ad v 1,, v are correspodig eigevectors. This argumet proves the oly if parts of the first ad secod statemets, alog with the third statemet, of the theorem. Fially, give ay eigevectors v 1,, v, use them to costruct the colums of P ad use correspodig eigevalues 1,, to costruct D. Slide 5.3-8

THE DIAGONALIZATION THEOREM By equatio (1)(3),. This is true without ay coditio o the eigevectors. If, i fact, the eigevectors are liearly idepedet, the P is ivertible (by the Ivertible Matrix Theorem), ad implies that. AP = AP PD = PD A = PDP 1 Slide 5.3-9

DIAGONALIZING MATRICES Example 2: Diagoalize the followig matrix, if possible. 1 3 3 A = 3 5 3 3 3 1 That is, fid a ivertible matrix P ad a diagoal matrix D such that A = PDP 1. Solutio: There are four steps to implemet the descriptio i Theorem 5. Step 1. Fid the eigevalues of A. Here, the characteristic equatio turs out to ivolve a cubic polyomial that ca be factored: Slide 5.3-10

DIAGONALIZING MATRICES 3 2 0 = det( A I) = 3 + 4 = ( 1)( + 2) 2 The eigevalues are ad. Step 2. Fid three liearly idepedet eigevectors of A. Three vectors are eeded because A is a 3 3 matrix. This is a critical step. = 1 = 2 If it fails, the Theorem 5 says that A caot be diagoalized. Slide 5.3-11

DIAGONALIZING MATRICES Basis for Basis for 1 = 1: v = 1 1 1 1 2 : v 1 = = 2 0 ad 1 v = 0 3 1 You ca check that {v 1, v 2, v 3 } is a liearly idepedet set. Slide 5.3-12

DIAGONALIZING MATRICES Step 3. Costruct P from the vectors i step 2. The order of the vectors is uimportat. Usig the order chose i step 2, form P 1 1 1 = [ v v v ] = 1 1 0 1 2 3 1 0 1 Step 4. Costruct D from the correspodig eigevalues. I this step, it is essetial that the order of the eigevalues matches the order chose for the colums of P. Slide 5.3-13

DIAGONALIZING MATRICES Use the eigevalue = 2 twice, oce for each of the eigevectors correspodig to = 2 : 1 0 0 D = 0 2 0 0 0 2 1 To avoid computig, simply verify that. P AD = PD Compute 1 3 3 1 1 1 1 2 2 AP = 3 5 3 1 1 0 = 1 2 0 3 3 1 1 0 1 1 0 2 Slide 5.3-14

DIAGONALIZING MATRICES PD 1 1 1 1 0 0 1 2 2 = 1 1 0 0 2 0 = 1 2 0 1 0 1 0 0 2 1 0 2 Theorem 6: A matrix with distict eigevalues is diagoalizable. Proof: Let v 1,, v be eigevectors correspodig to the distict eigevalues of a matrix A. The {v 1,, v } is liearly idepedet, by Theorem 2 i Sectio 5.1. Hece A is diagoalizable, by Theorem 5. Slide 5.3-15

MATRICES WHOSE EIGENVALUES ARE NOT DISTINCT It is ot ecessary for a matrix to have distict eigevalues i order to be diagoalizable. Theorem 6 provides a sufficiet coditio for a matrix to be diagoalizable. If a matrix A has distict eigevalues, with correspodig eigevectors v 1,, v, ad if P = [ v L v ], the P is automatically ivertible 1 2 because its colums are liearly idepedet, by Theorem 2. Slide 5.3-16

MATRICES WHOSE EIGENVALUES ARE NOT DISTINCT Whe A is diagoalizable but has fewer tha distict eigevalues, it is still possible to build P i a way that makes P automatically ivertible, as the ext theorem shows. Theorem 7: Let A be a matrix whose distict eigevalues are 1,, p. 1 k p a. For, the dimesio of the eigespace for k is less tha or equal to the multiplicity of the eigevalue k. Slide 5.3-17

MATRICES WHOSE EIGENVALUES ARE NOT DISTINCT b. The matrix A is diagoalizable if ad oly if the sum of the dimesios of the eigespaces equals, ad this happes if ad oly if (i) the characteristic polyomial factors completely ito liear factors ad (ii) the dimesio of the eigespace for each k equals the multiplicity of k. c. If A is diagoalizable ad Β k is a basis for the eigespace correspodig to Β k for each k, the the total collectio of vectors i the sets Β 1,, Β p forms a eigevector basis for. Slide 5.3-18