Agenda: Understand the action of A by seeing how it acts on eigenvectors.

Similar documents
Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 221, Spring Homework 10 Solutions

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Lecture 15, 16: Diagonalization

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Eigenvalues and Eigenvectors

235 Final exam review questions

Study Guide for Linear Algebra Exam 2

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Chapter 4 & 5: Vector Spaces & Linear Transformations

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

and let s calculate the image of some vectors under the transformation T.

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Conceptual Questions for Review

Math 3191 Applied Linear Algebra

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Math Matrix Algebra

Econ Slides from Lecture 7

Math 240 Calculus III

Recall : Eigenvalues and Eigenvectors

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Definition (T -invariant subspace) Example. Example

1. General Vector Spaces

Generalized Eigenvectors and Jordan Form

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

The Singular Value Decomposition

MA 265 FINAL EXAM Fall 2012

Chap 3. Linear Algebra

Lecture 11: Diagonalization

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

18.06 Quiz 2 April 7, 2010 Professor Strang

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

UNIT 6: The singular value decomposition.

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Solutions to Final Exam

Chapter 5 Eigenvalues and Eigenvectors

1 Last time: least-squares problems

Eigenvalues, Eigenvectors, and Diagonalization

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

MAT 1302B Mathematical Methods II

Linear algebra II Tutorial solutions #1 A = x 1

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

I. Multiple Choice Questions (Answer any eight)

Linear Algebra- Final Exam Review

Math Linear Algebra Final Exam Review Sheet

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Chapter 7: Symmetric Matrices and Quadratic Forms

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Eigenvalues and Eigenvectors

Diagonalization of Matrix

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues and Eigenvectors

1. Select the unique answer (choice) for each problem. Write only the answer.

MATH Spring 2011 Sample problems for Test 2: Solutions

Chapter 3 Transformations

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

MAT1302F Mathematical Methods II Lecture 19

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

MATH 1553-C MIDTERM EXAMINATION 3

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Linear System Theory

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math Final December 2006 C. Robinson

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Cheat Sheet for MATH461

Solutions to Review Problems for Chapter 6 ( ), 7.1

Matrices related to linear transformations

Diagonalization of Matrices

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Jordan Normal Form and Singular Decomposition

7. Symmetric Matrices and Quadratic Forms

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Solutions to practice questions for the final

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear algebra II Homework #1 due Thursday, Feb A =

ANSWERS. E k E 2 E 1 A = B

Linear Algebra: Matrix Eigenvalue Problems

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Maths for Signals and Systems Linear Algebra in Engineering

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Linear Algebra Primer

MTH 2032 SemesterII

Transcription:

Eigenvalues and Eigenvectors If Av=λv with v nonzero, then λ is called an eigenvalue of A and v is called an eigenvector of A corresponding to eigenvalue λ. Agenda: Understand the action of A by seeing how it acts on eigenvectors. (λi-a)v= system to be satisfied by λ and v For a given λ, only solution is v= except if det(λ I-A)=; in that case a nonzero v that satisfies (λ I-A)v= is guaranteed to exist, and that v is an eigenvector. det(λ I-A) is a polynomial of degree n with leading term λ n the characteristic polynomial of the matrix A.. This polynomial is called det(λ I-A)= is called the characteristic equation of the matrix A There will be n roots of the characteristic polynomial, these may or may not be distinct. For each eigenvalue λ the corresponding eigenvectors satisfy (λ I-A)v= ; the (nonzero) solutions v are the eigenvectors, and these are the nonzero vectors in the null space of (λi- A). For each eigenvalue, we calculate a basis for the null space of (λ I-A) and these represent the corresponding eigenvectors, with it being understood that any (nonzero) linear combination of these vectors will produce another eigenvector corresponding to that λ. We will see shortly that if λ is a nonrepeated root of the characteristic polynomial, then the null space of (λ I-A) has dimension one, so there is only one corresponding eigenvector (that is, a basis of the null space has only one vector), which can be multiplied by any nonzero scalar. Observation: If λ i is an eigenvalue of A with v i a corresponding eigenvector, then for any λ, (doesn t have to be an eigenvalue) we have (λ I-A)v i =(λ i -λ) v i Fun facts about eigenvalues /eigenvectors: 1) Eigenvectors corresponding to different eigenvalues are linearly independent ) If λ is a nonrepeated root of the characteristic polynomial, then there is exactly one corresponding eigenvector (up to a scalar multiple); in other words the dimension of the nullspace of (λ I-A) is one. 3) If λ is a repeated root of the characteristic polynomial, with multiplicity m(λ) then there at least one corresponding eigenvector and up to m(λ) independent corresponding eigenvectors; in other words the dimension of the nullspace of (ΛI- A) is between 1 and m(λ)

4) As a consequence of item 1) above, if the characteristic polynomial has n distinct roots (where the degree of the polynomial is n) then there are n corresponding independent eigenvectors, which in turn constitute a basis for R n (or C n if applicable) 5) When bad things happen to good matrices: deficient matrices. A matrix is said to be deficient if it fails to have n independent eigenvectors. This can only happen if (but not necessarily if) an eigenvalue has multiplicity greater than 1. However it is always true that if an eigenvalue λ has multiplicity m then null(a- λi) m has dimension m. The vectors in this null space are called generalized eigenvectors. We won t pursue that further in this course, however. 6) Complex eigenvalues: If A is real then complex eigenvalues/eigenvectors come in complex conjugate pairs: If λ is an eigenvalue with eigenvector v then λ * is an eigenvalue with eigenvector v * 7) If A is a triangular matrix, the eigenvalues are the diagonal entries Diagonalization If A has a full set of eigenvectors (n linearly independent eigenvectors) and we put them as columns in a matrix V, then AV=VΛ where Λ is a diagonal matrix with the eigenvalues (corresponding to columns of V) down the diagonal. Then V -1 AV=Λ this is called diagonalizing A. Also, we have A=VΛV -1. Calculating powers of A: A m =( VΛV -1 ) m = VΛ m V -1 In general, if P is any invertible matrix then P -1 AP is called a similarity transformation of A. Diagonalizing A consists of finding a P for which the similarity transformation gives a diagonal matrix. Of course we know that such a P would need to be a matrix whose columns are a full set of eigenvectors. A is diagonalizable if and only if there is a full set of eigenvectors. Given any linearly independent set of n vectors, there is a matrix A that has these as eigenvectors, namely A=VΛV -1 for any diagonal Λ we wish to specify; the diagonal entries of Λ are the eigenvalues. A similarity transformation can be considered as specifying the action of A in a transformed coordinate system:

Given y=ax as a transformation in R n, if we transform the coordinates by x=pu and y=pw (so that our new coordinate directions are the columns of P) then u and w are related by w=( P -1 AP)u so that P -1 AP is the transformed action of A in the new coordinate system, i.e. A is transformed to the new matrix P -1 AP in the new coordinate system. If A doesn t have a full set of eigenvectors then it cannot be diagonalized (why: if A can be diagonalized then A=PΛP -1, we have AP=PΛ and the columns of P are seen to be of full set of eigenvectors) but you can always find a similarity transformation P -1 AP such that P -1 AP has a special upper triangular form, called Jordan form. We will not delve any further into this, however. Remember, however, we did note that a square matrix A always has a full set of generalized eigenvectors even when A itself is deficient. Additional remark on similarity transformations: If P -1 AP=B then A and B have the same characteristic polynomial and the same eigenvalues and eigenvector structure in the sense that if v is an (generalized) eigenvector of A then P -1 v is an (generalized) eigenvector of B. Symmetric matrices: A T =A Properties: 1) All eigenvalues are real ) There is always a full set of eigenvectors 3) Eigenvectors from different eigenvalues are (automatically) orthogonal to each other. So, in particular if all the eigenvalues are distinct, there is an orthonormal basis of R n consisting of eigenvectors of A. If the eigenvalues have multiplicity greater than 1, you can always arrange for the corresponding eigenvectors to be orthogonal to each other. (Gram-Schmidt process) So, finally, you can always arrange for the orthogonal eigenvectors of A to have magnitude 1 and thus construct a full set of orthonormal eigenvectors of A. If V is the matrix whose columns are those eigenvectors, then we have V T V=I, so that V T =V -1. Application to quadratic forms: F(x,y,z)=xy+z -5x -7xz+13yz

By writing a quadratic form as x T Ax where A is symmetric, the transformation x=vu, where the columns of V are an orthonormal set of eigenvectors of A, gives new coordinates u in which the quadratic form is x T Ax= u T V T AVu= u T V -1 AVu =u T Λu where Λ is the diagonal matrix of eigenvalues. This is called diagonalizing the quadratic form. In terms of the new coordinates the quadratic form consists pure of a combination of squares of the coordinates, with no cross terms. In this example we have >> A=[-5.5-3.5;.5 6.5;-3.5 6.5 ] >> A A = -5..5-3.5.5 6.5-3.5 6.5. >> [V,D]=eig(A) V = -.688.7 -.1833.4788.69.615 -.5453 -.3336.769 D = -8.1.889 8.114 >> V'*V %the columns of V are orthonormal vectors ans = 1. -.. -. 1.... 1. So the new quadratic form is xy+z -5x -7xz+13yz =-8.1u 1.889u +8.114 u 3 where x=vu The new coordinate system is easily displayed within the original x-y-z coordinates. The columns of V are the new coordinate vectors (shown as red, green, blue unit vectors, respectively), corresponding to the vectors i,j,k in the standard coordinate system.

u 3 3 z 1 u 1-1 -3 u x y