c Igor Zelenko, Fall

Similar documents
Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1 Last time: least-squares problems

Linear Algebra: Matrix Eigenvalue Problems

Numerical Linear Algebra Homework Assignment - Week 2

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Eigenvalues and Eigenvectors

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

2. Every linear system with the same number of equations as unknowns has a unique solution.

Econ Slides from Lecture 7

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MA 265 FINAL EXAM Fall 2012

Spectral Theorem for Self-adjoint Linear Operators

Dimension. Eigenvalue and eigenvector

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Lecture 15, 16: Diagonalization

Generalized Eigenvectors and Jordan Form

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Notes on the matrix exponential

Eigenvalues and Eigenvectors A =

Announcements Wednesday, November 01

Linear Algebra II Lecture 22

235 Final exam review questions

Chapter 5. Eigenvalues and Eigenvectors

A linear algebra proof of the fundamental theorem of algebra

and let s calculate the image of some vectors under the transformation T.

Online Exercises for Linear Algebra XM511

A linear algebra proof of the fundamental theorem of algebra

Chapter Two Elements of Linear Algebra

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Linear Algebra II Lecture 13

Diagonalization of Matrix

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

Math 205, Summer I, Week 4b:

Constant coefficients systems

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Properties of Linear Transformations from R n to R m

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 315: Linear Algebra Solutions to Assignment 7

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Math 3191 Applied Linear Algebra

Eigenvalues, Eigenvectors, and Diagonalization

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Study Guide for Linear Algebra Exam 2

Chapter 5 Eigenvalues and Eigenvectors

Lecture 10 - Eigenvalues problem

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Lecture Summaries for Linear Algebra M51A

Review problems for MA 54, Fall 2004.

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Linear vector spaces and subspaces.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

JORDAN AND RATIONAL CANONICAL FORMS

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Quizzes for Math 304

Family Feud Review. Linear Algebra. October 22, 2013

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Jordan normal form notes (version date: 11/21/07)

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Final Exam Practice Problems Answers Math 24 Winter 2012

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

9.1 Eigenvectors and Eigenvalues of a Linear Map

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Linear System Theory

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Exam questions with full solutions

MATH 1553-C MIDTERM EXAMINATION 3

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

CS 246 Review of Linear Algebra 01/17/19

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

CHAPTER 3. Matrix Eigenvalue Problems

Definitions for Quizzes

Announcements Wednesday, November 7

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 1553, C. JANKOWSKI MIDTERM 3

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

Math Matrix Algebra

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Linear Algebra Practice Final

10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections )

Transcription:

c Igor Zelenko, Fall 2017 1 18: Repeated Eigenvalues: algebraic and geometric multiplicities of eigenvalues, generalized eigenvectors, and solution for systems of differential equation with repeated eigenvalues in case n = 2 (sec. 7.8) 1. We have seen that not every matrix admits a basis of eigenvectors. First, discuss a way how to determine if there is such basis or not. Recall the following two equivalent characterization of an eigenvalue: (a) is an eigenvalue of A det(a I) = 0; (b) is an eigenvalue of A there exist a nonzero vector v such that (A I)v = 0. The set of all such vectors together with the 0 vector form a vector space called the eigenspace of and denoted by E. Based on these two characterizations of an eigenvalue of a matrix A one can assign to the following two positive integer numbers, Algebraic multiplicity of is the multiplicity of in the characteristic polynomial det(a xi), i.e. the maximal number of appearances of the factor (x ) in the factorization of the polynomial det(a xi). Geometric multiplicity of is the dimension dim E of the eigenspace of, i.e. the maximal number of linearly independent eigenvectors of. ( ) 0 1 2. For example, if N = (as in the example in item 13 of the previous notes), then 0 0 = 0 is the unique eigenvalue. Find the algebraic multiplicity and geometric multiplicity of = 0 THEOREM 1. Geometric multiplicity is not greater than algebraic multiplicity.

c Igor Zelenko, Fall 2017 2 THEOREM 2. A matrix A admits a basis of eigenvectors if and only of for every its eigenvalue the geometric multiplicity of is equal to the algebraic multiplicity of. REMARK 3. In Linear Algebra matrices admitting a basis of eigenvectors are called diagonizable (because they are diagonal in this basis). REMARK 4. Basis of eigenvectors always exists for the following classes of matrix: symmetric matrices: A T = A, or equivalently, a ij = a ji for all i, j; skew-symmetric A T = A, or equivalently, a ij = a ji for all i, j. For symmetric matrices all eigenvalues are real and the the eigenspaces corresponding to the different eigenvalues are orthogonal. For skew-symmetric matrices the eigenvalues are purely imaginary (i.e. of the form iβ). 3. If the matrix A does not admit a basis of eigenvectors then for what vectors w other than the eigenvectors it is still easy to calculate e At w in the light of the formula e ta = e t e t(a I) (1) (see item 6 of the previous lecture notes)? 4. Assume that w is such that (A I)w 0, but (A I) 2 w = 0 (2) (the first relation means that w is not an eigenvector corresponding to ). Calculate e At w using (1). 5. More generally if we assume that for some k > 0 (A I) k 1 w 0, but (A I) k w = 0 (3)

c Igor Zelenko, Fall 2017 3 then e At w can be calculated using only finite number of terms when expanding e t(a I) w from (1). Note that if for some there exists w satisfying (3) then must be an eigenvalue 6. DEFINITION 5. A vector w satisfying (3) for some k > 0 is called a generalized eigenvector of (of order k). The set of all generalized eigenvectors of together with the 0 vector is a vector space denoted by E gen REMARK 6. The (regular) eigenvector is a generalized eigenvector of order 1, so E E gen (given two sets A and B, the notation A B means that the set A is a subset of the set B, i.e. any element of the set A belongs also to B) THEOREM 7. The dimension of the space E gen to the algebraic multiplicity of. of generalized eigenvectors of is equal

c Igor Zelenko, Fall 2017 4 THEOREM 8. Any matrix A admits a basis of generalized eigenvectors. Let us see how it all works in the first nontrivial case of n = 2. 7. Let A be 2 2 matrix and is a repeated eigenvalue of A. Then its algebraic multiplicity is equal to There are two options for the geometric multiplicity: 1 (trivial case) Geometric multiplicity of is equal to 2. Then A = I 2. (less trivial case) Geometric multiplicity is equal to 1. In the rest of these notes we concentrate on this case only. 8. PROPOSITION 9. Let w be a nonzero vector which is not an eigenvector of, w / E. The vector w satisfies (A I) 2 w = 0, i.e. w is a generalized eigenvector of order 2. Besides, in this case v := (A I)w (4) is an eigenvector of A corresponding to.

c Igor Zelenko, Fall 2017 5 9. Note that {v, w} constructed above constitute a basis of R 2 (i.e. E gen = R 2, so we proved Theorem 8 in this case. Therefore, {e ta v, e ta w} form a fundamental set of solutions for the system X = AX. By constructions and calculation as in item 4 above e ta v = e ta w = Conclusion: {e t v, e t (w + tv)}. (5) form a fundamental set of solutions of X = AX, i.e. the general solution is e t (C 1 v + C 2 (w + tv)). (6) 10. This gives us the following algorithms for fining the fundamental set of solutions in the case of a repeated eigenvalue with geometric multiplicity 1. Algorithm 1 (easier than the one in the book): (a) Find the eigenspace E of by finding all solutions of the system (A I)v = 0. The dimension of this eigenspace under our assumptions must be equal to. (b) Take any vector w not lying in the eigenline E and find v := (A I)w. With chosen v and w the general solution is given by (6). Algorithm 2 (as in the book): (a) Find an eigenvector v by finding one nonzero solution of the system (A I)v = 0. (b) With v found in item 1 find w suh that (A I)w = v. With chosen v and w the general solution is given by (6). REMARK 10. The advantage of Algorithm 1 over Algorithm 2 is that in the first one you solve only one linear system when finding the eigenline, while in Algorithm 2 you need to solve one more linear system (A I)w = v for w (in Algorithm 1 you choose w and then find v from (4) instead). 11. Finally let us give another algorithm which works only in the case n = 2 (for higher n it works only under some additional assumption that A has only one eigenvalue). This algorithm does not use eigenvetors explicitly (although implicitly we use here the information that an eigenvalue is repeated). Proposition 9 actually implies that (A I) 2 = 0. (7)

c Igor Zelenko, Fall 2017 6 then based on (1) and (7) e ta = Conclusion: e At = e t (I + t(a I)) (8) Algorithm 3: Calculate e ta from (8). The columns of the resulting matrix form a fundamental set of solutions. REMARK 11. Identity (7) is in fact a particular case of the followiing remarkable result from Linear Algebra, called Caley-Hamilton: Let then det(a I) = ( 1) n ( n + a n 1 n 1 +... + a 1 + a 0 ) (9) A n + a n 1 A n 1 +... + a 1 A + a 0 I = 0 (10) In other words, if one substitutes the matrix A instead of and a 0 I instead of a 0 into the right hand side of (10) then you will get 0. 12. Example. Find general solution of the system.: { x 1 = 3x 1 + 5 2 x 2 x 2 = 5 2 x 1 + 2x 2

c Igor Zelenko, Fall 2017 7 13. Now return to a second order linear homogeneous equation ay + by + cy = 0 (11) with a repeated root of its characteristic polynomial. How to explain that {e t, te t } is a fundamental set of solution from the theory of systems of first equations with repeated eigenvalues?

c Igor Zelenko, Fall 2017 8 Consider the corresponding system { x 1 = x 2 x 2 = c a x 1 b a x 2. (12) The point here that one can take ( ) ( ) 1 0 v =, w = 1 so the first component of e t v is equal to e t and the first component of e t (w + tv) is equal to te t.