Eigenvalues and Eigenvectors

Similar documents
Calculating determinants for larger matrices

Study Guide for Linear Algebra Exam 2

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

Summer Session Practice Final Exam

Chapter 3. Vector spaces

Abstract Vector Spaces and Concrete Examples

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Eigenvalues, Eigenvectors, and Diagonalization

Announcements Wednesday, November 01

Properties of Linear Transformations from R n to R m

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Linear Algebra- Final Exam Review

MAT 1302B Mathematical Methods II

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

4. Linear transformations as a vector space 17

Announcements Wednesday, November 01

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MA 265 FINAL EXAM Fall 2012

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Announcements Monday, October 29

Math 315: Linear Algebra Solutions to Assignment 7

Dimension. Eigenvalue and eigenvector

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

Matrices and Linear transformations

Linear Algebra, Summer 2011, pt. 2

The minimal polynomial

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

CSL361 Problem set 4: Basic linear algebra

1. General Vector Spaces

Math 3191 Applied Linear Algebra

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

MTH 464: Computational Linear Algebra

Definitions for Quizzes

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

1 Last time: inverses

City Suburbs. : population distribution after m years

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Definition (T -invariant subspace) Example. Example

Linear Algebra Basics

Diagonalization of Matrices

Lecture Summaries for Linear Algebra M51A

Eigenspaces and Diagonalizable Transformations

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 310, REVIEW SHEET 2

235 Final exam review questions

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Span and Linear Independence

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Diagonalization. Hung-yi Lee

Topics in linear algebra

MIT Final Exam Solutions, Spring 2017

Linear vector spaces and subspaces.

Some Notes on Linear Algebra

Eigenvalues and Eigenvectors

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

Eigenvectors and Hermitian Operators

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Chapter 5. Eigenvalues and Eigenvectors

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

Linear Algebra Practice Problems

Econ Slides from Lecture 7

. = V c = V [x]v (5.1) c 1. c k

Elementary Linear Algebra

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Linear Algebra. Min Yan

Linear Algebra: Matrix Eigenvalue Problems

4. Determinants.

Math Matrix Algebra

Eigenvalues and Eigenvectors

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

We start by computing the characteristic polynomial of A as. det (A λi) = det. = ( 2 λ)(1 λ) (2)(2) = (λ 2)(λ + 3)

Math 308 Practice Final Exam Page and vector y =

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

LINEAR ALGEBRA SUMMARY SHEET.

CHAPTER 3. Matrix Eigenvalue Problems

and let s calculate the image of some vectors under the transformation T.

Eigenvalues and Eigenvectors. Review: Invertibility. Eigenvalues and Eigenvectors. The Finite Dimensional Case. January 18, 2018

Matrices related to linear transformations

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

1 Last time: least-squares problems

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Transcription:

LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalue-eigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are solutions of a eigenvector-eigenvalue problem then the vector v is called an eigenvector of A and λ is called an eigenvalue of A. Remark 3.2. Because eigenvectors and eigenvalues always come in pairs (λ, v) one often uses language like λ is the eigenvalue of the vector v or v is an eigenvector of A with eigenvalue λ. Note that the multiplication on the left hand side is matrix multiplication (complicated) while the multiplication on the right hand side is scalar multiplication (easy). Thus, one way of thinking about the eigenvector-eigenvalue problem is that, when a solution is found we can simplify matrix multiplication. I remark that so doing, we ll also get a better idea of what is actually going on in many problems.. Motivating Examples.. Rotations and Axes of Rotation. Suppose R is a 3 3 matrix with the property that its transpose equals its inverse: R t = R Such a matrix will define a linear transformation T R : R 3 R 3 that preserves the lengths of vectors. For, since x 2 = x t x = [ x x x 2 x 3 x 2 = (x ) 2 + (x 2 ) 2 + (x 3 ) 2 x 3 if we set we ll have T R (x) = Rx Rx 2 = (Rx) t (Rx) = x t R t Rx = x t R Rx = x t x We call such matrices rotation matrices, because these are exactly the 3 3 matrices that implement rotations on 3-dimensional vectors. Now one property of rotations is that they always have an axis of rotation: meaning there is always one line in R 3 that is left undisturbed by the rotation. We note, for example, that in the case of the planet Earth, the line that that passes through the north and south poles is unaffected by its daily rotation. Given an arbitary rotation matrix R, one can thus ask: How to figure out its axis of rotation?. Well, the condition we d want is that Rx = () x That is, we d want to find an eigenvector x of R with eigenvalue. Once we find such a vector, then we ll have the corresponding axis of rotation.

. MOTIVATING EXAMPLES 2 Example 3.3. Consider the following linear transformation T : R 3 R 3 : The corresponding matrix is T ([x, y, z) = [x, y cos(θ) + z sin (θ), y sin(θ) + z cos(θ) A T = cos(θ) sin(θ) sin(θ) cos (θ) This transformation corresponds to a rotation about the x-axis by θ. Question 3.4. What vectors in R 3 are unaffected by this transformation? This question is equivalent to solving the linear system (3.) A T x = x or or (A T I) x = cos(θ) sin(θ) sin(θ) cos (θ) Row-reducing the matrix on the left-hand side yields sin(θ) cos(θ) 2 cos(θ) sin(θ)) x y z = x is arbitrary y = z = In other words, the transformation T leaves the points along the x-axis unaffected, and only those points remain unaffected by the transformation. Perhaps the preceding example is so simple-minded that the importance of the eigenvector-eigenvalue problem is obfuscated; surely a rotation about the z-axis will leave points along the z-axis fixed. However, consider the linear transformation T : R 3 R 3 that combines a rotation about z-axis with a rotation about the x-axis: x cos θ sin θ x T : y cos φ sin φ sin θ cos θ y z sin φ cos φ z = (cos θ) x + (sin θ) y (cos φ sin θ) x + (cos φ cos θ) y + (sin φ) z (sin φ sin θ) x (sin φ cos θ) y + (cos φ) z This is in fact another rotation; but which rotation? What is the axis of rotation? The answer to this question can be found by solving i.e, by solving an eigenvector-eigenvalue problem. T x = ()x Example 3.5. Let C (R) denote the vector space of smooth (infinitely differentiable) functions on the real line. Consider the transformation D : C (R) C (R) defined by D[f = df dx It is easy to see that this is a linear transformation (it preserves scalar multiplication and vector addition in C (R)). We can associate with D an infinite matrix (employing simple monomials in x, {, x, x 2, x 3,... }

2. COMPUTING EIGENVALUES 3 as a standard basis for C (R)): D [ = D [x = D [ x 2 = 2x D [ x 3 = 3x 2 Solution of a differential equation of the form. 2 A D = 3....... df dx = λf is then equivalent to a linear system of the form (3.2) A D f = λf Note the similarity of this equation with (3.). In terms of these new semantics, the first example shows that the vectors along the x-axis are eigenvectors of the linear transformation T with eigenvalues ; and the second example shows that the solution of a differential equation can be thought of as a problem of finding an eigenvector for a linear transformation on the vector space of smooth functions on R. 2. Computing Eigenvalues For a given n n matrix A, the eigenvalue problem is the problem of finding the eigenvalues and eigenvectors of A. Our abstract machinery now plays a crucial role. Before finding a solution v of (3.3) Av = λv, λ R we first ask, when do non-trivial solutions exist? First we rewrite (3.3) as or Av λv = (A λi) v = Now recall that demanding that the existence of non-zero solutions to linear system Bx = is equivalent to demanding that the null space of B is at least -dimensional, which is equivalent to saying that when B is row-reduced to row-echelon form there is at least one row full of zeros, which implies det (B) =. Thus, we arrive at the following criteria for the existence of eigenvectors and eigenvalues.

2. COMPUTING EIGENVALUES 4 Assertion 3.. If non-trivial solutions of Av = λv exist then det (A λi) =. Example 3.6. Find the possible eigenvalues of the matrix [ 3 2 A = 2 If A is to have eigenvalues, then we must have [ 3 λ 2 = det (A λi) = det = (3 λ) ( λ) 4 = λ 2 3λ 4 = (λ 4) (λ + ) 2 λ Hence, the possible eigenvalues are 4 and. This simple example actually serves as the prototype for the general case: To find the eigenvalues of an n n matrix A : Fact 3.7. () Compute the determinant of the matrix H = A λi, regarding λ as a symbolic variable. (2) det (A λi) will be a polynomial P A (λ) of degree n in λ. We shall call this polynomial the characteristic polynomial of A. (3) Solve P A (λ) =. The roots of this equation will be the possible eigenvalues of A. At this point it is worthwhile to recall the Fundamental Theorem of Algebra. Theorem 3.8. Every polynomial p(λ) of degree n has a unique factorization in terms of linear polynomials: p(λ) = c (λ r ) m (λ r 2 ) m2 (λ r s ) ms where r r 2 r s, and s i= m i = n. The numbers r i are, in general, complex numbers and are called the roots of p(x); they coincide precisely with the solutions of p(λ) =. The integers m i are called the multiplicities of the corresponding roots. This theorem tells us that the eigenvalues of a matrix can be complex numbers (in fact, this is what one must expect in general), and that they may occur with some multiplicity. The multiplicity of an eigenvalue will be very relevant to the problem of diagonalizing matrices (which we will take up in the next lecture). Example 3.9. Find the possible eigenvalues of We have The characteristic polynomial is thus A = [ 2 2 [ λ 2 det (A λi) = det 2 λ Thus, the possible eigenvalues are λ = ±2i. = λ 2 + 4 P A (λ) = λ 2 + 4 = (λ + 2i) (λ 2i) Example 3.. Find the possible eigenvalues of A = 2 2

3. COMPUTATION OF EIGENVECTORS 5 We have det (A λi) = det = ( λ) det λ 2 λ 2 λ [ λ λ = λ ( λ) ( + λ) + (2)( λ) = 3λ λ 3 2 = (2 + λ) (λ ) 2 [ 2 () det 2 λ [ 2 λ + () det 2 We thus see that A has two possible eigenvalues, λ = 2 and λ =. The eigenvalue λ = occurs with multiplicity 2. 3. Computation of Eigenvectors In finding the eigenvalues of a matrix, all we have accomplished is figuring out for what values of λ (3.4) Av = λv can have a solution. I will now show you how to find the corresponding eigenvectors v. This is very straight-forward given our recent discussion of homogeneous linear systems. For finding vectors v such that equation (3.4) is satisfied is equivalent to finding solutions of (A λi) x = Let s do a quick example. Example 3.. Find the eigenvectors of A = [ 3 2 2 In the first example, we discovered that the possible eigenvalues for this matrix are λ = 4,. Let s consider first the solution of (3.5) (A 4I) x = (which should be an eigenvector of A corresponding to the eigenvalue λ = 4). We have [ [ 3 4 2 2 A 4I = = 2 4 2 4 This matrix row reduces to [ 2 And so the solutions of (3.5) coincide with solutions of x 2x 2 = = Thus, any vector of the form x = 2x 2 x = [ 2 v = r [ 2x2 x 2 = x 2 [ 2, r R, x 2 R will be an eigenvector of A with eigenvalue 4. Now let s look for eigenvectors corresponding to the eigenvalue. We thus look for solutions of (3.6) (A + I) x =

3. COMPUTATION OF EIGENVECTORS 6 We have which is row-equivalent to A + I = [ 3 + 2 2 + [ 2 = So the solutions of (3.6) coincide with the solutions of 2x + x 2 = = [ 4 2 2 x = 2 x 2 x = x 2 [ 2 So the corresponding eigenvectors will be vectors of the form [ v = r 2 Example 3.2. Find the eigenvectors of A = 2 2 Recall from Example 5.9 that the characteristic polynomial for this matrix factorizes as P A (λ) = (2 + λ) (λ ) 2 and so we have two possible eigenvalues: λ = 2 with multiplicity, and λ = with multiplicity 2. Let s look first for eigenvectors corresponding to the eigenvalue λ = 2. We have 2 2 A ( 2)I = 2 3 row reduction 3 2 2 and so the solutions x of (A + 2I) x = satisfy 2x + x 3 = 3x 2 + 2x 3 = = x = 2 x 3 2 3 x 3 x 3 span 2 2 3 So the eigenvectors of A corresponding to the eigenvalue 2 are scalar multiples of the vector [ 2, 2 3,. We ll now repeat the calculation for the double root λ =. We have A ()I = 2 row reduction 2 2 and so the solutions x of (A I) x = satisfy x = x 3 = = x = x 2 So the eigenvectors of A corresponding to the eigenvalue are scalar multiples of the vector [,,.

5. TWO KINDS OF EIGENVALUE MULTIPLICITIES 7 4. Eigenspaces Definition 3.3. Let A be an n n matrix and suppose r R is an eigenvalue of A. The r-eigenspace of A is the subset {v R n Av = rv} Lemma 3.4. Let A be an n n matrix and suppose r R is an eigenvalue of A. The r-eigenspace of A is a subspace of R n. Proof. This can be seen two ways. First of all, suppose r is an eigenvalue of A and v, v 2 lie in r-eigenspace of A. Then Av = rv and Av 2 = rv 2 and so A (v + v 2 ) = Av + Av 2 = rv + rv 2 = r (v + v 2 ) Thus, v + v 2 is also in the r-eigenspace of A. We also have A (cv ) = cav = crv = r (cv ) Since the r-eigenspace of A is closed under both vector addition and scalar multiplication, it is a subspace. Even more easily, v r-eigenspace of A Av = rv v is a solution of (A ri) x = So r-eigenspace of A = NullSp (A ri) = a subspace of R n 5. Two Kinds of Eigenvalue Multiplicities Here I want to introduce some nomenclature that will become important latter. Suppose r is an eigenvalue of an n n matrix A. This means that λ = r is a root of det (A λi) = We call the right hand side of this equation the characterist polynomial of A and we shall denote by p A (λ). Thus, p A (λ) det (A λi) Since r is a root of p A (λ) =, it follows from the Fundamental Theorem of Algebra that (λ r) is a factor of p A (λ). In fact, one has Lemma 3.5. If r is a eigenvalue of an n n matrix A, there is an integer m such that p A (λ) = (λ r) m Q (λ) with Q (λ) a polynomial of degree n m such that Q (r). We call the integer m in the lemma, the algebraic multiplicity of the eigenvalue r. Suppose again that r is an eigenvalue of an n n matrix A. The dimension of the r-eigenspace of A is the geometric multiplicity of the eigenvalue r.

5. TWO KINDS OF EIGENVALUE MULTIPLICITIES 8 Example 3.6. Consider the matrix A = 3 3 2 find the eigenvalues of A and their algebraic and geometric multiplicities. Well first, let s find the eigenvalues of A. It s characteristic polynomial is 3 λ p A (λ) = det (A λi) = det 3 λ 2 λ Since the matrix A λi is already in Row Echelon Form, its determinant is just the product of its diagonal elements. Thus, p A (λ) = (3 λ) 2 (2 λ) From this we see that λ = 3 is an eigenvalue with algebraic multiplicity 2 and λ = 2 is an eigenvalue with algebraic multiplicity. To determine the geometric muliplicities we need to determine the dimension of the corresponding eigenspaces. λ = 3 eigenspace. The 3-eigespace is the null space of A 3I = Row-reducing this matrix (pretty easily) we get 3-eigenspace = N ullsp = solutions set of = span { x2 = x 3 = Since we have one basis vector the dimension of the 3-eigenspace is. multiplicity of λ = 3 is. (Recall its algebraic multiplicity was 2.) λ = 2 eigenspace The 2-eigenspace is the solution set of (A 2I) x =. We have (A 2I) = row reduces to and so the null space is corresponds to solutions of x = x = = x x 2 = 3 x 3 Since we have one basis vector, multiplicity of the eigenvalue λ = 2 is. span Thus, the geometric, the 2-eigenspace is -dimensional. Hence, the geometric Remark 3.7. Actually, to determine the geometric multiplicity of an eigenvalue r all we really had to do is row-reduce A ri to row echelong form and count the number of columns without pivots - that would give the dimension of NullSp (A ri) which is the same thing is the geometric multiplicity of the eigenvalue r.