Eigenvalues, Eigenvectors, and Diagonalization

Size: px
Start display at page:

Download "Eigenvalues, Eigenvectors, and Diagonalization"

Transcription

1 Week12 Eigenvalues, Eigenvectors, and Diagonalization 12.1 Opening Remarks Predicting the Weather, Again View at edx Let us revisit the example from Week 4, in which we had a simple model for predicting the weather. Again, the following table tells us how the weather for any day e.g., today predicts the weather for the next day e.g., tomorrow: Tomorrow Today sunny cloudy rainy sunny cloudy rainy This table is interpreted as follows: If today is rainy, then the probability that it will be cloudy tomorrow is.6, etc. We introduced some notation: Let χ k s Let χ k c Let χ k r denote the probability that it will be sunny k days from now on day k. denote the probability that it will be cloudy k days from now. denote the probability that it will be rainy k days from now. 511

2 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 512 We then saw that predicting the weather for day k + 1 based on the prediction for day k was given by the system of linear equations which could then be written in matrix form as χ k s x k = χ k c χ k r χ k+1 s =.4 χ s k +.3 χ k c +.1 χ r k χ k+1 c =.4 χ s k +.3 χ k c +.6 χ r k χ k+1 r =.2 χ s k +.4 χ c k +.3 χ k r. and P = so that χ k+1 s χ k+1 c χ k+1 r = χ k s χ k c χ k r or x k+1 = Px k. Now, if we start with day zero being cloudy, then the predictions for the first two weeks are given by Day # Sunny Cloudy Rainy

3 12.1. Opening Remarks 513 What you notice is that eventually x k+1 Px k. What this means is that there is a vector x such that Px = x. Such a vector if it is non-zero is known as an eigenvector. In this example, it represents the long-term prediction of the weather. Or, in other words, a description of typical weather : approximately 26% of the time it is sunny, 42% of the time it is cloudy, and 32% of the time rainy. The question now is: How can we compute such vectors? Some observations: Px = x means that Px x = which in turn means that P Ix =. This means that x is a vector in the null space of P I: x N P I. But we know how to find vectors in the null space of a matrix. You reduce a system to row echelon form, identify the free variables, etc. But we also know that a nonzero vector in the null space is not unique. In this particular case, we know two more pieces of information: The components of x must be nonnegative a negative probability does not make sense. The components of x must add to one the probabilities must add to one. The above example can be stated as a more general problem: Ax = λx, which is known as the algebraic eigenvalue problem. Scalars λ that satisfy Ax = λx for nonzero vector x are known as eigenvalues while the nonzero vectors are known as eigenvectors. From the table above we can answer questions like what is the typical weather? Answer: Cloudy. An approach similar to what we demonstrated in this unit is used, for example, to answer questions like what is the most frequently visited webpage on a given topic?

4 Week 12. Eigenvalues, Eigenvectors, and Diagonalization Outline Opening Remarks Predicting the Weather, Again Outline What You Will Learn Getting Started The Algebraic Eigenvalue Problem Simple Examples Diagonalizing Eigenvalues and Eigenvectors of 3 3 Matrices The General Case Eigenvalues and Eigenvectors of n n matrices: Special Cases Eigenvalues of n n Matrices Diagonalizing, Again Properties of Eigenvalues and Eigenvectors Practical Methods for Computing Eigenvectors and Eigenvalues Predicting the Weather, One Last Time The Power Method In Preparation for this Week s Enrichment Enrichment The Inverse Power Method The Rayleigh Quotient Iteration More Advanced Techniques Wrap Up Homework Summary

5 12.1. Opening Remarks What You Will Learn Upon completion of this unit, you should be able to Determine whether a given vector is an eigenvector for a particular matrix. Find the eigenvalues and eigenvectors for small-sized matrices. Identify eigenvalues of special matrices such as the zero matrix, the identity matrix, diagonal matrices, and triangular matrices. Interpret an eigenvector of A, as a direction in which the action of A, Ax, is equivalent to x being scaled without changing its direction. Here scaling by a negative value still leaves the vector in the same direction. Since this is true for any scalar multiple of x, it is the direction that is important, not the length of x. Compute the characteristic polynomial for 2 2 and 3 3 matrices. Know and apply the property that a matrix has an inverse if and only if its determinant is nonzero. Know and apply how the roots of the characteristic polynomial are related to the eigenvalues of a matrix. Recognize that if a matrix is real valued, then its characteristic polynomial has real valued coefficients but may still have complex eigenvalues that occur in conjugate pairs. Link diagonalization of a matrix with the eigenvalues and eigenvectors of that matrix. Make conjectures, reason, and develop arguments about properties of eigenvalues and eigenvectors. Understand practical algorithms for finding eigenvalues and eigenvectors such as the power method for finding an eigenvector associated with the largest eigenvalue in magnitude.

6 Week 12. Eigenvalues, Eigenvectors, and Diagonalization Getting Started The Algebraic Eigenvalue Problem The algebraic eigenvalue problem is given by View at edx Ax = λx. where A R n n is a square matrix, λ is a scalar, and x is a nonzero vector. Our goal is to, given matrix A, compute λ and x. It must be noted from the beginning that λ may be a complex number and that x will have complex components if λ is complex valued. If x, then λ is said to be an eigenvalue and x is said to be an eigenvector associated with the eigenvalue λ. The tuple λ,x is said to be an eigenpair. Here are some equivalent statements: Ax = λx,, where x. This is the statement of the algebraic eigenvalue problem. Ax λx =, where x. This is merely a rearrangement of Ax = λx. Ax λix =, where x. Early in the course we saw that x = Ix. A λix =, where x. This is a matter of fractoring x out. A λi is singular. Since there is a vector x such that A λix =. N A λi contains a nonzero vector x. This is a consequence of there being a vector x such that A λix =. dimn A λi >. Since there is a nonzero vector in N A λi, that subspace must have dimension greater than zero. If we find a vector x such that Ax = λx, it is certainly not unique. For any scalar α, Aαx = λαx also holds. If Ax = λx and Ay = λy, then Ax + y = Ax + Ay = λx + λy = λx + y.

7 12.2. Getting Started 517 We conclude that the set of all vectors x that satisfy Ax = λx is a subspace. It is not the case that the set of all vectors x that satisfy Ax = λx is the set of all eigenvectors associated with λ. After all, the zero vector is in that set, but is not considered an eigenvector. It is important to think about eigenvalues and eigenvectors in the following way: If x is an eigenvector of A, then x is a direction in which the action of A in other words, Ax is equivalent to x being scaled in length without changing its direction other than changing sign. Here we use the term length somewhat liberally, since it can be negative in which case the direction of x will be exactly the opposite of what it was before. Since this is true for any scalar multiple of x, it is the direction that is important, not the magnitude of x Simple Examples View at edx In this unit, we build intuition about eigenvalues and eigenvectors by looking at simple examples.

8 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 518 Homework Which of the following are eigenpairs λ,x of the 2 2 zero matrix: x = λx, where x. Mark all correct answers. 1. 1, 2., 1 3., , , ,.. View at edx

9 12.2. Getting Started 519 Homework Which of the following are eigenpairs λ,x of the 2 2 zero matrix: 1 x = λx, 1 where x. Mark all correct answers. 1. 1, 2. 1, , , , , 1 1. View at edx

10 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 52 Homework Let A = = 3 1 so that 3, 1 1 is an eigenpair. True/False The set of all eigenvectors associated with eigenvalue 3 is characterized by mark all that apply: All vectors x that satisfy Ax = 3x. All vectors x that satisfy A 3Ix =. All vectors x that satisfy 4 χ χ is a scalar x =. 3 = 1 so that 1, is an eigenpair True/False View at edx δ δ 1 Homework Consider the diagonal matrix δ n 1 Eigenpairs for this matrix are given by δ,e,δ 1,e 1,,δ n 1,e n 1, where e j equals the jth unit basis vector. Always/Sometimes/Never

11 12.2. Getting Started 521 View at edx Homework Which of the following are eigenpairs λ, x of the 2 2 triangular matrix: 3 1 x = λx, 1 where x. Mark all correct answers. 1. 1, /3, , , , , 3 1. View at edx

12 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 522 Homework Consider the upper triangular matrix U = υ, υ,1 υ,n 1 υ 1,1 υ 1,n υ n 1,n 1 The eigenvalues of this matrix are υ,,υ 1,1,...,υ n 1,n 1. Always/Sometimes/Never View at edx Below, on the left we discuss the general case, side-by-side with a specific example on the right. General Consider Ax = λx. Rewrite as Ax λx Rewrite as Ax λix =. Now [A λi]x = A λi is the matrix A with λ subtracted from its diagonal elements. Example 1 1 χ = λ χ. 2 4 χ 1 χ χ λ χ =. 2 4 χ 1 χ χ λ 1 χ = 2 4 χ 1 1 χ λ 1 χ = χ 1 1 λ 1 χ =. 2 4 λ χ 1. Now A λi has a nontrivial vector x in its null space if that matrix does not have an inverse. Recall that 1 α, α,1 1 = α 1,1 α,1. α 1, α 1,1 α, α 1,1 α 1, α,1 α 1, α, Here the scalar α, α 1,1 α 1, α,1 is known as the determinant of 2 2 matrix A, deta. This turns out to be a general statement:

13 12.2. Getting Started 523 Matrix A has an inverse if and only if its determinant is nonzero. We have not yet defined the determinant of a matrix of size greater than 2. So, the matrix 1 λ 1 does not have an inverse if and only if 2 4 λ But det 1 λ 1 = 1 λ4 λ 2 1 =. 2 4 λ 1 λ4 λ 2 1 = 4 5λ + λ = λ 2 5λ + 6 This is a quadratic second degree polynomial, which has at most two district roots. In particular, by examination, λ 2 5λ + 6 = λ 2λ 3 = so that this matrix has two eigenvalues: λ = 2 and λ = 3. If we now take λ = 2, then we can determine an eigenvector associated with that eigenvalue: χ = χ 1 or 1 1 χ =. 2 2 χ 1 By examination, we find that χ = 1 is a vector in the null space and hence an eigenvector χ 1 1 associated with the eigenvalue λ = 2. This is not a unique solution. Any vector χ with χ is χ an eigenvector. Similarly, if we take λ = 3, then we can determine an eigenvector associated with that second eigenvalue: χ = χ 1 or 2 1 χ =. 2 1 χ 1 By examination, we find that χ = 1 is a vector in the null space and hence an eigenvector χ 1 2

14 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 524 associated with the eigenvalue λ = 3. Again, this is not a unique solution. Any vector χ is an eigenvector. χ 2χ with The above discussion identifies a systematic way for computing eigenvalues and eigenvectors of a 2 2 matrix: Compute det α, λ α,1 α 1, α 1,1 λ = α, λα 1,1 λ α,1 α 1,. Recognize that this is a second degree polynomial in λ. It is called the characteristic polynomial of the matrix A, p 2 λ. Compute the coefficients of p 2 λ so that Solve p 2 λ = λ 2 + βλ + γ. λ 2 + βλ + γ = for its roots. You can do this either by examination, or by using the quadratic formula: λ = β ± β 2 + 4γ. 2 For each of the roots, find an eigenvector that satisfies α, λ α,1 α 1, α 1,1 λ χ χ 1 = The easiest way to do this is to subtract the eigenvalue from the diagonal, set one of the components of x to 1, and then solve for the other component. Check your answer! It is a matter of plugging it into Ax = λx and seeing if the computed λ and x satisfy the equation. A 2 2 matrix yields a characteristic polynomial of degree at most two, and has at most two distinct eigenvalues.

15 12.2. Getting Started 525 Homework Consider A = The eigenvalue largest in magnitude is Which of the following are eigenvectors associated with this largest eigenvalue in magnitude: The eigenvalue smallest in magnitude is Which of the following are eigenvectors associated with this largest eigenvalue in magnitude:

16 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 526 Homework Consider A = The eigenvalue largest in magnitude is The eigenvalue smallest in magnitude is 3 1 Example 12.1 Consider the matrix A =. To find the eigenvalues and eigenvectors λ 1 of this matrix, we form A λi = and check when the characteristic polynomial 2 1 λ is equal to zero: 3 λ 1 det = 3 λ1 λ 12 = λ 2 4λ λ When is this equal to zero? We will use the quadratic formula: λ = 4 ± = 2 ± i. Thus, this matrix has complex valued eigenvalues in form of a conjugate pair: λ = 2 + i and λ 1 = 2 i. To find the corresponding eigenvectors: λ = 2 + i: λ = 2 i: i A λ I = i = 1 i i A λ 1 I = = 3 2 i i 1 + i i Find a nonzero vector in the null space: 1 i 1 χ =. 2 1 i χ 1 By examination, 1 i i Eigenpair: 2 + i, 1 1 i 1 1 i. =. Find a nonzero vector in the null space: 1 + i 1 χ = i χ 1 By examination, 1 + i i Eigenpair: 2 i, i i. =.

17 12.2. Getting Started 527 If A is real valued, then its characteristic polynomial has real valued coefficients. However, a polynomial with real valued coefficients may still have complex valued roots. Thus, the eigenvalues of a real valued matrix may be complex. Homework Consider A = of A: 4 and i and i and 3 i. 2 + i and 2 i View at edx. Which of the following are the eigenvalues Diagonalizing View at edx Diagonalizing a square matrix A R n n is closely related to the problem of finding the eigenvalues and eigenvectors of a matrix. In this unit, we illustrate this for some simple 2 2 examples. A more thorough treatment then follows when we talk about the eigenvalues and eigenvectors of n n matrix, later this week. In the last unit, we found eigenpairs for the matrix

18 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 528 Specifically, = and = so that eigenpairs are given by 2, 1 1 and Now, let s put our understanding of matrix-matrix multiplication from Weeks 4 and 5 to good use: = 2 1 ; = {{ = {{ = {{{{{{{{ A X = {{ {{ {{ {{ X 1 A X Λ X Λ Comment A here is 2 2 Ax = λ x ;Ax 1 = λ 1 x 1 Ax Ax 1 = λ x λ 1 x 1 A x x 1 = x x 1 λ {{{{ λ 1 {{ X X Λ 1 x x 1 A x x 1 = λ {{{{ λ 1 X 1 {{ X Λ What we notice is that if we take the two eigenvectors of matrix A, and create with them a matrix X that has those eigenvectors as its columns, then X 1 AX = Λ, where Λ is a diagonal matrix with the eigenvalues on its diagonal. The matrix X is said to diagonalize matrix A. Defective matrices Now, it is not the case that for every A R n n there is a nonsingular matrix X R n n such that X 1 AX = Λ, where Λ is diagonal. Matrices for which such a matrix X does not exists are called defective matrices. Homework The matrix 1 can be diagonalized. The matrix λ 1 λ True/False

19 12.2. Getting Started 529 is a simple example of what is often called a Jordan block. It, too, is defective. View at edx Homework In Homework you considered the matrix A = and computed the eigenpairs 4, 1 1 and 2, 1 1. Matrix A can be diagonalized by matrix X =. Yes, this matrix is not unique, so please use the info from the eigenpairs, in order... AX = X 1 = X 1 AX = Eigenvalues and Eigenvectors of 3 3 Matrices View at edx

20 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 53 Homework Let A = is an eigenvector associated with eigenvalue 3. 1 is an eigenvector associated with eigenvalue 1. χ 1. Then which of the following are true: True/False True/False, where χ 1 is a scalar, is an eigenvector associated with eigenvalue 1. is an eigenvector associated with eigenvalue 2. 1 True/False True/False View at edx

21 12.2. Getting Started 531 Homework Let A = α, α 1,1 α 2,2 1 is an eigenvector associated with eigenvalue α,. 1 is an eigenvector associated with eigenvalue α 1,1. χ 1. Then which of the following are true: where χ 1 is an eigenvector associated with eigenvalue α 1,1. is an eigenvector associated with eigenvalue α 2,2. 1 True/False True/False True/False True/False

22 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 532 Homework Let A = Then which of the following are true: 3, 1, and 2 are eigenvalues of A. 1 is an eigenvector associated with eigenvalue 3. True/False 1/4 1 is an eigenvector associated with eigenvalue 1. True/False 1/4χ 1 χ 1 where χ 1 is an eigenvector associated with eigenvalue 1. True/False 1/3 2/3 is an eigenvector associated with eigenvalue 2. 1 True/False Homework Let A = α,, α 1,1, and α 2,2. α, α,1 α,2 α 1,1 α 1,2 α 2,2 View at edx. Then the eigenvalues of this matrix are True/False When we discussed how to find the eigenvalues of a 2 2 matrix, we saw that it all came down to the determinant of A λi, which then gave us the characteristic polynomial p 2 λ. The roots of this

23 12.2. Getting Started 533 polynomial were the eigenvalues of the matrix. Similarly, there is a formula for the determinant of a 3 3 matrix: det α, α,1 α,2 α 1, α 1,1 α 1,2 α 2, α 2,1 α 2,2 = α, α 1,1 α 2,2 + α,1 α 1,2 α 2, + α,2 α 1, α 2,1 {{ α, α,1 α,2 α, α,1 α 1, α 1,1 α 1,2 α 1, α 1,1 α 2, α 2,1 α 2,2 α 2, α 2,1 α 2, α 1,1 α,2 + α 2,1 α 1,2 α, + α 2,2 α 1, α,1. {{ α, α,1 α,2 α, α,1 α 1, α 1,1 α 1,2 α 1, α 1,1 α 2, α 2,1 α 2,2 α 2, α 2,1 Thus, for a 3 3 matrix, the characteristic polynomial becomes p 3 λ = det α, λ α,1 α,2 α 1, α 1,1 λ α 1,2 α 2, α 2,1 α 2,2 λ = [α, λα 1,1 λα 2,2 λ + α,1 α 1,2 α 2, + α,2 α 1, α 2,1 ] [α 2, α 1,1 λα,2 + α 2,1 α 1,2 α, λ + α 2,2 λα 1, α,1 ]. Multiplying this out, we get a third degree polynomial. The roots of this cubic polynomial are the eigenvalues of the 3 3 matrix. Hence, a 3 3 matrix has at most three distinct eigenvalues.

24 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 534 Example 12.2 Compute the eigenvalues and eigenvectors of A = 1 λ 1 1 det 1 1 λ 1 = [ 1 λ1 λ1 λ ] {{ λ 1 3λ + 3λ 2 λ 3 {{ 3 3λ + 3λ 2 λ 3 So, λ = is a double root, while λ = 3 is the third root [ 1 λ + 1 λ + 1 λ {{ 3 3λ {{ 3λ 2 λ 3 = 3 λλ 2 ]. λ 2 = 3: λ = λ 1 = : A λ 2 I = We wish to find a nonzero vector in the null space: χ χ 1 χ 2 By examination, I noticed that = =.. A I = Reducing this to row-echelon form gives us the matrix 1 1 1, for which we find vectors in the null space 1 1 and 1 1. Eigenpair: 3, Eigenpairs:, 1 1 and, 1 1 What is interesting about this last example is that λ = is a double root and yields two linearly independent

25 12.3. The General Case 535 eigenvectors. Homework Consider A = matrix: 1 1, 3 is an eigenpair. 1, 1 is an eigenpair., 1 is an eigenpair Which of the following is true about this This matrix is defective The General Case Eigenvalues and Eigenvectors of n n matrices: Special Cases We are now ready to talk about eigenvalues and eigenvectors of arbitrary sized matrices. View at edx

26 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 536 Homework Let A R n n be a diagonal matrix: A = α, α 1,1 α 2,2. Then e i is an eigenvector associated with eigen α n 1,n 1 value α i,i. True/False View at edx, where A is square. Then α 11 is an eigen- A a 1 A 2 Homework Let A = α 11 a T 12 A 22 A α 11 I 1 a 1 value of A and 1 is nonsingular. is a corresponding eigenvalue provided A α 11 I True/False View at edx Homework The eigenvalues of a triangular matrix can be found on its diagonal. True/False

27 12.3. The General Case Eigenvalues of n n Matrices View at edx There is a formula for the determinant of a n n matrix, which is a inductively defined function, meaning that the formula for the determinant of an n n matrix is defined in terms of the determinant of an n 1 n 1 matrix. Other than as a theoretical tool, the determinant of a general n n matrix is not particularly useful. We restrict our discussion to some facts and observations about the determinant that impact the characteristic polynomial, which is the polynomial that results when one computes the determinant of the matrix A λi, deta λi. Theorem 12.3 A matrix A R n n is nonsingular if and only if deta. Theorem 12.4 Given A R n n, for some coefficients γ 1,...,γ n 1 R. p n λ = deta λi = λ n + γ n 1 λ n γ 1 λ + γ. Since we don t give the definition of a determinant, we do not prove the above theorems. Definition 12.5 Given A R n n, p n λ = deta λi is called the characteristic polynomial. Theorem 12.6 Scalar λ satisfies Ax = λx for some nonzero vector x if and only if deta λi =. Proof: This is an immediate consequence of the fact that Ax = λx is equivalent to A λix = and the fact that A λi is singular has a nontrivial null space if and only if deta λi =. Roots of the characteristic polynomial Since an eigenvalue of A is a root of p n A = deta λi and vise versa, we can exploit what we know about roots of nth degree polynomials. Let us review, relating what we know to the eigenvalues of A. The characteristic polynomial of A R n n is given by p n λ = deta λi =γ +γ 1 λ+ +γ n 1 λ n 1 + λ n Since p n λ is an nth degree polynomial, it has n roots, counting multiplicity. Thus, matrix A has n eigenvalues, counting multiplicity.

28 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 538 Let k equal the number of distinct roots of p n λ. Clearly, k n. Clearly, matrix A then has k distinct eigenvalues. The set of all roots of p n λ, which is the set of all eigenvalues of A, is denoted by ΛA and is called the spectrum of matrix A. The characteristic polynomial can be factored as p n λ = deta λi =λ λ n λ λ 1 n 1 λ λ k 1 n k 1, where n + n n k 1 = n and n j is the root λ j, which is known as the algebraic multiplicity of eigenvalue λ j. If A R n n, then the coefficients of the characteristic polynomial are real γ,...,γ n 1 R, but Some or all of the roots/eigenvalues may be complex valued and Complex roots/eigenvalues come in conjugate pairs : If λ = R λ+ii λ is a root/eigenvalue, so is λ = R λ ii λ An inconvenient truth Galois theory tells us that for n 5, roots of arbitrary p n λ cannot be found in a finite number of computations. Since we did not tell you how to compute the determinant of A λi, you will have to take the following for granted: For every nthe degree polynomial p n λ = γ + γ 1 λ + + γ n 1 λ n 1 + λ n, there exists a matrix, C, called the companion matrix that has the property that p n λ = detc λi =γ + γ 1 λ + + γ n 1 λ n 1 + λ n. In particular, the matrix γ n 1 γ n 2 γ 1 γ 1 C = is the companion matrix for p n λ: γ n 1 γ n 2 γ 1 γ 1 p n λ = γ + γ 1 λ + + γ n 1 λ n 1 + λ n = det Homework If A R n n, then ΛA has n distinct elements. True/False

29 12.3. The General Case 539 Homework Let A R n n and λ ΛA. Let S be the set of all vectors that satisfy Ax = λx. Notice that S is the set of all eigenvectors corresponding to λ plus the zero vector. Then S is a subspace. True/False Diagonalizing, Again View at edx We now revisit the topic of diagonalizing a square matrix A R n n, but for general n rather than the special case of n = 2 treated in Unit Let us start by assuming that matrix A R n n has n eigenvalues, λ,...,λ n 1, where we simply repeat eigenvalues that have algebraic multiplicity greater than one. Let us also assume that x j equals the eigenvector associated with eigenvalue λ j and, importantly, that x,...,x n 1 are linearly independent. Below,

30 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 54 we generalize the example from Unit Ax = λ x ;Ax 1 = λ 1 x 1 ; ;Ax n 1 = λ n 1 x n 1 if and only if < two matrices are equal if their columns are equal > Ax Ax 1 Ax n 1 = λ x λ 1 x 1 λ n 1 x n 1 A A if and only if < partitioned matrix-matrix multiplication > x x 1 x n 1 = λ x λ 1 x 1 λ n 1 x n 1 if and only if < multiplication on the right by a diagonal matrix > λ λ 1 x x 1 x n 1 = x x 1 x n λ n 1 if and only if < multiplication on the right by a diagonal matrix > λ λ 1 AX = XΛ where X = x x 1 x n 1 and Λ = λ n 1 if and only if < columns of X are linearly independent > λ X 1 λ 1 AX = Λ where X = x x 1 x n 1 and Λ = λ n 1 The above argument motivates the following theorem: Theorem 12.7 Let A R n n. Then there exists a nonsingular matrix X such that X 1 AX = Λ if and only if A has n linearly independent eigenvectors. If X is invertible nonsingular, has linearly independent columns, etc., then the following are equivalent X 1 A X = Λ A X = X Λ A = X Λ X 1 If Λ is in addition diagonal, then the diagonal elements of Λ are eigenvalues of A and the columns of X are eigenvectors of A.

31 12.3. The General Case 541 Recognize that ΛA denotes the spectrum set of all eigenvalues of matrix A while here we use it to denote the matrix Λ, which has those eigenvalues on its diagonal. This possibly confusing use of the same symbol for two different but related things is commonly encountered in the linear algebra literature. For this reason, you might as well get use to it! Defective deficient matrices We already saw in Unit , that it is not the case that for every A R n n there is a nonsingular matrix X R n n such that X 1 AX = Λ, where Λ is diagonal. In that unit, a 2 2 example was given that did not have two linearly independent eigenvectors. In general, the k k matrix J k λ given by λ 1 λ 1 λ... J k λ = λ 1 λ has eigenvalue λ of algebraic multiplicity k, but geometric multiplicity one it has only one linearly independent eigenvector. Such a matrix is known as a Jordan block. Definition 12.8 The geometric multiplicity of an eigenvalue λ equals the number of linearly independent eigenvectors that are associated with λ. The following theorem has theoretical significance, but little practical significance which is why we do not dwell on it: Theorem 12.9 Let A R n n. Then there exists a nonsingular matrix X R n n such that A = XJX 1, where J k λ J k1 λ 1 J = J k2 λ J km 1 λ m 1 where each J k j λ j is a Jordan block of size k j k j. The factorization A = XJX 1 is known as the Jordan Canonical Form of matrix A. A few comments are in order: It is not the case that λ,λ 1,...,λ m 1 are distinct. If λ j appears in multiple Jordan blocks, the number of Jordan blocks in which λ j appears equals the geometric multiplicity of λ j and the number of linearly independent eigenvectors associated with λ j.

32 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 542 The sum of the sizes of the blocks in which λ j as an eigenvalue appears equals the algebraic multiplicity of λ j. If each Jordan block is 1 1, then the matrix is diagonalized by matrix X. If any of the blocks is not 1 1, then the matrix cannot be diagonalized. Homework Consider A = The algebraic multiplicity of λ = 2 is The geometric multiplicity of λ = 2 is The following vectors are linearly independent eigenvectors associated with λ = 2: 1 and. 1 True/False Homework Let A R n n, λ ΛA, and S be the set of all vectors x such that Ax = λx. Finally, let λ have algebraic multiplicity k meaning that it is a root of multiplicity k of the characteristic polynomial. The dimension of S is k dims = k. Always/Sometimes/Never Properties of Eigenvalues and Eigenvectors View at edx In this unit, we look at a few theoretical results related to eigenvalues and eigenvectors.

33 12.3. The General Case 543 Homework Let A R n n and A = matrices. ΛA = ΛA, ΛA 1,1. A, A,1 A 1,1, where A, and A 1,1 are square Always/Sometimes/Never The last exercise motives the following theorem which we will not prove: View at edx Theorem 12.1 Let A R n n and A = A, A,1 A,N 1 A 1,1 A 1,N A N 1,N 1 where all A i,i are a square matrices. Then ΛA = ΛA, ΛA 1,1 ΛA N 1,N 1. Homework Let A R n n be symmetric, λ i λ j, Ax i = λ i x i and Ax j = λ j x j. x T i x j = Always/Sometimes/Never The following theorem requires us to remember more about complex arithmetic than we have time to remember. For this reason, we will just state it: Theorem Let A R n n be symmetric. Then its eigenvalues are real valued. View at edx Homework If Ax = λx then AAx = λ 2 x. AA is often written as A 2. Always/Sometimes/Never

34 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 544 Homework Let Ax = λx and k 1. Recall that A k = AA A {{ k times A k x = λ k x.. Always/Sometimes/Never Homework A R n n is nonsingular if and only if / ΛA. True/False 12.4 Practical Methods for Computing Eigenvectors and Eigenvalues Predicting the Weather, One Last Time View at edx If you think back about how we computed the probabilities of different types of weather for day k, recall that x k+1 = Px k where x k is a vector with three components and P is a 3 3 matrix. We also showed that We noticed that eventually x k = P k x. x k+1 Px k and that therefore, eventually, x k+1 came arbitrarily close to an eigenvector, x, associated with the eigenvalue 1 of matrix P: Px = x. Homework If λ ΛA then λ ΛA T. True/False View at edx

35 12.4. Practical Methods for Computing Eigenvectors and Eigenvalues 545 Homework λ ΛA if and only if λ ΛA T. True/False Ah! It seems like we may have stumbled upon a possible method for computing an eigenvector for this matrix: Start with a first guess x. for k =,..., until x k doesn t change much anymore x k+1 := Px k. Can we use what we have learned about eigenvalues and eigenvectors to explain this? In the video, we give one explanation. Below we give an alternative explanation that uses diagonalization. Let s assume that P is diagonalizable: P = V ΛV 1, where Λ = λ λ 1 λ 2 Here we use the letter V rather than X since we already use x k in a different way. Then we saw before that. x k = P k x = V ΛV 1 k x = V Λ k V 1 x k λ = V λ 1 V 1 x λ 2 = V λ k λ k 1 λ k 2 V 1 x. Now, let s assume that λ = 1 since we noticed that P has one as an eigenvalue, and that λ 1 < 1 and λ 2 < 1. Also, notice that V = v v 1 v 2 where v i equals the eigenvector associated with λ i. Finally, notice that V has linearly independent columns and that therefore there exists a vector w such that V w = x. Then λ k x k = V λ k 1 V 1 x λ k 2 = V 1 λ k 1 λ k 2 V 1 V w

36 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 546 = v v 1 v 2 1 λ k 1 w = v v 1 v 2 λ k 2 1 λ k 1 ω ω 1. λ k 2 ω 2 Now, what if k gets very large? We know that lim k λ k 1 =, since λ 1 < 1. Similarly, lim k λ k 2 =. So, lim k xk = 1 ω lim k v v 1 v 2 λ k 1 ω 1 = λ k 2 ω 2 1 ω v v 1 v 2 lim k λ k 1 ω 1 λ k 2 ω 2 1 = v v 1 v 2 lim k λ k 1 lim k λ k 2 1 ω = v v 1 v 2 ω 1 ω 2 = v v 1 v 2 ω = ω v. ω ω 1 ω 2 Ah, so x k eventually becomes arbitrarily close converges to a multiple of the eigenvector associated with the eigenvalue 1 provided ω.

37 12.4. Practical Methods for Computing Eigenvectors and Eigenvalues The Power Method View at edx So, a question is whether the method we described in the last unit can be used in general. The answer is yes. The resulting method is known as the Power Method. First, let s make some assumptions. Given A R n n, Let λ,λ 1,...,λ n 1 ΛA. We list eigenvalues that have algebraic multiplicity k multiple k times in this list. Let us assume that λ > λ 1 λ 2 λ n 1. This implies that λ is real, since complex eigenvalues come in conjugate pairs and hence there would have been two eigenvalues with equal greatest magnitude. It also means that there is a real valued eigenvector associated with λ. Let us assume that A R n n is diagonalizable so that λ A = V ΛV 1 λ 1 = v v 1 v n λ n 1 This means that v i is an eigenvector associated with λ i. v v 1 v n 1 1. These assumptions set the stage. Now, we start with some vector x R n. Since V is nonsingular, the vectors v,...,v n 1 form a linearly independent bases for R n. Hence, Now, we generate x = γ v + γ 1 v γ n 1 v n 1 = v v 1 v n 1 The following algorithm accomplishes this x 1 = Ax x 2 = Ax 1 x 3 = Ax 2. γ γ 1. γ n 1 = V c.

38 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 548 Notice that then But then for k =,..., until x k doesn t change much anymore x k+1 := Ax k endfor x k = Ax k 1 = A 2 x k 2 = = A k x. A k x = A k γ v + γ 1 v γ n 1 v {{ n 1 V c = A k γ v + A k γ 1 v A k γ n 1 v n 1 = γ A k v + γ 1 A k v γ n 1 A k v n 1 = γ λ k v + γ 1 λ k 1 v γ n 1 λ k n 1 v n 1 {{ λ k γ λ k 1 γ 1 v v 1 v n {{ λ k n 1 γ n 1 V Λ k c Now, if λ = 1, then λ j < 1 for j > and hence lim k xk = lim γ v + γ 1 λ k 1 v γ n 1 λ k n 1 v n 1 k {{ 1 γ λ k 1 γ 1 lim v v 1 v n 1 k λ k n 1 γ n 1 {{ 1 γ γ 1 v v 1 v n {{ γ n 1 v γ γ 1. γ n 1 {{ γ v = γ v

39 12.4. Practical Methods for Computing Eigenvectors and Eigenvalues 549 which means that x k eventually starts pointing towards the direction of v, the eigenvector associated with the eigenvalue that is largest in magnitude. Well, as long as γ. Homework Let A R n n and µ be a scalar. Then λ ΛA if and only if λ/µ Λ 1 µ A. True/False What this last exercise shows is that if λ 1, then we can instead iterate with the matrix 1 λ A, in which case 1 = λ λ > λ 1 λ λ n 1. λ The iteration then becomes x 1 = 1 Ax λ x 2 = 1 Ax 1 λ x 3 = 1 Ax 2 λ. The following algorithm accomplishes this for k =,..., until x k doesn t change much anymore x k+1 := Ax k /λ endfor

40 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 55 It is not hard to see that then lim k xk = k k λ λ1 lim γ v + γ 1 v γ n 1 k λ λ λn 1 λ k v n 1 {{ 1 γ λ 1 /λ k γ 1 lim v v 1 v n 1 k λ n 1 /λ k {{ γ n 1 1 γ γ 1 v v 1 v n {{ γ n 1 v γ γ 1. γ n 1 {{ γ v = γ v So, it seems that we have an algorithm that always works as long as λ > λ 1 λ n 1. Unfortunately, we are cheating... If we knew λ, then we could simply compute the eigenvector by finding a vector in the null space of A λ I. The key insight now is that, in x k+1 = Ax k /λ, dividing by λ is merely meant to keep the vector x k from getting progressively larger if λ > 1 or smaller if λ < 1. We can alternatively simply make x k of length one at each step, and that will have the same effect without requiring λ : for k =,..., until x k doesn t change much anymore x k+1 := Ax k x k+1 := x k+1 / x k+1 2 endfor This last algorithm is known as the Power Method for finding an eigenvector associated with the largest eigenvalue in magnitude.

41 12.4. Practical Methods for Computing Eigenvectors and Eigenvalues 551 Homework We now walk you through a simple implementation of the Power Method, referring to files in directory LAFFSpring215/Programming/Week12. We want to work with a matrix A for which we know the eigenvalues. Recall that a matrix A is diagonalizable if and only if there exists a nonsingular matrix V and diagonal matrix Λ such that A = V ΛV 1. The diagonal elements of Λ then equal the eigenvalues of A and the columns of V the eigenvectors. Thus, given eigenvalues, we can create a matrix A by creating a diagonal matrix with those eigenvalues on the diagonal and a random nonsingular matrix V, after which we can compute A to equal V ΛV 1. This is accomplished by the function [ A, V ] = CreateMatrixForEigenvalueProblem eigs see file CreateMatrixForEigenvalueProblem.m. The script in PowerMethodScript.m then illustrates how the Power Method, starting with a random vector, computes an eigenvector corresponding to the eigenvalue that is largest in magnitude, and via the Rayleigh quotient a way for computing an eigenvalue given an eigenvector that is discussed in the next unit an approximation for that eigenvalue. To try it out, in the Command Window type >> PowerMethodScript input a vector of eigenvalues. e.g.: [ 4; 3; 2; 1 ] [ 4; 3; 2; 1 ] The script for each step of the Power Method reports for the current iteration the length of the component orthogonal to the eigenvector associated with the eigenvalue that is largest in magnitude. If this component becomes small, then the vector lies approximately in the direction of the desired eigenvector. The Rayleigh quotient slowly starts to get close to the eigenvalue that is largest in magnitude. The slow convergence is because the ratio of the second to largest and the largest eigenvalue is not much smaller than 1. Try some other distributions of eigenvalues. For example, [ 4; 1;.5;.25], which should converge faster, or [ 4; 3.9; 2; 1 ], which should converge much slower. You may also want to try PowerMethodScript2.m, which illustrates what happens if there are two eigenvalues that are equal in value and both largest in magnitude relative to the other eigenvalues In Preparation for this Week s Enrichment In the last unit we introduce a practical method for computing an eigenvector associated with the largest eigenvalue in magnitude. This method is known as the Power Method. The next homework shows how to compute an eigenvalue associated with an eigenvector. Thus, the Power Method can be used to first approximate that eigenvector, and then the below result can be used to compute the associated eigenvalue. Given A R n n and nonzero vector x R n, the scalar x T Ax/x T x is known as the Rayleigh quotient.

42 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 552 Homework Let A R n n and x equal an eigenvector of A. Assume that x is real valued as is the eigenvalue λ with Ax = λx. λ = xt Ax x T x is the eigenvalue associated with the eigenvector x. Always/Sometimes/Never Notice that we are carefully avoiding talking about complex valued eigenvectors. The above results can be modified for the case where x is an eigenvector associated with a complex eigenvalue and the case where A itself is complex valued. However, this goes beyond the scope of this course. The following result allows the Power Method to be extended so that it can be used to compute the eigenvector associated with the smallest eigenvalue in magnitude. The new method is called the Inverse Power Method and is discussed in this week s enrichment section. Homework Let A R n n be nonsingular, λ ΛA, and Ax = λx. Then A 1 x = 1 λ x. True/False The Inverse Power Method can be accelerated by shifting the eigenvalues of the matrix, as discussed in this week s enrichment, yielding the Rayleigh Quotient Iteration. The following exercise prepares the way. Homework Let A R n n and λ ΛA. Then λ µ ΛA µi. True/False 12.5 Enrichment The Inverse Power Method The Inverse Power Method exploits a property we established in Unit : If A is nonsingular and λ ΛA then 1/λ ΛA 1. Again, let s make some assumptions. Given nonsingular A R n n, Let λ,λ 1,...,λ n 2,λ n 1 ΛA. We list eigenvalues that have algebraic multiplicity k multiple k times in this list. Let us assume that λ λ 1 λ n 2 > λ n 1 >. This implies that λ n 1 is real, since complex eigenvalues come in conjugate pairs and hence there would have been two eigenvalues with equal smallest magnitude. It also means that there is a real valued eigenvector associated with λ n 1. Let us assume that A R n n is diagonalizable so that λ λ 1 A =VΛV 1 = v v 1 v n 2 v n λ n 2 λ n 1 This means that v i is an eigenvector associated with λ i. v v 1 v n 2 v n 1 1.

43 12.5. Enrichment 553 These assumptions set the stage. Now, we again start with some vector x R n. Since V is nonsingular, the vectors v,...,v n 1 form a linearly independent bases for R n. Hence, x = γ v + γ 1 v γ n 2 v n 2 + γ n 1 v n 1 = v v 1 v n 2 v n 1 Now, we generate The following algorithm accomplishes this x 1 = A 1 x x 2 = A 1 x 1 x 3 = A 1 x 2. for k =,..., until x k doesn t change much anymore Solve Ax k+1 := x k endfor γ γ 1. γ n 2 γ n 1 = V c. In practice, one would probably factor A once, and reuse the factors for the solve. Notice that then But then x k = A 1 x k 1 = A 1 2 x k 2 = = A 1 k x. A 1 k x = A 1 k γ v + γ 1 v γ n 2 v n 2 + γ n 1 v {{ n 1 V c = A 1 k γ v + A 1 k γ 1 v A 1 k γ n 2 v n 2 + A 1 k γ n 1 v n 1 = γ A 1 k v + γ 1 A 1 k v γ n 2 A 1 k v n 2 + γ n 1 A 1 k v n 1 1 k 1 k 1 k 1 k = γ v + γ 1 v γ n 2 v n 2 + γ n 1 v n 1 λ λ 1 λ n 2 λ {{ n 1 k 1 λ γ v v n 2 v n 1. k 1 λ n 2 γ n 2 k 1 γ n 1 λ n 1 {{ V Λ 1 k c

44 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 554 Now, if λ n 1 = 1, then lim k xk = lim k 1 λ j < 1 for j < n 1 and hence γ 1 λ k v + + γ n 2 1 λ n 2 k v n 2 + γ n 1v n 1 {{ k 1 λ γ. lim v v n 2 v n k. k 1 λ n 2 γ n 2 1 γ n 1 {{ γ v v n 2 v n 1 γ n 2 1 {{ γ n 1 γ. v n 1 γ n 2 γ n 1 {{ γ n 1 v n 1 = γ n 1 v n 1 which means that x k eventually starts pointing towards the direction of v n 1, the eigenvector associated with the eigenvalue that is smallest in magnitude. Well, as long as γ n 1. Similar to before, we can instead iterate with the matrix λ n 1 A 1, in which case The iteration then becomes λ n 1 λ The following algorithm accomplishes this λ n 1 < λ n 1 = 1. λ n 2 λ n 1 x 1 = λ n 1 A 1 x x 2 = λ n 1 A 1 x 1 x 3 = λ n 1 A 1 x 2. for k =,..., until x k doesn t change much anymore Solve Ax k+1 := x k x k+1 := λ n 1 x k+1 endfor

45 12.5. Enrichment 555 It is not hard to see that then lim k xk = k λn 1 lim γ v + + γ n 2 k λ λn 1 λ n 2 k v n 2 + γ n 1v n 1 {{ λ n 1 /λ k γ lim v v n 2 v n 1 k λ n 1 /λ n 2 k γ n 2 1 {{ γ n 1 γ v v n 2 v n 1 γ n 2 1 {{ γ n 1 γ. v n 1 γ n 2 γ n 1 {{ γ n 1 v n 1 = γ n 1 v n 1 So, it seems that we have an algorithm that always works as long as λ λ n 1 > λ n 1. Again, we are cheating... If we knew λ n 1, then we could simply compute the eigenvector by finding a vector in the null space of A λ n 1 I. Again, the key insight is that, in x k+1 = λ n 1 Ax k, multiplying by λ n 1 is merely meant to keep the vector x k from getting progressively larger if λ n 1 < 1 or smaller if λ n 1 > 1. We can alternatively simply make x k of length one at each step, and that will have the same effect without requiring λ n 1 : for k =,..., until x k doesn t change much anymore Solve Ax k+1 := x k x k+1 := x k+1 / x k+1 2 endfor This last algorithm is known as the Inverse Power Method for finding an eigenvector associated with the smallest eigenvalue in magnitude.

46 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 556 Homework The script in InversePowerMethodScript.m illustrates how the Inverse Power Method, starting with a random vector, computes an eigenvector corresponding to the eigenvalue that is smallest in magnitude, and via the Rayleigh quotient an approximation for that eigenvalue. To try it out, in the Command Window type >> InversePowerMethodScript input a vector of eigenvalues. e.g.: [ 4; 3; 2; 1 ] [ 4; 3; 2; 1 ] If you compare the script for the Power Method with this script, you notice that the difference is that we now use A 1 instead of A. To save on computation, we compute the LU factorization once, and solve LUz = x, overwriting x with z, to update x := A 1 x. You will notice that for this distribution of eigenvalues, the Inverse Power Method converges faster than the Power Method does. Try some other distributions of eigenvalues. For example, [ 4; 3; 1.25; 1 ], which should converge slower, or [ 4; 3.9; 3.8; 1 ], which should converge faster. Now, it is possible to accelerate the Inverse Power Method if one has a good guess of λ n 1. The idea is as follows: Let µ be close to λ n 1. Then we know that A µix = λ µx. Thus, an eigenvector of A is an eigenvector of A 1 is an eigenvector of A µi is an eigenvector of A mui 1. Now, if µ is close to λ n 1, then hopefully The important thing is that if, as before, λ µ λ 1 µ λ n 2 µ > λ n 1 µ. x = γ v + γ 1 v γ n 2 v n 2 + γ n 1 v n 1 where v j equals the eigenvector associated with λ j, then x k = λ n 1 µa µi 1 x k 1 = = λ n 1 µ k A µi 1 k x = = γ λ n 1 µ k A µi 1 k v + γ 1 λ n 1 µ k A µi 1 k v γ n 2 λ n 1 µ k A µi 1 k v n 2 + γ n 1 λ n 1 µ k A µi 1 k v n 1 λ n 1 µ k = γ λ n 1 µ k λ µ v + γ 1 λ n 1 µ k λ 1 µ v γ n 2 λ n 2 µ v n 2 + γ n 1 v n 1 Now, how fast the terms involving v,...,v n 2 approx zero become negligible is dictated by the ratio λ n 1 µ λ n 2 µ. Clearly, this can be made arbitrarily small by picking arbitrarily close to λ n 1. Of course, that would require knowning λ n 1... The practical algorithm for this is given by for k =,..., until x k doesn t change much anymore Solve A µix k+1 := x k x k+1 := x k+1 / x k+1 2 endfor

47 12.5. Enrichment 557 which is referred to as the Shifted Inverse Power Method. Obviously, we would want to only factor A µi once. Homework The script in ShiftedInversePowerMethodScript.m illustrates how shifting the matrix can improve how fast the Inverse Power Method, starting with a random vector, computes an eigenvector corresponding to the eigenvalue that is smallest in magnitude, and via the Rayleigh quotient an approximation for that eigenvalue. To try it out, in the Command Window type >> ShiftedInversePowerMethodScript input a vector of eigenvalues. e.g.: [ 4; 3; 2; 1 ] [ 4; 3; 2; 1 ] <bunch of output> enter a shift to use: a number close to the smallest eigenvalue.9 If you compare the script for the Inverse Power Method with this script, you notice that the difference is that we now iterate with A σi 1, where σ is the shift, instead of A. To save on computation, we compute the LU factorization of A σi once, and solve LUz = x, overwriting x with z, to update x := A 1 σix. You will notice that if you pick the shift close to the smallest eigenvalue in magnitude, this Shifted Inverse Power Method converges faster than the Inverse Power Method does. Indeed, pick the shift very close, and the convergence is very fast. See what happens if you pick the shift exactly equal to the smallest eigenvalue. See what happens if you pick it close to another eigenvalue The Rayleigh Quotient Iteration In the previous unit, we explained that the Shifted Inverse Power Method converges quickly if only we knew a scalar µ close to λ n 1. The observation is that x k eventually approaches v n 1. If we knew v n 1 but not λ n 1, then we could compute the Rayleigh quotient: λ n 1 = vt n 1 Av n 1 v T n 1 v. n 1 But we know an approximation of v n 1 or at least its direction and hence can pick µ = xkt Ax k x kt x k λ n 1 which will become a progressively better approximation to λ n 1 as k increases. This then motivates the Rayleigh Quotient Iteration: for k =,..., until x k doesn t change much anymore µ := xkt Ax k x kt x k Solve A µix k+1 := x k x k+1 := x k+1 / x k+1 2 endfor

48 Week 12. Eigenvalues, Eigenvectors, and Diagonalization 558 Notice that if x has length one, then we can compute µ := x kt Ax k instead, since x k will always be of length one. The disadvantage of the Rayleigh Quotient Iteration is that one cannot factor A µi once before starting the loop. The advantage is that it converges dazingly fast. Obviously dazingly is not a very precise term. Unfortunately, quantifying how fast it converges is beyond this enrichment. Homework The script in RayleighQuotientIterationScript.m illustrates how shifting the matrix by the Rayleigh Quotient can greatly improve how fast the Shifted Inverse Power Method, starting with a random vector, computes an eigenvector. It could be that the random vector is close to an eigenvector associated with any of the eigenvalues, in which case the method will start converging towards an eigenvector associated with that eigenvalue. Pay close attention to how many digit are accurate from one iteration to the next. To try it out, in the Command Window type >> RayleighQuotientIterationScript input a vector of eigenvalues. e.g.: [ 4; 3; 2; 1 ] [ 4; 3; 2; 1 ] More Advanced Techniques The Power Method and its variants are the bases of algorithms that compute all eigenvalues and eigenvectors of a given matrix. Details, presented with notation similar to what you have learned in this class, can be found in LAFF: Notes on Numerical Linear Algebra. Consider this unit a cliff hanger. You will want to take a graduate level course on Numerical Linear Algebra next! 12.6 Wrap Up Homework No additional homework this week Summary The algebraic eigenvalue problem The algebraic eigenvalue problem is given by Ax = λx. where A R n n is a square matrix, λ is a scalar, and x is a nonzero vector. If x, then λ is said to be an eigenvalue and x is said to be an eigenvector associated with the eigenvalue λ. The tuple λ,x is said to be an eigenpair. The set of all vectors that satisfy Ax = λx is a subspace.

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Week12 Eigenvalues, Eigenvectors, and Diagonalization 12.1 Opening Remarks 12.1.1 Predicting the Weather, Again Let us revisit the example from Week 4, in which we had a simple model for predicting the

More information

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture Week9 Vector Spaces 9. Opening Remarks 9.. Solvable or not solvable, that s the question Consider the picture (,) (,) p(χ) = γ + γ χ + γ χ (, ) depicting three points in R and a quadratic polynomial (polynomial

More information

MATH 310, REVIEW SHEET 2

MATH 310, REVIEW SHEET 2 MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Recap: A N N matrix A has an eigenvector x (non-zero) with corresponding

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Recall : Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 38 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 38.. Outline..

More information

3 Matrix Algebra. 3.1 Operations on matrices

3 Matrix Algebra. 3.1 Operations on matrices 3 Matrix Algebra A matrix is a rectangular array of numbers; it is of size m n if it has m rows and n columns. A 1 n matrix is a row vector; an m 1 matrix is a column vector. For example: 1 5 3 5 3 5 8

More information

Properties of Linear Transformations from R n to R m

Properties of Linear Transformations from R n to R m Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

Math 471 (Numerical methods) Chapter 4. Eigenvalues and Eigenvectors

Math 471 (Numerical methods) Chapter 4. Eigenvalues and Eigenvectors Math 471 (Numerical methods) Chapter 4. Eigenvalues and Eigenvectors Overlap 4.1 4.3 of Bradie 4.0 Review on eigenvalues and eigenvectors Definition. A nonzero vector v is called an eigenvector of A if

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 463 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 464.. Outline..

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

MAT 1302B Mathematical Methods II

MAT 1302B Mathematical Methods II MAT 1302B Mathematical Methods II Alistair Savage Mathematics and Statistics University of Ottawa Winter 2015 Lecture 19 Alistair Savage (uottawa) MAT 1302B Mathematical Methods II Winter 2015 Lecture

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Summer Session Practice Final Exam

Summer Session Practice Final Exam Math 2F Summer Session 25 Practice Final Exam Time Limit: Hours Name (Print): Teaching Assistant This exam contains pages (including this cover page) and 9 problems. Check to see if any pages are missing.

More information

Linear Algebra. Carleton DeTar February 27, 2017

Linear Algebra. Carleton DeTar February 27, 2017 Linear Algebra Carleton DeTar detar@physics.utah.edu February 27, 2017 This document provides some background for various course topics in linear algebra: solving linear systems, determinants, and finding

More information

Math Matrix Algebra

Math Matrix Algebra Math 44 - Matrix Algebra Review notes - 4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

arxiv: v1 [math.na] 5 May 2011

arxiv: v1 [math.na] 5 May 2011 ITERATIVE METHODS FOR COMPUTING EIGENVALUES AND EIGENVECTORS MAYSUM PANJU arxiv:1105.1185v1 [math.na] 5 May 2011 Abstract. We examine some numerical iterative methods for computing the eigenvalues and

More information

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information

The minimal polynomial

The minimal polynomial The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic

More information

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8 Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v

More information

Math Camp Notes: Linear Algebra II

Math Camp Notes: Linear Algebra II Math Camp Notes: Linear Algebra II Eigenvalues Let A be a square matrix. An eigenvalue is a number λ which when subtracted from the diagonal elements of the matrix A creates a singular matrix. In other

More information

Chapter 5. Eigenvalues and Eigenvectors

Chapter 5. Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

Matrix-Vector Operations

Matrix-Vector Operations Week3 Matrix-Vector Operations 31 Opening Remarks 311 Timmy Two Space View at edx Homework 3111 Click on the below link to open a browser window with the Timmy Two Space exercise This exercise was suggested

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

MTH 464: Computational Linear Algebra

MTH 464: Computational Linear Algebra MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)

More information

Chapter 4 & 5: Vector Spaces & Linear Transformations

Chapter 4 & 5: Vector Spaces & Linear Transformations Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think

More information

Repeated Eigenvalues and Symmetric Matrices

Repeated Eigenvalues and Symmetric Matrices Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information

Matrix-Vector Operations

Matrix-Vector Operations Week3 Matrix-Vector Operations 31 Opening Remarks 311 Timmy Two Space View at edx Homework 3111 Click on the below link to open a browser window with the Timmy Two Space exercise This exercise was suggested

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Contents Eigenvalues and Eigenvectors. Basic Concepts. Applications of Eigenvalues and Eigenvectors 8.3 Repeated Eigenvalues and Symmetric Matrices 3.4 Numerical Determination of Eigenvalues and Eigenvectors

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

Numerical Methods I Eigenvalue Problems

Numerical Methods I Eigenvalue Problems Numerical Methods I Eigenvalue Problems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 2nd, 2014 A. Donev (Courant Institute) Lecture

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS p. 2/4 Eigenvalues and eigenvectors Let A C n n. Suppose Ax = λx, x 0, then x is a (right) eigenvector of A, corresponding to the eigenvalue

More information

EIGENVALUES AND EIGENVECTORS 3

EIGENVALUES AND EIGENVECTORS 3 EIGENVALUES AND EIGENVECTORS 3 1. Motivation 1.1. Diagonal matrices. Perhaps the simplest type of linear transformations are those whose matrix is diagonal (in some basis). Consider for example the matrices

More information

Eigenvectors and Hermitian Operators

Eigenvectors and Hermitian Operators 7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

MATH 320, WEEK 11: Eigenvalues and Eigenvectors

MATH 320, WEEK 11: Eigenvalues and Eigenvectors MATH 30, WEEK : Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors We have learned about several vector spaces which naturally arise from matrix operations In particular, we have learned about the

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

Eigenvalue and Eigenvector Problems

Eigenvalue and Eigenvector Problems Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems

More information

Lecture 10: Powers of Matrices, Difference Equations

Lecture 10: Powers of Matrices, Difference Equations Lecture 10: Powers of Matrices, Difference Equations Difference Equations A difference equation, also sometimes called a recurrence equation is an equation that defines a sequence recursively, i.e. each

More information

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

More information

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

1. In this problem, if the statement is always true, circle T; otherwise, circle F. Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation

More information

4 Matrix Diagonalization and Eigensystems

4 Matrix Diagonalization and Eigensystems 14.102, Math for Economists Fall 2004 Lecture Notes, 9/21/2004 These notes are primarily based on those written by George Marios Angeletos for the Harvard Math Camp in 1999 and 2000, and updated by Stavros

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

6 EIGENVALUES AND EIGENVECTORS

6 EIGENVALUES AND EIGENVECTORS 6 EIGENVALUES AND EIGENVECTORS INTRODUCTION TO EIGENVALUES 61 Linear equations Ax = b come from steady state problems Eigenvalues have their greatest importance in dynamic problems The solution of du/dt

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

4. Determinants.

4. Determinants. 4. Determinants 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 2 2 determinant 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 3 3 determinant 4.1.

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 TMA4115 - Calculus 3 Lecture 21, April 3 Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 www.ntnu.no TMA4115 - Calculus 3, Lecture 21 Review of last week s lecture Last week

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 489AB Exercises for Chapter 2 Fall Section 2.3 Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial

More information

Diagonalization. MATH 1502 Calculus II Notes. November 4, 2008

Diagonalization. MATH 1502 Calculus II Notes. November 4, 2008 Diagonalization MATH 1502 Calculus II Notes November 4, 2008 We want to understand all linear transformations L : R n R m. The basic case is the one in which n = m. That is to say, the case in which the

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

LS.2 Homogeneous Linear Systems with Constant Coefficients

LS.2 Homogeneous Linear Systems with Constant Coefficients LS2 Homogeneous Linear Systems with Constant Coefficients Using matrices to solve linear systems The naive way to solve a linear system of ODE s with constant coefficients is by eliminating variables,

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Extreme Values and Positive/ Negative Definite Matrix Conditions

Extreme Values and Positive/ Negative Definite Matrix Conditions Extreme Values and Positive/ Negative Definite Matrix Conditions James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 8, 016 Outline 1

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers.. EIGENVALUE PROBLEMS Background on eigenvalues/ eigenvectors / decompositions Perturbation analysis, condition numbers.. Power method The QR algorithm Practical QR algorithms: use of Hessenberg form and

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and

More information

Linear algebra II Tutorial solutions #1 A = x 1

Linear algebra II Tutorial solutions #1 A = x 1 Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta

More information

Matrix Vector Products

Matrix Vector Products We covered these notes in the tutorial sessions I strongly recommend that you further read the presented materials in classical books on linear algebra Please make sure that you understand the proofs and

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra The two principal problems in linear algebra are: Linear system Given an n n matrix A and an n-vector b, determine x IR n such that A x = b Eigenvalue problem Given an n n matrix

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

3.3 Eigenvalues and Eigenvectors

3.3 Eigenvalues and Eigenvectors .. EIGENVALUES AND EIGENVECTORS 27. Eigenvalues and Eigenvectors In this section, we assume A is an n n matrix and x is an n vector... Definitions In general, the product Ax results is another n vector

More information

Numerical Methods - Numerical Linear Algebra

Numerical Methods - Numerical Linear Algebra Numerical Methods - Numerical Linear Algebra Y. K. Goh Universiti Tunku Abdul Rahman 2013 Y. K. Goh (UTAR) Numerical Methods - Numerical Linear Algebra I 2013 1 / 62 Outline 1 Motivation 2 Solving Linear

More information

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems. Chapter 3 Linear Algebra In this Chapter we provide a review of some basic concepts from Linear Algebra which will be required in order to compute solutions of LTI systems in state space form, discuss

More information

G1110 & 852G1 Numerical Linear Algebra

G1110 & 852G1 Numerical Linear Algebra The University of Sussex Department of Mathematics G & 85G Numerical Linear Algebra Lecture Notes Autumn Term Kerstin Hesse (w aw S w a w w (w aw H(wa = (w aw + w Figure : Geometric explanation of the

More information

Math 504 (Fall 2011) 1. (*) Consider the matrices

Math 504 (Fall 2011) 1. (*) Consider the matrices Math 504 (Fall 2011) Instructor: Emre Mengi Study Guide for Weeks 11-14 This homework concerns the following topics. Basic definitions and facts about eigenvalues and eigenvectors (Trefethen&Bau, Lecture

More information

Math 310 Final Exam Solutions

Math 310 Final Exam Solutions Math 3 Final Exam Solutions. ( pts) Consider the system of equations Ax = b where: A, b (a) Compute deta. Is A singular or nonsingular? (b) Compute A, if possible. (c) Write the row reduced echelon form

More information

Elementary Linear Algebra

Elementary Linear Algebra Matrices J MUSCAT Elementary Linear Algebra Matrices Definition Dr J Muscat 2002 A matrix is a rectangular array of numbers, arranged in rows and columns a a 2 a 3 a n a 2 a 22 a 23 a 2n A = a m a mn We

More information

Announcements Wednesday, November 7

Announcements Wednesday, November 7 Announcements Wednesday, November 7 The third midterm is on Friday, November 6 That is one week from this Friday The exam covers 45, 5, 52 53, 6, 62, 64, 65 (through today s material) WeBWorK 6, 62 are

More information

Eigenvalues and eigenvectors

Eigenvalues and eigenvectors Chapter 6 Eigenvalues and eigenvectors An eigenvalue of a square matrix represents the linear operator as a scaling of the associated eigenvector, and the action of certain matrices on general vectors

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information