Math 489AB Exercises for Chapter 2 Fall Section 2.3

Similar documents
MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Numerical Linear Algebra Homework Assignment - Week 2

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Recall : Eigenvalues and Eigenvectors

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Eigenvalues and Eigenvectors

Eigenvalue and Eigenvector Problems

Eigenvalues and Eigenvectors

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

and let s calculate the image of some vectors under the transformation T.

Chapter 3. Linear and Nonlinear Systems

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Symmetric and self-adjoint matrices

Chapter 3. Determinants and Eigenvalues

Computational Methods. Eigenvalues and Singular Values

Math 108b: Notes on the Spectral Theorem

1. General Vector Spaces

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

JUST THE MATHS UNIT NUMBER 9.8. MATRICES 8 (Characteristic properties) & (Similarity transformations) A.J.Hobson

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Econ Slides from Lecture 7

Foundations of Matrix Analysis

SECTIONS 5.2/5.4 BASIC PROPERTIES OF EIGENVALUES AND EIGENVECTORS / SIMILARITY TRANSFORMATIONS

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Symmetric and anti symmetric matrices

Diagonalization of Matrix

CHAPTER 3. Matrix Eigenvalue Problems

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 489AB Exercises for Chapter 1 Fall Section 1.0

Cayley-Hamilton Theorem

MATH 583A REVIEW SESSION #1

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

CS 246 Review of Linear Algebra 01/17/19

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

MATHEMATICS 217 NOTES

Math Matrix Algebra

0.1 Rational Canonical Forms

Linear Systems and Matrices

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Elementary linear algebra

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Eigenvalues and Eigenvectors

Chap 3. Linear Algebra

Jordan Normal Form and Singular Decomposition

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

4. Determinants.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Eigenvalues and Eigenvectors

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 10, 2011

Review of some mathematical tools

Eigenvalue Problems. Eigenvalue problems occur in many areas of science and engineering, such as structural analysis

Notes on Eigenvalues, Singular Values and QR

Math Spring 2011 Final Exam

Scientific Computing: An Introductory Survey

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Schur s Triangularization Theorem. Math 422

Linear Algebra Primer

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Math Homework 8 (selected problems)

Math 2331 Linear Algebra

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Linear Algebra Primer

Linear Algebra Lecture Notes-II

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Math Linear Algebra Final Exam Review Sheet

Linear Algebra Primer

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Definition (T -invariant subspace) Example. Example

Linear algebra and applications to graphs Part 1

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Inverses and Elementary Matrices

Eigenvalues and eigenvectors

Study Guide for Linear Algebra Exam 2

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...

Matrix Theory. A.Holst, V.Ufnarovski

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

3 Matrix Algebra. 3.1 Operations on matrices

Conceptual Questions for Review

Chapter 3. Matrices. 3.1 Matrices

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Matrices. Chapter What is a Matrix? We review the basic matrix operations. An array of numbers a a 1n A = a m1...

Math 3191 Applied Linear Algebra

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Linear Algebra. Workbook

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Math 405: Numerical Methods for Differential Equations 2016 W1 Topics 10: Matrix Eigenvalues and the Symmetric QR Algorithm

Math 310 Final Exam Solutions

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

Transcription:

Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial with real coefficients. Thus, the roots of p A (t), and hence the eigenvalues of A, come in complex conjugate pairs. (See Appendix C.) 2.3.6. Suppose A, B M n are simultaneously similar to upper triangular matrices. That is, there is a nonsingular matrix S such that S 1 AS = 1 and S 1 BS = 2. Then AB BA = S 1 S 1 S 2 S 1 S 2 S 1 S 1 S 1 = S 1 2 S 1 S 2 1 S 1 = S ( 1 2 2 1 ) S 1 Therefore the eigenvalues of AB BA are the eigenvalues of 1 2 2 1. Now, let t 11,..., t nn be the diagonal elements of 1 and let s 11,..., s nn be the diagonal elements of 2. Then t 11 s 11 t 11 s 11 0 t 22 0 s 22 1 2 =...... = 0 t 22 s 22..., 0 0 t nn 0 0 s nn 0 0 t nn s nn where the s represent possibly nonzero elements. Thus the diagonal elements of 1 2 are t 11 s 11, t 22 s 22,..., t nn s nn. A similar calculation shows that the diagonal elements of 2 1 are the same. Thus, 1 2 2 1 is an upper triangular matrix with 0 s on the diagonal. Therefore, the eigenvalues of 1 2 2 1, and hence the eigenvalues of AB BA, are zero. Section 2.4 2.4.9. Let A M n, B M m and suppose A and B have no eigenvalues in common. Suppose X M n,m satisfies AX XB = 0, i.e. AX = XB. Multiply both sides of the above equation by A on the left to get Repeatedly performing this k times, we obtain A 2 X = AXB = (AX)B = (XB)B = XB 2. A k X = XB k, for any integer k. So for any polynomial p(t) = a k t k + a k 1 t k 1 + + a 1 t + a 0, [ ] p(a)x = a k A k + a k 1 A k 1 + + a 1 A + a 0 I X = a k A k X + a k 1 A k 1 X + + a 1 AX + a 0 X = a k XB k + a k 1 XB k 1 + + a 1 XB + a 0 X [ ] = X a k B k + a k 1 B k 1 + + a 1 B + a 0 I = Xp(B).

In particular this is true for the characteristic polynomial of A, which can be factored as p A (t) = (t λ 1 )(t λ 2 ) (t λ n ), where λ 1,..., λ n are the eigenvalues of A. By the Cayley-Hamilton Theorem, p A (A) = 0, so we have 0 = p A (A)X = Xp A (B). Notice that p A (B) = (B λ 1 I)(B λ 2 I) (B λ n I). Since λ i is not an eigenvalue of B, B λ i I is nonsingular for every i, thus p A (B) is nonsingular. (This is true because the eigenvalues of B are exactly those values λ for which B λi is singular.) Therefore Xp A (B) = 0 has only the zero solution X = 0. And, since AX = BX implies p A (A)X = Xp A (B), the equation AX XB = 0 has only the zero solution. Now consider the the function T (X) = AX XB. This is a function T : M m,n M n,m between vector spaces of the same dimension (mn). The above equation can be written as T (X) = 0. Moreover, for any scalars α, β, and matrices X, Y, we have T (αx + βy ) = A(αX + βy ) (αx + βy )B = αax + βay αxb βy B = α(ax XB) + β(ay Y B) = αt (X) + βt (Y ), so T is a linear transformation on M m,n. Since T (X) = 0 has only the zero solution, T is a nonsingular linear transformation, hence T (X) = C has a unique solution for every C M n,m (see section 0.5). In summary, we have shown that if A and B have no eigenvalues in common, then the equation has a unique solution for every C. AX XB = C Section 2.5 2.5.1. Proposition: A M n is normal if and only if Ax = A x for all x C n. Proof. : Suppose A is normal, i.e. A A = AA. Then for any x C n, Ax 2 = Ax, Ax = (Ax) Ax = x A Ax = x AA x = A x, A x = A x 2

Thus Ax = A x for all x C n. : Suppose Ax = A x for all x C n. Then for any x C n, x A Ax = Ax, Ax = Ax 2 = A x 2 = A x, A x = x AA x, i.e. x A Ax = x AA x. Thus it must be that A A = AA, so A is normal. (See problem 4.1.6.) 2.5.3. We know already that the eigenvalues of a Hermitian matrix are real. Thus we will show that if the eigenvalues of a normal matrix A are real, then A is Hermitian. So suppose A is normal. Then A is unitarily similar to a real diagonal matrix D. That is, there is a unitary matrix S such that S AS = D is real. Since D is real, D = D, so take the Hermitian adjoint of the above equation: D = S AS = S A S = D. Multiply on the left by S and on the right by S : so A is Hermitian. SS ASS = SS A SS A = A, 2.5.19. Proposition: Let A M n and a C. Then A is normal if and only if A + ai is normal. Proof. : Let A M n be normal and a C. Then (A + ai) (A + ai) = (A + ai)(a + ai) = A A + aa + aa + aai = AA + aa + aa + aai = (A + ai)(a + ai) = (A + ai)(a + ai), so A + ai is normal. : Suppose A + ai is normal. Then, as in the above calculation, (A + ai) (A + ai) = (A + ai)(a + ai) A A + aa + aa + aai = AA + aa + aa + aai Subtract aa + aa + aai from both sides to obtain A A = AA, thus A is normal. 2.5.20. Suppose A M n is normal and Ax = λx. Thus (A λi)x = 0, and by the previous problem, A λi is normal. Moreover, since (A λi)x = 0, problem 1 implies that (A λi) x = 0. Thus (A λi) x = (A λi)x = 0, so A x = λx.

Section 2.6 2.6.5. Let A M n and let A k be the kth iterate in the QR algorithm, A = A 0 = Q 0 R 0, and A k = Q k R k, A k+1 = R k Q k, where Q k is orthogonal and R k is upper triangular. Notice that Q k A k+1 Q k = Q k R k Q k Q k = Q k R k = A k, so A k+1 is unitarily similar to A k. By induction, each A k is unitarily similar to A. Thus, for each k there is a unitary matrix U k such that U k A k U k = A. Now suppose A k B. By the selection principle (Lemma 2.1.8), there is a subsequence U k1, U k2,... such that all the entries of U ki converge to a unitary matrix U 0 as i. For this subsequence, Take the limit of the above subsequence, Thus B is unitarily similar to A. U k i A ki U ki = A lim U k i i A ki U ki = U0 BU 0 = A.

Exercises for Chapter 3 3.1.2. For A = Section 3.1 [ ] 1 1, the characteristic polynomial is 1 1 p A (t) = det [ ] t 1 1 = (t 1) 2 1, 1 t 1 so A has eigenvalues λ 1 = 0 and λ 2 = 2. Since there are two distinct eigenvalues, A is diagonalizable, so the Jordan form of A is [ ] [ ] 0 0 2 0, or. 0 2 0 0 3 1 2 For A = 0 3 0, the matrix is triangular, so the eigenvalues are given by the diagonal 0 0 3 entries. Hence A has a single eigenvalue, λ = 3, of multiplicity 3. To determine the Jordan form we must find the dimension of the eigenspace of λ = 3. We first find the eigenvectors of A, i.e. the solutions of (A 3I)x = 0, or 0 1 2 x 1 0 0 0 x 2 = 0 x 2 + 2x 3 = 0. 0 0 0 x 3 There are therefore two LI eigenvectors associated with λ = 3, 1 0 v 1 = 0 and v 2 = 2. 0 1 There are therefore two Jordan blocks. Since n = 3 one of these is size 2 and the other is size 1. Thus the Jordan form of A is 3 1 0 0 3 0. 0 0 3 3.1.3. Let A M n be a matrix with complex entries, but with only real eigenvalues. Then A is similar to a matrix in Jordan form, i.e. A = SJS 1, where S is nonsingular and J is in Jordan form. The Jordan form J of A has the eigenvalues of A on its diagonal. The only other possible nonzero elements of J are 1 s in superdiagonal positions. Thus, since the eigenvalues of A are real, J contains only real elements. Thus, S must contain complex entries, otherwise the product SJS 1 would be real, which contradicts the assumption that A has complex entries. So the similarity matrix cannot be real.