(V.C) Complex eigenvalues and other topics

Similar documents
EIGENVALUES AND EIGENVECTORS

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

Math 489AB Exercises for Chapter 1 Fall Section 1.0

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

Def. (a, b) is a critical point of the autonomous system. 1 Proper node (stable or unstable) 2 Improper node (stable or unstable)

Problems for M 11/2: A =

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Eigenvalues and Eigenvectors: An Introduction

88 CHAPTER 3. SYMMETRIES

235 Final exam review questions

Systems of Algebraic Equations and Systems of Differential Equations

Math 21b Final Exam Thursday, May 15, 2003 Solutions

JORDAN NORMAL FORM NOTES

Announcements Monday, November 13

Math 315: Linear Algebra Solutions to Assignment 7

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Eigenvalues and Eigenvectors

Matrices and Deformation

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Linear Algebra: Matrix Eigenvalue Problems

CAAM 335 Matrix Analysis

Introduction to Matrix Algebra

Control Systems I. Lecture 4: Diagonalization, Modal Analysis, Intro to Feedback. Readings: Emilio Frazzoli

MATH 583A REVIEW SESSION #1

Properties of Linear Transformations from R n to R m

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

MATH 310, REVIEW SHEET 2

Knowledge Discovery and Data Mining 1 (VO) ( )

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00

Announcements Wednesday, November 01

Announcements Wednesday, November 01

Math 215 HW #9 Solutions

Math 1553 Worksheet 5.3, 5.5

Linear Algebra. Workbook

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Announcements Monday, November 06

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Calculating determinants for larger matrices

Eigenvalues and Eigenvectors A =

MAT 1302B Mathematical Methods II

Chapter 5 Eigenvalues and Eigenvectors

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

π 1 = tr(a), π n = ( 1) n det(a). In particular, when n = 2 one has

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Dimension. Eigenvalue and eigenvector

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

You may use a calculator, but you must show all your work in order to receive credit.

Linear Systems Notes for CDS-140a

Mathematical Methods wk 2: Linear Operators

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

20D - Homework Assignment 5

MATH 24 EXAM 3 SOLUTIONS

REPRESENTATION THEORY WEEK 7

Eigenvalues and Eigenvectors

A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

Online Exercises for Linear Algebra XM511

Definition (T -invariant subspace) Example. Example

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

1. Matrix multiplication and Pauli Matrices: Pauli matrices are the 2 2 matrices. 1 0 i 0. 0 i

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Lecture Notes for Math 251: ODE and PDE. Lecture 27: 7.8 Repeated Eigenvalues

Find the general solution of the system y = Ay, where

Linear algebra and applications to graphs Part 1

Math 266: Phase Plane Portrait

Recitation 8: Graphs and Adjacency Matrices

Generalized Eigenvectors and Jordan Form

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Solution: In standard form (i.e. y + P (t)y = Q(t)) we have y t y = cos(t)

MAT2342 : Introduction to Linear Algebra Mike Newman, 5 October assignment 1

MATH 1553-C MIDTERM EXAMINATION 3

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Vectors, matrices, eigenvalues and eigenvectors

Homework 3 Solutions Math 309, Fall 2015

A PRIMER ON SESQUILINEAR FORMS

Linear Algebra

Linear Algebra Exercises

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Symmetric and anti symmetric matrices

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

MATH 320, WEEK 11: Eigenvalues and Eigenvectors

Topics in linear algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

18.S34 linear algebra problems (2007)

(VI.C) Rational Canonical Form

Math 108b: Notes on the Spectral Theorem

Eigenvalues and Eigenvectors

Practice problems for Exam 3 A =

TBP MATH33A Review Sheet. November 24, 2018

Math 21b. Review for Final Exam

Transcription:

V.C Complex eigenvalues and other topics matrix Let s take a closer look at the characteristic polynomial for a 2 2 namely A = a c f A λ = detλi A = λ aλ d bc b d = λ 2 a + dλ + ad bc = λ 2 traλ + det A. More generally, for A M n F one can check by direct calculation that f A λ = λ n traλ n + P 2 Aλ n 2... + n det A =: n k=0 k P k Aλ n k where P A = tra, P n A = deta. The P k A so defined are are invariant polynomials of degree k in the entries of A : P k A = P k S AS, as one can see by comparing coefficients of like powers of λ in the end terms of k P k Aλ n k = detλi A = det{s λi AS} = det{λi S AS} = k P k S ASλ n k. If you didn t know tra = trs AS for all invertible S say, by not doing the exercises, now you do.,

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 2 In particular, if A is diagonalizable = SDS then P k A = P k SDS = P k D. Now by definition λ λ f D λ = det... = λ λ λ λ n λ λ n and so P k A = { = k P k Dλ n k, coefficient of λ n k in λ λ λ λ n } = λ i λ ik i <...<i k n =: k th elementary symmetric function in the {λ i }. We caution that these are not formulas for the P k A any more than the special cases deta = P n A = λ λ n, tra = P A = λ +... + λ n are formulas for determinant and trace. Decoupling linear PDEs. Frequently in physics and engineering one encounters systems of partial differential equations like u u + a x + a 2 u 2 x = 0, u 2 u + a 2 x + a 22 u 2 x = 0 which are coupled in the sense that the equation for the change in u has a term involving u 2, and vice versa. We can decouple them by a linear change in coordinates. Assume a A = a 2 λ = 0 v v 2 v v 2 a 2 a 22 0 λ 2 = SDS

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 3 is diagonalizable, with eigenbasis B = { v, v 2 }, and rewrite the system in matrix form Let c c 2 and write u u + A u x = 0. = c := S u be the coordinates of u with respect to B, + SDS u x = 0 S u + DS u = 0. Since S is a matrix of constants, the differential operators commute with it. The last equation becomes c + D c x = 0 which corresponds to the two PDEs c c + λ x = 0, c 2 c + λ 2 2 x = 0. These are not coupled and may be solved separately to get the traveling waves c x, t = Φ I x λ t, c 2 x, t = Φ II x λ 2 t where Φ I, Φ II are any functions on the real line. To get more specific we must specify the initial conditions of the system. u x, 0 = gx and u 2 x, 0 = hx EXAMPLE. Let s solve the system with initial conditions u + u x + 4 u 2 x = 0, u 2 + u x + u 2 x = 0 u x, 0 = e x, u 2 x, 0 = cos x.

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 4 The first step is to diagonalize 4 2 2 A = = 3 0 0 4 2 2 = S D S. Then we solve for Φ I and Φ II : Φ I x c = x, 0 = S ux, 0 Φ II x c 2 x, 0 = 2 e x, 4 2 cos x i.e. Φ I x = 4 e x + 2 cos x, Φ IIx = 4 e x + 2 cos x ; and so since λ = 3, λ 2 = Φ cx, t = I x λ t 4 e x 3t + = 2 cosx 3t Φ II x λ 2 t 4 e x+t + 2 cosx + t. This is the solution in eigen-coordinates. In terms of the original variables u and u 2 we have u u 2 = ux, t = S cx, t 2 2 4 e x 3t + = 2 cosx 3t 4 e x+t + 2 cosx + t 2 e x 3t + cosx 3t + = 2 e x+t cosx + t 4 e x 3t + 2 cosx 3t 4 e x+t + 2 cosx + t. If you are interested in applications of this sort of thing, look in any book on applied PDEs a beautiful subject in itself. Complex diagonalization of real matrices. Let A M n R. In this lecture A always has real entries. If the characteristic polynomial f A λ does not break completely into linear factors over R, then A does not have a complete set of eigenvalues in R. The algebraic

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 5 multiplicities d j of the real eigenvalues do not add to n. In this case f A λ = λ σ d λ σ r dr λ 2 + µ λ + ν c λ 2 + µ s λ + ν s c s }{{} irreducible quadratic factors: µ 2 k <4ν k where σ i, µ j, ν j are all real and d j + 2 c k = n. Over C this f A λ factors completely: if we set ξ k = µ k 2, η k = 4ν k µ 2 k since µ2 k < 4ν k by hypothesis, these will be real then 2 the quadratic formula shows λ 2 + µ k λ + ν k = {λ ξ k + iη k } {λ ξ k iη k }. Therefore each irreducible quadratic factor gives rise to a conjugate pair of complex eigenvalues, and the algebraic multiplicities of do add to n. σ,..., σ r ; ξ + iη, ξ iη,..., ξ s + iη s, ξ s iη s Since A is real, if A v = λ v for a real v = 0 then λ is also real. So if we take now v E ξk +iη k A a nonzero eigenvector, v cannot be real since η k = 0 by hypothesis. Therefore we write v = u + i w, where u and w are real and w = 0. We have taking the complex conjugate {ξ k + iη k I A} u + i w = 0 ; {ξ k iη k I A} u i w = 0 where A is unaffected because it is real. That is, v = u i w E ξk iη k A, and so to each conjugate pair of complex eigenvalues of A there is at least one conjugate pair of complex eigenvectors. In fact, i f A is diagonalizable over C, then there is an eigenbasis v,..., v m ; u + i w, u i w,..., u p + i w p, u p i w p

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 6 where all v i, u i, w i are real with respective eigenvalues λ,..., λ m ; ξ + iη, ξ iη,..., ξ p + iη p, ξ p iη p λ i, ξ i, η i R where m + 2p = n. Of course one can now write a matrix S with the vectors of the complex eigenbasis as its columns, and get A = SDS where D has the complex eigenvalues as its diagonal entries. Such a diagonalization is old news. The new, beautiful fact is that one can decompose A into real n n matrices A = MBM where M = B = v v m u w u p w p λ 0... λ m ξ η η ξ... ξ 0 p η p η p ξ p, i.e. B is almost diagonal. You can think of the 2 2 blocks along the diagonal as corresponding to the irreducible quadratic factors of f A λ. It will essentially become clear why this works from the following section. The situation for real 2 2 matrices. Let s start by diagonalizing real rotation-dilation matrices over C. The matrix a b A = M 2 R b a

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 7 has eigenvalues a ± bi. To obtain the eigenvectors look at bi b E a+bi A = ker{a + bii A} = ker b bi = ker i = span i and bi b E a bi A = ker{a bii A} = ker b bi = ker i = span. i Note that you could also just take the conjugate of i. Instead of S, write i N := ; i i to get N will mean this particular matrix for the remainder of the section. By the usual diagonalization procedure one gets a b = NDN a + bi 0 = i. b a i i 0 a bi 2 i From this we have immediately the useful relation a + bi 0 = N a b N = r N R θ N, 0 a bi b a where R θ is the counterclockwise rotation by θ = arctanb/a and r = b 2 + a 2 the dilation which commutes with everything. These are just the θ and r for which a + bi = re iθ.

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 8 More generally, let A = α α 2 α 2 α 22 M 2 R; then there are three possibilities for the roots of f A λ = λ 2 α + α 22 λ + α α 22 α 2 α 2 = λ 2 traλ + det A, based on its discriminant i.e., the quantity under the radical in the quadratic formula: i tra 2 > 4 det A: distinct real roots = A is diagonalizable over R; ii tra 2 = 4 det A: repeated real root =??? A may not be diagonalizable even over C; iii tra 2 < 4 det A: distinct conjugate complex roots = A diagonalizable over C but not over R f A λ is an irreducible quadratic. We are interested in case iii. Call the roots a + bi, a bi noting that b = 0, and let v = u + i w E a+bi A be a nonzero eigenvector in particular, w = 0 as before. Then u i w E a ib A since noting that A is real, so A = Ā A u + i w = a + bi u + i w conjugation A u i w = a bi u i w, and { u + i w, u i w} give an A -eigenbasis for C 2. Taking S = u + i w u i w They are independent because they have distinct eigenvalues; note also that their independence implies independence of u and w why?.

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 9 we have using the boxed relation above a + ib 0 A = S S = SN 0 a ib a b b a NS ; if we set M = SN = i u + i w u i w 2 i = { } { } u + i w i u + i w 2 + u i w i u i w = u w, then we have the decomposition A = M a b b a M = r MR θ M, M = u w completely over R!! This plays exactly the same role in solving linear dynamical systems /R when the eigenvalues are conjugate complex, as the decomposition A = SDS diagonalization did in the case of distinct real eigenvalues. EXAMPLE 2. Take A = 3 5, so that λ 3 5 f A λ = det = λ 2 2λ + 2 λ + has conjugate complex roots λ = ± i =: a ± ib, which is to say a = b =. Moreover 2 + i 5 E +i A = ker 2 + i

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 0 = ker 2 + i 2 + i = span, and so we have the eigenvector 2 + i 2 v = = + i =: u + i w ; 0 its conjugate u i w is the other one. Set 2 M = u w = 0 and note that M = 0 2 then according to our result above, we have the real decomposition a b A = M M 2 0 =. b a 0 2 One may conceptualize this in terms of transformations as follows: if T : R 2 R 2 is the transformation with matrix [T]ê = A relative to the standard basis {ê, ê 2 } of R 2, then in terms of B = { u, w} [T] B = So T is a sort of skew rotation-dilation. ; = 2 R π/4 =: r R θ. Real dynamical systems with complex eigenvalues. I. A discrete dynamical system. Let A remain the matrix of the above example. We will work out the solution to xτ + = A xτ abstractly and then substitute in the numerical information. let γτ = M xτ, γ 0 = M x 0

V.C COMPLEX EIGENVALUES AND OTHER TOPICS be the present analogue of the eigen-coordinates c = S x. These are the coordinates with respect to u and w : that is, x = γ u γ 2 w. Using our real decomposition of A because we want a real solution, xτ = A τ x 0 = rmr θ M τ x 0 = r τ MR θ τ M x 0 = r τ MR τθ M x 0. Multiplying both sides by M, this becomes M xt = r τ R τθ M x 0 or γt = r τ R τθ γ 0. Suppose we start at u, i.e take x 0 = u = 2. Then γ 0 = why? and γt = 2 τ R π 4 τ = 2 τ cos π 4 τ 0 sin π 4 τ, 0 or in ê -coordinates xτ = M γt = 2 τ { cos π 4 t u sinπ 4 τ w }. II. A continuous dynamical system. Again assuming A = the matrix from the example, let s investigate the continuous dynamical system d x dτ = A xτ. This turns out to be unexpectedly complicated; we will employ both sets of coordinates. Put γτ = M xτ, cτ = S xτ where M = u w, S = u + i w u i w.

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 2 First use the eigencoordinates c = to d x dτ = S d c dτ = a + bi 0 c c 2 0 a bi a + bi 0 0 a bi The solution to this is just e a+biτ 0 cτ = c 0 e a biτ 0 = e aτ i.e. or S xτ = e aτ to transform e ibτ 0 0 e ibτ S xτ ct. e ibτ 0 0 e ibτ S x 0 c 0, xτ = e aτ e ibτ 0 S 0 e ibτ S x 0 = e aτ cos bτ + i sin bτ 0 S S x 0 0 cos bτ i sin bτ = e aτ SN cos bτ sin bτ NS x 0 = e aτ MR bτ M x 0. sin bτ cos bτ In order to decouple and solve the differential equations we had to diagonalize, but that introduced complex numbers. To then get rid of them, we had to be more clever than in the last three types of dynamical systems discrete with real or complex λ, continuous with real λ. Now all we have to do is transform to { u, w} coordinates: M xτ = e aτ R bτ M x 0 γτ = e aτ R bτ γ 0.

V.C COMPLEX EIGENVALUES AND OTHER TOPICS 3 2 Again let x 0 = γ 0 = be the initial state, so that the 0 solution looks like γτ = e aτ cos bτ sin bτ = e aτ cos bτ sin bτ cos bτ 0 sin bτ or xτ = M γτ = e aτ u cos bτ w sin bτ = e aτ {cosbτ u sinbτ w} } = e {cosτ τ 2 sinτ. 0 This corresponds 2 to the picture w u=x 0 Although for this particular matrix there are not dramatic differences between the discrete and continuous systems as far as the qualitative phase diagram goes, note the distinctions in the dilations r τ = a 2 + b 2τ vs. e aτ 2 I make no claims to quantitative correctness here. The spiral probably unfolds at a different rate than what is drawn.

EXERCISES 4 and the rotations R τ arctanb/a vs. R t b. This leads to qualitative differences for certain values of a and b. Exercises a Show A = 0 0 0 0 0 0 cannot be diagonalized over R. A 90 rotation in 3 space has only one real eigenvector the one it rotates around. b Now diagonalize it over C. 2 Decouple the linear system u + 8 v x = 0 v + 2 u x = 0 of PDEs and find the solution given initial conditions ux, 0 = sin x, vx, 0 = x 2. 3 Let T : R 2 R 2 be the transformation given in the standard basis by A = a Diagonalize A over C. 4 0 4 8 b Now show that A is similar A = R BR to a rotationdilation matrix both R and B are supposed to be real. c Give a closed-form solution to the discrete dynamical system xt + = A xt if x0 = 3 2 and draw a phase portrait for the system..

4 Partially diagonalize the matrix 3 A = 2 4 2 3 over R: that is, write it as MBM where λ 0 0 B = 0 a b 0 b a with a, b, λ R. EXERCISES 5