Section 5.5. Complex Eigenvalues (Part II)

Similar documents
Section 5.5. Complex Eigenvalues

Section 5.5. Complex Eigenvalues

Announcements Monday, November 13

Announcements Monday, November 13

Lecture 15, 16: Diagonalization

Announcements Wednesday, November 7

Chapter 5. Eigenvalues and Eigenvectors

Announcements Monday, November 06

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Math 1553 Worksheet 5.3, 5.5

Properties of Linear Transformations from R n to R m

Announcements Wednesday, November 01

Recall : Eigenvalues and Eigenvectors

Announcements Wednesday, November 7

Math 3191 Applied Linear Algebra

MTH 464: Computational Linear Algebra

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

MATH 1553, C. JANKOWSKI MIDTERM 3

MAT 1302B Mathematical Methods II

Announcements Wednesday, November 01

MATH 221, Spring Homework 10 Solutions

Eigenvalues and Eigenvectors

Diagonalization of Matrix

Definition (T -invariant subspace) Example. Example

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Announcements: Nov 10

Section 5.5. Complex Eigenvalues

Math 315: Linear Algebra Solutions to Assignment 7

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

4. Determinants.

Linear Algebra: Matrix Eigenvalue Problems

Math Matrix Algebra

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Dimension. Eigenvalue and eigenvector

Diagonalizing Matrices

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

3.3 Eigenvalues and Eigenvectors

20D - Homework Assignment 5

CSL361 Problem set 4: Basic linear algebra

Math 2802 Applications of Linear Algebra. School of Mathematics Georgia Institute of Technology

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

4. Linear transformations as a vector space 17

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

Announcements Monday, October 29

MATH 1553-C MIDTERM EXAMINATION 3

12x + 18y = 30? ax + by = m

Econ Slides from Lecture 7

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Linear Algebra Practice Problems

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

City Suburbs. : population distribution after m years

7. Symmetric Matrices and Quadratic Forms

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

1 Last time: least-squares problems

Chapter 4 & 5: Vector Spaces & Linear Transformations

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Chapter 5 Eigenvalues and Eigenvectors

Linear vector spaces and subspaces.

Eigenvalues and Eigenvectors

Study Guide for Linear Algebra Exam 2

Chapter 6: Orthogonality

Review of Some Concepts from Linear Algebra: Part 2

Eigenvalues, Eigenvectors, and Diagonalization

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Math 1553 Introduction to Linear Algebra. School of Mathematics Georgia Institute of Technology

Section 6.5. Least Squares Problems

235 Final exam review questions

Problems for M 11/2: A =

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Name: Final Exam MATH 3320

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 1553 Introduction to Linear Algebra. School of Mathematics Georgia Institute of Technology

Eigenvalues & Eigenvectors

Linear Algebra- Final Exam Review

and let s calculate the image of some vectors under the transformation T.

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 5BI: Problem Set 6 Gradient dynamical systems

Jordan Canonical Form Homework Solutions

Math 1553, Introduction to Linear Algebra

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Math 2331 Linear Algebra

Vectors and Matrices Statistics with Vectors and Matrices

Problem 1: Solving a linear equation

Math 308 Practice Final Exam Page and vector y =

Physics Complex numbers

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Lecture 3 Eigenvalues and Eigenvectors

Transcription:

Section 5.5 Complex Eigenvalues (Part II)

Motivation: Complex Versus Two Real Eigenvalues Today s decomposition is very analogous to diagonalization. Theorem Let A be a 2 2 matrix with linearly independent eigenvectors v 1, v 2 and associated eigenvalues λ 1, λ 2. Then where A = PDP 1 scale x-axis by λ 1 scale y-axis by λ 2 ( ) P = v 1 v 2 λ1 0 and D =. 0 λ 2 Theorem Let A be a 2 2 real matrix with a complex eigenvalue λ = a + bi (where b 0), and let v be an eigenvector. Then where A = PCP 1 P = Re v Im v and C =(rotation) (scaling).

Computing Eigenvectors of 2 2 Matrices Specially useful for complex eigenvalues Let A be a 2 2 matrix, and let λ be an eigenvalue of A. Then A λi is not invertible, so its rows are linearly independent. When we row reduce, the second row entries are zeros. Save time! there is no need to find the exact entries of second row 2 2 Shortcut ( a b If A λi = for λ. ) ( ) b, then use or a ( ) b as eigenvector a Example: A = 1 ( ) 1 1 2 1 1 λ = 1 i 2.

Poll Example Completed Last poll used the matrix of rotation by π/4: A = 1 ( ) 1 1 has eigenvalues 2 1 1 λ = 1 ± i 2. Compute an eigenvector for λ = (1 + i)/ 2 (factor out 2): A λi = 1 ( ) 1 (1 + i) 1 2 1 1 (1 + i) Row reducing: ( ) 1 i 1 2 1 i The parametric form is ix = y, so an eigenvector is v = = 1 2 ( i 1 1 i ). ( ) ( ) 1 i 1 i 1. 2 0 0 0 0 ( ) 1. i

A 3 3 Example Find the complex eigenvalues and eigenvectors of 4 3 0 5 5 A = 3 4 0 5 5. 0 0 2 Recall: When we find an eigenvector v with eigenvalue λ then we automatically know that v is eigenvector with eigenvalue λ.

A 3 3 Example Continued 4 3 0 5 5 A = 3 4 0 5 5 0 0 2 Find eigenvector v with eigenvalue 4+3i. Row reduce: 5

Complex Eigenvectors: Matrix Decomposition 2 2 case Theorem Let A be a 2 2 real matrix with a complex (non-real) eigenvalue λ, and let v be an eigenvector. Then A = PCP 1 where ( ) P = Re v Im v Re λ Im λ and C =. Im λ Re λ The matrix C is a composition of scaling by λ and rotation by θ = arg(λ): ( ) cos θ sin θ C = λ. sin θ cos θ With a complex eigenvalue λ The matrix C correspond to multiplication by λ in C R 2. The matrix A is similar to C; that is to a rotation by the argument of λ composed with scaling by λ.

Decomposition: Geometric Interpretation Example 1 What does A = ( ) 1 1 do geometrically? 1 1

Decomposition: Geometric Interpretation Example 1, continued A rotate by π/4 scale by 2

Decomposition: Geometric Interpretation Example 2 ( ) 3 + 1 2 What does A = do geometrically? 1 3 1

Decomposition: Geometric Interpretation Example 2, continued ( ) 3 + 1 2 A = 1 3 1 ( ) 3 1 C = 1 3 λ = 3 i The matrix C scales by a factor of λ = ( 3) 2 + ( 1) 2 = 4 = 2. The argument of λ is π/6: λ 3 π 6 1 Therefore C rotates by +π/6. C rotate by π/6 scale by 2

Decomposition: Geometric Interpretation Example 2, continued ( ) 3 + 1 2 What does A = do geometrically? 1 3 1 A rotate around an ellipse scale by 2 A = PCP 1 does the same as C. but with respect to the basis P = {( ) ( 1 1, 1 )} 0 of columns of P

The 3-Dimensional Case Theorem Let A be a real 3 3 matrix. Suppose that A has one real eigenvalue λ 1 with eigenvector v 1, and one conjugate pair of complex eigenvalues λ 2, λ 2 with eigenvectors v 2, v 2. Then A = PCP 1, where λ 1 0 0 P = v 1 Re v 2 Im v 2 C = 0 Re λ 2 Im λ 2 0 Im λ 2 Re λ 2

The 3-Dimensional Case Pictures 1 1 0 Let A = 1 1 0. This acts on the xy-plane by rotation by π/4 and 0 0 2 scaling by 2. This acts on the z-axis by scaling by 2. Pictures: z from above looking down y-axis z x x x y y Note: A is already a block diagonal. In general, this dynamics occur along the axes given by the columns of P (if A = PCP 1 ).

Extra: The n-dimensional Case Theorem Let A be a real n n matrix. Suppose that for each (real or complex) eigenvalue, the dimension of the eigenspace equals the algebraic multiplicity. Then A = PCP 1, where 1. C is block diagonal: the blocks containing the real eigenvalues (with their multiplicities) are 1 1 blocks. the blocks containing the pairs of conjugate complex eigenvalues (with their multiplicities) are 2 2 blocks. ( ) Re λ Im λ Im λ Re λ (λ must have an imaginary part) 2. P has columns that either form bases for the eigenspaces for the real eigenvectors, or come in pairs ( Re v Im v ) for the non-real eigenvectors.

Extra: Why This Is Not A Weird Thing To Do An anachronistic historical aside In the beginning, people only used counting numbers for, well, counting things: 1, 2, 3, 4, 5,.... Then someone (Persian mathematician Muḥammad ibn Mūsā al-khwārizmī, 825) had the ridiculous idea that there should be a number that represents an absence of quantity (number 0). This blew everyone s mind. Then it occurred to someone (Chinese mathematician Liu Hui, c. 3rd century) that there should be negative numbers to represent a deficit in quantity. That seemed reasonable, until people realized that 10 + ( 3) would have to equal 7. This is when people started saying, bah, math is just too hard for me. At this point it was inconvenient that you couldn t divide 2 by 3. Thus someone (Indian mathematician Aryabhatta, c. 5th century) invented fractions (rational numbers) to represent fractional quantities. These proved very popular. The Pythagoreans developed a whole belief system around the notion that any quantity worth considering could be broken down into whole numbers in this way. Then the Pythagoreans (c. 6th century BCE) discovered that the hypotenuse of an isosceles right triangle with side length 1 (i.e. 2) is not a fraction. This caused a serious existential crisis and led to at least one death by drowning. The real number 2, which is not a fraction, was thus invented to solve the equation x 2 2 = 0. Now we come to invent a number i that solves the equation x 2 + 1 = 0.