Diagonalization. P. Danziger. u B = A 1. B u S.

Similar documents
Diagonalization of Matrix

Math 3191 Applied Linear Algebra

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Lecture 15, 16: Diagonalization

and let s calculate the image of some vectors under the transformation T.

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Eigenvalues and Eigenvectors

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Eigenvalue and Eigenvector Homework

Definition (T -invariant subspace) Example. Example

Eigenvalues and Eigenvectors

Properties of Linear Transformations from R n to R m

Constant coefficients systems

Lecture 12: Diagonalization

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Homogeneous Linear Systems of Differential Equations with Constant Coefficients

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Jordan normal form notes (version date: 11/21/07)

systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Math Ordinary Differential Equations

Math 205, Summer I, Week 4b:

Eigenvalues and Eigenvectors 7.2 Diagonalization

Jordan Canonical Form

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

MTH 464: Computational Linear Algebra

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Matrix Algebra

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Eigenvalues and Eigenvectors

Section 9.8 Higher Order Linear Equations

Solutions Problem Set 8 Math 240, Fall

Linear Algebra II Lecture 13

Lecture 3 Eigenvalues and Eigenvectors

1 Last time: least-squares problems

Math 108b: Notes on the Spectral Theorem

Diagonalization of Matrices

22.2. Applications of Eigenvalues and Eigenvectors. Introduction. Prerequisites. Learning Outcomes

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Announcements Monday, November 06

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Linear Algebra- Final Exam Review

MAT1302F Mathematical Methods II Lecture 19

Recall : Eigenvalues and Eigenvectors

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Eigenvalues and Eigenvectors: An Introduction

Eigenvalues, Eigenvectors, and Diagonalization

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Jordan Canonical Form Homework Solutions

Generalized Eigenvectors and Jordan Form

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Definition: An n x n matrix, "A", is said to be diagonalizable if there exists a nonsingular matrix "X" and a diagonal matrix "D" such that X 1 A X

1 Linear Algebra Problems

Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions

4. Linear transformations as a vector space 17

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

MAT 1302B Mathematical Methods II

Chapter 3. Determinants and Eigenvalues

Lecture 11: Diagonalization

City Suburbs. : population distribution after m years

Math 314H Solutions to Homework # 3

MA 265 FINAL EXAM Fall 2012

Math 315: Linear Algebra Solutions to Assignment 7

Eigenvalues and Eigenvectors

AN ITERATION. In part as motivation, we consider an iteration method for solving a system of linear equations which has the form x Ax = b

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

3.3 Eigenvalues and Eigenvectors

Unit 5. Matrix diagonaliza1on

ECEN 605 LINEAR SYSTEMS. Lecture 7 Solution of State Equations 1/77

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Study Guide for Linear Algebra Exam 2

Lecture 11: Eigenvalues and Eigenvectors

Symmetric and anti symmetric matrices

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Exercise Set 7.2. Skills

Applied Linear Algebra in Geoscience Using MATLAB

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

Math Camp Notes: Linear Algebra II

Exam in TMA4110 Calculus 3, June 2013 Solution

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Examples include: (a) the Lorenz system for climate and weather modeling (b) the Hodgkin-Huxley system for neuron modeling

Linear Algebra Practice Problems

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear Algebra Exercises

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

LINEAR ALGEBRA REVIEW

Transcription:

7., 8., 8.2 Diagonalization P. Danziger Change of Basis Given a basis of R n, B {v,..., v n }, we have seen that the matrix whose columns consist of these vectors can be thought of as a change of basis matrix. A B v v 2... v n Given a vector u R n, we write u B, to represent the components of u with respect to the basis B. We write S for the standard basis, so u S are the components of u with respect to the standard basis. Given a vector u R n, the components of u with respect to the basis B by finding the solutions to A B u B u S. Since B is a basis, A B is invertible, and we can find u B by u B A B u S. [Note in the books rather confusing notation A B would be called A B S and A B is A S B.] A B can be thought of as a passive transformation, instead of thinking of the transformation T AB associated with A B as actively moving the space, we think of it as passively moving the coordinates. Now, note that if an n n matrix P is invertible then the columns of P form a basis for R n. every invertible matrix is a change of basis matrix. P can be thought of as the matrix which transforms to the coordinate system consisting of the columns of P. P then transforms such a coordinate system back to the original coordinate system. Let B be the column vectors of P. Given a vector u R n, we can find u B by solving the system of equations P u B u S. We can solve this equation using the inverse u B P u S. Thus the components of u with respect to the basis given by the column vectors of P are Example u B P u S.. Find the components of u 2, with respect to the basis B {,,, 2}. Consider P 2, P 2 2 2 3 u B 3, B are the components of the vector u with respect to the basis {,,, 2}.

2. Given that a vector has components 3, B with respect to the basis B {,,, 2}, find the components of u with respect to the standard basis. 3, P is the vector 2,. 2 3 2 2 Similarity Definition 2 Two n n matrices, A and B, are similar if there exists an invertible matrix P such that B P AP. Notes If A is similar to B, so B P AP, then P BP A, and so B is similar to A. If A is similar to B, so B P AP, then P B AP. If A and B are similar matrices, so B P AP, then B can be thought of as preforming the same transformation as A, but on vectors given with respect to the basis given by the columns of P. Let C be the basis formed by the columns of P and for any given u R n let v T A u. Now Example 3 Find a matrix similar to A T B u Bu P AP u P Au C P T A u C P v C v P. 2 2, P 2 B P AP 2 2 0 2 3 2 5 2 4 with respect to the basis {2,,, }, A looks like 2 5 2 4 2

Theorem 4 If A and B are similar matrices, then they have the same characteristic polynomial, and hence the same eigenvalues. Proof:Let A and B be similar matrices, so B P AP for some invertible matrix P. 3 Diagonalization λi B λi P AP P P λi P AP P P I P λip P AP P I IP P λi AP Distribution P λi A P XY X Y λi A P P Definition 5 A matrix is said to be diagonalizable if it is similar to a diagonal matrix. Consider an n n matrix A with n linearly independent eigenvectors v, v 2,..., v n, which thus form a basis B. Recall that the matrix associated with a transformation has columns given by the images of the basis vectors. With respect to this basis, T A v i B λ i v i B λ i e i. If D is the matrix of A with respect to this basis D is diagonal. Let P be the matrix whose columns are the linearly independent eigenvectors of A, then A in this coordinate system will be D, where A P DP, or more usefully D P AP. Theorem 6 An n n matrix A is diagonalizable if and only if it has n linearly independent eigenvectors, v, v 2,..., v n corresponding to eigenvalues λ, λ 2,..., λ n respectivley with possible repitition. In this case A is similar to the matrix D diagλ, λ 2,..., λ n. D P AP, where P is the matrix whose columns are the eigenvectors of A. i.e. P v v 2... v n Note not all matrices have n linearly independent eigenvectors. Theorem 7 Given an n n matrix A, the following statements are equivalent: A is diagonalizable. A has n linearly independent Eigenvectors. The algebraic multiplicity equals the geometric multiplicity for each eigenvalue of A. Example 8. Diagonalize A Eigenvalues λ 0, 2, with corresponding eigenvectors, and,. P, P 2 3

D diag0, 2 0 0 P AP 2 2 0 0 2. Diagonalize A 2 0 0 Eigenvalues λ and λ 2. But Algebraic multiplicity of λ is 2, whereas the geometric multiplicity is. A is not diagonalizable. 4 Applications 4. Powers of Matrices Theorem 9 If D P AP, then A k P D k P Proof: D P AP, so Example 0. Find A 0, where A D k P AP P AP... P AP }{{} k P AP P AP P A... AP P AP }{{} k P AAA }{{... AA} P k P A k P P, P 2 D diag0, 2 0 0 4

A 0 52 52 52 52 0 0 0 0 0 0 0 D 0 0 0 024 0 0 P D 0 P 0 024 2 0 0 52 52 52 52 52 52 4.2 Application to Differential Equations Suppose we have n functions of some variable t, x t, x 2 t,... x n t. We may represent them with the vector x t x 2 t xt. x n t xt is called a vector function. We wish to solve a system of differential equations of the form x t a x t + a 2 x 2 t +... + a n x n t x 2t a 2 x t + a 22 x 2 t +... + a 2n x n t. x nt a n x t + a n2 x 2 t +... + a nn x n t Where x i indicates the derivative of the function x i t. We can represent this in matrix notation as x t Axt. Given such a system and a set of initial conditions we would like to find the functions x i t. Recall for an arbitrary function xt, where C is a constant of integration. Example lve the system x t λxt xt Ce λt, xt yt zt Where x0, y0 3, z 2e 9. 3 0 0 0 0 0 9 5 xt yt zt

Thus Substituting in the initial conditions gives x t 3xt, y t 2yt, z t 9zt, xt C e 3t, yt C 2 e 2t, zt C 3 e 9t, C, C 2 3, C 3 2. Now suppose that A is diagonalizable, so there exists invertable P and diagonal D such that D P AP. Suppose xt satisfies the linear system of differential equations Define yt P xt. x t Ax Now x t Ax., premultiplying by P, P x t P AP P x. P P I y t Dyt. Thus to solve *, we first solve to find yt. Then y t Dyt. x t Cy t. Example 2 lve x t x 2t Where x 0 0, x 2 0. We know A P DP, where P x t x 2 t 0 0, and D diag0, 2 Consider Now, y t y 2t 0 0 y t y 2 t y t 0 y t C y 2t 2yt y 2 t C 2 e 2t xt P yt C C 2 e 2t C + C 2 e 2t C + C 2 e 2t 6

Now apply the initial conditions, x 0 0, so C + C 2 0, and x 2 0 gives C + C 2 lving gives C 2, C 2 2. x t x 2 t 2 e 2t + e 2t 7