Constant coefficients systems

Similar documents
systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Section 8.2 : Homogeneous Linear Systems

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Lecture Notes for Math 251: ODE and PDE. Lecture 12: 3.3 Complex Roots of the Characteristic Equation

Math 3191 Applied Linear Algebra

Linear Algebra II Lecture 13

Section 9.3 Phase Plane Portraits (for Planar Systems)

π 1 = tr(a), π n = ( 1) n det(a). In particular, when n = 2 one has

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Modal Decomposition and the Time-Domain Response of Linear Systems 1

Homogeneous Linear Systems of Differential Equations with Constant Coefficients

MTH 464: Computational Linear Algebra

Complex, distinct eigenvalues (Sect. 7.6)

Eigenpairs and Diagonalizability Math 401, Spring 2010, Professor David Levermore

= A(λ, t)x. In this chapter we will focus on the case that the matrix A does not depend on time (so that the ODE is autonomous):

Eigenvalues and Eigenvectors

Diagonalization. P. Danziger. u B = A 1. B u S.

Math Matrix Algebra

Section 9.8 Higher Order Linear Equations

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Theory of Higher-Order Linear Differential Equations

c Igor Zelenko, Fall

Systems of Linear Differential Equations Chapter 7

Diagonalization of Matrix

Math 304 Answers to Selected Problems

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

DIAGONALIZABLE LINEAR SYSTEMS AND STABILITY. 1. Algebraic facts. We first recall two descriptions of matrix multiplication.

Calculus and Differential Equations II

4. Linear transformations as a vector space 17

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Lecture 15, 16: Diagonalization

1 Systems of Differential Equations

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Study guide - Math 220

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Autonomous system = system without inputs

Diagonalization. Hung-yi Lee

Summer Session Practice Final Exam

Dynamic interpretation of eigenvectors

Homogeneous Liner Systems with Constant Coefficients

Linear ODEs. Existence of solutions to linear IVPs. Resolvent matrix. Autonomous linear systems

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Eigenvalues, Eigenvectors, and Diagonalization

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Announcements Wednesday, November 7

Lecture 9. Scott Pauls 1 4/16/07. Dartmouth College. Math 23, Spring Scott Pauls. Last class. Today s material. Group work.

Vector Spaces and Linear Transformations

Jordan Canonical Form Homework Solutions

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Econ Slides from Lecture 7

Department of Mathematics IIT Guwahati

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

Find the Fourier series of the odd-periodic extension of the function f (x) = 1 for x ( 1, 0). Solution: The Fourier series is.

MATH 221, Spring Homework 10 Solutions

MATH 115A: SAMPLE FINAL SOLUTIONS

Lecture 3 Eigenvalues and Eigenvectors

Symmetric and anti symmetric matrices

Generalized Eigenvectors and Jordan Form

Diagonalization of Matrices

Lec 2: Mathematical Economics

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

FIRST-ORDER SYSTEMS OF ORDINARY DIFFERENTIAL EQUATIONS II: Homogeneous Linear Systems with Constant Coefficients

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Linear ODEs. Types of systems. Linear ODEs. Definition (Linear ODE) Linear ODEs. Existence of solutions to linear IVPs.

Announcements Monday, November 06

Definition (T -invariant subspace) Example. Example

MATH 1553-C MIDTERM EXAMINATION 3

1 Last time: least-squares problems

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

We have already seen that the main problem of linear algebra is solving systems of linear equations.

SOLUTIONS TO PRACTICE EXAM 3, SPRING 2004

Announcements Wednesday, November 7

Study Guide for Linear Algebra Exam 2

Jordan Canonical Form

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Math 314H Solutions to Homework # 3

ELEC 3035, Lecture 3: Autonomous systems Ivan Markovsky

Part II Problems and Solutions

Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems

Practice Problems for the Final Exam

We shall finally briefly discuss the generalization of the solution methods to a system of n first order differential equations.

MATH 320 INHOMOGENEOUS LINEAR SYSTEMS OF DIFFERENTIAL EQUATIONS

Vectors, matrices, eigenvalues and eigenvectors

Elements of linear algebra

Eigenvalues and Eigenvectors

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

Chapter 7: Symmetric Matrices and Quadratic Forms

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Math Final December 2006 C. Robinson

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

What s Eigenanalysis? Matrix eigenanalysis is a computational theory for the matrix equation

Eigenvalue and Eigenvector Homework

Lecture 6. Eigen-analysis

MATH 308 Differential Equations

Transcription:

5.3. 2 2 Constant coefficients systems Section Objective(s): Diagonalizable systems. Real Distinct Eigenvalues. Complex Eigenvalues. Non-Diagonalizable systems. 5.3.. Diagonalizable Systems. Remark: We review the solutions of 2 2 diagonalizable systems. Theorem 5.3.. (Diagonalizable Systems) If the 2 2 constant matrix A is diagonalizable with eigenpairs λ ±, v (±), then the general solution of x = A x is x gen (t) = c + e λ+t v (+) + c - e λ-t v (-). Remark: We have three cases: (i) The eigenvalues λ +, λ - are real and distinct. (a) < λ - < λ +, (b) λ - < < λ +, (c) λ - < λ + <. (ii) The eigenvalues λ ± = α±βi are distinct and complex ; (iii) The eigenvalues λ + = λ - = λ is repeated and real.

2 Remark: The case (i) was studied in the previous section. Example 5.3.: Find the general solution of the 2 2 linear system x 3 = A x, A =. 3 Solution: We have computed in a previous example the eigenpairs of the coefficient matrix, λ + = 4, v + =, and λ - = 2, v - = This coefficient matrix has distinct real eigenvalues, so the general solution is x gen (t) = c + e 4t + c - e 2t Remark: We now focus on case (ii). A real matrix can have complex eigenvalues. But in this case, the eigenvalues and the eigenvectors come in pairs, λ + = λ -. Theorem 5.3.2. (Conjugate Pairs) If A is a square matrix with real coefficients and λ, v is an eigenpair, then so is their complex conjugate λ, v. Example : If a 2 2 matrix A has eigenpairs then A also has the eigenpairs λ = 7 2i, v = λ = 7 + 2i, v = 2 3i 5 + i 2 + 3i, 5 i. Proof of Theorem 5.3.2: Complex conjugate the eigenvalue eigenvector equation for λ and v, and recall that matrix A is real-valued, hence A = A. We obtain, Av = λv Av = λ v, Av = λ v, This establishes the Theorem.

3 Theorem 5.3.3. (Complex and Real Solutions) If a 2 2 matrix A has eigenpairs λ ± = α ± iβ, v (±) = a ± ib, where α, β, a, and b real, then the equation x = A x has fundamental solutions x (+) (t) = e λ+t v (+), x (-) (t) = e λ-t v (-), but it also has real-valued fundamental solutions x () (t) =! a cos(βt) b sin(βt) " e αt, x (2) (t) =! a sin(βt) + b cos(βt) " e αt. Proof of Theorem 5.3.3: We know that the solutions x (±) are linearly independent. The new information in the theorem above is the formula for the real-valued fundamental solutions. They are obtained from x (±) as follows: x (±) = (a ± ib) e (α±iβ)t = e αt (a ± ib) e ±iβt = e αt (a ± ib)! cos(βt) ± i sin(βt) " = e αt! a cos(βt) b sin(βt) " ± ie αt! a sin(βt) + b cos(βt) ". Therefore, we get x + = e αt! a cos(βt) b sin(βt) " + ie αt! a sin(βt) + b cos(βt) " x = e αt! a cos(βt) b sin(βt) " ie αt! a sin(βt) + b cos(βt) ". Since the differential equation x = Ax is linear, the functions below are also solutions, This establishes the Theorem. x () = 2! x + + x -" =! a cos(βt) b sin(βt) " e αt, x (2) = 2i! x + x -" =! a sin(βt) + b cos(βt) " e αt.

4 Example 5.3.2: Find real-valued fundamental solutions to the differential equation x 2 3 = Ax, A =. 3 2 Solution: Fist find the eigenvalues of matrix A above, (2 λ) 3 = = (λ 2) 2 + 9 λ ± = 2 ± 3i. # 3 (2 λ) # Then find the respective eigenvectors. The one corresponding to λ + is the solution of the homogeneous linear system with coefficients given by 2 (2 + 3i) 3 = 3i 3 i 3 2 (2 + 3i) 3 3i i i i i Therefore the eigenvector v (+) = v+ v + 2 is given by v (+) = iv (+) 2 v (+) 2 =, v (+) = i, v (+) = i, λ + = 2 + 3i. The second eigenvector is the complex conjugate of the eigenvector found above, that is, v (-) = i, λ - = 2 3i. Notice that v (±) = ± i. Then, the real and imaginary parts of the eigenvalues and of the eigenvectors are given by α = 2, β = 3, a =, b =

5 So a real-valued expression for a fundamental set of solutions is given by x () = x (2) =! cos(3t) ( sin(3t) e 2t x () = sin(3t) e 2t, cos(3t) sin(3t) + ( cos(3t) e 2t x (2) = cos(3t) e 2t. sin(3t)!

6 Remark: The case (iii), 2 2 diagonalizable matrices with a repeated eigenvalue. Theorem 5.3.4. the form Every 2 2 diagonalizable matrix with repeated eigenvalue λ has A = λ I. Proof of Theorem 5.3.4: Since matrix A diagonalizable, there exists a matrix P invertible such that A = P DP. Since A is 2 2 with a repeated eigenvalue λ, then D = λ = λ I 2. λ Put these two fatcs together, A = P λ IP = λ P P = λ I. Remark: : The differential equation x = λ I x is already decoupled. ' x = λ x x too simple. 2 = λ x 2

7 5.3.2. Non-Diagonalizable Systems. Remark: In this case is not so easy to find two fundamental solutions. Example : Find fundamental solutions to the system x 6 4 = A x, A = 2 Solution: We start computing the eigenvalues of A. 6 λ 4 p(λ) = = (λ + 6)(λ + 2) + 4 = λ 2 + 8λ + 6 = (λ + 4) 2. # 2 λ# We have a repeated eigenvalue λ = 4. The eigenvector v is the solution of (A + 4I)v =, 6 + 4 4 v = 2 + 4 So we have only one equation v 2 v = 2v 2 v = v = 2 v 2 v 2 2 4 v = 2 and choosing v 2 = we get the eigenpair λ = 4, v = 2. So one fundamental solution is v 2 x () = 2 e 4t. However, we do not know what is a second fundamental solution in this case.

8 Theorem 5.3.5. (Repeated Eigenvalue) If a 2 2 matrix A has a repeated eigenvalue λ with only one associated eigen-direction given by the eigenvector v, then the differential system x (t) = A x(t) has a linearly independent set of solutions x (t) = e λt v, x 2 (t) = e λt! v t + w ", where the vector w is one solutions of the algebraic linear system (A λi)w = v. Remark: For the proof see the Lecture Notes. Example Similar to 5.3.3: Find the fundamental solutions of the differential equation # $ x 6 4 = Ax, A =. 2 Solution: We already know that an eigenpair of A is λ =, v = 2 Any other eigenvector associated to λ = is proportional to the eigenvector above. The matrix A is not diagonalizable, so we solve for a vector w the linear system (A + 4I)w = v, 2 4 w = 2 w + 2w 2 = w = 2w 2. 2 w 2 Therefore, w = w = 2w 2 w = 2 w 2 + w 2 w 2 So, given any solution w, the cv + w is also a solution for any c R. We choose w 2 =, w = Therefore, a fundamental set of solutions to the differential equation above is formed by x () (t) = e 4t 2 +, x (2) (t) = e 4t t 2 +,