Math Homework 8 (selected problems)

Similar documents
Math 215 HW #11 Solutions

Conceptual Questions for Review

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

Math 108b: Notes on the Spectral Theorem

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

CS 143 Linear Algebra Review

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

MATH 221, Spring Homework 10 Solutions

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MIT Final Exam Solutions, Spring 2017

Review of similarity transformation and Singular Value Decomposition

Math 408 Advanced Linear Algebra

Summer Session Practice Final Exam

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Review problems for MA 54, Fall 2004.

235 Final exam review questions

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MA 265 FINAL EXAM Fall 2012

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Eigenvalues and Eigenvectors

Jordan Canonical Form

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

COMP 558 lecture 18 Nov. 15, 2010

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Linear Algebra. Workbook

Math Linear Algebra Final Exam Review Sheet

Review of Linear Algebra

Linear Algebra Practice Problems

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

6 EIGENVALUES AND EIGENVECTORS

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Jordan Canonical Form Homework Solutions

A Brief Outline of Math 355

Properties of Linear Transformations from R n to R m

UNIT 6: The singular value decomposition.

Numerical Linear Algebra Homework Assignment - Week 2

Symmetric and anti symmetric matrices

Announcements Wednesday, November 01

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math Matrix Algebra

Eigenvalue and Eigenvector Problems

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Symmetric and self-adjoint matrices

Cheat Sheet for MATH461

I. Multiple Choice Questions (Answer any eight)

Econ Slides from Lecture 7

ACM 104. Homework Set 4 Solutions February 14, 2001

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1. General Vector Spaces

Linear Algebra: Matrix Eigenvalue Problems

Math Matrix Algebra

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

LinGloss. A glossary of linear algebra

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Remarks on Definitions

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

1. Select the unique answer (choice) for each problem. Write only the answer.

The Singular Value Decomposition

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Diagonalizing Matrices

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Problem # Max points possible Actual score Total 120

Definition (T -invariant subspace) Example. Example

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Quantum Computing Lecture 2. Review of Linear Algebra

Math 315: Linear Algebra Solutions to Assignment 7

18.06SC Final Exam Solutions

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Linear Algebra Review

Notes on basis changes and matrix diagonalization

MATH 235. Final ANSWERS May 5, 2015

Math Camp Notes: Linear Algebra II

Summary of Week 9 B = then A A =

Linear Algebra Primer

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Maths for Signals and Systems Linear Algebra in Engineering

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 308 Practice Final Exam Page and vector y =

LINEAR ALGEBRA QUESTION BANK

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

2 b 3 b 4. c c 2 c 3 c 4

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Transcription:

Math 102 - Homework 8 (selected problems) David Lipshutz Problem 1. (Strang, 5.5: #14) In the list below, which classes of matrices contain A and which contain B? 1 1 1 1 A 0 0 1 0 0 0 0 1 and B 1 1 1 1 1 4 1 1 1 1 1 0 0 0 1 1 1 1 Orthogonal, invertible, projection, permutation, Hermitian, rank-1, diagonalizable, Markov. Find the eigenvalues of A and B. Proof. A is orthogonal, invertible, not projection (since A A 2 ), permutation, not Hermitian (since A A T ), diagonalizable and Markov. The characteristic polynomial for A is p(λ) λ 4 1, so the eigenvalues of A are ±1 and ±i. B is projection (onto the space spanned by (1, 1, 1, 1)), Hermitian, rank-1, diagonalizable, Markov. The eigenvalues of B are 1 since it is a Markov matrix and the rest 0 since it is rank-1. Problem 2. (Strang, 5.5: #16) Write one significant fact about the eigenvalues of each of the following. (a) A real symmetric matrix. (b) A stable matrix: all solutions to du/dt Au approach zero. (c) An orthogonal matrix. (d) A Markov matrix. (e) A defective matrix (nondiagonalizable). (f) A singular matrix. Proof. (a) Every eigenvalue is real. (b) All eigenvalues have norm less than 1. (c) Eigenvalues all have norm equal to 1. (d) 1 is an eigenvalue of the matrix and the other eigenvalues are less than 1. (e) The matrix has repeated eigenvalues. (f) 0 is an eigenvalue of the matrix. 1

Problem 3. (Strang, 5.5: #18) Show that a unitary matrix has det U 1, but possibly det U is different from det U H. Describe all 2 by 2 matrices that are unitary. Proof. If λ 1,..., λ n are the eigenvalues of U, then det U λ 1 λ n λ 1 λ n 1. Suppose 1 0 1 0 U then U H 0 i 0 i r1 so det U i and det U H e iθ1 r 3 e iθ3 i. Suppose U is unitary. Then U has r 2 e iθ2 r 4 e iθ4 orthonormal columns, so r1 2 + r2 2 1 r3 2 + r4. 2 Let r 1 sin ϕ 1 then r 2 cos ϕ 1, and if r 3 cos ϕ 2 then r 4 sin ϕ 2 where 0 ϕ 1, ϕ 2 π 2. By the orthogonality of the column vectors, sin ϕ 1 cos ϕ 2 e i(θ1+θ3) + cos ϕ 1 sin ϕ 2 e i(θ2+θ4) 0. This implies that ϕ 2 ϕ 1 and e θ1+θ3 e θ2+θ4 θ 1 + θ 3 θ 2 + θ 4 + π. So sin ϕe iθ 1 cos ϕe iθ3 sin ϕe iθ 1 cos ϕe iθ3 U or cos ϕe iθ2 sin ϕe i(θ1+θ3 θ2 π) cos ϕe iθ2 sin ϕe i(θ1+θ3 θ2) for some 0 ϕ π 2. Problem 4. (Strang, 5.5: #38) If v 1,..., v n is an orthonormal basis for C n, the matrix with those columns is what kind of matrix? Show that any vector z equals (v H 1 z)v 1 + + (v H n z)v n. Proof. The matrix is unitary since the columns are orthonormal. v H v1 z Iz v 1 v n 1. H z z v 1 v n. (vh 1 z)v 1 + + (vn H z)v n vn H z v H n Problem 5. (Strang, 5.5: #44) How are the eigenvalues of A H (square matrix) related to the eigenvalues of A? Proof. If λ is an eigenvalue of A, then det(a λi) 0 which implies that A λi is singular, so (A λi) H is singular. Therefore det(a H λi) 0, so λ is an eigenvalues of A H. If λ is an eigenvalue of A then λ is an eigenvalue of A H and vice versa. Problem 6. (Strang, 5.5: #46) A B If A+iB is a unitary matrix (A and B are real), show that Q B A matrix. is an orthogonal 2

Proof. We have that I (A + ib)(a + ib) H (A + ib)(a T ib T ) (AA T + BB T ) i(ab T BA T ), which implies that AA T + BB T I and AB T BA T. Then A B QQ H A T B T AA T + BB T AB T BA T I 0 I B A B T A T BA T AB T BB T + AA T 0 I So Q is an orthogonal matrix. Problem 7. (Strang, 5.6: #8) What matrix M changes the basis V 1 (1, 1), V 2 (1, 4) to the basis v 1 (2, 5), v 2 (1, 4) The columns of M come from expressing V 1 and V 2 as combinations of m ij v i of the v s. Proof. Note that V 1 v 1 v 2 and V 2 v 2, so we want a matrix that takes av 1 + bv 2 to 1 0 a(v 1 v 2 ) + bv 2 av 1 + (b a)v 2. This is given by M. 1 1 Problem 8. (Strang, 5.6: #38) These Jordan matrices have eigenvalues 0, 0, 0, 0. They have two eigenvectors (find them). But the block sizes don t match and J is not similar to K: J 0 0 0 1 and K 0 0 1 0 For any matrix M, compare JM and MK. If they are equal, show that M is not invertible. Then M 1 JM K is impossible. Proof. M 21 M 22 M 23 M 24 0 M 11 M 12 0 JM M 41 M 42 M 43 M 44 MK 0 M 21 M 22 0 0 M 31 M 32 0 0 M 41 M 42 0 If JM MK then M 11 M 21 M 31 M 41 0 det M 0 so M is not invertible, so M 1 JM K is impossible. Problem 9. (Strang, 5.6: #42) Prove that AB has the same eigenvalues as BA. Proof. If λ is an eigenvalue of AB, ABx λx for some x which implies BA(Bx) λ(bx), so λ is an eigenvalue of BA with eigenvector Bx. Problem 10. (Strang, 5.6: #44) Why is each of these statements true? (a) If A is similar to B, then A 2 is similar to B 2. 3

(b) A 2 and B 2 can be similar when A and B are not similar (try λ 0, 0). 3 0 (c) is similar to 0 4 0 4 3 0 (d) is not similar to (e) If we exchange rows 1 and 2 of A and then exchange columns 1 and 2 of A, the resulting Proof. matrix has the same eigenvalues. (a) If A P BP 1, then A 2 (P BP 1 )(P BP 1 ) P B 2 P 1. 0 1 (b) Let A and B 0, then A 2 0 is obviously similar to B 2 0, but 0 0 A P BP 1 0 for all P. (c) Since the eigenvalues of are λ 3, 4 with respective eigenvectors (1, 0) and 0 4 (1, 1), the matrix can be diagonalized as follows: 1 1 3 0 1 1 0 4 0 1 0 4 0 1 (d) 3 0 3I, therefore P (3I)P 1 3I for all invertible P. (e) Exchanging the rows and columns is equivalent to applying the matrices 1 0 1 1 0 0 1 0 1 0 0 A 1 0 0 1 0 0 A 1 0 0 A 0 0 I 0 0 I 0 0 I 0 0 I So the matrices are similar and therefore have the same eigenvalues. Problem 11. (Strang, Appendix B: #6) Find the Jordan form J and the matrix M for A and B (B has eigenvalues 1,1,1,-1). What is the solution to du/dt Au, and what is e At? A and B 1 1 0 1 0 2 0 1 2 1 1 1 2 1 2 0 4

Proof. The method for finding the Jordan form of a matrix given in the book goes as follows: 1. Find linearly independent vectors such that the vectors are eigenvectors or a linear combination of the vector and the previous vector: Ae 1 0e 1, Ae 3 0e 3 + e 1, Ae 5 0e 5 + e 3, Ae 2 0e 2 and Ae 4 0e 4 + e 2 2. Set M to be the matrix with those column vectors: M e 1 e 3 e 5 e 2 e 4. 3. Then J M 1 AM: 1 J 0 1 0 0 4. To solve du/dt Au, change variables so u Mv, then Mdv/dt du/dt Au AMv MJv or dv/dt Jv. 1 t t 2 /2 0 0 0 1 t 0 0 e Jt, so u Me Jt M 1 u 0 0 0 0 1 t B has eigenvalues 1, 1, 1, 1 with associated eigenvectors ( 1, 1, 3, 3) for λ 1 and (0, 1, 0, 1) and (1, 0, 1, 0) for λ 1. There is one instance of eigenvalue -1 and one associated eigenvector, so there is one block associated to eigenvalue -1. There are three instances of eigenvalue 1 with only two eigenvectors, so there are two blocks associated to the eigenvalue 1. Since there are 3 instances of the eigenvalue 1, one of the blocks must be 2 by 2 and the other 1 by 1. So the Jacobian matrix is of the form: 1 1 0 0 J 0 0 1 0 0 0 0 1 1. To find M, set BM MJ. B x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 1 1 0 0 0 0 1 0 0 0 0 1 5

2. This leads to Bx 1 x 1 and Bx 2 x 2 +x 1 or (B I)x 2 x 1 and Bx 3 x 3 and Bx 4 x 4. That means that x 1 is an eigenvector of B with eigenvalue 1 that is also in the column space of B I. Through some simple computation, x 1 (1, 1, 1, 1) fits these specifications. Then x 2 (0, 1/2, 0, 1/2), x 3 (0, 1, 0, 1) and x 4 ( 1, 1, 3, 3). 3. Setting J M 1 BM, we get 3/2 0 1/2 0 J 1 1 1 1 1/2 1/2 1/2 1/2 1/2 0 1/2 0 1 1 0 0 0 0 1 0 0 0 0 1 1 1 0 1 0 2 0 1 2 1 1 1 2 1 2 0 1 0 0 1 1 1/2 1 1 1 0 1 1/2 1 3 which is what we want. 6