MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Similar documents
Math 489AB Exercises for Chapter 2 Fall Section 2.3

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Math 489AB Exercises for Chapter 1 Fall Section 1.0

Elementary linear algebra

1 Linear Algebra Problems

Problems in Linear Algebra and Representation Theory

Chapter 1. Matrix Calculus

Math 408 Advanced Linear Algebra

Lecture 23: Trace and determinants! (1) (Final lecture)

Lecture 10 - Eigenvalues problem

Foundations of Matrix Analysis

Math 315: Linear Algebra Solutions to Assignment 7

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Numerical Linear Algebra Homework Assignment - Week 2

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math 108b: Notes on the Spectral Theorem

I. Multiple Choice Questions (Answer any eight)

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Chap 3. Linear Algebra

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

MAT 2037 LINEAR ALGEBRA I web:

Exercise Sheet 1.

Matrices A brief introduction

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Linear Algebra Formulas. Ben Lee

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

7 Matrix Operations. 7.0 Matrix Multiplication + 3 = 3 = 4

Lecture notes: Applied linear algebra Part 2. Version 1

DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

MATHEMATICS 217 NOTES

Symmetric and anti symmetric matrices

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Algebra C Numerical Linear Algebra Sample Exam Problems

MATH 235. Final ANSWERS May 5, 2015

Linear Algebra. Workbook

Matrices and Matrix Algebra.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

CS 246 Review of Linear Algebra 01/17/19

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Linear Systems and Matrices

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Review problems for MA 54, Fall 2004.

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

1 Last time: least-squares problems

FALL 2011, SOLUTION SET 10 LAST REVISION: NOV 27, 9:45 AM. (T c f)(x) = f(x c).

Eigenvalue and Eigenvector Problems

Linear Algebra in Actuarial Science: Slides to the lecture

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Knowledge Discovery and Data Mining 1 (VO) ( )

Math Linear Algebra Final Exam Review Sheet

Mathematical Methods wk 2: Linear Operators

Taxonomy of n n Matrices. Complex. Integer. Real. diagonalizable. Real. Doubly stochastic. Unimodular. Invertible. Permutation. Orthogonal.

Linear Algebra Lecture Notes-II

Bare-bones outline of eigenvalue theory and the Jordan canonical form

1. General Vector Spaces

MAT Linear Algebra Collection of sample exams

The Singular Value Decomposition

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Linear Algebra Review. Vectors

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Topic 1: Matrix diagonalization

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Eigenvalues and eigenvectors

G1110 & 852G1 Numerical Linear Algebra

Chapter 4. Matrices and Matrix Rings

Matrices A brief introduction

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Eigenvalues and Eigenvectors

Yale University Department of Mathematics Math 350 Introduction to Abstract Algebra Fall Midterm Exam Review Solutions

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Lecture 15 Review of Matrix Theory III. Dr. Radhakant Padhi Asst. Professor Dept. of Aerospace Engineering Indian Institute of Science - Bangalore

Conceptual Questions for Review

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Notes on Eigenvalues, Singular Values and QR

Linear Algebra: Matrix Eigenvalue Problems

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Introduction to Quantitative Techniques for MSc Programmes SCHOOL OF ECONOMICS, MATHEMATICS AND STATISTICS MALET STREET LONDON WC1E 7HX

CHARACTERIZATIONS. is pd/psd. Possible for all pd/psd matrices! Generating a pd/psd matrix: Choose any B Mn, then

Transcription:

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006 Sherod Eubanks HOMEWORK 2 2.1 : 2, 5, 9, 12 2.3 : 3, 6 2.4 : 2, 4, 5, 9, 11 Section 2.1: Unitary Matrices Problem 2 If λ σ(u) and U M n is unitary, show that λ = 1. Solution. If λ σ(u), U M n is unitary, and Ux = λx for x 0, then by Theorem 2.1.4(g), we have x C n = Ux C n = λx C n = λ x C n, hence λ = 1, as desired. Problem 5 Show that the permutation matrices in M n are orthogonal and that the permutation matrices form a subgroup of the group of real orthogonal matrices. How many different permutation matrices are there in M n? Solution. By definition, a matrix P M n is called a permutation matrix if exactly one entry in each row and column is equal to 1, and all other entries are 0. That is, letting e i C n denote the standard basis element of C n that has a 1 in the i th row and zeros elsewhere, and S n be the set of all permutations on n elements, then P = [e σ(1) e σ(n) ] = P σ for some permutation σ S n such that σ(k) denotes the k th member of σ. Observe that for any σ S n, and as { 1 if i = j e T σ(i)e σ(j) = 0 otherwise for 1 i j n by the definition of e i, we have that e T σ(1) Pσ T e σ(1) e T σ(1) e σ(n) P σ =.. = I n (= P σ Pσ T ) e T σ(n) e σ(1) e T σ(n) e σ(n) (where I n denotes the n n identity matrix). Hence Pσ 1 = Pσ T (permutation matrices are trivially nonsingular), and so P σ is (real) orthogonal. Since the above holds for any σ S n, it follows that any permutation matrix is orthogonal. Now, notice that I n is a permutation matrix corresponding to the identity in the group S n, so the set of all permutation matrices in M n is (trivially) nonempty, and contains the identity element of GL n. Moreover, by the preceding paragraph, for each σ S n and each corresponding permutation matrix P σ, Pσ T = Pσ 1, and observe further that Pσ 1 = P σ 1, since P σ has a 1 in column i, row σ(i), and Pσ 1 = Pσ T = P τ has a 1 in column σ(i), row τ(σ(i)) = i for all i = 1,..., n. Thus τ σ = e, the identity element of S n, so τ = σ 1 since S n is a group. As such, the inverse (transpose) of a permutation matrix is again a permutation matrix. Finally, if ν S n is any other permutation, then the preceding discussion 1

shows that e T σ(1) Pσ T e ν(1) e T σ(1) e ν(n) P ν =.. = [e (σ ν)(1) e (σ ν)(n) ] = P σ ν e T σ(n) e ν(1) e T σ(n) e ν(n) hence as σ ν S n, the product of permutation matrices is again a permutation matrix (this is rather trivial given the definition of permutation matrix, but it illustrates the connection between permutation matrices in M n and permutations in S n ). Therefore, the set of all permutation matrices is not only a subgroup of GL n, but since each is orthogonal, such is a subgroup of the set of all orthogonal matrices as well. Moreover, the mapping σ P σ is a bijection, hence as o(s n ) = n!, it follows that there are n! different permutation matrices in M n (thus, the order of the subgroup in question is n!). Problem 9 If U M n is unitary, show that U, U T, and U are all unitary. Solution. Let U M n be unitary. That U is unitary follows readily from Theorem 2.1.4(d); that U T is unitary follows from the fact that as the columns of U form an orthonormal set by Theorem 2.1.4(e), then the rows of U T form an orthonormal set. Now, since U = U T is unitary, the rows of U T form an orthonormal set, hence the columns of U form an orthonormal set, and thus U is unitary. Problem 12 Show that if A M n is similar to a unitary matrix, then A 1 is similar to A. Solution. If A M n is similar to the unitary matrix U, then there is a nonsingular matrix S such that U = SAS 1, hence AS 1 = S 1 U, and as such A(S 1 U S) = S 1 UU S = S 1 S = I n. Since S and U are nonsingular, S 1 U S is nonsingular, hence it follows that A is nonsingular (by the exercise preceding Theorem 2.1.4). Thus A 1 = S 1 U S, so that U = SA 1 S 1, and so as U = SAS 1, it follows that U = (S 1 ) A S = SA 1 S 1, and therefore, since S S is nonsingular and (S 1 ) = (S ) 1 by the non-singularity of S, A 1 = S 1 (S 1 ) A S S = (S S) 1 A S S, which implies that A 1 and A are similar. Section 2.3: Schur s Unitary Triangularization Theorem Problem 3 Let A M n (R). Explain why the nonreal eigenvalues of A (if any) must occur in conjugate pairs. Solution. A simple answer to the given question is that since A M n (R), the characteristic polynomial p A (t) has real coefficients, and hence any nonreal roots occur in conjugate pairs, it follows that any nonreal eigenvalues of A must occur in conjugate pairs. This also follows by Theorem 2.3.4, since there is a real orthogonal matrix Q M n (R) such that Q T AQ M n (R) where Q T AQ = A 1 A 2 0 A k, 2

and each A i is a real 1 1 matrix (so A i σ(a)), or a real 2 2 matrix with a nonreal pair of complex conjugate eigenvalues. Hence, since σ(a) = σ(q T AQ) by similarity, any nonreal eigenvalues of A must occur in conjugate pairs. Problem 6 Let A, B M n be given, and suppose A and B are simultaneously similar to upper triangular matrices; that is, S 1 AS and S 1 BS are both upper triangular for some nonsingular S M n. Show that every eigenvalue of AB BA must be zero. Solution. Put T A = S 1 AS and T B = S 1 BS. Since T A and T B are upper triangular, T A T B and T B T A are upper triangular, hence as T A T B = S 1 ASS 1 BS = S 1 ABS and similarly T B T A = S 1 BAS. Now, T A T B T B T A = S 1 ABS S 1 BAS = S 1 (AB BA)S, hence as T A T B and T B T A are both upper triangular, it follows further that T A T B T B T A is also upper triangular, hence the eigenvalues of AB BA are the diagonal elements of T A T B T B T A. But, if T A = [t ij ], T B = [s ij ], then t ij = s ij = 0 if i > j, hence it follows that T A T B = t 11 * 0 t nn s 11 * 0 s nn t 11 s 11 * 0 t nn s nn so the diagonal of T A T B is t ii s ii, i = 1,..., n, and by a similar computation, the diagonal of T B T A is s ii t ii (i.e. the two set of diagonal entries are the same). Therefore, the diagonal of T A T B T B T A is t ii s ii s ii t ii = 0 for all i = 1,..., n, which implies that every eigenvalue of AB BA is zero, as desired. Section 2.4: Some Implications of Schur s Theorem Problem 2 If A M n, show that the rank of A is not less than the number of nonzero eigenvalues of A. Solution. If A M n and σ(a) = {λ 1,..., λ n }, then by Schur s Theorem, there is a unitary matrix U such that U AU = T = [t ij ] where T is upper triangular and t ii = λ i, i = 1,..., n. If k of the eigenvalues of A are nonzero, then T has k nonzero and n k zero entries along its main diagonal. As such, the k columns containing the nonzero eigenvalues of A constitute a linearly independent set (since T is upper triangular), and as such, rank(t ) k. But then rank(a) k since U is nonsingular and rank is invariant under multiplication by nonsingular matrices. Of course, we may certainly have rank(a) > k, for if A = [ 0 1 then A is already upper triangular, and σ(a) = {0}, so even though A has no nonzero eigenvalues, rank(a) = 1 > 0. Problem 4 Let A M n be a nonsingular matrix. Show that any matrix that commutes with A also commutes with A 1. ],, 3

Solution. Here, we provide two proofs of the given statement. First, if A M n is nonsingular and AB = BA for some B M n, then B = A 1 BA hence BA 1 = A 1 B, so B commutes with A 1 if and only if it commutes with A. Second, by Corollary 2.4.4, since A M n is nonsingular, there is a polynomial q(t), whose coefficients depend on A and where deg(q(t)) n 1, such that A 1 = q(a). Put k = deg(q(t)) and write q(t) = a k t k + a k 1 t k 1 + + a 1 t + a 0, where a k 0. Now, observe that showing BA = AB implies that Bq(A) = q(a)b will prove the given statement. Note that for any p N, we have BA p = BAA p 1 = ABA p 1 = = A i BA p i = = A p 1 BA = A p B, so B commutes with any positive integer power of A; as such, we compute and thus A 1 B = BA 1, as desired. Problem 5 q(a)b = (a k A k + a k 1 A k 1 + + a 1 A + a 0 )B = a k A k B + a k 1 A k 1 B + + a 1 AB + a 0 IB = a k BA k + a k 1 BA k 1 + + a 1 BA + a 0 BI = B(a k A k + a k 1 A k 1 + + a 1 A + a 0 I) = Bq(A), Use (2.3.1) to show that if A M n has eigenvalues λ 1, λ 2,..., λ n, then n λ k i = tr(a k ), k = 1, 2,... Solution. First, if A M n, and σ(a) = {λ 1,..., λ n }, then letting p(t) = t k for k = 1, 2,..., by Theorem 1.1.6 we have that p(a) = A k has eigenvalues p(λ i ) = λ k i, i = 1,..., n. Now, by Schur s Theorem, for each k = 1, 2,..., there is a unitary matrix U k M n such that Uk Ak U k = T k = [t (k) ij ] where T k is upper triangular, and t (k) ii = λ k i, i = 1,..., n. Hence, by Problem 11 (below), as tr(ab) = tr(ba), and as the trace of a matrix is the sum of the eigenvalues of the matrix that tr(a k ) = tr(u k U k A k ) = tr(u k A k U k ) = tr(t k ) = n λ k i k = 1, 2,... as desired. Problem 9 Let A M n, B M m be given and suppose A and B have no eigenvalues in common; that is, σ(a) σ(b) is empty. Use the Cayley-Hamilton theorem (2.4.2) to show that the equation AX XB = 0, X M n,m has only the solution X = 0. Deduce from this fact that the equation AX XB = C has a unique solution X M n,m for each given C M n,m. Solution. Suppose AX = XB, for A, B, and X as given above. Then, assuming that A k X = XB k for k = 1,..., p, we have A p+1 X = A(A p X) = A(XB p ) = (AX)B p = (XB)B p = XB p+1, thus by induction, A k X = XB k for all k = 1, 2,.... In this way, if p(t) is any polynomial, it follows 4

that p(a)x = Xp(B) (as in Problem 4 above). So p A (A)X = Xp A (B), hence as p A (A) = 0 by the Cayley-Hamilton Theorem, we have Xp A (B) = 0. But, since p A (t) = (t λ 1 )(t λ 2 ) (t λ n ) where λ i σ(a), i = 1,..., n, it follows that p A (B) = n (B λ i I). Moreover, the eigenvalues of the matrix p A (B) are p A (µ j ) for µ j σ(a) σ(b) =, µ j λ i for any 1 i n and 1 j m, so σ(b), j = 1,..., m, hence as p A (µ j ) = n (µ j λ i ) 0 for each j = 1,..., m. So, as all of the eigenvalues of p A (B) are nonzero, it follows that p A (B) is nonsingular, and as such, Xp A (B) = 0 has the unique solution X = 0, hence AX XB = 0 has the unique solution X = 0. So, considering the linear transformation T : M n,m M n,m where T (X) = AX XB, as T (X) = 0 has the unique solution X = 0, it follows that T (X) = C has a unique solution for each C M n,m, and the proof is complete. Problem 11 Let A, B M n be given and consider the commutator C = AB BA. Show that tr(c) = 0. Consider [ ] [ ] 0 1 A = and B = 1 0 and show that a commutator need not be nilpotent; that is, some eigenvalues of a commutator can be nonzero, even though the sum of the eigenvalues must be zero. Solution. First, by the definition of trace as the sum of diagonal entries, we have tr(c) = tr(ab BA) = tr(ab) tr(ba), hence by Theorem 1.3.20, as the eigenvalues of AB and BA are the same (counting multiplicity), and as the trace of a matrix is also the sum of its eigenvalues, we have that tr(ab) = tr(ba), so that tr(c) = 0. Now, observe that with A and B as given above, we have [ ] 1 0 C = AB BA = 0 1 so that C has (nonzero) eigenvalues 1 and 1, and hence C is not nilpotent, but we see that tr(c) = 1 + 1 = 0. 5