Homework 11 Solutions. Math 110, Fall 2013.

Similar documents
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Linear Algebra 2 Spectral Notes

Math Linear Algebra II. 1. Inner Products and Norms

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Chapter 6: Orthogonality

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math Linear Algebra

Family Feud Review. Linear Algebra. October 22, 2013

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

MATH 115A: SAMPLE FINAL SOLUTIONS

Math 113 Final Exam: Solutions

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 115A: Homework 5

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Vector Spaces and Linear Maps

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

The Gram Schmidt Process

The Gram Schmidt Process

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Chapter 7: Symmetric Matrices and Quadratic Forms

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Review problems for MA 54, Fall 2004.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Vectors. Vectors and the scalar multiplication and vector addition operations:

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Exercise Set 7.2. Skills

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Chapter 3 Transformations

Linear Algebra. Min Yan

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Math 113 Solutions: Homework 8. November 28, 2007

Review of Linear Algebra

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Lecture notes: Applied linear algebra Part 1. Version 2

Math 108b: Notes on the Spectral Theorem

PRACTICE PROBLEMS FOR THE FINAL

Eigenvectors and Hermitian Operators

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4153 Exam 1 Review

The Spectral Theorem for normal linear maps

Linear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS. a + 2b = x a + 3b = y. This solves to a = 3x 2y, b = y x. Thus

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Spectral Theorem for Self-adjoint Linear Operators

1 Invariant subspaces

MATH Linear Algebra

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

MAT Linear Algebra Collection of sample exams

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Chapter 4 Euclid Space

Eigenvalues and Eigenvectors A =

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Problem # Max points possible Actual score Total 120

SUMMARY OF MATH 1600

Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear algebra II Homework #1 due Thursday, Feb A =

Eigenvalues, Eigenvectors, and Invariant Subspaces

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Systems of Linear Equations

Math 3191 Applied Linear Algebra

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Math 550 Notes. Chapter 2. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

LINEAR ALGEBRA QUESTION BANK

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Linear Algebra Final Exam Solutions, December 13, 2008

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

1 Last time: least-squares problems

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Diagonalizing Matrices

Transcription:

Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting of eigenvectors of T It is straightforward to see, by inspection of T, that the eigenvalues of T are λ = 0, 1 and that T (v) = 0 v null(t ) = {a 0 + a 2 x t a 0, a 2 R}, T (v) = v v span(x) Thus, we have p 1, p 2 null(t ), with p 1, p 2 = 0 and p 1 = p 2 = 1 The remaining eigenvector p 3 must be an element of span(x), so that p 3 = cx, for some (nonzero) c R such that p 3 = 1 Moreover, we would require that null(t ) span(x), since eigenvectors associated to distinct eigenvalues are orthogonal Therefore, null(t ) = span(x), by dimension considerations (dim span(x) = dim P 2 dim span(x) = 3 1 2) However, if p = a 0 + a 2 x 2 span(x) then 0 = a 0 + a 2 x 2, x = 1 0 (a 0 + a 2 x 2 )x dx = 1 4 (2a 0 + a 2 ) = a 2 = 2a 0 so that p = a 0 (1 2x 2 ) That is, we have shown that span(1 2x 2 ) = span(x) null(t ) = null(t ), since null(t ) = span(x) Then, we would have 2 = dim null(t ) = dim span(1 2x 2 ) = 1 which is absurd Hence, our assumption that T is self-adjoint is false b) Theorem 647 requires that the matrix of T (relative to C W and B V ) is the conjugate transpose of the matrix of T (relative to B V and C W ) if both B and C are orthonormal However, the basis B = C = (1, x, x 2 ) of P 2 is not orthonormal (with respect to the given inner product) so that we are not contradicting Theorem 647 If we chose an orthonormal basis of P 2, call it A (obtained by Gram-Schmidt process on (1, x, x 2 ), for example), then we would find [T ] A [T ] t A 2 This is false This would imply that the matrices A, B of two self-adjoint operators T, S (relative to an orthonormal basis) would satisfy (AB) = AB, where for a square matrix C we are writing C = C t However, (AB) = B A = BA, since T, S are self-adjoint So, we need to only find two non-commuting self adjoint operators - we can take the following operators on Euclidean space C 2 T : C 2 C 2 ; x You can check that TS(e 1 ) ST (e 1 ), where e 1 = [ ] 0 1 x, S : C 2 C 2 ; x 1 0 [ ] 1 0 [ ] 2 0 x 0 1 3 a) Let T, S L(V ), be self-adjoint operators on V, a real inner product space Then, the zero operator on V, Z : V V ; v 0 V is self-adjoint; we have (T +S) = T +S = T +S, so that T + S is self-adjoint; if c R then (ct ) = ct = ct, so that ct is self-adjoint Hence, the set of self-adjoint operators on a real inner product space is a subspacae of L(V ) 1

b) If T is self-adjoint operator on the complex inner product space V, then ( 1T ) = 1T = 1T 1T Hence, the set of self-adjoint operators is not closed under scalar multiplication 4 Let P L(V ) be such that P 2 = P ( ) Suppose that P is an orthogonal projection Then, range(p) = null(p) Moreover, the only eigenvalues of P are λ = 0, 1 (this was proved in a previous HW exercise) and P(v) = v v range(p) (this is true of any projection) so that range(p) consists of eigenvectors with eigenvalue λ = 1 Hence, we can find an orthonormal basis (u 1,, u k )of range(p) (using Gram-Schmidt applied to any basis of range(p)) and an orthonormal basis (v 1,, v l ) of null(p) (using Gram-Schmidt applied to any basis of null(p)) Then, (u 1,, u k, v 1,, v l ) is an orthonormal basis of V consisting of eigenvectors of P Hence, if V is a real inner product space then P is self-adjoint, by the Spectral Theorem If V is complex inner product space then, for any v V, we can write v = u + z, u range(p), z null(p) so that P(v), v = u, u + z = u, u + u, z = u 2 + 0 R Hence, P is self-adjoint when V is complex inner product space ( ) Suppose that P is self-adjoint Then, we have null(p) = null(p and range(p) = null(p ) = null(p) = V = null(p) null(p) = null(p) range(p) Since P is self-adjoint then there is a basis of V consisting of orthonormal vectors of P - call it (u 1,, u k, v 1,, v l ), where P(u i ) = 0 V and P(v i ) 0 V Hence, dim null(p) = k (ie, we are saying that the u s are an on basis of null(p)) and, since span(v 1,, v l ) null(p) = range(p) and dim range(p) = dim V dim null(p) = (k + l) k = l, we see that span(v 1,, v l ) = range(p) Thus, (v 1,, v l ) is an orthonormal basis of range(p) consisting of eigenvectors of P with nonzero associated eigenvalues As we are assuming that P 2 = P, we must have that the only eigenvalues of P are λ = 0, 1, so that the only nonzero eigenvalue is λ = 1 Hence, for every u range(p) we have P(u) = u Thus, since we can write v = z + u, with z null(p), u range(p), we see that P(v) = P(z + u) = P(z) + P(u) = 0 V + u, so that P is a projection onto range(p) with null(p) = range(p) - hence, it is an orthogonal projection 5 Let V be an inner product space, take (v 1,, v n ) an orthonormal basis of V (so that n 2) Consider the normal operators T, S L(V ) defined as follows: T (v 1 ) = 3v 1, T (v 2 ) = v 2, T (v i ) = 0 V, i 3, S(v 1 ) = v 2, S(v 2 ) = v 1, S(v i ) = 0 V, i 3 Then, the (n n) matrices of T, S with respect to the given basis are 3 0 0 0 0 A =, B = 1 0 0 2

Since, A = A t, we have T = T, and as BB t = B t B, we have SS = S S, giving that both T and S are normal Now, we see that the matrix of T + S is 3 1 A + B = and this last matrix is not diagonalisable (so that T + S is not diaagonalisable: indeed, the eigenvalues of T + S are λ = 0, 2 and null(t + S) = span(v 3,, v n ) while the λ = 2 eigenspace is span(v 1 + v 2 ) If T + S were to be diagonalisable then we would need to have two linearly independent eigenvectors with eigenvalue λ = 2, which obviously can t be the case Hence, T + S is not normal (it isn t diagonalisable) 6 Let T L(V ) be normal Then, we must have that null(t ) = null(t ) (this is at the top of p131) Hence, range(t ) = null(t ) = null(t ) = range(t ) 7 There are a couple of ways to proceed: Proof I) Let B be an orthonormal basis of V consisting of eigenvectors of T (it exists by the Spectral Theorem) Then, we have the matrix of T relative to B is [T ] B = where λ 1,, λ n are eigenvalues of T (counted with multiplicity) Let s suppose that λ 1 = = λ k = 0, and λ i 0, for i > k Thus, dim null(t ) = k Now, for any j 1 we have λ j [T j ] B = [T ] j 1 0 B = [T ] B = 0 λ j n and λ j r = 0 = λ r = 0 = r {1,, k} Hence, null(t j ) = span(v 1,, v k ) = null(t ), for each j 1 Now, since, for each j 1, dim range(t ) = dim V dim null(t ) = dim V dim null(t j ) = dim range(t j ) and range(t j ) range(t ), we see that range(t ) = range(t j ) follows from null(t ) = null(t j ) Proof II) As T is normal then we have null(t ) = null(t ) (see p131) Hence, we have range(t ) = null(t ) = null(t ) = V = null(t ) range(t ) 3

In particular, null(t ) range(t ) = {0} Let s prove null(t j ) = null(t ), for every j 1, by induction: the case j = 1 is trivial Assume the result hold for j = s - we ll show it holds for j = s + 1 Since null(t ) null(t s+1 ) always holds, we need only show that null(t ) null(t s+1 ) So, let z null(t s+1 ) Then, 0 = T s+1 (z) = T (T s (z)) = T s (z) null(t ) range(t ) = {0} = z null(t s ) = null(t ), by induction Hence, null(t s+1 ) null(t ) and the result is proved 8 The requirements on T imply that the vectors (1, 2, 3) and (2, 5, 7) are eigenvectors of T However, with respect to the dot product on R 3, we see that (1, 2, 3) (2, 5, 7) = 2 + 10 + 21 = 33 0 so that eigenvectors corresponding to distinct eigenvalues are not orthogonal, contradicting Corollary 78 9 ( ) Suppose that T is self-adjoint Then, by Proposition 71 we see that all eigenvalues of T are real ( ) Suppose that all eigenvalues of the normal operator T are real Then, by the (complex) Spectral Theorem, we can find an orthonormal basis B of V consisting of eigenvectors of T Hence, we have the matrix of T relative to B is Then, we have that Hence, T is self-adjoint [T ] B =, where λ 1,, λ n R [T ] B = [T ] t B = [T ] B = T = T 10 Since T is normal, there is an orthonormal basis B of V consisting of eigenvectors of T Then, we have λ i [T ] B =, and [T i ] B = 0 λ i n Hence, if T 8 = T 9 then we must have λ 8 [T 8 ] B = 0 λ 8 n so that, foe each i = 1,, n λ 9 0 λ 9 n λ 8 i = λ 9 i = λ 8 i (1 λ i ) = 0 [T 9 ] B 4

In particular, each eigenvalue λ i is either equal to 1 or 0 Since the eigenvalues of T are real then T is self-adjoint (by previous exercise) Moreover, if we assume that λ 1 = = λ k = 0 and λ k+1 = = λ n = 1 then we have so that T 2 = T [T ] B = λ 2 0 λ 2 n [T ] B 11 Let T be normal and B = (v 1,, v n ) be an orthonormal basis of V consisting of eigenvectors of T Suppose that [T ] B = Then, by the Fundamental Theorem of Algebra, we can find a (complex) square root of λ i, for each i = 1,, n Suppose that µ 2 i = λ i, for each i Then, define the operators S L(V ) as follows: S(v 1 ) = µ 1 v 1,, S(v n ) = µ n v n Then, we have µ 2 [S 2 ] B = 0 µ 2 n [T ] B = S 2 = T 5