E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

Similar documents
LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

235 Final exam review questions

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MA 265 FINAL EXAM Fall 2012

Symmetric and self-adjoint matrices

Linear Algebra- Final Exam Review

MATH 235. Final ANSWERS May 5, 2015

Review problems for MA 54, Fall 2004.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MAT Linear Algebra Collection of sample exams

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

I. Multiple Choice Questions (Answer any eight)

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Lecture notes: Applied linear algebra Part 1. Version 2

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Chapter 3 Transformations

Conceptual Questions for Review

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Math 407: Linear Optimization

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

4. Linear transformations as a vector space 17

Lecture 7: Positive Semidefinite Matrices

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

1 Last time: least-squares problems

Chapter 6: Orthogonality

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Linear Algebra. Session 12

Linear algebra and applications to graphs Part 1

0 2 0, it is diagonal, hence diagonalizable)

MIT Final Exam Solutions, Spring 2017

Math Final December 2006 C. Robinson

Solutions to Final Exam

Eigenvalues and Eigenvectors 7.2 Diagonalization

Lecture 10 - Eigenvalues problem

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

1 Linear Algebra Problems

Exercise Sheet 1.

Solutions to Review Problems for Chapter 6 ( ), 7.1

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Diagonalization by a unitary similarity transformation

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Math 408 Advanced Linear Algebra

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Answer Keys For Math 225 Final Review Problem

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Lecture 15, 16: Diagonalization

MATH 583A REVIEW SESSION #1

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Chapter 6 Inner product spaces

Definitions for Quizzes

LINEAR ALGEBRA SUMMARY SHEET.

Linear Algebra Practice Problems

Fundamentals of Engineering Analysis (650163)

PRACTICE PROBLEMS FOR THE FINAL

MATH 369 Linear Algebra

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

2.3. VECTOR SPACES 25

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Recall the convention that, for us, all vectors are column vectors.

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Math 118, Fall 2014 Final Exam

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra Massoud Malek

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Linear Algebra II Lecture 13

National University of Singapore. Semester II ( ) Time allowed: 2 hours

Econ Slides from Lecture 7

Chapter 4 Euclid Space

STAT200C: Review of Linear Algebra

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Calculating determinants for larger matrices

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

Linear algebra II Homework #1 due Thursday, Feb A =

Linear Algebra II Lecture 22

Diagonalizing Matrices

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions

Chapter 7: Symmetric Matrices and Quadratic Forms

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Lecture 8. Principal Component Analysis. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. December 13, 2016

Transcription:

E2 212: Matrix Theory (Fall 2010) s to Test - 1 1. Let X = [x 1, x 2,..., x n ] R m n be a tall matrix. Let S R(X), and let P be an orthogonal projector onto S. (a) If X is full rank, show that P can be written as P = X(X T X) 1 X T. What is P 2? (4 points) (b) Suitably modify P when X is not full rank. Hint: Let Q be an orthonormal basis for R(X). (4 points) (c) Let X be full rank and let B be a square matrix such that B T B = X T X. What special property does the matrix XB 1 possess? Using this, what can you say about the matrix P = X(X T X) 1 X T? (4 points) (d) Show that the rank of P is equal to its trace. (4 points) (e) What are the singular values of P? (4 points) (a) Recall that by definition, a matrix P R m m is a projector onto S if (a) R(P ) = S, (b) P 2 = P and (c) P T = P. Since X has full column rank, (X T X) 1 X T y spans R m, and hence, R(P ) = S. The other two properties are trivial to verify. Also, P 2 = 1 because its eigenvalues are all 1 or 0, so the largest eigenvalue is 1. (b) It is simple to verify that QQ T is the required projection. It satisfies the three properties required. Alternatively, if X sub is the matrix constructed from the largest possible subset of linearly independent columns of X (think Gram-Schmidt procedure), then the matrix P = X sub (Xsub T X sub) 1 Xsub T is the required projector. (c) (XB 1 ) T XB 1 = I, hence XB 1 contains orthonormal columns. Moreover, its columns span X since B 1 is full rank. By (b), this implies that P =

XB 1 (XB 1 ) T = X(X T X) 1 X T is a projector onto R(X). (d) There are multiple ways of solving this. One way is to write P = QQ T where Q has orthonormal columns. Then, rank(p ) = the number of linearly independent columns = trace(p ), since the latter equals trace(q T Q) = trace(i r ), where r is the rank (and the number of columns) of Q. (e) The singular values are 1 and 0. (They are the same as the eigenvalues since P 2 = P.) 2. Let A and B be n n matrices that commute: AB = BA. If λ is an eigenvalue of A, let V λ be the subspace of all eigenvectors corresponding to this eigenvalue. (a) Prove that the eigenspace V λ is an invariant subspace of the matrix B, i.e., show that v V λ, Bv V λ. (4 points) (b) Show that v V λ, such that v is an eigenvector of B. (12 points) (a) Let v be an eigenvector of A, i.e., BAv = ABv = λbv, where λ is an eigenvalue of A. Therefore, Bv is an eigenvector of A. Thus, v V λ, Bv V λ. (b) See Matrix Analysis by Horn and Johnson, page no. 51, lemma 1.3.17. 3. Let A be a n n matrix and z be a vector with the property that A k 1 z 0 but A k z = 0. Show that z, Az,..., A k 1 z are linearly independent. (8 points) We need to prove that α 1 z + α 2 Az 2 +... + α k A k 1 z = 0 α i = 0, i = 1,..., k. Consider, α 1 z + α 2 Az 2 +... + α k A k 1 z = 0. Applying A k 1 on it, we get A k 1 [ α 1 z + α 2 Az +... + α k 1 A k 2 z ] = 0 α 1 A k 1 z = 0 α 1 = 0. Setting α 1 = 0 and multiplying the equation by A k 2, we can show that α 2 = 0. Continuing this way, we get, α 1 = α 2 =... α k 1 = 0. 4. There are no square matrices A, B with the property that AB BA = I. Prove or give counterexample. (8 points) Let us assume such matrices of size N N exist. Taking trace on both sides and using

the fact Trace(AB) = Trace(BA), we get, 0 on the right hand side and N on the left hand side, which is contradiction. 5. What is wrong with this argument for the proof of the Cayley-Hamilton theorem? Since p A (t) = det(ti A), p A (A) = det(ai A) = 0. Therefore, p A (A) = 0. (4 points) One has to write out the polynomial and then substitute for t. 6. Consider the Householder reflection x R n. H x = I 2 xxt x T x, (a) Show that H x x = x and if y R n 1 is such that y T x = x T y = 0, then H x y = y. (4 points) (b) Show that H x is symmetric and orthogonal. What is the trace of H x? (4 points) (c) Given linearly independent vectors y, z R n 1, find x as a linear combination of y and z such that H x y span(z). (8 points) (a) (b) H x x = x 2 x(xt x) x T x = x Hx T = (I 2 xxt x T x )T = I 2 (xxt ) T x T x = I 2 xxt x T x = H x H T x H x = (I 2 xxt x T x )T (I 2 xxt x T x ) = I 2 xxt x T x 2xxT x T x + )(xx T ) 4(xxT (x T x) 2 = I Also, tr(h x ) = n 2 is easy to show since the eigenvalues are +1 (with multiplicity n 1) and 1 (with multiplicity 1), and the trace equals the sum of the eigenvalues.

(c) Let x = ay + bz (ay + bz)(ay + bz)t H x y = y 2 (ay + bz) T (ay + bz) y = y 2 a2 (y T y)y + ab(z T y)z + ab(z T y)y + b 2 (z T y)z a 2 y T y + 2aby T z + b 2 z T z ( ) ( a 2 (y T y) + ab(z T y) ab(z T y)z + b 2 (z T y) = 1 2 y 2 a 2 y T y + 2aby T z + b 2 z T z a 2 y T y + 2aby T z + b 2 z T For this vector to lie in span(z), the coefficient of y must be zero. Thus we get, a 2 y T y + b 2 z T z = 0 One of the possible solution is a = 1 and b = y 2 z 2. ) z 7. Let A R m n with m n, y R n, define [ A à = y T ]. Show that σ n (Ã) σ n(a) and σ 1 (Ã) A 2 2 + y 2 2. (12 points) First, note that ÃT à = A T A + yy T. Also, σ 2 n(a) = λ min (A T A), σ 2 n(ã) = λ min(ãt Ã). Now, λ min (ÃT Ã) = min x =1 xt à T Ãx = min x =1 xt A T Ax + x T y 2 min x =1 xt A T Ax = λ min (A T A), where the last inequality is because the term x T y is always non-negative. σ n (Ã) σ n(a). To show the second part, 2 σ1(ã) = max [x T A T Ax + x T y 2] x =1 [ ] [ max x =1 xt A T Ax + max x T y ] 2 = A 2 2 + y 2 2, x =1 Thus, Thus, σ 2 1(Ã) A 2 2 + y 2 2, which proves the required result. 8. (a) Show that any square matrix which commutes with a diagonal matrix, whose all diagonal entries are different, is diagonal. (4 points) (b) Two matrices are called simultaneously diagonalizable, if they are diagonalized by the same matrix. Show that simultaneously diagonalizable matrices commute. (4 points)

(c) If two matrices commute and one of them is similar to a diagonal matrix with all the diagonal entries different, show that they are simultaneously diagonalizable. (8 points) (a) Let A = [a ij ] be a square matrix which commutes with diagonal matrix D = diag(d 1,..., d n ), i.e, AD = DA. Consider (i,j)th element of left hand side and right hand side. n a il d lj = l=1 n d il a lj l=1 = a ij d jj = d ii a ij Since all the diagonal entries of D are different, it follows that a ij = 0 for i j. Hence A is a diagonal matrix. (b) Let A and B be two simultaneously diagonalizable matrices. Let SAS 1 = D 1 and SBS 1 = D 2, D 1 and D 2 diagonal matrices. Since diagonal matrices always commute, AB = S 1 D 1 SS 1 D 2 S = S 1 D 1 D 2 S = S 1 D 2 D 1 S = S 1 D 2 SS 1 D 1 S = BA (c) Let AB = BA and SAS 1 = D 1, D 1 a diagonal matrix. AB = BA = S 1 D 1 SB = BS 1 D 1 S = D 1 SBS 1 = SBS 1 D 1 SBS 1 commutes with a diagonal matrix with all diagonal entries different. Thus SBS 1 is also diagonal.