Problem 1. CS205 Homework #2 Solutions. Solution

Similar documents
Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Chapter 3 Transformations

Orthogonal Transformations

The QR Factorization

Linear Analysis Lecture 16

18.06SC Final Exam Solutions

Math 407: Linear Optimization

MATH 532: Linear Algebra

Q T A = R ))) ) = A n 1 R

7. Dimension and Structure.

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

PRACTICE PROBLEMS (TEST I).

MAT 2037 LINEAR ALGEBRA I web:

Orthonormal Transformations and Least Squares

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

MAT Linear Algebra Collection of sample exams

Problem set 5: SVD, Orthogonal projections, etc.

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

Orthonormal Transformations

Introduction to Matrix Algebra

Lecture notes: Applied linear algebra Part 1. Version 2

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Dimension and Structure

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Numerical Linear Algebra Homework Assignment - Week 2

Chapter 3. Matrices. 3.1 Matrices

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Linear Algebra Primer

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Here each term has degree 2 (the sum of exponents is 2 for all summands). A quadratic form of three variables looks as

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Computational Methods. Eigenvalues and Singular Values

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 369 Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Numerical Methods for Solving Large Scale Eigenvalue Problems

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Linear Algebra and Matrix Inversion

Lecture 6: Geometry of OLS Estimation of Linear Regession

Computational Methods. Least Squares Approximation/Optimization

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Homework 2. Solutions T =

There are six more problems on the next two pages

MIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.

EE263: Introduction to Linear Dynamical Systems Review Session 2

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Linear algebra for computational statistics

EE731 Lecture Notes: Matrix Computations for Signal Processing

Lecture 10 - Eigenvalues problem

Cheat Sheet for MATH461

Linear Algebra V = T = ( 4 3 ).

3 QR factorization revisited

Lecture 3: QR-Factorization

A Vector Space Justification of Householder Orthogonalization

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

CSL361 Problem set 4: Basic linear algebra

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

B553 Lecture 5: Matrix Algebra Review

MATH 22A: LINEAR ALGEBRA Chapter 4

2. Every linear system with the same number of equations as unknowns has a unique solution.

Linear Algebra. Workbook

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Linear Algebra Formulas. Ben Lee

9. Numerical linear algebra background

5.6. PSEUDOINVERSES 101. A H w.

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

Linear Least squares

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Eigenvalues and eigenvectors

7. Symmetric Matrices and Quadratic Forms

Properties of Matrices and Operations on Matrices

Linear Algebra Massoud Malek

Least-Squares Systems and The QR factorization

A Brief Outline of Math 355

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Math Linear Algebra Final Exam Review Sheet

Conceptual Questions for Review

Matrices. Chapter What is a Matrix? We review the basic matrix operations. An array of numbers a a 1n A = a m1...

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

Homework 2 Foundations of Computational Math 2 Spring 2019

Review of some mathematical tools

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Lecture 6, Sci. Comp. for DPhil Students

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

Math 291-2: Lecture Notes Northwestern University, Winter 2016

ECE 275A Homework #3 Solutions

Lecture 6 Positive Definite Matrices

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Review problems for MA 54, Fall 2004.

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS

Transcription:

CS205 Homework #2 s Problem 1 [Heath 3.29, page 152] Let v be a nonzero n-vector. The hyperplane normal to v is the (n-1)-dimensional subspace of all vectors z such that v T z = 0. A reflector is a linear transformation R such that Rx = x if x is a scalar multiple of v, and Rx = x if v T x = 0. Thus, the hyperplane acts as a mirror: for any vector, its component within the hyperplane is invariant, whereas its component orthogonal to the hyperplane is reversed. 1. Show that R = 2P I, where P is the orthogonal projector onto the hyperplane normal to v. Draw a picture to illustrate this result 2. Show that R is symmetric and orthogonal 3. Show that the Householder transformation is a reflector H = I 2 vvt, 4. Show that for any two vectors s and t such that s t and s 2 = t 2, there is a reflector R such that Rs = t 1. We can obtain the reflection Rx of a vector x with respect to a hyperplane through the origin by adding to x twice the vector from x to Px, where Px is the projection of x onto the same hyperplane (see figure 1). Thus Rx = x + 2 (Px x) = (2Px x) = (2P I) x Since this has to hold for all x we have R = 2P I. An alternative way to derive the same result is to observe that the projection Px lies halfway between x and its reflection Rx. Therefore which leads to the same result. 1 (x + Rx) = Px Rx = (2Px x) = (2P I) x 2 2. A reflection with respect to a hyperplane through the origin does not change the magnitude of the reflected vector (see figure 1). Therefore we have Rx = x x T R T Rx = x T x x T (R T R I)x = 0 1

Figure 1: Reflector for any vector x. If we could show that x T (R T R I)x = 0 implies R T R I = 0 we would have proven the orthogonality of R. Furthermore, since reflecting a vector twice just gives the original vector we have R 2 = I. Therefore we would have which shows that R is symmetric. R T R = I R T R 2 = R R T = R In order to show that R T R I = 0 it suffices to show that for a symmetric matrix C, x T Cx = 0 for all x implies C = 0 (since R T R I is symmetric). To show that, we note that e T i Ce j = C ij where e k is the k-th column of the identity matrix. We have e T i Ce i = C ii = 0 for any i and furthermore 0 = (e i + e j ) T C (e i + e j ) = C ii + C jj + C ij + C ji = 2C ij = 2C ji C = 0 3. The Householder matrix reflects all vectors in the direction of v ( ) H(αv) = I 2 vvt (αv) = αv 2α v(vt v) = α(v 2v) = (αv) and leaves all vectors x with v T x = 0 invariant ( ) Hx = I 2 vvt x = x 2 v(vt x) = x therefore, H is a reflector about the hyperplane {x : v T x = 0}. 4. Any two vectors s and t are reflections of each other with respect to the hyperplane normal to the vector s t that passes from the midpoint of s and t. When s 2 = t 2 2

that hyperplane passes through the origin and can be written as {x : (s t) T x = 0}. Therefore the Householder transform (s t)(s t)t H = I 2 is the reflection that maps s to t and vice versa. To show that formally, we have Problem 2 Hs = s 2 (s t)(s t)t s = s(s t)t (s t) 2(s t)(s t) T s = sst s 2st T s + st T t 2ss T s + 2st T s + 2ts T s 2tt T s = 2tsT s 2tt T s = tst s 2tt T s + tt T t [ ss T s + st T t 2ss T s = 0 2st T s + 2st T s = 0 = t(s t)t (s t) = t Let A be a rectangular m n matrix with full column rank and m > n. Consider the QR decomposition of A. 1. Show that P 0 = I QQ T is the projection matrix onto the nullspace of A T 2. Show that for every x we have Ax b 2 2 = A(x x 0 ) 2 2 + Ax 0 b 2 2 where x 0 is the least squares solution of Ax = b 3. Show that the minimum value for the 2-norm of the residual is attained when x is equal to the least squares solution and that this minimum value is equal to P 0 b 2 1. We know that the nullspace of A T and the column space of A are the normal complements of each other. Therefore, any vector x can be written as x = x 1 + x 2 where x 1 is in the nullspace of A T and x 2 is in the column space of A. For x 1 this means that A T x 1 = 0. Since A is full rank, the QR decomposition is defined and A T x 1 = 0 R T Q T x 1 = 0 Q T x 1 = 0 since R is nonsingular. On the other hand, x 2 belongs to the column space of A, therefore it can be written as x 2 = Ay = QRy where y R n. ] 3

Thus, the action of P 0 on x amounts to P 0 x = ( I QQ T) (x 1 + x 2 ) = x 1 + x 2 QQ T x 1 QQ T QRy = x 1 + x 2 QRy = x 1 Therefore, P 0 is the projection matrix onto the nullspace of A T (and QQ T is the projection matrix onto the column space of A). 2. We have Ax b 2 2 = A(x x 0 ) + (Ax 0 b) 2 2 = [A(x x 0 ) + (Ax 0 b)] T [A(x x 0 ) + (Ax 0 b)] = A(x x 0 ) 2 2 + Ax 0 b 2 2 + 2(x x 0 ) T A T (Ax 0 b) = A(x x 0 ) 2 2 + Ax 0 b 2 2 + 2(x x 0 ) T (A T Ax 0 A T b) = A(x x 0 ) 2 2 + Ax 0 b 2 2 [A T Ax 0 A T b = 0] 3. From the equation above, we have that the minimum value for Ax b 2 is attained for x = x 0, since the term Ax 0 b 2 2 does not depend on x. The least squares solution is given as Rx 0 = Q T b x 0 = R 1 Q T b Therefore the minimum value for the residual Ax b 2 is Ax 0 b 2 = QRR 1 Q T b b 2 = QQ T b b 2 = (QQ T I)b = (I QQ T )b = P 0 b 2 Intuitively, this means that the least squares solution annihilates the component of the residual in the column space of A and the minimum value for the residual is exactly the component of b that is not contained in the column space of A. Problem 3 State whether the following classes of matrices are positive (semi-)definite, negative (semi-)definite, indefinite, or whether their definiteness cannot be determined in general 1. Orthogonal matrices 2. Matrices of the form A T A where A is a rectangular matrix 3. Projection matrices 4. Matrices of the form I P where P is a projection matrix 5. Householder matrices 4 2

6. Upper triangular matrices with positive diagonal elements 7. A diagonally dominant matrix with positive elements on the diagonal. A matrix is called diagonally dominant if a ii > j i a ij and a ii > j i a ji. 1. Any diagonal matrix with values +1 or 1 on the diagonal is orthogonal. Nevertheless it can be positive definite (if it equals I), negative definite (if it equals I) or indefinite (in any other case). Thus the definiteness of orthogonal matrices cannot be determined for the general case. 2. We have x T (A T A)x = Ax 2 2 0. Thus a matrix of the form A T A is always positive semidefinite. In addition, if A is full rank, then A T A is positive definite (since Ax = 0 x = 0). 3. Let V be the vector subspace that a projection matrix P projects onto, and V its normal complement. Let x = x 1 + x 2 be an arbitrary vector, where x 1 is the component of x in V and x 2 its component in V. Therefore x T Px = (x 1 + x 2 ) T P(x 1 + x 2 ) = x T 1 Px 1 = x 1 2 2 0 where we used the fact that P is symmetric and Px 2 = 0. Therefore a projection matrix is always positive semi-definite. 4. The matrix I P is the projection onto the normal complement of the space P projects onto. Therefore it is a projection matrix itself and thus positive semidefinite. 5. Given the Householder matrix H = I 2 vvt we have v T Hv = v T ( v) = v 2 2 < 0 where if w is a nonzero vector that is orthogonal to v (such a vector always exists in 2 or more dimensions) then w T Hw = w T w = w 2 2 > 0. Therefore a Householder matrix is always indefinite (in the special 1D case the matrix reduces to the single number 1, being negative definite) 6. I is an example of such a matrix that is positive definite. The matrix [ 1 3 0 1 is an example of an indefinite matrix. However, we can conclude that such a matrix can never be negative definite, because e T i Ae i = A ii > 0, where e i is the i-th column of the identity matrix. ] 5

7. x T Ax = A ij x i x j = A ii x 2 i + A ij x i x j i,j i A ii x i 2 A ij x i x j = 1 2 > 1 2 i = = 1 2 ( A ii + A ii ) x i 2 A ij x i x j i ( A ij + A ji ) x i 2 A ij x i x j ( 1 A ij 2 x i 2 + 1 ) 2 x j 2 x i x j A ij ( x i x j ) 2 0 Problem 4 [Heath 3.12 page 150] 1. Let A be a n n matrix. Show that any two of the following conditions imply the other. (a) A T = A (b) A T A = I (c) A 2 = I 2. Give a specific example, other than the identity matrix I or a permutation of it, of a 3 3 matrix that has all three of these properties. 3. Name a nontrivial class of matrices that have all three of these properties. 1. { A T = A A T A = I } A 2 = I [By substitution] { A T } = A A 2 A T A = I [By substitution] = I { A T } { A = I A A 2 1 = A T } = I A 1 A T = A = A 6

2. 1 0 0 0 1 0 0 0 1, 1/3 2/3 2/3 2/3 1/3 2/3 2/3 2/3 1/3 3. Reflection matrices (e.g. Householder matrices, see problem 1). Problem 5 [Heath 3.16 page 150] Consider the vector a as an n 1 matrix. 1. Write out its QR factorization, showing the matrices Q and R explicitly. 2. What is the solution to the linear least squared problem ax = b, where b is a given n-vector? 1. By simple application of the algorithm, we have Q = 1 a 2 a R = [ a 2 ] 2. The least squares solution is given by the equation Rx = Q T b a 2 x = 1 a T b x = at b a 2 a T a 7