ACM 104. Homework Set 4 Solutions February 14, 2001

Similar documents
Conceptual Questions for Review

Math Linear Algebra Final Exam Review Sheet

Introduction to Matrix Algebra

Math 215 HW #9 Solutions

Matrices and Linear Algebra

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Properties of Linear Transformations from R n to R m

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

2018 Fall 2210Q Section 013 Midterm Exam II Solution

CS 246 Review of Linear Algebra 01/17/19

Review problems for MA 54, Fall 2004.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Announcements Wednesday, November 01

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra- Final Exam Review

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

MA 265 FINAL EXAM Fall 2012

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 583A REVIEW SESSION #1

LINEAR ALGEBRA REVIEW

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Problem 1: Solving a linear equation

Math 308 Practice Final Exam Page and vector y =

MATH 221, Spring Homework 10 Solutions

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

MATH 235. Final ANSWERS May 5, 2015

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Knowledge Discovery and Data Mining 1 (VO) ( )

1. General Vector Spaces

Linear Algebra Highlights

Math Homework 8 (selected problems)

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

235 Final exam review questions

Study Guide for Linear Algebra Exam 2

Linear Algebra Exercises

Math Matrix Algebra

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Math 314H Solutions to Homework # 3

MTH 464: Computational Linear Algebra

A Brief Outline of Math 355

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

18.06SC Final Exam Solutions

Symmetric and anti symmetric matrices

Math 315: Linear Algebra Solutions to Assignment 7

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Mathematical foundations - linear algebra

Announcements Wednesday, November 01

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Online Exercises for Linear Algebra XM511

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Chapter 5 Eigenvalues and Eigenvectors

MAT Linear Algebra Collection of sample exams

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Econ Slides from Lecture 7

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Chapter 3 Transformations

Linear Algebra Review

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH. 20F SAMPLE FINAL (WINTER 2010)

Family Feud Review. Linear Algebra. October 22, 2013

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

CS 143 Linear Algebra Review

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Practice problems for Exam 3 A =

Mathematical foundations - linear algebra

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Q1 Q2 Q3 Q4 Tot Letr Xtra

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

LINEAR ALGEBRA REVIEW

Linear Algebra Primer

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

ELE/MCE 503 Linear Algebra Facts Fall 2018

MIT Final Exam Solutions, Spring 2017

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Linear Algebra Practice Problems

Chapter 4 Euclid Space

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Math Linear Algebra II. 1. Inner Products and Norms

Lecture Summaries for Linear Algebra M51A

Linear Algebra Massoud Malek

Numerical Linear Algebra Homework Assignment - Week 2

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Review of linear algebra

Transcription:

ACM 04 Homework Set 4 Solutions February 4, 00 Franklin Chapter, Problem 4, page 55 Suppose that we feel that some observations are more important or reliable than others Redefine the function to be minimized as: φ(x) = m i= σ i (b i x 0 x i a i a i a in ) Now what are the appropriate normal equations? Assume real data Following the procedure of p5, we differentiate the above relation with respect to x j : Now define b = b b b m 0 = φ m = a ij σ i (b i x 0 x i a i a i a in ) x j, A = i= a a n a a n a m a mn The normal equations can be written as:, x = x 0 x, Σ = σ 0 0 0 σ 0 0 0 σm 0 = A T Σ(b Ax) A T ΣAx = A T Σb Franklin Chapter 4, Problem 3, page 75 Let A = 0 0 0 0 0 0 0 0 0 α 4 α 3 α α, x = Can x be an eigenvector of this matrix, A, if x = 0? If λ is an eigenvalue of A, and if x =, what is the form of the eigenvector x? What is the characteristic polynomial of A? Under what condition does A have four linearly independent eigenvectors? If x were an eigenvector of A with x = 0 then there would exist γ 0 such that 0 0 0 0 0 = 0 0 0 0 0 0 x 3 = γ x 3 α 4 α 3 α α x 4 x 4 x x 3 x 4

ie = 0, x 3 = γ, x 4 = γx 3, (α 3 + α x 3 + α x 4 ) = γx 4 This implies = x 3 = x 4 = 0 which means that x is not an eigenvector since it is not a nonzero vector We conclude that x cannot be an eigenvector if x = 0 Now let λ be an eigenvalue and x = Then we have 0 0 0 0 0 0 0 0 0 α 4 α 3 α α x 3 x 4 = λ ie = λ, x 3 = λ, x 4 = λx 3, (α 4 + α 3 + α x 3 + α x 4 ) = λx 4 So the form of the eigenvector is x = λ λ λ 3 with α 4 + λα 3 + λ α + λ 3 α + λ 4 = 0 The characteristic polynomial of a matrix is a polynomial of degree four, where the coefficient of the highest power in λ is, whose roots are the eigenvalues of the matrix Looking at the previous relation for the eigenvalues we realize that x 3 x 4 p(λ) = λ 4 + α λ 3 + α λ + α 3 λ + α 4 is the characteristic polynomial If the characteristic polynomial has 4 distinct nonzero roots then there will be 4 linearly independent eigenvectors This is a sufficient condition for 4 linearly independent eigenvectors but it may not be a necessary conditon since, in general, distinct eigenvectors can be obtained from the same eigenvalues However, in this case we can show that this is a necessary condition We can always scale the x component to be Suppose that we have the following 4 eigenvectors λ λ λ 3, λ λ λ 3, These are linearly independent if and only if the determinant of the vectors (as rows or columns) is not zero So consider λ λ λ 3 λ λ λ 3 λ 3 λ 3 λ 3 = (λ i λ j ) 3 λ 4 λ 4 λ 3 i>j 4 (result of Vandermonde matrices) This is nonzero if and only if λ i λ j for i j Therefore, we need to have distinct eigenvalues for this matrix in order to get distinct eigenvectors λ 3 λ 3 λ 3 3, λ 4 λ 4 λ 3 4 3 Franklin Chapter 4, Problem 4, page 75 Generalize the last problem, replacing 4 by n (The matrix, A, is known as the companion matrix of the polynomial λ n + α λ n + + α n If x were an eigenvector of A with x = 0 then there would exist γ 0 such that 0 0 0 0 α n α n α 0 = γ 0

which implies, as before that (x,, ) is the zero vector, so there are no eigenvectors of the companion matrix with x = 0 Let λ be an eigenvalue and x = Then we have 0 0 0 0 α n α n α = γ So the eigenvector is x = λ λ λ n with α n + λα n + + λ n α + λ n = 0 The characteristic polynomial of an n n matrix is a polynomial of degree n, where the coefficient of the highest power in λ is, whose roots are the eigenvalues of the matrix Looking at the previous relation for the eigenvalues we realize that p(λ) = α n + λα n + + λ n α + λ n is the characteristic polynomial If the characteristic polynomial has n distinct nonzero roots then there will be n linearly independent eigenvectors We will show that this is also a necessary condition We can always scale the x component to be Suppose that we have the n eigenvectors which arise from the eigenvalues λ, λ,, λ n They are linearly independent if and only if the determinant of the matrix that they can form (as rows or columns) is not zero So consider λ λ λ n λ λ λ n λ n λ n λ n n = (λ i λ j ) i>j (result of Vandermonde matrices) This is nonzero if and only if λ i λ j for i j Therefore, we need to have distinct eigenvalues for this matrix in order to get distinct eigenvectors 4 Franklin Chapter 4, Problem 5, page 88 Assume that ρ > 0,, ρ n > 0 Redefine the length of a vector as Which matrices V preserve this length? x (ρ x + + ρ n ) Let Q = ρ 0 0 ρ n so that Q = P = ρ 0 0 ρ n 3

then we can write the new length as x P = (Qx, Qx) = (P x, x) or x P = (P x, x) We will first show that V x P = x P if and only if (QV x, QV y) = (Qx, Qy) For the only if part, suppose that V x P = x P Then we can write ie we have x + y P = ρ i x i + y i = ρ i x i + ρ i x i y i + ρ i x i y i + ρ i y i Replacing x + y with x + iy we get x + y P = x P + Re(P x, y) + y P () x + iy P = x P + Im(P x, y) + y P () Now replacing x with V x and y with V y in () and () we obtain V (x + y) P = V x P + Re(P V x, V y) + V y P (3) V (x + iy) P = V x P + Im(P V x, V y) + V y P (4) Since we assumed that V x P = x P we must have after equating () with (3) and () with (4) Re(P V x, V y) = Re(P x, y), Im(P V x, V y) = Im(P x, y) so (P V x, V y) = (P x, y) implies that (V P V x, y) = (P x, y), ie (QV x, QV y) = (Qx, Qy) Now assuming that (QV x, QV y) = (Qx, Qy) we have that setting x = y (QV x, QV x) = (Qx, Qx) V x P = x P which implies that V x P = x P Now taking the result (V P V x, y) = (P x, y) which is true for all x, y we set x = e i, y = e j to pick out certain rows and columns Then (V P V ji = P ij so V P V = P since components are equal Therefore, if we have V P V = P then V preserves x (ρ x + + ρ n ) 5 Franklin Chapter 4, Problem 4, page 94 Let a,, a m be linearly independent vectors over the field of compleumbers Define α ij = (a i ) a j = (a j, a i ) Show that det(α ij ) 0 by showing that an eigenvector x of the matrix (α ij ) belonging to the eigenvalue 0, would satisfy the relation x i a i = 0 If we define A = a a a m The matrix (α ij ) can be written as A A, where by A we mean the complex conjugate of the transpose of A If zero is an eigenvalue of (α ij ) with eigenvector x we then have Then since A has independent columns A Ax = 0 0 = x, A Ax = Ax, Ax = Ax x = 0 4

6 Let B be an m n matrix and A be a square invertible matrix (mxm) Prove that AB and B have the same nullspace and the same row space (and the same rank) as B itself Use this fact to show that the rank of a matrix is an invariant For x R n we have Bx = 0 ABx = 0 so N(B) N(AB) Also, since A is invertible, ABx = 0 A ABx = A 0 Bx = 0 so N(AB) N(B) These two relations imply N(B) = N(AB) The row space of a matrix is the space of all row vectors that can be written as a linear combination of the row vectors of that matrix: row space(b) = { all vectors in the form x T B, for some x T R m} row space(ab) = { all vectors in the form y T AB, for some y T R m} y T A is a vector in R m so every element in the row space of AB is also in the row space of B Any element x T B in the row space of B is also in the row space of AB and the corresponding y T is y T = x T A Then the row spaces of A and AB must be the same, and the matrices have the same rank Let X Y denote the fact that two matrices X and Y have the same rank Applying what we proved to the case m = n we have that when B is an n n matrix and A is an n n invertible matrix then B AB Since taking the transpose of an n n matrix does not change its rank, B B T AB T BA T This implies that B BA, when A is invertible Now let A and C be the same up to a coordinate transformation, that is let A = T CT, for some invertible matrix T Then C T C T CT A and the rank of a matrix is invariant 7 Let A and B be two square matrices Prove that the characteristic polynomials of BA and AB are identical (Hint: It may be helpful first to establish the result for deta 0 In the case where deta = 0, think about perturbing the matrix A such that it becomes invertible) First suppose det(a) 0 Then A is invertible and: det(ab λi) = det(a )det(ab λi)det(a) = det(ba λi) The matrices AB and BA have the same eigenvalues and therefore the same characteristic polynomial If det(a) = 0 we know from problem set 3 exercise 4 that for ɛ > 0 small enough,det(a + ɛi) 0 Applying what we just showed to the matrices A + ɛi, B we conclude that AB + ɛb and BA + ɛb have the same eigenvalues λ(ɛ), for every ɛ > 0 that is small enough (5) For a matrix A, the coefficients of the polynomial det(a λi) depend continuously on the coefficients of A det(a λi) = (λ λ i ) implies that the coefficients of the characteristic polynomial depend continuously on the eigenvalues (actually, the eigenvalues are a polynomial of the coefficients) Since the eigenvalues of a matrix depend continuously on the matrix, (5) that must hold also for ɛ = 0 5

8 A projection matrix is a matrix which satisfies the identity P = P If P = P T P show that P is a projection matrix If P is an orthogonal projection matrix, show that P = P T 3 Suppose that P is the projection onto the subspace M and Q is the projection onto the orthogonal complement M What are P + Q and P Q? Show that P Q is its own inverse Taking the transpose of both sides in P = P T P, we get P T = P T P, so P = P T and P = P T P = P As we did for problem 5 in problem set 3, we can represent the orthogonal projection in a basis consisting of the r vectors that span the space onto which P projects and the n r vectors that are orthogonal to that space In that basis, the matrix P is diagonal, with the first r elements being and the rest 0 Let Λ be this diagonal matrix and let S be the orthogonal matrix whose columns are these basis vectors We have which proves the claim P = SΛS T P = SΛ T S T = SΛS T = P 3 The orthogonal complement M consists of vectors that are orthogonal to all vectors in M The problem is again simplified if we choose as basis the r = dim(m) eigenvectors of P with eigenvalue together with the n r eigenvectors of Q with eigenvalue Then the matrices that represent these projections are: P = ( I 0 0 0 ), Q = ( ) 0 0, with I 0 I, I r by r and (n r) by (n r) identities Then P + Q is the identity in this basis and an n n invertible matrix in general, P Q is zero and P Q is indeed its own inverse To express any matrix in a different basis we multiply on the left by T and on the right by T By inspection we see that these results hold, even when P and Q are expressed in an arbitrary basis Inuitively they really make sense: P x + Qx = x means that the sum of the components of x in M and M give back the vector, P Qx = 0 means if we try to project Qx M onto M we will get the zero vector, and using these two results: (P Q)(P Q) = P P Q QP + Q = P + Q = P + Q = I 6