Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal. Suppose a n n matrix A = (a ij ) is upper-triangular: a ij = 0, n i > j 1 A is unitary: A = A 1, where A is a lower-triangular matrix If we can prove that A 1 is a upper-triangular matrix, then A must be a diagonal matrix and so is A. Recall the algorithm to find the inverse of the invertible matrix A: Create the augmented matrix (A I n ) Use elementary row operations on (A I n ) to reduce the part corresponding to A to the identity matrix I n The right part corresponded to I n before performing row operations is now the inverse A 1 of A Because A is a upper-triangular matrix (which is also in row echelon form), we do not need to perform the row-switching operations to reduce it to I n. We only need row-multiplying operations and row-addition operations such that each of which add a lower row multiplied by an scalar α to an upper row. More ever, recall that if a elementary row operation is performed on a matrix M, the result M 1 can be presented as the product M 1 = E 1 M with E 1 being the elementary matrix corresponding to the operation performed. In this case, we only need to consider the elementary matrices mentioned below: The row operation multiplying elements on the i th row by a scalar α has the corresponding elementary matrix obtained by multiplying the diagonal entry 1 of the identity matrix by α, which is diagonal and can be viewed as a upper-triangular matrix. The row operation adding the q th row multiplied by an scalar β to the p th row has the corresponding elementary matrix obtained by adding the scalar β to the (p, q) position of the identity matrix. We note that we only need the operations adding a lower row multiplied by an scalar to an upper row of the matrix A, which means q > p, so the (p, q) position of the identity matrix must be above the diagonal line. Therefore, the elementary matrix must be a upper-triangular matrix. We can see that all the elementary matrices needed to transform A into A 1 = E 1 E 2... E k A (with E i being elementary matrices) are upper-triangular and the product of 2 upper-triangular matrices is a triangular matrix (last week homework), then A 1 must be a upper-triangular matrix, which completes our proof Exercise 2.2: The Pythagorean theorem asserts that for a set of n orthogonal vectors {x i }, n n x i 2 = x i 2 (1) 1
(a) Prove that in the case n = 2 by an explicit computation of x 1 + x 2 2 (b) Show that this computation also establishes the general case, by induction (a) We prove (1) for the case n = 2 x 1 + x 2 2 = x 1 + x 2, x 1 + x 2 = x 1, x 1 + x 1, x 2 + x 2, x 1 + x 2, x 2 = x 1, x 1 + 0 + 0 + x 2, x 2 = x 1 2 + x 2 2 (b) In order to prove (1) in the general case by induction, we need two properties: Base case: (1) holds with n = 2, this is proved in (a). Induction step: assume that (1) holds with n = k: x i 2 = x i 2, (2) (1) holds with n = k + 1: x i 2 = x i 2, (3) We can see that if x 1,..., x are orthogonal vector, then y k = k x i and x are two orthogonal vector. Therefore, we can apply the result in (a) and the assumption in (2) to this case x i 2 = x i + x 2 = y k + x 2 = y k 2 + x 2 = = x i 2 + x 2 x i 2 + x 2 = x i 2 Then our proof is completed. Exercise 2.3: Let A C m m be hermitian. An eigenvector of A is a nonzero vector x C m such that Ax = λx for some λ C, the corresponding eigenvalue. (a) Prove that all eigenvalues of A are real (b) Prove that if x and y are eigenvectors corresponding to distinct eigenvalues, then x and y are orthogonal (a) A is hermitian: A = A Let λ C be an eigenvalue of A, there is a nonzero vector x C m such that Ax = λx. To prove that λ is real, we need to show that λ = λ. (Ax) = (λx) 2
x A = λx = x A x = λx x x Ax = λx x x λx = λx x λx x = λx x λ x, x = λ x, x λ = λ This completes the proof. (b) Suppose that λ 1 and λ 2 are two distinct eigenvalues of A and x and y are corresponding eigenvectors. { Ax = λ 1 x (4) Ay = λ 2 y We need to prove that x, y = x y = 0 (Ax) = (λ 1 x) x A = λ 1 x = x Ay = λ 1 x y x λ 2 y = λ 1 x y x λ 2 y = λ 1 x y (λ 2 λ 1 )x y = 0 Because λ 1 and λ 2 are two distinct eigenvalues of A, x y must by 0. Exercise 2.4: What can be said about the eigenvalues of a unitary matrix? Let A be a unitary matrix, A = A 1 Let λ C be an eigenvalue of A, there is a nonzero vector x C m such that Ax = λx = Ax = λx Ax = λ x Because A is a unitary matrix, Ax = x We have { Ax = λ x Ax = x = λ = 1 Therefore, we can conclude that the eigenvalues of A have length of 1 Exercise 2.5: Let S C m m be skew-hermitian, i.e. S = S (a) Show by using Exercise 2.1 that the eigenvalues of S are pure imaginary 3
(b) Show that I S is nonsingular (c) Show that the matrix Q = (I S) 1 (I + S), known as the Cayley transform of S, is unitary. (This is a matrix analogue of a linear fractional transformation (1 + s)/(1 s), which maps the left half of the complex s-plane conformally onto the unit disk) (a) Let λ C be an eigenvalue of S, there exists a nonzero vector (eigenvector) x such that Sx = λx We need to show that λ = λ in order to show that the eigenvalues of S are pure imaginary Sx = λx (Sx) = (λx) x S = λx x S = λx = x Sx = λx x x λx = λx x λ x, x = λ x, x λ = λ Our proof is completed. (b) Recall that any eigenvalue of a matrix A must satisfies the characteristic polynomial P A (x) = det(a xi) = 0, if x = 0 is an eigenvalue of A, then P A (0) = det(a) = 0 = A is not invertible (A is singular). Conversely, if A is not invertible, the equation AX = 0 does not only have trivial solution X = 0 but also nontrivial and nonzero solutions. Thus there exists v 0 so that Av = 0v, which means v is an eigenvector of A corresponding to the eigenvalue 0. Therefore, 0 is an eigenvalue of A. The two statements above can be combined into: "A matrix is not invertible (is singular) if and only if 0 is an eigenvalue of it". Which can be rewritten as: "A matrix is invertible (is nonsingular) if and only if 0 is not an eigenvalue of it". Back to our problem, let λ be an eigenvalue of I S, there exists an eigenvector u such that (I S λi)u = 0, we need to prove that λ 0. Suppose that λ = 0, (I S λi)u = 0 (I S)u = 0 (S 1I)u = 0 Which means 1 is an eigenvalue of S, contradicts the statement that the eigenvalues of S are pure imaginary in (a). So λ = 0 is not an eigenvalue of I S and therefore I S is nonsingular (is invertible). (c) Similarly to the proof in (b), suppose λ = 0 is an eigenvalue of I + S and x is an corresponding eigenvector. We have (I + S λi)x = (I + S 0I)x = (S ( 1)I)x = 0, which means 1 is an eigenvalue of S, contradicts (a). Therefore, 0 is not an eigenvalue of I + S and thus I + S is nonsingular. 4
We need to show that Q is unitary: Q = Q 1 Q = [ (I S) 1 (I + S) ] = (I + S) [ (I S) 1] = (I + S) [(I S) ] 1 = (I + S ) [(I S )] 1 = (I S)(I + S) 1 We can see that (I S)(I + S) = I S 2 = (I + S)(I S) Q Q = (I S)(I + S) 1 (I S) 1 (I + S) = (I S) [(I S)(I + S)] 1 (I + S) = (I S) [(I + S)(I S)] 1 (I + S) = (I S)(I S) 1 (I + S) 1 (I + S) = I Similarly, we can check that QQ = (I S) 1 (I + S)(I S)(I + S) 1 = (I S) 1 (I S)(I + S)(I + S) 1 = I Therefore, we can conclude that Q = Q 1 5