University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Similar documents
University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 10, 2011

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 22, 2016

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 13, 2014

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam May 25th, 2018

Review problems for MA 54, Fall 2004.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

2. Every linear system with the same number of equations as unknowns has a unique solution.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Final Exam Practice Problems Answers Math 24 Winter 2012

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

I. Multiple Choice Questions (Answer any eight)

MA 265 FINAL EXAM Fall 2012

Solutions to Review Problems for Chapter 6 ( ), 7.1

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Last name: First name: Signature: Student number:

EXAM. Exam 1. Math 5316, Fall December 2, 2012

MAT Linear Algebra Collection of sample exams

Family Feud Review. Linear Algebra. October 22, 2013

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Algebra Final Exam Solutions, December 13, 2008

235 Final exam review questions

DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

1 Last time: least-squares problems

Math 3191 Applied Linear Algebra

DEPARTMENT OF MATHEMATICS

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

LINEAR ALGEBRA SUMMARY SHEET.

MATH 235. Final ANSWERS May 5, 2015

Chapter 6: Orthogonality

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

National University of Singapore. Semester II ( ) Time allowed: 2 hours

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Conceptual Questions for Review

Chapter 7: Symmetric Matrices and Quadratic Forms

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Lecture 7: Positive Semidefinite Matrices

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Study Guide for Linear Algebra Exam 2

Quizzes for Math 304

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Homework 11 Solutions. Math 110, Fall 2013.

Chapter 3 Transformations

0 2 0, it is diagonal, hence diagonalizable)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 1553-C MIDTERM EXAMINATION 3

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

FINAL EXAM Ma (Eakin) Fall 2015 December 16, 2015

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Practice Final Exam Solutions

Applied Linear Algebra in Geoscience Using MATLAB

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Math Linear Algebra II. 1. Inner Products and Norms

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Section 6.4. The Gram Schmidt Process

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 1553 PRACTICE FINAL EXAMINATION

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Solutions to Final Exam

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Diagonalizing Matrices

Definition (T -invariant subspace) Example. Example

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Math Final December 2006 C. Robinson

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

There are two things that are particularly nice about the first basis

CAAM 336 DIFFERENTIAL EQUATIONS IN SCI AND ENG Examination 1

1. General Vector Spaces

MTH 2032 SemesterII

Problem # Max points possible Actual score Total 120

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra Lecture Notes-II

Linear Algebra. Min Yan

Chapter 4 Euclid Space

Math Abstract Linear Algebra Fall 2011, section E1 Practice Final. This is a (long) practice exam. The real exam will consist of 6 problems.

GQE ALGEBRA PROBLEMS

Transcription:

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012 Name: Exam Rules: This is a closed book exam. Once the exam begins, you have 4 hours to do your best. Submit as many solutions as you can. All solutions will be graded and your final grade will be based on your six best solutions. Each problem is worth 20 points. Justify your solutions: cite theorems that you use, provide counter-examples for disproof, give explanations, and show calculations for numerical problems. If you are asked to prove a theorem, do not merely quote that theorem as your proof; instead, produce an independent proof. Begin each solution on a new page and use additional paper, if necessary. Write only on one side of paper. Write legibly using a dark pencil or pen. Ask the proctor if you have any questions. Good luck! 1. 5. 2. 6. 3. 7. 4. 8. Total DO NOT TURN THE PAGE UNTIL TOLD TO DO SO. Applied Linear Algebra Preliminary Exam Committee: Steve Billups (Chair), Alexander Engau, Julien Langou.

1. Find an orthogonal basis for the space P 2 of quadratic polynomials with the inner product f, g = f( 1)g( 1) + f(0)g(0) + f(1)g(1). Two ways. First way. Take a first nonzero quadratic polynomial, x(x + 1), whose value is 0 in -1 and 0, and nonzero in 1; a second polynomial, (x 1)(x + 1), whose value is 0 in -1 and 1, and nonzero in 0; and a third polynomial, x(x 1), whose value is 0 in 0 and 1, and nonzero in -1. Then it is easy to see that these three polynomials are orthogonal with respect to the given scalar product. We just need to normalize accordingly. We find: 2 2 x(x + 1), (x 1)(x + 1), x(x 1). 2 2 Second way. We can use the Gram-Schmidt process on three linearly independent vectors in P 2, for example: 1, x, and x 2.

2. A real n n matrix A is an isometry if it preserves length: Ax = x for all vectors x R n. Show that the following are equivalent. (a) A is an isometry (preserves length). (b) Ax, Ay = x, y for all vectors x, y, so A preserves inner products. (c) A 1 = A. (d) The columns of A are unit vectors that are mutually orthogonal. (b) (a). Trivial since x is defined as x, x. So if an application preserves inner products, it preserves length. (a) (b). Assume that A preserves lengths. Let x and y R n. We have A(x + y) 2 = (x + y) 2. Let us consider A(x + y) 2 (x + y) 2. On the one hand this quantity is zero. On the other hand we have: A(x + y) 2 (x + y) 2 = A(x + y), A(x + y) x + y, x + y = Ax, Ax + Ax, Ay + Ay, Ax + Ay, Ay x, x x, y y, x y, y. We note that Ax, Ay = Ay, Ax (symmetry of the inner product) and that Ay, Ay = Ay 2 = y 2 = y, y ( A preserves lengths). All in all, we obtain that A(x + y) 2 (x + y) 2 = 2 Ax, Ay 2 x, y. Setting this to zero implies: Ax, Ay = x, y. Therefore A preserves inner products. We proved that (a) (b). (c) (b). Assume A 1 = A. Let x and y R n. Ax, Ay = A Ax, y = A 1 Ax, y = x, y. So A preserves inner products. (b) (d). Assume A preserves inner products. Let a j be the jth column of A. Then a i, a j = Ae i, Ae j = e i, e j. This proves that the columns of A are unit vectors that are mutually orthogonal. (d) (c). Assume that the columns of A are unit vectors that are mutually orthogonal. Let a j be the jth column of A. This means that a j, a j = 1 and for i j, a i, a j = 0. We know that A = A H, we know that (A H A) ij = a H i a j = a i, a j, so A A = A H A = I. So A = A 1. We proved that (b) (c) (d).

3. Let p q. Let A be a real p q matrix with rank q. Prove that the QRdecomposition A = QR is unique if R is forced to have positive entries on its main diagonal, Q is p q and R is q q. Assume that A = Q 1 R 1 and A = Q 2 R 2 with R 1, R 2 upper triangular with positive entries on the diagonal and Q T 1 Q 1 = I q and Q T 2 Q 2 = I q. We first note that since A is full rank, R 1 and R 2 are invertible. We have Q 1 R 1 = Q 2 R 2, multiplying by Q T 1 and R 1, this gives 2 R 1 R 1 2 = Q T 1 Q 2. This means that Q T 1 Q 2 is upper triangular. Now multiplying by Q T 2 gives R 2 R1 1 = Q T 2 Q 1. and R 1 1, this This means that Q T 2 Q 1 is upper triangular. So Q T 1 Q 2 is lower triangular. Q T 1 Q 2 is upper and lower triangular. So it is diagonal (and invertible). Let us call D = Q T 1 Q 2, (from R 1 R 1 2 = Q T 1 Q 2,) we see that R 1 = DR 2. From Q 1 R 1 = Q 2 R 2, we see that Q 1 = Q 2 D 1. So now Q T 1 Q 1 = I and Q T 2 Q 2 = I give D 2 = I. D has therefore ±1 on the diagonal. We come back to the relation R 1 = DR 2. Since the diagonal entry of R 1 are given by (R 1 ) ii = D ii (R 2 ) ii and that (R 1 ) ii and (R 2 ) ii are both positive, and that D ii = ±1, we see that this implies: D ii = 1. Finally D = I and so: Q 1 = Q 2 and R 1 = R 2.

4. Let A and B be n n complex matrices such that AB = BA. Show that if A has n distinct eigenvalues, then A, B, and AB are all diagonalizable. Let λ 1,..., λ n be the n distinct eigenvalues of A with corresponding (nonzero) eigenvectors v 1,..., v n. We know that a list of eigenvectors belonging to distinct eigenvalues must be a linearly independent list. Hence B = (v 1,..., v n ) is a basis of C n consisting of eigenvectors of A, so that A is similar to the diagonal matrix diag(λ 1,..., λ n ). Then ABv i = BAv i = B(λ i )v i = λ i (Bv i ). So Bv i belongs to the 1-dimensional eigenspace of A associated with the eigenvalue λ i. This means that Bv i = µ i v i. Hence the basis B is also a basis of eigenvectors of B so that v i is associated with the eigenvalue µ i (which might be equal to 0). Then clearly AB is similar to the matrix diag(µ 1 λ 1,..., µ n λ n ).

5. In this problem, R is the field of real numbers. Let (u 1, u 2,..., u m ) be an orthonormal basis for subspace W {0} of the vector space V = R n 1 (under the standard inner product), let U be the n m matrix defined by U = [u 1, u 2,..., u m ], and let P be the n n matrix defined by A = UU T. (a) Prove that if v is any given member of V, then among all the vectors w in W, the one which minimizes v w is given by w = v, u 1 u 1 + v, u 2 u 2 +... + v, u m u m. (The vector w is called the projection of v onto W.) (b) Prove: For any vector x R n 1, the projection w of x onto W is given by w = P x. (c) Prove: P is a projection matrix. (Recall that a matrix P R n n is called a projection matrix if and only if P is symmetric and idempotent. (d) If V = R 3 1, and W = span [(1, 2, 2) T, (1, 0, 1) T ], find the projection matrix P described above and use it to find the projection of (2, 2, 2) T onto W. (a) First it is clear that w W. Note as well that v w W since for all x W, (b) v w, x = (v v, u 1 u 1... v, u m u m ), x = v, x v, u 1 u 1, x... v, u m u m, x = 0. The last equality comes from the fact that since x W, x = x u 1 u 1 +... + x, u m u m. Now consider x W. We define v x 2 = (v w) + (w x) 2 = v w 2 + 2(v w) (w x) + w x 2 Since v w W and w x W, we have that (v w) (w x) = 0, so that v x 2 = v w 2 + w x 2 We see that the minimum for v x is v w 2 and is realized when x = w. w = v, u 1 u 1 + v, u 2 u 2 +... + v, u m u m = u 1 (u T 1 v) + u 2 (u T 2 v) +... + u m (u T mv) = (u 1 u T 1 + u 2 u T 2 +... + u m u T m)v = UU T v = P v. (c) First, P T = (UU T ) T = UU T = P, second, P 2 = (UU T ) 2 = U(U T U)U T = UU T = P where we have used the fact that U T U = I.

(d) An orthogonal basis for W is for example (u 1, u 2 ) = 1/3 2/3 2/3, 2/3 2/3 1/3. We get Finally P = UU T = 5/9 2/9 4/9 2/9 8/9 2/9 4/9 2/9 5/9 w = P x = 14/9 16/9 22/9..

6. Let V = R 5 and let T L(V ) be defined by T (a, b, c, d, e) = (2a, 2b, 2c + d, a + 2d, b + 2e). (a) (8 points) Find the characteristic and minimal polynomial of T. (b) (8 points) Determine a basis of R 5 consisting of eigenvectors and generalized eigenvectors of T. (c) (4 points) Find the Jordan form of T with respect to your basis. The matrix of T in the standard basis (e 1, e 2, e 3, e 4, e 5 ) is 2 0 0 1 0 0 2 0 0 1 0 0 2 0 0 0 0 1 2 0 0 0 0 0 2. We can reorder the basis in (e 3, e 4, e 1, e 5, e 2 ), the matrix of T in this basis is: 2 1 0 0 0 0 2 1 0 0 0 0 2 0 0 0 0 0 2 1 0 0 0 0 2 This answers questions (b) and (c). To answer (a), we readily see that the characteristic polynomial of T is (x 2) 5 and the minimal polynomial of T is (x 2) 3..

7. Suppose that W is finite dimensional and T L(V, W ). Prove that T is injective if and only if there exists S L(W, V ) such that ST is the identity map on V. First suppose that there exists S L(W, V ) such that ST is the identity map on V. Let x and y in V such that T x = T y. Multiplying by S, this means ST x = ST y, but ST is the identity so ST x = x and ST y = y, so we get x = y, which means T is injective. Now suppose that T is injective. Consider (w 1,..., w m ) a basis of Range(T ). (We use the fact that W is finite dimensional.) Since (w 1,..., w m ) belongs to Range(T ), there exists v 1,..., v m in V such that w 1 = T v 1, w 2 = T v 2,... Since (w 1,..., w m ) is a basis, it is a linearly independent list. Since, for all i = 1,..., m, w i = T v i, (T v 1,..., T v m ) is a linearly independent list. This readily implies that (v 1,..., v m ) is a linearly independent list. (Because, by contrapositive, if (v 1,..., v m ) is a linearly dependent list, then T v 1,..., T v m is a linearly dependent list.) Now let x in V. Then, since w 1,..., w m spans Range(T ), there exist scalars α 1,..., α m such that T x = α 1 w 1 +... + α m w m. So T x = α 1 T v 1 +... + α m T v m. So T (x α 1 v 1... α m v m ) = 0. Now T is injective, so we get that x α 1 v 1... α m v m = 0. So x = α 1 v 1 +... + α m v m. So x in Span(v 1,..., v m ). This proves that (v 1,..., v m ) spans V. We conclude that v 1,..., v m is a basis of V. dimensional.) (So this implies that V is finite Now we use the incomplete basis theorem to extend w 1,..., w m with w m+1,..., w n so as w 1,..., w n is a basis of W. Now we define S : W V (on the basis w 1,..., w n ) such that Sw 1 = v 1, Sw 2 = v 2,..., Sw m = v m. and Sw m+1 = Sw m+2 =... = Sw n = 0. It is clear that S L(W, V ) and that ST is the identity map on V.

8. (a) Prove that a normal operator on a finite dimensional complex inner product space with real eigenvalues is self-adjoint. (b) Let V be a finite dimensional real inner product space and let T : V V be a self-adjoint operator. Is it true that T must have a cube root? Explain. (A cube root of T is an operator S : V V such that S 3 = T.) (a) Let V be a finite dimensional complex inner product space and T : V V be a normal operator with real eigenvalues. Let A be the matrix of T in an orthonormal basis. Since T is normal, T is diagonalizable in an orthonormal basis. Therefore there exists a unitary matrix U (U H U = I) such that A = UDU H with D diagonal. We also know that the eigenvalues of T are real, so D is a real matrix; in particular, this implies D = D H. In this case: A H = (UDU H ) H = U(D H )U H = UDU H = A. (b) T has a cube root. The proof of existence is by construction. Let A be the matrix of T in an orthonormal basis. Since T is a self-adjoint operator, then T is diagonalizable in an orthonormal basis with real eigenvalues. Therefore there exists a unitary matrix U (U H U = I) such that A = UDU H with D real and diagonal. Define S = UD 1/3 U H, (the cube root of D is simply the cube root of the diagonal entries,) then it is clear that S 3 = T.