University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 13, 2014

Similar documents
University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 10, 2011

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 22, 2016

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam May 25th, 2018

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Math 315: Linear Algebra Solutions to Assignment 7

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Review of Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

MA 265 FINAL EXAM Fall 2012

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MAT Linear Algebra Collection of sample exams

Math Final December 2006 C. Robinson

Recall : Eigenvalues and Eigenvectors

1. General Vector Spaces

Review problems for MA 54, Fall 2004.

Recall the convention that, for us, all vectors are column vectors.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

and let s calculate the image of some vectors under the transformation T.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math 3191 Applied Linear Algebra

Applied Linear Algebra in Geoscience Using MATLAB

Definitions for Quizzes

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

TBP MATH33A Review Sheet. November 24, 2018

1. Select the unique answer (choice) for each problem. Write only the answer.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Problem # Max points possible Actual score Total 120

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Homework 11 Solutions. Math 110, Fall 2013.

235 Final exam review questions

Chapter 3 Transformations

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Linear Algebra Massoud Malek

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

DEPARTMENT OF MATHEMATICS

Linear Algebra Practice Problems

Exercise Sheet 1.

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

6 Inner Product Spaces

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

MATH 369 Linear Algebra

Chapter 3. Matrices. 3.1 Matrices

Summer Session Practice Final Exam

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Lecture 10 - Eigenvalues problem

MATH 1553-C MIDTERM EXAMINATION 3

Matrix Algebra: Summary

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Lecture 15, 16: Diagonalization

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Linear Algebra problems

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Family Feud Review. Linear Algebra. October 22, 2013

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

MATH 223 FINAL EXAM APRIL, 2005

The Singular Value Decomposition

CS 246 Review of Linear Algebra 01/17/19

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Mathematical Methods for Engineers 1 (AMS10/10A)

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Midterm 2 Solutions, MATH 54, Linear Algebra and Differential Equations, Fall 2014

MATH 1553, C. JANKOWSKI MIDTERM 3

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

Eigenvalues and Eigenvectors A =

Notes on Linear Algebra and Matrix Theory

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

DEPARTMENT OF MATHEMATICS

Linear Algebra- Final Exam Review

STAT200C: Review of Linear Algebra

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Linear Algebra Review. Vectors

MATH 1553 PRACTICE FINAL EXAMINATION

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

2 Eigenvectors and Eigenvalues in abstract spaces.

MATH 235. Final ANSWERS May 5, 2015

The eigenvalues are the roots of the characteristic polynomial, det(a λi). We can compute

Transcription:

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra PhD Preliminary Exam June 3, 24 Name: Exam Rules: This is a closed book exam Once the exam begins, you have 4 hours to do your best Submit as many solutions as you can All solutions will be graded and your final grade will be based on your six best solutions Each problem is worth 2 points Justify your solutions: cite theorems that you use, provide counter-examples for disproof, give explanations, and show calculations for numerical problems If you are asked to prove a theorem, do not merely quote that theorem as your proof; instead, produce an independent proof Begin each solution on a new page and use additional paper, if necessary Write only on one side of paper Write legibly using a dark pencil or pen Ask the proctor if you have any questions Good luck! 5 2 6 3 7 4 Total DO NOT TURN THE PAGE UNTIL TOLD TO DO SO Applied Linear Algebra Preliminary Exam Committee: Joshua French (Chair), Julien Langou, Anatolii Puhalskii

Assume the following general definition for a real positive semidefinite matrix: an n n real matrix A is said to be positive semidefinite if and only if, for all vector x in R n, x T Ax In particular, this definition allows real matrices which are not symmetric to be positive semidefinite (a) Prove that if A and B are real symmetric positive semidefinite matrices and matrix A is nonsingular, then AB has only real nonnegative eigenvalues ( pts) (b) Provide a counterexample showing that the requirement that the matrices are symmetric cannot be dropped ( pts) (a) Since A is symmetric positive definite, A /2 and A /2 are well defined The matrix AB has the same eigenvalues as the matrix A /2 ABA /2 = A /2 BA /2 The latter matrix is selfadjoint and positive semidefinite, so it has real nonnegative eigenvalues Note: The result also holds if we remove the assumption of A to be nonsingular In other words, A and B only need to be two n by n symmetric positive semidefinite matrices The proof gets a little trickier though (b) One needs to provide positive semidefinite matrices A and B, A nonsingular, such that AB has an eigenvalue which is not real and nonnegative Given question (a) we understand that either A or B (or both) have to be nonsymmetric To create a positive semidefinite matrix A, one simply takes a symmetric positive definite matrix H and then add an antisymmetric matrix S, then A = H +S is positive semidefinite matrix ( ) ( ) In our case, we can take A = and B = In this case A is positive semidefinite nonsingular, B is positive semidefinite, and AB does not have real nonnegative eigenvalues

2 (a) Suppose A and B are real-valued symmetric n n matrices Show that trace (AB) trace (A 2 ) trace (B 2 ) What are the conditions for equality to hold? ( pts) (b) Let A be a real m n matrix Show that ) trace(aa T ) trace( AA T When does equality hold? ( pts) (a) By the Cauchy-Schwarz Theorem, trace (AB) = a ij b ij i,j i,j ij b 2 ij = trace (A 2 ) trace (B 2 ) For equality to hold, one of the matrices has to be a scalar multiple of the other (b) Let AA T = P T DP, where D represents a nonnegative diagonal matrix and P represents an orthogonal matrix Then trace(aa T ) = trace(d) = λ i ( λi ) 2 = (trace(d /2 )) 2 = (trace((aa T ) /2 )) 2 i i i,j The fact that i λ i ( i λi ) 2 comes from developing the square on the right side Equality holds if and only if D has at most one nonzero entry, so AA T has at most one nonzero eigenvalue, so A has at most one nonzero singular value

3 Let f : M n (R) M n (R) A A T (a) What are the eigenvalues of f? ( pts) (b) Is f diagonalizable? If yes, give a basis of eigenvectors If no, give as many linearly independent eigenvectors as possible ( pts) It is clear that f 2 = I, therefore p(x) = (x )(x+) is such that p(f) = This implies that the eigenvalues of f are part of the set {, } Also p(f) = implies that f is diagonalizable since p only has single roots Now it is clear that any symmetric matric is eigenvector associated with eigenvalue, andthatan eigenvector associated witheigenvalue isasymmetricmatrix Ifwe call the subspace of symmetric matrices, S n, and E the eigenspace of f associated with eigenvalue, we have S n = E It is also clear that any antisymmetric matric is eigenvector associated with eigenvalue -, and that an eigenvector associated with eigenvalue - is an antisymmetric matrix If we call the subspace of antisymmetric matrices, A n, and E the eigenspace of f associated with eigenvalue, we have A n = E We know that M n = S n A n Therefore we can diagonalize f by taking a basis of S n and a basis of A n to form a basis of M n

4 Define the n n matrix a+b b b b b a a+b b b b A n = a a a+b b b a a a a+b b a a a a a+b (a) Compute D n = det(a n ) ( pts) (b) Give the value of D n for n =, a = 2, and b = ( pts) We perform (in this order) L n L n L n, then L n L n L n 2, and finally L 2 L 2 L (These transformations do not change the value of the determinant) We get a+b b b b b b a D n = b a a b a We develop with respect to last column and get D n = ( ) n b b a b a a b +a a+b b b b b a b a b a And so, we get D n = b n +ad n We have D = a+b (Note: We could get D from D = b+ad if we define D to be )

So we get D 2 = b 2 +ad = b 2 +ab+ Quick check: D 2 = a+b a b a+b = (a+b)2 ab = b 2 +ab+ So we get D 3 = b 3 +ad 2 = b 3 +ab 2 + b+a 3 Pursuing in an identical manner, we get D n = b n +ab n ++a n b+a n = We can simplify by noticing that So, if a b, we have And, if a = b, we get n a k b n k k= (a b)(b n +ab n ++a n b+a n ) = a n+ b n+ D n = an+ b n+ a b D n = (n+)a n (And we check that the latter expression for a = b is the limit of the expression for a b when b goes to a) For n =, a =, and b = 2, we get ( ) (2) ( ) 2 = 249 3 = 683

5 Suppose that u and v are vectors in a real inner product space V (a) Prove that ( u + v ) u,v u v (b) Prove or disprove the following identity: ( u + v ) u,v u v u+v ( pts) u+v ( pts) (a) Case: u,v Theinequalityfollowstriviallysinceanormisnonnegative Thus, the leftside is no more than while the right side is no less than Case 2: u,v > Squaring the left side we have ( u + v ) 2 u,v u,v u 2 v 2 ( u 2 + v 2 +2 u v ) u,v u v u 2 v 2 () = u v u,v + u,v +2 u,v v u (2) = u v u v + u v +2 u,v v u (3) = u+v 2 (4) Both () and (3) are obtained by applying the Cauchy-Schwarz inequality to u,v, while (2) and (4) are obtained by simplifying (b) Let u = (,), v = (,), and use a Euclidean inner product (dot product) Then the left side of the inequality becomes (+)()() = while the right side is (Note: one can also use one-dimensional vector: u = (), v = ( ))

6 Let V be a vector space Let f L(V) Let p be a projection (so p L(V) and is such that p 2 = p) Prove that Null(f p) = Null(p) (Null(f) Range(p)) (2 pts) Firstly, we would like to prove that Null(p) (Null(f) Range(p)) Null(f p) Note: We recall that if A, B and C are subspaces, to prove that A +B C, we just need to prove that A C and B C Null(p) Null(f p) Let x Null(p), then p(x) =, so (f p)(x) =, so x Null(f p) Null(f) Range(p) Null(f p) Letx Null(f) Range(p) Sincex Range(p), there exists y such that x = p(y) Since x Null(f), we have f(x) = Now let us look at (f p)(x) (Note: we want to prove that (f p)(x) = ) We have (f p)(x) = (f p)(p(y)) = f(p 2 (y)) = f(p(y)) = f(x) =, We have used the facts that 2: x = p(y), 3 4: p 2 = p, 4 5: p(y) = x, 5 6: f(x) = This proves that x Null(f p) We proved that (Null(p)+(Null(f) Range(p))) Null(f p) Secondly, we would like to prove that Null(f p) Null(p) (Null(f) Range(p)) Let x Null(f p), we can write x as where x = (x p(x))+p(x), (a) (x p(x)) Null(p) Indeed, p(x p(x)) = p(x) p 2 (x), but p = p 2 so p(x p(x)) =, so (x p(x)) Null(p) (b) p(x) Null(f) Range(p) Itisafactthatp(x) Range(p) Moreover, since x Null(f p), we have that (f p)(x) =, which proves that p(x) Null(f) So p(x) Null(f) Range(p)

Therefore we have that At this point, we proved that Null(f p) Null(p)+(Null(f) Range(p)) Null(f p) = Null(p)+(Null(f) Range(p)) It remains to prove that the sum is direct Let x Null(p) (Null(f) Range(p)), then x Range(p), so there exits u V such that x = p(u), but x Null(p), so p(x) =, so p 2 (u) =, but p 2 = p, so p(u) =, so x = We proved that Null(p) (Null(f) Range(p)) = {} so the sum in the previous paragraph is direct We are done and we can conclude that Null(f p) = Null(p) (Null(f) Range(p))

7 (a) Let n N\{,} (so n 2) and A M n (C) such that rank(a) = Prove that A is diagonalizable if and only if trace(a) ( pts) (b) Let a,a n C\{}, (so the a i s are nonzero complex numbers,) and A such ( ) ai a j that A = (This means that the entry (i,j) of A is a i i,j n a j ) Show that A is diagonalizable Give a basis of eigenvectors (with the associated eigenvalues) for A ( pts) (a) First we note that rank(a) = dim(null(a)) = n (by the rank theorem) So, if rank(a) = and n 2, then dim(null(a)) and so is an eigenvalue of A We call ν the geometric multiplicity of the eigenvalue, and µ the algebraic multiplicity of the eigenvalue We call E the eigenspace associated with the eigenvalue Now, since dim(null(a)) = n, we have that dim(e ) = n, or in other words, the geometric multiplicity of the eigenvalue, ν, is n We know that, for a given eigenvalue, the algebraic multiplicity is always greater than or equal to the geometric multiplicity For the eigenvalue, this reads: ν µ For a rank matrix, there are therefore only two cases: either ν = µ = n, or ν = n, µ = n case ν = µ = n In this case, since µ = n, there has to exist another eigenvalue λ different from zero (Because the sum of the algebraic multiplicities of the eigenvalues has to sum to n) For that eigenvalue λ, the geometric multiplicity, ν λ, is at least, but can be no more than (because ν = n and the sum of the algebraic multiplicities of two distinct eigenvalues has to be less than n) So ν λ = So we have ν λ = and ν = n, so A is diagonalizable We also note that, in this case, trace(a) = λ, (the trace is the sum of the eigenvalues counted with their multiplicities,) and so, in this case, trace(a) case ν = n,µ = n In this case, since µ = n, A only has the eigenvalue We also have that A is not diagonaliable and that trace(a) = Starting from a rank matrix, we found two possibilities Either ν = µ = n, in which case, A is diagonaliable and trace(a) Or ν = n,µ = n, in which case, A is not diagonaliable and trace(a) = This enables us to conclude that for a rank matrix A is diagonalizable trace(a)

(b) We observe that the matrix is of rank Indeed a A = a n a n ( a a n a n ) We also have trace(a) = n So by the previous question, we see that A is diagonalizable (since trace(a) We also see that A has eigenvalue with geometric multiplicity n and eigenvalue n with geometric multiplicity eigenvalue To find n linearly independent eigenvectors associated with eigenvalue, we want to find a basis for the null space of A, which is same as null space of ( ) a a n a n We have (for example) that x is a leading variable, and that x 2, x 3,, x n are free variables This gives for a general solution: a a a 3 a x 2 a a 3 x 3 a a n x n a a nx n x 2 x 3 x n x n = x 2 +x 3 ++x n a a n +x n a a n v = So a basis for E is for example a a a 3, v 2 =, v n 2 = a a n, v n = a a n eigenvalue n We see that an eigenvector for eigenvalue n is for example a v n = a n a n Answer: The above given (v,,v n ) is a basis of C n made of eigenvectors of A