Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices
|
|
- Gabriella Gallagher
- 6 years ago
- Views:
Transcription
1 COMMUNICATIONS IN ALGEBRA, 29(6, (200 Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices Fuzhen Zhang Department of Math Science and Technology Nova Southeastern University Fort Lauderdale, Florida 3334, USA zhang@nova.edu Yimin Wei Department of Mathematics Fudan University Shanghai , P. R. China ymwei@fudan.edu.cn Abstract Let Σ be the collection of all 2n 2n partitioned complex matrices ( A A 2, A 2 A where A and A 2 are n n complex matrices, the bars on top of them mean matrix conjugate. We show that Σ is closed under similarity transformation to Jordan (canonical forms. Precisely, any matrix in Σ is similar to a matrix in the form J J Σ via an invertible matrix in Σ, where J is a Jordan form whose diagonal elements all have nonnegative imaginary parts. An application of this result gives the Jordan form of real quaternion matrices.. Isomorphism and complex adjoint of real quaternion matrices This research was motivated by studying the analogous properties of quaternion matrices to those of complex matrices. Throughout this paper we denote by C and Q, respectively, the sets of complex numbers and real quaternions and let F n and M n (F bethen-tuples and the n n
2 2 FUZHEN ZHANG matrices over F, wheref = C or Q. We call real quaternions simply quaternions, and take {, i, j, k} as a basis of Q. In addition, for A M n (Q, A denotes the conjugate of A and A = A T is the (quaternionic conjugate transpose of A. Let A M n (Q. Write A = A + A 2 j,wherea and A 2 are n n complex matrices. We associate with A the 2n 2n complex matrix ( A A 2 φ(a = ( A 2 A and call φ(a thecomplex adjoint matrix of the quaternion matrix A. LetΣbethe collection of all 2n 2n partitioned complex matrices in the form (. The mapping A φ(a is an isomorphism between M n (Q and Σ. Some basic properties of φ are (see [] φ(a + B =φ(a+φ(b; 2 φ(ab =φ(aφ(b; 3 φ(a =(φ(a ; 4 φ(a =(φ(a if A is invertible; 5 A has rank r if and only if φ(a hasrank2r; 6 φ(a is unitary, Hermitian, or normal if and only if A is unitary, Hermitian, or normal. Complex adjoint matrices have been employed in the study of quaternion matrices by many researchers (see, e.g., [4], [6], [9] or []. In order to obtain some properties such as commutativity, normality and polar decomposition for quaternion matrices, Wiegmann [8] investigated the Jordan form of a quaternion matrix through its adjoint using solely the complex matrix theory. He claimed the statement in our abstract. His proof, however, is false. The purpose of the present paper is to complete the proof. For a survey on quaternion matrices, see []. For other aspects of quaternions such as Cayley numbers and quaternionic quantum mechanics, see [7] and []. The problems we shall study in this paper are formulated as follows: Problem : Any A Σ has a Jordan form in Σ; Problem 2: Any A Σ is similar to a Jordan form in Σ via an invertible P Σ; Problem 3: Any A M n (Q has a Jordan form in M n (C in the usual sense.
3 JORDAN FORM OF A PARTITIONED MATRIX 3 Problems 2 and 3 are equivalent, since if P AP = B holds over Q, then (φ(p φ(aφ(p =φ(b holdsoverc. Conversely, if there exists a nonsingular matrix S in Σ such that S φ(as = T,thenT is in Σ. If we write ( S S S = 2 S 2 S ( T T, T = 2 T 2 T then the identity S φ(as = T or φ(as = ST implies It follows that A S A 2 S 2 = S T S 2 T 2 and A S 2 + A 2 S = S T 2 + S 2 T., (A + A 2 j(s + S 2 j=(s + S 2 j(t + T 2 j. Therefore the quaternion matrix A is similar to T +T 2 j via the nonsingular quaternion matrix S +S 2 j. We note that in such a way many quaternion matrix problems can be converted into partitioned complex matrix problems. Recall that the left and right eigenvalues λ of a square quaternion matrix A, defined respectively by Ax = λx and Ax = xλ for some nonzero quaternion vector x, are different in general. It has been evident that right eigenvalues, simply called eigenvalues, are more useful in the study of quaternion matrices. For left eigenvalues, one may refer to [0] (for the existence. Furthermore, any n n quaternion matrix has n complex right eigenvalues with nonnegative imaginary parts []. Problem is immediate from Problems 2 and 3, but not conversely, since the invertible matrix through which A is similar to its Jordan form in Σ need not be in Σ. Take, for instance, A = ( 0 0 Then the eigenvalues of A are i, i, anda is similar to its Jordan form J = ( i 0 0 i.. It is easy to verify that ( P AP = J, where P = i i Σ. However, as seen later, one can choose an invertible matrix in Σ that does the job. We give an affirmative answer to Problem 2, consequently to Problems and 3.
4 4 FUZHEN ZHANG 2. Adjoint vectors and lemmas As we saw in the previous section, an invertible matrix P that takes a matrix A Σ to its Jordan form need not be in Σ. Suppose such a P exists in Σ. If (v i,...,v ni,w i,...,w ni T is the i-th column vector of P, i n, then( w i,..., w ni, v i,...,v ni T should be the (n+i-th column vector of P. This vector is called the adjoint vector of the former one. In general, the adjoint vector v of a column vector v of 2n-complex components is defined to be ( v v2 = v, if v = ( v v 2, v,v 2 C n. Obviously v and v are linearly independent if v 0. In fact, they are orthogonal. It is readily seen that for any vectors u and v in C 2n and any complex number c (cu + v = cu + v, (u = u. (2 The following lemmas, extracted from [8], will be used in the proof of our main theorem when choosing basis vectors of the form v, v for certain eigenspaces. Lemma If v,v 2,...,v k C 2n are linearly independent, then so are v,v 2,...,v k. Lemma 2 Let v,v 2,...,v k,v k+ be vectors in C 2n. If v,v,..., v k,vk,v k+ are linearly independent, then v,v,...,v k,vk,v k+,vk+ are also linearly independent. Proof. Let a,b,...,a k+,b k+ be complex numbers such that a v + b v + + a kv k + b k v k + a k+v k+ + b k+ v k+ =0. Take adjoint ( to get a v b v + + a k v k b k v k + a k+ v k+ b k+ v k+ =0. Multiplying the first equation by a k+, and the second by b k+, then subtracting to get rid of vk+, one has a k+ 2 + b k+ 2 =0, since v,v,...,v k,v k,v k+ are linearly independent by assumption. Thus b k+ =0, hence all a s and b s equal zero due to the linear independence assumption.
5 JORDAN FORM OF A PARTITIONED MATRIX 5 Lemma 3 Let u C 2n, c C, anda, B Σ. Then [(A + cbu] =(A + cbu. (3 In particular [(A λiu] =(A λiu, (4 and if u is an eigenvector of A corresponding to λ, thensoisu to λ. 3. Jordan form of the partitioned matrices Jordan canonical forms play a fundamental role in linear algebra. The Jordan form of an n n complexmatrixcanbecarried out in three different classical ways: i using elementary divisor and invariant factor theory, ii by a pure matrix proof [3, p. 2], and iii utilizing the invariance of generalized eigenspaces ([5, p. 22]. We outline the last approach for convenience of the later use. The proof of our theorem is based on a thorough understanding and reconstruction of this approach. Step : Let λ C be an eigenvalue of A M n (C. The null spaces N((A λi k ={x C n (A λi k x =0}, k =, 2,..., form an increasing chain of subspaces of C n. Thus some two consecutive null spaces, and from there on, are identical. Let r be the smallest of such positive integers, i.e., N((A λi r N((A λi r =N((A λi r+. The subspace N((A λi r, abbreviated N λ, is the generalized eigenspace of A corresponding to λ, andr is the index of λ. Note that N λ is an invariant subspace of C n under A because (A λi i and A commute. For the same reason, the range R λ = {(A λi r v v C n } is also an invariant subspace under A. In addition, C n = N λ R λ. Step 2: Let λ,λ 2,...,λ k C be distinct eigenvalues of A. It can be shown that N λi R λj for any i j. This implies the key decomposition to the entire proof C n = N λ N λ2 N λk. (5 Since each N λi is an invariant subspace under A, one may focus on A on a typical generalized eigenspace N λ and show that the matrix of A under certain basis of N λ is of the desired Jordan form.
6 6 FUZHEN ZHANG Step 3: Let x N λ. Then (A λi r x =0. Ifx 0, then there exists an integer s r such that (A λi s x =0, but (A λi s x 0. Such a vector x is referred to as a generalized eigenvector of rank s corresponding to λ. A useful fact is that the generalized eigenvectors of different ranks corresponding to the eigenvalue λ are linearly independent. Step 4: Consider N λ with the index of λ being r, andlet N i = {x C n (A λi i x =0}, i =, 2,...,r. Then N N 2 N r = N λ. Note that the set N i+ N i contains all generalized eigenvectors of rank i +. We may form a basis for each N i+ by combining a set of basis vectors of N i and a set of linearly independent generalized eigenvectors of rank i +. LetT i be the subspace spanned by these generalized eigenvectors. Then N i+ = N i T i, i =, 2,...,r. With T 0 = N, we then have the decomposition for each N λ N λ = T r T r 2 T T 0. (6 Furthermore if vectors x,x 2,...,x p in T i are linearly independent, then the vectors (A λi j x, (A λi j x 2,..., (A λi j x p, j =, 2,...,i, (7 are also linearly independent. Step 5: We now choose a basis for N λ that gives the Jordan blocks of eigenvalue λ. Let the dimension of T i be d i and choose a basis for T i, i =0,,...,r, to form a basis by (6 and (7 for N λ as follows: T r : x,...,x dr ; T r 2 :(A λix,...,(a λix dr,x dr +,...,x dr 2 ; T r 3 :(A λi 2 x,...,(a λi 2 x dr, (A λix dr +,......,(A λix dr 2,x dr 2 +,...,x dr 3 ; T 0 :(A λi r x,...,(a λi r x dr, (A λi r 2 x dr +,......,(A λix d,x d +,...,x d0. Step 6: Rearrange the basis vectors just obtained. For x,wehave (A λi r x, (A λi r 2 x,..., (A λix, x.
7 JORDAN FORM OF A PARTITIONED MATRIX 7 Let P x be the n r matrix having these linearly independent vectors as columns. Then (A λip x = P x K, where K is the r r matrix with superdiagonal entries and 0 elsewhere. Thus AP x = P x J, (8 where J is the r r Jordan block with diagonal entries λ. Repeating the process for each x j in the basis of N λ, one obtains a Jordan form J λ which is a direct sum of the Jordan blocks of A belonging to λ through a matrix P λ with linearly independent column vectors. In symbols, AP λ = P λ J λ. Now let λ,λ 2,...,λ k be distinct eigenvalues of A M n (CandsetP =(P λ,p λ2,...,p λk. Then P is invertible and P AP is a Jordan form of A. For quaternion matrices, with Q n viewed as a right vector space over Q and with the (right eigenvalues of the matrices, this approach fails at the early stages in the Steps and 2; N λ is not an invariant subspace of A in general, since (A λi i and A do not commute if A is of quaternion entries. Thus the decomposition in (5 for C n does not hold for Q n in general. Aiming to resolve Problem 2 rather than dealing with the quaternion matrices directly, our strategy is to follow the above steps and show that any matrix in Σ has Jordan blocks in conjugate pairs. An invertible matrix P in Σ which gives the similarity A to its Jordan form is obtained by suitably choosing adjoint vectors to form bases for N λ and N λ if λ is nonreal, and for N λ itself if λ is real. To this end, we need to use the basic properties of the matrices in Σ. Notice that Σ is closed under addition, multiplication, and inversion; namely, A, B Σ A + B, AB, A Σ if the inverse exists. We observe that the nonreal eigenvalues of a matrix in Σ occur in conjugate pairs. A stronger result is the following statement ([2, p. 83]. Lemma 4 An n-square complex matrix A is similar to A or A if and only if the Jordan blocks of the nonreal eigenvalues of A occur in conjugate pairs. As a consequence, any matrix A Σ is similar to a Jordan canonical form J J R, where J has diagonal entries nonreal, and R is real. Note that if A has no real eigenvalues then Problem is settled and that A and A are not necessarily similar if only the occurrence of nonreal eigenvalues in conjugate pairs is assumed.
8 8 FUZHEN ZHANG Now that the eigenvalues of A Σ occur in conjugate pairs, we have the decomposition in the Step 3 for C 2n : C 2n = N λ N λ N λ2 N λ2 N λp N λp N λp+ N λq, (9 where λ s are distinct eigenvalues of A, λ,...,λ p nonreal and λ p+,...,λ q real. With (9 we shall proceed the Step 4 and consider N λ and N λ together if λ is nonreal. For the real case, we choose paired vectors v, v to form a basis for N λ. Theorem For any A Σ there exists an invertible matrix P Σ such that P AP = J J Σ is a Jordan canonical form of A, wherej has all its diagonal entries with nonnegative imaginary parts. Proof. Let λ be a nonreal eigenvalue of A. Thenλ is also a nonreal eigenvalue of A, andλ λ. We are to pair the generalized eigenspaces N λ and N λ. If the index of λ is r, then so is the index of λ, because, by Lemma 3, (A λi i x =0 (A λi i x =0. (0 If we denote N i and T i for λ as N i and T i for λ in the Step 4, then (6 and (7 hold side by side for N λ and N λ. And furthermore, the basis vectors for T i in the Step 5 that are used to form the basis N λ will produce, by Lemma 2 and with the replacement of λ by λ and of x by x, the basis vectors for T i to form the basis N λ. Since λ and λ are different eigenvalues of A and the decomposition (9 is a direct sum, all the vectors just obtained for N λ and N λ are linearly independent. Thus as the identity (8 in the Step 6, AP x = P x J and AP x = P x J both hold. Therefore A(P x,p x =(P x,p x (J J. The Jordan blocks J and J are thus paired. Note that the column vectors of (P x,p x are linearly independent. Repeat this process for every nonreal eigenvalue λ of A to pair all the Jordan blocks J and J of λ. Now we deal with the case in which λ is real. To carry out the Step 5, we first show that T r has a basis consisting of vectors in form v, v. Let 0 x T r be of rank r. Then the rank of x is also r by (0. Thus x T r,andx and x are linearly independent. If the dimension of T r is 2, we then turn to T r 2. Otherwise, let x 2 T r be linearly independent of x and
9 JORDAN FORM OF A PARTITIONED MATRIX 9 x. Then by Lemma 3, x,x,x 2,x 2 are linearly independent. Continuing this way, we have a basis for T r consisting of paired vectors x,x,...,x k,x k for some k. For T r 2, we apply A λi to the above vectors to get linearly independent vectors of rank r (A λix, (A λix,..., (A λix k, (A λix k. ( Note that each (A λix i is the adjoint of (A λix i. The vectors in ( are also in pairs v, v.ifx k+ is a vector of rank r which is not a linear combination of the vectors in (, then by Lemma 3 the vectors in ( plus x k+ and x k+ are linearly independent. In this way we obtain a basis of paired vectors v, v for T r 2. The same idea applies to all T i in the Step 5 to yield the basis vectors of N λ in pairs v =(A λi i x, v =(A λi i x (regard (A λi 0 = I. Now in the Step 6, we have AP x = P x J and AP x = P x J. It follows that A(P x,p x =(P x,p x (J J. The real Jordan blocks are thus paired. Note that the column vectors of (P x,p x are linearly independent. The same argument works as well for all real eigenvalues. We now put all the matrices P x and P x obtained from the above process for all eigenvalues λ, real and nonreal, to form an 2n 2n matrix P.ThenAP = P (J J (permutations may be applied if necessary. Matrix P is invertible, because its column vectors are the basis vectors of the subspaces in the direct sum decomposition (9. It is readily seen that P is in Σ. As a consequence, we have the following result. Theorem 2 Let A be an n n quaternion matrix with distinct (right eigenvalues λ,λ 2,...,λ s, s n, whose imaginary parts are nonnegative. Then there exists an invertible quaternion matrix P such that P AP = J J 2 J s is in a unique Jordan canonical form, where each J t is a direct sum of the Jordan blocks of λ t. The uniqueness follows immediately from that of the complex case. 4. Wiegmann s proof It has been evident that some quaternion matrix problems (such as the existence of left eigenvalues are much more difficult than expected (see [2] and [0]. It does not seem easy to give a direct and elementary proof for Theorem 2. Wiegmann
10 0 FUZHEN ZHANG studied Problem 2 using only the complex matrix theory. For A Σ, as a square complex matrix, A has a Jordan form J via an invertible matrix P. P and J need not be in Σ. The idea of Wiegmann s proof is to change the column vectors of P step by step to linearly independent vectors that span the same subspaces of certain column vectors of P, without altering the relations of the vectors to the partitioned matrix A, so that the resulting matrix P is of the desired form. But there is an invalid statement in his proof. To understand Wiegmann s idea and see what is wrong with the proof, we follow his line of proof, with instead of for the adjoint vectors, since the latter is now commonly used as conjugate transpose in matrix theory. In addition, our A Σis the same as A in [8]. Let A be a matrix in Σ. Let P be an invertible complex matrix such that P AP = J or AP = PJ, (2 where J is an ordinary Jordan form of A, i.e., the direct sum of Jordan blocks λ λ Thus each column vector v of P satisfies one and only one of the following relations: (i Av = λv or (ii Av = w + λv, where w is the column vector adjacent to v on the left. Let λ,λ 2,...,λ m be distinct eigenvalues of A, andletv λi be the vector space spanned by the column vectors of P corresponding to λ i, i =, 2,...,m.Then C 2n = V λ V λm. Since the above decomposition is a direct sum, one may focus on a typical subspace V λ. The strategy of Wiegmann s proof is to choose basis vectors of pairs v, v for V λ if λ is real and for V λ and V λ if λ is nonreal, which satisfy the relations (i and (ii. We shall see a false statement in Wiegmann s proof for the real case. Let λ be a real eigenvalue of A. Consider the column vectors of P that correspond to λ. Ifv denotes the first column vector of P for the first Jordan block of λ, write
11 JORDAN FORM OF A PARTITIONED MATRIX the remaining vectors of P for this block successively as v ( v v ( v (2 λ 0 0 λ 0 0 λ.,v(2,...,illustrated by Note that v is an eigenvector of A belonging to λ. Repeating for all other Jordan blocks of λ, we have the column vectors of P corresponding to the second, third,..., Jordanblocksofλ labeled respectively as v 2,v ( 2,v(2 2,..., v 3,v ( 3,v(2 3,...,... If we write SpanS to denote the space generated by the vectors in S, then V λ = Span{v,v 2,v 3,...} Span{v (,v( 2,v( 3,...} Span{v(2,v(2 2,v(2 3,...} (3 Note that the first summand Span{v,v 2,v 3,...} in (3 is the eigenspace of λ and it is invariant under A. Wiegmann s idea is to replace the basis vectors in each span by linearly independent vectors of the form v, v. Note that if v is a type (i vector for A, soisv by Lemma 3. One can show by using Lemma 2 that the vector space spanned by type (i vectors for λ is of even dimension and has a basis consisting of paired vectors in the form v, v. If v is a type (ii vector, then there is a nonzero vector w such that which, since λ is real, results in (A λiv = w (4 (A λiv = w. (5 Now let there be 2k column vectors v,v 2,...,v p,...,v 2k of P of type (i for λ, where p is the number of Jordan blocks of λ with size more than. Then (A λiv ( i = v i, i =, 2,...,p. It can be shown that p is even, for if v and v ( satisfy (5, then so do v and (v (. Moreover if v and v ( are linearly independent, then so are v and (v ( by Lemma. Either p = 2 or the process can be continued, so that p = 2q. It was concluded in Wiegmann s proof that in this way we see that there
12 2 FUZHEN ZHANG exists a set of linearly independent vectors v,v,...,v q,vq,...,...,v k,vk such that v (, (v(,...,v q (, (v q ( provide a basis for the space spanned by v (,...,v( 2q as taken above where v i and v ( i are related as above, and that the process can be repeated for v (2 j s, v (3 j s,...,until2nlinearly independent vectors which form amatrixp of the desired form are obtained. Putting this all in symbols, Span{v,v 2,v 3,...} = Span{v, v,...}, (6 Span{v (,v( 2,v( 3,...} = Span{v(, (v (,...}, (7 Span{v (2,v(2 2,v(2 3,...} = Span{v(2, (v (2,...}, We point out that the assertion that v (, (v(,...,v q (, (v q ( provide a basis for the space spanned by v (,...,v( 2q is false. That is, (7 and others, except (6, do not hold in general. Thus the process cannot be repeated. For a counterexample, take A = J = and P = Then AP = PJ. Notice that the first and the third column vectors of P are eigenvectors of A belonging to λ. Thus the column vectors of P from left to right are respectively v,v (,v 2,v ( 2. It is easy to check that v( and (v ( do not provide a basis for the space spanned by v ( and v ( 2, since (v( =(, 0,, T is not a linear combination of v ( and v ( 2. We conclude that Wiegmann s proof is false. ACKWLEDGEMENTS Work of the first author was supported in part by the Nova Faculty Development Funds. The work of the second author (Project was supported by the National Natural Science Foundation of China when he visited Harvard University..
13 JORDAN FORM OF A PARTITIONED MATRIX 3 References [] Adler, S. L. Quaternionic Quantum Mechanics and Quantum Fields. Oxford University Press: New York, 995. [2] Cohn, P. M. Skew Field Constructions. Cambridge University Press: London, 977. [3] Horn, R. A.; Johnson, C. R. Matrix Analysis. Cambridge University Press: New York, 985. [4] Huang, L. Jordan Canonical Form of a Matrix over the Quaternion Field. Northeast Math J. (China 994, 0 (, [5] Lancaster, P.; Tismenetsky, M. The Theory of Matrices. Academic Press: Orlando, 985. [6] So, W.; Thompson, R. C.; Zhang, F. The Numerical Range of Normal Matrices with Quaternion Entries. Linear and Multilinear Algebra 994, 37, [7] Ward, J. P. Quaternions and Cayley Numbers. Kluwer Academic Publishers: Norwell, 997. [8] Wiegmann, N. A. Some Theorems on Matrices with Real Quaternion Elements. Canad. J. Math. 995, 7, [9] Wolf, L. A. Similarity of Matrices in Which the Elements Are Real Quaternions. Bull. Amer. Math. Soc. 936, 42, [0] Wood, R. M. W. Quaternionic Eigenvalues. Bull. London Math. Soc. 984, 7, [] Zhang, F. Quaternions and Matrices of Quaternions. Linear Algebra and Its Applications 997, 25, [2] Zhang, F. Matrix Theory: Basic Results and Techniques. Springer: New York, 999.
Math Matrix Algebra
Math 44 - Matrix Algebra Review notes - 4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,
More informationALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND
More informationSymmetric and anti symmetric matrices
Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationUniversity of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm
University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions
More informationMatrix Inequalities by Means of Block Matrices 1
Mathematical Inequalities & Applications, Vol. 4, No. 4, 200, pp. 48-490. Matrix Inequalities by Means of Block Matrices Fuzhen Zhang 2 Department of Math, Science and Technology Nova Southeastern University,
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationMATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation
More informationMATHEMATICS 217 NOTES
MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable
More informationMath 240 Calculus III
Generalized Calculus III Summer 2015, Session II Thursday, July 23, 2015 Agenda 1. 2. 3. 4. Motivation Defective matrices cannot be diagonalized because they do not possess enough eigenvectors to make
More informationFoundations of Matrix Analysis
1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More informationNumerical Linear Algebra Homework Assignment - Week 2
Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.
More informationSolution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since
MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution.
More informationEigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.
Chapter 3 Linear Algebra In this Chapter we provide a review of some basic concepts from Linear Algebra which will be required in order to compute solutions of LTI systems in state space form, discuss
More informationMATRICES ARE SIMILAR TO TRIANGULAR MATRICES
MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,
More informationMath 489AB Exercises for Chapter 2 Fall Section 2.3
Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationCHAPTER 3. Matrix Eigenvalue Problems
A SERIES OF CLASS NOTES FOR 2005-2006 TO INTRODUCE LINEAR AND NONLINEAR PROBLEMS TO ENGINEERS, SCIENTISTS, AND APPLIED MATHEMATICIANS DE CLASS NOTES 3 A COLLECTION OF HANDOUTS ON SYSTEMS OF ORDINARY DIFFERENTIAL
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationMATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.
MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial. Geometric properties of determinants 2 2 determinants and plane geometry
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationLinear Algebra. Workbook
Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx
More informationGeneralized Eigenvectors and Jordan Form
Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least
More informationOHSx XM511 Linear Algebra: Solutions to Online True/False Exercises
This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)
More informationLinear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form
Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =
More information18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in
806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state
More information4 Matrix Diagonalization and Eigensystems
14.102, Math for Economists Fall 2004 Lecture Notes, 9/21/2004 These notes are primarily based on those written by George Marios Angeletos for the Harvard Math Camp in 1999 and 2000, and updated by Stavros
More informationJORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5
JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct
More information1 Linear Algebra Problems
Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and
More information1 Invariant subspaces
MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another
More informationa 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12
24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2
More informationBare-bones outline of eigenvalue theory and the Jordan canonical form
Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional
More informationJordan Normal Form. Chapter Minimal Polynomials
Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More informationNOTES ON BILINEAR FORMS
NOTES ON BILINEAR FORMS PARAMESWARAN SANKARAN These notes are intended as a supplement to the talk given by the author at the IMSc Outreach Programme Enriching Collegiate Education-2015. Symmetric bilinear
More informationRecall : Eigenvalues and Eigenvectors
Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationThe Jordan Normal Form and its Applications
The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens
More informationA NOTE ON THE JORDAN CANONICAL FORM
A NOTE ON THE JORDAN CANONICAL FORM H. Azad Department of Mathematics and Statistics King Fahd University of Petroleum & Minerals Dhahran, Saudi Arabia hassanaz@kfupm.edu.sa Abstract A proof of the Jordan
More informationLinear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.
Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear
More informationMTH 464: Computational Linear Algebra
MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationYimin Wei a,b,,1, Xiezhang Li c,2, Fanbin Bu d, Fuzhen Zhang e. Abstract
Linear Algebra and its Applications 49 (006) 765 77 wwwelseviercom/locate/laa Relative perturbation bounds for the eigenvalues of diagonalizable and singular matrices Application of perturbation theory
More informationBASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x
BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,
More information6 Inner Product Spaces
Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationEigenvalues, Eigenvectors, and Diagonalization
Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will
More informationLinear algebra II Tutorial solutions #1 A = x 1
Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta
More information4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial
Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and
More informationw T 1 w T 2. w T n 0 if i j 1 if i = j
Lyapunov Operator Let A F n n be given, and define a linear operator L A : C n n C n n as L A (X) := A X + XA Suppose A is diagonalizable (what follows can be generalized even if this is not possible -
More informationREVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION
REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar
More informationMATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018
Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry
More information2 Eigenvectors and Eigenvalues in abstract spaces.
MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors
More informationJordan Normal Form Revisited
Mathematics & Statistics Auburn University, Alabama, USA Oct 3, 2007 Jordan Normal Form Revisited Speaker: Tin-Yau Tam Graduate Student Seminar Page 1 of 19 tamtiny@auburn.edu Let us start with some online
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More informationSolving a system by back-substitution, checking consistency of a system (no rows of the form
MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary
More informationEigenvalues and Eigenvectors
November 3, 2016 1 Definition () The (complex) number λ is called an eigenvalue of the n n matrix A provided there exists a nonzero (complex) vector v such that Av = λv, in which case the vector v is called
More informationEigenvalues and Eigenvectors
CHAPTER Eigenvalues and Eigenvectors CHAPTER CONTENTS. Eigenvalues and Eigenvectors 9. Diagonalization. Complex Vector Spaces.4 Differential Equations 6. Dynamical Systems and Markov Chains INTRODUCTION
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n
More informationChapter 4 & 5: Vector Spaces & Linear Transformations
Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think
More informationThe Jordan canonical form
The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationChap 3. Linear Algebra
Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationSUPPLEMENT TO CHAPTERS VII/VIII
SUPPLEMENT TO CHAPTERS VII/VIII The characteristic polynomial of an operator Let A M n,n (F ) be an n n-matrix Then the characteristic polynomial of A is defined by: C A (x) = det(xi A) where I denotes
More informationMath Final December 2006 C. Robinson
Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the
More informationTopics in linear algebra
Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over
More informationMath Spring 2011 Final Exam
Math 471 - Spring 211 Final Exam Instructions The following exam consists of three problems, each with multiple parts. There are 15 points available on the exam. The highest possible score is 125. Your
More information2 b 3 b 4. c c 2 c 3 c 4
OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationUniversity of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam May 25th, 2018
University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam May 25th, 2018 Name: Exam Rules: This exam lasts 4 hours. There are 8 problems.
More informationContents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2
Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationNONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction
NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques
More informationJordan Canonical Form Homework Solutions
Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationReview of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem
Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as
More informationStat 159/259: Linear Algebra Notes
Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationEigenvalues and Eigenvectors
Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear
More informationGQE ALGEBRA PROBLEMS
GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout
More informationLinear Algebra. Min Yan
Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................
More informationarxiv: v3 [math.ra] 22 Aug 2014
arxiv:1407.0331v3 [math.ra] 22 Aug 2014 Positivity of Partitioned Hermitian Matrices with Unitarily Invariant Norms Abstract Chi-Kwong Li a, Fuzhen Zhang b a Department of Mathematics, College of William
More informationMAPPING AND PRESERVER PROPERTIES OF THE PRINCIPAL PIVOT TRANSFORM
MAPPING AND PRESERVER PROPERTIES OF THE PRINCIPAL PIVOT TRANSFORM OLGA SLYUSAREVA AND MICHAEL TSATSOMEROS Abstract. The principal pivot transform (PPT) is a transformation of a matrix A tantamount to exchanging
More informationMath 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.
Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationMath 113 Homework 5. Bowei Liu, Chao Li. Fall 2013
Math 113 Homework 5 Bowei Liu, Chao Li Fall 2013 This homework is due Thursday November 7th at the start of class. Remember to write clearly, and justify your solutions. Please make sure to put your name
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationProblem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show
MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,
More informationSpectral Theorem for Self-adjoint Linear Operators
Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;
More informationDM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini
DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationMATH JORDAN FORM
MATH 53 JORDAN FORM Let A,, A k be square matrices of size n,, n k, respectively with entries in a field F We define the matrix A A k of size n = n + + n k as the block matrix A 0 0 0 0 A 0 0 0 0 A k It
More informationCalculating determinants for larger matrices
Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det
More information