POLYNOMIAL EQUATIONS OVER MATRICES. Robert Lee Wilson. Here are two well-known facts about polynomial equations over the complex numbers

Size: px
Start display at page:

Download "POLYNOMIAL EQUATIONS OVER MATRICES. Robert Lee Wilson. Here are two well-known facts about polynomial equations over the complex numbers"

Transcription

1 POLYNOMIAL EQUATIONS OVER MATRICES Robert Lee Wilson Here are two well-known facts about polynomial equations over the complex numbers (C): (I) (Vieta Theorem) For any complex numbers x,..., x n (not necessarily distinct), there is a unique monic polynomial over C f(x) x n + a n x n a x + a (x x )...(x x n ) such that the equation f(x) has roots x,..., x n (counting multiplicities). Then a n i ( ) i j <...<j i x j...x ji. (II) (Fundamental Theorem of Algebra) The equation over C x n + a n x n a x + a has a root. Consequently, it has exactly n roots (counting multiplicities). This article describes the similar statements that can be made about a polynomial equation X n + A n X n A X + A over M k (C), the algebra of k by k matrices over C. For simplicity I will assume n k. All the results can be extended to general n k, but the notation gets more complicated. Such problems for polynomials of low degree (particularly for n ) have been treated by several authors since they arise naturally in control Typeset by AMS-TEX

2 theory in functional analysis. See, for example, GHR,LR. For general n, the solution of the diagonalizable case is due to Fuchs Schwarz FS. The differences between the situation for equations over the complex numbers the situation for equations over matrices arise for two reasons: multiplication in M k (C) is not commutative not all matrices in M k (C) are invertible. Analogues of the Vieta Theorem Let X X be roots of the quadratic equation X + A X + A over any algebra (not necessarily commutative - e.g., M k (C)). Then we have X + A X + A Taking the difference gives X + A X + A. Replace by X X + A (X X ). X X X (X X ) + (X X )X. (This is the noncommutative version of the well-known formula u v (u+v)(u v) that holds in the commutative case.) We obtain Thus X (X X ) + (X X )X + A (X X ). () A (X X ) X (X X ) + (X X )X. This has two consequences: First, if X X is invertible we may multiply on the right by its inverse obtain A X + (X X )X (X X ). To make this look nicer, write y X

3 Then In other words: y (X X )X (X X ). A y + y A X A X y + y + y y. A (y + y ) A y y. These are close analogues of the formulas in the commutative case, but they involve rational functions, not just polynomials. This generalization of the Vieta Theorem (which can be extended to equations of degree n by using the theory of quasideterminants) is due to Gelf Retakh GR,GR. Second, equation () may be rewritten as () (X + A )(X X ) (X X )X. Now work in M (C) take X, X. Then (X X )X. But the rows of the matrix (X + A )(X X ) must be contained in the row space of the matrix X X 3.

4 Thus it is impossible to satisfy () hence there is no quadratic equation over M (C) with roots,. Solutions of polynomial equations- how many solutions can there be? Some examples: () X has no solution. a b To see this, suppose that is a solution observe that c d a b a + bc b(a + d) c d c(a + d) bc + d. If this is equal to, then a + d. But c(a + d), so c. But then (comparing the entries on the diagonal) a d, so a d, a contradiction. We will return to this example later see another, less computational, way to analyze it. () X has infinitely many solutions x For example, is a solution for any x C. (3) X has four solutions: 4 ± X. ± It is clear that these four matrices are solutions. To see that there are no other a b solutions, let X. Then, as before c d X a + bc b(a + d) c(a + d) bc + d. If this is equal to, then 4 b(a + d) c(a + d). 4

5 If a + d, then b c, a, d 4, giving the four solutions listed. If a + d, then a d so a d. But then, a + bc bc + d 4, a contradiction. (4) X has infinitely many solutions. The same analysis as in (3) shows that a b X c a is a solution whenever a + bc. If b c, these are just the matrices of reflections in C. Here is a question we will answer: For what values of l is there a quadratic equation X + A X + A with exactly l solutions? So far, we know that there is such an equation if l, 4, or. We have not yet seen the answer for l,, 3, 5, 6, 7, 8,... We want to find all solutions to (3) X + A X + A. First we need to think about how we can specify a solution X. There are several ways to do this: (a) Write down the entries of X, e.g., X. (b) Write down X (the first column of X) X (the second column of X). E.g., X X describes the same matrix as in (a). (c) Write down Xv Xv for any basis {v, v } of C. E.g., X 5

6 X 3 3 describes the same matrix as in (a) (b). (d) Write down a basis {v, v } for C consisting of eigenvectors for X specify the corresponding eigenvalue for each eigenvector. (Recall that a nonzero vector v is an eigenvector for X with corresponding eigenvalue λ if Xv λv.) Note that requiring that be an eigenvector for X with corresponding eigenvalue that be an eigenvector for X with corresponding eigenvalue 3 describes the same matrix as in (a), (b) (c). Note that replacing v i by c i v i where c, c describes the same matrix. Thus, if we let S { u C} { u we may assume that the eigenvectors are chosen from S. It is important to note that not every matrix can be described in this way. The matrices that can be described this way are called diagonalizable matrices. Diagonalizable solutions We can now find all diagonalizable solutions to equation (3). Let X be such a solution let v be an eigenvector for X with corresponding eigenvalue λ. Then } v (X + A X + A )v X v + A Xv + A v This is equivalent to requiring that λ v + λa v + A v (λ I + λa + A )v. det(λ I + λa + A ) v Null (λ I + λa + A ). Thus λ is a root of the equation det(t I + ta + A ), 6

7 a fourth degree equation. Each of its roots is a possible eigenvalue for X the corresponding eigenvectors may be taken from the nonzero elements of Now the matrix S Null (λ I + λa + A ). t I + ta + A is a matrix with entries in Ct. A general result (rational canonical form) says that P (t)(t I + ta + A )Q(t) f (t) f (t) where P (t) Q(t) are invertible matrices over Ct, detp (t) detq(t) f (t), f (t) are monic polynomials in t with f (t) dividing f (t). (The polynomial f (t) is the greatest common divisor of all of the entries of the matrix t I + ta + A.) Then det(t I + ta + A ) f (t)f (t) λ is a root of det(t I + ta + A ) if only if (t λ) f (t). Furthermore, if then λ I + λa + A so If (t λ) f but (t λ) f (t), then (t λ) f, Null (λ I + λa + A ) C. Null (λ I + λa + A ) Span {Q(λ) } is a one-dimensional subspace. We can now describe how to find all diagonalizable solutions of equation (3). Denote the roots of det(t I + ta + A ) by λ i, i m where m 4. First suppose f (t). Then, for any i, i m we have dim Null (λ i I + λ i A + A ). Therefore this space is spanned by a single vector of S. Denote this vector by v i. Then, for any pair (i, j), i < j m with v i v j, there is a solution X to equation (3) such that v i is an eigenvector with eigenvalue λ i v j is an eigenvector 7

8 with eigenvalue λ j. Every diagonalizable solution to equation (3) arises in this way. Note that in this case there are at most 6 ( 4 ) diagonalizable solutions. Next suppose f (t) has a single root λ. Then, m 3 (for f (t) has at most three roots, one of which is λ ). In this case, dim Null (λ I + λ A + A ) dim Null (λ i I + λ i A + A ) for i m. Therefore, Null (λ i I + λ ia + A ) is spanned by a single vector of S. Denote this vector by v i. Now, since Null (λ I + λ A + A ) C, λ I is a solution. Also, if i m, if v S v v i, there is a solution X to equation (3) such that v is an eigenvector for X with eigenvalue λ v i is an eigenvector for X with eigenvalue λ i. Here there are infinitely many choices for v hence infinitely many solutions. In addition, if v v 3, there is a solution X to equation (3) such that v is an eigenvector for X with eigenvalue λ v 3 is an eigenvector for X with eigenvalue λ 3. Every diagonizable solution to equation (3) arises in this way. Notice that in this case there are infinitely many diagonalizable solutions if only if m >. Finally, suppose f (t) has two roots λ λ. Then f (t) f (t) λ λ are the only roots of det(λ I + λa + A ). Furthermore, dim Null (λ i I + λ i A + A ), i. Then λ I, λ I are solutions for any pair (v, v ) of distinct elements of S there is a solution X to equation (3) such that v is an eigenvector for X with eigenvalue λ v is an eigenvector for X with eigenvalue λ. Every diagonalizable solution to equation (3) arises in this way. In this case there are infinitely many diagonalizable solutions. This discussion goes over without serious change to the case of equations of degree n over M k (C). The maximum finite number of diagonalizable solutions becomes ( ) nk k. This maximum finite number of diagonalizable solutions ( the analysis of the problem using eigenvectors eigenvalues) is due to Fuchs Schwarz FS. 8

9 Examples: () Find all diagonalizable solutions of X The possible eigenvalues are the roots of det(t I ) t 4.. Thus is the only possible eigenvalue. The possible eigenvectors are the nonzero elements of Null. These are just the nonzero multiples of. Thus we cannot find a basis for C consisting of eigenvectors for X, so there are no diagonalizable solutions. () Find all diagonalizable solutions of X. The possible eigenvalues are the roots of det(t I) t 4 so, again, is the only possible eigenvalue. The possible eigenvectors are the nonzero elements of the nullspace of the zero matrix, i.e., all nonzero vectors in C. Thus we may take any basis for C declare that both elements of the basis are eigenvectors with corresponding eigenvalues. This means that X is the zero matrix. Thus there is only one diagonalizable solution. (3) Find all diagonalizable solutions of X. 4 The possible eigenvalues are the roots of det(t I t ) det( 4 t ) 4 (t )(t 4) (t )(t + )(t )(t + ). Thus the possible eigenvalues are,,,. If or is an eigenvalue, the corresponding eigenvector must be a nonzero element in N ull, that is, a 3 nonzero multiple of. If or is an eigenvalue, the corresponding eigenvector 9

10 3, that is, a nonzero multiple of must be a nonzero element of Null. Thus, in determining X there are two choices ( ) for the eigenvalue to assign to two choices ( ) for the eigenvalue to assign to diagonalizable solutions. (4) Find all diagonalizable solutions of X The possible eigenvalues are the roots of det(t t I ) det( t.. Thus there are four ) (t )(t ) (t ) (t + ). Thus the possible eigenvalues are. In either case, any vector in S is a possible eigenvector. Thus we may choose {v, v } to be any pair of distinct elements of S, declare both elements to be eigenvectors for. This gives the identity matrix. We may also declare both elements to be eigenvectors for. This gives I. But we may also declare v to be an eigenvector corresponding to v to be an eigenvector corresponding to. There are infinitely many ways to do this, so there are infinitely many solutions. (Note that if we choose an orthogonal basis for C this procedure produces the matrix of a reflection.) Non-diagonalizable solutions It is well-known that if a k by k matrix has k distinct eigenvalues, then the matrix is diagonalizable. Thus if a by matrix is not diagonalizable, it can have only one eigenvalue. Note that if X + A X + A if we set then Y X + ri Y + (A ri)y + (r I ra + A ). Since λ is an eigenvalue of X if only if r +λ is an eigenvalue of Y, we may assume, by replacing X by X + ri for an appropriate r, that is an eigenvalue of X. Since we are assuming that is an eigenvalue of X we have that is the only eigenvalue of X. This implies that X (for the characteristic polynomial of X must be t by the Cayley-Hamilton Theorem, X satisfies its characteristic polynomial).

11 Hence it is sufficient to consider nilpotent solutions of X + A X + A. Since we are assuming that X is not diagonalizable, X. Hence there is a linearly independent set {v, v } such that Xv Then Xv v. v (X + A X + A )v X v + A Xv + A v A v v (X + A X + A )v X v + A Xv + A v A v + A v Recall that Ct denotes the algebra of polynomials in t with coefficients from the complex numbers. Let (t ) denote the ideal in Ct generated by t, that is, the set of all polynomials divisible by t. Then the quotient algebra Ct/(t ) denotes the algebra of polynomials where we set t. We write (Ct/(t )) for the set of all g (t) g (t) where g (t), g (t) Ct/(t ). Note that M (Ct) acts on (Ct/(t )) by matrix multiplication. Note also that we may write any element in (Ct/(t )) uniquely in the form w + tw where w, w C. Now let v, v be as above; that is, A v

12 A v + A v Note that this is equivalent to (4) (t I + ta + A )(v + tv ) in (Ct/(t )). Recall that P (t)(t I + ta + A )Q(t) f (t) ) f (t) where P (t) Q(t) are invertible matrices over Ct, detp (t) detq(t) f (t), f (t) are monic polynomials in t with f (t) dividing f (t). The same decomposition holds over Ct/(t ). Note that (4) has no solution unless t f (t); therefore, from now on we assume t f (t). Note that it t f (t) then our equation is X the set of solutions is the set of all nilpotent matrices. After setting t, the left-h column of Q(t) may be written as p + tp the right-h column may be written as q + tq, where p, p, q, q C. Note that {p, q } is linearly independent, since p q are the columns of the invertible matrix Q(). If f (t) t, then the set of solutions of (4) is Thus there is a solution X with t Q(t) + Q(t) + Q(t) Span {tp, q + tq, tq }. t X(ap + q ) q, X(q ) whenever {ap + q, q } is linearly independent. Since {p, q } is linearly independent, there are infinitely many such solutions.

13 If t f (t), then (4) has solution Q(t) + Q(t) Span{q t + tq, tq }. If {q, q } is linearly independent there is a unique nilpotent solution X with Xq q, Xq. If {q, q } is linearly dependent, there is no nilpotent solution. Example: Find all nilpotent solutions of X. We write t t t t t 4. Thus Since the set q, q {, } is linearly dependent, there is no nilpotent solution. Example: Find all nilpotent solutions of X + Thus Here t + t X + t + t t t t 4 t. q, q 3. t + t

14 so there is a unique nilpotent solution. How many solutions can there be? We can now answer our earlier question: How many solutions can the equation X + A X + A have? We have seen that if this equation has a finite number of solutions, then the number of diagonalizable solutions is at most ( m ) where m is the number of distinct roots of (5) det(t I + ta + A ). We have also seen that the equation can have a nilpotent solution if only if is a root of (5) with multiplicity greater than. Furthermore, the number of nilpotent solutions is, or. Therefore, if (5) has four distinct roots the number of solutions is finite, this number is at most 6. If (5) has a single repeated root (which we may assume to be ) the number of solutions is finite, there is at most non-diagonalizable solution 3 ( 3 ) diagonalizable solutions. Finally, if (5) has two repeated roots the number of solutions is finite, there are at most two non-diagonalizable solutions (one corresponding to each root of (5)). Also, m, so there is at most ( ) diagonalizable solution. Thus, in any case, the maximum possible finite number of solutions is 6. The following examples show that for each l,,, 3, 4, 5, 6, there is an equation with exactly l solutions. Each example may be analyzed using the methods above. Examples: () X + has no solutions. (a) X + We have X + has a unique (nilpotent) solution. det t + t t t 3 (t + ), so the possible eigenvalues for a solution X are. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The 4

15 only element of S in this space is. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is. Therefore there is no diagonalizable solution. t Now + t t t t + t t 4 + t 3. Thus q so there is a unique nilpotent solution (b) X + We have X +, q. has a unique (diagonalizable) solution. t det + t + t t (t + ), so the possible eigenvalues for a solution X are. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is.the possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is Now Thus. Thus there is a unique diagonalizable solution (t + 3)t (t + 3)t q. (t + ) t t t t + 3 (t + ) t (t + )., q 5

16 so there is no nilpotent solution. A non-diagonalizable solution X of this equation with eigenvalue - of multiplicity would correspond (upon replacing X by X +I) to a nilpotent solution of the equation X X + Thus, in this case,. Since + (t 3)t (t 3)t q t t + 3 (t ) (t ) + t t t (t )., q so there is no nilpotent solution. Therefore there is no non-diagonalizable solution with eigenvalue of the original equation. () X + X + has two solutions, neither of which is diagonalizable. We have t det + t t t (t + ), + t so the possible eigenvalues for a solution X are. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is. The possible eigenvectors corresponding to are the nonzero elements in the null space of this space is Now t t. Thus there are no diagonalizable solutions.. The only element of S in t + t t + t t + t t (t + ). Thus q, q 6,

17 so there is a unique nilpotent solution. A non-diagonalizable solution X of this equation with eigenvalue - of multiplicity would correspond (upon replacing X by X +I) to a nilpotent solution of the equation X X +. Since t + t t t t t t t t (t ). Thus, in this case, q, q so there is unique nilpotent solution. This corresponds to the solution of the original equation. This is the unique non-diagonalizable solution with eigenvalue. (3) X + X + has three solutions, one nilpotent two diagonalizable. We have t det + t t t t (t + )(t ), t so the possible eigenvalues for a solution X are,,. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is. The possible eigenvectors corresponding to are. The only element of S in this space the nonzero elements in the null space of is. The possible eigenvectors corresponding to are the nonzero elements in 7

18 the null space of. The only element of S in this space is two diagonalizable solutions:. Now (t 3 + t t)/ Thus,.Thus there are t + t t t t t + t t + t t. (t + )(t ) q, q so there is a unique nilpotent solution (4) X has four (diagonalizable) solutions. 4 We have t det t t (t )(t + )(t )(t + ). 4t Since this factors into distinct linear factors, all solutions are diagonalizable. We have found these solutions above. 3 (5) X + X + has five (diagonalizable) solutions. We have t det 3t + t t t(t )(t + )(t ), + t so the possible eigenvalues for a solution X are,,,. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is. The possible eigenvectors corresponding 4 to are the nonzero elements in the null space of. The only element of S in this space is. The possible eigenvectors corresponding to are the nonzero elements in the null space of 6,.. The only element of S in this space is 8. The

19 possible eigenvectors corresponding to are the nonzero elements in the null space of 6. The only element of S in this space is. The 5 diagonalizable solutions 6 correspond to the pairs of eigenvalues (, ), (, ), (, ), (, ), (, ). (6) X X + has six (diagonalizable) solutions. 4 We have t det t t + t + t t(t )(t + )(t ), t so the possible eigenvalues for a solution X are,,,. The possible eigenvectors corresponding to are the nonzero elements in the null space of. 4 The only element of S in this space is. The possible eigenvectors corresponding. The only element of S to are the nonzero elements in the null space of in this space is. The possible eigenvectors corresponding to are the nonzero elements in the null space of. The only element of S in this space is. The 6 possible eigenvectors corresponding to are the nonzero elements in the null space 3 of. The only element of S in this space is. There is a diagonalizable solution corresponding to each pair of distinct possible eigenvalues. Thus the pair (, ) corresponds to ; the pair (, ) corresponds to ; the pair (, ) corresponds to ; the pair (, ) corresponds to ; the pair (, ) corresponds to ( ) X has infinitely many solutions. ; the pair (, ) corresponds to References FS D. Fuchs A. Schwarz, A Matix Vieta Theorem, E. B. Dynkin Seminar, Amer. Math. Soc. Transl. Ser 69 (996). GHR I. Gohberg, P. Lancaster, L. Rodman, Matrix polynomials, Academic Press,

20 GR I. Gelf V. Retakh, Noncommutative Vieta Theorem Symmetric Functions, Gelf Mathematical Seminars , Birkhauser (996). GR I. Gelf V. Retakh, Quasideterminants, I, Selecta Math 3 (997), no. 4. LR P. Lancaster L. Rodman, Algebraic Riccati Equations, Clarendon Press, 995. Department of Mathematics, Rutgers University, Piscataway, NJ address: rwilson@math.rutgers.edu

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

THREE LECTURES ON QUASIDETERMINANTS

THREE LECTURES ON QUASIDETERMINANTS THREE LECTURES ON QUASIDETERMINANTS Robert Lee Wilson Department of Mathematics Rutgers University The determinant of a matrix with entries in a commutative ring is a main organizing tool in commutative

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Online Exercises for Linear Algebra XM511

Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a

More information

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH. 20F SAMPLE FINAL (WINTER 2010) MATH. 20F SAMPLE FINAL (WINTER 2010) You have 3 hours for this exam. Please write legibly and show all working. No calculators are allowed. Write your name, ID number and your TA s name below. The total

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Math 215 HW #9 Solutions

Math 215 HW #9 Solutions Math 5 HW #9 Solutions. Problem 4.4.. If A is a 5 by 5 matrix with all a ij, then det A. Volumes or the big formula or pivots should give some upper bound on the determinant. Answer: Let v i be the ith

More information

Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for

More information

SUPPLEMENT TO CHAPTERS VII/VIII

SUPPLEMENT TO CHAPTERS VII/VIII SUPPLEMENT TO CHAPTERS VII/VIII The characteristic polynomial of an operator Let A M n,n (F ) be an n n-matrix Then the characteristic polynomial of A is defined by: C A (x) = det(xi A) where I denotes

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

ANSWERS. E k E 2 E 1 A = B

ANSWERS. E k E 2 E 1 A = B MATH 7- Final Exam Spring ANSWERS Essay Questions points Define an Elementary Matrix Display the fundamental matrix multiply equation which summarizes a sequence of swap, combination and multiply operations,

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Final Exam Practice Problems Answers Math 24 Winter 2012

Final Exam Practice Problems Answers Math 24 Winter 2012 Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the

More information

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017 Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

More information

Some notes on linear algebra

Some notes on linear algebra Some notes on linear algebra Throughout these notes, k denotes a field (often called the scalars in this context). Recall that this means that there are two binary operations on k, denoted + and, that

More information

Answer Keys For Math 225 Final Review Problem

Answer Keys For Math 225 Final Review Problem Answer Keys For Math Final Review Problem () For each of the following maps T, Determine whether T is a linear transformation. If T is a linear transformation, determine whether T is -, onto and/or bijective.

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

MATH 1553-C MIDTERM EXAMINATION 3

MATH 1553-C MIDTERM EXAMINATION 3 MATH 553-C MIDTERM EXAMINATION 3 Name GT Email @gatech.edu Please read all instructions carefully before beginning. Please leave your GT ID card on your desk until your TA scans your exam. Each problem

More information

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA Mahmut Kuzucuoğlu Middle East Technical University matmah@metu.edu.tr Ankara, TURKEY March 14, 015 ii TABLE OF CONTENTS CHAPTERS 0. PREFACE..................................................

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

This MUST hold matrix multiplication satisfies the distributive property.

This MUST hold matrix multiplication satisfies the distributive property. The columns of AB are combinations of the columns of A. The reason is that each column of AB equals A times the corresponding column of B. But that is a linear combination of the columns of A with coefficients

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

Dimension. Eigenvalue and eigenvector

Dimension. Eigenvalue and eigenvector Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,

More information

1. Select the unique answer (choice) for each problem. Write only the answer.

1. Select the unique answer (choice) for each problem. Write only the answer. MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details. Math 554 Qualifying Exam January, 2019 You may use any theorems from the textbook. Any other claims must be proved in details. 1. Let F be a field and m and n be positive integers. Prove the following.

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide

More information

Lecture 12: Diagonalization

Lecture 12: Diagonalization Lecture : Diagonalization A square matrix D is called diagonal if all but diagonal entries are zero: a a D a n 5 n n. () Diagonal matrices are the simplest matrices that are basically equivalent to vectors

More information

Topic 1: Matrix diagonalization

Topic 1: Matrix diagonalization Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3, MATH 205 HOMEWORK #3 OFFICIAL SOLUTION Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. a F = R, V = R 3, b F = R or C, V = F 2, T = T = 9 4 4 8 3 4 16 8 7 0 1

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Diagonalization. Hung-yi Lee

Diagonalization. Hung-yi Lee Diagonalization Hung-yi Lee Review If Av = λv (v is a vector, λ is a scalar) v is an eigenvector of A excluding zero vector λ is an eigenvalue of A that corresponds to v Eigenvectors corresponding to λ

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

0.1 Rational Canonical Forms

0.1 Rational Canonical Forms We have already seen that it is useful and simpler to study linear systems using matrices. But matrices are themselves cumbersome, as they are stuffed with many entries, and it turns out that it s best

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution.

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is

More information

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0 535 The eigenvalues are 3,, 3 (ie, the diagonal entries of D) with corresponding eigenvalues,, 538 The matrix is upper triangular so the eigenvalues are simply the diagonal entries, namely 3, 3 The corresponding

More information

Eigenvalues and Eigenvectors: An Introduction

Eigenvalues and Eigenvectors: An Introduction Eigenvalues and Eigenvectors: An Introduction The eigenvalue problem is a problem of considerable theoretical interest and wide-ranging application. For example, this problem is crucial in solving systems

More information

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented Question. How many solutions does x 6 = 4 + i have Practice Problems 6 d) 5 Question. Which of the following is a cubed root of the complex number i. 6 e i arctan() e i(arctan() π) e i(arctan() π)/3 6

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0. Matrices Operations Linear Algebra Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0 The rectangular array 1 2 1 4 3 4 2 6 1 3 2 1 in which the

More information

MATH 1553 PRACTICE FINAL EXAMINATION

MATH 1553 PRACTICE FINAL EXAMINATION MATH 553 PRACTICE FINAL EXAMINATION Name Section 2 3 4 5 6 7 8 9 0 Total Please read all instructions carefully before beginning. The final exam is cumulative, covering all sections and topics on the master

More information

Math 20F Final Exam(ver. c)

Math 20F Final Exam(ver. c) Name: Solutions Student ID No.: Discussion Section: Math F Final Exam(ver. c) Winter 6 Problem Score /48 /6 /7 4 /4 5 /4 6 /4 7 /7 otal / . (48 Points.) he following are rue/false questions. For this problem

More information

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5 JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 The test lasts 1 hour and 15 minutes. No documents are allowed. The use of a calculator, cell phone or other equivalent electronic

More information

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a). .(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

(VI.C) Rational Canonical Form

(VI.C) Rational Canonical Form (VI.C) Rational Canonical Form Let s agree to call a transformation T : F n F n semisimple if there is a basis B = { v,..., v n } such that T v = λ v, T v 2 = λ 2 v 2,..., T v n = λ n for some scalars

More information

SPECTRAL THEORY EVAN JENKINS

SPECTRAL THEORY EVAN JENKINS SPECTRAL THEORY EVAN JENKINS Abstract. These are notes from two lectures given in MATH 27200, Basic Functional Analysis, at the University of Chicago in March 2010. The proof of the spectral theorem for

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

MATH Spring 2011 Sample problems for Test 2: Solutions

MATH Spring 2011 Sample problems for Test 2: Solutions MATH 304 505 Spring 011 Sample problems for Test : Solutions Any problem may be altered or replaced by a different one! Problem 1 (15 pts) Let M, (R) denote the vector space of matrices with real entries

More information

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in 806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information