# NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Size: px
Start display at page:

Transcription

1 NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques for studying equations involving matrices I don t expect you to grasp all of the material immediately, but I hope that you will have a good understanding of it by the end of the course Note: There are exercises scattered throughout; some are trivial, and a few require a substantial effort Preliminaries 0 (i) If X and Y are sets then X Y refers to the new set {(x, y) x X and y Y } (ii) The notation f: X Y will be used to denote a function whose domain is X and whose values are contained in the image set Y ; we will also describe this situation as follows: X Y x f(x) (iii) Let f: X Y If W X, then f(w ) = {f(w) w W } These notes were originally developed for the summer REU (Research Experiences for Undergraduates) programs at Texas A&M University ( ) and Temple University ( ), supported in part by NSF REU Site Grants 1 Typeset by AMS-TEX

2 2 NONCOMMUTATIVE POLYNOMIAL EQUATIONS If f(x) = Y then f is said to be surjective and we further say that f maps X onto Y Note that f always maps X surjectively onto f(x) If f(x 1 ) = f(x 2 ) if and only if x 1 = x 2, for all x 1, x 2 X, then we say that f is injective If f is both injective and surjective, then we say that f is bijective or that f is a bijection Matrices 1 (i) The set of complex numbers, equipped with the usual addition and multiplication operations, will be denoted C complex entries All matrices discussed below will be assumed to have (ii) I will assume that you are already comfortable with matrix arithmetic (ie, matrix addition, scalar multiplication, and matrix multiplication) The n n identity matrix will be denoted I; the size of I will always be clear from the context in which it is used Recall that an inverse of an n n matrix A is a matrix A 1 such that A 1 A = AA 1 = I While A may not have an inverse, if A 1 exists then it is unique (prove this!) When A 1 exists we say that A is either invertible or nonsingular If A 1 does not exist then A is said to be singular (iii) The trace of an n n matrix A is the sum of the entries along the main diagonal and is denoted trace(a) Recall (or prove for yourself), assuming A and B to be n n matrices, that trace(a + B) = trace(a) + trace(b), that trace(ab) = trace(ba), and that trace(αa) = α trace(a) for all α C However, it is not generally true that trace(ab) = trace(a) trace(b) If Q is an invertible matrix then trace(a) = trace(q 1 AQ) Also, when I is the n n identity matrix, trace(i) = n (iv) The determinant of an n n matrix A is described in almost every text on linear algebra (see, for example, Chapter 5 of Linear Algebra, 2nd Edition, by K Hoffman and R Kunze) and is denoted det(a) (Unfamiliarity with determinants should not present an obstacle for you when working on your research projects)

3 MATH 4096: SPRING Vector Spaces 2 A (complex) vector space V is a nonempty set equipped with a (vector) sum V V V (u, v) u + v, a scalar product C V V (α, v) αv, and a zero vector 0 V such that the following properties hold, for u, v, w V and α, β C: u + (v + w) = (u + v) + w, u + v = v + u, u + 0 = u, there exists an element u V such that u + ( u) = 0, α(βu) = (αβ)u, (α + β)u = αu + βu, α(u + v) = αu + βv, 1u = u The elements of V are referred to as vectors 3 (i) Using the definition in (2), you should check that the zero vector is unique, that u = ( 1)u for all u V, that α0 = 0 for all α C, and that if u V is nonzero then αu = 0 if and only if α = 0 (ii) We will use C n to designate the set of n 1 complex matrices, α 1 α 2 α n α 1, α 2,, α n C It follows immediately that C n is a vector space under the usual matrix addition and scalar multiplication When there is no possibility of confusion, we will use 0 to designate the zero n 1 matrix (equivalently, the zero vector) in C n 4 For the remainder of these notes V will denote a vector space

4 4 NONCOMMUTATIVE POLYNOMIAL EQUATIONS Subspaces and Bases 5 A nonempty subset U of V is a (vector) subspace of V provided that U is closed under vector sums and scalar products: if u, v U and α C then u + v U and αu U (It is easy to check that a subspace U of V is itself a vector space under the sum and scalar product defined for V ) The subspace of V consisting only of the zero vector will be denoted (0) 6 Let X be a nonempty (but not necessarily finite) subset of V (i) The span of X, denoted span X, is the set of linear combinations α 1 x 1 + α 2 x α t x t, for all α 1,, α t C, all x 1,, x t X, and all positive integers t It is standard to say that the span of the empty subset of V is (0), and it is straightforward to check that span X is a subspace of V If U is the subspace of V spanned by X, then we say that X is a spanning set for U Every subspace U of V has a spanning set, since span U = U If u V, then span{u} is also denoted Cu (ii) The nonempty set X is linearly dependent if there exist α 1, α 2, α t C, at least one of which is not equal to zero, and pairwise distinct x 1, x 2,, x t X such that α 1 x 1 + α 2 x α t x t = 0 Note that any subset containing the zero vector is immediately linearly dependent If X is not linearly dependent then it is said to be linearly independent Check that a subset of V consisting of exactly one nonzero vector is necessarily linearly independent, and more generally that every nonempty subset of a linearly indedpendent set is linearly independent (iii) Exercise: Let X be a finite set spanning the subspace U, and suppose that X contains at least one nonzero vector Prove that X contains a linearly independent subset also spanning U (The conclusion remains true when X is infinite, but the proof is somewhat more complicated) (iv) A linearly independent (and so necessarily nonempty) spanning set of V is termed a basis for V It follows from the preceding that every vector space has a (possibly infinite) basis If V has a basis comprised of exactly n (pairwise distinct) vectors, then every basis of V consists of precisely n vectors (see, for example, Section 3, Chapter 2, of Linear Algebra,

5 MATH 4096: SPRING nd Edition, by K Hoffman and R Kunze, for a proof), and we say that the dimension of V is n; if no such n exists we say that V is infinite dimensional In general we will write dim V to denote the dimension of V If dim V = n, and if X is linearly independent, then X has no more than n distinct elements (again see, for example, Section 3, Chapter 2, of Hoffman s and Kunze s text) Therefore, if U is a subspace of V, then dim U dim V Exercise: If u is a nonzero vector in V, then Cu is a one-dimensional subspace of V (v) Suppose that X is linearly independent, that x 1,, x t are pairwise distinct vectors in X, and that α 1, α 2,, α t, β 1, β 2,, β t C Then α 1 x 1 + α 2 x α t x = β 1 x 1 + β 2 x β t x t if and only if α 1 = β 1, α 2 = β 2,, α t = β t (Prove this yourself) Now suppose that X is a basis for V It follows from the last if and only if that every vector u V can be uniquely written in the form u = α 1 x 1 + α 2 x α t x t for x 1, x 2,, x t X and α 1, α 2,, α t C (Check this last statement Don t assume that X is finite) (vi) If X is linearly independent, and u V is not contained in span X, then X {u} is linearly independent (Prove this yourself) 7 The set e 1 = 1 0 0, e 2 = 0 1 0,, e n 1 = 0 1, e n = is a basis for C n called the standard basis 8 If X is a linearly independent subset of V, then we say that X can be extended to a basis of V provided there exists a (necessarily linearly independent) subset Y of V such that X Y is a basis for V

6 6 NONCOMMUTATIVE POLYNOMIAL EQUATIONS Lemma Every linearly independent subset of V can be extended to a basis for V Proof (The following inductive proof assumes that V is finite dimensional) Suppose that X is a (necessarily finite, by (6iv)) linearly independent subset of V Set X 1 = X For i 1, we may assume that a series X 1 X 2 X i of linearly independent subsets of V has been chosen If span X i = V, then X can be extended to a basis for V, and the proof follows Now suppose that span X i V We can therefore choose a vector y V \ span X i, and by (6vi), we can set X i+1 = X i {y} to obtain a linearly independent subset of V properly containing X i cannot be extended to a basis for V there exists an infinite series Therefore, if X X 1 X 2 X i X i+1 of linearly independent subsets of V a contradiction to the finite dimensionality of V, by (6iv) 9 It follows from (8) that if U is a subspace of V then any basis for U can be extended to a basis for V 10 Corollary If U is a subspace of V, and if dim U = dim V, then U = V Proof Exercise Linear Transformations 11 For the remainder of these notes W will denote a vector space 12 (i) A function T : V W is a linear transformation provided that T (u + v) = T (u) + T (v), and T (αu) = αt (u) for all u, v V and α C (ii) A linear transformation T : V V is also called a linear operator on V (iii) A bijective linear transformation is termed a (linear) isomorphism, and a bijective linear operator is called a (linear) automorphism

7 MATH 4096: SPRING Let T : V W be a linear transformation The following facts are routine to verify (i) Let U be a vector space, and let S: U V be a linear transformation The composition T S: U W is also a linear transformation If S and T are isomorphisms then so is T S (v) If T is an isomorphism then T 1 : W V is also an isomorphism We thus say that V and W are isomorphic to each other when there exists any isomorphism from V onto W 14 Let T : V W be a linear transformation (i) Set K = {u V T (u) = 0} It is easy to verify (and you should!) that K is a subspace of V, called the kernel of T (ii) Exercise: Prove that T is injective if and only if its kernel is (0) (Hint: T (x 1 ) = T (x 2 ) if and only if T (x 1 x 2 ) = 0) 15 Suppose that X = {x 1, x 2,, x n } is a basis for V, and let T : V W be a linear transformation (i) Let u be an arbitrary vector in V Set u = α 1 x 1 + α 2 x α n x n for suitable α 1, α 2,, α n C Then T (u) = α 1 T (x 1 ) + α 2 T (x 2 ) + + α n T (x n ) Roughly speaking, the preceding equality asserts that T is completely determined by its action on X (ii) Exercise, using (i): T is injective if and only if T (X) is a linearly independent subset of W Consequently, if T is injective then dim V dim W (iii) Exercise, also using (i): T is surjective if and only if T (X) spans W Consequently, if T is surjective then dim V dim W (iv) We deduce from (ii) and (iii) that if T is an isomorphism then dim V = dim W

8 8 NONCOMMUTATIVE POLYNOMIAL EQUATIONS 16 Assume that {x 1, x 2,, x n } is a basis for V Assume that {y 1, y 2,, y n } is a basis for W ; in particular dim V = dim W (i) It is routine to check that the assignment α 1 x 1 + α 2 x α n x n α 1 y 1 + α 2 y α n y n produces an isomorphism from V onto W (ii) It follows from (ii) that any two vector spaces of dimension n are isomorphic In particular, if {x 1, x 2,, x 3 } is a basis for V, then it follows from (7) and (i) that there is an isomorphism from V onto C n given by for α 1,, α n C α 1 x 1 + α 2 x α n x n α 1 α 2 α n, Matrices Associated to Linear Transformations 17 (i) Let A be an m n matrix (with complex entries) Then the assignment x Ax, for x C n produces a linear transformation from C n to C m We will let A denote both the matrix and the corresponding linear transformation (ii) An n n matrix A is invertible (or, equivalently, nonsingular) if and only Ax 0 for all nonzero x C n, if and only if the determinant of A is nonzero (See, for example, Theorem 13, Chapter 1, and Theorem 4, Chapter 5, of Linear Algebra, 2nd Edition, by K Hoffman and R Kunze) (iii) Exercise: Let A be an n n matrix Then the linear transformation A: C n C n is an isomorphism if and only if A is invertible (iv) Let A be an m n matrix, and let C 1,, C n denote the columns of A, ordered appropriately It is useful to note, for α 1 α 2 that x = α n, and α 1, α 2,, α n C, Ax = α 1 C 1 + α 2 C α n C n

9 MATH 4096: SPRING Let T : C n C m be a linear transformation, and let e 1, e 2,, e n be the standard basis vectors of C n Let A denote the m n matrix whose jth column is equal to T (e j ), for 1 j n For it follows from (17iv) that x = α 1 α 2 α n, and α 1, α 2,, α n C, T (x) = T (α 1 e 1 + α 2 e α n e n ) = α 1 T (e 1 ) + α 2 T (e 2 ) + + α n T (e n ) = Ax Therefore, every linear transformation from C n to C m corresponds to an m n matrix Change of Basis 19 (i) Suppose that T : V W is a linear transformation, that {x 1, x 2,, x n } is a basis for V, and that {y 1, y 2,, y m } is a basis for W Let ϕ: V C n denote the isomorphism sending α 1 x 1 + α 2 x 2 + α n x n for α 1, α 2,, α n C, and let ψ: C m W be the isomorphism mapping β 1 β 2 β m for β 1, β 2,, β m C Next, set α 1 α 2 α n, β 1y 1 + β 2 y 2 + β m y m, T (x 1 ) = a 11 y 1 + a 21 y a m1 y m T (x 2 ) = a 12 y 1 + a 22 y a m2 y m T (x n ) = a 1n y 1 + a 2n y a mn y m,

10 10 NONCOMMUTATIVE POLYNOMIAL EQUATIONS for a 11, a 12, a 21,, a mn C, and set A equal to the m n matrix (a ij ) Then T = ψ A ϕ To illustrate: ϕ V C n T W ψ A C m (ii) Note that the matrix A in (i) depends on the choice of bases for V and W (iii) Part (i) can be used to justify the informal assertion that the study of linear transformations of finite dimensional vector spaces is identical to matrix theory 20 (i) Let T : V V and T : V V be linear operators, respectively, on the vector spaces V and V We say that T and T are equivalent if there exists an isomorphism ϕ: V V such that T = ϕ T ϕ 1 The following diagram may help to visualize the involved compositions of functions: ϕ V T V ϕ V T V Observe that T = ϕ T ϕ 1 if and only if T = ϕ 1 T ϕ (ii) Let T : V V be a linear operator, and suppose that we are given a vector space isomorphism ϕ: V V Set T = ϕ T ϕ 1 Then T : V V is a linear operator that is trivially equivalent to T (iii) Exercise: Suppose that A and A are n n matrices Then A, A : C n C n are equivalent linear transformations if and only if there exists an invertible n n matrix M such that A = MAM 1 In other words, A and A are equivalent if and only if they are conjugate (iv) Let T : V V be a linear operator, and assume that X = {x 1, x 2,, x n } is a basis for V Let ϕ: V C n be the isomorphism α 1 x 1 + α 2 x α n x n α 1 α 2 α n,

11 MATH 4096: SPRING for α 1, α 2,, α n C Set T (x 1 ) = a 11 x 1 + a 21 x a n1 x n T (x 2 ) = a 12 x 1 + a 22 x a n2 x n T (x n ) = a 1n x 1 + a 2n x a nn x n, for a 11, a 12, a 21,, a nn C, and set A equal to the n n matrix (a ij ) Then T = ϕ 1 A ϕ, and T is equivalent to A As before, A depends on the choice of basis X, and we describe the above scenario by writing [T ] X = A (v) Retain the notation of (iv) To better understand how A depends on X, let {y 1, y 2,, y n } be another basis for V, and let ψ be the isomorphism for β 1, β 2,, β n C As before, set β 1 y 1 + β 2 y β n y n β 1 β 2 β n, T (y 1 ) = b 11 y 1 + b 21 y b n1 y n T (y 2 ) = b 12 y 1 + b 22 y b n2 y n T (y n ) = b 1n y 1 + b 2n y b nn y n, for b 11, b 12, b 21,, b nn C, and set B equal to the n n matrix (b ij ) We now have T = ψ 1 B ψ, and so T is equivalent to B Now combine the preceding two situations: C n ϕ ψ V C n A C n ϕ T V ψ B C n Note that ϕ ψ 1 : C n C n is an isomorphism, and so there exists an invertible n n matrix Q whose associated linear transformation Q: C n C n is the map ϕ ψ 1 We

12 12 NONCOMMUTATIVE POLYNOMIAL EQUATIONS therefore obtain a new diagram: Q C n C n A C n Q B C n Consequently, B = Q 1 AQ, and in particular, A and B are conjugate Exercise: Suppose that T is equivalent to the matrix A, as above, and that C is an n n matrix conjugate to A Then T is equivalent to C (vi) To summarize, if T is a linear operator on a finite dimensional vector space V, then different bases for V only affect the matrix depiction of T up to conjugation More generally, the study of linear operators on n-dimensional vector spaces, up to equivalence, is identical to the study of n n matrices, up to conjugation Eigenvectors and Eigenvalues 21 Let T : V V be a linear operator (i) A nonzero vector v V is an eigenvector for T, or more briefly a T -eigenvector, provided there exists a λ C such that T (v) = λv In the preceding setting, λ is termed the T -eigenvalue associated to v Note that an eigenvector is necessarily nonzero but that an eigenvalue may equal zero (ii) Let ϕ: V V be an isomorphism of vector spaces Further suppose T : V V is a linear operator such that T = ϕ T ϕ 1 Let v be a nonzero vector in V Exercise: v is an eigenvector for T with eigenvalue λ if and only if ϕ(v) is an eigenvector for T with eigenvalue λ (iii) Exercise: Let v 1, v 2,, v n be eigenvectors for T with pairwise distinct eigenvalues λ 1, λ 2,, λ n Then v 1, v 2,, v n are linearly independent (iv) Suppose that the dimension of V is n, and let v be an eigenvector for T with eigenvalue λ Since {v} is a linearly independent set, we can extend it to a basis X =

13 MATH 4096: SPRING {x 1, x 2,, x n } with x 1 = v Set A = (a ij ) = [T ] X, as in (20iv) Exercise: a 11 = λ and a 21 = a 31 = = a n1 = 0 In other words, λ a 12 a 1n 0 a 22 a 2n A = 0 a 32 a 3n 0 a n2 a nn 22 We now consider the existence of eigenvectors and eigenvalues, first studying the case for n n matrices This is the one part of our discussion where some knowledge of the properties of determinants is required To start, let A be an n n matrix, also viewed as a linear transformation A: C n C n (i) Recall that I denotes the n n identity matrix, and let λ C The scalar product λi is the matrix with entries λ along the main diagonal and zero elsewhere Such matrices are referred to as scalar matrices More generally, the linear operator on V sending each u V to λv is referred to as a scalar operator and is often simply denoted λ: V V (ii) Note that a nonzero vector v C n is an A-eigenvector with eigenvalue λ if and only if Av = λv, if and only if (A λi)v = 0 But there exists a nonzero v such that (A λi)v = 0 if and only if A is singular, if and only if det(a λiv) = 0 (iii) Set f(x) = det(a xi) Then f(x) is a polynomial of degree n 2 and is called the characteristic polynomial of A (Moreover, the coefficient of x n2, the so-called leading coefficient, is 1 Polynomials whose leading coefficient is 1 are termed monic) From (ii) we see that A has an eigenvector if and only if f(x) has a root To be more specific, the eigenvalues of A are exactly the roots of f(x) (iv) The Fundamental Theorem of Algebra asserts that every complex polynomial in one variable can be factored into a product of polynomials all of degree one Therefore, for suitably chosen λ 1,, λ t C, f(x) = (x λ 1 ) i 1 (x λ 2 ) i2 (x λ t ) i t, for t n 2 The λ 1, λ 2,, λ t are precisely the eigenvalues of A (That all of the degreeone factors in the above product can be chosen to be monic follows from the fact that f(x) is itself monic) In particular, there exists at least one A-eigenvector in C n (v) We have been working with complex vector spaces and matrices precisely because they allow the above application of the Fundamental Theorem of Algebra

14 14 NONCOMMUTATIVE POLYNOMIAL EQUATIONS 23 Theorem Let T : V V be a linear operator, and assume that V is finite dimensional Then V contains a (nonzero) T -eigenvector Proof Assuming that dim V = n, we have already seen that T will be equivalent to A: C n C n, for some n n matrix A Furthermore, V contains a T -eigenvector if and only C n contains an A-eigenvector, as noted in (21ii) However, in (22iv) we proved that A has at least one eigenvector Invariant Subspaces 24 Let T : V V be a linear operator, and let U be a subspace of V (i) We say that U is T -invariant if T (U) U Note that (0) and V are T -invariant subspaces of V (ii) If U is T -invariant, then the restriction U T U U u T (u) is a linear operator on U If S is a set of linear operators then we will say that U is S-invariant when U is F -invariant for each F S 25 (i) A nonzero vector v V is a T -eigenvector if and only if Cv is a T -invariant subspace of V In particular, the set of T -eigenvectors in V corresponds exactly to the set of one-dimensional T -invariant subspaces of V (ii) Choose λ C, and set V T λ = {v V T (v) = λv} Note that 0 V T λ Also, V T λ (0) if and only if V has a T -eigenvector We call V T λ the (T, λ)-eigenspace of V Exercise: V T λ is a T -invariant subspace of V

15 MATH 4096: SPRING Upper Triangular, Nilpotent, and Semisimple Matrices 27 (i) Problem: Use the results obtained so far in these notes to prove that every n n matrix is conjugate to an upper triangular matrix (An n n matrix A is termed upper triangular if every entry below the main diagonal is equal to zero) (ii) Exercise: An n n matrix A is nilpotent if A m = 0 for some positive integer m Prove that a matrix is nilpotent if and only if it is conjugate to a strictly upper triangular matrix (An upper triangular n n matrix is said to be strictly upper triangular when the entries along the main diagonal are also equal to zero) (iii) Exercise: An n n matrix A is semisimple, or diagonalizable, if C n has a basis of A-eigenvectors Prove that a matrix is semisimple if and only if it is conjugate to a diagonal matrix (An n n matrix is diagonal if all of the entries not on the main diagonal are zero A scalar matrix is diagonal, but a diagonal matrix need not be be scalar) (iv) Exercise: Every n n matrix is the sum of a nilpotent and semisimple matrix 28 A more precise decomposition of a square matrix can be obtained as follows: A Jordan Block is an m m matrix of the form λ 1 λ 1 λ 1, λ 1 λ where the unlisted entries are understood, by standard convention, to equal zero Theorem Every (complex) n n matrix is conjugate to a matrix J 1 J 2, Jt where each J i is a Jordan Block Proof See, for example, Chapter 7, Section 3 of Linear Algebra, 2nd Edition, by K Hoffman and R Kunze We say that the matrix listed in the theorem is in Jordan Canonical Form

16 16 NONCOMMUTATIVE POLYNOMIAL EQUATIONS Noncommutative Polynomial Equations 29 (i) Consider the equation XY = Y X We say that [ X = 1 1 ] [, Ỹ = 1 ] 1 is a (2 2) matrix solution to this equation, since XỸ = Ỹ X (ii) In the equation XY + Y X = 2, we think of the 2 as a scalar matrix 2 2, 2 but with indeterminate size Observe then that [ 1 X = 1 is a 2 2 solution to Y X + XY = 2, since ], Ỹ = [ ] [ XỸ + Ỹ X 1 = 1 ] [ ] [ ] [ ] [ 2 = 1 2 ] (iii) Both XY = Y X and XY +Y X = 2 are examples of noncommutative polynomial equations We think of expressions such as XY, X 2 Y 3Y X, and Y 2 X 3 Y as noncommutative polynomials, or more precisely as polynomials in the noncommuting variables X and Y let (iv) We need to briefly explain the use of exponents in (iii) For all positive integers m, X m = m times {}}{ XX X

17 Hence, for all positive integers m, m, MATH 4096: SPRING X m+m = X m X m We will follow the usual convention that X 0 = 1 However, we will only allow non-negative exponents to appear in noncommutative polynomials (v) Negative exponents are permitted in noncommutative Laurent polynomials such as X 4 Y X 4 + 7Y 1 + 3, X 1 Y + Y X 1 = 5, where we interpret the 1 to mean that X 1 X = XX 1 = 1 and Y 1 Y = Y 1 Y = 1 We then set X m = m times {}}{ X 1 X 1 X 1 30 We can also study systems of noncommutative polynomial equations, such as HE EH = 2E, HF F H = 2F, EF F E = H, which has the following (r + 1) (r + 1) solution, for each positive integer r, r r 2 H = r 4, r F = 1 0, µ 1 0 µ 2 Ẽ = 0 µ 3 0 µ r Exercise: Verify that HẼ Ẽ H = 2Ẽ, H F F H = 2 F, Ẽ F F Ẽ = H

18 18 NONCOMMUTATIVE POLYNOMIAL EQUATIONS 31 More generally, let S be a system of noncommutative polynomial equations f 1 (X 1,, X s ) = 0, f 2 (X 1,, X s ) = 0,, f t (X 1,, X s ) = 0 in the (noncommuting) variables X 1, X 2,, X s The n n matrices X 1, X 2,, X s form an n n matrix solution to the system when the equations all remain true after setting X i = X i, for 1 i s We will also refer to X 1,, X s as, more simply, either a matrix solution or an n n solution (More formally, the set { X 1, X 2,, X s } is the solution) 32 Exercise: Prove that there are no matrix solutions to Y X XY = 1 33 As in (31), let S be a system of noncommutative polynomial equations in the variables X 1,, X s Further suppose that X 1, X 2,, X s and X 1, X 2,, X s are two n n solutions to S (in particular, the dimensions of the two solutions are the same) We say that these solutions are equivalent if there exists an invertible n n matrix Q such that X i = Q X iq 1, for 1 i s 34 A matrix solution to a system of noncommutative polynomial equations is (upper) triangularizable if it is equivalent to a solution only consisting of upper triangularizable matrices In (27) we noted that every n n matrix is individually similar to an upper triangular matrix, but this fact does not imply that matrix solutions consisting of more than one matrix must be triangularizable For example, other than E = 0, F = 0, H = 0, there are no triangularizable solutions to the equation in (30) (Don t worry, for the moment, about the proof to this assertion) 35 Exercise: Prove that every matrix solution to XY Y X = Y is upper triangularizable Do the same for XY Y X = Y 2

19 MATH 4096: SPRING Let S be a system of noncommutative polynomial equations, as in (31) (i) Suppose that X 1, X 2,, X s is a 1 1 solution to S Note that X i Xj = X j Xi for all 1 i, j s Therefore, to find the 1-dimensional solutions to S we can add to it the equations X i X j = X j X i, for all 1 i, j s (ii) Exercise: Find the 1-dimensional solutions to the equations in (29) (iii) Let X 1, X 2,, X s be a triangularizable n n solution to S, and fix 1 i n Check that the iith entries of the X 1, X 2,, X s form a 1-dimensional solution to S (iv) Exercise: Show that E, F, H = 0 is the only 1-dimensional solution to the system of equations in (30) Using this conclusion and (iii), prove that the system in (30) has no nonzero triangularizable solutions Irreducible Solutions We will use M n to denote the set of all n n matrices 37 (i) Consider the solution in (29i), and note that X + Ỹ 2 [ = 1 ], X Ỹ 2 [ = 1 ], X ( X + Ỹ 2 ) [ = 1 ] ( ) [ ] X, X2 X + Ỹ 1 = 2 Consequently, every matrix in M 2 can be written as a linear combination of products of powers of X and Ỹ (ii) Exercise: For the solution in (29ii), can every matrix in M 2 be written as a linear combination of products of powers of X and Ỹ? (iii) Can you show that every matrix in M r+1 can be written as a linear combination of products of powers of the solution Ẽ, F, H in (30)? 38 Let S be a system of noncommutative polynomial equations f 1 (X 1,, X s ) = 0, f 2 (X 1,, X s ) = 0,, f t (X 1,, X s ) = 0,

20 20 NONCOMMUTATIVE POLYNOMIAL EQUATIONS and let X = { X 1,, X s } be an n n matrix solution to S We will say that X is irreducible provided every n n matrix in M n can be written as a linear combination of products of powers of matrices from X The irreducible solutions are minimal in a sense that will be justified (in part) later 39 (i) Exercise: An upper triangularizable solution is irreducible if and only if it is 1- dimensional Consequently, the irreducible solutions to XY Y X = Y (see (35)) are all 1-dimensional (ii) Exercise: Let N be an n n matrix commuting with every matrix in M n Then N is a scalar matrix 40 Aside: (i) Recall that n n matrices can be viewed as linear operators on C n Exercise: Show that (0) and C n are the only M n -invariant subspaces of C n (ie, subspaces of C n simultaneously invariant under all of the matrices in M n ; see (24)) (ii) Use (i) to deduce the following: Let X be an irreducible n n solution to the system S, as in (38) Then there is no subspace of C n that is X-invariant (This conclusion helps to explain why we think of irreducible solutions as being minimal ) In particular, there are no common eigenvectors for the matrices in X (iii) Molien s Theorem (1893): Let A be a set of n n matrices, and suppose that (0) and C n are the only A-invariant subspaces of C n Then every matrix in M n can be written as a linear combination of products of powers of matrices in A (We will omit the proof, only noting that it requires the Fundamental Theorem of Algebra) Noncommutative Algebra in Two Variables 41 (i) We can add and multiply polynomials (and Laurent polynomials) in noncommuting variables For example, (1/2)X + X = (3/2)X, (X Y )(X + Y ) = X 2 + XY Y X + Y 2

21 MATH 4096: SPRING Note that we cannot simplify the last expression to X 2 Y 2 However, scalars always commute with the variables (and with other scalars): (X + 2) 2 = (X + 2)(X + 2) = X 2 + 2X + X = X 2 + 4X + 4 (ii) Any polynomial always commutes with itself (iii) It is sometimes useful to think of multiplication in this context as juxtaposition (iv) If f(x, Y ) is a noncommutative polynomial then we can define a scalar product (cf (2)), λf(x, Y ) = λf(x, Y ) In the left hand side λ is an element of C, and in the right hand side λ is a scalar polynomial 42 (i) When no additional conditions are placed on the variables X and Y, we denote the set of all noncommutative polynomials in them by C{X, Y } (ii) It is not hard to verify that C{X, Y } is a complex vector space under the addition and scalar multiplication described in (41) The following is a basis for C{X, Y }: Xi 1 Y j 1 X i 2 Y j2 X j l Y j l l = 1, 2, 3,, i 1, j 1, i 2, j 2,, i l, j l are non-negative integers 43 (i) When we write C{X, Y }/ XY = Y X, we mean the set of noncommutative polynomials in X and Y, with multiplication as in (41), but with the substitution rule XY = Y X For example, in C{X, Y }/ XY = Y X, X 2 Y = Y X 2, (X + Y )X = X 2 + Y X = X 2 XY (ii) By C{X, Y }/ XY = Y X we mean the set of polynomials in X and Y subject to XY = Y X Therefore, C{X, Y }/ XY Y X can be viewed as the commutative polynomial ring in X and Y

22 22 NONCOMMUTATIVE POLYNOMIAL EQUATIONS (iii) Note that both C{X, Y }/ XY + Y X and C{X, Y }/ XY Y X are vector spaces under polynomial addition and scalar multiplication In each of these two cases, { X i Y j i, j are non-negative integers } is a basis (Exercise: Try to prove this last assertion) (iv) In general, if f 1 (X, Y ), f 2 (X, Y ), f t (X, Y ) are polynomials in X and Y, then C{X, Y }/ f 1 (X, Y ) = 0, f 2 (X, Y ) = 0,, f t (X, Y ) = 0 is the vector space comprised of polynomials in X and Y subject to the substitution rules 0 = f 1 (X, Y ), 0 = f 2 (X, Y ),, 0 = f t (X, Y ) It is often difficult to compute a basis for these abstractly defined vector spaces 44 The polynomial equations f 1 (X, Y ) = 0,, f t (X, Y ) = 0 in (43) are termed relations 45 Aside: Vector spaces equipped with some type of multiplication are usually referred to as algebras Noncommutative Algebra in Several Variables 46 (i) We can add and multiply polynomials in the noncommuting variables X 1, X 2,, X t using the procedure described in (41) For example, (X 1 X5 3 )(3X 2 4X 1 ) = 3X 1 X5 3 X 2 4X 1 X5 3 X 1, where X 3 5 = X 5 X 5 X 5 (ii) The set of all polynomials in the noncommuting variables X 1, X 2,, X s, when not subjected to additional restrictions, is denoted C{X 1, X 2,, X t } (iii) Following (42), C{X 1, X 2,, X t } is a C-vector space with basis l = 1, 2, 3,, X b 1 a 1 X b 2 a 2 X b 3 a 3 X b l a l a 1, a 2,, a l {1, 2,, t} b 1, b 2,, b l are non-negative integers

23 MATH 4096: SPRING (i) In view of (43 45), when we are given a system S of equations f 1 (X 1,, X s ) = 0, f 2 (X 1,, X s ) = 0,, f t (X 1,, X s ) = 0, we can consider the polynomials in the noncommuting variables X 1, X 2,, X t subject to the equations in S The set of all such noncommutative polynomials is denoted C{X 1, X 2,, X t }/ f 1 (X 1,, X s ) = 0, f 2 (X 1,, X s ) = 0,, f t (X 1,, X s ) = 0, and has a vector space structure similar to that described in (43) As in (44), the equations in S are referred to as relations (ii) Exercise: Consider the noncommutative polynomials, in variables E, F, H, subject to the relations in (30) Show that HF E = F EH c E S Letzter

### ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

### 4. Linear transformations as a vector space 17

4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

### Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

### Lecture Summaries for Linear Algebra M51A

These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

### Definition (T -invariant subspace) Example. Example

Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

### OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

### 1 Invariant subspaces

MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another

### Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution.

### INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2. 2. More examples. Ideals. Direct products. 2.1. More examples. 2.1.1. Let k = R, L = R 3. Define [x, y] = x y the cross-product. Recall that the latter is defined

### Infinite-Dimensional Triangularization

Infinite-Dimensional Triangularization Zachary Mesyan March 11, 2018 Abstract The goal of this paper is to generalize the theory of triangularizing matrices to linear transformations of an arbitrary vector

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

### Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

### The converse is clear, since

14. The minimal polynomial For an example of a matrix which cannot be diagonalised, consider the matrix ( ) 0 1 A =. 0 0 The characteristic polynomial is λ 2 = 0 so that the only eigenvalue is λ = 0. The

### Solving a system by back-substitution, checking consistency of a system (no rows of the form

MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

### MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### MATH 115A: SAMPLE FINAL SOLUTIONS

MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

### 1. General Vector Spaces

1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

### MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

### LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

### 235 Final exam review questions

5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

### Vector Spaces and Linear Transformations

Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations

### Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

### The following definition is fundamental.

1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

### Math Linear Algebra Final Exam Review Sheet

Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

### Online Exercises for Linear Algebra XM511

This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

### Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35 1. Let R be a commutative ring with 1 0. (a) Prove that the nilradical of R is equal to the intersection of the prime

### Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

### Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

### Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

### THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

### Topics in linear algebra

Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

### Linear Algebra Highlights

Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

### Math 121 Homework 4: Notes on Selected Problems

Math 121 Homework 4: Notes on Selected Problems 11.2.9. If W is a subspace of the vector space V stable under the linear transformation (i.e., (W ) W ), show that induces linear transformations W on W

### (f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

1 Vector spaces 1.1 Definition (Vector space) Let V be a set with a binary operation +, F a field, and (c, v) cv be a mapping from F V into V. Then V is called a vector space over F (or a linear space

### NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011 Linear Algebra II May 2011 Time allowed : 2 hours INSTRUCTIONS TO CANDIDATES 1. This examination paper contains

### Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

### Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

### Vector Spaces and Linear Maps

Vector Spaces and Linear Maps Garrett Thomas August 14, 2018 1 About This document is part of a series of notes about math and machine learning. You are free to distribute it as you wish. The latest version

### Elementary linear algebra

Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

### Foundations of Matrix Analysis

1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

### Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

### MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

### Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

### 1 Last time: least-squares problems

MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

### Calculating determinants for larger matrices

Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

### MAT 2037 LINEAR ALGEBRA I web:

MAT 237 LINEAR ALGEBRA I 2625 Dokuz Eylül University, Faculty of Science, Department of Mathematics web: Instructor: Engin Mermut http://kisideuedutr/enginmermut/ HOMEWORK 2 MATRIX ALGEBRA Textbook: Linear

### MATHEMATICS 217 NOTES

MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

### Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

### Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

### Chapter 6: Orthogonality

Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

### Math 396. Quotient spaces

Math 396. Quotient spaces. Definition Let F be a field, V a vector space over F and W V a subspace of V. For v, v V, we say that v v mod W if and only if v v W. One can readily verify that with this definition

### Eigenspaces and Diagonalizable Transformations

Chapter 2 Eigenspaces and Diagonalizable Transformations As we explored how heat states evolve under the action of a diffusion transformation E, we found that some heat states will only change in amplitude.

### LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

### . = V c = V [x]v (5.1) c 1. c k

Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

### Chapter 1 Vector Spaces

Chapter 1 Vector Spaces Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 110 Linear Algebra Vector Spaces Definition A vector space V over a field

### Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

### DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

### A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

### MATH 221, Spring Homework 10 Solutions

MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

### JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct

### Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

### A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

### LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

### Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop Eric Sommers 17 July 2009 2 Contents 1 Background 5 1.1 Linear algebra......................................... 5 1.1.1

### Math 110, Spring 2015: Midterm Solutions

Math 11, Spring 215: Midterm Solutions These are not intended as model answers ; in many cases far more explanation is provided than would be necessary to receive full credit. The goal here is to make

### Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

### Linear and Bilinear Algebra (2WF04) Jan Draisma

Linear and Bilinear Algebra (2WF04) Jan Draisma CHAPTER 3 The minimal polynomial and nilpotent maps 3.1. Minimal polynomial Throughout this chapter, V is a finite-dimensional vector space of dimension

### Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps We start with the definition of a vector space; you can find this in Section A.8 of the text (over R, but it works

### Math 3108: Linear Algebra

Math 3108: Linear Algebra Instructor: Jason Murphy Department of Mathematics and Statistics Missouri University of Science and Technology 1 / 323 Contents. Chapter 1. Slides 3 70 Chapter 2. Slides 71 118

### Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

### GENERAL VECTOR SPACES AND SUBSPACES [4.1]

GENERAL VECTOR SPACES AND SUBSPACES [4.1] General vector spaces So far we have seen special spaces of vectors of n dimensions denoted by R n. It is possible to define more general vector spaces A vector

### LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

### Linear Algebra Review

Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

### DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

### Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

5. Fields 5.1. Field extensions. Let F E be a subfield of the field E. We also describe this situation by saying that E is an extension field of F, and we write E/F to express this fact. If E/F is a field

### Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra M1 - FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:

### Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Vector Space Basics (Remark: these notes are highly formal and may be a useful reference to some students however I am also posting Ray Heitmann's notes to Canvas for students interested in a direct computational

### Properties of Linear Transformations from R n to R m

Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation

### Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.

### Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

### Chapter 4 & 5: Vector Spaces & Linear Transformations

Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think

### Math 594. Solutions 5

Math 594. Solutions 5 Book problems 6.1: 7. Prove that subgroups and quotient groups of nilpotent groups are nilpotent (your proof should work for infinite groups). Give an example of a group G which possesses

### Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

### Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

### The Cayley-Hamilton Theorem and the Jordan Decomposition

LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

### a(b + c) = ab + ac a, b, c k

Lecture 2. The Categories of Vector Spaces PCMI Summer 2015 Undergraduate Lectures on Flag Varieties Lecture 2. We discuss the categories of vector spaces and linear maps. Since vector spaces are always

### EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA Mahmut Kuzucuoğlu Middle East Technical University matmah@metu.edu.tr Ankara, TURKEY March 14, 015 ii TABLE OF CONTENTS CHAPTERS 0. PREFACE..................................................

### Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Math 554 Qualifying Exam January, 2019 You may use any theorems from the textbook. Any other claims must be proved in details. 1. Let F be a field and m and n be positive integers. Prove the following.

### ( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Math 46 (Lesieutre Practice final ( minutes December 9, 8 Problem Consider the matrix M ( 9 a Prove that there is a basis for R consisting of orthonormal eigenvectors for M This is just the spectral theorem:

### Q N id β. 2. Let I and J be ideals in a commutative ring A. Give a simple description of

Additional Problems 1. Let A be a commutative ring and let 0 M α N β P 0 be a short exact sequence of A-modules. Let Q be an A-module. i) Show that the naturally induced sequence is exact, but that 0 Hom(P,

### Math Matrix Algebra

Math 44 - Matrix Algebra Review notes - 4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,

### Matrices and Linear Algebra

Contents Quantitative methods for Economics and Business University of Ferrara Academic year 2017-2018 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2

### Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

### JORDAN AND RATIONAL CANONICAL FORMS

JORDAN AND RATIONAL CANONICAL FORMS MATH 551 Throughout this note, let V be a n-dimensional vector space over a field k, and let φ: V V be a linear map Let B = {e 1,, e n } be a basis for V, and let A

### Solution to Homework 1

Solution to Homework Sec 2 (a) Yes It is condition (VS 3) (b) No If x, y are both zero vectors Then by condition (VS 3) x = x + y = y (c) No Let e be the zero vector We have e = 2e (d) No It will be false

### 1 Matrices and Systems of Linear Equations

March 3, 203 6-6. Systems of Linear Equations Matrices and Systems of Linear Equations An m n matrix is an array A = a ij of the form a a n a 2 a 2n... a m a mn where each a ij is a real or complex number.