Algebras. Chapter Definition

Size: px
Start display at page:

Download "Algebras. Chapter Definition"

Transcription

1 Chapter 4 Algebras 4.1 Definition It is time to introduce the notion of an algebra over a commutative ring. So let R be a commutative ring. An R-algebra is a ring A (unital as always) that is an R-module (left, say) such that r(ab) = (ra)b = a(rb) for all r R, a, b A. You can say it the other way round: an R-algebra is an R-module equipped with an R-bilinear multiplication R A A making it into a ring. Thus, this multiplication satisfies the conditions (A1) 1 R a = a; (A2) (r + s)a = ra + sa; (A3) r(a + b) = ra + rb; (A4) (rs)a = r(sa); (A5) r(ab) = (ra)b = a(rb) for all r, s R, a, b A. (Note strictly speaking I should call this an associative, unital R-algebra there are other important sorts of algebra which are not associative but since we won t meet them in this course I ll just stick to algebra.) Note a Z-algebra is just the old definition of ring: Z-algebras = rings just as Z-modules = Abelian groups. So you should view the passage from rings to R-algebras as analogous to the passage from Abelian groups to R-modules! This is the idea of studying objects (e.g. Abelian groups, rings) relative to a fixed commutative base ring R. There is an equivalent formulation of the definition of R-algebra: an R-algebra is a ring A together with a distinguished ring homomorphism (the structure map ) s : R A such that the image of s lies in the center Z(A) = {a A ab = ba for all b A}. Indeed, given such a ring homomorphism, define a multiplication R A A by (r, a) s(r)a. Now check this satisfies the above axioms (A1) (A5) (the last one being because im s Z(A)). Conversely, given an R-algebra as defined originally, one obtains a ring homomorphism s : R A by defining s(r) = r1 A, and the image lies in Z(A) by (A5). Let A be an R-algebra. Then, given an A-module M, we can in particular think of M as just an R-module, defining rm = s(r)m for r R, m M. So you can hope to exploit the additional structure of the base ring R in studying A-modules. In particular, if R = F is a field and A is 97

2 98 CHAPTER 4. ALGEBRAS an F -algebra, A and any A-module M is in particular a vector space over F. So we can talk about finite dimensional F -algebras and finite dimensional modules over an F -algebra, meaning their underlying dimension as vector spaces over F. Note given A-modules M, N, an A-module homomorphism between them is automatically an R- module homomorphism (for the underlying R-module structure). However, it is not necessarily the case that a ring homomorphism between two different R-algebras is an R-module homomorphism. So one defines an R-algebra homomorphism f : A B between two R-algebras A and B to be a ring homomorphism in the old sense that is in addition R-linear (i.e. it is an R-module homomorphism too). Ring homomorphisms and R-algebra homomorphisms are different things! Now for examples. Actually, we already know plenty. Let R be a commutative ring. Then, the polynomial ring R[X 1,..., X n ] is evidently an R-algebra: indeed, R[X 1,..., X n ] contains a copy of R as the subring consisting of polynomials of degree zero. The ring M n (R) of n n matrices over R is an R-algebra: again, it contains a copy of R as the subring consisting of the scalar matrices. But note there is a big difference between this and the previous example: M n (R) is finitely generated as an R-module (indeed, it is free of rank n 2 ) whereas R[X 1,..., X n ] is not. In case F is a field, M n (F ) is a finite dimensional F -algebra, F [X 1,..., X n ] is not. For the next example, let M be any (left) R-module. Consider the Abelian group End R (M). We make it into a ring by defining the product of two endomorphisms of M simply to be their composition. Now I claim that End R (M) is in fact an R-algebra: indeed, we define rθ for r R, θ End R (M) by setting (rθ)(m) = r(θ(m))(= θ(rm)) for all m M. Let us check that rθ really is an R-endomorphism of M. Take another s R. Then, s((rθ)(m)) = sr(θ(m)) = θ(srm) = θ(rsm) = (rθ)(sm). Note we really did use the commutativity of R! We can generalize the previous two examples. Suppose now that A is a (not necessarily commutative) R-algebra and M is a left A-module. Then, the ring D = End A (M) is also an R-algebra, defining (rd)(m) = r(d(m)) for all r R, d D, m M. For the final example of an algebra, let G be any group. Define the group algebra RG to be the free R-module on basis the elements of G. Thus an element of RG looks like r g g g G for coefficients r g R all but finitely many of which are zero. Multiplication is defined by the rule ( ) r g g s h h = r g s h (gh). g G h G g,h G In other words, the multiplication in RG is induced by the multiplication in G and R-bilinearity. Note the RG contains a copy of R as a subring, namely, R1 G, so is certainly an R-algebra. The construction of the group algebra RG allows the possibility of studying the abstract group G by studying the category of RG-modules so module theory can be applied to group theory. Note finally in case G is a finite group and F is a field, the group algebra F G is a finite dimensional F -algebra.

3 4.2. SOME MULTILINEAR ALGEBRA Some multilinear algebra Let F be a field (actually, everything here could be done more generally over an arbitrary commutative ring, e.g. Z but let s stick to the case of a field for simplicity). Given a vector space V over F, we have defined V V (meaning V F V ). Hence, we have (V V ) V and V (V V ). But these two are canonically isomorphic, the isomorphism mapping a generator (u v) w of the first to u (v w) in the second. From now on we ll identify the two, and write simply V V V for either. More generally, we ll write T n (V ) for V V (n times), where the tensor is put together in any order you like, all different ways giving canonically isomorphic vector spaces. Set T (V ) = n 0 T n (V ) and call it the tensor algebra on V. Note T 0 (V ) = F by convention, just a copy of the ground field, and T 1 (V ) = V. Then, T (V ) is an F -vector space given, but we can define a multiplication on it making it into an F -algebra as follows: define a map T m (V ) T n (V ) T m+n (V ) = T m (V ) T n (V ) by (x, y) x y. Then extend linearly to give a map from T (V ) T (V ) T (V ). Since is associative, the resulting multiplication is associative, and T (V ) is an F -algebra. To be quite explicit, suppose that e i (i I) is a basis for V. Then, {e i1 e i2 e in n 0, i 1,..., i n I} gives a basis of monomials for T (V ). Multiplication of the basis elements is just by joining tensors together. In other words, T (V ) looks like non-commutative polynomials in the e i (i I) an arbitrary element being a linear combination of finitely many non-commuting monomials. The importance of the tensor algebra is as follows: Universal property of the tensor algebra. Given any F -linear map f : V A to an F - algebra A, there is a unique F -linear ring homomorphism f : T (V ) A such that f = f i, where i : V T (V ) is the inclusion of V into T 1 (V ) T (V ). (Note: this universal property could be taken as the definition of T (V )!) Proof. The vector space T n (V ) is defined by the following universal property: given a multilinear map f n : V V (n copies) to a vector space W, there exists a unique F -linear map f n : T n (V ) W such that f n = f n i n, where i n is the obvious map V V T n (V ). Now define the map f n so that f n (v 1, v 2,..., v n ) = f(v 1 )f(v 2 )... f(v n ) (multiplication in the ring A). Check that it is multilinear in the v i, since A is an F -algebra. Hence it induces a unique f n : T n (V ) A. Now define f : T (V ) A by glueing all the f n together. In other words, using the basis notation above, we have that f(e i1 e in ) = f(e i1 )f(e i2 )... f(e in ). This is an F -linear ring homomorphism, and is clearly the only option to extend f to such a thing. We can restate the theorem as follows: Let X be a basis of V. Then, given any set map f from X to an F -algebra A, there exists a unique F -linear homomorphism f : T (V ) A extending f (proof: extend the map X A to a linear map V A in the unique way then apply the theorem). In other words, T (V ) plays the role of the free F -algebra on the set X. Then you can define algebras by generators and relations but we won t go into that... Next, we come to the symmetric algebra on V. Continue with V being a vector space. We ll be working with V V (n times). Call a map f : V V W

4 100 CHAPTER 4. ALGEBRAS a symmetric multilinear map if its multilinear, i.e. f(v 1,..., cv i + c v i,..., v n ) = cf(v 1,..., v i,..., v n ) + c f(v 1,..., v i,..., v n ) for each i, and moreover f(v 1,..., v i, v i+1,..., v n ) = f(v 1,..., v i+1, v i,..., v n ) for each i (so its symmetric). The symmetric power S n (V ), together with a distinguished map i : V V S n (V ), is characterized by the following universal property: given any symmetric multilinear map f : V V W to a vector space W, there exists a unique linear map f : S n (V ) W such that f = f i. Note this is exactly the same universal property as defined T n (V ) in the proof of the universal property of the tensor algebra, but we ve added the word symmetric too. As usual with universal properties, S n (V ), if it exists, is unique up to canonical isomorphism (so we call it the symmetric algebra). Still, we need to prove existence with some sort of construction. So now define S n (V ) = T n (V )/I n where I n = v 1 v i v i+1 v n v 1 v i+1 v i v n is the subspace spanned by all such terms for all v 1,..., v n V. The distinguished map i : V V S n (V ) is the obvious map V V T n (V ) followed by the quotient map T n (V ) S n (V ). Now take a symmetric multilinear map f : V V W. By the universal property defining T n (V ), there is a unique F -linear map f 1 : T n (V ) W extending f. Now since f is symmetric, we have that f 1 (v 1 v i v i+1 v n v 1 v i+1 v i v n ) = 0. Hence, f 1 annihilates all generators of I n, hence all of I n. So f 1 factors through the quotient S n (V ) of T n (V ) to induce a unique map f : S n (V ) W with f = f i. Note we have really just used the univeral property of T n followed by the universal property of quotients! So now we ve defined the nth symmetric power of a vector space. Note its customary to denote the image in S n (V ) of (v 1,..., v n ) V V under the map i by v 1 v n. So v 1 v n is the image of v 1 v n under the quotient map T n (V ) S n (V ). We can glue all S n (V ) together to define the symmetric algebra on V : S(V ) = n 0 S n (V ) where again S 0 (V ) = F, S 1 (V ) = V. Since each S n (V ) is by construction a quotient of T n (V ), we see that S(V ) is a quotient of T (V ), i.e. S(V ) = T (V )/I where I = n 0 I n. I claim in fact that I is an ideal of T (V ), so that S(V ) is actually a quotient algebra of T (V ). Actually, I claim even more, namely: Lemma. I is the two-sided ideal of T (V ) generated by the elements v w w v for all v, w V. Proof. Let J be the two-sided ideal generated by the given elements. By definition, I is spanned as an F -vector space by terms like v 1 v i v i+1 v n v 1 v i+1 v i v n = v 1 v i 1 (v i v i+1 v i+1 v i ) v i+2 v n.

5 4.2. SOME MULTILINEAR ALGEBRA 101 This clearly lies in J, hence I J. On the other hand, the generators of the ideal J lie in I, so it just remains to show that I is a two-sided ideal of T (V ). Since T (V ) is spanned by monomials of the form u = u 1 u m, it suffices to check that for any generator v = v 1 v i v i+1 v n v 1 v i+1 v i v n of I, we have that uv and vu both lie in I. But that s obvious! The lemma shows that S(V ) really is an F -algebra, as a quotient of T (V ) by an ideal. Multiplication of two monomials v 1 v m and w 1 w n in S(V ) is just by concatenation, giving v 1 v m w 1 w n. Moreover, since is symmetric, we can reorder this as we like in S(V ) to see that v 1 v m w 1 w n = w 1 w n v 1 v n. Hence, since these pure elements (the images of the pure tensors which generate T (V )) span S(V ) as an F -vector space, we see that S(V ) is a commutative F -algebra. Note not all elements of S(V ) can be written as v 1 v m, just as not all tensors are pure tensors. The symmetric algebra S(V ), together with the inclusion map i : V S(V ), is characterized by the following universal property (compare with the universal property of the tensor algebra): Universal property of the symmetric algebra. Given an F -linear map f : V A, where A is a commutative F -algebra, there exists a unique F -linear homomorphism f : S(V ) A such that f = f i. Proof. Since A is commutative, the map f 1 : T (V ) A given by the universal property of the tensor algebra annihilates all elements v w w v T 2 (V ). These generate the ideal I by the preceeding lemma. Hence, f 1 factors through the quotient S(V ) of T (V ). Now suppose that V is finite dimensional on basis e 1,..., e m. Then, we ve seen S(V ) before! Indeed, let F [x 1,..., x m ] be the polynomial ring over F in indeterminates x 1,..., x m. The map e i x i extends to a unique F -linear map V F [x 1,..., x m ], hence since the polynomial ring is commutative, the universal property of symmetric algebras gives us a unique F -linear homomorphism S(V ) F [x 1,..., x m ]. Now, S(V ) is spanned by the images of pure tensors of the form e i1 e in. Moreover, any such can be reordered in S(V ) using the symmetric property to assume that i 1 i n. Hence, S(V ) is spanned by the ordered monomials of the form e i1... e in for all i 1 i n and all n 0. Clearly, such a monomial maps to x i1... x in in the polynomial ring F [x 1,..., x m ]. But we know (by definition) that the ordered monomials give a basis for the F -vector space F [x 1,..., x m ]. Hence, they must in fact be linearly independent in S(V ) too, and we ve constructed an isomorphism: Basis theorem for symmetric powers. Let V be an F -vector space of dimension m. Then, S(V ) is isomorphic to F [x 1,..., x m ], the isomorphism mapping a basis element e i of V to the indeterminate x i in the polynomial ring. In particular, S n (V ) has basis given by all ordered monomials in the basis of the form e i1 e in with 1 i 1 i n m. In this language, the universal property of symmetric algebras gives that the polynomial algebra F [x 1,..., x m ] is the free commutative F -algebra on the generators x 1,..., x m. Note in particular that if V is finite dimensional, say of dimenion m, the basis theorem implies ( ) m + n 1 dim S n (V ) = n (exercise!).

6 102 CHAPTER 4. ALGEBRAS The last important topic in multilinear algebra I want to cover is the exterior algebra. The construction goes in much the same way as the symmetric algebra, however unlike there we do not have a known object like the polynomial algebra to compare it with so we have to work harder to get the analogous basis theorem to the basis theorem for symmetric powers. Start with V being any vector space. Define K to be the two-sided ideal of the tensor algebra T (V ) generated by the elements {x x x V }. Note that (v + w) (v + w) = v v + w v + v w + w w. So K also contains all the elements {v w + w v v, w V } automatically. Define the exterior (or Grassmann) algebra (V ) to be the quotient algebra (V ) = T (V )/K. We ll write v 1 v n for the image in (V ) of the pure tensor v 1 v n T (V ). So, now we have anti-symmetric properties like: and v 1 v i v i... v n = 0 v 1 v i v i+1... v n = v 1 v i+1 v i... v n. Since the ideal K is generated by homogeneous elements, K = n 0 K n where K n = K T n (V ). It follows that n (V ) = (V ) n 0 where n (V ) = T n (V )/K n is the subspace spanned by the images of all pure tensors of degree n. We ll call n (V ) the nth exterior power. The exterior power n (V ), together with the distinguished map i : V V n (V ), (v 1,..., v n ) v 1 v n, is characterized by the following universal property. First, call a multlinear map f : V V W to a vector space W alternating if we have that f(v 1,..., v j, v j+1,..., v n ) = 0 whenever v j = v j+1 for some j. The map i : V V n V is multilinear and alternating. Moreover, given any other multilinear, alternating map f : V V W, there exists a unique linear map f : n V W such that f = f i. The proof is the same as for the symmetric powers, you should compare the two constructions! Using this all important universal property, we can now prove: Basis theorem for exterior powers. Suppose that V is finite dimensional, with basis e 1,..., e m. Then, n (V ) has basis given by all monomials e i1 e in for all sequences 1 i 1 < < i n m. In particular, ( ) dim n m (V ) =, n and is zero for n > m.

7 4.2. SOME MULTILINEAR ALGEBRA 103 Proof. Since n (V ) is a quotient of T n (V ), we certainly have that n (V ) is spanned by all monomials e i1 e in. Now using the antisymmetric properties of, we get that it is spanned by the given strictly ordered monomials. The problem is to prove that the given elements are linearly independent. For this, we proceed by induction on n, the case n = 1 being immediate since 1 (V ) = V. Now consider n > 1. Let f 1,..., f n be the basis for V dual to the given basis e 1,..., e n for V. For each i = 1,..., n, we wish to define a map F i : n (V ) n 1 (V ). Start with the multilinear, alternating map f i : V V n 1 (V ) defined by f i (v 1,..., v n ) = n ( 1) j f i (v j )v 1 v j 1 v j+1 v n. j=1 Its clearly multilinear, and to check its alternating, we just need to see that if v j = v j+1 then: f i (v 1,..., v j, v j+1,..., v n ) = ( 1) j f i (v j )v 1 v j+1 v n + ( 1) j+1 f i (v j+1 )v 1 v j v n which is zero. Now the univeral property of gives that f i induces a unique linear map F i : n V n 1 V. Now we show that the given elements are linearly independent. Take a linear relation i 1< <i n a i1,...,i n e i1 e in = 0. Apply the linear map F 1, which annihilates all e j except for e 1 by definition. We get: 1<i 2< <i n a 1,i2,...,i n e i2 e in = 0. By the induction hypothesis, all such monomials are linearly independent, hence all a 1,i2,...,i n are zero. Now apply F 2 in the same way to get that all a 2,i2,...,i n are zero, etc... Now let s focus on a special case. Suppose that dim V = n and consider n V. Its dimension, by the theorem, is ( n n) = 1, and it has basis just the element e1 e n. Let f : V V be a linear map. Define f : V V n (V ) by f(v 1,..., v n ) = f(v 1 ) f(v n ). One easily checks that this is multilinear and alternating. Hence, it induces a unique linear map we ll denote n f : n (V ) n (V ). But n (V ) is one dimensional, so such a linear map is just multiplication by a scalar. In other words, there is a unique scalar which we ll call det f determined by the equation f(e 1 ) f(e n ) = (det f)e 1 e n. Of course, det f is exactly the determinant of the linear transformation f. Note our definition of determinant is basis free, always a good thing. You can see what we re calling det f is the same as usual, as follows. Suppose the matrix of f in our fixed basis is A, defined from f(e j ) = i A i,j e i.

8 104 CHAPTER 4. ALGEBRAS Then, f(e 1 ) f(e n ) = i 1,...,i n A i1,1a i2,2... A in,ne i1 e in. Now terms on the right hand side are zero unless all i 1,..., i n are distinct, i.e. (i 1,..., i n ) = (g1, g2,..., gn) for some permutation g S n. So: f(e 1 ) f(e n ) = g S n A g1,1 A g2,2... A gn,n e g1 e gn. Finally, e g1 e gn = sgn(g)e 1 e n where sgn denotes the sign of a permutation. So: This shows that f(e 1 ) f(e n ) = det f = g S n sgn(g)a g1,1 A g2,2... A gn,n e 1 e n. g S n sgn(g)a g1,1 A g2,2... A gn,n which is exactly the usual Laplace expansion definition of determinant. Here s a final result to illustrate how nicely the universal property definition of determinant gives its properties: Multiplicativity of determinant. For linear transformations f, g : V V, we have that det(f g) = det f det g. Proof. By definition, n [f g] is the map uniquely determined by ( n [f g])(v 1 v n ) = f(g(v 1 )) f(g(v n )). But n f n g also satisfies this equation. So, n (f g) = n f n g. The left hand side is scalar multiplication by det(f g), the right hand side is scalar multiplication by det f det g. 4.3 Chain conditions Let M be a (left or right) R-module. Then, M is called Noetherian if it satisfies the ascending chain condition (ACC) on submodules. This means that every ascending chain M 1 M 2 M 3... of submodules of M eventually stabilizes, i.e. M n = M n+1 = M n+2 =... for sufficiently large n. Similarly, M is called Artinian if it satisfies the descending chain condition (DCC) on submodules. So every descending chain M 1 M 2 M 3... of submodules of M eventually stabilizes, i.e. M n = M n+1 = M n+2 =... for sufficiently large n. A ring R is called left (resp. right) Noetherian if it is Noetherian viewed as a left (resp. right) R-module. Similarly, R is called left (resp. right) Artinian if it is Artinian viewed as a left (resp. right) R-module. In the case of commutative rings, we can omit the left or right here, but we cannot in general as the pathological examples on the homework show. In this chapter, we ll mainly be concerned with the Artinian property, but it doesn t make sense to introduce one chain condition without the other. We will discuss Noetherian rings in detail in the chapter on commutative algebra. By the way, you should think of the Noetherian property

9 4.3. CHAIN CONDITIONS 105 as a quite weak finiteness property on a ring, whereas the Artinian property is rather strong (see Hopkin s theorem below for the justification of this). We already know plenty of examples of Noetherian and Artinian rings. For instance, any PID is Noetherian (Lemma 2.4.1), but it need not be Artinian (e.g. Z is not Artinian: (p) (p 2 )... is an infinitely descending chain). The ring Z p n for p prime is both Noetherian and Artinian: indeed, here there are only finitely many ideals in total, the (p i ) for 0 i n. The main source of Artinian rings is as follows. Let F be a field and suppose that R is an F -algebra. Then I claim that every R-module M which is finite dimensional as an F -vector space is both Artinian and Noetherian. Well, R-submodules of M are in particular F -vector subspaces. And clearly in a finite dimensional vector space, you cannot have infinite chains of proper subspaces. So finite dimensionality of M does the job immediately! In particular, if the F -algebra R is finite dimensional as a vector space over F, then R is both left and right Artinian and Noetherian. Now you see why finite dimensional algebras over a field (e.g. M n (F ), the group algebra F G for G a finite group, etc...) are particularly nice things! Lemma. Let 0 K i M π Q 0 be a short exact sequence of R-modules. Then M is Noetherian (resp. Artinian) if and only if both K and Q are Noetherian (resp. Artinian). Proof. I just prove the result for the Noetherian property, the Artinian case being analogous. First, suppose M satisfies ACC. Then obviously K does as it is isomorphic to a submodule of M. Similarly Q does by the lattice isomorphism theorem for modules. Conversely, suppose K and Q both satisfy ACC. Let M 1 M 2... be an ascending chain of R-submodules of M. Set K i = i 1 (i(k) M i ), Q i = π(m i ). Then, K 1 K 2... is an ascending chain of submodules of K, and Q 1 Q 2... is an ascending chain of submodules of Q. So by assumption, there exists n 1 such that K N = K n and Q N = Q n for all N n. Now, we have evident short exact sequences and 0 K n M n Q n 0 0 K N M N Q N 0 for each N n. Letting α : K n K N, β : M n M N and γ : Q n Q N be the inclusions, one obtains a commutative diagram with α and γ being isomorphisms. Then the five lemma implies that β is surjective, hence M N = M n for all N n as required Theorem. If R is left (resp. right) Noetherian, then every finitely generated left (resp. right) R-module is Noetherian. Similarly, if R is left (resp. right) Artinian, then every finitely generated left (resp. right) R-module is Artinian. Proof. Let s do this for the Artinian case, the Noetherian case being similar. So assume that R is left Artinian and M is a finitely generated left R-module. Then, M is a quotient of a free R-module with a finite basis. So it suffices to show that R n is an Artinian left R-module. We proceed by induction on n, the case n = 1 being given. For n > 1, the submodule of R n spanned by the first (n 1) basis elements is isomorphic to R (n 1), and the quotient by this submodule is isomorphic to R. By induction both R (n 1) and R are Artinian. Hence, R n is Artinian by Lemma The next results are special ones about Noetherian rings (but see Hopkin s theorem below!) Lemma. Let R be a left Noetherian ring and M be a finitely generated left R-module. Then every R-submodule of M is also finitely generated.

10 106 CHAPTER 4. ALGEBRAS Proof. By Theorem 4.3.2, M is Noetherian. Let N be an arbitrary R-submodule of M and let A be the set of finitely generated R-submodules of M contained in N. Note A is non-empty as the zero module is certainly there! Now I claim that A contains a maximal element. Well, pick M 1 A. If M 1 is maximal in A, we are done. Else, we can find M 2 A with M 1 M 2. Repeat. The process must terminate, else we construct an infinite ascending chain of submodules of N M, contradicting the fact that M is Noetherian. (Warning: as with all such arguments we have secretly appealed to the axiom of choice here! So now let N be a maximal element of A. Say N is generated by m 1,..., m n. Now take any m N. Then, (m 1,..., m n, m) is a finitely generated submodule of M contained in N and containing N. By maximality of N, we therefore have that N = (m 1,..., m n, m), i.e. m N. This shows that N = N, hence N is finitely generated Corollary. Let R be a ring. Then, The following are equivalent: (i) R is left Noetherian; (ii) every left ideal of R is finitely generated; (Similar statements hold on the right, of course). Proof. (i) (ii). This is a special case of Lemma 4.3.3, taking M = R. (ii) (i). Let I 1 I 2... be an ascending chain of left ideals of R. Then, I = n 1 I n is also a left ideal of R, hence finitely generated, by a 1,..., a m say. Then for some sufficiently large n, all of a 1,..., a m lie in I n. Hence I n = I and R is Noetherian. Now you see that for commutative rings Noetherian is an obvious generalization of a PID. Instead of insisting all ideals are generated by a single element, one has that every ideal is generated by finitely many elements. 4.4 Wedderburn structure theorems Now we have the basic language of algebras and of Artinian rings and modules, we can begin to discuss the structure of rings. The first step is to understand the structure of semisimple rings. Recall that an R-module M is simple or irreducible if it is non-zero and has no submodules other than M and (0). Schur s lemma. Let M be a simple R-module. Then, End R (M) is a division ring. Proof. Let f : M M be a non-zero R-endomorphism of M. Then, ker f and im f are both R-submodules of M, with ker f M and im f (0) since f is non-zero. Hence, since M is simple, ker f = (0) and im f = M. This shows that f is a bijection, hence invertible. Throughout the section, we will also develop parallel but stronger versions of the results dealing with finite dimensional algebras over algebraically closed fields (recall a field F is algebraically closed if every monic f(x) F [X] has a root in F ). Relative Schur s lemma. Let F be an algebraically closed field and A be an F -algebra. Let M be a simple A-module which is finite dimensional as an F -vector space. Then, End R (M) = F. Proof. Let f : M M be an A-endomorphism of M. Then in particular, f : M M is an F -linear endomorphism of a finite dimensional vector space. Since F is algebraically closed, f has an eigenvector v λ M of eigenvalue λ F (because the characteristic polynomial of f has a root in F ).

11 4.4. WEDDERBURN STRUCTURE THEOREMS 107 Now consider f λ id M. It is also an A-endomorphism of M, and moreover its kernel is non-zero as v λ is annihilated by f λ id M. Hence since M is irreducible, the kernel of f λ id M is all of M, i.e. f = λ id M. This shows that the only R-endomorphisms of M are the scalars, as required. Now, let R be a non-zero ring. Call R left semisimple if R R is semisimple as a left R-module (recall section 3.4). Similarly, R is right semisimple if R R is semisimple as a right R-module. Finally, R is simple if it has no two-sided ideals other than R and (0). Here s a lemma from the previous chapter that should be quite familiar by now Lemma. Let A be an R-algebra, M a left A-module and D = End A (M). Then, as R-algebras. End A (M n ) = M n (D) Proof. Let us write elements of M n as column vectors m 1. m n with m i M. Suppose we have f End A (M n ). Let f i,j : M M be the map sending m M to the ith coordinate of the vector 0. 0 f m 0. 0 where the m is in the jth row. Then, f i,j is an A-module homomorphism, so is an element of D. Moreover, m 1 f 1,1... f 1,n m 1 f. = (matrix multiplication!), m n f n,1... f n,n m n so that f is uniquely determined by the f i,j. Thus, we obtain an R-algebra isomorphism f (f i,j ) between End A (M n ) and M n (D). Wedderburn s first structure theorem. The following conditions on a non-zero ring R are equivalent: (i) R is simple and left Artinian; (i ) R is simple and right Artinian; (ii) R is left semisimple and all simple left R-modules are isomorphic; (ii ) R is right semisimple and all simple right R-modules are isomorphic; (iii) R is isomorphic to the matrix ring M n (D) for some n 1 and D a division ring. Moreover, the integer n and the division ring D in (iii) are determined uniquely by R (up to isomorphism). Proof. First, note the condition (iii) is left-right symmetric. So let us just prove the equivalence of (i),(ii) and (iii).

12 108 CHAPTER 4. ALGEBRAS (i) (ii). Let U be a minimal left ideal of R. This exists because R is left Artinian so we have DCC on left ideals! Then, U = Rx for any non-zero x U, and Rx is a simple left R-module. Since R is a simple ring, the non-zero two-sided ideal RxR must be all of R. Hence, R = RxR = a R Rxa. Note each Rxa is a homomorphic image of the simple left R-module Rx, so is either isomorphic to Rx or zero. Thus we have written R R as a sum of simple R-modules, so by Lemma 3.4.1, there exists a subset S R such that R = a S Rxa with each Rxa for a S being isomorphic to U. Hence, R is left semisimple. Moreover, if M is any simple left R-module, then M = R/J for a maximal left ideal J of R. We can pick a S such that xa / J. Then, the quotient map R R/J maps the simple submodule Rxa of R to a non-zero submodule of M. Hence using simplicity, M = Rxa. This shows all simple R-modules are isomorphic to U. (ii) (iii). Let U be a simple left R-module. We have that R = a S Ra for some subset S of R, where each left R-module Ra is isomorphic to U. I claim that S is finite. Indeed, 1 R lies in a S Ra hence in Ra 1 Ra n for some finite subset {a 1,..., a n } of S. But 1 R generates R as a left R-module, so in fact {a 1,..., a n } = S. Thus, R = U n for some n, uniquely determined as the composition length of R R. Now, by Schur s lemma, End R (U) = D, where D a Division ring uniquely determined up to isomorphism as the endomorphism ring of a simple left R-module. Hence applying Lemma 4.4.1, Now finally, we observe that End R ( R R) = End R (U n ) = M n (D). End R ( R R) = R op, the isomorphism in the forward direction being determined by evaluation at 1 R. To see that this is an isomorphism, one constructs the inverse map, namely, the map sending r R to f r : R R R R with f r (s) = sr for each s R ( f r is right multiplication by r ). Now we have shown that R op = Mn (D). Hence, noting that matrix transposition is an isomorphism between M n (D) op and M n (D op ), we get that R = M n (D op ) and D op is also a division ring. (iii) (i). We know that M n (D) is simple as it is Morita equivalent to D, which is a division ring (or you can prove this directly!). It is left Artinian for instance because M n (D) is finite dimensional over the division ring D. This gives (i). Corollary. Every simple left (or right) Artinian ring R is an algebra over a field. Proof. Observe M n (D) is an algebra over Z(D), which is a field. The relative version for finite dimensional algebras over algebraically closed fields is as follows:

13 4.4. WEDDERBURN STRUCTURE THEOREMS 109 Relative first structure theorem. Let F be an algebraically closed field and A be a finite dimensional F -algebra (hence automatically left and right Artinian). (i) A is simple; (ii) A is left semisimple and all simple left R-modules are isomorphic; (ii ) A is right semisimple and all simple right R-modules are isomorphic; (iii) A is isomorphic to the matrix ring M n (F ) for some n (indeed, n 2 = dim A). Proof. The proof is the same, except that one gets that the division ring D equals F since one can use the stronger relative Schur s lemma. Wedderburn s second structure theorem. Every left (or right) semisimple ring R is isomorphic to a finite product of matrix rings over division rings: R = M n1 (D 1 ) M nr (D r ) for uniquely determined n i 1 and division rings D i. Conversely, any such ring is both left and right semisimple. Proof. Since R is left semisimple, we can decompose R R as i I U i where each U i is simple. Note 1 R already lies in a sum of finitely many of the U i, hence the index set I is actually finite. Now gather together isomorphic U i s to write where RR = H 1 H m H i = U n i i for irreducible R-modules U i and integers n i 1, with U i = Uj for i j. By Schur s lemma, D i = End R (U i ) is a division ring. Moreover, there are no R-module homomorphisms between H i and H j for i j, because none of their composition factors are isomorphic. Hence: R = End R ( R R) op = EndR (H 1 H m ) op = EndR (H 1 ) op End R (H m ) op = M n1 (D 1 ) op M nm (D m ) op = Mn1 (D op 1 ) M n m (Dm op ). The integers n i are uniquely determined as the multiplicities of the irreducible left R-modules in a composition series of R R, while the division rings D i are uniquely determined up to isomorphism as the endomorphism rings of the simple left R-modules. For the converse, we need to show that a product of matrix algebras over division rings is left and right semisimple. This follows because a single matrix algebra M n (D) over a division ring is both left and right semisimple according to the first Wedderburn structure theorem. The theorem shows: Corollary. A ring R is left semisimple if and only if it is right semisimple. So we can now just call R semisimple if it is either left or right semisimple in the old sense. Also: Corollary. Any semisimple ring is both left and right Artinian. Proof. A product of finitely many matrix rings over division algebras is left and right Artinian. Finally, we need to give the relative version:

14 110 CHAPTER 4. ALGEBRAS Relative second structure theorem. Let F be an algebraically closed field and A be a finite dimensional F -algebra that is semisimple. Then, for uniquely determined integers n i 1. Proof. A = M n1 (F ) M nr (F ) Just use the relative Schur s lemma in the same proof. This theorem lays bare the structure of finite dimensional semisimple algebras over algebraically closed fields: A is uniquely determined up to isomorphism by the integers n i, which are precisely the dimensions of the simple A-modules. 4.5 The Jacobson radical I want to explain in this section how understanding of the case of semisimple rings gives information about more general rings. The key notion is that of the Jacobson radical, which will be defined using the following equivalent properties: Jacobson radical theorem. Let R be a ring, a R. The following are equivalent: (i) a annihilates every simple left R-module; (i) a annihilates every simple right R-module; (ii) a lies in every maximal left ideal of R; (ii) a lies in every maximal right ideal of R; (iii) 1 xa has a left inverse for every x R; (iii) 1 ay has a right inverse for every y R; (iv) 1 xay is a unit for every x, y R. Proof. Since (iv) is left-right symmetric, I will only prove equivalence of (i) (iv). (i) (ii). If I is a maximal left ideal of R, then R/I is a simple left R-module. So, a(r/i) = 0, i.e. a I. (ii) (iii). Assume (ii) holds but 1 xa does not have a left inverse for some x R. Then, R(1 xa) R. So, there exists a maximal left ideal I with R(1 xa) I < R. But then, 1 xa I, and a I by assumption. Hence, 1 I so I = R, a contradiction. (iii) (i). Let M be a simple left R-module. Take u M. If au 0, then Rau = M as M is simple, so u = rau for some r R. But this implies that (1 ra)u = 0, hence since (1 ra) has a left inverse by assumption, we get that u = 0. This contradiction shows that in fact au = 0 for all u M, i.e. am = 0. (iv) (iii). This is trivial (take y = 1). (i),(iii) (iv). For any y R and any simple left R-module M, aym am = 0. So ay also satisfies the condition in (i), hence (iii). So for every x R, 1 xay has a left inverse, 1 b say. The equation (1 b)(1 xay) = 1 implies b = (b 1)xay. Hence bm = 0 for all simple left R-modules M, so b satisfies (i) hence (iii). So, 1 b has a left inverse, 1 c, say. Then, hence 1 c = (1 c)(1 b)(1 xay) = 1 xay, c = xay. Now we have that 1 b is a left inverse to 1 xay, and 1 xay is a left inverse to 1 b. Hence, 1 xay is a unit with inverse 1 b.

15 4.5. THE JACOBSON RADICAL 111 Now define the Jacobson radical J(R) to be J(R) = {a R a satisfies the equivalent conditions in the theorem}. Thus for instance using (i), J(R) is the intersection of the annihilators in R of all the simple left R-modules, or using (ii) J(R) = I I where I runs over all maximal left ideals of R. This implies that J(R) is a left ideal of R, but equally well using (ii), J(R) = I I where I runs over all maximal right ideals of R so that J is a right ideal of R. Hence, J(R) is a two-sided ideal of R. The following result is a basic trick in ring theory and maybe gives a first clue as to why the Jacobson radical is so important. Nakayama s lemma. Let R be a ring and M be a finitely generated left R-module. If J(R)M = M then M = 0. Proof. Suppose that M is non-zero and set J = J(R) for short. Let X = {m 1,..., m n } be a minimal set of generators of M, so m 1 0. Since JM = M, we can write for some j i J. So, m 1 = j 1 m j n m n (1 R j 1 )m 1 = j 2 m j n m n. But 1 R j 1 is a unit by the definition of J(R). So we get that m 1 = (1 R j 1 ) 1 j 2 m (1 R j 1 ) 1 j n m n. This contradicts the minimality of the initial set of generators chosen. We will apply Nakayama s lemma on the next chapter, but actually never in this chapter. The remainder of the section is concerned with the Jacobson radical in an Artinian ring!!! All these results are false if the ring is not Artinian... First characterization of the Jacobson radical for Artinian rings. Suppose that R is left (or right) Artinian. Then, J(R) is the unique smallest two-sided ideal of R such that R/J(R) is a semisimple algebra. Proof. Pick a maximal left ideal I 1 of R. Then (if possible) pick a maximal left ideal I 2 such that I 2 I 1 (hence I 1 I 2 is strictly smaller than I 1 ). Then (if possible) pick a maximal left ideal I 3 such that I 3 I 1 I 2 (hence I 1 I 2 I 3 is strictly smaller than I 1 I 2 ). Keep going! The process must terminate after finitely many steps, else you construct an infinite descending chain R I 1 I 1 I 2 I 1 I 2 I 3... of left ideals of R, contradicting the fact that R is left Artinian. We thus obtain finitely many maximal left ideals I 1, I 2,..., I r of R such that I 1 I r is contained in every maximal left ideal of R. In other words, J(R) = I 1 I r, so the Jacobson radical of an Artinian ring is the intersection of finitely many maximal left ideals.

16 112 CHAPTER 4. ALGEBRAS Consider the map R R/I 1 R/I r, a (a + I 1,..., a + I r ). It is an R-module map with kernel I 1 I r = J(R). So it induces an embedding R/J R/I 1 R/I r. The right hand side is a semisimple R-module, as it is a direct sum of simples, hence R/J is also a semisimple R-module. This shows that the quotient R/J is a semisimple ring. Now let K be another two-sided ideal of R such that R/K is a semisimple ring. By the lattice isomorphism theorem, we can write R/K = i I B i /K where B i is a left ideal of R containing K, i I B i = R and B i ( j i B j) = K for each i. Set C i = j i B j for short. Then, R/C i = (C i + B i )/C i = Bi /(B i C i ) = B i /K. This is simple, so C i is a maximal left ideal of R. Hence, each C i J(R). So K = C i also contains J(R). Thus J(R) is the unique smallest such ideal. Thus you see the first step to understanding the structure of an Arinian ring: understand the semisimple ring R/J(R) in the sense of Wedderburn s structure theorem Corollary. Let R be a left Artinian ring and M be a left R-module. Then, M is semisimple if and only if J(R)M = 0. Proof. If M is semisimple, it is a direct sum of simples. Now, J(R) annihilates all simple left R-modules by definition, hence J(R) annihilates M. Conversely, if J(R) annihilates M, then we can view M as an R/J(R)-module by defining (a + J(R))m = am for all a R, m M (one needs J(R) to annihilate M for this to be well-defined!). Since R/J(R) is semisimple by the previous theorem, M is a semisimple R/J(R)-module. But the R-module structure on M is just obtained by lifting the R/J(R)-module structure, so this means that M is semisimple as an R-module too. Second characterization of the Jacobson radical for Artinian rings. Let R be a left (or right) Artinian ring. Then, J(R) is a nilpotent two-sided ideal of R (i.e. J(R) n = 0 for some n > 0) and is equal to the sum of all nilpotent left ideals of R. Proof. Suppose x R is nilpotent, say x n = 0. Then, (1 x) is a unit, indeed, (1 x)(1 + x + x x n 1 ) = 1. So if I is a nilpotent left ideal of R, then every x I satisfies condition (iii) of the Jacobson radical theorem. This shows every nilpotent ideal of R is contained in J(R). It therefore just remains to prove that J(R) is itself a nilpotent ideal. Set J = J(R). Consider the chain J J 2 J 3... of two-sided ideals of R. Since R is left Artinian, the chain stabilizes, so J k = J k+1 =... for some k. Set I = J k, so I 2 = I. We need to prove that I = 0. Well, suppose for a contradiction that I 0. Choose a left ideal K of R minimal such that IK 0 (use the fact that R is left Artinian). Take any a K with Ia 0. Then, I 2 a = Ia 0, so the left ideal Ia of R coincides with K by the minimality of K. Hence, a K lies in Ia, so we can write a = xa for some x I. So, (1 x)a = 0. But x J, so 1 x is a unit, hence a = 0, which is a contradiction.

17 4.6. CHARACTER THEORY OF FINITE GROUPS Corollary. In particular, a left (or right) Artinian ring R is semisimple if and only if it has no non-zero nilpotent ideals. We end with an important application: Hopkin s theorem. Let R be a left Artinian ring and M be a left R-module. The following are equivalent: (i) M is Artinian; (ii) M is Noetherian; (iii) M has a composition series; (iv) M is finitely generated. Proof. (i) (iii) and (ii) (iii). Let J = J(R). Then, J is nilpotent by the second characterization of the Jacobson radical. So, J n = 0 for some n. Consider M JM J 2 M J n M = 0. It is a descending chain of R-submodules of M. Set F i = J i M/J i+1 M. Then, F i is annihilated by J, hence is a semisimple left R-module. Now, if M is Artinian (resp. Noetherian), so is each F i, so each F i is in fact a direct sum of finitely many simple R-modules. Thus, each F i obviously has a composition series, so M does too by the lattice isomorphism theorem. (iii) (iv). Let M = M 1 M n = 0 be a composition series of M. Pick m i M i M i+1 for each i = 1,..., n 1. Then, the image of m i in M i /M i+1 generates M i /M i+1 as it is a simple R-module. It follows that the m i generate M. Hence, M is finitely generated. (iv) (i) is Theorem (iii) (ii) and (iii) (i). These both follow immediately from the Jordan-Hölder theorem (actually, the Schreier refinement lemma using that any refinement of a composition series is trivial). Now we can show that Artinian implies Noetherian : Corollary. If R is left (resp. right) Artinian, then R is left (resp. right) Noetherian. Proof. Apply Hopkin s theorem to the Artinian R-module R R. 4.6 Character theory of finite groups Wedderburn s theorem has a very important application to the study of finite groups, indeed, it is the starting point for the study of character theory of finite groups. In this section, I ll follow Rotman section 8.5 fairly closely. Maschke s theorem. Let G be a finite group and F be a field either of characteristic 0 or of characteristic 0 < p G. Then, the group algebra F G is a semisimple algebra. Proof. It is certainly enough to show that every left F G-module M is semisimple. We just need to show that every short exact sequence 0 K M π Q 0 of F G-modules splits. Well, pick any F -linear map θ : Q M such that π θ = id Q. The problem is that θ need not be an F G-module map. Well, for g G, define a new F -linear map g θ : Q M, x gθ(g 1 x).

18 114 CHAPTER 4. ALGEBRAS Then, (π g θ)(x) = π(gθ(g 1 x)) = g(π θ)(g 1 x) = gg 1 x = x so each g θ also satisfies π g θ = id Q. Now define Avθ = 1 G (note G is invertible in F by assumption on characteristic). This is another F -linear map such that π Avθ = id Q. Moreover, Avθ is even an F G-module map: havθ(x) = 1 G g G Hence, Avθ defines the required splitting. g G hgθ(g 1 x) = 1 G g θ gθ(g 1 hx) = Avθ(hx). g G Remark. Maschke s theorem is actually if and only if... F G is a semisimple algebra if and only if char F = 0 or char F G. Let me just give one example. Take F of characteristic p > 0 and G = C p = x. The group algebra F G is commutative and artinian. Now recall a commutative artinian ring is semisimple if and only if it has no non-zero nilpotent elements. But The corollary can be applied in particular to commutative rings. In that case, if x R is a nilpotent element, the ideal (x) it generates is a nilpotent ideal. So, a commutative Artinian ring is semisimple if and only if it has no non-zero nilpotent elements. Now let G = C p, the cyclic group of order p. Maschke s theorem shows that if F is any field of characteristic different from p, then F G is a semisimple. Conversely, suppose F is a field of characterisitic p. The group algebra F G is finite dimensional, hence is a commutative Artinian ring. Consider the element where x is a generator of G. We have that x 1 F G (x 1) p = x p + ( 1) p = 1 + ( 1) p = 0. Hence, it is a non-zero nilpotent element. Thus, F G is not semisimple in this case. We are going to be interested from now on in the following situation: G is a finite group (so that F G is artinian); the field F is of characteristic 0 (so that F G is semisimple); the field F is algebraically closed (so that the souped-up version of Wedderburn s theorem holds). Actually I ll just assume F = C from now on. Then, we have shown that (1) CG = M n1 (C) M nr (C) where r is the number of isomorphism classes of irreducible CG-modules and n 1,..., n r are their dimensions. These give some interesting invariants of the group G defined using modules... Note first (considering dimension of each side as a C-vector space) that: G = (n 1 ) (n r ) 2. The number r has a simply group theoretic meaning too: Lemma. The number r in (1) is equal to the number of conjugacy classes in the group G.

19 4.6. CHARACTER THEORY OF FINITE GROUPS 115 Proof. Let us compute dim Z(CG) in two different ways. First, since the center of M n (C) is one dimensional spanned by the identity matrix, dim Z(CG) = r (one for each matrix algebra in the Wedderburn decomposition). On the other hand if a g g CG g G is central then conjugating by h G you see that a g = a hgh 1 for all h G. Hence the coefficients a g are constant on conjugacy classes. Hence if C 1,..., C s are the conjugacy classes of G, the elements z i = g C i g form a basis {z 1,..., z s } for Z(CG). Hence r = dim Z(CG) = s Example. Say G = S 3. There are three conjugacy classes. Hence r = 3, i.e. there are three isomorphism classes of irreducible CS 3 -modules. Moreover n n n 2 3 = 6 so the dimensions can only be 1, 1 and Example. Say G is abelian. Then there are r = G conjugacy classes, and n n 2 r = r hence each n i = 1. This proves that there are n isomorphism classes of irreducible CG-module, and all of these modules are one dimensional. Before going any further I want to introduce a little language. A (finite dimensional complex) representation of G means a pair (V, ρ) where V is a finite dimensional complex vector space and ρ : G GL(V ) is a group homomorphism. The representation is called faithful if this map ρ is injective. Note having a representation (V, ρ) of G is exactly the same as having a finite dimensional CG-module V : given a representation (V, ρ) we get a CG-module structure on V by defining gv := ρ(g)(v); conversely given a CG-module structure on V we get a representation ρ : G GL(V ) by defining ρ(g) to be the linear map v gv. This is all one big tautology. But I will often switch between calling things CG-modules and calling them representations. Sorry. Say ρ : G GL(V ) is a representation of G. Pick a basis for V, v 1,..., v n. Then each element of GL(V ) becomes an invertible n n matrix, and in this way you can regard ρ instead as a group homomorphism ρ : G GL n (C). This is a matrix representation: ρ maps the group G to the group of n n invertible matrices. You should compare the notion of matrix representation with that of permutation representation from before: that was a map from G to the symmetric group S n for some n. It corresponded to having a G-set X of size n... Its all very analogous to what we re doing now, but sets have been replaced by vector spaces. Before we called a permutation representation faithful if the map G S n was injective (then it embedded G as a subgroup of a symmetric group); now we are calling a (linear) representation faithful if the map G GL n (C) is injective (then it embeds G as a subgroup of the matrix group) Example. Recall that GL(C) = GL 1 (C) is just the multiplicative group C. For any group G, there is always the trivial representation, namely, the homomorphism mapping every g G to 1 GL 1 (C). This corresponds to the trivial CG-module equal to C as a vector space with every g G acting as Example. For the symmetric group S n, there is a group homomorphism sgn : S n {±1}. The latter group sits inside C, so you can view this as another 1-dimensional representation, the sign representation. The corresponding module is not isomorphic to the trivial module (providing n > 1). Now we ve constructed both of the one dimensonal CS 3 -modules: one is trivial, one is sign. What about the two dimensional CS 3 -module? There is a simple way to construct (linear) representations of G out of permutation representations. Suppose X = {x 1,..., x n } is a finite G-set. Let CX be the C-vector space on basis X. Each

Structure of rings. Chapter Algebras

Structure of rings. Chapter Algebras Chapter 5 Structure of rings 5.1 Algebras It is time to introduce the notion of an algebra over a commutative ring. So let R be a commutative ring. An R-algebra is a ring A (unital as always) together

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35 Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35 1. Let R be a commutative ring with 1 0. (a) Prove that the nilradical of R is equal to the intersection of the prime

More information

Exercises on chapter 4

Exercises on chapter 4 Exercises on chapter 4 Always R-algebra means associative, unital R-algebra. (There are other sorts of R-algebra but we won t meet them in this course.) 1. Let A and B be algebras over a field F. (i) Explain

More information

Math 594. Solutions 5

Math 594. Solutions 5 Math 594. Solutions 5 Book problems 6.1: 7. Prove that subgroups and quotient groups of nilpotent groups are nilpotent (your proof should work for infinite groups). Give an example of a group G which possesses

More information

MATH 326: RINGS AND MODULES STEFAN GILLE

MATH 326: RINGS AND MODULES STEFAN GILLE MATH 326: RINGS AND MODULES STEFAN GILLE 1 2 STEFAN GILLE 1. Rings We recall first the definition of a group. 1.1. Definition. Let G be a non empty set. The set G is called a group if there is a map called

More information

ALGEBRA HW 4. M 0 is an exact sequence of R-modules, then M is Noetherian if and only if M and M are.

ALGEBRA HW 4. M 0 is an exact sequence of R-modules, then M is Noetherian if and only if M and M are. ALGEBRA HW 4 CLAY SHONKWILER (a): Show that if 0 M f M g M 0 is an exact sequence of R-modules, then M is Noetherian if and only if M and M are. Proof. ( ) Suppose M is Noetherian. Then M injects into

More information

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra D. R. Wilkins Contents 3 Topics in Commutative Algebra 2 3.1 Rings and Fields......................... 2 3.2 Ideals...............................

More information

Modules Over Principal Ideal Domains

Modules Over Principal Ideal Domains Modules Over Principal Ideal Domains Brian Whetter April 24, 2014 This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this

More information

REPRESENTATION THEORY, LECTURE 0. BASICS

REPRESENTATION THEORY, LECTURE 0. BASICS REPRESENTATION THEORY, LECTURE 0. BASICS IVAN LOSEV Introduction The aim of this lecture is to recall some standard basic things about the representation theory of finite dimensional algebras and finite

More information

ALGEBRA EXERCISES, PhD EXAMINATION LEVEL

ALGEBRA EXERCISES, PhD EXAMINATION LEVEL ALGEBRA EXERCISES, PhD EXAMINATION LEVEL 1. Suppose that G is a finite group. (a) Prove that if G is nilpotent, and H is any proper subgroup, then H is a proper subgroup of its normalizer. (b) Use (a)

More information

COURSE SUMMARY FOR MATH 504, FALL QUARTER : MODERN ALGEBRA

COURSE SUMMARY FOR MATH 504, FALL QUARTER : MODERN ALGEBRA COURSE SUMMARY FOR MATH 504, FALL QUARTER 2017-8: MODERN ALGEBRA JAROD ALPER Week 1, Sept 27, 29: Introduction to Groups Lecture 1: Introduction to groups. Defined a group and discussed basic properties

More information

Math 121 Homework 5: Notes on Selected Problems

Math 121 Homework 5: Notes on Selected Problems Math 121 Homework 5: Notes on Selected Problems 12.1.2. Let M be a module over the integral domain R. (a) Assume that M has rank n and that x 1,..., x n is any maximal set of linearly independent elements

More information

Introduction to modules

Introduction to modules Chapter 3 Introduction to modules 3.1 Modules, submodules and homomorphisms The problem of classifying all rings is much too general to ever hope for an answer. But one of the most important tools available

More information

Topics in Module Theory

Topics in Module Theory Chapter 7 Topics in Module Theory This chapter will be concerned with collecting a number of results and constructions concerning modules over (primarily) noncommutative rings that will be needed to study

More information

REPRESENTATION THEORY WEEK 9

REPRESENTATION THEORY WEEK 9 REPRESENTATION THEORY WEEK 9 1. Jordan-Hölder theorem and indecomposable modules Let M be a module satisfying ascending and descending chain conditions (ACC and DCC). In other words every increasing sequence

More information

4.2 Chain Conditions

4.2 Chain Conditions 4.2 Chain Conditions Imposing chain conditions on the or on the poset of submodules of a module, poset of ideals of a ring, makes a module or ring more tractable and facilitates the proofs of deep theorems.

More information

(Rgs) Rings Math 683L (Summer 2003)

(Rgs) Rings Math 683L (Summer 2003) (Rgs) Rings Math 683L (Summer 2003) We will first summarise the general results that we will need from the theory of rings. A unital ring, R, is a set equipped with two binary operations + and such that

More information

Rings and groups. Ya. Sysak

Rings and groups. Ya. Sysak Rings and groups. Ya. Sysak 1 Noetherian rings Let R be a ring. A (right) R -module M is called noetherian if it satisfies the maximum condition for its submodules. In other words, if M 1... M i M i+1...

More information

Ring Theory Problems. A σ

Ring Theory Problems. A σ Ring Theory Problems 1. Given the commutative diagram α A σ B β A σ B show that α: ker σ ker σ and that β : coker σ coker σ. Here coker σ = B/σ(A). 2. Let K be a field, let V be an infinite dimensional

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

(1) A frac = b : a, b A, b 0. We can define addition and multiplication of fractions as we normally would. a b + c d

(1) A frac = b : a, b A, b 0. We can define addition and multiplication of fractions as we normally would. a b + c d The Algebraic Method 0.1. Integral Domains. Emmy Noether and others quickly realized that the classical algebraic number theory of Dedekind could be abstracted completely. In particular, rings of integers

More information

FILTERED RINGS AND MODULES. GRADINGS AND COMPLETIONS.

FILTERED RINGS AND MODULES. GRADINGS AND COMPLETIONS. FILTERED RINGS AND MODULES. GRADINGS AND COMPLETIONS. Let A be a ring, for simplicity assumed commutative. A filtering, or filtration, of an A module M means a descending sequence of submodules M = M 0

More information

Formal power series rings, inverse limits, and I-adic completions of rings

Formal power series rings, inverse limits, and I-adic completions of rings Formal power series rings, inverse limits, and I-adic completions of rings Formal semigroup rings and formal power series rings We next want to explore the notion of a (formal) power series ring in finitely

More information

ASSOCIATIEVE ALGEBRA. Eric Jespers. webstek: efjesper HOC: donderdag uur, F.5.207

ASSOCIATIEVE ALGEBRA. Eric Jespers. webstek:  efjesper HOC: donderdag uur, F.5.207 ASSOCIATIEVE ALGEBRA Eric Jespers 2017 2018 webstek: http://homepages.vub.ac.be/ efjesper HOC: donderdag 09-11 uur, F.5.207 Contents 1 Introduction iii 2 Semisimple rings 1 2.1 Introduction.........................

More information

MATH5735 Modules and Representation Theory Lecture Notes

MATH5735 Modules and Representation Theory Lecture Notes MATH5735 Modules and Representation Theory Lecture Notes Joel Beeren Semester 1, 2012 Contents 1 Why study modules? 4 1.1 Setup............................................. 4 1.2 How do you study modules?.................................

More information

Algebra Qualifying Exam August 2001 Do all 5 problems. 1. Let G be afinite group of order 504 = 23 32 7. a. Show that G cannot be isomorphic to a subgroup of the alternating group Alt 7. (5 points) b.

More information

COHEN-MACAULAY RINGS SELECTED EXERCISES. 1. Problem 1.1.9

COHEN-MACAULAY RINGS SELECTED EXERCISES. 1. Problem 1.1.9 COHEN-MACAULAY RINGS SELECTED EXERCISES KELLER VANDEBOGERT 1. Problem 1.1.9 Proceed by induction, and suppose x R is a U and N-regular element for the base case. Suppose now that xm = 0 for some m M. We

More information

4.4 Noetherian Rings

4.4 Noetherian Rings 4.4 Noetherian Rings Recall that a ring A is Noetherian if it satisfies the following three equivalent conditions: (1) Every nonempty set of ideals of A has a maximal element (the maximal condition); (2)

More information

ALGEBRA QUALIFYING EXAM, FALL 2017: SOLUTIONS

ALGEBRA QUALIFYING EXAM, FALL 2017: SOLUTIONS ALGEBRA QUALIFYING EXAM, FALL 2017: SOLUTIONS Your Name: Conventions: all rings and algebras are assumed to be unital. Part I. True or false? If true provide a brief explanation, if false provide a counterexample

More information

Injective Modules and Matlis Duality

Injective Modules and Matlis Duality Appendix A Injective Modules and Matlis Duality Notes on 24 Hours of Local Cohomology William D. Taylor We take R to be a commutative ring, and will discuss the theory of injective R-modules. The following

More information

Representation Theory

Representation Theory Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 Paper 1, Section II 19I 93 (a) Define the derived subgroup, G, of a finite group G. Show that if χ is a linear character

More information

REPRESENTATIONS OF S n AND GL(n, C)

REPRESENTATIONS OF S n AND GL(n, C) REPRESENTATIONS OF S n AND GL(n, C) SEAN MCAFEE 1 outline For a given finite group G, we have that the number of irreducible representations of G is equal to the number of conjugacy classes of G Although

More information

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent Lecture 4. G-Modules PCMI Summer 2015 Undergraduate Lectures on Flag Varieties Lecture 4. The categories of G-modules, mostly for finite groups, and a recipe for finding every irreducible G-module of a

More information

NOTES ON SPLITTING FIELDS

NOTES ON SPLITTING FIELDS NOTES ON SPLITTING FIELDS CİHAN BAHRAN I will try to define the notion of a splitting field of an algebra over a field using my words, to understand it better. The sources I use are Peter Webb s and T.Y

More information

Definitions. Notations. Injective, Surjective and Bijective. Divides. Cartesian Product. Relations. Equivalence Relations

Definitions. Notations. Injective, Surjective and Bijective. Divides. Cartesian Product. Relations. Equivalence Relations Page 1 Definitions Tuesday, May 8, 2018 12:23 AM Notations " " means "equals, by definition" the set of all real numbers the set of integers Denote a function from a set to a set by Denote the image of

More information

2. Prime and Maximal Ideals

2. Prime and Maximal Ideals 18 Andreas Gathmann 2. Prime and Maximal Ideals There are two special kinds of ideals that are of particular importance, both algebraically and geometrically: the so-called prime and maximal ideals. Let

More information

Algebra Homework, Edition 2 9 September 2010

Algebra Homework, Edition 2 9 September 2010 Algebra Homework, Edition 2 9 September 2010 Problem 6. (1) Let I and J be ideals of a commutative ring R with I + J = R. Prove that IJ = I J. (2) Let I, J, and K be ideals of a principal ideal domain.

More information

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that ALGEBRAIC GROUPS 61 5. Root systems and semisimple Lie algebras 5.1. Characteristic 0 theory. Assume in this subsection that chark = 0. Let me recall a couple of definitions made earlier: G is called reductive

More information

12. Projective modules The blanket assumptions about the base ring k, the k-algebra A, and A-modules enumerated at the start of 11 continue to hold.

12. Projective modules The blanket assumptions about the base ring k, the k-algebra A, and A-modules enumerated at the start of 11 continue to hold. 12. Projective modules The blanket assumptions about the base ring k, the k-algebra A, and A-modules enumerated at the start of 11 continue to hold. 12.1. Indecomposability of M and the localness of End

More information

Lecture 2. (1) Every P L A (M) has a maximal element, (2) Every ascending chain of submodules stabilizes (ACC).

Lecture 2. (1) Every P L A (M) has a maximal element, (2) Every ascending chain of submodules stabilizes (ACC). Lecture 2 1. Noetherian and Artinian rings and modules Let A be a commutative ring with identity, A M a module, and φ : M N an A-linear map. Then ker φ = {m M : φ(m) = 0} is a submodule of M and im φ is

More information

A Little Beyond: Linear Algebra

A Little Beyond: Linear Algebra A Little Beyond: Linear Algebra Akshay Tiwary March 6, 2016 Any suggestions, questions and remarks are welcome! 1 A little extra Linear Algebra 1. Show that any set of non-zero polynomials in [x], no two

More information

The most important result in this section is undoubtedly the following theorem.

The most important result in this section is undoubtedly the following theorem. 28 COMMUTATIVE ALGEBRA 6.4. Examples of Noetherian rings. So far the only rings we can easily prove are Noetherian are principal ideal domains, like Z and k[x], or finite. Our goal now is to develop theorems

More information

its image and kernel. A subgroup of a group G is a non-empty subset K of G such that k 1 k 1

its image and kernel. A subgroup of a group G is a non-empty subset K of G such that k 1 k 1 10 Chapter 1 Groups 1.1 Isomorphism theorems Throughout the chapter, we ll be studying the category of groups. Let G, H be groups. Recall that a homomorphism f : G H means a function such that f(g 1 g

More information

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u. 5. Fields 5.1. Field extensions. Let F E be a subfield of the field E. We also describe this situation by saying that E is an extension field of F, and we write E/F to express this fact. If E/F is a field

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Throughout these notes, F denotes a field (often called the scalars in this context). 1 Definition of a vector space Definition 1.1. A F -vector space or simply a vector space

More information

Rings. Chapter Homomorphisms and ideals

Rings. Chapter Homomorphisms and ideals Chapter 2 Rings This chapter should be at least in part a review of stuff you ve seen before. Roughly it is covered in Rotman chapter 3 and sections 6.1 and 6.2. You should *know* well all the material

More information

Exercises on chapter 1

Exercises on chapter 1 Exercises on chapter 1 1. Let G be a group and H and K be subgroups. Let HK = {hk h H, k K}. (i) Prove that HK is a subgroup of G if and only if HK = KH. (ii) If either H or K is a normal subgroup of G

More information

ALGEBRAIC GROUPS J. WARNER

ALGEBRAIC GROUPS J. WARNER ALGEBRAIC GROUPS J. WARNER Let k be an algebraically closed field. varieties unless otherwise stated. 1. Definitions and Examples For simplicity we will work strictly with affine Definition 1.1. An algebraic

More information

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop Eric Sommers 17 July 2009 2 Contents 1 Background 5 1.1 Linear algebra......................................... 5 1.1.1

More information

ALGEBRA QUALIFYING EXAM SPRING 2012

ALGEBRA QUALIFYING EXAM SPRING 2012 ALGEBRA QUALIFYING EXAM SPRING 2012 Work all of the problems. Justify the statements in your solutions by reference to specific results, as appropriate. Partial credit is awarded for partial solutions.

More information

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that ALGEBRAIC GROUPS 33 3. Lie algebras Now we introduce the Lie algebra of an algebraic group. First, we need to do some more algebraic geometry to understand the tangent space to an algebraic variety at

More information

Math 429/581 (Advanced) Group Theory. Summary of Definitions, Examples, and Theorems by Stefan Gille

Math 429/581 (Advanced) Group Theory. Summary of Definitions, Examples, and Theorems by Stefan Gille Math 429/581 (Advanced) Group Theory Summary of Definitions, Examples, and Theorems by Stefan Gille 1 2 0. Group Operations 0.1. Definition. Let G be a group and X a set. A (left) operation of G on X is

More information

The Gelfand-Tsetlin Basis (Too Many Direct Sums, and Also a Graph)

The Gelfand-Tsetlin Basis (Too Many Direct Sums, and Also a Graph) The Gelfand-Tsetlin Basis (Too Many Direct Sums, and Also a Graph) David Grabovsky June 13, 2018 Abstract The symmetric groups S n, consisting of all permutations on a set of n elements, naturally contain

More information

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY MAT 445/1196 - INTRODUCTION TO REPRESENTATION THEORY CHAPTER 1 Representation Theory of Groups - Algebraic Foundations 1.1 Basic definitions, Schur s Lemma 1.2 Tensor products 1.3 Unitary representations

More information

Exercises on chapter 0

Exercises on chapter 0 Exercises on chapter 0 1. A partially ordered set (poset) is a set X together with a relation such that (a) x x for all x X; (b) x y and y x implies that x = y for all x, y X; (c) x y and y z implies that

More information

Representations. 1 Basic definitions

Representations. 1 Basic definitions Representations 1 Basic definitions If V is a k-vector space, we denote by Aut V the group of k-linear isomorphisms F : V V and by End V the k-vector space of k-linear maps F : V V. Thus, if V = k n, then

More information

Presentation 1

Presentation 1 18.704 Presentation 1 Jesse Selover March 5, 2015 We re going to try to cover a pretty strange result. It might seem unmotivated if I do a bad job, so I m going to try to do my best. The overarching theme

More information

Since G is a compact Lie group, we can apply Schur orthogonality to see that G χ π (g) 2 dg =

Since G is a compact Lie group, we can apply Schur orthogonality to see that G χ π (g) 2 dg = Problem 1 Show that if π is an irreducible representation of a compact lie group G then π is also irreducible. Give an example of a G and π such that π = π, and another for which π π. Is this true for

More information

Projective modules: Wedderburn rings

Projective modules: Wedderburn rings Projective modules: Wedderburn rings April 10, 2008 8 Wedderburn rings A Wedderburn ring is an artinian ring which has no nonzero nilpotent left ideals. Note that if R has no left ideals I such that I

More information

(d) Since we can think of isometries of a regular 2n-gon as invertible linear operators on R 2, we get a 2-dimensional representation of G for

(d) Since we can think of isometries of a regular 2n-gon as invertible linear operators on R 2, we get a 2-dimensional representation of G for Solutions to Homework #7 0. Prove that [S n, S n ] = A n for every n 2 (where A n is the alternating group). Solution: Since [f, g] = f 1 g 1 fg is an even permutation for all f, g S n and since A n is

More information

Algebra Exam Syllabus

Algebra Exam Syllabus Algebra Exam Syllabus The Algebra comprehensive exam covers four broad areas of algebra: (1) Groups; (2) Rings; (3) Modules; and (4) Linear Algebra. These topics are all covered in the first semester graduate

More information

HARTSHORNE EXERCISES

HARTSHORNE EXERCISES HARTSHORNE EXERCISES J. WARNER Hartshorne, Exercise I.5.6. Blowing Up Curve Singularities (a) Let Y be the cusp x 3 = y 2 + x 4 + y 4 or the node xy = x 6 + y 6. Show that the curve Ỹ obtained by blowing

More information

Math 121 Homework 4: Notes on Selected Problems

Math 121 Homework 4: Notes on Selected Problems Math 121 Homework 4: Notes on Selected Problems 11.2.9. If W is a subspace of the vector space V stable under the linear transformation (i.e., (W ) W ), show that induces linear transformations W on W

More information

INJECTIVE MODULES: PREPARATORY MATERIAL FOR THE SNOWBIRD SUMMER SCHOOL ON COMMUTATIVE ALGEBRA

INJECTIVE MODULES: PREPARATORY MATERIAL FOR THE SNOWBIRD SUMMER SCHOOL ON COMMUTATIVE ALGEBRA INJECTIVE MODULES: PREPARATORY MATERIAL FOR THE SNOWBIRD SUMMER SCHOOL ON COMMUTATIVE ALGEBRA These notes are intended to give the reader an idea what injective modules are, where they show up, and, to

More information

Exercises on chapter 4

Exercises on chapter 4 Exercises on chapter 4 Always R-algebra means associative, unital R-algebra. (There are other sorts of R-algebra but we won t meet them in this course.) 1. Let A and B be algebras over a field F. (i) Explain

More information

ALGEBRA QUALIFYING EXAM PROBLEMS

ALGEBRA QUALIFYING EXAM PROBLEMS ALGEBRA QUALIFYING EXAM PROBLEMS Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND MODULES General

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

Math 210C. The representation ring

Math 210C. The representation ring Math 210C. The representation ring 1. Introduction Let G be a nontrivial connected compact Lie group that is semisimple and simply connected (e.g., SU(n) for n 2, Sp(n) for n 1, or Spin(n) for n 3). Let

More information

Chapter 3. Rings. The basic commutative rings in mathematics are the integers Z, the. Examples

Chapter 3. Rings. The basic commutative rings in mathematics are the integers Z, the. Examples Chapter 3 Rings Rings are additive abelian groups with a second operation called multiplication. The connection between the two operations is provided by the distributive law. Assuming the results of Chapter

More information

Some notes on linear algebra

Some notes on linear algebra Some notes on linear algebra Throughout these notes, k denotes a field (often called the scalars in this context). Recall that this means that there are two binary operations on k, denoted + and, that

More information

Infinite-Dimensional Triangularization

Infinite-Dimensional Triangularization Infinite-Dimensional Triangularization Zachary Mesyan March 11, 2018 Abstract The goal of this paper is to generalize the theory of triangularizing matrices to linear transformations of an arbitrary vector

More information

Representations and Linear Actions

Representations and Linear Actions Representations and Linear Actions Definition 0.1. Let G be an S-group. A representation of G is a morphism of S-groups φ G GL(n, S) for some n. We say φ is faithful if it is a monomorphism (in the category

More information

A PROOF OF BURNSIDE S p a q b THEOREM

A PROOF OF BURNSIDE S p a q b THEOREM A PROOF OF BURNSIDE S p a q b THEOREM OBOB Abstract. We prove that if p and q are prime, then any group of order p a q b is solvable. Throughout this note, denote by A the set of algebraic numbers. We

More information

SUMMARY ALGEBRA I LOUIS-PHILIPPE THIBAULT

SUMMARY ALGEBRA I LOUIS-PHILIPPE THIBAULT SUMMARY ALGEBRA I LOUIS-PHILIPPE THIBAULT Contents 1. Group Theory 1 1.1. Basic Notions 1 1.2. Isomorphism Theorems 2 1.3. Jordan- Holder Theorem 2 1.4. Symmetric Group 3 1.5. Group action on Sets 3 1.6.

More information

4.3 Composition Series

4.3 Composition Series 4.3 Composition Series Let M be an A-module. A series for M is a strictly decreasing sequence of submodules M = M 0 M 1... M n = {0} beginning with M and finishing with {0 }. The length of this series

More information

Classification of semisimple Lie algebras

Classification of semisimple Lie algebras Chapter 6 Classification of semisimple Lie algebras When we studied sl 2 (C), we discovered that it is spanned by elements e, f and h fulfilling the relations: [e, h] = 2e, [ f, h] = 2 f and [e, f ] =

More information

ne varieties (continued)

ne varieties (continued) Chapter 2 A ne varieties (continued) 2.1 Products For some problems its not very natural to restrict to irreducible varieties. So we broaden the previous story. Given an a ne algebraic set X A n k, we

More information

Reid 5.2. Describe the irreducible components of V (J) for J = (y 2 x 4, x 2 2x 3 x 2 y + 2xy + y 2 y) in k[x, y, z]. Here k is algebraically closed.

Reid 5.2. Describe the irreducible components of V (J) for J = (y 2 x 4, x 2 2x 3 x 2 y + 2xy + y 2 y) in k[x, y, z]. Here k is algebraically closed. Reid 5.2. Describe the irreducible components of V (J) for J = (y 2 x 4, x 2 2x 3 x 2 y + 2xy + y 2 y) in k[x, y, z]. Here k is algebraically closed. Answer: Note that the first generator factors as (y

More information

Representations of quivers

Representations of quivers Representations of quivers Gwyn Bellamy October 13, 215 1 Quivers Let k be a field. Recall that a k-algebra is a k-vector space A with a bilinear map A A A making A into a unital, associative ring. Notice

More information

Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III

Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III Lie algebras. Let K be again an algebraically closed field. For the moment let G be an arbitrary algebraic group

More information

1 Fields and vector spaces

1 Fields and vector spaces 1 Fields and vector spaces In this section we revise some algebraic preliminaries and establish notation. 1.1 Division rings and fields A division ring, or skew field, is a structure F with two binary

More information

ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS

ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS J. WARNER SUMMARY OF A PAPER BY J. CARLSON, E. FRIEDLANDER, AND J. PEVTSOVA, AND FURTHER OBSERVATIONS 1. The Nullcone and Restricted Nullcone We will need

More information

NOTES ON FINITE FIELDS

NOTES ON FINITE FIELDS NOTES ON FINITE FIELDS AARON LANDESMAN CONTENTS 1. Introduction to finite fields 2 2. Definition and constructions of fields 3 2.1. The definition of a field 3 2.2. Constructing field extensions by adjoining

More information

ADVANCED COMMUTATIVE ALGEBRA: PROBLEM SETS

ADVANCED COMMUTATIVE ALGEBRA: PROBLEM SETS ADVANCED COMMUTATIVE ALGEBRA: PROBLEM SETS UZI VISHNE The 11 problem sets below were composed by Michael Schein, according to his course. Take into account that we are covering slightly different material.

More information

ALGEBRA II: RINGS AND MODULES OVER LITTLE RINGS.

ALGEBRA II: RINGS AND MODULES OVER LITTLE RINGS. ALGEBRA II: RINGS AND MODULES OVER LITTLE RINGS. KEVIN MCGERTY. 1. RINGS The central characters of this course are algebraic objects known as rings. A ring is any mathematical structure where you can add

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

January 2016 Qualifying Examination

January 2016 Qualifying Examination January 2016 Qualifying Examination If you have any difficulty with the wording of the following problems please contact the supervisor immediately. All persons responsible for these problems, in principle,

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

5 Dedekind extensions

5 Dedekind extensions 18.785 Number theory I Fall 2016 Lecture #5 09/22/2016 5 Dedekind extensions In this lecture we prove that the integral closure of a Dedekind domain in a finite extension of its fraction field is also

More information

and this makes M into an R-module by (1.2). 2

and this makes M into an R-module by (1.2). 2 1. Modules Definition 1.1. Let R be a commutative ring. A module over R is set M together with a binary operation, denoted +, which makes M into an abelian group, with 0 as the identity element, together

More information

LECTURE 4: REPRESENTATION THEORY OF SL 2 (F) AND sl 2 (F)

LECTURE 4: REPRESENTATION THEORY OF SL 2 (F) AND sl 2 (F) LECTURE 4: REPRESENTATION THEORY OF SL 2 (F) AND sl 2 (F) IVAN LOSEV In this lecture we will discuss the representation theory of the algebraic group SL 2 (F) and of the Lie algebra sl 2 (F), where F is

More information

Algebra Exam Topics. Updated August 2017

Algebra Exam Topics. Updated August 2017 Algebra Exam Topics Updated August 2017 Starting Fall 2017, the Masters Algebra Exam will have 14 questions. Of these students will answer the first 8 questions from Topics 1, 2, and 3. They then have

More information

THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS. K. R. Goodearl and E. S. Letzter

THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS. K. R. Goodearl and E. S. Letzter THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS K. R. Goodearl and E. S. Letzter Abstract. In previous work, the second author introduced a topology, for spaces of irreducible representations,

More information

ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ.

ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ. ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ. ANDREW SALCH 1. Hilbert s Nullstellensatz. The last lecture left off with the claim that, if J k[x 1,..., x n ] is an ideal, then

More information

11. Finitely-generated modules

11. Finitely-generated modules 11. Finitely-generated modules 11.1 Free modules 11.2 Finitely-generated modules over domains 11.3 PIDs are UFDs 11.4 Structure theorem, again 11.5 Recovering the earlier structure theorem 11.6 Submodules

More information

LECTURE NOTES AMRITANSHU PRASAD

LECTURE NOTES AMRITANSHU PRASAD LECTURE NOTES AMRITANSHU PRASAD Let K be a field. 1. Basic definitions Definition 1.1. A K-algebra is a K-vector space together with an associative product A A A which is K-linear, with respect to which

More information

Introduction to Arithmetic Geometry Fall 2013 Lecture #18 11/07/2013

Introduction to Arithmetic Geometry Fall 2013 Lecture #18 11/07/2013 18.782 Introduction to Arithmetic Geometry Fall 2013 Lecture #18 11/07/2013 As usual, all the rings we consider are commutative rings with an identity element. 18.1 Regular local rings Consider a local

More information

Math 762 Spring h Y (Z 1 ) (1) h X (Z 2 ) h X (Z 1 ) Φ Z 1. h Y (Z 2 )

Math 762 Spring h Y (Z 1 ) (1) h X (Z 2 ) h X (Z 1 ) Φ Z 1. h Y (Z 2 ) Math 762 Spring 2016 Homework 3 Drew Armstrong Problem 1. Yoneda s Lemma. We have seen that the bifunctor Hom C (, ) : C C Set is analogous to a bilinear form on a K-vector space, : V V K. Recall that

More information