Math 121 Homework 4: Notes on Selected Problems

Similar documents
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Math 121 Homework 5: Notes on Selected Problems

Math 113 Midterm Exam Solutions

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

Chapter 2 Linear Transformations

Test 1 Review Problems Spring 2015

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

A Primer on Homological Algebra

Math 121 Homework 2: Notes on Selected Problems

Equivalence Relations

ABSTRACT NONSINGULAR CURVES

A Little Beyond: Linear Algebra

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NORMALIZATION OF THE KRICHEVER DATA. Motohico Mulase

Elementary linear algebra

Linear and Bilinear Algebra (2WF04) Jan Draisma

Linear Algebra MAT 331. Wai Yan Pong

Math 676. A compactness theorem for the idele group. and by the product formula it lies in the kernel (A K )1 of the continuous idelic norm

Kernel and range. Definition: A homogeneous linear equation is an equation of the form A v = 0

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Math 396. Quotient spaces

Math 110, Spring 2015: Midterm Solutions

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

MATH PRACTICE EXAM 1 SOLUTIONS

Family Feud Review. Linear Algebra. October 22, 2013

Infinite-Dimensional Triangularization

Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III

Selected exercises from Abstract Algebra by Dummit and Foote (3rd edition).

Smith theory. Andrew Putman. Abstract

Definitions. Notations. Injective, Surjective and Bijective. Divides. Cartesian Product. Relations. Equivalence Relations

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Math 210B. Artin Rees and completions

= c. = c. c 2. We can find apply our general formula to find the inverse of the 2 2 matrix A: A 1 5 4

Lecture Summaries for Linear Algebra M51A

NAME MATH 304 Examination 2 Page 1

INTRODUCTION TO LIE ALGEBRAS. LECTURE 7.

NOTES ON LINEAR ALGEBRA OVER INTEGRAL DOMAINS. Contents. 1. Introduction 1 2. Rank and basis 1 3. The set of linear maps 4. 1.

Groups of Prime Power Order with Derived Subgroup of Prime Order

Homework #05, due 2/17/10 = , , , , , Additional problems recommended for study: , , 10.2.

Introduction to Arithmetic Geometry Fall 2013 Lecture #17 11/05/2013

MATRIX LIE GROUPS AND LIE GROUPS

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Vector Spaces and Linear Transformations

BASIC GROUP THEORY : G G G,

Homework 5 M 373K Mark Lindberg and Travis Schedler

Lemma 1.3. The element [X, X] is nonzero.

MATH 196, SECTION 57 (VIPUL NAIK)

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

1.8 Dual Spaces (non-examinable)

Thus, the integral closure A i of A in F i is a finitely generated (and torsion-free) A-module. It is not a priori clear if the A i s are locally

and this makes M into an R-module by (1.2). 2

REPRESENTATION THEORY, LECTURE 0. BASICS

Representations. 1 Basic definitions

Computational Approaches to Finding Irreducible Representations

Vector spaces, duals and endomorphisms

ALGEBRA QUALIFYING EXAM PROBLEMS

INTRODUCTION TO LIE ALGEBRAS. LECTURE 1.

NOTES ON SPLITTING FIELDS

LECTURE 4: REPRESENTATION THEORY OF SL 2 (F) AND sl 2 (F)

4.1. Paths. For definitions see section 2.1 (In particular: path; head, tail, length of a path; concatenation;

Problems in Linear Algebra and Representation Theory

Exercises on chapter 0

Real representations

10. Smooth Varieties. 82 Andreas Gathmann

NOTES ON FINITE FIELDS

Math 113 Winter 2013 Prof. Church Midterm Solutions

Formal power series rings, inverse limits, and I-adic completions of rings

Noetherian property of infinite EI categories

CHAPTER 1. AFFINE ALGEBRAIC VARIETIES

Algebraic Geometry Spring 2009

On the singular elements of a semisimple Lie algebra and the generalized Amitsur-Levitski Theorem

Rings and groups. Ya. Sysak

ALGEBRA HW 10 CLAY SHONKWILER. Proof. Suppose that these three functions are not linearly independent. Then there exist a, b R such that

THE TATE MODULE. Seminar: Elliptic curves and the Weil conjecture. Yassin Mousa. Z p

A finite universal SAGBI basis for the kernel of a derivation. Osaka Journal of Mathematics. 41(4) P.759-P.792

DIVISORS ON NONSINGULAR CURVES

e j = Ad(f i ) 1 2a ij/a ii

Pacific Journal of Mathematics

The Jordan Canonical Form

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that

5 Structure of 2-transitive groups

August 2015 Qualifying Examination Solutions

ON k-subspaces OF L-VECTOR-SPACES. George M. Bergman

0.2 Vector spaces. J.A.Beachy 1

Vector Spaces and Linear Maps

x y B =. v u Note that the determinant of B is xu + yv = 1. Thus B is invertible, with inverse u y v x On the other hand, d BA = va + ub 2

THE FROBENIUS STRUCTURE OF LOCAL COHOMOLOGY

Linear and Bilinear Algebra (2WF04) Jan Draisma

ALGEBRA 8: Linear algebra: characteristic polynomial

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Online Exercises for Linear Algebra XM511

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Deduce from (i) that V = W Ker(Q) and therefore U = Ker(Q) has the desired property. Solution: (i) If w W, then ρ(g)(w) W and hence P ρ(g)(w) =

Algebra Exam Topics. Updated August 2017

(dim Z j dim Z j 1 ) 1 j i

Math 594. Solutions 5

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

Lecture Notes Math 371: Algebra (Fall 2006) by Nathanael Leedom Ackerman

Transcription:

Math 121 Homework 4: Notes on Selected Problems 11.2.9. If W is a subspace of the vector space V stable under the linear transformation (i.e., (W ) W ), show that induces linear transformations W on W and on the quotient vector space V /W. If W and are nonsingular prove that is nonsingular. Prove the converse holds if V has finite dimension and give a counterexample with V infinite dimensional. Solution. There is at most one map completing the diagram W W since the bottom map is an inclusion, and in this case there exists such a map W since the composition of the top and right maps factors as V V W Im(W V V ) V and by assumption the inclusion on the right factors through the inclusion W V. Similarly, there exists at most one map completing the diagram V V /W V V /W since the top map is surjective, and in this case there exists such a map by the universal property of the quotient V V /W since precomposing with W V at the top left of the above diagram gives the zero map, which can be seen by adjoining the earlier commutative square involving W. We now have the commutative diagram W W V V /W W V V /W. Assume that W and are nonsingular. Let v be an element of the kernel of. By commutativity of the right square, the image v + W of v in V /W, is killed by, but is injective so v + W = 0 + W, that is, v is in W, or more precisely there is w in W whose image under the inclusion W V is v. The commutativity of the left square and 1

injectivity of W V on the bottom then imply that W (w) = 0, but W (w) is injective so w = 0. Finally, the image v of w in V is also 0. This proves that is nonsingular. Note that a partial converse holds even without restricting the dimension of V. If is nonsingular, then W must also be nonsingular since the first map in any factorization of an injective map must also be injective. Assume now that V is finite dimensional and that is nonsingular. The dimension of the image of equals that of V so is an isomorphism. Then is surjective since the last map in any factorization of a surjective map must also be surjective. Any surjective linear map of vector spaces of the same dimension has trivial kernel (since it induces an isomorphism from the domain modulo the kernel and isomorphisms preserve dimension). Therefore is nonsingular. For an example where is nonsingular, but is singular (by the above necessarily V is infinite dimensional), let V be the free vector space on the nonnegative integers {0} N and let W be the subspace generated by the basis vectors corresponding to N. If is the linear map of V that takes the basis element corresponding to i to the basis element corresponding to i + 1, then it is invariant on W, but the induced map is singular because the image in V /W of the basis element corresponding to 0 is nonzero, but killed by. 2 11.2.38. Let A and B be square matrices. Prove that the trace of their Kronecker product is the product of the traces: tr(a B) = tr(a) tr(b). Solution. The corresponding (and equivalent, given the relation between Kronecker product and tensor product) basis-free statement is that if for i = 1, 2, φ i : V i V i is a linear map of vector spaces, then the trace of φ 1 φ 2 : V 1 V 2 V 1 V 2 is the product of the trace of φ 1 and the trace of φ 2. It suffices to prove this in the case that for each i, φ i is of the form v α i (v)v i,0 for α i in V i and v i,0 in V i. Then φ 1 φ 2 is given by v 1 v 2 α 1 (v 1 )v 1,0 α 2 (v 2 )v 2,0. Note that for i = 1, 2, φ i corresponds to α i v i,0 in V i V i and φ 1 φ 2 corresponds to (α 1 α 2 ) v 1,0 v 2,0 in (V 1 V 2 ) V 1 V 2. (We write α 1 α 2 for the element of (V 1 V 2 ) given by v 1 v 2 α 1 (v 1 )α 2 (v 2 ). This abuse of notation is justified because identification of the base field with the tensor product of the base field with itself over itself gives an inclusion V1 V2 (V 1 V 2 ) that takes α 1 α 2 in V1 V2 to precisely the above functional.) Hence the trace of φ i is α i (v i,0 ) and the trace of φ 1 φ 2 is (α 1 α 2 )(v 1,0 v 2,0 ) = α 1 (v 1,0 )α 2 (v 1,0 ). This completes the proof.

3 11.3.3(b). Let S be any subset of V for some finite dimensional space V. Define Ann(S) = {v V f (v) = 0 for all f S}. (Ann(S) is called the annihilator of S in V.) Let W 1 and W 2 be subspaces of V. Prove that Ann(W 1 W 2 ) = Ann(W 1 ) + Ann(W 2 ). Solution. For each i, W 1 W 2 W i so {v V f (v) = 0 for all f W i } that is Ann(W i ) Ann(W 1 W 2 ). Thus {v V f (v) = 0 for all f W 1 W 2 }, Ann(W 1 ) + Ann(W 2 ) Ann(W 1 W 2 ). For the reverse inclusion, we first prove a lemma about extending bases. Lemma. Let V be a vector space with subspaces V 1 and V 2. Then there exists a basis B for V so that B V i is a basis for V i for i = 1, 2 and B V 1 V 2 is a basis for V 1 V 2. Proof. Let B 3 be an ordered basis for V 3 = V 1 V 2. Since any linearly independent set is contained in a basis, there exists B i for i = 1, 2 such that the concatenation of the sets B i and B 3 is a basis for V i. In particular the concatenation of B 1 and B 2 and B 3 spans the subspace V 1 + V 2 of V, and we now show that it is in fact independent and hence a basis for the subspace. Assume that for each i, there is a finite set of distinct elements v i j for 1 j r i of B i and scalars a i j such that a i j vi j = 0 where the sum is over all i and j. Then a 1 j v1 j = a 2 j v2 j a 3 j v3 j, but the right side is in V 2 so a i j v1 j is V 2 and hence in V 1 V 2. Since the concatenation of B 1 and B 3 is an independent set, there is a unique way to write zero as a linear combination and hence a 1 j = 0 for all j. Similarly a 2 j = 0 for all j. Then by independence of B 3, a 3 j = 0 for all j. This proves independence of the concatenation of B 1 and B 2 and B 3, and we may extend this ordered set to a basis B of V. Then B is the desired basis for V. Apply the lemma to choose a basis B for V so that for i = 1, 2, B W i is a basis for W i and B W 1 W 2 is a basis for W 1 W 2. Let v be an element of Ann(W 1 W 2 ). In particular f (v) = 0 when f is in

the intersection of B W 1 and B W 2 so we may define functionals φ 1 and φ 2 on V, that is elements of V, by the requirements that and φ 1 (f ) = 0 for f in B W 1 φ 1 (f ) = f (v) for f in B W 2 φ 1 (f ) = f (v) for f in B \ (W 1 W 2 ) 4 φ 2 (f ) = f (v) for f in B W 1 φ 2 (f ) = 0 for f in B W 2 φ 2 (f ) = 0 for f in B \ (W 1 W 2 ). The natural injection V V is surjective because V is finite dimensional so for each i we may choose v i in V whose image in V is φ i. This means that, for each i and all f in B W i, f (v i ) = φ i (f ) = 0, and the same will be true for any f in W i so v i is in Ann(W i ). For any f in B, f (v 1 + v 2 ) = f (v 1 ) + f (v 2 ) = φ 1 (f ) + φ 2 (f ) = f (v) so the same is true for any f in V, which implies that v 1 + v 2 = v. This shows that v is in Ann(W 1 ) + Ann(W 2 ), but v was an arbitrary element of Ann(W 1 W 2 ) so the proof is complete. Problem 2. Let V be a finite-dimensional vector space over the field Q. Prove that there do not exist linear transformations X, Y : V V so that X Y Y X is the identity map. Show, however, that such transformations can exist for vector spaces over a finite field, or for infinite-dimensional vector spaces. Solution. Since X Y Y X has zero trace while the identity of V has trace dim V (we assume V is of finite nonzero dimension), which is not zero in Q, X Y Y X can never be the identity of V for a Q-vector space V. Let k be a field and consider the polynomial ring V = k[t], which is a k-vector space. Differentiation (t i it i 1 ) defines a k-linear endomorphism X of V. The k-algebra structure on V allows us to define another k-linear endomorphism Y of V by multiplication by t. By the product rule for differentiation, X Y = Y X +1 V. Hence X Y Y X is the identity of V.

Now assume that k has positive characteristic p. Let W be the ideal of the ring V = k[t] generated by t p. Then W is invariant under Y, and, because the derivative of t p is zero in characteristic p, W is also invariant under X. Therefore X and Y induce k-linear endomorphisms of the quotient V /W, say X and Y, respectively, such that X Y Y X is the identity of V /W, which is a finite dimensional vector space. Note. An element of the endomorphism ring of a finite dimensional vector space (over any field) is a commutator (element of the form X Y Y X) if and only if it has zero trace. 5 Problem 3. Let V,W be finite-dimensional spaces, and : V W Hom(V, W ) the unique homomorphism sending v w to the functional x V w x, v. Prove (as asserted in class) that if X V W is so that the rank of (X) is r, then X can be written as the sum of at most r pure tensors. Solution. Let X be an element of V W, and let W be the image of (X), which is a subspace of W. Write ι: W W for the inclusion and ρ : W W /W for the projection. Consider the commutative diagram V W V ι V W ψ Hom(V, W ) Hom(V,ι) Hom(V, W ) Hom(V, W /W ) Hom(V,ρ) where the vertical maps are the natural isomorphisms and the bottom row has the property that the image of the left map equals the kernel of the right map. Since Hom(V, ρ)(x) = 0, there exists g in Hom(V, W ) such that Hom(V, ι)g = (X). By commutativity, (V ι)(ψ 1 (g)) and X both have the same image under the isomorphism. Therefore X = (V ι)(ψ 1 (g)) is in the image of V ι. Every element of V W may be written as a linear combination of at most r = dim W pure tensors, and such a representation for ψ 1 (g) gives the desired representation for X because V ι maps pure tensors to pure tensors. Problem 4. Let V be a finite dimensional vector space over a field F. Let T be a linear transformation from V to V. We say T is symmetric if, for some basis B of V with dual basis B, the matrix of T with respect to B, B is symmetric. Prove that this definition is independent of choice of B.

Solution. Consider v and w in B. The w component of T v may be obtained by evaluating at w to obtain (T v)w. Similarly the v component of T w is (T w)v. The condition that the matrix of T with respect to B, B be symmetric is then that (T v)w = (T w)v for all v and w in B. By linearity this implies, and hence is equivalent to, the stronger statement that (T v)w = (T w)v for all v and w in V. The latter condition is independent of B so the definition of symmetric is independent of the choice of B. (The condition that T be symmetric can be restated by saying that the pairing, : V V F given by v, w = (T v)w is symmetric: v, w = w, v.) 6