LECTURE 16: TENSOR PRODUCTS

Similar documents
Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Math 110, Spring 2015: Midterm Solutions

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces

Lecture 10: Powers of Matrices, Difference Equations

First we introduce the sets that are going to serve as the generalizations of the scalars.

Math 396. Quotient spaces

Topics in linear algebra

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 2

1 Linear transformations; the basics

Math 113 Midterm Exam Solutions

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 24

V (v i + W i ) (v i + W i ) is path-connected and hence is connected.

1.8 Dual Spaces (non-examinable)

Lecture 1s Isomorphisms of Vector Spaces (pages )

Categories and Quantum Informatics: Hilbert spaces

and The important theorem which connects these various spaces with each other is the following: (with the notation above)

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Math 3361-Modern Algebra Lecture 08 9/26/ Cardinality

Section 2.6 Solving Linear Inequalities

Exercises on chapter 0

Big-oh stuff. You should know this definition by heart and be able to give it,

LECTURE 11: TRANSVERSALITY

The Berlekamp algorithm

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

Linear Algebra, Summer 2011, pt. 2

Algebraic Geometry Spring 2009

2: LINEAR TRANSFORMATIONS AND MATRICES

DESCENT THEORY (JOE RABINOFF S EXPOSITION)

Lecture 3: Sizes of Infinity

One-to-one functions and onto functions

Math 52: Course Summary

Linear and Bilinear Algebra (2WF04) Jan Draisma

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

Dot Products, Transposes, and Orthogonal Projections

1. A Little Set Theory

Free associative algebras

Symmetries and Polynomials

Last Time. x + 3y = 6 x + 2y = 1. x + 3y = 6 y = 1. 2x + 4y = 8 x 2y = 1. x + 3y = 6 2x y = 7. Lecture 2

Equivalence Relations

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

INTERSECTION THEORY CLASS 19

A PRIMER ON SESQUILINEAR FORMS

Algebraic Geometry Spring 2009

DUAL MODULES KEITH CONRAD

MATH 115, SUMMER 2012 LECTURE 12

Algebraic Geometry (Math 6130)

2 Lecture Span, Basis and Dimensions

Math 762 Spring h Y (Z 1 ) (1) h X (Z 2 ) h X (Z 1 ) Φ Z 1. h Y (Z 2 )

1 Basics of vector space

The Fundamental Group

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 25

Essential facts about NP-completeness:

1 Equivalence Relations

INTRODUCTION TO LIE ALGEBRAS. LECTURE 1.

EXAM 2 REVIEW DAVID SEAL

MAT224 Practice Exercises - Week 7.5 Fall Comments

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 37

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

a (b + c) = a b + a c

Notes on Tensor Products

QFT. Chapter 14: Loop Corrections to the Propagator

Chapter 3. Introducing Groups

Outline. Linear maps. 1 Vector addition is commutative: summands can be swapped: 2 addition is associative: grouping of summands is irrelevant:

LECTURE 13. Quotient Spaces

Representation Theory in Intermediate Characteristic

Categories and functors

Vector calculus background

Countability. 1 Motivation. 2 Counting

Generating Function Notes , Fall 2005, Prof. Peter Shor

Metric spaces and metrizability

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MONDAY - TALK 4 ALGEBRAIC STRUCTURE ON COHOMOLOGY

Representations of quivers

SCHUR-WEYL DUALITY FOR U(n)

2. Intersection Multiplicities

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

The tensor product of commutative monoids

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 5

Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases

ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ.

Supplementary Notes March 23, The subgroup Ω for orthogonal groups

O.K. But what if the chicken didn t have access to a teleporter.

Basic Definitions: Indexed Collections and Random Functions

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MATH 112 QUADRATIC AND BILINEAR FORMS NOVEMBER 24, Bilinear forms

LECTURE 2. Hilbert Symbols

Section 4.6 Negative Exponents

MATH 583A REVIEW SESSION #1

Introductory Analysis I Fall 2014 Homework #5 Solutions

Modal and temporal logic

Lecture 6: Principal bundles

Lemma 1.3. The dimension of I d is dimv d D = ( d D+2

EILENBERG-ZILBER VIA ACYCLIC MODELS, AND PRODUCTS IN HOMOLOGY AND COHOMOLOGY

1.1.1 Algebraic Operations

MH1200 Final 2014/2015

Algebraic Geometry Spring 2009

Linear algebra and differential equations (Math 54): Lecture 10

Measure and integration

Transcription:

LECTURE 16: TENSOR PRODUCTS We address an aesthetic concern raised by bilinear forms: the source of a bilinear function is not a vector space. When doing linear algebra, we want to be able to bring all of that machinery to bear. That means we need to consider only vector spaces and linear maps. Our constructions today will answer that point. We begin with a generalization of a bilinear form. Definition. A function f : U V W is bilinear if f(a u 1 + b u 2, v) = af( u 1, v) + bf( u 2, v) and f( u, a v 1 + b v 2 ) = af( u, v 1 ) + bf( u, v 2 ). Let Bilin(U, V ; W ) denote the set of all bilinear functions U V W. This is clearly a vector space with the usual operations. Thus we have for each fixed u a linear map f u = f( u, ): V W, and for each v V, a linear map f v = f(, v): U W. This already is a kind of generalization of our Reisz map we saw last time, and we will return to this later. The tensor product is the unique (up to isomorphism) vector space such that for any W L(U V, W ) = Bilin(U, V ; W ). In some sense, this is our defining principal. We ll give two different constructions, and then show they are the same. We first review somewhat a universal property. We ve actually already seen this notion twice before: (1) The free vector space on a set X, F{X}, had the defining property that to give a map of sets X V was the same as to give a linear transformation F{X} V, and that we recovered any map of sets by composing the linear map with a canonical map of sets X F{X} that was the inclusion of a basis. (2) The quotient space V/W had the defining property that given a map L: V U such that W ker(l), we can find a unique map V/W U such that the composite V V/W U is L. In both of these cases, we had two pieces of data: the vector space we wanted and a distinguished map to our vector space that factors any map of a certain kind (in the first example, it was just maps of sets, and in the second, it was linear maps with a kernel bigger than W ). We can do the same for the tensor product by specifying its universal property. Definition. Let U and V be vector spaces. Then the tensor product is vector space U V such that (1) There is a bilinear map i: U V U V and (2) Given any bilinear map f : U V W, there is a unique linear map L f : U V W such that L f i = f. Of course, this definition might be defining nothing. On the other hand, if we have two vector spaces that satisfy the two conditions of the definition, then they are 1

2 LECTURE 16: TENSOR PRODUCTS isomorphic (and if we work slightly harder and ensure that the map U V U V is remembered, then the isomorphism is also unique). Proposition. If (U V ) 1 and (U V ) 2, together with their bilinear maps i 1 and i 2 respectively, both satisfy the conditions of the definition, then there is a unique isomorphism that takes i 1 to i 2 and vice-versa. Proof. Setting W in the definition equal to (U V ) 2, we have a unique map L i2 : (U V ) 1 (U V ) 2 such that L i2 i 1 = i 2. On the other hand, if we swap the roles of (U V ) 1 and (U V ) 2, then we get a map L i1 : (U V ) 2 (U V ) 1. Now the composite L 2 = L i2 L i1 : (U V ) 2 (U V ) 2 is a linear map such that L 2 i 2 = i 2. Thus if we apply our defining property when W = (U V ) 2 and factoring through (U V ) 2, then we see that L 2 is the unique map such that L 2 i 2 = i 2 (this was the uniqueness part of the definition). Of course, the identity map is also such a map, so by uniqueness, L 2 is the identity map. The same argument shows that L 1 = L i1 L i2 is the identity, and we have constructed inverse isomorphisms. We ll now focus on two constructions of very different flavors: an almost unuseable one that has very good formal properties and an unnaturally one (with a choice [boo!] of basis) that is easier to compute with. Construction #1. In this construction, we will produce something that obviously satisfies the conditions of the definition. We begin by thinking again about what we want. We want a vector space U V, together with a bilinear map i: U V U V, such that for any W, L(U V, W ) = Bilin(U, V ; W ) and the isomorphism from L is given by L L i. We will build U V and i by first considering all functions (not just bilinear ones). The free vector space F{ } has the desired universal property here: L(F{U V }, W ) = Map(U V, W ), and the isomorphism is again given by composing with the inclusion i: U V F{U V }. Every bilinear function is in particular a map of sets U V W, so we need to simply impose restrictions that make our functions bilinear. We ll consider linearity in the first factor, as linearity in the second is an obvious extension. A function f : U V W is linear iff f(a u 1 + b u 2, v) = af( u 1, v) + bf( u 2, v), or equivalently f(a u 1 + b u 2, v) af( u 1, v) bf( u 2, v) = 0. This is a linear combination of values of f on elements of U V, so we can record the same information with L f, the linear transformation F{U V } W encoding f. Our identification now becomes: f is linear in the first factor iff L f ( (a u1 + b u 2, v) a( u 1, v) b( u 2, v) ) = 0, since L f is linear and L f (( u, v)) = f( u, v).

LECTURE 16: TENSOR PRODUCTS 3 This is a much nicer condition. We consider now the subspace I of F{U V } spanned by all possible vectors of this form I = (a u 1 + b u 2, v) a( u 1, v) b( u 2, v), ( u, a v 1 + b v 2 ) a( u, v 1 ) b( u, v 2 ). Thus f is bilinear iff I ker(l f ). The universal property of the quotient tells us then that f is bilinear iff L f extends uniquely over F{U V }/I. Here is a diagram that records much of this i U V F{U V } F{U V }/I. L f f L f W The first diagonal map exists and is unique no matter what f is. The second diagonal map exists and is unique whenever f is bilinear. Thus we define the tensor product to be F{U V }/I, and the structure map is the composite π I i. We must check that this is bilinear. This is easy, however. The element (a u 1 +b u 2, v) U V maps under i to the basis vector by the same name. However, since we know that (a u 1 + b u 2, v) a( u 1, v) b( u 2, v) I, π I ((a u 1 + b u 2, v)) = aπ I ( u 1, v) + bπ I ( u 2, v), and thus π I i is bilinear. That this construction gives us the right universal property is immediate. There are a few draw-backs: (1) If F is infinite, then F{U V } is HUGE (the dimension is F 2 dim U dim V ). Additionally, I is also HUGE (and it has essentially the same dimension). (2) It s not immediately clear that U V is not the zero space. In other words, that I F{U V }. Additionally, at the end of the day, we don t normally specific a linear transformation by specifying where EVERY element goes. We just work with a basis. That brings us to construction 2. Construction 2. Now assume that U has a basis { u 1,..., u n } and V has a basis { v 1,..., v m } (and here infinite bases work no differently). We start with an easy proposition. Proposition. Let f be a bilinear function U V W then f is uniquely determined by its values on ( u i, v j ), and any choice of values on these points determines a bilinear function. Proof. Let u = a i u i and let v = b j v j. Then by linearity in the first factor π I f( u, v) = i a i f( u i, v). By linearity in the second factor, this is = a i b j f( u i, v j ) = a i b j f( u i, v j ). i j i,j

4 LECTURE 16: TENSOR PRODUCTS Thus specifying the values on ( u i, v j ) defines f. Since these form a basis, there are no additional restrictions and hence any collection of values gives a bilinear function. This gives us our second definition: U V = F{( u i, v j )}. From the above proposition, we know that bilinear functions U V are in 1-1 correspondence with linear maps F{( u i, v j )}, and therefore we have almost verified the universal property. The only remaining point is to produce a bilinear map U V U V. However, using our 1-1 correspondence, we just have to produce a linear transformation U V U V. The identity works very nicely, and the corresponding bilinear map is the last piece of structure for the universal property. This gives us a very nice, easy consequence. Proposition. The dimension of U V is dim(u) dim(v ). Moving On. We already showed that any two things that satisfy the universal property are uniquely isomorphic. Thus our two definitions of are the same, and we will feel free to use both depending on context. In all cases, we will denote the image of ( u, v) in U V by u v. We turn now to maps. Let s consider S : U U and T : V V. Then we want to produce a new linear map S T : U V U V. To produce a linear map like this is the same as producing a bilinear map U V U V. This, however, is essentially formal (one of the advantages of a universal property). Proposition. If i: U V U V is the bilinear structure map, then i (S T ): U V U V is bilinear. Here S T : U V U V is given by (S T )( u, v) = (S( u), T ( v)). Proof. We first check linearity in the first factor. Our map is a composite, so we break it into two pieces and analyze them separately. (S T )(a u 1 + b u 2, v) = ( S(a u 1 + b u 2 ), T ( v) ) = ( as( u 1 ) + bs( u 2 ), T ( v) ). The map i is bilinear, so we know that ( (as( u1 i ) + bs( u 2 ), T ( v) )) = ai ( S( u 1 ), T ( v) ) + bi ( S( u 2 ), T ( v) ) This is exactly what expresses linearity in the first factor. The second factor is handled identically. Thus we get a linear map S T : U V U V by and extending by linearity. (S T )( u v) = S( u) T ( v) Proposition. The assignment (S, T ) S T is a bilinear map and thus descends to a linear map L(U, U ) L(V, V ) L(U V, U V ), L(U, U ) L(V, V ) L(U V, U V ). When all spaces are finite dimensional, this is an isomorphism.

LECTURE 16: TENSOR PRODUCTS 5 We can drop the dimensionality restrictions, but we won t explore that here. Proof. This kind of argument is the same as the ones we have been giving. We check directly that S T is linear in each factor, and then we conclude we have a map. For this, consider (as 1 + bs 2 ) T. Equality of functions means they have the same values at each point, and so it suffices to show that on a generic test point ( u v), (as 1 + bs 2 ) T ( u v) = as 1 T ( u v) + bs 2 T ( u v). We write out the left most term: (as 1 + bs 2 T )( u v) = ( (as 1 + bs 2 )( u) ) T ( v) = ( as 1 ( u) + bs 2 ( u) ) T ( v). Since the tensor product in U V (where the last term in the equality lives) is bilinear, we conclude that this is the right-hand side of the desired equality. Showing that this is an isomorphism is slightly tricker. Careful details will be left as an exercise. For now, note that if we choose a basis for all of our vector spaces, then we get a basis for the linear transformations and for the tensor product by mirroring the dual basis construction. With this basis, it is immediate that basis vectors map to basis vectors in a bijective way, and hence that the map is an isomorphism.