and The important theorem which connects these various spaces with each other is the following: (with the notation above)

Similar documents
A = u + V. u + (0) = u

Outline. Linear maps. 1 Vector addition is commutative: summands can be swapped: 2 addition is associative: grouping of summands is irrelevant:

MAT224 Practice Exercises - Week 7.5 Fall Comments

i=1 α ip i, where s The analogue of subspaces

MATH Linear Algebra

Equivalence Relations

Criteria for Determining If A Subset is a Subspace

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Linear Independence Reading: Lay 1.7

Dot Products, Transposes, and Orthogonal Projections

Math 113 Midterm Exam Solutions

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)

Solutions to Homework 8 - Math 3410

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

LECTURE 16: TENSOR PRODUCTS

15B. Isometries and Functionals 1

MA554 Assessment 1 Cosets and Lagrange s theorem

Review of Functions. Functions. Philippe B. Laval. Current Semester KSU. Philippe B. Laval (KSU) Functions Current Semester 1 / 12

Exercises on chapter 0

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Chapter 1: Linear Equations

10. Smooth Varieties. 82 Andreas Gathmann

Cosets and Lagrange s theorem

Test 1 Review Problems Spring 2015

Chapter 1: Linear Equations

1 Last time: inverses

Chapter 2 Linear Transformations

Algebraic Geometry Spring 2009

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

AFFINE AND PROJECTIVE GEOMETRY, E. Rosado & S.L. Rueda 4. BASES AND DIMENSION

Linear Algebra Lecture Notes

V (v i + W i ) (v i + W i ) is path-connected and hence is connected.

Finish section 3.6 on Determinants and connections to matrix inverses. Use last week's notes. Then if we have time on Tuesday, begin:

1 Equivalence Relations

1 Last time: multiplying vectors matrices

Lecture 3: Sizes of Infinity

LINEAR ALGEBRA KNOWLEDGE SURVEY

Math 110, Spring 2015: Midterm Solutions

Kernel and range. Definition: A homogeneous linear equation is an equation of the form A v = 0

Consequences of the Completeness Property

(dim Z j dim Z j 1 ) 1 j i

LECTURE 13. Quotient Spaces

Math 762 Spring h Y (Z 1 ) (1) h X (Z 2 ) h X (Z 1 ) Φ Z 1. h Y (Z 2 )

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

Vector Spaces and Linear Maps

Math 145. Codimension

LECTURE 3: RELATIVE SINGULAR HOMOLOGY

Written by Rachel Singh, last updated Oct 1, Functions

Homework 3: Relative homology and excision

ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ.

Math 4377/6308 Advanced Linear Algebra

TOPOLOGY TAKE-HOME CLAY SHONKWILER

DIFFERENTIAL EQUATIONS

Chapter 3. More about Vector Spaces Linear Independence, Basis and Dimension. Contents. 1 Linear Combinations, Span

Lecture 1s Isomorphisms of Vector Spaces (pages )

1.2 Functions What is a Function? 1.2. FUNCTIONS 11

Metric spaces and metrizability

Solution to Homework 1

Day 15. Tuesday June 12, 2012

1. A Little Set Theory

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 43

Decomposition Methods for Representations of Quivers appearing in Topological Data Analysis

NONSINGULAR CURVES BRIAN OSSERMAN

Systems of Linear Equations

SPECIAL POINTS AND LINES OF ALGEBRAIC SURFACES

The geometry of projective space

Math 4603: Advanced Calculus I, Summer 2016 University of Minnesota Notes on Cardinality of Sets

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases

COMMUTING PAIRS AND TRIPLES OF MATRICES AND RELATED VARIETIES

The Fundamental Theorem of Linear Algebra

a (b + c) = a b + a c

MATH 436 Notes: Homomorphisms.

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Math 115A: Homework 4

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that

Linear Algebra March 16, 2019

1 Invariant subspaces

Wed Feb The vector spaces 2, 3, n. Announcements: Warm-up Exercise:

LECTURE 11: TRANSVERSALITY

ABSTRACT VECTOR SPACES AND THE CONCEPT OF ISOMORPHISM. Executive summary

DIVISORS ON NONSINGULAR CURVES

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Lecture 29: Free modules, finite generation, and bases for vector spaces

k k would be reducible. But the zero locus of f in A n+1

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

ABSTRACT ALGEBRA 1, LECTURES NOTES 5: SUBGROUPS, CONJUGACY, NORMALITY, QUOTIENT GROUPS, AND EXTENSIONS.

Math 4153 Exam 1 Review

BASES. Throughout this note V is a vector space over a scalar field F. N denotes the set of positive integers and i,j,k,l,m,n,p N.

Section 1.8 Matrices as Linear Transformations

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 396. Quotient spaces

TROPICAL SCHEME THEORY

Lecture 5: Hodge theorem

Math 530 Lecture Notes. Xi Chen

Math 113 Winter 2013 Prof. Church Midterm Solutions

h M (T ). The natural isomorphism η : M h M determines an element U = η 1

Linear Algebra Differential Equations Math 54 Lec 005 (Dis 501) July 10, 2014

Topology Hmwk 6 All problems are from Allen Hatcher Algebraic Topology (online) ch 2

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

A PRIMER ON SESQUILINEAR FORMS

Transcription:

When F : U V is a linear transformation there are two special subspaces associated to F which are very important. One is a subspace of U and the other is a subspace of V. They are: kerf U (the kernel of F) which is { u U F(u) = 0 } and Im(F) V, (the image of F) which is = { v V v = F(u) for some u U } We say that F is onto V (or surjective ( surgettivo )) if Im(F) = V. We say that F is injective (iniettivo) if ker F = (0). Recall that F is injective if and only if F is a 1-1 function. The important theorem which connects these various spaces with each other is the following: (with the notation above) dimu = dim(ker(f)) + dim(im(f)). Putting these facts together we have the following conclusion: if F : U V is a linear transformation and dim U = dimv, then F is injective F is surjective Finding the Kernel and the Image of a linear transformation A knowledge of the kernel of a linear transformation is important for understanding the physical nature of the linear transformation. Not only does it tell us if the linear transformation is 1-1 or not, it also gives us some idea of how the function behaves. Let me illustrate this with a simple example in R 2. Example: Suppose we have a linear transformation F : R 2 R 2 which has, with respect to the s.o.b. of R 2, the following matrix ( ) 1 1 = A 1 1 1

From what you learned last term: since the determinant of this matrix is 0, the linear transformation is not 1-1. I.e. there is a kernel. What can we say about the image of this linear transformation. E.G. what is the dimension of the image of this linear transformation? Recall the theorem that you proved last term: Theorem: Let F : U V be a linear transformation between subspaces and suppose that B is a basis for U and B is a basis for V and M(F) BB = A is the matrix for F with respect to the two bases B and B respectively: then dim Im(F) = rka. In fact, the Im(F) is precisely the space spanned by the columns of A (naturally, with respect to the basis B of V ). Let s apply all this theory to the matrix above. We see that the columns of A span a 1-dimensional subspace of R 2 and so the image of this linear transformation is a line of R 2. We can use another important theorem from the first term to say something about the dimension of the kernel of F. Theorem: Let F be as above, and suppose that the dim U = d. Then d = dimim(f) + dimker(f) i.e. rkm = d dim ker(f). If we apply this information to the matrix A above we see that: rka = 1, d = 2 and so dimker(f) = 1 also. I.e. the matrix A above represents a linear transformation from R 2 to R 2 whose image is a line through the origin of R 2 and whose Kernel is a line through the origin of R 2. What are these two lines? 2

Let s first calculate the Kernel of this linear transformation. We want to find all the vectors u R 2 which the linear transformation F takes to 0 in the codomain R 2. We can use the matrix A, but we have to remember that it will give us this information in terms of the coordinates of the basis we used to form A. So, let s look for all the vectors u R 2 for which [u] B = x, B is the sob for R 2, y satisfies We get that 0 A[u] B =, i.e. 0 ( 1 1 1 1 x y = 0 ) x 0 = y 0 x + y = 0 If we solve these two linear equations (there is in fact only one, since the second equation is a multiple of the first) we get that x = y. I.e. the kernel of this linear transformation consists of all the vectors u R 2 for which [u] B = t for t R. t This is the subspace of R 2 generated by the vector (1,1) = u, for example. A parametric description of this subspace is given by (with respect to the standard basis) x = λ y = λ for any λ R. What about the Image of F? We know that, with respect to the basis B for the codomain of F, the image is spanned by the columns of A, i.e. Im(F) = v 1 = (1, 1),v 2 = ( 1,1) 3

It is easy to see that v 2 = v 1 and so we conclude that Im(F) = v 1 = (1, 1) A parametric description of the Image of F is given by x = λ y = λ for any λ R. We can actually see more about the nature of a linear transformation by studying this example a bit more carefully. As we noted, the image of this transformation is all the points on the line with implicit equation y = x, i.e. x + y = 0. E.g. the point (2, 2) is in the image of F. What are ALL the points in R 2 which get sent to (2, 2)? This is not a completely crazy question as we have just answered it for the point (0,0) in the image of F. I.e. we asked for all the points in R 2 which got sent to (0,0) when we sought the Kernel of F! Now, we are looking for all the points which get sent to (2, 2). We know there are some, because (2, 2) is in the image. There are several ways to consider this question: one is to try and find all such vectors by writing down what it means with respect to the matrix A for the linear transformation, i.e. we could write: what are all the x [u] B = y for which A[u] B = 2, i.e. 2 ( 1 1 1 1 ) x = y 2. 2 I.e. we would be trying to solve the equations x y = 2 x + y = 2 4

i.e. find all the solutions to this system of equations. Instead of just solving this system of equations (which is very easy), let s think about this for a moment. Suppose we had one vector, say the vector (2,0), which F sends to (2, 2). How would we find others? There is a very simple way: if u is a vector in ker(f) then (2,0) + u has the same image as (2,0), since F((2,0) + u) = F(2,0) + F(u) = (2, 2) + 0 = (2, 2). So, we always get more vectors to send to (2, 2) by adding to any vector we know that gets sent to (2, 2), a vector in ker(f). On the other hand, if u 2 is any vector in R 2 which F takes to (2, 2) then F(u 2 (2,0)) = F(u 2 ) F(2,0) = (0,0), i.e. u 2 (2,0) ker(f). Put another way, u 2 = (2,0) + w where w ker(f). So, we have shown that {u R 2 F(u) = (2, 2)} = {u R 2 u = (2,0) + w, w ker(f)}. What does that set of vectors look like? A moment s reflection shows you that this is a line of points in R 2 - which does not go through the origin of R 2 which passes through the point (2,0) and is parallel to the line which was ker(f). So we see something very nice about a linear transformation, which this example illustrates. The behaviour of a linear transformation is very uniform. Moreover, we can tell a great deal about this uniformity from finding the kernel of the linear transformation. Indeed it was true that the kernel described all the vectors which the linear transformation carried to the 0 vector, but it also tells us how much F carries to ANY vector. Moreover, the domain is partitioned into subsets, each of which looks like the kernel, and each of which gets sent to a different point by F. (Mention the connections with system of linear equations - homogeneous and non-homogeneous.) 5

With these considerations about linear transformations being made, we are now ready to describe the objects we will be studying in our geometry this semester. These will be the Affine Subsets of R n Definition: An affine subset, A, of R n is a subset of the following type: there is a u R n and V a subspace of R n such that A = u + V = {u + v v V }. Remark: We have just seen these in the example above. The set of all vectors in R n which a linear transformation F : R n R m takes to a fixed vector w R m is an affine subset of R n, it is a subset of the form u + ker(f) where u is one vector that is taken to w and ker(f) is the subspace we need for the definition. Notation: We will denote the affine subset described above by: A = {u,v }. So, given an affine subset A = {u,v } R n, is there anything unique about this way of writing A? A simple example will show us that not everything is unique about this way of writing A. For example, let A = {(1,1) + V } where V = (1,0). Claim: A = {(2,1) + V } also. (algebraic argument) (1,1) + (r,0) = (2,1) + (r 1,0). (geometric argument) - sketch in class. Definition-Theorem: Let A = {u,v } be an affine subset of R n. Then V is uniquely determined by A and is (thus) called the subspace associated to A. In the classical literature: V is called the giacitura of A. Proof: Suppose that A = {u,v } = {u,v }. We want to show that V = V. First notice that u = u + 0 and u = u + 0 are both in A. Moreover, for any vector w A, w u V and w u V. 6

In particular this is true for w = u and w = u. So, we conclude that u u V u u V and so u u = (u u) V V Now, we will show that V V (I leave it to the interested student to prove the other inclusion, i.e. V V and hence that V = V as we wanted.) So, let v V. Then u + v A and hence (u + v) u V. I.e. (u u ) + v V. But, u u is already in V. Therefore we must have v V. This finishes the proof that V V. So, from the last theorem we see that the giacitura V, of an affine subset, A = {u,v } of R n is uniquely determined by the subset, but that the vector u is not. (see the example above). How much liberty exactly do we have in choosing u? Well, it is obvious from the description of A as the set of all vectors of the form u +v, where v V, that u A. So, that is a restriction on the choice of u. It turns out to be the only restriction! Proposition: Let A = {u,v } be an affine subset of R n. Then for any element u A we have that A = {u,v }. Proof: It will be enough to show that for u in A, we have A = u + V. Since we are given that A = u + V it will be enough to show that u + V = u + V (I will only show one of the inclusions and will leave to the interested student the proof for the reverse inclusion.) Since u A we can write u = u + v for some vector v V. Now let x u + V, then x = u + v 1 where v 1 V. Then we can also write x = (u + v) + v 1 = u + (v + v 1 ). But, since v and v 1 are in V and V is a subspace of 7

R n, then v + v 1 is also in V. Thus x = u + v 2 where v 2 = v + v 1 is in V. It follows that x u + V. That finishes one of the inclusions we needed to verify. Examples: a) The affine subsets of R 1 are the subsets which consist of a single point and the subset which is all of R 1 itself. (We also include the empty set as an affine subset, as a convention.) (Explain) b) The affine subsets of R 2 are the subsets which consist of: a single point (they all have giacitura = 0); the lines in R 2, (these all have giacitura which is a line through the origin of R 2 ); and finally all of R 2. (Explain) c) The affine subsets of R 3 are planes, lines and points as well as all or R 3 (explain). 8