Linear algebra and differential equations (Math 54): Lecture 10

Similar documents
Math 54 Homework 3 Solutions 9/

Math Linear algebra, Spring Semester Dan Abramovich

Calculus (Math 1A) Lecture 4

Calculus (Math 1A) Lecture 4

Linear algebra and differential equations (Math 54): Lecture 20

February 20 Math 3260 sec. 56 Spring 2018

Linear Algebra, Summer 2011, pt. 2

Calculus (Math 1A) Lecture 5

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

ENGINEERING MATH 1 Fall 2009 VECTOR SPACES

There are two main properties that we use when solving linear equations. Property #1: Additive Property of Equality

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

Math 24 Spring 2012 Questions (mostly) from the Textbook

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Study Guide for Linear Algebra Exam 2

SUPPLEMENT TO CHAPTER 3

Chapter 3. Vector spaces

Linear Algebra- Final Exam Review

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Math 54 HW 4 solutions

Announcements Wednesday, November 01

Math 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Linear algebra and differential equations (Math 54): Lecture 13

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Chapter 2: Linear Independence and Bases

0.2 Vector spaces. J.A.Beachy 1

Announcements Wednesday, November 01

Sept. 3, 2013 Math 3312 sec 003 Fall 2013

Math 220 F11 Lecture Notes

MATH Topics in Applied Mathematics Lecture 2-6: Isomorphism. Linear independence (revisited).

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Math 601 Solutions to Homework 3

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Math 291-2: Lecture Notes Northwestern University, Winter 2016

An overview of key ideas

Linear algebra and differential equations (Math 54): Lecture 19

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

= c. = c. c 2. We can find apply our general formula to find the inverse of the 2 2 matrix A: A 1 5 4

MATH2210 Notebook 3 Spring 2018

Linear Algebra, Spring 2005

NOTES (1) FOR MATH 375, FALL 2012

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur

Lecture 6: Finite Fields

Linear algebra and differential equations (Math 54): Lecture 7

Math 52: Course Summary

1 Last time: inverses

* 8 Groups, with Appendix containing Rings and Fields.

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Calculus (Math 1A) Lecture 1

1111: Linear Algebra I

Math 110, Spring 2015: Midterm Solutions

Review 1 Math 321: Linear Algebra Spring 2010

Chapter 1: Linear Equations

Multiple Choice Questions

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Commutative Rings and Fields

Honors Advanced Mathematics Determinants page 1

x + 2y + 3z = 8 x + 3y = 7 x + 2z = 3

Calculus (Math 1A) Lecture 6

AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS

Math 291-1: Lecture Notes Northwestern University, Fall 2015

Chapter 1: Linear Equations

MATH10212 Linear Algebra B Homework Week 5

Math 346 Notes on Linear Algebra

Span & Linear Independence (Pop Quiz)

1 Review of the dot product

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Abstract Vector Spaces

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

Let V be a vector space, and let X be a subset. We say X is a Basis if it is both linearly independent and a generating set.

Abstract Vector Spaces and Concrete Examples

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

Lecture 8: A Crash Course in Linear Algebra

Components and change of basis

(II.B) Basis and dimension

MATH 310, REVIEW SHEET 2

Homework 5 M 373K Mark Lindberg and Travis Schedler

ARE211, Fall2012. Contents. 2. Linear Algebra (cont) Vector Spaces Spanning, Dimension, Basis Matrices and Rank 8

Math 480 The Vector Space of Differentiable Functions

Solution to Homework 1

Linear algebra and differential equations (Math 54): Lecture 8

2. Prime and Maximal Ideals

Exercises Chapter II.

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Chapter 1 Review of Equations and Inequalities

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Chapter 2: Matrix Algebra

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous:

Announcements Monday, September 25

Math Linear Algebra Final Exam Review Sheet

19. Basis and Dimension

Lecture 9: Vector Algebra

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

A PRIMER ON SESQUILINEAR FORMS

Note: Every graph is a level set (why?). But not every level set is a graph. Graphs must pass the vertical line test. (Level sets may or may not.

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Announcements September 19

Transcription:

Linear algebra and differential equations (Math 54): Lecture 10 Vivek Shende February 24, 2016

Hello and welcome to class! As you may have observed, your usual professor isn t here today. He ll be back on Thursday. Office hours this week are moved to Thursday, from 12:00 to 2:00. In the meantime, please welcome Martin Olsson!

Hello and welcome to class! Last time We saw the definition of vector space, and subspace. We saw many examples: R n, its subspaces, spaces of functions. I learned that many of you have a much harder time figuring out whether a given subset of a function space is a subspace than with the same question for subsets of R n. That s not surprising: thinking of function spaces as vector spaces is a new concept; it takes some time to get used to it.

Hello and welcome to class! This time We study the notions of linear independence, spanning sets, and bases in the context of general vector spaces. In particular, we emphasize how these notions serve to give coordinates to abstract, otherwise unfamiliar vector spaces e.g. subspaces of function spaces. This will allow us to pretend that these vector spaces are just R n.

Review: the definition of a vector space. Definition The data of an R vector space is a set V, equipped with a distinguished element 0 V and two maps + : V V V : R V V This data determines a vector space if it obeys the following rules. u + (v + w) = (u + v) + w v + w = w + v v + 0 = v c (d v) = (cd) v 1 v = v 0 v = 0 (c + d)v = cv + dv c(v + w) = cv + cw

Review: the definition of a subspace. Definition If V is a vector space, we say a subset W of V is a subspace if: Zero is in W. The sum of any two elements in W is in W. Any scalar multiple of an element in W is in W. Fact A subspace is itself a vector space.

Review: linear combinations and linear span Given a vector space V and v 1,... v n V and c 1,..., c n R, we can name an element c 1 v 1 + + c n v n V This is said to be a linear combination of the v i. The set of linear combinations of the v i is the linear span of the v i. Linear spans are subspaces.

Polynomials Last time we learned that functions R R formed a vector space and observed that polynomial functions formed a subspace. In particular, polynomial functions form a vector space. Following the notation of the book, we write: P P n all polynomials polynomials of degree at most n

Polynomials You add them like this: 5x 4 + 4x 3 + 3x 2 + 2x + 1 + 2x 3 + 4x 2 + 8x + 16 5x 4 + 6x 3 + 7x 2 + 10x + 17 You scalar multiply them like this: 7 (x 2 + 2x + 1) = 7x 2 + 14x + 2 And there s a zero polynomial.

Spanning sets We can turn this around and ask: Given a vector space V, find a collection of vectors v 1, v 2,... such that every element of V is a linear combination of the v i. In other words, find a collection of vectors which span V. Such a collection is called a spanning set.

Review: Does it span R 2? (0, 1) no (1, 0), (0, 1) yes (2, 3), (4, 5) yes (2, 3), (4, 6) no (2, 3), (4, 6), (1, 0) yes

Does it span P 2? x 3 + 3x 2 + 3x + 1, x 2 + 2x + 1, x + 1, 1 the first one isn t even in P 2 1 no 1, x no 1, x, x 2 yes every polynomial of degree 2 can be written as ax 2 + bx + c 1, x + 1, x 2 + 2x + 1 yes 1 + 2x + 3x 2, 4 + 5x + 6x 2, 7 + 8x + 9x 2 no If you don t see why now, we ll discuss it in a little bit.

What spans P? No finite set spans P. Indeed, given any finite collection of polynomials, they have some maximal degree, say n. Then no linear combination of them has degree greater than n: so they do not span P.

What spans P? There s an infinite set that spans P: all elements of P. One can do better: the set {1, x, x 2,...} spans P.

Linear indepedendence Definition Vectors {v 1, v 2,...} in a vector space V are linearly indepedendent if none is a linear combination of the others. Equivalently, if, whenever i c iv i = 0 for some constants c i R, all the c i must be zero. We already met this notion for vectors in R n.

Linear indepedendence of a single vector The set {v} is always linearly independent unless v = 0. Indeed, suppose {v} is linearly dependent. This means that there s some a 0 such tha 0 = a v. Multiplying by 1 a and expanding according to the axioms, we have on the one hand, that anything times the zero vector is zero: 1 a 0 = 1 a (0 0) = (1 a 0) 0 = 0 0 = 0 and on the other hand, 1 a (a v) = (1 a a) v = 1 v = v So v = 0. (We already saw this argument in R n.)

Linear indepedendence of two vectors The set {v, w} is linearly independent unless one vector is a multiple of the other. Indeed, suppose {v, w} is linearly dependent. This means that there s some a, b not both zero such that 0 = a v + b w If a is zero, then b w = 0 and b is not zero; by the previous slide w = 0 = 0 v. If on the other hand a is not zero then we can using the vector space axioms rearrange the above equation to v = b a w (We already saw this argument in R n.)

Is it linearly independent? The subset 1, x 2, x 5 of P 5? yes: Consider a linear combination a 1 + b x 2 + c x 5 = a + bx 2 + cx 5. This is only the zero polynomial if a = b = c = 0. The subset (x + 1) 2, (x 1) 2, x of P? no: (x + 1) 2 (x 1) 2 + 4 x = 0

Is it linearly independent? The subset {e x, e 2x } of the space of functions? yes: Suppose one was a multiple of the other, say e x = ce 2x. Then we would have c = e x for some constant c, which is not true. The subset {sin(x) 2, cos(x) 2, 1} of the space of functions? no: sin(x) 2 + cos(x) 2 = 1

Is it linearly independent? The subset {e x, e 2x, e 3x } of the space of functions? yes You won t have to do this kind of thing until we start talking about differential equations, but here s a hint of things to come: Suppose ae x + be 2x + ce 3x = 0. If any coefficient is zero, we can use the method of last slide. So let s assume this isn t the case. We take a derivative and get ae x + 2be 2x + 3ce 3x = 0. Subtracting the first equation from this, we see be 2x + 2ce 3x = 0. Rearranging, we learn that e x = b/2c, which is a contradiction. This sort of interleaving of differential calculus and linear algebra will characterize the third part of this class.

Bases Definition A subset {v 1, v 2,...} of a vector space V is a basis for V if it is linearly independent and spans V Example As we know, the vectors e 1 = (1, 0, 0), e 2 = (0, 1, 0), e 3 = (0, 0, 1), which are a basis for R 3. Example {1, x, x 2 } is a basis for P 2. They span: every polynomial of degree at most 2 is some ax 2 + bx + c. And they re linearly independent: ax 2 + bx + c is the zero polynomial only if a, b, c are all zero.

Unique representation Another way to write the definition: a subset {v 1, v 2,...} of a vector space V is a basis for V if there s one (spanning) and only one (linear independence) way to write any element v V as a linear combination of the v i.

Bases from spanning sets Given a finite spanning set {v i } of V, one can find a basis by iteratively throwing out vectors in the linear span of the others. Indeed, this procedure does not change the linear span of all the vectors. On the other hand, it must terminate, since the number of vectors decreases each time, and can only terminate when the remaining vectors are linearly independent. (We already saw this argument for T : R n R m.)

Example Consider the polynomials x + 1, x 2 + 2x + 1, 4, 4x + 3, x 2 Let s find, from among them, a basis for whatever space they span. x 2 + 2x + 1 = (x 2 ) + 2 (x + 1) + 1 4 4. So, let s throw it out. We re left with x + 1, 4, 4x + 3, x 2 4x + 3 = 4 (x + 1) 1 4 4. So throw it out. We re left with x + 1, 4, x 2 These are linearly independent.

Let s see how these notions interact with linear transformations.

Review: Linear transformations Definition If V and W are vector spaces, a function T : V W is said to be a linear transformation if T (cv + c v ) = ct (v) + c T (v ) for all c, c in R and all v, v in V.

Review: one-to-one and onto A function f : X Y is said to be: onto if every element of Y is f of 1 element of X. one-to-one if every element of Y is f of 1 element of X. For a linear transformation T : R m R n, we know that it is one-to-one if and only if the columns of the associated matrix are linearly independent, and onto if and only if the columns of the associated matrix span the codomain.

The identity function For a set X, there s a function from X to itself which does nothing. id X : X X x x When X is a vector space, id X is a linear transformation. (Do you see why?) When X = R n, the matrix of id X is the identity matrix.

Invertible functions A function f : X Y is said to be invertible if there s another function which undoes it, i.e., some g : Y X with g f = id X and f g = id Y. Note the similarity to the definition of an invertible matrix. Indeed, when X, Y are vector spaces and f, g are linear, then we re asking their matrices to multiply to the identity matrix. As with matrices, if f has an inverse, it s unique. We write it f 1.

Invertible functions A function is invertible if and only if it s one-to-one and onto. If f : X Y is invertible, then it s onto: for any y Y, we have f (f 1 (y)) = y. It s one-to-one: for any x with f (x) = y, we have x = f 1 f (x) = f 1 (y), so there s only one such x. Conversely If f : X Y is one-to-one and onto, then for any y Y, there is a unique x X with f (x) = y. Consider f 1 : Y X You can check this is the inverse. y the unique x with f (x) = y

Invertible functions The inverse of an invertible linear transformation is again linear. Consider some T : V W and its inverse T 1 : W V : T 1 (c 1 w 1 + c 2 w 2 ) = T 1 (c 1 TT 1 (w 1 ) + c 2 TT 1 (w 2 )) = T 1 (T (ct 1 (w 1 ) + ct 1 (w 2 ))) = ct 1 (w 1 ) + ct 1 (w 2 ) So T 1 is linear.

Isomorphism Invertible linear transformations are also called isomorphisms. If there s an isomorphism f : V W, we say V and W are isomorphic vector spaces. Isomorphic vector spaces look the same to linear algebra. More precisely, any question which can be asked just in terms of operations which make sense in any vector space must have the same answer in both. You use the isomorphism f to translate back and forth.

Isomorphism For example, the following map is an isomorphism P 2 R 3 ax 2 + bx + c (a, b, c) For the purposes of linear algebra, P 2 and R 3 are the same. For instance, we learn immediately that no collection of more than three vectors in P 2 can ever be linearly independent. For other purposes, P 2 and R 3 are quite different: it doesn t make sense to solve an element of R 3 for x, or to take its derivative.

Independence, span, bases, and linear transformations A linear transformation T : V W is onto if and only if the image of a spanning set is a spanning set. Suppose a collection {v i } of vectors spans V. To say {T (v i )} spans W means any w in W is a linear combination of the T (v i ). Since T is onto, we can at least write w = T (v) for some v in V. Since the v i span, we can write v = i a iv i, hence w = T (v) = T ( i a i v i ) = i a i T (v i ) For the converse: if the image of any collection of vectors in V span W then by linearity V must map onto W. (I ll let you think through the details.)

Independence, span, bases, and linear transformations A linear transformation T : V W is one-to-one if and only if the only element sent to zero is zero. Indeed, by definition, T is one-to-one if every element of w has at most one thing in V mapping to it. If two things map to zero, then T is certainly not one-to-one. Conversely, suppose 0 maps to 0. Then T (v 1 ) = T (v 2 ), or in other words 0 = T (v 1 ) T (v 2 ) = T (v 1 v 2 ), implies v 1 v 2 = 0, and thus v 1 = v 2. (We already saw this argument for T : R n R m.)

Independence, span, bases, and linear transformations A linear transformation T : V W is one-to-one if, and only if, the image of linearly independent vectors are linearly independent. Say {v i } in V are linearly independent, but their images in W are not. Then we must have a i T (v i ) = 0 for some a i not all zero. By linearity, T ( a i v i ) = 0; by independence, a i v i 0. By the previous slide, T is not one-to-one. Conversely, if T is not one-to-one, there is some nonzero vector v with T (v) = 0. Thus the linearly independent set {v} is sent to the non-linearly-independent set {0}.