NOTES (1) FOR MATH 375, FALL 2012

Similar documents
Chapter 1 Vector Spaces

Vector Spaces. Addition : R n R n R n Scalar multiplication : R R n R n.

1 Invariant subspaces

Chapter 2: Linear Independence and Bases

Math 341: Convex Geometry. Xi Chen

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

Mathematical foundations - linear algebra

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

a (b + c) = a b + a c

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

2 so Q[ 2] is closed under both additive and multiplicative inverses. a 2 2b 2 + b

Study Guide for Linear Algebra Exam 2

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases

Spanning, linear dependence, dimension

0.2 Vector spaces. J.A.Beachy 1

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

MA106 Linear Algebra lecture notes

Linear Vector Spaces

Row Space, Column Space, and Nullspace

Review of Linear Algebra

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

Linear Algebra. Preliminary Lecture Notes

Introduction to Mathematical Programming IE406. Lecture 3. Dr. Ted Ralphs

Math 24 Spring 2012 Questions (mostly) from the Textbook

Linear algebra and differential equations (Math 54): Lecture 10

1.3 Linear Dependence & span K

The following definition is fundamental.

Linear Algebra. Preliminary Lecture Notes

Chapter 2 - Introduction to Vector Spaces

MATH 423 Linear Algebra II Lecture 3: Subspaces of vector spaces. Review of complex numbers. Vector space over a field.

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous:

Linear Algebra. Chapter 5

The Cyclic Decomposition of a Nilpotent Operator

Chapter 1. Vectors, Matrices, and Linear Spaces

Vector space and subspace

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics

5 Set Operations, Functions, and Counting

Math 3013 Problem Set 6

MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 1 Due: Wednesday July 13, 2005

LINEAR ALGEBRA REVIEW

MATH 326: RINGS AND MODULES STEFAN GILLE

Vector Spaces. Chapter 1

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

i=1 α ip i, where s The analogue of subspaces

The following techniques for methods of proofs are discussed in our text: - Vacuous proof - Trivial proof

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

M17 MAT25-21 HOMEWORK 6

Matrices and RRE Form

Abstract & Applied Linear Algebra (Chapters 1-2) James A. Bernhard University of Puget Sound

Econ Slides from Lecture 7

Algebraic Methods in Combinatorics

6. The scalar multiple of u by c, denoted by c u is (also) in V. (closure under scalar multiplication)

4.6 Bases and Dimension

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Lecture 8: A Crash Course in Linear Algebra

Discrete Math, Spring Solutions to Problems V

ELEMENTARY LINEAR ALGEBRA

Linear Algebra Lecture Notes

DR.RUPNATHJI( DR.RUPAK NATH )

Abstract Vector Spaces and Concrete Examples

SUPPLEMENT TO CHAPTER 3

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

In N we can do addition, but in order to do subtraction we need to extend N to the integers

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1.

Prelims Linear Algebra I Michaelmas Term 2014

Math 4310 Solutions to homework 1 Due 9/1/16

3 The language of proof

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

Systems of Linear Equations

2. Prime and Maximal Ideals

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

n n P} is a bounded subset Proof. Let A be a nonempty subset of Z, bounded above. Define the set

Linear vector spaces and subspaces.

Part III. 10 Topological Space Basics. Topological Spaces

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

MATH 423 Linear Algebra II Lecture 12: Review for Test 1.

Exercises Chapter II.

1 - Systems of Linear Equations

Convex Optimization Notes

2.3 Terminology for Systems of Linear Equations

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Some Notes on Linear Algebra

Vector Spaces. Chapter Two

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

Chapter One. The Real Number System

Lecture 2: Linear operators

MATH Solutions to Homework Set 1

NOTES on LINEAR ALGEBRA 1

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Linear Algebra I. Ronald van Luijk, 2015

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall 2011

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

HW 4 SOLUTIONS. , x + x x 1 ) 2

Cardinality and ordinal numbers

Optimization Theory. A Concise Introduction. Jiongmin Yong

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

In N we can do addition, but in order to do subtraction we need to extend N to the integers

Transcription:

NOTES 1) FOR MATH 375, FALL 2012 1 Vector Spaces 11 Axioms Linear algebra grows out of the problem of solving simultaneous systems of linear equations such as 3x + 2y = 5, 111) x 3y = 9, or 2x + 3y z = 4, x 4y + 2z = 1, 3x + 10y 4z = 9 112) While you are probably familiar with techniques for solving 111), the system in 112) is a bit trickier since there are actually infinitely many solutions In both of these examples there are the same number of equations as unknowns, but it is also of interest to study systems where these numbers can be different, such as 3x + 2y = 5, x 3y = 9, 113) x + 4x = 11, or 2x + 3y z = 4, 114) x 4y + 2z = 1 We begin by studying the spaces of all possible solutions to any of these systems Thus we consider all possible pairs of real numbers x, y) for systems 111) and 113), and all triples of real numbers x, y, z) for the systems 112) and 114) More generally, we want to be able to study a system of m equations in n unknowns: c 1,1 x 1 + c 1,2 x 2 + + c 1,n 1 x n 1 + c 1,n x n = d 1, c 2,1 x 1 + c 2,2 x 2 + + c 2,n 1 x n 1 + c 2,n x n = d 2, c m,1 x 1 + c m,2 x 2 + + c m,n 1 x n 1 + c m,n x n = d m Here the possible solutions are n-tuples of real numbers x 1, x 2,, x n 1, x n ) 115) 9/5/12 Definition 11 The space of all ordered n-tuples of real numbers is denoted by R n This space can come equipped with two algebraic operations, addition and scalar multiplication These are defined as follows: a) If x = x 1,, x n ) and y = y 1,, y n ) are any two elements of R n, their sum x + y is defined to be the n-tuple x + y = x 1 + y 1,, x n + y n ) b) If x = x 1,, x n ) is any element of R n and λ is any real number, the product λx is defined to be the n-tuple λx = λx 1,, λx n ) We denote the special element 0,, 0) by 0 Sometimes we want to consider equations with complex numbers as coefficients or data, and then we are led to solutions which involve complex numbers Definition 12 The space of all ordered n-tuples of complex numbers is denoted by C n This space can come equipped with two algebraic operations, addition and scalar multiplication These are defined as follows:

NOTES 1) FOR MATH 375, FALL 2012 2 a) If z = z 1,, z n ) and w = w 1,, w n ) are any two elements of C n, their sum z + w is defined to be the n-tuple z + w = z 1 + w 1,, z n + w n ) 116) * b) If z = z 1,, z n ) is any element of C n and λ is any complex number, the product λz is defined to be the n-tuple λz = λz 1,, λz n ) 117) * We denote the special element 0,, 0) by 0 The following properties of addition and scalar multiplication in R n are easy to check: If x, y R n, then x + y R n If x R n and λ R then λx R n If x, y R n then x + y = y + x If x, y, t R n then x + y) + t = x + y + t) If x R n then 0 + x = x If x R n then x + 1)x = 0 If x R n and if λ, µ R, then λµ)x = λµx) If x, y R n and if λ R, then λx + y) = λx + λy If x R n and if λ, µ R, then λ + µ)x = λx + µy If x R n then 1x = x Moreover, there is an identical set of properties for the complex version C n These properties appear sufficiently often in mathematics to motivate the following abstract definition Definition 13 A nonempty set V is a real vector space if the following ten axioms are satisfied: 1) If x, y V, there is a unique element in V called the sum of x and y; it is denoted by x + y V 2) If x V, λ R, there is a unique element in V called the product of λ and x; it is denoted by λx V 3) If x, y V then x + y = y + x 4) If x, y, t V then x + y) + t = x + y + t) 5) If x V then 0 + x = x 6) If x V then x + 1)x = 0 7) If x V and if λ, µ R, then λµ)x = λµx) 8) If x, y V and if λ R, then λx + y) = λx + λy 9) If x V and if λ, µ R, then λ + µ)x = λx + µy 10) If x V then 1x = x A non-empty set V is called a complex vector space if the same ten axioms hold with the collection of scalars R replaced by the larger) collection of scalars C Why do mathematicians make an abstract definition like this? While one can make up any definition one wants, the real reason is that the definition is useful This usually involves two statements: A) There are many examples of real and complex vector spaces in addition to the special cases of R n and C n discussed above When an object appears sufficiently often, it is useful to give it a name B) Many of the results that we want to establish do NOT depend on the particular nature of the set of elements we are studying, but ONLY depend on the properties listed in the axioms Thus the definition focuses our attention on what is import ant, and what may be just incidental

NOTES 1) FOR MATH 375, FALL 2012 3 12 Further examples of vector spaces 121 Vector spaces of functions 9/7/12 One of the most important examples arises as follows Let S be a non-empty set A real-valued function on S is then a rule, often denoted by a letter like f or g, which associates to each element s S a unique real number, denoted by fs) or gs) We consider the set V = V R S) of all real-valued functions on S It is very important to note that the elements of V are individual functions or rules; the values of the functions which are real numbers) are not elements of V For example, suppose that S consists of the interval [0, 1] = {t R : 0 t 1} Let us write down some examples of elements V R S) f 1 defined by f 1 t) = 1 + t 2 ; f 2 defined by f 2 t) = sint) + e 5t ; 1 for 0 t 1 2, f 3 defined by f 3 t) = 2t for 1 2 t 1; 1 for 0 t 1 2, f 4 defined by f 4 t) = 2t 1 for 1 2 t 1; 1 for t a rational number, f 5 defined by f 5 t) = 2t 1 for t an irrational number In order to make V R S) into a vector space, we must satisfy axioms 1 and 2, which means we have to define the sum of two functions, and also the product of a real number and a function But this is easy to do If f, g V R S) and if λ R, let f + g and λf be the real-valued functions defined on S given by f + g)s) = fs) + gs), λf)s) = λfs) 121) Lemma 14 If addition and scalar multiplication are defined as in equation 121), then the set V R S) of all real-valued functions on the set S is a real vector space 122 Vector spaces of matrices An m n complex matrix is a rectangular array of complex numbers with m rows and n columns Thus an m n complex matrix M looks like a 1,1 + ib 1,1 a 1,2 + ib 1,2 a 1,n + ib 1,n a 2,1 + ib 2,1 a 2,2 + ib 2,2 a 2,n + ib 2,n M = 122) a m,1 + ib m,1 a m,2 + ib m,2 a m,n + ib m,n Let M m n C) denote the set of all m n complex matrices We can make M m n C) into a complex vector space by defining addition and scalar multiplication as follows:

NOTES 1) FOR MATH 375, FALL 2012 4 A) Let a 1,1 + ib 1,1 a 1,n + ib 1,n a 2,1 + ib 2,1 a 2,n + ib 2,n M 1 =, a m,1 + ib m,1 a m,n + ib m,n x 1,1 + iy 1,1 x 1,n + iy 1,n x 2,1 + iy 2,1 x 2,n + iy 2,n M 2 = x m,1 + iy m,1 x m,n + iy m,n Then define a 1,1 + x 1,1 ) + ib 1,1 + y 1,1 ) a 1,n + x 1,n ) + ib 1,n + y 1,n ) a 2,1 + x 2,1 ) + ib 2,1 + y 2,1 ) a 2,n + x 2,n ) + ib 2,n + y 2,n ) M 1 + M 2 = 123) a m,1 + x m,1 ) + ib m,1 + y m,1 ) a m,n + x m,n ) + ib m,n + y m,n )) B) Let λ + iµ) C and let a 1,1 + ib 1,1 a 1,2 + ib 1,2 a 1,n + ib 1,n a 2,1 + ib 2,1 a 2,2 + ib 2,2 a 2,n + ib 2,n M = 124) a m,1 + ib m,1 a m,2 + ib m,2 a m,n + ib m,n * Then define λ + iµ)a 1,1 + ib 1,1 ) λ + iµ)a 1,2 + ib 1,2 ) λ + iµ)a 1,n + ib 1,n ) λ + iµ)a 2,1 + ib 2,1 ) λ + iµ)a 2,2 + ib 2,2 ) λ + iµ)a 2,n + ib 2,n ) λ + iµ)m = 125) λ + iµ)a m,1 + ib m,1 ) λ + iµ)a m,2 + ib m,2 ) λ + iµ)a m,n + ib m,n ) Lemma 15 If addition and scalar multiplication are defined as in equations 123) and 125), then the set M m n C) of all m n complex matrices is a complex vector space 123 Homogeneous solutions Let us modify the system of linear equations in 112) and consider the system 2x + 3y z = 0, x 4y + 2z = 0, 3x + 10y 4z = 0 126) Let V denote the set of all 3-tuples of real numbers x, y, z) which are solutions of the system 126) Note that this is certainly a subset of R 3, so we can add elements of V and multiply them by real scalars We make the following important observations: The sum of two elements of V is not just an element of R 3, but is actually an element of V again The product of any real number with an element of V is not just an element of R 3 but is actually an element of V again Lemma 16 The set V of all solutions to the system 126) is a vector space if addition and scalar multiplication have the usual meaning for elements of R 3 The key to the observations above is that the terms on the right-hand side of the equations in 126) are all zero One can and should) check that the set of all solutions to the original system 112) is NOT a vector space

NOTES 1) FOR MATH 375, FALL 2012 5 13 Using the Axioms Examples 1 and 2 from the text): Let V be a vector space 1) Prove that there is one and only one identity element in V Proof The existence of an identity element is guaranteed by Axiom 5 The point of this problem is to show uniqueness Thus suppose there are two elements 0 1, 0 2 V such that for all x V it follows that x + 0 1 = x and x + 0 2 = x We must show that 0 1 = 0 2 We argue as follows Since x + 0 1 = x for all x V and since 0 2 V it follows that 0 2 + 0 1 = 0 2 Also since x + 0 2 = x for all x V and since 0 1 V it follows that 0 1 + 0 2 = 0 1 But then and so 0 1 = 0 2 0 1 = 0 1 + 0 2 = 0 2 + 0 1 by Axiom 3) = 0 2, 2) Prove that if x V there is one and only one inverse for x in V Proof The existence of an inverse is guaranteed by Axiom 6, which asserts that x + 1)x = 0 The point of this exercise is to show uniqueness Thus suppose there are two elements y 1 and y 2 so that x + y 1 = 0 and x + y 2 = 0 We must show that y 1 = y 2 We argue as follows We have and so y 1 = y 2 y 1 = y 1 + 0 by Axiom 5) = y 1 + x + y 2 ) since we are assuming that 0 = x + y 2 ) = y 1 + x) + y 2 by Axiom 4) = x + y 1 ) + y 2 by Axiom 3) = 0 + y 2 since we are assuming that 0 = x + y 1 )) = y 2 + 0 by Axiom 3) = y 2 by Axiom 5) 14 Subspaces 9/10/12 The example of homogeneous solutions in Subsection 123 above motivate the following definition Definition 17 Let V be a vector space A subspace of V is a subset W V such that W is a vector space if the operations of addition and multiplication by a scalar are those that are given in V Note that in order to be a vector space, W must be non-empty, and in particular it must contain the identity element 0 of V Lemma 18 Let V be a vector space, and let W V be a non-empty subset Then W is a subspace if and only if the following two conditions are satisfied: a) If x and y are elements of W then x + y is also an element of W b) If x is an element of W and if λ is a scalar, then λx is again an element of W Note that x + y and λx are automatically elements of V since we are assuming that V is a vector space The key point is the requirement that they actually belong to the subset W ) Proof If W is a vector space, then assertions a) and b) follow from Axioms 1 and 2 for vector spaces Thus the harder part of the proof is the opposite implication: if a) and b) are satisfied, then W is a vector space But if a) and b) are satisfied, then Axioms 1 and 2 for vector spaces are satisfied Then all the other eight axioms follow since they are true for elements of V and hence for elements of W

NOTES 1) FOR MATH 375, FALL 2012 6 As an application, observe that since the sum of two continuous functions is continuous, and since the product of a continuous function by a scalar is continuous, it follows that the set of continuous functions on the interval [0, 1] is a subspace of the set of all functions on [0, 1] see subsection 121), and hence this set is a vector space 21 Definitions 2 Linear Combinations and Linear Span If V is a vector space and if S = {x 1,, x n } V is a finite set of vectors, then a vector y V is a linear combination of the vectors {x 1,, x n } if there are scalars {α 1,, α n } such that y = α 1 x 1 + α 2 x 2 + + α n x n Note, for example, that the vector 0 is always a linear combination of {x 1,, x n } since we can write 0 = 0x 1 + 0x 2 + + 0x n 211) Definition 21 Let V be a vector space, and let S = {x 1,, x n } V be a finite set of vectors The span of S, denoted by LS) is the set of all linear combinations of the elements of S That is LS) = {y V : y = α 1 x 1 + + α n x n } Proposition 22 Let V be a vector space, and let S = {x 1,, x n } be a finite set of elements of V Then the span LS) of all linear combinations of {x 1,, x n }, is a subspace of V Proof Let v 1 = α 1 x 1 + + α n v n and v 2 = β 1 x 1 + + β n v n be two linear combinations of the set of elements {x 1,, x n }, and let λ be a scalar Then the axioms for a vector space show that v 1 + v 2 = α 1 + β 1 )x 1 + + α n + β n )x n, λv 1 = λα 1 x 1 + + λα n x n You should verify this!) This shows that LS) is closed under addition and multiplication by scalars It now follows from Lemma 18 that LS) is a subspace of V 9/12/12 22 The notion of linear span, and why should anyone be interested? I would like to convince you that the notion of linear span, defined in Definition 21, actually has something to do with solving linear equations Suppose that a, b, c are three given real numbers, and consider the system of equations 3x + 2y z = a, x y + 3z = b, 221) 3x + 7y 11z = c We ask: for which numbers a, b, and c can we solve this equation? For example, let a = 4, b = 3, and c = 1 That is, consider the system 3x + 2y z = 4, x y + 3z = 3, 3x + 7y 11z = 1 This system of equations does have a solution For example, we can take x = 1, y = 1, and z = 1 On the other hand, if we let a = 1, b = 0, and c = 1 I claim that there is no solution In this case we are considering the system 3x + 2y z = 1, x y + 3z = 0, 3x + 7y 11z = 1

NOTES 1) FOR MATH 375, FALL 2012 7 Suppose there were a solution x, y, z) which satisfies all three equations If we multiply the second equation by 3, we get the system 3x + 2y z = 1, 3x 3y + 9z = 0, 3x + 7y 11z = 1 Then subtracting the second equation from the first, we get 5y 10z = 1, and subtracting the third equation from the second, we get But then it would follow that 10y + 20z = 1 1 = 10y + 20z = 2)5y 10z) = 2)1) = 2 which is impossible! Thus for some triples of numbers a, b, c) we can solve the system in 221), while for others we cannot solve the system Is there anything useful we can say about when the system can or cannot be solved? Let us some of the notation we have introduced In the vector space R 3, let S be the subset consisting of three vectors as follows: S = { 3, 1, 3), 2, 1, 7), 1, 3, 11) } What does it mean that a vector a, b, c) R 3 is a linear combination of these three vectors? It means that we can find three scalars x, y, and z so that a, b, c) = x3, 1, 3) + y2, 1, 7) + z 1, 3, 11) But we can calculate the vector on the right hand side It is just 3x + 2y z, x y + 3z, 3x + 7y 11z) Thus to say that a, b, c) is a linear combination of the three vectors is to say that there are scalars x, y, z so that But this just means that a, b, c) = 3x + 2y z, x y + 3z, 3x + 7y 11z) a = 3x + 2y z, b = x y + 3z, z = 3x + 7y 11z Thus a, b, c) is a linear combination of the three vectors S = { 3, 1, 3), 2, 1, 7), 1, 3, 11) } if and only if the system of equations given in 221) has a solution In other words, we have rephrased the question of whether we can solve this system of equations into another question about whether a vector is a linear combination of three special vectors Note that the three vectors are the columns in the 3 3 matrix of coefficients in the system of equations) In particular, the set of ALL vectors a, b, c) for which we can solve the equation is the same as the linear span LS) Now last time we proved that LS) is a subspace of R 3 Thus we can conclude that since we cannot solve the system for all triples a, b, c), the set of triples for which we can solve the equation is either a plane passing through the origin, or a line passing through the origin, or only the origin Of course, the last case cannot be true since we found a non-zero triple for which the system can be solved EXERCISE: Decide whether the span LS) is a line or a plane

NOTES 1) FOR MATH 375, FALL 2012 8 23 Linear dependence and independence Let us return to equation 211), which gives a trivial way of writing 0 as a linear combination Is it possible to write the vector 0 as a linear combination of {x 1,, x n } in some other way? That is, do there exist scalars {α 1,, α n } which are not all zero such that Let us look at two examples: 0 = α 1 x 1 + α 2 x 2 + + α n x n? Example 1: In the vector space R 3 let v 1 = 1, 2, 3), let v 2 = 3, 2, 1) and let v 3 = 1, 2, 5) It is easy to check that 2v 1 + 1)v 2 + 1)v 3 = 0 Thus there is a non-trivial linear combination of these three vectors which equals zero Example 2: In the vector space R 3 let w 1 = 1, 2, 3), let w 2 = 3, 2, 0) and let w 3 = 1, 0, 0) If α 1, α 2, α 3 R and if we have α 1 w 1 + α 2 w 2 + α 3 w 3 = α 1 + 3α 2 α 3, 2α 1 + 2α 2, 3α 1 ) = 0, 0 = α 1 + 3α 2 α 3, 0 = 2α 1 + 2α 2, 0 = 3α 1 Solving these equations, it follows that α 1 = α 2 = α 3 = 0 Thus in this case, there is NO non-trivial linear combination which gives the vector 0 We give a name to these two kinds of phenomena: Definition 23 Let V be a vector space A set S of vectors in V is linearly dependent if there is a finite set of distinct elements {x 1,, x n } S and scalars {α 1,, α n }, at least one of which is non-zero, such that α 1 x 1 + + α n x n = 0 A set of vectors S V is linearly independent if it is not linearly dependent Thus Example 1 above shows that the set of vectors S = {1, 2, 3), 3, 2, 1), 1, 2, 5)} R 3 is linearly dependent, while Example 2 shows that the set of vectors S = {1, 2, 3), 3, 2, 0), 1, 0, 0)} R 3 is linearly independent The following is perhaps the most important result about linear independence Theorem 24 Let V be a vector space, and let S = {x 1,, x k } V be a linearly independent set of vectors Let LS) denote the space of this set Then every set of k + 1 elements of LS) is a linearly dependent set Proof We argue by mathematical induction on the number k of elements in the set S We first consider the case when k = 1 Thus S = {x} is a linearly independent set of elements This means that x 0 why?) Let y 1 and y 2 be any two elements of LS) Then y 1 = α 1 x and y 2 = α 2 x If both α 1 = α 2 = 0, then y 1 = y 2 = 0, and these vectors are linearly dependent Thus suppose at least one of {α 1, α 2 } is non-zero Then α 2 y 1 α 1 y 2 = α 2 α 1 x α 1 α 2 x = 0, and this gives a non-trivial linear combination equaling zero Thus {y 1, y 2 } is linearly dependent This completes the case when k = 1 By mathematical induction, we now assume that whenever we have a set S of k 1 elements which are linearly independent, then any set of k elements in LS ) are linearly dependent We want to show that the result is also true for a set S = {x 1,, x k } of linearly independent elements

NOTES 1) FOR MATH 375, FALL 2012 9 Let {y 1,, y k+1 } be any collection of k + 1 elements in LS) Since each y j is a linear combination of 9/17/12 {x 1,, x k }, there are scalars {α j,k } so that y 1 = α 1,1 x 1 + α 1,2 x 2 + + α 1,k 1 x k 1 + α 1,k x k, y 2 = α 2,1 x 1 + α 2,2 x 2 + + α 2,k 1 x k 1 + α 2,k x k, y k = α k,1 x 1 + α k,2 x 2 + + α k,k 1 x k 1 + α k,k x k, y k+1 = α k+1,1 x 1 + α k+1,2 x 2 + + 1 x k 1 + x k, There are now two possibilities Either α 1,k = α 2,k = α k,k = = 0, or at least one of these coefficients is not zero If α 1,1 = α 2,1 = α k,1 = α k+1,1 = 0, then looking only at the first k equations, we have written {y 1,, y k } as linear combinations of {x 1,, x k 1 } By our induction hypothesis, the vectors {y 1,, y k } are linearly dependent, and then the same is true of the vectors {y 1,, y k+1 } Now suppose that at least one of these coefficients is non-zero After relabeling the indices, let us assume that 0 Then we can do some algebra ie we subtract a multiple of the k + 1) st equation from each of the first k equations) to get y 1 α 1,k y k+1 = y 2 α 2,k y k+1 = y k α k,k y k+1 = α 1,1 α 1,k ) α k+1,1 α 2,1 α ) 2,k α k+1,1 α k,1 α k,k α k+1,1 x 1 + + α 1,k 1 α 1,k ) 1 x k 1, ) x k 1, x 1 + + α 2,k 1 α 2,k 1 ) x 1 + + α k,k 1 α ) k,k 1 x k 1 We now have k elements each a linear combination of k 1) linearly independent elements It follows from the induction hypothesis that the k elements are linearly dependent Thus there are scalars {λ 1,, λ k }, not all zero, such that But this says that k j=1 λ j y j k λ 1 y 1 + + λ k y k α ) j,k y k+1 = 0 α j,k α j=1 k+1,k y k+1 = 0, which shows that the vectors {y 1,, y k, y k+1 } are also linearly dependent This completes the proof 9/19/12 3 Bases and dimension Definition 31 Let V be a vector space and let S V be a set of elements of V Then S is a basis for V if and only if a) The linear span LS) is equal to V That is, for every v V there are elements x 1,, x k in S and scalars α 1,, α k so that v = α 1 x 1 + + α k v k b) The elements of the set S are linearly independent That is, if {y 1,, y m } are any distinct elements of S and if β 1,, β m } are scalars such that β 1 y 1 + + β m y m = 0, then β 1 = = β m = 0 If a vector space V has a basis consisting of a finite number of elements or consists only of the vector 0, the space is said to be finite dimensional Theorem 32 Let V be a finite dimensional vector space Then every basis for V has the same number of elements This number is called the dimension of the vector space V

NOTES 1) FOR MATH 375, FALL 2012 10 Proof Let S be a basis for V which consists of a finite number, say n, of elements s 1,, s n Let T be any other basis for V Suppose T contains more than n elements Then we can find n + 1 elements t 1,, t n+1 in T Since S is a basis, LS) = V But then we have n + 1 elements t 1,, t n+1 in LS), and by Theorem 24, the elements {t 1,, t n+1 } must be linearly dependent But this contradicts the hypothesis that the elements of a basis are linearly independent It follows that T contains at most n elements Next, suppose that T consists of strictly less than n elements Suppose T = {t 1,, t k } with k < n Then since T is a basis, LT ) = V But then we have n elements s 1,, s n in LT ), and by Theorem 24, these elements must be linearly dependent This contradicts the same hypothesis Thus T consists of exactly n elements, and this completes the proof Examples: I) The set S = {1, 0), 0, 1)} R 2 is a basis for R 2 II) The set S = {1, 1), 1, 1)} C 2 is a basis for C 2 III) The vector space R n is an n-dimensional real vector space, and the vector space C n is an n-dimensional complex vector space IV) The set S = {1, t, t 2,, t n } is a basis for the vector space of polynomials of degree less than or equal to n V) The space of all polynomials does not have a finite basis 9/21/12 Theorem 33 Let V be a finite dimensional vector space with dimension n Then: a) Any set of linearly independent elements in V has at most n elements in it b) If S = {v 1,, v k } V are linearly independent vectors and v k+1 V, then v k+1 / LS) if and only if the vectors {v 1, v 2,, v k, v k+1 } are also linearly independent c) Any set S of n linearly independent elements in V is a basis for V d) Any set S of n elements in V which span V is a basis for V e) If S V contains a non-zero element then we can find a subset S S consisting of linearly independent elements such that LS) = LS ) Proof Since V has dimension n, there is a basis B = {x 1,, x n } for V consisting of n elements This means that LB) = V and the n vectors {x 1,, x n } are linearly independent Proof of a) Let S = {v 1,, v k } be any set of linearly independent elements in V Since LB) = V, we have {v 1,, v k } LB) Then by Theorem 24, since {v 1,, v k } are linearly independent, we must have k n This establishes statement a) Proof of b) Let S = {v 1,, v k } be any set of k linearly independent elements in V If v k+1 LS), then we can write v k+1 = λ 1 v 1 + + λ k v k, and hence λ 1 v 1 + + λ k v k + 1)v k+1 = 0 This shows that the vectors {v 1,, v k+1 } are linearly independent Next let v k+1 / LS) Suppose we have an equation α 1 v 1 + + α k v k + α k+1 v k+1 = 0 301) We first show that α k+1 = 0 using argument by contradiction If α k+1 0 we could write v k+1 = α 1 v 1 α k α k+1 α k+1 But this would mean that v k+1 LS), which cannot be the case Thus α k+1 = 0 But then we have the equation α 1 v 1 + + α n v n = 0

NOTES 1) FOR MATH 375, FALL 2012 11 Since the vectors {v 1,, v n } are assumed to be linearly independent, it follows that α 1 = α 2 = = α n = 0 Thus for any equation of the form 301), it follows that all the coefficients are zero, and this says that the vectors {v 1,, v n, v n+1 } are linearly independent This establishes b) Proof of c) Let S = {v 1,, v n } be any set of n linearly independent elements in V To show that S is a basis, we only need to show that LS) = V We argue by contradiction If LS) V, then there is a vector v n+1 in V such that v n+1 / LS) But by assertion b) of the Theorem, the vectors {v 1,, v n, v n+1 } are linearly independent However, this contradicts assertion a) of the Theorem, which we have already established Thus the hypothesis that LS) V leads to a contradiction, and so we must have LS) = V Since the elements of S are assumed to be linearly independent, it follows that S is a basis This establishes c) Proof of d) We need to show that the elements of S are linearly independent If they are not, then by assertion e) below whose proof does not depend on assertion d)), we can find a subset S S of elements which are linearly independent, and which also span V But this would mean that we can find a basis for V with strictly fewer than n elements, which is a contradiction This establishes d) Proof of e) Let v 1 S be a non-zero element Then {v 1 } is a linearly independent subset of S We are done if Lv 1 ) = LS) If Lv 1 ) LS), choose v 2 S such that v 2 is not in Lv 1 ) [How do we know we can do this?] Then {v 1, v 2 } is a set of linearly independent vectors We are done if Lv 1, v 2 ) = LS) If not we repeat the process This must stop after n times since V is n-dimensional This completes the proof Example: Consider the system of equations 3x + 2y z = a, 2x 5y + z = b, 7x + 30y 9z = c a) Find a basis for the set of vectors x, y, z) r 3 which are solutions to this system when a = b = c = 0 b) Find a basis for the set of vectors a, b, c) R 3 for which this system has a solution Solution for a): We need to find the solutions to the system of homogeneous equations 3x + 2y z = 0, 2x 5y + z = 0, 7x + 30y 9z = 0 If we add the first two equations we get 5x 3y = 0 and if we multiply the first equation by 9 and subtract it from the third equation, we get 20x + 12y = 0 This second equation is a multiple of the first, and so the solution to the two of them is any pair x, y) with y = 5 3x Once we have x and y we can use any of the equations to solve for z, and we get z = 3x + 2y = 3x + 10 3 x = 19 3 x Thus we see that the solutions to the system of homogeneous equations is any vector of the form v = x, 5 3 x, 19 3 x) = x 3, 5, 19) 3 Thus the set of solutions to the homogeneous equations is spanned by the single vector 3, 5, 19), and the set of solutions is 1-dimensional Solution for b): We can rewrite this system as a single vector equation x3, 2, 7) + y2, 5, 30) + z 1, +1, 9) = a, b, c) Thus the set of vectors a, b, c) R 3 for which this system has a solution is exactly the linear span of the set of three vectors S = {3, 2, 7), 2, 5, 30), 1, +1, 9)} We need to find a basis for LS) If the three vectors were linearly independent, they would be a basis Let us check Suppose α3, 2, 7) + β2, 5, 30) + γ 1, +1, 9) = 0, 0, 0)

NOTES 1) FOR MATH 375, FALL 2012 12 This is just the system 3α + 2β γ = 0, 2α 5β + γ = 0, 7α + 30β 9γ = 0 We have already seen that this system has a non-zero solution Thus these three vectors are linearly dependent Let us look at the first two and ask if they are linearly independent That is, suppose This is just the system This is just the system α3, 2, 7) + β2, 5, 30) = 0, 0, 0) 3α + 2β = 0, 2α 5β = 0, 7α + 30β = 0 This time it is easy to see that the only solution is α = β = 0 Thus the two vectors {3, 2, 7), 2, 5, 30)} are linearly independent, and hence form a basis for the set of vectors a, b, c) for which the inhomogeneous system has a solution This vector space has dimension 2