Linear Independence Reading: Lay 1.7

Similar documents
Section 1.5. Solution Sets of Linear Systems

Linear Independence x

Matrix Factorization Reading: Lay 2.5

1 Last time: inverses

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS

Chapter 1: Linear Equations

Chapter 1: Linear Equations

Math 54 HW 4 solutions

Announcements Monday, September 17

Sections 1.5, 1.7. Ma 322 Spring Ma 322. Jan 24-28

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Review for Chapter 1. Selected Topics

Linear Equations in Linear Algebra

Announcements Monday, September 18

Sections 1.5, 1.7. Ma 322 Fall Ma 322. Sept

Linear Algebra, Summer 2011, pt. 2

Announcements Wednesday, September 20

VECTORS [PARTS OF 1.3] 5-1

Lecture 6: Spanning Set & Linear Independency

MATH10212 Linear Algebra B Homework Week 4

Linear Algebra Handout

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Row Reduction and Echelon Forms

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Problems for M 10/12:

Number of solutions of a system

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Linear Algebra MATH20F Midterm 1

6 Basis. 6.1 Introduction

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MATH240: Linear Algebra Exam #1 solutions 6/12/2015 Page 1

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Review Solutions for Exam 1

Math 54. Selected Solutions for Week 5

Solutions of Linear system, vector and matrix equation

Span & Linear Independence (Pop Quiz)

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

And, even if it is square, we may not be able to use EROs to get to the identity matrix. Consider

Week #4: Midterm 1 Review

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 8

1 Systems of equations

Practice problems for Exam 3 A =

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Span and Linear Independence

Sept. 3, 2013 Math 3312 sec 003 Fall 2013

Linear independence, span, basis, dimension - and their connection with linear systems

1111: Linear Algebra I

Chapter 2 Notes, Linear Algebra 5e Lay

Math 369 Exam #2 Practice Problem Solutions

Linear equations in linear algebra

Final Exam Practice Problems Answers Math 24 Winter 2012

Chapter 3. More about Vector Spaces Linear Independence, Basis and Dimension. Contents. 1 Linear Combinations, Span

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

MTH 2032 Semester II

MATH240: Linear Algebra Review for exam #1 6/10/2015 Page 1

Notes on Row Reduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Choose three of: Choose three of: Choose three of:

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Quizzes for Math 304

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

1 Last time: multiplying vectors matrices

Chapter 1: Systems of Linear Equations

MAT 242 CHAPTER 4: SUBSPACES OF R n

Math 54 Homework 3 Solutions 9/

Vector Spaces 4.4 Spanning and Independence

Vector space and subspace

Chapter 6. Orthogonality and Least Squares

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

INVERSE OF A MATRIX [2.2]

Additional Problems for Midterm 1 Review

February 20 Math 3260 sec. 56 Spring 2018

Section 1.2. Row Reduction and Echelon Forms

Linear Combination. v = a 1 v 1 + a 2 v a k v k

2. Every linear system with the same number of equations as unknowns has a unique solution.

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

The Gauss-Jordan Elimination Algorithm

MAT 1302B Mathematical Methods II

Abstract Vector Spaces and Concrete Examples

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

Math 308 Spring Midterm Answers May 6, 2013

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

Math 3191 Applied Linear Algebra

Exercise 2) Consider the two vectors v 1. = 1, 0, 2 T, v 2

MTH 464: Computational Linear Algebra

Section Gaussian Elimination

Math 3C Lecture 25. John Douglas Moore

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Topic 14 Notes Jeremy Orloff

Transcription:

Linear Independence Reading: Lay 17 September 11, 213 In this section, we discuss the concept of linear dependence and independence I am going to introduce the definitions and then work some examples and try to flesh out the meaning and implications of the definition These definitions might seem strange at first, and you might not understand why we care about dependence or independence (Lay does not do a good job explaining why) So I have included an appendix to these notes, Appendix A, which describes one reason why we care about these things 1 Definitions of dependence and independence Definition 11 An indexed set of vectors {v 1,, v p } is called linearly independent if the vector equation x 1 v 1 + + x p v p = has only the trivial solution x 1 = x 2 = = x p = Definition 12 An indexed set of vectors {u 1,, u p } is called linearly dependent if the vector equation x 1 u 1 + + x p u p = holds for some weights x 1, x 2,, x p which are not all zero The first thing to note is that a given set of vectors is either linearly independent or linearly independent; it cannot be both or neither 1

Example 13 Consider the vectors 1 v 1 = 1, v 2 =, v 3 = 1 1 6 6 1 Is the set {v 1, v 2, v 3 } linearly dependent or linearly independent? The way to check is to look at the equation x 1 v 1 + x 2 v 2 + x 3 v 3 = (1) If this equation has a solution other than x 1 = x 2 = x 3 =, then they are linearly dependent; otherwise, they are linearly independent So write the augmented matrix for (1): 1 6 1 6 (2) 1 1 1 We start on the computation of the RREF in the usual way Add 1 times the top row to the second row: 1 6 (3) 1 1 1 We can actually answer the question just by looking at the matrix (3); we do not need to finish computing the RREF Since there are only two nonzero rows in (3), there can only be at most two pivot positions in the RREF But there are three variables x 1, x 2, x 3, so this means that x 3 is a free variable This implies that there is a nontrivial solution, so the vectors are linearly independent The last paragraph was somewhat advanced, so in case you did not follow, we will finish computing the RREF Take the matrix (3) and add 1 times the first row to the fourth: 1 6 1 7 2

This is the RREF of (2) It corresponds to the system of equations x 1 + 6x 3 = x 2 7x 3 = There is a free variable x 3, so there exists a solution to (1) which is not the trivial solution (for instance, take x 1 = 6, x 2 = 7, x 3 = 1) Therefore, the set {v 1, v 2, v 3 } is linearly dependent Example 14 Is the set {u 1, u 2, u 3 } linearly independent or dependent, if 1 1 1 u 1 =, u 2 = 2, u 3 = 1? 2 1 As before, we write the augmented matrix 1 1 1 2 1 (4) 2 1 The first pivot position is the upper-left entry, and the column below it is already all zeros So we move to the second pivot position and begin emptying the column underneath: 1 1 1 2 1 Again we have only two pivots, so there is a nontrivial solution {u 1, u 2, u 3 } is linearly dependent Example 15 What about the vectors [ ] 1 v 1 =, v 2 2 = [ ] 1 ; is the set {v 1, v 2 } linearly dependent? Write the augmented matrix [ ] 1 1 2 Thus 3

Computing the RREF gives: [ ] 1 1 There are no free variables, so the only solution to the equation x 1 v 1 + x 2 v 2 = is x 1 = x 2 = Thus, the system is linearly independent 11 Homogeneous equations One small point to note is the connection of linear independence to the homogeneous equation Ax = Suppose the matrix A = [a 1, a n ] Then the solutions of Ax = are those vectors x = x 1 a 1 + + a n = x 1 x n such that So the homogeneous equation has a nontrivial solution that is, a solution x if and only if the columns of A are linearly dependent 2 Sets of one or two vectors We will talk now about what linear dependence and independence look like 21 One vector First, we consider sets of one vector, like {v} Such a set is linearly dependent if and only if v = This is because the equation is equivalent to the linear system x 1 v 1 = x 1 v 1 = x 1 v 2 = x 1 v n = (5) 4

(here we use the notation v = (5) is only solved by x 1 = 22 Two vectors v 1 v n ) If v i for some i, then the system Suppose we have a set of vectors {v 1, v 2 }, and suppose that this set is linearly dependent Then there exist x 1, x 2 (not both zero) such that or equivalently x 1 v 1 + x 2 v 2 =, x 1 v 1 = x 2 v 2 Thus, if a set of two vectors is linearly independent, then one is a multiple of the other u 1 u 2 v 1 v 2 Figure 1: A pair of sets of two vectors in R 2 The set {u 1, u 2 } is linearly independent The set {v 1, v 2 } is linearly dependent; indeed, v 2 = 15v 1 What about the converse is a set {v 1, v 2 }, where one vector is a multiple of the other, linearly dependent? It is not hard to see that the answer is yes Thus, we get the following characterization of linear dependence: A set {v 1, v 2 } is linearly dependent if and only if one of the vectors is a multiple of the other 3 More than two vectors What about bigger sets of vectors, like {v 1,, v p }? Something like what we just proved for the case of two vectors is true, but just slightly more complicated 5

Theorem 31 The set S = {v 1,, v p } is linearly dependent if and only if at least one of the vectors in S is a linear combination of the others Note that not every vector is a linear combination of the others! All that the theorem guarantees is the existence of some v j which can be written as a linear combination of v 1,, v j 1, v j+1,, v p Lay proves Theorem 31 in the section, and I highly recommend you read the proof It will help you develop a sense of what linear dependence actually means, which will help you work the problems When we consider subsets of R 3, there are some nice pictures that are associated with linear independence and dependence One is the following: say we have three vectors v 1, v 2, v 3 in R 3, and suppose the set {v 1, v 2 } is linearly independent Is the set {v 1, v 2, v 3 } linearly dependent? If it is linearly dependent, then it is possible to write v 3 = c 1 v 1 + c 2 v 2 for some c 1, c 2 (why?) That means that v 3 span{v 1, v 2 } (6) But since {v 1, v 2 } is linearly independent, span{v 1, v 2 } is a plane (see Lay 13, the subsection called A geometric description of span {v} and span {u, v} ) Thus, (6) implies that v 3 is in the plane spanned by v 1 and v 2 There are two nice theorems that let us tell that certain sets of vectors are linearly dependent without doing too much work For both theorems, we will not prove them (the proofs are in Lay), but give concrete examples which give the idea of the proof You can think about how to generalize these into a true proof or read the proofs in Lay Theorem 32 If a set of vectors S contains the zero vector, then S is linearly dependent Example 33 Suppose 5 v 1 = 3, v 2 = 2 Is the set {v 1, v 2 } linearly dependent? 6

Recall that linear dependence is equivalent to there existing weights x 1, x 2 which are not both equal to zero such that x 1 v 1 + x 2 v 2 = We can take x 1 = and x 2 = 1: x 1 v 1 + x 2 v 2 = = v 1 + 1v 2 = + = So the set is linearly dependent, just as we said in Theorem 32 This sort of strategy works with any set of vectors containing, which proves the theorem The second theorem: Theorem 34 If a set contains more vectors than there are components in each vector, then the set is linearly dependent Ie, if S = {v 1, v 2,, v p } with p > n, then S is linearly dependent Again, rather than proving in full detail, we will sketch the idea with an example Example 35 Say we have vectors [ ] 2 a 1 = a 2 2 = [ ] 1 a 1 3 = Is the set {a 1, a 2, a 3 } linearly dependent? Write the augmented matrix [ ] 2 1 1 2 1 [ ] 1 Since there are only two rows, there will be at most two pivot columns However, there are three variables x 1, x 2, x 3 Thus, the variable x 3 will be free, so there will be a nontrivial solution In case you don t believe me, you can check that the RREF is [ ] 1 1/4, 1 1/2 so a 1 2a 2 + 4a 3 = 7

A Appendix: Why do we care about linear dependence? In this section, we try to describe exactly why you would care about linear dependence and independence If you feel like the definitions make sense to you and you see why they would be useful, then this section is not so necessary for you Say we have a set of three vectors {w 1, w 2, w 3 } Suppose though that w 1 + 4w 2 = w 3, so you can write the vector w 3 as a linear combination of w 1 and w 2 Remember that span {w 1, w 2, w 3 } is the set of all vectors that can be written as a linear combination of the vectors w 1, w 2, w 3 So suppose y is a vector in span {w 1, w 2, w 3 }: y = c 1 w 1 + c 2 w 2 + c 3 w 3 Remember though that w 1 + 4w 2 = w 3, so we can rewrite the above as So y = c 1 w 1 + c 2 w 2 + c 3 (w 1 + 4w 2 ) = (c 1 + c 3 )w 1 + (c 2 + 4c 3 )w 2, y span {w 1, w 2 } (7) On the other hand, suppose we have some arbitrary vector z span {w 1, w 2 } Then z span{w 1, w 2, w 3 } automatically, because adding vectors to a set only makes the span larger What this and the reasoning leading up to (7) imply is that span {w 1, w 2 } = span {w 1, w 2, w 3 } Let s summarize what we have learned here If we have a set of vectors {v 1,, v p } which is linearly dependent, then we can remove vectors from this set without changing the span This is a good reason to care about linear dependence Later in the course, we will see that facts like the ones we discussed in this appendix have some very useful consequences 8