Lecture 4: Linear independence, span, and bases (1)

Similar documents
Lecture 6: Corrections; Dimension; Linear maps

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Problem set #4. Due February 19, x 1 x 2 + x 3 + x 4 x 5 = 0 x 1 + x 3 + 2x 4 = 1 x 1 x 2 x 4 x 5 = 1.

1 Invariant subspaces

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Math 550 Notes. Chapter 2. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Math 4153 Exam 1 Review

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

OHSX XM511 Linear Algebra: Multiple Choice Exercises for Chapter 2

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Chapter 3. More about Vector Spaces Linear Independence, Basis and Dimension. Contents. 1 Linear Combinations, Span

Math 24 Spring 2012 Questions (mostly) from the Textbook

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Chapter 2: Linear Independence and Bases

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Math 235: Linear Algebra

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

MATH 115A: SAMPLE FINAL SOLUTIONS

Math 3191 Applied Linear Algebra

Lecture 9: Vector Algebra

SUPPLEMENT TO CHAPTER 3

Algorithms to Compute Bases and the Rank of a Matrix

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Lecture 17: Section 4.2

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

The Gram Schmidt Process

The Gram Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Definition (T -invariant subspace) Example. Example

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

Math 2331 Linear Algebra

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

A linear algebra proof of the fundamental theorem of algebra

Generalized eigenspaces

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

A linear algebra proof of the fundamental theorem of algebra

6. The scalar multiple of u by c, denoted by c u is (also) in V. (closure under scalar multiplication)

Review of Linear Algebra

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Chapter 6: Orthogonality

Family Feud Review. Linear Algebra. October 22, 2013

Linear equations in linear algebra

Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases

x 2 For example, Theorem If S 1, S 2 are subspaces of R n, then S 1 S 2 is a subspace of R n. Proof. Problem 3.

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

Linear Combination. v = a 1 v 1 + a 2 v a k v k

(II.B) Basis and dimension

Mathematics 206 Solutions for HWK 13b Section 5.2

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

19. Basis and Dimension

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Math Linear Algebra

Elementary Operations and Matrices

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MATH 110: LINEAR ALGEBRA HOMEWORK #2

Practice Midterm Solutions, MATH 110, Linear Algebra, Fall 2013

A Do It Yourself Guide to Linear Algebra

LINEAR ALGEBRA: THEORY. Version: August 12,

Midterm solutions. (50 points) 2 (10 points) 3 (10 points) 4 (10 points) 5 (10 points)

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Last lecture: linear combinations and spanning sets. Let X = {x 1, x 2,..., x k } be a set of vectors in a vector

M.6. Rational canonical form

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.

if b is a linear combination of u, v, w, i.e., if we can find scalars r, s, t so that ru + sv + tw = 0.

Chapter 3. Vector spaces

1.3 Linear Dependence & span K

Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition.

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a

4.6 Bases and Dimension

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Math 3191 Applied Linear Algebra

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

VECTORS [PARTS OF 1.3] 5-1

Abstract Vector Spaces

of A in U satisfies S 1 S 2 = { 0}, S 1 + S 2 = R n. Examples 1: (a.) S 1 = span . 1 (c.) S 1 = span, S , S 2 = span 0 (d.

Introduction to Mathematical Programming IE406. Lecture 3. Dr. Ted Ralphs

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MATH Linear Algebra

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Notice that the set complement of A in U satisfies

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Vector space and subspace

6 Basis. 6.1 Introduction

Solution: (a) S 1 = span. (b) S 2 = R n, x 1. x 1 + x 2 + x 3 + x 4 = 0. x 4 Solution: S 5 = x 2. x 3. (b) The standard basis vectors

2 Lecture Span, Basis and Dimensions

Math 102, Winter 2009, Homework 7

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Transcription:

Lecture 4: Linear independence, span, and bases (1) Travis Schedler Tue, Sep 20, 2011 (version: Wed, Sep 21, 6:30 PM) Goals (2) Understand linear independence and examples Understand span and examples Understand basis and examples Preview dimension Prove the main theorem on linear independence and span, for the finitedimensional situation Warm-up exercise 1 (3) What are all of the subspaces of R 2? Try to define them also using set notation; e.g., {(a, b) R 2 ab = 1} or {(a, 1 a ) a R, a 0} would describe the hyperbola xy = 1. Answer: The subspaces are {0} (otherwise denoted by 0), R 2 itself, and all of the lines through the origin, L λ,µ = {(a, b) R 2 µa + λb = 0} = {(λt, µt) t R}, where λ and µ are not both zero. (Note that L λ,µ = L cλ,cµ for nonzero c R). Warm-up exercise 2 (4) True or false: If true, explain why. If false, give a counterexample: Suppose V = V 1 + V 2 + V 3. Then, V = V 1 V 2 V 3 if and only if 0 = V 1 V 2 = V 1 V 3 = V 2 V 3. Answer: False: one counterexample is V = R 2, with V 1 = the x-axis, V 2 = the y-axis, and V 3 = the line x = y. Then, V = R 2 = V 1 + V 2 + V 3, but V i V j = 0 for all i j. 1

Linear (in)dependence (5) Recall from the book: Definition 1. A list (v 1,..., v m ) is linearly independent if the only choice of a 1,..., a m F such that a 1 v 1 + + a m v m = 0 (0.1) is a 1 = = a m = 0. The list is linearly dependent if there exists a 1,..., a m F, not all zero, such that (0.1) holds. Examples (6) Any sublist of a linearly independent list is still linearly independent. For any nonzero vector v V, the list (v), just containing the vector v only, is linearly independent. Any list containing zero is linearly dependent. A list (v, w) of two elements is linearly independent if and only if v 0 and w is not a scalar multiple of v. The case of R 3 : The list ((1, 0, 0), (0, 1, 0), (0, 0, 1)) is linearly independent. The list ((1, 1, 2), (2, 2, 4)) is linearly dependent. The list ((1, 0, 0), (0, 1, 1), (1, 2, 2)) is linearly dependent. In general, (u, v, w) is linearly independent if and only if they do not all lie in the same plane through the origin. Any list of length four or more turns out to be linearly dependent (why?) Span (7) Definition 2. The span of a list (v 1,..., v m ) of vectors in V is the vector subspace Span(v 1,..., v m ) := {a 1 v 1 + + a m v m a 1,..., a m F}. (0.2) Given a vector space V, a list (v 1,..., v m ) of elements of V is a spanning list for V if V = Span(v 1,..., v m ). Note that, in a definition, if really means if and only if, since we are making a definition. Exercises: 2

Show that the set Span(v 1,..., v m ) is always a vector subspace. Show that Span(v 1,..., v m ) = Span(v 1 ) + + Span(v m ). This relates span to sum! Show that (v 1,..., v m ) is linearly independent if and only if Span(v 1,..., v m ) = Span(v 1 ) Span(v m ) and all the v i are nonzero. Examples (8) If we append any vectors to a spanning list, the result is still spanning. Span(0) = {0}. In general, Span(v 1,..., v m ) = {0} if and only if all of v 1,..., v m are zero. For nonzero v, Span(v) = {λv λ F} is the line through v and the origin. For nonzero v and for w not a multiple of v, Span(v, w) = {av +bw a, b F} is the plane through the origin containing v and w. Span(x, x 2,..., x n ) = the space of degree n polynomials which are multiples of x (i.e., which vanish at 0). The linear dependence lemma (9) We strengthen (a) from Lemma 2.4 and state it differently: Lemma 3. (a) A list (v 1,..., v m ) is linearly independent if and only if v 1 0 and, for all 2 j m, v j / Span(v 1,..., v j 1 ). (b) If the list is linearly dependent and v j Span(v 1,..., v j 1 ), then the span of the list (v 1,..., v j 1, v j+1,..., v m ) obtained by removing v j is the same as the span of (v 1,..., v m ). Intuitive idea: As we saw, a list (v 1,..., v m ) in R n is linearly independent if and only if v 1 spans a line, (v 1, v 2 ) spans a plane, etc., and (v 1,..., v j ) spans a j-dimensional space for all 1 j m, i.e., each v j is not in the span of the previous v 1,..., v j 1. For part (b), the idea is: if v j Span(v 1,..., v j 1 ), then we can replace v j by a linear combination of v 1,..., v j 1, so the span is unchanged if we discard v j. Proof of lemma (10) Proof. Axler proves half of (a) and all of (b): If (v 1,..., v m ) is linearly dependent, then either v 1 = 0 or, for some j, v j Span(v 1,..., v j 1 ) [half of part (a)]. In the latter case, the span of (v 1,..., v m ) is the same as the span of the list obtained by removing v j [part (b)]. 3

Using this, it remains to prove the other half of part (a). Suppose that v 1 = 0. Then (0, v 2,..., v m ) is linearly dependent: 1 v 1 +0 v 2 + +0 v m = 0. Suppose that j 2 and v j Span(v 1,..., v j 1 ). Write v j = a 1 v 1 + + a j 1 v j 1. Then a 1 v 1 + + a j 1 v j 1 + ( 1) v j = 0. So (v 1,..., v m ) is linearly dependent. Bases (11) Definition 4. A basis of V is a list (v 1,..., v m ) of vectors in V that is both linearly independent and a spanning list. Main (characterizing) property: Proposition 0.3 (Proposition 2.8). A list (v 1,..., v m ) in V is a basis if and only if, for every vector v V, there exist unique a 1,..., a m F such that v = a 1 v 1 + + a m v m. Proof: Read it in Axler and understand it! Idea of proof: Break into parts: (a) The list (v 1,..., v m ) is spanning if and only if every vector v can be written in at least one way as v = a 1 v 1 + + a m v m. (b) The list (v 1,..., v m ) is linearly independent if and only if every vector v can be written in at most one way as v = a 1 v 1 + + a m v m. Exercise: Read the proof in Axler and use it to prove (a) and (b) separately. Examples (12) A basis of R 2 : ((1, 0), (0, 1)). More generally, (u, v) where u 0 and v is not a multiple of u. The standard basis of R n : (v 1,..., v n ), where v i = (0, 0,..., 0, 1, 0, 0,..., 0), with 1 in the i-th place. Functions {1, 2,..., n} F: a basis is the collection of delta functions, (δ 1,..., δ n ), where δ i (j) = 0 if i j, and δ i (i) = 1. Deep result we will prove: For F n or the space of functions {1,..., n} F, a spanning list is a basis if and only if it has length n. Similarly, again for F n or the space of functions {1,..., n} F, a linearly independent list is a basis if and only if it has length n. 4

Preview on Dimension (13) The above deep properties are saying that F n and {1,..., n} F have dimension n. Definition 5. A vector space V is finite-dimensional if and only if there is a (finite) spanning list (v 1,..., v m ). A deep result we will prove next time: Proposition 0.4. If V is finite-dimensional, then the following numbers coincide: (a) The minimum length of a spanning list; (b) the maximum length of a linearly independent list; (c) the length of every basis. This number is called the dimension of V. A vector space which is not finite-dimensional is called infinite-dimensional. This includes P(F) and F. Main theorem on finite spanning and linearly independent lists (14) The key ingredient in the proof of the previous proposition on dimension is our main theorem: Theorem 6. In a finite-dimensional vector space, the length of every linearly independent list is less than or equal to the length of every spanning list. Proof: Let (w 1,..., w n ) be a spanning list and (u 1,..., u m ) be a linearly independent list. We have to show m n. In fact, we will show that (u 1,..., u m ) can be extended to a spanning list of length n by adding some w j s, so in particular m n. To do so, we go backwards, beginning with (w 1,..., w n ), and at each step replacing a w j with a u i. Proof of main theorem continued (15) First, let us append u 1. Form the list (u 1, w 1,..., w m ). Since u 1 Span(w 1,..., w n ), this list is linearly dependent. By the linear dependence lemma, since u 1 0, for some j, w j Span(u 1, w 1,..., w j 1 ). Then the list (u 1, w 1,..., w j 1, w j+1,..., w m ) is also spanning. Inductively, suppose that we have a spanning list of length n, beginning with u 1,..., u i, and ending with n i of the w s. Append u i+1 to the list. Since u i+1 is in the span of the other vectors, the result is linearly dependent. 5

Reorder the list to begin with u 1,..., u i+1. By the linear dependence lemma, either u j Span(u 1,..., u j 1 ) (impossible since (u 1,..., u i+1 ) is linearly independent), or else one of the v s, say v k, is in the span of the previous elements in the list. Now discard v k, completing the induction. We get a spanning list of length n, beginning with u 1,..., u m. So m n. For next time: Read all of Chapter 2, and understand the proofs of Theorems 2.10 and 2.12; bring questions to class. 6