Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Similar documents
Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Math 54 HW 4 solutions

Linear Algebra Linearly Independent Sets; Bases

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Math 54. Selected Solutions for Week 5

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Chapter 3. Vector spaces

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

1 Last time: inverses

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Study Guide for Linear Algebra Exam 2

Row Space, Column Space, and Nullspace

ICS 6N Computational Linear Algebra Vector Space

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

Math 2331 Linear Algebra

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

3.4 Elementary Matrices and Matrix Inverse

The definition of a vector space (V, +, )

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Determine whether the following system has a trivial solution or non-trivial solution:

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

CSL361 Problem set 4: Basic linear algebra

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

Math 2174: Practice Midterm 1

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

1 Systems of equations

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH10212 Linear Algebra B Homework Week 4

Linear Independence x

Vector space and subspace

Announcements Monday, October 29

Math 369 Exam #2 Practice Problem Solutions

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Linear independence, span, basis, dimension - and their connection with linear systems

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

This MUST hold matrix multiplication satisfies the distributive property.

Math 3C Lecture 20. John Douglas Moore

MATH 2360 REVIEW PROBLEMS

MAT Linear Algebra Collection of sample exams

Chapter 1. Vectors, Matrices, and Linear Spaces

INVERSE OF A MATRIX [2.2]

Math 3191 Applied Linear Algebra

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Math 21b: Linear Algebra Spring 2018

Math Linear algebra, Spring Semester Dan Abramovich

Math 242 fall 2008 notes on problem session for week of This is a short overview of problems that we covered.

Linear Algebra: Sample Questions for Exam 2

Section 2.2: The Inverse of a Matrix

Chapter 1: Systems of Linear Equations

MAT 242 CHAPTER 4: SUBSPACES OF R n

Chapter 2: Linear Independence and Bases

Dimension. Eigenvalue and eigenvector

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a

Section Gaussian Elimination

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Math 102, Winter 2009, Homework 7

Problems for M 10/12:

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Row Space and Column Space of a Matrix

Span & Linear Independence (Pop Quiz)

Lecture 22: Section 4.7

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Math 308 Practice Final Exam Page and vector y =

Shorts

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Linear Algebra 1 Exam 2 Solutions 7/14/3

Linear Independence Reading: Lay 1.7

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

2. Every linear system with the same number of equations as unknowns has a unique solution.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

(b) The nonzero rows of R form a basis of the row space. Thus, a basis is [ ], [ ], [ ]

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Math 301 Test I. M. Randall Holmes. September 8, 2008

MATH10212 Linear Algebra B Homework 6. Be prepared to answer the following oral questions if asked in the supervision class:

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

EXAM. Exam #3. Math 2360 Fall 2000 Morning Class. Nov. 29, 2000 ANSWERS

1111: Linear Algebra I

Math 2940: Prelim 1 Practice Solutions

Solutions to Math 51 First Exam April 21, 2011

Practice Final Exam. Solutions.

Chapter 1: Linear Equations

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MATH 1553, JANKOWSKI MIDTERM 2, SPRING 2018, LECTURE A

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

Objective: Introduction of vector spaces, subspaces, and bases. Linear Algebra: Section

Chapter 1: Linear Equations

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Solutions to Homework 5 - Math 3410

Linear Algebra. Grinshpan

Quizzes for Math 304

Math113: Linear Algebra. Beifang Chen

Transcription:

Math 24 4.3 Linear Independence; Bases A. DeCelles Overview Main ideas:. definitions of linear independence linear dependence dependence relation basis 2. characterization of linearly dependent set using linear combinations (thm 4) 3. paring down a spanning set to find a basis (thm 5) 4. bases for the null space and column space of a matrix (thm 6) 5. basis is minimal spanning set and maximal linearly independent subset Examples in text:. linearly dependent set in P 2. linearly independent and dependent sets in C[ ] the vsp of continuous functions on [ ] 3. columns of an invertible matrix 4. standard basis for R n 5. determine whether three vectors in R 3 are a basis for R 3 6. standard basis for P n 7. find a basis for the subspace spanned by a given set 8. find a basis for the column space of a matrix in rref 9. find a basis for the column space of a matrix not in rref. enlarging a linearly independent set to a basis and beyond 2 Discussion and Worked Examples 2. Linear Independent Sets and Spanning Sets in R n (Warm-up: For each set of vectors in R 2 determine whether the set is linearly independent and describe the span.) Two Observations () Two vectors in R 2 that do not lie on the same line span the whole plane. (2) A set of three or more vectors in R 2 is always linearly dependent. Let s take some time to recap the arguments for why these facts are true. Let {v... v n } be a set of n vectors in R m and let A be the matrix whose columns are the vectors v... v n. Then {v... v n } spans R m Ax = b is consistent for all b in R m A has a pivot in every row {v... v n } is lin. indep. Ax = has only the trivial solution A has a pivot in every col.

Math 24 4.3 Linear Independence; Bases A. DeCelles When n = m A has the same number of rows and columns so if it has a pivot in every row it must have a pivot in every column and vice versa. Thus {v... v n } spans R n Ax = b is consistent for all b in R n A has a pivot in every row (n = m) {v... v n } is lin. indep. Ax = has only the trivial solution A has a pivot in every col. Thus a set of n vectors in R n spans R n if and only if it is linearly independent. Also notice that any set of n vectors in R m (with n > m) is linearly dependent because if A has more columns than rows it cannot have a pivot in every column. Further a set of n vectors in R m (with n < m) cannot span R m because if A has more rows than columns it cannot have a pivot in every row. A linearly independent spanning set for R n is called a basis for R n. What we have observed in the plane and proven in R n is that: - Any linearly independent set of exactly n vectors in R n is a basis for R n. - Any spanning set for R n that has exactly n vectors is a basis for R n. - A basis for R n cannot have more than n vectors (or it would be linearly dependent) and cannot have less than n vectors (or it would not span). Thus a basis for R n has exactly n vectors. Note The columns of an n n invertible matrix form a basis for R n by the Invertible Matrix Theorem. Note Given a linearly independent set in R n that does not span R n we can add new vectors to make a basis. Similarly given a linearly dependent spanning set in R n we can remove vectors to make a basis. Example For each of the sets of vectors in R 2 above either add or remove vectors to make a basis. In order to define a basis for an abstract vector space we need an abstract notion of linear independence. 2.2 Linear Independent Sets and Bases in an Abstract Vector Space Definition Let {v v 2... v p } be a subset of a vector space V. The set is linearly dependent if there are weights c c 2... c p some of which may be zero but not all such that c v + c 2 v 2 +... + c p v p = Otherwise the set is linearly independent: the only set of weights c c 2... c p satisfying is the set of all zero weights c = c 2 = = c p =. c v + c 2 v 2 +... + c p v p = 2

Math 24 4.3 Linear Independence; Bases A. DeCelles A set consisting of just one vector is linearly dependent if and only if it is the set consisting of the zero vector. A set of more than one vector is linearly dependent if and only if one of the vectors can be written as a linear combination of the others. (This is Theorem 4 in the text.) Note Recall that in R n a set of vectors is linearly independent if and only if the homogeneous equation Ax = has only the trivial solution where A is the matrix whose columns consist of the vectors in the specified set of vectors. However for an abstract vector space we may not be able to write vectors as columns of a matrix. Example Consider the following polynomials in P and determine whether they are linearly dependent or independent: P (x) = 2 Q(x) = x 2 R(x) = x 2 + Example The functions sin(x) and cos(x) are linearly independent in the vsp of functions on R but the functions sin(2x) and sin(x) cos(x) are linearly dependent since sin(2x) = 2 sin(x) cos(x) Definition A linearly independent spanning set of a vector space V is called a basis for V. (The plural of basis is bases. ) The standard basis for R n is {e e 2... e n }:...... The standard basis for P n is { x x 2... x n }. Certainly any polynomial with real coefficients of degree d n can be written as a linear combination of these monomials. The fact that they are linearly independent is also clear because all of the coefficients of the zero polynomial are zero. As in R n a spanning set for a vector space V that is linearly dependent can be pared down to a basis for V by removing vectors that can be written as linear combinations of previous vectors in the spanning set and a linearly independent set that does not span V can be beefed up to a basis for V by adding vectors to the set that are linearly independent. lin. indep. set that does not span add vectors remove vectors BASIS add vectors remove vectors lin. dept spanning set In this sense a basis is a minimal spanning set and a maximal linearly independent set. 3

Math 24 4.3 Linear Independence; Bases A. DeCelles 2.3 Bases for the Null Space and the Column Space of a Matrix Recall that row reduction can be described in terms of multiplication by elementary matrices. A E A... E r... E A (rref) Let A rref denote the reduced row echelon form of A. If Ax = then A rref x = E r... E (Ax) =. On the other hand if A rref x = then (E r... E )Ax = and thus Ax = (E r... E ) () = since elementary matrices are invertible. Thus Ax = A rref x = This key observation allows us to determine bases for the null space and the column space of a matrix from the reduced row echelon form of the matrix. Example Find bases for the null spaces and column spaces of the matrices A and B which are given below along with their reduced row echelon forms. A = 2 3 2 5 4 2 3 2 3 4 3 A rref = B = 2 3 2 5 4 2 3 2 3 4 2 B rref = Since Ax = if and only if A rref x = the null space of a matrix is the same as the null space of its reduced row echelon form. Recall that an explicit description for the null space of a matrix is obtained by writing the solution set for the associated homogeneous equation in parametric vector form. A vector x is in the null space of A if and only if it is of the form x = + x 5 x 5 x 5 = + x 5 Thus the null space is spanned by the vectors v = ( ) and u = ( ). Notice that these two vectors are linearly independent since if cv + du = then = c + d = c c + d c d d 4

Math 24 4.3 Linear Independence; Bases A. DeCelles and in particular this implies that c = and d =. Recall that whenever we obtain a spanning set for the null space of a matrix from the reduced row echelon form of the matrix the spanning set will be linearly independent. In other words this procedure always yields a basis for the null space of the original matrix. Similarly to find a basis for the null space of B we write the solution set for Bx = in parametric vector form. A vector x is a solution to Bx = if and only if x = = Thus a basis for the null space of B is {( )}. Recall that the column space of A is the span of the columns of A. Thus for a given matrix A we already have a spanning set for the column space of A namely the columns of A. However the columns may be linearly dependent in which case we need to remove some of the vectors to obtain a linearly independent spanning set i.e. a basis. Recall that we can interpret the matrix-vector product Ax as a linear combination of the columns of A where the entries of x are the weights in the linear combination. The columns of A are linearly independent if and only if the only way to weight the columns of A to get the linear combination to equal zero is to have all the weights be zero i.e. the only solution to Ax = is the zero vector x =. We can tell from the reduced row echelon form of A whether there are nontrivial solutions to Ax = since Ax = if and only if A rref x =. Looking at the matrices A and B above it is clear that Ax = and Bx = both have nontrivial solutions. If the columns of a matrix A are linearly dependent each nonzero vector x such that Ax = gives a dependency relation among the columns of A. Since Ax = if and only if A rref x = the dependency relations among the columns of A are the same as the dependency relations among the columns of A rref. For example if the third column of A rref is a linear combination of the first and second columns of A rref then the third column of A is a linear combination of the first and second columns of A. Look at the matrix A rref above. Notice that the third column is a linear combination of the first two columns and the fifth column is a linear combination of the second and fourth columns. Thus the third column and the fifth columns are redundant in the following sense: the span of columns of A rref is the same as the span of the first second and fourth columns of A rref. Since the first second and fourth columns are linearly independent and they span the column space of A rref they are a basis for the column space of A rref. Since the dependency relations for the columns of A are the same as the dependency relations of the columns of A rref the third column of A is a linear combination of the first and second columns and the fifth column of A is a linear combination of the second and fourth columns of A. Thus the first second and fourth columns of A 2 3 5 2 3 4 5

Math 24 4.3 Linear Independence; Bases A. DeCelles form a basis for the column space of A. Similarly since the third column of B rref is a linear combination of the first and second columns of B rref we may remove it from the spanning set without changing the span. The remaining four columns are linearly independent so form a basis for the column space of B rref. Thus the first second fourth and fifth columns of B 2 3 5 4 2 2 3 4 2 form a basis for the column space of B. Summary:. To find a basis for the null space of a matrix A: Find A rref write the solution set to A rref x = in parametric vector form using the free variables as the parameters. The spanning set for the null space obtained in this way is a basis. 2. To find a basis for the column space of a matrix A: The columns of A are a spanning set for the column space of A. To determine which columns to remove in order to ensure linear independence find A rref. If any columns of A rref are linear combinations of previous columns remove the corresponding columns of A from the spanning set. The resulting spanning set will be a basis. 6