CHAPTER 5. Linear Operators, Span, Linear Independence, Basis Sets, and Dimension

Similar documents
CHAPTER 5. Higher Order Linear ODE'S

CHAPTER 2. Techniques for Solving. Second Order Linear. Homogeneous ODE s

CHAPTER 1. Theory of Second Order. Linear ODE s

CHAPTER 7. An Introduction to Numerical Methods for. Linear and Nonlinear ODE s

CHAPTER 3. Matrix Eigenvalue Problems

Math 4377/6308 Advanced Linear Algebra

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Linear Equations in Linear Algebra

Expansion formula using properties of dot product (analogous to FOIL in algebra): u v 2 u v u v u u 2u v v v u 2 2u v v 2

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Handout #6 INTRODUCTION TO ALGEBRAIC STRUCTURES: Prof. Moseley AN ALGEBRAIC FIELD

Vector Spaces 4.4 Spanning and Independence

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Properties of Linear Transformations from R n to R m

2.3 Terminology for Systems of Linear Equations

MAT 242 CHAPTER 4: SUBSPACES OF R n

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Span & Linear Independence (Pop Quiz)

Math 369 Exam #2 Practice Problem Solutions

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Vector Spaces. EXAMPLE: Let R n be the set of all n 1 matrices. x 1 x 2. x n

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Lecture 22: Section 4.7

Math 54 HW 4 solutions

1 Systems of Differential Equations

(b) The nonzero rows of R form a basis of the row space. Thus, a basis is [ ], [ ], [ ]

Handout #3 SUBSPACE OF A VECTOR SPACE Professor Moseley

Lecture 18: Section 4.3

Column 3 is fine, so it remains to add Row 2 multiplied by 2 to Row 1. We obtain

MATH10212 Linear Algebra B Homework Week 4

Math 313 Chapter 1 Review

Row Space, Column Space, and Nullspace

2. Every linear system with the same number of equations as unknowns has a unique solution.

= x iy. Also the. DEFINITION #1. If z=x+iy, then the complex conjugate of z is given by magnitude or absolute value of z is z =.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

3.3 Linear Independence

Chapter 2 Notes, Linear Algebra 5e Lay

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

Chapter 1: Systems of Linear Equations

Math 110, Spring 2015: Midterm Solutions

Linear Independence Reading: Lay 1.7

a 1n a 2n. a mn, a n = a 11 a 12 a 1j a 1n a 21 a 22 a 2j a m1 a m2 a mj a mn a 11 v 1 + a 12 v a 1n v n a 21 v 1 + a 22 v a 2n v n

Math 24 Spring 2012 Questions (mostly) from the Textbook

2018 Fall 2210Q Section 013 Midterm Exam I Solution

Section Gaussian Elimination

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Section 1.5. Solution Sets of Linear Systems

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Linear Algebra Exam 1 Spring 2007

MATH10212 Linear Algebra B Homework Week 5

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

LS.1 Review of Linear Algebra

MAT Linear Algebra Collection of sample exams

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Extra Problems: Chapter 1

Math 321: Linear Algebra

Lecture 6: Spanning Set & Linear Independency

MTH 464: Computational Linear Algebra

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

12. Perturbed Matrices

Linear Independence x

T ((x 1, x 2,..., x n )) = + x x 3. , x 1. x 3. Each of the four coordinates in the range is a linear combination of the three variables x 1

Chapter 1. Vectors, Matrices, and Linear Spaces

Matrices and systems of linear equations

More Techniques. for Solving First Order ODE'S. and. a Classification Scheme for Techniques

Comps Study Guide for Linear Algebra

Exercises Chapter II.

Linear Equations in Linear Algebra

Calculating determinants for larger matrices

Review Solutions for Exam 1

Solving Systems of Linear Equations Symbolically

The definition of a vector space (V, +, )

CHAPTER 4. Introduction to the. Heat Conduction Model

Linear Independence. Linear Algebra MATH Linear Algebra LI or LD Chapter 1, Section 7 1 / 1

ELEMENTARY LINEAR ALGEBRA

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Next topics: Solving systems of linear equations

Systems of Linear Equations. By: Tri Atmojo Kusmayadi and Mardiyana Mathematics Education Sebelas Maret University

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Linear Algebra- Final Exam Review

MATH2210 Notebook 3 Spring 2018

CHAPTER 1. Review of Elementary Algebra and Geometry in One, Two, and Three Dimensions

Math Linear algebra, Spring Semester Dan Abramovich

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

ERASMUS UNIVERSITY ROTTERDAM Information concerning the Entrance examination Mathematics level 2 for International Business Administration (IBA)

Linear Algebra (Part II) Vector Spaces, Independence, Span and Bases

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

LINEAR ALGEBRA: THEORY. Version: August 12,

Differential Geometry of Surfaces

MATH 251 MATH 251: Multivariate Calculus MATH 251 FALL 2005 EXAM-I FALL 2005 EXAM-I EXAMINATION COVER PAGE Professor Moseley

Systems of Algebraic Equations and Systems of Differential Equations

Linear equations in linear algebra

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Linear Equations in Linear Algebra

Transcription:

A SERIES OF CLASS NOTES TO INTRODUCE LINEAR AND NONLINEAR PROBLEMS TO ENGINEERS, SCIENTISTS, AND APPLIED MATHEMATICIANS LINEAR CLASS NOTES: A COLLECTION OF HANDOUTS FOR REVIEW AND PREVIEW OF LINEAR THEORY INCLUDING FUNDAMENTALS OF LINEAR ALGEBRA CHAPTER 5 Linear Operators, Span, Linear Independence, Basis Sets, and Dimension 1. Linear Operators. Spanning Sets 3. Linear Independence of Column Vectors 4. Basis Sets and Dimension Ch. 5 Pg. 1

Handout #1 LINEAR OPERATOR THEORY Professor Moseley In this handout, we preview linear operator theory. The most important examples of linear operators are differential and integral operators and operators defined y matrix multiplication. These arise in many applications. Lumped parameter systems (e.g., linear circuits and mass spring systems) give rise to discrete operators defined on finite dimensional vector spaces (e.g., R n ). Differential and integral equations (e.g., Maxwell s equations and the Navier-Stokes equation) are used to model distriuted (continuum) systems and require infinite dimensional vector spaces. These give rise to differential and integral operators on function spaces. Even without covering any topics in differential equations, your ackground in calculus should e sufficient to see how discrete and continuous operators are connected as linear operators on a vector space. A function or map T from one vector space V to another vector space W is often call an operator. If we wish to think geometrically (e.g., if V and W are R or R 3 ) rather than algeraically we might call T a transformation. DEFINITION 1. Let V and W e vector spaces over the same field K. An operator T:V W is said to e linear if x, y V and scalars α,βk, it is true that T( α x + β y ) = α T( x ) + β T( y ). (1) THEOREM 1. Let V and W e vector spaces over the same field K. An operator T: V W is linear if and only if the following two properties hold: i) x, y ε V implies T( x + y ) = T( x ) + T( y ) () ii) αk and x ε V implies T(α x ) = α T( x ). (3) EXAMPLE 1 Let the operator T:R n R m e defined y matrix multiplication of the column vector x y the m n matrix A; that is, let T( x ) = df A x (4) nx1 x1 a11 a1 a1n x a 1 a a n where x R n and R m n. Then T is a linear A xn a m1 a m1 a mn operator. Ch. 5 Pg.

EXAMPLE Let I = (a,). The operator D:C 1 (I,R) C(I,R) defined y df D(f) = df (5) dx df where f ε C 1 (I,R) = {f:i R: exists and is continuous on I} and dx C(I, R) ={f:i R:f is continuous on I}. Then D is a linear operator. We may restrict D to A (I, R) ={f:i R:f is analytic on I} so that D:A (I, R) A (I, R) maps a vector space ack to itself. DEFINITION #. Let T:VW e a mapping from a set V to a set W. The set R(T) = {y W: there exists an xv such that y = T(x) } is called the range of T. If W has an additive structure (i.e., a inary operation which we call addition) with an additive identity which we call 0, then the set N(T) = {xv: T(x) = 0} is called the null set of T (or nullity of T). If T is a linear operator from a vector space V to another vector space W, we can say more. THEOREM # Let T:VW e a linear operator from a vector space V to a vector space W. The range of T, R(T) = { y W: there exists an x V such that y = T( x ) }, is a suspace of W and the null set of T, N(T) = { x V: T( x ) = 0} is a suspace of V. We rename these sets. DEFINITION #3. Let T:VW e a linear operator from a vector space V to another vector space W. The set R(T) = { y W: there exists an x V such that y = T( x ) } is called the range space of T and the set N(T) = { x V: T( x ) = 0 } is called the null space of T.. Ch. 5 Pg. 3

EXERCISES on Linear Operator Theory EXERCISE #1. True or False. 1. A linear circuit is an example of a lumped parameter system.. Amass/spring system is an example of a lumped parameter system. 3. A function or map T from one vector space V to another vector space W is often call an operator. 4. An operator T:V W is said to e linear if x, y V and scalars α,βk, it is true that T( α x + β y ) = α T( x ) + β T( y ). 5. An operator T: V W is linear if and only if the following two properties hold: i) x, y ε V implies T( x + y ) = T( x ) + T( y ) and ii) αk and x ε V implies T(α x ) = α T( x ). 6. The operator T:R n R m defined y T(x) A x nx1 7. The operator D:C 1 (a,) C(a,) e defined y D(f) = df 8. C(a,) ={f:(a) R:f is continuous}. df 9. C 1 (a,) = {f:(a,) R: dx exists and is continuous}. is a linear operator. df dx is a linear operator. 10. If T:VW, then the set R(T) = {y W: there exists an xv such that y = T(x) } is called the range of T. 11. If T:VW and W has an additive structure (i.e., a inary operation which we call addition) with an additive identity which we call 0, then the set N(T) = {xv: T(x) = 0} is called the null set of T (or nullity of T). 1. If T:VW is a linear operator from a vector space V to a vector space W, then the range of T, R(T) = { y W: there exists an x V such that y = T( x ) }, is a suspace of W 14.If T:VW is a linear operator from a vector space V to a vector space W, then the null set of T, N(T) = { x V: T( x ) = 0} is a suspace of V. 15. If T:VW is a linear operator from a vector space V to another vector space W, then the set R(T) = { y W: there exists an x V such that y = T( x ) } is called the range space of T. 16. If T:VW is a linear operator from a vector space V to another vector space W, then the set N(T) = { x V: T( x ) = 0 } is called the null space of T. Ch. 5 Pg. 4

Handout # SPANNING SETS Professor Moseley DEFINITION #1. If x 1, x,, x n are vectors in a vector space V and α 1,α,,α n are scalars, then n α, x 1 + α x + + α n x n = x is called a linear comination of the vectors. (It is important to note that a linear comination allows only a finite numer of vectors.) i1 i 1 EXAMPLE #1. Consider the system of linear algeraic equations: x + y + z = 1 4x + y = -3 x - y - z = 0. This set of scalar equations can e written as the vector equation 1 1 x 4 + y 1 + z = 0 1 1 1 3 0 where the left hand side (LHS) is a linear comination of the (column vectors whose components come from the) columns of the coefficient matrix 1 1 A = 4 1 0. 1 1 If we generalize Example #1, we otain: THEOREM #1. The general system A x nxn nx1 nx1 has a solution if and only if the vector can e written as a linear comination of the (1) Ch. 5 Pg. 5

columns of A. That is, is in the range space of T( ) = A if and only if can e written as a linear comination of the (column vectors whose components come from the) columns of the coefficient matrix. A. DEFINITION #. Let S e a (finite) suset of a suspace W of a vector space V. If every vector in W can e written as a linear comination of (a finite numer of) vectors in S, then S is said to span W or to form a spanning set for W. On the other hand, if S is any (finite) set of vectors, the span of S, written Span(S), is the set of all possile linear cominations of (a finite numer of) vectors in S. THEOREM #. For any (finite) suset S of a vector space V, Span (S) is a suspace of V. THEOREM #3. If (a finite set) S is a spanning set for W, then Span S = W. EXAMPLES. Consider the following susets of R 3. 1. S = {[1,0,0] T } = {î}. Then Span (S) = {x î: x R} = x - axis.. S = {[1,0,0] T, [0,1,0] T } = {î, j}. Then Span (S) = {x î+y ĵ: x, y R} = the xy plane. 3. S = {[1,0,0] T, [1,1,0] T } = {î, î+ĵ}. Then Span (S) = xy plane. 4. S = {[1,0,0] T, [0,1,0] T, [1,1,0] T } = {î, ĵ, î+ĵ}. Then Span (S) = xy plane. x x DEFINITION #3. For any matrix A, the span of the set of column vectors is the column space of A. The usual notation for the column space is R(A) and sometimes R A. Normally we will use R(A). The reason for this notation is that when we think of the operator T( x ) defined y (matrix) multiplication of the matrix A y the column vector x = [x 1,,x n ] T, we see that T maps vectors x R n into vectors y = A x R m, the column space R(A) is seen to e the range (space) of T. COMMENT. Consider the general system (1) aove. Theorem #1 can now e rephrased to say that (1) has a solution if and only if is in the column space of A (i.e. in the range of the operator T). DEFINITION 4. For any matrix A, the span of the set of row vectors is the row space of A. The usual notation for the row space is R(A T ) and sometimes R A T since the row space of A is the column space of A T. Normally we will use R(A T ). If we think of the operator T T T ( x ) defined y (matrix) multiplication of the matrix A y the column vector y = [y 1,,y m ] T, we see that A T maps vectors y R m into vectors x = A T y R n, Ch. 5 Pg. 6

the row space R(A T ) is seen to e the range (space) of A T. Recall that all of the coefficient matrices for the associated linear systems of algeraic equations otained in the process of doing Gauss elimination are row equivalent. Hence they all have the same row space. However, they do not have the same column space. EXERCISES on Spanning Sets EXERCISE #1. True or False. 1. If x 1, x,, x n are vectors in a vector space V and α 1,α,,α n are scalars, then n α, x 1 + α x + + α n x n = x is a linear comination of the vectors. A linear comination allows only a finite numer of vectors. i1 3. The linear algeraic equations: x + y + z = 1, 4x + y = 3, x y z = 0, is a system of scalar equations. 4. The coefficient matrix for the system of linear algeraic equations: x + y + z = 1, 1 1 4x + y = 3, x y z = 0 is A = 4 1 0. 1 1 5. The linear algeraic equations: x + y + z = 1, 4x + y = 3, x y z = 0 can e 1 1 1 written as the vector equation x 4 + y 1 + z =. 0 3 1 1 0 1 1 5. The left hand side (LHS) of the vector equation x 4 + y 1 + z 0 1 1 1 = 3 is a linear comination of the column vectors whose components come from 0 1 1 the columns of the coefficient matrix A = 4 1 0. 1 1 6. The general system A x has a solution if and only if the vector can e written nxn nx1 nx1 i 1 Ch. 5 Pg. 7

as a linear comination of the columns of A. x x 7. is in the range space of T( ) = A if and only if can e written as a linear comination of the column vectors whose components come from the columns of the coefficient matrix A. 8. If S e a finite suset of a suspace W of a vector space V and every vector in W can e written as a linear comination of a finite numer of vectors in S, then S is said to span W or to form a spanning set for W. 9. If S e a finite suset of a suspace W of a vector space V, then the span of S, written Span(S), is the set of all possile linear cominations of a finite numer of vectors in S. 10. For any finite suset S of a vector space V, Span (S) is a suspace of V. 11. If S e a finite suset of a suspace W of a vector space V and S is a spanning set for W, then Span S = W. 1. For any matrix A, the span of the set of column vectors is the column space of A. 13. The usual notation for the column space is R(A) and sometimes R A. 14. The reason that R(A) is the column space of A is that when we think of the operator T( x ) defined y matrix multiplication of the matrix A y the column vector x = [x 1,,x n ] T, we see that T maps vectors x R n into vectors y = A x R m which is the column space R(A) 14. The column space R(A) is the range space of T. 15. A x has a solution if and only if is in the range of the operator T( x ) = A x. nxn nx1 nx1 16. A x has a solution if and only if is in the column space of A. nxn nx1 nx1 17. For any matrix A, the span of the set of row vectors is the row space of A. 18. The usual notation for the row space is R(A T ) and sometimes R A T since the row space of A is the column space of A T. 19. If we think of the operator T T ( x ) defined y (matrix) multiplication of the matrix T A y the column vector y = [y 1,,y m ] T, we see that A T maps vectors y R m into vectors x = A T y R n, the row space R(A T ) is seen to e the range space of A T. Ch. 5 Pg. 8

Handout #3 LINEAR INDEPENDENCE (OF COLUMN VECTORS) Professor Moseley DEFINITION #1. Let V e a vector space. A finite set of vectors S {x,...,x 1 k} V is linearly independent (.i.) if the only set of scalars c 1, c,..., c k which satisfy the (homogeneous) vector equation c x c x c x 0 1 1 k k is c 1 = c = = c n = 0; that is, (1) has only the trivial solution. If there is a set of scalars not all zero satisfying (1) then S is linearly dependent (.d.). It is common practice to descrie the vectors (rather than the set of vectors) as eing linearly independent or linearly dependent. Although this is technically incorrect, it is wide spread, and hence we accept this terminology. Since we may consider (1) as a linear homogeneous equation with unknown vector [ c 1, c,..., c n ] R n, if (1) has one nontrivial solution, then it in fact has an infinite numer of nontrivial solutions. As part of the standard procedure for showing that a set is linearly dependent directly using the definition (DUD) you must exhiit one (and only one) such non trivial solution. To show that a set is linearly independent directly using the definition (DUD) you must show that the only solution of (1) is the trivial solution (i.e. that all the c i 's must e zero). Before looking at the application of this definition to column vectors, we state four theorems. THEOREM #1. Let V e a vector space. If a finite set of vectors the zero vector, then S is linearly dependent. S {x 1,...,x k} V contains THEOREM #. Let V e a vector space. If x 0, then S {x} V is linearly independent. Proof. To show S is linearly independent we must show that c 1 x = 0 implies that c 1 = 0. But y the zero product theorem, if c 1 x = 0 is true then c 1 = 0 or x = 0. But y hypothesis x 0. Hence c 1 x = 0 implies c 1 = 0. Hence S = {x} where x 0 is linearly independent. Q.E.D. THEOREM #3. Let V e a vector space and S = {, x y } V. If either x or y is the zero vector, then S is linearly dependent. THEOREM #4. Let V e a vector space and S = { x, y } V where x and y are nonzero vectors. Then S is linearly dependent if and only if one vector is a scalar multiple of the other. Although the definition is stated for an astract vector space and hence applies to any (1) Ch. 5 Pg. 9

vector space and we have stated some theorems in this astract setting, in this section we focus on column vectors in R n (or C n or K n ). Since we now know how to solve a system of linear algeraic equations, using this procedure, we can develop a procedure to show that a finite set in R n (or C n or K n ) is linearly independent. We also show how to give sufficient conditions to show that a finite set in R n (or C n or K n ) is linearly dependent. PROCEDURE. To determine if a set S {x,...,x 1 k} V is linearly independent or linearly dependent, we first write down the equation (1) and try to solve. If we can show that the only solution is the trivial solution c 1 = c = = c n = 0, then we have shown directly using the definition (DUD) of linear independence that S is linearly independent. On the other hand (OTOH), if we can exhiit a nontrivial solution, then we have shown directly using the definition of linear dependence that S is linearly dependent. We might recall that the linear theory assures us that if there is one nontrivial solution, that there are an infinite numer of nontrivial solutions. However, to show that S is linearly dependent directly using the definition, it is not necessary (or desirale) to find all of the nontrivial solutions. Although you could argue that once you are convinced (using some theorem) that there are an infinite numer of solutions, then we do not need to exhiit one, this will not e considered to e a proof directly using the definition (as it requires a theorem). Thus to prove that S is linearly dependent directly using the definition of linear dependence, you must exhiit one nontrivial solution. This will help you to etter understand the concepts of linear independence and linear dependence. We apply these procedures to finite sets in R n (or C n ). For S {x,...,x R n (or C n 1 k} ) (1) ecomes a system of n equations (since we are in R n (or C n ) ) in k unknowns (since we have k vectors). (This can e confusing when applying general theorems aout m equations in n unknowns. However, this should not e a prolem when using DUD on specific prolems.) EXAMPLE #1. Determine (using DUD) if 1 1 S,, 1 3 is linearly independent. Solution. (This is not a yes-no question and a proof is required). Assume 1 1 0 c1 c c3 1 3 0 () and (try to) solve. The vector equation () is equivalent to the two scalar equations (since we are in R ) in three unknowns (since S has three vectors). (3) c 1 + c + c 3 = 0 c 1 + 3c + c 3 = 0 Simple systems can often e solved using ad hoc (for this case only) procedures. But for Ch. 5 Pg. 10

Ac 0 complicated systems we might wish to write this system in the form and use Gauss elimination (on a computer). Note that when we reduce A we do not have to augment. Why? 1 1 A 1 3 c1 c c c 3 For this example and. Since the system is homogeneous we can solve y reducing A without augmenting (Why?). 1 1 1 1 c c c 0 c c c c c c R1 R 1 3 0 1 1 c c3 0 c c3 1 3 1 3 3 3 3 c1 c3 1 c c c 1 Hence the general solution of () is. 3 3 c 3 c 3 1 Hence there are an infinite numer of solutions. They are the vectors in the suspace W = { x R 3 : x = c [, -1, 1] T with c R}. Since there is a nontrivial solution, S is linearly dependent. However, we must exhiit one nontrivial solution which we do y choosing c 1 =1, c = 1, and c 3 = 1. Hence we have 1 1 0 (1) ( 1) (1) 1 3 0 (4) Since we have exhiited a nontrivial linear comination of the vectors in S, (4) alone proves that S is a linearly dependent set in the vector space R QED It may e easy to guess a nontrivial solution (if one exists). We call this method the Clever Ad Hoc (CAH) method. You may have noted that the first and third vectors in the previous example add together to give the second vector. Hence the coefficients c 1, c, and c 3 could have een easily guessed. EXAMPLE #. Determine if 1 3 4 S, 6, 5 1 3 6 is linearly independent. Solution. (Again this is not a yes-no question and a proof is required) Note that since Ch. 5 Pg. 11

1 3 3 6 1 3 we have 1 3 4 0 (3) ( 1) 6 (0) 5 0. (5) 1 3 6 0 Since we have exhiited a nontrivial linear comination of the vectors in S, (5) alone proves that S is a linearly dependent set in the vector space R 3 Hence S is linearly dependent. Q.E.D. We give a final example. EXAMPLE #3. Determine if 1 1 S, 1 is linearly independent. 1 1 Solution. Since is not a multiple of we assume 1 1 1 0 c c 0 c 1 1 c Ac 0 1 0 c1 c 0 where 1 1 A c1 and. Hence 1 c c 1 1 1 1 c1 c 0 c1 c c 0 R1 R 1 0 1 c 0 Since we have proved that the trivial linear comination where c 1 = c = 0 is the only linear comination of the vectors in S that gives the zero vector in R (i.e., we have proved that S is a linearly independent set. Ch. 5 Pg. 1

EXERCISES on Linear Independence (of Column Vectors) EXERCISE #1. True or False. 1. If V e a vector space and S {x,...,x 1 k} V, then S is linearly independent (.i.) if the only set of scalars c 1, c,..., c k which satisfy the (homogeneous) vector equation c1x1 cx ckx k 0 is c 1 = c = = c n = 0.. If V e a vector space and S {x,...,x 1 k} V, then S is linearly independent (.i.) if the only set of scalars c 1, c,..., c k which satisfy the (homogeneous) vector equation c1x1 cx ckx k 0 is the trivial solution. 3. If V e a vector space and S {x,...,x 1 k} V, then S is linearly dependent (.d.) if there is a set of scalars not all zero satisfying c1x1 cx ckx k 0. 4. If V is a vector space and S {x,...,x 1 k} V contains the zero vector, then S is linearly dependent. 5. If V e a vector space, x 0 and S {x} V, then S is linearly independent. 6. If c 1 x = 0 is true then y the zero product theorem either c 1 = 0 or x = 0. 7. If V is a vector space and S = {, x y } V and either x or y is the zero vector, then S is linearly dependent. 8. If V is a vector space and S = { x, y } V where x and y are nonzero vectors, then S is linearly dependent if and only if one vector is a scalar multiple of the other. 1 4 EXERCISE #. Use the procedure given aove to determine (using DUD) if S,, 1 3 linearly dependent or linearly independent or neither. Thus you must explain completely. EXERCISE #3. Use the procedure given aove to determine (using DUD) if is 1 4 4 S, 8, 5 1 4 6 linearly dependent or linearly independent or neither. Thus you must explain completely. EXERCISE #4. Use the procedure given aove to determine (using DUD) if 1 1 S, 1 is linearly dependent or linearly independent or neither. Thus you must explain completely. EXERCISE #5. Prove Theorem #1. EXERCISE #6. Prove Theorem #3. EXERCISE #7. Prove Theorem #4. is Ch. 5 Pg. 13