Linear Independence x

Similar documents
Determine whether the following system has a trivial solution or non-trivial solution:

Lecture 6: Spanning Set & Linear Independency

Linear Equations in Linear Algebra

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Review for Chapter 1. Selected Topics

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Section 1.5. Solution Sets of Linear Systems

Linear Equations in Linear Algebra

Chapter 1: Systems of Linear Equations

Math 2331 Linear Algebra

1111: Linear Algebra I

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Vector Spaces 4.3 LINEARLY INDEPENDENT SETS; BASES Pearson Education, Inc.

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Chapter 1. Vectors, Matrices, and Linear Spaces

Study Guide for Linear Algebra Exam 2

Linear Independence Reading: Lay 1.7

MATH10212 Linear Algebra B Homework Week 4

Solutions of Linear system, vector and matrix equation

Linear Equations in Linear Algebra

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Linear Equations in Linear Algebra

Math 54 HW 4 solutions

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Lecture 22: Section 4.7

2. Every linear system with the same number of equations as unknowns has a unique solution.

Row Reduction and Echelon Forms

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Matrices and RRE Form

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Linear Algebra Exam 1 Spring 2007

Announcements Monday, October 29

DM559 Linear and Integer Programming. Lecture 2 Systems of Linear Equations. Marco Chiarandini

System of Linear Equations

13. Systems of Linear Equations 1

Linear equations in linear algebra

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Math Linear algebra, Spring Semester Dan Abramovich

Section Gaussian Elimination

Span & Linear Independence (Pop Quiz)

Math 3C Lecture 20. John Douglas Moore

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Algebra MATH20F Midterm 1

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Linear Algebra (wi1403lr) Lecture no.3

Math 2940: Prelim 1 Practice Solutions

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

VECTORS [PARTS OF 1.3] 5-1

Math 2174: Practice Midterm 1

Lecture 18: Section 4.3

Properties of Linear Transformations from R n to R m

Systems of Linear Equations. By: Tri Atmojo Kusmayadi and Mardiyana Mathematics Education Sebelas Maret University

Matrix equation Ax = b

Math 369 Exam #2 Practice Problem Solutions

Announcements Monday, September 18

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Vector Spaces 4.4 Spanning and Independence

2018 Fall 2210Q Section 013 Midterm Exam II Solution

R b. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 1 1, x h. , x p. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Chapter 3. Vector spaces

Span and Linear Independence

Chapter 1: Linear Equations

Fall Inverse of a matrix. Institute: UC San Diego. Authors: Alexander Knop

Math 3C Lecture 25. John Douglas Moore

MAT 242 CHAPTER 4: SUBSPACES OF R n

5) homogeneous and nonhomogeneous systems 5a) The homogeneous matrix equation, A x = 0 is always consistent.

Chapter 1: Linear Equations

MATH 1553, SPRING 2018 SAMPLE MIDTERM 1: THROUGH SECTION 1.5

MATH240: Linear Algebra Exam #1 solutions 6/12/2015 Page 1

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Combination. v = a 1 v 1 + a 2 v a k v k

The definition of a vector space (V, +, )

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :)

1 Last time: multiplying vectors matrices

Linear transformations

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

3.4 Elementary Matrices and Matrix Inverse

MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 1 Due: Wednesday July 13, 2005

Math 102, Winter 2009, Homework 7

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

Math 2331 Linear Algebra

AFFINE AND PROJECTIVE GEOMETRY, E. Rosado & S.L. Rueda 4. BASES AND DIMENSION

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Row Space and Column Space of a Matrix

Matrix of Linear Xformations

MATH10212 Linear Algebra B Homework 7

if b is a linear combination of u, v, w, i.e., if we can find scalars r, s, t so that ru + sv + tw = 0.

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

0.0.1 Section 1.2: Row Reduction and Echelon Forms Echelon form (or row echelon form): 1. All nonzero rows are above any rows of all zeros.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Transcription:

Linear Independence A consistent system of linear equations with matrix equation Ax = b, where A is an m n matrix, has a solution set whose graph in R n is a linear object, that is, has one of only n + 1 possible shapes: a point (a copy of R 0 ), a line (a copy of R 1 ), a plane (a copy of R 2 ), a 3-space (a copy of R 3 ),, all of R n. We have seen how, given one particular solution x = p, every other solution is a translate x = p + v h of a solution v h to the associated homogeneous system of equations with matrix form Ax = 0. That is, the solution set for Ax = b has the same shape as, and is parallel to, the solution set for the homogeneous system Ax = 0. So the shape of the solution set depends only on A and not on b. In fact, the process of row reducing A shows that the solutions v h to the associated homogeneous system of equations have the form v h = x 1 v 1 + + x f v f, where x 1, x 2,, x f are the free variables that arise (each corresponding to a column of A that does not contain a pivot entry). Therefore, the shape of the solution set to Ax = b is determined entirely by the number of free variables that appear in the row reduction procedure.

The fundamental case occurs when there are no free variables, that is, when the reduced echelon form of A has pivots in every row. In this case, by expressing Ax = 0 as a vector equation of the form x 1 a 1 + + x n a n = 0, where A = [ a 1 a 2 a n ], we have a situation in which the only solution to the system is the trivial solution x = v h = 0. This leads to the following very important definition: a set of vectors a 1,a 2,,a n is said to be linearly independent when the vector equation x 1 a 1 + + x n a n = 0 only has the trivial solution x 1 = x 2 = = x n = 0. Otherwise, there is a nontrivial solution (that is, at least one of the x s in the solution can be nonzero), and we say that the a s are linearly dependent. Then we have a linear dependence relation amongst the a s. Alternatively, to say that the a s are linearly dependent is to say that the zero vector 0 can be expressed as a nontrivial linear combination of the a s. Determining whether a set of vectors a 1,a 2,,a n is linearly independent is easy when one of the vectors is 0: if, say, a 1 = 0, then we have a simple solution to x 1 a 1 + + x n a n = 0 given by choosing

x 1 to be any nonzero value we please and putting all the other x s equal to 0. Consequently, if a set of vectors contains the zero vector, it must always be linearly dependent. Equivalently, any set of linearly independent vectors cannot contain the zero vector. Another situation in which it is easy to determine linear independence is when there are more vectors in the set than entries in the vectors. If n > m, then the n vectors a 1,a 2,,a n in R m are columns of an m n matrix A. The vector equation x 1 a 1 + + x n a n = 0 is equivalent to the matrix equation Ax = 0 whose corresponding linear system has more variables than equations. Thus there must be at least one free variable in the solution, meaning that there are nontrivial solutions to x 1 a 1 + + x n a n = 0: If n > m, then the set { a 1,a 2,,a n } of vectors in R m must be linearly dependent. When n is small we have a clear geometric picture of the relation amongst linearly independent vectors. For instance, the case n = 1 produces the equation x 1 a 1 = 0, and as long as a 1 0, we only have the trivial solution x 1 = 0. A single nonzero vector always forms a linearly independent set.

When n = 2, the equation takes the form x 1 a 1 + x 2 a 2 = 0. If this were a linear dependence relation, then one of the x s, say x 1, would have to be nonzero. Then we could solve the equation for a 1 and obtain a relation indicating that a 1 is a scalar multiple of a 2. Conversely, if one of the vectors is a scalar multiple of the other, we can express this in the form x 1 a 1 + x 2 a 2 = 0. Thus, a set of two nonzero vectors is linearly dependent if and only if they are scalar multiples of each other. More generally, we can prove the following Theorem A set { a 1,a 2,,a n } of vectors is linearly dependent if and only if at least one of the vectors a i is a nontrivial linear combination of the others. In fact, if { a 1,a 2,,a n } is a linearly dependent set and a 1 0, then there must be some a j (with j > 1) which is a linear combination of the preceding vectors a 1,a 2,,a j 1. Proof If { a 1,a 2,,a n } is a linearly dependent set, then there are values of the x s, not all 0, that make x 1 a 1 + + x n a n = 0 true. If we choose the index i corresponding to some nonzero x, then solve the vector equation for a i, this shows that it is a linear combination of the other vectors in the set.

Conversely, if a i can be written as a linear combination of the remaining a s, moving a i to the other side of this equation expresses 0 as a nontrivial linear combination of all the a s, so { a 1,a 2,,a n } is a linearly dependent set. Furthermore, if { a 1,a 2,,a n } is a linearly dependent set and a 1 0, then there is a nontrivial solution to x 1 a 1 + + x n a n = 0. Let j be the largest subscript whose coefficient x j in this equation is nonzero; then in fact, x 1 a 1 + + x j a j = 0 and we can solve this equation for a j, thereby expressing a j as a linear combination of the preceding collection of vectors a 1,a 2,,a j 1. //