Lecture 18: Section 4.3

Similar documents
Lecture 22: Section 4.7

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

Lecture 17: Section 4.2

Linear Equations in Linear Algebra

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Determine whether the following system has a trivial solution or non-trivial solution:

Lecture 6: Spanning Set & Linear Independency

Chapter 1. Vectors, Matrices, and Linear Spaces

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Independence x

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Matrices and RRE Form

Family Feud Review. Linear Algebra. October 22, 2013

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

The scope of the midterm exam is up to and includes Section 2.1 in the textbook (homework sets 1-4). Below we highlight some of the important items.

Linear Algebra Practice Problems

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

No books, notes, any calculator, or electronic devices are allowed on this exam. Show all of your steps in each answer to receive a full credit.

Homework #6 Solutions

Eigenvalue and Eigenvector Homework

Lecture 11: Eigenvalues and Eigenvectors

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Numerical Linear Algebra Homework Assignment - Week 2

Homework 1 Due: Wednesday, August 27. x + y + z = 1. x y = 3 x + y + z = c 2 2x + cz = 4

Math 3C Lecture 20. John Douglas Moore

1/25/2018 LINEAR INDEPENDENCE LINEAR INDEPENDENCE LINEAR INDEPENDENCE LINEAR INDEPENDENCE

Properties of Linear Transformations from R n to R m

MATH10212 Linear Algebra B Homework Week 5

Systems of Linear Equations. By: Tri Atmojo Kusmayadi and Mardiyana Mathematics Education Sebelas Maret University

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

Math 60. Rumbos Spring Solutions to Assignment #17

Vector Spaces 4.4 Spanning and Independence

Linear Algebra Exam 1 Spring 2007

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Study Guide for Linear Algebra Exam 2

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Online Exercises for Linear Algebra XM511

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

M 340L CS Homework Set 1

Chapter 1: Systems of Linear Equations

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

MATH10212 Linear Algebra B Homework Week 4

Math 3A Winter 2016 Midterm

Linear Equations in Linear Algebra

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Section 1.5. Solution Sets of Linear Systems

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

1 Determinants. 1.1 Determinant

Math "Matrix Approach to Solving Systems" Bibiana Lopez. November Crafton Hills College. (CHC) 6.3 November / 25

Math 3191 Applied Linear Algebra

Math 54. Selected Solutions for Week 5

Chapter 2 Notes, Linear Algebra 5e Lay

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Math 54 HW 4 solutions

1300 Linear Algebra and Vector Geometry Week 2: Jan , Gauss-Jordan, homogeneous matrices, intro matrix arithmetic

Chapter 2 Subspaces of R n and Their Dimensions

MAT Linear Algebra Collection of sample exams

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Linear Independence. Linear Algebra MATH Linear Algebra LI or LD Chapter 1, Section 7 1 / 1

Extra Problems: Chapter 1

1.3 Linear Dependence & span K

Math Linear Algebra Final Exam Review Sheet

(Linear equations) Applied Linear Algebra in Geoscience Using MATLAB

Chapter 3. Vector spaces

Abstract Vector Spaces

The definition of a vector space (V, +, )

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

if b is a linear combination of u, v, w, i.e., if we can find scalars r, s, t so that ru + sv + tw = 0.

ORIE 6300 Mathematical Programming I August 25, Recitation 1

Review Solutions for Exam 1

Lectures on Linear Algebra for IT

Lecture Summaries for Linear Algebra M51A

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Vector Spaces 4.3 LINEARLY INDEPENDENT SETS; BASES Pearson Education, Inc.

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Final Exam Practice Problems Answers Math 24 Winter 2012

Chapter 2: Linear Independence and Bases

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Homework 1.1 and 1.2 WITH SOLUTIONS

Math 2940: Prelim 1 Practice Solutions

Math 1060 Linear Algebra Homework Exercises 1 1. Find the complete solutions (if any!) to each of the following systems of simultaneous equations:

Vector space and subspace

2. Every linear system with the same number of equations as unknowns has a unique solution.

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

4.9 The Rank-Nullity Theorem

Transcription:

Lecture 18: Section 4.3 Shuanglin Shao November 6, 2013

Linear Independence and Linear Dependence. We will discuss linear independence of vectors in a vector space.

Definition. If S = {v 1, v 2,, v r } is a nonempty set of vectors in a vector space V, then the vector equation has at least one solution, namely, k 1 v 1 + k 2 v 2 + + k r v r = 0 k 1 = 0, k 2 = 0,, k r = 0. We call it the trivial solution. If this is the only solution, then S is said to be a linearly independent set. If there are solutions in addition to the trivial solution, then S is said to be a linearly dependently set.

Example 1. Example 1. Linear independence of the standard unit vectors in R n. Let Let e 1 = (1, 0, 0, 0), e 2 = (0, 1, 0,, 0),, e n = (0, 0,, 1). Thus c 1 e 1 + c 2 e 2 + + c n e n = 0. (c 1, c 2,, c n ) = (0, 0,, 0). This implies that c 1 = 0, c 2 = 0,, c n = 0. Therefore {e 1, e 2,, e n } is linearly independently.

Example 2. Example 2. Linear independence in R 3. Determine whether the vectors v 1 = (1, 2, 3), v 2 = (5, 6, 1) and v 3 = (3, 2, 1) are linearly independent or linearly dependent in R 3. Solution. Suppose that there exist k 1, k 2 and k 3 such that k 1 v 1 + k 2 v 2 + k 3 v 3 = (0, 0, 0). Thus (k 1 + 5k 2 + 3k 3, 2k 1 + 6k 2 + 2k 3, 3k 1 k 2 + k 3 ) = (0, 0, 0). (1)

Writing it in the linear system, k 1 + 5k 2 + 3k 3 = 0, 2k 1 + 6k 2 + 2k 3 = 0, 3k 1 k 2 + k 3 = 0. The augmented matrix is 1 5 3 0 2 6 2 0. 3 1 1 0

By elementary row reductions, 1 5 3 0 0 2 1 0 0 0 0 0 Since the last row only consists of zero, there exists nontrivial solutions (k 1, k 2, k 3 ) satisfying the equation (1).

Linear independence in R 4. Determine whether the vectors v 1 = (1, 2, 2, 1), v 2 = (4, 9, 9, 4), v 3 = (5, 8, 9, 5) in R 4 are linearly dependent or linearly independent. Proof. Suppose that there exists k 1, k 2, k 3 such that k 1 v 1 + k 2 v 2 + k 3 v 3 = (0, 0, 0).

Suppose that there exist k 1, k 2, k 3 such that k 1 + 4k 2 + 5k 3 = 0, 2k 1 + 9k 2 + 8k 3 = 0, 2k 1 + 9k 2 + 9k 3 = 0, k 1 4k 2 5k 3 = 0. The 4th equation is the same as the first equation. So the augmented matrix is 1 4 5 0 2 9 8 0 2 9 9 0

By the elementary row reductions, we have 1 4 5 0 0 1 2 0. 0 0 1 0 This implies that k 1 = 0, k 2 = 0, k 3 = 0. Therefore v 1, v 2, v 3 is linearly independent.

An important linearly independently set in P n. Show that the polynomials 1, x,, x n form a linearly independent set in P n. Solution. Suppose that there exists a 0, a 1,, a n such that Thus a 0 + a 1 x + a 2 x 2 + + a n x n = 0. a 0 = a 1 = a 2 = = a n = 0. Hence the set of polynomials of {1, x,, x n } is linearly independent.

Theorem. Theorem. (a). A set S with two or more vectors is Linearly dependent if and only if at least one of the vectors in S is expressible as a linear combination of the other vectors in S. (b). Linearly independent if and only if no vector in S is expressible as a linear combination of the other vectors in S.

Proof. We observe that the second statement follows from the first statement. So we only need to prove part (a). Let S = {v 1, v 2,, v k }.. If v 1 = c 2 v 2 + + c k v k, then 0 = v 1 c 2 v 2 + c k v k. Thus there exists nonzero solution (1, c 2,, c k ) satisfying the equation.

. Suppose that there exists S is linearly dependent, i.e., there exists (c 1, c 2,, c k ), not all zero, such that Suppose that c 1 0, This proves the statement. c 1 v 1 + c 2 v 2 + + c k v k = 0. v 1 = c 2 c 1 v 2 c k c 2 v k.

Theorem. (a). (b). (c). A finite set that contains bf 0 is linearly dependent. A set with exactly one vector is linearly dependent if and only if that vector is nonzero. A set with exactly two vectors is linearly independent if and only if neither vector is a scalar multiple of the other. The proof is omitted.

Theorem. Let S = {v 1, v 2,, v r } be a set of vectors in R n. If r > n, then S is linearly dependent. Solution. Suppose that there exists x 1, x 2,, x r such that x 1 v 1 + x 2 v 2 + + x r v r = 0. (2) We need to show that there exists a nonzero solution (x 1, x 2,, x r ) to this equation. Suppose that v i = (a 1i, a 2i,, a ni ). Then we have the system of equations a 11 x 1 + a 12 x 2 + + a 1r x r = 0, a 21 x 1 + a 22 x 2 + + a 2r x r = 0,.. a n1 x 1 + a n2 x 2 + + a nr x r = 0.

Then the augmented matrix is a 11 a 12 a 1r 0 a 21 a 22 a 2r 0..... a n1 a n2 a nr 0. By Theorem 1.2.2, since this system has more unknowns than equations, it has infinitely many solutions. So there exists a nonzero solution to the equation (2).

By using the standard unit basis in R n, each v i can be expressed as a linear combination of e 1, e 2,, e n : v 1 = a 11 e 1 + a 12 e 2 + + a 1n e n v 2. = a 21 e 1 + a 22 e 2 + + a 2n e n. v n = a n1 e 1 + a n2 e 2 + + a nn e n.. v r = a r1 e 1 + a r2 e 2 + + a rn e n.

We write the first n equations in the following form: v 1 a 11 a 12 a 1n e 1 v 2 a 21 a 22 a 2n e 2. v n = Let A be the coefficient matrix.... a n1 a n2 a nn. e n

We divide it into two cases. Case 1. If A is invertible, then the column vector of e 1, e 2,, e n can be written as the product of the inverse matrix A and the column vector v 1, v 2,, v n. Since v r is a linear combination of e 1, e 2,, e n. Thus v r is a linear combination of v 1, v 2,, v n. Hence the set of vectors {v 1, v 2,, v r } is linearly dependent. Case 2. If A is not invertible, then by elementary row operations, we can reduce the last row of the matrix A to the zero row. Applying the same elementary row operations to the left hand side, we see that a linear combination of {v 1, v 2,, v n } with nonzero coefficients is equal to zero. Thus {v 1,, v n } is linearly dependent. Hence {v 1,, v n,, v r } is linearly dependent.

Homework and Reading. Homework. Ex. # 2, # 3, # 4, # 6, # 10, # 12, # 13, # 16. True or false questions on page 200. Reading. Section 4.4.