Dependence and independence

Similar documents
Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 7. Inverse matrices

Number of solutions of a system

Roberto s Notes on Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 3. Diagonal matrices

Row and column spaces

Using matrices to represent linear systems

Definition of geometric vectors

Roberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2. Orthogonal matrices

Roberto s Notes on Infinite Series Chapter 1: Sequences and series Section 4. Telescoping series. Clear as mud!

Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 4. Matrix products

Chapter 2 Notes, Linear Algebra 5e Lay

Linear Independence. Linear Algebra MATH Linear Algebra LI or LD Chapter 1, Section 7 1 / 1

Solutions to Homework 5 - Math 3410

7.6 The Inverse of a Square Matrix

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

Lesson U2.1 Study Guide

Basic matrix operations

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

The definition of a vector space (V, +, )

Math 60. Rumbos Spring Solutions to Assignment #17

Linear Algebra- Final Exam Review

Terminology and notation

Eigenvalues and eigenvectors

MITOCW ocw f99-lec09_300k

0.1. Linear transformations

Inverting Matrices. 1 Properties of Transpose. 2 Matrix Algebra. P. Danziger 3.2, 3.3

Lecture 22: Section 4.7

Chapter 1. Vectors, Matrices, and Linear Spaces

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

Elimination and back substitution

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

Cofactors and Laplace s expansion theorem

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Linear algebra and differential equations (Math 54): Lecture 10

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

Working with Block Structured Matrices

Linear Algebra. Preliminary Lecture Notes

Elementary maths for GMT

Matrices and Matrix Algebra.

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Section 2.2: The Inverse of a Matrix

Vector Spaces 4.4 Spanning and Independence

EE731 Lecture Notes: Matrix Computations for Signal Processing

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

LINEAR ALGEBRA: THEORY. Version: August 12,

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

ENGINEERING MATH 1 Fall 2009 VECTOR SPACES

Review Solutions for Exam 1

Linear Algebra Practice Problems

Math 290, Midterm II-key

Linear Algebra. Chapter Linear Equations

Linear Algebra. Preliminary Lecture Notes

Math 2030 Assignment 5 Solutions

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

1 Matrices and matrix algebra

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Roberto s Notes on Linear Algebra Chapter 11: Vector spaces Section 1. Vector space axioms

MATH 15a: Applied Linear Algebra Practice Exam 1

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Special types of matrices

1. Solve each linear system using Gaussian elimination or Gauss-Jordan reduction. The augmented matrix of this linear system is

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

Linear Algebra March 16, 2019

LINEAR ALGEBRA KNOWLEDGE SURVEY

INVERSE OF A MATRIX [2.2]

Vector Spaces, Orthogonality, and Linear Least Squares

is Use at most six elementary row operations. (Partial

EXAM 2 REVIEW DAVID SEAL

Row Reduced Echelon Form

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MAT 2037 LINEAR ALGEBRA I web:

2 Systems of Linear Equations

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Review of Matrices and Block Structures

Lecture 2e Row Echelon Form (pages 73-74)

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Math 544, Exam 2 Information.

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

x 1 x 2. x 1, x 2,..., x n R. x n

NAME MATH 304 Examination 2 Page 1

Lecture 4: Gaussian Elimination and Homogeneous Equations

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Lecture 23: 6.1 Inner Products

CHAPTER 8: MATRICES and DETERMINANTS

Review Packet 1 B 11 B 12 B 13 B = B 21 B 22 B 23 B 31 B 32 B 33 B 41 B 42 B 43

If A is a 4 6 matrix and B is a 6 3 matrix then the dimension of AB is A. 4 6 B. 6 6 C. 4 3 D. 3 4 E. Undefined

Topic 14 Notes Jeremy Orloff

Transcription:

Roberto s Notes on Linear Algebra Chapter 7: Subspaces Section 1 Dependence and independence What you need to now already: Basic facts and operations involving Euclidean vectors. Matrices determinants and their connections to vector properties. What you can learn here: The definition and basic facts about the ey linear algebra properties of dependence and independence. By now we have seen the concept of linear combinations several times. It puts together the two basic vector operations of addition and scalar multiplication and it is behind many of the methods we have constructed such as Gauss-Jordan elimination. In this section we are going to use it to define another concept that we shall later be able to generalize to other inds of vectors. We begin with a rather obvious observation but one that sets the stage for our main definition. Knot on your finger S v1 v v of Euclidean vectors the zero vector can always be obtained as a linear combination of them by picing all coefficients to be 0: 0v1 0v 0v 0 Given any finite set This is called the trivial linear combination. A question that turns out to be very interesting and fruitful is whether the zero vector can be obtain as a linear combination of the same vectors in other ways. Definition S v1 v v of Euclidean vectors is said to be linearly dependent if there are scalars c1 c c such that: A finite set c1 c c 0 (that is not all are 0) c 1 1 c c v v v 0 This is called a non-trivial linear combination. This may not seem lie much but believe me it is a ey idea that we shall exploit quite a bit. Here is a simple example to mae this less abstract. Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 1

Example: S 1 3 0 4 10 8 0 Notice that: 41 3 3 0 410 8 0 0 That is here is a linear combination of these vectors that provides the zero vector but uses non-zero coefficients. Technical fact If a finite set of Euclidean vectors is linearly dependent then there are infinitely many sets of coefficients that prove such relation. Technical fact S v1 v v of Euclidean vectors is linearly dependent if and only if the homogeneous system v x v x v x 0 A finite set 1 1 has non-trivial solutions. Of course! Solutions of this system are exactly what we are looing for so you are just saying the same thing in a different way. Yes and here is yet another way to say pretty much the same thing but from a different perspective. Proof Well if c1 c c do the tric obviously so do bc1 bc bc for any non-zero scalar b. And since there are infinitely many different such sets of scalars there are infinitely many ways of proving the relation! That reminds me of solutions of linear systems: if there is more than one (the trivial and one non-trivial) there are infinitely many of them. Good observation and there is a very close connection between the two concepts. Definition S v1 v v of Euclidean vectors is not linearly dependent it is said to be linearly independent. If the set Example: 1 0 0 1 0 3 1 This set of vectors is independent. To see this we consider the homogeneous system 1 0 0x1 1 0x 3 1 x 0. Its coefficient matrix is: Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page

1 3 0 1 0 0 1 This matrix is clearly invertible and hence the system only has one solution: the trivial one. Therefore the set is independent. Do we need to use the word linearly all the time? No in fact from now on I will omit it except when it is important to emphasize it. Therefore whenever we shall tal about a set of vectors being dependent or independent we shall mean linearly so. And since we are at it I will also omit the reference to Euclidean vectors. Although there are other types of vectors and we shall study them later for now we are only dealing with them so let s agree that all the vectors we shall use here are Euclidean. And to complete the jargon cleanup let us agree to the following Knot on your finger The definition of being dependent or independent can be applied to either a set S of vectors or the vectors that comprise it. Therefore we can say equally correctly either that a set S v1 v v of vectors is dependent or independent or that the vectors v1 v v themselves are. OK I will eep all this in mind. But the last two examples seemed cooed up so as to easily show their being dependent or not. They were! In the first case I constructed the third vector so that an easy linear combination would do the tric while in the second the 0 components were placed strategically so as to get the conclusion. So how do we chec dependence in a set of vectors that was not built in a contrived way? I am glad you ased but loo bac at the definition: it hinges on whether a certain homogeneous system has non-trivial solutions or not. So the following general method should not be surprising. Strategy for checing the dependence of a set of Euclidean vectors S v1 v v are dependent it is sufficient to either: To determine if the vectors chec if the homogeneous system v x v x v x 0 1 1 has non-trivial solutions or chec if the ran of the matrix A v v v is less than. 1 Proof Both conditions are necessary and sufficient for the existence of infinitely many non-trivial linear combinations that produce the zero vector. Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 3

Example: S 1 3 1 0 4 3 3 14 9 1 To chec if this set of vectors is independent we compute an REF of the matrix: 1 3 0 14 A 3 4 9 1 3 1 For instance here is one such matrix: 3 4 9 0 1 REF A 0 0 0 0 0 0 We can see that the homogeneous system for which this is the matrix of coefficients has a free variable hence infinitely many solutions so that the set in dependent. Alternatively we notice that this matrix has ran lower than the three vectors we were given again conforming the dependence. But my calculator will not allow me to compute an REF for a matrix with more rows than columns. If you are so short of time that you need to use a calculator you can always include a column of zeros and ignore it. We shall see in the Learning questions that there is another way out: use the transpose of the matrix we need that is write the vectors as rows. It turns out that the condition on the ran being less than can be applied to it as well. Example: S 1 3 1 0 4 3 3 14 9 1 To chec if this set of vectors is independent we compute an REF of the transpose of the previous matrix: A T 1 3 1 0 4 3 3 14 9 1 For instance here is one such matrix: 3 14 9 1 T REF A 0 5 1 0 0 0 0 Again ran less than the number of vectors and they are dependent. As a hint for your proof that this can be done here is an important property of dependence that will be useful in our later dealings with the concept. Technical fact S v1 v v is dependent if and only if one of them is a linear combination of the others. A set of vectors The proof of this fact is also very straightforward to be found in the Learning questions. Speaing of which time for them before we move on to seeing how all this is developed further. Wait! I just thought of a way to construct a dependent set of vectors! Just mae sure it includes the zero vector! Brilliant! We might as well highlight this smart observation. Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 4

Technical fact Any set of vectors that includes the zero vector is dependent. Proof That s because we can pic 1 as the constant multiplier of the zero vector 0 to be the multiplier of each of the other vectors and thus get the non-trivial solution that satisfies the definition of dependence: S v v v 0 1 1 0 v 0 v 0 v 10 0 Summary A set of Euclidean vectors is dependent if one of them is a linear combination of the others. A set of Euclidean vectors is independent if the only way to obtain the zero vector as a linear combination of them is by using the trivial one. Common errors to avoid Do not fixate your attention on the computational steps involved in determining whether a set of vectors is dependent or independent. Focus instead on understanding what this relationship means and why the methods we have seen wor. Learning questions for Section LA 7-1 Review questions: 1. Describe the definitions of linear dependence and linear independence.. Present an alternative and effective way to chec if a set of vectors is linearly dependent or not. 3. Explain the connection between homogeneous linear systems and linear dependence. Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 5

Memory questions: 1. When is a set of vectors linearly independent? 3. Which vector maes any set that contains it linearly dependent?. When is a set of vectors linearly dependent? Computation questions: For each of the sets of vectors presented in questions 1-6 determine if the set is dependent or independent. 1. 1 1 1 17 3 36 3 5 3 8. 1 1 1 3 3 7 4 3. 1 3 0 3 8 4 3 5. 6. 1 3 1 3 1 1 4 1 1 3 1 1 4 6 1 4 6 4. 3 0 6 1 4 0 1 1 Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 6

Show that each of the sets of vectors presented in questions 7-8 is dependent and express each of the vectors as a linear combination of the others. 7. 1 0 3 0 1 8. 4 4 3 1 6 7 5 14 1 9. Given the vectors u and 3 1 1 v : a) explain why they are linearly independent b) provide a vector that is a linear combination of u and v but is not parallel to either c) construct the general equation of the plane through the origin that contains the span of u and v. Theory questions: 1. Explain why the vectors u 3 1 1 1 and v 1 3 1 1 are linearly independent.. If u and v are linearly independent Euclidean vectors is u v equal to 0 different to 0 or can it be either? 3. If u and v are linearly independent vectors are u v and u independent? v also linearly S v1 v v is a dependent set of non-zero vectors how do we 5. If find the scalars that mae one of them a linear combination of the others? 6. If 6 vectors are linearly dependent what can we say about the ran of the matrix that has such vectors as columns? 7. If a set of vectors is linearly dependent how many different linear combinations of them will generate the 0 vector? 4. If u v w form a dependent set what will the ran of the matrix A u v w be? Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 7

Proof questions: 1. Prove that a set of vectors one of them is a linear combination of the others.. Prove that a set S v1 v v is dependent if and only if S v1 v v is dependent if and only if the matrix that has them as rows has ran less than. 3. Prove that if two non-zero vectors in n are orthogonal they are also independent. 4. Prove that a set S v1 v vn of n vectors in only if the determinant of the matrix are written as columns is 0. 1 n n is dependent if and A v v v where the vectors Templated questions: 1. Pic a small set of Euclidean vectors and chec if they are dependent or independent.. Pic a small set of dependent Euclidean vectors and express each of them as a linear combination of the others. What questions do you have for your instructor? Linear Algebra Chapter 7: Subspaces Section 1: Dependence and independence Page 8