Section 1.7. Linear Independence

Similar documents
Announcements Wednesday, September 20

Announcements Monday, September 17

Announcements Wednesday, September 05

Chapter 3. Systems of Linear Equations: Geometry

Linear Independence Reading: Lay 1.7

1 :: Mathematical notation

Announcements Wednesday, September 06

Section 1.5. Solution Sets of Linear Systems

Announcements Wednesday, September 06. WeBWorK due today at 11:59pm. The quiz on Friday covers through Section 1.2 (last weeks material)

Chapter 4: Techniques of Circuit Analysis

MATH240: Linear Algebra Review for exam #1 6/10/2015 Page 1

Announcements Monday, September 18

Basic Linear Algebra Ideas. We studied linear differential equations earlier and we noted that if one has a homogeneous linear differential equation

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Chapter 6. Orthogonality and Least Squares

A Geometric Review of Linear Algebra

Review of Matrices and Vectors 1/45

Chapter 15. Probability Rules! Copyright 2012, 2008, 2005 Pearson Education, Inc.

Geometry of Span (continued) The Plane Spanned by u and v

Linear algebra and differential equations (Math 54): Lecture 10

Announcements Wednesday, November 15

Roberto s Notes on Linear Algebra Chapter 1: Geometric vectors Section 8. The dot product

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Section 6.1. Inner Product, Length, and Orthogonality

Section 3.3. Matrix Equations

2018 Fall 2210Q Section 013 Midterm Exam I Solution

1.3.3 Basis sets and Gram-Schmidt Orthogonalization

Abstract & Applied Linear Algebra (Chapters 1-2) James A. Bernhard University of Puget Sound

Test 3, Linear Algebra

MTH 464: Computational Linear Algebra

4-vectors. Chapter Definition of 4-vectors

Math 144 Activity #9 Introduction to Vectors

R b. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 1 1, x h. , x p. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9

Linear Algebra. Introduction. Marek Petrik 3/23/2017. Many slides adapted from Linear Algebra Lectures by Martin Scharlemann

Scalar multiplication and algebraic direction of a vector

Announcements Wednesday, October 04

Uniformity of Theta- Assignment Hypothesis. Transit i ve s. Unaccusatives. Unergatives. Double object constructions. Previously... Bill lied.

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Review Solutions, Exam 2, Operations Research

Announcements Wednesday, November 15

Math 1553 Introduction to Linear Algebra. School of Mathematics Georgia Institute of Technology

A Geometric Review of Linear Algebra

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Subspaces and Spanning Sets

Announcements August 31

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Linear Algebra MATH20F Midterm 1

Practice problems for Exam 3 A =

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 3

The Riemann-Roch Theorem: a Proof, an Extension, and an Application MATH 780, Spring 2019

MITOCW ocw f99-lec09_300k

Economics 205, Fall 2002: Final Examination, Possible Answers

Algorithms to Compute Bases and the Rank of a Matrix

Math 4326 Fall Exam 2 Solutions

Row Space, Column Space, and Nullspace

different formulas, depending on whether or not the vector is in two dimensions or three dimensions.

= c. = c. c 2. We can find apply our general formula to find the inverse of the 2 2 matrix A: A 1 5 4

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Carleton College, winter 2013 Math 232, Solutions to review problems and practice midterm 2 Prof. Jones 15. T 17. F 38. T 21. F 26. T 22. T 27.

Vector Spaces 4.4 Spanning and Independence

LECTURE 2: CROSS PRODUCTS, MULTILINEARITY, AND AREAS OF PARALLELOGRAMS

Math 308 Practice Final Exam Page and vector y =

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

Mathematics 206 Solutions for HWK 13b Section 5.2

LINEAR ALGEBRA. and VECTOR GEOMETRY Volume 2 of 2 September 2014 edition

Section 6: PRISMATIC BEAMS. Beam Theory

Chromatic characterization of biclique cover. Denis Cornaz. Jean Fonlupt. Equipe Combinatoire, UFR 921, Universite Pierre et Marie Curie, 4 place

Second Midterm Exam April 14, 2011 Answers., and

if b is a linear combination of u, v, w, i.e., if we can find scalars r, s, t so that ru + sv + tw = 0.

Symbolic Logic Outline

Math 290, Midterm II-key

Gravity and Orbits. 1. Choose the picture you think shows the gravity forces on the Earth and the Sun.

Linear Algebra, Spring 2005

Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur

Problems for M 10/12:

Period: Date: Lesson 3B: Properties of Dilations and Equations of lines

LINEAR ALGEBRA: THEORY. Version: August 12,

Today: Linear Programming (con t.)

Chapter 1: Linear Equations

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Math 51 First Exam April 21, 2011

LECTURE 1: INTRODUCTION, VECTORS, AND DOT PRODUCTS

MATH10212 Linear Algebra B Homework Week 4

Chapter 1: Linear Equations

Astrometric Errors Correlated Strongly Across Multiple SIRTF Images

2. Prime and Maximal Ideals

Math 2030 Assignment 5 Solutions

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Review for Chapter 1. Selected Topics

Math 54 HW 4 solutions

OVERVIEW OF TATE S THESIS

Linear Equations in Linear Algebra

Section 6.4. The Gram Schmidt Process

Your Thoughts. What is the difference between elastic collision and inelastic collision?

Math 425 Lecture 1: Vectors in R 3, R n

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Student Activity: Finding Factors and Prime Factors

Transcription:

Section 1.7 Linear Independence

Motiation Sometimes the span of a set of ectors is smaller than you expect from the number of ectors. Span{, w} w Span{u,, w} w u This means you don t need so many ectors to express the same set of ectors. Notice in each case that one ector in the set is already in the span of the others so it doesn t make the span bigger. Today we will formalize this idea in the concept of linear (in)dependence.

Definition A set of ectors { 1, 2,..., p} in R n is linearly independent if the ector equation x 1 1 + x 2 2 + + x p p = 0 has only the triial solution x 1 = x 2 = = x p = 0. The opposite: The set { 1, 2,..., p} is linearly dependent if there exist numbers x 1, x 2,..., x p, not all equal to zero, such that x 1 1 + x 2 2 + + x p p = 0. This is called a linear dependence relation. Like span, linear (in)dependence is another one of those big ocabulary words that you absolutely need to learn. Much of the rest of the course will be built on these concepts, and you need to know exactly what they mean in order to be able to answer questions on quizzes and exams (and sole real-world problems later on).

Definition A set of ectors { 1, 2,..., p} in R n is linearly independent if the ector equation x 1 1 + x 2 2 + + x p p = 0 has only the triial solution x 1 = x 2 = = x p = 0. The set { 1, 2,..., p} is linearly dependent otherwise. The notion of linear (in)dependence applies to a collection of ectors, not to a single ector, or to one ector in the presence of some others.

Checking Linear Independence Question: 1 1 3 Is 1, 1, 1 linearly independent? 1 2 4 Equialently, does the (homogeneous) the ector equation 1 1 3 0 x 1 + y 1 + z 1 = 0 1 2 4 0 hae a nontriial solution? How do we sole this kind of ector equation? 1 1 3 1 1 1 row reduce 1 0 2 0 1 1 1 2 4 0 0 0 So x = 2z and y = z. So the ectors are linearly dependent, and an equation of linear dependence is (taking z = 1) 1 1 3 0 2 1 1 + 1 = 0. 1 2 4 0

Checking Linear Independence Question: 1 1 3 Is 1, 1, 1 linearly independent? 0 2 4 Equialently, does the (homogeneous) the ector equation 1 1 3 0 x 1 + y 1 + z 1 = 0 0 2 4 0 hae a nontriial solution? 1 1 3 1 1 1 row reduce 1 0 0 0 1 0 0 2 4 0 0 1 x 0 The triial solution y = 0 is the unique solution. So the ectors are z 0 linearly independent.

and Matrix Columns By definition, { 1, 2,..., p} is linearly independent if and only if the ector equation x 1 1 + x 2 2 + + x p p = 0 has only the triial solution. This holds if and only if the matrix equation Ax = 0 has only the triial solution, where A is the matrix with columns 1, 2,..., p: A = 1 2 p. This is true if and only if the matrix A has a piot in each column. Important The ectors 1, 2,..., p are linearly independent if and only if the matrix with columns 1, 2,..., p has a piot in each column. Soling the matrix equation Ax = 0 will either erify that the columns 1, 2,..., p of A are linearly independent, or will produce a linear dependence relation.

Linear Dependence Criterion If one of the ectors { 1, 2,..., p} is a linear combination of the other ones: 3 = 2 1 1 2 + 64 2 Then the ectors are linearly dependent: 2 1 1 2 3 + 64 = 0. 2 Conersely, if the ectors are linearly dependent 2 1 1 2 + 64 = 0, 2 then one ector is a linear combination of the other ones: 2 = 4 1 + 12 4. Theorem A set of ectors { 1, 2,..., p} is linearly dependent if and only if one of the ectors is in the span of the other ones.

Pictures in R 2 In this picture One ector {}: Linearly independent if 0. Span{}

Pictures in R 2 Span{w} w In this picture One ector {}: Linearly independent if 0. Two ectors {, w}: Linearly independent: neither is in the span of the other. Span{}

Pictures in R 2 Span{w} Span{} w u Span{, w} In this picture One ector {}: Linearly independent if 0. Two ectors {, w}: Linearly independent: neither is in the span of the other. Three ectors {, w, u}: Linearly dependent: u is in Span{, w}. Also is in Span{u, w} and w is in Span{u, }.

Pictures in R 2 w Two collinear ectors {, w}: Linearly dependent: w is in Span{} (and ice-ersa). Two ectors are linearly dependent if and only if they are collinear. Span{}

Pictures in R 2 Two collinear ectors {, w}: Linearly dependent: w is in Span{} (and ice-ersa). Two ectors are linearly dependent if and only if they are collinear. Span{} w u Three ectors {, w, u}: Linearly dependent: w is in Span{} (and ice-ersa). If a set of ectors is linearly dependent, then so is any larger set of ectors!

Pictures in R 3 Span{, w} In this picture Span{w} u w Two ectors {, w}: Linearly independent: neither is in the span of the other. Three ectors {, w, u}: Linearly independent: no one is in the span of the other two. Span{}

Pictures in R 3 In this picture Span{w} w Two ectors {, w}: Linearly independent: neither is in the span of the other. Three ectors {, w, x}: Linearly dependent: x is in Span{, w}. Span{}

Pictures in R 3 In this picture Span{w} w Two ectors {, w}: Linearly independent: neither is in the span of the other. Three ectors {, w, x}: Linearly dependent: x is in Span{, w}. Span{}

Which subsets are linearly dependent? Think about Are there four ectors u,, w, x in R 3 which are linearly dependent, but such that u is not a linear combination of, w, x? If so, draw a picture; if not, gie an argument. Yes: actually the pictures on the preious slides proide such an example. Linear dependence of { 1,..., p} means some i is a linear combination of the others, not any.

Linear Dependence Stronger criterion Suppose a set of ectors { 1, 2,..., p} is linearly dependent. Take the largest j such that j is in the span of the others. For example, j = 3 and Is j is in the span of 1, 2,..., j 1? 3 = 2 1 1 2 + 64 2 Rearrange: (2 1 12 2 3 ) 4 = 1 6 so 4 is also in the span of 1, 2, 3, but 3 was supposed to be the last one that was in the span of the others. Better Theorem A set of ectors { 1, 2,..., p} is linearly dependent if and only if there is some j such that j is in Span{ 1, 2,..., j 1 }.

Increasing span criterion If the ector j is not in Span{ 1, 2,..., j 1 }, it means Span{ 1, 2,..., j } is bigger than Span{ 1, 2,..., j 1 }. If true for all j A set of ectors is linearly independent if and only if, eery time you add another ector to the set, the span gets bigger. Theorem A set of ectors { 1, 2,..., p} is linearly independent if and only if, for eery j, the span of 1, 2,..., j is strictly larger than the span of 1, 2,..., j 1.

Increasing span criterion: pictures Theorem A set of ectors { 1, 2,..., p} is linearly independent if and only if, for eery j, the span of 1, 2,..., j is strictly larger than the span of 1, 2,..., j 1. One ector {}: Linearly independent: span got bigger (than {0}). Span{}

Increasing span criterion: pictures Theorem A set of ectors { 1, 2,..., p} is linearly independent if and only if, for eery j, the span of 1, 2,..., j is strictly larger than the span of 1, 2,..., j 1. Span{, w} One ector {}: Linearly independent: span got bigger (than {0}). w Two ectors {, w}: Linearly independent: span got bigger. Span{}

Increasing span criterion: pictures Theorem A set of ectors { 1, 2,..., p} is linearly independent if and only if, for eery j, the span of 1, 2,..., j is strictly larger than the span of 1, 2,..., j 1. Span{, w, u} Span{, w} One ector {}: Linearly independent: span got bigger (than {0}). Span{} u w Two ectors {, w}: Linearly independent: span got bigger. Three ectors {, w, u}: Linearly independent: span got bigger.

Increasing span criterion: pictures Theorem A set of ectors { 1, 2,..., p} is linearly independent if and only if, for eery j, the span of 1, 2,..., j is strictly larger than the span of 1, 2,..., j 1. Span{, w, x} One ector {}: Linearly independent: span got bigger (than {0}). Span{} x w Two ectors {, w}: Linearly independent: span got bigger. Three ectors {, w, x}: Linearly dependent: span didn t get bigger.

Extra: Linear Independence Two more facts Fact 1: Say 1, 2,..., n are in R m. If n > m then { 1, 2,..., n} is linearly dependent: the matrix A = 1 2 n. cannot hae a piot in each column (it is too wide). This says you can t hae 4 linearly independent ectors in R 3, for instance. A wide matrix can t hae linearly independent columns. Fact 2: If one of 1, 2,..., n is zero, then { 1, 2,..., n} is linearly dependent. For instance, if 1 = 0, then is a linear dependence relation. 1 1 + 0 2 + 0 3 + + 0 n = 0 A set containing the zero ector is linearly dependent.