Rank, Homogenous Systems, Linear Independence

Similar documents
Sections 1.5, 1.7. Ma 322 Fall Ma 322. Sept

Sections 1.5, 1.7. Ma 322 Spring Ma 322. Jan 24-28

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Chapter 1: Systems of Linear Equations

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Problem Sheet 1 with Solutions GRA 6035 Mathematics

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

DEPARTMENT OF MATHEMATICS

Section Gaussian Elimination

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Row Reduction and Echelon Forms

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Notes on Row Reduction

Solutions to Exam I MATH 304, section 6

Linear Algebra Exam 1 Spring 2007

Linear Algebra I Lecture 10

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

DM559 Linear and Integer Programming. Lecture 6 Rank and Range. Marco Chiarandini

Chapter 5. Linear Algebra. Sections A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Review for Chapter 1. Selected Topics

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

MATH 2360 REVIEW PROBLEMS

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Vector Spaces 4.4 Spanning and Independence

Math 54 HW 4 solutions

Math 3A Winter 2016 Midterm

Linear Independence. Linear Algebra MATH Linear Algebra LI or LD Chapter 1, Section 7 1 / 1

3.4 Elementary Matrices and Matrix Inverse

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Linear Algebra Math 221

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Chapter 5. Linear Algebra. Sections A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

System of Linear Equations

Math 2030 Assignment 5 Solutions

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

APPM 2360 Exam 2 Solutions Wednesday, March 9, 2016, 7:00pm 8:30pm

Week 3 September 5-7.

Systems of Equations Homework Solutions

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 102, Winter 2009, Homework 7

Lecture 4: Gaussian Elimination and Homogeneous Equations

0.0.1 Section 1.2: Row Reduction and Echelon Forms Echelon form (or row echelon form): 1. All nonzero rows are above any rows of all zeros.

Ma 322 Spring Ma 322. Jan 18, 20

Lecture 3: Gaussian Elimination, continued. Lecture 3: Gaussian Elimination, continued

Linear equations in linear algebra

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

Solutions to Homework 5 - Math 3410

Chapter 6. Linear Independence. Chapter 6

4.9 The Rank-Nullity Theorem

Math 2174: Practice Midterm 1

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

DM559 Linear and Integer Programming. Lecture 2 Systems of Linear Equations. Marco Chiarandini

Lecture 18: Section 4.3

Solutions of Linear system, vector and matrix equation

Chapter 2 Subspaces of R n and Their Dimensions

Lecture 6: Spanning Set & Linear Independency

Lecture 21: 5.6 Rank and Nullity

M 340L CS Homework Set 1

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Linear Independence x

Matrices and systems of linear equations

1 - Systems of Linear Equations

Solving Linear Systems Using Gaussian Elimination

Section 1.5. Solution Sets of Linear Systems

Math 415 Exam I. Name: Student ID: Calculators, books and notes are not allowed!

The scope of the midterm exam is up to and includes Section 2.1 in the textbook (homework sets 1-4). Below we highlight some of the important items.

CONSISTENCY OF EQUATIONS

Linear Equations in Linear Algebra

1 Linear systems, existence, uniqueness

Solutions to Math 51 First Exam April 21, 2011

The Four Fundamental Subspaces

Vector Spaces and Subspaces

Chapter 1: Linear Equations

Linear Equations in Linear Algebra

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

Final Examination 201-NYC-05 December and b =

Matrices and RRE Form

Name: Section Registered In:

Dimension and Structure

Chapter 1: Linear Equations

Column 3 is fine, so it remains to add Row 2 multiplied by 2 to Row 1. We obtain

Math 220: Summer Midterm 1 Questions

if b is a linear combination of u, v, w, i.e., if we can find scalars r, s, t so that ru + sv + tw = 0.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

c i r i i=1 r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4].

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Determine whether the following system has a trivial solution or non-trivial solution:

Linear independence, span, basis, dimension - and their connection with linear systems

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Review Solutions for Exam 1

CHAPTER 3 REVIEW QUESTIONS MATH 3034 Spring a 1 b 1

Week #4: Midterm 1 Review

Math 2331 Linear Algebra

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Transcription:

Sept 11,1 Rank, Homogenous Systems, Linear Independence If (A B) is a linear system represented as an augmented matrix then A is called the COEFFICIENT MATRIX and B is (usually) called the RIGHT HAND SIDE (rhs) Definition: A linear system (A B), represented as an augmented matrix, is HOMOGENOUS if B = O, the zero vector. Definition: If is (A O) (A B) is a linear system then the ASSOCIATED HOMOGENOUS SYSTEM VECTOR REPRESENTATION OF A LINEAR SYSTEM If S is a linear system and (A B) an augmented matrix representation which C[1], C[],... C[n] the columns of A, then the associated VECTOR REPRESENTATION of S is the vector equation x 1 C 1 +... + x n = B That is S is represented by B written as a linear combination of the columns of A with variable coefficients. A solution to S is then a choice of values for the variables for which the expression is an equalty of vectors. EXAMPLE; (A B) = B := -1-1 x y RHS 1-1 4 -. Here the coefficient matrix A = A := 1 4 and the RHS is The associated homogenous system is x y RHS 1 0 4 0 When we determine the parametric solution to (A B) by calculating the REF x y RHS 1-1 x = 1 y, x = -1 y + y - 0 1 0 0 0

We have written the general solution as a linear combination of the vectors 1 (*) With X = x y We write this as X = + X p X h, X p = 1 0 and X h = y 1 The "p" is for "particular" and the "h" for "homogenous". The interpretation is that any solution X and be written as this one specific solution, Xp, plus some solution to the 1 0 and associated homogenous system X h Indeed is we solve the associated homogenous system x y RHS 1 0 0 0 0 x y RHS 1 0 4 0 we get the REF for whiich we get x = y x, = y y y the X h = y 1 in line (*), x y = y 1 which is exactly Thus solving a linear system can be interpreted as a -part process (a) Find a particular solution X p (We will see that any particular solution will work) (b) Solve the homogenous system Then the solutions all have the form X = X p + X h where X h is some solution to the homogenous system. These can be done in either order. Our parametric process does them simultaneously. NOTE: HOMOGENOUS SYSTEMS ARE ALWAYS CONSISTENT because the verctor O 0 = 0 is always a solution. THIS IS CALLED THE TRIVIAL SOLUTION. So even if a... 0 system is inconsistent, its associated homogenous system will always be consistent. The system (A B) has infinitely many solutions if and only if (a) it is consistent (b) the associated homogenous system (A O) has infinitely many solutions.

The homogenous system (A O) has infinitely many solutions if and only if there is a free variable. Recall that if A is a matrix then the RANK of A is the number of pivots in any REF of A Equivalently, the rank of A is the number of non-zero rows in any REF of A. This is is not by any means "obvious". We will take it as a fact for now and prove it when we have more tools. If A is an m by n matrix ( m rows and n-columns) then of course the rank of A is the number of pivot variables. Since there is a variable for each column of A the rank is at most the number of columns. Since there is a pivot in each non-zero row of the REF and the REF an A have the same number of rows we see that the rank of A is at most the number of rows of A. So of A is an m by n matrix then 0 RANK( A ) min ( m, n ) Since there is a free variable exactly when there is a variable that is not a pivot variable we see THE HOMOGENOUS SYSTEM (A O) HAS ONLY THE TRIVIAL SOUTION IF AND ONLY IF THE RANK OF A IS EQUAL TO THE NUMBER OF COLUMNS OF A SAID ANOTHER WAY THE HOMOGENOUS SYSTEM (A O) HAS ONLY THE TRIVIAL SOLUTION MEANS THAT IF C 1, C are the columns of A then the only way to write x 1 C 1 + x C +... + x n = O (the zero vector) is with x 1 = x =... = x n = 0 DEFINITION: The set of vectors { C 1, C } is LINEARLY INDEPENDENT if the only way to write O as a linear combination of C 1, C is with all coefficients equal to 0. Thus: THE COLUMNS OF THE MATRIX A ARE LNIEARLY INDEPENDENT IF AND ONLY IF THE RANK OF A IS EQUAL TO THE NUMBER OF COLUMNS OF A. Given any set of n vectors { C 1, C } in R m, we can always make them the columns of a matrix and calculate its rank. Thus we have a simple way to determine if a set of vectors is linearly independent: (a) assemble the vectors into a matrix A (b) calculate a REF of A Then: The vectors are linearly indpendent if and only if the rank A is n.

If the rank is less than n then there are non-trivial ways to write O as a linear combination of the {,,, C 1 C... }. These are just the non-trivial solutions to the homogenous system (A O) Example: The two vectors C 1 = 1 and C = 4 are linearly dependent since (**) C 1 + ( 1 ) C = O What are all of the was to write O as a linear combination of C 1 and C? Solution: The pairs (x,y) such that x C 1 + y C = O are the solutions to the homogenous system (A O) where A := 1 4. We found above that the parametric solution to this system is 1 So every way to write x C 1 + y C = O has the form y C 1 + y C where y is a real number. Note that y=0 provides the trivial solution and y= -1 gives the solution (**) x y = y Example: Consider the matrix 1-0 4 1 A := - 0 1 0 1-4 0 4 4-1 - 0 7 1 with REF M = 1-0 4 1 0 0-8 -1 0 0 0 0 1 0 0 0 0 0 0 QUESTIONS: 1. What is the rank of A? Are the columns of A an independent set? Why or why not. Answer:, No, The condition for independence is that the rank(a) = number of columns of A.. What is the largest number of columns of A that could be independent? Why?, Given ay subset of the columns of A, if they form the columns of a matrix then the row operations that produced M would reduce that matrix to the one consisting of the corresponding columns of M. That matrix would have at most three non-zero rows so it would hav eat most rank.

. If C i is the i { th } column of A, which of the following are linearly independent? (a) {{ C 1, C }}, (b) { C, C 4, C 6 }, (c) { C, C 4 }, (d) { C, C }, (e) { C 1 }, (f) { C } ANS: (a), (b), (c),(e) In each case the rank of the matrix whose columns are the given vectors has rank equal to the number of columns. 4. The matrix M was produced from A by the sequence R->R -*R1, R->R-*R1, R4->R4+R1,R->R-R, R4->R4+*R, R4<->R The sequence converting M back into A is the sequence of inverses of these operations in the reverse order. That is R4<->R, R4->R4-*R, R-.R+R, R4->R4 -R1, R->R+R1, R->R+*R1 Use these to modify B is the linear system (A B) = x y z w t rhs 1-0 4 1-0 1 0 1-4 0 4 4-1 - 0 7 1 to a vector C such that (A C) is inconsistent. Solution: This is just the previous matrix which has REF = 1-0 4 1 0 0-8 -1 0 0 0 0 1 0 0 0 0 0 0 1-0 4 If the REF had become, say, 0 0-8 0 0 0 0 1 0 0 0 0 0-1 inconsistent. Thus we want replace the vector B = 1 then the system would have been

with a vector C which the given sequence of elementary steps would take to D = - To do this we apply the reserse sequence to D - R4<->R - R4->R4-*R - 5 R->R+R -7 5 R4->R4 -R1-7 R->R+R1 R->R+*R1 1 >