Sections 1.5, 1.7. Ma 322 Fall Ma 322. Sept

Similar documents
Sections 1.5, 1.7. Ma 322 Spring Ma 322. Jan 24-28

Week 3 September 5-7.

Rank, Homogenous Systems, Linear Independence

Ma 322 Spring Ma 322. Jan 18, 20

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Row Reduction and Echelon Forms

Solutions of Linear system, vector and matrix equation

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Review for Chapter 1. Selected Topics

DEPARTMENT OF MATHEMATICS

Matrix equation Ax = b

Linear Independence Reading: Lay 1.7

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Section Gaussian Elimination

LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS

Linear Independence x

Solutions to Exam I MATH 304, section 6

Lecture 2 Systems of Linear Equations and Matrices, Continued

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

3.4 Elementary Matrices and Matrix Inverse

Linear Algebra MATH20F Midterm 1

Math 54 HW 4 solutions

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Math 3A Winter 2016 Midterm

MATH 1553, SPRING 2018 SAMPLE MIDTERM 1: THROUGH SECTION 1.5

System of Linear Equations

Column 3 is fine, so it remains to add Row 2 multiplied by 2 to Row 1. We obtain

Math 369 Exam #2 Practice Problem Solutions

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Homework 1 Due: Wednesday, August 27. x + y + z = 1. x y = 3 x + y + z = c 2 2x + cz = 4

Math 102, Winter 2009, Homework 7

Lecture 2e Row Echelon Form (pages 73-74)

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Linear Equations in Linear Algebra

The definition of a vector space (V, +, )

Linear Algebra Exam 1 Spring 2007

1 Systems of equations

Quizzes for Math 304

Review Solutions for Exam 1

Linear equations in linear algebra

Notes on Row Reduction

Chapter 1: Linear Equations

Chapter 1: Linear Equations

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Chapter 1: Systems of Linear Equations

Solutions to Math 51 Midterm 1 July 6, 2016

Number of solutions of a system

1. Solve each linear system using Gaussian elimination or Gauss-Jordan reduction. The augmented matrix of this linear system is

Lecture 4: Gaussian Elimination and Homogeneous Equations

Vector Spaces 4.4 Spanning and Independence

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

2 Eigenvectors and Eigenvalues in abstract spaces.

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Section 3.3. Matrix Equations

Algorithms to Compute Bases and the Rank of a Matrix

Section 1.5. Solution Sets of Linear Systems

MATH10212 Linear Algebra B Homework Week 4

1 Linear systems, existence, uniqueness

2. Every linear system with the same number of equations as unknowns has a unique solution.

10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

1111: Linear Algebra I

Linear independence, span, basis, dimension - and their connection with linear systems

Chapter 3. Vector spaces

February 20 Math 3260 sec. 56 Spring 2018

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Additional Problems for Midterm 1 Review

Linear Algebra I Lecture 10

MATH240: Linear Algebra Exam #1 solutions 6/12/2015 Page 1

Math 2331 Linear Algebra

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Math 2174: Practice Midterm 1

INVERSE OF A MATRIX [2.2]

MAT Linear Algebra Collection of sample exams

MATH 2360 REVIEW PROBLEMS

Linear Algebra Handout

Chapter 1. Vectors, Matrices, and Linear Spaces

Chapter 5. Linear Algebra. Sections A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Chapter 2 Subspaces of R n and Their Dimensions

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

APPM 2360 Exam 2 Solutions Wednesday, March 9, 2016, 7:00pm 8:30pm

1 Last time: inverses

Chapter 6. Linear Independence. Chapter 6

Math 308 Spring Midterm Answers May 6, 2013

MAT 242 CHAPTER 4: SUBSPACES OF R n

0.0.1 Section 1.2: Row Reduction and Echelon Forms Echelon form (or row echelon form): 1. All nonzero rows are above any rows of all zeros.

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

Problems for M 10/12:

6.4 Basis and Dimension

T ((x 1, x 2,..., x n )) = + x x 3. , x 1. x 3. Each of the four coordinates in the range is a linear combination of the three variables x 1

Span & Linear Independence (Pop Quiz)

Transcription:

Sections 1.5, 1.7 Ma 322 Fall 213 Ma 322 Sept. 9-13 Summary ˆ Solutions of homogeneous equations AX =. ˆ Using the rank. ˆ Parametric solution of AX = B. ˆ Linear dependence and independence of vectors in R n. ˆ Using REF and RREF as convenient.

The Homogeneous Equation AX =. ˆ A linear system (A ) is said t be homogeneous when the RHS entries are all. In this case, the REF of M = (A ) can be seen to be M = (A ), i.e. the column can be omitted through the reduction process, if convenient. ˆ Clearly, rank(m) = rank(a), so a homogeneous system is always consistent. Indeed, it is also clear that X = is a solution to AX = and hence consistency is directly obvious! ˆ Let the common rank be r. Then there are exactly r pivot variables and n r free variables. ˆ The final solution will consist of solving the pivot variables in terms of the free variables and reporting the conclusion. ˆ Here is a worked example. An example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 1 3 2 7 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t =, the second gives z =3w 2t = 3w and the first gives x = 3y + 2w 3t = 3y + 2w.

Reporting the Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w y 3w w = yv 1 + wv 2 where v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = is always a member of the span of n r vectors, where n is the number of variables and r = rank(a).

The Non homogeneous Case. ˆ More generally, if we put the augmented matrix M = (A B) of a non homogeneous system into REF, say M = (A B ). then we need to verify the consistency condition first. ˆ If the condition fails, then there is no solution. ˆ If the condition holds, then the solution process and reporting is just as above, except the final answer isa fixed vector plus a span of (n r) vectors as before. Here is an example. A Non Homogeneous Example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 18 1 3 2 5 7 35 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t = 5, the second gives z =3w 2t + 5 = 3w 1 + 5 = 3w 5 and the first gives x = 3y + 2w 3t + 9 = 3y + 2w 15 + 9 = 3y + 2w 6.

Reporting the Full Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w 6 y 3w 5 w 5 = v p + yv 1 + wv 2 where v p = 6 5 5, v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be v p plus a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v p + v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = B is always X = v p + v h, where v h is a general member of a span of n r vectors, with n is the number of variables and r = rank(a).

Trivial Solutions. ˆ If we consider a homogeneous equation AX = and C 1, C 2,, C n are n columns of A, then we are solving the equation: x 1 C 1 + x 2 C 2 + + x n C n =. ˆ As already observed, this system is consistent and always has a solution x 1 = x 2 = = x n = or briefly X =. Def.1: Trivial Solution. The solution X = is said to be a trivial solution of the system AX =. ˆ As already observed, the solution of AX = is X = v h, where v h is a general member of a span of n r vectors where r = rank(a). Linear Dependence. ˆ Def.2: Linearly Dependent vectors The columns C 1, C 2,, C n of A are said to be linearly dependent if the system AX = has a non trivial solution, or equivalently n r >, i.e. rank of A is less than its number of columns. ˆ For future use, we restate this definition more generally thus: Def.2(general): Linearly Dependent vectors. Any set S of vectors is said to be linearly dependent if there is a positive integer n such that n distinct vectors v 1, v 2,, v n of S satisfy c 1 v 1 +c 2 v 2 + +c n v n = where at least one of c i is non zero. ˆ Note that this makes sense even for an infinite set S.

Linear Independence. ˆ Def.3: Linearly Independent Vectors. A set S of vectors is said to be linearly independent if it is not linearly dependent! ˆ Def.4: Notations rownum, colnum. For any matrix A, we shall define colnum(a) to be the number of columns in A and rownum(a) to be the number of rows in A. ˆ Though logically clear, linear independence can be difficult to verify without better tools. We describe such a tool next. ˆ Convention: We shall often drop the word linearly from the terms linearly dependent and linearly independent. Test of Linear dependence/independence. ˆ For a finite set of vectors in R m, there is a simple criterion for dependence/independence. ˆ Given vectors v 1, v 2,, v n in R m, make a matrix A by taking these as columns and find its rank (by using REF, for example.). ˆ Suppose rank(a) = r. It is obvious that r n = colnum(a). Then we have: v 1, v 2,, v n are linearly dependent iff r < n and thus, they are linearly independent iff r = n.

A Proof of a Better Criterion. ˆ Given vectors v 1, v 2,, v r in R m, here is a more general criterion for their independence. ˆ Make a matrix A using v 1, v 2,, v r as columns and assume that A has r different rows with different finite pivot columns. Then v 1, v 2,, v r are linearly independent. ˆ Proof: Observe that each of the r columns must have pivots in them and so each of 1, 2,, r appear in the pc list. ˆ Consider the equation x 1 v 1 + x 2 v 2 + + x r v r =. If we consider the row which has its pivot column r, then clearly the equation gives x r =. ˆ Now if we consider a row which has pivot column r 1, then its equation involves x r 1 and may also involve x r. Since x r is already known to be zero, we have x r 1 =. ˆ Proceeding thus, we get x 1 = x 2 = x r =, i.e. the columns are linearly independent. Using REF, RREF. ˆ We have illustrated how to make and use REF - the row echelon form. ˆ We defined RREF, but did not work much with it. Roughly, it needs twice as much work as REF and hence we avoided using it, if it was not really needed. ˆ It is, however, needed for some of the later work and we include an illustration of the method to get that form. ˆ Unlike, REF, the form RREF is well defined and thus has theoretical merit. ˆ Typically, the act of solving equations, becomes, writing down the answers if we have achieved RREF.

An example of RREF ˆ Problem: Consider the following augmented matrix (already in REF) and convert to RREF. Use it to write out the solution of the associated linear system. ˆ x y z RHS 1 1 3 12 1 6 2 9 27 ˆ Notice that the pc list is (1, 2, 3, ) and the respective pivot entries are 1, 1, 9. ˆ The process is to start with the last pivot, make all entries above it equal to zero and make it 1. Then repeat with earlier pivots. RREF Example concluded. ˆ Thus, the operations R 1 3 9 R 3, R 2 6 9 R 3, 1 9 R 3 give: ˆ x y z RHS 1 1 3 12 1 6 2 9 27 x y z RHS 1 1 3 1 2 1 3 ˆ Then the operation R 1 R 2 finishes off the RREF. x y z RHS 1 1 1 2 1 3 ˆ The answer to the system can now be simply read off!