Sections 1.5, 1.7. Ma 322 Spring Ma 322. Jan 24-28

Similar documents
Sections 1.5, 1.7. Ma 322 Fall Ma 322. Sept

Week 3 September 5-7.

Ma 322 Spring Ma 322. Jan 18, 20

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

Rank, Homogenous Systems, Linear Independence

Row Reduction and Echelon Forms

Solutions of Linear system, vector and matrix equation

Linear Independence Reading: Lay 1.7

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

MATH 1553, SPRING 2018 SAMPLE MIDTERM 1: THROUGH SECTION 1.5

Review for Chapter 1. Selected Topics

DEPARTMENT OF MATHEMATICS

Matrix equation Ax = b

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

Linear Algebra Exam 1 Spring 2007

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

Linear Equations in Linear Algebra

Homework 1 Due: Wednesday, August 27. x + y + z = 1. x y = 3 x + y + z = c 2 2x + cz = 4

System of Linear Equations

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Lecture 2e Row Echelon Form (pages 73-74)

Section Gaussian Elimination

LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS

Math 102, Winter 2009, Homework 7

Linear Independence x

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Lecture 2 Systems of Linear Equations and Matrices, Continued

Solutions to Exam I MATH 304, section 6

Linear Algebra MATH20F Midterm 1

3.4 Elementary Matrices and Matrix Inverse

Math 54 HW 4 solutions

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Math 3A Winter 2016 Midterm

Column 3 is fine, so it remains to add Row 2 multiplied by 2 to Row 1. We obtain

Chapter 1: Linear Equations

MATH240: Linear Algebra Exam #1 solutions 6/12/2015 Page 1

Math 369 Exam #2 Practice Problem Solutions

Math 308 Spring Midterm Answers May 6, 2013

Chapter 1: Linear Equations

Number of solutions of a system

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

T ((x 1, x 2,..., x n )) = + x x 3. , x 1. x 3. Each of the four coordinates in the range is a linear combination of the three variables x 1

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

The definition of a vector space (V, +, )

MATH10212 Linear Algebra B Homework Week 4

1 Systems of equations

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Review Solutions for Exam 1

Quizzes for Math 304

Linear equations in linear algebra

2 Eigenvectors and Eigenvalues in abstract spaces.

10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.

February 20 Math 3260 sec. 56 Spring 2018

2. Every linear system with the same number of equations as unknowns has a unique solution.

Notes on Row Reduction

Solutions to Math 51 First Exam April 21, 2011

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

INVERSE OF A MATRIX [2.2]

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

Lecture 3: Gaussian Elimination, continued. Lecture 3: Gaussian Elimination, continued

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

Chapter 1: Systems of Linear Equations

MAT 242 CHAPTER 4: SUBSPACES OF R n

Solutions to Math 51 Midterm 1 July 6, 2016

Span & Linear Independence (Pop Quiz)

Lecture 4: Gaussian Elimination and Homogeneous Equations

1. Solve each linear system using Gaussian elimination or Gauss-Jordan reduction. The augmented matrix of this linear system is

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Vector Spaces 4.4 Spanning and Independence

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Section 1.5. Solution Sets of Linear Systems

Section 3.3. Matrix Equations

Math Linear algebra, Spring Semester Dan Abramovich

Algorithms to Compute Bases and the Rank of a Matrix

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

1 Linear systems, existence, uniqueness

Linear Algebra: Sample Questions for Exam 2

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Matrices and RRE Form

1111: Linear Algebra I

Linear independence, span, basis, dimension - and their connection with linear systems

Additional Problems for Midterm 1 Review

MATH 1553 PRACTICE FINAL EXAMINATION

Chapter 3. Vector spaces

Math 2331 Linear Algebra

Solving Systems of Linear Equations

Linear Algebra I Lecture 10

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Math 2174: Practice Midterm 1

Math 220 Some Exam 1 Practice Problems Fall 2017

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Transcription:

Sections 1.5, 1.7 Ma 322 Spring 217 Ma 322 Jan 24-28 Summary ˆ Text: Solution Sets of Linear Systems (1.5),Linear Independence (1.7) ˆ Solutions of homogeneous equations AX =. ˆ Using the rank. ˆ Parametric solution of AX = B. ˆ Linear dependence and independence of vectors in R n. ˆ Using REF and RREF as convenient.

The Homogeneous Equation AX =. ˆ Def. 1: Homogeneous System of Equations. A linear system (A B) is said to be homogeneous when B =, i.e. the RHS entries in B are all. In this case, the REF of M = (A ) can be seen to be M = (A ), i.e. the column can be omitted through the reduction process, since it will never change. ˆ Clearly, rank(m) = rank(a), so a homogeneous system is always consistent. Indeed, it is also clear that X = is a solution to AX = and hence consistency is directly obvious! ˆ Let the common rank be r. Then there are exactly r pivot variables and n r free variables. ˆ The final solution will consist of solving the pivot variables in terms of the free variables and reporting the conclusion. ˆ Here is a worked example. An example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 1 3 2 7 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t =, the second gives z =3w + 2t = 3w and the first gives x = 3y + 2w 3t = 3y + 2w.

Reporting the Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w y 3w w = yv 1 + wv 2 where v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = is always a member of the span of n r vectors, where n is the number of variables and r = rank(a).

The Non homogeneous Case. ˆ More generally, if we put the augmented matrix M = (A B) of a non homogeneous system into REF, say M = (A B ). then we need to verify the consistency condition first. ˆ If the condition fails, then there is no solution. ˆ If the condition holds, then the solution process and reporting is just as above, except the final answer is a fixed vector plus a span of (n r) vectors as before. Here is an example. A Non Homogeneous Example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 18 1 3 2 5 7 35 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t = 5, the second gives z =3w + 2t + 5 = 3w + 1 + 5 = 3w + 15 and the first gives x = 3y + 2w 3t + 9 = 3y + 2w 15 + 9 = 3y + 2w 6.

Reporting the Full Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w 6 y 3w + 15 w 5 = v p + yv 1 + wv 2 where v p = 6 15 5, v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be v p plus a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v p + v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = B is always X = v p + v h, where v h is a general member of a span of n r vectors, with n is the number of variables and r = rank(a). ˆ Def. 2: Homogeneous and Particular Solutions. We call v p as a particular solution and v h as a homogeneous solution. Note that neither of these are unique, but with proper identifications, they exhibit all the solutions of the system in a parametric form.

Trivial Solutions. ˆ If we consider a homogeneous equation AX = and C 1, C 2,, C n are n columns of A, then we are solving the equation: x 1 C 1 + x 2 C 2 + + x n C n =. ˆ As already observed, this system is consistent and always has a solution x 1 = x 2 = = x n = or briefly X =. Def. 3: Trivial Solution. The solution X = is said to be a trivial solution of the system AX =. ˆ As already observed, the solution of AX = is X = v h, where v h is a general member of a span of n r vectors where r = rank(a). Thus, for a homogeneous system, a particular solution is the trivial solution! Linear Dependence. ˆ Def. 4: Linearly Dependent vectors The columns C 1, C 2,, C n of A are said to be linearly dependent if the system AX = has a non trivial solution, or equivalently n r >, i.e. rank of A is less than its number of columns. ˆ For future use, we restate this definition more generally thus: Def. 5(general): Linearly Dependent vectors. Any set S of vectors is said to be linearly dependent if there is a positive integer n such that n distinct vectors v 1, v 2,, v n of S satisfy c 1 v 1 +c 2 v 2 + +c n v n = where at least one of c i is non zero. ˆ Note that this makes sense even for an infinite set S.

Linear Independence. ˆ Def. 6: Linearly Independent Vectors. A set S of vectors is said to be linearly independent if it is not linearly dependent! ˆ Def. 7: Notations rownum, colnum. For any matrix A, we shall define colnum(a) to be the number of columns in A and rownum(a) to be the number of rows in A. ˆ Though logically clear, linear independence can be difficult to verify without better tools. We describe such a tool next. ˆ Convention: We shall often drop the word linearly from the terms linearly dependent and linearly independent. Test of Linear dependence/independence. ˆ For a finite set of vectors in R m, there is a simple criterion for dependence/independence. ˆ Given vectors v 1, v 2,, v n in R m, make a matrix A by taking these as columns and find its rank (by using REF, for example.). ˆ Suppose rank(a) = r. It is obvious that r n = colnum(a). Then we have: v 1, v 2,, v n are linearly dependent iff r < n and thus, they are linearly independent iff r = n.

Using REF, RREF. ˆ We have illustrated how to make and use REF - the row echelon form. ˆ We defined RREF, but did not work much with it. Roughly, it needs twice as much work as REF and hence we avoided using it, if it was not really needed. ˆ It is, however, needed for some of the later work and we include an illustration of the method to get that form. ˆ Unlike, REF, the form RREF is well defined and thus has theoretical merit. ˆ Typically, the act of solving equations, becomes, writing down the answers if we have achieved RREF. An example of RREF ˆ Consider the following augmented matrix (already in REF) and convert to RREF. Use it to write out the solution of the associated linear system. ˆ x y z RHS 1 1 3 12 1 6 2 9 27 ˆ Notice that the pc list is (1, 2, 3, ) and the respective pivot entries are 1, 1, 9. ˆ The process is to start with the last pivot, make all entries above it equal to zero and make it 1. Then repeat with earlier pivots.

RREF Example concluded. ˆ Thus, the operations R 1 3 9 R 3, R 2 6 9 R 3, 1 9 R 3 give: ˆ x y z RHS 1 1 3 12 1 6 2 9 27 x y z RHS 1 1 3 1 2 1 3 ˆ Then the operation R 1 R 2 finishes off the RREF. x y z RHS 1 1 1 2 1 3 ˆ The answer to the system can now be simply read off!