Elementary Linear Algebra

Similar documents
MATRICES AND VECTORS SPACE

MTH 5102 Linear Algebra Practice Exam 1 - Solutions Feb. 9, 2016

Bases for Vector Spaces

Chapter 3. Vector Spaces

Elementary Linear Algebra

Theoretical foundations of Gaussian quadrature

Lecture 2e Orthogonal Complement (pages )

Matrix Eigenvalues and Eigenvectors September 13, 2017

Math 520 Final Exam Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Matrices and Determinants

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique?

Here we study square linear systems and properties of their coefficient matrices as they relate to the solution set of the linear system.

September 13 Homework Solutions

Chapter 14. Matrix Representations of Linear Transformations

Things to Memorize: A Partial List. January 27, 2017

308K. 1 Section 3.2. Zelaya Eufemia. 1. Example 1: Multiplication of Matrices: X Y Z R S R S X Y Z. By associativity we have to choices:

Multivariate problems and matrix algebra

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique?

MATH34032: Green s Functions, Integral Equations and the Calculus of Variations 1

Elements of Matrix Algebra

Chapter 2. Determinants

Abstract inner product spaces

Contents. Outline. Structured Rank Matrices Lecture 2: The theorem Proofs Examples related to structured ranks References. Structure Transport

Chapter 3 MATRIX. In this chapter: 3.1 MATRIX NOTATION AND TERMINOLOGY

Inner-product spaces

Numerical Linear Algebra Assignment 008

Hilbert Spaces. Chapter Inner product spaces

Lecture 19: Continuous Least Squares Approximation

1 Linear Least Squares

How do you know you have SLE?

STUDY GUIDE FOR BASIC EXAM

1.2. Linear Variable Coefficient Equations. y + b "! = a y + b " Remark: The case b = 0 and a non-constant can be solved with the same idea as above.

a a a a a a a a a a a a a a a a a a a a a a a a In this section, we introduce a general formula for computing determinants.

Chapter 3 Polynomials

Engineering Analysis ENG 3420 Fall Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00

Lecture Solution of a System of Linear Equation

The Regulated and Riemann Integrals

Best Approximation in the 2-norm

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 2

Best Approximation. Chapter The General Case

INTRODUCTION TO LINEAR ALGEBRA

Construction of Gauss Quadrature Rules

Quadratic Forms. Quadratic Forms

Practice final exam solutions

LINEAR ALGEBRA AND MATRICES. n ij. is called the main diagonal or principal diagonal of A. A column vector is a matrix that has only one column.

440-2 Geometry/Topology: Differentiable Manifolds Northwestern University Solutions of Practice Problems for Final Exam

DEFINITION OF ASSOCIATIVE OR DIRECT PRODUCT AND ROTATION OF VECTORS

REPRESENTATION THEORY OF PSL 2 (q)

ECON 331 Lecture Notes: Ch 4 and Ch 5

Elementary Linear Algebra

In Section 5.3 we considered initial value problems for the linear second order equation. y.a/ C ˇy 0.a/ D k 1 (13.1.4)

Math 270A: Numerical Linear Algebra

HW3, Math 307. CSUF. Spring 2007.

Review of Gaussian Quadrature method

Duality # Second iteration for HW problem. Recall our LP example problem we have been working on, in equality form, is given below.

Numerical integration

20 MATHEMATICS POLYNOMIALS

Math Lecture 23

approaches as n becomes larger and larger. Since e > 1, the graph of the natural exponential function is as below

Lecture 6: Singular Integrals, Open Quadrature rules, and Gauss Quadrature

A Matrix Algebra Primer

1 2-D Second Order Equations: Separation of Variables

Rudimentary Matrix Algebra

Math 61CM - Solutions to homework 9

MATH 423 Linear Algebra II Lecture 28: Inner product spaces.

The Algebra (al-jabr) of Matrices

Introduction to Group Theory

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies

u t = k 2 u x 2 (1) a n sin nπx sin 2 L e k(nπ/l) t f(x) = sin nπx f(x) sin nπx dx (6) 2 L f(x 0 ) sin nπx 0 2 L sin nπx 0 nπx

Matrix Solution to Linear Equations and Markov Chains

arxiv: v1 [math.ra] 1 Nov 2014

Math 4310 Solutions to homework 1 Due 9/1/16

CHAPTER 2d. MATRICES

along the vector 5 a) Find the plane s coordinate after 1 hour. b) Find the plane s coordinate after 2 hours. c) Find the plane s coordinate

set is not closed under matrix [ multiplication, ] and does not form a group.

Matrices. Elementary Matrix Theory. Definition of a Matrix. Matrix Elements:

Sturm-Liouville Eigenvalue problem: Let p(x) > 0, q(x) 0, r(x) 0 in I = (a, b). Here we assume b > a. Let X C 2 1

Module 6: LINEAR TRANSFORMATIONS

Part IB Numerical Analysis

Definite integral. Mathematics FRDIS MENDELU

EE263 homework 8 solutions

Chapter 5 Determinants

Pavel Rytí. November 22, 2011 Discrete Math Seminar - Simon Fraser University

Semigroup of generalized inverses of matrices

Orthogonal Polynomials

Linearity, linear operators, and self adjoint eigenvalue problems

Definite integral. Mathematics FRDIS MENDELU. Simona Fišnarová (Mendel University) Definite integral MENDELU 1 / 30

Linear Algebra 1A - solutions of ex.4

We will see what is meant by standard form very shortly

Lecture 3. Limits of Functions and Continuity

MORE FUNCTION GRAPHING; OPTIMIZATION. (Last edited October 28, 2013 at 11:09pm.)

13.3 CLASSICAL STRAIGHTEDGE AND COMPASS CONSTRUCTIONS

Lecture Note 9: Orthogonal Reduction

63. Representation of functions as power series Consider a power series. ( 1) n x 2n for all 1 < x < 1

Math 1B, lecture 4: Error bounds for numerical methods

Matrices, Moments and Quadrature, cont d

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Are Deligne-Lusztig representations Deligne-Lusztig? Except when they are complex?

Before we can begin Ch. 3 on Radicals, we need to be familiar with perfect squares, cubes, etc. Try and do as many as you can without a calculator!!!

LECTURE 3. Orthogonal Functions. n X. It should be noted, however, that the vectors f i need not be orthogonal nor need they have unit length for

Transcription:

Elementry Liner Algebr Anton & Rorres, 1 th Edition Lecture Set 5 Chpter 4: Prt II Generl Vector Spces 163 คณ ตศาสตร ว ศวกรรม 3 สาขาว ชาว ศวกรรมคอมพ วเตอร ป การศ กษา 1/2555 163 คณตศาสตรวศวกรรม 3 สาขาวชาวศวกรรมคอมพวเตอร ปการศกษา 1/2555 ผศ.ดร.อร ญญา ผศ.ดร.สมศ กด วล ยร ชต

Chpter Content Dimension Chnge of Bsis Row Spce, Column Spce, nd Null Spce Rnk, Nullity, nd the Fundmentl Mtrix Spces 212/7/11 Elementry Liner Algebr 2

4-5 Finite-Dimensionl A nonzero vector spce V is clled finite-dimensionl if it contins finite set of vector {v 1, v 2,, v n } tht forms bsis. If no such set exists, V is clled infinite-dimensionl. In ddition, we shll regrd the zero vector spce to be finitedimensionl. Exmple The vector spces R n, P n, nd M mn re finite-dimensionl. The vector spces F(-, ), C(-, ), C m (-, ), nd C (-, ) re infinite-dimensionl. 212/7/11 Elementry Liner Algebr 3

Theorem 4.5.1 & 4.5.2 Theorem 4.5.1 All bses for finite-dimensionl vector spce hve the sme number of vectors. Theorem 4.5.2 Let V be finite-dimensionl vector spce nd {v 1, v 2,, v n } ny bsis. If set in V hs more thn n vector, then it is linerly dependent. If set in V hs fewer thn n vector, then it does not spn V. 212/7/11 Elementry Liner Algebr 5

4.5 Dimension The dimension of finite-dimensionl vector spce V, denoted by dim(v), is defined to be the number of vectors in bsis for V. We define the zero vector spce to hve dimension zero. Dimensions of Some Vector Spces: dim(r n ) = n [The stndrd bsis hs n vectors] dim(m mn ) = mn [The stndrd bsis hs mn vectors] dim(p n ) = n + 1 [The stndrd bsis hs n + 1 vectors] 212/7/11 Elementry Liner Algebr 6

Exmple (Stndrd Bsis for P n ) S = {1, x, x 2,, x n } is bsis for the vector spce P n of polynomils of the form + 1 x + + n x n. The set S is clled the stndrd bsis for P n. Find the coordinte vector of the polynomil p = + 1 x + 2 x 2 reltive to the bsis S = {1, x, x 2 } for P 2. Solution: The coordintes of p = + 1 x + 2 x 2 re the sclr coefficients i of the bsis vectors 1, x, nd x 2, so (p) s =(, 1, 2 ). 212/7/11 Elementry Liner Algebr 7

4-5 Exmple 3 Determine bsis for nd the dimension of the solution spce of the homogeneous system 2x 1 + 2x 2 x 3 + x 5 = -x 1 + x 2 + 2x 3 3x 4 + x 5 = -x 1 + x 2 2x 3 x 5 = -xx 3 + x 4 + x 5 = Solution: The generl solution of the given system is x 1 = -s-t, x 2 = s, x 3 = -t, x 4 =, x 5 = t Therefore, e e, the solution o vectors cn be written s (x 1, x 2, x 3, x 4, x 5 ) = (-s-t, s, -t,, t) or, lterntively, s (x 1, x 2, x 3, x 4, x 5 ) = s (-1, 1, 1)+,, ) + t (-1,, -1, 1), Bsis is v 1 =(-1, 1,,, ), v 2 = (-1,, -1,, 1) Solution spce hs dimension = 2 212/7/11 Elementry Liner Algebr 8

Theorem 4.5.3 (Plus/Minus( Theorem) Let S be nonempty set of vectors in vector spce V. () () If S is linerly independent set, nd if v is vector in V tht is outside of spn(s), then the set S {v} tht results by inserting v into S is still linerly independent. (b) If v is vector in S tht is expressible s liner combintion of other vectors in S, nd if S {v} denotes the set obtined by removing v from S, then S nd S {v} spn the sme spce; tht is, spn(s) = spn(s {v}) 212/7/11 Elementry Liner Algebr 9

4-5 Exmple 5 Applying the Plus/Minus Theorem Show tht p 1 = 1-x 2, p 2 = 2-x 2,p 3 = x 3 re linerly independent vectors. Solution: S={p 1,p 2 } is linerly independent p 3 cnnot be expressed s liner combintion of vectors in S, it cn be djoined to S S ={p 1, p 2, p 3 } is linerly independent 1

Theorem 4.5.4 If V is n n-dimensionl vector spce, nd if S is set in V with exctly n vectors then S is bsis for V if either S spns V or S is linerly independent. 212/7/11 Elementry Liner Algebr 11

4-5 Exmple 6 Show tht v 1 = (-3, 7) nd v 2 = (5, 5) form bsis for R 2 by inspection. Solution: Neither vector is sclr multiple l of the other The two vectors form liner independent set in the 2-D spce R 2 The two vectors form bsis by Theorem 4.5.4. Show tht v 1 = (2,, 1), v 2 = (4,, 7), v 3 = (-1, 1, 4) form bsis for R 3 by inspection. 212/7/11 Elementry Liner Algebr 12

Theorem 4.5.5 Let S be finite set of vectors in finite-dimensionl vector spce V. () If S spns V but is not bsis for V then S cn be reduced to bsis for V by removing pproprite p vectors from S (b) If S is linerly independent set tht is not lredy bsis for V then S cn be enlrged to bsis for V by inserting pproprite vectors into S 212/7/11 Elementry Liner Algebr 13

Theorem 4.5.6 If W is subspce of finite-dimensionl vector spce V, then: () W is finite-dimensionl. (b) dim(w) dim(v). (c) W=V if nd only if dim(w) = dim(v). 212/7/11 Elementry Liner Algebr 14

Chpter Content Dimension Chnge of Bsis Row Spce, Column Spce, nd Null Spce Rnk, Nullity, nd the Fundmentl Mtrix Spces 212/7/11 Elementry Liner Algebr 15

4.6 Chnge of Bsis If S = {v 1, v 2,, v n } is bsis for finite-dimensionl vector spce V, nd v = c 1 v 1 + c 2 v 2 + + c n v n is the expression for vector v in terms of the bsis S, nd if (v) S = (c 1, c 2,, c n ) is the coordinte vector of v reltive to S, then the mpping v (v) S crete connection ( one-to-one correspondence) between vectors in the generl vector spce V nd vectors in the fmilir vector spce R n, clled coordinte mp from V to R n 16

The Chnge of Bsis 212/7/11 Elementry Liner Algebr 17

Continue 212/7/11 Elementry Liner Algebr 18

212/7/11 Elementry Liner Algebr 19

Solution of the Chnge-of-Bsis Problem Mtrix P is clled the trnsition mtrix from BtoB B 2

212/7/11 Elementry Liner Algebr 21

4-6 Exmple 1 Finding Trnsition Mtrices Consider the bses B={u 1,u 2 2} } nd B ={u 1, u 2 }, where u 1 =(1,), u 2 =(,1), u 1 =(1,1), u 2 =(2,1) () () Find the trnsition mtrix (b) Find the trnsition mtrix P B B B P B B from Bto B from Bto B 22

4-6 Exmple 1 Finding Trnsition Mtrices (Cont.) 212/7/11 Elementry Liner Algebr 23

4-6 Exmple 1 Finding Trnsition Mtrices (Cont.) 212/7/11 Elementry Liner Algebr 24

Exmple 2 : Computing Coordinte Vectors 212/7/11 Elementry Liner Algebr 25

Theorem 4.6.1 212/7/11 Elementry Liner Algebr 26

Efficient Method: Computing the Trnsition Mtrix 27

212/7/11 Elementry Liner Algebr 28

Chpter Content Dimension Chnge of Bsis Row Spce, Column Spce, nd Null Spce Rnk, Nullity, nd the Fundmentl Mtrix Spces 212/7/11 Elementry Liner Algebr 29

4.7 Definition For n mn mtrix n n A 2 22 21 1 12 11 mn m m 2 1 ] [ 1 12 11 1 n r the vectors ] [ ] [ 2 22 21 2 1 12 11 1 n n r in R n formed form the rows of A re clled the row vectors of A, nd the vectors ] [ 2 1 mn m m m r vectors n n n 2 1 22 12 2 21 11 1,,, c c c in R m formed from the columns of A re clled the column vectors of A mn n m m 2 2 1 1 3 212/7/11 Elementry Liner Algebr 3 in R m formed from the columns of A re clled the column vectors of A.

4-7 Exmple 1 Row nd Column Vector in A 2x3 Let A 2 1 3 1 4 The row vectors of A re r 1 = [2 1 ] nd r 2 = [3-1 4] nd the column vectors of A re 2 1 c,, nd 3 c 1 c 1 4 2 3 212/7/11 Elementry Liner Algebr 31

4-7 Row Spce, Column Spce, nd Null Spce If A is n mn mtrix the subspce of R n spnned by the row vectors of A is clled the row spce of A the subspce of R m spnned by the column vectors of A is clled the column spce of A The solution spce of the homogeneous system of eqution Ax =, which is subspce of R n, is clled the null spce of A. A mn 11 21 m1 12 22 m2 1 n 2 n mn r 1 [ 11 12 1 n r 2 r m [ [ 21 m1 22 m2 ] 2n ] mn ] c 11 12 22, c 2, c m2 21 1, m1 n 1 2 n n mn 212/7/11 Elementry Liner Algebr 32

Systems of Liner Equtions c 1 c 2 c n 33

Exmple 2 1 3 2 x1 1 Let Ax = b be the liner system 1 2 3 x2 9 2 1 2 x 3 3 Show tht b is in the column spce of A, nd express b s liner combintion of the column vectors of A. Solution: Solving the system by Gussin elimintion yields x 1 = 2, x 2 = -1, x 3 = 3 Since the system is consistent, b is in the column spce of A. Moreover, it follows tht 1 3 2 1 2 1 2 3 3 9 2 1 2 3 212/7/11 Elementry Liner Algebr 34

Systems of Liner Equtions 35

Generl nd Prticulr Solutions Remrk The generl solution of Ax = b The vector x is clled prticulr solution of Ax = b The expression c 1v 1 + + c kv k is clled the generl solution of Ax = The generl solution of Ax = b, (x + c 1 v 1 + + c k v k ) is the sum of ny prticulr solution of Ax = b nd the generl solution of Ax = 212/7/11 Elementry Liner Algebr 36

Exmple 3 : Generl Solution of Ax = b p Th l i h t s r x 2 4 3 2 4 3 The solution to the nonhomogeneous system t s r s s r t s r x x x x 2 1 2 4 1 3 2 2 4 3 3 2 1 x 1 + 3x 2 2x 3 + 2x 5 = 2x 1 + 6x 2 5x 3 2x 4 + 4x 5 3x 6 = -1 5x 3 + 1x 4 + 15x 6 = 5 t s x x x 1 1 3 1/ 3 1/ 6 5 4 5x 3 1x 4 15x 6 5 2x 1 + 5x 2 + 8x 4 + 4x 5 + 18x 6 = 6 is which is the generl solution. x h x x 1 = -3r - 4s - 2t, x 2 = r, x 3 = -2s, x 4 = s, x 5 = t, x 6 = 1/3 The vector x is prticulr solution of nonhomogeneous system nd the liner The result cn be written in vector form s system, nd the liner combintion x h is the generl solution of the homogeneous system 37 212/7/11 Elementry Liner Algebr 37 system.

Theorem 4.7.3 & 4.7.4 Elementry row opertions do not chnge the nullspce of mtrix Elementry row opertions do not chnge the row spce of mtrix. 212/7/11 Elementry Liner Algebr 38

From Exmple 3 (see slide 37) 2 2 3 1 p ( ) Find bsis for the null spce of 15 1 5 3 4 2 5 6 2 A Solution The nullspce of A is the solution spce of the homogeneous system x 1 + 3x 2 2x 3 + 2x 5 = 18 4 8 6 2 1 2 3 5 2x 1 + 6x 2 5x 3 2x 4 + 4x 5 3x 6 = 5x 3 + 1x 4 + 15x 6 = 2x 1 + 5x 2 + 8x 4 + 4x 5 + 18x 6 = 1 2 4 5 6 So, 2 4 1 3 1, 1 2, 3 2 1 v v v form bsis for the null spce. 39 212/7/11 Elementry Liner Algebr 39

Theorem 4.7.5 If mtrix R is in row echelon form the row vectors with the leding 1 s (i.e., the nonzero row vectors) form bsis for the row spce of R the column vectors with the leding 1 s of the row vectors form bsis for the column spce of R 212/7/11 Elementry Liner Algebr 4

Exmple 5 : Bses for Row nd Column Spces 4.7.5 212/7/11 Elementry Liner Algebr 41

Theorem 4.7.6 If A nd B re row equivlent mtrices, then: () A given set of column vectors of A is linerly independent if nd only if the corresponding column vectors of B re linerly independent. (b) A given set of column vectors of A forms bsis for the column spce of A if nd only if the corresponding column vectors of B form bsis for the column spce of B. 212/7/11 Elementry Liner Algebr 42

Exmple 6: Find Bsis for Row Spce by Row reduction 1 3 4 2 5 4 2 6 9 1 8 2 A 2 6 9 1 9 7 1 3 4 2 5 4 1 3 4 2 5 4 1 3 2 6 R 1 5 Find bsis for the Column Spce by Row reduction Note bout the correspondence! (Slide 43) 212/7/11 Elementry Liner Algebr 43

Exmple 6 : Bsis for Row Spce by Row Reduction 212/7/11 Elementry Liner Algebr 44

Exmple 7 : Bsis for Column Spce by Row Reduction 212/7/11 Elementry Liner Algebr 45

Exmple 8 (Bsis for Vector Spce Using Row Opertions) Find bsis for the spce spnned by the vectors v 1 = (1, -2,,, 3), v 2 = (2, -5, -3, -2, 6), v 3 = (, 5, 15, 1, ), v 4 = (2, 6, 18, 8, 6). Solution: (Write down the vectors s row vectors first!) 1 2 3 1 2 3 2 5 3 2 6 1 3 2 5 15 1 1 1 2 6 18 8 6 The nonzero row vectors in this mtrix re w 1 =(1 (1, -2,,, 3), w 2 =( (, 1, 3, 2, ), w 3 =( (,, 1, 1, ) -These vectors form bsis for the row spce nd form bsis for the subspce of R 5 spnned by v 1, v 2, v 3 nd v 4. 212/7/11 Elementry Liner Algebr 46

Remrks Keeping in mind tht A nd R my hve different column spces, we cnnot find bsis for the column spce of A directly from the column vectors of R. However, it follows from Theorem 4.7.6b tht if we cn find set of column vectors of R tht forms bsis for the column spce of R,, then the corresponding column vectors of A will form bsis for the column spce of A. In the exmple 6 (slide 43), the bsis vectors obtined for the column spce of A consisted of column vectors of A, but the bsis vectors obtined for the row spce of A were not ll vectors of A. Trnspose of the mtrix cn be used to solve this problem. 212/7/11 Elementry Liner Algebr 47

Exmple 9 (Bsis for the Row Spce of Mtrix ) Find bsis for the row spce of 1 2 3 2 5 3 2 6 A 5 15 1 2 6 18 8 6 consisting iti entirely of row vectors from A. Solution: (column vector in A T form bsis) 1 2 2 1 2 2 Trnsposing gin 2 5 5 6 1 5 1 T A 3 15 18 1 2 1 8 3 6 6 Got bsis for the row spce of row vectors from A r 1 =[1-2 3] r 2 =[2-5 -3-2 6] r 4 =[2 6 18 8 6] 212/7/11 Elementry Liner Algebr 48

Exmple 1 ()Fi d b t f th t (1 2 3) (2 5 3 6) p (Bsis nd Liner Combintions ) () Find subset of the vectors v 1 = (1, -2,, 3), v 2 = (2, -5, -3, 6), v 3 = (, 1, 3, ), v 4 = (2, -1, 4, -7), v 5 = (5, -8, 1, 2) tht forms bsis for the spce spnned by these vectors. Solution (): 5 2 2 1 1 2 1 2 7 6 3 1 4 3 3 8 1 1 5 2 1 1 1 1 1 5 4 3 2 1 2 7 6 3 v v v v v 5 4 3 2 1 w w w w w Thus, {v 1, v 2, v 4 } is bsis for the column spce of the mtrix. 212/7/11 Elementry Liner Algebr 49

Exmple 1 (cont.) (b) Express ech vector not in the bsis s liner combintion of the bsis vectors. Solution (b): express w 3 s liner combintion of w 1 nd w 2, express w 5 s liner combintion of w 1, w 2, nd w 4 w 3 = 2w 1 w 2 w 5 = w 1 + w 2 + w 4 We cll these the dependency equtions. The corresponding reltionships lti in the originl ii vectors re v 3 = 2v 1 v 2 v 3 = v 1 + v 2 + v 4 212/7/11 Elementry Liner Algebr 5

Bsis for Spn(S) 212/7/11 Elementry Liner Algebr 51

Chpter Content Dimension Chnge of Bsis Row Spce, Column Spce, nd Null Spce Rnk, Nullity, nd the Fundmentl Mtrix Spces 212/7/11 Elementry Liner Algebr 52

4-8 Dimension nd Rnk Theorem 4.8.1 If A is ny mtrix, then the row spce nd column spce of A hve the sme dimension. (See Slide 43) Definition The common dimension of the row nd column spce of mtrix A is clled the rnk of A nd is denoted by rnk(a). The dimension of the nullspce of is clled the nullity of A nd dis denoted dby nullity(a). 212/7/11 Elementry Liner Algebr 53

Exmple 1 : Rnk nd Nullity of 4x6 Mtrix Find the rnk nd nullity of the mtrix 1 2 4 5 3 3 7 2 1 4 A 2 5 2 4 6 1 4 9 2 4 4 7 Solution: The reduced row-echelon form of A is 1 4 28 37 13 1 2 12 16 5 Since there re two nonzero rows, the row spce nd column spce re both two-dimensionl, so rnk(a) =2 2. 212/7/11 Elementry Liner Algebr 54

Exmple 1 : Rnk nd Nullity of 4x6 Mtrix (Cont.) The corresponding system of equtions will be x 1 4x 3 28x 4 37x 5 + 13x 6 = x 2 2x 3 12x 4 16 x 5 + 5 x 6 = It follows tht the generl solution of the system is x 1 = 4r + 28s + 37t 13u, x 2 = 2r + 12s + 16t 5u, x 3 = r, x 4 = s, x 5 = t, x 6 = u or x1 4 28 37 13 Thus, nullity(a) = 4. x 2 2 12 16 5 x 3 1 r s t u x4 1 x 5 1 x 6 1 212/7/11 Elementry Liner Algebr 55

Mximum Vlue for Rnk If A is n mn mtrix The row vectors lie in R n nd the column vectors lie in R m. The row spce of A is t most n-dimensionl nd the column spce is t most m-dimensionl. Since the row nd column spce hve the sme dimension (the rnk A), we must conclude c tht if m n,, then the rnk of A is t most the smller of the vlues of m or n. Tht is, rnk(a) min(m, n) 212/7/11 Elementry Liner Algebr 56

Exmple 2 If A is 74 mtrix, the rnk of A is t most 4 the seven row vectors must be linerly dependent If A is 47 mtrix, the rnk of A is t most 4 the seven column vectors must be linerly dependent 212/7/11 Elementry Liner Algebr 57

Theorems Theorem 4.8.2 (Dimension Theorem for Mtrices) If A is mtrix with n columns, then rnk(a) + nullity(a) = n. Theorem 4.8.3 If A is n mn mtrix, then: () () rnk(a) = Number of leding vribles in the solution of Ax =. (b) nullity(a) = Number of prmeters in the generl solution of Ax =. 212/7/11 Elementry Liner Algebr 58

Exmple 3: Sum of Rnk nd Nullity The mtrix hs 6 columns, so 1 2 4 5 3 3 7 2 1 4 A 2 5 2 4 6 1 4 9 2 4 4 7 rnk(a) + nullity(a) =6 This is consistent with the previous exmple, where we showed tht rnk(a) = 2 nd nullity(a) = 4 212/7/11 Elementry Liner Algebr 59

Exmple 4: Number of Prmeters in Generl Solution Find the number of prmeters in the generl solution of Ax = if A is 57 mtrix of rnk 3. Solution: nullity(a) = n rnk(a) = 7 3 = 4 Thus, there re four prmeters. 212/7/11 Elementry Liner Algebr 6

Theorem 4.8.4 Equivlent Sttements If A is n mn mtrix, then the following re equivlent: A is invertible. Ax = hs only the trivil i solution. The reduced row-echelon form of A is I n. A is expressible s product of elementry mtrices. Ax = b is consistent for every n1 mtrixb b. Ax = b hs exctly one solution for every n1 mtrix b. det(a). The column vectors of A re linerly independent. The row vectors of A re linerly independent. The column vectors of A spn R n. The row vectors of A spn R n. The column vectors of A form bsis for R n. The row vectors of A form bsis for R n. A hs rnk n. A hs nullity. 212/7/11 Elementry Liner Algebr 61

Theorem 4.8.5 If Ax = b is consistent t liner system of m equtions in n unknowns, nd if A hs rnk r, then the generl solution of the system contins n r prmeters. 212/7/11 Elementry Liner Algebr 62

Overdetermined nd Underdetermined Systems Theorem 4.8.6 Let A be n m x n mtrix () (b) (Overdetermined Cse) If m>n, then the liner system Ax = b is inconsistent for t lest one vector b in R n (Underdetermined Cse) If m<n, then for ech vector b in R m the liner system Ax = b is either inconsistent or hs infinitely mny solutions 212/7/11 Elementry Liner Algebr 63

Exmple 5 Wht cn you sy bout of n overdetermined system Ax = b of 7 eqs in 5 unknowns in which A hs rnk r =4? Sol. The system is consistent for some vector b in R 7, nd for ny such b the number of prmeters in the generl sol is n-r = 5-4=1 Wht cn you sy bout of n underdetermined system Ax = b of 5 eqs in 7 unknowns in which A hs rnk r =4? Sol. The system my be consistent or inconsistent, but if it is consistent for vector b in R 5, then the generl sol hs n-r = 7-4=3 prmeters 212/7/11 Elementry Liner Algebr 64

Exmple 6, m > n The liner system x 2x b x x b x x b x 2x b 1 2 1 1 2 2 1 2 3 1 2 4 x 3x b 1 2 5 is overdetermined, so it cnnot be consistent for ll possible vlues of b 1, b 2, b 3, b 4, nd b 5. Exct conditions under which the system is consistent cn be obtined by solving the liner system by Guss-Jordn elimintion. 1 2b2 b1 1 b2 b1 b 3b 2b b 4b 3b b 5 b 4 b 3 2 1 4 2 1 5 2 1 212/7/11 Elementry Liner Algebr 65

Exmple 6 (cont.) Thus, the system is consistent if nd only if b 1 1, b 2 2, b 3 3, b 4 4, nd b 5 stisfy the conditions 2b1 3 b2 b3 = 23 b14 b2 b4 = 4b15 b2 b5= or, on solving this homogeneous liner system, b 1 =5r-4s, b 2 =4r-3s, b 3 =2r-s, b 4 =r, b 5 =s where r nd s re rbitrry. 212/7/11 Elementry Liner Algebr 66

Four Fundmentl Mtrix Spces Consider mtrix A nd dits trnspose A T together, then there re six vector spces of interest: row spce of A, row spce of A T column spce of A, column spce of A T null spce of A, null spce of A T However, the fundmentl mtrix spces ssocited with A re row spce of A, column spce of faa null spce of A, null spce of A T 212/7/11 Elementry Liner Algebr 67

Four Fundmentl Mtrix Spces If A is n mn mtrix the row spce of A nd nullspce of A re subspces of R n the column spce of faa nd the nullspce of faa T re subspce of R m Wht is the reltionship between the dimensions of these four vector spces? 212/7/11 Elementry Liner Algebr 68

Dimensions of Fundmentl Spces Theorem 4.8.7 If A is ny mtrix, then rnk(a) = rnk(a T ). Suppose tht A is n mn mtrix of rnk r, then A T is n nm mtrix of rnk r by Theorem 4.8.7 nullity(a) = n r, nullity(a T ) = m r by Theorem 4.8.2 Fundmentl Spce Row spce of A Column spce of A Nullspce of A Nullspce of A T Dimension r r n r m r 212/7/11 Elementry Liner Algebr 69

Orthogonl Complement 212/7/11 Elementry Liner Algebr 7

Exmple 7 212/7/11 Elementry Liner Algebr 71

Geometric Link Between the Fundmentl Spces 212/7/11 Elementry Liner Algebr 72

Theorem 4.8.1 Equivlent Sttements If A is n mn mtrix, then the following re equivlent: A is invertible. Ax = hs only the trivil i solution. The reduced row-echelon form of A is I n. A is expressible s product of elementry mtrices. Ax = b is consistent for every n1 mtrix b. Ax = b hs exctly one solution for every n1 mtrix b. det(a). The column vectors of A re linerly independent. The row vectors of A re linerly independent. The column vectors of A spn R n. The row vectors of A spn R n. The column vectors of A form bsis for R n. The row vectors of A form bsis for R n. A hs rnk n. A hs nullity. The orthogonl complement of the null spce of A is R n. The orthogonl complement of the row spce of A is {}. 212/7/11 Elementry Liner Algebr 73