c i r i i=1 r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4].

Similar documents
A = , A 32 = n ( 1) i +j a i j det(a i j). (1) j=1

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Lecture Notes: Solving Linear Systems with Gauss Elimination

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

Lecture Notes: Matrix Inverse. 1 Inverse Definition. 2 Inverse Existence and Uniqueness

Linear Classification: Linear Programming

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Linear Classification: Linear Programming

5x 2 = 10. x 1 + 7(2) = 4. x 1 3x 2 = 4. 3x 1 + 9x 2 = 8

Linear Classification: Perceptron

Determine whether the following system has a trivial solution or non-trivial solution:

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Notes on Row Reduction

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Row Space, Column Space, and Nullspace

1111: Linear Algebra I

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Kevin James. MTHSC 3110 Section 2.2 Inverses of Matrices

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

1111: Linear Algebra I

Lecture 6: Spanning Set & Linear Independency

MTH 362: Advanced Engineering Mathematics

INVERSE OF A MATRIX [2.2]

Exercise Sketch these lines and find their intersection.

Math 3C Lecture 20. John Douglas Moore

Lecture 22: Section 4.7

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

Chapter 1: Systems of Linear Equations

Math 2030 Assignment 5 Solutions

1111: Linear Algebra I

P = 1 F m(p ) = IP = P I = f(i) = QI = IQ = 1 F m(p ) = Q, so we are done.

Algebraic Methods in Combinatorics

2.2 Some Consequences of the Completeness Axiom

Kernel and range. Definition: A homogeneous linear equation is an equation of the form A v = 0

Lecture 18: The Rank of a Matrix and Consistency of Linear Systems

Math 54 HW 4 solutions

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

MTH 464: Computational Linear Algebra

1 Last time: inverses

3.4 Elementary Matrices and Matrix Inverse

Matrices and RRE Form

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 8

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

And, even if it is square, we may not be able to use EROs to get to the identity matrix. Consider

Algorithms to Compute Bases and the Rank of a Matrix

EXAM 2 REVIEW DAVID SEAL

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Chapter 2: Matrix Algebra

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Linear Algebra. Preliminary Lecture Notes

Abstract Vector Spaces and Concrete Examples

1 Last time: linear systems and row operations

Lecture 4: Gaussian Elimination and Homogeneous Equations

Problem Sheet 1 with Solutions GRA 6035 Mathematics

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Linear Equations in Linear Algebra

CHAPTER 3 REVIEW QUESTIONS MATH 3034 Spring a 1 b 1

Fundamentals of Linear Algebra. Marcel B. Finan Arkansas Tech University c All Rights Reserved

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

MTHSC 3110 Section 1.1

Math 301 Test I. M. Randall Holmes. September 8, 2008

5 Set Operations, Functions, and Counting

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Econ Slides from Lecture 7

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

Linear Algebra I Lecture 8

Linear Algebra. Preliminary Lecture Notes

Solutions to Exam I MATH 304, section 6

Chapter 1. Vectors, Matrices, and Linear Spaces

Linear Independence Reading: Lay 1.7

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

Chapter 2 Linear Transformations

Linear Algebra The Inverse of a Matrix

Homework 11/Solutions. (Section 6.8 Exercise 3). Which pairs of the following vector spaces are isomorphic?

Linear Systems and Matrices

MATH 2030: ASSIGNMENT 4 SOLUTIONS

Math 324 Summer 2012 Elementary Number Theory Notes on Mathematical Induction

PROPERTIES OF SIMPLE ROOTS

Lecture 14: The Rank-Nullity Theorem

MAT1332 Assignment #5 solutions

Lecture 9: Elementary Matrices

4.9 The Rank-Nullity Theorem

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Numerical Linear Algebra Homework Assignment - Week 2

INVERSE OF A MATRIX [2.2] 8-1

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

0. Introduction 1 0. INTRODUCTION

ORIE 6300 Mathematical Programming I August 25, Recitation 1

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Algebraic Methods in Combinatorics

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

Mon Feb Matrix inverses, the elementary matrix approach overview of skipped section 2.5. Announcements: Warm-up Exercise:

MAT 242 CHAPTER 4: SUBSPACES OF R n

5 Linear Transformations

Linear Algebra Handout

E k E k 1 E 2 E 1 A = B

Transcription:

Lecture Notes: Rank of a Matrix Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk 1 Linear Independence Definition 1. Let r 1, r 2,..., r m be vectors of the same dimensionality. Given real values c 1,..., c n, n c i r i i=1 is called a linear combination of r 1, r 2,..., r m. Definition 2. Let r 1, r 2,..., r m be vectors of the same dimensionality. If we can find real values c 1,..., c n such that c 1,..., c n are not all zero; n i=1 c i r i = 0 then we say that r 1,..., r m are linearly dependent. Otherwise, r 1,..., r m are linearly independent. Example 1. Consider vectors r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4]. Note that commas were added between the numbers so that you can see more easily that they are different numbers. r 1, r 2, and r 3 are linearly dependent because 3r 1 2r 2 r 3 = 0. On the other hand, r 1 and r 2 are linearly independent because the following equation has a unique solution c 1 = c 2 = 0: c 1 r 1 + c 2 r 2 = 0. (1) 1

2 Rank of a Matrix Definition 3. The rank of a matrix A denoted as rank A is the maximum number of linearly independent row vectors of A. Example 2. Consider matrix A = 1 2 0 1 3 4. Let r i (i [1, 3]) be the i-th row of A. The rank of A cannot be 3 because we have seen from Example 1 that r 1, r 2, and r 3 are linearly dependent. On the other hand, we already know that r 1 and r 2 are linearly independent. It follows that 2 is the maximum number of row vectors that are linearly independent. Therefore, rank A = 2. The above example shows a method for calculating the rank of a matrix. However, the method is not easy to apply when the matrix is large in dimensions. Next, we will give an alternative method for rank computation which is much easier to use. Let A and B both be m n matrices. Recall that A and B are said to be row-equivalent if we can convert A to B by applying the following elementary row operations: 1. Switch two rows of A. 2. Multiply all numbers of a row of A by the same non-zero value. 3. Let r i and r j be two row vectors of A. Update row r i to r i + r j. Next, we refer to the above as Operation 1, 2, and 3, respectively. Lemma 1. If A and B are row-equivalent, then they have the same rank. Proof. See appendix. Example 3. We already know from Example 2 that the following matrix has rank 2. 1 2 0 1. 3 4 By Lemma 1, we know that all the following matrices also have rank 2: 1 2 1 5 1 5 0 3, 0 3, and 0 3 3 4 3 4 3 1 We say that a row of a matrix is a non-zero row if the row contains at least one non-zero value. Then, we have the following fact: Lemma 2. If matrix A is in row echelon form, then rank A is the number non-zero rows of A. 2

The proof is simple and left to you as an exercise. Example 4. The ranks of the following matrices are 2 and 3, respectively. 1 2 3 0 1 3, and 0 1 3 0 0 0 0 3 0 0 0 0 0 0 0 Lemmas 1 and 3 suggest the following approach to compute the rank of a matrix A. First, convert A to a matrix A of row echelon form, and then, count the number of non-zero rows of A. Example 5. Next, we use the approach to calculate the rank of the matrix in Example 2 (in the derivation below, indicates applying row elementary operations): 1 2 1 2 1 2 0 1 0 1 0 1. 3 4 0 2 0 0 Example 6. Compute the rank of the following matrix: 1 2 3 4 5 6 7 8 Solution. 5 6 7 8 0 0 0 0 Hence, the original matrix has rank 2. Lemma 3. Suppose that r 1, r 2,..., r k are linearly independent, but r 1, r 2,..., r k+1 are linearly dependent. Then, r k+1 must be a linear combination of r 1, r 2,..., r k. Proof. Since r 1, r 2,..., r k+1 are linearly dependent, there exist c 1,..., c k+1 such that (i) they are not all zero, and (ii) c 1 r 1 + c 2 r 2 +... + c k r k + c k+1 r k+1 = 0. Note that c k+1 cannot be 0. Otherwise, it will follow that c 1 r 1 +c 2 r 2 +...+c k r k = 0. Since c 1,..., c k cannot be all zero, this means that r 1,..., r k were linearly dependent, which is a contradiction. Now that c k+1 0, we have: r k+1 = c 1 r 1 + c 2 r 2 +... + c k r k. c k+1 c k+1 c k+1 Therefore, r k+1 is a linear combination of r 1, r 2,..., r k. 3

This implies that, if a matrix has rank k, then there are only k effective rows, in the sense that every other row can be derived as a linear combination of those k rows. For instance, consider the matrix in Example 6; we know that its rank is 2, and that the first two rows are linearly independent. Thus, we must be able to represent the 3rd row as a linear combination of the first two. Indeed, this is true: [3, 2, 1, 0] = ( 2) [1, 2, 3, 4] + [5, 6, 7, 8]. 3 An Important Property of Ranks In this section, we will prove a non-trivial lemma about ranks. Lemma 4. The rank of a matrix A is the same as the rank of A T. Proof. (Sketch) Define the column-rank of A to be the maximum number of independent column vectors of A. Note that the column-rank of A is exactly the same as the rank of A T. Hence, to prove the lemma, it suffices to show that the rank of A is the same as the column-rank of A. We first show: Fact 1: If A is in row echelon form, then the rank of A cannot be less than its column-rank. Fact 2: Elementary row operations on A do not change its column rank. The proofs of the above facts are left to you as an exercise. But here are the hints: For Fact 1: assume that A has rank k; take k columns appropriately and prove that they must be linearly independent. For Fact 2: it can proved in the same style as the proof of Lemma 1. Since elementary row operations on A do not change its rank, combining both facts shows that the rank of A is at most its column-rank. On the other hand, reversing the above argument shows that the column-rank of A is at most its rank. With this, we complete the lemma. Example 7. Let Compute the rank of A T. A = 5 6 7 8 Solution. From Example 6, we know that the rank of A is 2. Lemma 4 tells us that the rank of A T must also be 2. 4

Appendix Proof of Lemma 1 Let us assume that B is obtained from A after applying one elementary row operation. To prove the lemma, it suffices to show that A and B have the same rank. Let r 1, r 2,..., r m be the row vectors of A. Case 1: Operation 1 was applied. In this case, B has exactly the same row vectors as A. Hence, they have the same rank. Case 2: Operation 2 was applied. Let R be an arbitrary non-empty subset of {r 1, r 2,..., r m }. Define R be the set of corresponding 1 rows in B. We will prove: Claim: R is linearly dependent if and only if R is linearly dependent. The claim implies that A and B have the same rank. Without loss of generality, assume that the row operation multiplies the first row r 1 of A with a value c. In other words, the row vectors of B are cr 1, r 2,..., r m. If r 1 / R, then R = R, in which case our claim obviously holds. It remains to consider that r 1 R. Define k = R. Without loss of generality, assume that R = {r 1, r 2,..., r k }. The set of corresponding rows in B is R = {cr 1, r 2,..., r k }. If there exist α 1,..., α k such that (i) they are not all zero, and (ii) it holds that α 1 r 1 + α 2 r 2 +... + α k r k = 0 α 1 c (cr 1) + α 2 r 2 +... + α k r k = 0 In other words, we can find β 1,..., β k such that (i) they are not all zero, and (ii) β 1 r 1 + β 2 r 2 +... + β k r k = 0. Hence, that R being linearly dependent implies R being linearly dependent. The reverse of the above argument shows that R being linearly dependent implies R being linearly dependent. Case 3: Operation 3 was applied. The proof of this case is similar to the proof of Case 2, and is left to you as an exercise. 1 The correspondence here means: R includes the i-th row of B if and only if R includes the i-th row of A. 5