NOTES ON MATRICES OF FULL COLUMN (ROW) RANK. Shayle R. Searle ABSTRACT

Similar documents
Chapter 1. Matrix Algebra

Example Linear Algebra Competency Test

Review of Linear Algebra

Section 2.2: The Inverse of a Matrix

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

MAT 2037 LINEAR ALGEBRA I web:

Mathematics I. Exercises with solutions. 1 Linear Algebra. Vectors and Matrices Let , C = , B = A = Determine the following matrices:

Optimization problems on the rank and inertia of the Hermitian matrix expression A BX (BX) with applications

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2018

Chap 3. Linear Algebra

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Further Mathematical Methods (Linear Algebra) 2002

Pseudoinverse & Moore-Penrose Conditions

Linear Algebra 1 Exam 2 Solutions 7/14/3

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Solutions to the August 2008 Qualifying Examination

Large Scale Data Analysis Using Deep Learning

Introduction to Matrix Algebra

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Vector and Matrix Norms. Vector and Matrix Norms

Properties of Matrices and Operations on Matrices

Linear Algebra V = T = ( 4 3 ).

Linear Algebra 1 Exam 1 Solutions 6/12/3

Linear Algebra. Workbook

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

3 (Maths) Linear Algebra

EE731 Lecture Notes: Matrix Computations for Signal Processing

Appendix C Vector and matrix algebra

Applied Matrix Algebra Lecture Notes Section 2.2. Gerald Höhn Department of Mathematics, Kansas State University

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

6 Gram-Schmidt procedure, QR-factorization, Orthogonal projections, least square

LinGloss. A glossary of linear algebra

LINEAR ALGEBRA REVIEW

Introduction to Numerical Linear Algebra II

Generalized Principal Pivot Transform

BUILT-IN RESTRICTIONS ON BEST LINEAR UNBIASED PREDICTORS (BLUP) OF RANDOM EFFECTS IN MIXED MODELSl/

PROBLEMS ON LINEAR ALGEBRA

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

Matrix Algebra. Matrix Algebra. Chapter 8 - S&B

Linear algebra for computational statistics

MOORE-PENROSE INVERSE IN AN INDEFINITE INNER PRODUCT SPACE

A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010

The Singular Value Decomposition

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

Lecture 3: Linear Algebra Review, Part II

Linear Algebra Review. Vectors

5. Orthogonal matrices

Final Examination 201-NYC-05 - Linear Algebra I December 8 th, and b = 4. Find the value(s) of a for which the equation Ax = b

Analytical formulas for calculating the extremal ranks and inertias of A + BXB when X is a fixed-rank Hermitian matrix

Repeated Eigenvalues and Symmetric Matrices

Review of Vectors and Matrices

~ g-inverses are indeed an integral part of linear algebra and should be treated as such even at an elementary level.

Applied Numerical Linear Algebra. Lecture 8

1 Linear Algebra Problems

Matrix Algebra: Summary

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

ECE 275A Homework #3 Solutions

Linear Algebra Practice Problems

Week 15-16: Combinatorial Design

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra and Robot Modeling

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Linear Equations in Linear Algebra

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Diagonal and Monomial Solutions of the Matrix Equation AXB = C

STAT200C: Review of Linear Algebra

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

CITS2401 Computer Analysis & Visualisation

7. Symmetric Matrices and Quadratic Forms

A Review of Linear Algebra

MATH36001 Generalized Inverses and the SVD 2015

ELE/MCE 503 Linear Algebra Facts Fall 2018

Weaker assumptions for convergence of extended block Kaczmarz and Jacobi projection algorithms

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Econ Slides from Lecture 7

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Chapter 2 Notes, Linear Algebra 5e Lay

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Math Linear Algebra Final Exam Review Sheet

Inverting Matrices. 1 Properties of Transpose. 2 Matrix Algebra. P. Danziger 3.2, 3.3

On some matrices related to a tree with attached graphs

Eigenpairs and Similarity Transformations

Linear Algebra. Grinshpan

Ep Matrices and Its Weighted Generalized Inverse

Problem 1. CS205 Homework #2 Solutions. Solution

1 Last time: least-squares problems

Math 242 fall 2008 notes on problem session for week of This is a short overview of problems that we covered.

Lectures on Linear Algebra for IT

Lecture 3: Matrix and Matrix Operations

Linear Algebra in Actuarial Science: Slides to the lecture

Product of Range Symmetric Block Matrices in Minkowski Space

Extra Problems for Math 2050 Linear Algebra I

Transcription:

NOTES ON MATRICES OF FULL COLUMN (ROW) RANK Shayle R. Searle Biometrics Unit, Cornell University, Ithaca, N.Y. 14853 BU-1361-M August 1996 ABSTRACT A useful left (right) inverse of a full column (row) matrix is the Moore-Penrose inverse; and linear equations based on a full column (row) rank matrix have only one (many a) solution. These solutions are characterized. Key words: left inverse, right inverse, Moore-Penrose inverse, solution to linear equations.

-1- LEFT INVERSES Ar x c can have a left inverse L, such that LA= I only if r ~ c, and it does have left inverses only if it has full column rank, r A = c. Then for given A there can be many values of L, one of which is A+= (A'A)- 1 A', the Moore-Penrose inverse of A. and For A=[::]. 2-2 (A'A)-1 = [ 24 6-1 1 [ 14 6 ] 14 = 300-6 -6] 24 A+=(A'A)- 1 A'=-1 [14-6][ 2 4 2]=_1_[ 1 5 4] 300-6 24 3 1-2 30 6 0-6. The conditions for A+ being the Moore-Penrose of A are easily verified: first, that A+ A 1s symmetric, which it is because A+ A= I, then AA +A= A, A+ AA + = A+, and then that which is symmetric. RIGHT INVERSES The existence of right inverses is very much (but not entirely) the converse of the situation for left inverses. Ar x c can have right inverses R, with AR = I only if c ~ r and it does have them only if A has full row rank, r A = r. One such right inverse is A+ = A'(AA'fl, the Moore-Penrose inverse of A of full row rank. Moore-Penrose inverse An interesting point here is that the expressions for the Moore-Penrose inverse of A are not the same in the preceding two cases: and A+ = (A'A)- 1 A' for A of full column rank (1) A++ = A'(AA')-1 for A of full row rank. (2)

-2- Result (2) is derived from (1) by noting that for A of full row rank A' has full column rank. Therefore (1) gives (A')+ as (AA')- 1 A. But the transpose of (A')+ is A+, so giving (2). Note, too, that the two inverses (A'A)-1 and (AA')- 1 are not the same, not even of the same order and, moreover, when one of them exists the other does not. Thus (A'A)-1 exists when A has full column but (AA')-1 does not; and vice versa for A of full row rank. A general expression for the Moore-Penrose inverse AM of A is (Searle, 1982, p. 216) where (AA')- is any generalized inverse of AA' satisfying just the first of the Penrose conditions, AA'(AA')-AA' = AA'. It is of interest to see how this reduces to A+ for A of full column rank. Write A as for some non-singular T and some K. Then A'(AA')-A= T'[I K1([ ~ ]Tr[I K1n ~} = T'[I :] [ ~ ]T =I. Therefore AM reduces to A+, as it should: AM= I(A'ArA' because A has full column rank. Similarly AM reduces to A++ when A has full row rank. LINEAR EQUATIONS WITH A FULL COLUMN RANK MATRIX Consider linear equations Ax=y for known A, of full column rank c, and known y. The general solution is

-3- for arbitrary z; and, through the arbitrariness of z, it generates all possible solutions ( loc. cit., Chapter 9). But, with Therefore A-A = I and so there is only one solution the same for all generalized inverses A-. One well might ask "What happens if the first c rows of Ar x c (of full column rank) are not linearly independent?" Then 'F1 would not exist. This can be circumvented by using B = PA where P is a permutation matrix, and hence B is A with its rows permuted to have the first c rows of B be linearly independent. Then, as with A-A = I in the preceding paragraph, we now have B-B = I. But with B = PA, A= P'B (because Pis orthogonal) and A-= B-P and so What is sometimes puzzling about the i = A-y solution is why does Ai = AA-y equal y? It is not because AA- is an identity matrix, since that is not so. One approach is that without knowing i we do know that y = Ax.. Therefore Ai = AA-y = AA-Ax. = Ax. = y. Fortunately one does not have to rely on the existence of 'F1 for calculating an A-. The full column rank property of A ensures the existence of (A'Ar 1 and so A-= A+= (A'A)- 1 A' is the easiest calculation and i = (A'A)- 1 A'y is the solution of Ax.= y for full column rank A. the solution using 'F1 is

-4- and with A+= (A'A)- 1 A' the solution is i= A+y= [146 43 ~n 9 4 7 J[;] 2 1 3 1 [ 14 = 195-43 ~: ][ :: ] = [ -~ ] as before. LINEAR EQUATIONS WITH A FULL ROW RANK MATRIX Consider Ax =y with A of full row rank represented by Ar x c = [R RQ] for R non-singular of rank r. On partitioning x conformably with the partitioning of A we write and get [R R~[: ]=y Rx 1 +R~=y it = R-ty - Qx2.,., Thus for any ~' the solution x = [~~] will satisfy Ax = y. Since Q = R- 1 RQ, and RQ is the notation for the columns of A beyond those of the r columns of non-singular R, it is more useful to write A = [R RQ] and 'i 1 = R- 1 (y- R~). A = [ : : : ] and I. = [ : : r [ c: )-( :).. J Ax = [ 2 : l Thus the solution is = [ -~ ~ ][ 2:~::: ] [ -9 + 5x2] - 27-13x 2

-5- If A is such that its first r columns are not linearly independent, then resequence the columns to have C = AP for P a permutation matrix, and C having its first r columns linearly independent. Then because Ju =Y is APP'x = y which, with z = P'x, is Cz = y, obtain i as a solution to Cz = y and from that get x =Pi as the solution to Ju = y. [ ~ 1 2 C=[ ~ ~ ~ ]=[ ~ ~ ~ J[: : n=ap = [ ~ ~ r [ c:)-c } J _ [ 2- Z2] z = 2 = [ 7-3 ] [ 8 - z 2 ] = [ 2 - z 2 ] -2 1 18-2z 2 2 z2 and REFERENCE Searle, Shayle R. (1982) Matrix Algebra Useful for Statistics, Wiley, New York.