Matrix Inverses November 9, 204 22 The Inverse of a Matrix Now that we have discussed how to multiply two matrices, we can finally have a proper discussion of what we mean by the expression A for a matrix A First, consider the cases of a real number a In the strictest mathematical sense, when we talk about dividing by a number a, we really mean multiplying by the inverse a In addition, the number a is defined as the number which satisfies the equations a a a a This leads us to use the following definition for matrix inverses: Definition If A is a matrix, then the inverse matrix A is the matrix which satisfies A A I n A A I n If such a matrix exists, we say that A is invertible There are some things to note about this definition First, the definition can only work if we assume that matrix A is square Otherwise, we would not be able to change the order of multiplication the way we want in the above expressions and still have something that makes sense Second, we note that it is possible to define inverses for non-square matrices Based on the first comment, however, we see that if A isn t a square matrix, the thing we use as the inverse on the left would probably be different than the matrix we use as the inverse on the right Thus, in such scenarios, we do not have one
unique inverse Because of this, we avoid the situation by just using square matrices Once we know we are working with a square matrix, how do we find its inverse? The answer to this comes from the following theorem Theorem An n n matrix is invertible if and only if A is row equivalent to I n In this case, any sequence of elementary row operations that reduces A to I n also transforms I n into A This theorem is pretty useful because it gives us a procedure for finding inverses If you use a set of elementary row operations to reduce a matrix A to I n, then you simply apply those same row operations to I n to obtain A A convenient way of doing this is the following algorithm, which we illustrate using the following example Finding Inverses To find the inverse of a matrix A, we must perform the same row operations on the identity matrix I n which will transform A into I n To do this all simultaneously, we can simply augment A with I n, and then row reduce, as is shown in the following example Example Find the inverse of the matrix [ ] 2 A 0 The procedure for finding the inverse consists of the following three steps Step : Augment the given matrix with the identity matrix [ ] 2 0 A 0 0 Step 2: Row reduce to put the resulting matrix in reduced echelon form [ ] [ ] 2 0 2 0 R 0 0 R 2 R 0 6 + R 2 [ ] 0 0 R [ ] 0 0 0 6 R 6 2 0 2 6 Step : Read the inverse matrix from the right-hand side of the resulting augmented matrix [ ] 0 A 2 6 2
Example 2 Let Find B 2 B 0 0 0 2 Solution: As before, we first augment B with the identity matrix to obtain 2 0 0 0 0 0 0 0 2 0 0 We then row-reduce: 2 0 0 R 2R 2 0 0 0 0 0 2 0 0 R 2R 2 0 0 0 0 0 0 0 0 0 0 2 0 2 0 0 0 0 0 0 0 0 2 R R Now that we have obtained the identity matrix I on the left side of this large matrix, we can read the inverse off from the right side: 0 B 0 0 0 2 Properties of Inverses Inverse matrices have several properties which will be of use to us later These are the content of the following theorem Theorem 2 Assume the matrices A and B are invertible Then the following equations hold (A ) A 2 (AB) B A (A T ) (A ) T
Throughout this discussion, we have avoided talking about whether or not a matrix will actually have an inverse or not We touched on the issue briefly at the beginning when we said that matrices must be square to have an inverse However, this requirement only tells us that a matrix may have an inverse It does not guarantee that it does In fact, the algorithm described above does not guarantee the existence of inverses either For example, try applying the procedure to the matrix [ ] 2 A 6 You will see that it will not work The problem here is not that the algorithm fails to give us the inverse in this case Rather, the matrix A simply does not have an inverse This begs the following question: given a matrix A, how do we know if it actually has an inverse? To answer this question, we must introduce a new idea, known as determinants Inverses and Determinants Fortunately for us, there is a simple test to determine if a square matrix has an inverse This test involves an operation on matrices called the determinant of the matrix It is, unfortunately, a tedious computation However, as we will see, it gives us a definitive answer as to whether the inverse matrix exists or not The way to compute the determinant of a matrix depends on its size, and so we begin with 2 2 matrices Definition 2 Let A be the 2 2 matrix [ ] a b A c d Then, the determinant of A, denoted as det A, is given by Example Let A be the matrix Then det A 6 2 0 det A ad bc A [ ] 2 6 4
Example 4 Let B be the matrix Then det B 2 B [ ] 2 Now, let us move on to the other case we will consider, the matrices In this case, we compute the determinant by reducing the computation to a combination of smaller determinants in the following way: suppose we have a matrix A given by a b c A d e f g h i Let A be the portion of this matrix obtained by deleting the first row and the first column: [ ] e f A h i Let A 2 be the portion of the matrix obtained by deleting the first row and second column: [ ] d f A 2 g i Finally, let A be the portion obtained by deleting the first row and third column: [ ] d e A g h Then the determinant of A is given by det A a det A b det A 2 + c det A Example 5 Let A be given by 0 A 2 2 2 5
As in the explanation above, we compute the sub matrices [ ] 2 A 2 [ ] 2 A 2 [ ] 2 2 A 2 Then det A det A 0 det A 2 + det A 4 0 5 + 2 0 Now that we understand determinants, we can finally answer the question: how do we know if a matrix has an inverse? The answer comes in the form of the following theorem Theorem Let A be a square matrix Then A has a determinant if and only if det A 0 Looking back at the matrices A and B in examples 4 and 5, we see that B does not have an inverse (as we stated earlier), while A does Thus, applying the procedure for finding inverses to A will work, while applying it to B will not Solving Systems of Equations Now that we understand how matrix inverses work, we can use them to solve systems of equations Consider the system x + 2x 2 x 2 Based on earlier discussions, we saw that we can write this as Ax b, where [ ] [ ] [ ] 2 x A, x, b 0 2 Based on Theorem, A is invertible In fact, we have computed its inverse above, which is [ ] 0 A 2 6 6 x 2
We can then solve the system by using the formula x A b [ 0 ] 2 6 [ 2 6 ] [ ] 2 7