INVERSE OF A MATRIX [2.2]

Similar documents
INVERSE OF A MATRIX [2.2] 8-1

Chapter 2 Notes, Linear Algebra 5e Lay

February 20 Math 3260 sec. 56 Spring 2018

Section Gaussian Elimination

Inverting Matrices. 1 Properties of Transpose. 2 Matrix Algebra. P. Danziger 3.2, 3.3

Section 2.2: The Inverse of a Matrix

SOLVING Ax = b: GAUSS-JORDAN ELIMINATION [LARSON 1.2]

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

3.4 Elementary Matrices and Matrix Inverse

1 Last time: inverses

MATH 2030: MATRICES. Example 0.2. Q:Define A 1 =, A. 3 4 A: We wish to find c 1, c 2, and c 3 such that. c 1 + c c

MODEL ANSWERS TO THE THIRD HOMEWORK

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Fall Inverse of a matrix. Institute: UC San Diego. Authors: Alexander Knop

Methods for Solving Linear Systems Part 2

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Math 313 Chapter 1 Review

1 Last time: determinants

Solutions to Exam I MATH 304, section 6

Math 54 HW 4 solutions

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

If A is a 4 6 matrix and B is a 6 3 matrix then the dimension of AB is A. 4 6 B. 6 6 C. 4 3 D. 3 4 E. Undefined

Elementary maths for GMT

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Chapter 2: Matrices and Linear Systems

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Math 2030 Assignment 5 Solutions

Math Linear Algebra Final Exam Review Sheet

MTH 464: Computational Linear Algebra

Lecture 2 Systems of Linear Equations and Matrices, Continued

Introduction to Matrices

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 60. Rumbos Spring Solutions to Assignment #17

ORIE 6300 Mathematical Programming I August 25, Recitation 1

Kevin James. MTHSC 3110 Section 2.2 Inverses of Matrices

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Chapter 1. Vectors, Matrices, and Linear Spaces

Lecture Notes in Linear Algebra

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Linear Algebra. Chapter Linear Equations

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Chapter 4. Solving Systems of Equations. Chapter 4

Matrices and systems of linear equations

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

Math 3C Lecture 20. John Douglas Moore

Review Solutions for Exam 1

A FIRST COURSE IN LINEAR ALGEBRA. An Open Text by Ken Kuttler. Matrix Arithmetic

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

Linear Equations and Matrix

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

Mon Feb Matrix algebra and matrix inverses. Announcements: Warm-up Exercise:

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Linear Algebra: Lecture notes from Kolman and Hill 9th edition.

Math 3191 Applied Linear Algebra

Section 4.5. Matrix Inverses

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Math 22AL Lab #4. 1 Objectives. 2 Header. 0.1 Notes

Math 1553 Introduction to Linear Algebra

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Linear Algebra Handout

Elementary Linear Algebra

Systems of Linear Equations. By: Tri Atmojo Kusmayadi and Mardiyana Mathematics Education Sebelas Maret University

4 Elementary matrices, continued

LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Math 20F Practice Final Solutions. Jor-el Briones

Lecture 22: Section 4.7

x y + z = 3 2y z = 1 4x + y = 0

Matrix Arithmetic. j=1

1 Last time: linear systems and row operations

1. Solve each linear system using Gaussian elimination or Gauss-Jordan reduction. The augmented matrix of this linear system is

MH1200 Final 2014/2015

1111: Linear Algebra I

MTH 464: Computational Linear Algebra

Linear Algebra Highlights

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

Review Questions REVIEW QUESTIONS 71

4 Elementary matrices, continued

We see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where

Matrix & Linear Algebra

Lecture 4: Gaussian Elimination and Homogeneous Equations

Chapter 1: Linear Equations

Topic 15 Notes Jeremy Orloff

Lecture 3: Gaussian Elimination, continued. Lecture 3: Gaussian Elimination, continued

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

Linear equations in linear algebra

Linear Algebra March 16, 2019

MA 138 Calculus 2 with Life Science Applications Matrices (Section 9.2)

Inverses of Square Matrices

MATH10212 Linear Algebra B Homework 6. Be prepared to answer the following oral questions if asked in the supervision class:

Transcription:

INVERSE OF A MATRIX [2.2]

The inverse of a matrix: Introduction We have a mapping from R n to R n represented by a matrix A. Can we invert this mapping? i.e. can we find a matrix (call it B for now) such that when B is applied to Ax the result is x? x Product by A Product by B Ax Example: blurring operation. We want to revert blurring, i.e., to deblur. So: Blurring: A; Deblurring: B. B is the inverse of A and is denoted by A 1. 8-2 Text: 2.2 Inverse

Recall that I n x = x for all x. Since we want A 1 (Ax) = x for all x this means, we need to have A 1 A = I n Naturally the inverse of A 1 should be A so we also want AA 1 = I n Finding an inverse to A is not always possible. When it is we say that the matrix A is invertible Next: details. 8-3 Text: 2.2 Inverse

The inverse of a matrix An n n matrix A is said to be invertible if there is an n n matrix B such that BA = I and AB = I where I = I n, the n n identity matrix. In this case, B is an inverse of A. In fact, B is uniquely determined by A: If C were another inverse of A, then C = CI = C(AB) = (CA)B = IB = B This unique inverse is denoted by A 1 -so that AA 1 = A 1 A = I 8-4 Text: 2.2 Inverse

Matrix inverse - the 2 2 case [ ] a b Let A =. If ad bc 0 then A is invertible and c d [ ] A 1 1 d b = ad bc c a Verify the result If ad bc = 0 then A is not invertible (does not have an inverse) The quantity ad bc is called the determinant of A (det(a)) The above says that a 2 2 matrix is invertible if and only if det(a) 0. 8-5 Text: 2.2 Inverse

Matrix inverse - Properties Theorem If A is invertible, then for each b in R n, the equation Ax = b has the unique solution x = A 1 b. Proof: Take any b in R n. A solution exists because if A 1 b is substituted for x, then Ax = A(A 1 b) = (A 1 A)b = Ib = b. So A 1 b is a solution. To prove that the solution is unique, show that if u is any solution, then u must be A 1 b. If Au = b, we can multiply both sides by A 1 and obtain A 1 Au = Ab, Iu = A 1 b, and u = A 1 b Show: If A is invertible then it is one to one, i.e., its columns are linearly independent. 8-6 Text: 2.2 Inverse

Matrix inverse - Properties a. If A is an invertible matrix, then A 1 is invertible and (A 1 ) 1 = A b. If A and B are n n invertible matrices, then so is AB, and we have (AB) 1 = B 1 A 1 c. If A is an invertible matrix, then so is A T, and the inverse of A T is the transpose of A 1 : (A T ) 1 = (A 1 ) T Common notation (A T ) 1 A T 8-7 Text: 2.2 Inverse

Existence of the inverse and related properties Our next goal is to prove the following theorem. Existence Theorem. The 4 following statements are equivalent (1) A is invertible (2) The columns of A are linearly independent (3) The Span of the columns of A is R n (4) rref(a) is the identity matrix 8-8 Text: 2.2 Inverse

Elementary matrices Consider the matrix on the right and call it 1 0 0 0 E. What is the result of the product EA for 0 1 0 0 some matrix A? r 0 1 0 0 0 0 1 Can this operation result in a change of the linear independence of the columns of A? [prove or disprove] Consider now the matrix on the right 1 0 0 0 [obtained by swapping rows 2 and 4 of I]. Call 0 0 0 1 it P. Same questions as above. 0 0 1 0 0 1 0 0 Matrices like E (elementary elimination matrix) and P (permutation matrix) are called elementary matrices 8-9 Text: 2.2 Inverse

Elimination algorithms and elementary matrices We will show this: The following algorithms: Gaussian elimination, Gauss-Jordan, reduction to echelon form, and to reduced row echelon form, are all based on multiplying the original matrix by a sequence of elementary matrices to the left. Each of these transformations preserves linear independence of the columns of the original matrix. An elementary matrix is one that is obtained by performing a single elementary row operation on an identity matrix. Let us revisit Gaussian Elimination - Recommended : compare with lecture note example on section 1.1.. 8-10 Text: 2.2 Inverse

Recall: Gaussian Elimination Consider example seen in section 1.1 Step 1 must transform: 2 4 4 2 1 3 1 1 1 5 6 6 into: x x x x 0 x x x 0 x x x row 2 := row 2 1 2 row 1: row 3 := row 3 1 2 row 1: 2 4 4 2 0 1 1 0 1 5 6 6 2 4 4 2 0 1 1 0 0 3 4 7 8-11 Text: 2.2 Inverse

The first transformation ( row 2 := row 2 1 2 row 1 ) is equivalent to performing this product: 1 0 0 1 2 1 0 0 0 1 2 4 4 2 1 3 1 1 1 5 6 6 = 2 4 4 2 0 1 1 0 1 5 6 6 Similarly, operation of row 3 is equivalent to product: 1 0 0 0 1 0 1 2 0 1 2 4 4 2 0 1 1 0 1 5 6 6 Hint: Use the row-wise form of the matrix products = 2 4 4 2 0 1 1 0 0 3 4 7 Matrix on the left is called an Elementaty elimination matrix Do the same thing for 2nd (and last) step of GE. 8-12 Text: 2.2 Inverse

Another type of elementary matrices: Permutations We used these in partial pivoting. A permutation matrix is a matrix obtained from the identity matrix by permuting its rows For example for the permutation p = {3, 1, 4, 2} we obtain P = 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 Important observation: the matrix P A is obtained from A by permuting its rows with the permutation p (P A) i,: = A p(i),: 8-13 Text: 2.2 Inverse

What does this mean? It means that for example the 3rd row of P A is simply row number p(3) which is 4, of the original matrix A. 3rd row of P A equals p(3) th row of A Why is this true? What can you can of the j-th column of AP? What is the matrix P A when 0 0 1 0 P = 1 0 0 0 0 0 0 1 A = 0 1 0 0 1 2 3 4 5 6 7 8 9 0 1 2? 3 4 5 6 8-14 Text: 2.2 Inverse

Back to elementary matrices Do the elementary matrices E 1, E 2,..., E n 1 (including permutations) change linear independence of the columns? Prove: If u, v, w (3 columns of A) are independent then the columns E 1 u, E 1 v, E 1 w are independent where E 1 is an elementary matrix (elimination matrix or a permutation matrix). So: (*Very important*) Elimination operations (Gaussian elimination, Gauss-Jordan, reduction to echelon form, and to rref) preserve the linear independence of the columns. Consequence: Gaussian elimination with partial pivoting cannot fail when the columns of A are linearly independent. Conclusion: When A has independent columns, the linear system Ax = b always has a unique solution. 8-15 Text: 2.2 Inverse

Proof: Only way in which GE can fail is that the algorithm reaches a system with a matrix U like the one in left hand side of: x x x x x x x x x x x x x x x 0 x x 0 x x 0 x x x 1 x 2 x 3 1 0 0 = 0 0 0 0 0 0 Then the columns of original A must be dependent. Can find a solution to Ux = 0 with x not a zero vector: Take x 4 = 1 and x i = 0 for i > k. Then back-solve to get x k 1, x k 2,, x 2, x 1 (in example k = 4). Done. Therefore the columns of original A must be dependent too Argument is a little simpler with Gauss-Jordan. Get to the same conclusion using Gauss-Jordan with pivoting (permute below row k!). 8-16 Text: 2.2 Inverse

Theorem: Let A be an n n matrix. Then the columns of A are linearly independent iff its reduced echelon form is the identity matrix Only way in which the rref(a) I is by having at least one free variable. Form the augmented system [A, 0]. Set this free variable to one (other free var. to zero) and solve for the basic variables. Result: a nontrivial sol. to the systsm Ax = 0 Contradiction If rref(a) = I then columns of A are independent since the elementary operations do not alter linear dependence. 8-17 Text: 2.2 Inverse

Theorem: Let A be an n n matrix. Then A has independent columns if and only if A is invertible. From previous theorem, A can be reduced to the identity matrix with the reduced echelon form procedure. There are elementary matrices E 1, E 2,..., E p such that E p E p 1 E 2 E}{{} 1 A = I step1 }{{} step2 Call C the matrix E p E p 1 E 1. Then CA = I. So A has a left-inverse. It also has a right inverse X (s.t. AX = I) because any system Ax = b has a solution (Gaussian elimination will not fail since columns are linearly independent) 8-18 Text: 2.2 Inverse

Therefore we can solve Ax i = e i, where e i is the i-th col. if I. For X = [x 1, x 2,, x n ] this gives AX = I. Finally, X = C because: CA = I C (AX) }{{} I = X C = X Let A be invertible. Its columns are lin. independent if (by definition) Ax = 0 implies x = 0 - this is trivially true as can be seen by multiplying Ax = 0 to the left by A 1. 8-19 Text: 2.2 Inverse

Q: Can we now prove the Existence Theorem? Existence Theorem. The 4 following statements are equivalent (1) A is invertible (2) The columns of A are linearly independent (3) The Span of the columns of A is R n (4) rref(a) is the identity matrix We have proved (1) iff (2) and also (2) iff (4) Easy to show (2) (3) and then (3) (4) Is this enough to prove theorem? The most important result to remember is: A invertible rref(a) = I cols(a) Lin. independ. 8-20 Text: 2.2 Inverse

Proof: (3) (4). As was seen before (3) implies that there is a pivot in every row. Since the matrix is n n the only possible rref echelon matrix of this type is I. (2) (3) Proof by contradiction. Assume A has linearly independent columns. And assume that some system Ax = b does not have a solution. Then A, b will have a reduced row echelon form in which b will become a pivot. So there is a zero row in the A part of the echelon matrix.. This means we have at least a free variable - So systems Ax = 0 will have nontrivial solutions contradiction. 8-21 Text: 2.2 Inverse

Computing the inverse Q: How do I compute the inverse of a matrix A? A: Two common strategies [not necessarily the best] Using the reduced row echelon form Solving the n systems Ax = e i for i = 1,, n How to use the echelon form? Could record the product of the E i s as suggested by one of the previous theorems Too complicated! 8-22 Text: 2.2 Inverse

Instead perform the echelon form on the augmented matrix [A, I] Assuming A is invertible result is of the form [I, C] The inverse is C. Explain why. What will happen if A is not invertible? 8-23 Text: 2.2 Inverse

Example: Compute the inverse of Solution. First form the augmented matrix Then get reduced echelon form: Inverse is C = 0 0 1 1 2 1 3 1 2 4 1 2 1 4 1 1 2 1 1 2 4 2 3 2 1 1 0 0 2 3 1 0 1 0 4 2 3 0 0 1 2 1 0 0 5 2 4 0 1 0 2 2 2 0 0 1 2 1 1 5 2 4 2 2 2 2 1 1 8-24 Text: 2.2 Inverse