Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

Similar documents
Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 6

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 3

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 3

22A-2 SUMMER 2014 LECTURE 5

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 2

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 2

Elementary maths for GMT

4 Elementary matrices, continued

4 Elementary matrices, continued

Linear Algebra Handout

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

February 20 Math 3260 sec. 56 Spring 2018

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way

Lecture 9: Elementary Matrices

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34

Section 2.2: The Inverse of a Matrix

Chapter 2 Notes, Linear Algebra 5e Lay

Chapter 4. Solving Systems of Equations. Chapter 4

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra 1/33

CHAPTER 8: Matrices and Determinants

Designing Information Devices and Systems I Discussion 4B

MATH10212 Linear Algebra B Homework Week 5

Matrices. 1 a a2 1 b b 2 1 c c π

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

7.6 The Inverse of a Square Matrix

INVERSE OF A MATRIX [2.2]

Solving Linear Systems Using Gaussian Elimination

is a 3 4 matrix. It has 3 rows and 4 columns. The first row is the horizontal row [ ]

Review Solutions for Exam 1

POLI270 - Linear Algebra

Designing Information Devices and Systems I Discussion 4B

Dot Products, Transposes, and Orthogonal Projections

Math 54 HW 4 solutions

Designing Information Devices and Systems I Fall 2017 Official Lecture Notes Note 2

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

Review : Powers of a matrix

Designing Information Devices and Systems I Fall 2018 Homework 3

Definition of Equality of Matrices. Example 1: Equality of Matrices. Consider the four matrices

Linear Algebra, Vectors and Matrices

If A is a 4 6 matrix and B is a 6 3 matrix then the dimension of AB is A. 4 6 B. 6 6 C. 4 3 D. 3 4 E. Undefined

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

1 Inner Product and Orthogonality

EXAM 2 REVIEW DAVID SEAL

MAC1105-College Algebra. Chapter 5-Systems of Equations & Matrices

MODEL ANSWERS TO THE THIRD HOMEWORK

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Lectures on Linear Algebra for IT

Section 4.5. Matrix Inverses

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Math 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements

Lecture 4: Applications of Orthogonality: QR Decompositions

Designing Information Devices and Systems I Fall 2017 Homework 3

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Practice problems for Exam 3 A =

Math Matrix Theory - Spring 2012

Chapter 7. Tridiagonal linear systems. Solving tridiagonal systems of equations. and subdiagonal. E.g. a 21 a 22 a A =

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

6-2 Matrix Multiplication, Inverses and Determinants

Matrix Inverses. November 19, 2014

Row Reduction

Math 60. Rumbos Spring Solutions to Assignment #17

1 Last time: determinants

Linear Algebra and Matrix Inversion

Methods for Solving Linear Systems Part 2

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Elementary matrices, continued. To summarize, we have identified 3 types of row operations and their corresponding

Matrix Calculations: Determinants and Basis Transformation

Linear algebra and differential equations (Math 54): Lecture 10

Lecture 7: Introduction to linear systems

Lecture 22: Section 4.7

x + 2y + 3z = 8 x + 3y = 7 x + 2z = 3

INVERSE OF A MATRIX [2.2] 8-1

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

ANSWERS. Answer: Perform combo(3,2,-1) on I then combo(1,3,-4) on the result. The elimination matrix is

A FIRST COURSE IN LINEAR ALGEBRA. An Open Text by Ken Kuttler. Matrix Arithmetic

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Honors Advanced Mathematics Determinants page 1

Lecture 4: Gaussian Elimination and Homogeneous Equations

Math 4377/6308 Advanced Linear Algebra

Linear Algebra March 16, 2019

The Gauss-Jordan Elimination Algorithm

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

E k E k 1 E 2 E 1 A = B

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations.

Mon Feb Matrix algebra and matrix inverses. Announcements: Warm-up Exercise:

Review of matrices. Let m, n IN. A rectangle of numbers written like A =

Matrices MA1S1. Tristan McLoughlin. November 9, Anton & Rorres: Ch

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Chapter 4 Systems of Linear Equations; Matrices

1 Matrices and matrix algebra

Math 320, spring 2011 before the first midterm

Math 3191 Applied Linear Algebra

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

MA 138 Calculus 2 with Life Science Applications Matrices (Section 9.2)

1 Last time: inverses

Transcription:

EECS 16A Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6 6.1 Introduction: Matrix Inversion In the last note, we considered a system of pumps and reservoirs where the water in each reservoir is represented as a vector and the pumps, represented as a matrix, act on the reservoirs to move water into a new state. If we know the current state of the reservoirs, v[t, and we know the state transition matrix describing the pumps, A, we can find the water at the next time step through matrix-vector multiplication: v[t + 1 A v[t However, suppose we d like to find the water in the reservoirs at a previous timestep, v[t 1. Is there a state transition matrix B, that can take us backwards in time? v[t 1 B v[t It turns out that the matrix that undoes the effects of A is it s inverse! In this note, we ll define matrix inverses, introduce some properties, and investigate when matrix inverses exist (and when they don t exist). 6.1.1 Definition and properties of matrix inverses Definition 6.1 (Inverse): A square matrix A is said to be invertible if there exists an matrix B such that AB BA I. (1) where I is the identity matrix. In this case, we call the matrix B the inverse of the matrix A, which we denote as A 1. [ [ 1 1 1 1 Example 6.1 (Matrix inverse): Consider the 2 2 matrix A. Then A 2 1 1. We can 2 1 verify that the following holds [ [ [ AA 1 1 1 1 1 1 0 (2) 2 1 2 1 0 1 [ [ [ A 1 1 1 1 1 1 0 A. (3) 2 1 2 1 0 1 Let s show an important property of matrix inverses: If A is an invertible matrix, then its inverse must be unique. EECS 16A, Fall 2018, Note 6 1

Proof. Suppose B 1 and B 2 are both inverses of the matrix A. Then we have AB 1 B 1 A I (4) AB 2 B 2 A I (5) Now take the equation AB 1 I. (6) Multiplying both sides of the equation by B 2 from the left, we have B 2 (AB 1 ) B 2 I B 2. (7) Notice that by associativity of matrix multiplication, the left hand side of the equation above becomes B 2 (AB 1 ) (B 2 A)B 1 IB 1 B 1. (8) Hence we have B 1 B 2. (9) We see that B 1 and B 2 must be equal, so the inverse of any invertible matrix is unique. In discussion, you will see a few more useful properties of matrix inverses. Some of the next natural questions to ask are: How do we know whether or not a matrix is invertible? If a matrix is invertible, how do we go about finding its inverse? It turns out Gaussian elimination could help us answer these questions! 6.1.2 Finding Inverses With Gaussian Elimination A square matrix M and its inverse M 1 will always satisfy the following conditions MM 1 I and M 1 M I, where I is the identity matrix. [ 1 1 Let M 2 1 [ and M 1 b11 b 12 b 21 b 22 We want to find the values of b i j such [ that the [ equation MM 1 [ I would be satisfied. 1 1 b11 b 12 1 0 2 1 b 21 b 22 0 1 Just as we did for solving equations of the form A x b, we can write the above as an augmented matrix, which joins the left and right numerical matrices together and hides the variable matrix, as shown below. [ 1 1 1 0 2 1 0 1 EECS 16A, Fall 2018, Note 6 2

Now, to find the inverse matrix M 1 using Gaussian elimination, we have to transform the left numerical matrix (left half of the augmented matrix) to the identity matrix, then the right numerical matrix (right half of the augmented matrix) becomes our solution. In equation form MM 1 I, we are transforming M and I simultaneously using row operations so that the equation becomes IM 1 A, where A is the resulting numerical matrix from the Gaussian elimination. Since M 1 is multiplied by the identity matrix I, the resulting numerical matrix A must equal to M 1, and we have the values for the elements in our inverse matrix. We will now do the actual computation below: [ 1 1 1 0 2 1 0 1 R 2 2R 1 [ 1 1 1 0 0 1 2 1 [ 1 0 1 1 R 1 R 2 0 1 2 1 [ 1 1 1 0 1(R 2 ) 0 1 2 1 [ 1 1 M 1 is the right half of the augmented matrix. Therefore M 1. More generally, for any 2 1 n n matrix M, we can perform Gaussian elimination on the augmented matrix M I n. If at termination of Gaussian elimination, we end up with an identity matrix on the left, then the matrix on the right is the inverse of the matrix M. I n M 1. If we don t end up with an identity matrix on the left after running Gaussian elimination, we know that the matrix is not invertible. Knowing if a matrix is invertible can tell us about the rows/columns of a matrix, and knowing about the rows/columns can tell us if a matrix is invertible - let s look at how. Additional Resources For more on matrix inverses, read Strang pages 83-85 and try Problem Set 2.5; or read Schuam s pages 33-34 and try Problems 2.17 to 2.20, 2.54, 2.55, 2.57, and 2.58. 6.2 Connecting invertibility with matrix rows and columns First let s consider how the rows of the matrix relate to invertibility. Example 6.2 (Invertibility Intuition Rows): Suppose we have a black and white image with two pixels. We cannot directly see the shade of each pixel, but we can measure linear combinations of the light the two pixels absorb. Define the light absorbed by the two pixels as x 1 and x 2. We take the following two measurements, y 1,y 2 : y 1 x 1 + x 2 y 2 2x 1 + 2x 2 EECS 16A, Fall 2018, Note 6 3

Written in matrix form: y [ 1 1 x (10) 2 2 Can we invert our measurement matrix to solve for the shade of each pixel? No, our second measurement y 2 does not provide any new information, since it is linearly dependent with the first measurement. In our matrix representation, each measurement corresponds to a row, so we can guess that we need linearly independent rows to have an invertible matrix. We won t prove this rigorously, but we can extend this intuition by examining the Gaussian elimination method for finding matrix inverses: If we run Gaussian elimination on a matrix M and do not end up with the identity matrix, this means that the matrix is not invertible. If we don t get the identity matrix, we will have a row of zeros, which indicates that the rows of M are linearly dependent. Now let s look from the column perspective. Consider A as an operator on any vector x R n. What does it mean for A to have an inverse? It suggests that we can find a matrix that "undoes" the effect of matrix A operating on any vector x R n. What property should A have in order for this to be possible? A should map any two distinct vectors to distinct vectors in R n, i.e., A x 1 A x 2 for vectors x 1, x 2 such that x 1 x 2. Consider this example: Example 6.3 (Invertibility [ Intuition Columns): 1 1 Is the matrix A invertible? Intuitively, it is not because A can map two distinct vectors into the 0 0 same vector. [ [ [ [ [ 1 1 1 1 1 4 1 + 3 (11) 0 0 3 0 0 0 [ [ [ [ [ 1 1 2 1 1 4 2 + 2. (12) 0 0 2 0 0 0 We cannot recover the vector uniquely after it is operated by A. This is connected with the fact that the columns are linearly dependent different weighted combinations of columns could generate the same vector. Theorem 6.1: A matrix A is invertible if and only if its columns are linearly independent. Let s prove this theorem. The statement if and only if means that we need to prove two things: 1. If A is invertible, then its columns are linearly independent. 2. If A s columns are linearly independent, then it is invertible. For the first statement, we re trying to prove the following: A 1 exists the columns of A are linearly independent EECS 16A, Fall 2018, Note 6 4

We know that if the columns of A are linearly independent, then A x 0 only when x 0, so we can rephrase what we re trying to prove as A 1 exists (A x 0 only when x 0) To prove this, assume that A is invertible. Let v be some vector such that A v 0: A v 0 left-multiply by A 1 A 1 A v I v 0 v 0 Hooray! We ve successfully proven the first statement. The second statement is a little trickier to prove: (columns of A are linearly independent) A 1 exists or, rewritten (A x b has a unique solution x) A 1 exists Now suppose that { e 1, e 2,, e n } are the columns of the identity matrix. We want to see if there s a matrix M which is A s inverse, i.e. it obeys the following properties: AM MA I e 1 e 2 e n Using the definition of matrix-matrix multiplication as a series of stacked matrix-vector multiplications: A m 1 m 2 m n A m 1 A m 2 A m n e 1 e 2 e n What we need to prove now is that there are vectors { m 1, m 2, m n } such that A m i e i i {1,2,,n} but we already know this is true because this is a square matrix and because of the definition of linear independence above! We also already know from lecture that the left and right inverses are identical. Thus, we ve proven the second statement because the m vectors exist, i.e. M exists. EECS 16A, Fall 2018, Note 6 5

6.3 Practice Problems These practice problems are also available in an interactive form on the course website. 1 0 2 1. Find the inverse of 1 1 1. 0 1 0 1 0 2 2. Find the inverse of 1 1 1. 0 1 1 3. Suppose A BC, where B is a 4 2 matrix and C is a 2 4 matrix. Is A invertible? (a) Yes, A is invertible. (b) No, A is not invertible. (c) Depends on C only. (d) Depends on B and C. 4. Let the matrix A be the state transition matrix for some system. Given some state after n steps x[n, can we always find x[n + 1? (a) Yes, we simply apply the matrix A on x[n. (b) No, we need to know the intial state x[0. (c) No, we don t have enough information about the system. 5. Let the matrix A be the state transition matrix for some system. Given some state after n steps x[n, can we always find x[n 1? (a) Yes, we can use Gaussian elimination to find the initial state. (b) Yes, we simply apply the matrix A 1 on x[n. (c) No, we don t know whether the matrix A is invertible. 6. Suppose that the state transition matrix for a system is given by steps x[n, can we find x[n 1? [ 1 0.5. Given some state after n 0 0 7. True or False: The inverse of a diagonal matrix, where all of the diagonal entries are non-zero, is another diagonal matrix. 8. True or False: If A n 0, where 0 is the zero matrix, for some n R, then A is not invertible. EECS 16A, Fall 2018, Note 6 6