Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

Similar documents
Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 6

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 3

22A-2 SUMMER 2014 LECTURE 5

Elementary maths for GMT

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 2

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 3

Linear Algebra Handout

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 2

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10

February 20 Math 3260 sec. 56 Spring 2018

7.6 The Inverse of a Square Matrix

is a 3 4 matrix. It has 3 rows and 4 columns. The first row is the horizontal row [ ]

4 Elementary matrices, continued

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

4 Elementary matrices, continued

Review : Powers of a matrix

Chapter 2 Notes, Linear Algebra 5e Lay

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Section 2.2: The Inverse of a Matrix

1 Inner Product and Orthogonality

If A is a 4 6 matrix and B is a 6 3 matrix then the dimension of AB is A. 4 6 B. 6 6 C. 4 3 D. 3 4 E. Undefined

MAC1105-College Algebra. Chapter 5-Systems of Equations & Matrices

Matrices. 1 a a2 1 b b 2 1 c c π

Lectures on Linear Algebra for IT

Designing Information Devices and Systems I Discussion 4B

Practice problems for Exam 3 A =

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way

CHAPTER 8: Matrices and Determinants

INVERSE OF A MATRIX [2.2]

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

6-2 Matrix Multiplication, Inverses and Determinants

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

Math 60. Rumbos Spring Solutions to Assignment #17

Review Solutions for Exam 1

Linear Algebra and Matrix Inversion

Methods for Solving Linear Systems Part 2

Designing Information Devices and Systems I Discussion 4B

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34

POLI270 - Linear Algebra

Lecture 22: Section 4.7

Chapter 4. Solving Systems of Equations. Chapter 4

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

ANSWERS. Answer: Perform combo(3,2,-1) on I then combo(1,3,-4) on the result. The elimination matrix is

Solving Linear Systems Using Gaussian Elimination

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

Designing Information Devices and Systems I Fall 2017 Homework 3

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra 1/33

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Math 4377/6308 Advanced Linear Algebra

E k E k 1 E 2 E 1 A = B

Definition of Equality of Matrices. Example 1: Equality of Matrices. Consider the four matrices

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chapter 4 Systems of Linear Equations; Matrices

Math 320, spring 2011 before the first midterm

MA 138 Calculus 2 with Life Science Applications Matrices (Section 9.2)

Chapter 7. Tridiagonal linear systems. Solving tridiagonal systems of equations. and subdiagonal. E.g. a 21 a 22 a A =

Section 4.5. Matrix Inverses

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

1 Last time: determinants

Math 313 Chapter 1 Review

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

Math 308 Practice Final Exam Page and vector y =

MATH10212 Linear Algebra B Homework Week 5

Lecture 4: Applications of Orthogonality: QR Decompositions

Matrix operations Linear Algebra with Computer Science Application

Linear Systems and Matrices

MTH5112 Linear Algebra I MTH5212 Applied Linear Algebra (2017/2018)

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

Designing Information Devices and Systems I Fall 2018 Homework 3

Problem 1: Solving a linear equation

Matrices and Determinants

Lecture 9: Elementary Matrices

INVERSE OF A MATRIX [2.2] 8-1

Math 54 HW 4 solutions

Math Matrix Theory - Spring 2012

x + 2y + 3z = 8 x + 3y = 7 x + 2z = 3

Linear Algebra Practice Final

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Matrix Calculations: Determinants and Basis Transformation

Solutions to Exam I MATH 304, section 6

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Matrix Inverses. November 19, 2014

Mon Feb Matrix algebra and matrix inverses. Announcements: Warm-up Exercise:

Math Final December 2006 C. Robinson

Linear Algebra Practice Problems

Lesson 3. Inverse of Matrices by Determinants and Gauss-Jordan Method

Problem # Max points possible Actual score Total 120

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Determine whether the following system has a trivial solution or non-trivial solution:

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Math x + 3y 5z = 14 3x 2y + 3z = 17 4x + 3y 2z = 1

Linear Algebra, Vectors and Matrices

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Matrices Gaussian elimination Determinants. Graphics 2009/2010, period 1. Lecture 4: matrices

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

n n matrices The system of m linear equations in n variables x 1, x 2,..., x n can be written as a matrix equation by Ax = b, or in full

4.3 Row operations. As we have seen in Section 4.1 we can simplify a system of equations by either:

Transcription:

EECS 16A Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6 6.1 Introduction: Matrix Inversion In the last note, we considered a system of pumps and reservoirs where the water in each reservoir is represented as a vector and the pumps, represented as a matrix, act on the reservoirs to move water into a new state. If we know the current state of the reservoirs, v[t, and we know the state transition matrix describing the pumps, A, we can find the water at the next time step through matrix-vector multiplication: v[t + 1 A v[t However, suppose we d like to find the water in the reservoirs at a previous timestep, v[t 1. Is there a state transition matrix B, that can take us backwards in time? v[t 1 B v[t It turns out that the matrix that undoes the effects of A is it s inverse! In this note, we ll define matrix inverses, introduce some properties, and investigate when matrix inverses exist (and when they don t exist). 6.1.1 Definition and properties of matrix inverses Definition 6.1 (Inverse): A square matrix A is said to be invertible if there exists an matrix B such that AB BA I. (1) where I is the identity matrix. In this case, we call the matrix B the inverse of the matrix A, which we denote as A 1. [ [ 1 1 1 1 Example 6.1 (Matrix inverse): Consider the 2 2 matrix A. Then A 2 1 1. We can 2 1 verify that the following holds [ [ [ AA 1 1 1 1 1 1 0 (2) 2 1 2 1 0 1 [ [ [ A 1 1 1 1 1 1 0 A. (3) 2 1 2 1 0 1 Let s show an important property of matrix inverses: If A is an invertible matrix, then its inverse must be unique. EECS 16A, Fall 2018, Note 6 1

Proof. Suppose B 1 and B 2 are both inverses of the matrix A. Then we have AB 1 B 1 A I (4) AB 2 B 2 A I (5) Now take the equation AB 1 I. (6) Multiplying both sides of the equation by B 2 from the left, we have B 2 (AB 1 ) B 2 I B 2. (7) Notice that by associativity of matrix multiplication, the left hand side of the equation above becomes B 2 (AB 1 ) (B 2 A)B 1 IB 1 B 1. (8) Hence we have B 1 B 2. (9) We see that B 1 and B 2 must be equal, so the inverse of any invertible matrix is unique. In discussion, you will see a few more useful properties of matrix inverses. Some of the next natural questions to ask are: How do we know whether or not a matrix is invertible? If a matrix is invertible, how do we go about finding its inverse? It turns out Gaussian elimination could help us answer these questions! 6.1.2 Finding Inverses With Gaussian Elimination A square matrix M and its inverse M 1 will always satisfy the following conditions MM 1 I and M 1 M I, where I is the identity matrix. [ 1 1 Let M 2 1 [ and M 1 b11 b 12 b 21 b 22 We want to find the values of b i j such [ that the [ equation MM 1 [ I would be satisfied. 1 1 b11 b 12 1 0 2 1 b 21 b 22 0 1 Just as we did for solving equations of the form A x b, we can write the above as an augmented matrix, which joins the left and right numerical matrices together and hides the variable matrix, as shown below. [ 1 1 1 0 2 1 0 1 EECS 16A, Fall 2018, Note 6 2

Now, to find the inverse matrix M 1 using Gaussian elimination, we have to transform the left numerical matrix (left half of the augmented matrix) to the identity matrix, then the right numerical matrix (right half of the augmented matrix) becomes our solution. In equation form MM 1 I, we are transforming M and I simultaneously using row operations so that the equation becomes IM 1 A, where A is the resulting numerical matrix from the Gaussian elimination. Since M 1 is multiplied by the identity matrix I, the resulting numerical matrix A must equal to M 1, and we have the values for the elements in our inverse matrix. We will now do the actual computation below: 2 1 0 1 R 2 2R 1 0 1 2 1 [ 1 0 1 1 R 1 R 2 0 1 2 1 1(R 2 ) 0 1 2 1 [ 1 1 M 1 is the right half of the augmented matrix. Therefore M 1. More generally, for any 2 1 n n matrix M, we can perform Gaussian elimination on the augmented matrix M I n. If at termination of Gaussian elimination, we end up with an identity matrix on the left, then the matrix on the right is the inverse of the matrix M. I n M 1. If we don t end up with an identity matrix on the left after running Gaussian elimination, we know that the matrix is not invertible. Knowing if a matrix is invertible can tell us about the rows/columns of a matrix, and knowing about the rows/columns can tell us if a matrix is invertible - let s look at how. Additional Resources For more on matrix inverses, read Strang pages 83-85 and try Problem Set 2.5; or read Schuam s pages 33-34 and try Problems 2.17 to 2.20, 2.54, 2.55, 2.57, and 2.58. 6.2 Connecting invertibility with matrix rows and columns First let s consider how the rows of the matrix relate to invertibility. Theorem 6.1: A matrix M is invertible if and only if its rows are linearly independent. If we run Gaussian elimination on our matrix, M, and do not end up with the identity matrix (meaning the matrix is not invertible), we will end up with a row of zeros indicating that the rows of the matrix are linearly dependent. Conversely, we also know that if the rows of M are linearly dependent, running Gaussian elimination on the matrix will gives us at least one row of zeros. Hence we can conclude that a matrix M is invertible if and only if its rows are linearly independent. EECS 16A, Fall 2018, Note 6 3

Now let s look from the column perspective. Consider A as an operator on any vector x R n. What does it mean for A to have an inverse? It suggests that we can find a matrix that "undoes" the effect of matrix A operating on any vector x R n. What property should A have in order for this to be possible? A should map any two distinct vectors to distinct vectors in R n, i.e., A x 1 A x 2 for vectors x 1, x 2 such that x 1 x 2. Consider this example: Example 6.2 (Invertibility [ intuition): 1 1 Is the matrix A invertible? Intuitively, it is not because A can map two distinct vectors into the 0 0 same vector. [ [ [ [ [ 1 1 1 1 1 4 1 + 3 (10) 0 0 3 0 0 0 [ [ [ [ [ 1 1 2 1 1 4 2 + 2. (11) 0 0 2 0 0 0 We cannot recover the vector uniquely after it is operated by A. This is connected with the fact that the columns are linearly dependent different weighted combinations of columns could generate the same vector. Theorem 6.2: A matrix A is invertible if and only if its columns are linearly independent. From Example 6.2, we can infer that a square matrix A R n n is invertible if and only if there exists a unique solution to the system of linear equations A x b for any vector b R n. To connect this idea to matrix columns, let s consider the specific case of A x 0. For A to be invertible, A x 0 must have a unique solution: x 0. This means that the only linear combination of the columns of A that equals zero is when all the coefficients are zero - which is the definition of linear independence. This means that the columns of A must be linearly independent in order to have a unique solution. Conversely, we showed in Note 3, that if the columns of A are linearly dependent, then the system of equations A x b cannot have a unique solution, and therefore is not invertible. (12) 6.3 Practice Problems These practice problems are also available in an interactive form on the course website. 1 0 2 1. Find the inverse of 1 1 1. 0 1 0 1 0 2 2. Find the inverse of 1 1 1. 0 1 1 3. Suppose A BC, where B is a 4 2 matrix and C is a 2 4 matrix. Is A invertible? (a) Yes, A is invertible. EECS 16A, Fall 2018, Note 6 4

(b) No, A is not invertible. (c) Depends on C only. (d) Depends on B and C. 4. Let the matrix A be the state transition matrix for some system. Given some state after n steps x[n, can we always find x[n + 1? (a) Yes, we simply apply the matrix A on x[n. (b) No, we need to know the intial state x[0. (c) No, we don t have enough information about the system. 5. Let the matrix A be the state transition matrix for some system. Given some state after n steps x[n, can we always find x[n 1? (a) Yes, we can use Gaussian elimination to find the initial state. (b) Yes, we simply apply the matrix A 1 on x[n. (c) No, we don t know whether the matrix A is invertible. 6. Suppose that the state transition matrix for a system is given by steps x[n, can we find x[n 1? [ 1 0.5. Given some state after n 0 0 7. True or False: The inverse of a diagonal matrix, where all of the diagonal entries are non-zero, is another diagonal matrix. 8. True or False: If A n 0, where 0 is the zero matrix, for some n R, then A is not invertible. EECS 16A, Fall 2018, Note 6 5