Solving Ax = b w/ different b s: LU-Factorization

Similar documents
Determinants: Elementary Row/Column Operations

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

Linear Transformations: Standard Matrix

Math 344 Lecture # Linear Systems

7.6 The Inverse of a Square Matrix

30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes

5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns

Review of matrices. Let m, n IN. A rectangle of numbers written like A =

12/1/2015 LINEAR ALGEBRA PRE-MID ASSIGNMENT ASSIGNED BY: PROF. SULEMAN SUBMITTED BY: M. REHAN ASGHAR BSSE 4 ROLL NO: 15126

MTH 464: Computational Linear Algebra

Section Partitioned Matrices and LU Factorization

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

MTH 215: Introduction to Linear Algebra

2.1 Gaussian Elimination

Lecture 8: Determinants I

MIDTERM 1 - SOLUTIONS

Next topics: Solving systems of linear equations

SOLVING Ax = b: GAUSS-JORDAN ELIMINATION [LARSON 1.2]

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Math 2331 Linear Algebra

Methods for Solving Linear Systems Part 2

Eigenvalues and Eigenvectors

MODEL ANSWERS TO THE THIRD HOMEWORK

Row Reduced Echelon Form

Math x + 3y 5z = 14 3x 2y + 3z = 17 4x + 3y 2z = 1

Chapter 4. Solving Systems of Equations. Chapter 4

Linear Equations in Linear Algebra

Definition of Equality of Matrices. Example 1: Equality of Matrices. Consider the four matrices

4 Elementary matrices, continued

MATHEMATICS FOR COMPUTER VISION WEEK 2 LINEAR SYSTEMS. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year

Math 4377/6308 Advanced Linear Algebra

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Computational Methods. Systems of Linear Equations

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Gaussian Elimination and Back Substitution

Linear Systems. Math A Bianca Santoro. September 23, 2016

Elementary matrices, continued. To summarize, we have identified 3 types of row operations and their corresponding

Eigenvalues and Eigenvectors

MAC1105-College Algebra. Chapter 5-Systems of Equations & Matrices

Chapter 2 Notes, Linear Algebra 5e Lay

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

Section Gaussian Elimination

Y = ax + b. Numerical Applications Least-squares. Start with Self-test 10-1/459. Linear equation. Error function: E = D 2 = (Y - (ax+b)) 2

Solving Systems of Linear Equations Using Matrices

Linear Equations in Linear Algebra

Lecture 2 Systems of Linear Equations and Matrices, Continued

MATH 2030: ASSIGNMENT 4 SOLUTIONS

Linear System Equations

4 Elementary matrices, continued

Matrix decompositions

Numerical Linear Algebra

Linear Algebra I Lecture 8

LU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark

MATH2210 Notebook 2 Spring 2018

Matrix & Linear Algebra

Matrix Factorization Reading: Lay 2.5

Inverting Matrices. 1 Properties of Transpose. 2 Matrix Algebra. P. Danziger 3.2, 3.3

The Solution of Linear Systems AX = B

Notes on Row Reduction

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

Formula for the inverse matrix. Cramer s rule. Review: 3 3 determinants can be computed expanding by any row or column

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

System of Linear Equations

Dylan Zwick. Fall Ax=b. EAx=Eb. UxrrrEb

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

Introduction to PDEs and Numerical Methods Tutorial 1: Overview of essential linear algebra and analysis

System of Linear Equations

Section 1.1 System of Linear Equations. Dr. Abdulla Eid. College of Science. MATHS 211: Linear Algebra

Numerical Linear Algebra

Matrix operations Linear Algebra with Computer Science Application

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Linear Algebra Practice Problems

Math 1553 Introduction to Linear Algebra

Matrices and systems of linear equations

Elementary Row Operations on Matrices

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

lecture 2 and 3: algorithms for linear algebra

ORIE 6300 Mathematical Programming I August 25, Recitation 1

Review of Basic Concepts in Linear Algebra

Chapter 3 - From Gaussian Elimination to LU Factorization

Direct Methods for Solving Linear Systems. Matrix Factorization

CHAPTER 8: Matrices and Determinants

MAT 343 Laboratory 3 The LU factorization

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit.

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination

Math 3C Lecture 20. John Douglas Moore

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

Matrix decompositions

Math 343 Midterm I Fall 2006 sections 002 and 003 Instructor: Scott Glasgow

Linear Algebra Math 221

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Lesson U2.1 Study Guide

EE5120 Linear Algebra: Tutorial 1, July-Dec Solve the following sets of linear equations using Gaussian elimination (a)

Section 5.6. LU and LDU Factorizations

This can be accomplished by left matrix multiplication as follows: I

Chapter 6 Page 1 of 10. Lecture Guide. Math College Algebra Chapter 6. to accompany. College Algebra by Julie Miller

Transcription:

Solving Ax = b w/ different b s: LU-Factorization Linear Algebra Josh Engwer TTU 14 September 2015 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 1 / 21

Elementary Row Operations (Review) Consider solving several linear systems Ax = b with same A but different b s: Performing Gauss-Jordan on each [A b repeats the same work alot!! Computing x = A 1 b only works if A is square AND invertible!! Recall the definition of an elementary row operation: Definition (Elementary Row Operations) There are three types of elementary row operations applicable to [A b: (SWAP) [R i R j Swap row i & row j (SCALE) [αr j R j Multiply row j by a non-zero scalar α (COMBINE) [αr i + R j R j Add scalar multiple α of row i to row j VERY IMPORTANT: For this section (LARSON 2.4) only: The only elementary row operation considered is COMBINE. COMBINE operations will be applied to A instead of [A b. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 2 / 21

COMBINE Operations (Examples) 1 0 1 2 7 9 1 0 1 2 7 9 1 0 1 2 7 9 5R1+R2 R2 5R1+R3 R3 5R2+R3 R3 14 20 1 17 27 9 1 0 1 17 27 9 1 0 1 3 7 14 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 3 / 21

Elementary Matrices Representing SWAP Operations It would be useful to encapsulate an elementary row op into a matrix: Definition (Elementary Matrix) An n n square matrix E is an elementary matrix if it can be obtained from the n n identity matrix I by a single elementary row operation. (SWAP) [R 1 R 2 : Here are some 3 3 elementary matrices: 0 1 0, [R 2 R 3 : NOTE: SWAP operations will never be used in this section - the above elementary matrix representations are shown here for completion. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 4 / 21

Elementary Matrices Representing SCALE Operations It would be useful to encapsulate an elementary row op into a matrix: Definition (Elementary Matrix) An n n square matrix E is an elementary matrix if it can be obtained from the n n identity matrix I by a single elementary row operation. Here are some 3 3 elementary matrices: 3 0 0 (SCALE) [3R 1 R 1 :, [( 8)R 3 R 3 : 0 0 8 NOTE: SCALE operations will never be used in this section - the above elementary matrix representations are shown here for completion. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 5 / 21

Elementary Matrices Representing COMBINE Op s It would be useful to encapsulate an elementary row op into a matrix: Definition (Elementary Matrix) An n n square matrix E is an elementary matrix if it can be obtained from the n n identity matrix I by a single elementary row operation. Here are some 3 3 elementary matrices: (COMBINE) [5R 1 + R 2 R 2 : 5 1 0, [( 5)R 2 + R 3 R 3 : 0 5 1 NOTE: All elementary matrices considered in this section will be COMBINE s. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 6 / 21

Elementary Matrices applied to A with m = n To apply an elem row op to matrix A, left-multiply A by elementary matrix. Let A = 1 0 1. Then: 2 7 9 E 1 A = E 2 A = E 3 A = 5 1 0 5 0 1 0 5 1 1 0 1 2 7 9 1 0 1 2 7 9 1 0 1 2 7 9 5R1+R2 R2 5R1+R3 R3 5R2+R3 R3 14 20 1 2 7 9 1 0 1 17 27 9 1 0 1 3 7 14 WARNING: Right-multiplying A by an elementary matrix applies the corresponding elementary column operation. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 7 / 21

Elementary Matrices applied to A with m < n To apply an elem row op to matrix A, left-multiply A by elementary matrix. Let A = 1 0 1 6. Then: 8 2 7 9 3 E 1 A = E 2 A = E 3 A = 5 1 0 5 0 1 0 5 1 8 1 0 1 6 2 7 9 3 8 1 0 1 6 2 7 9 3 8 1 0 1 6 2 7 9 3 5R1+R2 R2 5R1+R3 R3 5R2+R3 R3 8 14 20 1 46 2 7 9 3 8 1 0 1 6 17 27 9 43 8 1 0 1 6 3 7 14 33 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 8 / 21

Elementary Matrices applied to A with m > n To apply an elem row op to matrix A, left-multiply A by elementary matrix. Let A = 1 0. Then: 3 4 2 7 E 1 A = E 2 A = E 3 A = 5 1 0 5 0 1 0 5 1 3 4 1 0 2 7 3 4 1 0 2 7 3 4 1 0 2 7 5R1+R2 R2 5R1+R3 R3 5R2+R3 R3 3 4 14 20 2 7 3 4 1 0 17 27 3 4 1 0 3 7 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 9 / 21

Inverse of an Elementary Matrix Theorem (Elementary Matrices are Invertible) If E is an elementary matrix, then E 1 exists and is an elementary matrix. Question: What is the inverse of an elementary matrix? Answer: The inverse undoes the elementary row operation. ELEMENTARY MATRIX: ITS INVERSE: [R 1 R 2 : [R 1 R 2 : [3R 1 R 1 : 3 0 0 [ ( 1 3 0 0 1 3) R1 R 1 : [5R 1 +R 2 R 2 : 5 1 0 [( 5)R 1 +R 2 R 2 : 5 1 0 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 10 / 21

Main Diagonal of a Matrix (Definition) Definition (Main Diagonal of a Matrix) The main diagonal of a m n matrix comprises of all (k, k)-entries of the matrix for k = 1, 2,, min{m, n}. The main diagonal entries of each matrix below is shown in blue: [ [ 1 2 1 2 1 2 3,, 3 4 3 4 4 5 6 5 6 1 2 3 4 5 6 7 8 9, 1 2 3 4 5 6 7 8 9 10 11 12, 1 2 3 4 5 6 7 8 9 10 11 12 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 11 / 21

Special Matrices: Triangular Matrices In certain applications & higher math, triangular matrices are useful: Definition (Triangular Matrices) Let L, U be m n matrices. Then: L is called lower triangular if entries above the main diagonal are all zero. U is called upper triangular if entries below the main diagonal are all zero. A matrix is called triangular if it s either lower or upper triangular. Lower Triangular: Upper Triangular: [ 1 0 3 4 [ 1 2 0 4 [, 4 5 0 [ 1 0 3, 0 5 6, 1 0 3 4 5 0, 1 2 0 4 0 0, 4 5 0 0 8 0, 1 0 3 0 0 6 0 0 9,, Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 12 / 21

Special Matrices: Unit Triangular Matrices Moreover, in what s coming next, unit triangular matrices are useful: Definition (Unit Triangular Matrices) Let L, U be m n matrices. Then: L is unit lower triangular if it s lower triangular and has all ones on main diagonal. U is unit upper triangular if it s upper triangular and has all ones on main diagonal. Unit Lower Triangular: Unit Upper Triangular: [ 1 0 3 1 [ 1 2 0 1 [, 4 1 0 [ 1 0 3, 0 1 1, 1 0 3 1 1 0, 1 2 0 1 0 0,, 1 1 0 0 8 1 1 0 1 0 1 6,, Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 13 / 21

The Value of Triangular Matrices Consider the following 3 3 lower triangular linear system: x 1 = 1 2x 1 + 3x 2 = 1 4x 1 + 5x 2 + 6x 3 = 11 [A b, where A = 2 3 0 4 5 6 Then the soln can be found by a forward-solve (AKA forward substitution): x 1 = 1 = x 1 = 1 2x 1 + 3x 2 = 1 = 2(1) + 3x 2 = 1 = x 2 = 1 4x 1 + 5x 2 + 6x 3 = 11 = 4(1) + 5( 1) + 6x 3 = 11 = x 3 = 2 The cases where the lower triangular linear system has no soln or infinitely many solns can also be handled with little trouble by a forward-solve. Having said that, for this section (LARSON 2.4), only linear systems with a unique solution will be considered. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 14 / 21

The Value of Triangular Matrices Consider the following 3 3 upper triangular linear system: x 1 + 2x 2 + 3x 3 = 3 4x 2 + 5x 3 = 10 6x 3 = 12 [A b, where A = 1 2 3 0 4 5 0 0 6 Then the solution can be found by a back-solve (AKA back substitution): 6x 3 = 12 = x 3 = 12/6 = x 3 = 2 4x 2 + 5x 3 = 10 = 4x 2 + 5(2) = 10 = x 2 = 0 x 1 + 2x 2 + 3x 3 = 3 = x 1 + 2(0) + 3(2) = 3 = x 1 = 3 The cases where the upper triangular linear system has no soln or infinitely many solns can also be handled with little trouble by a back-solve. Having said that, for this section (LARSON 2.4), only linear systems with a unique solution will be considered. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 15 / 21

LU-Factorization of a Matrix (Procedure) There are times in Linear Algebra where factoring a matrix is quite useful. Here is the first such instance: Proposition (LU-Factorization of a Matrix) GIVEN: m n matrix A where no row swaps are necessary. TASK: Form A = LU, (L is square unit lower triangular & U is upper triangular) (1) COMBINE to zero-out an entry below main diagonal: [αr i + R j R j (2) Form m m elementary matrix corresponding to COMBINE operation: E (3) Find the inverse of the elementary matrix: E 1 (4) Repeat steps (1)-(3) for all such entries, top-to-bottom, left-to-right (5) Resulting matrix is upper triangular: U = E k E k 1 E 3 E 2 E 1 A (6) Determine L: E k E k 1 E 2 E 1 A = U = A = E 1 1 E 1 2 E 1 k 1 E 1 k U }{{} L Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 16 / 21

Products of Inverses of COMBINE Elem. Matrices Products of inverses of COMBINE elementary matrices can be found instantly: Proposition (Products of Inverses of COMBINE Elementary Matrices) Let E 1, E 2,, E k 1, E k be n n COMBINE elementary matrices. Then the n n matrix product E 1 1 E 1 2 E 1 k 1 E 1 k is simply the unit lower triangular matrix with each entry below the main diagonal being the corresponding single non-zero entry in one of the inverse elementary matrices. Let E 1 = 3 1 0, E 2 =, E 3 = Then: 2 0 1 0 8 1 E 1 1 E 1 2 E 1 3 = 3 1 0 2 0 1 The product E 3 E 2 E 1 is not obvious: E 3 E 2 E 1 = 0 8 1 = 3 1 0 26 8 1 3 1 0 2 8 1 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 17 / 21

Solving Ax = b via LU-Factorization of A Proposition (Solving Ax = b via LU-Factorization of A) GIVEN: m n linear system Ax = b where no row swaps are necessary. TASK: Solve linear system via A = LU (1) Perform LU-Factorization of A (see previous slides) (2) Notice Ax = b = (LU)x = b = L(Ux) = b = Let y = Ux (3) Solve square triangular system Ly = b for y via forward-solve (4) Solve triangular system Ux = y for x If U is square, solve Ux = y via back-solve If U is non-square, solve Ux = y via Gauss-Jordan Elimination REMARK: Here, Ax = b will always be square & have a unique solution. The reason being most applications lead to square linear systems with unique solutions. Moreover, most computer algorithms can only handle square linear systems with a unique solution. Finally, using A = LU is preferable to computing A 1 since it s too slow & unstable for a computer to invert most large square matrices. Take Numerical Linear Algebra for the details. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 18 / 21

Solving Ax = b with different b s via A = LU A = LU is efficient when solving several linear systems with the same matrix A and different RHS b s since A = LU only has to be computed once: Ax = b 1 = L(Ux) = b 1 = Solve Ly 1 = b 1 = Solve Ux = y 1 Ax = b 2 = L(Ux) = b 2 = Solve Ly 2 = b 2 = Solve Ux = y 2 Ax = b 3 = L(Ux) = b 3 = Solve Ly 3 = b 3 = Solve Ux = y 3 Ax = b 4 = L(Ux) = b 4 = Solve Ly 4 = b 4 = Solve Ux = y 4.... If A is square, computing A = LU is more efficient than computing A 1. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 19 / 21

Solving Ax = b via LU-Factorization of A (Example) WEX 2-4-1: Let A = [ 1 2 3 4 (a) Find the LU-Factorization for A [ [ A ( 3)R1+R2 R2 1 2 = E 0 2 1 = [ 1 2 = E 1 A = 0 2 A = LU [ 1 2 3 4. 1 0 3 1 = U = A = E 1 1 U = L = E 1 1 = = [ 1 0 3 1 [ 1 2 0 2 [ = E 1 1 0 1 = 3 1 [ 1 0 3 1 [ 1 (b) Use A = LU to solve linear system Ax = b, where b = 1 { [ [ y1 = 1 Forward-Solve y1 Ly = b = = 3y 1 + y 2 = 1 y 2 { [ x1 + 2x Ux = y = 2 = 1 Back-Solve x1 = 2x 2 = 4 x 2 [ 3 2 1 4 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 20 / 21

Fin Fin. Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 21 / 21