Dylan Zwick. Fall Ax=b. EAx=Eb. UxrrrEb

Similar documents
Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

22A-2 SUMMER 2014 LECTURE 5

Math Lecture 26 : The Properties of Determinants

6: Introduction to Elimination. Lecture

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

1 GSW Sets of Systems

Math Nullspace

Math Lecture 27 : Calculating Determinants

Linear Algebra Math 221

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f.

2.1 Gaussian Elimination

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 2e Row Echelon Form (pages 73-74)

Lecture 9: Elementary Matrices

14: Hunting the Nullspace. Lecture

MATH 310, REVIEW SHEET 2

Solving Ax = b w/ different b s: LU-Factorization

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

Linear Algebraic Equations

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

1 GSW Gaussian Elimination

Math 2331 Linear Algebra

Methods for Solving Linear Systems Part 2

18.06SC Final Exam Solutions

Sections 6.1 and 6.2: Systems of Linear Equations

y b where U. matrix inverse A 1 ( L. 1 U 1. L 1 U 13 U 23 U 33 U 13 2 U 12 1

4 Elementary matrices, continued

Math 4377/6308 Advanced Linear Algebra

4 Elementary matrices, continued

Matrix Factorization Reading: Lay 2.5

Honors Advanced Mathematics Determinants page 1

EE5120 Linear Algebra: Tutorial 1, July-Dec Solve the following sets of linear equations using Gaussian elimination (a)

Math Lecture 18 Notes

MTH 215: Introduction to Linear Algebra

Solution Set 3, Fall '12

Math 1553 Introduction to Linear Algebra

Math Lecture 3 Notes

Math 54 Homework 3 Solutions 9/

Determinant: 3.3 Properties of Determinants

Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Section 5.6. LU and LDU Factorizations

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn

Math 344 Lecture # Linear Systems

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

Chapter 4. Solving Systems of Equations. Chapter 4

Conceptual Questions for Review

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

MATH 2030: ASSIGNMENT 4 SOLUTIONS

Eigenvalues and Eigenvectors

x + 2y + 3z = 8 x + 3y = 7 x + 2z = 3

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

Maths for Signals and Systems Linear Algebra for Engineering Applications

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

Inverses and Determinants

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

1 Last time: inverses

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

3 Fields, Elementary Matrices and Calculating Inverses

A FIRST COURSE IN LINEAR ALGEBRA. An Open Text by Ken Kuttler. Matrix Arithmetic

Solving Consistent Linear Systems

MAT 1302B Mathematical Methods II

MAT 343 Laboratory 3 The LU factorization

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Numerical Methods Lecture 2 Simultaneous Equations

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Problem Set # 1 Solution, 18.06

MAT1302F Mathematical Methods II Lecture 19

MATH 3511 Lecture 1. Solving Linear Systems 1

Linear Algebra Handout

Direct Methods for Solving Linear Systems. Matrix Factorization

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

The Solution of Linear Systems AX = B

Chapter 2 Notes, Linear Algebra 5e Lay

v = ( 2)

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

Linear Equations in Linear Algebra

Math 3C Lecture 20. John Douglas Moore

1.5 Gaussian Elimination With Partial Pivoting.

LU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark

Numerical Methods Lecture 2 Simultaneous Equations

Chapter 2. Matrix Arithmetic. Chapter 2

AMS 209, Fall 2015 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination

Matrix decompositions

3.4 Elementary Matrices and Matrix Inverse

Gaussian Elimination and Back Substitution

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Eigenvalues and Eigenvectors

MATH 2030: MATRICES. Example 0.2. Q:Define A 1 =, A. 3 4 A: We wish to find c 1, c 2, and c 3 such that. c 1 + c c

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Math Lecture 23 Notes

8 Square matrices continued: Determinants

Math 343 Midterm I Fall 2006 sections 002 and 003 Instructor: Scott Glasgow

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 110: LINEAR ALGEBRA PRACTICE MIDTERM #2

Transcription:

Math 2270 - Lecture 0: LU Factorization Dylan Zwick Fall 202 This lecture covers section 2.6 of the textbook. The Matrices L and U In elimination what we do is we take a system of equations and convert it into an upper-triangular system. Viewed from the matrix perspective, what we re doing is taking an equation Ax=b and finding an elimination matrix E such that EA is upper-triangular. The system EAx=Eb then becomes much easier to solve. If we write EA = U, indicating EA is upper-triangular, then our equation is UxrrrEb What might not be obvious here is that the matrix E is lower-triangular and invertible, and on top of that its inverse E is also lower-triangular. Denote this inverse E = L. Then if we multiply both sides of the above equation by L = we get

LUx = LEb E Eb = b This looks an awful lot like our original equation Ax = b, and in fact it is our original equation in disguise. This is because A = LU. So, we ve factored A as the product of two matrices, one upper-triangular and the other lower-triangular. Note that throughout this discussion and for the rest of the lecture we ll assume our matrix A is invertible. 2 The Nuts and Bolts of LU Factoriztion We re now going to take a deeper look at LU factorization, using the LU factorization of /2 0 A= f 2 2 as our running example. To get our matrix U we need to perform elimi nation on the matrix A, and the first step in elimination here is to subtract the first row from the second. This is achieved with the elimination matrix /i 00 E2= f 0 0 We note that the above elimination matrix is lower-triangular. In fact, all our elimination matrices will be lower-triangular, because in elimina tion we re always subtracting a higher row from a lower row. Performing our first elimination step we obtain / 0 0 \ /2 o\ /2 0 E2 A=( 0 2 )=f 0 \ 0 0 ) \0 2) \0 2 We re going to assume throughout this lecture that we don t need to permute any rows as part of elimination. 2

) ) All the terms below our pivot in the first column are now 0, so we move on to the second pivot in row 2, and the second column. We want to subtract the second row from the third. This operation is accomplished by the elimination matrix / 0 0 E23 =( 0 0 \o The result of the next step in elimination is / 0 0 (2 o\ /2 0 E23E2 A=f0 \\o o)(oi)=(oi \o 2) \o 0 This is the conclusion of elimination, as our transformed matrix is now upper-triangular. We can multiply E23 and E2 to get our matrix E = E23E2 that transforms A directly into U. This matrix is / 0 o\ / 0 0 \ / 0 0 E=(0 oj( o)=( 0 \\O 0 0 ) \ Note the term in the bottom-left. This is because as we do elimination we first subtract of row from row 2, and then subtract of the modified row 2 from row 3. This means that in the end we subtract of the orig inal row 2 from row 3, and add of the original row to row 3. Kind of complicated, isn t it? The amazing thing is that this entanglement doesn t show up when we look at the inverse of E. The inverse of E will be / 00 E = ( 0 What s going on here? Well, the idea is that the matrix E takes U and takes it back to A: 3

E U = A Now, we derived U from A through elimination. When we do elimina tion we might change the rows of A, but once those rows become the final rows we see in U we stop. Once we have a pivot row, it never changes again. In our particular example we have Row 3 of U (Row 3 of A) - (Row 2 of U). When computing the third row of U, we subtract multiples of earlier rows of U, not rows of A. However, if we want to figure out row 3 of A, then doing some algebra on the above equation we get Row 3 of A = (Row 3 of U) + (Row 2 of U). The equation for row 3 of A involves just the rows of U, and no other row of A. Finally, one could argue that, in some sense. the factorization A = LU isn t complete. The lower-triangular matrix will always have terms on the diagonal, while the upper-triangular matrix will not. Sometimes we want to make sure the upper-triangular matrix has terms on the diagonal as well, and so we factor our matrix as ALDU where D is a diagonal matrix consisting of the pivots, L is lower-triangular with entries on the diagonal, and U is upper-triangular with entries on the diagonal. 4

-p C C 0 CD I 0 >< -p - I ( IL N N 0 cc 0 N 0 0