Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

Similar documents
Matrix decompositions

Matrix decompositions

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

Solving Linear Systems Using Gaussian Elimination. How can we solve

MATH 3511 Lecture 1. Solving Linear Systems 1

Numerical Linear Algebra

MATLAB Project: LU Factorization

y b where U. matrix inverse A 1 ( L. 1 U 1. L 1 U 13 U 23 U 33 U 13 2 U 12 1

Gaussian Elimination and Back Substitution

2.1 Gaussian Elimination

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn

Pivoting. Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3

Applied Linear Algebra in Geoscience Using MATLAB

Practical Linear Algebra: A Geometry Toolbox

Linear Systems of n equations for n unknowns

MAT 343 Laboratory 3 The LU factorization

Matrix Factorization Reading: Lay 2.5

AMS526: Numerical Analysis I (Numerical Linear Algebra)

lecture 2 and 3: algorithms for linear algebra

Next topics: Solving systems of linear equations

Math 2331 Linear Algebra

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f.

Numerical Linear Algebra

Program Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects

Introduction to Mathematical Programming

Hani Mehrpouyan, California State University, Bakersfield. Signals and Systems

MAC1105-College Algebra. Chapter 5-Systems of Equations & Matrices

LU Factorization. LU factorization is the most common way of solving linear systems! Ax = b LUx = b

MATHEMATICS FOR COMPUTER VISION WEEK 2 LINEAR SYSTEMS. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year

Section 5.6. LU and LDU Factorizations

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le

CHAPTER 6. Direct Methods for Solving Linear Systems

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

1 GSW Sets of Systems

MODULE 7. where A is an m n real (or complex) matrix. 2) Let K(t, s) be a function of two variables which is continuous on the square [0, 1] [0, 1].

Solving Consistent Linear Systems

Review of matrices. Let m, n IN. A rectangle of numbers written like A =

Linear Equations and Matrix

Numerical Methods I: Numerical linear algebra

Scientific Computing

Section 3.5 LU Decomposition (Factorization) Key terms. Matrix factorization Forward and back substitution LU-decomposition Storage economization

Linear Algebraic Equations

Direct Methods for Solving Linear Systems. Matrix Factorization

AMS 209, Fall 2015 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems

Maths for Signals and Systems Linear Algebra for Engineering Applications

Y = ax + b. Numerical Applications Least-squares. Start with Self-test 10-1/459. Linear equation. Error function: E = D 2 = (Y - (ax+b)) 2

Numerical Linear Algebra

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Conceptual Questions for Review

MTH 464: Computational Linear Algebra

lecture 3 and 4: algorithms for linear algebra

Solving Ax = b w/ different b s: LU-Factorization

Math 1314 Week #14 Notes

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

The Solution of Linear Systems AX = B

14.2 QR Factorization with Column Pivoting

Chapter 4 Systems of Linear Equations; Matrices

Computational Methods. Systems of Linear Equations

Scientific Computing: Dense Linear Systems

12/1/2015 LINEAR ALGEBRA PRE-MID ASSIGNMENT ASSIGNED BY: PROF. SULEMAN SUBMITTED BY: M. REHAN ASGHAR BSSE 4 ROLL NO: 15126

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

1.5 Gaussian Elimination With Partial Pivoting.

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Lecture 12 (Tue, Mar 5) Gaussian elimination and LU factorization (II)

Roundoff Analysis of Gaussian Elimination

Linear Equations in Linear Algebra

30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes

Linear Algebra, part 3 QR and SVD

Math Lecture 26 : The Properties of Determinants

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

LU Factorization a 11 a 1 a 1n A = a 1 a a n (b) a n1 a n a nn L = l l 1 l ln1 ln 1 75 U = u 11 u 1 u 1n 0 u u n 0 u n...

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

Numerical Analysis Fall. Gauss Elimination

Solution of Linear Equations

1 Number Systems and Errors 1

MIDTERM 1 - SOLUTIONS

LU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination

Gaussian Elimination -(3.1) b 1. b 2., b. b n

6 Linear Systems of Equations

Math 1553, Introduction to Linear Algebra

The following steps will help you to record your work and save and submit it successfully.

This can be accomplished by left matrix multiplication as follows: I

There are six more problems on the next two pages

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Math Lecture 27 : Calculating Determinants

CS227-Scientific Computing. Lecture 4: A Crash Course in Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Gaussian Elimination without/with Pivoting and Cholesky Decomposition

EE5120 Linear Algebra: Tutorial 1, July-Dec Solve the following sets of linear equations using Gaussian elimination (a)

(Linear equations) Applied Linear Algebra in Geoscience Using MATLAB

MODEL ANSWERS TO THE THIRD HOMEWORK

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

(17) (18)

SOLVING LINEAR SYSTEMS

Solution Set 3, Fall '12

MATH2210 Notebook 2 Spring 2018

EBG # 3 Using Gaussian Elimination (Echelon Form) Gaussian Elimination: 0s below the main diagonal

Math Studio College Algebra

Transcription:

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math Week # 1 Saturday, February 1, 1

Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers or products). Saturday, February 1, 1

Linear algebra Where do linear systems come from? Fitting curves to data Polynomial approximation to functions Computational fluid dynamics Network flow Computer graphics Difference equations Differential equations Dynamical systems theory... Saturday, February 1, 1

Typical linear system How does Matlab solve linear systems such as : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = Does such a system always have a solution? Can such a system be solved efficiently for millions of equations? What does Matlab do if we have more equations than unknowns? More unknowns than equations? Saturday, February 1, 1

Solving linear systems We are already familiar with at least one type of linear system x 1 +x = x 1 x = 1 The solution is the intersection of the two lines represented by each equation. This solution is a point (x 1,x ) that satisfies both equations simultaneously. We could also view the solution as providing the correct linear combination of vectors (, ) and (, ) that give us (, 1). x 1 apple + x apple = apple 1 Saturday, February 1, 1

Linear algebra - a x system We can row-reduce an augmented matrix to find the solution : apple 1! apple 0 1 Use an elementary row operation to produce a 0 in the lower left corner. (eqn ) (eqn 1) Use back-substitution to solve first for x and then for x 1. x = Solution : ( 1) = x 1 = 1 ( x )= 1 x 1 = 1, x = = 1 Saturday, February 1, 1

Linear algebra - a x system, 1 x 1 (, ) (,1) (,) x (, ) (,-) Solution as the intersection of two lines Solution as linear combination of vectors Saturday, February 1, 1

Linear algebra - a x system Pivots 1 9 0 1 0 8 0 18 9 1 1 1 0 8 9 0 0 10 1 Apply elementary row operations to the augmented matrix to zero out entries below the diagonal and reduce the system to an upper triangular system. (eqn ) (eqn ) (eqn ) (eqn 1) (eqn 1) 9 (eqn ) 1 Multipliers 8 Saturday, February 1, 1

Solve upper triangular system for x 1 0 8 9 0 0 10 x 1 x x = 1 Notice that the right hand side is not the right hand side of the original system Use back-substitution to solve for x : 10 Step 1 : x = = 1 9 Step : x = 8 x = 1 Step : x 1 = ( ( 1)x ()x )= Known from previous steps 9 Saturday, February 1, 1

Cost of elimination The cost of eliminating entries below the 1 st pivot : To eliminate (or zero out ) the column below the first pivot, we must do (n 1) = () multiplications and subtractions. 10 Saturday, February 1, 1

Cost of elimination The cost of eliminating entries below the nd pivot : 0 X X X X 0 X X X X 0 X X X X 0 X X X X To eliminate (or zero out ) the column below the second pivot, we must do (n ) = () multiplications and subtractions. 11 Saturday, February 1, 1

Cost of elimination The cost of eliminating entries below the rd pivot : 0 X X X X 0 0 X X X 0 0 X X X 0 0 X X X To eliminate (or zero out ) the column below the third pivot, we must do (n ) = () multiplications and subtractions. 1 Saturday, February 1, 1

Cost of elimination The cost of eliminating entries below the rd pivot : 0 X X X X 0 0 X X X 0 0 0 X X 0 0 0 X X To eliminate (or zero out ) the column below the fourth pivot, we must do (n ) = (1) multiplications and subtractions. 1 Saturday, February 1, 1

Cost of elimination The total number of multiplications is then : n 1 X k=1 k = (n 1)(n)(n 1) The number of subtractions is the same. 1 n we ignore lower order powers of n So we have a total of n operations (ignoring terms involving n and n). We say that elimination is an n process. This is consider expensive for a linear solve. 1 Saturday, February 1, 1

What about the cost of the back solve? Step k in the back solve requires k multiplications and k additions. So the total work for a back solve is : nx k=1 k = n(n + 1) 1 n Ignore lower order powers of n, but pay attention to the coefficient of the highest order power of n. or 1 n = n for both the additions and multiplications. Considerably cheaper than the elimination procedure. 1 Saturday, February 1, 1

The LU decomposition If we have more than one right hand side (as is often the case) Ax = b i, i =1,,...,M We can actually store the work involved carrying out the elimination by storing the multipliers used to carry out the row operations. 1 Saturday, February 1, 1

Linear algebra - a x system 1 9 0 1 0 8 0 18 9 1 1 1 0 8 9 0 0 10 1 Apply elementary row operations to the augmented matrix to zero out entries below the diagonal and reduce the system to an upper triangular system. (eqn ) (eqn ) (eqn ) (eqn 1) (eqn 1) 9 (eqn ) 1 Multipliers 1 Saturday, February 1, 1

LU Decomposition U = 1 0 8 9 0 0 10 Store the multipliers in a lower triangular matrix : L = 1 0 0 1 0 9 1 18 Saturday, February 1, 1

LU Decomposition The product LU is equal to A : 1 0 0 1 0 9 1 1 0 8 9 0 0 10 = 1 9 L U A It does not cost us anything to store the multipliers. But by doing so, we can now solve many systems involving the matrix A. 19 Saturday, February 1, 1

Solution procedure given LU=A How can we solve a system using the LU factorization? Ax = b Step 0 : Factor A into LU Row-reduction Step 1 : Solve Ly = b Forward substitution Step : Solve Ux = y Back substitution For each right hand side, we only need to do n operations. The expensive part is forming the original LU decomposition. 0 Saturday, February 1, 1

Row exchanges What if we start with a system that looks like : A = 0 0 0 8 10 9 1 All we need to do is exchange the rows of A, and do the decomposition on LU = PA where P is a permutation matrix, i.e. 0 0 1 P = 0 1 0 1 0 0 Saturday, February 1, 1 1

Partial pivoting We can also do row exchanges not just to avoid a zero pivot, but also to make the pivot as large as possible. This is called partial pivoting. Find the largest pivot in the entire column, and do a row exchange. One can also do full pivoting by looking for the largest pivot in the entire matrix. But this is rarely done. Saturday, February 1, 1

Top 10 algorithms The matrix decompositions are listed as one of the top 10 algorithms of the 0th century from SIAM News, Volume, Number The Best of the 0th Century: Editors Name Top 10 Algorithms 191: Alston Householder of Oak Ridge National Laboratory formalizes the decompositional approach to matrix computations. The ability to factor matrices into triangular, diagonal, orthogonal, and other special forms has turned out to be extremely useful. The decompositional approach has enabled software developers to produce flexible and efficient matrix packages. It also facilitates the analysis of rounding errors, one of the big bugbears of numerical linear algebra. (In 191, James Wilkinson of the National Physical Laboratory in London published a seminal paper in the Journal of the ACM, titled Error Analysis of Direct Methods of Matrix Inversion, based on the LU decomposition of a matrix as a product of lower and upper triangular factors.) Saturday, February 1, 1