Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math Week # 1 Saturday, February 1, 1
Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers or products). Saturday, February 1, 1
Linear algebra Where do linear systems come from? Fitting curves to data Polynomial approximation to functions Computational fluid dynamics Network flow Computer graphics Difference equations Differential equations Dynamical systems theory... Saturday, February 1, 1
Typical linear system How does Matlab solve linear systems such as : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = Does such a system always have a solution? Can such a system be solved efficiently for millions of equations? What does Matlab do if we have more equations than unknowns? More unknowns than equations? Saturday, February 1, 1
Solving linear systems We are already familiar with at least one type of linear system x 1 +x = x 1 x = 1 The solution is the intersection of the two lines represented by each equation. This solution is a point (x 1,x ) that satisfies both equations simultaneously. We could also view the solution as providing the correct linear combination of vectors (, ) and (, ) that give us (, 1). x 1 apple + x apple = apple 1 Saturday, February 1, 1
Linear algebra - a x system We can row-reduce an augmented matrix to find the solution : apple 1! apple 0 1 Use an elementary row operation to produce a 0 in the lower left corner. (eqn ) (eqn 1) Use back-substitution to solve first for x and then for x 1. x = Solution : ( 1) = x 1 = 1 ( x )= 1 x 1 = 1, x = = 1 Saturday, February 1, 1
Linear algebra - a x system, 1 x 1 (, ) (,1) (,) x (, ) (,-) Solution as the intersection of two lines Solution as linear combination of vectors Saturday, February 1, 1
Linear algebra - a x system Pivots 1 9 0 1 0 8 0 18 9 1 1 1 0 8 9 0 0 10 1 Apply elementary row operations to the augmented matrix to zero out entries below the diagonal and reduce the system to an upper triangular system. (eqn ) (eqn ) (eqn ) (eqn 1) (eqn 1) 9 (eqn ) 1 Multipliers 8 Saturday, February 1, 1
Solve upper triangular system for x 1 0 8 9 0 0 10 x 1 x x = 1 Notice that the right hand side is not the right hand side of the original system Use back-substitution to solve for x : 10 Step 1 : x = = 1 9 Step : x = 8 x = 1 Step : x 1 = ( ( 1)x ()x )= Known from previous steps 9 Saturday, February 1, 1
Cost of elimination The cost of eliminating entries below the 1 st pivot : To eliminate (or zero out ) the column below the first pivot, we must do (n 1) = () multiplications and subtractions. 10 Saturday, February 1, 1
Cost of elimination The cost of eliminating entries below the nd pivot : 0 X X X X 0 X X X X 0 X X X X 0 X X X X To eliminate (or zero out ) the column below the second pivot, we must do (n ) = () multiplications and subtractions. 11 Saturday, February 1, 1
Cost of elimination The cost of eliminating entries below the rd pivot : 0 X X X X 0 0 X X X 0 0 X X X 0 0 X X X To eliminate (or zero out ) the column below the third pivot, we must do (n ) = () multiplications and subtractions. 1 Saturday, February 1, 1
Cost of elimination The cost of eliminating entries below the rd pivot : 0 X X X X 0 0 X X X 0 0 0 X X 0 0 0 X X To eliminate (or zero out ) the column below the fourth pivot, we must do (n ) = (1) multiplications and subtractions. 1 Saturday, February 1, 1
Cost of elimination The total number of multiplications is then : n 1 X k=1 k = (n 1)(n)(n 1) The number of subtractions is the same. 1 n we ignore lower order powers of n So we have a total of n operations (ignoring terms involving n and n). We say that elimination is an n process. This is consider expensive for a linear solve. 1 Saturday, February 1, 1
What about the cost of the back solve? Step k in the back solve requires k multiplications and k additions. So the total work for a back solve is : nx k=1 k = n(n + 1) 1 n Ignore lower order powers of n, but pay attention to the coefficient of the highest order power of n. or 1 n = n for both the additions and multiplications. Considerably cheaper than the elimination procedure. 1 Saturday, February 1, 1
The LU decomposition If we have more than one right hand side (as is often the case) Ax = b i, i =1,,...,M We can actually store the work involved carrying out the elimination by storing the multipliers used to carry out the row operations. 1 Saturday, February 1, 1
Linear algebra - a x system 1 9 0 1 0 8 0 18 9 1 1 1 0 8 9 0 0 10 1 Apply elementary row operations to the augmented matrix to zero out entries below the diagonal and reduce the system to an upper triangular system. (eqn ) (eqn ) (eqn ) (eqn 1) (eqn 1) 9 (eqn ) 1 Multipliers 1 Saturday, February 1, 1
LU Decomposition U = 1 0 8 9 0 0 10 Store the multipliers in a lower triangular matrix : L = 1 0 0 1 0 9 1 18 Saturday, February 1, 1
LU Decomposition The product LU is equal to A : 1 0 0 1 0 9 1 1 0 8 9 0 0 10 = 1 9 L U A It does not cost us anything to store the multipliers. But by doing so, we can now solve many systems involving the matrix A. 19 Saturday, February 1, 1
Solution procedure given LU=A How can we solve a system using the LU factorization? Ax = b Step 0 : Factor A into LU Row-reduction Step 1 : Solve Ly = b Forward substitution Step : Solve Ux = y Back substitution For each right hand side, we only need to do n operations. The expensive part is forming the original LU decomposition. 0 Saturday, February 1, 1
Row exchanges What if we start with a system that looks like : A = 0 0 0 8 10 9 1 All we need to do is exchange the rows of A, and do the decomposition on LU = PA where P is a permutation matrix, i.e. 0 0 1 P = 0 1 0 1 0 0 Saturday, February 1, 1 1
Partial pivoting We can also do row exchanges not just to avoid a zero pivot, but also to make the pivot as large as possible. This is called partial pivoting. Find the largest pivot in the entire column, and do a row exchange. One can also do full pivoting by looking for the largest pivot in the entire matrix. But this is rarely done. Saturday, February 1, 1
Top 10 algorithms The matrix decompositions are listed as one of the top 10 algorithms of the 0th century from SIAM News, Volume, Number The Best of the 0th Century: Editors Name Top 10 Algorithms 191: Alston Householder of Oak Ridge National Laboratory formalizes the decompositional approach to matrix computations. The ability to factor matrices into triangular, diagonal, orthogonal, and other special forms has turned out to be extremely useful. The decompositional approach has enabled software developers to produce flexible and efficient matrix packages. It also facilitates the analysis of rounding errors, one of the big bugbears of numerical linear algebra. (In 191, James Wilkinson of the National Physical Laboratory in London published a seminal paper in the Journal of the ACM, titled Error Analysis of Direct Methods of Matrix Inversion, based on the LU decomposition of a matrix as a product of lower and upper triangular factors.) Saturday, February 1, 1