The System of Linear Equations. Direct Methods. Xiaozhou Li.

Similar documents
CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

Direct Methods for Solving Linear Systems. Matrix Factorization

Lecture 12 (Tue, Mar 5) Gaussian elimination and LU factorization (II)

2.1 Gaussian Elimination

Linear Algebraic Equations

14.2 QR Factorization with Column Pivoting

AMS526: Numerical Analysis I (Numerical Linear Algebra)

CSE 160 Lecture 13. Numerical Linear Algebra

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular

Numerical Linear Algebra

Program Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn

Computational Linear Algebra

Gaussian Elimination and Back Substitution

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f.

LU Factorization. LU Decomposition. LU Decomposition. LU Decomposition: Motivation A = LU

AMS 209, Fall 2015 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems

Linear Equations and Matrix

Solving Linear Systems Using Gaussian Elimination. How can we solve

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

Scientific Computing

Introduction to Mathematical Programming

Solution of Linear Equations

Solving Dense Linear Systems I

LU Factorization. LU factorization is the most common way of solving linear systems! Ax = b LUx = b

Gaussian Elimination without/with Pivoting and Cholesky Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

5 Solving Systems of Linear Equations

Matrix decompositions

1.5 Gaussian Elimination With Partial Pivoting.

y b where U. matrix inverse A 1 ( L. 1 U 1. L 1 U 13 U 23 U 33 U 13 2 U 12 1

Review of Vectors and Matrices

Numerical Analysis: Solving Systems of Linear Equations

COURSE Numerical methods for solving linear systems. Practical solving of many problems eventually leads to solving linear systems.

Determinants. Chia-Ping Chen. Linear Algebra. Professor Department of Computer Science and Engineering National Sun Yat-sen University 1/40

Linear Systems of n equations for n unknowns

1.Chapter Objectives

Matrix decompositions

CHAPTER 6. Direct Methods for Solving Linear Systems

Numerical Methods - Numerical Linear Algebra

lecture 2 and 3: algorithms for linear algebra

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

4.2 Floating-Point Numbers

Draft. Lecture 12 Gaussian Elimination and LU Factorization. MATH 562 Numerical Analysis II. Songting Luo

SOLVING LINEAR SYSTEMS

Pivoting. Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3

Numerical Methods I: Numerical linear algebra

Solving Consistent Linear Systems

Matrix Multiplication Chapter IV Special Linear Systems

Dense LU factorization and its error analysis

Let x be an approximate solution for Ax = b, e.g., obtained by Gaussian elimination. Let x denote the exact solution. Call. r := b A x.

Linear Algebra. Carleton DeTar February 27, 2017

This can be accomplished by left matrix multiplication as follows: I

MODULE 7. where A is an m n real (or complex) matrix. 2) Let K(t, s) be a function of two variables which is continuous on the square [0, 1] [0, 1].

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

V C V L T I 0 C V B 1 V T 0 I. l nk

Numerical Methods I Solving Square Linear Systems: GEM and LU factorization

Process Model Formulation and Solution, 3E4

MATH 3511 Lecture 1. Solving Linear Systems 1

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 6

CS475: Linear Equations Gaussian Elimination LU Decomposition Wim Bohm Colorado State University

1 Determinants. 1.1 Determinant

lecture 3 and 4: algorithms for linear algebra

The Solution of Linear Systems AX = B

April 26, Applied mathematics PhD candidate, physics MA UC Berkeley. Lecture 4/26/2013. Jed Duersch. Spd matrices. Cholesky decomposition

Review Questions REVIEW QUESTIONS 71

MA2501 Numerical Methods Spring 2015

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Practical Linear Algebra: A Geometry Toolbox

Engineering Computation

Solving Linear Systems Using Gaussian Elimination

ECE133A Applied Numerical Computing Additional Lecture Notes

Solving linear systems (6 lectures)

Linear Algebra. Solving Linear Systems. Copyright 2005, W.R. Winfrey

Hani Mehrpouyan, California State University, Bakersfield. Signals and Systems

Numerical Linear Algebra

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Matrix Factorization and Analysis

(17) (18)

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Review of matrices. Let m, n IN. A rectangle of numbers written like A =

Algebra C Numerical Linear Algebra Sample Exam Problems

Math Lecture 27 : Calculating Determinants

CS100: DISCRETE STRUCTURES. Lecture 3 Matrices Ch 3 Pages:

Roundoff Analysis of Gaussian Elimination

1 GSW Sets of Systems

Lecture 4: Gaussian Elimination and Homogeneous Equations

Solving Linear Systems of Equations

Lecture 2 INF-MAT : , LU, symmetric LU, Positve (semi)definite, Cholesky, Semi-Cholesky

Physics 116A Determinants

Scientific Computing: An Introductory Survey

Math 471 (Numerical methods) Chapter 3 (second half). System of equations

Lecture 3: Gaussian Elimination, continued. Lecture 3: Gaussian Elimination, continued

Transcription:

1/16 The Direct Methods xiaozhouli@uestc.edu.cn http://xiaozhouli.com School of Mathematical Sciences University of Electronic Science and Technology of China Chengdu, China

Does the LU factorization always work? That A = LU, where 1. m.. L = 21.... 1 m n1 m n,n 1 1 and a 11 a 12 a 1n U = A (n 1) a (1) = 22 a (1) 2n.... a nn (n 1) 2/16

3/16 Theorem a (k 1) kk 0(k = 1, 2,..., n) if and only if the matrix A satisfies that its leading principle minors a 11 a 12 a 1k a 21 a 22 a D k = 2k... 0 k = 1,..., n a k1 a k2 a kk Theorem If the LU factorization of the matrix A exists, then it is unique.

4/16 where L = A = a 11 a 12 a 1n a 21 a 22 a 2n... a n1 a n2 a nn 1. l.. 21. U =... 1 l n1 l n,n 1 1 = LU u 11 u 12 u 1n u 22 u 2n.... u nn

5/16 1st row 1st column u 1j = a 1j, j = 1,..., n; l i1 = a i1 u 11, i = 2,..., n.

rth row n r 1 a rj = l rk u kj = l rk u kj + u rj k=1 k=1 r 1 = u rj = a rj l rk u kj, k=1 j = r,..., n rth column n r 1 a ir = l ik u kr = l ik u kr + l ir u rr k=1 k=1 ( ) r 1 = l ir = a ir l rk u kj /u rr, i = r + 1,..., n k=1 6/16

Example: 7/16

8/16

Crout 9/16 where L = A = a 11 a 12 a 1n a 21 a 22 a 2n... a n1 a n2 a nn l 11. l.. 21. Ū =... 1 l n1 l n,n 1 l nn = LŪ 1 ū 12 ū 1n 1 ū 2n.... 1 Cholesky : A is symmetric positive definite (SPD) matrix.

Another Example 10/16 The solution of the system { 0.0003x1 + 1.566x 2 = 1.569 0.3454x 1 2.436x 2 = 1.018 is x 1 = 10, x 2 = 1. Solving this by elimination with four-decimal floating arithmetic. Pivoting Strategy 1 [ 0.0003 1.566 1.569 0.3454 2.436 1.018 ] [ 0.0003 1.566 1.569 1.804 1.805 ] So we have x 2 = 1.001 and x 1 = 3.333

11/16 Pivoting Strategy 2 [ 0.0003 1.566 1.569 0.3454 2.436 1.018 ] [ 0.3454 2.436 1.018 1.568 1.568 So we have x 2 = 1 and x 1 = 10 [ 0.3454 2.436 1.018 0.0003 1.566 1.569 It is not possible at present to give a best pivoting strategy for a general linear system, nor is it even clear what such a term might mean. ] ]

12/16 For the sake of economy, the pivotal equation for each step must be selected on the basis of the current state of the system under consideration at the beginning of the step, i.e., without foreknowledge of the effect of the selection on later steps. A currently accepted strategy is partial pivoting.

13/16 Comparing numbers before carrying out each elimination step; Then, at the beginning of the kth step of the elimination, one picks as pivotal equation that one from the available n k candidates which has the absolutely largest coefficient of x k, assume it to be p (k + 1 p n); Exchanges the kth row and the pth row if necessary. Using partial pivoting ensures that all multipliers, or entires of L, will be no greater than 1.

Permutation Matrices 14/16 Definition A permutation matrix is an n n matrix consisting of all zeros, except for a single 1 in every row and column. Equivalently, a permutation matrix P is created by applying arbitrary row exchanges to the n n identity matrix (or arbitrary column exchanges). Theorem Let P be the n n permutation matrix formed by a particular set of row exchanges applied to the identity matrix. Then, for any n n matrix A, PA is the matrix obtained by applying exactly the same set of row exchanges to A.

Example 15/16 Find the PA = LU factorization of the matrix 2 1 5 A = 4 4 4 1 3 1 The solution 0 1 0 2 1 5 1 1 0 0 1 4 4 4 = 4 1 1 1 0 0 1 3 1 2 1 2 1 If we rewrite A = PLU, then what is P? 4 4 4 2 2 8 P = P 1 = P T.

Comments on 16/16 It will increase the computational cost of elimination. Considering a variation of a previous example { 3x 1 + 1566x 2 = 1569 0.3454x 1 2.436x 2 = 1.018 Is partial pivoting stable here? The scaled partial pivoting (code). Does pivoting always better than without pivoting? Considering the tri-diagonal matrix, d-diagonal matrix. Complete pivoting elimination.