Solving Systems of Linear Equations

Size: px
Start display at page:

Download "Solving Systems of Linear Equations"

Transcription

1 Motivation Solving Systems of Linear Equations The idea behind Googles pagerank An example from economics Gaussian elimination LU Decomposition Iterative Methods The Jacobi method Summary

2 Motivation Systems of linear equations of the form or in components A x= b a 11 x 1 +a 12 x a 1n x n =b 1 a 21 x 1 +a 22 x a 2n x n =b 2... a m1 x 1 +a m2 x a mn x n =b m a 11 a a 1n b1 a A=( 21 a a 2n b, a m1 a m2... a mn) b=( m) 2... b Are found in many contexts. For instance we need them to find roots of systems of non-linear equations using the Newton method.

3 Short Reminder (1) Matrix-vector multiplication A x=[ a 11 a 12 a 13 a 21 a 22 a 23 a 31 a 32 a 33][ x1 x 3]=[ a11 2 x Matrix-matrix multiplication [a11 a12 a13 a 21 a 22 a 23 x1+a12 x2+a13 x3 a 21 x 1 +a 22 x 2 +a 23 x 3] 3 a 31 x 1 +a 32 x 2 +a 33 x b12 b13... b 21 b 22 b 23 a 31 a 32 a 33][b11 b 31 b 32 b 33]=[... k a ik b kj......]

4 Short Reminder (2) Graphical illustration of 2d systems Every equation represents a line non-singular singular The following situations are possible Lines cross at exactly one point -> unique solution (nonsingular) Lines are parallel (singular) and don't cross -> no solution Lines overlap completely -> infinitely many solutions In higher d the same situations are possible

5 Short Reminder (3) Focus on square matrices Some criteria that an n*n matrix is non-singular: The inverse of A (denoted A -1 ) exists det (A)!=0 rank (A) = n For any vector z!=0 it holds Az!=0

6 #total Example: How Google Search Works (sort of) In a particular group of people, who is the most popular? Say there are three people: Jane, Charlie, Fred We ask them to list their friends J lists C, C lists J,F, F lists J Could write this as a matrix J C F J C F 0 1 0

7 Example (2) Some people list more than others as friends. To compensate we normalise each column by the sum Now want to find a vector p=(j,c,f) which assigns a popularity to each person Idea: a person's popularity is the weighted sum of the popularities of the people who mention it p=lp J C F J C F ( 0 1/2 1 0) =L 0 1/2 linking matrix

8 Example (3) 1 1/2 1/2 1/3 1/3 1/3 A node (person) is the more popular the more people list it as friends less friends these people list more popular these friends are (i.e. Having one very popular good friend can make you more popular than having 10 not so popular friends)

9 Example (4) This defines a system of linear equations! J C F J C F ( 0 1/2 1 0) =L 0 1/2 E.g.: j=1/2 c+ f, c=j, f=1/2 c or In this case this is easy to solve Lp= p System is under-determined: (e.g.) set j=1 as a scale, -> c=1, f=1/2, i.e. Charlie and Jane are most popular, Fred is less so.

10 Remark We could use this to assign a popularity to webpages The friend lists from before would then just refer to the lists of links in web-pages, i.e. How many webpages link to a given page (normalized)? The popularity p could be interpreted as page rank (or we might wish to add contextual information (c) for search queries and not base them purely on popularity e.g. p=lp + c) We might wonder if the system Lp=p always has positive solutions p -> This is ensured by the Frobenius-Perron theorem

11 Example: Setting Prices in an Economy This is a problem that goes back to Wassily Leontief (who got the Nobel prize for economics in 1973) Suppose in a far away land of Eigenbazistan, in a small country town called Matrixville, there lived a Farmer, a Tailor, a Carpenter, a Coal Miner and Slacker Bob. The Farmer produced food; the Tailor, clothes; the Carpenter, housing; the Coal Miner supplied energy; and Slacker Bob made High Quality 100% Proof Moonshine, half of which he drank himself. Let's assume that: No outside supply or demand Everything produced is consumed. Need to specify what fraction of each good is consumed by each person Problem: What are the incomes of each person?

12 Consumption Food Clothes Housing Energy High Quality 100 Proof Moonshine Farmer Tailor Carpenter Coal Miner Slacker Bob I.e. The farmer eats 25% of all food, 15% of all clothes, 25% of all housing, 18% of all energy Let's say price levels are p F,p T,p CA,p CO,p S Since this is a closed economy, we require: p F =0.25 p F p T p CA p CO +0.20p S

13 Again... a System of Linear Equations p F =0.25 p F p T p CA p CO +0.20p S p T =0.15 p F p T p CA p CO +0.05p S p CA =0.22 p F p T p CA p CO +0.10p S p CO =0.20 p F p T p CA p CO +0.15p S p S =0.18 p F p T p CA p CO +0.50p S Again this is a system of the type Ap=p (i.e. Under-determined, can set one price arbitrarily) Not so easy to solve by hand... computer methods would be useful!

14 Gaussian Elimination One idea to mechanise solving such systems is Gaussian elimination Consider the system: 2x+ y z=8 3x y +2z= 11 2x+ y +2z= 3 Observations: Can multiply equations by constants, swap equations, or add or subtract equations without changing solution Apply such operations to transform system into convenient form? What is a convenient form? -> triangular

15 Gaussian Elimination (2) Triangular form: 2x+ y z=8 1/3 y+1/3 z=2/3 z=1 (3) If our system had the above form we could just solve it by successive substitution, i.e. (1) (2) (3) z= 1 (substitute into) (2) 1/3 y 1/3=2/ 3 y=3 (substitute into) (3) 2 x+3 ( 1)=8 x=2

16 How to Transform into Triangular Form? Start with: 2x+ y z=8 (1) 3x y +2z= 11 (2) 2x+ y +2z= 3 (3) Pivot (1). Multiply (2) by 2/3 and add to (1) 2x + y z=8 y 3 + z 3 =2/3 2x+ y +2z= 3 (1) (2') (3) Multiply (3) by 1 and add to (1) 2x + y z=8 y 3 + z (1) 3 =1 (2') 2y+z=5 (3')

17 How to Transform into Triangular Form? (2) 2x + y z=8 y 3 + z 3 =2/3 2y+z=5 Now multiply (2') by -6 and add to (3') 2x + y z=8 y 3 + z 3 =2/3 z=1 (1) (2') (3') (1) (2') (3'') Matrix is now in triangular form. Can diagonalise it as follows:

18 Diagonalising... Multiply (2') by 3 and add to (3''), (3'') by -1 add to (1): 2 x+ y =7 y =3 z=1 (1) (2'') (3'') And finally: Multiply (2'') by -1 and add to (1). Finally, scale coefficients: 2 x =4 y =3 z=1 (1') (2'') (3'')

19 Gaussian Elimination Usually, this is done using only coefficient schemes the augmented matrix Can be used to diagonalise matrices, just append identity matrix and perform all operations on it as well [ ] Coefficient matrix Identity matrix

20 Example... [ ] [ 2 [ ] 0 1/2 1 /2 3/ *(2)+(3) ] 0 1/2 1 /2 3/ /2*(1)+(2) (1)+(2)

21 [ 2 [ 2 Example (2) ] 0 1/ 2 1 /2 3/ ] (1)-(3) *(2)+(3) [ ] [ ] (1)-(2) 1/2*(1) 1*(2) -1*(3) This is A -1

22 Example (3) OK, we found A -1, how to check if this is correct? Suppose we now have A -1 how is this useful to find x?

23 Some Remarks Pivot may be zero or ( inconvenient for numerical stability) -> reorder rows Possibly divide by very small numbers (if pivot is small), better reorder to use largest possible pivot Computational complexity is of order O(n 3 ) (roughly: n-1 rows need to be dealt with, this process involves order n operations and needs to be repeated less than n times)

24 Pseudocode for k = 1... min(m,n): Find the k-th pivot: i_max := argmax (i = k... m, abs(a[i, k])) if A[i_max, k] = 0 error "Matrix is singular!" swap rows(k, i_max) Do for all rows below pivot: for i = k m: Do for all remaining elements in current row: for j = k n: A[i, j] := A[i, j] - A[k, j] * (A[i, k] / A[k, k]) Fill lower triangular matrix with zeros: A[i, k] := 0 Turns m*n matrix into echelon formation which can then be solved by back-substitution

25 LU Decomposition A slightly different strategy is LU factorization Write A as a product of two triangular matrices, one upper (U) and one lower triangular A=LU [a11 a12 a13 a 21 a 22 a u12 u13 l 21 l u 22 u 23 a 31 a 32 a 33]=[l11 l 31 l 32 l 33][u u 33] How does this help to solve Ax=b? A x= LU x=b Can first solve Ly=b and then Ux=y

26 LU Decomposition (2) Ly=b is easy to solve because y 3]=[ l 11 y 1 2 l 21 y 1 +l 22 y 3]=[ 2 y l 31 y 1 +l 32 y 2 +l 33 y [l l 21 l 22 0 l 31 l 32 l 33][ y1 we can solve it by back-substitution Similarly, Ux=y is easy : b1 b 2 b 3] [u11 u12 u13 x1 x1+u12 x2+u13 x3 0 u 22 u 23 x 2 u 22 x 2 +u 23 x 3 x 3]=[u11 u 33 x u 33][ ]=[ y1 y 2 y 3]

27 LU Decomposition (3) Essentially variant of Gaussian elimination Like in Gaussian elimination we might have to exchange Rows ( partial pivoting ) PA=LU Rows and columns ( full pivoting ) PAQ=LU Where P is a permutation matrix that keeps accounts for row permutations and Q for column permutations Any square matrix allows an LUP factorization For positive definite symmetric (or Hermitian) matrices there is a similar decomposition, the Cholesky decomposition A=L L * (often used in statistics)

28 [ Doolittle Algorithm (1) One algorithm to generate LUP Similarly to GE we aim to remove entries of A under the diagonal which is achieved by multiplying A with an appropriate matrix L' from the left, e.g.: [a11 a12 a13 a 21 a 22 a 23 a 31 a 32 a 33] Multiply by -a21/a11 and add to first row Multiply by -a31/a11 and add to first row This is equivalent to multiplying A from the left by a 21 / a ][ 0 a 31 / a 11 0 a11 a12 a13 a12 a13 a 21 a 22 a 23 0 a 21 / a 11 a 12 +a 22 a 21 / a 11 a 13 +a 23 a 31 a 32 a 33]=[a11 0 a 31 / a 11 a 12 +a 32 a 31 / a 11 a 13 +a 33]

29 Doolittle Algorithm (2) Start with matrix A (0) =A At stage n-1 left multiply A (n-1) with a matrix L n that has diagonal entries 1 and only entries in the nth column under the diagonal. These entries are In N-1 steps obtain an upper triangular matrix We have: Since products and inverses of lower triangular matrices are lower triangular we have (*) L= L 1 1 L L N 1 and A=LU (n l ) i,n = a (n 1) i, n A=L 1 1 L L N 1 (n 1) a nn (N 1) A (N 1) U =A

30 Doolittle (3) Still need to =[ determine L from (*) ] L n l i, n =[ l N,n ] L 1 n l i, n l N,n 0...

31 Doolittle (4) And hence: l l L=[ ] 31 l l 41 l 42 l 43 l i n l N1 l N2 l N l N N 1 1

32 Beyond Doolittle... LU decomposition has some slight advantages over Gaussian elimination i.e. once we have L and U we can solve systems for every r.h.s. B Small modifications can also be used to calculate the matrix inverse Generalizations are e.g. the Crout algorithm or LUP decomposition (the wiki pages are very good on this topic and give many more advanced references:

33 Iteration Methods Time complexity of LU decomposition or Gaussian elimination is O(n 3 ), i.e. dealing with millions of equations (which arise when solving certain PDEs) can take a long time... There are faster methods that only come up with approximate solutions, these are iteration methods The trade-off of these methods is that they might not converge for general types of coefficient matrices A For diagonally dominant methods a widely used method is the Jacobi method

34 Jacobi Method Want to solve Ax=b (*) Write A=D+R Where D are the diagonal elements of A and R is the rest Diagonally dominant means that D is large compared to R, e.g. a ii > j a ij Write (*) as: A x=(d +R) x= b x= D 1 ( b R x) Iterate this: x n +1 =D 1 ( b R x n ) Iteration #

35 Example Jacobi Consider the system: 2x+ y=11 5x+ 7y=13 A=[ ], b=[ 11 13] Jacobi: x n +1 =D 1 ( b R x n ) say: x n +1 = c T x n x 0 =[1,1] T D=[ ], R= [ ] D 1 =[ c= D 1 b=[11/2,13/7] T T =D 1 R=[ 0 1/2 5/7 0 ] 1/ /7]

36 Example Jacobi (2) Hence we have to iterate: =[ x n+1 With x0=[1,1] x n +1 = c T x n 11/2 13/7] [ 0 1/2] 5/7 0 x n =[ x 11/2 1 13/7] [ 0 1/2 8/7] 5/7 0 ][ 1 1] = [ 5 Iterating 25 times one obtains: x=[7.111, ] Exact -> x=64/9= , y=-29/9=

37 The Gauss-Seidel Method Jacobi method only applicable to diagonally dominant matrices Gauss-Seidel can be used for general matrices, but convergence only guaranteed if Diagonally dominant or Symmetric and positive definite Idea: A=L+U Lower triangular Strictly upper triangular x n +1 =L 1 (b U x n )

38 Software Packages LINPACK Package for solving wide variety of linear systems and special systems (e.g. symmetric, banded, etc.) Has become standard benchmark for comparing performance of computers LAPACK More recent replacement of LINPACK higher performance on modern computer architectures including some parallel computers Available from Netlib --

39 Summary Exact methods to solve linear systems of equations Gaussian elimination LU decomposition Doolittle algorithm With full/partial pivoting -> in practice stable O(n3 ) Iterative methods e.g. Jacobi method Faster suited to millions of equations that might arise when solving certain PDEs Limited convergence More sophisticated: weighted Jacobi, Gauss-Seidel successive over-relaxation,...

40 References The English wiki pages are quite comprehensive on this topic (and I used them quite a bit for preparing the lecture) More comprehensive: David M. Young, Jr. Iterative Solution of Large Linear Systems, Academic Press, (reprinted by Dover, 2003) Richard S. Varga 2002 Matrix Iterative Analysis, Second ed. (of 1962 Prentice Hall edition), Springer-Verlag.

Lecture 12. Linear systems of equations II. a 13. a 12. a 14. a a 22. a 23. a 34 a 41. a 32. a 33. a 42. a 43. a 44)

Lecture 12. Linear systems of equations II. a 13. a 12. a 14. a a 22. a 23. a 34 a 41. a 32. a 33. a 42. a 43. a 44) 1 Introduction Lecture 12 Linear systems of equations II We have looked at Gauss-Jordan elimination and Gaussian elimination as ways to solve a linear system Ax=b. We now turn to the LU decomposition,

More information

Review of matrices. Let m, n IN. A rectangle of numbers written like A =

Review of matrices. Let m, n IN. A rectangle of numbers written like A = Review of matrices Let m, n IN. A rectangle of numbers written like a 11 a 12... a 1n a 21 a 22... a 2n A =...... a m1 a m2... a mn where each a ij IR is called a matrix with m rows and n columns or an

More information

Next topics: Solving systems of linear equations

Next topics: Solving systems of linear equations Next topics: Solving systems of linear equations 1 Gaussian elimination (today) 2 Gaussian elimination with partial pivoting (Week 9) 3 The method of LU-decomposition (Week 10) 4 Iterative techniques:

More information

A LINEAR SYSTEMS OF EQUATIONS. By : Dewi Rachmatin

A LINEAR SYSTEMS OF EQUATIONS. By : Dewi Rachmatin A LINEAR SYSTEMS OF EQUATIONS By : Dewi Rachmatin Back Substitution We will now develop the backsubstitution algorithm, which is useful for solving a linear system of equations that has an upper-triangular

More information

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) D. ARAPURA Gaussian elimination is the go to method for all basic linear classes including this one. We go summarize the main ideas. 1.

More information

Computational Methods. Systems of Linear Equations

Computational Methods. Systems of Linear Equations Computational Methods Systems of Linear Equations Manfred Huber 2010 1 Systems of Equations Often a system model contains multiple variables (parameters) and contains multiple equations Multiple equations

More information

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination Chapter 2 Solving Systems of Equations A large number of real life applications which are resolved through mathematical modeling will end up taking the form of the following very simple looking matrix

More information

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn Today s class Linear Algebraic Equations LU Decomposition 1 Linear Algebraic Equations Gaussian Elimination works well for solving linear systems of the form: AX = B What if you have to solve the linear

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2 MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS SYSTEMS OF EQUATIONS AND MATRICES Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

Elementary Linear Algebra

Elementary Linear Algebra Elementary Linear Algebra Linear algebra is the study of; linear sets of equations and their transformation properties. Linear algebra allows the analysis of; rotations in space, least squares fitting,

More information

Iterative Methods. Splitting Methods

Iterative Methods. Splitting Methods Iterative Methods Splitting Methods 1 Direct Methods Solving Ax = b using direct methods. Gaussian elimination (using LU decomposition) Variants of LU, including Crout and Doolittle Other decomposition

More information

Process Model Formulation and Solution, 3E4

Process Model Formulation and Solution, 3E4 Process Model Formulation and Solution, 3E4 Section B: Linear Algebraic Equations Instructor: Kevin Dunn dunnkg@mcmasterca Department of Chemical Engineering Course notes: Dr Benoît Chachuat 06 October

More information

Linear Algebraic Equations

Linear Algebraic Equations Linear Algebraic Equations 1 Fundamentals Consider the set of linear algebraic equations n a ij x i b i represented by Ax b j with [A b ] [A b] and (1a) r(a) rank of A (1b) Then Axb has a solution iff

More information

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in Chapter 4 - MATRIX ALGEBRA 4.1. Matrix Operations A a 11 a 12... a 1j... a 1n a 21. a 22.... a 2j... a 2n. a i1 a i2... a ij... a in... a m1 a m2... a mj... a mn The entry in the ith row and the jth column

More information

22A-2 SUMMER 2014 LECTURE 5

22A-2 SUMMER 2014 LECTURE 5 A- SUMMER 0 LECTURE 5 NATHANIEL GALLUP Agenda Elimination to the identity matrix Inverse matrices LU factorization Elimination to the identity matrix Previously, we have used elimination to get a system

More information

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination

Chapter 2. Solving Systems of Equations. 2.1 Gaussian elimination Chapter 2 Solving Systems of Equations A large number of real life applications which are resolved through mathematical modeling will end up taking the form of the following very simple looking matrix

More information

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular form) Given: matrix C = (c i,j ) n,m i,j=1 ODE and num math: Linear algebra (N) [lectures] c phabala 2016 DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix

More information

The Solution of Linear Systems AX = B

The Solution of Linear Systems AX = B Chapter 2 The Solution of Linear Systems AX = B 21 Upper-triangular Linear Systems We will now develop the back-substitution algorithm, which is useful for solving a linear system of equations that has

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 208 LECTURE 3 need for pivoting we saw that under proper circumstances, we can write A LU where 0 0 0 u u 2 u n l 2 0 0 0 u 22 u 2n L l 3 l 32, U 0 0 0 l n l

More information

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any

More information

Math 471 (Numerical methods) Chapter 3 (second half). System of equations

Math 471 (Numerical methods) Chapter 3 (second half). System of equations Math 47 (Numerical methods) Chapter 3 (second half). System of equations Overlap 3.5 3.8 of Bradie 3.5 LU factorization w/o pivoting. Motivation: ( ) A I Gaussian Elimination (U L ) where U is upper triangular

More information

MODULE 7. where A is an m n real (or complex) matrix. 2) Let K(t, s) be a function of two variables which is continuous on the square [0, 1] [0, 1].

MODULE 7. where A is an m n real (or complex) matrix. 2) Let K(t, s) be a function of two variables which is continuous on the square [0, 1] [0, 1]. Topics: Linear operators MODULE 7 We are going to discuss functions = mappings = transformations = operators from one vector space V 1 into another vector space V 2. However, we shall restrict our sights

More information

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015 CS: Lecture #7 Mridul Aanjaneya March 9, 5 Solving linear systems of equations Consider a lower triangular matrix L: l l l L = l 3 l 3 l 33 l n l nn A procedure similar to that for upper triangular systems

More information

1 GSW Sets of Systems

1 GSW Sets of Systems 1 Often, we have to solve a whole series of sets of simultaneous equations of the form y Ax, all of which have the same matrix A, but each of which has a different known vector y, and a different unknown

More information

The purpose of computing is insight, not numbers. Richard Wesley Hamming

The purpose of computing is insight, not numbers. Richard Wesley Hamming Systems of Linear Equations The purpose of computing is insight, not numbers. Richard Wesley Hamming Fall 2010 1 Topics to Be Discussed This is a long unit and will include the following important topics:

More information

Solving Consistent Linear Systems

Solving Consistent Linear Systems Solving Consistent Linear Systems Matrix Notation An augmented matrix of a system consists of the coefficient matrix with an added column containing the constants from the right sides of the equations.

More information

Chapter 3. Linear Equations. Josef Leydold Mathematical Methods WS 2018/19 3 Linear Equations 1 / 33

Chapter 3. Linear Equations. Josef Leydold Mathematical Methods WS 2018/19 3 Linear Equations 1 / 33 Chapter 3 Linear Equations Josef Leydold Mathematical Methods WS 2018/19 3 Linear Equations 1 / 33 Lineares Gleichungssystem System of m linear equations in n unknowns: a 11 x 1 + a 12 x 2 + + a 1n x n

More information

Review Questions REVIEW QUESTIONS 71

Review Questions REVIEW QUESTIONS 71 REVIEW QUESTIONS 71 MATLAB, is [42]. For a comprehensive treatment of error analysis and perturbation theory for linear systems and many other problems in linear algebra, see [126, 241]. An overview of

More information

5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns

5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns 5.7 Cramer's Rule 1. Using Determinants to Solve Systems Assumes the system of two equations in two unknowns (1) possesses the solution and provided that.. The numerators and denominators are recognized

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

COURSE Numerical methods for solving linear systems. Practical solving of many problems eventually leads to solving linear systems.

COURSE Numerical methods for solving linear systems. Practical solving of many problems eventually leads to solving linear systems. COURSE 9 4 Numerical methods for solving linear systems Practical solving of many problems eventually leads to solving linear systems Classification of the methods: - direct methods - with low number of

More information

y b where U. matrix inverse A 1 ( L. 1 U 1. L 1 U 13 U 23 U 33 U 13 2 U 12 1

y b where U. matrix inverse A 1 ( L. 1 U 1. L 1 U 13 U 23 U 33 U 13 2 U 12 1 LU decomposition -- manual demonstration Instructor: Nam Sun Wang lu-manualmcd LU decomposition, where L is a lower-triangular matrix with as the diagonal elements and U is an upper-triangular matrix Just

More information

CSE 160 Lecture 13. Numerical Linear Algebra

CSE 160 Lecture 13. Numerical Linear Algebra CSE 16 Lecture 13 Numerical Linear Algebra Announcements Section will be held on Friday as announced on Moodle Midterm Return 213 Scott B Baden / CSE 16 / Fall 213 2 Today s lecture Gaussian Elimination

More information

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le Direct Methods for Solving Linear Systems Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le 1 Overview General Linear Systems Gaussian Elimination Triangular Systems The LU Factorization

More information

CS 323: Numerical Analysis and Computing

CS 323: Numerical Analysis and Computing CS 323: Numerical Analysis and Computing MIDTERM #1 Instructions: This is an open notes exam, i.e., you are allowed to consult any textbook, your class notes, homeworks, or any of the handouts from us.

More information

Linear Algebraic Equations

Linear Algebraic Equations Linear Algebraic Equations Linear Equations: a + a + a + a +... + a = c 11 1 12 2 13 3 14 4 1n n 1 a + a + a + a +... + a = c 21 2 2 23 3 24 4 2n n 2 a + a + a + a +... + a = c 31 1 32 2 33 3 34 4 3n n

More information

10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method

10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method 54 CHAPTER 10 NUMERICAL METHODS 10. ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS As a numerical technique, Gaussian elimination is rather unusual because it is direct. That is, a solution is obtained after

More information

Linear System of Equations

Linear System of Equations Linear System of Equations Linear systems are perhaps the most widely applied numerical procedures when real-world situation are to be simulated. Example: computing the forces in a TRUSS. F F 5. 77F F.

More information

Scientific Computing

Scientific Computing Scientific Computing Direct solution methods Martin van Gijzen Delft University of Technology October 3, 2018 1 Program October 3 Matrix norms LU decomposition Basic algorithm Cost Stability Pivoting Pivoting

More information

Linear Algebra and Matrix Inversion

Linear Algebra and Matrix Inversion Jim Lambers MAT 46/56 Spring Semester 29- Lecture 2 Notes These notes correspond to Section 63 in the text Linear Algebra and Matrix Inversion Vector Spaces and Linear Transformations Matrices are much

More information

Direct Methods for solving Linear Equation Systems

Direct Methods for solving Linear Equation Systems REVIEW Lecture 5: Systems of Linear Equations Spring 2015 Lecture 6 Direct Methods for solving Linear Equation Systems Determinants and Cramer s Rule Gauss Elimination Algorithm Forward Elimination/Reduction

More information

Maths for Signals and Systems Linear Algebra for Engineering Applications

Maths for Signals and Systems Linear Algebra for Engineering Applications Maths for Signals and Systems Linear Algebra for Engineering Applications Lectures 1-2, Tuesday 11 th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON

More information

Matrix & Linear Algebra

Matrix & Linear Algebra Matrix & Linear Algebra Jamie Monogan University of Georgia For more information: http://monogan.myweb.uga.edu/teaching/mm/ Jamie Monogan (UGA) Matrix & Linear Algebra 1 / 84 Vectors Vectors Vector: A

More information

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i ) Direct Methods for Linear Systems Chapter Direct Methods for Solving Linear Systems Per-Olof Persson persson@berkeleyedu Department of Mathematics University of California, Berkeley Math 18A Numerical

More information

Solving Linear Systems of Equations

Solving Linear Systems of Equations 1 Solving Linear Systems of Equations Many practical problems could be reduced to solving a linear system of equations formulated as Ax = b This chapter studies the computational issues about directly

More information

Systems of Linear Equations

Systems of Linear Equations Systems of Linear Equations Last time, we found that solving equations such as Poisson s equation or Laplace s equation on a grid is equivalent to solving a system of linear equations. There are many other

More information

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f.

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f. Review Example. Elementary matrices in action: (a) 0 0 0 0 a b c d e f = g h i d e f 0 0 g h i a b c (b) 0 0 0 0 a b c d e f = a b c d e f 0 0 7 g h i 7g 7h 7i (c) 0 0 0 0 a b c a b c d e f = d e f 0 g

More information

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 6

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 6 CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 6 GENE H GOLUB Issues with Floating-point Arithmetic We conclude our discussion of floating-point arithmetic by highlighting two issues that frequently

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course

More information

Solution of Linear systems

Solution of Linear systems Solution of Linear systems Direct Methods Indirect Methods -Elimination Methods -Inverse of a matrix -Cramer s Rule -LU Decomposition Iterative Methods 2 A x = y Works better for coefficient matrices with

More information

PowerPoints organized by Dr. Michael R. Gustafson II, Duke University

PowerPoints organized by Dr. Michael R. Gustafson II, Duke University Part 3 Chapter 10 LU Factorization PowerPoints organized by Dr. Michael R. Gustafson II, Duke University All images copyright The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

More information

Matrix decompositions

Matrix decompositions Matrix decompositions How can we solve Ax = b? 1 Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers

More information

This operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix

This operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix 1 Matrix Algebra Reading [SB] 81-85, pp 153-180 11 Matrix Operations 1 Addition a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn + b 11 b 12 b 1n b 21 b 22 b 2n b m1 b m2 b mn a 11 + b 11 a 12 + b 12 a 1n

More information

POLI270 - Linear Algebra

POLI270 - Linear Algebra POLI7 - Linear Algebra Septemer 8th Basics a x + a x +... + a n x n b () is the linear form where a, b are parameters and x n are variables. For a given equation such as x +x you only need a variable and

More information

LINEAR SYSTEMS (11) Intensive Computation

LINEAR SYSTEMS (11) Intensive Computation LINEAR SYSTEMS () Intensive Computation 27-8 prof. Annalisa Massini Viviana Arrigoni EXACT METHODS:. GAUSSIAN ELIMINATION. 2. CHOLESKY DECOMPOSITION. ITERATIVE METHODS:. JACOBI. 2. GAUSS-SEIDEL 2 CHOLESKY

More information

Scientific Computing WS 2018/2019. Lecture 9. Jürgen Fuhrmann Lecture 9 Slide 1

Scientific Computing WS 2018/2019. Lecture 9. Jürgen Fuhrmann Lecture 9 Slide 1 Scientific Computing WS 2018/2019 Lecture 9 Jürgen Fuhrmann juergen.fuhrmann@wias-berlin.de Lecture 9 Slide 1 Lecture 9 Slide 2 Simple iteration with preconditioning Idea: Aû = b iterative scheme û = û

More information

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4 Linear Algebra Section. : LU Decomposition Section. : Permutations and transposes Wednesday, February 1th Math 01 Week # 1 The LU Decomposition We learned last time that we can factor a invertible matrix

More information

LU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark

LU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark DM559 Linear and Integer Programming LU Factorization Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark [Based on slides by Lieven Vandenberghe, UCLA] Outline

More information

Department of Mathematics California State University, Los Angeles Master s Degree Comprehensive Examination in. NUMERICAL ANALYSIS Spring 2015

Department of Mathematics California State University, Los Angeles Master s Degree Comprehensive Examination in. NUMERICAL ANALYSIS Spring 2015 Department of Mathematics California State University, Los Angeles Master s Degree Comprehensive Examination in NUMERICAL ANALYSIS Spring 2015 Instructions: Do exactly two problems from Part A AND two

More information

Optimized LU-decomposition with Full Pivot for Small Batched Matrices S3069

Optimized LU-decomposition with Full Pivot for Small Batched Matrices S3069 Optimized LU-decomposition with Full Pivot for Small Batched Matrices S369 Ian Wainwright High Performance Consulting Sweden ian.wainwright@hpcsweden.se Based on work for GTC 212: 1x speed-up vs multi-threaded

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra By: David McQuilling; Jesus Caban Deng Li Jan.,31,006 CS51 Solving Linear Equations u + v = 8 4u + 9v = 1 A x b 4 9 u v = 8 1 Gaussian Elimination Start with the matrix representation

More information

CS475: Linear Equations Gaussian Elimination LU Decomposition Wim Bohm Colorado State University

CS475: Linear Equations Gaussian Elimination LU Decomposition Wim Bohm Colorado State University CS475: Linear Equations Gaussian Elimination LU Decomposition Wim Bohm Colorado State University Except as otherwise noted, the content of this presentation is licensed under the Creative Commons Attribution

More information

Introduction to Mathematical Programming

Introduction to Mathematical Programming Introduction to Mathematical Programming Ming Zhong Lecture 6 September 12, 2018 Ming Zhong (JHU) AMS Fall 2018 1 / 20 Table of Contents 1 Ming Zhong (JHU) AMS Fall 2018 2 / 20 Solving Linear Systems A

More information

TMA4125 Matematikk 4N Spring 2017

TMA4125 Matematikk 4N Spring 2017 Norwegian University of Science and Technology Institutt for matematiske fag TMA15 Matematikk N Spring 17 Solutions to exercise set 1 1 We begin by writing the system as the augmented matrix.139.38.3 6.

More information

Practical Linear Algebra: A Geometry Toolbox

Practical Linear Algebra: A Geometry Toolbox Practical Linear Algebra: A Geometry Toolbox Third edition Chapter 12: Gauss for Linear Systems Gerald Farin & Dianne Hansford CRC Press, Taylor & Francis Group, An A K Peters Book www.farinhansford.com/books/pla

More information

Linear Algebra Math 221

Linear Algebra Math 221 Linear Algebra Math Open Book Exam Open Notes 8 Oct, 004 Calculators Permitted Show all work (except #4). (0 pts) Let A = 3 a) (0 pts) Compute det(a) by Gaussian Elimination. 3 3 swap(i)&(ii) (iii) (iii)+(

More information

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in Vectors and Matrices Continued Remember that our goal is to write a system of algebraic equations as a matrix equation. Suppose we have the n linear algebraic equations a x + a 2 x 2 + a n x n = b a 2

More information

CPE 310: Numerical Analysis for Engineers

CPE 310: Numerical Analysis for Engineers CPE 310: Numerical Analysis for Engineers Chapter 2: Solving Sets of Equations Ahmed Tamrawi Copyright notice: care has been taken to use only those web images deemed by the instructor to be in the public

More information

Engineering Computation

Engineering Computation Engineering Computation Systems of Linear Equations_1 1 Learning Objectives for Lecture 1. Motivate Study of Systems of Equations and particularly Systems of Linear Equations. Review steps of Gaussian

More information

Gaussian Elimination and Back Substitution

Gaussian Elimination and Back Substitution Jim Lambers MAT 610 Summer Session 2009-10 Lecture 4 Notes These notes correspond to Sections 31 and 32 in the text Gaussian Elimination and Back Substitution The basic idea behind methods for solving

More information

Topics. Vectors (column matrices): Vector addition and scalar multiplication The matrix of a linear function y Ax The elements of a matrix A : A ij

Topics. Vectors (column matrices): Vector addition and scalar multiplication The matrix of a linear function y Ax The elements of a matrix A : A ij Topics Vectors (column matrices): Vector addition and scalar multiplication The matrix of a linear function y Ax The elements of a matrix A : A ij or a ij lives in row i and column j Definition of a matrix

More information

Lecture 9: Elementary Matrices

Lecture 9: Elementary Matrices Lecture 9: Elementary Matrices Review of Row Reduced Echelon Form Consider the matrix A and the vector b defined as follows: 1 2 1 A b 3 8 5 A common technique to solve linear equations of the form Ax

More information

Linear Systems of n equations for n unknowns

Linear Systems of n equations for n unknowns Linear Systems of n equations for n unknowns In many application problems we want to find n unknowns, and we have n linear equations Example: Find x,x,x such that the following three equations hold: x

More information

Math 140, c Benjamin Aurispa. 2.1 Matrices

Math 140, c Benjamin Aurispa. 2.1 Matrices 2.1 Matrices Reminder: A matrix with m rows and n columns has size m x n. (This is also sometimes referred to as the order of the matrix.) The entry in the ith row and jth column of a matrix A is denoted

More information

System of Linear Equations

System of Linear Equations Chapter 7 - S&B Gaussian and Gauss-Jordan Elimination We will study systems of linear equations by describing techniques for solving such systems. The preferred solution technique- Gaussian elimination-

More information

Numerical Methods I Solving Square Linear Systems: GEM and LU factorization

Numerical Methods I Solving Square Linear Systems: GEM and LU factorization Numerical Methods I Solving Square Linear Systems: GEM and LU factorization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 September 18th,

More information

AM 205: lecture 6. Last time: finished the data fitting topic Today s lecture: numerical linear algebra, LU factorization

AM 205: lecture 6. Last time: finished the data fitting topic Today s lecture: numerical linear algebra, LU factorization AM 205: lecture 6 Last time: finished the data fitting topic Today s lecture: numerical linear algebra, LU factorization Unit II: Numerical Linear Algebra Motivation Almost everything in Scientific Computing

More information

Elementary maths for GMT

Elementary maths for GMT Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1

More information

Numerical Analysis: Solutions of System of. Linear Equation. Natasha S. Sharma, PhD

Numerical Analysis: Solutions of System of. Linear Equation. Natasha S. Sharma, PhD Mathematical Question we are interested in answering numerically How to solve the following linear system for x Ax = b? where A is an n n invertible matrix and b is vector of length n. Notation: x denote

More information

Linear Algebra. James Je Heon Kim

Linear Algebra. James Je Heon Kim Linear lgebra James Je Heon Kim (jjk9columbia.edu) If you are unfamiliar with linear or matrix algebra, you will nd that it is very di erent from basic algebra or calculus. For the duration of this session,

More information

Matrices and Systems of Equations

Matrices and Systems of Equations M CHAPTER 3 3 4 3 F 2 2 4 C 4 4 Matrices and Systems of Equations Probably the most important problem in mathematics is that of solving a system of linear equations. Well over 75 percent of all mathematical

More information

The System of Linear Equations. Direct Methods. Xiaozhou Li.

The System of Linear Equations. Direct Methods. Xiaozhou Li. 1/16 The Direct Methods xiaozhouli@uestc.edu.cn http://xiaozhouli.com School of Mathematical Sciences University of Electronic Science and Technology of China Chengdu, China Does the LU factorization always

More information

Introduction to PDEs and Numerical Methods Lecture 7. Solving linear systems

Introduction to PDEs and Numerical Methods Lecture 7. Solving linear systems Platzhalter für Bild, Bild auf Titelfolie hinter das Logo einsetzen Introduction to PDEs and Numerical Methods Lecture 7. Solving linear systems Dr. Noemi Friedman, 09.2.205. Reminder: Instationary heat

More information

Linear System of Equations

Linear System of Equations Linear System of Equations Linear systems are perhaps the most widely applied numerical procedures when real-world situation are to be simulated. Example: computing the forces in a TRUSS. F F 5. 77F F.

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 19: Computing the SVD; Sparse Linear Systems Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical

More information

Section 9.2: Matrices.. a m1 a m2 a mn

Section 9.2: Matrices.. a m1 a m2 a mn Section 9.2: Matrices Definition: A matrix is a rectangular array of numbers: a 11 a 12 a 1n a 21 a 22 a 2n A =...... a m1 a m2 a mn In general, a ij denotes the (i, j) entry of A. That is, the entry in

More information

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018 Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Algebra C Numerical Linear Algebra Sample Exam Problems

Algebra C Numerical Linear Algebra Sample Exam Problems Algebra C Numerical Linear Algebra Sample Exam Problems Notation. Denote by V a finite-dimensional Hilbert space with inner product (, ) and corresponding norm. The abbreviation SPD is used for symmetric

More information

Matrix decompositions

Matrix decompositions Matrix decompositions How can we solve Ax = b? 1 Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers

More information

Jordan Journal of Mathematics and Statistics (JJMS) 5(3), 2012, pp A NEW ITERATIVE METHOD FOR SOLVING LINEAR SYSTEMS OF EQUATIONS

Jordan Journal of Mathematics and Statistics (JJMS) 5(3), 2012, pp A NEW ITERATIVE METHOD FOR SOLVING LINEAR SYSTEMS OF EQUATIONS Jordan Journal of Mathematics and Statistics JJMS) 53), 2012, pp.169-184 A NEW ITERATIVE METHOD FOR SOLVING LINEAR SYSTEMS OF EQUATIONS ADEL H. AL-RABTAH Abstract. The Jacobi and Gauss-Seidel iterative

More information

Numerical Methods - Numerical Linear Algebra

Numerical Methods - Numerical Linear Algebra Numerical Methods - Numerical Linear Algebra Y. K. Goh Universiti Tunku Abdul Rahman 2013 Y. K. Goh (UTAR) Numerical Methods - Numerical Linear Algebra I 2013 1 / 62 Outline 1 Motivation 2 Solving Linear

More information

Lesson 3. Inverse of Matrices by Determinants and Gauss-Jordan Method

Lesson 3. Inverse of Matrices by Determinants and Gauss-Jordan Method Module 1: Matrices and Linear Algebra Lesson 3 Inverse of Matrices by Determinants and Gauss-Jordan Method 3.1 Introduction In lecture 1 we have seen addition and multiplication of matrices. Here we shall

More information

MATH3018 & MATH6141 Numerical Methods. Dr. Ian Hawke Content provided under a Creative Commons attribution license, CC-BY Ian Hawke.

MATH3018 & MATH6141 Numerical Methods. Dr. Ian Hawke Content provided under a Creative Commons attribution license, CC-BY Ian Hawke. MATH3018 & MATH6141 Numerical Methods Dr. Ian Hawke Content provided under a Creative Commons attribution license, CC-BY 4.0. 2014 Ian Hawke. Contents 1 Error analysis 5 1.1 Introduction.............................

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

Basic Concepts in Linear Algebra

Basic Concepts in Linear Algebra Basic Concepts in Linear Algebra Grady B Wright Department of Mathematics Boise State University February 2, 2015 Grady B Wright Linear Algebra Basics February 2, 2015 1 / 39 Numerical Linear Algebra Linear

More information

ECON 331 Homework #2 - Solution. In a closed model the vector of external demand is zero, so the matrix equation writes:

ECON 331 Homework #2 - Solution. In a closed model the vector of external demand is zero, so the matrix equation writes: ECON 33 Homework #2 - Solution. (Leontief model) (a) (i) The matrix of input-output A and the vector of level of production X are, respectively:.2.3.2 x A =.5.2.3 and X = y.3.5.5 z In a closed model the

More information

Matrices and Systems of Equations

Matrices and Systems of Equations M CHAPTER 4 F 2 2 4 C 4 4 Matrices and Systems of Equations Probably the most important problem in mathematics is that of solving a system of linear equations. Well over 75 percent of all mathematical

More information

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4 Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math Week # 1 Saturday, February 1, 1 Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra Decompositions, numerical aspects Gerard Sleijpen and Martin van Gijzen September 27, 2017 1 Delft University of Technology Program Lecture 2 LU-decomposition Basic algorithm Cost

More information