Important Matrix Factorizations

Similar documents
This can be accomplished by left matrix multiplication as follows: I

Linear Analysis Lecture 16

1. Search Directions In this chapter we again focus on the unconstrained optimization problem. lim sup ν

Math 407: Linear Optimization

Matrix Factorization and Analysis

5.6. PSEUDOINVERSES 101. A H w.

lecture 2 and 3: algorithms for linear algebra

The Linear Least Squares Problem

lecture 3 and 4: algorithms for linear algebra

EECS 275 Matrix Computation

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATHEMATICS FOR COMPUTER VISION WEEK 2 LINEAR SYSTEMS. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

Part IB Numerical Analysis

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Lecture 4: Applications of Orthogonality: QR Decompositions

Matrix decompositions

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

Lecture 3: QR-Factorization

Section 6.4. The Gram Schmidt Process

Linear Least squares

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Chapter 5 Orthogonality

QR-Decomposition for Arbitrary Matrices

Numerical Linear Algebra Chap. 2: Least Squares Problems

Algebra C Numerical Linear Algebra Sample Exam Problems

9. Numerical linear algebra background

May 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions

30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes

Computational Linear Algebra

Introduction to Mathematical Programming

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

LU Factorization. LU Decomposition. LU Decomposition. LU Decomposition: Motivation A = LU

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

MIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Matrix decompositions

Gaussian Elimination without/with Pivoting and Cholesky Decomposition

Linear Algebra, part 3 QR and SVD

Solving linear equations with Gaussian Elimination (I)

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Total 170. Name. Final Examination M340L-CS

Lecture # 11 The Power Method for Eigenvalues Part II. The power method find the largest (in magnitude) eigenvalue of. A R n n.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Orthonormal Transformations

9. Numerical linear algebra background

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

2.1 Gaussian Elimination

Matrix decompositions

1 Inner Product and Orthogonality

Applied Linear Algebra in Geoscience Using MATLAB

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra Review

Numerical Methods - Numerical Linear Algebra

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Numerical Linear Algebra

L2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Orthogonal Transformations

Solution of Linear Equations

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Lecture notes: Applied linear algebra Part 1. Version 2

MIT Final Exam Solutions, Spring 2017

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le

The QR Factorization

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

14.2 QR Factorization with Column Pivoting

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

3 QR factorization revisited

Orthonormal Transformations and Least Squares

Schur s Triangularization Theorem. Math 422

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

MATH 22A: LINEAR ALGEBRA Chapter 4

AMS526: Numerical Analysis I (Numerical Linear Algebra)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

MATH 532: Linear Algebra

Problem 1. CS205 Homework #2 Solutions. Solution

Chapter 1: Systems of Linear Equations and Matrices

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

MIDTERM 1 - SOLUTIONS

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Orthogonal iteration to QR

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2

Linear Algebra Linear Algebra : Matrix decompositions Monday, February 11th Math 365 Week #4

Conceptual Questions for Review

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

MA2501 Numerical Methods Spring 2015

Draft. Lecture 12 Gaussian Elimination and LU Factorization. MATH 562 Numerical Analysis II. Songting Luo

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

1 GSW Sets of Systems

Orthogonal Complements

Some Notes on Least Squares, QR-factorization, SVD and Fitting

4.2. ORTHOGONALITY 161

Numerical Linear Algebra

Transcription:

LU Factorization Choleski Factorization The QR Factorization

LU Factorization: Gaussian Elimination Matrices Gaussian elimination transforms vectors of the form a α, b where a R k, 0 α R, and b R n k 1, to those of the form a α. 0 This is accomplished by left matrix multiplication as follows: I k k 0 0 a a 0 1 0 α = α 0 α 1 b I (n k 1) (n k 1) b 0.

LU Factorization: Gaussian Elimination Matrices Gaussian elimination transforms vectors of the form a α, b where a R k, 0 α R, and b R n k 1, to those of the form a α. 0 This is accomplished by left matrix multiplication as follows: I k k 0 0 a a 0 1 0 α = α 0 α 1 b I (n k 1) (n k 1) b 0 The matrix on the left is called a Gaussian elimination matrix..

Gaussian Elimination Matrices The matrix I k k 0 0 0 1 0 0 α 1 b I (n k 1) (n k 1) has ones on the diagonal and so is invertible. Indeed, I k k 0 0 0 1 0 0 α 1 b I (n k 1) (n k 1) 1 = I k k 0 0 0 1 0 0 α 1 b I (n k 1) (n k 1). Also note that I k k 0 0 0 1 0 0 α 1 b I (n k 1) (n k 1) x 0 y = x 0 y.

LU Factorization Suppose [ a1 v1 T A = u 1 Ã 1 ] C n m, with 0 a 1 C, u 1 C m 1, v 1 C n 1, and Ã1 C (m 1) (n 1).

LU Factorization Suppose [ a1 v1 T A = u 1 Ã 1 ] C n m, with 0 a 1 C, u 1 C m 1, v 1 C n 1, and Ã1 C (m 1) (n 1). Then [ ] [ 1 0 a1 v T ] [ ] 1 C I n m a1 v = 1 T, (*) u 1 Ã 1 0 A 1 u1 a 1 where A 1 = Ã1 u 1 v T 1 /a 1.

LU Factorization Suppose [ a1 v1 T A = u 1 Ã 1 ] C n m, with 0 a 1 C, u 1 C m 1, v 1 C n 1, and Ã1 C (m 1) (n 1). Then [ ] [ 1 0 a1 v T ] [ ] 1 C I n m a1 v = 1 T, (*) u 1 Ã 1 0 A 1 u1 a 1 where A 1 = Ã1 u 1 v1 T /a 1. Repeat m times to get L 1 m 1 L 1 2 L 1 1 A = U m 1 = U is upper triangular, so A = LU where L is lower triangular with ones on the diagonal.

Cholesky Factorization Suppose M R n n, symmetric and positive definite has LU factorization M = LU.

Cholesky Factorization Suppose M R n n, symmetric and positive definite has LU factorization M = LU. Then L 1 ML T = UL T is an upper triangular symmetric matrix.

Cholesky Factorization Suppose M R n n, symmetric and positive definite has LU factorization M = LU. Then L 1 ML T = UL T is an upper triangular symmetric matrix. That is, UL T = D, where D is diagonal.

Cholesky Factorization Suppose M R n n, symmetric and positive definite has LU factorization M = LU. Then L 1 ML T = UL T is an upper triangular symmetric matrix. That is, UL T = D, where D is diagonal. Since M is psd, D has positive diagonal entries, so M = LDL T = ˆLˆL T where ˆL = LD 1/2.

Cholesky Factorization Suppose M R n n, symmetric and positive definite has LU factorization M = LU. Then L 1 ML T = UL T is an upper triangular symmetric matrix. That is, UL T = D, where D is diagonal. Since M is psd, D has positive diagonal entries, so M = LDL T = ˆLˆL T where ˆL = LD 1/2. This is called the Cholesky Factorization of M.

The QR Factorization: Householder Reflections Given w R n we can associate the matrix U = I 2 ww T w T w which reflects R n across the hyperplane Span{w}. The matrix U is call the Householder reflection across this hyperplane.

The QR Factorization: Householder Reflections Given w R n we can associate the matrix U = I 2 ww T w T w which reflects R n across the hyperplane Span{w}. The matrix U is call the Householder reflection across this hyperplane. Given a pair of vectors x and y with x 2 = y 2, and x y, there is a Householder reflection such that y = Ux: (x y)(x y)t U = I 2 (x y) T (x y).

The QR Factorization: Householder Reflections Given w R n we can associate the matrix U = I 2 ww T w T w which reflects R n across the hyperplane Span{w}. The matrix U is call the Householder reflection across this hyperplane. Given a pair of vectors x and y with x 2 = y 2, and x y, there is a Householder reflection such that y = Ux: (x y)(x y)t U = I 2 (x y) T (x y). Householder reflections are symmetric unitary tranformations: U 1 = U T = U.

The QR Factorization Given A R m n write [ α0 a A 0 = 0 T ] b 0 A 0 and ν 0 = ( α0 b 0) 2.

The QR Factorization Given A R m n write [ α0 a A 0 = 0 T ] b 0 A 0 and ν 0 = ( α0 b 0) 2. Set H 0 = I 2 ww T w T w where w = ( α0 b 0 ) ν 0 e 1 = ( α0 ν 0 b 0 ).

The QR Factorization Given A R m n write [ α0 a A 0 = 0 T ] b 0 A 0 and ν 0 = ( α0 b 0) 2. Set H 0 = I 2 ww T w T w Then where w = H 0 A = ( α0 b 0 [ ν0 a1 T ]. 0 A 1 ) ν 0 e 1 = ( α0 ν 0 b 0 ).

QR Factorization H 0 A = [ ν0 a1 T ]. 0 A 1

QR Factorization Repeat to get H 0 A = [ ν0 a1 T ]. 0 A 1 Q T A = H n 1 H n 2... H 0 A = R, where R is upper triangular and Q is unitary.

QR Factorization Repeat to get H 0 A = [ ν0 a1 T ]. 0 A 1 Q T A = H n 1 H n 2... H 0 A = R, where R is upper triangular and Q is unitary. The A = QR is called the QR factorization of A.

Orthogonal Projections Suppose A R m n with m > n, then A =

Orthogonal Projections Suppose A R m n with m > n, then A = The QR factorization of A looks like [ ] R A = [Q 1, Q 2 ] = Q 0 1 R where the columns of Q 1 and Q 2 for an orthonormal basis for R m.

Orthogonal Projections Suppose A R m n with m > n, then A = The QR factorization of A looks like [ ] R A = [Q 1, Q 2 ] = Q 0 1 R where the columns of Q 1 and Q 2 for an orthonormal basis for R m. The columns of Q 1 form and orthonormal basis for the range of A with Q 1 Q1 T = the orthogonal projector onto Ran(A) and I Q 1 Q T 1 = Q 2 Q T 2 = the orthogonal projector onto Ran(A)

Orthogonal Projections Similarly, if A R m n with m < n, then A T =

Orthogonal Projections Similarly, if A R m n with m < n, then A T = The QR factorization of A T looks like [ ] A T R = [Q 1, Q 2 ] = Q 0 1 R where the columns of Q 1 and Q 2 for an orthonormal basis for R m.

Orthogonal Projections Similarly, if A R m n with m < n, then A T = The QR factorization of A T looks like [ ] A T R = [Q 1, Q 2 ] = Q 0 1 R where the columns of Q 1 and Q 2 for an orthonormal basis for R m. The columns of Q 1 form and orthonormal basis for the range of A T with and Q 1 Q T 1 = the orthogonal projector onto Ran(A T ) I Q 1 Q T 1 = Q 2 Q T 2 = the orthogonal projector onto Ran(A T ) = Nul(A)