Notes on Householder QR Factorization

Size: px
Start display at page:

Download "Notes on Householder QR Factorization"

Transcription

1 Notes on Householder QR Factorization Robert A van de Geijn Department of Computer Science he University of exas at Austin Austin, X 7872 rvdg@csutexasedu September 2, 24 Motivation A fundamental problem to avoid in numerical codes is the situation where one starts with large values and one ends up with small values with large relative errors in them his is known as catastrophic cancellation he Gram-Schmidt algorithms can inherently fall victim to this: column a j is successively reduced in length as components in the directions of {q,, q j } are subtracted, leaving a small vector if a j was almost in the span of the first j columns of A Application of a unitary transformation to a matrix or vector inherently preserves length hus, it would be beneficial if the QR factorization can be implementated as the successive application of unitary transformations he Householder QR factorization accomplishes this he first fundamental insight is that the product of unitary matrices is itself unitary If, given A C m n with m n, one could find a sequence of unitary matrices, {H,, H n }, such that R H n H A, where R C m n is upper triangular, then R H n H A H H }{{ n } Q Q R Q L Q R R Q L R, where Q L equals the first n columns of A hen A Q L R is the QR factorization of A he second fundamental insight will be that the desired unitary transformations {H,, H n } can be computed and applied cheaply 2 Householder ransformations Reflectors 2 he general case In this section we discuss Householder transformations, also referred to as reflectors

2 H x z + u x u H u x u x v x y u z H u x u u H I 2 u u x H I 2 u u x y u Figure : Left: Illustration that shows how, given vectors x and unit length vector u, the subspace orthogonal to u becomes a mirror for reflecting x represented by the transformation I 2uu H Right: Illustration that shows how to compute u given vectors x and y with x 2 y 2 Definition Let u C n be a vector of unit length u 2 hen H I 2uu H is said to be a reflector or Householder transformation We observe: Any vector z that is perpendicular to u is left unchanged: I 2uu H z z 2uu H z z Any vector x can be written as x z + u H xu where z is perpendicular to u and u H xu is the component of x in the direction of u hen I 2uu H x I 2uu H z + u H xu z + u H xu 2u }{{} u H z 2uu H u H xu z + u H xu 2u H x u H u }{{} u z u H xu his can be interpreted as follows: he space perpendicular to u acts as a mirror : any vector in that space along the mirror is not reflected, while any other vector has the component that is orthogonal to the space the component outside, orthogonal to, the mirror reversed in direction, as illustrated in Figure Notice that a reflection preserves the length of the vector Exercise 2 Show that if H is a reflector, then HH I reflecting the reflection of a vector results in the original vector, 2

3 H H H, and H H H I a reflection is a unitary matrix and thus preserves the norm Next, let us ask the question of how to reflect a given x C n into another vector y C n with x 2 y 2 In other words, how do we compute vector u so that I 2uu H x y From our discussion above, we need to find a vector u that is perpendicular to the space with respect to which we will reflect From Figure right we notice that the vector from y to x, v x y, is perpendicular to the desired space hus, u must equal a unit vector in the direction v: u v/ v 2 Remark 3 In subsequent discussion we will prefer to give Householder transformations as I uu H /τ, where τ u H u/2 so that u needs no longer be a unit vector, just a direction he reason for this will become obvious later In the next subsection, we will need to find a Householder transformation H that maps a vector x to a multiple of the first unit basis vector e Let us first discuss how to find H in the case where x R n We seek v so that I 2 v v vv x ± x 2 e Since the resulting vector that we want is y ± x 2 e, we must choose v x y x x 2 e Exercise 4 Show that if x R n, v x x 2 e, and τ v v/2 then I τ vv x ± x 2 e In practice, we choose v x + signχ x 2 e where χ denotes the first element of x he reason is as follows: the first element of v, ν, will be ν χ x 2 If χ is positive and x 2 is almost equal to χ, then χ x 2 is a small number and if there is error in χ and/or x 2, this error becomes large relative to the result χ x 2, due to catastrophic cancellation Regardless of whether χ is positive or negative, we can avoid this by choosing x χ + signχ x 2 e 22 As implemented for the Householder QR factorization real case Next, we discuss a slight variant on the above discussion that is used in practice o do so, we view x as a vector that consists of its first element, χ, and the rest of the vector, : More precisely, partition x χ, 3

4 where χ equals the first element of x and is the rest of x hen we will wish to find a Householder vector u so that I τ χ Notice that y in the previous discussion equals the vector by v χ x 2 ± x 2 ± x 2 We now wish to normalize this vector so its first entry equals : u v χ x 2 ν χ x 2 /ν, so the direction of u is given where ν χ x 2 equals the first element of v Note that if ν then can be set to 23 he complex case optional Next, let us work out the complex case, dealing explicitly with x as a vector that consists of its first element, χ, and the rest of the vector, : More precisely, partition x χ where χ equals the first element of x and is the rest of x hen we will wish to find a Householder vector u so that I τ H, χ ± x 2 Here ± denotes a complex scalar on the complex unit circle By the same argument as before χ ± x 2 v We now wish to normalize this vector so its first entry equals : u v χ ± x 2 v 2 χ ± x 2 /ν where ν χ ± x 2 If ν then we set to 4

5 Algorithm: [ ρ ], τ Housev χ χ 2 : x 2 2 χ α : x 2 2 χ 2 ρ signχ x 2 ρ : signχ α ν χ + signχ x 2 ν : χ ρ /ν : /ν χ 2 χ 2/ ν 2 τ + u H 2 /2 τ + χ 2 2/2 Figure 2: Computing the Householder transformation Left: simple formulation Right: efficient computation Note: I have not completely double-checked these formulas for the complex case hey work for the real case Exercise 5 Verify that I τ H χ ρ where τ u H u/2 + u H 2 /2 and ρ ± x 2 Hint: ρ ρ ρ 2 x 2 2 since H preserves the norm Also, x 2 2 χ and z z z z Again, the choice ± is important For the complex case we choose ± signχ χ χ 24 A routine for computing the Householder vector We will refer to the vector as the Householder vector that reflects x into ± x 2 e and introduce the notation [ ] ρ χ, τ : Housev as the computation of the above mentioned vector, and scalars ρ and τ, from vector x We will use the notation Hx for the transformation I τ uuh where u and τ are computed by Housevx 3 Householder QR Factorization Let A be an m n with m n We will now show how to compute A QR, the QR factorization, as a sequence of Householder transformations applied to A, which eventually zeroes out all elements of that matrix below the diagonal he process is illustrated in Figure 3 5

6 [ ρ α a 2 Original matrix Housev, τ ] α a 2 : a 2 A 22 ρ a 2 w2 A 22 w2 Move forward Figure 3: Illustration of Householder QR factorization 6

7 In the first iteration, we partition A α a 2 a 2 A 22 Let [ ρ, τ ] Housev α be the Householder transform computed from the first column of A hen applying this Householder transform to A yields H α a 2 : I α a 2 a 2 A 22 τ a 2 A 22 ρ a 2 w 2 A 22 w2, where w 2 a 2 + uh 2 A 22/τ Computation of a full QR factorization of A will now proceed with the updated matrix A 22 Now let us assume that after k iterations of the algorithm matrix A contains A R L R R a 2 R r R 2 α a 2 a 2 A 22, where R L and R are k k upper triangular matrices Let [ ρ, τ ] Housev α a 2 and update A : I R r R 2 H I α a 2 τ a 2 A 22 H I R r R 2 τ α a 2 a 2 A 22 R r R 2 ρ a 2 w 2, A 22 w 2 7

8 where again w2 a 2 + uh 2 A 22/τ Let H k I τ k k H be the Householder transform so computed during the k + st iteration hen upon completion matrix A contains R R L H n H H  where  denotes the original contents of A and R L is an upper triangular matrix Rearranging this we find that  H H H n R which shows that if Q H H H n then  QR Exercise 6 Show that I H I τ I τ H ypically, the algorithm overwrites the original matrix A with the upper triangular matrix, and at each step is stored over the elements that become zero, thus overwriting a 2 It is for this reason that the first element of u was normalized to equal In this case Q is usually not explicitly formed as it can be stored as the separate Householder vectors below the diagonal of the overwritten matrix he algorithm that overwrites A in this manner is given in Fig 4 We will let [{U\R}, t] HouseholderQRA denote the operation that computes the QR factorization of m n matrix A, with m n, via Householder transformations It returns the Householder vectors and matrix R in the first argument and the vector of scalars τ i that are computed as part of the Householder transformations in t heorem 7 Given A C m n the cost of the algorithm in Figure 4 is given by C HQR m, n 2mn n3 flops Proof: he bulk of the computation is in w2 a 2 + uh 2 A 22/τ and A 22 w2 During the kth iteration when R L is k k, this means a matrix-vector multiplication u H 2 A 22 and rank- 8

9 Algorithm: [A, t] HouseholderQRA A L A R Partition A and t where A L is and t while n do Repartition A L A R t has elements A a A 2 a α a 2 A 2 a 2 A 22 where α and τ are scalars [ α a 2 Update, τ ] a 2 A 22 : via the steps [ ρ : I τ ], τ Housev and α u H 2 t a 2 a 2 A 22 t τ t 2 w2 : a 2 + a H 2A 22/τ a 2 a 2 w2 : A 22 A 22 a 2w2 Continue with A L A R endwhile A a A 2 a α a 2 A 2 a 2 A 22 and t t τ t 2 Figure 4: Unblocked Householder transformation based QR factorization update with matrix A 22 which is of size approximately m k n k for a cost of 4m kn k flops hus the total cost is approximately n n 4m kn k 4 m n + jj 4m n k j n 2m nnn + 4 2m nn n j j 2 n j n j + 4 j dx 2mn 2 2n n3 2mn n3 j 2 4 Forming Q Given A C m n, let [A, t] HouseholderQRA yield the matrix A with the Householder vectors stored below the diagonal, R stored on and above the diagonal, and the τ i stored in vector t We 9

10 Original matrix α a 2 : a 2 A 22 /τ u H 2A 22/τ Move forward /τ A 22 + a 2 Figure 5: Illustration of the computation of Q

11 now discuss how to form the first n columns of Q H H H n he computation is illustrated in Figure?? Notice that to pick out the first n columns we must form Q I n n H H n where B k is defined as indicated I n n In n H H k H k H n }{{} B k Lemma 8 B k has the form B k H k H n In n Ik k Bk Proof: he proof of this is by induction on k: Base case: k n hen B n In n, which has the desired form Inductive step: Assume the result is true for B k We show it is true for B k : In n Ik k B k H k H k H n H k B k H k Bk I k k I k k I τ k u H k u k Bk I k k I τ k u H k u k Bk I k k /τ k yk Bk u k I k k /τk yk u k /τ k B k u k yk I k k /τ k yk Ik k u k /τ k B k u k y k where y k uh k B k /τ k Bk

12 Algorithm: [A] FormQA, t A L A R Partition A and t where A L is na na and t while na R do Repartition A L A R t A a A 2 a α a 2 A 2 a 2 A 22 where α and τ are scalars α a 2 Update a 2 A 22 via the steps : α : /τ a 2 : a H 2A 22/τ A 22 : A 22 + a 2a 2 a 2 : a 2/τ I τ has na elements and t u H 2 t τ t 2 A 22 Continue with A L A R endwhile A a A 2 a α a 2 A 2 a 2 A 22 and t t τ t 2 Figure 6: Algorithm for overwriting A with Q from the Householder transformations stored as Householder vectors below the diagonal of A as produced by [A, t] HouseholderQRA By the Principle of Mathematical Induction the result holds for B,, B n heorem 9 Given [A, t] HouseholderQRA from Figure 4, the algorithm in Figure 6 overwrites A with the first n na columns of Q as defined by the Householder transformations stored below the diagonal of A and in the vector t Proof: he algorithm is justified by the proof of Lemma 8 heorem Given A C m n the cost of the algorithm in Figure 6 is given by C FormQ m, n 2mn n3 flops 2

13 Algorithm: [y] ApplyQtA, t, y A L A R Partition A, t where A L is and t, y while n do Repartition A L A R t has elements A a A 2 a α a 2 A 2 a 2 A 22 where α, τ, and ψ are scalars Update ψ via the steps : I τ, and y, u H 2 t y ψ y B t τ t 2, and y y B y ψ ω : ψ + a H 2/τ ψ ψ ω : ω Continue with A L A R endwhile A a A 2 a α a 2 A 2 a 2 A 22, t t τ t 2, and y y B y ψ Figure 7: Algorithm for computing y : H n H y given the output from routine HouseholderQR Proof: Hence the proof for heorem 7 can be easily modified to establish this result Exercise If m n then Q could be accumulated by the sequence Q IH H H n Give a high-level reason why this would be much more expensive than the algorithm in Figure 6 5 Applying Q H In a future Note, we will see that the QR factorization is used to solve the linear least-squares problem o do so, we need to be able to compute ŷ Q H y where Q H H n H 3

14 Let us start by computing H y: I τ H ψ ψ H ψ /τ }{{} ψ ω u2 ω ψ ω ω More generally, let us compute H k y: I τ H y ψ y ψ ω ω, where ω ψ + u H 2 /τ his motivates the algorithm in Figure 7 for computing y : H n H y given the output matrix A and vector t from routine HouseholderQR he cost of this algorithm can be analyzed as follows: When y is of length k, the bulk of the computation is in an inner product with vectors of length m k to compute ω and an axpy operation with vectors of length m k to subsequently update ψ and hus, the cost is approximately given by n 4m k 4mn 2n 2 k Notice that this is much cheaper than forming Q and then multiplying 4

8.3 Householder QR factorization

8.3 Householder QR factorization 83 ouseholder QR factorization 23 83 ouseholder QR factorization A fundamental problem to avoid in numerical codes is the situation where one starts with large values and one ends up with small values

More information

Chapter 3 - From Gaussian Elimination to LU Factorization

Chapter 3 - From Gaussian Elimination to LU Factorization Chapter 3 - From Gaussian Elimination to LU Factorization Maggie Myers Robert A. van de Geijn The University of Texas at Austin Practical Linear Algebra Fall 29 http://z.cs.utexas.edu/wiki/pla.wiki/ 1

More information

Notes on Solving Linear Least-Squares Problems

Notes on Solving Linear Least-Squares Problems Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Exercise 1.3 Work out the probabilities that it will be cloudy/rainy the day after tomorrow.

Exercise 1.3 Work out the probabilities that it will be cloudy/rainy the day after tomorrow. 54 Chapter Introduction 9 s Exercise 3 Work out the probabilities that it will be cloudy/rainy the day after tomorrow Exercise 5 Follow the instructions for this problem given on the class wiki For the

More information

Notes on LU Factorization

Notes on LU Factorization Notes on LU Factorization Robert A van de Geijn Department of Computer Science The University of Texas Austin, TX 78712 rvdg@csutexasedu October 11, 2014 The LU factorization is also known as the LU decomposition

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

From Matrix-Vector Multiplication to Matrix-Matrix Multiplication

From Matrix-Vector Multiplication to Matrix-Matrix Multiplication Week4 From Matrix-Vector Multiplication to Matrix-Matrix Multiplication here are a LO of programming assignments this week hey are meant to help clarify slicing and dicing hey show that the right abstractions

More information

Matrix Factorization and Analysis

Matrix Factorization and Analysis Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their

More information

Orthogonal Transformations

Orthogonal Transformations Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations

More information

CS 322 Homework 4 Solutions

CS 322 Homework 4 Solutions CS 322 Homework 4 Solutions out: Thursday 1 March 2007 due: Wednesday 7 March 2007 Problem 1: QR mechanics 1 Prove that a Householder reflection is symmetric and orthogonal. Answer: Symmetry: H T = (I

More information

Matrix-Vector Operations

Matrix-Vector Operations Week3 Matrix-Vector Operations 31 Opening Remarks 311 Timmy Two Space View at edx Homework 3111 Click on the below link to open a browser window with the Timmy Two Space exercise This exercise was suggested

More information

Orthonormal Transformations and Least Squares

Orthonormal Transformations and Least Squares Orthonormal Transformations and Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 30, 2009 Applications of Qx with Q T Q = I 1. solving

More information

Week6. Gaussian Elimination. 6.1 Opening Remarks Solving Linear Systems. View at edx

Week6. Gaussian Elimination. 6.1 Opening Remarks Solving Linear Systems. View at edx Week6 Gaussian Elimination 61 Opening Remarks 611 Solving Linear Systems View at edx 193 Week 6 Gaussian Elimination 194 61 Outline 61 Opening Remarks 193 611 Solving Linear Systems 193 61 Outline 194

More information

Families of Algorithms for Reducing a Matrix to Condensed Form

Families of Algorithms for Reducing a Matrix to Condensed Form Families of Algorithms for Reducing a Matrix to Condensed Form FIELD G. VAN ZEE, The University of Texas at Austin ROBERT A. VAN DE GEIJN, The University of Texas at Austin GREGORIO QUINTANA-ORTí, Universidad

More information

Orthonormal Transformations

Orthonormal Transformations Orthonormal Transformations Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 25, 2010 Applications of transformation Q : R m R m, with Q T Q = I 1.

More information

5.6. PSEUDOINVERSES 101. A H w.

5.6. PSEUDOINVERSES 101. A H w. 5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and

More information

Algorithms for Reducing a Matrix to Condensed Form

Algorithms for Reducing a Matrix to Condensed Form lgorithms for Reducing a Matrix to Condensed Form FLME Working Note #53 Field G. Van Zee Robert. van de Geijn Gregorio Quintana-Ortí G. Joseph Elizondo October 3, 2 Revised January 3, 22 bstract In a recent

More information

P = A(A T A) 1 A T. A Om (m n)

P = A(A T A) 1 A T. A Om (m n) Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra The two principal problems in linear algebra are: Linear system Given an n n matrix A and an n-vector b, determine x IR n such that A x = b Eigenvalue problem Given an n n matrix

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 38 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 38.. Outline..

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 463 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 464.. Outline..

More information

Notes on the Symmetric QR Algorithm

Notes on the Symmetric QR Algorithm Notes on the Symmetric QR Algorithm Robert A van de Geijn Department of Computer Science The University of Texas Austin, TX 78712 rvdg@csutexasedu November 4, 2014 The QR algorithm is a standard method

More information

Lecture # 5 The Linear Least Squares Problem. r LS = b Xy LS. Our approach, compute the Q R decomposition, that is, n R X = Q, m n 0

Lecture # 5 The Linear Least Squares Problem. r LS = b Xy LS. Our approach, compute the Q R decomposition, that is, n R X = Q, m n 0 Lecture # 5 The Linear Least Squares Problem Let X R m n,m n be such that rank(x = n That is, The problem is to find y LS such that We also want Xy =, iff y = b Xy LS 2 = min y R n b Xy 2 2 (1 r LS = b

More information

Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Linear Algebra Chap. 2: Least Squares Problems Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra

More information

Since the determinant of a diagonal matrix is the product of its diagonal elements it is trivial to see that det(a) = α 2. = max. A 1 x.

Since the determinant of a diagonal matrix is the product of its diagonal elements it is trivial to see that det(a) = α 2. = max. A 1 x. APPM 4720/5720 Problem Set 2 Solutions This assignment is due at the start of class on Wednesday, February 9th. Minimal credit will be given for incomplete solutions or solutions that do not provide details

More information

Matrix-Vector Operations

Matrix-Vector Operations Week3 Matrix-Vector Operations 31 Opening Remarks 311 Timmy Two Space View at edx Homework 3111 Click on the below link to open a browser window with the Timmy Two Space exercise This exercise was suggested

More information

Krylov Subspace Methods that Are Based on the Minimization of the Residual

Krylov Subspace Methods that Are Based on the Minimization of the Residual Chapter 5 Krylov Subspace Methods that Are Based on the Minimization of the Residual Remark 51 Goal he goal of these methods consists in determining x k x 0 +K k r 0,A such that the corresponding Euclidean

More information

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES 48 Arnoldi Iteration, Krylov Subspaces and GMRES We start with the problem of using a similarity transformation to convert an n n matrix A to upper Hessenberg form H, ie, A = QHQ, (30) with an appropriate

More information

Linear Least squares

Linear Least squares Linear Least squares Method of least squares Measurement errors are inevitable in observational and experimental sciences Errors can be smoothed out by averaging over more measurements than necessary to

More information

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

Solving linear equations with Gaussian Elimination (I)

Solving linear equations with Gaussian Elimination (I) Term Projects Solving linear equations with Gaussian Elimination The QR Algorithm for Symmetric Eigenvalue Problem The QR Algorithm for The SVD Quasi-Newton Methods Solving linear equations with Gaussian

More information

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name: Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

This can be accomplished by left matrix multiplication as follows: I

This can be accomplished by left matrix multiplication as follows: I 1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method

More information

9. Numerical linear algebra background

9. Numerical linear algebra background Convex Optimization Boyd & Vandenberghe 9. Numerical linear algebra background matrix structure and algorithm complexity solving linear equations with factored matrices LU, Cholesky, LDL T factorization

More information

Solving large scale eigenvalue problems

Solving large scale eigenvalue problems arge scale eigenvalue problems, Lecture 4, March 14, 2018 1/41 Lecture 4, March 14, 2018: The QR algorithm http://people.inf.ethz.ch/arbenz/ewp/ Peter Arbenz Computer Science Department, ETH Zürich E-mail:

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Practical Linear Algebra. Class Notes Spring 2009

Practical Linear Algebra. Class Notes Spring 2009 Practical Linear Algebra Class Notes Spring 9 Robert van de Geijn Maggie Myers Department of Computer Sciences The University of Texas at Austin Austin, TX 787 Draft January 8, Copyright 9 Robert van de

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra Decompositions, numerical aspects Gerard Sleijpen and Martin van Gijzen September 27, 2017 1 Delft University of Technology Program Lecture 2 LU-decomposition Basic algorithm Cost

More information

Program Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects

Program Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects Numerical Linear Algebra Decompositions, numerical aspects Program Lecture 2 LU-decomposition Basic algorithm Cost Stability Pivoting Cholesky decomposition Sparse matrices and reorderings Gerard Sleijpen

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized

More information

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) Householder QR Householder reflectors are matrices of the form P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) w Px x Geometrically, P x represents a mirror image of x with respect to

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017 Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Matrix-Matrix Multiplication

Matrix-Matrix Multiplication Week5 Matrix-Matrix Multiplication 51 Opening Remarks 511 Composing Rotations Homework 5111 Which of the following statements are true: cosρ + σ + τ cosτ sinτ cosρ + σ sinρ + σ + τ sinτ cosτ sinρ + σ cosρ

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

Linear Algebra. and

Linear Algebra. and Instructions Please answer the six problems on your own paper. These are essay questions: you should write in complete sentences. 1. Are the two matrices 1 2 2 1 3 5 2 7 and 1 1 1 4 4 2 5 5 2 row equivalent?

More information

Applied Numerical Linear Algebra. Lecture 8

Applied Numerical Linear Algebra. Lecture 8 Applied Numerical Linear Algebra. Lecture 8 1/ 45 Perturbation Theory for the Least Squares Problem When A is not square, we define its condition number with respect to the 2-norm to be k 2 (A) σ max (A)/σ

More information

A Vector Space Justification of Householder Orthogonalization

A Vector Space Justification of Householder Orthogonalization A Vector Space Justification of Householder Orthogonalization Ronald Christensen Professor of Statistics Department of Mathematics and Statistics University of New Mexico August 28, 2015 Abstract We demonstrate

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2 MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS SYSTEMS OF EQUATIONS AND MATRICES Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

3 QR factorization revisited

3 QR factorization revisited LINEAR ALGEBRA: NUMERICAL METHODS. Version: August 2, 2000 30 3 QR factorization revisited Now we can explain why A = QR factorization is much better when using it to solve Ax = b than the A = LU factorization

More information

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation Lecture Cholesky method QR decomposition Terminology Linear systems: Eigensystems: Jacobi transformations QR transformation Cholesky method: For a symmetric positive definite matrix, one can do an LU decomposition

More information

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =

More information

1 Backward and Forward Error

1 Backward and Forward Error Math 515 Fall, 2008 Brief Notes on Conditioning, Stability and Finite Precision Arithmetic Most books on numerical analysis, numerical linear algebra, and matrix computations have a lot of material covering

More information

Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury

Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury Matrix Theory, Math634 Lecture Notes from September 27, 212 taken by Tasadduk Chowdhury Last Time (9/25/12): QR factorization: any matrix A M n has a QR factorization: A = QR, whereq is unitary and R is

More information

Computational Methods. Least Squares Approximation/Optimization

Computational Methods. Least Squares Approximation/Optimization Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the

More information

Chapter 6 - Orthogonality

Chapter 6 - Orthogonality Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall 2009 http://z.cs.utexas.edu/wiki/pla.wiki/ 1 Orthogonal Vectors and Subspaces http://z.cs.utexas.edu/wiki/pla.wiki/

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

The Gram-Schmidt Process 1

The Gram-Schmidt Process 1 The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition

More information

Vector and Matrix Norms. Vector and Matrix Norms

Vector and Matrix Norms. Vector and Matrix Norms Vector and Matrix Norms Vector Space Algebra Matrix Algebra: We let x x and A A, where, if x is an element of an abstract vector space n, and A = A: n m, then x is a complex column vector of length n whose

More information

9. Numerical linear algebra background

9. Numerical linear algebra background Convex Optimization Boyd & Vandenberghe 9. Numerical linear algebra background matrix structure and algorithm complexity solving linear equations with factored matrices LU, Cholesky, LDL T factorization

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J Olver 8 Numerical Computation of Eigenvalues In this part, we discuss some practical methods for computing eigenvalues and eigenvectors of matrices Needless to

More information

QR-decomposition. The QR-decomposition of an n k matrix A, k n, is an n n unitary matrix Q and an n k upper triangular matrix R for which A = QR

QR-decomposition. The QR-decomposition of an n k matrix A, k n, is an n n unitary matrix Q and an n k upper triangular matrix R for which A = QR QR-decomposition The QR-decomposition of an n k matrix A, k n, is an n n unitary matrix Q and an n k upper triangular matrix R for which In Matlab A = QR [Q,R]=qr(A); Note. The QR-decomposition is unique

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Lecture 9: Numerical Linear Algebra Primer (February 11st)

Lecture 9: Numerical Linear Algebra Primer (February 11st) 10-725/36-725: Convex Optimization Spring 2015 Lecture 9: Numerical Linear Algebra Primer (February 11st) Lecturer: Ryan Tibshirani Scribes: Avinash Siravuru, Guofan Wu, Maosheng Liu Note: LaTeX template

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A = STUDENT S COMPANIONS IN BASIC MATH: THE ELEVENTH Matrix Reloaded by Block Buster Presumably you know the first part of matrix story, including its basic operations (addition and multiplication) and row

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 12: Gaussian Elimination and LU Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 10 Gaussian Elimination

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Q T A = R ))) ) = A n 1 R

Q T A = R ))) ) = A n 1 R Q T A = R As with the LU factorization of A we have, after (n 1) steps, Q T A = Q T A 0 = [Q 1 Q 2 Q n 1 ] T A 0 = [Q n 1 Q n 2 Q 1 ]A 0 = (Q n 1 ( (Q 2 (Q 1 A 0 }{{} A 1 ))) ) = A n 1 R Since Q T A =

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

Computational Linear Algebra

Computational Linear Algebra Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 2: Direct Methods PD Dr.

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

We will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m

We will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m Eigensystems We will discuss matrix diagonalization algorithms in umerical Recipes in the context of the eigenvalue problem in quantum mechanics, A n = λ n n, (1) where A is a real, symmetric Hamiltonian

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Practical Linear Algebra: A Geometry Toolbox

Practical Linear Algebra: A Geometry Toolbox Practical Linear Algebra: A Geometry Toolbox Third edition Chapter 12: Gauss for Linear Systems Gerald Farin & Dianne Hansford CRC Press, Taylor & Francis Group, An A K Peters Book www.farinhansford.com/books/pla

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Decomposition. bq (m n) R b (n n) r 11 r 1n

Decomposition. bq (m n) R b (n n) r 11 r 1n The QR Decomposition Lab Objective: The QR decomposition is a fundamentally important matrix factorization. It is straightforward to implement, is numerically stable, and provides the basis of several

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information