Orthogonal Transformations
|
|
- Basil Norman
- 6 years ago
- Views:
Transcription
1 Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3
2 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations (today) 3. solving eigenvalue problems (next time) 4. finding the singular value decomposition Orthogonal Transformations p. 2/3
3 Stability of orthogonal transformations We recall: Suppose Q is an orthogonal matrix, that is Q T Q = I. If x = x + e is an approximation to x then Qx = Qx + Qe and Qe 2 = e 2. If A R m,n and A = A + E is an approximation to A then QA = QA + QE and QE 2 = E 2. Conculsion: when an orthogonal transformation is applied to a vector or a matrix the error will not grow. Orthogonal Transformations p. 3/3
4 QR-Decomposition and Factorization Definition 1. Let A R m,n with m n 1. We say that A = QR is a QR-decomposition of A if Q R m,m is square and orthogonal and R = R 1 0 with R 1 R n,n upper triangular and 0 R m n,n the zero matrix. We say that A = QR is a QR-factorization of A if Q R m,n has orthonormal columns and R = R 1 R n,n is upper triangular. Orthogonal Transformations p. 4/3
5 QR decomp QR fact It is easy to construct a QR-factorization from a QR-decomposition A = QR. We simply partition Q = [Q 1, Q 2 ] and R = and R 1 R n,n. R 1 0 where Q 1 R m,n Then A = Q 1 R 1 is a QR-factorization of A. Conversely we construct the QR-decomposition from the QR-factorization A = Q 1 R 1 by extending the columns of Q 1 to an orthonormal basis Q = [Q 1, Q 2 ] for R m and defining R = [ R 1 0 ] R m,n. If m = n then the two factorizations are the same. Orthogonal Transformations p. 5/3
6 example An example of a QR-decomposition is A = = = QR, while a QR-factorization A = Q 1 R 1 is obtained by dropping the last column of Q and the last row of R so that A = = Q 1 R Orthogonal Transformations p. 6/3
7 Existence of QR Factorization Lemma 2. A matrix A R m,n with linearly independent columns has a QR-decomposition and a QR-factorization. The QR-factorization is unique if R has positive diagonal entries. Proof. Existence Cholesky factorization A T A = R T 1 R 1, R 1 R n,n is upper triangular and nonsingular. Q 1 := AR 1 1 Q T 1 Q 1 = I. QR-factorization A = Q 1 R 1 Uniqueness A = Q 1 R 1 A T A = R T 1 R 1 Cholesky factorization unique implies R 1 unique and Q 1 = AR 1 1 unique. Orthogonal Transformations p. 7/3
8 QR and Gram-Schmidt If A R m,n has rank n, then the set of columns {a 1,...,a n } forms a basis for span(a) and the Gram-Schmidt orthogonalization process takes the form v 1 = a 1, v j = a j j 1 i=1 a T j v i v T i v v i, for j = 2,...,n. i {v 1,...,v n } is an orthogonal basis for span(a). a 1 = v 1, a j = j 1 i=1 ρ ij v i + v j, where ρ ij = at j v i v T i v i Orthogonal Transformations p. 8/3
9 GS=QR-factorization a 1 = v 1, a j = j 1 i=1 ρ ij v i + v j, A = V ˆR, where V := [v 1,...,v n ] R m,n and ˆR is unit upper triangular. Since {v 1,...,v n } is a basis for span(a) the matrix D := diag( v 1 2,..., v n 2 ) is nonsingular, the matrix Q 1 := V D 1 = [ v 1 v 1 2,..., v n v n 2 ] is orthogonal. Therefore, A = Q 1 R 1, with R 1 := D ˆR is a QR-factorization of A with positive diagonal entries in R 1. Orthogonal Transformations p. 9/3
10 QR and Least Squares Suppose A R m,n has rank n and let b R m. Consider the least squares problem min x R n Ax b 2. Suppose A = QR is a QR-decomposition of A. We partition Q and R as Q = [Q 1 Q 2 ] and R = [ ] R 1 0, where Q 1 R m,n and R 1 R n,n. [ ] Ax b 2 2 = QRx b 2 2 = Rx Q T b 2 R 1 x Q T 1 b 2 = Q T 2 b = R 1 x Q T 1 b QT 2 b 2 2. Thus Ax b 2 Q T 2 b 2 for all x R n with equality if R 1 x = Q T 1 b. 2 2 Orthogonal Transformations p. 10/3
11 Least Squares using QR 1. Find a QR-factorization A = Q 1 R 1 of A. 2. Solve R 1 x = Q T 1 b for the least squares solution x. Consider the least squares problem with [ ] A = and b = [ ]. The least squares solution x is found by solving the system [ ] [ x1 ] = 1 [ ] [ 11 ] [ 20 ] x = 1 0 and we find x = [1, 0, 0] T. Orthogonal Transformations p. 11/3
12 Finding QR Gram Schmidt is numerically unstable Use instead orthogonal transformations they are stable study Householder transformations and Givens rotations Orthogonal Transformations p. 12/3
13 Householder Transformations A matrix H R n,n of the form H := I uu T, where u R n and u T u = 2 is called a Householder transformation. For n = 2 we find H = [ 1 u 2 1 u 1u 2 u 2 u 1 1 u 2 2 A Householder transformation is symmetric and orthogonal. In particular, H T H = H 2 = (I uu T )(I uu T ) = I 2uu T + u(u T u)u T = I. ]. Orthogonal Transformations p. 13/3
14 Alternative representations Householder himself used I 2uu T, where u T u = 1. For any nonzero v R n the matrix H := I 2 vvt v T v is a Householder transformation. In fact H = I uu T, where u := 2 v v 2. Orthogonal Transformations p. 14/3
15 Transformation Lemma 3. Suppose x,y R n with x 2 = y 2 and v := x y 0. Then ( I 2 vvt v T v) x = y. Proof. Since x T x = y T y we have v T v = (x y) T (x y) = 2x T x 2y T x = 2v T x. (1) ( ) But then I 2 vvt v T v x = x 2v T x v T v v = x v = y. Orthogonal Transformations p. 15/3
16 Geometric Interpretation Hx = y Hx is the mirror image (reflection) of x H = I 2vvT v T v = P vvt vvt, where P := I v T v v T v Px is the orthogonal projection of x into span{x + y} Px = x vt x v T v v = (1) x 1 2 v = 1 (x + y) 2 x+y y=hx Px x Orthogonal Transformations p. 16/3
17 Zero out entries x R n \ 0 and y = αe 1. If α 2 = x T x then x 2 = y 2. Two solutions α = + x 2 and α = x 2 Choose α to have opposite sign of x 1. obtain y x so H exists even if x = e 1. Numerical stability since we avoid cancelation in the subtraction v 1 = x 1 α. Orthogonal Transformations p. 17/3
18 Formulas for an algorithm Lemma 1. For a nonzero vector x R n we define { x 2 if x 1 > 0 α := + x 2 otherwise, (2) and H := I uu T with u = x/α e 1 1 x1 /α. (3) Then H is a Householder transformation and Hx = αe 1. Orthogonal Transformations p. 18/3
19 Proof Let y := αe 1 and v = x y. By construction x T x = y T y and v 0. By Lemma 3 we have Hx = αe 1, where H = I 2 vvt v T v is a Householder transformation. Since 0 < v T v = (x αe 1 ) T (x αe 1 ) = x T x 2αx 1 +α 2 = 2α(α x 1 ), we find H = I 2(x αe 1)(x αe 1 ) T 2α(α x 1 ) = I (x/α e 1)(x/α e 1 ) T 1 x 1 /α = I uu T. Orthogonal Transformations p. 19/3
20 example For x := [1, 2, 2] T we have x 2 = 3 and since x 1 > 0 we choose α = 3. We find u = [2, 1, 1] T / 3 and H = I 1 2 [ ] = We see that Hx = 3e 1. Orthogonal Transformations p. 20/3
21 Algorithm housegen To given x R n the following algorithm computes a = α and the vector u so that (I uu T )x = αe 1. function [ u, a ]= housegen ( x ) a=norm( x ) ; u=x ; i f a==0 u (1)= sqrt ( 2 ) ; return ; end i f u(1)>0 a= a ; end u=u / a ; u (1)=u(1) 1; u=u / sqrt( u ( 1 ) ) ; Orthogonal Transformations p. 21/3
22 Zero only part of a vector Suppose y R k, z R n k and α 2 = z T z. Consider finding a Householder transformation H such that H [ y z ] = [ y αe 1 ]. Let [û, α] = housegen(z) and set u T = [0 T, û T ]. Then H = I uu T = I 0 0 I 0 [ ] 0 û û = I 0, 0 Ĥ where Ĥ = I ûût. Since u T u = û T û = 2 we see that H and transformations. Ĥ are Householder Orthogonal Transformations p. 22/3
23 Householder Triangulation x x x x x x x x x x x x H 1 r 11 r 12 r 13 0 x x 0 x x 0 x x H 2 r 11 r 12 r 13 0 r 22 r x 0 0 x A 1 = D 1 A 2 = B 2 C 2 A 3 = B 3 C 3 A 4 = 0 D 2 0 D 3 H 3 r 11 r 12 r 13 0 r 22 r r R 1 0 The transformation is applied to the lower right block. A = QR, where Q := H 1 H 2 H n and R := R 1 0. Orthogonal Transformations p. 23/3
24 Algorithm houselsq Suppose A R m,n has rank n and that b R m. This algorithm uses Householder transformations to solve the least squares problem min x Ax b 2 if m > n and the linear system Ax = b if m = n. It uses housegen and backsolve. function x = houselsq (A, b ) [m, n ]= size (A ) ; A=[A b ] ; for k =1:min ( n,m 1) [ v, A( k, k ) ] = housegen (A( k :m, k ) ) ; A( k :m, k +1: n+1)=a( k :m, k +1: n+1) v (v A( k :m, k +1: n + 1 ) ) ; end x=backsolve (A( 1 : n, 1 : n ),A ( 1 : n, n + 1 ) ) ; Orthogonal Transformations p. 24/3
25 QR without full rank Theorem 2. Suppose m n 1 and A R m,n. Then A has a QR-decomposition and a QR-factorization. Orthogonal Transformations p. 25/3
26 Compare with normal equations A = QR and R 1 x = Q T 1 b compared to A T Ax = A T b Flops: Householder 2mn 2 2n 3 /3, Normal eqn mn 2. Householder twice as expensive for large m Householder is numerically stable Normal equations often ill-conditioned Orthogonal Transformations p. 26/3
27 Squaring the condition number Recall that the 2-norm condition number of a matrix is the square root of the largest eigenvalue of A T A. It follows that the 2-norm condition number of A T A is the square of the 2-norm condition number of A. Thus if A is mildly ill-conditioned the normal equations can be quite ill-conditioned solving the normal equations can give inaccurate results. Orthogonal Transformations p. 27/3
28 What about linear systems? houselsq can be used to solve linear systems. If A is nonsingular and m = n then the output x will be the solution of Ax = b. This follows since the QR-decomposition and QR-factorization are the same when A is square. Therefore, if Rx = Q T b then Ax b 2 = QRx b 2 = Rx Q T b 2 = 0. So Algorithm 24 can be used as an alternative to Gaussian elimination. The two methods are similar since they both reduce A to upper triangular form using certain transformations. Orthogonal Transformations p. 28/3
29 Which method is better? Linear systems can be constructed where Gaussian elimination with partial pivoting will fail numerically. On the other hand the transformations used in Householder triangulation are orthogonal so the method is quite stable. So why is Gaussian elimination more popular than Householder triangulation? One reason is that the number of flops when m = n is given by 4n 3 /3, while the count for Gaussian elimination is half of that. Numerical stability can be a problem with Gaussian elimination, but years and years of experience shows that it works well for most practical problems and pivoting is often not necessary. Tradition might also play a role. Orthogonal Transformations p. 29/3
30 Rotations In some applications, the matrix we want to triangulate has a special structure. Suppose for example that A R n,n is square and upper Hessenberg as illustrated by a Wilkinson diagram for n = 4 x x x x x x x x A = 0 x x x. 0 0 x x Only one entry in each column needs to be annihilated and a full Householder transformation will be inefficient. In this case we can use a simpler transformation. Orthogonal Transformations p. 30/3
31 Givens Rotations Definition 3. A plane rotation (also called a Given s rotation) is an orthogonal matrix of the form P := [ ] c s, where c 2 + s 2 = 1. s c There is a unique angle θ [0, 2π) such that c = cos θ and s = sinθ. x θ y=px Figure 1: Clockwise rotation. Orthogonal Transformations p. 31/3
32 Zero in one component Suppose x = [ x 1 x 2 ] 0, c := x 1 r, s := x 2 r, r := x 2. Then Px = 1 r [ x 1 x 2 x 2 x 1 ] [ x 1 x 2 ] = 1 r [ x x2 2 0 ] = [ ] r 0, and we have introduced a zero in x. We can take P = I when x = 0. Orthogonal Transformations p. 32/3
33 Rotation in the i, j-plane For 1 i < j n we define a rotation in the i,j-plane as a matrix P ij = (p kl ) R n,n by p kl = δ kl except for positions ii,jj,ij,ji, which are given by [ ] [ ] p ii p ij c s =, where c 2 + s 2 = 1. s c p ji p jj P 23 = [ ] 0 c s 0 0 s c Orthogonal Transformations p. 33/3
34 Triangulating an upper Hessenberg ma A = [ ] x x x x 0 x x x 0 0 x x P 12 [ r11 r 12 r 13 r 14 0 x x x 0 x x x 0 0 x x ] P 23 [ r11 r 12 r 13 r 14 0 r 22 r 23 r x x 0 0 x x ] P 34 [ r11 r 12 r 13 0 r 22 r r Orthogonal Transformations p. 34/3
Orthonormal Transformations and Least Squares
Orthonormal Transformations and Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 30, 2009 Applications of Qx with Q T Q = I 1. solving
More informationOrthonormal Transformations
Orthonormal Transformations Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 25, 2010 Applications of transformation Q : R m R m, with Q T Q = I 1.
More informationENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition
ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition
More informationLeast Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo
Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 26, 2010 Linear system Linear system Ax = b, A C m,n, b C m, x C n. under-determined
More informationAM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition
AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationLinear Analysis Lecture 16
Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,
More informationNumerical Linear Algebra Chap. 2: Least Squares Problems
Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra
More informationbe a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u
MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline
More informationThis can be accomplished by left matrix multiplication as follows: I
1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationMatrix decompositions
Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The
More informationLeast-Squares Systems and The QR factorization
Least-Squares Systems and The QR factorization Orthogonality Least-squares systems. The Gram-Schmidt and Modified Gram-Schmidt processes. The Householder QR and the Givens QR. Orthogonality The Gram-Schmidt
More informationNumerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??
Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement
More informationLecture 2 INF-MAT : , LU, symmetric LU, Positve (semi)definite, Cholesky, Semi-Cholesky
Lecture 2 INF-MAT 4350 2009: 7.1-7.6, LU, symmetric LU, Positve (semi)definite, Cholesky, Semi-Cholesky Tom Lyche and Michael Floater Centre of Mathematics for Applications, Department of Informatics,
More informationMATH 532: Linear Algebra
MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9
STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they
More informationSince the determinant of a diagonal matrix is the product of its diagonal elements it is trivial to see that det(a) = α 2. = max. A 1 x.
APPM 4720/5720 Problem Set 2 Solutions This assignment is due at the start of class on Wednesday, February 9th. Minimal credit will be given for incomplete solutions or solutions that do not provide details
More informationComputing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices
Computing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo November 9, 2008 Today
More information3 QR factorization revisited
LINEAR ALGEBRA: NUMERICAL METHODS. Version: August 2, 2000 30 3 QR factorization revisited Now we can explain why A = QR factorization is much better when using it to solve Ax = b than the A = LU factorization
More informationThe Singular Value Decomposition and Least Squares Problems
The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving
More informationWe will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m
Eigensystems We will discuss matrix diagonalization algorithms in umerical Recipes in the context of the eigenvalue problem in quantum mechanics, A n = λ n n, (1) where A is a real, symmetric Hamiltonian
More informationLecture 6, Sci. Comp. for DPhil Students
Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationHouseholder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)
Householder QR Householder reflectors are matrices of the form P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) w Px x Geometrically, P x represents a mirror image of x with respect to
More information5.6. PSEUDOINVERSES 101. A H w.
5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and
More informationMatrix Factorization and Analysis
Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their
More informationCheat Sheet for MATH461
Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A
More informationGaussian Elimination without/with Pivoting and Cholesky Decomposition
Gaussian Elimination without/with Pivoting and Cholesky Decomposition Gaussian Elimination WITHOUT pivoting Notation: For a matrix A R n n we define for k {,,n} the leading principal submatrix a a k A
More informationMATH 3511 Lecture 1. Solving Linear Systems 1
MATH 3511 Lecture 1 Solving Linear Systems 1 Dmitriy Leykekhman Spring 2012 Goals Review of basic linear algebra Solution of simple linear systems Gaussian elimination D Leykekhman - MATH 3511 Introduction
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 2: Direct Methods PD Dr.
More informationLinear Least squares
Linear Least squares Method of least squares Measurement errors are inevitable in observational and experimental sciences Errors can be smoothed out by averaging over more measurements than necessary to
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More information5 Selected Topics in Numerical Linear Algebra
5 Selected Topics in Numerical Linear Algebra In this chapter we will be concerned mostly with orthogonal factorizations of rectangular m n matrices A The section numbers in the notes do not align with
More informationLecture # 5 The Linear Least Squares Problem. r LS = b Xy LS. Our approach, compute the Q R decomposition, that is, n R X = Q, m n 0
Lecture # 5 The Linear Least Squares Problem Let X R m n,m n be such that rank(x = n That is, The problem is to find y LS such that We also want Xy =, iff y = b Xy LS 2 = min y R n b Xy 2 2 (1 r LS = b
More informationMAT Linear Algebra Collection of sample exams
MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More informationProblem 1. CS205 Homework #2 Solutions. Solution
CS205 Homework #2 s Problem 1 [Heath 3.29, page 152] Let v be a nonzero n-vector. The hyperplane normal to v is the (n-1)-dimensional subspace of all vectors z such that v T z = 0. A reflector is a linear
More informationInverses. Stephen Boyd. EE103 Stanford University. October 28, 2017
Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number
More informationComputational Methods. Least Squares Approximation/Optimization
Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the
More informationApplied Numerical Linear Algebra. Lecture 8
Applied Numerical Linear Algebra. Lecture 8 1/ 45 Perturbation Theory for the Least Squares Problem When A is not square, we define its condition number with respect to the 2-norm to be k 2 (A) σ max (A)/σ
More information9. Numerical linear algebra background
Convex Optimization Boyd & Vandenberghe 9. Numerical linear algebra background matrix structure and algorithm complexity solving linear equations with factored matrices LU, Cholesky, LDL T factorization
More informationLU Factorization. LU factorization is the most common way of solving linear systems! Ax = b LUx = b
AM 205: lecture 7 Last time: LU factorization Today s lecture: Cholesky factorization, timing, QR factorization Reminder: assignment 1 due at 5 PM on Friday September 22 LU Factorization LU factorization
More informationYORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions
YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this
More informationLinear Algebra, part 3 QR and SVD
Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We
More informationlecture 2 and 3: algorithms for linear algebra
lecture 2 and 3: algorithms for linear algebra STAT 545: Introduction to computational statistics Vinayak Rao Department of Statistics, Purdue University August 27, 2018 Solving a system of linear equations
More informationBasic Concepts in Linear Algebra
Basic Concepts in Linear Algebra Grady B Wright Department of Mathematics Boise State University February 2, 2015 Grady B Wright Linear Algebra Basics February 2, 2015 1 / 39 Numerical Linear Algebra Linear
More informationQ T A = R ))) ) = A n 1 R
Q T A = R As with the LU factorization of A we have, after (n 1) steps, Q T A = Q T A 0 = [Q 1 Q 2 Q n 1 ] T A 0 = [Q n 1 Q n 2 Q 1 ]A 0 = (Q n 1 ( (Q 2 (Q 1 A 0 }{{} A 1 ))) ) = A n 1 R Since Q T A =
More informationAM205: Assignment 2. i=1
AM05: Assignment Question 1 [10 points] (a) [4 points] For p 1, the p-norm for a vector x R n is defined as: ( n ) 1/p x p x i p ( ) i=1 This definition is in fact meaningful for p < 1 as well, although
More informationReview of Basic Concepts in Linear Algebra
Review of Basic Concepts in Linear Algebra Grady B Wright Department of Mathematics Boise State University September 7, 2017 Math 565 Linear Algebra Review September 7, 2017 1 / 40 Numerical Linear Algebra
More informationOrthogonalization and least squares methods
Chapter 3 Orthogonalization and least squares methods 31 QR-factorization (QR-decomposition) 311 Householder transformation Definition 311 A complex m n-matrix R = [r ij is called an upper (lower) triangular
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More information9. Numerical linear algebra background
Convex Optimization Boyd & Vandenberghe 9. Numerical linear algebra background matrix structure and algorithm complexity solving linear equations with factored matrices LU, Cholesky, LDL T factorization
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced
More informationSolution of Linear Equations
Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass
More informationComputing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices
Computing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo November 8, 2009 Today
More informationMIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.
MAE 9A / FALL 3 Maurício de Oliveira MIDTERM Instructions: You have 75 minutes This exam is open notes, books No computers, calculators, phones, etc There are 3 questions for a total of 45 points and bonus
More informationL2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015
L-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 015 Marty McFly: Wait a minute, Doc. Ah... Are you telling me you built a time machine... out of a DeLorean? Doc Brown: The way I see
More information14.2 QR Factorization with Column Pivoting
page 531 Chapter 14 Special Topics Background Material Needed Vector and Matrix Norms (Section 25) Rounding Errors in Basic Floating Point Operations (Section 33 37) Forward Elimination and Back Substitution
More informationACM106a - Homework 2 Solutions
ACM06a - Homework 2 Solutions prepared by Svitlana Vyetrenko October 7, 2006. Chapter 2, problem 2.2 (solution adapted from Golub, Van Loan, pp.52-54): For the proof we will use the fact that if A C m
More informationLinear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg
Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector
More informationCOMPUTER SCIENCE 515 Numerical Linear Algebra SPRING 2006 ASSIGNMENT # 4 (39 points) February 27
Due Friday, March 1 in class COMPUTER SCIENCE 1 Numerical Linear Algebra SPRING 26 ASSIGNMENT # 4 (9 points) February 27 1. (22 points) The goal is to compare the effectiveness of five different techniques
More informationLecture 4 Eigenvalue problems
Lecture 4 Eigenvalue problems Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn
More informationMATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018
Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry
More informationLecture 3: QR-Factorization
Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization
More informationLinear least squares problem: Example
Linear least squares problem: Example We want to determine n unknown parameters c,,c n using m measurements where m n Here always denotes the -norm v = (v + + v n) / Example problem Fit the experimental
More informationClass notes: Approximation
Class notes: Approximation Introduction Vector spaces, linear independence, subspace The goal of Numerical Analysis is to compute approximations We want to approximate eg numbers in R or C vectors in R
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationNumerical Linear Algebra Primer. Ryan Tibshirani Convex Optimization /36-725
Numerical Linear Algebra Primer Ryan Tibshirani Convex Optimization 10-725/36-725 Last time: proximal gradient descent Consider the problem min g(x) + h(x) with g, h convex, g differentiable, and h simple
More informationMatrix decompositions
Matrix decompositions How can we solve Ax = b? 1 Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers
More informationLinear Algebra and Matrix Inversion
Jim Lambers MAT 46/56 Spring Semester 29- Lecture 2 Notes These notes correspond to Section 63 in the text Linear Algebra and Matrix Inversion Vector Spaces and Linear Transformations Matrices are much
More informationChapter 4 No. 4.0 Answer True or False to the following. Give reasons for your answers.
MATH 434/534 Theoretical Assignment 3 Solution Chapter 4 No 40 Answer True or False to the following Give reasons for your answers If a backward stable algorithm is applied to a computational problem,
More information1 Inner Product and Orthogonality
CSCI 4/Fall 6/Vora/GWU/Orthogonality and Norms Inner Product and Orthogonality Definition : The inner product of two vectors x and y, x x x =.., y =. x n y y... y n is denoted x, y : Note that n x, y =
More informationDecomposition. bq (m n) R b (n n) r 11 r 1n
The QR Decomposition Lab Objective: The QR decomposition is a fundamentally important matrix factorization. It is straightforward to implement, is numerically stable, and provides the basis of several
More informationNotes on Solving Linear Least-Squares Problems
Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation
More informationCS 322 Homework 4 Solutions
CS 322 Homework 4 Solutions out: Thursday 1 March 2007 due: Wednesday 7 March 2007 Problem 1: QR mechanics 1 Prove that a Householder reflection is symmetric and orthogonal. Answer: Symmetry: H T = (I
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationLecture Notes for Inf-Mat 3350/4350, Tom Lyche
Lecture Notes for Inf-Mat 3350/4350, 2007 Tom Lyche August 5, 2007 2 Contents Preface vii I A Review of Linear Algebra 1 1 Introduction 3 1.1 Notation............................... 3 2 Vectors 5 2.1 Vector
More informationNumerical Linear Algebra
Numerical Linear Algebra Decompositions, numerical aspects Gerard Sleijpen and Martin van Gijzen September 27, 2017 1 Delft University of Technology Program Lecture 2 LU-decomposition Basic algorithm Cost
More informationProgram Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects
Numerical Linear Algebra Decompositions, numerical aspects Program Lecture 2 LU-decomposition Basic algorithm Cost Stability Pivoting Cholesky decomposition Sparse matrices and reorderings Gerard Sleijpen
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationComputing Eigenvalues and/or Eigenvectors;Part 2, The Power method and QR-algorithm
Computing Eigenvalues and/or Eigenvectors;Part 2, The Power method and QR-algorithm Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo November 13, 2009 Today
More informationOrthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6
Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and
More informationlecture 3 and 4: algorithms for linear algebra
lecture 3 and 4: algorithms for linear algebra STAT 545: Introduction to computational statistics Vinayak Rao Department of Statistics, Purdue University August 30, 2016 Solving a system of linear equations
More information1. Search Directions In this chapter we again focus on the unconstrained optimization problem. lim sup ν
1 Search Directions In this chapter we again focus on the unconstrained optimization problem P min f(x), x R n where f : R n R is assumed to be twice continuously differentiable, and consider the selection
More informationLinear Least-Squares Data Fitting
CHAPTER 6 Linear Least-Squares Data Fitting 61 Introduction Recall that in chapter 3 we were discussing linear systems of equations, written in shorthand in the form Ax = b In chapter 3, we just considered
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationEECS 275 Matrix Computation
EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18 Overview
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More information4.8 Arnoldi Iteration, Krylov Subspaces and GMRES
48 Arnoldi Iteration, Krylov Subspaces and GMRES We start with the problem of using a similarity transformation to convert an n n matrix A to upper Hessenberg form H, ie, A = QHQ, (30) with an appropriate
More informationApplied Linear Algebra
Applied Linear Algebra Gábor P. Nagy and Viktor Vígh University of Szeged Bolyai Institute Winter 2014 1 / 262 Table of contents I 1 Introduction, review Complex numbers Vectors and matrices Determinants
More informationAMS 209, Fall 2015 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems
AMS 209, Fall 205 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems. Overview We are interested in solving a well-defined linear system given
More informationSolving linear equations with Gaussian Elimination (I)
Term Projects Solving linear equations with Gaussian Elimination The QR Algorithm for Symmetric Eigenvalue Problem The QR Algorithm for The SVD Quasi-Newton Methods Solving linear equations with Gaussian
More informationPreliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012
Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.
More informationThe matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.
) Find all solutions of the linear system. Express the answer in vector form. x + 2x + x + x 5 = 2 2x 2 + 2x + 2x + x 5 = 8 x + 2x + x + 9x 5 = 2 2 Solution: Reduce the augmented matrix [ 2 2 2 8 ] to
More informationVector and Matrix Norms. Vector and Matrix Norms
Vector and Matrix Norms Vector Space Algebra Matrix Algebra: We let x x and A A, where, if x is an element of an abstract vector space n, and A = A: n m, then x is a complex column vector of length n whose
More informationNumerical Methods I Non-Square and Sparse Linear Systems
Numerical Methods I Non-Square and Sparse Linear Systems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 September 25th, 2014 A. Donev (Courant
More information