AMS526: Numerical Analysis I (Numerical Linear Algebra)
|
|
- Timothy West
- 5 years ago
- Views:
Transcription
1 AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15
2 Outline 1 Householder Reflectors 2 Linear Least Squares Problems Xiangmin Jiao Numerical Analysis I 2 / 15
3 Householder Triangularization Method introduced by Alston Scott Householder in 1958 It multiplies unitary matrices to make column triangular, e.g. x x x Q 1 0 x x 0 x x Q 2 x x 0 x Q 3 x 0 x x 0 x 0 0 x x 0 x 0 A Q 1 A Q 2 Q 1 A Q 3 Q 2 Q 1 A After n steps, we get a product of unitary matrices Q n Q 2 Q }{{} 1 A = R Q and in turn we get full QR factorization A = QR Q k introduces zeros below diagonal of kth column while preserving zeros below diagonal in preceding columns The key question is how to find Q k Xiangmin Jiao Numerical Analysis I 3 / 15
4 Householder Reflectors First, consider Q 1 : Q 1 a 1 = a 1 e 1, where e 1 = (1, 0,, 0) T. Why the length is a 1? Q 1 reflects a 1 across hyperplane H orthogonal to v = a 1 e 1 a 1, and therefore Q 1 = I 2 vv v v More generally, Q k = [ I 0 0 F where I is (k 1) (k 1) and F is (m k + 1) (m k + 1) such that F x = x 2 e 1, where x is (a k,k, a k,k+1,, a k,m ) T What is F? It has similar form as Q 1 with v = x e 1 x. ] Xiangmin Jiao Numerical Analysis I 4 / 15
5 Choice of Reflectors We could choose F such that F x = x e 1 instead of F x = x e 1, or more generally, F x = z x e 1 with z = 1 for z C This leads to an infinite number of possible QR factorizations of A If we require z R, we still have two choices Numerically, it is undesirable for v v to be close to zero for v = z x e 1 x, and v is larger if z = sign(x 1 ) Therefore, v = sign(x 1 ) x e 1 x. Since I 2 vv v v is independent of sign, clear out the factor 1 and obtain v = sign(x 1 ) x e 1 + x For completeness, if x 1 = 0, set z to 1 (instead of 0) Xiangmin Jiao Numerical Analysis I 5 / 15
6 Householder Algorithm Householder QR Factorization for k = 1 to n x = A k:m,k v k = sign(x 1 ) x e 1 + x v k = v k / v k A k:m,k:n = A k:m,k:n 2v k (v k A k:m,k:n) Note that sign(x) = 1 if x 0 and = 1 if x < 0 Leave R in place of A Matrix Q is not formed explicitly but reflection vector v k is stored Xiangmin Jiao Numerical Analysis I 6 / 15
7 Householder Algorithm Householder QR Factorization for k = 1 to n x = A k:m,k v k = sign(x 1 ) x e 1 + x v k = v k / v k A k:m,k:n = A k:m,k:n 2v k (v k A k:m,k:n) Note that sign(x) = 1 if x 0 and = 1 if x < 0 Leave R in place of A Matrix Q is not formed explicitly but reflection vector v k is stored Question: Can A be reused to store both R and v k completely? Xiangmin Jiao Numerical Analysis I 6 / 15
8 Householder Algorithm Householder QR Factorization for k = 1 to n x = A k:m,k v k = sign(x 1 ) x e 1 + x v k = v k / v k A k:m,k:n = A k:m,k:n 2v k (v k A k:m,k:n) Note that sign(x) = 1 if x 0 and = 1 if x < 0 Leave R in place of A Matrix Q is not formed explicitly but reflection vector v k is stored Question: Can A be reused to store both R and v k completely? Answer: We can use lower diagonal portion of A to store all but one entry in each v k. So an additional array of size n is needed. Xiangmin Jiao Numerical Analysis I 6 / 15
9 Householder Algorithm Householder QR Factorization for k = 1 to n x = A k:m,k v k = sign(x 1 ) x e 1 + x v k = v k / v k A k:m,k:n = A k:m,k:n 2v k (v k A k:m,k:n) Note that sign(x) = 1 if x 0 and = 1 if x < 0 Leave R in place of A Matrix Q is not formed explicitly but reflection vector v k is stored Question: Can A be reused to store both R and v k completely? Answer: We can use lower diagonal portion of A to store all but one entry in each v k. So an additional array of size n is needed. Question: What happens if v k is 0 in line 3 of the loop? Xiangmin Jiao Numerical Analysis I 6 / 15
10 Applying or Forming Q Compute Q b = Q n Q 1 b Implicit calculation of Q b for k = 1 to n b k:m = b k:m 2v k (v k b k:m) Xiangmin Jiao Numerical Analysis I 7 / 15
11 Applying or Forming Q Compute Q b = Q n Q 1 b Implicit calculation of Q b for k = 1 to n b k:m = b k:m 2v k (v k b k:m) Compute Qx = Q 1 Q 2 Q n x Implicit calculation of Qx for k = n downto 1 x k:m = x k:m 2v k (v k x k:m) Question: How to form Q and ˆQ, respectively? Xiangmin Jiao Numerical Analysis I 7 / 15
12 Applying or Forming Q Compute Q b = Q n Q 1 b Implicit calculation of Q b for k = 1 to n b k:m = b k:m 2v k (v k b k:m) Compute Qx = Q 1 Q 2 Q n x Implicit calculation of Qx for k = n downto 1 x k:m = x k:m 2v k (v k x k:m) Question: How to form Q and ˆQ, respectively? Answer: Apply x = I m m or first n columns of I, respectively Xiangmin Jiao Numerical Analysis I 7 / 15
13 Operation Count Most work done at step A k:m,k:n = A k:m,k:n 2v k (v k A k:m,k:n) Flops per iteration: 2(m k)(n k) for dot products v k A k:m,k:n (m k)(n k) for outer product 2v k ( ) (m k)(n k) for subtraction 4(m k)(n k) total Including outer loop, total flops is n 4(m k)(n k) = 4 k=1 n (mn km kn + k 2 ) k=1 4mn 2 4(m + n)n 2 /2 + 4n 3 /3 = 2mn n3 If m n, it is more efficient than Gram-Schmidt method, but if m n, similar to Gram-Schmidt Xiangmin Jiao Numerical Analysis I 8 / 15
14 Givens Rotations Instead of using reflection, we can rotate x to obtain x e 1 [ ] cos θ sin θ A Given rotation R = rotates x R sin θ cos θ 2 counterclockwise by θ Choose θ to be angle between (x i, x j ) T and (1, 0) T, and we have [ ] [ ] cos θ sin θ xi = sin θ cos θ x j [ x 2 i 0 + x 2 j ] where cos θ = x 2 i x i + x 2 j, sin θ = x j xi 2 + xj 2 Xiangmin Jiao Numerical Analysis I 9 / 15
15 Givens QR Introduce zeros in column bottom-up, one zero at a time (4,5) (3,4) x x x x x x 0 x x 0 x x... To zero a ij, left-multiply matrix F with F i:i+1,i:i+1 being rotation matrix and F kk = 1 for k i, i + 1 Flop count of Givens QR is 3mn 2 n 3, which is about 50% more expensive than Householder QR Xiangmin Jiao Numerical Analysis I 10 / 15
16 Outline 1 Householder Reflectors 2 Linear Least Squares Problems Xiangmin Jiao Numerical Analysis I 11 / 15
17 Linear Least Squares Problems Overdetermined system of equations Ax b, where A has more rows than columns and has full rank, in general has no solutions Example application: Polynomial least squares fitting In general, minimize the residual r = b Ax In terms of 2-norm, we obtain linear least squares problem: Given A C m n, m n, and b C m, find x C n such that b Ax 2 is minimized If A has full rank, the minimizer x is the solution to the normal equation A Ax = A b or in terms of the pseudoinverse A +, x = A + b, where A + = (A A) 1 A C n m Xiangmin Jiao Numerical Analysis I 12 / 15
18 Geometric Interpretation Ax is in range(a), and the point in range(a) closest to b is its orthogonal projection onto range(a) Residual r is then orthogonal to range(a), and hence A r = A (b Ax) = 0 Ax is orthogonal projection of b, where x = A + b, so P = AA + = A(A A) 1 A is orthogonal projection (recall lecture 5) Xiangmin Jiao Numerical Analysis I 13 / 15
19 Solution of Lease Squares Problems One approach is to solve normal equation A Ax = A b directly using Cholesky factorization Is unstable, but is very efficient if m n (mn n3 ) More robust approach is to use QR factorization A = ˆQ ˆR b can be projected onto range(a) by P = ˆQ ˆQ, and therefore ˆQ ˆRx = ˆQ ˆQ b Left-multiply by ˆQ and we get ˆRx = ˆQ b (note A + = ˆR 1 ˆQ ) Least squares via QR Factorization Compute reduced QR factorization A = ˆQ ˆR Compute vector c = ˆQ b Solve upper-triangular system ˆRx = c for x Computation is dominated by QR factorization (2mn n3 ) Question: If Householder QR is used, how to compute ˆQ b? Xiangmin Jiao Numerical Analysis I 14 / 15
20 Solution of Lease Squares Problems One approach is to solve normal equation A Ax = A b directly using Cholesky factorization Is unstable, but is very efficient if m n (mn n3 ) More robust approach is to use QR factorization A = ˆQ ˆR b can be projected onto range(a) by P = ˆQ ˆQ, and therefore ˆQ ˆRx = ˆQ ˆQ b Left-multiply by ˆQ and we get ˆRx = ˆQ b (note A + = ˆR 1 ˆQ ) Least squares via QR Factorization Compute reduced QR factorization A = ˆQ ˆR Compute vector c = ˆQ b Solve upper-triangular system ˆRx = c for x Computation is dominated by QR factorization (2mn n3 ) Question: If Householder QR is used, how to compute ˆQ b? Answer: Compute Q b (where Q is from full QR factorization) and then take first n entries of resulting Q b Xiangmin Jiao Numerical Analysis I 14 / 15
21 Solution by SVD Using A = Û ˆΣV, b can be projected onto range(a) by P = ÛÛ, and therefore Û ˆΣV x = ÛÛ b Left-multiply by Û and we get ˆΣV x = Û b Least squares via SVD Compute reduced SVD factorization A = Û ˆΣV Compute vector c = Û b Solve diagonal system ˆΣw = c for w Set x = V w Work is dominated by SVD, which is 2mn n 3 flops, very expensive if m n Best numerical stability Question: If A is rank deficient, how to solve Ax b? Xiangmin Jiao Numerical Analysis I 15 / 15
22 Solution by SVD Using A = Û ˆΣV, b can be projected onto range(a) by P = ÛÛ, and therefore Û ˆΣV x = ÛÛ b Left-multiply by Û and we get ˆΣV x = Û b Least squares via SVD Compute reduced SVD factorization A = Û ˆΣV Compute vector c = Û b Solve diagonal system ˆΣw = c for w Set x = V w Work is dominated by SVD, which is 2mn n 3 flops, very expensive if m n Best numerical stability Question: If A is rank deficient, how to solve Ax b? Answer: x is no longer unique. Constrain x to be orthogonal to null space of A. Xiangmin Jiao Numerical Analysis I 15 / 15
AMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 12: Gaussian Elimination and LU Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 10 Gaussian Elimination
More informationAM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition
AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 13: Conditioning of Least Squares Problems; Stability of Householder Triangularization Xiangmin Jiao Stony Brook University Xiangmin Jiao
More informationOrthonormal Transformations and Least Squares
Orthonormal Transformations and Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 30, 2009 Applications of Qx with Q T Q = I 1. solving
More informationEECS 275 Matrix Computation
EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18 Overview
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 2: Orthogonal Vectors and Matrices; Vector Norms Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 11 Outline 1 Orthogonal
More informationLinear Algebra, part 3 QR and SVD
Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Reduction to Hessenberg and Tridiagonal Forms; Rayleigh Quotient Iteration Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationOrthogonal Transformations
Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations
More informationENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition
ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 1: Course Overview; Matrix Multiplication Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 3: Positive-Definite Systems; Cholesky Factorization Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 11 Symmetric
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationLecture 6. Numerical methods. Approximation of functions
Lecture 6 Numerical methods Approximation of functions Lecture 6 OUTLINE 1. Approximation and interpolation 2. Least-square method basis functions design matrix residual weighted least squares normal equation
More informationOrthonormal Transformations
Orthonormal Transformations Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 25, 2010 Applications of transformation Q : R m R m, with Q T Q = I 1.
More informationComputational Methods. Least Squares Approximation/Optimization
Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 19: More on Arnoldi Iteration; Lanczos Iteration Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 17 Outline 1
More informationLinear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg
Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationProblem set 5: SVD, Orthogonal projections, etc.
Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where
More informationLecture 6, Sci. Comp. for DPhil Students
Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page
More informationApplied Numerical Linear Algebra. Lecture 8
Applied Numerical Linear Algebra. Lecture 8 1/ 45 Perturbation Theory for the Least Squares Problem When A is not square, we define its condition number with respect to the 2-norm to be k 2 (A) σ max (A)/σ
More informationNumerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??
Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement
More informationMIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.
MAE 9A / FALL 3 Maurício de Oliveira MIDTERM Instructions: You have 75 minutes This exam is open notes, books No computers, calculators, phones, etc There are 3 questions for a total of 45 points and bonus
More informationNotes on Householder QR Factorization
Notes on Householder QR Factorization Robert A van de Geijn Department of Computer Science he University of exas at Austin Austin, X 7872 rvdg@csutexasedu September 2, 24 Motivation A fundamental problem
More informationInverses. Stephen Boyd. EE103 Stanford University. October 28, 2017
Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number
More information4.8 Arnoldi Iteration, Krylov Subspaces and GMRES
48 Arnoldi Iteration, Krylov Subspaces and GMRES We start with the problem of using a similarity transformation to convert an n n matrix A to upper Hessenberg form H, ie, A = QHQ, (30) with an appropriate
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter V: Least Squares Problems Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu MATH
More informationLinear Least squares
Linear Least squares Method of least squares Measurement errors are inevitable in observational and experimental sciences Errors can be smoothed out by averaging over more measurements than necessary to
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 19: Computing the SVD; Sparse Linear Systems Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationNumerical Linear Algebra Chap. 2: Least Squares Problems
Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra
More informationHouseholder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)
Householder QR Householder reflectors are matrices of the form P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) w Px x Geometrically, P x represents a mirror image of x with respect to
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationDS-GA 1002 Lecture notes 10 November 23, Linear models
DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.
More informationLecture 3: QR-Factorization
Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization
More informationReview of Basic Concepts in Linear Algebra
Review of Basic Concepts in Linear Algebra Grady B Wright Department of Mathematics Boise State University September 7, 2017 Math 565 Linear Algebra Review September 7, 2017 1 / 40 Numerical Linear Algebra
More informationLinear Analysis Lecture 16
Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,
More information5 Selected Topics in Numerical Linear Algebra
5 Selected Topics in Numerical Linear Algebra In this chapter we will be concerned mostly with orthogonal factorizations of rectangular m n matrices A The section numbers in the notes do not align with
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced
More informationLU Factorization. LU factorization is the most common way of solving linear systems! Ax = b LUx = b
AM 205: lecture 7 Last time: LU factorization Today s lecture: Cholesky factorization, timing, QR factorization Reminder: assignment 1 due at 5 PM on Friday September 22 LU Factorization LU factorization
More informationbe a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u
MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9
STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 9 Minimizing Residual CG
More informationB553 Lecture 5: Matrix Algebra Review
B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationNotes on Solving Linear Least-Squares Problems
Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation
More informationChapter 7. Tridiagonal linear systems. Solving tridiagonal systems of equations. and subdiagonal. E.g. a 21 a 22 a A =
Chapter 7 Tridiagonal linear systems The solution of linear systems of equations is one of the most important areas of computational mathematics. A complete treatment is impossible here but we will discuss
More informationPractical Linear Algebra: A Geometry Toolbox
Practical Linear Algebra: A Geometry Toolbox Third edition Chapter 12: Gauss for Linear Systems Gerald Farin & Dianne Hansford CRC Press, Taylor & Francis Group, An A K Peters Book www.farinhansford.com/books/pla
More informationMatrix decompositions
Matrix decompositions How can we solve Ax = b? 1 Linear algebra Typical linear system of equations : x 1 x +x = x 1 +x +9x = 0 x 1 +x x = The variables x 1, x, and x only appear as linear terms (no powers
More informationThis can be accomplished by left matrix multiplication as follows: I
1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method
More informationQ T A = R ))) ) = A n 1 R
Q T A = R As with the LU factorization of A we have, after (n 1) steps, Q T A = Q T A 0 = [Q 1 Q 2 Q n 1 ] T A 0 = [Q n 1 Q n 2 Q 1 ]A 0 = (Q n 1 ( (Q 2 (Q 1 A 0 }{{} A 1 ))) ) = A n 1 R Since Q T A =
More information8.3 Householder QR factorization
83 ouseholder QR factorization 23 83 ouseholder QR factorization A fundamental problem to avoid in numerical codes is the situation where one starts with large values and one ends up with small values
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationNumerical Methods I Non-Square and Sparse Linear Systems
Numerical Methods I Non-Square and Sparse Linear Systems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 September 25th, 2014 A. Donev (Courant
More information2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian
FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian
More informationLecture # 5 The Linear Least Squares Problem. r LS = b Xy LS. Our approach, compute the Q R decomposition, that is, n R X = Q, m n 0
Lecture # 5 The Linear Least Squares Problem Let X R m n,m n be such that rank(x = n That is, The problem is to find y LS such that We also want Xy =, iff y = b Xy LS 2 = min y R n b Xy 2 2 (1 r LS = b
More informationUNIVERSITY of LIMERICK
UNIVERSITY of LIMERICK OLLSCOIL LUIMNIGH Faculty of Science & Engineering Department of Mathematics & Statistics END OF SEMESTER ASSESSMENT PAPER MODULE CODE: MS4105 SEMESTER: Autumn 2015 MODULE TITLE:
More informationAM205: Assignment 2. i=1
AM05: Assignment Question 1 [10 points] (a) [4 points] For p 1, the p-norm for a vector x R n is defined as: ( n ) 1/p x p x i p ( ) i=1 This definition is in fact meaningful for p < 1 as well, although
More informationSolving large scale eigenvalue problems
arge scale eigenvalue problems, Lecture 4, March 14, 2018 1/41 Lecture 4, March 14, 2018: The QR algorithm http://people.inf.ethz.ch/arbenz/ewp/ Peter Arbenz Computer Science Department, ETH Zürich E-mail:
More information3 QR factorization revisited
LINEAR ALGEBRA: NUMERICAL METHODS. Version: August 2, 2000 30 3 QR factorization revisited Now we can explain why A = QR factorization is much better when using it to solve Ax = b than the A = LU factorization
More informationMATH 532: Linear Algebra
MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline
More informationProblem 1: Solving a linear equation
Math 38 Practice Final Exam ANSWERS Page Problem : Solving a linear equation Given matrix A = 2 2 3 7 4 and vector y = 5 8 9. (a) Solve Ax = y (if the equation is consistent) and write the general solution
More informationj=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.
Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u
More informationSolution of Linear Equations
Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass
More informationVector and Matrix Norms. Vector and Matrix Norms
Vector and Matrix Norms Vector Space Algebra Matrix Algebra: We let x x and A A, where, if x is an element of an abstract vector space n, and A = A: n m, then x is a complex column vector of length n whose
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Rayleigh Quotient Iteration Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 10 Solving Eigenvalue Problems All
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 18 Outline
More informationBasic Concepts in Linear Algebra
Basic Concepts in Linear Algebra Grady B Wright Department of Mathematics Boise State University February 2, 2015 Grady B Wright Linear Algebra Basics February 2, 2015 1 / 39 Numerical Linear Algebra Linear
More informationDesigning Information Devices and Systems II
EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric
More information1 Cricket chirps: an example
Notes for 2016-09-26 1 Cricket chirps: an example Did you know that you can estimate the temperature by listening to the rate of chirps? The data set in Table 1 1. represents measurements of the number
More informationOrthogonalization and least squares methods
Chapter 3 Orthogonalization and least squares methods 31 QR-factorization (QR-decomposition) 311 Householder transformation Definition 311 A complex m n-matrix R = [r ij is called an upper (lower) triangular
More informationMATHEMATICS FOR COMPUTER VISION WEEK 2 LINEAR SYSTEMS. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year
1 MATHEMATICS FOR COMPUTER VISION WEEK 2 LINEAR SYSTEMS Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year 2013-14 OUTLINE OF WEEK 2 Linear Systems and solutions Systems of linear
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationMatrix Basic Concepts
Matrix Basic Concepts Topics: What is a matrix? Matrix terminology Elements or entries Diagonal entries Address/location of entries Rows and columns Size of a matrix A column matrix; vectors Special types
More informationLarge Scale Data Analysis Using Deep Learning
Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13
STAT 309: MATHEMATICAL COMPUTATIONS I FALL 208 LECTURE 3 need for pivoting we saw that under proper circumstances, we can write A LU where 0 0 0 u u 2 u n l 2 0 0 0 u 22 u 2n L l 3 l 32, U 0 0 0 l n l
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis (Numerical Linear Algebra for Computational and Data Sciences) Lecture 14: Eigenvalue Problems; Eigenvalue Revealing Factorizations Xiangmin Jiao Stony Brook University Xiangmin
More informationSingular Value Decomposition
Singular Value Decomposition (Com S 477/577 Notes Yan-Bin Jia Sep, 7 Introduction Now comes a highlight of linear algebra. Any real m n matrix can be factored as A = UΣV T where U is an m m orthogonal
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 2: Direct Methods PD Dr.
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3
ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any
More informationMAA507, Power method, QR-method and sparse matrix representation.
,, and representation. February 11, 2014 Lecture 7: Overview, Today we will look at:.. If time: A look at representation and fill in. Why do we need numerical s? I think everyone have seen how time consuming
More informationMatrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury
Matrix Theory, Math634 Lecture Notes from September 27, 212 taken by Tasadduk Chowdhury Last Time (9/25/12): QR factorization: any matrix A M n has a QR factorization: A = QR, whereq is unitary and R is
More informationECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am
ECS130 Scientific Computing Lecture 1: Introduction Monday, January 7, 10:00 10:50 am About Course: ECS130 Scientific Computing Professor: Zhaojun Bai Webpage: http://web.cs.ucdavis.edu/~bai/ecs130/ Today
More informationSince the determinant of a diagonal matrix is the product of its diagonal elements it is trivial to see that det(a) = α 2. = max. A 1 x.
APPM 4720/5720 Problem Set 2 Solutions This assignment is due at the start of class on Wednesday, February 9th. Minimal credit will be given for incomplete solutions or solutions that do not provide details
More informationKrylov Subspace Methods that Are Based on the Minimization of the Residual
Chapter 5 Krylov Subspace Methods that Are Based on the Minimization of the Residual Remark 51 Goal he goal of these methods consists in determining x k x 0 +K k r 0,A such that the corresponding Euclidean
More informationSolving Linear Systems Using Gaussian Elimination. How can we solve
Solving Linear Systems Using Gaussian Elimination How can we solve? 1 Gaussian elimination Consider the general augmented system: Gaussian elimination Step 1: Eliminate first column below the main diagonal.
More information6.4 Krylov Subspaces and Conjugate Gradients
6.4 Krylov Subspaces and Conjugate Gradients Our original equation is Ax = b. The preconditioned equation is P Ax = P b. When we write P, we never intend that an inverse will be explicitly computed. P
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationLinear Algebra (Review) Volker Tresp 2017
Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.
More information