be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Size: px
Start display at page:

Download "be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u"

Transcription

1 MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u = u (ii) Hv = (I 2 uut v = v 2uuT v u Y u = v 0 = v because vt u = 0 This implies that u T v = 0 (72) Let x be an n-vector Develop an algorithm to compute a Householder matrix H = I 2 uut such that Hx has zeros in positions (r + 1) through n; r < n How many flops will be required to implement this algorithm? Given x = (1, 2, 3) T, apply your algorithm to construct H such that Hx has a zero in the third position For x = (x 1,, x n ) T, let Now find û such that Ĥ = I 2ûûT û T û y = and x 1 x n Ĥy = 0 0

2 Then u = (0,, 0, û) T will generate a Householder transform H such that }{{} r 1 zeros Hx = (,,, 0,, 0) This takes 3(n r) flops }{{} r non zeros 77 Using the Cholesky decomposition of A T A, where A is m n(m n) and has full rank, prove that A has a unique reduced QR factorization with positive diagonal entries of R What are the computational drawbacks of this approach for finding a reduced QR factorization numerically? Illustrate the difficulties with an example Let A T A = HH T = R1 T R 1, where R 1 is an upper triangular matrix with positive diagonal entries Let Q 1 = AR1 1 Then we have A = Q 1 R 1 Now considering Q T 1 Q 1, we have Q T 1 Q 1 = (R T 1 ) 1 A T AR 1 1 = (R T 1 ) 1 R T 1 R 1 R 1 1 = I This shows that Q 1 is an orthogonal matrix Thus, A = Q 1 R 1 is a reduced QR factorization of A In order to prove the uniqueness, we assume that A = Q 1 1R1 1 be another reduced QR factorization of A Thus, (Q 1 1) T Q T 1 and R1 1 is upper triangular with positive diagonal entries Then, we have A T A = (R T 1 ) T (Q T 1 ) T Q T 1 R T 1 Therefore, (R T 1 ) T R 1 1 is the Cholesky factorization of A T A Since the Cholesky factorization is unique, R 1 1 = R 1 Furthermore, we have Q 1 1 = A(R 1 1) 1 = AR 1 1 = Q 1 Therefore, the reduced QR factorization of A is unique Computational Drawbacks The matrix A T A may not be formed accurately Then the column of the matrix Q may be far from orthogonal and the entries of the matrix R may not be computed accurately 716 (a) Let A be m n, and let U and V be orthogonal Then, using the definition of singular values Prove that the singular values of A and V T AV are the same What about the singular vectors? (b) How are the singular vectors of A related with those of U T AV? (c) How are the singular values of a symmetric matrix related to its eigenvalues? (a) Applying the singular value decomposition to a matrix A, we have A = U 1 Σ 1 V1 T Now multiplying A = U 1 Σ 1 V1 T by orthogonal matrices V T and V, we have V T AV = V T U 1 Σ 1 V T 1 V = P Σ 1 Q where P = V T U 1 and Q = V T 1 V are orthogonal Therefore, A and V T AV have the same singular values 2

3 (b) Now applying the singular value decomposition, we have U T AV = U T U 1 ΣV T 1 V The singular vectors of A are the columns of U 1 and V 1, and the singular values of U T AV are those of U T U 1 and V T 1 V (c) Let A = U 1 Σ 1 V T 1 be the singular value decomposition of A Then we have A 2 = A T A = V 1 Σ T 1 U T 1 U 1 Σ 1 V T 1 = V 1 Σ T 1 Σ 1 V T 1 Since V T 1 = V 1 1 and orthogonal transformations do not change the eigenvalues, the eigenvalues of A 2 are the squares of the singular values of A 719 For an m n matrix A, prove the following results using the SVD of A: rank(a T A) = rank(aa T ) = rank(a) = rank(a T ) (ii) A T A and AA T have the same nonzero eigenvalues (iii) If the eigenvectors u 1 and u 2 of A T A are orthogonal, then Au 1 and Au 2 are orthogonal Without the loss of generality, we will assume that m n for all problems of No 719 Singular Value Decomposition For a matrix A R m n, there exist orthogonal matrices U R m m and V R n n such that A = UΣV T where Σ = dig(σ 1,, σ n ) R m n and σ 1 σ 2 σ n 0 From the SVD of A, there are orthogonal matrices U and V such that A = UΣV T Then we have A T = V Σ T U T However, we know Σ = dig(σ 1,, σ n ) R m n This implies Σ T = dig(σ 1,, σ n ) R n m Also, we know that rank(a) = The number of nonzero singular values Since the singular values of A and A T are σ 1, σ 2,, σ n, the number of nonzero singular values of A is equal to the number of nonzero singular values of A T Therefore, rank(a) = rank(a T ) Now consider AA T and A T A Since A = UΣV T and A T = V Σ T U T, we have AA T = (UΣV T )(V Σ T U T ) = U(ΣΣ T )U T (1) A T A = (V Σ T U T )(UΣV T ) = V (Σ T Σ)V T (2) 3

4 Looking at Σ T Σ and ΣΣ T, we can see ΣΣ T = = σ σ 2 0 σ σ 0 σ n σ n m n σ σ σn m m n m Similarly, we can obtain Σ T Σ = σ σ σ 2 n n n From (1) and (2), the singular values of AA T and A T A are σ1, 2 σ2, 2, σn 2 Therefore, the number of nonzero singular values of AA T is equal to the number of nonzero singular values of A T A, that is, rank(aa T ) = rank(a T A) Also, from the facts that the singular values of A are σ 1, σ 2,, σ n and that the singular values of AA T are σ1, 2 σ2, 2, σn, 2 the number of nonzero singular values of A is equal to the number of nonzero singular values of AA T Therefore, rank(a) = rank(aa T ) Finally, we showed that rank(a T A) = rank(aa T ) = rank(a) = rank(a T ) (ii) By the Singular Value Decomposition, we have A = UΣV T where Σ m n = diag(σ 1, σ 2,, σ n ) A T = V Σ T U T where Σ n m = diag(σ 1, sigma 2,, σ n ) By the definition of singular values, for a real matrix A, the square roots of the eigenvalues of A T A are called the singular values of A, that is, the eigenvalues of A T A are σ 2 1, σ 2 2,, σ 2 n Since AA T = ( A T ) T A T and the singular values of A T are σ 1, σ 2,, σ n, the eigenvalues of AA T = ( A T ) T A are σ 2 1, σ 2 2,, σ 2 n Therefore, A T A and AA T have the same nonzero eigenvalues 4

5 (iii) Suppose that the eigenvectors u 1 and u 2 of A T A are orthogonal Then we have (A T A)u 1 = λ 1 u 1 (1) (A T A)u 2 = λ 2 u 2 (2) u T 1 u 2 = u T 2 u 1 = 0 (3) Then consider (Au 1 ) T (Ay 2 ) and (Au 2 ) t (Au 1 ) Using (2) and (3), we have (Au 1 ) T (Au 2 ) = u 1 A T Au 2 = u T 1 (λ 2 u 2 ) = λ 2 u T 1 u 2 = 0 Similarly, using (1) and (2), we have (Au 2 ) T (Au 1 ) = u 2 A T Au 1 = u T 2 (λ 1 u 1 ) = λ 1 u T 2 u 1 = 0 Therefore, Au 1 and Au 2 are orthogonal 721 Let U have orthogonal columns Then using SVD, prove that AU 2 = A 2, (ii) AU F = A F, Again without the loss of generality, we will assume that m n for all problems of No 721 Since U has orthogonal columns, U = [u 1 u 2 u n ] m n, where u i is the i th column vector expressed as u i = (u 1i, u 2i,, u mi ) T and satisfying u T i u j = { 0 if i j 1 if i = j This implies U T U = u T 1 u T 2 ( u1 u 2 u T n ) m n = I n n u T n n m Therefore, U is an orthogonal matrix From the Singular Value Decomposition, for a matrix A, there exist orthogonal matrices Û and V such that A = ÛΣ V T, where Σ = diag(σ 1, σ 2,, σ n ) Now multiplying A = ÛΣ V T by U, we have AU = ÛΣ V T U Since the product of orthogonal matrices is an orthogonal matrix, letting W T = V T U, we have AU = ÛΣW T ( ) Since A 2 = σ max, from ( ) we have AU 2 = σ max Therefore, AU 2 = A 2 5

6 (ii) Since A F = (σ σ σ 2 n) 1 2, again from ( ), we have Therefore, we showed A F = AU F AU F = (σ σ σ 2 n) Let UΣV T be the SVD of A Then prove that U T AV 2 = p F i=1 σi 2, where σ i are the singular values of A From the Singular Value Decomposition, for an m n matrix A, there exists U R m m and V R n n such that A = UΣV T where Σ = diag(σ 1,, σ p ) R m n and p = min(m, n) Since U and V are orthogonal matrices, we have A = UΣV T U T AV = Σ Now taking the Frobenius norm on U T AV = Σ, we have U T AV 2 = F Σ 2 F (1) By the definition of the Frobenius norm, A F = [ nj=1 mi=1 a ij 2] 1 2, since Σ = diag(σ 1,, σ p ), we have p Σ 2 F = σi 2 (2) From (1) and (2), we showed that U T AV 2 = p F i=1 σi Let A be an m n matrix (a) Using the SVD of A, prove that A T A = A ; (ii) Cond 2 (A T A) = (Cond 2 (A)) 2 ; (iii) Cond 2 (A) = Cond 2 (U T AV ), where U and V are orthogonal i=1 (b) Let rank(a m n ) = n, and let B m r be a matrix obtained by deleting (n r) columns from A Then prove that Cond 2 (B) Cond 2 (A) (a) Using the Singular Value Decomposition, we have an orthogonal matrices U R m m and V R n n such that A = UΣV T Without the loss of generality, we assume m n 6

7 By the Singular Value Decomposition, we have A = UΣ m n V T Therefore, A 2 = σ max = σ 1 Also, we have A T = V Σ n n U T Considering A T A, we have A T A = UΣ T n mv V T Σ m n U = UΣΣ T U T Since ΣΣ T m m = diag(σ 2 1, σ 2 2, σ 2 n), A T A 2 = σ 2 max = σ 2 1 Therefore, A 2 2 = A T A 2 (ii) Cond 2 (A T A) = σ max of (A T A) of (A T A) = σ max of (A 2 ) of (A 2 ) = (σ max of A) 2 ( of A) 2 = (Cond 2(A)) 2 (iii) By the Singular Value Decomposition, we have Since Cond 2 (Σ) = σmax, we have A = UΣV T U T AV = Σ Cond 2 (U T AV ) = Cond 2 (Σ) = σ max (1) Also, Cond 2 (A) = σmax Therefore, we have Cond 2 (A) = Cond 2 (U T AV ) (b) Let A m n = a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn m n Then we have B m r = a 11 a 12 a 1r a 21 a 22 a 2r a m1 a m2 a mr m r Since rank(a) = n, the number of nonzero singular values of A is n = σ n σ r Therefore, we have This tell us that Cond 2 (A) = σ max = σ 1 σ n σ 1 σ r = Cond 2 (B) 7

Linear Least Squares. Using SVD Decomposition.

Linear Least Squares. Using SVD Decomposition. Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any

More information

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Orthogonal Transformations

Orthogonal Transformations Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5 STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity

More information

Problem set 5: SVD, Orthogonal projections, etc.

Problem set 5: SVD, Orthogonal projections, etc. Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding

More information

Orthonormal Transformations and Least Squares

Orthonormal Transformations and Least Squares Orthonormal Transformations and Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 30, 2009 Applications of Qx with Q T Q = I 1. solving

More information

Summary of Week 9 B = then A A =

Summary of Week 9 B = then A A = Summary of Week 9 Finding the square root of a positive operator Last time we saw that positive operators have a unique positive square root We now briefly look at how one would go about calculating the

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information

Lecture 5 Singular value decomposition

Lecture 5 Singular value decomposition Lecture 5 Singular value decomposition Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn

More information

The QR Decomposition

The QR Decomposition The QR Decomposition We have seen one major decomposition of a matrix which is A = LU (and its variants) or more generally PA = LU for a permutation matrix P. This was valid for a square matrix and aided

More information

SINGULAR VALUE DECOMPOSITION

SINGULAR VALUE DECOMPOSITION DEPARMEN OF MAHEMAICS ECHNICAL REPOR SINGULAR VALUE DECOMPOSIION Andrew Lounsbury September 8 No 8- ENNESSEE ECHNOLOGICAL UNIVERSIY Cookeville, N 38 Singular Value Decomposition Andrew Lounsbury Department

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition

More information

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 26, 2010 Linear system Linear system Ax = b, A C m,n, b C m, x C n. under-determined

More information

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in 86 Problem Set - Solutions Due Thursday, 29 November 27 at 4 pm in 2-6 Problem : (5=5+5+5) Take any matrix A of the form A = B H CB, where B has full column rank and C is Hermitian and positive-definite

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2 HE SINGULAR VALUE DECOMPOSIION he SVD existence - properties. Pseudo-inverses and the SVD Use of SVD for least-squares problems Applications of the SVD he Singular Value Decomposition (SVD) heorem For

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 The Singular Value Decomposition (SVD) continued Linear Algebra Methods for Data Mining, Spring 2007, University

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

Orthonormal Transformations

Orthonormal Transformations Orthonormal Transformations Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 25, 2010 Applications of transformation Q : R m R m, with Q T Q = I 1.

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Lecture 5: Numerical Linear Algebra Cho-Jui Hsieh UC Davis April 20, 2017 Linear Algebra Background Vectors A vector has a direction and a magnitude

More information

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1 Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

Numerical Methods I Singular Value Decomposition

Numerical Methods I Singular Value Decomposition Numerical Methods I Singular Value Decomposition Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 9th, 2014 A. Donev (Courant Institute)

More information

Notes on Solving Linear Least-Squares Problems

Notes on Solving Linear Least-Squares Problems Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Algebra C Numerical Linear Algebra Sample Exam Problems

Algebra C Numerical Linear Algebra Sample Exam Problems Algebra C Numerical Linear Algebra Sample Exam Problems Notation. Denote by V a finite-dimensional Hilbert space with inner product (, ) and corresponding norm. The abbreviation SPD is used for symmetric

More information

Numerical Linear Algebra Notes

Numerical Linear Algebra Notes Numerical Linear Algebra Notes Brian Bockelman October 11, 2006 1 Linear Algebra Background Definition 1.0.1. An inner product on F n (R n or C n ) is a function, : F n F n F satisfying u, v, w F n, c

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Numerical Linear Algebra Background Cho-Jui Hsieh UC Davis May 15, 2018 Linear Algebra Background Vectors A vector has a direction and a magnitude

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

Basic Calculus Review

Basic Calculus Review Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V

More information

Nonlinear Programming Algorithms Handout

Nonlinear Programming Algorithms Handout Nonlinear Programming Algorithms Handout Michael C. Ferris Computer Sciences Department University of Wisconsin Madison, Wisconsin 5376 September 9 1 Eigenvalues The eigenvalues of a matrix A C n n are

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 6 Singular Value Decomposition In Chapter 5, we derived a number of algorithms for computing the eigenvalues and eigenvectors of matrices A R n n. Having developed this machinery, we complete our

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Notes on Linear Algebra

Notes on Linear Algebra 1 Notes on Linear Algebra Jean Walrand August 2005 I INTRODUCTION Linear Algebra is the theory of linear transformations Applications abound in estimation control and Markov chains You should be familiar

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 208 LECTURE 3 need for pivoting we saw that under proper circumstances, we can write A LU where 0 0 0 u u 2 u n l 2 0 0 0 u 22 u 2n L l 3 l 32, U 0 0 0 l n l

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline

More information

Jordan Normal Form and Singular Decomposition

Jordan Normal Form and Singular Decomposition University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 EigenValue decomposition Singular Value Decomposition Ramani Duraiswami, Dept. of Computer Science Hermitian Matrices A square matrix for which A = A H is said

More information

MATH 3795 Lecture 10. Regularized Linear Least Squares.

MATH 3795 Lecture 10. Regularized Linear Least Squares. MATH 3795 Lecture 10. Regularized Linear Least Squares. Dmitriy Leykekhman Fall 2008 Goals Understanding the regularization. D. Leykekhman - MATH 3795 Introduction to Computational Mathematics Linear Least

More information

Applied Numerical Linear Algebra. Lecture 8

Applied Numerical Linear Algebra. Lecture 8 Applied Numerical Linear Algebra. Lecture 8 1/ 45 Perturbation Theory for the Least Squares Problem When A is not square, we define its condition number with respect to the 2-norm to be k 2 (A) σ max (A)/σ

More information

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462 Lecture 6 Sept. 25-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Dual PCA It turns out that the singular value decomposition also allows us to formulate the principle components

More information

The Singular Value Decomposition

The Singular Value Decomposition CHAPTER 6 The Singular Value Decomposition Exercise 67: SVD examples (a) For A =[, 4] T we find a matrixa T A =5,whichhastheeigenvalue =5 Thisprovidesuswiththesingularvalue =+ p =5forA Hence the matrix

More information

1 Linearity and Linear Systems

1 Linearity and Linear Systems Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 26 Jonathan Pillow Lecture 7-8 notes: Linear systems & SVD Linearity and Linear Systems Linear system is a kind of mapping f( x)

More information

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

Orthogonalization and least squares methods

Orthogonalization and least squares methods Chapter 3 Orthogonalization and least squares methods 31 QR-factorization (QR-decomposition) 311 Householder transformation Definition 311 A complex m n-matrix R = [r ij is called an upper (lower) triangular

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

LECTURE 7. Least Squares and Variants. Optimization Models EE 127 / EE 227AT. Outline. Least Squares. Notes. Notes. Notes. Notes.

LECTURE 7. Least Squares and Variants. Optimization Models EE 127 / EE 227AT. Outline. Least Squares. Notes. Notes. Notes. Notes. Optimization Models EE 127 / EE 227AT Laurent El Ghaoui EECS department UC Berkeley Spring 2015 Sp 15 1 / 23 LECTURE 7 Least Squares and Variants If others would but reflect on mathematical truths as deeply

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Linear Least-Squares Data Fitting

Linear Least-Squares Data Fitting CHAPTER 6 Linear Least-Squares Data Fitting 61 Introduction Recall that in chapter 3 we were discussing linear systems of equations, written in shorthand in the form Ax = b In chapter 3, we just considered

More information

Model Order Reduction Techniques

Model Order Reduction Techniques Model Order Reduction Techniques SVD & POD M. Grepl a & K. Veroy-Grepl b a Institut für Geometrie und Praktische Mathematik b Aachen Institute for Advanced Study in Computational Engineering Science (AICES)

More information

5 Selected Topics in Numerical Linear Algebra

5 Selected Topics in Numerical Linear Algebra 5 Selected Topics in Numerical Linear Algebra In this chapter we will be concerned mostly with orthogonal factorizations of rectangular m n matrices A The section numbers in the notes do not align with

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I) CS 106 Spring 2004 Homework 1 Elena Davidson 8 April 2004 Problem 1.1 Let B be a 4 4 matrix to which we apply the following operations: 1. double column 1, 2. halve row 3, 3. add row 3 to row 1, 4. interchange

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Homework 1. Yuan Yao. September 18, 2011

Homework 1. Yuan Yao. September 18, 2011 Homework 1 Yuan Yao September 18, 2011 1. Singular Value Decomposition: The goal of this exercise is to refresh your memory about the singular value decomposition and matrix norms. A good reference to

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

3D Computer Vision - WT 2004

3D Computer Vision - WT 2004 3D Computer Vision - WT 2004 Singular Value Decomposition Darko Zikic CAMP - Chair for Computer Aided Medical Procedures November 4, 2004 1 2 3 4 5 Properties For any given matrix A R m n there exists

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

8. the singular value decomposition

8. the singular value decomposition 8. the singular value decomposition cmda 3606; mark embree version of 19 February 2017 The singular value decomposition (SVD) is among the most important and widely applicable matrix factorizations. It

More information

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR 1. Definition Existence Theorem 1. Assume that A R m n. Then there exist orthogonal matrices U R m m V R n n, values σ 1 σ 2... σ p 0 with p = min{m, n},

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course

More information

Preliminary Examination, Numerical Analysis, August 2016

Preliminary Examination, Numerical Analysis, August 2016 Preliminary Examination, Numerical Analysis, August 2016 Instructions: This exam is closed books and notes. The time allowed is three hours and you need to work on any three out of questions 1-4 and any

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: Au = λu λ C : eigenvalue u C n : eigenvector Example:

More information

Linear Methods in Data Mining

Linear Methods in Data Mining Why Methods? linear methods are well understood, simple and elegant; algorithms based on linear methods are widespread: data mining, computer vision, graphics, pattern recognition; excellent general software

More information

5 Linear Algebra and Inverse Problem

5 Linear Algebra and Inverse Problem 5 Linear Algebra and Inverse Problem 5.1 Introduction Direct problem ( Forward problem) is to find field quantities satisfying Governing equations, Boundary conditions, Initial conditions. The direct problem

More information

Using SVD to Recommend Movies

Using SVD to Recommend Movies Michael Percy University of California, Santa Cruz Last update: December 12, 2009 Last update: December 12, 2009 1 / Outline 1 Introduction 2 Singular Value Decomposition 3 Experiments 4 Conclusion Last

More information

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers.. EIGENVALUE PROBLEMS Background on eigenvalues/ eigenvectors / decompositions Perturbation analysis, condition numbers.. Power method The QR algorithm Practical QR algorithms: use of Hessenberg form and

More information

ACM106a - Homework 4 Solutions

ACM106a - Homework 4 Solutions ACM106a - Homework 4 Solutions prepared by Svitlana Vyetrenko November 17, 2006 1. Problem 1: (a) Let A be a normal triangular matrix. Without the loss of generality assume that A is upper triangular,

More information

EIGENVALUES AND SINGULAR VALUE DECOMPOSITION

EIGENVALUES AND SINGULAR VALUE DECOMPOSITION APPENDIX B EIGENVALUES AND SINGULAR VALUE DECOMPOSITION B.1 LINEAR EQUATIONS AND INVERSES Problems of linear estimation can be written in terms of a linear matrix equation whose solution provides the required

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Section 3.9. Matrix Norm

Section 3.9. Matrix Norm 3.9. Matrix Norm 1 Section 3.9. Matrix Norm Note. We define several matrix norms, some similar to vector norms and some reflecting how multiplication by a matrix affects the norm of a vector. We use matrix

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information