SINGULAR VALUE DECOMPOSITION

Size: px
Start display at page:

Download "SINGULAR VALUE DECOMPOSITION"

Transcription

1 DEPARMEN OF MAHEMAICS ECHNICAL REPOR SINGULAR VALUE DECOMPOSIION Andrew Lounsbury September 8 No 8- ENNESSEE ECHNOLOGICAL UNIVERSIY Cookeville, N 38

2 Singular Value Decomposition Andrew Lounsbury Department of Mathematics ennessee echnological University Cookeville, N, 38 September 8, 8 Abstract he Singular Value Decomposition (SVD) provides a cohesive summary of a handful of topics introduced in basic linear algebra SVD may be applied to digital photographs so that they may be approximated and transmitted with a concise computation Mathematics Subject Classification Primary A3, A4 Keywords singular value, matrix factorization Contents Introduction Steps for Calculation of SVD 3 3 heory 4 Examples 9 Maple 6 Conclusions 6 A Appendix 6 Introduction his paper begins with a definition of SVD and instructions on how to compute it, which includes calculating eigenvalues, singular values, eigenvectors, left and right singular vectors, or, alternatively, orthonormal bases for the four fundamental spaces of a matrix We present two theorems Completed in partial fulfillment of the requirements for MAH 4993 Mathematical Research (3 cr) in spring 8 under supervision of Dr R Ablamowicz

3 that result from SVD with corresponding proofs We provide examples of matrices and their singular value decompositions here is also a section involving Maple that includes examples of photographs It is demonstrated how the inclusion of more and more information from the SVD allows one to construct accurate approximations of a color image Definition Let A be an m n real matrix of rank r min(m, n) A Singular Value Decomposition (SVD) is a way to factor A as A UΣV, () where U and V are orthogonal matrices such that U U I m and V V I n he Σ matrix contains the singular values of A on its pseudo-diagonal, with zeros elsewhere hus, σ A UΣV [ v v u u u m }{{} σ r, () U(m m) v n }{{}}{{} V (n n) Σ(m n) with u,, u m being the orthonormal columns of U, σ,, σ r being the singular values of A satisfying σ σ σ r >, and v,, v n being the orthonormal columns of V Singular values are defined as the positive square roots of the eigenvalues of A A Note that since A A of size n n is real and symmetric of rank r, r of its eigenvalues σ i, i,, r, are positive and therefore real, while the remaining n r eigenvalues are zero In particular, A A V (Σ Σ)V, A Av i σ i v i, i,, r, A Av i, i r +,, n (3) hus, the first r vectors v i are the eigenvectors of A A with the eigenvalues σi Likewise, we have AA U(ΣΣ )U, AA u i σ i u i, i,, r, AA u i, i r +,, m (4) hus, the first r vectors u i are the eigenvectors of AA with the eigenvalues σ i Furthermore, it can be shown (see Lemmas and ) that Av i σ i u i, i,, r, and Av i, i r +,, n () If rank(a) r < min(m, n), then there are n r zero columns and rows in Σ, rendering the

4 r +,, m columns of U and r +,, n rows in V unnecessary to recover A herefore, σ v A UΣV [ u u r u r+ u m σ r vr vr+ vn σ [ v u u r σ }{{} m r vr σ r }{{}}{{} r n r r u σ v + + u r σ r v r (6) Steps for Calculation of SVD Here, we provide an algorithm to calculate a singular value decomposition of a matrix Compute A A of a real m n matrix A of rank r Compute the singular values of A A Solve the characteristic equation A A(λ) A A λi of A A for the eigenvalues λ,, λ r of A A hese eigenvalues will be positive ake their square roots to obtain σ,, σ r which are the singular values of A, that is, σ i + λ i, i,, r (7) 3 Sort the singular values, possibly renaming them, so that σ σ σ r 4 Construct the Σ matrix of size m n such that Σ ii σ i for i,, r, and Σ ij when i j Compute the eigenvectors of A A Find a basis for Null(A A λ i I) hat is, solve (A A λ i I)s i for s i, an eigenvector of A corresponding to λ i, for each eigenvalue λ i Since A A is symmetric, its eigenvectors corresponding to different eigenvalues are already orthogonal (but likely not orthonormal) See Lemma 6 Compute the (right singular) vectors v,, v r by normalizing each eigenvector s i by multiplying it by s i hat is, let v i s i s i, i,, r (8) 3

5 If n > r, the additional n r vectors v r+,, v n need to be chosen as an orthonormal basis in Null(A) Note that since Av i σ i u i for i,, vectors v,, v r provide an orthonormal basis for Row(A) while the vectors u,, u r provide an orthonormal basis for Col(A) In particular, R n Row(A) Null(A) span{v,, v r } span{v r+,, v r+(n r) } (9) 7 Construct the orthogonal matrix V [v v n 8 Verify V V I 9 Compute the (left singular) vectors u,, u r as Av i σ i u i u i Av i σ i, i r () In this method, u,, u r are orthogonal by Lemma Alternatively, (i) Note that AA U(ΣΣ )U suggests the vectors of U can be calculated as the eigenvectors of AA In using this method, the vectors need to be normalized first Namely, u i s i s i, where s i is an eigenvector of AA (ii) Since A A(λ) AA (λ) by Lemma 8, σ,, σ r are also the square roots of the eigenvalues of AA If m > r, the additional m r vectors u r+,, u m need to be chosen as an orthonormal basis in Null(A ) Note that since Av i σ i u i for i,, vectors u,, u r provide an orthonormal basis for Col(A) while the vectors u r+,, u m provide an orthonormal basis for the left null space Null(A ) In particular, R m Col(A) Null(A ) span{u,, u r } span{u r+,, u r+(m r) } () Construct U [u u m Verify U U I Verify A UΣV 3 Construct the dyadic decomposition of A, as described in hm 3: A UΣV σ u v + σ u v + + u r σ r v r () A dyad is a product of an n column vector with another n row vector, eg, u v, resulting in a square n n matrix whose rank is by Lemma 6 4

6 3 heory In this section, we provide the two theorems related to SVD along with their proofs heorem Let A UΣV be a singular value decomposition of an m n real matrix of rank r hen, AV UΣ and { Avi σ i u i, i,, r Av i, i r +,, r + (n r) { Row(A) span{v,, v r } Null(A) span{v r+,, v r+(n r) } A A V (Σ Σ)V : R n R n 3 A AV V (Σ Σ) and { A Av i σi v i, i,, r A Av i, i r +,, r + (n r) { Row(A A) span{v,, v r } Null(A A) span{v r+,, v r+(n r) } 4 U A ΣV and { u i A σ i v i, i,, r u i A, i r +,, r + (m r) { Col(A) span{u,, u r } Null(A ) span{u r+,, u r+(m r) } AA U(ΣΣ )U : R m R m 6 AA U U(ΣΣ ) and { AA u i σi u i, i,, r AA u i, i r +,, r + (m r) { Row(AA ) span{u,, u r } Null(AA ) span{u r+,, u r+(m r) } Proof of () AV (UΣV )V UΣ(V V ) UΣ So, AV [ Av Av r Av r+ Av n σ [ u u r u r+ u m σ r [ σ u σ r u r Hence,

7 Av σ u,, Av r σ r u r, and Av r+,, Av r+(n r) Proof of () A A (UΣV ) (UΣV ) (V Σ U )(UΣV ) V Σ (U U)ΣV V Σ ΣV Proof of (3) A AV (V Σ U )(UΣV )V V Σ (U U)Σ(V V ) V (Σ Σ) So, A AV [ A Av A Av r A Av r+ A Av n [ λ v λ r v r λ r+ v r+ λ n v n λ [ v v r v r+ v n λ r σ [ v v r v r+ v n σr [ σ v σ rv r Hence, A Av σ v,, A Av r σ rv r, and A Av r+,, A Av r+(n r) Proof of (4) U A U (UΣV ) (U U)ΣV ΣV 6

8 So, U A u u r u r+ u m A u A u r A u r+ A u ma σ σ r v v r v r+ v n σ v σ r v r Hence, u A σ v,, u r A σ r v r, and u r+ A,, u r+(m r) A Proof of () AA (UΣV )(UΣV ) (UΣV )(V Σ U ) UΣ (V V )ΣU UΣ ΣU Proof of (6) AA U (UΣV )(UΣV ) U (UΣV )(V Σ U )U UΣ(V V )Σ (U U) U(ΣΣ ) 7

9 So, AA U [ AA u AA u r AA u r+ AA u m [ λ u λ r u r λ r+ u r+ λ n u m λ [ u u r u r+ u m λ r σ [ u u r u r+ u m σr [ σ u σ ru r Hence, AA u σ u,, AA u r σ ru r, and AA u r+,, AA u r+(m r) heorem Let A UΣV be a singular value decomposition of an m n real matrix of rank r hen, A UΣV r u i σ i vi σ u v + σ u v + + σ r u r vr (3) i 8

10 Proof A UΣV σ u σ u σ r u r v v v r σ u σ u σ r u r v σ u (r ) σ u r σ u r σ r u rr vr vr vrr (σ u v + + σ ru r vr ) (σ u v + + σ ru r vr ) (σ u vr + + σ ru r v rr) (σ u v + + σ ru r vr ) (σ u r v + + σ ru rr v r ) (σ u r v r + + σ ru rr v rr) σ u v + (σ u v + + σ ru r vr ) (σ u v + + σ ru r vr ) (σ u vr + + σ ru r v rr) (σ u v + + σ ru r vr ) (σ u r v + + σ ru rr vr ) (σ u rr vr + + σ ru rr vrr) σ u v + σ u v + + σ r u r v r u σ v + u σ v + + u r σ r v r r u i σ i vi i 4 Examples In this section we calculate the singular value decomposition of a few matrices Example Let A, then A A 6 7, and

11 A A(λ) σ σ S λ λ 6 λ λ Σ λ 4 4λ 3 + 8λ λ λ , s 6, s s 3 3 s 4 4 S contains the transposed eigenvectors of A A , , , V , , , ( ) ( u Av σ ( ) ( u Av σ , , , , ) ) ,77,8+33, ,77,8+33, ,77,8+33, ,77,8 33, ,77,8 33, ,77,8 33, Since A is 3 4, U should be a 3 3 matrix However, there is no σ 3, so we cannot use u i Av i σ i to find u 3 Instead, we use the left null space of A Null(A ) y 4 Normalizing y to obtain u 3 : u 3 y y

12 So, U hus, A ,77,8+33, ,77,8+33, ,77,8+33, ,77,8 33, ,77,8 33, ,77,8 33, ,77,8+33, ,77,8 33, ,77,8+33, ,77,8 33, ,77,8+33, ,77,8 33, , , , , , , , , [ UΣV Example Let A Notice that when[ A is a[ symmetric [ matrix, A A AA, so U V Less work is required A A AA rank(a) rank(a A) rank(aa ) A A(λ) AA (λ) A A λi AA λi λ λ ( λ)( λ) λ 3λ + λ 3+ λ 3 σ [ λ + + σ 3 6 Σ λ v & u : [ [A A λ Ix [AA 3+ λ Is 3+ [ [ [ + s s v u s s v & u : + [ + + [ [

13 [A A λ Is [AA 3 λ Is 3 [ + [ + [ + s s v u s s V [ [, U o verify that v u and v u : [ + Av Av [ [ [ σ u σ v σ u σ v Notice that this implies the eigenvalues of A are equal to the singular values of A By Lemma 7, every symmetric matrix has this property hus, A [ Maple [ UΣV UΛV he purpose in subjecting a color photograph to the Singular Value Decomposition is to greatly reduce the amount of data required to transmit the photograph to or from a satellite, for instance A digital image is essentially a matrix comprised of three other matrices of identical size hese are the red, green, and blue layers that combine to produce the colors in the original image Obtain the three layers of the image using the Maple command GetLayers from the Imageools package It is on each of these three layers that we perform the SVD Define each one as img_r,img_g, and img_b Define the singular values of each matrix using the SingularValues command in the Linear Algebra package Maple will also calculate U and V Simply set the output of SingularValues[ U, Vt In the argument of the following procedure, the variable n denotes which approximation the procedure will compute, that is, the number of singular values that it will include posint indicates that n must be a positive integer approx:proc(img_r,img_g,img_b,n::posint)local Singr,Singg,Singb,Ur,Ug,Ub,Vtr,Vtg,Vtb,singr,

14 singg,singb,ur,ug,ub,vr,vg,vb,mr,mg,mb,i,img_rgb; In place of a Σ matrix, we create a list of the σ s for each red, green and blue layer, as well as the red, green, and blue U and V matrices It is important to note that Maple outputs the transpose of V Hence, it does not need to be transposed Singr:SingularValues(img_r,output list ): Singg:SingularValues(img_g,output list ): Singb:SingularValues(img_b,output list ): Ur,Vtr:SingularValues(img_r,output[ U, Vt ); Ug,Vtg:SingularValues(img_g,output[ U, Vt ); Ub,Vtb:SingularValues(img_b,output[ U, Vt ); Pulling out each individual σ i, u i, and v i to create the σ i u i v i dyads for i r : for i from to n do singr[i:singr[i; singg[i:singg[i; singb[i:singb[i; ur[i:linearalgebra:-column(ur,ii); vr[i:linearalgebra:-column(linearalgebra:-ranspose(vtr),ii); ug[i:linearalgebra:-column(ug,ii); vg[i:linearalgebra:-column(linearalgebra:-ranspose(vtg),ii); ub[i:linearalgebra:-column(ub,ii); vb[i:linearalgebra:-column(linearalgebra:-ranspose(vtb),ii); end do; Note that Maple stores data as floating point numbers, so values that would be otherwise are stored as a small decimal number very close to, yet still greater than his means that, when working with Maple, rank(a) r min(m, n) Adding the dyads to produce the approximations of each layer: Mr:add(singr[i*ur[iLinearAlgebra:-ranspose(vr[i),in); Mg:add(singg[i*ug[iLinearAlgebra:-ranspose(vg[i),in); Mb:add(singb[i*ub[iLinearAlgebra:-ranspose(vb[i),in); Combining the approximations of each layer: img_rgb:combinelayers(mr,mg,mb): Displaying the result in Maple: Embed(img_rgb); end proc: 3

15 Application Suppose we have a color photograph that is pixels, or entries hat s, pixels When we separate it into its red, green, and blue layers, the number of entries becomes 3, or 6, entries Apply SVD to each of these matrices and take enough of the dyad decomposition to obtain a meaningful approximation For the sake of example, suppose two products from the outer product decomposition suffice In other words, we have σ u v + σ u v σ u [ v + σ u [ v he σ s are just scalars, the u i column vectors are, and the row vectors vi are So, it follows that σ u v + σ u v has entries Multiply this by three to account for the red, green, and blue layers to obtain,86 entries, approximately 3% of the original 6, entries that we would have had to send had we not utilized the SVD Now, when the satellite sends the photograph down to Earth, it sends those σ and σ, u and u, and v and v separately All that needs to be done to recover the approximation is to multiply these together and add them up back on Earth Example 3 Here, we show a progression of approximations of a photograph of a galaxy picture has dimensions 78 96, and each approximation uses n singular values he 4

16 (a) original image (b) n his is merely σ u v, the first term in the sum of 78 dyads σi ui vi hus, it bares very little resemblance to the original image (d) n ; σ u v + + σ u v (c) n ; σ u v + + σ u v Notice that with only 78, or 3%, of the infor dyads, or 8% of all the data, we With only 78 mation contained in the original image we have can already see the shapes in the original image an approximation that is close to being indistinstarting to come together guishable from the original (e) n ; σ u v + + σ u v his is 64% of all the information in the orig (f) n 78; σ u v + + σ78 u78 v78 inal image he lack of a noticeable difference When we use all singular values, the approximabetween this picture and the previous one illustion is the same as the original image trates the fact that the σi values are getting much smaller as i increases

17 6 Conclusions In summary, the application of singular value decomposition we have detailed provides a method of calculating very accurate approximations of photographs so that they may be transmitted from satellites to Earth without requiring large amounts of data SVD provides bases for the Four Fundamental Subspaces of a matrix, and it gets its versatility from the ordering of the σ values SVD is also used in the calculation of pseudoinverses, as illustrated in [3, among other things A Appendix In this appendix, we prove a few results related to symmetric matrices Lemma Let A A hen eigenvectors corresponding to different eigenvalues are orthogonal Proof Let Av λ v and Av λ v We show that v v v v Observe that λ (v v ) (λ v ) v (Av ) v (v A )v v (Av ) v (λ v ) λ (v v ) hus, λ (v v ) λ (v v ) (λ λ )(v v ), so v v since λ λ Lemma Let A A and A be real hen, eigenvalues of A are non-negative Proof Suppose A A and A A Let Av λv, where v We show that λ λ hus, and, Av λv and (Av) v v (A v) v (Av) v (λv) λ(v v), Let v z, where z i C So, z n (λv )v λ(v v) λ(v v) (4) v v [ z,, z n z z z + + z n z n z + + z n > z n From (4) we have λ(v v) λ(v v) since v Hence, (λ λ)(v v), so λ λ since v v > Lemma 3 Let A be an m n matrix Null(A) Null(A A) Proof of ( ) Let x Null(A) hen Ax so A (Ax) (A A)x x Null(A A) 6

18 Proof of ( ) Let x Null(A A) hen (A A)x hus A (Ax), so Ax Null(A ) On the other hand, Ax Col(A) Since R m Col(A) Null(A ), we conclude that Ax since Col(A) Null(A ) {} hus, Null(A A) Null(A) Null(A) Null(A A) rank(a) rank(a A) Proof Let r rank(a) hus, r n dim(null(a)) n dim(null(a A)) rank(a A) since A A is n n 3 dim(null(a)) n r dim(null(a A)) dim(col(a)) r dim(row(a)) dim(col(a A)) rank(a A) rank(a) dim(col(a)) r 4 Null(A ) Null(AA ) Proof of ( ) Let x Null(A ), then A x so A(A x) (AA )x x Null(AA ) Proof of ( ) Let x Null(AA ) hen (AA )x hus, A(A x) so A x Null(A) On the other hand, A x Col(A ) Since R m Col(A ) Null(A), we conclude that A x since Col(A ) Null(A) {} hus, Null(AA ) Null(A ) Null(A ) Null(AA ) rank(a ) rank(aa ) rank(a), so rank(a) rank(a A) rank(a ) rank(aa ) 6 We have the following: (i) dim(null(a )) m r dim(null(aa )), (ii) dim(col(a )) rank(a ) rank(a) r dim(row(a )), (iii) dim(col(aa )) rank(aa ) rank(a) dim(col(a)) r 7 Since A Av i σ i v i, i r, and vectors {v,, v r } provide a basis for Row(A), we have Av i, i r If A Av i, then v i would belong to Null(A A) Null(A), which would give Av i contradiction So, A Av i σ i v i, which implies σ i so σ i, i r Lemma 4 σ i for i r 7

19 Lemma Since the vectors {v,, v r } are orthonormal, vectors {u,, u r } computed as are also orthonormal Proof We have the following: u i Av i σ i, i r, (Av i ) (Av j ) v i (A Av j ) v i σ j v j σ j v i v j σ j δ ij {, i j; σ j, i j hus, vectors {u i } r i are orthogonal and orthonormal because: ( ) ( ) u Avi Avj i u j ( ) { (Av i ) (Av j ) σ, i j; j δ ij σ i σ j σ i σ j σ i σ j, i j Lemma 6 Let v be a n vector hen, the dyad v v has rank Proof We compute the following: v v v v v v v n vv v [ v v vn v v v v v v n }{{} n v n }{{} v n v } v n vn {{ v n vn } n n n v v v v v n v Since the rows of vv are multiples of v, they are linearly dependent herefore, we have rank(vv ) Lemma 7 Let A be a symmetric matrix hen, the eigenvalues of A are equal to the singular values of A Proof Let A be a matrix such that A A, let λ be an eigenvalue of A, and let v be an eigenvector of A hen, Av λv, and A Av A λv λa v λav λ v Hence, λ is an eigenvalue of A A herefore, λ is a singular value of A On the other hand, let σ > be a singular value of A So, A Av σ v for some nonzero eigenvector v A A(σ ) det(a A σ I) det(a σ I) det((a σi)(a + σi)) det(a σi) det(a + σi), which implies that det(a σi), det(a + σi), or det(a σi) det(a + σi) Suppose, det(a + σi) hen, σ is an eigenvalue of A, a contradiction since σ > herefore, det(a + σi), and σ is an eigenvalue of A Lemma 8 Let A be an m n matrix hen, A A and AA share the same nonzero eigenvalues and, therefore, both provide the singular values of A In the case where m n, A A(t) AA (t) 8

20 Proof From parts and of heorem (), we have A A V Σ ΣV V Λ n V, and AA UΣΣ U UΛ m U, where Λ n is n n, and Λ m is m m Hence, A A is similar to Λ n, and AA is similar to Λ m We cannot show that Λ n is similar to Λ m However, from Definition (), we know what Σ and Σ look like, so we know that when we multiply them together to obtain Λ n and Λ m we get two square matrices with σ,, σ r, or λ,, λ r, and zeros along the diagonal, and zeros elsewhere he only difference between the two is that one is larger, depending on whether m < n or m > n, and has more zeros on its diagonal σ λ Λ n σr λ r }{{} n n σ λ σr λ r Λ m } {{ } m m Since the eigenvalues of a diagonal matrix are simply the entries on its diagonal, we know that Λ n and Λ m both have λ,, λ r as eigenvalues Because they are similar to A A and AA, respectively, we know that A A and AA both must also have λ,, λ r as eigenvalues Moreover, an n n matrix has n entries on its diagonal and hence has a characteristic polynomial of degree n hus, and A A(t) det(λ n ti) (λ t)(λ t) (λ r t) ( t) ( t) }{{}}{{} r factors n r factors ( t) n r (λ t)(λ t) (λ r t) ( t) n r h(t), AA (t) det(λ m ti) (λ t)(λ t) (λ r t) ( t) ( t) }{{}}{{} r factors m r factors ( t) m r (λ t)(λ t) (λ r t) ( t) m r h(t), 9

21 where h(t) is a monic polynomial of degree r with only λ,, λ r as roots herefore, when A is a square n n matrix, we have the special case where A A(t) AA (t) ( t) n r h(t), and Λ m Λ n References [ Maple (8) Maplesoft, A division of Waterloo Maple Inc, Waterloo, Ontario, [ APOD, 3 April 8, he Blue Horsehead Nebula in Infrared Astronomy Picture of the Day, NASA, April 8 [3 D Poole, Linear Algebra: A Modern Introduction, Stamford, Cengage Learning, [4 G Strang, Introduction to Linear Algebra, Wellesley, Wellesley-Cambridge Press,

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

The Singular Value Decomposition

The Singular Value Decomposition CHAPTER 6 The Singular Value Decomposition Exercise 67: SVD examples (a) For A =[, 4] T we find a matrixa T A =5,whichhastheeigenvalue =5 Thisprovidesuswiththesingularvalue =+ p =5forA Hence the matrix

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1 Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector

More information

Linear Least Squares. Using SVD Decomposition.

Linear Least Squares. Using SVD Decomposition. Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2 HE SINGULAR VALUE DECOMPOSIION he SVD existence - properties. Pseudo-inverses and the SVD Use of SVD for least-squares problems Applications of the SVD he Singular Value Decomposition (SVD) heorem For

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Linear Algebra (Review) Volker Tresp 2018

Linear Algebra (Review) Volker Tresp 2018 Linear Algebra (Review) Volker Tresp 2018 1 Vectors k, M, N are scalars A one-dimensional array c is a column vector. Thus in two dimensions, ( ) c1 c = c 2 c i is the i-th component of c c T = (c 1, c

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

Math 369 Exam #2 Practice Problem Solutions

Math 369 Exam #2 Practice Problem Solutions Math 369 Exam #2 Practice Problem Solutions 2 5. Is { 2, 3, 8 } a basis for R 3? Answer: No, it is not. To show that it is not a basis, it suffices to show that this is not a linearly independent set.

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Lecture 13: Row and column spaces

Lecture 13: Row and column spaces Spring 2018 UW-Madison Lecture 13: Row and column spaces 1 The column space of a matrix 1.1 Definition The column space of matrix A denoted as Col(A) is the space consisting of all linear combinations

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 34 The powers of matrix Consider the following dynamic

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition (Com S 477/577 Notes Yan-Bin Jia Sep, 7 Introduction Now comes a highlight of linear algebra. Any real m n matrix can be factored as A = UΣV T where U is an m m orthogonal

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

Math 308 Practice Test for Final Exam Winter 2015

Math 308 Practice Test for Final Exam Winter 2015 Math 38 Practice Test for Final Exam Winter 25 No books are allowed during the exam. But you are allowed one sheet ( x 8) of handwritten notes (back and front). You may use a calculator. For TRUE/FALSE

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: Au = λu λ C : eigenvalue u C n : eigenvector Example:

More information

Problem set 5: SVD, Orthogonal projections, etc.

Problem set 5: SVD, Orthogonal projections, etc. Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where

More information

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016 EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016 Answer the questions in the spaces provided on the question sheets. You must show your work to get credit for your answers. There will

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces Section 1: Linear Independence Recall that every row on the left-hand side of the coefficient matrix of a linear system A x = b which could

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = 30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can

More information

Large Scale Data Analysis Using Deep Learning

Large Scale Data Analysis Using Deep Learning Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset

More information

Matrix Vector Products

Matrix Vector Products We covered these notes in the tutorial sessions I strongly recommend that you further read the presented materials in classical books on linear algebra Please make sure that you understand the proofs and

More information

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Jim Lambers MAT 610 Summer Session Lecture 1 Notes Jim Lambers MAT 60 Summer Session 2009-0 Lecture Notes Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra

More information

1 Linearity and Linear Systems

1 Linearity and Linear Systems Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 26 Jonathan Pillow Lecture 7-8 notes: Linear systems & SVD Linearity and Linear Systems Linear system is a kind of mapping f( x)

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Practice Problems for the Final Exam

Practice Problems for the Final Exam Practice Problems for the Final Exam Linear Algebra. Matrix multiplication: (a) Problem 3 in Midterm One. (b) Problem 2 in Quiz. 2. Solve the linear system: (a) Problem 4 in Midterm One. (b) Problem in

More information

Spectral Methods for Natural Language Processing (Part I of the Dissertation) Jang Sun Lee (Karl Stratos)

Spectral Methods for Natural Language Processing (Part I of the Dissertation) Jang Sun Lee (Karl Stratos) Spectral Methods for Natural Language Processing (Part I of the Dissertation) Jang Sun Lee (Karl Stratos) Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

Singular Value Decompsition

Singular Value Decompsition Singular Value Decompsition Massoud Malek One of the most useful results from linear algebra, is a matrix decomposition known as the singular value decomposition It has many useful applications in almost

More information

CS 143 Linear Algebra Review

CS 143 Linear Algebra Review CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Solution to Homework 8, Math 2568

Solution to Homework 8, Math 2568 Solution to Homework 8, Math 568 S 5.4: No. 0. Use property of heorem 5 to test for linear independence in P 3 for the following set of cubic polynomials S = { x 3 x, x x, x, x 3 }. Solution: If we use

More information

Notes on Linear Algebra

Notes on Linear Algebra 1 Notes on Linear Algebra Jean Walrand August 2005 I INTRODUCTION Linear Algebra is the theory of linear transformations Applications abound in estimation control and Markov chains You should be familiar

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

The SVD-Fundamental Theorem of Linear Algebra

The SVD-Fundamental Theorem of Linear Algebra Nonlinear Analysis: Modelling and Control, 2006, Vol. 11, No. 2, 123 136 The SVD-Fundamental Theorem of Linear Algebra A. G. Akritas 1, G. I. Malaschonok 2, P. S. Vigklas 1 1 Department of Computer and

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n Lectures -2: Linear Algebra Background Almost all linear and nonlinear problems in scientific computation require the use of linear algebra These lectures review basic concepts in a way that has proven

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

REVIEW FOR EXAM II. The exam covers sections , the part of 3.7 on Markov chains, and

REVIEW FOR EXAM II. The exam covers sections , the part of 3.7 on Markov chains, and REVIEW FOR EXAM II The exam covers sections 3.4 3.6, the part of 3.7 on Markov chains, and 4.1 4.3. 1. The LU factorization: An n n matrix A has an LU factorization if A = LU, where L is lower triangular

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

Dimension and Structure

Dimension and Structure 96 Chapter 7 Dimension and Structure 7.1 Basis and Dimensions Bases for Subspaces Definition 7.1.1. A set of vectors in a subspace V of R n is said to be a basis for V if it is linearly independent and

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Principal Component Analysis

Principal Component Analysis Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 6 Singular Value Decomposition In Chapter 5, we derived a number of algorithms for computing the eigenvalues and eigenvectors of matrices A R n n. Having developed this machinery, we complete our

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors. Math 7 Treibergs Third Midterm Exam Name: Practice Problems November, Find a basis for the subspace spanned by the following vectors,,, We put the vectors in as columns Then row reduce and choose the pivot

More information

Practice Final Exam. Solutions.

Practice Final Exam. Solutions. MATH Applied Linear Algebra December 6, 8 Practice Final Exam Solutions Find the standard matrix f the linear transfmation T : R R such that T, T, T Solution: Easy to see that the transfmation T can be

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

Novelty Detection. Cate Welch. May 14, 2015

Novelty Detection. Cate Welch. May 14, 2015 Novelty Detection Cate Welch May 14, 2015 1 Contents 1 Introduction 2 11 The Four Fundamental Subspaces 2 12 The Spectral Theorem 4 1 The Singular Value Decomposition 5 2 The Principal Components Analysis

More information

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the

More information

ON THE SINGULAR DECOMPOSITION OF MATRICES

ON THE SINGULAR DECOMPOSITION OF MATRICES An. Şt. Univ. Ovidius Constanţa Vol. 8, 00, 55 6 ON THE SINGULAR DECOMPOSITION OF MATRICES Alina PETRESCU-NIŢǍ Abstract This paper is an original presentation of the algorithm of the singular decomposition

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information