ON THE SINGULAR DECOMPOSITION OF MATRICES
|
|
- Marcia Hunt
- 5 years ago
- Views:
Transcription
1 An. Şt. Univ. Ovidius Constanţa Vol. 8, 00, 55 6 ON THE SINGULAR DECOMPOSITION OF MATRICES Alina PETRESCU-NIŢǍ Abstract This paper is an original presentation of the algorithm of the singular decomposition and implicitly, of the calculus of the pseudoinverse of any matrix with real coefficients. In the same time, we give a geometric interpretation of that decomposition. In the last section, we present an application of the singular decomposition in a problem of Pattern Recognition. Introduction Let A M m,n R be a matrix; for any column vector X R n M n, R, define f A : R n R m, f A X = AX. One may associate to A four remarkable vector spaces: IA = Imf A, NA = Kerf A, IA T and NA T = { Y A T Y = 0 }. IA is the vector subspace of R m, generated by the columns of the matrix A, whereas IA T is the subspace of R n, generated by the rows of A. If the rank of matrix A is ρa = r, then dimia = r, dimna = n r, dimia T = r, dimna T = m r. By convention, any vector x R n, x = x,x,...,x n, can be identified with the corresponding column vector X; for any x,y R n, the Euclidian scalar product is defined by x,y = X T Y. The following properties are well-known: The subspace NA is orthogonal to IA T and NA T is orthogonal to IA. Key Words: pseudoinverse of a matrix, algorithm of singular decomposition, geometric interpretation, pattern recognition of plane image. Mathematics Subject Classification: 5A09 Received: August, 009 Accepted: January, 00 55
2 56 Alina PETRESCU-NIŢǍ The following orthogonal decompositions hold: R n = NA IA T, R m = NA T IA. 3 The square matrix A T A of order n is symmetric and non-negatively defined i.e. X T A T AX 0, for any column vector X. 4 For any matrix A M m,n R of rank r, the number of positive eigenvectors of the matrix A T A is equal to r. 5Wesupposethatm n.letλ,λ,...,λ r bethepositiveeigenvaluesofmatrix A T A and σ k = λ k, k r singular numbers of the matrix A. These are called singular numbers of the matrix A. There exists an orthonormal basis {v,v,..., v n }ofr n formedbytheunitaryeigenvectors ofa T A, such that A T Av i = σ i v i, i r, and A T Av j = 0, r+ j n. If we denote u i = σ i Av i, i r, it follows that the vectors v,v,...,v r form an orthonormal basis for IA T and v r+,v r+,...,v n form an orthonormal basis for NA. Further, u,u,...,u r form an orthonormal basis for IA, which can be extended to an orthonormal basis u,u,...,u r, u r+,u r+,...,u m of R m ; the added vectors u r+,u r+,...,u m form an orthonormal basis for NA T. 6 The orthogonal matrices U = u u... u m, V = v v... v n are invertible the inverse is just the transposed matrix. We have: AV = Av Av... Av n = σ u σ u... σ r u r , σ i 0,i =,,...,r and if we denote S 0 S = diagσ,σ,...,σ r and Σ = being an a m n - matrix, it follows that AV = U Σ and following relation holds: This is the singular decomposition of matrix A. If A = 0, then Σ = 0. A = UΣV T. 7 Until now we have supposed that m n. If A M m,n R and m < n, then we may consider the n m - matrix B = A T. According to the properties 5 and 6, we have the singular decomposition B = U Σ V T and it follows A = B T = V Σ U T.
3 ON THE SINGULAR DECOMPOSITION OF MATRICES 57 The main application of the singular decomposition is the explicit way to compute of the pseudoinverse A of an arbitrary non-null matrix A M m,n R, namely, if A = UΣV T according to, then A = VΣ U T, where Σ = S 0 M n,m R and S = diag,,..., σ σ σ r NOTE. One knows that for any A M m, n R, there is A + M n,m R, called the pseudoinverse of A, such that for any y R m, the minimum of the euclidian norm Ax y is attained iff x A + y. The Singular Decomposition Algorithm One knows that any rectangular matrix or a square matrix, invertible or not admits a singular decomposition given by or. Suppose now that the matrix A M m,n R is given, and m n. If m < n, then the algorithm applies to the matrix A T. Step. Compute the symmetric matrix A T A M n R and determine the nonzero eigenvalues λ,λ,...,λ r, as well as the singular numbers σ = λ,σ = λ,...,σ r = λ r, where ρa = r. Step. Determine an orthonormal basis {v,v,..., v n } of R n, formed by the unitary eigenvectors of A T A and denote by V M n R the orthogonal matrix whose columns are formed by the vectors v,v,...,v n. Step 3. Compute the column unitary vectors u i = σ i Av i for i r and complete them to an orthonormal basis {u,u,...,u r, u r+,u r+,...,u m } of R m. Denote by U M m R the orthogonal matrix formed by the column vectors u,u,...,u r,u r+,u r+,...,u m. Step 4. Taking S = diagσ,σ,...,σ r and defining the m n - matrix Σ having S in the left upper corner, that is S 0 Σ =, one obtains the singular decomposition. Example Take A = hence m =, n =, r =. The matrix A T A has the eigenvalues λ = 0,λ = 0, with the unitary eigenversors v = 5, T ; v = 5, T. Then u = 0 Av =, and take u =,. So, U = and finally, A = UΣV T.,V = 5,Σ =
4 58 Alina PETRESCU-NIŢǍ 0 Example For B = and A = B 0 T, A T A = ; λ = 3,λ = and the singular numbers of A are σ = 3,σ = r =. Hence, the unitary eigenvectors for A T A are: T T v =, and v =,. Take u = 3 Av = T T 6, 6, 6,u = Av =,0, and we complete u,u to an orthonormal basis of R 3. We take u 3 = a,b,c T with unknown components and impose the condition u 3 u,u 3 u and a + b + c =. It follows u 3 = T 3, 3, 3 and denote: U = , V = and Σ = finally, the singular decomposition A = UΣV T and B = V T U T. 3 Geometric Interpretation of Singular Decomposition From geometrical point of view, any orthogonal matrix U M m R corresponds to a rotation of space R m. For m =, an orthogonal matrix U M R has the form cosθ sinθ U =,θ R sinθ cosθ and the application f U : R R,f U x,y = x,y becomes x = xcosθ ysinθ,y = xsinθ + ycosθ. This is the plane rotation formulae with the angle θ, around centered in the origin. This fact is generalized for upper dimensions. Any matrix of type S or Σ corresponds, from a geometrical point of view, σ 0 to a scale change. For instance, if n = and S =, the application 0 σ f S : R R,f S x,y = x,y, becomes x = σ x and y = σ y, and we get the plane scale change formulae.
5 ON THE SINGULAR DECOMPOSITION OF MATRICES 59 Proposition Any linear application f : R n R n is a composition of a rotation with a scale change, followed by another rotation. Proof. Let A be the associated matrix of the linear application f with respect to the canonical basis. According to, the singular decomposition of the matrix A has the form A = UΣV T, with U and V orthogonal matrices and Σ a diagonal matrix. Then f = f A = f U f Σ f V. The maps f U and f V correspond to orthogonal matrices and they represent rotations, whereas f Σ is a scale change, namely f Σ x,x,...,x n = σ x,...,σ r x r,0,...,0. Figure. The image of the unit sphere S n on f A Relation correspond to the statement of the proposition. Let A M n R be a nonsingular square matrix hence m = n = r and f A : R n R n be a linear application associated to A in the canonical basis of R n. Through the application f A, the unit sphere S n = {x R n x = } is transformed into an n-dimensional ellipsoid E n = f A S n. Indeed, if y = f A x, it follows that Y = AX, X = A Y hence = X = A Y = A Y,A Y = Y T A T A Y, and the matrix C = A T A is positively by defined, hence the set E n = {Y = AX/X S n } defines an ellipsoid. Therecursiveconstructionoftheorthonormalbasesv,v,...,v n andu,...,u n has also a geometric interpretation, which is presented in the sequel. Let w be
6 60 Alina PETRESCU-NIŢǍ a radial vector of maximal length in the ellipsoid and v = A w. If we denote by H the tangent hyperplane at v to the unitary sphere S n and H = f A H, then it follows that the hyperplane H is tangent at w to the ellipsoid E n see the figure. Indeed, we have w H and H has just one common point with the ellipsoid E n otherwise, since the application f A is bijective, it would follow that H is not tangent to the sphere.. Considering the restriction g of the linear application f A to H, we obtain a linear application g : H H, for which we can do the previous construction. This geometric interpretation leads to the singular decomposition without appealing the study of matrix A T A. Moreover, w H. We take v = v,u = w w 4 An application of the Singular Decomposition to the Classification of D Images Let A M m,n R be the gray levels matrix of a D black-white image for example: a photography, a map, a black-white TV image, etc.. Such a matrix can be obtained by splitting the image with a rectangular network, and associate to each node i,j, i m, j n, the gray level expressed as an integer number in the range between 0 and 63, where, for example, 0 stands for absolute white and 63 stands for absolute black. Let us consider the singular decomposition A = UΣV T = r λ i u i vi T, r = ρa, i= where λ > λ >... > λ r > 0 and λ,λ,...,λ r are the nonzero eigenvalues of matrix AA T. If A = a ij, i m, j n, then the Frobenius norm A F = i,j a ij / can be called the energy of the considered image. If the small eigenvalues are eliminated, we obtain and approximation A = k r λ i u i vi T,k << r and A A F = λi/. i= i=k+ If B M m,n R is another matrix, then matrix B = U ΣV T = r can be called the projective image of B on A, B = U.Σ.V T = λ r i= i u i vi T i= λ iu i vi T, where Σ = λ,...,λ r,0,...,0 and λ i = u T i Bv i, i ρ. We have: B B F A B F + r / λi λ i. For similar D images of the same class, the distance between the associated matrices i.e. the Frobenius norm of difference of the matrices is also i=
7 ON THE SINGULAR DECOMPOSITION OF MATRICES 6 small ; passing to the projective images, these are smaller because B C F B C F for any two matrices B,C M m,n R. Let us suppose that, for an image class ω we have the learning sample A,A,..., A N M m,n R. The average is given by µ = N A +...+A N and like above, the singular decomposition of average is µ = UΣV T = r i= λ iu i vi T. SimilarlywegettheprojectiveimagesA,...,A N on µ,hencea i = r j= xi j u jvj T, i N, where x i j = u T j A iv j, j r. The vector X i = x i,...,xi r can be interpreted as the coordinates vector of the projective image on µ of matrices A i, i N. The algorithm of supervised classification of images Let ω,ω,...,ω M be M classes of images already existing classes. We suppose that each class w i is represented by N i matrices of learning samples A i,ai,...,ai N i belonging to M m,n R. and the singular de- Step. Compute the average µ i = N i A i +...+A i composition of this matrix, hence the set of matrices N i u i j,vi j, j k, k minm,n, i M. Step. Compute the vectors of the coordinates of the projective image on µ i, i.e. X i j = x i j,...,x i j r, where: x i j p = u i p T A i j vi p, p k, j N i, i M. Step3. Compute the center of the classes ω i by X i C = Ni N i j= Xi j. Step4. the recognition step For any unclassified new D image F M m,n R, compute the projective images of F on µ i, i M, and the corresponding T coordinates vectors Y,...,Y M. If we denote z p i = u i i p Fv p, p k, we have Y i = z i,...,zi k, i M. If min Y i X i is reached for an index i = i 0, not necessary unique then F i M image F is places in class ω i0. References C [] Ben-Israel A., Greville T.N.E., Generalized Inverses; Theory and Applications, Springer Verlag N.Y., Inc., 003. [] Niţǎ, A., Generalized inverse of a matrix with applications to optimization of some systems Ph. D. Thesis, in Romanian, University of Bucharest, Faculty of Mathematics and Computer Science, 004. [3] Strang, G., Linear algebra and applications, Academic Press, 976.
8 6 Alina PETRESCU-NIŢǍ [4] Wang, J., A recurrent neural network for real-time, Appl.Math. Computer, 55993, [5] Wang, J., Recurrent neural networks for computing pseudoinverses of rankdeficient matrices, Appl.Math. Computer, University POLITEHNICA of Bucharest Department of Mathematics Splaiul Independenţei 33, Ro - Bucharest, Romania anita@euler.math.pub.ro
UNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationMaths for Signals and Systems Linear Algebra in Engineering
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE
More informationIntroduction to Matrix Algebra
Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationLinGloss. A glossary of linear algebra
LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal
More informationChap 3. Linear Algebra
Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions
More informationSingular Value Decomposition
Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationLecture 1: Review of linear algebra
Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations
More informationNotes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.
Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where
More informationKnowledge Discovery and Data Mining 1 (VO) ( )
Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory
More informationFoundations of Matrix Analysis
1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the
More informationStat 159/259: Linear Algebra Notes
Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the
More informationMath Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p
Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the
More informationDimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas
Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity
More informationComputational math: Assignment 1
Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange
More information1 Linear Algebra Problems
Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationlinearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice
3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is
More informationMATH36001 Generalized Inverses and the SVD 2015
MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationThe Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)
Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationProperties of Matrices and Operations on Matrices
Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationSingular Value Decomposition (SVD)
School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang
More informationLarge Scale Data Analysis Using Deep Learning
Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationWeaker assumptions for convergence of extended block Kaczmarz and Jacobi projection algorithms
DOI: 10.1515/auom-2017-0004 An. Şt. Univ. Ovidius Constanţa Vol. 25(1),2017, 49 60 Weaker assumptions for convergence of extended block Kaczmarz and Jacobi projection algorithms Doina Carp, Ioana Pomparău,
More informationMATH 581D FINAL EXAM Autumn December 12, 2016
MATH 58D FINAL EXAM Autumn 206 December 2, 206 NAME: SIGNATURE: Instructions: there are 6 problems on the final. Aim for solving 4 problems, but do as much as you can. Partial credit will be given on all
More information18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in
86 Problem Set - Solutions Due Thursday, 29 November 27 at 4 pm in 2-6 Problem : (5=5+5+5) Take any matrix A of the form A = B H CB, where B has full column rank and C is Hermitian and positive-definite
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationMath Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88
Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant
More informationMatrices A brief introduction
Matrices A brief introduction Basilio Bona DAUIN Politecnico di Torino Semester 1, 2014-15 B. Bona (DAUIN) Matrices Semester 1, 2014-15 1 / 41 Definitions Definition A matrix is a set of N real or complex
More informationA Brief Outline of Math 355
A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationLinear Algebra - Part II
Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T
More informationApplied Mathematics Letters. Comparison theorems for a subclass of proper splittings of matrices
Applied Mathematics Letters 25 (202) 2339 2343 Contents lists available at SciVerse ScienceDirect Applied Mathematics Letters journal homepage: www.elsevier.com/locate/aml Comparison theorems for a subclass
More informationLinear Algebra Formulas. Ben Lee
Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries
More informationMatrix Vector Products
We covered these notes in the tutorial sessions I strongly recommend that you further read the presented materials in classical books on linear algebra Please make sure that you understand the proofs and
More informationFunctional Analysis Review
Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all
More informationSingular Value Decomposition
Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationLinear algebra II Tutorial solutions #1 A = x 1
Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta
More informationLinear Least Squares. Using SVD Decomposition.
Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any
More informationa 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12
24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2
More informationLinear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form
Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationLinear Algebra. Workbook
Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx
More informationQuantum Computing Lecture 2. Review of Linear Algebra
Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces
More informationThe Singular Value Decomposition and Least Squares Problems
The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationReview of some mathematical tools
MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical
More informationMATH 369 Linear Algebra
Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Numerical Linear Algebra Background Cho-Jui Hsieh UC Davis May 15, 2018 Linear Algebra Background Vectors A vector has a direction and a magnitude
More information18.06SC Final Exam Solutions
18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon
More informationLecture notes on Quantum Computing. Chapter 1 Mathematical Background
Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For
More informationNotes on Linear Algebra and Matrix Theory
Massimo Franceschet featuring Enrico Bozzo Scalar product The scalar product (a.k.a. dot product or inner product) of two real vectors x = (x 1,..., x n ) and y = (y 1,..., y n ) is not a vector but a
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationLecture 2: Linear Algebra Review
EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1
More informationLinear Algebra, part 3 QR and SVD
Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We
More informationLecture 15 Review of Matrix Theory III. Dr. Radhakant Padhi Asst. Professor Dept. of Aerospace Engineering Indian Institute of Science - Bangalore
Lecture 15 Review of Matrix Theory III Dr. Radhakant Padhi Asst. Professor Dept. of Aerospace Engineering Indian Institute of Science - Bangalore Matrix An m n matrix is a rectangular or square array of
More informationSECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =
SECTION 3.3. PROBLEM. The null space of a matrix A is: N(A) {X : AX }. Here are the calculations of AX for X a,b,c,d, and e. Aa [ ][ ] 3 3 [ ][ ] Ac 3 3 [ ] 3 3 [ ] 4+4 6+6 Ae [ ], Ab [ ][ ] 3 3 3 [ ]
More informationEE731 Lecture Notes: Matrix Computations for Signal Processing
EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition
More information2. Matrix Algebra and Random Vectors
2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More information. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3
Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. An n n matrix
More informationWeaker hypotheses for the general projection algorithm with corrections
DOI: 10.1515/auom-2015-0043 An. Şt. Univ. Ovidius Constanţa Vol. 23(3),2015, 9 16 Weaker hypotheses for the general projection algorithm with corrections Alexru Bobe, Aurelian Nicola, Constantin Popa Abstract
More informationGlossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB
Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationMATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018
Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More informationMatrices and Linear Algebra
Contents Quantitative methods for Economics and Business University of Ferrara Academic year 2017-2018 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2 3 4 5 Contents 1 Basics 2
More informationMatrices A brief introduction
Matrices A brief introduction Basilio Bona DAUIN Politecnico di Torino September 2013 Basilio Bona (DAUIN) Matrices September 2013 1 / 74 Definitions Definition A matrix is a set of N real or complex numbers
More information~ g-inverses are indeed an integral part of linear algebra and should be treated as such even at an elementary level.
Existence of Generalized Inverse: Ten Proofs and Some Remarks R B Bapat Introduction The theory of g-inverses has seen a substantial growth over the past few decades. It is an area of great theoretical
More informationNotes on Linear Algebra
1 Notes on Linear Algebra Jean Walrand August 2005 I INTRODUCTION Linear Algebra is the theory of linear transformations Applications abound in estimation control and Markov chains You should be familiar
More information3 (Maths) Linear Algebra
3 (Maths) Linear Algebra References: Simon and Blume, chapters 6 to 11, 16 and 23; Pemberton and Rau, chapters 11 to 13 and 25; Sundaram, sections 1.3 and 1.5. The methods and concepts of linear algebra
More informationFall TMA4145 Linear Methods. Exercise set Given the matrix 1 2
Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an
More informationSymmetric and anti symmetric matrices
Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal
More informationLinear Algebra review Powers of a diagonalizable matrix Spectral decomposition
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing
More informationBare minimum on matrix algebra. Psychology 588: Covariance structure and factor models
Bare minimum on matrix algebra Psychology 588: Covariance structure and factor models Matrix multiplication 2 Consider three notations for linear combinations y11 y1 m x11 x 1p b11 b 1m y y x x b b n1
More information[3] (b) Find a reduced row-echelon matrix row-equivalent to ,1 2 2
MATH Key for sample nal exam, August 998 []. (a) Dene the term \reduced row-echelon matrix". A matrix is reduced row-echelon if the following conditions are satised. every zero row lies below every nonzero
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More informationLinear Algebra review Powers of a diagonalizable matrix Spectral decomposition
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationSingular Value Decomposition
Chapter 6 Singular Value Decomposition In Chapter 5, we derived a number of algorithms for computing the eigenvalues and eigenvectors of matrices A R n n. Having developed this machinery, we complete our
More informationECE 275A Homework #3 Solutions
ECE 75A Homework #3 Solutions. Proof of (a). Obviously Ax = 0 y, Ax = 0 for all y. To show sufficiency, note that if y, Ax = 0 for all y, then it must certainly be true for the particular value of y =
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 1. Basic Linear Algebra Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Example
More informationReview of Some Concepts from Linear Algebra: Part 2
Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set
More informationBackground Mathematics (2/2) 1. David Barber
Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and
More informationLecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra
Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental
More informationLinear vector spaces and subspaces.
Math 2051 W2008 Margo Kondratieva Week 1 Linear vector spaces and subspaces. Section 1.1 The notion of a linear vector space. For the purpose of these notes we regard (m 1)-matrices as m-dimensional vectors,
More information