Maths for Signals and Systems Linear Algebra in Engineering

Size: px
Start display at page:

Download "Maths for Signals and Systems Linear Algebra in Engineering"

Transcription

1 Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON

2 Positive definite matrices A symmetric or Hermitian matrix is positive definite if and only if (iff) all its eigenvalues are real and positive. Therefore, the pivots are positive and the determinant is positive. However, positive determinant doesn t guarantee positive definiteness. Example: Consider the matrix A = 5 3 Eigenvalues are obtained from: 5 λ 3 λ 4 = 0 λ 8λ + 11 = 0 8 ± λ 1, = = 8 ± 0 = 4 ± 5 The eigenvalues are positive and the matrix is symmetric, therefore, the matrix is positive definite.

3 Positive definite matrices cont. We are talking about symmetric matrices. We have various tests for positive definiteness. Consider the case of a positive definite matrix A = a b b c. The eigenvalues are positive λ 1 > 0, λ > 0. The pivots are positive a > 0, ac b a > 0. All determinates of leading ( north west ) sub-matrices are positive a > 0, ac b > 0. x T A x > 0, x is any vector. x T A x = x 1 x a b x 1 b c x Form. = ax 1 + bx 1 x + cx. This is called Quadratic

4 Positive semi-definite matrices Example: Consider the matrix 6 6 x Which sufficiently large values of x makes the matrix positive definite? The answer is x > 18. (The determinant is x 36 > 0 x > 18) If x = 18 we obtain the matrix For x = 18 the matrix is positive semi-definite. The eigenvalues are λ 1 = 0 and λ = 0. One of its eigenvalues is zero. It has only one pivot since the matrix is singular. The pivots are and 0. Its quadratic form is x 1 x 6 x x = x 1 + 1x 1 x + 18x. In that case the matrix marginally failed the test.

5 Graph of quadratic form In mathematics, a quadratic form is a homogeneous polynomial of degree two in a number of variables. For example, the condition for positive-definiteness of a matrix, f x 1, x = ax 1 + bx 1 x + cx, is a quadratic form in the variables x and y. Positive definite Not positive definite x 1 x 1 For the positive definite case we have: x Obviously, first derivatives must be zero at the minimum. This condition is not enough. Second derivatives matrix is positive definite, i.e., for f x 1 x 1 f x1 x f x x 1 f x x, we have f x1 x 1 > 0, f x1 x 1 f x x f x1 x > 0. Positive for a number turns into positive definite for a matrix. x minimum

6 Example 1 Example: 6 6 0, trace A = = λ 1+λ, det A = 4 = λ 1 λ x 1 x x 1 x = x 1 + 1x 1 x + 0x f x 1, x 1 = x 1 + 1x 1 x + 0x = x 1 + 3x + x. minimum x 1 x A horizontal intersection could be f x 1, x 1 = 1. It is an ellipse. Its quadratic form is x 1 + 3x + x = 1.

7 Example 1 cont. Example: 6 6 0, trace A = = λ 1+λ, det A = 4 = λ 1 λ f x 1, x = x 1 + 1x 1 x + 0x = x 1 + 3x + x Note that computing the square form is effectively elimination A = = U and L = 6 0 3(1) The pivots and the multipliers appear in the quadratic form when we compute the square. Pivots are the multipliers of the squared functions so positive pivots imply sum of squares and hence positive definiteness.

8 Example 1 0 Example: Consider the matrix A = The leading ( north west ) determinants are,3,4. The pivots are, 3/, 4/3. The quadratic form is x T A x = x 1 + x + x 3 x 1 x x x 3. This can be written as: x 1 1 x + 3 x 3 x x 3 The eigenvalues of A are λ 1 =, λ =, λ 3 = + The matrix A is positive definite when x T A x > 0.

9 Positive definite matrices cont. If a matrix A is positive-definite, its inverse A 1 it also positive definite. This comes from the fact that the eigenvalues of the inverse of a matrix are equal to the inverses of the eigenvalues of the original matrix. If matrices A and B are positive definite, then their sum is positive definite. This comes from the fact x T A + B x = x T Ax +x T Bx > 0. The same comment holds for positive semi-definiteness. Consider the matrix A of size m n, m n (rectangular, not square). In that case we are interested in the matrix A T A which is square. Is A T A positive definite?

10 The case of A T A and AA T Is A T A positive definite? x T A T Ax = (Ax) T Ax = Ax In order for Ax > 0 for every x 0, the null space of A must be zero. In case of A being a rectangular matrix of size m n with m > n, the rank of A must be n. In case of A being a rectangular matrix of size m n with m < n, the null space of A cannot be zero and therefore, A T A is not positive definite. Following the above analysis, it is straightforward to show that AA T is positive definite if m < n and the rank of A is m.

11 Similar matrices Consider two square matrices A and B. Suppose that for some invertible matrix M the relationship B = M 1 AM holds. In that case we say that A and B are similar matrices. Example: Consider a matrix A which has a full set of eigenvectors. In that case S 1 AS = Λ. Based on the above A is similar to Λ. Similar matrices have the same eigenvalues. Matrices with identical eigenvalues are not necessarily similar. There are different families of matrices with the same eigenvalues. Consider the matrix A with eigenvalues λ and corresponding eigenvectors x and the matrix B = M 1 AM. We have Ax = λx AMM 1 x = λx M 1 AMM 1 x = λm 1 x BM 1 x = λm 1 x Therefore, λ is also an eigenvalue of B with corresponding eigenvector M 1 x.

12 Matrices with identical eigenvalues with some repeated Consider the families of matrices with repeated eigenvalues. Example: Lets take the size matrices with eigenvalues λ 1 = λ = 4. The following two matrices = 4I and have eigenvalues 4,4 but they belong to different families. There are two families of matrices with eigenvalues 4,4. The matrix has no relatives. The only matrix similar to it, is itself. The big family includes a and any matrix of the form, a 0. These matrices are not diagonalizable since they only have one non-zero eigenvector.

13 Singular Value Decomposition (SVD) The so called Singular Value Decomposition (SVD) is one of the main highlights in Linear Algebra. Consider a matrix A of dimension m n and rank r. I would like to diagonalize A. What I know so far is A = SΛS 1. This diagonalization has the following weaknesses: A has to be square. There are not always enough eigenvectors. For example consider the matrix 1 a 0 1 x 0 T., a 0. It only has the eigenvector Goal: I am looking for a type of decomposition which can be applied to any matrix.

14 Singular Value Decomposition (SVD) cont. I am looking for a type of matrix factorization of the form A = UΣV T where A is any real matrix A of dimension m n and furthermore, U is a unitary matrix U T U = I with columns u i, of dimension m m. Σ is an m n rectangular matrix with non-negative real entries only along the main diagonal. The main diagonal is defined by the elements σ ij, i = j. V is a unitary matrix V T V = I with columns v i, of dimension n n. U is, in general, different to V. The above type of decomposition is called Singular Value Decomposition. The non-zero elements of Σ are the so called Singular Values of matrix A. They are chosen to be positive. When A is a square invertible matrix then A = SΛS 1. When A is a symmetric matrix, the eigenvectors of S are orthonormal, so A = QΛQ T. Therefore, for symmetric matrices SVD is effectively an eigenvector decomposition U = Q = V and Λ = Σ. For complex matrices, transpose must be replaced with conjugate transpose.

15 Singular Value Decomposition (SVD) cont. From A = UΣV T, the following relationship hold: AV = UΣ Do not forget that U and V are assumed to be unitary matrices and therefore, U T U = UU T = V T V = VV T = I If I manage to write A = UΣV T, the matrix A T A is decomposed as: A T A = VΣ T U T UΣV T = VΣ T ΣV T In the above expression Σ T Σ is a matrix of dimension n m m n = n n (square matrix). From the form of the original Σ, you can easily deduct that Σ T Σ is a diagonal matrix. It has r non-zero elements across the diagonal. These are the squares of the singular values of A which are located along the main diagonal of the rectangular matrix Σ. Σ T Σ = Σ if the original matrix A is a square matrix. Please note the difference between the diagonal (square matrices) and the main diagonal (rectangular matrices). Therefore, the above expression is the eigenvector decomposition of A T A as follows: A T A = V(Σ T Σ)V T

16 Singular Value Decomposition (SVD) cont. Similarly, the eigenvector decomposition of AA T is: AA T = UΣV T VΣ T U T = UΣΣ T U T In the above expression ΣΣ T is a matrix of dimension m n n m = m m. Similarly to Σ T Σ, it is a square matrix with r non-zero elements across the diagonal. Based on the properties stated in previous slides, the number and values of nonzero elements of matrices ΣΣ T and Σ T Σ are identical. Note that these two matrices have different dimensions if m n. In that case one of them (the bigger one) has at least one zero element in its diagonal since they both have rank r which is r min(m, n). From this and previous slides, we deduct that we can determine all the factors of SVD by the eigenvector decompositions of matrices A T A and AA T.

17 Useful properties Let A be an m n matrix and let B be an n m matrix with n m. Then the n eigenvalues of BA are the m eigenvalues of AB with the extra eigenvalues being 0. Therefore, the non-zero eigenvalues of AB and BA are identical. Therefore: Let A be an m n matrix with n m. Then the n eigenvalues of A T A are the m eigenvalues of AA T with the extra eigenvalues being 0. Similar comments for n m are valid. Matrices A, A T A and AA T have the same rank. Let A be an m n matrix with n m and rank r. The matrix A has r singular values. Both A T A and AA T have r non-zero eigenvalues which are the squares of the singular values of A. Furthermore: A T A is of dimension n n. It has r eigenvectors v 1 v r associated with its r non-zero eigenvalues and n r eigenvectors associated with its n r zero eigenvalues. AA T is of dimension m m. It has r eigenvectors u 1 u r associated with its r non-zero eigenvalues and m r eigenvectors associated with its m r zero eigenvalues.

18 Singular Value Decomposition (SVD) cont. I can write V = v 1 v r v r+1 v n and U = u 1 u r u r+1 u m. Matrices U and V have already been defined previously. Note that in the above matrices, I put first in the columns the eigenvectors of A T A and AA T which correspond to non-zero eigenvalues. To take the above even further, I order the eigenvectors according to the magnitude of the associated eigenvalue. The eigenvector that corresponds to the maximum eigenvalue is placed in the first column and so on. This ordering is very helpful in various real life applications.

19 Singular Value Decomposition (SVD) cont. As already shown, from A = UΣV T we obtain that AV = UΣ or A v 1 v r v r+1 v n = u 1 u r u r+1 u m Σ Therefore, we can break AV = UΣ into a set of relationships of the form Av i = σ i u i. Note that σ i is a scalar and v i and u i vectors. For i r the relationship AV = UΣ tells us that: The vectors v 1, v,, v r are in the row space of A. This is because from AV = UΣ we have U T AVV T = U T UΣV T U T A = ΣV T v i T = 1 σ i u i T A, σ i 0. Furthermore, since the v i s associated with σ i 0 are orthonormal, they form a basis of the row space. The vectors u 1, u,, u r are in the column space of A. This observation comes directly from u i = 1 σ i Av i, σ i 0, i.e., u i s are linear combinations of columns of A. Furthermore, the u i s associated with σ i 0 are orthonormal. Thus, they form a basis of the column space.

20 Singular Value Decomposition (SVD) cont. Based on the facts that: Av i = σ i u i, v i form an orthonormal basis of the row space of A, u i form an orthonormal basis of the column space of A, we conclude that: with SVD, an orthonormal basis of the row space, which is given by the columns of v, is mapped by matrix A to an orthonormal basis of the column space given by the columns of u. This comes from AV = UΣ. The n r additional v s which correspond to the zero eigenvalues of matrix A T A are taken from the null space of A. The m r additional u s which correspond to the zero eigenvalues of matrix AA T are taken from the left null space of A.

21 Examples of different Σ matrices We managed to find an orthonormal basis (V) of the row space and an orthonormal basis (U) of the column space that diagonalize the matrix A to Σ. In general, the basis of V is different to the basis of U. The SVD is written as: A v 1 v r v r+1 v n = u 1 u r u r+1 u m Σ The form of matrix Σ depends on the dimensions m, n, r. It is of dimension m n. Its elements are chosen as: Σ ij = ቐ σ i = σ i i = j, 1 i, j r 0 otherwise σ i are the non-zero eigenvalues of A T A or AA T. σ i are the non-zero singular values of A.

22 Examples of different Σ matrices cont. Example: m = n = r = 3. σ Σ = 0 σ σ 3 Example: m = 4, n = 3, r =. Σ = 0 σ 1 0 σ

23 Truncated or Reduced Singular Value Decomposition In the expression for SVD we can reformulate the dimensions of all matrices involved by ignoring the eigenvectors which correspond to zero eigenvalues. In that case we have: σ 1 0 A v 1 v r = u 1 u r A = u 1 σ 1 v T T u r σ r v r 0 σ r where: The dimension of A is m n. The dimension of v 1 v r is n r. The dimension of u 1 u r is m r. The dimension of Σ is r r. The above formulation is called Truncated or Reduced Singular Value Decomposition. As seen, the Truncated SVD gives the splitting of A into a sum of r matrices, each of rank 1. In the case of a square, invertible matrix (m = n = r), the two decompositions are identical.

24 Singular Value Decomposition. Example 1. Example: A = and AT A = = The eigenvalues of A T A are σ 1 = 3 and σ = The eigenvectors of A T A are v 1 = 1 and v = A T A = VΣ V T Similarly AA T = = Therefore, the eigenvectors of AA T are u 1 = 1 0 and u = 0 1 and AAT = UΣ U T. CAREFUL: u i s are chosen to satisfy the relationship u i = 1 σ i Av i, i = 1,. Therefore, the SVD of A = is: A = UΣV T = =

25 Singular Value Decomposition. Example. Example: A = (singular) and AT A = The eigenvalues of A T A are σ 1 = 15 and σ = The eigenvectors of A T 4 A are v 1 = Τ and v Τ = Τ 5 5 Τ A T A = VΣ V T Similarly AA T = = = u 1 is chosen to satisfy the relationship u 1 = 1 Av σ 1 = = u is chosen to be perpendicular to u 1. Note that the presence of u and v does not affect the calculations, since their elements are multiplied by zeros. Therefore, the SVD of A = is: 1 A = U Σ V T 5 5 = =

26 Singular Value Decomposition. Example cont. The SVD of A = is: 1 A = U Σ V T 5 5 = = The truncated SVD is: A = U Σ V T = /5 3/5 =

27 Singular Value Decomposition. Example 3. Example: A = We see that r = AT A = = The eigenvalues of A T A are σ 1 = 3 and σ = 1 and σ 3 = 0 (obviously).. The eigenvectors of A T A are v 1 = Similarly AA T = / 6 / 6 1/ 6 = 1 1., v = 1/ 0 1/ 1/ 3 and v 3 = 1/ / 3 u 1 is chosen to satisfy the relationship u 1 = 1 Av σ 1 = 1 1 u is chosen to satisfy the relationship u = 1 Av σ = 1 1. Note that the presence 1 of v 3 does not affect the calculations, since its elements are multiplied by zeros..

28 Singular Value Decomposition. Example. Therefore, the SVD of A = is A = 1/ 1/ 1/ 1/ / 6 / 6 1/ 6 1/ 0 1/ 1/ 3 1/ 3 1/ 3 The truncated SVD for this example is: 1/ 1/ A = 3 0 1/ 1/ 0 1 1/ 6 1/ / 6 0 1/ 6 1/

29 Pseudoinverse Suppose that A is a matrix of dimension m n and rank r. The SVD of matrix A is given by: A = UΣV T I define a matrix Σ + of dimension n m as follows: Σ + ij = ቐ 1/ σ i i = j, 1 i, j r 0 otherwise The matrix A + = V Σ + U T is called the Pseudoinverse of matrix A or the Moore Penrose inverse. A + A = V Σ + U T UΣV T = V Σ + ΣV T. The matrix Σ + Σ is of dimension n n (square) and has rank r. It is defined as follows: (Σ + 1 i = j, 1 i, j r Σ) ij = ቊ 0 otherwise AA + = UΣV T V Σ + U T = UΣ Σ + U T. The matrix Σ Σ + is of dimension m m and has rank r. It is defined as Σ + Σ above.

30 Pseudoinverse Note that Σ + Σ and Σ Σ + have different dimensions. Note that Σ + Σ and Σ Σ + look like identity matrices where the last (n r) or (m r) diagonal elements have been replaced by zeros. If m n and the rank of A is n then Σ + Σ = I n n. In that case A + A = V Σ + U T UΣV T = V Σ + ΣV T = V V T = I n n. Therefore, A + is a left inverse matrix of A. If m n and the rank of A is m then Σ Σ + = I m m. In that case AA + = I m m. Therefore, A + is a right inverse matrix of A.

31 Pseudoinverse cont. As already proved, the relationship Av i = σ i u i, which comes directly from AV = UΣ, maps a vector from the row space to the column space. Similarly, from A + = V Σ + U T we get A + U = V Σ + and therefore, A + u i = 1 σ i v i. Therefore, the multiplication of a vector from the column space with the pseudo inverse A +, gives a the vector in the row space.

32 Other types of matrix inverses Consider a matrix A of dimension m n and rank r. The following cases hold: r = m = n. In that case AA 1 = I = A 1 A. The matrix A has a two-sided inverse or simply an inverse. r = n < m (more rows thank columns) The matrix has full column rank (independent columns). Null space = 0. n m m n 1 0 or 1 solutions to Ax = b. A left A = I n n In that case A T A of dimension n n is invertible and A has a left inverse only. r = m < n (more columns than rows) The matrix has full row rank (independent rows). There are n m free variables. Left null space = 0. Infinite solutions to Ax = b. In that case AA T of dimension m m is invertible and A has a right inverse only. A T A 1 A T A = I n n AA T AA T 1 = I m m m x n n x m 1 A A right = I m m

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18 th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Mathematics

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Maths for Signals and Systems Linear Algebra for Engineering Applications

Maths for Signals and Systems Linear Algebra for Engineering Applications Maths for Signals and Systems Linear Algebra for Engineering Applications Lectures 1-2, Tuesday 11 th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13-14, Tuesday 11 th November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Eigenvectors

More information

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in 86 Problem Set - Solutions Due Thursday, 29 November 27 at 4 pm in 2-6 Problem : (5=5+5+5) Take any matrix A of the form A = B H CB, where B has full column rank and C is Hermitian and positive-definite

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lecture 3, Friday 4 th October 26 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Mathematics for

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

There are six more problems on the next two pages

There are six more problems on the next two pages Math 435 bg & bu: Topics in linear algebra Summer 25 Final exam Wed., 8/3/5. Justify all your work to receive full credit. Name:. Let A 3 2 5 Find a permutation matrix P, a lower triangular matrix L with

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

1 Linearity and Linear Systems

1 Linearity and Linear Systems Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 26 Jonathan Pillow Lecture 7-8 notes: Linear systems & SVD Linearity and Linear Systems Linear system is a kind of mapping f( x)

More information

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

Numerical Methods I Singular Value Decomposition

Numerical Methods I Singular Value Decomposition Numerical Methods I Singular Value Decomposition Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 9th, 2014 A. Donev (Courant Institute)

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5 STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity

More information

Large Scale Data Analysis Using Deep Learning

Large Scale Data Analysis Using Deep Learning Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset

More information

Lecture 6 Positive Definite Matrices

Lecture 6 Positive Definite Matrices Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

Singular Value Decompsition

Singular Value Decompsition Singular Value Decompsition Massoud Malek One of the most useful results from linear algebra, is a matrix decomposition known as the singular value decomposition It has many useful applications in almost

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Background Mathematics (2/2) 1. David Barber

Background Mathematics (2/2) 1. David Barber Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in 806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

Image Registration Lecture 2: Vectors and Matrices

Image Registration Lecture 2: Vectors and Matrices Image Registration Lecture 2: Vectors and Matrices Prof. Charlene Tsai Lecture Overview Vectors Matrices Basics Orthogonal matrices Singular Value Decomposition (SVD) 2 1 Preliminary Comments Some of this

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition (Com S 477/577 Notes Yan-Bin Jia Sep, 7 Introduction Now comes a highlight of linear algebra. Any real m n matrix can be factored as A = UΣV T where U is an m m orthogonal

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

More information

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of CHAPTER III APPLICATIONS The eigenvalues are λ =, An orthonormal basis of eigenvectors consists of, The eigenvalues are λ =, A basis of eigenvectors consists of, 4 which are not perpendicular However,

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 EigenValue decomposition Singular Value Decomposition Ramani Duraiswami, Dept. of Computer Science Hermitian Matrices A square matrix for which A = A H is said

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Announcements Wednesday, November 01

Announcements Wednesday, November 01 Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR 1. Definition Existence Theorem 1. Assume that A R m n. Then there exist orthogonal matrices U R m m V R n n, values σ 1 σ 2... σ p 0 with p = min{m, n},

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Elementary maths for GMT

Elementary maths for GMT Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1

More information

B553 Lecture 5: Matrix Algebra Review

B553 Lecture 5: Matrix Algebra Review B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name: Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition Due date: Friday, May 4, 2018 (1:35pm) Name: Section Number Assignment #10: Diagonalization

More information

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1 Vectors Arrays of numbers They represent a point in a n dimensional space 2 Vectors: Scalar Product Scalar-Vector Product Changes

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 6 Singular Value Decomposition In Chapter 5, we derived a number of algorithms for computing the eigenvalues and eigenvectors of matrices A R n n. Having developed this machinery, we complete our

More information

Symmetric and anti symmetric matrices

Symmetric and anti symmetric matrices Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018 Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry

More information

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the

More information

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Eigenvalues and diagonalization

Eigenvalues and diagonalization Eigenvalues and diagonalization Patrick Breheny November 15 Patrick Breheny BST 764: Applied Statistical Modeling 1/20 Introduction The next topic in our course, principal components analysis, revolves

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Recap: A N N matrix A has an eigenvector x (non-zero) with corresponding

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information