Linear Algebra 20 Symmetric Matrices and Quadratic Forms
|
|
- Neil Spencer
- 5 years ago
- Views:
Transcription
1 Linear Algebra 20 Symmetric Matrices and Quadratic Forms Wei-Shi Zheng, What Do You Learn from This Note Do you still remember the following equation or something like that I have mentioned in the class f = x T A x. (1) It is exactly a quadratic form function which is going to detail in the following. This connects the eigen-analysis and orthogonal matrix very well. Basic Concept:symmetric matrix( 对称矩阵 ), quadratic form, quadratic function, positive definitely matrix( 正定矩阵 ), Spectral Decomposition ( 谱分解 ), singular value decomposition ( 奇异值分解 ) 2 What is symmetric matrix Definition 1 (symmetric matrix). A square matrix A is said to be symmetric iff A T = A (or equivalently, [A] ij = [A] ji ). Examples: 0 n n, I n, diagonal matrices, etc.. Theorem 2. All square matrices in R n n form a vector space. Denote this vector space as M n. 1
2 Theorem 3. The set of all symmetric matrices of size n, denoted by Sym n, forms a subspace of M n. Proof. Let A and B be symmetric matrices of size n and c a scalar. Then 1. 0 n n is symmetric obviously; 2. A + B is symmetric since (A + B) T = A T + B T = A + B; 3. ca is symmetric since (ca) T = ca T = ca. So Sym n is a subspace of M n. 3 Eigenvectors for Symmetric Matrix Theorem 4. Let A Sym n (R). Also let λ 1, λ 2 R be distinct eigenvalues of A with eigenvectors v 1, v 2 respectively. Then v 1 v 2. Proof. We have λ 1 ( v 1 v 2 ) = λ 1 v 1 v 2 = Av 1 v 2 = v T 1 A T v 2 = v T 1 Av 2 = v 1 Av 2 = v 1 λ 2 v 2 = λ 2 ( v 1 v 2 ). So (λ 1 λ 2 )( v 1 v 2 ) = 0. But λ 1 λ 2 0 since they are distinct, which forces that v 1 v 2 = 0. Theorem 5. Let A Sym n (R) and λ a complex eigenvalue of A. Then λ R, that is λ is a real eigenvalue of A. Proof. Let v C n be a λ eigenvector. Then A v = λ v. Also, A T = A. So λ(v T v) = v T (λ v) = v T A v = A v T v = (λ v T ) v = λ( v T v). So (λ λ) v T v = 0. Since v 0, it follows that v T v = v 1 v v n v n = v v n 2 > 0 where ( v 1 v n ) T = v, which forces that λ λ = 0. So λ is real. 2
3 4 Orthogonality & Symmetry Theorem 6. Let A be any symmetric function and U any orthogonal matrix. Then U 1 AU is still a symmetric matrix. Proof. Notice that U T = U 1. So (U 1 AU) T = U T A T (U 1 ) T = U 1 AU. We can now prove the major result on symmetric matrices. Theorem 7 (orthogonally diagonalizable). Let A be a square matrix. Then A is a symmetric matrix if and only if there exists an orthogonal matrix U such that U 1 AU is diagonal. Proof. Firstly, it is easy to verify that if there exists an orthogonal matrix U such that U 1 AU is diagonal, then A is symmetric. It is because let the diagonal matrix be D, then A = UDU 1 Sym n (R). Secondly, or say reversely, if A is symmetric, then we need to prove there exists an orthogonal matrix U such that U 1 AU is diagonal. It can be done by induction on n as follows. 1. Step 1: Base Case: n = 1. Trivial. 2. Step 2: Inductive Hypotheses: Assume the result holds for n Step 3: Inductive Step: Let λ 1 be an eigenvalue of A, which is real by Theorem 5, and u 1 a unit λ 1 eigenvector. Then we can extend { u 1 } to a basis of R n, say { u 1, v 2,..., v n }. Furthermore, by applying the Gram Schmidt process, we obtain an orthonormal basis U 1 = { u 1, u 2,..., u n } of R n. Define U 1 = ( u 1 u n ) O n (R), which is the matrix from basis U 1 to basis E. Let F = U 1 1 AU 1 It is not hard to see that the first column of F is (λ 1 0 0) since U1 1 = U T. So F has the form ( ) λ1 0 F = T n 1 0 n 1 A, 3
4 where A Sym n 1 (R). By Inductive Hypotheses, there exists ( ) 1 0 U O n 1 (R) such that U 1 A U is diagonal. Define U 2 = T n 1 0 n 1 U, which is an orthogonal matrix. Then ( ) U2 1 U1 1 AU 1 U 2 = U2 1 λ1 0 F U 2 = T n 1 0 n 1 U 1 A U, which is diagonal. So take U = U 1 U 2, then U is an orthogonal matrix and U 1 AU is diagonal. Remarks (Spectral Decomposition, 谱分解 ): If A is orthogonally diagonalizable, and A = P DP T = ( λ ) 1 u T 1 u 1 u n...., then A = λ 1 u 1 u T λ n u n u T n is the spectral decomposition of matrix A. 5 Singular Value Decomposition Question: If matrix A is not symmetric, it may not be orthogonally diagonalizable. However, A T A is symmetric and can be orthogonally diagonalized as P DP 1. Then is it possible to find some orthogonal matrix U and Σ such that P DP 1 = P Σ T U T UΣP 1, A = UΣP 1 = UΣV T? λ n u T n Answer: Yes. It can be. The Singular Value Decomposition can realize this!!! Question: What is singular values? Definition 8. Let A be m n matrix. Then A T A can be orthogonally diagonalized. Let { v 1,, v n } be the orthogonal eigenvector basis of matrix A T A, where their corresponding eigenvalues are λ 1 λ 2 λ n 0. Then we say σ i = λ i are the singular values of matrix A. 4
5 Theorem 9. Let A be m n matrix, where r = rank(a). Then there exists a m m orthogonal ( matrix U and a n ) n orthogonal matrix V and a m n D O matrix Σ = r (n r), where D = diag(σ 1, σ 2,, σ r ) O (m r) r O (m r) (n r) σ 1 σ 2 σ r > 0 are singular values of A, such that A = UΣV. Proof. STEP 1: Let { v 1,, v n } be the orthogonal eigenvector basis of matrix A T A, where their corresponding eigenvalues are λ 1 λ 2 λ r > λ r+1 = λ r+2 = = λ n = 0. Then we can prove that {A v 1,, A v r } are orthogonal basis of ColA. STEP 2: Normalize all vectors in the set {A v 1,, A v r } to obtain orthonormal basis { u 1,, u r }, where u i = 1 A v i A v i = 1 σ i A v i. That is A v i = σ i u i, 1 i r. STEP 3: Extend { u 1,, u r } to be the orthonormal basis of R m { u 1,, u r, u r+1,, u n }. STEP 4: Let U = [ u 1,, u n ] and V = [ v 1,, v n ]. Then they are orthogonal matrices and AV = [A v 1,, A v r, 0 0] = [σ 1 u 1,, σ r u r, 0 0]. Then we can easily have A = UΣV T. Example: ( Compute ) the singular value decomposition for matrix A = Applications: Image Processing ( 见演示 ) 5
6 6 Application: Quadratic Form( 二次型 ) Definition 10 (quadratic form). A real quadratic form q(x 1,..., x n ) in n variables x 1,..., x n is a polynomial over R having the form q(x 1,..., x n ) = c ij x i x j. 1 i j n Examples: 0, 8x 2 1, x 2 x 7, x x x 2 3, x 1 x x 2 5, 2x 3 x 4 4x 2 1 are all quadratic forms. Now we are given a quadratic form q(x 1,..., x n ) = 1 i j n c ijx i x j. We can define a symmetric matrix A such that c ij if i = j; 1 [A] ij = c 2 ij if i < j; 1 c 2 ji if i > j. Then it is easy to verify that q(x 1,..., x n ) = (x 1 x n )A(x 1 x n ) T. Conversely, for any A Sym n (R), q(x 1,..., x n ) = (x 1 x n )A(x 1 x n ) T = a ii x 2 i + 2a ij x i x j 1 i n 1 i<j n is a quadratic form, which is called the quadratic form corresponding to A and is denoted by q A (x 1,..., x n ). Thus, any quadratic form in n variables can be represented by a symmetric matrix of size n uniquely. Definition 11 (quadratic form function). Let q(x 1,..., x n ) be a real quadratic form. Then the function f q : R n R, f q ( x) = q(x 1,..., x n ), where x = (x 1 x n ) T, is called the quadratic form function induced by q. If q corresponds to A Sym n, then we write f A for f q and f A ( x) = x T A x. 6
7 Question: Can we have a more simplified quadratic form for the same quadratic function? Recall that for any x R n, we have x = [x] E, where E is the standard basis. Suppose that B is another basis of R n. Then x = [ x] E = P [ x] B, where P = [B] E. So for any quadratic form function f A we have f A ( x) = f A (P y) = y T P T AP y = f P T AP ( y), where y = [ x] B. So f A and f P T AP can be regarded as the same quadratic form function with respect to different bases. Therefore for the sake of convenience, it is quite natural to choose an invertible matrix P such that f P T AP has a simple form. Also, under the postulates of Euclidean Geometry, only orthogonal coordinate transformation is admitted, that is P is required to be orthogonal. Now by Theorem 6, we can choose an orthogonal matrix U such that U T AU is diagonal. So we have Theorem 12. Let f A ( x) = x T A x be a quadratic form function. Then there exists U O n such that f A ( x) = f U T AU( y) = y T U T AU y = λ 1 y λ n y 2 n, where y = (y 1 y n ) T and λ 1,..., λ n are all eigenvalues of A counting multiplicity. 7
8 6.1 Positive definite quadratic form Definition 13 (positive definite quadratic form). Let q be a real quadratic form in n variables. Then q is said to be 1. positive definite ( 正定的 ) if f q ( x) > 0 for all x 0; 2. negative definite ( 负定的 ) if f q ( x) < 0 for all x 0; 3. indefinite ( 不定的 ) if the values of f q ( x) can be both positive and negative. Theorem 14. Let q A be the quadratic form corresponding to A Sym n (R) and λ 1,..., λ n all the eigenvalues of A. Then 1. q A is positive definite iff λ i > 0 for all i = 1,..., n; 2. q A is negative definite iff λ i < 0 for all i = 1,..., n; 3. q A is indefinite iff there exist i, j such that λ i > 0 and λ j < 0. Proof. Let U be an orthogonal matrix such that f A ( x) = f U T AU( y) = λ 1 y λ n y 2 n. Then the range of f A ( x) is the same as that of f U T AU. So the result follows easily by arguing the range of λ 1 y λ n y 2 n. Theorem 15. Let A Sym 2 (R) and λ 1 λ 2 be eigenvalues of A. Then equation f A ( x) = 1 represents a conic ( 圆锥 ) section, which can be classified as: 1. an ellipse ( 椭圆 ) if λ 1 λ 2 > 0; 2. a hyperbola ( 双曲线 ) if λ 1 > 0 > λ 2 ; 3. empty set if 0 λ 1 λ 2 ; 4. a pair of parallel lines if λ 1 > λ 2 = 0. Proof. Let U O 2 be such that f A ( x) = f U T AU( y) = λ 1 y λ 2 y 2 2. Then the original equation turns out to be λ 1 y λ 2 y 2 2 = 1 and the result follows easily. 8
9 7 Geometric Understanding of Symmetric Matrix Theorem 16. Let A be a symmetric matrix, then we have M = max{ x T A x x = 1} = the maximum eigenvalue of A, n = min{ x T A x x = 1} = the minimum eigenvalue of A. The Cafe Terrace on the Place du Forum, by Van Gogh 9
Chapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationLinear Algebra 18 Orthogonality
Linear Algebra 18 Orthogonality Wei-Shi Zheng, wszheng@ieeeorg, 2011 1 What Do You Learn from This Note We still observe the unit vectors we have introduced in Chapter 1: 1 0 0 e 1 = 0, e 2 = 1, e 3 =
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationA A x i x j i j (i, j) (j, i) Let. Compute the value of for and
7.2 - Quadratic Forms quadratic form on is a function defined on whose value at a vector in can be computed by an expression of the form, where is an symmetric matrix. The matrix R n Q R n x R n Q(x) =
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationLecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors
Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors Wei-Shi Zheng, wszheng@ieee.org, 2011 November 18, 2011 1 What Do You Learn from This Note In this lecture note, we are considering a very
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationUNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationExercise Set 7.2. Skills
Orthogonally diagonalizable matrix Spectral decomposition (or eigenvalue decomposition) Schur decomposition Subdiagonal Upper Hessenburg form Upper Hessenburg decomposition Skills Be able to recognize
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST
me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following
More information. = V c = V [x]v (5.1) c 1. c k
Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,
More informationLec 2: Mathematical Economics
Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationQuadratic forms. Defn
Quadratic forms Aim lecture: We use the spectral thm for self-adjoint operators to study solns to some multi-variable quadratic eqns. In this lecture, we work over the real field F = R. We use the notn
More informationLinear Algebra Fundamentals
Linear Algebra Fundamentals It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix. Because they form the foundation on which we later
More informationW2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.
MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have
More informationNotes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.
Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where
More informationSpectral Theorem for Self-adjoint Linear Operators
Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n
More informationDiagonalizing Matrices
Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More information8. Diagonalization.
8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More information2 Eigenvectors and Eigenvalues in abstract spaces.
MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors
More informationLecture 15, 16: Diagonalization
Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose
More informationEXAM. Exam 1. Math 5316, Fall December 2, 2012
EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.
More informationSingular Value Decomposition
Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationDiagonalization of Matrix
of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that
More informationSample Final Exam: Solutions
Sample Final Exam: Solutions Problem. A linear transformation T : R R 4 is given by () x x T = x 4. x + (a) Find the standard matrix A of this transformation; (b) Find a basis and the dimension for Range(T
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationFinal Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson
Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Name: TA Name and section: NO CALCULATORS, SHOW ALL WORK, NO OTHER PAPERS ON DESK. There is very little actual work to be done on this exam if
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationMath 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:
Math 34/84 - Exam Blue Exam Solutions December 4, 8 Instructor: Dr. S. Cooper Name: Read each question carefully. Be sure to show all of your work and not just your final conclusion. You may not use your
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST
me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationMath 315: Linear Algebra Solutions to Assignment 7
Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationCS 143 Linear Algebra Review
CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More information5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.
Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More informationA Review of Linear Algebra
A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose
More informationMAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:
MAC Module Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to: Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors
More informationMAC Module 12 Eigenvalues and Eigenvectors
MAC 23 Module 2 Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to:. Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors
More informationSymmetric matrices and dot products
Symmetric matrices and dot products Proposition An n n matrix A is symmetric iff, for all x, y in R n, (Ax) y = x (Ay). Proof. If A is symmetric, then (Ax) y = x T A T y = x T Ay = x (Ay). If equality
More informationName: Final Exam MATH 3320
Name: Final Exam MATH 3320 Directions: Make sure to show all necessary work to receive full credit. If you need extra space please use the back of the sheet with appropriate labeling. (1) State the following
More informationThroughout these notes we assume V, W are finite dimensional inner product spaces over C.
Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal
More informationThe Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)
Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will
More informationTHE EIGENVALUE PROBLEM
THE EIGENVALUE PROBLEM Let A be an n n square matrix. If there is a number λ and a column vector v 0 for which Av = λv then we say λ is an eigenvalue of A and v is an associated eigenvector. Note that
More informationMath 224, Fall 2007 Exam 3 Thursday, December 6, 2007
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank
More informationReview of some mathematical tools
MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,
More informationI. Multiple Choice Questions (Answer any eight)
Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY
More informationA = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,
65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More information6 Inner Product Spaces
Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationTranspose & Dot Product
Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:
More informationUniversity of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012
University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012 Name: Exam Rules: This is a closed book exam. Once the exam
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More informationMath 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0
Math 12 Final Exam - Dec 14 - PCYNH 122-6pm Fall 212 Name Student No. Section A No aids allowed. Answer all questions on test paper. Total Marks: 4 8 questions (plus a 9th bonus question), 5 points per
More informationQuizzes for Math 304
Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue
More informationORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]
ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationThe Singular Value Decomposition and Least Squares Problems
The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationTranspose & Dot Product
Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:
More informationEE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS
EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More information