Chapter 7: Symmetric Matrices and Quadratic Forms

Size: px
Start display at page:

Download "Chapter 7: Symmetric Matrices and Quadratic Forms"

Transcription

1 Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Diagonalization of Symmetric Matrices We have seen already that it is quite time intensive to determine whether a matrix is diagonalizable. We ll see that there are certain cases when a matrix is always diagonalizable. Definition. A matrix A is symmetric if A T = A. 3 4 Example. Let A = Note that A T = A, so A is symmetric. The characteristic polynomial of A is χ A (t) = (t + )(t 7) so the eigenvalues are and 7. The corresponding eigenspaces have bases, λ =,, λ = 7,, 0. 0 Hence, A is diagonalizable. Now we use Gram-Schmidt to find an orthogonal basis for R 3. Note that the eigenvector for λ = is already orthogonal to both eigenvectors for λ = 7. / v = 0, v =, v 3 =. Finally, we normalize each vector, / u = 0, u = / /3 /3, u 3 = / /3 Now the matrix U = [u u u 3 is orthogonal and so U T U = I. /3 /3. Theorem. If A is symmetric, then any two eigenvectors from different eigenspaces are orthogonal. Proof. Let v, v be eigenvectors for A with corresponding eigenvalues λ, λ, λ λ. Then λ (v v ) = (λ v ) T v = (Av ) T v = v T A T v = v T Av = v T (λ v ) = λ (v v ). /3 Hence, (λ λ )(v v ) = 0. Since λ λ, then we must have v v = 0.

2 Based on the previous theorem, we say that the eigenspaces of A are mutually orthogonal. Definition. An n n matrix A is orthogonally diagonalizable if there exists an orthogonal n n matrix P and a diagonal matrix D such that A = P DP T. Theorem 3. If A is orthogonally diagonalizable, then A is symmetric. Proof. Since A is orthogonally diagonalizable, then A = P DP T for some orthogonal matrix P and diagonal matrix D. A is symmetric because A T = (P DP T ) T = (P T ) T D T P T = P DP T = A. It turns out the converse of the above theorem is also true! The set of eigenvalues of a matrix A is called the spectrum of A and is denoted σ A. Theorem 4 (The Spectral Theorem for symmetric matrices). Let A be a (real) n n matrix. Then the following hold. () A has n real eigenvalues, counting multiplicities. () For each eigenvalue λ of A, geomult λ (A) = algmult λ (A). (3) The eigenspaces are mutually orthogonal. (4) A is orthogonally diagonalizable. Proof. We proved in HW9, Exercise 6 that every eigenvalue of a symmetric matrix is real. The second part of () as well as () are immediate consequences of (4). We proved (3) in Theorem. Note that (4) is trivial when A has n distinct eigenvalues by (3). We prove (4) by induction. Clearly the result holds when A is. Assume (n ) (n ) symmetric matrices are orthogonally diagonalizable. Let A be n n and let λ be an eigenvalue of A and u a (unit) eigenvector for λ. By the Gram- Schmidt process, we may extend u to an orthonormal basis {u,..., u n } for R n where {u,..., u n } is a basis for W. Set U = [u u u n. Then u T U T Au u T Au n [ AU =..... = λ. 0 B u T n Au u T n Au n The first column is as indicated because u T i Au = u T i (λu ) = λ(u i u ) = λδ ij. As U T AU is symmetric, = 0 and B is a symmetric (n ) (n ) matrix that is orthogonally diagonalizable with eigenvalues λ,..., λ n (by the inductive hypothesis). Because A and U T AU are similar, then the eigenvalues of A are λ,..., λ n.

3 Since B is orthogonally diagonalizable, there exists an orthogonal matrix Q such that Q T BQ = D, where the diagonal entries of D are λ,..., λ n. Now [ T [ [ [ [ 0 λ 0 0 λ 0 λ 0 = =. 0 Q 0 B 0 Q 0 Q T BQ 0 D [ [ 0 0 Note that is orthogonal. Set V = U. As the product of orthogonal matrices is 0 Q 0 Q orthogonal, V is itself orthogonal and V T AV is diagonal. Suppose A is orthogonally diagonalizable, so A = UDU T where U = [u u n and D is the diagonal matrix whose diagonal entries are the eigenvalues of A, λ,..., λ n. Then A = UDU T = λ u u T + + λ n u n u T n. This is known as the spectral decomposition of A. Each u i u T i (u i u T i )x is the projection of x onto Span{u i}. is called a projection matrix because Example. Construct a spectral decomposition of the matrix A in Example 3 4 Recall that A = 6 and our orthonormal basis of Col(A) was 4 3 / u = 0, u = /3 /3, u 3 = /3 /3. / /3 /3 Setting U = [u u u 3 gives U T AU = D = diag(, 7, 7). The projection matrices are / 0 / u u T = 0 0 0, u u T = / 0 / The spectral decomposition is /8 /9 /8 /9 8 9 /9 /8 /9 /8, u 3u T 3 = 4/9 /9 4/9 /9 /9 /9 4/9 /9 4/9. 7u u T + 7u u T u 3 u T 3 = A.

4 . Quadratic Forms Definition 3. A quadratic form is a function Q on R n given by Q(x) = x T Ax where A is an n n symmetric matrix, called the matrix of the quadratic form. Example 6. The function x x is a quadratic form given by setting A = I. Quadratic forms appear in differential geometry, physics, economics, and statistics. [ [ x Example 7. Let A = and x =. The corresponding quadratic form is x Q(x) = x T Ax = x x x + x. Example 8. Find the matrix of the quadratic form Q(x) = 8x +7x 3x 3 6x x +4x x 3 x x 3. By inspection we see that 8 3 A = Theorem 9 (Principal Axes Theorem). Let A be an n n symmetric matrix. Then there is an orthogonal change of variable x = P y, that transforms the quadratic form x T Ax into a quadratic form y T Dy such that D is diagonal. Proof. By the Spectral Theorem, there exists an orthogonal matrix P such that P T AP = D with D diagonal. For all x R n, set y = P T x. Then x = P y and x T Ax = (P y) T A(P y) = y T (P T AP )y = y T Dy. Example 0. Let Q(x) = 3x 4x x + 6x. Make a change of variable that transforms Q into a quadratic form with no cross-product terms. [ {[ } {[ } 3 We have A = with eigenvalues 7 and, and eigenbases,, respectively. 6 We normalize each to determine our diagonalizing matrix P, so that P T AP = D where P = [ [ 7 0, and D =. 0 Our change of variable is x = P y and the new form is Q (x) = y T Dy = 7y + y. If A is a symmetric n n matrix, the quadratic form Q(x) = x T Ax is a real-valued function with domain R n. If n =, then the graph of Q(x) is the set of points (x, x, z) with z = Q(x). For example, if Q(x) = 3x + 7x, then Q(x) > 0 for all x 0. Such a form is called positive definite, and it is possible to determine this property from the eigenvalues of A.

5 Definition 4. A quadratic form Q is () positive definite if Q(x) > 0 for all x 0. () negative definite if Q(x) < 0 for all x 0. (3) indefinite if Q(x) assumes positive and negative values. Theorem. Let A be an n n symmetric matrix. Then Q(x) = x T Ax is () positive definite if and only if the eigenvalues of A are all positive. () negative definite if and only if the eigenvalues of A are all negative. (3) indefinite otherwise. Proof. By the Principal Axes Theorem, there exists an orthogonal matrix P such that x = P y and where λ,..., λ n are the eigenvalues of A. Q(x) = x T Ax = y T Dy = λ y + + λ n y n Since P is invertible, there is a - correspondence between all nonzero x and nonzero y. The values above for y 0 are clearly determined by the signs of the λ i. Hence so are the corresponding values of x 0. We can also determine the maximum and minimum of a quadratic form when evaluated on a unit vector. This is known as constrained optimization Theorem. Let A be a symmetric matrix. Set m = min{x T Ax : x = } and M = max{x T Ax : x = } The M is the greatest eigenvalue of A and m is the least eigenvalue of A. Moreover, x T Ax = M (resp. m) when x is the unit eigenvector corresponding to M (resp. m). Example 3. Let Q(x) = 7x + x + 7x 3 8x x 4x x 3 8x x 3. Find a vector x such that Q(x) is maximimized (mimimized) subject to x T x =. 7 4 The matrix of Q is A = 4 4 and the eigenvalues are 3, 9. Hence, the maximum (resp. 4 7 minimum) values of Q subject to the constraint is 9 (resp. 3). / An eigenvector for 9 is 0. Hence, setting u = 0 gives Q(u) = 9. / / 6 An eigenvector for 3 is. Hence, setting v = / 6 gives Q(v) = 6. / 6

6 4. Singular Value Decomposition The key question in this section is whether it is possible to diagonalize a non-square matrix. [ 4 4 Example 4. Let A =. A linear transformation with matrix A maps the unit sphere 8 7 {x R 3 : x = } onto an ellipse in R. Find a unit vector x at which the length x is maximized and compute its length. The key observation here is that Ax is maximized at the same x that maximizes Ax and Ax = (Ax) T (Ax) = x T (A T A)x. Thus, we want to maximize the quadratic form Q(x) = x T (A T A)x subject to the constraint x =. The eigenvalues of A T A = are λ = 360, λ = 90, λ 3 = 0 with corresponding (unit) eigenvectors /3 /3 /3 v = /3, v = /3, v 3 = /3. /3 /3 /3 The maximum value is 360 obtained when x = v. That is, the vector Av corresponds to the point on the ellipse furthest from the origin. Then [ 8 Av = and Av = 360 = The trick we utilized above is a handy one. That is, even though A is not symmetric (it wasn t even square), A T A is symmetric and we can extract information about A from A T A. Let A be an m n matrix. Then A T A can be orthogonally diagonalized. Let {v,..., v n } be an orthonormal A T A-eigenbasis for R n with corresponding eigenvalues λ,..., λ n. For i n, 0 Av i = (Av i ) T (Av i ) = vi T (A T A)v i = vi T (λ i v i ) = λ i vi T v i = λ i. Hence, λ i 0 for all i. Arrange the eigenvalues such that λ λ λ n 0 and define σ i = λ i. That is, the σ i represent the lengths of the vectors Av i. These are called the singular values of A. Example. In Example 4, the singular values are σ = 360 = 6 0, σ = 90 = 3 0, σ 3 = 0. Theorem 6. Let A be an m n matrix. Suppose {v,..., v n } is an orthonormal basis for R n consisting of eigenvectors of A T A with corresponding eigenvalues λ λ λ n 0. Suppose A has r singular values. Then {Av,..., Av r } is an orthogonal basis for Col(A) and ranka = r.

7 Proof. Suppose i j, then (Av i ) (Av j ) = (Av i ) T (Av j ) = v T i (A T A)v j = λ(v T i v j ) = 0. Hence, {Av,..., Av r } is an orthogonal set and hence linearly independent. It is left to show that the given set spans Col(A). Since there are exactly r nonzero singular values, Av i 0 if and only if i r. Let y Col(A), so y = Ax Col(A) for some x R n. Then x = c v + + c n v n for some scalars c i R and so y = Ax = c Av + + c r Av r + c r+ Av r+ + + c n v n = c Av + + c r Av r Thus, y Span{Av,..., Av r } and so {Av,..., Av r } is an orthogonal basis for Col(A) and ranka = dim Col(A) = r. Theorem 7 (Singular Value [ Decomposition). Let A be an m n matrix with rank r. Then there D 0 exists an m n matrix Σ = where D is an r r diagonal matrix for which the diagonal 0 0 entries in D are the first r singular values of A, σ σ σ r > 0. and there exists an m m orthogonal matrix U and an n n orthogonal matrix V such that A = UΣV T. Proof. Let {v,..., v n } be an orthonormal basis for R n By Theorem 6, there exists an orthogonal basis {Av,..., Av r } of Col(A). For each i, set u i = Av i / Av i = σ i Av i. Then Av i = σ i u i. Extend {u,..., u r } to an orthonormal basis {u,..., u m } of R m. Let U = [u u m and V = [v v n. Then both U and V are orthogonal and [ [ AV = Av Av r 0 0 = σ u σ r u r 0 0 = UΣ. Since V is orthogonal, A = AV V T = UΣV T. The columns of U in the preceding theorem are called the left singular vectors of A and the columns of V are the right singular vectors of A. We summarize the method for SVD below. Let A be an m n matrix of rank r. Then A = UΣV T where U, Σ, V T are as below. () Find an orthonormal basis of A T A, {v,..., v n }. () Arrange the eigenvalues of A T A in decreasing order. The matrix V is [v v n in this order. (3) The matrix Σ is obtained by placing the r singular values along the diagonal in decreasing order. (4) Set u i = Av i Av i for i =,..., r. Extend the orthogonal set {u,..., u r } to an orthonormal basis of R m, {u,..., u m } to a basis of R m by adding vectors not in the set and applying Gram-Schmidt. The matrix U is [u u m.

8 Example 8. Consider A as in Example 4. We have already that /3 /3 /3 V = /3 /3 /3. /3 /3 /3 The nonzero singular values are σ = 360 = 6 0 and σ = 3 0 so [ [ [ D = 0 3 and Σ = D 0 = Now [ [ u = Av = [ [ 8 3/ 0 σ 6 = 0 6 / and u = Av = 3 / 0 0 σ 3 = 0 9 3/. 0 Note that {u, u } is already a basis for R and so U = [u u. Now we check that A = UΣV T. 7 Example 9. Construct the Singular Value Decomposition of A = 0 0. [ 74 3 We compute A T A =. The eigenvalues of A T A are λ = 90 and λ = 0 (note that 3 6 [ [ λ λ > 0). The corresponding eigenvectors are and, respectively. Normalizing gives [ [ [ / / / / v = / and v = / so V = / /. The singular values are now σ = 90 = 0 and σ = 0. Hence, [ [ D D = and Σ = = A has rank and u = σ Av = / 0 and u = σ Av = / 0. / / Choose u 3 = cv300 so that {u, u, u 3 } is an orthonormal basis of R 3. Then U = [u u u 3 and A = UΣV T.

9 Example 0. Construct the Singular Value Decomposition of A =. [ 9 9 We compute A T A =. The eigenvalues of A T A are λ = 8 and λ = 0 with normalized 9 9 eigenvectors [ [ [ / / / / v = / and v = / so V = / /. The singular values are now σ = 8 = 3 and σ = 0. Hence, 3 0 Σ = 0 0 and u = /3 Av = /3. σ 0 0 To find u, u 3 such that {u, u, u 3 } is an orthonormal basis of R 3, we find a basis for Nul(u T ). Set u T x = 0. Then we have 3 x 3 x + 3 x 3 = 0, which implies x x + x 3 = 0. Hence, the parametric solution is given by x = x x x 3 = x x 3 x x 3 = /3 x + 0 x 3. 0 Set w = and w 3 = 0. By construction, w and w 3 are orthogonal to u but not to each 0 other. We apply Gram-Schmidt. Set ũ = w. Then ũ 3 = w 3 proj Span{ũ } ũ = w 3 ũ w 3 ũ = ũ ũ Normalizing gives, / u = / /, u 3 = 4/ = ( 4 4/3 /3 ) / = 4/ 0 /3. Now {u, u, u 3 } is an orthonormal basis of R 3. Then U = [u u u 3 and A = UΣV T..

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Linear Algebra 20 Symmetric Matrices and Quadratic Forms

Linear Algebra 20 Symmetric Matrices and Quadratic Forms Linear Algebra 20 Symmetric Matrices and Quadratic Forms Wei-Shi Zheng, wszheng@ieee.org, 2011 1 What Do You Learn from This Note Do you still remember the following equation or something like that I have

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4 Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Linear Algebra Fundamentals

Linear Algebra Fundamentals Linear Algebra Fundamentals It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix. Because they form the foundation on which we later

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

TBP MATH33A Review Sheet. November 24, 2018

TBP MATH33A Review Sheet. November 24, 2018 TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

WI1403-LR Linear Algebra. Delft University of Technology

WI1403-LR Linear Algebra. Delft University of Technology WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Spectral Theorem for Self-adjoint Linear Operators

Spectral Theorem for Self-adjoint Linear Operators Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Lecture 15, 16: Diagonalization

Lecture 15, 16: Diagonalization Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Exercise Set 7.2. Skills

Exercise Set 7.2. Skills Orthogonally diagonalizable matrix Spectral decomposition (or eigenvalue decomposition) Schur decomposition Subdiagonal Upper Hessenburg form Upper Hessenburg decomposition Skills Be able to recognize

More information

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Name: TA Name and section: NO CALCULATORS, SHOW ALL WORK, NO OTHER PAPERS ON DESK. There is very little actual work to be done on this exam if

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name: Math 34/84 - Exam Blue Exam Solutions December 4, 8 Instructor: Dr. S. Cooper Name: Read each question carefully. Be sure to show all of your work and not just your final conclusion. You may not use your

More information

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 21 Symmetric matrices An n n

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of CHAPTER III APPLICATIONS The eigenvalues are λ =, An orthonormal basis of eigenvectors consists of, The eigenvalues are λ =, A basis of eigenvectors consists of, 4 which are not perpendicular However,

More information

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and 7.2 - Quadratic Forms quadratic form on is a function defined on whose value at a vector in can be computed by an expression of the form, where is an symmetric matrix. The matrix R n Q R n x R n Q(x) =

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Answer Keys For Math 225 Final Review Problem

Answer Keys For Math 225 Final Review Problem Answer Keys For Math Final Review Problem () For each of the following maps T, Determine whether T is a linear transformation. If T is a linear transformation, determine whether T is -, onto and/or bijective.

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Diagonalization of Matrix

Diagonalization of Matrix of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

Announcements Monday, November 26

Announcements Monday, November 26 Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Math 408 Advanced Linear Algebra

Math 408 Advanced Linear Algebra Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x

More information

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name. HW2 - Due 0/30 Each answer must be mathematically justified. Don t forget your name. Problem. Use the row reduction algorithm to find the inverse of the matrix 0 0, 2 3 5 if it exists. Double check your

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1 Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector

More information

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012 University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012 Name: Exam Rules: This is a closed book exam. Once the exam

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

1. In this problem, if the statement is always true, circle T; otherwise, circle F. Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: Au = λu λ C : eigenvalue u C n : eigenvector Example:

More information

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

NATIONAL UNIVERSITY OF SINGAPORE MA1101R Student Number: NATIONAL UNIVERSITY OF SINGAPORE - Linear Algebra I (Semester 2 : AY25/26) Time allowed : 2 hours INSTRUCTIONS TO CANDIDATES. Write down your matriculation/student number clearly in the

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information