Chapter 6: Orthogonality

Size: px
Start display at page:

Download "Chapter 6: Orthogonality"

Transcription

1 Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products We now return to a discussion on the geometry of vectors. There are many applications of the notion of orthogonality, some of which we will discuss. A basic (geometric) question that we will address shortly is the following. Suppose you are given a plane P and a point p (in R 3 ). What is the distance from p to P? That is, what is the length of the shortest possible line segment that one could draw from p to P. Definition. Let u, v R n. The inner product of u and v is defined as u v = u T v = [u u n. = u v + + u n v n. The inner product is also referred to as the dot product. Another product, the cross product, will be discussed at a later time. Example. Let a = 3 5 and b = Theorem. Let u, v, w R n and let c R. Then () u v = v u. () (u + v) w = (u w) + (v w). v v n. Compute a b and b a. (3) (cu) v = c(u v). The inner product is a useful tool to study the geometry of vectors. (4) u u and u u = if and only if u =. Definition. The length (or norm) of v R n is the non-negative scalar v defined by v = v v = v + + v n and v = v v. A unit vector is a vector of length. Note that cv = c v for c R. If v, then u = v v is the unit vector in the same direction as v. Example 3. Find the lengths of a and b in Example and find their associated unit vectors.

2 Recall that the distance between two points (a, b ) and (a, b ) in R is determined by the wellknown distance formula (a a ) + (b b ). We can similarly define distance between vectors. Definition 3. For u, v R n, the distance between u and v, written d(u, v), is the length of the vector u v. That is, d(u, v) = u v. Example 4. Find the distance between a and b in Example. Definition 4. Two vectors u, v R n are said to be orthogonal (to each other) if u v =. Orthogonality generalizes the idea of perpendicular lines in R. Two lines (represented as vectors u, v) are perpendicular if and only if the distance from u to v equals the distance from u to v. (d(u, v)) = u ( v) = u + v = (u + v) (u + v) = u + u v + v (d(u, v)) = u v = (u v) (u v) = u u v + v. Hence, these two quantities are equal if and only if u v = u v. Equivalently, u v =. The next theorem now follows directly. Theorem 5 (The Pythagorean Theorem). Two vectors u, v R n are orthogonal if and only if u + v = u + v. Definition 5. Let W R n be a subspace. The set W = {z R n : z w = for all w W } is called the orthogonal complement of W. Part of your homework will be to show that W is a subspace of R n. Example 6. Let W be a plane through the origin in R 3 and let L be a line through perpendicular to W. If z L and w W are nonzero, then the line segment from to z is perpendicular to the line segment from to w. In fact, L = W and W = L. Theorem 7. Let A be an n n matrix. Then (RowA) = NulA and (ColA) = NulA T. Proof. If x NulA, then Ax = by definition. Hence, x is perpendicular to each row of A. Since the rows of A span RowA, then x (RowA). Conversely, if x (RowA), then x is orthogonal to each row of A and Ax =, so x NulA. The proof of the second statement is similar. Let u, v R be nonzero. By the Law of Cosines, u v = u + v u v cos θ. Rearranging gives u v cos θ = [ u + v u v = [ (u + u ) + (v + v) (u v ) (u v ) = u v + u v = u v. Hence, cos θ = u v u v

3 . Orthogonal Sets Definition 6. A set of vectors {u,..., u p } in R n is said to be an orthogonal if u i u j = for all i j. If, in addition, each u i is a unit vector, then the set is said to be orthonormal. Example 8. Show that the following set is orthogonal. Is it orthonormal? If not, find a set of orthonormal vectors with the same span. 3 3, 3 3, Theorem 9. If S = {u,..., u p } is an orthogonal set of nonzero vectors in R n, then S is linearly independent and hence a basis for a subspace spanned by S. Proof. Write = c u + + c p u p. Then = u = (c u + + c p u p ) u = c (u u ) + + c p (u p u ) = c (u u ). Since u u (because u ), then c =. Repeating this argument with u,..., u p gives c = = c p =. Hence, S is linearly independent. Let {u,..., u p } be an orthogonal basis for a subspace W of R n. Let u W and write y = c u + + c p u p. Then y u i = c i (u i u i ), and so c i = y u i u i u i, i =,..., p. Example. Show that the set S below is an orthogonal basis of R 3 and express the given vector x as a linear combination of these vectors. 8 S =, 4,, x = 4. 3 Here is an easier version of the problem hinted at in the beginning of this chapter. Given a point p and a line L (in R ), what is the distance from p to L. The solution of this uses orthogonal projections. Let L = Span{u} and p given by the vector y. We need to know the length of the vector orthogonal to u through y. By translation, this is equivalent to the length of the vector z = y ŷ where ŷ = αu for some scalar α. Then = z u = (y αu) u = y u (αu) u = y u α(u u). Hence, α = y u y u u u and so ŷ = u uu. Note that if we replace u by cu for any scalar c this definition does not change and thus we have defined the projection for all of L.

4 Definition 7. Given vectors y, u R n, and L = Span{u}, the orthogonal projection of y onto L is defined as ŷ = proj L y = y u u u u. Note that this gives a decomposition of the vector y as y = ŷ + z where ŷ L and z L. Hence, every vector in R n can be written (uniquely) as the sum of an element in L and an element in L. Since L L = {}, then it follows that dim L = n. In the next section we will generalize this to larger subspaces. [ [ 4 Example. Compute the orthogonal projection of onto the line L through and the 7 origin. Use this to find the distance from y to L. Definition 8. If W is a subspace of R n spanned by an orthonormal set S = {u,..., u p }, then we say S is an orthonormal basis of W. Example. The standard basis is an orthonormal basis of R n. Theorem 3. An m n matrix U has orthonormal columns if and only if U T U = I. Proof. Write U = [u u n. Then u T u T u u T u u T u n U T u U = [u T u n = u u T u u T u n.. u T.... n u T n u u T n u u T n u n Hence, U T U = I if and only if u i u i = for all i and u i u j = for all i j. Theorem 4. Let U be an m n matrix with orthonormal columns and let x, y R n. Then () Ux = x () (Ux) (Ux) = x y (3) (Ux) (Ux) = if and only if x y =. Proof. We will prove (). The rest are left as an exercise. Write U = [u u n. Then Ux = Ux Ux = (u x + u n x n ) (u x + u n x n ) = i,j (u i x i ) (u j x j ) = i,j x i x j (u i u j ) = i x i (u i u i ) = i x i = x.

5 3. Orthogonal Projections The next definition generalizes projections onto lines. Definition 9. Let W be a subspace of R n with orthogonal basis {u,..., u p }. For y R n, the orthogonal projection of y onto W is given by proj W y = y u u u u + + y u n u n u n u n. This definition matches our previous one when W is -dimensional. Note that proj W y W because it is a linear combination of basis elements. Also note that the definition simplifies when the basis {u,..., u p } is orthonormal. In this case, if we let U = [u u p, then proj W y = UU T y for all y R n. Theorem 5 (Orthogonal Decomposition Theorem). Let W be a subspace of R n with orthogonal basis {u,..., u p }. Then each y R n can be written uniquely in the form y = ŷ + z where ŷ W and z W. In fact ŷ = proj W y and z = y ŷ. Proof. Note that if W = {}, then this theorem is trivial. As noted above, proj W y W. We claim z = y ŷ W. ( ) y u z u = (y ŷ) u = y u ŷ u = y u u = y u y u =. u u It is clear that this holds similarly for u,, u p. By linearity, z y =, so z W. To prove uniqueness, let y = w + x be another decomposition with w W and x W. Then w+x = y = ŷ+z, so (w ŷ) = (z x). But (w ŷ) W and (z x) W. Since W W = {}, then w ŷ = so w = ŷ. Similarly, z = x. We will show in the next section that every subspace has an orthogonal basis. Corollary 6. Let W be a subspace of R n with orthogonal basis {u,..., u p }. Then y W if and only if proj W y = y. Example 7. Let W = Span{u, u } below. Note that u and u are orthogonal. Write y (below) as a vector ŷ W and z W. 3 u =, u =, y =. Theorem 8 (Best Approximation Theorem). Let W be a subspace of R n and y R n. Then ŷ = proj W y is the closest point to W in the sense that y ŷ < y v for all v W, v y. 6

6 4. The Gram-Schmidt Process Orthogonal projections give us a way to find an orthogonal basis for any W of R n. Example 9. Let W = Span{x, x } with x, x below. Construct an orthogonal basis for W. x = 3, x = 8 5. Let v = x and W = Span{v }. It suffices to find a vector v W orthogonal to W. Let p = proj W x W. Then x = p + (x p) where x p W. 6 v = x p = x x 8 9 v v = 5 v v 6 3 = 5. Now v v = and v, v W. Hence, {v, v } is a basis for W. Note that if we wanted an orthonormal basis for W then we can just take the unit vectors associated to v and v. 3 This process could continue. Say W was three-dimensional. We could then let W = Span{v, v } and find the projection of x 3 onto W. We ll prove the next theorem using this idea. Theorem (The Gram-Schmidt Process). Given a basis {x,..., x p } for a nonzero subspace W R n, define v = x v = x x v v v v v 3 = x 3 x 3 v v v v x 3 v v v v. v p = x p x p v v x p v v x p v p v p v v v v v p v p Then {v,..., v p } is an orthogonal basis for W. In addition, Span{v,..., v k } = Span{x,..., x k } for all k p. Proof. For k p, set W k = Span{x,..., x k } and V k = Span{v,..., v k }. Since v = x. Then it (trivially) holds that W = V and {v } is orthogonal. Suppose for some k, k < n, that W k = V k and that {v,..., v k } is an orthogonal set. Define v k+ = x k+ proj Wk x k+ W k W k+.

7 By the Orthogonal Decomposition Theorem, v k+ is orthogonal to W k. Since x k+ W k+, then v k+ W k+. Hence, {v,..., v k+ } is an orthogonal set of k + nonzero vectors in W k+ and hence a basis of W k+. Hence, W k+ = V k+. The result now follows by induction. Example. Let W = Span{x, x, x 3 } with x i below. Construct an orthogonal basis for W. x =, x =, x 3 =. Set v = x. Then v = x x v v v v = 3 = /3 /3 /3. Now, v 3 = x 3 x 3 v v v v x 3 v v v v = 3 5 /3 /3 /3 = / /. Hence, an orthogonal basis for W is {v, v, v 3 }.

8 5. Least-squares problems In data science, one often wants to be able to approximate a set of data by a curve. Possibly, one might hope to construct the line that best fits the data. This is known (by one name) as linear regression. In this section we ll study the linear algebra approach to this problem. Suppose the system Ax = b is inconsistent. Previously, we gave up all hope then of solving this system because no solution existed. However, if we give up the idea that we must find an exact solution and instead focus on finding an approximate solution, then we may have hope of solving. Definition. If A is an m n matrix and b R n, a least-squares solution of Ax = b is a vector ˆx R n such that for all x R n, b Aˆx b Ax. Geometrically, we think of Aˆx as the projection of b onto ColA. That is, if ˆb = proj ColA b, then the equation Ax = ˆb is consistent. Let ˆx R n be a solution (there may be several). By the Best Approximation Theorem, ˆb is the point on ColA closest to b and so Aˆx is a least-squares solution to Ax = b By the Orthogonal Decomposition Theorem, b ˆb is orthogonal to ColA. Hence, if a j is any column of A, then a j (b ˆb) =. That is, a T j (b ˆb) =. But a T j is a row of A T and so A T (b ˆb). Replacing ˆb with Aˆx and expanding we get A T Aˆx = A T b. The equations corresponding to this system are the normal equations for Ax = b. We have now essentially proven the following theorem. Theorem. The set of least-squares solutions of Ax = b coincides with the nonempty set of solutions of the normal equations A T Ax = A T b. Example 3. Find a least-squares solution of the inconsistent system Ax = b where 5 A = 4 and b =. 3 We will use normal equations. First we compute [ [ A T A =, A T b =. 9 5 To solve the equation A T Ax = A T b we invert A T A. [ [ [ ˆx = (A T A) A T b = 9 5 /9 = /9 Hence, when A T A is invertible then the least-squares solution ˆx is unique and ˆx = (A T A) A T b.

9 As an application of this, we ll see how to fit data to a line using least-squares. To match notation commonly used in statistical analysis, we denote the equation Ax = b by Xβ = y. The matrix X is referred to as the design matrix, β as the parameter vector, and y as the observation vector. Suppose we have a a set of data points (x, y ), (x, y ),..., (x n, y n ), perhaps from some experiments. We would like to model this data be a line to predict outcomes that did not appear in our experiment. Say this line is written y = β + β x. The residual of a point (x i, y i ) is the distance from that point to the line. The least-squares line is the line that minimizes the sum of the squares of the residuals. Suppose the data was all on the line. Then they would all satisfy, β + β x = y β + β x = y. β + β x n = y n. We could write this system as Xβ = y where x x X =, β =.. x n [ β β y y, y =.. If the data does not lie on the line (and this is likely) then we want the vector β to be the leastsquares solution of Xβ = y that minimizes the distance between Xβ and y. Example 4. Find the equation y = β +β x of the least-squares line that best fits the data points (4, ), (, ), (3, 3), (5, 5). We build the matrix X and vector y from the data, 4 X = 3, y = For the least-squares solution of Xβ = y, we have the normal equation X T Xβ = X T y where [ [ 4 X T X =, X T y = Hence, [ β β y n [ 4/35 = (X T X) X T y =. 7/35

10 7. Diagonalization of Symmetric Matrices We have seen already that it is quite time intensive to determine whether a matrix is diagonalizable. We ll see that there are certain cases when a matrix is always diagonalizable. Definition. A matrix A is symmetric if A T = A. 3 4 Example 5. Let A = Note that A T = A, so A is symmetric. The characteristic polynomial of A is χ A (t) = (t + )(t 7) so the eigenvalues are and 7. The corresponding eigenspaces have bases, λ =,, λ = 7,,. Hence, A is diagonalizable. Now we use Gram-Schmidt to find an orthogonal basis for R 3. Note that the eigenvector for λ = is already orthogonal to both eigenvectors for λ = 7. / v =, v =, v 3 =. Finally, we normalize each vector, / u =, u = / /3 /3, u 3 = / /3 Now the matrix U = [u u u 3 is orthogonal and so U T U = I. /3 /3. Theorem 6. If A is symmetric, then any two eigenvectors from different eigenspaces are orthogonal. Proof. Let v, v be eigenvectors for A with corresponding eigenvalues λ, λ, λ λ. Then λ (v v ) = (λ v ) T v = (Av ) T v = v T A T v = v T Av = v T (λ v ) = λ (v v ). /3 Hence, (λ λ )(v v ) =. Since λ λ, then we must have v v =. Based on the previous theorem, we say that the eigenspaces of A are mutually orthogonal. Definition. An n n matrix A is orthogonally diagonalizable if there exists an orthogonal n n matrix P and a diagonal matrix D such that A = P DP T. Theorem 7. If A is orthogonally diagonalizable, then A is symmetric.

11 Proof. Since A is orthogonally diagonalizable, then A = P DP T for some orthogonal matrix P and diagonal matrix D. A is symmetric because A T = (P DP T ) T = (P T ) T D T P T = P DP T = A. It turns out the converse of the above theorem is also true! The set of eigenvalues of a matrix A is called the spectrum of A and is denoted σ A. Theorem 8 (The Spectral Theorem for symmetric matrices). Let A be a (real) n n symmetric matrix. Then the following hold. () A has n real eigenvalues, counting multiplicities. () For each eigenvalue λ of A, geomult λ (A) = algmult λ (A). (3) The eigenspaces are mutually orthogonal. (4) A is orthogonally diagonalizable. Proof. Every eigenvalue of a symmetric matrix is real. The second part of () as well as () are immediate consequences of (4). We proved (3) in Theorem 6. Note that (4) is trivial when A has n distinct eigenvalues by (3). We prove (4) by induction. Clearly the result holds when A is. Assume (n ) (n ) symmetric matrices are orthogonally diagonalizable. Let A be n n and let λ be an eigenvalue of A and u a (unit) eigenvector for λ. By the Gram- Schmidt process, we may extend u to an orthonormal basis {u,..., u n } for R n where {u,..., u n } is a basis for W. Set U = [u u u n. Then u T U T Au u T Au n [ AU =..... = λ. B u T n Au u T n Au n The first column is as indicated because u T i Au = u T i (λu ) = λ(u i u ) = λδ ij. As U T AU is symmetric, = and B is a symmetric (n ) (n ) matrix that is orthogonally diagonalizable with eigenvalues λ,..., λ n (by the inductive hypothesis). Because A and U T AU are similar, then the eigenvalues of A are λ,..., λ n. Since B is orthogonally diagonalizable, there exists an orthogonal matrix Q such that Q T BQ = D, where the diagonal entries of D are λ,..., λ n. Now [ T [ [ [ [ λ λ λ = =. Q B Q Q T BQ D This is one of the problems on the extra credit homework assignment.

12 [ [ Note that is orthogonal. Set V = U. As the product of orthogonal matrices is Q Q orthogonal, V is itself orthogonal and V T AV is diagonal. Suppose A is orthogonally diagonalizable, so A = UDU T where U = [u u n and D is the diagonal matrix whose diagonal entries are the eigenvalues of A, λ,..., λ n. Then A = UDU T = λ u u T + + λ n u n u T n. This is known as the spectral decomposition of A. Each u i u T i (u i u T i )x is the projection of x onto Span{u i}. is called a projection matrix because Example 9. Construct a spectral decomposition of the matrix A in Example Recall that A = 6 and our orthonormal basis of Col(A) was 4 3 / u =, u = /3 /3, u 3 = /3 /3. / /3 /3 Setting U = [u u u 3 gives U T AU = D = diag(, 7, 7). The projection matrices are / / u u T =, u u T = / / The spectral decomposition is /8 /9 /8 /9 8 9 /9 /8 /9 /8, u 3u T 3 = 4/9 /9 4/9 /9 /9 /9 4/9 /9 4/9. 7u u T + 7u u T u 3 u T 3 = A.

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

WI1403-LR Linear Algebra. Delft University of Technology

WI1403-LR Linear Algebra. Delft University of Technology WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

Announcements Monday, November 26

Announcements Monday, November 26 Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4 Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Announcements Monday, November 20

Announcements Monday, November 20 Announcements Monday, November 20 You already have your midterms! Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018 MATH 1553 SAMPLE FINAL EXAM, SPRING 2018 Name Circle the name of your instructor below: Fathi Jankowski Kordek Strenner Yan Please read all instructions carefully before beginning Each problem is worth

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Linear Algebra Fundamentals

Linear Algebra Fundamentals Linear Algebra Fundamentals It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix. Because they form the foundation on which we later

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V. MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

A Primer in Econometric Theory

A Primer in Econometric Theory A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,

More information

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

Name: Final Exam MATH 3320

Name: Final Exam MATH 3320 Name: Final Exam MATH 3320 Directions: Make sure to show all necessary work to receive full credit. If you need extra space please use the back of the sheet with appropriate labeling. (1) State the following

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information