Typical Problem: Compute.
|
|
- Solomon Webster
- 5 years ago
- Views:
Transcription
1 Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y x n y n is the dot product of x and y. [ ] [ ] 2 5 Typical Problem: Compute. 4 3 Important: The dot product of x and y is a scalar, NOT a vector. Remark: The dot product is mysterious and beautiful and powerful. The matrix product form of dot product: When x is viewed as a n 1 matrix, we get x y = x T y. Theorem 1 page 376: The dot product is a commutative, bilinear, positive definite operation. That is, 1. (Commutativity) x y = y x, 2. (Bilinearity) (αx + βy) z = α(x z) + β(y z) and x (αy + βz) = α(x y) + β(x z), and 3. (Positive definiteness) x x > 0 if x 0. Remark: x x = x x 2 n, so, always, x 0. T Also, x x = 0 implies x x 2 n = 0, in which case, x = 0. Thus, the positive definite property is equivalent to the property that always x x 0, and x x = 0 iff x = 0. Finally, it is worth noting that, in the presence of symmetry, the bilinearity property is equivalent to the two properties (Distributivity of dot product over addition) x (y + z) = x y + x z, and (Scalars float) (αx) y = α(x y) = x (αy). Abstraction: The abstraction of the above properties yields the idea of an inner product on a vector space. Thus, we define an inner product space to be a vector space V together with the assignment to each u, v V of a scalar < u, v >, called the inner product of u and v, such that for all u, v, w V and α, β R, the following conditions hold. 1. (Commutativity) < u, v >=< v, u >, 2. (Bilinearity) < αu + βv, w >= α < u, w > +β < v, w > and < u, αv + βw >= α < u, v > +β < u, w >, and 3. (Positive definiteness) < u, u > is positive if u 0. Remark: The positive definiteness property is equivalent to the condition that < u, u > is nonnegative, and < u, u >= 0 iff u = 0. In the presence of inner product symmetry, the bilinearity property of inner product is equivalent to the two properties 1
2 (Distributivity of inner product over addition) < u, v + w >=< u, v > + < u, w >, and (Scalars float) < αu, v >= α < u, v >=< u, αv >. Typical Problems: Establish each of the following. 1. R n together with the dot product is an example of an inner product space. 2. Any n th degree polynomial p(t) is completely determined by its values p(t 0 ), p(t 1 )..., p(t n ) at n + 1 distinct points t 0, t 1,..., t n in R. Given this, it is an easy matter to show that < p(t), q(t) >= p(t 0 )q(t 0 ) + p(t 1 )q(t 1 ) p(t n )q(t n ) is an inner product on P n whenever t 0, t 1,..., t n are n + 1 distinct points in R. 3. Recall C[a, b] denotes the vector space of real valued continuous functions defined on the interval [a, b] = {x a x b} in R. C[a, b] becomes an inner product space when the inner product is given by < f, g >= b f(x)g(x)dx for each f, g C[a, b]. a Convention: Throughout what follows, unless otherwise specified, R n is considered to be an inner product space with the inner product equal to the dot product. Distance: The (Euclidean) distance from x to y in R n is the nonnegative square root of (x 1 y 1 ) (x n y n ) 2 (i.e. + (x 1 y 1 ) (x n y n ) 2 ). [ ] [ ] 2 5 Typical Problem: With x = and y =, find the distance from x to y in R Norm: The norm or length of x R n is the distance from x to 0. Thus, the norm of x R n is x x x 2 n. [ 2 Typical Problem: With x = 4 ] in R 2, find the length (or norm) of x. Proposition: The distance from x to y in R n is (x y) (x y) and the norm of x is x x. Notation: The distance from x to y is denoted x y. The norm of x is denoted x. Unit Vectors: A vector x is a unit vector if it has length 1. If x 0, then x/ x is a unit vector in the direction of x. [ ] 2 Typical Problem: With x = in R 4 2, find a unit vector in the direction of x. Abstraction: The distance in an inner product space V from u to v in V is defined by u v = < u v, u v > and the norm of u V is defined to be u = < u, u >. Proposition: For any vectors u, v, and w in an inner product space V, each of the following properties hold. 2
3 1. (Symmetry) The distance from u to v is the distance from v to u; that is, u v = v u. 2. (Degeneracy) The distance from u to v is zero if and only if u = v; that is, u v = 0 iff u = v. 3. (Triangle Inequality) The distance from u to w is less than or equal to the distance from u to v plus the distance from v to w; that is, u w u v + v w. Remark. The properties above allow further abstraction to more general spaces and giving us the important ideas of metrics, metric spaces, normed spaces, normed linear spaces, Banach spaces, etc. Typical Problems: On P 2, consider the inner product < p(t), q(t) >= p( 1)q( 1) + p(0)q(0) + p(1)q(1) for each p(t), q(t) P 2. Let p(t) = 1 + t 2 and q(t) = 2 2t + t Compute < p(t), q(t) >. 2. What is the length of p(t)? 3. What is the distance from p(t) to q(t)? 4. Find a unit vector in the direction of p(t). Proposition: Let u and v be vectors in an inner product space V. Then 1. u v 2 =< (u v), (u v) >, 2. u = u 0 (i.e., the norm of u is its distance from the origin). 3. u 2 =< u, u >. 4. u v 2 = u 2 2 < u, v > + v u + v 2 = u < u, v > + v 2. Pythagoras Theorem: < u, v >= 0 if and only if u + v 2 = u 2 + v 2 in any inner product space V. Proof: Examine item 5 in the last proposition above. Definition: In an inner product space, vectors u and v are called orthogonal or perpendicular vectors if and only if < u, v >= 0. When u and v are orthogonal, we write u v. Typical problems: 1. Determine if What about On P 2, consider the inner product < p(t), q(t) >= p( 1)q( 1) + p(0)q(0) + p(1)q(1) for each p(t), q(t) P 2. Let p(t) = 1 + t 2 and q(t) = 2 2t + t 2. Determine if p(t) q(t). 3. Prove 0 u for any vector u in an inner product space V.? 3
4 Proposition: If W is a subspace of an inner product space V then {u u w for all w W } is also a subspace of V. Definition: If W is a subspace of an inner product space V then {u u w for all w W } is denoted by W (spoken W perp), and called the orthogonal complement of W. Proposition: If W is a subspace of an inner product space V, then W W = {0}. Proposition: If W = Span{w 1,..., w k } in an inner product space V, then u W if and only if u w i, for all i = 1,..., k. Theorem 3 page 381 Generalized: Let A be an m n matrix. Then (1) Col(A T ) and Nul(A) are each other s orthogonal complement in R n, and (2) Col(A) and Nul(A T ) are each other s orthogonal complement in R m. Proof: See the Appendix below. Sections 6.2 through 6.4 via 6.7: Orthogonality and Gram-Schmidt. Definition. A set of vectors which are pair-wise orthogonal is called an orthogonal set. An orthogonal set which is also a basis of an inner product set is called an orthogonal basis. If all of the vectors in an orthogonal set are unit vectors then the set is called orthonormal. Warning: A matrix whose columns form an orthonormal set is called an orthogonal matrix, NOT an orthonormal matrix. There is no such thing as an orthonormal matrix. This is a historical accident. Also, it is more usual to speak of n n orthogonal matrices than it is of n m ones. Linear Independence Of Orthogonal Sets: If {b 1,..., b p } is an orthogonal set of nonzero vectors in an inner product space V, then it is linearly independent. This is because 0 = α 1 b α i b i α n b n implies < b i, 0 >=< b i, α 1 b α i b i α n b n > (since 0 = α 1 b α i b i α n b n ) = α 1 < b i, b 1 > α i < b i, b i > α n < b i, b n > (by bilinearity) = α i < b i, b i > (since < b i, b j >= 0 for all i j and < b i, b i > 0), so α i = <b i,0> <b i,b i > = 0 (since < b i, 0 >= 0) for all i. Representations Using Orthogonal Basis: An orthogonal basis provides a useful representation of an inner product space because coordinates are easily computed. To see this let B = {b 1,..., b n } be an orthogonal basis of an inner product space V and let x V. Generally, [x] B = P 1 B (x) is difficult to compute, but not in this case. Write x = α 1b α i b i +...+α n b n. 4
5 Then, as just above (but using x instead of 0), < b i, x >= α i < b i, b i >, so α i = <b i,x> <b i,b i, each i. > <b 1,x> <b 1,b 1 >... Therefore, [x] B = P 1 B (x) = <b i,x> <b i,b i >.... <b n,x> <b n,b n> Testing Orthogonality In R n : In R n, {v 1,..., v p } is an orthogonal set if and only if A T A is a diagonal matrix, where A = [v 1... v p ], because if D = A T A then d i,j = v T i v j = v i v j, each i and j. Also, in R n, {v 1,..., v p } is an orthonormal set if and only if A T A = I p. It is worth noting here, for the sake of reducing the number of computations, that regardless of whether D = A T A is diagonal or not, it is symmetric (that is, D T = D), so to determine if D is diagonal, one only needs to check the entries strictly below its diagonal (if any are non zero then orthogonality fails). For orthonormality, one must check the unity of the diagonal elements as well. Finally, after repeating the warning that there is no such thing as an orthonormal matrix, we notice that an n p matrix A is orthogonal iff A T A = I p, in which case, we see that an n n matrix A is orthogonal iff A T = A 1. Thus, {v 1,..., v n } is an orthonormal basis of R n iff A = [v 1... v n ] is an orthogonal matrix. Typical problems: 1. Show the standard basis {e 1,..., e n } of R n is orthonormal. 2. Is {[1, 2, 2, 3] T, [2, 1, 6, 4] T } an orthogonal set in R 4? Orthonormal? 3. Is {[1, 2, 2, 3] T, [2, 1, 6, 4] T, [ 2, 3, 1, 2] T } an orthogonal set in R 4? 4. Let P 1 have inner product defined by evaluation at t 1 = 1 and t 2 = 1 (that is, < p(t), q(t) >= p( 1)q( 1) + p(1)q(1)). Is B = {1 + t, 1 t} orthogonal? Orthogonal basis? Orthonormal basis? How about S = {1, t}? 5. Repeat the last item but with t 1 = 0 and t 2 = Compute [2 + 3t] B for B = {1 + t, 1 t} in P 1 with inner product given by evaluation at t 1 = 1 and t 2 = 1. The Orthogonal Projection Theorem: Let B = {b 1,..., b p } be an orthogonal set of nonzero vectors in an inner product space V. Then for any y V, there is a unique vector ŷ = <y,b i> <b i,b i > b i <y,b p> <b p,b p> b p in W = Span(B) (called the orthogonal projection of y onto W ) and unique vector such that y = ŷ + z. z = y ŷ in W 5
6 Proof: Obviously, the conclusion y = ŷ + z follows from the equation z = y ŷ defining z. The existence and uniqueness of z are guaranteed by the existence and uniqueness of ŷ, respectively, because z = y ŷ. ŷ exists because it is given by a computable formula. That y W is obvious. The uniqueness property of ŷ and that that z W will established in class. Typical problems: 1. Find the orthogonal projection of [2, 0, 1, 2] T onto the subspace of R 4 spanned by {[1, 2, 2, 3] T, [2, 1, 6, 4] T }. 2. Use an orthogonal projection to extend the orthogonal set {[1, 2, 2] T, [0, 1, 1] T } to an orthogonal basis of R Let P 1 have inner product defined by evaluation at t 1 = 1 and t 2 = 1. Find the orthogonal projection of 2 + 3t onto Span{p(t)} in P 1 where p(t) = 1 t. 4. Repeat the last item except use t 1 = 2 and t 2 = 1. Lecture Target: This is the optimal spot to end the lecture of Thursday, August 3 rd. Cauchy-Schwarz Inequality: For any vectors u and v in an inner product space, < u, v > u, v. The elegant but tricky proof of this inequality is given in the textbook on page 432. This inequality is the key to the metric triangle property mentioned above (also, see page 433 of text). Alternative Notation: The orthogonal projection ŷ of y onto the subspace W is written on occasion P roj W y; that is, for ŷ = P roj W y. The Best Approximation Theorem: For any subspace W = Span(B) and y V in a finite dimensional inner product space, P roj W y is the best approximation to y by a vector in W ; that is, y P roj W y y v for all v W. Proof: To be done in class. Typical Problems: 1. Find the best approximation to [2, 0, 1, 2] T in W = Span{[1, 2, 2, 3] T, [2, 1, 6, 4] T }. 2. Let P 1 have inner product defined by evaluation at t 1 = 1 and t 2 = 1. Find the best approximation to 2 + 3t in Span{p(t)} in P 1 where p(t) = 1 t. The Gram-Schmidt Process: Let {b 1,..., b p } be a basis of a subspace W in an inner product space V. Let v 1 = b 1 and for 1 < i p let v i = b i P roj W i b i where W i = Span({v 1,..., v i 1 }). Then {v 1,..., v p } is an orthogonal basis of W. Upon normalizing each v i, we get an orthonormal basis { v 1,..., v p } of W. v 1 v p Typical problems: 1. With B = {[0, 1, 1, 1] T, [1, 1, 1, 0] T, [1, 0, 1, 1] T }, find the orthogonal projection of [0, 1, 1, 0] T onto Span(B) in R 4. 6
7 2. Let P 2 have inner product defined by evaluation at t 1 = 1, t 2 = 0 and t 3 = 1. Find an orthogonormal basis for P 2. The QR Factorization: Let A be an m n matrix with linearly independent columns. Let Q be the matrix of columns formed by applying the Gram-Schmidt process to the columns of A and normalizing each of the resulting columns. Then Q is an orthogonal matrix. Let R = Q T A. Since Q is orthogonal, we have A = QR. Moreover, R is n n, upper triangular, invertible and has positive entries on its diagonal. Typical Problem: Find the QR factorization of A = Least Squares: If A is an m n matrix and b R m then a least squares solution of Ax = b is a vector ˆx R n such that b Aˆx b Ax for all x R n. Clearly, the set of least squares solutions of Ax = b is the set of solutions of Aˆx = ˆb where ˆb = P rojcol(a) b. b ˆb = b Aˆx so b Aˆx lies in the orthogonal complement of Col(A). This orthogonal complement is Nul(A T ), so A T (b Aˆx) = 0. Thus, we seek x such that A T Aˆx = A T b. The system of equations A T Aˆx = A T b is called the system of normal equations for x in the least squares problem Ax = b. When A has linearly independent columns, A T A is invertible ˆx = (A T A) 1 A T b. From this we get a matrix product description of orthogonal projection, for ˆb = Aˆx = A(A T A) 1 A T b. The invertibility of A T A mentioned above is not obvious. That A T A is invertible when A has linearly independent columns proceeds by noticing Col(A T ) = R n (since Nul(A) = {0}, and since, as is shown in the appendix, Col(A T ) is the orthogonal complement of Nul(A)). Therefore, A T is onto. It will follow from this that A T A is also onto. To see this let x R n. Since A T is onto, there is y R m, such that A T y = x. Let ŷ be the orthogonal projection of y onto Col(A) in R m and let z = y ŷ. z Nul(A T ) because, again by the appendix below, Nul(A T ) is the orthogonal complement of Col(A) in R m. Therefore A T z = 0 and y = z + ŷ. Now, since ŷ Col(A), there is a vector v R n such that Av = ŷ. We have. 7
8 A T Av = A T ŷ = 0 + A T ŷ = A T z + A T ŷ = A T (z + ŷ) = A T y = x. Therefore, as claimed, A T A is an onto linear transformation. But A T A is an n n matrix, so, by the Invertible Matrix Theorem, A T A is invertible (because it is onto). This argument provides an inverse statement to the one given in the assigned exercise Typical Problem: Let A have columns a 1 = [0, 1, 1, 1] T, a 2 = [1, 1, 1, 0] T, and a 3 = [1, 0, 1, 1] T. Use least squares to find the orthogonal projection of [0, 1, 1, 0] T onto Col(A) in R Least Squares Lines: In R n let 1 = [1, 1,..., 1] T be the column vector all of whose coordinates are 1. Given x, y R n the least squares line is given by β 0, β 1 R such that ŷ = β β 1 x where ŷ is the best approximation of y in Col([1 x]). Thus, we seek β = [β 0, β 1 ] T R 2 such that [1 x] β = ŷ. The normal equations provide the solution. [1 x] T [1 x] β = [1 x] T y Typical Problem: Find the equation y = β 0 + β 1 x of the least squares line that best fits the data points ( 1, 0), (0, 1), (1, 2), (2, 4). Appendix. Above, we generalized Theorem 3 page 381. generalization next and then we go on to prove it. Theorem 3 page 381 Generalized: Let A be an m n matrix. Then Here, we repeat the statement of the (1) Col(A T ) and Nul(A) are each other s orthogonal complement in R n, and (2) Col(A) and Nul(A T ) are each other s orthogonal complement in R m. Proof. The technical meaning of the two statements is (1) Row(A) Col(A T ) = (Nul(A)) and Nul(A) = (Col(A T )) (Row(A)), and (2) Col(A) = (Nul(A T )) and Nul(A T ) = (Col(A)). (2) follows from (1) upon replacing A, A T, m, and n throughout (1) by A T, A, n, and m, respectively. The second part of (1), that Nul(A) = (Col(A T )) (Row(A)), is easy and given in the textbook on page 381. The first part of (1), that Row(A) Col(A T ) = (Nul(A)), 8
9 is more difficult and requires an exploitation of the rank of A. We will prove Row(A) Col(A T ) = (Nul(A)). Row(A) Col(A T ) follows by the observation that transpose is an isomorphism sending rows to columns and columns to rows. Now we turn our attention to proving Col(A T ) = (Nul(A)) in R n. A brief meditation on the relationships between the rows of A, the columns of A T, and the solutions of Ax = 0 will allow the reader to see that each column of A T is orthogonal to any x Nul(A). Therefore, each column of A T is in (Nul(A)). Thus, Col(A T ) is a subspace of (Nul(A)), (*) because (Nul(A)) is a vector space (the vector space structure being inherited from R n ). Let r = Rank(A). We know that dim(col(a T )) = Rank(A T ) = Rank(A) = r. Let p = dim((nul(a)) ). Then r p by (*), so if we can show then we will have p r, dim(col(a T )) = r = p = dim((nul(a)) ). (**) Once p r is established, the desired result, that Col(A T ) = (Nul(A)) follows immediately from (*) and (**), because a subspace (namely, Col(A T )) with dimension equal to the dimension of its parent space (namely, (Nul(A)) ) must be all of the parent space. Let B = {v 1, v 2,...v p } be a basis of (Nul(A)). Let q = dim(nul(a)); so that, q = n r by the Rank Theorem. Let C = {w 1, w 2,...w q } be a basis of Nul(A), and let D = {v 1, v 2,...v p, w 1, w 2,...w q } be the union of B and C. We will show D is linearly independent. Why will this help? Well, if D is shown to be linearly independent, then since p + q is the number of elements in D, it follows that p + q n by the Basis Theorem, in which case, we have p + (n r) = p + q n, so p r 0; whence, p r, as required. Thus, once we have established that D is linearly independent, then the proof will be complete. We do this next. Suppose (α 1 v α p v p ) + (β 1 w β q w q ) = 0 for some scalars α 1,..., α p, β 1,..., β q, where the bracketing has no use other than to suggest what follows next. Let x = α 1 v α p v p and y = β 1 w β q w q. We have x + y = 0, so x = y. Now, x (Nul(A)) and y Nul(A). But y Nul(A) and x = y implies x Nul(A). Therefore, x (Nul(A)) Nul(A). But, far above we noted this means x = 0. Also, y = 0, because x = y and x = 0. Now we have it, 9
10 because x = 0 and y = 0 give us 0 = α 1 v α p v p and 0 = β 1 w β q w q ; whence, all of the α-scalars and β-scalars are 0 by the linear independence of the bases B and C, respectively. Therefore, D is linearly independent as claimed, and this completes the proof. 10
(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More information(v, w) = arccos( < v, w >
MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,
More informationORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]
ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationMarch 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More informationSUMMARY OF MATH 1600
SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,
More informationWorksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality
Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationLinear Algebra Lecture Notes-II
Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationLecture 1: Review of linear algebra
Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations
More informationFurther Mathematical Methods (Linear Algebra) 2002
Further Mathematical Methods (Linear Algebra) 22 Solutions For Problem Sheet 3 In this Problem Sheet, we looked at some problems on real inner product spaces. In particular, we saw that many different
More informationWorksheet for Lecture 25 Section 6.4 Gram-Schmidt Process
Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationMath 3191 Applied Linear Algebra
Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have
More informationP = A(A T A) 1 A T. A Om (m n)
Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax
More informationMATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.
MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationFinal Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015
Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationGENERAL VECTOR SPACES AND SUBSPACES [4.1]
GENERAL VECTOR SPACES AND SUBSPACES [4.1] General vector spaces So far we have seen special spaces of vectors of n dimensions denoted by R n. It is possible to define more general vector spaces A vector
More informationMathematics Department Stanford University Math 61CM/DM Inner products
Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationPRACTICE PROBLEMS FOR THE FINAL
PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show
More informationRecall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:
Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =
More informationWorksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases
Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation
More informationOrthogonality and Least Squares
6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationMATH 22A: LINEAR ALGEBRA Chapter 4
MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationElementary linear algebra
Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The
More informationOverview. Motivation for the inner product. Question. Definition
Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationOrthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6
Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and
More informationPractice Final Exam. Solutions.
MATH Applied Linear Algebra December 6, 8 Practice Final Exam Solutions Find the standard matrix f the linear transfmation T : R R such that T, T, T Solution: Easy to see that the transfmation T can be
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More information6 Inner Product Spaces
Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space
More informationWI1403-LR Linear Algebra. Delft University of Technology
WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear
More informationv = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :
Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =
More informationMATRICES ARE SIMILAR TO TRIANGULAR MATRICES
MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,
More informationLecture 23: 6.1 Inner Products
Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationInner Product and Orthogonality
Inner Product and Orthogonality P. Sam Johnson October 3, 2014 P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 1 / 37 Overview In the Euclidean space R 2 and R 3 there are two concepts,
More informationReview Notes for Linear Algebra True or False Last Updated: February 22, 2010
Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n
More informationMATH 167: APPLIED LINEAR ALGEBRA Least-Squares
MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of
More informationMATH Linear Algebra
MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization
More informationINNER PRODUCT SPACE. Definition 1
INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of
More informationDefinition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition
6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition
More informationWorksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases
Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation
More informationW2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.
MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have
More informationMath Linear Algebra
Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationA Primer in Econometric Theory
A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,
More informationMATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.
MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products. Orthogonal projection Theorem 1 Let V be a subspace of R n. Then any vector x R n is uniquely represented
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationThere are two things that are particularly nice about the first basis
Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially
More informationNOTES on LINEAR ALGEBRA 1
School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationExam in TMA4110 Calculus 3, June 2013 Solution
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 8 Exam in TMA4 Calculus 3, June 3 Solution Problem Let T : R 3 R 3 be a linear transformation such that T = 4,
More informationInner Product Spaces
Inner Product Spaces Introduction Recall in the lecture on vector spaces that geometric vectors (i.e. vectors in two and three-dimensional Cartesian space have the properties of addition, subtraction,
More informationSolutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015
Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See
More informationb 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n
Lectures -2: Linear Algebra Background Almost all linear and nonlinear problems in scientific computation require the use of linear algebra These lectures review basic concepts in a way that has proven
More informationMATH 235. Final ANSWERS May 5, 2015
MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your
More informationREVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION
REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar
More informationChapter 2. Vectors and Vector Spaces
2.1. Operations on Vectors 1 Chapter 2. Vectors and Vector Spaces Section 2.1. Operations on Vectors Note. In this section, we define several arithmetic operations on vectors (especially, vector addition
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More informationReview of Some Concepts from Linear Algebra: Part 2
Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set
More informationChapter 6. Orthogonality
6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for
More informationMath 413/513 Chapter 6 (from Friedberg, Insel, & Spence)
Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector
More informationBindel, Fall 2016 Matrix Computations (CS 6210) Notes for
1 Logistics Notes for 2016-08-29 General announcement: we are switching from weekly to bi-weekly homeworks (mostly because the course is much bigger than planned). If you want to do HW but are not formally
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationPseudoinverse & Moore-Penrose Conditions
ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego
More informationLinear Analysis Lecture 5
Linear Analysis Lecture 5 Inner Products and V Let dim V < with inner product,. Choose a basis B and let v, w V have coordinates in F n given by x 1. x n and y 1. y n, respectively. Let A F n n be the
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationOctober 25, 2013 INNER PRODUCT SPACES
October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal
More informationThe Gram Schmidt Process
u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple
More informationThe Gram Schmidt Process
The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case
More informationEXAM. Exam 1. Math 5316, Fall December 2, 2012
EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.
More informationVector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.
Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar
More informationMATH 167: APPLIED LINEAR ALGEBRA Chapter 3
MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have
More informationChapter 6 - Orthogonality
Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall 2009 http://z.cs.utexas.edu/wiki/pla.wiki/ 1 Orthogonal Vectors and Subspaces http://z.cs.utexas.edu/wiki/pla.wiki/
More informationOrthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016
Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That
More informationMath 290, Midterm II-key
Math 290, Midterm II-key Name (Print): (first) Signature: (last) The following rules apply: There are a total of 20 points on this 50 minutes exam. This contains 7 pages (including this cover page) and
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationLinear algebra review
EE263 Autumn 2015 S. Boyd and S. Lall Linear algebra review vector space, subspaces independence, basis, dimension nullspace and range left and right invertibility 1 Vector spaces a vector space or linear
More informationMTH 464: Computational Linear Algebra
MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)
More informationx 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1
. Orthogonal Complements and Projections In this section we discuss orthogonal complements and orthogonal projections. The orthogonal complement of a subspace S is the complement that is orthogonal to
More information