Orthogonality and Least Squares

Size: px
Start display at page:

Download "Orthogonality and Least Squares"

Transcription

1 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY

2 INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and the matrix product u T v is a 1 1matrix, which we write as a single real number (a scalar) without brackets. The number u T v is called the inner product of u and v, and it is written as. uiv The inner product is also referred to as a dot product. Slide 6.1-2

3 INNER PRODUCT! u $! # 1 & # # u u= 2 & # # & v = # # & # # " u n & # % " If and,! "# then the inner product of u and v is! v $ # 1 & u 1 u 2 u $ # v 2 & n %& # & = u 1 v 1 + u 2 v u n v n # & # " v n & % v 1 v 2 v n $ & & & & & %. Slide 6.1-3

4 INNER PRODUCT Theorem 1: Let u, v, and w be vectors in, and let c be a scalar. Then a. b. c. d., and if and only if n uiv = viu (u + v)iw = uiw + viw (cu)iv = c(uiv) = ui(cv) uiu 0 uiu = 0 u= 0 Properties (b) and (c) can be combined several times to produce the following useful rule: (c 1 u c p u p )iw = c 1 (u 1 iw) + + c p (u p iw) Slide 6.1-4

5 THE LENGTH OF A VECTOR n If v is in, with entries v 1,, v n, then the square root of viv is defined because viv is nonnegative. Definition: The length (or norm) of v is the nonnegative scalar defined by v v = viv = v 12 + v v n 2 and v 2 = viv! Suppose v is in 2, say, v = a $ # &. " b % Slide 6.1-5

6 THE LENGTH OF A VECTOR If we identify v with a geometric point in the plane, as usual, then v coincides with the standard notion of the length of the line segment from the origin to v. This follows from the Pythagorean Theorem applied to a triangle such as the one shown in the following figure. For any scalar c, the length cv is v. That is, cv = c v c times the length of Slide 6.1-6

7 THE LENGTH OF A VECTOR A vector whose length is 1 is called a unit vector. If we divide a nonzero vector v by its length that is, multiply by 1/ v we obtain a unit vector u because the length of u is. (1 / v ) v The process of creating u from v is sometimes called normalizing v, and we say that u is in the same direction as v. Slide 6.1-7

8 THE LENGTH OF A VECTOR v = (1, 2,2,0) Example 1: Let in the same direction as v. Solution: First, compute the length of v:. Find a unit vector u v = vgv = (1) + ( 2) + (2) + (0) = 9 v = 9 = 3 1/ v " u = 1 v v = 1 3 v = 1 $ $ 3$ $ # Then, multiply v by to obtain % " ' $ ' = $ ' $ ' $ & # 1/ 3 2 / 3 2 / 3 0 % ' ' ' ' & Slide 6.1-8

9 DISTANCE IN To check that, it suffices to show that. Definition: For u and v in, the distance between u and v, written as dist (u, v), is the length of the vector u v. That is, n u = u = ugu = = = n dist (u,v) = u v 2 u = 1 ( ) 2 Slide 6.1-9

10 DISTANCE IN n Example 2: Compute the distance between the vectors and. u = (7,1) v = (3,2) Solution: Calculate u The vectors u, v, and the next slide v= 1 = u v = 4 + ( 1) = 17 u v u v are shown in the figure on When the vector is added to v, the result is u. Slide

11 DISTANCE IN n Notice that the parallelogram in the above figure shows that the distance from u to v is the same as the distance from to 0. u v Slide

12 ORTHOGONAL VECTORS 2 3 Consider or and two lines through the origin determined by vectors u and v. See the figure below. The two lines shown in the figure are geometrically perpendicular if and only if the distance from u to v is the same as the distance from u to. v This is the same as requiring the squares of the distances to be the same. Slide

13 ORTHOGONAL VECTORS Now "dist(u, # v) $ % = u ( v) = u + v = (u + v)i(u + v) = ui(u + v) + vi(u + v) = uiu + uiv + viu + viv Theorem 1(b) Theorem 1(a), (b) = u 2 + v 2 + 2uiv The same calculations with v and show that v Theorem 1(a) interchanged 2 2!dist(u,v)# " $2 = u + v + 2ui( v) = u 2 + v 2 2uiv Slide

14 ORTHOGONAL VECTORS The two squared distances are equal if and only if 2uiv = 2uiv, which happens if and only if uiv = 0. This calculation shows that when vectors u and v are identified with geometric points, the corresponding lines through the points and the origin are perpendicular if and only if. uiv = 0 n Definition: Two vectors u and v in are orthogonal (to each other) if uiv = 0. The zero vector is orthogonal to every vector in n T because for all v. 0v= 0 Slide

15 THE PYTHOGOREAN THEOREM Theorem 2: Two vectors u and v are orthogonal if and only if. u+ v = u + v Orthogonal Complements If a vector z is orthogonal to every vector in a subspace W of n, then z is said to be orthogonal to W. The set of all vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by W (and read as W perpendicular or simply W perp ). Slide

16 ORTHOGONAL COMPLEMENTS 1. A vector x is in if and only if x is orthogonal to every vector in a set that spans W. W W n 2. is a subspace of. m n Theorem 3: Let A be an matrix. The orthogonal complement of the column space of A is the null space of A T (Col A) = Nul A T Slide

17 6 Orthogonality and Least Squares 6.2 ORTHOGONAL SETS

18 ORTHOGONAL SETS A set of vectors {u 1,,u p } in is said to be an orthogonal set if each pair of distinct vectors from the set is orthogonal, that is, if whenever i j. n u i iu j = 0 Theorem 4: If S ={u 1,,u p } is an orthogonal set of nonzero vectors in n, then S is linearly independent and hence is a basis for the subspace spanned by S. Slide 6.2-2

19 ORTHOGONAL SETS 0 = c 1 u c p u p Proof: If for some scalars c 1,,c p, then 0 = 0iu 1 = (c 1 u 1 + c 2 u c p u p )iu 1 = (c 1 u 1 )iu 1 + (c 2 u 2 )iu (c p u p )iu 1 = c 1 (u 1 iu 1 ) + c 2 (u 2 iu 1 ) + + c p (u p iu 1 ) = c 1 (u 1 iu 1 ) because u 1 is orthogonal to u 2,,u p. Since u 1 is nonzero, is not zero and so. Similarly, c 2,,c p must be zero. u 1 iu 1 c = 0 1 Slide 6.2-3

20 ORTHOGONAL SETS Thus S is linearly independent. Definition: An orthogonal basis for a subspace W of is a basis for W that is also an orthogonal set. n Theorem 5: Let {u 1,,u p } be an orthogonal basis for a subspace W of n. For each y in W, the weights in the linear combination are given by y = c 1 u c p u p c j = yiu j ( j = 1, K, p) u j iu j Slide 6.2-4

21 ORTHOGONAL SETS Proof: The orthogonality of {u 1,,u p } shows that yiu 1 = (c 1 u 1 + c 2 u c p u p )iu 1 = c 1 (u 1 iu 1 ) u 1 iu 1 Since is not zero, the equation above can be solved for c 1. j = 2,, p yiu j To find c j for, compute and solve for c j. Slide 6.2-5

22 AN ORTHOGONAL PROJECTION Given a nonzero vector u in, consider the problem of decomposing a vector y in n into the sum of two vectors, one a multiple of u and the other orthogonal to u. We wish to write ŷ=αu y= yˆ + z n ----(1) where for some scalar α and z is some vector orthogonal to u. See the following figure. Slide 6.2-6

23 AN ORTHOGONAL PROJECTION Given any scalar α, let satisfied. Then z= y αu, so that (1) is y yˆ is orthogonal to u if an only if 0 = (y αu)iu = yiu (αu)iu = yiu α(uiu) That is, (1) is satisfied with z orthogonal to u if and α = yiu uiu only if and. ŷ ŷ = yiu uiu u The vector is called the orthogonal projection of y onto u, and the vector z is called the component of y orthogonal to u. Slide 6.2-7

24 AN ORTHOGONAL PROJECTION If c is any nonzero scalar and if u is replaced by cu in the definition of ŷ, then the orthogonal projection of y onto cu is exactly the same as the orthogonal projection of y onto u. Hence this projection is determined by the subspace L spanned by u (the line through u and 0). Sometimes is denoted by proj L y and is called the orthogonal projection of y onto L. That is, ŷ ŷ = proj L y = yiu uiu u ----(2) Slide 6.2-8

25 AN ORTHOGONAL PROJECTION Example 1: Let and. Find the orthogonal projection of y onto u. Then write y as the sum of two orthogonal vectors, one in Span {u} and one orthogonal to u. Solution: Compute y 7 = 6! yiu = # " 7 6! uiu = # " 4 2 u $! &i# % " $! &i# % " 4 = $ & = 40 % $ & = 20 % Slide 6.2-9

26 AN ORTHOGONAL PROJECTION The orthogonal projection of y onto u is ŷ = yiu uiu u = u = 2! # 4 " 2 $! & = # 8 % " 4 $ & % and the component of y orthogonal to u is " y ŷ = 7 % " $ ' 8 % " $ ' = 1 % $ ' # 6 & # 4 & # 2 & The sum of these two vectors is y. Slide

27 AN ORTHOGONAL PROJECTION That is,! # " 7 6 $! & = # 8 % " 4 $! &+# 1 % " 2 $ & % y ŷ (y y) ˆ The decomposition of y is illustrated in the following figure. Slide

28 AN ORTHOGONAL PROJECTION Note: If the calculations above are correct, then will be an orthogonal set. {y, ˆ y y} ˆ As a check, compute " ŷi(y ŷ) = $ 8 # 4 % " 'i$ & # 1 2 % ' = 8+8 = 0 & Since the line segment in the figure on the previous slide between y and ŷ is perpendicular to L, by construction of ŷ, the point identified with ŷ is the closest point of L to y. Slide

29 ORTHONORMAL SETS A set {u 1,,u p } is an orthonormal set if it is an orthogonal set of unit vectors. If W is the subspace spanned by such a set, then {u 1,,u p } is an orthonormal basis for W, since the set is automatically linearly independent, by Theorem 4. The simplest example of an orthonormal set is the standard basis {e 1,,e n } for. n Any nonempty subset of {e 1,,e n } is orthonormal, too. Slide

30 ORTHONORMAL SETS Example 2: Show that {v 1, v 2, v 3 } is an orthonormal basis of, where! # v 1 = # # # " 3 3 / 11 1/ 11 1/ 11 Solution: Compute $ " % " & $ 1/ 6 ' $ & &, v 2 = $ 2 / 6 ' $, v ' 3 = $ $ & $ 1/ 6 ' $ % # & # 1/ 66 4 / 66 7 / 66 v 1 iv 2 = 3 / / 66 +1/ 66 = 0 v 1 iv 3 = 3 / / / 726 = 0 % ' ' ' ' & Slide

31 ORTHONORMAL SETS v gv = 1/ 396 8/ / 396 = Thus {v 1, v 2, v 3 } is an orthogonal set. Also, v 1 iv 1 = 9 /11+1/11+1/11= 0 v 2 iv 2 =1/ / 6 +1/ 6 =1 v 3 iv 3 =1/ / / 66 =1 which shows that v 1, v 2, and v 3 are unit vectors. Thus {v 1, v 2, v 3 } is an orthonormal set. Since the set is linearly independent, its three vectors form a basis for. See the figure on the next slide. 3 Slide

32 ORTHONORMAL SETS When the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will still be orthogonal, and hence the new set will be an orthonormal set. Slide

33 6 Orthogonality and Least Squares 6.3 ORTHOGONAL PROJECTIONS

34 ORTHOGONAL PROJECTIONS 2 The orthogonal projection of a point in onto a line through the origin has an important analogue in. n n Given a vector y and a subspace W in, there is a vector ŷ in W such that (1) ŷ is the unique vector in W for which y yˆ is orthogonal to W, and (2) ŷ is the unique vector in W closest to y. See the following figure. Slide 6.3-2

35 THE ORTHOGONAL DECOMPOSITION THEOREM These two properties of provide the key to finding the least-squares solutions of linear systems. Theorem 8: Let W be a subspace of. Then each y in can be written uniquely in the form n ŷ where is in W and z is in. ----(1) In fact, if {u 1,,u p } is any orthogonal basis of W, then and. ŷ y= yˆ + z W n ŷ = yiu 1 u 1 iu 1 u yiu p u p iu p u p z= y yˆ ----(2) Slide 6.3-3

36 THE ORTHOGONAL DECOMPOSITION THEOREM ŷ The vector in (1) is called the orthogonal projection of y onto W and often is written as proj W y. See the following figure. Proof: Let {u 1,,u p } be any orthogonal basis for W, and define by (2). ŷ ŷ ŷ Then is in W because is a linear combination of the basis u 1,,u p. Slide 6.3-4

37 THE ORTHOGONAL DECOMPOSITION THEOREM z= y yˆ Let. Since u 1 is orthogonal to u 2,,u p, it follows from (2) that " ziu 1 = (y ŷ)iu 1 = yiu 1 yiu % 1 $ 'u # u 1 iu 1 iu & Thus z is orthogonal to u 1. Similarly, z is orthogonal to each u j in the basis for W. Hence z is orthogonal to every vector in W. W That is, z is in. = yiu 1 yiu 1 = 0 Slide 6.3-5

38 THE ORTHOGONAL DECOMPOSITION THEOREM To show that the decomposition in (1) is unique, suppose y can also be written as y= yˆ + z, with 1 1 in W and z 1 in. Then so W ˆ + = + ˆ1 1 y z y z (since both sides equal y), and yˆ y = z z ˆ1 1 This equality shows that the vector is in W and in W (because z 1 and z are both in W, and W is a subspace). viv = 0 v= 0 y y z = z Hence, which shows that. ˆ = ˆ1 1 This proves that and also. ˆ ˆ1 v= y y ŷ 1 Slide 6.3-6

39 THE ORTHOGONAL DECOMPOSITION THEOREM The uniqueness of the decomposition (1) shows that the orthogonal projection ŷ depends only on W and not on the particular basis used in (2) u = 5,u = 1,and y = Example 1: Let. Observe that {u 1, u 2 } is an orthogonal basis for W = Span{u,u }. Write y as the sum of a vector in 1 2 W and a vector orthogonal to W. Slide 6.3-7

40 THE ORTHOGONAL DECOMPOSITION THEOREM Solution: The orthogonal projection of y onto W is ŷ = yiu 1 u 1 iu 1 u 1 + yiu 2 u 2 iu 2 u 2 " = 9 $ $ 30$ # % " ' ' '+ 3 $ $ 6$ & # % " ' ' = 9 $ $ ' 30$ & # % ' ' ' & " $ $ $ # % " ' $ ' = $ ' $ & # 2 / 5 2 1/ 5 % ' ' ' & Also 1 2/5 7/5 y yˆ = 2 2 = 0 3 1/ 5 14 / 5 Slide 6.3-8

41 THE ORTHOGONAL DECOMPOSITION THEOREM Theorem 8 ensures that is in. To check the calculations, verify that is orthogonal to both u 1 and u 2 and hence to all of W. y yˆ W y yˆ The desired decomposition of y is 1 2/5 7/5 y= 2 = / 5 14 / 5 Slide 6.3-9

42 PROPERTIES OF ORTHOGONAL PROJECTIONS If {u 1,,u p } is an orthogonal basis for W and if y happens to be in W, then the formula for proj W y is exactly the same as the representation of y given in Theorem 5 in Section 6.2. proj y In this case,. W = y W = Span{u 1,,u p } proj y y If y is in, then. W = Slide

43 THE BEST APPROXIMATION THEOREM Theorem 9: Let W be a subspace of, let y be any vector in n, and let ŷ be the orthogonal projection of y onto W. Then ŷ is the closest point in W to y, in the sense that for all v in W distinct from. ŷ y yˆ < y v ŷ The vector in Theorem 9 is called the best approximation to y by elements of W. ----(3) The distance from y to v, given by, can be regarded as the error of using v in place of y. Theorem 9 says that this error is minimized when. n y v v = yˆ Slide

44 THE BEST APPROXIMATION THEOREM Inequality (3) leads to a new proof that does not depend on the particular orthogonal basis used to compute it. ŷ If a different orthogonal basis for W were used to construct an orthogonal projection of y, then this projection would also be the closest point in W to y, namely,. ŷ Slide

45 THE BEST APPROXIMATION THEOREM Proof: Take v in W distinct from. See the following figure. ŷ ŷ v Then is in W. By the Orthogonal Decomposition Theorem, orthogonal to W. y yˆ ŷ v y yˆ In particular, is orthogonal to (which is in W ). Slide is

46 THE BEST APPROXIMATION THEOREM Since y v = (y y) ˆ + (yˆ v) the Pythagorean Theorem gives y v = y yˆ + yˆ v (See the colored right triangle in the figure on the previous slide. The length of each side is labeled.) 2 > ŷ v 0 Now ŷ v 0because, and so inequality (3) follows immediately. Slide

47 PROPERTIES OF ORTHOGONAL PROJECTIONS Example 2: The distance from a point y in to a subspace W is defined as the distance from y to the nearest point in W. Find the distance from y to W = Span{u,u } 1 2, where y= 5,u = 2,u = n Solution: By the Best Approximation Theorem, the distance from y to W is, where. y yˆ ŷ= proj y W Slide

48 PROPERTIES OF ORTHOGONAL PROJECTIONS Since {u 1, u 2 } is an orthogonal basis for W, ŷ= u + u = 2 2 = y yˆ = 5 8 = y yˆ = = = 3 5 The distance from y to W is. Slide

49 6 Orthogonality and Least Squares 6.5 LEAST-SQUARES PROBLEMS

50 LEAST-SQUARES PROBLEMS Definition: If A is and b is in, a leastsquares solution of is an in such that n for all x in. m n m A x= b ˆx n b Axˆ b Ax The most important aspect of the least-squares problem is that no matter what x we select, the vector Ax will necessarily be in the column space, Col A. So we seek an x that makes Ax the closest point in Col A to b. See the figure on the next slide. Slide 6.5-2

51 LEAST-SQUARES PROBLEMS Solution of the General Least-Squares Problem Given A and b, apply the Best Approximation Theorem to the subspace Col A. Let ˆb = proj b Col A Slide 6.5-3

52 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM ˆb Because is in the column space A, the equation is consistent, and there is an in such that A ˆx = bˆ ˆx n A x = ----(1) bˆ ˆb Since is the closest point in Col A to b, a vector is a least-squares solution of A x= b if and only if ˆx satisfies (1). ˆx n ˆb Such an in is a list of weights that will build out of the columns of A. See the figure on the next slide. ˆx Slide 6.5-4

53 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM ˆx A ˆx = Suppose satisfies. By the Orthogonal Decomposition Theorem, the projection ˆb has the property that b bˆ is orthogonal to Col A, so ˆ is orthogonal to each column of A. b Ax If a j is any column of A, then, T and ˆ. a(b Ax) j bˆ a j i(b Aˆx) = 0 Slide 6.5-5

54 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Since each is a row of A T, Thus a T j A T A T (b Ax) ˆ = 0 T b A Axˆ = 0 T AAˆx ----(2) These calculations show that each least-squares solution of A x= bsatisfies the equation T T AAx = Ab ----(3) The matrix equation (3) represents a system of equations called the normal equations for A x= b. A solution of (3) is often denoted by. = A T b ˆx Slide 6.5-6

55 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM A x= Theorem 13: The set of least-squares solutions of coincides with the nonempty set of solutions of the T T normal equation AAx = Ab. Proof: The set of least-squares solutions is nonempty and each least-squares solution ˆx satisfies the normal equations. Conversely, suppose ˆx satisfies T T AAˆx = Ab. Then ˆx satisfies (2), which shows that b Axˆ is orthogonal to the rows of A T and hence is orthogonal to the columns of A. Since the columns of A span Col A, the vector b Axˆ is orthogonal to all of Col A. b Slide 6.5-7

56 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Hence the equation b= Ax ˆ + (b Ax) ˆ is a decomposition of b into the sum of a vector in Col A and a vector orthogonal to Col A. By the uniqueness of the orthogonal decomposition, must be the orthogonal projection of b onto Col A. Aˆx A ˆx = bˆ ˆx That is, and is a least-squares solution. Slide 6.5-8

57 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Example 1: Find a least-squares solution of the inconsistent system for Solution: To use normal equations (3), compute:! # A = # # " A x= b $! & # &,b = # & # % " ! A T A = # " 0 2 1! $ # &# %# " $ & & & % $ &! & = # 17 1 & " 1 5 % $ & % Slide 6.5-9

58 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM! A T b = # " 0 2 1! $ # &# %# " $ &! & = # 19 & " 11 % $ & % Then the equation! # " T AAx $! &# %# " = x 1 x 2 A T b becomes $ & & =! 19 $ # & % " 11 % Slide

59 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Row operations can be used to solve the system on the previous slide, but since A T A is invertible and 2 2, it is probably faster to compute and then solve (A T A) 1 = 1 84 T AAx = ˆx = (A T A) 1 A T b = 1 " 5 1 %" $ ' $ 84# 1 17 &# " $ # A T b as % ' = 1 " $ & 84# % ' & % " ' = $ & # 1 2 % ' & Slide

60 SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM m n Theorem 14: Let A be an matrix. The following statements are logically equivalent: a. The equation A x= bhas a unique least-squares solution for each b in m. b. The columns of A are linearly independent. c. The matrix A T A is invertible. When these statements are true, the least-squares ˆx solution is given by 1 ˆx ( T T = AA) Ab ----(4) When a least-squares solution ˆx is used to produce A as an approximation to b, the distance from b to Aˆx is called the least-squares error of this approximation. ˆx Slide

61 ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS Example 2: Find a least-squares solution of " $ A = $ $ $ # % " ' $ ',b = $ ' $ ' $ & # % ' ' ' ' & A x= b for Solution: Because the columns a 1 and a 2 of A are orthogonal, the orthogonal projection of b onto Col A is given by ˆb = bia 1 a 1 ia 1 a 1 + bia 2 a 2 ia 2 a 2 = 8 4 a a (5) Slide

62 ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS! # = # # # " ˆb Now that is known, we can solve. But this is trivial, since we already know weights to place on the columns of A to produce. It is clear from (5) that! ˆx = 8 / 4 # " 45 / $! & # & + # & # & # % " 3 1 1/ 2 7 / 2 $! & # & = # & # & # % " $! & = # % " / 2 11/ 2 2 1/ 2 $ & & & & % A ˆx = ˆb $ & % bˆ Slide

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.5 LEAS-SQUARES S LEAS-SQUARES S Definition: If A is and is in, a leastsquares solution of is an in such n that n for all x in. m n A x = x Ax Ax m he most important

More information

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

Overview. Motivation for the inner product. Question. Definition

Overview. Motivation for the inner product. Question. Definition Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

Orthogonal Projection. Hung-yi Lee

Orthogonal Projection. Hung-yi Lee Orthogonal Projection Hung-yi Lee Reference Textbook: Chapter 7.3, 7.4 Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal

More information

The geometry of least squares

The geometry of least squares The geometry of least squares We can think of a vector as a point in space, where the elements of the vector are the coordinates of the point. Consider for example, the following vector s: t = ( 4, 0),

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Announcements Wednesday, November 15

Announcements Wednesday, November 15 3π 4 Announcements Wednesday, November 15 Reviews today: Recitation Style Solve and discuss Practice problems in groups Preparing for the exam tips and strategies It is not mandatory Eduardo at Culc 141,

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer. Chapter 3 Directions: For questions 1-11 mark each statement True or False. Justify each answer. 1. (True False) Asking whether the linear system corresponding to an augmented matrix [ a 1 a 2 a 3 b ]

More information

Chapter 6 - Orthogonality

Chapter 6 - Orthogonality Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall 2009 http://z.cs.utexas.edu/wiki/pla.wiki/ 1 Orthogonal Vectors and Subspaces http://z.cs.utexas.edu/wiki/pla.wiki/

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

2.4 Hilbert Spaces. Outline

2.4 Hilbert Spaces. Outline 2.4 Hilbert Spaces Tom Lewis Spring Semester 2017 Outline Hilbert spaces L 2 ([a, b]) Orthogonality Approximations Definition A Hilbert space is an inner product space which is complete in the norm defined

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

Math 4377/6308 Advanced Linear Algebra

Math 4377/6308 Advanced Linear Algebra 1.4 Linear Combinations Math 4377/6308 Advanced Linear Algebra 1.4 Linear Combinations & Systems of Linear Equations Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/

More information

WI1403-LR Linear Algebra. Delft University of Technology

WI1403-LR Linear Algebra. Delft University of Technology WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality m:033 Notes: 6. Inner Product, Length and Orthogonality Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman April, 00 The inner product Arithmetic is based on addition and

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Elementary maths for GMT

Elementary maths for GMT Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1

More information

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n. 6.. Length, Angle, and Orthogonality In this section, we discuss the defintion of length and angle for vectors and define what it means for two vectors to be orthogonal. Then, we see that linear systems

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6. Orthogonal Projections Math 2 Linear Algebra 6. Orthogonal Projections Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2 Jiwen He, University of

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

6.1 SOLUTIONS. 1. Since. and. 2. Since. 3. Since. 4. Since. x w = 6(3) + ( 2)( 1) + 3( 5) = 5, 6. Since x. u u ( 1) 2 5, v u = 4( 1) + 6(2) = 8, and

6.1 SOLUTIONS. 1. Since. and. 2. Since. 3. Since. 4. Since. x w = 6(3) + ( 2)( 1) + 3( 5) = 5, 6. Since x. u u ( 1) 2 5, v u = 4( 1) + 6(2) = 8, and 6. SOLUIONS Notes: he first half of this section is computational and is easily learned. he second half concerns the concepts of orthogonality and orthogonal complements, which are essential for later

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Normed & Inner Product Vector Spaces

Normed & Inner Product Vector Spaces Normed & Inner Product Vector Spaces ECE 174 Introduction to Linear & Nonlinear Optimization Ken Kreutz-Delgado ECE Department, UC San Diego Ken Kreutz-Delgado (UC San Diego) ECE 174 Fall 2016 1 / 27 Normed

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW When we define a term, we put it in boldface. This is a very compressed review; please read it very carefully and be sure to ask questions on parts you aren t sure of. x 1 WedenotethesetofrealnumbersbyR.

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Announcements Wednesday, November 15

Announcements Wednesday, November 15 Announcements Wednesday, November 15 The third midterm is on this Friday, November 17. The exam covers 3.1, 3.2, 5.1, 5.2, 5.3, and 5.5. About half the problems will be conceptual, and the other half computational.

More information

Linear Equations and Vectors

Linear Equations and Vectors Chapter Linear Equations and Vectors Linear Algebra, Fall 6 Matrices and Systems of Linear Equations Figure. Linear Algebra, Fall 6 Figure. Linear Algebra, Fall 6 Figure. Linear Algebra, Fall 6 Unique

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n

More information

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

The Transpose of a Vector

The Transpose of a Vector 8 CHAPTER Vectors The Transpose of a Vector We now consider the transpose of a vector in R n, which is a row vector. For a vector u 1 u. u n the transpose is denoted by u T = [ u 1 u u n ] EXAMPLE -5 Find

More information

Fourier and Wavelet Signal Processing

Fourier and Wavelet Signal Processing Ecole Polytechnique Federale de Lausanne (EPFL) Audio-Visual Communications Laboratory (LCAV) Fourier and Wavelet Signal Processing Martin Vetterli Amina Chebira, Ali Hormati Spring 2011 2/25/2011 1 Outline

More information

Announcements Monday, November 26

Announcements Monday, November 26 Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp MTHSC 3 Section 4.3 Linear Independence in Vector Spaces; Bases Definition Let V be a vector space and let { v. v 2,..., v p } V. If the only solution to the equation x v + x 2 v 2 + + x p v p = is the

More information

Announcements Monday, November 26

Announcements Monday, November 26 Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244

More information

Lecture 20: 6.1 Inner Products

Lecture 20: 6.1 Inner Products Lecture 0: 6.1 Inner Products Wei-Ta Chu 011/1/5 Definition An inner product on a real vector space V is a function that associates a real number u, v with each pair of vectors u and v in V in such a way

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT Math Camp II Basic Linear Algebra Yiqing Xu MIT Aug 26, 2014 1 Solving Systems of Linear Equations 2 Vectors and Vector Spaces 3 Matrices 4 Least Squares Systems of Linear Equations Definition A linear

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~ MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~ Question No: 1 (Marks: 1) If for a linear transformation the equation T(x) =0 has only the trivial solution then T is One-to-one Onto Question

More information

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve: MATH 2331 Linear Algebra Section 1.1 Systems of Linear Equations Finding the solution to a set of two equations in two variables: Example 1: Solve: x x = 3 1 2 2x + 4x = 12 1 2 Geometric meaning: Do these

More information

1 Last time: inverses

1 Last time: inverses MATH Linear algebra (Fall 8) Lecture 8 Last time: inverses The following all mean the same thing for a function f : X Y : f is invertible f is one-to-one and onto 3 For each b Y there is exactly one a

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

Linear Algebra V = T = ( 4 3 ).

Linear Algebra V = T = ( 4 3 ). Linear Algebra Vectors A column vector is a list of numbers stored vertically The dimension of a column vector is the number of values in the vector W is a -dimensional column vector and V is a 5-dimensional

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information