Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Size: px
Start display at page:

Download "Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:"

Transcription

1 Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1

2 Reason: The opposite side is given by u v. u v 2 = (u v) (u v) = u u v u u v + v v = u 2 + v 2 2u v. By Cosine law: c 2 = a 2 + b 2 2ab cos θ, i.e. u v 2 = u 2 + v 2 2 u v cos θ. Comparing two equalities, we get: u v = u v cos θ. 2

3 Inner Product Generalization of dot product. Direct generalization to R n : u v := u 1 v u n v n = n i=1 u i v i. Using matrix notation: n u i v i = [ u 1... u n ] v 1.. = u T v = v T u. i=1 v n This is called the (standard) inner product on R n. 3

4 Thm 1 (P.359): Let u, v, w R n and c R. Then: (i) u v = v u; (ii) (u + v) w = u w + v w; (iii) (cu) v = c(u v); (iv) u u 0, and u u = 0 iff u = 0. Note: (iv) is sometimes called: positive-definite property. A general inner product is defined using the above 4 properties. For complex inner product, need to add complex conjugate to (i). 4

5 Def: The length (or norm) of v is defined as: v := v v. v = 1 : called unit vectors. Def: The distance between u, v is defined as: dist(u, v) := u v. Def: The angle between u, v is defined as: u, v) := cos 1 u v u v. 5

6 Extra: The General Inner Product Space Let V be a vector space over R or C. Def: An inner product on V is a real/complex-valued function of two vector variables <u, v > such that: (a) <u, v >= <v, u>; (conjugate symmetric) (b) <u + v, w >=<u, w > + <v, w >; (c) <cu, v >= c <u, v >; (linear in the first vector variable) (d) <u, u> 0; and <u, u>= 0 iff u = 0. (positive-definite property) 6

7 Def: A real/complex vector space V equipped with an inner product is called an inner product space. Note: (i) An inner product is conjugate linear in the second vector variable: <u, cv 1 + dv 2 >= c <u, v 1 > + d <u, v 2 >. (ii) If we replace (a) by <u, v > = <v, u>, consider: <iu, iu> = i 2 <u, u>= <u, u>, it will be incompatible with (d). When working with complex inner product space, must take complex conjugate when interchanging u, v. 7

8 Examples of (general) inner product spaces: 1. The dot product on C n : ( : conjugate transpose) <u, v >:= u 1 v u n v n = v u. 2. A non-standard inner product on R 2 : [ <u, v >:= u 1 v 1 u 1 v 2 u 2 v 1 +2u 2 v 2 = v T An inner product on the matrix space M m n : m n <A, B >:= tr(b A) = a jk bjk. j=1 k=1 ] u. 8

9 4. Consider the vector space V of continuous real/complexvalued functions defined on the interval [a, b]. Then the following is an inner product on V : <f, g >:= 1 b a b a f(t)g(t)dt. [In real case, the norm f will give the root-meansquare (r.m.s.) of area bounded by the curve of f and the t-axis over the interval [a, b].] 9

10 Schwarz s inequality: (a 1 b a n b n ) 2 (a a 2 n)(b b 2 n). Pf: The following equation cannot have distinct solution: (a 1 x + b 1 ) (a n x + b n ) 2 = 0 (a a 2 n)x 2 + 2(a 1 b a n b n )x +(b b 2 n) = 0 So 0, and this gives the inequality. 10

11 The Cauchy-Schwarz Inequality: u v u v, and equality holds if, and only if, {u, v} is l.d. Proof: When u 0, set û = 1 u u. Consider w = v (v û)û. v w (v û)û u Obviously, w w = w 2 0 Cauchy-Schwarz Inequality. 11

12 Set k = v û: 0 (v kû) (v kû) = v v 2k(v û) + k 2 (û û) = v 2 k 2, Note that k = v u u, so: k 2 = (v u)2 u 2 v 2 (u v) 2 u 2 v 2. Taking positive square roots, we obtain the result. 12

13 Thm: (Triangle Inequality) For u, v R n : u + v u + v, and equality holds iff one of the vectors is a non-negative scalar multiple of the other. Proof: Consider u + v 2. (u + v) (u + v) = u 2 + 2(u v) + v 2 u u v + v 2 = ( u + v ) 2. Taking square root, we obtain the inequality. 13

14 Orthogonality: Pythagoras Theorem in vector form: u + v 2 = u 2 + v 2 u + v v But in general we have: u so we need u v = 0. u + v 2 = u 2 + 2(u v) + v 2, 14

15 Def: Let u, v be two vectors in R n. When u v = 0, we say that u is orthogonal to v, denoted by u v. This generalizes the concept of perpendicularity. 0 is the only vector that is orthogonal to every vector v in R n. Example: In R 2, we have: [ ] 3 4 [ ] 4. 3 Thm 2 (P.362): u and v are orthogonal iff u + v 2 = u 2 + v 2. 15

16 Common Orthogonality: Def: Let S be a set of vectors in R n. If u is orthogonal to every vector in S, we will say u is orthogonal to S, denoted by u S. i.e. We can regard u to be a common perpendicular to S. Examples: (i) 0 R n. (ii) In R 2, let S = x-axis. Then e 2 S. (iii) In R 3, let S = x-axis. Then both e 2 S and e 3 S. Exercise: Let u, v S. Show that: (i) (au + bv) S for any numbers a, b; (ii) u Span S. *** 16

17 Orthogonal Complement: Def: Let S be a set of vectors in R n. We define: S := {u R n u S}, called the orthogonal complement of S in R n. i.e. S collects all the common perpendiculars to S. Examples: (i) {0} = R n, (R n ) = {0}. (ii) In R 2, let S = x-axis. Then S = y-axis. (iii) In R 3, take S = {e 1 }. Then S = yz-plane. 17

18 Thm: S is always a subspace of R n. Checking: (i) 0 v for every v S. So 0 S. (ii) Pick any u 1, u 2 S. For any scalars a, b R, consider: (au 1 + bu 2 ) v = a(u 1 v) + b(u 2 v) = a 0 + b 0 = 0, whenever v S. So au 1 + bu 2 S (cf. previous exercise). Note: S itself need not be a subspace. Thm: (a) S = (Span S). (b) Span S (S ). Pf: (a) S (Span S) is easy to see, since any vector u Span S must also satisfy u S. 18

19 Now, pick any u S. For every v Span S, write: v = c 1 v c p v p, v i S, i = 1,..., p. Then since u S: u v = c 1 (u v 1 ) c p (u v p ) = 0, and hence u (Span S), so S (Span S) is proved. (b) Pick a vector w Span S, we have l.c.: w = c 1 v c p v p. For any u S : w u = c 1 (v 1 u) c p (v p u) = 0 w (S ). 19

20 Thm 3 (P.363): Let A be an m n matrix. Then: (Row A) = Nul A and (Col A) = Nul A T. Pf: The product Ax can be rewritten as: r 1 r T 1 x Ax =. x = r m. r T m x. So x Nul A x {r T 1,... r T m} x (Row A). Hence (Row A) = Nul A. Apply the result to A T, we obtain: (Col A) = (Row A T ) = Nul A T. 20

21 Orthogonal sets and Orthonormal sets Def: A set S is called orthogonal if any two vectors in S are always orthogonal to each other. Def: A set S is called orthonormal if (i) S is orthogonal, and (ii) each vector in S is of unit length. Example: Orthonormal set: 1 3 1, ,

22 Thm 4 (P.366): An orthogonal set S of non-zero vectors is always linearly independent. Pf: Let S = {u 1, u 2,..., u p } and consider the relation: c 1 u 1 + c 2 u c p u p = 0. Take inner product with u 1, then: c 1 (u 1 u 1 ) + c 2 (u 1 u 2 ) c p (u p u 1 ) = 0 u 1 c 1 u c c p 0 = 0. As u 1 = 0, we must have c 1 = 0. Similarly for other c i. So S must be l.i. 22

23 The method of proof of previous Thm 4 gives: Thm 5 (P.367): Let S = {u 1,..., u p } be an orthogonal set of non-zero vectors and let v Span S. Then: v = v u 1 u 1 2 u v u p u p 2 u p. Pf: Let c 1,..., c p be such that v = c 1 u c p u p. Take inner product with u 1, we have: v u 1 = c 1 (u 1 u 1 ) c p (u p u 1 ) = c 1 u 1 2. So c 1 = v u 1 u 1 2. Similarly for other c i. 23

24 Thm 5 : Let S = {û 1,... û p } be an orthonormal set. Then for any v Span S, we have: v = (v û 1 )û (v û p )û p. Remark: This generalizes our familiar expression in R 3 : v = (v i)i + (v j)j + (v k)k. Example: Express v as a l.c. of the vectors in S: v = 1 2, S = 3 1, 1 2,

25 New method: Compute c 1, c 2, c 3 directly: c 1 = c 2 = c 3 = c 1 = 8 11, c 2 = 6 6 = 1, c 3 = =

26 Exercise: Determine if v Span {u 1, u 2 }. v = 3 2 5, u 1 = 1 2 2, u 2 = *** 26

27 Orthogonal basis and Orthonormal basis Def: A basis for a subspace W is called an orthogonal basis if it is an orthogonal set. Def: A basis for a subspace W is called an orthonormal basis if it is an orthonormal set. Examples: (i) {e 1,..., e n } is an orthonormal basis for R n. [ ] [ ] 3 4 (ii) S = {, } is an orthogonal basis for R ] ] } is an orthonormal basis for R 2. S = {[ , [

28 (iii) The following set S: S = 3 1 1, 1 2 1, is an orthogonal basis for R 3. (iv) The columns of an n n orthogonal matrix A will form an orthonormal basis for R n. Orthogonal matrix: square matrix and A T A = I n. 28

29 Checking: Write A = [ v 1... v n ]. (i, j)-th entry of A T A = vi T v j = v i v j. { 1, i = j, (i, j)-th entry of I n = 0, i j. Above checking also works for non-square matrix: Thm 6 (P.371): The n columns of an m n matrix U are orthonormal iff U T U = I n. But for square matrices: AB = I BA = I. So: (iv) The rows of an n n orthogonal matrix A (written in column form) also form an orthonormal basis for R n. 29

30 Matrices having orthonormal columns are very special: Thm 7 (P.371): Let T : R n R m be a linear transformation given by an m n standard matrix U with orthonormal columns. Then for any x, y R n : a. U x = x (preserving length) b. (Ux) (Uy) = x y (preserving inner product) c. (Ux) (Uy) = 0 iff x y = 0 (preserving orthogonality) Pf: Direct verifications using U T U = I n. Results not true for just orthogonal columns. 30

31 Recall: Let S = {u 1,..., u p } be orthogonal. When v W = Span S, we have: v = v u 1 u 1 2 u v u p u p 2 u p. What happens if v W? LHS RHS, as RHS is always a vector in W. v = RHS is still computable. What is the relation between v and v? 31

32 p LHS = v, RHS = v = Take inner product of RHS with u j : i=1 v u i u i 2 u i. v u j = = ( p i=1 p i=1 ) v u i u i 2 u i u j v u i u i 2 (u i u j ) = v u j u j 2 (u j u j ) = v u j, which is the same as LHS u j. 32

33 In other words, (v v ) u j = 0 for j = 1,..., p. Thm: The vector z = v v is orthogonal to every vector in Span S, i.e. z (Span S). v z u i v W 33

34 Def: Let {u 1,..., u p } be an orthogonal basis for W. each v in R n, the following vector in W : For proj W v := v u 1 u 1 2 u v u p u p 2 u p, is called the orthogonal projection of v onto W. Remark: {u 1,..., u p } must be orthogonal, otherwise RHS will not give us the correct vector v. Note: v = proj W v v W. 34

35 Example: In R 3, consider S = {e 1, e 2 }. Then W = Span S is the xy-plane. For any vector v R 3 : z v v e 1 e 1 2 e 1 x proj W v v e 2 e 2 2 e 2 y proj W v = x y 0 35

36 Exercise: Consider in R 3 and W = Span {u 1, u 2 }. proj W v: Find v = 1 0 1, u 1 = 2 2 1, u 2 = *** 36

37 Def: The decomposition: v = proj W v + (v proj W v), (v proj W v) W, is called the orthogonal decomposition of v w.r.t. W. v z = v proj W v w = proj W v Thm 8 (P.376): Orthogonal decomposition w.r.t. W is the unique way to write v = w + z with w W and z W. 37 W

38 Exercise: Find the orthogonal projection of v onto W = Nul A. 1 A = [ ], 2 v =. 3 4 *** Thm 9 (P.378): Let v R n and let w W. Then we have: v proj W v v w, and equality holds only when w = proj W v. 38

39 Pf: We can rewrite v w as: v w = (v proj W v) + (proj W v w). v v proj W v W proj W v w v w proj W v w Can apply Pythagoras Theorem to the right-angle triangle. 39

40 v w 2 = v proj W v 2 + proj W v w 2 v proj W v 2, and equality holds iff proj W v w = 0 iff w = proj W v. Because of the inequality: v proj W v v w, proj W v sometimes is called the best approximation of v by vectors in W. 40

41 Def: The distance of v to W is defined as: dist(v, W ) := v proj W v. Obviously, v W iff dist(v, W ) = 0. Exercise: Let W = Span {u 1, u 2, u 3 }. Find dist(v, W ): u 1 = 1 1, u 1 2 = 1 1 1, u 1 3 = and v = 1 1 Sol: Remeber to check that {u 1, u 2, u 3 } is orthogonal. ***

42 Extension of Orthogonal Set Let S = {u 1,..., u p } be an orthogonal basis for W = Span S. When W R n, we can find a vector v W and: z = v proj W v 0. This vector z is in W, i.e. will satisfy: z w = 0 for every w W. Hence the following set will again be orthogonal: S {z} = {u 1,..., u p, z}. 42

43 Thm: Span(S {v}) = Span(S {z}). In other words, we can extend an orthogonal set S by adding the vector z. S 1 = {u 1 } orthogonal, v 2 Span S 1, then compute z 2. S 2 = {u 1, z 2 } is again orthogonal. and Span {u 1, v 2 } = Span {u 1, z 2 }. S 2 = {u 1, u 2 } orthogonal, v 3 Span S 2, compute z 3. S 3 = {u 1, u 2, z 3 } is again orthogonal. and Span {u 1, u 2, v 3 } = Span {u 1, u 2, z 3 }.. This is called the Gram-Schmidt orthogonalization process. 43

44 Thm 11 (P.383): Let {x 1,..., x p } l.i.. Define u 1 = x 1 and: u 2 = x 2 x 2 u 1 u 1 2 u 1 u 3 = x 3 x 3 u 2 u 2 2 u 2 x 3 u 1 u 1 2 u 1. u p = x p p 1 i=1 x p u i u i 2 u i. Then {u 1,..., u p } will be orthogonal and for 1 k p: Span {x 1,..., x k } = Span {u 1,..., u k }. 44

45 Notes: (i) Must use {u i } to compute proj Wk x k+1 since the formula: k x k+1 u i proj Wk x k+1 = u i 2 u i, i=1 is only valid for orthogonal set {u i }. (ii) If obtain u k = 0 for some k, i.e. x k = proj W x k, we have: x k Span {x 1,..., x k 1 }. so {x 1,..., x k } will be l.d. instead. (iii) All the u i will be non-zero vectors as {x i } is l.i. 45

46 Example: Apply Gram-Schmidt Process to {x 1, x 2, x 3 }: x 1 = 1 1 0, x 2 = Solution: Take u 1 = x 1. Then:, x 3 = u 2 = x 2 x 2 u 1 u 1 2 u 1 = x u 1 = u 3 = x u u 1 = , 1 46

47 Example: Apply Gram-Schmidt Process to {x 1, x 3, x 2 }: x 1 = 1 1 0, x 3 = Solution: Take u 1 = x 1. Then:, x 2 = u 2 = x 3 x 3 u 1 u 1 2 u 1 = x u 1 = u 3 = x u u 1 = , 47

48 Exercise: Find an orthogonal basis for Col A: A = Sol: First find a basis for Col A (e.g. pivot columns of A). Then apply Gram-Schmidt Process. *** 48

49 Approximation Problems: Solve Ax = b. Due to the presence of errors, a consistent system may appear as an inconsistent system: x 1 + x 2 = 1 x 1 x 2 = 0 2x 1 + 2x 2 = 2 x 1 + x 2 = 1.01 x 1 x 2 = x 1 + 2x 2 = 2.01 Also in practice, exact solutions are usually not necessary. How to obtain a good approximate solution for the above inconsistent system? 49

50 Least squares solution: How to measure the goodness of x 0 as an approximated solution to the system: Ax = b? Minimize the difference x x 0 Problem: But x is unknown... Another way of approximation: x 0 x Ax 0 Ax = b. 50

51 Analysis: Find x 0 such that: Ax 0 = b 0, and b 0 is as close to b as possible. b 0 must be in Col A. b b 0 2 is a sum of squares least squares solution. Best approximation property of orthogonal projection: b proj W b b w for every w in W = Col A. Should take b 0 = proj W b. 51

52 Example: Find the least squares solution of the inconsistent system: x 1 + x 2 = 1.01 x 1 x 2 = x 1 + 2x 2 = 2.01 To compute proj W b, we need an orthogonal basis for W = Col A first. a basis for Col A is: { 1 1 2, }. 52

53 Then by Gram-Schmidt Process, we get an orthogonal basis for W = Col A: { 1 1, Compute b 0 = proj W b: b 0 = } { , }

54 Hence: b 0 = Since b 0 Col A, the system Ax 0 = b 0 must be consistent. Solving Ax 0 = b 0 : Thus we have the following least squares solution: [ ] x 0 =

55 But we have the following result: (Col A) = Nul A T. Then, since we take b 0 = proj Col A b: (b b 0 ) (Col A) (b b 0 ) Nul A T A T (b b 0 ) = 0 A T b 0 = A T b. So, if x 0 is an approximate solution, we have: A T (Ax 0 ) = A T b. The above is usually called the normal equation of Ax = b. 55

56 Thm 13 (P.389): The least squares solutions of Ax = b are the solutions of the normal equation A T Ax = A T b. In the following case, the least square solution will be unique: Thm 14 (P.391): Let A be an m n matrix with rank A = n. Then the n n matrix A T A is invertible. Example: Find again the least squares solution: x 1 + x 2 = 1.01 x 1 x 2 = x 1 + 2x 2 =

57 Solution: Solve the normal equation. Compute: [ ] A T A = 1 1 [ ] =, A T b = [ ] = [ ] So the normal equation is: [ ] [ ] [ 6 4 x = x 2 ] [ x1 x 2 ] = [ ]

58 Least Squares Problems Linear Regression: Fitting data (x i, y i ) with straight line. y To minimize the differences indicated by the red intervals. x 58

59 When a straight line y = c + mx can pass through all the points, it will of course best fit the data. This requires: c + mx 1 = y 1. c + mx n = y n being consistent. 1 x 1 [ ].. c m 1 x n But in general the above system Ax = b is inconsistent. = y 1.. y n 59

60 Measurement of closeness: square sum of y-distances. y 1 (mx 1 + c) y n (mx n + c) 2. Note that this is expressed as b b 0 2, where: b = y 1. y n, b 0 = b 0 Col A since Ax = b 0 is consistent. Use normal equation! c + mx 1. c + mx n. 60

61 Example: Find a straight line that best fits the points: (2, 1), (5, 2), (7, 3), (8, 3), in the sense of minimizing the square-sum of y-distances. Sol: The (inconsistent) system is: [ ] c m = We are going the find its least squares solution. 61

62 Compute: A T A = A T b = [ [ ] ] = = [ ] [ ] 4 22,

63 So the normal equation is: [ ] [ ] 4 22 c m = [ ] 9, 57 which has a unique solution of ( 2 7, 5 14 ). The best fit straight line will be: y = x. 63

64 Polynomial Curve Fitting: Example: Find a polynomial curve of degree at most 2 which best fits the following data: in the sense of least squares. (2, 1), (5, 2), (7, 3), (8, 3), Sol: Consider the general form of the fitting curve: y = a a 1 x + a 2 x 2. 64

65 The curve cannot pass through all the 4 points as: a a a = 1 a a a = 2 a a a = 3 a a a = 3 is inconsistent. Again, use normal equation. 65

66 The corresponding normal equation A T Ax = A T b is: a 0 a a 2 = which has a unique solution of ( , 19 44, ). So the best fitting polynomial is: y = x x2., 66

67 General Curve Fitting: Example: Find a curve in the form c 0 + c 1 sin x + c 2 sin 2x which best fits the following data: ( π 6, 1), (π 4, 2), (π 3, 3), (π 2, 3), in the sense of least squares. Sol: Let y = c c 1 sin x + c 2 sin 2x. The system c c 1 sin π 6 + c 2 sin 2π 6 = 1 c c 1 sin π 4 + c 2 sin 2π 4 = 2 c c 1 sin π 3 + c 2 sin 2π 3 = 3 c c 1 sin π 2 + c 2 sin 2π 2 = 3 is inconsistent. 67

68 Solving A T Ax = A T b... c 0 = c 1 = c 2 = So the best fitting function is: ( ) + ( ) sin x + ( ) sin 2x. 68

69 Extra: Continuous Curve Fitting Find g(x) best fitting a given f(x). g(x) f(x) Try to minimize the difference (area) between two curves. 69

70 To minimize the root-mean-square (r.m.s.) between two curves: 1 b f(x) g(x) b a 2 dx. Given by the following inner product: a of area <f, g >= 1 b a b a f(x)g(x)dx. not in R n, not the standard inner product... No normal equation. But we can use orthogonal projection. 70

71 Recall: Formula of orthogonal projection in general: p <y, u i > proj W y = <u i, u i > u i, i=1 where {u 1,..., u p } is an orthogonal basis of W. Example: Fit f(x) = x over [0, 1] by l.c. of S = {1, sin 2πkx, cos 2πkx; k = 1, 2,..., n} Sol: S is orthogonal under the inner product: (direct checking) <f, g >= f(x)g(x)dx.

72 So compute those <y, u i > : <f(x), 1>= 1 2, <f(x), sin 2πkx>= 1, <f(x), cos 2πkx>= 0. 2πk We also need those <u i, u i > : <1, 1>= 1, <sin 2πkx, sin 2πkx>= 1 2, <cos 2πkx, cos 2πkx>=

73 So the best fitting curve is g(x) = proj W f(x): g(x) = ( ) sin 2πx sin 4πx sin 2nπx π 1 2 n When n = 5:

74 Example: Let f(x) = sgn(x), the sign of x: 1 for x < 0 sgn(x) = 0 for x = 0 1 for x > 0 Find the best r.m.s. approximation function over [ 1, 1] using l.c. of S = {1, sin kπx, cos kπx; k = 1, 2, 3,..., 2n + 1}. Sol: Interval changed. Use new inner product: <f, g >= f(x)g(x)dx. 74

75 Then S is orthogonal (needs another checking) and: <1, 1>= 1, <sin kπx, sin kπx>= 1 2 =<cos kπx, cos kπx>. So, we compute: <sgn(x), 1> = 0; { 0 if k is even, <sgn(x), sin kπx> = if k is odd; <sgn(x), cos kπx> = 0. 2 kπ 75

76 Hence the best r.m.s. approx. to sgn(x) over [ 1, 1] is: 4 π ( sin πx + sin 3πx 3 + sin 5πx sin(2n + 1)πx 2n + 1 ). When 2n + 1 = 9:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

Lecture 20: 6.1 Inner Products

Lecture 20: 6.1 Inner Products Lecture 0: 6.1 Inner Products Wei-Ta Chu 011/1/5 Definition An inner product on a real vector space V is a function that associates a real number u, v with each pair of vectors u and v in V in such a way

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition 6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns Linear Equation: a x + a 2 x 2 +... + a n x n = b. x, x 2,..., x n : variables or unknowns a, a 2,..., a n : coefficients b: constant term Examples: x + 4 2 y + (2 5)z = is linear. x 2 + y + yz = 2 is

More information

Vectors in Function Spaces

Vectors in Function Spaces Jim Lambers MAT 66 Spring Semester 15-16 Lecture 18 Notes These notes correspond to Section 6.3 in the text. Vectors in Function Spaces We begin with some necessary terminology. A vector space V, also

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

Mathematics Department Stanford University Math 61CM/DM Inner products

Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n MTH 39Y 37. Inner product spaces Recall: ) The dot product in R n : a. a n b. b n = a b + a 2 b 2 +...a n b n 2) Properties of the dot product: a) u v = v u b) (u + v) w = u w + v w c) (cu) v = c(u v)

More information

The Gram-Schmidt Process 1

The Gram-Schmidt Process 1 The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition

More information

Linear Algebra. Alvin Lin. August December 2017

Linear Algebra. Alvin Lin. August December 2017 Linear Algebra Alvin Lin August 207 - December 207 Linear Algebra The study of linear algebra is about two basic things. We study vector spaces and structure preserving maps between vector spaces. A vector

More information

Algebra II. Paulius Drungilas and Jonas Jankauskas

Algebra II. Paulius Drungilas and Jonas Jankauskas Algebra II Paulius Drungilas and Jonas Jankauskas Contents 1. Quadratic forms 3 What is quadratic form? 3 Change of variables. 3 Equivalence of quadratic forms. 4 Canonical form. 4 Normal form. 7 Positive

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

Inner Product Spaces

Inner Product Spaces Inner Product Spaces Introduction Recall in the lecture on vector spaces that geometric vectors (i.e. vectors in two and three-dimensional Cartesian space have the properties of addition, subtraction,

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Math 290, Midterm II-key

Math 290, Midterm II-key Math 290, Midterm II-key Name (Print): (first) Signature: (last) The following rules apply: There are a total of 20 points on this 50 minutes exam. This contains 7 pages (including this cover page) and

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

x 1 x 2. x 1, x 2,..., x n R. x n

x 1 x 2. x 1, x 2,..., x n R. x n WEEK In general terms, our aim in this first part of the course is to use vector space theory to study the geometry of Euclidean space A good knowledge of the subject matter of the Matrix Applications

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Lecture 1.4: Inner products and orthogonality

Lecture 1.4: Inner products and orthogonality Lecture 1.4: Inner products and orthogonality Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics M.

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Extra Problems for Math 2050 Linear Algebra I

Extra Problems for Math 2050 Linear Algebra I Extra Problems for Math 5 Linear Algebra I Find the vector AB and illustrate with a picture if A = (,) and B = (,4) Find B, given A = (,4) and [ AB = A = (,4) and [ AB = 8 If possible, express x = 7 as

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

MATH Linear Algebra

MATH Linear Algebra MATH 4 - Linear Algebra One of the key driving forces for the development of linear algebra was the profound idea (going back to 7th century and the work of Pierre Fermat and Rene Descartes) that geometry

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Dot product and linear least squares problems

Dot product and linear least squares problems Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The

More information

Section 7.5 Inner Product Spaces

Section 7.5 Inner Product Spaces Section 7.5 Inner Product Spaces With the dot product defined in Chapter 6, we were able to study the following properties of vectors in R n. ) Length or norm of a vector u. ( u = p u u ) 2) Distance of

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

Overview. Motivation for the inner product. Question. Definition

Overview. Motivation for the inner product. Question. Definition Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Linear Algebra A Brief Reminder Purpose. The purpose of this document

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner

More information

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto

More information

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

SECTION v 2 x + v 2 y, (5.1)

SECTION v 2 x + v 2 y, (5.1) CHAPTER 5 5.1 Normed Spaces SECTION 5.1 171 REAL AND COMPLEX NORMED, METRIC, AND INNER PRODUCT SPACES So far, our studies have concentrated only on properties of vector spaces that follow from Definition

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

The Four Fundamental Subspaces

The Four Fundamental Subspaces The Four Fundamental Subspaces Introduction Each m n matrix has, associated with it, four subspaces, two in R m and two in R n To understand their relationships is one of the most basic questions in linear

More information