Orthogonal Projection Hung-yi Lee
Reference Textbook: Chapter 7.3, 7.4
Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection
Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S S = v: v u = 0, u S W S = R n S = {0} W S = {0} S = R n
Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S W = V = S = v: v u = 0, u S w 1 w 2 0 w 1, w 2 R 0 0 v 3 v 3 R = W? V W : for all v V and w W, v w = 0 W V: since e 1, e 2 W, all z = [ z 1 W must have z 1 = z 2 = 0 z 2 z 3 ] T
Properties of Orthogonal Complement Is S always a subspace? For any nonempty vector set S, Span S = S Let W be a subspace, and B be a basis of W. B = W What is S S? Zero vector
Properties of Orthogonal Complement Example: For W = Span{u 1, u 2 }, where u 1 = [ 1 1 1 4 ] T and u 2 =[ 1 1 1 2 ] T v W if and only if u 1 v = u 2 v = 0 i.e., v = [ x 1 x 2 x 3 x 4 ] T satisfies is a basis for W. W = Solutions of Ax=0 = Null A
Properties of Orthogonal Complement For any matrix A Row A = Null A v (Row A) For all w Span{rows of A}, w v = 0 Av = 0. Col A = Null A T (Col A) = (Row A T ) = Null A T. For any subspace W of R n dimw + dimw = n rank nullity
Unique For any subspace W of R n Basis: w 1, w 2,, w k dimw + dimw = n Basis: z 1, z 2,, z n k For every vector u, u = w + z (unique) Basis for R n W W W W w z 0 u
Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection
Orthogonal Projection orthogonal projection W z u = w + z (unique) W W Orthogonal Projection Operator: The function U W u is the orthogonal projection of u on W. Linear?
Orthogonal Projection W z w in subspace W is closest to vector u. closest Always orthogonal z is always orthogonal to all vectors in W.
Closest Vector Property 0 Among all vectors in subspace W, the vector closest to u is the orthogonal projection of u on W u w w u u w' w w, w, w w W. (u w) (w w ) = 0. u w 2 = (u w) + (w w ) 2 = u w W 2 + w w 2 w = U W (u) > u w 2 The distance from a vector u to a subspace W is the distance between u and the orthogonal projection of u on W
Orthogonal Projection Matrix Orthogonal projection operator is linear. It has standard matrix. z = u - w u Orthogonal Projection Matrix P w w = U W (u) = P W u W 0 v = U W (v) = P W v
Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection
Orthogonal Projection on a line Orthogonal projection of a vector on a line z u = 0 z v z w u L v: any vector u: any nonzero vector on L w: orthogonal projection of v onto L, w = cu z: v w v w u = v cu u = v u cu u = v u c u 2 c = v u u 2 w = cu = v u u 2 u Distance from tip of v to L : z = v w = v =0 v u u 2 u
Orthogonal Projection Example: c = v u u 2 w = cu = v u u 2 u z u = 0 v z L L is y = (1/2)x z w u v = 4 1 u = 2 1
Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W P W = C C T C 1 C T n x n Proof: Let u R n and w = U W (u). Since W = Col C, w = Cv for some v R k and u w W 0 = C T (u w) = C T u C T w = C T u C T Cv. C T u = C T Cv. v = (C T C) 1 C T u and w = C(C T C) 1 C T u as C T C is invertible.
Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W P W = C C T C 1 C T n x n Let C be a matrix with linearly independent columns. Then C T C is invertible. Proof: We want to prove that C T C has independent columns. Suppose C T Cb = 0 for some b. b T C T Cb = (Cb) T Cb = (Cb) (Cb) = Cb 2 = 0. Cb = 0 b = 0 since C has L.I. columns. Thus C T C is invertible.
Orthogonal Projection Matrix Example: Let W be the 2-dimensional subspace of R 3 with equation x 1 x 2 +2x 3 = 0. P W = C C T C 1 C T 1 2 W has a basis 1 2 1, 0 0 1 C = 1 0 0 1 5 1 2 1 1 5 2 6 2 2 2 P W = P W 3 1 4 = 0 4 2
Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection
Solution of Inconsistent System of Linear Equations Suppose Ax = b is an inconsistent system of linear equations. b is not in the column space of A Find vector z minimizing Az b b Ax b Az b 0 Az = P W b Ax W = Col A
Least Square Approximation predict e.g. data pairs: x 1 y 1 x 2 y 2 x i y i ( 今天股票, 明天股票 ) ( 今天 PM2.5, 明天 PM2.5) Find the least-square line y = a 0 + a 1 x to best fit the data Regression
Least Square Approximation y = a 0 + a 1 x Error Vector: x i, a 0 + a 1 x i y i a 0 + a 1 x i x i, y i Find a 0 and a 1 minimizing E
Least Square Approximation Error Vector: Find a 0 and a 1 minimizing E E = y (a 0 v 1 + a 1 v 2 ) 2 = y Ca 2
Least Square Approximation Find a minimizing E = y Ca 2 B = {v 1, v 2 } (L.I.) Ca is the orthogonal projection of y on W = Span B. find a such that Ca = P W y
Example 1 y = 0.056 + 0.745x. Prediction: if the rough weight is 2.65, the finished weight is 0.056 +0.745(2.65) = 2.030. (estimation)
Least Square Approximation Best quadratic fit: using y = a 0 + a 1 x + a 2 x 2 to fit the data points (x 1, y 1 ), (x 2, y 2 ),, (x n, y n ) y = a 0 + a 1 x + a 2 x 2 e = y 1 a 0 + a 1 x 1 + a 2 x 1 2 y 2 a 0 + a 1 x 2 + a 2 x 2 2 y n a 0 + a 1 x n + a 2 x n 2 Find a 0, a 1 and a 2 minimizing E
Least Square Approximation Best quadratic fit: using y = a 0 + a 1 x + a 2 x 2 to fit the data points (x 1, y 1 ), (x 2, y 2 ),, (x n, y n ) e = y 1 a 0 + a 1 x 1 + a 2 x 1 2 y 2 a 0 + a 1 x 2 + a 2 x 2 2 y n a 0 + a 1 x n + a 2 x n 2 Find a 0, a 1 and a 2 minimizing E
y = a 0 + a 1 x + a 2 x 2 y = 101.00 + 29.77x 16.11x 2 Best fitting polynomial of any desired maximum degree may be found with the same method.
Multivariable Least Square Approximation y A y A = a 0 + a 1 x A + a 2 x B x A x B http://www.palass.org/publications/newsletter/palaeomath-101/palaeomathpart-4-regression-iv