and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

Similar documents
Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Vectors. Vectors and the scalar multiplication and vector addition operations:

Chapter 4 Euclid Space

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 3191 Applied Linear Algebra

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MTH 2310, FALL Introduction

Solutions to Review Problems for Chapter 6 ( ), 7.1

Projection Theorem 1

Linear Models Review

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Chapter 6: Orthogonality

Lecture 10: Vector Algebra: Orthogonal Basis

v = w if the same length and the same direction Given v, we have the negative v. We denote the length of v by v.

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Section 6.1. Inner Product, Length, and Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Linear Algebra Massoud Malek

LINEAR ALGEBRA W W L CHEN

Homework 5. (due Wednesday 8 th Nov midnight)

Two Posts to Fill On School Board

Announcements Monday, November 20

1.6 and 5.3. Curve Fitting One of the broadest applications of linear algebra is to curve fitting, especially in determining unknown coefficients in

MTH 2032 SemesterII

Chapter 6. Orthogonality and Least Squares

MA 265 FINAL EXAM Fall 2012

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

There are two things that are particularly nice about the first basis

6. Orthogonality and Least-Squares

Math Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Lecture 3: Linear Algebra Review, Part II

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Math 3191 Applied Linear Algebra

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

The Gram-Schmidt Process 1

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Transpose & Dot Product

1. Select the unique answer (choice) for each problem. Write only the answer.

Homework 11 Solutions. Math 110, Fall 2013.

OWELL WEEKLY JOURNAL

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Transpose & Dot Product

MATH. 20F SAMPLE FINAL (WINTER 2010)

Inner Product, Length, and Orthogonality

Section 6.4. The Gram Schmidt Process

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

The Gram Schmidt Process

The Gram Schmidt Process

Orthogonal Complements

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Answer Keys For Math 225 Final Review Problem

I. Multiple Choice Questions (Answer any eight)

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

2. Every linear system with the same number of equations as unknowns has a unique solution.

10 Orthogonality Orthogonal subspaces

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Wednesday, November 15

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MAT Linear Algebra Collection of sample exams

x = t 1 x 1 + t 2 x t k x k

A Primer in Econometric Theory

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

Chapter 2. Vectors and Vector Spaces

Numerical Linear Algebra Chap. 2: Least Squares Problems

Linear Algebra. Session 12

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

1. Subspaces A subset M of Hilbert space H is a subspace of it is closed under the operation of forming linear combinations;i.e.,

Linear Algebra Practice Problems

Conceptual Questions for Review

Practice Final Exam. Solutions.

Exercise Solutions to Functional Analysis

7. Dimension and Structure.

MATH 369 Linear Algebra

Solutions: Problem Set 3 Math 201B, Winter 2007

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

DS-GA 1002 Lecture notes 10 November 23, Linear models

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Math Real Analysis II

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

Announcements Wednesday, November 15

REFLECTIONS IN A EUCLIDEAN SPACE

MATH 22A: LINEAR ALGEBRA Chapter 4

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

4.4. Orthogonality. Note. This section is awesome! It is very geometric and shows that much of the geometry of R n holds in Hilbert spaces.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Designing Information Devices and Systems II

Transcription:

Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v, and u and v are orthogonal if and only if u v =.

Linear Algebra [2] 4.3 Planes A nonzero vector n is called a normal to a plane if it is orthogonal to every vector in the plane. A point P is on the plane with normal n through the point P if and only if n ( PP ) =.

Linear Algebra [3] and If n = (a, b, c), P(x, y, z) and P (x, y, z), then PP = OP OP = (x x, y y, z z) (a, b, c) (x x, y y, z z) =. Hence, the plane through P(x, y, z) with normal n = (a, b, c) is given by a(x x) + b(y y) + c(z z) =. Eg. An equation of the plane through P(,, 3) with normal n = (3,, 2) is 3(x ) (y + ) + 2(z 3) =. This simplifies to 3x y + 2z =.

Linear Algebra [4] 7. Orthogonality in R n Given X = x. and Y = xn X and Y is defined by y. in R n, the dot product of yn X Y = X T Y = xy + x2y2 + + xnyn. The length X of X is defined by X = X X = x 2 + x2 2 + + x2 n.

Linear Algebra [5] Eg. If X = and Y = 2 2, then 2 X Y = [ 2 ] 2 = 2 + 2 + 2 =, 2 and X 2 = + + + 4 = 7 and X = 7.

Linear Algebra [6] Thm.. X Y = Y X 2. X (Y + Z) = X Y + X Z, (X + Y ) Z = X Z + Y Z. 3. (kx) Y = X (ky ) = k(x Y ), k R. 4. X, and X = X = O. 5. kx = k X, k R.

Linear Algebra [7] Eg. X + Y 2 = (X + Y ) (X + Y ) = X X + X Y + Y X + Y Y = X 2 + Y 2 + 2(X Y ). X X has length. Indeed, X X = X X =.

Linear Algebra [8] Def.. Two vectors X and Y are orthogonal if X Y =. 2. A set {X, X2, Xm} of nonzero vectors in R n is called an orthogonal set if Xi Xj = for i j. 3. An orthogonal set {X, X2,, Xm} is orthonormal if Xi = for all i. Rmk. If {X, X2,, Xm} is orthogonal, then { } X X, X2 X2,, Xm Xm is orthonormal.

Linear Algebra [9] Eg. X = form an orthogonal set and X 2 =, 3 4 2, X2 = and X3 = 3 4 2 X2 3 2 = and X 3 2 4 2 = 2 2 form an orthonormal set.

Linear Algebra [] Thm. Every orthogonal set of vectors in R n is linearly independent. Proof. Let {X, X2,, Xm} be orthogonal. Consider rx + r2x2 + + rmxm = O. = Xi O = Xi (rx + r2x2 + + rmxm) = r(xi X) + r2(xi X2) + + rm(xi Xm) = ri Xi 2. Hence ri = for each i.

Linear Algebra [] Thm. If {X, X2,, Xn} is an orthogonal basis of R n, then X = X X X 2 X + X X 2 X2 2 X 2 + + X X n Xn 2 X n for every X in R n. Proof. If X = rx + r2x2 + + rnxn, then X Xi = ri(xi Xi) = ri Xi 2. Therefore, ri = X X i Xi 2.

Linear Algebra [2] Eg. X = 2, X2 = orthogonal basis of R 3. X = 3 and X3 = 3 = rx + r2x2 + r3x3. 4 form an 4 r = X X X 2 = 2, r 2 = X X 2 X2 2 = 3, r 3 = X X 3 X3 2 =. Hence, X = 2 X + 3 X 2.

Linear Algebra [3] Projections Def. If U is a subspace of R n, we define the orthogonal complement U of U by U = {X R n X Y = for all Y U}. Observe that if U = span{x,, Xm}, then U = {X R n X Xi = for all i}.

Linear Algebra [4] Eg. Find U if U = span, 2 x y Solution. Let X = U. z w 2 3 in R 4. Then X = and X = yield 2 2 3 x y + 2z =, x 2z + 3w =.

Linear Algebra [5] [ 2 2 3 ] [ 2 3 4 3 ], x y = s z w 2 4 +t 3 3. U = span 2 4, 3 3.

Linear Algebra [6] Def. Let {X, X2,, Xm} be an orthogonal basis of a subspace U of R n. Given X in R n, we define proj U (X) = X X X 2 X + X X 2 X2 2 X 2 + + X X m Xm 2 X m and call it the orthogonal projection of X on U.

Linear Algebra [7] Thm. If U is a subspace of R n and X R n, write P = proj U (X). Then. P U and X P U. 2. X P X Y for all Y U. 3. dim U + dim U = n. Proof.. Clearly, P U. (X P ) Xi = X Xi P Xi = X Xi X X i Xi 2 X i Xi = Thus X P U.

Linear Algebra [8] 2. Write X Y = (X P ) + (P Y ). Then P Y is in U and X P U. X Y 2 = (X P ) + (P Y ) 2 = X P 2 + P Y 2 + 2(X P ) (P Y ) = X P 2 + P Y 2 X P 2. 3. Let {X,, Xm} and {Y,, Yk} be orthogonal basis of U and U, respectively. Then {X,, Xm, Y,, Yn} is orthogonal, so linearly independent. If X R n, then X = P + (X P ). Thus {X,, Xm, Y,, Yn} spans R n.

Linear Algebra [9] Eg. Let U = span,. If X = 3 2, find the vector in U closest to X and express X as the sum of a vector in U and a vector in U. Solution. Note that, is orthogonal.

Linear Algebra [2] P = proj U (X) = X X X 2 X + X X 2 X2 2 X 2 = 4 3 X + 3 X 2 = 4 3 + 3 = 3 X = P + (X P ) = 3 5 4 4 + 7. 3 3 3 5 4. 3

Linear Algebra [2] Gram-Schmidt Orthogonalization Algorithm Question : Given a basis B = {Y,, Ym} of U, how can we obtain an orthogonal basis from B? Answer : Construct X,, Xm in U as follows. X = Y, X2 = Y2 Y 2 X X 2 X, X3 = Y3 Y 3 X X 2 X Y 3 X2 X2 2 X 2, Xm = Ym Y m X X 2 X Y m X2 X2 2 X 2 Y m Xm Xm 2 X m. Then {X,, Xm} is an orthogonal basis of U.

Linear Algebra [22] Proof. Let U = span{x}, U2 = span{x, X2},, Um = span{x,, Xm }. {X} is orthogonal. X2 = Y2 proj U (Y 2), X2 U {X, X2} is orthogonal. X3 = Y3 proj (Y U 2 3), X3 U 2 {X, X2, X3} is orthogonal. Continue the process. Xm = Ym proj (Y U m m), Xm U m {X,, Xm} is orthogonal.

Linear Algebra [23] Eg. Let U be the subspace of R 4 with basis {Y, Y2, Y3}, where Y =, Y2 = Find an orthogonal basis., Y3 =. Solution. X = Y, X2 = Y2 Y 2 X X 2 X = ( 2 3 ) = 3 2 3 3 2 3,

Linear Algebra [24] X3 = Y3 Y 3 X X 2 X Y 3 X2 X2 2 X 2 Thus = ( 3 ), 2, 3 4 3 3 ( 2 5 ) 2 = 3 is an orthogonal basis. 4 5 3 5 5 3 5 4 3. 3

Linear Algebra [25] How can we get an orthonormal basis? 3, 5 2, 35 3 4 3 3