MTH 2310, FALL Introduction

Similar documents
Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Orthogonal Complements

Math 3191 Applied Linear Algebra

Math 2331 Linear Algebra

March 27 Math 3260 sec. 56 Spring 2018

6.1. Inner Product, Length and Orthogonality

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math Linear Algebra

Chapter 6: Orthogonality

Math 3191 Applied Linear Algebra

Solutions to Review Problems for Chapter 6 ( ), 7.1

Math 2331 Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

6. Orthogonality and Least-Squares

Lecture 10: Vector Algebra: Orthogonal Basis

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Orthogonality and Least Squares

The Gram Schmidt Process

The Gram Schmidt Process

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Linear Models Review

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

The geometry of least squares

F F. proj cos( ) v. v proj v

Lecture 3: Linear Algebra Review, Part II

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Announcements Monday, November 20

2.4 Hilbert Spaces. Outline

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Exercise Solutions for Introduction to 3D Game Programming with DirectX 10

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

MTH 2032 SemesterII

LINEAR ALGEBRA SUMMARY SHEET.

There are two things that are particularly nice about the first basis

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

A Primer in Econometric Theory

MATH Linear Algebra

(v, w) = arccos( < v, w >

Homework 5. (due Wednesday 8 th Nov midnight)

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Linear Algebra: Homework 3

Chapter 6. Orthogonality and Least Squares

(v, w) = arccos( < v, w >

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Designing Information Devices and Systems II

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 1.4: Inner products and orthogonality

MATH 22A: LINEAR ALGEBRA Chapter 4

Section 6.1. Inner Product, Length, and Orthogonality

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

(v, w) = arccos( < v, w >

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Applied Linear Algebra in Geoscience Using MATLAB

LINEAR ALGEBRA REVIEW

2. Every linear system with the same number of equations as unknowns has a unique solution.

Exercises for Unit I (Topics from linear algebra)

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

Exercises for Unit I (Topics from linear algebra)

Dot Products, Transposes, and Orthogonal Projections

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Chapter 6. Orthogonality

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Section 6.4. The Gram Schmidt Process

Linear Algebra V = T = ( 4 3 ).

MATH 12 CLASS 2 NOTES, SEP Contents. 2. Dot product: determining the angle between two vectors 2

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Vectors. Vectors and the scalar multiplication and vector addition operations:

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

The Gram-Schmidt Process 1

Brief Review of Exam Topics

Properties of Linear Transformations from R n to R m

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

6.4 Vectors and Dot Products

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 2331 Linear Algebra

Homework 11 Solutions. Math 110, Fall 2013.

Linear Algebra, Summer 2011, pt. 3

8 General Linear Transformations

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

Answer Key for Exam #2

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then

Lecture 9: Vector Algebra

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

7.1 Projections and Components

Family Feud Review. Linear Algebra. October 22, 2013

Math 3C Lecture 25. John Douglas Moore

Row Space, Column Space, and Nullspace

Transcription:

MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly independent or spans a certain vector space. It turns out that having vectors that are mutually orthogonal also has certain benefits, which we discuss in this section. 2. Orthogonal Sets Definition 1. A set of vectors {u 1,..., u p } in R n is an orthogonal set if each pair of vectors from the set is orthogonal- that is, if u i u j = 0 whenever i j. 3 1 1/2 Exercise 2. Show that the set 1, 2, 2 is an orthogonal set. (i.e., dot product 1 1 7/2 each vector with the other two and confirm that each inner product gives you 0). 1

2 SECTION 6.2: ORTHOGONAL SETS Theorem 2.1. If S = {u 1,..., u p } is an orthogonal set of nonzero vectors in R n, then S is linearly independent and hence is a basis for the subspace spanned by S. Proof. To prove this, we must show that if (1) 0 = c 1 u 1 + + c p u p for some scalars c 1,..., c p, then each c i must equal zero. Since all we know is that the vectors are mutually orthogonal, let s start by taking equation (1) and dotting both sides by the vector u 1. Definition 3. An orthogonal basis is a basis that is also an orthogonal set. It turns out that an orthogonal basis is much nicer than other bases, which the next theorem illustrates. Basically, the idea is that a difficult part of changing bases is calculating what the new weights will be for your vectors, and this calculation is greatly simplified if the vectors are orthogonal. Theorem 2.2. Let {u 1,..., u p } be an orthogonal basis for a vector space W. For each y in W, the weights in the linear combination y = c 1 u 1 + + c p u p are given by c j = y u j u j u j (j = 1,..., p) Proof. This proof is similar to the one above: start by taking both sides of y = c 1 u 1 + +c p u p and dotting them with u 1, then use this to get the expression for c 1, then explain why the expression for c j is true for any j.

MTH 2310, FALL 2011 3 3 1 1/2 Example 4. The set 1, 2, 2 is an orthogonal basis for R3. Use Theorm 2.2 to 1 1 7/2 6 express the vector y = 1 as a linear combination of that basis. 8 3. Orthogonal Projection In terms of the standard basis, it is easy to decompose a vector into its constituent parts. Given a different basis, though, it can be difficult to see how any other vector is written in terms of the basis. Geometrically, the big idea is this:

4 SECTION 6.2: ORTHOGONAL SETS In general, we can state this problem in the following way: given a nonzero vector u in R n, how can we decompose any other vector y into the sum of two vectors, one which is a scalar multiple of u and another which is orthogonal to u? We do the following: Definition 5. Let u be a nonzero vector and y another vector. Then ŷ = y u u u u is the orthogonal projection of y onto u, and z = y ŷ is the component of y orthogonal to u. Notice that if the vector u was replaced by any scalar multiple cu, you still get the same result for ŷ (You may want to check this on your own if you do not believe me). The important thing is not the vector u, but the line spanned by the vector u. Definition 6. Let u be a nonzero vector and y another vector. span{u}. Then ŷ = proj L y = y u u u u is the orthogonal projection of y onto L. Let L be the line formed by

MTH 2310, FALL 2011 5 [ ] [ ] 7 4 Exercise 7. Let y = and u =. 6 2 (a) Use the definitions above to find ŷ, the orthogonal projection of y onto u. (b) Find the component of y orthogonal to u (that is, z = y ŷ). (c) Check that ŷ and z are orthogonal. Exercise 8. Given u 0 in R n, let L=Span{u}. Show that the mapping x proj L x is a linear transformation. Another interpretation of the orthogonal projection is as follows. Since the ŷ and z are perpendicular, we can think of ŷ as the point on the line L that is closest to the point y. This idea forms the foundation for many applications of linear algebra, because often you cannot find a solution to a problem, and you are just searching for the closest thing to a solution, in other words you are looking for a projection of your ideal solution. We discuss this more in section 6.5.

6 SECTION 6.2: ORTHOGONAL SETS 4. Orthonormal Sets If you have two vectors that are orthogonal, any scalar multiples of the two vectors will be orthogonal. Why is this true? Thus if we have a set of orthogonal vectors and we scale them to be unit vectors, they will become a set of orthogonal unit vectors, which will make us happy. Definition 9. An orthogonal set of unit vectors is called an orthonormal set. If the set spans a vector space we call it an orthonormal basis, since the set will be linearly independent (Why?). {[ 2/ ] [ 5 Example 10. Prove that 1/ 1/ ]} 5, 5 2/ is an orthonormal basis. 5 Theorem 4.1. A matrix U has orthonormal columns if and only if U T U = I. Proof. We prove the case where U has three columns, the general case follows easily. Let U = [u 1 u 2 u 3 ].

NOTES 7 Theorem 4.2. Let U be an m n matrix with orthonormal columns and let x and y be in R n. Then (a) Ux = x (b) (Ux) (Uy) = x y (c) (Ux) (Uy) = 0 if and only if x y = 0 Proof. We prove only part (b), the other follow from it. We prove the case where U has three ] x 1 y 1 columns. Let U = [u 1 u 2 u 3, x =, and y =. x 2 x 3 y 2 y 3 Notes 1 T, T, F, F, F