Orthogonal Complements

Similar documents
Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

March 27 Math 3260 sec. 56 Spring 2018

Math 3191 Applied Linear Algebra

MTH 2310, FALL Introduction

Math 2331 Linear Algebra

Chapter 6: Orthogonality

Math 2331 Linear Algebra

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 3191 Applied Linear Algebra

Announcements Monday, November 20

6.1. Inner Product, Length and Orthogonality

PRACTICE PROBLEMS FOR THE FINAL

Homework 5. (due Wednesday 8 th Nov midnight)

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Solutions to Review Problems for Chapter 6 ( ), 7.1

Lecture 10: Vector Algebra: Orthogonal Basis

Section 6.4. The Gram Schmidt Process

Math 102, Winter 2009, Homework 7

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Announcements Monday, November 26

6. Orthogonality and Least-Squares

Final Examination 201-NYC-05 - Linear Algebra I December 8 th, and b = 4. Find the value(s) of a for which the equation Ax = b

Problem # Max points possible Actual score Total 120

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math Linear Algebra

LINEAR ALGEBRA SUMMARY SHEET.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

2018 Fall 2210Q Section 013 Midterm Exam II Solution

WI1403-LR Linear Algebra. Delft University of Technology

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Practice Final Exam. Solutions.

Orthogonality and Least Squares

For each problem, place the letter choice of your answer in the spaces provided on this page.

Announcements Monday, November 19

LINEAR ALGEBRA QUESTION BANK

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Typical Problem: Compute.

Announcements Monday, November 19

Vectors. Vectors and the scalar multiplication and vector addition operations:

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

MATH Linear Algebra

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

MATH 1553, FALL 2018 SAMPLE MIDTERM 2: 3.5 THROUGH 4.4

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

MTH 2032 SemesterII

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Least squares problems Linear Algebra with Computer Science Application

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Orthogonal complement

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

7. Dimension and Structure.

The Gram Schmidt Process

The Gram Schmidt Process

2.4 Hilbert Spaces. Outline

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 4 Euclid Space

SUMMARY OF MATH 1600

Linear Algebra 18 Orthogonality

MATH 54 - FINAL EXAM STUDY GUIDE

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

The Fundamental Theorem of Linear Algebra

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Section 6.5. Least Squares Problems

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Answer Keys For Math 225 Final Review Problem

Announcements Monday, November 26

Math 2331 Linear Algebra

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Dimension and Structure

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

2. Every linear system with the same number of equations as unknowns has a unique solution.

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

REFLECTIONS IN A EUCLIDEAN SPACE

Dot product and linear least squares problems

MATH 1553, JANKOWSKI MIDTERM 2, SPRING 2018, LECTURE A

Final Examination 201-NYC-05 December and b =

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Exam in TMA4110 Calculus 3, June 2013 Solution

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

DS-GA 1002 Lecture notes 10 November 23, Linear models

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

MAT Linear Algebra Collection of sample exams

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Math 22 Fall 2018 Midterm 2

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Lecture 1: Basic Concepts

Math 396. An application of Gram-Schmidt to prove connectedness

Transcription:

Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal complement of W, denoted W. W is a subspace of R n (HW8 - Problem 2) z W z is orthogonal to all the vectors in any basis of W Let A be an m n matrix, then [Row(A)] = Nul(A) and [Col(A)] = Nul(A T ) Quang T. Bach Math 18 December 1, 2017 1 / 13

Orthogonal Sets Definition A set of vectors S = {v 1, v 2,..., v p } in R n is said to be an orthogonal set if every pair of distinct vectors in S is orthogonal. Namely, v i v j = 0 whenever i j In addition, if each vector in S is a unit vector then S is called orthonormal. (Orthogonality Implies Linearly Independence) If S = {v 1, v 2,..., v p } is an orthogonal set of non-zero vectors in R n, then S is linearly independent and thus forms a basis for the subspace spanned by S. Quang T. Bach Math 18 December 1, 2017 2 / 13

Question 1 Which of the following set is not orthogonal? { [ ] [ ] 1 2 } A., 2 1 { [ 1/ ] [ ] 5 B. 2/ 2/ 5 }, 5 1/ 5 { [ ] [ ] [ ] 1 2 0 } C.,, 2 1 0 { [ ] [ ] [ ] 1 2 2 } D.,, 2 1 4 E. (C) and (D) Quang T. Bach Math 18 December 1, 2017 3 / 13

Question 2 What is the maximum size of a non-zero orthogonal set in R 3? A. 2 B. 3 C. 4 D. More than 4 but the number is finite E. Infinity many Quang T. Bach Math 18 December 1, 2017 4 / 13

Orthogonal Matrices An m n matrix U is called orthogonal if its columns form a set of orthonormal vectors. Remark: Orthogonal matrix does not have to be square. If U is an m n orthogonal matrix then U T U = I n If U is square and is an orthogonal matrix, then U 1 = U T (HW7 - Problem 3) Question 3. Let A be an m n orthogonal matrix. Which of the following must be false about its dimensions m and n? A. m = n B. m > n C. m < n D. Choose this if all the above are true Quang T. Bach Math 18 December 1, 2017 5 / 13

Linear Transformation using Orthogonal Matrix Let U be an m n matrix with orthonormal columns and let x and y be vectors in R n. Then a. Ux = x b. (Ux) (Uy) = x y c. (Ux) (Uy) = 0 if and only if x y = 0 Note: The above theorem shows that the linear transformation x Ux preserves lengths and orthogonality. (HW8 - Problem 1) Quang T. Bach Math 18 December 1, 2017 6 / 13

Orthogonal Basis Definition If an orthogonal (orthonormal) set S is also a basis for a vector space V then S is called an orthogonal basis (orthonormal basis) for V. Example The following sets are orthonormal bases in R 3 : [ [ 1 1 T 2 2 S 1 = { 2,, 0], 2 6, 6, 2 ] T [ 2 2, 3 3, 2 3, 1 ] T } 3 S 2 = {[cos(θ), sin(θ), 0] T, [ sin(θ), cos(θ), 0] T, [0, 0, 1] T } To see this, check that each set is an orthonormal set of linearly independent vectors in R 3. Quang T. Bach Math 18 December 1, 2017 7 / 13

Orthogonal Projection Given a vector y R n and a non-zero vector u R n. We can always decompose y into a sum of two vectors y = ŷ + z where ŷ is a multiple of u and z is orthogonal to u. The vector ŷ is called the orthogonal projection of y onto u and the vector z is the component of y orthogonal to u. In general, the projection of y onto Span{u} is given by ŷ = y u u u u We sometimes denote this by proj L (y) where L is the subspace spanned by u. Quang T. Bach Math 18 December 1, 2017 8 / 13

Orthogonal Projection - Example Example Find the orthogonal decomposition of y = decompose y. [ ] 7 onto u = 6 [ ] 4 and 2 Answer: ŷ = y u u u [ ] 28 + 12 4 u = 16 + 4 u = 2 2 [ ] 1 z = y ŷ =. 2 = [ ] 8. 4 Quang T. Bach Math 18 December 1, 2017 9 / 13

Coordinates under Orthogonal Basis Let S = {v 1, v 2,..., v p } be an orthogonal basis for a subspace W of R n. For each u W, the weights in the linear combination u = c 1 v 1 + c 2 v 2 + + c p v p is given by c j = u v j v j v j Quang T. Bach Math 18 December 1, 2017 10 / 13

Orthogonal Decomposition Let W be a subspace of R n with an orthogonal basis S = {v 1, v 2,..., v p }. Then every vector y in R n can also be written as y = ŷ + z where and ŷ = y v 1 v 1 v 1 v 1 + y v 2 v 2 v 2 v 2 + + y v p v p v p v p W z = y ŷ W We sometimes denote ŷ, the orthogonal projection of y onto W, by proj W (y) Quang T. Bach Math 18 December 1, 2017 11 / 13

Orthogonal Decomposition - Example Example 1 { 2 2 } Let y = 2 and W = Span{v 1, v 2 } = 5, 1. Decompose y 3 1 1 into the sum of a vector in W and a vector in W. ŷ = y v 1 v 1 v 1 v 1 + y v 2 v 2 v 2 v 2 = 3 10 Check that ŷ z 2 5 + 1 2 2/5 1 = 2 W 2 1 1 1/5 1 2/5 7/5 z = y ŷ = 2 2 = 0 W 3 1/5 14/5 Quang T. Bach Math 18 December 1, 2017 12 / 13

Orthogonal Projection - Properties Lemma If y W then proj W (y) = y Let W be a subspace of R n. Let y be any vector in R n and let ŷ be the orthogonal projection of y onto W. Then ŷ is the closest point in W to y, in the sense that for all v W and v ŷ d(y, ŷ) = y ŷ < y v = d(y, v) The vector ŷ is called the best approximation to y by elements of W. Quang T. Bach Math 18 December 1, 2017 13 / 13