Announcements Monday, November 19

Similar documents
Announcements Monday, November 19

Announcements Monday, November 20

Announcements Wednesday, November 15

Announcements Wednesday, November 15

Announcements Monday, November 26

Section 6.1. Inner Product, Length, and Orthogonality

Chapter 6. Orthogonality and Least Squares

Announcements Monday, November 26

Announcements Monday, October 29

Announcements Monday, September 17

Announcements Wednesday, September 20

Announcements Wednesday, November 01

Orthogonal Projection. Hung-yi Lee

Announcements Monday, September 25

Overview. Motivation for the inner product. Question. Definition

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Math 2331 Linear Algebra

Math 3191 Applied Linear Algebra

Announcements Wednesday, November 01

Announcements Wednesday, September 27

Math 2331 Linear Algebra

Announcements Wednesday, October 04

March 27 Math 3260 sec. 56 Spring 2018

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Announcements Monday, November 13

Math 3191 Applied Linear Algebra

Announcements Monday, September 18

Announcements Wednesday, September 05

Section 6.4. The Gram Schmidt Process

Announcements Wednesday, November 7

Orthogonal Complements

Solutions to Review Problems for Chapter 6 ( ), 7.1

Answer Key for Exam #2

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Announcements Monday, November 13

Announcements Monday, October 02

Announcements Wednesday, November 7

Lecture 10: Vector Algebra: Orthogonal Basis

Math 51 Midterm 1 July 6, 2016

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

The Fundamental Theorem of Linear Algebra

Math 2331 Linear Algebra

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements September 19

Section 6.5. Least Squares Problems

Announcements Wednesday, August 30

Announcements: Nov 10

Orthogonality and Least Squares

Announcements Wednesday, August 30

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Wednesday, October 25

Math 3191 Applied Linear Algebra

Math 3C Lecture 25. John Douglas Moore

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

LINEAR ALGEBRA SUMMARY SHEET.

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Midterm solutions. (50 points) 2 (10 points) 3 (10 points) 4 (10 points) 5 (10 points)

MATH 1553, FALL 2018 SAMPLE MIDTERM 2: 3.5 THROUGH 4.4

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

1 Last time: determinants

Math 1553 Introduction to Linear Algebra. School of Mathematics Georgia Institute of Technology

Chapter 6: Orthogonality

Section 3.3. Matrix Equations

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

PRACTICE PROBLEMS FOR THE FINAL

MATH 54 QUIZ I, KYLE MILLER MARCH 1, 2016, 40 MINUTES (5 PAGES) Problem Number Total

2.4 Hilbert Spaces. Outline

Announcements August 31

Math 22 Fall 2018 Midterm 2

APPM 2360 Exam 2 Solutions Wednesday, March 9, 2016, 7:00pm 8:30pm

Lecture 9: Vector Algebra

6. Orthogonality and Least-Squares

MAT 242 CHAPTER 4: SUBSPACES OF R n

Announcements Wednesday, October 10

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Problem # Max points possible Actual score Total 120

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Dot Products, Transposes, and Orthogonal Projections

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Systems of Linear Equations

Chapter 6 - Orthogonality

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Shorts

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

1 Last time: inverses

Section Instructors: by now you should be scheduled into one of the following Sections:

MTH 2310, FALL Introduction

MATH 1553, JANKOWSKI MIDTERM 2, SPRING 2018, LECTURE A

University of Ottawa

Lecture 20: 6.1 Inner Products

I. Multiple Choice Questions (Answer any eight)

Math 200 A and B: Linear Algebra Spring Term 2007 Course Description

Math 21b: Linear Algebra Spring 2018

Basic Linear Algebra Ideas. We studied linear differential equations earlier and we noted that if one has a homogeneous linear differential equation

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Math 290, Midterm II-key

Transcription:

Announcements Monday, November 19 You should already have the link to view your graded midterm online. Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and will not be determined until all grades are in. Individual exam grades are not curved. Send regrade requests by tomorrow. WeBWorK 6.6, 7.1, 7.2 are due the Wednesday after Thanksgiving. No more quizzes! My office is Skiles 244 and Rabinoffice hours are: Mondays, 12 1pm; Wednesdays, 1 3pm. (But not this Wednesday.)

Section 7.2 Orthogonal Complements

Orthogonal Complements Definition Let W be a subspace of R n. Its orthogonal complement is W = { v in R n v w = 0 for all w in W } read W perp. Pictures: W is orthogonal complement A T is transpose The orthogonal complement of a line in R 2 is the perpendicular line. [interactive] W W The orthogonal complement of a line in R 3 is the perpendicular plane. [interactive] W W The orthogonal complement of a plane in R 3 is the perpendicular line. [interactive] W W

Poll

Orthogonal Complements Basic properties Let W be a subspace of R n. Facts: 1. W is also a subspace of R n 2. (W ) = W 3. dim W + dim W = n 4. If W = Span{v 1, v 2,..., v m}, then W = all vectors orthogonal to each v 1, v 2,..., v m = { x in R n x v i = 0 for all i = 1, 2,..., m } v T 1 = Nul v2 T... vm T

Orthogonal Complements Computation 1 Problem: if W = Span 1, 1 1 1, compute W. 1 [interactive] v T 1 Span{v 1, v 2,..., v m} = Nul v2 T.. vm T

Orthogonal Complements Row space, column space, null space Definition The row space of an m n matrix A is the span of the rows of A. It is denoted Row A. Equivalently, it is the column space of A T : It is a subspace of R n. Row A = Col A T. We showed before that if A has rows v T 1, v T 2,..., v T m, then Hence we have shown: Fact: (Row A) = Nul A. Span{v 1, v 2,..., v m} = Nul A. Replacing A by A T, and remembering Row A T = Col A: Fact: (Col A) = Nul A T. Using property 2 and taking the orthogonal complements of both sides, we get: Fact: (Nul A) = Row A and Col A = (Nul A T ).

Orthogonal Complements Reference sheet Orthogonal Complements of Most of the Subspaces We ve Seen For any vectors v 1, v 2,..., v m: v T 1 Span{v 1, v 2,..., v m} = Nul v2 T.. vm T For any matrix A: Row A = Col A T and (Row A) = Nul A (Col A) = Nul A T Row A = (Nul A) Col A = (Nul A T ) For any other subspace W, first find a basis v 1,..., v m, then use the above trick to compute W = Span{v 1,..., v m}.

Section 7.3 Orthogonal Projections

Best Approximation Suppose you measure a data point x which you know for theoretical reasons must lie on a subspace W. x x y y W Due to measurement error, though, the measured x is not actually in W. Best approximation: y is the closest point to x on W. How do you know that y is the closest point? The vector from y to x is orthogonal to W : it is in the orthogonal complement W.

Orthogonal Decomposition Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. The equation x = x W + x W is called the orthogonal decomposition of x (with respect to W ). The vector x W is the orthogonal projection of x onto W. The vector x W is the closest vector to x on W. x [interactive 1] [interactive 2] x W x W W

Orthogonal Decomposition Justification Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. Why?

Orthogonal Decomposition Example Let W be the xy-plane in R 3. Then W is the z-axis. 2 x = 1 = x W = x W =. 3 a x = b = x W = x W =. c This is just decomposing a vector into a horizontal component (in the xy-plane) and a vertical component (on the z-axis). x x W x W W [interactive]

Orthogonal Decomposition Computation? Problem: Given x and W, how do you compute the decomposition x = x W + x W? Observation: It is enough to compute x W, because x W = x x W.

The A T A trick Theorem (The A T A Trick) Let W be a subspace of R n, let v 1, v 2,..., v m be a spanning set for W (e.g., a basis), and let A = v 1 v 2 v m. Then for any x in R n, the matrix equation A T Av = A T x (in the unknown vector v) is consistent, and x W = Av for any solution v. Recipe for Computing x = x W + x W Write W as a column space of a matrix A. Find a solution v of A T Av = A T x (by row reducing). Then x W = Av and x W = x x W.

The A T A Trick Example Problem: Compute the orthogonal projection of a vector x = (x 1, x 2, x 3) in R 3 onto the xy-plane.

The A T A Trick Another Example Problem: Let 1 x 1 x = 2 W = x 2 in R 3 x1 x 2 + x 3 = 0. 3 x 3 Compute the distance from x to W.

The A T A Trick Another Example, Continued Problem: Let 1 x 1 x = 2 W = x 2 in R 3 x1 x 2 + x 3 = 0. 3 x 3 Compute the distance from x to W. [interactive]

The A T A trick Proof Theorem (The A T A Trick) Let W be a subspace of R n, let v 1, v 2,..., v m be a spanning set for W (e.g., a basis), and let A = v 1 v 2 v m. Then for any x in R n, the matrix equation A T Av = A T x (in the unknown vector v) is consistent, and x W = Av for any solution v. Proof:

Orthogonal Projection onto a Line Problem: Let L = Span{u} be a line in R n and let x be a vector in R n. Compute x L. Projection onto a Line The projection of x onto a line L = Span{u} is x L = u x u u u x L = x x L. x L L x u x L = u x u u u

Orthogonal Projection onto a Line Example Problem: Compute the orthogonal projection of x = ( ) 6 4 onto the line L spanned by u = ( 3 2), and find the distance from u to L. ( ) 6 4 ( 3 2) L 10 ( ) 3 13 2 [interactive]

Summary Let W be a subspace of R n. The orthogonal complement W is the set of all vectors orthogonal to everything in W. We have (W ) = W and dim W + dim W = n. Row A = Col A T, (Row A) = Nul A, Row A = (Nul A), (Col A) = Nul A T, Col A = (Nul A T ). Orthogonal decomposition: any vector x in R n can be written in a unique way as x = x W + x W for x W in W and x W in W. The vector x W is the orthogonal projection of x onto W. The vector x W is the closest point to x in W : it is the best approximation. The distance from x to W is x W. If W = Col A then to compute x W, solve the equation A T Av = A T x; then x W = Av. If W = L = Span{u} is a line then x L = u x u u u.