Section 6.1. Inner Product, Length, and Orthogonality

Similar documents
Chapter 6. Orthogonality and Least Squares

Announcements Wednesday, November 15

Announcements Wednesday, November 15

Announcements Monday, November 19

Announcements Monday, November 19

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Math 2331 Linear Algebra

Math 3191 Applied Linear Algebra

March 27 Math 3260 sec. 56 Spring 2018

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Overview. Motivation for the inner product. Question. Definition

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

MAT 242 CHAPTER 4: SUBSPACES OF R n

Lecture 3: Linear Algebra Review, Part II

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Orthogonality and Least Squares

Chapter 6: Orthogonality

Math 2331 Linear Algebra

The Four Fundamental Subspaces

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Math 3191 Applied Linear Algebra

Announcements Monday, October 29

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Systems of Linear Equations

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

Reduction to the associated homogeneous system via a particular solution

Math 3C Lecture 25. John Douglas Moore

Math Linear Algebra

Orthogonal Projection. Hung-yi Lee

Math 4377/6308 Advanced Linear Algebra

1 Last time: inverses

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Section 6.4. The Gram Schmidt Process

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Announcements Monday, November 20

MTH 2032 SemesterII

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

LINEAR ALGEBRA SUMMARY SHEET.

10 Orthogonality Orthogonal subspaces

The geometry of least squares

LINEAR ALGEBRA: THEORY. Version: August 12,

Solutions to Math 51 Midterm 1 July 6, 2016

Vectors. Vectors and the scalar multiplication and vector addition operations:

6. Orthogonality and Least-Squares

Inner Product, Length, and Orthogonality

>>> M=transformation(A) >>> print(m*a) A B one one 1 0 0

Announcements Monday, November 26

4.3 - Linear Combinations and Independence of Vectors

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

Lecture 20: 6.1 Inner Products

Solutions of Linear system, vector and matrix equation

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MTH 2310, FALL Introduction

Lecture 23: 6.1 Inner Products

Linear Algebra, Summer 2011, pt. 3

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Chapter 2: Vector Geometry

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Solutions to Final Exam

Chapter 6 - Orthogonality

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

The Gram Schmidt Process

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

The Gram Schmidt Process

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Lecture 22: Section 4.7

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Math 54 HW 4 solutions

Math 54. Selected Solutions for Week 5

Dot Products, Transposes, and Orthogonal Projections

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Lecture 9: Vector Algebra

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Test 3, Linear Algebra

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

DS-GA 1002 Lecture notes 10 November 23, Linear models

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

MATH 221, Spring Homework 10 Solutions

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Announcements Monday, November 26

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Solutions to Review Problems for Chapter 6 ( ), 7.1

2.4 Hilbert Spaces. Outline

Math 290, Midterm II-key

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then

Solutions to Math 51 First Exam April 21, 2011

MATH 1553 PRACTICE FINAL EXAMINATION

Solutions to Selected Questions from Denis Sevee s Vector Geometry. (Updated )

Transcription:

Section 6. Inner Product, Length, and Orthogonality

Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not actually in Span{u, v}. But you know, for theoretical reasons, it must lie on that plane. What do you do? The real value is probably the closest point, on the plane, to x. New terms: Orthogonal projection ( closest point ), orthogonal vectors, angle.

The Dot Product The dot product encodes the notion of angle between two vectors. We will use it to define orthogonality (i.e. when two vectors are perpendicular) Definition The dot product of two vectors x, y in R n is This is the same as x T y. Example x y x 2 x y =. y 2. x n y n def = x y + x 2y 2 + + x ny n. 4 2 5 = ( 2 3 ) 4 5 = 4 + 2 5 + 3 6 = 32. 3 6 6

Properties of the Dot Product Many usual arithmetic rules hold, as long as you remember you can only dot two vectors together, and that the result is a scalar. x y = y x (x + y) z = x z + y z (cx) y = c(x y) Dotting a vector with itself is special: Hence: x x 0 x x x x = 0 if and only if x = 0. x 2. x 2. = x 2 + x2 2 + + xn 2. x n x n Important: x y = 0 does not imply x = 0 or y = 0. For example, ( 0) ( 0 ) = 0.

The Dot Product and Length Definition The length or norm of a vector x in R n is x = x x = x 2 + x 2 2 + + x n 2. Why is this a good definition? The Pythagorean theorem! ( 3 4) 4 3 2 + 4 2 = 5 3 ( = 4) 3 2 + 4 2 = 5 3 Fact If x is a vector and c is a scalar, then cx = c x. ( ) ( ( ) 6 = 3 8 4) 2 = 2 3 = 0 4

The Dot Product and Distance The following is just the length of the vector from x to y. Definition The distance between two points x, y in R n is dist(x, y) = y x. Example Let x = (, 2) and y = (4, 4). Then ( ) dist(x, y) = y x = 3 = 3 2 2 + 2 2 = 3. x y x y 0

Unit Vectors Definition A unit vector is a vector v with length v =. Example The unit coordinate vectors are unit vectors: e = 0 0 = 2 + 0 2 + 0 2 = Definition Let x be a nonzero vector in R n. The unit vector in the direction of x is the x vector x. Is this really a unit vector? scalar x x = x =. x

Unit Vectors Example Example What is the unit vector in the direction of x = u = ( ) 3? 4 x ( ) x = 3 = 32 + 4 2 4 5 ( ) 3. 4 x u 0

x y Orthogonality Definition Two vectors x, y are orthogonal or perpendicular if x y = 0. Notation: Write it as x y. Why is this a good definition? The Pythagorean theorem / law of cosines! y y x x x and y are perpendicular x 2 + y 2 = x y 2 x x + y y = (x y) (x y) x x + y y = x x + y y 2x y x y = 0 Fact: x y x y 2 = x 2 + y 2 (Pythagorean Theorem)

Orthogonality Example Problem: Find all vectors orthogonal to v =. We have to find all vectors x such that x v = 0. This means solving the equation x 0 = x v = x 2 = x + x 2 x 3. x 3 The parametric form for the solution is x = x 2 + x 3, so the parametric vector form of the general solution is x x = x 2 = x 2 + x 3 0. x 3 0 For instance, 0 because 0 = 0.

Orthogonality Example Problem: Find all vectors orthogonal to both v = and w =. Now we have to solve the system of two homogeneous equations x 0 = x v = x 2 = x + x 2 x 3 x 3 x 0 = x w = x 2 = x + x 2 + x 3. x 3 In matrix form: ( ) ( ) rref 0 The rows are v and w. 0 0 The parametric vector form of the solution is x x 2 = x 2. x 3 0

Orthogonality General procedure Problem: Find all vectors orthogonal to v, v 2,..., v m in R n. This is the same as finding all vectors x such that 0 = v T x = v T 2 x = = v T m x. Putting the row vectors v T, v T 2,..., v T m into a matrix, this is the same as finding all x such that v T v x v2 T.. x = v 2 x. = 0. vm T v m x The key observation The set of all vectors orthogonal to some vectors v, v 2,..., v m in R n is the null space of the m n matrix: In particular, this set is a subspace! v T v2 T... vm T

Orthogonal Complements Definition Let W be a subspace of R n. Its orthogonal complement is W = { v in R n v w = 0 for all w in W } W is orthogonal complement A T is transpose Pictures: The orthogonal complement of a line in R 2 is the perpendicular line. W read W perp. W The orthogonal complement of a line in R 3 is the perpendicular plane. W W The orthogonal complement of a plane in R 3 is the perpendicular line. W W

Orthogonal Complements Basic properties Facts: Let W be a subspace of R n.. W is also a subspace of R n 2. (W ) = W 3. dim W + dim W = n 4. If W = Span{v, v 2,..., v m}, then W = all vectors orthogonal to each v, v 2,..., v m = { x in R n x v i = 0 for all i =, 2,..., m } v T = Nul v2 T... vm T Property 4 v T Span{v, v 2,..., v m} = Nul v2 T.. vm T

Orthogonal Complements Row space, column space, null space Definition The row space of an m n matrix A is the span of the rows of A. It is denoted Row A. Equivalently, it is the column span of A T : It is a subspace of R n. Row A = Col A T. We showed before that if A has rows v T, v T 2,..., v T m, then Span{v, v 2,..., v m} = Nul A. Hence we have shown: (Row A) = Nul A. Other Facts: (Col A) = Nul A T. (Replacing A by A T, and remembering Row A T = Col A) (Nul A) = Row A and Col A = (Nul A T ). (Using property 2 and taking the orthogonal complements of both sides)

Reference sheet Orthogonal Complements of Most of the Subspaces We ve Seen For any vectors v, v 2,..., v m: v T (Span{v, v 2,..., v m}) = Nul v2 T.. vm T For any matrix A: Row A = Col A T thus (Row A) = Nul A (Col A) = Nul A T Row A = (Nul A) Col A = (Nul A T )

Extra: Practice proving a set is subspace Example Let s check W is a subspace. Is 0 in W? Yes: 0 w = 0 for any w in W. Closed under addition: Suppose x, y are in W. So x w = 0 and y w = 0 for all w in W. Then (x + y) w = x w + y w = 0 + 0 = 0 for all w in W. So x + y is also in W. Closed under scalar product: Suppose x is in W. So x w = 0 for all w in W. If c is a scalar, then (cx) w = c(x 0) = c(0) = 0 for any w in W. So cx is in W.