ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Similar documents
March 27 Math 3260 sec. 56 Spring 2018

Inner Product Spaces 6.1 Length and Dot Product in R n

Lecture 23: 6.1 Inner Products

Typical Problem: Compute.

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

The Transpose of a Vector

Chapter 6: Orthogonality

Linear Algebra Massoud Malek

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Lecture 20: 6.1 Inner Products

(v, w) = arccos( < v, w >

Math 3191 Applied Linear Algebra

(v, w) = arccos( < v, w >

Inner Product Spaces 5.2 Inner product spaces

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Lecture 1: Review of linear algebra

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

(v, w) = arccos( < v, w >

Applied Linear Algebra in Geoscience Using MATLAB

Definitions for Quizzes

LINEAR ALGEBRA W W L CHEN

Math Linear Algebra II. 1. Inner Products and Norms

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

4 Linear Algebra Review

Projection Theorem 1

Linear Algebra Review. Vectors

Least-Squares Systems and The QR factorization

Inner Product Spaces

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Orthogonality and Least Squares

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Dot product and linear least squares problems

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

Mathematics Department Stanford University Math 61CM/DM Inner products

6.1. Inner Product, Length and Orthogonality

Vectors in Function Spaces

Unit 2. Projec.ons and Subspaces

Chapter 4 Euclid Space

4 Linear Algebra Review

Introduction to Numerical Linear Algebra II

Inner products and Norms. Inner product of 2 vectors. Inner product of 2 vectors x and y in R n : x 1 y 1 + x 2 y x n y n in R n

INNER PRODUCT SPACE. Definition 1

Lecture 4 Orthonormal vectors and QR factorization

Lecture 2: Linear Algebra Review

1. General Vector Spaces

Inner Product Spaces

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Chapter 6. Orthogonality and Least Squares

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Math 290, Midterm II-key

Math 2331 Linear Algebra

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Section 6.1. Inner Product, Length, and Orthogonality

7. Dimension and Structure.

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

October 25, 2013 INNER PRODUCT SPACES

6. Orthogonality and Least-Squares

Knowledge Discovery and Data Mining 1 (VO) ( )

Linear Analysis Lecture 5

Systems of Linear Equations

Outline. Linear Algebra for Computer Vision

Section 3.9. Matrix Norm

Linear Algebra II Lecture 8

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

MATH 54 - FINAL EXAM STUDY GUIDE

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Linear Algebra. Session 12

2. Review of Linear Algebra

Computational math: Assignment 1

SUMMARY OF MATH 1600

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

2. Every linear system with the same number of equations as unknowns has a unique solution.

Linear Algebra (Review) Volker Tresp 2017

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Lecture 6: Geometry of OLS Estimation of Linear Regession

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

MATH 235: Inner Product Spaces, Assignment 7

Fourier and Wavelet Signal Processing

Inner product spaces. Layers of structure:

Definitions and Properties of R N

Linear Algebra (Review) Volker Tresp 2018

y 1 y 2 . = x 1y 1 + x 2 y x + + x n y n y n 2 7 = 1(2) + 3(7) 5(4) = 4.

2018 Fall 2210Q Section 013 Midterm Exam II Solution

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

IFT 6760A - Lecture 1 Linear Algebra Refresher

Transcription:

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0 1 5 If u and v are vectors in R n then we can regard u and v as n 1 matrices. The transpose u T is a 1 n matrix, and the matrix product u T v is a 1 1 matrix = a scalar. Then note that u.v = v.u = u T v = v T u 13-2 Text: 6.1-3 LS0

Length of a vector in R n Euclidean norm of a vector u is u = u.u, i.e., u = (u.u) 1/2 = u 2 1 + u2 2 +... + u2 n This is the length of vector u If we identify v with a geometric point in the plane, then v is the standard notion of the length of the line segment from 0 to v. This follows from the Pythagorean Theorem applied to a triangle.. A vector of length one is often called a unit vector The process of dividing a vector by its length to create a vector of unit length (a unit vector) is called normalizing Normalize v = (1; 2; 2; 0). 13-3 Text: 6.1-3 LS0

Important properties For any scalar α, the length αv is α times the length of v. That is, αv = α v The length of the sum of any two vectors does not exceed the sum of the lengths of the vectors (Triangle inequality) u + v u + v The Cauchy-Schwartz inequality : x.y x y 13-4 Text: 6.1-3 LS0

Distance in R n Definition: The distance between u and v, two vectors in R n is the length of the vector u v Written as dist(u, v) or d(u, v) d(u, v) = u v Distance between u = ( ) 1 1 and v = ( ) 4 3 13-5 Text: 6.1-3 LS0

Orthogonality 1. Two vectors u and v are orthogonal if (u, v) = 0. 2. A system of vectors {v 1,..., v n } is orthogonal if (v i, v j ) = 0 for i j; and orthonormal if (v i, v j ) = δ ij Pythagoras theorem: u v u + v 2 = u 2 + v 2 That is, two vectors u and v are orthogonal if and only if u+v v u + v 2 = u 2 + v 2 u 13-6 Text: 6.1-3 LS0

Least-Squares systems Background Recall orthogonality: x y if x.y = 0 Equivalently x y if y T x = 0 or x T y = 0 A zero vector is trivially orthogonal to any vector. A vector x is orthogonal to a subspace S if: x y for all y S If A = [a 1, a 2,, a n ] is a basis of S then x S A T x = 0 x T A = 0 13-7 Text: 6.5 LS

The space of all vectors orthogonal to S is a subspace. Notation: S Two subspaces S 1, S 2 are orthogonal to each other when x y for all x in S 1, for all y in S 2 13-8 Text: 6.5 LS

Show that Nul(A) Col(A T ) Nul(A T ) Col(A) and Indeed: Ax = 0 means (A T ) T x = 0. So if x Nul(A), it is to the columns of A T, i.e., to the range of A T. Second result: replace A by A T. Find the subspace of all vectors that are orthogonal to span{v 1, v 2 } where [v 1, v 2 ] = 1 1 1 0 1 1 13-9 Text: 6.5 LS

Least-Squares systems Problem: Given: an m n matrix and a right-hand side b in R m, find x R n which minimizes: b Ax Assumption: m > n and rank(a) = n ( A is of full rank ) Find equivalent conditions to this assumption Theorem If A has full rank then A T A is invertible. Proof We need to prove: A T Ax = 0 implies x = 0. Assume A T Ax = 0. Then x T A T Ax = 0 i.e., (Ax) T Ax = 0, or Ax 2 = 0. This means Ax = 0. But since the columns of A are independent x must be zero. QED. 13-10 Text: 6.5 LS

Theorem Let A be an m n matrix of rank n. Then x is the solution of the least-squares problem min b Ax if and only if b Ax Col(A) if and only if A T (b Ax ) = 0 if and only if A T Ax = A T b Proof See text. 13-11 Text: 6.5 LS

Illustration of theorem: x is the best approximation to the vector b from the subspace span{a} if and only if b Ax is to the whole subspace span{a}. This in turn is equivalent to A T (b Ax ) = 0 A T Ax = A T b. Note: span{a} = Col(A) = column space of A b b A x* {A} 0 x * 13-12 Text: 6.5 LS

Normal equations The system A T Ax = A T b is called the system of normal equations for the matrix A and rhs b Its solution is the solution of the least-squares problem min b Ax Find the least solution by solving the normal equations when: 1 1 0 2 A = 2 1 1 1 1 2 b = 0 4 0 2 1 1 13-13 Text: 6.5 LS

Application: Linear data fitting Experimental data (not accurate) provides measurements y 1,..., y m of an unknown linear function φ at points t 1,..., t m. Problem: find the best possible approximation to φ. Must find: φ(t) = β 0 + β 1 t s.t. φ(t j ) y j, j = 1,..., m Question: Close in what sense? Least-squares approximation sense: Find φ such that φ(t 1 ) y 1 2 + φ(t 2 ) y 2 2 + + φ(t m ) y m 2 = Min 13-14 Text: 6.5 LS

We want to find best fit in least-squares sense for the equations y y = β + t β 0 1 β 0 +β 1 t 1 = y 1 β 0 +β 1 t 2. = y 2 =. β 0 +β 1 t m = y m y i t i t Using matrix notation this means: find best approximation to vector y from linear combinations of vectors f 1, f 2, where y 1 1 t 1 y = y 2., f 1 = 1., f 2 = t 2. 1 y m 13-15 Text: 6.5 LS t m

Define F = [f 1, f 2 ], x = ( β0 β 1 ) We want to find x such F x y Least-squares linear system. F is m 2. is minimum. The vector x mininizes y F x if and only if it is the solution of the normal equations: F T F x = F T y 13-16 Text: 6.5 LS