MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

Similar documents
MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Linear Algebra. Session 12

Linear algebra review

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Lecture 23: 6.1 Inner Products

Chapter 6 - Orthogonality

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Review of Some Concepts from Linear Algebra: Part 2

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Linear Analysis Lecture 5

Typical Problem: Compute.

Chapter 3 Transformations

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Lecture 1: Review of linear algebra

Lecture 20: 6.1 Inner Products

Overview. Motivation for the inner product. Question. Definition

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Row Space, Column Space, and Nullspace

Lecture notes: Applied linear algebra Part 1. Version 2

The Four Fundamental Subspaces

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Least squares problems Linear Algebra with Computer Science Application

Normed & Inner Product Vector Spaces

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

MAT Linear Algebra Collection of sample exams

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Lecture 3: Linear Algebra Review, Part II

Linear Algebra Massoud Malek

Lecture Notes 1: Vector spaces

Section 6.1. Inner Product, Length, and Orthogonality

Math 310 Final Exam Solutions

Math 3191 Applied Linear Algebra

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Algebra II. Paulius Drungilas and Jonas Jankauskas

Math 3C Lecture 25. John Douglas Moore

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 304 Linear Algebra Lecture 15: Linear transformations (continued). Range and kernel. Matrix transformations.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Chapter 6: Orthogonality

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Lecture 6: Geometry of OLS Estimation of Linear Regession

Lecture 9: Vector Algebra

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Math 2331 Linear Algebra

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

Abstract Vector Spaces and Concrete Examples

Chapter 1. Vectors, Matrices, and Linear Spaces

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

CAAM 335: Matrix Analysis

Chapter 6. Orthogonality and Least Squares

MATH 22A: LINEAR ALGEBRA Chapter 4

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

A Primer in Econometric Theory

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Linear Algebra. Session 8

Vector Space and Linear Transform

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Linear Algebra Review

Orthogonality and Least Squares

A Review of Linear Algebra

EE263: Introduction to Linear Dynamical Systems Review Session 2

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MTH 2032 SemesterII

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Chapter 3. Vector spaces

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

NOTES on LINEAR ALGEBRA 1

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Systems of Linear Equations

MATH 423 Linear Algebra II Lecture 28: Inner product spaces.

FUNCTIONAL ANALYSIS-NORMED SPACE

Linear Algebra Review

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Math Linear Algebra II. 1. Inner Products and Norms

Inner Product Spaces

Elementary linear algebra

Check that your exam contains 20 multiple-choice questions, numbered sequentially.

A Brief Outline of Math 355

Vectors in Function Spaces

Math 307 Learning Goals

Midterm solutions. (50 points) 2 (10 points) 3 (10 points) 4 (10 points) 5 (10 points)

Inner product spaces. Layers of structure:

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 2360 REVIEW PROBLEMS

2. Review of Linear Algebra

Pseudoinverse & Moore-Penrose Conditions

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10

Transcription:

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y) if x y = 0. Definition 2. A vector x R n is said to be orthogonal to a nonempty set Y R n (denoted x Y ) if x y = 0 for any y Y. Definition 3. Nonempty sets X, Y R n are said to be orthogonal (denoted X Y ) if x y = 0 for any x X and y Y.

Orthogonal complement Definition. Let S R n. The orthogonal complement of S, denoted S, is the set of all vectors x R n that are orthogonal to S. Theorem 1 (i) S is a subspace of R n. (ii) (S ) = Span(S). Theorem 2 If V is a subspace of R n, then (i) (V ) = V, (ii) V V = {0}, (iii) dim V + dim V = n. Theorem 3 If V is the row space of a matrix, then V is the nullspace of the same matrix.

V 0 V

Orthogonal projection Theorem 1 Let V be a subspace of R n. Then any vector x R n is uniquely represented as x = p + o, where p V and o V. In the above expansion, p is called the orthogonal projection of the vector x onto the subspace V. Theorem 2 x v > x p for any v p in V. Thus o = x p = min x v is the v V distance from the vector x to the subspace V.

V o x p V

Problem. Let Π be the plane spanned by vectors v 1 = (1, 1, 0) and v 2 = (0, 1, 1). (i) Find the orthogonal projection of the vector x = (4, 0, 1) onto the plane Π. (ii) Find the distance from x to Π. We have x = p + o, where p Π and o Π. Then the orthogonal projection of x onto Π is p and the distance from x to Π is o. We have p = αv 1 + βv 2 for some α, β R. Then o = x p = x αv 1 βv 2. { { o v1 = 0 o v 2 = 0 α(v1 v 1 ) + β(v 2 v 1 ) = x v 1 α(v 1 v 2 ) + β(v 2 v 2 ) = x v 2

x = (4, 0, 1), v 1 = (1, 1, 0), v 2 = (0, 1, 1) { α(v1 v 1 ) + β(v 2 v 1 ) = x v 1 α(v 1 v 2 ) + β(v 2 v 2 ) = x v 2 { 2α + β = 4 α + 2β = 1 { α = 3 β = 2 p = 3v 1 2v 2 = (3, 1, 2) o = x p = (1, 1, 1) o = 3

Overdetermined system of linear equations: x + 2y = 3 3x + 2y = 5 x + y = 2.09 No solution: inconsistent system x + 2y = 3 4y = 4 y = 0.91 Assume that a solution (x 0, y 0 ) does exist but the system is not quite accurate, namely, there may be some errors in the right-hand sides. Problem. Find a good approximation of (x 0, y 0 ). One approach is the least squares fit. Namely, we look for a pair (x, y) that minimizes the sum (x + 2y 3) 2 + (3x + 2y 5) 2 + (x + y 2.09) 2.

Least squares solution System of linear equations: a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m Ax = b For any x R n define a residual r(x) = b Ax. The least squares solution x to the system is the one that minimizes r(x) (or, equivalently, r(x) 2 ). m r(x) 2 = (a i1 x 1 + a i2 x 2 + + a in x n b i ) 2 i=1

Let A be an m n matrix and let b R m. Theorem A vector ˆx is a least squares solution of the system Ax = b if and only if it is a solution of the associated normal system A T Ax = A T b. Proof: Ax is an arbitrary vector in R(A), the column space of A. Hence the length of r(x) = b Ax is minimal if Ax is the orthogonal projection of b onto R(A). That is, if r(x) is orthogonal to R(A). We know that R(A) = N(A T ), the nullspace of the transpose matrix. Thus ˆx is a least squares solution if and only if A T r(ˆx) = 0 A T (b Aˆx) = 0 A T Aˆx = A T b.

Problem. Find the least squares solution to x + 2y = 3 3x + 2y = 5 x + y = 2.09 1 2 3 2 1 1 ( 1 3 1 ( ) x = y ) 2 1 3 2 2 2 1 1 1 ( ) ( ) 11 9 x = 9 9 y 3 5 2.09 ( x y ( 20.09 18.09 ) ( ) 3 1 3 1 = 5 2 2 1 2.09 ) { x = 1 y = 1.01

Problem. Find the constant function that is the least square fit to the following data x 0 1 2 3 f (x) 1 0 1 2 c = 1 1 c = 0 f (x) = c = = 1 c = 1 1 (c) = c = 2 1 1 1 (1, 1, 1, 1) 1 1 (c) = (1, 1, 1, 1) 0 1 1 2 1 0 1 2 c = 1 (1 + 0 + 1 + 2) = 1 4 (mean arithmetic value)

Problem. Find the linear polynomial that is the least square fit to the following data x 0 1 2 3 f (x) 1 0 1 2 f (x) = c 1 + c 2 x = ( ) 1 1 1 1 0 1 2 3 ( ) ( ) 4 6 c1 = 6 14 c 2 c 1 = 1 c 1 + c 2 = 0 c 1 + 2c 2 = 1 c 1 + 3c 2 = 2 1 0 1 1 1 2 1 3 ( 4 8) ( c1 ) = c 2 = ( 1 1 1 1 0 1 2 3 { c1 = 0.4 c 2 = 0.4 1 0 ( ) 1 1 1 c1 1 2 = 0 c 2 1 1 3 2 ) 1 0 1 2

Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α : V R is called a norm on V if it has the following properties: (i) α(x) 0, α(x) = 0 only for x = 0 (positivity) (ii) α(rx) = r α(x) for all r R (homogeneity) (iii) α(x + y) α(x) + α(y) (triangle inequality) Notation. The norm of a vector x V is usually denoted x. Different norms on V are distinguished by subscripts, e.g., x 1 and x 2.

Examples. V = R n, x = (x 1, x 2,...,x n ) R n. x = max( x 1, x 2,..., x n ). Positivity and homogeneity are obvious. The triangle inequality: x i + y i x i + y i max j x j + max j y j = max j x j + y j max j x j + max j y j x 1 = x 1 + x 2 + + x n. Positivity and homogeneity are obvious. The triangle inequality: x i + y i x i + y i = j x j + y j j x j + j y j

Examples. V = R n, x = (x 1, x 2,...,x n ) R n. x p = ( x 1 p + x 2 p + + x n p) 1/p, p > 0. Theorem x p is a norm on R n for any p 1. Remark. x 2 = Euclidean length of x. Definition. A normed vector space is a vector space endowed with a norm. The norm defines a distance function on the normed vector space: dist(x,y) = x y. Then we say that a sequence x 1,x 2,... converges to a vector x if dist(x,x n ) 0 as n.

Unit circle: x = 1 x = (x1 2 + x2) 2 1/2 black x = ( 1 2 x2 1 + x2) 2 1/2 green x = x 1 + x 2 blue x = max( x 1, x 2 ) red

Examples. V = C[a, b], f : [a, b] R. f = max f (x). a x b f 1 = b a f (x) dx. ( b 1/p f p = f (x) dx) p, p > 0. a Theorem f p is a norm on C[a, b] for any p 1.