MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Similar documents
MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

Linear Algebra. Session 12

Lecture 23: 6.1 Inner Products

MATH 423 Linear Algebra II Lecture 28: Inner product spaces.

Linear algebra review

Lecture 20: 6.1 Inner Products

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

Typical Problem: Compute.

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

Linear Algebra Massoud Malek

Linear Analysis Lecture 5

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Lecture 1.4: Inner products and orthogonality

MAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

MATH 22A: LINEAR ALGEBRA Chapter 4

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 304 Linear Algebra Lecture 15: Linear transformations (continued). Range and kernel. Matrix transformations.

Lecture notes: Applied linear algebra Part 1. Version 2

LINEAR ALGEBRA W W L CHEN

Review of Some Concepts from Linear Algebra: Part 2

12x + 18y = 30? ax + by = m

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Linear Algebra Review

Inner Product Spaces 5.2 Inner product spaces

Lecture 2: Linear Algebra Review

Chapter 3 Transformations

Inner Product Spaces

University of Leeds, School of Mathematics MATH 3181 Inner product and metric spaces, Solutions 1

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Lecture Notes 1: Vector spaces

Overview. Motivation for the inner product. Question. Definition

Math 310 Final Exam Solutions

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Normed & Inner Product Vector Spaces

Functions of Several Variables

REAL LINEAR ALGEBRA: PROBLEMS WITH SOLUTIONS

MTH 2032 SemesterII

The Four Fundamental Subspaces

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

Lecture 1: Review of linear algebra

Exercise Sheet 1.

Theorem (4.11). Let M be a closed subspace of a Hilbert space H. For any x, y E, the parallelogram law applied to x/2 and y/2 gives.

Chapter 6 - Orthogonality

Lecture 7. Econ August 18

15B. Isometries and Functionals 1

Functional Analysis MATH and MATH M6202

Systems of Linear Equations

Row Space, Column Space, and Nullspace

Knowledge Discovery and Data Mining 1 (VO) ( )

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

Math 302 Outcome Statements Winter 2013

The Transpose of a Vector

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Math Linear Algebra II. 1. Inner Products and Norms

Inner product spaces. Layers of structure:

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

Linear Algebra Practice Problems

Final Exam - Take Home Portion Math 211, Summer 2017

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Least squares problems Linear Algebra with Computer Science Application

Algebra II. Paulius Drungilas and Jonas Jankauskas

MATH 112 QUADRATIC AND BILINEAR FORMS NOVEMBER 24, Bilinear forms

A Review of Linear Algebra

Pseudoinverse and Adjoint Operators

Projection Theorem 1

Linear Algebra and Eigenproblems

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

October 25, 2013 INNER PRODUCT SPACES

a. Define a function called an inner product on pairs of points x = (x 1, x 2,..., x n ) and y = (y 1, y 2,..., y n ) in R n by

MAT Linear Algebra Collection of sample exams

Chapter 3. Vector spaces

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Signature. Printed Name. Math 312 Hour Exam 1 Jerry L. Kazdan March 5, :00 1:20

Linear Algebra. Min Yan

Chapter 6: Orthogonality

Definitions and Properties of R N

Math 3191 Applied Linear Algebra

HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013

Lecture 6: Geometry of OLS Estimation of Linear Regession

Chapter 12 Review Vector. MATH 126 (Section 9.5) Vector and Scalar The University of Kansas 1 / 30

Linear maps. Matthew Macauley. Department of Mathematical Sciences Clemson University Math 8530, Spring 2017

Review of Basic Concepts in Linear Algebra

Linear Normed Spaces (cont.) Inner Product Spaces

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Chapter 5 Eigenvalues and Eigenvectors

Transcription:

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Orthogonal projection Theorem 1 Let V be a subspace of R n. Then any vector x R n is uniquely represented as x = p + o, where p V and o V. In the above expansion, p is called the orthogonal projection of the vector x onto the subspace V. Theorem 2 x v > x p for any v p in V. Thus o = x p = min x v is the v V distance from the vector x to the subspace V.

V o x p V

Least squares solution System of linear equations: a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m Ax = b For any x R n define a residual r(x) = b Ax. The least squares solution x to the system is the one that minimizes r(x) (or, equivalently, r(x) 2 ). m r(x) 2 = (a i1 x 1 + a i2 x 2 + + a in x n b i ) 2 i=1

Let A be an m n matrix and let b R m. Theorem A vector ˆx is a least squares solution of the system Ax = b if and only if it is a solution of the associated normal system A T Ax = A T b. Proof: Ax is an arbitrary vector in R(A), the column space of A. Hence the length of r(x) = b Ax is minimal if Ax is the orthogonal projection of b onto R(A). That is, if r(x) is orthogonal to R(A). We know that {row space} = {nullspace} for any matrix. In particular, R(A) = N(A T ), the nullspace of the transpose matrix of A. Thus ˆx is a least squares solution if and only if A T r(ˆx) = 0 A T (b Aˆx) = 0 A T Aˆx = A T b. Corollary The normal system A T Ax = A T b is always consistent.

Problem. Find the constant function that is the least square fit to the following data x 0 1 2 3 f (x) 1 0 1 2 c = 1 1 c = 0 f (x) = c = = 1 c = 1 1 (c) = c = 2 1 1 1 (1, 1, 1, 1) 1 1 (c) = (1, 1, 1, 1) 0 1 1 2 1 0 1 2 c = 1 (1 + 0 + 1 + 2) = 1 4 (mean arithmetic value)

Problem. Find the linear polynomial that is the least square fit to the following data x 0 1 2 3 f (x) 1 0 1 2 f (x) = c 1 + c 2 x = ( ) 1 1 1 1 0 1 2 3 ( ) ( ) 4 6 c1 = 6 14 c 2 c 1 = 1 c 1 + c 2 = 0 c 1 + 2c 2 = 1 c 1 + 3c 2 = 2 1 0 1 1 1 2 1 3 ( 4 8) ( c1 ) = c 2 = ( 1 1 1 1 0 1 2 3 { c1 = 0.4 c 2 = 0.4 1 0 ( ) 1 1 1 c1 1 2 = 0 c 2 1 1 3 2 ) 1 0 1 2

Problem. Find the quadratic polynomial that is the least square fit to the following data x 0 1 2 3 f (x) 1 0 1 2 f (x) = c 1 + c 2 x + c 3 x 2 c 1 = 1 c = 1 + c 2 + c 3 = 0 = c 1 + 2c 2 + 4c 3 = 1 c 1 + 3c 2 + 9c 3 = 2 1 1 1 1 1 0 0 0 1 2 3 1 1 1 c 1 1 2 4 c 2 = 0 1 4 9 c 1 3 9 3 4 6 14 6 14 36 c 1 c 2 = 4 8 14 36 98 22 c 3 1 0 0 1 1 1 c 1 1 1 2 4 c 2 = 0 1 c 1 3 9 3 2 1 1 1 1 1 0 1 2 3 0 1 0 1 4 9 2 c 1 = 0.9 c 2 = 1.1 c 3 = 0.5

Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α : V R is called a norm on V if it has the following properties: (i) α(x) 0, α(x) = 0 only for x = 0 (positivity) (ii) α(rx) = r α(x) for all r R (homogeneity) (iii) α(x + y) α(x) + α(y) (triangle inequality) Notation. The norm of a vector x V is usually denoted x. Different norms on V are distinguished by subscripts, e.g., x 1 and x 2.

Examples. V = R n, x = (x 1, x 2,...,x n ) R n. x = max( x 1, x 2,..., x n ). Positivity and homogeneity are obvious. The triangle inequality: x i + y i x i + y i max j x j + max j y j = max j x j + y j max j x j + max j y j x 1 = x 1 + x 2 + + x n. Positivity and homogeneity are obvious. The triangle inequality: x i + y i x i + y i = j x j + y j j x j + j y j

Examples. V = R n, x = (x 1, x 2,...,x n ) R n. x p = ( x 1 p + x 2 p + + x n p) 1/p, p > 0. Remark. x 2 = Euclidean length of x. Theorem x p is a norm on R n for any p 1. Positivity and homogeneity are still obvious (and hold for any p > 0). The triangle inequality for p 1 is known as the Minkowski inequality: ( x1 + y 1 p + x 2 + y 2 p + + x n + y n p) 1/p ( x 1 p + + x n p) 1/p + ( y1 p + + y n p) 1/p.

Normed vector space Definition. A normed vector space is a vector space endowed with a norm. The norm defines a distance function on the normed vector space: dist(x,y) = x y. Then we say that a sequence x 1,x 2,... converges to a vector x if dist(x,x n ) 0 as n. Also, we say that a vector x is a good approximation of a vector x 0 if dist(x,x 0 ) is small.

Unit circle: x = 1 x = (x1 2 + x2) 2 1/2 black x = ( 1 2 x2 1 + x2) 2 1/2 green x = x 1 + x 2 blue x = max( x 1, x 2 ) red

Examples. V = C[a, b], f : [a, b] R. f = max f (x). a x b f 1 = b a f (x) dx. ( b 1/p f p = f (x) dx) p, p > 0. a Theorem f p is a norm on C[a, b] for any p 1.

Inner product The notion of inner product generalizes the notion of dot product of vectors in R n. Definition. Let V be a vector space. A function β : V V R, usually denoted β(x,y) = x,y, is called an inner product on V if it is positive, symmetric, and bilinear. That is, if (i) x,x 0, x,x = 0 only for x = 0 (positivity) (ii) x, y = y, x (symmetry) (iii) r x, y = r x, y (homogeneity) (iv) x + y,z = x,z + y,z (distributive law) An inner product space is a vector space endowed with an inner product.

Examples. V = R n. x,y = x y = x 1 y 1 + x 2 y 2 + + x n y n. x,y = d 1 x 1 y 1 + d 2 x 2 y 2 + + d n x n y n, where d 1, d 2,...,d n > 0. x,y = (Dx) (Dy), where D is an invertible n n matrix. Remarks. (a) Invertibility of D is necessary to show that x,x = 0 = x = 0. (b) The second example is a particular case of the third one when D = diag(d 1/2 1, d 1/2 2,...,dn 1/2 ).

Counterexamples. V = R 2. x,y = x 1 y 1 x 2 y 2. Let v = (1, 2), then v,v = 1 2 2 2 = 3. x,y is symmetric and bilinear, but not positive. x,y = 2x 1 y 1 + x 1 x 2 + 2x 2 y 2 + y 1 y 2. v = (1, 1), w = (1, 0) = v,w = 3, 2v,w = 8. x,y is positive and symmetric, but not bilinear. x,y = x 1 y 1 + x 1 y 2 x 2 y 1 + x 2 y 2. v = (1, 1), w = (1, 0) = v,w = 0, w,v = 2. x,y is positive and bilinear, but not symmetric.