Dot product and linear least squares problems

Size: px
Start display at page:

Download "Dot product and linear least squares problems"

Transcription

1 Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The dot product u u = u + + u n gives the square of the Euclidean length of the vector, aka norm of the vector: u = (u u) / = ( u + + u ) / n Theorem (Cauchy-Schwarz inequality) For a,b R n we have Proof Step for vectors with a = and b = : Then a b a b () (b a) (b a) = b b a b + a a a b a a + b b = + Hence a b Using (b + a) instead of (b a) gives a b, ie, a b Step for general vectors a, b: If a = or b = we see that () obviously holds If both vectors are different from let u := a/ a and v := b/ b, then u v = Since u = and v = we get from step that u v a b a b Consider a triangle with the three points, a, b Then the vector from a to b is given by c = b a, and the lengths of the three sides of the triangle are a, b, c Let θ denote the angle between the vectors a and b Then the law of cosines tells us that Multiplying out c = (b a) (b a) gives By comparing the last two equations we obtain If a and b are different from we have c = a + b a b cosθ c = a + b a b a b = a b cosθ cosθ = This tells us how to compute the angle θ between two vectors: q := a b a b, a b a b θ := cos q Because of () we have q, hence the inverse cosine function gives θ π: q = θ =, q = θ = π, q = θ = π, ie, a,b point in the same direction ie, a,b are orthogonal ie a, b point in opposite directions

2 Example Find the angle between the vectors a = q := and b = : We get a b a b = =, θ := cos q = cos = π b x 5 a 5 5 x x We say vectors a,b R n are orthogonal or a b a b a b = We say a vector a R n is orthogonal on a subspace V of R n or a V a V a v = for all v V Orthogonal projection onto a line Consider a -dimensional subspace V = span{v} of R n given by a vector v R n This is a line through the origin For a given point b R n we want to find the point u V which is closest to the point b: The point u must have the following properties: u V, ie, u = cv with some unknown c R u b V, ie, v (u b) = Find u V such that u b is minimal By plugging u = cv into the second property we get by multiplying out v (cv b) = c(v v) (v b) = c = v b v v Therefore the point u on the line given by v which is closest to the point b is given by u = v b v v v We say that u is the orthogonal projection of the point b onto the line given by v and use the notation pr v b = v b v v v

3 Orthogonal projection onto a -dimensional subspace V Consider a -dimensional subspace V = span { v (),v ()} of R n given by two linearly independent vectors v (),v () R n This is a plane through the origin For a given point b R n we want to find the point u V which is closest to the point b: The point u must have the following properties: Find u V such that u b is minimal u V, ie, u = c v () + c v () with some unknowns c,c R u b V, ie, v () (u b) = and v () (u b) = By plugging u = c v () +c v () into the second property we obtain a linear system of two equations for two unknowns which we can then solve We can use the k matrix A = v (),v () to express the two properties: c u V, ie, u = Ac with an unknown vector c = R v u b V, ie, v () (u b) = and v () () (u b) =, ie, By plugging u = Ac into the second property we obtain c v () A (u b) = A (Ac b) = A Ac = A b (u b) = These are the so-called normal equations (since they express that u b is orthogonal or normal on the subspace V This is how to find the point u V which is closest to b: find the matrix M := A A R and the vector g := A b R solve the linear system Mc = g for c R let u := Ac Example Consider the plane V = span Let A = Then Solving the linear system 6 c Hence the closest point is u = Ac = c, M = A 6 A = 5 = checking v () r = and v () r = : We have r = Ac b = v () r = or Find the point u V which is closest to b = gives c = 5, g = We can check that this is correct by finding the difference vector r = u b and =, v () r = =

4 Orthogonal projection onto a k-dimensional subspace V aka least squares problem Consider a k-dimensional subspace V = span { v (),,v (k)} of R n given by k linearly independent vectors v (),,v (k) R n Ie, the vectors v (),,v (k) form a basis for the subspace V For a given point b R n we want to find the point u V which is closest to the point b: Let us define the n k matrix A = v (),,v (k) The point u must have the following properties: Find u V such that u b is minimal () u V, ie, u = c v () + + c k v (k) = Ac with some unknown vector c R k v () u b V, ie, v () (u b) =,,v (k) (u b) =, ie, (u b) = By plugging u = Ac into the second property we obtain v (k) A (u b) = or A (Ac b) = () A Ac = A b () These are the so-called normal equations (since they express that u b is orthogonal or normal on the subspace V ) This is how to find the point u V which is closest to b:

5 find the matrix M := A A R k k and the vector g := A b R k solve the k k linear system Mc = g for c R k let u := Ac Note that the normal equations () always have a unique solution: Theorem Assume the matrix A R n k with k n has linearly independent columns (ie, ranka = k) Then the matrix M = A A is nonsingular Proof We have to show that Mc = implies c = Assume we have c R k such that Mc = Then we can multiply from the left with c and get with y := Ac = c Mc = c A Ac = y y = y as y = c A Since y = we have y = Ac = This means that we have a linear combination of the columns of A which gives the zero vector Since by assumption the columns of A are linearly independent we must have c = So solving the normal equations gives us a unique c R n We then get by u = Ac a point on the subspace V We now want to formally prove that this point u V is really the unique answer to our minimization problem () Theorem Assume the matrix A R n k with k n has linearly independent columns (ie, ranka = k) Then for any given b R n the minimization problem find c R k such that Ac b is minimal (5) has a unique solution which is obtained by solving the normal equations A Ac = A b Proof Let c R k be the unique solution of the normal equations Consider now c = c + d where d R k is nonzero We then have ( ) ( ) A c b = Ac b + Ad = (Ac b) + Ad (Ac b) + Ad = (Ac b) (Ac b) + (Ad) (Ac b) + (Ad) (Ad) We have for the middle term (Ad) (Ac b) = (Ad) (Ac b) = d A (Ac b) = by the normal equations () Hence A c b = Ac b + Ad Since d we have Ad since the columns of A are linearly independent, and hence Ad > This means that for any vector c different from c we get A c b > Ac b, ie, the vector c from the normal equations is the unique solution of (5) Least squares problem with orthogonal basis For a least squares problem we are given n linearly independent vectors a (),,a (n) R m which form a basis for the subspace V = span{a (),,a (n) } For a given right hand side vector b R m we want to find u V such that u b is minimal We can write u = c a () + + c n a (n) = Ac with the matrix A = a (),,a (n) R m n Hence we want to find c R n such that Ac b is minimal Solving this problem is much simpler if we have an orthogonal basis for the subspace V : Assume we have vectors p (),, p (n) such that span { p (),, p (n)} = V the vectors are orthogonal on each other: p (i) p ( j) = for i j

6 We can then write u = d p () + + d n p (n) = Pd with the matrix P = p (),, p (n) R m n Hence we want to find d R n such that Pd b is minimal The normal equations for this problem give where the matrix P P = p () p (n) (P P)d = P b (6) p () p () p () p (n) p (),, p (n) = = p (n) p () p (n) p (n) p () p () p (n) p (n) is now diagonal since p (i) p ( j) = for i j Therefore the normal equations (6) are actually decoupled ( p () p ()) d = p () b ( p (n) p (n)) d n = p (n) b and have the solution d i = p(i) b p (i) p (i) for i =,,n Gram-Schmidt orthogonalization We still need a method to construct from a given basis a (),,a (n) an orthogonal basis p (),, p (n) Given n linearly independent vectors a (),,a (n) R m we want to find vectors p (),, p (n) such that span { p (),, p (n)} = span { a (),,a (n)} the vectors are orthogonal on each other: p (i) p ( j) = for i j Step : p () := a () Step : p () := a () s p () where we choose s such that p () p () = : p () a () s p () p () = s = p() a () p () p () Step : p () := a () s p () s p () where we choose s, s such that p () p () =, ie, p () a () s p () p () s p () p () =, hence s }{{} = p() a () p () p () p () p () =, ie, p () a () s p () p () s }{{} p () p () =, hence s = p() a () p () p () Step n: p (n) := a (n) s n p () s n,n p (n ) where we choose s n,,s n,n such that p ( j) p (n) = for j =,,n which yields s jn = p( j) p (n) p ( j) p ( j) for j =,,n

7 Example: We are given the vectors a () =, a() =, a() = find an orthogonal basis p (), p (), p () for the subspace V = span { a (),a (),a ()} Step : p () := a () = Step : p () := a () p() a () p () p () p() = 6 Step : p () := a () p() a () p () p () p() p() a () p () p () p() = Note that we have = a () = p () Use Gram-Schmidt orthogonalization to 5 = which we can write as In the general case we have a (),a (),a () = 9 a () = p () + 6 p() a () = p () + p() p() } {{ } A = p (), p (), p () } {{ } P } {{ } S a () = p () a () = p () + s p () a () = p () + s p () + s p () a (n) = p (n) + s n p () + + s n,n p (n ) which we can write as a (),a (),,a (n) = p (), p (),, p (n) s s n sn,n Therefore we obtain a decomposition A = PS where

8 P R m n has orthogonal columns S R n n is upper triangular, with on the diagonal Note that the vectors p (),, p (n) are different from : Assume, eg, that p () = a () s p () s p () =, then a () = s p () + s p () is in span { p (), p ()} = span { a (),a ()} This is a contradiction to the assumption that a (),a (),a () are linearly independent Solving the least squares problem Ac b = min using orthogonalization We are given A R m n with linearly independent columns, b R n We want to find c R n such that Ac b = min From the Gram-Schmidt method we get A = PS, hence we want to find c such that P }{{} Sc b = min d This gives the following method for solving the least squares problem: use Gram-Schmidt to find decomposition A = PS solve Pd b = min: d i := solve Sc = d by back substitution p(i) b p (i) for i =,,n p (i) Example: Solve the least squares problem Ac b = min for A = Gram-Schmidt gives A = } {{ } P 5 5 } {{ } S 9 (see above), b = d = p() b p () p () = =, d = p() b p () = p () 5 =, d = p() b p () p () = = c solving c = by back substitution gives c = 5, c = 9, c = c 5 Hence the solution of our least squares problem is the vector c = Note: If you want to solve a least squares problem by hand with pencil and paper, it is usually easier to use the normal equations But for numerical computation on a computer using orthogonalization is usually more efficient and more accurate Finding an orthonormal basis q (),,q (n) : the QR decomposition The Gram-Schmidt method gives an orthogonal basis p (),, p (n) for V = span { a (),,a (n)} Often it is convenient to have a so-called orthonormal basis q (),,q (n) where the basis vectors have length : Define then we have q ( j) = j) p( p ( j) for j =,,n

9 span { q (),,q (n)} = V { q ( j) q (k) for j = k = otherwise This means that the matrix Q = q (),,q (n) satisfies Q Q = I where I is the n n identity matrix Since p ( j) = p ( j) q ( j) we have a () = p () q () }{{} r a () = p () q () +s p () }{{}}{{} r a (n) = p (n) q (n) }{{} r nn +s n r () p p () p () p (n ) + + s n,n }{{}}{{} r n r n,n (n ) q which we can write as a (),a (),,a (n) = q (),q (),,q (n) r r r n r rn,n r nn A = QR where the n n matrix R is given by row of R row nof R = p () (row of R) p (n) (row nof R) We obtain the so-called QR decomposition A = QR where the matrix Q R m n has orthonormal columns, rangeq = rangea the matrix R R n n is upper triangular, with nonzero diagonal elements Example: In our example we have p () p () =, p () p () = 5, p () p () =, hence 5 5 q () = p() = 5 5, q() = p () = , q() = p() = 5 5 and we obtain the QR decomposition 9 = 5 5/ / / /

10 In Matlab we can find the QR decomposition using Q,R=qr(A,) This works for symbolic matrices >> A = sym( ; ; 9) ; >> Q,R = qr(a,) Q = /, -(*5^(/))/, / /, -5^(/)/, -/ /, 5^(/)/, -/ /, (*5^(/))/, / R =,, 7, 5^(/), *5^(/),, and it works for numerical matrices >> A = ; ; 9 ; >> Q,R = qr(a,) Q = R = Note that for numerical matrices Matlab returned the basis q (), q (),q () (which is also an orthonormal basis) and hence rows and of the matrix R is ( ) times our previous matrix R If we want to find an orthonormal basis for rangea and an orthonormal basis for the orthogonal complement (rangea) = nulla we can use the command Qh,Rh=qr(A) : It returns matrices ˆQ R m m and ˆR R m n with basis for rangea basis for (rangea) {}}{ R {}}{ ˆQ = q (),,q (n), q (n+),,q (m), ˆR = m nrows of zeros >> A = ; ; 9 ; >> Qh,Rh = qr(a) Qh = Rh = But in most cases we only need an orthonormal basis for rangea and we should use Q,R=qr(A,) (which Matlab calls the economy size decomposition)

11 Solving the least squares problem Ac b = min using the QR decomposition If we use an orthonormal basis q (),,q (n) for span{a (),,a (n) } we have Q Q = I The solution of Qd b = min is therefore given by the normal equations (Q Q)d = Q b, ie, we obtain d = Q b This gives the following method for solving the least squares problem: find the QR decomposition A = QR let d = Q b solve Rc = d by back substitution In Matlab we can do this as follows: Q,R = qr(a,); d = Q *b; c = R\d; In our example we have >> A = ; ; 9 ; b = ;;;7; >> Q,R = qr(a,); >> d = Q *b; >> c = R\d c = This works for both numerical and symbolic matrices For a numerical matrix A we can use the shortcut c=a\y which actually uses the QR decomposition to find the solution of Ac b = min >> A = ; ; 9 ; b = ;;;7; >> c = A\b c = Warning: This shortcut does not work for symbolic matrices: >> A = sym( ; ; 9) ; b = sym(;;;7); >> c = A\b Warning: The system is inconsistent Solution does not exist c = Inf Inf Inf

Linear least squares problem: Example

Linear least squares problem: Example Linear least squares problem: Example We want to determine n unknown parameters c,,c n using m measurements where m n Here always denotes the -norm v = (v + + v n) / Example problem Fit the experimental

More information

Class notes: Approximation

Class notes: Approximation Class notes: Approximation Introduction Vector spaces, linear independence, subspace The goal of Numerical Analysis is to compute approximations We want to approximate eg numbers in R or C vectors in R

More information

Linear Least Squares Problems

Linear Least Squares Problems Linear Least Squares Problems Introduction We have N data points (x 1,y 1 ),...(x N,y N ). We assume that the data values are given by y j = g(x j ) + e j, j = 1,...,N where g(x) = c 1 g 1 (x) + + c n

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

MATH 235: Inner Product Spaces, Assignment 7

MATH 235: Inner Product Spaces, Assignment 7 MATH 235: Inner Product Spaces, Assignment 7 Hand in questions 3,4,5,6,9, by 9:3 am on Wednesday March 26, 28. Contents Orthogonal Basis for Inner Product Space 2 2 Inner-Product Function Space 2 3 Weighted

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name: Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due

More information

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 10: Vector Algebra: Orthogonal Basis Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Least-Squares Systems and The QR factorization

Least-Squares Systems and The QR factorization Least-Squares Systems and The QR factorization Orthogonality Least-squares systems. The Gram-Schmidt and Modified Gram-Schmidt processes. The Householder QR and the Givens QR. Orthogonality The Gram-Schmidt

More information

LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems

LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems Math 550A MATLAB Assignment #2 1 Revised 8/14/10 LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems In this lab you will use Matlab to study

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Linear Algebra Chap. 2: Least Squares Problems Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Lecture 6, Sci. Comp. for DPhil Students

Lecture 6, Sci. Comp. for DPhil Students Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page

More information

Mathematics Department Stanford University Math 61CM/DM Inner products

Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

Dimension and Structure

Dimension and Structure 96 Chapter 7 Dimension and Structure 7.1 Basis and Dimensions Bases for Subspaces Definition 7.1.1. A set of vectors in a subspace V of R n is said to be a basis for V if it is linearly independent and

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

P = A(A T A) 1 A T. A Om (m n)

P = A(A T A) 1 A T. A Om (m n) Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax

More information

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 4: Applications of Orthogonality: QR Decompositions Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Inner Product Spaces 6.1 Length and Dot Product in R n

Inner Product Spaces 6.1 Length and Dot Product in R n Inner Product Spaces 6.1 Length and Dot Product in R n Summer 2017 Goals We imitate the concept of length and angle between two vectors in R 2, R 3 to define the same in the n space R n. Main topics are:

More information

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Problem set 5: SVD, Orthogonal projections, etc.

Problem set 5: SVD, Orthogonal projections, etc. Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline

More information

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

Linear Algebra. Alvin Lin. August December 2017

Linear Algebra. Alvin Lin. August December 2017 Linear Algebra Alvin Lin August 207 - December 207 Linear Algebra The study of linear algebra is about two basic things. We study vector spaces and structure preserving maps between vector spaces. A vector

More information

Math Fall Final Exam

Math Fall Final Exam Math 104 - Fall 2008 - Final Exam Name: Student ID: Signature: Instructions: Print your name and student ID number, write your signature to indicate that you accept the honor code. During the test, you

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

Problem 1: Solving a linear equation

Problem 1: Solving a linear equation Math 38 Practice Final Exam ANSWERS Page Problem : Solving a linear equation Given matrix A = 2 2 3 7 4 and vector y = 5 8 9. (a) Solve Ax = y (if the equation is consistent) and write the general solution

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

Math 2174: Practice Midterm 1

Math 2174: Practice Midterm 1 Math 74: Practice Midterm Show your work and explain your reasoning as appropriate. No calculators. One page of handwritten notes is allowed for the exam, as well as one blank page of scratch paper.. Consider

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information