MTH 2310, FALL Introduction

Size: px
Start display at page:

Download "MTH 2310, FALL Introduction"

Transcription

1 MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, Introduction We have discussed previously the benefits of having a set of vectors that is linearly independent or spans a certain vector space. It turns out that having vectors that are mutually orthogonal also has certain benefits, which we discuss in this section. 2. Orthogonal Sets Definition 1. A set of vectors {u 1,..., u p } in R n is an orthogonal set if each pair of vectors from the set is orthogonal- that is, if u i u j = 0 whenever i j /2 Exercise 2. Show that the set 1, 2, 2 is an orthogonal set. (i.e., dot product 1 1 7/2 each vector with the other two and confirm that each inner product gives you 0). 1

2 2 SECTION 6.2: ORTHOGONAL SETS Theorem 2.1. If S = {u 1,..., u p } is an orthogonal set of nonzero vectors in R n, then S is linearly independent and hence is a basis for the subspace spanned by S. Proof. To prove this, we must show that if (1) 0 = c 1 u c p u p for some scalars c 1,..., c p, then each c i must equal zero. Since all we know is that the vectors are mutually orthogonal, let s start by taking equation (1) and dotting both sides by the vector u 1. Definition 3. An orthogonal basis is a basis that is also an orthogonal set. It turns out that an orthogonal basis is much nicer than other bases, which the next theorem illustrates. Basically, the idea is that a difficult part of changing bases is calculating what the new weights will be for your vectors, and this calculation is greatly simplified if the vectors are orthogonal. Theorem 2.2. Let {u 1,..., u p } be an orthogonal basis for a vector space W. For each y in W, the weights in the linear combination y = c 1 u c p u p are given by c j = y u j u j u j (j = 1,..., p) Proof. This proof is similar to the one above: start by taking both sides of y = c 1 u 1 + +c p u p and dotting them with u 1, then use this to get the expression for c 1, then explain why the expression for c j is true for any j.

3 MTH 2310, FALL /2 Example 4. The set 1, 2, 2 is an orthogonal basis for R3. Use Theorm 2.2 to 1 1 7/2 6 express the vector y = 1 as a linear combination of that basis Orthogonal Projection In terms of the standard basis, it is easy to decompose a vector into its constituent parts. Given a different basis, though, it can be difficult to see how any other vector is written in terms of the basis. Geometrically, the big idea is this:

4 4 SECTION 6.2: ORTHOGONAL SETS In general, we can state this problem in the following way: given a nonzero vector u in R n, how can we decompose any other vector y into the sum of two vectors, one which is a scalar multiple of u and another which is orthogonal to u? We do the following: Definition 5. Let u be a nonzero vector and y another vector. Then ŷ = y u u u u is the orthogonal projection of y onto u, and z = y ŷ is the component of y orthogonal to u. Notice that if the vector u was replaced by any scalar multiple cu, you still get the same result for ŷ (You may want to check this on your own if you do not believe me). The important thing is not the vector u, but the line spanned by the vector u. Definition 6. Let u be a nonzero vector and y another vector. span{u}. Then ŷ = proj L y = y u u u u is the orthogonal projection of y onto L. Let L be the line formed by

5 MTH 2310, FALL [ ] [ ] 7 4 Exercise 7. Let y = and u =. 6 2 (a) Use the definitions above to find ŷ, the orthogonal projection of y onto u. (b) Find the component of y orthogonal to u (that is, z = y ŷ). (c) Check that ŷ and z are orthogonal. Exercise 8. Given u 0 in R n, let L=Span{u}. Show that the mapping x proj L x is a linear transformation. Another interpretation of the orthogonal projection is as follows. Since the ŷ and z are perpendicular, we can think of ŷ as the point on the line L that is closest to the point y. This idea forms the foundation for many applications of linear algebra, because often you cannot find a solution to a problem, and you are just searching for the closest thing to a solution, in other words you are looking for a projection of your ideal solution. We discuss this more in section 6.5.

6 6 SECTION 6.2: ORTHOGONAL SETS 4. Orthonormal Sets If you have two vectors that are orthogonal, any scalar multiples of the two vectors will be orthogonal. Why is this true? Thus if we have a set of orthogonal vectors and we scale them to be unit vectors, they will become a set of orthogonal unit vectors, which will make us happy. Definition 9. An orthogonal set of unit vectors is called an orthonormal set. If the set spans a vector space we call it an orthonormal basis, since the set will be linearly independent (Why?). {[ 2/ ] [ 5 Example 10. Prove that 1/ 1/ ]} 5, 5 2/ is an orthonormal basis. 5 Theorem 4.1. A matrix U has orthonormal columns if and only if U T U = I. Proof. We prove the case where U has three columns, the general case follows easily. Let U = [u 1 u 2 u 3 ].

7 NOTES 7 Theorem 4.2. Let U be an m n matrix with orthonormal columns and let x and y be in R n. Then (a) Ux = x (b) (Ux) (Uy) = x y (c) (Ux) (Uy) = 0 if and only if x y = 0 Proof. We prove only part (b), the other follow from it. We prove the case where U has three ] x 1 y 1 columns. Let U = [u 1 u 2 u 3, x =, and y =. x 2 x 3 y 2 y 3 Notes 1 T, T, F, F, F

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6.2 Orthogonal Sets Math 233 Linear Algebra 6.2 Orthogonal Sets Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math233 Jiwen He, University of Houston

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6. Orthogonal Projections Math 2 Linear Algebra 6. Orthogonal Projections Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2 Jiwen He, University of

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 10: Vector Algebra: Orthogonal Basis Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that

More information

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar. It follows that S is linearly dependent since the equation is satisfied by which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar. is

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1 Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of

More information

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n MTH 39Y 37. Inner product spaces Recall: ) The dot product in R n : a. a n b. b n = a b + a 2 b 2 +...a n b n 2) Properties of the dot product: a) u v = v u b) (u + v) w = u w + v w c) (cu) v = c(u v)

More information

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem Dot product The dot product is an inner product on a coordinate vector space (Definition 1, Theorem 1). Definition 1 Given vectors v and u in n-dimensional space, the dot product is defined as, n v u v

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

The geometry of least squares

The geometry of least squares The geometry of least squares We can think of a vector as a point in space, where the elements of the vector are the coordinates of the point. Consider for example, the following vector s: t = ( 4, 0),

More information

F F. proj cos( ) v. v proj v

F F. proj cos( ) v. v proj v Geometric Definition of Dot Product 1.2 The Dot Product Suppose you are pulling up on a rope attached to a box, as shown above. How would you find the force moving the box towards you? As stated above,

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Announcements Monday, November 20

Announcements Monday, November 20 Announcements Monday, November 20 You already have your midterms! Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and

More information

2.4 Hilbert Spaces. Outline

2.4 Hilbert Spaces. Outline 2.4 Hilbert Spaces Tom Lewis Spring Semester 2017 Outline Hilbert spaces L 2 ([a, b]) Orthogonality Approximations Definition A Hilbert space is an inner product space which is complete in the norm defined

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Exercise Solutions for Introduction to 3D Game Programming with DirectX 10

Exercise Solutions for Introduction to 3D Game Programming with DirectX 10 Exercise Solutions for Introduction to 3D Game Programming with DirectX 10 Frank Luna, September 6, 009 Solutions to Part I Chapter 1 1. Let u = 1, and v = 3, 4. Perform the following computations and

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

A Primer in Econometric Theory

A Primer in Econometric Theory A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Homework 5. (due Wednesday 8 th Nov midnight)

Homework 5. (due Wednesday 8 th Nov midnight) Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if

More information

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Linear Algebra: Homework 3

Linear Algebra: Homework 3 Linear Algebra: Homework 3 Alvin Lin August 206 - December 206 Section.2 Exercise 48 Find all values of the scalar k for which the two vectors are orthogonal. [ ] [ ] 2 k + u v 3 k u v 0 2(k + ) + 3(k

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n. 6.. Length, Angle, and Orthogonality In this section, we discuss the defintion of length and angle for vectors and define what it means for two vectors to be orthogonal. Then, we see that linear systems

More information

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Lecture 1.4: Inner products and orthogonality

Lecture 1.4: Inner products and orthogonality Lecture 1.4: Inner products and orthogonality Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics M.

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW When we define a term, we put it in boldface. This is a very compressed review; please read it very carefully and be sure to ask questions on parts you aren t sure of. x 1 WedenotethesetofrealnumbersbyR.

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Exercises for Unit I (Topics from linear algebra)

Exercises for Unit I (Topics from linear algebra) Exercises for Unit I (Topics from linear algebra) I.0 : Background Note. There is no corresponding section in the course notes, but as noted at the beginning of Unit I these are a few exercises which involve

More information

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017 STAT 151A: Lab 1 Billy Fang 2 September 2017 1 Logistics Billy Fang (blfang@berkeley.edu) Office hours: Monday 9am-11am, Wednesday 10am-12pm, Evans 428 (room changes will be written on the chalkboard)

More information

Exercises for Unit I (Topics from linear algebra)

Exercises for Unit I (Topics from linear algebra) Exercises for Unit I (Topics from linear algebra) I.0 : Background This does not correspond to a section in the course notes, but as noted at the beginning of Unit I it contains some exercises which involve

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v,

More information

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0 SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM () We find a least squares solution to ( ) ( ) A x = y or 0 0 a b = c 4 0 0. 0 The normal equation is A T A x = A T y = y or 5 0 0 0 0 0 a b = 5 9. 0 0 4 7

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for

More information

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Linear Algebra V = T = ( 4 3 ).

Linear Algebra V = T = ( 4 3 ). Linear Algebra Vectors A column vector is a list of numbers stored vertically The dimension of a column vector is the number of values in the vector W is a -dimensional column vector and V is a 5-dimensional

More information

MATH 12 CLASS 2 NOTES, SEP Contents. 2. Dot product: determining the angle between two vectors 2

MATH 12 CLASS 2 NOTES, SEP Contents. 2. Dot product: determining the angle between two vectors 2 MATH 12 CLASS 2 NOTES, SEP 23 2011 Contents 1. Dot product: definition, basic properties 1 2. Dot product: determining the angle between two vectors 2 Quick links to definitions/theorems Dot product definition

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No. 7. LEAST SQUARES ESTIMATION 1 EXERCISE: Least-Squares Estimation and Uniqueness of Estimates 1. For n real numbers a 1,...,a n, what value of a minimizes the sum of squared distances from a to each of

More information

The Gram-Schmidt Process 1

The Gram-Schmidt Process 1 The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition

More information

Brief Review of Exam Topics

Brief Review of Exam Topics Math 32A Discussion Session Week 3 Notes October 17 and 19, 2017 We ll use this week s discussion session to prepare for the first midterm. We ll start with a quick rundown of the relevant topics, and

More information

Properties of Linear Transformations from R n to R m

Properties of Linear Transformations from R n to R m Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation

More information

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015 Chapter : Theory Review: Solutions Math 08 F Spring 05. What two properties must a function T : R m R n satisfy to be a linear transformation? (a) For all vectors u and v in R m, T (u + v) T (u) + T (v)

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

6.4 Vectors and Dot Products

6.4 Vectors and Dot Products 6.4 Vectors and Dot Products Copyright Cengage Learning. All rights reserved. What You Should Learn Find the dot product of two vectors and use the properties of the dot product. Find the angle between

More information

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018 Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018 Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

8 General Linear Transformations

8 General Linear Transformations 8 General Linear Transformations 8.1 Basic Properties Definition 8.1 If T : V W is a function from a vector space V into a vector space W, then T is called a linear transformation from V to W if, for all

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1 Winter 2009 I. Topics from linear algebra I.0 : Background 1. Suppose that {x, y} is linearly dependent. Then there are scalars a, b which are not both

More information

Answer Key for Exam #2

Answer Key for Exam #2 . Use elimination on an augmented matrix: Answer Key for Exam # 4 4 8 4 4 4 The fourth column has no pivot, so x 4 is a free variable. The corresponding system is x + x 4 =, x =, x x 4 = which we solve

More information

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then 1 Def: V be a set of elements with a binary operation + is defined. F be a field. A multiplication operator between a F and v V is also defined. The V is called a vector space over the field F if: V is

More information

Lecture 9: Vector Algebra

Lecture 9: Vector Algebra Lecture 9: Vector Algebra Linear combination of vectors Geometric interpretation Interpreting as Matrix-Vector Multiplication Span of a set of vectors Vector Spaces and Subspaces Linearly Independent/Dependent

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

7.1 Projections and Components

7.1 Projections and Components 7. Projections and Components As we have seen, the dot product of two vectors tells us the cosine of the angle between them. So far, we have only used this to find the angle between two vectors, but cosines

More information

Family Feud Review. Linear Algebra. October 22, 2013

Family Feud Review. Linear Algebra. October 22, 2013 Review Linear Algebra October 22, 2013 Question 1 Let A and B be matrices. If AB is a 4 7 matrix, then determine the dimensions of A and B if A has 19 columns. Answer 1 Answer A is a 4 19 matrix, while

More information

Math 3C Lecture 25. John Douglas Moore

Math 3C Lecture 25. John Douglas Moore Math 3C Lecture 25 John Douglas Moore June 1, 2009 Let V be a vector space. A basis for V is a collection of vectors {v 1,..., v k } such that 1. V = Span{v 1,..., v k }, and 2. {v 1,..., v k } are linearly

More information

Row Space, Column Space, and Nullspace

Row Space, Column Space, and Nullspace Row Space, Column Space, and Nullspace MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Every matrix has associated with it three vector spaces: row space

More information