22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Similar documents
Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

22m:033 Notes: 1.3 Vector Equations

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Math 2331 Linear Algebra

Section 6.1. Inner Product, Length, and Orthogonality

Chapter 6. Orthogonality and Least Squares

March 27 Math 3260 sec. 56 Spring 2018

Lecture Note on Linear Algebra 17. Standard Inner Product

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

The geometry of least squares

Math 3191 Applied Linear Algebra

Chapter 6: Orthogonality

22m:033 Notes: 3.1 Introduction to Determinants

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Inner Product, Length, and Orthogonality

(v, w) = arccos( < v, w >

Lecture 3: Linear Algebra Review, Part II

6.1. Inner Product, Length and Orthogonality

22m:033 Notes: 1.8 Linear Transformations

(v, w) = arccos( < v, w >

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

(v, w) = arccos( < v, w >

Systems of Linear Equations

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Announcements Wednesday, November 15

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Orthogonality and Least Squares

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

LINEAR ALGEBRA REVIEW

4.3 - Linear Combinations and Independence of Vectors

2018 Fall 2210Q Section 013 Midterm Exam II Solution

1 Notes for lectures during the week of the strike Part 2 (10/25)

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Announcements Wednesday, November 15

Fourier and Wavelet Signal Processing

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

Linear Algebra. Alvin Lin. August December 2017

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Final Examination 201-NYC-05 - Linear Algebra I December 8 th, and b = 4. Find the value(s) of a for which the equation Ax = b

October 25, 2013 INNER PRODUCT SPACES

10 Orthogonality Orthogonal subspaces

Typical Problem: Compute.

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is

x n -2.5 Definition A list is a list of objects, where multiplicity is allowed, and order matters. For example, as lists

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Orthogonal Complements

1. General Vector Spaces

MATH 221, Spring Homework 10 Solutions

Linear Algebra V = T = ( 4 3 ).

11.1 Vectors in the plane

PRACTICE PROBLEMS FOR THE FINAL

Matrix Algebra: Vectors

Review of linear algebra

MATH Linear Algebra

Distance in the Plane

The Four Fundamental Subspaces

7. Dimension and Structure.

Linear Equations and Vectors

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication.

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Math 3108: Linear Algebra

Solutions to Selected Questions from Denis Sevee s Vector Geometry. (Updated )

Vector Geometry. Chapter 5

4.1 Distance and Length

MTH 2032 SemesterII

SUMMARY OF MATH 1600

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

GEOMETRY OF MATRICES x 1

Vectors. Vectors and the scalar multiplication and vector addition operations:

Mon Apr dot product, length, orthogonality, projection onto the span of a single vector. Announcements: Warm-up Exercise:

Linear Algebra Massoud Malek

Math 544, Exam 2 Information.

LINEAR ALGEBRA: THEORY. Version: August 12,

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Math Linear Algebra

Applied Linear Algebra in Geoscience Using MATLAB

Lecture 2: Vector-Vector Operations

Lecture 20: 6.1 Inner Products

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

MATH 2030: ASSIGNMENT 4 SOLUTIONS

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Lecture 7. Econ August 18

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Remark 3.2. The cross product only makes sense in R 3.

Review of Some Concepts from Linear Algebra: Part 2

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

2. Every linear system with the same number of equations as unknowns has a unique solution.

Mathematics Department Stanford University Math 61CM/DM Inner products

Euclidean Spaces. Euclidean Spaces. Chapter 10 -S&B

Vector space and subspace

MTAEA Vectors in Euclidean Spaces

Practice Final Exam. Solutions.

Linear Algebra, Summer 2011, pt. 3

Lecture 1: Systems of linear equations and their solutions

LINEAR ALGEBRA QUESTION BANK

CSL361 Problem set 4: Basic linear algebra

22A-2 SUMMER 2014 LECTURE Agenda

Transcription:

m:033 Notes: 6. Inner Product, Length and Orthogonality Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman April, 00

The inner product Arithmetic is based on addition and multiplication. Geometry is based on length and angle. Using a dot product we obtain length and angle from addition and multiplication in a relatively simple way. Definition. Suppose u is an n-dimensional vector with coordinates {u i } and v is an n-dimensional vector with coordinates {v i } then the inner product or dot product of u and v is defined: u v = u v + u v + + u n v n Remark. If we express a vector as a matrix with one column the inner product can be expressed in terms of matrix multiplication as: u v = ( u ) T v. So the dot product of u = can be calculated: u v = ( u ) T ( ) v = 3 3 and v = 3 5 3 5 = 6+5 =.

It is important to note that the dot product of two vectors is a number. This makes it very different from more familiar multiplications where we multiply two things of some type and get another thing of that same type. Other than that, the formal properties for dot product have the familiar look of multiplication: Proposition.3 Suppose u and v are n-dimensional vectors and c a number. Then. u v = v u. ( u + v ) w = u w + v w 3. (c u ) v = u (c v ) = c( u v ) 4. 0 u u 5. u u = 0 if and only if u = 0 Remark.4 Note in that last item, there are two different zeros the number zero and the vector zero. 3

Length of a vector Definition. The length (or norm) of a vector v is defined to be v v, A common notation for the length of v is v. Remark. For two and three dimensional vectors, the formula agrees with the familiar length formulas. If ( ) x v = then v = x y + y. This is the familiar Pythagorean Theorem in the plane. If x v = y then v = x + y + z. This is z the three dimensional version of the Pythagorean Theorem. x For higher dimensions we have If x v =. then x n v = x + x + + x n. This is clearly an analogue version of the Pythagorean Theorem. This allows 4

us to define length in any R n even though we may have difficulty in visualizing things in high dimensions. A vector v R n lies on a line L = t v, and our length is just length as measured in this line. Example.3 So the length of 3 4 is + + 3 + 4 = + 4 + 9 + 6 = 30. Remark.4 It is easy to check from the definition that if c is a number, then c v = c v. For example if ( ) x v = then c ( ) cx v = y cy And so c v = (cx) + (cy) = c (x + y ) = c (x + y ) = c (x + y ). Definition.5 A vector with length is called a unit vector. 5

Definition.6 If v is a non zero vector, the vector ( v v ) is called the normalization of v. Remark.7 If v is a non zero vector, then ( v ) v is a unit vector since, using Remark.4, we see: ( ) v = v v v = v v = Example.8 The normalization of 3 4 is 30 30 3 30 4 30. 3 Distance in R n Definition 3. Suppose u and u are n-dimensional vectors. We define the distance between u and u to be dist( u, v ) = u v 6

Remark 3. In other words, the distance between two vectors is length of their difference. If we write out the formulas we find: The distance between two vectors is the square root of the sum of the squares of the differences of the coordinates: dist( u, v ) = n (u i v i ) i= Also note we can write: dist( u, v ) = ( u v ) ( u v ). 4 Angles in R n The famed Pythagorean Theorem: a + b = c only holds for right triangles where the angle θ opposite the side of length c ( that is, the hypotenuse ) is π. The theorem covering the case when θ π is the Law 7

of Cosines: a + b = c bc cos θ. Two vectors in the plane R, u and v ( for the moment assume they are linearly independent) determine a triangle whose vertices are: the origin and two endpoints of the vectors. Let θ be the angle of this triangle at 0 ; we refer to this as the angle between the vectors. Applying the Law of Cosines to this triangle we obtain: u v = u v cos θ It is easy to check that this formula works if u and v are not linearly independent. In the general case suppose we have two vectors in R n, u and v ( for the moment assume they are linearly independent). Then they lie in a well defined two-dimensional subset which contains the origin and is geometrically is just like R. In this plane we can measure our angle θ. And we will find that our formula still holds. u v = u v cos θ 8

5 Orthogonal vectors Two line segments joined at a point are called orthogonal if the meet at a right angle. We can think of these line segments as giving two vectors. Since cos π = 0 we see from the formula of Section 4 that the dot product of these two vectors must be zero. Definition 5. Two vectors in R n, u and v are orthogonal if and only if u v = 0. Example 5. So we can see that 4 and are not orthogonal since their dot product is +4+ 0. Also 4 and are orthogonal since their dot product is 4 + = 0. 9

( ) a Example 5.3 Find a vector of the form which ( ) is orthogonal to. The dot product of these vectors is a 3, and we need a 3 = 0, so we conclude 3 that a = 3. 6 Orthogonal complements Definition 6. If z is a vector in R n and W a subspace we say z is orthogonal to W if z is orthogonal to every vector in W. Remark 6. If z is a vector in R n the set of all vectors orthogonal to z is a subspace. If W is a subspace of R n the set of all vectors orthogonal to some vector in W is a also subspace of R n. We denote this subspace as W. Definition 6.3 If W is a subspace in R n, the subspace W of all vectors orthogonal to W is called the orthogonal complement of W. 0

Definition 6.4 Let A be an m n matrix. The row space, RowA, of A is the subspace of R m spanned by the row vectors of A. The column space, ColA, of A is the subspace of R n spanned by the column vectors of A When we do matrix multiplication AB = C we calculate c ij from the i-th row of A and the j-th column of B. That calculation can be viewed as a dot product of the i-th row vector of A and the j-th column vector of B. Now think about an equation A x = 0. We can think of this as saying that the vector x to every row vector of A. This leads us to the following theorem. Proposition 6.5 Let A be an m n matrix. Then. (RowA) = NulA. (ColA) = NulA T Example 6.6 Find a basis for the orthogonal comple-

ment of the subspace W spanned by 3 and 0. 4 0 By the Proposition 6.5 we need only find the null space of ( ) 3 4. 0 0 We do this by row reducing A to get ( ) 3 4 0 4 ( 0 0 0 4 ( 0 0 0 Letting x 4 = s and x 3 = t we will have x + t + s = 0 and x + t = 0, So the null space is all vectors ) )

t t s t s We conclude that is a basis for W. = t 0 0, + s 0 0 0 0 7 Problems. Let v =. Find a basis for the subspace of 3 R 3 orthogonal to the subspace spanned by v.. Find a basis for the orthogonal complement of the 3

subspace W spanned by 3 and 4. 4