Lecture Note on Linear Algebra 17. Standard Inner Product

Similar documents
Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Linear Algebra 18 Orthogonality

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Linear Algebra. Alvin Lin. August December 2017

Applied Linear Algebra in Geoscience Using MATLAB

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Lecture 23: 6.1 Inner Products

Lecture 20: 6.1 Inner Products

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w >

Linear Algebra Massoud Malek

Definitions and Properties of R N

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Typical Problem: Compute.

(v, w) = arccos( < v, w >

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Linear Analysis Lecture 5

Projection Theorem 1

Inner Product and Orthogonality

March 27 Math 3260 sec. 56 Spring 2018

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

Chapter 6: Orthogonality

INNER PRODUCT SPACE. Definition 1

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is

Inner Product Spaces

The Transpose of a Vector

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication.

Real Analysis III. (MAT312β) Department of Mathematics University of Ruhuna. A.W.L. Pubudu Thilan

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 21

Normed & Inner Product Vector Spaces

CHAPTER II HILBERT SPACES

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Inner Product Spaces 6.1 Length and Dot Product in R n

Lecture Notes 1: Vector spaces

MATH Linear Algebra

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

There are two things that are particularly nice about the first basis

Linear Algebra: Homework 3

Fourier and Wavelet Signal Processing

LINEAR ALGEBRA W W L CHEN

6.1. Inner Product, Length and Orthogonality

y 1 y 2 . = x 1y 1 + x 2 y x + + x n y n y n 2 7 = 1(2) + 3(7) 5(4) = 4.

Inner Product Spaces 5.2 Inner product spaces

Vector Geometry. Chapter 5

2. Review of Linear Algebra

4.1 Distance and Length

Vectors. Vectors and the scalar multiplication and vector addition operations:

Math Linear Algebra

Matrix Algebra: Vectors

6 Inner Product Spaces

A SYNOPSIS OF HILBERT SPACE THEORY

Chapter 2. Vectors and Vector Spaces

The Geometry of R n. Supplemental Lecture Notes for Linear Algebra Courses at Georgia Tech

MAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012

which has a check digit of 9. This is consistent with the first nine digits of the ISBN, since

Math 2331 Linear Algebra

Detailed objectives are given in each of the sections listed below. 1. Cartesian Space Coordinates. 2. Displacements, Forces, Velocities and Vectors

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

4.3 - Linear Combinations and Independence of Vectors

Euclidean Space. This is a brief review of some basic concepts that I hope will already be familiar to you.

Lecture Note on Linear Algebra 14. Linear Independence, Bases and Coordinates

October 25, 2013 INNER PRODUCT SPACES

SUMMARY OF MATH 1600

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

The geometry of least squares

1. General Vector Spaces

Functions of Several Variables

1.2 LECTURE 2. Scalar Product

Math 3191 Applied Linear Algebra

Solution to Homework 1

Functional Analysis MATH and MATH M6202

Geometric interpretation of signals: background

Chapter 3 Transformations

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India

Inner Product, Length, and Orthogonality

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Definitions for Quizzes

Solutions to assignment 5. x y = x z + z y x z + z y < = 5. x y x z = x (y z) x y z x y + ( z) x ( y + z ) < 2 (3 + 4) = 14,

Vector Calculus. Lecture Notes

Orthogonal Complements

x 1 x 2. x 1, x 2,..., x n R. x n

Quiz 2 Practice Problems

Mathematics Department Stanford University Math 61CM/DM Inner products

7.1 Projections and Components

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

Linear Equations and Vectors

Further Mathematical Methods (Linear Algebra) 2002

Quantum Computing Lecture 2. Review of Linear Algebra

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

Vectors in Function Spaces

MTH 2032 SemesterII

Math 32A Discussion Session Week 2 Notes October 10 and 12, 2017

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

NOTES ON LINEAR ALGEBRA CLASS HANDOUT

Transcription:

Lecture Note on Linear Algebra 17. Standard Inner Product Wei-Shi Zheng, wszheng@ieee.org, 2011 November 21, 2011 1 What Do You Learn from This Note Up to this point, the theory we have established can be applied to any vector space in general. However, vector spaces over R have geometric structure on their own. You probably have already known concepts such as lengths, angles and distances of vectors in R 2 and R 3. Now we shall see how these geometric concepts can be generalized to vector spaces over R, particularly R n. Basic Concept:inner product ( 内积 ), length ( 长度 ), unit vector ( 单位向量 ), distance ( 距离 ), orthogonality ( 正交 ), orthogonal complements ( 正交补 ) 2 Inner Product & Length 2.1 Basic Concept Definition 1 (inner product or dot product, 内积 ). Let u, v R n. The standard inner product or dot product u v of u and v is defined to be the real number u T v. Definition 2 (norm or length, 模或长度 ). length v of v is defined to be Let v R n. The norm or v = v v, 1

which is a non negative real number. Definition 3 (unit vector, 单位向量 ). is a vector whose norm is 1. If we are given any non zero vector v, then v is a unit vector, which is called the v unit vector in the same direction as v. The process of creating v is called v normalisation. Definition 4 (angle). Let u, v R n { 0}. The angle θ between u and v is defined to be u v θ = arccos u v, which is a number in the interval [ 0, π]. Definition 5 (distance, 距离 ). Let u, v R n }. The distance d( u, v) between u and v is defined to be d( u, v) = u v. Definition 6 (orthogonality, 正交 ). Two vectors u and v in R n are orthogonal if u v = 0. 2.2 Some Properties Theorem 7. Let u, v R n and c R. Then 1. u u 0 and u u = 0 iff u = 0; 2. u v = v u; 3. u ( v + w) = u v + u w; 4. u c v = c( u v). (So we say R n equipped with dot product is an inner product space.) Proof. Straightforward. Remark: 1) 3. and 4. of Theorem 2 indicate that dot product is linear on the second operand, and 2. ensures that dot product is also linear on the first operand. 2

2) We have v = 0 iff v = 0. 3) For any scalar c, we have c v = c v c v = c 2 v v = c v. Example: Textbook P.377. Theorem 8 (the Cauchy Schwarz inequality). Let u, v R n. Then ( u v) 2 ( u u)( v v), or equivalently, u v u v. The equality holds if and only if u and v are linearly dependent. Proof. Step 1. Suppose first that u and v are linearly independent. Then neither u nor v is 0. Let t be any scalar. Then u tv 0, so we have 0 < ( u tv) ( u tv) [by 1. of Theorem 2.] = ( u u) t( u v) t( v u) + t 2 ( v v) [by linearity.] = ( u u) 2t( u v) + t 2 ( v v) [by symmetry.] By the arbitraryness of t, we may take t = ( u v) ( v v) above inequality to obtain ( u v) 0 < ( u u) 2 ( u v) + ( v v) ( v v) ( v v) 2 = ( u u) 2 ( v v) = ( u u) ( v v). Multiply ( v v) on both side, we get So ( u v) 2 < ( u u)( v v). + ( v v) 0 < ( u u)( v v) ( u v) 2. and substitute it into the Step 2. If u and v are linearly dependent then either u = λ v or v = λ u. For both cases, it is obvious that ( u v) 2 = ( u u)( v v). By the Cauchy Schwarz inequality, for any u, v R n { 0} we have 1 u v u v 3 1,

Remarks: 1. By definition, we have u v = u v cos θ. 2. By Cauchy Schwarz inequality, θ = 0 or π iff u and v are linearly dependent. It is not hard to see that θ = 0 if u and v are in the same direction and θ = π if u and v are in the opposite direction. Theorem 9. Let u, v R n and c R. Then 1. u 0 and u = 0 iff u = 0; 2. c u = c u ; 3. u + v u + v and the equality holds iff u and v are in the same direction. (So we say R n equipped with is a normed space.) Proof. 1. and 2. have been shown already. For 3. we have u + v 2 = u u + v v + 2( u v) u 2 + v 2 + 2 u v [by Cauchy Schwarz inequality.] = ( u + v ) 2 So u + v u + v and the equality holds iff u v = u v iff u and v are in the same direction. Example: Textbook P.378. Theorem 10. Let u, v and w R n. Then 1. d( u, v) 0 and d( u, v) = 0 iff u = v; 2. d( u, v) = d( v, u); 3. Triangle inequality: d( u, w) d( u, v) + d( v, w). (So we say R n equipped with d(, ) is a metric space.) Proof. 1. and 2. are obvious. For 3., we have d( u, w) = u w = ( u v)+( v w) u v + v w = d( u, v)+d( v, w). 3 Orthogonality: More Definition 11 (orthogonality, 正交 ). Vectors u and v are said to be orthogonal, written as u v, iff u v = 0. If neither u nor v is 0, then u v is equivalent to θ = π 2. 4

Theorem 12. Let u, v R n. 1. the Cosine Theorem: If neither u nor v is 0, then u v 2 = u 2 + u 2 2 u v cos θ. 2. the Pythagorean Theorem: If u v, then Proof. 1. u + v 2 = u 2 + v 2. u v 2 = ( u v) ( u v) = u u+ v v 2( u v) = u 2 + v 2 2 u v cos θ. 2. u + v 2 = u u + v v + 2( u v) = u 2 + v 2. Definition 13 (orthogonal complements, 正交补 ). If a vector z is orthogonal to every vector in a subspace W of R n, then z is said to be orthogonal to W. The set of all vector z that are orthogonal to W is called the orthogonal complement of W and is denoted by W. Theorem 14. Let W be a subspace of R n. Then W is a subspace. Proof. Exercise. Theorem 15. Let A be an m n matrix. Then (RowA) = NulA and (ColA) = NulA T The Flight into Egypt, by Giotto di Bondone 5