Inner Product and Orthogonality

Similar documents
Elementary linear algebra

Linear Algebra Massoud Malek

Typical Problem: Compute.

Mathematics Department Stanford University Math 61CM/DM Inner products

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

October 25, 2013 INNER PRODUCT SPACES

A Primer in Econometric Theory

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

INNER PRODUCT SPACE. Definition 1

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India

Matrix Algebra: Vectors

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Applied Linear Algebra in Geoscience Using MATLAB

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication.

Review of Some Concepts from Linear Algebra: Part 2

1. General Vector Spaces

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

Introduction to Signal Spaces

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

CHAPTER II HILBERT SPACES

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Chapter 4 Euclid Space

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Linear Models Review

MTH 2032 SemesterII

LINEAR ALGEBRA W W L CHEN

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Lecture 1: Review of linear algebra

Further Mathematical Methods (Linear Algebra) 2002

Linear Algebra Review. Vectors

Lecture 6: Geometry of OLS Estimation of Linear Regession

Numerical Linear Algebra

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Fourier and Wavelet Signal Processing

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Linear Analysis Lecture 5

Lecture 23: 6.1 Inner Products

Linear Algebra. Min Yan

Normed & Inner Product Vector Spaces

Math Linear Algebra

Linear Algebra. Session 12

6 Inner Product Spaces

Spectral Theory, with an Introduction to Operator Means. William L. Green

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Review of Linear Algebra

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

David Hilbert was old and partly deaf in the nineteen thirties. Yet being a diligent

Lecture 2: Linear Algebra

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

Math 3191 Applied Linear Algebra

(v, w) = arccos( < v, w >

MA201: Further Mathematical Methods (Linear Algebra) 2002

Fall 2016 MATH*1160 Final Exam

CHAPTER VIII HILBERT SPACES

(v, w) = arccos( < v, w >

Lecture Note on Linear Algebra 17. Standard Inner Product

I teach myself... Hilbert spaces

Lecture 2: Linear Algebra Review

2. Review of Linear Algebra

Lecture 20: 6.1 Inner Products

Math Linear Algebra II. 1. Inner Products and Norms

Lecture 3: Linear Algebra Review, Part II

x 1 x 2. x 1, x 2,..., x n R. x n

The following definition is fundamental.

Quantum Computing Lecture 2. Review of Linear Algebra

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

LINEAR ALGEBRA REVIEW

Vector Space Concepts

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

(v, w) = arccos( < v, w >

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

4 Hilbert spaces. The proof of the Hilbert basis theorem is not mathematics, it is theology. Camille Jordan

SUMMARY OF MATH 1600

MAT Linear Algebra Collection of sample exams

Linear Algebra. Alvin Lin. August December 2017

MATH 532: Linear Algebra

Dot product and linear least squares problems

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Definitions for Quizzes

Mathematical foundations - linear algebra

Lecture Notes 1: Vector spaces

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

3 a 21 a a 2N. 3 a 21 a a 2M

Dot product. The dot product is an inner product on a coordinate vector space (Definition 1, Theorem

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Algebra II. Paulius Drungilas and Jonas Jankauskas

Transcription:

Inner Product and Orthogonality P. Sam Johnson October 3, 2014 P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 1 / 37

Overview In the Euclidean space R 2 and R 3 there are two concepts, viz., length (or distance) and angle which have no analogues over a general field. Fortunately there is a single concept usually known as inner product or scalar product which covers both the concepts of length and angle. We discuss the concept of orthogonality and some applications. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 2 / 37

Scalar product of vectors in R 2 Let a = (a 1, a 2 ) and b = (b 1, b 2 ) be vectors in R 2 represented by the points A and B as in figure. Then the scalar product of a and b is defined to be a, b = a b cos θ where a is the length of OA, b is the length of OB and θ is the angle between OA and OB. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 3 / 37

Length, distance, angle : in terms of the inner product Scalar product gives the following concepts of length, distance and angle. 1 Length of a vector: The length OA can be defined in terms of the scalar product since OA 2 = a 2 = a, a. 2 Distance between vectors: If OABC is a parallelogram, the distance AB = OC = b a, b a since C = b a. 3 The angle θ can be obtained as ( ) θ = cos 1 a, b. a, a. b, b P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 4 / 37

The above concepts and results have obvious analogues in R 3. The concept of angle between vectors is generalized to vector space with an inner product (called inner product space). Motivated by the scalar product (dot product) on R 2 we now give the axiomatic definition of inner product on a vector space over K, where K is either R or C. Definition Let X be a vector space over K. A function.,. : X X K is an inner product on X if for any x, y, z X and α, β K the following conditions are satisfied: 1 αx + βy, z = α x, z + β y, z (linear with respect to first variable) 2 x, x 0 (positivity) and x, x = 0 x = 0 3 x, y = y, x. (conjugate symmetry, often known as Hermitian symmetry) (the bar denotes the complex conjugate). P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 5 / 37

One does not extend inner product to vector spaces over a general field mainly because x, x 0 has no meaning in a general field. a vector space with an inner product a real inner product space a complex inner product space an inner product space a Euclidean space a unitary space P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 6 / 37

Properties of an inner product 1 The restriction of an inner product to a subspace is an inner product. 2 In any inner product space, we have x, αy + βz = α x, y + β x, z. 0, y = x, 0 = 0. 3 When the second argument is held fixed, inner product is linear in the first argument. Similarly, when the first argument is held fixed, inner product is conjugate-linear in the second argument. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 7 / 37

Concept of length : Norm Inner product combines the concepts of length and angle. We shall discuss the first concept, length. Definition A norm on a (real or complex) vector space V is a map x x from V to R satisfying the following three conditions: 1 x 0 ; x = 0 if x = 0 2 αx = α. x 3 x + y x + y. A vector space together with a norm on it is called a normed vector space or normed linear space or simply normed space. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 8 / 37

Each inner product induces a norm, defined by x = x, x. Theorem Every inner product space is a normed space. In any inner-product space, we have the following 1 length of the vector x, x = x, x. 2 distance of two vectors x and y, x y = x y, x y. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 9 / 37

Another concept : Angle The angle between two nonzero vectors x and y is defined by the formula cos θ = x, y x y. It is understood that the angle θ should be chosen in the closed interval [0, π]. Inner product combines the concepts of length and angle. We shall discuss an important special case of the second concept, viz., the angle between two vectors being 90. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 10 / 37

Law of Cosines : Verification in R2 We call upon the Law of Cosines from trigonometry, which asserts that in a triangle having sides a, b, c, and opposing angles, A, B, C, the formula c 2 = a2 + b 2 2ab cos θ holds. Create a triangle having sides x, y, and x y. Then in the law of cosines let C = θ, a = kxk, b = ky k, and c = kx y k. This produces the equation kx y k2 = kxk2 + ky k2 2kxk ky k cos θ. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 11 / 37

Hence, we obtain x 2 2 x, y + y 2 = x 2 + y 2 2 x y cos θ. When this equation is simplified, we arrive at the equation cos θ = x, y x y = x, y. x, x y, y Theorem (Cauchy-Schwarz Inequality) Let X be an inner product space. Then x, y 2 x, x y, y for all x, y X. The equality occurs iff x and y are linearly dependent. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 12 / 37

Examples of Inner Product Spaces 1 The space K n of ordered n-tuples (x 1, x 2,..., x n ) of (real or complex) scalars is an inner product space with respect to the inner product (canonical inner product) n x, y = x i y i. 2 The space l 2 of all sequences (x n ) n=1 of (real or complex) scalars such that n=1 x n 2 <, is an inner product space with the inner product defined by x, y = x n y n. i=1 n=1 3 Fix any finite subset A of R with size n. Let V = P n over R. is an inner product on V. p, q := a A p(a)q(a) P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 13 / 37

Examples of Inner Product Spaces 1 The space C[a, b] of all continuous scalar-valued functions on the interval [a, b] is an inner product space with the inner product defined by f, g = b a f (x)g(x)dx. 2 If h V is such that h(t) > 0 for all t [a, b], is also an inner product. f, g = b a h(t)f (t) g(t) dt 3 A, B = tr(b A) is an inner product on C m n. 4 Let V be the vector space of all real-valued random variables with mean 0 and finite variance, defined on a fixed probability space. Let F = R and define x, y to be the covariance between x and y. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 14 / 37

Exercises 1 Prove that the following x, y = y T x and x, y = x y are not inner products on C n. 2 In C m n, verify that A, B = n a ii b ii i=1 is not an inner product. What are all the axioms which are violated? P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 15 / 37

Inner product associated with a matrix Let V be an inner product space over K and B = {u 1, u 2,..., u n } a basis of V. Let α = (α 1, α 2,..., α n ) T and β = (β 1, β 2,..., β n ) T be the coordinate vectors of x and y respectively with respect to B and let A = (a ij ), where a ij = u j, u i. Then x, y := αi u i, β j u j = β j a ji α i = β Aα. (1) The matrix A will satisfy the following conditions: 1 A = A 2 α Aα 0 for all α K n, 3 if α Aα = 0 then α = 0. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 16 / 37

Matrix associated with an inner product Conversely, if A is a matrix satisfying the above three conditions, then.,. defined by (1) is an inner product on V. Suppose A = B B, where B is a matrix with n columns and rank n. Then is an inner product because x, y = y B Bx y, x = x B By = (x B By) = y B Bx = x, y. x, x = (Bx) (Bx) 0. If x, x = 0 then Bx = 0 and so x = 0. We shall later show that any matrix A satisfying the above three conditions, can be written as B B for some non-singular B. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 17 / 37

Orthogonality Let V be an inner product space, x, y V. Let A, B be subsets of V. x, y = 0 (we write x y) x y for every pair of distinct vectors x, y in A A is orthogonal and every vector in A has norm 1 every vector in A is orthogonal to every vector in B x and y are orthogonal to each other A is orthogonal A is orthonormal A is orthogonal to B Properties of Orthogonality x y y x. 0 x for all x. x x x = 0. if x y, y z, then x (αy + βz) for any α, β K. The empty set is orthonormal (in a vacuous sense). P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 18 / 37

Exercises A set of vectors is orthogonal iff its elements are pair-wise orthogonal. Is the corresponding statement for linear independence true?. Linear independence is a property of the entire set whereas orthogonality is a property of pairs. Exercises Any orthogonal set A not containing the null vector is linearly independent. Any orthonormal set is linearly independent. If the subspaces S 1, S 2,..., S k are orthogonal to one another then S 1 + S 2 + + S k is direct. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 19 / 37

Orthogonal Complement Definition The orthogonal complement of a set S in an inner-product space is the set {x : x s for all s S}. The orthogonal complement of S is denoted by S. Theorem In an inner product space, the orthogonal complement of any subset is a subspace. Theorem In an inner product space, the orthogonal complement of a set is the same as the orthogonal complement of its span. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 20 / 37

Theorem If M is a subspace in an n-dimensional inner product space, then dim(m) + dim(m ) = n. Theorem Let M and N be subsets of an inner product space. If M N, then N M. Theorem If M is a finite dimensional subspace in an inner product space V, then V is the direct sum of M and M : Moreover, M = M. V = M + M and M M = {0}. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 21 / 37

Pythagoras theorem In a real inner product space, if x y, then x + y 2 = x 2 + y 2. The converse is true for real inner product space but not for complex inner product space. More generally, k 2 x i = i=1 k x i 2 if {x 1, x 2,..., x k } is orthogonal. The converse is not true for both real and complex inner product spaces. i=1 P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 22 / 37

System corresponding to orthonormal basis Definition Let S be a subspace of an inner product space. We say that B is an orthogonal basis (resp. an orthonormal basis) of S if B is a basis of S and B is an orthonormal (resp. an orthonormal) set. We have seen that a basis corresponds to a coordinate system. An orthonormal basis corresponds to a system of rectangular coordinates where the reference point on each axis is at unit distance from the origin. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 23 / 37

For a given orthonormal basis, finding the coordinates with respect to such a coordinate system is easy as shown in the following. Theorem Let B = {x 1, x 2,..., x n } be an orthonormal basis of an inner product space V. Then for any x V, we have x = n α j x j, where α j = x, x j. j=1 P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 24 / 37

Exercises Let x 1, x 2,..., x k form an orthonormal set. 1 Show that k i=1 α ix i 2 = k i=1 α i 2. 2 If z is the residual of x on {x 1, x 2,..., x k }, show that z 2 = x 2 3 Bessel s inequality: k 2 x, x i x i = x 2 i=1 k x, x i 2. i=1 x 2 k x, x i 2 i=1 for any x. Show also that equality holds iff x Sp({x 1, x 2,..., x k }). P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 25 / 37

Exercises Let B = {x 1, x 2,..., x k } be an orthonormal set in a finite-dimensional inner product space V. Show that the following statements are equivalent: 1 B is an orthonormal basis (maximal), 2 x, x i = 0 for i = 1, 2,..., k x = 0, 3 B generates V, 4 if x V then x = k i=1 x, x i x i, 5 if x, y V then x, y = k i=1 x, x i. x i, y, 6 if x V then x 2 = k i=1 x, x i 2. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 26 / 37

Orthogonal Projection One vector x can be projected orthogonally onto another vector y, provided that y is not zero. The idea is that the projection of x onto y should be a scalar multiple of y, say αy, such that x αy is orthogonal to y. What is the correct value of α? Theorem In any inner-product space, the orthogonal projection of a vector x onto a nonzero vector y is the point p = x, y y, y y. It has the property that x p is orthogonal to y. Thus, x is split into an orthogonal pair of vectors in the equation x = p + (x p). P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 27 / 37

Construction of orthogonal projection matrix Notice that our concept of projecting x onto y does not depend on the magnitude of the vector y. Actually the formula can be remembered more easily as p = x, y v where v is the normalized y; that is, y/ y. The calculation of an orthogonal projection can be carried out in several different ways. For example, we can begin with the point z that is to be projected and the matrix U whose columns are the vectors u i. The point p that we seek is a linear combination of the columns of U and is therefore of the form p = Uc for some unknown vector c in R n. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 28 / 37

The orthogonality condition is that z p should be orthogonal to all the columns of U, or in other terms, (z p) T U = 0 Since p = Uc, this last equation becomes (z Uc) T U = 0 U T (z Uc) = 0 U T Uc = U T z. We shall see later how a well-chosen basis of W will lead to U T U = I. In this case we obtain c = U T z. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 29 / 37

Suppose we are working in the space R n, and we have an orthonormal set of n vectors, u 1, u 2,..., u n. Put them into a matrix U as columns. The resulting matrix is square, and this property is crucial. The orthonormality now gives us the equation U T U = I. Such a matrix U is said to be orthogonal. It is obviously invertible as U T is its inverse. Since U is square, UU T = I, that the rows of U also form an orthonormal set of vectors! This is an impressive bit of magic. Definition A real matrix U is orthogonal if UU T = U T U = I. A complex matrix U is unitary if UU H = U H U = I. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 30 / 37

Theorem Let {u 1, u 2,..., u n } be an orthonormal basis for a subspace U in an inner-product space. The orthogonal projection of any x onto U is the point n p = x, u i u i. Theorem i=1 In order that a vector be orthogonal to a subspace (in an inner-product space), it is sufficient that the vector be orthogonal to each member of a set that spans the subspace. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 31 / 37

Exercises 1 Let z be a fixed nonnull vector in the plane. What is the locus of the point x such that x, z = 0? What happens if 0 is replaced by a non-zero scalars? 2 If x 1, x 2, y 1, y 2 are real numbers, show that (x 1 x 2 + y 1 y 2 ) 2 (x 2 1 + y 2 1 )(x 2 2 + y 2 2 ). Hence deduce that PQ + QR PR for any three points P, Q and R in the plane. 3 Suppose A = {x 1, x 2,..., x k } be an orthogonal set (not a basis) of non-null vectors in V. Then for any x V, we call z := x k j=1 x, x j x j, x j x j the residual of x with respect to A. Prove that the residual z is orthogonal to each x i. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 32 / 37

Application : Work and Forces Let the vector f be the force exerted on an object, let the vector d be the displacement caused by the force, and let θ be the angle between f and d. For example, suppose we are pulling a heavy load on a dolly with a constant force so that it moves horizontally along the ground. The work done in moving the dolly through a distance d is given by the distance moved multiplied by the magnitude of the component of the force in the direction of motion. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 33 / 37

Application : Work and Forces The component of f in the direction d is f cos θ. By the definition, the work accomplished is W = f d cos θ = f, d. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 34 / 37

Application : Collision The law of cosines can be applied to determine the final location of a ball after a glancing collision with a wall, as shown in the following figure. Let u = (u 1, u 2 ) be the initial position, v = (v 1, v 2 ) be the final position, and u v be the change in position as a result of the collision. From the Law of Cosines, it follows that the magnitude of the change in position is: u v 2 = u 2 + v 2 2 u v cos ϕ. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 35 / 37

Application : Collision From the expression we can obtain u v 2 = u 2 + v 2 2 u v cos ϕ, u, v = u v cos ϕ which gives a connection between the inner product of the vectors u and v and the angle ϕ between them. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 36 / 37

References Gilbert Strang, Linear Algebra and its Applications, Cengage Learning, New Delhi, 2006. A. Ramachandra Rao and P. Bhimasankaram, Linear Algebra, Hindustan Book Agency, 2000. Ward Cheney and David Kincaid, Linear Algebra - Theory and Applications, Jones & Bartlett, 2010. P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 37 / 37