Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Similar documents
Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Mathematics Department Stanford University Math 61CM/DM Inner products

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

Vectors in Function Spaces

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Elementary linear algebra

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Linear Algebra Massoud Malek

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Numerical Linear Algebra

Linear Algebra. Session 12

Math Linear Algebra II. 1. Inner Products and Norms

Ordinary Differential Equations II

I teach myself... Hilbert spaces

5 Compact linear operators

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

INNER PRODUCT SPACE. Definition 1

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

The following definition is fundamental.

Recall that any inner product space V has an associated norm defined by

CHAPTER VIII HILBERT SPACES

Lecture 10: Vector Algebra: Orthogonal Basis

4 Hilbert spaces. The proof of the Hilbert basis theorem is not mathematics, it is theology. Camille Jordan

Math 407: Linear Optimization

Lecture Notes 1: Vector spaces

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Linear Algebra. Workbook

1. General Vector Spaces

Functional Analysis Exercise Class

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Inner Product Spaces An inner product on a complex linear space X is a function x y from X X C such that. (1) (2) (3) x x > 0 for x 0.

LINEAR ALGEBRA W W L CHEN

Chapter 6: Orthogonality

Chapter 6 Inner product spaces

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Math 489AB A Very Brief Intro to Fourier Series Fall 2008

5. Orthogonal matrices

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall 2011

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Fourier and Wavelet Signal Processing

Lecture Notes for Inf-Mat 3350/4350, Tom Lyche

Lecture notes: Applied linear algebra Part 1. Version 2

MTH 2032 SemesterII

October 25, 2013 INNER PRODUCT SPACES

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Dot product and linear least squares problems

Typical Problem: Compute.

MATH 532: Linear Algebra

MATH 235: Inner Product Spaces, Assignment 7

MAT Linear Algebra Collection of sample exams

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

A Primer in Econometric Theory

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

NORMS ON SPACE OF MATRICES

Chapter 4 Euclid Space

MA20216: Algebra 2A. Notes by Fran Burstall

Chapter 3 Transformations

6 Inner Product Spaces

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Section 7.5 Inner Product Spaces

Applied Linear Algebra in Geoscience Using MATLAB

PART II : Least-Squares Approximation

Lecture 1: Review of linear algebra

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

MATH Linear Algebra

Mathematics of Information Spring semester 2018

Lecture 4 Orthonormal vectors and QR factorization

LINEAR ALGEBRA SUMMARY SHEET.

Spectral Theory, with an Introduction to Operator Means. William L. Green

Functional Analysis HW #5

Linear Algebra. Paul Yiu. 6D: 2-planes in R 4. Department of Mathematics Florida Atlantic University. Fall 2011

Functional Analysis. James Emery. Edit: 8/7/15

Diagonalizing Matrices

2. Review of Linear Algebra

Chapter 8 Integral Operators

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

SYLLABUS. 1 Linear maps and matrices

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

Chapter 5. Basics of Euclidean Geometry

Section 6.4. The Gram Schmidt Process

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

I. Multiple Choice Questions (Answer any eight)

Solutions to Review Problems for Chapter 6 ( ), 7.1

Real Analysis Notes. Thomas Goller

CHAPTER II HILBERT SPACES

Mathematical foundations - linear algebra

Inner Product Spaces

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

(v, w) = arccos( < v, w >

Linear Normed Spaces (cont.) Inner Product Spaces

Transcription:

Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F = R, χ is simply the identity. If F = C, χ(λ) = λ. In both cases, we write χ(λ) = λ.

Inner product Let V be a vector space over F. An inner product on V is a nonsingular symmetric bilinear form, : V V F. Symmetry: x, y = y, x for x, y V. Bilinearity: (i) λx + λ x, y = λ x, y + λ x, y, (ii) x, µy + µ y = µ x, y + µ x, y. (ii) follows from (i) and symmetry. Positive definiteness: For every x V. x, x 0 and equality holds if and only if x = 0. 1

Standard inner product on R n Consider R n with the standard basis B = {e 1, e 2,..., e n } (where e i has 1 in its i-th position, and 0 elsewhere). The standard inner product on R n is defined by n a i e i, i=1 n b i e i = i=1 n a i b i. i=1 For x = n i=1 x ie i, x 2 = x, x = n x 2 i. i=1 2

Unitary space C n Consider C n with standard basis B = {e 1, e 2,..., e n } (where e i has 1 in its i-th position, and 0 elsewhere). The Hermitian inner product on C n is defined by n a i e i, i=1 n b i e i = i=1 n a i b i. i=1 For x = n i=1 x ie i, x 2 = x, x = n x i x i = i=1 n x i 2. i=1 3

Examples (1) The space C[a, b] of continuous complex valued functions on a closed interval [a, b] with inner product f, g = b a f(x)g(x)dx. (2) The space l 2. Let F = R or C. An infinite sequence (x n ), n = 0, 1, 2..., in F is square summable if n=0 x n 2 < (convergent). The square summable sequences form the space l 2 with inner product (x n ), (y n ) = x n y n. n=0 This latter series converges by the comparison test since ( x n y n ) 2 0 = x n y n 1 2 ( x n 2 + y n 2 ) = 1 2 ( x n 2 + (y n 2 ). 4

Norms The norm of x V is defined by x = x, x. A vector x V is a unit vector if x = 1. A nonzero vector x in a positive definite inner product space can be normalized into a unit vector: x x :=, x, x i.e. x = 1. 5

Some inequalities The norm of x V is defined by x = x, x. (1) x 0 and x = 0 if and only if x = 0. (2) λx = λ x for λ F and x V. (3) Cauchy-Schwarz inequality: x, y x y. Equality holds if and only if x and y are linearly dependent. (4) Triangle inequality: x + y x + y. Equality holds if and only if x and y are linearly dependent. (5) Parallelogram law: x + y 2 + x y 2 = 2( x 2 + y 2 ). 6

Cauchy-Schwarz inequality Proof. We may assume x and y nonzero. For any λ F, 0 x λy 2 = x λy, x λy = x, x λ x, y λ ( y, x λ y, y ) = x 2 λ x, y λ ( y, x λ y 2). By choosing λ = y, x y 2, we have becomes 0 x 2 y, x y 2 x, y = x 2 y 2 x, y 2 y 2. Therefore, x, y x y. Clearly, equality holds if and only if x λy = 0 for some λ F, i.e., x and y are linearly dependent. 7

Polarization identities (1) The real case: x, y = 1 ( u + v 2 u v 2). 4 (2) The complex case: x, y = 1 ( u + v 2 u v 2) + i ( u + iv 2 u iv 2). 4 4 8

Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6B: Orthogonality

Orthogonality Let V be an inner product space. Two vectors x, y V are orthogonal if x, y = 0. Notation: x y. 1

Orthogonal = linearly independent Proposition. Mutually orthogonal nonzero vectors are linearly independent. Proof. Let u 1,..., u s be orthogonal nonzero vectors in an inner product space V. Suppose s i=1 λ iu i = 0. Then for each j = 1,..., s, 0 = s λ i u i, u j = i=1 s λ i u i, u j = λ j u j, u j = λ j = 0 i=1 since u j, u j 0. Therefore, the vectors are linearly independent. 2

Orthogonal projection of a vector onto a subspace Let V be a positive definite inner product space, and W V a subspace. For x V, the orthogonal projection onto W is the vector y W for which x y x for every x W. i.e., x, x = x, y. For this, let B = {u 1,..., u s } be a basis of W, and write y = s i=1 a iu i. We require, for each i = 1,..., s, u i, x = u i, s a j u j = j=1 s u i, u j a j. j=1 This amounts to solving the system of linear equations a 1 u 1, x M. =., u s, x a s where M = ( u i, u j ). We shall justify below that M is nonsingular. Granting this, the above equation has a unique solution in a 1,..., a s. This gives the orthogonal projection y of x onto V. 3

( u i, u j ) is a nonsingular matrix Proposition. Let {u 1,..., u n } be a basis of a positive definite inner product space. The matrix M = ( u i, u j ) is nonsingular. Proof. Suppose, for a contradiction, that M is singular. There exists b 1,..., b n F, not all zero, such that b 1 M. = 0.. 0 Let y = n j=1 b ju j. It follows from the above that x, y = 0 for every x V. This is clearly impossible since y, y > 0. b n 4

Orthogonal complement Let V be an inner product space over F, and V a subspace of V. The orthogonal complement of V is the subspace W := {x V : x, y = 0 for every y W }. Theorem. V = W W. 5

Orthogonal decomposition Theorem. V = W W. Proof. (1) W W = 0. (2) For every x V, x = y + (x y), where y is the orthogonal projection of x onto V, and x y W. Therefore, we have an orthogonal direct sum decomposition V = W W. 6

Orthogonal decomposition: infinite dimensional case A decomposition V = W W may not hold if V is infinite dimensional. Example. Let V = l 2, and W be the span of all e n = (0,..., 0, 1, 0,...), which contains 1 in its n-th position, and 0 elsewhere. The sequence e 0, e 1,..., e n,..., is orthonormal. Note that W is a proper subspace of l 2, since each element of W, being a linear combination of e 1,..., e n..., has only finitely many nonzero terms. If x = (x n ) W, then for each integer n, 0 = x, e n = x n. This shows that x = 0, and W = {0}. Clearly, l 2 W = W W. 7

Orthogonal and orthonormal bases Let V be an inner product space. An orthogonal basis of V is a basis consisting of mutually orthogonal vectors, i.e., a basis B = {u 1,..., u n } of V satisfying u i, u j = 0 for distinct i, j = 1,..., n. An orthonormal basis is an orthogonal basis of unit vectors, i.e., a basis B = {u 1,..., u n } of V satisfying u i, u j = δ i,j for i, j = 1,..., n. 8

Existence of orthogonal basis Theorem. A finite dimensional positive definite inner product space contains an orthogonal basis. Proof. (Induction on dimension). This is clearly true for 1-dimensional inner product spaces. Every nonzero vector constitutes an orthogonal basis. Assume this true for all inner product spaces of dimensions < n. Let V be an inner product space of dimension n. Choose a nonzero v V. Then V = Span(v) v, and dim F v = n 1. By inductive hypothesis, v contains an orthogonal basis B. Then {v} B is an orthogonal basis of V. Corollary. A finite dimensional positive definite inner product space contains an orthonormal basis. 9

Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6C: Gram-Schmidt orthogonalization In this section, V is a positive definite inner product over R or C.

Orthogonal projection onto a vector Lemma. The orthogonal projection of u V along v is the vector u, v v, v v. Proof. Write the orthogonal projection as av for some a F. We require u av, v = 0. From this, u, v a v, v = 0, and a = u, v v, v. 1

Orthogonal projection onto a subspace (again) (1) Let W V be a subspace of V, with an orthogonal basis u 1,..., u s. For x V, the orthogonal projection on W is the vector ˆx W such that x ˆx w for every w W. x x ˆx ˆx V Write ˆx = s i=1 λ iu i. For each i = 1, 2..., s, we require x ˆx, u i = 0. This means, x, u i λ i u i, u i = 0, and λ i = x, u i ˆx = s i=1 x, u i u i 2 u i. u i 2. (2) If e 1,..., e s is an orthonormal basis of W V, s ˆx = x, e i e i. i=1 2

Bessel s inequality: ˆx x Proof. From x = ˆx + (x ˆx), with ˆx, x ˆx = 0, we have x 2 = ˆx 2 + x ˆx 2 ˆx 2. Therefore, ˆx x. Equality holds if and only if x ˆx = 0, i.e., x W. 3

Best approximation ˆx is the best approximation to x in W, in the sense that x ˆx < x w for every w W \ {ˆx}. Proof. For w W, x w = (x ˆx) + (ˆx w). Note that ˆx w W and is orthogonal to x ˆx. Therefore, x w 2 = x ˆx 2 + ˆx w 2 x ˆx 2. Equality holds if and only if ˆx w = 0. Therefore, x ˆx < x w if w W \ {ˆx}. Remark: If V is infinite dimensional, we also call n ˆx = x, e i e i i=1 the Fourier expansion with respect to the orthonormal set e 1,..., e n. The best approximation property and Bessel s inequality are still valid. 4

Example: The space l 2 For each integer n 0, let u n be the infinite sequence which has 1 in its n-th position, and 0 elsewhere. The set e 0, e 1,..., e n,... is an orthonormal set. However, the span is not the whole space l 2. Let V be the span of e 1,..., e n. The orthogonal projection of x = (x i ) l 2 is the truncation of x at position n. ˆx = (x 1,..., x n, 0, 0,...), 5

Example: V = C[ π, π] The functions 1, cosnx, sinnx, n = 1, 2,..., form an orthogonal set: π π π π π π cosmx cosnxdx = δ m,n π, m, n = 1, 2,..., sinmx sin nxdx = δ m,n π, m, n = 1, 2,..., cosmx sin nxdx = 0, m, n = 0, 1,.... These lead to the Fourier expansions of f(x) C[ π, π]: where a 0 = 1 π a k = 1 π b k = 1 π ˆf = a n 0 2 + (a k coskx + b k sinkx), π π π π π π k=1 f(x)dx, f(x) coskxdx, k = 1, 2,..., f(x) sinkxdx, k = 1, 2,... 6

Gram-Schmidt orthogonalization Let v 1,..., v n be given vectors in a positive definite inner product space V. Define a sequence u 1,..., u n as follows. (i) u 1 = v 1, (ii) for k = 2,..., n, u k = v k k 1 i=1 λ k,iu i, where { vk, u i u λ k,i = i if u 2 i 0, 0 if u i = 0. Then u 1,..., u n is a sequence of mutually orthogonal vectors satisfying for each k = 1,..., n. Span(u 1,..., u k ) = Span(v 1,..., v k ) Proposition. If v 1,..., v n is a basis of V, then u 1,..., u n is an orthogonal basis of V. 7

QR factorization of a matrix Given an m n matrix over F = R or C, A = ( v 1 v 2... v n ), by reorganizing the vectors obtained in the Gram-Schmidt orthogonalization process, we obtain a factorization A = QR in which (i) Q = ( ũ 1 ũ 2... ũ n ) consists of the vectors u1,..., u n normalized, (ii) R is the upper triangular matrix v u 1 2, u 1 u 1 0 u R = 2 v n, u 1 u 1 v n, u 2 u 2... 0 0 u n 8