MATH 235: Inner Product Spaces, Assignment 7

Similar documents
Linear Algebra Massoud Malek

Applied Linear Algebra in Geoscience Using MATLAB

Dot product and linear least squares problems

Applied Linear Algebra in Geoscience Using MATLAB

Math 24 Spring 2012 Sample Homework Solutions Week 8

There are two things that are particularly nice about the first basis

INNER PRODUCT SPACE. Definition 1

LINEAR ALGEBRA W W L CHEN

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Inner Product Spaces

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Class notes: Approximation

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Chapter 6 Inner product spaces

6. Orthogonality and Least-Squares

Lecture 23: 6.1 Inner Products

EXAM. Exam 1. Math 5316, Fall December 2, 2012

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Typical Problem: Compute.

Math Linear Algebra II. 1. Inner Products and Norms

Chapter 4 Euclid Space

1. General Vector Spaces

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Section 6.4. The Gram Schmidt Process

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

The Gram Schmidt Process

The Gram Schmidt Process

Linear least squares problem: Example

The Gram-Schmidt Process 1

Linear Algebra- Final Exam Review

Linear Algebra. Session 12

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

MATH 22A: LINEAR ALGEBRA Chapter 4

Math 290, Midterm II-key

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

GQE ALGEBRA PROBLEMS

Inner Product Spaces 5.2 Inner product spaces

MAT Linear Algebra Collection of sample exams

Linear algebra II Homework #1 due Thursday, Feb A =

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

MATH 54 - FINAL EXAM STUDY GUIDE

SUMMARY OF MATH 1600

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MTH 2032 SemesterII

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Designing Information Devices and Systems II

Math 3191 Applied Linear Algebra

MATH 235. Final ANSWERS May 5, 2015

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

Linear Algebra, part 3 QR and SVD

Lecture 4 Orthonormal vectors and QR factorization

MATH 115A: SAMPLE FINAL SOLUTIONS

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Cheat Sheet for MATH461

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Linear Algebra problems

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Lecture 6, Sci. Comp. for DPhil Students

Lecture 20: 6.1 Inner Products

ELE/MCE 503 Linear Algebra Facts Fall 2018

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

The QR Factorization

Solution to Homework 8, Math 2568

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Section 7.5 Inner Product Spaces

Mathematics Department Stanford University Math 61CM/DM Inner products

PRACTICE PROBLEMS FOR THE FINAL

1. Select the unique answer (choice) for each problem. Write only the answer.

Linear Algebra. and

Math 407: Linear Optimization

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Approximation Theory

6 Inner Product Spaces

Basic Elements of Linear Algebra

x 1 x 2. x 1, x 2,..., x n R. x n

Transcription:

MATH 235: Inner Product Spaces, Assignment 7 Hand in questions 3,4,5,6,9, by 9:3 am on Wednesday March 26, 28. Contents Orthogonal Basis for Inner Product Space 2 2 Inner-Product Function Space 2 3 Weighted Inner Product in R 2 *** 2 4 Inner Product in S 3 *** 2 5 Inner Products in M 22 and P 2 *** 3 6 Cauchy-Schwarz and Triangle Inequalities *** 3 7 Length in Function Space 3 8 More on Function Space 3 9 Linear Transformations and Inner Products *** 4 MATLAB *** 5

Orthogonal Basis for Inner Product Space If V = P 3 with the inner product < f,g >= f(x)g(x)dx, apply the Gram-Schmidt algorithm to obtain an orthogonal basis from B = {,x,x 2,x 3 }. 2 Inner-Product Function Space Consider the vector space C[, ] of all continuously differentiable functions defined on the closed interval [,]. The inner product in C[,] is defined by < f,g >= f(x)g(x)dx. Find the inner products of the following pairs of functions and state whether they are orthogonal. f(x) = cos(2πx) and g(x) = sin(2πx) 2. f(x) = x and g(x) = e x 3. f(x) = x and g(x) = 3x 3 Weighted Inner Product in R 2 *** Show whether or not the following are valid inner products in R 2. If one is a valid inner product, then find a nonzero vector that is orthogonal to the vector y = ( 2 ) T.. u,v := 7u v +.2u 2 v 2. 2. u,v := 7u v.2u 2 v 2. 3. u,v := 7u v.2u 2 v 2. 4 Inner Product in S 3 *** Let S 3 denote the vector space of 3 3 symmetric matrices.. Prove that S,T = tr ST is an inner product on S 3. 2. Let S = and S 2 = 3. Apply the Gram-Schmidt process in S 3 to find a matrix orthogonal to both S and S 2. 3. Find an orthonormal basis for S 3 using the above three matrices. 4. Let T =. Find the projection of T on the span of {S,S 2 }. 2

5 Inner Products in M 22 and P 2 *** Show which of the following are valid inner products. [ ] [ ] a a. Let A = 2 b b and B = 2 be real matrices. a 3 a 4 b 3 b 4 (a) Define A,B := a b + 2a 2 b 3 + 3a 3 b 2 + a 4 b 4. (b) Define A,B := tr B T A. 2. Let p(x) and q(x) be polynomials in P 2. (a) Define p,q := p( )q( ) + p( 2 )q( 2 ) + p( 3)q( 3). (b) Define p,q := p( 3)q( 3) + p( 2 )q( 2 ) + p( )q( ). 6 Cauchy-Schwarz and Triangle Inequalities *** 3. Let D = π. Consider the inner product defined on R 3 by: u,v = u T Dv. Let 2 2 4 u = 3 and v = 8. Verify that both inequalities hold. 7 9 2. Find the projection of v on the span of u and call it w. Verify that equality holds for the Cauchy-Schwarz inequality applied to u,w. 7 Length in Function Space Let W = Span{, cos(x), cos(2x)} C[ π, π] with inner product < f,g >= π π f(x)g(x)dx. It is known that { 2π, π cos(x), π cos(2x)} is an orthonormal basis for W. A function f W satisfies π π f(x)dx = a, π π f(x)cos(x)dx = b and π π f(x)cos(2x)dx = c. Find an expression for f in terms of a,b and c. 8 More on Function Space Consider the vector space P 2 consisting of polynomials of degree at most 2 together with the inner product < f,g >= Let W be the subspace of P 2 having a basis {,x}.. Find an orthonormal basis for W. f(x)g(x)dx, f,g P 2. 2. Let h(x) = x 2 +. Find the vector in W that is closest to h(x). 3

9 Linear Transformations and Inner Products *** Let V be a real inner product space (with the usual notation) and let {v,v 2,,v n } be an orthonormal basis for V (so V is finite dimensional). Fix a linear transformation T : V R. Define v = T(v )v + T(v 2 )v 2 + + T(v n )v n.. Show that for all v V, T(v) =< v,v >. 2. Prove that, for vectors u,w V, if < u,v >=< w,v > for all v V, then u = w. 3. Deduce from (b) that v is the only vector in V with the displayed property in part (a). NOTE: This question shows how a suitable inner product produces a correspondence between vectors in the vector space and linear functionals T from it to the reals. This is used to identify tangent vectors with what are called cotangent vectors in differential geometry, general relativity and in other symbol manipulations of mathematical physics, where it is sometimes referred to as moving indices up-and-down. A slightly surprising consequence of part (c) is that the vector v, despite appearances, depends only on the inner product and T, but not on the choice of orthonormal basis. 4

MATLAB *** Gram-Schmidt & QR Factorization Overview of the Gram-Schmidt Process Given a basis {x,...,x p } for a subspace W of R n, define v = x v 2 = x 2 x 2 v v v v v 3 = x 3 x 3 v v v v x 3 v 2 v 2 v 2 v 2. v p = x p x p v v x p v 2 x p v p v 2 v p v v v 2 v 2 v p v p Then {v,...,v p } is an orthogonal basis for W. When working on a computer, a basis is often presented as a matrix A with linearly independent columns. The columns of matrix A are a basis for its column space (or range). The Gram-Schmidt process can be applied to the column vectors of matrix A so that the output of the process is an orthogonal set of vectors. For example, let x =, x 2 =, x 3 = Then, by inspection, {x,x 2,x 3 } are linearly independent and thus form a basis for the subspace W of R 4. Consider the following matrix A formed by using the basis {x,x 2,x 3 } as column vectors: A = Matrix A will be used to illustrate the Gram-Schmidt process in MATLAB. First, enter matrix A. >> A = [ ; ; ; ] %Create matrix A Next, calculate the orthogonal basis {v, v2, v3} by using the Gram-Schmidt algorithm stated above. Recall that in MATLAB, the notation A(:,) selects the first column of A, A(:,2) selects the second column, and so on... Also recall that v is the short form for transpose(a) (changes a row vector to a column vector, or vice versa). >> v = A(:,) %Calculate v >> v2 = A(:,2) - (A(:,2) *v)/(v *v)*v %Calculate v2 >> v3 = A(:,3) - (A(:,3) *v)/(v *v)*v - (A(:,3) *v2)/(v2 *v2)*v2 %Calculate v3 5

MATLAB tells us that by using the Gram-Schmidt process on matrix A, the corresponding orthogonal basis is.75 v =, v 2 =.25.25, v 3 =.6667.3333.25.3333 To convert from an Orthogonal basis to an Orthonormal Basis: We must simply divide each orthogonal vector by its norm: >> vn = v/norm(v) %Calculate normalized v >> vn2 = v2/norm(v2) %Calculate normalized v2 >> vn3 = v3/norm(v3) %Calculate normalized v3 The following theorem provides a method for checking whether the columns of a matrix are orthonormal: An m n matrix U has orthonormal columns if and only if U T U = I Using this theorem and MATLAB, we can check whether {vn, vn2, vn3} forms an orthonormal basis. First define a new matrix U that uses {vn,vn2,vn3} as its column vectors. >> U = [vn vn2 vn3] %Create matrix U >> transpose(u) * U %Apply theorem Indeed, MATLAB returns the identity matrix I, so by the theorem, {vn,vn2,vn3} form an orthonormal basis. 6

QR Factorization QR factorization is the matrix version of the Gram-Schmidt method. QR factorization is particularly useful because computer algorithms exist that implement this factorization method with good control over computer round-off error. Additionally, there is no simple method for computing eigenvalues of matrices that are larger than 3 3 since numerically it is difficult to calculate the roots of the characteristic polynomial. Using QR factorization iteratively is the most successful algorithm that exists today for computing eigenvalues. For these reasons, QR is an important factorization technique in Linear Algebra. Overview of QR Factorization If A is an m n matrix with linearly independent columns, then A can be factored as A = QR where Q is an m n matrix with orthonormal columns consisting of an orthonormal basis of the column space, and R is an invertible, upper triangular matrix. To illustrate this method in MATLAB, consider the following matrix A: A = This is the same matrix that was used in the section on the Gram-Schmidt process. We have already used MATLAB to calculate the orthonormal basis for matrix A as: vn =.5.5.5.5, vn2 =.866.2887.2887.2887, vn3 =.865.482.482 As such, according to the description of the QR factorization method, it must be that we find: Q =.5.866.5.2887.865.5.2887.482.5.2887.482 Let s use MATLAB to verify this result. MATLAB s command for QR Factorization is [Q R] = qr(a) where A is an m n matrix, Q is returned as an m m orthogonal matrix and R is returned as an m n upper triangular matrix. This MATLAB command for QR factorization is very general since it applies to any matrix and it returns more information than required in some cases. If A has rank n, then only the first n columns of Q will be an orthonormal basis for the column space of A. Enter the following in MATLAB: >> A = [ ; ; ; ] %Create matrix A >> rank(a) %Compute rank of matrix A Since A is a 4 3 matrix and its rank is 3, we should expect that the orthonormal basis is the first 3 columns of Q in result of the following command: 7

>> [Q R] = qr(a) %Calculate the QR factorization of A Indeed, the first 3 columns of Q match those of the orthonormal basis that we calculated using the Gram-Schmidt process. Notice that the MATLAB qr command is much easier to use than the Gram-Schmidt process. The qr command in MATLAB is also less computationally demanding on your computer (more efficient and not prone to rounding errors). The Gram-Schmidt method is numerically unstable when implemented on a computer the vectors that are returned are often not quite orthogonal because of rounding errors in the calculation. Therefore, the QR factorization method is usually prefered when using a computer. Consider the following matrix A: A = 4 5 2 2 5 2 4 8 8 Use MATLAB to determine if the columns of A are linearly independent? State which MATLAB commands you used and explain your answer. Using MATLAB, find the QR factorization of matrix A (i.e. find Q and R such that A = QR). What is an orthonormal basis for the column space of matrix A? Given any m n matrix B that has already been entered in MATLAB, what single command could you use to determine if matrix B has orthonormal columns? 8