Best approximation in the 2-norm

Similar documents
APPENDIX B GRAM-SCHMIDT PROCEDURE OF ORTHOGONALIZATION. Let V be a finite dimensional inner product space spanned by basis vector functions

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Projection Theorem 1

INNER PRODUCT SPACE. Definition 1

96 CHAPTER 4. HILBERT SPACES. Spaces of square integrable functions. Take a Cauchy sequence f n in L 2 so that. f n f m 1 (b a) f n f m 2.

Your first day at work MATH 806 (Fall 2015)

Mathematical foundations - linear algebra

Linear Algebra Massoud Malek

Linear Algebra. Session 12

Chapter 6 Inner product spaces

Your first day at work MATH 806 (Fall 2015)

Section 7.5 Inner Product Spaces

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Hilbert Spaces: Infinite-Dimensional Vector Spaces

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

5 Compact linear operators

Lecture 20: 6.1 Inner Products

12.0 Properties of orthogonal polynomials

Inner Product Spaces An inner product on a complex linear space X is a function x y from X X C such that. (1) (2) (3) x x > 0 for x 0.

Linear Analysis Lecture 5

1. Subspaces A subset M of Hilbert space H is a subspace of it is closed under the operation of forming linear combinations;i.e.,

Linear Normed Spaces (cont.) Inner Product Spaces

Introduction to Signal Spaces

Functional Analysis Review

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Power Series Solutions to the Legendre Equation

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Chapter 4 Euclid Space

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India

Real Analysis Notes. Thomas Goller

Functional Analysis Exercise Class

Mathematics Department Stanford University Math 61CM/DM Inner products

Math 255 Honors: Gram-Schmidt Orthogonalization on the Space of Polynomials

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

Vectors in Function Spaces

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector Spaces. Vector space, ν, over the field of complex numbers, C, is a set of elements a, b,..., satisfying the following axioms.

Linear Algebra Review

MATH 235: Inner Product Spaces, Assignment 7

Curve Fitting and Approximation

Functional Analysis MATH and MATH M6202

Function Space and Convergence Types

Lecture 23: 6.1 Inner Products

Definitions and Properties of R N

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

1. General Vector Spaces

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Math 24 Spring 2012 Sample Homework Solutions Week 8

Lecture 3: Review of Linear Algebra

Class notes: Approximation

Lecture 3: Review of Linear Algebra

Orthonormal Bases Fall Consider an inner product space V with inner product f, g and norm

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

I teach myself... Hilbert spaces

Approximation theory

Analysis IV : Assignment 3 Solutions John Toth, Winter ,...). In particular for every fixed m N the sequence (u (n)

2. Review of Linear Algebra

Solutions for MAS277 Problems

Lecture IX. Definition 1 A non-singular Sturm 1 -Liouville 2 problem consists of a second order linear differential equation of the form.

Inner product spaces. Layers of structure:

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Fall TMA4145 Linear Methods. Exercise set 10

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

4.4. Orthogonality. Note. This section is awesome! It is very geometric and shows that much of the geometry of R n holds in Hilbert spaces.

UNIT 6: The singular value decomposition.

Functional Analysis, Math 7320 Lecture Notes from August taken by Yaofeng Su

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Applied Linear Algebra in Geoscience Using MATLAB

4 Hilbert spaces. The proof of the Hilbert basis theorem is not mathematics, it is theology. Camille Jordan

Fall TMA4145 Linear Methods. Solutions to exercise set 9. 1 Let X be a Hilbert space and T a bounded linear operator on X.

Def. The euclidian distance between two points x = (x 1,...,x p ) t and y = (y 1,...,y p ) t in the p-dimensional space R p is defined as

Real Variables # 10 : Hilbert Spaces II

The Gram matrix in inner product modules over C -algebras

Basic Calculus Review

3 Orthogonality and Fourier series

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The following definition is fundamental.

Approximation Theory

Chapter 5. Basics of Euclidean Geometry

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Fourier and Wavelet Signal Processing

Ordinary Differential Equations II

Chapter 4: Interpolation and Approximation. October 28, 2005

LINEAR ALGEBRA W W L CHEN

Inner Product Spaces

1 Functional Analysis

Dot product and linear least squares problems

CS 143 Linear Algebra Review

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.

Applied Linear Algebra

The Gram Schmidt Process

The Gram Schmidt Process

1 angle.mcd Inner product and angle between two vectors. Note that a function is a special case of a vector. Instructor: Nam Sun Wang. x.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Numerical Linear Algebra

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

Transcription:

Best approximation in the 2-norm Department of Mathematical Sciences, NTNU september 26th 2012

Vector space A real vector space V is a set with a 0 element and three operations: Addition: x, y V then x + y V Inverse: for all x V there is x V such that x + ( x) = 0 Scalar multiplication: λ R, x V then λ x V 1 x + y = y + x 2 (x + y) + z = x + (y + z) 3 0 + x = x + 0 = x 4 0 x = 0 5 1 x = x 6 (λ µ) x = λ (µ x) 7 λ (x + y) = λx + λy 8 (λ + µ) x = λx + µx

Inner product spaces: 9.2 Definition of inner product space Let V be a linear space over R. A function, : V V R, is called an inner product on V if: 1 f + g, h = f, h + g, h for all f, g, h in V ; 2 λf, g = λ f, g for all f, g in V and λ R; 3 f, g = g, f for all f, g in V ; 4 f, f > 0 if f 0, f V.

Inner product spaces: 9.2 Definition of inner product space Let V be a linear space over R. A function, : V V R, is called an inner product on V if: 1 f + g, h = f, h + g, h for all f, g, h in V ; 2 λf, g = λ f, g for all f, g in V and λ R; 3 f, g = g, f for all f, g in V ; 4 f, f > 0 if f 0, f V. A linear space with an inner product is an inner product space.

Inner product spaces: 9.2 Definition of inner product space Let V be a linear space over R. A function, : V V R, is called an inner product on V if: 1 f + g, h = f, h + g, h for all f, g, h in V ; 2 λf, g = λ f, g for all f, g in V and λ R; 3 f, g = g, f for all f, g in V ; 4 f, f > 0 if f 0, f V. A linear space with an inner product is an inner product space. Example The n-dimensional Euclidean space R n is an inner product space u, b = n u i v i, i=0 u, v R n and u = [u 1,..., u n ] T, v = [v 1,..., v n ] T.

Inner product spaces: orthogonality and norm Orthogonality If in an inner product space V we have f, g = 0 then we say that f is orthogonal to g.

Inner product spaces: orthogonality and norm Orthogonality If in an inner product space V we have f, g = 0 then we say that f is orthogonal to g. Induced norm In an inner product space we can define the norm f := f, f 1 2, this is called norm induced by the inner product.

Inner product spaces: orthogonality and norm Orthogonality If in an inner product space V we have f, g = 0 then we say that f is orthogonal to g. Induced norm In an inner product space we can define the norm f := f, f 1 2, this is called norm induced by the inner product. Theorem An inner product space V over R with the induced norm defined above is a normed space. Proof. Need to prove that the induced norm satisfies the axioms of norm, the proof makes use of the Cauchy-Schwarz inequality.

Lemma: Cauchy-Schwarz inequality f, g f g, f, g V. Proof Similar to the one seen in chapter 2 for the inner product in R n.

Lemma: Cauchy-Schwarz inequality f, g f g, f, g V. Proof Similar to the one seen in chapter 2 for the inner product in R n. Using the lemma we prove the triangular inequality for the induced norm: f + g 2 = f + g, f + g = f 2 + 2 f, g + g 2 from Cauchy-Schwarz we get f + g 2 f 2 + 2 f g + g 2 = ( f + g ) 2 so f + g f + g.

Lemma: Cauchy-Schwarz inequality f, g f g, f, g V. Proof Similar to the one seen in chapter 2 for the inner product in R n. Using the lemma we prove the triangular inequality for the induced norm: f + g 2 = f + g, f + g = f 2 + 2 f, g + g 2 from Cauchy-Schwarz we get f + g 2 f 2 + 2 f g + g 2 = ( f + g ) 2 so f + g f + g. The other two axioms of norm are directly proved using the definition of induced norm.

Space of continuous functions Example (9.3) C[a, b] (continuous functions defined on the closed interval [a, b]) is an inner product space with the inner product f, g = b a w(x)f (x)g(x)dx, and w is a weight function defined, positive, continuous and integrable on (a, b). The induced norm is ( b f 2 = a ) 1 w(x) f (x) 2 2 dx. This norm is called the 2-norm on C[a, b].

Space of square integrable functions: L 2 We do not need f to be continuous in order for f 2 to be finite, it is sufficient that f 2 w is integrable on [a, b].

Space of square integrable functions: L 2 We do not need f to be continuous in order for f 2 to be finite, it is sufficient that f 2 w is integrable on [a, b]. Example L 2 w (a, b) := {f : (a, b) R w(x) f (x) 2 is integrable on (a, b)}. This is an inner product space with the inner product f, g = b a w(x)f (x)g(x)dx, and w is a weight function defined, positive, continuous and integrable on (a, b). The induced norm is ( ) 1 b 2 f 2 = w(x) f (x) 2 dx. a This norm is called the L 2 -norm. Note C[a, b] L 2 w (a, b). In the case w 1 then we write L 2 (a, b).

Best approximation in the 2-norm: ch 9.3 The problem of best approximation in the 2-norm can be formulated as follows: Best approximation problem in the 2-norm Given that f L 2 w (a, b), find p n Π n such that f p n 2 = inf q Π n f q 2 such p n is called a polynomial of best approximation of degree n to the function f in the 2-norm on (a, b).

Best approximation in the 2-norm: ch 9.3 The problem of best approximation in the 2-norm can be formulated as follows: Best approximation problem in the 2-norm Given that f L 2 w (a, b), find p n Π n such that f p n 2 = inf q Π n f q 2 such p n is called a polynomial of best approximation of degree n to the function f in the 2-norm on (a, b). Theorem 9.2 Let f L 2 w (a, b), there exists a unique polynomial p n Π n such that f p n 2 = min q Π n f q 2.

Example: best approximation in 2 vs Let ɛ > 0 fixed f (x) = 1 e x ɛ, x [0, 1] find p0 2 norm and p0 norm

In general, to find p n (x) = c 0 + c 1 x + + c n x n best approximation in the two norm on [0, 1]:

In general, to find p n (x) = c 0 + c 1 x + + c n x n best approximation in the two norm on [0, 1]: Solve where M is the Hilbert matrix: M c = b M j,k = 1 0 x k+j dx = 1 k + j + 1 j, k = 0,..., n, and b T = b 0. b n, b j = 1 0 f (x)x j dx.

Alternatively write p n (x) = γ 0 ϕ 0 + γ 1 ϕ 1 (x) + + γ n ϕ n (x) best approximation in the two norm on [a, b]: and Π n = span{1, x,..., x n } = span{ϕ 0, ϕ 1..., ϕ n }, and find γ 0,..., γ n minimizing b n E(γ 0,..., γ n ) := f (x) γ j ϕ j 2. a j=0

Alternatively write p n (x) = γ 0 ϕ 0 + γ 1 ϕ 1 (x) + + γ n ϕ n (x) best approximation in the two norm on [a, b]: and Π n = span{1, x,..., x n } = span{ϕ 0, ϕ 1..., ϕ n }, and find γ 0,..., γ n minimizing b n E(γ 0,..., γ n ) := f (x) γ j ϕ j 2. a j=0 This leads to M γ = b where γ = [γ 0,..., γ n ] T, M is the matrix: M j,k = ϕ k, ϕ j j, k = 0,..., n, and b = [b 0,..., b n ] T b j = f, ϕ j. We d like M diagonal (i.e. ϕ k, ϕ j = δ k,j ).

Def 9.4: Orthogonal polynomials Given a weight function w on (a, b) we say that the sequence of polynomials ϕ j, j = 0, 1,... is a system of orthogonal polynomials on (a, b) with respect to w, if each ϕ j is of exact degree j and ϕ j, ϕ k = δ j,k. To construct a system of orthogonal polynomials we can use a Gram-Schmidt process.

Gram-Schmidt orthogonalization Let ϕ 0 1 and assume ϕ j j = 0,..., n n 0 are already orthogonal so ϕ k, ϕ j = consider and take THEN b a w(x)ϕ k (x)ϕ j (x) dx = 0, k, j {0,..., n}, k j q(x) = x n+1 a 0 ϕ 0 (x) a n ϕ n (x) Π n+1, a j = x n+1, ϕ j ϕ j, ϕ j, j = 0,..., n

Gram-Schmidt orthogonalization Let ϕ 0 1 and assume ϕ j j = 0,..., n n 0 are already orthogonal so ϕ k, ϕ j = consider and take THEN b a w(x)ϕ k (x)ϕ j (x) dx = 0, k, j {0,..., n}, k j q(x) = x n+1 a 0 ϕ 0 (x) a n ϕ n (x) Π n+1, a j = x n+1, ϕ j ϕ j, ϕ j, j = 0,..., n q, ϕ j = x n+1, ϕ j a j ϕ j, ϕ j = 0,

Gram-Schmidt orthogonalization Let ϕ 0 1 and assume ϕ j j = 0,..., n n 0 are already orthogonal so ϕ k, ϕ j = consider and take THEN b a and we can take w(x)ϕ k (x)ϕ j (x) dx = 0, k, j {0,..., n}, k j q(x) = x n+1 a 0 ϕ 0 (x) a n ϕ n (x) Π n+1, with α a scaling constant. a j = x n+1, ϕ j ϕ j, ϕ j, j = 0,..., n q, ϕ j = x n+1, ϕ j a j ϕ j, ϕ j = 0, ϕ n+1 = α q

Example (9.6 Legendre polynomials) (a, b) = ( 1, 1) w(x) 1: ϕ 0 (x) 1 ϕ 1 (x) = x ϕ 2 (x) = 3 2 x 2 1 2 ϕ 2 (x) = 5 2 x 3 3 2 x Example (9.6 Chebishev polynomials) (a, b) = ( 1, 1) w(x) (1 x 2 ) 1/2 T n (x) = cos(n arcos(x)), n = 0, 1,....

Characterization of the best approximation in the 2-norm Theorem 9.3 p n Π n is such that f p n 2 = min q Π n f q 2, f L 2 w (a, b), if and only if f p n, q = 0, q Π n Remark: This theorem relates best approximation to orthogonality.

Example On (0, 1) and w(x) 1 on (0, 1) by Gram-Schmidt orthogonalization we get So for ϕ 0 = 1, ϕ 1 = x 1 2, ϕ 2(x) = x 2 x + 1 6. f (x) = 1 e x ɛ, x [0, 1] the best approximation p 3 is p 3 (x) = γ 0 + γ 1 ϕ 1 (x) + γ 2 ϕ 2 (x)

Example On (0, 1) and w(x) 1 on (0, 1) by Gram-Schmidt orthogonalization we get So for ϕ 0 = 1, ϕ 1 = x 1 2, ϕ 2(x) = x 2 x + 1 6. f (x) = 1 e x ɛ, x [0, 1] the best approximation p 3 is p 3 (x) = γ 0 + γ 1 ϕ 1 (x) + γ 2 ϕ 2 (x) γ 0 = γ 1 = γ 1 = 1 0 1 (1 e x ɛ ) dx, 0 (1 e x ɛ )(x 1 2 ) dx 1 0 (x, 1 2 )2 dx 1 0 (1 e x ɛ )(x 2 x + 1 6 ) dx 1 0 (x, 2 x + 1 6 )2 dx