Inner products and Norms. Inner product of 2 vectors. Inner product of 2 vectors x and y in R n : x 1 y 1 + x 2 y x n y n in R n

Similar documents
Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

Section 3.9. Matrix Norm

Linear Algebra Massoud Malek

Review of Some Concepts from Linear Algebra: Part 2

Linear Analysis Lecture 5

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Basic Concepts in Linear Algebra

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Review of Basic Concepts in Linear Algebra

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Algebra C Numerical Linear Algebra Sample Exam Problems

Lecture 1: Review of linear algebra

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

Jim Lambers MAT 610 Summer Session Lecture 2 Notes

Functional Analysis Review

STA141C: Big Data & High Performance Statistical Computing

Normed & Inner Product Vector Spaces

Lecture 2: Linear Algebra Review

Linear Algebra Review

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Linear Algebra Review. Vectors

Notes on Linear Algebra and Matrix Theory

FIXED POINT ITERATIONS

MAT 610: Numerical Linear Algebra. James V. Lambers

Numerical Analysis: Solving Systems of Linear Equations

EXAM. Exam 1. Math 5316, Fall December 2, 2012

STA141C: Big Data & High Performance Statistical Computing

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Quantum Computing Lecture 2. Review of Linear Algebra

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Mathematical foundations - linear algebra

NORMS ON SPACE OF MATRICES

The Transpose of a Vector

A Brief Introduction to Functional Analysis

LinGloss. A glossary of linear algebra

12 CHAPTER 1. PRELIMINARIES Lemma 1.3 (Cauchy-Schwarz inequality) Let (; ) be an inner product in < n. Then for all x; y 2 < n we have j(x; y)j (x; x)

Inner product spaces. Layers of structure:

5.0 Vector norms Properties of norms Norm notation Unitarily invariant norms Inner products

First, we review some important facts on the location of eigenvalues of matrices.

2. Review of Linear Algebra

Basic Calculus Review

INNER PRODUCT SPACE. Definition 1

MAT 771 FUNCTIONAL ANALYSIS HOMEWORK 3. (1) Let V be the vector space of all bounded or unbounded sequences of complex numbers.

Singular Value Decomposition (SVD) and Polar Form

Analysis Preliminary Exam Workshop: Hilbert Spaces

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

MATH0328: Numerical Linear Algebra Homework 3 SOLUTIONS

Lecture Note 7: Iterative methods for solving linear systems. Xiaoqun Zhang Shanghai Jiao Tong University

Discrete Math Class 19 ( )

Lecture 23: 6.1 Inner Products

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Elementary linear algebra

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

Numerical Linear Algebra

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Applied Linear Algebra in Geoscience Using MATLAB

Inner Product Spaces 6.1 Length and Dot Product in R n

MATH 235: Inner Product Spaces, Assignment 7

Math Linear Algebra II. 1. Inner Products and Norms

Introduction to Numerical Linear Algebra II

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2

Characterization of half-radial matrices

MATH 532: Linear Algebra

Linear Algebra. Session 12

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012.

6 Inner Product and Hilbert Spaces

Structured Condition Numbers and Backward Errors in Scalar Product Spaces

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

What is it we are looking for in these algorithms? We want algorithms that are

Computational math: Assignment 1

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Numerical Linear Algebra

Lecture: Introduction to LP, SDP and SOCP

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

The following definition is fundamental.

Linear Algebra & Analysis Review UW EE/AA/ME 578 Convex Optimization

Signals are Vectors. Signals are mathematical objects. Here we will develop tools to analyze the geometry of sets of signals

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Linear Algebra Methods for Data Mining

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

Knowledge Discovery and Data Mining 1 (VO) ( )

MATRIX ANALYSIS HOMEWORK

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Review of Linear Algebra

Matrix Analysis and Algorithms

Functions of Several Variables

Kernel Method: Data Analysis with Positive Definite Kernels

Homework 6 Solutions

Chapter 0 Miscellaneous Preliminaries

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is

MATH 581D FINAL EXAM Autumn December 12, 2016

Compound matrices and some classical inequalities

Chapter 6 Inner product spaces

Transcription:

Inner products and Norms Inner product of 2 vectors Inner product of 2 vectors x and y in R n : x 1 y 1 + x 2 y 2 + + x n y n in R n Notation: (x, y) or y T x For complex vectors (x, y) = x 1 ȳ 1 + x 2 ȳ 2 + + x n ȳ n in C n Note: (x, y) = y H x 2-1 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Properties of Inner Product: (x, y) = (y, x). (αx, y) = α (x, y). (x, x) 0 is always real and non-negative. (x, x) = 0 iff x = 0 (for finite dimensional spaces). Given A C m n then (Ax, y) = (x, A H y) x C n, y C m 2-2 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Vector norms Norms are needed to measure lengths of vectors and closeness of two vectors. Examples of use: Estimate convergence rate of an iterative method; Estimate the error of an approximation to a given solution;... A vector norm on a vector space X is a real-valued function on X, which satisfies the following three conditions: 1. x 0, x X, and x = 0 iff x = 0. 2. αx = α x, x X, α C. 3. x + y x + y, x, y X. Third property is called the triangle inequality. 2-3 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Important example: Euclidean norm on X = C n, x 2 = (x, x) 1/2 = x 1 2 + x 2 2 +... + x n 2 Show that when Q is orthogonal then Qx 2 = x 2 Most common vector norms in numerical linear algebra: special cases of the Hölder norms (for p 1): x p = ( n i=1 x i p ) 1/p. Find out (bbl search) how to show that these are indeed norms for any p 1 (Not easy for 3rd requirement!) 2-4 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Property: Limit of x p when p exists: lim p x p = max n i=1 x i Defines a norm denoted by.. The cases p = 1, p = 2, and p = lead to the most important norms. p in practice. These are: x 1 = x 1 + x 2 + + x n, x 2 = [ x 1 2 + x 2 2 + + x n 2] 1/2, x = max i=1,...,n x i. 2-5 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

The Cauchy-Schwartz inequality (important) is: (x, y) x 2 y 2. When do you have equality in the above relation? Expand (x + y, x + y). What does the Cauchy-Schwarz inequality imply? The Hölder inequality (less important for p 2) is: (x, y) x p y q, with 1 p + 1 q = 1 2-6 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Equivalence of norms: In finite dimensional spaces (R n, C n,..) all norms are equivalent : if φ 1 and φ 2 are two norms then there is a constant α such that, φ 1 (x) α φ 2 (x) How can you prove this result? We can bound one norm in terms of another: βφ 2 (x) φ 1 (x) αφ 2 (x) Show that for any x: 1 n x 1 x 2 x 1 What are the unit balls B p = {x x p 1} associated with the norms. p for p = 1, 2,, in R 2? 2-7 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Convergence of vector sequences A sequence of vectors x (k), k = 1,..., converges to a vector x with respect to the norm. if, by definition, lim k x(k) x = 0 Important point: because all norms in R n are equivalent, the convergence of x (k) w.r.t. a given norm implies convergence w.r.t. any other norm. Notation: lim k x(k) = x 2-8 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Example: converges to The sequence x (k) = x = 1 + 1/k k k+log 2 k 1 k 1 1 0 Note: Convergence of x (k) to x is the same as the convergence of each individual component x (k) i of x (k) to the corresoponding component x i of x. 2-9 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Matrix norms Can define matrix norms by considering m n matrices as vectors in R mn. These norms satisfy the usual properties of vector norms, i.e., 1. A 0, A C m n, and A = 0 iff A = 0 2. αa = α A, A C m n, α C 3. A + B A + B, A, B C m n. However, these will lack (in general) the right properties for composition of operators (product of matrices). The case of. 2 yields the Frobenius norm of matrices. 2-10 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Given a matrix A in C m n, define the set of matrix norms A p = max x C n, x 0 Ax p x p. These norms satisfy the usual properties of vector norms (see previous page). The matrix norm. p is induced by the vector norm. p. Again, important cases are for p = 1, 2,. 2-11 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Consistency / sub-mutiplicativity of matrix norms A fundamental property of matrix norms is consistency AB p A p B p. [Also termed sub-multiplicativity ] Consequence: A k p A k p A k converges to zero if any of its p-norms is < 1 [Note: sufficient but not necessary condition] 2-12 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Frobenius norms of matrices The Frobenius norm of a matrix is defined by A F = ( n j=1 m i=1 a ij 2) 1/2. Same as the 2-norm of the column vector in C mn consisting of all the columns (respectively rows) of A. This norm is also consistent [but not induced from a vector norm] 2-13 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Compute the Frobenius norms of the matrices 1 1 1 2 1 1 0 1 5 0 3 2 1 1 2 Prove that the Frobenius norm is consistent [Hint: Use Cauchy- Schwartz] Define the vector 1-norm of a matrix A as the 1-norm of the vector of stacked columns of A. Is this norm a consistent matrix norm? [Hint: Result is true Use Cauchy-Schwarz to prove it.] 2-14 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Expressions of standard matrix norms Recall the notation: (for square n n matrices) ρ(a) = max λ i (A) ; T r(a) = n i=1 a ii = n i=1 λ i(a) where λ i (A), i = 1, 2,..., n are all eigenvalues of A A 1 = A = max j=1,...,n max i=1,...,m m i=1 n j=1 a ij, a ij, A 2 = [ ρ(a H A) ] 1/2 = [ ρ(aa H ) ] 1/2, A F = [ T r(a H A) ] 1/2 = [ T r(aa H ) ] 1/2. 2-15 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Eigenvalues of A H A are real 0. Their square roots are singular values of A. To be covered later. A 2 == the largest singular value of A and A F = the 2-norm of the vector of all singular values of A. Compute the p-norm for p = 1, 2,, F for the matrix ( ) 0 2 A = 0 1 Show that ρ(a) A for any matrix norm. 2-16 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

Is ρ(a) a norm? 1. ρ(a) = A 2 when A is Hermitian (A H = A). True for this particular case... 2.... However, not true in general. For ( ) 0 1 A =, 0 0 we have ρ(a) = 0 while A 0. Also, triangle inequality not satisfied for the pair A, and B = A T. Indeed, ρ(a + B) = 1 while ρ(a) + ρ(b) = 0. 2-17 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms

A few properties of the 2-norm and the F-norm Let A = uv T. Then A 2 = u 2 v 2 Prove this result In this case A F =?? For any A C m n and unitary matrix Q C m m we have QA 2 = A 2 ; QA F = A F. Show that the result is true for any orthogonal matrix Q (Q has orthonomal columns), i.e., when Q C p m with p > m Let Q C n n. Do we have AQ 2 = A 2? AQ F = A F? What if Q C n p, with p < n? 2-18 Text 3; GvL 2.2-2.3; AB: 1.1.7 Norms