AMS526: Numerical Analysis I (Numerical Linear Algebra)

Similar documents
AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Review of Linear Algebra

Linear Algebra (Review) Volker Tresp 2017

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra Review. Vectors

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra (Review) Volker Tresp 2018

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

ELE/MCE 503 Linear Algebra Facts Fall 2018

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

What is it we are looking for in these algorithms? We want algorithms that are

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra: Matrix Eigenvalue Problems

AMS526: Numerical Analysis I (Numerical Linear Algebra)

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

5. Orthogonal matrices

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Math 407: Linear Optimization

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Matrix Algebra: Summary

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Quantum Computing Lecture 2. Review of Linear Algebra

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

The First Fundamental Form

Image Registration Lecture 2: Vectors and Matrices

Lecture 7. Econ August 18

Lecture 1.4: Inner products and orthogonality

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Linear Algebra Massoud Malek

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Linear Algebra Methods for Data Mining

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Basic Concepts in Linear Algebra

Chap 3. Linear Algebra

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Review of Basic Concepts in Linear Algebra

There are two things that are particularly nice about the first basis

Knowledge Discovery and Data Mining 1 (VO) ( )

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Math 6610 : Analysis of Numerical Methods I. Chee Han Tan

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Mathematical Principles for Scientific Computing and Visualization

4 Linear Algebra Review

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Typical Problem: Compute.

LinGloss. A glossary of linear algebra

Cambridge University Press The Mathematics of Signal Processing Steven B. Damelin and Willard Miller Excerpt More information

MATH Linear Algebra

Computational math: Assignment 1

3 a 21 a a 2N. 3 a 21 a a 2M

Large Scale Data Analysis Using Deep Learning

A 2 G 2 A 1 A 1. (3) A double edge pointing from α i to α j if α i, α j are not perpendicular and α i 2 = 2 α j 2

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Designing Information Devices and Systems II

The Singular Value Decomposition

Review of linear algebra

Chapter 3 Transformations

MATH 350: Introduction to Computational Mathematics

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning

6 Inner Product Spaces

15 Singular Value Decomposition

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

1 Linear Algebra Problems

6. Orthogonality and Least-Squares

MAT 610: Numerical Linear Algebra. James V. Lambers

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Basic Elements of Linear Algebra

1. General Vector Spaces

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Kernel Method: Data Analysis with Positive Definite Kernels

A Review of Linear Algebra

Linear Algebra Lecture Notes-II

Linear Algebra Review

SYLLABUS. 1 Linear maps and matrices

Linear Algebra V = T = ( 4 3 ).

NOTES on LINEAR ALGEBRA 1

Linear Algebra and Dirac Notation, Pt. 3

Lecture 4 Orthonormal vectors and QR factorization

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Lecture 3: QR-Factorization

Linear algebra 2. Yoav Zemel. March 1, 2012

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

The following definition is fundamental.

Lecture 1: Review of linear algebra

AMS 529: Finite Element Methods: Fundamentals, Applications, and New Trends

14 Singular Value Decomposition

MTH 2310, FALL Introduction

Transcription:

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 2: Orthogonal Vectors and Matrices; Vector Norms Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 11

Outline 1 Orthogonal Vectors and Matrices 2 Vector Norms Xiangmin Jiao Numerical Analysis I 2 / 11

Inner Product Inner product (dot product) of two column vectors u, v C m is u v = m i=1 ūiv i In contrast, outer product of u and v is uv Note that cross product is different Xiangmin Jiao Numerical Analysis I 3 / 11

Inner Product Inner product (dot product) of two column vectors u, v C m is u v = m i=1 ūiv i In contrast, outer product of u and v is uv Note that cross product is different Euclidean length of u is the square root of the inner product of u with itself, i.e., u u Xiangmin Jiao Numerical Analysis I 3 / 11

Inner Product Inner product (dot product) of two column vectors u, v C m is u v = m i=1 ūiv i In contrast, outer product of u and v is uv Note that cross product is different Euclidean length of u is the square root of the inner product of u with itself, i.e., u u Inner product of two unit vectors u and v is the cosine of the angle α between u and v, i.e., cos α = u v u v Xiangmin Jiao Numerical Analysis I 3 / 11

Inner Product Inner product (dot product) of two column vectors u, v C m is u v = m i=1 ūiv i In contrast, outer product of u and v is uv Note that cross product is different Euclidean length of u is the square root of the inner product of u with itself, i.e., u u Inner product of two unit vectors u and v is the cosine of the angle α between u and v, i.e., cos α = u v u v Inner product is bilinear, in the sense that it is linear in each vertex separately: (u 1 + u 2 ) v = u 1v + u 2v u (v 1 + v 2 ) = u v 1 + u v 2 (αu) (βv) = ᾱβu v Xiangmin Jiao Numerical Analysis I 3 / 11

Orthogonal Vectors Definition A pair of vectors are orthogonal if x y = 0. In other words, the angle between them is 90 degrees Xiangmin Jiao Numerical Analysis I 4 / 11

Orthogonal Vectors Definition A pair of vectors are orthogonal if x y = 0. In other words, the angle between them is 90 degrees Definition Two sets of vectors X and Y are orthogonal if every x X is orthogonal to every y Y. Xiangmin Jiao Numerical Analysis I 4 / 11

Orthogonal Vectors Definition A pair of vectors are orthogonal if x y = 0. In other words, the angle between them is 90 degrees Definition Two sets of vectors X and Y are orthogonal if every x X is orthogonal to every y Y. Definition A set of nonzero vectors S is orthogonal if they are pairwise orthogonal. They are orthonormal if it is orthogonal and in addition each vector has unit Euclidean length. Xiangmin Jiao Numerical Analysis I 4 / 11

Orthogonal Vectors Theorem The vectors in an orthogonal set S are linearly independent. Xiangmin Jiao Numerical Analysis I 5 / 11

Orthogonal Vectors Theorem The vectors in an orthogonal set S are linearly independent. Proof. Prove by contradiction. If a vector can be expressed as linear combination of the other vectors in the set, then it is orthogonal to itself. Question: If the column vectors of an m n matrix A are orthogonal, what is the rank of A? Xiangmin Jiao Numerical Analysis I 5 / 11

Orthogonal Vectors Theorem The vectors in an orthogonal set S are linearly independent. Proof. Prove by contradiction. If a vector can be expressed as linear combination of the other vectors in the set, then it is orthogonal to itself. Question: If the column vectors of an m n matrix A are orthogonal, what is the rank of A? Answer: n = min{m, n}. In other words, A has full rank. Xiangmin Jiao Numerical Analysis I 5 / 11

Components of Vector Given an orthonormal set {q 1, q 2,..., q m } forming a basis of C m, vector v can be decomposed into orthogonal components as v = m i=1 (q i v)q i Xiangmin Jiao Numerical Analysis I 6 / 11

Components of Vector Given an orthonormal set {q 1, q 2,..., q m } forming a basis of C m, vector v can be decomposed into orthogonal components as v = m i=1 (q i v)q i Another way to express the condition is v = m i=1 (q iq i )v q i q i is an orthogonal projection matrix. Note that it is NOT an orthogonal matrix. Xiangmin Jiao Numerical Analysis I 6 / 11

Components of Vector Given an orthonormal set {q 1, q 2,..., q m } forming a basis of C m, vector v can be decomposed into orthogonal components as v = m i=1 (q i v)q i Another way to express the condition is v = m i=1 (q iq i )v q i q i is an orthogonal projection matrix. Note that it is NOT an orthogonal matrix. More generally, given an orthonormal set {q 1, q 2,..., q n } with n m, we have v = r + n (q i v)q i = r + i=1 n (q i q i )v and r q i = 0, 1 i n i=1 Let Q be composed of column vectors {q 1, q 2,..., q n }. QQ = n i=1 (q iq i ) is an orthogonal projection matrix. Xiangmin Jiao Numerical Analysis I 6 / 11

Unitary Matrices Definition A matrix is unitary if Q = Q 1, i.e., if Q Q = QQ = I. In the real case, we say the matrix is orthogonal. Its column vectors are orthonormal. In other words, q i q j = δ ij, the Kronecker delta. Xiangmin Jiao Numerical Analysis I 7 / 11

Unitary Matrices Definition A matrix is unitary if Q = Q 1, i.e., if Q Q = QQ = I. In the real case, we say the matrix is orthogonal. Its column vectors are orthonormal. In other words, q i q j = δ ij, the Kronecker delta. Question: What is the geometric meaning of multiplication by a unitary matrix? Xiangmin Jiao Numerical Analysis I 7 / 11

Unitary Matrices Definition A matrix is unitary if Q = Q 1, i.e., if Q Q = QQ = I. In the real case, we say the matrix is orthogonal. Its column vectors are orthonormal. In other words, q i q j = δ ij, the Kronecker delta. Question: What is the geometric meaning of multiplication by a unitary matrix? Answer: It preserves angles and Euclidean length. In the real case, multiplication by an orthogonal matrix Q is a rotation (if det(q) = 1) or reflection (if det(q) = 1). Xiangmin Jiao Numerical Analysis I 7 / 11

Outline 1 Orthogonal Vectors and Matrices 2 Vector Norms Xiangmin Jiao Numerical Analysis I 8 / 11

Definition of Norms Norm captures size of vector or distance between vectors There are many different measures for sizes but a norm must satisfy some requirements: Xiangmin Jiao Numerical Analysis I 9 / 11

Definition of Norms Norm captures size of vector or distance between vectors There are many different measures for sizes but a norm must satisfy some requirements: Definition A norm is a function : C m R that assigns a real-valued length to each vector. It must satisfy the following conditions: (1) x 0, and x = 0 only if x = 0, (2) x + y x + y, (3) αx = α x. Xiangmin Jiao Numerical Analysis I 9 / 11

Definition of Norms Norm captures size of vector or distance between vectors There are many different measures for sizes but a norm must satisfy some requirements: Definition A norm is a function : C m R that assigns a real-valued length to each vector. It must satisfy the following conditions: (1) x 0, and x = 0 only if x = 0, (2) x + y x + y, (3) αx = α x. An example is Euclidean length (i.e, x = m i=1 x i 2 ) Xiangmin Jiao Numerical Analysis I 9 / 11

p-norms Euclidean length is a special case of p-norms, defined as x p = for 1 p ( m ) 1/p x i p i=1 Xiangmin Jiao Numerical Analysis I 10 / 11

p-norms Euclidean length is a special case of p-norms, defined as x p = for 1 p ( m ) 1/p x i p i=1 Euclidean norm is 2-norm x 2 (i.e., p = 2) Xiangmin Jiao Numerical Analysis I 10 / 11

p-norms Euclidean length is a special case of p-norms, defined as for 1 p ( m ) 1/p x p = x i p i=1 Euclidean norm is 2-norm x 2 (i.e., p = 2) 1-norm: x 1 = m i=1 x i Xiangmin Jiao Numerical Analysis I 10 / 11

p-norms Euclidean length is a special case of p-norms, defined as for 1 p ( m ) 1/p x p = x i p i=1 Euclidean norm is 2-norm x 2 (i.e., p = 2) 1-norm: x 1 = m i=1 x i -norm: x. What is its value? Xiangmin Jiao Numerical Analysis I 10 / 11

p-norms Euclidean length is a special case of p-norms, defined as for 1 p ( m ) 1/p x p = x i p i=1 Euclidean norm is 2-norm x 2 (i.e., p = 2) 1-norm: x 1 = m i=1 x i -norm: x. What is its value? Answer: x = max 1 i m x i Xiangmin Jiao Numerical Analysis I 10 / 11

p-norms Euclidean length is a special case of p-norms, defined as for 1 p ( m ) 1/p x p = x i p i=1 Euclidean norm is 2-norm x 2 (i.e., p = 2) 1-norm: x 1 = m i=1 x i -norm: x. What is its value? Answer: x = max 1 i m x i Why we require p 1? What happens if 0 p < 1? Xiangmin Jiao Numerical Analysis I 10 / 11

Weighted p-norms A generalization of p-norm is weighted p-norm, which assigns different weights (priorities) to different components. It is anisotropic instead of isotropic Algebraically, x W = W x, where W is diagonal matrix with ith diagonal entry w i 0 being weight for ith component In other words, What happens if we allow w i = 0? ( m ) 1/p x W = w i x i p i=1 Xiangmin Jiao Numerical Analysis I 11 / 11

Weighted p-norms A generalization of p-norm is weighted p-norm, which assigns different weights (priorities) to different components. It is anisotropic instead of isotropic Algebraically, x W = W x, where W is diagonal matrix with ith diagonal entry w i 0 being weight for ith component In other words, What happens if we allow w i = 0? ( m ) 1/p x W = w i x i p Can we further generalize it to allow W being arbitrary matrix? No. But we can allow W to be arbitrary nonsingular matrix. i=1 Xiangmin Jiao Numerical Analysis I 11 / 11