Linear Algebra Review

Similar documents
Mathematical foundations - linear algebra

Review of Some Concepts from Linear Algebra: Part 2

Linear Algebra. Session 12

Linear Algebra Massoud Malek

Mathematical foundations - linear algebra

Lecture II: Linear Algebra Revisited

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Knowledge Discovery and Data Mining 1 (VO) ( )

The following definition is fundamental.

Basic Calculus Review

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Functional Analysis Review

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Basic Elements of Linear Algebra

5 Compact linear operators

STA141C: Big Data & High Performance Statistical Computing

Elementary linear algebra

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

NORMS ON SPACE OF MATRICES

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

Applied Linear Algebra in Geoscience Using MATLAB

Large Scale Data Analysis Using Deep Learning

STA141C: Big Data & High Performance Statistical Computing

CS 143 Linear Algebra Review

1. General Vector Spaces

Linear Algebra Review. Vectors

Linear Analysis Lecture 5

Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

The Eigenvalue Problem: Perturbation Theory

Inner products and Norms. Inner product of 2 vectors. Inner product of 2 vectors x and y in R n : x 1 y 1 + x 2 y x n y n in R n

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Lecture 7: Positive Semidefinite Matrices

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

MAT Linear Algebra Collection of sample exams

Mathematical Methods wk 2: Linear Operators

Common-Knowledge / Cheat Sheet

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

MAT 771 FUNCTIONAL ANALYSIS HOMEWORK 3. (1) Let V be the vector space of all bounded or unbounded sequences of complex numbers.

MAT 610: Numerical Linear Algebra. James V. Lambers

MATH 532: Linear Algebra

Chapter 3 Transformations

Algebra II. Paulius Drungilas and Jonas Jankauskas

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

2. Review of Linear Algebra

Linear Algebra Formulas. Ben Lee

Lecture 2: Linear Algebra Review

Mathematics Department Stanford University Math 61CM/DM Inner products

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

UNIT 6: The singular value decomposition.

Math 24 Spring 2012 Sample Homework Solutions Week 8

Section 3.9. Matrix Norm

Chapter 4 Euclid Space

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Lecture 10 - Eigenvalues problem

Exercise Sheet 1.

Applied Linear Algebra in Geoscience Using MATLAB

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

MATH 583A REVIEW SESSION #1

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Symmetric Matrices and Eigendecomposition

Lecture 1: Review of linear algebra

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Mathematical Analysis Outline. William G. Faris

Normed & Inner Product Vector Spaces

FUNCTIONAL ANALYSIS-NORMED SPACE

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math General Topology Fall 2012 Homework 1 Solutions

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

Introduction to Signal Spaces

Vector Spaces, Affine Spaces, and Metric Spaces

a. Define a function called an inner product on pairs of points x = (x 1, x 2,..., x n ) and y = (y 1, y 2,..., y n ) in R n by

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

Lecture 1: Systems of linear equations and their solutions

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

ACM 104. Homework Set 4 Solutions February 14, 2001

CS 246 Review of Linear Algebra 01/17/19

1. Matrix multiplication and Pauli Matrices: Pauli matrices are the 2 2 matrices. 1 0 i 0. 0 i

Lecture 9: Low Rank Approximation

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

Math 307 Learning Goals. March 23, 2010

Chapter 6 Inner product spaces

Best approximation in the 2-norm

0.1 Rational Canonical Forms

Eigenvalues, Eigenvectors, and an Intro to PCA

Math 307 Learning Goals

The Singular Value Decomposition

A Review of Linear Algebra

1. Addition: To every pair of vectors x, y X corresponds an element x + y X such that the commutative and associative properties hold

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Transcription:

January 29, 2013

Table of contents Metrics

Metric Given a space X, then d : X X R + 0 and z in X if: d(x, y) = 0 is equivalent to x = y d(x, y) = d(y, x) d(x, y) d(x, z) + d(z, y) is a metric is for all x, y

Example of a metric Euclidean Distance: Given X = R n, d(x, y) := ( n (x i y i ) 2 ) 1 2 i=1 d(a, b) = 0 is equivalent to a = b d(a, b) = d(b, a) d(a, b) d(a, c) + d(c, b) (this is the triangle inequality)

Vector Space A vector space is a space X such that for all x, y X and for all α R: x + y X αx X

Examples of vector spaces Real Numbers: given x, y R, and α R: x + y R αx R R n : given x, y R n, and α R: x + y R n αx R n

Examples of vector spaces Polynomials: given f (x) = n a i x i and g(x) = n b i x i, and α R: i=0 i=0 f (x) + g(x) = n (a i + b i )x i, i.e. polynomial of order n αf (x) = n i=0 i=0 αa i x i, i.e. polynomial of order n

Cauchy Series Given a space X, a Cauchy series is a series x i X for which for every ɛ > 0 there exist an n 0 such that for all m, n n 0, d(x m, x n ) ɛ

Completeness A space X is complete if the limit of every Cauchy series X. For example, (0, 1) is not complete but [0, 1] is. The set Q of rational numbers is not complete: you can construct a sequence that converges to 2 but 2 is not in Q.

Norm Given a vector space X, a norm is a mapping. : X R + 0 satisfies, for all x, y X and for all α R: x = 0 if and only if x = 0 αx = α x x + y x + y (triangle inequality) that A norm is also a metric: d(x, y) := x y

Banach Space A Banach Space is a complete vector space X together with a norm.. ( m ) 1 l m p Spaces: R m with the norm x := x i p p i=1 ( ) 1 l p Spaces: These are subspaces of R N with x := x i p p i=1 Function Spaces L p (X ): Over X, f := ( X f (x) p dx ) 1 p.

Dot Product Given a vector space X, a dot product is a mapping.,. : X X R that satisfies, for all x, y and z X and for all α R: Symmetry: x, y = y, x Linearity: x, αy = α x, y Additivity: x, y + z = x, y + x, z

A is a complete vector space X together with a dot product.,.. The dot product automatically generates a norm: x := x, x. Hilbert spaces are special cases of Banach spaces.

Examples of s Euclidean spaces and the standard dot product for x, y R m : x, y = m x i y i i=1 Function spaces (L 2 (X )): functions on X with f : X C for all f, g F, with the dot product: f, g = X f (x)g(x)dx l 2 series of real numbers (infinite), R N : x, y = x i y i i=1

A matrix M R m n corresponds to a linear map from R m to R n. A symmetric matrix M R m m satisfies M ij = M ji. An anti-symmetric matrix M R m m satisfies M ij = M ji. Rank: Denote by I the image of R m under M. rank(m) is the smallest number of vectors that span I.

: orthogonality A matrix M R m m is orthogonal if M T M = I. This means M T = M 1. An orthogonal matrix consists of mutually orthogonal rows and columns.

Matrix Norms The norm of a linear operator between two Banach spaces X and Y: Ax A := max x X x αa = max x X αax x A + B = max x X A + B A = 0 implies max A = 0. = α A (A+B)x x x X Ax x max x X Ax x + max x X Bx x = and thus Ax = 0 for all x, i.e.

Matrix Norms Frobenius norm: (in analogy with vector norm) M 2 Frob = m m Mij 2 i=1 j=1

Eigen Systems Given M in R m m, then λ R is an eigenvalue and x R m is an eigenvector if: Mx = λx

Eigen Systems, symmetric matrices For symmetric matrices all eigenvalues are real and the matrix is fully diagonalizable (i.e. m eigenvectors). All eigenvectors with different eigenvalues are mutually orthogonal: Proof, for two eigenvectors x and x with respective eigenvalues λ and λ : λx T x = (Mx) T x = x T (M T x ) = x T (Mx ) = λ x T x so λ = λ or x T x = 0. We can decompose M = O T ΛO.

Eigen Systems, symmetric matrices We also have the operator norm: M 2 Mx 2 = max x R m x 2 = max x R m and x =1 Mx 2 = max x R m and x =1 xt M T Mx = max x R m and x =1 xt OΛO T OΛO T x = max x R m and x =1 x T Λ 2 x = max i [m] λ2 i

Eigen Systems, symmetric matrices Frobenius norm: M 2 Frob = tr(mm T ) = tr(oλo T OΛO T ) m = tr(λo T OΛO T O) = tr(λ 2 ) = i=1 λ 2 i

: Invariants Trace: tr(m) = m M ii. i=1 tr(ab) = tr(ba). For symmetric matrices: tr(m) = tr(o T ΛO) = tr(λoo T ) = tr(λ) = m Determinant: det(m) = m λ i i=1 λ i i=1

Positive A Positive Definite Matrix is a matrix M R m m for which for all x R m : x T Mx > 0 if x 0 This matrix has only positive eigenvalues: x T Mx = λx T x = λ x > 0 Induced norm: x 2 M = xt Mx

Singular Value Decomposition Want to find similar thing for arbitrary matrix M R m n where m n: M = UΛO U R m n, U T U = I O R n n, O T O = I Λ = diag(λ 1, λ 2,...λ n )