Basic Calculus Review

Similar documents
Functional Analysis Review

Functional Analysis Review

Mathematical foundations - linear algebra

2. Review of Linear Algebra

NORMS ON SPACE OF MATRICES

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Linear Algebra - Part II

CS 143 Linear Algebra Review

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Linear Algebra. Session 12

Quantum Computing Lecture 2. Review of Linear Algebra

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Multivariate Statistical Analysis

Mathematical foundations - linear algebra

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

STA141C: Big Data & High Performance Statistical Computing

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra

1 Last time: least-squares problems

UNIT 6: The singular value decomposition.

The Singular Value Decomposition

The following definition is fundamental.

Lecture 1: Review of linear algebra

1 Inner Product and Orthogonality

Linear Algebra Review

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Linear Algebra Massoud Malek

Review of Linear Algebra

Principal Component Analysis

STA141C: Big Data & High Performance Statistical Computing

Section 3.9. Matrix Norm

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Large Scale Data Analysis Using Deep Learning

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Singular Value Decomposition

1. The Polar Decomposition

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Positive Definite Matrix

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

Linear Algebra Review. Vectors

Definitions for Quizzes

Chapter 3 Transformations

14 Singular Value Decomposition

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

Linear Algebra for Machine Learning. Sargur N. Srihari

Linear Algebra: Characteristic Value Problem

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

EE731 Lecture Notes: Matrix Computations for Signal Processing

Linear Least Squares. Using SVD Decomposition.

Matrix Algebra: Summary

Designing Information Devices and Systems II

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Stat 159/259: Linear Algebra Notes

Computational math: Assignment 1

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Spectral Theorem for Self-adjoint Linear Operators

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Problem # Max points possible Actual score Total 120

Chapter 7: Symmetric Matrices and Quadratic Forms

15 Singular Value Decomposition

Analysis Preliminary Exam Workshop: Hilbert Spaces

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

LinGloss. A glossary of linear algebra

ON THE SINGULAR DECOMPOSITION OF MATRICES

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Math for ML: review. ML and knowledge of other fields

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Applied Linear Algebra in Geoscience Using MATLAB

Kernel Method: Data Analysis with Positive Definite Kernels

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

Lecture 2: Linear Algebra Review

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Lecture 8: Linear Algebra Background

Applied Linear Algebra in Geoscience Using MATLAB

Maths for Signals and Systems Linear Algebra in Engineering

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Review of some mathematical tools

CS 246 Review of Linear Algebra 01/17/19

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Exercise Sheet 1.

Linear Algebra (Review) Volker Tresp 2017

Chap 3. Linear Algebra

Math for ML: review. CS 1675 Introduction to ML. Administration. Lecture 2. Milos Hauskrecht 5329 Sennott Square, x4-8845

Convex Optimization and Modeling

Notes on Linear Algebra and Matrix Theory

Math 108b: Notes on the Spectral Theorem

Summary of Week 9 B = then A A =

EXAM. Exam 1. Math 5316, Fall December 2, 2012

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Linear Algebra: Matrix Eigenvalue Problems

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Transcription:

Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning

Vector Spaces Functionals and Operators (Matrices)

Vector Space A vector space is a set V with binary operations +: V V V and : R V V such that for all a, b R and v, w, x V: 1. v + w = w + v 2. (v + w) + x = v + (w + x) 3. There exists 0 V such that v + 0 = v for all v V 4. For every v V there exists v V such that v + ( v) = 0 5. a(bv) = (ab)v 6. 1v = v 7. (a + b)v = av + bv 8. a(v + w) = av + aw

Vector Space A vector space is a set V with binary operations +: V V V and : R V V such that for all a, b R and v, w, x V: 1. v + w = w + v 2. (v + w) + x = v + (w + x) 3. There exists 0 V such that v + 0 = v for all v V 4. For every v V there exists v V such that v + ( v) = 0 5. a(bv) = (ab)v 6. 1v = v 7. (a + b)v = av + bv 8. a(v + w) = av + aw Example: R n, space of polynomials, space of functions.

Inner Product An inner product is a function, : V V R such that for all a, b R and v, w, x V:

Inner Product An inner product is a function, : V V R such that for all a, b R and v, w, x V: 1. v, w = w, v 2. av + bw, x = a v, x + b w, x 3. v, v 0 and v, v = 0 if and only if v = 0.

Inner Product An inner product is a function, : V V R such that for all a, b R and v, w, x V: 1. v, w = w, v 2. av + bw, x = a v, x + b w, x 3. v, v 0 and v, v = 0 if and only if v = 0. v, w V are orthogonal if v, w = 0.

Inner Product An inner product is a function, : V V R such that for all a, b R and v, w, x V: 1. v, w = w, v 2. av + bw, x = a v, x + b w, x 3. v, v 0 and v, v = 0 if and only if v = 0. v, w V are orthogonal if v, w = 0. Given W V, we have V = W W, where W = { v V v, w = 0 for all w W }.

Inner Product An inner product is a function, : V V R such that for all a, b R and v, w, x V: 1. v, w = w, v 2. av + bw, x = a v, x + b w, x 3. v, v 0 and v, v = 0 if and only if v = 0. v, w V are orthogonal if v, w = 0. Given W V, we have V = W W, where W = { v V v, w = 0 for all w W }. Cauchy-Schwarz inequality: v, w v, v 1/2 w, w 1/2.

Norm Can define norm from inner product: v = v, v 1/2.

Norm A norm is a function : V R such that for all a R and v, w V: 1. v 0, and v = 0 if and only if v = 0 2. av = a v 3. v + w v + w Can define norm from inner product: v = v, v 1/2.

Metric Can define metric from norm: d(v, w) = v w.

Metric A metric is a function d: V V R such that for all v, w, x V: 1. d(v, w) 0, and d(v, w) = 0 if and only if v = w 2. d(v, w) = d(w, v) 3. d(v, w) d(v, x) + d(x, w) Can define metric from norm: d(v, w) = v w.

Basis B = {v 1,..., v n } is a basis of V if every v V can be uniquely decomposed as for some a 1,..., a n R. v = a 1 v 1 + + a n v n

Basis B = {v 1,..., v n } is a basis of V if every v V can be uniquely decomposed as for some a 1,..., a n R. v = a 1 v 1 + + a n v n An orthonormal basis is a basis that is orthogonal ( v i, v j = 0 for i j) and normalized ( v i = 1).

Vector Spaces Functionals and Operators (Matrices)

Maps Next we are going to review basic properties of maps on a Hilbert space. functionals: Ψ : H R linear operators A : H H, such that A(af + bg) = aaf + bag, with a, b R and f, g H.

Representation of Continuous Functionals Let H be a Hilbert space and g H, then Ψ g (f) = f, g, f H is a continuous linear functional. Riesz representation theorem The theorem states that every continuous linear functional Ψ can be written uniquely in the form, Ψ(f) = f, g for some appropriate element g H.

Matrix Every linear operator L: R m R n can be represented by an m n matrix A.

Matrix Every linear operator L: R m R n can be represented by an m n matrix A. If A R m n, the transpose of A is A R n m satisfying Ax, y R m = (Ax) y = x A y = x, A y R n for every x R n and y R m.

Matrix Every linear operator L: R m R n can be represented by an m n matrix A. If A R m n, the transpose of A is A R n m satisfying Ax, y R m = (Ax) y = x A y = x, A y R n for every x R n and y R m. A is symmetric if A = A.

Eigenvalues and Eigenvectors Let A R n n. A nonzero vector v R n is an eigenvector of A with corresponding eigenvalue λ R if Av = λv.

Eigenvalues and Eigenvectors Let A R n n. A nonzero vector v R n is an eigenvector of A with corresponding eigenvalue λ R if Av = λv. Symmetric matrices have real eigenvalues.

Eigenvalues and Eigenvectors Let A R n n. A nonzero vector v R n is an eigenvector of A with corresponding eigenvalue λ R if Av = λv. Symmetric matrices have real eigenvalues. Spectral Theorem: Let A be a symmetric n n matrix. Then there is an orthonormal basis of R n consisting of the eigenvectors of A.

Eigenvalues and Eigenvectors Let A R n n. A nonzero vector v R n is an eigenvector of A with corresponding eigenvalue λ R if Av = λv. Symmetric matrices have real eigenvalues. Spectral Theorem: Let A be a symmetric n n matrix. Then there is an orthonormal basis of R n consisting of the eigenvectors of A. Eigendecomposition: A = VΛV, or equivalently, n A = λ i v i v i. i=1

Singular Value Decomposition Every A R m n can be written as A = UΣV, where U R m m is orthogonal, Σ R m n is diagonal, and V R n n is orthogonal.

Singular Value Decomposition Every A R m n can be written as A = UΣV, where U R m m is orthogonal, Σ R m n is diagonal, and V R n n is orthogonal. Singular system: Av i = σ i u i A u i = σ i v i AA u i = σ 2 i u i A Av i = σ 2 i v i

Matrix Norm The spectral norm of A R m n is A spec = σ max (A) = λ max (AA ) = λ max (A A).

Matrix Norm The spectral norm of A R m n is A spec = σ max (A) = λ max (AA ) = λ max (A A). The Frobenius norm of A R m n is m n A F = a 2 ij = min{m,n} i=1 j=1 i=1 σ 2 i.

Positive Definite Matrix A real symmetric matrix A R m m is positive definite if x t Ax > 0, x R m. A positive definite matrix has positive eigenvalues. Note: for positive semi-definite matrices > is replaced by.