Mathematical foundations - linear algebra

Similar documents
Mathematical foundations - linear algebra

Linear Algebra Review. Vectors

Functional Analysis Review

Linear Algebra Review

Linear Algebra Massoud Malek

Basic Calculus Review

The following definition is fundamental.

5 Compact linear operators

A Review of Linear Algebra

NORMS ON SPACE OF MATRICES

Linear Algebra Practice Problems

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Lecture II: Linear Algebra Revisited

Review of Linear Algebra

Math Linear Algebra Final Exam Review Sheet

Lecture 7: Positive Semidefinite Matrices

Example Linear Algebra Competency Test

Lecture 7. Econ August 18

Chap 3. Linear Algebra

2. Review of Linear Algebra

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Review of Some Concepts from Linear Algebra: Part 2

STA141C: Big Data & High Performance Statistical Computing

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Math Solutions to homework 5

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Knowledge Discovery and Data Mining 1 (VO) ( )

Analysis Preliminary Exam Workshop: Hilbert Spaces

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Introduction to Matrix Algebra

Definitions for Quizzes

Quantum Computing Lecture 2. Review of Linear Algebra

CS 143 Linear Algebra Review

1. Select the unique answer (choice) for each problem. Write only the answer.

Review of some mathematical tools

Selçuk Demir WS 2017 Functional Analysis Homework Sheet

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture 3: Review of Linear Algebra

2. Every linear system with the same number of equations as unknowns has a unique solution.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Lecture 3: Review of Linear Algebra

Optimization Theory. A Concise Introduction. Jiongmin Yong

Hilbert Spaces. Contents

Elementary linear algebra

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

ELE/MCE 503 Linear Algebra Facts Fall 2018

STA141C: Big Data & High Performance Statistical Computing

CS 246 Review of Linear Algebra 01/17/19

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Lecture 3: Linear Algebra Review, Part II

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Matrix Algebra: Summary

Properties of Matrices and Operations on Matrices

Linear Algebra - Part II

Large Scale Data Analysis Using Deep Learning

(v, w) = arccos( < v, w >

Review problems for MA 54, Fall 2004.

Conceptual Questions for Review

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

LECTURE 7. k=1 (, v k)u k. Moreover r

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Foundations of Matrix Analysis

Econ Slides from Lecture 7

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

SPECTRAL THEOREM FOR SYMMETRIC OPERATORS WITH COMPACT RESOLVENT

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

I. Multiple Choice Questions (Answer any eight)

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

1. General Vector Spaces

Tutorials in Optimization. Richard Socher

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

7 : APPENDIX. Vectors and Matrices

Lecture 2: Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Review of Linear Algebra

Chapter 3 Transformations

NOTES (1) FOR MATH 375, FALL 2012

Linear Algebra. Session 12

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Hilbert Spaces: Infinite-Dimensional Vector Spaces

Some notes on Linear Algebra. Mark Schmidt September 10, 2009

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

T.8. Perron-Frobenius theory of positive matrices From: H.R. Thieme, Mathematics in Population Biology, Princeton University Press, Princeton 2003

TBP MATH33A Review Sheet. November 24, 2018

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Transcription:

Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning

Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar multiplication are defined and satisfy for all x, y, x X and λ, µ IR: Addition: associative x + (y + z) = (x + y) + z commutative x + y = y + x identity element 0 X : x + 0 = x inverse element x X x X : x + x = 0 Scalar multiplication: distributive over elements λ(x + y) = λx + λy distributive over scalars (λ + µ)x = λx + µx associative over scalars λ(µx) = (λµ)x identity element 1 IR : 1x = x

Properties and operations in vector spaces subspace Any non-empty subset of X being itself a vector space (E.g. projection) linear combination given λ i IR, x i X n λ i x i i=1 span The span of vectors x 1,..., x n is defined as the set of their linear combinations { n } λ i x i, λ i IR i=1

Basis in vector space Linear independency A set of vectors x i is linearly independent if none of them can be written as a linear combination of the others Basis A set of vectors x i is a basis for X if any element in X can be uniquely written as a linear combination of vectors x i. Necessary condition is that vectors x i are linearly independent All bases of X have the same number of elements, called the dimension of the vector space.

Linear maps Definition Given two vector spaces X, Z, a function f : X Z is a linear map if for all x, y X, λ IR: f (x + y) = f (x) + f (y) f (λx) = λf (x)

Linear maps as matrices A linear map between two finite-dimensional spaces X, Z of dimensions n, m can always be written as a matrix: Let {x 1,..., x n } and {z 1,..., z m } be some bases for X and Z respectively. For any x X we have: n n f (x) = f ( λ i x i ) = λ i f (x i ) f (x i ) = f (x) = i=1 m a ij z j j=1 i=1 n m m n λ i a ij z j = ( λ i a ij )z j = i=1 j=1 j=1 i=1 m µ j z j j=1

Linear maps as matrices Matrix of basis transformation a 11... a 1n M IR m n =... a m1... a mn Mapping from basis coefficients to basis coefficients Mλ = µ

Matrix properties transpose Matrix obtained exchanging rows with columns (indicated with M T ). Properties: (MN) T = N T M T trace Sum of diagonal elements of a matrix tr(m) = n i=1 M ii inverse The matrix which multiplied with the original matrix gives the identity MM 1 = I rank The rank of an n m matrix is the dimension of the space spanned by its columns

Metric structure Norm A function : X IR + 0 is a norm if for all x, y X, λ IR: x + y x + y λx = λ x x > 0 if x 0 Metric A norm defines a metric d : X X IR + 0 : d(x, y) = x y Note The concept of norm is stronger than that of metric: not any metric gives rise to a norm

Dot product Bilinear form A function Q : X X IR is a bilinear form if for all x, y, z X, λ, µ IR: Q(λx + µy, z) = λq(x, z) + µq(y, z) Q(x, λy + µz) = λq(x, y) + µq(x, z) A bilinear form is symmetric if for all x, y X : Q(x, y) = Q(y, x)

Dot product Dot product A dot product, : X X IR is a symmetric bilinear form which is positive definite: x, x 0 x X A strictly positive definite dot product satisfies x, x = 0 iff x = 0 Norm Any dot product defines a corresponding norm via: x = x, x

Properties of dot product angle The angle θ between two vectors is defined as: cosθ = orthogonal Two vectors are orthogonal if x, z x z x, y = 0 orthonormal A set of vectors {x 1,..., x n } is orthonormal if x i, x j = δ ij where δ ij = 1 if i = j, 0 otherwise.

Eigenvalues and eigenvectors Definition Given a matrix M, the real value λ and (non-zero) vector x are an eigenvalue and corresponding eigenvector of M if Spectral decomposition Mx = λx Given an n n symmetric matrix M, it is possible to find an orthonormal set of n eigenvectors. The matrix V having eigenvectors as columns is unitary, that is V T = V 1 M can be factorized as: M = V ΛV T where Λ is the diagonal matrix of eigenvalues corresponding to eigenvectors in V

Eigenvalues and eigenvectors Singular matrices A matrix is singular if it has a zero eigenvalue Mx = 0x = 0 A singular matrix has linearly dependent columns: [ ] M1... M n 1 M n x 1. x n 1 = 0 x n

Eigenvalues and eigenvectors Singular matrices A matrix is singular if it has a zero eigenvalue Mx = 0x = 0 A singular matrix has linearly dependent columns: M 1 x 1 + + M n 1 x n 1 + M n x n = 0

Eigenvalues and eigenvectors Singular matrices A matrix is singular if it has a zero eigenvalue Mx = 0x = 0 A singular matrix has linearly dependent columns: M n = M 1 x 1 x n + + M n 1 x n 1 x n

Eigenvalues and eigenvectors Singular matrices A matrix is singular if it has a zero eigenvalue Mx = 0x = 0 A singular matrix has linearly dependent columns: M n = M 1 x 1 x n + + M n 1 x n 1 x n Determinant The determinant M of a n n matrix M is the product of its eigenvalues A matrix is invertible if its determinant is not zero (i.e. it is not singular)

Eigenvalues and eigenvectors Symmetric matrices Eigenvectors corresponding to distinct eigenvalues are orthogonal: λ x, z = Ax, z = (Ax) T z = x T A T z = x T Az = x, Az = µ x, z Note An n n symmetric matrix can have at most n distinct eigenvalues.

Positive semi-definite matrix Definition An n n symmetrix matrix M is positive semi-definite if all its eigenvalues are non-negative. Alternative sufficient and necessary conditions for all x IR n x T Mx 0 there exists a real matrix B s.t. M = B T B

Functional analysis Cauchy sequence A sequence (x i ) i IN in a normed space X is called a Cauchy sequence if for all ɛ > 0 there exists n IN s.t. n, n > n x n x n < ɛ. A Cauchy sequence converges to a point x X if Banach and Hilbert spaces lim x n x 0 n A space is complete if all Cauchy sequences in the space converge (to a point in the space). A complete normed space is called a Banach space A complete dot product space is called a Hilbert space

Hilbert spaces A Hilbert space can be (and often is) infinite dimensional Infinite dimensional Hilbert spaces are usually required to be separable, i.e. there exists a countable dense subset: for any ɛ > 0 there exists a sequence (x i ) i IN of elements of the space s.t. for all x X min x i x < ɛ x i