Linear Algebra (Review) Volker Tresp 2018

Similar documents
Linear Algebra (Review) Volker Tresp 2017

Review of Linear Algebra

Linear Algebra Review. Vectors

Lecture 3: Matrix and Matrix Operations

Quantum Computing Lecture 2. Review of Linear Algebra

Review of Linear Algebra

Large Scale Data Analysis Using Deep Learning

Elementary Row Operations on Matrices

I = i 0,

Basic Concepts in Linear Algebra

Review of Basic Concepts in Linear Algebra

Knowledge Discovery and Data Mining 1 (VO) ( )

Linear Algebra V = T = ( 4 3 ).

Image Registration Lecture 2: Vectors and Matrices

This appendix provides a very basic introduction to linear algebra concepts.

Basic Linear Algebra in MATLAB

Linear Algebra for Machine Learning. Sargur N. Srihari

Mathematics 13: Lecture 10

Properties of Matrices and Operations on Matrices

Introduction to Numerical Linear Algebra II

Example Linear Algebra Competency Test

The Singular Value Decomposition

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Matrix Basic Concepts

Matrix Algebra: Summary

Deep Learning Book Notes Chapter 2: Linear Algebra

Matrix Arithmetic. j=1

CS 143 Linear Algebra Review

Section 12.4 Algebra of Matrices

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Review of linear algebra

Matrices. Chapter Definitions and Notations

Basic Concepts in Matrix Algebra

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Section 9.2: Matrices.. a m1 a m2 a mn

Review of some mathematical tools

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Getting Started with Communications Engineering. Rows first, columns second. Remember that. R then C. 1

Chapter 3 Transformations

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

Matrix & Linear Algebra

Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

CS 246 Review of Linear Algebra 01/17/19

. a m1 a mn. a 1 a 2 a = a n

Linear Algebra. Session 12

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Final Project # 5 The Cartan matrix of a Root System

1 Linearity and Linear Systems

1. Vectors.

Linear Algebra - Part II

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear Algebra and Eigenproblems

1. The Polar Decomposition

EE731 Lecture Notes: Matrix Computations for Signal Processing

POLI270 - Linear Algebra

ELE/MCE 503 Linear Algebra Facts Fall 2018

Lecture 2: Linear Algebra Review

Elementary maths for GMT

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Vector, Matrix, and Tensor Derivatives

3 (Maths) Linear Algebra

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Foundations of Computer Vision

Section 3.9. Matrix Norm

Linear Algebra: Characteristic Value Problem

1 Positive definiteness and semidefiniteness

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

Linear Equations and Matrix

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

Computational Foundations of Cognitive Science. Definition of the Inverse. Definition of the Inverse. Definition Computation Properties

Linear Algebra: Matrix Eigenvalue Problems

3 a 21 a a 2N. 3 a 21 a a 2M

Phys 201. Matrices and Determinants

Lecture 6. Numerical methods. Approximation of functions

5 Linear Algebra and Inverse Problem

The Adjoint of a Linear Operator

Raphael Mrode. Training in quantitative genetics and genomics 30 May 10 June 2016 ILRI, Nairobi. Partner Logo. Partner Logo

Matrices: 2.1 Operations with Matrices

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

14 Singular Value Decomposition

Introduction. Vectors and Matrices. Vectors [1] Vectors [2]

LEAST SQUARES SOLUTION TRICKS

Singular Value Decomposition and Principal Component Analysis (PCA) I

NOTES on LINEAR ALGEBRA 1

Lecture II: Linear Algebra Revisited

Kevin James. MTHSC 3110 Section 2.1 Matrix Operations

Linear Algebra in Actuarial Science: Slides to the lecture

Background Mathematics (2/2) 1. David Barber

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Lecture 15 Review of Matrix Theory III. Dr. Radhakant Padhi Asst. Professor Dept. of Aerospace Engineering Indian Institute of Science - Bangalore

Eigenvalues and diagonalization

UNIT 6: The singular value decomposition.

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

Maths for Signals and Systems Linear Algebra in Engineering

Pseudoinverse & Moore-Penrose Conditions

Matrices and Matrix Algebra.

Transcription:

Linear Algebra (Review) Volker Tresp 2018 1

Vectors k, M, N are scalars A one-dimensional array c is a column vector. Thus in two dimensions, ( ) c1 c = c 2 c i is the i-th component of c c T = (c 1, c 2 ) is a row vector, the transposed of c 2

Matrices A two-dimensional array A is a matrix, e.g., A = [ a1,1 a 1,2 a 1,3 a 2,1 a 2,2 a 2,3 ] If A is an N M-dimensional matrix, then the transposed A T is an M N-dimensional matrix the columns (rows) of A are the rows (columns) of A T and vice versa A T = a 1,1 a 2,1 a 1,2 a 2,2 a 1,3 a 2,3 3

Addition of Two Vectors Let c = a + d Then c j = a j + d j 4

Multiplication of a Vector with a Scalar c = ka is a vector with c j = ka j C = ka is a matrix of the dimensionality of A, with c i,j = ka i,j 5

Scalar Product of Two Vectors The scalar product (also called dot product) is defines as and is a scalar a c = a T c = M j=1 a j c j Special case: a T a = M j=1 a 2 j 6

Matrix-Vector Product A matrix consists of many row vectors. So a product of a matrix with a column vector consists of many scalar products of vectors If A is an N M-dimensional matrix and c is an M-dimensional column vector Then d = Ac is an N-dimensional column vector d with d i = M j=1 a i,j c j 7

Matrix-Matrix Product A matrix also consists of many column vectors. So a product of matrix with a matrix consists of many matrix-vector products If A is an N M-dimensional matrix and C an M K-dimensional matrix Then D = AC is an N K-dimensional matrix with d i,k = M j=1 a i,j c j,k 8

Multiplication of a Row-Vector with a Matrix Multiplication of a row vector with a matrix is a row vector. If A is a N M-dimensional matrix and d a N-dimensional vector and if c T = d T A Then c is a M-dimensional vector with c j = N i=1 d i a i,j 9

Outer Product Special case: Multiplication of a column vector with a row vector is a matrix. Let d be a N-dimensional vector and c be a M-dimensional vector, then is an N M matrix with a i,j = d i c j A = dc T Example: [ d1 c 1 d 1 c 2 d 1 c 3 d 2 c 1 d 2 c 2 d 2 c 3 ] = [ d1 d 2 ] [ c1 c 2 c 3 ] 10

Matrix Transposed The transposed A T changes rows and columns We have ( A T ) T = A (AC) T = C T A T 11

Unit Matrix I = 1 0... 0 0 1... 0... 0... 0 1 12

Diagonal Matrix N N diagonal matrix: A = a 1,1 0... 0 0 a 2,2... 0... 0... 0 a N,N 13

Matrix Inverse Let A be an N N square matrix If there is a unique inverse matrix A 1, then we have A 1 A = I AA 1 = I If the corresponding inverse exist, (AC) 1 = C 1 A 1 14

Orthogonal Matrices Orthogonal Matrix (more precisely: Orthonormal Matrix): R is a (quadratic) orthogonal matrix, if all columns are orthonormal. It follows (non-trivially) that all rows are orthonormal as well and R T R = I RR T = I R 1 = R T (1) 15

Singular Value Decomposition (SVD) Any N M matrix X can be factored as X = UDV T where U and V are both orthonormal matrices. U is an N N Matrix and V is an M M Matrix. D is an N M diagonal matrix with diagonal entries (singular values) d i 0, i = 1,..., r, with r = min(m, N) The u j (columns of U) are the left singular vectors The v j (columns of V) are the right singular vectors The d j (diagonal entries of D) are the singular values 16

Appendix* A vector is defined in a vector space. Example: c R 2 and c = c i e 1 + c 2 e 2 with an orthogonal basis e 1, e 2. We denote with c both the vector and its component representation A matrix is a 2-D array that is defined with respect to a vector space The dot product is identical to the inner product a, c for Euclidean vector spaces with orthonormal basis vectors e i ( ) a, c = a i e i c i e i = i i i a i c i = a c = a T c An outer product is also called a dyadic product or an outer product (when related to vector spaces) and is written as d c. Note that a matrix is generated from two vectors An outer product is a special case of a tensor product 17

C = AB T, can be written as a sum of outer products C = j a jb T j, where a j is a columns vector of A and b j is a columns vector of B