Computational Methods CMSC/AMSC/MAPL 460. Linear Systems, Matrices, LU Decomposition, Ramani Duraiswami, Dept. of Computer Science

Similar documents
Computational Methods CMSC/AMSC/MAPL 460. Vectors, Matrices, Linear Systems, LU Decomposition, Ramani Duraiswami, Dept. of Computer Science

Geometric View of Machine Learning Nearest Neighbor Classification. Slides adapted from Prof. Carpuat

Lecture II: Linear Algebra Revisited

Basic Concepts in Linear Algebra

Review of Basic Concepts in Linear Algebra

Applied Linear Algebra in Geoscience Using MATLAB

Linear Algebra (Review) Volker Tresp 2017

There are two things that are particularly nice about the first basis

Linear Algebra (Review) Volker Tresp 2018

Inner Product Spaces

Chapter 3 Transformations

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Example Linear Algebra Competency Test

Linear Algebra Massoud Malek

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

MATH Linear Algebra

EXAM. Exam 1. Math 5316, Fall December 2, 2012

MATH.2720 Introduction to Programming with MATLAB Vector and Matrix Algebra

Review of Linear Algebra

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Matrices and Vectors

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra Methods for Data Mining

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Bastian Steder

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

Linear Algebra Review

Mathematical Foundations

Section 3.9. Matrix Norm

Linear Algebra Review. Vectors

A Review of Linear Algebra

Review of linear algebra

CMSC 422 Introduction to Machine Learning Lecture 4 Geometry and Nearest Neighbors. Furong Huang /

The Transpose of a Vector

Matrices: 2.1 Operations with Matrices

Math 3108: Linear Algebra

Review problems for MA 54, Fall 2004.

IFT 6760A - Lecture 1 Linear Algebra Refresher

Functional Analysis. James Emery. Edit: 8/7/15

Matrix Basic Concepts

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

Typical Problem: Compute.

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Quantum Computing Lecture 2. Review of Linear Algebra

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing

Linear Algebra V = T = ( 4 3 ).

Linear Algebra for Machine Learning. Sargur N. Srihari

Outline. Linear Algebra for Computer Vision

Linear Algebra and Eigenproblems

What is it we are looking for in these algorithms? We want algorithms that are

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Numerical Linear Algebra

Lecture 7. Econ August 18

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

5. Orthogonal matrices

3 a 21 a a 2N. 3 a 21 a a 2M

Matrix Algebra: Summary

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

(Mathematical Operations with Arrays) Applied Linear Algebra in Geoscience Using MATLAB

Diagonalizing Matrices

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Cambridge University Press The Mathematics of Signal Processing Steven B. Damelin and Willard Miller Excerpt More information

Math 3191 Applied Linear Algebra

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

Lecture 23: 6.1 Inner Products

1. Vectors.

EGR2013 Tutorial 8. Linear Algebra. Powers of a Matrix and Matrix Polynomial

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

Practical Linear Algebra: A Geometry Toolbox

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

MATH 22A: LINEAR ALGEBRA Chapter 4

Foundations of Matrix Analysis

This appendix provides a very basic introduction to linear algebra concepts.

Exercises Chapter II.

Linear Algebra Review

Extra Problems for Math 2050 Linear Algebra I

4 Linear Algebra Review

Linear Algebra and Matrix Inversion

Knowledge Discovery and Data Mining 1 (VO) ( )

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Notes on Linear Algebra and Matrix Theory

CS 143 Linear Algebra Review

6.1. Inner Product, Length and Orthogonality

1. General Vector Spaces

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Methods for sparse analysis of high-dimensional data, II

Chapter 6: Orthogonality

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

MAT Linear Algebra Collection of sample exams

Matrix Algebra: Vectors

MAT 610: Numerical Linear Algebra. James V. Lambers

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Determinants: summary of main results

Transcription:

Computational ethods CSC/ASC/APL 460 Linear Systems, atrices, LU Decomposition, Ramani Duraiswami, Dept. of Computer Science

Class Outline uch of scientific computation involves solution of linear equations Even non-linear problems are solved by linearization Some interpretations of matrices and vectors atrix vector multiplication and complexity Identity, Inverse, Singular atrices Permutation, Lower and Upper Triangular atrices

Vectors Ordered set of numbers: (1,2,3,4) Example: (x,y,z) coordinates of a pt in space. Line joining the origin of coordinates to the point Vectors usually indicated with bold lower case letters. Scalars with lower case Operations with vectors: Addition operation u + v, with: Identity 0 v + 0 = v Inverse - v + (-v) = 0 Scalar multiplication: Distributive rule: α(u + v) = α(u) + α(v) (α + β)u = αu + βu

Vector Addition v + w = ( x 1, x2) + ( y1, y2) = ( x1 + y1, x2 + y2) v V+w w

Vector Spaces A linear combination of vectors results in a new vector: v = α 1 v 1 + α 2 v 2 + + α n v n If the only set of scalars such that α 1 v 1 + α 2 v 2 + + α n v n = 0 is α 1 = α 2 = = α 3 = 0 then we say the vectors are linearly independent The dimension of a space is the greatest number of linearly independent vectors possible in a vector set For a vector space of dimension n, any set of n linearly independent vectors form a basis

Vector Spaces: Basis Vectors Given a basis for a vector space: Each vector in the space is a unique linear combination of the basis vectors The coordinates of a vector are the scalars from this linear combination Best-known example: Cartesian coordinates Example Note that a given vector v will have different coordinates for different bases

Dot Product The dot product or, more generally, inner product of two vectors is a scalar: v 1 v 2 = x 1 x 2 + y 1 y 2 + z 1 z 2 (in 3D) Useful for many purposes Computing the length of a vector: length(v) = sqrt(v v) Normalizing a vector, making it unit-length Computing the angle between two vectors: u v = u v cos(θ) Checking two vectors for orthogonality Projecting one vector onto another v u θ

2 Vector norms v = ( x, x, K, x ) 1 2 n Two norm (Euclidean norm) n v = x 2 2 i i = 1 If v = 1, v is a unit vector Infinity norm v = max( x 1, x 2, K, x ) n One norm (" anhattan distance") n v = x 2 i i = 1 For a 2 dimensional vector, write down the set of vectors with two, one and infinity norm equal to unity

Linear Transformations: atrices A linear transformation: aps one vector to another Preserves linear combinations Thus behavior of linear transformation is completely determined by what it does to a basis Turns out any linear transform can be represented by a matrix

atrices By convention, matrix element ij is located at row i and column j: = 11 21 m1 12 22 m2 By convention, vectors are columns: L L O L 1n 2n mn v = v v v 1 2 3

How are matrices stored in a computer? atlab and Fortran: column by column Indices start at 1 What is the most efficient way to access a matrix? C arrays are closely linked to pointers Indices start at 0 C native matrices are row major any issues which must be dealt with by properly defining matrices, or using a set of matrix definitions (see http://www.library.cornell.edu/nr/bookcpdf/c1-2.pdf for a nice discussion)

Ways to define matrices Perhaps all entries are random ore often, they are somehow functions of i and j atlab code to generate matrix If matrix is of size N N how many operations are needed to enter values? Sometimes each column or row is given as a formula: Example Vandermonde matrix in polynomial interpolation

Some special matrices atlab code How many operations and memory does this take? Vectorized operations atrix may be sparse, i.e. most elements are zero. How many operations/memory? Answer still N 2 unless we avoid referring to the zero elements altogether

Some special matrices atrices may be built up from blocks of smaller matrices

atrix-vector product atrix-vector multiplication applies a linear transformation to a vector: v = 11 21 31 12 22 32 13 23 33 v v v x y z Recall how to do matrix multiplication How many operations does a matrix vector product take?

atrix-vector product

atrix Transformations A sequence or composition of linear transformations corresponds to the product of the corresponding matrices Note: the matrices to the right affect vector first Note: order of matrices matters! The identity matrix I has no effect in multiplication Some (not all) matrices have an inverse: ( ( v) ) v 1 =

Ways to implement a matrix vector product Access matrix Element-by-element along rows Element-by-element along columns As column vectors As row vectors Discuss advantages

atrix norms Can be defined using corresponding vector norms Two norm One norm Infinity norm Two norm is hard to define need to find maximum singular value related to idea that matrix acting on unit sphere converts it in to an ellipsoid Frobenius norm is defined just using matrix elements