Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Similar documents
Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Bastian Steder

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Review of Linear Algebra

Advanced Techniques for Mobile Robotics Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Fundamentals of Engineering Analysis (650163)

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Linear Algebra: Matrix Eigenvalue Problems

1 9/5 Matrices, vectors, and their applications

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Review problems for MA 54, Fall 2004.

Elementary maths for GMT

Quiz ) Locate your 1 st order neighbors. 1) Simplify. Name Hometown. Name Hometown. Name Hometown.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATRICES. knowledge on matrices Knowledge on matrix operations. Matrix as a tool of solving linear equations with two or three unknowns.

Chapter 5 Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Reduction to the associated homogeneous system via a particular solution

A Review of Matrix Analysis

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Quantum Computing Lecture 2. Review of Linear Algebra

Linear Algebra Primer

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Math Linear Algebra Final Exam Review Sheet

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

MAT Linear Algebra Collection of sample exams

Exercise Sheet 1.

Conceptual Questions for Review

Math 302 Outcome Statements Winter 2013

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

ANSWERS. E k E 2 E 1 A = B

OR MSc Maths Revision Course

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

1. General Vector Spaces

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Solving a system by back-substitution, checking consistency of a system (no rows of the form

CHAPTER 8: Matrices and Determinants

Solutions to Final Exam

Matrices and Matrix Algebra.

1 Determinants. 1.1 Determinant

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Math 1553, Introduction to Linear Algebra

2. Every linear system with the same number of equations as unknowns has a unique solution.

MTH 464: Computational Linear Algebra

Matrices A brief introduction

Linear Algebra March 16, 2019

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Knowledge Discovery and Data Mining 1 (VO) ( )

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

SUMMARY OF MATH 1600

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

POLI270 - Linear Algebra

ANSWERS (5 points) Let A be a 2 2 matrix such that A =. Compute A. 2

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Image Registration Lecture 2: Vectors and Matrices

Review of linear algebra

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Chapter 7. Linear Algebra: Matrices, Vectors,

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra Review. Vectors

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Linear Algebra Highlights

Matrices. In this chapter: matrices, determinants. inverse matrix

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

1 Inner Product and Orthogonality

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Study Notes on Matrices & Determinants for GATE 2017

Digital Workbook for GRA 6035 Mathematics

Background Mathematics (2/2) 1. David Barber

2 b 3 b 4. c c 2 c 3 c 4

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

Maths for Signals and Systems Linear Algebra in Engineering

Introduction to Matrices

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Study Guide for Linear Algebra Exam 2

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Chapter 3. Determinants and Eigenvalues

3 (Maths) Linear Algebra

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Review of some mathematical tools

ELE/MCE 503 Linear Algebra Facts Fall 2018

Online Exercises for Linear Algebra XM511

Linear Algebra in Actuarial Science: Slides to the lecture

Robotics 2 Least Squares Estimation. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Maren Bennewitz, Wolfram Burgard

LINEAR ALGEBRA SUMMARY SHEET.

Mathematical foundations - linear algebra

G1110 & 852G1 Numerical Linear Algebra

Transcription:

Introduction to Mobile Robotics Compact Course on Linear Algebra Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Vectors Arrays of numbers Vectors represent a point in a n dimensional space

Vectors: Scalar Product Scalar-Vector Product Changes the length of the vector, but not its direction

Vectors: Sum Sum of vectors (is commutative) Can be visualized as chaining the vectors.

Vectors: Dot Product Inner product of vectors (is a scalar) If one of the two vectors, e.g., has, the inner product returns the length of the projection of along the direction of If, the two vectors are orthogonal

Vectors: Linear (In)Dependence A vector In other words, if summing up the is linearly dependent from if can be obtained by properly scaled If there exist no such that then is independent from

Vectors: Linear (In)Dependence A vector In other words, if summing up the is linearly dependent from if can be obtained by properly scaled If there exist no such that then is independent from

Matrices A matrix is written as a table of values rows columns 1 st index refers to the row 2 nd index refers to the column Note: a d-dimensional vector is equivalent to a dx1 matrix

Matrices as Collections of Vectors Column vectors

Matrices as Collections of Vectors Row vectors

Important Matrices Operations Multiplication by a scalar Sum (commutative, associative) Multiplication by a vector Product (not commutative) Inversion (square, full rank) Transposition

Scalar Multiplication & Sum In the scalar multiplication, every element of the vector or matrix is multiplied with the scalar The sum of two vectors is a vector consisting of the pair-wise sums of the individual entries The sum of two matrices is a matrix consisting of the pair-wise sums of the individual entries

Matrix Vector Product The i th component of is the dot product. The vector is linearly dependent from with coefficients row vectors column vectors

Matrix Vector Product If the column vectors of represent a reference system, the product computes the global transformation of the vector according to column vectors

Matrix Matrix Product Can be defined through the dot product of row and column vectors the linear combination of the columns of A scaled by the coefficients of the columns of B

Matrix Matrix Product If we consider the second interpretation, we see that the columns of C are the global transformations of the columns of B through A All the interpretations made for the matrix vector product hold

Linear Systems (1) Interpretations: A set of linear equations A way to find the coordinates x in the reference system of A such that b is the result of the transformation of Ax Solvable by Gaussian elimination (as taught in school)

Linear Systems (2) Notes: Many efficient solvers exit, e.g., conjugate gradients, sparse Cholesky decomposition One can obtain a reduced system (A, b ) by considering the matrix (A, b) and suppressing all the rows which are linearly dependent Let A'x=b' the reduced system with A':n'xm and b':n'x1 and rank A' = min(n',m) rows The system might be either over-constrained (n >m) or under-constrained (n <m) columns

Over-Constrained Systems More (indep) equations than variables An over-constrained system does not admit an exact solution However, if rank A = cols(a) one may find a minimum norm solution by closed form pseudo inversion Note: rank = Maximum number of linearly independent rows/columns

Under-Constrained Systems More variables than (indep) equations The system is under-constrained if the number of linearly independent rows (or columns) of A is smaller than the dimension of b An under-constrained system admits infinite solutions The degree of these infinite solutions is cols(a ) - rows(a )

Inverse If A is a square matrix of full rank, then there is a unique matrix B=A -1 such that AB=I holds The i th row of A is and the j th column of A -1 are: orthogonal (if i j) or their dot product is 1 (if i = j)

Matrix Inversion The i th column of A -1 can be found by solving the following linear system: This is the i th column of the identity matrix

Trace (tr) Only defined for square matrices Sum of the elements on the main diagonal, that is It is a linear operator with the following properties Additivity: Homogeneity: Pairwise commutative: Trace is similarity invariant Trace is transpose invariant Given two vectors a and b, tr(a T b)=tr(a b T )

Rank Maximum number of linearly independent rows (columns) Dimension of the image of the transformation When is we have and the equality holds iff is the null matrix is injective iff is surjective iff if, is bijective and is invertible iff Computation of the rank is done by Gaussian elimination on the matrix Counting the number of non-zero rows

Determinant (det) Only defined for square matrices The inverse of exists if and only if For matrices: Let and, then For matrices the Sarrus rule holds:

Determinant For general matrices? Let be the submatrix obtained from by deleting the i-th row and the j-th column Rewrite determinant for matrices:

Determinant For general matrices? Let be the (i,j)-cofactor, then This is called the cofactor expansion across the first row

Determinant Problem: Take a 25 x 25 matrix (which is considered small). The cofactor expansion method requires n! multiplications. For n = 25, this is 1.5 x 10^25 multiplications for which a today supercomputer would take 500,000 years. There are much faster methods, namely using Gauss elimination to bring the matrix into triangular form. Because for triangular matrices the determinant is the product of diagonal elements

Determinant: Properties Row operations ( is still a square matrix) If results from by interchanging two rows, then If results from by multiplying one row with a number, then If results from by adding a multiple of one row to another row, then Transpose: Multiplication: Does not apply to addition!

Determinant: Applications Find the inverse using Cramer s rule with being the adjugate of with C ij being the cofactors of A, i.e.,

Determinant: Applications Find the inverse using Cramer s rule with being the adjugate of Compute Eigenvalues: Solve the characteristic polynomial Area and Volume: ( is i-th row)

Orthonormal Matrix A matrix is orthonormal iff its column (row) vectors represent an orthonormal basis As linear transformation, it is norm preserving Some properties: The transpose is the inverse Determinant has unity norm ( 1)

Rotation Matrix A Rotation matrix is an orthonormal matrix with det =+1 2D Rotations 3D Rotations along the main axes IMPORTANT: Rotations are not commutative

Matrices to Represent Affine Transformations A general and easy way to describe a 3D transformation is via matrices Translation Vector Rotation Matrix Takes naturally into account the noncommutativity of the transformations See: homogeneous coordinates

Combining Transformations A simple interpretation: chaining of transformations (represented as homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? p

Combining Transformations A simple interpretation: chaining of transformations (represented as homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? Bp gives the pose of the object wrt the robot B

Combining Transformations A simple interpretation: chaining of transformations (represented as homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? Bp gives the pose of the object wrt the robot A ABp gives the pose of the object wrt the world

Symmetric Matrix A matrix is symmetric if, e.g. A matrix is skew-symmetric if, e.g. Every symmetric matrix: is diagonalizable, where is a diagonal matrix of eigenvalues and is an orthogonal matrix whose columns are the eigenvectors of define a quadratic form

Positive Definite Matrix The analogous of positive number Definition Example

Positive Definite Matrix Properties Invertible, with positive definite inverse All real eigenvalues > 0 Trace is > 0 Cholesky decomposition

Jacobian Matrix It is a non-square matrix in general Given a vector-valued function Then, the Jacobian matrix is defined as

Jacobian Matrix It is the orientation of the tangent plane to the vector-valued function at a given point Generalizes the gradient of a scalar valued function

Quadratic Forms Many functions can be locally approximated with quadratic form Often, one is interested in finding the minimum (or maximum) of a quadratic form, i.e.,

Quadratic Forms Question: How to efficiently compute a solution to this minimization problem At the minimum, we have By using the definition of matrix product, we can compute f

Quadratic Forms The minimum of where its derivative is 0 is Thus, we can solve the system If the matrix is symmetric, the system becomes Solving that, leads to the minimum

Further Reading A quick and dirty guide to matrices is the Matrix Cookbook available at: http://matrixcookbook.com