Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Similar documents
Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Bastian Steder

Review of Linear Algebra

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Review problems for MA 54, Fall 2004.

1 9/5 Matrices, vectors, and their applications

Quiz ) Locate your 1 st order neighbors. 1) Simplify. Name Hometown. Name Hometown. Name Hometown.

Elementary maths for GMT

Linear Algebra Review. Vectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra: Matrix Eigenvalue Problems

A Review of Matrix Analysis

Chapter 5 Eigenvalues and Eigenvectors

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 302 Outcome Statements Winter 2013

Fundamentals of Engineering Analysis (650163)

1. General Vector Spaces

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Linear Algebra March 16, 2019

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

MTH 464: Computational Linear Algebra

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MATRICES. knowledge on matrices Knowledge on matrix operations. Matrix as a tool of solving linear equations with two or three unknowns.

SUMMARY OF MATH 1600

Math Linear Algebra Final Exam Review Sheet

EE731 Lecture Notes: Matrix Computations for Signal Processing

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Math 1553, Introduction to Linear Algebra

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Reduction to the associated homogeneous system via a particular solution

ELE/MCE 503 Linear Algebra Facts Fall 2018

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Introduction to Matrices

MAT Linear Algebra Collection of sample exams

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

2. Every linear system with the same number of equations as unknowns has a unique solution.

OR MSc Maths Revision Course

Solutions to Final Exam

Statistical Geometry Processing Winter Semester 2011/2012

Chapter 3. Determinants and Eigenvalues

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Algebra Highlights

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Lecture II: Linear Algebra Revisited

Image Registration Lecture 2: Vectors and Matrices

Review of linear algebra

Linear Algebra. Min Yan

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Digital Workbook for GRA 6035 Mathematics

POLI270 - Linear Algebra

Knowledge Discovery and Data Mining 1 (VO) ( )

ANSWERS. E k E 2 E 1 A = B

Phys 201. Matrices and Determinants

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Quantum Computing Lecture 2. Review of Linear Algebra

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

1 Determinants. 1.1 Determinant

4.2. ORTHOGONALITY 161

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

MATH 221, Spring Homework 10 Solutions

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Conceptual Questions for Review

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Review of some mathematical tools

Chap 3. Linear Algebra

Maths for Signals and Systems Linear Algebra in Engineering

Lecture Summaries for Linear Algebra M51A

Math 307 Learning Goals

LinGloss. A glossary of linear algebra

3 (Maths) Linear Algebra

Lemma 8: Suppose the N by N matrix A has the following block upper triangular form:

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Study Notes on Matrices & Determinants for GATE 2017

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

1 Inner Product and Orthogonality

ANSWERS (5 points) Let A be a 2 2 matrix such that A =. Compute A. 2

Topic 1: Matrix diagonalization

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

Introduction to Matrix Algebra

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

LINEAR ALGEBRA REVIEW

Chapter 7. Linear Algebra: Matrices, Vectors,

Mathematical foundations - linear algebra

MAC Module 3 Determinants. Learning Objectives. Upon completing this module, you should be able to:

Transcription:

Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1

Vectors Arrays of numbers They represent a point in a n dimensional space 2

Vectors: Scalar Product Scalar-Vector Product Changes the length of the vector, but not its direction 3

Vectors: Sum Sum of vectors (is commutative) Can be visualized as chaining the vectors. 4

Vectors: Dot Product Inner product of vectors (is a scalar) If one of the two vectors has, the inner product returns the length of the projection of along the direction of If the two vectors are orthogonal 5

Vectors: Linear (In)Dependence A vector is linearly dependent from if In other words if can be obtained by summing up the properly scaled. If do not exist such that then is independent from 6

Vectors: Linear (In)Dependence A vector is linearly dependent from if In other words if can be obtained by summing up the properly scaled. If do not exist such that then is independent from 7

Matrices A matrix is written as a table of values Can be used in many ways: 8

Matrices as Collections of Vectors Column vectors 9

Matrices as Collections of Vectors Row Vectors 10

Matrices Operations Sum (commutative, associative) Product (not commutative) Inversion (square, full rank) Transposition Multiplication by a scalar Multiplication by a vector 11

Matrix Vector Product The i component of is the dot product. The vector is linearly dependent from with coefficients. 12

Matrix Vector Product If the column vectors represent a reference system, the product computes the global transformation of the vector according to 13

Matrix Vector Product Each can be seen as a linear mixing coefficient that tells how contributes to. Example: Jacobian of a multidimensional function 14

Matrix Matrix Product Can be defined through the dot product of row and column vectors the linear combination of the columns of A scaled by the coefficients of the columns of B. 15

Matrix Matrix Product If we consider the second interpretation we see that the columns of C are the projections of the columns of B through A. All the interpretations made for the matrix vector product hold. 16

Linear Systems Interpretations: Find the coordinates x in the reference system of A such that b is the result of the transformation of Ax. Many efficient solvers Conjugate gradients Sparse Cholesky Decomposition (if SPD) The system may be over or under constrained. One can obtain a reduced system (A b ) by considering the matrix (A b) and suppressing all the rows which are linearly dependent. 17

Linear Systems The system is over-constrained if the number of linearly independent columns (or rows) of A is greater than the dimension of b. An over-constrained system does not admit a solution, however one may find a minimum norm solution by pseudo inversion 18

Linear Systems The system is under-constrained if the number of linearly independent columns (or rows) of A is greater than the dimension of b. An under-constrained admits infinite solutions. The degree of infinity is rank(a )-dim(b ). The rank of a matrix is the maximum number of linearly independent rows or columns. 19

Matrix Inversion If A is a square matrix of full rank, then there is a unique matrix B=A -1 such that the above equation holds. The i th row of A is and the j th column of A -1 are: orthogonal, if i=j their scalar product is 1, otherwise. The i th column of A -1 can be found by solving the following system: This is the i th column of the identity matrix 20

Trace Only defined for square matrices Sum of the elements on the main diagonal, that is It is a linear operator with the following properties Additivity: Homogeneity: Pairwise commutative: Trace is similarity invariant Trace is transpose invariant 21

Rank Maximum number of linearly independent rows (columns) Dimension of the image of the transformation When is we have and the equality holds iff is the null matrix is injective iff is surjective iff if, is bijective and is invertible iff Computation of the rank is done by Perform Gaussian elimination on the matrix Count the number of non-zero rows 22

Determinant Only defined for square matrices Remember? if and only if For matrices: Let and, then For matrices: 23

Determinant For general matrices? Let be the submatrix obtained from by deleting the i-th row and the j-th column Rewrite determinant for matrices: 24

Determinant For general matrices? Let be the (i,j)-cofactor, then This is called the cofactor expansion across the first row. 25

Determinant Problem: Take a 25 x 25 matrix (which is considered small). The cofactor expansion method requires n! multiplications. For n = 25, this is 1.5 x 10^25 multiplications for which a today supercomputer would take 500,000 years. There are much faster methods, namely using Gauss elimination to bring the matrix into triangular form Then: Because for triangular matrices (with being invertible), the determinant is the product of diagonal elements 26

Determinant: Properties Row operations ( still a square matrix) If results from by interchanging two rows, then If results from by multiplying one row with a number, then If results from by adding a multiple of one row to another row, then Transpose: Multiplication: Does not apply to addition! 27

Determinant: Applications Find the inverse using Cramer s rule with being the adjugate of Compute Eigenvalues Solve the characteristic polynomial Area and Volume: ( is i-th row) 28

Orthogonal matrix A matrix is orthogonal iff its column (row) vectors represent an orthonormal basis As linear transformation, it is norm preserving, and acts as an isometry in Euclidean space (rotation, reflection) Some properties: The transpose is the inverse Determinant has unity norm (± 1) 29

Rotational matrix Important in robotics 2D Rotations 3D Rotations along the main axes IMPORTANT: Rotations are not commutative 30

Matrices as Affine Transformations A general and easy way to describe a 3D transformation is via matrices. Translation Vector Rotation Matrix Homogeneous behavior in 2D and 3D Takes naturally into account the noncommutativity of the transformations 31

Combining Transformations A simple interpretation: chaining of transformations (represented as homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? p 32

p Combining Transformations A simple interpretation: chaining of transformations (represented ad homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? Bp gives me the pose of the object wrt the robot B 33

Combining Transformations A simple interpretation: chaining of transformations (represented ad homogeneous matrices) Matrix A represents the pose of a robot in the space Matrix B represents the position of a sensor on the robot The sensor perceives an object at a given location p, in its own frame [the sensor has no clue on where it is in the world] Where is the object in the global frame? p Bp gives me the pose of the object wrt the robot B ABp gives me the pose of the object wrt the world A 34

Symmetric matrix A matrix is symmetric if, e.g. A matrix is anti-symmetric if, e.g. Every symmetric matrix: can be diagonalizable, where is a diagonal matrix of eigenvalues and is an orthogonal matrix whose columns are the eigenvectors of define a quadratic form 35

Positive definite matrix The analogous of positive number Definition Examples 36

Positive definite matrix Properties Invertible, with positive definite inverse All eigenvalues > 0 Trace is > 0 For any spd Cholesky decomposition Partial ordering: iff If, we have If, then, are positive definite 37

Jacobian Matrix It s a non-square matrix in general Suppose you have a vector-valued function Let the gradient operator be the vector of (first-order) partial derivatives Then, the Jacobian matrix is defined as 38

Jacobian Matrix It s the orientation of the tangent plane to the vectorvalued function at a given point Generalizes the gradient of a scalar valued function Heavily used for first-order error propagation See later in the course 39

Quadratic Forms Many important functions can be locally approximated with a quadratic form. Often one is interested in finding the minimum (or maximum) of a quadratic form. 40

Quadratic Forms How can we use the matrix properties to quickly compute a solution to this minimization problem? At the minimum we have By using the definition of matrix product we can compute f 41

Quadratic Forms The minimum of where its derivative is set to 0 is Thus we can solve the system If the matrix is symmetric, the system becomes 42