Maths for Signals and Systems Linear Algebra in Engineering

Similar documents
Maths for Signals and Systems Linear Algebra in Engineering

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Maths for Signals and Systems Linear Algebra for Engineering Applications

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Positive Definite Matrix

18.06SC Final Exam Solutions

A Review of Linear Algebra

1 Last time: least-squares problems

Problem # Max points possible Actual score Total 120

Linear Algebra Primer

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

There are six more problems on the next two pages

COMP 558 lecture 18 Nov. 15, 2010

Eigenvalues and diagonalization

Applied Linear Algebra in Geoscience Using MATLAB

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Conceptual Questions for Review

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

ANSWERS. E k E 2 E 1 A = B

Singular Value Decomposition

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Lecture notes: Applied linear algebra Part 1. Version 2

A Brief Outline of Math 355

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Example Linear Algebra Competency Test

TBP MATH33A Review Sheet. November 24, 2018

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

Math 3191 Applied Linear Algebra

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

I. Multiple Choice Questions (Answer any eight)

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Chapter 7: Symmetric Matrices and Quadratic Forms

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Properties of Matrices and Operations on Matrices

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The Singular Value Decomposition

Computational math: Assignment 1

2. Every linear system with the same number of equations as unknowns has a unique solution.

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

1 Linearity and Linear Systems

Study Guide for Linear Algebra Exam 2

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Mathematical foundations - linear algebra

Solution of Linear Equations

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

7. Symmetric Matrices and Quadratic Forms

Notes on Eigenvalues, Singular Values and QR

Lecture 4: Applications of Orthogonality: QR Decompositions

Review of Some Concepts from Linear Algebra: Part 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Cheat Sheet for MATH461

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Linear Algebra, part 3 QR and SVD

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Math 3191 Applied Linear Algebra

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Linear Algebra. Session 12

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Section 6.4. The Gram Schmidt Process

Lecture Summaries for Linear Algebra M51A

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Jordan Normal Form and Singular Decomposition

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Singular Value Decomposition

MATH 581D FINAL EXAM Autumn December 12, 2016

UNIT 6: The singular value decomposition.

Announcements Wednesday, November 01

MATH 115A: SAMPLE FINAL SOLUTIONS

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

8. Diagonalization.

Pseudoinverse & Moore-Penrose Conditions

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Linear Algebra - Part II

Mathematical foundations - linear algebra

Review problems for MA 54, Fall 2004.

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

LINEAR ALGEBRA SUMMARY SHEET.

Math Linear Algebra Final Exam Review Sheet

Announcements Monday, November 26

B553 Lecture 5: Matrix Algebra Review

Name: Final Exam MATH 3320

Stat 159/259: Linear Algebra Notes

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Chapter 6: Orthogonality

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Principal Components Theory Notes

Background Mathematics (2/2) 1. David Barber

Transcription:

Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18 th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON

Mathematics for Signals and Systems In this set of lectures we will talk about two applications: Linear Transformations Summary of Decompositions and Matrices

Linear transformations Consider the parameters/functions/vectors/other mathematical quantities denoted by u and v. A transformation is an operator applied on the above quantities, i.e., T u, T(v). A linear transformation possesses the following two properties: T u + v = T u + T(v) T cv = ct(v) where c is a scalar. By grouping the above two conditions we get T c 1 u + c 2 v = c 1 T u + c 2 T(v) The zero vector in a linear transformation is always mapped to zero.

Examples of transformations Is the transformation T: R 2 R 2, which carries out projection of any vector of the 2-D plane on a specific straight line, a linear transformation? Is the transformation T: R 2 R 2, which shifts the entire plane by a vector v 0, a linear transformation? Is the transformation T: R 3 R, which takes as input a vector and produces as output its length, a linear transformation? Is the transformation T: R 2 R 2, which rotates a vector by 45 o a linear transformation? Is the transformation T v = Av, where A is a matrix, a linear transformation?

Examples of transformations Consider a transformation T: R 3 R 2. In case T v = Av, then A is a matrix of size 2 3. If we know the outputs of the transformation if applied on a set of vectors v 1, v 2,, v n which form a basis of some space, then we know the output to any vector that belongs to that space. Recall: The coordinates of a system are based on its basis. Most of the time when we talk about coordinates we think about the standard basis, which consists of the rows (columns) of the identity matrix. Another popular basis consists of the eigenvectors of a matrix.

Examples of transformations: Projections Consider the matrix A that represents a linear transformation T. Most of the times the required transformation is of the form T: R n R m. I need to choose two bases, one for R n, denoted by v 1, v 2,, v n and one for R m denoted by w 1, w 2,, w m. I am looking for a transformation that if applied on a vector described with the input coordinates produces the output co-ordinates. Consider R 2 and the transformation which projects any vector on the line shown on the figure below. I consider as basis for R 2 the vectors shown with red below and not the standard vectors 1 0 and 0 1. One of the basis vectors lies on the required line and the other is perpendicular to the former.

Examples of transformations: Projections cont. I consider as basis for R 2 the vectors shown with red below both before and after the transformation. Any vector v in R 2 can be written as v = c 1 v 1 +c 1 v 2. We are looking for T( ) such that T v 1 = v 1 and T v 2 = 0. Furthermore, T v = c 1 T v 1 +c 2 T v 2 = c 1 v 1 1 0 c 1 0 0 c = c 1 2 0 The matrix in that case is Λ. This is the good matrix. v 2 = w 2 v 1 = w 1

Examples of transformations: Projections cont. I now consider as basis for R 2 the standard basis. v 1 = 1 0 and v 2 = 0 1. Consider projections on to 45 o line. In this example the required matrix is P = aat 1/2 1/2 a T = a 1/2 1/2 Here we didn t choose the best basis, we chose the handiest basis. v 2 = w 2 v 1 = w 1

Examples of transformations: Derivative of a function Consider a linear transformation that takes the derivative of a function. (The derivative is a linear transformation!) T = d( ) dx Consider input c 1 + c 2 x + c 3 x 2. Basis consists of the functions 1, x, x 2. The output should be c 2 + 2c 3 x. Basis consists of the functions 1, x. I am looking for a matrix A such that A This is A = 0 1 0 0 0 2. c 1 c 2 c 3 = c 2 2c 3.

Summary of Decompositions: LU Decomposition What is it? A = LU, A is a square matrix. L a lower triangular matrix with 1s on the diagonal. U an upper triangular matrix with the pivots of A on the diagonal. When does it exist? If the matrix is invertible (the determinant is not 0), then a pure LU decomposition exists only if the leading principal minors are not 0. The leading principal minors are the north-west determinants. Example: The matrix A = 0 1 does not have an LU although it is invertible. 1 1 If the matrix is not invertible (the determinant is 0), then we can't know if there is a pure LU decomposition. LU decomposition works with rectangular matrices as well, with slight modifications/extensions.

LDU Decomposition What is it? A = LDU, A is a square matrix. L a lower triangular matrix with 1s on the diagonal. D a diagonal matrix with the pivots of A across the diagonal. U an upper triangular matrix with 1s on the diagonal. When does it exist? The same requirements as in pure LU decomposition. Example: A = LU = 2 1 8 7 = 1 0 4 1 The above can be written as: A = LDU = 2 1 8 7 = 1 0 4 1 2 1 0 3 2 0 0 3 1 1/2 0 1 LDU decomposition works with rectangular matrices as well, with slight modifications/extensions.

A=ER What is it? A = ER A is any matrix of dimension m n. E a square invertible matrix of dimension m m. R = I r r F r (n r) 0 (m r) r 0 (m r) (n r) F is a random matrix. r is the rank of A. When does it exist? Always.

A=QR for square matrices What is it? A = QR A is any matrix of dimension n n. Q a square invertible matrix of dimension n n with orthogonal columns. The columns of Q are derived from the columns of A. If A is not invertible the first r columns of Q are derived from the independent columns of A. r is the rank of A. The last (n r) columns of Q are chosen to be orthogonal to the first r ones. R is upper triangular if A is invertible. If A is NOT invertible the last (n r) rows of R are filled with zeros. (Look at Problem Sheet 6, Question 6, Matrix B). When does it exist? Always. QR decomposition exists also for rectangular matrices. I haven t taught this case.

A = SΛS 1 What is it? A = SΛS 1 A is a square invertible matrix of dimension n n. When does it exist? When A has n linearly independent eigenvectors.

A = QΛQ T What is it? A = QΛQ T A is a real, symmetric matrix of dimension n n. The above decomposition is the so called Spectral Theorem. When does it exist? When A is real and symmetric.

The Singular Value Decomposition A = UΣV T What is it? A = UΣV T A is any matrix of dimension m n. U is an orthogonal matrix of dimension m m whose columns are the eigenvectors of matrix AA T. Σ is a singular value matrix, with the singular values of A in its main diagonal. V is an orthogonal matrix of dimension n n whose columns are the eigenvectors of matrix A T A. The squares of the singular values of A are the eigenvalues of both AA T and A T A. When does it exist? Always.

Summary of matrices Projection matrix P onto subspace S. p = Pb is the closest point to b in S. P 2 = P = P T. Condition P 2 = P is sufficient to characterise a matrix as a projection matrix. Eigenvalues: 0 or 1. Eigenvectors are in S or S. If the columns of matrix A are a basis for S, then P = A(A T A) 1 A T. Orthogonal matrix Q. Square matrix with orthonormal columns. Q 1 = Q T Eigenvalues: λ = 1. Eigenvectors are orthogonal. Preserves length and angles, i.e., Qx = x.

Also recall Symmetric (or Hermitian) matrices (real eigenvalues). Positive definite and positive semi-definite matrices. Their eigenvalues are positive or non-negative respectively. Matrices A T A and AA T. They have identical eigenvalues. Their eigenvalues are non-negative. Square matrices. Rectangular matrices.