LinGloss. A glossary of linear algebra

Similar documents
Properties of Matrices and Operations on Matrices

Notes on Eigenvalues, Singular Values and QR

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Review of some mathematical tools

Lecture notes: Applied linear algebra Part 1. Version 2

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

The Singular Value Decomposition

Lecture 10 - Eigenvalues problem

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Linear Algebra Review. Vectors

Review of Some Concepts from Linear Algebra: Part 2

Computational math: Assignment 1

Quantum Computing Lecture 2. Review of Linear Algebra

Eigenvalue and Eigenvector Problems

Linear Algebra: Matrix Eigenvalue Problems

UNIT 6: The singular value decomposition.

Matrices A brief introduction

Review of Linear Algebra

Review problems for MA 54, Fall 2004.

Schur s Triangularization Theorem. Math 422

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

Linear Algebra Massoud Malek

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

ELE/MCE 503 Linear Algebra Facts Fall 2018

Linear Algebra Lecture Notes-II

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Matrices and Linear Algebra

1. General Vector Spaces

Maths for Signals and Systems Linear Algebra in Engineering

Symmetric and anti symmetric matrices

Linear Algebra in Actuarial Science: Slides to the lecture

Jordan Normal Form and Singular Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

1 Last time: least-squares problems

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Numerical Linear Algebra

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Foundations of Matrix Analysis

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

SYLLABUS. 1 Linear maps and matrices

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Algebra C Numerical Linear Algebra Sample Exam Problems

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

SECTIONS 5.2/5.4 BASIC PROPERTIES OF EIGENVALUES AND EIGENVECTORS / SIMILARITY TRANSFORMATIONS

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Eigenvalues and eigenvectors

Linear Algebra. Workbook

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

CHARACTERIZATIONS. is pd/psd. Possible for all pd/psd matrices! Generating a pd/psd matrix: Choose any B Mn, then

Knowledge Discovery and Data Mining 1 (VO) ( )

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Matrix Algebra: Summary

Review of linear algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

1 Linear Algebra Problems

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Numerical Methods for Solving Large Scale Eigenvalue Problems

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

MATHEMATICS 217 NOTES

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Spectral radius, symmetric and positive matrices

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J

Elementary linear algebra

ECE 275A Homework # 3 Due Thursday 10/27/2016

Math Linear Algebra II. 1. Inner Products and Norms

Summary of Week 9 B = then A A =

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Pseudoinverse & Orthogonal Projection Operators

Linear Algebra Highlights

Introduction to Matrix Algebra

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Chapter 0 Miscellaneous Preliminaries

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Spectral Theorem for Self-adjoint Linear Operators

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math. H110 Answers for Final Examination December 21, :56 pm

Matrices A brief introduction

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

The Spectral Theorem for normal linear maps

The QR Decomposition

Chap 3. Linear Algebra

Section 3.9. Matrix Norm

Math Homework 8 (selected problems)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Diagonalizing Matrices

8. Diagonalization.

Background Mathematics (2/2) 1. David Barber

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Functional Analysis Review

Transcription:

LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal is composed of 1 1 and 2 2 blocks. Gram Matrix (Gramian Matrix, Gramian) (of a set of vectors v 1,, v n ) is the matrix G such that G ij = v i v j. The set is linearly independent iff det(g) 0 Cramer s Rule For A F n n we can compute x : Ax = b by x i = det(a(i) det(a) where A(i) := {A with b replacing the i th column }. Bilinear form For a vectorspace V over a field K, f : V V K is a bilinear form iff it s linear in each component: f(av 1 + bv 2, u) = af(v 1, u) + bf(v 2, u) - for all A F n n, f A (x) = x T Ax defines a bilinear form, f A. - Bilinear forms on n-dimension V are in 1-1 correspondence with A K n n. - f is symmetric iff A f is symmetric, iff f(v, u) = f(u, v). - f is positive definite iff A f is positive definite. - f is non-degenerate iff rank(f) = rank(v). Inner Product Space (Euclidean Space) A vectorspace V equipped with an inner product, satsifying for all v, u V 1. u, v = v, u 2. v, v 0, with v, v = 0 iff v = 0 3., is bilinear. Equivalent matrices A and B are equivalent iff A = Q 1 BP for invertible Q, P. For rectangular matrices, equivalence as matrices means the matrices represent the same linear transformation. Similar matrices A and B are similar iff A = P 1 BP for invertible P. Note that this is a relation on square matrices only. 1

Moore-Penrose Inverse (Pseudo-inverse, generalized inverse) Given A F m n there exists a matrix A + which satisfies: AA + A = A (1) A + AA + = A + (2) AA + = ( AA +) T A + A = ( A + A ) T - A + is unique - if r = ranka and we factor A = BC having dimensions m r, r, n, then we have ( ) A + = C T CC T 1 ) 1 (B T B B T (3) (4) (5) Group Inverse A F n n can have a group inverse A # F n n satisfying AA # = A # A (6) A # AA # = A # (7) AA # A = A (8) - if v is an eigenvector of A with eigenvalue λ 0, then v is an eigenvector of A # with eigenvalue 1 λ - if v is an eigenvector of A with eigenvalue λ = 0, then v is an eigenvector of A # with eigenvalue 0 - A # is unique if it exists. If A 1 exists, then A 1 = A #. - Let A = BC have dimensions n r, r n. If CB is invertible, then A # exists and A # = B (CB) 2 C Spectral radius If λ 1 λ 2 λ n are the eigenvalues of A C n n (possibly real) then the spectral radius is ρ (A) := max{ λ i } i Note: ρ(a) A k 1/k for all k N for any consistent matrix norm (9) Matrix Norm A matrix norm is a map : R n m R satisfying (1) A 0 for all A R n m, A = 0 iff A = 0 (2) A + B A + B for all A, B R n m (3) αa = α A for all α R and A R n m Matrix norms a, b, c on appropriate spaces are consistent iff AB a A b B c always holds. 2

We can define a matrix norm a,b that is subordinate to the vector norms a, b by Ax b A a,b := sup x 0 x a Note that subordinate matrix norms satisfy Ax b A a,b := max x a=1 = Ax b Frobenius Norm For A K m n we have A F = i = tr (AA ) = σi 2, where σ i are the singular values of A i j A ij 2 Compare with A 2 2 = σ 2 1 (and in the case A is square, σ 2 1 = λ 2 1). Hermitian Matrix (Self-adjoint) M F n n is Hermitian iff M equals its Hermitian transpose (conjugate transpose, Hermitian conjugate, Adjoint Matrix), i.e. M = M = M T = M = M H, i.e. M ij = M ji. Note: Hermitian implies normal. Projection Matrix (Projection, Idempotent Matrix, Projector) A is a projection matrix iff A 2 = A. - all eigenvalues are 0 or 1 - the minimimum polynomial divides x 2 x. - det A = 0 or 1. Markov Matrix (Stochastic Matrix) M R n n is Markov iff for all i, j (1) 0 M ij 1 n (2) M ij = 1 j=1 - for Markov M, N; MN is also Markov. - Column stochastic means this same property, with respect to columns. Orthogonal Matrix M R n n is orthongal iff M T M = MM T = I. - det M = 1 - preserves inner-products (< v, w >=< Av, Aw >) - preserves lengths, i.e. is an isometry - eigenvalues INCOMPLETE Householder Matrix (Householder Reflector, Reflector Matrix) For any u F n with u T u = 1, the Householder reflector of u is H u = I 2uu T. - symmetric - orthogonal - eigenvalues: -1 with multiplicity 1, 1 with multiplicity n 1. 3

- eigenvectors: u with multiplicity 1; the other n 1 are the n 1 basis vectors of null(u) Unitary Matrix A C n n is unitary iff A A = AA = I. - det(a) = 1 - preserves inner-products (< v, w >=< Av, Aw >) - preserves lengths, i.e. is an isometry - eigenvalues INCOMPLETE Congruent Matrices Two matrices A, B F n n are congruent iff A = P T BP for some invertible matrix P (note: P need not be orthogonal!) - congruent matrices represent the same bilinear form with respect to different bases - different from similar matrices, which represent the same linear transformation with respect to different bases - different from orthogonally-similar matrices, which represent rotations of the same linear transformation - Law of Inertia: congruent matrices have the same number of positive, negative, and zero eigenvalues (this means they have the same signature as well) Deficient Matrix (Defective Matrix) A matrix with less than n linearly independent eigenvectors, i.e. at least one eigenvalue has a Jordan block of size > 1. A matrix is deficient iff it is non-diagonalizable. Derogatory Matrix A matrix is derogatory if an eigenvalue has more than one Jordan block associated to it. Note that this is distinct from simply having a repeated eigenvalue a deficient matrix has at least one repeated eigenvalue but without a corresponding number of eigenvectors; that is, the eigenvalue has algebraic multiplicity a λ > 1, but geometric multiplicity g λ < a λ. A derogatory matrix, on the other hand, simply has a λ > 1 for at least one λ it is not necessarily the case that g λ equals a λ or not. - non-derogatory iff characteristic polynomial = minimal polynomial. Simple eigenvalue An eigenvalue with algebraic multiplicity = 1 (which necessarily implies geometric multiplicity = 1). Semisimple eigenvalue An eigenvalue for which all associated Jordan blocks have dimension 1; i.e. algebraic multiplicity = geometric multiplicity. Defective eigenvalue (Deficient eigenvalue) An eigenvalue that is not semisimple: an eigenvalue with geometric multiplicity algebraic multiplicity. A matrix has a defective eigenvalue iff it is a defective matrix (a deficient matrix). 4

Signature (of a symmetric bilinear form) Any symmetric bilear form on a finite dimensional vectorspace has a diagonal matrix representation. The signature is s = p n # of positive entries on the diagonal # of negative entries on the diagonal Isometry A linear transformation T on F is an isometry iff Tv = v for all v F n. - an isometry is necessarily invertible - all eigenvalues satisfy λ = 1. Rank (of a matrix): For M F n n, rankm := # of independent columns (= # of independent rows) - if A F m m and C F n n are nonsingular then rank(am) = rank(mc) = rank(m) - for A, B F n n, AB nonsingular iff A and B are nonsingular. - rank(m) = dimension of the largest nonsingular submatrix of M. - For arbitrary U, V we have 0 rank(uv) min(rank(u), rank(v)) - It is possible UV = 0, even if one of them is non-singular: [ 1 0 ] [ 0 1 Normal Matrix A F n n is normal iff A A = A A. Theorem: normal iff unitarily diagonalizable, meaning QAQ = D for unitary Q and diagonal D. - note: being diagonalizable does not imply unitarily diagonalizable. - for real matrices, being orthogonally diagonalizable implies symmetric: ( ) A = QDQ T = QDQ T T, and being symmetric implies normal, so for R, SYMMETRIC = ORTHOGONALLY DIAGONALIZABLE = NORMAL 1 Theorems Spectral Mapping Theorem Let A F n n have eigenvalues λ 1,, λ n not necessarily distinct, and take p(x) F[x]. Then the eigenvalues of p(a) are p(λ 1 ),, p(λ n ). Note the characteristic polynomial has changed from (x λ i ) to (x p(λ i )) ] Perron-Frobenius Theorem If A F n n has all positive entries then 1. There exists a unique eigenvector with all positive entries (up to scalar multiples) 2. Its corresponding eigenvalue is λ 1 3. That eigenvalue has multiplicty 1 (i.e. it is a simple eigenvalue) 5

Schur Decomposition For any A C n n there exists a unitary Q C n n and upper triangular U C n n such that A = QUQ - not necessarily unique - an extension of the spectral decomposition theorem (i.e. diagonalization theorem) - If A is positive definite, then the Schur, Singular Value, and Spectral decompositions are the same. - if A is R, then there exists orthogonal Q R n n and upper quasi-triangular T R n n such that A = QTQ T. Singular Value Decomposition (SVD) Any M C n m has a Singular Value Decomposition M = UΣV for U m m unitary, V n n unitary, and Σ m n diagonal with the singular values as its entries. - U and V aren t unique, but Σ is up to re-ordering. - If M is diagonalizable 6