CS 246 Review of Linear Algebra 01/17/19

Similar documents
Knowledge Discovery and Data Mining 1 (VO) ( )

1 Proof techniques. CS 224W Linear Algebra, Probability, and Proof Techniques

Math Linear Algebra Final Exam Review Sheet

Review of Linear Algebra

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chapter 5. Eigenvalues and Eigenvectors

Linear Algebra March 16, 2019

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Quick Tour of Linear Algebra and Graph Theory

Basic Concepts in Linear Algebra

Review of Basic Concepts in Linear Algebra

Introduction to Matrix Algebra

Recall : Eigenvalues and Eigenvectors

Conceptual Questions for Review

Linear Algebra Highlights

Econ Slides from Lecture 7

Quantum Computing Lecture 2. Review of Linear Algebra

Chapter 2 Notes, Linear Algebra 5e Lay

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Matrices and Linear Algebra

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Symmetric and anti symmetric matrices

3 Matrix Algebra. 3.1 Operations on matrices

n n matrices The system of m linear equations in n variables x 1, x 2,..., x n can be written as a matrix equation by Ax = b, or in full

Elementary maths for GMT

and let s calculate the image of some vectors under the transformation T.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Matrices Gaussian elimination Determinants. Graphics 2009/2010, period 1. Lecture 4: matrices

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

CS 143 Linear Algebra Review

1 Matrices and Systems of Linear Equations. a 1n a 2n

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

0.1 Eigenvalues and Eigenvectors

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

ICS 6N Computational Linear Algebra Matrix Algebra

Linear Algebra Review. Vectors

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Linear Algebra Primer

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Matrix Algebra and Random Vectors

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Matrix Algebra: Summary

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Quick Tour of Linear Algebra and Graph Theory

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

4. Determinants.

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

Linear Equations and Matrix

Vectors and matrices: matrices (Version 2) This is a very brief summary of my lecture notes.

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

Numerical Linear Algebra Homework Assignment - Week 2

A Quick Tour of Linear Algebra and Optimization for Machine Learning

Linear Algebra Primer

Matrix Algebra. Matrix Algebra. Chapter 8 - S&B

Matrices. Chapter Definitions and Notations

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

This appendix provides a very basic introduction to linear algebra concepts.

MAT 610: Numerical Linear Algebra. James V. Lambers

Linear Algebra Primer

Linear Algebra Practice Final

Phys 201. Matrices and Determinants

Linear Equations in Linear Algebra

MATH 221, Spring Homework 10 Solutions

Determinants by Cofactor Expansion (III)

Review of Some Concepts from Linear Algebra: Part 2

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Foundations of Matrix Analysis

Lecture 3 Linear Algebra Background

Dot Products, Transposes, and Orthogonal Projections

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Matrix Arithmetic. j=1

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Topic 1: Matrix diagonalization

10-701/ Recitation : Linear Algebra Review (based on notes written by Jing Xiang)

Math 315: Linear Algebra Solutions to Assignment 7

I = i 0,

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

18.06SC Final Exam Solutions

Appendix A: Matrices

UNIT 6: The singular value decomposition.

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

Topic 15 Notes Jeremy Orloff

Elementary Row Operations on Matrices

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Introduction. Vectors and Matrices. Vectors [1] Vectors [2]

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

22A-2 SUMMER 2014 LECTURE Agenda

Linear Algebra - Part II

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Chapter 3. Linear and Nonlinear Systems

Linear Algebra and Matrices

Linear Algebra V = T = ( 4 3 ).

Transcription:

1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector is a one dimensional matrix, and it can be written as a column vector: v 1 v 2 v. v n or a row vector: v [ v 1 v 2... v n ] 1.1.1 Dot product The dot product of two equal-length vectors u (u 1,..., u n ) and v (v 1,..., v n ) is u v u T v u 1 v 1 + u 2 v 2 + + u n v n n u i v i i1 Two vectors are orthogonal if their dot product is zero. In 2D space we also call orthogonal vectors perpendicular. 1.1.2 Norm The l 2 norm, or length, of a vector (v 1,..., v n ) is just v 2 1 + v 2 2 + + v 2 n. The norm of a vector v is usually written as v. A more general norm is p-norm, denoted as when we take p 2, we have the l 2. l p ( n v i p ) 1 p i1 1

1.1.3 Triangle inequality For two vectors u and v, we have u + v u + v and u v u v 1.2 Matrix operations 1.2.1 Matrix addition Matrix addition is defined for matrices of the same dimension. Matrices are added componentwise: [ ] 1 2 + 3 4 [ ] 5 6 7 8 [ ] [ ] 1 + 5 2 + 6 6 8 3 + 7 4 + 8 10 12 1.2.2 Matrix multiplication Matrices can be multiplied like so: [ ] [ ] [ ] 1 2 5 6 1 5 + 2 7 1 6 + 2 8 3 4 7 8 3 5 + 4 7 3 6 + 4 8 [ 19 ] 22 43 50 You can also multiply non-square matrices, but the dimensions have to match (i.e. the number of columns of the first matrix has to equal the number of rows of the second matrix). 1 2 3 4 5 6 [ ] 1 2 3 4 5 6 1 1 + 2 4 1 2 + 2 5 1 3 + 2 6 9 12 15 3 1 + 4 4 3 2 + 4 5 3 3 + 4 6 19 26 33 5 1 + 6 4 5 2 + 6 5 5 3 + 6 6 29 40 51 In general, if matrix A is multiplied by matrix B, we have (AB) ij k A ikb kj for all entries (i, j) of the matrix product. Matrix multiplication is associative, i.e. (AB)C A(BC). It is also distributive, i.e. A(B + C) AB + AC. However, it is not commutative. That is, AB does not have to equal BA. Note that if you multiply a 1-by-n matrix with an n-by-1 matrix, that is the same as taking the dot product of the corresponding vectors. 2

1.2.3 Matrix transpose The transpose operation switches a matrix s rows with its columns, so 1 2 3 4 5 6 T [ 1 3 ] 5 2 4 6 In other words, we define A T by (A T ) ij A ji. Properties: (A T ) T A (AB) T B T A T Proof. Let AB C, (AB) T D, then (AB) T ij D ij C ji k A jk B ki k (A T ) kj (B T ) ik k (B T ) ik (A T ) kj D B T A T (A + B) T A T + B T 1.2.4 Identity matrix The identity matrix I n is an n-by-n matrix with all 1 s on the diagonal, and 0 s everywhere else. It is usually abbreviated I, when it is clear what the dimensions of the matrix are. It has the property that when you multiply it by any other matrix, you get that matrix. In other words, if A is an m-by-n matrix, then AI n I m A A. 1.2.5 Matrix inverse The inverse of a matrix A is the matrix that you can multiply A by to get the identity matrix. Not all matrices have an inverse. (The ones that have an inverse are called invertible.) In other words, A 1 is the matrix where AA 1 A 1 A I (if it exists). 3

Inverse matrix is unique for a particular matrix A, if the inverse exists. Properties: (A 1 ) 1 A (AB) 1 B 1 A 1 (A 1 ) T (A T ) 1 Proof. (A 1 ) T (A T ) (AA 1 ) T I 1.3 Types of matrices 1.3.1 Diagonal matrix A diagonal matrix is a matrix that has 0 s everywhere except the diagonal. A diagonal matrix can be written D diag(d 1, d 2,..., d n ), which corresponds to the matrix d 1 0... 0 0 d 2... 0 D...... 0 0... d n You may verify that d k 1 0... 0 D k 0 d k 2... 0...... 0 0... d k n 1.3.2 Triangular matrix A lower triangular matrix is a matrix that has all its nonzero elements on or below the diagonal. An upper triangular matrix is a matrix that has all its nonzero elements on or above the diagonal. 1.3.3 Symmetric matrix A is symmetric if A A T, i.e. A ij A ji for all entries (i, j) in A. Note that a matrix must be square in order to be symmetric. 4

1.3.4 Orthogonal matrix A matrix U is orthogonal if UU T U T U I. (That is, the inverse of an orthogonal matrix is its transpose.) Orthogonal matrices have the property that every row is orthogonal to every other row. That is, the dot product of any row vector with any other row vector is 0. In addition, every row is a unit vector, i.e. it has norm 1. (Try verifying this for yourself!) Similarly, every column is a unit vector, and every column is orthogonal to every other column. (You can verify this by noting that if U is orthogonal, then U T is also orthogonal.) 1.4 Linear independence and span A linear combination of the vectors v 1,..., v n is an expression of the form a 1 v 1 + a 2 v 2 + + a n v n, where a 1,..., a n are real numbers. Note that some of the a i s may be zero. The span of a set of vectors is the set of all possible linear combinations of that set of vectors. The vectors v 1,..., v n are linearly independent if you cannot find coefficients a 1,..., a n where a 1 v 1 + + a n v n 0 (except for the trivial solution a 1 a 2 0). Intuitively, this means you cannot write any of the vectors in terms of any linear combination of the other vectors. (A set of vectors is linearly dependent if it is not linearly independent.) 1.5 Eigenvalues and eigenvectors Sometimes, multiplying a matrix by a vector just stretches that vector. If that happens, the vector is called an eigenvector of the matrix, and the stretching factor is called the eigenvalue. Definition: Given a square matrix A, λ is an eigenvalue of A with the corresponding eigenvector x if Ax λx. (Note that in this definition, x is a vector, and λ is a number.) (By convention, the zero vector cannot be an eigenvector of any matrix.) Example: If then the vector [ ] 2 1 A 1 2 [ ] 3 is an eigenvector with eigenvalue 1, because 3 Ax [ ] [ ] [ ] [ ] 2 1 3 3 3 1 1 2 3 3 3 5

1.5.1 Solving for eigenvalues and eigenvectors We exploit the fact that Ax λx if and only if (A λi)x 0. (Note that λi is the diagonal matrix where all the diagonal entries are λ, and all other entries are zero.) This equation has a nonzero solution for x if and only if the determinant of A λi equals 0. (We won t prove this here, but you can google for invertible matrix theorem.) Therefore, you can find the eigenvalues of the matrix A by solving the equation det(a λi) 0 for λ. Once you have done that, you can find the corresponding eigenvector for each eigenvalue λ by solving the system of equations (A λi)x 0 for x. Example: If then and A A λi [ ] 2 1 1 2 [ 2 λ 1 ] 1 2 λ det(a λi) (2 λ) 2 1 λ 2 4λ + 3 Setting this equal to 0, we find that λ 1 and λ 3 are possible eigenvalues. To find the eigenvectors for λ 1, we plug λ into the equation (A λi)x 0. This gives us [ ] [ ] [ ] 1 1 x1 0 1 1 0 Any vector where x 2 x 1 is a solution to this equation, and in particular, solution. x 2 [ ] 3 3 is one To find the eigenvectors for λ 3, we again plug λ into the equation, and this time we get [ ] [ ] [ ] 1 1 x1 0 1 1 0 Any vector where x 2 x 1 is a solution to this equation. (Note: The above method is never used to calculate eigenvalues and eigenvectors for large matrices in practice, iterative methods are used instead.) x 2 1.5.2 Properties of eigenvalues and eigenvectors Usually eigenvectors are normalized to unit length If A is symmetric, then all its eigenvalues are real The eigenvalues of any triangular matrix are its diagonal entries 6

The trace of a matrix (i.e. the sum of the elements on its diagonal) is equal to the sum of its eigenvalues The determinant of a matrix is equal to the product of its eigenvalues 1.6 Matrix eigendecomposition Theorem: Suppose A is an n-by-n matrix with n linearly independent eigenvectors. Then A can be written as A P DP 1, where P is the matrix whose columns are the eigenvectors of A, and D is the diagonal matrix whose entries are the corresponding eigenvalues. In addition, A 2 (P DP 1 )(P DP 1 ) P D 2 P 1, and A n P D n P 1. (This is interesting because it s much easier to raise a diagonal matrix to a power than to exponentiate an ordinary matrix.) 7