Introduc)on to linear algebra

Similar documents
Review of Linear Algebra

Math Linear Algebra Final Exam Review Sheet

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Introduction to Matrix Algebra

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

Elementary maths for GMT

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Phys 201. Matrices and Determinants

Matrix & Linear Algebra

Lecture Notes in Linear Algebra

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Conceptual Questions for Review

Matrices and Linear Algebra

Linear Algebra Review. Vectors

Knowledge Discovery and Data Mining 1 (VO) ( )

Appendix A: Matrices

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra Highlights

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear Systems and Matrices

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Fundamentals of Engineering Analysis (650163)

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Bastian Steder

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

MATH2210 Notebook 2 Spring 2018

Matrix Arithmetic. j=1

Lecture 3: Matrix and Matrix Operations

Quantum Computing Lecture 2. Review of Linear Algebra

Linear Algebra Review

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Mathematics. EC / EE / IN / ME / CE. for

CS 246 Review of Linear Algebra 01/17/19

Lecture 6: Geometry of OLS Estimation of Linear Regession

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Unit 2. Projec.ons and Subspaces

Matrix Algebra Determinant, Inverse matrix. Matrices. A. Fabretti. Mathematics 2 A.Y. 2015/2016. A. Fabretti Matrices

A Review of Matrix Analysis

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

Chapter 2: Matrices and Linear Systems

Linear Algebra March 16, 2019

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Basic Concepts in Matrix Algebra

18.06SC Final Exam Solutions

Chapter 7. Linear Algebra: Matrices, Vectors,

Linear Algebra V = T = ( 4 3 ).

MATRICES. a m,1 a m,n A =

Introduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Elementary Linear Algebra

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Topic 7 - Matrix Approach to Simple Linear Regression. Outline. Matrix. Matrix. Review of Matrices. Regression model in matrix form

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

. a m1 a mn. a 1 a 2 a = a n

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

2. Matrix Algebra and Random Vectors

LINEAR SYSTEMS, MATRICES, AND VECTORS

Chapter 5 Matrix Approach to Simple Linear Regression

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

1 Inner Product and Orthogonality

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra 1/33

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

NOTES FOR LINEAR ALGEBRA 133

Singular Value Decomposition and Principal Component Analysis (PCA) I

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Matrix notation. A nm : n m : size of the matrix. m : no of columns, n: no of rows. Row matrix n=1 [b 1, b 2, b 3,. b m ] Column matrix m=1

Introduction. Vectors and Matrices. Vectors [1] Vectors [2]

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34

A matrix over a field F is a rectangular array of elements from F. The symbol

Vectors and Matrices Statistics with Vectors and Matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

3 (Maths) Linear Algebra

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF

Linear Algebra: Lecture Notes. Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

3 Matrix Algebra. 3.1 Operations on matrices

Image Registration Lecture 2: Vectors and Matrices

NOTES on LINEAR ALGEBRA 1

MATH 106 LINEAR ALGEBRA LECTURE NOTES

Chapter 2 Notes, Linear Algebra 5e Lay

Matrices and Matrix Algebra.

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Materials engineering Collage \\ Ceramic & construction materials department Numerical Analysis \\Third stage by \\ Dalya Hekmat

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Definitions for Quizzes

ELE/MCE 503 Linear Algebra Facts Fall 2018

Notes on Mathematics

Study Notes on Matrices & Determinants for GATE 2017

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Linear Models Review

Quiz ) Locate your 1 st order neighbors. 1) Simplify. Name Hometown. Name Hometown. Name Hometown.

Review of linear algebra

Transcription:

Introduc)on to linear algebra

Vector A vector, v, of dimension n is an n 1 rectangular array of elements v 1 v v = 2 " v n % vectors will be column vectors. They may also be row vectors, when transposed v T =[v 1, v 2,, v n ].

A vector, v, of dimension n v = " can be thought a point in n dimensional space v 1 v 2 v n %

v 3 v v 1 = v 2 v 3 v 2 v 1

Proper)es of vector spaces Commuta)vity of addi)on a + b = b + a Associa)vity of addi)on a + (b + c) = (a + b) + c Iden)ty element of addi)on a + 0 = a [0 = (0, 0,, 0) T ] Addi)ve inverse a + (- a) = 0 Iden)ty of scalar mul)plica)on 1 a = a Distribu)vity of scalar mul)plica)on c (a + b) = c a + c b (c + d) a = c a + d a Compa)bility of scalar mul)plica)on (cd) a = c (d a)

Other vector opera)ons Dot (inner product) product a b = a 1 b 1 + a 2 b 2 + +a n b n θ x y Let x T x = x and y x 1 2 +"+ x p 2 = the length of x denote two p 1 vectors. Then. cosθ = x T x x y y T y = angle between x and y

Note: Let x and y denote two p 1 vectors. Then. cosθ = Thus if xʹ x y = angle 0 if between x ʹ y = 0x and θ y = π 2 x x y y y = 0, π 2 then x and x y y are orthogonal.

How similar are two signals? A ( a, a,..., a 1 2 n ) Dot product B ( b, b,..., b 1 2 n ) A B i aibi A B cos Identical vectors: 0, A B 1 Perpendicular vectors:, A B 0 2 The dot product is the came as the cross-correation at zero: ( f g)(0) f ( ) g( )

What are the characteristics of the dot product? 10 3 1 0.3 0.1 S/N 1000 Signal+ Noise Noise Dimensions

What are the characteristics of the dot product? 10 3 1 0.3 0.1 S/N 1000 Signal+ Noise Noise Dimensions

What are the characteristics of the dot product? 10 3 1 0.3 0.1 S/N 1000 Signal+ Noise Noise Dimensions

What are the characteristics of the dot product? 10 3 1 0.3 0.1 S/N 1000 Signal+ Noise Noise Dimensions

What are the characteristics of the dot product? 10 3 1 0.3 0.1 S/N 100 Signal+Noise Noise 1000 Dimensions

What are the characteristics of the dot product? 10 10 3 1 0.3 0.1 S/N Signal+Noise Noise 100 1000 Dimensions

Vector Basis A basis is a set of linearly independent (dot product is zero) vectors that span the vector space. Any vector in this vector space may be represented as a linear combina)on of the basis vectors. The vectors forming a basis are orthogonal to each other. If all the vectors are of length 1, then the basis is called orthonormal.

Matrix An n m matrix, A, is a rectangular array of elements a 11 a 12 a 1n a A = ( a ) ij = 21 a 22 a 2n " " " " a m1 a m2 a mn % n = of columns m = of rows dimensions = n m

Matrix Operations Addition Let A = (a ij ) and B = (b ij ) denote two n m matrices, then the sum, A + B, is the matrix A + B = ( a ij + b ) ij = " a 11 + b 11 a 12 + b 12 a 1n + b 1n a 21 + b 21 a 22 + b 22 a 2n + b 2n " " " a m1 + b m1 a m2 + b m2 a mn + b mn % The dimensions of A and B are required to be both n m.

Scalar Multiplication Let A = (a ij ) denote an n m matrix and let c be any scalar. Then ca is the matrix ca = ( ca ) ij = " ca 11 ca 12 ca 1n ca 21 ca 22 ca 2n " " " ca m1 ca m2 ca mn %

Gaussian Elimina)on Method to solve linear equa)ons. Let Ax = b, be a linear system of equa)ons, represent it as: Use the following opera)ons Mul)ply row by a constant Interchange two rows Add a mul)ple of a row to another to transform it to row echlon form " " a 11 a 1m b 1 " " " a n1 a nm b n c 11 c 1m d 1 0 " 0 0 c nm d n % %

Gaussian Elimination Example Solve this system of linear equations: 2x 2 + x 3 = 8 x 1 2x 2 3x 3 = 0 x 1 + x 2 + 2x 3 = 3

Gaussian Elimination Example Solve this system of linear equations: 2x 2 + x 3 = 8 x 1 2x 2 3x 3 = 0 x 1 + x 2 + 2x 3 = 3 Solution: Swap Row 1 and Row 2. Add Row 1 to Row 3. Swap Row 2 and Row 3. Add twice Row 2 to Row 3. Add 1 times Row 3 to Row 2. Add 3 times Row 3 to Row 1. Add 2 times Row 2 to Row 1. Multiply Rows 2 and 3 by 1.

Matrix multiplication Let A = (a ij ) denote an n m matrix and B = (b jl ) denote an m k matrix. A and B are compatible iff the second dimension of A is equal to the first dimension of B. Then the n k matrix C = (c il ) where c m = a b il ij jl j= 1 is called the product of A and B and is denoted by A B. In other words the element i,l of a product matrix is a dot product of the i-th row vector of A by the l-th column vector of B.

Identity matrix An n n identity matrix, I or I n, is the square matrix 1 0 0 0 1 0 I = I n = " " " " 0 0 1 % Note: 1. AI = A 2. IA = A.

Definition (The inverse of an n n matrix) Let A denote the n n matrix a 11 a 12 a 1n a A = ( a ) ij = 21 a 22 a 2n " " " " a n1 a n2 a nn Let B denote an n n matrix such that AB = BA = I, If the matrix B exists then A is called invertible Also B is called the inverse of A and is denoted by A -1 %

Note: Let A and B be two matrices whose inverse exists. Let C = AB. Then the inverse of the matrix C exists and C -1 = B -1 A -1. Proof C[B -1 A -1 ] = [AB][B -1 A -1 ] = A[B B -1 ]A -1 = A[I]A -1 = AA -1 =I

Block Matrices Let the n m matrix A A q 11 12 A = n m n q A21 A 22 p m p be partitioned into sub-matrices A 11, A 12, A 21, A 22, Similarly partition the m k matrix B B p 11 12 B = m k m p B21 B 22 l k l

Product of Blocked Matrices Then A A B B 11 12 11 12 = A21 A 22 B21 B 22 AB AB + AB AB + AB 11 11 12 21 11 12 12 22 = A21B11 + A22B21 A21B12 + A22B 22

Gauss- Jordan elimina)on To find an inverse of a matrix A Use Gaussian elimina)on to transform to ( A I) (I B) B is the inverse of A

Gauss-Jordan Elimination Example Find Inverse of Matrix: 3 0 2 2 0-2 0 1 1

Example of Gauss- Jordan elimina4on Find inverse of matrix: " " " R1: R2 : R2 : " 1 0 0 2 0 2 0 1 1 1 0 0 0 0 1 0 1 1 1 0 0 0 1 0 0 0 1 3 0 2 2 0 2 0 1 1 1 0 0 0 1 0 0 0 1 0.2 0.2 0 0 1 0 0 0 1 0.2 0.2 0 0.2 0.3 0 0 0 1 0.2 0.2 0 0.2 0.3 1 0.2 0.3 0 % ' +R2 ' ' " = % " ' ' 2R1 = ' % " ' ' R2 R3 = ' % ' ' ' " 3 0 2 2 0 2 0 1 1 5 0 0 2 0 2 0 1 1 1 0 0 0 0 2 0 1 1 1 0 0 0 1 1 0 0 1 % ' ' ' 1 1 0 0 1 0 0 0 1 0.2 0.2 0 0.4 0.6 0 0 0 1 0.2 0.2 0 0 0 1 0.2 0.3 0 % ' ' ' R1/ 5 % ' ' ' = R2 / 2 = % ' ' R3 = '

The transpose of a matrix Consider the n m matrix, A A = ( a ) ij = " A = ( a ) ji = a 11 a 12 a 1n a 21 a 22 a 2n " " " a m1 a m2 a mn then the m n matrix, Aʹ (also denoted by A T ) " is called the transpose of A % a 11 a 21 a m1 a 12 a 22 a m2 " " " a m1 a m2 a mn % ' ' ' ' '

Symmetric Matrices An n n matrix, A, is said to be symmetric if A ʹ = A Note: ʹ ( AB) ( AB) 1 = Bʹ Aʹ = B 1 A ( ) 1 ʹ = ( 1 A A )ʹ 1

The trace and the determinant of a square matrix Let A denote then n n matrix a 11 a 12 a 1n a A = ( a ) ij = 21 a 22 a 2n " " " " a n1 a n2 a nn Then tr A ( ) n = a i= 1 ii %

A = det " also where det A ij a a 11 a 12 a 1n a 21 a 22 a 2n " " " a n1 a n2 a nn n = aa ij j= 1 ij = cofactor of a = a 11 12 = aa 11 22 aa 12 21 a21 a 22 ( ) i+ j ij = the determinant of A % the determinant of the matrix 1 th th after deleting i row and j col.

Some properties 1. I =1, tr I ( ) = n 2. AB = A B, tr ABC ( ) = tr CAB ( ) 3. A = 1 1 A 4. A T = A 5. ca = c n A, for A, square n n n 6. A = a i,i, for square triangular matrix i=1

Special Types of Matrices 1. Orthogonal matrices A matrix is orthogonal if PˊP = PPˊ = I In this cases P -1 =Pˊ. Also the rows (columns) of P have length 1 and are orthogonal to each other

Suppose P is an orthogonal matrix then Pʹ P = PPʹ = I Let x and y denote p 1 vectors. Let u = P x and v = P y Then u v = ( P x ) ( P y ) = and u u = ( P x ) ( P x ) = x P P y = x y x P P x = x x Orthogonal transformation preserve length and angles Rotations about the origin, Reflections

Special Types of Matrices (continued) 2. Positive definite matrices A symmetric matrix, A, is called positive definite if: " " 2 2 xʹ Ax = a11 x1 + + annxn + 2a12x1x2 + 2a12xn 1xn > for all x 0 A symmetric matrix, A, is called positive semi definite if: xʹ Ax 0 for all x 0 0

Special Types of Matrices (continued) 3. Idempotent matrices A symmetric matrix, E, is called idempotent if: E E = Idempotent matrices project vectors onto a linear subspace E = ( Ex) Ex E E x x

Definition Let A be an n n matrix Let x and λ be such that A x = λ x with x 0 then λ is called an eigenvalue of A and and x is called an eigenvector of A and

Note: ( A λi) x = 0 If A λi 0 then x = ( A λi) 1 0 = 0 thus A λi = 0 is the condition for an eigenvalue.

" A λi = det ( a λ) a 11 1n " " a ( a λ) n1 nn = polynomial of degree n in λ. % ' ' = 0 ' ' Hence there are n possible eigenvalues λ 1,, λ n, some may be 0 or repeat.

Thereom If the matrix A is symmetric then the eigenvalues of A, λ 1,, λ n,are real. Thereom If the matrix A is positive definite then the eigenvalues of A, λ 1,, λ n, are positive. Thereom If the matrix A is symmetric and the eigenvalues of A are λ 1,, λ n, with corresponding eigenvectors i.e. A x i = λ i xi If λ i λ j then x i x j = 0

Diagonaliza)on Thereom If the matrix A is symmetric with distinct eigenvalues, λ 1,, λ n, with corresponding eigenvectors x,, x 1 n Assume xi xi =1 then A = λ 1 x1 x1 + + λ n xn xn = [ x 1,, x λ 1 " 0 ] n 0 " λ " n % " x1' xn ' = PDPʹ %

Least Squares Typical linear model can be wriden as where y = " y 1 y n y = Xβ +ε,, X = % " 1 x 21 x k1 1 x 2n x kn, β = % " Here β is a vector of unknown regression coefficients, ε are error terms, y is the response variable and X is the design matrix β 1 β n,ε = % " ε 1 ε n %

Least Squares (con)nued) Let b be the es)mates of β, and e the residuals, then y = X b + e S(b)=Σe i 2 is the objec)ve func)on we would like to minimize. S(b)=(y - X b) (y X b)=y y y Xb b X y + b X Xb Set to 0 the deriva)ve of S(b), w.r.t. b - 2 X y + 2X Xb=0 => X Xb=X y Hence the least squares es)mator is b=(x X) - 1 X y

Least Squares (con)nued) b=(x X) - 1 X y e = y Xb = y X(X X) - 1 X y=(i- X(X X) - 1 X )y=(i- H)y H is symmetric and idempotent H 2 = H. HX = X(X X) - 1 X X = X => H is a projec)on onto X subspace y = Hy + e. Least squares method projects y onto X subspace minimizing the residual error e.