I = i 0,

Similar documents
Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

Linear Algebra (Review) Volker Tresp 2018

Determinants. Recall that the 2 2 matrix a b c d. is invertible if

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Mathematics 13: Lecture 10

Matrix Algebra. Matrix Algebra. Chapter 8 - S&B

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

CS 246 Review of Linear Algebra 01/17/19

Kevin James. MTHSC 3110 Section 2.1 Matrix Operations

Matrices. Chapter Definitions and Notations

Section 9.2: Matrices.. a m1 a m2 a mn

Lecture 8: Determinants I

II. Determinant Functions

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Math 240 Calculus III

Linear Algebra and Matrix Inversion

Chapter 1 Matrices and Systems of Equations

Section 9.2: Matrices. Definition: A matrix A consists of a rectangular array of numbers, or elements, arranged in m rows and n columns.

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

1 Introduction. 2 Determining what the J i blocks look like. December 6, 2006

Matrices and Vectors

Evaluating Determinants by Row Reduction

Math 123, Week 2: Matrix Operations, Inverses

Review of Linear Algebra

Gaussian Elimination and Back Substitution

Elementary Row Operations on Matrices

Linear Algebra (Review) Volker Tresp 2017

Numerical Linear Algebra

MAT 2037 LINEAR ALGEBRA I web:

Chapter 2. Square matrices

Dot Products, Transposes, and Orthogonal Projections

Matrix Algebra & Elementary Matrices

Determinants in detail

Determinants. Samy Tindel. Purdue University. Differential equations and linear algebra - MA 262

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Prepared by: M. S. KumarSwamy, TGT(Maths) Page

Matrix Algebra 2.1 MATRIX OPERATIONS Pearson Education, Inc.

Lemma 8: Suppose the N by N matrix A has the following block upper triangular form:

Linear Algebra March 16, 2019

Introduction to Determinants

Determinants by Cofactor Expansion (III)

Fundamentals of Engineering Analysis (650163)

Matrix Arithmetic. j=1

10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections )

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

Section 1.6. M N = [a ij b ij ], (1.6.2)

Matrices and systems of linear equations

Basic Linear Algebra in MATLAB

Submatrices and Partitioned Matrices

Introduction. Vectors and Matrices. Vectors [1] Vectors [2]

Math 314/814 Topics for first exam

Numerical Analysis Lecture Notes

Linear Equations and Matrix

Matrix Algebra Determinant, Inverse matrix. Matrices. A. Fabretti. Mathematics 2 A.Y. 2015/2016. A. Fabretti Matrices

Final Project # 5 The Cartan matrix of a Root System

ICS 6N Computational Linear Algebra Matrix Algebra

Topic 15 Notes Jeremy Orloff

William Stallings Copyright 2010

Vectors and matrices: matrices (Version 2) This is a very brief summary of my lecture notes.

1 Determinants. 1.1 Determinant

Lecture 2 INF-MAT : A boundary value problem and an eigenvalue problem; Block Multiplication; Tridiagonal Systems

Matrix Algebra: Summary

Chapter 2 Notes, Linear Algebra 5e Lay

Chapter 2: Matrix Algebra

Chapter 4. Determinants

Properties of the Determinant Function

Direct Methods for Solving Linear Systems. Matrix Factorization

Linear Equations in Linear Algebra

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

Recitation 8: Graphs and Adjacency Matrices

Linear Algebra Solutions 1

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Announcements Monday, October 02

Announcements Wednesday, October 10

Determinants - Uniqueness and Properties

MATRICES AND MATRIX OPERATIONS

Math Linear Algebra Final Exam Review Sheet

Determinants of 2 2 Matrices

MTH5112 Linear Algebra I MTH5212 Applied Linear Algebra (2017/2018)

Math113: Linear Algebra. Beifang Chen

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Determinants and Scalar Multiplication

Matrices. Math 240 Calculus III. Wednesday, July 10, Summer 2013, Session II. Matrices. Math 240. Definitions and Notation.

CHAPTER 6. Direct Methods for Solving Linear Systems

ECON 186 Class Notes: Linear Algebra

Raphael Mrode. Training in quantitative genetics and genomics 30 May 10 June 2016 ILRI, Nairobi. Partner Logo. Partner Logo

Math 344 Lecture # Linear Systems

Linear Algebra Tutorial for Math3315/CSE3365 Daniel R. Reynolds

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

Phys 201. Matrices and Determinants

1 Matrices and Systems of Linear Equations. a 1n a 2n

Math/CS 466/666: Homework Solutions for Chapter 3

Announcements Wednesday, October 25

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.

Lecture Notes in Linear Algebra

k=1 ( 1)k+j M kj detm kj. detm = ad bc. = 1 ( ) 2 ( )+3 ( ) = = 0

Vector, Matrix, and Tensor Derivatives

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Transcription:

Special Types of Matrices Certain matrices, such as the identity matrix 0 0 0 0 0 0 I = 0 0 0, 0 0 0 have a special shape, which endows the matrix with helpful properties The identity matrix is an example of a diagonal matrix; we will discuss several types of special matrices in this section, including diagonal matrices, as well as the properties that make them interesting Diagonal Matrices Examine the matrices below: 3 i 0 0 0 0, 0 0 5 0 7 0 0 0 0 0 3 0 0 0 0 0 i 0, 0 ( ) 0 0 You should notice that the three matrices have a common shape their only nonzero entries occur on their diagonals Definition A square matrix is diagonal if all of its off-diagonal entries are 0s A diagonal matrix has form d 0 0 0 0 d 0 0 0 0 d 3 0 0 0 0 d n We give such matrices a name because they have interesting properties not shared by nondiagonal matrices; we discuss these properties below Properties of Diagonal Matrices We have seen that matrix multiplication is, in general, quite tedious However, if the two matrices in question are actually diagonal matrices, multiplication becomes quite simple, as indicated by the following theorem:

Theorem The product of a pair of n n diagonal matrices a 0 0 0 b 0 0 0 0 a 0 0 0 b 0 0 A = 0 0 a 33 0 and B = 0 0 b 33 0 0 0 0 a nn 0 0 0 b nn is also an n n diagonal matrix, and has form a b 0 0 0 0 a b 0 0 AB = 0 0 a 33 b 33 0 0 0 0 a nn b nn Proof To prove the theorem, we must accomplish two things: Show that each off-diagonal entry is 0 Show that the diagonal entries have form a ii b ii We know that the i, j entry p ij of the product AB has form p ij = a ik b kj k= We may now divide the problem into two cases: Case : Off-diagonal entries Case : Diagonal entries Notice that an off-diagonal entry p ij of a matrix must have i j; on the other hand, every diagonal entry has form p ii We investigate the cases using this observation: Case : Off-diagonal entries: Since p ij is off-diagonal, i j We know that p ij = a ik b kj ; k= in addition, A and B are diagonal, so a ik = 0 whenever i k, and similarly b kj = 0 whenever k j Since i j, every term in the expansion of p ij = a ik b kj k= has at least one 0 factor Thus p ij = 0 whenever i j

Case : Diagonal entries: The diagonal entry p ii has form p ii = a ik b ki ; using the reasoning above, the only possible nonzero term in this sum is a ii b ii Thus k= p ii = a ii b ii The theorem above has several nice consequences For starters, finding the inverse of a diagonal matrix is quite simple (unlike finding inverses for most other matrices!) Indeed, a diagonal matrix is invertible if and only if all of its diagonal entries are nonzero; if this is the case, then the inverse of d 0 0 0 d 0 0 0 0 d 0 0 0 d 0 0 D = 0 0 d 3 0 is D = 0 0 d 3 0 0 0 0 d n 0 0 0 d n Example Given find A and A 3 0 7 0 0 0 A = 0 0 3 0 0 0 0 0 0, 5 4 A is simple, and you should check that 0 0 0 0 A 7 = 3 0 0 0 0 4 5 To find A 3, we could start by finding A, then calculating A A However, we know that each product will yield another diagonal matrix, whose diagonal entries are just the products of the corresponding diagonal entries of the factors So to find A 3, all we really need to do is 3

cube each of its diagonal entries We have ( )3 A 3 0 7 3 0 0 0 = 0 0 3 3 0 0 0 0 0 ( ) 3 0 ( 5 4 )3 = 8 0 343 0 0 0 0 0 7 0 0 0 0 0 0 5 64 Upper and Lower Triangular Matrices Triangular matrices are our next special type of matrix: Definition A square matrix is upper triangular if all of its below-diagonal entries are 0s An upper triangular matrix has form a a a 3 a n 0 a a 3 a n 0 0 a 33 a 3n 0 0 0 a nn A square matrix is lower triangular if all of its above-diagonal entries are 0s A lower triangular matrix has form a 0 0 0 a a 0 0 a 3 a 3 a 33 0 a n a n a n3 a nn The first two matrices below are upper triangular, and the last is lower triangular: 3 0 5 0 0 0 0 0 0 0 0 3 0 0 0 0 0 3 5 0 In the example above, you should have noticed that the first and third matrices are just transposes This is an illustration of the first part of the following theorem, which lists some important properties of triangular matrices: 4

Theorem (a) The transpose of an upper triangular matrix is lower triangular, and vice versa (b) The product of two upper triangular matrices is upper triangular, and the product of two lower triangular matrices is lower triangular (c) A triangular matrix is invertible if and only if each of its diagonal entries is nonzero (d) The inverse of an invertible upper triangular matrix is lower triangular, and vice versa Symmetric Matrices The matrix 6 3i A = 3i 4 0 0 5 has a special property, which you can discover by calculating A : 6 3i A = 3i 4 0 0 5 Notice that A and A are the same matrix, that is A = A As you may have suspected, we have a name for such special types of matrices: Definition A square matrix A is symmetric if A = A If A is symmetric, then a ij = a ji The following theorem lists some interesting properties of symmetric matrices: Theorem If A and B are symmetric n n matrices, and k is any scalar, then: (a) A is symmetric (b) A + B and A B are symmetric (c) ka is symmetric Proof Let s prove part (b) of the theorem We d like to show that, if A and B are symmetric, then so is A + B Of course, if A and B are symmetric, then we know that A = A and B = B Now, to show that A + B is symmetric, we need to be convinced that (A + B) = (A + B) Let s check that this is the case: (A + B) = A + B, 5

since the transpose of a sum is the sum of the transposes Fortunately, combining this with the observation that A = A and B = B give us Thus (A + B) = A + B = A + B (A + B) = (A + B) as we hoped, which means that A + B is a symmetric matrix Hermitian Matrices Recall that the conjugate transpose A of matrix A is the matrix A = [a ji ], whose (i, j)th entry is the conjugate transpose of the (j, i)th entry of A Definition A square matrix A is hermitian if A = A ; that is, if a ij = a ji The matrix is hermitian since On the other hand, the matrix A = A = ( ) i + i 0 ( ) i = A + i 0 6 3i A = 3i 4 0, 0 5 while symmetric (A = A ), is not hermitian: 6 3i A = 3i 4 0 0 5 A Even though A from the last example is symmetric, it is not hermitian; this is due to the signs of the complex entries in A However, if matrix A is symmetric and has strictly real entries, then A is automatically (trivially) hermitian, as indicated by the following theorem: 6

Theorem A real symmetric matrix A is also hermitian Proof Let A = [a ij ] be real symmetric Since A is symmetric, we know that a ij = a ji In addition, each entry of A is real, so a ij = a ij Combining these observations, we see that so that A is hermitian a ij = a ji = a ji, 7