Solving Homogeneous Systems with Sub-matrices

Similar documents
Jordan Canonical Form

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Cost Allocation for Outside Sources Replacing Service Departments

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Diagonalizing Matrices

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

On Powers of General Tridiagonal Matrices

Cayley-Hamilton Theorem

Math 240 Calculus III

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Study Guide for Linear Algebra Exam 2

2. Every linear system with the same number of equations as unknowns has a unique solution.

235 Final exam review questions

Definition (T -invariant subspace) Example. Example

A Note on Linearly Independence over the Symmetrized Max-Plus Algebra

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Dimension. Eigenvalue and eigenvector

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Some Reviews on Ranks of Upper Triangular Block Matrices over a Skew Field

Lecture Summaries for Linear Algebra M51A

A Family of Nonnegative Matrices with Prescribed Spectrum and Elementary Divisors 1

Math 315: Linear Algebra Solutions to Assignment 7

k-pell, k-pell-lucas and Modified k-pell Numbers: Some Identities and Norms of Hankel Matrices

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 3191 Applied Linear Algebra

Eigenvalues and Eigenvectors

MAT Linear Algebra Collection of sample exams

Diophantine Equations. Elementary Methods

Final Exam Practice Problems Answers Math 24 Winter 2012

MATRICES. knowledge on matrices Knowledge on matrix operations. Matrix as a tool of solving linear equations with two or three unknowns.

Linear Algebra Practice Problems

Jordan Canonical Form Homework Solutions

City Suburbs. : population distribution after m years

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Eigenvalues and Eigenvectors A =

Math 205, Summer I, Week 4b:

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Diagonalizing Hermitian Matrices of Continuous Functions

Control Systems. Linear Algebra topics. L. Lanari

Diagonalization of Matrix

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

MA 265 FINAL EXAM Fall 2012

Math Linear Algebra Final Exam Review Sheet

On the Power of Standard Polynomial to M a,b (E)

Math Final December 2006 C. Robinson

Generalized Eigenvectors and Jordan Form

On the Computation of the Adjoint Ideal of Curves with Ordinary Singularities

Review problems for MA 54, Fall 2004.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Generalized eigenvector - Wikipedia, the free encyclopedia

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Lecture 1: Review of linear algebra

1. Select the unique answer (choice) for each problem. Write only the answer.

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Locating Chromatic Number of Banana Tree

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

4. Linear transformations as a vector space 17

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Frame Diagonalization of Matrices

Lecture 11: Diagonalization

Definitions for Quizzes

Chapter 5 Eigenvalues and Eigenvectors

A Practical Method for Decomposition of the Essential Matrix

A Generalization of p-rings

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 314H Solutions to Homework # 3

7. Symmetric Matrices and Quadratic Forms

of a Two-Operator Product 1

Linear Algebra Massoud Malek

Linear Algebra Practice Problems

Parallel Properties of Poles of. Positive Functions and those of. Discrete Reactance Functions

CSL361 Problem set 4: Basic linear algebra

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Foundations of Matrix Analysis

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Notes on Row Reduction

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MAT1302F Mathematical Methods II Lecture 19

Solutions to Final Exam

Morphisms Between the Groups of Semi Magic Squares and Real Numbers

Exercise Set 7.2. Skills

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

12x + 18y = 30? ax + by = m

Eigenvalues and Eigenvectors

2 Eigenvectors and Eigenvalues in abstract spaces.

Explicit Expressions for Free Components of. Sums of the Same Powers

Chapter 3. Determinants and Eigenvalues

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Transcription:

Pure Mathematical Sciences, Vol 7, 218, no 1, 11-18 HIKARI Ltd, wwwm-hikaricom https://doiorg/112988/pms218843 Solving Homogeneous Systems with Sub-matrices Massoud Malek Mathematics, California State University East Bay, Hayward, CA 94542, USA Copyright c 218 Massoud Malek This article is distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited Abstract We show that linearly independent solutions of M X = θ, where M is an m n matrix, may be found by the largest non-singular submatrix of M With this method, we may also obtain eigenvectors and generalized eigenvectors corresponding to an eigenvalue Finally, we shall explain how to construct a generalized modal matrix, to obtain a Jordan canonical form of a square matrix without solving a system except for finding the ranks Mathematics Subject Classification: 15B51 Keywords: Eigenvalues and eigenvectors; eigenspace; defective eigenvalue; defective matrix; generalized eigenvectors; generalized modal matrix; Jordan canonical form 1 Introduction In all that follows the n n identity matrix is denoted by I n A permutation matrix P is obtain from the identity matrix, by permuting some of its rows or columns The zero column vector is denoted by θ and the m n zero matrix is denoted by Z m n Let be an eigenvalue of the n n real or complex matrix A An eigenvector of A corresponding to the eigenvalue is a non-trivial solution of

12 Massoud Malek ( A I n ) u = θ The set of all such vectors is called the eigenspace corresponding to the eigenvalue The algebraic multiplicity of an eigenvalue is its multiplicity in the characteristic polynomial and its geometric multiplicity is the dimension of its eigenspace An eigenvalue is said to be defective, if its geometric multiplicity is less than its algebraic multiplicity Matrices with some defective eigenvalues are called defective These matrices are not diagonalizable If is defective, then for an integer k > 1, any nonzero vector u (, k) satisfying : A k u (, k) = θ with A k u (, k) θ is called a generalized eigenvector of order k corresponding to the eigenvalue Clearly the generalized eigenvector of order one is just an eigenvector In this paper, we show that linearly independent solutions of the homogeneous linear system M X = θ, where M is an m n matrix, may be obtain by using the largest non-singular sub-matrix of M The same method may be used to find all the eigenvectors and generalized eigenvectors of a square matrix corresponding to an eigenvalue If the rank of the m n matrix M is n, then the dimension of the nullity of M is zero ; this clearly implies that θ is the only solution of M X = θ Given the m n matrix M of rank h < n ; we partition the matrix M or N = P M Q (permutation-equivalent to M ), into M M = 11 M 12 N or N = P M Q = 11 N 12, M 21 M 22 N 21 N 22 where at least one of the M ij or N ij blocks is a non-singular h h matrix 2 Results Let U be an n (n h) matrix, where its columns represent (n h) linearly independent solutions of M X = θ Clearly U is not unique Our main result explains how to find U, using the inverse of an h h block of M Theorem 1 Suppose the rank of the m n matrix M is h < n M 11 M 1 If the block M 11 is a non-singular h h matrix; then U = 12 [ 2 If the block M 12 is a non-singular h h matrix; then U = M 12 M 11 ]

Solving homogeneous systems with sub-matrices 13 3 If the block M 21 is a non-singular h h matrix; then U = M 21 M 22 [ ] I 4 If the block M 22 is a non-singular h h matrix; then U = n h A22 M 21 Proof Clearly, the rank of any n ( n h ) matrix containing is n h (1) We have [ M 11 M 12 ] [ M 11 M 12 ] = M 11 M 11 M 12 + M 12 = Z m n h Since each row of [ M 21 ] [ M 22 is a linear combination of the rows of M 11 ] M 12, we have : M 11 M M 12 M = 11 M 12 M 11 M 12 = Z M 21 M 22 I m n h n h Thus U = M 11 M 12 The proofs of (2) through (4) are similar to the proof of the first part 1 1 The matrix M = 1 1 1 1 of rank 2 cannot be partitioned into 1 1 M M = 11 M 12, M 21 M 22 where one of the M ij blocks is a 2 2 non-singular sub-matrix But by interchanging the second and the third column, we may obtain a matrix with at least one In our next corollary, we address this case Corollary 1 Let P and Q be two permutation matrices which make the block N 11 of the matrix N N = P M Q = 11 N 12, N 21 N 22 an h h invertible block Then U = Q V, where N 11 N V = 12

14 Massoud Malek Proof By using the proof of Theorem 1 part (1) on the matrix N, we obtain N V = Z m n h The remainder of the proof then follows from : M U = M [ Q V ] = P t N Q t [ Q V ] = P t [ N V ] = P t Z m n h = Z m n h Remark 1 If the matrix N in Corollary 1 is obtained from M by using only some row operations (ie, Q = I n ); then U = V Eigenvectors Let be an eigenvalue of the n n matrix A with geometric multiplicity m, and let U ( A ) be an n m matrix, where its columns represent m linearly independent eigenvectors of A corresponding to the eigenvalue Since eigenvectors of A corresponding to the eigenvalue are solutions of the homogeneous linear system ( A I n ) u = θ; by replacing the matrix M with A = A I n and h with n m in Theorem 1 or Corollary 1, one may obtain U ( A ) Remark 2 With the conventional method, the matrix in order to obtain m linearly independent eigenvectors of a matrix corresponding to an eigenvalue; one must first solve a homogenous system of linear equations which has infinitely many solutions Then one must verify if they are linearly independent Contrary to the conventional method, the matrix U ( A ) gives all the linearly independent eigenvectors associate with the eigenvector Also, the higher the geometric multiplicity of is, the task of finding linearly independent eigenvectors associated to becomes easier Generalized Eigenvectors The fact that generalized eigenvectors are solutions of some homogeneous linear systems, Theorem 1 or Corollary 1 may be used to find them Here is how: Step 1 Obtain the matrix A k = ( A I n ) k Step 2 Find r k, the rank of the matrix A k Step 3 If r k = (ie, the matrix A k is the zero matrix), then any vector v satisfying A k v θ is a generalized eigenvector of A of order k If not, replace the matrix M with A k and h with n r k in Theorem 1 or Corollary 1; any column v of U (A k) satisfying A k v θ represents a generalized eigenvector of order k of A corresponding to the eigenvalue

Solving homogeneous systems with sub-matrices 15 Jordan Canonical Form Any n n defective matrix A can be put in Jordan canonical form by a similarity transformation, ie, J 1 M J 2 A M = J = is of size n i n i with J m i 1, where J i = i 1 i m n i = n The blocks J 1, J 2,, J m i=1 are called Jordan blocks The matrix M is called a generalized modal matrix for A and is obtained from eigenvectors and generalized eigenvectors of the matrix A The order of the largest Jordan block of A corresponding to an eigenvalue is called the index of It is the smallest value of k N such that rank ( A I n ) k = rank ( A I n ) k+1 Let k be the smallest positive integer such that A k u = θ Then the sequence A k u, A k 2 u,, A 2 u, A u, u is called a Jordan chain of linearly independent generalized eigenvectors of length k To find a Jordan chain of length k corresponding to a defective eigenvalue, one must solve the equation A v = u If there are more eigenvectors associated with the defective eigenvalue, then it is not always clear which eigenvector must be chosen to solve a non-homogeneous linear system in order to produce the generalized eigenvector Instead of starting with an eigenvector which may or may not produce a Jordan chain of length k, we start with the matrix A k, which has a smaller non-singular block than A k The matrix U (A, k) has a column j such that for i = 1, 2, k, the j th columns of A i U (A, k) produce a Jordan chain of length k We shall explain this procedure in more detail with two examples 7 4 6 9 Example 1 Consider the matrix A = 1 6 9 1 4 1 9 with the characteristic polynomials K A ( ) = ( 4 ) 4 Thus 1 = 2 = 3 = 4 = 4 1 4 6 13

16 Massoud Malek Let A 4 = A 4 I 4, then 1 4 6 9 A 4 = 1 4 6 9 1 4 6 9 and A 2 4 = 1 4 6 9 Since the rank of A 4 is one, then the geometric multiplicity of = 4 in A is three; so according to Theorem 1, we need a 1 1 non-singular block of A 4 Hence 4 6 9 U A ( 4, 1) = 1 11 = u 1 =, u 2 =, and u 3 = 11 11 11 Now, to obtain a generalized modal matrix of A, we need a generalized eigenvector associated with one of the above eigenvectors With the conventional method, we have to solve one of the following matrix equations : 4 11 ( A 4 I 4 ) v = u i for i = 1,, 2, 3, Notice that all the rows of the matrix A 4 = ( A 4 I 4 ) are the same but the entries of u i s are different This means that we must find another eigenvector in order to produce a generalized modal matrix Since A 2 4 is a zero matrix, any non-zero vector which is not an eigenvector of A becomes a generalized eigenvector of order one So we may choose for example the vector e 1 = [ 1 ] t as our generalized eigenvector Then from e 1, we get a new eigenvector v = A 4 e 1 = [ 1 1 1 1 ] t By using v, e 1, u 2, and u 3, we construct the generalized modal matrix M = [ v e 1 u 2 u 3 ] Then we have 1 1 6 9 4 1 M = 1 1 11 with M A M = J ( A ) = 4 1 11 4 4 We conclude our paper with an example of a defective matrix with different eigenvalues 6 11 9 11

Solving homogeneous systems with sub-matrices 17 1 3 1 Example 2 Consider the matrix A = 4 3 2 with characteristic 5 4 3 2 6 5 4 3 2 polynomial K A ( ) = ( 1 ) 2 ( 2 ) 3 3 Define A 1 = A I 5 = 4 3 1 5 4 3 1 and, A 3 2 = A 2 I 5 = 4 3 5 4 3 6 5 4 3 1 6 5 4 3 The geometric multiplicities of both 1 = 1 and 2 = 2 are one So we need A 2 1 = 13 3 1 29 13 6 1, and A 9 3 2 = 4 3 4 5 52 29 17 6 1 53 8 By using Theorem 1 for A 2 1 and A 3 2, we obtain 1 1 V = U A ( 1, 2 ) = 3 3 49 5 and W = U A ( 2, 3 ) = 1 1 25 8 1 We have 3 A 1 V = 9 15, A 2 W = 3, and A 2 2 W = 24 4 3 9

18 Massoud Malek To construct a generalized modal matrix of A from V and W, we must use the first columns of A 1 V, V, A 2 2 W, A 2 W, and W, in that order 1 1 1 3 M = 9 3 1 15 49 3 with 1 M A M = J ( A ) = 2 1 2 1 24 25 9 4 2 Notice that no linear system was solved and there was no need to find any eigenvector of the matrix A directly References [1] F R Gantmacher, Matrix Theory, Vol 2, Chelsa, New York, 1959 [2] R A Horn and C R Johnson, Matrix Analysis, 2nd ed, Cambridge University Press, 213 [3] Georgi E Shilov, Linear Algebra, Dover Publications, New York, 1977 Received: April 29, 218; Published: June 28, 218