Lecture 2 INF-MAT 4350 2008: A boundary value problem and an eigenvalue problem; Block Multiplication; Tridiagonal Systems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo August 30, 2008
Plan for the day A two point boundary value problem The finite difference scheme The second derivative matrix T. Weakly diagonally dominant tridiagonal matrices An eigenvalue problem The finite difference method Eigenvalues and eigenvectors (eigenpairs) of T Block multiplication Properties of triangular matrices
A boundary value problem u (x) = f (x), x [0, 1], u(0) = 0, u(1) = 0. f is a given continuous function on [0, 1]. Solve using a finite difference method Choose positive integer m Define discretization parameter h := 1/(m + 1) Replace the interval [0, 1] by grid points x j := jh for j = 0, 1,..., m + 1. Replace derivative with finite difference approximation u(x h) 2u(x) + u(x + h) h 2 = u (x) + h2 12 u(4) (ξ) for some ξ (x h, x + h).
Tridiagonal linear system v j u(jh) for all j v j 1 + 2v j v j+1 = h 2 f (jh), j = 1,..., m. Linear system Tv = b, where T := 2 1 0 1 2 1. 0......... 0 1 2 1 0 1 2 T R m,m is called the second derivative matrix. T is not strictly diagonally dominant. T is weakly diagonally dominant.
Weak diagonal dominance Definition The tridiagonal matrix d 1 c 1 a 2 d 2 c 2 A := tridiag(a i, d i, c i ) =......... a n 1 d n 1 c n 1 a n d n is weakly diagonally dominant if d 1 > c 1, d n > a n, d k a k + c k, k = 2, 3,..., n 1.
Irreducibility T is weakly diagonally domiant, but is it non-singular? [ 2 1 0 ] The matrix A 1 = 0 0 0 is weakly diagonally dominant and 0 1 2 singular. A tridiagonal matrix tridiag(a i, d i, c i ) is irreducible if and only if all the a i, c i are nonzero. A matrix which is not irreducible is called reducible. The matrix T is irreducible, while the matrix A 1 is reducible. Theorem Suppose A is tridiagonal, weakly diagonally dominant, and irreducible. Then A is nonsingular and has a unique LU-factorization A = LR.
Proof We use the LU-factorization algorithm to show that r k c k for k = 1, 2,..., n 1 by induction on k and that r n 0. Recall r 1 = d 1, r 1 = d 1 > c 1. l k = a k r k 1, r k = d k l k c k 1, k = 2, 3,..., n. Suppose for some k n that r k 1 c k 1. Then r k = d k a kc k 1 d k a k c k 1 d k a k r k 1 r k 1 r k > c k for k n 1. r k > 0 for k = n. Since both L and R exist and have nonzero diagonal entries they are nonsingular, and the product A = LR is nonsingular and has a unique LU-factorization.
An eigenvalue problem Consider a horizontal beam of length L located between 0 and L on the x-axis of the plane. We assume that the beam is fixed at x = 0 and x = L A force F is applied at (L, 0) in the direction towards the origin. Let y(x) be the vertical displacement of the beam at x. Boundary value problem Ry (x) = Fy(x), y(0) = y(l) = 0 R is a constant defined by the rigidity of the beam.
The transformed system Transform: u : [0, 1] R given by u(t) := y(tl). Eigenvalue problem u (t) = Ku(t), u(0) = u(1) = 0, K := FL2 R When F is increased it will reach a critical value where the beam will buckle and maybe break. This corresponds to the smallest eigenvalue of u = Ku. Approximate u by T/h 2. Discrete eigenvalue problem Tv = λv, where T is the second derivative matrix. Determine eigenvalues of T.
Hermitian matrices Complex number: z = x + iy = r(cos φ + i sin φ) = re iφ Complex conjugate: z = x iy = r(cos φ i sin φ) = re iφ A complex number z is real if and only if z = z. Absolute value z = zz = x 2 + y 2 = r. Hermitian transpose: If A = [a ij ] C m,n then A H := [a ji ] C n,m (AB) H = B H A H If A R m,n then A T = A H. A matrix is Hermitian if A H = A and symmetric if A T = A.
Eigenpairs of Hermitian matrices The eigenvalues of a Hermitian matrix are real. The eigenvectors corresponding to distinct eigenvalues are orthogonal. Proof real. Suppose A H = A and Ax = λx with x 0. λ = xh Ax x H x. λ = λ H = (xh Ax) H = xh A H x = xh Ax = λ, and λ is real. (x H x) H x H x x H x Proof orthogonal Suppose Ax = λx and Ay = µy with µ λ. λy H x = y H Ax = (x H A H y) H = (x H Ay) H = (µx H y) H = µy H x. Thus (λ µ)y H x = 0 and y H x = 0.
The Sine Matrix S = [ sin m+1] jkπ m j,k=1 Rm,m. For m = 3 sin π 4 sin 2π 4 sin 3π 4 t 1 t S = sin 2π 4 sin 4π 4 sin 6π 4 = 1 0 1, t := 1. sin 3π 4 sin 6π 4 sin 9π 4 t 1 t 2 Columns S = [s 1,..., s m ] s T 1 s 2 = s T 1 s 3 = s T 2 s 3 = 0
Eigenvalue Problem C := tridiag(a, b, a) = b a 0... 0 a b a... 0......... 0... a b a 0... a b R m,m. a = 1,b = 2 second derivative matrix T. a = 1, b = 4 spline matrix N 1. C is symmetric, C T = C. c k,j = 0 except for c k,k 1 = c k,k+1 = a and c k,k = b. Show that in general we have Cs j = λ j s j for j = 1,..., m, λ j = b + 2a cos(jπh), h = 1/(m + 1).
Eigenpairs of C Let s k,j = sin (kjπh) be the kth entry in s j. With A := kjπh and B := jπh we find (Cs j ) k = m c k,l s l,j = l=1 k+1 l=k 1 c k,l s l,j = as k 1,j + bs k,j + as k+1,j = a sin ( (k 1)jπh ) + b sin ( kjπh ) + a sin ( (k + 1)jπh ) = a sin(a B) + b sin A + a sin(a + B) = 2a cos B sin A + b sin A = (b + 2a cos B) sin A = λ j s k,j, and λ j = b + 2a cos(jπh) follows. Since jπh = jπ/(m + 1) (0, π) for j = 1,..., m and the cos function is monotone on (0, π) the eigenvalues are distinct. Since C is symmetric it follows from Lemma that the eigenvectors s j are orthogonal.
Scaling of eigenvectors s j 2 2 := st j s j = m+1 2 for j = 1,..., m. s T j s j = m sin 2 (kjπh) = k=1 = m + 1 2 1 2 m sin 2 (kjπh) = 1 2 k=0 m cos(2kjπh)) k=0 m (1 cos(2kjπh)) k=0 The last cosine sum is zero. We show this by summing a geometric series of complex exponentials. m cos(2kjπh) + i k=0 = e2i(m+1)jπh 1 e 2ijπh 1 m sin(2kjπh) = k=0 = e2πij 1 e 2ijπh 1 = 0. m k=0 e 2ikjπh
Column partition of a matrix product Matrix vector product. a1j x j Ax =. = a 1j. x j = amj x j a mj In particular Ae j = a j. n j=1 x j a j Product (AB)e j = A(Be j ) = Ab j so AB = [Ab 1,..., Ab n ] AB is partitioned by columns Partition by rows a T 1. B a T 2. AB = B. a T m.b
Partitioned matrices A rectangular matrix A can be partitioned into sub-matrices by drawing horizontal lines between selected rows and [ vertical 1 2 3 ] lines between selected columns. For example, A = 4 5 6 can 7 8 9 be partitioned as [ ] 1 2 3 A11 A (i) 12 = 4 5 6, (ii) [ ] 1 2 3 a A 21 A.1, a.2, a.3 = 4 5 6 22 7 8 9 7 8 9 a T 1. (iii) a T 2. = a T 3. 1 2 3 4 5 6 7 8 9, (iv) [ ] A 11, A 12 = 1 2 3 4 5 6 7 8 9 The submatrices in a partition is often referred to as blocks and a partitioned matrix is sometimes called a block matrix..
Block multiplication 1 B 1 B 2 p m A AB 1 AB 2 m AB = [Ab 1,..., Ab r, Ab r+1..., Ab n ] = [AB 1, AB 2 ].
Block multiplication 2 m A 1 A 2 B 1 s m A 1 B 1 A 2 B 2 B 2 p s p s p (AB) ij = a ik b kj = a ik b kj + a ik b kj j=1 j=1 j=s+1 = (A 1 B 1 ) ij + (A 2 B 2 ) ij = (A 1 B 1 + A 2 B 2 ) ij.
The general case If A 11 A 1s B 11 B 1q A =.., B =.., A p1 A ps B s1 B sq and if all the matrix products in s C ij = A ik B kj, k=1 i = 1,..., p, j = 1,..., q are well defined then C 11 C 1q AB =... C p1 C pq
Products of triangular matrices a 11 a 12 a 1n b 11 b 12 b 1n c 11 c 12 c 1n 0 a 22 a 2n 0 b 22 b 2n.... = 0 c 22 c 2n... 0 0 a nn 0 0 b nn 0 0 c nn Lemma The product C = AB = (c ij ) of two upper(lower) triangular matrices A = (a ij ) and B = (b ij ) is upper(lower) triangular with diagonal entries c ii = a ii b ii for all i. Proof. Exercise.
Block-Triangular Matrices Lemma Suppose [ ] A11 A A = 12 0 A 22 where A, A 11 and A 22 are square matrices. Then A is nonsingular if and only if both A 11 and A 22 are nonsingular. In that case [ A 1 A 1 = 11 A 1 11 A 12A 1 ] 22 0 A 1 (1) 22
Proof If A 11 and A 12 are nonsingular then [ A 1 11 A 1 11 A 12A 1 ] [ ] 22 A11 A 12 0 A 1 = 22 0 A 22 and A is nonsingular with the indicated inverse. [ ] I 0 = I 0 I
Proof Conversely, let B be the inverse of the nonsingular matrix A. We partition B conformally with A and have [ ] [ ] [ ] B11 B BA = 12 A11 A 12 I 0 = = I B 21 B 22 0 A 22 0 I Using block-multiplication we find B 11 A 11 = I, B 21 A 11 = 0, B 21 A 12 + B 22 A 22 = I. The first equation implies that A 11 is invertible, this in turn implies that B 21 = 0 in the second equation, and then the third equation simplifies to B 22 A 22 = I. We conclude that also A 22 is invertible.
The inverse Consider now a triangular matrix. Lemma An upper (lower) triangular matrix A = [a ij ] R n,n is nonsingular if and only if the diagonal entries a ii, i = 1,..., n are nonzero. In that case the inverse is upper (lower) triangular with diagonal entries a 1 ii, i = 1,..., n. Proof: We use induction on n. The result holds for n = 1: The 1-by-1 matrix A = (a 11 ) is invertible if and only if a 11 0 and in that case A 1 = (a 1 11 ). Suppose the result holds for n = k and let A = R k+1,k+1 be upper triangular.
Proof We partition A in the form [ ] Ak a A = k 0 a k+1,k+1 and note that A k R k,k is upper triangular. By Lemma 1.1 A is nonsingular if and only if A k and (a k+1,k+1 ) are nonsingular and in that case [ A A 1 1 k A 1 k = a ] ka 1 k+1,k+1 0 a 1. k+1,k+1 By the induction hypothesis A k is nonsingular if and only if the diagonal entries a 11,..., a kk of A k are nonzero and in that case A 1 k is upper triangular with diagonal entries a 1 ii, i = 1,..., k. The result for A follows.
Unit Triangular Matrices A matrix is unit triangular if it is triangular with 1 s on the diagonal. Lemma For a unit upper(lower) triangular matrix A R n,n : 1. A is invertible and the inverse is unit upper(lower) triangular. 2. The product of two unit upper(lower) triangular matrices is unit upper(lower) triangular. Proof. 1. follows from the inverse Lemma while the above Lemma implies 2.
Summary Studied a boundary value problem and an eigenvalue problem Each leads to a tridiagonal matrix T Introduced the concept of weak diagonal dominance and irreducibility Used LU-factorization to show that a tridiagonal, irreducible matrix is non-singular Found the eigenvalues and eigenvector of T Eigenvectors are orthogonal Block multiplication Triangular matrices