Lecture 2 INF-MAT : A boundary value problem and an eigenvalue problem; Block Multiplication; Tridiagonal Systems

Similar documents
Lecture 1 INF-MAT3350/ : Some Tridiagonal Matrix Problems

Lecture 1 INF-MAT : Chapter 2. Examples of Linear Systems

Lecture 2 INF-MAT : , LU, symmetric LU, Positve (semi)definite, Cholesky, Semi-Cholesky

Lecture4 INF-MAT : 5. Fast Direct Solution of Large Linear Systems

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Lecture Notes for Inf-Mat 3350/4350, Tom Lyche

Computing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices

Foundations of Matrix Analysis

I = i 0,

Numerical Linear Algebra Homework Assignment - Week 2

Algebra C Numerical Linear Algebra Sample Exam Problems

Linear Algebra and Matrix Inversion

Computing Eigenvalues and/or Eigenvectors;Part 1, Generalities and symmetric matrices

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Math 489AB Exercises for Chapter 1 Fall Section 1.0

CS 246 Review of Linear Algebra 01/17/19

Linear Algebra. Linear Equations and Matrices. Copyright 2005, W.R. Winfrey

Math 5630: Iterative Methods for Systems of Equations Hung Phan, UMass Lowell March 22, 2018

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

The Singular Value Decomposition and Least Squares Problems

MAT 2037 LINEAR ALGEBRA I web:


Symmetric and anti symmetric matrices

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

The Singular Value Decomposition

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Direct Methods for Solving Linear Systems. Simon Fraser University Surrey Campus MACM 316 Spring 2005 Instructor: Ha Le

Elementary Row Operations on Matrices

Chapter 1 Matrices and Systems of Equations

. =. a i1 x 1 + a i2 x 2 + a in x n = b i. a 11 a 12 a 1n a 21 a 22 a 1n. i1 a i2 a in

Linear Systems and Matrices

This operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix

Symmetric matrices and dot products

MATRICES. a m,1 a m,n A =

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Direct Methods for Solving Linear Systems. Matrix Factorization

Lecture Notes in Linear Algebra

Math 108b: Notes on the Spectral Theorem

Vectors and matrices: matrices (Version 2) This is a very brief summary of my lecture notes.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Chapter 3. Determinants and Eigenvalues

Math 240 Calculus III

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Eigenvalues and Eigenvectors

Prepared by: M. S. KumarSwamy, TGT(Maths) Page

Orthonormal Transformations

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Fundamentals of Engineering Analysis (650163)

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with

Lecture 12 (Tue, Mar 5) Gaussian elimination and LU factorization (II)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Notes on Mathematics

MTH 464: Computational Linear Algebra

Lecture 7. Econ August 18

Basic Concepts in Linear Algebra

Ma/CS 6b Class 20: Spectral Graph Theory

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS

MATH2210 Notebook 2 Spring 2018

G1110 & 852G1 Numerical Linear Algebra

Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

Review of Basic Concepts in Linear Algebra

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Conceptual Questions for Review

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012.

Gaussian Elimination without/with Pivoting and Cholesky Decomposition

Matrices. Chapter What is a Matrix? We review the basic matrix operations. An array of numbers a a 1n A = a m1...

PHYS 705: Classical Mechanics. Rigid Body Motion Introduction + Math Review

Orthogonal Transformations

MAT 1332: CALCULUS FOR LIFE SCIENCES. Contents. 1. Review: Linear Algebra II Vectors and matrices Definition. 1.2.

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

1 Determinants. 1.1 Determinant

II. Determinant Functions

Math 405: Numerical Methods for Differential Equations 2016 W1 Topics 10: Matrix Eigenvalues and the Symmetric QR Algorithm

Math113: Linear Algebra. Beifang Chen

Introduction to PDEs and Numerical Methods Lecture 7. Solving linear systems

Matrices A brief introduction

Systems of Linear Equations and Matrices

3.2 Gaussian Elimination (and triangular matrices)

Math 315: Linear Algebra Solutions to Assignment 7

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Krylov subspace projection methods

Phys 201. Matrices and Determinants

Matrices and Determinants

Systems of Linear Equations and Matrices

CS100: DISCRETE STRUCTURES. Lecture 3 Matrices Ch 3 Pages:

Numerical Linear Algebra A Solution Manual. Georg Muntingh and Christian Schulz

Orthonormal Transformations and Least Squares

Applied Linear Algebra

Computing Eigenvalues and/or Eigenvectors;Part 2, The Power method and QR-algorithm

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Lecture 8 : Eigenvalues and Eigenvectors

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Lecture notes: Applied linear algebra Part 1. Version 2

Cayley-Hamilton Theorem

Transcription:

Lecture 2 INF-MAT 4350 2008: A boundary value problem and an eigenvalue problem; Block Multiplication; Tridiagonal Systems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo August 30, 2008

Plan for the day A two point boundary value problem The finite difference scheme The second derivative matrix T. Weakly diagonally dominant tridiagonal matrices An eigenvalue problem The finite difference method Eigenvalues and eigenvectors (eigenpairs) of T Block multiplication Properties of triangular matrices

A boundary value problem u (x) = f (x), x [0, 1], u(0) = 0, u(1) = 0. f is a given continuous function on [0, 1]. Solve using a finite difference method Choose positive integer m Define discretization parameter h := 1/(m + 1) Replace the interval [0, 1] by grid points x j := jh for j = 0, 1,..., m + 1. Replace derivative with finite difference approximation u(x h) 2u(x) + u(x + h) h 2 = u (x) + h2 12 u(4) (ξ) for some ξ (x h, x + h).

Tridiagonal linear system v j u(jh) for all j v j 1 + 2v j v j+1 = h 2 f (jh), j = 1,..., m. Linear system Tv = b, where T := 2 1 0 1 2 1. 0......... 0 1 2 1 0 1 2 T R m,m is called the second derivative matrix. T is not strictly diagonally dominant. T is weakly diagonally dominant.

Weak diagonal dominance Definition The tridiagonal matrix d 1 c 1 a 2 d 2 c 2 A := tridiag(a i, d i, c i ) =......... a n 1 d n 1 c n 1 a n d n is weakly diagonally dominant if d 1 > c 1, d n > a n, d k a k + c k, k = 2, 3,..., n 1.

Irreducibility T is weakly diagonally domiant, but is it non-singular? [ 2 1 0 ] The matrix A 1 = 0 0 0 is weakly diagonally dominant and 0 1 2 singular. A tridiagonal matrix tridiag(a i, d i, c i ) is irreducible if and only if all the a i, c i are nonzero. A matrix which is not irreducible is called reducible. The matrix T is irreducible, while the matrix A 1 is reducible. Theorem Suppose A is tridiagonal, weakly diagonally dominant, and irreducible. Then A is nonsingular and has a unique LU-factorization A = LR.

Proof We use the LU-factorization algorithm to show that r k c k for k = 1, 2,..., n 1 by induction on k and that r n 0. Recall r 1 = d 1, r 1 = d 1 > c 1. l k = a k r k 1, r k = d k l k c k 1, k = 2, 3,..., n. Suppose for some k n that r k 1 c k 1. Then r k = d k a kc k 1 d k a k c k 1 d k a k r k 1 r k 1 r k > c k for k n 1. r k > 0 for k = n. Since both L and R exist and have nonzero diagonal entries they are nonsingular, and the product A = LR is nonsingular and has a unique LU-factorization.

An eigenvalue problem Consider a horizontal beam of length L located between 0 and L on the x-axis of the plane. We assume that the beam is fixed at x = 0 and x = L A force F is applied at (L, 0) in the direction towards the origin. Let y(x) be the vertical displacement of the beam at x. Boundary value problem Ry (x) = Fy(x), y(0) = y(l) = 0 R is a constant defined by the rigidity of the beam.

The transformed system Transform: u : [0, 1] R given by u(t) := y(tl). Eigenvalue problem u (t) = Ku(t), u(0) = u(1) = 0, K := FL2 R When F is increased it will reach a critical value where the beam will buckle and maybe break. This corresponds to the smallest eigenvalue of u = Ku. Approximate u by T/h 2. Discrete eigenvalue problem Tv = λv, where T is the second derivative matrix. Determine eigenvalues of T.

Hermitian matrices Complex number: z = x + iy = r(cos φ + i sin φ) = re iφ Complex conjugate: z = x iy = r(cos φ i sin φ) = re iφ A complex number z is real if and only if z = z. Absolute value z = zz = x 2 + y 2 = r. Hermitian transpose: If A = [a ij ] C m,n then A H := [a ji ] C n,m (AB) H = B H A H If A R m,n then A T = A H. A matrix is Hermitian if A H = A and symmetric if A T = A.

Eigenpairs of Hermitian matrices The eigenvalues of a Hermitian matrix are real. The eigenvectors corresponding to distinct eigenvalues are orthogonal. Proof real. Suppose A H = A and Ax = λx with x 0. λ = xh Ax x H x. λ = λ H = (xh Ax) H = xh A H x = xh Ax = λ, and λ is real. (x H x) H x H x x H x Proof orthogonal Suppose Ax = λx and Ay = µy with µ λ. λy H x = y H Ax = (x H A H y) H = (x H Ay) H = (µx H y) H = µy H x. Thus (λ µ)y H x = 0 and y H x = 0.

The Sine Matrix S = [ sin m+1] jkπ m j,k=1 Rm,m. For m = 3 sin π 4 sin 2π 4 sin 3π 4 t 1 t S = sin 2π 4 sin 4π 4 sin 6π 4 = 1 0 1, t := 1. sin 3π 4 sin 6π 4 sin 9π 4 t 1 t 2 Columns S = [s 1,..., s m ] s T 1 s 2 = s T 1 s 3 = s T 2 s 3 = 0

Eigenvalue Problem C := tridiag(a, b, a) = b a 0... 0 a b a... 0......... 0... a b a 0... a b R m,m. a = 1,b = 2 second derivative matrix T. a = 1, b = 4 spline matrix N 1. C is symmetric, C T = C. c k,j = 0 except for c k,k 1 = c k,k+1 = a and c k,k = b. Show that in general we have Cs j = λ j s j for j = 1,..., m, λ j = b + 2a cos(jπh), h = 1/(m + 1).

Eigenpairs of C Let s k,j = sin (kjπh) be the kth entry in s j. With A := kjπh and B := jπh we find (Cs j ) k = m c k,l s l,j = l=1 k+1 l=k 1 c k,l s l,j = as k 1,j + bs k,j + as k+1,j = a sin ( (k 1)jπh ) + b sin ( kjπh ) + a sin ( (k + 1)jπh ) = a sin(a B) + b sin A + a sin(a + B) = 2a cos B sin A + b sin A = (b + 2a cos B) sin A = λ j s k,j, and λ j = b + 2a cos(jπh) follows. Since jπh = jπ/(m + 1) (0, π) for j = 1,..., m and the cos function is monotone on (0, π) the eigenvalues are distinct. Since C is symmetric it follows from Lemma that the eigenvectors s j are orthogonal.

Scaling of eigenvectors s j 2 2 := st j s j = m+1 2 for j = 1,..., m. s T j s j = m sin 2 (kjπh) = k=1 = m + 1 2 1 2 m sin 2 (kjπh) = 1 2 k=0 m cos(2kjπh)) k=0 m (1 cos(2kjπh)) k=0 The last cosine sum is zero. We show this by summing a geometric series of complex exponentials. m cos(2kjπh) + i k=0 = e2i(m+1)jπh 1 e 2ijπh 1 m sin(2kjπh) = k=0 = e2πij 1 e 2ijπh 1 = 0. m k=0 e 2ikjπh

Column partition of a matrix product Matrix vector product. a1j x j Ax =. = a 1j. x j = amj x j a mj In particular Ae j = a j. n j=1 x j a j Product (AB)e j = A(Be j ) = Ab j so AB = [Ab 1,..., Ab n ] AB is partitioned by columns Partition by rows a T 1. B a T 2. AB = B. a T m.b

Partitioned matrices A rectangular matrix A can be partitioned into sub-matrices by drawing horizontal lines between selected rows and [ vertical 1 2 3 ] lines between selected columns. For example, A = 4 5 6 can 7 8 9 be partitioned as [ ] 1 2 3 A11 A (i) 12 = 4 5 6, (ii) [ ] 1 2 3 a A 21 A.1, a.2, a.3 = 4 5 6 22 7 8 9 7 8 9 a T 1. (iii) a T 2. = a T 3. 1 2 3 4 5 6 7 8 9, (iv) [ ] A 11, A 12 = 1 2 3 4 5 6 7 8 9 The submatrices in a partition is often referred to as blocks and a partitioned matrix is sometimes called a block matrix..

Block multiplication 1 B 1 B 2 p m A AB 1 AB 2 m AB = [Ab 1,..., Ab r, Ab r+1..., Ab n ] = [AB 1, AB 2 ].

Block multiplication 2 m A 1 A 2 B 1 s m A 1 B 1 A 2 B 2 B 2 p s p s p (AB) ij = a ik b kj = a ik b kj + a ik b kj j=1 j=1 j=s+1 = (A 1 B 1 ) ij + (A 2 B 2 ) ij = (A 1 B 1 + A 2 B 2 ) ij.

The general case If A 11 A 1s B 11 B 1q A =.., B =.., A p1 A ps B s1 B sq and if all the matrix products in s C ij = A ik B kj, k=1 i = 1,..., p, j = 1,..., q are well defined then C 11 C 1q AB =... C p1 C pq

Products of triangular matrices a 11 a 12 a 1n b 11 b 12 b 1n c 11 c 12 c 1n 0 a 22 a 2n 0 b 22 b 2n.... = 0 c 22 c 2n... 0 0 a nn 0 0 b nn 0 0 c nn Lemma The product C = AB = (c ij ) of two upper(lower) triangular matrices A = (a ij ) and B = (b ij ) is upper(lower) triangular with diagonal entries c ii = a ii b ii for all i. Proof. Exercise.

Block-Triangular Matrices Lemma Suppose [ ] A11 A A = 12 0 A 22 where A, A 11 and A 22 are square matrices. Then A is nonsingular if and only if both A 11 and A 22 are nonsingular. In that case [ A 1 A 1 = 11 A 1 11 A 12A 1 ] 22 0 A 1 (1) 22

Proof If A 11 and A 12 are nonsingular then [ A 1 11 A 1 11 A 12A 1 ] [ ] 22 A11 A 12 0 A 1 = 22 0 A 22 and A is nonsingular with the indicated inverse. [ ] I 0 = I 0 I

Proof Conversely, let B be the inverse of the nonsingular matrix A. We partition B conformally with A and have [ ] [ ] [ ] B11 B BA = 12 A11 A 12 I 0 = = I B 21 B 22 0 A 22 0 I Using block-multiplication we find B 11 A 11 = I, B 21 A 11 = 0, B 21 A 12 + B 22 A 22 = I. The first equation implies that A 11 is invertible, this in turn implies that B 21 = 0 in the second equation, and then the third equation simplifies to B 22 A 22 = I. We conclude that also A 22 is invertible.

The inverse Consider now a triangular matrix. Lemma An upper (lower) triangular matrix A = [a ij ] R n,n is nonsingular if and only if the diagonal entries a ii, i = 1,..., n are nonzero. In that case the inverse is upper (lower) triangular with diagonal entries a 1 ii, i = 1,..., n. Proof: We use induction on n. The result holds for n = 1: The 1-by-1 matrix A = (a 11 ) is invertible if and only if a 11 0 and in that case A 1 = (a 1 11 ). Suppose the result holds for n = k and let A = R k+1,k+1 be upper triangular.

Proof We partition A in the form [ ] Ak a A = k 0 a k+1,k+1 and note that A k R k,k is upper triangular. By Lemma 1.1 A is nonsingular if and only if A k and (a k+1,k+1 ) are nonsingular and in that case [ A A 1 1 k A 1 k = a ] ka 1 k+1,k+1 0 a 1. k+1,k+1 By the induction hypothesis A k is nonsingular if and only if the diagonal entries a 11,..., a kk of A k are nonzero and in that case A 1 k is upper triangular with diagonal entries a 1 ii, i = 1,..., k. The result for A follows.

Unit Triangular Matrices A matrix is unit triangular if it is triangular with 1 s on the diagonal. Lemma For a unit upper(lower) triangular matrix A R n,n : 1. A is invertible and the inverse is unit upper(lower) triangular. 2. The product of two unit upper(lower) triangular matrices is unit upper(lower) triangular. Proof. 1. follows from the inverse Lemma while the above Lemma implies 2.

Summary Studied a boundary value problem and an eigenvalue problem Each leads to a tridiagonal matrix T Introduced the concept of weak diagonal dominance and irreducibility Used LU-factorization to show that a tridiagonal, irreducible matrix is non-singular Found the eigenvalues and eigenvector of T Eigenvectors are orthogonal Block multiplication Triangular matrices