Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

Similar documents
22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Quantum Computing Lecture 2. Review of Linear Algebra

Repeated Eigenvalues and Symmetric Matrices

11.0 Introduction. An N N matrix A is said to have an eigenvector x and corresponding eigenvalue λ if. A x = λx (11.0.1)

We will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m

Linear Algebra Review. Vectors

Linear Algebra: Matrix Eigenvalue Problems

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Symmetric and anti symmetric matrices

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

The Singular Value Decomposition

Econ Slides from Lecture 7

Eigenvalue and Eigenvector Homework

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Math Matrix Algebra

Math 108b: Notes on the Spectral Theorem

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Eigenvalues and Eigenvectors

Linear Algebra Review

1 Linear Algebra Problems

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Mathematical Methods wk 2: Linear Operators

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Matrix Algebra: Summary

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Linear algebra 2. Yoav Zemel. March 1, 2012

Spectral Theorem for Self-adjoint Linear Operators

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

k is a product of elementary matrices.

Matrices and Deformation

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Eigenvalues and Eigenvectors

1. General Vector Spaces

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Diagonalizing Matrices

Lecture If two operators A, B commute then they have same set of eigenkets.

Introduction to Matrix Algebra

Lecture 8 : Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors

Knowledge Discovery and Data Mining 1 (VO) ( )

Review of Linear Algebra

MTH 2032 SemesterII

Roundoff Error. Monday, August 29, 11

Chapter 3 Transformations

Conceptual Questions for Review

Eigenvalues and Eigenvectors

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Linear Algebra Practice Problems

A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010

Math 1060 Linear Algebra Homework Exercises 1 1. Find the complete solutions (if any!) to each of the following systems of simultaneous equations:

Online Exercises for Linear Algebra XM511

and let s calculate the image of some vectors under the transformation T.

Matrices and Linear Algebra

Conjugate Gradient (CG) Method

Algebra I Fall 2007

Notes on basis changes and matrix diagonalization

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

PY 351 Modern Physics Short assignment 4, Nov. 9, 2018, to be returned in class on Nov. 15.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Spring 2011 Final Exam

Numerical Methods I Eigenvalue Problems

Multivariate Statistical Analysis

Math 215 HW #9 Solutions

Section 4.4 Reduction to Symmetric Tridiagonal Form

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Linear Algebra and Dirac Notation, Pt. 1

MATH 583A REVIEW SESSION #1

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Lecture 15 Review of Matrix Theory III. Dr. Radhakant Padhi Asst. Professor Dept. of Aerospace Engineering Indian Institute of Science - Bangalore

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Linear Algebra: Characteristic Value Problem

Linear Algebra and Matrices

Study Notes on Matrices & Determinants for GATE 2017

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Foundations of Matrix Analysis

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Vector spaces and operators

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

3 (Maths) Linear Algebra

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

G1110 & 852G1 Numerical Linear Algebra

Chap 3. Linear Algebra

Homework 2. Solutions T =

ELEMENTARY MATRIX ALGEBRA

Announce Statistical Motivation Properties Spectral Theorem Other ODE Theory Spectral Embedding. Eigenproblems I

Linear Algebra. Carleton DeTar February 27, 2017

REPRESENTATION THEORY WEEK 7

Lecture 15, 16: Diagonalization

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Background Mathematics (2/2) 1. David Barber

Diagonalization. Hung-yi Lee

ELE/MCE 503 Linear Algebra Facts Fall 2018

CS 143 Linear Algebra Review

Transcription:

Lecture 1: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) The eigenvalue problem, Ax= λ x, occurs in many, many contexts: classical mechanics, quantum mechanics, optics 22

Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) The textbook solution is obtained from (A λ I) x =, which can only hold for x if det (A λ I) =. This is a polynomial equation for λ of order N 221

Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) This method of solution is: (1) very slow (2) inaccurate, in the presence of roundoff error (3) yields only eigenvalues, not eigenvectors (4) of no practical value (except, perhaps in the 2 x 2 case where the solution to the resultant quadratic can be written in closed form) 222

Review of some basic definitions in linear algebra A matrix is said to be: Symmetric if A = A T (its transpose) i.e. if a ij = a ji Hermitian if A = A (A T )* i.e. if a ij = a ji * (its Hermitian conjugate) Orthonormal if U 1 = U T Unitary if U 1 = U 223

Review of some basic definitions in linear algebra All 4 types are normal, meaning that they obey the relations A A = A A U U = U U 224

Eigenvectors and eigenvalues Hermitian matrices are of particular interest because they have: real eigenvalues orthonormal eigenvectors (x i x j = δ ij ) (unless some eigenvalues are degenerate, in which case we can always design an orthonormal set) 225

Diagonalization of Hermitian matrices Hermitian matrices can be diagonalized according to Unitary matrix whose column are the eigenvectors A = U D U 1 Diagonal matrix consisting of the eigenvalues 226

Diagonalization of Hermitian matrices Clearly, if A = U D U 1, then A U = U D..which is simply the eigenproblem (written out N times): The columns of U are the eigenvectors (mutually orthogonal) and the elements of D (non-zero only on the diagonal) are the corresponding eigenvalues. 227

Non-Hermitian matrices We are often interested in non-symmetric real matrices: the eigenvalues are real or come in complexconjugate pairs the eigenvectors are not orthonormal in general the eigenvectors may not even span an N- dimensional space 228

Diagonalization of non-hermitian matrices For a non-hermitian matrix, we can identify two (different) types of eigenvectors Right hand eigenvectors are column vectors which obey: Ax R = λ x R Left hand eigenvectors are row vectors which obey: x L A = λ x L The eigenvalues are the same (the roots of det (A λi) = in both cases) but, in general, the eigenvectors are not 229

Diagonalization of non-hermitian matrices Note: for a Hermitian matrix, the two types of eigenvectors are equivalent, since A x R = λ x R since A = A since λ is real x R A = λ x R x R A = λ x R so if any vector x R is a right-hand eigenvector, then x R is a left-hand eigenvector 23

Diagonalization of non-hermitian matrices Let D be the diagonal matrix whose elements are the eigenvalues, D = diag (λ i ) Let R be the matrix whose columns are the right-hand eigenvectors Let L be the matrix whose rows are the lefthand eigenvectors Then the eigenvalue equations are A R = R D and L A = D L 231

Diagonalization of non-hermitian matrices The eigenvalue equations A R = R D and L A = D L imply L A R = L R D and L A R = D L R Hence, L A R = L R D = D L R L R commutes with D L R is also diagonal rows of L are orthogonal to columns of R 232

Diagonalization of non-hermitian matrices It follows that any matrix can be diagonalized according to D = L A R = R 1 A R, or equivalently D = L A L 1 The special feature of a Hermitian matrix is that the L and R are unitary 233

Solving the eigenvalue equation The problem is entirely equivalent to figuring out how to diagonalize of A according to A = R D R 1 All methods proceed by making a series of similarity transformation A = P 1 1 M 1 P 1 = P 1 1 P 1 2 M 2 P 2 P 1 = = R D 1 R where M 1, M 2 are successively more nearly diagonal 234

Solving the eigenvalue equation There are many routines available: EISPACK + Linpack LAPACK (free) NAG, IMSL (expensive) Before using a totally general technique, consider what you want: eigenvalues alone, or eigenvectors as well?..and how special your case is real symmetric & tridiagonal, real symmetric, real nonsymmetric, complex Hermitian, complex non-hermitian 235

Solution for a real symmetric matrix: the Householder method (Recipes, 11.2) The Householder method makes use of a similarity transform based upon P = I 2 w w T, where w is a column vector for which w T w = 1 and ww T is an N x N symmetric matrix i.e. (w w T ) ij = (w w T ) ji = w i w j 236

The Householder method The Householder matrix, P, clearly has the following properties P is symmetric (being the difference between 2 symmetric matrices, I and 2ww T ) P is orthogonal: P T P = (I 2 w w T ) T (I 2 w w T ) = (I 4 w w T + 4 w w T w w T ) = I = 1 237

The Householder method Let a 1 be the first column of A Let w = a 1 a 1 e 1 a 1 a 1 e 1 where e 1 = (1,,,, ) T 238

The Householder method Let s consider the action of P = I 2w w T upon the first column of A P a 1 ={ I 2 (a 1 a 1 e 1 ) (a 1 a 1 e 1 ) T } a 1 a 1 a 1 e 1 2 = a 1 2(a 1 a 1 e 1 ) ( a 1 2 a 1 e 1T a 1 ) a 1 2 2 a 1 e 1. a 1 + a 1 2 = a 1 e 1 P zeroes out all but the first element of a 1 239

24 The Householder method Suppose A is symmetric A = P A = M 1 = P A P 1 = P A P T =

241 The Householder method Suppose A is symmetric A = P A = M 1 = P A P 1 = P A P T =

242 The Householder method Next step, choose new Householder matrix of the form P = to operate on M 1 = M 2 = P M 1 P 1 = 1 I N 1 2 w N 1 w N 1 T

243 The Householder method After N Householder transformations, the matrix is tridiagonal M N = = P N P 2 P 1 AP 1 1 P 2 1 P N 1

The Householder method: computational cost You might think that each Householder transformation would require O(N 3 ) operations (2 matrix multiplications), so the whole series would cost ~ O(N 4 ) operations However, we can rewrite the product PAP 1 as follows PAP 1 = PA P= (I 2 ww T ) A (I 2 ww T ) = (I 2 ww T ) (A 2 Aww T ) = (A 2 ww T A 2 Aww T + 4 ww T Aw w T ) 244

The Householder method: computational cost Now define an N x 1 vector v = A w P 1 A P = (A 2ww T A 2Aww T + 4 ww T Aww T ) = (A 2 wv T 2 vw T + 4 ww T vw T ) scalar This calculation involves an N 2 operation to compute v, and a series of N 2 operations to compute, subtract and add various matrices The reduction of the matrix to triadiagonal form therefore costs only O(N 3 ) operations, not O(N 4 ) 245

246 Diagonalization of a symmetric tridiagonal matrix (Recipes, 11.3) The diagonalization of a symmetric tridiagonal matrix can be accomplished by a series of ~N 2 x 2 rotations Consider the rotation R = 1 1 1 1 1 C -S S C (rotation through angle θ = cos 1 C = sin 1 S in the x 1 -x 2 plane)

247 Diagonalization of a symmetric tridiagonal matrix Applying this similarity transform to the triadiagonal matrix, we obtain R T R = v u u t

Diagonalization of a symmetric tridiagonal matrix To make the off-diagonal elements (R T M N R) 12 = (R T M N R) 21 zero, we require tsc + u(c 2 S 2 ) vsc = tan θ = (t v) ± (t v) 2 + 4u 2 2u 248