Matrix Representation

Similar documents
Quantum Computing Lecture 2. Review of Linear Algebra

Mathematical Formulation of the Superposition Principle

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Phys 201. Matrices and Determinants

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Mathematical Methods wk 2: Linear Operators

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Mathematics. EC / EE / IN / ME / CE. for

10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections )

Linear Algebra Review. Vectors

CS 246 Review of Linear Algebra 01/17/19

POLI270 - Linear Algebra

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of

Linear Algebra: Matrix Eigenvalue Problems

Elementary maths for GMT

Appendix A: Matrices

II. The Machinery of Quantum Mechanics

Introduction to Matrix Algebra

Introduction to Quantitative Techniques for MSc Programmes SCHOOL OF ECONOMICS, MATHEMATICS AND STATISTICS MALET STREET LONDON WC1E 7HX

Matrices A brief introduction

Foundations of Matrix Analysis

Vector spaces and operators

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Linear Algebra and Matrix Inversion

Angular Momentum. Classical. J r p. radius vector from origin. linear momentum. determinant form of cross product iˆ xˆ J J J J J J

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

Fundamentals of Engineering Analysis (650163)

Lecture 3: Matrix and Matrix Operations

Diagonalization by a unitary similarity transformation

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

1 Matrices and Systems of Linear Equations. a 1n a 2n

Lecture 7: Vectors and Matrices II Introduction to Matrices (See Sections, 3.3, 3.6, 3.7 and 3.9 in Boas)

Materials engineering Collage \\ Ceramic & construction materials department Numerical Analysis \\Third stage by \\ Dalya Hekmat

Introduction to Matrices

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Chem 3502/4502 Physical Chemistry II (Quantum Mechanics) 3 Credits Fall Semester 2006 Christopher J. Cramer. Lecture 5, January 27, 2006

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

CP3 REVISION LECTURES VECTORS AND MATRICES Lecture 1. Prof. N. Harnew University of Oxford TT 2013

A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010

Review of Linear Algebra

Linear Equations and Matrix

Review of Linear Algebra

Lecture Notes in Linear Algebra

Eigenvalues and Eigenvectors

1 Matrices and vector spaces

PHYS 705: Classical Mechanics. Rigid Body Motion Introduction + Math Review

Systems of Linear Equations and Matrices

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Chemistry 5.76 Revised February, 1982 NOTES ON MATRIX METHODS

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 1 Introduction to Linear Algebra

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Systems of Linear Equations and Matrices

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra Solutions 1

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra Review (Course Notes for Math 308H - Spring 2016)

Notes on Mathematics

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Knowledge Discovery and Data Mining 1 (VO) ( )

Symmetric and anti symmetric matrices

Vectors and matrices: matrices (Version 2) This is a very brief summary of my lecture notes.

Section 9.2: Matrices.. a m1 a m2 a mn

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Linear Equations in Linear Algebra

Matrices A brief introduction

Mathematical Foundations of Quantum Mechanics

1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

PHYS-4007/5007: Computational Physics Course Lecture Notes Section VII

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Lecture If two operators A, B commute then they have same set of eigenkets.

Linear Algebra (Review) Volker Tresp 2018

The quantum state as a vector

Graphics (INFOGR), , Block IV, lecture 8 Deb Panja. Today: Matrices. Welcome

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

2. Matrix Algebra and Random Vectors

Chapter 1 Vector Spaces

1 = I I I II 1 1 II 2 = normalization constant III 1 1 III 2 2 III 3 = normalization constant...

Matrices and Determinants

William Stallings Copyright 2010

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 1 Introduction to Linear Algebra

! 4 4! o! +! h 4 o=0! ±= ± p i And back-substituting into the linear equations gave us the ratios of the amplitudes of oscillation:.»» = A p e i! +t»»

Matrices and Linear Algebra

Physics 215 Quantum Mechanics 1 Assignment 1

MATRICES AND MATRIX OPERATIONS

MATRICES The numbers or letters in any given matrix are called its entries or elements

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Notes on basis changes and matrix diagonalization

I = i 0,

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Matrices. Chapter Definitions and Notations

Review of Vectors and Matrices

= a. a = Let us now study what is a? c ( a A a )

Chapter 7. Linear Algebra: Matrices, Vectors,

Quantum Mechanics for Scientists and Engineers. David Miller

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Section 9.2: Matrices. Definition: A matrix A consists of a rectangular array of numbers, or elements, arranged in m rows and n columns.

Numerical Analysis Lecture Notes

Transcription:

Matrix Representation Matrix Rep. Same basics as introduced already. Convenient method of working with vectors. Superposition Complete set of vectors can be used to express any other vector. Complete set of N orthonormal vectors can form other complete sets of N orthonormal vectors. Can find set of vectors for Hermitian operator satisfying Au u. Eigenvectors and eigenvalues Matrix method Find superposition of basis states that are eigenstates of particular operator. Get eigenvalues.

Orthonormal basis set in N dimensional vector space e j basis vectors Any N dimensional vector can be written as x N j1 x e j j with j x j e x To get this, project out j j e e from x j j j j x e e e x piece of x that is e, j j then sum over all e.

Operator equation y Ax N N j j j1 j1 y e A x e j j Substituting the series in terms of bases vectors. N j1 Left mult. by x Ae j i e j N i j yi e A e xj j1 The N 2 scalar products e i A e j are completely determined by A j and the basis set e. N values of j for each y i ; and N different y i

Writing i a e A e ij j gives for the linear transformation Matrix elements of A in the basis e j y y N ax j 1, 2, N i ij j j1 In terms of the vector representatives x1 y1 x 2 x y 2 y x N yn (Set of numbers, gives you vector when basis is known.) Ax Know the a ij because we know A and e The set of N linear algebraic equations can be written as double underline means matrix j Q 7xˆ 5yˆ 4zˆ vector 7 5 vector representative, must know basis 4

A array of coefficients - matrix A a a a a a a a a a 11 12 1N 21 22 2 N aij N1 N2 NN The a ij are the elements of the matrix A. y Ax vector representatives in particular basis y1 a11 a12 a1n x1 y 2 a21 a 22 x 2 y a a a x N N1 N2 NN N A The product of matrix and vector representative x is a new vector representative y with components y N ax i ij j j1

Matrix Properties, Definitions, and Rules Two matrices, A and B are equal A B if a ij = b ij. The unit matrix 1 1 1 ij 1 Gives identity transformation N y x x i ij j i j1 Corresponds to ones down principal diagonal The zero matrix x y 1 x x

Matrix multiplication Consider y Ax z By operator equations z BA x Using the same basis for both transformations z N b y z By k ki i i1 B matrix ( b ki ) z By BAx Cx C BA has elements Example c N b a kj ki ij i1 Law of matrix multiplication 2 37 5 29 28 3 4 5 6 41 39

Multiplication Associative ABC ABC Multiplication NOT Commutative except in special cases. AB BA Matrix addition and multiplication by complex number A B C c a b ij ij ij

Reciprocal of Product 1 1 1 AB B A For matrix defined as A ( a ij ) Transpose A a ji interchange rows and columns * * A a ij Complex Conjugate Hermitian Conjugate * A a ji Inverse of a matrix inverse of A A 1 A A CT complex conjugate of each element complex conjugate transpose A 1 A 1 1 AA A A 1 identity matrix transpose of cofactor matrix (matrix of signed minors) determinant A If A A is singular

Rules AB BA transpose of product is product of transposes in reverse order A A determinant of transpose is determinant * ( AB) * * A B complex conjugate of product is product of complex conjugates * * A A determinant of complex conjugate is complex conjugate of determinant ( AB) B A Hermitian conjugate of product is product of Hermitian conjugates in reverse order A A * determinant of Hermitian conjugate is complex conjugate of determinant

Definitions A A Symmetric A A Hermitian A A * Real A A * Imaginary A 1 A Unitary a a ij ij ij Diagonal Powers of a matrix 1 2 A 1 A A A AA e A A 1 A 2! 2

Column vector representative one column matrix x1 x 2 x vector representatives in particular basis xn then y Ax becomes y1 a11 a12 a1n x1 y 2 a21 a 22 x 2 y a a a x N N1 N2 NN N row vector x x x x 1, 2 N transpose of column vector y Ax y xa y Ax y x A transpose Hermitian conjugate

Change of Basis orthonormal basis i e then i j e e i, j 1,2, N ij Superposition of a new basis i e i e can form N new vectors linearly independent N i k e u e i 1, 2, N k1 ik complex numbers

New Basis is Orthonormal j i e e ij if the matrix U U U u ik meets the condition 1 coefficients in superposition N i k e u e i 1, 2, N k1 ik U 1 U U is unitary Hermitian conjugate = inverse Important result. The new basis will be orthonormal UU U U i e if U, the transformation matrix, is unitary (see book and Errata and Addenda, linear algebra book ). 1

Unitary transformation substitutes orthonormal basis for orthonormal basis. Vector x e e x x xi e i i i i xi e vector line in space (may be high dimensionality abstract space) written in terms of two basis sets x Same vector different basis. The unitary transformation U can be used to change a vector representative of x in one orthonormal basis set to its vector representative in another orthonormal basis set. x vector rep. in unprimed basis x' vector rep. in primed basis x Ux x U x change from unprimed to primed basis change from primed to unprimed basis

Example Consider basis y xˆ, yˆ, zˆ s z x Vector s - line in real space. In terms of basis s 7xˆ 7yˆ 1zˆ Vector representative in basis xˆ, yˆ, zˆ s 7 7 1

Change basis by rotating axis system 45 around ẑ. Can find the new representative of s, s' s U s U is rotation matrix U x y z cosθ sinθ sinθ cosθ 1 For 45 rotation around z U 2/2 2/2 2/2 2/2 1

Then 2/2 2/2 7 7 2 s 2/2 2/2 7 1 1 1 7 2 s 1 vector representative of s in basis e Same vector but in new basis. Properties unchanged. Example length of vector ss 1/2 1/2 ss 1/2 * 1/2 1/2 1/2 ( s s) (49 49 1) (99) ss s s * 1/2 1/2 1/2 ( ) (2 49 1) (99)

Can go back and forth between representatives of a vector x by change from unprimed to primed basis change from primed to unprimed basis x Ux x U x components of x in different basis

Consider the linear transformation y Ax operator equation y Ax In the basis e can write or y a x i ij j j Change to new orthonormal basis e using U yuyuaxuau x or y Ax A U AU with the matrix A given by Because U is unitary A U AU 1

Extremely Important A U AU 1 Can change the matrix representing an operator in one orthonormal basis into the equivalent matrix in a different orthonormal basis. Called Similarity Transformation

y Ax ABC ABC In basis e Go into basis e y Ax AB C A B C Relations unchanged by change of basis. Example AB C UABU UCU Can insert U U between AB because UAU UBU UCU 1 U U U U 1 Therefore AB C

Isomorphism between operators in abstract vector space and matrix representatives. Because of isomorphism not necessary to distinguish abstract vectors and operators from their matrix representatives. The matrices (for operators) and the representatives (for vectors) can be used in place of the real things.

Hermitian Operators and Matrices Hermitian operator x Ay yax Hermitian operator Hermitian Matrix A A + - complex conjugate transpose - Hermitian conjugate

Theorem (Proof: Powell and Craseman, P. 33 37, or linear algebra book) For a Hermitian operator A in a linear vector space of N dimensions, there exists an orthonormal basis,, 1 2 N U U U relative to which A is represented by a diagonal matrix 1 2 A N. i The vectors, U, and the corresponding real numbers, i, are the solutions of the Eigenvalue Equation AU U and there are no others.

Application of Theorem Operator A represented by matrix A i in some basis e. The basis is any convenient basis. In general, the matrix will not be diagonal. There exists some new basis eigenvectors U i in which A represents operator and is diagonal eigenvalues. i To get from to i i U U e. e i U unitary transformation. A U AU 1 Similarity transformation takes matrix in arbitrary basis into diagonal matrix with eigenvalues on the diagonal.

Matrices and Q.M. Previously represented state of system by vector in abstract vector space. Dynamical variables represented by linear operators. Operators produce linear transformations. y Ax Real dynamical variables (observables) are represented by Hermitian operators. Observables are eigenvalues of Hermitian operators. AS S Solution of eigenvalue problem gives eigenvalues and eigenvectors.

Matrix Representation Hermitian operators replaced by Hermitian matrix representations. A A In proper basis, A is the diagonal Hermitian matrix and the diagonal matrix elements are the eigenvalues (observables). 1 A suitable transformation UAU takes A (arbitrary basis) into A (diagonal eigenvector basis) A U AU 1. U takes arbitrary basis into eigenvectors. Diagonalization of matrix gives eigenvalues and eigenvectors. Matrix formulation is another way of dealing with operators and solving eigenvalue problems.

All rules about kets, operators, etc. still apply. Example Two Hermitian matrices A and B can be simultaneously diagonalized by the same unitary transformation if and only if they commute. All ideas about matrices also true for infinite dimensional matrices.

Example Harmonic Oscillator Have already solved use occupation number representation kets and bras (already diagonal). 1 H P x 2 an 1 2 aa a a 2 2 nn1 a n n1 n1 matrix elements of a a a 1 1 a 2 1 a 1 a 1 1 a 2 2 1 a 3 1 2 3 a 1 2 3 1 2 3 4

a 1 2 3 4 1 H aa a a 2 aa 1 2 3 4 1 2 3 4 1 2 3 4

1 H aa a a 2 a a 1 2 3 4 1 2 3 4 1 2 3

Adding the matrices aa and a a and multiplying by ½ gives H H 1 1 2 3 3 2 1 5 5 2 2 7 7 2 The matrix is diagonal with eigenvalues on diagonal. In normal units the matrix would be multiplied by. This example shows idea, but not how to diagonalize matrix when you don t already know the eigenvectors.

Diagonalization Eigenvalue equation eigenvalue Au u Auu matrix representing operator representative of eigenvector In terms of the components N aij ij uj i 1,2 N j1 This represents a system of equations a u a u a u 11 1 12 2 13 3 21 1 22 2 23 3 31 1 32 2 33 3 a u a u a u a u a u a u We know the a ij. We don't know - the eigenvalues u i - the vector representatives, one for each eigenvalue.

Besides the trivial solution u u u N 1 2 A solution only exists if the determinant of the coefficients of the u i vanishes. a11 a12 a13 a21 a22 a23 a31 a32 a33 Expanding the determinant gives N th degree equation for the the unknown 's (eigenvalues). know a ij, don't know 's Then substituting one eigenvalue at a time into system of equations, the u i (eigenvector representatives) are found. N equations for u's gives only N - 1 conditions. Use normalization. uu uu u u * * * 1 1 2 2 N N 1

Example - Degenerate Two State Problem Basis - time independent kets orthonormal. H E H E and not eigenkets. Coupling. These equations define H. The matrix elements are H E H H H E And the Hamiltonian matrix is H E E

The corresponding system of equations is ( E ) ( E ) These only have a solution if the determinant of the coefficients vanish. E E Take the matrix E E Make into determinant. Subtract λ from the diagonal elements. Expanding 2 2 E 2E E 2 2 2 E 2 Dimer Splitting Excited State Energy Eigenvalues E E Ground State E =

To obtain Eigenvectors Use system of equations for each eigenvalue. a b Eigenvectors associated with + and -. a b a, b and a, b are the vector representatives of and in the, basis set. We want to find these.

First, for the E write system of equations. eigenvalue ( H ) a H b 11 12 H a ( H ) b 21 22 H H ; H H ; H H ; H H Matrix elements of 11 12 21 22 H ( E E ) a b a ( E E ) b The result is a b a b The matrix elements are H E H H H E

An equivalent way to get the equations is to use a matrix form. E a E b Substitute E E E a E E b a b Multiplying the matrix by the column vector representative gives equations. a b a b

a b a b The two equations are identical. a b Always get N 1 conditions for the N unknown components. Normalization condition gives necessary additional equation. a b 2 2 1 Then a and b 1 2 1 1 2 2 Eigenvector in terms of the basis set.

For the eigenvalue E using the matrix form to write out the equations E a E b Substituting E a b a a b b These equations give Using normalization 1 1 Therefore 2 2 a a b 1 1 b 2 2

Can diagonalize by transformation H U HU 1 diagonal not diagonal Transformation matrix consists of representatives of eigenvectors in original basis. U a a 1/ 2 1/ 2 b b 1/ 2 1/ 2 1 1/ 2 1/ 2 U 1/ 2 1/ 2 complex conjugate transpose

Then H H 1/ 2 1/ 2 E 1/ 2 1/ 2 E 1/ 2 1/ 2 1/ 2 1/ 2 Factoring out 1/ 2, one from each matrix. H 1 1 1 E 1 1 2 1 1 1 1 E after matrix multiplication 1 1 1 E E 2 1 1 E E more matrix multiplication E H E diagonal with eigenvalues on diagonal