On common eigenbases of commuting operators

Size: px
Start display at page:

Download "On common eigenbases of commuting operators"

Transcription

1 On common eigenbases of commuting operators Paolo Glorioso In this note we try to answer the question: Given two commuting Hermitian operators A and B, is each eigenbasis of A also an eigenbasis of B? We take this occasion to review the mathematical results needed to explore the answer to such question. Moreover, we will assume that the reader is familiar with the concepts of vector space, vector subspace, linear combination, linear independence, diagonalization, inner product, and basis. These concepts can be found in Sections.,. and.4 in []. A less specific treatment of the following is given in Section.8 therein. Consider an operator A, acting on vectors belonging to a vector space V. We will make use of the following definitions: i) Eigenvalue: A constant λ C is called an eigenvalue of A if it satisfies the following equation: Âv = λv, () for some nonvanishing vector v V. ii) Eigenvector: A nonvanishing vector v V is an eigenvector of A if it satisfies Equation () for some λ C. Note that v is also called eigenstate, or eigenfunction, depending on the context. iii) Nondegenerate eigenvalue/eigenvector: An eigenvalue λ of A is called nondegenerate if Equation () is satisfied by only one vector v, up to an overall complex number, i.e. all the solutions of Equation () are of the form αv, where α C. This is equivalent to say that the eigenvalue λ corresponds to only one eigenstate of A. Similarly, an eigenvector is called nondegenerate if it is the only vector, up to an overall complex number, that satisfies Equation () for some λ. Degenerate eigenvalues/eigenvectors are those which don t satisfy this uniqueness property. Note that some references use the expression degenerate state with respect to Equation () but referring only to the energy operator, see for example []. see below for more details. iv) Eigenbasis: A set of vectors Êv = εv, V = {v i }, such that each v i is an eigenvector of A and V is a basis of V, is called an eigenbasis of V with respect to A.

2 In this note we will refer to Hermitian operators, where A is Hermitian if, for any u, v V, (u, Av) = (Au, v), and (u, v) is the scalar product in V. There are two reasons why we consider Hermitian operators. First, because in Quantum Mechanics all observables are postulated to be Hermitian. Second, because Hermitian operators are diagonalizable, i.e. they admit a basis in which they have a diagonal form, which is then an eigenbasis. See Theorem in Chapter of [] for this point. Proposition. Let A be a Hermitian operator with only nondegenerate eigenvalues, and V = {v i } and W = {w i } two eigenbases of A. Then V is obtained from W by permutations and multiplications by complex numbers of the eigenvectors of W, i.e., for each v i V, there is w W and α C such that v i = αw. In other words, V and W contain the same eigenstates. Proof. Let λ i and µ i be the eigenvalues of v i and w i, respectively, i.e. Av i = λ i v i, Aw i = µ i w i. () Since W is a basis, we can write any v i V as a linear combination of the w i s, where α C. Then, v i = α w, (3) λ i v i = Av i = A α w = α Aw = α µ w. where we used the linearity of A. Comparing the first and the last members of the above Equation with Equation (3), we get λi α w = µ α w = (µ λ i ) α w =, and using the fact that the w s are linearly independent, we obtain, (µ λ i ) α =. This is a set of equations, labelled by. Each of them has two solutions: either α =, or µ = λ i. Since by hypothesis, the µ s are all different from each other, being nondegenerate eigenvalues, only one of the above set of equations can satisfy µ = λ i. Thus all α s but one are zero, and we obtain, from Equation (3), v i = α w, Recall that this is implied by the requirement of having real eigenvalues.

3 for one value of. This tells us that each vector of V is also contained in W, up to an overall complex number, which in the above Equation is given by α. Of course, we can also revert the whole reasoning and show that each vector of W is contained in V. Therefore we can say that the two eigenbases are the same, up to permutations and multiplications by complex numbers of their vectors, and we are done. Proposition. Suppose that A are Hermitian operators with vanishing commutator, i.e. Then A share a common eigenbasis. [A, B] =. Proof. Consider an eigenbasis of A, V = {v i }, with λ i the eigenvalue associated to v i. Then, for any v i, ABv i = BAv i = λ i Bv i, (4) i.e., if Bv i =, Bv i is an eigenvector of A associated to the same eigenvalue as v i, λ i. We have two cases to consider: λ i nondegenerate: Bv i can differ from v i only by a constant factor, or Bv i = µ i v i, and thus v i is also an eigenvector of B, with eigenvalue µ i. λ i degenerate: in this case there are more vectors associated to λ i, which we denote by w, =,..., N, where N is the degeneracy of λ i. Since Bw is still an eigenvector of A, we can write it as a linear combination of the w s. For this reason, the operator B can be seen as acting internally in the subspace spanned by the w s. Since B is Hermitian, it is Hermitian in particular in this subspace. Indeed, for any u, u belonging to such subspace, (u, Bu ) = (Bu, u ), ust because u, u belong also to the big vector space V. Now, since B is Hermitian in this subspace we can diagonalize it, or in other words we can choose a basis of eigenvectors of B which span this subspace, and we call them w. These w are still eigenvectors of A, and thus the Proposition is proved. To understand better the last proof, we can view B as an infinite matrix, in the sense explained by the following picture: (v, Bv ) (v, Bv ) B = (v, Bv ) (v, Bv ),

4 where the v i s belong to the basis with respect to which B is represented. If we view B in the eigenbasis V introduced at the beginning of the proof, we have B B B =, B 3 (5)... where each block B i is a submatrix representing the internal action of B on the subspaces similar to the one spanned by the w s. Each block refers to an eigenvalue λ i of A, and if λ i is nondegenerate the block will be ust a matrix. If λ i is degenerate with degeneracy N then the block will be N N. What we did in the degenerate case of the proof was ust to show that the corresponding block B i is a Hermitian matrix, and thus diagonalizable. Finally, note that if we know that A share a common eigenbasis, then their commutator is zero. Indeed, sharing a common eigenbasis means that in such basis they are both represented as diagonal operators, and thus they commute. This consideration allows us to state a more powerful statement than the above Preposition: Proposition 3. Let A be two Hermitian operators. Then the following two statements are equivalent: i) A possess a common eigenbasis. ii) A commute. Aimed of the mathematical results we have found, we shall now answer the following question: Given two commuting Hermitian operators A and B, is each eigenbasis of A also an eigenbasis of B? The short answer is: it depends. Consider the case where both A have only nondegenerate eigenvalues. Then, by virtue of Proposition and, each eigenbasis of A is also an eigenbasis of B. Indeed, by Proposition we can consider a common eigenbasis of A and B, which we denote by V. By Proposition we know that we would exhaust all the eigenbases of A by permuting and multiplying by complex numbers the vectors of V, and the same for the eigenbases of B. Thus, in this case, the answer to the above question is YES. In all the other cases, the answer is NO. Consider e.g. the case where A has some degenerate eigenvalues. Then, in some eigenbasis of A, B would look like in Equation (5), which is not in diagonal form if some of the blocks B i are nondiagonal. For a neat example, we can consider the following matrices: 3 A =, B = i, i note that [A, B] =, essentially because the lower diagonal block of A is a scalar matrix, and thus it must commute with the corresponding lower diagonal block of B. Moreover, B is Hermitian, A scalar matrix is a matrix proportional to the identity. 4

5 and then diagonalizable. Note that A and B are represented in terms of an eigenbasis of A, and that is a degenerate eigenvalue of A. We denote this eigenbasis by V = {e, e, e 3 }, where e =, e =, e 3 =. Following the proof of Proposition, we only need to find two linear combinations of e, e 3 such that the lower block of B assumes a diagonal form. Working out the standard diagonalization procedure, we find the common eigenbasis W = {e, v, v 3 }, where v = (e + ie 3 ), v 3 = (e ie 3 ), and the above matrices assume the following form: 3 A =, B =. 3 For a more physical example, consider the hydrogen atom. The energy eigenstates are commonly labelled by n, l, m, (6) where n is the energy level, l is the label associated to L, and m is the label associated to L z through L z ψ = mψ. In this basis, E, L and L z are all diagonal. Using the fact that the states (6) are degenerate, we want to construct a new basis where L z is nondiagonal. One example is s V = { n, l, s }, where n, l, s = n, l, m, l + s + m= l and where s = l,..., l. Clearly, V is a complete set of eigenstates of E and L, but they are not eigenstates of L z, indeed L z n, l, s = l + s + s m= l L z n, l, m = m n, l, m, l + s + s m= l which clearly doesn t correspond to Equation (). Another simple construction we can make with the hydrogen atom is the following. Consider, instead of (6), the basis given by the eigenvectors of E, L, and L x, instead of L z, and denote it by W = { n, l, m x }. (7) Thus we have that L x n, l, m x = m x n, l, m x. 5

6 However, L z will not be diagonal in this basis, indeed suppose that, for all m x, there exist α mx such that L z n, l, m x = α mx n, l, m x. Then, on any eigenstate of the form (7), il y n, l, m x = [L z, L x ] n, l, m x = [ m, α mx ] n, l, m x =, which we know to be false. Therefore W is not an eigenbasis for L z, but it keeps being an eigenbasis for E and L. In the basis given by the states (6), the operator L z assumes the following matrix form for the first 4 eigenstates: L z =,... whereas, in the basis (7), L z =.... References [] R. Shankar, Principles of Quantum Mechanics (Kluwer, 994). [] D.J. Griffiths, Introduction to Quantum Mechanics (Pearson Prentice Hall, 5). 6

7 MIT OpenCourseWare Quantum Physics I Spring For information about citing these materials or our Terms of Use, visit:

2 Eigenvectors and Eigenvalues in abstract spaces.

2 Eigenvectors and Eigenvalues in abstract spaces. MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors

More information

Chapter 5. Eigenvalues and Eigenvectors

Chapter 5. Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

Diagonalization by a unitary similarity transformation

Diagonalization by a unitary similarity transformation Physics 116A Winter 2011 Diagonalization by a unitary similarity transformation In these notes, we will always assume that the vector space V is a complex n-dimensional space 1 Introduction A semi-simple

More information

Vector spaces and operators

Vector spaces and operators Vector spaces and operators Sourendu Gupta TIFR, Mumbai, India Quantum Mechanics 1 2013 22 August, 2013 1 Outline 2 Setting up 3 Exploring 4 Keywords and References Quantum states are vectors We saw that

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

More information

Math 24 Spring 2012 Questions (mostly) from the Textbook

Math 24 Spring 2012 Questions (mostly) from the Textbook Math 24 Spring 2012 Questions (mostly) from the Textbook 1. TRUE OR FALSE? (a) The zero vector space has no basis. (F) (b) Every vector space that is generated by a finite set has a basis. (c) Every vector

More information

Jordan Canonical Form Homework Solutions

Jordan Canonical Form Homework Solutions Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have

More information

Consider Hamiltonian. Since n form complete set, the states ψ (1) and ψ (2) may be expanded. Now substitute (1-5) into Hψ = Eψ,

Consider Hamiltonian. Since n form complete set, the states ψ (1) and ψ (2) may be expanded. Now substitute (1-5) into Hψ = Eψ, 3 Time-independent Perturbation Theory I 3.1 Small perturbations of a quantum system Consider Hamiltonian H 0 + ˆV, (1) where H 0 and ˆV both time-ind., and ˆV represents small perturbation to Hamiltonian

More information

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition 6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition

More information

Recitation 1 (Sep. 15, 2017)

Recitation 1 (Sep. 15, 2017) Lecture 1 8.321 Quantum Theory I, Fall 2017 1 Recitation 1 (Sep. 15, 2017) 1.1 Simultaneous Diagonalization In the last lecture, we discussed the situations in which two operators can be simultaneously

More information

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will

More information

8.04: Quantum Mechanics Professor Allan Adams Massachusetts Institute of Technology Tuesday March 19. Problem Set 6. Due Wednesday April 3 at 10.

8.04: Quantum Mechanics Professor Allan Adams Massachusetts Institute of Technology Tuesday March 19. Problem Set 6. Due Wednesday April 3 at 10. 8.04: Quantum Mechanics Professor Allan Adams Massachusetts Institute of Technology Tuesday March 19 Problem Set 6 Due Wednesday April 3 at 10.00AM Assigned Reading: E&R 6 all, G, H Li. 7 1 9, 8 1 Ga.

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

Page 404. Lecture 22: Simple Harmonic Oscillator: Energy Basis Date Given: 2008/11/19 Date Revised: 2008/11/19

Page 404. Lecture 22: Simple Harmonic Oscillator: Energy Basis Date Given: 2008/11/19 Date Revised: 2008/11/19 Page 404 Lecture : Simple Harmonic Oscillator: Energy Basis Date Given: 008/11/19 Date Revised: 008/11/19 Coordinate Basis Section 6. The One-Dimensional Simple Harmonic Oscillator: Coordinate Basis Page

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra VBS/MRC Review of Linear Algebra 0 Ok, the Questions Why does algebra sound like cobra? Why do they say abstract algebra? Why bother about all this abstract stuff? What is a vector,

More information

Group representation theory and quantum physics

Group representation theory and quantum physics Group representation theory and quantum physics Olivier Pfister April 29, 2003 Abstract This is a basic tutorial on the use of group representation theory in quantum physics, in particular for such systems

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 1. Office Hours: MWF 9am-10am or by appointment

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 1. Office Hours: MWF 9am-10am or by  appointment Adam Floyd Hannon Office Hours: MWF 9am-10am or by e-mail appointment Topic Outline 1. a. Fourier Transform & b. Fourier Series 2. Linear Algebra Review 3. Eigenvalue/Eigenvector Problems 1. a. Fourier

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Spectral Theorem for Self-adjoint Linear Operators

Spectral Theorem for Self-adjoint Linear Operators Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Lecture 2: Linear operators

Lecture 2: Linear operators Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study

More information

Chapter 4 & 5: Vector Spaces & Linear Transformations

Chapter 4 & 5: Vector Spaces & Linear Transformations Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think

More information

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses

More information

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n. 6.. Length, Angle, and Orthogonality In this section, we discuss the defintion of length and angle for vectors and define what it means for two vectors to be orthogonal. Then, we see that linear systems

More information

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

The Jordan Normal Form and its Applications

The Jordan Normal Form and its Applications The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens

More information

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4 Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary

More information

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information

MAT 1302B Mathematical Methods II

MAT 1302B Mathematical Methods II MAT 1302B Mathematical Methods II Alistair Savage Mathematics and Statistics University of Ottawa Winter 2015 Lecture 19 Alistair Savage (uottawa) MAT 1302B Mathematical Methods II Winter 2015 Lecture

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

An Algorithmist s Toolkit September 10, Lecture 1

An Algorithmist s Toolkit September 10, Lecture 1 18.409 An Algorithmist s Toolkit September 10, 2009 Lecture 1 Lecturer: Jonathan Kelner Scribe: Jesse Geneson (2009) 1 Overview The class s goals, requirements, and policies were introduced, and topics

More information

Name: Final Exam MATH 3320

Name: Final Exam MATH 3320 Name: Final Exam MATH 3320 Directions: Make sure to show all necessary work to receive full credit. If you need extra space please use the back of the sheet with appropriate labeling. (1) State the following

More information

Physics 221A Fall 2018 Notes 22 Bound-State Perturbation Theory

Physics 221A Fall 2018 Notes 22 Bound-State Perturbation Theory Copyright c 2018 by Robert G. Littlejohn Physics 221A Fall 2018 Notes 22 Bound-State Perturbation Theory 1. Introduction Bound state perturbation theory applies to the bound states of perturbed systems,

More information

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010 Review Notes for Linear Algebra True or False Last Updated: January 25, 2010 Chapter 3 [ Eigenvalues and Eigenvectors ] 31 If A is an n n matrix, then A can have at most n eigenvalues The characteristic

More information

Summer Session Practice Final Exam

Summer Session Practice Final Exam Math 2F Summer Session 25 Practice Final Exam Time Limit: Hours Name (Print): Teaching Assistant This exam contains pages (including this cover page) and 9 problems. Check to see if any pages are missing.

More information

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. Definition: Let V T V be a linear transformation.

More information

Representation theory and quantum mechanics tutorial Spin and the hydrogen atom

Representation theory and quantum mechanics tutorial Spin and the hydrogen atom Representation theory and quantum mechanics tutorial Spin and the hydrogen atom Justin Campbell August 3, 2017 1 Representations of SU 2 and SO 3 (R) 1.1 The following observation is long overdue. Proposition

More information

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1) EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily

More information

Linear Algebra Final Exam Solutions, December 13, 2008

Linear Algebra Final Exam Solutions, December 13, 2008 Linear Algebra Final Exam Solutions, December 13, 2008 Write clearly, with complete sentences, explaining your work. You will be graded on clarity, style, and brevity. If you add false statements to a

More information

An overview of key ideas

An overview of key ideas An overview of key ideas This is an overview of linear algebra given at the start of a course on the mathematics of engineering. Linear algebra progresses from vectors to matrices to subspaces. Vectors

More information

LIE ALGEBRAS: LECTURE 7 11 May 2010

LIE ALGEBRAS: LECTURE 7 11 May 2010 LIE ALGEBRAS: LECTURE 7 11 May 2010 CRYSTAL HOYT 1. sl n (F) is simple Let F be an algebraically closed field with characteristic zero. Let V be a finite dimensional vector space. Recall, that f End(V

More information

Quantum Mechanics Solutions. λ i λ j v j v j v i v i.

Quantum Mechanics Solutions. λ i λ j v j v j v i v i. Quantum Mechanics Solutions 1. (a) If H has an orthonormal basis consisting of the eigenvectors { v i } of A with eigenvalues λ i C, then A can be written in terms of its spectral decomposition as A =

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Mathematical Foundations of Quantum Mechanics

Mathematical Foundations of Quantum Mechanics Mathematical Foundations of Quantum Mechanics 2016-17 Dr Judith A. McGovern Maths of Vector Spaces This section is designed to be read in conjunction with chapter 1 of Shankar s Principles of Quantum Mechanics,

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

1 = I I I II 1 1 II 2 = normalization constant III 1 1 III 2 2 III 3 = normalization constant...

1 = I I I II 1 1 II 2 = normalization constant III 1 1 III 2 2 III 3 = normalization constant... Here is a review of some (but not all) of the topics you should know for the midterm. These are things I think are important to know. I haven t seen the test, so there are probably some things on it that

More information

Math 121 Practice Final Solutions

Math 121 Practice Final Solutions Math Practice Final Solutions December 9, 04 Email me at odorney@college.harvard.edu with any typos.. True or False. (a) If B is a 6 6 matrix with characteristic polynomial λ (λ ) (λ + ), then rank(b)

More information

Eigenspaces and Diagonalizable Transformations

Eigenspaces and Diagonalizable Transformations Chapter 2 Eigenspaces and Diagonalizable Transformations As we explored how heat states evolve under the action of a diffusion transformation E, we found that some heat states will only change in amplitude.

More information

Quizzes for Math 304

Quizzes for Math 304 Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

Statistical Interpretation

Statistical Interpretation Physics 342 Lecture 15 Statistical Interpretation Lecture 15 Physics 342 Quantum Mechanics I Friday, February 29th, 2008 Quantum mechanics is a theory of probability densities given that we now have an

More information

1 Last time: inverses

1 Last time: inverses MATH Linear algebra (Fall 8) Lecture 8 Last time: inverses The following all mean the same thing for a function f : X Y : f is invertible f is one-to-one and onto 3 For each b Y there is exactly one a

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Midterm 2 Solutions, MATH 54, Linear Algebra and Differential Equations, Fall 2014

Midterm 2 Solutions, MATH 54, Linear Algebra and Differential Equations, Fall 2014 Name (Last, First): Student ID: Circle your section: 2 Shin 8am 7 Evans 22 Lim pm 35 Etcheverry 22 Cho 8am 75 Evans 23 Tanzer 2pm 35 Evans 23 Shin 9am 5 Latimer 24 Moody 2pm 8 Evans 24 Cho 9am 254 Sutardja

More information

NOTES ON BILINEAR FORMS

NOTES ON BILINEAR FORMS NOTES ON BILINEAR FORMS PARAMESWARAN SANKARAN These notes are intended as a supplement to the talk given by the author at the IMSc Outreach Programme Enriching Collegiate Education-2015. Symmetric bilinear

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

Properties of Commutators and Schroedinger Operators and Applications to Quantum Computing

Properties of Commutators and Schroedinger Operators and Applications to Quantum Computing International Journal of Engineering and Advanced Research Technology (IJEART) Properties of Commutators and Schroedinger Operators and Applications to Quantum Computing N. B. Okelo Abstract In this paper

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

The Schrodinger Equation and Postulates Common operators in QM: Potential Energy. Often depends on position operator: Kinetic Energy 1-D case:

The Schrodinger Equation and Postulates Common operators in QM: Potential Energy. Often depends on position operator: Kinetic Energy 1-D case: The Schrodinger Equation and Postulates Common operators in QM: Potential Energy Often depends on position operator: Kinetic Energy 1-D case: 3-D case Time Total energy = Hamiltonian To find out about

More information

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices. Exam review This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices. Sample question Suppose u, v and w are non-zero vectors in R 7. They span

More information

Notes on basis changes and matrix diagonalization

Notes on basis changes and matrix diagonalization Notes on basis changes and matrix diagonalization Howard E Haber Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 April 17, 2017 1 Coordinates of vectors and matrix

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

1. In this problem, if the statement is always true, circle T; otherwise, circle F. Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation

More information

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 8 Notes

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 8 Notes Overview 1. Electronic Band Diagram Review 2. Spin Review 3. Density of States 4. Fermi-Dirac Distribution 1. Electronic Band Diagram Review Considering 1D crystals with periodic potentials of the form:

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Lecture 4 (Sep. 18, 2017)

Lecture 4 (Sep. 18, 2017) Lecture 4 8.3 Quantum Theory I, Fall 07 Lecture 4 (Sep. 8, 07) 4. Measurement 4.. Spin- Systems Last time, we said that a general state in a spin- system can be written as ψ = c + + + c, (4.) where +,

More information

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis. Exam Review Material covered by the exam [ Orthogonal matrices Q = q 1... ] q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis

More information

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if 5 Matrices in Different Coordinates In this section we discuss finding matrices of linear maps in different coordinates Earlier in the class was the matrix that multiplied by x to give ( x) in standard

More information

Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

More information

Eigenvectors and Hermitian Operators

Eigenvectors and Hermitian Operators 7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding

More information

Recall : Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

0.1 Schrödinger Equation in 2-dimensional system

0.1 Schrödinger Equation in 2-dimensional system 0.1 Schrödinger Equation in -dimensional system In HW problem set 5, we introduced a simpleminded system describing the ammonia (NH 3 ) molecule, consisting of a plane spanned by the 3 hydrogen atoms and

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Quantum Physics III (8.06) Spring 2016 Assignment 3

Quantum Physics III (8.06) Spring 2016 Assignment 3 Quantum Physics III (8.6) Spring 6 Assignment 3 Readings Griffiths Chapter 9 on time-dependent perturbation theory Shankar Chapter 8 Cohen-Tannoudji, Chapter XIII. Problem Set 3. Semi-classical approximation

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Matrices related to linear transformations

Matrices related to linear transformations Math 4326 Fall 207 Matrices related to linear transformations We have encountered several ways in which matrices relate to linear transformations. In this note, I summarize the important facts and formulas

More information

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10 EECS 6A Designing Information Devices and Systems I Spring 209 Lecture Notes Note 0 0. Change of Basis for Vectors Previously, we have seen that matrices can be interpreted as linear transformations between

More information

Repeated Eigenvalues and Symmetric Matrices

Repeated Eigenvalues and Symmetric Matrices Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information