Assignment Arfken 5.. Show that Stirling s formula is an asymptotic expansion. The remainder term is R N (x nn+ for some N. The condition for an asymptotic series, lim x xn R N lim x nn+ B n n(n x n B n n(n x n N is thus met. We should also check that the series formally diverges. We can do that using lim N x N R N (x or just use the ratio test on the series (and using the representation of the Bernoulli numbers given in equation 5.5 a n+ a n B n+ x n n(n (n + (n + B n x n (n +! (πn ζ(n + (n! (π n+ ζ(n ζ(n + (π n(n ζ(n x n(n (n + (n + which obviously as n (note that lim n ζ(n. Thus the Stirling series is an asymptotic expansion. Arfken 5.. Let s do both Fresnel integrals together. But first note that the infinite version of both of these integrals is cos πu / du / (getting this from a table. So use this to define an asymptotic series C(x + is(x x cos πu + i e iπu / du x sin πu du x ( + i π πx / e iπu / du e iz z dz ( + i π e iz iz / + i e iz ( + i π iz + / i ( i cos πx ( + i + π x + π x + i + i π x ] e iz dz z/ e iz z + / 4i πx / e iz dz z5/ ] πx / πx + sin (cos πx πx πx + i sin / ( i 4π x 4 cos πx πx 5 + sin + (cos πx πx /4 8π x 6 + i sin /8 sin πx πx cos πx πx π sin x4 + 5 ] πx π cos x6 + cos πx πx sin πx + πx π cos x4 + 5 ] πx π sin x6 + x ] +
Thus the real part of this is C(x and the imaginary part is S(x. Arfken 5..5 For the series the remainder terms are, respectively, n P ν (z + ( n s 4ν (s ] (n! (8z n n n Q ν (z ( n+ s 4ν (s ] (n! (8z n n R P n (z R Q n (z kn+ k ( k s 4ν (s ] (k! (8z k k ( k+ s 4ν (s ] (k! (8z k kn+ and both quantities, z n R n (z, approach zero as z since the z terms go like z k+n and z k++n and k > n. To demonstrate that both are formally divergent series, use the ratio test for P ν (z a n+ n+ a n s 4ν (s ] (n! (8z n (n +! (8z n+ n s 4ν (s ] ( 4ν (4n ( 4ν (4n (n + (n + (8z which, for a fixed z, goes to as n. A similar calculation follows for Q ν (z. Arfken 5..8 We want to expand the integral e xv( + v dv x e u ( + u x e u k du where u xv x u k du ( k (k + n ( k (k + x x k k n ( k (k + (k! k x k+ x e u u k du Note that the book would seem to have an error. My guess is that the in the exponent of the integral was really supposed to be.
Arfken.. (a (b (c ( ( + ( ( 9 Arfken..6a D D 4 Arfken..4 D 4 6 4 5 ( 5 6 ( + ( 8 9 (a If complex numbers can be represented by matrices, then we should be able to reproduce the basic arithmetic operations in a consistent manner. For instance addition of two complex numbers: (a+ib+(c+id becomes a b c d a + c b + d + b a d c (b + d a + c which, translated back to complex numbers, is (a + c + i(b + d as we would expect for an isomorphic representation. Likewise, multiplication of two complex numbers (a + ib (c + id becomes a b c d ac bd bd + ac b a d c (bc + ad bd + ac
which, translated back to complex numbers, is (ac bd + i(bd + ac and which is complex mutiplication. Thus the algebra of complex numbers is isomorphic to the algebra of matrices which have the form a b b a (b The matrix corresponding to (a + ib is just the inverse of the standard matrix. Finding the inverse, we get a b a + b b a which in complex notation is a ib a +b Arfken..6a Starting with a general matrix as expected. the demand that A A leads to four equations: x y A z w x yz y(x + w z(x + w w yz One solution is the trivial solution: x y z w which we discard. The other solution is w x and y x /z with x and z arbitrary. The matrix becomes x x A /z z x and if we make the redefinitions x ab and z a, we get the form in the text. Arfken..4 From the previous problem, we have the anti-commutation relations and can deduce the commutation relations between the Pauli matrices: Adding these equations and dividing by we get σ i σ j + σ j σ i δ ij σ i σ j σ j σ i iɛ ijk σ k σ i σ j δ ij + iɛ ijk σ k Since σ is a matrix-valued vector (this is the same language as when we speak of a real-valued function we can dot an ordinary vector (i.e. a real-valued vector into it (σ i a i (σ j a j a i b j δ ij + iɛ kij σ k a i b j or in traditional vector notation, ( σ a ( σ b a b + i σ a b 4
Arfken..6 (a M x, M y ] i i i i i i i i i i i im z with similar calculations for M y, M z ] im x and M z, M x ] im y. (b M M x + M y + M z i i + i i + i i i i i i i i + + (c Using just the commutation relations we get since the identity matrix commutes with everything. M, M i ], M i ], M i ] M z, L + ] M z, M x + im y ] M z, M x ] + im z, M y ] im y + i ( im x L + Lastly, L +, L ] M x + im y, M x im y ] M x, M x ] im x, M y ] + im y, M x ] + M y, M y ] i (im z + i ( im z + M z 5
Arfken.. The matrix A is diagonal and commutes with the matrix B. Show B is diagonal. The commutator in index notation is A ij B jk B ij A jk A ii B i k B ik A k k where our notation is a bit strange here. We are summing over i and k, however, since A is diagonal, there is only a single term in each of these sums. The notation is to emphasize the fact that i i and k k, again, because A is diagonal. Therefore, since i i and k k, B i k B ik and we can write B i k(a ii A k k which, if the diagonal elements of A are distinct (assumed implies that what is in parentheses is nonzero for i k and hence B i k for i k and B is diagonal. Arfken..8 The matrices A and B satisfy A B and {A, B} AB + BA Multiply the above anti-commutation relation by A and take the trace: tr (AAB + ABA tr (B + AAB tr (B where we have used A and the cyclic property of the trace. A virtually identical calculation shows the same thing for tr (A. Arfken.. The matrices A and B are orthogonal: A T A and B T B. Consider the transpose of their product in index notation (A B T (A ik B kj T ij Thus we can now write A jk B ki B ki A jk (B ik T (A kj T ( B T A T ij (A B T B T A T B A If we multiply this by AB from the left, we get the identity, which establishes (A B T as the inverse of AB and hence AB as an orthogonal matrix. Arfken..8 In index notation, we can write our symmetric matrix as (S ij s ij s ji and the anti-symmetric matrix as (A ij a ij a ji. The trace is then tr (S A s ij a ji (s ji ( a ij s ji a ij tr (S A and we have the trace equal to its negative, thus it is zero. 6
Arfken..9 For a similarity transformation, we have A B A B. Taking the trace of this, we have tra tr ( B A B tr ( B B A tr (A and the trace of a matrix is invariant under similarity transformations. Arfken.. For a similarity transformation, we have A B A B. Taking the determinant of this, we have det A det ( B A B (det B (det A ( det B (det B (det A (det B det A and the determinant of a matrix is invariant under similarity transformations. 7