Lecture 4: Entropy Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743 September 7, 2017 Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 1 / 13
Subsystem in a Thermostat We divide a macroscopic system A into two subsystems a and b. The probability dw a that the energy of the subsystem a falls into the interval (E a, E a + de a ) is proportionals the phase volume dq a dp a between the surfaces of the constant energies E a and E a + de a : dw a (q a, p a ) = ρ a (E a )dq a dp a /h s = ρ a (E a )dγ a We can calculate ρ a (E a ) using micro-canonical distribution and integrating over all possible states of the system b (thermostat): dw a (q a, p a ) = C(dq a dp a /h s ) δ (E 0 E a E b ) dq b dp b /h s = C g b (E 0 E a )dγ a = C g b (E 0 E a )g a (E a )de a, (1) Therefore ρ a (E a ) = C g b (E 0 E a ) (2) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 2 / 13
Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13
Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a.the conservation of energy dictates that the energy of the subsystem lies in a very narrow interval E around the most probable value Ē. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13
Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a.the conservation of energy dictates that the energy of the subsystem lies in a very narrow interval E around the most probable value Ē. From now on we drop index a for simplicity and approximate W (E) with a rectangular function W (E) = { g( Ē)ρ(Ē), Ē E/2 E Ē + E/2 0 otherwise hapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13
Boltzmann s Definition of the Entropy Normalization of W (E) (Area=1) yields: ρ(ē) Γ = 1, (4) where Γ is the multiplicity of the subsystem, i.e. the total number of the microstates accessible to the subsystem in the characteristic interval E around the average value Ē. The quantity S = k B log Γ (5) is called the entropy. This is the celebrated Boltzmann s definition of the entropy. Here k B =1.38 10 23 J/K is the Boltzmann s constant. In many derivations we will follow L&L and set k B = 1: We argued previously that S = log Γ log ρ(e) = α βe can be approximated by the linear term in its Taylor s expansion. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 4 / 13
Gibbs Definition of the Entropy The linear approximation for log ρ(e) is the only meaningful way to represent its energy dependence and preserve additivity. Let us explore this linearity: log ρ(e ) = log ρ(hei) = α β hei = hα βei = hlog ρi Using Eqs. (4) and (5) we obtain (kb = 1): S = hlog ρi or Z S= log ρ(q, p)ρ(q, p)dq dp/hs (6) Eq. (6) is the Gibbs definition of entropy which is valid not only for equilibrium but also for non-equilibrium statistical mechanics. It is also at the core of the information theory. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4:743 Entropy September 7, 2017 5 / 13
Entropy in Quantum Statistics In Quantum Mechanics (QM) we deal with discrete energy levels E n. Then we can introduce a QM analog of ρ(q, p) denoted as ρ n ( w n in L&L). The quantity ρ n (sometimes called population or occupation number) is the probability that a QM system occupies the energy level E n. Then the QM analog of Eq. (6) reads: S = n ρ n log ρ n (7) The quantities ρ n describe a QM subsystem of a larger closed system. Such systems are mixtures that cannot be described by wave functions but rather by density matrices. It turns out that the density matrices relevant for equilibrium statistical mechanics are diagonal, i.e. ρ nm = ρ n δ nm in energy representation, i.e. when indices n and m enumerate eigenstates of the Hamiltonian: Ĥ n = E n n Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 6 / 13
Density Matrix In more general, i.e. non-equilibrium, case the density matrix is non-diagonal. Let us introduce the density operator: ˆρ = n ρ mn m m,n From QM we know that the expectation value of any operator  can be obtained using the density operator ˆρ: A = ρ mn A nm = ( ) ˆρ = Trˆρ (8) mm m,n m Let us consider two extreme cases of density matrices: Pure state Ψ = n c n e iωnt n ω n = E n / ˆρ = n,m c nc m exp[i(ω n ω m )t] n m Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 7 / 13
Example of Pure States: Neutrino Oscillations In general ˆρ depends on time and satisfies QM Liouiville equation: ˆρ t = i [ˆρ, H] For pure states non-diagonal elements of the density matrix (coherences) are of critical importance. They are responsible for coherent quantum oscillations. Example: Consider neutrino oscillations between µ and τ neutrinos. By all means neutrinos should be treated as non-interacting particles in pure states (the probability of a single interaction during neutrino s travel through Earth is 10 15!). These states, however, are not the eigenstates of the Hamiltonian (mass matrix) and neutrinos constantly oscillate from one flavor to another. Let us describe this phenomenon using the density matrix formalism. For simplicity we will take into account only µ and τ flavors. The corresponding states read: µ = cos θ ν 1 + sin θ ν 2 τ = sin θ ν 1 + cos θ ν 2 Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 8 / 13
Neutrino Oscillations cont d where ν i are the mass eigenstates and θ is the mixing angle. If the neutrinos are initially in µ state the density matrix reads: ( cos ˆρ(t) = 2 θ sin(θ) cos(θ)e iωt ) sin(θ) cos(θ)e iωt sin 2 θ (9) We can also find the density matrix of the pure τ -state: ( sin 2 ) θ sin(θ) cos(θ) ˆρ τ = τ τ = sin(θ) cos(θ) cos 2 θ (10) Using Eqs. (8)-(10) we can obtain the probability to find the system in the state τ: P τ (t) = ˆρ τ = Tr [ˆρ τ ˆρ(t)] = sin 2 (2θ) sin 2 (ωt/2) (11) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 9 / 13
Statistical Density Matrix ( Statistical Matrix, Statistical Operator) ṇ.. 2 1 ρ. n.. ρ 2 ρ 1 ΔE When a system approaches thermal equilibrium the oscillating non-diagonal matrix elements of ˆρ tend to zero as exp( t/τ) 0 (decoherence) and only the diagonal matrix elements of the density matrix in energy representation survive: ˆρ = n n ρ n n (12) The statistical operator in Eq. (12) describes a mixture (ensemble) of systems in different energy states weighed with probabilities ρ n. Thus ρ n is the probability to find the system in the micro state n with energy E n. The most common statistical distributions are microcanonical and Gibbs or canonical distribution: ρ n = exp( E n/t ) Z with Z = n exp( E n /T ) (13) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 10 / 13
Properties of the Entropy The microcanonical distribution is invaluable for general justification of the statical mechanics. However it is not very practical due to the conservation of energy constraint. On the other hand, the Gibbs distribution is very practical and powerful. In QM the formulas of Stat Mech look simpler because all we need to do is counting. For instance the number and the density of states read: Γ(E) = En E 1 and g(e) = E E n E+dE 1 = dγ(e)/de. The values of statistical weight (multiplicity) Γ end the entropy are insensitive to the choice of an arbitrary interval E. Indeed, we know that g(e) E s where s 10 23. The interval E must be much larger than spacing δe between the energy levels but still macroscopically small. Let s take E s δe. Then: S = log Γ = log[g(e) E] = log[g(e)] + log( E) s + log(s) = s Here we used that s 10 23 log(s) 23. If we change E in a wide range (many orders of magnitude) the result will not change! This insensitivity explains success of statistical mechanics. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 11 / 13
Properties of Entropy cont d The entropy is additive. For a set of subsystems Γ = a Γ a and S = a log( Γ a ) = a S a As we have already seen for two subsystems in contact (i.e. they can exchange energy) the probability to find the subsystem outside of the narrow energy interval in the vicinity of the average energy E is negligible. This argument can be generalized to many subsystems in contact. We write: dw(e 1, E 2,...) = Cδ(E 0 E a ) e a Sa de a e S(E 1,E 2,...E N ) a a The probability and the entropy reach sharp maxima when the energy of each subsystem equals to its average value E a. This is the state of thermal equilibrium. The entropy of a closed system reaches its maximum in the state of thermal equilibrium. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 12 / 13
The Law of Increase of Entropy For a closed system which is not in the state of thermodynamic equilibrium the transition to equilibrium is a chain of more and more probable states. It means that both dw and the entropy S increase steadily until S reaches it s maximum possible value corresponding to the state of complete thermal equilibrium (The second law of thermodynamics). When transitions between the states of the system occur we can distinguish between two types of processes: irreversible, ds/dt 0 and reversible, ds/dt = 0. Example V i N Γ i V i V f Γ i / Γ f 2 N Γ f (2V i ) N Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 13 / 13