Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017

Similar documents
9.1 System in contact with a heat reservoir

Lecture 5: Temperature, Adiabatic Processes

Grand Canonical Formalism

IV. Classical Statistical Mechanics

Thermodynamic equilibrium

1. Thermodynamics 1.1. A macroscopic view of matter

6.730 Physics for Solid State Applications

The general reason for approach to thermal equilibrium of macroscopic quantu

A Brief Introduction to Statistical Mechanics

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Introduction Statistical Thermodynamics. Monday, January 6, 14

2 The Density Operator

Elements of Statistical Mechanics

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

2m + U( q i), (IV.26) i=1

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Part II: Statistical Physics

NANOSCALE SCIENCE & TECHNOLOGY

Statistical. mechanics

Part II: Statistical Physics

Theoretical Statistical Physics

arxiv: v2 [hep-th] 7 Apr 2015

Physics 4022 Notes on Density Matrices

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Typicality paradigm in Quantum Statistical Thermodynamics Barbara Fresch, Giorgio Moro Dipartimento Scienze Chimiche Università di Padova

Quantum measurement theory and micro-macro consistency in nonequilibrium statistical mechanics

Solution Set 3. Hand out : i d dt. Ψ(t) = Ĥ Ψ(t) + and

Finite-size analysis via the critical energy -subspace method in the Ising models

Removing the mystery of entropy and thermodynamics. Part 3

2. Thermodynamics. Introduction. Understanding Molecular Simulation

arxiv: v1 [hep-th] 26 Sep 2007

From unitary dynamics to statistical mechanics in isolated quantum systems

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Averaging II: Adiabatic Invariance for Integrable Systems (argued via the Averaging Principle)

Physics 172H Modern Mechanics

Quantum Mechanics + Open Systems = Thermodynamics? Jochen Gemmer Tübingen,

1 Traces, Traces Everywhere (5 points)

Likewise, any operator, including the most generic Hamiltonian, can be written in this basis as H11 H

The Methodology of Statistical Mechanics

S.K. Saikin May 22, Lecture 13

Lecture 9 Examples and Problems

Statistical thermodynamics for MD and MC simulations

Statistical Mechanics in a Nutshell

1 Foundations of statistical physics

Postulates of Quantum Mechanics

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Quantum Measurements: some technical background

(Dynamical) quantum typicality: What is it and what are its physical and computational implications?

UNIVERSITY OF OSLO FACULTY OF MATHEMATICS AND NATURAL SCIENCES

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

QUANTUM MECHANICS I PHYS 516. Solutions to Problem Set # 5

5. Systems in contact with a thermal bath

What is thermal equilibrium and how do we get there?

Massachusetts Institute of Technology Physics Department

Thermodynamical cost of accuracy and stability of information processing

T2 Capacity of Communication Channels; Erasing Information from Computer Memories... 55

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Physics becomes the. computer. Computing Beyond Silicon Summer School. Norm Margolus

V. FERMI LIQUID THEORY

Lecture 2: Intro. Statistical Mechanics

3.320 Lecture 18 (4/12/05)

The kinetic equation (Lecture 11)

Chapter 3. Statistical Mechanics. 3.1 Overview

Phys 622 Problems Chapter 5

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

The fine-grained Gibbs entropy

8 Quantized Interaction of Light and Matter

First Problem Set for Physics 847 (Statistical Physics II)

Time-dependent DMRG:

Flavor oscillations of solar neutrinos

The Microcanonical Approach. (a) The volume of accessible phase space for a given total energy is proportional to. dq 1 dq 2 dq N dp 1 dp 2 dp N,

4. Systems in contact with a thermal bath

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Derivation of Bose-Einstein and Fermi-Dirac statistics from quantum mechanics: Gauge-theoretical structure

Second Quantization: Quantum Fields

Under evolution for a small time δt the area A(t) = q p evolves into an area

9 Atomic Coherence in Three-Level Atoms

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Statistical Mechanics Solution Set #1 Instructor: Rigoberto Hernandez MoSE 2100L, , (Dated: September 4, 2014)

Lecture 13B: Supplementary Notes on Advanced Topics. 1 Inner Products and Outer Products for Single Particle States

Einselection without pointer states -

ChE 503 A. Z. Panagiotopoulos 1

d 2 2 A = [A, d 2 A = 1 2 [[A, H ], dt 2 Visualizing IVR. [Chem. Phys. Lett. 320, 553 (2000)] 5.74 RWF Lecture #

Nonequilibrium thermodynamics at the microscale

Chapter 3. Statistical Mechanics. 3.1 Overview

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

PHY 396 K. Solutions for problem set #11. Problem 1: At the tree level, the σ ππ decay proceeds via the Feynman diagram

So far, we considered quantum static, as all our potentials did not depend on time. Therefore, our time dependence was trivial and always the same:

arxiv:cond-mat/ v2 [cond-mat.stat-mech] 25 Sep 2000

in terms of the classical frequency, ω = , puts the classical Hamiltonian in the form H = p2 2m + mω2 x 2

Lecture 9: Macroscopic Quantum Model

T2 Capacity of Communication Channels; Erasing Information from Computer Memories... 55

Lecture 7: Thermodynamic Potentials

Quantum quenches in the thermodynamic limit

I. BASICS OF STATISTICAL MECHANICS AND QUANTUM MECHANICS

STATISTICAL PHYSICS II

Grand-canonical ensembles

Handout 10. Applications to Solids

1 Quantum field theory and Green s function

Transcription:

Lecture 4: Entropy Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743 September 7, 2017 Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 1 / 13

Subsystem in a Thermostat We divide a macroscopic system A into two subsystems a and b. The probability dw a that the energy of the subsystem a falls into the interval (E a, E a + de a ) is proportionals the phase volume dq a dp a between the surfaces of the constant energies E a and E a + de a : dw a (q a, p a ) = ρ a (E a )dq a dp a /h s = ρ a (E a )dγ a We can calculate ρ a (E a ) using micro-canonical distribution and integrating over all possible states of the system b (thermostat): dw a (q a, p a ) = C(dq a dp a /h s ) δ (E 0 E a E b ) dq b dp b /h s = C g b (E 0 E a )dγ a = C g b (E 0 E a )g a (E a )de a, (1) Therefore ρ a (E a ) = C g b (E 0 E a ) (2) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 2 / 13

Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13

Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a.the conservation of energy dictates that the energy of the subsystem lies in a very narrow interval E around the most probable value Ē. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13

Sharpness of the Energy Distribution Thus we represented dw a as: dw a = W (E a )de a, where the probability density W (E a ) = g a (E a )ρ a (E a ) g a (E a ) g b (E 0 E a ) (3) We know that g(e) E s is extremely fast, monotonically increasing function of its argument. The second formula in Eq. (3) shows that W (E) must have an extremely sharp maximum at some energy Ē because it s a product of increasing (g a) and decreasing (g b ) functions of E a.the conservation of energy dictates that the energy of the subsystem lies in a very narrow interval E around the most probable value Ē. From now on we drop index a for simplicity and approximate W (E) with a rectangular function W (E) = { g( Ē)ρ(Ē), Ē E/2 E Ē + E/2 0 otherwise hapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 3 / 13

Boltzmann s Definition of the Entropy Normalization of W (E) (Area=1) yields: ρ(ē) Γ = 1, (4) where Γ is the multiplicity of the subsystem, i.e. the total number of the microstates accessible to the subsystem in the characteristic interval E around the average value Ē. The quantity S = k B log Γ (5) is called the entropy. This is the celebrated Boltzmann s definition of the entropy. Here k B =1.38 10 23 J/K is the Boltzmann s constant. In many derivations we will follow L&L and set k B = 1: We argued previously that S = log Γ log ρ(e) = α βe can be approximated by the linear term in its Taylor s expansion. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 4 / 13

Gibbs Definition of the Entropy The linear approximation for log ρ(e) is the only meaningful way to represent its energy dependence and preserve additivity. Let us explore this linearity: log ρ(e ) = log ρ(hei) = α β hei = hα βei = hlog ρi Using Eqs. (4) and (5) we obtain (kb = 1): S = hlog ρi or Z S= log ρ(q, p)ρ(q, p)dq dp/hs (6) Eq. (6) is the Gibbs definition of entropy which is valid not only for equilibrium but also for non-equilibrium statistical mechanics. It is also at the core of the information theory. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4:743 Entropy September 7, 2017 5 / 13

Entropy in Quantum Statistics In Quantum Mechanics (QM) we deal with discrete energy levels E n. Then we can introduce a QM analog of ρ(q, p) denoted as ρ n ( w n in L&L). The quantity ρ n (sometimes called population or occupation number) is the probability that a QM system occupies the energy level E n. Then the QM analog of Eq. (6) reads: S = n ρ n log ρ n (7) The quantities ρ n describe a QM subsystem of a larger closed system. Such systems are mixtures that cannot be described by wave functions but rather by density matrices. It turns out that the density matrices relevant for equilibrium statistical mechanics are diagonal, i.e. ρ nm = ρ n δ nm in energy representation, i.e. when indices n and m enumerate eigenstates of the Hamiltonian: Ĥ n = E n n Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 6 / 13

Density Matrix In more general, i.e. non-equilibrium, case the density matrix is non-diagonal. Let us introduce the density operator: ˆρ = n ρ mn m m,n From QM we know that the expectation value of any operator  can be obtained using the density operator ˆρ: A = ρ mn A nm = ( ) ˆρ = Trˆρ (8) mm m,n m Let us consider two extreme cases of density matrices: Pure state Ψ = n c n e iωnt n ω n = E n / ˆρ = n,m c nc m exp[i(ω n ω m )t] n m Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 7 / 13

Example of Pure States: Neutrino Oscillations In general ˆρ depends on time and satisfies QM Liouiville equation: ˆρ t = i [ˆρ, H] For pure states non-diagonal elements of the density matrix (coherences) are of critical importance. They are responsible for coherent quantum oscillations. Example: Consider neutrino oscillations between µ and τ neutrinos. By all means neutrinos should be treated as non-interacting particles in pure states (the probability of a single interaction during neutrino s travel through Earth is 10 15!). These states, however, are not the eigenstates of the Hamiltonian (mass matrix) and neutrinos constantly oscillate from one flavor to another. Let us describe this phenomenon using the density matrix formalism. For simplicity we will take into account only µ and τ flavors. The corresponding states read: µ = cos θ ν 1 + sin θ ν 2 τ = sin θ ν 1 + cos θ ν 2 Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 8 / 13

Neutrino Oscillations cont d where ν i are the mass eigenstates and θ is the mixing angle. If the neutrinos are initially in µ state the density matrix reads: ( cos ˆρ(t) = 2 θ sin(θ) cos(θ)e iωt ) sin(θ) cos(θ)e iωt sin 2 θ (9) We can also find the density matrix of the pure τ -state: ( sin 2 ) θ sin(θ) cos(θ) ˆρ τ = τ τ = sin(θ) cos(θ) cos 2 θ (10) Using Eqs. (8)-(10) we can obtain the probability to find the system in the state τ: P τ (t) = ˆρ τ = Tr [ˆρ τ ˆρ(t)] = sin 2 (2θ) sin 2 (ωt/2) (11) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 9 / 13

Statistical Density Matrix ( Statistical Matrix, Statistical Operator) ṇ.. 2 1 ρ. n.. ρ 2 ρ 1 ΔE When a system approaches thermal equilibrium the oscillating non-diagonal matrix elements of ˆρ tend to zero as exp( t/τ) 0 (decoherence) and only the diagonal matrix elements of the density matrix in energy representation survive: ˆρ = n n ρ n n (12) The statistical operator in Eq. (12) describes a mixture (ensemble) of systems in different energy states weighed with probabilities ρ n. Thus ρ n is the probability to find the system in the micro state n with energy E n. The most common statistical distributions are microcanonical and Gibbs or canonical distribution: ρ n = exp( E n/t ) Z with Z = n exp( E n /T ) (13) Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 10 / 13

Properties of the Entropy The microcanonical distribution is invaluable for general justification of the statical mechanics. However it is not very practical due to the conservation of energy constraint. On the other hand, the Gibbs distribution is very practical and powerful. In QM the formulas of Stat Mech look simpler because all we need to do is counting. For instance the number and the density of states read: Γ(E) = En E 1 and g(e) = E E n E+dE 1 = dγ(e)/de. The values of statistical weight (multiplicity) Γ end the entropy are insensitive to the choice of an arbitrary interval E. Indeed, we know that g(e) E s where s 10 23. The interval E must be much larger than spacing δe between the energy levels but still macroscopically small. Let s take E s δe. Then: S = log Γ = log[g(e) E] = log[g(e)] + log( E) s + log(s) = s Here we used that s 10 23 log(s) 23. If we change E in a wide range (many orders of magnitude) the result will not change! This insensitivity explains success of statistical mechanics. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 11 / 13

Properties of Entropy cont d The entropy is additive. For a set of subsystems Γ = a Γ a and S = a log( Γ a ) = a S a As we have already seen for two subsystems in contact (i.e. they can exchange energy) the probability to find the subsystem outside of the narrow energy interval in the vicinity of the average energy E is negligible. This argument can be generalized to many subsystems in contact. We write: dw(e 1, E 2,...) = Cδ(E 0 E a ) e a Sa de a e S(E 1,E 2,...E N ) a a The probability and the entropy reach sharp maxima when the energy of each subsystem equals to its average value E a. This is the state of thermal equilibrium. The entropy of a closed system reaches its maximum in the state of thermal equilibrium. Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 12 / 13

The Law of Increase of Entropy For a closed system which is not in the state of thermodynamic equilibrium the transition to equilibrium is a chain of more and more probable states. It means that both dw and the entropy S increase steadily until S reaches it s maximum possible value corresponding to the state of complete thermal equilibrium (The second law of thermodynamics). When transitions between the states of the system occur we can distinguish between two types of processes: irreversible, ds/dt 0 and reversible, ds/dt = 0. Example V i N Γ i V i V f Γ i / Γ f 2 N Γ f (2V i ) N Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September 7, 2017 13 / 13