U = γkt N. is indeed an exact one form. Thus there exists a function U(S, V, N) so that. du = T ds pdv. U S = T, U V = p

Similar documents
Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

Thermodynamics, Statistical Mechanics and Entropy

Solid Thermodynamics (1)

A Brief Introduction to Statistical Mechanics

Introduction Statistical Thermodynamics. Monday, January 6, 14

Properties of Entropy

PHYSICS 715 COURSE NOTES WEEK 1

PHYS 352 Homework 2 Solutions

Adiabatic Expansion (DQ = 0)

Elements of Statistical Mechanics

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

G : Statistical Mechanics

Irreversible Processes

Thermodynamics. 1.1 Introduction. Thermodynamics is a phenomenological description of properties of macroscopic systems in thermal equilibrium.

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Physics 607 Final Exam

Thermodynamics of phase transitions

CHAPTER 1. Thermostatics

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i

Removing the mystery of entropy and thermodynamics. Part 3

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

A.1 Homogeneity of the fundamental relation

Statistical Mechanics. Atomistic view of Materials

summary of statistical physics

First Law Limitations

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

Lecture Notes Set 3b: Entropy and the 2 nd law

The Euler Equation. Using the additive property of the internal energy U, we can derive a useful thermodynamic relation the Euler equation.

Nanoscale simulation lectures Statistical Mechanics

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

1 Foundations of statistical physics

Physics 127b: Statistical Mechanics. Second Order Phase Transitions. The Ising Ferromagnet

Grand Canonical Formalism

2m + U( q i), (IV.26) i=1

Thermodynamic equilibrium

Recitation: 10 11/06/03

Gases and the Virial Expansion

On the Validity of the Assumption of Local Equilibrium in Non-Equilibrium Thermodynamics

Survey of Thermodynamic Processes and First and Second Laws

Property of Tsallis entropy and principle of entropy increase

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Introduction to thermodynamics

although Boltzmann used W instead of Ω for the number of available states.

Dually Flat Geometries in the State Space of Statistical Models

Part II Statistical Physics

CHAPTER 4. Cluster expansions

Entropy A measure of molecular disorder

Curves in the configuration space Q or in the velocity phase space Ω satisfying the Euler-Lagrange (EL) equations,

Thermal and Statistical Physics

Physics 408 Final Exam

2. Thermodynamics. Introduction. Understanding Molecular Simulation

More Thermodynamics. Specific Specific Heats of a Gas Equipartition of Energy Reversible and Irreversible Processes

Homework Hint. Last Time

Chapter 10: Statistical Thermodynamics

Ensembles and incomplete information

213 Midterm coming up

Physics is time symmetric Nature is not

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Part II: Statistical Physics

9.1 System in contact with a heat reservoir

The first law of thermodynamics continued

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Thermodynamics 1 Lecture Note 2

Computer simulation methods (1) Dr. Vania Calandrini

ChE 503 A. Z. Panagiotopoulos 1

Lecture 14. Entropy relationship to heat

Internal Degrees of Freedom

Part II: Statistical Physics

Lecture 6: Ideal gas ensembles

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Physics Dec The Maxwell Velocity Distribution

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Physics 607 Final Exam

Lecture Notes, Statistical Mechanics (Theory F) Jörg Schmalian

Entropy and the Second and Third Laws of Thermodynamics

Entropy and Free Energy in Biology

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term Solutions to Problem Set #10

6.1 Main properties of Shannon entropy. Let X be a random variable taking values x in some alphabet with probabilities.

Thermodynamics and the aims of statistical mechanics

Rubber elasticity. Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge. February 21, 2009

Multivariable Calculus

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle

to satisfy the large number approximations, W W sys can be small.

Temperature Thermal Expansion Ideal Gas Law Kinetic Theory Heat Heat Transfer Phase Changes Specific Heat Calorimetry Heat Engines

Statistical Mechanics

Part II: Statistical Physics

Preliminary Examination - Day 2 May 16, 2014

Statistical thermodynamics Lectures 7, 8

Statistical Mechanics

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

TSTC Lectures: Theoretical & Computational Chemistry

ADIABATIC PROCESS Q = 0

REVIEW. Hamilton s principle. based on FW-18. Variational statement of mechanics: (for conservative forces) action Equivalent to Newton s laws!

Transcription:

A short introduction to statistical mechanics Thermodynamics consists of the application of two principles, the first law which says that heat is energy hence the law of conservation of energy must apply the second law of Thermodynamics, which says that there is no machine whose sole purpose is to do work by extracting heat from a reservoir. While it is not our aim to discuss Thermodynamics we treat the ideal gas in some detail. The ideal gas law says that pv = knt where p is the pressure, V the volume, T the absolute temperature N the number of particles (in moles) in the gas. The constant k is Boltzmann s constant. There is another equation which relates the temperature T to the energy contained in the gas U = γkt N where γ is a constant that depends on the type of the gas. The first law says du = dq pdv + µdn which is a confusing formula. (Here µ is the chemical potential. The formula suggests that there exists a function U(Q, V, N)) so that its gradient is given by (1, p, µ). This is not the case! There is, however, the second law which states that there is an integrating factor 1/T so that ds = 1 T dq is indeed an exact one form. Thus there exists a function U(S, V, N) so that du = T ds pdv. In particular S = T, V = p N = µ. Note the sloppy notation which is at the root of all the confusion in thermodynamics. One does not make the distinction between U as a function of T U as a function of S. both function are represented by the same symbol. Returning to the ideal case, we note that good variables would be U, V, N not S, V, N. Since S > 0 we can invert this relation consider S(U, V, N) hence S = 1 T 1

S V = p T S N = µ T. There is the fundamental requirement that the S, U, V N all scale the same way as we enlarge the systems. If we have twice as many particles, twice the interal energy twice the volume then the entropy should be multiplied by two. In other words S(λU, λv, λn) = λs(u, V, N). Differentiating the above relation at λ = 1 gets us For the ideal gas hence S(U, V, N) = U T + pv T µn T. 1 T = γkn U, p T = kn V, S(U, V, N) = γkn log U + kn log V + NC(N) for some unknown function C(N). By scaling or λ(γkn log U + kn log V + NC(N)) = γkλn log(λu) + kλn log(λv ) + λnc(λn) Differentiating this at λ = 1 yields hence C(N) = (γ + 1)k log λ + C(λN) 0 = (γ + 1)k + C (N)N C(N) = (1 + γ)k log N up to some constant. Hence, the entropy for an ideal gas is given by S(U, V, N) = γkn log U + kn log V (1 + γ)kn log N. Gas with a piston Imagine a a cylinder of length one (in some units) insulated against heat transfer filled with gas. Inside the cylinder is an ideal insulating piston at position 0 < x < 1. Half of the gas is on the eft of the piston where it has Volume Ax, (A is the cross section ) temperature T half of the gas is on the right of the piston where it has volume A(1 x) temperature T. Thus, the pressure on the left will be larger than on the right. The

piston is fixed with a pin. At some moment we pull the pin let the piston move freely which it does without friction. Eventually it will come to rest the question is where. Call the position y. The pressures on both sides must be the same, hence T L y = T R (1 y). Moreover, by the first law the energy must be conserved, i.e., Therefore T L + T R = T. T L = T y, T R = T (1 y). We know from the second law that the entropy must have gone up. Thus the entropy at the end of the process is greater than the entropy of the system before. The total entropy before the process is given by (up to irrelevant constants) whereas the entropy after is γ log T + log x + log(1 x) γ log(t y) + γ log(t (1 y) + log y + log(1 y). Hence applying the second law we get the inequality A straightforward calculation leads to the log(x(1 x)) γ log + (1 + γ) log(y(1 y)) y(1 y) (x(1 x)) 1 1+γ 4 γ 1+γ Note that when x = 1/ we get correctly y = 1/. What this result is saying that the given macroscopic quantities do not suffice the calculate the exact outcome of this experiment. The Second Law, however, yields bounds for the position of the piston. These inequalities are determined entirely by the macroscopic quantities. It might be that shocks all sorts of nasty things might develop by releasing the piston, but the second law sets absolute bounds on the outcome. The statistical theory of entropy We consider a system of N interacting particles in a box. The system is specified by giving the positions X = (x 1,..., x N ) the momenta P = (p 1,..., p N ). If we give a probability distribution ρ(x, P ) then the Boltzmann entropy is given by S(ρ) = k dxdp ρ(x, P ) log(ρ(x, P )).. 3

Assuming that the dynamics of the system is given by the Hamiltonian H(X, P ) one might ask what is the state with maximal entropy, given that the system has total energy U. In other words, we have to maximize S(ρ) given that H(X, P )ρ(x, P )dxdp = U that ρ(x, P )dxdp = 1. To solve this problem we start with the elementary inequality x log x e α (α + 1)x which holds for all x > 0 all α. There is equality if only if Next we have pointwise x = e α. kρ log ρ ke γh+δ k(γh + δ 1)ρ where γ δ are constants. Again, there is equality if only if ρ = e γh+δ. We now use the free constants to fix the correct side conditions γh+δ = 1 γh+δ H = U. The first says that we choose γ such that e δ = γh, the other means that we have to choose γ such that γh H γh = U. (1) 4

This is an equation for γ we have to show that there is a solution. Differentiate the left side of (1) with respect to γ get γh H γh γh + [ H γh ] since the left side of (1) is the expectation value of H with respect to the probability distribution ρ = H ρ e γh γh we can write this as H ρ + ( H ρ ) = (H H ρ ) ρ < 0. This means that the left side of (1) is a strictly decreasing function of γ. Clearly, any allowed value for U must be in the range of H. As γ tends to the left side of (1) converges to the minimal value of H. If we take our system of the form H = N p j + V (x 1,..., x N ) j=1 where the particles are constrained in a box, then we see that as γ 0 the left side of (1) tends to. Thus, in this case for any U in the range of H the equation (1) has a unique solution. We call this solution β. Hence the optimizing ρ is given by ρ canon = e βh βh. This is called the canonical ensemble Z = βh is called the partition function. Further we get S(ρ canon ) = kβ H canon + k log Z. () We shall see that this formula has a nice interpretation. For the case of an ideal gas the Hamiltonian is given by H = 1 m 5 N j=1 p j

the particles are confined to a volume V. Now β ρ canon = ( mπ )3N/ e β N m j=1 p j N j=1 χ V (x j ) V The energy is i.e., γ = 3/. we see that β = 1 kt Z = ( mπ β )3N/ V N. U = H canon = 3 N 1 β is consistent. Further using () (3) S(ρ canon ) = 3 Nk + kn[3 log(mπ ) + log V ] β recalling that we have to write S in terms of the variable U, V, N we can eliminate β using (3) get S = 3 kn log U + kn log V 3 kn log N + 3 kn(1 log((4mπ/3)). The last term is just an additive constant is irrelevant for thermodynamics. All the other terms fit our expectation except the the penultimate term. We have the serious problem that the entropy is not extensive. We should really have (1 + 3 )kn log N instead of 3 kn log N. Note that we can get out of this quagmire of replace the partition function by Z = 1 βh N! since this term will yield an addition term of the order of N log N for the entropy that would render the entropy extensive. This problem does not show up in quantum statistical mechanics! The way out in the classical case is to undest that the particles are indistinguishable. The configuration (x 1, x, x 3,,x N ) cannot be distinguished from the configuration (x, x 1, x 3,,x N ). In other words we have to consider the density ρ canon = e βh βh χ S N 6

where χ SN is the characteristic function of a fundamental domain of the permutation group. Such a set is somewhat complicated to describe when the underlying space is R 3 but since its contribution to the partition function is 1/N! times the whole one. We shall see that the partition function contains in some sense all the information of our thermodynamic system. If the underlying space is one dimensional then a fundamental domain is given by the region x 1 < x < < x N 1 < x N. In this formalism the natural variables are not U, V, N but T, V, N. The way out of that is to perform a Legendre transform. One performs this best by starting from the internal energy U(S, V, N). Solve the equation S = T to get S(T, V, N). Then consider the Free Energy function F (T, V, N) = U(S(T, V, N), V, N) T S(T, V, N) note that Returning to () we see that F T = S, F V = p F N = µ. F = 1 β log Z, where Z = 1 N! βh. Thus, the log of the partition function yields the free energy. 7