Entropy and Ergodic Theory Lecture 12: Introduction to statistical mechanics, I

Similar documents
Entropy and Ergodic Theory Lecture 13: Introduction to statistical mechanics, II

213 Midterm coming up

Introduction. Chapter The Purpose of Statistical Mechanics

Statistical Mechanics

PHYSICS 715 COURSE NOTES WEEK 1

G : Statistical Mechanics

Statistical Mechanics in a Nutshell

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Physics 172H Modern Mechanics

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

PHYS Statistical Mechanics I Course Outline

Khinchin s approach to statistical mechanics

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory

Javier Junquera. Statistical mechanics

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

ChE 503 A. Z. Panagiotopoulos 1

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

ENERGY CONSERVATION The Fisrt Law of Thermodynamics and the Work/Kinetic-Energy Theorem

Nondeterministic finite automata

Introductory Quantum Chemistry Prof. K. L. Sebastian Department of Inorganic and Physical Chemistry Indian Institute of Science, Bangalore

arxiv: v1 [cond-mat.stat-mech] 9 Oct 2014

IV. Classical Statistical Mechanics

LECTURE 01: Microscopic view of matter

Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA. There are at least two ways to think about statistical thermodynamics.

Appendix C extra - Concepts of statistical physics Cextra1-1

Lecture Notes Set 3a: Probabilities, Microstates and Entropy

Markovian Description of Irreversible Processes and the Time Randomization (*).

Review of probability

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

CHAPTER FIVE FUNDAMENTAL CONCEPTS OF STATISTICAL PHYSICS

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Figure 1: Doing work on a block by pushing it across the floor.

Thermodynamics: Entropy

BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

1. As a macroscopic analogy, think of an idealized pool table on which there is no friction. Let s consider a few scenarios.

Sniffing out new laws... Question

The Wave Function. Chapter The Harmonic Wave Function

Elements of Statistical Mechanics

Irreversibility. Have you ever seen this happen? (when you weren t asleep or on medication) Which stage never happens?

CHAPTER FIVE FUNDAMENTAL CONCEPTS OF STATISTICAL PHYSICS "

Lecture 6: Ideal gas ensembles

Basics of Statistical Mechanics

Applied Thermodynamics for Marine Systems Prof. P. K. Das Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

1 Multiplicity of the ideal gas

Entropy and Ergodic Theory Lecture 6: Channel coding

Phase space in classical physics

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Introduction. Introductory Remarks

CHEM-UA 652: Thermodynamics and Kinetics

Importance of Geometry

Building your toolbelt

Brief review of Quantum Mechanics (QM)

Mechanics, Heat, Oscillations and Waves Prof. V. Balakrishnan Department of Physics Indian Institute of Technology, Madras

BOLTZMANN-GIBBS ENTROPY: AXIOMATIC CHARACTERIZATION AND APPLICATION

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

Introduction Statistical Thermodynamics. Monday, January 6, 14

Models in condensed matter physics

Physics 351 Wednesday, January 14, 2015

Sequence convergence, the weak T-axioms, and first countability

THE NATURE OF THERMODYNAMIC ENTROPY. 1 Introduction. James A. Putnam. 1.1 New Definitions for Mass and Force. Author of

Basics of Statistical Mechanics

Math 300: Foundations of Higher Mathematics Northwestern University, Lecture Notes

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Equilibrium. Chapter Introduction

A Deeper Look into Phase Space: The Liouville and Boltzmann Equations

AN INTRODUCTION TO LAGRANGE EQUATIONS. Professor J. Kim Vandiver October 28, 2016

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

NEWS ABOUT SYMMETRIESIN PHYSICS ISCMNS 12th Workshop

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

AP PHYSICS 2 FRAMEWORKS

Physics, Time and Determinism

What Is Classical Physics?

Thermodynamics: Entropy

Electromagnetic Theory Prof. D. K. Ghosh Department of Physics Indian Institute of Technology, Bombay

Basic Concepts and Tools in Statistical Physics

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

arxiv:gr-qc/ v2 30 Oct 2005

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

Entropy and Ergodic Theory Notes 21: The entropy rate of a stationary process

Introduction. Introductory Remarks

The Unreasonable Expectation of Mathematics August Lau

The Two Faces of Infinity Dr. Bob Gardner Great Ideas in Science (BIOL 3018)

Part I Electrostatics. 1: Charge and Coulomb s Law July 6, 2008

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Energy functions and their relationship to molecular conformation. CS/CME/BioE/Biophys/BMI 279 Oct. 3 and 5, 2017 Ron Dror

Part II: Statistical Physics

Lecture Notes Set 3b: Entropy and the 2 nd law

MITOCW watch?v=rwzg8ieoc8s

Stochastic Processes

III. Kinetic Theory of Gases

Chapter 9: Statistical Mechanics

Quantum mechanics (QM) deals with systems on atomic scale level, whose behaviours cannot be described by classical mechanics.

Part II: Statistical Physics

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

MATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King

Physics 351 Wednesday, January 10, 2018

Transcription:

Entropy and Ergodic Theory Lecture 12: Introduction to statistical mechanics, I 1 Introduction Somewhere near the beginning, most introductions to statistical physics contain a sentence along the following lines: The main aim of statistical mechanics is to derive macroscopic properties of matter from the laws governing the microscopic interactions of the individual particles. (Thompson, Mathematical Statistical Mechanics, p32) This is in contrast with the older subject of thermodynamics, which proposes some basic laws that govern certain kinds of macroscopic behaviour without probing the microscopic phenomena that give rise to those laws. In principle, one should be able to derive thermodynamics from statistical mechanics. Working out this derivation in detail is an important theoretical task of statistical physics. The first appearance of a notion of entropy was in the early thermodynamics of the nineteenth century. With the benefit of modern statistical physics, one finds that this appearance of entropy is really a consequence of the same underlying mathematical phenomena as the others we have met in this course. This lecture: a rough introduction to statistical mechanics based on the theory around entropy that we have already met. 2 Systems Statistical mechanics analyzes systems that consist of huge numbers of elementary components, usually particles of some kind. The behaviour of these is governed 1

by the basic laws of either classical or quantum mechanics, depending on the model. A full introduction to statistical mechanics should start with a discussion of how one isolates a system, at least conceptually, and describes its relationship to the rest of the universe. That relationship can follow various rules, and these govern the kinds of model that one uses for the system. Here we omit most of this discussion, and focus instead on some simple examples. As a result, our introduction to statistical mechanics is immediately incomplete. But the examples are quite intuitive, and do illustrate some of the basic phenomena, particularly in relation to entropy. To get started, imagine an open box V in three dimensional space which contains n classical particles with masses m i and positions r i. Suppose the particles move in the presence of a potential energy U (such gravitational energy), and that they interact with one another only in pairs which either attract or repel through a force given by a potential energy ϕ that depends on the distance between them (such as electrostatic repulsion). Then, according to Newton s laws of motion, the positions evolve according to m i d 2 r i dt 2 force on particle i Upr i q ÿ ϕp}r i r j }q. (1) We do not make use of these equations in this lecture: they are placed here only to make the general case for statistical mechanics as a discipline in its own right. For macroscopic systems, we may well have n 10 23. It is impossible to imagine solving the above equations for motion with that many degrees of freedom. Even if they could be solved, one could never in practice have accurate enough information about the initial conditions of all the particles in the system to derive useful predictions from the solutions. And yet, many systems in the real world display remarkably smooth and regular behaviour. Some theory should be available to explain this, based on better ideas than just solving the equations of motion. Statistical mechanics is that theory. When it is succesful, it describes the approximate behaviour of certain key macroscopic quantities by finding ways to ignore most of the microscopic detail in the system. The approximations rely on statistical phenomena, so they are good precisely because the system has so many constituents. In a system made up of, say, 10 20 particles, these approximations tend to be fantastically accurate. 2 j i

3 Thermodynamic equilibrium A macroscopic description of a system is given in terms of a few real-number parameters. For instance, in simple models of a gas confined to a bounded region, these might be the volume of the region V, the number n of particles in the gas, and their total energy E. Other systems, such as blocks of magnetic material, require different choices of parameters. Such a description still leaves enormous freedom as to the microscopic state of the system. For instance, in the example of a gas, we haven t said how much of the total energy E is accounted for by the kinetic energies of each of the particles, or the potential energies of their interactions. So, within a given model of a system, there is a big set of possible micro states that are compatible with the chosen macro state. One of the key ideas of statistical physics is that, to a macroscopic observer, this microscopic variation is essential invisible. Whatever other macroscopic parameter one thinks of, it will take very nearly the same value on most of microstates that are compatible with the given macrostate. In the first place, this is a fact of experience. If we have two identical mugs on the same table in a room, and they contain identical volumes of coffee at the same temperature, then we usually do not discern any other difference between the mugs of coffee, even though the motions of the individual molecules inside of one are not related to those inside the other in any way. This indistinguishability of microstates is not supposed to hold for absolutely all possible compatible microstates, only for the overwhelming majority. Indeed, after a system first enters a state described by certain macroscopic parameters, it usually does undergo some further macroscopic changes for a while, before its macroscopic state appears to become still. Microscopically it is still extremely active: particles move around and bounce off of one another all the time. But after a certain relaxation time the system reaches thermodynamic equilibrium, meaning that its macroscopic appearance stays constant. Moreover, it is precisely that microscopic activity which is responsible for the relaxation of the system into thermodynamic equilibrium: microscopically, the state keeps changing, and if the microscopic dynamics mixes things up sufficiently well, then after a while the initial state of disequilibrium is forgotten and equilibrium is achieved. This, again, is principally a fact of experience. (Early attempts to justify it rigorously rest on the so-called ergodic hypothesis. We do not explore this here, but historically it is the origin of ergodic theory as a subject. We turn to that later in the course.) 3

This idea of equilibrium is really only ever an approximation to the behaviour of the real world; put another way, it is a matter of interpretation. In the hot-mug-of-coffee example, if we care about the behaviour of the coffee over the next few seconds, then maybe we can regard it as being in equilibrium. If we care about the whole of the next hour, then the coffee will noticeability lose heat to the air in the room, and we should not regard equilibrium as being attained until that process is essentially finished. The convergence to thermodynamic equilibrium is one of the basic observations in the science of thermodynamics. One can describe it without reference to the microscopic state of a system. But in statistical physics, it becomes the first key step towards a new and powerful mathematical formalism. 4 The microcanonical ensemble Conceptually, it is simplest to model systems that are isolated. This means that the constituent particles have negligible interaction with the rest of the universe. If a system is isolated, then the values of various physical quantities are constant: this is a consequence of the basic conservation laws that underly all of physics. So a model of an isolated macroscopic system is constrained by its values of certain macroscopic parameters, such as volume, particle-number and energy. Within such a model, the microscopic basis for the achievement of thermodynamic equilibrium is that, among all possible micro states compatible with those macroscopic parameters, the overwhelming majority look the same as regards any other macroscopically measurable quantity. With this in mind, we reach perhaps the first big idea in statistical mechanics: For any macroscopic quantity, its behaviour for a system in equilibrium may be predicted very accurately by its average value over all compatible micro states. Mathematically, the plan is this: rather than tracking the exact trajectory of the system among all possible microstates, we will 1. describe the set of all microstates that are compatible with the given macroscopic parameters; 2. endow that set with a natural probability measure; 3. predict that the values of other macroscopic observables are given by their averages with respect to that probability measure. 4

If a system is in equilibrium, it shouldn t matter exactly what microstate it has reached at any given moment: the prediction obtained from step 3 above will be a very accurate guess most of the time. In fact, it turns out to be an extremely accurate guess for an overwhelming fraction of the time. The choice of the measure in step 2 depends on the model and the description in step 1. In most models there is a natural candidate for the uniform measure on all compatible micro states, and then this is the measure that we use. In physics it is called the microcanonical ensemble. Once one has decided on a sensible microscopic model and on the microcanonical ensemble, it becomes a mathematical problem to decide whether other macroscopic quantities really are roughly constant with high probability. If this is the case, then the real test of our plan above is by agreement with experiment. 5 Simple models of isolated systems In the model (1) from classical mechanics, the instantaneous state of the system is described by the locations and momenta of all n particles: that is, by a point in the set M : V n ˆ pr 3 q n Ď R 6n. These are the initial data needed to evolve the equation (1) (with some additional rules about the nature of collisions with the sides of the box). For any model, this set of possible states for the whole system is called the state space. Within it, for any real value E, there is an energy surface: in the model above this is the set! ΩpEq : pr 1,..., r n, p 1,..., p n q : 1 nÿ nÿ m i }p i } 2 ` Upr i q ` ÿ ϕp}r i r j }q 2 loooooomoooooon i 1 loooooooooooooooomoooooooooooooooon i 1 i j total kinetic energy total potential energy ) E. In this model, assuming we have no other natural conserved quantities, the natural choice of microcanonical ensemble is the p6n 1q-dimensional surface measure on ΩpEq, normalized to be a probability measure. Of course, we must choose a value of E for which ΩpEq is nonempty, and worry about whether this set is regular enough to carry such a natural surface measure. 5

We do not analyze such continuous models any further in this lecture. Instead we consider some even simpler discrete models, where the relevant statistical phenomena are more easily uncovered. So now consider a system which consists of n generic particles, each of which has available to it a discrete set A ta 1, a 2,... u of possible individual states. That set may be finite or countable infinite. The microstate of the whole system is specified by the n individual states of the particles: that is, by a point in A n. Such a model with a discrete set of permitted states may seem like a drastic simplification of a mechanical model like (1), but it is actually quite close to the truth according to quantum mechanics. Each microstate has a total energy. Let us assume further that our particles are identical and non-interacting. This means there is a fixed energy function Φ 0 : A ÝÑ R, called the Hamiltonian, such that the total energy of the microstate a pa 1,..., a n q is nÿ Φpaq Φ 0 pa i q. (2) i 1 (Notational remark: the most common symbol for a Hamiltonian is H; I am using Φ instead because I want to keep using H for Shannon entropy. The choice of Φ is quite common among more advanced pure math texts. On the other hand, the standard physical symbol for entropy is S, and I ll be using S in a related way shortly, too.) The cases of finite and infinite A are slightly different once we enter the more detailed analysis below. If A is finite, then the tools we need are given almost immediately by our previous work on large deviations theory. One useful convention: it will be clear that nothing changes if we adjust the function Φ 0 by additive constant, so we may always assume that mintφ 0 paq : a P Au 0. Having done so, any state a for which Φ 0 paq 0 is called a single-particle ground state. We do make one extra assumption here: that Φ 0 takes at least two distinct values. This simply rules out degenerate cases. If A is infinite, then we need to modify those large-deviations results slightly, and this is possible only if we make some (quite reasonable) extra assumptions about A. 6

Most importantly, if an individual particle has limited energy, then it should not have too many states available to it. The simplest assumption to make is simply that the set ta P A : Φ 0 paq ď ru is finite for every r P R. But it turns out that this is not quite enough for the theory to work correctly. We really need the following more quantitative version of this assumption: for any β ą 0, we have ÿ e βφ0paq ă 8. (3) apa This is equivalent to assuming that the cardinality ta P A : Φ 0 paq ď ru grows subexponentially in r. Such a precise mathematical assumption may seem surprising so early in the story, but the model below really can degenerate without it. It s also a very weak assumption, satisfied by all the usual models of basic physical systems. Having made this assumption, then as in the case of finite A we may adjust Φ 0 by a constant to assume that its minimum value is zero. The term ground state is used here as before. 5.1 An important subtlety This is not an attempt to model a system which is truly non-interacting. If the particles were isolated and really didn t interact at all, then they could not exchange energy with one another, and then all of the individual energies Φ 0 pa i q, i 1, 2,..., n would be constant. In this case there would be no mechanism to drive this system into equilibrium. Rather, we are really using Φpaq as an approximation to a Hamiltonian of the form nÿ Φ 0 pa i q ` Φloooooomoooooon interaction paq. (4) i 1 loooomoooon much smaller, but more complicated main term It is the interactions between particles that serve to mix up their different behaviours and bring the system as a whole towards the typical macroscopic behaviour of equilibrium. But the idea in a non-interacting model is that, although the presence of Φ interaction paq is necessary to explain the movement towards equilibrium, it is small enough to be neglected when doing calculations in terms of the microcanonical ensemble. 7

The microcanonical ensemble in this model is obtained by choosing a value E and considering the uniform distribution over the set of states for the system whose energy is En. In writing it this way, E is the specific energy of the system: the energy-per-particle. This is natural, because the total energy allowed in the model generally scales like the number of particles. In fact, we do not ask precisely that the state a satisfy Φpaq En, but only that Φpaq be in a small window around En, with an error that is small relative to the total number of particles. This approximation is needed to prevent certain pathologies. But it is also natural if we remember that Φ is only an approximation to the true Hamiltonian (4) anyway. So, in a model of this kind, the microcanonical ensemble at specific energy E and with error tolerance ε ą 0 is the uniform distribution on the set Ω n pe, εq : ta P A n : Φpaq En ă εnu. This should remind you of our various notions of typical sets in information theory. The main new feature of the present setting is that A can be infinite. But the set Ω n pe, εq is finite for any given values of E and ε, as a consequence of (3). 6 Notes and remarks Sources for this lecture: Our choice of first steps is heavily motivated by [Khi49], although Khinchine stays in the setting of continuous models. See [Ell06, Chapter III] for a much more careful account of an even more special case than the above. I like [B 06, Chapter 3] for a terse but accessible introduction to statistical physics for applied mathematicians. Further reading: There are many good physics textbooks on statistical physics. I mostly looked at Pathria and Beale s Statistical Mechanics. Classic more advanced books include [Fey98] and [LL80]. But it s definitely worth shopping around for yourself. Several authors have attempted a combined description of information theory and statistical mechanics, but with explanations based on physical ideas rather than mathematics. Two places to start looking into this are [Jay89] and [Bri62]. 8

References [B 06] Oliver Bühler. A brief introduction to classical, statistical, and quantum mechanics, volume 13 of Courant Lecture Notes in Mathematics. New York University, Courant Institute of Mathematical Sciences, New York; American Mathematical Society, Providence, RI, 2006. [Bri62] Léon Brillouin. Science and information theory. Second edition. Academic Press, Inc., Publishers, New York, 1962. [Ell06] Richard S. Ellis. Entropy, large deviations, and statistical mechanics. Classics in Mathematics. Springer-Verlag, Berlin, 2006. Reprint of the 1985 original. [Fey98] Richard Feynman. Statistical Mechanics: A Set Of Lectures. Advanced Books Classics. Westview Press, revised edition, 1998. [Jay89] E. T. Jaynes. E. T. Jaynes: papers on probability, statistics and statistical physics, volume 50 of Pallas Paperbacks. Kluwer Academic Publishers Group, Dordrecht, 1989. Reprint of the 1983 original, Edited and with an introduction by R. D. Rosenkrantz. [Khi49] A. I. Khinchin. Mathematical Foundations of Statistical Mechanics. Dover Publications, Inc., New York, N. Y., 1949. Translated by G. Gamow. [LL80] L.D. Landau and E.M. Lifschitz. Statistical Physics, Part 1. Course of Theoretical Physics. Butterworth Heinemann, third edition, 1980. TIM AUSTIN Email: tim@cims.nyu.edu URL: cims.nyu.edu/ tim 9