Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA. There are at least two ways to think about statistical thermodynamics.

Similar documents
Introduction. Chapter The Purpose of Statistical Mechanics

Introductory Quantum Chemistry Prof. K. L. Sebastian Department of Inorganic and Physical Chemistry Indian Institute of Science, Bangalore

Statistical Mechanics

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Grand Canonical Formalism

Heat, Temperature and the First Law of Thermodynamics

Statistical Mechanics in a Nutshell

Entropy and Free Energy in Biology

Probability and the Second Law of Thermodynamics

IV. Classical Statistical Mechanics

Statistical. mechanics

Physics 172H Modern Mechanics

Quiz 3 for Physics 176: Answers. Professor Greenside

Khinchin s approach to statistical mechanics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

2m + U( q i), (IV.26) i=1

3. RATE LAW AND STOICHIOMETRY

LECTURE 10: REVIEW OF POWER SERIES. 1. Motivation

Physics Oct A Quantum Harmonic Oscillator

although Boltzmann used W instead of Ω for the number of available states.

PHYS Statistical Mechanics I Course Outline

Physics Nov Bose-Einstein Gases

Statistical thermodynamics for MD and MC simulations

CHAPTER FIVE FUNDAMENTAL CONCEPTS OF STATISTICAL PHYSICS "

Elements of Statistical Mechanics

Entropy and Free Energy in Biology

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

FACTORIZATION AND THE PRIMES

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Thermal Physics. Energy and Entropy

Stochastic Processes

6.730 Physics for Solid State Applications

are divided into sets such that the states belonging

The Growth of Functions. A Practical Introduction with as Little Theory as possible

Irreversibility and the Second Law in Thermodynamics and Statistical Mechanics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

arxiv:gr-qc/ v2 30 Oct 2005

We already came across a form of indistinguishably in the canonical partition function: V N Q =

CHEM-UA 652: Thermodynamics and Kinetics

Chapter 1 Review of Equations and Inequalities

Introduction. Introductory Remarks

ACTIVITY 5. Figure 5-1: Simulated electron interference pattern with your prediction for the next electron s position.

This is a statistical treatment of the large ensemble of molecules that make up a gas. We had expressed the ideal gas law as: pv = nrt (1)

213 Midterm coming up

Introduction Statistical Thermodynamics. Monday, January 6, 14

Fluctuations of Trapped Particles

Math 131, Lecture 20: The Chain Rule, continued

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Lecture 10: Generalized likelihood ratio test

What Is Classical Physics?

1 Foundations of statistical physics

Internal Degrees of Freedom

Boltzmann Distribution Law (adapted from Nash)

Ordinary Differential Equations Prof. A. K. Nandakumaran Department of Mathematics Indian Institute of Science Bangalore

6. The transfinite ordinals*

Lecture 10: A (Brief) Introduction to Group Theory (See Chapter 3.13 in Boas, 3rd Edition)

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

Special Theory of Relativity Prof. Shiva Prasad Department of Physics Indian Institute of Technology, Bombay

Chapter 5 - Systems under pressure 62

Derivation of the Boltzmann Distribution

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Math 414, Fall 2016, Test I

Review of probability

Light Quantum Hypothesis

General-relativistic quantum theory of the electron

CHAPTER 2 THE MICROSCOPIC PERSPECTIVE

Physics Sep Example A Spin System

MATH 115, SUMMER 2012 LECTURE 12

CHEM 1420: Physical Chemistry 2 Thermodynamics, Statistical Mechanics, and Kinetics

We're in interested in Pr{three sixes when throwing a single dice 8 times}. => Y has a binomial distribution, or in official notation, Y ~ BIN(n,p).

9.1 System in contact with a heat reservoir

ENSEMBLE VERSUS TIME AVERAGE PROBABILITIES IN RELATIVISTIC STATISTICAL MECHANICS

Phase space in classical physics

Introduction to Group Theory

Superposition - World of Color and Hardness

Entropy online activity: accompanying handout I. Introduction

Chem 3070: Thermodynamics and Kinetics. Spring 2013

Mechanics, Heat, Oscillations and Waves Prof. V. Balakrishnan Department of Physics Indian Institute of Technology, Madras

Quantum Mechanics- I Prof. Dr. S. Lakshmi Bala Department of Physics Indian Institute of Technology, Madras

Lecture 2: Intro. Statistical Mechanics

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

So far we have limited the discussion to state spaces of finite dimensions, but it turns out that, in

ChE 503 A. Z. Panagiotopoulos 1

Molecular Modeling of Matter

On Exponential Decay and the Riemann Hypothesis

EM Algorithm & High Dimensional Data

Fitting a Straight Line to Data

Sequences and Series

Atoms with More than One Electron

Measurements and Data Analysis

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

Math 2260 Exam #2 Solutions. Answer: The plan is to use integration by parts with u = 2x and dv = cos(3x) dx: dv = cos(3x) dx

L11.P1 Lecture 11. Quantum statistical mechanics: summary

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

to satisfy the large number approximations, W W sys can be small.

PAPER No. 6: PHYSICAL CHEMISTRY-II (Statistical

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

Ideal gases. Asaf Pe er Classical ideal gas

Transcription:

1 The Gibbs Approach to Statistical Thermodynamics Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA There are at least two ways to think about statistical thermodynamics. The first way is usually attributed to the German physicist Ludvig Boltzmann (1844-1906), and a second way has been attributed to the American chemist J. Willard Gibbs. (1839-1903) My purpose here is to present Schrödinger s introduction to Gibbs method. I do this by quoting excerpts from the first part of a collection of lectures by Erwin Schrödinger on Statistical Thermodynamics (1). It is good to hear the voice of authority, even though it makes considerable demands on the listener. I ve tried to help by adding comments and explanations. With three exceptions all the quoted material is from Schrödinger s lectures (1). The two ways of thinking about statistical thermodynamics The Boltzmann approach to statistical thermodynamics is to consider a single system,... the one macroscopic device that is actually erected on our laboratory table. (Schrödinger, p. 3, Reference 1). The Gibbs approach is to use a collection or assembly or ensemble of these systems. Characteristics of the assembly or ensemble are that its members are both imaginary in nature and infinite in number. Ensembles An ensemble is a large number of imaginary replicas of the system under consideration. (LMS, p. 828, Reference 2). The reason for using the idea of... an ensemble is to make it easier to visualize the averaging arguments that are used in statistical mechanics. (LMS, p. 828). There are different types of ensembles that are distinguished by the

2 conditions specified for the component systems. In a microcanonical ensemble every component system has the same number of molecules, N, the same volume, V, and the same energy, E. In a canonical ensemble every component system has the same N, V, and the same temperature, T. Assumptions of the ensemble approach The first assumption that accompanies the ensemble approach is that there are properties of the systems of the ensemble whose average corresponds to experimentally observed properties (LMS, p. 829). For example, the average energy of the systems of a canonical ensemble (constant NVT) will equal the observed energy of the one actual system. The second assumption is that the individual systems of the ensemble are distributed (in ways that are consistent with the constraints) with equal probability over all possible quantum states. Schrödinger s presentation of the Gibbs approach The object of this seminar is to develop briefly one simple, unified standard method, capable of dealing, without changing the fundamental attitude, with all cases.... (S. p. 1.) It is not a first introduction for newcomers to the subject, but rather a repetitorium. (S. p. 1) There is, essentially, only one problem in statistical thermodynamics: the distribution of a given amount of energy E over N identical systems. Or perhaps better: to determine the distribution of an [ensemble] of N identical systems over the possible states in which this [ensemble] can find itself, given that the energy of the [ensemble] is a constant E. The idea is that there is weak interaction between them [the systems of the ensemble], so weak that the energy of interaction can be disregarded, that one can speak of the private energy of every one of them [the systems of the ensemble] and that

3 the sum of their private energies has to equal E. (S. p. 1) To determine the distribution means in principle to make oneself familiar with any possible distribution-of-the-energy (or state-of-theassembly), to classify them [the states] in a suitable way, i.e. in the way suiting the purpose in question and to count the numbers in the classes, so as to be able to judge of the probability of certain features or characteristics turning up in the [ensemble]. (S. p. 2). This is the mathematical problem -- always the same; (S. p. 2) There are two ways to think about it But there are two different attitudes as regards the physical application of the mathematical result. We shall later, for obvious reasons, decidedly favor one of them; for the moment, we must explain them both. The older and more naïve application is to N actually existing physical systems in actual physical interaction with each other e. g. gas molecules or electrons or Planck oscillators or degrees of freedom... The N of them together represent the actual physical system under consideration. This original point of view is associated with the names of Maxwell, Boltzmann and others. But it suffices only for dealing with a very restricted class of physical systems - virtually only with gases. It is not applicable to a system which does not consist of a great number of identical constituents with private energies. In a solid the interaction between neighboring atoms is so strong that you cannot mentally divide up its total energy into the private energies of its atoms. (S. pp. 2,3). Hence a second point of view (or, rather, a different application of the same mathematical results), which we owe to Willard Gibbs, has been developed. It has a particular beauty of its own, is applicable quite generally to every physical system and has some advantages to be mentioned

4 forthwith. Here the N identical systems are mental copies of the one system under consideration - of the one macroscopic device that is actually erected on our laboratory table. Now what on earth could it mean, physically, to distribute a given amount of energy E over these N mental copies: The idea is, in my view, that you can, of course, imagine that you really had N copies of your system, that they really were in weak interaction with each other, but isolated from the rest of the world. Fixing your attention on one of them, you find it in a peculiar kind of heat-bath which consists of the N-1 others. (S. p. 3) Now you have, on the one hand, the experience that in thermodynamical equilibrium the behavior of a physical system which you place in a heat-bath is always the same whatever be the nature of the heat-bath that keeps it at constant temperature, provided, of course, that the bath is chemically neutral towards your system, i. e. that there is nothing else but heat exchange between them. On the other hand, the statistical calculations do not refer to the mechanism of interaction; they only assume that it is purely mechanical, that it does not affect the nature of the single systems but merely transfers energy from one to the other. (S. pp. 3,4) These considerations suggest that we may regard the behavior of any one of those N systems as describing the one actually existing system when placed in a heat-bath of given temperature. Moreover, since the N systems are alike and under similar conditions, we can then obviously, from their simultaneous statistics, judge of the probability of finding our system, when placed in a heat-bath of given temperature, in one of other of its private states. Hence all questions concerning the system in a heat-bath can be answered. (S. p. 4) We adopt this point of view in principle though all the following

5 considerations may, with due care, also be applied to the other. The advantage consists not only in the general applicability, but also in the following two points: (i) N can be made arbitrarily large. In fact, in case of doubt we always mean lim N = (infinitely large heat-bath). Hence the applicability, for example, of Stirling s formula for N!, or for the factorials of occupation numbers proportional to N (and thus going with N to infinity) need never be questioned. (ii) No question about the individuality of the members of the [ensemble] can ever arise -... (S. p. 4) The method of the most probable distribution Here is where Schrödinger presents Gibbs approach. We are faced with an [ensemble] of N identical systems. We describe the nature of any one [system] by enumerating its possible states, which we label 1, 2, 3, 4,..., l,.... (S. p. 5). [There are some qualifications here.] We shall always regard the state of the [ensemble] as determined by the indication that system No. 1 is in state, say, l 1, No. 2 in state l 2,... No. N in state l N. We shall adhere to this, though the attitude is altogether wrong. (S. p. 5) [ Wrong because states may not be pure but most likely are mixed. But, apparently, we don t need to worry about this because the results do not differ appreciably from those obtained from the simpler and more naïve point of view, which we have outlined above and now adopt. ] (S. pp. 5,6) Now we get to the big idea Thus a certain class of states of the [ensemble] will be indicated by saying that a 1, a 2, a 3,..., al,... of the N systems are in state 1, 2, 3,..., l,...

6 respectively, and all states of the [ensemble] are embraced without overlapping by the classes described by all different admissible sets of numbers a l. [system] State No. 1 2 3... l... Energy ε 1 ε 2 ε 3... ε l... (2.1) Occupation No. a 1 a 2 a 3... al... The number of single states, belonging to this class, is obviously P = N! a 1! a 2! a 3!... al!... (2.2) The set of numbers a l must, of course, comply with the conditions Σ a l = N and Σ ε lal = E (2.3) (S. p. 6) The point There are many, N, systems in the ensemble. Each system will be in a state. Each state is characterized by its energy. Systems with energy ε 1 are in the state called State 1. Systems with energy ε 2 are in the state called State 2, and so on. The ensemble has states. The state of the ensemble is determined by the states of the N systems of which the ensemble is composed. A conceivable state of the ensemble is that every system has the same energy. This is unlikely. Another possibility is that every system has a different energy. This, too, is unlikely. What is more likely is that some systems will have energy ε 1, some will have energy ε 2 and so on. Thus states of the ensemble can be classified, or grouped together, according to the number of component systems that have

7 each energy: a certain number, a 1, with energy ε 1, a certain number, a 2, with energy ε 2, and so on. But not all conceivable combinations of systems (sets of occupation numbers) are admissible (acceptable); there are constraints. The combined occcupation numbers of the systems must equal N, the total of all systems in the ensemble, and the combined energy of all the systems must equal E, the total energy of the ensemble. Σ a l = N and Σ ε lal = E (2.3) Schrödinger continues... Statements (2.2) and (2.3) really finish our counting. But in this form the result is wholly unsurveyable. (S. p. 6). The present method admits that, on account of the enormous largeness of the number N, the total number of distributions (i. e. the sum of all P s) is very nearly exhausted by the sum of those P s whose number sets a l do not deviate appreciably from that set which gives P its maximum value (among those, of course, which comply with (2.3)). In other words, if we regard this set of occupation numbers as obtaining always, we disregard only a very small fraction of all possible distributions - and this has a vanishing likelihood of ever being realized. This assumption is rigorously correct in the limit N (thus: in the application to the mental or virtual assembly, where in dubio we always mean this limiting case, which corresponds physically to an infinite heat-bath ; you see again the great superiority of the Gibbs point of view). Here we adopt this assumption without the proof which will emerge later... (S. pp. 6,7) The most probable distribution is overwhelmingly probable The number of single states, belonging to this class, is obviously

8 P = N! a 1! a 2! a 3!... al!.. (2.2) (S. p. 6) Schrödinger is saying that you can add the Ps for every class and get the total of single states for all the classes, which equals N. When the P for the most probable class (actually... those Ps whose number sets a l do not deviate appreciably from that set which gives P its maximum value... ) is divided by the total of the Ps for all the classes, N, the fraction is essentially equal to 1. The most probable set a l (sometimes represented by a* l ) is essentially the only admissible set; the state of the ensemble can always be represented by a* l. Schrödinger goes on to say... For N large, but finite, the assumption is only approximately true. Indeed, in the application to the Boltzmann case, distributions with occupation numbers deviating from the maximum set must not be entirely disregarded. They give information on the thermodynamic fluctuations of the Boltzmann system, when kept at constant energy E, i.e. in perfect heat isolation. But we shall not work that out here,... (S. p. 7) Continuing.. Returning to (2.2) and (2.3), we choose the logarithm of P as the function whose maximum we determine, taking care of the accessory conditions in the usual way by Lagrange multipliers, λ and µ; i.e. we seek the unconditional maximum of log P λ Σ al µ Σ ε l al (2.4) l l for the logarithms of the factorials we use Stirling s formula in the form log (n!) = n(log n 1) (2.5)

9 (S. p. 7) Skipping a little we get to Calling E/N = U the average share of energy of one system, we express our whole results thus: E N = U = Σ ε l e -µε l Σ e -µε l = µ log Σ e -µε l a l = N e-µε l Σ e -µε l = N µ ε l log Σ e -µε l (2.6) The set of equations [(2.6)] indicates the distribution of our N systems over their energy levels. It may be said to contain, in a nutshell, the whole of thermodynamics, which hinges entirely on this basic distribution. The relation itself is very perspicuous - the exponential e -µε l indicates the occupation number a l as a fraction of the total number N of systems, the sum in the denominator [to be shown to be the partition function, Q, or Z] being only a normalizing factor. (S. p. 8) A physical interpretation To explain this without too many qualifications, we now definitively adopt the Gibbs point of view, namely, that we are dealing with a virtual ensemble, of which the single member is the system really under consideration. And since all the single members are of equal right, we may now, when it comes to physical interpretation, think of the a l, or rather of the a l /N, as the frequencies with which a single system, immersed in a large heat-bath, will be encountered in the state ε l, while U is its average energy under these circumstances. (S. p. 9) The meaning of µ A little later, as usual, µ, often called β, is identified as 1/kT. (S. p. 13)

10 Summary; Boltzmann versus Gibbs This is the mathematical problem -- always the same;... (S. p. 2) Hence a second point of view (or, rather, a different application of the same mathematical results), which we owe to Willard Gibbs, has been developed. (S. p. 3) The mathematical problem There is, essentially, only one problem in statistical thermodynamics: the distribution of a given amount of energy E over N identical systems. (S. p. 1) The identical systems could be 10 23 atoms of helium at normal temperature and pressure. We can call this the Boltzmann approach. Or perhaps better: to determine the distribution of an [ensemble] of N identical systems over the possible states in which this [ensemble] can find itself, given that the energy of the [ensemble] is a constant E. (S. p. 1) The N identical systems could be the N replicas of 10 23 atoms of helium at normal temperature and pressure. We can call this the Gibbs approach. The Boltzmann representation of the math The Boltzmann approach has been represented as finding the maximum of W = N! a 1! a 2! a 3!... ai! Eq. 2 by way of maximizing log W in this expression log W α Σ a i β Σ a i ε i It then turns out that the weight, W, or multiplicity, of the most probable distribution of particles of the system is essentially equal to Ω, the weight of all the distributions. The Gibbs representation of the math The Gibbs approach has been represented here as finding the maximum of

11 P = N! a 1! a 2! a 3!... al!... (2.2) by way of maximizing log P in this expression log P λ Σ al µ Σ ε l al (2.4) l l It then turns out that P, the number of single states for the most probable distribution of the systems of the ensemble, is essentially equal to N, the number of systems in the ensemble. Why do we need two ways? First, it often is helpful to say the same thing in two different ways. Perhaps this is the most important reason. A second reason is that there are very simple model systems for which the possibilities are easily counted. An ensemble is not needed. (Reference 3). A final comment My attitude here was If Schrödinger said it, I should believe it. The problem for me, however, was being confident that I understood what he was saying. To this end I found pages 35 through 40 of Statistical Mechanics by Donald A. McQuarrie to be very helpful. (Reference 4) Literature Cited 1. Schrödinger, Erwin, Statistical Thermodynamics, Cambridge University Press, 1948. 2. Laidler, K. J., Meiser, J. H., and Sanctuary, B. C. Physical Chemistry, fourth edition, Houghton Mifflin Company, Boston, MA 2003 3. Dill, Ken A., and Bromberg, Sarina, Molecular Driving Forces. Statistical Driving Forces in Chemistry and Biology, Garland Science, 2003. 4. McQuarrie, Donald A. Statistical Mechanics, University Science Books, Sausalito, CA, 2000.