Basic Concepts and Tools in Statistical Physics

Similar documents
Introduction. Chapter The Purpose of Statistical Mechanics

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

213 Midterm coming up

ΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Part II: Statistical Physics

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

1 INFO Sep 05

Outline for Fundamentals of Statistical Physics Leo P. Kadanoff

1: PROBABILITY REVIEW

Kinetic Theory 1 / Probabilities

Lectures on Elementary Probability. William G. Faris

Classical Statistical Mechanics: Part 1

Ultrasonic Testing and Entropy

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Kinetic Theory 1 / Probabilities

IV. Classical Statistical Mechanics

Statistical Mechanics

ChE 503 A. Z. Panagiotopoulos 1

Physics 115A: Statistical Physics

Boltzmann Distribution Law (adapted from Nash)

Signal Processing - Lecture 7

A Brief Introduction to Statistical Mechanics

III. Kinetic Theory of Gases

PAPER No. 6: PHYSICAL CHEMISTRY-II (Statistical

Derivation of the Boltzmann Distribution

Math Bootcamp 2012 Miscellaneous

Lecture 8: Probability

Lecture 6 Examples and Problems

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Elements of Statistical Mechanics

Lecture 2 Binomial and Poisson Probability Distributions

6.730 Physics for Solid State Applications

Minimum Bias Events at ATLAS

Physics 172H Modern Mechanics

ELEMENTARY COURSE IN STATISTICAL MECHANICS

1 Phase Spaces and the Liouville Equation

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Chapter 8: An Introduction to Probability and Statistics

5. Systems in contact with a thermal bath

Introduction to Statistical Thermodynamics

II. Probability. II.A General Definitions

Thermodynamics: More Entropy

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Removing the mystery of entropy and thermodynamics. Part 3

Lecture Notes Set 3a: Probabilities, Microstates and Entropy

1 Multiplicity of the ideal gas

Statistical Methods in Particle Physics

The Methodology of Statistical Mechanics

although Boltzmann used W instead of Ω for the number of available states.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

04. Random Variables: Concepts

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Statistical Mechanics Notes. Ryan D. Reece

1 Fluctuations of the number of particles in a Bose-Einstein condensate

. Find E(V ) and var(v ).

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

1 Presessional Probability

2m + U( q i), (IV.26) i=1

Exercises with solutions (Set D)

Identical Particles. Bosons and Fermions

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Chapter 6 - Random Processes

Northwestern University Department of Electrical Engineering and Computer Science

( ) ( )( k B ( ) ( ) ( ) ( ) ( ) ( k T B ) 2 = ε F. ( ) π 2. ( ) 1+ π 2. ( k T B ) 2 = 2 3 Nε 1+ π 2. ( ) = ε /( ε 0 ).

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Math 122L. Additional Homework Problems. Prepared by Sarah Schott

Introduction to Probability and Statistics (Continued)

Theoretical Statistical Physics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Statistical thermodynamics for MD and MC simulations

The non-interacting Bose gas

Chapter 20 The Second Law of Thermodynamics

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Dr.Salwa Alsaleh fac.ksu.edu.sa/salwams

CMPSCI 240: Reasoning Under Uncertainty

First and Last Name: 2. Correct The Mistake Determine whether these equations are false, and if so write the correct answer.

Thermal and Statistical Physics Department Exam Last updated November 4, L π

We already came across a form of indistinguishably in the canonical partition function: V N Q =

Sample Spaces, Random Variables

CME 106: Review Probability theory

Probability theory basics

Supplement: Statistical Physics

Computer Applications for Engineers ET 601

Review of Probability Theory

Statistics and Econometrics I

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Introduction Statistical Thermodynamics. Monday, January 6, 14

Introduction to Stochastic Processes

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Statistical Mechanics

Brownian motion and the Central Limit Theorem

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Grand Canonical Formalism

Lecture 9 Examples and Problems

1 The postulates of quantum mechanics

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Transcription:

Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes general formulas connecting physical parameters to various physical quantities. From these formulas, one can deduce properties of a system if one knows the system s parameters. In general, microscopic mechanisms leading to interaction parameters are provided by quantum mechanics. Statistical mechanics and quantum mechanics are at the heart of the modern physics which has allowed a microscopic understanding of properties of matter and spectacular technological innovations and progress which have radically changed our daily life since 50 years. While quantum mechanics searches for microscopic structures and mechanisms which are not perceptible at the macroscopic scale, statistical mechanics studies macroscopic properties of systems composed of a large number of particles using information provided by quantum mechanics on interaction parameters. The progress made in the 20-th century in the theory of condensed matter shows the power of methods borrowed from statistical physics. Statistical physics of systems at equilibrium is based on a single postulate called fundamental postulate introduced in the case of an isolated system at equilibrium. Using this postulate, one studies in chapter 2 properties of isolated systems and recovers results from thermodynamics. Moreover, as one sees in chapter 3 and 4, one can also study properties of non isolated systems in some particular situations, such as systems maintained at a constant temperature and systems maintained at a constant temperature and a constant chemical potential, where the fundamental postulate 3

4 Statistical Physics Fundamentals and Application to Condensed Matter can be used in each case to calculate the probability of a microscopic state, or microstate for short. In this chapter one recalls basic mathematical tools and definitions which are used throughout this book. 1.2 Combinatory analysis In the following, one recalls some useful definitions in the combinatory analysis. 1.2.1 Number of permutations Let N be the number of discernible objects. The number of permutations two by two of these objects is given by P = N! (1.1) To demonstrate this formula, let us consider an array of N cases and N objects numbered from 1 to N: -there are N ways to choose the first object to put into the first case -there are N 1 ways to choose the second object to put into the second case, etc. The total number of different configurations of objets in the cases is thus N(N 1)(N 2)...1 =N!. Example: The number of permutations of 3 objects numbered from 1 to 3 is 3! = 6. They are 1 2 3, 1 3 2, 2 1 3, 2 3 1, 3 1 2, 3 2 1. 1.2.2 Number of arrangements The number of arrangements of n objects taken among N objectsisthe number of ways to choose n objects among N objects, one by one, taking into account the order of sorting-out. This number is given by A n N = N! (N n)! (1.2) Example: Consider 4 objects numbered from 1 to 4. One chooses 2 objects among 4. The possible configurations are (1,2), (1,3), (1,4), (2,1), (2,3), (2,4), (3,1), (3,2), (3,4), (4,1), (4,2) and (4,3) where the first and second numbers in the parentheses correspond respectively to the first and second sorting. If one takes into account the sorting order, then (1,2) and

Basic Concepts and Tools in Statistical Physics 5 (2,1) are considered to be different. There are thus 12 arrangements, as given by A 2 4 =4!/(4 2)! = 12. 1.2.3 Number of combinations The number of combinations of n objects among N objects is the number of ways to choose n objects among N objects, without taking into account the sorting order. This number is given by C n N = N! (N n)!n! (1.3) One sees that CN n is equal to An N divided by the number of permutations of n objects: Cn n = An N /n!. In the example given above, if (1,2) and (2,1) are counted only once, and the same for the other similar pairs, then one has 6 combinations, namely C4 2. 1.3 Probability Statistical mechanics is based on a probabilistic concept. As one sees in the following, the value of a physical quantity experimentally observed is interpreted as its most probable value. In terms of probabilities, this value corresponds to the maximum of the probability to find this quantity. One sees below that the observed quantity is nothing but the mean value resulting from the statistical average over microstates. Hereafter, one recalls the main properties of the probability and presents some frequently used probability laws. 1.3.1 Definition One distinguishes two cases: discrete events and continuous events. 1.3.1.1 Ensemble of discrete events Discrete events can be distinguished from each other without error or ambiguity. Examples include rolling a dice or flipping a coin in which the result of each event is independent of the previous one. Let us realize N experiments. Each experiment gives a result. Let n i be the number of times that an event of the type i occurs among N results. The probability of that event is defined by

6 Statistical Physics Fundamentals and Application to Condensed Matter P i = lim (1.4) N N Example: One flips N times a coin. One obtains n 1 heads and n 2 tails. If N is very large, one expects n 1 = n 2 = N n1 n2 2. The ratios N and N are called probabilities to obtain heads and tails, respectively. Remark: For all i, 0 P i 1. n i 1.3.1.2 Ensemble of continuous events Density of probability Variables in continuous events are always given with an uncertainty (error). Example: The length of a table is measured with an uncertainty. Let P (x) be the probability so that the length of the table belongs to the interval [x, x + x]. Let n(x) be the number of times among N measurements where x belongs to [x, x + x]. One has by definition n(x) P (x) = lim N N The density of probability is defined by P (x) dp (x) (1.5) W (x) = lim = (1.6) x 0 x dx The probability of finding x in the interval [x, x + dx] isthusdp (x) = W (x)dx. Remarks: 1) W (x) 0 2) For more than one variable one writes dp (x, y, z) =W (x, y, z)dxdydz or dp ( r )=W ( r )d r. 1.3.2 Fundamental properties 1.3.2.1 Normalization of probabilities From their definition, one has for discrete and continuous cases the following normalization relations i P i = n i =1 (1.7) N i W (x)dx = dp (x) = 1 dn(x) = 1 (1.8) N

Basic Concepts and Tools in Statistical Physics 7 1.3.2.2 Addition rule Let e 1 and e 2 be two incompatible discrete events of probabilities P (e 1 ) and P (e 2 ), respectively. The probability of finding e 1 or e 2 is given by P (e 1 or e 2 )=P (e 1 )+P (e 2 ) (1.9) For continuous variables, the probability to find x between a and b is P (a x b) = b a W (x)dx (1.10) If e 1 and e 2 are not incompatible in N experiments, then one should remove the number of times where e 1 and e 2 simultaneously take place. One has 1.3.2.3 Multiplication rule P (e 1 or e 2 )=P (e 1 )+P (e 2 ) P (e 1 and e 2 ) (1.11) Let e 1 and e 2 be two independent events. The probability to find e 1 and e 2 is given by P (e 1 and e 2 )=P (e 1 )P (e 2 ) (1.12) Example: One flips a dice twice. The probability to find face 1 or face 2is 1 6 + 1 6, and that to find face 1 and face 2 is 1 6 1 6. 1.3.3 Mean values One defines the mean value of a quantity f by f = m P m f m (1.13) where f m is the value of f in the microstate m of probability P m.thesum is performed over all states. If the states (or events) are characterized by continuous variables, one has f = where f(x) isthevalueoff in the state x. W (x)f(x)dx (1.14)

8 Statistical Physics Fundamentals and Application to Condensed Matter If a value A occurs many times with different events m (or different states), one can write A = A i P (A i )A i (1.15) where P (A i ) is the probability to find A i,namely P (A i )= m,a m=a i P m (1.16) where the sum is made only on the events m having A m = A i.inthecase of a continuous variable, one has A = W (A)AdA (1.17) where W (A) is the density of probability to find A. The most probable value of A i (or A) corresponds to the maximum of P (A i )(orw (A)). The variance is defined by ( f) 2 = ( f f ) 2 (1.18) One can also write ( f) 2 = m = m P m ( fm f ) 2 P m [ f 2 m +(f) 2 2f m f ] = f 2 +(f) 2 2f f = f 2 (f) 2 (1.19) One calls f = ( f) 2 standard deviation. This quantity expresses the dispersion of the statistical distribution. In the same manner, one has for the continuous case ( A) 2 = + W (A) [ A A ] 2 da = A2 (A) 2 (1.20)

Basic Concepts and Tools in Statistical Physics 9 1.4 Statistical distributions 1.4.1 Binomial distribution When there are only two possible outcomes in an experiment, the distribution of the results in a series of experiments follows the binomial law. Example: flipping a coin two possible results, head or tail. Let P A and P B be the probabilities of the two types of outcome. The normalization of probability imposes P A + P B =1. IfafterN experiments one obtains n times A and N n times B, then the probability to find n times A and (N n) timesb is given by the multiplication rule P (N,n) =CN n PAP n N n B (1.21) where the factor CN n = N! n!(n n)! is introduced to take into account the number of combinations in which n times A and (N n) timesb occur. One shows that P (N,n) is normalized: N P (N,n) = N C n N P n A P N n B =(P A + P B ) N = 1 (1.22) where one has used the formula of the Newton binomial to go from the first to the second line. One shows in the following some main results: the mean value of n is n = NP A : n = N np (N,n) = = P A P A [ N N ncn n PAP n N n B ] C n N P n A P N n B = P A (P A + P B ) N P A = P A N(P A + P B ) N 1 = NP A (1.23) where one has used in the last line P A + P B =1.

10 Statistical Physics Fundamentals and Application to Condensed Matter the variance: The variance is defined by ( n) 2 = n 2 (n) 2. One calculates n 2 as follows: N n 2 = n 2 P (N,n) =n 2 CN n P A n P N n B [ N ] =(P A ) 2 CN n P P AP n N n B A =(P A ) 2 (P A + P B ) N P A [ = P A PA N(P A + P B ) N 1] P A = NP A + N(N 1)PA 2 (1.24) where one has used in the last equality P A + P B = 1. One finds ( n) 2 = n 2 (n) 2 = NP A + N(N 1)PA 2 (NP A) 2 = NP A (1 P A )=NP A P B (1.25) from which the standard deviation is n = NP A P B. the relative uncertainty or relative error: The relative error on n is given by n n N N = 1 N (1.26) The relative error (or uncertainty) decreases with increasing N. For N = 1000 (which is the standard sample for polls), n/n 3%. 1.4.2 Gaussian distribution The Gaussian distribution applies in the case of continuous events. The density of probability W (x) of the Gaussian distribution, or Gaussian law, is given by W (x) =A exp [ (x x 0) 2 ] 2σ 2 (1.27) where A is a constant determined by the normalization of W (x), x 0 the central value of the distribution and 2σ the full width at half-maximum of W (x) (see Fig.1.1).

Basic Concepts and Tools in Statistical Physics 11 1 0.9 0.8 0.7 0.6 W(X) 0.5 0.4 0.3 0.2 0.1 0-2 -1 0 1 2 3 4 Fig. 1.1 Gaussian density of probability W (x) shown with x 0 =1,σ =0.4. To calculate A, one writes 1= W (x)dx = A exp [ (x x 0) 2 ] 2σ 2 = A 2πσ 2 (1.28) where u = x x 0 and where one has used in the last equality the Gauss integral (see Appendix A): exp [ au 2] du = X π a One then has A = 1 σ 2π. One shows some important results in the following: the mean value of x: With (1.27), one has x = xw (x)dx = A x exp [ (x x 0) 2 ] 2σ 2 dx = A (x x 0 )exp [ (x x 0) 2 ] 2σ 2 dx +A x 0 exp [ (x x 0) 2 ] 2σ 2 dx =0+x 0 A exp [ (x x 0) 2 ] 2σ 2 dx (1.29) = x 0 (1.30) where the first integral of the second line is zero (integral of an odd function with two symmetrical opposite bounds), and one has

12 Statistical Physics Fundamentals and Application to Condensed Matter used the normalization of the probability density W (x) in the line before the last line. the variance: Using x = x 0,thevariance( x) 2 = (x x) 2 is ( x) 2 = A (x x 0 ) 2 exp [ (x x 0) 2 ] 2σ 2 dx ] = A y 2 exp [ y2 2σ 2 dy = A 1 2 π(2σ2 ) 3 where one used in the last line a formula in Appendix A. Replacing A by 1 σ 2π,oneobtains from which the standard deviation is x = σ. ( x) 2 = σ 2 (1.31) One can also show that (see Problem 1) the binomial law becomes the Gaussian law when N>>n>>1. This equivalence is known as central limit theorem. 1.4.3 Poisson law When the probability of an event is very small with respect to 1, namely a rare event, the probability to find n events of the same kind is given by the Poisson law: P (n) = µn exp( µ) n! (1.32) where µ is a constant which is the mean value of n (see Problem 2). One shows that P (n) is normalized: P (n) = µ n exp( µ) n! =exp( µ) µ n n! =exp( µ)exp(µ) =1 (1.33) One can also show that the binomial law becomes the Poisson law when P A << P B and N>>n>>1(seeProblem2).

Basic Concepts and Tools in Statistical Physics 13 1.5 Microstates Macrostates 1.5.1 Microstates Enumeration A microstate of a system is defined in general by the individual states of the particles which constitute the system. These individual states are characterized by the physical parameters which define the system. Let us mention some examples. The microstates of an atom are given by the states of the electrons of the atom: each microstate is defined by four quantum numbers (n, l, m l,m s ). The microstates of an isolated system of energy E are given by the different distributions of E on the system particles. The microstates, at a given time, of a system of N classical particles of mass m are defined by the positions and the velocities of the particles at that time. The microstates of a system of independent quantum particles are given by the wave-vectors of the particles. It is obvious that for a given system there are a large number of microstates which satisfy the physical constraints imposed on the system (volume, energy, temperature,...). These microstates are called realizable microstates. It is very important to know how to calculate the number of these states. Systems of classical and quantum independent particles are shown in details in chapter 2. In the following, one gives some examples and formulas to enumerate the microstates of simple systems. In general, one distinguishes the cases of indiscernible and discernible particles. Discernible particles are those one can identify individually such as particles with different colors or particles bearing each a number. Indiscernible particles are identical particles impossible to distinguish the one from the others such as atoms in a mono-atomic gas or conduction electrons in a metal. Example: Consider an isolated system of total energy equal to 3 units. This system has 3 particles supposed to be discernible and bearing letters A, B and C. Each of the particles can occupy levels of energy at 0, 1, 2, or 3 units. With the constraint that the sum of energies of the particles is equal to 3, one has the 3 following categories of microstates (see Fig. 1.2): -level 0: B, C; level 3: A (Fig. 1.2, left) permutations of A with B and A with C give 3 microstates -level 0: C; level 1: B ; level 2: A (Fig. 1.2, middle) permutations of A, B and C give 6 microstates -level 1: A, B, C (Fig. 1.2, right) 1microstate. Remark: Permutations between particles of the same level do not give rise to new microstates.

14 Statistical Physics Fundamentals and Application to Condensed Matter 3 2 1 0 A B,C A B C A,B,C Fig. 1.2 Three categories of microstates (left, middle, right) according to occupation numbers of energy levels. In the example given above, one can use the following formula to calculate the number of microstates of each category in Fig. 1.2 ω = N! n 0!n 1!n 2!... (1.34) where n i (i =0, 1, 2,...) is the number of particles occupying the i-th level and N the total number of particles. One can obviously verify that this formula gives the number of microstates enumerated above in each category. Furthermore, one sees that the total number of microstates (all categories) is 10. This number can be calculated in the general case with arbitrary N and E by Ω(E) = (N + E 1)! (N 1)!E! (1.35) For E =3andN = 3, one recovers Ω = (3+3 1)! (3 1)!3! = 10. The demonstrations of Eqs. (1.34) and (1.35) are done in Problem 3. Remark: In the above example, if the particles are indiscernible, the number of microstates is reduced to 3 because permutations of particles on the different levels do not yield new microstates. It should be emphasized that the knowledge of the number of microstates allows one to calculate principal physical properties of isolated systems as one will see in chapter 2. The microstates having the same energy are called degenerate states. The number of degenerate states at a given energy is called degeneracy. In the above example, the degeneracy is 10 for E =3.

Basic Concepts and Tools in Statistical Physics 15 Each microstate l has a probability P l.ifoneknowsp l, one can calculate the mean values of physical quantities as will be seen in the following chapters. That is why the main objective of statistical mechanics is to find a way to determine P l according to the conditions imposed on the system. Remark: Quantum particles, by their nature, are indiscernible. One distinguishes two kinds of particles: bosons and fermions. Bosons are particles having integer spins (0, 1, 2,...) and fermions are those having half-integer spins (1/2, 3/2, 5/2,...). The symmetry postulate in quantum mechanics states that wave functions of bosons are invariant with respect to permutation of two particles in their states, while wave functions of fermions do change their sign at each permutation. A consequence of this postulate is that a quantum microstate can contain any number of bosons but only zero or one fermion. This affects obviously the enumeration of microstates as one will see in chapters 5 and 6. 1.5.2 Macroscopic states A macroscopic state, or macrostate for short, is an ensemble of microstates having a common macroscopic property. An observed macroscopic property is considered as the mean value of a statistical mixing of microstates of the ensemble. So, its definition depends on which macroscopic property one wants to observe: there are thus many ways to define a macrostate. In the example shown in Fig. 1.2, one can define a macrostate by the occupation numbers of the energy levels: there are thus 3 macrostates corresponding to the following occupation numbers in the first 4 levels (2,0,0,1), (1,1,1,0), (0,3,0,0) (Fig. 1.2, from left to right). One can also define a macrostate as the one having an energy E = 3. In this case, the macrostate contains the ensemble of all 10 microstates. 1.5.3 Statistical averages Ergodic hypothesis When a system is at equilibrium, the mean values of its macroscopic variables do not vary anymore with time evolution. However, often the system fluctuates between different microstates giving rise to small variations around the mean values of the macroscopic variables. One defines the time average of a variable f by

16 Statistical Physics Fundamentals and Application to Condensed Matter 1 f = lim τ τ t0+τ t 0 f(t)dt (1.36) where τ is the averaging time. On the other hand, as seen above, the mean value of a quantity can be calculated over the ensemble of microstates. Let P l be the probability of the microstate l in which the variable f is equal to f l. One defines the so-called ensemble average of f by f = (l) P l f l (1.37) The ergodic hypothesis of statistical mechanics at equilibrium introduced by Boltzmann in 1871 states that the time and ensemble averages are equal. The validity of this principle has been experimentally verified for systems at equilibrium. Of course, for systems with extremely long relaxation times such as glasses, this equivalence is not easy to prove. 1.6 Statistical entropy In statistical mechanics, the fundamental quantity which allows one to calculate principal properties of a system is the statistical entropy introduced by Boltzmann near the end of the 19th century. It is defined by S = k B P l ln P l (1.38) where P l is the probability of the microstate l of the system and k B the Boltzmann constant. One will see in the following chapters that S coincides with the thermodynamic entropy defined in a macroscopic manner in the second principle of the classical thermodynamics. The advantage of the definition (1.38) is that the physical consequences associated with the entropy are clear as seen below: l As 0 P l 1, one has S 0 When one of the events is sure, namely one probability is equal to 1 (other probabilities are zero by normalization), one sees that S is zero (mimimum). When all probabilities are equal, namely when

Basic Concepts and Tools in Statistical Physics 17 the uncertainty on the outcome is maximum, one can show that S is maximum (see Problem 8). In other words, S represents the uncertainty or the lack of information on the system. With the above definition, one understands easily that why in thermodynamics entropy S is said to express the disorder of the system. Statistical entropy is additive: consider two independent ensembles of events {e m ; m =1,..., M} and {e m ; m =1,..., M } of respective probabilities {P m ; m =1,..., M} and {P m ; m =1,..., M }. The probability to find e m and e m is thus P (m, m )=P m P m. The entropy of these double events is then S(e, e )= k B P m P m ln(p mp m )= k B P m P m [ln P m +lnp m ] m,m m,m = k B P m ln P m P m k B P m ln P m P m m m m m = S(e)+S(e ) (1.39) using m P m =1and m P m = 1. The total entropy is the sum of partial entropies. 1.7 Conclusion Since statistical mechanics is based on a probabilistic concept, the enumeration of microstates and the use of probabilities are very important. In this chapter, basic definitions and fundamental concepts on these points have been presented. In particular, most frequently used probability laws such as binomial, Gaussian and Poisson distributions have been shown. Microstates and macrostates were defined using examples for better illustration. Finally, the definition of the statistical entropy which is the most important quantity in statistical mechanics was shown and discussed in connection with the thermodynamic entropy. The above definitions and tools will be used throughout this book.

18 Statistical Physics Fundamentals and Application to Condensed Matter 1.8 Problems Problem 1. Central limit theorem: Show that the binomial law becomes the Gaussian law when N>> n>>1. Problem 2. Poisson law (1.32): a) Calculate n. b) Show that the binomial law becomes the Poisson law when P A << P B and N>>n>>1. Problem 3. Demonstrate the formulas (1.34) and (1.35). Problem 4. Application of the binomial law: Calculate the probability to find the number of heads between 3 and 6 (boundaries included) when one flips 10 times a coin. Problem 5. Random walk in one dimension: Consider a particle moving on the x axis with a constant step length l on the right or on the left with the same probability. a) Calculate the probability for the particle to make n steps to the right and n steps to the left after N steps. b) Calculate the averages n and n. Calculate the variance and relative uncertainty on n. c) Calculate the probability to find the particle at the position x = ml from the origin. Calculate x and ( x) 2. d) Suppose that the step length x i is not constant. What is the density of probability to find the particle at X after N steps with N 1? Problem 6. Random walk in three dimensions: Consider a particle moving in three dimensions with variable discrete step lengths l and a density of probability independent of the direction of the step. Each step corresponds to a change of direction of the particle motion which is due to a collision with another particle. This is a simple model of the Brownian motion. a) Calculate the Cartesian components (X, Y, Z) of the particle position after N steps. Calculate the averages ( X,Ȳ, Z) andthe corresponding variances ( X) 2,( Y) 2 and ( Z) 2. Deduce the average l 2. b) What is the density of probability to find the particle at (X, Y, Z) after N steps? c) Putting ( X) 2 =2Dt where D is the diffusion coefficient and t the lapse of time of the motion of the particle, demonstrate D = 1 l 2 6 τ where τ is the average time between two particle collisions.

Basic Concepts and Tools in Statistical Physics 19 Problem 7. Exchange of energy: Consider two isolated systems. The first one has N 1 = 2 particles and an energy equal to E 1 = 3 units. The second one has N 2 =4 particles and E 2 = 5 units. Each particle can have 0, 1, 2,... energy units. a) Calculate the number of microstates of the total system composed of systems 1 and 2 separated by an insulating wall. b) Remove the wall. The total system is kept isolated. Calculate the number of microstates of the total system in this new situation. Comment. c) In the situation of the previous question, calculate the number of microstates in which there are at least 2 particles having 2 energy units each. Deduce the probability of this macrostate. Problem 8. Statistical entropy: Show that the statistical entropy S = k B l P l ln P l is maximum when all probabilities P l are equal.