Basic Concepts and Tools in Statistical Physics
|
|
- Andra Wilson
- 6 years ago
- Views:
Transcription
1 Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes general formulas connecting physical parameters to various physical quantities. From these formulas, one can deduce properties of a system if one knows the system s parameters. In general, microscopic mechanisms leading to interaction parameters are provided by quantum mechanics. Statistical mechanics and quantum mechanics are at the heart of the modern physics which has allowed a microscopic understanding of properties of matter and spectacular technological innovations and progress which have radically changed our daily life since 50 years. While quantum mechanics searches for microscopic structures and mechanisms which are not perceptible at the macroscopic scale, statistical mechanics studies macroscopic properties of systems composed of a large number of particles using information provided by quantum mechanics on interaction parameters. The progress made in the 20-th century in the theory of condensed matter shows the power of methods borrowed from statistical physics. Statistical physics of systems at equilibrium is based on a single postulate called fundamental postulate introduced in the case of an isolated system at equilibrium. Using this postulate, one studies in chapter 2 properties of isolated systems and recovers results from thermodynamics. Moreover, as one sees in chapter 3 and 4, one can also study properties of non isolated systems in some particular situations, such as systems maintained at a constant temperature and systems maintained at a constant temperature and a constant chemical potential, where the fundamental postulate 3
2 4 Statistical Physics Fundamentals and Application to Condensed Matter can be used in each case to calculate the probability of a microscopic state, or microstate for short. In this chapter one recalls basic mathematical tools and definitions which are used throughout this book. 1.2 Combinatory analysis In the following, one recalls some useful definitions in the combinatory analysis Number of permutations Let N be the number of discernible objects. The number of permutations two by two of these objects is given by P = N! (1.1) To demonstrate this formula, let us consider an array of N cases and N objects numbered from 1 to N: -there are N ways to choose the first object to put into the first case -there are N 1 ways to choose the second object to put into the second case, etc. The total number of different configurations of objets in the cases is thus N(N 1)(N 2)...1 =N!. Example: The number of permutations of 3 objects numbered from 1 to 3 is 3! = 6. They are 1 2 3, 1 3 2, 2 1 3, 2 3 1, 3 1 2, Number of arrangements The number of arrangements of n objects taken among N objectsisthe number of ways to choose n objects among N objects, one by one, taking into account the order of sorting-out. This number is given by A n N = N! (N n)! (1.2) Example: Consider 4 objects numbered from 1 to 4. One chooses 2 objects among 4. The possible configurations are (1,2), (1,3), (1,4), (2,1), (2,3), (2,4), (3,1), (3,2), (3,4), (4,1), (4,2) and (4,3) where the first and second numbers in the parentheses correspond respectively to the first and second sorting. If one takes into account the sorting order, then (1,2) and
3 Basic Concepts and Tools in Statistical Physics 5 (2,1) are considered to be different. There are thus 12 arrangements, as given by A 2 4 =4!/(4 2)! = Number of combinations The number of combinations of n objects among N objects is the number of ways to choose n objects among N objects, without taking into account the sorting order. This number is given by C n N = N! (N n)!n! (1.3) One sees that CN n is equal to An N divided by the number of permutations of n objects: Cn n = An N /n!. In the example given above, if (1,2) and (2,1) are counted only once, and the same for the other similar pairs, then one has 6 combinations, namely C Probability Statistical mechanics is based on a probabilistic concept. As one sees in the following, the value of a physical quantity experimentally observed is interpreted as its most probable value. In terms of probabilities, this value corresponds to the maximum of the probability to find this quantity. One sees below that the observed quantity is nothing but the mean value resulting from the statistical average over microstates. Hereafter, one recalls the main properties of the probability and presents some frequently used probability laws Definition One distinguishes two cases: discrete events and continuous events Ensemble of discrete events Discrete events can be distinguished from each other without error or ambiguity. Examples include rolling a dice or flipping a coin in which the result of each event is independent of the previous one. Let us realize N experiments. Each experiment gives a result. Let n i be the number of times that an event of the type i occurs among N results. The probability of that event is defined by
4 6 Statistical Physics Fundamentals and Application to Condensed Matter P i = lim (1.4) N N Example: One flips N times a coin. One obtains n 1 heads and n 2 tails. If N is very large, one expects n 1 = n 2 = N n1 n2 2. The ratios N and N are called probabilities to obtain heads and tails, respectively. Remark: For all i, 0 P i 1. n i Ensemble of continuous events Density of probability Variables in continuous events are always given with an uncertainty (error). Example: The length of a table is measured with an uncertainty. Let P (x) be the probability so that the length of the table belongs to the interval [x, x + x]. Let n(x) be the number of times among N measurements where x belongs to [x, x + x]. One has by definition n(x) P (x) = lim N N The density of probability is defined by P (x) dp (x) (1.5) W (x) = lim = (1.6) x 0 x dx The probability of finding x in the interval [x, x + dx] isthusdp (x) = W (x)dx. Remarks: 1) W (x) 0 2) For more than one variable one writes dp (x, y, z) =W (x, y, z)dxdydz or dp ( r )=W ( r )d r Fundamental properties Normalization of probabilities From their definition, one has for discrete and continuous cases the following normalization relations i P i = n i =1 (1.7) N i W (x)dx = dp (x) = 1 dn(x) = 1 (1.8) N
5 Basic Concepts and Tools in Statistical Physics Addition rule Let e 1 and e 2 be two incompatible discrete events of probabilities P (e 1 ) and P (e 2 ), respectively. The probability of finding e 1 or e 2 is given by P (e 1 or e 2 )=P (e 1 )+P (e 2 ) (1.9) For continuous variables, the probability to find x between a and b is P (a x b) = b a W (x)dx (1.10) If e 1 and e 2 are not incompatible in N experiments, then one should remove the number of times where e 1 and e 2 simultaneously take place. One has Multiplication rule P (e 1 or e 2 )=P (e 1 )+P (e 2 ) P (e 1 and e 2 ) (1.11) Let e 1 and e 2 be two independent events. The probability to find e 1 and e 2 is given by P (e 1 and e 2 )=P (e 1 )P (e 2 ) (1.12) Example: One flips a dice twice. The probability to find face 1 or face 2is , and that to find face 1 and face 2 is Mean values One defines the mean value of a quantity f by f = m P m f m (1.13) where f m is the value of f in the microstate m of probability P m.thesum is performed over all states. If the states (or events) are characterized by continuous variables, one has f = where f(x) isthevalueoff in the state x. W (x)f(x)dx (1.14)
6 8 Statistical Physics Fundamentals and Application to Condensed Matter If a value A occurs many times with different events m (or different states), one can write A = A i P (A i )A i (1.15) where P (A i ) is the probability to find A i,namely P (A i )= m,a m=a i P m (1.16) where the sum is made only on the events m having A m = A i.inthecase of a continuous variable, one has A = W (A)AdA (1.17) where W (A) is the density of probability to find A. The most probable value of A i (or A) corresponds to the maximum of P (A i )(orw (A)). The variance is defined by ( f) 2 = ( f f ) 2 (1.18) One can also write ( f) 2 = m = m P m ( fm f ) 2 P m [ f 2 m +(f) 2 2f m f ] = f 2 +(f) 2 2f f = f 2 (f) 2 (1.19) One calls f = ( f) 2 standard deviation. This quantity expresses the dispersion of the statistical distribution. In the same manner, one has for the continuous case ( A) 2 = + W (A) [ A A ] 2 da = A2 (A) 2 (1.20)
7 Basic Concepts and Tools in Statistical Physics Statistical distributions Binomial distribution When there are only two possible outcomes in an experiment, the distribution of the results in a series of experiments follows the binomial law. Example: flipping a coin two possible results, head or tail. Let P A and P B be the probabilities of the two types of outcome. The normalization of probability imposes P A + P B =1. IfafterN experiments one obtains n times A and N n times B, then the probability to find n times A and (N n) timesb is given by the multiplication rule P (N,n) =CN n PAP n N n B (1.21) where the factor CN n = N! n!(n n)! is introduced to take into account the number of combinations in which n times A and (N n) timesb occur. One shows that P (N,n) is normalized: N P (N,n) = N C n N P n A P N n B =(P A + P B ) N = 1 (1.22) where one has used the formula of the Newton binomial to go from the first to the second line. One shows in the following some main results: the mean value of n is n = NP A : n = N np (N,n) = = P A P A [ N N ncn n PAP n N n B ] C n N P n A P N n B = P A (P A + P B ) N P A = P A N(P A + P B ) N 1 = NP A (1.23) where one has used in the last line P A + P B =1.
8 10 Statistical Physics Fundamentals and Application to Condensed Matter the variance: The variance is defined by ( n) 2 = n 2 (n) 2. One calculates n 2 as follows: N n 2 = n 2 P (N,n) =n 2 CN n P A n P N n B [ N ] =(P A ) 2 CN n P P AP n N n B A =(P A ) 2 (P A + P B ) N P A [ = P A PA N(P A + P B ) N 1] P A = NP A + N(N 1)PA 2 (1.24) where one has used in the last equality P A + P B = 1. One finds ( n) 2 = n 2 (n) 2 = NP A + N(N 1)PA 2 (NP A) 2 = NP A (1 P A )=NP A P B (1.25) from which the standard deviation is n = NP A P B. the relative uncertainty or relative error: The relative error on n is given by n n N N = 1 N (1.26) The relative error (or uncertainty) decreases with increasing N. For N = 1000 (which is the standard sample for polls), n/n 3% Gaussian distribution The Gaussian distribution applies in the case of continuous events. The density of probability W (x) of the Gaussian distribution, or Gaussian law, is given by W (x) =A exp [ (x x 0) 2 ] 2σ 2 (1.27) where A is a constant determined by the normalization of W (x), x 0 the central value of the distribution and 2σ the full width at half-maximum of W (x) (see Fig.1.1).
9 Basic Concepts and Tools in Statistical Physics W(X) Fig. 1.1 Gaussian density of probability W (x) shown with x 0 =1,σ =0.4. To calculate A, one writes 1= W (x)dx = A exp [ (x x 0) 2 ] 2σ 2 = A 2πσ 2 (1.28) where u = x x 0 and where one has used in the last equality the Gauss integral (see Appendix A): exp [ au 2] du = X π a One then has A = 1 σ 2π. One shows some important results in the following: the mean value of x: With (1.27), one has x = xw (x)dx = A x exp [ (x x 0) 2 ] 2σ 2 dx = A (x x 0 )exp [ (x x 0) 2 ] 2σ 2 dx +A x 0 exp [ (x x 0) 2 ] 2σ 2 dx =0+x 0 A exp [ (x x 0) 2 ] 2σ 2 dx (1.29) = x 0 (1.30) where the first integral of the second line is zero (integral of an odd function with two symmetrical opposite bounds), and one has
10 12 Statistical Physics Fundamentals and Application to Condensed Matter used the normalization of the probability density W (x) in the line before the last line. the variance: Using x = x 0,thevariance( x) 2 = (x x) 2 is ( x) 2 = A (x x 0 ) 2 exp [ (x x 0) 2 ] 2σ 2 dx ] = A y 2 exp [ y2 2σ 2 dy = A 1 2 π(2σ2 ) 3 where one used in the last line a formula in Appendix A. Replacing A by 1 σ 2π,oneobtains from which the standard deviation is x = σ. ( x) 2 = σ 2 (1.31) One can also show that (see Problem 1) the binomial law becomes the Gaussian law when N>>n>>1. This equivalence is known as central limit theorem Poisson law When the probability of an event is very small with respect to 1, namely a rare event, the probability to find n events of the same kind is given by the Poisson law: P (n) = µn exp( µ) n! (1.32) where µ is a constant which is the mean value of n (see Problem 2). One shows that P (n) is normalized: P (n) = µ n exp( µ) n! =exp( µ) µ n n! =exp( µ)exp(µ) =1 (1.33) One can also show that the binomial law becomes the Poisson law when P A << P B and N>>n>>1(seeProblem2).
11 Basic Concepts and Tools in Statistical Physics Microstates Macrostates Microstates Enumeration A microstate of a system is defined in general by the individual states of the particles which constitute the system. These individual states are characterized by the physical parameters which define the system. Let us mention some examples. The microstates of an atom are given by the states of the electrons of the atom: each microstate is defined by four quantum numbers (n, l, m l,m s ). The microstates of an isolated system of energy E are given by the different distributions of E on the system particles. The microstates, at a given time, of a system of N classical particles of mass m are defined by the positions and the velocities of the particles at that time. The microstates of a system of independent quantum particles are given by the wave-vectors of the particles. It is obvious that for a given system there are a large number of microstates which satisfy the physical constraints imposed on the system (volume, energy, temperature,...). These microstates are called realizable microstates. It is very important to know how to calculate the number of these states. Systems of classical and quantum independent particles are shown in details in chapter 2. In the following, one gives some examples and formulas to enumerate the microstates of simple systems. In general, one distinguishes the cases of indiscernible and discernible particles. Discernible particles are those one can identify individually such as particles with different colors or particles bearing each a number. Indiscernible particles are identical particles impossible to distinguish the one from the others such as atoms in a mono-atomic gas or conduction electrons in a metal. Example: Consider an isolated system of total energy equal to 3 units. This system has 3 particles supposed to be discernible and bearing letters A, B and C. Each of the particles can occupy levels of energy at 0, 1, 2, or 3 units. With the constraint that the sum of energies of the particles is equal to 3, one has the 3 following categories of microstates (see Fig. 1.2): -level 0: B, C; level 3: A (Fig. 1.2, left) permutations of A with B and A with C give 3 microstates -level 0: C; level 1: B ; level 2: A (Fig. 1.2, middle) permutations of A, B and C give 6 microstates -level 1: A, B, C (Fig. 1.2, right) 1microstate. Remark: Permutations between particles of the same level do not give rise to new microstates.
12 14 Statistical Physics Fundamentals and Application to Condensed Matter A B,C A B C A,B,C Fig. 1.2 Three categories of microstates (left, middle, right) according to occupation numbers of energy levels. In the example given above, one can use the following formula to calculate the number of microstates of each category in Fig. 1.2 ω = N! n 0!n 1!n 2!... (1.34) where n i (i =0, 1, 2,...) is the number of particles occupying the i-th level and N the total number of particles. One can obviously verify that this formula gives the number of microstates enumerated above in each category. Furthermore, one sees that the total number of microstates (all categories) is 10. This number can be calculated in the general case with arbitrary N and E by Ω(E) = (N + E 1)! (N 1)!E! (1.35) For E =3andN = 3, one recovers Ω = (3+3 1)! (3 1)!3! = 10. The demonstrations of Eqs. (1.34) and (1.35) are done in Problem 3. Remark: In the above example, if the particles are indiscernible, the number of microstates is reduced to 3 because permutations of particles on the different levels do not yield new microstates. It should be emphasized that the knowledge of the number of microstates allows one to calculate principal physical properties of isolated systems as one will see in chapter 2. The microstates having the same energy are called degenerate states. The number of degenerate states at a given energy is called degeneracy. In the above example, the degeneracy is 10 for E =3.
13 Basic Concepts and Tools in Statistical Physics 15 Each microstate l has a probability P l.ifoneknowsp l, one can calculate the mean values of physical quantities as will be seen in the following chapters. That is why the main objective of statistical mechanics is to find a way to determine P l according to the conditions imposed on the system. Remark: Quantum particles, by their nature, are indiscernible. One distinguishes two kinds of particles: bosons and fermions. Bosons are particles having integer spins (0, 1, 2,...) and fermions are those having half-integer spins (1/2, 3/2, 5/2,...). The symmetry postulate in quantum mechanics states that wave functions of bosons are invariant with respect to permutation of two particles in their states, while wave functions of fermions do change their sign at each permutation. A consequence of this postulate is that a quantum microstate can contain any number of bosons but only zero or one fermion. This affects obviously the enumeration of microstates as one will see in chapters 5 and Macroscopic states A macroscopic state, or macrostate for short, is an ensemble of microstates having a common macroscopic property. An observed macroscopic property is considered as the mean value of a statistical mixing of microstates of the ensemble. So, its definition depends on which macroscopic property one wants to observe: there are thus many ways to define a macrostate. In the example shown in Fig. 1.2, one can define a macrostate by the occupation numbers of the energy levels: there are thus 3 macrostates corresponding to the following occupation numbers in the first 4 levels (2,0,0,1), (1,1,1,0), (0,3,0,0) (Fig. 1.2, from left to right). One can also define a macrostate as the one having an energy E = 3. In this case, the macrostate contains the ensemble of all 10 microstates Statistical averages Ergodic hypothesis When a system is at equilibrium, the mean values of its macroscopic variables do not vary anymore with time evolution. However, often the system fluctuates between different microstates giving rise to small variations around the mean values of the macroscopic variables. One defines the time average of a variable f by
14 16 Statistical Physics Fundamentals and Application to Condensed Matter 1 f = lim τ τ t0+τ t 0 f(t)dt (1.36) where τ is the averaging time. On the other hand, as seen above, the mean value of a quantity can be calculated over the ensemble of microstates. Let P l be the probability of the microstate l in which the variable f is equal to f l. One defines the so-called ensemble average of f by f = (l) P l f l (1.37) The ergodic hypothesis of statistical mechanics at equilibrium introduced by Boltzmann in 1871 states that the time and ensemble averages are equal. The validity of this principle has been experimentally verified for systems at equilibrium. Of course, for systems with extremely long relaxation times such as glasses, this equivalence is not easy to prove. 1.6 Statistical entropy In statistical mechanics, the fundamental quantity which allows one to calculate principal properties of a system is the statistical entropy introduced by Boltzmann near the end of the 19th century. It is defined by S = k B P l ln P l (1.38) where P l is the probability of the microstate l of the system and k B the Boltzmann constant. One will see in the following chapters that S coincides with the thermodynamic entropy defined in a macroscopic manner in the second principle of the classical thermodynamics. The advantage of the definition (1.38) is that the physical consequences associated with the entropy are clear as seen below: l As 0 P l 1, one has S 0 When one of the events is sure, namely one probability is equal to 1 (other probabilities are zero by normalization), one sees that S is zero (mimimum). When all probabilities are equal, namely when
15 Basic Concepts and Tools in Statistical Physics 17 the uncertainty on the outcome is maximum, one can show that S is maximum (see Problem 8). In other words, S represents the uncertainty or the lack of information on the system. With the above definition, one understands easily that why in thermodynamics entropy S is said to express the disorder of the system. Statistical entropy is additive: consider two independent ensembles of events {e m ; m =1,..., M} and {e m ; m =1,..., M } of respective probabilities {P m ; m =1,..., M} and {P m ; m =1,..., M }. The probability to find e m and e m is thus P (m, m )=P m P m. The entropy of these double events is then S(e, e )= k B P m P m ln(p mp m )= k B P m P m [ln P m +lnp m ] m,m m,m = k B P m ln P m P m k B P m ln P m P m m m m m = S(e)+S(e ) (1.39) using m P m =1and m P m = 1. The total entropy is the sum of partial entropies. 1.7 Conclusion Since statistical mechanics is based on a probabilistic concept, the enumeration of microstates and the use of probabilities are very important. In this chapter, basic definitions and fundamental concepts on these points have been presented. In particular, most frequently used probability laws such as binomial, Gaussian and Poisson distributions have been shown. Microstates and macrostates were defined using examples for better illustration. Finally, the definition of the statistical entropy which is the most important quantity in statistical mechanics was shown and discussed in connection with the thermodynamic entropy. The above definitions and tools will be used throughout this book.
16 18 Statistical Physics Fundamentals and Application to Condensed Matter 1.8 Problems Problem 1. Central limit theorem: Show that the binomial law becomes the Gaussian law when N>> n>>1. Problem 2. Poisson law (1.32): a) Calculate n. b) Show that the binomial law becomes the Poisson law when P A << P B and N>>n>>1. Problem 3. Demonstrate the formulas (1.34) and (1.35). Problem 4. Application of the binomial law: Calculate the probability to find the number of heads between 3 and 6 (boundaries included) when one flips 10 times a coin. Problem 5. Random walk in one dimension: Consider a particle moving on the x axis with a constant step length l on the right or on the left with the same probability. a) Calculate the probability for the particle to make n steps to the right and n steps to the left after N steps. b) Calculate the averages n and n. Calculate the variance and relative uncertainty on n. c) Calculate the probability to find the particle at the position x = ml from the origin. Calculate x and ( x) 2. d) Suppose that the step length x i is not constant. What is the density of probability to find the particle at X after N steps with N 1? Problem 6. Random walk in three dimensions: Consider a particle moving in three dimensions with variable discrete step lengths l and a density of probability independent of the direction of the step. Each step corresponds to a change of direction of the particle motion which is due to a collision with another particle. This is a simple model of the Brownian motion. a) Calculate the Cartesian components (X, Y, Z) of the particle position after N steps. Calculate the averages ( X,Ȳ, Z) andthe corresponding variances ( X) 2,( Y) 2 and ( Z) 2. Deduce the average l 2. b) What is the density of probability to find the particle at (X, Y, Z) after N steps? c) Putting ( X) 2 =2Dt where D is the diffusion coefficient and t the lapse of time of the motion of the particle, demonstrate D = 1 l 2 6 τ where τ is the average time between two particle collisions.
17 Basic Concepts and Tools in Statistical Physics 19 Problem 7. Exchange of energy: Consider two isolated systems. The first one has N 1 = 2 particles and an energy equal to E 1 = 3 units. The second one has N 2 =4 particles and E 2 = 5 units. Each particle can have 0, 1, 2,... energy units. a) Calculate the number of microstates of the total system composed of systems 1 and 2 separated by an insulating wall. b) Remove the wall. The total system is kept isolated. Calculate the number of microstates of the total system in this new situation. Comment. c) In the situation of the previous question, calculate the number of microstates in which there are at least 2 particles having 2 energy units each. Deduce the probability of this macrostate. Problem 8. Statistical entropy: Show that the statistical entropy S = k B l P l ln P l is maximum when all probabilities P l are equal.
Introduction. Chapter The Purpose of Statistical Mechanics
Chapter 1 Introduction 1.1 The Purpose of Statistical Mechanics Statistical Mechanics is the mechanics developed to treat a collection of a large number of atoms or particles. Such a collection is, for
More informationStatistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.
Statistical Physics he Second Law ime s Arrow Most macroscopic processes are irreversible in everyday life. Glass breaks but does not reform. Coffee cools to room temperature but does not spontaneously
More information213 Midterm coming up
213 Midterm coming up Monday April 8 @ 7 pm (conflict exam @ 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion 1-4 Labs 1-2 Review Session Sunday April 7, 3-5 PM, 141 Loomis
More informationΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics
Module 3 Statistics I. Basic Statistics A. Statistics and Physics 1. Why Statistics Up to this point, your courses in physics and engineering have considered systems from a macroscopic point of view. For
More informationn N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)
CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy
More informationPart II: Statistical Physics
Chapter 7: Quantum Statistics SDSMT, Physics 2013 Fall 1 Introduction 2 The Gibbs Factor Gibbs Factor Several examples 3 Quantum Statistics From high T to low T From Particle States to Occupation Numbers
More informationUNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS
UNDERSTANDING BOLTZMANN S ANALYSIS VIA Contents SOLVABLE MODELS 1 Kac ring model 2 1.1 Microstates............................ 3 1.2 Macrostates............................ 6 1.3 Boltzmann s entropy.......................
More information1 INFO Sep 05
Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually
More informationOutline for Fundamentals of Statistical Physics Leo P. Kadanoff
Outline for Fundamentals of Statistical Physics Leo P. Kadanoff text: Statistical Physics, Statics, Dynamics, Renormalization Leo Kadanoff I also referred often to Wikipedia and found it accurate and helpful.
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationKinetic Theory 1 / Probabilities
Kinetic Theory 1 / Probabilities 1. Motivations: statistical mechanics and fluctuations 2. Probabilities 3. Central limit theorem 1 The need for statistical mechanics 2 How to describe large systems In
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationClassical Statistical Mechanics: Part 1
Classical Statistical Mechanics: Part 1 January 16, 2013 Classical Mechanics 1-Dimensional system with 1 particle of mass m Newton s equations of motion for position x(t) and momentum p(t): ẋ(t) dx p =
More informationUltrasonic Testing and Entropy
Ultrasonic Testing and Entropy Alex Karpelson Kinectrics Inc.; Toronto, Canada; e-mail: alex.karpelson@kinectrics.com Abstract Attempt is described to correlate results of the UT inspection of the object
More informationFrom the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT
From the microscopic to the macroscopic world Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München Jean BRICMONT Université Catholique de Louvain Can Irreversible macroscopic laws be deduced
More informationKinetic Theory 1 / Probabilities
Kinetic Theory 1 / Probabilities 1. Motivations: statistical mechanics and fluctuations 2. Probabilities 3. Central limit theorem 1 Reading check Main concept introduced in first half of this chapter A)Temperature
More informationIV. Classical Statistical Mechanics
IV. Classical Statistical Mechanics IV.A General Definitions Statistical Mechanics is a probabilistic approach to equilibrium macroscopic properties of large numbers of degrees of freedom. As discussed
More informationStatistical Mechanics
42 My God, He Plays Dice! Statistical Mechanics Statistical Mechanics 43 Statistical Mechanics Statistical mechanics and thermodynamics are nineteenthcentury classical physics, but they contain the seeds
More informationChE 503 A. Z. Panagiotopoulos 1
ChE 503 A. Z. Panagiotopoulos 1 STATISTICAL MECHANICAL ENSEMLES 1 MICROSCOPIC AND MACROSCOPIC ARIALES The central question in Statistical Mechanics can be phrased as follows: If particles (atoms, molecules,
More informationPhysics 115A: Statistical Physics
Physics 115A: Statistical Physics Prof. Clare Yu email: cyu@uci.edu phone: 949-824-6216 Office: 210E RH Fall 2013 LECTURE 1 Introduction So far your physics courses have concentrated on what happens to
More informationBoltzmann Distribution Law (adapted from Nash)
Introduction Statistical mechanics provides a bridge between the macroscopic realm of classical thermodynamics and the microscopic realm of atoms and molecules. We are able to use computational methods
More informationSignal Processing - Lecture 7
1 Introduction Signal Processing - Lecture 7 Fitting a function to a set of data gathered in time sequence can be viewed as signal processing or learning, and is an important topic in information theory.
More informationA Brief Introduction to Statistical Mechanics
A Brief Introduction to Statistical Mechanics E. J. Maginn, J. K. Shah Department of Chemical and Biomolecular Engineering University of Notre Dame Notre Dame, IN 46556 USA Monte Carlo Workshop Universidade
More informationIII. Kinetic Theory of Gases
III. Kinetic Theory of Gases III.A General Definitions Kinetic theory studies the macroscopic properties of large numbers of particles, starting from their (classical) equations of motion. Thermodynamics
More informationPAPER No. 6: PHYSICAL CHEMISTRY-II (Statistical
Subject Paper No and Title Module No and Title Module Tag 6: PHYSICAL CHEMISTRY-II (Statistical 1: Introduction to Statistical CHE_P6_M1 TABLE OF CONTENTS 1. Learning Outcomes 2. Statistical Mechanics
More informationDerivation of the Boltzmann Distribution
CLASSICAL CONCEPT REVIEW 7 Derivation of the Boltzmann Distribution Consider an isolated system, whose total energy is therefore constant, consisting of an ensemble of identical particles 1 that can exchange
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationLecture 8: Probability
Lecture 8: Probability The idea of probability is well-known The flipping of a balanced coin can produce one of two outcomes: T (tail) and H (head) and the symmetry between the two outcomes means, of course,
More informationLecture 6 Examples and Problems
Lecture 6 Examples and Problems Heat capacity of solids & liquids Thermal diffusion Thermal conductivity Irreversibility Hot Cold Random Walk and Particle Diffusion Counting and Probability Microstates
More information4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy
Chapter 4 Free Energies The second law allows us to determine the spontaneous direction of of a process with constant (E, V, n). Of course, there are many processes for which we cannot control (E, V, n)
More informationElements of Statistical Mechanics
Elements of Statistical Mechanics Thermodynamics describes the properties of macroscopic bodies. Statistical mechanics allows us to obtain the laws of thermodynamics from the laws of mechanics, classical
More informationLecture 2 Binomial and Poisson Probability Distributions
Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James
More information6.730 Physics for Solid State Applications
6.730 Physics for Solid State Applications Lecture 25: Chemical Potential and Equilibrium Outline Microstates and Counting System and Reservoir Microstates Constants in Equilibrium Temperature & Chemical
More informationMinimum Bias Events at ATLAS
Camille Bélanger-Champagne Lehman McGill College University City University of ew York Thermodynamics Charged Particle and Correlations Statistical Mechanics in Minimum Bias Events at ATLAS Statistical
More informationPhysics 172H Modern Mechanics
Physics 172H Modern Mechanics Instructor: Dr. Mark Haugan Office: PHYS 282 haugan@purdue.edu TAs: Alex Kryzwda John Lorenz akryzwda@purdue.edu jdlorenz@purdue.edu Lecture 22: Matter & Interactions, Ch.
More informationELEMENTARY COURSE IN STATISTICAL MECHANICS
ELEMENTARY COURSE IN STATISTICAL MECHANICS ANTONIO CONIGLIO Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Monte S. Angelo I-80126 Napoli, ITALY April 2, 2008 Chapter 1 Thermodynamics
More information1 Phase Spaces and the Liouville Equation
Phase Spaces and the Liouville Equation emphasize the change of language from deterministic to probablistic description. Under the dynamics: ½ m vi = F i ẋ i = v i with initial data given. What is the
More informationChapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential
Chapter Ensemble Theory in Statistical Physics: Free Energy Potential Abstract In this chapter, we discuss the basic formalism of statistical physics Also, we consider in detail the concept of the free
More informationChapter 8: An Introduction to Probability and Statistics
Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including
More information5. Systems in contact with a thermal bath
5. Systems in contact with a thermal bath So far, isolated systems (micro-canonical methods) 5.1 Constant number of particles:kittel&kroemer Chap. 3 Boltzmann factor Partition function (canonical methods)
More informationIntroduction to Statistical Thermodynamics
Cryocourse 2011 Chichilianne Introduction to Statistical Thermodynamics Henri GODFRIN CNRS Institut Néel Grenoble http://neel.cnrs.fr/ Josiah Willard Gibbs worked on statistical mechanics, laying a foundation
More informationII. Probability. II.A General Definitions
II. Probability II.A General Definitions The laws of thermodynamics are based on observations of macroscopic bodies, and encapsulate their thermal properties. On the other hand, matter is composed of atoms
More informationThermodynamics: More Entropy
Thermodynamics: More Entropy From Warmup Yay for only having to read one section! I thought the entropy statement of the second law made a lot more sense than the other two. Just probability. I haven't
More informationStatistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany
Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany Preliminaries Learning Goals From Micro to Macro Statistical Mechanics (Statistical
More informationRemoving the mystery of entropy and thermodynamics. Part 3
Removing the mystery of entropy and thermodynamics. Part 3 arvey S. Leff a,b Physics Department Reed College, Portland, Oregon USA August 3, 20 Introduction In Part 3 of this five-part article, [, 2] simple
More informationLecture Notes Set 3a: Probabilities, Microstates and Entropy
Lecture Notes Set 3a: Probabilities, Microstates and Entropy Thus far.. In Sections 1 and 2 of the module we ve covered a broad variety of topics at the heart of the thermal and statistical behaviour of
More information1 Multiplicity of the ideal gas
Reading assignment. Schroeder, section.6. 1 Multiplicity of the ideal gas Our evaluation of the numbers of microstates corresponding to each macrostate of the two-state paramagnet and the Einstein model
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationThe Methodology of Statistical Mechanics
Chapter 4 The Methodology of Statistical Mechanics c 2006 by Harvey Gould and Jan Tobochnik 16 November 2006 We develop the basic methodology of statistical mechanics and provide a microscopic foundation
More informationalthough Boltzmann used W instead of Ω for the number of available states.
Lecture #13 1 Lecture 13 Obectives: 1. Ensembles: Be able to list the characteristics of the following: (a) icrocanonical (b) Canonical (c) Grand Canonical 2. Be able to use Lagrange s method of undetermined
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department 8.044 Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble The object of this endeavor is to impose a simple probability
More information04. Random Variables: Concepts
University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative
More informationStatistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.
Statistical Physics How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much. We will care about: N = # atoms T = temperature V = volume U = total
More informationStatistical Mechanics Notes. Ryan D. Reece
Statistical Mechanics Notes Ryan D. Reece August 11, 2006 Contents 1 Thermodynamics 3 1.1 State Variables.......................... 3 1.2 Inexact Differentials....................... 5 1.3 Work and Heat..........................
More information1 Fluctuations of the number of particles in a Bose-Einstein condensate
Exam of Quantum Fluids M1 ICFP 217-218 Alice Sinatra and Alexander Evrard The exam consists of two independant exercises. The duration is 3 hours. 1 Fluctuations of the number of particles in a Bose-Einstein
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationG : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into
G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More information2m + U( q i), (IV.26) i=1
I.D The Ideal Gas As discussed in chapter II, micro-states of a gas of N particles correspond to points { p i, q i }, in the 6N-dimensional phase space. Ignoring the potential energy of interactions, the
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationIdentical Particles. Bosons and Fermions
Identical Particles Bosons and Fermions In Quantum Mechanics there is no difference between particles and fields. The objects which we refer to as fields in classical physics (electromagnetic field, field
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 6 - Random Processes
EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More information( ) ( )( k B ( ) ( ) ( ) ( ) ( ) ( k T B ) 2 = ε F. ( ) π 2. ( ) 1+ π 2. ( k T B ) 2 = 2 3 Nε 1+ π 2. ( ) = ε /( ε 0 ).
PHYS47-Statistical Mechanics and hermal Physics all 7 Assignment #5 Due on November, 7 Problem ) Problem 4 Chapter 9 points) his problem consider a system with density of state D / ) A) o find the ermi
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationAdvanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)
Advanced Thermodynamics Jussi Eloranta (jmeloranta@gmail.com) (Updated: January 22, 2018) Chapter 1: The machinery of statistical thermodynamics A statistical model that can be derived exactly from the
More informationMath 122L. Additional Homework Problems. Prepared by Sarah Schott
Math 22L Additional Homework Problems Prepared by Sarah Schott Contents Review of AP AB Differentiation Topics 4 L Hopital s Rule and Relative Rates of Growth 6 Riemann Sums 7 Definition of the Definite
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More informationTheoretical Statistical Physics
Theoretical Statistical Physics Prof. Dr. Christof Wetterich Institute for Theoretical Physics Heidelberg University Last update: March 25, 2014 Script prepared by Robert Lilow, using earlier student's
More informationLecture 8. The Second Law of Thermodynamics; Energy Exchange
Lecture 8 The Second Law of Thermodynamics; Energy Exchange The second law of thermodynamics Statistics of energy exchange General definition of temperature Why heat flows from hot to cold Reading for
More informationStatistical thermodynamics for MD and MC simulations
Statistical thermodynamics for MD and MC simulations knowing 2 atoms and wishing to know 10 23 of them Marcus Elstner and Tomáš Kubař 22 June 2016 Introduction Thermodynamic properties of molecular systems
More informationThe non-interacting Bose gas
Chapter The non-interacting Bose gas Learning goals What is a Bose-Einstein condensate and why does it form? What determines the critical temperature and the condensate fraction? What changes for trapped
More informationChapter 20 The Second Law of Thermodynamics
Chapter 20 The Second Law of Thermodynamics When we previously studied the first law of thermodynamics, we observed how conservation of energy provided us with a relationship between U, Q, and W, namely
More information(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble
Recall from before: Internal energy (or Entropy): &, *, - (# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble & = /01Ω maximized Ω: fundamental statistical quantity
More informationDr.Salwa Alsaleh fac.ksu.edu.sa/salwams
Dr.Salwa Alsaleh Salwams@ksu.edu.sa fac.ksu.edu.sa/salwams Lecture 5 Basic Ideas of Statistical Mechanics General idea Macrostates and microstates Fundamental assumptions A simple illustration: tossing
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationFirst and Last Name: 2. Correct The Mistake Determine whether these equations are false, and if so write the correct answer.
. Correct The Mistake Determine whether these equations are false, and if so write the correct answer. ( x ( x (a ln + ln = ln(x (b e x e y = e xy (c (d d dx cos(4x = sin(4x 0 dx xe x = (a This is an incorrect
More informationThermal and Statistical Physics Department Exam Last updated November 4, L π
Thermal and Statistical Physics Department Exam Last updated November 4, 013 1. a. Define the chemical potential µ. Show that two systems are in diffusive equilibrium if µ 1 =µ. You may start with F =
More informationWe already came across a form of indistinguishably in the canonical partition function: V N Q =
Bosons en fermions Indistinguishability We already came across a form of indistinguishably in the canonical partition function: for distinguishable particles Q = Λ 3N βe p r, r 2,..., r N ))dτ dτ 2...
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationSupplement: Statistical Physics
Supplement: Statistical Physics Fitting in a Box. Counting momentum states with momentum q and de Broglie wavelength λ = h q = 2π h q In a discrete volume L 3 there is a discrete set of states that satisfy
More informationComputer Applications for Engineers ET 601
Computer Applications for Engineers ET 601 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Random Variables (Con t) 1 Office Hours: (BKD 3601-7) Wednesday 9:30-11:30 Wednesday 16:00-17:00 Thursday
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationStatistics and Econometrics I
Statistics and Econometrics I Random Variables Shiu-Sheng Chen Department of Economics National Taiwan University October 5, 2016 Shiu-Sheng Chen (NTU Econ) Statistics and Econometrics I October 5, 2016
More informationLecture 8. The Second Law of Thermodynamics; Energy Exchange
Lecture 8 The Second Law of Thermodynamics; Energy Exchange The second law of thermodynamics Statistics of energy exchange General definition of temperature Why heat flows from hot to cold Reading for
More informationIntroduction Statistical Thermodynamics. Monday, January 6, 14
Introduction Statistical Thermodynamics 1 Molecular Simulations Molecular dynamics: solve equations of motion Monte Carlo: importance sampling r 1 r 2 r n MD MC r 1 r 2 2 r n 2 3 3 4 4 Questions How can
More informationIntroduction to Stochastic Processes
Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.
More informationfiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES
Content-Thermodynamics & Statistical Mechanics 1. Kinetic theory of gases..(1-13) 1.1 Basic assumption of kinetic theory 1.1.1 Pressure exerted by a gas 1.2 Gas Law for Ideal gases: 1.2.1 Boyle s Law 1.2.2
More informationStatistical Mechanics
Franz Schwabl Statistical Mechanics Translated by William Brewer Second Edition With 202 Figures, 26 Tables, and 195 Problems 4u Springer Table of Contents 1. Basic Principles 1 1.1 Introduction 1 1.2
More informationBrownian motion and the Central Limit Theorem
Brownian motion and the Central Limit Theorem Amir Bar January 4, 3 Based on Shang-Keng Ma, Statistical Mechanics, sections.,.7 and the course s notes section 6. Introduction In this tutorial we shall
More informationThe goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq
Chapter. The microcanonical ensemble The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq } = A that give
More informationGrand Canonical Formalism
Grand Canonical Formalism Grand Canonical Ensebmle For the gases of ideal Bosons and Fermions each single-particle mode behaves almost like an independent subsystem, with the only reservation that the
More informationLecture 9 Examples and Problems
Lecture 9 Examples and Problems Counting microstates of combined systems Volume exchange between systems Definition of Entropy and its role in equilibrium The second law of thermodynamics Statistics of
More information1 The postulates of quantum mechanics
1 The postulates of quantum mechanics The postulates of quantum mechanics were derived after a long process of trial and error. These postulates provide a connection between the physical world and the
More information[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B
Canonical ensemble (Two derivations) Determine the probability that a system S in contact with a reservoir 1 R to be in one particular microstate s with energy ɛ s. (If there is degeneracy we are picking
More informationLecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins
Lecture 6 Statistical Processes Irreversibility Counting and Probability Microstates and Macrostates The Meaning of Equilibrium Ω(m) 9 spins -9-7 -5-3 -1 1 3 5 7 m 9 Lecture 6, p. 1 Irreversibility Have
More information