STATISTICAL PHYSICS II

Size: px
Start display at page:

Download "STATISTICAL PHYSICS II"

Transcription

1 . STATISTICAL PHYSICS II (Statistical Physics of Non-Equilibrium Systems) Lecture notes 2009 V.S. Shumeiko 1

2 . Part I CLASSICAL KINETIC THEORY 2

3 1 Lecture Outline of equilibrium statistical physics 1.1 Definitions Let us consider a classical system in 3 dimensions with N particles. The microscopic state of the system is fully defined by the position and momentum of each of the N particles. If we denote the position of the i-th particle by the 3-dimensional vector q i and similarly denote the momentum of the particle by p i, the microstate of the system is specified by the vectors { q 1, p 1, q 2, p 2,..., q N, p N }. The 6N-dimensional space spanned by these vectors is known as the phase space Γ of the system. For simplicity let us denote the microstate of the system by the 6N-dimensional vector X. As the system evolves in time it traces a trajectory X(t) in phase space the simplest example is the time development of a set of non-interacting particles in the absence of external forces: from Newton s laws we know that the momenta of the particles do not change and the particles positions change with constant velocities; consequently the state of the system as a function of time is a straight line in phase space. If there are external or internal forces acting upon the system the trajectory in phase space can be considerably more complicated and, under some circumstances, chaotic. In statistical physics the system usually consists of a huge number of particles ( ), and it is not possible to determine the positions and momenta of all particles. Instead, we define the macrostate of the system by specifying some macroscopic quantities like pressure, volume and temperature. Usually particular values of these macroscopic quantities can originate from any one of a large number of different microstates (the exception is if we specify T = 0, in which case the system will be in one of a (usually) small number of ground states). Therefore, we define a probability density f N ( X, t) such that f N ( X, t) X = the probability that at time t the microstate of the system is in the neighborhood X of the point X in the phase space. Since f N ( X, t) is a probability density it obeys the normalization condition dxf N ( X, t) = 1, dx = d 3 q i d 3 p i. (1.1) i A probability density is also often called a distribution function. It is convenient in many cases to use different from Eq. (1.1) normalizations of the distribution function, for example, the integral being equal to the number of particles N, or to the particle density N/V. Any macroscopic variable A( X) is characterized by various statistical averages, (moments), which are also called ensemble averages. For example, a first moment A = dxa( X)f( X), (1.2) 3

4 more generally, n-th moment, A n = dxa n ( X)f( X). (1.3) Fluctuations around average values are characterized by cumulants, for example, the second cumulant or variance, var A = (A A ) 2 = A 2 A 2. (1.4) Correlation function between some two variables A( X) and B( X) are defined as AB = dxa( X)B( X)f( X). (1.5) Statistically independent variables satisfy the relation, AB = A B. (1.6) In classical physics, statistical correlations indicate interaction: non-interacting particles can be considered as statistically independent. In quantum physics this is not the case: for example the Pauli exclusion principle imposes quantum correlations on fermionic particles which persist even in absence of any interaction. 1.2 Self-averaging The goal of equilibrium statistical physics is to establish bridge between macroscopic variables introduced in the thermodynamics (temperature, pressure, entropy, internal energy, quantity of heat, etc.) and the ensemble averages of the microscopic description. The first issue to address is apparent qualitative difference between those two descriptions: random variables of microscopic description always fluctuate while thermodynamical quantities do not seem to. This discrepancy is lifted by the self-averaging property of macroscopic variables. Consider, e.g. total energy of an ideal gas E, which consists of sum of energies of all molecules, E = N i ɛ i, where i counts the molecules. A random quantity ɛ i has some average value, which we assume for simplicity independent on i, ɛ i = ɛ, and some variance, also i-independent, var ɛ i = δ. Since molecules of an ideal gas do not interact, they may be considered as statistically independent, hence ɛ i ɛ j = ɛ i ɛ j for i j. Macroscopically, our gas is characterized by an internal energy U = E = Nɛ. Let us evaluate the variance of the internal energy, var E = E 2 U 2. (1.7) Using microscopic energies, we rewrite this, var E = = N N ɛ i ɛ j ɛ i ɛ j ij ij N N [ ɛ 2 i ɛ i 2 ] + [ ɛ i ɛ j ɛ i ɛ j ] (1.8) i i j 4

5 The second sum is equal to zero by virtue of the molecules statistical independence, thus we get, var E = Nδ. (1.9) Fluctuation is quantified by the ratio of the square root of variance to the average, var E δ = U Nɛ. (1.10) Thus we come to an important conclusion that the fluctuation is proportional to 1/ N. For very large systems (the thermodynamic limit N ) the fluctuation is vanishingly small, 1/ N 0. This property is called self-averaging. Same argument also applies to any macroscopic variable that is sum of many non-interacting microscopic ingredients. Clearly, the self-averaging property is a particular case of the central limit theorem of the probability theory: fluctuation of an additive function of many random variables is infinitely small. In fact, the self-averaging property also holds for any thermodynamical systems with arbitrary strong microscopic interactions, provided this interaction is a short range one. To see this, we apply the following scaling argument: let us consider an infinitely large system (thermodynamic limit!), and divide it in very large (asymptotically infinite) number of blocks, each of them itself being a large thermodynamic system. Due to the local microscopic interactions, the interaction between the blocks are confined to blocks interfaces, while bulk parts of the blocks do not interact. The interface areas are vanishingly small compared to the bulk volume in the thermodynamic limit, thus blocks can be considered non-interacting hence statistically independent. Then repeating the same calculation for any additive thermodynamic characteristic of the whole system, as we did for the energy of an ideal gas, we find that this thermodynamic characteristic will be self-averaging. 1.3 Microcanonical ensemble Thermodynamical equilibrium is a unique state of a system characterized by the maximum value of the entropy. What distribution function does correspond to this state? Evaluation of the form for such a distribution function is a central task of the equilibrium statistical physics. The result must be consistent with the phenomenological laws of thermodynamics, in particular with the maximum entropy principle - the second thermodynamic law. There are many ways to derive (better to say postulate) the equilibrium distribution function. Here we will follow the derivation based on the microscopic equation for the entropy postulated by Gibbs, and the maximum entropy principle. The relation between the entropy and the distribution function suggested by Gibbs has the form, S = k ln f = k dxf( X) ln f( X), (1.11) 5

6 where k is the Boltzmann constant. Now we have to solve a variational problem: find such a distribution function that provides maximum value for the integral defining the entropy. Let us consider a closed system, whose exact energy is fixed and thus it defines a 6N 1-dimensional surface in the phase space, Γ E. Let us denote the area of the surface as Ω(E). The variational problem must be solved under the normalization constraint, which is to be taken into account by considering the following extended variational problem, δ[s α 1 ] = 0, (1.12) where α is a Lagrange multiplier. It is to be found from the normalization condition 1 = 1. Performing explicit variation procedure we get, δ dx [ kf( X) ln f( X) αf( X)] = Γ E dx δ[ kf( X) ln f( X) αf( X)] Γ E = dx [ k ln f( X) k α] δf( X) = 0. Γ E (1.13) The integral in the above equation is equal zero for arbitrary variation of f( X) if and only if the expression in the brackets is zero, k ln f( X) k α = 0. (1.14) Solution to this equation is a constant function over the energy surface, f( X) = const. From the normalization condition we find this constant, f( X) = 1 Ω(E). (1.15) This equilibrium distribution function for closed system is known as the microcanonical ensemble. For this ensemble, the entropy is related to the available phase volume, S(E) = k ln Ω(E), or Ω(E) = e S(E)/k. (1.16) Intuitive feeling of nature of such a distribution can be gained by considering picture of chaotic motion of the system over the surface of constant energy, which visits every region of the surface evenly frequently. Such a behavior is called ergodic, and corresponding assumption about evolution of the system is known as the ergodicity hypothesis. Thermodynamic description is relevant for ergodic systems. It is generally believed that very large interacting systems are ergodic, however, exact knowledge of which macroscopic Hamiltonians generate ergodic evolutions is not known. Mathematically, the ergodicity property is formulated in terms of a time average along the system trajectory X(t), T 1 A X0 = lim dta( X(t)), T T X(0) = X0. (1.17) 0 For ergodic evolution the time average does not depend on an initial state, and it is equal to the ensemble average, A = A. (1.18) 6

7 2 Lecture Equilibrium ensembles 2.1 Canonical ensemble More practical equilibrium distribution is derived for systems connected to a thermostat in a such a way that the system and the thermostate can exchange their energy. Thus exact total energy of the system fluctuates. Let us suppose that the internal energy of the system is fixed, U = E( X) = const. Such an ensemble is called the canonical ensemble or the Gibbs ensemble. Gibbs distribution is derived using the same definition of entropy in Eq. (1.11), and requiring the maximum entropy value. However, now there is one more constraint to be fulfilled in addition to the distribution normalization - conservation of the internal energy. The corresponding variational problem now contains two Lagrange multipliers, α and β, δ[s α 1 β E ] = 0. (2.1) Going through a similar calculation as in the previous section, we derive the following equation for the Gibbs distribution, k ln f( X) k α βe( X) = 0. (2.2) The solution has an exponential form, f( X) = exp k + α βe( X). (2.3) k k Connection between the constants in this solution and the thermodynamic parameters of the system is recovered by considering equation for the entropy and substituting the distribution Eq. (2.3), S = k ln f = (k + α) + βu. (2.4) This is to be compared with the thermodynamic relation F (V, T ) = U(V, S) ST, for the Gibbs free energy F, giving β = 1/T, k + α = F/T. Hence the Gibbs distribution takes the form, f( X) = exp F E( X). (2.5) kt Using the normalization condition, one connects free energy to the partition function Z, F (V, T ) = kt ln Z, Z(V, T ) = dxe E( X)/kT. (2.6) This equation establishes basis for the calculation of all macroscopic thermodynamical quantities from microscopic Hamiltonian of the system H( X) = E( X). 7

8 Let us apply Gibbs distribution to an ideal gas. In this case, the total energy is the sum of energies of individual molecules, E = N i ɛ i. Thus the Gibbs distribution can be factorized, f({ɛ i }) = e F/kT i e ɛ i/kt. (2.7) Each factor in this product has a meaning of a (non-normalized) distribution function for individual molecule. Assuming the molecule energy consisting of the kinetic energy and the potential energy in an external field, ( ) N p 2 E = i 2m + U( q i), (2.8) i we get ( f( p i, q i ) = C exp p2 i 2mkT U( q ) i), (2.9) kt where C is a normalization constant. This is the Boltzmann distribution; Maxwell derived this distribution for U = 0. We close this section with a remark that since for large thermodynamical systems the energy fluctuation is negligibly small, the results of calculations using microcanonical or canonical ensembles coincide (in the thermodynamical limit). 2.2 Grand canonical ensemble Let us allow the system also to exchange particles with the thermostat, keeping the average number of particles constant, N = N = const. The distribution for this system is known as the Grand canonical ensemble. It can be derived (do it!) using the same arguments as before adding the particle conservation constraint. The result reads, ( ) Ω E + µn f(e, N) = exp, (2.10) kt where µ is a chemical potential, and Ω(V, T, µ) is a thermodynamical potential related to the Gibbs free energy via the Legendre transformation, Ω(V, T, µ) = F (V, T, N) µn. (2.11) The difference between the thermodynamical potentials F and Ω for systems with variable number of particles is that F depends on average particle number, df µdn, while Ω - on chemical potential, dω Ndµ. Thus equation Ω/ µ = N in fact fixes the chemical potential for given N. Explicit equation for Ω through the partition function again follows from the normalization, Ω(V, T, µ) = kt ln Z, Z(V, T, µ) = dxe ( E( X)+µN)/kT, (2.12) 8 N

9 but the partition function now contains the sum over all particle numbers. To explore the properties of the grand canonical ensemble, we apply it to evaluation of distribution function of a quantum ideal gas. That is given by the Fermi or Bose distributions rather than Boltzmann distribution Eq. (2.9). The difference from the classical case is that the quantum particles are non-distinguishable, and the factorized distribution does not refer to individual particles. Another way to see this is to say that individual particles are not statistically independent. Instead, the occupations of individual quantum states are statistically independent, and it is possible to talk about average populations of individual quantum states. Let us introduce population n i of i-th quantum state, and write total energy of the gas on the form E = i ɛ i n i, N = i n i, (2.13) where ɛ i is the energy of the i-th quantum state. The partition function has the form, Z = ( exp ) (ɛ i µ)n i, (2.14) {n i } i kt where the sum is taken over all possible particle configurations {n i } occupying all the quantum states. One can rewrite this sum in an equivalent form, =. (2.15) {n i } i n i This allows us to factorize the partition function, Z = i z i, z i = n i e (ɛ i µ)n i kt, (2.16) to the product of partial partition functions of the quantum states. Correspondingly, the thermodynamic potential of the gas consists of contributions of individual quantum states, Ω = i Ω i, Ω i = kt ln z i. (2.17) This allows us to define an average occupations of quantum states n i, N = Ω µ = i Ω i µ = i n i, n i = kt µ ln z i, (2.18) Let us now evaluate the partial partition function in Eq. (2.16). In the case of Fermi particles, occupation of the state can be n i = 0, 1, giving z i = 1 + e (ɛ i µ)/kt. In the Bose case, there is no constraint on the state occupation, thus z i = [ 1 e (ɛ i µ)/kt ] 1. 9

10 Taking this into account we get for both the cases, n i = ± kt µ ln [ 1 ± e (ɛ i µ)/kt ] = The upper (lower) sign refers to the Fermi (Bose) statistics. 1 e (ɛ i µ)/kt ± 1. (2.19) 10

11 3 Lecture Exploring vicinity of equilibrium 3.1 Fluctuation around equilibrium Consider some macroscopic variable A( X) that is not exactly constant in equilibrium (for example, for microcanonical ensemble this is not an energy) but depends on exact microscopic state. Many microscopic states contribute to the same value of macroscopic variable, let s denote corresponding area of the phase space as Ω(A). Since the microcanonical distribution function is constant over Γ E, the probability of certain value of the variable is proportional to the corresponding area, P (A) Ω(A). Let us define non-equilibrium entropy S(A) similar to definition in Eq. (1.16), then P (A) e S(A)/k. (3.1) For small deviation from the equilibrium value, A = A 0 + x, we can expand the entropy, remembering that it has maximum at equilibrium, S(A) = S d 2 S 2 da 2 x2, d 2 S < 0. (3.2) da2 Introducing g = (1/k)(d 2 S/dA 2 ), we write the probability density (distribution function) for macroscopic fluctuation on the form, f(x) = Ce (1/2)gx2. (3.3) Thus fluctuation of macroscopic variable around equilibrium has a gaussian form. This result belongs to Einstein. Constant C in Eq. (3.3) is defined by the normalization condition, C = g/2π, the first moment is zero, while the second moment, x 2 = 1/g. Gaussian distribution has a known remarkable property that all higher order moments are expressed through the second one, i.e. through g. To see this, we use the method that is very useful for calculation of moments and correlation functions of several variables. Let us introduce generating function ( dx exp 1 2 J(h) = gx2 + hx ) ( ) (3.4) dx exp 1 2 gx2 Then any moment can be calculated using relation, x n = dn J dh n. (3.5) h=0 Consider the following transformation, (1/2)gx 2 hx = (1/2)g(x h/g) 2 (1/2)(h 2 /g). Then changing the variable, (x h/g) = y we find that integrals in the nominator and denominator in Eq. (8.22) will cancel, giving the result, J(h) = e h2 /2g. (3.6) 11

12 It is clear from this equation that indeed x 2 = 1/g, and all (even) high order moments depends entirely on g. This derivation can be extended to the case of several variables, P (A 1,... A n ) e S(A 1,...A n )/k. (3.7) Expanding entropy with respect to small deviation from equilibrium x 1... x n, and introducing matrix g ik = 1 k 2 S A i A k, g ik = g ki, (3.8) we have S(A 1,... A n ) = S 0 (k/2)g ik x i x k. The probability density has the form, f(x 1,... x n ) = C exp ( 1 ) 2 g ikx i x k. (3.9) Now the normalization constant reads C = correlation function we have, x i x j = g 1 ij. det g/(2π) n, and for the second order (3.10) This result is easily obtained by using the generating function method (check!). In the following, we shall use so called thermodynamic forces that are defined as X i = 1 k S x i = g ij x j. (3.11) Using the result of Eq. (3.10), we can find the correlation functions between the fluctuating variables and the thermodynamic forces, X i x k = g ij g 1 jk = δ ik. (3.12) 3.2 Time correlation of fluctuations Let us consider an evolution of small deviations from equilibrium x i (t) in time. This evolution is characterized by a correlation function x i (t)x i (t ) = φ i (t t ). (3.13) This correlation function depends on the difference of the times because in equilibrium the distribution is stationary. The time ordering in this equation does not matter since x i is a classical (commutative) variable, thus φ i (t t ) = φ i (t t). Similarly, we can introduce time cross correlation function, x i (t)x j (t ) = φ ij (t t ). (3.14) This correlation function has an obvious property, φ ij (t t ) = φ ji (t t). (3.15) 12

13 There is a more fundamental symmetry property of the cross correlation function related to the microscopic time reversal symmetry: φ ij (t t ) = φ ij (t t). (3.16) One would expect that deviation from equilibrium will tend to evolve towards equilibrium. However, how can then the deviation from equilibrium spontaneously appear? As Paul Ehrenfest argued, spontaneous deviation from equilibrium in the fluctuation region x var x is due to microscopic processes, which are time reversible. Thus evolution of fluctuation forward and backward in time is symmetric. Combining the two symmetry relations in Eqs. (3.15) and (3.16), we get φ ij (τ) = φ ji (τ). (3.17) This relation is valid for the quantities that are time reversible themselves (e.g. position not velocity), and which evolve according to time-reversal Hamiltoinians (e.g. not containing magnetic field). If it is not the case this symmetry relation must be appropriately generalized. 3.3 Evolution towards equilibrium Let us now consider large deviation from equilibrium, which exceeds the fluctuation region, x var x. In this case fluctuation is negligible (self-averaging) and we should not distinguish x and x. In this case,the system will tend to evolve towards equilibrium, so that at t x i = 0 and ẋ i = 0. The simplest imaginable equation describing such an evolution is a linear equation, which for a single variable reads, ẋ(t) = λx(t), x(t) = x(0)e λt. (3.18) This kinetic equation describes exponential relaxation towards equilibrium with the relaxation rate given by a constant λ. This is a phenomenological equation that can be only justified by comparison with experiment. Indeed this type of evolution is common for chemical reactions, and also for many condensed matter physics problems, as we will see. It is relevant for weak non-equilibrium states, and for slow relaxation processes. In principle, there are more complex, e.g. non-linear kinetic equations, or integral equations containing the time memory. Derivation of macroscopic kinetic derivations, including values of the relaxation rates, must be done on a more fundamental microscopic grounds. For several variables, linear kinetic equation looks similar, ẋ i (t) = λ ij x j (t), (3.19) and now its exponential solutions are characterized by a set of relaxation rates given by the eigenvalues of matrix λ ij. More commonly this kinetic equation is written using the thermodynamic forces introduced in Eq. (3.11). Inverting this equation, we get, ẋ i (t) = γ ij X j (t), γ ij = λ im g 1 mj. (3.20) 13

14 This is a canonical form of the kinetic equation of non-equilibrium thermodynamics. It says that the thermodynamic forces generate thermodynamics flows (ẋ), the proportionality coefficients γ ij are called the kinetic or transport coefficients. Typical examples of thermodynamic forces are gradients of temperature or chemical potential, while examples of the flows are given by thermal current or mass current. Corresponding kinetic coefficients are thermal conductivity and diffusion coefficient. Kinetic equation can be written in an equivalent form resembling Hamiltonian equation, 1 S ẋ i (t) = γ ij. (3.21) k x j However, this is a misleading similarity: the Hamiltonian evolution and the one given by kinetic equation are qualitatively different, the principle difference concerns time reversibility. Kinetic equation Eq. (3.20) is purely phenomenological one: neither kinetic coefficients nor region of equation applicability is known for any particular system. Theoretical verification of this equation can be only done on a microscopic level, and this is the central task of the Boltzmann transport theory. Amazingly, one general property of the kinetic coefficients can be nevertheless established without appealing to a microscopic theory: this is the famous symmetry relation formulated by Onsager: γ ij = γ ji. (3.22) This symmetry is a consequence of the microscopic time reversibility expressed by Eq. (3.17). To prove this symmetry property, let us continue our kinetic equation into the fluctuation region. Here we have to remember that, rigorously speaking, our equation is valid for averages, x and X. However, because of linear form of this equation and stationary equilibrium distribution, the same equation will be valid for exact fluctuating quantities. Then we write time correlation function Eq. (3.17), x i (τ)x j (0) = x j (τ)x i (0), (3.23) and apply the time derivative to both sides of the equation, ẋ i (τ)x j (0) = ẋ j (τ)x i (0). (3.24) With the help of kinetic equation Eq. (3.20) we eliminate the time derivatives, γ ik X k (τ)x j (0) = γ jk X k (τ)x i (0). (3.25) Now we put τ = 0, and make use of relation Eq. (3.12), which is valid for the coinciding times, γ ik δ kj = γ jk δ ki. (3.26) This proves the Onsager s symmetry relation, Eq. (9.22). 14

15 4 Lecture Stochastic processes 4.1 Definitions Now we change scope of discussion and approach problem from a microscopic side. Good example of a system we have in mind is a classical gas of hard particles scattered elastically by randomly distributed fixed impurities, see Fig. 1. We neglect scattering between the particles and thus it suffices to consider trajectory of only single particle. If we know exact positions of the impurity scatterers, and exact initial position and momentum of our particle, we can, in principle recover the trajectory. However, if we don t have such a precise knowledge, trajectory will look to us completely chaotic, and we should think about particle motion as a random process. Let us focus on momentum of the particle. In each collision event the absolute value of the momentum is conserved, but the momentum direction changes randomly. If we are interested in calculation of single time momenta, p n (t), we need to only know a probability P ( p, t) for particle to have momentum p at time moment t. However, if we are interested in two-time correlation function, p(t 1 ) p(t 2 ), we need to know more about the stochastic process, namely, joint probability P ( p 1, t 1 ; p 2, t 2 ). Figure 1: Elastic scattering by pins. Full description of a stochastic process during certain time period, t 1 < t 2... < t n is given by the joint probability, P ( p 1, t 1 ; p 2, t 2 ;... p n, t n ), (4.1) This is equivalent to define a probability distribution on a space of all possible trajectories P ({ p(t)}). Each trajectory is called a realization of a stochastic process. Then any averages are to be calculated using the rule,... = P ({ p(t)})(...), (4.2) { p(t)} where summation is done over all the realizations. In the case of continuous realizations, the sum over realizations is replaced by a functional integral,... = Dx(t) P ({x(t)})(...). (4.3) 15

16 To make our discussion more general, we drop the momentum notation and consider some continuous random variable x(t), characterized by the probability densities. We can defined reduced probability densities, e.g. n 1-point probability density, f(x 1, t 1 ;... x n 1, t n 1 ) = dx n f(x 1, t 1 ;... x n 1, t n 1 ; x n, t n ). (4.4) Further we introduce a conditional probability density, F (x 1, t 1 ;... x n 1, t n 1 x n, t n ), which defines a probability density for value x n at moment t n provided the values x 1,... x n 1 at moments t 1,... t n 1 have realized with certainty. Formal definition of the conditional probability is, f(x 1, t 1 ;... x n, t n ) = F (x 1, t 1 ;... x n 1, t n 1 x n, t n ) f(x 1, t 1 ;... x n 1, t n 1 ). (4.5) From this we deduce following useful relation, f(x n, t n ) =... dx 1... dx n 1 F (x 1, t 1 ;... x n 1, t n 1 x n, t n ) f(x 1, t 1 ;... x n 1, t n 1 ).(4.6) Conditional probability density obviously satisfies the following normalization equation, dx n F (x 1, t 1 ;... x n 1, t n 1 x n, t n ) = 1 (4.7) In physics we often deal with some simpler stochastic processes that are called Markov processes. Markov process is defined by relation, F (x 1, t 1 ;... x n 1, t n 1 x n, t n ) = F (x n 1, t n 1 x n, t n ), (4.8) i.e. the conditional probability only depends on the realization value at the previous moment of time but not on all earlier values. Thus Markov process is often referred to as a process without memory. Knowledge of the two-time conditional probabilities completely defines the Markov process. Indeed, suppose we know f(x 1, t 1 ), then we know all the joint probabilities, f(x 1, t 1 ; x 2, t 2 ) = F (x 1, t 1 x 2, t 2 )f(x 1, t 1 ), f(x 1, t 1 ; x 2, t 2 ; x 3, t 3 ) = F (x 2, t 2 x 3, t 3 )f(x 1, t 1 ; x 2, t 2 ) and so forth. 4.2 Master equation = F (x 2, t 2 x 3, t 3 )F (x 1, t 1 x 2, t 2 )f(x 1, t 1 ), (4.9) Continuous Markov processes can be defined through a differential equation that is known as a master equation. Master equations play a central role in the non-equilibrium statistical physics. Consider two close moments of time, t and t + t. Fore many physical systems the conditional probability to make a transition from x 1 to x 2 x 1 for small time differences is small. Suppose it is proportional to t. Then we can introduce transition rate W (x 1, x 2, t), F (x 1, t x 2, t + t) = W (x 1, x 2, t) t, x 1 x 2. (4.10) 16

17 Small probability to leave the point x 1 implies that the probability to stay at point x 1 is large. To find it we use Eq. (4.7) for n = 2, F (x 1, t + t x 1, t) + dx 2 W (x 1, x 2, t) t = 1. (4.11) Let us write Eq. (4.6) for n = 2, f(x 2, t + t) = dx 1 F (x 1, t; x 2, t + t) f(x 1, t 1 ), (4.12) and substitute there transition rates according to Eq. (4.10). Then we get, f(x 2, t + t) = F (x 2, t + t x 2, t)f(x 2, t) + dx 1 W (x 1, x 2, t) tf(x 1, t). (4.13) We eliminate the first term by using Eq. (4.11), where we change indices 1 2, then we get, [ ] f(x 2, t+ t) = 1 dx 1 W (x 2, x 1, t) t f(x 2, t)+ dx 1 W (x 1, x 2, t) tf(x 1, t).(4.14) Moving term f(x 2, t) to the left hand side, and dividing by t, we consider the limit t 0. This will give us a differential equation describing evolution of a Markovian distribution function in time - master equation, f(x, t) t = dx 1 [W (x, x 1, t) f(x, t) W (x 1, x, t)f(x 1, t)], (4.15) This equation has a typical form of a conservation equation: change of the probability functions is determined by the balance between an outflux - transitions from the point of interest x to any other points (first term in the right hand side), and an influx - transitions to the point x from any other points. 4.3 H-theorem In general, master equation is a complex integro-differential equation. Its stationary solution only exists as soon as transition rates are time-independent, W (x, x 1 ) f 0 (x) W (x 1, x)f 0 (x 1 ) = 0 (4.16) Time evolution of an arbitrary initial distribution can be roughly guessed by formally ignoring the influx term - this model is called the relaxation time approximation. Then equation takes a differential form, δf(x, t) t = 1 τ(x) δf(x, t), 1 τ(x) = dx 1 W (x, x 1 ). (4.17) Solution to this equation has exponential form resembling relaxation of the macroscopic variables according to the kinetic equation discussed earlier, δf(x, t) = δf(x, 0)e t/τ(x). (4.18) 17

18 The relaxation time τ is now expressed through microscopic characteristic of the system - transition rate. At this point we can formulate two questions: How can we be sure that keeping the influx term will not qualitatively change the evolution? And what connection has this mathematical exercise to thermodynamics and relaxation to equilibrium. The positive answer to these questions is given by so called H-theorem discovered by Boltzmann. This theorem states that evolution according to master equation gives rise to increase of the Gibbs entropy, which achieves the maximum value for the stationary distribution. Thus this equation has a physical meaning and indeed desribes evolution of a non-equilibrium system towards thermodynamic equilibrium. We prove the H-theorem for our initial system - ideal gas with impurity scattering. For this physical realization, important symmetry property of the transition rates holds (here we suppress vector notation): W (p, p 1 ) = W (p 1, p). (4.19) This property is called a detailed balance equation, and it is a consequence of the microscopic time reversibility of the scattering process: transition rate from state p to state p 1 is the same as the reversed transition rate. Additionally, due to conservation of the particle energy during the scattering event, the transition rate must contain an energy conservation factor, W (p, p 1 ) = w(p, p 1 )δ(ɛ ɛ 1 ). Thus, master equation can be written on the form, f(p, t) t = dp 1 w(p, p 1 ) δ(ɛ ɛ 1 ) [f(p, t) f(p 1, t)]. (4.20) The rate of the entropy production directly follows from the definition of the Gibbs entropy, S(t) = k dpf(p, t) ln f(p, t), Ṡ(t) = k dpf(p, t)[ln f(p, t) + 1]. (4.21) The time derivative here is eliminated by using Eq. (4.20) (we skip time argument), Ṡ = k dp dp 1 W (p, p 1 )[f(p 1 ) f(p)][ln f(p) + 1]. (4.22) By interchanging the momenta p p 1, and using symmetry of W we transform this equation to the form, Ṡ = 1 2 k dpdp 1 W (p, p 1 )[f(p 1 ) f(p)][ln f(p) ln f(p 1 )] (4.23) The sign of the integral depends on the factor, [f(p 1 ) f(p)][ln f(p) ln f(p 1 )] = f(p 1 )(1 z) ln z, z = f(p) f(p 1 ). (4.24) It is easy to check that the sign of the function (1 z) ln z 0, is always negative, and the maximum value, zero, is achieved at z = 1. 18

19 Thus we conclude, that during evolution governed by the master equation the entropy grows, Ṡ 0, and that the equilibrium distribution satisfies the equation f 0 ( p) = f 0 ( p 1 ). Since the energy conservation, this can be written on a form, f 0 (ɛ, n) = f 0 (ɛ, n 1 ), p = p n. (4.25) In other words the equilibrium distribution is isotropic. The physical interpretation of this result is pretty natural: multiple scattering by impurities will try to erase eventual anisotropy of an initial distribution, while energy dependence is not affected since the microscopic energy conservation. From this example we see that a particular form of an equilibrium distribution depends on a kind of scattering. For this system - ideal gas with impurities - true equilibrium given by Gibbs distribution is never reached because of particles do not exchange their energies. In this case we could only talk about partial (local) equilibrium. H-theorem ensures that the system relaxes to an equilibrium in the thermodynamical sense, but the evolution seldom has a simple exponential form with specific relaxation time. Usually it is more complex functional dependence which can be roughly characterized by some typical relaxation time (relaxation time approximation). There is nevertheless a physical example of an exponentially evolving system. If impurity scatterers are spherically symmetric, scattering does not depend on the direction of incoming particle, w( p, p 1 ) = w(ɛ, n n 1 ). Furthermore, scattering can be often approximated as a spherically symmetric scattering, i.e. w(ɛ, n n 1 ) = w(ɛ). In this case, our master equation Eq. (4.20) takes the form, f(ɛ, n, t) t = do 1 ν(ɛ)w(ɛ) [f(ɛ, n, t) f(ɛ, n 1, t)], (4.26) where we performed integration over energy ɛ 1 and introduced a density of states ν(ɛ) = m 2mɛ; integration now goes over a solid angle in the momentum space, do = dθdϕ sin θ. Let s consider purely anisotropic (not necessarily small!) deviation from equilibrium, f(ɛ, n, t) = f 0 (ɛ) + δf(ɛ, n, t), do δf = 0; for this deviation the kinetic equation reads, δf(ɛ, n, t) t = 4πν(ɛ)w(ɛ) δf(ɛ, n, t). (4.27) It describes an exponential relaxation with an energy dependent relaxation time τ(ɛ) = 1/4πν(ɛ)w(ɛ). What is important here, is that a macroscopic kinetic characteristic of the system can be related to its microscopic characteristics. 19

20 5 Lecture Bolzmann equation 5.1 Binary scattering Let us consider another important scattering process: binary scattering of molecules in an ideal gas, see Fig. 2. Similar to the previous lecture, we will look at this process as a chain of random transitions between two-particle states p 1, p 2 p 3, p 4. Such a process is characterized with two-particle distribution function f (2) (p 1, p 2, t). Assuming Markov property of the process we introduce transition rates W (p 1, p 2 ; p 3, p 4 ; t). For stationary process the rates are time independent, moreover they possess obvious kinematic symmetry, W (p 1, p 2 ; p 3, p 4 ) = W (p 2, p 1 ; p 4, p 3 ). Time reversal symmetry of the collision process defines another symmetry - detailed balance principle, W (p 1, p 2 ; p 3, p 4 ) = W (p 3, p 4 ; p 1, p 2 ). Finally, there is momentum and energy conservation during collision event, which is reflected by the delta function factors in the collision rates, W (p 1, p 2 ; p 3, p 4 ) = w(p 1, p 2 ; p 3, p 4 )δ( p 1 + p 2 p 3 p 4 )δ(ɛ 1 + ɛ 2 ɛ 3 ɛ 4 ), (5.1) ɛ = p 2 /2m. Using these definitions we can write down a master equation for the two-particle distribution function, f (2) (p 1, p 2, t) = dp 3 dp 4 W (p 1, p 2 ; p 3, p 4 ) [f (2) (p 3, p 4 ;, t) f (2) (p 1, p 2, t)]. (5.2) t Similar to case of the impurity scattering this scattering integral also has outgoing and incoming parts. p 4 p 1 p 2 p 3 Figure 2: Binary scattering. The two-particle distribution function is relevant for evaluation of two-particle correlation functions. However, in practice, we commonly calculate single particle averages. Therefore it is desirable to derive master equation for a simpler object - single particle density matrix. Using the relation f(p 1, t) = dp 2 f (2) (p 1, p 2, t), (5.3) we could try to simplify Eq. (5.2), however, it is generally not possible to reduce distribution functions in the integrand at the right hand side because of essential 20

21 momentum dependence of the transition rates. To overcome this difficulty we make a crucial assumption about factorization of the two-particle distribution function, f (2) (p 1, p 2, t) = f(p 1, t)f(p 2, t). (5.4) Then the master equation takes the form, f(p 1, t) t = dp 2 dp 3 dp 4 W (p 1, p 2 ; p 3, p 4 ) [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)]. (5.5) Motivation for such a factorization comes from assumption about statistical independence of particles between the collisions since they do not interact. However, rigorous justification for the factorization does not exist. There are two reasons to believe that this equation correctly describes physical systems: (i) its stationary solution corresponds to the Maxwell distribution, (ii) similar to the case of impurity scattering, binary scattering integral obeys the H-theorem. Let us look for the stationary solution of Eq. (5.5). It t corresponds to the zero value of the square brackets in the integrand, i.e. f(p 3 )f(p 4 ) = f(p 1 )f(p 2 ). (5.6) It is easy to check that an exponential function f(p) = Ce ɛ/kt satisfies this functional equation due to energy conservation law in Eq. (5.1). Moreover, due to the momentum conservation, one can find more general form for the stationary solution, f( p) = C exp ɛ V p kt. (5.7) This solution has a simple physical meaning: it corresponds to the Maxwell distribution in the moving reference frame with velocity V. Indeed, since binary scattering does not change the total momentum of the gas, physical equilibrium should not depend on the choice of the reference frame, p p m V, i.e. it respects the Galilean invariance. Note that this is different from the impurity scattering where the impurities fix the laboratory reference frame. Derivation of the H-theorem is completely analogous to the calculation done in the previous section: we start with equation for the entropy production rate, eliminate the time derivative by means of the master equation, and then transform the integrand using symmetry properties of the transition rates, [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)] ln f(p 1 ) 1 2 [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)][ln f(p 1 ) + ln f(p 2 )] 1 4 [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)] [ln{f(p 1 )f(p 2 )} ln{f(p 3 )f(p 4 )}]. This function is always negative and reaches its maximum value when distribution is given by Eq. (5.7). 21

22 It is important to stress that the master equation for binary gas gives independent, and even more general argument in favor of Maxwell distribution to describe the equilibrium: it tells that arbitrary non-equilibrium distribution will evolve to the unique form, which is the stationary point of this equation. If we did not know anything about equilibrium statistical ensembles, we could derive Maxwell distribution from the master equation. 5.2 Boltzmann equation So far we considered evolution of physical systems in terms of completely stochastic processes. However, this is not totally true. During periods of time between the collisions, particles motion is completely deterministic. Moreover, important case of spatial inhomogeneity was ignored. We repair this drawback by using the following, still phenomenological, argument. Deterministic particle dynamics in absence of random scattering is described by the Liouville equation, df( r, p, t) dt = f( r, p, t) t f( r, p, t) + v r du( r) f( r, p, t) r p = 0, (5.8) i.e. full time derivative of the distribution function is zero. Scattering violate this identity, bringing additional channels for time evolution of distribution, ( ) df( r, p, t) df( r, p, t) = I coll ({f}). (5.9) dt dt coll The right hand side of this equation is given by a master equation of the form Eq. (4.20), or Eq. (5.2), or both depending on the kind of scattering. This combination of the Liouville equation describing deterministic evolution (often called convection or drift term), and master equation describing scattering constitutes the Boltzmann equation, f( r, p, t) t + v f( r, p, t) r du( r) f( r, p, t) r p = a I (a) coll ({f}). (5.10) The sum term includes all scattering mechanisms of importance. At this stage we can allow ourself to put following question: what is the meaning of combining deterministic dynamical evolution governed by Liouville equation and random dynamical evolution governed by master equation? For which object such a combined equation is written? Indeed, dynamical evolution for exact distribution function conserves the phase volume occupied by the system according to the Liouville theorem. The master equation evolution, on the contrary, leads to increasing phase volume, in accord with growing entropy, eventually leading to microcanonical distribution evenly covering the whole surface Γ E, see Fig. 3. The answer to this important conceptual question was given by P. Ehrenfest in the 1920s, who argued that an initial phase region occupied by the system develops with time to a region with extremely complicated irregular form, keeping constant volume, so that the inside points become arbitrary close to any point of outside region. Thus slightly washing 22

23 out exact distribution(coarse graining) will lead to a phase region with smooth shape and larger volume, eventually covering the whole available phase space. Nice example of such a situation is given be a fine dye powder diluted in a water: although the total volume of the dye particles is constant and small compared to the water volume, we do not distinguish individual particles and see homogeneous colored liquid. Furthermore, Ehrenfest argued that time reversal evolution of exact distribution function does not contradict irreversible evolution of the coarse grained distribution: if we manage to select the complex shaped phase region which is identical to the one developed by the system during forward time evolution, and reverse the time - we will obtain initial smooth phase region. However, the probability of this is negligibly small - any little error in our selection will lead to an even more complex region with larger coarse grained volume. Another interesting interpretation of entropy growth is related to an information: given the accuracy of our measurement (shape of coarse grained phase volume) we increasingly loose information about yet more and more complex shape of the exact distribution. This information loss is reflected by growing entropy. t 0 t 1 t 0 t 1 t 2 exact distribution coarse grained distribution Figure 3: Upper: unstable particle trajectories - distance grows exponentially with time. Lower: Exact dynamical evolution of initial distribution vs coarse grained distribution; initial phase region develops complex shape due to unstable particle trajectories. 5.3 Hydrodynamic equations Here we consider first application of Boltzmann equation - equations of hydrodynamics. These equations are an approximate form of the Boltzmann equation applied to a certain class of non-equilibrium macroscopic processes. Boltzmann equation is very complicated equation that is very difficult to solve even numerically. However, several classes of problems can be identified when the Boltzmann equation can be consistently 23

24 approximated with simpler, macrosocpic equations. The purpose of this discussion, besides demonstrating an astonishing fact that hydrodynamics is just consequence of the Boltzmann equation, is to give illustration of such approximative methods. We start by noticing that integration of the collision integral I coll (f) in Eq. (5.5) over p 1 gives zero, dp 1 I coll (f) = dp 1 dp 2 dp 3 dp 4 W (p 1, p 2 ; p 3, p 4 ) [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)] = 0. (5.11) This can be proven by re-labelling p 1, p 2 p 3, p 4 in, for instance, the outgoing (second) term and using the detailed balance equation. Furthermore, we notice that the conservation of microscopic momentum and energy for binary scattering, Eq. (5.1), gives rise to further constraints on the collision integral. Indeed, let us consider equation, dp 1 p 1 I coll (f) = dp 1 dp 2 dp 3 dp 4 p 1 W (p 1, p 2 ; p 3, p 4 ) [f(p 3, t)f(p 4, t) f(p 1, t)f(p 2, t)]. (5.12) By re-labelling p 1, p 3 p 2, p 4, and then p 1, p 2 p 3, p 4, and using the symmetries of the scattering rate, we get p 1 (1/4)( p 1 + p 2 p 3 p 4 ), which equals zero by virtue of the momentum conservation. Thus we get dp 1 p 1 I coll (f) = 0. (5.13) Similarly, we easily show that the energy conservation results in the relation, dp 1 ɛ 1 I coll (f) = 0. (5.14) Now we consider the implications of these conservation laws for the Boltzmann equation. We shall multiply the both sides of Eq. (5.10) consequently by 1, p, and ɛ, and integrate over momentum. This will cancel the right hand side (rhs) terms, and we get for the first case, dp f( r, p, t) + dp v f( r, p, t) = 0. (5.15) t r The integral of the last term of the Boltzmann equation is zero because the integrand is a full derivative over momentum. This equation can be re-written in a more familiar form by introducing a new normalization for the distribution function, dp f( r, p, t) = n( r, t), (5.16) where n( r, t) is the particle density. We also introduce the particle current density vector, j( r, t) = dp vf( r, p, t), (5.17) 24

25 then Eq. (5.15) takes the form of the continuity equation for the particle density, n( r, t) t + div j( r, t) = 0. (5.18) Multiplication of the Boltzmann equation by p and integration over momentum gives, j α ( r, t) t + P αβ( r, t) r β = 1 m F α( r, t)n( r, t). (5.19) Here we introduced index α = x, y, z for the vector components, and defined P αβ ( r, t) = dp v α v β f( r, p, t). (5.20) The last term in the Boltzmann equation was integrated by parts, taking into account relation p α / p β = δ αβ (since the components of the momentum vector are independent variables). Equation (5.19) can be interpreted as a continuity equation for the current density, the rhs term describing a driving effect of an external force. Similarly, we get the continuity equation for the energy density, u( r, t) t + div j E ( r, t) = F ( r, t) j( r, t). (5.21) Here u is the internal energy density, and j E is the energy current density, u( r, t) = dp ɛ( p)f( r, p, t), j E ( r, t) = dp vɛ( p)f( r, p, t) (5.22) 5.4 Local equilibrium The conservation equations for macroscopic quantities derived in the previous section have a general character. Now we specify them for a certain choice of the distribution function describing a local equilibrium, f (0) = C( r, t) exp ɛ( p) p V ( r, t) kt ( r, t). (5.23) This function resembles the equilibrium Boltzmann distribution in Eq. (5.7), but with non-homogeneous in time and space coefficients. This function describes approximate solution of the Boltzmann equation for the situation when the inhomogeneities are small on the time scale of the relaxation time and the spatial scale of the mean free path. Namely, dt dt T/τ, T T/vτ, (5.24) 25

26 and similar for C, and V. If this is the case, then the left hand side of the Boltzmann equation is small compared to the collision integral, and can be neglected, 0 I coll (f), (5.25) and the function in Eq. (5.23) gives solution to this equation. To evaluate the conservation equations for the local equilibrium, we notice that Eq. (5.23) can be rewritten on an equivalent form, f (0) = C( r, t) exp m( v V ( r, t)) 2 2kT ( r, t) (5.26) (the factor e mv 2 /2kT has been included in the normalization constant). This function is isotropic with respect to the variable v V. Generally, for isotropic functions, the averages of vector components are, dq q α F (q 2 ) = 0, dq q α q β F (q 2 ) = Qδ αβ. (5.27) This general property allows us to write the current density on the form, j = dp [( v V ) + V ]f (0) = nv, (5.28) and also simplify the quantity, P αβ = P δ αβ + nv α V β, where P is the pressure. With these simplifications, the equations Eq. (5.15), (5.19), take the form after some algebra, n t + div(n V ) = 0 V t + ( V ) V + 1 n P = F. (5.29) These are the hydrodynamic equations for an ideal liquid without any dissipation. Note that these equations contain only macroscopic quantities, the microscopic distribution function is left behind the scene. Equation (5.21) can be also transformed into a macroscopic equation for temperature dynamics by using relation m (v V ) 2 = 3kT, and similar algebraic transformation. Of course, the local equilibrium distribution function is not an exact solution for the Boltzmann equation. There are non-equilibrium corrections proportional to the (small) external forces, temperature and velocity gradients, which generate dissipative contributions to the flows, leading to thermal resistance, viscosity, etc. We will consider some examples of such dissipative effects in the next lectures. A consistent mathematical procedure for constructing the dissipative terms in the hydrodynamic equations was devised by Enskog in This procedure leads, in particular, to the Navier-Stokes equation and its generalizations. 26

27 6 Lecture Transport theory and kinetic coefficients A goal of the transport theory is a calculation of non-equilibrium macroscopic currents induced by external fields, spatial gradients, and other thermodynamical forces, and calculation of the kinetic coefficients. Usually the currents are evaluated in a linear approximation with respect to small deviation from equilibrium, and the theory is often called a linear response theory. In a solid state physics, typical currents of interest include electric current for charged particles (or mass current for neutral particles), and heat current. Corresponding kinetic coefficients are conductivity, thermal conductivity, and thermoelectric coefficients. 6.1 Current definitions Electric current density for electrons in a solid is defined j e = e v = e d 3 p vf( r, p, t), (6.1) distribution function here is normalized to electron density, n = N V = d 3 p f( r, p, t). (6.2) Heat current density is defined as difference between an energy flow density, j E = d 3 p vɛf( r, p, t) (6.3) and energy transferred by the particle flow µ v, j q = d 3 p (ɛ µ) vf( r, p, t). (6.4) 6.2 Linearized BE Consider Boltzmann equation of Eq. (5.10) with the collision integral corresponding to an electron-electron scattering, and the external force corresponding to static electric field, f( r, p, t) t + v f( r, p, t) r f( r, p, t) e ϕ p = I coll ({f}). (6.5) Equilibrium solution to this equation correspond to (i) stationary state, (ii) absence of spatial gradients, (iii) absence of external fields, then it reads, f (0) ( p) = exp ɛ(p) + µ + p V kt (6.6) In absence of macroscopic current flows, V must be zero, and equilibrium distribution function is isotropic and depends only on energy, f (0) (E). 27

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS UNDERSTANDING BOLTZMANN S ANALYSIS VIA Contents SOLVABLE MODELS 1 Kac ring model 2 1.1 Microstates............................ 3 1.2 Macrostates............................ 6 1.3 Boltzmann s entropy.......................

More information

Introduction. Chapter The Purpose of Statistical Mechanics

Introduction. Chapter The Purpose of Statistical Mechanics Chapter 1 Introduction 1.1 The Purpose of Statistical Mechanics Statistical Mechanics is the mechanics developed to treat a collection of a large number of atoms or particles. Such a collection is, for

More information

III. Kinetic Theory of Gases

III. Kinetic Theory of Gases III. Kinetic Theory of Gases III.A General Definitions Kinetic theory studies the macroscopic properties of large numbers of particles, starting from their (classical) equations of motion. Thermodynamics

More information

G : Statistical Mechanics

G : Statistical Mechanics G25.2651: Statistical Mechanics Notes for Lecture 1 Defining statistical mechanics: Statistical Mechanics provies the connection between microscopic motion of individual atoms of matter and macroscopically

More information

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables! Introduction Thermodynamics: phenomenological description of equilibrium bulk properties of matter in terms of only a few state variables and thermodynamical laws. Statistical physics: microscopic foundation

More information

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the

More information

Basic Concepts and Tools in Statistical Physics

Basic Concepts and Tools in Statistical Physics Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes

More information

ChE 503 A. Z. Panagiotopoulos 1

ChE 503 A. Z. Panagiotopoulos 1 ChE 503 A. Z. Panagiotopoulos 1 STATISTICAL MECHANICAL ENSEMLES 1 MICROSCOPIC AND MACROSCOPIC ARIALES The central question in Statistical Mechanics can be phrased as follows: If particles (atoms, molecules,

More information

d 3 r d 3 vf( r, v) = N (2) = CV C = n where n N/V is the total number of molecules per unit volume. Hence e βmv2 /2 d 3 rd 3 v (5)

d 3 r d 3 vf( r, v) = N (2) = CV C = n where n N/V is the total number of molecules per unit volume. Hence e βmv2 /2 d 3 rd 3 v (5) LECTURE 12 Maxwell Velocity Distribution Suppose we have a dilute gas of molecules, each with mass m. If the gas is dilute enough, we can ignore the interactions between the molecules and the energy will

More information

4. The Green Kubo Relations

4. The Green Kubo Relations 4. The Green Kubo Relations 4.1 The Langevin Equation In 1828 the botanist Robert Brown observed the motion of pollen grains suspended in a fluid. Although the system was allowed to come to equilibrium,

More information

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4 MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM AND BOLTZMANN ENTROPY Contents 1 Macroscopic Variables 3 2 Local quantities and Hydrodynamics fields 4 3 Coarse-graining 6 4 Thermal equilibrium 9 5 Two systems

More information

Grand Canonical Formalism

Grand Canonical Formalism Grand Canonical Formalism Grand Canonical Ensebmle For the gases of ideal Bosons and Fermions each single-particle mode behaves almost like an independent subsystem, with the only reservation that the

More information

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is 1 I. BROWNIAN MOTION The dynamics of small particles whose size is roughly 1 µmt or smaller, in a fluid at room temperature, is extremely erratic, and is called Brownian motion. The velocity of such particles

More information

Classical Statistical Mechanics: Part 1

Classical Statistical Mechanics: Part 1 Classical Statistical Mechanics: Part 1 January 16, 2013 Classical Mechanics 1-Dimensional system with 1 particle of mass m Newton s equations of motion for position x(t) and momentum p(t): ẋ(t) dx p =

More information

IV. Classical Statistical Mechanics

IV. Classical Statistical Mechanics IV. Classical Statistical Mechanics IV.A General Definitions Statistical Mechanics is a probabilistic approach to equilibrium macroscopic properties of large numbers of degrees of freedom. As discussed

More information

Onsager theory: overview

Onsager theory: overview Onsager theory: overview Pearu Peterson December 18, 2006 1 Introduction Our aim is to study matter that consists of large number of molecules. A complete mechanical description of such a system is practically

More information

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics Glenn Fredrickson Lecture 1 Reading: 3.1-3.5 Chandler, Chapters 1 and 2 McQuarrie This course builds on the elementary concepts of statistical

More information

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT From the microscopic to the macroscopic world Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München Jean BRICMONT Université Catholique de Louvain Can Irreversible macroscopic laws be deduced

More information

CHAPTER V. Brownian motion. V.1 Langevin dynamics

CHAPTER V. Brownian motion. V.1 Langevin dynamics CHAPTER V Brownian motion In this chapter, we study the very general paradigm provided by Brownian motion. Originally, this motion is that a heavy particle, called Brownian particle, immersed in a fluid

More information

2m + U( q i), (IV.26) i=1

2m + U( q i), (IV.26) i=1 I.D The Ideal Gas As discussed in chapter II, micro-states of a gas of N particles correspond to points { p i, q i }, in the 6N-dimensional phase space. Ignoring the potential energy of interactions, the

More information

Curves in the configuration space Q or in the velocity phase space Ω satisfying the Euler-Lagrange (EL) equations,

Curves in the configuration space Q or in the velocity phase space Ω satisfying the Euler-Lagrange (EL) equations, Physics 6010, Fall 2010 Hamiltonian Formalism: Hamilton s equations. Conservation laws. Reduction. Poisson Brackets. Relevant Sections in Text: 8.1 8.3, 9.5 The Hamiltonian Formalism We now return to formal

More information

We already came across a form of indistinguishably in the canonical partition function: V N Q =

We already came across a form of indistinguishably in the canonical partition function: V N Q = Bosons en fermions Indistinguishability We already came across a form of indistinguishably in the canonical partition function: for distinguishable particles Q = Λ 3N βe p r, r 2,..., r N ))dτ dτ 2...

More information

G : Statistical Mechanics

G : Statistical Mechanics G25.2651: Statistical Mechanics Notes for Lecture 15 Consider Hamilton s equations in the form I. CLASSICAL LINEAR RESPONSE THEORY q i = H p i ṗ i = H q i We noted early in the course that an ensemble

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department 8.044 Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble The object of this endeavor is to impose a simple probability

More information

Identical Particles. Bosons and Fermions

Identical Particles. Bosons and Fermions Identical Particles Bosons and Fermions In Quantum Mechanics there is no difference between particles and fields. The objects which we refer to as fields in classical physics (electromagnetic field, field

More information

1 Phase Spaces and the Liouville Equation

1 Phase Spaces and the Liouville Equation Phase Spaces and the Liouville Equation emphasize the change of language from deterministic to probablistic description. Under the dynamics: ½ m vi = F i ẋ i = v i with initial data given. What is the

More information

Khinchin s approach to statistical mechanics

Khinchin s approach to statistical mechanics Chapter 7 Khinchin s approach to statistical mechanics 7.1 Introduction In his Mathematical Foundations of Statistical Mechanics Khinchin presents an ergodic theorem which is valid also for systems that

More information

Statistical Mechanics

Statistical Mechanics 42 My God, He Plays Dice! Statistical Mechanics Statistical Mechanics 43 Statistical Mechanics Statistical mechanics and thermodynamics are nineteenthcentury classical physics, but they contain the seeds

More information

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4) Phase Transitions A homogeneous equilibrium state of matter is the most natural one, given the fact that the interparticle interactions are translationally invariant. Nevertheless there is no contradiction

More information

New ideas in the non-equilibrium statistical physics and the micro approach to transportation flows

New ideas in the non-equilibrium statistical physics and the micro approach to transportation flows New ideas in the non-equilibrium statistical physics and the micro approach to transportation flows Plenary talk on the conference Stochastic and Analytic Methods in Mathematical Physics, Yerevan, Armenia,

More information

Elements of Statistical Mechanics

Elements of Statistical Mechanics Elements of Statistical Mechanics Thermodynamics describes the properties of macroscopic bodies. Statistical mechanics allows us to obtain the laws of thermodynamics from the laws of mechanics, classical

More information

Lattice Boltzmann Method

Lattice Boltzmann Method 3 Lattice Boltzmann Method 3.1 Introduction The lattice Boltzmann method is a discrete computational method based upon the lattice gas automata - a simplified, fictitious molecular model. It consists of

More information

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H Solutions exam 2 roblem 1 a Which of those quantities defines a thermodynamic potential Why? 2 points i T, p, N Gibbs free energy G ii T, p, µ no thermodynamic potential, since T, p, µ are not independent

More information

Statistical Mechanics in a Nutshell

Statistical Mechanics in a Nutshell Chapter 2 Statistical Mechanics in a Nutshell Adapted from: Understanding Molecular Simulation Daan Frenkel and Berend Smit Academic Press (2001) pp. 9-22 11 2.1 Introduction In this course, we will treat

More information

Introduction Statistical Thermodynamics. Monday, January 6, 14

Introduction Statistical Thermodynamics. Monday, January 6, 14 Introduction Statistical Thermodynamics 1 Molecular Simulations Molecular dynamics: solve equations of motion Monte Carlo: importance sampling r 1 r 2 r n MD MC r 1 r 2 2 r n 2 3 3 4 4 Questions How can

More information

CHEM-UA 652: Thermodynamics and Kinetics

CHEM-UA 652: Thermodynamics and Kinetics 1 CHEM-UA 652: Thermodynamics and Kinetics Notes for Lecture 2 I. THE IDEAL GAS LAW In the last lecture, we discussed the Maxwell-Boltzmann velocity and speed distribution functions for an ideal gas. Remember

More information

Supplement: Statistical Physics

Supplement: Statistical Physics Supplement: Statistical Physics Fitting in a Box. Counting momentum states with momentum q and de Broglie wavelength λ = h q = 2π h q In a discrete volume L 3 there is a discrete set of states that satisfy

More information

Statistical mechanics of classical systems

Statistical mechanics of classical systems Statistical mechanics of classical systems States and ensembles A microstate of a statistical system is specied by the complete information about the states of all microscopic degrees of freedom of the

More information

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics c Hans C. Andersen October 1, 2009 While we know that in principle

More information

Chapter 9: Statistical Mechanics

Chapter 9: Statistical Mechanics Chapter 9: Statistical Mechanics Chapter 9: Statistical Mechanics...111 9.1 Introduction...111 9.2 Statistical Mechanics...113 9.2.1 The Hamiltonian...113 9.2.2 Phase Space...114 9.2.3 Trajectories and

More information

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10 Contents EQUILIBRIUM STATISTICAL MECHANICS 1 The fundamental equation of equilibrium statistical mechanics 2 2 Conjugate representations 6 3 General overview on the method of ensembles 10 4 The relation

More information

Part II: Statistical Physics

Part II: Statistical Physics Chapter 7: Quantum Statistics SDSMT, Physics 2013 Fall 1 Introduction 2 The Gibbs Factor Gibbs Factor Several examples 3 Quantum Statistics From high T to low T From Particle States to Occupation Numbers

More information

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential Chapter Ensemble Theory in Statistical Physics: Free Energy Potential Abstract In this chapter, we discuss the basic formalism of statistical physics Also, we consider in detail the concept of the free

More information

Physics 4230 Final Examination 10 May 2007

Physics 4230 Final Examination 10 May 2007 Physics 43 Final Examination May 7 In each problem, be sure to give the reasoning for your answer and define any variables you create. If you use a general formula, state that formula clearly before manipulating

More information

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany Preliminaries Learning Goals From Micro to Macro Statistical Mechanics (Statistical

More information

Physics Dec The Maxwell Velocity Distribution

Physics Dec The Maxwell Velocity Distribution Physics 301 7-Dec-2005 29-1 The Maxwell Velocity Distribution The beginning of chapter 14 covers some things we ve already discussed. Way back in lecture 6, we calculated the pressure for an ideal gas

More information

HAMILTON S PRINCIPLE

HAMILTON S PRINCIPLE HAMILTON S PRINCIPLE In our previous derivation of Lagrange s equations we started from the Newtonian vector equations of motion and via D Alembert s Principle changed coordinates to generalised coordinates

More information

1. Introductory Examples

1. Introductory Examples 1. Introductory Examples We introduce the concept of the deterministic and stochastic simulation methods. Two problems are provided to explain the methods: the percolation problem, providing an example

More information

PHYSICS 715 COURSE NOTES WEEK 1

PHYSICS 715 COURSE NOTES WEEK 1 PHYSICS 715 COURSE NOTES WEEK 1 1 Thermodynamics 1.1 Introduction When we start to study physics, we learn about particle motion. First one particle, then two. It is dismaying to learn that the motion

More information

Thermodynamics of nuclei in thermal contact

Thermodynamics of nuclei in thermal contact Thermodynamics of nuclei in thermal contact Karl-Heinz Schmidt, Beatriz Jurado CENBG, CNRS/IN2P3, Chemin du Solarium B.P. 120, 33175 Gradignan, France Abstract: The behaviour of a di-nuclear system in

More information

Fractional exclusion statistics: A generalised Pauli principle

Fractional exclusion statistics: A generalised Pauli principle Fractional exclusion statistics: A generalised Pauli principle M.V.N. Murthy Institute of Mathematical Sciences, Chennai (murthy@imsc.res.in) work done with R. Shankar Indian Academy of Sciences, 27 Nov.

More information

Condensed Matter Physics Prof. G. Rangarajan Department of Physics Indian Institute of Technology, Madras

Condensed Matter Physics Prof. G. Rangarajan Department of Physics Indian Institute of Technology, Madras Condensed Matter Physics Prof. G. Rangarajan Department of Physics Indian Institute of Technology, Madras Lecture - 10 The Free Electron Theory of Metals - Electrical Conductivity (Refer Slide Time: 00:20)

More information

The fine-grained Gibbs entropy

The fine-grained Gibbs entropy Chapter 12 The fine-grained Gibbs entropy 12.1 Introduction and definition The standard counterpart of thermodynamic entropy within Gibbsian SM is the socalled fine-grained entropy, or Gibbs entropy. It

More information

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects

More information

1 Fluctuations of the number of particles in a Bose-Einstein condensate

1 Fluctuations of the number of particles in a Bose-Einstein condensate Exam of Quantum Fluids M1 ICFP 217-218 Alice Sinatra and Alexander Evrard The exam consists of two independant exercises. The duration is 3 hours. 1 Fluctuations of the number of particles in a Bose-Einstein

More information

5. Systems in contact with a thermal bath

5. Systems in contact with a thermal bath 5. Systems in contact with a thermal bath So far, isolated systems (micro-canonical methods) 5.1 Constant number of particles:kittel&kroemer Chap. 3 Boltzmann factor Partition function (canonical methods)

More information

8.333: Statistical Mechanics I Fall 2007 Test 2 Review Problems

8.333: Statistical Mechanics I Fall 2007 Test 2 Review Problems 8.333: Statistical Mechanics I Fall 007 Test Review Problems The second in-class test will take place on Wednesday 10/4/07 from :30 to 4:00 pm. There will be a recitation with test review on Monday 10//07.

More information

Thermodynamic equilibrium

Thermodynamic equilibrium Statistical Mechanics Phys504 Fall 2006 Lecture #3 Anthony J. Leggett Department of Physics, UIUC Thermodynamic equilibrium Let s consider a situation where the Universe, i.e. system plus its environment

More information

Brownian Motion and Langevin Equations

Brownian Motion and Langevin Equations 1 Brownian Motion and Langevin Equations 1.1 Langevin Equation and the Fluctuation- Dissipation Theorem The theory of Brownian motion is perhaps the simplest approximate way to treat the dynamics of nonequilibrium

More information

8 Lecture 8: Thermodynamics: Principles

8 Lecture 8: Thermodynamics: Principles 8. LECTURE 8: THERMODYNMICS: PRINCIPLES 69 8 Lecture 8: Thermodynamics: Principles Summary Phenomenological approach is a respectable way of understanding the world, especially when we cannot expect microscopic

More information

Decoherence and the Classical Limit

Decoherence and the Classical Limit Chapter 26 Decoherence and the Classical Limit 26.1 Introduction Classical mechanics deals with objects which have a precise location and move in a deterministic way as a function of time. By contrast,

More information

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky MD Thermodynamics Lecture 1 3/6/18 1 Molecular dynamics The force depends on positions only (not velocities) Total energy is conserved (micro canonical evolution) Newton s equations of motion (second order

More information

Linear Response and Onsager Reciprocal Relations

Linear Response and Onsager Reciprocal Relations Linear Response and Onsager Reciprocal Relations Amir Bar January 1, 013 Based on Kittel, Elementary statistical physics, chapters 33-34; Kubo,Toda and Hashitsume, Statistical Physics II, chapter 1; and

More information

Energy Barriers and Rates - Transition State Theory for Physicists

Energy Barriers and Rates - Transition State Theory for Physicists Energy Barriers and Rates - Transition State Theory for Physicists Daniel C. Elton October 12, 2013 Useful relations 1 cal = 4.184 J 1 kcal mole 1 = 0.0434 ev per particle 1 kj mole 1 = 0.0104 ev per particle

More information

9.1 System in contact with a heat reservoir

9.1 System in contact with a heat reservoir Chapter 9 Canonical ensemble 9. System in contact with a heat reservoir We consider a small system A characterized by E, V and N in thermal interaction with a heat reservoir A 2 characterized by E 2, V

More information

Information Theory in Statistical Mechanics: Equilibrium and Beyond... Benjamin Good

Information Theory in Statistical Mechanics: Equilibrium and Beyond... Benjamin Good Information Theory in Statistical Mechanics: Equilibrium and Beyond... Benjamin Good Principle of Maximum Information Entropy Consider the following problem: we have a number of mutually exclusive outcomes

More information

Hydrodynamics. Stefan Flörchinger (Heidelberg) Heidelberg, 3 May 2010

Hydrodynamics. Stefan Flörchinger (Heidelberg) Heidelberg, 3 May 2010 Hydrodynamics Stefan Flörchinger (Heidelberg) Heidelberg, 3 May 2010 What is Hydrodynamics? Describes the evolution of physical systems (classical or quantum particles, fluids or fields) close to thermal

More information

PHYS 705: Classical Mechanics. Hamiltonian Formulation & Canonical Transformation

PHYS 705: Classical Mechanics. Hamiltonian Formulation & Canonical Transformation 1 PHYS 705: Classical Mechanics Hamiltonian Formulation & Canonical Transformation Legendre Transform Let consider the simple case with ust a real value function: F x F x expresses a relationship between

More information

Thermodynamics of violent relaxation

Thermodynamics of violent relaxation UNIVERSITA DEGLI STUDI DI PADOVA Dipartimento di ASTRONOMIA Thermodynamics of violent relaxation Dr. Bindoni Daniele 13 of MAY 2011 Outlines The Aims Introduction Violent Relaxation Mechanism Distribution

More information

Liouville Equation. q s = H p s

Liouville Equation. q s = H p s Liouville Equation In this section we will build a bridge from Classical Mechanics to Statistical Physics. The bridge is Liouville equation. We start with the Hamiltonian formalism of the Classical Mechanics,

More information

Physics 172H Modern Mechanics

Physics 172H Modern Mechanics Physics 172H Modern Mechanics Instructor: Dr. Mark Haugan Office: PHYS 282 haugan@purdue.edu TAs: Alex Kryzwda John Lorenz akryzwda@purdue.edu jdlorenz@purdue.edu Lecture 22: Matter & Interactions, Ch.

More information

Lecture notes for QFT I (662)

Lecture notes for QFT I (662) Preprint typeset in JHEP style - PAPER VERSION Lecture notes for QFT I (66) Martin Kruczenski Department of Physics, Purdue University, 55 Northwestern Avenue, W. Lafayette, IN 47907-036. E-mail: markru@purdue.edu

More information

although Boltzmann used W instead of Ω for the number of available states.

although Boltzmann used W instead of Ω for the number of available states. Lecture #13 1 Lecture 13 Obectives: 1. Ensembles: Be able to list the characteristics of the following: (a) icrocanonical (b) Canonical (c) Grand Canonical 2. Be able to use Lagrange s method of undetermined

More information

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.)

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.) 4 Vector fields Last updated: November 26, 2009. (Under construction.) 4.1 Tangent vectors as derivations After we have introduced topological notions, we can come back to analysis on manifolds. Let M

More information

Stochastic Particle Methods for Rarefied Gases

Stochastic Particle Methods for Rarefied Gases CCES Seminar WS 2/3 Stochastic Particle Methods for Rarefied Gases Julian Köllermeier RWTH Aachen University Supervisor: Prof. Dr. Manuel Torrilhon Center for Computational Engineering Science Mathematics

More information

1 Foundations of statistical physics

1 Foundations of statistical physics 1 Foundations of statistical physics 1.1 Density operators In quantum mechanics we assume that the state of a system is described by some vector Ψ belonging to a Hilbert space H. If we know the initial

More information

Experimental Soft Matter (M. Durand, G. Foffi)

Experimental Soft Matter (M. Durand, G. Foffi) Master 2 PCS/PTSC 2016-2017 10/01/2017 Experimental Soft Matter (M. Durand, G. Foffi) Nota Bene Exam duration : 3H ecture notes are not allowed. Electronic devices (including cell phones) are prohibited,

More information

Sketchy Notes on Lagrangian and Hamiltonian Mechanics

Sketchy Notes on Lagrangian and Hamiltonian Mechanics Sketchy Notes on Lagrangian and Hamiltonian Mechanics Robert Jones Generalized Coordinates Suppose we have some physical system, like a free particle, a pendulum suspended from another pendulum, or a field

More information

Lecture 6: Ideal gas ensembles

Lecture 6: Ideal gas ensembles Introduction Lecture 6: Ideal gas ensembles A simple, instructive and practical application of the equilibrium ensemble formalisms of the previous lecture concerns an ideal gas. Such a physical system

More information

CONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year.

CONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year. CONTENTS 1 0.1 Introduction 0.1.1 Prerequisites Knowledge of di erential equations is required. Some knowledge of probabilities, linear algebra, classical and quantum mechanics is a plus. 0.1.2 Units We

More information

The Ginzburg-Landau Theory

The Ginzburg-Landau Theory The Ginzburg-Landau Theory A normal metal s electrical conductivity can be pictured with an electron gas with some scattering off phonons, the quanta of lattice vibrations Thermal energy is also carried

More information

Fluid equations, magnetohydrodynamics

Fluid equations, magnetohydrodynamics Fluid equations, magnetohydrodynamics Multi-fluid theory Equation of state Single-fluid theory Generalised Ohm s law Magnetic tension and plasma beta Stationarity and equilibria Validity of magnetohydrodynamics

More information

Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases

Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Bahram M. Askerov Sophia R. Figarova Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases With im Figures Springer Contents 1 Basic Concepts of Thermodynamics and Statistical Physics...

More information

Statistical Mechanics

Statistical Mechanics Franz Schwabl Statistical Mechanics Translated by William Brewer Second Edition With 202 Figures, 26 Tables, and 195 Problems 4u Springer Table of Contents 1. Basic Principles 1 1.1 Introduction 1 1.2

More information

Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017

Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017 Lecture 4: Entropy Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743 September 7, 2017 Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, Lecture PHYS4: 743 Entropy September

More information

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x.

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x. Section 5.1 Simple One-Dimensional Problems: The Free Particle Page 9 The Free Particle Gaussian Wave Packets The Gaussian wave packet initial state is one of the few states for which both the { x } and

More information

Landau s Fermi Liquid Theory

Landau s Fermi Liquid Theory Thors Hans Hansson Stockholm University Outline 1 Fermi Liquids Why, What, and How? Why Fermi liquids? What is a Fermi liquids? Fermi Liquids How? 2 Landau s Phenomenological Approach The free Fermi gas

More information

1 Quantum field theory and Green s function

1 Quantum field theory and Green s function 1 Quantum field theory and Green s function Condensed matter physics studies systems with large numbers of identical particles (e.g. electrons, phonons, photons) at finite temperature. Quantum field theory

More information

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i G25.265: Statistical Mechanics Notes for Lecture 4 I. THE CLASSICAL VIRIAL THEOREM (MICROCANONICAL DERIVATION) Consider a system with Hamiltonian H(x). Let x i and x j be specic components of the phase

More information

7 The Navier-Stokes Equations

7 The Navier-Stokes Equations 18.354/12.27 Spring 214 7 The Navier-Stokes Equations In the previous section, we have seen how one can deduce the general structure of hydrodynamic equations from purely macroscopic considerations and

More information

Statistical. mechanics

Statistical. mechanics CHAPTER 15 Statistical Thermodynamics 1: The Concepts I. Introduction. A. Statistical mechanics is the bridge between microscopic and macroscopic world descriptions of nature. Statistical mechanics macroscopic

More information

I. Collective Behavior, From Particles to Fields

I. Collective Behavior, From Particles to Fields I. Collective Behavior, From Particles to Fields I.A Introduction The object of the first part of this course was to introduce the principles of statistical mechanics which provide a bridge between the

More information

Caltech Ph106 Fall 2001

Caltech Ph106 Fall 2001 Caltech h106 Fall 2001 ath for physicists: differential forms Disclaimer: this is a first draft, so a few signs might be off. 1 Basic properties Differential forms come up in various parts of theoretical

More information

Symmetry of the Dielectric Tensor

Symmetry of the Dielectric Tensor Symmetry of the Dielectric Tensor Curtis R. Menyuk June 11, 2010 In this note, I derive the symmetry of the dielectric tensor in two ways. The derivations are taken from Landau and Lifshitz s Statistical

More information

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge Beijing International Center for Mathematical Research and Biodynamic Optical Imaging Center Peking

More information

Markovian Description of Irreversible Processes and the Time Randomization (*).

Markovian Description of Irreversible Processes and the Time Randomization (*). Markovian Description of Irreversible Processes and the Time Randomization (*). A. TRZĘSOWSKI and S. PIEKARSKI Institute of Fundamental Technological Research, Polish Academy of Sciences ul. Świętokrzyska

More information

Definition 5.1. A vector field v on a manifold M is map M T M such that for all x M, v(x) T x M.

Definition 5.1. A vector field v on a manifold M is map M T M such that for all x M, v(x) T x M. 5 Vector fields Last updated: March 12, 2012. 5.1 Definition and general properties We first need to define what a vector field is. Definition 5.1. A vector field v on a manifold M is map M T M such that

More information

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm Markov Chain Monte Carlo The Metropolis-Hastings Algorithm Anthony Trubiano April 11th, 2018 1 Introduction Markov Chain Monte Carlo (MCMC) methods are a class of algorithms for sampling from a probability

More information

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility 84 My God, He Plays Dice! This chapter on the web informationphilosopher.com/problems/reversibility Microscopic In the 1870 s, Ludwig Boltzmann developed his transport equation and his dynamical H-theorem

More information

Physics 212: Statistical mechanics II Lecture IV

Physics 212: Statistical mechanics II Lecture IV Physics 22: Statistical mechanics II Lecture IV Our program for kinetic theory in the last lecture and this lecture can be expressed in the following series of steps, from most exact and general to most

More information