Principles of Statistical Mechanics

Size: px
Start display at page:

Download "Principles of Statistical Mechanics"

Transcription

1 Note... Statistical mechanics did not develop in the way classical mechanics or electrodynamics, which are well developed and formalized. Statistical mechanics on the other hand does not have a simple and straightforward formulation. For this reason there is not an obvious choice of textbook. Consider checking out Landau and Lifshitz Vol. 5. Part I Principles of Statistical Mechanics Hamilton s Equations and Phase Space We consider a classical mechanical system. Let L(q, q) be the Lagrangian take heed of the notation q {q,..., q f } q { q,..., q f } the generalized positions the generalized velocities Recall the action S[q, q] L(q, q, t)dt. By definition, the equations of motion are satisfied when the action is extremized. Lagrange equations t This condition gives the Euler- d dt ( ) L q i = L q i, q i q. The momenta of this system are given by p i = L q i (q, q, t) f i (q, q, t). Note..2. The q are elements of a f dimensional manifold M f. The details of this manifold depend on the physical system under consideration. ( ) If det 2 L q i q j we can invert the f i and solve for q i as functions of q, p, and t. From now on, we assume this is the case and thus we can state the following: there exist f continuous differentiable functions F i that depend on the momenta and positions such that the i th velocity q i = F i (q, p, t). Definition... The Hamiltonian is defined by the Legendre transformation of the Lagrangian H(q, p, t) i p i F i (q, p, t) L(q, F i (q, p, t), t). Note..3. Given a Lagrangian one can compute the Hamiltonian, likewise, given a Hamiltonian one can recover the Lagrangian by another Legendre transformation L(q, q, t) = i q i f i (q, q, t) H(q, f i (q, q, t), t). Definition..2. The space spanned by (q,p) is phase space and is denoted by Γ. Theorem... For every path q(t) in M f there exists a unique path (q(t), p(t)) in Γ. Moreover, if p does not depend on q, the path is determined by the system of two first order ODEs q i (t) ṗ i (t) = H p i () = H q i (2)

2 Note..4. These are Hamilton s equations of motion, also called the canonical equations. These are equivalent to the Euler Lagrange equations. Proof. (a) q(t) being differentiable uniquely determines q(t) and thus p(t) = f(q(t), q(t), t) is also uniquely determined. (b) Assume that the Euler Lagrange equations are satisfied and p does not depend on q. Then it immediately follows: H p i = F i (q, p, t) + j H q i = j p j F j q i p j F j p i j L q i j L q j F j q i L q j F j p i = q i = L q i = ṗ i Note..5. The equations () and (2) are called Hamilton s or Canonical equations. Note..6. Hamilton s equations are equivalent to the Euler-Lagrange equations. Note..7. For the Euler-Lagrange equations, one must solve f second order ODE s. In Hamilton s equations, one must solve 2f first order ODE s. Note..8. The initial point (q, p ) Γ uniquely determines the path (q(t), p(t)) by the theory of differential equations. Thus physical paths cannot cross in Γ. If there were a crossing at point (q c, p c ) Γ, we could use that as an initial condition. The crossing implies there are then two solutions to the differential equation, a contradiction to uniqueness of solutions. 2 Phase Space and Flow Recall from last time that a path in physical space uniquely specifies a path in phase space. In particular the physical trajectories of phase space never cross. If they did, then the point at which they cross would be an initial condition which does not provide unique solutions to Hamilton s equations, which cannot happen. Recall, too, that that we derived the equations of motion for the ps and qs, which we called Hamilton s equations of motion and are given by q = H p ṗ = H q Corollary 2... The total derivative of the Hamiltonian w.r.t. time is the partial derivative w.r.t. time. That is (3) (4) Proof. d dt H(q, p, t) = H(p, d, t) t d dt H(q, p, t) = f ( H t H(p, d, t) + q i + H ) ṗ i q i p i i= = f t H(p, d, t) + (ṗ i q i q i ṗ i ) = i= Note Let our system be such a system where H(q, p, t) does not have explicit time dependence. In that case, the partial derivative of H vanishes, and thus the total derivative of H vanishes w.r.t. t. Thus H is a constant of the motion. We cal H the first integral of Hamilton s equations, or Jacobi s Integral. Definition A system with a time independent Hamiltonian is called conservative and Jacobi s integral is called the energy. In this case, the phase space is restricted to a hyperplane of constant E, which is a 2f dimensional surface. This is called the energy surface, energy shell or accessible region. Configurations not on this surface violate energy conservation and thus are not physical. The accessible part of Γ is a set of measure zero. 2

3 Example: d harmonic oscillator For the one dimensional harmonic oscillator, we have one degree of freedom. The Lagrangian is The momentum is L = m 2 q2 k 2 q2 This allows us to solve for q as follows p = L d q = m q. q = p m The Hamiltonian is then H = p q(p) L = p2 2m + k 2 q2 Now, what is our energy surface? The energy surface is given by a constant value of H. This is exactly when the following condition is satisfied p 2 2mE + k 2E q2 = This equation specifies an ellipse in the 2 dimensional phase space. For completeness, note the following 2.. Canonical equations q ṗ = H p = H q = p m kq } q = k m q ω2 q 2..2 E-L equations d L dt q = L q = kq q = ω2 q 3 Phase Flow Phase flow refers to how points in phase space evolve as time goes by. Consider z(t) (q(t), p(t)) Γ. This point evolves via the canonical equations of motion. Considering all such z gives a mapping of phase space onto itself. Definition Let z(t ) Γ. The set of mappings U that map Γ onto itself such that u(t, t )[z(t )] = z(t). The set of all such mappings is phase flow, or time evolution. We refer to u as the time evolution operator. The Hamiltonian equations of motion constrain the time evolution operator s form. Proposition 3... The following are general properties of the time evolution operator.. u(t, t) =, t. 2. u(t, t ) u(t, t 2 ) = u(t, t 2 ), t, t, t (u(t, t )) = u(t, t ) u (t, t 2 ) Proof.. u(t, t)[z(t)] = z(t) by the definition of u. 2. (u(t, t ) u(t, t 2 ))[z(t 2 )] = u(t, t )[z(t )] = z(t ) = u(t, t 2 )[z(t 2 )]. Since this is true for all elements of phase space, the operators themselves are equal. 3

4 3. u(t, t )[z(t )] = z(t). If an inverse exists, then u (t, t )[z(t )] = z(t ) = u(t, t)[z(t)], z(t). Thus, u (t, t ) = u(t, t) Note 3... So, knowledge of the phase flow is equivalent to knowing the solution of the canonical equations. Example: Harmonic Oscillator The phase flow consists of (continuous) mappings of the ellipse onto itself. 4 Transformations of Phase Space Definition Let G be a set of mappings parameterized by a continuous index α. That is, G = {f α, α R}. We call G a one parameter group of transformations (or mappings) if. G is a group under composition. 2. α = f α = and f α+β = f α f β Note 4... All such groups are isomorphic to (R, +) Theorem The time evolution on a phase flow of any conservative system is a one parameter group of transformations. Note This is a remarkable (and unexpected) connection between the theory of ordinary differential equations and group theory. Proof. First we prove a lemma Lemma 4... u(t, t 2 ) = u(t t 2, ) u(t t 2 ) Note This is a statement of time translation invariance Proof. Consider u(t + t 3, t 2 + t 3 )z(t 2 + t 3 ) = z(t + t 3 ). Define z(t) = z(t + t 3 ) for a fixed time t 3. The canonical equations give that ż(t) = F (z(t)) where there is no explicit time dependence because the system is conservative. Now, consider z(t) = d dt z(t + t 3) = F (z(t + t 3 )) = F ( z(t)). Thus, z and z have the same time evolution. This tells us that u(t, t 2 ) z(t 2 ) = z(t ) = z(t +t 3 ) But we can also write this as u(t +t 3, t 2 +t 3 )z(t 2 +t 3 ) = u(t +t 3, t 2 +t 3 ) z(t 2 ), z(t 2 ). Thus this must be an operator identity, so u(t + t 3, t 2 + t 3 ) = u(t, t 2 ), t 3. In particular for t 3 = t 2 u(t, t 2 ) = u(t t 2, ). Now, by the lemma, G = {u(t)}. Obviously this forms a group by proposition Liouville s Theorem For D Γ a bounded region of Γ let the volume be denoted by V D. Let D(t) = u(t) D. Consider z D, with z = (z,..., z 2f ), where the first f components are positions and the second f components are momenta. By Theorem.. we have ż(t) = F (z(t)) where F = (F i,..., F 2f ) with Lemma 4... F i = d dt V = z D(t) dz 2f F i i= z i (z(t)) { H p i H q i f for i f for f < i 2f Proof. Suppose V (t) is known. We wish to calculate V (t + ɛ) = dz. Change variables and we get d(t+ɛ) V (t + ɛ) = = D(t) D(t) dz z(t + ɛ) z(t) dz ɛ i...i 2f ɛ j...j2f z i (t + ɛ) z ji (t)... z i 2f (t + ɛ) z j2f (t) (5) 4

5 Of course, z j (t + ɛ) = z j (t) + ɛf i. Now, the derivatives dz i /dz j = δ ij. And we just want the term linear in epsilon, that is the time derivative so, dv (t)/dt = D(t) dz ɛ ii2...i 2f ɛ ji2...i 2f F i (t) z j (t) But the sum in the ɛ tensor gives is a Kronecker delta, so dv (t)/dt = dz D(t) i F i (t) z i (t) Theorem 4.. (Liouville s Theorem). For a conservative system, phase flow is volume preserving i.e. dv dt say V D(t) = V D(), t. Proof. 2f i= F i z i = = f i= And the result follows immediately from the lemma. H q p i 2f H p i f q i f i=f+ =. That is to Example: f = In this case Γ = R 2. Consider a bounded region which starts off as a rectangle at t =, with corners at {(a, c), (a, d), (b, c), (b, d)}. Continuity tells us that the region must remain simply connected and differentiability tells us that the corners must remain as corners. In addition, if the system is conservative Liouville s theorem guarantees the volume remains the same. Theorem 4..2 (Poincaré s Theorem, Recurrence Theorem). Let G be a group of volume preserving continuously invertible mappings of R n onto itself. Let D R n be a bounded region with the property that f G, f(d) D. Then, for every open subset U D and for every f G there exists an n N, n and a V U such that f n V U. What is this theorem saying? It tells us that every part of every subset of D returns (approximately) to itself eventually under application of an element of G. Proof. Let D Γ be a bounded set such that G D D. Consider an open subset U D and an element g G. Now assume that N, N 2 N, g N U g N2 U =. Now, consider the volume of U, vol(u). By the fact that g is volume preserving, and by our assumption it follows that: lim N N n= (g n U) = lim N N Vol(U) Now, since G D D, it must be that lim N N n= (g n U) D. This is a contradiction as Vol(D) is finite, and clearly the limit of our union is not. Thus, we have that k, k 2 N such that g k (U) g k2 (U). We may assume without loss of generality k k 2. Then, x, y U such that g k (x) = g k2 (y) or x = g k2 k (y) g k2 k (U). Let us choose n = k 2 k then g n (U) U. Define Ṽ = gn (U) (U) and let V = g n (Ṽ ). So, V is our subset of U which returns to the set U under repeated applications of g. Note 4... The following premesis are very important. The subset D Γ is stable under the action of G. 2. g G is volume preserving in Γ. Corollary 4... Let a conservative mechanical system at time t be in a state given by the point in phase space z(t ) Γ. Then, under time evolution, the point z(t) will return arbitrarily close to it s original location within a finite amount of time. Corollary The recurrence of theorem 4..2 occurs arbitrarily often and arbitrarily accurately since U can be chosen arbitrarily small. 5

6 Example: Classical Gas Consider a classical gas in a partitioned container. At time t = all the gas is in the left hand side of the container. At t = a hole is punched in the partition. But by Poincaré s theorem there is a finite time at which all the particles will be back in the left hand side of the container. But that is ludicrous. What is the solution?. Maybe the finite time is absurdly long? 2. Maybe physics doesn t actually satisfy the condition for Poincaré s theorem? 4.2 The necessity of a statistical description of many particle systems Consider the time evolution of simple regions in phase space according the our two main theorems thus far. Example: Free Particle in d = For a free particle in one dimension, the motion is given by p(t) = p q(t) = q + (p /m)t Poincaré s theorem does not hold, because no bounded region of phase space is left invariant. Example: Particle in a box Lets instead consider this particle confined to a box in space. Now the equations are This leads to the following phase flow: p(t) = ±p q(t) = q + (p /m)t + inverted motion p L q Figure : Phase flow of particle in a box The particle just bounces back and forth in this box at a constant speed. A volume element in phase space stays within the rectangle bounded by the positive and negative values of the initial momenta bounds and the bounds of the box. The map is not continuous (when the particle hits the bounds of the box, they change speed discontinuously) and as t the initial region stretches into infinitely thin discontinuous pieces, stretched out across the invariant region of phase space. 6

7 Example: Lorentz Gas Now, consider a particle scattered by hard spheres in d = 3 this is called a Lorentz gas. Consider a packet of particles traveling towards a hard sphere of radius σ, with central impact parameter b and uncertainty in the impact parameter δb. Then the particles with scatter in a cone with some central angle α, and an uncertainty of δα. After some amount of time, the particles inhabit a region which increases with the square of the radius, real space volume is not preserved. Phase space volume must be preserved, so the momentum volume must shrink. Assume that the angle α is very small. Then δα = δb/σ. Now we assume there is a mean free path l >> σ which tells us that the volume after one collision will be v = q l 2 α = v l 2 σ v In d dimensions, clearly v = v (2l/σ) d up to factors of order. By Liouville s theorem, the momentum space volume must shrink by the same factor. Then, after N collisions, the momentum space volume has shrunk by a factor (σ/2l) (d ) N. But there is a problem. Problem: The Butterfly Effect After N collisions, a simple volume in phase space has been transformed into a foliated structure with typical dimensions ( p) N /( p) (σ/2l) (d )N. Example: Classical Gas A hard sphere gas with radii σ and density n has a mean free path l /nσ 2. Then n/l nσ 3 = n 3 for a typical gas. For d = 3 and N = 3 (that is, after three collisions) we have ( p) N /( p) 8. WOW! This tells us that after only three collisions, our initial phase space volumes are extremely long and thin. This means that incredible accuracy is needed in order to maintain the precise dynamics of a system. Example: A real experiment In a real experiment, the weakest force around is gravity. If gravity acts due to an object of mass M (and it does), we have a force F/m = GM/r 2 = 7 8 [cgs] M r 2. The momentum impulse is p = F/τ or p/p F m τ v. For a typical classical gas, v 5 m/s and the typical time scale is 9 seconds. Plugging this in, the relative momentum uncertainly is p/p 2 M/r 2 [cgs]. i.e. for a particle of mass M a distance r, this is the perturbation it imparts on your experiment. A butterfly outside of a lab with a mass of approximately a gram, at a distance of approximately meters destroys the foliated structure after about 4 collisions... that is bad news. Well, lets say we find a very remote location, on some distant corner of the solar system, where there are no butterflies. A one gram mass at a distance of one parsec (for example, a butterfly in the Andromeda galaxy) destroys the phase space structure after 9 collisions. Note Volumes of phase space represent errors in measurement, the actual state of a system of N particles is a single point in 6N dimensional phase space (for particles in 3 dimensions). As the examples make clear, after a very small period of time, perturbations from the environment completely destroy our knowledge of the system s state. Even on a computer, rounding error makes precise predictions impossible. The conclusion is that a deterministic microscopic description of an N particle system is practically impossible. On the other hand, time averages of macroscopic properties are perfectly deterministic. So, we need is to develop a statistical description of many particle systems, instead of a microscopic one. 7

8 5 Crash-Course in Formal Statistics Note A suggested book on statistics is Craig and Hogg 5. The definition of probability Definition 5... Consider an experiment of such a nature that the outcome cannot be predicted with certainty but, the set of all possible outcomes S can be described prior to performing the experiment. Furthermore, we assume the experiment can be repeated arbitrarily often under identical conditions. Such an experiment is called a random experiment and the set of outcomes S we call the sample space. Definition Given a random experiment a set C of elements C C is called classes or events if. S C and C C C S. 2. C, C 2 C C C 2 C. 3. C C C S\C C. Definition We call two events C, C 2 mutually exclusive if C C 2 =. Note 5... The above definition guarantees that S = C. Note C is in general a subset of the power set of S. Usually it is the whole power set. Definition A function P : C R is called a probability set function or probability, or statistical weight if. P (C),, C C. 2. If C, C 2 C are mutually exclusive, then P (C C 2 ) = P (C ) + P (C 2 ). 3. P (S) =. Proposition 5... P (C) = P (C ). Proof. For all C C, C C = and S = C C. Then P (S) = = P (C) + P (C ) P (C) = P (C ). Corollary 5... P () =. Proposition Let C, C 2 C. and C C 2. Then P (C ) P (C 2 ). Proof. We may write C 2 = C (C C 2 ). Furthermore, C (C C 2 ) =. So, P (C 2 ) = P (C ) + P (C C 2 ) P (C ) Corollary P (C). Proposition P (C C 2 ) = P (C ) + P (C 2 ) P (C C 2 ), C, C 2 C. Proof. Observe: C C 2 = C (C C 2 ), C 2 = (C C 2 ) (C C 2 ), C (C C 2 ) = and (C C 2 ) (C C 2 ) = tells us P (C C 2 ) = P (C ) + P (C C 2 ) and P (C 2 ) = P (C C 2 ) + P (C C 2 ), so we get what we wanted. Examples:. Casting a fair die. In this case S = {, 2, 3, 4, 5, 6, } and C= {C a C a S}. Furthermore, P (C) = 6 C. 2. Tossing two distinguishable coins S = {(H, H), (H, T ), (T, H), (T, T )} and C is the same as before and P (C) = 4 C. Note P (C) is a function which we assign based on reasonable assumptions. Often, defining P (C) as the size of C divided by the size of S is reasonable. This is called the assumption of equal a priori probabilities. Definition The conditional probability of an event C 2 given the event C, is denoted by P (C 2 C ). This is the probablility that, given some known event C, that C 2 will occur. The conditional probability is given by P (C 2 C ) = P (C C 2 )/P (C ) Note In general P (C C 2 ) = P (C)P (C 2 C ). Definition Two events are called statistically independent if P (C 2 C ) = P (C 2 ) Note For statistically independent events C and C 2, P (C C 2 ) = P (C )P (C 2 ). 8

9 Example: One chip, two chip, red chip blue chip. A bowl with 3 red chips and 5 blue chips, what is the probability of pulling first a red chip and then a blue one? Assuming a priori equal probabilities, the probability of drawing a red chip first is P (C ) = 3/8. The probability of drawing a blue chip after having drawn a red one, is P (C 2 C ) = 5/7. Finally, the probability of both these events occurring is 5/56. Theorem 5... The conditional probability is a probability in the sense of definition 5..4 with C as the sample space. Proof. There are a number of conditions in definition 5..4 we must check for given C.. P (C C 2 ) P (C ) P (C C 2 ). 2. P (C C 2 ) =. 5.2 Discrete random variables Definition Let S be a sample space. Let X : S T R be a mapping from the sample space into the real numbers. Then we call X a random variable. Given an event C the probability of A X(C) is P (A) P r (A) P x (A) P (A) P (C). Definition Let X : S X(S) T be a random variable. Let C be an event with image A = X(C) under X. Furthermore, let T be a countable, possibly finite, subset of R. If a function ρ : T [, ] exists such that P r (A) = x A ρ(x) then X is called a discrete random variable and ρ is called the probability density function (pdf) or the distribution of the random variable of X. Note ρ(x) is the probability of the event C that gets mapped onto the number x by the function X. One often says (sloppily) that ρ(x) is the probability that the random variable X has the value x. Of course, it must be the case that ρ(x) =. x T Example: independent coin tosses In this case, S = {tt..., htt..., tht...,...}. Let X be the number of tosses needed to get the first h. The space T = X(S) is N. We want to know the probability P r (X) where X is the number of tosses before the first heads. What is the probability P r (X = )? Obviously it is /2. Also, P r (X = 2) = /4. Clearly, P r (X = x) = /2 x. So, we know now our probability function, ρ(x) = /2 x. Note This pdf is indeed normalized, since n= infty /2n =. Example: manufacturing Consider a lot of manufactured parts contains 2 bad parts. Suppose we inspect that lot by testing 5 randomly chosen parts. The tested parts are not replaced. Our random variable x will now be the number of defective parts that show up in the test. ρ(x) = P r (X = x) = ( ) ( 2 x 8 ) ( 5 x / ) 5. Definition The integrated distribution or cumulative distribution of our random variable X, denoted F X is defined as F X (x) P r (X x) = y x ρ(y). Note If we define A = {y T : y x} clearly, F X (x) = P r (A) = P (C X A). So, F X is indeed a probability. 9

10 5.3 Continuous Random Variables Definition Let X be a random variable. We call X a continuous random variable if T = X(S) is a union of intervals, and for C an event and A the image of the event under X a function ρ : T R exists such that P r (A) is equal to the integral of rho(x) over all x A. We call ρ the pdf. Note Take note of a few facts x T ρ(x)dx = ρ(x) is not a probability, it is a probability density. The value ρ(x)dx is the probability that a random variable X has a value between x and x + dx. So, ρ(x)dx = P r (x X x + dx). In particular, P r (X = x) = for a continuous random variable. Example: Manufacturing Suppose a factory manufactures resistors with a norm resistivity of R = Ω. The actual resistivities are distributed according to some continuous pdf, ρ(r) which is hopefully centered strongly around Ω. Note We can recover the discreet case from the continuous one by allowing ρ(x) to be a sum of delta functions. 5.4 Averages and Fluctuations Let X : S R be a random variable with pdf ρ, and let f : R R. Define the function Y = f X : S R (often written f(x)). Note that Y is a random variable. Definition The value Y E(Y ) f(x)ρ(x)dx is called the mathematical expectation or expected value x T or expectation value or the average or the mean of the random variable Y. Note Note that averages are defined with respect to a particular distribution! Usually when averages are computed some reasonable assumption is made about the distribution but almost always they are not known in practice. Note Suppose that we choose a function f(x) = x n. Then the expectation value X n f X is called the n-th moment of ρ. Of course, the zeroth moment is just by normalization. Definition Define ( X) X 2 /2 which is called the root mean square (rms) or standard deviation. Standard notation is σ. Definition The generating function of a pdf ρ(x) is defined as Z(t) = dxe tx ρ(x) Note This is just the Laplace transformation of ρ(x) Note Note that the nth moment of a pdf can be expressed as d n dx n t== dx x n ρ(x) Definition The characteristic function of a pdf is defined as f(k) = Z( ik) = dx e ikx ρ(x) e ikx Note The characteristic function is the Fourier transform of ρ(x). Note Of course, all of this can be applied to the discrete case by replacing integrals with sums, or having a pdf which is a sum of delta functions.

11 5.5 The Binomial Distribution Definition An experiment is called a Bernoulli Experiment of a Bernoulli trial when it is a random experiment with exactly two possible outcomes. Note The classic examples is a coin toss. Any boolean variable can take the place of the outcome. Proposition Consider a Bernoulli experiment with a sample set S = {true, f alse}. Consider a random variable X such that X(true) = and X(false) =. Let P (true) = P r () = p with p. Then, the pdf for X takes the form Proof. The proof is left to the reader. ρ b (x) = p x ( p) x, (x =, ) Note The pdf in the above proposition is called a Bernoulli distribution. The expected value of X is E(X) = x=, x(p)x ( p) x = p. The variance is, similarly σ 2 = x=, (x p)2 p x ( p) x = p( p) Often it is the case that a Bernoulli experiment is conducted many times over, in this case Definition A sequence of Bernoulli experiments is a sequence of n Bernoulli experiments that are performed independently so that P (true) = p for each experiment. Proposition Consider a sequence of Bernoulli experiments with sample space S of all ordered n-tuples of true and false. Consider the random variable X such that X(C) = # true s in C. Then, the pdf for X is ( ) n ρ B (x) = p x ( p) n x. x Proof. Obvious. Note This pdf is called a binomial distribution. let us check that this thing is normalized n x= ( ) n p x ( p) n x = (p + p) n = x Where the second to last step is given by the binomial theorem. Now, the expected value E(X) = = n ( ) n x p x ( p) n x x x= n x= n! x!(n x)! ppx ( p) n x n ( ) n = pn p x ( p) n x = np x x= Finally, the variance is σ 2 = np( p) and the fluctuation is σ 2 /E(X) = Note Note that the fluctuation goes to zero as n. This is an example of the law of large numbers, which states that (usually) fluctuations go to zero in the large n limit. Note An equivalent problem. Suppose you have a string of boxes which can either be occupied, or empty. Say we ve got sites occupied by either or indistinguishable objects. The question is, what is the probability of finding x of our indistinguishable on n sites if on average x = m? The pdf ρ(x) for this problem is given by the binomial distribution with p = m/n. p p n

12 5.6 The Gaussian distribution Note One rarely encounters a truly Gaussian experiment, but it often appears in limiting cases or idealizations. Definition Let X be a random variable with pdf ρ g (x) = 2πσ e x2 /xσ 2z. This is called a Gaussian or normal pdf. Note A mathematician colleague of the lecturer one told him that if a mathematician calls something normal you can be sure that it is extremely rare and almost never happens. Note E(X) =, but this can be generalized by shifting x x x and then E(X) = x. Of course, σ in the definition is indeed the standard deviation, just do the integral. Proposition The nth moment of ρ g is if n is odd, and Proof. Proof is left to the reader Proposition The characteristic function of ρ g is f g (k) = e k2 σ 2 /2 Proof. Proof is left to the reader 5.7 The Central Limit Theorem n!σ n 2 n/2 (n/2)! if n is even. Note Professor Belitz believes that this is one of the most amazing results in all of mathematics. Theorem Suppose we have N independents random variables {X i i,..., N} with pdfs X i = and X 2 i rangle <. Let Y = N N i= X i, the arithmetic mean of the X i. Then, for N, Y has a Gaussian pdf with a standard deviation sigma / N. In fact, ρ(y) = 2πσ e y2 /2σ 2, σ 2 = N 2 N Xi 2. Note Note that this is note the law of large numbers. The law of large numbers says that for large numbers of random experiments, or trials, the pdf is sharply peaked about the mean. This on the other hand, says that the pdf of the arithmetic mean of many random variables limits to a gaussian distribution. Note What is truly remarkable is the generality of this result. In particular, it works for any well defined pdf for the X i. Note The generalization to nonzero averages is obvious, we must just shift the gaussian to have a peak at the arithmentic mean of the averages, and shift the mean square to the mean square deviation. Proof. We present more of a plausibility argument than a proof: i= e iky = e ik N N N j= Xj = e ikxj/n (By their independence) = For N expand the A j j= N j= e ikxj/n = e log( N j= e ikx j /N ) = e N j= log(e ikx j /N ), A j (p) log( e ipxj ) A j (k/n) = log( ik X j N k2 2 Xj 2 (Becasue we assume that X j = ) = log( k2 2 Now we have that e iky = e N j= ( k2 2 N 2 = ie k2 σ 2 /2. σ 2 N 2 X 2 j N 2 + O(/N 3 ) ) + O(/N 3 )) = K2 2 X j 2 N 2 ) = e k2 N 2N 2 j= X2 j N Xj 2 j= X 2 j N 2 + O(/N 3 ) 2

13 Now, recall the note (5.4.5). We can then write ρ(y) = 2πσ e y2 /2σ 2 Note One big hole is the fact that we have not checked issues at higher order in the expansion of the A j. Example: Bernoulli Trials ρ(x i ) = p x i ( p) xi. Consider Y = n i= X i. Then X i = p and (X i X ) 2 = p( p) < so the central limit theorem applies. So, at n, Y/n has a gaussian pdf. and ρ(y/n) = 2π σ e (y σ)2 /2 σ 2 n 2. And σ = n n 2 i= (X i X i ) 2 = p( p) n σ 2 /n 2. Then, σ 2 = np( p). And we get that ρ(y) = nρ(y/n) = ρb(y). 6 Review of Thermodynamics Recall that in Chapter we saw that, in any fathomable practical terms, large numbers of particles cannot be described deterministically by solving equations of motion. This is true both classically and quantum mechanically. The solution to this issue is that we must adopt a statistical description. In this way, we distinguish between the microscopic state of a system, and the macroscopic state of a system. Classically, the microstate is a particular point in phase space, and quantum mechanically we mean a particular wave function. The macrostate is specified by the values of certain macroscopic parameters (energy, temperature, pressure...). Definition 6... For a given energy E, the number of microstates whose energy is less than or equal to E is denoted by Ω(E) and is called the integrated density of states (DOS), this is also called the total number of states. Note Number of microstates is in quotes because there are in general uncountably infinitely many microstates in a classical system. Really we mean the volume in phase space, when spaking of classical systems. Or the number of stationary states in quantum mechanics. Definition ω(e) dω/de is called the density of states. Note ω has units of /E quantum mechanically. Definition For a system whose energy is constrained to an interval in energy space, [E E, E] the number of microstates with energies in the interval, denoted by Ω E (E) is called the number of accessible states. Note For macroscopic systems, any reasonable choice of the interval E (that is, small, but not too small given the system) results in Ω(E) = Ω E (E) to tremendous accuracy. As a result, we often don t need to distinguish between the two. Note for large systems, Ω increses exponentially with n. In fact Ω(E) = e Nφ(E/N) with N >>>, φ(e/n) = O(), φ > and φ <. As a result we have a few fun facts ω(e) = dω/de = φ (E/N)e Nφ(E/N) and ω (E) = (φ (E/N)(/N) + (φ (E/N)) 2 )e Nφ(E/N) N (φ ) 2 e nφ > For a fixed E/N Ω and ω are rapidly growing functions of E. you can find further discussion in Reif s book. The purpose of us talking about this is to determine that pdfs of observables in a system with certain macroscopically defined conditions. 3

14 6. The Equilibrium State Definition 6... We call a sytem isolated if it cannot exchange energy with it s surroundings, except for the infinitesimal perturbations caused by the butterfly effect... Note 6... A truly isolated system would behave very differently than the realistic isolated system above. Definition If the probability of finding a system in a particular microstate does not change with time, we say the system is in thermodynamic equilibrium. Note In equilibrium all pdfs are time independent and therefore, all average values of observables are time independent. However, the values themselves still fluctuate. In particular, equilibrium is not a static state! We can now formulate two postulates underlying classical thermodynamics. Postulate (The postulate of equal a priori probabilities): An isolated system in equilibrium is equally likely to be in any of it s accessible microstates. Postulate 2 (Approach to equilibrium): An isolated system that is not in equilibrium will approach equilibrium if left undisturbed for a sufficiently long time. Note Postulate 2 can be shown to be false in the case of a truly (unrealistically) isolated system. 6.2 Interacting Systems Two systems ( and 2) are in contact with one another but isolated from the surrounding environment. They can exchange energy in the form of heat Q and/or work W. The energy of the two subsystems is not fixed, but the total energy is. Definition The average energy of a system that is a subsystem of a larger isolated system is called the internal energy and is denoted by U E. Note If a system absorbs heat Q, and does work W then it s internal energy changes by U = Q W. If this change is infinitesimal, then we have du = δu δw where du is an exact differential while δq and δw are not! As a result an energy change U depends only on the system s initial and final states while Q and W depend on the process by which the state was reached. 6.3 Reversible and Irreversible Processes The concepts of reversible and irreversible processes are an idealization, but useful nonetheless. Note An example of a reversible process is the workings of a car engine, and an example of an irreversible process is teh expanding of a gas to fill a container. Definition The following provides a definition of reversible processes.. A process that is infinitely slow so that the system remains in equilibrium at all times is called quasistatic. 2. A process that does not involve any heat transfer is adiabatic. 3. A process that is both adiabatic and quasistatic is called reversible. Note Let the numbers of accessible states for the initial and final state be Ω i andω f. It can be shown that Ω i = Ω f reversible. Definition let X be an external parameter and let E n (X) be the energy of the nth microstate as a function of X. If the parameter X is changed quasistatically, the system will do work δw given by de n /dx dx Fdx. F is the (generalized) force conjugate to x. 4

15 Example: Volume and Pressure If the system s volume changes by dv then the system does work δw = p dv where p = E n / V is the force conjugate to the volume, called pressure. As one might imagine, when a process is not reversible, we call it irreversible. Definition A process that increases the number of accessible states is called irreversible. Note The final state of a reversible process is more probable and so the system will not spontaneously return to the initial state. Example:Free Expansion of a Gas The process of free expansion of a gas into vacuum is irreversible. 6.4 Energy, Temperature and Entropy To define these concepts it is useful to think of two systems in thermal contact, one of wich is small () and the other of which is large (2). Let the energies of these systems be E 2 and E 2 with total energy E tot = E + E 2. Let the number of accessible states be Ω and Ω 2. Suppose we pick E = E. What is the total number of accessible states available to the system? Well, Ω tot (E) = Ω (E) Ω 2 (E tot E). This gives the total number of accessible states for the whole system. Note The total number of accessible states for the whole system is Ω tot = E Ω tot(e) and the probability distribution for the system to be in a state where E = E is clearly Ω tot (E)/Ω tot. Note Recall that we remarked that Ω tot (E) is very sharply peaked about the average energy E = E U the internal energy of system. Definition The entropy of a system is S = k b log(ω(e)). Recall that k b =.38 6 erg/k. Note The entropy encodes the probability of a given macrostate. A few properties: entropy is additive, in any process that takes an isolated system from one macrostate to another the entropy change in that process is positive semi-definite. Now we can define a reversible process as one in which the entropy does not change. Definition The temperature T of a system is defined by the equation T = S E E=U. Definition β /k b T Note T = k b log(ω) E E=U Note How does this definition of temperature relate to our experience of temperature? Well, historically this is interesting. It took a long time for people to understand heat and what it is. It was eventually realized that heat is due to the microscopic motion of particles, this makes some sence, and this connection to definition can be made explicit, though is not at all obvious. As the course goes on, this connection will be at least partially made. Note For two systems in thermal contact, the equilibrium state is given by the requirement that the number of accessible states is maximized. This is just the product of the number of accessible states. This is to say that Ω (E )Ω (E tot E ) is maximal, but then S (E ) + S 2 (E tot E ) is maximal. Differentiating this we see that the temperatures of two systems in thermal equilibrium are the same! Note if Ω(E, X) (X a tuneable parameter) is the number of accessible states, then, the generalized force F conjugate to X is given by mathcalf = Log(Ω) s x. Note that this is really the same as our earlier definition, F = En x Note The relation between T and external parameters and their conjugate forces T = T (x, F) is called the system s equation of state. Note Knowledge of Ω(E, X) or equivalently, knowledge of the entropy, is sufficient for calculating all thermodynamic properties of the system. β = log(ω) E=U, T = S E E E=U F = log(ω) x= x = T S β x x x= x 5

16 6.5 The laws of thermodynamics. The First law of Thermodynamics For an isolated system U is a constant. If a system is brought from one temperature to another by means of a process such that the system does work W and absorbs head Q, then the internal energy changes by an amount U = Q W. Note The first law expresses energy conservation. 2. The Second Law of Thermodynamics For any process that takes an isolated system from one macrostate to another, S. Any quasistatic process by which the system absorbs an infinitesimal amount of head δq leads to a change in entropy ds = δq/t. Note ds is an exact differential. δq is not, inverse temperature is an integrating factor. Note This law forbids perpetual motion machines of the second kind. Note The first and second law together imply that du = T ds dw, or course, if work is purely mechanical, then δw = pdv and du = T ds pdv. 3. The Third law of Thermodynamics S(T ) Note This law took until the early 9th century to be formulated, part of the reason for this is becasue it is not true classically, only quantum mechanically. Dropping the distinction between Ω(E) and Ω E (E) (the classical limit) breaks the third law. So, S must not depend on the arbitrary quantity E. More generally, entropy cannot be consistently defined in a purely classical context. This will be discussed in chapter 2 paragraph, where we will discuss the Gibbs paradox. 6.6 Thermodynamic Potentials Definition in addition to the internal energy U, one defines enthalpy, H U + pv. Helmholtz free energy, F U T S. Gibbs free energy, G = H T S. Note Note that these are all legendre transforms of the internal energy. From the definitions we see that the differetials obey the following relations: dh = T ds + P dv + V dp df = SdT P dv SdT + V dp Note These relations imply that temperature can be obtained by H/ S P =P or T = U/ S V =V. Pressure ca be found by P = F V T. There are also things known as maxwell relations, for instance, T/ V S = V/ S T and V/ S T = P/ T V, and others. 6

17 Example: The Ideal Classical Gas An ideal gas with N particles has the following properties: The equation of state is pv = Nk b T. The internal energy is U = 3N 2 k bt The specific heat per mole is c V = 3R 2 at fixed volume or c P = 5R 2 only). at fixed pressure (this is valid for a monoatomic gas A quasistatic change from a state with an initial temperature T i and initial volume V i to a final temperature T f and final volume V f leads to a change in entropy given by S = νc v log(t f /T i ) + νr log(v f /V i ), where ν is the number of moles. To get all this all we need the number of accessible states, which can be derived for a classical ideal gas and turns out to be Ω(E) = V N χ(e) where χ is a function of E only. 7 Statistical Ensembles This may be review, often this is covered in an undergraduate course. 7. Gibbsian Ensembles Consider a thermodynamic system in some particular macrostate. Definition 7... A statistical ensemble consists of many identical systems ( identical means same number of particles governed by the same equations of motion). Let us assume that these identical systems are in the same macrostate (but perhaps different microstates). Note 7... The value of this definition is that it makes averaging easier, conceptually, rather than following the time evolution of a single system, and averaging over logn time periods. That is, we can take the average of all these systems and it is teh same as the average of one sstem over a logn period of time. Example: Dice Suppose we want to find, empirically, the probability of rolling a 6. We can do this in two ways, either we roll one dice many times (long time average), or we roll many dice once, and count the number of sixes (an ensemble of die). Note This has a fundamental assumption which is that the laws of physics are time-invariant! Note We need to find the probability of an ensemble member being in a particular microstate, this depends on the physical system. There are a few canonical setups, the micro-canonical ensemble, canonical ensemble, and grand canonical ensemble. 7.2 The Microcanonical Ensemble This is a system which is thermally and chemically isolated from the environment. Consider an isolated system in equilibrium whose microstates have certain energies E(p, q) (classically) or E n (quantum mechanically with n a set of quantum numbers). Definition A macrostate is defined by demanding that the total energy of the system lies in an interval {E δe, E}. Given our postulates, we can immediately write the probability for the system to be in a given microstate. 7

18 Theorem The probability for the system to be in a microstate with energy E(p, q) (En) is { ρ(p, q) = if E(p, q) {E δe, E} V E dγ else Where V E is the volume of phase space filled by energetically allowed configurations. Or, quantum mechanically (6) ρ(e n ) = V E { if E(p, q) {E δe, E} else Proof. This follows trivially from the assumption of equal a priori probabilities. Note Note that the normalization indeed works, and we are assuming non-degenerate energy levels. degenerate energy levels is straightforward. Definition An ensemble which has a pdf like the one above defines a microcanonical ensemble Note This ensemble is very hard to work with, in particular the definition of temperature is not clear. (7) The fix for 7.3 The Canonical Ensemble Definition A heat bath is a system large enough such that any heat absorbed by the system or given off by the system is negligible compared to the total internal energy. Definition The canonical partition function is defined quantum mechanically as Z = n exp ( βe n ) (8) or classically, for 3N degrees of freedom, Where γ Γ. Z = 3N exp ( βe(γ)) (9) N! Γ Note Z is a dimensionless, in particular, the can be justified in the classical definition for dimensional reasons. Note The uncertainty principle tells us that we cannot know the position and momentum arbitrarially accurately. For this reason, we cannot consider phase space to be a true continuum, it can only be broken into bits of size at least /2. This justifies to some extent the factor of in the classical definition. Note More generally we can write Z = where ω(e) is the density of states. In the quantum case, ω becomes discrete. deω(e) exp ( βe) () Note Recall that ω(e) = dω/de so in the classical case the integral above becomes an integral over states dω. Note originally the partition function was defined by Boltzmann, who called it (in german) the Sum of states, which explains the origin of the label Z. Theorem For a system that is in thermal consact with a heat bath, the probability of being in a state with energy E n, or, the probability density of having energy E(q, p), is for a quantum system, and ρ n = Z exp( βe n) () ρ(e) = 3N exp( βe). (2) N!Z for a classical system, where β = /k b T, with T the temperature of the heat bath. 8

19 Proof. The proof will be done for a quantum system. Denote the heat bath by index 2 and the system by index. Suppose we put the system in a state with energy E n and let the heat bath have energy E 2. The total system has energy E. The first law tells us that the sum of the two energies better be E tot = E n + E 2. By the postulate of a priori equal probabilities, rho n = const. Ω 2 (E tot E n ). By definition, ρ = const. exp(s 2 (E tot E n )/k b ). Now we can use the fact that E tot >> E n so that ρ S = exp((s 2 (E tot ) E 2 n E E tot )/k b ). We can pull the first factor out and absorb it into the constant, and then by the definition of temperature, ρ = const e βen (3) Definition An ensemble with such a pdf is called canonical. The distribution above is the canonical distribution, and is also known as the Gibbs-Boltzmann Distribution. Note This distribution applies in the case that a system is in contact with a heat bath and in situations where a system has been prepared in such a way that the energy is known on average, but not precisely. In this case, the temperature is the tempurature that a heat bath would need for the average energy to be maintained when the system is put in contact with it (for a discussion of this, see Reif s book). The main difference between micro-canonical and canonical ensembles is that in the latter case the energy of the system can fluctuate via exchange with the heat bath. We would like to know how large these fluctuations are. Proposition The energy fluctuations in a (large i.e. N >> ) system described by a canonical ensemble (i.e. in contact with huge system) are Gaussian and the relative energy fluctuations E/ E = / N where N is the particle number. Lemma In a canonical ensemble the mean energy and the r.m.s. function as follows. energy fluctuations are given by the partition Proof. Recall that u = E = log(z) (4) β ( r.m.s. (E) = (E E 2 ) /2 2 ) /2 log(z) = β 2 (5) Z = n e βen So that the right hand side of equation (4) is And the right hand side of (5) gives log(z) β = Z n Ene βen = n E n ρ n E. 2 log(z) β 2 = Z Z 2 β E n e βen + E 2 Z ne βen n N = log(z) β z E n e βen + En 2 = E 2 + E 2 = ( E) 2 n Note The specific heat C v = ( u T ) v = β specific heat. Cool! T ( u β ) v = β T ( E)2. So, it appears that the energy fluctuations determine the Note For a macroscopic system with N particles, u = O(Nk b T ) and C v = O(Nk b ) (equipartition theorem). Proof. (proof of theorem 7.3.) Divide the large system into n identical subsystems that are still macroscopic but with n >>. Let the energies of the subsystems be ɛ i (i {... n}) and the total energy we denote by E = i ɛ i. If we look at any of these subsystems, its energy will fluctuate as a random variable with pdf given by the pdf of a canonical distribution (the rest of the system acts as a heat bath). Each subsystem has a partition function of it s own, call it Z i = e βɛ(i) n where the ɛ (i) n are the microscopic energy levels of the subsystems. The subsystems are identical Z i Z and Z = z n. The canonical distribution has a finite second moment, and therefore, the central limit theorem applies. So, we use the central limit theorem with x i = Nɛ i, y = E and σ 2 = n( ɛ) 2. Then, by the lemma, we can write this as σ 2 = n 2 log(z) β 2 = 9

20 2 log(n) β 2 = ( E) 2 = T β C v. Then, σ 2 / E 2 = ( E) 2 /u 2 = u 2 T B C v = O( N ) Note A macroscopic system has an extremely sharply peaked energy distribution. Proposition In a canonical ensemble, the helmholtz free energy is given by the partition function by the formula F = k b T log(z). Proof. Let x be an external parameter that characterizes the system. Then E n = E n (x) and the partition function Z = n e βen(x) = Z(β, x). Let us change β and x quasi-statically by increments dβ and dx. Then d log(z) = log(z) dx + log(z) dβ (6) x dβ ( ) = β E n Z x e βen dx u dβ (the second term follows from the lemma) (7) = β n n E n x ρ n u dβ = β E n dx u dβ dx (9) = βδw d(uβ) + β, du (first term follows from a previous remark) (2) = βδq d(uβ) (By the first law) (2) d(log(z) + βu) = βδq = k b ds (The last term follows by the 2nd law) (22) S = k b (log(z) + βu) + const. (23) F = u T S = k b log(z) (24) (8) Note From the helmholtz free energy all other thermodynamic quantities follow. And if we know the energy levels (or Z), in principle we know the helmholtz free energy. However, calculating it from energy levels is a pain. Corollary The entropy is given by k b log(z) + k b T log(z) T Corollary The pressure is given as k b (T ) log(z) V 7.4 The Grand Canonical Ensemble Consider a system in contact with a heat bath 2 and allows for the exchange of particles in addition to heat. Note The particle number N is not fixed and therefore the number of accessible states depends on the particle number in addition to the energy. In particular, Ω = Ω(E, N). Definition The chemical potential is defined as µ β log(ω) N Note βµ is related to N the same way that β is related to E. Note Many properties of β carry over to µ. In particular, two systems in equilibrium have the same chemical potential. A heat bath has a constant chemical potential no matter what. Definition Let E (N) n be the energy levels of system when it contains N particles. Now we define the grand canonical partition function as Z = e βµn N= n Note Here we restrict ourselves to quantum mechanics. e βe(n) n (25) Theorem The probability of system being in a microstate with N particles and energy E (N) n ρ (N) n is given by the pdf = Z e β(e(n) n µn) (26) Proof. The proof generalizes straightforwardly from the previous theorem, and is left to the reader. 2

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning Chang Liu Tsinghua University June 1, 2016 1 / 22 What is covered What is Statistical mechanics developed for? What

More information

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B Canonical ensemble (Two derivations) Determine the probability that a system S in contact with a reservoir 1 R to be in one particular microstate s with energy ɛ s. (If there is degeneracy we are picking

More information

ChE 503 A. Z. Panagiotopoulos 1

ChE 503 A. Z. Panagiotopoulos 1 ChE 503 A. Z. Panagiotopoulos 1 STATISTICAL MECHANICAL ENSEMLES 1 MICROSCOPIC AND MACROSCOPIC ARIALES The central question in Statistical Mechanics can be phrased as follows: If particles (atoms, molecules,

More information

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS UNDERSTANDING BOLTZMANN S ANALYSIS VIA Contents SOLVABLE MODELS 1 Kac ring model 2 1.1 Microstates............................ 3 1.2 Macrostates............................ 6 1.3 Boltzmann s entropy.......................

More information

2m + U( q i), (IV.26) i=1

2m + U( q i), (IV.26) i=1 I.D The Ideal Gas As discussed in chapter II, micro-states of a gas of N particles correspond to points { p i, q i }, in the 6N-dimensional phase space. Ignoring the potential energy of interactions, the

More information

Statistical mechanics of classical systems

Statistical mechanics of classical systems Statistical mechanics of classical systems States and ensembles A microstate of a statistical system is specied by the complete information about the states of all microscopic degrees of freedom of the

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department 8.044 Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble The object of this endeavor is to impose a simple probability

More information

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the

More information

IV. Classical Statistical Mechanics

IV. Classical Statistical Mechanics IV. Classical Statistical Mechanics IV.A General Definitions Statistical Mechanics is a probabilistic approach to equilibrium macroscopic properties of large numbers of degrees of freedom. As discussed

More information

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10 Contents EQUILIBRIUM STATISTICAL MECHANICS 1 The fundamental equation of equilibrium statistical mechanics 2 2 Conjugate representations 6 3 General overview on the method of ensembles 10 4 The relation

More information

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations Andrew Forrester August 4, 2006 1 The Fundamental (Difference or Differential) Relation of Thermodynamics 1

More information

9.1 System in contact with a heat reservoir

9.1 System in contact with a heat reservoir Chapter 9 Canonical ensemble 9. System in contact with a heat reservoir We consider a small system A characterized by E, V and N in thermal interaction with a heat reservoir A 2 characterized by E 2, V

More information

A Brief Introduction to Statistical Mechanics

A Brief Introduction to Statistical Mechanics A Brief Introduction to Statistical Mechanics E. J. Maginn, J. K. Shah Department of Chemical and Biomolecular Engineering University of Notre Dame Notre Dame, IN 46556 USA Monte Carlo Workshop Universidade

More information

Khinchin s approach to statistical mechanics

Khinchin s approach to statistical mechanics Chapter 7 Khinchin s approach to statistical mechanics 7.1 Introduction In his Mathematical Foundations of Statistical Mechanics Khinchin presents an ergodic theorem which is valid also for systems that

More information

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy Chapter 4 Free Energies The second law allows us to determine the spontaneous direction of of a process with constant (E, V, n). Of course, there are many processes for which we cannot control (E, V, n)

More information

Classical Statistical Mechanics: Part 1

Classical Statistical Mechanics: Part 1 Classical Statistical Mechanics: Part 1 January 16, 2013 Classical Mechanics 1-Dimensional system with 1 particle of mass m Newton s equations of motion for position x(t) and momentum p(t): ẋ(t) dx p =

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

Basic Concepts and Tools in Statistical Physics

Basic Concepts and Tools in Statistical Physics Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes

More information

for changing independent variables. Most simply for a function f(x) the Legendre transformation f(x) B(s) takes the form B(s) = xs f(x) with s = df

for changing independent variables. Most simply for a function f(x) the Legendre transformation f(x) B(s) takes the form B(s) = xs f(x) with s = df Physics 106a, Caltech 1 November, 2018 Lecture 10: Hamiltonian Mechanics I The Hamiltonian In the Hamiltonian formulation of dynamics each second order ODE given by the Euler- Lagrange equation in terms

More information

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics c Hans C. Andersen October 1, 2009 While we know that in principle

More information

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble Recall from before: Internal energy (or Entropy): &, *, - (# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble & = /01Ω maximized Ω: fundamental statistical quantity

More information

HAMILTON S PRINCIPLE

HAMILTON S PRINCIPLE HAMILTON S PRINCIPLE In our previous derivation of Lagrange s equations we started from the Newtonian vector equations of motion and via D Alembert s Principle changed coordinates to generalised coordinates

More information

summary of statistical physics

summary of statistical physics summary of statistical physics Matthias Pospiech University of Hannover, Germany Contents 1 Probability moments definitions 3 2 bases of thermodynamics 4 2.1 I. law of thermodynamics..........................

More information

Introduction Statistical Thermodynamics. Monday, January 6, 14

Introduction Statistical Thermodynamics. Monday, January 6, 14 Introduction Statistical Thermodynamics 1 Molecular Simulations Molecular dynamics: solve equations of motion Monte Carlo: importance sampling r 1 r 2 r n MD MC r 1 r 2 2 r n 2 3 3 4 4 Questions How can

More information

Statistical Mechanics

Statistical Mechanics 42 My God, He Plays Dice! Statistical Mechanics Statistical Mechanics 43 Statistical Mechanics Statistical mechanics and thermodynamics are nineteenthcentury classical physics, but they contain the seeds

More information

III. Kinetic Theory of Gases

III. Kinetic Theory of Gases III. Kinetic Theory of Gases III.A General Definitions Kinetic theory studies the macroscopic properties of large numbers of particles, starting from their (classical) equations of motion. Thermodynamics

More information

= 0. = q i., q i = E

= 0. = q i., q i = E Summary of the Above Newton s second law: d 2 r dt 2 = Φ( r) Complicated vector arithmetic & coordinate system dependence Lagrangian Formalism: L q i d dt ( L q i ) = 0 n second-order differential equations

More information

1 Foundations of statistical physics

1 Foundations of statistical physics 1 Foundations of statistical physics 1.1 Density operators In quantum mechanics we assume that the state of a system is described by some vector Ψ belonging to a Hilbert space H. If we know the initial

More information

Chapter 4: An Introduction to Probability and Statistics

Chapter 4: An Introduction to Probability and Statistics Chapter 4: An Introduction to Probability and Statistics 4. Probability The simplest kinds of probabilities to understand are reflected in everyday ideas like these: (i) if you toss a coin, the probability

More information

PHYSICS 715 COURSE NOTES WEEK 1

PHYSICS 715 COURSE NOTES WEEK 1 PHYSICS 715 COURSE NOTES WEEK 1 1 Thermodynamics 1.1 Introduction When we start to study physics, we learn about particle motion. First one particle, then two. It is dismaying to learn that the motion

More information

PHYS 352 Homework 2 Solutions

PHYS 352 Homework 2 Solutions PHYS 352 Homework 2 Solutions Aaron Mowitz (, 2, and 3) and Nachi Stern (4 and 5) Problem The purpose of doing a Legendre transform is to change a function of one or more variables into a function of variables

More information

G : Statistical Mechanics

G : Statistical Mechanics G25.2651: Statistical Mechanics Notes for Lecture 1 Defining statistical mechanics: Statistical Mechanics provies the connection between microscopic motion of individual atoms of matter and macroscopically

More information

X α = E x α = E. Ω Y (E,x)

X α = E x α = E. Ω Y (E,x) LCTUR 4 Reversible and Irreversible Processes Consider an isolated system in equilibrium (i.e., all microstates are equally probable), with some number of microstates Ω i that are accessible to the system.

More information

8 Lecture 8: Thermodynamics: Principles

8 Lecture 8: Thermodynamics: Principles 8. LECTURE 8: THERMODYNMICS: PRINCIPLES 69 8 Lecture 8: Thermodynamics: Principles Summary Phenomenological approach is a respectable way of understanding the world, especially when we cannot expect microscopic

More information

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas Lecture 5 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas law. redict the molar specific heats of gases and solids. Understand how heat is transferred via molecular collisions

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Physics Sep Example A Spin System

Physics Sep Example A Spin System Physics 30 7-Sep-004 4- Example A Spin System In the last lecture, we discussed the binomial distribution. Now, I would like to add a little physical content by considering a spin system. Actually this

More information

Statistical Mechanics in a Nutshell

Statistical Mechanics in a Nutshell Chapter 2 Statistical Mechanics in a Nutshell Adapted from: Understanding Molecular Simulation Daan Frenkel and Berend Smit Academic Press (2001) pp. 9-22 11 2.1 Introduction In this course, we will treat

More information

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq Chapter. The microcanonical ensemble The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq } = A that give

More information

The fine-grained Gibbs entropy

The fine-grained Gibbs entropy Chapter 12 The fine-grained Gibbs entropy 12.1 Introduction and definition The standard counterpart of thermodynamic entropy within Gibbsian SM is the socalled fine-grained entropy, or Gibbs entropy. It

More information

Curves in the configuration space Q or in the velocity phase space Ω satisfying the Euler-Lagrange (EL) equations,

Curves in the configuration space Q or in the velocity phase space Ω satisfying the Euler-Lagrange (EL) equations, Physics 6010, Fall 2010 Hamiltonian Formalism: Hamilton s equations. Conservation laws. Reduction. Poisson Brackets. Relevant Sections in Text: 8.1 8.3, 9.5 The Hamiltonian Formalism We now return to formal

More information

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature. Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature. All calculations in statistical mechanics can be done in the microcanonical ensemble, where all copies of the system

More information

2. Thermodynamics. Introduction. Understanding Molecular Simulation

2. Thermodynamics. Introduction. Understanding Molecular Simulation 2. Thermodynamics Introduction Molecular Simulations Molecular dynamics: solve equations of motion r 1 r 2 r n Monte Carlo: importance sampling r 1 r 2 r n How do we know our simulation is correct? Molecular

More information

ELEMENTARY COURSE IN STATISTICAL MECHANICS

ELEMENTARY COURSE IN STATISTICAL MECHANICS ELEMENTARY COURSE IN STATISTICAL MECHANICS ANTONIO CONIGLIO Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Monte S. Angelo I-80126 Napoli, ITALY April 2, 2008 Chapter 1 Thermodynamics

More information

Sketchy Notes on Lagrangian and Hamiltonian Mechanics

Sketchy Notes on Lagrangian and Hamiltonian Mechanics Sketchy Notes on Lagrangian and Hamiltonian Mechanics Robert Jones Generalized Coordinates Suppose we have some physical system, like a free particle, a pendulum suspended from another pendulum, or a field

More information

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables! Introduction Thermodynamics: phenomenological description of equilibrium bulk properties of matter in terms of only a few state variables and thermodynamical laws. Statistical physics: microscopic foundation

More information

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics. Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics. The goal of equilibrium statistical mechanics is to calculate the density

More information

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics Glenn Fredrickson Lecture 1 Reading: 3.1-3.5 Chandler, Chapters 1 and 2 McQuarrie This course builds on the elementary concepts of statistical

More information

Introduction to Statistical Thermodynamics

Introduction to Statistical Thermodynamics Cryocourse 2011 Chichilianne Introduction to Statistical Thermodynamics Henri GODFRIN CNRS Institut Néel Grenoble http://neel.cnrs.fr/ Josiah Willard Gibbs worked on statistical mechanics, laying a foundation

More information

Temperature Fluctuations and Entropy Formulas

Temperature Fluctuations and Entropy Formulas Temperature Fluctuations and Entropy Formulas Does it or does not? T.S. Biró G.G. Barnaföldi P. Ván MTA Heavy Ion Research Group Research Centre for Physics, Budapest February 1, 2014 J.Uffink, J.van Lith:

More information

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H Solutions exam 2 roblem 1 a Which of those quantities defines a thermodynamic potential Why? 2 points i T, p, N Gibbs free energy G ii T, p, µ no thermodynamic potential, since T, p, µ are not independent

More information

Grand Canonical Formalism

Grand Canonical Formalism Grand Canonical Formalism Grand Canonical Ensebmle For the gases of ideal Bosons and Fermions each single-particle mode behaves almost like an independent subsystem, with the only reservation that the

More information

Lecture 9 Examples and Problems

Lecture 9 Examples and Problems Lecture 9 Examples and Problems Counting microstates of combined systems Volume exchange between systems Definition of Entropy and its role in equilibrium The second law of thermodynamics Statistics of

More information

Nanoscale simulation lectures Statistical Mechanics

Nanoscale simulation lectures Statistical Mechanics Nanoscale simulation lectures 2008 Lectures: Thursdays 4 to 6 PM Course contents: - Thermodynamics and statistical mechanics - Structure and scattering - Mean-field approaches - Inhomogeneous systems -

More information

Physics 106b: Lecture 7 25 January, 2018

Physics 106b: Lecture 7 25 January, 2018 Physics 106b: Lecture 7 25 January, 2018 Hamiltonian Chaos: Introduction Integrable Systems We start with systems that do not exhibit chaos, but instead have simple periodic motion (like the SHO) with

More information

Part II: Statistical Physics

Part II: Statistical Physics Chapter 6: Boltzmann Statistics SDSMT, Physics Fall Semester: Oct. - Dec., 2013 1 Introduction: Very brief 2 Boltzmann Factor Isolated System and System of Interest Boltzmann Factor The Partition Function

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information

Elements of Statistical Mechanics

Elements of Statistical Mechanics Elements of Statistical Mechanics Thermodynamics describes the properties of macroscopic bodies. Statistical mechanics allows us to obtain the laws of thermodynamics from the laws of mechanics, classical

More information

Supplement: Statistical Physics

Supplement: Statistical Physics Supplement: Statistical Physics Fitting in a Box. Counting momentum states with momentum q and de Broglie wavelength λ = h q = 2π h q In a discrete volume L 3 there is a discrete set of states that satisfy

More information

Distributions of statistical mechanics

Distributions of statistical mechanics CHAPTER II Distributions of statistical mechanics The purpose of Statistical Mechanics is to explain the thermodynamic properties of macroscopic systems starting from underlying microscopic models possibly

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

Part1B(Advanced Physics) Statistical Physics

Part1B(Advanced Physics) Statistical Physics PartB(Advanced Physics) Statistical Physics Course Overview: 6 Lectures: uesday, hursday only 2 problem sheets, Lecture overheads + handouts. Lent erm (mainly): Brief review of Classical hermodynamics:

More information

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas law. redict the molar specific heats of gases and solids. Understand how heat is transferred via molecular

More information

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle Bruce Turkington Univ. of Massachusetts Amherst An optimization principle for deriving nonequilibrium

More information

Lecture I: Constrained Hamiltonian systems

Lecture I: Constrained Hamiltonian systems Lecture I: Constrained Hamiltonian systems (Courses in canonical gravity) Yaser Tavakoli December 15, 2014 1 Introduction In canonical formulation of general relativity, geometry of space-time is given

More information

PHYS 705: Classical Mechanics. Hamiltonian Formulation & Canonical Transformation

PHYS 705: Classical Mechanics. Hamiltonian Formulation & Canonical Transformation 1 PHYS 705: Classical Mechanics Hamiltonian Formulation & Canonical Transformation Legendre Transform Let consider the simple case with ust a real value function: F x F x expresses a relationship between

More information

Thermodynamics, Statistical Mechanics and Entropy

Thermodynamics, Statistical Mechanics and Entropy entropy Article Thermodynamics, Statistical Mechanics and Entropy Robert H. Swendsen Physics Department, Carnegie Mellon University, Pittsburgh, PA 15213, USA; swendsen@cmu.edu; Tel.: +1-412-268-5211 Received:

More information

Noncollision Singularities in the n-body Problem

Noncollision Singularities in the n-body Problem Noncollision Singularities in the n-body Problem Danni Tu Department of Mathematics, Princeton University January 14, 2013 The n-body Problem Introduction Suppose we have n points in R 3 interacting through

More information

Part II: Statistical Physics

Part II: Statistical Physics Chapter 6: Boltzmann Statistics SDSMT, Physics Fall Semester: Oct. - Dec., 2014 1 Introduction: Very brief 2 Boltzmann Factor Isolated System and System of Interest Boltzmann Factor The Partition Function

More information

to satisfy the large number approximations, W W sys can be small.

to satisfy the large number approximations, W W sys can be small. Chapter 12. The canonical ensemble To discuss systems at constant T, we need to embed them with a diathermal wall in a heat bath. Note that only the system and bath need to be large for W tot and W bath

More information

8.044 Lecture Notes Chapter 4: Statistical Mechanics, via the counting of microstates of an isolated system (Microcanonical Ensemble)

8.044 Lecture Notes Chapter 4: Statistical Mechanics, via the counting of microstates of an isolated system (Microcanonical Ensemble) 8.044 Lecture Notes Chapter 4: Statistical Mechanics, via the counting of microstates of an isolated system (Microcanonical Ensemble) Lecturer: McGreevy 4.1 Basic principles and definitions.........................

More information

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon G5.651: Statistical Mechanics Notes for Lecture 5 From the classical virial theorem I. TEMPERATURE AND PRESSURE ESTIMATORS hx i x j i = kt ij we arrived at the equipartition theorem: * + p i = m i NkT

More information

Liouville Equation. q s = H p s

Liouville Equation. q s = H p s Liouville Equation In this section we will build a bridge from Classical Mechanics to Statistical Physics. The bridge is Liouville equation. We start with the Hamiltonian formalism of the Classical Mechanics,

More information

Thermodynamics of phase transitions

Thermodynamics of phase transitions Thermodynamics of phase transitions Katarzyna Sznajd-Weron Department of Theoretical of Physics Wroc law University of Science and Technology, Poland March 12, 2017 Katarzyna Sznajd-Weron (WUST) Thermodynamics

More information

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r).

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r). Caveat: Not proof read. Corrections appreciated. Combinatorics In the following, n, n 1, r, etc. will denote non-negative integers. Rule 1 The number of ways of ordering n distinguishable objects (also

More information

1 Phase Spaces and the Liouville Equation

1 Phase Spaces and the Liouville Equation Phase Spaces and the Liouville Equation emphasize the change of language from deterministic to probablistic description. Under the dynamics: ½ m vi = F i ẋ i = v i with initial data given. What is the

More information

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany Preliminaries Learning Goals From Micro to Macro Statistical Mechanics (Statistical

More information

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4 MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM AND BOLTZMANN ENTROPY Contents 1 Macroscopic Variables 3 2 Local quantities and Hydrodynamics fields 4 3 Coarse-graining 6 4 Thermal equilibrium 9 5 Two systems

More information

Supplement on Lagrangian, Hamiltonian Mechanics

Supplement on Lagrangian, Hamiltonian Mechanics Supplement on Lagrangian, Hamiltonian Mechanics Robert B. Griffiths Version of 28 October 2008 Reference: TM = Thornton and Marion, Classical Dynamics, 5th edition Courant = R. Courant, Differential and

More information

1 INFO Sep 05

1 INFO Sep 05 Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually

More information

Basics of Statistical Mechanics

Basics of Statistical Mechanics Basics of Statistical Mechanics Review of ensembles Microcanonical, canonical, Maxwell-Boltzmann Constant pressure, temperature, volume, Thermodynamic limit Ergodicity (see online notes also) Reading assignment:

More information

Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 5, due Wednesday, April 2 Latest revision: April 4, 2003, 8:53 am

Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 5, due Wednesday, April 2 Latest revision: April 4, 2003, 8:53 am Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 5, due Wednesday, April 2 Latest revision: April 4, 2003, 8:53 am Reading David Chandler, Introduction to Modern Statistical Mechanics,

More information

Physics Oct A Quantum Harmonic Oscillator

Physics Oct A Quantum Harmonic Oscillator Physics 301 5-Oct-2005 9-1 A Quantum Harmonic Oscillator The quantum harmonic oscillator (the only kind there is, really) has energy levels given by E n = (n + 1/2) hω, where n 0 is an integer and the

More information

Caltech Ph106 Fall 2001

Caltech Ph106 Fall 2001 Caltech h106 Fall 2001 ath for physicists: differential forms Disclaimer: this is a first draft, so a few signs might be off. 1 Basic properties Differential forms come up in various parts of theoretical

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

1. Thermodynamics 1.1. A macroscopic view of matter

1. Thermodynamics 1.1. A macroscopic view of matter 1. Thermodynamics 1.1. A macroscopic view of matter Intensive: independent of the amount of substance, e.g. temperature,pressure. Extensive: depends on the amount of substance, e.g. internal energy, enthalpy.

More information

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT From the microscopic to the macroscopic world Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München Jean BRICMONT Université Catholique de Louvain Can Irreversible macroscopic laws be deduced

More information

Metric spaces and metrizability

Metric spaces and metrizability 1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively

More information

Statistical thermodynamics for MD and MC simulations

Statistical thermodynamics for MD and MC simulations Statistical thermodynamics for MD and MC simulations knowing 2 atoms and wishing to know 10 23 of them Marcus Elstner and Tomáš Kubař 22 June 2016 Introduction Thermodynamic properties of molecular systems

More information

Lecture 6: Ideal gas ensembles

Lecture 6: Ideal gas ensembles Introduction Lecture 6: Ideal gas ensembles A simple, instructive and practical application of the equilibrium ensemble formalisms of the previous lecture concerns an ideal gas. Such a physical system

More information

Basics of Radiation Fields

Basics of Radiation Fields Basics of Radiation Fields Initial questions: How could you estimate the distance to a radio source in our galaxy if you don t have a parallax? We are now going to shift gears a bit. In order to understand

More information

Basics of Statistical Mechanics

Basics of Statistical Mechanics Basics of Statistical Mechanics Review of ensembles Microcanonical, canonical, Maxwell-Boltzmann Constant pressure, temperature, volume, Thermodynamic limit Ergodicity (see online notes also) Reading assignment:

More information

213 Midterm coming up

213 Midterm coming up 213 Midterm coming up Monday April 8 @ 7 pm (conflict exam @ 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion 1-4 Labs 1-2 Review Session Sunday April 7, 3-5 PM, 141 Loomis

More information

Gauge Fixing and Constrained Dynamics in Numerical Relativity

Gauge Fixing and Constrained Dynamics in Numerical Relativity Gauge Fixing and Constrained Dynamics in Numerical Relativity Jon Allen The Dirac formalism for dealing with constraints in a canonical Hamiltonian formulation is reviewed. Gauge freedom is discussed and

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0 StatMech Basics VBS/MRC Stat Mech Basics 0 Goal of Stat Mech Concepts in Materials Science I Quantum (Classical) Mechanics gives energy as a function of configuration Macroscopic observables (state) (P,

More information

Lecture 1: Historical Overview, Statistical Paradigm, Classical Mechanics

Lecture 1: Historical Overview, Statistical Paradigm, Classical Mechanics Lecture 1: Historical Overview, Statistical Paradigm, Classical Mechanics Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743 August 23, 2017 Chapter I. Basic Principles of Stat MechanicsLecture

More information

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life. Statistical Physics he Second Law ime s Arrow Most macroscopic processes are irreversible in everyday life. Glass breaks but does not reform. Coffee cools to room temperature but does not spontaneously

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information