Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 2, due Wednesday Feb. 12 Latest revision: February 10, 2003, 12:22 am
|
|
- Moris Welch
- 5 years ago
- Views:
Transcription
1 Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 2, due Wednesday Feb. 12 Latest revision: February 10, 2003, 12:22 am Pathria, chapters 3, 4, Reading (2.1) Entropy of Glasses. Problems See Entropy of Glasses, Stephen A. Langer and James P. Sethna, Phys. Rev. Lett. 61, 570 (1988), although M. Goldstein noticed the bounds earlier. Glasses aren t really in equilibrium. In particular they do not obey the third law of thermodynamics, that the entropy S goes to zero at zero temperature. Experimentalists measure a residual entropy by subtracting the entropy change from the known entropy of the equilibrium liquid at a temperature T l at or above the crystalline melting temperature T c : Tl 1 dq S residual = S liquid (T l ) dt, (2.1.1) 0 T dt where Q is the net heat flow out of the bath into the glass. If you put a glass in an insulated box, it will warm up (very slowly) because of microscopic atomic rearrangements which lower the potential energy. So, glasses don t have a welldefined temperature or specific heat. In particular, the heat flow upon cooling and on heating dq dt (T ) won t precisely match (although their integrals will agree by conservation of energy). Thomas and Parks in the figure shown are making the approximation that the specific heat of the glass is dq/dt, the measured heat flow out of the glass divided by the temperature change of the heat bath. They find that the specific heat defined in this way measured on cooling and heating disagree. Consider the second cooling curve and the final heating curve, from 325 C to room temperature and back. Assume that the liquid at 325 Cis in equilibrium both before cooling and after heating (and so has the same liquid entropy S liquid. (a) Is the residual entropy, equation (2.1.1), larger on heating or on cooling? (Hint: Use the fact that the integrals under the curves, T l dq 0 dt dt give the heat flow, which by conservation of energy must be the same on heating and cooling. The heating curve shifts weight to higher temperatures: will that increase or decrease the integral in (2.1.1)?) (b) By using the second law (entropy can only increase), show that when cooling and then heating from an equilibrium liquid the residual entropy measured on cooling must always be less than the residual entropy measured on heating. 1
2 Specific heat of B 2 O 3 glass measured while heating and cooling. The glass was first rapidly cooled from the melt (500 C 50 C in a half hour), then heated from 33 C 345 C in 14 hours (solid curve with squares), cooled from 345 Cto room temperature in 18 hours (dotted curve with diamonds), and finally heated from 35 C 325 C (solid curve with crosses). Data from S. B. Thomas and G. S. Parks, J. Phys. Chem. 35, 2091 (1931); revisited in Nonequilibrium Entropy and Entropy Distributions, S. A. Langer, E. R. Grannan, and J. P. Sethna, Phys. Rev. B 41, 2261 (1990). The fact that the energy lags the temperature near the glass transition, in linear response, leads to the study of specific heat spectroscopy, (N. O. Birge and S. R. Nagel, Phys. Rev. Lett. 54, 2674 (1985).) The residual entropy of a typical glass is about k B per molecular unit. It s a measure of how many different glassy configurations of atoms the material can freeze into. (c) In a molecular dynamics simulation with one hundred atoms, and assuming that the residual entropy is k B log 2 per atom, what is the probability that two coolings to zero energy will arrive at equivalent atomic configurations? In a system with molecular units, with residual entropy k B log 2 per unit, how many coolings would be needed to arrive at the original configuration again, with probability 1/2? Up to, e.g., translations and rotations. 2
3 (2.2) Shannon entropy. See: Claude E. Shannon, A Mathematical Theory of Communication, from The Bell System Technical Journal, 27, pp , , July, October, Entropy can be viewed as a measure of the lack of information you have about a system. Claude Shannon realized, back in the 1940 s, that communication over telephone wires amounts to reducing the listener s uncertainty about the sender s message, and introduced a definition of an information entropy. Most natural languages (voice, written English) are highly redundant; the number of intelligible fifty-letter sentences is many fewer than 26 50, and the number of ten-second phone conversations is far smaller than the number of sound signals that could be generated with frequencies between 60 and 20,000 Hz. Shannon, knowing statistical mechanics, defined the entropy of an ensemble of messages: if there are N possible messages that can be sent in one package, and message m is being transmitted with probability p m, then Shannon s entropy is N S = k p m log p m (2.2.1) 1 where instead of Boltzmann s constant, Shannon picked k = 1/ log 2. This immediately suggests a theory for signal compression. If you can recode the alphabet so that common letters and common sequences of letters are abbreviated, while infrequent combinations are spelled out in lengthly fashion, you can dramatically reduce the channel capacity needed to send the data. (This is lossless compression, like zip and gz and gif). An obscure language A bç! for long-distance communication has only three sounds: a hoot represented by A, a slap represented by B, and a click represented by C. In a typical message, hoots and slaps occur equally often (p =1/4), but clicks are twice as common (p = 1/2). Assume the messages are otherwise random. (a) What is the Shannon entropy in this language? More specifically, what is the Shannon entropy rate (entropy per sound, or letter, transmitted)? (b) Show that a communication channel transmitting bits (ones and zeros) can transmit no more than one unit of Shannon entropy per bit. (Hint: this should follow by showing that, for N =2 m messages, equation (2.2.1) is maximized by p m =1/N. You needn t prove it s a global maximum: check that it is a local extremum. You ll need either a Lagrange multiplier or will need to explicitly set p N =1 N 1 m=1 p m.) (c) In general, argue that the Shannon entropy gives the minimum number of bits needed to transmit the ensemble of messages. (Hint: compare the Shannon entropy of the N original messages with the Shannon entropy of the N (shorter) encoded messages.) Calculate the minimum number of bits per letter on average needed to transmit messages for the particular case of an A bç! communication channel. (d) Find a compression scheme (a rule that converts a A bç! message to zeros and ones, that can be inverted to give back the original message) that is optimal, in the sense that it saturates the bound you derived in part (b). Shannon also developed a measure of the channel capacity of a noisy wire, and discussed error correction codes... 3
4 (2.3) Dyson s Time without End: Life and the Heat Death of the Universe. See Freeman J. Dyson, Time without end: Physics and biology in an open universe, Rev. Mod. Phys. 51, 447 (1979). Freeman Dyson discusses how living things might evolve to cope with the cooling and dimming we expect during the heat death of the universe. Dyson models an intelligent being as a heat engine that creates a fixed entropy S per thought. (a) Energy needed per thought. Assume that the being draws heat Q from a hot reservoir at T 1 and radiates it away to a cold reservoir at T 2. What is the minimum energy Q needed per thought, in terms of S and T 2?(YoumaytakeT 1 very large.) Time needed per thought to radiate energy. Dyson shows, using theory not important here, that the power radiated by our intelligent being as heat engine is no larger than CT 3 2, a constant times the cube of the cold temperature. (The constant scales with the number of electrons in the being, so we can think of our answer t as the time per thought per mole of electrons.) (b) Write an expression for the maximum rate of thoughts per unit time dh/dt (the inverse of the time t per thought), in terms of S, C, andt 2. Number of thoughts for an ecologically efficient being. Our universe is expanding: the radius R grows roughly linearly in time t. The microwave background radiation has a characteristic temperature Θ(t) R 1 which is getting lower as the universe expands: this red-shift is due to the Doppler effect. An ecologically efficient being would naturally try to use as little heat as possible, and so wants to choose T 2 as small as possible. It cannot radiate heat at a temperature below T 2 =Θ(t) =A/t. (c) How many thoughts H can an ecologically efficient being, radiating at a fixed multiple of Θ(t), have between now and time infinity, in terms of S, C, A, and the current time t 0? Time without end: Greedy beings. Dyson would like his beings to be able to think an infinite number of thoughts before the universe ends, but consume a finite amount of energy. He proposes that his beings need to be profligate in order to get their thoughts in before the world ends: he proposes that they radiate at a temperature T 2 (t) t 3/8 which falls with time, but not as fast as Θ(t) t 1. (d) Show that with Dyson s radiation schedule, the total number of thoughts H is infinite, but the total energy consumed U is finite. 4
5 (2.4) Thermodynamics Review II: Gibbs-Duhem relation and the Clausius- Clapeyron equation. (a) Using the fact that the entropy S(N,V,E) is extensive, show that N S N Show from this that in general + V S V,E V + E S N,E E = S. N,V and hence E = TS pv + µn. This is Euler s equation. S =(E + pv µn)/t (2.4.1) As a state function, S is supposed to depend only on E, V,andN. But equation (2.4.1) seems to show explicit dependence on T, p, andµ as well: how can this be? (c) One answer is to write the latter three as functions of E, V,andN. Do this explicitly for the ideal gas, using last week s equation (1.2.2) [ S(N,V,E) = 5 ( ) ] 3/2 2 Nk V 4πmE B + Nk B log Nh 3, (1.2.2) 3N and your (or the grader s) results for problem 1.2(c), and verify equation (2.4.1) in that case. Another answer is to consider a small shift of all six variables. We know that de = TdS pdv + µdn, but if we shift all six variables in Euler s equation we get de = TdS pdv + µdn + SdT Vdp+ Ndµ. This implies the Gibbs-Duhem relation 0=SdT Vdp+ Ndµ. (2.4.2) It means that the intensive variables T, p, andµ are not all independent. Generic phase diagram, from itl/2045 s99/lectures/lec f.html 5
6 Clausius-Clapeyron equation. Consider the phase diagram above. Along an equilibrium phase boundary, the temperatures, pressures, and chemical potentials of the two phases must agree: otherwise a flat interface between the two phases would transmit heat, shift sideways, or leak particles, respectively (violating the assumption of equilibrium). (d) Apply the Gibbs-Duham relation to both phases, for a small shift by T across the phase boundary. Let s 1, v 1, s 2,andv 2 be the molecular entropies and volumes (s = S/N, v = V/N for each phase); derive the Clausius-Clapeyron equation for the slope of the coexistence line on the phase diagram dp/dt =(s 1 s 2 )/(v 1 v 2 ). (2.4.3) (e) Latent Heat. It s hard to experimentally measure the entropies per particle: we don t have an entropy thermometer. But, as you will remember, the entropy difference upon a phase transformation S = Q/T is related to the heat flow Q needed to induce the phase change. Let the latent heat L be the heat flow per molecule: write a formula for dp/dt not involving the entropy. 6
7 (2.5) Excited Atoms, Negative Temperature, and Micro Canonical. AsystemofN atoms can be in the ground state or in an excited state. For convenience, we set the zero of energy exactly in between, so the energies of the two states of an atom are ±ɛ/2. The atoms are isolated from the outside world. There are only weak couplings between the atoms, sufficient to bring them into internal equilibrium but without other effects. Entropies and Energy Fluctuations Microcanonical and Canonical 30k B 0.1 Entropy S Microcanonical Entropy S micro (E) Canonical Entropy S c (T(E)) Probability ρ(e) Probability ρ(e) 0k B -20ε -10ε 0ε 10ε 20ε Energy E = -N/2 + M ε/3 Entropies and energy fluctuations for this problem with N = 50. The canonical probability distribution for the energy is for E = 10ɛ, andk B T =1.207ɛ. You may wish to check some of your answers against this plot. (a) Microcanonical Entropy. If the net energy is E (corresponding to a number of excited atoms m = E/ɛ + N/2), what is the microcanonical entropy S micro (E) ofour system? Simplify your expression using Stirling s formula, log n! n log n n. (b) Negative Temperature. Find the temperature, using your simplified expression from part (a). (Why is it tricky to do it without approximation?) What happens to the temperature when E>0? Having the energy E>0is a kind of population inversion. Population inversion is the driving mechanism for lasers. Microcanonical simulations can lead also to states with negative specific heats. For many quantities, the thermodynamic derivatives have natural interpretations when viewed as sums over states. It s easiest to see this in small systems. (c) Canonical Ensemble: Explicit traces and thermodynamics. (i) Take one of our atoms and couple it to a heat bath of temperature k B T =1/β. Write explicit formulas for Z canon, E canon,ands canon in the canonical ensemble, as a trace (or sum) over the two states of the atom. (E should be the energy of each state multiplied by the probability ρ n of that state, S should be the trace of ρ n log ρ n.) (ii) Compare the results with what you get by using the thermodynamic relations. Using Z from the trace over states, calculate the Helmholtz free energy A, S as a derivative of A, ande from A = E TS. Do the thermodynamically derived formulas you get agree with the statistical traces? (iii) To remove some of the mystery from the thermodynamic relations, 7 0
8 consider the thermodynamically valid formula E = log Z/ β = (1/Z) Z/ β. Write out Z as a sum over energy states, and see that this formula follows naturally. (d) What happens to E in the canonical ensemble as T? regime discussed in part (b)? Can you get into the (e) Canonical-Microcanonical Correspondence. Find the entropy in the canonical distribution for N of our atoms coupled to the outside world, from your answer to part (c). How can you understand the value of S(T = ) S(T = 0) simply? Using the approximate form of the entropy from part (a) and the temperature from part (b), show that the canonical and microcanonical entropies agree, S micro (E) = S canon (T (E)). (Perhaps useful: arctanh(x) =(1/2) log ((1 + x)/(1 x)).) Notice that the two are not equal in the figure above: the form of Stirling s formula we used in part (a) is not very accurate for N = 50. In a simple way, explain why the microcanonical entropy is smaller than the canonical entropy. (f) Fluctuations. Show in general that the root-mean-squared fluctuations in the energy in the canonical distribution (E E ) 2 = E 2 E 2 is related to the specific heat C = E/ T. (I find it helpful to use the formula from part (c.iii), E = log(z)/ β.) Calculate the root-mean-square energy fluctuations for N of our atoms. Evaluate it at T (E) from part (b): it should have a particularly simple form. For large N, are the fluctuations in E small compared to E? 8
9 (2.6) Specific Heat: Strings and Defects. One-dimensional Phonons. A nano-string of length L with mass per unit length µ under tension τ has a vertical, transverse displacement u(x, t). The kinetic energy density is (µ/2)( u/ t) 2 and the potential energy density is (τ/2)( u/ x) 2. (a) Write the kinetic energy and the potential energy in new variables, changing from u(x, t) tonormalmodesq k (t) with u(x, t) = n q k n (t) sin(k n x), k n = nπ/l. Show in these variables that the system is a sum of decoupled harmonic oscillators. Calculate the density of states per unit frequency g(ω) = n δ(ω ω k n ) and the specific heat of the string c(t ) per unit length in the limit L, treating the oscillators quantum mechanically. What is the specific heat of the classical string? Defects in Crystals. A defect in a crystal has one on-center configuration with energy zero, and M off-center configurations with energy ɛ, with no significant quantum tunneling between the states. The Hamiltonian can be approximated by the (M +1) (M +1) matrix H = ɛ ɛ There are N defects in the crystal, which can be assumed stuck in position (and hence distinguishable) and assumed not to interact with one another. (b) Write the canonical partition function Z(T ), the mean energy E(T ), the fluctuations in the energy, the entropy S(T ), and the specific heat C(T ) as a function of temperature. Plot the specific heat per defect C(T )/N for M = 6; set the unit of energy equal to ɛ and k B = 1 for your plot. Derive a simple relation between M and the change in entropy between zero and infinite temperature. Check this relation using your formula for S(T ). The density of states here is the number of normal modes in a frequency range (ω, ω+ɛ) divided by ɛ. In the next homework, we ll study the density of energy eigenstates g(e) in a trap with precisely three frequencies (g(ω) will be = δ(ω ω 0 )+2δ(ω ω 1 )). Don t confuse the two. 9
10 (2.7) Laplace ( ), Lagrange ( ) and Legendre ( ). (a) Laplace Transform. Show (or note) that the canonical partition function can be written as the Laplace transform of the microcanonical partition function. I ve never noticed that this was useful. (b) Lagrange Multipliers. (See Pathria, problem 3.6). (i) Microcanonical: Show, using a Lagrange multiplier, that the probability distribution that extremizes the entropy S = k B Tr(ρlog ρ) = k B ρ(x, P)logρ(X, P) with the constraint that Energy Surface the density integrates to one, Tr(ρ) = ρ(x, P) = 1, is a constant Energy Surface (the microcanonical distribution). (ii) Canonical: Integrating over all X and P, use another Lagrange multiplier to fix the mean energy E = dxdp H(X, P)ρ(X, P). Show that maximizing the entropy given the two constraints gives you the canonical distribution. (iii) Grand Canonical: Summing over different numbers of particles N, adding the constaint that the average number is N = N dxdp NρN (X, P), show that you get the grand canonical distribution. (c) Legendre Transforms. What thermodynamics potentials correspond to the microcanonical, canonical, and grand canonical ensembles? What variables are they naturally functions of? Give the Legendre transformations which lead us from one to the other. As you showed in problem 2.2(b). 10
11 (2.8) Does Entropy Increase? The second law of thermodynamics says that entropy always increases. Perversely, it s easy to show that in an isolated system, no matter what non-equilibrium condition it starts in, entropy as precisely defined stays constant in time. Entropy is Constant: Classical. Liouville s theorem tells us that the total derivative of the probability density is zero: following the trajectory of a system, the local probability density never changes. The equilibrium states have probability densities that only depend on energy and number. Clearly something is wrong: if the density starts non-uniform, how can it become uniform? (a) Let ρ(p, Q) is the probability density in phase space (where P =(p 1,...,p 3N )are momenta and Q =(q 1,...,q 3N ) are the momenta. Show, for any function f(ρ) with f(0) = 0, that f(ρ)/ t = [f(ρ)v] = α / p α (f(ρ)ṗ α )+ / q α (f(ρ) q α ), where v = (Ṗ, Q) is the 6N dimensional velocity in phase space. Hence, (by Gauss s theorem in 6N dimensions), show f(ρ)/ t dpdq = 0, assuming that the probability density vanishes at large momenta and positions. Show, thus, that the entropy S = k B ρ log ρ is constant in time. (b) Entropy is Constant: Quantum. Prove that S = Trˆρ log ˆρ is time-independent, where ˆρ is any density matrix. The Arnol d Cat. Why do we think entropy increases? First, points in phase space don t just swirl in circles: they get stretched and twisted and folded back in complicated patterns especially in systems where statistical mechanics seems to hold! Arnol d, in a takeoff on Schrödinger s cat, suggested the following analogy. Instead of a continuous transformation of phase space onto itself preserving 6N-dimensional volume, let s think of an area-preserving mapping of the plane into itself. (You might think of the mapping as a Poincaré section, remembering last problem set.) Consider the mapping See the map illustrated below. Γ ( ) ( ) x x + y = mod n. y x +2y (c) Check that Γ preserves area (what is the determinant?). Show that it takes a square n n (or a picture of n n pixels) and maps it into itself with periodic boundary conditions. (With less cutting and pasting, you can view it as a map from the torus into itself.) As a linear map, find the eigenvalues and eigenvectors. Argue that a small neighborhood (say a circle in the center of the picture) will initially be stretched along an irrational direction into a thin strip. When this thin strip hits the boundary, it gets split into two; in the case of an n n square, further iterations stretch and chop our original circle into a line uniformly covering the square. In the pixel case, there are always exactly the same number of pixels that are black, white, and each shade of gray: they just get so kneaded together that everything looks 11
12 Arnol d Cat Transform, from see movie at sander/movies/arnold.html. Notice, however, that the cat comes back! This system exhibits Poincaré recurrance (which we haven t covered) much earlier than most systems. a uniform color. So, by putting a limit to the resolution of our measurement (rounding errors on the computer, for example), or by introducing any tiny coupling to the external world, the final state can be seen to rapidly approach equilibrium, proofs to the contrary notwithstanding! 12
13 (2.9) Quantum Stat Mech: Density Matrices and Density of States. (a) Density matrices for photons. Write the density matrix for a photon linearly traveling along z and linearly polarized along ˆx, in the basis where (1, 0) and (0, 1) are polarized along ˆx and ŷ. Write the density matrix for a right-handed polarized photon, (1/ 2,i/ 2), and the density matrix for unpolarized light. Calculate Tr(ˆρ), Tr(ˆρ 2 ), and S = k B Tr(ˆρ log ˆρ). Interpret the values of the three traces physically: one is a check for pure states, one is a measure of information, and one is a normalization. (b) Density matrices for a spin.(adapted from Halperin s course, 1976.) Let the Hamiltonian for a spin be H = h 2 B ˆ σ where ˆ σ =(σ x,σ y,σ z ) are the three Pauli spin matrices, and B may be interpreted as a magnetic field, in units where the gyromagnetic ratio is unity. Remember that σ i σ j σ j σ i =2iɛ ijk σ k. Show that any 2 2 density matrix may be written in the form ˆρ =(1/2)(ˆ1+p ˆ σ). Show that the equations of motion for the density matrix i h ˆρ/ t =[H, ˆρ] canbe written as dp/dt = B p. In classical mechanics, the entropy goes to minus infinity as the temperature is lowered to zero; in quantum mechanics the entropy goes to zero, because states are quantized. This is Nernst s theorem, the third law of thermodynamics. The classical phase-space volume has units of ((momentum) (distance)) 3N. It s a little perverse to take the logarithm of a quantity with units, and the obvious candidate with these dimensions is Planck s constant h 3N. Or should it be h 3N? (c) Arbitrary zero of the classical entropy. Show that the choice of units changes the entropy per particle, and hence that Nernst s theorem depends on getting this correct. (d) Fixing the zero of entropy: quantum Maxwell-Boltzmann ideal gas. Consider N particles in a L L L box with periodic boundary conditions. Show that the single-particle energy eigenstates exp(ik x) form a regular grid in k space: what is the spacing k? (Fixed boundary conditions would give a grid for only positive k, with a different spacing.) Show that the many-particle energy eigenstates form a regular grid in 3N-dimensional k space. Show that the constant-energy surfaces are spheres in this space. Assume that the particles are indistinguishable, but classical: only distinct fillings of single-particle eigenstates are different. Calculate the entropy per particle in the limit L, in analogy to the fixed boundary condition calculation in the text. Which version of Planck s constant is the right one to normalize the units of phase space? We ll see soon that real quantum mechanics gives symmetry constraints on the manybody wave function, which changes the statistics from Maxwell-Boltzmann to either Fermi or Bose-Einstein. This calculation, however, remains valid at low densities and high pressures, where multiply occupied states are rare. 13
Physics 562: Statistical Mechanics Spring 2002, James P. Sethna Homework 2, due Monday Feb. 11 Latest revision: March 11, 2002, 9:35.
Physics 562: Statistical Mechanics Spring 2002, James P. Sethna Homework 2, due Monday Feb. 11 Latest revision: March 11, 2002, 9:35 Pathria, chapters 3, 4, 7.2-7.3. Reading Problems (2.1) Entropy of Glasses.
More informationPhysics 218: Waves and Thermodynamics Fall 2002, James P. Sethna Homework 12, due Wednesday Dec. 4 Latest revision: November 26, 2002, 10:45 am
Physics 218: Waves and Thermodynamics Fall 2002, James P. Sethna Homework 12, due Wednesday Dec. 4 Latest revision: November 26, 2002, 10:45 am Reading Feynman, I.44 Laws of Thermodynamics, I.45 Illustrations
More informationPhysics 562: Statistical Mechanics Spring 2002, James P. Sethna Prelim, due Wednesday, March 13 Latest revision: March 22, 2002, 10:9
Physics 562: Statistical Mechanics Spring 2002, James P. Sethna Prelim, due Wednesday, March 13 Latest revision: March 22, 2002, 10:9 Open Book Exam Work on your own for this exam. You may consult your
More informationStat Mech: Problems II
Stat Mech: Problems II [2.1] A monatomic ideal gas in a piston is cycled around the path in the PV diagram in Fig. 3 Along leg a the gas cools at constant volume by connecting to a cold heat bath at a
More informationIntroduction Statistical Thermodynamics. Monday, January 6, 14
Introduction Statistical Thermodynamics 1 Molecular Simulations Molecular dynamics: solve equations of motion Monte Carlo: importance sampling r 1 r 2 r n MD MC r 1 r 2 2 r n 2 3 3 4 4 Questions How can
More informationPhysics 408 Final Exam
Physics 408 Final Exam Name You are graded on your work, with partial credit where it is deserved. Please give clear, well-organized solutions. 1. Consider the coexistence curve separating two different
More informationChapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.
Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics. The goal of equilibrium statistical mechanics is to calculate the density
More informationIn-class exercises. Day 1
Physics 4488/6562: Statistical Mechanics http://www.physics.cornell.edu/sethna/teaching/562/ Material for Week 3 Exercises due Mon Feb 12 Last correction at February 5, 2018, 9:46 am c 2017, James Sethna,
More informationIn-class exercises. Day 1
Physics 4488/6562: Statistical Mechanics http://www.physics.cornell.edu/sethna/teaching/562/ Material for Week 8 Exercises due Mon March 19 Last correction at March 5, 2018, 8:48 am c 2017, James Sethna,
More informationTime-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics
Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics c Hans C. Andersen October 1, 2009 While we know that in principle
More informationTable of Contents [ttc]
Table of Contents [ttc] 1. Equilibrium Thermodynamics I: Introduction Thermodynamics overview. [tln2] Preliminary list of state variables. [tln1] Physical constants. [tsl47] Equations of state. [tln78]
More informationFirst Problem Set for Physics 847 (Statistical Physics II)
First Problem Set for Physics 847 (Statistical Physics II) Important dates: Feb 0 0:30am-:8pm midterm exam, Mar 6 9:30am-:8am final exam Due date: Tuesday, Jan 3. Review 0 points Let us start by reviewing
More information(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble
Recall from before: Internal energy (or Entropy): &, *, - (# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble & = /01Ω maximized Ω: fundamental statistical quantity
More informationThermal and Statistical Physics Department Exam Last updated November 4, L π
Thermal and Statistical Physics Department Exam Last updated November 4, 013 1. a. Define the chemical potential µ. Show that two systems are in diffusive equilibrium if µ 1 =µ. You may start with F =
More information2. Thermodynamics. Introduction. Understanding Molecular Simulation
2. Thermodynamics Introduction Molecular Simulations Molecular dynamics: solve equations of motion r 1 r 2 r n Monte Carlo: importance sampling r 1 r 2 r n How do we know our simulation is correct? Molecular
More information[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B
Canonical ensemble (Two derivations) Determine the probability that a system S in contact with a reservoir 1 R to be in one particular microstate s with energy ɛ s. (If there is degeneracy we are picking
More information(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H
Solutions exam 2 roblem 1 a Which of those quantities defines a thermodynamic potential Why? 2 points i T, p, N Gibbs free energy G ii T, p, µ no thermodynamic potential, since T, p, µ are not independent
More informationPhysics Nov Phase Transitions
Physics 301 11-Nov-1999 15-1 Phase Transitions Phase transitions occur throughout physics. We are all familiar with melting ice and boiling water. But other kinds of phase transitions occur as well. Some
More informationPhysics 218: Waves and Thermodynamics Fall 2003, James P. Sethna Homework 11, due Monday Nov. 24 Latest revision: November 16, 2003, 9:56
Physics 218: Waves and Thermodynamics Fall 2003, James P. Sethna Homework 11, due Monday Nov. 24 Latest revision: November 16, 2003, 9:56 Reading Feynman, I.39 The Kinetic Theory of Gases, I.40 Principles
More informationWe already came across a form of indistinguishably in the canonical partition function: V N Q =
Bosons en fermions Indistinguishability We already came across a form of indistinguishably in the canonical partition function: for distinguishable particles Q = Λ 3N βe p r, r 2,..., r N ))dτ dτ 2...
More informationStatistical Mechanics
Franz Schwabl Statistical Mechanics Translated by William Brewer Second Edition With 202 Figures, 26 Tables, and 195 Problems 4u Springer Table of Contents 1. Basic Principles 1 1.1 Introduction 1 1.2
More informationPhysics 408 Final Exam
Physics 408 Final Exam Name You are graded on your work (with partial credit where it is deserved) so please do not just write down answers with no explanation (or skip important steps)! Please give clear,
More informationPhys Midterm. March 17
Phys 7230 Midterm March 17 Consider a spin 1/2 particle fixed in space in the presence of magnetic field H he energy E of such a system can take one of the two values given by E s = µhs, where µ is the
More informationThermal & Statistical Physics Study Questions for the Spring 2018 Department Exam December 6, 2017
Thermal & Statistical Physics Study Questions for the Spring 018 Department Exam December 6, 017 1. a. Define the chemical potential. Show that two systems are in diffusive equilibrium if 1. You may start
More informationPhysics Nov Cooling by Expansion
Physics 301 19-Nov-2004 25-1 Cooling by Expansion Now we re going to change the subject and consider the techniques used to get really cold temperatures. Of course, the best way to learn about these techniques
More informationStatistical Mechanics
Statistical Mechanics Entropy, Order Parameters, and Complexity James P. Sethna Laboratory of Atomic and Solid State Physics Cornell University, Ithaca, NY OXFORD UNIVERSITY PRESS Contents List of figures
More information1 Foundations of statistical physics
1 Foundations of statistical physics 1.1 Density operators In quantum mechanics we assume that the state of a system is described by some vector Ψ belonging to a Hilbert space H. If we know the initial
More informationJ10M.1 - Rod on a Rail (M93M.2)
Part I - Mechanics J10M.1 - Rod on a Rail (M93M.2) J10M.1 - Rod on a Rail (M93M.2) s α l θ g z x A uniform rod of length l and mass m moves in the x-z plane. One end of the rod is suspended from a straight
More information1 Multiplicity of the ideal gas
Reading assignment. Schroeder, section.6. 1 Multiplicity of the ideal gas Our evaluation of the numbers of microstates corresponding to each macrostate of the two-state paramagnet and the Einstein model
More informationPhysics 404: Final Exam Name (print): "I pledge on my honor that I have not given or received any unauthorized assistance on this examination.
Physics 404: Final Exam Name (print): "I pledge on my honor that I have not given or received any unauthorized assistance on this examination." May 20, 2008 Sign Honor Pledge: Don't get bogged down on
More informationPHYS3113, 3d year Statistical Mechanics Tutorial problems. Tutorial 1, Microcanonical, Canonical and Grand Canonical Distributions
1 PHYS3113, 3d year Statistical Mechanics Tutorial problems Tutorial 1, Microcanonical, Canonical and Grand Canonical Distributions Problem 1 The macrostate probability in an ensemble of N spins 1/2 is
More informationP3317 HW from Lecture and Recitation 10
P3317 HW from Lecture 18+19 and Recitation 10 Due Nov 6, 2018 Problem 1. Equipartition Note: This is a problem from classical statistical mechanics. We will need the answer for the next few problems, and
More informationHomework Hint. Last Time
Homework Hint Problem 3.3 Geometric series: ωs τ ħ e s= 0 =? a n ar = For 0< r < 1 n= 0 1 r ωs τ ħ e s= 0 1 = 1 e ħω τ Last Time Boltzmann factor Partition Function Heat Capacity The magic of the partition
More informationPhysics 607 Exam 2. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2
Physics 607 Exam Please be well-organized, and show all significant steps clearly in all problems. You are graded on your work, so please do not just write down answers with no explanation! Do all your
More informationThe Methodology of Statistical Mechanics
Chapter 4 The Methodology of Statistical Mechanics c 2006 by Harvey Gould and Jan Tobochnik 16 November 2006 We develop the basic methodology of statistical mechanics and provide a microscopic foundation
More informationStatistical Mechanics
42 My God, He Plays Dice! Statistical Mechanics Statistical Mechanics 43 Statistical Mechanics Statistical mechanics and thermodynamics are nineteenthcentury classical physics, but they contain the seeds
More informationThermodynamics of phase transitions
Thermodynamics of phase transitions Katarzyna Sznajd-Weron Department of Theoretical of Physics Wroc law University of Science and Technology, Poland March 12, 2017 Katarzyna Sznajd-Weron (WUST) Thermodynamics
More informationAn Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning
An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning Chang Liu Tsinghua University June 1, 2016 1 / 22 What is covered What is Statistical mechanics developed for? What
More informationChE 210B: Advanced Topics in Equilibrium Statistical Mechanics
ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics Glenn Fredrickson Lecture 1 Reading: 3.1-3.5 Chandler, Chapters 1 and 2 McQuarrie This course builds on the elementary concepts of statistical
More informationA Deeper Look into Phase Space: The Liouville and Boltzmann Equations
A Deeper Look into Phase Space: The Liouville and Boltzmann Equations Brian Petko Statistical Thermodynamics Colorado School of Mines December 7, 2009 INTRODUCTION Phase Space, in a sense, can be a useful
More information...Thermodynamics. Lecture 15 November 9, / 26
...Thermodynamics Conjugate variables Positive specific heats and compressibility Clausius Clapeyron Relation for Phase boundary Phase defined by discontinuities in state variables Lecture 15 November
More informationt = no of steps of length s
s t = no of steps of length s Figure : Schematic of the path of a diffusing molecule, for example, one in a gas or a liquid. The particle is moving in steps of length s. For a molecule in a liquid the
More informationUNIVERSITY OF MISSOURI-COLUMBIA PHYSICS DEPARTMENT. PART I Qualifying Examination. January 20, 2015, 5:00 p.m. to 8:00 p.m.
UNIVERSITY OF MISSOURI-COLUMBIA PHYSICS DEPARTMENT PART I Qualifying Examination January 20, 2015, 5:00 p.m. to 8:00 p.m. Instructions: The only material you are allowed in the examination room is a writing
More informationIntroduction. Chapter The Purpose of Statistical Mechanics
Chapter 1 Introduction 1.1 The Purpose of Statistical Mechanics Statistical Mechanics is the mechanics developed to treat a collection of a large number of atoms or particles. Such a collection is, for
More informationChapter 5 - Systems under pressure 62
Chapter 5 - Systems under pressure 62 CHAPTER 5 - SYSTEMS UNDER PRESSURE 5.1 Ideal gas law The quantitative study of gases goes back more than three centuries. In 1662, Robert Boyle showed that at a fixed
More informationPreliminary Examination - Day 2 August 16, 2013
UNL - Department of Physics and Astronomy Preliminary Examination - Day August 16, 13 This test covers the topics of Quantum Mechanics (Topic 1) and Thermodynamics and Statistical Mechanics (Topic ). Each
More informationsummary of statistical physics
summary of statistical physics Matthias Pospiech University of Hannover, Germany Contents 1 Probability moments definitions 3 2 bases of thermodynamics 4 2.1 I. law of thermodynamics..........................
More informationChE 503 A. Z. Panagiotopoulos 1
ChE 503 A. Z. Panagiotopoulos 1 STATISTICAL MECHANICAL ENSEMLES 1 MICROSCOPIC AND MACROSCOPIC ARIALES The central question in Statistical Mechanics can be phrased as follows: If particles (atoms, molecules,
More informationPhysics Nov Bose-Einstein Gases
Physics 3 3-Nov-24 8- Bose-Einstein Gases An amazing thing happens if we consider a gas of non-interacting bosons. For sufficiently low temperatures, essentially all the particles are in the same state
More informationStatistical Mechanics in a Nutshell
Chapter 2 Statistical Mechanics in a Nutshell Adapted from: Understanding Molecular Simulation Daan Frenkel and Berend Smit Academic Press (2001) pp. 9-22 11 2.1 Introduction In this course, we will treat
More information1. Thermodynamics 1.1. A macroscopic view of matter
1. Thermodynamics 1.1. A macroscopic view of matter Intensive: independent of the amount of substance, e.g. temperature,pressure. Extensive: depends on the amount of substance, e.g. internal energy, enthalpy.
More informationto satisfy the large number approximations, W W sys can be small.
Chapter 12. The canonical ensemble To discuss systems at constant T, we need to embed them with a diathermal wall in a heat bath. Note that only the system and bath need to be large for W tot and W bath
More informationPhysics 607 Final Exam
Physics 607 Final Exam Please be well-organized, and show all significant steps clearly in all problems You are graded on your work, so please do not ust write down answers with no explanation! o state
More information3. Quantum Mechanics in 3D
3. Quantum Mechanics in 3D 3.1 Introduction Last time, we derived the time dependent Schrödinger equation, starting from three basic postulates: 1) The time evolution of a state can be expressed as a unitary
More informationStatistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby
Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby Lecture 1: Probabilities Lecture 2: Microstates for system of N harmonic oscillators Lecture 3: More Thermodynamics,
More informationHAMILTON S PRINCIPLE
HAMILTON S PRINCIPLE In our previous derivation of Lagrange s equations we started from the Newtonian vector equations of motion and via D Alembert s Principle changed coordinates to generalised coordinates
More informationPhysics 4230 Final Exam, Spring 2004 M.Dubson This is a 2.5 hour exam. Budget your time appropriately. Good luck!
1 Physics 4230 Final Exam, Spring 2004 M.Dubson This is a 2.5 hour exam. Budget your time appropriately. Good luck! For all problems, show your reasoning clearly. In general, there will be little or no
More informationCHAPTER 9 Statistical Physics
CHAPTER 9 Statistical Physics 9.1 Historical Overview 9.2 Maxwell Velocity Distribution 9.3 Equipartition Theorem 9.4 Maxwell Speed Distribution 9.5 Classical and Quantum Statistics 9.6 Fermi-Dirac Statistics
More informationThe physical limits of communication or Why any sufficiently advanced technology is indistinguishable from noise. Abstract
The physical limits of communication or Why any sufficiently advanced technology is indistinguishable from noise Michael Lachmann, 1,4 M. E. J. Newman, 2,4 and Cristopher Moore 3,4 1 Max Planck Institute
More informationNanoscale Energy Transport and Conversion A Parallel Treatment of Electrons, Molecules, Phonons, and Photons
Nanoscale Energy Transport and Conversion A Parallel Treatment of Electrons, Molecules, Phonons, and Photons Gang Chen Massachusetts Institute of Technology OXFORD UNIVERSITY PRESS 2005 Contents Foreword,
More informationFigure 1: Doing work on a block by pushing it across the floor.
Work Let s imagine I have a block which I m pushing across the floor, shown in Figure 1. If I m moving the block at constant velocity, then I know that I have to apply a force to compensate the effects
More informationHandout 10. Applications to Solids
ME346A Introduction to Statistical Mechanics Wei Cai Stanford University Win 2011 Handout 10. Applications to Solids February 23, 2011 Contents 1 Average kinetic and potential energy 2 2 Virial theorem
More informationUnit 7 (B) Solid state Physics
Unit 7 (B) Solid state Physics hermal Properties of solids: Zeroth law of hermodynamics: If two bodies A and B are each separated in thermal equilibrium with the third body C, then A and B are also in
More informationAST1100 Lecture Notes
AST11 Lecture Notes Part 1G Quantum gases Questions to ponder before the lecture 1. If hydrostatic equilibrium in a star is lost and gravitational forces are stronger than pressure, what will happen with
More information(a) Write down the total Hamiltonian of this system, including the spin degree of freedom of the electron, but neglecting spin-orbit interactions.
1. Quantum Mechanics (Spring 2007) Consider a hydrogen atom in a weak uniform magnetic field B = Bê z. (a) Write down the total Hamiltonian of this system, including the spin degree of freedom of the electron,
More information01. Equilibrium Thermodynamics I: Introduction
University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 2015 01. Equilibrium Thermodynamics I: Introduction Gerhard Müller University of Rhode Island, gmuller@uri.edu
More informationConcepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0
StatMech Basics VBS/MRC Stat Mech Basics 0 Goal of Stat Mech Concepts in Materials Science I Quantum (Classical) Mechanics gives energy as a function of configuration Macroscopic observables (state) (P,
More informationIV. Classical Statistical Mechanics
IV. Classical Statistical Mechanics IV.A General Definitions Statistical Mechanics is a probabilistic approach to equilibrium macroscopic properties of large numbers of degrees of freedom. As discussed
More informationPhysics 106a/196a Problem Set 7 Due Dec 2, 2005
Physics 06a/96a Problem Set 7 Due Dec, 005 Version 3, Nov 7, 005 In this set we finish up the SHO and study coupled oscillations/normal modes and waves. Problems,, and 3 are for 06a students only, 4, 5,
More informationM02M.1 Particle in a Cone
Part I Mechanics M02M.1 Particle in a Cone M02M.1 Particle in a Cone A small particle of mass m is constrained to slide, without friction, on the inside of a circular cone whose vertex is at the origin
More informationConcepts for Specific Heat
Concepts for Specific Heat Andreas Wacker 1 Mathematical Physics, Lund University August 17, 018 1 Introduction These notes shall briefly explain general results for the internal energy and the specific
More informationList of Comprehensive Exams Topics
List of Comprehensive Exams Topics Mechanics 1. Basic Mechanics Newton s laws and conservation laws, the virial theorem 2. The Lagrangian and Hamiltonian Formalism The Lagrange formalism and the principle
More informationCONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year.
CONTENTS 1 0.1 Introduction 0.1.1 Prerequisites Knowledge of di erential equations is required. Some knowledge of probabilities, linear algebra, classical and quantum mechanics is a plus. 0.1.2 Units We
More informationStatistical Mechanics
Statistical Mechanics Newton's laws in principle tell us how anything works But in a system with many particles, the actual computations can become complicated. We will therefore be happy to get some 'average'
More informationMicrocanonical Ensemble
Entropy for Department of Physics, Chungbuk National University October 4, 2018 Entropy for A measure for the lack of information (ignorance): s i = log P i = log 1 P i. An average ignorance: S = k B i
More informationGraduate Preliminary Examination, Thursday, January 6, Part I 2
Graduate Preliminary Examination, Thursday, January 6, 2011 - Part I 2 Section A. Mechanics 1. ( Lasso) Picture from: The Lasso: a rational guide... c 1995 Carey D. Bunks A lasso is a rope of linear mass
More information3.012 PS Issued: Fall 2003 Graded problems due:
3.012 PS 4 3.012 Issued: 10.07.03 Fall 2003 Graded problems due: 10.15.03 Graded problems: 1. Planes and directions. Consider a 2-dimensional lattice defined by translations T 1 and T 2. a. Is the direction
More informationThe goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq
Chapter. The microcanonical ensemble The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq } = A that give
More informationIdentical Particles. Bosons and Fermions
Identical Particles Bosons and Fermions In Quantum Mechanics there is no difference between particles and fields. The objects which we refer to as fields in classical physics (electromagnetic field, field
More information424 Index. Eigenvalue in quantum mechanics, 174 eigenvector in quantum mechanics, 174 Einstein equation, 334, 342, 393
Index After-effect function, 368, 369 anthropic principle, 232 assumptions nature of, 242 autocorrelation function, 292 average, 18 definition of, 17 ensemble, see ensemble average ideal,23 operational,
More informationElements of Statistical Mechanics
Elements of Statistical Mechanics Thermodynamics describes the properties of macroscopic bodies. Statistical mechanics allows us to obtain the laws of thermodynamics from the laws of mechanics, classical
More informationPhysics Oct A Quantum Harmonic Oscillator
Physics 301 5-Oct-2005 9-1 A Quantum Harmonic Oscillator The quantum harmonic oscillator (the only kind there is, really) has energy levels given by E n = (n + 1/2) hω, where n 0 is an integer and the
More informationPart II: Statistical Physics
Chapter 7: Quantum Statistics SDSMT, Physics 2013 Fall 1 Introduction 2 The Gibbs Factor Gibbs Factor Several examples 3 Quantum Statistics From high T to low T From Particle States to Occupation Numbers
More informationPart II Statistical Physics
Part II Statistical Physics Theorems Based on lectures by H. S. Reall Notes taken by Dexter Chua Lent 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationG : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into
G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the
More informationPhysics PhD Qualifying Examination Part I Wednesday, January 21, 2015
Physics PhD Qualifying Examination Part I Wednesday, January 21, 2015 Name: (please print) Identification Number: STUDENT: Designate the problem numbers that you are handing in for grading in the appropriate
More informationCHEM-UA 652: Thermodynamics and Kinetics
1 CHEM-UA 652: Thermodynamics and Kinetics Notes for Lecture 2 I. THE IDEAL GAS LAW In the last lecture, we discussed the Maxwell-Boltzmann velocity and speed distribution functions for an ideal gas. Remember
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department 8.044 Statistical Physics I Spring Term 2013 Problem 1: The Big Bang Problem Set #9 Due in hand-in box by 4;00 PM, Friday, April 19 Early in the
More informationCHAPTER 9 Statistical Physics
CHAPTER 9 Statistical Physics 9.1 9.2 9.3 9.4 9.5 9.6 9.7 Historical Overview Maxwell Velocity Distribution Equipartition Theorem Maxwell Speed Distribution Classical and Quantum Statistics Fermi-Dirac
More informationLecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas
Lecture 5 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas law. redict the molar specific heats of gases and solids. Understand how heat is transferred via molecular collisions
More informationData Provided: A formula sheet and table of physical constants are attached to this paper.
Data Provided: A formula sheet and table of physical constants are attached to this paper. DEPARTMENT OF PHYSICS AND ASTRONOMY Spring Semester (2016-2017) From Thermodynamics to Atomic and Nuclear Physics
More informationIntroduction to Thermodynamic States Gases
Chapter 1 Introduction to Thermodynamic States Gases We begin our study in thermodynamics with a survey of the properties of gases. Gases are one of the first things students study in general chemistry.
More informationCoherent states, beam splitters and photons
Coherent states, beam splitters and photons S.J. van Enk 1. Each mode of the electromagnetic (radiation) field with frequency ω is described mathematically by a 1D harmonic oscillator with frequency ω.
More informationFinal Exam for Physics 176. Professor Greenside Wednesday, April 29, 2009
Print your name clearly: Signature: I agree to neither give nor receive aid during this exam Final Exam for Physics 76 Professor Greenside Wednesday, April 29, 2009 This exam is closed book and will last
More informationSecond Law of Thermodynamics: Concept of Entropy. While the first law of thermodynamics defines a relation between work and heat, in terms of
Chapter 4 Second Law of Thermodynamics: Concept of Entropy The law that entropy always increases, holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet
More informationOutline Review Example Problem 1. Thermodynamics. Review and Example Problems: Part-2. X Bai. SDSMT, Physics. Fall 2014
Review and Example Problems: Part- SDSMT, Physics Fall 014 1 Review Example Problem 1 Exponents of phase transformation : contents 1 Basic Concepts: Temperature, Work, Energy, Thermal systems, Ideal Gas,
More informationBrief review of Quantum Mechanics (QM)
Brief review of Quantum Mechanics (QM) Note: This is a collection of several formulae and facts that we will use throughout the course. It is by no means a complete discussion of QM, nor will I attempt
More informationIntroduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!
Introduction Thermodynamics: phenomenological description of equilibrium bulk properties of matter in terms of only a few state variables and thermodynamical laws. Statistical physics: microscopic foundation
More informationLecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy
Dr. W. Pezzaglia Physics 8C, Spring 2014 Page 1 Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy 1. Work 1 Dr. W. Pezzaglia Physics 8C, Spring 2014 Page 2 (c)
More informationInternational Physics Course Entrance Examination Questions
International Physics Course Entrance Examination Questions (May 2010) Please answer the four questions from Problem 1 to Problem 4. You can use as many answer sheets you need. Your name, question numbers
More information