Entropy, Shannon s Measure of Information and Boltzmann s H-Theorem

Size: px
Start display at page:

Download "Entropy, Shannon s Measure of Information and Boltzmann s H-Theorem"

Transcription

1 entropy Article Entropy, Shannon s Measure Information and Boltzmann s H-Theorem Arieh Ben-Naim Department Physical Chemistry, The Hebrew University Jerusalem, Jerusalem 91904, Israel; arieh@fh.huji.ac.il Academic Edirs: Geert Verdoolaege and Kevin H. Knuth Received: 3 November 016; Accepted: 1 January 017; Published: 4 January 017 Abstract: We start with a clear distinction between Shannon s Measure Information (SMI) and Thermodynamic Entropy. The first is defined on any probability distribution; and refore it is a very general concept. On or hand Entropy is defined on a very special set distributions. Next we show that Shannon Measure Information (SMI) provides a solid and quantitative basis for interpretation rmodynamic entropy. The entropy measures uncertainty in distribution locations and momenta all particles; as well as two corrections due uncertainty principle and indistinguishability particles. Finally we show that H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much what has been written on H-orem is irrelevant entropy and Second Law Thermodynamics. Keywords: entropy; Shannon s measure ; Second Law Thermodynamics; H-orem 1. Introduction The purpose this article is revisit an old problem, relationship between entropy and Shannon s measure. An even older problem is question about subjectivity entropy which arose from association entropy with general concept. Finally, we discuss H-orem; its meaning, its criticism, and its relationship with Second Law Thermodynamics. The paper is organized in four parts. In Section, we present a brief introduction concept SMI. In Section 3, we derive rmodynamic entropy as a special case SMI. In Section 4, we revisit Boltzmann H-orem. In light SMI-based interpretation entropy; it will become clear that function H(t) is identical with SMI velocity distribution. The entropy is obtained from H(t) after taking limit t, i.e., value H(t) at equilibrium. Because its central importance we state our conclusion here: It is absolutely necessary distinguish between SMI and entropy. Failing make such a distinction has led o many misinterpretations entropy and Second Law, as well as assigning properties SMI entropy, and in particular misunderstanding H-orem, discussed in Section 4. In 1948, Shannon sought and found a remarkable measure, uncertainty [1,] and unlikelihood. It was not a measure any, not any uncertainty about any proposition, and not unlikelihood about occurrence any event. However, because quantity he found has same mamatical form as entropy in statistical mechanics, he called his measure, as allegedly suggested by von Neumann: entropy. This proved be a grievous mistake which had caused a great confusion in both ory and rmodynamics. Entropy 017, 19, 48; doi: /e

2 Entropy 017, 19, The SMI is defined for any probability distribution. The entropy is defined on a tiny subset all possible distributions. Calling SMI entropy leads many awkward statements such as: The value maximal entropy at equilibrium is entropy system. The correct statement concerning entropy an isolated system is as follows: An isolated system at equilibrium is characterized by a fixed energy E, volume V and number particles N (assuming a one-component system). For such a system, entropy is determined by variables E, V, N. In this system entropy is fixed. It is not a function time, it does not change with time, and it does not tend a maximum. Similarly, one can define entropy for any or well defined rmodynamic system at equilibrium [3,4]. This is exactly what is meant by statement that entropy is a state function. For any isolated system not at equilibrium one can define SMI on probability distributions locations and velocities all particles. This SMI changes with time []. At equilibrium, it attends a maximal value. The maximal value SMI, attained at equilibrium is related entropy system [1,]. In this article, whenever we talk about SMI we use logarithm base, but in rmodynamics we use, for convenience natural logarithm log e x. To convert SMI we need multiply by log e, i.e., log x = log e log e x. Parts this article have been published before in [3 5]. Specifically, derivation entropy function an ideal gas based on SMI, was published by author in 008 [3]. The discussion Boltzmann H-Theorem in terms SMI is new. We do not discuss relations with huge field rmodynamics irreversible processes. This whole field is based on assumption local equilibrium, which, in author s opinion was never fully justified. Therefore, in this article we use concept entropy only for macroscopic equilibrium systems, while SMI may be used for any system.. A Brief Introduction Concept SMI In this section, we present a very simple definition SMI. We n discuss its various interpretations. For any random variable X (or an experiment, or a game, see below), characterized by a probability distribution: p 1, p,..., p n, we define SMI as: H = n i=1 p i log p i (1) If X is an experiment having n outcomes, n p i is probability associated with occurrence outcome i. We now discuss briefly three interpretations SMI. The first is an average uncertainty about outcome an experiment; second, a measure unlikelihood; and third, a measure. It is ironic that al interpretation SMI is least straightforward one, as a result it is also one most commonly misused. Note that SMI has form an average quantity. However, this is a very special average. It is an average quantity log p i using probability distribution p 1,..., p n..1. The Uncertainty Meaning SMI The interpretation H as an average uncertainty is very popular. This interpretation is derived directly from meaning probability distribution [,5,6]. Suppose that we have an experiment yielding n possible outcomes with probability distribution p 1,..., p n. If, say, p i = 1, n we are certain that outcome i occurred or will occur. For any or value p i, we are less certain about occurrence event i. Less certainty can be translated more uncertainty. Therefore, larger value log p i, larger extent uncertainty about occurrence event i. Multiplying log p i by p i, and summing over all i, we get an average uncertainty about all possible outcomes experiment [6].

3 Entropy 017, 19, Entropy 017, 19, We should add here that when p i = 0, we are certain that event i will not occur. It would be awkward Yaglom and say Yaglom in this [7] case suggest thatreferring uncertainty log in as occurrence uncertainty i isin zero. event Fortunately, i. In this this view, awkwardness SMI (referred does not as affect entropy value by Yaglom H. Once and Yaglom) we formis a sum product over all p i log uncertainties p i, we get zero in when outcomes eir p i = 1, or experiment. when p i = 0. Yaglom This interpretation and Yaglom is [7] invalid suggest for referring following p i reason. log p i as As we uncertainty noted above, in it event is plausible i. In this view, interpret SMI log (referred as a measure entropy by extent Yaglom uncertainty and Yaglom) with is arespect sum over all occurrence uncertainties in outcome outcomes i. Since log experiment. is a mononically decreasing function, Figure 1a, larger, or smaller log This means interpretation smaller is uncertainty invalid for(or larger following certainty). reason. In Asthis we noted view, above, SMI it is is plausible an average interpret uncertainty log over pall i as possible a measure outcomes extent experiment. uncertainty with respect occurrence outcome The i. quantity Since log log p i is a mononically or hand, decreasing is not function a mononic p i, Figure function 1a, larger p, i, Figure or smaller 1b. Therefore, log p i means one smaller cannot use uncertainty this quantity (or larger measure certainty). Inextent this view, uncertainty SMI is anwith average respect uncertainty over occurrence all possible outcomes i. experiment. (a) (b) Figure Figure The The functions (a) (a) Log (p) and (b) p Log Log (p). (p)... The Unlikelihood Interpretation The quantity p i log p i on or hand, is not a mononic function p i, Figure 1b. Therefore, one cannot A slightly use this different quantity but still measure useful interpretation extent uncertainty H is in with terms respect likelihood occurrence or expectedness. outcome These two i. are also derived from meaning probability. When is small, event i is unlikely occur, or its occurrence is less expected. When approaches one, we can say that occurrence.. i The is more Unlikelihood likely. Since Interpretation log is a mononically increasing function, we can say that larger A slightly value different log, but still larger useful likelihood interpretation or larger H is in terms expectedness likelihood for or expectedness. event. Since These 0 two 1, are we also have derived from log 0. meaning The quantity probability. log When is thus pa measure unlikelihood or i is small, event i is unlikely occur, unexpectedness or its occurrence event is less i. Therefore, expected. When quantity p = log is a measure average i approaches one, we can say that occurrence unlikelihood, i is more or likely. unexpectedness, Since log p entire set outcomes experiment. i is a mononically increasing function p i, we can say that larger value log p i, larger likelihood or larger expectedness for event. Since 0.3. The p Meaning SMI as a Measure Information i 1, we have log p i 0. The quantity log p i is thus a measure unlikelihood or unexpectedness As we have seen, both event i. Therefore, uncertainty and quantity unlikelihood H = p i log interpretation p i is a measure H are derived average unlikelihood, from meaning or unexpectedness, probabilities entire. set The interpretation outcomes H as experiment. a measure is a little trickier and less straightforward. It is also more interesting since it conveys a different kind.3. The Meaning on Shannon SMI as a measure Measure Information. As we already emphasized, SMI is not As we [8]. haveit seen, is also both not a measure uncertainty any andpiece unlikelihood, interpretation but a very Hparticular are derived kind from. meaning The confusion probabilities psmi i. The with interpretation is Halmost as a measure rule, not exception, is a littleby trickier both and scientists less straightforward. and non-scientists. It is also more interesting since it conveys a different kind on Some authors assign quantity log meaning (or self-) associated with event i.

4 Entropy 017, 19, Entropy 017, 19, The idea is that if an event is rare, i.e., Shannon measure. As we already is small and hence log emphasized, SMI is not is large, n one gets [8]. It is also more when one knows that event has occurred. Consider probabilities not a measure any piece, but a very particular kind. The confusion outcomes a die as shown in Figure a. We see that outcome 1 is less probable than outcome. SMI with is almost rule, not exception, by both scientists and non-scientists. We may say that we are less uncertain about outcome than about 1. We may also say that Some authors assign quantity log p i meaning (or self-) outcome 1 is less likely occur than outcome. However, when we are informed that outcome associated with event i. 1 or occurred, we cannot claim that we have received more or less. When we The idea is that if an event is rare, i.e., p i is small and hence log p i is large, n one gets more know that an event i has occurred, we have got on occurrence i. One might be when one knows that event has occurred. Consider probabilities outcomes surprised learn that a rare event has occurred, but size one gets when a die as shown in Figure a. We see that outcome 1 is less probable than outcome. We may event i occurs is not dependent on probability that event. say that we are less uncertain about outcome than about 1. We may also say that outcome Both 1 is less likely and log occur are measures uncertainty about occurrence an event. They do than outcome. However, when we are informed that outcome 1 or not measure about events. Therefore, we do not recommend referring log occurred, we cannot claim that we have received more or less. When we know that as an (or self-) associated with event i. Hence, H should not be interpreted event i has occurred, we have got on occurrence i. One might be surprised as average associated with experiment. Instead, we assign al meaning learn that a rare event has occurred, but size one gets when event i occurs is directly quantity H, rar than individual events. not dependent on probability that event. Figure. Two possible distributions an unfair die. Figure. Two possible distributions an unfair die. It is sometimes said that removing uncertainty is tantamount obtaining. This is true for Both pentire i and experiment, log p i are measures i.e., entire uncertainty probability about distribution, occurrence not individual an event. events. They do not Suppose measure that we have about an unfair events. die with Therefore, probabilities we do not = recommend, =, referring =, =, log = p i as and = (or self-) associated with event i. Hence, H should not be interpreted, Figure b. Clearly, uncertainty we have regarding outcome =6 is less than as average associated with experiment. Instead, we assign al meaning uncertainty directly we quantity have regarding H, rar any than outcome individual 6. When events. we carry out experiment and find result, Itsay is sometimes =3, we said removed that removing uncertainty uncertainty we had isabout tantamount outcome obtaining before. carrying out This is experiment. true for However, entire experiment, it would i.e., be wrong entire argue probability that amount distribution, not individual we got is events. larger or smaller Suppose than if that anor we have outcome an unfair had dieoccurred. with probabilities Note also p that we talk here about amount 1 = 10 1, not itself. If outcome is =3,, p = 10 1, p 3 = 10 1, we p 4 = 10 1 got, is: p 5 = 10 1 and p The 6 = 1, Figure b. Clearly, uncertainty we have regarding outcome i = 6 is less than outcome uncertainty is 3. If we have outcome regarding is =6, any outcome i = 6. When is: The we outcome carry out is 6. experiment These are different and find, result, say but i = one 3, cannot we removed claim that uncertainty one is larger we or had smaller about than outcome or. before carrying out experiment. We emphasize However, again it that would be interpretation wrong argue H that as average amount uncertainty average we unlikelihood got is larger is derived or smaller from than if meaning anor outcome each term had log occurred.. The Note interpretation also that weh talk as a here measure about amount is, not associated notwith meaning itself. each If probability outcome is i, = but 3, with entire distribution we got is: The,, outcome. is 3. We Ifnow outcome describe in is i a = qualitative 6, way meaning is: The outcome H as a is measure 6. These are different associated, with but entire one cannot experiment. claim that one is larger or smaller than or. Consider We emphasize any experiment again that or interpretation a game having Hn as outcomes average uncertainty with probabilities or average unlikelihood,,. For is concreteness, derived fromsuppose meaning we throw eacha term dart at log a board, p Figure 3. The board is divided in n regions, i. The interpretation H as a measure is areas not associated,,. with We know meaning that dart eachhit probability one se p regions. We assume that probability i, but with entire distribution p 1,..., p n. hitting ith region is = /, where A is tal area board.

5 Entropy 017, 19, We now describe in a qualitative way meaning H as a measure associated with entire experiment. Consider any experiment or a game having n outcomes with probabilities p 1,..., p n. For concreteness, suppose we throw a dart at a board, Figure 3. The board is divided in n regions, areas A 1,..., A n. We know that dart hit one se regions. We assume that probability hitting Entropy 017, ith 19, 48 region is p i = A i /A, where A is tal area board Figure 3. A board divided in five unequal regions. Now Now experiment experiment is is carried carried out, out, and and have have find find out out where where dart dart hit hit board. board. You You know know that that dart dart hit hit board, board, and and know know probability probability distribution distribution,, p. Your task is 1,..., p n. Your task is find find out in out which in which region region dart dart is by is asking by asking binary binary questions, questions, i.e., i.e., questions questions answerable answerable by Yes, by Yes, or or No. No. Clearly, Clearly, since since do do not not know know where where dart dart is, is, lack lack on on location location dart. dart. To To acquire acquire this this ask questions. ask questions. We are We interested are interested in amount in amount contained in contained this experiment. in this experiment. One way One measuring way this measuring amount this amount is by number is by questions number need questions ask in need order ask obtain in order required obtain. required. As As everyone everyone who who has has played played 0-question 0-question (0Q) (0Q) game game knows, knows, number number questions questions need need ask ask depends depends on on strategy strategy for for asking asking questions. questions. Here Here we we shall shall not not discuss discuss what what constitutes constitutes a strategy strategy for for asking asking questions questions [9]. [9]. Here Here we we are are only only interested interested in in a measure measure amount amount contained contained in in this this experiment. experiment. It It turns turns out out that that quantity quantity H, H, which which we we referred referred as Shannon s as Shannon s measure measure (SMI), (SMI), provides provides us us with with a measure a measure this this in terms in terms minimum minimum number number questions questions one one needs needs ask ask in order in order find find location location dart, dart, given given probability probability distribution distribution various various outcomes outcomes [,8]. [,8]. For For a general general experiment experiment with with n n possible possible outcomes, outcomes, having having probabilities probabilities p,,, quantity 1,..., p n, quantity H is H a is measure a measure how how difficult difficult it is it find is out find which out outcome which outcome has occurred, has given occurred, that an given experiment that an was experiment carried was out. carried It is easy out. see It is that easy for experiments see that for experiments having same having tal number same tal outcomes number n, but outcomes with different n, but with probability different distributions, probability distributions, amount amount (measured in (measured terms in number terms questions) number is different. questions) In is or different. words, In knowing or words, probability knowing distribution probability gives distribution us a hint or gives some us partial a hint or some partial on outcomes. This on is outcomes. reason why This we is refer reason H as why a measure we refer H amount as a measure amount contained in, or associated contained with a given in, or probability associated distribution. with a given We probability emphasize again distribution. that We SMI emphasize is a measure again that SMI is associated a measure with entire distribution, associated with not with entire individual distribution, probabilities. not with individual probabilities. 3. Derivation Entropy Function for an Ideal Gas In this section we derive entropy function for an ideal gas. We start with SMI which is definable any probability distribution [9,10]. [9,10]. We We apply apply SMI SMI two two molecular molecular distributions; distributions; locational locational and and momentum momentum distribution. distribution. Next, Next, we calculate we calculate distribution distribution which which maximizes maximizes SMI. SMI. We We refer refer this this distribution distribution as as equilibrium equilibrium distribution. distribution. Finally, Finally, we we apply apply two two corrections corrections SMI, SMI, one one due due Heisenberg Heisenberg uncertainty uncertainty principle, principle, second second due due indistinguishability particles. The resulting SMI is, up a multiplicative constant equal entropy gas, as calculated by Sackur and Tetrode based on Boltzmann definition entropy [11,1]. In previous publication [,13], we discussed several advantages SMI-based definition entropy. For our purpose in this article most important aspect this definition is following: The entropy is defined as maximum value SMI. As such, it is not a function time. We

6 Entropy 017, 19, particles. The resulting SMI is, up a multiplicative constant equal entropy gas, as calculated by Sackur and Tetrode based on Boltzmann definition entropy [11,1]. In previous publication [,13], we discussed several advantages SMI-based definition entropy. For our purpose in this article most important aspect this definition is following: The entropy is defined as maximum value SMI. As such, it is not a function time. We shall discuss implication this conclusion for Boltzmann H-orem in Section The Locational SMI a Particle in a 1D Box Length L Suppose we have a particle confined a one-dimensional (1D) box length L. Since re are infinite points in which particle can be within interval (0, L). The corresponding locational SMI must be infinity. However we can defined, as Shannon did, following quantity by analogy with discrete case: H(X) = f (x) log f (x)dx () This quantity might eir converge or diverge, but in any case, in practice we shall use only differences this quantity. It is easy calculate density which maximizes locational SMI, H(X) in () which is [1,]: f eq (x) = 1 L (3) The use subscript eq (for equilibrium) will be cleared later, and corresponding SMI calculated by () is: H(locations in 1D) = log L (4) We acknowledge that location X particle cannot be determined with absolute accuracy, i.e., re exists a small interval, h x within which we do not care where particle is. Therefore, we must correct Equation (4) by subtracting log h x. Thus, we write instead (4): H(X) = log L log h x (5) We recognize that in (5) we effectively defined H(X) for a finite number intervals n = L/h. Note that when h x 0, H(X) diverges infinity. Here, we do not take mamatical limit, but we sp at h x small enough but not zero. Note also that in writing (5) we do not have specify units length, as long as we use same units for L and h x. 3.. The Velocity SMI a Particle in 1D Box Length L Next, we calculate probability distribution that maximizes continuous SMI, subject two conditions: The result is Normal distribution [1,]: f (x)dx = 1 (6) x f (x)dx = σ = constant (7) f eq (x) = exp[ x /σ ] πσ (8)

7 Entropy 017, 19, The subscript eq. for equilibrium will be clear later. Applying this result a classical particle having average kinetic energy m<v x>, and identifying standard deviation σ with temperature system: σ = k BT m We get equilibrium velocity distribution one particle in 1D system: [ m mv ] f eq (v x ) = mk B T exp x k B T (9) (10) where k B is Boltzmann constant, m is mass particle, and T absolute temperature. The value continuous SMI for this probability density is: H max (velocity in 1D) = 1 log(πek BT/m) (11) Similarly, we can write momentum distribution in 1D, by transforming from v x p x = mv x, get: [ 1 p ] f eq (p x ) = πmkb T exp x (1) mk B T and corresponding maximal SMI: H max (momentum in 1 D) = 1 log(πemk BT) (13) As we have noted in connection with locational SMI, quantities (11) and (13) were calculated using definition continuous SMI. Again, recognizing fact that re is a limit accuracy within which we can determine velocity, or momentum particle, we correct expression in (13) by subtracting log h p where h p is a small, but infinite interval: H max (momentum in 1D) = 1 log(πemk BT) log h p (14) Note again that if we choose units h p ( momentum as: mass length/time) same as mkb T, n whole expression under logarithm will be a pure number Combining SMI for Location and Momentum One Particle in 1D System In previous two sections, we derived expressions for locational and momentum SMI one particle in 1D system. We now combine two results. Assuming that location and momentum (or velocity) particles are independent events we write [ ] L πemkb T H max (location and momentum) = H max (location) + H max (momentum) = log h x h p Recall that h x and h p were chosen eliminate divergence SMI for a continuous random variables; location and momentum. In (15) we assume that location and momentum particle are independent. However, quantum mechanics imposes restriction on accuracy in determining both location x and corresponding momentum p x. In Equations (5) and (14) h x and h p were introduced because we did not care determine location and momentum with an accuracy greater that h x and h p, respectively. Now, we must acknowledge that nature imposes upon us a limit on accuracy with which we can determine both location and corresponding momentum. Thus, in Equation (15), (15)

8 Entropy 017, 19, h x and h p cannot both be arbitrarily small, but ir product must be order Planck constant h = J s. Thus we set: h x h p h (16) And instead (15), we write: [ ] L πemkb T H max (location and momentum) = log h (17) 3.4. The SMI a Particle in a Box Volume V We consider again one simple particle in a box volume V. We assume that location particle along three axes x, y and z are independent. Therefore, we can write SMI location particle in a cube edges L, and volume V as: H(location in 3D) = 3H max (location in 1D) (18) Similarly, for momentum particle we assume that momentum (or velocity) along three axes x, y and z are independent. Hence, we write: H max (momentum in 3D) = 3H max (momentum in 1D) (19) We combine SMI locations and momenta one particle in a box volume V, taking in account uncertainty principle. The result is [ ] L πemkb T H max (location and momentum in 3D) = 3 log h 3.5. The SMI Locations and Momenta N Independent Particles in a Box Volume V The next step is proceed from one particle in a box N independent particles in a box volume V. Giving location (x, y, z), and momentum ( p x, p y, p z ) one particle within box, we say that we know microstate particle. If re are N particles in box, and if ir microstates are independent, we can write SMI N such particles simply as N times SMI one particle, i.e.: SMI( N independent particles) = N SMI(one particle) (1) This Equation would have been correct when microstates all particles where independent. In reality, re are always correlations between microstates all particles; one is due intermolecular interactions between particles, second is due indistinguishability between particles. We shall discuss se two sources correlation separately. (i) Correlation Due Indistinguishability Recall that microstate a single particle includes location and momentum that particle. Let us focus on location one particle in a box volume V. We have written locational SMI as: (0) H max (location) = log V () Recall that this result was obtained for continuous locational SMI. This result does not take in account divergence limiting procedure. In order explain source correlation due indistinguishability, suppose that we divide volume V in a very large number small cells each volume V/M. We are not interested in exact location each particle, but only in which cell each particle is. The tal number cells is M, and we assume that tal number particles is N M. If each cell can contain at most one particle, n re are M possibilities put

9 Entropy 017, 19, Entropy first 017, particle 19, 48 in one cells, and re are M 1 possibilities put second particle in 9 17 remaining empty cells. Alger, we have M(M 1) possible microstates, or configurations for two particles. Note The that probability in counting that one tal particle number is found configurations in cell i, andwe have second implicitly in a different assumed cell jthat is: two particles are labeled, say red and blue. In this case we count two configurations in Figure 4a, as different configurations: blue particle 1 Pr(i, in cell j) i, = and red particle in cell j, and blue particle in cell (3) j, and red particle in cell i. M(M 1) The Ams probability and molecules that a particle are indistinguishable is found in cell by i is: nature; we cannot label m. Therefore, two microstates (or configurations) in Figure 4b are indistinguishable. This means that tal number configurations is not ( 1), but: Pr(j) = Pr(i) = 1 (4) M = ( 1) Therefore, we see that even in this simple example, re is correlation,forlarge (6) between events one particle For invery i and large one M particle we have ina j : correlation between events particle in i and particle in j : Pr(i, j) Pr(, g(i, j) = Pr(i)Pr(j) = ) M (, ) = M(M 1) = 1 1 M 1 (5) Pr( ) Pr( ) = = (7) / For N particles distributed in M cells, we have a correlation function (For ): Clearly, this correlation can be made as small as we wish, by taking M 1 (or in general, M N). There is anor correlation which we cannot eliminate and is due indistinguishability (,,, ) = particles. =! (8) /! Note This means that inthat counting for N indistinguishable tal number particles configurations we must we divide have implicitly number assumed configurations that two by particles!. Thus are in labeled, general say by red removing and blue. In labels this case on we count particles two number configurations configurations in Figure 4a, is as reduced different by configurations: N!. For two particles blue particle two in configurations cell i, and redshown particlein infigure cell j, 4a and reduce blue particle one shown in cellin j, and Figure red4b. particle in cell i. Figure 4. Two different configurations are reduced one when particles are indistinguishable. Figure 4. Two different configurations are reduced one when particles are indistinguishable. Now that we know that re are correlations between events one particle in, one particle Ams in and molecules one particle are indistinguishable in, we can define by nature; mutual we cannot label m. corresponding Therefore, two this microstates correlation. (or We configurations) write this as: in Figure 4b are indistinguishable. This means that tal number configurations is not M(M 1), but: (1; ; ; ) =ln! (9) M(M 1) The SMI for N particles number owill f conbe: f igurations = M, for large M (6) For very large M we have ( a correlation ) = ( between ) events particle ln! in i and particle in j : (30) Pr(i, j) For definition mutual g(i,, j) = see []. Pr(i)Pr(j) = M M / = (7) Using SMI for location and momentum one particle in (0) we can write final result For for N particles SMI distributed N indistinguishable M cells, (but wenon-interacting) have a correlation particles function as: (For M N): ( indistinguishable) = log h log! (31) g(i 1, i,..., i n ) = MN M N = N! (8) /N! Using Stirling approximation for log! in form (note again that we use natural logarithm): log! log (3)

10 Entropy 017, 19, This means that for N indistinguishable particles we must divide number configurations M N by N!. Thus in general by removing labels on particles number configurations is reduced by N!. For two particles two configurations shown in Figure 4a reduce one shown in Figure 4b. Now that we know that re are correlations between events one particle in i 1, one particle in i... one particle in i n, we can define mutual corresponding this correlation. We write this as: I(1; ;... ; N) = ln N! (9) The SMI for N particles will be: H(N particles) = N H(one particle) ln N! (30) i=1 For definition mutual, see []. Using SMI for location and momentum one particle in (0) we can write final result for SMI N indistinguishable (but non-interacting) particles as: ( ) 3/ πmekb T H(N indistinguishable) = N log V log N! (31) Using Stirling approximation for log N! in form (note again that we use natural logarithm): log N! N log N N (3) We have final result for SMI N indistinguishable particles in a box volume V, and temperature T: [ ( ) 3/ ] V πmkb T H(1,,... N) = N log + 5 N N (33) By multiplying SMI N particles in a box volume V at temperature T, by facr (k B log e ), one gets entropy, rmodynamic entropy an ideal gas simple particles. This equation was derived by Sackur and by Tetrode in 191, by using Boltzmann definition entropy. One can convert this expression in entropy function S(E, V, N), by using relationship between tal energy system, and tal kinetic energy all particles: h h E = N m v = 3 Nk BT (34) The explicit entropy function an ideal gas is: [ ( ) 3/ ] V E S(E, V, N) = Nk B ln + 3 [ ( )] 5 4πm N N k BN 3 + ln 3h (35) (ii) Correlation Due Intermolecular Interactions In Equation (35) we got entropy a system non-interacting simple particles (ideal gas). In any real system particles, re are some interactions between particles. One simplest interaction energy potential function is shown in Figure 5. Without getting in any details on function U(r) shown in Figure, it is clear that re are two regions distances 0 r σ and 0 r, where slope function U(r) is negative and positive, respectively. Negative slope correspond repulsive forces between pair particles when y are at a distance

11 In Equation (35) we got entropy a system non-interacting simple particles (ideal gas). In any real system particles, re are some interactions between particles. One simplest interaction energy potential function is shown in Figure 5. Without getting in any details on function ( ) shown in Figure, it is clear that re are two regions distances 0 and Entropy 017, 19, , where slope function ( ) is negative and positive, respectively. Negative slope correspond repulsive forces between pair particles when y are at a distance smaller than than. σ. This is is reason why σ is is sometimes referred referred as as effective effective diameter diameter particles. particles. For larger For distances, larger distances, r σ we observe we observe attractive attractive forces between forces between particles. particles. Figure Figure 5. The 5. The general general form form pair-potential pair-potential between between two two particles. particles. Intuitively, it is clear that interactions between particles induce correlations between Intuitively, it is clear that interactions between particles induce correlations between locational probabilities two particles. For hard-spheres particles re is infinitely strong locational probabilities two particles. For hard-spheres particles re is infinitely strong repulsive force between two particles when y approach distance. Thus, if we know repulsive force between two particles when y approach a distance r σ. Thus, if we know location one particle, we can be sure that a second particle, at is not in a sphere location R 1 one particle, we can be sure that a second particle, at R is not in a sphere diameter σ diameter around point. This repulsive interaction may be said introduce negative around point R 1. This repulsive interaction may be said introduce negative correlation between correlation between locations two particles. locations two particles. On or hand, two argon ams attract each or at distances r 4A. Therefore, if we know location one particle say, at R 1, probability observing a second particle at R is larger than probability finding particle at R in absence a particle at R 1. In this case we get positive correlation between locations two particles. We can conclude that in both cases (attraction and repulsion) re are correlations between particles. These correlations can be cast in form mutual which reduces SMI a system N simple particles in an ideal gas. The mamatical details se correlations are discussed in Ben-Naim [3]. Here, we show only form mutual (MI) in very low density. At this limit, we can assume that re are only pair correlations, and neglect all higher order correlations. The MI due se correlations is: I(due correlations in pairs) = where g(r 1, R ) is defined by: N(N 1) g(r 1, R ) = p(r 1, R ) p(r 1 )p(r ) p(r 1, R ) log g(r 1, R )dr 1 dr (36) Note again that log g can be eir positive or negative, but average in (36) must be positive Conclusions We summarize main steps leading from SMI entropy. We started with SMI associated with locations and momenta particles. We calculated distribution locations and momenta that maximizes SMI. We referred this distribution as equilibrium distribution. Let us denote this distribution locations and momenta all particles by f eq (R, p). (37)

12 Entropy 017, 19, Next, we use equilibrium distribution calculate SMI a system N particles in a volume V, and at temperature T. This SMI is, up a multiplicative constant (k B ln ) identical with entropy an ideal gas at equilibrium. This is reason we referred distribution which maximizes SMI as equilibrium distribution. It should be noted that in derivation entropy, we used SMI twice; first, calculate distribution that maximize SMI, n evaluating maximum SMI corresponding this distribution. The distinction between concepts SMI and entropy is essential. Referring SMI (as many do) as entropy, inevitably leads such an awkward statement: maximal value entropy (meaning SMI) is entropy (meaning rmodynamic entropy). The correct statement is that SMI associated with locations and momenta is defined for any system; small or large, at equilibrium or far from equilibrium. This SMI, not entropy, evolves in a maximum value when system reaches equilibrium. At this state, SMI becomes proportional entropy system. Since entropy is a special case a SMI, it follows that whatever interpretation one accepts for SMI, it will be aumatically applied concept entropy. The most important conclusion is that entropy is not a function time. Entropy does not change with time, and entropy does not have a tendency increase. We said that SMI may be defined for a system with any number particles including case N = 1. This is true for SMI. When we talk about entropy a system we require that system beentropy very large. 017, 19, The 48 reason is that only for such systems entropy-formulation Second 1 17 Law rmodynamic is valid. This pic is discussed in next section. This question is considered be one most challenging one. This property entropy 3.7. is The also Entropy responsible Formulation for mystery surrounding Second Law concept entropy. In this section, we discuss very briefly origin increase in entropy in one specific process. The correct answer In previous section, we derived and interpreted concept entropy. Knowing what entropy question why entropy always increases removes much mystery associated with entropy. is leaves In this question section, we derive why entropy correct always answer increases, correct unanswered. questions; when and why entropy a system This question increases? is considered be one most challenging one. This property entropy is also responsible Consider forfollowing mystery process. surrounding We have a system concept characterized entropy. by E, V, InN. this (This section, means we N discuss very particles, brieflyin a volume originv having increase tal energy in entropy E). We assume in onethat specific all process. energy The system correct is answer due question kinetic why energy entropy always particles. increases We neglect removes any interactions much between mystery particles, associated and with if entropy. particles In thishave section, any internal we derive energies (say, correct vibrational, answer rotational, correct electronic, questions; nuclear, when etc.), and why se entropy will a system not change increases? process. We now remove a partition between two compartments, as in Figure 6, and observe what happens. Experience tells us that once we remove partition, gas will Consider following process. We have a system characterized by E, V, N. (This means N expand occupy entire system volume V. Furrmore, if both initial and final states particles, in a volume V having tal energy E). We assume that all energy system is due are equilibrium states, n we can apply entropy function calculate change in entropy in kinetic this process, energyi.e.: particles. We neglect any interactions between particles, and if particles have any internal energies (say, vibrational, rotational, electronic, nuclear, etc.), se will not change in process. We now remove a ( partition ) between = ln two compartments, as in Figure 6, and observe = ln (38) what happens. Experience tells us that once we remove partition, gas will expand occupy Note carefully that this entropy change corresponds difference in entropy system entire system volume V. Furrmore, if both initial and final states are equilibrium states, at two equilibrium states; initial and final states, Figure 6. n we The canal apply entropy interpretation function this calculate quantity can change be obtained in entropy by dividing in this process, by i.e.: constant facr ln and we get: S(V V) = Nk B ln V V = Nk B ln (38) ( ) = = (39) ln p L p R L R Figure6. 6. Expansion Expansion an ideal an ideal gas from gas from V V. V V. This means that SMI system increased by N bits. The reason is simple. Initially, we know that all N particles are in a volume V, and after removal partition we lost one bit per particle. We need ask one question find out where a particle is: in right (R), or left (L) compartment. Now that we understand meaning this entropy change we turn study cause for this

13 Entropy 017, 19, Note carefully that this entropy change corresponds difference in entropy system at two equilibrium states; initial and final states, Figure 6. The al interpretation this quantity can be obtained by dividing S by constant facr k B ln and we get: H(V V) = S k B ln = N (39) This means that SMI system increased by N bits. The reason is simple. Initially, we know that all N particles are in a volume V, and after removal partition we lost one bit per particle. We need ask one question find out where a particle is: in right (R), or left (L) compartment. Now that we understand meaning this entropy change we turn study cause for this entropy change. Specifically, we ask: Why does entropy this process increase? Before we answer this question we will try answer more fundamental question: Why did this process occur at all? We shall see that an answer second question leads an answer first question. Clearly, if partition separating two compartments is not removed nothing will happen; gas will not expand and entropy system will not change. We can tentatively conclude that having a system characterized by (E, V, N) entropy is fixed and will not change with time. Let us examine what will happen when we remove partition separating two compartments in Figure 6. Instead removing entire partition, we open a small window between two compartments. This will allow us follow process in small steps. If window is small enough, we can expect only one particle at time pass through it. Starting with all N particles on left compartment, we open window and observe what will happen. Clearly, first particle which crosses window will be from left (L) right (R) compartment. This is clear simply because re are no particles in R compartment. After some time, some particles will move from L R. Denote number particles in R by n and number in L by N n. The pair numbers (N n, n) may be referred as a distribution particles in two compartments. Dividing by N, we get a pair numbers (p L, p R ) = (1 p, p) where p = n N. Clearly, this pair numbers is a probability distribution (p L, p R 0, p L + p L = 1). We can refer it as temporary probability distribution, or simply state distribution (More precisely, this is locational state particles. Since we have an ideal gas, energy, temperature, and velocity or momentum distribution particles will not change in this process). For each state distribution, (1 p, p) we can define corresponding SMI by: H(p) = p log p (1 p) log p (40) Note that p changes with time, as a result also H(p) will change with time. If we follow change SMI we will observe a nearly mononic increasing function time. For actual simulations, see Ben-Naim (008, 010) [3,9]. The larger N, more mononic curve will be and once n reaches value: N/, value SMI will stay re forever. For any N, re will be fluctuations, both on way up maximum, as well as after reaching maximum. However, for very large N se fluctuations will be unnoticeable. After some time we reach an equilibrium state. The equilibrium states is reached when locational distribution is such that it maximizes SMI, namely: p eq = N (41)

14 Entropy 017, 19, and corresponding SMI is: [ H max = N 1 log 1 1 log 1 ] = N (4) Note again that here we are concerned with locational distribution with respect being eir in L, or in R. The momentum distribution does not change in this process. Once we reached equilibrium state, we can ask: What is probability finding system, such that re are N n in L, and n in R? Since probability finding a specific particle in eir L or R is 1/, probability finding probability distribution (N n, n), is: Pr(N n, n) = N! n!(n n)! ( 1 ) N n ( ) ( 1 = N n ) ( ) 1 N n It is easy show that this probability function has a maximum at n = N. Clearly, if we sum over all n, and use Binomial orem, we get, as expected: ( N N Pr(N n, n) = n=0 n=0 We now use Stirling approximation: To rewrite (43) as: N n ) ( ) 1 N = (43) ( ) 1 N N = 1 (44) ln N! N ln N N (45) ln Pr(1 p, p) N ln N[(1 p) ln(1 p) + p ln p] (46) or equivalently, after dividing by ln, we get: Pr(1 p, p) ( ) 1 N NH(p) (47) If we use instead approximation (45) following approximation: We get instead (47) approximation: ln N! N ln N N + 1 ln(πn) (48) Pr(1 p, p) ( ) 1 N NH(p) (49) πnp(1 p) Note that in general probability Pr finding distribution (1 p, p) is related ( ) SMI that distribution. We now compare probability finding state distribution 1, 1 with probability finding state distribution (1, 0). From (49) we have: ( 1 Pr, 1 ) = πn For state (1,0) we can use exact expression (43): Pr(1, 0) = (50) ( ) 1 N (51)

15 Entropy 017, 19, The ratio se two probabilities is: ) Pr( 1, 1 Pr(1, 0) = πn N (5) ) Note carefully that Pr( 1, 1 decreases with N. However, ratio two probabilities in (5) increases with N. The corresponding difference in SMI is: ( 1 H, 1 ) H(1, 0) = N 0 = N (53) What about entropy change? To calculate entropy difference in this process, let us denote by S i = S(E, V, N) entropy initial state. The entropy at final state S f = S(E, V, N) may be obtained by multiplying (53) by k B ln, and add it S i : The change in entropy is refore: S f = S i + (k B ln )N (54) S = S f S i = Nk B ln (55) which agrees with (38). It should be emphasized that ratio probabilities (5) and difference in entropies in (55) are computed for different states same system. In (55), S f and S i are entropies system at final and initial equilibrium states, respectively. These two equilibrium states are S(E, V, N) and S(E, V, N), respectively. In particular, S(E, V, N) is entropy system before removing partition. On or hand, ratio probability in (5) is calculated at equilibrium after removing partition. We can now answer question posed in beginning this section. After removal partition, gas will expand and attend a new equilibrium state. The ( reason ) for change from initial final state is probabilistic. The probability final state 1, 1 is overwhelmingly larger than probability initial state (1,0) immediately after removal partition. As a result mononic relationship between probability Pr(1 p, p), and SMI, whenever probability increases, SMI increases o. At state for which SMI is maximum, we can calculate change in entropy which is larger by Nk B ln relative entropy initial equilibrium state S i, i.e., before removal partition. We can say that process expansion occurs because overwhelmingly larger probability final equilibrium state. The increase in entropy system is a result expansion process, not cause process Caveat Quite ten, one might find in textbooks Boltzmann definition entropy in terms number states: S = k B ln W (56) W, in this equation is ten referred as probability. Of course, W cannot be a probability, which by definition is a number between zero and one. More careful writers will tell that ratio number states is ratio probabilities, i.e., for final and initial states, one writes: W f W i = Pr( f ) Pr(i) (57)

16 Entropy 017, 19, This is true but one must be careful note that while W i is number states system before removal partition, corresponding probability Pr(i) pertains same system after removal partition. Very ten might find erroneous statement second law based on Equation (56) as follows: number states system tends increase, refore entropy tends increase o. This statement is not true; both W and S in (56) are defined for an equilibrium state, and both do not have a tendency increase with time! 4. Boltzmann s H-Theorem Before we discuss Boltzmann s H-orem, we summarize here most important conclusion regarding SMI. In Section 3, we saw that entropy is obtained from SMI in four steps. We also saw that entropy a rmodynamic system is related maximum value SMI defined on distribution locations and velocities all particles in system: S = KMaxSMI(locations and velocities) (58) where K is a constant (K = k B ln ). We know that every system tends an equilibrium state at very long time, refore we identify Max SMI as time limit SMI, i.e.: S = K lim t SMI(locations and velocities) (59) The derivation entropy from SMI is a very remarkable result. But what is more important is that this derivation reveals at same time relationship between entropy and SMI on one hand, and fundamental difference between two concepts, on or hand. Besides fact that SMI is a far more general concept than entropy, we found that even when two concepts apply distribution locations and velocities, y are different. The SMI can evolve with time and reaches a limiting value (for large systems) at t. The entropy is proportional maximum value SMI obtained at equilibrium. As such entropy is not, and cannot be a function time. Thus, well-known mystery about entropy always increase with time, disappears. With this removal mystery, we also arrive at resolution paradoxes associated with Boltzmann H-orem. In 1877 Boltzmann defined a function H(t) [14 16]: H(t) = f (v, t) log[ f (v, t)]dv (60) and proved a remarkable orem known as Boltzmann s H-orem. following assumptions: Boltzmann made 1. Ignoring molecular structure walls (ideal. perfect smooth walls).. Spatial homogenous system or uniform locational distribution. 3. Assuming binary collisions, conserving momentum and kinetic energy. 4. No correlations between location and velocity (assumption molecular chaos). Then, Boltzmann proved that: and at equilibrium, i.e., t : dh(t) dt dh(t) dt 0 (61) = 0 (6)

17 Entropy 017, 19, Boltzmann believed that behavior function H(t) is same as that entropy, i.e., entropy always increases with time, and at equilibrium, it reaches a maximum. At this time, entropy does not change with time. This orem drew a great amount criticism, most well-known are: I. The Reversal Paradox States: The H-orem singles out a preferred direction time. This is inconsistent with time reversal invariance equations motion. This is not a paradox because statement that H(t) always changes in one direction is false. II. The Recurrence Paradox, Based on Poincare s Theorem States: After sufficiently long time, an isolated system with fixed E, V, N, will return arbitrary small neighborhood almost any given initial state. If we assume that dh/dt < 0 at all t, n obviously H cannot be periodic function time. Both paradoxes have been with us ever since. Furrmore, most popular science books identify Second Law, or behavior entropy with so-called arrow time. Some even go extremes identifying entropy with time [8,17,18]. Both paradoxes seem arise from conflict between reversibility equations motion on one hand, and apparent irreversibility Second Law, namely that H-function decreases mononically with time. Boltzmann rejected criticism by claiming that H does not always decrease with time, but only with high probability. The irreversibility Second Law is not absolute, but also highly improbable. The answer recurrence paradox follows from same argument. Indeed, system can return initial state. However, recurrence time is so large that this is never observed, not in our lifetime, not even in life time universe. Notwithstanding Boltzmann s correct answers his critics, Boltzmann and his critics made an enduring mistake in H-orem, a lingering mistake that has hounded us ever since. This is very identification H(t) with behavior entropy. This error has been propagated in literatures until day. It is clear, from very definition function H(t), that H(t) is a SMI. And if one identifies SMI with entropy, n we go back Boltzmann s identification function H(t) with entropy. Fortunately, thanks recent derivation entropy function, i.e., function S(E, V, N), or Sackur-Tetrode equation for entropy based on SMI, it becomes crystal clear that SMI is not entropy! The entropy is obtained from SMI when apply it distribution locations and momenta, n take limit t, and only in this limit we get entropy function which has no traces time dependence. Translating our findings in Section 3 H-orem, we can conclude that H(t) is SMI based on velocity distribution. Clearly, one cannot identify H(t) with entropy. To obtain entropy one must first define H(t) function based on distribution both locations and momentum, i.e.: H(t) = f (R, p, t) log f (R, p, t)drdp (63) This is a proper SMI. This may be defined for a system at equilibrium, or very far from equilibrium. To obtain entropy one must take limit t, i.e., limit H(t) at equilibrium, i.e.: lim [ H(t)] = Max SMI (at equilibrium) (64) t At this limit we obtain entropy (up a multiplicative constant), which is clearly not a function time. Thus, once it is undersod that function H(t) is an SMI and not entropy, it becomes clear that criticism Boltzmann s H-Theorem were addressed evolution SMI and not entropy. At same time, Boltzmann was right in defending his H-orem when viewed

18 Entropy 017, 19, as a orem on evolution SMI, but he was wrong in his interpretation quantity H(t) as entropy. Conflicts Interest: The author declares no conflict interest. References 1. Shannon, C.E.; Weaver, W. The Mamatical Theory Communication; The University Illinois Press: Chicago, IL, USA, Ben-Naim, A. Information Theory; World Scientific: Singapore, Ben-Naim, A. A Farewell Entropy: Statistical Thermodynamics Based on Information; World Scientific: Singapore, Ben-Naim, A.; Casadei, D. Modern Thermodynamics; World Scientific: Singapore, Ben-Naim, A. Entropy and Second Law. Interpretation and Misss-Interpretationsss; World Scientific: Singapore, Ben-Naim, A. Discover Probability. How Use It, How Avoid Misusing It, and How It Affects Every Aspect Your Life; World Scientific: Singapore, Yaglom, A.M.; Yaglom, I.M. Probability and Information; Jain, V.K., Reidel, D., Eds.; Springer Science & Business Media: Berlin/Heidelberg, Germany, Ben-Naim, A. Information, Entropy, Life and Universe. What We Know and What We Do Not Know; World Scientific: Singapore, Jaynes, E.T. Information ory and statistical mechanics. Phys. Rev. 1957, 106, [CrossRef] 10. Jaynes, E.T. Information ory and statistical mechanics, Part II. Phys. Rev. 1957, 108, [CrossRef] 11. Ben-Naim, A. The entropy mixing and assimilation: An -oretical perspective. Am. J. Phys. 006, 74, [CrossRef] 1. Ben-Naim, A. An Informational-Theoretical Formulation Second Law Thermodynamics. J. Chem. Educ. 009, 86, [CrossRef] 13. Ben-Naim, A. Entropy, Truth Whole Truth and nothing but Truth; World Scientific: Singapore, Boltzmann, L. Lectures on Gas Theory; Dover Publications: New York, NY, USA, Brush, S.G. The Kind Motion We Call Heat. A Hisry Kinetic Theory Gases in 19th Century, Book : Statistical Physics and Irreversible Processes; North-Holland Publishing Company: Amsterdam, The Nerlands, Brush, S.G. Statistical Physics and Amic Theory Matter, from Boyle and Newn Landau and Onsager; Princen University Press: Princen, NJ, USA, Ben-Naim, A. Discover Entropy and Second Law Thermodynamics. A Playful Way Discovering a Law Nature; World Scientific: Singapore, Ben-Naim, A. Entropy: Order or Information. J. Chem. Educ. 011, 88, [CrossRef] 017 by author; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under terms and conditions Creative Commons Attribution (CC BY) license (

On the Validity of the Assumption of Local Equilibrium in Non-Equilibrium Thermodynamics

On the Validity of the Assumption of Local Equilibrium in Non-Equilibrium Thermodynamics On the Validity of the Assumption of Local Equilibrium in Non-Equilibrium Thermodynamics Arieh Ben-Naim Department of Physical Chemistry The Hebrew University of Jerusalem Givat Ram, Jerusalem 91904 Israel

More information

Statistical Mechanics in a Nutshell

Statistical Mechanics in a Nutshell Chapter 2 Statistical Mechanics in a Nutshell Adapted from: Understanding Molecular Simulation Daan Frenkel and Berend Smit Academic Press (2001) pp. 9-22 11 2.1 Introduction In this course, we will treat

More information

1 Particles in a room

1 Particles in a room Massachusetts Institute of Technology MITES 208 Physics III Lecture 07: Statistical Physics of the Ideal Gas In these notes we derive the partition function for a gas of non-interacting particles in a

More information

X α = E x α = E. Ω Y (E,x)

X α = E x α = E. Ω Y (E,x) LCTUR 4 Reversible and Irreversible Processes Consider an isolated system in equilibrium (i.e., all microstates are equally probable), with some number of microstates Ω i that are accessible to the system.

More information

Quiz 3 for Physics 176: Answers. Professor Greenside

Quiz 3 for Physics 176: Answers. Professor Greenside Quiz 3 for Physics 176: Answers Professor Greenside True or False Questions ( points each) For each of the following statements, please circle T or F to indicate respectively whether a given statement

More information

CHEM-UA 652: Thermodynamics and Kinetics

CHEM-UA 652: Thermodynamics and Kinetics 1 CHEM-UA 652: Thermodynamics and Kinetics Notes for Lecture 2 I. THE IDEAL GAS LAW In the last lecture, we discussed the Maxwell-Boltzmann velocity and speed distribution functions for an ideal gas. Remember

More information

FRAME S : u = u 0 + FRAME S. 0 : u 0 = u À

FRAME S : u = u 0 + FRAME S. 0 : u 0 = u À Modern Physics (PHY 3305) Lecture Notes Modern Physics (PHY 3305) Lecture Notes Velocity, Energy and Matter (Ch..6-.7) SteveSekula, 9 January 010 (created 13 December 009) CHAPTERS.6-.7 Review of last

More information

Ideal gases. Asaf Pe er Classical ideal gas

Ideal gases. Asaf Pe er Classical ideal gas Ideal gases Asaf Pe er 1 November 2, 213 1. Classical ideal gas A classical gas is generally referred to as a gas in which its molecules move freely in space; namely, the mean separation between the molecules

More information

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4 MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM AND BOLTZMANN ENTROPY Contents 1 Macroscopic Variables 3 2 Local quantities and Hydrodynamics fields 4 3 Coarse-graining 6 4 Thermal equilibrium 9 5 Two systems

More information

Signal Processing - Lecture 7

Signal Processing - Lecture 7 1 Introduction Signal Processing - Lecture 7 Fitting a function to a set of data gathered in time sequence can be viewed as signal processing or learning, and is an important topic in information theory.

More information

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility 84 My God, He Plays Dice! This chapter on the web informationphilosopher.com/problems/reversibility Microscopic In the 1870 s, Ludwig Boltzmann developed his transport equation and his dynamical H-theorem

More information

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS UNDERSTANDING BOLTZMANN S ANALYSIS VIA Contents SOLVABLE MODELS 1 Kac ring model 2 1.1 Microstates............................ 3 1.2 Macrostates............................ 6 1.3 Boltzmann s entropy.......................

More information

Introduction. Chapter The Purpose of Statistical Mechanics

Introduction. Chapter The Purpose of Statistical Mechanics Chapter 1 Introduction 1.1 The Purpose of Statistical Mechanics Statistical Mechanics is the mechanics developed to treat a collection of a large number of atoms or particles. Such a collection is, for

More information

Stochastic Histories. Chapter Introduction

Stochastic Histories. Chapter Introduction Chapter 8 Stochastic Histories 8.1 Introduction Despite the fact that classical mechanics employs deterministic dynamical laws, random dynamical processes often arise in classical physics, as well as in

More information

Introduction Statistical Thermodynamics. Monday, January 6, 14

Introduction Statistical Thermodynamics. Monday, January 6, 14 Introduction Statistical Thermodynamics 1 Molecular Simulations Molecular dynamics: solve equations of motion Monte Carlo: importance sampling r 1 r 2 r n MD MC r 1 r 2 2 r n 2 3 3 4 4 Questions How can

More information

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT From the microscopic to the macroscopic world Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München Jean BRICMONT Université Catholique de Louvain Can Irreversible macroscopic laws be deduced

More information

Physics 172H Modern Mechanics

Physics 172H Modern Mechanics Physics 172H Modern Mechanics Instructor: Dr. Mark Haugan Office: PHYS 282 haugan@purdue.edu TAs: Alex Kryzwda John Lorenz akryzwda@purdue.edu jdlorenz@purdue.edu Lecture 22: Matter & Interactions, Ch.

More information

Engineering Physics 1 Dr. B. K. Patra Department of Physics Indian Institute of Technology-Roorkee

Engineering Physics 1 Dr. B. K. Patra Department of Physics Indian Institute of Technology-Roorkee Engineering Physics 1 Dr. B. K. Patra Department of Physics Indian Institute of Technology-Roorkee Module-05 Lecture-02 Kinetic Theory of Gases - Part 02 (Refer Slide Time: 00:32) So, after doing the angular

More information

PHYSICS 715 COURSE NOTES WEEK 1

PHYSICS 715 COURSE NOTES WEEK 1 PHYSICS 715 COURSE NOTES WEEK 1 1 Thermodynamics 1.1 Introduction When we start to study physics, we learn about particle motion. First one particle, then two. It is dismaying to learn that the motion

More information

Intersecting Two Lines, Part Two

Intersecting Two Lines, Part Two Module 1.5 Page 149 of 1390. Module 1.5: Intersecting Two Lines, Part Two In this module you will learn about two very common algebraic methods for intersecting two lines: the Substitution Method and the

More information

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins Lecture 6 Statistical Processes Irreversibility Counting and Probability Microstates and Macrostates The Meaning of Equilibrium Ω(m) 9 spins -9-7 -5-3 -1 1 3 5 7 m 9 Lecture 6, p. 1 Irreversibility Have

More information

5. Systems in contact with a thermal bath

5. Systems in contact with a thermal bath 5. Systems in contact with a thermal bath So far, isolated systems (micro-canonical methods) 5.1 Constant number of particles:kittel&kroemer Chap. 3 Boltzmann factor Partition function (canonical methods)

More information

College Algebra Through Problem Solving (2018 Edition)

College Algebra Through Problem Solving (2018 Edition) City University of New York (CUNY) CUNY Academic Works Open Educational Resources Queensborough Community College Winter 1-25-2018 College Algebra Through Problem Solving (2018 Edition) Danielle Cifone

More information

Statistical Mechanics

Statistical Mechanics 42 My God, He Plays Dice! Statistical Mechanics Statistical Mechanics 43 Statistical Mechanics Statistical mechanics and thermodynamics are nineteenthcentury classical physics, but they contain the seeds

More information

Basic Concepts and Tools in Statistical Physics

Basic Concepts and Tools in Statistical Physics Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes

More information

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith MP203 Statistical and Thermal Physics Jon-Ivar Skullerud and James Smith October 3, 2017 1 Contents 1 Introduction 3 1.1 Temperature and thermal equilibrium.................... 4 1.1.1 The zeroth law of

More information

- HH Midterm 1 for Thermal and Statistical Physics I Hsiu-Hau Lin (Nov 5, 2012)

- HH Midterm 1 for Thermal and Statistical Physics I Hsiu-Hau Lin (Nov 5, 2012) - HH0063 - Midterm 1 for Thermal and Statistical Physics I Hsiu-Hau Lin hsiuhau.lin@gmail.com (Nov 5, 2012) Binary model (A) The thermal entropy is logarithm of the multiplicity function, (N,U) =logg =logg

More information

2m + U( q i), (IV.26) i=1

2m + U( q i), (IV.26) i=1 I.D The Ideal Gas As discussed in chapter II, micro-states of a gas of N particles correspond to points { p i, q i }, in the 6N-dimensional phase space. Ignoring the potential energy of interactions, the

More information

A Deeper Look into Phase Space: The Liouville and Boltzmann Equations

A Deeper Look into Phase Space: The Liouville and Boltzmann Equations A Deeper Look into Phase Space: The Liouville and Boltzmann Equations Brian Petko Statistical Thermodynamics Colorado School of Mines December 7, 2009 INTRODUCTION Phase Space, in a sense, can be a useful

More information

Probability and the Second Law of Thermodynamics

Probability and the Second Law of Thermodynamics Probability and the Second Law of Thermodynamics Stephen R. Addison January 24, 200 Introduction Over the next several class periods we will be reviewing the basic results of probability and relating probability

More information

CHAPTER 1: Functions

CHAPTER 1: Functions CHAPTER 1: Functions 1.1: Functions 1.2: Graphs of Functions 1.3: Basic Graphs and Symmetry 1.4: Transformations 1.5: Piecewise-Defined Functions; Limits and Continuity in Calculus 1.6: Combining Functions

More information

Stochastic Quantum Dynamics I. Born Rule

Stochastic Quantum Dynamics I. Born Rule Stochastic Quantum Dynamics I. Born Rule Robert B. Griffiths Version of 25 January 2010 Contents 1 Introduction 1 2 Born Rule 1 2.1 Statement of the Born Rule................................ 1 2.2 Incompatible

More information

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids Chapter 5 In this chapter we want to review the concept of irreversibility in more detail and see how it comes from the multiplicity of states. In addition, we want to introduce the following new topics:

More information

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B Canonical ensemble (Two derivations) Determine the probability that a system S in contact with a reservoir 1 R to be in one particular microstate s with energy ɛ s. (If there is degeneracy we are picking

More information

Physics Dec The Maxwell Velocity Distribution

Physics Dec The Maxwell Velocity Distribution Physics 301 7-Dec-2005 29-1 The Maxwell Velocity Distribution The beginning of chapter 14 covers some things we ve already discussed. Way back in lecture 6, we calculated the pressure for an ideal gas

More information

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq Chapter. The microcanonical ensemble The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq } = A that give

More information

Practical Algebra. A Step-by-step Approach. Brought to you by Softmath, producers of Algebrator Software

Practical Algebra. A Step-by-step Approach. Brought to you by Softmath, producers of Algebrator Software Practical Algebra A Step-by-step Approach Brought to you by Softmath, producers of Algebrator Software 2 Algebra e-book Table of Contents Chapter 1 Algebraic expressions 5 1 Collecting... like terms 5

More information

1 Multiplicity of the ideal gas

1 Multiplicity of the ideal gas Reading assignment. Schroeder, section.6. 1 Multiplicity of the ideal gas Our evaluation of the numbers of microstates corresponding to each macrostate of the two-state paramagnet and the Einstein model

More information

Algebra. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Algebra. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed. This document was written and copyrighted by Paul Dawkins. Use of this document and its online version is governed by the Terms and Conditions of Use located at. The online version of this document is

More information

Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur Lecture - 09 Second Law and its Corollaries-IV Good afternoon to all of you to this session

More information

213 Midterm coming up

213 Midterm coming up 213 Midterm coming up Monday April 8 @ 7 pm (conflict exam @ 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion 1-4 Labs 1-2 Review Session Sunday April 7, 3-5 PM, 141 Loomis

More information

1 Phase Spaces and the Liouville Equation

1 Phase Spaces and the Liouville Equation Phase Spaces and the Liouville Equation emphasize the change of language from deterministic to probablistic description. Under the dynamics: ½ m vi = F i ẋ i = v i with initial data given. What is the

More information

Elements of Statistical Mechanics

Elements of Statistical Mechanics Elements of Statistical Mechanics Thermodynamics describes the properties of macroscopic bodies. Statistical mechanics allows us to obtain the laws of thermodynamics from the laws of mechanics, classical

More information

Gibbs Paradox Solution

Gibbs Paradox Solution Gibbs Paradox Solution James A. Putnam he Gibbs paradox results from analyzing mixing entropy as if it is a type of thermodynamic entropy. It begins with an adiabatic box divided in half by an adiabatic

More information

Stephen F Austin. Exponents and Logarithms. chapter 3

Stephen F Austin. Exponents and Logarithms. chapter 3 chapter 3 Starry Night was painted by Vincent Van Gogh in 1889. The brightness of a star as seen from Earth is measured using a logarithmic scale. Exponents and Logarithms This chapter focuses on understanding

More information

Before the Quiz. Make sure you have SIX pennies

Before the Quiz. Make sure you have SIX pennies Before the Quiz Make sure you have SIX pennies If you have more than 6, please share with your neighbors There are some additional pennies in the baskets up front please be sure to return them after class!!!

More information

Chapter 3. Estimation of p. 3.1 Point and Interval Estimates of p

Chapter 3. Estimation of p. 3.1 Point and Interval Estimates of p Chapter 3 Estimation of p 3.1 Point and Interval Estimates of p Suppose that we have Bernoulli Trials (BT). So far, in every example I have told you the (numerical) value of p. In science, usually the

More information

Solutions to Quiz 2. i 1 r 101. ar = a 1 r. i=0

Solutions to Quiz 2. i 1 r 101. ar = a 1 r. i=0 Massachusetts Institute of Technology 6.042J/18.062J, Fall 02: Mathematics for Computer Science Prof. Albert Meyer and Dr. Radhika Nagpal Solutions to Quiz 2 Problem 1 (15 points). Chuck Vest is planning

More information

Lecture 5 - Information theory

Lecture 5 - Information theory Lecture 5 - Information theory Jan Bouda FI MU May 18, 2012 Jan Bouda (FI MU) Lecture 5 - Information theory May 18, 2012 1 / 42 Part I Uncertainty and entropy Jan Bouda (FI MU) Lecture 5 - Information

More information

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r).

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r). Caveat: Not proof read. Corrections appreciated. Combinatorics In the following, n, n 1, r, etc. will denote non-negative integers. Rule 1 The number of ways of ordering n distinguishable objects (also

More information

Last Update: March 1 2, 201 0

Last Update: March 1 2, 201 0 M ath 2 0 1 E S 1 W inter 2 0 1 0 Last Update: March 1 2, 201 0 S eries S olutions of Differential Equations Disclaimer: This lecture note tries to provide an alternative approach to the material in Sections

More information

Definite Integral and the Gibbs Paradox

Definite Integral and the Gibbs Paradox Acta Polytechnica Hungarica ol. 8, No. 4, 0 Definite Integral and the Gibbs Paradox TianZhi Shi College of Physics, Electronics and Electrical Engineering, HuaiYin Normal University, HuaiAn, JiangSu, China,

More information

The Methodology of Statistical Mechanics

The Methodology of Statistical Mechanics Chapter 4 The Methodology of Statistical Mechanics c 2006 by Harvey Gould and Jan Tobochnik 16 November 2006 We develop the basic methodology of statistical mechanics and provide a microscopic foundation

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the

More information

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium Entropy, free energy and equilibrium Spontaneity Entropy Free energy and equilibrium Learning objectives Discuss what is meant by spontaneity Discuss energy dispersal and its relevance to spontaneity Describe

More information

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture 8. The Second Law of Thermodynamics; Energy Exchange Lecture 8 The Second Law of Thermodynamics; Energy Exchange The second law of thermodynamics Statistics of energy exchange General definition of temperature Why heat flows from hot to cold Reading for

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy Chapter 4 Free Energies The second law allows us to determine the spontaneous direction of of a process with constant (E, V, n). Of course, there are many processes for which we cannot control (E, V, n)

More information

Lab 0 Appendix C L0-1 APPENDIX C ACCURACY OF MEASUREMENTS AND TREATMENT OF EXPERIMENTAL UNCERTAINTY

Lab 0 Appendix C L0-1 APPENDIX C ACCURACY OF MEASUREMENTS AND TREATMENT OF EXPERIMENTAL UNCERTAINTY Lab 0 Appendix C L0-1 APPENDIX C ACCURACY OF MEASUREMENTS AND TREATMENT OF EXPERIMENTAL UNCERTAINTY A measurement whose accuracy is unknown has no use whatever. It is therefore necessary to know how to

More information

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential Chapter Ensemble Theory in Statistical Physics: Free Energy Potential Abstract In this chapter, we discuss the basic formalism of statistical physics Also, we consider in detail the concept of the free

More information

Rate of Heating and Cooling

Rate of Heating and Cooling Rate of Heating and Cooling 35 T [ o C] Example: Heating and cooling of Water E 30 Cooling S 25 Heating exponential decay 20 0 100 200 300 400 t [sec] Newton s Law of Cooling T S > T E : System S cools

More information

E = hν light = hc λ = ( J s)( m/s) m = ev J = ev

E = hν light = hc λ = ( J s)( m/s) m = ev J = ev Problem The ionization potential tells us how much energy we need to use to remove an electron, so we know that any energy left afterwards will be the kinetic energy of the ejected electron. So first we

More information

Advanced Topics in Equilibrium Statistical Mechanics

Advanced Topics in Equilibrium Statistical Mechanics Advanced Topics in Equilibrium Statistical Mechanics Glenn Fredrickson 2. Classical Fluids A. Coarse-graining and the classical limit For concreteness, let s now focus on a fluid phase of a simple monatomic

More information

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4) Phase Transitions A homogeneous equilibrium state of matter is the most natural one, given the fact that the interparticle interactions are translationally invariant. Nevertheless there is no contradiction

More information

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models Contents Mathematical Reasoning 3.1 Mathematical Models........................... 3. Mathematical Proof............................ 4..1 Structure of Proofs........................ 4.. Direct Method..........................

More information

Physics 505 Homework No.2 Solution

Physics 505 Homework No.2 Solution Physics 55 Homework No Solution February 3 Problem Calculate the partition function of a system of N noninteracting free particles confined to a box of volume V (i) classically and (ii) quantum mechanically

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Introduction to Thermodynamic States Gases

Introduction to Thermodynamic States Gases Chapter 1 Introduction to Thermodynamic States Gases We begin our study in thermodynamics with a survey of the properties of gases. Gases are one of the first things students study in general chemistry.

More information

Special Theory Of Relativity Prof. Shiva Prasad Department of Physics Indian Institute of Technology, Bombay

Special Theory Of Relativity Prof. Shiva Prasad Department of Physics Indian Institute of Technology, Bombay Special Theory Of Relativity Prof. Shiva Prasad Department of Physics Indian Institute of Technology, Bombay Lecture - 6 Length Contraction and Time Dilation (Refer Slide Time: 00:29) In our last lecture,

More information

Microcanonical Ensemble

Microcanonical Ensemble Entropy for Department of Physics, Chungbuk National University October 4, 2018 Entropy for A measure for the lack of information (ignorance): s i = log P i = log 1 P i. An average ignorance: S = k B i

More information

Chapter 5 - Systems under pressure 62

Chapter 5 - Systems under pressure 62 Chapter 5 - Systems under pressure 62 CHAPTER 5 - SYSTEMS UNDER PRESSURE 5.1 Ideal gas law The quantitative study of gases goes back more than three centuries. In 1662, Robert Boyle showed that at a fixed

More information

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero Chapter Limits of Sequences Calculus Student: lim s n = 0 means the s n are getting closer and closer to zero but never gets there. Instructor: ARGHHHHH! Exercise. Think of a better response for the instructor.

More information

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics c Hans C. Andersen October 1, 2009 While we know that in principle

More information

...Thermodynamics. Lecture 15 November 9, / 26

...Thermodynamics. Lecture 15 November 9, / 26 ...Thermodynamics Conjugate variables Positive specific heats and compressibility Clausius Clapeyron Relation for Phase boundary Phase defined by discontinuities in state variables Lecture 15 November

More information

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019 Lecture 11: Information theory DANIEL WELLER THURSDAY, FEBRUARY 21, 2019 Agenda Information and probability Entropy and coding Mutual information and capacity Both images contain the same fraction of black

More information

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture 8. The Second Law of Thermodynamics; Energy Exchange Lecture 8 The Second Law of Thermodynamics; Energy Exchange The second law of thermodynamics Statistics of energy exchange General definition of temperature Why heat flows from hot to cold Reading for

More information

PHYS 352 Homework 2 Solutions

PHYS 352 Homework 2 Solutions PHYS 352 Homework 2 Solutions Aaron Mowitz (, 2, and 3) and Nachi Stern (4 and 5) Problem The purpose of doing a Legendre transform is to change a function of one or more variables into a function of variables

More information

Continuum Probability and Sets of Measure Zero

Continuum Probability and Sets of Measure Zero Chapter 3 Continuum Probability and Sets of Measure Zero In this chapter, we provide a motivation for using measure theory as a foundation for probability. It uses the example of random coin tossing to

More information

Quantum Information Types

Quantum Information Types qitd181 Quantum Information Types Robert B. Griffiths Version of 6 February 2012 References: R. B. Griffiths, Types of Quantum Information, Phys. Rev. A 76 (2007) 062320; arxiv:0707.3752 Contents 1 Introduction

More information

9.1 System in contact with a heat reservoir

9.1 System in contact with a heat reservoir Chapter 9 Canonical ensemble 9. System in contact with a heat reservoir We consider a small system A characterized by E, V and N in thermal interaction with a heat reservoir A 2 characterized by E 2, V

More information

Information in Biology

Information in Biology Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living

More information

August 20, Review of Integration & the. Fundamental Theorem of Calculus. Introduction to the Natural Logarithm.

August 20, Review of Integration & the. Fundamental Theorem of Calculus. Introduction to the Natural Logarithm. to Natural Natural to Natural August 20, 2017 to Natural Natural 1 2 3 Natural 4 Incremental Accumulation of Quantities to Natural Natural Integration is a means of understanding and computing incremental

More information

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010) http://math.sun.ac.za/amsc/sam Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics 2009-2010 Lecture notes in progress (27 March 2010) Contents 2009 Semester I: Elements 5 1. Cartesian product

More information

with a proper choice of the potential U(r). Clearly, we should include the potential of the ions, U ion (r):.

with a proper choice of the potential U(r). Clearly, we should include the potential of the ions, U ion (r):. The Hartree Equations So far we have ignored the effects of electron-electron (e-e) interactions by working in the independent electron approximation. In this lecture, we shall discuss how this effect

More information

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

MATH2206 Prob Stat/20.Jan Weekly Review 1-2 MATH2206 Prob Stat/20.Jan.2017 Weekly Review 1-2 This week I explained the idea behind the formula of the well-known statistic standard deviation so that it is clear now why it is a measure of dispersion

More information

Appendix 1: Normal Modes, Phase Space and Statistical Physics

Appendix 1: Normal Modes, Phase Space and Statistical Physics Appendix : Normal Modes, Phase Space and Statistical Physics The last line of the introduction to the first edition states that it is the wide validity of relatively few principles which this book seeks

More information

IV. Classical Statistical Mechanics

IV. Classical Statistical Mechanics IV. Classical Statistical Mechanics IV.A General Definitions Statistical Mechanics is a probabilistic approach to equilibrium macroscopic properties of large numbers of degrees of freedom. As discussed

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity

More information

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics Marc R. Roussel January 23, 2018 Marc R. Roussel Entropy and the second law January 23, 2018 1 / 29 States in thermodynamics The thermodynamic

More information

Math 138: Introduction to solving systems of equations with matrices. The Concept of Balance for Systems of Equations

Math 138: Introduction to solving systems of equations with matrices. The Concept of Balance for Systems of Equations Math 138: Introduction to solving systems of equations with matrices. Pedagogy focus: Concept of equation balance, integer arithmetic, quadratic equations. The Concept of Balance for Systems of Equations

More information

23 The Born-Oppenheimer approximation, the Many Electron Hamiltonian and the molecular Schrödinger Equation M I

23 The Born-Oppenheimer approximation, the Many Electron Hamiltonian and the molecular Schrödinger Equation M I 23 The Born-Oppenheimer approximation, the Many Electron Hamiltonian and the molecular Schrödinger Equation 1. Now we will write down the Hamiltonian for a molecular system comprising N nuclei and n electrons.

More information

Remember next exam is 1 week from Friday. This week will be last quiz before exam.

Remember next exam is 1 week from Friday. This week will be last quiz before exam. Lecture Chapter 3 Extent of reaction and equilibrium Remember next exam is week from Friday. his week will be last quiz before exam. Outline: Extent of reaction Reaction equilibrium rxn G at non-standard

More information

Treatment of Error in Experimental Measurements

Treatment of Error in Experimental Measurements in Experimental Measurements All measurements contain error. An experiment is truly incomplete without an evaluation of the amount of error in the results. In this course, you will learn to use some common

More information

Information in Biology

Information in Biology Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve

More information

DR.RUPNATHJI( DR.RUPAK NATH )

DR.RUPNATHJI( DR.RUPAK NATH ) Contents 1 Sets 1 2 The Real Numbers 9 3 Sequences 29 4 Series 59 5 Functions 81 6 Power Series 105 7 The elementary functions 111 Chapter 1 Sets It is very convenient to introduce some notation and terminology

More information

Lecture 13. Multiplicity and statistical definition of entropy

Lecture 13. Multiplicity and statistical definition of entropy Lecture 13 Multiplicity and statistical definition of entropy Readings: Lecture 13, today: Chapter 7: 7.1 7.19 Lecture 14, Monday: Chapter 7: 7.20 - end 2/26/16 1 Today s Goals Concept of entropy from

More information

Review of Basic Probability

Review of Basic Probability Review of Basic Probability Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 September 16, 2009 Abstract This document reviews basic discrete

More information

Quantum Thermodynamics

Quantum Thermodynamics Quantum Thermodynamics In this chapter we wish to give some insights to quantum thermodynamics. It can be seen as an introduction to the topic, however not a broad one but rather an introduction by examples.

More information