Gibbs Paradox Solution

Similar documents
THE NATURE OF THERMODYNAMIC ENTROPY. 1 Introduction. James A. Putnam. 1.1 New Definitions for Mass and Force. Author of

Entropy and the Second and Third Laws of Thermodynamics

Reversibility, Irreversibility and Carnot cycle. Irreversible Processes. Reversible Processes. Carnot Cycle

CHAPTER - 12 THERMODYNAMICS

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

19-9 Adiabatic Expansion of an Ideal Gas

Chemical thermodynamics the area of chemistry that deals with energy relationships

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

X α = E x α = E. Ω Y (E,x)

18.13 Review & Summary

International Physics Course Entrance Examination Questions

Part1B(Advanced Physics) Statistical Physics

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

The Second Law of Thermodynamics

12 The Laws of Thermodynamics

A thermodynamic system is taken from an initial state X along the path XYZX as shown in the PV-diagram.

PHYSICS 715 COURSE NOTES WEEK 1

More Thermodynamics. Specific Specific Heats of a Gas Equipartition of Energy Reversible and Irreversible Processes

ADIABATIC PROCESS Q = 0

Physics 53. Thermal Physics 1. Statistics are like a bikini. What they reveal is suggestive; what they conceal is vital.

Entropy and the second law of thermodynamics

Chapter 20 Entropy and the 2nd Law of Thermodynamics

I.D The Second Law Q C

Chapter 2 Carnot Principle

University Physics (Prof. David Flory) Chapt_21 Monday, November 26, 2007 Page 1

Thermodynamics. 1.1 Introduction. Thermodynamics is a phenomenological description of properties of macroscopic systems in thermal equilibrium.

Quantitative Exercise 9.4. Tip 9/14/2015. Quantitative analysis of an ideal gas

Concept: Thermodynamics

Addison Ault, Department of Chemistry, Cornell College, Mt. Vernon IA. The Carnot cycle is usually described in terms of classical

Process Nature of Process

Reversible Processes. Furthermore, there must be no friction (i.e. mechanical energy loss) or turbulence i.e. it must be infinitely slow.

Chapter 19 Chemical Thermodynamics

Class 22 - Second Law of Thermodynamics and Entropy

Chapter 12. The Laws of Thermodynamics

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

Classification following properties of the system in Intensive and Extensive

Chapter 20 The Second Law of Thermodynamics

1. Thermodynamics 1.1. A macroscopic view of matter

Chapter 19 Chemical Thermodynamics

MME 2010 METALLURGICAL THERMODYNAMICS II. Fundamentals of Thermodynamics for Systems of Constant Composition

Engineering Thermodynamics. Chapter 6. Entropy: a measure of Disorder 6.1 Introduction

CARNOT CYCLE = T = S ( U,V )

Module 5: Rise and Fall of the Clockwork Universe. You should be able to demonstrate and show your understanding of:

Reversibility. Processes in nature are always irreversible: far from equilibrium

Spontaneity: Second law of thermodynamics CH102 General Chemistry, Spring 2012, Boston University

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium

11/22/11. If you add some heat to a substance, is it possible for the temperature of the substance to remain unchanged?

Temperature and Thermometers. Temperature is a measure of how hot or cold something is. Most materials expand when heated.

PY2005: Thermodynamics

Adiabats and entropy (Hiroshi Matsuoka) In this section, we will define the absolute temperature scale and entropy.

First Law Limitations

THERMODYNAMICS SSC-JE STAFF SELECTION COMMISSION MECHANICAL ENGINEERING STUDY MATERIAL THERMODYNAMICS THERMODYNAMICS THERMODYNAMICS

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

PHYSICS 214A Midterm Exam February 10, 2009

Stuff 1st Law of Thermodynamics First Law Differential Form Total Differential Total Differential

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Chapter 19 Chemical Thermodynamics

MAHALAKSHMI ENGINEERING COLLEGE

Physics 5D PRACTICE FINAL EXAM Fall 2013

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

Chap. 3. The Second Law. Law of Spontaneity, world gets more random

Basic thermodynamics. heat to the high temperature reservoir.

Irreversible Processes

AP PHYSICS 2 WHS-CH-15 Thermodynamics Show all your work, equations used, and box in your answers!

Survey of Thermodynamic Processes and First and Second Laws

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

Thermodynamic system is classified into the following three systems. (ii) Closed System It exchanges only energy (not matter) with surroundings.

THERMODYNAMICS. Zeroth law of thermodynamics. Isotherm

8 Lecture 8: Thermodynamics: Principles

THE SECOND LAW OF THERMODYNAMICS. Professor Benjamin G. Levine CEM 182H Lecture 5

Removing the mystery of entropy and thermodynamics. Part 3

Physics Nov Heat and Work

P(N,V,T) = NRT V. = P(N,V,T) dv

CHEM Introduction to Thermodynamics Fall Entropy and the Second Law of Thermodynamics

This follows from the Clausius inequality as a consequence of the second law of thermodynamics. Therefore. (for reversible process only) (22.

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

Thermodynamic Third class Dr. Arkan J. Hadi

6.730 Physics for Solid State Applications

The Temperature of a System as a Function of the Multiplicity and its Rate of Change

Thermodynamics. Mechanical Engineering. For

The goal of thermodynamics is to understand how heat can be converted to work. Not all the heat energy can be converted to mechanical energy

Chapter 17. Free Energy and Thermodynamics. Chapter 17 Lecture Lecture Presentation. Sherril Soman Grand Valley State University

AP Physics Thermodynamics Wrapup

Basic Thermodynamics. Prof. S. K. Som. Department of Mechanical Engineering. Indian Institute of Technology, Kharagpur.

Test Exchange Thermodynamics (C) Test Team Name: Team Number: Score: / 43. Made by Montgomery High School -

Thermodynamic entropy

Thermodynamics and the aims of statistical mechanics

10/12/10. Chapter 16. A Macroscopic Description of Matter. Chapter 16. A Macroscopic Description of Matter. State Variables.

Entropy and the Second Law of Thermodynamics

Department of Mechanical Engineering ME 322 Mechanical Engineering Thermodynamics. Introduction to 2 nd Law and Entropy.

Statistical. mechanics

CONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year.

Thermodynamics is the Science of Energy and Entropy

arxiv:physics/ v1 [physics.ed-ph] 2 Oct 2003

So far changes in the state of systems that occur within the restrictions of the first law of thermodynamics were considered:

PHYS First Midterm SOLUTIONS

Chapter 19. Chemical Thermodynamics. Chemical Thermodynamics

October 18, 2011 Carnot cycle - 1

ME6301- ENGINEERING THERMODYNAMICS UNIT I BASIC CONCEPT AND FIRST LAW PART-A

Entropy: A concept that is not a physical quantity

Transcription:

Gibbs Paradox Solution James A. Putnam he Gibbs paradox results from analyzing mixing entropy as if it is a type of thermodynamic entropy. It begins with an adiabatic box divided in half by an adiabatic removable partition. here are two ideal gases, at equal temperatures and pressures, distinguishable as gas A and gas B, each separately contained in separate halves of the box. he partition is removed, the two gases mix. Mixing entropy theory predicts a significant change in temperature for the gases due to mixing. However, experimental results show that the mixing process produces no detectable change in temperature. he solution presented in this essay introduces new explanations for both thermodynamic entropy and mixing entropy. It is shown that the paradox is not real. he prediction of mixing entropy is illusory due to an incorrect assumption: he mixing entropy is not like Clausius thermodynamic entropy. he subject of this essay was chosen to demonstrate the negative consequences of theorists bypassing an understanding of what is Clausius thermodynamic entropy. What is hermodynamic Entropy? Entropy is a theoretical pathway for moving from It to Bit. Clausius discovered entropy and defined it precisely as thermodynamic entropy. What is thermodynamic entropy? It is something whose nature should be easily established; because, its derivation is part of the operation of the simple Carnot engine. he answer can be found in the operation of the Carnot engine. he Carnot engine is theoretically the most efficient engine. Its efficiency is independent of the nature of the working medium, in this case a gas. he efficiency depends only upon the values of the high and low temperatures in degrees Kelvin. Degrees Kelvin is used because the Kelvin temperature scale is derived based upon the Carnot cycle. he engine s equation of efficiency and the definition of the Kelvin temperature scale are the basis for the derivation of the equation: Something very important happens during this derivation that establishes a definite rate of operation for the Carnot cycle. he engine is defined as operating quasi-statically. he general requirement for this to be true is that the engine should operate so slowly that the temperature of the working medium should always measure the same at any point within the medium. his is a condition that must be met for a system to be described as operating infinitesimally close to equilibrium. here are a number of rates of operation that will satisfy this condition; however, there is one specific rate, above which, the equilibrium will be lost. Any slower rate will work fine. he question is: What is this rate of operation that separates equilibrium from disequilibrium? It is important to know this because it is the rate that becomes fixed into the derivation of the Carnot engine. his occurs because the engine is

2 defined such that the ratio of its heat absorbed to its heat rejected equals the ratio of the temperatures of the high and low heat sources: emperature is proportional to the rate of exchange of energy between molecules. It is not quantitatively the same, because, temperature is assigned arbitrary units of measurement. emperature is assigned units of degrees Kelvin and its scale is arbitrarily fitted to the freezing and boiling points of water. he temperature difference between these points is set at degrees. For this reason, the quantitative measurement of temperature is not the same as the quantitative measurement of exchange of energy between molecules. However, this discrepancy can be moderated with the introduction of a constant of proportionality : he ratio is the definition of the modified temperature. Multiplying by dt: his equation shows that the differential of entropy appears in the above equation as: Both and are variables. It is necessary to determine a value for the constant k. his value might be contained in the ideal gas law: Where k is Boltzmann s constant. When n the equation gives the kinetic energy of a single molecule. In the case of a single molecule, E becomes E an incremental value of energy: his suggests that for an ideal gas molecule: In other words, the thermodynamic entropy of a single ideal gas molecule is a constant. Substituting for Boltzmann s constant: Entropy, from five steps above as differentials, now in incremental form is: herefore, I write:

3 If I could establish a value for t, then I could calculate k. Since this calculation applies to a single ideal gas molecule and is a constant value, I assume that t is a fundamental increment of time or is directly proportional to a fundamental increment of time. here is one immensely useful fundamental increment of time, introduced in the essay Electric Charge and Universal ime. It is: Substituting and solving for k : he body of work, Reference (4), supporting this essay includes theoretical changes beginning right from the start of physics theory. he units change (Appendix B). Substituting the derived empirical units and dropping the molecule indicator: he value is a unit free constant of proportionality. It follows, from seven steps above, that Boltzmann s constant is: For the ideal gas, the thermodynamic entropy of each molecule is a constant: Substituting this expression for entropy into the defining equation for thermodynamic entropy: E = S = k t c Recognizing that the increment of energy represents an increment of heat and solving for S: Heat is energy in transit. Solving for t c: E S = k t S tc k k his period of time t c would have been definable as thermodynamic entropy if temperature had been defined as the rate of transit of energy between molecules he arbitrary temperature units make it necessary to include the constant in the definition for a modified thermodynamic entropy m. he equation showing this is: Sm k Q tc For an ideal gas receiving energy from a high temperature reservoir: c

4 Sm high high k Q high high tc For a Carnot engine, the modified entropy equals expressions for heat both received and released: Substituting from two steps above: high Sm k k low high low high low Sm high low t t And the increments of time for the rates of transit of energy are equivalent. he time periods for average molecular kinetic energy entering the engine and leaving the engine are the same. he rates of exchange of kinetic energy are exactly what they need to be for the modified entropy to remain constant. his is why the increase in entropy is exactly the opposite of the decrease in entropy for the Carnot engine. Energy entering the engine carries the positive sign, and energy leaving the engine carries the negative sign. emperature is proportional to average kinetic energy, because, it is proportional to the rate at which average kinetic energy is transferred between individual molecules. he numerator of the modified temperature is the average kinetic energy. he denominator is the constant t c. he modified temperature establishes the point where equilibrium exists. Equilibrium exists when kinetic energy is exchanged at the rate set by the modified temperature. Next, I consider an engine that has heat loss that does not result in work. he heat successfully converted into work can be represented by a series of Carnot engines. For the series of Carnot engines, the change in entropy per cycle is zero. he lost heat just passes through the engines unnoticed. he series of engines is an unaffected pathway for the lost heat to travel to the low temperature sink. he lost heat becomes energy no longer available for producing work by the series of Carnot engines. he entropies that are affected are those of the high heat source and the low heat source. heir entropies are measures of time required for the lost heat to be released by the high heat source and later absorbed by the low heat source. he net change in thermodynamic entropy is: c c he quantity of heat is the same in both cases. he rates at which energy is transferred are different. he low temperature represents a slower rate of exchange of heat than for the high temperature. his means it takes longer for the low temperature sink to absorb the quantity of lost heat than it does for the high temperature source to supply it. his time difference is the cause of the greater than sign in Clausius definition of thermodynamic entropy. he high heat source loses entropy because it requires extra time for the lost heat to leave the source. he low heat source gains entropy because it requires extra time to absorb the heat that is simply passing through the engine without being converted into work. his time difference is the origin of thermodynamic entropy. hermodynamic entropy, referred to as an arrow of time, is an arrow of time.

5 Boltzmann s Entropy his essay introduces the idea that a consequence of defining thermodynamic entropy using an ideal gas is that, as the pressure approaches zero, the exchanges of energy between molecules theoretically reduce down to single exchanges. A point is reached where exchanges occur at a rate that can be modeled as one at a time without delay between them. hat is an equilibrium point where the temperature is infinitesimally close to a constant value. Clausius thermodynamic entropy applies to that low pressure where the exchanges that occur can be ideally represented as each molecule taking its turn, without delay, to pass on average molecular kinetic energy. his process can be modeled by considering all of the gas molecules lined up single file and the average molecular kinetic energy of one of them is transferred down the line from molecule to molecule until the energy has been transferred to the last molecule. he time required to complete this process is internal thermodynamic entropy. emperature is proportional to the rate of transfer of average molecular kinetic energy between molecules. he modified temperature is the rate at which energy is transferred between molecules. he numerator of the modified temperature is average molecular kinetic energy. he average kinetic energy of an ideal gas depends upon temperature only. On page 3 it was shown that the average kinetic energy divided by modified temperature equals t c : In the equation below, Boltzmann s constant is defined in this paper as the first equal term and by thermodynamics as the second equal term: 0 is Avogadro s number, the number of molecules in a mole of gas. R is the universal gas constant. Solving for Substituting the appropriate values: For one mole of gas and dropping the molecule indicator: 2 23 4 19 R = ( 6.02x10 )( 1.292x10 )( 1.602x10 sec onds) = 8.31sec onds 3 he universal gas constant is directly proportional to the total time required for a mole of ideal gas to transfer average molecular kinetic energy from molecule to molecule without delay between exchanges until the number of molecules in a mole of gas is reached. he solution above is not that time. he actual time requires the use of modified temperature. R is defined using degrees Kelvin. R must be divided by k t so that it becomes defined using modified temperature. Another adjustment that is needed is to multiply by 3/2 to remove the 2/3 that resulted from the kinetic theory of an ideal gas:

6 3 R 2 k ( 23 6.02 10 / )( 1.602 10 19 sec / ) = N0 tc = x molecules mole x onds molecule = 96, 440sec onds / mole = 1, 607 min utes / mole = 26.8 hours / mole Boltzmann s constant is the time period represented by the universal gas constant molecule status: reduced to single R 8.312 sec onds / mole 23 k = 1.38x10 sec onds / molecule 23 N 6.02x10 molecules / mole 0 Strictly speaking, the units of degrees should have been included in the two equations above. I took the liberty of not showing it for reason of readability. Boltzmann s constant is directly proportional to the period of time necessary for a single exchange of average kinetic energy to occur between two molecules of an ideal gas, independent of temperature. he actual time period, devoid of the molecule indicator, is given by: he reason for eliminating the kinetic theory of an ideal gas fraction of 2/3 is that it pertains to macroscopic properties while the time of exchange of kinetic energy between individual molecules is a microscopic property. he number of possible arrangements for a mole of ideal gas is infinite. Boltzmann s entropy requires there to be a limited number of possible arrangements. His entropy assumed that the volume of the mole of gas could be divided into a limited number of cells available to be occupied. In quantum theory, there are a naturally limited number of available arrangements. Instead of arbitrary cells, there are microstates which particles might occupy. If the concept of microstates is idealized so that all microstates are equally likely to be occupied, then, I can write: his is not the definition of Boltzmann s entropy even though is the number of microstates. he inclusion of Boltzmann s constant causes this calculation to be analogous to that of thermodynamic entropy. he number of microstates simulates a number of ideal gas molecules. he entropy calculation simulates the calculation of internal entropy of an ideal gas. he solution is proportional to the time period required for the simulated ideal gas molecules to transfer their simulated individual average kinetic energies from one molecule to the next, without delay, until the number of simulated molecules equals. he calculation of the entropy for any number of microstates will yield a solution identical to an analogous calculation for an equal number of ideal gas molecules. However, Boltzmann s entropy is defined as: herefore, Boltzmann s entropy is proportional to the time period of a single transfer of ideal gas molecule energy times the logarithm of the number of microstates. Boltzmann s entropy is not an expression of simulated internal thermodynamic entropy. he entropy is no longer a direct measure of time. he units of seconds carried along by Boltzmann s constant have become irrelevant. Boltzmann s constant can be set to unity without units. Its connection to thermodynamic entropy is already lost.

7 Gibbs Paradox Solution he Gibb s paradox problem will be analyzed here in its original form, including its discontinuity problem pertaining to its interpretation of distinct versus indistinct or distinguishable versus indistinguishable. here is a rectangular adiabatic container divided in half by a removable adiabatic partition. In one half of the container there is the ideal gas A, and, in the other half there is the ideal gas B. he molecules of both gases are idealized as inelastic spheres. he gas A molecules have mass m and size s. he gas B molecules have mass 2m and size 2s. Both gases are at the same temperature and pressure. he partition prevents either gas from being affected by the other. here are two moles of gas A, and, because the molecules of gas B are twice as large, there is one mole of gas B. emperature is identified as being directly proportional to the rate at which energy is transferred from molecule to molecules. he molecules temperatures are the same so both gases are transferring energy between molecules at the same constant rate. he initial thermodynamic entropy of each of the gases is internal thermodynamic entropy. he internal thermodynamic entropy of gas A is the time required for one molecule to transfer energy to another and twice Avogadro s number. Avogadro s number is the number of molecules in a mole of gas. Internal thermodynamic entropy is calculated as if the molecules were lined up single file with energy being passed from one to the other down the line until the last molecule receives the energy. his treatment is dictated by the definition of an ideal gas. Specifically, the requirement that pressure approaches zero. here is an initial point of equilibrium reached as pressure approaches zero. hat equilibrium establishes the internal thermodynamic entropy of an ideal gas. herefore, the internal thermodynamic entropy of gas A is twice the internal thermodynamic entropy of gas B even though they are at the same temperature. hat is because there are twice as many molecules of gas A as for gas B. In contrast to the Gibbs description, the following is what the author would expect to be the case. he dividing partition is removed allowing the molecules of both gases to mix. he temperature remains constant for both gases. he rates-of-exchange of energy between any two molecules is always the same value. here is no heat either gained or lost. he temperature continues to remain the same initial constant value for the duration of the mixing process. he internal thermodynamic entropy of the two gases, immediately after the partition is removed, is equal to the sum of the two separate internal thermodynamic entropies. It remains this same value throughout the mixing process. he Gibbs interpretation predicts that the mixing process increases the internal thermodynamic entropy of the combined gases. It is assumed that thermodynamic properties such as heat would change. If this were true, then there would be a significant change in temperature due to the mixing of the gases. he experimental results show that there is no detectable change in temperature. his next example will demonstrate that there is no change in internal thermodynamic entropy as a result of mixing two distinguishable gases having equal initial temperatures and pressures. It is proposed that the choice to treat mixing entropy like internal thermodynamic entropy caused the apparent paradox. In the Gibbs treatment the property of distinguishable was interpreted as being absolute. It was concluded that there are no grades of distinguishable. he gases were either distinguishable or they weren t. Distinguishable was as opposite from indistinguishable as binary 1 is from 0. Before the gases are mixed, the mixing entropy is zero. As the gases mixed, the mixing entropy was expected to add to the internal thermodynamic entropy of the two gases. Since a large increase in mixing entropy was expected, based upon its statistical analysis, a large change in internal thermodynamic entropy was predicted. A corresponding change of temperature was also predicted. Because a lowering of temperature is associated with an increase in internal thermodynamic entropy, mixing entropy will be treated here in the same manner. If mixing entropy is predicted to increase internal thermodynamic entropy, then a corresponding drop in temperature should be observed. Because of the assumption of absolute opposites for unmixed versus mixed, the fully mixed gases should have reached their maximum possible mixing entropy. hat maximum would occur at the lowest possible temperature.

8 his absolute opposites treatment of mixed versus unmixed is modeled here as being analogous to the mixed gases having an initial temperature 1 and a final temperature of approximately zero degrees Kelvin. he thermodynamic entropy at a temperature of approximately zero degrees Kelvin is very large. he thermodynamic entropy of the two gases together is equal to the time required for energy to be transferred from one molecule to another multiplied by 3 times Avogadro s number. At near absolute zero temperature, the rate of exchange of energy between molecules is extremely slow. herefore the value of mixing entropy, when treated as being like internal thermodynamic entropy, should be extremely large. Such a large change in internal thermodynamic entropy would be detectable as a large drop in temperature. he experimental results show that there is no change in temperature. It was shown that internal thermodynamic entropy for an ideal gas is independent of temperature. If the mixing process changed the gases temperature, the internal thermodynamic entropy would not change. However, there is no change in temperature due to mixing. Even a single gas undergoes constant mixing of its molecules. Although mixing entropy s mathematical expression borrows Boltzmann s constant and the name entropy, it, the mixing entropy, is not internal thermodynamic entropy. he theoretical thermodynamic pathway from entropy and the microstates, does not exist.