An Elementary Notion and Measurement of Entropy

Similar documents
Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

Chapter 12. The Laws of Thermodynamics

12 The Laws of Thermodynamics

Heat Machines (Chapters 18.6, 19)

Thermodynamics. 1.1 Introduction. Thermodynamics is a phenomenological description of properties of macroscopic systems in thermal equilibrium.

Thermodynamic entropy

Spring_#8. Thermodynamics. Youngsuk Nam

Free expansion (Joule); Constant U Forced expansion (Joule-Kelvin); Constant H. Joule-Kelvin coefficient - heating or cooling on JK expansion?

Lecture Outline Chapter 18. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.


1. Second Law of Thermodynamics

Information in Biology

Lecture 2 Entropy and Second Law

Information in Biology

5/6/ :41 PM. Chapter 6. Using Entropy. Dr. Mohammad Abuhaiba, PE

ENTROPY. Chapter 7. Mehmet Kanoglu. Thermodynamics: An Engineering Approach, 6 th Edition. Yunus A. Cengel, Michael A. Boles.

Entropy and the Second and Third Laws of Thermodynamics

T s change via collisions at boundary (not mechanical interaction)

Thermodynamics: An Engineering Approach Seventh Edition in SI Units Yunus A. Cengel, Michael A. Boles McGraw-Hill, 2011.

Thermodynamics: An Engineering Approach Seventh Edition Yunus A. Cengel, Michael A. Boles McGraw-Hill, Chapter 7 ENTROPY

Physics 101: Lecture 28 Thermodynamics II

Class 22 - Second Law of Thermodynamics and Entropy

CHAPTER 7 ENTROPY. Copyright Hany A. Al-Ansary and S. I. Abdel-Khalik (2014) 1

Chapter 12 Thermodynamics

Entropy and the Second Law of Thermodynamics

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

Lecture Notes Set 3b: Entropy and the 2 nd law

CLAUSIUS INEQUALITY. PROOF: In Classroom

Murray Gell-Mann, The Quark and the Jaguar, 1995

Spring_#7. Thermodynamics. Youngsuk Nam.

Halesworth U3A Science Group

The First Law of Thermodynamics

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

First Law Limitations

Stat Mech: Problems II

Physics 207 Lecture 27. Lecture 26. Chapters 18, entropy and second law of thermodynamics Chapter 19, heat engines and refrigerators

Statistical Mechanics

Second Law of Thermodynamics -

October 18, 2011 Carnot cycle - 1

Chapter 6: Forms of the Second Law

S = S(f) S(i) dq rev /T. ds = dq rev /T

Chemical thermodynamics the area of chemistry that deals with energy relationships

Physics 101: Lecture 28 Thermodynamics II

Heat What is heat? Work = 2. PdV 1

Hugh Everett III s Many Worlds

CARNOT CYCLE = T = S ( U,V )

It From Bit Or Bit From Us?

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

Chapter 16 The Second Law of Thermodynamics

Lecture 21: Introducing the Second Law, Irreversibilities

1. INTRODUCTION TO REFRIGERATION AND AIR CONDITION

Chapter 20 Second Law of Thermodynamics. Copyright 2009 Pearson Education, Inc.

The four laws of Thermodynamics. WOD are underlined.

Verschränkung versus Stosszahlansatz: The second law rests on low correlation levels in our cosmic neighborhood

Chapter 11 Heat Engines and The Second Law of Thermodynamics

Lecture. Polymer Thermodynamics 0331 L First and Second Law of Thermodynamics

1. Second Law of Thermodynamics

CHAPTER 6 THE SECOND LAW OF THERMODYNAMICS

Thermodynamic Systems, States, and Processes

Chapter 16 Thermodynamics

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

The Second Law of Thermodynamics

Irreversible Processes

Addison Ault, Department of Chemistry, Cornell College, Mt. Vernon IA. The Carnot cycle is usually described in terms of classical

Lecture Ch. 2a. Lord Kelvin (a.k.a William Thomson) James P. Joule. Other Kinds of Energy What is the difference between E and U? Exact Differentials

Chemistry 163B Heuristic Tutorial Second Law, Statistics and Entropy

Lecture 13. The Second Law

The Story of Spontaneity and Energy Dispersal. You never get what you want: 100% return on investment

Heat Engines and the Second Law of Thermodynamics

Random arrang ement (disorder) Ordered arrangement (order)

Review of First and Second Law of Thermodynamics

Chemistry 163B Heuristic Tutorial Second Law, Statistics and Entropy falling apples excited state ground state chemistry 163B students

Chapter 19. Heat Engines

Introduction. Chapter The Purpose of Statistical Mechanics

THERMODYNAMICS AND STATISTICAL MECHANICS

Handout 12: Thermodynamics. Zeroth law of thermodynamics

Chapter 19. Heat Engines

Physics 101: Lecture 28 Thermodynamics II

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

ME6301- ENGINEERING THERMODYNAMICS UNIT I BASIC CONCEPT AND FIRST LAW PART-A

The Second Law of Thermodynamics

18.13 Review & Summary

Basics of Thermodynamics: Easy learning by Dr. Anjana Sen

Thermodynamics Second Law Entropy

8 Lecture 8: Thermodynamics: Principles

I.D The Second Law Q C

Engineering Thermodynamics. Chapter 5. The Second Law of Thermodynamics

Entropy in Classical and Quantum Information Theory

Handout 12: Thermodynamics. Zeroth law of thermodynamics

Vapor pressure at 25 o C =0.03 atm. Vapor pressure at 100 o C =1.0 atm. What are p N2, p O2 and p H2O at equilibrium?

Entropy and the second law of thermodynamics

PHY101: Major Concepts in Physics I

Other Problems in Philosophy and Physics

Chapter 20 The Second Law of Thermodynamics

Chapter 1: FUNDAMENTAL CONCEPTS OF THERMODYNAMICS AND VARIOUS THERMODYMIC PROCESSES

Deriving Thermodynamics from Linear Dissipativity Theory

CHAPTER - 12 THERMODYNAMICS

Basic Thermodynamics. Prof. S. K. Som. Department of Mechanical Engineering. Indian Institute of Technology, Kharagpur.

Chap. 3 The Second Law. Spontaneous change

Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

Transcription:

(Source: Wikipedia: entropy; Information Entropy) An Elementary Notion and Measurement of Entropy Entropy is a macroscopic property of a system that is a measure of the microscopic disorder within the system. It is an important part of the second law of thermodynamics. There are many ways of stating the second law of thermodynamics, but all are equivalent in the sense that each form of the second law logically implies every other form. Thus, the theorems of thermodynamics can be proved using any form of the second law and third law. The formulation of the second law that refers to entropy directly is as follows: In a system, a process that occurs will tend to increase the total entropy of the universe. Thus, while a system can go through some physical process that decreases its own entropy, the entropy of the universe (which includes the system and its surroundings) must increase overall. (An exception to this rule is a reversible or "isentropic" process, such as frictionless adiabatic compression.) Processes that decrease the total entropy of the universe are impossible. If a system is at equilibrium, by definition no spontaneous processes occur, and therefore the system is at maximum entropy. Thus, the second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system. Hence, in a system isolated from its environment, the entropy of that system will tend not to decrease. It follows that heat will not flow from a colder body to a hotter body without the application of work (the imposition of order) to the colder body. Secondly, it is impossible for any device operating on a cycle to produce net work from a single temperature reservoir; the production of net work requires flow of heat from a hotter reservoir to a colder reservoir. As a result, there is no possibility of a "perpetual motion" system. Finally, it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient. Approaches to understanding entropy Order and Disorder: Entropy has often been loosely associated with the amount of order, disorder, and/or chaos in a thermodynamic system. The traditional qualitative description of entropy is that it refers to changes in the status quo of the system and is a measure of "molecular disorder" and the amount of wasted energy in a dynamical energy transformation from one state or form to another. In this direction, a number of authors, in recent years, have derived exact entropy formulas to account for and measure disorder and order in atomic and molecular assemblies. One of the simpler entropy order/disorder formulas is that derived in 1984 by thermodynamic physicist Peter Landsberg, which is based on a combination of thermodynamics and information theory arguments. Landsberg argues that when constraints operate on a system, such that it is prevented from entering one or more of its possible or permitted states, as contrasted with its forbidden states, the measure of the total amount of disorder in the system is given by the following expression: Similarly, the total amount of "order" in the system is given by:

2 In which C D is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, C I is the "information" capacity of the system, an expression similar to Shannon's channel capacity, and C O is the "order" capacity of the system. Energy Dispersal: The concept of entropy can be described qualitatively as a measure of energy dispersal at a specific temperature. Similar terms have been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels. Ambiguities in the terms disorder and chaos, which usually have meanings directly opposed to equilibrium, contribute to widespread confusion and hamper comprehension of entropy for most students. As the second law of thermodynamics shows, in an isolated system internal portions at different temperatures will tend to adjust to a single uniform temperature and thus produce equilibrium. A recently developed educational approach avoids ambiguous terms and describes such spreading out of energy as dispersal, which leads to loss of the differentials required for work even though the total energy remains constant in accordance with the first law of thermodynamics (compare discussion in next section). Physical chemist Peter Atkins, for example, who previously wrote of dispersal leading to a disordered state, now writes that "spontaneous changes are always accompanied by a dispersal of energy" Relating entropy to energy usefulness: Following on from the above, it is possible (in a thermal context) to regard entropy as an indicator or measure of the effectiveness or usefulness of a particular quantity of energy. This is because energy supplied at a high temperature (i.e. with low entropy) tends to be more useful than the same amount of energy available at room temperature. Mixing a hot parcel of a fluid with a cold one produces a parcel of intermediate temperature, in which the overall increase in entropy represents a loss which can never be replaced. Thus, the fact that the entropy of the universe is steadily increasing, means that its total energy is becoming less useful: eventually, this will lead to the "heat death of the Universe". Entropy and life For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy as asserted in the 1944 book What is Life? by physicist Erwin Schrödinger served as a further stimulus to this research. Recent writings have used the concept of Gibbs free energy to elaborate on this issue. In the 1982 textbook Principles of Biochemistry by American biochemist Albert Lehninger, for example, it is argued that the "order" produced within cells as they grow and divide is more than

3 compensated for by the "disorder" they create in their surroundings in the course of growth and division. In short, according to Lehninger, "living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy." Evolution related definitions: Negentropy - a shorthand colloquial phrase for negative entropy. Ectropy - a measure of the tendency of a dynamical system to do useful work and grow more organized. [34] Extropy a metaphorical term defining the extent of a living or organizational system's intelligence, functional order, vitality, energy, life, experience, and capacity and drive for improvement and growth. Ecological entropy - a measure of biodiversity in the study of biological ecology. In a study titled Natural selection for least action published in the Proceedings of The Royal Society A., Ville Kaila and Arto Annila of the University of Helsinki describe how the second law of thermodynamics can be written as an equation of motion to describe evolution, showing how natural selection and the principle of least action can be connected by expressing natural selection in terms of chemical thermodynamics. In this view, evolution explores possible paths to level differences in energy densities and so increase entropy most rapidly. Thus, an organism serves as an energy transfer mechanism, and beneficial mutations allow successive organisms to transfer more energy within their environment. Sociological definitions The concept of entropy has also entered the domain of sociology, generally as a metaphor for chaos, disorder or dissipation of energy, rather than as a direct measure of thermodynamic or information entropy: Corporate entropy - energy waste as red tape and business team inefficiency, i.e. energy lost to waste. (This definition is comparable to von Clausewitz's concept of friction in war.) Economic entropy a semi-quantitative measure of the irrevocable dissipation and degradation of natural materials and available energy with respect to economic activity. Entropology the study or discussion of entropy or the name sometimes given to thermodynamics without differential equations. Psychological entropy - the distribution of energy in the psyche, which tends to seek equilibrium or balance among all the structures of the psyche. Social entropy a measure of social system structure, having both theoretical and statistical interpretations, i.e. society (macrosocietal variables) measured in terms of how the individual functions in society (microsocietal variables); also related to social equilibrium. Information Theoretic Measures of Entropy The Shannon s entropy : Shannon s entropy relating to Boltzmann concept of is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy. Shannon entropy is a broad and general concept which finds applications in

4 information theory as well as thermodynamics. It was originally devised by Claude Shannon in 1948 to study the amount of information in a transmitted message. The definition of the information entropy is, however, quite general, and is expressed in terms of a discrete set of probabilities p i : In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of how much information was in the message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message. The Rényi entropy: It a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α, where α 0, is defined as where p i are the probabilities of {x 1, x 2... x n } and log is in base 2. If the probabilities are all the same then all the Rényi entropies of the distribution are equal, with H α (X)=log n. Otherwise the entropies are weakly decreasing as a function of α. Higher values of α, approaching infinity, give a Rényi entropy which is increasingly determined by consideration of only the highest probability events. Lower values of α, approaching zero, give a Rényi entropy which increasingly weights all possible events more equally, regardless of their probabilities. The intermediate case α=1 gives the Shannon entropy, which has special properties. The Rényi entropies are important in ecology and statistics as indices of diversity. The Rényi entropy also important in quantum information, it can be used as a measure of entanglement. In XY Heisenberg spin chain the Rényi entropy was calculated explicitly in terms of modular function of α. They also lead to a spectrum of indices of fractal dimension. The Tsallis entropy: It is a generalization of the standard Boltzmann-Gibbs entropy. It was an extension put forward by Constantino Tsallis in 1988. It is defined as or in the discrete case

5 In this case, p denotes the probability distribution of interest, and q is a real parameter. In the limit as q 1, the normal Boltzmann-Gibbs entropy is recovered. The parameter q is a measure of the non-extensitivity of the system of interest. There are continuous and discrete versions of this entropic measure.