THE PHYSICS OF STUFF: WHY MATTER IS MORE THAN THE SUM OF ITS PARTS

Similar documents
Ideal gas From Wikipedia, the free encyclopedia

CONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year.

1) K. Huang, Introduction to Statistical Physics, CRC Press, 2001.

Lecture 1: Historical Overview, Statistical Paradigm, Classical Mechanics

Physics 1140 Lecture 6: Gaussian Distributions

G 8243 Entropy and Information in Probability

Elementary Statistics for Geographers, 3 rd Edition

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Introduzione al conce/o di energia

1 The Rocky Mountain News (Denver, Colorado), Dec

Thermodynamics. Energy is driving life. Energy of sun ultimately drives most of life on Earth

Second Law of Thermodynamics: Concept of Entropy. While the first law of thermodynamics defines a relation between work and heat, in terms of

The (Almost) Modern Atom Model. SUMMARY from previous lecture Energy comes in different forms. ENERGY IS ALWAYS CONSERVED.

PAPER No. 6: PHYSICAL CHEMISTRY-II (Statistical

Energy management at micro scales

The Physics of Energy

The Higgs - Theory. The Higgs. Theory. Arthur H. Compton Lecture th. Martin Bauer. Oct. 26 Arthur H. Compton Lectures Oct 26th 2013

Thermodynamics: More Entropy

Thornton & Rex, 4th ed. Fall 2018 Prof. Sergio B. Mendes 1

Table of Contents [ttc]

Lecture Outline Chapter 17. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Entropy and the Second and Third Laws of Thermodynamics

Statistical Mechanics

Droplets and atoms. Benjamin Schumacher Department of Physics Kenyon College. Bright Horizons 35 (July, 2018)

CHAPTER 9 Statistical Physics

PowerPoint lecture notes for Thornton/Rex s Modern Physics, 4e

Test Exchange Thermodynamics (C) Test Answer Key

THERMODYNAMICS WRITTEN TEST PORTION GOPHER INVITATIONAL JANUARY 6TH 2018 NAMES TEAM NAME AND NUMBER SCHOOL

Murray Gell-Mann, The Quark and the Jaguar, 1995

Chem 350: Statistical Mechanics and Chemical Kinetics. Spring Preface. Introduction 2

PHY101: Major Concepts in Physics I

Physics 205 Modern Physics for Engineers

Thermodynamics/Optics

Test Exchange Thermodynamics (C) Test Team Name: Team Number: Score: / 43. Made by Montgomery High School -

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

Introduction. Chapter The Purpose of Statistical Mechanics

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Thermal Physics. Energy and Entropy

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

Stochastic processes and stopping time Exercises

18.13 Review & Summary

Thermodynamics: More Entropy

Entropy for Mathematicians or... The final Blaubeuren chapters

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Part II Statistical Physics


PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

A thermodynamic system is taken from an initial state X along the path XYZX as shown in the PV-diagram.

Chapter 12. The Laws of Thermodynamics

PHYS 3313 Section 001. Lecture #3

Two Heads Are Better Than None

1. Thermodynamics 1.1. A macroscopic view of matter

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Introduction to thermodynamics

Basic on the physics of energy transforma2ons at micro and nanoscales

Matter. Energy- which is a property of matter!! Matter: anything that takes up space and has mass

ME6301- ENGINEERING THERMODYNAMICS UNIT I BASIC CONCEPT AND FIRST LAW PART-A

PHYS Statistical Mechanics I Course Outline

Basic thermodynamics

Contents. 1.1 Prerequisites and textbooks Physical phenomena and theoretical tools The path integrals... 9

The Direction of Spontaneous Change: Entropy and Free Energy

Critical Exponents. From P. Chaikin and T Lubensky Principles of Condensed Matter Physics

ANSWERS EXERCISE 1.1 EXERCISE 1.2

Ch C e h m e ic i a c l a The Th r e mod o yna n m a ic i s c 2007/08

THE NATURE OF THERMODYNAMIC ENTROPY. 1 Introduction. James A. Putnam. 1.1 New Definitions for Mass and Force. Author of

Halesworth U3A Science Group

MOLE CONCEPT AND STOICHIOMETRY

MP203 Statistical and Thermal Physics. Problem set 7 - Solutions

THE SECOND LAW OF THERMODYNAMICS. Professor Benjamin G. Levine CEM 182H Lecture 5

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

6.730 Physics for Solid State Applications

Thermodynamics: Entropy Conclusion

HEAT AND THERMODYNAMICS PHY 522 Fall, 2010

PH4211 Statistical Mechanics Brian Cowan

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

Collective Effects. Equilibrium and Nonequilibrium Physics

Collective Effects. Equilibrium and Nonequilibrium Physics

Heat, Temperature and the First Law of Thermodynamics

Exergy. What s it all about? Thermodynamics and Exergy

Lab Monday optional: review for Quiz 3. Lab Tuesday optional: review for Quiz 3.

CHAPTER 1 Matter in our Surroundings CONCEPT DETAILS

The Story of Spontaneity and Energy Dispersal. You never get what you want: 100% return on investment

Chapter 10. Thermal Physics

What Is Classical Physics?

First Digit Tally Marks Final Count

Goals for learning in Chapter 4 Lecture 3 Matter and Energy

arxiv: v1 [cond-mat.stat-mech] 9 Oct 2014

SDS 321: Introduction to Probability and Statistics

In-class exercises. Day 1

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

Matter and Energy: Special Relativity

Concept: Thermodynamics

FUNDAMENTALS OF CHEMISTRY Vol. II - Irreversible Processes: Phenomenological and Statistical Approach - Carlo Cercignani

Beyond the Second Law of Thermodynamics

Today. Work, Energy, Power loose ends Temperature Second Law of Thermodynamics

Solutions to Exam 2; Phys 100

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Syllabus: Physics 241 Introduction to Modern Physics Professor Marshall Onellion (office)

Thermodynamic Properties

Transcription:

THE UNIVERSITY OF CHICAGO, ENRICO FERMI INSTITUTE ARTHUR H. COMPTON LECTURES 71 st SERIES THE PHYSICS OF STUFF: WHY MATTER IS MORE THAN THE SUM OF ITS PARTS JUSTIN C. BURTON -- APRIL 3 RD JUNE 12 TH 2009

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics.

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics. This Spring s lectures will focus on the physics of stuff

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics. This Spring s lectures will focus on the physics of stuff -matter we interact with on a daily basis

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics. This Spring s lectures will focus on the physics of stuff -matter we interact with on a daily basis -complex and emergent properties

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics. This Spring s lectures will focus on the physics of stuff -matter we interact with on a daily basis -complex and emergent properties Lectures will be held every Saturday at 11am until June 12th -there is no lecture on May 29 th (Memorial Day weekend)

WELCOME! The purpose of the Compton lectures is to share with the public some of the exciting research and new concepts in physics. This Spring s lectures will focus on the physics of stuff -matter we interact with on a daily basis -complex and emergent properties Lectures will be held every Saturday at 11am until June 12th -there is no lecture on May 29 th (Memorial Day weekend) Make sure to grab a lecture handout -all handouts and notes will be available on the website http://home.uchicago.edu/~jcburton/comptonlectures

OUTLINE 1) The matter we know: from the ordinary to the exotic 2) Solids: crystals and symmetry 3) Fluids and interfacial physics 4) Phase transitions: a universal theme 5) Super-stuff: quantum matter 6) Disorder and Glassiness 7) From the old to the new: soft matter I 8) From the old to the new: soft matter II 9) Let s put it to use: materials science past and present 10) Much more than the sum of its parts: living matter and evolution

OUTLINE 1) The matter we know: from the ordinary to the exotic 2) Solids: crystals and symmetry 3) Fluids and interfacial physics 4) Phase transitions: a universal theme 5) Super-stuff: quantum matter 6) Disorder and Glassiness 7) From the old to the new: soft matter I 8) From the old to the new: soft matter II 9) Let s put it to use: materials science past and present 10) Much more than the sum of its parts: living matter and evolution

SOME PRELIMINARY CONSIDERATIONS What is matter?

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass Can we be more specific (reductionist)?

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass Can we be more specific (reductionist)? are there common components??

SOME PRELIMINARY CONSIDERATIONS What is matter? A substance or object that occupies space and has mass Can we be more specific (reductionist)? yes! molecules are there common components??

SOME PRELIMINARY CONSIDERATIONS molecule p+

SOME PRELIMINARY CONSIDERATIONS molecule p+

SOME PRELIMINARY CONSIDERATIONS molecule p+

SOME PRELIMINARY CONSIDERATIONS molecule atom p+ p+

SOME PRELIMINARY CONSIDERATIONS molecule atom p+ p+ different atoms = elements

SOME PRELIMINARY CONSIDERATIONS molecule atom p+ p+ Periodic Table (Dmitri Mendeleev 1869) different atoms = elements

WHAT MAKES UP AN ATOM? e- p+ n p+

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable e- p+ n p+

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- p+ n

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- n p+ 141 Ba 92 Kr

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- n p+ First nuclear reaction took place here at UofC, designed by physicist Enrico Fermi 141 Ba 92 Kr

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- n p+ First nuclear reaction took place here at UofC, designed by physicist Enrico Fermi 141 Ba 92 Kr

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- n p+ First nuclear reaction took place here at UofC, designed by physicist Enrico Fermi 141 Ba 92 Kr Fermi worked in the metallurgy lab of Arthur Compton

WHAT MAKES UP AN ATOM? Atom from átomos (Greek) uncuttable Nuclear fission breaks atoms into smaller parts usually lighter atoms p+ e- n p+ First nuclear reaction took place here at UofC, designed by physicist Enrico Fermi 141 Ba 92 Kr Fermi worked in the metallurgy lab of Arthur Compton How far can this go? What determines an elementary particle?

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity )

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity ) All of the observable matter in The universe is composed of some combination of these particles

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity ) All of the observable matter in The universe is composed of some combination of these particles p+

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity ) All of the observable matter in The universe is composed of some combination of these particles p+

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity ) All of the observable matter in The universe is composed of some combination of these particles p+

ELEMENTARY PARTICLES Our standard model of particle physics is composed of elementary particles and force carriers These particles obey the modern laws of physics (quantum mechanics, general relativity ) All of the observable matter in The universe is composed of some combination of these particles p+ What do we mean observable?

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light)

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light) Dark matter does however interact gravitationally (it has mass), so we have indirect methods of detection (gravitational lensing)

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light) Dark matter does however interact gravitationally (it has mass), so we have indirect methods of detection (gravitational lensing) The distribution of dark matter can be determined from images, such as those from the Hubble Space Telescope

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light) Dark matter does however interact gravitationally (it has mass), so we have indirect methods of detection (gravitational lensing) The distribution of dark matter can be determined from images, such as those from the Hubble Space Telescope ~4% of the matter-energy in the universe is visible! (stuff we are made of)

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light) Dark matter does however interact gravitationally (it has mass), so we have indirect methods of detection (gravitational lensing) The distribution of dark matter can be determined from images, such as those from the Hubble Space Telescope ~4% of the matter-energy in the universe is visible! (stuff we are made of) ~22% of the matter-energy in the universe is dark matter!

THE STUFF WE CAN T SEE (BUT KNOW ITS THERE) Dark matter is the name given to mass in the universe that we cannot detect with optical instruments (i.e. does not interact with light) Dark matter does however interact gravitationally (it has mass), so we have indirect methods of detection (gravitational lensing) The distribution of dark matter can be determined from images, such as those from the Hubble Space Telescope ~4% of the matter-energy in the universe is visible! (stuff we are made of) ~22% of the matter-energy in the universe is dark matter! The nature of dark matter still eludes us and is an open and very active question in physics!

STATES OF MATTER

STATES OF MATTER Gas

STATES OF MATTER Gas Liquid

STATES OF MATTER Gas Liquid Solid

STATES OF MATTER Gas Liquid Solid

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid Plasma

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid ionization Plasma

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid ionization Plasma Quantum Condensates -superfluid (week 5) -superconductor -Bose-Einstein condensate

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid ionization Plasma Quantum Condensates -superfluid (week 5) -superconductor -Bose-Einstein condensate

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid ionization Plasma Quantum Condensates -superfluid (week 5) -superconductor -Bose-Einstein condensate Exotic Forms -dark matter -core of neutron star -quark-gluon plasma/fluid -transparent aluminum

STATES OF MATTER Gas phase transitions (week 4) Liquid Solid ionization Plasma Quantum Condensates -superfluid (week 5) -superconductor -Bose-Einstein condensate Exotic Forms -dark matter -core of neutron star -quark-gluon plasma/fluid -transparent aluminum

FROM THE BOTTOM UP?

FROM THE BOTTOM UP?

FROM THE BOTTOM UP? p+

FROM THE BOTTOM UP? p+ p+

FROM THE BOTTOM UP? p+ p+

FROM THE BOTTOM UP? p+ p+

FROM THE BOTTOM UP? p+ p+

FROM THE BOTTOM UP? p+ p+

FROM THE BOTTOM UP? p+ p+ Can the complex properties of emergent phenomena be derived solely from the properties of elementary particles?

FROM THE BOTTOM UP? p+ p+ Can the complex properties of emergent phenomena be derived solely from the properties of elementary particles? In principle, maybe

FROM THE BOTTOM UP? p+ p+ Can the complex properties of emergent phenomena be derived solely from the properties of elementary particles? In principle, maybe In practice, it seems nearly impossible

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism the properties of large systems can be derived from properties of its constituents.

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism the properties of large systems can be derived from properties of its constituents. However, empirical evidence implies, for example, that Darwin s theory of evolution does not require the existence of elementary particle physics

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism the properties of large systems can be derived from properties of its constituents. However, empirical evidence implies, for example, that Darwin s theory of evolution does not require the existence of elementary particle physics In addition, it is likely impossible to derive Darwin s theory starting from particle physics.

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism Science X obeys the laws of science Y the properties of large systems can be derived from properties of its constituents. However, empirical evidence implies, for example, that Darwin s theory of evolution does not require the existence of elementary particle physics P.W. Anderson, Science, 1972 In addition, it is likely impossible to derive Darwin s theory starting from particle physics.

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism Science X obeys the laws of science Y the properties of large systems can be derived from properties of its constituents. However, empirical evidence implies, for example, that Darwin s theory of evolution does not require the existence of elementary particle physics In addition, it is likely impossible to derive Darwin s theory starting from particle physics. P.W. Anderson, Science, 1972 Although science X must obey the laws of science Y, science X requires new formalisms and theories to explain emergent phenomena

BREAKING DOWN THE SCIENCE There currently exist a scientific and philosophic debate about emergence and reductionism Science X obeys the laws of science Y the properties of large systems can be derived from properties of its constituents. However, empirical evidence implies, for example, that Darwin s theory of evolution does not require the existence of elementary particle physics In addition, it is likely impossible to derive Darwin s theory starting from particle physics. P.W. Anderson, Science, 1972 Although science X must obey the laws of science Y, science X requires new formalisms and theories to explain emergent phenomena Or X is more (and different) than the sum of its parts

HOW LARGE IS LARGE? 1 particle p+

HOW LARGE IS LARGE? 1 particle a few particles p+

HOW LARGE IS LARGE? 1 particle a few particles 10 23 particles! p+

HOW LARGE IS LARGE? 1 particle a few particles 10 23 particles! p+ How big is 10 23?

HOW LARGE IS LARGE? 1 particle a few particles 10 23 particles! p+ How big is 10 23? There are about 10 14 cells in the human body

HOW LARGE IS LARGE? 1 particle a few particles 10 23 particles! p+ How big is 10 23? There are about 10 14 cells in the human body 10 14 cells X 10 9 humans 10 23 human cells in the world!

HOW LARGE IS LARGE? 1 particle a few particles 10 23 particles! p+ How big is 10 23? There are about 10 14 cells in the human body 10 14 cells X 10 9 humans 10 23 human cells in the world! How can we possibly study such large systems?

THE PHYSICS OF MANY BODIES What do we need to describe a single particle?

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? z x y

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? v z x y 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?)

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? v z x y 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z x y 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z 6 degrees of freedom per particle X N particles = 6N total degrees of freedom! x y 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z 6 degrees of freedom per particle X N particles = 6N total degrees of freedom! x That s a lot to keep track of! y 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z 6 degrees of freedom per particle X N particles = 6N total degrees of freedom! x That s a lot to keep track of! y This was the dilemma in the later parts of the 19 th century 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z 6 degrees of freedom per particle X N particles = 6N total degrees of freedom! x That s a lot to keep track of! y This was the dilemma in the later parts of the 19 th century 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom Issac Newton

THE PHYSICS OF MANY BODIES What do we need to describe a single particle? What about N particles? (N ~ 10 23 ) v z 6 degrees of freedom per particle X N particles = 6N total degrees of freedom! x That s a lot to keep track of! y This was the dilemma in the later parts of the 19 th century 3 position coordinates (where?) and 3 velocity coordinates (how fast? and which direction?) This give 6 total degrees of freedom Issac Newton bulk properties of matter

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine How do we connect thermodynamics to the underlying material, namely a large collection of atoms?

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine How do we connect thermodynamics to the underlying material, namely a large collection of atoms? atomistics

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine How do we connect thermodynamics to the underlying material, namely a large collection of atoms? atomistics?

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine How do we connect thermodynamics to the underlying material, namely a large collection of atoms? atomistics? thermodynamics

A HISTORICAL PRIMER By the late 19 th century the principles of thermodynamics had been established : energy conservation, work, heat, entropy Sadi Carnot Robert Mayer William Thomson Hermann Helmholtz Rudolf Clausius James Joule This provided a framework in order to define basic macroscopic ideas such as heat flow, thermal equilibrium, and efficiency of an engine How do we connect thermodynamics to the underlying material, namely a large collection of atoms? atomistics statistical mechanics thermodynamics

A HISTORICAL PRIMER Statistical mechanics uses the mathematical tools of probability and statistics to describe thermodynamic phenomena

A HISTORICAL PRIMER Statistical mechanics uses the mathematical tools of probability and statistics to describe thermodynamic phenomena If the number of particles N is very large, then the dynamics of the system can be subject to a statistical interpretation

A HISTORICAL PRIMER Statistical mechanics uses the mathematical tools of probability and statistics to describe thermodynamic phenomena If the number of particles N is very large, then the dynamics of the system can be subject to a statistical interpretation 1860 s James Clerk Maxwell derives the velocity distribution of particles in an ideal gas (coins the term statistical mechanics ) Maxwell

A HISTORICAL PRIMER Statistical mechanics uses the mathematical tools of probability and statistics to describe thermodynamic phenomena If the number of particles N is very large, then the dynamics of the system can be subject to a statistical interpretation 1860 s James Clerk Maxwell derives the velocity distribution of particles in an ideal gas (coins the term statistical mechanics ) 1870 s Ludwig Boltzmann gives entropy a probabilistic interpretation, develops kinetic theory of gases Maxwell Boltzman

A HISTORICAL PRIMER Statistical mechanics uses the mathematical tools of probability and statistics to describe thermodynamic phenomena If the number of particles N is very large, then the dynamics of the system can be subject to a statistical interpretation 1860 s James Clerk Maxwell derives the velocity distribution of particles in an ideal gas (coins the term statistical mechanics ) 1870 s Ludwig Boltzmann gives entropy a probabilistic interpretation, develops kinetic theory of gases 1890 s Josiah Willard Gibbs perfects the theory of statistical mechanics, writes the bible of stat. mech. - developed idea of phase space and phase transitions Maxwell Boltzman - 1 st American theorist Gibbs

WHAT IS ENTROPY? Entropy is typically defined as a measure of how organized or disorganized a system is

WHAT IS ENTROPY? Entropy is typically defined as a measure of how organized or disorganized a system is equilibrium: max entropy non-equilibrium: less entropy

WHAT IS ENTROPY? Entropy is typically defined as a measure of how organized or disorganized a system is equilibrium: max entropy non-equilibrium: less entropy Second Law of Thermodynamics: The entropy of an isolated system which is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium.

WHAT IS ENTROPY? Entropy is typically defined as a measure of how organized or disorganized a system is equilibrium: max entropy non-equilibrium: less entropy Second Law of Thermodynamics: The entropy of an isolated system which is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. Any method involving the notion of entropy, the very existence of which depends on the second law of thermodynamics, will doubtless seem to many far-fetched, and may repel beginners as obscure and difficult of comprehension. - J. Willard Gibbs

WHAT IS ENTROPY? Entropy is typically defined as a measure of how organized or disorganized a system is equilibrium: max entropy non-equilibrium: less entropy Second Law of Thermodynamics: The entropy of an isolated system which is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. Any method involving the notion of entropy, the very existence of which depends on the second law of thermodynamics, will doubtless seem to many far-fetched, and may repel beginners as obscure and difficult of comprehension. - J. Willard Gibbs The law that entropy always increases holds, I think, the supreme position among the laws of Nature. if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. - Sir Arthur Stanley Eddington

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation:

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible.

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible. What do we mean by a system state?

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible. What do we mean by a system state? Let s say we flip a coin 1 time. We can either get heads or tails. -there are 2 possible states

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible. What do we mean by a system state? Let s say we flip a coin 1 time. We can either get heads or tails. -there are 2 possible states What about flipping it 2 times? There are 4 possible states: -hh, ht, th, tt

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible. What do we mean by a system state? Let s say we flip a coin 1 time. We can either get heads or tails. -there are 2 possible states What about flipping it 2 times? There are 4 possible states: -hh, ht, th, tt What about flipping it 3 times? There are 8 possible states: -hhh, hht, hth, htt, thh, tht, tth, ttt

STATISTICAL MECHANICS - ENTROPY Probabilistic Interpretation: The entropy of a system depends on the number of possible states that are accessible. What do we mean by a system state? Let s say we flip a coin 1 time. We can either get heads or tails. -there are 2 possible states What about flipping it 2 times? There are 4 possible states: -hh, ht, th, tt What about flipping it 3 times? There are 8 possible states: -hhh, hht, hth, htt, thh, tht, tth, ttt What about flipping it 4 times? There are 16 possible states: -hhhh, hhht, hhth, hhtt, hthh, htht, htth, httt thhh, thht, thth, thtt, tthh, ttht, ttth, tttt

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system A very powerful result of statistical mechanics is that the entropy is proportional to the exponent in the number of states

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system A very powerful result of statistical mechanics is that the entropy is proportional to the exponent in the number of states If we have 2 N states, then the entropy S N

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system A very powerful result of statistical mechanics is that the entropy is proportional to the exponent in the number of states If we have 2 N states, then the entropy S N entropy number of states

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system A very powerful result of statistical mechanics is that the entropy is proportional to the exponent in the number of states If we have 2 N states, then the entropy S N entropy number of states huge numbers!

STATISTICAL MECHANICS - ENTROPY coin flips 1 2 3 4 number of states 2 4 8 16 N 2 N the number of states grows exponentially in this system A very powerful result of statistical mechanics is that the entropy is proportional to the exponent in the number of states If we have 2 N states, then the entropy S N entropy Entropy has a logarithmic dependence on the number of states, the same as the Richter scale for measuring earthquake strength. number of states huge numbers!

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½)

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N?

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=10 0 N/2 N number of heads

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=32 N=10 0 N/2 N number of heads

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=100 0 N/2 N number of heads

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=317 0 N/2 N number of heads

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=1000 0 N/2 N number of heads

EQUILIBRIUM AND ENTROPY We can agree that if we flip a coin N times, we will on average get heads N/2 times (since the probability of heads is ½) For N coin flips, we will call the N/2 state equilibrium What does the distribution of states look like as we increase N? Distribution of states probability density N=1000 0 N/2 N number of heads As N approaches 10 23 (macroscopic size), it becomes outrageously unlikely to find the system in anything but the equilibrium state!

TEMPERATURE Until this point I have not mentioned temperature

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states E high E low

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states E high E E low

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states E high E E low

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states E high E E low

TEMPERATURE Until this point I have not mentioned temperature Temperature adds stochastic or random fluctuations to a system These fluctuations allow a system to jump between different energy states E high E low E At a given temperature, what is the probability of finding the system in certain energy state?

PROBABILITY OF STATES

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is:

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is:

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin exponential decay probability energy

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin As the temperature decreases, it becomes less likely to find the system in the high energy state E high probability exponential decay energy

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin As the temperature decreases, it becomes less likely to find the system in the high energy state E high As the E increases, it becomes less likely to find the system in the high energy state E high probability exponential decay energy

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin Question: what is the pressure at the top of Mt. Everest?

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin Question: what is the pressure at the top of Mt. Everest? Answer: about 1/3 of atmospheric pressure

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin Question: what is the pressure at the top of Mt. Everest? Answer: about 1/3 of atmospheric pressure Potential energy of a nitrogen molecule in gravity = mass x height x acceleration of gravity Increasing gravitational potential energy

PROBABILITY OF STATES Boltzmann showed that the relative probability P of a system being in a state with a given energy E high is: T = temperature k B = Boltzmann s constant = 1.38 x 10-23 Joules/Kelvin Question: what is the pressure at the top of Mt. Everest? Answer: about 1/3 of atmospheric pressure Potential energy of a nitrogen molecule in gravity = mass x height x acceleration of gravity Increasing gravitational potential energy

IDEAL GAS Ideal gas: the simplest large collection of particles

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V:

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V:

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V:

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V: Assumptions: particles are not interacting

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V: Assumptions: particles are not interacting system is at high temperatures, dominated by kinetic energy

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V: Assumptions: particles are not interacting system is at high temperatures, dominated by kinetic energy particles are indistinguishable

IDEAL GAS Ideal gas: the simplest large collection of particles Consider N particles confined in a box of volume V: Assumptions: particles are not interacting system is at high temperatures, dominated by kinetic energy particles are indistinguishable This model works amazingly well for many gases at room temperature, especially monatomic gases such as Argon

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume:

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law:

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B The point of all this is to show that the thermodynamics and physics of a collection of particles boils down to a counting problem

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B The point of all this is to show that the thermodynamics and physics of a collection of particles boils down to a counting problem Statistical mechanics provides a pathway between the distribution of energy states of a system and its macroscopic properties.

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B The point of all this is to show that the thermodynamics and physics of a collection of particles boils down to a counting problem Statistical mechanics provides a pathway between the distribution of energy states of a system and its macroscopic properties. -heat capacity (how much heat is needed to change the temperature)

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B The point of all this is to show that the thermodynamics and physics of a collection of particles boils down to a counting problem Statistical mechanics provides a pathway between the distribution of energy states of a system and its macroscopic properties. -heat capacity (how much heat is needed to change the temperature) -compressibility (speed of sound)

COUNTING STATES AND THERMODYNAMICS For this system of non-interacting particles (ideal gas), it can be shown that the number of states is proportional to the Nth power of the volume: # of states V N Turns out that if this relation is true, then the system will obey the ideal gas law: pressure x volume = number of particles x temperature x constant PV = NTk B The point of all this is to show that the thermodynamics and physics of a collection of particles boils down to a counting problem Statistical mechanics provides a pathway between the distribution of energy states of a system and its macroscopic properties. -heat capacity (how much heat is needed to change the temperature) -compressibility (speed of sound) -thermal conductivity (how well does it conduct heat?)

NOW THE INTERESTING STUFF

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model There is no freezing and crystallization in the ideal gas model

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model There is no freezing and crystallization in the ideal gas model Real matter is composed of particles that interact in many different ways

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model There is no freezing and crystallization in the ideal gas model Real matter is composed of particles that interact in many different ways These interactions are what give rise to the vast array of properties of ordinary and exotic states of matter

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model There is no freezing and crystallization in the ideal gas model Real matter is composed of particles that interact in many different ways These interactions are what give rise to the vast array of properties of ordinary and exotic states of matter For the past 100 years physicists have been developing experiments and theoretical tools to study large systems and their emergent properties Photo by Wilson Bentley

NOW THE INTERESTING STUFF There is no liquid phase in the ideal gas model There is no freezing and crystallization in the ideal gas model Real matter is composed of particles that interact in many different ways These interactions are what give rise to the vast array of properties of ordinary and exotic states of matter For the past 100 years physicists have been developing experiments and theoretical tools to study large systems and their emergent properties These lectures are intended to provide a look into how we understand these complex phenomena Photo by Wilson Bentley

ARROW OF TIME? (SOMETHING TO THINK ABOUT)

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy)

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time?

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe?

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe? Maxwell s Demon: imagine a external influence able to open a door between two volumes of gas, and only let fast particles move to one side. One side is hot and one is cold!?

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe? Maxwell s Demon: imagine a external influence able to open a door between two volumes of gas, and only let fast particles move to one side. One side is hot and one is cold!? Does this violate the second law of thermodynamics?

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe? Maxwell s Demon: imagine a external influence able to open a door between two volumes of gas, and only let fast particles move to one side. One side is hot and one is cold!? Does this violate the second law of thermodynamics? NO! We must include the demon in the calculation of entropy!

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe? Maxwell s Demon: imagine a external influence able to open a door between two volumes of gas, and only let fast particles move to one side. One side is hot and one is cold!? Does this violate the second law of thermodynamics? NO! We must include the demon in the calculation of entropy! The demon must do work to store information (memories), the entropy of the whole system will increase

ARROW OF TIME? (SOMETHING TO THINK ABOUT) Nearly all the laws of physics are time-reversible, meaning they have no preference for the direction of time The fact that entropy will always tend to increase (2 nd law) means that there is a preferred arrow of time (increasing entropy) Is this related to the psychological arrow of time? expanding universe? Maxwell s Demon: imagine a external influence able to open a door between two volumes of gas, and only let fast particles move to one side. One side is hot and one is cold!? Does this violate the second law of thermodynamics? NO! We must include the demon in the calculation of entropy! The demon must do work to store information (memories), the entropy of the whole system will increase The connection between entropy and information is part of the basis for modern information theory (developed by Claude Shannon 1949)