Statistical Mechanics and Information Theory

Similar documents
Thermodynamics of the nucleus

- HH Midterm 1 for Thermal and Statistical Physics I Hsiu-Hau Lin (Nov 5, 2012)

1. Thermodynamics 1.1. A macroscopic view of matter

Part II: Statistical Physics

Dr.Salwa Alsaleh fac.ksu.edu.sa/salwams

Part II: Statistical Physics

Recitation: 10 11/06/03

9.1 System in contact with a heat reservoir

13! (52 13)! 52! =

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Typicality paradigm in Quantum Statistical Thermodynamics Barbara Fresch, Giorgio Moro Dipartimento Scienze Chimiche Università di Padova

Maximum Entropy - Lecture 8

Lecture 5: Temperature, Adiabatic Processes

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Thermodynamics of nuclei in thermal contact

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Introduction Statistical Thermodynamics. Monday, January 6, 14

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Grand Canonical Formalism

Physics Sep Example A Spin System

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Statistical Mechanics. Hagai Meirovitch BBSI 2007

Computer simulations as concrete models for student reasoning

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Statistical thermodynamics Lectures 7, 8

2m + U( q i), (IV.26) i=1

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

Statistical thermodynamics Lectures 7, 8

213 Midterm coming up

MITES 2010: Physics III Survey of Modern Physics Problem Set 3 Solutions

Basic Concepts and Tools in Statistical Physics

Thermodynamics of feedback controlled systems. Francisco J. Cao

Paramagnetism. Asaf Pe er Paramagnetic solid in a heat bath

Fluctuations of Trapped Particles

Assignment 3. Tyler Shendruk February 26, 2010

UNIVERSITY OF SOUTHAMPTON

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Multiplicity Fluctuations in Statistical Models

ChE 503 A. Z. Panagiotopoulos 1

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Liquid to Glass Transition

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Elements of Statistical Mechanics

to satisfy the large number approximations, W W sys can be small.

Part II. Combinatorial Models. 5 A Sketch of Basic Statistical Physics. 5.1 Thermodynamics

IV. Classical Statistical Mechanics

The Methodology of Statistical Mechanics

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

1 Foundations of statistical physics

Physics 115/242 Monte Carlo simulations in Statistical Physics

PHYS 445 Lecture 3 - Random walks at large N 3-1

Phase Transition & Approximate Partition Function In Ising Model and Percolation In Two Dimension: Specifically For Square Lattices

PHYS3113, 3d year Statistical Mechanics Tutorial problems. Tutorial 1, Microcanonical, Canonical and Grand Canonical Distributions

Generalized Ensembles: Multicanonical Simulations

Collective Effects. Equilibrium and Nonequilibrium Physics

Lecture V: Multicanonical Simulations.

Understanding Molecular Simulation 2009 Monte Carlo and Molecular Dynamics in different ensembles. Srikanth Sastry

Suggestions for Further Reading

PHAS2228 : Statistical Thermodynamics

(a) How much work is done by the gas? (b) Assuming the gas behaves as an ideal gas, what is the final temperature? V γ+1 2 V γ+1 ) pdv = K 1 γ + 1

Statistical Mechanics. Atomistic view of Materials

Statistical Mechanics. Hagai Meirovitch

E = 0, spin = 0 E = B, spin = 1 E =, spin = 0 E = +B, spin = 1,

PHYSICS 715 COURSE NOTES WEEK 1

Problem Set 4 Solutions 3.20 MIT Dr. Anton Van Der Ven Fall 2002

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

Quiz 3 for Physics 176: Answers. Professor Greenside

Response of quantum pure states. Barbara Fresch 1, Giorgio J. Moro 1

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

Lecture 6 Examples and Problems

Entropy and Free Energy in Biology

Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017

Mechanical Systems are Deterministic

CS 290G (Fall 2014) Introduction to Cryptography Oct 23rdd, Lecture 5: RSA OWFs. f N,e (x) = x e modn

INTRODUCTION TO о JLXJLA Из А lv-/xvj_y JrJrl Y üv_>l3 Second Edition

although Boltzmann used W instead of Ω for the number of available states.

Signal Processing - Lecture 7

Lecture 6 Examples and Problems

Temperature Fluctuations and Entropy Formulas

SOME ASPECTS OF THE THERMODYNAMIC LIMIT

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

First Problem Set for Physics 847 (Statistical Physics II)

Lecture 27: Entropy and Information Prof. WAN, Xin

Thermal pure quantum state

Lecture 2: Intro. Statistical Mechanics

Entropy and Free Energy in Biology

Heat Capacities, Absolute Zero, and the Third Law

Classical Physics I. PHY131 Lecture 36 Entropy and the Second Law of Thermodynamics. Lecture 36 1

PHY 293F WAVES AND PARTICLES DEPARTMENT OF PHYSICS, UNIVERSITY OF TORONTO PROBLEM SET #6 - SOLUTIONS

8.044 Lecture Notes Chapter 8: Chemical Potential

Finite-size analysis via the critical energy -subspace method in the Ising models

3.320: Lecture 19 (4/14/05) Free Energies and physical Coarse-graining. ,T) + < σ > dµ

arxiv:cond-mat/ v1 [cond-mat.stat-mech] 25 Jul 2003

Collective Effects. Equilibrium and Nonequilibrium Physics

Transcription:

1 Multi-User Information Theory 2 Oct 31, 2013 Statistical Mechanics and Information Theory Lecturer: Dror Vinkler Scribe: Dror Vinkler I. INTRODUCTION TO STATISTICAL MECHANICS In order to see the need for a statistical approach to mechanics, let s look at a system of N particles, each having a property called a spin, which can be ±m. Each of those particles contribute to the total magnetic moment of the system, such that the total moment is given by M = N m i. (1) i=1 Definition 1 (Spin Excess) For a system with N particles with spins ±m, the spin excess is given by how many particles more than N/2 have spin +m, i.e. S = N N 2 where N is the number of particles with spin +m. From Definition 1 and Equation (1) it follows that the total magnetic moment is given by M = 2mS. Theoretically, one could calculate the system s magnetic moment at some point in time by using Newton s laws for each particle (given the forces that act on each particle). However, as there are about 10 23 particles in a gram of matter, this is not very practical. A different approach would be to use statistical tools to find the probability of having a certain magnetic moment at some point in time. Since the total magnetic moment is a deterministic function of the spin excess, this is equivalent to finding the probability of a certain spin excess. Definition 2 (Multiplicity Function) The multiplicity function states how many microscopic states (a specific set of spins) result in the same macroscopic state (a certain magnetic moment). This function is marked by g(s, N). Lemma 1 The multiplicity function is given by ( ) N N! g(s,n) = = N/2+S (N/2+S)!(N/2 S)!. (2)

2 Looking at this system of spins as a sequence of bits, a microscopic state is equivalent to a specific sequence x n and a macroscopic state is equivalent to some type P. The multiplicity function then becomes the size of the type class defined by S. I.e. P = ( 1 + S, 1 S ) and g(s,n) = T(P). 2 N 2 N Lemma 2 Under the assumption that the probability of each microscopic state is the same, and noticing that there are a total of 2 N possible microscopic states, the probability of finding the system with a given spin excess is P N (S) = g(s,n) 2 N. (3) Lemma 3 For S N, P N (S) is a normal distribution with a mean of 0 and variance of N/4. Proof: For S N the Stirling approximation applies and it follows that ln PN (S) P N (0) = ln (N/2)!(N/2)! (N/2+S)!(N/2 S)! ( N = 2 2 ln N 2 N ) ( ) ( ) N N 2 2 +S ln 2 +S ( ) ( ) ( ) ( ) N N N N + 2 +S 2 S ln 2 S + 2 S = N ln N ( ) ( ) ( ) ( ) N N N N 2 2 +S ln 2 +S 2 S ln 2 S ( ) ( N = 2 +S ln 1+ 2S ) ( ) ( N N 2 S ln 1 2S ) N ( )( ) ( )( ) (a) N 2S 2 +S N 2S2 N 2S + N 2 2 S N + 2S2 N 2 = 2S2 N, (4) where (a) follows from the Taylor expansion near 2S N = 0. Thus P N (S) = P N (0)e 2S2 N. (5) From Lemma 3 follows that the system will most likely have a zero magnetic moment, and that there s a very low chance of finding the system with a high excess spin relative to the number of particles.

3 We will now use the insights we have gained so far to analyze systems which are a bit more complex. The analysis is done by setting certain parameters of the system as constant, and analyzing the behaviour of the system under these conditions. Depending on what parameters are constant, the system is said to be in a different Thermodynamic Ensemble. II. THE MICROCANONICAL ENSEMBLE In the microcanonical ensemble, the constant parameters are the system s total energy U, volume V, and number of particles N. In order to visualize such a system, let s consider a rigid box completely isolated from its environment. This box is divided into two parts by a divider which allows only energy to flow from one part to the other (see Fig. 1). In other words, N 1,N 2,V 1,V 2 are all constant, and only U 1 can change over time. Notice that, since U is constant, we can write U 2 = U U 1 and use only one variable to describe our system over time. N 1,V 1,U 1 N 2,V 2,U U 1 Fig. 1. The microcanonical ensemble model system - a system isolated from its environment divided into two subsystems, where only energy can flow from one subsystem to another. Suppose this box contains spin particles as discussed in the previous section, and there s an external magnetic field B which affects the system s energy according to the formula U = MB. Thus, since U is constant, M must also be constant, which means the macroscopic state of the box is constant. However, the macroscopic state of each side of the box can change, with the only limitation being that M 1 +M 2 = M = const. Lemma 4 The probability of finding the left part of the box with some energy U 1 is

4 given by P(U 1 ) = g(u 1,N 1 )g(u U 1,N 2 ). (6) g(u, N) Proof: The left part of the box has an energy of U 1, which means it a the spin excess S 1 = U 1 /2mB and that the right part of the box has an energy of U U 1. According to Equation (3), the probability of this is given by the number microscopic states that yield this macroscopic result, divided by the number of all possible microscopic states. As this is a multiplication of Gaussians, the expected value will be the same as the most probable value. The most probable value of U 1 is given by the point where the derivative of P(U 1 ) is zero, but for ease of calculation the derivative of lnp(u 1 ) will be considered. Setting yields the following equality U 1 ln[g(u 1,N 1 )g(u U 1,N 2 )] = 0 (7) U 1 lng(u 1,N 1 ) = (U U 1 ) lng(u U 1,N 2 ). (8) Definition 3 (Entropy) The entropy of a system is marked σ and defined as the logarithm of the multiplicity function. I.e. σ = lng(u,n) (9) Definition 4 (Temperature) The temperature of a system is marked τ and defined through the derivative of the system s entropy by the system s energy, in the following manner 1 τ = σ U Theorem 1 When two systems are isolated from the environment and attached to one another such that only energy can flow between them, the expected state is the one where the systems have the same temperature. Proof: Follows directly from Equation (8) and the definition of temperature. Looking at our system again as a sequence of equiprobable bits, and remembering that the multiplicity function is now the size of the type class, we see the definition of entropy given here agrees with what we know about the size of type classes, namely that T(P) = 2 nh(x). Also, this microcanonical setting can be seen as taking a sequence (10)

5 of bits and dividing it randomly to two subsets of constant lengths N 1,N 2. Substituting g( ) in Equation (8) with the Gaussian we found in Lemma 3, and replacing U with its equivalent S, one finds that the expected state is that both sequences will have the same type. Theorem 2 (The First Law of Thermodynamics) In a microcanonical system where the entropy depends on the volume V and the energy U, the full differential of the energy is du = τdσ pdv, (11) where p is pressure. Theorem 3 (The Second Law of Thermodynamics) In an isolated system, over a long enough period of time, the entropy always rises. III. THE CANONICAL ENSEMBLE In the canonical ensemble, the constant parameters are the system s temperature τ, volume V and number of particles N. The way to keep the system, marked S, in a constant temperature is to put it in contact with a much larger system, called a heat bath and marked R (see Fig. 2). R S Fig. 2. small system, S. The canonical ensemble model system - an isolated system divided in two: a very big heat bath, R, and a The system and the heat bath are a microcanonical ensemble, with constant energy U 0. Marking the system s energy E, and noting E U 0 E, it is easy to see that the system s energy has negligible affect on the heat bath s energy, meaning its temperature is constant. According to Theorem 1, this means the system s temperature will be constant as well, thus realizing the canonical ensemble.

6 Theorem 4 The probability of the system to be in a certain microscopic state m is given by P S (m) = 1 Z e E(m) τ, where Z is a normalization factor and E(m) is the energy of the system when its microscopical state is m. Proof: For a given microscopic state of the system, the number of possible states overall is given only by the number of possible states of the heat bath. Thus, P S (m) = 1 Z g R(U 0 E) = 1 Z elng R(U 0 E) = 1 Z eσ R(U 0 E) (a) = 1 Z exp (σ R U0 = 1 Z exp ( σ R U0 E τ ( ) σr E ) U 0 E ) = 1 Z e E τ, (12) where (a) follows from E U 0 and the Taylor expansion of σ R. Definition 5 (Partition Function) The normalization factor Z is called the partition function of the system, and is given by Z = m e E(m) τ. (13) Lemma 5 Much like Shannon s entropy, the statistical mechanics entropy can be calculated using the formula σ = m P(E)lnP(E) (14) Proof: Z = m e E(m) τ = E g(e)e E τ (a) g( E )e E τ N, (15)

7 where E is the expected value of E and (a) follows from the fact that g(e) is a Gaussian of width N. Taking the logarithm on both sides yields where (a) follows from the fact that lnn σ. Thus, lnz = lng( E ) E + 1 τ 2 lnn (a) σ E τ, (16) σ = lnz + E τ = 1 1 1 e E τ lnz + Z τ Z m m = 1 ( ) lnz lne E τ e E τ Z m = m 1 Z e E e E τ τ ln Z Ee E τ = m P lnp. (17)