Intro. Each particle has energy that we assume to be an integer E i. Any single-particle energy is equally probable for exchange, except zero, assume

Similar documents
Parallel Tempering Algorithm in Monte Carlo Simulation

Computational Physics (6810): Session 13

Introduction to the Renormalization Group

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

3.320 Lecture 18 (4/12/05)

REVIEW: Derivation of the Mean Field Result

Metropolis Monte Carlo simulation of the Ising Model

Recitation: 10 11/06/03

Physics 127b: Statistical Mechanics. Second Order Phase Transitions. The Ising Ferromagnet

to satisfy the large number approximations, W W sys can be small.

Metropolis, 2D Ising model

The Phase Transition of the 2D-Ising Model

Statistical Mechanics and Combinatorics : Lecture I

Mechanics and Statistical Mechanics Qualifying Exam Spring 2006

A Monte Carlo Implementation of the Ising Model in Python

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Progress toward a Monte Carlo Simulation of the Ice VI-VII Phase Transition

Numerical Analysis of 2-D Ising Model. Ishita Agarwal Masters in Physics (University of Bonn) 17 th March 2011

The XY-Model. David-Alexander Robinson Sch th January 2012

Copyright 2001 University of Cambridge. Not to be quoted or copied without permission.

Equilibrium, out of equilibrium and consequences

Statistical Thermodynamics

Understanding temperature and chemical potential using computer simulations

Phase transitions and finite-size scaling

7.1 Coupling from the Past

3.320: Lecture 19 (4/14/05) Free Energies and physical Coarse-graining. ,T) + < σ > dµ

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm

Physics 115/242 Monte Carlo simulations in Statistical Physics

Markov Chain Monte Carlo Method

E = <ij> N 2. (0.3) (so that k BT

Spontaneous Symmetry Breaking

Assignment 6: MCMC Simulation and Review Problems

Monte Carlo study of the Baxter-Wu model

PHYS 352 Homework 2 Solutions

Monte Carlo Simulation of the 2D Ising model

Statistical. mechanics

Phase Equilibria and Molecular Solutions Jan G. Korvink and Evgenii Rudnyi IMTEK Albert Ludwig University Freiburg, Germany

Collective Effects. Equilibrium and Nonequilibrium Physics

Phase transition and spontaneous symmetry breaking

The Ising model Summary of L12

Advanced Monte Carlo Methods Problems

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

Scaling Theory. Roger Herrigel Advisor: Helmut Katzgraber

although Boltzmann used W instead of Ω for the number of available states.

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Physics 127b: Statistical Mechanics. Renormalization Group: 1d Ising Model. Perturbation expansion

Phase Transition & Approximate Partition Function In Ising Model and Percolation In Two Dimension: Specifically For Square Lattices

Lecture V: Multicanonical Simulations.

Critical Dynamics of Two-Replica Cluster Algorithms

Complex Systems Methods 9. Critical Phenomena: The Renormalization Group

Classical Monte Carlo Simulations

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Monte Carlo Simulation of the Ising Model. Abstract

A brief introduction to the inverse Ising problem and some algorithms to solve it

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

3. General properties of phase transitions and the Landau theory

A Brief Introduction to Statistical Mechanics

13! (52 13)! 52! =

Janus: FPGA Based System for Scientific Computing Filippo Mantovani

c 2007 by Harvey Gould and Jan Tobochnik 28 May 2007

Paramagnetic phases of Kagome lattice quantum Ising models p.1/16

Statistical Mechanics. Atomistic view of Materials

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

Computing bounds for entropy of stationary Z d -Markov random fields. Brian Marcus (University of British Columbia)

Efficiency of Parallel Tempering for Ising Systems

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

Lecture Notes on Statistical Mechanics

Lecture 2: Intro. Statistical Mechanics

Finite-size analysis via the critical energy -subspace method in the Ising models

763620SS STATISTICAL PHYSICS Solutions 5 Autumn 2012

MONTE CARLO METHODS IN SEQUENTIAL AND PARALLEL COMPUTING OF 2D AND 3D ISING MODEL

Entropy and Free Energy in Biology

The Partition Function Statistical Thermodynamics. NC State University

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Markov Chain Monte Carlo Simulations and Their Statistical Analysis An Overview

arxiv:cond-mat/ v1 10 Aug 2002

Computational Laboratory: Monte Carlo for Phase Stability Calculations

Monte Carlo Simulations in Statistical Physics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

9.1 System in contact with a heat reservoir

A New Algorithm for Monte Carlo Simulation of king Spin Systems* A. B. BORTZ M. H. KALOS J. L. LEBOWITZ

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

4. Cluster update algorithms

8.334: Statistical Mechanics II Problem Set # 4 Due: 4/9/14 Transfer Matrices & Position space renormalization

Fluctuation relations for equilibrium states with broken discrete symmetries

Exam TFY4230 Statistical Physics kl Wednesday 01. June 2016

Numerical Studies of Adiabatic Quantum Computation applied to Optimization and Graph Isomorphism

Random Walks A&T and F&S 3.1.2

ChE 503 A. Z. Panagiotopoulos 1

Elements of Statistical Mechanics

LECTURE 10: Monte Carlo Methods II

Lab 70 in TFFM08. Curie & Ising

Statistical mechanics, the Ising model and critical phenomena Lecture Notes. September 26, 2017

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

J ij S i S j B i S i (1)

CH 240 Chemical Engineering Thermodynamics Spring 2007

arxiv: v1 [cond-mat.dis-nn] 25 Apr 2018

Monte Caro simulations

1. Thermodynamics 1.1. A macroscopic view of matter

Transcription:

Intro Take N particles 5 5 5 5 5 5 Each particle has energy that we assume to be an integer E i (above, all have 5) Particle pairs can exchange energy E i! E i +1andE j! E j 1 5 4 5 6 5 5 Total number of particles and total energy is conserved Any single-particle energy is equally probable for exchange, except zero, assume E i 0 What is the most likely distribution of the single-particle energies? Simple process: Simulation Each particle starts with same energy 60 Take random pair and change energies E i! E i +1andE j! E j 1 for j=1:n, % loop over particles particle1=randi(n); if E(particle1)==0, continue; end particle2=randi(n); E(particle1)=E(particle1)-1; E(particle2)=E(particle2)+1; end energy 50 40 30 20 Start Continue avoids negative energies In the plot, particles are sorted by energy. 10 0 End 0 20 40 60 80 100 particle index This is only a short simulation. What is the probability distribution if we simulate longer?

for j=1:n, % loop over particles particle1=randi(n); if E(particle1)==0, continue; end particle2=randi(n); E(particle1)=E(particle1)-1; E(particle2)=E(particle2)+1; end % Also collecting statistics: for j=1:n, p(e(j)+1)=p(e(j)+1)+1; end Now also collecting statistics Results on the plot right: Probability 0.04 0.03 0.02 0.01 Results simulation exponential fit Probability distribution seems to follow an exponential law p i / exp( E i ) 10-2 10-3 10-4 20 40 60 80 100 120 140 Energy This is the Boltzmann factor you should have seen in statistical physics! We had a special case, but similar things can be found in a more general case also. Consider N particles. Statistical physics Possible energy states E 1,E 2,E 3,... Number of particles at state i is n i. Total number of particles is found by summing n i s: n i = N, i and average total energy is: n i E i = U. i What is the most probable distribution of n i s for a given N and U? (Assuming every state is equally probable.)

Most probable distribution Number of ways certain set {n i } can be found is P [{n i }]= N! n 1!n 2!n 3!..., and maximizing P is simple (N and U by Lagrange multipliers) and results in where β is found from n i N = exp( βe i) i exp( βe i), i U = E i exp( βe i ) i exp( βe. i) Thermodynamics tells us that β =1/kT (k is Boltzmann constant). exp( βe i ) is called Boltzmann factor. Why do high-energy states have smaller probability (we assumed equal probability)? Plugging in numbers (needed later) Example: One particle, two states, energies ϵ and ϵ. Probabilities: exp( βϵ) 2cosh(βϵ) and Energy: U = ϵ tanh (βϵ). exp(βϵ) 2cosh(βϵ). 1 0.8 ε ε 0 0.2 Probability 0.6 0.4 U/ε 0.4 0.6 0.2 0.8 0 0 2 4 6 8 10 kt/ε 1 0 2 4 6 8 10 kt/ε

Definitions Partition function is defined as: Z = i exp( βe i). So, e.g., probability for a state i is exp( βe i )/Z. Partition function of previous example: Z =exp( βϵ)+exp(βϵ) =2cosh(βϵ). If number of states with energy U is Ω(N,V,U), wecandefine Entropy: S(N,V,U) =k ln Ω(N,V,U). Note also that Z = U e βu Ω = U e β(u TS) so states with minimum F = U TS dominate. Fiscalledfreeenergy. Linking statistical physics to thermodynamics. Observables Expectation value of observable A is A = i A i exp( βe i )/Z. The average energy as derivative U = ln Z β. Response functions: Specific heat is and thus it is also or fluctuation of energy C V = ( ) U T V C V = 1 2 ln Z kt 2 β 2 C V = 1 kt 2 ( E 2 E 2) We have had N, V,andT fixed, this is called canonical ensemble.

Microstates and all that Everything looks roughly the same if we replace single-particle states with configurations of all the (non-interacting) particles. These configurations are commonly called microstates. Example: Take the one particle and two states model discussed above, and generalise this to N non-interacting particles with two possible energies. N=3, 1-body states u and d, now microstates are: (uuu), (uud), (udu),..., (ddd) Energy of an microstate is (Nu-Nd) ε, where Nx is number of particles at state x (u or d). Nu+Nd=N, and the total number of microstates is 2 N. Expectation values as sums over microstates: i U = E i exp( βe i ) i exp( βe i) Loop over microstates We can use bits of an integer to code if the particle is u or d Take integers from zero to 2 N -1 and bit 1 is u and bit 0 is d : N=15; % Number of particles T=1; % Temperature sum_e=0; % <energy> partition_f=0; % partition function for i=1:2^n, Nu=sum(bitget(uint32(i-1),1:N)); Nd=N-Nu; E=Nu-Nd; boltzmann=exp(-e/t); sum_e=sum_e+e*boltzmann; partition_f=partition_f+boltzmann; end -1 0 2 4 6 8 10 Temperature Here energy is scaled so that ε=1. Energy per spin follows the -tanh(1/t) formula! Matlab has different types of variables...hidden inside it, try class(uint32(1)) and class(1). But this was stupid, as we could separate the sum over microstates to N times sum over two states. Yes, but this can NOT be done if the particles start to interact (NEXT)! Energy 0-0.1-0.2-0.3-0.4-0.5-0.6-0.7-0.8-0.9

Ising model, 1D N spins on a 1D lattice: 1 2 3 4 5 6 7 8 9 1 interaction between neighbors: where σ i = ±1 is spin at site i. J>0, andpbchavebeenused. H = J N σ i σ i+1, i=1 T =0,groundstatewithallspinsupordown. Like before: Expectation values as sums over microstates: A = 2 N j=1 A(X j)exp( βh(x j )) j exp( βh(x j)) X contains all spins {σ i } N i=1 Matlab summing over microstates: Ising model, 1D sum_e=0; partition_f=0; for i=1:2^n, s=int32(bitget(uint32(i-1),1:n))*2-1; E=double(J*(sum(s(1:N-1).*s(2:N))+s(N)*s(1))); boltzmann=exp(-e/t); sum_e=sum_e+e*boltzmann; partition_f=partition_f+boltzmann; end Interaction This is can be done for various N: Computational time for one T

Fortran and matlab working on bits Fortran is around 100 times faster. Same 2^N scaling in CPU time! Brute force summation, 2D case Horizontal couplings along the green line, vertically yellow (over boundary not shown)

Dead end? The 1D Ising model seems to be too hard to calculate by direct summation. Any method doubles computer time as we add one more spin to the system. In addition, analytic results for 1D Ising tells that it is not very interesting. However, 2D version turns out to have a phase transition! J couples nearest-neighbor spins. Notation i, j counts all n-n pairs once. Again σ = ±1 H = J i,j σ i σ j H i σ i Sites i on lattice, various boundary conditions possible. H is external magnetic field. But we need a more clever algorithm for the simulations. Monte Carlo in statistical physics Our example: N spins pointing up or down. Expectation value of observable A: A = 2 N j=1 A(X j)exp( βh(x j )) j exp( βh(x j)) X contains all spins {σ i } N i=1 We can not sum all terms for large N, bad scaling. Find the important terms by Metropolis, sampling X from correct distribution: hai = 1 M MX A(X i ), X i from exp( i H(X i ))/Z Here beta is the inverse temperature and H is the Hamiltonian (energy) of the system Generalisation to continuum variables, e.g.

Metropolis 0. Choose an initial state X. defines neighbours 1. Pick a random site i. 2. Calculate the energy change E if the spin σ i is flipped. 3. Generate a random numberr (between zero and one). 4. If r<exp ( E/kT), flipthespin. 5. Measure observables. 6. Go to 1. Any lattice with the same code, just modify couplings One-dimensional Two-dimensional square lattice Steep Difference between one and two dimensions One-dimensional Two-dimensional square lattice C V = U T One-dimensional Two-dimensional square lattice

Phases of the Ising model 1. If T =0,allspinsareeitherupordown,andnospinflipspossibleasexp ( E/kT) 0. This state has magnetization m = 1 N i σ i = ±1. 2. If β =0(T = ), allstateshavesameweight,andspinsarerandom.nomagnetization (m =0). What happens between these limits? 1D: Smooth transition to lower magnetization. 2D: Sudden transition, phase transition? This plot is for 2D Metropolis, 2D Ising model You can get visual understanding from the java applets available, like: http://physics.ucsc.edu/~peter/ising/ising.html

Mean-field analysis H = J i,j σ i σ j J i,j σ i σ j = Jzmσ j Approximate neighbors with mean of the spin m (magnetization). Number of nearest neighbors is z. Problem of one spin, two states with energies ±Jzm (solved in intro to stat. phys. above). Expectation value of spin σ j is σ j =tanh(zjm/kt). But σ j should be m! We get m =tanh(zjm/kt). Need of self-consistent solution, plug m in on RHS, get new m on LHS. Continue until converged. Mean-field for 2D (z=4) 1 0.9 0.8 0.7 Magnetization 0.6 0.5 0.4 0.3 0.2 0.1 0 0 1 2 3 4 5 6 7 8 Temperature At small x, tanh(x) x x 3 /3, sothat m = zjm kt 1 3 ( zjm Solutions m =0and m (zj/k T ) 1 2 (=β), T C = zj/k, β is critical exponent. kt ) 3