Introduction. Chapter The Purpose of Statistical Mechanics

Similar documents
PAPER No. 6: PHYSICAL CHEMISTRY-II (Statistical

The non-interacting Bose gas

Elements of Statistical Mechanics

We already came across a form of indistinguishably in the canonical partition function: V N Q =

6.730 Physics for Solid State Applications

Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA. There are at least two ways to think about statistical thermodynamics.

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

5. Systems in contact with a thermal bath

Classical Statistical Mechanics: Part 1

Basic Concepts and Tools in Statistical Physics

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

CHAPTER FIVE FUNDAMENTAL CONCEPTS OF STATISTICAL PHYSICS "

Internal Degrees of Freedom

Derivation of the Boltzmann Distribution

Identical Particles. Bosons and Fermions

Statistical Mechanics

STATISTICAL MECHANICS

Part II: Statistical Physics

1 Multiplicity of the ideal gas

Thermodynamics & Statistical Mechanics

Chapter 14. Ideal Bose gas Equation of state

Lecture 6: Fluctuation-Dissipation theorem and introduction to systems of interest

Physics 127a: Class Notes

The Ginzburg-Landau Theory

International Physics Course Entrance Examination Questions

CHEM-UA 652: Thermodynamics and Kinetics

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

213 Midterm coming up

Introduction to particle physics Lecture 3: Quantum Mechanics

CONTENTS 1. In this course we will cover more foundational topics such as: These topics may be taught as an independent study sometime next year.

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Table of Contents [ttc]

5. Systems in contact with a thermal bath

Physics Nov Bose-Einstein Gases

1 Particles in a room

Condensed Matter Physics Prof. G. Rangarajan Department of Physics Indian Institute of Technology, Madras

Statistical Mechanics

Part II Statistical Physics

Chapter 9: Statistical Mechanics

4. The Green Kubo Relations

Appendix 1: Normal Modes, Phase Space and Statistical Physics

d 3 r d 3 vf( r, v) = N (2) = CV C = n where n N/V is the total number of molecules per unit volume. Hence e βmv2 /2 d 3 rd 3 v (5)

Light Quantum Hypothesis

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Statistical Mechanics in a Nutshell

+ 1. which gives the expected number of Fermions in energy state ɛ. The expected number of Fermions in energy range ɛ to ɛ + dɛ is then dn = n s g s

Unit III Free Electron Theory Engineering Physics

Planck s Hypothesis of Blackbody

The Wave Function. Chapter The Harmonic Wave Function

PHYSICS 715 COURSE NOTES WEEK 1

Lecture 6 Photons, electrons and other quanta. EECS Winter 2006 Nanophotonics and Nano-scale Fabrication P.C.Ku

Checking Consistency. Chapter Introduction Support of a Consistent Family

The Wave Function. Chapter The Harmonic Wave Function

Physics Oct Reading. K&K chapter 6 and the first half of chapter 7 (the Fermi gas). The Ideal Gas Again

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Scattering of an α Particle by a Radioactive Nucleus

10. SPONTANEOUS EMISSION & MULTIPOLE INTERACTIONS

G : Statistical Mechanics

The perfect quantal gas

A.1 Homogeneity of the fundamental relation

Quantum Field Theory. Chapter Introduction. 8.2 The Many Particle State

221B Lecture Notes Many-Body Problems I

2 Canonical quantization

Chem 350: Statistical Mechanics and Chemical Kinetics. Spring Preface. Introduction 2

PH4211 Statistical Mechanics Brian Cowan

Physics 408 Final Exam

Identical Particles in Quantum Mechanics

Bohr s Model, Energy Bands, Electrons and Holes

PRINCIPLES OF PHYSICS. \Hp. Ni Jun TSINGHUA. Physics. From Quantum Field Theory. to Classical Mechanics. World Scientific. Vol.2. Report and Review in

Fractional exclusion statistics: A generalised Pauli principle

Physics 607 Final Exam

CHAPTER 9 Statistical Physics

Quantum Statistics (2)

Fluctuations of Trapped Particles

How applicable is Maxwell- Boltzmann statistics?

Planck s Hypothesis of Blackbody

Fundamentals. Statistical. and. thermal physics. McGRAW-HILL BOOK COMPANY. F. REIF Professor of Physics Universüy of California, Berkeley

Concepts for Specific Heat

1 Fluctuations of the number of particles in a Bose-Einstein condensate

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

1) K. Huang, Introduction to Statistical Physics, CRC Press, 2001.

Quantum Mechanics of Atoms

The Methodology of Statistical Mechanics

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Introduction to Thermodynamic States Gases

Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases

Superfluidity. Krzysztof Myśliwy. October 30, Theoretical Physics Proseminar

Phonon II Thermal Properties

3 Dimensional String Theory

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

Computer simulations as concrete models for student reasoning

Physics 3700 Introduction to Quantum Statistical Thermodynamics Relevant sections in text: Quantum Statistics: Bosons and Fermions

SECOND QUANTIZATION. Lecture notes with course Quantum Theory

Lecture 5. Hartree-Fock Theory. WS2010/11: Introduction to Nuclear and Particle Physics

CHAPTER 9 Statistical Physics

Announcement. Station #2 Stars. The Laws of Physics for Elementary Particles. Lecture 9 Basic Physics

Ideal gas From Wikipedia, the free encyclopedia

Chemical thermodynamics the area of chemistry that deals with energy relationships

In-class exercises. Day 1

Transcription:

Chapter 1 Introduction 1.1 The Purpose of Statistical Mechanics Statistical Mechanics is the mechanics developed to treat a collection of a large number of atoms or particles. Such a collection is, for example, a solid made up of N 10 23 atoms or a liquid or a gas of 10 23 molecules. Ordinary mechanics, classical or quantum mechanics, is suited to treating the behavior of one, two or at most a few bodies. There we set up an equation of motion for the body, Newton s equations in classical mechanics, for example, and solve this equation to find how the momentum and position of the body vary with time. With this equation of motion, and once we have specified the initial state of the body by giving its position and momentum at some initial time, we can predict the behavior of the body at all future times. If we attempt this approach for 10 23 atoms we have not only an impossible task, but the result would not be useful. For it does not help us much to know the individual motion of each atom and describe the properties of an enormous collection of individual motions. What we want rather is the average or macroscopic properties of the 10 23 atoms and how these are related to the microscopic interatomic interactions. We want to know particularly how the atoms are distributed over the possible states available to them from which we can construct average values. These average values constitute the macroscopic properties of the liquid, solid or gas. With this in mind we may say that statistical mechanics has two purposes. Firstly, given an assembly of N identical systems, how are these systems distributed over the states available to them? This might be the distribution over the possible momentum values or in the case of an external field, how the systems are distributed in position. Strictly only the distribution over energy, the number of systems in a given energy interval, is required since the other distributions may be obtained from the energy distribution. Once we have this distribution - which is often of specified physical interest itself - we may evaluate average properties such as the energy, pressure and specific heat which make up the thermodynamic properties of the assembly of systems. 1

2 Chapter 1. This leads to the second role of statistical mechanics. It provides the link between the microscopic properties of a many body system and its macroscopic character. It is the link between the microscopic particle dynamics and interactions, on the one hand, and the thermodynamic properties of the system made up of these particles on the other. This link is made through the average process via the distributions discussed above. How do we specify microscopically the state of a large, many-particle system? If we had one or two particles, we know from classical mechanics that we need only specify the position x i and momentum p i of the particles and this completely determines (with the equations of motion) the present (and future) behavior. We can regard x i and p i as vectors to points in a three dimensional configuration space and a three dimensional momentum space, respectively. If we combine the two spaces, we will have a single point in a six dimensional space. The combine position and momentum space is referred to as PHASE SPACE. If the body we wish to specify contained only a single particle, this point in the six dimensional phase space would specify the state of the system completely. We specify the state of a many-body system in the same way, by specifying the x i and p i of each of the i = 1 to N S bodies. We denote the number of bodies in the system by N S to distinguish it from the number of systems in the assembly. The dimensions of the phase space is now 6N S, six dimensions for each of the N S particles. A point in this 6N S dimensional space completely represents the state of the N S body system. In statistical mechanics we seek the distribution of a collection of identical such systems over the possible energy states available to the systems. This distribution can be represented by a distribution of representative points in the 6N S dimensional phase space or Γ space. 1.2 The Mathematical Model We have been discussing an assembly of N identical systems. This concept of an assembly of N systems and their distribution over the elements Γ of phase space is central to statistical mechanics. We have, however, not really said what these systems are. Fundamentally, the assembly of systems is a mathematical tool which we may interpret physically in different ways. There are two basic interpretations, one due to Ludwig Boltzmann and the other due later to Willard Gibbs. The two interpretations are conceptually quite different but the mathematical method of treating the assembly, and there are a number of methods, is identical. Hence the concept of an assembly of N identical systems should be viewed as a mathematical model. Only the physical interpretation of the systems is different in the Gibbs and Boltzmann pictures of statistical mechanics. We solve the model here using the method of the most probable distribution introduced by Boltzmann. We seek, mathematically, the distribution of the N systems over the possible energy states available to the systems given that the total energy of the assembly is fixed. The number N of systems

Chapter 1. 3 in the assembly is also fixed. In doing this we will assume that initially there is an equal probability of finding a system in any energy state or in any region of phase space. That is, there is no a priori reason to choose one region over another to begin our probability arguments. This assumption is discussed in more detail in section 1.3. 1.3 The Gibbs Interpretation Time and Space Average In the Gibbs interpretation, the system is the actual solid, liquid or gas containing many atoms that we wish to study. This system may, however, be in a number of states. We naturally want to include and describe all possible states of the system. To do this we construct an assembly of mental copies of the system in its different states. Clearly, to represent all possible states in this way each possible state of the system must appear at least once in the assembly. Thus, here the assembly is the collection of mental copies. This assembly is also often referred to as an ENSEMBLE in the Gibb s method. We might picture constructing our assembly of mental copies in the following way. Consider a large block of solid, say of copper, containing 10 23 atoms. We could partition this block into 10 5 or 10 6 equal parts with each part remaining large enough that it still represents the character of a block of copper; that is each part has the same character of the initial system. Each part can then be taken as the system. The idea is to select one of these parts as our system and regard all the other parts as mental copies of this system. As energy fluctuations take place in the block, energy is exchanged between the one system selected and the mental copies. We assume loose mechanical contact but good thermal contact between the copies so that the temperature will be uniform in the block. In this way we may imagine our system immersed in a heat bath made up of identical mental copies of the initial system. Clearly again we must have enough mental copies so that each possible state, however rare, is represented at least once in the assembly. The number of copies appearing in a given state gives the probability of seeing the selected system in that state. The last sentence is basically the statement if the ERGODIC HYPOTH- ESIS. If we focus first on the selected system we could watch it for a long time. During this time heat will be exchanged between the heat bath (the remainder of the block) and the system s energy will fluctuate. If we observe the system for a long enough time T we should observe it in all its possible states. We could then give the average value of a property A, say, of the system as A = 1 T T 0 dt s A(t s ) (1.1) where A(t s ) is the value of A at time t s when the system is in state s. If we counted the number of times, N(s), we observed the system in state s, during

4 Chapter 1. time T, we could also write this time average as s A = N(s)A(s) s N(s) (1.2) where A(s) is the value of A in state s(t) observed at time t. We now turn to our N mental copies of the selected system and count, at one instant, the number of copies in state s. If we regard the mental copies of the system as constructed by partitioning a large body (e.g. our block of copper) into identical systems, then the second counting is a space average over this larger body or an ensemble average over the N mental copies. The ergodic hypothesis states that these two averages are identical. In this second counting, then the average of A in (1.2) is interpreted as a sum over all the states appearing among the N mental copies. Such a theorem cannot be, and has never been proved. However, it is intuitively reasonable that provided T is long enough and provided the number of copies N is large enough then the time average (1.1) should equal the space average (1.2). The later we can write as s A = N(s)A(s) s N(s) = s N(s) N A(s) = s ρ(s)a(s) (1.3) where ρ(s) is the fraction of copies in the ensemble observed in state s. The last average may be regarded as a space average over the whole block since we could convert this to an integral over the block, A = 1 dra(r) (1.4) V where V is the volume of the block. The ergodic hypothesis states that the time and space ensemble averages in (1.1) and (1.4) are identical. How do we specify the states s of our system, which is now composed of a large number M of atoms or molecules? To do this we go back to dynamics as noted above where we saw that we may specify the state of a single particle by its momentum p i and position x i. Once the initial momentum and position of a single particle are given then, classically at least, the complete motion is specified for all times by the eqution of motion the particle satisfies. We also take x i and p i as independent. This is obviously true in classical mechanics, but in quantum mechanics x i and p i can be specified only within Heisenberg s uncertainty limit p i x i h 3. In this way we specify the state of a particle by locating p i and x i within a six dimensional cube of sides h 1 2, where h = 6.023 10 27 erg. sec. We will take this as the limit to which we attempt to specify x i and p i and outside this limit x i and p i are regarded as statistically independent. The instantaneous

Chapter 1. 5 state of our system of N S particles is then specified by specifying the position x i and p i of each constituent. If the constituents are atoms (which we regard as a unit which does not have an internal degree of freedom) then we have a total of 6N S coordinates to specify. Once this is done the state of the system may be represented by a point in 6N S dimensional phase space or Γ space. This state point is in the volume element, dγ = N S i=1 d p i d x i, of phase space. The motion of the particles in the system can then be described by the motion of this point through phase space. Particularly, if the N S x i and p i are fixed then the total energy of the system, E = N S i=1 p i 2 2m i + V ( x i... x n ), is fixed. The motion of the point in Γ describes the re-distribution of energy between kinetic and potential as the particles (e.g. atoms) interact and as the system looses (or gains) total energy through contact and exchange of energy with the heat bath surrounding it. If the constituents making up the system are N S molecules having f degrees of freedom, then we need to fix 2f coordinates to locate each molecule and a 2fN S dimensional phase space to specify the system. Finally, the great power of the Gibb s interpretation of statistical mechanics is that we have said nothing about the interaction between the atoms, molecules or more elementary particles that make up the system. This means that the averages we attain will be valid for arbitrary interaction among the particles. The Boltzmann interpretation to follow, although simpler, is valid only for very weakly interacting particles which means it is restricted to dilute gases in practice. 1.4 Boltzmann s Statistical Mechanics Ludwig Boltzmann (1847-1906) was interested primarily in the statistics and thermodynamics of gases. In this case the particles making up the gas are nearly independent particles interacting only rarely with other particles. For example, in a gas at standard temperature and pressure (STP) the average interparticle spacing is 40Å which is more than 10 times a typical atomic or molecular radius. Thus, collisions will be rare and particularly for a very dilute gas we may regard each atom as a non-interacting particle. The limit of an infinitely dilute gas, in which the particles do not interact at all, is called a perfect or ideal gas. In this case each particle is clearly independent. For a gas, Boltzmann identifies the system with the particles and the assembly of the N identical systems (particles) as the gas under study. In this

6 Chapter 1. case we are seeking the distribution of the N = N S particles over the possible energy or momentum or velocity states. Once we have one distribution (e.g. energy) we have the others since for a free particle the energy, momentum and velocity are simply related by ɛ i = p2 i 2m = 1 2 mv2 i. We are able to regard the particles as the systems in a gas since they are statistically independent units. 1.5 The Basic Assumption of Statistical Mechanics In discussing Gibb s statistical mechanics we saw that the state of an N S particle system can be represented or specified by a point in an element Γ = i p i x i of the 6N S dimensional phase space Γ. This identifies the momentum p i and position x i of each of the N S particles in the system. The Γ are called CELLS in phase space and we select a cell size small enough that we can distinguish the different cells. We saw also that due to Heisenberg s uncertainty principle we could not locate the point more precisely than each p iα x iα h in Γ. In this case we ought to choose each cell at least as big as h 3N S since we cannot locate a state more accurately than this limit. We did not, however, decide how large the cells in Γ space should actually be. Physically, we could make the cell large. The upper limit would be reached if the properties of the system changed substantially as we moved the point within the cell. This would tell us that we must divide the cell so different states are represented by different cells. In other words if there is more than one distinguishable states within the cell we must reduce the size of the cell to obtain a precise description of the system. In statistical mechanics we choose the size of the cell as h 3N S. That is, each cell of this size is taken to contain one state of the system. The number of states in volume dγ = dγ h 3N S This choice is really arbitrary and most properties, we shall see, are independent of this size. The absolute value of the entropy does, however, depend upon this choice of cell size. We can also show that for the microscopic distributions we seek, and particularly in classical cases, the cell size is not too large. In Boltzmann s statistical mechanics the cell size is h 3, since each system is a single particle in a gas. In fact this cell is too small to contain a large number of atoms. For a gas at STP there is only one chance in 100, 000 that a cell contains an atom. This means that we will have large statistical fluctuations in the occupations of the cells (from 1 to 0 with rarely more than 1 in a cell). Boltzmann s combinatorial method depends mathematically on a uniform variation of occupation among adjacent cells. This apparent contradiction between the cell size and the mathematical requirements of Boltzmann s combinatorial

Chapter 1. 7 method lead many to criticize and dismiss his statistical mechanics. This problem can, however, be overcome by combining cells into groups large enough to contain a large number of particles. This is valid since the observable particle energy ɛ = p 2 /2m varies slowly over the larger elements. Given that the state of a system is represented by a point in phase space in what cell do we place this point? That is, how do we weight or choose the probability that a cell is occupied? The basic assumption of statistical mechanics is that each region or cell of phase space is intrinsically equally likely to be occupied. This is often referred to as the Equal A Priori Probability of all regions of Phase Space. This assumption is based on the fact that phase space is everywhere the same and we have no reason to choose or favor one region over another. It is based on a symmetry argument. By analogy if we had symmetric dice, with each of the six sides identical, we would say that each side is equally likely to appear facing up when the dice is thrown. Only if we weighted the faces with some additional conditions would any faces be favored. We shall see that when we weight our statistical arguments with external conditions that certain regions of phase space become more heavily occupied than others. But before these conditions are imposed, we assume that each region is intrinsically equally likely to be occupied. Equally, before these conditions are imposed, each energy state of the system is equally likely to be occupied. 1.6 The Three Statistics Classical; Fermi and Bose To introduce the idea of particle statistics, we consider an assembly of non-interacting systems. A physical example to keep in mind is a perfect gas, an assembly of non-interacting atoms (the systems). Since we have taken the interactions to be negligible, there remains only the intrinsic character of the particles to distinguish one gas from another. We then have two cases: 1. The Classical Case: Here the particles are large enough or the temperature of a gas is high enough that classical mechanics accurately describes the particles. In this case the de Broglie wavelength (the spread of each particle in space) is much less than the inter-particle spacing (λ r) so that each particle can be clearly distinguished from another. With this distinguishability we can trace the path of a single particle through the gas and identify it at each point. In this case we can distinguish between states in which two particles have interchanged positions. 2. The Quantum Case: Here, the particle de Broglie wavelength is comparable or greater than the inter-particle spacing (λ r) and due fundamentally to Heisenberg s uncertainty principle, we cannot trace the path of a particle in the gas. We thus cannot distinguish between particles. This means that in enumerating the states of a system we cannot distinguish between states in which two particles are interchanged. We must count them as the same state.

8 Chapter 1. In addition, in Quantum Mechanics the probability of observing a particular distribution of the N particles in the gas is given by ψ (x 1, x 2... x n ) 2 All observable properties depend upon the square of the wave function ψ. Since the particles are indistinguishable, that is we cannot tell when two particles are interchanged, this ψ 2 must be symmetric with respect to interchange of two particles. This symmetry of ψ 2 will be maintained if ψ is symmetric with respect to interchange or if ψ is anti-symmetric with respect to interchange. In the last case ψ changes sign when two particles are interchanged so that ψ 2 remains unchanged. To investigate the properties of this symmetry further, consider a system of two particles. If the particles are non-interacting we could write the total wave function of the pair as ψ(1, 2) = φ a (1) φ b (2), which is a product of two single particle functions. This ψ(1, 2) does not, however, have the correct symmetry since if we interchange particles 1 and 2 we do not get either (a) ψ(2, 1) = ψ(1, 2) or (b) ψ(2, 1) = ψ(1, 2) corresponding to the symmetric and anti-symmetric cases respectively. We can however, construct from φ a and φ b a pair function which is symmetric and one which is anti-symmetric, i.e., for which and for which ψ sym (1, 2) = φ a (1) φ b (2) + φ a (2) φ b (1) (1) ψ sym (2, 1) = ψ sym (1, 2), ψ anti (1, 2) = φ a (1) φ b (2) φ b (1) φ a (2) (2) ψ anti (2, 1) = ψ anti (1, 2), The ψ sym (1, 2) and ψ anti (1, 2) have the correct symmetry and each pair of particles must have a wave function of one of these forms. It is an observed property that particles having integer spin (s = 0, 1, 2....) have symmetric wave functions while particles having half integral spins have anti-symmetric wave functions. The integral spin particles are called BOSONS and the half-integral spin particles are called FERMIONS. In most cases we will consider only spin zero Bosons and spin 1 2 Fermions. If we now try to put two particles in the same state, for example, φ a (1) = φ b in (2), we see that ψ anti (1, 2) vanishes. That is, for Fermi particles having having anti-symmetric wave functions we can put only one Fermion in each single particle state. Thus in placing particles in the possible states available we will be able to assign at most one particle per state. For the Boson case there is no such restriction and we may assign any number of particles to a single particle state. The restriction on Fermions leads to Fermi-Dirac statistics. In summary we have three cases

Chapter 1. 9 1. Classical Particles distinguishable leads to Maxwell-Boltzmann statistics 2. Bosons (integer spin particles) indistinguishable, can have any number in one state leads to Bose-Einstein statistics 3. Fermions ( 1 2 integer spin particles) indistinguishable, can have only one particle per state leads to Fermi-Dirac statistics Finally, we note that we have considered a system of non-interacting particles to introduce the ideas of statistics. The three statistics hold, however, for interacting particles as well. For classical particles, the particles remain distinguishable when they interact. In the quantum case, the states a particle can occupy are modified by the inter-particle interaction. Yet, the state of the whole system is unchanged if we interchange two particles in their states. The only change is that the states are more complicated to work out and are certainly not independent single particle states. The wave function describing the whole system must again be either symmetric or anti-symmetric w.r.t. particle interchange. Secondly, a key idea is that of statistical independence. The mathematical model we will treat is an assembly of N weakly interacting and statistically independent systems. We will find that the probability of observing such a system in energy state E is proportional to e E/kT, the Boltzmann factor. What constitutes a statistically independent system? Firstly, classical particles can be independent because their wave functions are well localized compared to the inter-particle spacing. That is, their de Broglie wavelength λ are much less than the inter-particle spacing. Thus they are distinguishable and well isolated from one another. If they also interact weakly or rarely then they can be regarded as statistically independent. On the other hand quantum particles have widely spread wave functions which overlap with the wave functions of other particles. The de Broglie wavelength is long compared to the inter-particle spacing. In this case the quantum particles can never be statistically independent even if the inter-particle force is weak. They interact effectively via the overlap of their wave functions. In this case the probability of observing the quantum particles having energy ɛ is not given simply by the Boltzmann factor. Rather this probability is given by the Fermi-Dirac distribution for Fermions and by the Bose-Einstein distribution for Bosons. For cases in which the particles of a body are not statistically independent, either because they are classical and strongly interacting or because they are quantum, we seek modes or excitations if the body which are independent. These are then described by the Boltzmann factor. Examples are phonons in solids or liquid 4 He or the quasi-particles of liquid 3 He.