Kinetic Theory 1 / Probabilities

Similar documents
Kinetic Theory 1 / Probabilities

5. Systems in contact with a thermal bath

Physics Dec The Maxwell Velocity Distribution

5. Systems in contact with a thermal bath

Statistical Mechanics

Multiple Random Variables

Basic Concepts and Tools in Statistical Physics

4. Systems in contact with a thermal bath

functions Poisson distribution Normal distribution Arbitrary functions

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

1 Presessional Probability

Statistics, Data Analysis, and Simulation SS 2015

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Please read the following instructions:

Part II: Statistical Physics

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Physics 607 Final Exam

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Part II: Statistical Physics

X α = E x α = E. Ω Y (E,x)

18.440: Lecture 28 Lectures Review

ADIABATIC PROCESS Q = 0

ε tran ε tran = nrt = 2 3 N ε tran = 2 3 nn A ε tran nn A nr ε tran = 2 N A i.e. T = R ε tran = 2

Elementary Lectures in Statistical Mechanics

Please read the following instructions:

Random Variables. P(x) = P[X(e)] = P(e). (1)

4. CONTINUOUS RANDOM VARIABLES

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Physics Sep Example A Spin System

Statistics, Data Analysis, and Simulation SS 2013

Review of Probabilities and Basic Statistics

Kinetic Theory of Gases (1)

UNIVERSITY OF SOUTHAMPTON

Let X and Y denote two random variables. The joint distribution of these random

Velocity distribution of ideal gas molecules

Kinetic theory. Collective behaviour of large systems Statistical basis for the ideal gas equation Deviations from ideality

Ch. 19: The Kinetic Theory of Gases

Physics 562: Statistical Mechanics Spring 2003, James P. Sethna Homework 5, due Wednesday, April 2 Latest revision: April 4, 2003, 8:53 am

Chapter 15 Thermal Properties of Matter

Chapter 18 Thermal Properties of Matter

Math Review Sheet, Fall 2008

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Chemistry. Lecture 10 Maxwell Relations. NC State University

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

II. Probability. II.A General Definitions

Physics Nov Cooling by Expansion

Rate of Heating and Cooling

EE4601 Communication Systems

Probability Distributions - Lecture 5

Speed Distribution at CONSTANT Temperature is given by the Maxwell Boltzmann Speed Distribution

Probability Review. Gonzalo Mateos

Physics 408 Final Exam

AST1100 Lecture Notes

This is a statistical treatment of the large ensemble of molecules that make up a gas. We had expressed the ideal gas law as: pv = nrt (1)

Statistics, Data Analysis, and Simulation SS 2017

Formulas for probability theory and linear models SF2941

Phys Midterm. March 17

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Introduction Statistical Thermodynamics. Monday, January 6, 14

A Study of the Thermal Properties of a One. Dimensional Lennard-Jones System

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Section B. Electromagnetism

Classical thermodynamics

Topics covered so far:

Classical Statistical Mechanics: Part 1

The Ideal Gas. One particle in a box:

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

Lecture 1: August 28

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

1 Phase Spaces and the Liouville Equation

Thermodynamics: A Brief Introduction. Thermodynamics: A Brief Introduction

Stochastic Models of Manufacturing Systems

Two-dimensional Random Vectors

Introduction to Statistics and Error Analysis

PHYS 352 Homework 2 Solutions

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

This is a statistical treatment of the large ensemble of molecules that make up a gas. We had expressed the ideal gas law as: pv = nrt (1)

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

(TRAVELLING) 1D WAVES. 1. Transversal & Longitudinal Waves

Thermal Physics. 1) Thermodynamics: Relates heat + work with empirical (observed, not derived) properties of materials (e.g. ideal gas: PV = nrt).

Physics 4230 Final Examination 10 May 2007

2. Molecules in Motion

Continuous Random Variables

STA 256: Statistics and Probability I

Chapter 19 Kinetic Theory of Gases

Chapter 2: Random Variables

Probability Density Functions

Continuous r.v practice problems

Physics 123 Thermodynamics Review

Lecture 12. Temperature Lidar (1) Overview and Physical Principles

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Statistics and data analyses

KINETIC THEORY OF GASES

Multivariate distributions

PHY101: Major Concepts in Physics I. Photo: J. M. Schwarz

Introduction to Probability and Stocastic Processes - Part I

Temperature Thermal Expansion Ideal Gas Law Kinetic Theory Heat Heat Transfer Phase Changes Specific Heat Calorimetry Heat Engines

Transcription:

Kinetic Theory 1 / Probabilities 1. Motivations: statistical mechanics and fluctuations 2. Probabilities 3. Central limit theorem 1

Reading check Main concept introduced in first half of this chapter A)Temperature B) Pressure C) Entropy D) Diffusion E) Probabilistic description 2

Reading check Most important: Probability description 3

The need for statistical mechanics 4

How to describe large systems In this course we will try to make the connection between the macroscopic description of thermodynamics and the microphysics behind it. The prototype that we will use is a volume of gas. http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Our system are large! Avogrado number N=6.02 10 23 / mole (=22.4liter of gas, =12 gram of Carbon) Very large number of particles 10 28 in 500m 3 At finite temperature not at rest e.g Velocity 300 m/s for air molecules at 300K Constant collisions Velocity magnitude and direction change at each e.g. gas in room: Mean free path λ 0.1 µm 5 1 2 mv2 = 3 2 k B T with λ = 1 σn 1 πd 2 n with T 300K k B = 1.38 10 23 J / K m 32 10 3 / 6.0210 23 d = 3 10 10 m n = 6.02 10 23 / 0.0224 m 3 kg

Ideal Gas What is an ideal gas? A) An ordinary gas such as air B) Something that obeys PV=NkT C) Particle probability distributions are independent D) Independent particles without interactions E) Particle systems where interaction time << mean free time between collisions 6

Ideal gas Answers C or E 7

Statistical Mechanics Probabilistic description. 1 particle Classically: We have to track both position and velocities (or momentum). We will introduce f x, p ( )d 3 xd 3 p probability distribution on phase space= position x momentum Quantum: probability p i of single particle to be in state i N particles: In many cases, a good approximation is to treat them as independent We will call this an Ideal Gas http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html In terms of formalism, this corresponds to taking the product of probability distribution: f ( x i, p i )d 3 x i d 3 p i Particles i In general, there are correlations between particles => not independent. Example: Van der Waals gas, liquids 8

Temperature What is temperature? A: A measurement of how hot a system is (as measured by a thermometer) B: A measurement of the random kinetic energy of particles in the system C: A measurement of the mean energy of excitations in the system D: 1 2 mv2 = 3 2 k b T E: A measurement of the kicks to the system 9

Temperature Closest answer C 2 reasons ε k b T τ 1) Usually more degrees of freedom than spatial degrees of freedom Spin (e.g., ferromagnetism) Diatomic molecules: rotation and vibration C V Diatomic molecules cf. KK fig 3.9 k B k B Rotation Vibration 2) Other effects: Quantum gas: Fermi Dirac exclusion principle means that we have to stack f ε,τ ( ) log τ ε F ε 10

Thermodynamics: Macroscopic Quantities Temperature A measure of the mean energy of excitations of the system e.g., in a classical gas, excitatioon =random velocity of particles 1 2 mv2 = 3 2 k T b We do not care about coherent motion, e.g, gas bulk velocity. Many more degrees of freedom Spins, rotation, vibration, broken Cooper pairs We are only interested what changes when we heat up the system. => We will have to understand how to calculate mean energy ε ε p ( ) f x, p ( )d 3 xd 3 p V 11

Pressure What is pressure? What is the most accurate answer? A: A measure of the kicks that particles give each other during collisions B: NkT V C: The average force per unit area on the walls of the container D: A measure of the energy density E:The force per unit area on the walls of teh container 12

Pressure C/D are the closest answers B: is a formula of limited applicability A <-> D are related to the microphysics of collisions C is better than E because of the fluctuations We will show that quite generall non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ = same pressure as thermodynamic definition= T S V U,N θ ultra relativistic pv = ε P = 1 3 u 13

Macroscopic quantities 2 Pressure 2 possible definitions 1) The simplest The average force per unit area on the walls of the container θ http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html 2) A measure of the kicks that particles give each other per unit time= related to the mean energy density non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ ultra relativistic pv = ε P = 1 3 u 14

Entropy What is entropy of a system? A: A measure of the state of disorder of a system B: A measure of the quantity of heat in the system C: A measure of the disordered energy in the system 15

Entropy A With modern definition =0 if in one state = max if all states are equiprobable B C No δq = TdS σ = i p i Probability of state i log p i 16

Macroscopic quantities Thermodynamics identity At equilibrium du = TdS pdv + µdn τdσ pdv + µdn with U = energy of the system, V = volume, N = number of particles τ k b T = temperature S σ = entropy and µ= chemical potential k b Entropy A measure of the disorder. We will define it as σ = i Note that this requires discrete states Measures how unequal the probabilities are Maximum when all probabilities are equal p i Probability of state i log p i σ = p i log p i = 1 p = log g i Probability of state i number of states in the system 17

How does a system reaches equilibrium Diffusion Consider the gas in this room Introduce a puff of CO gas in the center of the room diffuses out <= collisions starting from blue curve evolves towards uniform distribution http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Density n True also in probability space An isolated system tends to evolve towards maximum flatness All states equally probable! Maximum entropy! x 18

Fluctuations Because of finite number of particles, fluctuations Example: Pressure of an ideal gas http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html Note that when the number of particles increase the fluctuations become relatively smaller ΔP P 1 N Central limit theorem Thermodynamics is a good approximation 19

Probabilities See e.g., http://en.wikipedia.org/wiki/probability http://mathworld.wolfram.com/probability.html http://wiki.stat.ucla.edu/socr/index.php/ebook 20

Question 1: Roulette In the US: 38 slots. What is the probability that the ball lands in one particular slot? A Don t know B 1/35 C 1/38 21

Question 1: Roulette In the US: 38 slots. What is the probability that the ball lands in one particular slot? A Don t know B 1/35 C 1/38 You get only 35 times your bet 22

Probability, Moments Probabilistic Event: occurs in a "uncontrolled" way described by Random Variable parameter = fixed (even if unknown) 1) Discrete Case: consider bins n boxes labelled s (e.g. roulette) "Drawing" of N objects N s "land" in box s Probability = measure on sets defined by the limit of the ratio of number of outcome s to the total number of draws N P(s) = lim s N N 0 Normalized An important example: Histograms: make boxes of width Δx s P(s) = 1 s N i Δx i As the total number of events N N i N P i See e.g.: http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html 23 x

Probability Overlapping possible outcome OR AND P(A = { s i } B = s j { }) = P(A) + P(B) P(A B) = P( s i ) = P( s j ) i P(A B) = P s i = 0 if no common region of outcomes Common i= j ( ) j A A B B Combined with positivity and normalization=> Measure on set 24

Probabilities Can we determine probabilities experimentally? A: Yes B: No 25

Probabilities Answer is B: NO N P(s) = lim s N N 0 Normalized P(s) = 1 s But N s N is an estimator of p s N s N p s The error is of the order of p s 1 p s ( ) N We can predict probabilities theoretically e.g., by symmetry arguments But system may be imperfect! 26

Conditions on probabilities Consider 5 possible outcomes Which sequence of numbers can be a set of probabilities for these outcomes? A: 0.1 0.5 0.3 0.2 0.1 B: 0.1 0.7 0.05 0.1 0.05 C: 0.3-0.1 0.06 0.15 0.005 27

Conditions on probabilities B is correct Probabilities have to be positive and normalized N P(s) = lim s N N 0 Normalized P(s) = 1 s 28

Moments Probability, Moments 2 Consider a real function y(s) defined on the random sets s Mean =Expectation Value ( Experimental Average ) < y >= y(s) P(s) Variance s root mean square (r.m.s.) σ 2 y =< y < y > =standard deviation Measurement of the width of the distribution Independence Consider several random variables (in general they have some causal connection) Independence ( ) 2 >=< y 2 > < y > 2 σ y = σ y 2 x, y P(x, Correlation < (x < x >) (y < y >) > ρ xy = σ x σ y =0 if independent (but in general not the reverse!) and y) = P(x) P(y / x) = P(y) P(x / y) if if P( x,y) = P( x)p( y) <=> P( x /y) = P( x) <=> P( y / x) = P( y) 29

Independence y y A B x x y C D y x x Which of these distributions appear to be independent? Which of these distributions seem to have zero correlation coefficient? 30

Independence A B C D Which of these distributions appear to be independent? C Which of these distributions seem to have zero correlation coefficient? C and D We say appear or seem because we can have only an estimate 31

Discrete Probabilities 2 important examples Poisson Distribution Applies to the case where we count the number of independent random events in a drawing or an experiment (e.g. number of events in a given time interval, when the events are independent and occurring at constant rate, for example radioactive decays), The probability of obtaining exactly n events when we expect m is m Prob( n) n = e m n! µ = m σ 2 = m Statistical fluctuations Binomial distribution Applies to the case when there are two possible outcomes 1 and 2 of probabilities p and 1-p (e.g. number of events in an histogram bin: either the event falls in the bin or outside, or in case of spin 1/2 particles, number of spins up). The probability of obtaining exactly n outcomes 1 in N trials (i.e. N= total number of events) Prob n /N µ = Np ( ) = N! n! ( N n)! p n ( 1 p) N n σ 2 = Np( 1 p) Statistical fluctuations N m 32

2) Continuous Case Probability, Moments 3 Infinitesimal box dx Examples:1) Path length of a particle on mean free path λ Prob interaction between l and l + dl : f ( l )dl = 1 2) Gaussian(= Normal) λ exp l λ dl µ and σ 2 are indeed the mean and variance 3) Maxwell: ideal gas => The probability for a given particle to have velocity v in solid angle cone dω is This assumes implicitly polar coordinates P(dx) = f (x)dx < y >= y(x) f (x)dx < y 2 >= y 2 (x) f (x)dx σ 2 y = y(x) < y > f (x)dx = 3 1 2πσ e 1 (x µ) 2 2 σ 2 dx f (x)dx = 1 f (v)dvdω = M 2 Mv 2 exp v 2 dvdω with dω = d cosθdϕ 2πτ 2τ v x = vsinθ cosϕ v y = vsinθ sinϕ, v z = v cosθ ( ) 2 f (x)dx 33

Moments Centered moments Cumulants Summary of Moments ( ) y( x) y x - κ i = i th cumulant κ 1 = x f ( x)dx In particular µ 1 = x κ 2 = x 2 x 2 = σ 2 - x µ 2 = x 2 x 2 - µ ' i = i th centered moment µ' 1 = x µ ' 2 = x 2 x 2 = σ 2 µ ' 3 = x x f ( x)dx µ f ( x)dx etc... ( ) 3 etc... κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 34

Normal distribution What is the mean µ? A B C f (x)dx = 1 2πσ e 1 (x µ) 2 2 σ 2 dx 0.4 0.3 0.2 0.1 10 5 0 5 10 35

Normal distribution What is the mean µ? A B C f (x)dx = 1 2πσ e 1 (x µ) 2 2 σ 2 dx 0.4 0.3 0.2 0.1 10 5 0 5 10 36

Normal distribution What is the standard deviation? f (x)dx = 1 2πσ e 1 (x µ) 2 2 σ 2 dx 0.4 0.3 C 0.2 A 0.1 10 5 0 5 10 B 37

Normal distribution What is the standard deviation? f (x)dx = 1 2πσ e 1 (x µ) 2 2 σ 2 dx 0.4 0.3 C 0.2 A 0.1 10 5 0 5 10 B A = Full width at half maximum FWHM 2.3σ 38

Maxwell distribution 0.5 0.4 f (v)dv dω = angles M = 1,τ = 3 A B C 3 M 2 Mv 2 2πτ exp 2τ v2 dv 4π 0.3 0.2 0.1 0.0 0 2 4 6 8 10 What is the mean µ? v 39

Maxwell distribution 0.5 0.4 f (v)dv dω = angles M = 1,τ = 3 A B C 3 M 2 Mv 2 2πτ exp 2τ v2 dv 4π 0.3 0.2 0.1 0.0 0 2 4 6 8 10 What is the mean µ? v 40

Maxwell distribution f (v)dv dω = angles M = 1,τ = 3 0.5 3 M 2 Mv 2 2πτ exp 2τ v2 dv 4π 0.4 0.3 0.2 0.1 A C What is the standard deviation? B 0.0 0 2 4 6 8 10 v 41

Maxwell distribution 0.5 f (v)dv dω = angles M = 1,τ = 3 3 M 2 Mv 2 2πτ exp 2τ v2 dv 4π 0.4 0.3 0.2 0.1 A C=1.16 B 0.0 0 2 4 6 8 10 What is the standard deviation? v 42

Probability, Moments 4 Independence/Correlation Two variables are said to be independent if sand only if: f (x, y)dxdy = g( x)h( y)dxdy Example: Maxwell distribution g( v x,v y,v z )dv x dv y dv z = M 3 2 M v 2 x + v 2 2 y + v z exp 2πτ Correlation: Example: Normal distribution y ( ) 2τ dv dv dv => v,v,v x y z x y z are independent f ( x,y)dxdy = 1 ρ 2 exp x 2 2ρxy + y 2 2πσ 2 2σ 2 x dxdy 43

Successive approximation in Statistics Successive moments macroscopic quantities are usually means Root mean square dispersion: Fluctuations around the mean ( ) n = λ n µ n x Note that µ n ( λx) = λx λ x ( ) Characteristic function and cumulants (optional) f ( x)dx Fourier transform φ( t) = exp( itx) f ( x)dx Optional Expanding exp -itx ( ) = 1 it x + it ( )2 ( ) = 1- itx + -it ( )2 φ t 2! If we take the log x 2 ( ) == 1 itκ 1 + it log φ( t) κ i = i th cumulant κ 1 = x κ 2 = x 2 x 2 = σ 2 ( )2 2! 2! + it 3! ( )3 ( )3 x 2 + -it 3! x 3 +.. κ 2 + ( it)3 κ 3 +.. 3! x 3 +.. for Gaussian κ n = 0 for n 3 κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 44

Optional Sum of random variables Sum and Scaling < y + z >=< y > + < z > σ y +z 2 = σ y 2 +σ z 2 + 2ρσ y σ z if independent 2 ρ = 0 σ y+ z = σ y 2 + σ z 2 More generally, if x = y + z, with y,z independent y distributed as f ( y)dy, z distributed as g( z)dz the distribution of x is h( x)dx = dx f ( y) g( x y)dy = f * g ( x)dx convolution Proof (optional) => the cumulants add! Convolution in original space is equivalent to product in Fourier space Hence for a sum the characteristic functions multiply and the logs of characteristic functions add! Multiplying by a constant ( λy) n = λ n y n 2 σ λy = λ 2 2 σ y ( λy) = λ n κ n ( y) 45 κ n

Central Limit Theorem Central Limit Theorem Consider N independent random variables x i < x i >= µ i σ 2 2 xi = σ i Consider average A = 1 N N x i i =1 Theorem: Asymptotically, random variable A has a Gaussian distribution f N (A)dA N N 1 2πσ e µ = 1 N µ i σ 2 = 1 N 2 2 σ i 1 (A µ) 2 2 σ 2 Proof (optional); Look at cumulants which add! Consequences Initial distribution not important Experimental errors are Gaussian provided that sum of large number of small errors : 1 N 1 Distribution are getting very narrow If σ i are bounded, σ 2 1 N Simulation: http://cosmology.berkeley.edu/classes/s2009/phys112/centrallimit/ MultipleCoinToss.html da κ n 0 as 1 N n 1 m = g(m 1, m 2,m 3...) = g(true) + i g δm i m i 46

Apply also to sums Consider B = N x i with x i i=1 { } independent of mean µ 0 and standard deviation σ 0 What is the asymptotic distribution of B? A: f N (B)dB N 1 1 (B µ) 2 2πσ e 2 σ 2 db with µ = µ 0 N σ 2 = σ 2 0 N 2 B: C: f N (B)dB N 1 1 (B µ) 2 2πσ e 2 σ 2 db with µ = µ 0 σ 2 = σ 2 0 N f N (B)dB N 1 1 (B µ) 2 2πσ e 2 σ 2 db with µ = Nµ 0 σ 2 2 = Nσ 0 D: No definite answer (B-> ± infinity!) 47

Apply also to sums Consider B = N x i with x i i=1 { } independent of mean µ 0 and standard deviation σ 0 What is the asymptotic distribution of B? C: f N (B)dB N 1 1 (B µ) 2 2πσ e 2 σ 2 db with µ = Nµ 0 σ 2 2 = Nσ 0 B goes to ± infinity but its distribution relative width decreases with N A = 1 N N x i B = NA i=1 f N (A)dA N 1 e 2πσ 1 1 (A µ 1 ) 2 2 2 σ 1 da with µ 1 = µ 0 σ 2 1 = σ 2 0 N 48