Kinetic Theory 1 / Probabilities 1. Motivations: statistical mechanics and fluctuations 2. Probabilities 3. Central limit theorem 1
The need for statistical mechanics 2
How to describe large systems In this course we will try to make the connection between the macroscopic description of thermodynamics and the microphysics behind it. The prototype that we will use is a volume of gas. http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Our system are large! Avogrado number N=6.02 10 23 / mole (=22.4liter of gas, =12 gram of Carbon) Very large number of particles 10 28 in 500m 3 At finite temperature not at rest e.g Velocity 300 m/s for air molecules at 300K Constant collisions Velocity magnitude and direction change at each e.g. gas in room: Mean free path λ 0.1 µm 3 1 2 mv2 = 3 2 k B T with λ = 1 σn 1 πd 2 n with T 300K k B = 1.38 10 23 J / K m 32 10 3 / 6.0210 23 d = 3 10 10 m n = 6.02 10 23 / 0.0224 m 3 kg
Statistical Mechanics Probabilistic description. 1 particle Classically: We have to track both position and velocities (or momentum). We will introduce f x, p ( )d 3 xd 3 p probability distribution on phase space= position x momentum Quantum: probability p i of single particle to be in state i N particles: In many cases, a good approximation is to treat them as independent We will call this an Ideal Gas http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html In terms of formalism, this corresponds to taking the product of probability distribution: f ( x i, p i )d 3 x i d 3 p i Particles i In general, there are correlations between particles => not independent. Example: Van der Waals gas, liquids 4
Thermodynamics: Macroscopic Quantities Temperature A measure of the mean energy of excitations of the system e.g., in a classical gas, excitatioon =random velocity of particles 1 2 mv2 = 3 2 k T b We do not care about coherent motion, e.g, gas bulk velocity. Many more degrees of freedom Spins, rotation, vibration, broken Cooper pairs We are only interested what changes when we heat up the system. => We will have to understand how to calculate mean energy ε ε p ( ) f x, p ( )d 3 xd 3 p V 5
Pressure What is pressure? What is the most accurate answer? A: A measure of the kicks that particles give each other during collisions B: NkT V C: The average force per unit area on the walls of the container D: A measure of the energy density E:The force per unit area on the walls of teh container 6
Pressure C/D are the closest answers B: is a formula of limited applicability A <-> D are related to the microphysics of collisions C is better than E because of the fluctuations We will show that quite generall non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ = same pressure as thermodynamic definition= T S V U,N θ ultra relativistic pv = ε P = 1 3 u 7
Macroscopic quantities 2 Pressure 2 possible definitions 1) The simplest The average force per unit area on the walls of the container θ http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html 2) A measure of the kicks that particles give each other per unit time= related to the mean energy density non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ ultra relativistic pv = ε P = 1 3 u 8
Macroscopic quantities Thermodynamics identity At equilibrium du = TdS pdv + µdn τdσ pdv + µdn with U = energy of the system, V = volume, N = number of particles τ k b T = temperature S σ = entropy and µ= chemical potential k b Entropy A measure of the disorder. We will define it as σ = i Note that this requires discrete states Measures how unequal the probabilities are Maximum when all probabilities are equal p i Probability of state i log p i σ = p i log p i = 1 p = log g i Probability of state i number of states in the system 9
How does a system reaches equilibrium Diffusion Consider the gas in this room Introduce a puff of CO gas in the center of the room diffuses out <= collisions starting from blue curve evolves towards uniform distribution http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Density n True also in probability space An isolated system tends to evolve towards maximum flatness All states equally probable! Maximum entropy! x 10
Fluctuations Because of finite number of particles, fluctuations Example: Pressure of an ideal gas http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html Note that when the number of particles increase the fluctuations become relatively smaller ΔP P 1 N Central limit theorem Thermodynamics is a good approximation 11
Probabilities See e.g., http://en.wikipedia.org/wiki/probability http://mathworld.wolfram.com/probability.html http://wiki.stat.ucla.edu/socr/index.php/ebook 12
Probability, Moments Probabilistic Event: occurs in a "uncontrolled" way described by Random Variable parameter = fixed (even if unknown) 1) Discrete Case: consider bins n boxes labelled s (e.g. roulette) "Drawing" of N objects N s "land" in box s Probability = measure on sets defined by the limit of the ratio of number of outcome s to the total number of draws N P(s) = lim s N N 0 Normalized An important example: Histograms: make boxes of width Δx s P(s) = 1 s N i Δx i As the total number of events N N i N P i See e.g.: http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html 13 x
Probability Overlapping possible outcome OR AND P(A = { s i } B = s j { }) = P(A) + P(B) P(A B) = P( s i ) = P( s j ) i P(A B) = P s i = 0 if no common region of outcomes Common i= j ( ) j A A B B Combined with positivity and normalization=> Measure on set 14
Moments Probability, Moments 2 Consider a real function y(s) defined on the random sets s Mean =Expectation Value ( Experimental Average ) < y >= y(s) P(s) Variance s root mean square (r.m.s.) σ 2 y =< y < y > =standard deviation Measurement of the width of the distribution Independence Consider several random variables (in general they have some causal connection) Independence ( ) 2 >=< y 2 > < y > 2 σ y = σ y 2 x, y P(x, Correlation < (x < x >) (y < y >) > ρ xy = σ x σ y =0 if independent (but in general not the reverse!) and y) = P(x) P(y / x) = P(y) P(x / y) if if P( x,y) = P( x)p( y) <=> P( x /y) = P( x) <=> P( y / x) = P( y) 15
Discrete Probabilities 2 important examples Poisson Distribution Applies to the case where we count the number of independent random events in a drawing or an experiment (e.g. number of events in a given time interval, when the events are independent and occurring at constant rate, for example radioactive decays), The probability of obtaining exactly n events when we expect m is m Prob( n) n = e m n! µ = m σ 2 = m Statistical fluctuations Binomial distribution Applies to the case when there are two possible outcomes 1 and 2 of probabilities p and 1-p (e.g. number of events in an histogram bin: either the event falls in the bin or outside, or in case of spin 1/2 particles, number of spins up). The probability of obtaining exactly n outcomes 1 in N trials (i.e. N= total number of events) Prob n /N µ = Np ( ) = N! n! ( N n)! p n ( 1 p) N n σ 2 = Np( 1 p) Statistical fluctuations N m 16
2) Continuous Case Probability, Moments 3 Infinitesimal box dx Examples:1) Path length of a particle on mean free path λ Prob interaction between l and l + dl : f ( l )dl = 1 2) Gaussian(= Normal) λ exp l λ dl µ and σ 2 are indeed the mean and variance 3) Maxwell: ideal gas => The probability for a given particle to have velocity v in solid angle cone dω is This assumes implicitly polar coordinates P(dx) = f (x)dx < y >= y(x) f (x)dx < y 2 >= y 2 (x) f (x)dx σ 2 y = y(x) < y > f (x)dx = 3 1 2πσ e 1 (x µ) 2 2 σ 2 dx f (x)dx = 1 f (v)dvdω = M 2 Mv 2 exp v 2 dvdω with dω = d cosθdϕ 2πτ 2τ v x = vsinθ cosϕ v y = vsinθ sinϕ, v z = v cosθ ( ) 2 f (x)dx 17
Moments Centered moments Cumulants Summary of Moments ( ) y( x) y x - κ i = i th cumulant κ 1 = x f ( x)dx In particular µ 1 = x κ 2 = x 2 x 2 = σ 2 - x µ 2 = x 2 x 2 - µ ' i = i th centered moment µ' 1 = x µ ' 2 = x 2 x 2 = σ 2 µ ' 3 = x x f ( x)dx µ f ( x)dx etc... ( ) 3 etc... κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 18
Probability, Moments 4 Independence/Correlation Two variables are said to be independent if sand only if: f (x, y)dxdy = g( x)h( y)dxdy Example: Maxwell distribution g( v x,v y,v z )dv x dv y dv z = M 3 2 M v 2 x + v 2 2 y + v z exp 2πτ Correlation: Example: Normal distribution y ( ) 2τ dv dv dv => v,v,v x y z x y z are independent f ( x,y)dxdy = 1 ρ 2 exp x 2 2ρxy + y 2 2πσ 2 2σ 2 x dxdy 19
Successive approximation in Statistics Successive moments macroscopic quantities are usually means Root mean square dispersion: Fluctuations around the mean ( ) n = λ n µ n x Note that µ n ( λx) = λx λ x ( ) Characteristic function and cumulants (optional) f ( x)dx Fourier transform φ( t) = exp( itx) f ( x)dx Optional Expanding exp -itx ( ) = 1 it x + it ( )2 ( ) = 1- itx + -it ( )2 φ t 2! If we take the log x 2 ( ) == 1 itκ 1 + it log φ( t) κ i = i th cumulant κ 1 = x κ 2 = x 2 x 2 = σ 2 ( )2 2! 2! + it 3! ( )3 ( )3 x 2 + -it 3! x 3 +.. κ 2 + ( it)3 κ 3 +.. 3! x 3 +.. for Gaussian κ n = 0 for n 3 κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 20
Optional Sum of random variables Sum and Scaling < y + z >=< y > + < z > σ y +z 2 = σ y 2 +σ z 2 + 2ρσ y σ z if independent 2 ρ = 0 σ y+ z = σ y 2 + σ z 2 More generally, if x = y + z, with y,z independent y distributed as f ( y)dy, z distributed as g( z)dz the distribution of x is h( x)dx = dx f ( y) g( x y)dy = f * g ( x)dx convolution Proof (optional) => the cumulants add! Convolution in original space is equivalent to product in Fourier space Hence for a sum the characteristic functions multiply and the logs of characteristic functions add! Multiplying by a constant ( λy) n = λ n y n 2 σ λy = λ 2 2 σ y ( λy) = λ n κ n ( y) 21 κ n
Central Limit Theorem Central Limit Theorem Consider N independent random variables x i < x i >= µ i σ 2 2 xi = σ i Consider average A = 1 N N x i i =1 Theorem: Asymptotically, random variable A has a Gaussian distribution f N (A)dA N N 1 2πσ e µ = 1 N µ i σ 2 = 1 N 2 2 σ i 1 (A µ) 2 2 σ 2 Proof (optional); Look at cumulants which add! Consequences Initial distribution not important Experimental errors are Gaussian provided that sum of large number of small errors : 1 N 1 Distribution are getting very narrow If σ i are bounded, σ 2 1 N Simulation: http://cosmology.berkeley.edu/classes/s2009/phys112/centrallimit/ MultipleCoinToss.html da κ n 0 as 1 N n 1 m = g(m 1, m 2,m 3...) = g(true) + i g δm i m i 22
Consequences for Thermodynamics Because of large number of particles, averages are very peaked around mean System well characterized by mean = macroscopic quantity Fluctuations are very small Not a lot of difference between most probable value, mean value and experimental value. 23