Kinetic Theory 1 / Probabilities

Similar documents
Kinetic Theory 1 / Probabilities

5. Systems in contact with a thermal bath

functions Poisson distribution Normal distribution Arbitrary functions

Basic Concepts and Tools in Statistical Physics

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Physics Dec The Maxwell Velocity Distribution

Multiple Random Variables

Statistical Mechanics

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Statistics, Data Analysis, and Simulation SS 2015

II. Probability. II.A General Definitions

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

5. Systems in contact with a thermal bath

Let X and Y denote two random variables. The joint distribution of these random

Introduction to Statistics and Error Analysis

Kinetic theory. Collective behaviour of large systems Statistical basis for the ideal gas equation Deviations from ideality

18.440: Lecture 28 Lectures Review

Part II: Statistical Physics

(TRAVELLING) 1D WAVES. 1. Transversal & Longitudinal Waves

X α = E x α = E. Ω Y (E,x)

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g

ε tran ε tran = nrt = 2 3 N ε tran = 2 3 nn A ε tran nn A nr ε tran = 2 N A i.e. T = R ε tran = 2

Random Variables. P(x) = P[X(e)] = P(e). (1)

Part II: Statistical Physics

Ch. 19: The Kinetic Theory of Gases

UNIVERSITY OF SOUTHAMPTON

ADIABATIC PROCESS Q = 0

Review of Probabilities and Basic Statistics

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Thermodynamics: A Brief Introduction. Thermodynamics: A Brief Introduction

Introduction Statistical Thermodynamics. Monday, January 6, 14

Physics 4230 Final Examination 10 May 2007

Thermal and Statistical Physics Department Exam Last updated November 4, L π

ΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics

Formulas for probability theory and linear models SF2941

The Boltzmann, Normal and Maxwell Distributions

Statistics, Data Analysis, and Simulation SS 2017

PHYS 352 Homework 2 Solutions

1 Presessional Probability

Rate of Heating and Cooling

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

Introduction to Probability and Stocastic Processes - Part I

Stochastic Models of Manufacturing Systems

Chapter 15 Thermal Properties of Matter

Chapter 4: Partial differentiation

Statistics, Data Analysis, and Simulation SS 2013

Physics 607 Final Exam

Two-dimensional Random Vectors

STA 256: Statistics and Probability I

4. CONTINUOUS RANDOM VARIABLES

Continuous Random Variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Today: Fundamentals of Monte Carlo

Probability Distributions - Lecture 5

This is a statistical treatment of the large ensemble of molecules that make up a gas. We had expressed the ideal gas law as: pv = nrt (1)

Methods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie)

Chemistry. Lecture 10 Maxwell Relations. NC State University

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

2. Molecules in Motion

KINETIC THEORY OF GASES

conditional cdf, conditional pdf, total probability theorem?

Lecture 1: August 28

Probability Review. Gonzalo Mateos

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Topics covered so far:

EE4601 Communication Systems

Chapter 2: Random Variables

Some Statistics. V. Lindberg. May 16, 2007

Continuous r.v practice problems

4. Systems in contact with a thermal bath

Section B. Electromagnetism

Kinetic Model of Gases

Math Review Sheet, Fall 2008

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Outline Review Example Problem 1. Thermodynamics. Review and Example Problems: Part-2. X Bai. SDSMT, Physics. Fall 2014

Final Review Prof. WAN, Xin

Similarities and differences:

The Kinetic Theory of Gases (1)

Non equilibrium thermodynamics: foundations, scope, and extension to the meso scale. Miguel Rubi

Today: Fundamentals of Monte Carlo

1 Random Variable: Topics

Please read the following instructions:

General Physics I (aka PHYS 2013)

Brownian motion and the Central Limit Theorem

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Physics 123 Thermodynamics Review

This is a statistical treatment of the large ensemble of molecules that make up a gas. We had expressed the ideal gas law as: pv = nrt (1)

Physics 408 Final Exam

The Ideal Gas. One particle in a box:

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Velocity distribution of ideal gas molecules

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

Random processes and probability distributions. Phys 420/580 Lecture 20

Lecture 2: Repetition of probability theory and statistics

Lecture 24. Ideal Gas Law and Kinetic Theory

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Data Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006

Lecture 18 Molecular Motion and Kinetic Energy

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Transcription:

Kinetic Theory 1 / Probabilities 1. Motivations: statistical mechanics and fluctuations 2. Probabilities 3. Central limit theorem 1

The need for statistical mechanics 2

How to describe large systems In this course we will try to make the connection between the macroscopic description of thermodynamics and the microphysics behind it. The prototype that we will use is a volume of gas. http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Our system are large! Avogrado number N=6.02 10 23 / mole (=22.4liter of gas, =12 gram of Carbon) Very large number of particles 10 28 in 500m 3 At finite temperature not at rest e.g Velocity 300 m/s for air molecules at 300K Constant collisions Velocity magnitude and direction change at each e.g. gas in room: Mean free path λ 0.1 µm 3 1 2 mv2 = 3 2 k B T with λ = 1 σn 1 πd 2 n with T 300K k B = 1.38 10 23 J / K m 32 10 3 / 6.0210 23 d = 3 10 10 m n = 6.02 10 23 / 0.0224 m 3 kg

Statistical Mechanics Probabilistic description. 1 particle Classically: We have to track both position and velocities (or momentum). We will introduce f x, p ( )d 3 xd 3 p probability distribution on phase space= position x momentum Quantum: probability p i of single particle to be in state i N particles: In many cases, a good approximation is to treat them as independent We will call this an Ideal Gas http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html In terms of formalism, this corresponds to taking the product of probability distribution: f ( x i, p i )d 3 x i d 3 p i Particles i In general, there are correlations between particles => not independent. Example: Van der Waals gas, liquids 4

Thermodynamics: Macroscopic Quantities Temperature A measure of the mean energy of excitations of the system e.g., in a classical gas, excitatioon =random velocity of particles 1 2 mv2 = 3 2 k T b We do not care about coherent motion, e.g, gas bulk velocity. Many more degrees of freedom Spins, rotation, vibration, broken Cooper pairs We are only interested what changes when we heat up the system. => We will have to understand how to calculate mean energy ε ε p ( ) f x, p ( )d 3 xd 3 p V 5

Pressure What is pressure? What is the most accurate answer? A: A measure of the kicks that particles give each other during collisions B: NkT V C: The average force per unit area on the walls of the container D: A measure of the energy density E:The force per unit area on the walls of teh container 6

Pressure C/D are the closest answers B: is a formula of limited applicability A <-> D are related to the microphysics of collisions C is better than E because of the fluctuations We will show that quite generall non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ = same pressure as thermodynamic definition= T S V U,N θ ultra relativistic pv = ε P = 1 3 u 7

Macroscopic quantities 2 Pressure 2 possible definitions 1) The simplest The average force per unit area on the walls of the container θ http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html 2) A measure of the kicks that particles give each other per unit time= related to the mean energy density non relativistic u = 3 2 N V k b T 3 2 N V τ P= N V τ ultra relativistic pv = ε P = 1 3 u 8

Macroscopic quantities Thermodynamics identity At equilibrium du = TdS pdv + µdn τdσ pdv + µdn with U = energy of the system, V = volume, N = number of particles τ k b T = temperature S σ = entropy and µ= chemical potential k b Entropy A measure of the disorder. We will define it as σ = i Note that this requires discrete states Measures how unequal the probabilities are Maximum when all probabilities are equal p i Probability of state i log p i σ = p i log p i = 1 p = log g i Probability of state i number of states in the system 9

How does a system reaches equilibrium Diffusion Consider the gas in this room Introduce a puff of CO gas in the center of the room diffuses out <= collisions starting from blue curve evolves towards uniform distribution http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html Density n True also in probability space An isolated system tends to evolve towards maximum flatness All states equally probable! Maximum entropy! x 10

Fluctuations Because of finite number of particles, fluctuations Example: Pressure of an ideal gas http://cosmology.berkeley.edu/classes/s2009/phys112/idealgas/ idealgas.html Note that when the number of particles increase the fluctuations become relatively smaller ΔP P 1 N Central limit theorem Thermodynamics is a good approximation 11

Probabilities See e.g., http://en.wikipedia.org/wiki/probability http://mathworld.wolfram.com/probability.html http://wiki.stat.ucla.edu/socr/index.php/ebook 12

Probability, Moments Probabilistic Event: occurs in a "uncontrolled" way described by Random Variable parameter = fixed (even if unknown) 1) Discrete Case: consider bins n boxes labelled s (e.g. roulette) "Drawing" of N objects N s "land" in box s Probability = measure on sets defined by the limit of the ratio of number of outcome s to the total number of draws N P(s) = lim s N N 0 Normalized An important example: Histograms: make boxes of width Δx s P(s) = 1 s N i Δx i As the total number of events N N i N P i See e.g.: http://cosmology.berkeley.edu/classes/s2009/phys112/diffusion/ GaussDiffusion.html 13 x

Probability Overlapping possible outcome OR AND P(A = { s i } B = s j { }) = P(A) + P(B) P(A B) = P( s i ) = P( s j ) i P(A B) = P s i = 0 if no common region of outcomes Common i= j ( ) j A A B B Combined with positivity and normalization=> Measure on set 14

Moments Probability, Moments 2 Consider a real function y(s) defined on the random sets s Mean =Expectation Value ( Experimental Average ) < y >= y(s) P(s) Variance s root mean square (r.m.s.) σ 2 y =< y < y > =standard deviation Measurement of the width of the distribution Independence Consider several random variables (in general they have some causal connection) Independence ( ) 2 >=< y 2 > < y > 2 σ y = σ y 2 x, y P(x, Correlation < (x < x >) (y < y >) > ρ xy = σ x σ y =0 if independent (but in general not the reverse!) and y) = P(x) P(y / x) = P(y) P(x / y) if if P( x,y) = P( x)p( y) <=> P( x /y) = P( x) <=> P( y / x) = P( y) 15

Discrete Probabilities 2 important examples Poisson Distribution Applies to the case where we count the number of independent random events in a drawing or an experiment (e.g. number of events in a given time interval, when the events are independent and occurring at constant rate, for example radioactive decays), The probability of obtaining exactly n events when we expect m is m Prob( n) n = e m n! µ = m σ 2 = m Statistical fluctuations Binomial distribution Applies to the case when there are two possible outcomes 1 and 2 of probabilities p and 1-p (e.g. number of events in an histogram bin: either the event falls in the bin or outside, or in case of spin 1/2 particles, number of spins up). The probability of obtaining exactly n outcomes 1 in N trials (i.e. N= total number of events) Prob n /N µ = Np ( ) = N! n! ( N n)! p n ( 1 p) N n σ 2 = Np( 1 p) Statistical fluctuations N m 16

2) Continuous Case Probability, Moments 3 Infinitesimal box dx Examples:1) Path length of a particle on mean free path λ Prob interaction between l and l + dl : f ( l )dl = 1 2) Gaussian(= Normal) λ exp l λ dl µ and σ 2 are indeed the mean and variance 3) Maxwell: ideal gas => The probability for a given particle to have velocity v in solid angle cone dω is This assumes implicitly polar coordinates P(dx) = f (x)dx < y >= y(x) f (x)dx < y 2 >= y 2 (x) f (x)dx σ 2 y = y(x) < y > f (x)dx = 3 1 2πσ e 1 (x µ) 2 2 σ 2 dx f (x)dx = 1 f (v)dvdω = M 2 Mv 2 exp v 2 dvdω with dω = d cosθdϕ 2πτ 2τ v x = vsinθ cosϕ v y = vsinθ sinϕ, v z = v cosθ ( ) 2 f (x)dx 17

Moments Centered moments Cumulants Summary of Moments ( ) y( x) y x - κ i = i th cumulant κ 1 = x f ( x)dx In particular µ 1 = x κ 2 = x 2 x 2 = σ 2 - x µ 2 = x 2 x 2 - µ ' i = i th centered moment µ' 1 = x µ ' 2 = x 2 x 2 = σ 2 µ ' 3 = x x f ( x)dx µ f ( x)dx etc... ( ) 3 etc... κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 18

Probability, Moments 4 Independence/Correlation Two variables are said to be independent if sand only if: f (x, y)dxdy = g( x)h( y)dxdy Example: Maxwell distribution g( v x,v y,v z )dv x dv y dv z = M 3 2 M v 2 x + v 2 2 y + v z exp 2πτ Correlation: Example: Normal distribution y ( ) 2τ dv dv dv => v,v,v x y z x y z are independent f ( x,y)dxdy = 1 ρ 2 exp x 2 2ρxy + y 2 2πσ 2 2σ 2 x dxdy 19

Successive approximation in Statistics Successive moments macroscopic quantities are usually means Root mean square dispersion: Fluctuations around the mean ( ) n = λ n µ n x Note that µ n ( λx) = λx λ x ( ) Characteristic function and cumulants (optional) f ( x)dx Fourier transform φ( t) = exp( itx) f ( x)dx Optional Expanding exp -itx ( ) = 1 it x + it ( )2 ( ) = 1- itx + -it ( )2 φ t 2! If we take the log x 2 ( ) == 1 itκ 1 + it log φ( t) κ i = i th cumulant κ 1 = x κ 2 = x 2 x 2 = σ 2 ( )2 2! 2! + it 3! ( )3 ( )3 x 2 + -it 3! x 3 +.. κ 2 + ( it)3 κ 3 +.. 3! x 3 +.. for Gaussian κ n = 0 for n 3 κ 3 = κ 4 = ( x x ) 3 κ 3 = skewness 3/2 σ ( x x ) 4 κ 3σ 4 4 σ = kurtosis 4 20

Optional Sum of random variables Sum and Scaling < y + z >=< y > + < z > σ y +z 2 = σ y 2 +σ z 2 + 2ρσ y σ z if independent 2 ρ = 0 σ y+ z = σ y 2 + σ z 2 More generally, if x = y + z, with y,z independent y distributed as f ( y)dy, z distributed as g( z)dz the distribution of x is h( x)dx = dx f ( y) g( x y)dy = f * g ( x)dx convolution Proof (optional) => the cumulants add! Convolution in original space is equivalent to product in Fourier space Hence for a sum the characteristic functions multiply and the logs of characteristic functions add! Multiplying by a constant ( λy) n = λ n y n 2 σ λy = λ 2 2 σ y ( λy) = λ n κ n ( y) 21 κ n

Central Limit Theorem Central Limit Theorem Consider N independent random variables x i < x i >= µ i σ 2 2 xi = σ i Consider average A = 1 N N x i i =1 Theorem: Asymptotically, random variable A has a Gaussian distribution f N (A)dA N N 1 2πσ e µ = 1 N µ i σ 2 = 1 N 2 2 σ i 1 (A µ) 2 2 σ 2 Proof (optional); Look at cumulants which add! Consequences Initial distribution not important Experimental errors are Gaussian provided that sum of large number of small errors : 1 N 1 Distribution are getting very narrow If σ i are bounded, σ 2 1 N Simulation: http://cosmology.berkeley.edu/classes/s2009/phys112/centrallimit/ MultipleCoinToss.html da κ n 0 as 1 N n 1 m = g(m 1, m 2,m 3...) = g(true) + i g δm i m i 22

Consequences for Thermodynamics Because of large number of particles, averages are very peaked around mean System well characterized by mean = macroscopic quantity Fluctuations are very small Not a lot of difference between most probable value, mean value and experimental value. 23