Review of probability

Similar documents
Advanced sampling. fluids of strongly orientation-dependent interactions (e.g., dipoles, hydrogen bonds)

6.730 Physics for Solid State Applications

Read [K. G. Denbigh Book] The Principles of Chemical Equilibrium: With Applications in Chemistry and Chemical Engineering Online PDF free

Lecture 11: Continuous-valued signals and differential entropy

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA. There are at least two ways to think about statistical thermodynamics.

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Markov Chain Monte Carlo Method

Entropy and Free Energy in Biology

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

Entropy and Ergodic Theory Lecture 12: Introduction to statistical mechanics, I

Computer simulation methods (1) Dr. Vania Calandrini

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Basic Concepts and Tools in Statistical Physics

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

THEORETICAL NUCLEAR PHYSICS BY JOHN MARKUS BLATT, VICTOR FREDERICK WEISSKOPF

CARNOT CYCLE = T = S ( U,V )

Entropy and Free Energy in Biology

Some Statistics. V. Lindberg. May 16, 2007

M E 320 Professor John M. Cimbala Lecture 10. The Reynolds Transport Theorem (RTT) (Section 4-6)

Chemistry Physical Chemistry I Fall 2017

2.3 Estimating PDFs and PDF Parameters

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

ATOMIC DIMENSIONS. Background for Teachers KEY CONCEPT

Statistical Mechanics

2m + U( q i), (IV.26) i=1

CHAPTER V. Brownian motion. V.1 Langevin dynamics

CHEM-UA 652: Thermodynamics and Kinetics

Statistical Thermodynamics Approach to Urinary Analysis Employing Micro-Canonical Ensemble

Lecture 17. Conditions of Equilibrium

Probability and Stochastic Processes

Lecture Notes Set 3a: Probabilities, Microstates and Entropy

Engineering Mechanics Department of Mechanical Engineering Dr. G. Saravana Kumar Indian Institute of Technology, Guwahati

Data Analysis and Statistical Methods Statistics 651

Lecture 11: Models of the chemical potential

CHAPTER FIVE FUNDAMENTAL CONCEPTS OF STATISTICAL PHYSICS "

Computational Physics (6810): Session 13

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

STATISTICAL AND THERMAL PHYSICS

Fourier Series (Dover Books On Mathematics) PDF

Javier Junquera. Statistical mechanics

Lecture 2: Discrete Probability Distributions

Introduction to Chemical Kinetics AOSC 433/633 & CHEM 433 Ross Salawitch

HEAT AND THERMODYNAMICS PHY 522 Fall, 2010

Chemistry 883 Computational Quantum Chemistry

Confusion in Thermodynamics. J. Dunning Davies and D. Sands, Physics Department, Hull University, Hull HU6 7RX, England.

Lecture 5: Temperature, Adiabatic Processes

Syllabus and Topics Statistical Mechanics Thermal Physics II Spring 2009

Chapter 20 The Second Law of Thermodynamics

Removing the mystery of entropy and thermodynamics. Part 3

Chapter 12 - Part I: Correlation Analysis

ChE 503 A. Z. Panagiotopoulos 1

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Syllabus and Topics Statistical Mechanics Spring 2011

The Kawasaki Identity and the Fluctuation Theorem

Phase transitions for particles in R 3

Introduction to Chemical Kinetics AOSC 433/633 & CHEM 433/633 Ross Salawitch

Calculus at Rutgers. Course descriptions

Chemistry Physical Chemistry I Fall 2018

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Lecture Notes Set 3b: Entropy and the 2 nd law

Foundations of Chemical Kinetics. Lecture 19: Unimolecular reactions in the gas phase: RRKM theory

An Introduction to Matrix Algebra

Taylor Classical Mechanics Solution Manual

Physics 342 Lecture 4. Probability Density. Lecture 4. Physics 342 Quantum Mechanics I

Process Control and Instrumentation Prof. D. Sarkar Department of Chemical Engineering Indian Institute of Technology, Kharagpur

MULTIVARIABLE MATHEMATICS: LINEAR ALGEBRA, MULTIVARIABLE CALCULUS, AND MANIFOLDS BY THEODORE SHIFRIN

Statistical mechanics of classical systems

Univariate Normal Distribution; GLM with the Univariate Normal; Least Squares Estimation

8 Lecture 8: Thermodynamics: Principles

Stochastic process for macro

Extra Topic: DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

Thermodynamics: Entropy

Euclid's Elements Book One With Questions For Discussion

Intermolecular Potentials and The Second Virial Coefficient

Chemistry Physical Chemistry II Spring 2018

Essential Thermodynamics: An Undergraduate Textbook For Chemical Engineers By Athanassios Z. Panagiotopoulos READ ONLINE

Chemical Reaction Engineering II Prof. A. K. Suresh Department of Chemical Engineering Indian Institute of Technology, Bombay

Thermodynamics For Dummies Free Pdf Books

October 18, 2011 Carnot cycle - 1

IV. Classical Statistical Mechanics

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Lecture 4: September Reminder: convergence of sequences

Physics 6720 Introduction to Statistics April 4, 2017

ME DYNAMICAL SYSTEMS SPRING SEMESTER 2009

Khinchin s approach to statistical mechanics

An Introduction to Path Analysis

STRUCTURAL EQUATION MODELING. Khaled Bedair Statistics Department Virginia Tech LISA, Summer 2013

University Physics With Modern Physics (12th Edition) Download Free (EPUB, PDF)

General Chemistry (Fourth Edition) By Ralph H. Petrucci, John W. Hill READ ONLINE

GEORGETOWN UNIVERSITY Department of Chemistry General Chemistry I - Summer General Information for CHEM 001 and CHEM 009

2. A Basic Statistical Toolbox

E-BOOK / ANALYTICAL CHEMISTRY PROBLEMS AND SOLUTIONS

Physical Geology Earth Revealed By Carlson, Diane, Plummer, Charles [Carlos] C, McGeary, David [McGraw-Hill Science/Engineering/Math,2004]

MATH10212 Linear Algebra B Homework Week 5

Hydrodynamics. Stefan Flörchinger (Heidelberg) Heidelberg, 3 May 2010

GEORGETOWN UNIVERSITY Department of Chemistry General Chemistry II - Summer General Information for CHEM 002 and CHEM 010

A Hands on Introduction to NMR Lecture #1 Nuclear Spin and Magnetic Resonance

Transcription:

Review of probability ChE210D Today's lecture: basics of probability and statistics; both discrete and continuous distributions. Probability and statistics Probability distributions The behavior of systems at equilibrium is described by molecular statistical distribution functions. Therefore, we now briefly review some properties of statistics. Let be a discrete variable. Denote a probability distribution () as the probability that will take a given value. This distribution is normalized such that ( ) =1 where the sum proceeds over all allowable, distinct values of. Here, () is dimensionless since it returns a probability. On the other hand, let be a continuous variable. Then, we define a probability distribution ( ) such that: ( ) =1 Note that in order to be dimensionally consistent, ( ) must have inverse dimensions of, due to the presence of the term. Thus, ( ) becomes a probability density. In this interpretation we think of the combined expression ( ) as the probability (dimensionless) that takes on a value in the range /2 to + /2. In simulation, we are often interested in measuring distribution functions. Typically this is accomplished using histograms. A histogram is simply a counting of the different numbers of times that we see a variable with different values. This is straightforward for discrete variables. For continuous distributions, however, we must generate a discrete approximation to them by creating a histogram with bins of a finite width. For example, we may make a histogram of the energy with a bin width of 0.5 kcal/mol. The histogram bin for the range 0-0.5 kcal/mol would count the number of times we observed an energy in that range, similar for 0.5-1.0 kcal/mol, and so on and so forth. M. S. Shell 2012 1/6 last modified 3/30/2012

Let the histogram bin width be Δ and let ( ) be the number of times we observe with values between Δ /2 and +Δ /2. Then, our discrete distribution is ( )= ( ) ( ) Here, the use of the tilde indicates that the probability distribution is a discrete approximation to a continuous one. We can recover the continuous one in the limit: ( ) ( )= lim Δ ( ) There is always a tradeoff in simulations: as the histogram bin size Δ grows smaller, our discrete approximate becomes increasingly continuous-like. However, the tradeoff is that the number of counts in any one histogram bin becomes very small, and statistical accuracy grows poor unless we extend the length of our simulation. Multivariate distributions The joint probability distribution for multiple variables indicates the probability that all variables will simultaneously attain a given value. For example, for two discrete variables and, (=,= ) gives the joint probability that a measurement will occur in which it is found that has value and has. If we consider continuous variables and, then (, ) is the joint probability that we see an event with in the range ± /2 and in the range ±/2. Notice that (,) has inverse dimensions of both and. We can extract single-variable distributions from multiple variable ones by summing or integrating over possibilities for the other variables: ()= (,)! ( )= (,) Notice that ( ) now has inverse dimensions of only, as we would expect. M. S. Shell 2012 2/6 last modified 3/30/2012

If two variables are completely independent, then their joint probability is simply the product of their individual probabilities: (,)= ( ) () independent variables The conditional probability is the probability of a variable value given that some other observation has been made. For example, or ; is the probability distribution for given a particular observed value of. This distribution can be related to the joint and single-variable distributions by: =, Identical expressions apply to continuous variables. Distribution moments We can compute the average value of a distribution variable. For example, = The variance is given by: with 2 3 = 3 = 3 2 + 3 = 3 2 + 3 = 3 3 3 = 3 What is the expectation value of some variable 4 that is a function of an observable? To compute such a property, we integrate the distribution: 4 =4 The variance in 4 is similarly found: 2 5 3 = 4 4 3 M. S. Shell 2012 3/6 last modified 3/30/2012

= 4 3 4 3 =4 3 64 7 If we compute 8= using only a finite number of independent observations 9, such that 8= 9 we can also compute the expected variance in 8, 2 : 3. This is called the standard error of the mean. As 9, we expect 2 : 3 0. That is, as we take more and more samples, we expect our approximation to the underlying distribution average to grow more and more accurate. Under the assumption of independent observations, Working through the math leads to, 8== 2 : 3 = 8 3 8 3 2 : 3 = 2 3 Thus, if we are trying to find the average of a distribution, our accuracy increases as the square root of the number of independent samples we take. Typically we take measurements at different points in time in a molecular simulation. If we take them too frequently, the measurements will not be independent but will be highly correlated. The above expression can be generalized to: 2 : 3 = 2 3 9 9 >1+2? Δ@ A Here,? is the relaxation time or correlation time for the variable, and Δ@ is the spacing in time at which we take measurements. This equation shows that the standard error of the mean can be much larger than what we would expect for independent measurements if the time intervals are not appreciably larger than the relaxation time. 3 M. S. Shell 2012 4/6 last modified 3/30/2012

Microstate probability distributions in classical systems Basic concepts In the classical description, we describe a system by the positions and momenta of all of the atomic nuclei: B C = D, D,E D, 3,, C,E C G C =HI,D,I J,D,I K,D,I,3,,I J,C,I K,C L A microstate is just one configuration of the system. In a classical system, one microstate is characterized by a list of the 3N positions B O and 3N momenta G C, for a total of 6N pieces of information. For a microstate Q we might use the notation G R C,B R C to indicate specific values of these variables. At equilibrium, the bulk properties of a system are time-invariant. We might construct a microscopic probability distribution function that describes the probability with which we might see a given classical microstate Q at any one instance in time. We have that G R C,B R C G C B C gives the probability of microstate Q. That is, G C R,B C R G D G C B D B C is proportional to the continuous joint probability that the system is at a microstate that lies between I D, I D, 2 and I D, +I D, 2, I D,J I D,J 2 and I D,J +I D,J 2, and so on and so forth. In other words, the probability corresponds to a microstate within a differential element G D G C B D B C centered around G R C,B R C. The particular form of the distribution function G C,B C depends on the conditions at which a system is maintained, i.e., the particular statistical mechanical ensemble. Systems can either be at: constant T or constant U (exchanging T with a heat bath / energy reservoir) constant V or constant W (exchanging V with a volume reservoir) constant N or constant X (exchanging N with a particle bath) We will discuss these ensembles in more depth as we progress through different simulation methods. M. S. Shell 2012 5/6 last modified 3/30/2012

Other references for statistical mechanics Extensive online lecture notes for CHE210A, Thermodynamics and Statistical Mechanics, are available at: www.engr.ucsb.edu/~shell/che210a The following two classic statistical mechanics texts are also highly recommended: Statistical Mechanics Donald A. McQuarrie, University Science Books, 2000 (2 nd edition) This is a classic statistical mechanics text, and McQuarrie does an excellent job of laying out clear, concise explanations of the subject material. It serves well as both an introduction to the subject and a reference for specific models and theoretical techniques. In this course, we will cover material in parts of Chapters 1-3, 5, 7, and 9. This book is also frequently used as a primary text in ChE 210B. An Introduction to Statistical Thermodynamics Terrell Hill, Dover Books, 1987 This very inexpensive paperback is a tour-de-force in laying out the foundations and early theoretical advancements of statistical mechanics. Hill takes care to discuss many of the subtleties that other texts glance over, and provides detailed derivations for major theories. The density of material in the book often necessitates careful study and re-reading at first, but it is an essential reference for anyone involved in research broadly related to molecular thermodynamics. In the course, we will cover material in parts of Chapters 1-4 and 6. M. S. Shell 2012 6/6 last modified 3/30/2012