n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Similar documents

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Mutually Exclusive Events

12 1 = = 1

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Statistics for Managers Using Microsoft Excel (3 rd Edition)

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Deep Learning for Computer Vision

Discrete Probability. Chemistry & Physics. Medicine

Discrete Structures Homework 1

Probability Year 10. Terminology

What is Probability? Probability. Sample Spaces and Events. Simple Event

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Probability and Statistics Notes

Probability Year 9. Terminology

P (A) = P (B) = P (C) = P (D) =

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Fundamentals of Probability CE 311S

Probability and random variables. Sept 2018

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan

Chapter 2: Probability Part 1

Sociology 6Z03 Topic 10: Probability (Part I)

A Event has occurred

Lecture Lecture 5

Chapter 2 Class Notes

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Probability (Devore Chapter Two)

Conditional Probability

Chapter 8: An Introduction to Probability and Statistics

Name: Exam 2 Solutions. March 13, 2017

2011 Pearson Education, Inc

1 Random variables and distributions

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Discrete Probability

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

CME 106: Review Probability theory

Introduction and basic definitions

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Statistical Inference

Statistics, Probability Distributions & Error Propagation. James R. Graham

Set/deck of playing cards. Spades Hearts Diamonds Clubs

RVs and their probability distributions

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Review Counting Principles Theorems Examples. John Venn. Arthur Berg Counting Rules 2/ 21

P [(E and F )] P [F ]

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

1 Presessional Probability

Marquette University MATH 1700 Class 5 Copyright 2017 by D.B. Rowe

Chapter 2 PROBABILITY SAMPLE SPACE

Introduction to Decision Sciences Lecture 11

Review Basic Probability Concept

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Lecture 2: Repetition of probability theory and statistics

Lecture notes for probability. Math 124

Permutations and Combinations

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

k P (X = k)

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

Probability Theory and Simulation Methods

Chapter 2 Random Variables

Basic Concepts of Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Intermediate Math Circles November 8, 2017 Probability II

Notes on Mathematics Groups

Statistics 1L03 - Midterm #2 Review

Chapter 1 Principles of Probability

Chapter 4: An Introduction to Probability and Statistics

Basic Concepts of Probability

Chapter 6: Probability The Study of Randomness

3.2 Probability Rules

Chapter 8 Sequences, Series, and Probability

Chapter 2: The Random Variable

Probability & Random Variables

Lectures on Elementary Probability. William G. Faris

STAT 111 Recitation 1

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Year 10 Mathematics Probability Practice Test 1

Math 493 Final Exam December 01

Basic Concepts and Tools in Statistical Physics

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Combinatorial Analysis

Class 26: review for final exam 18.05, Spring 2014

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r).

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Probability - Lecture 4

Section 13.3 Probability

Quantitative Methods for Decision Making

CS 361: Probability & Statistics

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Homework 4 Solution, due July 23

Probability (Devore Chapter Two)

Transcription:

CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy probabilities are important for understanding entropy Suppose there are different outcomes (A, B, C, ) for some event p A = n N A # of outcomes giving A total # of outcomes Prob of outcome A: 0 < p A < 1

prob of getting 4 when rolling a die = 1/6 prob of 3, 3, 4, in that order? Now consider 3 rolls prob of rolling two 3 s, 4 in any order? mutually exclusive: e.g., if A occurs, B does not collectively exhaustive: A 1, A 2, A t give all possible outcomes independent: A 1, A 2, A 3 independent if outcome of each is unrelated to outcome of others multiplicity: total # of ways different outcomes can be realized

n A = # of outcomes of A n B = # of outcomes of B W = n A n B = multiplicity 2 models 3 colors 6 different outcomes

Addition Rule if outcome A, B, are mutually exclusive and occur with na nb pa =, pb =,... probabilities N N prob of seeing A or B: p(a or B) = p A + p B n A, n B, are statistical weights if outcomes n A, n B, n E are collectively exhaustive and mutually exclusive n A + n B + + n E = N p A + p B + + p E = 1

Multiplication Rule If outcomes A, B,, E are independent prob of seeing A and B and E Examples Rolling a die p(a and B and E) = p A p B p E prob of 1 or 4 = 1/6 + 1/6 = 1/3 Roll die twice prob of 1 and then 4 = 1/6 1/6 = 1/36 prob of five heads in a row in coin flips (1/2) 5 = 1/32 prob of two heads, one tails, two heads on successive coin flips P H2 P t P H2 = 1/32 one of 32 possible sequences

Independent events p A, p B prob that both happen = p A p B prob A happens, B does not = p A (1 p B ) prob neither happens = (1 p A )(1 p B ) prob that something happens (A or B or Both) 1 b( ta d t B) 1 (1 )(1 ) = 1 prob(not A and not B) = 1 (1 p A )(1 p B ) = p A +p B p A p B prob of 1 on first roll of die and 4 on 2 nd roll = 1/36 prob of 1 on first roll or 4 on 2 nd roll =? (1, 1)* (1, 2)* (1, 3)* (1, 4)* (1, 5)* (1, 6)* (2, 1) (2, 2) (2, 3) (2, 4)* (2, 5) (2, 6) (3, 1) (3, 2) (3, 3) (3, 4)* (3, 5) (3, 6) (4, 1) (4, 2) (4, 3) (4, 4)* (4, 5) (4, 6) (5, 1) (5, 2) (5, 3) (5, 4)* (5, 5) (5, 6) prob = 11/36 By counting all possibilities Not a viable approach in general (6, 1) (6, 2) (6, 3) (6, 4)* (6, 5) (6, 6)

class A 1 and anything but 4: 5/36 class B anything but 1 and 4: 5/36 class C 1 and 4: 1/36 p(1 or 4) = 11/36 Another approach is the following: prob(1) = 6/36 probl(4) = 6/36 prob(1 and 4) = 1/36 prob(1 or 4) = 6/36 + 6/36 1/36 = 11/36 Essentially all questions can be reexpressed in terms of a combination of and and or operations. Here is a third approach to the above problem p(fail) = p(not 1 first) and p(not 4 second) = 5/6 5/6 = 25/36 p(success) = 1 p(fail) = 11/36

Correlated Events outcome of an event depends on the outcomes of other events Example: Balls from barrel with replacement prob of G = 2/3 on 1 st try; prob of R = 1/3 on 1 st try R G G probabilities on second try depends on whether ball put back after first try one red and two green balls in a if put back uncorrelated ltd barrel if not put back correlated If one does not return balls to the barrel, the three possible sequences are: Move # 3 2 1 G G R 1 1 1/3 = 1/3 G R G 1 1/2 2/3 = 2/6 R G G 1 1/2 2/3 = 2/6 Σ all possibilities = 1 Note: We have to renormalize on moves after the 1 st

Gambling Eq. 5 horses A, B, C, D, E prob of winning p A, p B, p C, p D, p E (e.g., odds from Las Vegas) Suppose C wins, what are probabilities for A, B, D, E to come in second? prob p C is first, p A is second, p B is third p A nd pa pa 2, C first = =, etc. p p p p 1 p ( ) C + + + 1 pc 1 pc pa ( 1 pc)( 1 pc pa) A B D E C p p A p B = p p p A B C

Combinatorics: Composition rather than sequence of events is important Consider coin tosses prob of seeing HTHH? a sequence question prob of seeing 3H1T regardless of order a composition question prob of 3H, 1T 16 total arrangements from 4H, 3H1T, 2H2T, 1H3T, 4T The number of ways of getting 3H, 1T is 4 (so prob = 4/16 = 1/4) We will show that the answer is given by 4! 3!1! = 4

Consider letters x, y, z, w 4 possibilities at first draw 3 possibilities at 2 nd draw 2 possibilities at 3 rd draw 1 possibility at 4 th draw 4! = 24 sequences letters of alphabet: 26! sequences Distinguishable and indistinguishable letters AHA AHA vs. 1 2 3 distinguishable 3! = 6 arrangements arrangements 3! 3 2! = accounts for the fact we cannot distinguish the two As.

In general, the # of permutations is N! W = n! n!...n! 1 2 t n 1 indistinguishable members of 1 n 2 indistinguishable members of 2 2 g n t indistinguishable members of t Whenthere are only two categories, n 1 and n 2 W N! N! N = = = n1! n2! n1!( N n1 )! n1 So for the 3HT coin flip example 4! # of distinguishable arrangements 4 3!1! = flip a coin 117 times How many arrangements have 36 heads? 117! W = = 184 1.84x10 36!81! 30

6 coin tosses 6H 6!/6!0! = 1 5H1T 6!/5!1! = 6 4H2T 6!/4!2! = 15 3H3T 6!/3!3! = 20 2H4T = 15 1H5T = 6 0H6T = 1 most probable situation is equal # heads and tails Roll die 15 times: How many ways can we achieve three 1, one 2, one 3, five 4, two 5, three 6 15! = 151,351,200 sequences 3!1! 1! 5!2!3! prob of royal flush in poker Ace, King, Queen, Jack, 10 of same suit 52 cards 5 4 3 2 1 i i i i = 52 51 50 49 48 6 4 1.53x10 probability the factor of 4 accounts for the four suits

Bose Einstein statistics # ways n indistinguishable particles can be put in M boxes with no constraint on # in a box W( n, M) = ( M + n ) ( M 1!n! ) + 1! rather than m boxes consider M 1 walls to check: 4 particles in 4 boxes 7! W (4,4) = = 35 3!4! (4) (12) (6) (12) (1) 35 total arrangements

Distribution functions discrete continuous In general, probability distributions should be normalized given g(x) on interval [a, b] If discrete t i= 1 pi () = 1 g o = b a g ( x ) dx ( ) g x px ( ) = g o Note: b p ( xdx= ) 1 a Normalized, as advertised Binomial and multinomial distributions Binomial: two outcomes (e.g., heads, tails) p p = 1 p coin toss 1 2 1 p H all combinations of two events 1 2 2 2 = pt = p ( ) ( ) ( ) 2 1 + p1p2 + p2p1+ p2 = p1 + 1 p1 + 1 p1 p1+ 1 p1 2 = p + p p + p p + 1 2p + p = 1 2 2 2 2 1 1 1 1 1 1 1

n N n N! prob of one event # pnn (, ) = p ( 1 p) i = n! ( N n)! regardless of order sequences N! n!( N n)! N N N = 0 p(0,0) = 1 1! = 1 = = 1 0!1! 1! = = 1 1!0! 2! = 2 = = 1 0!2! 2! = 2 1!1! = 2! = = 1 2! 0! N = 0 1 N = 1 1 1 N = 2 N= 3 N = 4 1 2 1 Pascal s triangle 1 3 3 1 1 4 6 4 1

2 coins HH pp( 1 p) 0 = 1/ 4 HT p(1 p) = 1/2(1/2) = 1/4 TT p p o 2 (1 ) = 1/ 4 3 coins 3 0 HHH p (1 p) = 1/ 8 2 ( ) ( ) HHT = 1/2 1/2 = 1/8 Etc. 1/4 3/8 1/4 p(n H ) 1/16 1/16 0 1 2 3 4 n H

Multinomial prob distrib :... N! n1 n2 nt p1 p2 pt n 1! n 2!... n t! ni i = N Averages If continuous: b x n = x n p ( x ) dx = n th moment a b ( ) ( ) ( ) f x = f xpxdx a If discrete: i t i= 1 () = ip i t () f () i p() i f i = i= 1 assuming the distribution is normalized <x> = mean, or average g b = pxdx ( ) a p ' = px ( )/ g This converts p(x) to a normalized distribution p (x)

average of 3, 3, 2, 2, 2, 1, 1 i sum of numbers 14 = = = 2 # of samples 7 2 3 3 p1 =, p2 =, p3 = 7 7 7 i 2 3 2 = 1 + 2 + 3 7 7 7 2+ 6+ 6 = = 2 7 Variance (σ 2 ) = width of distribution ( ) 2 2 2 2 σ = = 2 + ; = = x a x ax a x a mean = x 2a x + a 2 2 2 2 2 = x a = x x 2 σ 2 = σ standard deviation

If do four coin flips many times what will you find for the average # of heads (obviously 2) How to compute this mathematically 4 H H H n H = 0 (, ) n = n p n N 1 4 6 4 1 = 0 + 1 + 2 + 3 + 4 16 16 16 16 16 4+ 12+ 12+ 4 = = 2 6 n = 5 2 H 2 2 2 2 H H n n = 5 2 = 1 σ = 1 Uniform distribution a x = 1/a 2 0 x a x 2 2 x = 2 2 a x = 12 2 a 3

Exponential distribution ax go e dx 0 = = 1 a ( ), p x x = ae 0 x ax 1 x = a xe dx= a 1 2 2 x σ x /2 Gaussian p ( x ) e = x σ 2π Poisson p( n) = a ae n n! Lorentizian p( x) 1 a = π ( ) 2 2 x x + a x Power law p( x) = 1 q, q is a constant, x x