Chapter 1. Probability

Similar documents
Probability and Random Variable Primer

Lecture 3: Probability Distributions

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

Engineering Risk Benefit Analysis

CS-433: Simulation and Modeling Modeling and Probability Review

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

University of Washington Department of Chemistry Chemistry 453 Winter Quarter 2015

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Linear Regression Analysis: Terminology and Notation

= z 20 z n. (k 20) + 4 z k = 4

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

Expectation Maximization Mixture Models HMMs

A be a probability space. A random vector

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Expected Value and Variance

Statistics Spring MIT Department of Nuclear Engineering

STATISTICAL MECHANICS

A REVIEW OF ERROR ANALYSIS

A random variable is a function which associates a real number to each element of the sample space

and Statistical Mechanics Material Properties

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

Randomness and Computation

Convergence of random processes

Einstein-Podolsky-Rosen Paradox

ANSWERS CHAPTER 9. TIO 9.2: If the values are the same, the difference is 0, therefore the null hypothesis cannot be rejected.

Introduction to Random Variables

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

CS 798: Homework Assignment 2 (Probability)

Introduction to Statistical Methods

Homework Assignment 3 Due in class, Thursday October 15

Lecture 10. Reading: Notes and Brennan Chapter 5

Chapter 7 Channel Capacity and Coding

Chapter 8 Indicator Variables

Rules of Probability

Stochastic Structural Dynamics

Definition. Measures of Dispersion. Measures of Dispersion. Definition. The Range. Measures of Dispersion 3/24/2014

Statistics Chapter 4

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

1. Fundamentals 1.1 Probability Theory Sample Space and Probability Random Variables Limit Theories

Physics 181. Particle Systems

Chapter 9: Statistical Inference and the Relationship between Two Variables

Lecture 4. Macrostates and Microstates (Ch. 2 )

Chapter 7 Channel Capacity and Coding

Linear Approximation with Regularization and Moving Least Squares

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!

Comparison of Regression Lines

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Thermodynamics and statistical mechanics in materials modelling II

χ x B E (c) Figure 2.1.1: (a) a material particle in a body, (b) a place in space, (c) a configuration of the body

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

Chapter 13: Multiple Regression

x i1 =1 for all i (the constant ).

x = , so that calculated

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Chapter 1: Introduction to Probability

DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes

THEOREMS OF QUANTUM MECHANICS

PhysicsAndMathsTutor.com

Rockefeller College University at Albany

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Statistical Energy Analysis for High Frequency Acoustic Analysis with LS-DYNA

Distributions /06. G.Serazzi 05/06 Dimensionamento degli Impianti Informatici distrib - 1

/ n ) are compared. The logic is: if the two

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Measurement and Uncertainties

: 5: ) A

XII.3 The EM (Expectation-Maximization) Algorithm

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00

} Often, when learning, we deal with uncertainty:

Solution of Linear System of Equations and Matrix Inversion Gauss Seidel Iteration Method

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

Propagation of error for multivariable function

PROBABILITY PRIMER. Exercise Solutions

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Credit Card Pricing and Impact of Adverse Selection

An Experiment/Some Intuition (Fall 2006): Lecture 18 The EM Algorithm heads coin 1 tails coin 2 Overview Maximum Likelihood Estimation

Multiple Choice. Choose the one that best completes the statement or answers the question.

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

Lecture 17 : Stochastic Processes II

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Temperature. Chapter Heat Engine

Hydrological statistics. Hydrological statistics and extremes

6.4. RANDOM VARIABLES 233

A Robust Method for Calculating the Correlation Coefficient

U-Pb Geochronology Practical: Background

Course 395: Machine Learning - Lectures

IRO0140 Advanced space time-frequency signal processing

Limited Dependent Variables

STAT 405 BIOSTATISTICS (Fall 2016) Handout 15 Introduction to Logistic Regression

Lecture 3: Shannon s Theorem

Transcription:

Chapter. Probablty Mcroscopc propertes of matter: quantum mechancs, atomc and molecular propertes Macroscopc propertes of matter: thermodynamcs, E, H, C V, C p, S, A, G How do we relate these two propertes? Statstcal thermodynamcs (mechancs)

Basc Probablty Theory Varables: quanttes that can change n value throughout the course of an eperment or seres of events e.g., the sde of con observed after tossng the con. Dscrete varables: assume only a lmted number of specfc values e.g., the outcome of toss = two values (head or tal) sample space of the varables = the possble values a varable can assume e.g., the outcome of toss {+, } Contnuous varables: assume any value n the certan range e.g., temperature, 0 < T <.

Imagne a lottery where balls numbered to 50 are randomly med. The probablty of selectng s /50. Ths requres an nfnte number of eperments. Consder a varable for whch the sample space conssts of n values denoted as {,, n }. The probablty that a varable wll assume one of these values (P ) s: 0 < P <, =,,, M The sum of the probabltes for selectng each ndvdual ball must be equal to. M P+ P+ + P = n P= =

Consder the probablty assocated wth a gven outcome for a seres of eperments,.e, the event probablty. Imagne tossng a con four tmes. What s the probablty that at least two heads are observed after four tosses? Fgure Potental outcomes after tossng a con four tmes. Red sgnfes heads and blue sgnfes tals. The probablty = /6 The probablty (P E ) that the outcome or event of nterest, E, occurs n N values n sample space E PE = N

The Fundamental Countng Prncple For a seres of manpulatons {M, M,, M j } havng n j ways to accomplsh each manpulaton {n, n,, n j }, the total number of ways to perform the entre seres of manpulatons (Total M ) s the product of the number of ways to perform each manpulaton under the assumpton that the ways are ndependent: Total = ( n )( n )( n ) ( n ) M 3 j - Assemble 30 students n a lne. - How many arrangements of students are possble? The total number of ways 3 W = 30 9 8 = 30! =.65 0 Eample: How many fve-card arrangements are possble from a standard deck of 5 cards? Soluton: Total M = (n ) (n ) (n 3 ) (n 4 ) (n 5 ) = (5) (5) (50) (49) (48) = 3,875,00

Permutatons How many permutatons are possble f only a subset of objects s employed n constructng the permutaton? P(n, j): the number of permutatons possble usng a subset of j objects from the total group of n Pn (, j) = nn ( )( n ) ( n j+ ) nn ( )( n ) ()() n! = = ( n j)( n j ) ()() ( n j)! Eample: The coach of a basketball team has plays on the roster, but can only play 5 plays on one tme. How many 5-player arrangements are possble usng the -player roster? Soluton:! Pn (, j) = P(,5) = = 95,040 ( 5)!

Confguratons permutatons = the number of ordered arrangements confguratons = the number of unordered arrangements Fgure Illustraton of confguratons and permutatons usng four colored balls. The left-hand column presents the four possble three-color confguratons, and the rght-hand column presents the s permutatons correspondng to each confguraton.

C(n, j) = the number of confguratons that are possble usng a subset of j objects from a total number of n objects. Pn (, j) n! Cnj (, ) = = j! j!( n j)! Eample: How many possble 5 card combnatons or hands from a standard 5 card deck are there? Soluton: 5! C (5,5) = =,598,960 5!(5 5)!

Bnomal Probabltes Defne the complement of P E as the probablty of an outcome other than that assocated wth the event of nterest, as denoted by P EC P + P = E EC Bernoull tral: the outcome of a gven eperment wll be a success (.e., the outcome of nterest) or a falure (.e., not the outcome of nterest). Bnomal eperment: a collecton of Bernoull trals The probablty of observng heads every tme when a con s tossed four tmes P E = = 6

The probablty of obtanng j successes n a tral consstng of n trals for a seres of Bernoull trals n whch the probablty of success for a sngle tral s P E : P( j) = Cnj (, )( P) ( P) E j n j E n! = ( PE) ( PE) j!( n j)! j n j C(n, j): the number of confguratons that are possble usng a subset of j successes n n trals

Eample: Toss a con 50 tmes. What are the probabltes of havng the con land heads up 0 tmes and 5 tmes? Soluton: P = Cnj (, )( P) ( P) P 0 5 j n j E 0 40 = C(50,0) E 0 40 50! = = 9. 0 (0!)(40!) 5 5 = C(50, 5) 5 5 50! = = 0. (5!)(5!) 6

Strlng s Appromaton Calculatons of factoral quanttes becomes etremely large: 57 00! = 9.3 0 Need an appromaton: Strlng s appromaton ln N! = Nln N N Dervaton ln N! = ln[( N)( N )( N ) ()()] = ln N+ ln( N ) + ln( N ) + + ln + ln N = ln n n= N ln ndn = Nln N N (ln ) Nln N N

Probablty Dstrbuton Functons Number of Heads Probablty Number of Heads Probablty 0 8.88 0 6 30 0.04 4.44 0 4 35.00 0 3.09 0 40 9. 0 6 5.88 0 9 45.88 0 9 0 9. 0 6 48.09 0 5.00 0 3 49 4.44 0 4 0 0.04 50 8.88 0 6 5 0.

Ths nformaton can be presented graphcally by plottng the probablty as a functon of outcome. Fgure 3. Plot of the probablty of the number of heads beng observed after flppng a con 50 tmes. The red curve represents the dstrbuton of probabltes for P E =0.5, the blue curve for P E =0.3, and the yellow curve for P E =0.7 The probablty of observng j successful trals followng n total trals n! j n j P( j) = ( PE) ( PE) j = 0,,,, n j!( n j)!

A probablty dstrbuton functon ( f ) represents the probablty of a varable () havng a gven value, wth the probablty descrbed by a functon P ( ) f. P( ) = Cf M = P ( P ( ) = M = Cf = Cf + Cf + + Cf = C f C = M = = M = f ) = M f = f M

Probablty Dstrbutons Involvng Dscrete and Contnuous Varables If the varable s contnuous, P() s the probablty that the varable has a value n the range of d P( ) d = Cf ( ) d, where f () s a functon not yet defned. P( ) d = C f ( ) d = C = f ( ) d P( ) d = Cf ( ) d = f ( ) d f ( ) d

Characterzng Dstrbuton Functons Average Values Consder a functon, g(), whose value s dependent on. M g ( ) = g ( ) P ( ) = = Dstrbuton moments g ( ) n M = g ( ) f M = : the frst moment of the dstrbuton functon : the second moment of the dstrbuton f : the root-mean-squared (rms) value Eample a P( ) = C e 0 Are the mean and rms values for ths dstrbuton the same?

Varance The varance ( σ ) : a measure of the wdth of a dstrbuton defned as the average devaton squared from the mean of dstrbuton ( ) σ = = + Note: b ( ) + d ( ) = b ( ) + d ( ) cb ( ) = cb ( ) σ = + = + =

- Gaussan dstrbuton: the bell-shaped curve ( ) ( δ ) σ P d e d = σ : the wdth of the dstrbuton ( πσ ) Fgure 4. The nfluence of varance on Gaussan probablty dstrbuton functons. Notce that an ncrease n the varance corresponds to an ncrease n the wdth of the dstrbuton.

Eample: a ( ) σ? P = C e =