functions Poisson distribution Normal distribution Arbitrary functions

Similar documents
Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Lecture 2: Repetition of probability theory and statistics

Probability Distributions.

Chapter 2: Random Variables

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Statistical Methods in Particle Physics

Algorithms for Uncertainty Quantification

Statistics 100A Homework 5 Solutions

A simple algorithm that will generate a sequence of integers between 0 and m is:

Name: Firas Rassoul-Agha

Multiple Random Variables

Kinetic Theory 1 / Probabilities

Introduction to Statistics and Error Analysis

1: PROBABILITY REVIEW

Data Analysis and Monte Carlo Methods

Midterm Exam 1 Solution

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Independent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Special distributions

Chapter 3. Chapter 3 sections

Lecture 1: August 28

Exercises with solutions (Set D)

Statistics, Probability Distributions & Error Propagation. James R. Graham

Numerical Methods I Monte Carlo Methods

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Review of Probability Theory

conditional cdf, conditional pdf, total probability theorem?

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Statistical Methods in Particle Physics

1 Random Variable: Topics

CS 361: Probability & Statistics

Stochastic Models of Manufacturing Systems

Quick Tour of Basic Probability Theory and Linear Algebra

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

MATH Solutions to Probability Exercises

Sampling Random Variables

Multivariate distributions

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

1 Random variables and distributions

Order Statistics and Distributions

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Guidelines for Solving Probability Problems

The random variable 1

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Deep Learning for Computer Vision

Continuous Distributions

Probability Density Functions

Chapter 1. Sets and probability. 1.3 Probability space

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Introduction to Probability and Statistics (Continued)

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Review of Probability. CS1538: Introduction to Simulations

Kinetic Theory 1 / Probabilities

Lecture 2. Distributions and Random Variables

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )

Point Estimation. Vibhav Gogate The University of Texas at Dallas

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Bell-shaped curves, variance

MAT 271E Probability and Statistics

Modern Methods of Data Analysis - WS 07/08

STAT Chapter 5 Continuous Distributions

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Chapter 3: Random Variables 1

Basics on Probability. Jingrui He 09/11/2007

Random Variables and Their Distributions

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

2.3 Estimating PDFs and PDF Parameters

STOR Lecture 16. Properties of Expectation - I

Chapter 2: The Random Variable

Week 1 Quantitative Analysis of Financial Markets Distributions A

Data Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006

7 Random samples and sampling distributions

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Calculus with Algebra and Trigonometry II Lecture 21 Probability applications

3. Review of Probability and Statistics

Lecture 2. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

3. Probability and Statistics

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Lecture 3. Discrete Random Variables

Sampling Distributions Allen & Tildesley pgs and Numerical Recipes on random numbers

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

2 Random Variable Generation

PAS04 - Important discrete and continuous distributions

Brief Review of Probability

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions

Unit 4 Probability. Dr Mahmoud Alhussami

Transcription:

Physics 433: Computational Physics Lecture 6 Random number distributions Generation of random numbers of various distribuition functions Normal distribution Poisson distribution Arbitrary functions

Random Number Generation Algorithms Uniform deviate Linear congruential algorithm I new = (ai old + b)mod M Random integers I range in value from 0 M - Scale by M to get real random numbers, R, in [0,1] - M large, Choice of a < M, b < M critical Other functional forms of distributions Triangular distribution y = R 1 + R 2 y P y (y) = 2 y 0 y 1 1 y 2 Uniform deviates can be used to generate random deviates that fit other probability distribution functions, p.d.f.

Sums of Uniform Deviates double uniformdistribution() { return ((double) random()) / ((double) LONG_MAX); } double sum2r() // TRIANGULAR { return (uniformdistribution()+uniformdistribution()); } double sum3r() { return (uniformdistribution()+uniformdistribution()+uniformdistribution()); } double sum4r() { return (uniformdistribution()+uniformdistribution()+ uniformdistribution()+uniformdistribution()); }... for large n becomes gaussian or normal distribution - CLT

Central Limit Theorem Central Limit Theorem Convolutions of any distribution normal (gaussian) distribution

Probability Distributions Probability Density Function (p.d.f.) A functional form that describes the probability density Uniform density - all values in the interval are equally probable P R (R) = 1 0 0 R 1 otherwise Exponential density - probability falls of exponentially as the random variable value increases P u (u) = 1 λ e u λ, 0 u Integral of probability density over the complete range is unity Cumulative Distribution Function Integral of probability between minimum and a point in the range x F x (x) = P x (y)dy, where P x (x) is p.d.f. for a x < b a i.e., F x (a) = 0 and F x (b) =1

Inversion Technique pdf - cdf mapping Map interval [a,b) to [0,1) Where slope of F is small, the interval dr maps to large interval dx Uniform deviate interval yields correct F(x) One needs to evaluate and invert it F x (x) or F x -1 (R)

Inversion Technique - II Applicable for simple distribution functions Use normalized distribution function (p.d.f.) Integrate p.d.f. P(x) analytically to obtain c.d.f. F(x), i.e., probability of chosing a value less than x Equate this to a uniform random variable and solve for x. The result is distributed according to the original p.d.f. The method is fully efficient since each uniformly distributed random number R gives an x value

Exponential Distribution Invertible distribution Exponential distribution P u (u) = 1 λ e u/ λ where 0 u mean u = λ, variance u u 2 = λ 2 New random variable, u = λln(1 R) R =1 e u/λ, dr = 1 λ e u/λ du P u (u)du =1.dR = P R (R)dR i.e., R is distributed uniformly Exponential distribution with mean and standard deviation λ is obtained uniformly distributed random variables R simply by computing, u=-λ ln(1-r) double exponentialdistribution(double lambda = 1.) { return (-lambda * log(1. - uniformdistribution())); }

Central Limit Theorem - II Central limit theorem Works with exponential distribution also!

Normal Distribution Normal or Gaussian Distribution P x (x)dx = 1 2 2πσ e (x µ) /2σ 2 dx, mean µ, standard deviation σ Convolutions of other distributions lead to this - CLT But, this distribtution is invertible using a simple trick P x (x)p y (y)dxdy = 1 [ ( x µ ) 2 + ( y µ ) 2 ] 2σ 2 2πσ 2 e dxdy Changing to polar coordinates, x = ρ cosθ + µ, y = ρ sinθ + µ, 1 2 P ρ (ρ)p θ (θ)ρdρdθ = 2πσ 2 e ρ 2σ 2 ρdρdθ Another change of variables, u = ρ 2 2σ 2, R = 1 2π θ P u (u)p R (R)dudR = (e u du)(1 dr) Normal distribution obtained by using exponential and uniform distributions

Normal Distribution Generation of Normal Distribution Uses exponential and uniformly distributed random numbers Inversion formula double normaldistribution(double mean = 0., double sd = 1.) { return (sqrt(2. * sd * exponentialdistribution()) * cos(2. * pi * uniformdistribution()) + mean); }

Discrete Distributions Coin flip - two outcomes head or tail P(head) = P(tail) = 0.5 Given uniform deviate R in [0,1), if R<0.5, head otherwise tail. Die roll - six outcomes 1-6 P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6 Given uniform deviate R in [0,1), if R<1/6, x=1, 1/6<R<1/3, x=2,... Binomial Distribution (with two equal probability outcomes) Probability of getting k heads when flipping N coins N! 1 P k (k) = N, where k = 0,1,...,N. k!(n k)! 2 Algorithm: simply use coin flip algorithm for N trials Any Discrete Distribution Algorithm for selecting a random variable i with distribution P i (i) is to find the value of i that satisfies: i 1 P i ( j) R < P i ( j), with P i (a 1) 0 j= a i j= a

Discrete Distributions Discrete cumulative distribution i F i (i) = P i ( j) j= a Compute F i (i) just once for all i, and then for each random number requested pick the value based on uniform deviate [0,1) F i (i 1) R < F i (i) More efficient - even better if search algorithm is O(ln N)

Poisson Distribution Example: Radioactive decay Probability to observe 1 decay in time t, p=β t, where β=αn, N is total number of parent nuclei and t is small compared to time T in which observations are made, i.e., β t<<1. Probability of observing n decays in time T is: P = m! m! (m n)!n! pn (1 p) m n = (m n)!n! In the limit of t 0, (i.e., m ), 1 βt m e βt, 1 βt m m with µ = βt, n 1, βt m n 1 βt m m! (m n)!n! mn m n P = µn e µ, Poisson distribution, mean = µ, variance = µ n! If n events are observed the standard deviation is n Examples: Number of observed events when efficiency is constant Number of entries in a histogram bin

Poisson vs Gaussian The Gaussian (Normal) distribution is a reasonable approximation of the Poisson distribution even for µ as small as 5. Asymmetric for small µ.

Acceptance-Rejection Technique Generation of random numbers from arbitrary distribution function, P x (x) Use a trial number x try at random Accept it with the probability proportional to f(x try ) 1. Determine P x max Px (x) 2. Select x try = a + (b a)r 1 3. Compute P x (x try ) 4. Accept x try as the generated random number if P (x ) x try max R P 2 x Also yields the integral! b I = P(x)dx n accept max P x (b a) a n try - But, the method is inefficient in generating random number series if there are sharp peaks in P x (x), because many trial events will be not accepted

Slide Title