Generating Random Variates 2 (Chapter 8, Law)

Similar documents
IE 303 Discrete-Event Simulation

Probability Distributions Columns (a) through (d)

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

1 Acceptance-Rejection Method

Chapter 5. Chapter 5 sections

2 Random Variable Generation

Generation from simple discrete distributions

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Dr. Maddah ENMG 617 EM Statistics 10/15/12. Nonparametric Statistics (2) (Goodness of fit tests)

Sampling Random Variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.

Statistics 3657 : Moment Generating Functions

15 Discrete Distributions

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

PAS04 - Important discrete and continuous distributions

Chapter 5 continued. Chapter 5 sections

S6880 #7. Generate Non-uniform Random Number #1

Things to remember when learning probability distributions:

Test Problems for Probability Theory ,

1 Inverse Transform Method and some alternative algorithms

Random Variables and Their Distributions

Random Variate Generation

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

SOLUTION FOR HOMEWORK 12, STAT 4351

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

PROBABILITY THEORY LECTURE 3

Slides 5: Random Number Extensions

STAT 3610: Review of Probability Distributions

STAT J535: Chapter 5: Classes of Bayesian Priors

Basic concepts of probability theory

Basic concepts of probability theory

Ch3. Generating Random Variates with Non-Uniform Distributions

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).

Contents 1. Contents

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

1.6 Families of Distributions

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise.

IEOR 4703: Homework 2 Solutions

Basic concepts of probability theory

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 8.8.1: A factorization theorem

Stat410 Probability and Statistics II (F16)

Mathematical Statistics

Stat 315: HW #6. Fall Due: Wednesday, October 10, 2018

STAT/MATH 395 PROBABILITY II

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah

Lecture 3. Discrete Random Variables

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Continuous Distributions

The University of Hong Kong Department of Statistics and Actuarial Science STAT2802 Statistical Models Tutorial Solutions Solutions to Problems 71-80

1. (Regular) Exponential Family

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

1 Review of Probability

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

First Year Examination Department of Statistics, University of Florida

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BEST TESTS. Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

THE QUEEN S UNIVERSITY OF BELFAST

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

Modelling the risk process

Continuous Probability Spaces

STAT Chapter 5 Continuous Distributions

Continuous Probability Distributions. Uniform Distribution

functions Poisson distribution Normal distribution Arbitrary functions

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

i=1 k i=1 g i (Y )] = k

1 Probability Model. 1.1 Types of models to be discussed in the course

2905 Queueing Theory and Simulation PART IV: SIMULATION

STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero

Slides 8: Statistical Models in Simulation

Joint p.d.f. and Independent Random Variables

1 Probability and Random Variables

Calculus First Semester Review Name: Section: Evaluate the function: (g o f )( 2) f (x + h) f (x) h. m(x + h) m(x)

Classical and Bayesian inference

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

3 Modeling Process Quality

3 Continuous Random Variables

Stat 426 : Homework 1.

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Probability and Distributions

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Week 1 Quantitative Analysis of Financial Markets Distributions A

Chapters 3.2 Discrete distributions

EEC 686/785 Modeling & Performance Evaluation of Computer Systems. Lecture 18

Department of Mathematics

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Lecture 2: Repetition of probability theory and statistics

IE 581 Introduction to Stochastic Simulation

Transcription:

B. Maddah ENMG 6 Simulation /5/08 Generating Random Variates (Chapter 8, Law) Generating random variates from U(a, b) Recall that a random X which is uniformly distributed on interval [a, b], X ~ U(a, b), has the distribution function: 0, F( x) ( x a) /( b a),, if x a if a x b if x b To utilize the inverse transform method, set u = F(x) for a x b, and solve for x, u = ( x a) /( b a) x = a + u( b a). Then, the algorithm for generating X ~ U(a, b) is as follows:. Generate U ~ U(0,). Set X = a + U(b a). Generating random variates from triang(a, b, m) A r.v. X with a triangular distribution, X ~ triang(a, b, m), has the following distribution function: a m b ( x a) /( b a)( m a), x) ( b x) /( b a)( b m), F X ( if if a x m m x b

Fact. If X ~ triang(a, b, m), then X = a +(b a)y, where Y ~ triang (0,, m ), where m = (m a) /(b a). Proof. Note that P{ Y Then, y / m, < y} ( y) /( m ), [( ) /( )], if 0 if 0 y m if m y P{ a+ ( b a) Y < x} = P{ Y < ( x a)/( b a)} [( x a) /( b a)] x a, if 0 m m ( b a) [ ( x a) /( b a)] x a, if m m ( b a) x a b a x a m a ( m a)/( b a) ( b a) ( b a) [ ( x a) /( b a) ] m a x a, if ( m a)/( b a) ( b a) ( b a) ( x a) /( b a)( m a), ( b x) /( b a)( b m), = P{X < x}. if a x m if m x b To generate Y, using the inverse transform method, solve u = FY ( y) = y / m, if 0 y m ( y) /( m), if m y.

y m u, ( u)( m ), if 0 if m m u m ( u)( m ) y mu, ( u)( m ), if 0 u m if m u Finally, this is the algorithm for generating triang(a,b,m).. Set m = (m a) /(b a).. Generate U ~ U(0,). 3. If U < m, set = m U. Otherwise, set Y Y = ( U)( m ). 4. Set X = a +(b a)y. Generating random variates from N(μ, σ) Recall that if X ~ N(μ, σ), then X = μ +σz, where Z ~ N(0,). Once Z is generated, X can be generated as follows:. Generate Z ~ N(0,).. Set X = μ +σz. In the following we present two algorithms for generating Z. The first algorithm is based on the central limit, which implies that if U i are iid U(0,) then U i 6 N(0, ). i= 3

This (approximate) algorithm works as follows. Generate U, U,..., U ~ U(0,). Set Z = U i 6. i= The second (Box-Muller) algorithm is based on the following fact. Fact. Let Z and Z be independent N(0,) rvs. Let R, = Z + Z θ = tg ( Z / Z ). Then, R is an exponential rv with mean (R ~ exp(/)), and θ ~ U(0, π). Z θ R Noting that Z = R cos(θ), and Z = R sin(θ), pairs of Z can be generated as follows.. Generate U, U ~ U(0,). Set Z = ln( U )] / cos(π ), [ U Z Z [ U = ln( U )] / sin(π ). 4

Generating random variates from LN(μ, σ) A rv X is said to have a lognormal distribution, X ~LN(μ, σ), if Y = ln(x) ~ N(μ, σ). The, X can be generated as follows.. Generate Y ~ N(μ, σ).. Set X = e Y. Generating random variates from gamma(α, β) A rv X with a gamma distribution, X ~ gamma(α, β), has the following density and distribution function: α α x/ β x α α t/ β β x e β t e f X( x) =, x> 0, FX( x) = dt, Γ( α) Γ( α) 0 where z t Γ ( α) = t e dt. 0 Since F X (x) generally has no closed form, the inverse transform method cannot be applied except numerically. When α is an integer, then X is an Erlang r.v. representing the sum of α exponential random variables each having a mean β. I.e., X ~ α-erlang(/β). In this case the convolution method for generating Erlang can be used. If α is not an integer or α is a large integer, the acceptance-rejection method can be used. 5

Since X ~ gamma(α, β) can be written as X = βy, where Y~ gamma(α, ). Then, the focus is on generating Y. To generate Y via acceptance-rejection, consider two cases: α < and α >. (If α =, then Y ~ exp()). If α <, then the following majorizing function is used: α x, if 0< x Γ ( α) tx ( ) x e if x > Γ( α) 3 f X x, t x, 0 0 3 4 x If α >, then the following majorizing function is used: λ λμx tx ( ) = c, λ ( μ + x ) α α λ where λ = α, c= 4 α e /( λγ ( α)), μ = α. 6

0.4 f X ( x, ) tx (, ) 0. 0 0 3 4 x Generating from the density function r(x) based on t(x) in both cases can be done with the inverse-transform method. (see Law, pp. 449-45, for details). Generating random variates from beta(α,α ) If X ~ beta(α,α ) (see Law, p. 0 for description), then X can be written as X = Y /(Y +Y ), where Y ~ gamma(α, ) and Y ~ gamma(α, ). Then, X can be generated as follows:. Generate Y ~ gamma(α, ) and Y ~ gamma(α, ).. Set X = Y /(Y +Y ). Generating random variates from a discrete Uniform rv An integer-valued rv, X, is said to be discrete uniform if it is equally likely to take on any integer between integers i and j, i < j. 7

Note that the pmf of X, is P{X = k} = /(j i+), k = i, i+,, j. Applying the inverse transform method gives the following algorithm:. Generate U ~ U(0,).. Set X = i+ ( j i+ ) U, where x is the largest integer x (e.g..3 = ). Generating random variates from a Bernoulli rv Recall that a Bernoulli rv takes on values 0 or with probabilities p and p <. Applying the inverse transform method gives the following algorithm:. Generate U ~ U(0,).. If U p, set X =. Otherwise, set X = 0. Generating random variates from a Binomial rv Recall that if X has a binomial distribution with parameters n and p, X n = Yi, where Y i ~ Bernoulli with parameter p. i= Then, X can be generated as follows.. Generate Y, Y,, Y n ~ Bernoulli(p). Set X n = Yi. i= 8

Generating from a Geometric rv The distribution function of the Geometric rv is F () i = P{ X i} = ( p) i X Then, the inverse-transform method can be applied as follows to generate X. Let q = p.. Generate U ~ U(0,).. If q i < U q i, set X = i. Note that in step, X is the smallest i such that U F(i) = q i, which implies that X is the smallest i such that q i < U. Solving q i = U for i implies that i =ln( U)/ln(q). Then, since q i is decreasing in i, the smallest value of i such that q i < U is X = ln( U) / ln( q), where x is the smallest integer x. This leads to the following algorithm for generating X.. Generate U ~ U(0,).. Set X = ln( U) / ln( q). 9

Generating from a Poisson rv Recall that the pmf for the Poisson rv with parameter λ is i λ λ px () i = P{ X = i} = e, i= 0,, K i! λ Note that px( i+ ) = px( i) and i + i FX() i = px( j). j= 0 Then, since F X (i) is increasing in i, the smallest value of i for which U F X (i), can be found by first comparing U and F X (0) = p X (0) = e -λ, and then, if U > p X (0), comparing U with F X () = p X (0) + λp X (0), and so on. The algorithm works as follows.. Set p = e λ, F = p, i = 0.. Generate U ~ U(0,). 3. If U < F, set X = i and stop. 4. Set p = pλ/(i+), F = F + p, i = i+ 5. Go to Step 3. 0