Ch3. Generating Random Variates with Non-Uniform Distributions

Size: px
Start display at page:

Download "Ch3. Generating Random Variates with Non-Uniform Distributions"

Transcription

1 ST4231, Semester I, Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard uniform random number generator. Outline of the Chapter: Inversion Method Rejection Method Composition Method Polar Method Multivariate Random Variable Generation 1

2 1 Inversion Method Proposition 1.1 The Foundation Theory of the Inversion Method Let F be the cdf of a random variable, and let U be a random variable with U[0,1] distribution. Then F 1 (U) F. Proof: Let F X denote the distribution of X = F 1 (U). Then F X (x) = P {X x} = P {F 1 (U) x} Now since F is a distribution function it follows that F (x) is a monotone increasing function of x and so the inequality a b is equivalent to the inequality F (a) F (b). Hence, from equation above, we see that F X (x) = P {F (F 1 (U) F (x)} = P {U F (x)} = F (x) since U is uniform (0, 1) 2

3 Remark When F : R [0, 1] is continuous and strictly increasing, then F 1 :[0, 1] R is also continuous and strictly increasing. More generally, we only know that F is right-continuous and non-decreasing, then we need to define F 1 by F 1 (u) = inf{z R : F (z) u}, u [0, 1]. 3

4 1.1 Discrete Random Number Generators Suppose we want to generate the value of a discrete random variable X having probability mass function P (X = x j )=p j, j =0, 1,..., p j =1 Algorithm Generate a random number U. if U<p 0, set X = x 0 and stop. if U<p 0 + p 1, set X = x 1 and stop. if U<p 0 + p 1 + p 2, set X = x 2 and stop.. j 4

5 Remark If the x i, i 0, are ordered so that x 0 <x 1 < and if we let F denote the distribution function of X, then F (x k )= k i=0 p i and so X will be equal to x j if F (x j 1 ) U<F(x j ) In other words, after generating a random number U we determine the value of X by finding the interval [F (x j 1 ),F(x j )) in which U lies [or, equivalently, by finding the inverse of F (U)]. It is for this reason that the above is called the inversion method for generating X. 5

6 Example 1. A Simple Discrete Ranom Numer Generator If we want to simulate a random variable X such that p 1 =0.2, p 2 =0.15, p 3 =0.25, p 4 =0.4, where p j = P (X = j). We could generate u and do the following: if U<0.2set X = 1 and stop. if U<0.35 set X = 2 and stop. if U<0.6set X = 3 and stop. Otherwise set X =4. However, a more efficient procedure is the following if U<0.4set X = 4 and stop. if U<0.65 set X = 3 and stop. if U<0.85 set X = 1 and stop. Otherwise set X =2. 6

7 Example 2. Geometric Ranom Number Generator P (X = i) =pq i 1, i 1, where q =1 p Since j 1 P (X = i) = 1 P (X >j 1) i=1 = 1 P (The first j 1 trials are all failures) = 1 q j 1 We can generate the value of X by generating a random number U and setting X equal to that value j for which 1 q j 1 U<1 q j or, equivalently, for which q j < 1 U q j 1 7

8 Generator 1 Thus, we can define X by X = min{j : q j < 1 U} X = min{j : j log(q) <log(1 U)} log(1 U) = min{j : j> } log(q) Hence, we can express X as X = int( log(1 U) )+1 log(q) where int(x) denotes the integer part of x. Generator 2 We can also write X as X = int( log(u) log(q) )+1, because 1 U is also uniformly distributed on (0,1). 8

9 Example 3. Poisson Random Number Generator P oisson(λ), The random variable X λ λi p i = P (X = i) =e i! for i =0, 1,. For p i and p i+1, we have the following recursive relationship, Generator Algorithm: p i+1 = Generate a random number U. i =0,p = e λ, F = p. if U<F, set X = i and stop. λ i +1 p i, i 0 p = λp/(i + 1), F = F + p, i = i +1. Go to step 3. 9

10 Remark: Based on the property of Poisson distribution, if λ is very large, one efficient generation algorithm is as follows: Let I = int(λ), first generate a random number to determine X is larger or smaller than I, then searches downward starting from X = I in the case where X I and upwards starting from X = I + 1 otherwise. Average number of searches = 1 + E[ X λ ] = 1+ X λ λe[ ] λ = 1+ λe[ Z ], wherez N(0, 1) = λ 10

11 Example 4.Binomial Random Number Generator for i =0, 1,,n. The recursive identity, P (X = i) = n! i!(n 1)! pi (1 p) n i P (X = i +1)= n i p P (X = i) i +11 p Generator Algorithm: Generate a random number U. c = p/(1 p), i =0,pr =(1 p) n, F = pr. If U<F, set X = i and stop. pr = c(n i) i+1 pr, F = F + pr, i = i +1. Go to step 3. X Binomial(n, p), and Remark: As in Poisson case, when the mean np is large it is better to first determine if the generated value is less than or equal to I = int(np) or whether it is larger than I. Then decide to search downward or upward. 11

12 1.2 Continuous Random Number Generators Example 1. A Simple Continuous Random Number Generator want to generate X from the distribution Suppose we F (x) =x n, 0 <x<1 Solution: Let x = F 1 (u), then u = F (x) =x n or, equivalently, x = u 1/n Hence we have the following algorithm for generating a random variable from F (x). Generate a random number U U(0, 1). Set X = U 1/n. 12

13 Example 2. Exponential Random Number Generator If X is an exponential random variable with rate 1, then its distribution function is given by F (x) =1 e x Let x = F 1 (u), we have u = F (x) =1 e x or, taking logarithms, x = log(1 u) Hence we can generate an exponential with parameter 1 by generating a random number U and then setting X = F 1 (U) = log(1 U) = log(u) In general, an exponential random variable with rate λ can be generated by generating a random number U, and setting X = 1 λ log(u). 13

14 Example 3. Gamma Random Number Generator Suppose we want to generate the value of a gamma (n, λ) random variable. F (x) = x 0 λe λy (λy) n 1 dy (n 1)! Remark 1 It is not possible to give a closed form expression for the inverse F 1 (x), we can not use the inverse transform method here. Remark 2 By using the result that a gamma (n, λ) random variable X can be regarded as being the sum of n independent exponentials, each with rate λ, we can make use of example 2 to generate X. Generator Algorithm Based on the above idea, we can generate a gamma (n, λ) random variable by generating n random numbers U 1,, U n and then setting X = 1 λ log U 1 1 λ log U n = 1 λ log(u 1 U n ) Where the use of the identity n i=1 log(x i) = log(x 1 x n ) is computationally time saving in that it requires only one rather than n logarithm computations. 14

15 2 Rejection Method The Problem Suppose we want to generate a d-dimensional random vector X from a known target pdf f(x) onr d. Assume that g(x) is another pdf on R d satisfying two conditions: We already know how to generate random vector V from g(x), and There is a constant α such that f(x) αg(x) for every x R d. Then we can apply the following general Rejection Method Algorithm: (1) Generate V g. (2) Generate Y U[0,αg(V )]. (3) If Y f(v ), then Accept: set X = V and stop. Otherwise, then Reject: return to step (1). We call g the Trial or Proposal pdf. 15

16 Theorem 2.1 The Foundation Theory of the Rejection Method If X is generated via steps 1-3 of the rejection method above, then X f. Proof: Define the following two subsets of R d R: S = {(x,y):0 y αg(x)} and B = {(x,y):0 y f(x)}, which are regions below the graphs of αg and f respectively. Our first observation is that steps 1 and 2 generate a random point (V,Y) that is uniformly distributed on S. Let h(x,y) be the joint density of (V,Y), then we have g(x)h(y x) = 1 α if (x,y) S h(x,y)= 0 otherwise. Let (V,Y ) be the accepted point, i.e., the first (V,Y) that is in B. Then our first observation implies that (V,Y ) is uniformly distributed on B; i.e., its pdf is identically 1 on B (since the volume of B is 1). Hence, the marginal pdf of X = V is k(x) = f(x) 0 1dy = f(x). 16

17 Efficiency of the General Rejection Method For each proposal (V,Y) obtained via steps 1 and 2. P {(V,Y) is accepted} = area(b) area(s) = 1 α Therefore, the expected number of proposals needed is α. In fact, the number of proposals needed has the geometric distribution with parameter 1/α. Thus in the interests of efficiency, we would like to choose g so that α is small (i.e., close to 1). Clearly, taking α = sup f(x)/g(x) is the optimal α, given f and g. 17

18 Example 1 Simulate a distribution with probability P = {0.11, 0.12,0.09, 0.08,0.12, 0.10, 0.09, 0.09,0.10, 0.10}. whereas one possibility is to use the inverse transform algorithm, a better approach is to use the rejection method with q being the discrete uniform density on 1, 2,, 10. That is, q j =1/10, j =1, 2,, 10. Choose c =1.2 byc = max{p j /q j } =1.2, so the algorithm is as follows: Generate a random number U 1 and set Y = int(10u 1 )+1. Generate a second random number U 2. If U 2 p y /1.2, set X = Y and stop. Otherwise return to step 1. 18

19 Example 2 Using the rejection method to generate a random variable having density function f(x) =20x(1 x) 3, 0 <x<1 Step 1: Specify the Proposal pdf Since this random variable (Beta(2,4)) is concentrated in the interval (0,1), let us consider the rejection method with g(x) =1, 0 <x<1. Step 2: Find the Optimal C To determine the constant C, we maximize the following function f(x) =20x(1 x)3 g(x) d dx (20x(1 x)3 ) = 20[(1 x) 3 3x(1 x) 2 ] 19

20 Step 3: Specify the Rejection Function Setting this equal to 0 shows that the maximal value is attained when x =1/4, and thus f(x)/g(x) 20(0.25)(1 0.25) 3 = c Hence, f(x) cg(x) = 256 x(1 x)3 27 Step 4: Write down the Generator Algorithm Thus, the rejection procedure is as follows, Generate random number U 1 and U 2. If U U 1(1 U 1 ) 3, stop and set X = U 1. Otherwise, return to step 1. The average number of times that step 1 will be performed is c = 135/

21 Example 3 Suppose we want to generate a random variable having the Gamma(1.5,1) density f(x) =Kx 1/2 e x, x > 0 where K =1/Γ(1.5) = 2/ π. Because such a random variable is concentrated on the positive axis and has mean 1.5, it is natural to try the rejection technique with an exponential random variable with the same mean. Hence, let g(x) = 2 3 e 2x/3, x > 0 We have f(x) g(x) = 3 2 Kx1/2 e x/3 Maximize the ratio, we get c = 33/2 (2πe) 1/2, and So the algorithm is as follows, f(x) cg(x) =(2e/3)1/2 x 1/2 e x/3 Generate a random number U 1, and set Y = 3 2 log U 1. Generate a random number U 2. If U 2 < (2eY/3) 1/2 e Y/3, set X = Y, otherwise, return to step 1. 21

22 3 The Composition Method Take a pdf f and divide the region under the graph of f into a finite number of sibregions, say S 1,...,S M, with respective areas α 1,...,α M so that M i=1 α i = 1. To generate X f via the composition method: (1) Generate I {1,...,M} with pmf (α 1,...,α M ). (2) Generate (V,W) uniformly on S I. (3) Set X = V. 22

23 One can also describe this method by expressing the target pdf as a mixture of other pdf s f 1,...,f M, that is, M f = α i f i. i=1 For example, we want to simulate the random variables from the following distribution P (X = j) =αp (1) j +(1 α)p (2) j where 0 <α<1. The algorithm is as follows: Generate a random number U. If U<α, generate X from P (1), and stop. Otherwise go to step 3. If U>α, generate X from P (2), and stop. 23

24 Example 1 Simulate a random variable from the following distribution: { 0.05 for j =1, 2, 3, 4, 5 p j = P (X = j) = 0.15 for j =6, 7, 8, 9, 10 By noting that p j =0.5p (1) j +0.5p (2) j, where Algorithm is as follows: p (1) Generate a random number U 1. Generate a random number U 2. j =0.1, j=1,, 10 { 0 for j =1, 2, 3, 4, 5 p (2) j = 0.2 for j =6, 7, 8, 9, 10 If U 1 < 0.5, set X = int(10u 2 ) + 1, otherwise, set X = int(5u 2 )+6. 24

25 4 The Polar Method for Generating Normal R.V. Let X and Y be independent standard normal random variables and let R and θ denote the polar coordinates of the vector (X, Y ). That is R 2 = X 2 + Y 2 tan θ = Y X Since X and Y are independent, we have the joint density f(x, y) = 1 e x2 /2 1 e y2 /2 = 1 +y 2 )/2 2π 2π 2π e (x2 25

26 To determine the joint density of R 2 and θ, we make the change of variables r = x 2 + y 2 θ = tan 1 ( y x ) We have (with the Transformation Jacobian J = 2), f(r, θ) = π e r/2, 0 <r<, 0 <θ<2π (1) However, as this is equal to the product of an exponential density having mean 2 (namely, 1 2 e r/2 ) and the uniform density on (0, 2π), it follows that R 2 and θ are independent, with R 2 being exponential with mean 2 and θ being uniformly distributed over (0, 2π). 26

27 We can now generate a pair of independent standard normal random variables X and Y by using (1) to first generate their polar coordinates and then transforming back to rectangular coordinates. The algorithm is as follows Box-Muller Algorithm Generate random number U 1 and U 2. R 2 = 2 log(u 1 ), set θ =2πU 2. let X = R cos θ = 2 log U 1 cos(2πu 2 ) Y = R sin θ = 2 log U 1 sin(2πu 2 ) (2) The above transformation is known as Box-Muller Transformation. However, the above algorithm is not very efficient: the reason for this is the need to compute the sine and cosine trigonometric functions. There is a way to get around this time-consuming difficulty by an indirect computation of the sine and cosine of a random angle. The algorithm is as follows. 27

28 An Efficient Generator Generate U 1 and U 2 from U(0,1), and set V 1 = 2U 1 1, V 2 =2U 2 1. Then (V 1,V 2 ) is uniformly distributed in the square of area 4 centered at (0, 0). Suppose now that we continually generate such pairs (V 1,V 2 ) until we obtain one that is contained in the circle of radius 1 centered at (0, 0) that is, until (V 1,V 2 ) such that V1 2 + V It now follows that such a pair (V 1,V 2 ) is uniformly distributed in the circle. If we let R and θ denote the polar coordinates of this pair, then it is not difficult to verify that R and θ are independent, with R 2 being uniformly distributed on (0,1) and with θ being uniformly distributed over (0, 2π). Since θ is thus a random angle, it follows that we can generate the sine and cosine of a random angle by generating a random point (V 1,V 2 ) in the circle and setting sin θ = V 2 R = V 2 (V V 2 2 )1/2 cos θ = V 1 R = V 1 (V V 2 2 )1/2 Following the Box-Muller transformation, we can generate independent unit normals as follows, X = ( 2 log(u)) 1/2 V 1 (V 2 1 +V 2 2)1/2 Y = ( 2 log(u)) 1/2 V 1 (V 2 (3) 2 +V 2 2)1/2 28

29 Since R 2 = X 2 + Y 2 is itself uniformly distributed over (0,1) and is independent of the random angle θ, we can use it as the random number U needed in equation (3). Therefore, letting S = R 2, we obtain that X = ( 2 log(s)/s) 1/2 V 1 Y = ( 2 log(s)/s) 1/2 V 2 (4) are independent unit normals when (V 1,V 2 ) is a randomly chosen point in the circle of radius 1 centered at the origin, and S = V1 2 + V 2 2. Summing up, the algorithm is as follows, The Improved Box-Muller Algorithm Generate random numbers, U 1 and U 2. Set V 1 =2U 1 1, V 2 =2U 2 1, S = V1 2 + V 2 2 If S>1 return to step 1. return the independent unit normals X = ( 2 log(s)/s) 1/2 V 1 Y = ( 2 log(s)/s) 1/2 V 2 29

30 5 Multivariate Distributions 5.1 Multivariate Normal Distribution X N d (µ, Σ) has the following density function p(x) = 1 (2π) d/2 Σ exp{ (x µ) Σ 1 (x µ) } 1/2 2 A direct way of generating random vectors from the distribution is to generate a d-vector of i.i.d standard normal deviates z =(z 1,z 2,,z n ) and then to form the vector x = T z + µ Where T is a d d matrix such that T T =Σ. (T could be a Cholesky factor of Σ, for example.) Then x has a N d (µ, Σ) distribution. Another approach for generating the d-vector x from N d (µ, Σ) is to generate x 1 from N 1 (µ 1,σ 11 ), generate x 2 conditionally on x 1, generate x 3 conditionally on x 1 and x 2, and so on. 30

31 5.2 Multinomial Distribution The probability function for the d-variate multinomial distribution is p(x) = n! x π j j xj! for π j =1,x j 0, and x j = n. To generate a multinomial, a simple way is to work with the marginals, they are binomials. The generation is done sequentially. Each succeeding conditional marginal is binomial. For efficiency, the first marginal considered would be the one with the largest probability. 31

2905 Queueing Theory and Simulation PART IV: SIMULATION

2905 Queueing Theory and Simulation PART IV: SIMULATION 2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Generation from simple discrete distributions

Generation from simple discrete distributions S-38.3148 Simulation of data networks / Generation of random variables 1(18) Generation from simple discrete distributions Note! This is just a more clear and readable version of the same slide that was

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Transformations and Expectations

Transformations and Expectations Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

S6880 #7. Generate Non-uniform Random Number #1

S6880 #7. Generate Non-uniform Random Number #1 S6880 #7 Generate Non-uniform Random Number #1 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions

More information

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6 MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Candidates are expected to have available a calculator. Only division by (x + a) or (x a) will be required.

Candidates are expected to have available a calculator. Only division by (x + a) or (x a) will be required. Revision Checklist Unit C2: Core Mathematics 2 Unit description Algebra and functions; coordinate geometry in the (x, y) plane; sequences and series; trigonometry; exponentials and logarithms; differentiation;

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Sampling Random Variables

Sampling Random Variables Sampling Random Variables Introduction Sampling a random variable X means generating a domain value x X in such a way that the probability of generating x is in accordance with p(x) (respectively, f(x)),

More information

Stat 451 Lecture Notes Simulating Random Variables

Stat 451 Lecture Notes Simulating Random Variables Stat 451 Lecture Notes 05 12 Simulating Random Variables Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 22 in Lange, and Chapter 2 in Robert & Casella 2 Updated:

More information

Bivariate Transformations

Bivariate Transformations Bivariate Transformations October 29, 29 Let X Y be jointly continuous rom variables with density function f X,Y let g be a one to one transformation. Write (U, V ) = g(x, Y ). The goal is to find the

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2010 Luc Rey-Bellet Contents 1 Random variables and Monte-Carlo method 3 1.1 Review of probability............................

More information

Lecture 11: Probability, Order Statistics and Sampling

Lecture 11: Probability, Order Statistics and Sampling 5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Precept 4: Hypothesis Testing

Precept 4: Hypothesis Testing Precept 4: Hypothesis Testing Soc 500: Applied Social Statistics Ian Lundberg Princeton University October 6, 2016 Learning Objectives 1 Introduce vectorized R code 2 Review homework and talk about RMarkdown

More information

Generating the Sample

Generating the Sample STAT 80: Mathematical Statistics Monte Carlo Suppose you are given random variables X,..., X n whose joint density f (or distribution) is specified and a statistic T (X,..., X n ) whose distribution you

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

1 Acceptance-Rejection Method

1 Acceptance-Rejection Method Copyright c 2016 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F 1 (y), y [0, 1], for the cdf of a rv X we wish to generate, F (x) = P (X x), x R, is

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Math 370 Semester Review Name

Math 370 Semester Review Name Math 370 Semester Review Name These problems will give you an idea of what may be included on the final exam. Don't worry! The final exam will not be this long! 1) State the following theorems: (a) Remainder

More information

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by specifying one or more values called parameters. The number

More information

STAT 830 Hypothesis Testing

STAT 830 Hypothesis Testing STAT 830 Hypothesis Testing Richard Lockhart Simon Fraser University STAT 830 Fall 2018 Richard Lockhart (Simon Fraser University) STAT 830 Hypothesis Testing STAT 830 Fall 2018 1 / 30 Purposes of These

More information

Math 12 Final Exam Review 1

Math 12 Final Exam Review 1 Math 12 Final Exam Review 1 Part One Calculators are NOT PERMITTED for this part of the exam. 1. a) The sine of angle θ is 1 What are the 2 possible values of θ in the domain 0 θ 2π? 2 b) Draw these angles

More information

Generating Random Variates 2 (Chapter 8, Law)

Generating Random Variates 2 (Chapter 8, Law) B. Maddah ENMG 6 Simulation /5/08 Generating Random Variates (Chapter 8, Law) Generating random variates from U(a, b) Recall that a random X which is uniformly distributed on interval [a, b], X ~ U(a,

More information

STAT 830 Hypothesis Testing

STAT 830 Hypothesis Testing STAT 830 Hypothesis Testing Hypothesis testing is a statistical problem where you must choose, on the basis of data X, between two alternatives. We formalize this as the problem of choosing between two

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Change Of Variable Theorem: Multiple Dimensions

Change Of Variable Theorem: Multiple Dimensions Change Of Variable Theorem: Multiple Dimensions Moulinath Banerjee University of Michigan August 30, 01 Let (X, Y ) be a two-dimensional continuous random vector. Thus P (X = x, Y = y) = 0 for all (x,

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Theory of Statistical Tests

Theory of Statistical Tests Ch 9. Theory of Statistical Tests 9.1 Certain Best Tests How to construct good testing. For simple hypothesis H 0 : θ = θ, H 1 : θ = θ, Page 1 of 100 where Θ = {θ, θ } 1. Define the best test for H 0 H

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY SECOND YEAR B.Sc. SEMESTER - III SYLLABUS FOR S. Y. B. Sc. STATISTICS Academic Year 07-8 S.Y. B.Sc. (Statistics)

More information

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds. Simulation Question: My computer only knows how to generate a uniform random variable. How do I generate others?. Continuous Random Variables Recall that a random variable X is continuous if it has a probability

More information

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution Probability distributions Probability Distribution Functions G. Jogesh Babu Department of Statistics Penn State University September 27, 2011 http://en.wikipedia.org/wiki/probability_distribution We discuss

More information

PRINTABLE VERSION. Practice Final. Question 1. Find the coordinates of the y-intercept for 5x 9y + 6 = 0. 2 (0, ) 3 3 (0, ) 2 2 (0, ) 3 6 (0, ) 5

PRINTABLE VERSION. Practice Final. Question 1. Find the coordinates of the y-intercept for 5x 9y + 6 = 0. 2 (0, ) 3 3 (0, ) 2 2 (0, ) 3 6 (0, ) 5 PRINTABLE VERSION Practice Final Question Find the coordinates of the y-intercept for 5x 9y + 6 = 0. (0, ) (0, ) (0, ) 6 (0, ) 5 6 (0, ) 5 Question Find the slope of the line: 7x 4y = 0 7 4 4 4 7 7 4 4

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

1 Inverse Transform Method and some alternative algorithms

1 Inverse Transform Method and some alternative algorithms Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it

More information

1 Introduction. P (n = 1 red ball drawn) =

1 Introduction. P (n = 1 red ball drawn) = Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability

More information

Math 370 Semester Review Name

Math 370 Semester Review Name Math 370 Semester Review Name 1) State the following theorems: (a) Remainder Theorem (b) Factor Theorem (c) Rational Root Theorem (d) Fundamental Theorem of Algebra (a) If a polynomial f(x) is divided

More information

EXAM # 3 PLEASE SHOW ALL WORK!

EXAM # 3 PLEASE SHOW ALL WORK! Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households

More information

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet. 2016 Booklet No. Test Code : PSA Forenoon Questions : 30 Time : 2 hours Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with Example: Alternative calculation of mean and variance of binomial distribution A r.v. X has the Bernoulli distribution if it takes the values 1 ( success ) or 0 ( failure ) with probabilities p and (1

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

2.3 Analysis of Categorical Data

2.3 Analysis of Categorical Data 90 CHAPTER 2. ESTIMATION AND HYPOTHESIS TESTING 2.3 Analysis of Categorical Data 2.3.1 The Multinomial Probability Distribution A mulinomial random variable is a generalization of the binomial rv. It results

More information

function independent dependent domain range graph of the function The Vertical Line Test

function independent dependent domain range graph of the function The Vertical Line Test Functions A quantity y is a function of another quantity x if there is some rule (an algebraic equation, a graph, a table, or as an English description) by which a unique value is assigned to y by a corresponding

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

TAYLOR AND MACLAURIN SERIES

TAYLOR AND MACLAURIN SERIES TAYLOR AND MACLAURIN SERIES. Introduction Last time, we were able to represent a certain restricted class of functions as power series. This leads us to the question: can we represent more general functions

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Course 214 Section 2: Infinite Series Second Semester 2008

Course 214 Section 2: Infinite Series Second Semester 2008 Course 214 Section 2: Infinite Series Second Semester 2008 David R. Wilkins Copyright c David R. Wilkins 1989 2008 Contents 2 Infinite Series 25 2.1 The Comparison Test and Ratio Test.............. 26

More information

Simulation - Lectures - Part I

Simulation - Lectures - Part I Simulation - Lectures - Part I Julien Berestycki -(adapted from François Caron s slides) Part A Simulation and Statistical Programming Hilary Term 2017 Part A Simulation. HT 2017. J. Berestycki. 1 / 66

More information

Non-parametric Inference and Resampling

Non-parametric Inference and Resampling Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

NYS Algebra II and Trigonometry Suggested Sequence of Units (P.I's within each unit are NOT in any suggested order)

NYS Algebra II and Trigonometry Suggested Sequence of Units (P.I's within each unit are NOT in any suggested order) 1 of 6 UNIT P.I. 1 - INTEGERS 1 A2.A.1 Solve absolute value equations and inequalities involving linear expressions in one variable 1 A2.A.4 * Solve quadratic inequalities in one and two variables, algebraically

More information

Calculus II Study Guide Fall 2015 Instructor: Barry McQuarrie Page 1 of 8

Calculus II Study Guide Fall 2015 Instructor: Barry McQuarrie Page 1 of 8 Calculus II Study Guide Fall 205 Instructor: Barry McQuarrie Page of 8 You should be expanding this study guide as you see fit with details and worked examples. With this extra layer of detail you will

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes. Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

( ) ( ) Monte Carlo Methods Interested in. E f X = f x d x. Examples:

( ) ( ) Monte Carlo Methods Interested in. E f X = f x d x. Examples: Monte Carlo Methods Interested in Examples: µ E f X = f x d x Type I error rate of a hypothesis test Mean width of a confidence interval procedure Evaluating a likelihood Finding posterior mean and variance

More information

MODULE 1 PAPER 1 (3472/1)

MODULE 1 PAPER 1 (3472/1) MODULE 1 PAPER 1 (347/1) FUNCTIONS 1. Given that h(x) =, x 0 and v(x) = 3x +, find hv 1 ) x ( x. p : t t + qp : t t + 4t + 1. Based on the above information, find the function of q. QUADRATIC FUNCTIONS

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

General Principles in Random Variates Generation

General Principles in Random Variates Generation General Principles in Random Variates Generation E. Moulines and G. Fort Telecom ParisTech June 2015 Bibliography : Luc Devroye, Non-Uniform Random Variate Generator, Springer-Verlag (1986) available on

More information

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 =

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 = Chapter 5 Sequences and series 5. Sequences Definition 5. (Sequence). A sequence is a function which is defined on the set N of natural numbers. Since such a function is uniquely determined by its values

More information

Slides 5: Random Number Extensions

Slides 5: Random Number Extensions Slides 5: Random Number Extensions We previously considered a few examples of simulating real processes. In order to mimic real randomness of events such as arrival times we considered the use of random

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information