So we will instead use the Jacobian method for inferring the PDF of functionally related random variables; see Bertsekas & Tsitsiklis Sec. 4.1.

Similar documents
Stochastic Integration (Simple Version)

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016

I will post Homework 1 soon, probably over the weekend, due Friday, September 30.

Reduction of Variance. Importance Sampling

Stat 451 Lecture Notes Simulating Random Variables

( ) ( ) Monte Carlo Methods Interested in. E f X = f x d x. Examples:

References: Shreve Secs. 1.1, 1.2 Gardiner Secs. 2.1, 2.2 Grigoriu Secs , 2.10

Previously Monte Carlo Integration

Lecture 11: Probability, Order Statistics and Sampling

MITOCW MITRES6_012S18_L23-05_300k

Monte Carlo Integration I

This is now an algebraic equation that can be solved simply:

Cauchy Integral Formula Consequences

Pascal s Triangle on a Budget. Accuracy, Precision and Efficiency in Sparse Grids

Introduction to Polar Coordinates in Mechanics (for AQA Mechanics 5)

CSCI-6971 Lecture Notes: Monte Carlo integration

General Principles in Random Variates Generation

Random processes and probability distributions. Phys 420/580 Lecture 20

Let's transfer our results for conditional probability for events into conditional probabilities for random variables.

Math 121 (Lesieutre); 9.1: Polar coordinates; November 22, 2017

Office hours: Mondays 5-6 PM (undergraduate class preference) Tuesdays 5-6 PM (this class preference) Thursdays 5-6 PM (free-for-all)

Note that. No office hours on Monday October 15 or Wednesday October 17. No class Tuesday October 16

CS145: Probability & Computing Lecture 11: Derived Distributions, Functions of Random Variables

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Poisson Point Processes

BEST TESTS. Abstract. We will discuss the Neymann-Pearson theorem and certain best test where the power function is optimized.

Physics 307. Mathematical Physics. Luis Anchordoqui. Wednesday, August 31, 16

Statistics 100A Homework 1 Solutions

Sampling Distributions Allen & Tildesley pgs and Numerical Recipes on random numbers

Ch3. Generating Random Variates with Non-Uniform Distributions

is a Borel subset of S Θ for each c R (Bertsekas and Shreve, 1978, Proposition 7.36) This always holds in practical applications.

Semester , Example Exam 1

Eco517 Fall 2004 C. Sims MIDTERM EXAM

Chapter 5 Random vectors, Joint distributions. Lectures 18-23

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

Overview of Spatial analysis in ecology

Basic Sampling Methods

Modular Arithmetic Instructor: Marizza Bailey Name:

14 : Approximate Inference Monte Carlo Methods

Cheng Soon Ong & Christian Walder. Canberra February June 2018

CSC 446 Notes: Lecture 13

Probability, Random Processes and Inference

Ordinary Differential Equation Introduction and Preliminaries

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Methods for Solving Linear Systems Part 2

S6880 #7. Generate Non-uniform Random Number #1

Monte Carlo Methods. Geoff Gordon February 9, 2006

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Oliver Schulte - CMPT 419/726. Bishop PRML Ch.

Methods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie)

Math 147, Homework 1 Solutions Due: April 10, 2012

````` The Belt Trick

Statistical Methods in Particle Physics

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Answers and expectations

36. Double Integration over Non-Rectangular Regions of Type II

We will begin by first solving this equation on a rectangle in 2 dimensions with prescribed boundary data at each edge.

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Sampling Based Filters

16 : Markov Chain Monte Carlo (MCMC)

MAT 271E Probability and Statistics

A Quick Introduction to Row Reduction

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Accuracy, Precision and Efficiency in Sparse Grids

Physics 403. Segev BenZvi. Monte Carlo Techniques. Department of Physics and Astronomy University of Rochester

1 Introduction. or equivalently f(z) =

MAT 271E Probability and Statistics

Frequentist-Bayesian Model Comparisons: A Simple Example

Eigenvalues and Eigenvectors

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit.

STEM-Prep Pathway SLOs

Monte Carlo Integration II & Sampling from PDFs

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Statistics: Learning models from data

Lecture for Week 2 (Secs. 1.3 and ) Functions and Limits

Complex Analysis MATH 6300 Fall 2013 Homework 4

2 Random Variable Generation

Lecture 15: MCMC Sanjeev Arora Elad Hazan. COS 402 Machine Learning and Artificial Intelligence Fall 2016

CPSC 340: Machine Learning and Data Mining. Stochastic Gradient Fall 2017

An Introduction to Monte Carlo

University of California San Diego and Stanford University and

AP Calculus. Derivatives.

GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review

Nonparametric Bayesian Methods (Gaussian Processes)

z b k P k p k (z), (z a) f (n 1) (a) 2 (n 1)! (z a)n 1 +f n (z)(z a) n, where f n (z) = 1 C

Random numbers and random number generators

MA1131 Lecture 15 (2 & 3/12/2010) 77. dx dx v + udv dx. (uv) = v du dx dx + dx dx dx

Polynomial Chaos and Karhunen-Loeve Expansion

3.9 Derivatives of Exponential and Logarithmic Functions

1 Expansion in orthogonal functions

Math 1272 Solutions for Fall 2005 Final Exam

Advanced Monte Carlo Methods Problems

Adaptive Rejection Sampling with fixed number of nodes

Week 7: Integration: Special Coordinates

Monte Carlo Radiation Transfer I

Complex Variables. Instructions Solve any eight of the following ten problems. Explain your reasoning in complete sentences to maximize credit.

AEA 2007 Extended Solutions

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Transcription:

2011 Page 1 Simulating Gaussian Random Variables Monday, February 14, 2011 2:00 PM Readings: Kloeden and Platen Sec. 1.3 Why does the Box Muller method work? How was it derived? The basic idea involves a trick in noting that the probability distribution for two independent standard Gaussian random variables takes on a nice form in polar coordinates. We are given the probability density for the Cartesian coordinates: We want to infer from this PDF the PDF of the random variables R, which are functionally related to X 1,X 2. One could use the CDF method as described last time (compute the CDF for X 1,X 2 and use the functional relationship to infer the CDF for R, but this can be somewhat painful to implement when the mapping is between multiple random variables. So we will instead use the Jacobian method for inferring the PDF of functionally related random variables; see Bertsekas & Tsitsiklis Sec. 4.1. Let us first consider the case (which applies to our present circumstance) in which the mapping between the random variables is 1-1. Then we can say for any nice (Borel) subset

2011 Page 2 But this is true for all nice (Borel) sets B so the integrands (assuming they are piecewise continuous) must also be equal (almost everywhere, up to a set of measure zero which has no consequence for a PDF). for the case where is a 1-1 mapping. If you repeat the argument for a general mapping that need not be 1-1:

2011 Page 3 and then the same reasoning gives: This Jacobian technique only works for absolutely continuously distributed random variables, which will be most of our concern. For more general random variables, one should resort to another technique, for example fall back on the CDF approach. Let's apply this Jacobian method to our Cartesian to polar coordinate transformation So we see that the polar coordinate representation (radius, angle) for two independent standard Gaussian random variables are also independent of each other We see that the angle variable is clearly uniformly distributed and so we can simulate it by simply multiplying a standard uniform random variable:

2011 Page 4 Let's try CDF method to simulate R. So And we recall R, are independent of each other. This then gives the Box- Muller simulation formula. We remark that an equivalent way of deriving the Box-Muller formula is to recognize that R 2 is exponentially distributed with mean 2. (Show by CDF method or Jacobian method.) An alternative method for simulating Gaussian random variables is the Marsaglia Polar Method. The basic reason for this modification of the Box-Muller method is that although Box-Muller is much faster than inverse transform method (that would involve inverting erf), it still involves somewhat expensive trigonometric operations. The idea of the Marsaglia Polar method is to simulate the uniform angle in a more direct way that obviates these trigonometric operations. The procedure is to begin by sampling a point uniformly from the unit disc:

2011 Page 5 We note then that What is the PDF for And the radial and angle variables are clearly independent of each other. So we can instead use, in the Box-Muller formula: That would amount to simulating the two standard Gaussian random variables by:

2011 Page 6 will give two independent, standard Gaussian random variables provided (V 1,V 2 ) are sampled uniformly from the unit disc, and But how do we simulate the uniform random point within the unit disc? Rejection method: Want to sample uniformly from a region A that can have an irregular shape. What's easy is to simulate points randomly within a rectangle (or higher-dimensional analog). The reason the rejection method works is that for any nice subregion

2011 Page 7 Clearly, in practice, one wants the containing rectangle to be as small as possible to minimize the need to resample the random variable. For the Polar Marsaglia method, clearly we use the bounding square with side length 2, and we waste a fraction: The hope is that the wasted effort in generating random variables that fall outside the desired circle is more than compensated by the quickness of the sampling. Rejection method has very wide application in random variable simulation and Monte Carlo integration. Random variable simulation from arbitrary PDF: where Y is easy to simulate. In practice, this rejection method for sampling from the PDF p X goes as follows: a. Simulate Y b. Simulate a uniformly distributed random variable c. Accept the sampled value for Y provided d. Otherwise reject and repeat the cycle until the simulated value for Y is accepted

2011 Page 8 Back to Gaussian random variables. We've described how to simulate standard Gaussian random variables. What about general Gaussian random variables? What if we want to simulate a collection of a Gaussian random variables that may be correlated with each other, say, a Gaussian random vector: Generate a vector of independent standard Gaussian random variables Z. Why does this work? Since Z is Gaussian random vector, so is X because it is just a linear transformation of the Gaussian random vector Z. So we just need to check its mean and covariance.

2011 Page 9