2011 Page 1 Simulating Gaussian Random Variables Monday, February 14, 2011 2:00 PM Readings: Kloeden and Platen Sec. 1.3 Why does the Box Muller method work? How was it derived? The basic idea involves a trick in noting that the probability distribution for two independent standard Gaussian random variables takes on a nice form in polar coordinates. We are given the probability density for the Cartesian coordinates: We want to infer from this PDF the PDF of the random variables R, which are functionally related to X 1,X 2. One could use the CDF method as described last time (compute the CDF for X 1,X 2 and use the functional relationship to infer the CDF for R, but this can be somewhat painful to implement when the mapping is between multiple random variables. So we will instead use the Jacobian method for inferring the PDF of functionally related random variables; see Bertsekas & Tsitsiklis Sec. 4.1. Let us first consider the case (which applies to our present circumstance) in which the mapping between the random variables is 1-1. Then we can say for any nice (Borel) subset
2011 Page 2 But this is true for all nice (Borel) sets B so the integrands (assuming they are piecewise continuous) must also be equal (almost everywhere, up to a set of measure zero which has no consequence for a PDF). for the case where is a 1-1 mapping. If you repeat the argument for a general mapping that need not be 1-1:
2011 Page 3 and then the same reasoning gives: This Jacobian technique only works for absolutely continuously distributed random variables, which will be most of our concern. For more general random variables, one should resort to another technique, for example fall back on the CDF approach. Let's apply this Jacobian method to our Cartesian to polar coordinate transformation So we see that the polar coordinate representation (radius, angle) for two independent standard Gaussian random variables are also independent of each other We see that the angle variable is clearly uniformly distributed and so we can simulate it by simply multiplying a standard uniform random variable:
2011 Page 4 Let's try CDF method to simulate R. So And we recall R, are independent of each other. This then gives the Box- Muller simulation formula. We remark that an equivalent way of deriving the Box-Muller formula is to recognize that R 2 is exponentially distributed with mean 2. (Show by CDF method or Jacobian method.) An alternative method for simulating Gaussian random variables is the Marsaglia Polar Method. The basic reason for this modification of the Box-Muller method is that although Box-Muller is much faster than inverse transform method (that would involve inverting erf), it still involves somewhat expensive trigonometric operations. The idea of the Marsaglia Polar method is to simulate the uniform angle in a more direct way that obviates these trigonometric operations. The procedure is to begin by sampling a point uniformly from the unit disc:
2011 Page 5 We note then that What is the PDF for And the radial and angle variables are clearly independent of each other. So we can instead use, in the Box-Muller formula: That would amount to simulating the two standard Gaussian random variables by:
2011 Page 6 will give two independent, standard Gaussian random variables provided (V 1,V 2 ) are sampled uniformly from the unit disc, and But how do we simulate the uniform random point within the unit disc? Rejection method: Want to sample uniformly from a region A that can have an irregular shape. What's easy is to simulate points randomly within a rectangle (or higher-dimensional analog). The reason the rejection method works is that for any nice subregion
2011 Page 7 Clearly, in practice, one wants the containing rectangle to be as small as possible to minimize the need to resample the random variable. For the Polar Marsaglia method, clearly we use the bounding square with side length 2, and we waste a fraction: The hope is that the wasted effort in generating random variables that fall outside the desired circle is more than compensated by the quickness of the sampling. Rejection method has very wide application in random variable simulation and Monte Carlo integration. Random variable simulation from arbitrary PDF: where Y is easy to simulate. In practice, this rejection method for sampling from the PDF p X goes as follows: a. Simulate Y b. Simulate a uniformly distributed random variable c. Accept the sampled value for Y provided d. Otherwise reject and repeat the cycle until the simulated value for Y is accepted
2011 Page 8 Back to Gaussian random variables. We've described how to simulate standard Gaussian random variables. What about general Gaussian random variables? What if we want to simulate a collection of a Gaussian random variables that may be correlated with each other, say, a Gaussian random vector: Generate a vector of independent standard Gaussian random variables Z. Why does this work? Since Z is Gaussian random vector, so is X because it is just a linear transformation of the Gaussian random vector Z. So we just need to check its mean and covariance.
2011 Page 9