Chapter 2 The Monte Carlo Method

Similar documents
Lecture 19: Convergence

Lecture 7: Properties of Random Samples

AMS570 Lecture Notes #2

This section is optional.

1 Introduction to reducing variance in Monte Carlo simulations

Lecture 2: Monte Carlo Simulation

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Topic 9: Sampling Distributions of Estimators

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Topic 9: Sampling Distributions of Estimators

EE 4TM4: Digital Communications II Probability Theory

6.3 Testing Series With Positive Terms

Monte Carlo Integration

STAT Homework 2 - Solutions

Topic 9: Sampling Distributions of Estimators

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Machine Learning Brett Bernstein

Because it tests for differences between multiple pairs of means in one test, it is called an omnibus test.

7.1 Convergence of sequences of random variables

Probability and statistics: basic terms

Module 1 Fundamentals in statistics

Stat 421-SP2012 Interval Estimation Section

1 Approximating Integrals using Taylor Polynomials

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.


Section 13.3 Area and the Definite Integral

Sampling Distributions, Z-Tests, Power

Section 1.4. Power Series

PRELIM PROBLEM SOLUTIONS

Infinite Sequences and Series

LECTURE 8: ASYMPTOTICS I

Chapter 6 Principles of Data Reduction

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Topic 10: The Law of Large Numbers

MA131 - Analysis 1. Workbook 2 Sequences I

4. Partial Sums and the Central Limit Theorem

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Vector Quantization: a Limiting Case of EM

Assignment 1 : Real Numbers, Sequences. for n 1. Show that (x n ) converges. Further, by observing that x n+2 + x n+1

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Parameter, Statistic and Random Samples

INFINITE SEQUENCES AND SERIES

Math 21B-B - Homework Set 2

1 Convergence in Probability and the Weak Law of Large Numbers

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Singular Continuous Measures by Michael Pejic 5/14/10

Sequences I. Chapter Introduction

7.1 Convergence of sequences of random variables

Lecture Chapter 6: Convergence of Random Sequences

MA131 - Analysis 1. Workbook 9 Series III

Random Variables, Sampling and Estimation

c. Explain the basic Newsvendor model. Why is it useful for SC models? e. What additional research do you believe will be helpful in this area?

Expectation and Variance of a random variable

Distribution of Random Samples & Limit theorems

Lesson 10: Limits and Continuity

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

STAT Homework 1 - Solutions

BHW #13 1/ Cooper. ENGR 323 Probabilistic Analysis Beautiful Homework # 13

Probability for mathematicians INDEPENDENCE TAU

PRACTICE PROBLEMS FOR THE FINAL

Introduction to Probability. Ariel Yadin

Chapter 6 Sampling Distributions

Interval Estimation (Confidence Interval = C.I.): An interval estimate of some population parameter is an interval of the form (, ),

Last time: Moments of the Poisson distribution from its generating function. Example: Using telescope to measure intensity of an object

Math 1314 Lesson 16 Area and Riemann Sums and Lesson 17 Riemann Sums Using GeoGebra; Definite Integrals

Monte Carlo method and application to random processes

Ma 530 Infinite Series I

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

Advanced Stochastic Processes.

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

Closed book and notes. No calculators. 60 minutes, but essentially unlimited time.

MATH/STAT 352: Lecture 15

CS284A: Representations and Algorithms in Molecular Biology

Convergence of random variables. (telegram style notes) P.J.C. Spreij

of the matrix is =-85, so it is not positive definite. Thus, the first

Lecture 18: Sampling distributions

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

HOMEWORK I: PREREQUISITES FROM MATH 727

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

SOLUTIONS TO EXAM 3. Solution: Note that this defines two convergent geometric series with respective radii r 1 = 2/5 < 1 and r 2 = 1/5 < 1.

Solutions to Homework 2 - Probability Review

CS537. Numerical Analysis and Computing

Approximations and more PMFs and PDFs

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

2 Banach spaces and Hilbert spaces

Simulation. Two Rule For Inverting A Distribution Function

MONTE CARLO VARIANCE REDUCTION METHODS

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

Chapter 13: Tests of Hypothesis Section 13.1 Introduction

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

2.2. Central limit theorem.

Parameter, Statistic and Random Samples

(6) Fundamental Sampling Distribution and Data Discription

ENGI 4421 Confidence Intervals (Two Samples) Page 12-01

Frequentist Inference

MATH301 Real Analysis (2008 Fall) Tutorial Note #7. k=1 f k (x) converges pointwise to S(x) on E if and

Transcription:

Chapter 2 The Mote Carlo Method The Mote Carlo Method stads for a broad class of computatioal algorithms that rely o radom sampligs. It is ofte used i physical ad mathematical problems ad is most useful whe it is difficult or impossible to use other algorithms. There are maily three distict classes of applicatios: umerical itegratio, optimizatio, ad geeratig draws from a probability distributio. ( https://cse.sc.edu/~terejau/files/tutorialmc.pdf ) Sectio 2. Computig Itegrals Simple Mote Carlo for fidig mi ad max of y = f (x) radomly geerate poits x, x 2,... Mi = Max = f (x ), if f (x 2 ) < f (x ), the replace Mi = f (x 2 ), Max = f (x ), cotiues Cosider the area below the curve y = f (x) iside the rectagle of area R. We radomly geerate a poit (x, y ) iside the rectagle. If y f (x ), the this poit is below the curve. If i N trials, M poits are below the curve, the: area of uder the curve M N R

2

I theory, as N, the limit is exactly the area. Mote Carlo Method for itegrals: Recall that I = b a f (x) dx is the siged area betwee the curve y = f (x) ad x axis. If f (x) 0 it is the area. We may use the above method to approximate the itegral as follows: Set c = mi {f (x) : a x b}, d = max {f (x) : a x b} update: c = mi {c, 0}, d = max {d, 0} N is total trials, iitialize N = M = I = 0 Each trial, radomly geerate a x b, c y d If 0 y f (x ), the M = M + If 0 y f (x ), the M = M The above two step ca be combied as: If (f (x ) y ) y 0, the M = M + sig (y ) I = (M/N) (b a) (d c) For double itegrals, we use volumes istead. Read Example i page 22. Exercise : Use Mote Carlo to fid the followig itegral withi a error of 0 5.What is N? 2 0 e x2 cos xdx 3

4 Experimet: try usig ormal distributio isteat of uiform distributio, compare results. Mote Carlo is ot very efficiet. It is used oly for very complicated situatios. Sectio 2.2 Mea Time betwee Failures Cosider a system has several compoets. Each has a failure time that is ormally distributed with mea µ i ad variace σ 2 i. The the system failure time is T = mi {T, T 2,..., T } Aalytic solutio for E [T ] : Let F i be the CDF of T i. Note that T (ω) T i (ω),ad T (ω) > t T i (ω) > t for i =, 2,..., {ω : T (ω) > t} = {ω : T i (ω) > t}

5 The CDF F of T is (assumig idepedece) ad F (t) = P (T t) = P (T > t) = P ( {ω : T i (ω) > t}) = P (ω : T i (ω) > t) = ( F i (t)) ( ) df (t) = d ( F i (t)) = ( F i (t)) df i k= i= k E [T ] = This itegral is so difficult. Mote Carlo for mea ad variace: t k= i= k ( F i (t)) df i (t)

6 µ = E [X] = lim σ 2 = E N X (ω ) + X (ω 2 ) +... + X (ω N ) N ( X (ω ) = lim N N + X (ω 2) N +... + X (ω ) N) N [ (X µ) 2] = E [ X 2 2µX µ 2] = E [ X 2] 2µE [X] + µ 2 = E [ X 2] µ 2 Exercise #2: MC for E [T ] (See Routie i page 23) Sectio 2.3 Servicig Requests Mote Carlo for radom variables, meas/variaces Lemma Suppose that CDF F X (x) of a radom variable X is cotiuous ad strictly icreasig. The F X (X) = Y U (0, ), i.e., Y is uiform distributio i [0, ]. Proof: Usig graph of y = F X (x). For ay 0 < y 0 <, there is a uique x 0 such that y 0 = F X (x 0 ).Sice F X is strictly icreasig, {x : F X (x) y 0 } = (, x 0 ] By defiitio, y 0 = P (X x 0 ).From the above, we see that if ω {ω : F X (X (ω)) y 0 }

7 the X (ω) {x : F X (x) y 0 } = (, x 0 ] X (ω) x 0 So ω {ω : X (ω) x 0 }. Cosequetly {ω : F X (X (ω)) y 0 } {ω : X (ω) x 0 }. Sice F X is icreasig, we actually have {ω : F X (X (ω)) y 0 } = {ω : X (ω) x 0 }. So F Y (y 0 ) = P (F X (X) y 0 ) = P (X x 0 ) = y 0 Now, for ay give F (x), we first radomly geerate a sample y U (0, ), the calculate x = F (y) as a sample for F. Estimate E [g (X)] for give CDF F X (x) or PDF ρ X : For ay, geerate a sequece of radom variables x i as above with CDF F X

compute I particular, g (x i ) E [g (X)] = x 2 i ( g (x) df X = x i E [X] g (x) ρ X (x) dx ) 2 x i E [ X 2] E [X] 2 = σ 2 Cetral Limit Theory: The average of a large umber idepedet radom variable coverges to a ormal distributio. More precisely, let X i be a radom variable with mea µ ad variace σ 2, the (X i µ) σ N (0, ), as 8

9 Error is O (/ ) g (X i ) E [g (X)] = = (g (X i) E [g (X)]) (g (X i) E [g (X)]) σ σ N (0, ) σ Cosider checkout time i a library or grocery story. services has the expoetial distributio Assume the time T betwee requests for ρ (t) = e t/a a, for t > 0 Assume there are m checkout lies. If the first oe is busy, the the requested is haded out to the secod, ad so o. If all lies are busy, the request is rejected. Assume that each lie processes each request i time S that is ormally distributed with mea µ ad variace σ 2.

0 The CDF for T is The iverse fuctio F is y = F (t) = t 0 e x/a a dx = e t/a t = a l ( y) So there exists a uiform distributio i [0, ], Y U [0, ] See routie i page 25 T = a l ( Y ) Homework: Exercise : Usig MC to fid the itegral (see Example i page 22), withi a error of 0 5.What is the smallest N? Exercise 2: Ru MC Routie i page 23. The write your ow routie for the followig cases: (a) There are five devices T,..., T 5, which have ormal distributios with mea,2,3,4,5, ad stadard deviatio, 2, 3, 4, 5, respectively. (b) Repeat (a) with uiform distributio istead (with the same eas ad stadard deviatios). (Hit: U [ µ σ 3, µ + σ 3 ] has mea µ ad variace σ 2 )

Exercise 3: Write a MC routie to compute E [X] ad σ 2 [X] with give CDF of X. The test your MC routie usig expoetial distributio with λ = 2 ad λ = 5. Exercise 4: Write a Matlab routie to geerate N (0, ) (e.g., fid mea ad variace) usig usig "rad" commad oly (you caot use "rad" commad) ad Lemmas of page 25. Test your results i 2(a). Optioal Develop a Matlab routie for MC itegratio of geeral itegral f (x, y) dxdy for ay fuctio f (x, y) i ad domai D. Note that D is ot ecessarily a rectaglar domai. For istace, D could be the uit disk. The, (accurate up to 0 3 ) (i) use your program to compute 2 0 dy 2 e (x2 y) si x cos ydx (ii) Let B be the uit disk, i.e., B : x 2 + y 2, use MC to fid e (x2y) si x cos ydxdy D B