on t0 t T, how can one compute the value E[g(X(T ))]? The Monte-Carlo method is based on the

Size: px
Start display at page:

Download "on t0 t T, how can one compute the value E[g(X(T ))]? The Monte-Carlo method is based on the"

Transcription

1 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 203 Monte Carlo Euler for SDEs Consider the stochastic differential equation dx(t) = a(t, X(t))dt + b(t, X(t))dW (t) on t0 t T, how can one compute the value E[g(X(T ))]? The Monte-Carlo method is based on the approximation E[g(X(T ))] M j=1 g(x(t ; ωj)) M, where X is an approximation of X, here the Euler method.

2 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 204 The error in the Monte-Carlo method is E[g(X(T ))] M j=1 g(x(t ; ωj)) M = E[g(X(T )) g(x(t ))] (28) + M j=1 E[g(X(T ))] g(x(t ; ωj)) M. (29) In the right hand side of the error representation (29), the first part is the time discretization error, which we will consider later, and the second part is the statistical error, which we study here.

3 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 205 Monte Carlo Statistical Error Goal: Approximate the expected value, E[Y ], by a sample average of M iid samples M j=1 Y (ω j) M and choose M sufficiently large to control the statistical error, M j=1 E[Y ] Y (ω j). M

4 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 206 For M independent samples of Y denote sample average A(Y ; M), and sample standard deviation S(Y ; M) ofy by A(Y ; M) 1 M [ S(Y ; M) M j=1 Y (ωj) A(Y 2 ; M) (A(Y ; M)) 2 ] 1/2. Let σy { E [ Y E[Y ] 2 ]} 1/2

5 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 207 Exercise 13 Compute the integral I = f(x)dx by [0,1] d the Monte Carlo method, where we assume f(x) :[0, 1] d R. We have I = f(x) dx [0,1] d = f(x)p(x) dx (wherep is the uniform pdf) [0,1] d = E[f(x)] ( where x is uniformly distributed in [0, 1] d ) M j=1 IM, f(x(ωj)) M

6 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 208 The values {x(ωj)} are sampled uniformly in the cube [0, 1] d, by sampling the components xi(ωn) independently and uniformly on the interval [0, 1]. Remark 15 (Random number generators) One can generate approximate random numbers, so called pseudo random numbers, see the lecture notes. By using transformations, one can also generate more complicated distributions in terms of simpler ones.

7 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 209 Example 8 Let Y be a given real valued random variable with P (Y x) =FY (x). Suppose that we want to sample iid from Y and that we can cheaply compute F 1 Y (u), u [0, 1]. Then, take U to be uniform distributed in [0, 1] and let Y (ω) =F 1 Y (U(ω)). We then have P (Y x) =P (F 1 Y (U) x) =P (U F Y (x)) = FY (x) as we wanted!

8 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 210 Acceptance-rejection sampling It generates sampling values from an arbitrary pdf ρy (x) by using an auxiliary pdf ρx(x). Assumptions: (i) It is simple to sample from ρx, (ii) There exists 0 <ɛ 1 s.t. ɛ ρ Y ρx (x) 1, for all x.

9 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 211 Idea: Rejection sampling is usually used in cases where the form of ρy makes sampling difficult. Instead of sampling directly from ρy, we use samples from ρx. These samples from ρx are probabilistically accepted or rejected.

10 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 212 Acceptance-rejection sampling The steps below generate a single realization of Y with pdf ρy. Step 1 Set k =1 Step 2 Sample two independent random variables: Xk from ρx and Uk U(0, 1). Step 3 If Uk ɛ ρ Y (Xk) then accept Y = X ρx(xk) k be a sample from ρy. Otherwise reject Xk, increment k by 1 and go to Step 1.

11 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 213 Let us see that Y sampled by acceptance-rejection has indeed density ρy We have the acceptance probability P ( Uk ɛ ρ Y (Xk) ρx(xk) ) = =ɛ =ɛ ρ ɛ Y (x) ρ X (x) 0 duρx(x)dx ρ Y (x) ρx(x) ρ X(x)dx ρy (x)dx =ɛ

12 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 214 Let K(ω) be the first value of k for which Xk is accepted as a realization of Y. We want to show that XK has the desired density, ρy. Consider an open set B P (XK B) = k 1 P (Xk B,K = k) = ( P Xk B,Uk ɛ ρ ) Y (Xk) ρx(xk) k 1 }{{} does not depend on k =P ( Xk B,Uk ɛ ρ Y (Xk) ρx(xk) ) k 1 k 1 m=1 (1 ɛ) k 1 } {{ } =1/ɛ ( P Um >ɛ ρ ) Y (Xm) ρx(xm) }{{} =1 ɛ

13 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 215 To finish compute P ( Xk B,Uk ɛ ρ Y (Xk) ρx(xk) ) = =ɛ =ɛ B B B ρ ɛ Y (x) ρ X (x) 0 duρx(x)dx ρy (x) ρx(x) ρ X(x)dx ρy (x)dx which implies P (XK B) = as we claimed. B ρy (x)dx

14 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 216 Remark 16 (Acceptance-rejection cost) Compute the expected number of samples per accepted ones: E[K] = k 1 kp(k = k) = k 1 k (1 ɛ) k ɛ =1/ɛ Can you interpret this result?

15 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 217 Monte Carlo: Numerical example Consider the computation of the integral 1= N exp( [0,1] N n=1 xn)dx1...dxn/(e 1) N M = 1e6; % Max. number of realizations N = 20; % Dimension of the problem u = rand(m,n); f = exp(sum(u ) ); run_aver = cumsum(f)./(((1:m) )*(exp(1)-1)^n); plot(1:m, run_aver), figure, plot(1:m, run_aver), xlabel M figure,plot(1:m,(run_aver-1)), xlabel M figure,semilogy(1:m,abs(run_aver-1)), xlabel M,

16 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 218

17 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 219

18 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 220

19 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 221

20 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 222 Monte Carlo error analysis Consider the scaled random variable ZM M ( ) A(Y ; M) E[Y ] σy with cumulative distribution function FZM (x) P (Z M x), x R.

21 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 223 The Central Limit Theorem is the fundamental result to understand the statistical error of Monte Carlo methods. Theorem 10 (The Central Limit Theorem) Assume ξj, j =1, 2, 3,... are independent, identically distributed (i.i.d) and E[ξj] =0, E[ξ j 2 ]=1. Then M j=1 ξj ν, (30) M where ν is N(0, 1) and denotes convergence of the distributions, also called weak convergence, i.e. the convergence (36) means E[g( M j=1 ξ j/ M)] E[g(ν)] for all bounded and continuous functions g.

22 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 224 Characteristic function Let X be a r.v. then f(t) =E[e itx ] is called the characteristic function of X. This function identifies completely the distribution of X, namely Theorem 11 Two distributions having the same characteristic function are identical Example: Consider a standard normal distribution, X N(0, 1). Then f(t) =E[e itx ]=e t2 2

23 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 225 In fact, we have inversion formulas closely related to the Fourier transform a Theorem 12 Let x1,x2 be continuity points of FX. Then F (x2) F (x1) = 1 2π + e itx 2 e itx 1 it f(t)dt a See [Petrov]

24 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 226 Proof. Consider the characteristic function f(t) =E[e itξ j]. Then its derivatives satisfy f (m) (t) =E[i m ξ m j e itξ j]. (31) For the sample average of the ξj vars we have ( ) E[e it P M j=1 ξ j/ M M t ] = f M = ( f(0)+ t f (0) + 1 M 2 t 2 M f (0) + o ( t 2 M ) )M.

25 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 227 The representation (31) implies f(0) = E[1] = 1, f (0) = ie[ξn] =0, f (0) = E[ξ 2 n]= 1. Therefore ( E[e it P M j=1 ξ j/ M ] = 1 t2 2M + o ( t 2 M )) M e t2 /2, as M e itx e x2 /2 = dx, (32) 2π R and we conclude that the Fourier transform of the pdf

26 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 228 (i.e. the characteristic function) of M j=1 ξ j/ M converges to the Fourier transform of the standard normal distribution. Therefore, E[g( M j=1 ξj/ M)] = }{{} = Parseval R R g(x)ρ P M j=1 ξ j/ M (x)dx f(t)f (g)(t)dt R }{{} = Parseval e t2 /2 F (g)(t)dt E[g(ν)].

27 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 229 Exercise 14 What is the error of IM I in Example 13? Let the error ɛm be defined by ɛm = M j=1 f(xj) M [0,1] d f(x)dx = M j=1 f(xj) E[f(x)] M. By the Central Limit Theorem, MɛM σν,where ν is

28 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 230 N(0, 1) and σ 2 = = f 2 (x)dx [0,1] d ( f(x) [0,1] d ( f(x)dx [0,1] d [0,1] d f(x)dx ) 2 ) 2 dx. In practice, σ 2 is approximated by ( ˆσ 2 = 1 M f(xj) M 1 j=1 M m=1 f(xm) M ) 2.

29 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 231 Approximate error bound: Cαˆσ/ M.HereCα =3.

30 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 232 Theorem13(Berry Esseen)Assume ( [ E Y E[Y ] 3]) 1/3 λ < +, σy then we have a uniform estimate in the central limit theorem (1 + x ) 3 M Here Φ is the distribution function of N(0, 1), Φ(x) = 1 x 2π exp ( s2 2 ) ds. (33) and CBE = FZM (x) Φ(x) CBE λ 3

31 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 233 By the Berry Esseen thm., the statistical error ES(Y ; M) E[Y ] A(Y ; M) satisfies, c 0 > 0, ([ ]) σy P ES(Y ; M) c 0 M 2Φ(c 0 ) 1 C BE λ 3 (1 + c 0 ) 3 M.

32 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 234 In practice choose c , 1 > 2Φ(c 0 ) and the event S(Y ; M) ES(Y ; M) ES(Y ; M) c 0 (34) M has probability close to one, which involves the additional step to approximate σy by S(Y ; M).Thus, in the computations ES(Y ; M) is a good approximation of the statistical error ES(Y ; M).

33 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 235 Numerical Example: Taking c 0 = 3 yields 2Φ(c 0 ) 1= and ([ P ES(Y ; M) 3 σ ]) Y λ3. M M In particular, if Y is a uniform random variable, then λ 3 = = and we have the bound ([ P ES(Y ; M) 3 σ ]) Y M M Obs: the last term on the right will determine the confidence level for M

34 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 236 Numerical Example: Consider a Binomial r.v. with parameter p = 1/2, X = M Yi i=1 and Yi iid Bernoulli r.vars., σ 2 = p(1 p). Let Z = (X Mp) σ M, then we compare its cdf (computed exactly) vs. the CLT approximation, Φ(z). We do it for several values of M...

35 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 237

36 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 238 Adaptive Monte Carlo For a given TOLS > 0, the goal is to find M such that ES(Y ; M) TOLS. The following algorithm adaptively finds the number of realizations M to compute the sample average A(Y ; M) as an approximation to E[Y ]. With probability close to one, depending on c 0,the statistical error in the approximation is then bounded by TOLS.

37 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 239 routine Monte-Carlo(TOLS, Y, M0; EY ) Set the batch counter m =1,M[1] = M0 and ES[1] = 2 TOLS. Do while (ES[m] >TOLS) Compute M[m] newsamples of Y, along with the sample average EY A(Y ; M[m]), the sample variance S[m] S(Y ; M[m]) and the deviation ES[m +1] ES(Y ; M[m]). Compute M[m +1]by change M (M[m], S[m], TOLS; M[m + 1]). Increase m by 1. end-do end of Monte-Carlo

38 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 240 routine change M (Min, Sin, TOLS; Mout) { ( ) 2 M c0 Sin = min integer part, MCH Min TOLS } n = integer part (log 2 M )+1 Mout =2 n. (35) end of change M

39 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 241 Remark 17 (Parameters for change M) Here, M0 is a given initial value for M, and MCH > 1 is a positive integer parameter introduced to avoid a large new number of realizations in the next batch due to a possibly inaccurate sample standard deviation S[m]. Indeed, M[m +1] cannot be greater than MCH M[m]. We will use MCH = 2 in the next example:

40 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 242 Numerical Example: Adaptive MC, TOL = 1e 2 for 1= [0,1] exp( N N n=1 x n)dx1...dxn/(e 1) N, N =20. M Sample E Sample std Error est.comp. Error e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e-03

41 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone e e e-03

42 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 244 Question: Can you compute the confidence level corresponding to the above computations as a function of M using the BE Theorem?

43 SPRING 2008, CSC KTH - Numerical methods for SDEs, Szepessy, Tempone 245 Large Deviations Theory for rare events, deep in the distribution tails. Remember CLT and BET: Assume ξj, j=1, 2, 3,... are independent, identically distributed (i.i.d) and E[ξj] = 0, E[ξ j 2 ] = 1. Then M j=1 ξj ν, (36) M

Hochdimensionale Integration

Hochdimensionale Integration Oliver Ernst Institut für Numerische Mathematik und Optimierung Hochdimensionale Integration 14-tägige Vorlesung im Wintersemester 2010/11 im Rahmen des Moduls Ausgewählte Kapitel der Numerik Contents

More information

Continuous random variables

Continuous random variables Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Monte Carlo Integration II & Sampling from PDFs

Monte Carlo Integration II & Sampling from PDFs Monte Carlo Integration II & Sampling from PDFs CS295, Spring 2017 Shuang Zhao Computer Science Department University of California, Irvine CS295, Spring 2017 Shuang Zhao 1 Last Lecture Direct illumination

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Exercise 5 Release: Due:

Exercise 5 Release: Due: Stochastic Modeling and Simulation Winter 28 Prof. Dr. I. F. Sbalzarini, Dr. Christoph Zechner (MPI-CBG/CSBD TU Dresden, 87 Dresden, Germany Exercise 5 Release: 8..28 Due: 5..28 Question : Variance of

More information

Lecture 21: Convergence of transformations and generating a random variable

Lecture 21: Convergence of transformations and generating a random variable Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

Lecture 9: March 26, 2014

Lecture 9: March 26, 2014 COMS 6998-3: Sub-Linear Algorithms in Learning and Testing Lecturer: Rocco Servedio Lecture 9: March 26, 204 Spring 204 Scriber: Keith Nichols Overview. Last Time Finished analysis of O ( n ɛ ) -query

More information

Asymptotic distribution of the sample average value-at-risk

Asymptotic distribution of the sample average value-at-risk Asymptotic distribution of the sample average value-at-risk Stoyan V. Stoyanov Svetlozar T. Rachev September 3, 7 Abstract In this paper, we prove a result for the asymptotic distribution of the sample

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds. Simulation Question: My computer only knows how to generate a uniform random variable. How do I generate others?. Continuous Random Variables Recall that a random variable X is continuous if it has a probability

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg Metric Spaces Exercises Fall 2017 Lecturer: Viveka Erlandsson Written by M.van den Berg School of Mathematics University of Bristol BS8 1TW Bristol, UK 1 Exercises. 1. Let X be a non-empty set, and suppose

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Lecture 15 Random variables

Lecture 15 Random variables Lecture 15 Random variables Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn No.1

More information

Monte Carlo Integration I [RC] Chapter 3

Monte Carlo Integration I [RC] Chapter 3 Aula 3. Monte Carlo Integration I 0 Monte Carlo Integration I [RC] Chapter 3 Anatoli Iambartsev IME-USP Aula 3. Monte Carlo Integration I 1 There is no exact definition of the Monte Carlo methods. In the

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

MA2501 Numerical Methods Spring 2015

MA2501 Numerical Methods Spring 2015 Norwegian University of Science and Technology Department of Mathematics MA5 Numerical Methods Spring 5 Solutions to exercise set 9 Find approximate values of the following integrals using the adaptive

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Regression and Statistical Inference

Regression and Statistical Inference Regression and Statistical Inference Walid Mnif wmnif@uwo.ca Department of Applied Mathematics The University of Western Ontario, London, Canada 1 Elements of Probability 2 Elements of Probability CDF&PDF

More information

Lecture 11: Probability, Order Statistics and Sampling

Lecture 11: Probability, Order Statistics and Sampling 5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space

More information

UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables

UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables To be provided to students with STAT2201 or CIVIL-2530 (Probability and Statistics) Exam Main exam date: Tuesday, 20 June 1

More information

Stat 704 Data Analysis I Probability Review

Stat 704 Data Analysis I Probability Review 1 / 39 Stat 704 Data Analysis I Probability Review Dr. Yen-Yi Ho Department of Statistics, University of South Carolina A.3 Random Variables 2 / 39 def n: A random variable is defined as a function that

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Probability and Statistics Concepts

Probability and Statistics Concepts University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

( x) ( ) F ( ) ( ) ( ) Prob( ) ( ) ( ) X x F x f s ds

( x) ( ) F ( ) ( ) ( ) Prob( ) ( ) ( ) X x F x f s ds Applied Numerical Analysis Pseudo Random Number Generator Lecturer: Emad Fatemizadeh What is random number: A sequence in which each term is unpredictable 29, 95, 11, 60, 22 Application: Monte Carlo Simulations

More information

Review of Statistics I

Review of Statistics I Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

Scientific Computing: Monte Carlo

Scientific Computing: Monte Carlo Scientific Computing: Monte Carlo Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 April 5th and 12th, 2012 A. Donev (Courant Institute)

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Chapter 5: Monte Carlo Integration and Variance Reduction

Chapter 5: Monte Carlo Integration and Variance Reduction Chapter 5: Monte Carlo Integration and Variance Reduction Lecturer: Zhao Jianhua Department of Statistics Yunnan University of Finance and Economics Outline 5.2 Monte Carlo Integration 5.2.1 Simple MC

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

ISyE 6644 Fall 2014 Test 3 Solutions

ISyE 6644 Fall 2014 Test 3 Solutions 1 NAME ISyE 6644 Fall 14 Test 3 Solutions revised 8/4/18 You have 1 minutes for this test. You are allowed three cheat sheets. Circle all final answers. Good luck! 1. [4 points] Suppose that the joint

More information

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples Homework #3 36-754, Spring 27 Due 14 May 27 1 Convergence of the empirical CDF, uniform samples In this problem and the next, X i are IID samples on the real line, with cumulative distribution function

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

What to do today (Nov 22, 2018)?

What to do today (Nov 22, 2018)? What to do today (Nov 22, 2018)? Part 1. Introduction and Review (Chp 1-5) Part 2. Basic Statistical Inference (Chp 6-9) Part 3. Important Topics in Statistics (Chp 10-13) Part 4. Further Topics (Selected

More information

Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

16 : Markov Chain Monte Carlo (MCMC)

16 : Markov Chain Monte Carlo (MCMC) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions

More information

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016 Monte Carlo Integration Computer Graphics CMU 15-462/15-662, Fall 2016 Talk Announcement Jovan Popovic, Senior Principal Scientist at Adobe Research will be giving a seminar on Character Animator -- Monday

More information

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Stochastic Differential Equations

Stochastic Differential Equations Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations

More information

SDE Coefficients. March 4, 2008

SDE Coefficients. March 4, 2008 SDE Coefficients March 4, 2008 The following is a summary of GARD sections 3.3 and 6., mainly as an overview of the two main approaches to creating a SDE model. Stochastic Differential Equations (SDE)

More information

Monte Carlo Integration

Monte Carlo Integration Monte Carlo Integration SCX5005 Simulação de Sistemas Complexos II Marcelo S. Lauretto www.each.usp.br/lauretto Reference: Robert CP, Casella G. Introducing Monte Carlo Methods with R. Springer, 2010.

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Semester , Example Exam 1

Semester , Example Exam 1 Semester 1 2017, Example Exam 1 1 of 10 Instructions The exam consists of 4 questions, 1-4. Each question has four items, a-d. Within each question: Item (a) carries a weight of 8 marks. Item (b) carries

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Simulation - Lectures - Part I

Simulation - Lectures - Part I Simulation - Lectures - Part I Julien Berestycki -(adapted from François Caron s slides) Part A Simulation and Statistical Programming Hilary Term 2017 Part A Simulation. HT 2017. J. Berestycki. 1 / 66

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

COMPSCI 240: Reasoning Under Uncertainty

COMPSCI 240: Reasoning Under Uncertainty COMPSCI 240: Reasoning Under Uncertainty Andrew Lan and Nic Herndon University of Massachusetts at Amherst Spring 2019 Lecture 20: Central limit theorem & The strong law of large numbers Markov and Chebyshev

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

A Review of Basic Monte Carlo Methods

A Review of Basic Monte Carlo Methods A Review of Basic Monte Carlo Methods Julian Haft May 9, 2014 Introduction One of the most powerful techniques in statistical analysis developed in this past century is undoubtedly that of Monte Carlo

More information

Outline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems.

Outline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems. EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Outline Simulation of a Single-Server Queueing System Review of midterm # Department of Electrical and Computer Engineering

More information