IMPORTANCE SAMPLING Importance Sampling Background: let x = (x 1,..., x n ), θ = E[h(X)] = h(x)f(x)dx 1 N. h(x i ) = Θ,

Size: px
Start display at page:

Download "IMPORTANCE SAMPLING Importance Sampling Background: let x = (x 1,..., x n ), θ = E[h(X)] = h(x)f(x)dx 1 N. h(x i ) = Θ,"

Transcription

1 1 IMPORTANCE SAMPLING Importance Sampling Background: let x = (x 1,..., x n ), θ = E[h(X)] = h(x)f(x)dx 1 N N h(x i ) = Θ, if X i F (X), and F (x) is cdf for f(x). For many problems, F (x) is difficult to sample from and/or V ar(h) is large. If a related, easily sampled pdf g(x) is available, could use θ = E g [ h(x)f(x h(x)f(x) ] = g(x)dx g(x) g(x) 1 N h(x i )f(x i ), N g(x i ) with X i G(X), for associated cdf G(X). Importance sampling: if V ar( h(x)f(x) g(x) ) is small, g(x) samples are concentrated where h(x)f(x) is important :

2 2 Importance Sampling Example: θ = 1 0 ex2 dx. Try g(x) = e x ; so θ = 1 0 ex2 x e x dx. To find G(x), note 1 0 ex dx = e 1, so G(x) = 1 x e t dt = (e x 1)/(e 1); e 1 0 then using X i = ln(1 + (e 1)U i ), 1 θ = (e 1) e x2 x ex (e 1) N dx e X2 i X i, e 1 N 0 N = 10000; U = rand(1,n); Y = exp(u.^2); disp( [mean(y) 2*std(Y)/sqrt(N)]) % simple MC e = exp(1); X = log(1+(e-1)*u); T = (e-1)*exp(x.*(x-1)); disp( [mean(t) 2*std(T)/sqrt(N)]) % importance Error reduction by 1/ exp(x), exp(x 2 ), and 1+x 2 +x 4 / x

3 3 Alternative: g(x) = 1 + x 2, G(x)? θ = with X i 3 4 X X3. 0 e x2 3(1 + x 2 ) 1 + x 2 4 dx 4 3N N e X2 i 1 + X 2 i N = 10000; U = rand(1,n); I = rand(1,n)<3/4; X = I.*U + (1-I).*U.^(1/3); T = 4*exp(X.^2)./(3*(1+X.^2)); disp( [mean(t) 2*std(T)/sqrt(N)]) % importance Better g(x) = 1 + x 2 + x 4 /2, G(x) = x x x5 ; θ = N 1 0 e x2 1 + x 2 + x 4 /2 N e X2 i 1 + X 2 i + X4 i /2, 30(1 + x 2 + x 4 /2) dx 43 with X i x x x5. N = ; U = rand(1,n); V = rand(1,n); I = V<30/43; J = V>40/43; X = I.*U + (1-I).*(1-J).*U.^(1/3)+ J.*U.^(1/5); T = 43*exp(X.^2)./(30*(1+X.^2+X.^4/2)); disp( [mean(t) 2*std(T)/sqrt(N)]) % importance ,

4 4 Importance Sampling Example: θ = 1 e z2 /2 z 3 e z dz. N = ; Z = randn(1,n); Y = Z.^3.*exp(Z); disp( [mean(y) 2*std(Y)/sqrt(N)]) % simple MC Try g(z) = 1 e (z 1)2 /2 = 1 e w2 /2, with w = z 1; so θ = e 1/2 z 3 g(z)dz = e1/2 (w + 1) 3 e w2 /2 dw. Simulation uses θ e1/2 N N (Z i+1) 3, with Z i Normal(0, 1). N = ; Z = randn(1,n); Y = exp(1/2)*(z+1).^3; disp( [mean(y) 2*std(Y)/sqrt(N)]) % importance Error reduction by 1/4. Note θ = e1/2 (3w 2 + 1)e w2 /2 dw = 4e 1/

5 5 Higher dimensional problems: often f(x) g(x) = g 1 (x 1 )g 2 (x 2 ) g n (x n ), so samples are from a sequence of 1-d samples. 2-d example: θ = e(x 1+x 2 ) 2 dx; if g(x) = e x 1e x 2; θ = e((x 1+x 2 ) 2 x 1 x 2 e x 1+x 2 dx. After scaling, with X ij = ln(1 + (e 1)U ij ), 1 1 θ = (e 1) 2 e ((x 1+x 2 ) 2 x 1 x 2 ) ex 1+x (e 1) 2dx (e 1)2 N e (X 1i+X 2i ) 2 X 1i X 2i. N N = 10000; U = rand(2,n); T = exp(sum(u).^2); disp( [mean(t) 2*std(T)/sqrt(N)]) % simple MC e = exp(1); X = log(1+(e-1)*u); T = (e-1)^2*exp(sum(x).^2-sum(x)); disp( [mean(t) 2*std(T)/sqrt(N)]) Better g(x) = e 2x 1e 2x 2, with g(1, 1) = f(1, 1)? Then G i (x) = e2x 1 e 2 1, X ij = ln(1 + (e 2 1)U ij )/2, and 1 1 θ = (e2 1) e = exp(1); X = log(1+(e^2-1)*u)/2; T = (e^2-1)^2*exp(sum(x).^2-2*sum(x))/4; disp( [mean(t) 2*std(T)/sqrt(N)]) Better g(x) = 1 + (x 1 + x 2 ) 2? e ((x 1+x 2 ) 2 2(x 1 +x 2 )) 4e2(x 1+x 2 ) (e 2 1) 2 dx,

6 6 Tilted Densities g(x) : given pdf f(x) let M(t) = e tx f(x)dx (the moment generating function). The tilted density for f(x) is f t (x) = etx f(x) M(t). Examples Exponential densities: if f(x) = λe λx, x [0, ), f t (x) = (λ t)e (λ t)x, t < λ. Bernoullli pmf s: f(x) = p x (1 p) 1 x, x = 0, 1. M(t) = E f [e tx ] = e t p + (1 p), so ( ) f t (x) = etx p x (1 p) 1 x e = t x ( p 1 x, e t p+(1 p) e t p+(1 p) e p+(1 p)) t a Bernoulli RV with p t = e t p e t p+(1 p). So f/f t = et p+(1 p) = e tx (e t p + (1 p)) e tx Generalization: if f(x) is a Binomial(n, p) pmf, f t (x) is Binomial(n, e t p + 1 p), with M(t) = (e t p + 1 p) n. Normal densities: if f(x) = e x2 /2, x (, ), e tx f(x) = ext e x2 /2 = e (x t)2 /2 e t2 /2 so f t (x) = e (x t)2 /2, Normal(t, 1), with M(t) = e t2 /2. Generalization: if f(x) is a Normal(µ, σ 2 ) pdf, then f t (x) is a Normal(µ + σ 2 t, σ 2 ) pdf. Choosing t: pick t with small V ar( h(x)f(x) f t (x) ). Text heuristic for exponentials and Bernoullis: if h = I{ X i > a}, choose t = t with E t [ X i ] a.

7 Examples: 1. Bernoulli RV Examples: if X i s are independent Bernoulli(p i ) RV s and θ = I{ n X i > a} = I{S > a}. n ˆθ = I{S > a}e ts (e t p i + (1 p i )), with E t [ n X i ] = n e t p i e t p i + (1 p i ). Example with n = 20, p i =.4, a = 16; choose t so that E t [S] = 20.4et.4e t +.6 = 16, with solution et = 6; then p t =.8, e t p + (1 p) = 3, and estimator is ˆθ = I{ X i > a}6 S 3 20 = 3 20 S I{ X i > a}/2 S. Matlab N = ; p =.4; n = 20; I = sum( rand(n,n) < p ) > 16; % Simple MC disp([mean(i) 2*std(I)/sqrt(N)]) 6e e-05 S = sum( rand(n,n) <.8 ); % importance I = 3.^(20-S).*( S > 16 )./2.^S; disp([mean(i) 2*std(I)/sqrt(N)]) e e-07 N = ; p =.4; n = 20; I = sum( rand(n,n) < p ) > 16; % Simple MC disp([mean(i) 2*std(I)/sqrt(N)]) 4.82e-05 Note: θ = e-06 ( 20 ) i (.4) i (.6) 20 i

8 8 2. Exponential RV Example: if X i Exp( 1 i+2 ), i = 1,..., 4, S(X) = 4 X i, find 4 x i θ = h(x) e i dx, 0 0 with h(x) = S(x)I{S(x) > 62}. Raw simulation uses X ij Exp( 1 i+2 ), to estimate θ 1 N h(x j ). N j=1 Matlab N = ; U = rand(4,n); X = -diag([3:6])*log(1-u); S = sum(x); h = S.*( S > 62 ); disp( [mean(h) 2*std(h)/sqrt(N)]) Note: to find E[S S > 62], divide by E[I(S > 62)] E = h/mean((s>62)); disp([mean(e) 2*std(E)/sqrt(N)])

9 For tilted density, use common tilt parameter t, so that X i Exp(1/(i + 2) t), θ = 4 C N [0, )4 h(x)e ts(x) i + 2 e 1 (i + 2)t 4 (i + 2) N 4 h(x j )e ts(xj), with C = j=1 4 x i ( 1 i+2 t) 4 i+2 1 (i+2)t) 1 1 (i + 2)t. Text estimates good t =.14, by approximately solving 4 E t [X i ] = 3 1 3t t t t = dx; But guess and check with Matlab finds better t.136. Matlab tests: t =.14; Cd = 1./(1-[3:6]*t); C = prod(cd); St = -([3:6].*Cd)*log(1-U); ht = C*St.*( St > 62 ).*exp(-t*st); disp([mean(ht) 2*std(ht)/sqrt(N)]) t =.136; Cd = 1./(1-[3:6]*t); C = prod(cd); St = -([3:6].*Cd)*log(1-U); ht = C*St.*( St > 62 ).*exp(-t*st); disp([mean(ht) 2*std(ht)/sqrt(N)]) E = ht/mean(c*(st > 62).*exp(-t*St)); disp([mean(e) 2*std(E)/sqrt(N)])% Expected Value Note smaller standard errors compared to raw sampling.

10 Tilting for Normal Densities: if f(x) = 1 e (x µ)2 /2, tilted density f t (x) = f(x)ext M(t) is a shifted normal. Example: θ = 1 e z2 /2 z 3 e z dz. Pick t to match point (mode) where integrand a(z) = Ke (z 1)2 /2 z 3 is max. 10 exp(x x 2 /2) x 3 /sqrt(2 π) x

11 11 Or, pick t to match mean at t = 2.5. For either case: θ = 1 e (z t)2 /2 z 3 e z zt+t2 /2 dz = 1 e w2 /2 (w+t) 3 e (w+t)(1 t)+t2 /2), using z = w+t. Matlab tests: N = ; W = randn(1,n); t = (1+sqrt(13)/2; Y = (W+t).^3.*exp((W+t)*(1-t)+t^2/2); disp( [mean(y) 2*std(Y)/sqrt(N)]) % mode tilt t = 2.5; Y = (W+t).^3.*exp((W+t)*(1-t)+t^2/2); disp( [mean(y) 2*std(Y)/sqrt(N)]) % mean tilt Compare: t = 1; Y = (W+t).^3.*exp((W+t)*(1-t)+t^2/2); disp( [mean(y) 2*std(Y)/sqrt(N)]) % tilt = t=0; Y = (W+t).^3.*exp((W+t)*(1-t)+t^2/2); disp( [mean(y) 2*std(Y)/sqrt(N) ]) % t=0, raw MC

12 Tilting for Multidimensional Normal Density Problems: use vector t. Choice of t? Try to make V ar( h(x)f(x) f(x t) ) small: a) choose point t where h(x)f(x) is maximum (mode), or b) choose t = E[xh(x)]/E[h(x)] (mean). Asian Option example: this has S m = S m 1 e (r σ2 2 )δ+σ δz m, with δ = T/M, Z m Normal(0, 1) and expected profit θ = E[e rt max( 1 M = e rt = M S i (Z) K, 0)] max( 1 M M M zi 2/2 h(z) e ( ) dz, m with h(z) = e rt max( 1 M M S i(z) K, 0). 12 M S i (z) K, 0) e zi 2/2 ( ) dz m For method a), find t to maximize h(z)e M zi 2/2. For method b), t can be estimated from data. Given t, use ˆθ = = h(z) e M z 2 i /2 e M (z i t i ) 2 /2 M (y i +t i ) 2 /2 e h(y + t) e e M y 2 i /2 e M (z i t i ) 2 /2 ( dz ) m M yi 2/2 ( ) dy. m

13 Example with M = 16, S 0 = K = 50, T = 1, r =.05, σ =.1. Matlab test using method b): M = 16; S0 = 50; K = 50; T = 1; dlt = T/M; r = 0.05; s = 0.1; rd = ( r - s^2/2 )*dlt; N = 10000; z = randn(m,n); % Simple MC S = S0*exp(cumsum(rd + s*sqrt(dlt)*z)); h = exp(-r*t)*max( mean(s)-k, 0 ); disp([mean(h) var(h) 2*std(h)/sqrt(N)]) t = z*h /sum(h); % Approx. Mean Tilt Vector y = z; z = y + t*ones(1,n); S = S0*exp(cumsum(rd + s*sqrt(dlt)*z)); h = exp(-r*t)*max( mean(s)-k, 0 ); ht = h.*exp(sum(y.*y-z.*z)/2); % Importance disp([mean(ht) var(ht) 2*std(ht)/sqrt(N)]) Notice variance reduction from tilted sampling. Using method a) with Matlab fminsearch to find t: Sf hf *z/2)*max(mean(sf(z))-k,0); t = fminsearch(@(z)-hf(z), ones(m,1) ); % Tilt t y = z; z = y + t*ones(1,n); S = S0*exp(cumsum(rd + s*sqrt(dlt)*z)); h = exp(-r*t)*max( mean(s)-k, 0 ); ht = h.*exp(sum(y.*y-z.*z)/2); % Importance disp([mean(ht) var(ht) 2*std(ht)/sqrt(N)])

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Θ is the stratified sampling estimator of θ. Note: y i s subdivide sampling space into k strata ;

Θ is the stratified sampling estimator of θ. Note: y i s subdivide sampling space into k strata ; 1 STRATIFIED SAMPLING Stratified Sampling Background: suppose θ = E[X]. Simple MC simulation would use θ X. If X simulation depends on some discrete Y with pmf P {Y = y i } = p i, i = 1,..., k, and X can

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

Chapter 4. Continuous Random Variables

Chapter 4. Continuous Random Variables Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6 MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward

More information

1.6 Families of Distributions

1.6 Families of Distributions Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

Statistics 3657 : Moment Generating Functions

Statistics 3657 : Moment Generating Functions Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Continuous Distributions

Continuous Distributions Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

1 Complete Statistics

1 Complete Statistics Complete Statistics February 4, 2016 Debdeep Pati 1 Complete Statistics Suppose X P θ, θ Θ. Let (X (1),..., X (n) ) denote the order statistics. Definition 1. A statistic T = T (X) is complete if E θ g(t

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Integration by Substitution

Integration by Substitution November 22, 2013 Introduction 7x 2 cos(3x 3 )dx =? 2xe x2 +5 dx =? Chain rule The chain rule: d dx (f (g(x))) = f (g(x)) g (x). Use the chain rule to find f (x) and then write the corresponding anti-differentiation

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER. Two hours MATH38181 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer any FOUR

More information

Mathematical statistics

Mathematical statistics October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation

More information

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p). Sect 3.4: Binomial RV Special Discrete RV s 1. Assumptions and definition i. Experiment consists of n repeated trials ii. iii. iv. There are only two possible outcomes on each trial: success (S) or failure

More information

Chapter 8.8.1: A factorization theorem

Chapter 8.8.1: A factorization theorem LECTURE 14 Chapter 8.8.1: A factorization theorem The characterization of a sufficient statistic in terms of the conditional distribution of the data given the statistic can be difficult to work with.

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic? Biostatistics 62 - Statistical Inference Lecture 5 Hyun Min Kang 1 What is an ancillary statistic for θ? 2 Can an ancillary statistic be a sufficient statistic? 3 What are the location parameter and the

More information

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

MATH : EXAM 2 INFO/LOGISTICS/ADVICE MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation Maximum Entropy and Spectral Estimation 1 Introduction What is the distribution of velocities in the gas at a given temperature? It is the Maxwell-Boltzmann distribution. The maximum entropy distribution

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

1 Probability Model. 1.1 Types of models to be discussed in the course

1 Probability Model. 1.1 Types of models to be discussed in the course Sufficiency January 18, 016 Debdeep Pati 1 Probability Model Model: A family of distributions P θ : θ Θ}. P θ (B) is the probability of the event B when the parameter takes the value θ. P θ is described

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ).

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ). Estimation February 3, 206 Debdeep Pati General problem Model: {P θ : θ Θ}. Observe X P θ, θ Θ unknown. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ). Examples: θ = (µ,

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Mathematical statistics

Mathematical statistics October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

International Journal of Scientific & Engineering Research, Volume 6, Issue 2, February-2015 ISSN

International Journal of Scientific & Engineering Research, Volume 6, Issue 2, February-2015 ISSN 1686 On Some Generalised Transmuted Distributions Kishore K. Das Luku Deka Barman Department of Statistics Gauhati University Abstract In this paper a generalized form of the transmuted distributions has

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

ON A GENERALIZATION OF THE GUMBEL DISTRIBUTION

ON A GENERALIZATION OF THE GUMBEL DISTRIBUTION ON A GENERALIZATION OF THE GUMBEL DISTRIBUTION S. Adeyemi Department of Mathematics Obafemi Awolowo University, Ile-Ife. Nigeria.0005 e-mail:shollerss00@yahoo.co.uk Abstract A simple generalization of

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

MATH Solutions to Probability Exercises

MATH Solutions to Probability Exercises MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Test Problems for Probability Theory ,

Test Problems for Probability Theory , 1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30

More information

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm) Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM Solutions to Question A1 a) The marginal cdfs of F X,Y (x, y) = [1 + exp( x) + exp( y) + (1 α) exp( x y)] 1 are F X (x) = F X,Y (x, ) = [1

More information

Stochastic Simulation Variance reduction methods Bo Friis Nielsen

Stochastic Simulation Variance reduction methods Bo Friis Nielsen Stochastic Simulation Variance reduction methods Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: bfni@dtu.dk Variance reduction

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Chapter 4: Continuous Random Variables and Probability Distributions

Chapter 4: Continuous Random Variables and Probability Distributions Chapter 4: and Probability Distributions Walid Sharabati Purdue University February 14, 2014 Professor Sharabati (Purdue University) Spring 2014 (Slide 1 of 37) Chapter Overview Continuous random variables

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 1: Continuous Random Variables Section 4.1 Continuous Random Variables Section 4.2 Probability Distributions & Probability Density

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Joint p.d.f. and Independent Random Variables

Joint p.d.f. and Independent Random Variables 1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X. 10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2010 Luc Rey-Bellet Contents 1 Random variables and Monte-Carlo method 3 1.1 Review of probability............................

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72

Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72 Proof. We know that which is S X Λ (x λ) = exp S X Λ (x λ) = exp Furthermore, S X (x) = { x } h X Λ (t λ)dt, 0 { x } λ a(t)dt = exp { λa(x)}. 0 Edward Furman Risk theory 4280 21 / 72 Proof. We know that

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Question Points Score Total: 76

Question Points Score Total: 76 Math 447 Test 2 March 17, Spring 216 No books, no notes, only SOA-approved calculators. true/false or fill-in-the-blank question. You must show work, unless the question is a Name: Question Points Score

More information

Review For the Final: Problem 1 Find the general solutions of the following DEs. a) x 2 y xy y 2 = 0 solution: = 0 : homogeneous equation.

Review For the Final: Problem 1 Find the general solutions of the following DEs. a) x 2 y xy y 2 = 0 solution: = 0 : homogeneous equation. Review For the Final: Problem 1 Find the general solutions of the following DEs. a) x 2 y xy y 2 = 0 solution: y y x y2 = 0 : homogeneous equation. x2 v = y dy, y = vx, and x v + x dv dx = v + v2. dx =

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Errata for Stochastic Calculus for Finance II Continuous-Time Models September 2006

Errata for Stochastic Calculus for Finance II Continuous-Time Models September 2006 1 Errata for Stochastic Calculus for Finance II Continuous-Time Models September 6 Page 6, lines 1, 3 and 7 from bottom. eplace A n,m by S n,m. Page 1, line 1. After Borel measurable. insert the sentence

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Miscellaneous Errors in the Chapter 6 Solutions

Miscellaneous Errors in the Chapter 6 Solutions Miscellaneous Errors in the Chapter 6 Solutions 3.30(b In this problem, early printings of the second edition use the beta(a, b distribution, but later versions use the Poisson(λ distribution. If your

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ). 1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order. Order Statistics The order statistics of a set of random variables 1, 2,, n are the same random variables arranged in increasing order. Denote by (1) = smallest of 1, 2,, n (2) = 2 nd smallest of 1, 2,,

More information

Generating Random Variates 2 (Chapter 8, Law)

Generating Random Variates 2 (Chapter 8, Law) B. Maddah ENMG 6 Simulation /5/08 Generating Random Variates (Chapter 8, Law) Generating random variates from U(a, b) Recall that a random X which is uniformly distributed on interval [a, b], X ~ U(a,

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information