S6880 #7. Generate Non-uniform Random Number #1
|
|
- Sharlene Higgins
- 5 years ago
- Views:
Transcription
1 S6880 #7 Generate Non-uniform Random Number #1
2 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method 3 Rejection Method Rejection Method 4 Statistical Theoretic Methods Theories Connecting Distributions Transformation Methods (WMU) S6880 #7 S6880, Class Notes #7 2 / 29
3 Inverse CDF F(X) U(0, 1) if X is continuous. That is, for continuous r.v. X, X = F 1 (U). In general, define F 1 (u) = min{x : F(x) u}, then X F 1 (U) for U U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 3 / 29
4 Exponential X E(θ) where θ = E(X) is the expected waiting time. That is, f (x) = 1 θ e x θ I(x>0), F(x) = ( 1 e x θ ) I(x>0). Then, Y = F (X) = 1 e X θ U(0, 1), and hence X = θln(1 Y ). That is, generate Y U(0, 1) then ( ) X = θln(1 Y ). Note: ( ) can be replaced by X = θlny since Y U(0, 1) 1 Y U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 4 / 29
5 Exponential X E(θ) where θ = E(X) is the expected waiting time. That is, f (x) = 1 θ e x θ I(x>0), F(x) = ( 1 e x θ ) I(x>0). Then, Y = F (X) = 1 e X θ U(0, 1), and hence X = θln(1 Y ). That is, generate Y U(0, 1) then ( ) X = θln(1 Y ). Note: ( ) can be replaced by X = θlny since Y U(0, 1) 1 Y U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 4 / 29
6 Exponential X E(θ) where θ = E(X) is the expected waiting time. That is, f (x) = 1 θ e x θ I(x>0), F(x) = ( 1 e x θ ) I(x>0). Then, Y = F (X) = 1 e X θ U(0, 1), and hence X = θln(1 Y ). That is, generate Y U(0, 1) then ( ) X = θln(1 Y ). Note: ( ) can be replaced by X = θlny since Y U(0, 1) 1 Y U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 4 / 29
7 Weibull X W(β) That is, f (x) = βx β 1 e x β I (x>0), F(x) = ( 1 e x β ) I(x>0). Then, Y = F(X) = 1 e X β U(0, 1), and hence X = [ θln(1 Y ) ] 1 β = ( lnu) 1/β, U = 1 Y U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 5 / 29
8 Weibull X W(β) That is, f (x) = βx β 1 e x β I (x>0), F(x) = ( 1 e x β ) I(x>0). Then, Y = F(X) = 1 e X β U(0, 1), and hence X = [ θln(1 Y ) ] 1 β = ( lnu) 1/β, U = 1 Y U(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 5 / 29
9 Application to Discrete Distributions Denote the support of X: x 1 < x 2 < (i.e., p(x i ) = P(X = x i ) > 0). Let x 0 < x 1, x 0 is arbitrarily chosen. Generate Y U(0, 1), then take X = x i where F(x i 1 ) < Y F(x i ). (WMU) S6880 #7 S6880, Class Notes #7 6 / 29
10 Discrete Uniform f (k) = 1 n I {1,2,,n}(k), F(k) = k n at k = 1, 2,, n (or k = nf (k)) So, generate Y U(0, 1) and take X = ny where : ceiling function = x = smallest integer x (WMU) S6880 #7 S6880, Class Notes #7 7 / 29
11 Using Inversion Method Evaluation of F 1 could be costly. Or, its closed form does not exist = approximation of F 1 must be employed. For instance, qnorm in R for generating normal random numbers. Note: RNGkind()[2] in R is defaulted to "Inversion" (for normal.kind). (WMU) S6880 #7 S6880, Class Notes #7 8 / 29
12 Using Inversion Method Evaluation of F 1 could be costly. Or, its closed form does not exist = approximation of F 1 must be employed. For instance, qnorm in R for generating normal random numbers. Note: RNGkind()[2] in R is defaulted to "Inversion" (for normal.kind). (WMU) S6880 #7 S6880, Class Notes #7 8 / 29
13 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method 3 Rejection Method Rejection Method 4 Statistical Theoretic Methods Theories Connecting Distributions Transformation Methods (WMU) S6880 #7 S6880, Class Notes #7 9 / 29
14 Composition Method by mixture of distributions Idea: Decompose c.d.f. into linear combination of easier-to-generate c.d.f. s.: k F(x) = p j F j (x), p j > 0, j=1 k p j = 1. j=1 Method: Generate F i with probability p i to get F. Implementation hint: Find decomposition such that p i is large for F i which is easy-to-generate. Implementation: 0 = q 0 q 1 q 2 q 3 q k 1 q k = 1 0 = P 0 P 1 P 2 P 3 P k 1 P k = 1 Note: P m = p p m and P 0 = 0 1 Generate U U(0, 1). 2 If q i 1 < U q i for some i then generate X from F i (X). (WMU) S6880 #7 S6880, Class Notes #7 10 / 29
15 Example: Triangular Distribution f (x) = (2 2x)I [0,1] (x) = 1 2 f 1(x) f 2(x) f 3(x), where f 1 (x) = 2I [0, 1 2 ](x) f 2 (x) = (4 8x)I [0, 1 2 ](x) f 3 (x) = (8 8x)I ( 1 2,1](x) Note: Here, F(x) = 1 2 F 1(x) F 2(x) F 3(x) (not true in general). 2 f(x) 1 f f 1 2 divide and conquer 1 f /2 1 x (WMU) S6880 #7 S6880, Class Notes #7 11 / 29
16 A Note About the Triangular Distribution Note that, y = F(x) = [ 1 (1 x) 2] I [0,1] (x) + I (1, ) (x) F 1 (y) = 1 1 y So triangular r.v. is generated by 1 U(0, 1) which needs operation. So, in the mixture above, is used only = 1 2 the time on average. Note, x in f 2 (x) can be generated by (1 U(0, 1))/2; and x in f 3 (x) can be generated by (2 U(0, 1))/2. (WMU) S6880 #7 S6880, Class Notes #7 12 / 29
17 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method 3 Rejection Method Rejection Method 4 Statistical Theoretic Methods Theories Connecting Distributions Transformation Methods (WMU) S6880 #7 S6880, Class Notes #7 13 / 29
18 Rejection Method (or Acceptance Sampling Method) idea f(x) (c,d) Draw f (x). x Key: if a point (c, d) could be selected at random on the area under the curve f (x), then the x-coordinate c, of that point, could be a random variable having density f (x). Conversely: if c is a random variable having density f (x) and d is a random variable uniformly distributed on (0, f (c)), then (c, d) is a point randomly distributed on the area under the curve f (x). (WMU) S6880 #7 S6880, Class Notes #7 14 / 29
19 Method Have g(y) for which y is easily generated and g(x) cf (x), x, and c (0, 1), (a) generate x from g(x) (b) generate b U(0, g(x )), (x, b) is now random in area under g(x). g(x) cf(x) x* g(x*) cf(x*) (c) If b cf (x ), return x = x. Otherwise, start all over from (a). (WMU) S6880 #7 S6880, Class Notes #7 15 / 29
20 Proof of the Rejection Method Mathematically, one can show that the returned x value has p.d.f. f : P(x x 0 ) = P(x x 0 return x = x ) = P(x x 0, b cf (x )) P(b cf (x. )) Now, P(b cf (x )) = = = cf (x ) 0 cf (x ) 0 cf (x )dx = c f (b x )g(x )dbdx 1 g(x ) g(x )dbdx and P(x x 0, b cf (x )) = = c x0 cf (x ) 0 x0 1 g(x ) g(x )dbdx f (x )dx = P(x x 0 ) = x0 f (x )dx. (WMU) S6880 #7 S6880, Class Notes #7 16 / 29
21 Examples Location-scale (m, β) gamma: X Gamma(m, β, α) f (x) = (x m)α 1 exp [ x m ] β Γ(α)β α I (m, ) (x), where α, β > 0 α: shape parameter, m: location parameter β: scale parameter. λ = 1 β : occurrence rate of a Poisson process (and β is the mean waiting time for an occurrence). Examples of sub-families (let m = 0): exponential: α = 1. chi-square with ν d.o.f.: α = ν 2, β = 2. Can generate x from standard gamma, m = 0, β = 1 (rgamma in R) and relocate and rescale! (WMU) S6880 #7 S6880, Class Notes #7 17 / 29
22 X Gamma(α): Standard Gamma f (x) = 1 Γ(α) x α 1 e x I (0, ), α > 0. Then X waiting time to α th event (say α, a positive integer) in a Poisson process with parameter 1 (occurrence rate is 1 per unit time and mean waiting time is 1 unit time). For small integer α: generate α independent exponential(1) (E(1)) and add them up For other α > 0, rejection method works: Cauchy works as the dominating p.d.f. (nice thick tails!) 1 g(y) = π(1+y 2 ) I R(y), G(y) = π tan 1 (y). Can generate y using inversion method. (WMU) S6880 #7 S6880, Class Notes #7 18 / 29
23 X Gamma(α): Standard Gamma f (x) = 1 Γ(α) x α 1 e x I (0, ), α > 0. Then X waiting time to α th event (say α, a positive integer) in a Poisson process with parameter 1 (occurrence rate is 1 per unit time and mean waiting time is 1 unit time). For small integer α: generate α independent exponential(1) (E(1)) and add them up For other α > 0, rejection method works: Cauchy works as the dominating p.d.f. (nice thick tails!) 1 g(y) = π(1+y 2 ) I R(y), G(y) = π tan 1 (y). Can generate y using inversion method. (WMU) S6880 #7 S6880, Class Notes #7 18 / 29
24 X Gamma(α): Standard Gamma f (x) = 1 Γ(α) x α 1 e x I (0, ), α > 0. Then X waiting time to α th event (say α, a positive integer) in a Poisson process with parameter 1 (occurrence rate is 1 per unit time and mean waiting time is 1 unit time). For small integer α: generate α independent exponential(1) (E(1)) and add them up For other α > 0, rejection method works: Cauchy works as the dominating p.d.f. (nice thick tails!) 1 g(y) = π(1+y 2 ) I R(y), G(y) = π tan 1 (y). Can generate y using inversion method. (WMU) S6880 #7 S6880, Class Notes #7 18 / 29
25 X Gamma(α): Standard Gamma f (x) = 1 Γ(α) x α 1 e x I (0, ), α > 0. Then X waiting time to α th event (say α, a positive integer) in a Poisson process with parameter 1 (occurrence rate is 1 per unit time and mean waiting time is 1 unit time). For small integer α: generate α independent exponential(1) (E(1)) and add them up For other α > 0, rejection method works: Cauchy works as the dominating p.d.f. (nice thick tails!) 1 g(y) = π(1+y 2 ) I R(y), G(y) = π tan 1 (y). Can generate y using inversion method. (WMU) S6880 #7 S6880, Class Notes #7 18 / 29
26 X Gamma(α): Standard Gamma f (x) = 1 Γ(α) x α 1 e x I (0, ), α > 0. Then X waiting time to α th event (say α, a positive integer) in a Poisson process with parameter 1 (occurrence rate is 1 per unit time and mean waiting time is 1 unit time). For small integer α: generate α independent exponential(1) (E(1)) and add them up For other α > 0, rejection method works: Cauchy works as the dominating p.d.f. (nice thick tails!) 1 g(y) = π(1+y 2 ) I R(y), G(y) = π tan 1 (y). Can generate y using inversion method. (WMU) S6880 #7 S6880, Class Notes #7 18 / 29
27 Standard Gamma, cont d Note that, if Y Cauchy, then (for U U(0, 1)) Y = tan[π(u 1 2 )] = tan( πu π 2 ) = cot(πu) Could use Y = tan(πu) since 1 Y and 1 Y are both Cauchy if Y is Cauchy. Could also use a location-scale Cauchy Y = ky + b to dominate c gamma. (WMU) S6880 #7 S6880, Class Notes #7 19 / 29
28 Standard Gamma, cont d Note that, if Y Cauchy, then (for U U(0, 1)) Y = tan[π(u 1 2 )] = tan( πu π 2 ) = cot(πu) Could use Y = tan(πu) since 1 Y and 1 Y are both Cauchy if Y is Cauchy. Could also use a location-scale Cauchy Y = ky + b to dominate c gamma. (WMU) S6880 #7 S6880, Class Notes #7 19 / 29
29 Standard Gamma, cont d Note that, if Y Cauchy, then (for U U(0, 1)) Y = tan[π(u 1 2 )] = tan( πu π 2 ) = cot(πu) Could use Y = tan(πu) since 1 Y and 1 Y are both Cauchy if Y is Cauchy. Could also use a location-scale Cauchy Y = ky + b to dominate c gamma. (WMU) S6880 #7 S6880, Class Notes #7 19 / 29
30 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method 3 Rejection Method Rejection Method 4 Statistical Theoretic Methods Theories Connecting Distributions Transformation Methods (WMU) S6880 #7 S6880, Class Notes #7 20 / 29
31 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
32 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
33 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
34 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
35 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
36 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
37 Use Known Statistical Theories normal and chi-square (i) N(0, 1) χ 2 1 generate z N(0, 1) x = z 2 will be a χ 2 1 r.v. (How do you get χ2 k?) (ii) χ 2 1 N(0, 1) generate x χ 2 1 generate u U(0, 1) set z = [sign(u 1 2 )] x, then z N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 21 / 29
38 Generate Normal from Uniforms Generate u 1,, u n U(0, 1). By CLT, z = u 0.5 1/(12n) N(0, 1) for sufficiently large n. It is not bad for small n since U(0, 1) is symmetric. E.g., n = 12, z = 12 i=1 u i 6. (WMU) S6880 #7 S6880, Class Notes #7 22 / 29
39 Transformation Methods Probability integral transformation (inverse CDF, a.k.a., inversion method) General transformation methods. (WMU) S6880 #7 S6880, Class Notes #7 23 / 29
40 General Transformation Methods Assume: can generate y with p.d.f. g(y) and c.d.f. G(y) Objective: generate x with p.d.f. f (x) and c.d.f. F(x) How: find transformation h(y) such that x = h(y) has the required distribution. Change of variable: y x g(y)dy = = g(h 1 (x)) dx 1 dy dx g(h 1 (x))[h (h 1 (x))] 1 dx = f (x)dx i.e., find h such that f (x) = g(h 1 (x)) h (h 1 (x)) Example: g(y)dy = dy, g(y) = 1 (i.e., U(0,1)) and x = h(y), f (x) = 1/h (h 1 (x)) = 1/h (y) f (x) = 1/ ( dx ) dy dy dx = f (x) y = F(x) x = F 1 (y) h(y) = F 1 (y) = probability integral transform! (WMU) S6880 #7 S6880, Class Notes #7 24 / 29
41 Multivariate Transformations Higher dimensional result: (i) f (x)dx = f (x 1,, x n )dx 1 dx n = g(y 1,, y n ) J dx. where J: Jacobian = y x given y and g, find transformation to get f. (ii) Example: Box-Müller method for getting N(0, 1) variates from independent uniforms y 1, y 2 U(0, 1): { x 1 = 2lny 1 cos(2πy 2 ) x 2 = 2lny 1 sin(2πy 2 ) Can show that J = φ(x 1 )φ(x 2 ) where φ( ) is the standard normal p.d.f. Therefore, x 1 and x 2 are independent N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 25 / 29
42 Multivariate Transformations Higher dimensional result: (i) f (x)dx = f (x 1,, x n )dx 1 dx n = g(y 1,, y n ) J dx. where J: Jacobian = y x given y and g, find transformation to get f. (ii) Example: Box-Müller method for getting N(0, 1) variates from independent uniforms y 1, y 2 U(0, 1): { x 1 = 2lny 1 cos(2πy 2 ) x 2 = 2lny 1 sin(2πy 2 ) Can show that J = φ(x 1 )φ(x 2 ) where φ( ) is the standard normal p.d.f. Therefore, x 1 and x 2 are independent N(0, 1). (WMU) S6880 #7 S6880, Class Notes #7 25 / 29
43 Marsaglia s Polar Method (=Box-Müller + rejection method) 1 Repeat Generate v 1, v 2 U( 1, 1) until w = v1 2 + v 2 2 < 1 (within unit circle in v 1 -v 2 space) 2 Return (i.i.d. N(0, 1)) ) x 1 = ( 2w lnw v 1 ) x 2 = ( 2w lnw v 2 1 reject v 2 1 accept 1 v 1 1 (WMU) S6880 #7 S6880, Class Notes #7 26 / 29
44 Marsaglia s Polar Method (=Box-Müller + rejection method) 1 Repeat Generate v 1, v 2 U( 1, 1) until w = v1 2 + v 2 2 < 1 (within unit circle in v 1 -v 2 space) 2 Return (i.i.d. N(0, 1)) ) x 1 = ( 2w lnw v 1 ) x 2 = ( 2w lnw v 2 1 reject v 2 1 accept 1 v 1 1 (WMU) S6880 #7 S6880, Class Notes #7 26 / 29
45 Cauchy from Ratio of Uniforms From the polar method, suppose (U, V ) uniformly on unit circle, then U V ratio of two i.i.d. N(0,1) Cauchy. Hence, Cauchy can be generated: 1 Repeat generate independent U 1 U(0, 1), U 2 U( 1, 1) Until U U2 2 < 1 2 Return X = U 2 /U 1. (WMU) S6880 #7 S6880, Class Notes #7 27 / 29
46 Cauchy from Ratio of Uniforms From the polar method, suppose (U, V ) uniformly on unit circle, then U V ratio of two i.i.d. N(0,1) Cauchy. Hence, Cauchy can be generated: 1 Repeat generate independent U 1 U(0, 1), U 2 U( 1, 1) Until U U2 2 < 1 2 Return X = U 2 /U 1. (WMU) S6880 #7 S6880, Class Notes #7 27 / 29
47 Cauchy from Ratio of Uniforms From the polar method, suppose (U, V ) uniformly on unit circle, then U V ratio of two i.i.d. N(0,1) Cauchy. Hence, Cauchy can be generated: 1 Repeat generate independent U 1 U(0, 1), U 2 U( 1, 1) Until U U2 2 < 1 2 Return X = U 2 /U 1. (WMU) S6880 #7 S6880, Class Notes #7 27 / 29
48 Ratio of Uniforms (Kinder+Monahan, 1977) h(x) 0, x, h(x)dx <, C h = {(u, v) 0 u h(v/u)}. Let a = sup h(x), b + = x sup x 0 x 2 h(x), b = sup x 0 x 2 h(x). 1 Repeat generate independent U U(0, a), V U(b, b + ) Until (U, V ) C h 2 Return X = V /U, then X has p.d.f. h(x)/ h(t)dt. (WMU) S6880 #7 S6880, Class Notes #7 28 / 29
49 Ratio of Uniforms (Kinder+Monahan, 1977) h(x) 0, x, h(x)dx <, C h = {(u, v) 0 u h(v/u)}. Let a = sup h(x), b + = x sup x 0 x 2 h(x), b = sup x 0 x 2 h(x). 1 Repeat generate independent U U(0, a), V U(b, b + ) Until (U, V ) C h 2 Return X = V /U, then X has p.d.f. h(x)/ h(t)dt. (WMU) S6880 #7 S6880, Class Notes #7 28 / 29
50 Ratio of Uniforms (Kinder+Monahan, 1977) h(x) 0, x, h(x)dx <, C h = {(u, v) 0 u h(v/u)}. Let a = sup h(x), b + = x sup x 0 x 2 h(x), b = sup x 0 x 2 h(x). 1 Repeat generate independent U U(0, a), V U(b, b + ) Until (U, V ) C h 2 Return X = V /U, then X has p.d.f. h(x)/ h(t)dt. (WMU) S6880 #7 S6880, Class Notes #7 28 / 29
51 Ratio of Uniforms, cont d: An Example Let h(x) = e x I (0, ) (x). Then a = 1, b = 0, b + = 2 e. (u, v) C h is equivalent to v 2ulnu. Hence, 1 Repeat generate independent U U(0, 1), V U(0, 2 e ) Until V 2UlnU 2 Return X = V /U. (WMU) S6880 #7 S6880, Class Notes #7 29 / 29
52 Ratio of Uniforms, cont d: An Example Let h(x) = e x I (0, ) (x). Then a = 1, b = 0, b + = 2 e. (u, v) C h is equivalent to v 2ulnu. Hence, 1 Repeat generate independent U U(0, 1), V U(0, 2 e ) Until V 2UlnU 2 Return X = V /U. (WMU) S6880 #7 S6880, Class Notes #7 29 / 29
53 Ratio of Uniforms, cont d: An Example Let h(x) = e x I (0, ) (x). Then a = 1, b = 0, b + = 2 e. (u, v) C h is equivalent to v 2ulnu. Hence, 1 Repeat generate independent U U(0, 1), V U(0, 2 e ) Until V 2UlnU 2 Return X = V /U. (WMU) S6880 #7 S6880, Class Notes #7 29 / 29
Stat 451 Lecture Notes Simulating Random Variables
Stat 451 Lecture Notes 05 12 Simulating Random Variables Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 22 in Lange, and Chapter 2 in Robert & Casella 2 Updated:
More informationCh3. Generating Random Variates with Non-Uniform Distributions
ST4231, Semester I, 2003-2004 Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationLecture 11: Probability, Order Statistics and Sampling
5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationRandom Variate Generation
Random Variate Generation 28-1 Overview 1. Inverse transformation 2. Rejection 3. Composition 4. Convolution 5. Characterization 28-2 Random-Variate Generation General Techniques Only a few techniques
More informationChapter 2 Continuous Distributions
Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following
More informationSimulating Random Variables
Simulating Random Variables Timothy Hanson Department of Statistics, University of South Carolina Stat 740: Statistical Computing 1 / 23 R has many built-in random number generators... Beta, gamma (also
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationGeneration from simple discrete distributions
S-38.3148 Simulation of data networks / Generation of random variables 1(18) Generation from simple discrete distributions Note! This is just a more clear and readable version of the same slide that was
More informationGenerating Random Variates 2 (Chapter 8, Law)
B. Maddah ENMG 6 Simulation /5/08 Generating Random Variates (Chapter 8, Law) Generating random variates from U(a, b) Recall that a random X which is uniformly distributed on interval [a, b], X ~ U(a,
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationL3 Monte-Carlo integration
RNG MC IS Examination RNG MC IS Examination Monte-Carlo and Empirical Methods for Statistical Inference, FMS091/MASM11 Monte-Carlo integration Generating pseudo-random numbers: Rejection sampling Conditional
More information1 Introduction. P (n = 1 red ball drawn) =
Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationMath Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah
Math 5040 1 Stochastic Processes & Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a randomnumber
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,
More informationContinuous random variables
Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationTopic 4: Continuous random variables
Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )
UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationTopic 4: Continuous random variables
Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More information1 Acceptance-Rejection Method
Copyright c 2016 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F 1 (y), y [0, 1], for the cdf of a rv X we wish to generate, F (x) = P (X x), x R, is
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationGenerating Random Variables
Generating Random Variables Christian Robert Université Paris Dauphine and CREST, INSEE George Casella University of Florida Keywords and Phrases: Random Number Generator, Probability Integral Transform,
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationS6880 #13. Variance Reduction Methods
S6880 #13 Variance Reduction Methods 1 Variance Reduction Methods Variance Reduction Methods 2 Importance Sampling Importance Sampling 3 Control Variates Control Variates Cauchy Example Revisited 4 Antithetic
More informationMultivariate Simulations
Multivariate Simulations Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Prague, Czech Republic 21.11.2011 Katarína Starinská Multivariate Simulations 1 /
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationGenerating pseudo- random numbers
Generating pseudo- random numbers What are pseudo-random numbers? Numbers exhibiting statistical randomness while being generated by a deterministic process. Easier to generate than true random numbers?
More informationChapter 6: Functions of Random Variables
Chapter 6: Functions of Random Variables We are often interested in a function of one or several random variables, U(Y 1,..., Y n ). We will study three methods for determining the distribution of a function
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationMath 576: Quantitative Risk Management
Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1
More informationNonlife Actuarial Models. Chapter 14 Basic Monte Carlo Methods
Nonlife Actuarial Models Chapter 14 Basic Monte Carlo Methods Learning Objectives 1. Generation of uniform random numbers, mixed congruential method 2. Low discrepancy sequence 3. Inversion transformation
More informationSimulation - Lectures - Part I
Simulation - Lectures - Part I Julien Berestycki -(adapted from François Caron s slides) Part A Simulation and Statistical Programming Hilary Term 2017 Part A Simulation. HT 2017. J. Berestycki. 1 / 66
More information3 Modeling Process Quality
3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationSampling Random Variables
Sampling Random Variables Introduction Sampling a random variable X means generating a domain value x X in such a way that the probability of generating x is in accordance with p(x) (respectively, f(x)),
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationAPPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1
APPM/MATH 4/552 Solutions to Problem Set Two. Let Y X /X 2 and let Y 2 X 2. (We can select Y 2 to be anything but when dealing with a fraction for Y, it is usually convenient to set Y 2 to be the denominator.)
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationGeneral Principles in Random Variates Generation
General Principles in Random Variates Generation E. Moulines and G. Fort Telecom ParisTech June 2015 Bibliography : Luc Devroye, Non-Uniform Random Variate Generator, Springer-Verlag (1986) available on
More informationStat 451 Lecture Notes Monte Carlo Integration
Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationStatistics Ph.D. Qualifying Exam: Part II November 9, 2002
Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationDefinition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci
Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by specifying one or more values called parameters. The number
More informationChapter 4: Continuous Probability Distributions
Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random
More informationISyE 3044 Fall 2017 Test #1a Solutions
1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have
More informationProbability Theory. Patrick Lam
Probability Theory Patrick Lam Outline Probability Random Variables Simulation Important Distributions Discrete Distributions Continuous Distributions Most Basic Definition of Probability: number of successes
More informationNon-parametric Inference and Resampling
Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing
More informationLecture 4. Continuous Random Variables and Transformations of Random Variables
Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More information40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology
Singapore University of Design and Technology Lecture 9: Hypothesis testing, uniformly most powerful tests. The Neyman-Pearson framework Let P be the family of distributions of concern. The Neyman-Pearson
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More information0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).
1. Suppose Y U(0, 2) so that the probability density function (pdf) of Y is 1 2, 0 < y < 2 (a) Find the pdf of U = Y 4 + 1. Make sure to note the support. (c) Suppose Y 1, Y 2,..., Y n is an iid sample
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationNotes 9 : Infinitely divisible and stable laws
Notes 9 : Infinitely divisible and stable laws Math 733 - Fall 203 Lecturer: Sebastien Roch References: [Dur0, Section 3.7, 3.8], [Shi96, Section III.6]. Infinitely divisible distributions Recall: EX 9.
More information( ) ( ) Monte Carlo Methods Interested in. E f X = f x d x. Examples:
Monte Carlo Methods Interested in Examples: µ E f X = f x d x Type I error rate of a hypothesis test Mean width of a confidence interval procedure Evaluating a likelihood Finding posterior mean and variance
More informationSlides 5: Random Number Extensions
Slides 5: Random Number Extensions We previously considered a few examples of simulating real processes. In order to mimic real randomness of events such as arrival times we considered the use of random
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More informationTransformations and Expectations
Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function
More informationMaster s Written Examination
Master s Written Examination Option: Statistics and Probability Spring 016 Full points may be obtained for correct answers to eight questions. Each numbered question which may have several parts is worth
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationIndependent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring
Independent Events Two events are independent if knowing that one occurs does not change the probability of the other occurring Conditional probability is denoted P(A B), which is defined to be: P(A and
More informationMTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6
MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward
More informationSTAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.
STAT 45: Statistical Theory Distribution Theory Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6. Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p
More informationThree hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.
Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION
More informationSTAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.
STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationFinal Examination. STA 711: Probability & Measure Theory. Saturday, 2017 Dec 16, 7:00 10:00 pm
Final Examination STA 711: Probability & Measure Theory Saturday, 2017 Dec 16, 7:00 10:00 pm This is a closed-book exam. You may use a sheet of prepared notes, if you wish, but you may not share materials.
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationActuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72
Proof. We know that which is S X Λ (x λ) = exp S X Λ (x λ) = exp Furthermore, S X (x) = { x } h X Λ (t λ)dt, 0 { x } λ a(t)dt = exp { λa(x)}. 0 Edward Furman Risk theory 4280 21 / 72 Proof. We know that
More informationSTAT 801: Mathematical Statistics. Distribution Theory
STAT 81: Mathematical Statistics Distribution Theory Basic Problem: Start with assumptions about f or CDF of random vector X (X 1,..., X p ). Define Y g(x 1,..., X p ) to be some function of X (usually
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationProbability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999
Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.
More information