Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72

Size: px
Start display at page:

Download "Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72"

Transcription

1 Proof. We know that which is S X Λ (x λ) = exp S X Λ (x λ) = exp Furthermore, S X (x) = { x } h X Λ (t λ)dt, 0 { x } λ a(t)dt = exp { λa(x)}. 0 Edward Furman Risk theory / 72

2 Proof. We know that which is S X Λ (x λ) = exp S X Λ (x λ) = exp Furthermore, as required. { x } h X Λ (t λ)dt, 0 { x } λ a(t)dt = exp { λa(x)}. 0 S X (x) = E[S X Λ (x Λ)] = E[e ΛA(x) ] = M Λ ( A(x)), Edward Furman Risk theory / 72

3 Sometimes the conditional hazard function can be very simple. Edward Furman Risk theory / 72

4 Sometimes the conditional hazard function can be very simple. Take X Λ Exp(Λ). Then a(x) = 1. Example 1.9 Let Λ Ga(τ, α) and X Λ Wei(λ, γ) with ddf S(x) = e λx γ. Use the frailty mixture model to find the ddf of X. Solution We showed that S X (x) = M Λ ( A(x)). In this case Λ is a gamma rv, thus M Λ (t) = (1 αt) τ. Also, for Weibull with the the ddf above, a(x) = γx γ 1. Edward Furman Risk theory / 72

5 Solution Thus A(x) = x γ, and S X (x) = M Λ ( A(x)) = (1 + αx γ ) τ Definition 1.8 (Spliced distributions) An n component spliced distribution has the following pdf α 1 f 1 (x), if c 0 < x < c 1 α 2 f 2 (x), if c 1 < x < c 2 f (x) =. α n f n (x), if c n 1 < x < c n where for j = 1,..., n we have that α j > 0, α α n = 1 and every f j is a legitimate pdf with the support on (c j 1, c j ). Edward Furman Risk theory / 72

6 Example 1.10 Show that the pdf below is a spliced pdf. { 0.01, if 0 x < 50 f (x) = 0.02, if 50 x < 75 Solution Indeed f (x) = { , if 0 x < , if 50 x < 75 where both f 1 and f 2 are uniform densities on different intervals. Edward Furman Risk theory / 72

7 Example 1.10 Show that the pdf below is a spliced pdf. { 0.01, if 0 x < 50 f (x) = 0.02, if 50 x < 75 Solution Indeed f (x) = { , if 0 x < , if 50 x < 75 where both f 1 and f 2 are uniform densities on different intervals. The splicing approach is in a way similar to the mixing one. Here however the processes generating losses differ with respect to loss amounts. This is not true for the mixing approach. Edward Furman Risk theory / 72

8 In addition to the methods described above, we can obtain new distributions as limiting cases of other ones. Edward Furman Risk theory / 72

9 In addition to the methods described above, we can obtain new distributions as limiting cases of other ones. Let X Ga(γ, 1). Then Y = θx 1/τ, τ > 0 is a generalized (actuaries call it) transformed gamma distribution. The density is { ( y ) τ } y τγ 1 τ f (y) = exp θ θ τγ Γ(γ), y > 0. Edward Furman Risk theory / 72

10 In addition to the methods described above, we can obtain new distributions as limiting cases of other ones. Let X Ga(γ, 1). Then Y = θx 1/τ, τ > 0 is a generalized (actuaries call it) transformed gamma distribution. The density is { ( y ) τ } y τγ 1 τ f (y) = exp θ θ τγ Γ(γ), y > 0. For a > 0, b > 0, let the beta function be B(a, b) = 1 0 t a 1 (1 t) b 1 dt = Γ(a)Γ(b) Γ(a + b). Then let X Be(τ, α) be a beta random variable with the pdf f X (x) = 1 Be(τ, α) x τ 1 (1 x) α 1, 0 x 1. Edward Furman Risk theory / 72

11 Let Y = θ(x/(1 X)) 1/γ. Then, we have that the pdf of Y is the generalized beta of the second kind (actuaries call it the transformed beta) f Y (y) = 1 γ(x/θ) γτ Be(τ, α) x(1 + (x/θ) γ, y 0. ) α+τ Example 1.11 The generalized (transformed) gamma distribution is limiting case of the generalized (transformed) beta distribution when θ,α,θ/α 1/γ ξ. Solution The well-known Stirling s approximation says that for large α s, Γ(α) = e α α α 0.5 2π. Edward Furman Risk theory / 72

12 Solution Thus for large α s, θ = ξα 1/γ and using the pdf of the generalized beta rv, we have that f Y (y) = = = = Γ(α + τ) γ(x/θ) γτ Γ(α)Γ(τ) x(1 + (x/θ) γ ) α+τ e τ α (τ + α) τ+α 0.5 2πγy γτ 1 e α α α 0.5 2πΓ(τ)θ γτ (1 + y γ θ γ ) α+τ e τ (τ + α) τ+α 0.5 γy γτ 1 α α 0.5 Γ(τ)θ γτ (1 + y γ θ γ ) α+τ e τ (τ + α) τ+α 0.5 γy γτ 1 α τ+α 0.5 Γ(τ)ξ γτ (1 + (y/ξ) γ /α) α+τ = e τ (τ/α + 1) τ+α 0.5 γy γτ 1 Γ(τ)ξ γτ (1 + (y/ξ) γ /α) α+τ Edward Furman Risk theory / 72

13 Solution We also know that lim α ( 1 + τ ) α+τ 0.5 α Edward Furman Risk theory / 72

14 Solution We also know that and similarly lim α lim α ( 1 + τ ) α+τ 0.5 = e τ α (1 + (x/ξ)γ α ) α+τ Edward Furman Risk theory / 72

15 Solution We also know that and similarly lim α ( 1 + τ ) α+τ 0.5 = e τ α Hence lim α ) α+τ (1 + (x/ξ)γ = e (x/ξ)γ α lim f Y (y) = γy γτ 1 e (y/ξ)γ α ξ γτ, Γ(τ) which is the pdf of a generalized (transformed) gamma rv as needed. Edward Furman Risk theory / 72

16 Figure: Beta family of distributions Edward Furman Risk theory / 72

17 Definition 1.9 (Linear exponential family) A random variable X has a distribution belonging to the linear exponential family if its pdf can be reformulated in terms of a parameter θ as f (x; θ) = p(x)er(θ)x. q(θ) Here p(x) does not depend on θ, the function q(θ) is a normalizing constant, and r(θ) is the canonical parameter. The support of X must be independent of θ. Example 1.12 Gamma is linear exponential family, i.e., f (x; θ) = e x/θ x γ 1 θ γ Γ(γ) is in the linear exponential family with r(θ) = 1/θ, q(θ) = θ γ and p(x) = x γ 1 /Γ(γ). Edward Furman Risk theory / 72

18 Example 1.13 Normal random variable X N(θ, ν) is in the linear exponential family. Indeed { f (x; θ) = (2πν) 1/2 exp 1 } (x θ)2 2ν = (2πν) 1/2 exp { x 2 2ν + θ } ν x θ2 2ν { } (2πν) 1/2 exp x 2 2ν exp { θ ν x} = }, exp { θ 2 2ν that is linear exponential { family } with r(θ) = θ/ν, } p(x) = (2πν) 1/2 exp, q(θ) = exp x 2 2ν { θ 2 2ν Edward Furman Risk theory / 72

19 Proposition 1.7 Let X LEF (θ). Then its expected value is Proof. First note that Also, E[X] = q (θ) r (θ)q(θ). log f (x; θ) = log p(x) + r(θ)x log q(θ). θ f (x; θ) = 1 ( ) q(θ) 2 p(x)e r(θ) xr (θ)q(θ) p(x)e r(θ)x q (θ) = f (x; θ) ( r (θ)x q (θ)/q(θ) ) Edward Furman Risk theory / 72

20 Proof. We will now differentiate the above with respect to x. Then θ f (x; θ)dx = r (θ) xf (x; θ)dx q (θ) f (x; θ)dx q(θ) Note that the support of X is independent of θ by definition of the family and so is the range of x. Therefore θ f (x; θ)dx = r (θ) xf (x; θ)dx q (θ) q(θ) f (x; θ)dx that is equivalent to saying that θ (1) = r (θ)e[x] q (θ) q(θ). Or in other words Edward Furman Risk theory / 72

21 Proof. E[X] = which completes the proof. Proposition 1.8 q (θ) r (θ)q(θ) = µ(θ), Let X LEF(θ). Then its variance is Proof. Var[X] = µ (θ) r (θ). We will use the same technique as before. We have already shown that θ f (x; θ) = r (θ) (x µ(θ)) f (x; θ) Edward Furman Risk theory / 72

22 Proof. Differentiating again with respect to θ and using the already obtained first derivative of f (x; θ), we have that 2 f (x; θ) θ2 = r (θ) (x µ(θ)) f (x; θ) + r (θ) (x µ(θ)) f (x; θ) r (θ)µ (θ)f (x; θ) = r (θ) (x µ(θ)) f (x; θ) + (r (θ)) 2 (x µ(θ)) 2 f (x; θ) r (θ)µ (θ)f (x; θ) Also, because µ(θ) is the mean 2 θ 2 f (x; θ)dx = (r (θ)) 2 Var[X] r (θ)µ (θ). Recalling that the range of x does not depend on θ because of the support of X, we obtain that Edward Furman Risk theory / 72

23 Proof. from which as required. 2 θ 2 f (x; θ)dx = (r (θ)) 2 Var[X] r (θ)µ (θ). Var[X] = µ (θ) r (θ) Edward Furman Risk theory / 72

24 Proof. from which as required. 2 θ 2 f (x; θ)dx = (r (θ)) 2 Var[X] r (θ)µ (θ). Var[X] = µ (θ) r (θ) We shall further calculate the CTE of X at the confidence level x q = VaR q [X]. Edward Furman Risk theory / 72

25 Proof. from which as required. 2 θ 2 f (x; θ)dx = (r (θ)) 2 Var[X] r (θ)µ (θ). Var[X] = µ (θ) r (θ) We shall further calculate the CTE of X at the confidence level x q = VaR q [X]. Proposition 1.9 The CTE q [X] is given by CTE q [X] = µ(θ) + 1 r (θ) θ log S(x q; θ). Edward Furman Risk theory / 72

26 Proof. θ log S(x q; θ) Edward Furman Risk theory / 72

27 Proof. = θ log S(x q; θ) 1 S(x q ; θ) θ S(x q; θ) Edward Furman Risk theory / 72

28 Proof. = θ log S(x q; θ) 1 S(x q ; θ) θ S(x q; θ) = 1 S(x q ; θ) θ x q f (x; θ)dx Edward Furman Risk theory / 72

29 Proof. = = θ log S(x q; θ) 1 S(x q ; θ) θ S(x 1 q; θ) = S(x q ; θ) θ 1 S(x q ; θ) x q f (x; θ)dx. θ x q f (x; θ)dx Edward Furman Risk theory / 72

30 Proof. = = θ log S(x q; θ) 1 S(x q ; θ) θ S(x 1 q; θ) = S(x q ; θ) θ 1 S(x q ; θ) x q f (x; θ)dx. θ x q f (x; θ)dx We have already shown in Proposition 1.7 that Thus we have that θ log S(x q; θ) = θ f (x; θ) = f (x; θ) ( r (θ)x q (θ)/q(θ) ). Edward Furman Risk theory / 72

31 Proof. = = θ log S(x q; θ) 1 S(x q ; θ) θ S(x 1 q; θ) = S(x q ; θ) θ 1 S(x q ; θ) x q f (x; θ)dx. θ x q f (x; θ)dx We have already shown in Proposition 1.7 that Thus we have that θ log S(x q; θ) = θ f (x; θ) = f (x; θ) ( r (θ)x q (θ)/q(θ) ). 1 S(x q ; θ) x q f (x; θ) ( r (θ)x q (θ)/q(θ) ) dx. Edward Furman Risk theory / 72

32 Proof. The latter equation reduces to θ log S(x q; θ) = r (θ)cte q [X] q (θ) q(θ), from which we easily deduce that Edward Furman Risk theory / 72

33 Proof. The latter equation reduces to θ log S(x q; θ) = r (θ)cte q [X] q (θ) q(θ), from which we easily deduce that CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) that completes the proof. Example 1.14 (CTE for gamma as a member of LEF) Let X Ga(γ, θ). Use the fact that it is in LEF to derive the CTE risk measure. Edward Furman Risk theory / 72

34 Solution Gamma is in LEF with r(θ) = 1/θ, p(x) = x γ 1 /Γ(γ) and q(θ) = θ γ. Thus we have that and also µ(θ) = r (θ) = θ 2, q (θ) r (θ)q(θ) = γθγ 1 θ γ 2 = γθ. Furthermore, we have that for the cdf of gamma G( ; γ, θ) because S(x q ; θ) = 1 G(x q ; γ, θ) = 1 G(x q /θ; γ) Edward Furman Risk theory / 72

35 Solution Gamma is in LEF with r(θ) = 1/θ, p(x) = x γ 1 /Γ(γ) and q(θ) = θ γ. Thus we have that and also µ(θ) = r (θ) = θ 2, q (θ) r (θ)q(θ) = γθγ 1 θ γ 2 = γθ. Furthermore, we have that for the cdf of gamma G( ; γ, θ) S(x q ; θ) = 1 G(x q ; γ, θ) = 1 G(x q /θ; γ) because gamma is in the scale family with scale parameter θ. Furthermore, θ x q/θ e x x γ 1 dx = x q e xq/θ (x/θ) γ 1 Γ(γ) θ 2. Γ(γ) Edward Furman Risk theory / 72

36 Solution (cont.) After all, we have the CTE risk measure given by CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) Edward Furman Risk theory / 72

37 Solution (cont.) After all, we have the CTE risk measure given by CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) 1 e xq/θ (x/θ) γ 1 = γθ + x q S(x q /θ; θ) Γ(γ) Edward Furman Risk theory / 72

38 Solution (cont.) After all, we have the CTE risk measure given by CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) 1 e xq/θ (x/θ) γ 1 = γθ + x q S(x q /θ; θ) Γ(γ) = γθ + x q h(x q /θ) Edward Furman Risk theory / 72

39 Solution (cont.) After all, we have the CTE risk measure given by CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) 1 e xq/θ (x/θ) γ 1 = γθ + x q S(x q /θ; θ) Γ(γ) = γθ + x q h(x q /θ) with h( ) a hazard function of a Ga(γ, 1) random variable. Which random variable we have already seen to possess a CTE of the form similar to the one above? Edward Furman Risk theory / 72

40 Solution (cont.) After all, we have the CTE risk measure given by CTE q [X] = 1 r (θ) θ log S(x q; θ) + q (θ) r (θ)q(θ) 1 e xq/θ (x/θ) γ 1 = γθ + x q S(x q /θ; θ) Γ(γ) = γθ + x q h(x q /θ) with h( ) a hazard function of a Ga(γ, 1) random variable. Which random variable we have already seen to possess a CTE of the form similar to the one above? We have proved that for X Ga(γ, θ), CTE q [X] = E[X] 1 G(x q; γ + 1, θ). 1 G(x q ; γ, θ) Make sure you can show the equivalence of the two. Edward Furman Risk theory / 72

41 Definition 1.10 (Univariate elliptical family) We shall say that X E(a, b) is a univariate elliptical random variable if when X has a density, it is given by the form ( f (x) = c ( ) ) b g 1 x a 2, x R, 2 b where g : [0, ) [0, ) such that Edward Furman Risk theory / 72

42 Definition 1.10 (Univariate elliptical family) We shall say that X E(a, b) is a univariate elliptical random variable if when X has a density, it is given by the form ( f (x) = c ( ) ) b g 1 x a 2, x R, 2 b where g : [0, ) [0, ) such that 0 x 1/2 g(x)dx <. Edward Furman Risk theory / 72

43 Definition 1.10 (Univariate elliptical family) We shall say that X E(a, b) is a univariate elliptical random variable if when X has a density, it is given by the form ( f (x) = c ( ) ) b g 1 x a 2, x R, 2 b where g : [0, ) [0, ) such that 0 x 1/2 g(x)dx <. Both the expectation and the variance can be infinite. For instance, to assure that the expectation is finite it must be 0 g(x)dx <, (change of variables in the integral for the mean). Edward Furman Risk theory / 72

44 Proposition 1.10 Let X E(a, b) that has a density as before. The normalizing constant is then ( 2 1 c = x g(x)dx) 1/2. Proof. From the fact that f (x) is a density it must be that f (x)dx = 0 Edward Furman Risk theory / 72

45 Proposition 1.10 Let X E(a, b) that has a density as before. The normalizing constant is then ( 2 1 c = x g(x)dx) 1/2. Proof. From the fact that f (x) is a density it must be that f (x)dx = c ( ( ) ) 1 x a 2 g dx b 2 b = 0 Edward Furman Risk theory / 72

46 Proposition 1.10 Let X E(a, b) that has a density as before. The normalizing constant is then ( 2 1 c = x g(x)dx) 1/2. Proof. From the fact that f (x) is a density it must be that f (x)dx = c ( ( ) ) 1 x a 2 g dx b 2 b ( ) 1 = c g 2 u2 du = 0 Edward Furman Risk theory / 72

47 Proposition 1.10 Let X E(a, b) that has a density as before. The normalizing constant is then ( 2 1 c = x g(x)dx) 1/2. Proof. 0 From the fact that f (x) is a density it must be that f (x)dx = c ( ( ) ) 1 x a 2 g dx b 2 b ( ) 1 ( ) 1 = c g 2 u2 du = 2c g 2 u2 du 0 noticing that we have an integral of an even function. Further by Edward Furman Risk theory / 72

48 Proof. simple change of variables z = u 2 /2, we have that f (x)dx = 2c = 2c From which as required. 0 0 (2z) 1/2 g (z) dz z 1/2 g (z) dz = 1. c = 1 ( 1 z 1/2 g (z) dz), 2 0 Edward Furman Risk theory / 72

49 Example 1.15 (Normal rv as a member of the elliptical family) Let g(x) = e x, then f (x) = c b exp { ( 1 2 ( ) )} x a 2, x R b which is a normal density, and the normalizing constant is c = 1 ( 1 x 1/2 e dx) x = 2 0 Edward Furman Risk theory / 72

50 Example 1.15 (Normal rv as a member of the elliptical family) Let g(x) = e x, then f (x) = c b exp { ( 1 2 ( ) )} x a 2, x R b which is a normal density, and the normalizing constant is c = 1 ( 2 0 ) 1 x 1/2 e x dx = 1 (Γ 2 ( )) 1 1 = π Edward Furman Risk theory / 72

51 Example 1.15 (Normal rv as a member of the elliptical family) Let g(x) = e x, then f (x) = c b exp { ( 1 2 ( ) )} x a 2, x R b which is a normal density, and the normalizing constant is c = 1 ( 2 0 ) 1 x 1/2 e x dx = 1 (Γ 2 ( )) 1 1 = π Hint: to calculate Γ(1/2) change variables into x = t 2 /2 and go to the normal integral. Edward Furman Risk theory / 72

52 Example 1.16 ( ) p, Let g(x) = 1 + x k p where p > 1/2 and kp is a constant such that k p = The density is then f (x) = c b { (2p 3)/2, p > 3/2 1/2, 1/2 < p 3/2 (1 + ) p (x a)2 2k p b 2, x R. To find c, we have to evaluate the integral 0 z 1/2 g(z)dz = that after a change of variables becomes 0 z 1/2 (1 + z/k p ) p dz, Edward Furman Risk theory / 72

53 Solution z 1/2 g(z)dz = 0 or in other words 0 z 1/2 g(z)dz = k 1/2 p 0 k 1/2 p u 1/2 (1 + u) p du 0 u 1/2 1 du (1 + u) 1/2+p 1/2 Edward Furman Risk theory / 72

54 Solution z 1/2 g(z)dz = 0 or in other words 0 z 1/2 g(z)dz = k 1/2 p 0 = k 1/2 p B Thus the normalizing constant is c = k 1/2 p u 1/2 (1 + u) p du (1 + u) ( 1 2, p u 1/2 1 1/2+p 1/2 du ). ( ( 1 2k p B 2, p 1 )) 1. 2 The density of a student-t rv is then Edward Furman Risk theory / 72

55 Solution f (x) = 1 b 2k p B ( 1 2, p 1 ) (1 + 2 ) p (x a)2 2k p b 2, x R. Elliptical distributions belong to the location-scale family (check at home). Note that for p, the student-t distribution above becomes a normal distribution. Edward Furman Risk theory / 72

56 Solution f (x) = 1 b 2k p B ( 1 2, p 1 ) (1 + 2 ) p (x a)2 2k p b 2, x R. Elliptical distributions belong to the location-scale family (check at home). Note that for p, the student-t distribution above becomes a normal distribution. Also, for p = 1, we have the Cauchy distribution, Edward Furman Risk theory / 72

57 Solution f (x) = 1 b 2k p B ( 1 2, p 1 ) (1 + 2 ) p (x a)2 2k p b 2, x R. Elliptical distributions belong to the location-scale family (check at home). Note that for p, the student-t distribution above becomes a normal distribution. Also, for p = 1, we have the Cauchy distribution, which does not have a mean. Edward Furman Risk theory / 72

58 Solution f (x) = 1 b 2k p B ( 1 2, p 1 ) (1 + 2 ) p (x a)2 2k p b 2, x R. Elliptical distributions belong to the location-scale family (check at home). Note that for p, the student-t distribution above becomes a normal distribution. Also, for p = 1, we have the Cauchy distribution, which does not have a mean. Let G(x) = c x 0 g(y)dy and let G(x) = G( ) G(x), then we have the following proposition that establishes the form of the CTE risk measure with x q = VaR q [X]. Edward Furman Risk theory / 72

59 Proposition 1.11 Let X E(µ, β) (i.e., it has finite mean µ). Then CTE q [X] = µ + λβ 2, where λ = 1 β ( 1 G 2 ( ) ) xq µ 2 β S X (x q ). Proof. By definition, for z q = (x q µ)/β, we have that CTE q [X] = 1 S X (x q ) x q x c β g ( 1 2 ( ) ) x µ 2 dx β Edward Furman Risk theory / 72

60 Proposition 1.11 Let X E(µ, β) (i.e., it has finite mean µ). Then CTE q [X] = µ + λβ 2, where λ = 1 β ( 1 G 2 ( ) ) xq µ 2 β S X (x q ). Proof. By definition, for z q = (x q µ)/β, we have that ( 1 CTE q [X] = x c ( ) ) S X (x q ) x q β g 1 x µ 2 dx 2 β 1 ( ) 1 = c(µ + βz)g S X (x q ) 2 z2 dz. z q Edward Furman Risk theory / 72

61 Proof. Further, CTE q [X] = µ + 1 S X (x q ) z q ( ) 1 cβzg 2 z2 dz Edward Furman Risk theory / 72

62 Proof. Further, CTE q [X] = µ + = µ + 1 S X (x q ) 1 S X (x q ) z q 1 2 z2 q ( ) 1 cβzg 2 z2 dz cβg (u) du Edward Furman Risk theory / 72

63 Proof. Further, CTE q [X] = µ + = µ + 1 S X (x q ) 1 S X (x q ) = µ + β S X (x q ) z q 1 2 z2 q 1 2 z2 q ( ) 1 cβzg 2 z2 dz cβg (u) du cg (u) du Edward Furman Risk theory / 72

64 Proof. Further, CTE q [X] = µ + = µ + which completes the proof. 1 S X (x q ) 1 S X (x q ) = µ + β S X (x q ) z q 1 2 z2 q 1 2 z2 q = µ + β 2 1/β S X (x q ) G ( ) 1 cβzg 2 z2 dz cβg (u) du cg (u) du ( ) 1 2 z2 q, Edward Furman Risk theory / 72

65 We have so far seen distributions and families of distributions which are used to describe risks amounts. It is important to model the number of risks received by an insurer. For this purpose we shall turn to counting distributions. Recall that the pgf is P(z) = E[z X ] = x z x p(x) and d m [ d m dz m P(z) = E dz m zx ] = E[X(X 1) (X m+1)z X m ]. Edward Furman Risk theory / 72

66 Example 1.17 (Poisson distribution) We consider the Poisson X Po(λ), λ > 0 rv. The pmf is The pgf is P[X = x] = e λ λ x, x = 0, 1, 2,..., x! Edward Furman Risk theory / 72

67 Example 1.17 (Poisson distribution) We consider the Poisson X Po(λ), λ > 0 rv. The pmf is The pgf is P[X = x] = e λ λ x, x = 0, 1, 2,..., x! P(z) = E[z X e λ (zλ) x ] = x! x=0 = exp{ λ(1 z)}. Also: Edward Furman Risk theory / 72

68 Example 1.17 (Poisson distribution) We consider the Poisson X Po(λ), λ > 0 rv. The pmf is The pgf is P[X = x] = e λ λ x, x = 0, 1, 2,..., x! P(z) = E[z X e λ (zλ) x ] = x! x=0 = exp{ λ(1 z)}. Also: E[X] = d dz P(z) 1 = λ, Edward Furman Risk theory / 72

69 Example 1.17 (Poisson distribution) We consider the Poisson X Po(λ), λ > 0 rv. The pmf is The pgf is P[X = x] = e λ λ x, x = 0, 1, 2,..., x! P(z) = E[z X e λ (zλ) x ] = x! x=0 = exp{ λ(1 z)}. Also: E[X] = d dz P(z) 1 = λ, E[X(X 1)] = d 2 dz 2 P(z) 1 = λ 2, Edward Furman Risk theory / 72

70 Example 1.17 (Poisson distribution) We consider the Poisson X Po(λ), λ > 0 rv. The pmf is The pgf is P[X = x] = e λ λ x, x = 0, 1, 2,..., x! P(z) = E[z X e λ (zλ) x ] = x! x=0 = exp{ λ(1 z)}. Also: E[X] = d dz P(z) 1 = λ, E[X(X 1)] = d 2 dz 2 P(z) 1 = λ 2, Var[X] = E[X(X 1)] + E[X] E 2 [X] = λ 2 + λ λ 2 = λ. Edward Furman Risk theory / 72

71 Proposition 1.12 Let X 1,..., X n be interdependent Poisson rv s. Then S = X X n is Poisson with λ = λ λ n. Proof. P S (z) = E[z S ] = E[z X 1+ +X n ] = Edward Furman Risk theory / 72

72 Proposition 1.12 Let X 1,..., X n be interdependent Poisson rv s. Then S = X X n is Poisson with λ = λ λ n. Proof. P S (z) = E[z S ] = E[z X 1+ +X n ] = n E[z X j ] j=1 Edward Furman Risk theory / 72

73 Proposition 1.12 Let X 1,..., X n be interdependent Poisson rv s. Then S = X X n is Poisson with λ = λ λ n. Proof. P S (z) = E[z S ] = E[z X 1+ +X n ] = n = exp{λ j (z 1)} = j=1 n E[z X j ] j=1 Edward Furman Risk theory / 72

74 Proposition 1.12 Let X 1,..., X n be interdependent Poisson rv s. Then S = X X n is Poisson with λ = λ λ n. Proof. P S (z) = E[z S ] = E[z X 1+ +X n ] = = j=1 which completes the proof. n E[z X j ] j=1 n n exp{λ j (z 1)} = exp λ j (z 1), j=1 Edward Furman Risk theory / 72

75 Proposition 1.13 Let the number of risks be X Po(λ). Also, let each risk be classified into m types with probabilities p j, j = 1,..., m independent of all others. Then we have that X j Po(λp j ), j = 1,..., m and they are independent. Edward Furman Risk theory / 72

76 Proposition 1.13 Let the number of risks be X Po(λ). Also, let each risk be classified into m types with probabilities p j, j = 1,..., m independent of all others. Then we have that X j Po(λp j ), j = 1,..., m and they are independent. Proof. We have that P[X 1 = x 1,..., X m = x m ] Edward Furman Risk theory / 72

77 Proposition 1.13 Let the number of risks be X Po(λ). Also, let each risk be classified into m types with probabilities p j, j = 1,..., m independent of all others. Then we have that X j Po(λp j ), j = 1,..., m and they are independent. Proof. We have that P[X 1 = x 1,..., X m = x m ] = P[X 1 = x 1,..., X m = x m X = x]p[x = x]. Edward Furman Risk theory / 72

78 Proposition 1.13 Let the number of risks be X Po(λ). Also, let each risk be classified into m types with probabilities p j, j = 1,..., m independent of all others. Then we have that X j Po(λp j ), j = 1,..., m and they are independent. Proof. We have that P[X 1 = x 1,..., X m = x m ] = P[X 1 = x 1,..., X m = x m X = x]p[x = x]. Here the conditional probability is multinomial by its definition, thus for x = x x m P[X 1 = x 1,..., X m = x m ]. = x! x 1! x m! px 1 1 pxm m e λ λ x x! Edward Furman Risk theory / 72

79 Proof. Hence P[X 1 = x 1,..., X m = x m ] Edward Furman Risk theory / 72

80 Proof. Hence = P[X 1 = x 1,..., X m = x m ] 1 x 1! x m! px 1 1 pxm m e λ λ x 1+ +x m Edward Furman Risk theory / 72

81 Proof. Hence = = P[X 1 = x 1,..., X m = x m ] 1 x 1! x m! px 1 1 pxm m e λ λ x 1+ +x m 1 x 1! x m! (λp 1) x1 (λp m ) xm e m j=1 λp j Edward Furman Risk theory / 72

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by specifying one or more values called parameters. The number

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Creating New Distributions

Creating New Distributions Creating New Distributions Section 5.2 Stat 477 - Loss Models Section 5.2 (Stat 477) Creating New Distributions Brian Hartman - BYU 1 / 18 Generating new distributions Some methods to generate new distributions

More information

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016 Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Severity Models - Special Families of Distributions

Severity Models - Special Families of Distributions Severity Models - Special Families of Distributions Sections 5.3-5.4 Stat 477 - Loss Models Sections 5.3-5.4 (Stat 477) Claim Severity Models Brian Hartman - BYU 1 / 1 Introduction Introduction Given that

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Lecture 17: The Exponential and Some Related Distributions

Lecture 17: The Exponential and Some Related Distributions Lecture 7: The Exponential and Some Related Distributions. Definition Definition: A continuous random variable X is said to have the exponential distribution with parameter if the density of X is e x if

More information

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER. Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Mathematical statistics

Mathematical statistics October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter

More information

Midterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm

Midterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm Midterm Examination STA 205: Probability and Measure Theory Wednesday, 2009 Mar 18, 2:50-4:05 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

Experimental Design and Statistics - AGA47A

Experimental Design and Statistics - AGA47A Experimental Design and Statistics - AGA47A Czech University of Life Sciences in Prague Department of Genetics and Breeding Fall/Winter 2014/2015 Matúš Maciak (@ A 211) Office Hours: M 14:00 15:30 W 15:30

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

Chapter 4: Continuous Random Variable

Chapter 4: Continuous Random Variable Chapter 4: Continuous Random Variable Shiwen Shen University of South Carolina 2017 Summer 1 / 57 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite

More information

Probability Distributions

Probability Distributions Probability Distributions Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr 1 / 25 Outline Summarize the main properties of some of

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

MATH Solutions to Probability Exercises

MATH Solutions to Probability Exercises MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER. Two hours MATH38181 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer any FOUR

More information

Final Examination. STA 711: Probability & Measure Theory. Saturday, 2017 Dec 16, 7:00 10:00 pm

Final Examination. STA 711: Probability & Measure Theory. Saturday, 2017 Dec 16, 7:00 10:00 pm Final Examination STA 711: Probability & Measure Theory Saturday, 2017 Dec 16, 7:00 10:00 pm This is a closed-book exam. You may use a sheet of prepared notes, if you wish, but you may not share materials.

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Exponential Families

Exponential Families Exponential Families David M. Blei 1 Introduction We discuss the exponential family, a very flexible family of distributions. Most distributions that you have heard of are in the exponential family. Bernoulli,

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS

March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS Abstract. We will introduce a class of distributions that will contain many of the discrete and continuous we are familiar with. This class will help

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

S6880 #7. Generate Non-uniform Random Number #1

S6880 #7. Generate Non-uniform Random Number #1 S6880 #7 Generate Non-uniform Random Number #1 Outline 1 Inversion Method Inversion Method Examples Application to Discrete Distributions Using Inversion Method 2 Composition Method Composition Method

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Exam C. Exam C. Exam C. Exam C. Exam C

Exam C. Exam C. Exam C. Exam C. Exam C cumulative distribution function distribution function cdf survival function probability density function density function probability function probability mass function hazard rate force of mortality

More information

Review for the previous lecture

Review for the previous lecture Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

MFM Practitioner Module: Quantitative Risk Management. John Dodson. September 23, 2015

MFM Practitioner Module: Quantitative Risk Management. John Dodson. September 23, 2015 MFM Practitioner Module: Quantitative Risk Management September 23, 2015 Mixtures Mixtures Mixtures Definitions For our purposes, A random variable is a quantity whose value is not known to us right now

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Math 362, Problem set 1

Math 362, Problem set 1 Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Continuous Random Variables

Continuous Random Variables MATH 38 Continuous Random Variables Dr. Neal, WKU Throughout, let Ω be a sample space with a defined probability measure P. Definition. A continuous random variable is a real-valued function X defined

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm Midterm Examination STA 205: Probability and Measure Theory Thursday, 2010 Oct 21, 11:40-12:55 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1

More information

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor. Final Examination a STA 532: Statistical Inference Wednesday, 2015 Apr 29, 7:00 10:00pm Thisisaclosed bookexam books&phonesonthefloor Youmayuseacalculatorandtwo pagesofyourownnotes Do not share calculators

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Nonlife Actuarial Models. Chapter 4 Risk Measures

Nonlife Actuarial Models. Chapter 4 Risk Measures Nonlife Actuarial Models Chapter 4 Risk Measures Learning Objectives 1. Risk measures based on premium principles 2. Risk measures based on capital requirements 3. Value-at-Risk and conditional tail expectation

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Discrete Distributions Chapter 6

Discrete Distributions Chapter 6 Discrete Distributions Chapter 6 Negative Binomial Distribution section 6.3 Consider k r, r +,... independent Bernoulli trials with probability of success in one trial being p. Let the random variable

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Non-parametric Inference and Resampling

Non-parametric Inference and Resampling Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing

More information

Final Solutions Fri, June 8

Final Solutions Fri, June 8 EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

9 Bayesian inference. 9.1 Subjective probability

9 Bayesian inference. 9.1 Subjective probability 9 Bayesian inference 1702-1761 9.1 Subjective probability This is probability regarded as degree of belief. A subjective probability of an event A is assessed as p if you are prepared to stake pm to win

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory Department of Statistics & Applied Probability Wednesday, October 19, 2011 Lecture 17: UMVUE and the first method of derivation Estimable parameters Let ϑ be a parameter in the family P. If there exists

More information

3 Modeling Process Quality

3 Modeling Process Quality 3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS Review of PROBABILITY DISTRIBUTIONS Hideaki Shimazaki, Ph.D. http://goo.gl/visng Poisson process 1 Probability distribution Probability that a (continuous) random variable X is in (x,x+dx). ( ) P x < X

More information

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

1. Let A be a 2 2 nonzero real matrix. Which of the following is true? 1. Let A be a 2 2 nonzero real matrix. Which of the following is true? (A) A has a nonzero eigenvalue. (B) A 2 has at least one positive entry. (C) trace (A 2 ) is positive. (D) All entries of A 2 cannot

More information

1.6 Families of Distributions

1.6 Families of Distributions Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for

More information