SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

Size: px
Start display at page:

Download "SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM"

Transcription

1 SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

2 Solutions to Question A1 a) The marginal cdfs of F X,Y (x, y) = [1 + exp( x) + exp( y) + (1 α) exp( x y)] 1 are F X (x) = F X,Y (x, ) = [1 + exp( x)] 1 and F Y (y) = F X,Y (, y) = [1 + exp( y)] 1. b) F X belongs to the Gumbel max domain of attraction since 1 F X (t + x) t w(f X ) 1 F X (t) = t 1 [1 + e t x ] 1 1 [1 + e t ] 1 = t 1 [1 e t x ] 1 [1 e t ] = e x. c) F Y belongs to the Gumbel max domain of attraction since 1 F Y (t + x) t w(f Y ) 1 F Y (t) = t 1 [1 + e t y ] 1 1 [1 + e t ] 1 = t 1 [1 e t y ] 1 [1 e t ] = e y. SEEN d) Use the rule a n = γ ( 1 F 1 X Inverting (1 n 1 ) ) and b n = F 1 X (1 n 1 ). From part (b), γ(t) = 1. F X (x) = [1 + exp( x)] 1 = 1 n 1, 1

3 we obtain ( F ) 1 X 1 n 1 = log(n 1). e) Use the rule c n = γ ( 1 F 1 Y Inverting we obtain (1 n 1 ) ) and d n = F 1 Y (1 n 1 ). From part (c), γ(t) = 1. F Y (y) = [1 + exp( y)] 1 = 1 n 1, ( F ) 1 Y 1 n 1 = log(n 1). g) The iting cdf is F X,Y (a n x + b n, c n y + d n ) n = [1 + exp ( x log(n 1)) + exp ( y log(n 1)) + (1 α) exp ( x y 2 log(n 1))] n n [ ] n exp ( x) exp ( y) exp ( x y) = (1 α) n n 1 n 1 (n 1) 2 = exp { exp( x) exp( y)}. h) Yes, the extremes are completely independent since F X,Y (a n x + b n, c n y + d n ) = exp { exp( x) exp( y)} n = n F X (a n x + b n ) n F Y (c n y + d n ). (5 marks) 2

4 Solutions to Question A2 C (u 1, u 2 ) is a valid copula if C (u, 0) = 0, C (0, u) = 0, C (1, u) = u, C (u, 1) = u, u 1 C (u 1, u 2 ) 0 and u 2 C (u 1, u 2 ) 0. SEEN a) for the copula defined by C (u 1, u 2 ) = [α (min (u 1, u 2 )) m + (1 α)u m 1 u m 2 ] 1/m, we have C (u, 0) = [α 0 + (1 α) 0] 1/m = 0, C (0, u) = [α 0 + (1 α) 0] 1/m = 0, C (1, u) = [αu m + (1 α)u m ] 1/m = u, { C (u 1, u 2 ) = u 1 C (u, 1) = [αu m + (1 α)u m ] 1/m = u, [α + (1 α)u m 2 ] 1/m, if u 1 u 2, (1 α)u m 1 1 u 2 [α + (1 α)u m 2 ] 1/m 1 0, if u 2 u 1 and { C (u 1, u 2 ) = u 2 (1 α)u 1 u m 1 2 [α + (1 α)u m 2 ] 1/m 1, if u 1 u 2, [α + (1 α)u m 2 ] 1/m, if u 2 u

5 b) for the copula defined by C (u 1, u 2 ) = max (u 1 + u 2 1, 0), we have C (u, 0) = max (u + 0 1, 0) = 0, C (0, u) = max (0 + u 1, 0) = 0, C (1, u) = max (1 + u 1, 0) = u, C (u, 1) = max (u + 1 1, 0) = u, and { 1, if u1 + u C (u 1, u 2 ) = 2 1, u 1 0, if u 1 + u 2 < 1 { 1, if u1 + u C (u 1, u 2 ) = 2 1, u 2 0, if u 1 + u 2 < c) for the copula defined by C (u 1, u 2 ) = min ( ) ( ) u a 1, u b 2 min u 1 a 1, u2 1 b, we have C (u, 0) = min (u a, 0) min ( u 1 a, 0 ) = 0, C (0, u) = min ( 0, u b) min ( 0, u 1 b) = 0, C (1, u) = min ( 1, u b) min ( 1, u 1 b) = u, C (u, 1) = min (u a, 1) min ( u 1 a, 1 ) = u, and C (u 1, u 2 ) = u 1 C (u 1, u 2 ) = u 2 1, if u a 1 u b 2 and u 1 a 1 u2 1 b, au1 a 1 u 1 b 2, if u a 1 u b 2 and u 1 a 1 > u2 1 b, (1 a)u a 1 u b 2, if u a 1 > u b 2 and u 1 a 1 u2 1 b, 0, if u a 1 > u b 2 and u 1 a 1 > u2 1 b 0, if u a 1 u b 2 and u1 1 a u 1 b 2, (1 b)u a 1u b 2, if u a 1 u b 2 and u1 1 a > u 1 b 2, bu 1 a 1 u b 1 2, if u a 1 > u b 2 and u1 1 a u 1 b 2, 1, if u a 1 > u b 2 and u 1 a 1 > u 1 b

6 { [ d) for the copula defined by C (u 1, u 2 ) = exp ( log u 1 ) θ + ( log u 2 ) θ] } 1/θ, we have { [ C (u, 0) = exp ( log u) θ + ( ) θ] } 1/θ = 0, { [ C (0, u) = exp ( ) θ + ( log u) θ] } 1/θ = 0, { [ C (1, u) = exp 0 + ( log u) θ] } 1/θ = u, { [ ] } 1/θ C (u, 1) = exp ( log u) θ + 0 = u, [ C (u 1, u 2 ) = u 1 1 ( log u 1 ) θ 1 ( log u 1 ) θ + ( log u 2 ) θ] 1/θ 1 u 1 { [ exp ( log u 1 ) θ + ( log u 2 ) θ] } 1/θ 0 and [ C (u 1, u 2 ) = u 1 2 ( log u 2 ) θ 1 ( log u 1 ) θ + ( log u 2 ) θ] 1/θ 1 u 1 { [ exp ( log u 1 ) θ + ( log u 2 ) θ] } 1/θ 0. 5

7 Solutions to Question A3 a) We can write F (x, y) = exp { [ (x + y) 1 (θ + φ) y x + y + ]} θy2 (x + y) + φy3 2 (x + y) 3 for x > 0, y > 0, θ 0, φ 0, θ + 3φ 0, θ + φ 1 and θ + 2φ 1. This is in the form of [ ( )] y F (x, y) = exp (x + y)a x + y with A(w) = 1 (θ + φ)w + θw 2 + φw 3. We now check the conditions for A( ). Clearly, A(0) = 1 and A(1) = 1. Also A(t) 0 since θ + φ 1 implies 1 (θ + φ)w 1 for all w. Also A(w) 1 since A(w) 1 1 (θ + φ)w + θw 2 + φw 3 1 θ ( w 2 w ) + φ ( w 3 w ) 0 θw (w 1) + φw ( w 2 1 ) 0 (θ + φ + φw) w (w 1) 0. Note that max(w, 1 w) = w if w [1/2, 1]. So, for w [1/2, 1], A(w) max(w, 1 w) A(w) w 1 (θ + φ + 1)w + θw 2 + φw 3 0. Let g(w) = 1 (θ + φ + 1)w + θw 2 + φw 3. Note that g (w) = 2θw + 3φw 2 θ φ 1 0 for all w [1/2, 1]. But g(1) = 0 0, so A(w) max(w, 1 w) for all w [1/2, 1]. Note that max(w, 1 w) = 1 w if w [0, 1/2]. So, for w [0, 1/2], which holds since θ + φ 1. A( ) is convex since A(w) max(w, 1 w) A(w) 1 w (1 θ φ)w + θw 2 + φw 3 0, A (w) = 2θw + 3φw 2 θ φ 6

8 and A (t) = 2θ + 6φw 0 for all w since θ + 3φ 0. (6 marks) b) the joint cdf is F (x, y) = 1 exp( x) exp( y) + exp { [ (x + y) 1 (θ + φ) y x + y + ]} θy2 (x + y) + φy3. 2 (x + y) 3 c) the derivative of joint cdf with respect to x is [ ] F (x, y) θy 2 = exp( x) + F (x, y) x (x + y) + 2φy3 2 (x + y) 1, 3 so the conditional cdf if Y given X = x is [ ] θy 2 F (y x) = 1 + exp(x)f (x, y) (x + y) + 2φy3 2 (x + y) 1. 3 d) the derivative of joint cdf with respect to y is [ F (x, y) 2φy 3 (θ 3φ)y2 = exp( y) + F (x, y) + 2θy ] y (x + y) 3 (x + y) 2 x + y + θ + φ 1, so the conditional cdf if X given Y = y is [ 2φy 3 (θ 3φ)y2 F (x y) = 1 + exp(y)f (x, y) + 2θy ] (x + y) 3 (x + y) 2 x + y + θ + φ 1. 7

9 e) the derivative of joint cdf with respect to x and y is F (x, y) f(x, y) = x y [ ] [ θy 2 = F (x, y) (x + y) + 2φy3 2φy 3 2 (x + y) 1 (θ 3φ)y2 + 2θy ] 3 (x + y) 3 (x + y) 2 x + y + θ + φ 1 [ ] 2θy 2(3φ θ)y2 +F (x, y) + 6θy3. (x + y) 2 (x + y) 3 (x + y) 4 8

10 Solutions to Question B1 a) The mgf of X i is M Xi (t) = 0 λ exp [ (λ t)x] dx = [ ] λ exp [ (λ t)x] = 0 λ t λ 0 t λ = λ λ t. SEEN b) The mgf of T conditional on N = n is M T N=n (t) = E {exp [t (X X n )]} = E {exp [tx 1 ] exp [X n ]} = E {exp [tx 1 ]} E {exp [X n ]} = M X1 (t) M Xn (t) ( ) n λ = λ t c) ( λ λ t) n is the mgf of a gamma random variable with parameters λ and n. So, the conditional distribution of T is gamma with parameters λ and n. d) The conditional pdf of T is f T N=n (x) = λn x n 1 exp( λx). Γ(n) 9

11 So, the unconditional pdf of T is f T (x) = f T N=n (x)θ(1 θ) n 1 n=1 = λ n x n 1 exp( λx) θ(1 θ) n 1 Γ(n) n=1 = (λ(1 θ)x) n 1 θλ exp( λx) (n 1)! n=1 = θλ exp( λx) exp [λx(1 θ)] = θλ exp( θλx), an Exponential pdf with parameter θλ. e) The mean is 1/(θλ) and the variance 1/(θλ) 2. f) The unconditional cdf of T is F T (x) = 1 exp( θλx). Inverting 1 exp( θλx) = p, we obtain VaR p (T ) = 1 log(1 p). θλ 10

12 g) The expected shortfall T is ES p (T ) = 1 θλp = 1 θλp = 1 θλp = 1 θλp p log(1 t)dt 0 p {[t log(1 t)] p0 + 0 p { p log(1 p) + 0 { p log(1 p) p + } t 1 t dt } t dt 1 t p } 1 1 t dt = 1 θλp {p log(1 p) p + [ log(1 t)]p 0 } = 1 {p log(1 p) p log(1 p)}. θλp 0 11

13 Solutions to Question B2 If there are norming constants a n > 0, b n and a nondegenerate G such that the cdf of a normalized version of M n converges to G, i.e. ( ) Mn b n Pr x = F n (a n x + b n ) G(x) (1) a n as n then G must be of the same type as (cdfs G and G are of the same type if G (x) = G(ax + b) for some a > 0, b and all x) as one of the following three classes: I : Λ(x) = exp { exp( x)}, x R; { 0 if x < 0, II : Φ α (x) = exp { x α } if x 0 for some α > 0; { exp { ( x) III : Ψ α (x) = α } if x < 0, 1 if x 0 for some α > 0. SEEN (6 marks) The necessary and sufficient conditions for the three extreme value distributions are: 1 F (t + xγ(t)) I : γ(t) > 0 s.t. t w(f ) 1 F (t) II : w(f ) = and t 1 F (tx) 1 F (t) = x α, x > 0, = exp( x), x R, III : w(f ) < and t 0 1 F (w(f ) tx) 1 F (w(f ) t) = xα, x > 0. SEEN (6 marks) First, suppose that G belongs to the max domain of attraction of the Gumbel extreme value distribution. Then, there must exist a strictly positive function say h(t) such that 1 G (t + xh(t)) t w(g) 1 G(t) = e x 12

14 for every x (, ). But 1 F (t + xh(t)) t W (F ) 1 F (t) { 1 {1 [1 G (t + xh(t))] a } b = t w(f ) 1 {1 [1 G (t)] a } b { } aθ 1 G (t + xh(t)) = t w(f ) 1 G (t) = e aθx } θ for every x (, ), assuming w(f ) = w(g). So, it follows that F also belongs to the max domain of attraction of the Gumbel extreme value distribution with ( ) P Mn b n x = exp [ exp( aθx)] n for some suitable norming constants a n > 0 and b n. a n Second, suppose that G belongs to the max domain of attraction of the Fréchet extreme value distribution. Then, there must exist a β > 0 such that for every x > 0. But 1 F (tx) t 1 F (t) 1 G (tx) t 1 G(t) = x β { 1 {1 [1 G (tx)] a } b = t 1 {1 [1 G (t)] a } b { } aθ 1 G (tx) = t 1 G (t) = x βaθ for every x > 0. So, it follows that F also belongs to the max domain of attraction of the Fréchet extreme value distribution with ( ) P Mn b n x = exp ( x βaθ) n for some suitable norming constants a n > 0 and b n. a n } θ Third, suppose that G belongs to the max domain of attraction of the Weibull extreme value distribution. Then, there must exist a β > 0 such that 1 G (w(g) tx) t 0 1 G (w(g) t) = xβ 13

15 for every x > 0. But But t 0 1 F (w(f ) tx) 1 F (w(f ) t) = t 0 = t = x βaθ { 1 {1 [1 G (w(f ) tx)] a } b 1 {1 [1 G (w(f ) t)] a } b { } aθ 1 G (w(f ) tx) 1 G (w(f ) t) } θ for every x < 0, assuming w(f ) = w(g). So, it follows that F also belongs to the max domain of attraction of the Weibull extreme value distribution with ( ) P Mn b n x = exp ( ( x) βaθ) n for some suitable norming constants a n > 0 and b n. a n 14

16 Solutions to Question B3 a) Note that w(f ) = 1. Then t 0 1 F (1 tx) 1 F (1 t) = t 0 xf (1 tx) f(1 t) = t 0 x (1 tx) α 1 (tx) β 1 (1 t) α 1 t β 1 = x β. So, f(x) = Cx α 1 (1 x) β 1 belongs to the Weibull domain of attraction. b) Note that and So, 1/2, if k = 1, 0, F (k) = 1, if k 1, 0, otherwise 1, if k 0, 1 F (k 1) = 1/2, if k = 1, 2, 0, otherwise. p(k) 1 F (k 1) = Hence, there can be no non-degenerate it. 1/2, if k = 1, 1, if k = 1, 0, if k 2, k 1, 1, undefined, otherwise. c) Note that w(f ) =. Then 1 F (tx) t 1 F (t) = t xf (tx) f(t) = t x (1 + t 2 x 2 ) 1 (1 + t 2 ) 1 = x 1. 15

17 So, f(x) = π 1 (1 + x 2 ) 1 belongs to the Fréchet domain of attraction. d) Note that w(f ) =. It is easy to show that the corresponding cdf is { 0.5e x, if x < 0, F (x) = 1 0.5e x, if x 0. Take g(t) = 1. Then 1 F (t + xg(t)) t 1 F (t) e t x = t e t = e x. So, f = 0.5e x belongs to the Gumbel domain of attraction. e) Note that w(f ) =. Then 1 F (tx) t 1 F (t) = 1 exp ( (tx) 1 ) t 1 exp ( t 1 ) = t 1 (1 (tx) 1 ) 1 (1 t 1 ) (tx) 1 = = x 1. t t 1. So, the cdf F (x) = exp ( x 1 ) belongs to the Fréchet domain of attraction. 16

18 Solutions to Question B4 (a) If X is an absolutely continuous random variable with cdf F ( ) then VaR p (X) = F 1 (p) and ES p (X) = 1 p p 0 F 1 (v)dv. SEEN (b) (i) The corresponding cdf is for x > K; F (x) = x K ak a y a 1 dy = [ K a y a] x K = 1 Ka x a SEEN (b) (ii) Inverting we obtain VaR p (X) = K(1 p) 1/a. F (x) = 1 K a x a = p, SEEN (b) (iii) The expected shortfall is ES p (X) = K p p 0 (1 v) 1/a dv = K p [ (1 v) 1 1/a 1 1 a ] p 0 = ak ] [(1 p) 1 1 a 1. p(1 a) SEEN c) (i) The joint likelihood function of a and K is L (a, K) = n [ ak a x a 1 i I {x i K} ] ( n ) a 1 = a n K na x i I {min x i K}. 17

19 (1 marks) c) (ii) Note that L (a, K) is an increasing function over K [0, min x i ] and is zero elsewhere. So, the mle of K is K = min x i. c) (iii) The log-likelihood function corresponding to L (a, K) is So, log L (a, K) = n log a + na log K (a + 1) Setting this to zero gives d log L da n log x i + log I {min x i K}. = n n a + n log K log x i. [ n ] 1 â = n log x i n log min x i. c) (iv) The mle of VaR is VaR 0 (X) = K = min x i. By L Hopital s rule, the MLE of ES is also ÊS 0(X) = K = min x i. c (v) Let Z = min x i. The cdf of Z is F Z (z) = Pr (Z z) = 1 Pr (Z > z) = 1 Pr (min X i > z) = 1 Pr n (X > z) = 1 for z > K. The corresponding pdf is f Z (z) = nak na z na 1 18 ( ) na K z

20 for z > K. Z is a Pareto random variable. The expected value of Z is [ z E(Z) = nak na z na dz = nak na 1 na So, K and hence VaR0 (X), ÊS 0(X) are biased. K 1 na ] K. 19

21 Solutions to Question B5 a) The cdf of Y is for y > 0. F Y (y) = Pr(Y y) = Pr (max (X 1,..., X m ) y) = Pr (X 1 y,..., X m y) = Pr (X 1 y) Pr (X m y) ( ) ( ) y a y a = b a b a ( ) m y a = b a b) The corresponding pdf is for y > 0. f Y (y) = m b a ( ) m 1 y a. b a c) The nth moment of Y can be calculated as b E (Y n ) = y n m ( ) m 1 y a dy b a b a a b = m(b a) m (y a + a) n (y a) m 1 dy a b n ( ) n = m(b a) m a n k (y a) k+m 1 dy a k k=0 n ( n b = m(b a) )a m n k (y a) k+m 1 dy k k=0 a n ( [ ] n (y a) = m(b a) )a m n k k+m b k k + m k=0 a n ( n = m(b a) m n k (b a)k+m )a k k + m. k=0 20

22 So, E (Y ) = m(b a) m 1 k=0 1 k (b a)k+m a k + m and V ar (Y ) = m(b a) m 2 k=0 ( 2 2 k (b a)k+m )a k k + m E2 (Y ). d) Setting ( ) m y a = p b a gives VaR p (Y ) = a + (b a)p 1/m. e) The expected shortfall is p ES p (Y ) = 1 ( ) a + (b a)v 1/m dv p 0 = 1 ] p [av + (b a) v1+1/m p 1 + 1/m 0 = 1 ] [ap + (b a) p1+1/m p 1 + 1/m p 1/m = a + (b a) 1 + 1/m. 21

23 f) The likelihood and log likelihood functions are [ n ( ) m 1 m yi a L(a, b) = I {a < y i < b}] b a b a [ n ] m 1 n = m n (b a) mn (y i a) I {a < y i < b} [ n m 1 = m n (b a) mn (y i a)] I {min y i > a, max y i < b} and n log L(a, b) = n log m mn log(b a) + (m 1) log (y i a) + n log I {min y i > a, max y i < b}. The likelihood function is a decreasing function of b, so it MLE is max y i. derivative of the log-likelihood with respect to a is The partial log L(a, b) a = mn n b a (m 1) 1 y i a. So, the MLE of a is the root of mn n b a = (m 1) 1 y i a. (6 marks) 22

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM Solutions to Question A1 a) The joint cdf of X Y is F X,Y x, ) x 0 0 4uv + u + v + 1 dudv 4 u v + u + uv + u ] x dv 0 1 4 0 1 x v + x + xv

More information

MATH68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST

MATH68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST MATH688: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST ANSWER TO QUESTION If there are norming constants a n >, b n and a nondegenerate G such that the cdf of a normalized version of M n converges

More information

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER. Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION

More information

MATH3/4/68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST

MATH3/4/68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST MATH3/4/68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST ANSWER TO QUESTION 1 i If there re norming constnts n > 0, b n nd nondegenerte G such tht the cdf of normlized version of M n converges

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER. Two hours MATH38181 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer any FOUR

More information

SOLUTIONS TO MATH38181 EXTREME VALUES EXAM

SOLUTIONS TO MATH38181 EXTREME VALUES EXAM SOLUTIONS TO MATH388 EXTREME VALUES EXAM Solutions to Question If there re norming onstnts n >, b n nd nondegenerte G suh tht the df of normlized version of M n onverges to G, i.e. ( ) Mn b n Pr x F n

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate

More information

Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Math 362, Problem set 1

Math 362, Problem set 1 Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x

More information

Financial Econometrics and Volatility Models Extreme Value Theory

Financial Econometrics and Volatility Models Extreme Value Theory Financial Econometrics and Volatility Models Extreme Value Theory Eric Zivot May 3, 2010 1 Lecture Outline Modeling Maxima and Worst Cases The Generalized Extreme Value Distribution Modeling Extremes Over

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45 Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

Step-Stress Models and Associated Inference

Step-Stress Models and Associated Inference Department of Mathematics & Statistics Indian Institute of Technology Kanpur August 19, 2014 Outline Accelerated Life Test 1 Accelerated Life Test 2 3 4 5 6 7 Outline Accelerated Life Test 1 Accelerated

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Mathematical statistics

Mathematical statistics October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Introduction to Algorithmic Trading Strategies Lecture 10

Introduction to Algorithmic Trading Strategies Lecture 10 Introduction to Algorithmic Trading Strategies Lecture 10 Risk Management Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Value at Risk (VaR) Extreme Value Theory (EVT) References

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models

More information

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1 Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function

More information

1. Point Estimators, Review

1. Point Estimators, Review AMS571 Prof. Wei Zhu 1. Point Estimators, Review Example 1. Let be a random sample from. Please find a good point estimator for Solutions. There are the typical estimators for and. Both are unbiased estimators.

More information

A Marshall-Olkin Gamma Distribution and Process

A Marshall-Olkin Gamma Distribution and Process CHAPTER 3 A Marshall-Olkin Gamma Distribution and Process 3.1 Introduction Gamma distribution is a widely used distribution in many fields such as lifetime data analysis, reliability, hydrology, medicine,

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Parameter Estimation

Parameter Estimation Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23 Methods for parameter estimation Methods for parameter estimation Methods

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

5 Introduction to the Theory of Order Statistics and Rank Statistics

5 Introduction to the Theory of Order Statistics and Rank Statistics 5 Introduction to the Theory of Order Statistics and Rank Statistics This section will contain a summary of important definitions and theorems that will be useful for understanding the theory of order

More information

Shape of the return probability density function and extreme value statistics

Shape of the return probability density function and extreme value statistics Shape of the return probability density function and extreme value statistics 13/09/03 Int. Workshop on Risk and Regulation, Budapest Overview I aim to elucidate a relation between one field of research

More information

f (1 0.5)/n Z =

f (1 0.5)/n Z = Math 466/566 - Homework 4. We want to test a hypothesis involving a population proportion. The unknown population proportion is p. The null hypothesis is p = / and the alternative hypothesis is p > /.

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor. Final Examination a STA 532: Statistical Inference Wednesday, 2015 Apr 29, 7:00 10:00pm Thisisaclosed bookexam books&phonesonthefloor Youmayuseacalculatorandtwo pagesofyourownnotes Do not share calculators

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2 Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Lecture 21: Convergence of transformations and generating a random variable

Lecture 21: Convergence of transformations and generating a random variable Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous

More information

Transformations and Expectations

Transformations and Expectations Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Section 21. Expected Values. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University

Section 21. Expected Values. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University Section 21 Expected Values Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30010, R.O.C. Expected value as integral 21-1 Definition (expected

More information

FACULTY OF ENGINEERING AND ARCHITECTURE. MATH 256 Probability and Random Processes. 04 Multiple Random Variables

FACULTY OF ENGINEERING AND ARCHITECTURE. MATH 256 Probability and Random Processes. 04 Multiple Random Variables OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Multiple Random Variables Fall 2012 Yrd. Doç. Dr. Didem Kivanc Tureli didem@ieee.org didem.ivanc@oan.edu.tr

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

Method of Moments. which we usually denote by X or sometimes by X n to emphasize that there are n observations.

Method of Moments. which we usually denote by X or sometimes by X n to emphasize that there are n observations. Method of Moments Definition. If {X 1,..., X n } is a sample from a population, then the empirical k-th moment of this sample is defined to be X k 1 + + Xk n n Example. For a sample {X 1, X, X 3 } the

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

UNIT Define joint distribution and joint probability density function for the two random variables X and Y. UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero

STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero 1999 32 Statistic used Meaning in plain english Reduction ratio T (X) [X 1,..., X n ] T, entire data sample RR 1 T (X) [X (1),..., X (n) ] T, rank

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS

March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS Abstract. We will introduce a class of distributions that will contain many of the discrete and continuous we are familiar with. This class will help

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise.

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise. 1. Suppose Y 1, Y 2,..., Y n is an iid sample from a Bernoulli(p) population distribution, where 0 < p < 1 is unknown. The population pmf is p y (1 p) 1 y, y = 0, 1 p Y (y p) = (a) Prove that Y is the

More information

Stat 710: Mathematical Statistics Lecture 27

Stat 710: Mathematical Statistics Lecture 27 Stat 710: Mathematical Statistics Lecture 27 Jun Shao Department of Statistics University of Wisconsin Madison, WI 53706, USA Jun Shao (UW-Madison) Stat 710, Lecture 27 April 3, 2009 1 / 10 Lecture 27:

More information

Mathematical statistics

Mathematical statistics October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

VaR vs. Expected Shortfall

VaR vs. Expected Shortfall VaR vs. Expected Shortfall Risk Measures under Solvency II Dietmar Pfeifer (2004) Risk measures and premium principles a comparison VaR vs. Expected Shortfall Dependence and its implications for risk measures

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Probability and Statistics qualifying exam, May 2015

Probability and Statistics qualifying exam, May 2015 Probability and Statistics qualifying exam, May 2015 Name: Instructions: 1. The exam is divided into 3 sections: Linear Models, Mathematical Statistics and Probability. You must pass each section to pass

More information

1 Joint and marginal distributions

1 Joint and marginal distributions DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function

More information

Mathematical statistics

Mathematical statistics October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation

More information