THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Similar documents
EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Probability and Distributions

THE ROYAL STATISTICAL SOCIETY 2007 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE (MODULAR FORMAT) MODULE 2 PROBABILITY MODELS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Relationship between probability set function and random variable - 2 -

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA STATISTICAL THEORY AND METHODS PAPER I

Exponential Distribution and Poisson Process

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

MAS223 Statistical Inference and Modelling Exercises

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Mathematics 426 Robert Gross Homework 9 Answers

6.041/6.431 Fall 2010 Quiz 2 Solutions

Lecture 5: Moment generating functions

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

MAS113 Introduction to Probability and Statistics. Proofs of theorems

18.440: Lecture 28 Lectures Review

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Name of the Student: Problems on Discrete & Continuous R.Vs

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

1 Solution to Problem 2.1

Chapter 4. Chapter 4 sections

Chapter 3 Common Families of Distributions

Chapter 4 Multiple Random Variables

Continuous Random Variables

Test Problems for Probability Theory ,

2 Functions of random variables

Math 362, Problem set 1

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH Solutions to Probability Exercises

Slides 8: Statistical Models in Simulation

Math Spring Practice for the final Exam.

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

1 Review of Probability and Distributions

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

Chapter 5 continued. Chapter 5 sections

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Sampling Distributions

2. This problem is very easy if you remember the exponential cdf; i.e., 0, y 0 F Y (y) = = ln(0.5) = ln = ln 2 = φ 0.5 = β ln 2.

Stat 100a, Introduction to Probability.

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah

Continuous Distributions

STA 584 Supplementary Examples (not to be graded) Fall, 2003

Continuous Distributions

Brief Review of Probability

MAS113 Introduction to Probability and Statistics. Proofs of theorems

THE QUEEN S UNIVERSITY OF BELFAST

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

18 Bivariate normal distribution I

Chapter 5 Joint Probability Distributions

FINAL EXAM: Monday 8-10am

Name of the Student: Problems on Discrete & Continuous R.Vs

SOLUTION FOR HOMEWORK 11, ACTS 4306

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Sampling Distributions

Posted June 26, 2013 In the solution of Problem 4 in Practice Examination 8, the fourth item in the list of six events was mistyped as { X 1

LIST OF FORMULAS FOR STK1100 AND STK1110

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Notes for Math 324, Part 19

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

Continuous random variables

STAT 3610: Review of Probability Distributions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

18.440: Lecture 28 Lectures Review

STAT FINAL EXAM

Chapter 6: Functions of Random Variables

1 Basic continuous random variable problems

Apart from this page, you are not permitted to read the contents of this question paper until instructed to do so by an invigilator.

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Lecture 2: Repetition of probability theory and statistics

CIVL Continuous Distributions

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Exercises and Answers to Chapter 1

Formulas for probability theory and linear models SF2941

STA 256: Statistics and Probability I

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Chapter 2. Continuous random variables

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Limiting Distributions

December 2010 Mathematics 302 Name Page 2 of 11 pages

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Chapter 2. Discrete Distributions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

5.2 Continuous random variables

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

HW4 : Bivariate Distributions (1) Solutions

STAT 515 MIDTERM 2 EXAM November 14, 2018

ECE Lecture #9 Part 2 Overview

Transcription:

THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future ears and for the information of an other persons using the examinations The solutions should NOT be seen as "model answers" Rather, the have been written out in considerable detail and are intended as learning aids Users of the solutions should alwas be aware that in man cases there are valid alternative methods Also, in the man cases where discussion is called for, there ma be other valid points that could be made While ever care has been taken with the preparation of these solutions, the Societ will not be responsible for an errors or omissions The Societ will not enter into an correspondence in respect of these solutions RSS 4

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question X and Y are independent, each with probabilit function given b P(W = w) = ( θ) w θ, for w =,, and where < θ < (i) Let U = X + Y U ma take the values, 3, B the independence of X and Y, u P ( U = u) = P( X = x and Y = u x) for u =, 3, x= Hence u { θ θ} {( θ) θ} θ ( θ) θ ( θ) u x u x u u, PU= u = = = u x= x= for u =, 3, [this is a negative binomial distribution] (ii) If U (= X +Y) = u, X must take values in the range,,, u Then ( = and = ) PU ( = u) P X x Y u x P X = xu = u = x {( ) } ( ) ( u ) θ ( θ) u x { } θ θ θ θ = = u u (for x =,,, u ) This is a discrete uniform distribution on the integers,,, u (iii) Suppose And takes X attempts and Bob takes Y attempts Then X and Y are independent geometric random variables with θ = 4 Thus from part (i), P(X + Y = 6) = (6 )(4) (6) 4 = 37 From part (ii), P(X < Y X + Y = 6) = P(X = or X + Y = 6) = 6 + 6 = 5

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question (i) (a) We have P(U = u) = /m, for u =,,, m So EU [ ] upu ( m) Also, m m+ m + = = + + + = = m m ( + )( + ) ( + )( + ) m m m m m E U = u P( u) = ( + + + m ) = = m m 6 6 and therefore Var(U) = E[U ] {E[U]} m + m + m+ m+ m m m ( 4m 3m 3) + = = + = = 6 m + (b) V = U + k, so E [ V] = + k and Var ( V ) m = (ii) Given that Y =, the first rolls must have been 6s and the final roll not a 6 Thus X is a discrete uniform random variable with, using the notation of part (i)(b), 5+ 5 k = 6( ) and m = 5 Hence E[ X] = + 6 ( ) = 6 3 and Var ( X ) = = These are the values conditional on Y = 6 5/6 5 Now, Y is a geometric random variable with EY [ ] = = and Var ( Y ) 6 = = 5 5 6 6 5 36 E[ X] = E E( X Y) = E[ 6Y 3] = 6E[ Y] 3 = 3 = 5 5 We also need [ ] E Var X Y = E = and ( E X Y ) ( Y ) ( Y) Var = Var 6 3 = 36 Var = 6 / 5, from which (using the result quoted in the question) 6 66 Var ( X ) = + = 5 5

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 3 (i) θ u e θ E U = u du = u e du Γ Γ α α θu α m m α+ m θu [put t = θu] ( α) ( α) α m ( α ) θ Γ( a) ( α m) α α θ α + m t θ Γ + = t e dt = + α+ m Γ θ Hence EU [ ] = α Also θ ( + ) = and so Var(U) = E[U ] {E[U]} = E U α α θ α θ (ii) e x f X x = 8xe d = 8x = 4xe = x (for x > ) This is gamma with α = and θ = = = = fy xe dx e x e (for > ) 8 8 4 x This is gamma with α = 3 and θ = Therefore from part (i), E[X] = and Var(X) = /, and E[Y] = 3/ and Var(Y) = 3/4 [ ] 8 ( 8 = x= ) 4 E XY x xe dxd e x dx d = = = e d [put t = ] 4 8 t t dt 4! e 3 6 = = Γ 5 = = 8 3 3 ( XY) E( X) EY Cov, = = =, and the correlation between X and Y is ( XY) ( X) Var ( Y) 3 4 Cov, = = Var 3

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 4 X and Y are uniform(, ) U = X Y, V = X + Y So X = ½(U + V) and Y = ½(V U) (i) The limits of U are (, ) and of V are (, ) The diagram shows the square region where f (u, v) v u The Jacobian of the transformation is J x u u x v v = = = Therefore, for u, v in the region shown, and using the independence of X and Y, f u, v = J f x, = f x f = = (ii) For u, For u, For v, For v, + u u f u = dv= + u u u f u = dv= u f v = du = v v v v f v = du = v v (f (u) and f (v) are zero elsewhere) These distributions are identical except for a shift of one unit in the x-direction (iii) We now have W = a + tx and Z = b + ty (t > ) So W + Z = (a + b) + t(x + Y) = (a + b) + tv where V is as in parts (i) and (ii) So, using the result for V in part (ii), t t / / P W + Z < a+ b+ = P tv < = P V < = vdv= v = 8

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 5 (i) The method is to complete the square in the exponent in the integral and then recognise that this re-creates the pdf of a Normal distribution so that the value of the integral is simpl tx tx x µ X () t = E e = e exp dx Thus the exponent is σ π σ { } x µ tx = ( x µ x + µ σ tx) = x ( µ + σ t) µσ t σ t σ σ σ 4 x µ σ t µ t+ σ t + Hence X () t = e exp dx = exp µ t σ t + σ π σ tax ( b) bt ( atx ) bt (ii) ax b () t E e + + = e E e = = e X at Taking a = /σ and b = µ/σ we have Z, and so µ σ µ t σ t Z () t = e + = e σ σ t/ t / exp Putting µ = and σ = in the result of part (i), we see that this is the moment generating function of the standard Normal distribution Therefore (assuming without proof the uniqueness propert between distributions and their moment generating functions) Z follows the standard Normal distribution (iii) We expand Z (t) as a power series in t and use the result that the rth moment of Z about the origin, E[Z r ], is the coefficient of t r /r! in this expansion [Alternativel, use the result that it is given b the rth derivative evaluated at t = ] () ( ) ( ) 3 t t 4 6 t t t Z t = + t + + + = + + + +! 3! 8 48 We need E[Z ] and E[Z 4 ], which we see are and 3 respectivel So Var(Z ) = E[Z 4 ] {E[Z ]} = 3 =

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 6 (i) F( x) = P( X x) = P( X x bo) P( bo) + P( X x girl) P( girl) = { } F x + F x d = = + dx Hence the pdf of X is f ( x) F( x) { f( x) f( x) } ( ) ( ) E X = xf x dx= xf x dx+ xf x dx = µ + µ (ii) [ ] ( ) = = + E X x f x dx x f x dx x f x dx [now use σ = E[X ] {E[X]} ] {( σ ) } µ ( σ µ σ µ µ ) = + + + = + + X = σ + µ + µ µ + µ = σ + µ + µ µ µ 4 4 and thus ( ) Var ( ) = σ + ( µ ) µ 4 (iii) As the sample is large, we ma use the central limit theorem to sa that the distribution of X is approximatel Normal, whatever the distributions f and f r The mean of X is equal to the mean of X, ie r µ + µ Var n n σ 4 µ µ ( X ) r = Var X = + (iv) Let the (independent) sample means for bos and girls be X and X Their means are µ and Xst = X+ X has mean µ, and their variances are each σ /n So σ σ σ ( µ + µ ) and variance + = Thus both X r and 4 n n n the latter has smaller variance; it is more precise X st are unbiased but

Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 7 (i) (a) The distribution of X is x 3 P(x) 64/5 48/5 /5 /5 cdf F(x) 5 896 99 36 < 5 and so corresponds to x = 6789 is between 5 and 896 and so corresponds to x = 355 < 5 and so corresponds to x = 898 < 5 and so corresponds to x = So the sample is,,, (b) x 4 4 x 4 3 t / / / t x 4 / F x = t e dt = e = e Thus the inverse cdf method gives u = e x, where u is the random number, so x= log( u) /4 u u log( u) [ log( u)] /4 36 6388 8963 973 6789 3 7 77 355 6448 8776 9679 898 7 6844 996 These are the required random numbers (ii) Inter-arrival times, X, are exponential with parameter ½ Given u (as above), the inverse cdf method leads to u = e x/, ie x = log( u) [as is tabulated in column 3 above] First arrival time is at 8963 Service starts immediatel and lasts (uniform distribution on (5, 5)) 5 + 6789 = 789, and thus ends at 8963 + 789 = 375 Second arrival time is at 8963 + 8776 = 7739 Service cannot start until 375 and then lasts 5 + 898 = 7898, and thus ends at 375 + 7898 = 4865 Expressing the simulated results to the nearest minute after 9 am, the first customer arrives at 9 and leaves at 3, the second arrives at 8 and leaves at 49

(i) Graduate Diploma, Statistical Theor & ethods, Paper I, 4 Question 8 Consider the number of balls in urn A, an integer from to Define the p = P j balls in urn A at step n i balls in urn A at step n transition probabilit ij Then we have p =, p = for j ; j i i for i =,,,, pii =, pii+ =, pij = for j i or i+ ; p =, p = for j j (ii) Let the stationar probabilities be π, π,, π We have π = p π Hence j ij i i= j+ j+ π = π; for j =,,,, π j = π j + π j+ ; π = π It is required to show that π j = j ( j =,,, ) satisf these equations Immediatel, = = and π π π / π = = / Thus π = π and π = π, as required To confirm that the general equation is satisfied, consider j+ j+ j+! j+! + = + j j+ j! j+! j+! j! ( j j )!!!! = + = + = = j! j! j! j! j! j! j! j! j ultipling b (/), we see that the general equation ( j =,,, ) is satisfied (iii) Let X be the number of balls in urn A The given probabilities form a binomial distribution with parameters (6, ½) So E(X) = 6 ½ = 3 and Var(X) = 6 ½ ½ = 5 P(X = 34) can be approximated b P(335 < N(3, 5) < 345) Standardising to N(, ) gives P(937 < N(, ) < 69) = 8774 869 = 65