EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Similar documents
EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

STAT 414: Introduction to Probability Theory

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

MAS223 Statistical Inference and Modelling Exercises

Contents 1. Contents

Chapter 4 continued. Chapter 4 sections

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

PhysicsAndMathsTutor.com. Advanced/Advanced Subsidiary. You must have: Mathematical Formulae and Statistical Tables (Blue)

FINAL EXAM: 3:30-5:30pm

THE QUEEN S UNIVERSITY OF BELFAST

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2

Chapter 5 continued. Chapter 5 sections

6.041/6.431 Fall 2010 Quiz 2 Solutions

18.440: Lecture 28 Lectures Review

STAT 418: Probability and Stochastic Processes

FINAL EXAM: Monday 8-10am

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 2004

Homework 10 (due December 2, 2009)

Probability and Distributions

Econ 371 Problem Set #1 Answer Sheet

University of Illinois ECE 313: Final Exam Fall 2014

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 2000

More on Distribution Function

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 1996

This paper is not to be removed from the Examination Halls

Advanced/Advanced Subsidiary. You must have: Mathematical Formulae and Statistical Tables (Blue)

ECE Lecture #10 Overview

Advanced/Advanced Subsidiary. You must have: Mathematical Formulae and Statistical Tables (Blue)

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Page Max. Possible Points Total 100

Random Variables and Their Distributions

STA 4322 Exam I Name: Introduction to Statistics Theory

STA 256: Statistics and Probability I

Joint Distribution of Two or More Random Variables

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Dependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2007

Test Problems for Probability Theory ,

Lecture 2: Repetition of probability theory and statistics

Lecture 1: August 28

Math 493 Final Exam December 01

4. Distributions of Functions of Random Variables

Notes for Math 324, Part 19

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

MATHEMATICS. The assessment objectives of the Compulsory Part are to test the candidates :

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

. Find E(V ) and var(v ).

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

Brief Review of Probability

Section 9.1. Expected Values of Sums

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

Problem 1. Problem 2. Problem 3. Problem 4

ECE Lecture #9 Part 2 Overview

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Discrete Random Variable Practice

MATH/STAT 3360, Probability

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

IB Mathematics HL Year 2 Unit 7 (Core Topic 6: Probability and Statistics) Valuable Practice

5. Conditional Distributions

1. Discrete Distributions

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

A Few Special Distributions and Their Properties

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

LIST OF FORMULAS FOR STK1100 AND STK1110

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Class 26: review for final exam 18.05, Spring 2014

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Properties of Random Variables

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Probability & Statistics - FALL 2008 FINAL EXAM

Lecture 2: Review of Probability

Algorithms for Uncertainty Quantification

STA 2201/442 Assignment 2

18.440: Lecture 28 Lectures Review

1.1 Review of Probability Theory

MAS108 Probability I

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Multivariate distributions

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Transcription:

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks. The number of marks allotted for each part-question is shown in brackets. Graph paper and Official tables are provided. Candidates may use calculators in accordance with the regulations published in the Society's "Guide to Examinations" (document Ex1). The notation log denotes logarithm to base e. Logarithms to any other base are explicitly identified, e.g. log 10. Note also that n r is the same as n C. r 1 GD Module 1 2016 This examination paper consists of 12 printed pages. This front cover is page 1. Question 1 starts on page 2. RSS 2016 There are 8 questions altogether in the paper.

1. (i) The discrete random variable U is equally likely to take any of the integer 2 k 1 k 1 values 1, 2,, k (for k 1). Show that EU ( ) and Var( U ). 2 12 2 2 [You may use without proof the result that 1 2 2 k( k 1)(2k 1) k.] 6 X and Y are independent geometric random variables with probability distributions P X x x x1 ( ) (1 ), 1, 2, y1 and P( Y y) (1 ), y 1, 2, where 0 < < 1. (a) Derive the probability distribution of the random variable S = X + Y. (b) Find P( X x S s) for all possible values of x, when s 2 is a fixed integer. Write down expressions for E( X S s) and Var( X S s). (8) 2

2. [In this question, you may use without proof the result that, for any non-negative integer k, k t t e dt k! ] 0 (i) The continuous random variable U has the gamma distribution with probability density function n n1 u u e f ( u), u 0, ( n 1)! where n is a positive integer and θ > 0. Show that n EU ( ) and n Var( U ). 2 The continuous random variables X and Y have joint probability density function ( xy) 4 xe, 0 x y, f x, y 0, otherwise. (a) Derive the marginal probability density function of X. Use part (i) to deduce E(X) and Var(X). (b) Derive the conditional probability density function of Y given X = x. Find E(Y X). (c) Hence find E(Y). (3) 3

3. The continuous random variable U, which may take only non-negative values, has cumulative distribution function F(u) and probability density function f (u). The hazard function of U is defined to be f ( u) h( u), u 0. 1 F( u) (i) The random variable Z has a Weibull distribution with probability density function given by 1 exp z z ( ), z f z 0, 0, otherwise, where > 0 and > 0. Derive the hazard function of Z. Write down conditions on that give (a) a hazard function that is constant for all z 0, (b) a hazard function that decreases as z increases. A system consists of two components, C 1 and C 2. Failures occur in the two components independently. Let the random variable X i (i = 1, 2) be the time to first failure of C i, where X i has cumulative distribution function F i (x i ) and probability density function f i (x i ). Let Y be the time to first failure of the system. C 1 and C 2 are connected in series. This means that the system fails as soon as either of the two components fails. Show that Y has cumulative distribution function G( y) F ( y) F ( y) F ( y) F ( y), y 0. 1 2 1 2 Deduce that the hazard function of Y is the sum of the hazard functions of X 1 and X 2. Hence or otherwise show that, if X i (i = 1, 2) follows a Weibull distribution with parameters i and, then Y also follows a Weibull distribution. (8) (iii) The components C 1 and C 2 are now connected in parallel to create a new system. The time to first failure of this system is the maximum of the times to first failure of the two components. Explain why the time to first failure of this system, Y, has cumulative distribution function G( y) F ( y) F ( y), y 0. 1 2 In the case where the two components are identical, and therefore have identically distributed times to first failure, obtain an expression for the hazard function of the system in terms of the cumulative distribution function and probability density function of an individual component. Show that the hazard for the system, h(y), is no greater than the hazard for an individual component, for all values of y. 4

4. (a) The continuous random variables X 1 and X 2 jointly have a bivariate Normal distribution. X 1 has expected value 50 and standard deviation 8. X 2 has expected value 45 and standard deviation 10. The correlation between X 1 and X 2 is 0.25. (i) Obtain the expectation and covariance matrix of the random vector X = (X 1, X 2 ) T. In different physical units, the same quantities can be recorded as Y 1 = 0.555(X 1 32) and Y 2 = 0.447X 2. Obtain the distribution of the random vector Y = (Y 1, Y 2 ) T. (3) (iii) Use this example to illustrate one reason why correlation is often preferred to covariance as a measure of association between two random variables. (2) (b) In a longitudinal study of child growth, measurements X 1, X 2 and X 3 are made of a baby's length at ages 1 month, 2 months and 3 months respectively. These random variables are modelled by a multivariate Normal distribution with E 2 2 2 2 2 2 2 2 ( X), Cov( X ), 2 2 2 2 where 2 > 0 and 0 < < 1. (i) (iii) Write down the correlations between all possible pairs of these measurements. (2) The random variable Y is the mean length of an individual child at these three ages. Specify the distribution of Y. Measurements are to be made, independently, on a sample of n children. The random vector of measurements on the i th child (i = 1, 2,, n) is X i. State the distribution of the sample mean vector X 1 n i n i 1 X, giving its expectation vector and covariance matrix. (3) 5

5. The independent random variables X 1, X 2 and X 3 each have the uniform distribution on the interval 0 to 1. Let W denote the median of X 1, X 2 and X 3, and V denote the maximum of X 1, X 2 and X 3. (i) The joint probability density function of two order statistics, X (i) and X (j), of a sample of n independent and identically distributed random variables, each with probability density function f (x) and distribution function F (x), is given by n! i1 g( x( i ), x( j ) ) [ F ( x( i ) )] [ F ( x( j ) ) F ( x( i ) )] ( i 1)!( j i 1)!( n j)! j i 1 [ 1 F ( x )] f ( x ) f ( x ). n j ( j ) ( i ) ( j ) Use this expression to show that the joint probability density function of W and V is gw, v 6 w, 0 w v 1. (3) Show that, for non-negative integers k and m, k m E W V 6. ( k 2)( k m 3) Hence or otherwise, find E(W), Var(W), E(V), Var(V) and Cov(W, V). (12) (iii) In a certain experiment, the same unknown quantity is to be measured three times. One researcher intends to estimate by the median of the three measurements while another researcher prefers to estimate by the maximum 1 of the three measurements. Assuming that the measurements are X, 1 2 2 1 2 1 X and X, find the expected value and variance of the 3 2 difference between these two estimators. 6

6. (i) The discrete random variable X has the binomial distribution ( ) m x (1 ) m x P X x, x 0,1,..., m x, where m is a positive integer and 0 < < 1. Show that X has moment generating function t M ( t) 1 e. X m Use this moment generating function to find the expected value and variance of X. (9) (iii) X 1, X 2,, X n are independent random variables, each with the binomial distribution given in part (i) in the special case where m = 1. Use moment generating functions to prove that S = X 1 + X 2 + + X n is also a binomial random variable. State the Central Limit Theorem. Use it, along with the results from parts (i) and, to prove that the binomial distribution with parameters n and can be approximated by a Normal distribution for large enough n. State clearly the parameters of this Normal distribution. 7

7. (a) The following values are a random sample from the uniform distribution on (0, 1). 0.0885 0.4096 0.7370 0.9384 Use these values to generate four random variates from each of the following distributions, carefully explaining the method you use in each case. (i) Poisson: 2.5 x e (2.5) P( X x), x 0, 1, 2,.... x! 10( x3) Shifted exponential: f ( x) 10 e, x 3. (b) Beetle is a party game played using a standard die. The player attempts to complete a rough sketch of a beetle, consisting of a body, a head, a tail, four legs, two antennae and two eyes [This is a special beetle for the game, with only four legs]. On each turn, the player rolls the die once and may draw one part of the beetle depending on the score on the die as indicated in the following table. 6 = body 3 = leg 5 = head 2 = antenna 4 = tail 1 = eye If the player has already drawn all of a particular body part (for example, four legs), no body part is added on subsequent turns when the corresponding score (for example, 3) is obtained on the die. In addition, the body itself must be drawn before any other part may be drawn. The head, tail or legs may then be attached to the body, but antennae and eyes may only be added after the head. This means that a turn results in at most one body part being added to the sketch of the beetle, but often no body part is added. Use the following sequence of random digits, which should be read from left to right across each row, to carry out one simulation of a player's attempt to draw a beetle. Explain your method carefully. State clearly the total number of random digits and the total number of simulated rolls of a die required to complete the beetle. 5 2 1 0 9 0 0 9 6 8 6 8 0 8 9 8 6 3 6 4 2 8 8 4 5 4 1 9 3 9 0 0 6 2 6 6 2 0 6 8 7 0 5 4 9 9 6 7 5 6 5 3 9 4 4 3 5 1 0 9 Describe how you would extend this simulation to find a plausible range of values for the number of rolls of the die that a player would require in order to complete a beetle. (4) 8

8. The continuous random variables X and Y have joint probability density function f x, y ( ) x ( ) ( ) ( ) 0, 1 y 1 (1 x y), 1 0 x 1, 0 y 1, x y 1, otherwise, where > 0, > 0 and > 0 are parameters and (.) is the gamma function. Y (i) Obtain the joint probability density function of U 1 X, V. 1 X (9) Explain how you know that U and V are independent random variables. Obtain their marginal probability density functions and identify these marginal distributions. (8) (iii) Deduce the marginal distributions of X, Y and Z 1 X Y. (3) 9

10

BLANK PAGE 11

BLANK PAGE 12