Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Similar documents
Qualifying Exam in Probability and Statistics.

Qualifying Exam in Probability and Statistics.

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Probability and Statistics qualifying exam, May 2015

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Problem Selected Scores

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Masters Comprehensive Examination Department of Statistics, University of Florida

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

STATISTICS SYLLABUS UNIT I

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Asymptotic Statistics-III. Changliang Zou

Master s Written Examination

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Statistics Ph.D. Qualifying Exam

Limiting Distributions

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Exercises and Answers to Chapter 1

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

1 Exercises for lecture 1

LIST OF FORMULAS FOR STK1100 AND STK1110

STA 2201/442 Assignment 2

Multiple Random Variables

Statistics Masters Comprehensive Exam March 21, 2003

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Quick Tour of Basic Probability Theory and Linear Algebra

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002

Mathematical statistics

Actuarial Science Exam 1/P

Mathematical Statistics

Limiting Distributions

Probability Theory and Statistics. Peter Jochumzen

Statistics & Data Sciences: First Year Prelim Exam May 2018

Spring 2012 Math 541B Exam 1

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Probability and Statistics

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

STAT215: Solutions for Homework 2

Review. December 4 th, Review

Brief Review of Probability

FINAL EXAM: 3:30-5:30pm

A Very Brief Summary of Statistical Inference, and Examples

First Year Examination Department of Statistics, University of Florida

Notes on Random Vectors and Multivariate Normal

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Probability and Statistics Notes

Master s Written Examination

3 Continuous Random Variables

STAT 512 sp 2018 Summary Sheet

Chapter 9: Hypothesis Testing Sections

Probability & Statistics - FALL 2008 FINAL EXAM

Lecture 1: August 28

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Theory of Statistics.

MAS223 Statistical Inference and Modelling Exercises

Regression and Statistical Inference

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

First Year Examination Department of Statistics, University of Florida

Random Variables and Their Distributions

Statistical Inference

Final Examination Statistics 200C. T. Ferguson June 11, 2009

Review and continuation from last week Properties of MLEs

STAT 414: Introduction to Probability Theory

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Chapter 7. Hypothesis Testing

SOLUTION FOR HOMEWORK 6, STAT 6331

Lectures on Statistics. William G. Faris

Mathematical statistics

UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables

Math 562 Homework 1 August 29, 2006 Dr. Ron Sahoo

Solutions to the Spring 2015 CAS Exam ST

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

Testing Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata

Statistics and Econometrics I

Review of Probability Theory II

Probability and Distributions

EE514A Information Theory I Fall 2013

Lecture 1: Review on Probability and Statistics

Notes for Math 324, Part 19

Selected Exercises on Expectations and Some Probability Inequalities

Formulas for probability theory and linear models SF2941

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Principles of Statistics

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

STAT 418: Probability and Stochastic Processes

Hypothesis testing: theory and methods

Problem Points S C O R E Total: 120

This does not cover everything on the final. Look at the posted practice problems for other topics.

This paper is not to be removed from the Examination Halls

Statistical Hypothesis Testing

Transcription:

Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section of Qualifying Exam in Probability and Statistics Probability. The Pareto distribution, with parameters α and β, has pdf f(x) = βαβ, α < x <, α > 0, β > 0. xβ+ (a) Verify that f(x) is a pdf. (b) Derive the mean and variance of this distribution. (c) Prove that the variance does not exist if β 2. 2. Let U i, i =, 2,..., be independent uniform(0,) random variables, and let X have distribution P (X = x) = c, x =, 2, 3,..., x! where c = /(e ). Find the distribution of Z = min{u,..., U X }. 3. A point is generated at random in the plane according to the following polar scheme. A radius R is chosen, where the distribution of R 2 is χ 2 with 2 degrees of freedom. Independently, an angle θ is chosen, where θ uniform(0, 2π). Find the joint distribution of X = R cos θ and Y = R sin θ. 4. Let X and Y be iid N(0, ) random variables, and define Z = min(x, Y ). Prove that Z 2 χ 2. 5. Suppose that B is a σ-field of subsets of Ω and suppose that P : B [0, ] is a set function satisfying: (a) P is finitely additive on B; (b) 0 P (A) for all A B and P (Ω) = ; (c) If A i B are disjoint and i=a i = Ω, then i= P (A i) =. Show that P is a probability measure on B in Ω. 6. Suppose that {X n } n= is a sequence of i.i.d. random variables and c n is an

increasing sequence of positive real numbers such that for all α >, we have and Prove that P [X n > α c n ] = n= P P [X n > αc n ] <. n= [ ] X n lim sup = =. n c n 7. Suppose for n that X n L are random variables such that sup n E(X n ) <. Show that if X n X, then X L and E(X n ) E(X). 8. Let X be a random variable with distribution function F (x). (a) Show that (F (x + a) F (x))dx = a. (b) If F is continuous, then E[F (X)] = 2. R 9. (a) Suppose that X n P X and g is a continuous function. Prove that g(x n ) P g(x). (b) If X n P 0, then for any r > 0, X n r + X n r P 0 and X n r E[ + X n ] 0. r 0. Suppose that {X n, n } are independent non-negative random variables satisfying E(X n ) = µ n, Var(X n ) = σn. 2 Define for n, S n = n i= X i and suppose that n= µ n = and σn 2 cµ n for some c > 0 and all n. Show S n E(S n ) 2 P.

. (a) If X n X and Y n Y in probability, then X n + Y n X + Y in probability. (b) Let {X i } be iid, E(X i ) = µ and V ar(x i ) = σ 2. Set X n i= = X i. Show that in probability. n n (X i X) 2 σ 2 i= 2. Suppose that the sequence {X n } is fundamental in probability in the sense that for ε positive there exists an N ε such that P [ X n X m > ε] < ε for m, n > N ε. (a) Prove that there is a subsequence {X nk } and a random variable X such that lim k X nk = X with probability (i.e. almost surely). (b) Show that f(x n ) f(x) in probability if f is a continuous function. n Statistics. Suppose that X = (X,, X n ) is a sample from the probability distribution P θ with density { θ( + x) f(x θ) = (+θ), if x > 0 0, otherwise for some θ > 0. (a) Is {f(x θ), θ > 0} a one-parameter exponential family? (explain your answer). (b) Find a sufficient statistic T (X) for θ > 0. 2. Suppose that X,, X n is a sample from a population with density p(x, θ) = θax a exp( θx a ), x > 0, θ > 0, a > 0. (a) Find a sufficient statistic for θ with a fixed. (b) Is the sufficient statistic in part (a) minimally sufficient? Give reasons for your answer. 3. Let X,, X n be a random sample from a gamma(α, β) population. (a) Find a two-dimensional sufficient statistic for (α, β). (b) Is the sufficient statistic in part (a) minimally sufficient? Explain your answer. (c) Find the moment estimator of (α, β). (d) Let α be known. Find the best unbiased estimator of β. 3

4. Let X,..., X n be iid Bernoulli random variables with parameter θ (probability of a success for each Bernoulli trial), 0 < θ <. Show that T (X) = n i= X i is minimally sufficient. 5. Suppose that the random variables Y,, Y n satisfy Y i = βx i + ε i, i =,, n, where x,, x n are fixed constants, and ε,, ε n are iid N(0, σ 2 ), σ 2 unknown. (a) Find a two-dimensional sufficient statistics for (β, σ 2 ). (b) Find the MLE of β, and show that it is an unbiased estimator of β. (c) Show that [ (Y i /x i )]/n is also an unbiased estimator of β. 6. Let X,, X n be iid N(θ, θ 2 ), θ > 0. For this model both X and cs are unbiased estimators of θ, where n Γ((n )/2) c =. 2Γ(n/2) (a) Prove that for any number a the estimator a X + ( a)(cs) is an unbiased estimator of θ. (b) Find the value of a that produces the estimator with minimum variance. (c) Show that ( X, S 2 ) is a sufficient statistic for θ but it is not a complete sufficient statistic. 7. Let X,, X n be i.i.d. with pdf (a) Find the Fisher information f(x θ) = 2x θ exp{ x2 }, x > 0, θ > 0. θ I(θ) = E θ [ ( θ log f(x θ) ) 2 ], where f(x θ) is the joint pdf of X = (X,..., X n ). (b) Show that n n i= X2 i is an UMVUE of θ. 8. Let X,, X n be a random sample from a n(θ, σ 2 ) population, σ 2 known. Consider estimating θ using squared error loss. Let π(θ) be a n(µ, τ 2 ) prior distribution on θ and let δ π be the Bayes estimator of θ. Verify the following formulas for the risk 4

function, Bayes estimator and Bayes risk. (a) For any consatnts a and b, the estimator δ(x) = a X + b has risk function R(θ, δ) = a 2 σ2 n + (b ( a)θ)2. (b) Show that the Bayes estimator of θ is given by δ π (X) = E(θ X) = τ 2 τ 2 + σ 2 /n X + σ2 /n τ 2 + σ 2 /n µ. (c)let η = σ 2 /(nτ 2 + σ 2 ). The risk function for the Bayes estimator is R(θ, δ π ) = ( η) 2 σ2 n + η2 (θ µ) 2. (d) The Bayes risk for the Bayes estimator is B(π, δ π ) = τ 2 η. 9. Suppose that X = (X,, X n ) is a sample from normal distribution N(µ, σ 2 ) with µ = µ 0 known. (a) Show that ˆσ 2 0 = n n i= (X i µ 0 ) 2 is a uniformly minimum variance unbiased estimate (UMVUE) of σ 2. (b) Show that ˆσ 2 0 converges to σ 2 in probability as n. (c) If µ 0 is not known and the true distribution of X i is N(µ, σ 2 ), µ µ 0, find the bias of ˆσ 2 0. 0. Let X,, X n be i.i.d. as X = (Z, Y ) T, where Y = Z + λw, λ > 0, Z and W are independent N(0, ). (a)find the conditional density of Y given Z = z. (b)find the best predictor of Y given Z and calculate its mean squared prediction error (MSPE). (c)find the maximum likelihood estimate (MLE) of λ. (d)find the mean and variance of the MLE.. Let X,, X n be a sample from distribution with density p(x, θ) = θx θ {x (0, )}, θ > 0. (a) Find the most powerful (MP) test for testing H : θ = versus K : θ = 2 with α = 0.0 when n =. 5

(b) Find the MP test for testing H : θ = versus K : θ = 2 with α = 0.05 when n 2. 2. Let X,, X n be a random sample from a N(µ, σ 2 ), and let Y,, Y m be an independent random sample from a N(µ 2, σ 2 2). We would like to test H : µ = µ 2 versus K : µ µ 2 with the assumption that σ 2 = σ 2 2. (a) Derive the likelihood ratio test (LRT) for these hypotheses. Show that the LRT can be based on the statistic T = X ( Ȳ Sp 2 + n m), where S 2 p = ( n (X i n + m 2 X) 2 + i= (b)show that, under H, T has a t n+m 2 distribution. ) m (Y j Ȳ )2. j= 6

Part 3: Required Proofs for Probability & Stats Qualifying Exam. Chebyshev inequality, statement and the proof. 2. Weak law of Large numbers, statement and the proof. 3. Borel Cantelli lemma statements and the proof (for both cases when P (An ) < and when P (A n ) = ) 4. Holder s inequality, proof and statement 5. Cauchy Schwarz, proof and statement 6. Prove the statement: if X i X o weakly and X o is a constant then X i X o in probability 7. Prove the statement: if X i X o a.s then X i X o in probability 8. Prove that X i X o in probability does not imply that X i X o a.s. 9. Prove the statement: if X i X o in probability then X i X o weakly. 0. Prove that X i X o weakly does not imply that X i X o in probability. Prove the statement:if X i X o in L then X i X o in probability 2. Prove the statement: If X and Y are independent and jointly normal random variables with means µ and µ 2 and standard deviations σ and σ 2 respectively than Z = X +Y is also normally distributed with µ = µ +µ 2 and σ = σ 2 + σ2 2 3. Prove the statement: If X and Y are independent Poisson random variables with means µ and µ 2 then Z = X + Y is also Poisson random variable. 4. Continuous mapping theorem, statement and the proof. 5. Slutsky s theorem, statement and the proof. 6. Prove the following: If X n X o weakly and if cumulative distribution function F (t) = P (X t) is continuous then lim sup P (X n t) F (t) = 0 n t 7. Consider the linear regression model Y = Xβ + e, where Y is an n vector of the observations, X is the n p design matrix of the levels of the regression variables, β is an p vector of the regression coefficients, and e is an n vector of random errors. Prove that the least squares estimator for β is ˆβ = (X X) X Y. 8. Prove that if X follows a F distribution F (n, n 2 ), then X follows F (n 2, n ).

9. Let X,, X n be a random sample of size n from a normal distribution N(µ, σ 2 ). We would like to test the hypothesis H 0 : µ = µ 0 versus H : µ µ 0. When σ is known, show that the power function of the test with type I error α under true population mean µ = µ is Φ(z α/2 + µ µ0 n σ ) where Φ(.) is the cumulative distribution function of a standard normal distribution and Φ(z α/2 ) = α/2. 20. Let X,, X n be a random sample of size n from a normal distribution N(µ, σ 2 ). Prove that (a) the sample mean X and the sample variance S 2 are independent; (b) (n )S2 σ 2 follows a Chi-squared distribution χ 2 (n ). 2

Qualifying Exam on Probability and Statistics Spring, January 9, 206 Instruction: You have 3 hours to complete the exam. You are required to show all the work for all the problems. There are three parts in the exam. Please budget your time wisely for all three parts. There are 0 problems in Elementary part, 3 problems in Challenging part and 3 proof problems. The suggested passing grade for the three parts are: Elementary part 80%, Challenging part 50% and Proofs 80%. Elementary part (). The number of injury claims per month is modeled by a random variable N with P (N = n) = for non negative integral (n+)(n+2) n s. Calculate the probability of at least one claim during a particular month, given that there have been at most four claims during that month. (2). Let X be a continuous random variable with density function f(x) = x 0 Calculate E(X). for x [, 4] and f(x) = 0 otherwise. (3). A device that continuously measures and records sesmic activity is placed in a remote region. The time to failure of this device, T, is exponentialy distributed with mean 3 years. Since the device will not be monitored during its first two years of service, the time to discovery of its failure is X = max(t, 2). Calculate E(X). (4). The time until failure, T, of a product is modeled by uniform distribution on [0, 0]. An extended warranty pays a benefit of 00 if failure occurs between t =.5 and t = 8. The present value, W of this benefit is W = 00e 0.04T for T [.5, 8] and zero otherwise. Calculate P (W < 79).

(5). On any given day, a certain machine has either no malfunctions or exactly one malfunction. The probability of malfunction on any given day is 0.4. Machine malfunctions on different days are mutually independent. Calculate the probability that the machine has its third malfunction on the fifth day, given that the machine has not had three malfunctions in the first three days. (6). Two fair dice are rolled. Let X be the absolute value of the difference between the two numbers on the dice. Calculate P (X < 3). (7). A driver and a passenger are in a car accident. Each of them independently has probability 0.3 of being hospitalized. When a hospilatization occurs, the loss is uniformly distributed on [0, ]. When two hospitalization occur, the losses are independent. Calculate the expected number of people in the car who are hospitalized, given that the total loss due to hospitalization is less than. (8). Let X and Y be independent and identically distributed random variables such that the moment generating function for X + Y is M(t) = 0.09e 2t + 0.24e t + 0.34 + 0.24e t + 0.09e 2t for t (, ) Calculate P (X 0). (9). The number of workplace injuries, N, occuring in a factory on any given day is Poisson distributed with mean λ. The parameter λ itself is a random variable that is determined by the level of activity in the factory and is uniformly distributed on inteval [0, 3]. Calculate V ar(n). (0). Let X and Y be continuous random variables with joint density function { 24xy, for 0 < y < x, x (0, ); f(x, y) = 0, otherwise Calculate P (Y < X X = 3 ). 2 Challenging Part (). Let Y be a non negative random variable. Show that EY P (Y > k) EY +. k=0 2

(2). Let X n be a sequence of random variables such that n(x n µ) N(0, σ 2 ) in distribution. For any given function g and a specific µ, suppose that g (µ) exists and g (µ) 0. Then prove that n(g(xn ) g(µ)) N(0, σ 2 [g (µ)] 2 ) in distribution. (3). Let {X n } be a sequence of random variables with E(X n ) = 0, and V ar(x n ) C (C is a constant), E(X i X j ) ρ ( i j ) for any i > j and ρ(n) 0 as n. Show that 3 Proofs n n X i 0 in probability. i= (). Let {X n } be a sequence of independent and identically distributed random variables with E X n <. Prove or disprove the following statement n X k EX in probability as n. n k= (2). Let X n : Ω R d and such that X n converges weakly (in distribution) to random vector Z. Let F : R d R be a continuous function and let Y n = F (X n ). Then prove or disprove the following statement: Y n F (Z) weakly (in distribution) as n. (3). Consider the linear regression model Y = Xβ + e, where Y is an n vector of the observations, X is the n p design matrix of the levels of the regression variables, β is a p vector of regression coefficients and e is an n vector of random errors. Show that the least square estimator for β is β = (X X) X Y. 3