Math 362, Problem set 1

Similar documents
Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

STA 2201/442 Assignment 2

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Stat410 Probability and Statistics II (F16)

Space Telescope Science Institute statistics mini-course. October Inference I: Estimation, Confidence Intervals, and Tests of Hypotheses

Probability and Distributions

6 The normal distribution, the central limit theorem and random samples

Mathematical Statistics

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Statistics & Data Sciences: First Year Prelim Exam May 2018

4. Distributions of Functions of Random Variables

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

Spring 2012 Math 541B Exam 1

SOLUTION FOR HOMEWORK 7, STAT p(x σ) = (1/[2πσ 2 ] 1/2 )e (x µ)2 /2σ 2.

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1

T has many other desirable properties, and we will return to this example

Continuous Random Variables

ECE534, Spring 2018: Solutions for Problem Set #3

Stat 426 : Homework 1.

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Partial Solutions for h4/2014s: Sampling Distributions

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

MATH Solutions to Probability Exercises

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

Problem 1 (20) Log-normal. f(x) Cauchy

Exercises and Answers to Chapter 1

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

18 Bivariate normal distribution I

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

Homework 10 (due December 2, 2009)

f (1 0.5)/n Z =

Lecture 1: August 28

Chapter 1: Revie of Calculus and Probability

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

15 Discrete Distributions

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

STA 4321/5325 Solution to Homework 5 March 3, 2017

Recitation 2: Probability

Proving the central limit theorem

Sampling Distributions

MAS223 Statistical Inference and Modelling Exercises

Continuous Random Variables

5.2 Continuous random variables

Solution. (i) Find a minimal sufficient statistic for (θ, β) and give your justification. X i=1. By the factorization theorem, ( n

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

S n = x + X 1 + X X n.

3 Modeling Process Quality

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Math Spring Practice for the final Exam.

Mathematical statistics

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

Chapter 5 continued. Chapter 5 sections

Bell-shaped curves, variance

MATH2715: Statistical Methods

Evaluating the Performance of Estimators (Section 7.3)

First Year Examination Department of Statistics, University of Florida

Sampling Distributions

Analysis of Experimental Designs

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

5 Operations on Multiple Random Variables

Math 341: Probability Seventeenth Lecture (11/10/09)

1 Review of Probability and Distributions

Discrete Mathematics and Probability Theory Fall 2015 Note 20. A Brief Introduction to Continuous Probability

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Question Points Score Total: 76

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

This does not cover everything on the final. Look at the posted practice problems for other topics.

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

ST5215: Advanced Statistical Theory

Math 447. Introduction to Probability and Statistics I. Fall 1998.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH A Test #2 June 11, Solutions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Statistics 427: Sample Final Exam

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Math 562 Homework 1 August 29, 2006 Dr. Ron Sahoo

Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed

February 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM

MATH 4426 HW7 solutions. April 15, Recall, Z is said to have a standard normal distribution (denoted Z N(0, 1)) if its pdf is

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Probability Background

Review of Maximum Likelihood Estimators

STATISTICS 1 REVISION NOTES

Continuous Distributions

Institute of Actuaries of India

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Transcription:

Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x <, zero elsewhere. We have that: Using the pdf, so E[ X] = E[ n Xi ] = n ne[x i] = E[X i ] var( X) = var( X i ) n = var(x i). n E[X] = E[X ] = var(x i ) = ( 4 5 4x 4 dx = 4 5. 4x 5 dx =, var( X) = 675. ) = 75,. (4..5) Let S = n (Xi X) d denote the sample variance of a rom sample from a distribution with variance σ >. Since E[S ] = σ, why isn t E[S] = σ? Note: There is a hint in the book that gives it away, but maybe think about it for a second before looking there. This follows from the fact that f(x) = x is strictly convex (that is f (x) = >, from Jensen s inequality: σ = E[S ] > (E[S] ),

so E[S] < σ. Note: some of you also (cleverly) noticed this follows from the fact that var(s) >.. (5..4) Let X,..., X n be a rom sample from the Γ(, θ) distribution, where θ is unknown. (Recall, look back in chapter if you forget the specifics of the Γ distribution). Let Y = n i= X i. (a) Find the distribution of Y determine c so that cy is anunbiased estimator of θ. (b) If n = 5 show that (9.59 < Yθ ) < 4. =.95 (c) Using (b), show that if y is the value of Y once the sample is drawn then the interval ( ) y 4., y 9.59 is a 95% confidence interval for Θ. (d) Suppose the sample results in the values, 44.879.55.99.574 4.5 Based on these data, obtain the point estimate of θ as described in art (a) the computed 95% confidence interval in art (c). What does the confidence interval mean? For (a), Y Γ(n, θ) by the additive property of the Gamma distribution. Thus E[Y ] = nθ we should take c = n. For (b), we will simply set up the integral. (9.59 < Yθ ) < 4. = (4.795θ < Y < 7.θ) = = 7.θ 4.795θ 7. 4.795 9!θ x9 e x/θ dx u 9 9! e u du =.95.95. Note: The important part here is that this does not depend on θ.

For (c), note that: (9.59 < Yθ ) < 4. = ( y 4. < θ < y 9.59 ) =.95 so the given interval is a 95% confidence interval. For (d), θ y n =.8654, we have a confidence interval of (6.685,.84), meaning that the true value of θ has a 95% chance of ling between these values. 4. (5..5) Suppose the number of customers X that enter a store between the hours of 9AM AM follows a oisson distribution with parameter θ. Suppose a rom sample of the number of customers for days results in the values 9 7 9 5 7 Based on these data obtain an unbiased point estimate of θ. Explain the meaning of this estimate in terms of the number of customers. Since E[X] = θ for a oisson θ rom variable, we have a point estimate of 9.5 customers per day. (Of course while it is impossible to actually obtain this on any given day, but it s a fine estimate of the average - it s even perfectly fine if it is the true mean of the distribution. 5. (5..) Obtain the probability that an observationis a potential outlier for the following distributions (a) The underlying distribution is normal (b) The underlying distribution is logistic, in other words it has pdf: f(x) = e x ( + e x ), < x < (c) The underlying distribution is Laplace, the pdf given by: f(x) = e x, < x <. For (a), we find (from the back of the book) that Q =.675 Q =.675.

Thus h =.5, UF =.7 LF =.7. We have (Z > UF ) =.46697, so (LF < Z < UF ) =.9966, so the probability of a potential outlier is.7 For (b), we have that Q e x ( + e x ) dx = 4, similar for Q. We thus get Q = ln() Q = ln(). Thus h = ln(), UF = 4 ln() LF = 4 ln(). We have then, that the (LF < X < UF ) = 4 ln() 4 ln() e x 4 ( + e x dx = ) 4, so the probability of a potential outlier is 4.49. For (c) we have that Q e x = 4. Solving, we find Q = ln() likewise Q = ln(). Thus h = ln(), we have UF = 4 ln() LF = 4 ln(). Since 4 ln() 4 ln() e x dx = 5 6. Thus the probability of a potential outlier is 6 =.65. 6. (5..5) Let Y < Y < Y < Y 4 be the order statistics of a rom sample of size 4 from the distribution having pdf f(x) = e x, < x <, zero elsewhere. Find ( Y 4 ). We have that F (x) = x e t dx = e t. Using the formula we derived in class: (Y 4 ) = 4( e t ) e t dt = ( e ) 4. 7. (5..) Let Y < Y < Y be the order statistics of a rom sample of size from a distribution having the pdf f(x) = x, < x <, zero elsewhere. Show that Z = Y /Y, Z = Y /Y Z = Y are mutually independent. 4

We have that Y = Z, Y = Z Z Y = Z Z Z. Thus, z z z z z z J = z z = z z. We have f Z,Z,Z (z, z, z ) = f Y,Y,Y (z z z, z z, z )z z =!(z z z )(z z )(z )(z z) = (z )(4z )(6z 5 ), where < z i <. This is the product of the marginal pdfs of Z, Z Z which shows that the variables are mutually independent. 8. (5..) Let X, X,..., X n be a rom sample. A measure of spread is Gini s mean difference n j ( ) n G = X i X j / j= i= (a) If n =, find a,..., a so that G = i= a iy i, where Y, Y,..., Y are the order statistics of the sample. (b) Show that E[G] = σ/ π if the sample arises from the normal distribution N(µ, σ ). Hint: This looks worse than it is: Think about the fact that X i X j is either X i X j or X j X i depending on which is larger. The Y i appear in the sum G, as they are some X j. The key observation is that Y i Y j = Y i Y j if i > j Y i Y j = Y j Y i, if i < j. Thus Y i appears with a positive sign i times, with a negative sign n i times. In all, Y i has coefficient: (i ) (n i) a i = ( n. ) Since n = 9, we have that for i =,..., 9.. a i = i 45 For (b), note that E[G] = E[ X i X j ]. Since X i X j N(, σ ) we have that E[ X i X j ] = = x x (π)(σ ) exp( σ )dx σ e u du = σ. π π 5