, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Similar documents
Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

MAS223 Statistical Inference and Modelling Exercises

Sampling Distributions

STAT 3610: Review of Probability Distributions

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

[Chapter 6. Functions of Random Variables]

Chapter 4 Multiple Random Variables

Probability Distributions Columns (a) through (d)

STAT 430/510 Probability

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Sampling Distributions

Test Problems for Probability Theory ,

Discrete Distributions

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

1. Point Estimators, Review

SOLUTION FOR HOMEWORK 12, STAT 4351

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Week 2. Review of Probability, Random Variables and Univariate Distributions

Continuous Distributions

Review for the previous lecture

Lecture 1: August 28

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Statistics for scientists and engineers

2 Functions of random variables

2. The CDF Technique. 1. Introduction. f X ( ).

Chapter 5. Chapter 5 sections

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

15 Discrete Distributions

Chapter 3 Common Families of Distributions

Contents 1. Contents

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

1 Solution to Problem 2.1

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

Limiting Distributions

Continuous Probability Distributions. Uniform Distribution

Stat410 Probability and Statistics II (F16)

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

The Multivariate Normal Distribution 1

Exercises and Answers to Chapter 1

Statistics for Economists. Lectures 3 & 4

1 Review of Probability and Distributions

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Probability and Distributions

THE QUEEN S UNIVERSITY OF BELFAST

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Statistics 1B. Statistics 1B 1 (1 1)

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Mathematical Statistics 1 Math A 6330

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Regression and Statistical Inference

Continuous Distributions

7 Random samples and sampling distributions

First Year Examination Department of Statistics, University of Florida

Continuous random variables

Probability and Statistics Notes

Lecture 11. Multivariate Normal theory

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Discrete random variables and probability distributions

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Generation from simple discrete distributions

LIST OF FORMULAS FOR STK1100 AND STK1110

Things to remember when learning probability distributions:

Continuous Random Variables

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

MATH Notebook 5 Fall 2018/2019

STA 256: Statistics and Probability I

PROBABILITY THEORY LECTURE 3

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Joint p.d.f. and Independent Random Variables

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq

(b). What is an expression for the exact value of P(X = 4)? 2. (a). Suppose that the moment generating function for X is M (t) = 2et +1 3

1. Exercise on Page 8: 2. Exercise on Page 8: (b)c 1 (C 2 C 3 ) and (C 1 C 2 ) (C 1 C 3 ) (C 1 C 2 ) (C 2 C 3 ) Problem Set 1 Fall 2017

Formulas for probability theory and linear models SF2941

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

STATISTICS SYLLABUS UNIT I

18.440: Lecture 28 Lectures Review

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Continuous Probability Distributions. Uniform Distribution

Random Variate Generation

Lecture 5: Moment generating functions

Limiting Distributions

Brief Review of Probability

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

3. Probability and Statistics

Relationship between probability set function and random variable - 2 -

Discrete Distributions

Some Special Distributions (Hogg Chapter Three)

Question Points Score Total: 76

STA 584 Supplementary Examples (not to be graded) Fall, 2003

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Notes on Random Vectors and Multivariate Normal

Chapter 5 Joint Probability Distributions

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Lecture 19: Properties of Expectation

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Transcription:

Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability density function of the binomial distribution is ( n x p(x) )px ( p) n x x 0,,,..., n 0 elsewhere Thus, P(X or X 3) P(X ) + P(X 3) ( ) ( ) 5 ( ) 3 ( ) ( ) 5 3 ( ) + 3 3 3 3 3 40 8. Exercise 3..7 on Page 49: Consider a shipment of 000 items into a factory. Suppose the factory can tolerate about 5% defective items. Let X be the number of defective items in a sample without replacement of size n 0. Suppose the factory returns the shipment if X. (a) Obtain the probability that the factory returns a shipment of items which has 5% defective items. (b) Suppose the shipment has 0% defective items. Obtain the probability that the factory returns such a shipment. (c) Obtain approximations to the probabilities in parts (a) (b) using appropriate binomial distributions. Note : If you do not have access to a computer package with a hypergeometric comm, obtain the answer to (c) only. This is what would have been done in practice 0 years ago. If you have access to R, then the comm dhyper(x, D, N D, n) returns the probability in expression (3..7) Stats3D03

Assignment 4 Fall 07 The expression (3..7) is p(x) (N D n x )(D x ) ( N n ), x 0,,..., n In the following, let X be the number of defective items. (a) From the given information, we have that N 000, n 0, D 000 5% 50 Since X has a hyper-geometric distribution, we have that P(X ) P(X 0) P(X ) (000 50 0 0 )(50 0 ) 0.0853 (b) Since D 000 0% 00, we have that ( 000 0 ) ( 000 50 0 )(50 ) ( 000 0 ) P(X ) P(X 0) P(X ) (000 00 0 0 )( 00 0 ) 0.637 ( 000 0 ) ( 000 00 0 )( 00 ) ( 000 0 ) (c) From the given information, when n 0, p 0.05, using the binomial distributions, we have that P(X ) P(X 0) P(X ) ( ) 0 0.05 0 ( 0.05) 0 0 0 0.086 ( ) 0 0.05 ( 0.05) 0 When n 0, p 0., using the binomial distributions, we have that P(X ) P(X 0) P(X ) ( ) 0 0. 0 ( 0.) 0 0 0 0.639 ( ) 0 0. ( 0.) 0 3. Exercise 3..3 on Page 54: In a lengthy manuscript, it is discovered that only 3.5 percent of the pages contain no typing errors. If we assume that the number of errors per page is a rom variable with a Poisson distribution, find the percentage of pages that have exactly one error. Stats3D03

Assignment 4 Fall 07 Since the Poisson distribution is P(X x) e λ λ x x!, x 0,, we have P(X 0) e λ λ 0 From the given information, P(X 0) 3.5%. Therefore, we can obtain λ. Thus, 0! 4. Exercise 3..4 on Page 55: P(X ) e λ λ! e! 0.7. Let X X be two independent rom variables. Suppose that X Y X + X have Poisson distributions with means µ µ > µ, respectively. Find the distribution of X The mgfs of X Y that have Poisson distributions are given by, respectively M X (t) exp{(µ (e t )} M Y (t) exp{(µ(e t )} Since X X are independently rom variables, we have M Y (t) M X (t)m X (t) exp{(µ(e t )} exp{(µ (e t )}M X (t) Therefore, M X (t) exp{(µ µ )(e t )} Thus, X has poisson distribution with mean (µ µ ). Stats3D03 3

Assignment 4 Fall 07 5. Exercise 3.3.6 on Pages 64 Let X, X, X 3 be iid rom variables, each with pdf f (x) e x, 0 < x <, zero elsewhere. (a) Find the distribution of Y minimum(x, X, X 3 ). Hint : P(Y y) P(Y > y) P(Xi > y, i,, 3). (b) Find the distribution of Y maximum(x, X, X 3 ). From the given information, we have that the cumulative density function is F(x) e x (a) F Y (y) P(Y y) P(Y > y) P(X > y, X > y, X 3 > y) [ P(X y)][ P(X y)][ P(X 3 y)] [ ( e y )] 3 e 3y Thus the distribution of Y is e 3y y > 0 F Y (y) 0 elsewhere (b) F Y (y) P(Y y) P(X y)p(x y)p(x 3 y) ( e y )( e y )( e y ) ( e y ) 3 Thus the distribution of Y is ( e y ) 3 y > 0 F Y (y) 0 elsewhere Stats3D03 4

Assignment 4 Fall 07 6. Exercise 3.4.5 on Page 76: Let X be a rom variable such that E(X m ) (m)!/( m m!), m,, 3,... E(X m ) 0, m,, 3,.... Find the mgf the pdf of X. The moment generating function of X is M X (t) Ee (tx) E [ n0 ] (tx) n n! E(X n )t n n0 n! m m0 + (t /) m m! E(X m )t m (m)! (Since EX m 0) e t Thus, X follows N(0, ), the pdf is f (x) π exp 7. Exercise 3.6. on Page 96: Show that Y ) ( x, < x < + (r /r )W where W has an F -distribution with parameters r r, has a beta distribution. Let U V are independent chi-square rom variables with r r degrees of freedom, respectively, let W r U/(r V). Then we can obtain that W has an F-distribution with two parameters r r. Y + (r /r )W + (r /r )( r U r V ) V U + V Stats3D03 5

Assignment 4 Fall 07 Since U V have chi-square distribution with r r degrees of freedom, we obtain that V U+V has a beta distribution. Thus Y has a beta distribution. 8. Exercise 3.6.4 on Page 96: Let X, X, X 3 be three independent chi-square variables with r, r, r 3 degrees of freedom, respectively. (a) Show that Y X /X Y X + X are independent that Y is χ (r + r ). (b) Deduce that are independent F-variables. X /r X /r (X + X )/(r + r ) (a) Since Y X /X Y X + X, we can obtain that X Y Y /( + Y ) X Y /( + Y ). The determinant of Jacobian is Therefore the joint pdf of X X is y (y +) y y + y (y +) y + y (y + ) f (x, x ) r +r Γ ( r ) Γ ( r ) x r r x e x+x The joint pdf of Y Y is f (y, y ) ( ) y r +r Γ ( y r ( ) r y r ) ( Γ r ) e y y y + y + (y + ) r +r Γ ( r ) ( Γ r ) y r y r +r (y + ) r +r e y Thus the joint pdf can be decomposed into two factors, each of them relevant to Y, Y alone. Hence Y, Y are independent. Since X X follow independently chi-square distribution with r, r, we have that Y has a chi-square distribution with r + r. Stats3D03 6

Assignment 4 Fall 07 (b) Since X, X, X 3 are three independent chi-square variables with r, r, r 3 degrees of freedom, therefore W X /r X /r F(r, r ) Y X + X χ (r + r ) From the known information, the ratio of two chi-square variables with their respective degrees of freedom follows F-distribution. Let V Y /(r + r ) (X + X )/(r + r ) F(r 3, (r + r )) X, X, X 3 are independent, X /X, Y are independent. So W V are independent. Thus are independent F-variables. X /r X /r (X + X )/(r + r ) Stats3D03 7