Stat 426 : Homework 1.

Similar documents
Change Of Variable Theorem: Multiple Dimensions

Sample Spaces, Random Variables

1 Probability and Random Variables

Random variables. DS GA 1002 Probability and Statistics for Data Science.

STAT 418: Probability and Stochastic Processes

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STAT 414: Introduction to Probability Theory

Stat410 Probability and Statistics II (F16)

MAS223 Statistical Inference and Modelling Exercises

18.440: Lecture 28 Lectures Review

Sampling Random Variables

Statistics (1): Estimation

Probability and Distributions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Mathematical Statistics

Exponential Distribution and Poisson Process

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Chapter 5. Chapter 5 sections

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 4: Continuous Probability Distributions

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Continuous Probability Spaces

Things to remember when learning probability distributions:

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Chapter 5 continued. Chapter 5 sections

18.440: Lecture 28 Lectures Review

Lecture 17: The Exponential and Some Related Distributions

1 Basic continuous random variable problems

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

THE QUEEN S UNIVERSITY OF BELFAST

1. (Regular) Exponential Family

Chapter 8: Taylor s theorem and L Hospital s rule

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

1 Review of Probability and Distributions

Continuous-time Markov Chains

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

MTH 202 : Probability and Statistics

15 Discrete Distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Continuous Distributions

Math 362, Problem set 1

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

FINAL EXAM: Monday 8-10am

2 Continuous Random Variables and their Distributions

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

STAT Chapter 5 Continuous Distributions

Stat 5101 Lecture Slides: Deck 7 Asymptotics, also called Large Sample Theory. Charles J. Geyer School of Statistics University of Minnesota

Chapters 3.2 Discrete distributions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65

Stat 5101 Notes: Brand Name Distributions

LIST OF FORMULAS FOR STK1100 AND STK1110

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Stat 5101 Lecture Slides Deck 5. Charles J. Geyer School of Statistics University of Minnesota

3 Conditional Expectation

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Actuarial Science Exam 1/P

Chapter 4: Continuous Random Variable

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Continuous Random Variables and Continuous Distributions

The exponential distribution and the Poisson process

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

(x 3)(x + 5) = (x 3)(x 1) = x + 5. sin 2 x e ax bx 1 = 1 2. lim

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Discrete Distributions

Tom Salisbury

Exam 3, Math Fall 2016 October 19, 2016

STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero

Stat 512 Homework key 2

Chapter 6: Random Processes 1

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

SDS 321: Introduction to Probability and Statistics

Math 361: Homework 1 Solutions

Recitation 2: Probability

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor

Probability Distributions Columns (a) through (d)

Practice Examination # 3

1 Acceptance-Rejection Method

Probability Theory and Statistics. Peter Jochumzen

STAT 430/510: Lecture 15

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Random Variables and Their Distributions

2.2 Separable Equations

Problem Set 1 Sept, 14

STA 256: Statistics and Probability I

Class Meeting # 2: The Diffusion (aka Heat) Equation

1 Complete Statistics

Continuous Probability Distributions. Uniform Distribution

SUFFICIENT STATISTICS

Math 180B Homework 9 Solutions

IE 303 Discrete-Event Simulation

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

ECSE B Solutions to Assignment 8 Fall 2008

Learning Objectives for Stat 225

Transcription:

Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values {1, 2, 3,...} and P (W = j) = θ (1 θ) j 1, where 0 < θ < 1. (a) Prove that for any two positive integers i, j, it is the case that, P (W > i + j W > i) = P (W > j). (b) Indeed, the converse is also true. We show that if W is a discrete random variable taking values {1, 2, 3,...} with probabilites {p 1, p 2, p 3,...} and satisifies the memoryless property, then W must follow a geometric distribution. Follow these steps to establish the fact that W is geometric. Using the fact that W has the memoryless property, show that P (W > m) = (P (W > 1)) m, for any m 2. As a first step towards proving this show that P (W > 2) = (P (W > 1)) 2. Define θ = P (W = 1) and 1 θ = P (X > 1). You now have, P (W > m) = (1 θ) m, for any m 2. Use this to show that for any m 2, P (W = m) = θ (1 θ) m 1. But for m = 1, P (W = m) = P (W = 1) = θ = θ (1 θ) m 1, trivially and the proof is complete. (5 + 5 = 10 points) 1

(2) If X is random variable with distribution function F, with continuous non-vanishing density f, obtain the density function of the random variable Y = X k, for an even integer k. Hint: Express the probability of the event (X k y) in terms of the distribution function F of X and proceed from there. (5 points) (3) Let T be an exponential random variable with parameter β and let W be a random variable independent of T which assumes the value 1 with probability 2/3 and the value 1 with probability 1/3. Find the density of X = W T. Hint: It would help to split up the event {X x} as the union of {X x, W = 1} and {X x, W = 1}. (7 points) (4) Consider a population of N voters who will either vote for Democrats or Republicans (i.e. no one abstains from voting, or cancels their vote). The goal is to estimate the proportion p of Democrats in the population. Sampling without replacement: A sample of size n is collected at random without replacement from this population. Let X i denote the affiliation (1 if Democrat, 0 if Republican) of the i th individual in the sample. (i) Write down the p.m.f of X 1 and the joint p.m.f. of (X 1, X 2 ). Compute the p.m.f. of X 2 and show that it is identical to the p.m.f. of X 1. Also show that X 1 and X 2 are not independent. In general, X 1, X 2,..., X n all have identical marginal distributions but are dependent. Also, the joint p.m.f. of (X i, X j ) is the same for all pairs (i, j). (ii) Consider the special case when the sample size n is equal to the population size N, so that your random sample is (X 1, X 2,..., X N ). Compute E(S) and Var(S), where S = X 1 + X 2 +... + X N. (iii) Let ˆp = n 1 (X 1 + X 2 +... + X n ) be the natural estimate of p. and Var(ˆp). (4 + 4 + 5 = 13 points) Compute E(ˆp) (5) (a) Let (U, V ) be distributed jointly with a spherically symmetric density. In other words, let their joint density f(x, y) = C g(x 2 + y 2 ), for some non-negative function g. Show that (ɛ 1 U, ɛ 2 V ) has the same distribution as (U, V ), where ɛ 1 and ɛ 2 are either 1 or -1. Deduce that U and V are uncorrelated. (b) Now consider i.i.d. random variables (W 1, W 2 ) that are independent of (U, V ) in (a), where each W i assumes the value 1 or -1 with probability 0.5. Let Z 1 = W 1 U and Z 2 = W 2 V. Show that (Z 1, Z 2 ) again has the same distribution as (U, V ). What if W 1 and W 2 still assume the values 1 or -1, are still independent of (U, V ) but the joint distribution of the vector (W 1, W 2 ) 2

is arbitrary i.e. W 1 and W 2 are not required to be i.i.d.? (b) Consider transforming (U, V ) above to (W 1, W 2 ) where W 1 = c U + d V c 2 + d 2 Here c and d are any two numbers. and W 2 = d U c V c 2 + d 2. (i) Show that (W 1, W 2 ) T = P (U, V ) T for a 2 2 matrix P that is orthogonal, i.e. P T P = I. Also show that U 2 + V 2 = W 2 1 + W 2 2. (ii) Show that (U, V ) and (W 1, W 2 ) have the same joint distribution. (iii) Let (U, V ) be expressed in terms of polar co ordinates (R, Θ). Thus, U = R cos Θ and V = R sin Θ. Compute the joint and marginal distributions of (R, Θ). Are they independent? (5+5+10 = 20 points) (6) This exercise outlines the construction of what is known as the Poisson process. As a warming up exercise, let s do some calculus first. (i) Recall that, for α > 0, the gamma function is defined as Show that Hint: Use integration by parts. Γ(α) = 0 e x x α 1 dx. Γ(α + 1) = α Γ(α). (ii) Deduce that Γ(n) = (n 1)!. (iii) If X 1, X 2, X 3,... is a sequence of mutually independent random variables, where each is distributed as exponential with parameter λ, then show that the partial sums S n = X 1 + X 2 +... + X n are distributed as Gamma (n, λ); in other words, show that the distribution of S n is, f n (x) = λn Γ(n) e λx x n 1. Hint: One way to do this is to prove the result for n = 2 using the formula for the density of the sum of two independent random variables, as derived in the book (or from first principles). Then, use induction. Assume that the result holds true for n 1 and then write S n = S n 1 + X n and derive the density of S n using the density of S n 1 and X n and the fact that S n 1 and X n are independent. 3

Now define a family of random variables in the following way: For each t, let N(t) be defined as, N(t) = largest integer n such that S n t. Here is a physical motivation: Suppose customers arrive at a counter starting from some fixed time. The inter-arrival times of the customers are assumed to be distributed independently and identically as exp(λ); in other words, the time that it takes for the first customer to arrive is an exp(λ) random variable and the time between the first customer and the second customer is an exp(λ) random variable independent of the first and so on. Note then, that the time taken for the n th customer to arrive is precisely S n. The number of customers arriving in time t is then counted by the random variable N(t). The family {N(t)} forms a Poisson process; each N(t) is distributed as Poi(λt). Furthermore, if t 1 < t 2 <... < t k, then N(t 1 ), N(t 2 ) N(t 1 ),..., N(t k ) N(t k 1 ) are mutually independent and N(t i ) N(t i 1 ) follows Poi(λ(t i t i 1 )). Here we will only prove that the marginal distribution of N(t) is Poisson with parameter λt. (iv) Show that the event {S n t} is equivalent to the event that {N(t) n}. Hence show that, P (N(t) = n) = P (S n t) P (S n+1 t). (v) Now write P (S n t) = and use integration by parts to show that, It follows immediately that 0 λ n Γ(n) e λx x n 1 dx P (S n t) = e λ t (λ t) n + P (S n+1 t). n! P (N(t) = n) = e λ t (λ t) n. n! ( 3 + 3 + 4 + 5 + 5 = 20 points) (7) A random variable V is said to have a distribution symmetric about 0 if the distribution of V is the same as that of V. Let V be a continuous random variable with continuous density function f. (i) Show that V is distributed symmetrically about 0 if and only if f(t) = f( t), for every t. In other words f is an even function. (ii) Suppose that V has an expectation; in other words x f(x) dx <. 4

Then show that E(X) = 0. (iii) Let B be a Bernoulli random variable with probability of success θ where 0 < θ < 1. Find the density function of the random variable Ṽ = (2B 1) V. (4 + 4 + 4 = 12 points) (8) If P 1 and P 2 are independent Poisson random variables with parameters λ 1 and λ 2, then show that P 1 + P 2 is also Poisson with parameter λ 1 + λ 2. Recall that if W follows Poisson(θ), then the p.m.f. of W is, P (W = m) = e θ θ m. m! Hint: Write P (P 1 + P 2 = m) as m i=0 P (P 1 = i, P 2 = m i) and proceed. (ii) Use this result repeatedly to show that if X 1, X 2,..., X n are independent Poisson random variables with parameters λ 1, λ 2,..., λ n respectively, then X 1 + X 2 +... + X n follows Poisson with parameter λ 1 + λ 2 +... + λ n. (iii) Show from first principles that the conditional distribution of the random vector (X 1, X 2,..., X n ) given that the sum X 1 + X 2 +... + X n = K for some integer K follows the multinomial distribution with parameters (K, p 1, p 2,..., p n ) where each p i is given by λ i /( n j=1 λ j). ( 4 + 3 + 6 = 13 points) (9) (a) A line is drawn through the point (-1,0) in a random direction. Let (0, Y ) denote the point at which it cuts the Y axis. Show that Y has the standard Cauchy density: f Y (y) = 1 π 1 1 + y 2, y R. (b) Let X and Y be i.i.d. N(0,1) random variables. Show that X/Y also has the standard Cauchy density. (c) Deduce (without further calculation) that if V is standard Cauchy, so is 1/V. (d) What is the distribution of the random variable X /Y? (6 + 7 + 3 + 4 = 20 points) 5