kp(x = k) = λe λ λ k 1 (k 1)! = λe λ r k e λλk k! = e λ g(r) = e λ e rλ = e λ(r 1) g (1) = E[X] = λ g(r) = kr k 1 e λλk k! = E[X]

Similar documents
Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

Lecture 2: April 3, 2013

PRACTICE PROBLEMS FOR THE FINAL

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

HOMEWORK 2 SOLUTIONS

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017

Discrete Mathematics and Probability Theory Fall 2016 Walrand Probability: An Overview

Massachusetts Institute of Technology

PRACTICE PROBLEMS FOR THE FINAL

Infinite Sequences and Series

Final Review for MATH 3510

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Generalized Semi- Markov Processes (GSMP)

9.3 Taylor s Theorem: Error Analysis for Series. Tacoma Narrows Bridge: November 7, 1940

Lecture 6: Coupon Collector s problem

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

MATH301 Real Analysis (2008 Fall) Tutorial Note #7. k=1 f k (x) converges pointwise to S(x) on E if and

Math 216A Notes, Week 5

PH 425 Quantum Measurement and Spin Winter SPINS Lab 1

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Homework 5 Solutions

( ) = p and P( i = b) = q.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

Chapter 6 Infinite Series

Lecture 12: November 13, 2018

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Simulation. Two Rule For Inverting A Distribution Function

Induction: Solutions

5. Likelihood Ratio Tests

7.1 Convergence of sequences of random variables

Chapter 6 Conditional Probability

AMS570 Lecture Notes #2

Fall 2013 MTH431/531 Real analysis Section Notes

CS 330 Discussion - Probability

Math 475, Problem Set #12: Answers

Lecture 4: Unique-SAT, Parity-SAT, and Approximate Counting

Lecture 2 February 8, 2016

1 Review and Overview

Random Models. Tusheng Zhang. February 14, 2013

Statistics 511 Additional Materials

Discrete probability distributions

An Introduction to Randomized Algorithms

Pb ( a ) = measure of the plausibility of proposition b conditional on the information stated in proposition a. & then using P2

P1 Chapter 8 :: Binomial Expansion

6.3 Testing Series With Positive Terms

Math778P Homework 2 Solution

Lecture 5: April 17, 2013

Approximations and more PMFs and PDFs

Introduction to Extreme Value Theory Laurens de Haan, ISM Japan, Erasmus University Rotterdam, NL University of Lisbon, PT

Topic 9: Sampling Distributions of Estimators

Math 155 (Lecture 3)

1 Generating functions for balls in boxes

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15

1 Duality revisited. AM 221: Advanced Optimization Spring 2016

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Quiz 5 Answers MATH 141

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Design and Analysis of Algorithms

6. Sufficient, Complete, and Ancillary Statistics

On Random Line Segments in the Unit Square

1 Hash tables. 1.1 Implementation

Problem Set 2 Solutions

Mathematics 170B Selected HW Solutions.

Estimation for Complete Data

HOMEWORK I: PREREQUISITES FROM MATH 727

CHAPTER I: Vector Spaces

Introduction to probability Stochastic Process Queuing systems. TELE4642: Week2

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

Sample Midterm This midterm consists of 10 questions. The rst seven questions are multiple choice; the remaining three

Machine Learning for Data Science (CS 4786)

(7 One- and Two-Sample Estimation Problem )

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Shannon s noiseless coding theorem

Online hypergraph matching: hiring teams of secretaries

Random Variables, Sampling and Estimation

Lecture Chapter 6: Convergence of Random Sequences

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

( 1) n (4x + 1) n. n=0

Application to Random Graphs

2.4 Sequences, Sequences of Sets

Massachusetts Institute of Technology

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

7.1 Convergence of sequences of random variables

SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES

The value of Banach limits on a certain sequence of all rational numbers in the interval (0,1) Bao Qi Feng

Lecture Note 8 Point Estimators and Point Estimation Methods. MIT Spring 2006 Herman Bennett

15-780: Graduate Artificial Intelligence. Density estimation

Convergence of random variables. (telegram style notes) P.J.C. Spreij

The Random Walk For Dummies

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

Lecture 6 Ecient estimators. Rao-Cramer bound.

Math 525: Lecture 5. January 18, 2018

Kurskod: TAMS11 Provkod: TENB 21 March 2015, 14:00-18:00. English Version (no Swedish Version)

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

Big Picture. 5. Data, Estimates, and Models: quantifying the accuracy of estimates.

Understanding Samples

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

Transcription:

Problem 1: (8 poits) Let X be a Poisso radom variable of parameter λ. 1. ( poits) Compute E[X]. E[X] = = kp(x = k) = k=1 λe λ λ k 1 (k 1)! = λe λ ke λλk λ k k! k =0 2. ( poits) Compute g(r) = E [ r X], for ay real umber r > 0. E[r X ] = r k e λλk = e λ g(r) = e λ e rλ = e λ(r 1) (rλ) k. (2 poits) Take for a fact that g (1) = E[X]. Verify your aswer to questio 1. g (r) = λe λ(r 1) g (1) = E[X] = λ (you ll get 2 extra poits if you actually prove that g (1) = E[X]) g(r) = r k e λλk g (r) = d dr g(r) = g (1) = kr k 1 e λλk ke λλk = E[X]

Problem 2: coutig squirrels (9 poits) We wat to estimate the umber N of squirrels o campus. To do that, we capture k squirrels, we put a mark o oe of their leg, ad release them. The, we wait a little bit, capture a squirrel, see if it has a mark, ad release it. We wait a little bit, capture aother squirrel, see if it has a mark, ad release it. etc... We do that l times, ad we cout the umber of marks that we saw. We assume that whe we capture a squirrel, we pick it uiformly at radom amog the N squirrels, idepedetly of the other time we have captured a squirrel (ote that amog the l oes that we checked, we could have picked twice, three time, etc.. the same oe). 1. ( poits) What is the probability that the first squirrel we capture is marked? Fid the probability p(m) that we fid m marks (it should deped o N,K,l). There are k marked squirrels amog N oes. Probability that the first oe I pick is marked = k/n. Probability to fid m marks amog l picks = ( ) l m (k/n) m (1 k/n) l m 2. (4 poits) We proceeded the experimet: k,l,m are data. To estimate N, we choose the value that maximizes p(m) as a fuctio of N. Fid this value, for ay give k,l,m. Hit: compute the derivative of p(m) with respect to N. We give the derivative of f(x) = (1 x) l m x m with respect to x: f (x) = (m lx)(1 x) l m 1 x m 1. Write y = k/n. p(m) = ( l m) y m (1 y) l m. ( ) d l dy p(m) = (m ly)(1 y) l m 1 y m 1 m p(m) maximal for y = l/m, that is for N = kl/m.. (2 poits) If we marked 20 squirrels, captured 20 squirrels, ad that we fid oly 5 marked squirrels, what is the estimated umber of squirrels o campus? k = 20, l = 20, m = 5. Estimatio by questio 2: N = kl/m = 80

Problem : (2 pages!) gamblig or ot gamblig (16 poits) There are 10 balls i a ur: 6 Gree balls, Red balls, ad 1 Black ball. You pick balls i the ur, ad your gai depeds o the umber of Red balls you get: with oe Red ball you wi $10, with two red balls $25, ad if you get the three red balls you wi the jackpot, $240! But careful, if you get the Black ball, you do t wi aythig... 1. (2 poit) What is the probability of wiig the jackpot? 1 ) = 1 120 2. ( poits) What is the probability of gettig the black ball? B = gettig the black ball : pick the black ball, ad choose the two remaiig balls: ( ) 9 2 ( 9 P(B) = ) = 10. (4 poits) Show that the probability that you do t wi aythig is 7 otatios: for example B is the evet that you get the black ball, etc...) N = gettig NO red ball. Dot wi aythig: N B 15 (give evets some P(N B) = P(N)+P(B) P(N B) NO red balls: all the balls are picked i the remaiig 7: ( ) 7 No red ball ad the black ball: pick the black, choose the remaiig two outside the red oes (6 choices): ( ) 6 2 ( 7 P(N) = ) ) = 7 24, P(B) = ( 6, P(N B) = ) = 1 10 8 the 7 24 + 10 1 8 = 7 15

Problem (page 4. (5 poits) We ote G the gai oe gets at this game. Fid the probability mass fuctio of the radom variable G. (Verify that is sums up to 1...) Questio 1 ad give, respectiveley P(G = 0) = 7/15 ad P(G = 240) = 1/120. P(G = 10) = P(1 red ball ad o black ball) = ( )( 6 1 ( 10 ) = 8 (choose the red ball you pick ( 1), the two other balls ( 6 ) P(G = 25) = P(2 red balls ad o black ball) = ( )( 6 2 ( 1) 10 ) = 40 (choose the two red balls you pick (, the other ball ( 6 1) ) Easy to verify it sums up to 1 (put everythig o 120) 5. ( poits) What is the expected value of the Gai? E[G] = 0.P(G = 0)+10.P(G = 10)+25.P(G = 25)+240.P(G = 240) = 0+ 15 4 + 15 +2 = 9.5 4

Problem 4: (7 poits) There are 5 Red balls, Gree balls ad 2 Yellow balls i a ur (agai!). Some boucig balls may have bee itroduced... The Red balls have a probability.2 to be boucig, the Gree balls a probability.4, ad the Yellow balls a probability.9. You pick 1 ball at radom. 1. ( poits) What is the probability that you get a boucig ball? B = boucig, R = Red, G = Gree, Y = Yellow P(B) = P(R)P(B R)+P(G)P(B G)+P(Y)P(B Y) = 5 10.2+ 10.4+ 2.9 =.10+.12+.18 =.4 10 2. (4 poits) Give that you got a boucig ball, what is the coditioal probability that it was a Red ball? A Gree ball? A Yellow ball? P(R B) = P(R)P(B R) P(B) P(G B) = P(G)P(B G) P(B) P(Y B) = P(Y)P(B Y) P(B) =.10.4 = 1 4 =.12.4 = 10 =.18.4 = 9 20

Problem 5: collectig toys (10 poits) Each week, you buy your favorite pack of cereals. It is your favorite because i each pack, there is a toy, ad you try to collect all the differet toys. We assume that the toy you get each week is chose uiformly amog the possible, idepedetly of what happes the other weeks. We deote T i the umber of weeks it takes to reach a collectio of i differet toys, ad X i = T i T i 1 the umber of weeks it takes to get from a collectio of i 1 differet toys to a collectio of i differet toys. 1. ( poits) Show that X i is a Geometric radom variable. What is its parameter? You have i 1 differet toys, there are (i 1) that you do t have. You get a ew toy ext week with probability p := (i 1). The probability that you wait exactly k weeks before you get a ew toy is therefore: (1 p) k 1 p (you fail the k 1 first week, ad you success the last oe, the weeks beig idepedet). This is exactly the probability mass fuctio of a geometric radom variable, with parameter p = (i 1). 2. ( poits) T /2 is the time to complete a half collectio. Write T /2 as a sum of X i s, ad compute E[T /2 ] (you do t have to simplify the sum). T /2 = X 1 +X 2 +...+X /2 = /2 X i, so that /2 E[T /2 ] = E[X i ] = /2 i+1. We used that the expectatio of a geometric radom variable of parameter p is 1/p (here X i is a geometric r.v. with parameter ( i+1)/). (4 poits) Show that E[T /2 ] < (Hit: give a boud for each term i the sum). Deduce that, o average, gettig the last missig toy (it takes a time X ) is loger tha completig a half collectio. Sice the idex i the sum is i /2, oe gets that i+1 > /2, so that Oe gets /2 /2 E[T /2 ] = i+1 < 2 i+1 < /2 = 2. The time to get the last missig toy is X. It s a geometric radom variable with parameter 1/. The average time to get the last missig toy is E[X ] = > E[T /2 ], the average time to get a half collectio.