n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Similar documents
Math 493 Final Exam December 01

Recitation 2: Probability

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Discrete Random Variable

M378K In-Class Assignment #1

Introduction to Probability 2017/18 Supplementary Problems

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

MAT 135B Midterm 1 Solutions

Multivariate distributions

Massachusetts Institute of Technology

Twelfth Problem Assignment

Discrete Random Variables

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Review of Probability. CS1538: Introduction to Simulations

Probability and Statisitcs

Notes 12 Autumn 2005

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Dept. of Linguistics, Indiana University Fall 2015

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

MAS108 Probability I

Exercises with solutions (Set D)

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

1 Random Variable: Topics

Analysis of Engineering and Scientific Data. Semester

Topic 3: The Expectation of a Random Variable

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Notes for Math 324, Part 17

Markov Chains. Chapter 16. Markov Chains - 1

Brief Review of Probability

Exam 1. Problem 1: True or false

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

What is a random variable

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Name of the Student:

Continuous Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

What does independence look like?

Basic Probability and Statistics

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

SDS 321: Introduction to Probability and Statistics

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

5. Conditional Distributions

Math , Fall 2012: HW 5 Solutions

STAT Chapter 3: Probability

1 Basic continuous random variable problems

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

5 CORRELATION. Expectation of the Binomial Distribution I The Binomial distribution can be defined as: P(X = r) = p r q n r where p + q = 1 and 0 r n

Plan for today. ! Part 1: (Hidden) Markov models. ! Part 2: String matching and read mapping

Lecture notes for probability. Math 124

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Chapter 3. Chapter 3 sections

STA Module 4 Probability Concepts. Rev.F08 1

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Conditional Probability

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder

CS206 Review Sheet 3 October 24, 2018

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

BINOMIAL DISTRIBUTION

Discrete random variables

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Relationship between probability set function and random variable - 2 -

Problem Points S C O R E Total: 120

STAT 430/510: Lecture 16

Lecture 3. Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

18.440: Lecture 28 Lectures Review

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Problem Points S C O R E Total: 120

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Math Bootcamp 2012 Miscellaneous

Binomial and Poisson Probability Distributions

1 Basic continuous random variable problems

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Discrete Probability Refresher

Chapter 2 Random Variables

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators.

STAT 311 Practice Exam 2 Key Spring 2016 INSTRUCTIONS

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

Stochastic Models of Manufacturing Systems

A brief review of basics of probabilities

Homework 4 Solution, due July 23

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

1 INFO Sep 05

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Probability Theory and Applications

Please simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely.

Transcription:

Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different experiments: 1. You choose one coin at random and toss it repeatedly. 2. You repeatedly choose a coin at random and toss it. In both cases calculate the average number of Heads among the first n tosses, and the average time you have to wait for the first Head. Solution. (b) 1. (a) EN = ET = = 1 3 E[N p = p i ]P(p = p i ) = E(T p = p i )P(p = p i ) = 1 3 n=1 n(1 p i ) n 1 p i = 1 3 np i 1 3 = 1 3 (p 1 + p 2 + p i ) n=1 p 2 i np(t = n p = p i ) p i = 1 3 ( 1 p 1 + 1 p 2 + 1 p 3 ). 2. (a) Let X i = 1 if the coin shows a Head on the ith turn, and zero otherwise, for i = 1,, n. Then N = X 1 + + X n EN = E(X 1 + + X n ) = EX 1 + + EX n. Since EX k = 3 E(X i p = p i )P(p = p i ) = 1 3 3 p i, then EN = 1 3 p i + + 1 3 p i = n 3 (p 1 + p 2 + p 3 ). (b) Since P(X i = 1) = 3 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 3 is independent of i, and X i are independent, then T is a geometric random variable with parameter p = p1+p2+p3 3. Therefore ET = 1 p = 1 p 1 + p 2 + p 3. 1

Solutions of Final Exam 2 2. (15 points) Let X 1,, X 4 be four independent random variables, and g i : R 2 R functions for i = 1, 2. Show that Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 3, X 4 ) are independent. Proof. We want to show that P(Y 1 F, Y 2 G) = P(Y 1 F )P(Y 2 G), for any event F and G. There are sets A 1, A 2 such that {Y 1 F } := {g 1 (X 1, X 2 ) F } = {X 1 A 1, X 2 A 2 }. Similarly, there are sets A 3, A 4, such that Therefore, {Y 2 G} := {g 2 (X 3, X 4 ) G} = {X 3 A 3, X 4 A 4 }. P(Y 1 F, Y 2 G) = P(X i A i, for i = 1,, 4) = P(X 1 A 1, X 2 A 2 )P(X 3 A 3, X 4 A 4 ) = P(Y 1 F )P(Y 2 G).

Solutions of Final Exam 3 3. (2 points) Let Z 1, Z 2, be a sequence of independent Bernoulli p. Let X 1, X2, denote the time of the first, second, success. (e.g. for the outcome, 1, 1,,, 1,, 1, we have X 1 = 2, X 2 = 3, X 3 = 6, ) 1. Find the joint PMF of X 1,, X n, i.e., P(X 1 = k 1,, X n = k n ). 2. Use (a) to calculate the PMF of X n, the time of the n-th success. 3. Use (a) to calculate the PMF of the RVs Y 1 = X 1, Y 2 = X 2 X 1,, Y n = X n X n 1, and show that they are independent geometric RVs. Solution. 1. Since X 1 < < X n, then P(X n = k 1,, X n = k n ) = if k 1 < < k n doesn t hold. If k 1 < < k n, then P(X 1 = k 1,, X n = k n ) = P(Z 1 =,, Z k1 1 =, Z k1 = 1,, Z kn 1+1 =, Z kn 1 =, Z kn=1) = P(Z 1 = ) P(Z k1 1 = )P(Z k 1 = 1) P(Z kn 1+1 = ) P(Z kn 1 = )P(Z kn = 1) = (1 p) kn n p n 2. Let Λ = {(k 1,, k n 1 ) N n 1 : k 1 < < k n 1 < k}. Then P(X n = k) = P(X 1 = k 1,, X n 1 = k n 1, X n = k) 3. = = (k 1,,k n 1) Λ (k 1,,k n 1) Λ ( k 1 n 1 (1 p) k n p n = (1 p) k n p n #Λ ) (1 p) k n p n, where N n is the set of n-tuples with coordinates in N and #Λ denotes the number of elements in Λ. P(Y 1 = l 1, Y 2 = l 1,, Y n = l n ) = P(X 1 = l 1, X 2 = l 1 + l 2,, X n = l 1 + + l n ) = (1 p) l1+ +ln n p n = p(1 p) l1 1 p(1 p) l2 1 p(1 p) ln 1, Next we compute P(Y n = l n ), as follows. P(Y n = l n ) = P(Y 1 = l 1,, Y n = l n ) l 1,,l n 1 = l 1=1 l n 1=1 (1 p) l1+ +ln n p n = p(1 p) ln 1 p n 1 (1 p) l1 1 = p(1 p) ln 1. l 1=1 l n 1=1 (1 p) ln 1 1

Solutions of Final Exam 4 This shows that Y n is a geometric random variable, with parameter p. Furthermore, we can write P(Y 1 = l 1, Y 2 = l 1,, Y n = l n ) = P(Y 1 = l 1 ) P(Y n = l n ). Therefore Y 1,, Y n are independent.

Solutions of Final Exam 5 4. (2 points) Suppose that every day can be described by either S (sunny) or R (rainy). The probability for a day to be sunny is 4/8 if the preceding day was rainy, 6/8 if the preceding day was sunny, but 7/8 if both of the preceding days were sunny. 1. Let S = {S, R}, and let X n = the weather at nth day. Is X n a Markov chain? 2. Define a Markov chain on the state space S = {SS, SR, RS, RR} that describes the above weather model, and determine the corresponding transition matrix. 3. Given that the weekend(saturday and Sunday) was sunny calculate the probability for the next five days to be SSRSS. Solution. 1. No, because knowing the weather status of today, tomorrow s weather is not independent of weather of yesterday. 7/8 1/8 SS 2/8 6/8 SR 4/8 S 4/8 RS 4/8 P = 7/8 1/8 4/8 4/8 6/8 2/8 4/8 4/8 RR 2. 3. 4/8 P(SSRSS SS) = P(X 1 = SS, X 2 = SS, X 3 = SR, X 4 = RS, X 5 = SS X = SS) = 7 7 1 4 6 8 8 8 8 8 =.35

Solutions of Final Exam 6 5. (1 points) How often do you have to roll a fair die on the average until you have seen all possible values? (Hint: Let X n denote the number of different values you have seen at time n.) Solution. Let T = inf{n : X n = 6}, be the hitting time of the state 6, and let t i = E i T, where E i denotes the expectation with respect to the conditional probability law P i (A) := P(A X = i). Then 1/6 2/6 3/6 4/6 5/6 1 1 5/6 4/6 3/6 2/6 1/6 1 2 3 4 5 6 t i = E i (T ) = 1 + E i (T X 1 = i)p i (X 1 = i) + E i (T X i = i + 1)P i (X 1 = i + 1) Remember that E i (T X 1 = i) = t i, therefore we have t = 1 + t 1 t 1 = 1 + 1 6 t 1 + 5 6 t 2 t 2 = 1 + 2 6 t 2 + 4 6 t 3 t 3 = 1 + 3 6 t 3 + 3 6 t 4 t 4 = 1 + 4 6 t 4 + 2 6 t 5 t = 14.7 t 5 = 1 + 5 6 t 5 + 1 6 t 6 t 6 =

Solutions of Final Exam 7 6. (2 points) Consider the following dice game. A pair of dice are rolled. If the sum is 7 then the game ends and you win. If the sum is not 7, then you have the option of either stopping the game and receiving an amount equal to that sum or starting over again. For each value of i, i = 1,, 12 find your expected return if you employ the strategy of stopping the first time that a value at least as large as i appears. What value of i leads to the largest expected return? (Hint: Let X i denote the return when you use the critical value i. To compute EX i, condition on the first sum) Solution. Assume E i G denote the expected gain, when our strategy is to stop as soon as the sum is at least i, where possible i s are 2,, 12. Let s calculate E i G. E i G = E i [G S < i]p(s < i) + E i [G S i]p(s i) i 1 12 = E i [G S = k]p(s = k) + E i [G S = k]p(s = k) k=2 = E i [G] i 1 k=2,k P(S = k) + k=i 12 k=i,k 7 kp(s = k), where we exclude k = 7 because E i [G S = k] =, when k = 7. We can solve this equation for E i G. If we observe that P(S < i) = E i G = 12 k=i,k 7 kp(s = k) 1 i 1 k=2,k P(S = k). The numerator, and the denominator, are both finite sums, and P(S = k) is easy to calculate. Therefore the numerical value of E i G is computable. The result will be E i attains its largest value when i = 7 and i = 8.

Solutions of Final Exam 8 7. (2 points) The number of accidents that a person has in a given year is a Poisson random variable with mean λ. However, suppose that the value of λ varies in from person to person. Assume the proportion of the population having a value of λ less than x is equal to 1 e x. If a person is chosen at random what is the probability that he will have 1. accidents in a year. 2. accidents in a year given he had no accident the preceding year (Hint: instead of P(λ < x) = 1 e x, first solve the problem for an easier case where P(λ = 2) =. and P(λ = 2.5) =.7. ) Solution. 1. Let L denote the number accidents of this person. We don t know the mean λ associated to this person. But given λ = x, L is a Poisson random variable N with mean x. Therefore, P(L = λ = x) = P(N = ) = e x. (.1) By the total probability formula for the continuous random variable λ, P(L = ) = P(L = λ = x)p λ (x)dx, where p λ (x) is the density of λ. Since P(λ < x) = 1 e x, then λ is an exponential random variable with mean 1. Then p λ (x) = e x. We also know that given λ = x, then N is a Poisson random variable with mean x therefore P(L = ) = 2. Define two events A and B as follows e x e x dx =.5 A := {This driver will not have an accident this year}, B := {This driver didn t have any accident last year}. Notice that P(A λ = x) = P(B λ = x) = e x, as in (.1). Then P(A B) = P(A B) P(B) Observe that A, and B are not independent, (if somebody had 1 accidents last year, he probably has a larger λ, and this means he is likely to have a few accidents this year too), however, conditionally on λ they are independent (Compare this with Example 1.21 on page 37 of your text). Therefore P(A B) = = P(A B λ = x)p λ (x)dx P(A λ = x)p(b λ = x)p λ (x)dx = Similarly, P(B) = 1 3, then the final answer is 2/3. e 3x dx = 1 3.