CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Similar documents
Massachusetts Institute of Technology

Lecture 4: Bernoulli Process

EE126: Probability and Random Processes

Massachusetts Institute of Technology

EE126: Probability and Random Processes

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Chapter 5. Chapter 5 sections

Guidelines for Solving Probability Problems

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

Special distributions

Part I Stochastic variables and Markov chains

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Math Spring Practice for the final Exam.

Things to remember when learning probability distributions:

Basic concepts of probability theory

Mathematical Statistics 1 Math A 6330

Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions

Quick review on Discrete Random Variables

MITOCW MITRES6_012S18_L22-10_300k

Basic concepts of probability theory

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Slides 8: Statistical Models in Simulation

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Stat 100a, Introduction to Probability.

Discrete Random Variables

Notes on Continuous Random Variables

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Basic concepts of probability theory

STAT Chapter 5 Continuous Distributions

ECE 313 Probability with Engineering Applications Fall 2000

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Chapter 4. Probability-The Study of Randomness

Introduction to Probability 2017/18 Supplementary Problems

Lecture 2: Discrete Probability Distributions

BINOMIAL DISTRIBUTION

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Queueing Theory and Simulation. Introduction

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

1 Random Variable: Topics

Midterm Exam 1 Solution

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Chapter 4. Continuous Random Variables

Lecture 3 Continuous Random Variable

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Northwestern University Department of Electrical Engineering and Computer Science

Math/Stat 352 Lecture 8

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Statistics for Economists. Lectures 3 & 4

Section 9.1. Expected Values of Sums

Lecture 8 : The Geometric Distribution

Relationship between probability set function and random variable - 2 -

CS 1538: Introduction to Simulation Homework 1

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Test Problems for Probability Theory ,

SDS 321: Introduction to Probability and Statistics

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Massachusetts Institute of Technology

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

General Random Variables

Chapter 2 Queueing Theory and Simulation

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

Continuous-time Markov Chains

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Discrete random variables and probability distributions

MITOCW MIT6_041F11_lec15_300k.mp4

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Chapter 3: Random Variables 1

Discrete Random Variable

Common Discrete Distributions

Tom Salisbury

Name of the Student: Problems on Discrete & Continuous R.Vs

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

EE 505 Introduction. What do we mean by random with respect to variables and signals?

THE QUEEN S UNIVERSITY OF BELFAST

Notes 12 Autumn 2005

Stochastic Models of Manufacturing Systems

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Lectures on Elementary Probability. William G. Faris

4 Branching Processes

Continuous random variables

Math 564 Homework 1. Solutions.

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Continuous Random Variables

15 Discrete Distributions

Edexcel GCE A Level Maths Statistics 2 Uniform Distributions

Chapter 3: Random Variables 1

1.1 Review of Probability Theory

Discrete Distributions

STAT/MATH 395 PROBABILITY II

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

The exponential distribution and the Poisson process

Transcription:

CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event that the first item to be loaded s up being the only one on its truck This event is the union of two disjoint events: (i) the first item is placed on the red truck and the remaining n are placed on the green truck, and, (ii) the first item is placed on the green truck and the remaining n are placed on the red truck Thus, P(A) =p( p) n +( p)p n (c) Let B be the event that at least one truck s up with a total of exactly one package The event B occurs if exactly one or both of the trucks up with exactly package, so, if n =, p( p), if n =, P(B) = ( ) ( ) n ( p) n n p + p n ( p), if n =3,, 5, n (d) Let D = R G = R (n R) =R n WehaveE[D] =E[R] n =np n Since D =R n, where n is a constant, var(d) = var(r) =np( p) (e) Let C be the event that each of the first packages is loaded onto the red truck Given that C occurred, the random variable R becomes Hence, +X 3 + X + + X n E[R C] =E[ + X 3 + X + + X n]=+(n )E[X i]=+(n )p Similarly, the conditional variance of R is var(r C) =var(+x 3 + X + + X n)=(n )var(x i)=(n )p( p) 67

Finally, given that the first two packages are loaded onto the red truck, the probability that a total of r packages are loaded onto the red truck is equal to the probability that r of the remaining n packages go to the red truck: ( ) n p R C (r) = ( p) n r p r, for r =,,n r Solution to Problem 6 (a) Failed quizzes are a Bernoulli process with parameter p =/ The desired probability is given by the binomial formula: ( ) 6 p ( p) = 6! ( ( ) 3!! ) (b) The expected number of quizzes up to the third failure is the expected value of a Pascal random variable of order three, with parameter /, which is 3 = Subtracting the number of failures, we have that the expected number of quizzes that Dave will pass is 3=9 (c) The event of interest is the intersection of the following three indepent events: A: there is exactly one failure in the first seven quizzes B: quiz eight is a failure C: quiz nine is a failure We have ( ) ( ) 7 3 6 P(A) =, P(B) =P(C) = )(, so the desired probability is ( ) 3 ( ) 3 6 P(A B C) =7 (d) Let B be the event that Dave fails two quizzes in a row before he passes two quizzesinarow LetususeF and S to indicate quizzes that he has failed or passed, respectively We then have P(B) =P(FF SFF FSFF SFSFF FSFSFF SFSFSFF ) = P(FF)+ P(SFF)+ P(FSFF)+ P(SFSFF)+ P(FSFSFF) + P(SFSFSFF)+ ( ) = + 3 ( ) + 3 ) + ( 3 3 ) + ( 3 3 ) ( + 3 3 3 ) + ( [ ( ) = + 3 ) + ( 3 3 ) + ] ( [ ) 3 + + ( 3 3 ) + ( 3 3 3 ) + ] ( 68

Therefore, P(B) is the sum of two infinite geometric series, and P(B) = ( ) 3 ) 3 + ( 3 = 7 5 Solution to Problem 63 The answers to these questions are found by considering suitable Bernoulli processes and using the formulas of Section 6 Deping on the specific question, however, a different Bernoulli process may be appropriate In some cases, we associate trials with slots In other cases, it is convenient to associate trials with busy slots (a) During each slot, the probability of a task from user is given by p = p B p B = (5/6) (/5) = /3 Tasks from user form a Bernoulli process and P(first user task occurs in slot ) = p ( p ) 3 = ( ) 3 3 3 (b) This is the probability that slot was busy and slot was idle, given that 5 out of the 0 first slots were idle Because of the fresh-start property, the conditioning information is immaterial, and the desired probability is p B p I = 5 6 6 (c) Each slot contains a task from user with probability p =/3, indepent of other slots The time of the 5th task from user is a Pascal random variable of order 5, with parameter p =/3Itsmeanisgivenby 5 = 5 p /3 =5 (d) Each busy slot contains a task from user with probability p B =/5, indepent of other slots The random variable of interest is a Pascal random variable of order 5, with parameter p B =/5 Its mean is 5 p B = 5 /5 = 5 (e) The number T of tasks from user until the 5th task from user is the same as the number B of busy slots until the 5th task from user, minus 5 The number of busy slots ( trials ) until the 5th task from user ( success ) is a Pascal random variable of order 5, with parameter p B =/5 Thus, ( t p B(t) = ) ( 5 ) 5 ( 5) t 5, t =5, 6, 69

Since T = B 5, we have p T (t) =p B(t + 5), and we obtain ( ) ( ) t + 5 ( p T (t) = t, t =0,, 5 5) Using the formulas for the mean and the variance of the Pascal random variable B, we obtain E[T ]=E[B] 5= 5 5=75, and var(t )=var(b) = 5( (/5) ) (/5) Solution to Problem 68 The total number of accidents between 8 am and am is the sum of two indepent Poisson random variables with parameters 5 and 3 = 6, respectively Since the sum of indepent Poisson random variables is also Poisson, the total number of accidents has a Poisson PMF with parameter 5+6= Solution to Problem 69 As long as the pair of players is waiting, all five courts are occupied by other players When all five courts are occupied, the time until a court is freed up is exponentially distributed with mean 0/5 = 8 minutes For our pair of players to get a court, a court must be freed up k + times Thus, the expected waiting time is 8(k +) Solution to Problem 60 (a) This is the probability of no arrivals in hours It is given by P (0, ) = e 06 =030 For an alternative solution, this is the probability that the first arrival comes after hours: P(T > ) = f T (t) dt = 06e 06t dt = e 06 =030 (b) This is the probability of zero arrivals between time 0 and, and of at least one arrival between time and 5 Since these two intervals are disjoint, the desired probability is the product of the probabilities of these two events, which is given by P (0, ) ( P (0, 3) ) = e 06 ( e 06 3 )=05 For an alternative solution, the event of interest can be written as { T 5}, and its probability is 5 f T (t) dt = 5 06e 06t dt = e 06 e 06 5 =05 (c) If he catches at least two fish, he must have fished for exactly two hours Hence, the desired probability is equal to the probability that the number of fish caught in the first two hours is at least two, ie, P (k, ) = P (0, ) P (, ) = e 06 (06 )e 06 =0337 k= 70

For an alternative approach, note that the event of interest occurs if and only if the time Y of the second arrival is less than or equal to Hence, the desired probability is P(Y ) = f Y (y) dy = (06) ye 06y dy 0 This integral can be evaluated by integrating by parts, but this is more tedious than the first approach (d) The expected number of fish caught is equal to the expected number of fish caught during the first two hours (which is λ = 06 =), plus the expectation of the number N of fish caught after the first two hours We have N = 0 if he stops fishing at two hours, and N =, if he continues beyond the two hours The event {N =} occurs if and only if no fish are caught in the first two hours, so that E[N] =P(N = ) = P (0, ) = 030 Thus, the expected number of fish caught is +030 = 50 (e) Given that he has been fishing for hours, the future fishing time is the time until the first fish is caught By the memorylessness property of the Poisson process, the future time is exponential, with mean /λ Hence, the expected total fishing time is +(/06) = 5667 Solution to Problem 6 We note that the process of departures of customers who have bought a book is obtained by splitting the Poisson process of customer departures, and is itself a Poisson process, with rate pλ (a) This is the time until the first customer departure in the split Poisson process It is therefore exponentially distributed with parameter pλ (b) This is the probability of no customers in the split Poisson process during an hour, and using the result of part (a), equals e pλ (c) This is the expected number of customers in the split Poisson process during an hour, and is equal to pλ Solution to Problem 6 Let X be the number of different types of pizza ordered Let X i be the random variable defined by {, if a type i pizza is ordered by at least one customer, X i = 0, otherwise We have X = X + + X n,ande[x] =ne[x ] We can think of the customers arriving as a Poisson process, and with each customer indepently choosing whether to order a type pizza (this happens with probability /n) or not This is the situation encountered in splitting of Poisson processes, and the number of type pizza orders, denoted Y, is a Poisson random variable with parameter λ/n Wehave 0 E[X ]=P(Y > 0) = P(Y =0)= e λ/n, so that E[X] =ne[x ]=n ( e λ/n) 7

Solution to Problem 63 (a) Let R be the total number of messages received during an interval of duration t Note that R is a Poisson random variable with arrival rate λ A + λ B Therefore, the probability that exactly nine messages are received is ( (λa + λ ) 9 B)T e (λ A +λ B )t P(R =9)= 9! (b) Let R be defined as in part (a), and let W i be the number of words in the ith message Then, N = W + W + + W R, which is a sum of a random number of random variables Thus, E[N] =E[W ]E[R] = ( 6 + 3 6 +3 6 = (λa + λb)t 6 ) (λ A + λ B)t (c) Three-word messages arrive from transmitter A in a Poisson manner, with average rate λ Ap W (3) = λ A/6 Therefore, the random variable of interest is Erlang of order 8, and its PDF is given by f(x) = (λa/6)8 x 7 e λ Ax/6 7! (d) Every message originates from either transmitter A or B, and can be viewed as an indepent Bernoulli trial Each message has probability λ A/(λ A + λ B) of originating from transmitter A (view this as a success ) Thus, the number of messages from transmitter A (out of the next twelve) is a binomial random variable, and the desired probability is equal to ( ) ( ) λ 8 ( ) A λ B 8 λ A + λ B λ A + λ B Solution to Problem 6 (a) Let X be the time until the first bulb failure Let A (respectively, B) be the event that the first bulb is of type A (respectively, B) Since the two bulb types are equally likely, the total expectation theorem yields E[X] =E[X A]P(A)+E[X B]P(B) = + 3 = 3 (b) Let D be the event of no bulb failures before time t Using the total probability theorem, and the exponential distributions for bulbs of the two types, we obtain P(D) =P(D A)P(A)+P(D B)P(B) = e t + e 3t 7

(c) We have P(A D) = P(A D) P(D) = e t e t + = e 3t +e t (d) We first find E[X ] We use the fact that the second moment of an exponential random variable T with parameter λ is equal to E[T ]=E[T ] +var(t )=/λ +/λ = /λ Conditioning on the two possible types of the first bulb, we obtain E[X ]=E[X A]P(A)+E[X B]P(B) = + 9 = 0 9 Finally, using the fact E[X] = /3 from part (a), var(x) =E[X ] E[X] = 0 9 3 = 3 (e) This is the probability that out of the first bulbs, exactly 3 were of type A and that the th bulb was of type A It is equal to ( ) ( ) 3 (f) This is the probability that out of the first bulbs, exactly were of type A, and is equal to ( ) ( ) (g) The PDF of the time between failures is (e x +3e 3x )/, for x 0, and the associated transform is ( s + 3 ) 3 s Since the times between successive failures are indepent, the transform associated with the time until the th failure is given by [ ( s + 3 )] 3 s (h) Let Y be the total period of illumination provided by the first two type-b bulbs This has an Erlang distribution of order, and its PDF is f Y (y) =9ye 3y, y 0 Let T be the period of illumination provided by the first type-a bulb Its PDF is f T (t) =e t, t 0 73

We are interested in the event T<YWehave Thus, P(T <Y)= 0 P(T <Y Y = y) = e y, y 0 f Y (y)p(t <Y Y = y) dy = 0 9ye 3y( e y) dy = 7 6, as can be verified by carrying out the integration We now describe an alternative method for obtaining the answer Let T A be the period of illumination of the first type-a bulb Let T B and T B be the period of illumination provided by the first and second type-b bulb, respectively We are interested in the event {T A <T B + T B } Wehave P(T A <T B + T B )=P(T A <T B )+P(T A T B ) P(T A <T B + T B T A T B ) = +3 + P(T A T B ) P(T A T B <T B T A T B ) = + 3 P(T A T B <T B T A T B ) Given the event T A T B, and using the memorylessness property of the exponential random variable T A, the remaining time T A T B until the failure of the type-a bulb is exponentially distributed, so that P(T A T B <T B T A T B )=P(T A <T B )=P(T A <T B )= Therefore, P(T A <T B + T B )= + 3 = 7 6 (i) Let V be the total period of illumination provided by type-b bulbs while the process is in operation Let N be the number of light bulbs, out of the first, that are of type B LetX i be the period of illumination from the ith type-b bulb We then have V = Y + +Y N Note that N is a binomial random variable, with parameters n = and p =/, so that E[N] =6, var(n) = =3 Furthermore, E[X i]=/3 and var(x i)=/9 Using the formulas for the mean and variance of the sum of a random number of random variables, we obtain E[V ]=E[N]E[X i]=, and var(v )=var(x i)e[n]+e[x i] var(n) = 9 6+ 9 3= 7

(j) Using the notation in parts (a)-(c), and the result of part (c), we have E[T D] =t + E[T t D A]P(A D)+E[T t D B]P(B D) = t + +e + ( ) t 3 +e t = t + 3 + 3 +e t Solution to Problem 65 (a) The total arrival process corresponds to the merging of two indepent Poisson processes, and is therefore Poisson with rate λ = λ A +λ B = 7 Thus, the number N of jobs that arrive in a given three-minute interval is a Poisson random variable, with E[N] = 3λ =, var(n) =, and PMF p N (n) = ()n e, n =0,,, n! (b) Each of these 0 jobs has probability λ A/(λ A + λ B)=3/7 of being of type A, indepently of the others Thus, the binomial PMF applies and the desired probability is equal to ( ) ( 0 3 3 ( ) 7 3 7) 7 (c) Each future arrival is of type A with probability λ A/(λ A+λ B)=3/7, indepently of other arrivals Thus, the number K of arrivals until the first type A arrival is geometric with parameter 3/7 The number of type B arrivals before the first type A arrival is equal to K, and its PMF is similar to a geometric, except that it is shifted by one unit to the left In particular, ( )( ) 3 k p K(k) =, k =0,,, 7 7 (d) The fact that at time 0 there were two type A jobs in the system simply states that there were exactly two type A arrivals between time and time 0 Let X and Y be the arrival times of these two jobs Consider splitting the interval [, 0] into many time slots of length δ Since each time instant is equally likely to contain an arrival and since the arrival times are indepent, it follows that X and Y are indepent uniform random variables We are interested in the PDF of Z =max{x, Y } We first find the CDF of Z We have, for z [, 0], By differentiating, we obtain P(Z z) =P(X z and Y z) =(+z) f Z(z) =(+z), z 0 (e) Let T be the arrival time of this type B job We can express T in the form T = K + X, where K is a nonnegative integer and X lies in [0,] We claim that X 75

clear; clc; %(B) tau = 05/(60*60); %change to hour instead of second lamda = 0::60; %it is in hour ave_x = lamda * ( - exp(- * lamda * tau));%given iv formula figure(); plot (lamda, ave_x) %% %(c) N=0000 VEC_TEW=zeros(6,N); VEC_TNS=zeros(6,N); para=[/0,/0,/30,/0,/50,/60]; for j=:6 for i=:0000 VEC_TEW(j,i+)=VEC_TEW(j,i)+exprnd(para(j)); VEC_TNS(j,i+)=VEC_TNS(j,i)+exprnd(para(j)); %% %c() %% figure(3); offset=[,,7,0,3,6]; offset=[,5,8,,,7]; for a=:6 plot(vec_tew(a,:),offset(a),'xr'); hold on; plot(vec_tns(a,:),offset(a),'xb'); hold on; hold off; xlim([0 ]); %% c(3) and c() counter=zeros(6,); for a=:6 for i = :N for ii = :N if(abs(vec_tew(a,i)-vec_tns(a,ii))<=05/(60*60)) counter(a,)=counter(a,)+; %% c(5) and c(6) figure(); hold on; parameter=[0,0,30,0,50,60]; for A=:6 plot(parameter(a), parameter(a)* ( - exp(- * parameter(a)* tau)), 'or')

plot(parameter(a), counter(a,)/vec_tew(a,n), 'xr')