CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).
|
|
- Ferdinand Bridges
- 6 years ago
- Views:
Transcription
1 CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event that the first item to be loaded s up being the only one on its truck This event is the union of two disjoint events: (i) the first item is placed on the red truck and the remaining n are placed on the green truck, and, (ii) the first item is placed on the green truck and the remaining n are placed on the red truck Thus, P(A) =p( p) n +( p)p n (c) Let B be the event that at least one truck s up with a total of exactly one package The event B occurs if exactly one or both of the trucks up with exactly package, so, if n =, p( p), if n =, P(B) = ( ) ( ) n ( p) n n p + p n ( p), if n =3,, 5, n (d) Let D = R G = R (n R) =R n WehaveE[D] =E[R] n =np n Since D =R n, where n is a constant, var(d) = var(r) =np( p) (e) Let C be the event that each of the first packages is loaded onto the red truck Given that C occurred, the random variable R becomes Hence, +X 3 + X + + X n E[R C] =E[ + X 3 + X + + X n]=+(n )E[X i]=+(n )p Similarly, the conditional variance of R is var(r C) =var(+x 3 + X + + X n)=(n )var(x i)=(n )p( p) 67
2 Finally, given that the first two packages are loaded onto the red truck, the probability that a total of r packages are loaded onto the red truck is equal to the probability that r of the remaining n packages go to the red truck: ( ) n p R C (r) = ( p) n r p r, for r =,,n r Solution to Problem 6 (a) Failed quizzes are a Bernoulli process with parameter p =/ The desired probability is given by the binomial formula: ( ) 6 p ( p) = 6! ( ( ) 3!! ) (b) The expected number of quizzes up to the third failure is the expected value of a Pascal random variable of order three, with parameter /, which is 3 = Subtracting the number of failures, we have that the expected number of quizzes that Dave will pass is 3=9 (c) The event of interest is the intersection of the following three indepent events: A: there is exactly one failure in the first seven quizzes B: quiz eight is a failure C: quiz nine is a failure We have ( ) ( ) P(A) =, P(B) =P(C) = )(, so the desired probability is ( ) 3 ( ) 3 6 P(A B C) =7 (d) Let B be the event that Dave fails two quizzes in a row before he passes two quizzesinarow LetususeF and S to indicate quizzes that he has failed or passed, respectively We then have P(B) =P(FF SFF FSFF SFSFF FSFSFF SFSFSFF ) = P(FF)+ P(SFF)+ P(FSFF)+ P(SFSFF)+ P(FSFSFF) + P(SFSFSFF)+ ( ) = + 3 ( ) + 3 ) + ( 3 3 ) + ( 3 3 ) ( ) + ( [ ( ) = + 3 ) + ( 3 3 ) + ] ( [ ) ( 3 3 ) + ( ) + ] ( 68
3 Therefore, P(B) is the sum of two infinite geometric series, and P(B) = ( ) 3 ) 3 + ( 3 = 7 5 Solution to Problem 63 The answers to these questions are found by considering suitable Bernoulli processes and using the formulas of Section 6 Deping on the specific question, however, a different Bernoulli process may be appropriate In some cases, we associate trials with slots In other cases, it is convenient to associate trials with busy slots (a) During each slot, the probability of a task from user is given by p = p B p B = (5/6) (/5) = /3 Tasks from user form a Bernoulli process and P(first user task occurs in slot ) = p ( p ) 3 = ( ) (b) This is the probability that slot was busy and slot was idle, given that 5 out of the 0 first slots were idle Because of the fresh-start property, the conditioning information is immaterial, and the desired probability is p B p I = (c) Each slot contains a task from user with probability p =/3, indepent of other slots The time of the 5th task from user is a Pascal random variable of order 5, with parameter p =/3Itsmeanisgivenby 5 = 5 p /3 =5 (d) Each busy slot contains a task from user with probability p B =/5, indepent of other slots The random variable of interest is a Pascal random variable of order 5, with parameter p B =/5 Its mean is 5 p B = 5 /5 = 5 (e) The number T of tasks from user until the 5th task from user is the same as the number B of busy slots until the 5th task from user, minus 5 The number of busy slots ( trials ) until the 5th task from user ( success ) is a Pascal random variable of order 5, with parameter p B =/5 Thus, ( t p B(t) = ) ( 5 ) 5 ( 5) t 5, t =5, 6, 69
4 Since T = B 5, we have p T (t) =p B(t + 5), and we obtain ( ) ( ) t + 5 ( p T (t) = t, t =0,, 5 5) Using the formulas for the mean and the variance of the Pascal random variable B, we obtain E[T ]=E[B] 5= 5 5=75, and var(t )=var(b) = 5( (/5) ) (/5) Solution to Problem 68 The total number of accidents between 8 am and am is the sum of two indepent Poisson random variables with parameters 5 and 3 = 6, respectively Since the sum of indepent Poisson random variables is also Poisson, the total number of accidents has a Poisson PMF with parameter 5+6= Solution to Problem 69 As long as the pair of players is waiting, all five courts are occupied by other players When all five courts are occupied, the time until a court is freed up is exponentially distributed with mean 0/5 = 8 minutes For our pair of players to get a court, a court must be freed up k + times Thus, the expected waiting time is 8(k +) Solution to Problem 60 (a) This is the probability of no arrivals in hours It is given by P (0, ) = e 06 =030 For an alternative solution, this is the probability that the first arrival comes after hours: P(T > ) = f T (t) dt = 06e 06t dt = e 06 =030 (b) This is the probability of zero arrivals between time 0 and, and of at least one arrival between time and 5 Since these two intervals are disjoint, the desired probability is the product of the probabilities of these two events, which is given by P (0, ) ( P (0, 3) ) = e 06 ( e 06 3 )=05 For an alternative solution, the event of interest can be written as { T 5}, and its probability is 5 f T (t) dt = 5 06e 06t dt = e 06 e 06 5 =05 (c) If he catches at least two fish, he must have fished for exactly two hours Hence, the desired probability is equal to the probability that the number of fish caught in the first two hours is at least two, ie, P (k, ) = P (0, ) P (, ) = e 06 (06 )e 06 =0337 k= 70
5 For an alternative approach, note that the event of interest occurs if and only if the time Y of the second arrival is less than or equal to Hence, the desired probability is P(Y ) = f Y (y) dy = (06) ye 06y dy 0 This integral can be evaluated by integrating by parts, but this is more tedious than the first approach (d) The expected number of fish caught is equal to the expected number of fish caught during the first two hours (which is λ = 06 =), plus the expectation of the number N of fish caught after the first two hours We have N = 0 if he stops fishing at two hours, and N =, if he continues beyond the two hours The event {N =} occurs if and only if no fish are caught in the first two hours, so that E[N] =P(N = ) = P (0, ) = 030 Thus, the expected number of fish caught is +030 = 50 (e) Given that he has been fishing for hours, the future fishing time is the time until the first fish is caught By the memorylessness property of the Poisson process, the future time is exponential, with mean /λ Hence, the expected total fishing time is +(/06) = 5667 Solution to Problem 6 We note that the process of departures of customers who have bought a book is obtained by splitting the Poisson process of customer departures, and is itself a Poisson process, with rate pλ (a) This is the time until the first customer departure in the split Poisson process It is therefore exponentially distributed with parameter pλ (b) This is the probability of no customers in the split Poisson process during an hour, and using the result of part (a), equals e pλ (c) This is the expected number of customers in the split Poisson process during an hour, and is equal to pλ Solution to Problem 6 Let X be the number of different types of pizza ordered Let X i be the random variable defined by {, if a type i pizza is ordered by at least one customer, X i = 0, otherwise We have X = X + + X n,ande[x] =ne[x ] We can think of the customers arriving as a Poisson process, and with each customer indepently choosing whether to order a type pizza (this happens with probability /n) or not This is the situation encountered in splitting of Poisson processes, and the number of type pizza orders, denoted Y, is a Poisson random variable with parameter λ/n Wehave 0 E[X ]=P(Y > 0) = P(Y =0)= e λ/n, so that E[X] =ne[x ]=n ( e λ/n) 7
6 Solution to Problem 63 (a) Let R be the total number of messages received during an interval of duration t Note that R is a Poisson random variable with arrival rate λ A + λ B Therefore, the probability that exactly nine messages are received is ( (λa + λ ) 9 B)T e (λ A +λ B )t P(R =9)= 9! (b) Let R be defined as in part (a), and let W i be the number of words in the ith message Then, N = W + W + + W R, which is a sum of a random number of random variables Thus, E[N] =E[W ]E[R] = ( = (λa + λb)t 6 ) (λ A + λ B)t (c) Three-word messages arrive from transmitter A in a Poisson manner, with average rate λ Ap W (3) = λ A/6 Therefore, the random variable of interest is Erlang of order 8, and its PDF is given by f(x) = (λa/6)8 x 7 e λ Ax/6 7! (d) Every message originates from either transmitter A or B, and can be viewed as an indepent Bernoulli trial Each message has probability λ A/(λ A + λ B) of originating from transmitter A (view this as a success ) Thus, the number of messages from transmitter A (out of the next twelve) is a binomial random variable, and the desired probability is equal to ( ) ( ) λ 8 ( ) A λ B 8 λ A + λ B λ A + λ B Solution to Problem 6 (a) Let X be the time until the first bulb failure Let A (respectively, B) be the event that the first bulb is of type A (respectively, B) Since the two bulb types are equally likely, the total expectation theorem yields E[X] =E[X A]P(A)+E[X B]P(B) = + 3 = 3 (b) Let D be the event of no bulb failures before time t Using the total probability theorem, and the exponential distributions for bulbs of the two types, we obtain P(D) =P(D A)P(A)+P(D B)P(B) = e t + e 3t 7
7 (c) We have P(A D) = P(A D) P(D) = e t e t + = e 3t +e t (d) We first find E[X ] We use the fact that the second moment of an exponential random variable T with parameter λ is equal to E[T ]=E[T ] +var(t )=/λ +/λ = /λ Conditioning on the two possible types of the first bulb, we obtain E[X ]=E[X A]P(A)+E[X B]P(B) = + 9 = 0 9 Finally, using the fact E[X] = /3 from part (a), var(x) =E[X ] E[X] = = 3 (e) This is the probability that out of the first bulbs, exactly 3 were of type A and that the th bulb was of type A It is equal to ( ) ( ) 3 (f) This is the probability that out of the first bulbs, exactly were of type A, and is equal to ( ) ( ) (g) The PDF of the time between failures is (e x +3e 3x )/, for x 0, and the associated transform is ( s + 3 ) 3 s Since the times between successive failures are indepent, the transform associated with the time until the th failure is given by [ ( s + 3 )] 3 s (h) Let Y be the total period of illumination provided by the first two type-b bulbs This has an Erlang distribution of order, and its PDF is f Y (y) =9ye 3y, y 0 Let T be the period of illumination provided by the first type-a bulb Its PDF is f T (t) =e t, t 0 73
8 We are interested in the event T<YWehave Thus, P(T <Y)= 0 P(T <Y Y = y) = e y, y 0 f Y (y)p(t <Y Y = y) dy = 0 9ye 3y( e y) dy = 7 6, as can be verified by carrying out the integration We now describe an alternative method for obtaining the answer Let T A be the period of illumination of the first type-a bulb Let T B and T B be the period of illumination provided by the first and second type-b bulb, respectively We are interested in the event {T A <T B + T B } Wehave P(T A <T B + T B )=P(T A <T B )+P(T A T B ) P(T A <T B + T B T A T B ) = +3 + P(T A T B ) P(T A T B <T B T A T B ) = + 3 P(T A T B <T B T A T B ) Given the event T A T B, and using the memorylessness property of the exponential random variable T A, the remaining time T A T B until the failure of the type-a bulb is exponentially distributed, so that P(T A T B <T B T A T B )=P(T A <T B )=P(T A <T B )= Therefore, P(T A <T B + T B )= + 3 = 7 6 (i) Let V be the total period of illumination provided by type-b bulbs while the process is in operation Let N be the number of light bulbs, out of the first, that are of type B LetX i be the period of illumination from the ith type-b bulb We then have V = Y + +Y N Note that N is a binomial random variable, with parameters n = and p =/, so that E[N] =6, var(n) = =3 Furthermore, E[X i]=/3 and var(x i)=/9 Using the formulas for the mean and variance of the sum of a random number of random variables, we obtain E[V ]=E[N]E[X i]=, and var(v )=var(x i)e[n]+e[x i] var(n) = = 7
9 (j) Using the notation in parts (a)-(c), and the result of part (c), we have E[T D] =t + E[T t D A]P(A D)+E[T t D B]P(B D) = t + +e + ( ) t 3 +e t = t e t Solution to Problem 65 (a) The total arrival process corresponds to the merging of two indepent Poisson processes, and is therefore Poisson with rate λ = λ A +λ B = 7 Thus, the number N of jobs that arrive in a given three-minute interval is a Poisson random variable, with E[N] = 3λ =, var(n) =, and PMF p N (n) = ()n e, n =0,,, n! (b) Each of these 0 jobs has probability λ A/(λ A + λ B)=3/7 of being of type A, indepently of the others Thus, the binomial PMF applies and the desired probability is equal to ( ) ( ( ) 7 3 7) 7 (c) Each future arrival is of type A with probability λ A/(λ A+λ B)=3/7, indepently of other arrivals Thus, the number K of arrivals until the first type A arrival is geometric with parameter 3/7 The number of type B arrivals before the first type A arrival is equal to K, and its PMF is similar to a geometric, except that it is shifted by one unit to the left In particular, ( )( ) 3 k p K(k) =, k =0,,, 7 7 (d) The fact that at time 0 there were two type A jobs in the system simply states that there were exactly two type A arrivals between time and time 0 Let X and Y be the arrival times of these two jobs Consider splitting the interval [, 0] into many time slots of length δ Since each time instant is equally likely to contain an arrival and since the arrival times are indepent, it follows that X and Y are indepent uniform random variables We are interested in the PDF of Z =max{x, Y } We first find the CDF of Z We have, for z [, 0], By differentiating, we obtain P(Z z) =P(X z and Y z) =(+z) f Z(z) =(+z), z 0 (e) Let T be the arrival time of this type B job We can express T in the form T = K + X, where K is a nonnegative integer and X lies in [0,] We claim that X 75
10 clear; clc; %(B) tau = 05/(60*60); %change to hour instead of second lamda = 0::60; %it is in hour ave_x = lamda * ( - exp(- * lamda * tau));%given iv formula figure(); plot (lamda, ave_x) %% %(c) N=0000 VEC_TEW=zeros(6,N); VEC_TNS=zeros(6,N); para=[/0,/0,/30,/0,/50,/60]; for j=:6 for i=:0000 VEC_TEW(j,i+)=VEC_TEW(j,i)+exprnd(para(j)); VEC_TNS(j,i+)=VEC_TNS(j,i)+exprnd(para(j)); %% %c() %% figure(3); offset=[,,7,0,3,6]; offset=[,5,8,,,7]; for a=:6 plot(vec_tew(a,:),offset(a),'xr'); hold on; plot(vec_tns(a,:),offset(a),'xb'); hold on; hold off; xlim([0 ]); %% c(3) and c() counter=zeros(6,); for a=:6 for i = :N for ii = :N if(abs(vec_tew(a,i)-vec_tns(a,ii))<=05/(60*60)) counter(a,)=counter(a,)+; %% c(5) and c(6) figure(); hold on; parameter=[0,0,30,0,50,60]; for A=:6 plot(parameter(a), parameter(a)* ( - exp(- * parameter(a)* tau)), 'or')
11 plot(parameter(a), counter(a,)/vec_tew(a,n), 'xr')
Massachusetts Institute of Technology
6.04/6.4: Probabilistic Systems Analysis Fall 00 Quiz Solutions: October, 00 Problem.. 0 points Let R i be the amount of time Stephen spends at the ith red light. R i is a Bernoulli random variable with
More informationLecture 4: Bernoulli Process
Lecture 4: Bernoulli Process Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 4 Hyang-Won Lee 1 / 12 Stochastic Process A stochastic process is a mathematical model of a probabilistic
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version
More informationMassachusetts Institute of Technology
6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More information6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.
604/643 Fall 200 Final Exam Solutions Wednesday, December 5, 9:00AM - 2:00noon Problem (32 points) Consider a Markov chain {X n ; n 0,, }, specified by the following transition diagram 06 05 09 04 03 2
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More information57:022 Principles of Design II Final Exam Solutions - Spring 1997
57:022 Principles of Design II Final Exam Solutions - Spring 1997 Part: I II III IV V VI Total Possible Pts: 52 10 12 16 13 12 115 PART ONE Indicate "+" if True and "o" if False: + a. If a component's
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationProbability Theory and Statistics (EE/TE 3341) Homework 3 Solutions
Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions Yates and Goodman 3e Solution Set: 3.2.1, 3.2.3, 3.2.10, 3.2.11, 3.3.1, 3.3.3, 3.3.10, 3.3.18, 3.4.3, and 3.4.4 Problem 3.2.1 Solution
More informationQuick review on Discrete Random Variables
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 2017 Néhémy Lim Quick review on Discrete Random Variables Notations. Z = {..., 2, 1, 0, 1, 2,...}, set of all integers; N = {0, 1, 2,...}, set of natural
More informationMITOCW MITRES6_012S18_L22-10_300k
MITOCW MITRES6_012S18_L22-10_300k In this segment, we go through an example to get some practice with Poisson process calculations. The example is as follows. You go fishing and fish are caught according
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationStat 100a, Introduction to Probability.
Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationChapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables
Chapter 4: Continuous Random Variables and Probability s 4-1 Continuous Random Variables 4-2 Probability s and Probability Density Functions 4-3 Cumulative Functions 4-4 Mean and Variance of a Continuous
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationECE 313 Probability with Engineering Applications Fall 2000
Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationChapter 4. Probability-The Study of Randomness
Chapter 4. Probability-The Study of Randomness 4.1.Randomness Random: A phenomenon- individual outcomes are uncertain but there is nonetheless a regular distribution of outcomes in a large number of repetitions.
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More information(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)
3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete
More informationQueueing Theory and Simulation. Introduction
Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University, Japan
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationLecture 3 Continuous Random Variable
Lecture 3 Continuous Random Variable 1 Cumulative Distribution Function Definition Theorem 3.1 For any random variable X, 2 Continuous Random Variable Definition 3 Example Suppose we have a wheel of circumference
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationMath/Stat 352 Lecture 8
Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationSection 9.1. Expected Values of Sums
Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1
More informationLecture 8 : The Geometric Distribution
0/ 24 The geometric distribution is a special case of negative binomial, it is the case r = 1. It is so important we give it special treatment. Motivating example Suppose a couple decides to have children
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationCS 1538: Introduction to Simulation Homework 1
CS 1538: Introduction to Simulation Homework 1 1. A fair six-sided die is rolled three times. Let X be a random variable that represents the number of unique outcomes in the three tosses. For example,
More informationProbability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999
Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,
More informationn(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3
Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different
More informationChapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type
Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution
More informationDEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL
CHAPTER 5: RANDOM VARIABLES, BINOMIAL AND POISSON DISTRIBUTIONS DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL NUMBER OF DOTS WHEN ROLLING TWO
More informationQueueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "
Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals
More informationMassachusetts Institute of Technology
Problem. (0 points) Massachusetts Institute of Technology Final Solutions: December 15, 009 (a) (5 points) We re given that the joint PDF is constant in the shaded region, and since the PDF must integrate
More informationSTAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution
STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationChapter 2 Queueing Theory and Simulation
Chapter 2 Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University,
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationDiscrete random variables and probability distributions
Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or
More informationMITOCW MIT6_041F11_lec15_300k.mp4
MITOCW MIT6_041F11_lec15_300k.mp4 The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for
More information(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)
3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete
More informationECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.
NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationCommon Discrete Distributions
Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationLecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014
Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete
More informationEE 505 Introduction. What do we mean by random with respect to variables and signals?
EE 505 Introduction What do we mean by random with respect to variables and signals? unpredictable from the perspective of the observer information bearing signal (e.g. speech) phenomena that are not under
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More information4 Branching Processes
4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationMath 564 Homework 1. Solutions.
Math 564 Homework 1. Solutions. Problem 1. Prove Proposition 0.2.2. A guide to this problem: start with the open set S = (a, b), for example. First assume that a >, and show that the number a has the properties
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationEdexcel GCE A Level Maths Statistics 2 Uniform Distributions
Edexcel GCE A Level Maths Statistics 2 Uniform Distributions Edited by: K V Kumaran kumarmaths.weebly.com 1 kumarmaths.weebly.com 2 kumarmaths.weebly.com 3 kumarmaths.weebly.com 4 1. In a computer game,
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationDiscrete Distributions
Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More information