Moments and Discrete Random Variables OPRE 7310 Lecture Notes by Metin Çakanyıldırım Compiled at 15:07 on Wednesday 22 nd November, 2017
|
|
- Lesley Preston
- 6 years ago
- Views:
Transcription
1 Moments and Discrete Random Variables OPRE 731 Lecture Notes by Metin Çakanyıldırım Compiled at 15:7 on Wednesday 22 nd November, Random Variables In a probability space (Ω, F, P), the element ω Ω is not necessarily a numerical value. In the experiment of tossing a coin once, ω is either H or T. It is useful to convert outcomes of an experiment to numerical values. Then these numerical values can also be used in usual algebraic operations off additions, subtractions, etc. To convert each ω to a numerical value (a real number), we use a function X(ω) : Ω R. X(ω) is a function that maps the sample space to real numbers. Example: Let Ω n = {H, T} be the sample space of for the nth toss of a coin. We can define a random variable as the indicator function X n (ω n ) = 1I ωn =H. Then we can count the number of heads in the N experiments as N n=1 X n(ω n ) or we can consider an average (1/N) N n=1 X n(ω n ). We can consider the difference between the number of heads obtained in even numbered experiments and odd numbered experiments in the first 2N experiments: N n=1 X 2n(ω 2n ) N n=1 X 2n 1(ω 2n 1 ). Example: A truck will arrive in the next 1 hour at a distribution center, the arrival time ω when measured in minutes belongs to Ω = [, 6]. In this case, the outcomes are real numbers and we can use them as random variables: X(ω) = ω. We can consider another truck to arrive in the same time window with arrival time Y [, 6]. We can consider the earliness of the second truck with respect to the first: (X Y) + = max{x Y, }. The earliness then becomes another random variable. As illustrated above a function of random variable(s) is another random variable. Since random variables are real numbers, we can average them to obtain means and variances. Example: A truck arrives at a distribution center over 7 days and each day it arrives over an one-hour time window [, 1]. Then X n (ω n ) denotes the arrival time on the nth day for n = The average arrival time is 7 n=1 X n/7. A random variable can be conditioned on another random variable or on an event. When such a conditioning happens, the convention is to talk about the functions characterizing the conditioned random variable rather than writing X Y or X A for random variables X, Y and event A. One of the such functions is cumulative distribution function (cdf) F X for random variable X. The cdf maps the real interval (, a] to the real interval [, 1] and is defined through the probability measure: F X (a) := P(X(ω) (, a]). Since (, a] (, b] for a b, we have F X (a) F X (b), so F X is a non-decreasing function. Recalling also P(Ω) = 1 and P( ) =, we obtain three properties of the cdf: i) F X is non-decreasing. ii) lim a F X (a) = 1. ii) lim a F X (a) =. Example: Consider the experiment of tossing a fair coin twice with the sample space Ω = {HT, TH, HH, TT}. 1
2 Let X(ω) be the number of heads in two tosses, so that X(ω) {, 1, 2}. For the cdf F X we have F X (a) = /4 if a < 1/4 if a < 1 3/4 if 1 a < 2 4/4 if 2 a We can consider the right-hand side limits of F X (a) and use the continuity of probability measure to obtain lim F X(x) = lim P(X(ω) (, x]) = lim P((, x]) = P( x a + x a + x a + x a +(, x]) = P(X(ω) (, a]) = F X(a). Note that (, x] as x a + is a decreasing sequence of closed sets, their limit is intersection of these sets and intersection of infinitely many closed sets is closed. In particular, a n=1 (, a + 1/n] = x a +(, x]. Hence, cdfs are right continuous for all random variables. For the left-hand side limits, we obtain lim F X(x) = lim P(X(ω) (, x]) = lim P((, x]) = P( x a x a x a x a (, x]) = P(X(ω) (, a)) F X(a). Note that (, x] as x a is an increasing sequence of closed sets, their limit is union of these sets and union of infinitely many closed sets is not necessarily closed. In particular, a / (, a 1/n] for any n and so a / n=1 (, a 1/n] = x a (, x]. So cdfs are not necessarily left-continuous, as illustrated by the F X in the last example. Example: 4 people including you and a friend line up at random. Let X be the number of people between you and your friend and find F X. X = means that you are sitting together. Putting two of you together as a pair, we have 2 people and a pair to order in 3! ways. By reordering you and your friend within the pair we obtain 2 3! ways. The total number of ways of ordering 4 people is 4!, so P(X = ) = 2 3!/4! = 1/2. X = 1 means that there is exactly one person between you two. If either of you is at the beginning of the line, we can depict the permutation as, where and are the other people. By exchanging the positions of with and with, we can create 4 permutations:,,,. Other 4 permutations can be created by placing either of you at the second place in the line:,,,. The total number of permutations with X = 1 is 8, so P(X = 1) = 8/4! = 1/3. X = 2 means exactly two people between you and all four such permutations are,,,. The total number of permutations with X = 2 is 4, so P(X = 2) = 4/4! = 1/6. F X (a) = if a < 3/6 if a < 1 5/6 if 1 a < 2 6/6 if 2 a Example: Suppose a fair coin is tossed 4 times, we examine the head runs. A head run is consecutive appearance of heads. Let X be the length of the longest head run and find P(X = x) for x 4. The sample space is Ω = {HHHH, HHHT, HHTH, HTHH, THHH, HHTT, HTHT, THHT, HTTH, THTH, TTHH, HTTT, THTT, TTHT, TTTH, TTTT}. X(HHHH) = 4, X(HHHT) = 3, X(THHH) = 3, X(HHTH) = 2, X(HTHH) = 2, X(HHTT) = 2, X(THHT) = 2, X(TTHH) = 2, X(TTTT) = and all the remaining 7 outcomes yield X = 1. So P(X = ) = 1/16, P(X = 1) = 7/16, P(X = 2) = 5/16, P(X = 3) = 2/16, P(X = 4) = 1/
3 2 Discrete Random Variables Most of the examples above had countable sample spaces. A discrete random variable has a countable sample space. As the sample space is countable, we can let it be Ω = {x 1, x s, x 3,... }. Probability measure is nonzero only at the elements of Ω and zero elsewhere. At the elements of Ω, we can define nonzero probability mass function (pmf) p X as p(a) = P(X = a). Then we have F X (a) = p X (x i ). x i a In real-life averages are used to summarize empirical observations: average fuel consumption of my car is 32 miles per gallon; average August temperature in Dallas is 88 o Fahrenheit. On the other hand, expected values summarize theoretical probability distributions. The expectation of a discrete random variable X with pmf is p X is given by E(X) = x:p X (x)> xp X (x). Expectation is a weighted average where the weights are p(x). Note that a function of a random variable is another random variable, so we can consider the expected value of this function. That is, let us define new random variable Y = g(x) starting with random variable X. Then E(Y) = x:p X (x)> g(x)p(x). For a given random variable X, the function can in particular be g(x) = (x E(X)) 2 and can define a new random variable Y = g(y) = (X E(X)) 2. Expected value of this new random variable is the variance of X: V(X) = (x E(X)) 2 p(x). x:p X (x)> Variance of a random variable is the expected value of square of deviations from its mean. Another particular g function is g(x) = x k for an integer k, it helps to define the kth moment of the random variable X E(X k ) = x k p(x). x:p X (x)> Example: Consider the example of 4 people lining up and X the number of people between you and your friend. The pmf of X is p() = 3/6, p(1) = 2/6 and p(2) = 1/6. Then the expected value of the number of people between you is E(X) = (3/6) + 1(2/6) + 2(1/6) = 2/3. The variance of the number of people between you is V(X) = ( 2/3) 2 (3/6) + (1 2/3) 2 (2/6) + (2 2/3) 2 (1/6) = (4/9)(3/6) + (1/9)(2/6) + (16/9)(1/6) = 3/54 = 5/9. Example: Consider the example of 4 people lining up and X the number of people between you and your friend. Suppose you want to pass a glass of water to your friend. But 1/4 of water spills as glass moves from one person to another. What is the expected percentage of water remaining in the glass when your friend receives it? Let g(x) be percentage of water left in the glass after x + 1 passes. Note that x people between you causes x + 1 passes. Then g() = 3/4, g(1) = (3/4) 2, g(2) = (3/4) 3. So E(g(X)) = (3/4)(3/6) + (3/4) 2 (2/6) + (3/4) 3 (1/6) = 81/128 =
4 63.28% of the water remains on average when you friend receives the glass. This example can be extended to model the amount of information loss over a telecommunication network as the information is sent from one node to another. Expectation and variances have some properties that are straightforward to show: i) Linearity of expectation: For constant c, E(cg(X)) = ce(g(x)). For functions g 1, g 2,..., g N, E(g 1 (X) + g 2 (X) + + g N (X)) = E(g 1 (X)) + E(g 2 (X)) + + E(g N (X)). ii) For constant c, E(c) = c and V(c) =. iii) Nonlinearity of variance: For constant c, V(cg(X)) = c 2 V(g(X)). iv) V(X) = E(X 2 ) (E(X)) 2. 3 Independent Discrete Random Variables Let X and Y be two random variables and consider the two events A := [X a] and B := [Y b] for real numbers a, b and A Ω A, B Ω B. Suppose that these two events are independent so we have the middle equality below P(X a, Y b) = P(A B) = P(A)P(B) = P(X a)p(y b). We extend this equality to all numbers a, b to define the independence of random varibbles. If we have P(X a, Y b) = P(X a)p(y b) for all a, b R, random variables X and Y are said to be independent. To indicate independence, we can write X Y Starting from P(X a, Y b) = P(X a)p(y b), we can obtain P(X = a, Y = b) for integer valued random varaibales X, Y in terms of their pmfs: P(X = a, Y = b) = P(X a, Y b) P(X a, Y b 1) (P(X a 1, Y b) P(X a 1, Y b 1)) = F X (a)f Y (b) F X (a)f Y (b 1) (F X (a 1)F Y (b) F X (a 1)F Y (b 1)) = F X (a)p Y (b) F X (a 1)p Y (b) = p X (a)p Y (b). This can be extended to non-integer valued random variables but a new notation is neded to express the largest value of X smaller than a. If you adapt some notation for this purpose, above steps can be repeated to obtain P(X = a, Y = b) = p X (a)p Y (b) for two independent discrete random varaiables X and Y. Example: For two independent random varaiables, we check the equality E(X 1 X 2 ) = E(X 1 )E(X 2 ): E(X 1 X 2 ) = (x 1 x 2 )P(X 1 = x 1, X 2 = x 2 ) = x 1 x 2 p X1 (x 1 )p X2 (x 2 ) x 1,x 2 :P(X 1 =x 1,X 2 =x 2 )> x 1,x 2 :p X1 (x 1 ),p X2 (x 2 )> = x 1 :p X1 (x 1 )> x 1 p X1 (x 1 ) x 2 :p X2 (x 2 )> x 2 p X2 (x 2 ) = E(X 1 )E(X 2 ). The first equality above uses the fact that the probability of the event [X 1 = x 1, X 2 = x 2 ] is p X1 (x 1 )p X2 (x 2 ). Example: For two independent random varaiables, we check the equality V(X 1 + X 2 ) = V(X 1 ) + V(X 2 ): V(X 1 + X 2 ) = E(X X 1X 2 + X2) 2 E(X 1 + X 2 )E(X 1 + X 2 ) = E(X1 2) E(X 1) 2 + E(X2) 2 E(X 2 ) 2 + 2E(X 1 X 2 ) 2E(X 1 )E(X 2 ) = V(X 1 ) + V(X 2 ). 4
5 The equalities established in the last two examples are summarized below: v) For two independent random varaibles X 1 and X 2, E(X 1 X 2 ) = E(X 1 )E(X 2 ). vi) Additivity of variance for two independent random varaibles X 1 and X 2, V(X 1 + X 2 ) = V(X 1 ) + V(X 2 ). 4 Common Discrete Random Variables Some commonly used discrete random variables are introduced below. 4.1 Bernoulli Random Variable An experiment is said to consist of Bernoulli trials if outcome of each trial can take two values, which can be interpreted as say success and failure, each trial is independent of other trials, each trial is identical in terms of probability to other trials, so for all trials P(success)= p = 1 P(failure). If an experiment has a single Bernoulli trial, the number of success is called a Bernoulli random variable. Example: Let us consider the experiment of tossing a coin once. Let success head, so P(Success)= P(Head)=p, where p is the probability that coin yields head. P(Failure)=P(Tail)=1 p. Let X be the number of heads on a single toss. E(X) = 1p + (1 p) = p and V(X) = E(X 2 ) p 2 = 1 2 p + 2 (1 p) p 2 = p(1 p). Example: Let us consider the price of a stock, which can increase or decrease over a day. Let success price increase, so P(Success)= P(Price increase) and P(Failure)=P(Price decrease). Suppose that the price either increases by $1 or decreases by $1. Then the expected price change over a day is Today s price + 1P(Price increase) 1P(Price decrease) Today s price = P(Price increase) P(Price decrease) = 2P(Price increase) 1. Price increases on average if P(Price increase) > 1/2. This probability of price increase can be estimated statistically for stocks. 4.2 Binomial Random Variable Number of successes in n independent Bernoulli trials, each with success probability p, is a Binomial random variable with parameters (p, n). Example: Let us consider the experiment of tossing a coin many times. On each toss, P(Success)= P(Head)=p and P(Failure)=P(Tail)=1 p. Let X i be the number of heads on ith toss. X i is a Bernoulli random variable. Consider the total number X of heads in n tosses. Then X is a Binomial random variable and X = n X i. i=1 To denote a Binomial random variable, we write X B(n, p). To obtain the probability mass function of X, consider the event [X = m] of having m success among n events for m n. If the first m trials are success, we get a string of } S S. {{.. S S }} F F.{{.. F F}. m successes n m failures 5
6 Probability of this string is p m (1 p) n m. If we take a permutation of this string, the probability remains p m (1 p) n m. The number of permutations that we can make up from m Ss and n m Fs is C n m. Hence, P(X = m) = C n mp m (1 p) n m for m =, 1, 2,..., n. Example: For X i Bernoulli random variable with P(Success)=p, we have E(X i ) = p and V(X i ) = p(1 p). By using X = n i=1 X i, linearity of expectation and additivity of variance for independent random variables, E(X) = np and V(X) = np(1 p). Example: An urn contains 6 white balls and 2 black balls. A ball is drawn and put back into the urn. The process is repeated 4 times. Let X be the number of black balls drawn out of 4 draws. Find P(X = 2, E(X) and V(X). Convince yourself that X B(4, 2/8), so P(X = 2) = C 4 2 (2/8)2 (1 2/8) 2, E(X) = 4(2/8) and V(X) = 4(2/8)(6/8). Suppose that B(n j, p) is the number of successes in n j trials. Then j B(n j, p) is the number of successes in j n j Bernoulli trials with success probability p. This in turn leads to j B(n j, p) B( j n j, p), where indicates having the same distribution. For example, B(n 1, p) + B(n 2, p) is another Binomial random variable with n 1 + n 2 trials and p success probability, i.e., B(n 1, p) + B(n 2, p) B(n 1 + n 2, p). 4.3 Geometric Random Variable Number of Bernoulli trials until the first success is a Geometric random variable. When counting the number of trials, we include the trial that resulted in the first success. For example, the first success in FFFFSFFSFS string is the fifth trial. Some books do not include the trial in which the success happened in counting the number of trials until the first success and say that it takes four trials to get a success in FFFFSFFSFS. This convention in the other books changes the pmf, cdf, mean and variance formulas so one has to be careful when reading other books. With our convention of counting events, a geometric random variable X takes values in {1, 2, 3,... }. For the Geometric random variable X to be equal to n, we need n 1 failures first and a success afterwards: F F... F F }{{} n 1 failures S S } F. {{.. S F }. irrelevant Recalling that p is success probability, the pmf for the geometric random variable is P(X = n) = (1 p) n 1 p. Unlike the binomial pmf, the geometric pmf does not have a combinatorial term because the sequence of failures and the success cannot be altered. Example: Consider a rice rolling experiment until 6 comes up. The experiment can potentially have infinite trials and so a countable but infinite sample space. Let success = 6 appears and failure = 1,2,3,4, or 5 appears. P(Success)=1/6 and P(Failure)=5/6. Then the probability of having 6 for the first time in the nth trial is P(X = n) = (5/6) n 1 (1/6). The cumulative distribution function for the Geometric random variable has a simple form: F X (n) = P(X n) = n i=1 (1 p) i 1 n 1 p = p i= (1 p) i = p 1 (1 p)n 1 (1 p) = 1 (1 p)n. 6
7 Hence, F := 1 F, the complementary cdf, is also simple: F X (n) = 1 F X (n) = P(X > n) = (1 p) n. Example: A graduate student applies to jobs one by one. Within a few days after his job interview, he is told that he gets the position or not. Each experiment can be thought as a Bernoulli trial where success is obtaining a position. The chances of getting a position after an interview is estimated to be.2. What is the probability that the student cannot get a job after the first five interviews? We want P(X > 5) for a Geometric random variable X with success probability.2. So, P(X > 5) = (1.2) 5. If the first m trials result in failures, what should the probability of n trials to result in failures as well for n m. In particular, we want to see if the probability that the remaining n m trials turn out to be failures depend on the failures of the first m trials. In other words, do the outcome of n m trials independent of the first m trials? This can be answered formally by examining the probability of the event [X > n X > m]. P(X > n X > m) = P(X > n, X > m) P(X > m) = P(X > n) (1 p)n = P(X > m) (1 p) m = (1 p)n m. On the other hand, (1 p) n m is the probability of failures in n m events, so (1 p) n m = P(X > n m). Hence, the outcome of n m trials do not depend on the knowledge (memory) of the first m trials. We have the memoryless property for the Geometric random variable. P(X > n X > m) = P(X > n m). Example: What other discrete random variable ranging over {1, 2,... } has the memoryless property? The memoryless property can be written as P(X > n + m) = P(X > m)p(x > n) for nonnegative integers m, n. In terms of complementary cdf, we have F(m + n) = F(m) F(n). Also F() = P(X > ) = 1. We can set q = F(1) for q 1. Then we recursively obtain F(n) = q n. Interpreting q as the failure probability and defining p = 1 q as the success probability, we obtain F(n) = 1 (1 p) n, which is the cdf of the Geometric random variable. As a result, the only discrete random variable ranging over {1, 2,... } and satisfying the memoryless property is the Geometric random variable. 4.4 Hypergeometric Random Variable One way to obtain Binomial random variable is to consider an urn with s success balls and f failure balls, and we pick a ball from the urn record it and return it to the urn. Each ball can be either a success or failure with the success probability s/(s + f ). The probability of m successes after n repetitions of this experiment is given by the binomial probability C n m(s/(s + f )) m ( f /(s + f )) n m. In the urn experiment above, each ball is returned back to the urn. What happens if we do not return the balls back to the urn is that the success and failure probabilities are no longer constant. Suppose we again pick n balls without replacement and wonder about the probability of m balls being success (and equivalently n m balls being failure). For this event, we need m s and n m f (or n s + f ). The number of ways of choosing m success balls out of s success balls is Cm, s the number of ways of choosing n m failure balls out of f failure balls is C f f n m and the number of ways of choosing n balls out of s + f balls is Cs+ n. Let X be the number of successes in this experiment. Then the probability of m success balls is P(X = m) = Cs mc f n m C s+ n f. 7
8 This probability obtained under no ball replacement is different from the binomial probability obtained under replacement. So the random variable X deserves a new name - the Hypergeometric random variable. The difference between the Binomial and Hypergeometric random variables is caused by the replacement of the balls. Replacement changes the success probability significantly if n is close to s + f. Otherwise, success probabilities do not change much by taking a ball out of the urn. For example, with s = 3 and f = 7, the success probability for the first ball is 3/1 for the second it is either 3/999 or 299/999, both of which are very close to 3/1. This inspires us to check the Hypergeometric random variable as N := s + f. We also define p = s/n so s = Np and f = N(1 p). With this reparametrization, we can show that the Hypergeometric pmf converges to the Binomial pmf as N increases: C Np m C N(1 p) n m Cn N (Np)!(N(1 p))!n!(n n)! = (Np m)!m!(n(1 p) n + m)!(n m)!n! = Cm n (Np)(Np 1)... (Np m + 1) N(1 p)(n(1 p) 1)... (N(1 p) n + m + 1) N(N 1)... (N n + 1) = Cm n (p)(p 1/N)... (p (m 1)/N) (1 p)((1 p) 1/N)... ((1 p) (n m 1)/N) 1(1 1/N)... (1 (n 1)/N) Cmp n m (1 p) n m as N. Example: A package of light bulbs has 6 bulbs. 4 of these are non-defective and 2 are defective. 3 bulbs are taken from the package for inspection. Let X be the number of defective bulbs observed during inspection. Find P(X = m) for m {, 1, 2}. P(X = ) = C2 C4 3 C 6 3, P(X = 1) = C2 1 C4 2 C 6 3, P(X = 1) = C2 2 C4 1 C3 6. Example: The sum of Hypergeometric pmf must yield 1. Let us check this sum. We claim that C s+ f n s P(X = m) = m= s m= CmC s f n m C s+ n f. = s m= CmC s f f n m. To convince yourself of this equality, consider Cs+ n the number of n-element subsets of a (s + f )-element set. An indirect way to obtain these subsets is to split (s + f )-element set into two sets with s and f elements. To obtain n elements subsets, we can pick m elements from the s- element set and n m elements from f -element set. The indirect way also gives us all the n-element subsets and in s m= C s mc f n m ways. So Cs+ f n = s m= C s mc f n m. Example: Find the expected value of the Hypergeometric random variable X. E(X) = s m= m Cs mc f n m C s+ f n = s m=1 s Cs 1 m 1 C f n m C s+ f n = s m=1 sn s + f C s 1 m 1 C f n m s+ f 1 Cn 1 = sn s + f s 1 m= Cm s 1 C f n m 1 s+ f 1 Cn 1 = sn s + f. 8
9 4.5 Poisson Random Variable Starting with a Binomial random variable B(n, p), what happens to B(n, p) as n and p while np remains constant. These limits can be considered by setting λ = np and then n implies p = λ/n. lim P(B(n, λ/n) = m) = lim n n Cn m(λ/n) m (1 λ/n) n m = λm m! lim (1 λ/n)n (1 λ/n) m n }{{}}{{} exp( λ) 1 = exp( λ) λm m! for m =, 1, 2,... n(n 1)... (n m + 1) n m } {{ } 1 The limiting random variable has the pmf p(m) = exp( λ)λ m /m! and is called the Poisson distribution and is denoted as Po(λ). Example: Classical FM plays about 1 Led Zepplin song every 1 hours. Suppose that the number of LZ songs played every 1 hour has a Po(1) distribution. P(No LZ song played in the last 1 hours) = P(Po(1) = ) = exp( 1)1 /!=exp( 1) = P(1 LZ song played in the last 1 hours) = P(Po(1) = 1) = exp( 1)1 1 /1!=exp( 1) = P(2 or more LZ songs played in the last 1 hours) = P(Po(1) 2) = 1 P(Po(1) 1) = 1 exp( 1) exp( 1) = Expected value of Po(λ) is λ: E(Po(λ)) = i= i exp( λ)λ i /i! = λ i=1 exp( λ)λ i 1 /(i 1)! = λ j= exp( λ)λ j /j! = λ. Example: A customer enters a store in 1 second with probability p =.1; no customer enters the store in 1 second with probability.99. Suppose that we consider n = 1 seconds. What is the probability of having exactly 1 customer in 1 seconds? This probability is P(B(1,.1) = 1) = C 1 1 (.1) 1 (.99) 99 = = We can also use B(1,.1) Po(1) and have P(Po(1) = 1) = exp( 1) = Poisson random variable Po(λ) can be interpreted as the number of arrivals in a time interval. Suppose that Po(λ 1 ) and Po(λ 2 ) is the number of type 1 and type 2 customers arriving independent of each other in an interval. The total number of customers arriving in the interval is Po(λ 1 ) + Po(λ 2 ). This is the limiting distribution of B(n 1, p) + B(n 2, p) for λ 1 = n 1 p and λ 2 = n 2 p. Since B(n 1, p) + B(n 2, p) B(n 1 + n 2, p) Po(λ 1 + λ 2 ), we obtain Po(λ 1 ) + Po(λ 2 ) Po(λ 1 + λ 2 ). That is, the sum of Poisson random variables is another Poisson random variable. Formally, for an integer m P(Bin(n 1, p) + Bin(n 2, p) = m) = P(Bin(n 1 + n 2, p) = m), P(Bin(n 1, p) + Bin(n 2, p) = m) P(Po(λ 1 ) + Po(λ 2 ) = m), P(Bin(n 1 + n 2, p) = m) P(Po(λ 1 + λ 2 ) = m). If two sequences are equal and converge, they must converge to the same number: P(Po(λ 1 ) + Po(λ 2 ) = m) = P(Po(λ 1 + λ 2 ) = m). Note that the convergence used here is the convergence of sequences made up of real numbers. 5 Solved Examples 1. A broadcasting company runs 4 different commercials for a car company and 96 commercials for other companies. The total of 1 commercials are randomly shown on TV one by one. 9
10 a) What is the expected number of commercials to see until (including) the first commercial of the car company? ANSWER Probability of a car commercial is 4/1 so it takes 1/(4/1)=25 commercials until a car commercial is seen. b) What is the expected number of commercials to see until (including) all of commercials of the car company are seen? ANSWER Probability of a car commercial is 4/1 so it takes 1/(4/1)=25 commercials until a car commercial is seen. Then on, the probability of seeing a distinct car commercial is 3/1 so it takes 1/(3/1)=1/3 commercials until the second car commercial is seen. Then on, the probability of seeing a distinct car commercial is 2/1 so it takes 1/(2/1)=1/2 commercials until the third car commercial is seen. Then on, the probability of seeing a distinct car commercial is 1/1 so it takes 1/(1/1)=1 commercials until the fourth car commercial is seen. In total it takes an expected number of 1/4+1/3+1/2+1/1 commercials to see all car commercials. 1
11 6 Exercises 1. 5 people including you and a friend line up at random. Let X be the number of people between you and your friend and find F X. 2. N people including you and a friend line up at random. Let X be the number of people between you and your friend and find P(X = ) and P(X = N). 3. Suppose a fair coin is tossed 5 times. Let X be the length of the longest head run and find P(X = x) for x In the case of Geometric random variable, we have Bernoulli trials with success probability p and we are interested in the number of trial in which the first success happens. Let X(i, p) be the number of Bernoulli trial in which the ith success happens in a series of Bernoulli trials with success probability of p. Clearly, X(1, p) is the Geometric random variable. Find the pmf of X(i, p). 5. Let X(i, p) be the number of Bernoulli trial in which the ith success happens in a series of Bernoulli trials with success probability of p. When X(1, p) is the Geometric random variable, Y(1, p) := X(1, p) 1 is the number of failures until the first success. Find the pmf of Y(1, p) and F Y(1,p) (n). Does Y(1, p) have the memoryless property. 6. Consider a series of Bernoulli trials with success probability of p and let X(i, p) be the number of Bernoulli trial in which the ith success happens. Recall that B(n, p) is the number of successes in n trials. For an integer a, we have P(X(i, p) > n) = P(B(n, p) < i + a). What should a be to make this equality valid? After finding this a, check that P(X(i, p) > n) = P(B(n, p) < i + a) holds for i > n. 7. Let X(i j, p) be the number of Bernoulli trial in which the i j th success happens in a series of Bernoulli trials with success probability of p. Suppose that X(i j, p)s are independent. Establish that j where indicates having the same distribution. X(i j, p) X( i j, p), j 8. For a random variable X parametrized by (r, p) we have the pmf P(X(r, p) = k) = C k+r 1 k p k (1 p) r for k =, 1, 2,.... Show that the mean of this distribution is rp/(1 p). 9. For a random variable X parametrized by (r, p) we have the pmf P(X(r, p) = k) = Γ(k + r) k! Γ(r) pk (1 p) r. What happens to this pmf as r and p while the mean rp/(1 p) remains constant? Identify the limiting pmf and determine which common random variable it corresponds to? 1. Let X(i, p) be the number of Bernoulli trials required to get the ith success in a series of Bernoulli trials with success probability of p and i 1. Let Y(i, p) be the number of Bernoulli trials resulting in failure until the ith success happens. What is the relation between X(i, p) and Y(i, p). Find the pmf of Y(i, p). 11
12 11. Show that the variance of the Hypergeometric random variable is (ns/(s + f ))(1 s/(s + f ))((s + f n)/(s + f 1)). 12. Use the parametrization N = s + f, s = Np and f = N(1 p) in the Hypergeometric variance (ns/(s + f ))(1 s/(s + f ))((s + f n)/(s + f 1)) and simplify this variance as N. Relate this limiting variance to other variances that you may know. 13. For fixed parameters n 1 and < q < 1, the pmf of a random variable X whose range is {n, n + 1, n + 2,... } satisfies p(n) = (1 q) n and p(m) p(m 1) = m 1 q for m > n. m n Find the expected value of X. You may use i= Cn+i i q i = (1 q) n In a social experiment to measure the honesty of several cities, Reader s Digest reporters 1 intentionally dropped 12 wallets one by one in a central area in each of the following cities and counted the number of wallets returned to the owners by the people who found the wallets. City Lisbon Madrid Prague Zurich Rio Bucharest Warsaw London Index # returned City Ljubljana Berlin Amsterdam Moscow New York Budapest Mumbai Helsinki Index # returned a) Let X i,j = 1 if the jth dropped vallet is returned in city i; otherwise, it is zero. What are the range of values for i and j for the Reader s Digest experiment? For distint integers of i, j, k from the appropriate ranges, consider the pair of random variables (X i,j, X i,j+1 ) (X i,j, X i,j+2 ), (X i,j, X i+1,j ) and (X i,j, X i+2,j ) Which of these pairs include random variables that are more likely to be independent within the pair? b) Consider the random variables Y j = i X i,j and Z i = j X i,j, where sums are taken over all possible index values of i and j. Express these sums in English. c) If possible, list the assumptions needed to conclude that Y 3 is a Binomial random variable and specify the parameters of the associated Binomial distribution. Otherwise, explain why this is not possible. d) If possible, list the assumptions needed to conclude that Z 5 is a Binomial random variable and specify the parameters of the associated Binomial distribution. Otherwise, explain why this is not possible. e) If possible, list the assumptions needed to conclude that Y 3 + Z 5 is a Binomial random variable and specify the parameters of the associated Binomial distribution. Otherwise, explain why this is not possible people take an elevator from the bottom floor of a building to the higher floors: first, second, third and fourth floor. Each person on the elevator is independently going to one of the floors with equal probability. Let i = for the bottom floor and let S i be 1 if the elevator stops on floor i = 1, 2, 3, 4; otherwise it is. a) What is P(Elevator does not stop on floor i) for i = 1, 2, 3, 4? What is the distribution of S i? b) When the elevator arrives at the first floor, two people get off at the first floor and 3 remain. Subject to this information, find the range of S 2 + S 3 + S 4 and P(S 2 + S 3 + S 4 = k) for appropriate values of k. 1 Most Honest Cities: The Readers Digest Lost Wallet Test at 12
13 Is the distribution of S 2 + S 3 + S 4 in binomial or not? Do you expect it to be binomial, why? Hint: Part c) is shorter so you may solve that first. c) When the elevator arrives at the second floor, one person gets off at the second floor and 2 remain. Subject to this information, find the range of S 3 + S 4 and P(S 3 + S 4 = k) for appropriate values of k. d) When the elevator arrives at the second floor, one person gets off at the second floor and 2 remain. But these two are talking about a joint project and are going to the same floor. Subject to this information, find the range of S 3 + S 4 and P(S 3 + S 4 = k) for appropriate values of k. 16. An urn contains 1 marbles, one of which is defective. a) A marble is picked from the urn. If it is the defective one, we stop. Otherwise, we return the picked marble to the urn and we pick another marble. We continue to pick marbles from the urn until the defective is found. Let M be the number of picks, find the range of M and P(M = k) for appropriate values of k. b) A marble is picked from the urn. If it is the defective one, we stop. Otherwise, we put the picked marble aside (outside the urn) and we pick another marble. We continue to pick marbles from the urn until the defective is found. Let N be the number of picks, find the range of N and P(N = k) for appropriate values of k. 17. An urn contains 1 marbles, two of which are defective. a) Two marbles are picked at once from the urn. If these are the defective ones, we stop by recording 1 pick. Otherwise, we return the picked marbles to the urn and we pick two other marbles. We continue to pick two marbles from the urn until both defective marbles are found. In this scheme, every pick from the urn involves picking 2 marbles. Let M be the number of picks (of two marbles) rather than the number of marbles picked. Find the range of M and P(M = k) for appropriate values of k. b) A marble is picked from the urn. Then another marble is picked. If these are the defective ones, we stop by recording 2 picks. Otherwise, 1 or 2 of the picked marbles is/are non-defective. We put the non-defective marbles aside (outside the urn). If 1 marble is put aside (for being non-defective), we pick 1 marble from the urn to have two marbles in hand. If 2 marbles are put aside (for being non-defective), we pick 2 marbles at once from the urn to have two marbles in hand. After these, we have 2 marbles in hand. If both marbles in hand are defective, we stop. Otherwise, we continue to pick marbles from the urn until the defectives are found. Let N be the number of marbles picked from the urn. Find the range of N and P(N = k) for appropriate values of k. 18. For real numbers x, y, we always have max{x, y} + min{x, y} = x + y, which is inherited by random variables X, Y as max{x, Y} + min{x, Y} = X + Y. Does this equality hold when written in terms of probabilities as P(max{X, Y} = a) + P(min{X, Y} = a) = P(X = a) + P(Y = a) for every real number a? Either prove the last equality or provide a counterexample to fail it. 13
14 7 Appendix: Negatives in Factorials and in Binomial Coefficients For nonnegative integer n, n! = n(n 1)(n 2)... 1 and! = 1. For positive non-integers as well as negative real numbers, this definition does not help. Instead, the factorial is defined through the Gamma Γ function: n! = Γ(n + 1) = x n e x dx. Note that! = Γ(1) = e x dx = 1. Also, integration by parts with u(x) = x n (or du = nx n 1 ) and v(x) = e x (or dv = e x ) confirms Γ(n + 1) = x n e x dx = u()v() u()v() nx n 1 ( e x )dx = n x n 1 e x dx = nγ(n). Because of Γ(n + 1) = nγ(n), knowing Γ over (,1) is sufficient to find Γ for any positive real number. For example, Γ(7/2) = (5/2)(3/2)(1/2)Γ(1/2). We can use u := x to obtain Then Using polar coordinates, Hence, Γ(1/2) = π. We also obtain Γ() = Γ(1/2) = e x x 1/2 dx = e 2u u2 u du = 2 e u2 du. Γ(1/2) 2 = 4 e u2 du e v2 dv = 4 e (u2 +v 2) dudv. π/2 Γ(1/2) 2 = 4 e (r2) rdrdθ = 4 π e (r2) rdr = 2π e t dt = π x 1 e x dx lim x 1 e x dx e 1 lim x 1 dx = e 1 lim(ln(1) ln(n)). n n n n n Γ is undefined for. By Γ(n) = Γ(n + 1)/n, Γ is undefined for all negative integers as well. Γ takes or value depending on whether we approach or a negative integer from left or right. But in general, we should remember n! = Γ(n + 1) = for negative integer n. Γ function is plotted in Figure 1, see that the function is diverging at nonpositive integers such as, -1, -2, -3, -4. Having presented negative integer factorials, we consider the binomial coefficients Cm n for nonnegative integers n, m but n < m. Cm n n(n 1)... (m + 1) = = for n < m (n m)! because (n m)! for n m <. With this understanding, we can write the binomial formula as (1 + x) n = n C n i xi = Ci n xi. i= i= Would this binomial formula hold when n <? For integers n < and m >, we need to define C n m: C n m := (n)(n 1)... (n m + 2)(n m + 1) m! for n < m, 14
15 y Figure 1: Γ(n) for n { 4.975, 4.925,..., 4.975} produced by R commands x < (1:2)/2; y < gamma(x); plot(x, y). where the factorial notation is applied only onto the nonnegative m. This binomial coefficient can also be written as Cm n m ( n + m 1)( n + m 2)... (n 1)(n) = ( 1) = ( 1) m Cm n+m 1 for n < m. m! Then the negative binomial expansion for n < and x < 1 (a technical condition for convergence) is (1 + x) n = Ci n xi = i= x i= ( 1) i C n+i 1 i x i. Interestingly, the formula (1 + x) n = i C n i xi applies both for n and n < with the appropriate interpretation of the binomial coefficients for n < m and n < m. Example: Find (1 x) n 1 for n and < x < 1. (1 x) n 1 = C n 1 i ( x) i = i= We then obtain the following useful identity C n+i i x i = i= 8 Some Identities Involving Combinations C n+1+i 1 i ( 1) i ( 1) i x i = i= 1 (1 x) n+1. C n+i i x i. i= C n m + C n m+1 = Cn+1 m+1. When making up m + 1-element subsets from n + 1 elements, you can fix one element from n + 1 elements. If this fixed element is in your subset, you can choose the remaining m elements from n elements in C n m ways. Otherwise, you have m + 1 elements to choose from n elements in C n m+1 ways. k i= Cm i C n k i = Cm+n k. 15
16 A combination of k objects from a group of m + n objects must have i k objects from a group of m objects and remaining from a group of n objects. i=m n Ci m = C n+1 m+1 for integers m n. i=m n Ci m = C m+1 m+1 + Cm+1 m + Cm m+2 + Cm m Cm n = C m+2 m+1 + Cm+2 m + Cm m+3 C m+2 m+1 = Cm+1 m+1 + Cm+1 m. Then C m+2 m+1 + Cm+2 m + Cm m Cm n = C m+3 m+1 + Cm+3 m + + Cm n where we use + + Cm n where we use C m+3 m+1 = Cm+2 m+1 + Cm+2 m. Continuing this way we eventually obtain n i=m Ci m = C n m+1 + Cn m = C n+1 If you know of some other useful identities, you can ask me to list them here. m+1. 16
Moments and Discrete Random Variables OPRE 7310 Lecture Notes by Metin Çakanyıldırım Compiled at 17:41 on Monday 1 st October, 2018
Moments and Discrete Random Variables OPRE 731 Lecture Notes by Metin Çakanyıldırım Compiled at 17:41 on Monday 1 st October, 218 1 Random Variables In a probability space (Ω, F, P), the element ω Ω is
More informationPROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar
PROBABILITY THEORY By Prof. S. J. Soni Assistant Professor Computer Engg. Department SPCE, Visnagar Introduction Signals whose values at any instant t are determined by their analytical or graphical description
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationRandom Variables Example:
Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the
More informationChapter 4 : Discrete Random Variables
STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable.
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationRandom Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping
Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationMAT 271E Probability and Statistics
MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationThus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553
Test 2 Review: Solutions 1) The following outcomes have at least one Head: HHH, HHT, HTH, HTT, THH, THT, TTH Thus, P(at least one head) = 7/8 2) The following outcomes have a sum of 9: (6,3), (5,4), (4,5),
More informationDiscrete Distributions
Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationChapter 3 Discrete Random Variables
MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationQuick review on Discrete Random Variables
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 2017 Néhémy Lim Quick review on Discrete Random Variables Notations. Z = {..., 2, 1, 0, 1, 2,...}, set of all integers; N = {0, 1, 2,...}, set of natural
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More informationChapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type
Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution
More informationLecture 3. Discrete Random Variables
Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationChapter 3. Discrete Random Variables and Their Probability Distributions
Chapter 3. Discrete Random Variables and Their Probability Distributions 2.11 Definition of random variable 3.1 Definition of a discrete random variable 3.2 Probability distribution of a discrete random
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationExpectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or
Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations
More informationRandom variables (section 6.1)
Random variables (section 6.1) random variable: a number attached to the outcome of a random process (r.v is usually denoted with upper case letter, such as \) discrete random variables discrete random
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationX 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:
nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the
More informationDiscrete random variables
Discrete random variables The sample space associated with an experiment, together with a probability function defined on all its events, is a complete probabilistic description of that experiment Often
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationST 371 (V): Families of Discrete Distributions
ST 371 (V): Families of Discrete Distributions Certain experiments and associated random variables can be grouped into families, where all random variables in the family share a certain structure and a
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More informationRecap of Basic Probability Theory
02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationProbability Theory and Random Variables
Probability Theory and Random Variables One of the most noticeable aspects of many computer science related phenomena is the lack of certainty. When a job is submitted to a batch oriented computer system,
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More informationMAT 271E Probability and Statistics
MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More informationRecap of Basic Probability Theory
02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More information2. AXIOMATIC PROBABILITY
IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationLecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationDiscrete Random Variables
Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationStatistical Inference
Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationsuccess and failure independent from one trial to the next?
, section 8.4 The Binomial Distribution Notes by Tim Pilachowski Definition of Bernoulli trials which make up a binomial experiment: The number of trials in an experiment is fixed. There are exactly two
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationLecture 08: Poisson and More. Lisa Yan July 13, 2018
Lecture 08: Poisson and More Lisa Yan July 13, 2018 Announcements PS1: Grades out later today Solutions out after class today PS2 due today PS3 out today (due next Friday 7/20) 2 Midterm announcement Tuesday,
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability
ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER 2 2017/2018 DR. ANTHONY BROWN 5.1. Introduction to Probability. 5. Probability You are probably familiar with the elementary
More informationChapter 1: Revie of Calculus and Probability
Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationRandom Models. Tusheng Zhang. February 14, 2013
Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review
More informationNotes for Math 324, Part 17
126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationMATH1231 Algebra, 2017 Chapter 9: Probability and Statistics
MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra
More informationCHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).
CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1
IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationChap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities
EE8103 Random Processes Chap 1 : Experiments, Models, and Probabilities Introduction Real world word exhibits randomness Today s temperature; Flip a coin, head or tail? WalkAt to a bus station, how long
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationDiscrete Random Variables. David Gerard Many slides borrowed from Linda Collins
Discrete Random Variables David Gerard Many slides borrowed from Linda Collins 2017-09-28 1 Learning Objectives Random Variables Discrete random variables. Means of discrete random variables. Means of
More informationELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities Dr. Jing Yang jingyang@uark.edu OUTLINE 2 Applications
More informationProbability and Statistics. Vittoria Silvestri
Probability and Statistics Vittoria Silvestri Statslab, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WB, UK Contents Preface 5 Chapter 1. Discrete probability
More informationStochastic Models in Computer Science A Tutorial
Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution
1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random
More information12 1 = = 1
Basic Probability: Problem Set One Summer 07.3. We have A B B P (A B) P (B) 3. We also have from the inclusion-exclusion principle that since P (A B). P (A B) P (A) + P (B) P (A B) 3 P (A B) 3 For examples
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationLECTURE 1. 1 Introduction. 1.1 Sample spaces and events
LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events
More informationMethods of Mathematics
Methods of Mathematics Kenneth A. Ribet UC Berkeley Math 10B February 23, 2016 Office hours Office hours Monday 2:10 3:10 and Thursday 10:30 11:30 in Evans; Tuesday 10:30 noon at the SLC Kenneth A. Ribet
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationLecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events
Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek
More information