Chapter 4 : Discrete Random Variables

Size: px
Start display at page:

Download "Chapter 4 : Discrete Random Variables"

Transcription

1 STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable. Understand the difference between a discrete and a continuous random variable. Motivating example. Select three fans randomly that attend a tennis match between Roger Federer and Novak Djokovic. A Federer fan is denoted by F and a Djokovic fan D. This experiment yields the following sample space: Ω = {DDD, DDF, DF D, DF F, F F D, F DF, F DD, DDD} Now, let us define the following events : E: none of the three fans like Federer F : exactly one of the three fans like Federer G: exactly two of the three fans like Federer H: all of the three fans like Federer Here, we can remark that we can assign a specific number to each of the four events. Let us define some variable X that represents the number of Federer fans selected. The possible values of X are, therefore, either 0, 1, 2, or 3. In this case, events E, F, G, H will be assigned respectively values 0, 1, 2, 3 and we would write intuitively the corresponding probabilities P(X = 0), P(X = 1), P(X = 2), P(X = 3). Let us suppose that 80% of the audience attending the game are Federer fans. Then, by independence, we have : P(X = 0) = P(E) = P({DDD}) = = P(X = 1) = P(F ) = P({DDF })+P({DF D})+P({F DD}) = = since the three outcomes are mutually exclusive P(X = 2) = P(G) = P({F F D})+P({F DF })+P({DF F }) = =

2 P(X = 3) = P(H) = P({F F F }) = = There are a few things to note here: The results make sense! Given that 80% of the fans in the stands are Federer fans, it should not seem surprising that we would be most likely to select 2 or 3 Federer fans. The probabilities P(X = x i ) behave well in that (1) the probabilities are all greater than 0, that is, P(X = x i ) 0 and (2) the probability of the sample space is 1, that is, P(Ω) = P(X = 0) + P(X = 1) + P(X = 2) + P(X = 3) = 1. Because the values that X takes on are random, the variable X is called a random variable Formally, the set of values that X can take on should be endowed with a σ- algebra. In this course, we will always consider random values that take values on R. In this case, the σ-algebra that we will work with is called the Borel σ-algebra on R and is defined as follows. Definition 1.1 (Borel σ-algebra on R). The Borel σ-algebra on R is the smallest (in the sense of set inclusion) σ-algebra that contains all open intervals in R. The Borel σ-algebra on R is denoted B(R). Reminder: Open intervals in R are of three kinds : (a, b), with a, b R; (a, ), with a R; (, b), with b R Proposition 1.1. B(R) also contains all closed intervals. Proof. Show that for any a, b R, [a, b] = n=1 ( a 1 n, b + 1 ) n Definition 1.2 (Random variable). Let (Ω, A) be a measurable space of events on the sample space Ω. A real-valued random variable (r.r.v.) X is a function mapping with domain Ω, i.e. X : Ω R, such that for any borel set B B(R): {ω Ω X(ω) B} A (1) The event {ω Ω X(ω) B} is simply denoted {X B}. 2

3 Notations. For a, b R, we have the following commonly used notations : X (a, b) is denoted a < X < b X [a, b) is denoted a X < b X (a, b] is denoted a < X b X [a, b] is denoted a X b X (, b) is denoted X < b X (, b] is denoted X b X (a, ) is denoted X > a X [a, ) is denoted X a X {a} is denoted X = a Definition 1.3 (Discrete random variable). A real-valued random variable X is said to be discrete if X can take : either a finite number of values : X(Ω) = {x i R, i = 1..., n} for a given n N or a countably infinite number of values : X(Ω) = {x i R, i I} for a given subset I N. If X is not discrete, we say that X is continuous. The following examples illustrate the difference between the two kinds of random variables. X : number of siblings of a person. X is discrete since X(Ω) = {0, 1,...} = N Y : time spent on working STAT394 in hours. Y can take any positive values and they cannot be listed or indexed. Y takes an uncountably infinite number of values : Y (Ω) = R +. Thus Y is not discrete but continuous. We focus on discrete real-valued random variables in this chapter. 2 Probability Mass Function and Distribution Function Objectives of this section. To learn the formal definition of a discrete probability mass function. To learn the formal definition of a distribution function. Definition 2.1 (Probability mass function). Let X be a discrete r.r.v. on probability space (Ω, A, P) that takes values on X(Ω) = {x i R, i I} for a given subset I N. The probability mass function (p.m.f.) p X of X is a function with domain X(Ω) and defined by : p X (x i ) = P(X = x i ), for i I (2) 3

4 In the previous example with Federer and Djokovic fans. The probability mass function p X of random variable X is given by : p X (0) = P(X = 0) = p X (1) = P(X = 1) = p X (2) = P(X = 2) = p X (3) = P(X = 3) = Representations of a discrete random variable. A discrete r.r.v. can either be represented by: a table a bar graph or histogram Below are the representations for random variable X, number of Federer fans : x i p X (x i ) Table 1: Tabular form of p X px(x) Figure 1: Histogram of p X Theorem 2.1. A probability mass function completely determines the distribution of a discrete real-valued random variable. Remark : This theorem means that two discrete real-valued random variables X and Y that have exactly the same probability mass functions (same values 4

5 on all points of their domain) are identically distributed. We will see other functions that also completely determine the distribution of a random variable. Be careful! If X and Y are identically distributed, this does not imply that X and Y are equal. Example. Consider the experiment of tossing a fair coin three times. Define the random variables X is the number of heads observed and Y is the number of tails observed. Show that X and Y have the same p.m.f. s but are not equal. Property 2.1. Let X be a discrete r.r.v. on probability space (Ω, A, P) that takes values on X(Ω) = {x i R, i I} for a given subset I N, with p.m.f. p X, then for any subset J I P(X {x j R, j J}) = j J p X (x j ) Proof. We have the following : P(X {x j R, j J}) = P j J{X = x j } = j J P (X = x j ) since events {X = x j } are mutually exclusive = j J p X (x j ) The previous property justifies the following calculation : the probability that no fans or two fans of Federer are selected is then : P(X {0, 2}) = p X (0) + p X (2) = Property 2.2. Let X be a discrete r.r.v. that takes values on X(Ω) = {x i R, i I}, with I N and p.m.f. p X, then the following holds : p X (x i ) 0, for i I i I p X(x i ) = 1 Proof. It is obvious that p X (x i ) = P(X = x i ) 0 since P is a probability measure i I p X(x i ) = P(X X(Ω)) = P(Ω) = 1 Definition 2.2. Let I N. A real-valued function p with domain {x i R, i I} is said to be a valid p.m.f. if the following holds : 5

6 p(x i ) 0, for i I i I p(x i) = 1 This means that if p is a valid p.m.f., then there exists some discrete r.r.v. X that admits p as its p.m.f. Example. Let p(x) = cx 2 for x = 1, 2, 3. Determine the constant c so that the function p satisfies the conditions of being a valid probability mass function. The first condition implies that c is nonnegative since p(x) 0 and x 2 0 The second condition is that p should sum to 1 : 3 p(x) = 1 c c c 3 2 = 1 x=1 c( ) = 1 1 c = = 1 14 Example. Determine the constant c so that the following function p satisfies the conditions of being a valid probability mass function : ( ) x 1 p(x) = c, for x N 4 Hint: If (u n ) is a geometric sequence, with first term u 0 = a and common ratio r, that is : u n+1 = ru n with 1 < r < 1. Then the sum of the geometric series is : n=0 u n = a/(1 r). 6

7 Definition 2.3 (Cumulative distribution function). Let (Ω, A, P) be a probability space. The (cumulative) distribution function (c.d.f.) of a real-valued random variable X is the function F X given by F X (x) = P(X x), for all x R (3) Property 2.3. Let F X be the distribution function of a random variable X. Following are some properties of F X : F X is increasing : x y F X (x) F X (y) lim x F X (x) = 1 and lim x F X (x) = 0 F X is càdlàg : F X is right continuous : lim x x0 F X (x) = F X (x 0 ), for x 0 R F X has left limits : lim x x0 F X (x) exists, for x 0 R Property 2.4. Let X be a discrete r.r.v. that takes its values in X(Ω) = {x i R, i I}, with I N and F X be the distribution function of X. Then, F X is piecewise constant and discontinuous at the points x i X(Ω). Precisely, if points x i are sorted in increasing order (x 0 < x 1 <...), then for any i I, F X (x) = i P(X = x k ) = k=0 i p X (x k ), for x [x i, x i+1 ) (4) k=0 Example. Consider the experiment of tossing a fair coin three times. Let X be the number of heads observed. We previously saw that the corresponding probability mass function p X is given by the table : x i p X (x i ) 1/8 3/8 3/8 1/8 For instance, F X (2) = P(X 2) = p X (0) + p X (1) + p X (2) = 1/8 + 3/8 + 3/8 = 7/8 What is F X ( 2)? By definition, it is F X ( 2) = P(X 2), but X cannot take any values below -2. Therefore F X ( 2) = 0. What is F X (1.74)? By definition, it is F X (1.74) = P(X 1.74) = p X (0)+ p X (1) = 1/8 + 3/8 = 1/2 What is F X (4)? By definition, it is F X (4) = P(X 4) = p X (0) + p X (1) + p X (2) + p X (3) = 1/8 + 3/8 + 3/8 + 1/8 = 1. There are no more values beyond 3. So the distribution function remains constant, equal to 1. 7

8 Function F X is thus defined as follows : 0 x < 0 1/8 0 x < 1 F X (x) = 1/2 1 x < 2 7/8 2 x < 3 1 x 3 Below is a plot of the distribution function F X : 1.5 F X (x) x 0.5 Theorem 2.2. A distribution function completely determines the distribution of a real-valued random variable. 3 Mathematical Expectation Objectives of this section. To get a general understanding of the mathematical expectation of a discrete random variable. To learn a formal definition of the expected value of a function of a discrete random variable. To understand that the expected value of a discrete random variable may not exist. To learn and be able to apply the properties of mathematical expectation. Motivating example. Toss a fair, six-sided die many times, say 100,000 times. How do you compute the average (or mean) of the tosses? Assume that the first elements of the resulting sequence are : 1, 5, 4, 1, 2, 3, 6, 2, 5, 4, 2,.... Then the mean would be : 8

9 , 000 = , , , , 000 number of 1s obtained = , 000 number of 4s obtained , , , 000 number of 2s obtained 100, 000 number of 5s obtained , 000 = 1 frequency of frequency of frequency of 3 Remarks: +4 frequency of frequency of frequency of 6 number of 3s obtained , 000 number of 6s obtained 100, 000 In reality, one-sixth of the tosses will equal x i {1,..., 6} only in the long run. Frequencies converge exactly to probabilities. The mean is an average of the values weighted by their respective individual frequencies. Definition 3.1 (Expected Value). Let X be a discrete r.r.v. that takes its values in X(Ω) = {x i R, i I}, with I N. Let p X be the associated p.m.f. If i I x ip X (x i ) is absolutely convergent, i.e. i I x i p X (x i ) <, then we say that X is integrable. Then, the mathematical expectation (or expected value or mean) of X exists, is denoted by E[X] and is defined as follows : E[X] = i I x i p X (x i ) (5) Definition 3.2 (Expected Value of a Function of a Random Variable). Let X be a discrete r.r.v. that takes its values in X(Ω) = {x i R, i I}, with I N. Let p X be the associated p.m.f. and let g : X(Ω) R be a piecewise continuous function. If random variable g(x) is integrable. Then, the mathematical expectation of g(x) exists, is denoted by E[g(X)] and is defined as follows : E[g(X)] = i I g(x i )p X (x i ) (6) Example. A roulette wheel contains 38 numbers: zero (0), double zero (00), and the numbers 1, 2, 3,..., 36. Let X denote the number on which the ball lands and g(x) denote the amount of money paid to the gambler, such that: g(x) = 5$ if X = 0 g(x) = 10$ if X = 00 9

10 g(x) = 1$ if X is even g(x) = 2$ if X is odd If I run a casino, how much would I have to charge each gambler to play in order to ensure that I made some money? Example. Let X be a discrete r.r.v. with the following probability mass function: p X (x) = c x 2, for x N 1. Determine the constant c. 2. What is the expected value of X? Example. Suppose the p.m.f. p X of a discrete random variable X is given by: 1. What is E[2]? 2. What is E[X]? 3. What is E[2X]? x i p X (x i ) Property 3.1. Let X be a discrete r.r.v. that takes its values in X(Ω) = {x i R, i I}, with I N, with associated p.m.f. p X. for all c R, E[c] = c If c R and g : X(Ω) R is a piecewise continuous function and g(x) is integrable. Then, we have : E[cg(X)] = ce[g(x)] (7) If g : X(Ω) R is a nonnegative piecewise continuous function and g(x) is integrable. Then, we have : E[g(X)] 0 (8) If g 1 : X(Ω) R and g 2 : X(Ω) R are piecewise continuous functions and g 1 (X) and g 2 (X) are integrable such that g 1 g 2. Then, we have : E[g 1 (X)] E[g 2 (X)] (9) 10

11 Proof. Let us first show that for all c R, E[c] = c : Here, we consider the function g equal to constant c. E[c] = i I cp X (x i ) = c p X (x i ) i I }{{} =1 since p X is a p.m.f. Now, let us consider a piecewise continuous function g : X(Ω) R, such that g(x) is integrable. E[cg(X)] = i I cg(x i )p X (x i ) = c i I = c g(x i )p X (x i ) = ce[g(x)] The last two statements are left as an exercise. Example. Let us return to the same previous discrete random variable X: 1. What is E[X 2 ]? 2. What is E[2X + 3X 2 ]? x i p X (x i ) Property 3.2. Let X be a discrete r.r.v. that takes its values in X(Ω) = {x i R, i I}, with I N, with associated p.m.f. p X. If c 1, c 2 R and g 1 : X(Ω) R and g 2 : X(Ω) R are piecewise continuous functions and g 1 (X) and g 2 (X) are integrable. Then, we have : Proof. Do it by yourself! E[c 1 g 1 (X) + c 2 g 2 (X)] = c 1 E[g 1 (X)] + c 2 E[g 2 (X)] (10) Example. Using results from previous example, 1. What is E[4X 2 ]? 2. What is E[3X + 2X 2 ]? 11

12 4 Variance Objectives of this section. To get a general understanding of the variance of a random variable. To learn a formal definition of the variance and standard deviation of a discrete random variable. To learn and be able to apply a shortcut formula for the variance of a discrete random variable. To be able to calculate variance of a linear function of a discrete random variable. Motivating example. Consider two discrete random variables X and Y with respective probability mass functions p X and p Y. Here are their respective tabular forms : x p X (x) x p Y (x) Show that the mean of X and the mean of Y are the same. Draw bar graphs corresponding to the two p.m.f.s. and observe the variability of the two distributions. Definition 4.1 (Variance Standard Deviation). Let X be a real-valued random variable. When E[X 2 ] exists, the variance of X is defined as follows : Var(X) = E[(X E[X]) 2 ] (11) Var(X) is sometimes denoted σx 2. The positive square root of the variance is called the standard deviation of X, and is denoted σ X. That is: σ X = Var(X) (12) Let us return to the previous example. What is the variance and standard deviation of X? How does it compare to the variance and standard deviation of Y? As you can see, the expected variation in the random variable Y, as quantified by its variance and standard deviation, is much larger than the expected variation in the random variable X. Given the p.m.f.s of the two random variables, this result should not be surprising. Property 4.1. The variance of a real-valued random variable X satisfies the following properties : 12

13 Var(X) 0 If a, b R are two constants, then Var(aX + b) = a 2 Var(X) Proof. The nonnegativity of the variance comes from the fact that function g defined by g(x) = (x E[X]) 2 is nonnegative. Let a, b R be two constants, then we have : Var(aX + b) = E[(aX + b E[aX + b]) 2 ] = E[(aX + b (ae[x] + b)) 2 ] = E[(aX ae[x]) 2 ] = E[(a(X E[X])) 2 ] = E[a 2 (X E[X]) 2 ] = a 2 E[(X E[X]) 2 ] = a 2 Var(X) The formula for the variance of a discrete random variable can be quite cumbersome to use. There is a slightly easier-to-work-with alternative formula. Theorem 4.1 (König-Huygens formula). Let X be a real-valued random variable. When E[X 2 ] exists, the variance of X is also given by : Proof. We have the following : Var(X) = E[X 2 ] (E[X]) 2 (13) Var(X) = E[(X E[X]) 2 ] = E[X 2 2XE[X] + (E[X]) 2 ] = E[X 2 ] E[2XE[X]] + E[(E[X]) 2 ] = E[X 2 ] 2E[X]E[X] + (E[X]) 2 = E[X 2 ] 2(E[X]) 2 + (E[X]) 2 = E[X 2 ] (E[X]) 2 Example. Use the alternative formula to verify that the variance of the random variable X is 0.6, as we calculated earlier. Example. The mean temperature in Victoria, B.C. is 50 degrees Fahrenheit with standard deviation 8 degrees Fahrenheit. What is the mean temperature in degrees Celsius? What is the standard deviation in degrees Celsius? 13

14 Recall that the conversion from Fahrenheit (F) to Celsius (C) is: C = 5 (F 32) 9 5 Common Discrete Distributions Objectives of this section To discover popular distributions of discrete real-valued random variables To understand the derivation of the formulas for probability mass functions To verify that the derived p.m.f.s are valid p.m.f.s To learn in which situations these distributions apply To derive formulas for the mean and variance of the different distributions 5.1 Discrete Uniform Distribution Example. I throw a toss a fair, six-sided die. What is the probability that the die lands on a specific side? Definition 5.1 (Discrete Uniform Distribution). Let (Ω, A, P) be a probability space and let X be a random variable that can take n N values on X(Ω) = {1,..., n}. X is said to have a discrete uniform distribution U n if its probability mass function is given by : p X (i) = P(X = i) = 1, for i = 1,..., n (14) n Proof. Let us prove that the p.m.f. of a discrete uniform distribution is actually a valid p.m.f. : p X (i) = 1/n 0, for i = 1,..., n ; Does the p.m.f. sum to 1? n p X (i) = i=1 n i=1 1 n = n 1 n = 1 Property 5.1 (Mean and Variance for a Discrete Uniform Distribution). If X follows a discrete uniform distribution U n, then 14

15 its expected value is given by : E[X] = n (15) its variance is given by : Var(X) = n (16) Proof. Expectation : E[X] = n i p X (i) = 1 n i=1 n i = 1 n i=1 n(n + 1) 2 = n Variance : Var(X) = E[X 2 ] (E[X]) 2 where E[X 2 ] is given by : E[X 2 ] = n i 2 p X (i) = 1 n i=1 n i 2 = 1 n i=1 n(n + 1)(2n + 1) 6 Therefore, Var(X) = (n + 1)(2n + 1)/6 (n + 1) 2 /4 = (n 2 1)/12 = (n + 1)(2n + 1) Bernoulli distribution Example. Remember the example provided at the beginning of the chapter. Select three fans randomly among 10,000 people that attend a tennis match between Roger Federer and Novak Djokovic. A Federer fan is denoted by F and a Djokovic fan D. And assume that p = 80% of the audience attending the game are Federer fans. Let X be the random variable defined as the number of Federer fans. We previously calculated the following probabilities : P(X = 0) = P({DDD}) = = P(X = 1) = P({DDF }) + P({DF D}) + P({F DD}) = = P(X = 2) = P({F F D}) + P({F DF }) + P({DF F }) = = P(X = 3) = P({F F F }) = = Questions. Which assumptions did we make to compute those probabilities? What pattern can we identify in the calculations? 15

16 Definition 5.2 (Bernoulli process). A Bernoulli or binomial process has the following features : 1. We repeat n N identical trials 2. A trial can result in only two possible outcomes, that is, a certain event E, called success, occurs with probability p, thus event E c, called failure, occurs with probability 1 p 3. The probability of success p remains constant trial after trial. In this case, the process is said to be stationary. 4. The trials are mutually independent. Are the following experiments Bernoulli processes? A coin is weighted in such a way so that there is a 70% chance of getting a head on any particular toss. Toss the coin, in exactly the same way, 100 times. A fair coin is tossed until it lands on heads. An urn contains 5 white balls and 5 black balls. We draw 6 balls from the urn without replacement. We are interested in the number of black balls drawn. 8,000 Federer fans and 2,000 Djokovic fans attend a tennis match. Select three fans randomly. We are interested in the number of Federer fans selected. Rigorously, the last example is not a Bernoulli process. However, when the sample size n is small in relation to the population size N, the approximation by a Bernoulli process is tolerated. Definition 5.3 (Bernoulli Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If the random variable X is the indicator function of event E, that is X = 1 if E occurs and X = 0 if E does not occur, then X is said to have a Bernoulli distribution Ber(p) and its probability mass function is given by : p X (1) = P(X = 1) = p and p X (0) = P(X = 0) = 1 p (17) Proof. Let us prove that the p.m.f. of a Bernoulli distribution is actually a valid p.m.f. : p X (1) = p 0 and p X (0) = 1 p 0 ; 16

17 Does the p.m.f. sum to 1? 1 p X (i) = (1 p) + p = 1 i=0 Property 5.2 (Mean and Variance for a Bernoulli Distribution). If X follows a Bernoulli distribution Ber(p), then its expected value is given by : its variance is given by : Proof. Expectation : E[X] = E[X] = p (18) Var(X) = p(1 p) (19) 1 i p X (i) = 0 (1 p) + 1 p = p i=0 Variance : Var(X) = E[X 2 ] (E[X]) 2. Note that X 2 = X for X {0, 1}. Therefore E[X 2 ] = E[X] = p and Var(X) = p p 2 = p(1 p) 5.3 Binomial Distribution Definition 5.4 (Binomial Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If n N trials are performed according to a Bernoulli process, then the random variable X defined as the number of successes among the n trials, is said to have a binomial distribution Bin(n, p) and its probability mass function is given by : ( ) n p X (x) = P(X = x) = p x (1 p) n x for x = 0,..., n (20) x Proof. Let us prove that the p.m.f. of a binomial distribution is actually a valid p.m.f. : p X (x) = ( n x) p x (1 p) n x 0 for x = 0,..., n ; 17

18 Does the p.m.f. sum to 1? n p X (x) = n ( ) n p x (1 p) n x = (p + (1 p)) n = 1 (Binomial theorem seen in Chapter 1) x Property 5.3 (Mean and Variance for a Binomial Distribution). If X follows a binomial distribution Bin(n, p), then its expected value is given by : its variance is given by : Proof. Expectation : E[X] = = = = n xp X (x) E[X] = np (21) Var(X) = np(1 p) (22) n ( ) n x p x (1 p) n x x n n! (x 1)!(n x)! px (1 p) n x x=1 n 1 k=0 n! k!(n 1 k)! pk+1 (1 p) n 1 k n 1 (n 1)! = np k!(n 1 k)! pk (1 p) n 1 k = np k=0 we recognize inside the sum the p.m.f. of Bin(n 1, p) Variance : Var(X) = E[X 2 ] (E[X]) 2. We use the following trick : we subtract and add E[X]. We thus obtain : Var(X) = E[X 2 ] E[X]+E[X] (E[X]) 2 = E[X(X 1)] + E[X] (E[X]) 2, where E[X(X 1)] is given by : 18

19 E[X(X 1)] = Hence, = = = n x(x 1)p X (x) n ( ) n x(x 1) p x (1 p) n x x n n! (x 2)!(n x)! px (1 p) n x x=2 n 2 k=0 n! k!(n 2 k)! pk+2 (1 p) n 2 k n 2 = n(n 1)p 2 (n 2)! k!(n 2 k)! pk (1 p) n 2 k k=0 we recognize inside the sum the p.m.f. of Bin(n 2, p) = n(n 1)p 2 Var(X) = n(n 1)p 2 + np (np) 2 = np(1 p) Example. A student attends STAT394 three days a week. Assume that he oversleeps with probability What is the probability the he misses one class in a week? What is the probability the he misses three classes in a month (12 classes)? What is the probability the he misses at least two classes in a month? What is the probability the he misses four classes in total in a given month if he already missed two classes in that same month? How many classes does the instructor of STAT394 expect that student to miss at the end of the quarter (30 classes)? What is the corresponding variance? 5.4 Geometric Distribution Example. I draw a card from a standard deck of 52 cards. If the card I draw is not an ace, I put it back in the deck and shuffle the cards and I draw a new card. I repeat the process until I get an ace. What is the probability that I draw an ace for the first time at the fourth trial? 19

20 Definition 5.5 (Geometric Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If all the assumptions of a Bernoulli process are satisfied, except that the number of trials is not preset, then the random variable X defined as the number of trials until the first success is said to have a geometric distribution G(p) and its probability mass function is given by : p X (x) = P(X = x) = (1 p) x 1 p, for x N (23) Proof. Let us prove that the p.m.f. valid p.m.f. : of a geometric distribution is actually a p X (x) = (1 p) x 1 p 0 for x N ; Does the p.m.f. sum to 1? p X (x) = (1 p) x 1 p = p (1 p) x 1 1 = p 1 (1 p) = 1 x=1 x=1 x=1 Example. I draw a card from a standard deck of 52 cards. If the card I draw is not an ace, I put it back in the deck and shuffle the cards and I draw a new card. I repeat the process until I get an ace. What is the probability that I have not drawn an ace yet after 6 trials? Property 5.4 (Distribution function for a Geometric Distribution). If X follows a geometric distribution G(p), then the distribution function of X is given by : F X (x) = 1 (1 p) x (24) Proof. For a given x R, we have that : F X (x) = P(X x) = 1 P(X > x) = 1 P(X x + 1) since X can only take on integer values 20

21 Let us calculate P(X x + 1) : Hence, the result holds. P(X x + 1) = = = p k=x+1 k=x+1 k=x+1 p X (x) (1 p) x 1 p (1 p) k 1 (1 p)x = p 1 (1 p) = (1 p) x Property 5.5 (Mean and Variance for a Geometric Distribution). If X follows a geometric distribution G(p), then its expected value is given by : E[X] = 1 p (25) its variance is given by : Var(X) = 1 p p 2 (26) Proof. Expectation : E[X] = = = p xp X (x) x=1 x(1 p) x 1 p x=1 x(1 p) x 1 x=1 Here, we notice that x(1 p) x 1 is actually the derivative of (1 p) x 21

22 with respect to p. Therefore, we have that : E[X] = p d (1 p) x dp x=1 = p d ( ) 1 p dp p 1 p (1 p) 1 = p p 2 = 1 p According to a previously used trick, the variance can be written as follows : Var(X) = E[X(X 1)] + E[X] (E[X]) 2, where E[X(X 1)] is given by : E[X(X 1)] = = x(x 1)p X (x) x=1 x(x 1)(1 p) x 1 p x=1 = p(1 p) x(x 1)(1 p) x 2 Here, we notice that x(x 1)(1 p) x 2 is actually the second derivative of (1 p) x with respect to p. Therefore, we have that : x=1 E[X(X 1)] = p(1 p) d2 dp 2 = p(1 p) d2 dp 2 = p(1 p) 2 p 3 (1 p) x x=1 ( ) 1 p p = 2(1 p) p 2 Hence, Var(X) = 2(1 p) p p 1 p 2 = 1 p p 2 22

23 Example. A representative from the National Football League s Marketing Division randomly selects people on a random street in Seattle until he finds a person who attended the last home football game. Let p, the probability that he succeeds in finding such a person, equal And, let X denote the number of people he selects until he finds his first success. What is the probability that the marketing representative must select 4 people before he finds one who attended the last home football game? What is the probability that the marketing representative must select more than 6 people before he finds one who attended the last home football game? How many people should we expect (in the long run) the marketing representative needs to select before he finds one who attended the last home football game? What is the corresponding variance? 5.5 Hypergeometric Distribution Example. A wallet contains three $100 bills and five $1 bills. You randomly choose four bills. What is the probability that you will choose exactly 2 $100 bills? Definition 5.6 (Hypergeometric Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success. If the experiment consists in drawing a sample of n items, without replacement, from a finite population of size N that contains exactly m successes, then the random variable X defined as the number of successes among the n trials, is said to have a hypergeometric distribution HG(N, n, m) and its probability mass function is given by : ( m )( N m ) x n x p X (x) = P(X = x) = ( N, for x = 0,..., m (27) n) Remark: As mentioned previously, when the sample size n is small in relation to the population size N, a hypergeometric distribution HG(N, n, m) can be approximated by a binomial distribution Bin(n, m/n). Verify that approximation in the Federer/Djokovic example : 8,000 Federer fans and 2,000 Djokovic fans attend a tennis match. If three fans are randomly selected, compute the exact and approximate probability that two of them are Federer fans. Property 5.6 (Mean and Variance for a Hypergeometric Distribution). If X follows a hypergeometric distribution HG(N, n, m), then its expected value is given by : E[X] = n m N (28) 23

24 its variance is given by : Var(X) = N n N 1 n m ( N 1 m ) N (29) Example. In Hold em Poker players make the best hand they can combining the two cards in their hand with the 5 cards (community cards) eventually turned up on the table. The deck has 52 cards and there are 13 of each suit (hearts, clubs, spades, diamonds). Assume a player has 2 clubs in the hand and there are 3 cards showing on the table, 2 of which are also clubs. What is the probability that neither of the next two cards turned are clubs? What is the probability that one of the next two cards turned is a club? What is the probability that both of the next two cards turned are clubs? 5.6 Poisson Distribution Let the discrete random variable X denote the number of times an event occurs in an interval of time or space. Then X may be a Poisson random variable with x N. Here is a list of examples of random variables that might obey the Poisson probability law : the number of typos on a printed page. (This is an example of an interval of space, the space being the printed page.) the number of cars passing through the intersection of 8th Avenue NE and NE 50th St in one minute. (This is an example of an interval of time, the time being one minute.) the number of customers at an ATM in 10-minute intervals. the number of students arriving during office hours. Definition 5.7 (Approximate Poisson process). Let X denote the number of events in a given continuous interval. Then X follows an approximate Poisson process with parameter λ > 0 if : (1) The number of events occurring in non-overlapping intervals are independent. (2) The probability of exactly one event in a short interval of length h = 1/n is approximately λh = λ/n (3) The probability of exactly two or more events in a short interval is essentially zero. 24

25 From the approximate Poisson process to the Poisson distribution Let X denote the number of events in a given continuous interval. Assume that X follows an approximate Poisson process with parameter λ > 0. Properties (2) and (3) imply that X obeys a binomial law where the number of trials n corresponds to the number of short intervals in a given continuous interval and the probability of success is p = λ/n, that is the probability of exactly one event in a short interval. Therefore the p.m.f. of X is given by : P(X = x) = ( n x ) ( λ n ) x ( 1 λ ) n x n Now, let us see how that p.m.f. behaves when n tends towards infinity, that is when the short interval gets smaller and smaller. To this end, let us rewrite the terms in the p.m.f. : P(X = x) = ( n! x!(n x)! λx n x 1 λ ) n ( 1 λ ) x n n = λx x! n! (n x)! 1 n x = λx x! n n n 1 n n x + 1 n ( 1 λ ) n ( 1 λ ) x n n ( 1 λ ) n ( 1 λ ) x n n λ x /x! is a constant with respect to n n n n 1 n n x+1 n n = 1 ( 1 λ n) x n (1 0) x = 1 According to a classic result of calculus, ( 1 λ n) n n Hence, we obtain the following result : λ λx lim P(X = x) = e n x! e λ Definition 5.8 (Poisson Distribution). Let (Ω, A, P) be a probability space. A random variable X is said to have a Poisson distribution P(λ), with λ > 0 if its probability mass function is given by : p X (x) = P(X = x) = e λ λx, for x N (30) x! Proof. Let us prove that the p.m.f. of a Poisson distribution is actually a valid p.m.f. : λ λx p X (x) = e x! 0 for x N ; 25

26 Does the p.m.f. sum to 1? n p X (x) = λ λx e x! = e λ λ x x! Here we recognize the exponential series = e λ e λ = 1 Property 5.7 (Mean and Variance for a Poisson Distribution). If X follows a Poisson distribution P(λ), then its expected value is given by : E[X] = λ (31) its variance is given by : Var(X) = λ (32) Proof. Expectation : E[X] = = xp X (x) = e λ λ λx xe x! x=1 = e λ k=0 = λ e λ λ x (x 1)! λ k+1 k=0 = λ e λ e λ = λ According to a previously used trick, the variance can be written as follows : Var(X) = E[X(X 1)] + E[X] (E[X]) 2, where E[X(X 1)] is given by : k! λ k k! 26

27 E[X(X 1)] = = x(x 1)p X (x) = e λ λ λx x(x 1)e x! x=2 = e λ k=0 λ x (x 2)! λ k+2 k! = λ 2 e λ k=0 = λ 2 e λ e λ = λ 2 λ k k! Hence, Var(X) = λ 2 + λ λ 2 = λ Example. Assume that a professor expects to be asked an average of 3 recommendations by quarter. (1) What is the probability that 3 students ask for a recommendation in a given quarter? (2) What is the probability that the professor is asked at least 2 recommendations in 2 quarters? Example. Five percent of Christmas tree light bulbs manufactured by a company are defective. The company s Quality Control Manager is quite concerned and therefore randomly samples 100 bulbs coming off of the assembly line. Let X denote the number in the sample that are defective. What is the probability that the sample contains at most three defective bulbs? Answer. X is a binomial random variable with parameters n = 100 (sample size) and p = 0.05, probability that a bulb is defective. The desired probability is thus : 3 3 ( ) 100 P(X 3) = p X (x) = 0.05 x x x 27

28 Many standard calculators would have trouble calculating that probability using the p.m.f. But if you recall the way that we derived the Poisson distribution, it seems reasonable to approximate the binomial distribution with the Poisson distribution whose parameter λ should be the expected number of defective bulbs, that is λ = np = = 5. This approximation holds as long as the number of trials n is large (and therefore, p is small since p = λ/n). The exact calculation gives : P(X 3) Now, let us see how good the Poisson approximation is : 3 e λ λ x /x! , which is not too bad of an approximation. Property 5.8. Let X be a random variable that follows a binomial distribution Bin(n, p). Then, when n is large and p is small enough to make np moderate, X is approximately a Poisson random variable with parameter λ = np. In general, the above approximation works well if n 20 and p 0.05, or if n 100 and p

Quick review on Discrete Random Variables

Quick review on Discrete Random Variables STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 2017 Néhémy Lim Quick review on Discrete Random Variables Notations. Z = {..., 2, 1, 0, 1, 2,...}, set of all integers; N = {0, 1, 2,...}, set of natural

More information

Discrete Random Variables (1) Solutions

Discrete Random Variables (1) Solutions STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 06 Néhémy Lim Discrete Random Variables ( Solutions Problem. The probability mass function p X of some discrete real-valued random variable X is given

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

Chapter 1: Revie of Calculus and Probability

Chapter 1: Revie of Calculus and Probability Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Chapter 3 Discrete Random Variables

Chapter 3 Discrete Random Variables MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) 1 Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) On this exam, questions may come from any of the following topic areas: - Union and intersection of sets - Complement of

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150 Name Student ID # Instructor: SOLUTION Sergey Kirshner STAT 516 Fall 09 Practice Midterm #1 January 31, 2010 You are not allowed to use books or notes. Non-programmable non-graphic calculators are permitted.

More information

Special Mathematics Discrete random variables

Special Mathematics Discrete random variables Special Mathematics Discrete random variables April 208 ii Expose yourself to as much randomness as possible. Ben Casnocha 6 Discrete random variables Texas Holdem Poker: In Hold em Poker players make

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

Probability and Statistics Concepts

Probability and Statistics Concepts University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables 1 Monday 9/24/12 on Bernoulli and Binomial R.V.s We are now discussing discrete random variables that have

More information

Discrete Distributions

Discrete Distributions Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 11: Geometric Distribution Poisson Process Poisson Distribution Geometric Distribution The Geometric

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

MATH MW Elementary Probability Course Notes Part I: Models and Counting

MATH MW Elementary Probability Course Notes Part I: Models and Counting MATH 2030 3.00MW Elementary Probability Course Notes Part I: Models and Counting Tom Salisbury salt@yorku.ca York University Winter 2010 Introduction [Jan 5] Probability: the mathematics used for Statistics

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete

More information

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution

More information

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS A The Hypergeometric Situation: Sampling without Replacement In the section on Bernoulli trials [top of page 3 of those notes], it was indicated

More information

Probability Theory and Random Variables

Probability Theory and Random Variables Probability Theory and Random Variables One of the most noticeable aspects of many computer science related phenomena is the lack of certainty. When a job is submitted to a batch oriented computer system,

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1 IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Random Models. Tusheng Zhang. February 14, 2013

Random Models. Tusheng Zhang. February 14, 2013 Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review

More information

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition: nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Name: Firas Rassoul-Agha

Name: Firas Rassoul-Agha Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE

More information

1. Discrete Distributions

1. Discrete Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 1. Discrete Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space Ω.

More information

Discrete Distributions

Discrete Distributions Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

Probability Theory Review

Probability Theory Review Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide

More information

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER 2 2017/2018 DR. ANTHONY BROWN 5.1. Introduction to Probability. 5. Probability You are probably familiar with the elementary

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial. Section 8.6: Bernoulli Experiments and Binomial Distribution We have already learned how to solve problems such as if a person randomly guesses the answers to 10 multiple choice questions, what is the

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

Math Bootcamp 2012 Miscellaneous

Math Bootcamp 2012 Miscellaneous Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Probability and Statistics. Vittoria Silvestri

Probability and Statistics. Vittoria Silvestri Probability and Statistics Vittoria Silvestri Statslab, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WB, UK Contents Preface 5 Chapter 1. Discrete probability

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Discrete Distributions

Discrete Distributions Discrete Distributions Applications of the Binomial Distribution A manufacturing plant labels items as either defective or acceptable A firm bidding for contracts will either get a contract or not A marketing

More information

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

Notes Week 2 Chapter 3 Probability WEEK 2 page 1 Notes Week 2 Chapter 3 Probability WEEK 2 page 1 The sample space of an experiment, sometimes denoted S or in probability theory, is the set that consists of all possible elementary outcomes of that experiment

More information

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState Random variables, Expectation, Mean and Variance Slides are adapted from STAT414 course at PennState https://onlinecourses.science.psu.edu/stat414/ Random variable Definition. Given a random experiment

More information

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos

More information

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

Math , Fall 2012: HW 5 Solutions

Math , Fall 2012: HW 5 Solutions Math 230.0, Fall 202: HW 5 Solutions Due Thursday, October 4th, 202. Problem (p.58 #2). Let X and Y be the numbers obtained in two draws at random from a box containing four tickets labeled, 2, 3, 4. Display

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Set Theory Digression

Set Theory Digression 1 Introduction to Probability 1.1 Basic Rules of Probability Set Theory Digression A set is defined as any collection of objects, which are called points or elements. The biggest possible collection of

More information

Chapter 4a Probability Models

Chapter 4a Probability Models Chapter 4a Probability Models 4a.2 Probability models for a variable with a finite number of values 297 4a.1 Introduction Chapters 2 and 3 are concerned with data description (descriptive statistics) where

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2 Probability Probability is the study of uncertain events or outcomes. Games of chance that involve rolling dice or dealing cards are one obvious area of application. However, probability models underlie

More information

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Lecture 08: Poisson and More. Lisa Yan July 13, 2018 Lecture 08: Poisson and More Lisa Yan July 13, 2018 Announcements PS1: Grades out later today Solutions out after class today PS2 due today PS3 out today (due next Friday 7/20) 2 Midterm announcement Tuesday,

More information

Basic Probability. Introduction

Basic Probability. Introduction Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Introduction to Probability

Introduction to Probability Introduction to Probability Salvatore Pace September 2, 208 Introduction In a frequentist interpretation of probability, a probability measure P (A) says that if I do something N times, I should see event

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information