Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

Size: px
Start display at page:

Download "Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:"

Transcription

1 Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 16 Notes Class URL:

2 Notes slides from before lecture Notes March 8 (1) This week: Days 21, 22, 23 of posted slides Probability distributions, conditional probability, expected value, Bayes theorem HW7 is due tonight. HW8 will be due next Wednesday. MT2 3-day (= 72-hour) window for regrade requests ends tomorrow Please ask questions about MT2, grading in OHs. Discussion section this week will spend some time going over MT2. Poll for Final exam reviews: 3 top choices = F 5-7, Sa 1-3, Su 1-3 second round of polling to choose between F 5-7, Su 1-3 please vote! Any logistic, other issues?

3 Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected sum of two 6-sided dice. A. 6 B. 7 C. 8 D. 9 E. None of the above.

4 Properties of Expectation Rosen p. 460,478 E(X) may not be an actually possible value of X. But m <= E(X) <= M, where m is the minimum possible value of X and M is the maximum possible value of X.

5 Useful trick 1: Case analysis Rosen p. 460,478 The expectation can be computed by conditioning on an event and its complement Theorem: For any random variable X and event A, E(X) = P(A) E(X A) + P( A c ) E ( X A c ) where A c is the complement of A. Conditional Expectation

6 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? e.g. X(HHT) = 1 X(HHH) = 2.

7 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Directly from definition For each of eight possible outcomes, find probability and value of X: HHH (P(HHH)=1/8, X(HHH) = 2), HHT, HTH, HTT, THH, THT, TTH, TTTetc.

8 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". Which subset of S is A? A. { HHH } B. { THT } C. { HHT, THH} D. { HHH, HHT, THH, THT} E. None of the above.

9 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". E(X) = P(A) E(X A) + P( A c ) E ( X A c )

10 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". P(A) = 1/2, P(A c ) = 1/2 E(X) = P(A) E(X A) + P( A c ) E ( X A c )

11 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". P(A) = 1/2, P(A c ) = 1/2 E(X) = P(A) E(X A) + P( A c ) E ( X A c ) E( X A c ) : If middle flip isn't H, there can't be any pairs of consecutive Hs

12 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". P(A) = 1/2, P(A c ) = 1/2 E(X) = P(A) E(X A) + P( A c ) E ( X A c ) E( X A c ) : If middle flip isn't H, there can't be any pairs of consecutive Hs E( X A ) : If middle flip is H, # pairs of consecutive Hs = # Hs in first & last flips

13 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". P(A) = 1/2, P(A c ) = 1/2 E( X A c ) = 0 E( X A ) = ¼ * 0 + ½ * 1 + ¼ * 2 = 1 E(X) = P(A) E(X A) + P( A c ) E ( X A c )

14 Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". P(A) = 1/2, P(A c ) = 1/2 E(X) = P(A) E(X A) + P( A c ) E ( X A c ) = ½ ( 1 ) + ½ ( 0 ) = 1/2 E( X A c ) = 0 E( X A ) = ¼ * 0 + ½ * 1 + ¼ * 2 = 1

15 Useful trick 1: Case analysis Examples: Ending condition Each time I play solitaire I have a probability p of winning. I play until I win a game. Each time a child is born, it has probability p of being left-handed. I keep having children until I have a left-handed one. Let X be the number of games [equivalently, #children] until ending condition is met. What's E(X)? A. 1. B. Some big number that depends on p. C. 1/p. D. None of the above.

16 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Directly from definition Need to compute the sum of all possible P(X = i) i.

17 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Directly from definition Need to compute the sum of all possible P(X = i) i. P(X = i) = Probability that don't stop the first i-1 times and do stop at the i th time = (1-p) i-1 p

18 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Directly from definition Need to compute the sum of all possible P(X = i) i. P(X = i) = Probability that don't stop the first i-1 times and do stop at the i th time = (1-p) i-1 p Math 20B?

19 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation E(X) = P(A) E(X A) + P( A c ) E ( X A c )

20 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = P(A) E(X A) + P( A c ) E ( X A c )

21 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = P(A) E(X A) + P( A c ) E ( X A c ) P(A) = p P(A c ) = 1-p

22 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = P(A) E(X A) + P( A c ) E ( X A c ) E(X A) = 1 P(A) = p P(A c ) = 1-p because stop after first try

23 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = P(A) E(X A) + P( A c ) E ( X A c ) E(X A) = 1 E(X A c ) = 1 + E(X) P(A) = p P(A c ) = 1-p because tried once and then at same situation from start

24 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = P(A) E(X A) + P( A c ) E ( X A c ) E(X A) = 1 E(X A c ) = 1 + E(X) P(A) = p P(A c ) = 1-p E(X) = p(1) + ( 1-p ) (1 + E(X) )

25 Useful trick 1: Case analysis Ending condition Let X be the number of games number of kids until ending condition is met. Solution: Using conditional expectation Let A be the event "success at first try". E(X) = p(1) + ( 1-p ) (1 + E(X) ) Solving for E(X) gives:

26 Useful trick 2: Linearity of expectation Theorem: If X i are random variables on S and if a and b are real numbers then E(X 1 + +X n ) = E(X 1 ) + + E(X n ) and E(aX+b) = ae(x) + b. Rosen p The expectation of the sum is the sum of the expectations The random variables X i can be dependent on each other or otherwise interrelated the Theorem still holds!

27 Useful trick 2: Linearity of expectation Example: Expected number of consecutive heads when we flip a fair coin n times? A. 1. B. 2. C. n. D. None of the above. E.??

28 Useful trick 2: Linearity of expectation Example: Expected number of consecutive heads when we flip a fair coin n times? Solution: Define X i = 1 if both the i th and (i+1) st flips are H; X i = 0 otherwise. Looking for E(X) where. For each i, what is E(X i )? A. 0. B. ¼. C. ½. D. 1. E. It depends on the value of i.

29 Useful trick 2: Linearity of expectation Example: Expected number of consecutive heads when we flip a fair coin n times? Solution: Define X i = 1 if both the i th and (i+1) st flips are H; X i = 0 otherwise. Looking for E(X) where.

30 Useful trick 2: Linearity of expectation Example: Expected number of consecutive heads when we flip a fair coin n times? Solution: Define X i = 1 if both the i th and (i+1) st flips are H; X i = 0 otherwise. Looking for E(X) where. Indicator variables: 1 if pattern occurs, 0 otherwise

31 Example: Thank-You Notes A child has just had a birthday party. He has written 20 thank-you notes and addressed 20 corresponding envelopes. But then he gets distracted, and puts the letters randomly into the envelopes (one letter per envelope). What is the expected number of letters that are in their correct envelopes? Define r.v. X = #letters in correct envelopes? We want E(X) = i=0,,20 i P(X = i) How about: r.v. s X j = 1 if letter j is in envelope j, and X j = 0 otherwise? We want E( j=1,,20 X j )

32 Example: Baseball Cards A bubble gum company puts one random major league baseball player s card into every pack of its gum. There are N major leaguers in the league this year. If X is the number of packs of gum that a collector must buy up until she has at least one of each player s card, what is E(X)?

33 Other functions? Expectation does not in general commute with other functions. Rosen p. 460,478 E ( f(x) ) f ( E (X) ) For example, let X be random variable with P(X = 0) = ½, P(X =1) = ½ What's E(X)? What's E(X 2 )? What's ( E(X) ) 2?

34 Other functions? Expectation does not in general commute with other functions. Rosen p. 460,478 E ( f(x) ) f ( E (X) ) For example, let X be random variable with P(X = 0) = ½, P(X =1) = ½ What's E(X)? What's E(X 2 )? What's ( E(X) ) 2? (½)0 + (½)1 = ½ (½)0 2 + (½)1 2 = ½ (½) 2 = ¼

35 One More Example: Hashing Definition: A hash function maps data of variable length to data of a fixed length. The values returned by a hash function are called hash values or simply hashes. Open Hashing: Each item hashes to a particular location in an array, and locations can hold more than one item. Multiple input data values can have the same hash value. These are called collisions!

36 Number of Items Per Location Suppose: n items, hash to table of size k What is expected number of items per location? n independent trials with outcome = location in table (i.e., hash value) Let r.v. X = # inputs that hash to location 1 Let s write X as the sum of X i s : X i = 1 if the i th input hashes to location 1 X i = 0 otherwise X = X 1 + X X n E(X) = E(X 1 ) + E(X 2 ) E(X n ) = n 1/k = n/k

37 Number of Empty Locations What is expected number of empty locations? Define r.v. Y = # empty locations. We want E(Y) Example: 20 items into table of size 5: E(Y) =? Example: 20 items into table of size 10: E(Y) =? Apply linearity of expectation! (worked out at end of slide deck )

38 Independence, Concentration, Bayes Theorem CSE21 Winter 2017, Day 23 (B00), Day 15 (A00) March 10,

39 Independent Events Two events E and F are independent iff P( E F ) = P(E) P(F). U Rosen p. 457 Problem: Suppose E is the event that a randomly generated bitstring of length 4 starts with a 1 F is the event that this bitstring contains an even number of 1s. Are E and F independent if all bitstrings of length 4 are equally likely? Are they disjoint? First impressions? A. E and F are independent and disjoint. B. E and F are independent but not disjoint. C. E and F are disjoint but not independent. D. E and F are neither disjoint nor independent.

40 Independent Events Two events E and F are independent iff P( E F ) = P(E) P(F). U Rosen p. 457 Problem: Suppose E is the event that a randomly generated bitstring of length 4 starts with a 1 F is the event that this bitstring contains an even number of 1s. Are E and F independent if all bitstrings of length 4 are equally likely? Are they disjoint?

41 Independent Random Variables Rosen p. 485 Let X and Y be random variables over the same sample space. X and Y are called independent random variables if, for all possible values of v and u, P ( X = v and Y = u ) = P ( X = v) P(Y = u) Which of the following pairs of random variables on sample space of sequences of H/T when coin is flipped four times are independent? A. X 12 = # of H in first two flips, X 34 = # of H in last two flips. B. X = # of H in the sequence, Y = # of T in the sequence. C. X 12 = # of H in first two flips, X = # of H in the sequence. D. None of the above.

42 Independence Theorem: If X and Y are independent random variables over the same sample space, then E ( X Y ) = E( X ) E( Y ) Rosen p. 486 Note: This is not necessarily true if the random variables are not independent!

43 Concentration - Definitions How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The unexpectedness of X is the random variable U = X-E The average unexpectedness of X is AU(X) = E ( X-E ) = E( U ) The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) The standard deviation of X is σ(x) = ( E( X E 2 ) ) 1/2 = V(X) 1/2 Rosen Section 7.4

44 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The unexpectedness of X is the random variable U = X-E The average unexpectedness of X is AU(X) = E ( X-E ) = E( U ) The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) The standard deviation of X is σ(x) = ( E( X E 2 ) ) 1/2 = V(X) 1/2 Weight all differences from mean equally Weight large differences from mean more

45 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The unexpectedness of X is the random variable U = X-E The average unexpectedness of X is AU(X) = E ( X-E ) = E( U ) The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) The standard deviation of X is σ(x) = ( E( X E 2 ) ) 1/2 = V(X) 1/2 Does the average unexpectedness equal the standard deviation? A. Yes B. No C.??

46 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) Example: X 1 is a random variable with distribution P( X 1 = -2 ) = 1/5, P( X 1 = -1 ) = 1/5, P( X 1 = 0 ) = 1/5, P( X 1 = 1 ) = 1/5, P( X 1 = 2 ) = 1/5. X 2 is a random variable with distribution P( X 2 = -2 ) = 1/2, P( X 2 = 2 ) = 1/2. Which is true? A. E(X 1 ) E(X 2 ) B. V(X 1 ) < V(X 2 ) C. V(X 1 ) > V(X 2 ) D. V(X 1 ) = V(X 2 )

47 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) Theorem: V(X) = E(X 2 ) ( E(X) ) 2

48 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) Theorem: V(X) = E(X 2 ) ( E(X) ) 2 Proof: V(X) = E( (X-E) 2 ) = E( X 2 2XE + E 2 ) = E(X 2 ) 2E E (X) + E 2 = E(X 2 ) 2E 2 + E 2 Linearity of expectation = E(X 2 ) ( E(X) ) 2

49 Concentration How close (on average) will we be to the average / expected value? Let X be a random variable with E(X) = E. The variance of X is V(X) = E( X E 2 ) = E ( U 2 ) Theorem: V(X) = E(X 2 ) ( E(X) ) 2 What does it mean for the variance of X to be 0? A. The value of the random variable X must always be 0. B. The value of the random variable must be constant. C. E(X 2 ) = E(X) D. E(X 2 ) = E(U 2 ) E. E(U) = 0

50 Recall: Conditional probabilities Probability of an event may change if have additional information about outcomes. Suppose E and F are events, and P(F) > 0. Then, i.e., Rosen p. 456

51 Conditional Probability Bayes Theorem Probability of E and F, i.e., P(E F) P(E F) = P(F E) P(E) probability of F given E, times probability of E P(E F) = P(E F) P(F) (symmetrical observation) Rosen p. 470, Theorem 1 P(F E) P(E) = P(E F) P(F) P(F E) = P(E F) P(F) / P(E) Bayes Theorem

52 Bayes' Theorem Rosen Section 7.3 Based on previous knowledge about how probabilities of two events relate to one another, how does knowing that one event occurred impact the probability that the other event occurred?

53 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Your first guess? A. Close to 95% B. Close to 85% C. Close to 50% D. Close to 15% E. Close to 10%

54 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive)

55 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive) so let E = Tested positive F = Used steroids

56 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive) E = Tested positive P( E F ) = 0.95 F = Used steroids

57 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive) E = Tested positive P( E F ) = 0.95 F = Used steroids P(F) = 0.1 P( ) = 0.9

58 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive) E = Tested positive P( E F ) = 0.95 P( E ) = 0.15 F = Used steroids P(F) = 0.1 P( ) = 0.9

59 Bayes' Theorem: Example 1 Rosen Section 7.3 A manufacturer claims that its drug test will detect steroid use 95% of the time. What the company does not tell you is that 15% of all steroid-free individuals also test positive (the false positive rate). 10% of the Tour de France bike racers use steroids. Your favorite cyclist just tested positive. What s the probability that he used steroids? Define events: we want P ( used steroids tested positive) E = Tested positive P( E F ) = 0.95 P( E ) = 0.15 F = Used steroids P(F) = 0.1 P( ) = 0.9 Plug in: 41%

60 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. Your first guess? A. Close to 95% B. Close to 85% C. Close to 50% D. Close to 15% E. Close to 10%

61 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. We want: P( spam contains "Rolex" ). So define the events E = contains "Rolex" F = spam

62 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. We want: P( spam contains "Rolex" ). So define the events E = contains "Rolex" F = spam What is P(E F)? A B C. 0.5 D. Not enough info

63 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. We want: P( spam contains "Rolex" ). Training set: establish probabilities E = contains "Rolex" P( E F) = 250/2000 = P( E ) = 5/1000 = F = spam

64 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. We want: P( spam contains "Rolex" ). E = contains Rolex P( E F) = 250/2000 = P( E ) = 5/1000 = F = spam P( F ) = P( ) = 0.5

65 Bayes' Theorem: Example 2 Rosen Section 7.3 Suppose we have found that the word Rolex occurs in 250 of 2000 messages known to be spam and in 5 out of 1000 messages known not to be spam. Estimate the probability that an incoming message containing the word Rolex is spam, assuming that it is equally likely that an incoming message is spam or not spam. We want: P( spam contains "Rolex" ). E = contains "Rolex" P( E F) = 250/2000 = P( E ) = 5/1000 = F = spam P( F ) = P( ) = 0.5 Plug in: 96%

66 Example 3: Market Research An electronics company commissions a marketing report for each new product that predicts the success or failure of the product. Of the new products that have been introduced by the company, 60% have been successes. Furthermore, 70% of the successful products were predicted to be successes, while 40% of the failed products were predicted to be successes. Find the probability that a new smart phone will be successful, given that success is predicted. Define: Event A = Event B = Exercise: Should get P(A B) 0.724

67 Announcements Some extra slides follow OHs and 1-1 sessions Plenty available! HW #8 due Wednesday, March 15 11:59pm

68 Your list of important definitions/concepts =? Sample space, probability space Bernoulli trial Outcome Event Conditional probability P(A B) Random variable Distribution function of r.v. Binomial distribution Expectation Variance Use of an indicator variable Linearity of expectation (and, its application) Independence Bayes theorem / rule.. (Advice: Have such a list )

69 More: Independence, Conditional Probability Fact: P(B S) = P(B) See Rosen p. 486 Conditioned on everything, we don t get any new knowledge Fact: A and B are independent iff P(B A) = P(B) Three slides ago: A and B are independent iff P(A B) = P(A) P(B) Also: P(A B) = P(B A) P(A) Divide by P(A) to get Fact Independence means that conditioning on new information doesn t change probability

70 Linearity of Expectation: Baseball Cards A bubble gum company puts one random major league baseball player s card into every pack of its gum. There are N major leaguers in the league this year. If X is the number of packs of gum that a collector must buy up until she has at least one of each player s card, what is E(X)? Think about the card collecting in phases Phase 1: collect the first unique card Phase 2: collect the second unique card Phase k-1: collect the (k-1) st unique card Phase k: collect the k th unique card P(next pack ends this phase) = E(# packs to buy in this phase) =

71 Linearity of Expectation: Baseball Cards A bubble gum company puts one random major league baseball player s card into every pack of its gum. There are N major leaguers in the league this year. If X is the number of packs of gum that a collector must buy up until she has at least one of each player s card, what is E(X)? Define r.v. X k = #packs bought in Phase k E(X) = E( k=1..n X k ) = k=1..n E(X k ) linearity of expectation! = k=1..n N / (N k + 1) = N (1/1 + 1/2 + 1/3 + 1/ /N) = N H N H N = N th harmonic number ~N log N packs

72 Market Research An electronics company commissions a marketing report for each new product that predicts the success or failure of the product. Of the new products that have been introduced by the company, 60% have been successes. Furthermore, 70% of the successful products were predicted to be successes, while 40% of the failed products were predicted to be successes. Find the probability that a new smart phone will be successful, given that success is predicted. Define: Event A = smartphone is a success Event B = success is predicted Exercise: Should get P(A B) 0.724

73 Number of Empty Locations What is expected number of empty locations? Define r.v. Y = # empty locations. We want E(Y) Example: 20 items into table of size 5: E(Y) = 5 (4/5) Example: 20 items into table of size 10: E(Y) = 10 (9/10) Understand why this is an application of linearity of expectation!

74 Random/Review Question (see next slide) A student knows 80% of the material on an exam. What is the probability that she answers a given true-false question correctly? Knows material: answers T/F correctly with P = 1 Doesn t know material: answers T/F correctly with P = 0.5

75 Random/Review Question A student knows 80% of the material on an exam. What is the probability that she answers a given true-false question correctly? Knows material: answers T/F correctly with P = 1 Doesn t know material: answers T/F correctly with P = 0.5 Material she knows P = 0.8 Material she doesn t know P = 0.2 Right 1.0 Wrong 0 Right 0.5 Wrong 0.5

Independence, Concentration, Bayes Theorem

Independence, Concentration, Bayes Theorem Independence, Concentration, Bayes Theorem CSE21 Winter 2017, Day 23 (B00), Day 15 (A00) March 10, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Other functions? Expectation does not in general commute

More information

Independence, Variance, Bayes Theorem

Independence, Variance, Bayes Theorem Independence, Variance, Bayes Theorem Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 16, 2016 Resolving collisions with chaining Hash

More information

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212 Expected Value Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212 http://cseweb.ucsd.edu/classes/wi16/cse21-abc/ March

More information

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL: Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 15 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 6 (1) This week: Days

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

Probability Distributions. Conditional Probability.

Probability Distributions. Conditional Probability. Probability Distributions. Conditional Probability. CSE21 Winter 2017, Day 21 (B00), Day 14 (A00) March 6, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Probability Rosen p. 446, 453 Sample space, S:

More information

Lecture Lecture 5

Lecture Lecture 5 Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary) Chapter 14 From Randomness to Probability How to measure a likelihood of an event? How likely is it to answer correctly one out of two true-false questions on a quiz? Is it more, less, or equally likely

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Lower bound for sorting/probability Distributions

Lower bound for sorting/probability Distributions Lower bound for sorting/probability Distributions CSE21 Winter 2017, Day 20 (B00), Day 14 (A00) March 3, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Another application of counting lower bounds Sorting

More information

Probability Distributions. Conditional Probability

Probability Distributions. Conditional Probability Probability Distributions. Conditional Probability Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 16, 2016 In probability, we want to

More information

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Section 7.2 Sec$on Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli Trials and the Binomial Distribution Random Variables

More information

Probability (10A) Young Won Lim 6/12/17

Probability (10A) Young Won Lim 6/12/17 Probability (10A) Copyright (c) 2017 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

Discrete Probability

Discrete Probability MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability

More information

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head} Chapter 7 Notes 1 (c) Epstein, 2013 CHAPTER 7: PROBABILITY 7.1: Experiments, Sample Spaces and Events Chapter 7 Notes 2 (c) Epstein, 2013 What is the sample space for flipping a fair coin three times?

More information

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4. I. Probability basics (Sections 4.1 and 4.2) Flip a fair (probability of HEADS is 1/2) coin ten times. What is the probability of getting exactly 5 HEADS? What is the probability of getting exactly 10

More information

Conditional Probability and Bayes Theorem (2.4) Independence (2.5)

Conditional Probability and Bayes Theorem (2.4) Independence (2.5) Conditional Probability and Bayes Theorem (2.4) Independence (2.5) Prof. Tesler Math 186 Winter 2019 Prof. Tesler Conditional Probability and Bayes Theorem Math 186 / Winter 2019 1 / 38 Scenario: Flip

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x. Ch. 16 Random Variables Def n: A random variable is a numerical measurement of the outcome of a random phenomenon. A discrete random variable is a random variable that assumes separate values. # of people

More information

Name: Exam 2 Solutions. March 13, 2017

Name: Exam 2 Solutions. March 13, 2017 Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth

More information

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Events A and B are said to be independent if the occurrence of A does not affect the probability of B. Independent Events Events A and B are said to be independent if the occurrence of A does not affect the probability of B. Probability experiment of flipping a coin and rolling a dice. Sample Space: {(H,

More information

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru Venkateswara Rao, Ph.D. STA 2023 Fall 2016 Venkat Mu ALL THE CONTENT IN THESE SOLUTIONS PRESENTED IN BLUE AND BLACK

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

Lecture 1 : The Mathematical Theory of Probability

Lecture 1 : The Mathematical Theory of Probability Lecture 1 : The Mathematical Theory of Probability 0/ 30 1. Introduction Today we will do 2.1 and 2.2. We will skip Chapter 1. We all have an intuitive notion of probability. Let s see. What is the probability

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

Introduction to Probability, Fall 2009

Introduction to Probability, Fall 2009 Introduction to Probability, Fall 2009 Math 30530 Review questions for exam 1 solutions 1. Let A, B and C be events. Some of the following statements are always true, and some are not. For those that are

More information

First Digit Tally Marks Final Count

First Digit Tally Marks Final Count Benford Test () Imagine that you are a forensic accountant, presented with the two data sets on this sheet of paper (front and back). Which of the two sets should be investigated further? Why? () () ()

More information

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space I. Vocabulary: A. Outcomes: the things that can happen in a probability experiment B. Sample Space (S): all possible outcomes C. Event (E): one outcome D. Probability of an Event (P(E)): the likelihood

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru, Venkateswara Rao STA 2023 Spring 2016 1 1. A committee of 5 persons is to be formed from 6 men and 4 women. What

More information

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114.

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114. TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114. Prob Set 1: to be posted tomorrow. due in roughly a week Finite probability space S 1) a

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 8 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 8 Notes Goals for Today Counting Partitions

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has

More information

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY Friends, we continue the discussion with fundamentals of discrete probability in the second session of third chapter of our course in Discrete Mathematics. The conditional probability and Baye s theorem

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Example: Two dice are tossed. What is the probability that the sum is 8? This is an easy exercise: we have a sample space

More information

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G. CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017.

More information

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2 Probability Probability is the study of uncertain events or outcomes. Games of chance that involve rolling dice or dealing cards are one obvious area of application. However, probability models underlie

More information

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra

More information

Quantitative Methods for Decision Making

Quantitative Methods for Decision Making January 14, 2012 Lecture 3 Probability Theory Definition Mutually exclusive events: Two events A and B are mutually exclusive if A B = φ Definition Special Addition Rule: Let A and B be two mutually exclusive

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2014 Introduction The markets can be thought of as a complex interaction of a large number of random

More information

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny a nickel are flipped. You win $ if either

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

General Info. Grading

General Info. Grading Syllabus & Policies General Info Lecture 1: Introduction, Set Theory, and Boolean Algebra Classroom: Perkins 2-072 Time: Mon - Fri, 2:00-3:15 pm Wed, 3:30-4:30 pm Sta 111 Colin Rundel May 13, 2014 Professor:

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

2. Conditional Probability

2. Conditional Probability ENGG 2430 / ESTR 2004: Probability and Statistics Spring 2019 2. Conditional Probability Andrej Bogdanov Coins game Toss 3 coins. You win if at least two come out heads. S = { HHH, HHT, HTH, HTT, THH,

More information

Probability Notes (A) , Fall 2010

Probability Notes (A) , Fall 2010 Probability Notes (A) 18.310, Fall 2010 We are going to be spending around four lectures on probability theory this year. These notes cover approximately the first three lectures on it. Probability theory

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 2: Conditional probability Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities CS 441 Discrete Mathematics for CS Lecture 20 Probabilities Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square CS 441 Discrete mathematics for CS Probabilities Three axioms of the probability theory:

More information

Introduction Probability. Math 141. Introduction to Probability and Statistics. Albyn Jones

Introduction Probability. Math 141. Introduction to Probability and Statistics. Albyn Jones Math 141 to and Statistics Albyn Jones Mathematics Department Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 September 3, 2014 Motivation How likely is an eruption at Mount Rainier in

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Final Exam Review Session Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Notes 140608 Review Things to Know has

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Expected Value 7/7/2006

Expected Value 7/7/2006 Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided

More information

AP Statistics Semester I Examination Section I Questions 1-30 Spend approximately 60 minutes on this part of the exam.

AP Statistics Semester I Examination Section I Questions 1-30 Spend approximately 60 minutes on this part of the exam. AP Statistics Semester I Examination Section I Questions 1-30 Spend approximately 60 minutes on this part of the exam. Name: Directions: The questions or incomplete statements below are each followed by

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

Basic Statistics for SGPE Students Part II: Probability theory 1

Basic Statistics for SGPE Students Part II: Probability theory 1 Basic Statistics for SGPE Students Part II: Probability theory 1 Mark Mitchell mark.mitchell@ed.ac.uk Nicolai Vitt n.vitt@ed.ac.uk University of Edinburgh September 2016 1 Thanks to Achim Ahrens, Anna

More information

Probabilistic models

Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation

More information

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial. Section 8.6: Bernoulli Experiments and Binomial Distribution We have already learned how to solve problems such as if a person randomly guesses the answers to 10 multiple choice questions, what is the

More information

Conditional Probability

Conditional Probability Conditional Probability Conditional Probability The Law of Total Probability Let A 1, A 2,..., A k be mutually exclusive and exhaustive events. Then for any other event B, P(B) = P(B A 1 ) P(A 1 ) + P(B

More information

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Mathematics. (  : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2 ( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is

More information

Lecture 20 Random Samples 0/ 13

Lecture 20 Random Samples 0/ 13 0/ 13 One of the most important concepts in statistics is that of a random sample. The definition of a random sample is rather abstract. However it is critical to understand the idea behind the definition,

More information

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0? MATH 382 Conditional Probability Dr. Neal, WKU We now shall consider probabilities of events that are restricted within a subset that is smaller than the entire sample space Ω. For example, let Ω be the

More information

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Probability 2017/18 Supplementary Problems Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A

More information

Estadística I Exercises Chapter 4 Academic year 2015/16

Estadística I Exercises Chapter 4 Academic year 2015/16 Estadística I Exercises Chapter 4 Academic year 2015/16 1. An urn contains 15 balls numbered from 2 to 16. One ball is drawn at random and its number is reported. (a) Define the following events by listing

More information

Senior Math Circles November 19, 2008 Probability II

Senior Math Circles November 19, 2008 Probability II University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Senior Math Circles November 9, 2008 Probability II Probability Counting There are many situations where

More information

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers. Chapter 5: Probability and Discrete Probability Distribution Learn. Probability Binomial Distribution Poisson Distribution Some Popular Randomizers Rolling dice Spinning a wheel Flipping a coin Drawing

More information

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E. Compound Events Because we are using the framework of set theory to analyze probability, we can use unions, intersections and complements to break complex events into compositions of events for which it

More information

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability.

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability. Probabilistic Models Example #1 A production lot of 10,000 parts is tested for defects. It is expected that a defective part occurs once in every 1,000 parts. A sample of 500 is tested, with 2 defective

More information

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical. RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013 0) a) Suppose you flip a fair coin 3 times. i) What is the probability you get 0 heads? ii) 1 head? iii) 2 heads? iv) 3 heads? b) Suppose you are dealt

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 3 May 28th, 2009 Review We have discussed counting techniques in Chapter 1. Introduce the concept of the probability of an event. Compute probabilities in certain

More information

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev CS4705 Probability Review and Naïve Bayes Slides from Dragomir Radev Classification using a Generative Approach Previously on NLP discriminative models P C D here is a line with all the social media posts

More information

Introduction to discrete probability. The rules Sample space (finite except for one example)

Introduction to discrete probability. The rules Sample space (finite except for one example) Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }

More information

Chapter 4: Probability and Probability Distributions

Chapter 4: Probability and Probability Distributions Chapter 4: Probability and Probability Distributions 4.1 How Probability Can Be Used in Making Inferences 4.1 a. Subjective probability b. Relative frequency c. Classical d. Relative frequency e. Subjective

More information

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.

More information

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias Recap Announcements Lecture 5: Statistics 101 Mine Çetinkaya-Rundel September 13, 2011 HW1 due TA hours Thursday - Sunday 4pm - 9pm at Old Chem 211A If you added the class last week please make sure to

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

1 The Basic Counting Principles

1 The Basic Counting Principles 1 The Basic Counting Principles The Multiplication Rule If an operation consists of k steps and the first step can be performed in n 1 ways, the second step can be performed in n ways [regardless of how

More information

CLASS 6 July 16, 2015 STT

CLASS 6 July 16, 2015 STT CLASS 6 July 6, 05 STT-35-04 Plan for today: Preparation for Quiz : Probability of the union. Conditional Probability, Formula of total probability, ayes Rule. Independence: Simple problems (solvable by

More information

CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =

CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) = CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14 1 Probability First, recall a couple useful facts from last time about probability: Linearity of expectation: E(aX + by ) = ae(x)

More information

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos

More information

Law of Total Probability and Bayes Rule

Law of Total Probability and Bayes Rule MATH 382 Law of Total Probability and Bayes Rule Dr Neal, WKU Law of Total Probability: Suppose events A 1, A 2,, A n form a partition of Ω That is, the events are mutually disjoint and their union is

More information

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Bayes Rule for probability

Bayes Rule for probability Bayes Rule for probability P A B P A P B A PAP B A P AP B A An generalization of Bayes Rule Let A, A 2,, A k denote a set of events such that S A A2 Ak and Ai Aj for all i and j. Then P A i B P Ai P B

More information

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information