Chapter 1 sections We will SKIP a number of sections Set theory SKIP Real number uncountability Definition of probability Finite sample spaces Counting methods Combinatorial methods SKIP tennis tournament Multinomial coefficients SKIP probability of unions (but understand Venn diagrams) SKIP matching problem SKIP statistical swindles
Chapter 2 sections We will SKIP a number of sections Conditional probability SKIP game of craps Independent events Bayes theorem SKIP posterior probability computation in more than one step Conditionally independent events SKIP gambler s ruin
Axiomatic Foundations of Probability The foundations of modern probability theory, the axiomatic basis, are laid by Kolmogorov in 1933. The axiomatic approach is not concerned with the interpretation of probabilities. Concerned only that probabilities are defined by a function satisfying the axioms. Definition: Probability Function Given a sample space S, a probability function is a function P with domain B that satisfies 1. P(A) 0 for all A B. 2. P(S) = 1 3. If A 1, A 2,... B pairwise disjoint, then P( i=1 A i) = i=1 P(A i).
Fair Coin Fair coin: P(H) = P(T ). P(H T ) = 1. H and T disjoint so P(H T ) = P(H) + P(T ). Hence P(H) = P(T ) = 1 2. Dart Board Assume that board is always hit (!) and that any point on the board is equally likely. P(scoring i points) = Area of region i Area of dart board P(scoring i points) = (6 i)2 (5 i) 2 5 2
Theorem If P is a probability function and A is any set in B: P( ) = 0 empty set P(A) 1 P(A c ) = 1 P(A). Theorem If P is a probability function and A and B are any sets in B: P(B A c ) = P(B) P(A B) P(A B) = P(A) + P(B) P(A B) If A B, then P(A) P(B) Bonferroni s Inequality P(A B) P(A) + P(B) 1
Terminology Finite vs infinite: number of people in this room is finite, but the amount of time it takes for everyone to love statistics may be infinite. Discrete vs continuous: minute hand of an analog moves continuously, but the minutes on a digital clock move discretely. Countable vs uncountable: N and Q are countable, R is uncountable. Definition: Partition If A 1, A 2,... are pairwise disjoint and i=1 A i = S, where S is the whole space, then the collection A 1, A 2,... forms a partition of S.
Counting How likely is it to win the lottery? What is the probability of winning a knock-out tournament with 16 participants? Fundamental Theorem of Counting If a job consists of k tasks, the ith of which can be done in n i ways, then the job can be done in n 1 n 2... n k ways. Permutations and combinations In how many ways can 5 people form a team of 3? This is a combination. What if they also have to decide who is President, vice President and helper? This is a permutation.
Methods of counting Ordered, without replacement Ordered, with replacement Unordered, without replacement Unordered, with replacement Definitions n! = n (n 1) (n 2)... 2 1 ) = n! ( n r r!(n r)!
Examples You are in a room of 23 people. What are the odds at least 2 of those have the same birthday? The probability that they are all different is 365 365 364 365... 343 365 0.5 So the probability that at least 2 are the same is 1 P(all different) 0.50. You are playing a hand of poker (5 card stud). What are the odds you are dealt 4 aces? All possible hands: ( ) 52 5. How many with 4 aces: 48. Probability less than 1 in 50,000.
END OF CHAPTER 1
Conditional probability Definition: For events A and B, the conditional probability of A given B is P(A B) P(A B) =. P(B) Is this a probability function by Kolmogorov? Poker example 4 cards are dealt. What is the probability you are dealt 4 aces given that you have already been drawn 1 ace? How about 2 or 3, 1 generally i? ( 52 i 4 i ) Monty Hall problem (similar to Three prisoners)
Law of Total probability Partition of space C 1, C 2,..., then P(B) = P(B C i )P(C i ) i=1 Bayes rule For a partition A 1, A 2,...: P(A B) = P(B A)P(A) P(B) P(A i B) = P(B A i )P(A i ) j=1 P(B A j)p(a j )
(statistical) Independence Two events A and B are statistically independent if P(A B) = P(A)P(B). Theorem If A and B are independent, so are the following pairs A and B c A c and B A c and B c Mutual Independence A collection of events A 1,..., A n are mutually independent if for any subcollection A i1,..., A ik, we have ) P ( k j=1a ij = k P(A ij ). j=1
P(A B C) = P(A)P(B)P(C) does not imply mutual independence! Example: Two dice, define following events: Doubles appear (1, 1), (2, 2),..., (6, 6) so that P(A) = 1/6. 7 sum 10 so that P(B) = 1/2. The sum is 2, 7 or 8, so that P(C) = 1/3. We also have P(A B C) = 1/36 = P(A)P(B)P(C). BUT P(B C) = 11/36 P(B)P(C).
Pairwise independence does not imply mutual independence! A box contains four balls, numbered 1 through 4. One ball is selected at random from this box. Let X 1 = 1 if ball 1 or ball 2 is drawn, 0 otherwise X 2 = 1 if ball 1 or ball 3 is drawn, 0 otherwise X 3 = 1 if ball 2 or ball 3 is drawn, 0 otherwise Show that any two of the random variables X 1, X 2, X 3 are independent, but the three together are not.