Lectures Conditional Probability and Independence

Size: px
Start display at page:

Download "Lectures Conditional Probability and Independence"

Transcription

1 Lectures 5 11 Conditional Probability and Independence Purpose: Calculate probabilities under restrictions, conditions or partial information on the random experiment. Break down complex probabilistic analyses into manageable steps. Example 1 (Roll of Two Fair Dice): What is the long run relative frequency of a sum of seven given that we rolled at least one six. The relative frequencies of the shaded boxes should be roughly equal in the long run. Thus the conditional relative frequency of the light shaded boxes among the shaded boxes should be about 2/ eliminated P (sum of 7 at least one 6) /36 11/36 Think of the original probabilities of the shaded region (adding to 11/36) being prorated so that the new conditional probabilities of the individual shaded boxes add to one. This is accomplished by dividing the original shaded box probabilities by this total shaded (conditioning) region probability, i.e., by 11/36, thus giving us the desired (11/36)/(11/36) 1, the probability of our new sampling space (shaded region). 1

2 Definition: If P (F ) > 0, then P (E F ) P (EF ) P (F ) undefined when P (F ) 0! Long Run Frequency Interpretation of P (E F ): By the long run frequency paradigm we would in a large number N of repeated experiments see about a proportion P (F ) of the experiments result in the event F, or we would expect to see event F to occur about N P (F ) times, and similarly event EF about N P (EF ) times. Thus, when we focus on experiments that result in F (i.e., given F ), the proportion of such restricted experiments that also result in E is thus approximately N P (EF ) N P (F ) P (EF ) P (F ) P (E F ) Example 2 (Makeup Exam): A student is given a makeup exam and is given one hour to finish it. The exam is designed so that a fraction x/2 of students would finish it in less than x hours, i.e., about half would finish it in the allotted time. Given that the student (when viewed as a random choice from all students) is still working on the exam after 40 minutes, what is the chance that the student will use the full hour (finished or not)? Let X denote the time (in hours) that the student needs to finish the exam. We want P (X < 1 X 40/60) P (40/60 X < 1) P (X 40/60) P (X < 1) P (X < 40/60) 1 P (X < 40/60) 1/2 1/3 1 1/3 1 4 X < 1 is shorthand for the event that the student finishes prior to 1 hour, and similarly for the other usages. The chance that the student will use the full hour (finished or not) is.75. Prorated probabilities: Note that in the case of equally likely outcomes it is often easier to work with the reduced sample space treating the remaining outcomes as equally likely. In general, outcome probabilities are prorated P ({e} F ) P ({e}) P (F ) if e F and P ({e} F ) 0 otherwise. When P ({e}) are all the same, then P ({e})/p (F ) are all the same for all e F. Example 3 (Coin Flips): Two fair coins are flipped. All outcomes of the sample space S {(H.H), (H, T ), (T, H), (T, T )} are equally likely. What is the conditional probability that both coins are heads given a) that the first coin shows heads and b) at least one of the coins shows heads? Let B {(H, H)}, F {(H, T ), (H, H)} and A {(H, T ), (T, H), (H, H)}, then for a) while for b) we get P (B F ) P (BF ) P (F ) P (B A) P (AB) P (A) P (B) P (F ) 1/4 1/2 1 2 P (B) P (A) 1/4 3/

3 This takes many by surprise, and it is often phrased in terms of the boy/girl problem in a family with two children. If you are told that at least one of the children is a girl, the answer again is 1/3 for the probability that both are girls. Note however the lengthy but readable discussion in Example 3m in Section 3.3 of Ross. Here comes the kicker. Suppose you are told that at least one is a girl and she was born on a Sunday and assuming all gender, weekday combinations (G 1, D 1, G 2, D 2 ) as equally likely, the answer changes to?? We will generalize this from days D i to attributes A i which can take any of n values. For example, for n 365 it could be the day in the year. If I tell you that at least one of the children is a girl with attribute A k, then the given information reduces the sample space to {(G 1, k, G 2, i), i 1,..., n} {(G 1, i, G 2, k), i 1,..., n, i k} {(G 1, k, B 2, i), i 1,..., n} {(B 1, i, G 2, k), i 1,..., n} with altogether n + (n 1) + n + n 4n 1 equally likely outcomes. Of these n + n 1 2n 1 qualify as having two girls. Thus the conditional chance is 2n 1 4n for large n and 2n 1 4n for n 1 Lec6 ends The product formula: A reformulation of the definition of P (E F ): P (EF ) P (E F )P (F ) Useful in computing P (EF ) when P (E F ) is more transparent. The Law of Total Probability: Suppose the sample space S can be represented as the disjoint union of M events F i, i 1,..., M, with F i F j for i j, then for any event E we have (given as (3.4) by Ross, but pulled forward to clean up the next example): Special Case: ( ) ( M M ) M M P (E) P (ES) P E F i P EF i P (EF i ) P (E F i )P (F i ) i1 i1 i1 i1 E EF EF c mutually exclusive P (E) P (EF )+P (EF c ) P (E F )P (F )+P (E F c )P (F c ) with the hope that P (F ), P (E F ) and P (E F c ) are easier to ascertain than P (E). Example 4 (Bridge): In the card game bridge 52 cards are dealt with equal chance for all sets of hands of 13 each to East, West, North and South. Given that North and South have a total of 8 spades, what is the chance that East has 3 of the remaining 5 spades? Let F i be the event that specifies the i th distinct deal to North and South, where they have a total of 8 spades, and where East and West thus have the other 5 spades, but the deal to East and West is not specified otherwise. How many such disjoint events F i there are is not important, say there are M 1. However, it is easy to realize that each F i contains the same number ( ) of deals to East and West, i.e., P (F 1 )... P (F M ) P (F )/M, where F is the event that North and South have 8 spades among themselves. 1 M ( 13 8 )( 39 )( ), 8 spades, 18 non-spades, to make two hands of 13 for N and S. 3

4 Conditionally, given F, we can view F 1... F M F as our reduced sample space with all deals in it being equally likely. Let E be the event that East gets exactly 3 spades. Then P (E F i ) ( )( ) ( ) Note that which 5 spades and which 21 non-spades are involved in making the two hands for East and West depends on F i, the hands specified for North and South. However, the probability P (E F i ) is always the same. Using the same idea as presented in the law of total probability we have (see diagram) P (E F ) P (EF ) P (F ) P (E(F 1... F M )) P (F ) M i1 P (EF i ) P (F ) M i1 P (E F i )P (F i ) P (F ) P (E F i ).339 Example 5 (Urn): An urn contains k blue balls and n k red ones. If we draw the balls out one by one in random order (all n! orders equally likely), what is the chance that the first ball is blue? Distinguish the balls by labels 1,..., k, k + 1,..., n, with first k corresponding to blue. Then there are k ways to make the first choice so that it is blue, and then (n 1)! ways to make the remaining choices. On the other hand there are n! ways to make all n choices, without restrictions. Thus the desired probability is k(n 1)!/n! k/n, intuitively quite evident, when we just focus on the first choice. But the same argument works and gives the same answer when we ask: what is the chance that the last ball (or any position ball) is blue. Now suppose the urn contains b blue balls and r red balls. You randomly draw one by one n balls out. Given that there are k blue balls among the n drawn, what is the chance that the first one is blue? It is exactly the same as in the previous setting since the condition reduces the sample space to equally likely sequences of n balls with k blues ones and n k red ones, i.e., it is as though we draw n from an urn with k blue and n k red balls. 4

5 Example 6 (An Ace in Each Bridge Hand): When we deal out 4 hands of 13 cards each, what is the chance that each hand has an ace? Define the following four events: E 1 {the ace of spades in any one of the hands} E 2 {the ace of spades and the ace of heart are in different hands} E 3 {the aces of spades, hearts and diamonds are in different hands} E 4 {all four aces are in different hands}, then the desired probability is P (E 1 E 2 E 3 E 4 ) P (E 1 )P (E 2 E 1 )P (E 3 E 1 E 2 )P (E 4 E 1 E 2 E 3 ) The fractions become clear by viewing the deal as though the aces are dealt one by one as the first 4 cards to the 52 positions (positions 1,..., 13 making the first hand, positions 14,..., 26 making the second hand, etc.). With that view 39/51 is the chance that the ace of hearts goes to one of the 39 positions out of the 51 open positions, where 39 counts the positions in the other hands, different from the hand where the ace of spades wound up. And so on. This reasoning assumes (correctly) that this way of dealing out cards makes all such hands equally likely. In a normal deal, clockwise one card to each player repeatedly from a shuffled deck, we could track which of the 52 slots the 4 aces got dealt into, then which slots the 4 kings got dealt to, and so on. This kind of tracking should make clear that the same kind of hands are possible either way, all equally likely. In both cases the shuffled deck determines the random sequence of dealing, with the same result. Another path involves just simple counting, without conditional probabilities. We can deal the cards in 52! orders. Assume that the first 13 cards go to player 1, the second 13 to player 2, etc. The ace of space can land in any of the 52 deal positions, the ace of hearts has 39 positions left so that it lands in any other hand, etc. After the 4 ace positions in different hands have been determined, there are 48! ways to deal out the other cards. Thus P (E) ! 52! Example 7 (Tree Diagram: Rolling 5 Dice to Get Different Faces): 5 Dice are rolled at most twice in order to achieve 5 distinct faces. On the second roll, dice with duplicate faces are rolled again. What ist the chance of the event D of finishing with 5 distinct faces? First we work out the probabilities for the events E 1,..., E 5, where E i means that we get exactly i distinct faces on the first roll. We have (recall the birthday problem) p 5 P (E 5 ) , p 4 P (E 4 ) p 3 P (E 3 ) ( 5 3 ) ( 5 1 ( 5 2 ) ) Lec7 ends ( ) ( ) p 2 P (E 2 ) + 450, p P (E 1 ) Note that , confirming a proper count in the disjoint sets. We only comment on P (E 3 ) which can come about as triple, single, single, e.g., (4, 1, 3, 4, 4), and 5

6 as two doubles and a single, e.g., (2, 4, 6, 6, 4). While the numerator count for the former should be clear, the count for the latter is obtained by choosing the position for the singleton and filling it in 6 possible ways, i.e., ( ) 5 1 6, and then taking the left most free position as the left most position in the left most pair and combine that with the 3 positions for the right most position in that pair. The remaining slots define the other pair. These pairs are then filled with 5 and 4 respective choices. The probability can then be obtained by following all branches in the tree diagram below, leading to the event of interest, i.e., all 5 faces are distinct, and multiplying the probabilities along each such branch and adding up all these products. The probability at each branch segment represents the conditional probability of traversing this segment, conditional on having arrived at the segment from the root node. Again, an application of the law of total probability. P (D) P (E 5 )P (D E 5 ) + P (E 4 )P (D E 4 ) + P (E 3 )P (D E 3 ) + P (E 2 )P (D E 2 ) + P (E 1 )P (D E 1 ) Bayes Formula Example 8 (Insurance): In a population there are 10% accident prone people and the rest is not accident prone. An accident prone person has a 20% chance of having an accident in a given 6

7 year whereas for a normal person that chance is 10%. What is the chance that a randomly chosen person will have an accident during the next year. If F is the event that the chosen person is accident prone and A is the event that the chosen person will have an accident next year then P (A) P (A F )P (F ) + P (A F c )P (F c ) If the chosen person had an accident within that year, what is the chance that the person is accident prone? P (F A) P (AF ) P (A) P (A F )P (F ) P (A) P (A F )P (F ) P (A F )P (F ) + P (A F c )P (F c ) i.e., the chance has almost doubled, from 1/10 to 2/11. This is an instance of Bayes formula. Example 9 (Multiple Choice Tests): What is the chance of the event K that the student knew the answer if the student answered the question correctly (event C). Assume that there are m choices and the a priori chance of the student knowing the answer is p. When the student does not know the answer, it is chosen randomly. P (K C) P (KC) P (C) P (C K)P (K) P (C K)P (K) + P (C K c )P (K c ) 1 p 1 p + (1/m)(1 p) m p 1 p + m p With p.5 and m 4 we get P (K C).8. Note:P (K C) 1 as m. Example 10 (Blood Test for Disease): A test is 95% effective on persons with the disease and has a 1% false alarm rate. Suppose that the prevalence of the disease in the population is.5%. What is the chance that the person actually has the disease (event D), given that the test is positive (event E)? P (D E) P (DE) P (E D)P (D) P (E) P (E D)P (D) + P (E D c )P (D c ) With P (E D).99 P (D E) With P (E D) 1 P (D E) Bayes formula The following long run type argument makes the surprising answer more transparent. Out of 1000 people, roughly 995 will have no disease, and about 10 of them will give a false positive E. 5 will have the disease and about all will give a true positive. 5/(10 + 5).333. Such illustrations can counter the possible psychological damage arising from routine tests. General Bayes Formula: Let F 1, F 2, F 3,..., F n be mutually exclusive events whose union in S then and hence P (E) n i1 P (EF i ) n i1 P (E F i )P (F i ) law of total probability Lec8 ends 7

8 P (F j E) P (F je) P (E) P (E F j )P (F j ) ni1 P (E F i )P (F i ) Bayes formula So far we have seen it when S was split in two mutually exclusive events, e.g. F and F c. Example 11 (Three Cards): One card with both sides black (BB), one card with both sides red (RR) and one card with a red side and a black side (RB). Cards are mixed and one randomly selected card is randomly flipped and placed on the ground. You don t see the flip. If the color facing up is red (R u ) what is the chance that it is RR? P (RR R u ) P (R u RR)P (RR) P (R u RR)P (RR) + P (R u BB)P (BB) + P (R u RB)P (RB) Independence Definition of Independence: Two events E and F are called independent if P (EF ) P (E)P (F ) otherwise they are called dependent. Motivate through P (E F ) P (E). This relationship appears one sided, but it is symmetric, if P (EF ) > 0, i.e., P (F ) > 0, P (E) > 0. The definition of independence does not require P (F ) > 0. An event F with P (F ) 0 is always independent of any other event. Example 12 (2 Dice): The number on the first die is independent of the number on the second die. Let F 4 be the event that the first die is 4 and S 6 the event that the sum is 6 and S 7 the event that the sum is 7. Is F 4 independent of S 6 (S 7 )? P (F 4 ) 1 6, P (S 6) 5 36, P (S 7) 1 6, P (F 4S 7 ) 1 36 P (F 4)P (S 7 ), P (F 4 S 6 ) 1 36 P (F 4)P (S 6 ) Example 13 (Cards): If we draw a card at random then the event A that the card is an ace is independent of the event C that the card is a club. However, this breaks down as soon as the king of diamonds is missing from the deck, but not when all kings are missing. Theorem: Independence of E, F implies independence of E, F c, of E c, F and of E c, F c. Example 14 (Independence in 3 Events?): If E is independent of F and also independent of G, is E then independent of F G? Not necessarily! In a throw of two dice let: E be the event that the sum is 7, F be the event that first die is a 4 and G be the event that the second die is a 3. Then P (E) 1/6, P (F ) 1/6, P (G) 1/6, P (EF ) 1/36, P (EG) 1/36, P (F G) 1/36 but P (EF G) 1/36 (1/6)(1/36) All events are pairwise independent but E and F G are not. Definition of Independence of 3 Events: The 3 events E, F, and G are called independent if all the following relations hold: P (EF G) P (E)P (F )P (G) P (EF ) P (E)P (F ), P (EG) P (E)P (G), P (F G) P (F )P (G) If E, F and G are independent then E is independent of any event formed from F and G, e.g., E is then independent of F G, F c G, F G, F G c, etc. 8

9 Definition (Extension of Independence to Many Events): E 1, E 2,..., E n are called independent if P (E i1 E i2... E ik ) P (E i1 )P (E i2 ) P (E ik ) for any subcollection of events E i1, E i2,..., E ik (1 k n) taken from E 1, E 2,..., E n. Subexperiments, Repeated Experiments: Often an experiment is made up of many subexperiment (say n) such that the outcome in each subexperiment is not affected by the outcomes in the other experiments. In such a case of physical independence we may then reasonably assume the (probabilistic) independence of the events E 1, E 2,..., E n provided E i is completely described by the outcomes in the i th subexperiment. If S i represents the sample space of the i th subexperiment and e i one of its typical outcomes, then an outcome e of the full experiment could be described by e (e 1, e 2,..., e n ) and its sample space is: S S 1 S 2 S n {(e 1, e 2,..., e n ) : e 1 S 1, e 2 S 2,..., e n S n }. If the sample spaces of the subexperiments are all the same and if the probability function defined on the events of each subexperiment is the same, then the subexperiments are called trials. Note that by prescribing the probability function for each subexperiment and assuming independence of the events from these subexperiments, it is possible to construct a probability function on the events described in terms of the outcomes (e 1, e 2,..., e n ) of the overall experiment such that it is consistent with the probability function on the subexperiments. Assume this without proof. Example 15 (Finite and Infinite Independent Trials): A finite number n or an infinite sequence of independent trials is performed. For each trial we distinguish whether a certain event E occurs or not. If E occurs we call the result a success otherwise we call the result a failure. Let p P (E) denote the probability of success in a single trial. E 1 be the event of at least one success in the first n trials. E k be the event of exactly k successes in the first n trials. E be the event that all trials are successes. Find P (E 1 ), P (E k ) and P (E ). Lec9 ends P (E 1 ) 1 P (E c 1) 1 (1 p) n, P (E k ) P (E ) P (E n ) p n for all n P (E ) ( ) n p k (1 p) n k k { 0 for p < 1 1 for p 1 Example 16 (Parallel System): A system is composed of n separate components (relays,artificial horizon in cockpit) such that the system works as long as at least one of the components work. Such a system is called a parallel system. Suppose that during a given time period the chance that component i works (functions) is p i, i 1, 2,..., n and assume that the functioning of a component does not depend on that of any of the other components, i.e. we may assume probabilistic independence of (failure) events pertaining to separate components. Let E Air India litigation 9

10 denote the event that the system functions, i.e. at least one of the components functions during the given time period. Let E i denote the event that component i functions. Then P (E) 1 P (E c ) 1 P (E c 1E c 2 E c n) 1 n i1 n P (Ei c ) 1 (1 p i ) i1 thus achieving arbitrarily high reliability ( probability of functioning) through redundancy. This is how once can achieve, on paper, a probability of failure on the order of 10 9, by having a triply redundant system with components of reliability.999 each (.001) Example 17 (Infinite Trials, E Before F ): Suppose we perform a potentially infinite number of independent trials and for each trial we note whether an event E or an event F or neither of the two events occurs. It is assumed that E and F are mutually exclusive. Their respective probabilities of occurrence in any given trial are denoted by P (E) and P (F ). What is the probability that E occurs before F in these trials. Let G denote the event consisting of all those trial sequences, i.e. outcomes, in which the first E occurs before the first F. Let E 1 denote the event that the first trial results in the event E, let F 1 denote the event that the first trial results in F and N 1 the event that neither E nor F occurs in the first trial. Then P (G) P (GE 1 ) + P (GF 1 ) + P (GN 1 ) P (G E 1 )P (E 1 ) + P (G F 1 )P (F 1 ) + P (G N 1 )P (N 1 ) 1 P (E) + 0 P (F ) + P (G)(1 P (E) P (F )) Discuss intuition. i.e. P (G) P (E) P (E) + P (F ) P ( F ) is a Probability: The Power of the Axiomatic Approach! For fixed F with P (F ) > 0 the function P (E F ) is a probability function defined for events E S, i.e. it satisfies the 3 axioms of probability: 1. 0 P (E F ) 1 2. P (S F ) 1 3. P ( i1e i F ) i1 P (E i F ) for mutually exclusive events E 1, E 2, E 3,.... Consequences: All the consequences derived from the original axiom set hold as well for the conditional probability function Q(E) P (E F ), e.g. Q(E 1 E 2 ) Q(E 1 ) + Q(E 2 ) Q(E 1 E 2 ) or P (E 1 E 2 F ) P (E 1 F ) + P (E 2 F ) P (E 1 E 2 F ) Also: Q(E G) Q(EG) Q(G) P (EG F ) P (G F ) P (EGF )/P (F ) P (GF )/P (F ) P (EGF ) P (GF ) P (E GF ) 10

11 and P (E F ) Q(E) Q(E G)Q(G) + Q(E G c )Q(G c ) P (E F G)P (G F ) + P (E F G c )P (G c F ) Conditional Independence: Two events E 1 and E 2 are conditionally (given F ) independent if Q(E 1 E 2 ) P (E 1 E 2 F ) P (E 1 F )P (E 2 F ) Q(E 1 )Q(E 2 ). Conditional independence of E 1 and E 2 does not imply the (unconditional) independence of E 1 and E 2. Randomly pick one of two boxes and and then 2 balls with replacement from that box. P ( 1 2 box i) P ( 1 box i)p ( 2 box i) but P ( 1 2 ) P ( 1 )P ( 2 ) P ( i ) and P ( 1 2 ) ( ) Conversely, unconditional independence of E 1 and E 2 does not imply their conditional independence given F. Rolling two dice, let E 1 sum of 7, E 2 first die is a 6 and F at least one 6. P (E 1 E 2 ) P (E 1 )P (E 2 ) 1 while P (E 1 E 2 F ) P (E 1 F )P (E 2 F ) However, the independence of E 1, E 2 and F implies the conditional independence of E 1 and E 2 given F (exercise). This notion of conditional independence of two events can easily be generalized to a corresponding notion of conditional independence of three or more events as was done for the unconditional case. Example 18 (Insurance Revisited): Let F be the event that a randomly selected person is accident-prone with P (F ).1, let A 1 be the event that this person has an accident in the first year and A 2 be the event that this person has an accident in the second year. An accident-prone person has chance.2 of having an accident in any given year whereas for a non-accident prone person that chance is.1. 10% of the population is accident-prone. Finally we assume that A 1 and A 2 are conditionally independent given that an accident-prone (or non-accident-prone) person was selected, i.e. given F (or F c ). What is P (A 2 A 1 )? Lec10 ends P (A 2 A 1 ) P (A 2A 1 ) P (A 2A 1 F )P (F ) + P (A 2 A 1 F c )P (F c ) P (A 1 ) P (A 1 ) P (A 2 F )P (A 1 F )P (F ) + P (A 2 F c )P (A 1 F c )P (F c ) P (A 1 ) P (A 2).11 It also shows that conditional independence does not necessarily imply unconditional independence. Similarly one shows P (A c 3 A c 1A c 2) P (Ac 3A c 1A c 2) P (A c 1A c 2) P (Ac 3A c 1A c 2 F )P (F ) + P (A c 3A c 1A c 2 F c )P (F c ) P (A c 1A c 2 F )P (F ) + P (A c 1A c 2 F c )P (F c ) Compare this with P (A c 3).89 and P (A c 3 F ).8 and P (A c 3 F c ).9. 11

12 The Gambler s Ruin Problem Two gamblers, A and B, with respective fortunes of i and N i units keep flipping a coin (probability of heads p). Each time a head turns up A collects one unit from B, each time a tail turns up B collects one unit from A. The game ends when one of the players goes broke, i.e. gets ruined. What is the probability of the event E that A is the ultimate winner? Let P i P (E), the subscript i emphasizing the dependence on the fortune of player A. With H denoting the event of a head on the first toss, note that P i P (E) P (E H)P (H) + P (E H c )P (H c ) pp (E H) + (1 p)p (E H c ) pp i+1 + (1 p)p i 1 pp i + (1 p)p i pp i+1 + (1 p)p i 1 or P i+1 P i 1 p p (P i P i 1 ) Using the boundary condition P 0 0 we get P 2 P 1 1 p p (P 1 P 0 ) 1 p p P 1 P 3 P 2 1 p ( ) 2 1 p p (P 2 P 1 ) P 1 p P i P i 1 1 p ( ) i 1 1 p p (P i 1 P i 2 ) P 1 p P N P N 1 1 p ( ) N 1 1 p p (P N 1 P N 2 ) P 1 p and adding these equations get ( ) ( ) 2 ( ) i 1 P i P 1 P 1 1 p 1 p 1 p p p p or and Using P N 1 we get: and hence P i P 1 1 ((1 p)/p) i 1 (1 p)/p P i ip 1 if (1 p)/p 1 or p 1/2 P 1 P i if (1 p)/p 1 or p 1/2 1 (1 p)/p 1 ((1 p)/p) N if p 1/2 and P 1 1 N 1 ((1 p)/p)i 1 ((1 p)/p) N if p 1/2 and P i i N if p 1/2 if p 1/2. If Q i probability that player B wins ultimately, starting with N i units, we get by symmetry: Q i 1 (p/(1 p))n i if (1 p) 1/2 and Q 1 (p/(1 p)) N i N i N and note P i + Q i 1, i.e. the chance that play will go on forever is 0. See illustrations on the last 4 pages. 12 if (1 p) 1/2.

13 Example 19 (Drug Testing): Two drugs are tested on pairs of patients, one drug per patient in a pair. Drug A has cure probability P A and drug B has cure probability P B. The test for each pair of patients constitutes a trial and the score S A of A is increased by one each time drug A results in a cure of that patient. Similarly the score S B of B goes up each time B effects a cure. The trials are stopped as soon as S A S B either reaches M or M, where M is some predetermined number. If we eliminate all those trials which result in no change of the score difference S A S B, then the remaining trials are again independent and the outcome of such a remaining trial is either that S A S B increases or decreases by one with probability P P A (1 P B )/(P A (1 P B ) + P B (1 P A )) or 1 P P B (1 P A )/(P A (1 P B ) + P B (1 P A )) respectively. This is exactly the gambler s ruin problem, where both players start out with M units, i.e. N 2M. Note: S A S B M iff A had M more wins than B or A wins, if both start with M betting units. When P A P B, the probability that drug B comes out ahead ({B > A}) is: P ({B > A}) 1 P ({A > B}) 1 1 ( 1 P P )M 1 ( 1 P 1 P )2M 1 + γ where γ P M 1 P P A(1 P B ) P B (1 P A ). For P A.6 and P B.4 and M 5 we get P ({B > A}).017 and for M 10 we get P ({B > A}) γ is also called the odds-ratio, of the odds P A /(1 P A ) over the odds P B /(1 P B ). 13

14 P i probability that player A with capital i ruins player B with capital N i N100; i5 N200; i10 N500; i25 N1000; i50 N5000; i250 N10000; i500 r i N p P(A wins one unit per game) 14

15 P i probability that player A with capital i ruins player B with capital N i N100; i25 N200; i50 N500; i125 N1000; i250 N5000; i1250 N10000; i2500 r i N p P(A wins one unit per game) 15

16 P i probability that player A with capital i ruins player B with capital N i N100; i50 N200; i100 N500; i250 N1000; i500 N5000; i2500 N10000; i5000 r i N p P(A wins one unit per game) 16

17 P i probability that player A with capital i ruins player B with capital N i N100; i95 N200; i190 N500; i475 N1000; i950 N5000; i4750 N10000; i9500 r i N p P(A wins one unit per game) 17

Probabilistic models

Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B). Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary) Chapter 14 From Randomness to Probability How to measure a likelihood of an event? How likely is it to answer correctly one out of two true-false questions on a quiz? Is it more, less, or equally likely

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom Conditional Probability, Independence and Bayes Theorem Class 3, 18.05 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence of events. 2.

More information

Probability Year 10. Terminology

Probability Year 10. Terminology Probability Year 10 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150 Name Student ID # Instructor: SOLUTION Sergey Kirshner STAT 516 Fall 09 Practice Midterm #1 January 31, 2010 You are not allowed to use books or notes. Non-programmable non-graphic calculators are permitted.

More information

CS626 Data Analysis and Simulation

CS626 Data Analysis and Simulation CS626 Data Analysis and Simulation Instructor: Peter Kemper R 104A, phone 221-3462, email:kemper@cs.wm.edu Today: Probability Primer Quick Reference: Sheldon Ross: Introduction to Probability Models 9th

More information

Conditional Probability

Conditional Probability Chapter 3 Conditional Probability 3.1 Definition of conditional probability In spite of our misgivings, let us persist with the frequency definition of probability. Consider an experiment conducted N times

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

1 The Basic Counting Principles

1 The Basic Counting Principles 1 The Basic Counting Principles The Multiplication Rule If an operation consists of k steps and the first step can be performed in n 1 ways, the second step can be performed in n ways [regardless of how

More information

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space I. Vocabulary: A. Outcomes: the things that can happen in a probability experiment B. Sample Space (S): all possible outcomes C. Event (E): one outcome D. Probability of an Event (P(E)): the likelihood

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Introduction to Probability, Fall 2009

Introduction to Probability, Fall 2009 Introduction to Probability, Fall 2009 Math 30530 Review questions for exam 1 solutions 1. Let A, B and C be events. Some of the following statements are always true, and some are not. For those that are

More information

Probability & Random Variables

Probability & Random Variables & Random Variables Probability Probability theory is the branch of math that deals with random events, processes, and variables What does randomness mean to you? How would you define probability in your

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

3.2 Probability Rules

3.2 Probability Rules 3.2 Probability Rules The idea of probability rests on the fact that chance behavior is predictable in the long run. In the last section, we used simulation to imitate chance behavior. Do we always need

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Chapter. Probability

Chapter. Probability Chapter 3 Probability Section 3.1 Basic Concepts of Probability Section 3.1 Objectives Identify the sample space of a probability experiment Identify simple events Use the Fundamental Counting Principle

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2 Probability Probability is the study of uncertain events or outcomes. Games of chance that involve rolling dice or dealing cards are one obvious area of application. However, probability models underlie

More information

Tutorial 3: Random Processes

Tutorial 3: Random Processes August 18 Semester 2018-II Tutorial 3: Random Processes Lecturer: Utpal Mukherji/ Parimal Parag Prepared by: Prathamesh Mayekar Note: LaTeX template courtesy of UC Berkeley EECS dept. 3.1 Conditional Probability

More information

Random processes. Lecture 17: Probability, Part 1. Probability. Law of large numbers

Random processes. Lecture 17: Probability, Part 1. Probability. Law of large numbers Random processes Lecture 17: Probability, Part 1 Statistics 10 Colin Rundel March 26, 2012 A random process is a situation in which we know what outcomes could happen, but we don t know which particular

More information

Statistical Theory 1

Statistical Theory 1 Statistical Theory 1 Set Theory and Probability Paolo Bautista September 12, 2017 Set Theory We start by defining terms in Set Theory which will be used in the following sections. Definition 1 A set is

More information

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

1 Preliminaries Sample Space and Events Interpretation of Probability... 13 Summer 2017 UAkron Dept. of Stats [3470 : 461/561] Applied Statistics Ch 2: Probability Contents 1 Preliminaries 3 1.1 Sample Space and Events...........................................................

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Example: Two dice are tossed. What is the probability that the sum is 8? This is an easy exercise: we have a sample space

More information

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek

More information

PROBABILITY.

PROBABILITY. PROBABILITY PROBABILITY(Basic Terminology) Random Experiment: If in each trial of an experiment conducted under identical conditions, the outcome is not unique, but may be any one of the possible outcomes,

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Chapter 7: Section 7-1 Probability Theory and Counting Principles Chapter 7: Section 7-1 Probability Theory and Counting Principles D. S. Malik Creighton University, Omaha, NE D. S. Malik Creighton University, Omaha, NE Chapter () 7: Section 7-1 Probability Theory and

More information

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias Recap Announcements Lecture 5: Statistics 101 Mine Çetinkaya-Rundel September 13, 2011 HW1 due TA hours Thursday - Sunday 4pm - 9pm at Old Chem 211A If you added the class last week please make sure to

More information

4. Probability of an event A for equally likely outcomes:

4. Probability of an event A for equally likely outcomes: University of California, Los Angeles Department of Statistics Statistics 110A Instructor: Nicolas Christou Probability Probability: A measure of the chance that something will occur. 1. Random experiment:

More information

5.3 Conditional Probability and Independence

5.3 Conditional Probability and Independence 28 CHAPTER 5. PROBABILITY 5. Conditional Probability and Independence 5.. Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

Intermediate Math Circles November 8, 2017 Probability II

Intermediate Math Circles November 8, 2017 Probability II Intersection of Events and Independence Consider two groups of pairs of events Intermediate Math Circles November 8, 017 Probability II Group 1 (Dependent Events) A = {a sales associate has training} B

More information

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,

More information

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space? Math 166 Exam 1 Review Sections L.1-L.2, 1.1-1.7 Note: This review is more heavily weighted on the new material this week: Sections 1.5-1.7. For more practice problems on previous material, take a look

More information

Conditional Probability and Independence

Conditional Probability and Independence Conditional Probability and Independence September 3, 2009 1 Restricting the Sample Space - Conditional Probability How do we modify the probability of an event in light of the fact that something is known?

More information

Introduction and basic definitions

Introduction and basic definitions Chapter 1 Introduction and basic definitions 1.1 Sample space, events, elementary probability Exercise 1.1 Prove that P( ) = 0. Solution of Exercise 1.1 : Events S (where S is the sample space) and are

More information

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is available on the Connexions website. It is used under a

More information

What is the probability of getting a heads when flipping a coin

What is the probability of getting a heads when flipping a coin Chapter 2 Probability Probability theory is a branch of mathematics dealing with chance phenomena. The origins of the subject date back to the Italian mathematician Cardano about 1550, and French mathematicians

More information

Conditional Probability

Conditional Probability Conditional Probability When we obtain additional information about a probability experiment, we want to use the additional information to reassess the probabilities of events given the new information.

More information

Probability 5-4 The Multiplication Rules and Conditional Probability

Probability 5-4 The Multiplication Rules and Conditional Probability Outline Lecture 8 5-1 Introduction 5-2 Sample Spaces and 5-3 The Addition Rules for 5-4 The Multiplication Rules and Conditional 5-11 Introduction 5-11 Introduction as a general concept can be defined

More information

Probability (Devore Chapter Two)

Probability (Devore Chapter Two) Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Fall 2012 Contents 0 Administrata 2 0.1 Outline....................................... 3 1 Axiomatic Probability 3

More information

Chapter 6: Probability The Study of Randomness

Chapter 6: Probability The Study of Randomness Chapter 6: Probability The Study of Randomness 6.1 The Idea of Probability 6.2 Probability Models 6.3 General Probability Rules 1 Simple Question: If tossing a coin, what is the probability of the coin

More information

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

Notes Week 2 Chapter 3 Probability WEEK 2 page 1 Notes Week 2 Chapter 3 Probability WEEK 2 page 1 The sample space of an experiment, sometimes denoted S or in probability theory, is the set that consists of all possible elementary outcomes of that experiment

More information

Axioms of Probability

Axioms of Probability Sample Space (denoted by S) The set of all possible outcomes of a random experiment is called the Sample Space of the experiment, and is denoted by S. Example 1.10 If the experiment consists of tossing

More information

Probability Notes (A) , Fall 2010

Probability Notes (A) , Fall 2010 Probability Notes (A) 18.310, Fall 2010 We are going to be spending around four lectures on probability theory this year. These notes cover approximately the first three lectures on it. Probability theory

More information

1 of 14 7/15/2009 9:25 PM Virtual Laboratories > 2. Probability Spaces > 1 2 3 4 5 6 7 5. Independence As usual, suppose that we have a random experiment with sample space S and probability measure P.

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

STAT Chapter 3: Probability

STAT Chapter 3: Probability Basic Definitions STAT 515 --- Chapter 3: Probability Experiment: A process which leads to a single outcome (called a sample point) that cannot be predicted with certainty. Sample Space (of an experiment):

More information

STA Module 4 Probability Concepts. Rev.F08 1

STA Module 4 Probability Concepts. Rev.F08 1 STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret

More information

Math , Fall 2012: HW 5 Solutions

Math , Fall 2012: HW 5 Solutions Math 230.0, Fall 202: HW 5 Solutions Due Thursday, October 4th, 202. Problem (p.58 #2). Let X and Y be the numbers obtained in two draws at random from a box containing four tickets labeled, 2, 3, 4. Display

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

STAT 516: Basic Probability and its Applications

STAT 516: Basic Probability and its Applications Lecture 3: Conditional Probability and Independence Prof. Michael September 29, 2015 Motivating Example Experiment ξ consists of rolling a fair die twice; A = { the first roll is 6 } amd B = { the sum

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

Conditional probability

Conditional probability CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will

More information

4. Conditional Probability

4. Conditional Probability 1 of 13 7/15/2009 9:25 PM Virtual Laboratories > 2. Probability Spaces > 1 2 3 4 5 6 7 4. Conditional Probability Definitions and Interpretations The Basic Definition As usual, we start with a random experiment

More information

Lecture Lecture 5

Lecture Lecture 5 Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then 1.1 Probabilities Def n: A random experiment is a process that, when performed, results in one and only one of many observations (or outcomes). The sample space S is the set of all elementary outcomes

More information

Stat 225 Week 2, 8/27/12-8/31/12, Notes: Independence and Bayes Rule

Stat 225 Week 2, 8/27/12-8/31/12, Notes: Independence and Bayes Rule Stat 225 Week 2, 8/27/12-8/31/12, Notes: Independence and Bayes Rule The Fall 2012 Stat 225 T.A.s September 7, 2012 1 Monday, 8/27/12, Notes on Independence In general, a conditional probability will change

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

Topic 3: Introduction to Probability

Topic 3: Introduction to Probability Topic 3: Introduction to Probability 1 Contents 1. Introduction 2. Simple Definitions 3. Types of Probability 4. Theorems of Probability 5. Probabilities under conditions of statistically independent events

More information

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability Topic 2 Probability Basic probability Conditional probability and independence Bayes rule Basic reliability Random process: a process whose outcome can not be predicted with certainty Examples: rolling

More information

Denker FALL Probability- Assignment 6

Denker FALL Probability- Assignment 6 Denker FALL 2010 418 Probability- Assignment 6 Due Date: Thursday, Oct. 7, 2010 Write the final answer to the problems on this assignment attach the worked out solutions! Problem 1: A box contains n +

More information

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

14 - PROBABILITY Page 1 ( Answers at the end of all questions ) - PROBABILITY Page ( ) Three houses are available in a locality. Three persons apply for the houses. Each applies for one house without consulting others. The probability that all the three apply for the

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis Conditional Probability 2 Solutions COR-GB.305 Statistics and Data Analysis The Birthday Problem. A class has 50 students. What is the probability that at least two students have the same birthday? Assume

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad MAT2377 Ali Karimnezhad Version September 9, 2015 Ali Karimnezhad Comments These slides cover material from Chapter 1. In class, I may use a blackboard. I recommend reading these slides before you come

More information

1. Discrete Distributions

1. Discrete Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 1. Discrete Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space Ω.

More information

Econ 113. Lecture Module 2

Econ 113. Lecture Module 2 Econ 113 Lecture Module 2 Contents 1. Experiments and definitions 2. Events and probabilities 3. Assigning probabilities 4. Probability of complements 5. Conditional probability 6. Statistical independence

More information

Announcements. Topics: To Do:

Announcements. Topics: To Do: Announcements Topics: In the Probability and Statistics module: - Sections 1 + 2: Introduction to Stochastic Models - Section 3: Basics of Probability Theory - Section 4: Conditional Probability; Law of

More information

Previous Exam Questions, Chapter 2

Previous Exam Questions, Chapter 2 ECE 302: Probabilistic Methods in Electrical and Computer Engineering Instructor: Prof. A. R. Reibman Previous Exam Questions, Chapter 2 Reibman (compiled September 2018) These form a collection of 36

More information

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3. Example: A fair die is tossed and we want to guess the outcome. The outcomes will be 1, 2, 3, 4, 5, 6 with equal probability 1 6 each. If we are interested in getting the following results: A = {1, 3,

More information

Section F Ratio and proportion

Section F Ratio and proportion Section F Ratio and proportion Ratio is a way of comparing two or more groups. For example, if something is split in a ratio 3 : 5 there are three parts of the first thing to every five parts of the second

More information

7.1 What is it and why should we care?

7.1 What is it and why should we care? Chapter 7 Probability In this section, we go over some simple concepts from probability theory. We integrate these with ideas from formal language theory in the next chapter. 7.1 What is it and why should

More information

Review Basic Probability Concept

Review Basic Probability Concept Economic Risk and Decision Analysis for Oil and Gas Industry CE81.9008 School of Engineering and Technology Asian Institute of Technology January Semester Presented by Dr. Thitisak Boonpramote Department

More information

Conditional Probability

Conditional Probability Conditional Probability Terminology: The probability of an event occurring, given that another event has already occurred. P A B = ( ) () P A B : The probability of A given B. Consider the following table:

More information

Business Statistics MBA Pokhara University

Business Statistics MBA Pokhara University Business Statistics MBA Pokhara University Chapter 3 Basic Probability Concept and Application Bijay Lal Pradhan, Ph.D. Review I. What s in last lecture? Descriptive Statistics Numerical Measures. Chapter

More information

Lecture 8: Probability

Lecture 8: Probability Lecture 8: Probability The idea of probability is well-known The flipping of a balanced coin can produce one of two outcomes: T (tail) and H (head) and the symmetry between the two outcomes means, of course,

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory 1 1.1. Introduction Any realistic model of a real-world phenomenon must take into account the possibility of randomness. That is, more often than not, the quantities

More information

1 Combinatorial Analysis

1 Combinatorial Analysis ECE316 Notes-Winter 217: A. K. Khandani 1 1 Combinatorial Analysis 1.1 Introduction This chapter deals with finding effective methods for counting the number of ways that things can occur. In fact, many

More information

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ). Chapter 2 Probability Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, 480-524). Blaise Pascal (1623-1662) Pierre de Fermat (1601-1665) Abraham de Moivre

More information

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i} STAT 56 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS 2. We note that E n consists of rolls that end in 6, namely, experiments of the form (a, a 2,...,a n, 6 for n and a i

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E. Compound Events Because we are using the framework of set theory to analyze probability, we can use unions, intersections and complements to break complex events into compositions of events for which it

More information

MATH/STATS 425 : Introduction to Probability. Boaz Slomka

MATH/STATS 425 : Introduction to Probability. Boaz Slomka MATH/STATS 425 : Introduction to Probability Boaz Slomka These notes are not proofread, and may contain typos and errors. Last update: April 10, 2018. LECTURE 1 Counting (1.2 Basic multiplication principle:

More information