Module 2 : Conditional Probability

Size: px
Start display at page:

Download "Module 2 : Conditional Probability"

Transcription

1 Module 2 : Conditional Probability Ruben Zamar Department of Statistics UBC January 16, 2017 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

2 MOTIVATION The outcome could be any element in the Sample Space, Ω. Sometimes the range of possibilities is restricted because of partial information Examples number of shots: partial info: we know it wasn t an ace ELEC 321 final grade: partial info: we know it is at least a B Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

3 CONDITIONING EVENT The event B representing the partial information is called conditioning event Denote by A the event of interest Example (Number of Shots) B = {2, 3,...} = {not an ace } A = {1, 3, 5,...} = {server wins} Example (Final Grade) B = [70, 100] = {at least a B } (conditioning event) (event of interest) (conditioning event) A = [80, 100] = {an A } (event of interest) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

4 DEFINITION OF CONDITIONAL PROBABILITY Suppose that P (B) > 0 P (A B) = P (A B) P (B) The left hand side is read as probability of A given B Useful formulas: P (A B) = P (B) P (A B) = P (A) P (B A) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

5 CONDITIONAL PROBABILITY P (A B), as a function of A (and for B fixed) satisfies all the probability axioms: - P (Ω B) = P (Ω B) /P (B) = P (B) /P (B) = 1 - P (A B) 0 - If {A i } are disjoint then P ( A i B) = P [( A i ) B] P (B) = P [ (A i B)] P (B) = P (A i B) P (B) = P (A i B) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

6 EXAMPLE: NUMBER OF SHOTS For simplicity, suppose that points are decided in at most 8 shots, with probabilities: Shots Prob Using the table above: P (Sever wins Not an ace) = P ({3, 5, 7}) P ({2, 3, 4, 5, 6, 7, 8}) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

7 EXAMPLE: FINAL GRADE Suppose that P (Grade is larger than x) = 100 x 100 = 1 x 100 Using the formula above: P (To get an A To get at least a B ) = = P ([80, 100]) P ([70, 100]) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

8 SCREENING TESTS Items are submitted to a screening test before shipment The screening test can result in either POSITIVE NEGATIVE (indicating that the item may have a defect) (indicating that the item doesn t have a defect) Screening tests face two types of errors FALSE POSITIVE FALSE NEGATIVE Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

9 SCREENING TESTS (continued) For each item we have 4 possible events Item true status: D = {item is defective} D c = {item is not defective} Test result: B = {test is positive} B c = {test is negative} Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

10 SCREENING TESTS (continued) The following conditional probabilities are normally known Sensitivity of the test: P (B D) = 0.95 (say) which implies Specificity of the test: P (B c D c ) = 0.99 (say) P (B c D) = 0.05 and P (B D c ) = 0.01 The proportion of defective items is also normally known P (D) = 0.02 (say) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

11 TEST PERFORMANCE The following questions may be of interest: What is the probability that a randomly chosen item tests positive? What is the probability of defective given that the test resulted negative? What is the probability of defective given that the test resulted positive? What is the probability of screening error? We will compute these probabilities Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

12 PROBABILITY OF TESTING POSITIVE P (B) = P (B D) + P (B D c ) = P (D) P (B D) + P (D c ) P (B D c ) = (1 0.02) 0.01 = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

13 PROB OF DEFECTIVE GIVEN A POSITIVE TEST P (D B) = = = P (D B) P (B) P (D) P (B D) P (B) = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

14 PROB OF DEFECTIVE GIVEN A NEGATIVE TEST P (D B c ) = P (D Bc ) P (B c ) = P (D) P (Bc D) 1 P (B) = = 0.01 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

15 SCREENING ERROR P (Error) = P (D B c ) + P (D c B) = P (D) P (B c D) + P (D c ) P (B D c ) = 0.02 (1 0.95) + (1 0.02) 0.01 = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

16 BAYES FORMULA The formula P (D B) = P (D B) P (B) = P (B D) P (D) P (B D) P (D) + P (B D c ) P (D c ) is the simple form of Bayes formula. This has been used in the "Screening Example presented before. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

17 BAYES FORMULA (continued) The general form of Bayes Formula is given by P (D i B) = P (D i B) P (B) = P (B D i ) P (D i ) k j=1 P (B D j ) P (D j ) where D 1, D 2,..., D k is a partition of the sample space Ω: Ω = D 1 D 2 D k D i D j = φ, for i = j Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

18 EXAMPLE: THREE PRISONERS Prisoners A, B and C are to be executed The governor has selected one of them at random to be pardoned The warden knows who is pardoned, but is not allowed to tell Prisoner A begs the warden to let him know which one of the other two prisoners is not pardoned Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

19 Prisoner A tells the warden: Since I already know that one of the other two prisioners is not pardoned, you could just tell me who is that Prisoner A adds: If B is pardoned, you could give me C s name. If C is pardoned, you could give me B s name. And if I m pardoned, you could flip a coin to decide whether to name B or C. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

20 The warden is convinced by prisoner A s arguments and tells him: not pardoned B is Result: Given the information provided by the Warden, C is now twice more likely to be pardoned than A! Why? Check the derivations below: Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

21 NOTATION: A = {A is pardoned} B = {B is pardoned} C = {C is pardoned} b = {The warden says B is not pardoned } Clearly P (A) = P (B) = P (C ) = 1 3 P (b B) = 0 (warden never lies) P (b A) = 1/2 (warden flips a coin) P (b C ) = 1 (warden cannot name A) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

22 By the Bayes formula: P (A b) = P (b A) P (A) P (b A) P (A) + P (b B) P (B) + P (b C ) P (C ) Hence = = 1 3 P (C b) = 1 P (A b) = = 2 3 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

23 SCREENING EXAMPLE II The tested items have two components: c 1 and c 2 Suppose D 1 = {Only component c 1 is defective }, P (D 1 ) = 0.01 D 2 = {Only component c 2 is defective }, P (D 2 ) = D 3 = {Both components are defective }, P (D 3 ) = D 4 = {Both components are non defective }, P (D 4 ) = 0.98 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

24 SCREENING EXAMPLE II (continued) Let Suppose B = {Screening test is positive} P (B D 1 ) = 0.95 P (B D 2 ) = 0.96 P (B D 3 ) = 0.99 P (B D 4 ) = 0.01 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

25 SOME QUESTIONS OF INTEREST The following questions may be of interest: What is the probability of testing positive? What is the probability that component c i (i = 1, 2) is defective when the test resulted positive? What is the probability that the item is defective when the test resulted negative? What is the probability both components are defective when the test resulted positive? What is the probability of testing error? We will compute these probabilities Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

26 PROB OF TESTING POSITIVE P (B) = P (B D 1 ) + P (B D 2 ) + P (B D 3 ) + P (B D 4 ) = = Notice that the probability of defective is P (D) = = 0.02 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

27 POSITIVE TEST P (D 1 B) = P (D 2 B) = P (D 3 B) = P (D 4 B) = = = = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

28 PROB OF TESTING NEGATIVE P (B c ) = P (B c D 1 ) + P (B c D 2 ) + P (B c D 3 ) + P (B c D 4 ) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

29 A NEGATIVE TEST P (D 1 B c ) = P (D 2 B c ) = P (D 3 B c ) = P (D 4 B c ) = = = = = P (defective B c ) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

30 INDEPENDENCE DEFINITION: Events A and B are independent if P (A B) = P (A) P (B) If A and B are independent then P (A B) = P (A B) P (B) = P (A) P (B) P (B) = P (A) and P (B A) = P (A B) P (A) = P (A) P (B) P (A) = P (B) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

31 DISCUSSION If P (A) = 1, then A is independent of all B. {}}{ P (A B) = P (A B) + P (A c B) = P (B) =0 P (A B) = =1 {}}{ P (A)P (B) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

32 DISCUSSION (Cont) Suppose that A and B are non-trivial events ( 0 < P (A) < 1 and 0 < P (B) < 1 ) If A and B are mutually exclusive ( A B = φ ) then they cannot be independent because P (A B) = 0 < P (A) If A B then they cannot be independent because P (A B) = P (A B) P (B) = P (A) P (B) > P (A) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

33 DISCUSSION (Cont) Suppose Ω = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} and the numbers are equally likely. A = {1, 2, 3, 4, 5} and B = {2, 4, 6, 8} P (A B) = P ({2, 4}) = 0.20, P (A) P (B) = = 0.20 Hence, A and B are independent In terms of probabilities A is half of Ω. half of B. On the other hand A B is Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

34 DISCUSSION (continued) What happens if P (i) = i 55? P (A B) = P ({2, 4}) = 6/55 = P (A) P (B) = (15/55) (20/55) = Hence, A and B are not independent in this case. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

35 MORE THAN TWO EVENTS Definition: We say that the events A 1, A 2,..., A n are independent if P (A i1 A i2 A ik ) = P (A i1 ) P (A i2 ) P (A ik ) for all 1 i 1 < i 2 < < i k n, and all 1 k n. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

36 For example, if n = 3, then P (A 1 A 2 ) = P (A 1 ) P (A 2 ) P (A 1 A 3 ) = P (A 1 ) P (A 3 ) P (A 2 A 3 ) = P (A 2 ) P (A 3 ) P (A 1 A 2 A 3 ) = P (A 1 ) P (A 2 ) P (A 3 ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

37 SYSTEM OF INDEPENDENT COMPONENTS In series a b c In parallel a b c Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

38 NOTATION A = {Component a works} B = {Component b works} C = {Component c works} Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

39 INDEPENDENT COMPONENTS We assume that A, B and C are independent, that is P (A B C ) = P (A) P (B) P (C ) P (A B) = P (A) P (B), P (B C ) = P (B) P (C ), P (A C ) = P (A) P (C ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

40 RELIABILITY CALCULATION Problem 1: Suppose that P (A) = P (B) = P (C ) = Calculate the reliability of the system a b c Solution: P (System Works) = P (A B C ) = P (A) P (B) P (C ) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

41 PRACTICE Problem 2: Suppose that P (A) = P (B) = P (C ) = Calculate the reliability of the system a b c Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

42 PROBLEM 2 (Solution) P (System works) = 1 P (System fails) = 1 P (A c B c C c ) = 1 P (A c ) P (B c ) P (C c ) = 1 (1 P (A)) (1 P (B)) (1 P (C )) = = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

43 PRACTICE Problem 3: Suppose that P (A) = P (B) = P (C ) = P (D) = Calculate the reliability of the system subsys I subsys II a c b d Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

44 PROBLEM 3 (Solution) P (System works) = P (subsys I works subsys II works) = P (subsys I works ) P (subsys II works) = [1 P (subsys I fails )] [1 P (subsys II fails )] = [1 P (A c B c )] [1 P (C c D c )] = [1 P (A c ) P (B c )] [1 P (C c ) P (D c )] = [1 (1 P (A)) (1 P (B))] [1 (1 P (C )) (1 P (D))] = ( ) 2 = Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

45 CONDITIONAL INDEPENDENCE Definition: We say that the events T 1, T 2,..., T n are conditionally independent given the event B if P (T i1 T i2 T ik B) = P (T i1 B) P (T i2 B) P (T ik B) for all 1 i 1 < i 2 < < i k n, and all 1 k n. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

46 For example, if n = 3, then P (T 1 T 2 B) = P (T 1 B) P (T 2 B) P (T 1 T 3 B) = P (T 1 B) P (T 3 B) P (T 2 T 3 B) = P (T 2 B) P (T 3 B) P (T 1 T 2 T 3 B) = P (T 1 B) P (T 2 B) P (T 3 B) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

47 Notes Conditional independence doesn t imply unconditional independence and vice versa Conditional independence given B doesn t imply conditional independence given B c However, usually both conditional independences are assumed together in applications We will apply this concept in Bayesian probability updating Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

48 Sequential Bayes Formula Let S i be the outcome of the i th test. For instance S 1 = S 2 = S 3 = { The 1 th { The 2 th { The 3 th } test is positive } test is negative } test is negative and so on Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

49 The outcomes S i (i = 1, 2,..., n) are available in a sequential fashion. Let I k = S 1 S 2 S k (data available at step k) and set π 0 = P (E ) π 1 = P (E I 1 ) = P (E S 1 ) π 2 = P (E I 2 ) = P (E S 1 S 2 ) π 3 = P (E I 3 ) = P (E S 1 S 2 S 3 ) and so on Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

50 Conditional Independence Assumption Assume that the S i (i = 1, 2,..., n) are independent given E and also given E c. Then, for k = 1, 2,..., n π k = P (S k E ) π k 1 P (S k E ) π k 1 + P (S k E c ) (1 π k 1 ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

51 Proof π k = P (I k E ) π 0 P (I k E ) π 0 + P (I k E c ) (1 π 0 ) = P (I k 1 S k E ) π 0 P (I k 1 S k E ) π 0 + P (I k 1 S k E c ) (1 π 0 ) = P (S k E ) P (I k 1 E ) π 0 P (S k E ) P (I k 1 E ) π 0 + P (S k E c ) P (I k 1 E c ) (1 π 0 ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

52 Proof π k = P (S k E ) P (I k 1 E ) P (S k E ) P (I k 1 E ) + P (S k E c ) P (I k 1 E c ) = P (S k E ) P (I k 1 E ) /P (I k 1 ) [P (S k E ) P (I k 1 E ) + P (S k E c ) P (I k 1 E c )] /P (I k 1 ) = P (S k E ) π k 1 P (S k E ) π k 1 + P (S k E c ) (1 π k 1 ), π k 1 = P (E I k 1 ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

53 Pseudo Code Input: (S 1, S 2, S 3,..., S n ) = (1, 0, 1,..., 0) (outcomes for the n tests) π = P (E ) defective") (prob of event of interest, for instance E= "the part is p k = P (S k = + E ) k = 1, 2,..., n (Sensitivity of k th test) q k = P (S k = E c ) k = 1, 2,..., n (Specificity of k th test) Output π k = P (E S 1 S 2 S k ), k = 1, 2,..., n Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

54 Pseudo Code Example of Input: n = 4, π = 0.05 Test Results = (1, 1, 0, 1) k p k = P (1 Defective) 1 q k = P (1 Non Defective) 1 p 1 = q 1 = p 2 = q 2 = p 3 = q 3 = p 4 = q 4 = 0.15 Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

55 Pseudo Code - Computation Computation of π k 1) Initialization: Set π 0 = π 2) k-step: If S k = 1, set a = p k and b = 1 q k If S k = 0, set a = 1 p k and b = q k 3) Computing π k : π k = aπ k 1 aπ k 1 + b (1 π k 1 ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

56 Spam Detection When you receive an , your spam fillter uses Bayes rule to decide whether it is spam or not. Basic spam filters check whether some pre-specified words appear in the ; e.g. {diplomat,lottery,money,inheritance,president,sincerely,huge,...}. We consider n events W i telling us whether the i th pre-specified word is in the message Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

57 Let E = { is spam } W i = { word i is in the message }, i = 1, 2,..., n Assume that W 1, W 2,..., W n are conditionally independent given E and also E c. Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

58 Human examination of a large number of messages is used estimate π 0 = P (E ) The training data is also used to estimate p i = P (W i E ) and 1 q i = P (W i E c ) Let I n = S 1 S 2 S n, where S i is either W i or W c i. The spam filter assumes that the W i are conditionally independent (given E and given E c ) to compute P (E I n ) = P (I n E ) P (E ) P (I n E ) P (E ) + P (I n E c ) P (E c ) Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

59 Sequential Updating The posterior probs π k = P (E I k ) (k = 1, 2,..., n 1) can be computed sequentially using the formula π k = P (E I k ) = P (S k E ) P (E I k 1 ) P (S k E ) P (E I k 1 ) + P (S k E c ) P (E c I k 1 ) = P (S k E ) π k 1 P (S k E ) π k 1 + P (S k E c ) (1 π k 1 ) An early decision to classify the as spam can be made if P (E I k ) becomes too large (or too small). Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

60 Numerical Example For a simple numerical example consider a case with and conditional probabilities n = 8 words, P (Spam) = 0.10 Word P(Word Spam) P(Word No Spam) W W W W W W W W Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

61 Word Word Status P(Spam I k ) W W W W W W W W Ruben Zamar Department of Statistics UBC Module () 2 January 16, / 61

Conditional Probability

Conditional Probability Conditional Probability When we obtain additional information about a probability experiment, we want to use the additional information to reassess the probabilities of events given the new information.

More information

Dynamic Programming Lecture #4

Dynamic Programming Lecture #4 Dynamic Programming Lecture #4 Outline: Probability Review Probability space Conditional probability Total probability Bayes rule Independent events Conditional independence Mutual independence Probability

More information

Conditional Probability P( )

Conditional Probability P( ) Conditional Probability P( ) 1 conditional probability and the chain rule General defn: where P(F) > 0 Implies: P(EF) = P(E F) P(F) ( the chain rule ) General definition of Chain Rule: 2 Best of 3 tournament

More information

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan Lecture 9: Naive Bayes, SVM, Kernels Instructor: Outline 1 Probability basics 2 Probabilistic Interpretation of Classification 3 Bayesian Classifiers, Naive Bayes 4 Support Vector Machines Probability

More information

tossing a coin selecting a card from a deck measuring the commuting time on a particular morning

tossing a coin selecting a card from a deck measuring the commuting time on a particular morning 2 Probability Experiment An experiment or random variable is any activity whose outcome is unknown or random upfront: tossing a coin selecting a card from a deck measuring the commuting time on a particular

More information

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo 4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo 1 conditional probability Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F S Written

More information

Bi=[X=i] i=i,z, forma partition. Example

Bi=[X=i] i=i,z, forma partition. Example Example Suppose that a person plays a game in which his score must be one of the 50 numbers 1, 2,,50 and that each of these 50 numbers is equally likely to be his score The first time he plays the game,

More information

ECE 302: Chapter 02 Probability Model

ECE 302: Chapter 02 Probability Model ECE 302: Chapter 02 Probability Model Fall 2018 Prof Stanley Chan School of Electrical and Computer Engineering Purdue University 1 / 35 1. Probability Model 2 / 35 What is Probability? It is a number.

More information

Naive Bayes classification

Naive Bayes classification Naive Bayes classification Christos Dimitrakakis December 4, 2015 1 Introduction One of the most important methods in machine learning and statistics is that of Bayesian inference. This is the most fundamental

More information

Introduction to AI Learning Bayesian networks. Vibhav Gogate

Introduction to AI Learning Bayesian networks. Vibhav Gogate Introduction to AI Learning Bayesian networks Vibhav Gogate Inductive Learning in a nutshell Given: Data Examples of a function (X, F(X)) Predict function F(X) for new examples X Discrete F(X): Classification

More information

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3. Example: A fair die is tossed and we want to guess the outcome. The outcomes will be 1, 2, 3, 4, 5, 6 with equal probability 1 6 each. If we are interested in getting the following results: A = {1, 3,

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional

More information

Week 2: Probability: Counting, Sets, and Bayes

Week 2: Probability: Counting, Sets, and Bayes Statistical Methods APPM 4570/5570, STAT 4000/5000 21 Probability Introduction to EDA Week 2: Probability: Counting, Sets, and Bayes Random variable Random variable is a measurable quantity whose outcome

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad MAT2377 Ali Karimnezhad Version September 9, 2015 Ali Karimnezhad Comments These slides cover material from Chapter 1. In class, I may use a blackboard. I recommend reading these slides before you come

More information

Chapter 2 Probability

Chapter 2 Probability Basic Axioms and properties Chapter 2 Probability 1. 0 Pr(A) 1 2. Pr(S) = 1 3. A B Pr(A) Pr(B) 4. Pr(φ ) = 0 5. If A and B are disjoint i.e. A B φ P( A B) = P(A) + P(B). 6. For any two events A and B,

More information

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples: Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Proposition If the first element of object of an ordered pair can be

More information

Info 2950, Lecture 4

Info 2950, Lecture 4 Info 2950, Lecture 4 7 Feb 2017 More Programming and Statistics Boot Camps? This week (only): PG Wed Office Hour 8 Feb at 3pm Prob Set 1: due Mon night 13 Feb (no extensions ) Note: Added part to problem

More information

MATH N1, Fall 2017 First midterm

MATH N1, Fall 2017 First midterm MATH 3670 - N1, Fall 2017 First midterm September 18th NAME: Instructions Only non-graphing calculator is allowed. Simple fraction expressions need not be converted to decimal expression. Notes, books,

More information

Bayes Theorem and Hypergeometric Distribution

Bayes Theorem and Hypergeometric Distribution Bayes Theorem Bayes Theorem is a theorem of probability theory that can be seen as a way of understanding how the probability that a theory is true is affected by a new piece of evidence. It has been used

More information

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition: Chapter 2: Probability 2-1 Sample Spaces & Events 2-1.1 Random Experiments 2-1.2 Sample Spaces 2-1.3 Events 2-1 1.4 Counting Techniques 2-2 Interpretations & Axioms of Probability 2-3 Addition Rules 2-4

More information

STAT 31 Practice Midterm 2 Fall, 2005

STAT 31 Practice Midterm 2 Fall, 2005 STAT 31 Practice Midterm 2 Fall, 2005 INSTRUCTIONS: BOTH THE BUBBLE SHEET AND THE EXAM WILL BE COLLECTED. YOU MUST PRINT YOUR NAME AND SIGN THE HONOR PLEDGE ON THE BUBBLE SHEET. YOU MUST BUBBLE-IN YOUR

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Lecture 2: Probability, conditional probability, and independence

Lecture 2: Probability, conditional probability, and independence Lecture 2: Probability, conditional probability, and independence Theorem 1.2.6. Let S = {s 1,s 2,...} and F be all subsets of S. Let p 1,p 2,... be nonnegative numbers that sum to 1. The following defines

More information

CS626 Data Analysis and Simulation

CS626 Data Analysis and Simulation CS626 Data Analysis and Simulation Instructor: Peter Kemper R 104A, phone 221-3462, email:kemper@cs.wm.edu Today: Probability Primer Quick Reference: Sheldon Ross: Introduction to Probability Models 9th

More information

Naïve Bayes classification

Naïve Bayes classification Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 8 1 / 11 The Prior Distribution Definition Suppose that one has a statistical model with parameter

More information

Independence, Concentration, Bayes Theorem

Independence, Concentration, Bayes Theorem Independence, Concentration, Bayes Theorem CSE21 Winter 2017, Day 23 (B00), Day 15 (A00) March 10, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Other functions? Expectation does not in general commute

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014

Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014 Bayes Formula MATH 07: Finite Mathematics University of Louisville March 26, 204 Test Accuracy Conditional reversal 2 / 5 A motivating question A rare disease occurs in out of every 0,000 people. A test

More information

Text Categorization CSE 454. (Based on slides by Dan Weld, Tom Mitchell, and others)

Text Categorization CSE 454. (Based on slides by Dan Weld, Tom Mitchell, and others) Text Categorization CSE 454 (Based on slides by Dan Weld, Tom Mitchell, and others) 1 Given: Categorization A description of an instance, x X, where X is the instance language or instance space. A fixed

More information

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

STAT 516: Basic Probability and its Applications

STAT 516: Basic Probability and its Applications Lecture 3: Conditional Probability and Independence Prof. Michael September 29, 2015 Motivating Example Experiment ξ consists of rolling a fair die twice; A = { the first roll is 6 } amd B = { the sum

More information

Generative Classifiers: Part 1. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang

Generative Classifiers: Part 1. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang Generative Classifiers: Part 1 CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang 1 This Week Discriminative vs Generative Models Simple Model: Does the patient

More information

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

Markov localization uses an explicit, discrete representation for the probability of all position in the state space. Markov Kalman Filter Localization Markov localization localization starting from any unknown position recovers from ambiguous situation. However, to update the probability of all positions within the whole

More information

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek

More information

ECE 313: Hour Exam I

ECE 313: Hour Exam I University of Illinois Fall 07 ECE 3: Hour Exam I Wednesday, October, 07 8:45 p.m. 0:00 p.m.. [6 points] Suppose that a fair coin is independently tossed twice. Define the events: A = { The first toss

More information

CDA6530: Performance Models of Computers and Networks. Chapter 1: Review of Practical Probability

CDA6530: Performance Models of Computers and Networks. Chapter 1: Review of Practical Probability CDA6530: Performance Models of Computers and Networks Chapter 1: Review of Practical Probability Probability Definition Sample Space (S) which is a collection of objects (all possible scenarios or values).

More information

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor.

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor. Random Signals and Systems Chapter 1 Jitendra K Tugnait James B Davis Professor Department of Electrical & Computer Engineering Auburn University 2 3 Descriptions of Probability Relative frequency approach»

More information

The Naïve Bayes Classifier. Machine Learning Fall 2017

The Naïve Bayes Classifier. Machine Learning Fall 2017 The Naïve Bayes Classifier Machine Learning Fall 2017 1 Today s lecture The naïve Bayes Classifier Learning the naïve Bayes Classifier Practical concerns 2 Today s lecture The naïve Bayes Classifier Learning

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Lecture 3 Probability Basics

Lecture 3 Probability Basics Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 January 18, 2018 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 18, 2018 1 / 9 Sampling from a Bernoulli Distribution Theorem (Beta-Bernoulli

More information

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events... Probability COMP 245 STATISTICS Dr N A Heard Contents Sample Spaces and Events. Sample Spaces........................................2 Events........................................... 2.3 Combinations

More information

Vibhav Gogate University of Texas, Dallas

Vibhav Gogate University of Texas, Dallas Review of Probability and Statistics 101 Elements of Probability Theory Events, Sample Space and Random Variables Axioms of Probability Independent Events Conditional Probability Bayes Theorem Joint Probability

More information

Sequential Decisions

Sequential Decisions Sequential Decisions A Basic Theorem of (Bayesian) Expected Utility Theory: If you can postpone a terminal decision in order to observe, cost free, an experiment whose outcome might change your terminal

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 2. MLE, MAP, Bayes classification Barnabás Póczos & Aarti Singh 2014 Spring Administration http://www.cs.cmu.edu/~aarti/class/10701_spring14/index.html Blackboard

More information

Joint, Conditional, & Marginal Probabilities

Joint, Conditional, & Marginal Probabilities Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how to create probabilities for combined events such as P [A B] or for the likelihood of an event A given that

More information

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #11 Scribe: Andrew Ferguson March 13, 2007

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #11 Scribe: Andrew Ferguson March 13, 2007 COS 424: Interacting with ata Lecturer: ave Blei Lecture #11 Scribe: Andrew Ferguson March 13, 2007 1 Graphical Models Wrap-up We began the lecture with some final words on graphical models. Choosing a

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped

More information

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1 Topic -2 Probability Larson & Farber, Elementary Statistics: Picturing the World, 3e 1 Probability Experiments Experiment : An experiment is an act that can be repeated under given condition. Rolling a

More information

Probabilistic models

Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES COMMON TERMS RELATED TO PROBABILITY Probability is the measure of the likelihood that an event will occur Probability values are

More information

Independence, Variance, Bayes Theorem

Independence, Variance, Bayes Theorem Independence, Variance, Bayes Theorem Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 16, 2016 Resolving collisions with chaining Hash

More information

Announcements. Topics: To Do:

Announcements. Topics: To Do: Announcements Topics: In the Probability and Statistics module: - Sections 1 + 2: Introduction to Stochastic Models - Section 3: Basics of Probability Theory - Section 4: Conditional Probability; Law of

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,

More information

STOR 435 Lecture 5. Conditional Probability and Independence - I

STOR 435 Lecture 5. Conditional Probability and Independence - I STOR 435 Lecture 5 Conditional Probability and Independence - I Jan Hannig UNC Chapel Hill 1 / 16 Motivation Basic point Think of probability as the amount of belief we have in a particular outcome. If

More information

Sequential Decisions

Sequential Decisions Sequential Decisions A Basic Theorem of (Bayesian) Expected Utility Theory: If you can postpone a terminal decision in order to observe, cost free, an experiment whose outcome might change your terminal

More information

CMPSCI 240: Reasoning about Uncertainty

CMPSCI 240: Reasoning about Uncertainty CMPSCI 240: Reasoning about Uncertainty Lecture 2: Sets and Events Andrew McGregor University of Massachusetts Last Compiled: January 27, 2017 Outline 1 Recap 2 Experiments and Events 3 Probabilistic Models

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Fall 2016 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables o Axioms of probability o Joint, marginal, conditional probability

More information

Lecture 3 - Axioms of Probability

Lecture 3 - Axioms of Probability Lecture 3 - Axioms of Probability Sta102 / BME102 January 25, 2016 Colin Rundel Axioms of Probability What does it mean to say that: The probability of flipping a coin and getting heads is 1/2? 3 What

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference

COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference Associate Instructor: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551 Unless otherwise noted, all material posted

More information

Naïve Bayes. Vibhav Gogate The University of Texas at Dallas

Naïve Bayes. Vibhav Gogate The University of Texas at Dallas Naïve Bayes Vibhav Gogate The University of Texas at Dallas Supervised Learning of Classifiers Find f Given: Training set {(x i, y i ) i = 1 n} Find: A good approximation to f : X Y Examples: what are

More information

Randomized Algorithms

Randomized Algorithms Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use? Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102 Mean, Median and Mode Lecture 3 - Axioms of Probability Sta102 / BME102 Colin Rundel September 1, 2014 We start with a set of 21 numbers, ## [1] -2.2-1.6-1.0-0.5-0.4-0.3-0.2 0.1 0.1 0.2 0.4 ## [12] 0.4

More information

STAT200 Elementary Statistics for applications

STAT200 Elementary Statistics for applications STAT200 Elementary Statistics for applications Lecture # 12 Dr. Ruben Zamar Winter 2011 / 2012 http://www4.agr.gc.ca/aafc-aac/display-afficher.do?id=1256763623482 Randomness Randomness is unpredictable

More information

2.4. Conditional Probability

2.4. Conditional Probability 2.4. Conditional Probability Objectives. Definition of conditional probability and multiplication rule Total probability Bayes Theorem Example 2.4.1. (#46 p.80 textbook) Suppose an individual is randomly

More information

Lecture 1 Introduction to Probability and Set Theory Text: A Course in Probability by Weiss

Lecture 1 Introduction to Probability and Set Theory Text: A Course in Probability by Weiss Lecture 1 to and Set Theory Text: A Course in by Weiss 1.2 2.3 STAT 225 to Models January 13, 2014 to and Whitney Huang Purdue University 1.1 Agenda to and 1 2 3 1.2 Motivation Uncertainty/Randomness in

More information

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,

More information

Computer vision: models, learning and inference

Computer vision: models, learning and inference Computer vision: models, learning and inference Chapter 2 Introduction to probability Please send errata to s.prince@cs.ucl.ac.uk Random variables A random variable x denotes a quantity that is uncertain

More information

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Chapter 2.5 Random Variables and Probability The Modern View (cont.) Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose

More information

STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING

STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING Tutorial 2 STAT1301 Fall 2010 28SEP2010, MB103@HKU By Joseph Dong Look, imagine a remote village where there has been a long drought.

More information

Probability theory basics

Probability theory basics Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:

More information

Axiomatic Foundations of Probability. Definition: Probability Function

Axiomatic Foundations of Probability. Definition: Probability Function Chapter 1 sections We will SKIP a number of sections Set theory SKIP Real number uncountability Definition of probability Finite sample spaces Counting methods Combinatorial methods SKIP tennis tournament

More information

DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS

DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS THUY ANH NGO 1. Introduction Statistics are easily come across in our daily life. Statements such as the average

More information

Introduction to Probability Theory, Algebra, and Set Theory

Introduction to Probability Theory, Algebra, and Set Theory Summer School on Mathematical Philosophy for Female Students Introduction to Probability Theory, Algebra, and Set Theory Catrin Campbell-Moore and Sebastian Lutz July 28, 2014 Question 1. Draw Venn diagrams

More information

4. Probability of an event A for equally likely outcomes:

4. Probability of an event A for equally likely outcomes: University of California, Los Angeles Department of Statistics Statistics 110A Instructor: Nicolas Christou Probability Probability: A measure of the chance that something will occur. 1. Random experiment:

More information

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis Conditional Probability 2 Solutions COR-GB.305 Statistics and Data Analysis The Birthday Problem. A class has 50 students. What is the probability that at least two students have the same birthday? Assume

More information

When working with probabilities we often perform more than one event in a sequence - this is called a compound probability.

When working with probabilities we often perform more than one event in a sequence - this is called a compound probability. + Independence + Compound Events When working with probabilities we often perform more than one event in a sequence - this is called a compound probability. Compound probabilities are more complex than

More information

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio 4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio Wrong is right. Thelonious Monk 4.1 Three Definitions of

More information

Machine Learning. Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 8 May 2012

Machine Learning. Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 8 May 2012 Machine Learning Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421 Introduction to Artificial Intelligence 8 May 2012 g 1 Many slides courtesy of Dan Klein, Stuart Russell, or Andrew

More information

BAYESIAN DECISION THEORY

BAYESIAN DECISION THEORY Last updated: September 17, 2012 BAYESIAN DECISION THEORY Problems 2 The following problems from the textbook are relevant: 2.1 2.9, 2.11, 2.17 For this week, please at least solve Problem 2.3. We will

More information

Elementary Discrete Probability

Elementary Discrete Probability Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,

More information

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events

More information

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

An AI-ish view of Probability, Conditional Probability & Bayes Theorem An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there

More information

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty. An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there

More information

Probabilistic Classification

Probabilistic Classification Bayesian Networks Probabilistic Classification Goal: Gather Labeled Training Data Build/Learn a Probability Model Use the model to infer class labels for unlabeled data points Example: Spam Filtering...

More information

Chapter 1 (Basic Probability)

Chapter 1 (Basic Probability) Chapter 1 (Basic Probability) What is probability? Consider the following experiments: 1. Count the number of arrival requests to a web server in a day. 2. Determine the execution time of a program. 3.

More information

Statistics 1 - Lecture Notes Chapter 1

Statistics 1 - Lecture Notes Chapter 1 Statistics 1 - Lecture Notes Chapter 1 Caio Ibsen Graduate School of Economics - Getulio Vargas Foundation April 28, 2009 We want to establish a formal mathematic theory to work with results of experiments

More information

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35%

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35% 9/6 Wednesday, September 06, 2017 Grades Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35% Exam Midterm exam 1 Wednesday, October 18 8:50AM - 9:40AM Midterm exam 2 Final

More information