Information Science 2 Probability Theory: An Overview Week 12 College of Information Science and Engineering Ritsumeikan University
Agenda Terms and concepts from Week 11 Basic concepts of probability Probability of events Simple event Combinations of events Independent and dependent events Bayes theorem (rule, law, ) Test 2
Recall terms from Week 11 Algorithm efficiency, Space and Time Algorithm complexity: Best case, Average case, Worst case Big O Frequency count Asymptotic notation Dominate (dominating, dominant) Constant time, Linear time, Quadratic time, n log n time, Exponential time 3
This lecture s objective Overview the fundamentals of Probability Theory After this lecture and study, you must be able to: Understand the concepts of random events and probability Understand and calculate the probabilities of independent and dependent events, using Bayes formula 4
l Uncertainty, variability, The notion of chance has existed for chance centuries: Scripts excavated from Egyptian tombs (pyramids) from around 2000BC (~4000 years ago) Card and board games were invented, at the latest, in the 14th century In a random experiment, we repeat the same action, process, or measurement, and get an outcome, which is uncertain (i.e. unpredictable) in advance. The outcome may vary at every trial, and before having the experiment finished, we can only talk about the chances (or probability) of obtaining a certain outcome l 5
Probability: Basic concepts To determine the chances of different outcomes, we: 1. Build an exhaustive list of all possible outcomes 2. Make sure the listed outcomes are mutually exclusive A list of outcomes which meet conditions (1) and (2) is called a sample space Sample Space: A sample space of a random experiment is a list of all the possible outcomes of the experiment. 1 6 The outcomes must be mutually 5 2 1 exclusive and exhaustive 5 2 4 3 3 4 Simple events: The individual outcomes are called simple (or elementary) events. Simple events cannot further be decomposed into constituent ( smaller ) outcomes 3 4 6 1 Event: An event is any collection of one or more simple events 6
Probability: Definition The probability of an event E i, which is a subset of a finite sample space S={E 1, E 2,, E n } of equally likely outcomes, is given by P(E i ) = (number of simple events in E i ) / (total number of simple events in S), such that: 1) 0 P(E i ) 1 for each i 2) Σ i P(E i ) = 1 where i = 1, 2,, n The probability P(A) of event A is, thus, the sum of the probabilities assigned to the simple events contained in A 7
The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. Probability: Example For ten days, check the weather each day. Each time we check, it is a random experiment with a sample space W = {Sunny, Rainy, Snow} P(Sunny) =.6 P(Rainy) =.3 P(Snow) =.1 S S S S S S R R R w After 10 days, we got the following outcomes: 6 days were sunny, 3 days were rainy, and 1 time we got snow in that period of time The probability of precipitation (i.e. rain or snow), P(precipitation) = P(Rainy) + P(Snow) =.4 8
Probability: Combinations of events If A and B are two events, then 1) P(A B) = P(A occurs OR B occurs OR both) 2) P(A B) = P(A AND B both occur) 3) P(Ā) = P(NOT(A)) = P(A does not occur) 4) P(A B) = P(A occurs given that B has already occurred) The conditional probability of A given B The complement event of A 9
10 Combinations of events Suppose that we are observing the number of spots turning up when a sixsided dice is tossed. Let us then define the following events: A:The number observed is at most 2 B:The number observed is an even number C:The number observed is 4
Combinations of events: Simple events Question: Define the sample space for the random experiment with the 6-sided dice and assign probabilities to the simple events S Venn Diagram 1 4 2 Answer: S = {1, 2, 3, 4, 5, 6} Each event is equally likely to occur, i.e. the events are equiprobable: P(1) = P(2) = = P(6) = 1/6 6 3 5 11
12 Combinations of events Question: Find P(A) S 1 A 4 6 5 Venn Diagram 2 3 Answer: A = {1, 2} P(A) = P(1) + P(2) = 2/6
13 Combinations of events: Question: Find P(Ā ) Complement S 1 A Venn Diagram 2 Answer: Ā = {3, 4, 5, 6} P(Ā ) = P(3) + P(4) + P(5) + P(6) = 1 P(A) = 4/6 4 6 3 5
Combinations of events: Mutually exclusive Question: Are events A and C mutually exclusive? S 1 A Venn Diagram 2 4 C Answer: There is no overlap between the two regions, therefore events A and C are mutually exclusive they cannot occur simultaneously (i.e. at the same time) 6 3 5 14
Combinations of events: Question: Find P(A C) Union S 1 A C 4 6 5 Venn Diagram 2 Answer: P(A C) = P(1, 2, 4) = 1/6 + 1/6 + 1/6 = 3/6, or because A and C are mutually exclusive, P(A C) = P(A) + P(C) = 2/6 + 1/6 = 3/6 3 15
16 Combinations of events: Question: Find P(A B) Intersection S 1 A B 2 Venn Diagram 4 6 3 5 Answer: P(A B) = P(2) = 1/6
Combinations of events: Question: Find P(C B) Condition S 1 Venn Diagram 2 C 4 B 6 3 5 Answer: P(The number is 4 The number is even) = 1/3 17
18 Conditional probability P(C B) is read as the conditional probability that event C occurs, given that event B has occurred P(C B) gives the probability of an event when partial knowledge (e.g. B) about the outcome of an experiment is known S 1 Venn Diagram 2 P(C B) = P(C B) / P(B) C 4 B 6 3 5
Bayes theorem (rule, law) Bayes theorem: Let S={A 1, A 2,, A n }, 0 < P(A i ) 1, Σ i P(A i ) = 1 (that is the normalization condition), and let B be any event such that P(B) > 0. Then for i = 1, 2,, n: P(A i B) = j P(A i ) P(B A i ) P(A j ) P(B A j ) The proof of this theorem is straightforward (try to prove it yourself) 19
20 Interpretation of the theorem The posterior probability of A i given B. It represents the probability of event A i after B has been observed P(A i B) = j The prior probability of A i. It summarizes our beliefs about the probability of event A i before A i or B are observed P(A i ) P(B A i ) P(A j ) P(B A j ) The normalizing constant which is equal to P(B) when A j are independent of B The likelihood of event B given A i
Bayes theorem: Example What is the probability that a student did not study for an exam, given that this student took but did not pass this exam? Let us assume that 10% of the students do not study for the exam; we denote it as P(I) =.10 Let us then assume that 95% of the students who did not study for the exam usually do not pass the exam; we denote it as P (F I ) =.95 We will also assume that 5% of the students who did study for the exam still could not pass it; we denote it as P(F Ī ) =.05) So, we need to find P (I F): P( I P( I F) = F = 1) = P( I ) P( F I ) P( I )P( F I ) + P( I )P( F (0.1)(.95) (.1)(.95) + (.9)(.05) = I ).095.14 =.68 Based on the obtained result, think whether the fact that a student did not pass the exam can be used to conclude that the student had not prepared for this exam? 21
Independent and dependent events Two events A and B are said to be independent if P(A B) = P(A) or P(B A) = P(B). Otherwise, the events are dependent Note that independent events and mutually exclusive events are not the same! For example, consider: A and B are two mutually exclusive events, and A can take place, that is P(A) > 0. Can A and B be independent? The conditional probability that A occurs given that B has occurred is zero, that is P(A B) = 0, because P(A B) = 0. However, P(A) > 0, therefore A and B cannot be independent 22
Rules to remember Axiom 1: For any event E, 0 P(E) 1 Axiom 2: If a certain event X has probability 1 that means P(X) = 1 Axiom 3: If E 1 E 2 =, then P(E 1 E 2 ) = P(E 1 )+ P(E 2 ) Rule 1: P(E) = 1 P(Ē) Rule 2: P( ) = 0 Rule 3: If B A, then P(B) P(A) Rule 4: P(A B) = P(A) + P(B) P(A B) (addition rule) Rule 5: P(A B) = P(A B)P(B) (multiplication rule) Bayes rule (see slide 20) 23
24 Summary of this lecture After this class, you are expected to know the following: Probability simple event / event / sample space combinations of events complement union intersection conditional probability, Bayes theorem independent and dependent events Axioms and rules of probability
25 Homework Read these slides Do the self-preparation assignments Learn the English terms new for you
26 Next class Random variables and Monte Carlo simulation The concept of the probability distribution function Random variables Expectation and variance of a random variable Random number generators Monte Carlo methods and their applications
Test 04 27