Computing Probability

Similar documents
Review of Basic Probability Theory

Axioms of Probability

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Probability Theory Review

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

CMPSCI 240: Reasoning about Uncertainty

Lecture 1 : The Mathematical Theory of Probability

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Intermediate Math Circles November 8, 2017 Probability II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Conditional Probability

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Properties of Probability

Fundamentals of Probability CE 311S

Probability and Independence Terri Bittner, Ph.D.

CS 361: Probability & Statistics

Axioms of Probability. Set Theory. M. Bremer. Math Spring 2018

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

CHAPTER 3 PROBABILITY TOPICS

JOINT PROBABILITY DISTRIBUTIONS

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

P (second card 7 first card King) = P (second card 7) = %.

= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later.

Recitation 2: Probability

MS-A0504 First course in probability and statistics

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205)

Econ 113. Lecture Module 2

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Lecture 6 Probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

STAT Chapter 3: Probability

Math 180B Problem Set 3

Conditional Probability

Probability. Lecture Notes. Adolfo J. Rumbos

Name: Exam 2 Solutions. March 13, 2017

CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS

Module 1. Probability

STA 247 Solutions to Assignment #1

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Math 1313 Experiments, Events and Sample Spaces

Axiomatic Foundations of Probability. Definition: Probability Function

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Formalizing Probability. Choosing the Sample Space. Probability Measures

Statistics for Engineers

Sample Spaces, Random Variables

STAT 430/510 Probability

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Conditional Probability (cont'd)

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Lecture Lecture 5

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Origins of Probability Theory

Probability: Part 2 *

3.1 Events, Sample Spaces, and Probability

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Probability. VCE Maths Methods - Unit 2 - Probability

UNIT 5 ~ Probability: What Are the Chances? 1

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Review Basic Probability Concept

God doesn t play dice. - Albert Einstein

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

MA : Introductory Probability

Chapter 6: Probability The Study of Randomness

Discrete Random Variables

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Math 493 Final Exam December 01

11. Probability Sample Spaces and Probability

Econ 325: Introduction to Empirical Economics

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

STA 291 Lecture 8. Probability. Probability Rules. Joint and Marginal Probability. STA Lecture 8 1

Statistical Methods for the Social Sciences, Autumn 2012

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Math , Fall 2012: HW 5 Solutions

Random Variables and Events

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Probability Long-Term Memory Review Review 1

Lecture 1: Basics of Probability

Probability in Programming. Prakash Panangaden School of Computer Science McGill University

Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory

Lecture 8: Probability

Chapter 7 Wednesday, May 26th

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Elementary Discrete Probability

Probability and Statistics Notes

6.01: Introduction to EECS I. Discrete Probability and State Estimation

2 Chapter 2: Conditional Probability


Transcription:

Computing Probability James H. Steiger October 22, 2003 1 Goals for this Module In this module, we will 1. Develop a general rule for computing probability, and a special case rule applicable when elementary events are equally likely. 2. De ne and discuss (a) Joint Events (b) Conditional Probability (c) Independence 3. Derive the rule for computing the probability of a sequence, and a special case rule applicable when events are independent. 2 Computing Probability 2.1 The General Rule For any event A in, A is the union of elementary events, which are nonintersecting. Consequently, to compute the probability of A, simply sum the probabilities of the elementary events in A. Example 1 You have an unfair die, with probabilities as listed in the following Table. Find the probability of the event E = f2; 4; 6g. 1

X Pr(X) 6.3 5.2 4.1 3.1 2.2 1.1 Solution. Simply add the probabilities of the events f2g, f4g, and f6g together, obtaining.6 as the answer. Example 2 Given the probabilities in the preceding Table, nd the probability that the die will be even and greater than 3. Solution. This event is the set f4; 6g. Adding the probabilities for f4g, and f6g, we obtain.4 as the answer. 2.2 Equally Likely Elementary Events Since the probabilities of elementary events must sum to 1, when elementary events are equally likely each elementary event has probability 1=N, where N is the total number of elementary events in. Consequently, any event A composed of N A elementary events must have probability given by Pr(A) = N A N Example 3 Suppose a die is fair. Then each side is equally likely to come up. There are 6 sides, and for their probabilities to add to 1, each side must have a probability of 1=6. In this case N = 6. Suppose our event of interest, call it A, is that the number is even and less than 6. To compute the probability of a number being even and less than 6, we simply ask how many elementary events are in A. Since A = f2; 4g, N A = 2; and Pr(A) = 2 6 2

3 Joint Events So far, we have discussing probabilities when the events of interest involve one process. In many cases, however, we are interested in the simultaneous behavior of more than one process. In that case, we are interested in joint events. De nition 4 (Joint Events) Joint events are the intersection of outcomes on two di erent processes. Example 5 You throw a fair coin and toss a fair die. The joint event is the intersection of the outcome on the coin and the outcome on the die. The following table, which is also a Venn diagram, shows the possibilities Die Coin 1 2 3 4 5 6 H H \ 1 H \ 2 H \ 3 H \ 4 H \ 5 H \ 6 T T \ 1 T \ 2 T \ 3 T \ 4 T \ 5 T \ 6 De nition 6 (Marginal Events) The events on the margins of the above table are called marginal events. They are the union of the events in the respective row or column. Example 7 One of the marginal events in the table is the event f3g. Note that the only ways a 3 can occur is if you throw either a Head and a 3 or a Tail and a 3. So f3g = (H \ 3) [ (T \ 3):Notice in passing that any marginal event is partitioned by the events in its row or column. Consequently, by the 3rd axiom of probability theory, the probabilities of the marginal events in any row or column can be obtained by adding all the entries in the cells of the respective row or column. We examine this notion further on the next slide. 3.1 Probabilities in a Joint Probability Table The table below shows probabilities for the joint events in the interior of the table, and probabilities for the marginal events are listed on the margins. Die 1=6 1=6 1=6 1=6 1=6 1=6 Coin 1 2 3 4 5 6 1=2 H 1=6 0 1=6 0 1=6 0 1=2 T 0 1=6 0 1=6 0 1=6 3

How would you characterize in words the performance of the coin and the die as depicted by the probabilities in the above table? Consider their behavior both separately and together, i.e., their joint and marginal behavior. 4 Conditional Probability De nition 8 (Conditional Probability) The conditional probability of A given B, denoted Pr(AjB), is the probability of A within the reduced sample space de ned by B: By way of comment, let me continue by saying that to compute conditional probabilities, you go into the reduced sample space of events where B occurs, and compute the probability of obtaining A within that reduced sample space. Example 9 Consider again the table. Find the following Pr(1jH), the probability of a 1 given a head. Pr(Hj1); the probability of a head given a 1. P r(ejt ), the probability of an even given a tail. Die 1=6 1=6 1=6 1=6 1=6 1=6 Coin 1 2 3 4 5 6 1=2 H 1=6 0 1=6 0 1=6 0 1=2 T 0 1=6 0 1=6 0 1=6 Notice that, to compute conditional probability, you went to a group of cells characterized by a row or a column, and restandardized the joint probabilities in that row or column so that they added up to 1. Perhaps without realizing it, you performed this standardization by dividing each entry in a row or column by the sum of the entries in that row or column. For example, to compute Pr(1jH), you went into the row labeled H and noted that in that reduced sample space, only 3 values ever occur for the die, and they occur equally often. So the conditional probability of a 1 given a head must be 1=3. In other words, you divided Pr(1\H) by Pr(H), obtaining (1=6)=(1=2) = 1=3. This type of operation leads to the following formula for conditional probability. 4

De nition 10 (Conditional Probability Formula) Pr(AjB) = Pr(A \ B) Pr(B) 5 Independence Probabilistic independence is a key idea in probability theory. Notice that, in the example of the preceding table, knowing the outcome on the coin provides information about the outcome on the die, and vice versa, in the sense that the conditional probabilities were di erent from the marginal probabilities. For example, before knowing the outcome on the coin, the probability of an odd numer on the die is 1=2. On the other hand, once you know that the coin is a head, the probability of an odd becomes zero. Two marginal events are independent if the status of one does not provide information about the other in the sense of changing the probability distribution of the other. This leads to the following de nition De nition 11 (Probabilistic Independence) Two events are independent if Pr(AjB) = Pr(A) or, alternatively, if Pr(A \ B) = Pr(A) Pr(B) 6 Sequences Many interesting problems in probability involve sequences of events. For example, if you are playing 5 card stud poker, you might be interested in the probability of the sequence A 1 A 2, i.e., drawing an ace on the rst card and an ace on the second card. In general, sequences may be viewed as intersections of events in time, so that the probability we are interested in is actually Pr(A 1 \ A 2 ). If we look carefully at the de nition of conditional probability, we can see a general rule for the probability of a sequence. We begin by deriving a general rule for the probability of the intersection of two events. 5

Theorem 12 (Probability of an Intersection) The probability of the intersection of two events A and B is given by Pr(A \ B) = Pr(A) Pr(BjA) Proof. Consider the de nition of conditional probability, but reverse the roles of A and B. One obtains Pr(BjA) = Pr(B \ A) Pr(A) = Pr(A \ B) Pr(A) and the result immediately follows. The preceding theorem leads immediately to a rule for calculating the probability of a sequence. Corollary 13 (Probability of a Sequence) The probability of the sequence of events A 1 A 2 A 3 A N is the product of the probabilities of events at each point in the sequence conditional on everything that happened previously, i.e., Pr(A 1 A 2 A 3 A N ) = Pr(A 1 ) Pr(A 2 ja 1 ) Pr(A 3 ja 1 A 2 ) Pr(A 4 ja 1 A 2 A 3 ) Pr(A N ja 1 A 2 A N 1 ) The result is simpler when the events are independent, because conditionalizing has no e ect when events are independent. Corollary 14 (Product Rule) Consider a sequence of independent events A 1 A 2 A 3 A N. The probability of the sequence is given by Pr(A 1 A 2 A N ) = Pr(A 1 ) Pr(A 2 ) Pr(A N ) Example 15 (Drawing Two Aces) We wish to compute the probability of drawing two consecutive aces of the top of a perfectly shu ed deck in stud poker. In poker there are 52 cards, and 4 of them are aces. Poker involves sampling without replacement, because what happens on one draw a ects the probabilities for subsequent draws. Suppose cards are dealt completely at random. What is the probability of drawing an ace on the rst card? 6

Solution 16 We need Pr(A 1 A 2 ), which can be calculated as Pr(A 1 ) Pr(A 2 ja 1 ). Since there are 52 cards in the sample space, and 4 of them are aces, and elementary events are assumed to be equally likely, Pr(A 1 ) is 4=52 = 1=13. What is Pr(A 2 ja 1 )? Once the rst card has been drawn, there are 51 cards in the deck and 3 are aces. So Pr(A 2 ja 1 ) = 3=51 = 1=17: Conseqently, Pr(A 1 A 2 ) = (1=13)(1=17) = 1=221. Example 17 (Throwing Two Sixes) If you throw 2 independent fair dice, what is the probability of throwing two sixes? Solution 18 If we can assume the dice are fair, the probability of a six on either die is 1=6. If we can assume independence, the product rule holds, and the probability of two sixes is the product of the individual probabilities, i.e, 1=36: 7