Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Similar documents
Discrete Structures for Computer Science

CSC Discrete Math I, Spring Discrete Probability

With Question/Answer Animations. Chapter 7

Probability (10A) Young Won Lim 6/12/17

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Discrete Probability

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

Conditional Probability

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

What is Probability? Probability. Sample Spaces and Events. Simple Event

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

CS206 Review Sheet 3 October 24, 2018

SDS 321: Introduction to Probability and Statistics

Expected Value 7/7/2006

STAT 430/510 Probability

Quantitative Methods for Decision Making

27 Binary Arithmetic: An Application to Programming

Lecture Lecture 5

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Probability Distributions. Conditional Probability.

2. In a clinical trial of certain new treatment, we may be interested in the proportion of patients cured.

Lower bound for sorting/probability Distributions

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

CS 361: Probability & Statistics

CLASS 6 July 16, 2015 STT

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Introduction to Probability and Sample Spaces

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Probabilistic models

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

First Digit Tally Marks Final Count

Probability Distributions. Conditional Probability

Probability Pearson Education, Inc. Slide

Discrete Probability

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

More on Distribution Function

Dept. of Linguistics, Indiana University Fall 2015

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Probability. VCE Maths Methods - Unit 2 - Probability

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

2. Conditional Probability

Deep Learning for Computer Vision

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

UNIT 5 ~ Probability: What Are the Chances? 1

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Conditional Probability

Lecture 1 : The Mathematical Theory of Probability

Applied Statistics I

What is a random variable

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Lecture 3 - Axioms of Probability

Introduction to Probability 2017/18 Supplementary Problems

Discrete random variables and probability distributions

Homework 4 Solution, due July 23

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Conditional Probability (cont...) 10/06/2005

I. Introduction to probability, 1

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

Chapter 2 PROBABILITY SAMPLE SPACE

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Discrete Random Variable

Conditional Probability (cont'd)

324 Stat Lecture Notes (1) Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Term Definition Example Random Phenomena

The Bernoulli distribution has only two outcomes, Y, success or failure, with values one or zero. The probability of success is p.

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114.

Section 13.3 Probability

P [(E and F )] P [F ]

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

13.1 The Basics of Probability Theory

NLP: Probability. 1 Basics. Dan Garrette December 27, E : event space (sample space)

Statistics Statistical Process Control & Control Charting

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Probability Theory and Applications

AMS7: WEEK 2. CLASS 2

Introduction Probability. Math 141. Introduction to Probability and Statistics. Albyn Jones

Steve Smith Tuition: Maths Notes

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Probability Distributions for Discrete RV

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Chapter 3: Random Variables 1

Part (A): Review of Probability [Statistics I revision]

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Transcription:

Section 7.2

Sec$on Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli Trials and the Binomial Distribution Random Variables 2

Assigning Probabili$es Laplace s definition assumes that all outcomes are equally likely. Now we introduce a more general definition of probabilities that avoids this restriction. Let S be a sample space of an experiment with a finite number of outcomes. We assign a probability p(s) to each outcome s, so that: i. 0 p(s) 1 for each s S ii. The function p from the set of all outcomes of the sample space S is called a probability distribution. 3

Assigning Probabili$es Ex: A trick coin is biased so that when flipped, the heads come up twice as often as tails. What probabilities should we assign to the outcomes H (heads) and T (tails) when the biased coin is flipped? Solution: We are given that p(h) = 2p(T) Because p(h) + p(t) = 1, it follows that 2p(T) + p(t) = 3p(T) = 1. Hence, p(t) = 1/3 and p(h) = 2/3. 4

Uniform Distribu$on Definition: Suppose that S is a set with n elements. The uniform distribution assigns the probability 1/n to each element of S. (Note that we could have used Laplace s definition here.) Example: Consider again the coin flipping example, but with a fair coin. Now p(h) = p(t) = 1/2. 5

Probability of an Event Definition: The probability of the event E is the sum of the probabilities of the outcomes in E. Note that now no assumption is being made about the distribution. 6

Example Ex: Suppose that a 6-sided die is biased so that 3 appears twice as often as each other number, but that the other five outcomes are equally likely. What is the probability that an odd number appears when we roll this die? Solution: We want the probability of the event E = {1,3,5}. We have p(3) = 2/7 and p(1) = p(2) = p(4) = p(5) = p(6) = 1/7. Hence, p(e) = p(1) + p(3) + p(5) = 1/7 + 2/7 + 1/7 = 4/7. 7

Probabili$es of Complements and Unions of Events Complements: each outcome is in either E or still holds. Since, but not both, Unions: also still holds under the new definition. 8

Combina$ons of Events Theorem: If E 1, E 2, is a sequence of pairwise disjoint events in a sample space S, then see Exercises 36 and 37 for the proof 9

Condi$onal Probability Events can be dependent, which means they can be affected by previous events Definition: Let E and F be events with p(f) > 0. The conditional probability of E given F, denoted by P(E F), is defined as: 10

Condi$onal Probability Ex: A bit string of length four is generated at random so that each of the 16 bit strings is equally likely. What is the probability that it contains at least two consecutive 0s, given that its first bit is a 0? Solution: Let E be the event that the bit string contains at least two consecutive 0s, and F be the event that the first bit is a 0. Since E F = {0000, 0001, 0010, 0011, 0100}, p(e F)=5/16. Because 8 bit strings of length 4 start with a 0, p(f) = 8/16= ½. Hence, 11

[ https://xkcd.com/795/ ] 12

Condi$onal Probability Ex: What is the conditional probability that a family with two children has two boys, given that they have at least one boy. Assume that each of the possibilities BB, BG, GB, and GG is equally likely (where B represents a boy and G represents a girl). Solution: Let E be the event that the family has two boys and let F be the event that the family has at least one boy. Then E = {BB}, F = {BB, BG, GB}, E F = {BB}. It follows that p(f) = 3/4 and p(e F)=1/4. Hence, 13

Independence Events can be independent, which means the occurrence of one event gives no information about the probability of another event. That is, p(e F) = p(e) p(f) has no impact on p(e F) Definition: The events E and F are independent if and only if p(e F) = p(e)p(f). 14

Independence Ex: Suppose E is the event that a randomly generated bit string of length four begins with a 1 and F is the event that this bit string contains an even number of 1s. Are E and F independent if the 16 bit strings of length four are equally likely? Solution: There are 8 bit strings of length four that begin with a 1, and 8 bit strings of length four that contain an even number of 1s. Since the number of bit strings of length 4 is 16, p(e) = p(f) = 8/16 = ½. Since E F = {1111, 1100, 1010, 1001}, p(e F) = 4/16 = 1/4. We conclude that E and F are independent, because p(e F) = 1/4 = (½) (½) = p(e) p(f) 15

Independence Ex: Assume (as in the previous example) that each of the four ways a family can have two children (BB, GG, BG, GB) is equally likely. Are the events E, that a family with two children has two boys, and F, that a family with two children has at least one boy, independent? Solution: E = {BB}, so p(e) = 1/4. We saw previously that p(f) = 3/4 and p(e F) = 1/4. The events E and F are not independent since p(e) p(f) = 3/16 1/4 = p(e F). 16

Pairwise and Mutual Independence Definition: The events E 1, E 2,, E n are pairwise independent if and only if p(e i E j ) = p(e i ) p(e j ) for all pairs i and j with i j n. Any 2 pairs of events are independent. Definition: The events are mutually independent if whenever i j, j = 1,2,., m are integers with 1 i 1 < i 2 < < i m n and m 2. Any m events are independent. 17

James Bernoulli (1854 1705) Bernoulli Trials Definition: Suppose an experiment can have only two possible outcomes, e.g., the flipping of a coin or the random generation of a bit. Each performance of the experiment is called a Bernoulli trial. One outcome is called a success and the other a failure. If p is the probability of success and q the probability of failure, then p + q = 1. Many problems involve determining the probability of k successes when an experiment consists of n mutually independent Bernoulli trials. 18

Bernoulli Trials Ex: A fair coin is flipped 3 times. p(h) = ½ = p(t). What is the probability that we get three, two, one, or no heads? Solution: There are 2 3 = 8 possible outcomes. p(three heads) = C(3,3) / 8 = 1/8 p(two heads) = C(3,2) / 8 = 3/8 p(one head) = C(3, 1) / 8 = 3/8 p(zero heads) = C(3, 0) / 8 = 1/8 HHH HHT HTH HTT THH THT TTH TTT 19

Bernoulli Trials Ex: A coin is biased so that the probability of heads is 2/3. What is the probability that exactly four heads occur when the coin is flipped seven times? Solution: There are 2 7 = 128 possible outcomes. The number of ways four of the seven flips can be heads is C(7,4). The probability of each of the outcomes is (2/3) 4 (1/3) 3 since the seven flips are independent. Hence, the probability that exactly four heads occur is C(7,4) (2/3) 4 (1/3) 3 = 560/ 2187. 20

Probability of k Successes in n Independent Bernoulli Trials. Theorem 2: The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q = 1 p, is C(n,k)p k q n k. Proof: The outcome of n Bernoulli trials is an n-tuple (t 1,t 2,,t n ), where each is t i either S (success) or F (failure). The probability of each outcome of n trials consisting of k successes and n k failures (in any order) is p k q n k. Because there are C(n,k) n-tuples of S s and F s that contain exactly k S s, the probability of k successes is C(n,k)p k q n k. 21

Probability of k Successes in n Independent Bernoulli Trials. Theorem 2: The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q = 1 p, is C(n,k)p k q n k. We denote by b(k:n,p) the probability of k successes in n independent Bernoulli trials with p the probability of success. Viewed as a function of k, b(k:n,p) is the binomial distribution. By Theorem 2, b(k:n,p) = C(n,k)p k q n k.. 22

Random Variables Definition: A random variable is a function from the sample space of an experiment to the set of real numbers. That is, a random variable assigns a real number to each possible outcome. A random variable is a function. It is not a variable, and it is not random! In the late 1940s W. Feller and J.L. Doob flipped a coin to see whether both would use random variable or the more fitting chance variable. Unfortunately, Feller won and the term random variable has been used ever since. 23

Random Variables Definition: The distribution of a random variable X on a sample space S is the set of pairs (r, p(x = r)) for all r X(S), where p(x = r) is the probability that X takes the value r. Ex: Suppose that a coin is flipped three times. Let X(t) be the random variable that equals the number of heads that appear when t is the outcome. Then X(t) takes on the following values: X(HHH) = 3, X(HHT) = X(HTH) = X(THH) = 2, X(TTH) = X(THT) = X(HTT) = 1 X(TTT) = 0. Each of the eight possible outcomes has probability 1/8. So, the distribution of X(t) is p(x = 3) = 1/8, p(x = 2) = 3/8, p(x = 1) = 3/8, and p(x = 0) = 1/8. 24