Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Similar documents
CSC Discrete Math I, Spring Discrete Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

With Question/Answer Animations. Chapter 7

Discrete Probability

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Discrete Structures for Computer Science

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

Section 13.3 Probability

{ } all possible outcomes of the procedure. There are 8 ways this procedure can happen.

13.1 The Basics of Probability Theory

Chapter 1 Axioms of Probability. Wen-Guey Tzeng Computer Science Department National Chiao University

Discrete Probability

Chapter 1 Axioms of Probability. Wen-Guey Tzeng Computer Science Department National Chiao University

Chapter 4 Probability

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

What is the probability of getting a heads when flipping a coin

Chapter5 Probability.

Statistical Inference

Lecture notes for probability. Math 124

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

ACMS Statistics for Life Sciences. Chapter 9: Introducing Probability

Denker FALL Probability- Assignment 6

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

STAT:5100 (22S:193) Statistical Inference I

Chapter 3 : Conditional Probability and Independence

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Conditional Probability

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

Solution: Solution: Solution:

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Probabilistic models

Producing data Toward statistical inference. Section 3.3

AMS7: WEEK 2. CLASS 2

Probability 1 (MATH 11300) lecture slides

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Statistical Theory 1

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Conditional Probability (cont...) 10/06/2005

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Conditional probability

CS626 Data Analysis and Simulation

1 INFO 2950, 2 4 Feb 10

Dept. of Linguistics, Indiana University Fall 2015

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Probabilistic models

Announcements. Topics: To Do:

Key Concept. Properties. February 23, S6.4_3 Sampling Distributions and Estimators

Introduction to Probability Theory

UNIT 5 ~ Probability: What Are the Chances? 1

Probability. Chapter 1 Probability. A Simple Example. Sample Space and Probability. Sample Space and Event. Sample Space (Two Dice) Probability

Introduction to Probability 2017/18 Supplementary Problems

Senior Math Circles November 19, 2008 Probability II

Lecture 4: Probability and Discrete Random Variables

Probability Theory and Applications

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Introductory Probability

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

First Digit Tally Marks Final Count

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

MATH/STATS 425 : Introduction to Probability. Boaz Slomka

A brief review of basics of probabilities

Lecture 2: Probability and Distributions

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Event A: at least one tail observed A:

MATH 556: PROBABILITY PRIMER

Axioms of Probability

Lower bound for sorting/probability Distributions

Chapter. Probability

Probability Distributions. Conditional Probability.

Discrete Random Variables

Chapter 6. Probability

Chapter 2 PROBABILITY SAMPLE SPACE

Foundations of Probability Theory

Lectures Conditional Probability and Independence

Introduction to Probability, Fall 2009

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Probability, Random Processes and Inference

Conditional Probability and Bayes

Useful for Multiplication Rule: When two events, A and B, are independent, P(A and B) = P(A) P(B).

Conditional Probability. CS231 Dianna Xu

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

STAT 430/510 Probability

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Transcription:

Chapter 7

Chapter Summary 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Section 7.1

Introduction Probability theory dates back to 1526 when the Italian mathematician, physician, and gambler Girolamo Cardano wrote the first known systematic treatment of the subject in his book Liber de Ludo Aleae (Book on Games of Chance). Expriments that have finite outcomes.

Finite probability Experiment: is a procedure that yields one of a given set of possible outcomes. Sample space of the expriment: is the set of possible outcomes. Event: is a subset of the sample space. If S is a finite nonempty sample space of equally likely outcomes, and E is an event, that is, a subset of S, then the probability of E is p(e)= E / S.

Finite probability What is the probability that when two dice are rolled, the sum of the numbers on the two dice is 7? Solution: Sample space: all outcomes of the sum of the numbers on the two dice (1,1),(1,2),..(6,5),(6,6) Event: two dice with the sum of the numbers on the two dice is 7 (1,6),(2,5),(3,4),(4,3),(5,2),(6,1) Probability: p=6/36=1/6

Finite probability In a lottery, players win a large prize when they pick four digits that match, in the correct order, four digits selected by a random mechanical process. A smaller prize is won if only three digits are matched. What is the probability that a player wins the large prize? what is the probability that a player wins the smaller prize? Solution1: Only one way to choose all four digits correctly. By the product rule, there are 10*10*10*10=10,000 ways to choose four digits. The probability that a player wins the large prize is 1/10000=0.0001. Solution2: By sum rule, 9+9+9+9=36 ways to choose only three digits correctly. By the product rule, there are 10*10*10*10=10,000 ways to choose four digits. The probability that a player wins the large prize is 36/10000=0.0036.

Finite probability There are many lotteries now that award enormous prizes to people who correctly choose a set of six numbers out of the first n positive integers, where n is usually between 30 and 60. What is the probability that a person picks the correct six number out of 40? Solution1: Only one winning combination. Total number of ways to choose six numbers out of 40 is: C(40,6)=40!/34!6!=3838380. the probability of picking a winning combination is 1/3838380.

Probabilities of complements and unions of events Let E be an event in a sample space S. The probability of the event E S- E,The complementary event of E, is given by p( E) 1- p(e) Example: A sequence of 10 bits is randomly generated. What is the probability that at least one of these bits is 0? At least one bit is 0=bits of length 10-none of the 10 bits is 0 P=1-1/2 10 =1023/1024

Probabilities of complements and unions of events Let E 1 and E 2 be events in the sample space S. Then p(e1 E2) p(e 1) p(e2 ) -p(e1 E2) Example: What is the probability that a positive integer selected at random from the set of positive integers not exceeding 100 is divisible by either 2 or 5? E 1 :the integer selected at random is divisible by 2 E 2 :the integer selected at random is divisible by 5 P(E 1 UE 2 )=p(e 1 )+p(e 2 )-P(E 1 ΠE 2 )=50/100+20/100-10/100=3/5

Section 7.2

Introduction If S is a finite nonempty sample space of equally likely outcomes, and E is an event, that is, a subset of S, then the probability of E is p(e)= E / S. How to deal with experiments with outcomes are not equally likely? A coin may be biased so that it comes up heads twice as often as tails. conditional probability Suppose that a fair coin is flipped four times, and the first time it comes up heads. Given this information, what is the probability that heads comes up three times? How to distinguish events are dependent or independent?

Assigning Probabilities Let S be the sample space of an experiment with a finite or countable number of outcomes. The probability p(s) to each outcome s : 1. 2. 0 p( s) 1 s S p( s) 1 for each s S When there are n possible outcomes, x 1,x 2,,x n, 1. 2. 0 p( x ) 1 n i 1 p( x i i ) 1 for i 1,2,..., n The function p from the set of all outcomes of the sample space S is called a probability distribution.

Example What probabilities should we assign to the outcomes H(heads) and T(tails) when a fair coin is flipped? p(h)=p(t) p(h)+p(t)=1 p(h)=p(t)=1/2 What probabilities should we assign to these outcomes when the coin is biased so that heads comes up twice as often as tails? p(h)=2p(t) p(h)+p(t)=1 p(h)=2/3, p(t)=1/3

Assigning Probabilities Definition 1: Suppose that S is a set with n elements. The uniform distribution assigns the probability 1/n to each element of S. The experiment of selecting an element from a sample space with a uniform distribution is called selecting an element of S at random. Definition 2: The probability of the event E is the sum of the probabilities of the outcomes in E. That is, p ( E) p( s) s E

Example Suppose that a die is biased so that 3 appears twice as often as each other number but that the other five outcomes are equally likely. What is the probability that an odd number appears when we roll this die? Solution: E={1,3,5} p(1)=p(2) =p(4) =p(5) =p(6)=1/2p(3) p(1)+ p(2) + p(3) + p(4) + p(5) + p(6) + p(7)=1 p(1)=p(2) =p(4) =p(5) =p(6)=1/7, p(3)=2/7 p(e)= p(1)+ p(3) + p(5) =4/7

Probabilities of complements and unions of events p( E) 1- p(e) p(e1 E2) p(e 1) p(e2 ) -p(e1 E2) THEOREM 1: if E 1,E 2, is a sequence of pairwise disjoint events in a sample space S, then p( E i i ) i) i p(e

Conditional Probability Example: What is the conditional probability that a family with two children has two boys, given they have at least one boy? Assume that each of the possibilities BB, BG, GB, and GG is equally likely, where B represents a boy and G represents a girl. Solution: E : the event that a family with two children has two boys F : the event that a family with two children has at least one boy E = {BB}, F = {BB, BG, GB}, and E F = {BB}. Because the four possibilities are equally likely, p(f) = 3/4 and p(e F) = 1/4. p(e F) = p(e F)/p(F) = ¼ / ¾ = 1/3.

Conditional Probability Definition 3: Let E and F be events with p(f) > 0. The conditional probability of E given F, denoted by p(e F), is defined as p(e F) = p(e F)/p(F) Example: A bit string of length four is generated at random so that each of the 16 bit strings of length four is equally likely. What is the probability that it contains at least two consecutive 0s, given that its first bit is a 0? (We assume that 0 bits and 1 bits are equally likely.) Solution: E : the event that a bit string of length four contains at least two consecutive 0s F : the event that the first bit of a bit string of length four is a 0. The probability that a bit string of length four has at least two consecutive 0s, given that its first bit is a 0, equals p(e F) = p(e F)/p(F) E F = {0000, 0001, 0010, 0011, 0100}, p(e F) = 5/16. Because there are eight bit strings of length four that start with a 0, p(f) = 8/16 = 1/2. Consequently, p(e F) = 5/16 / 1/2= 5 /8

Independence The events E and F are independent p(e F)=p(E) p(f) if and only if Example: suppose E is the event that a randomly generated bit string of length four begins with a 1 and F is the event that this bit string contains an even number of 1s. Are E and F independent, if the 16 bit strings of length four are equally likely? 0000,0001,0010,0011,0100,0101,0110,0111,1000,1001,1010,1011,1100,1101,1110,1111 p(e)=p(f)=8/16=1/2 E F={1111,1100,1010,1001} p(e F)=4/16=1/4 p(e F)=p(E) p(f) E and F are independent

Independence Example: Are the events E, that a family with three children has children of both sexes, and F, that this family has at most one boy, independent? Assume that the eight ways a family can have three children are equally likely. Solution: eight ways a family can have three children: BBB, BBG, BGB, BGG, GBB, GBG, GGB, and GGG, each with probability of 1/8. E = {BBG, BGB, BGG, GBB, GBG, GGB}, F = {BGG, GBG, GGB, GGG}, E F = {BGG, GBG, GGB}, p(e) = 6/8 = 3/4, p(f) = 4/8 = 1/2, and p(e F) = 3/8. p(e)p(f) = 3/4 1/2= 3/8, it follows that p(e F) = p(e)p(f), so E and F are independent.

Independence Example: Assume each of the four ways a family can have two children is equally likely. Are the events E, that a family with two children has two boys, and F, that a family with two children has at least one boy, independent? Solution: Two children : BB,BG,GB,GG E = {BB}, then p(e) = 1/4. F={BB,BG,GB}, then p(f) = 3/4 E F={BB}, then p(e F) = 1/4. But p(e)p(f) = 1/4 3/4=3/16. Therefore p(e F) p(e)p(f), so the events E and F are not independent.

Bernoulli trials & the binomial distribution Bernoulli trial: an experiment with only two possible outcomes A possible outcome of a Bernoulli trial is called a success or a failure. If p is the probability of a success and q is the probability of a failure, it follows that p+q=1. Example: A coin is biased so that the probability of heads is 2/3. What is the probability that exactly four heads come up when the coin is flipped seven times, assuming that the flips are independent? 2 7 =128 possible outcomes when flipped seven times C(7,4):four of seven flips can be heads C(7,4)(2/3) 4 (1/3) 3 =35*16/3 7 =560/2187

Bernoulli trials & the binomial distribution Theorem 2: The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q=1-p, is b=(k;n,p)=c(n,k)p k q n-k Considered as a function of k, the function is called the binoulli distribution. Example: suppose that the probability that a 0 bit is generated is 0.9, that the probability that a 1 bit is generated is 0.1, and that bits are generated independently. What is the probability that exactly eight 0 bits are generated when 10 bits are generated? Solution : Assume generate 0 is the success, 1 is the failure b(8;10,0.9)=c(10,8)(0.9) 8 (0.1) 2 =0.1937

Random variables Definition 6: A random variable is a function from the sample space of an experiment to the set of real numbers. That is, a random variable assigns a real number to each possible outcome. A random variable is a function, not a variable Example: Suppose that a coin is flipped three times. Let X(t) be the random variable that equals the number of heads that appear when t is the outcome. Then X(HHH)=3 X(HHT)=X(HTH)=X(THH)=2 X(TTH)=X(THT)=X(HTT)=1 X(TTT)=0

Random variables Definition 7: The distribution of a random variable X on a sample space S is the set of pairs (r, p(x=r)) for all r in X(S), where p(x=r) is the probability that X takes the value r. The set of pairs in this distribution is determined by the probabilities p(x=r) for all r in X(S).

Example Let X be the sum of the numbers that appear when dice is rolled. What are the values of this random variable for the 36 possible outcomes ( i, j ), where i and j are the numbers that appear on the first die and the second die, respectively, when these two dice are rolled? X((1,1))=2 X((1,2))=X((2,1))=3 X((1,3))=X((2,2))=X((3,1))=4 X((1,4))=X((2,3))=X((3,2))=X((4,1))=5 X((1,5))=X((2,4))=X((3,3))=X((4,2))=X((5,1))=6 X((1,6))=X((2,5))=X((3,4))=X((4,3))=X((5,2))=X((6,1))=7 X((2,6))=X((3,5))=X((4,4))=X((5,3))=X((6,2))=8 X((3,6))=X((4,5))=X((5,4))=X((6,3))=9 X((4,6))=X((5,5))=X((6,4))=10 X((5,6))=X((6,5))=11 X((6,6))=12

Section 7.3

Why we need Bayes? How to assess the probability that a particular event occurs on the basis of partial evidence. The probability p(f) that an event F occurs and we have knowledge that an event E occurs. Then the conditional probability that F occurs given that E occurs, p(f E), is a more realistic estimate than p(f) that F occurs.

Example Selects a ball by first choosing one of the two boxes at random. Selects one of the balls in this box at random. If bob selected a red ball, what is the probability that he selected a ball from the first box.

Solution E : a RED ball is selected Ē: a GREEN ball is selected F : a ball is selected from the first box F : a ball is selected from the second box p(f E)=p(F E)/p(E) (conditional probability ) p(e F)=p(E F)/p(F)=7/9, p(f)=p(f )=1/2 p(e F)=7/18 p(e F )=p(e F )/p(f )= 3/7 p(e F)=3/14 E= (E F) U (E F ) P(E)=p(E F)+P(E F)=38/63 p(f E)=p(F E)/p(E) = 49/76 0.645

Bayes Theorem Suppose that E and F are events from a sample space S such that p(e) 0 and p(f) 0. Then Note: F and F are mutually exclusive and cover the entire sample space S, that is F U F =S.

Example Suppose one person in 100,000 has a particular rare disease for which there is a fairly accurate diagnostic test. This test is correct 99% of the time when given to someone with the disease; it is correct 99.5% of the time when given to someone who does not have the disease. P(someon Only 0.2% who of people test who positive test positive actually actually have have the disease)? the disease. P(someone who tests negative for the desease does not have the disease)? F: a person has the disease E: a person tests positive p( F E) p( E p( E F) p( F) F) p( F) p( E F) p( F) 0.99 0.0001 0.002 0.99 0.0001 0.005 0.99999

Example Suppose one person in 100,000 has a particular rare disease for which there is a fairly accurate diagnostic test. This test is correct 99% of the time when given to someone with the disease; it is correct 99.5% of the time when given to someone who does not have the disease. P(someon who test positive actually have the disease)? P(someone 99.99999% of who people tests who negative test negative does really not have do not the have disease)? the disease. F: a person has the disease E: a person tests positive p( F E) p( E p( E F) p( F) F) p( F) p( E F) p( F) 0.995 0.99999 0.995 0.99999 0.01 0.00001 0.9999999