Lecture 3: Probability

Similar documents
Lecture 3 : Probability II. Jonathan Marchini

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

University of Technology, Building and Construction Engineering Department (Undergraduate study) PROBABILITY THEORY

Lecture 2. Conditional Probability

With Question/Answer Animations. Chapter 7

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

A Event has occurred

Lecture 2: Probability, conditional probability, and independence

Discrete Probability. Chemistry & Physics. Medicine

Probability Calculus

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

STAT509: Probability

Lecture 3. Probability and elements of combinatorics

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Statistics for Engineers Lecture 2

CSC Discrete Math I, Spring Discrete Probability

2.4 Conditional Probability

Probabilistic models

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

UNIT Explain about the partition of a sampling space theorem?

Combinatorics and probability

2.6 Tools for Counting sample points

Lecture 3. January 7, () Lecture 3 January 7, / 35

Probabilistic models

Announcements. Topics: To Do:

Set Notation and Axioms of Probability NOT NOT X = X = X'' = X

Statistical Inference

Lecture 4 Bayes Theorem

Prof. Thistleton MAT 505 Introduction to Probability Lecture 5

Lecture 3 Probability Basics

ACM 116: Lecture 1. Agenda. Philosophy of the Course. Definition of probabilities. Equally likely outcomes. Elements of combinatorics

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Basic Statistics and Probability Chapter 3: Probability

STA Module 4 Probability Concepts. Rev.F08 1

Review Counting Principles Theorems Examples. John Venn. Arthur Berg Counting Rules 2/ 21

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

More on conditioning and Mr. Bayes

STAT 201 Chapter 5. Probability

Lecture 4 Bayes Theorem

Conditional probability

Single Maths B: Introduction to Probability

Lecture 4: Probability and Discrete Random Variables

1 Probability Theory. 1.1 Introduction

STAT Chapter 3: Probability

P(A) = Definitions. Overview. P - denotes a probability. A, B, and C - denote specific events. P (A) - Chapter 3 Probability

4-1 BASIC CONCEPTS OF PROBABILITY

Probability Notes (A) , Fall 2010

Intermediate Math Circles November 15, 2017 Probability III

1. Let A and B be two events such that P(A)=0.6 and P(B)=0.6. Which of the following MUST be true?

2011 Pearson Education, Inc

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

STAT:5100 (22S:193) Statistical Inference I

MATH 556: PROBABILITY PRIMER

Standard & Conditional Probability

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

HW MATH425/525 Lecture Notes 1

Chapter 1. Probability

Lecture 4. Selected material from: Ch. 6 Probability

Chapter 2 PROBABILITY SAMPLE SPACE

CMPSCI 240: Reasoning about Uncertainty

CMPSCI 240: Reasoning about Uncertainty

Great Theoretical Ideas in Computer Science

Math 140 Introductory Statistics

Crash Course in Statistics for Neuroscience Center Zurich University of Zurich

Econ 325: Introduction to Empirical Economics

Lecture Slides. Elementary Statistics Tenth Edition. by Mario F. Triola. and the Triola Statistics Series. Slide 1

10.1. Randomness and Probability. Investigation: Flip a Coin EXAMPLE A CONDENSED LESSON

Topic 3: Introduction to Probability

1 The Basic Counting Principles

F71SM STATISTICAL METHODS

Probability Theory and Statistics

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Machine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty

Conditional Probability and Independence

12.1. Randomness and Probability

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Combinatorial Analysis

Vina Nguyen HSSP July 6, 2008

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

Chapter 8 Sequences, Series, and Probability

Rules of Probability

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

Combinatorics. But there are some standard techniques. That s what we ll be studying.

Topic 4 Probability. Terminology. Sample Space and Event

1 INFO 2950, 2 4 Feb 10

Math 140 Introductory Statistics

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

Chapter 15. Probability Rules! Copyright 2012, 2008, 2005 Pearson Education, Inc.

Introduction to Probability Theory, Algebra, and Set Theory

the time it takes until a radioactive substance undergoes a decay

STATPRO Exercises with Solutions. Problem Set A: Basic Probability

Lower bound for sorting/probability Distributions

Deep Learning for Computer Vision

Module 8 Probability

lecture notes October 22, Probability Theory

AP Statistics Ch 6 Probability: The Study of Randomness

Introduction to Probability. Experiments. Sample Space. Event. Basic Requirements for Assigning Probabilities. Experiments

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Conditional Probability. CS231 Dianna Xu

Transcription:

Lecture 3: Probability 28th of October 2015 Lecture 3: Probability 28th of October 2015 1 / 36

Summary of previous lecture Define chance experiment, sample space and event Introduce the concept of the probability of an event Show how to manipulate intersection, union and complement of events Show how to compute analytically probability for simple chance experiments Define independence between event Lecture 3: Probability 28th of October 2015 2 / 36

Example: Snails In a population of a particular species of snail, individuals exhibit different forms. It is known that 45% have a pink background colouring 30% of individuals are striped, and 20% of the population are pink and striped Is the presence or absence of striping independent of background colour? A = a snail has a pink background colouring S = a snail has stripes. We know that P (A) = 0.45, P (S) = 0.3, and P (A S) = 0.2. Note that 0.2 = P (A S) 0.135 = P (A)P (S), so the events A and S are not independent. Lecture 3: Probability 28th of October 2015 3 / 36

Conditional Probability Laws A = a snail has a pink background colouring S = a snail has stripes. What is the probability that a snail has stripes GIVEN that it is pink? This is a conditional probability. We write this probability as P(S A) (pronounced probability of S given A ) Lecture 3: Probability 28th of October 2015 4 / 36

Think of P(B A) as how much of A is taken up by B. S A B c A A B B A c B A c B c Then we see that P (B A) = P (A B) P (A) Lecture 3: Probability 28th of October 2015 5 / 36

Example (continued) A = a snail has a pink background colouring S = a snail has stripes. What is the probability that a snail has stripes GIVEN that it is pink? P (S A) = P (S A) P (A) = 0.2 = 0.44 > P (S) = 0.3 0.45 Thus, knowledge that a snail has a pink background colouring increases the probability that it is striped. Lecture 3: Probability 28th of October 2015 6 / 36

Independence of Events Definition: Two events A and B are said to be independent if P (A B) = P (A)P (B). Note that in this case (provided P (B) > 0), if A and B are independent P (A B) = P (A B) P (B) = P (A)P (B) P (B) = P (A), and similarly P (B A) = P (B) (provided P (A) > 0). So for independent events, knowledge that one of the events has occurred does not change our assessment of the probability that the other event has occur. If the probability that one event A occurs doesn t affect the probability that the event B also occurs, then we say that A and B are independent. Lecture 3: Probability 28th of October 2015 7 / 36

General multiplication Law We can rearrange the conditional probability law to obtain a general Multiplication Law. P(B A) = P (A B) P (A) P(B A)P(A) = P(A B) Similarly P(A B)P(B) = P(A B) Then P(A B) = P(B A)P(A) = P(A B)P(B) Lecture 3: Probability 28th of October 2015 8 / 36

The Partition law Suppose we know that P(A B) = 0.52 and P(A B c ) = 0.14 what is p(a)? S A B A B So we have the rule P (A) = P (A B) + P (A B c ) = 0.52 + 0.14 = 0.66 Lecture 3: Probability 28th of October 2015 9 / 36

More generally, events are mutually exclusive if at most one of the events can occur in a given experiment. Suppose E 1,..., E n are mutually exclusive events, which together form the whole sample space: E 1 E 2 E n = S. (In other words, every possible outcome is in exactly one of the E s.) Then P (A) = n P (A E i ) = i=1 n P (A E i )P (E i ) i=1 Lecture 3: Probability 28th of October 2015 10 / 36

Example: Mendelian segregation At a particular locus in humans, there are two alleles A and B, and it is known that the population frequencies of the genotypes AA, AB, and BB, are 0.49, 0.42, and 0.09, respectively. An AA man has a child with a woman whose genotype is unknown. What is the probability that the child will have genotype AB? We assume that as far as her genotype at this locus is concerned the woman is chosen randomly from the population Lecture 3: Probability 28th of October 2015 11 / 36

Use the partition rule, where the partition corresponds to the three possible genotypes for the woman. Then P (child AB) = P (child AB and mother AA) +P (child AB and mother AB) +P (child AB and mother BB) = P (mother AA)P (child AB mother AA) +P (mother AB)P (child AB mother AB) +P (mother BB)P (child AB mother BB) = 0.49 0 + 0.42 0.5 + 0.09 1 = 0.3. Lecture 3: Probability 28th of October 2015 12 / 36

Bayes Rule Bayes Rule is a very powerful probability law. An example from medicine: We want to calculate the probability that a person has a disease D given they have a specific symptom S, i.e. we want to calculate P (D S). This is a difficult task as we would need to take a random sample of people from the population with the symptom. A probability that is much easier to calculate is P (S D), i.e. the probability that a person with the disease has the symptom. This probability can be assigned much more easily as medical records for people with serious diseases are kept. The power of Bayes Rule is its ability to take P (S D) and calculate P (D S). Lecture 3: Probability 28th of October 2015 13 / 36

We have already seen a version of Bayes Rule P (B A) = P (A B) P (A) Using the Multiplication Law we can rewrite this as P (B A) = P (A B)P (B) P (A) This is the Bayes Rule. Suppose P (S D) = 0.12, P (D) = 0.01 and P (S) = 0.03. Then P (D S) = 0.12 0.01 0.03 = 0.04 Lecture 3: Probability 28th of October 2015 14 / 36

Example: Genetic testing A gene has two possible types A 1 and A 2. 75% of the population have A 1. B is a disease that has 3 forms B 1 (mild), B 2 (severe) and B 3 (lethal). A 1 is a protective gene, with the probabilities of having the three forms given A 1 as 0.9, 0.1 and 0 respectively. People with A 2 are unprotected and have the three forms with probabilities 0, 0.5 and 0.5 respectively. What is the probability that a person has gene A 1 given they have the severe disease? Lecture 3: Probability 28th of October 2015 15 / 36

The first thing to do with such a question is decode the information, i.e. write it down in a compact form we can work with. P(A 1 ) = 0.75 P(A 2 ) = 0.25 P(B 1 A 1 ) = 0.9 P(B 2 A 1 ) = 0.1 P(B 3 A 1 ) = 0 P(B 1 A 2 ) = 0 P(B 2 A 2 ) = 0.5 P(B 3 A 2 ) = 0.5 We want P(A 1 B 2 )? Lecture 3: Probability 28th of October 2015 16 / 36

From Bayes Rule we know that P(A 1 B 2 ) = P(B 2 A 1 )P(A 1 ) P(B 2 ) We know P(B 2 A 1 ) and P(A 1 ) but what is P(B 2 )? We can use the Partition Law since A 1 and A 2 are mutually exclusive. P(B 2 ) = P(B 2 A 1 ) + P(B 2 A 2 ) (Partition Law) = P(B 2 A 1 )P(A 1 ) + P(B 2 A 2 )P(A 2 ) (Multiplication Law) = 0.1 0.75 + 0.5 0.25 = 0.2 We can use Bayes Rule to calculate P(A 1 B 2 ). P(A 1 B 2 ) = 0.1 0.75 0.2 = 0.375 Lecture 3: Probability 28th of October 2015 17 / 36

Permutations and Combinations (Probabilities of patterns) In some situations we observe a specific pattern from a large number of possible patterns. To calculate the probability of the pattern we need to count the total number of patterns. This is why we need to learn about permutations and combinations. Lecture 3: Probability 28th of October 2015 18 / 36

Permutations of n objects Consider 2 objects A B Q. How many ways can they be arranged? i.e. how many permutations are there? A. 2 ways AB BA Consider 3 objects A B C Q. How many ways can they be arranged (permuted)? A. 6 ways ABC ACB BCA BAC CAB CBA Consider 4 objects A B C D Q. How many ways can they be arranged (permuted)? A. 24 ways There is a pattern emerging here. No. of objects 2 3 4 5 6... No. of permutations 2 6 24 120 720... Can we find a formula for the number of permutations of n objects? Lecture 3: Probability 28th of October 2015 19 / 36

Suppose we have 5 objects. How many different ways can we place them into 5 boxes? There are 5 choices of object for the first box. 5 There are now only 4 objects to choose from for the second box. 5 4 There are 3 choices for the 3rd box, 2 for the 4th and 1 for the 5th box. 5 4 3 2 1 Thus, the number of permutations of 5 objects is 5 4 3 2 1. Lecture 3: Probability 28th of October 2015 20 / 36

In general, the number of permutations of n objects is n(n 1)(n 2)... (3)(2)(1) We write this as n! (pronounced n factorial ). There should be a button on your calculator that calculates factorials. Lecture 3: Probability 28th of October 2015 21 / 36

Permutations of r objects from n Now suppose we have 4 objects and only 2 boxes. How many permutations of 2 objects when we have 4 to choose from? There are 4 choices for the first box and 3 choices for the second box 4 3 So there are 12 permutations of 2 objects from 4. We write this as 4 P 2 = 12 Lecture 3: Probability 28th of October 2015 22 / 36

We say there are n P r permutations of r objects chosen from n. The formula for n P r is given by n P r = n! (n r)! To see why this works consider the example above 4 P 2. Using the formula we get 4 P 1 = 4! 2! = 4 3 2 1 = 4 3 2 1 Lecture 3: Probability 28th of October 2015 23 / 36

Combinations of r objects from n Now consider the number of ways of choosing 2 objects from 4 when the order doesn t matter. We just want to count the number of possible combinations. We know that there are 12 permutations when choosing 2 objects from 4. These are AB AC AD BC BD CD BA CA DA CB DB DC Notice how the permutations are grouped in 2 s which are the same combination of letters. Thus there are 12/2 = 6 possible combinations. AB AC AD BC BD CD Lecture 3: Probability 28th of October 2015 24 / 36

We write this as 4 C 2 = 6 We say there are n C r combinations of r objects chosen from n. The formula for n C r is given by n C r = n! (n r)!r! Lecture 3: Probability 28th of October 2015 25 / 36

Example: the National Lottery In the National Lottery you need to choose 6 balls from 49. What is the probability that I choose all 6 balls correctly? There are 2 ways of answering this question (i) using permutations and combinations (ii) using a tree diagram Lecture 3: Probability 28th of October 2015 26 / 36

Method 1 - using permutations and combinations P (6 correct) = No. of ways of choosing the 6 correct balls No. of ways of choosing 6 balls = = 6 P 6 49 P 6 6! 0! 49! 43! 6 5 4 3 2 1 = 49 48 47 46 45 44 = 0.0000000715112 (1 in 14 million) Lecture 3: Probability 28th of October 2015 27 / 36

Method 2 - using a tree diagram Consider the first ball I choose, the probability it is correct is 6 49 The second ball I choose is correct with probability 5 48 The third ball I choose is correct with probability 4 47 and so on. Thus the probability that I get all 6 balls correct is 6 5 4 3 2 1 = 0.0000000715112 (1 in 14 million) 49 48 47 46 45 44 Lecture 3: Probability 28th of October 2015 28 / 36

Puzzle 2 You have reached the final of a game show. The host shows you three doors and tell you that there is a prize behind one of the door. You pick a door. The host then opens one of the door you did not pick: there is no price behind this door. He then asks you if you want to change from the door you chose to the other remaining door. Is it to your advantage to switch your choice? Lecture 3: Probability 28th of October 2015 29 / 36

Assume that you picked the first door. The price has equal probability to be behind each of the three doors. Let s study the three cases (depending on where the price is) individually and investigate whether it is better to keep your door choice or to switch. Lecture 3: Probability 28th of October 2015 30 / 36

Case 1: the price is behind the first door. Lecture 3: Probability 28th of October 2015 31 / 36

The host has two choices: he can tell you that the price is not behind the second door or tell you that it is not behind the third door. In this case, a player who stays with his initial choice wins. a player who changes his initial choice loses. Lecture 3: Probability 28th of October 2015 32 / 36

Case 2: the price is behind the second door. Lecture 3: Probability 28th of October 2015 33 / 36

The host has no choice: he can only tell you that the price is not behind the third door. In this case, a player who stays with his initial choice loses. a player who changes his initial choice wins. Lecture 3: Probability 28th of October 2015 34 / 36

Case 3: the price is behind the third door. Again, the host has no choice: he can only tell you that the price is not behind the second door. In this case, a player who stays with his initial choice loses. a player who changes his initial choice wins. Lecture 3: Probability 28th of October 2015 35 / 36

To summarize: Case 1: winning strategy is to stay with initial choice Case 2: winning strategy is to change door choice Case 3: winning strategy is to change door choice A player who stays with the initial choice wins in only one out of three of these equally likely possibilities, while a player who switches wins in two out of three. Possible to obtain the same solution more formally using conditional probabilities and Bayes formula. See for example, https://en.wikipedia.org/wiki/monty_hall_problem. Lecture 3: Probability 28th of October 2015 36 / 36