Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

Similar documents
Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Probability. VCE Maths Methods - Unit 2 - Probability

Probability Pearson Education, Inc. Slide

Simple Linear Regression. John McGready Johns Hopkins University

Directed Reading B. Section: Traits and Inheritance A GREAT IDEA

Conditional Probability P( )

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Probabilistic models

Formalizing Probability. Choosing the Sample Space. Probability Measures

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

First Digit Tally Marks Final Count

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

CS 361: Probability & Statistics

2. Conditional Probability

Deep Learning for Computer Vision

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Great Theoretical Ideas in Computer Science

Probability measures A probability measure, P, is a real valued function from the collection of possible events so that the following

Chapter 2 PROBABILITY SAMPLE SPACE

What is Probability? Probability. Sample Spaces and Events. Simple Event

Probability. Lecture Notes. Adolfo J. Rumbos

STAT 430/510 Probability

2.4 Conditional Probability

Conditional Probability

Basic Statistics for SGPE Students Part II: Probability theory 1

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

CSC Discrete Math I, Spring Discrete Probability

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire

Probabilistic models

Section B. The Theoretical Sampling Distribution of the Sample Mean and Its Estimate Based on a Single Sample

Ch 11.Introduction to Genetics.Biology.Landis

Sampling Variability and Confidence Intervals. John McGready Johns Hopkins University

Chapter 1 Principles of Probability

Lecture 3 - Axioms of Probability

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Lecture 3: Probability

Statistics for laboratory scientists II

Lecture 27. December 13, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Lecture 3. January 7, () Lecture 3 January 7, / 35

Conditional Probability


HEREDITY: Objective: I can describe what heredity is because I can identify traits and characteristics

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Bayes Rule for probability

Conditional Probability and Bayes Theorem (2.4) Independence (2.5)

Intermediate Math Circles November 15, 2017 Probability III

Heredity and Genetics WKSH

324 Stat Lecture Notes (1) Probability

Uncertainty. Russell & Norvig Chapter 13.

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

2 Chapter 2: Conditional Probability

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Probability Distributions. Conditional Probability.

Lecture 1 : The Mathematical Theory of Probability

Lecture 2. Constructing Probability Spaces

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Key Concepts. Key Concepts. Event Relations. Event Relations

Statistics Statistical Process Control & Control Charting

STAT509: Probability

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Conditional probability

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes

Section 11 1 The Work of Gregor Mendel

Elementary Discrete Probability

Unit 2 Lesson 4 - Heredity. 7 th Grade Cells and Heredity (Mod A) Unit 2 Lesson 4 - Heredity

Name Class Date. Pearson Education, Inc., publishing as Pearson Prentice Hall. 33

Probability Distributions. Conditional Probability

Lower bound for sorting/probability Distributions

Mendelian Genetics. Introduction to the principles of Mendelian Genetics

Interest Grabber. Analyzing Inheritance

EE126: Probability and Random Processes

Lecture 21. December 19, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

Outline for today s lecture (Ch. 14, Part I)

CLASS 6 July 16, 2015 STT

Lecture 4 - Conditional Probability

Chapter Eleven: Heredity

Probability and Independence Terri Bittner, Ph.D.

STAT Chapter 3: Probability

Business Statistics MBA Pokhara University

Steve Smith Tuition: Maths Notes

Lecture 6 - Random Variables and Parameterized Sample Spaces

Formal Modeling in Cognitive Science

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.

Class Copy! Return to teacher at the end of class! Mendel's Genetics

What is a random variable

Guided Reading Chapter 1: The Science of Heredity

Probability and random variables. Sept 2018

Probabilities and Expectations

MODULE NO.22: Probability

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Basics on Probability. Jingrui He 09/11/2007

Quantitative Methods for Decision Making

Transcription:

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this site. Copyright 2006, The Johns Hopkins University and Karl Broman. All rights reserved. Use of these materials permitted only in accordance with license rights granted. Materials provided AS IS ; no representations or warranties provided. User assumes all responsibility for use, and all liability related thereto, and must independently review all materials for accuracy and efficacy. May contain materials owned by others. User is responsible for obtaining permissions for use from third parties as needed.

The setup Probability Random experiment : A well-defined process with an uncertain outcome (for example, toss three fair coins) Sample space, S: The set of possible outcomes (e.g., { HHH, HHT, HTH, THH, HTT, THT, TTH, TTT }) Event: A set of outcomes (a subset of S) (e.g., A = {exactly one head} = { HTT, THT, TTH }) An event is said to have occurred if one of the outcome it contains occurs. Basic rules of probability 1. 1 Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B). More rules Pr(not A) = 1 Pr(A) Pr(A or B) = Pr(A) + Pr(B) Pr(A and B) Conditional probability Pr(A B) = probability of A given B = Pr(A and B) / Pr(B), provided Pr(B) > 0. Independence Events A and B are independent if Pr(A and B) = Pr(A) Pr(B); equivalently, Pr(A B) = Pr(A) and Pr(B A) = Pr(B). Still more rules Pr(A and B) = Pr(B) Pr(A B) = Pr(A) Pr(B A) Pr(A) = Pr(A and B) + Pr(A and not B) = Pr(B) Pr(A B) + Pr(not B) Pr(A not B) Bayes rule Pr(A B) = Pr(A) Pr(B A) / Pr(B) = Pr(A) Pr(B A) / [Pr(A) Pr(B A) + Pr(not A) Pr(B not A)]

Statistics for laboratory scientists Lecture 6 1. Mendel s peas. Examples Mendel s peas had either purple or white flowers. Flower color was due to a single gene with the purple allele (A) dominant to the white allele (a). The F 1 hybrid (obtained by crossing two pure-breeding lines, one with purple and one with white flowers) has purple flowers and is heterozygous, Aa. (a) Self the F 1, pick an F 2 seed at random and grow it up. i. Pr(F 2 plant has white flowers) = (1/2) (1/2) = 1/4 = 25%. ii. Pr(F 2 plant has purple flowers) = 1 (1/4) = 3/4 = 75%. iii. Pr(F 2 plant has genotype AA) = 1/4 = 25%. iv. Pr(F 2 plant has genotype AA it has purple flowers) = Pr(F 2 has genotype AA and purple flowers) / Pr(F 2 has purple flowers) = (1/4) / (3/4) = 1/3 33%. (b) Self the F 2. Grow up ten of the F 3 seeds at random. To save writing, define the following events: P 2 = { F 2 has purple flowers } H o = { F 2 has genotype AA } H e = { F 2 has genotype Aa } P 3 = { all ten F 3 plants have purple flowers } Note that P 2 = H e or H o. You may want to write out the following, using words. The use of this notation makes things compact but also somewhat harder to follow. i. Pr(P 2 ) = 1 (1/4) = 3/4 = 75%. ii. Pr(P 3 H o ) = 100%. iii. Pr(P 3 H e ) = (3/4) 10 5.6%. iv. Pr(P 3 P 2 ) = Pr(P 3 and P 2 ) / Pr(P 2 ) = { Pr(P 3 and H o ) + Pr(P 3 and H e ) } / Pr(P 2 ) = { Pr(H o ) Pr(P 3 H o ) + Pr(H e ) Pr(P 3 H e ) } / Pr(P 2 ) = { (1/4) 1 + (1/2) (3/4) 10 } / (3/4) 37% Alternatively, Pr(P 3 P 2 ) = Pr(P 3 and H o P 2 ) + Pr(P 3 and H e P 2 ) = Pr(H o P 2 ) Pr(P 3 H o and P 2 ) + Pr(H e P 2 ) Pr(P 3 H e and P 2 ) = (1/3) 1 + (2/3) (3/4) 10 37% v. Pr(H o P 3, P 2 ) = Pr(H o and P 3 P 2 ) / Pr(P 3 P 2 ) = Pr(P 3 H o and P 2 ) Pr(H o P 2 ) / Pr(P 3 P 2 ) = 1 (1/3)/[1 (1/3) + (3/4) 10 (2/3)] 90%.

2. Suppose a test for HIV correctly gives a positive result, if a person is infected, with probability 99.5%, and correctly gives a negative result, if a person is not infected, with probability 98%. (a) Suppose that 0.1% of a population are infected with HIV. Consider drawing a person at random and testing him or her for HIV infection. Calculate Pr(infected test is positive). Let I = { the person is infected } and P = { the person tests positive }. From the information above, on the sensitivity and specificity of the test, we have Pr(P I) = 0.995 and Pr(not P not I) = 0.98. From this information, Pr(not P I) = 1 Pr(P I) = 0.005 and Pr(P not I) = 1 Pr(not P not I) = 0.02. Note further that Pr(I) = 0.001, and so Pr(not I) = 0.999. We seek to calculate Pr(I P). We use Bayes s rule. (Why use Bayes s rule here? because we want to turn around the conditioning. We want to write Pr(I P) in terms of things like Pr(P I).) Pr(I P) = Pr(I) Pr(P I) / [ Pr(I) Pr(P I) + Pr(not I) Pr(P not I) ] = 0.001 0.995 / (0.001 0.995 + 0.999 0.02) 4.7% (b) Consider a person drawn from a high-risk group, so that they have, a priori, probability 30% of being infected. Calculate Pr(infected test is positive). In this case, we change Pr(I) = 0.3 and Pr(not I) = 1 Pr(I) = 0.7. Thus, Pr(I P) = 0.3 0.995 / (0.3 0.995 + 0.7 0.02) 96%

3. The Monty Hall problem. You re on a game show, and are presented with three doors. One hides a car; the other two hide goats. You re allowed to choose a door. Monty Hall then opens one of the other two doors to reveal a goat. You are now allowed to either stick with the door you chose initially, or switch to the other closed door. What should you do (assuming that you are hoping to get the car, and not a goat): stick with your original choice, or switch? The answer depends on Monty s behavior. (a) If you choose the door with the car, Monty opens one of the other two doors at random. If you choose a door hiding a goat, Monty opens the other door with a goat. (b) Monty Hall opens one of the other two doors at random. If he had revealed the car, you would have lost, but you happen to be in the situation where he revealed a goat. Let us call the doors A, B and C, and let us assume ( without loss of generality, as mathematicians like to say) that you choose door A. Moreover, let us initially not condition on Monty revealing the goat. There are six possible outcomes of the exeriment, { C G G, CG G, G C G, GC G, G G C, GG C }. (The three letters denote the objects behind the three doors, so that CGG means the car is behind door A. The boxes indicate which door Monty opens. Since you chose door A, he won t be opening that door, but one of the other two.) In scenario (a), Monty always opens a door with a goat. Thus, if the door you chose hid the car (that is, if the car was behind door A), he ll choose at random between the doors B and C. On the other hand, if the car is behind door B, he ll always open door C, and vice versa. Thus, under scenario A, the events { C G G } and { CG G } each have probability 1/6; events { GC G } and { G G C } each have probability 1/3, and events { G C G } and { GG C } each have probability 0. In scenario (b), in which Monty chooses at random between the two remaining doors, the six outcomes listed above each have probability 1/6. Now, define the events A = { the car is behind door A } = { C G G, CG G }, B = { the car is behind door B } = { G C G, GC G }, C = { the car is behind door C } = { G G C, GG C }, and D = { Monty opens a door with a goat } = { C G G, CG G, GC G, G G C }. Our objective: we want to know the chance that the door we originally chose hides the car, given that Monty has opened a door revealing a goat. This is Pr(A D). If this probability is < 1/2, we should switch doors; if it is > 1/2, we should stick with our first choice; if it is = 1/2, it doesn t matter whether we switch or stick. The event A and D (the intersection between A and D) is simply event A, and Pr(A) = 1/3 in either case (a) or (b), so Pr(A and D) = 1/3. In case (a), Pr(D) = 1, and so Pr(A D) = 1/3. We should switch doors. In case(b), Pr(D) = 2/3, and so Pr(A D) = 1/2. It doesn t matter what we do. The appropriate action to take depends on what Monty is doing!

4. Mendel, revisited. Mendel s peas had either purple or white flowers; flower color is due to a single gene, for which the purple allele (A) is dominant to the white allele (a). We cross two pure-breeding lines (one purple and one white) to produce the F 1 hybrid. We self the F 1 and choose an F 2 seed at random. We grow and self the F 2 and choose two F 3 seeds at random. Consider the following events. P O E W A 1 A 2 = {F 2 has purple flowers} = {F 2 is homozygous AA} = {F 2 is heterozygous Aa} = {F 2 has white flowers} = {F 3 number 1 has purple flowers} = {F 3 number 2 has purple flowers} Are A 1 and A 2 independent? A 1 and A 2 are not independent! A 1 and A 2 are conditionally independent, given the genotype of the F 2 plant. But, as we will see, in the absence of information about the F 2 parent s genotype, the flower color of the F 3 plants are not independent. First, note that the events E, O and W are mutually exclusive, and that [Pr(E) + Pr(O) + Pr(W)] = 1. Pr(A 1 ) = Pr(A 2 ) = Pr(A 1 E) Pr(E) + Pr(A 1 O) Pr(O) + Pr(A 1 W) Pr(W) = (3/4) (1/2) + (1) (1/4) + 0 (1/4) = 5/8 63%. Pr(A 1 and A 2 ) = Pr(A 1 and A 2 E) Pr(E) + Pr(A 1 and A 2 O) Pr(O) + Pr(A 1 and A 2 W) Pr(W) = (3/4) 2 (1/2) + 1 (1/4) + 0 (1/2) = 17/32 53%. Thus Pr(A 2 A 1 ) = Pr(A 1 and A 2 ) / Pr(A 1 ) = (17/32) / (5/8) = 17/20 = 85%, which is considerably greater than Pr(A 2 ). And so, A 2 and A 1 are not independent.