Toss 1. Fig.1. 2 Heads 2 Tails Heads/Tails (H, H) (T, T) (H, T) Fig.2

Similar documents
LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

MA 1125 Lecture 15 - The Standard Normal Distribution. Friday, October 6, Objectives: Introduce the standard normal distribution and table.

L14. 1 Lecture 14: Crash Course in Probability. July 7, Overview and Objectives. 1.2 Part 1: Probability

P (E) = P (A 1 )P (A 2 )... P (A n ).

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

3.2 Probability Rules

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

2011 Pearson Education, Inc

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Chance Models & Probability

STA Module 4 Probability Concepts. Rev.F08 1

Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2:

Probability Year 9. Terminology

Section 4.6 Negative Exponents

Fundamentals of Probability CE 311S

Chapter 2 Linear Equations and Inequalities in One Variable

Probability Year 10. Terminology

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Quadratic Equations Part I

Probability and Distribution Theory

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

CIS 2033 Lecture 5, Fall

the time it takes until a radioactive substance undergoes a decay

MAT Mathematics in Today's World

Grade 9 Mathematics Unit #2 Patterns & Relations Sub-Unit #1 Polynomials

Chapter 15. Probability Rules! Copyright 2012, 2008, 2005 Pearson Education, Inc.

Week 2: Probability: Counting, Sets, and Bayes

5.6 Solving Equations Using Both the Addition and Multiplication Properties of Equality

2. A SMIDGEON ABOUT PROBABILITY AND EVENTS. Wisdom ofttimes consists of knowing what to do next. Herbert Hoover

base 2 4 The EXPONENT tells you how many times to write the base as a factor. Evaluate the following expressions in standard notation.

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

Before this course is over we will see the need to split up a fraction in a couple of ways, one using multiplication and the other using addition.

Probability. VCE Maths Methods - Unit 2 - Probability

3 PROBABILITY TOPICS

2 Systems of Linear Equations

Solving Quadratic & Higher Degree Inequalities

Objective - To understand experimental probability

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?

Formalizing Probability. Choosing the Sample Space. Probability Measures

Ch. 11 Solving Quadratic & Higher Degree Inequalities

Unit 3 Probability Basics

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

Lecture 6: The Pigeonhole Principle and Probability Spaces

Section 13.3 Probability

7.1 What is it and why should we care?

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Lesson 21 Not So Dramatic Quadratics

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

To factor an expression means to write it as a product of factors instead of a sum of terms. The expression 3x

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Steve Smith Tuition: Maths Notes

Solving with Absolute Value

STAT Chapter 3: Probability

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

2017 SUMMER REVIEW FOR STUDENTS ENTERING GEOMETRY

SAMPLE CHAPTER. Avi Pfeffer. FOREWORD BY Stuart Russell MANNING

Ch 14 Randomness and Probability

CSC321 Lecture 18: Learning Probabilistic Models

CS 124 Math Review Section January 29, 2018

RVs and their probability distributions

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

P (second card 7 first card King) = P (second card 7) = %.

Intermediate Math Circles November 15, 2017 Probability III

Statistics 1L03 - Midterm #2 Review

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205)

Probability and Independence Terri Bittner, Ph.D.

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Probability Theory and Applications

Axioms of Probability

Bayesian Updating with Discrete Priors Class 11, Jeremy Orloff and Jonathan Bloom

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models

Algebra 8.6 Simple Equations

Unit 3 Probability Basics

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Math 3361-Modern Algebra Lecture 08 9/26/ Cardinality

Discrete Probability. Chemistry & Physics. Medicine

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

MA : Introductory Probability

CISC 1100: Structures of Computer Science

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

Preparing for the CSET. Sample Book. Mathematics

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

CS 361: Probability & Statistics

Probability. Chapter 1 Probability. A Simple Example. Sample Space and Probability. Sample Space and Event. Sample Space (Two Dice) Probability

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Basic Probability. Introduction

Chapter 1 Review of Equations and Inequalities

Algebra 2: Semester 2 Practice Final Unofficial Worked Out Solutions by Earl Whitney

AMS7: WEEK 2. CLASS 2

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Solving Quadratic & Higher Degree Equations

Warm Up. Fourth Grade Released Test Question: 1) Which of the following has the greatest value? 2) Write the following numbers in expanded form: 25:

Probability (Devore Chapter Two)

2 Chapter 2: Conditional Probability

Methods of Mathematics

Logarithms and Exponentials

Please bring the task to your first physics lesson and hand it to the teacher.

Transcription:

1 Basic Probabilities The probabilities that we ll be learning about build from the set theory that we learned last class, only this time, the sets are specifically sets of events. What are events? Roughly, events are ways things can go. So, if I m tossing a coin, there are two ways things can go: namely, it can land heads or it can land tails. The total list of possible ways things can go with respect to a given occurrence is known as the event space. In other words, the event space is equal to the set of all ways things can go. So, in the case of a single coin toss, this would be equal to the set {, }. I like to visualise the event space as a table (the shaded area is the entire event space): Toss 1 H T Fig.1 I emphasised single above to raise the following point: events are not just identical with their outcome it matters how the events came about. To illustrate, imagine I m performing two tosses of a single coin. It is tempting to think that the event space should look like this {(, ), (, ), (, )} (i.e. that there are only three possible ways things can go). 2 2 / (H, H) (T, T) (H, T) Fig.2 After all, what difference does the order in which the results land matter isn t it just the same if the coin lands heads first as if it lands tails first? In a word, no. It s not at all the same, where probabilities are concerned! What is important is that there are two different ways in which the (, ) outcome can come about. This is why I called events ways things can go this includes how they get there. Thus, the event

2 space for two coin tosses is accurately as follows: {(, ), (, ), (, ), (, )} Toss 1 (H, H) (H, T) (T, H) (T, T) Fig.3 Another way to visualise the event space that might help make the previous point clear is in terms of branching possibilities. Consider a single coin toss. In the following, the vertical axis marks the progression of time. Imagining yourself at a time prior to Toss, we have two possible results: Time TOSS / \ / \ HEADS TAILS Fig. 4 Now, let s imagine ourselves at a time prior to Toss again, but this time take into consideration the second toss. We don t yet know how the first toss will go, so we can t eliminate either of those branches yet. We have to reason as follows: supposing the first toss lands heads, the second could land either heads or tails; or, supposing the first toss lands tails, the second could land either heads or tails. Visually, we represent this as follows: Time TOSS / \ / \ HEADS TAILS / \ / \ HEADS TAILS HEADS TAILS Fig. 5

3 Using either the table or the branching chart can help to make calculating probabilities much easier. (NB: the branching chart will not be feasible for more cumbersome probabilities like Monday Girl.) In general, when we ask for the probability of some A here, we are asking for the ratio of the number events where A comes about to the number of events in the event space; i.e. # of events where A occurs # of events in the event space OR # of ways A can come about # of ways things can go Fig. 6 So, for instance, in the case of the coin tossed twice, the probability of a coin s landing heads at least once can be calculated by looking at all of the events in our event space, and tallying up the number of instances in which there is at least one heads. Recall Fig. 3, where green represented the total event space: Toss 1 (H, H) (H, T) (T, H) (T, T) Fig.3 Now, consider Fig. 7, where orange represents all of the events with which we re concerned (i.e. all those that satisfy heads lands at least once ):

4 Toss 2 Toss 1 (H, H) (H, T) (T, H) (T, T) Fig.7 So, the probability of heads landing at least once is going to be equal to: # of orange squares (i.e. # of events satisfying the description) # of green squares (i.e. # of events in the event space) Conditional Probabilities which equals 3 / 4 Conditional probabilities are really no different to basic probabilities, for our purposes; the only thing that changes is the reference class, or in more practical terms, the denominator of our fractions from above. In these cases, instead of asking about the probability of A given all the ways things can go, we are concerned about the probability of A given some (subset) of the ways things can go. In practice, this means counting a different set of events for our denominator (usually one that is a proper subset of, or some selection of, the event space). So, imagine we want to know the probability of heads landing at least once given that tails lands at least once. We begin, as before, by returning to Fig. 3: Toss 1 (H, H) (H, T) (T, H) (T, T) Fig.3

5 But this time, we need to check the number of events that satisfy the given that clause first! In this example, we want to know the probability of something given that tails lands at least once. So let s find all of the events in which tails lands at least once: Toss 1 (H, H) (H, T) (T, H) (T, T) Fig.8 I ve shaded these boxes in a lighter green deliberately. If you like, with respect to the conditional probability we are trying to calculate, this is our new, restricted event space. We re now really only concerned with the following: Toss 1 (H, H) (H, T) (T, H) (T, T) Fig. 8.1 Given this new restricted given that space, we now want to know how many of these events satisfy the description heads lands at least once. And this time, our answer is 2 (rather than 3, as in the first example). Toss 1 (H, H) (H, T) (T, H) (T, T) Fig. 9

6 So, the conditional probability of heads landing at least once given that tails lands at least once is equal to # of orange squares (i.e. # of events satisfying the description) # of green squares (i.e. # of events in given that space) which equals 2 / 3 Conditional Probabilities, Pt. 2 The Formula The formula you were given for conditional probabilities looks like this: Pr(B A) = But why? Let s try to make sense of this using more diagrams. Imagine the following is our event space: Fig. 10

7 When we ask for the probability of some A, we want to know the ratio of the red area below (i.e. the set of all A events) to the entire rectangle area (i.e. the set of all possible events). This represents the ratio of the number of events satisfying A to the number of events in the event space: Fig. 11 Now let s supposed we re also concerned with another kind of event B: Fig. 12 If each of the shaded areas represents a set of events, then the area shaded red represents (A B), i.e. all the events that are either A or B. Remember what was said before about conditional probabilities and restricting the event space?

8 Here s how we can represent that using venn diagrams. Suppose we are interested in the conditional probability of B given that A (i.e. Pr(B A)). It was the given that clause that constituted the restricted given that space that is to say, it was the given that clause that described what figures as the denominator in our fraction (see p.6). Thus, the denominator in our formula for conditional probabilities is the probability of the given that event in this case A. Visually, then, the denominator is constituted by the area shaded in YELLOW below: Fig. 13 Now, return to the original formula for conditional probabilities: Pr(B A) = We ve just figured out why the denominator is such as it is. Now let s figure out why the numerator consists of Pr(A B). Remember, in a conditional probability, we wanted to know how many events in the restricted given that space (in this case, the A space in yellow) satisfy the description in this case, B. In other words, we need first to figure out how many A s are also B; then we can calculate the ratio of the # of A s that are B to the # of A s. In our venn diagrams, the shaded areas represent sets of events. If we want to know the number of events that are A s and B s, then we want to know which events fall into both sets. In other words, we are interested in the intersection of A and B. Hence Pr(A B)! Visually, we are concerned with the area in orange below:

9 Fig. 14 Bayes Theorem Bayes Theorem is the formula we use when we want to know about certain kinds of conditional probabilities. First we ll go through how to get the simple version of the theorem by building on what we just learned about conditional probabilities. First, note that Bayes Theorem is also concerned with Pr(B A). Now, recall the formula for conditional probabilities from above: Pr(B A) = In words, the probability of B given that A is equal to the probability of A and B (i.e. A intersection B), divided by the probability of A. First, let s multiply both sides by Pr(A)... Pr(B A) Pr(A) =

10 which equals Pr(B A) Pr(A) = Pr(A B) Using the original formula for conditional probability, we can also say that: Pr(A B) = In words, the probability of A given that B is equal to the probability of A and B, divided by the probability of B. Once again, we can multiply both sides by the denominator: Pr(A B) Pr(B) = which equals Pr(A B) Pr(B) = Pr(A B) Notice that we now have two different equations that are equal to Pr(A B): Pr(A B) = Pr(A B) Pr(B) = Pr(B A) Pr(A)

11 Now, recall again that Bayes Theorem concerns Pr(B A). Given this, let s rearrange the above to isolate Pr(B A) on one side of the equal sign. Focus on this part: Pr(A B) Pr(B) = Pr(B A) Pr(A) And now dividing both sides by Pr(A), we re left with... Pr(B A) = This is the basic formulation of Bayes Theorem. Why would we ever use this, instead of the original formula for conditional probabilities? In this version of the theorem, it s not immediately obvious, since we need at least as much information to solve for Pr(B A) here as we did with our original formula for conditional probabilities that is to say, in both cases, we need to know Pr(A) and Pr(B). Compare both formulas: Bayes Theorem: Pr(B A) = Cond. Prob.: Pr(B A) =