Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Similar documents
Mutually Exclusive Events

Probability Year 10. Terminology

Probability and Independence Terri Bittner, Ph.D.

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Probability Year 9. Terminology

Chapter 7: Section 7-1 Probability Theory and Counting Principles

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Intermediate Math Circles November 8, 2017 Probability II

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Lecture 1. ABC of Probability

Chapter 7 Wednesday, May 26th

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

What is Probability? Probability. Sample Spaces and Events. Simple Event

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

UNIT 5 ~ Probability: What Are the Chances? 1

STAT Chapter 3: Probability

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Conditional Probability and Bayes

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Chapter 6: Probability The Study of Randomness

CMPSCI 240: Reasoning about Uncertainty

STA Module 4 Probability Concepts. Rev.F08 1

2011 Pearson Education, Inc

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Probability Theory and Applications

Basic Concepts of Probability

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Probability Theory Review

Discrete Probability. Chemistry & Physics. Medicine

Lecture notes for probability. Math 124

Properties of Probability

Basic Concepts of Probability

Math 140 Introductory Statistics

Dept. of Linguistics, Indiana University Fall 2015

A Event has occurred

Lecture Lecture 5

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Fundamentals of Probability CE 311S

MAS108 Probability I

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Conditional Probability and Bayes Theorem (2.4) Independence (2.5)

STOR Lecture 4. Axioms of Probability - II

Chapter 3 : Conditional Probability and Independence

Introduction to Probability 2017/18 Supplementary Problems

MATH STUDENT BOOK. 12th Grade Unit 9

Chapter 2 Class Notes

STAT:5100 (22S:193) Statistical Inference I

Probability- describes the pattern of chance outcomes

the time it takes until a radioactive substance undergoes a decay

Introduction to Probability

Conditional Probability

Origins of Probability Theory

Probability & Random Variables

Econ 325: Introduction to Empirical Economics

Math 1313 Experiments, Events and Sample Spaces

Today we ll discuss ways to learn how to think about events that are influenced by chance.

CHAPTER 3 PROBABILITY TOPICS

Year 10 Mathematics Probability Practice Test 1

Computing Probability

Conditional Probability

Random processes. Lecture 17: Probability, Part 1. Probability. Law of large numbers

Deep Learning for Computer Vision

Introduction and basic definitions

Chapter 8: An Introduction to Probability and Statistics

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Statistics for Engineers

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Discrete Random Variables

Axiomatic Foundations of Probability. Definition: Probability Function

Name: Exam 2 Solutions. March 13, 2017

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

Venn Diagrams; Probability Laws. Notes. Set Operations and Relations. Venn Diagram 2.1. Venn Diagrams; Probability Laws. Notes

Chapter 4 - Introduction to Probability

3 PROBABILITY TOPICS

3.2 Probability Rules

Grades 7 & 8, Math Circles 24/25/26 October, Probability

Chapter 1 Review of Equations and Inequalities

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov.

AQA Statistics 1. Probability. Section 1: Introducing Probability. Notation

13-5 Probabilities of Independent and Dependent Events

STAT 285 Fall Assignment 1 Solutions

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Algorithms for Uncertainty Quantification

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Lecture 2: Repetition of probability theory and statistics

Econ 113. Lecture Module 2

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Math , Fall 2012: HW 5 Solutions

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

AMS7: WEEK 2. CLASS 2

Probability: Sets, Sample Spaces, Events

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

M378K In-Class Assignment #1

Transcription:

Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose that you have two random variables X1 and X2. These two random variables will be pairwise statistically independent if the realized value X1 takes on does not affect the probability of X2 taking on any possible value, and vice versa. Loosely speaking, movements in X1 cannot affect movements in X2. If the realized value of X1 somehow affected the probability that X2 would take on a particular value, then the two variables would be dependent. Independence greatly simplifies probability calculations for X1 since we need not worry about what is happening to X2. It is still somewhat murky, since we have not made clear how that we know two random variables are independent. But, a simple theorem called the factorization theorem (or factorization principle) makes things much simpler. Let A and B be random events. P(A and B) = P(A)P(B) if and only if A and B are independent. So, suppose that we are flipping a coin. This is a Bernoulli event. Assume the coin is fair and the probability of a heads is p, or P[H] = p. Now, flip it twice. What will be the probability of first getting a head and then getting another head? That's easy. Since the two flips are independent, P[H1 and H2] = p 2. But, this is just P[H1]P[H2] and so P[H1 and H2] = P[H1]P[H2] and factorization works. Now, try a tail on the first and a tail on the second. This is P[T1 and T2] = (1-p) 2 = P[T1]P[T2], factorization again occurs. What about a head on the first flip and a tail on the second. This will also show factorization, as will a tail on the first and a head on the second. Factorization is a very useful way of looking at statistical independence. Next, consider flipping a coin twice again, with P[Head] = 1. The first flip is independent of the second flip. Let A = 1 if the first flip is a head, and zero otherwise. Let B = 1 if the second flip is a head, and zero otherwise. Now, how do we compute the probability of A+B? Obviously, A+B can

be 0, 1, or 2. To get A+B = 0, both A and B must be zero. Therefore, by the factorization theorem we multiply to get P[A+B=0] = P[A = 0 and B = 0] = (1-p) 2. Similarly, for A+B to be 2, both A = 1 and B = 1 must occur. Again, employing the factorization idea, we get P[A+B=2] = P[A=1 and B=1]= p 2. However, to get A+B = 1, there are two separate ways this can happen. The first way is A = 1 and B = 0, while the second way is A = 0 and B = 1. Two mutually exclusive ways of getting A+B = 1. In this case, we ADD the probabilities of the two different, mutually exclusive ways together. Thus, P[A+B = 1] = P[A = 1 and B = 0] + P[A = 0 and B = 1] = P[A=1]P[B=0] + P[A=0]P[B=1] = p(1-p) + (1-p)p = 2p(1-p) The probabilities of mutually exclusive ways of an event occurring add together (called the addition principle), while independent events multiply together (called the factorization principle). II. Statistical Dependence Sometimes we must deal with two or more random variables that are not statistically independent. The probability of one is affect by what happens to the other. Let's try to understand this with a simple example. Think of the following highly contrived set of circumstances for random variables A and B. A : You flip a coin. A = 1 if a head, zero otherwise where P[A=1] = p. B : If A is a head, you flip the coin again. If A is a tail, you roll a die where P[X = i] = 1/6 for i = 1,...,6. This is the second event. Calculate the pdf for A+B.

Here is the way we can calculate the pdf of A+B and draw the graph, as shown below. The important point to note is that the probability of B = 3 or 4 or 1 depends on the value A assumes. For example, if A = 1, then P[B=3] = 0, whereas if A = 0, then P[B = 3] = 1/6. For sure, the world is complicated, and these dependent random variables show that randomness can be complicated also. Not all sets of random variables follow simple rules of combination. Some require us to think carefully how probability is determined. Sometimes, this involves us in counting arguments. In all cases we look at two important criteria to help us in calculating probability. First, what are the different, mutually exclusive ways of getting a particular outcome? Second, for each of these ways, are they composed of independent events? In the graph above, we look at an outcome (A+B) = 1 and find that there are two ways this can happen. Namely, A = 1 and B = 0, and also A = 0 and B = 1. Note that P[A = 1 and B = 0] = p(1-p) and P[A = 0 and B = 1] = (1-p)/6. These are two mutually exclusive ways of getting A+B = 1, so we add them together to get P[A+B = 1] = p(1-p) + (1-p)/6. It will be useful for you to work your way through the graph above. Questions:

#1. Consider the case where you draw two cards from a 52 card deck without replacement. Now consider the random event that you have drawn two red cards. Explain why that the first and second draws are not statistically independent. #2. Assume A and B are independent random variables. Therefore, we know that P[A and B] =? #3. Flip a coin twice. Let E1 be the outcome from the first flip and E2 be the outcome from the second flip. Use the factorization principle to prove E1 and E2 and statistically independent. #4. Let A and B be random variables. But, suppose B becomes certain and no longer random. Show that A and B must now be statistical independent. #5. Consider the following Venn diagram showing the probabilities of A and B. Is the equation as it is written correct? Draw the diagram again assuming A and B are independent. How does the equation below the diagram change? #6. Use the following Venn diagram to calculate P[A], P[B], P[A & B], P[A or B]

#7. Use the following Venn diagram to calculate P[A], P[B], P[C], P[A & B &C], P[A & B or C], and P[A&C] #8. In blackjack (or the card game 21), it is claimed that people can raise the odds of winning by counting cards. Does this mean that draws at a black jack table are not independent? Explain.