Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Similar documents
k P (X = k)

Lecture 6 - Random Variables and Parameterized Sample Spaces

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

P (A) = P (B) = P (C) = P (D) =

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Fundamentals of Probability CE 311S

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Introduction to Probability

P [(E and F )] P [F ]

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Probability and Independence Terri Bittner, Ph.D.

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Conditional Probability

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Solution to HW 12. Since B and B 2 form a partition, we have P (A) = P (A B 1 )P (B 1 ) + P (A B 2 )P (B 2 ). Using P (A) = 21.

Lecture 1 : The Mathematical Theory of Probability

Grades 7 & 8, Math Circles 24/25/26 October, Probability

Topic 5 Basics of Probability

MAT Mathematics in Today's World

STA Module 4 Probability Concepts. Rev.F08 1

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019


Dynamic Programming Lecture #4

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Probability (Devore Chapter Two)

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

the time it takes until a radioactive substance undergoes a decay

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

What is a random variable

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

Elementary Discrete Probability

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

Lecture notes for probability. Math 124

4. Probability of an event A for equally likely outcomes:

After linear functions, the next simplest functions are polynomials. Examples are

Math 1313 Experiments, Events and Sample Spaces

Steve Smith Tuition: Maths Notes

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan

Chapter 6: Probability The Study of Randomness

STAT 430/510 Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

P (E) = P (A 1 )P (A 2 )... P (A n ).

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

STT When trying to evaluate the likelihood of random events we are using following wording.

Probability. Lecture Notes. Adolfo J. Rumbos

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Properties of Probability

SDS 321: Introduction to Probability and Statistics

Introduction to discrete probability. The rules Sample space (finite except for one example)

Probability Notes (A) , Fall 2010

Introduction to Probability

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Probability Theory and Applications

Chapter 8: An Introduction to Probability and Statistics

3.2 Probability Rules

5.2 Fisher information and the Cramer-Rao bound

Senior Math Circles November 19, 2008 Probability II

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?

MAT 271E Probability and Statistics

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).

MAT 271E Probability and Statistics

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

2011 Pearson Education, Inc

X = X X n, + X 2

Statistics 251: Statistical Methods

Toss 1. Fig.1. 2 Heads 2 Tails Heads/Tails (H, H) (T, T) (H, T) Fig.2

Mutually Exclusive Events

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Probability: Sets, Sample Spaces, Events

RVs and their probability distributions

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

CIS 2033 Lecture 5, Fall

1 The Basic Counting Principles

Basic Probability. Introduction

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

STAT:5100 (22S:193) Statistical Inference I

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Chapter 7 Wednesday, May 26th

Chapter 2 PROBABILITY SAMPLE SPACE

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Transcription:

Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems. 1.1-4. A fair coin is tossed four times and the sequence of heads and tails is observed. (a) The sample space is S = {T T T T, HT T T, T HT T, T T HT, T T T H, HHT T, HT HT, HT T H, (b) Consider the following events: A = { at least 3 heads } T HHT, T HT H, T T HH, T HHH, HT HH, HHT H, HHHT, HHHH} = {T HHH, HT HH, HHT H, HHHT, HHHH} B = { at most 2 heads } = {T T T T, HT T T, T HT T, T T HT, T T T H, HHT T, HT HT, HT T H, T HHT, T HT H, T T HH} C = { heads on the third toss } = {T T HT, HT HT, T HHT, T T HH, T HHH, HT HH, HHHT, HHHH} D = { 1 head and 3 tails } = {HT T T, T HT T, T T HT, T T T H}. We note that #S = 16, #A = 5, #B = 11, #C = 8 and #D = 4 so that P (A) = 5 11, P (B) = 16 16, P (C) = 8 16 We are also asked to consider the following events: A B = { at least 3 heads AND at most 2 heads } = A C = { at least 3 heads AND heads on the third toss } = {T HHH, HT HH, HHHT, HHHH} and P (D) = 4 16. A C = { at least 3 heads OR heads on the third toss } = never mind B D = { at most 2 heads AND 1 head and 3 tails } = { 1 head and 3 tails } = D. We observe that #(A B) = 0, #(A C) = 4 and #(B D) = #D = 4 so that P (A B) = 0 16, P (A C) = 4 16 and P (B D) = 4 16. I said never mind for the set A C because we don t need to list all the elements. Indeed, we already know that P (A) = 5/16, P (C) = 8/16 and P (A C) = 4/16, so that P (A C) = P (A) + P (C) P (A C) = 5 + 8 4 = 9 16 16.

1.1-5. We roll a fair six-sided die until we see a 3. Consider the events A = { we get a 3 on the first roll } B = { at least two rolls are required to see a 3 }. If the die is fair then we observe that P (A) = 1/6. We also observe that the events A and B are complementary because they are mutually exclusive and they exhaust all the possible outcomes. Therefore we conclude that P (A B) = 1 and P (B) = 1 P (A) = 1 1 6 = 5 6. 1.1-6. Consider two events A, B such that (a) Then we have P (A) = 0.4, P (B) = 0.5 and P (A B) = 0.3. P (A B) = P (A) + P (B) P (A B) = 0.4 + 0.5 0.3 = 0.6. (b) Note that we can decompose the event A into two disjoint pieces by looking at the stuff that s inside B or outside B: A = (A B) (A B ). [You should draw a Venn diagram to get a feeling for this.] Thus we have P (A) = P (A B) + P (A B ) P (A) P (A B) = P (A B ) 0.4 0.3 = P (A B ) 0.1 = P (A B ). (c) De Morgan s law says that (A B) = A B. That is, the stuff that is not in (A AND B) is the same as the stuff that is (not in A) OR (not in B). [You should draw a Venn diagram to get a feeling for this.] Thus we have 1.1-7. Consider two events A, B such that P (A B ) = 1 P (A B) = 1 0.3 = 0.7. P (A B) = 0.76 and P (A B ) = 0.87. We want to find P (A). How can we do this? Well, we saw in the previous problem that A can be decomposed by the stuff inside/outside of B: A = (A B) (A B ). And we can do the same trick to divide up A in terms of B and B : It follows from this that A = (A B) (A B ). P (A) = 1 P (A ) = 1 [P (A B) + P (A B )]. So what? Well, de Morgan s law also tells us that (A B) = (A B ) and (A B ) = (A B), so that P (A B) = 1 P (A B ), P (A B ) = 1 P (A B).

Finally, putting everything together gives P (A) = 1 [P (A B) + P (A B )] = 1 [ [1 P (A B )] + [1 P (A B)] ] = P (A B ) + P (A B) 1 = 0.87 + 0.76 1 = 0.63. 1.1-9. We roll a fair six-sided die 3 times. Consider the following events: A 1 = { 1 or 2 on the first roll } A 2 = { 3 or 4 on the second roll } A 3 = { 5 or 6 on the third roll }. Luckily we don t have to analyze this experiment ourselves because the book just tells us that: P (A i ) = 1/3 for all i. P (A i A j ) = (1/3) 2 for all i j. P (A 1 A 2 A 3 ) = (1/3) 3. Now we are asked to find P (A 1 A 2 A 3 ). At this point you can just quote Theorem 1.1-6 from the book. However, I ll do it myself from scratch. First we have P (A 1 A 2 A 3 ) = P (A 1 (A 2 A 3 )) = P (A 1 ) + P (A 2 A 3 ) P (A 1 (A 2 A 3 )) and P (A 2 A 3 ) = P (A 2 ) + P (A 3 ) P (A 2 A 3 ). Then by rewriting A 1 (A 2 A 3 ) = (A 1 A 2 ) (A 1 A 3 ) we have P (A 1 (A 2 A 3 )) = P ((A 1 A 2 ) (A 1 A 3 )) Putting everything together gives = P (A 1 A 2 ) + P (A 1 A 3 ) P ((A 1 A 2 ) (A 1 A 3 )) = P (A 1 A 2 ) + P (A 1 A 3 ) P (A 1 A 2 A 3 ). P (A 1 A 2 A 3 ) = P (A 1 ) + P (A 2 A 3 ) P (A 1 (A 2 A 3 )) or, in other words, P (A 1 A 2 A 3 ) equals = P (A 1 ) + [P (A 2 ) + P (A 3 ) P (A 2 A 3 )] [P (A 1 A 2 ) + P (A 1 A 3 ) P (A 1 A 2 A 3 )], P (A 1 ) + P (A 2 ) + P (A 3 ) P (A 1 A 2 ) P (A 1 A 3 ) P (A 2 A 3 ) + P (A 1 A 2 A 3 ). Finally, since we know value of each term in this sum, we get ( ) ( ) 1 1 2 ( ) 1 3 P (A 1 A 2 A 3 ) = 3 3 + 1. 3 3 3

Wow, that really reminds me of the third row of Pascal s triangle. Indeed, note that ( 1 1 3 ( ) ( ) 1 1 2 ( ) 1 3 = 1 + 3 + 3 + 1 3) 3 3 3 ( ) ( ) 1 1 2 ( ) 1 3 = 1 3 + 3 1 3 3 3 and hence we have P (A 1 A 2 A 3 ) = 1 ( 1 1 3) 3 = 1 ( ) 2 3 = 1 8 3 27 = 19 27. [Remark: Maybe there s a shorter way to do this, but it was good practice to do it the long way.] 1.1-12. This one is just a thinking problem, since we aren t told precisely what selected randomly means in this case. Just use your intuition. Suppose a real number x is selected randomly from the closed interval [0, 1]. We are supposed to assume that all possible choices are equally likely, whatever that means. Since there are infinitely many possible choices, this suggests that the probability of any particular x is 1/, or 0. Ok, I guess. We know that P ([0, 1]) = 1 because [0, 1] is the whole sample space. It also seems intuitively clear that P ([0, 1/2]) = P ([1/2, 1]) = 1/2. (The number is equally likely to be in the left half or the right half of the interval.) More generally, it seems that the probability that x lies in a particular line segment is just the length of the line segment (whether or not the endpoints of the line segment are included). So here are my answers: (a) P ({x : 0 x 1/3}) = 1/3, (b) P ({x : 1/3 x 1}) = 2/3, (c) P ({x : x = 1/3}) = 0, (d) P ({x : 1/2 < x < 5}) = P ({x : 1/2 < x 1}) + P ({x : 1 5}) = 1/2 + 0 = 1/2. [Remark: We will discuss continuous probability distributions in detail later.] Additional Problems. 1. Consider a biased coin with P ( heads ) = p and P ( tails ) = 1 p. Suppose that you flip the coin n times and let X be the number of heads that you get. Compute P (X 1). [Hint: Observe that P (X 1) + P (X = 0) = 1.] Solution: Recall from the course notes that the probability of getting exactly k heads is ( ) n P (X = k) = p k (1 p) n k. k Thus we are asked to compute the following sum: n ( ) n P (X 1) = p k (1 p) n k. k k=1

That seems really hard so instead we use the formula ( ) n P (X 1) = 1 P (X = 0) = 1 p 0 (1 p) n = 1 (1 p) n. 0 Alternatively, we can compute P (X = 0) by observing that X = 0 corresponds to the event {T T T T }. Since the probability of each tail is (1 p) and since the coin flips are independent we see that P (X = 0) = P (T T T T ) = P (T )P (T ) P (T ) = (1 p)(1 p) (1 p) = (1 p) n. [Remark: Compare the formula P (X = 0) = 1 (1 p) n to Exercise 1.1-9 above. Can you see how to obtain the answer to 1.1-9 by plugging in p = 1/3 and n = 3? We can think of each die roll as a fancy coin flip with P ( heads ) = 1/3. The definition of heads changes on each roll, but I guess that doesn t matter.] 2. Suppose that you roll a pair of fair six-sided dice. (a) Write down the elements of the sample space S. What is #S? Are the outcomes equally likely? (b) Compute the probability of getting a double six. [Hint: Let E S be the subset of outcomes that correspond to getting a double six. Compute P (E) = #E/#S.] Solution: (a) The sample space is S ={11, 12, 13, 14, 15, 16 21, 22, 23, 24, 25, 26 31, 32, 33, 34, 35, 36 41, 42, 43, 44, 45, 46 61, 62, 63, 64, 65, 66}. Therefore we have #S = 6 6 = 36. If the dice are fair I guess that all 36 possible outcomes are equally likely. Therefore we can use the formula P (E) = #E/#S. (b) The event double six corresponds to E = {66}, so that #E = 1. Thus P ( double six ) = P (E) = #E/#S = 1/36. 3. The Chevalier de Méré considered the following two games/experiments: (1) Roll a fair six-sided die 4 times. (2) Roll a pair of fair six-sided dice 24 times. For the first experiment, let X be the number of sixes that you get. Apply counting and axioms of probability to compute P (X 1). For the second experiment let Y be the number of double sixes that you get. Apply similar ideas to compute P (Y 1). Which of these two events is more likely? [Hint: You can think of a fair six-sided die as a bised coin with heads = six and tails = not six, so that P ( heads ) = 1/6 and P ( tails ) = 5/6. You will find that it is easier to compute P (X = 0) and P (Y = 0).] Solution: All of the work has been done. For game (1) we think of rolling a fair six-sided die as a fancy coin flip with heads = six and tails = not six. Roll the die n = 4 times and let X = number of sixes we get. Since p = P ( heads ) = 1/6, Problem 1 tells us that ( ) 5 4 P ( at least one six ) = P (X 1) = 1 P (X = 0) = 1 (1 p) n = 1 0.5177 = 51.77%. 6 ///

For game (2) we think of rolling a pair of fair six-sided dice as a fancy coin flip with heads = double six and tails = not double six. Roll the fancy coin n = 24 times and let Y = number of double sixes we get. From Problem 2 we know that p = P ( double six ) = 1/36, hence Problem 1 tells us that ( ) 35 24 P ( at least one double six ) = P (Y 1) = 1 P (Y = 0) = 1 0.4914 = 49.14%. 36 [Remark: The Chevalier s mathematical intuition told him that these two events should be equally likely, but his gambling experience told him that X 1 happened more often than Y 1. The subject of mathematical probability was born when Fermat and Pascal came up with a mathematical theory (the one we just used) that does agree with experience. The most important thing is to make accurate predictions.]