University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Similar documents
Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

STA Module 4 Probability Concepts. Rev.F08 1

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

k P (X = k)

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

X = X X n, + X 2

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

CS280, Spring 2004: Final

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability and Independence Terri Bittner, Ph.D.

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

MAT Mathematics in Today's World

Intermediate Math Circles November 8, 2017 Probability II

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Senior Math Circles November 19, 2008 Probability II

P (A) = P (B) = P (C) = P (D) =

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Name: Exam 2 Solutions. March 13, 2017

Chapter 8: An Introduction to Probability and Statistics

Probability: Sets, Sample Spaces, Events

CSC Discrete Math I, Spring Discrete Probability

12 1 = = 1

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Introduction to Probability, Fall 2009

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

3.2 Probability Rules

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2. Binomial and Poisson Probability Distributions

CS 361: Probability & Statistics

Part 3: Parametric Models

CMPSCI 240: Reasoning Under Uncertainty

PRACTICE PROBLEMS FOR EXAM 2

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov.

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Name: Firas Rassoul-Agha

Chapter 7 Wednesday, May 26th

Conditional Probability

Math 493 Final Exam December 01

Lecture 1. ABC of Probability

MATH STUDENT BOOK. 12th Grade Unit 9

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

P (E) = P (A 1 )P (A 2 )... P (A n ).

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Algebra Exam. Solutions and Grading Guide

Dynamic Programming Lecture #4

Math 1313 Experiments, Events and Sample Spaces

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

Lecture 8: Probability

Example. If 4 tickets are drawn with replacement from ,

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

Probability: Part 1 Naima Hammoud

Chapter 1. Probability

Chapter 4: An Introduction to Probability and Statistics

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

Lecture notes for probability. Math 124

P [(E and F )] P [F ]

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

3. A square has 4 sides, so S = 4. A pentagon has 5 vertices, so P = 5. Hence, S + P = 9. = = 5 3.

Binomial and Poisson Probability Distributions

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Let us think of the situation as having a 50 sided fair die; any one number is equally likely to appear.

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

CS 361: Probability & Statistics

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Conditional Probability

Probability Notes (A) , Fall 2010

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

1 Probability Theory. 1.1 Introduction

With Question/Answer Animations. Chapter 7

Dept. of Linguistics, Indiana University Fall 2015

CS 361: Probability & Statistics

Ch 14 Randomness and Probability

1 The Basic Counting Principles

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

CSCI2244-Randomness and Computation First Exam with Solutions

Introduction to Probability Theory, Algebra, and Set Theory

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

Properties of Probability

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis

Randomized Algorithms

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Transcription:

University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 2011 Exam 1 February 16, 2011, 11:10 am - 12:00 noon Name: Solutions Student ID: This exam consists of seven pages: this cover page; five pages, each containing one problem with several parts; and a table of the normal distribution. You may use a calculator, and notes on one sides of a standard 8.5-by-11-inch sheet of paper which you have written by hand, yourself. You must show all work other than basic arithmetic. Write your name at the top of each page. Each problem is worth 20 points, for a total of 100. Weights of problem subparts are indicated on the problems. DO NOT WRITE BELOW THIS LINE 1. 2. 3. 4. 5. TOTAL 1

1. Three ordinary, six-sided dice are rolled. (Each has the numbers 1, 2, 3, 4, 5 and 6 on its faces.) What is the probability that: (a) All three dice show different numbers? [5 points] 6/6 5/6 4/6 = 120/216 = 5/9. The probability that the second die doesn t show the same number as the first die is 5/6; given that this happens, the probability that the third die doesn t show the same number as either of the second two dice is 4/6. (b) The sum of the numbers showing on the dice is 16? [5 points] 16 can be obtained as two 5s and one 6, or one 4 and two 6s. Either of these can occur in three ways: (5, 5, 6), (5, 6, 5), (6, 5, 5), (4, 6, 6), (6, 4, 6), (6, 6, 4). There are a total of 6 3 = 216 possible outcomes for rolling three dice, so the answer is 6/216 or 1/36. (c) The largest of the three numbers shown is 5? [5 points] Let L n be the event that the largest number shown is n, and let M n be the event that the largest number shown is less than or equal to n. Then P (M n ) = (n/6) 3, since for the largest number shown to be at most n, all three numbers shown must be at most n. Now, L 5 = M 5 M c 4 for the largest number shown to be 5, it must be less than or equal to 5, but not less than or equal to 4. So P (L 5 ) = P (M 5 M c 4) = P (M 5 ) P (M 5 M 4 ). But M 5 M 4 is just M 4. So the answer is P (M 5 ) P (M 4 ) = (5 3 4 3 )/6 3 = 61/216. (This is written out in a bit more detail than was necessary on the exam. Also, a lot of people gave 5 3 /6 3 as their answer.) (d) The sum of the numbers shown is divisible by 6 that is, is equal to 6, 12, or 18? For part (d), to get full credit you must explain your answer. [5 points] This was a bit of a trick question. The answer is 1/6, because regardless of what you obtain on the first two rolls, there is exactly one outcome for the third roll that makes the sum divisible by 6. Nobody saw this trick. A lot of people tried to compute the number of ways to get 6, 12, and 18 by hand if you did this correctly, that s right, but it s a lot of work. 2

2. I roll a die with two red faces and four blue faces 300 times. (a) Give an expression for the exact probability that a red face appears exactly 100 times. You do not need to give a numerical value. [8 points] The number of red faces appearing is binomial with n = 300, p = 1/3, so the answer is ( 300 100 ) ( 1 3 ) 100 ( ) 200 2. 3 (b) Estimate the probability from part (a) using an appropriate approximation. Here you should give a numerical value. [8 points] Most people went with the normal approximation to the binomial here. We have µ = (300)(1/3) = 100 and σ = (300)(1/3)(2/3) = 8.16. By the usual formulas, the probability of getting exactly 100 hits is approximately ( ) ( ) 100 +.5 µ 100.5 µ Φ Φ σ σ with µ and σ as above. That s Φ(0.06) Φ( 0.06) = 2Φ(0.06) 1 = 2(0.5239 1) = 0.0478, to the accuracy of our table. The approximation I had in mind was just 1/(σ 2π), which is the value of the normal curve at its maximum; that gives 3/400π which is about 0.0489. A few brave souls used Stirling s approximation, n! 2πn(n/e) n. If you do this you get 300! 100!200! (100/300)100 (200/300) 200 The factors of 100, 200, 300 and e all cancel, leaving 600π 200π 400π which is in fact just 3/400π. ( ) 600π(300/e) 300 100 ( ) 200 100 200. 200π(100/e) 100 400π(200/e) 200 300 300 (c) Wthout doing further calculation, what is the probability of getting exactly 400 red faces in 1200 rolls of this die? (If you could not answer part (b), assume the answer to part (b) is x and give your answer in terms of x.) [4 points] The answer is x/2, where x is your answer from (b). This follows from the square root law; since we have four times as many trials the most likely result is 1/ 4 times as likely. 3

3. I have two coins which look the same. One is fair, and one is twice as likely to come up heads as tails. When selecting a coin at random, each of the coins is equally likely to be selected. (a) I select one of the coins at random and flip it once. What is the probability it comes up heads? [6 points] Let C b denote the event that the biased coin is selected, and let C f denote the event that the fair coin is selected. Let H, T denote getting heads and tails respectively. Then we have P (H) = P (HC b )+P (HC f ) = P (H C b )P (C b )+P (H C f )P (C f ) = (2/3)(1/2)+(1/2)(1/2) = 7/12. (b) I select one of the coins at random and flip it once. It comes up heads. What is the probability that it is the fair coin? [6 points] The problem asks for P (C f H). From the definition of conditional probability, P (C f H) = P (C f H)/P (H). We know both of these from (a): P (C f H) = (1/2)(1/2) = 1/4 and P (C f ) = 7/12. So the answer is (1/4)/(7/12) = 3/7. (c) I select one of the coins at random and flip it three times. I get heads, tails, and heads again. What is the probability that it is the fair coin? [8 points] Let H 1, H 2, H 3 denote getting heads on the first, second, and third coin tosses; similarly T 1, T 2, T 3 for tails. Then we want P (C f H 1 T 2 H 3 ) = P (H 1 T 2 H 3 C f )P (C f ) P (H 1 T 2 H 3 C f )P (C f ) + P (H 1 T 2 H 3 C b )P (C b ) Now, P (H 1 T 2 H 3 C f ) = (1/2)(1/2)(1/2) = 1/8 and P (H 1 T 2 H 3 C b ) = (2/3)(1/3)(2/3) = 4/27. So we get P (C f H 1 T 2 H 3 ) = (1/8)(1/2) (1/8)(1/2) + (4/27)(1/2) = 27 59. 4

4. A (poorly proofread) book has 300 pages. The number of mistakes on each page is Poisson-distributed with parameter µ = 6. (a) What is the probability that a randomly chosen page has no mistakes? numerical answer. [6 points] This is just the probability P 6 (0) = e 6 6 0 /0! = e 6 0.0025. Give a (b) What is the probability that a randomly chosen page has at least three mistakes? Give a numerical answer. [7 points] This is k=3 P 6(k). But we don t want to do the infinite sum, so recall that k=0 P 6(k) = 1 since P 6 is a probability distribution. The answer is then 1 2 ( e 6 6 0 P 6 (k) = 1 0! k=0 + e 6 6 1 1! ) + e 6 6 2 2! which is 1 25e 6 or about 0.938. (A lot of people found the probability that a randomly chosen page has exactly three mistakes, or more than three mistakes; be careful!) (c) Estimate the probability that there are zero or one pages in the book with no mistakes. [7 points] The number of pages with no mistakes is binomial with n = 300 and p = e 6. You can compute the probability that this is 0 or 1 exactly it s (1 p) 300 + 300p(1 p) 299. But this is a bit hard to evaluate. So we recall that when p is small, binomial(n,p) is approximately Poisson(np); therefore the number of pages with no mistakes is approximately Poisson with µ = 300e 6. The probability of zero or one pages with no mistakes is therefore P 300e 6(0) + P 300e 6(1) = (exp( 300 exp( 6)))(1 + 300 exp( 6)) 0.83 5

5. Let A and B be events with P (A) = 0.5, P (B) = 0.3, P (AB) = 0.2. Find: (a) P (A B) [3 points] By inclusion-exclusion, P (A B) = P (A) + P (B) P (AB) = 0.5 + 0.3 0.2 = 0.6. (b) P (A c B) [3 points] We have P (B) = P (AB)+P (A c B); solve for P (A c B) to get P (A c B) = P (B) P (AB) = 0.3 0.2 = 0.1. (c) P (A B) [3 points] By the definition of conditional probability, P (A B) = P (AB)/P (B) = 0.2/0.3 = 2/3. (d) P (A c B c ) [3 points] We have A c B c = (A B) c, so P (A c B c ) = 1 P (A B) = 1 0.6 = 0.4. (A lot of people didn t write this out explicitly, but drew a Venn diagram.) A third event, C, is independent of A and B, and P (C) = 0.6. (e) Find P (ABC) [3 points] Since C is independent of A and B, this is P (AB)P (C) = (0.2)(0.6) = 0.12. (Note that A and B are not independent of each other, so we can t further reduce this to P (A)P (B)P (C).) (f) Find the probability that exactly one of A, B, and C occurs. [5 points] The event that exactly one of A, B, and C occurs can be written as AB c C c A c BC c A c B c C. These three things are disjoint, so P (exactly 1) = P (AB c C c ) + P (A c BC c ) + P (A c B c C). By independence of C from A and B, this is P (AB c )P (C c ) + P (A c B)P (C c ) + P (A c B c )P (C). We have P (A c B) = 0.1 from (b), and P (A c B c ) = 0.4 from (d). We can find P (AB c ) = P (A) P (AB) = 0.3 analogously to (b), and P (C c ) = 1 0.6 = 0.4. Plugging this all in gives (0.3)(0.4) + (0.1)(0.4) + (0.4)(0.6) = 0.4. 6