CSCI2244-Randomness and Computation First Exam with Solutions

Similar documents
Exam 3, Math Fall 2016 October 19, 2016

Part 3: Parametric Models

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

CS280, Spring 2004: Final

Random Variables Example:

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Statistics 511 Additional Materials

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Chapter 4: An Introduction to Probability and Statistics

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Algebra Exam. Solutions and Grading Guide

18.05 Practice Final Exam

Chapter 8: An Introduction to Probability and Statistics

Lecture Notes. Here are some handy facts about the probability of various combinations of sets:

Conditional distributions (discrete case)

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Grade 6 Math Circles 7/8 November, Approximations & Estimations

COLLEGE ALGEBRA. Paul Dawkins

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Examiner s Report Pure Mathematics

MTH135/STA104: Probability

Statistics for Economists. Lectures 3 & 4

Sampling Distributions

Probability and Independence Terri Bittner, Ph.D.

Algebra. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

One-to-one functions and onto functions

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Module 8 Probability

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Algebra & Trig Review

Expected Value II. 1 The Expected Number of Events that Happen

MA 1125 Lecture 15 - The Standard Normal Distribution. Friday, October 6, Objectives: Introduce the standard normal distribution and table.

Discrete Mathematics and Probability Theory Summer 2014 James Cook Final Exam

( )( b + c) = ab + ac, but it can also be ( )( a) = ba + ca. Let s use the distributive property on a couple of

εx 2 + x 1 = 0. (2) Suppose we try a regular perturbation expansion on it. Setting ε = 0 gives x 1 = 0,

STAT/SOC/CSSS 221 Statistical Concepts and Methods for the Social Sciences. Random Variables

- measures the center of our distribution. In the case of a sample, it s given by: y i. y = where n = sample size.

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Proof Techniques (Review of Math 271)

CIS 2033 Lecture 5, Fall

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Probability and the Second Law of Thermodynamics

A-LEVEL STATISTICS. SS04 Report on the Examination June Version: 1.0

Math 3361-Modern Algebra Lecture 08 9/26/ Cardinality

Probability (Devore Chapter Two)

MITOCW watch?v=vjzv6wjttnc

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Please bring the task to your first physics lesson and hand it to the teacher.

Chapter 5. Means and Variances

ECO227: Term Test 2 (Solutions and Marking Procedure)

The Exciting Guide To Probability Distributions Part 2. Jamie Frost v1.1

CS 246 Review of Proof Techniques and Probability 01/14/19

Lecture 22: Quantum computational complexity

C if U can. Algebra. Name

CSCI3390-Second Test with Solutions

CS 361: Probability & Statistics

Math 493 Final Exam December 01

CS 361: Probability & Statistics

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS

Exam III #1 Solutions

Copyright c 2008 Kevin Long

k P (X = k)

AP STATISTICS: Summer Math Packet

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

1 What is the area model for multiplication?

Basics of Proofs. 1 The Basics. 2 Proof Strategies. 2.1 Understand What s Going On

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Unit 4 Probability. Dr Mahmoud Alhussami

Math 308 Midterm November 6, 2009

Regression, part II. I. What does it all mean? A) Notice that so far all we ve done is math.

GMAT Arithmetic: Challenge (Excerpt)

Chapter 1. Foundations of GMAT Math. Arithmetic

Lecture 4: Constructing the Integers, Rationals and Reals

The Haar Wavelet Transform: Compression and. Reconstruction

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Solving Quadratic & Higher Degree Equations

SDS 321: Introduction to Probability and Statistics

Math 20 Spring Discrete Probability. Midterm Exam

Name: Exam 2 Solutions. March 13, 2017

Mixture distributions in Exams MLC/3L and C/4

An analogy from Calculus: limits

Math 111, Math & Society. Probability

of 8 28/11/ :25

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Conditional Probability

Lesson 7: Classification of Solutions

Hi, my name is Dr. Ann Weaver of Argosy University. This WebEx is about something in statistics called z-

MAT Mathematics in Today's World

Math 320-1: Midterm 2 Practice Solutions Northwestern University, Fall 2014

3 Multiple Discrete Random Variables

PRACTICE PROBLEMS FOR EXAM 1

Transcription:

CSCI2244-Randomness and Computation First Exam with Solutions March 1, 2018 Each part of each problem is worth 5 points. There are actually two parts to Problem 2, since you are asked to compute two probabilities. Problems 1 and 4 require numerical answers, except as indicated, but as always, you have to show your work: If the right answer is 7, then just 7 is the wrong answer! The calculations are entirely rudimentary, involving arithmetic with one-digit integers for the most part, so you do not need to use a calculator. 1 A Little Box of Chocolates A little box of chocolates contains five chocolate truffles: two dark chocolates the kind I like and three milk chocolates, the kind I don t care for. I close my eyes grab two of the truffles at random. (a) Model the outcomes of the experiment as the set S of two-element subsets of {1, 2, 3, 4, 5}, where 1 and 2 represent the dark chocolate truffles. What is S? (Give an exact numerical answer, as well as an expression involving binomial coefficients.) Solution. S = ( 5 2) = 10. (b) Let E be the event at least one of the chocolates I pull out is a dark chocolate. Write the event E explicitly as a subset of S. What is E? Solution. so E = 7. E = {{1, 2}, {1, 3}, {1, 4}, {1, 5}, {2, 3}, {2, 4}, {2, 5}} 1

(c) Assume that each outcome in S has equal probability. What is P (E)? Solution. P (E) = E S = 0.7. Most common errors: This was supposed to be a gift problem, but surprisingly there were a lot of errors even on the part of students who did well on what I considered harder problems. The most common error was to confuse notation about sets versus ordered pairs. Strictly speaking, writing the solution to (b) as E = {(1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 4), (2, 5)} is not correct If the error was limited to this and you were consistent about treating a subset with two elements as identical with a sorted ordered pair with two elements, I did not take points off for this. The idea here was that you could solve this problem knowing nothing beyond the definitions. If you wanted to be really fancy about this you could compute the answer to part (b) using the technique for Problem 2, and write E = ( ) 2 1 ( ) 3 1 + ( ) 2 2 ( ) 3 0 = 2 3 + 1 1 = 7, but this seems like an instance of the art of making simple things difficult! 2 A Bigger Box of Chocolates Same problem, scaled up: A bigger box of chocolates contains twenty truffles, 8 of which are dark chocolate, and 12 of which are milk chocolate. I reach in and pull out ten truffles. What is the probability that exactly five of them are dark chocolate? What is the probability that at least five of them are dark chocolate? Your answer should contain binomial coefficients and summations, but you do not need to evaluate these. Solution. Exactly 5: ( 8 ( 5) 12 ) 5 ). At least 5: 1 ) ( 20 10 8 j=5 ( 20 10 ( ) ( ) 8 12. j 10 j 2

You could also use the complementary probability here: 1 1 4 ( ) ( ) 8 12 ). j 10 j ( 20 10 j=0 Most common error. By far the most common error on the exam was to treat this sampling-wthout-replacement problem (hypergeometric distribution) as one of sampling-with-replacement (binomial distribution), leading to the incorrect answer ( ) 10 0.4 5 0.6 5. 5 This evaluates to 0.20, while the correct answer is 0.24, so this is not even a particularly close approximation. Worse still was an odd mix of the two that I observed on a number of papers, writing this expression with ( 20 ( 10) rather than 10 ) 5. Some students wrote the summation in the second part as 10 j=5 rather than 8 j=5. This means that you are including terms that have binomial coefficients ( 8 ( 9) and 8 ( 10). While we officially defined n k) only in the case where 0 k n, in fact ( 8 9) has a perfectly reasonable interpretation as the number of of 9-element subsets of an 8-element set. This number is 0, of course, so the formula remains correct when you extend the summation in this way. I accordingly accepted this answer as correct. Other students, presumably uncomfortable with the notation, wrote out all four complicated terms of the sum. I accepted this answer as well, because it is perfectly correct, but make sure you get comfortable with! 3 The Chocolate Factory The Chocolate Factory makes millions of chocolate truffles every day. Approximately one of every thousand chocolates that leave the factory contains a golden ticket. You order a shipment of 2500 chocolates. (a) What is the probability that your shipment contains NO golden tickets? Give a (nearly) exact answer, based on treating this as a problem of sampling with replacement, along with an easy-to-compute approximation. Solution. 0.999 2500 for the exact value, e 2.5 for the approximate value, since 0.999 e 0.001. 3

(b) What is the probability that your shipment contains at least two golden tickets? Just write the easy-to-compute approximation. Solution. 1 e λ (1 + λ) = 1 3.5 e 2.5. (P (X > 1) for a Poisson random variable with λ = 2.5.) (c) You repeat this experiment many times, and keep track of how many golden tickets are in each shipment. What is the average number of golden tickets that you get in a shipment? Justify your answer you can cite any result that was taught in class. Solution. 2.5. You can get this either from the expected value λ of a Poisson distribution, or the expected value np of a binomial distribution with n trials and success probability p. Common errors. If you treat the problem as sampling with replacement, then the exact distribution is binomial, and the approximate distribution is Poisson. A few students apparently thought that anything with e in it is hard to compute, but I maintain that e 2.5 is an awful lot easier to compute than 0.999 2500. Be that as it may, it should still be clear which one is the exact answer and which one is the approximation. There were a lot of correct, but infelicitously expressed, answers for which I did not take off points. For example, I accepted ( ) 2500 0.001 0 0.999 2500 0 for the exact answer in (a). However, I did take off a point for writing j=2 2.5 2.5j e j! for the answer in (b), since using the complementary probability here is an essential key to computing the answer easily. 4

4 I couldn t figure out how to make this problem about chocolate, so it s about dice. Roll two fair 6-sided dice. Let the random variable X 1 denote the number of spots showing on the first die, and let X 2 be the number of spots showing on the second die. Let Z = X 1 X 2 ; that is, Z is the absolute value of the difference between the two dice. (For instance, Z = 5 if you roll a 1 and a 6.) (a) Formally, Z is a real-valued function on the sample space S = {1, 2, 3, 4, 5, 6} {1, 2, 3, 4, 5, 6}. What is Z((2, 4))? Solution. 2 4 = 2. (b) What is P (Z = 0)? What is is P Z (1)? What is F Z (2)? (Recall that P Z denotes the PMF of Z, and F Z the CDF of Z.) Solution.There are 6 outcomes {(1, 1), (2, 2),, (6, 6)} for which Z = 0, so P (Z = 0) = 6 = 1. There are 10 outcomes 36 6 {(j, j + 1) : 1 j 5} {(j + 1, j) : 1 j 5} for which Z = 1, so P Z (1) = 10 = 5. A similar enumeration gives 8 36 18 outcomes for which Z = 2, so P Z (2) = 8 = 2, and thus 36 9 F Z (2) = 6 + 10 + 8 36 = 2 3. (c) Determine all the values of P Z and sketch the graph of this function. (The preceding problem will get you halfway there.) Solution.The remaining counts of outcomes are 6,4,2, respectively for Z = 3, 4, 5, found just as in the tabulation above for Z = 1. This gives the plot in the figure below. (d) Find E(Z). You can leave the answer as an unevaluated expression. (The numerical answer is about 1.944, which should help tell you if you did the problem correctly.) 5

Figure 1: PMF of Z 6

Solution. Linearity of expectation is not terribly helpful here; there s not really a simpler strategy than going directly to the definition of expected value: E(Z) = 1 36 (0 6 + 1 10 + 2 8 + 3 6 + 4 4 + 5 2) = 35 18. (e) Let W be the random variable denoting the larger of the values on the two dice. That is W = max(x 1, X 2 ). Determine E(W ), by applying the handy fact max(x, y) = x + y + x y. 2 You can use the answer to the preceding question, even if you did not solve it. Solution. Now we can use linearity, along with the simple fact, shown in class, that E(X 1 ) = E(X 2 ) = 3.5: E(W ) = 1 2 (E(X 1)+E(X 2 )+E(Z)) = 1 2 (7 2 + 7 2 + 35 18 ) = 161 36 = 4.472. Most common incorrect answer. Apart from a lot of arithmetic errors, the biggest issue was students who correctly computed the expected value of W in the last part of the problem by brute force application of the definition, without using linearity and the already-computed value of E(Z). I considered the connection between these two as an essential part of the problem. At least one student did apply the linearity, but took it too far, writing, incorrectly, E( X 1 X 2 ) = E(X 1 ) E(X 2 ) = 3.5 3.5 = 0 WRONG!. 5 How d You Do? Aw, it was too easy. If you got 50 or above (26 out of the 35 students who took the exam), you re ready to move on to the more challenging material ahead, but make sure that you understand what you did wrong. 7

Figure 2: Distribution of grades: The median was 59/65 (91%), the mean 53.3/65 (82%) 8

If your score was in the 40 s, you re probably missing more than one or two fundamental concepts. You ll need to review this material carefully and work similar problems to be sure you ve mastered it. A score below 40 is cause for serious concern. Come talk to me! 9