Name: Firas Rassoul-Agha

Similar documents
MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Lecture 3. Discrete Random Variables

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Random Variables Example:

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

The random variable 1

Statistics 100A Homework 5 Solutions

Chapter 3. Chapter 3 sections

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Section 2.4 Bernoulli Trials

MATH Notebook 5 Fall 2018/2019

CS 246 Review of Proof Techniques and Probability 01/14/19

Chapter 2: Random Variables

MATH Solutions to Probability Exercises

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Chapter 8: An Introduction to Probability and Statistics

Topic 3: The Expectation of a Random Variable

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

The Central Limit Theorem

Math 493 Final Exam December 01

Statistics and Econometrics I

Random variables (discrete)

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).

1 Proof techniques. CS 224W Linear Algebra, Probability, and Proof Techniques

Statistics 100A Homework 1 Solutions

Chapter 3 Discrete Random Variables

Probability and Statistics Concepts

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Conditional Probability

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Continuous Distributions

Chapter 4: An Introduction to Probability and Statistics

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Chapter 5. Means and Variances

STAT 414: Introduction to Probability Theory

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Math 1313 Experiments, Events and Sample Spaces

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

What is a random variable

Senior Math Circles November 19, 2008 Probability II

Probability, Random Processes and Inference

Chapter 2: The Random Variable

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

STAT 418: Probability and Stochastic Processes

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Statistical Concepts. Distributions of Data

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

Discrete Distributions

Guidelines for Solving Probability Problems

Part 3: Parametric Models

Recitation 2: Probability

Discrete and continuous

Eleventh Problem Assignment

p. 4-1 Random Variables

Lecture notes for probability. Math 124

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Statistics 251: Statistical Methods

Review of Probability. CS1538: Introduction to Simulations

Quick Tour of Basic Probability Theory and Linear Algebra

success and failure independent from one trial to the next?

Week 2. Review of Probability, Random Variables and Univariate Distributions

1: PROBABILITY REVIEW

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Page Max. Possible Points Total 100

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

M378K In-Class Assignment #1

To find the median, find the 40 th quartile and the 70 th quartile (which are easily found at y=1 and y=2, respectively). Then we interpolate:

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability. Lecture Notes. Adolfo J. Rumbos

Exam 2 Review Math 118 Sections 1 and 2

STAT 311 Practice Exam 2 Key Spring 2016 INSTRUCTIONS

Chapter 2 Random Variables

Deep Learning for Computer Vision

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Chapter 1. Sets and probability. 1.3 Probability space

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Question Points Score Total: 76

Semester 2 Final Exam Review Guide for AMS I

CMPSCI 240: Reasoning Under Uncertainty

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Math/Stat 394 Homework 5

Northwestern University Department of Electrical Engineering and Computer Science

Discrete Random Variables

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

ECE 302: Probabilistic Methods in Electrical Engineering

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Things to remember when learning probability distributions:

IE 4521 Midterm #1. Prof. John Gunnar Carlsson. March 2, 2010

Transcription:

Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE NOT ALLOWED. Problem 1: A fair coin is tossed. If it comes up heads the number 1 is recorded. However, if it comes up tails then a die is rolled and its outcome is recorded. (a Let X be the number we record. What is its mass function? (b Given we recorded a, what is the probability the coin landed heads? (c Given we recorded a 1, what is the probability the coin landed heads? (a X takes the values 1 through 6. It takes the value 1 if the coin comes up heads or if it comes up tails and the die rolls a 1. This has probability 1/ + 1/ 1/6. It takes the value only if the coin comes up tails and the die rolls a. This has probability 1/ 1/6. The same happens for the values 3 through 6: each of them is taken with probability 1/ 1/6. (b If we know a was recorded then we know the die must have been rolled and the coin came up tails. So the probability of heads given we recorded a is zero. (c Let A be the event that the coin landed heads and B the event that we recorded a 1. We want to compute P (A B = P (A B. P (B A B is the event that the coin landed heads and we recorded a 1. But if the coin lands heads we will record a 1. So recording a 1 is a redundant information. In other words, A B. Therefore, A B = A. We know P (A = 1/, thus P (A B = 1/. As to P (B, this is the probability of recording a 1 which we already computed in part (a to be 1/ + 1/ 1/6 = 7/1. In summary, P (A B = P (A B P (B = 1/ 7/1 = 6 7. 1

Problem : Let F be the function defined by: 0 if x < 0, x +1 if 0 x < 1, 8 x F (x = if 1 < x <, 3 1 6 x + 1 if x < 4, 3 1 if x 4. (a What value should F have at x = 1 for it to have a chance at being a cumulative distribution function? (b Verify that with your answer to (a, F is indeed a cumulative distribution function. (c Let X be a random variable which corresponds to F. Is X discrete or continuous? (d Compute P {X = 0} and P {X = 1 }. (e Compute P {X < 1}. (f Compute P {X = 1 or 1 X < 3 }. (g Compute P {0 X }. (a F needs to be right-continuous. So to get its value at 1 we take a limit of F (x as x converges to 1 from the right. This means we plug x = 1 into x/3 and get that F (1 = 1/3. (b The functions making up the different parts of F are all nondecreasing. F has two jumps: at 0 it goes from 0 to an 1/8 and at 1 it goes from 1/4 to 1/3. In both cases it goes up. Hence, F itself is nondecreasing. With our choice of F (1 we made F right-continuous at 1. It is also right-continuous at 0 since F (0 is the limit of F (x as x 0 from the right. Other than at 0 and at 1 function F is continuous. Hence, F is right-continuous. Since for x < 0 we have F (x = 0 and for x > 4 we have F (x = 1 we do have that lim x F (x = 0 and lim x F (x = 1. These are the requirements needed for a CDF. (c Since F is not piecewise constant, it is not the CDF of a discrete random variable. Since F has jumps it is not the CDF of a continuous random variable either.

Extra Page (d P {X = 0} is the size of the jump at 0 which is 1/8 0 = 1/8. Similarly, P {X = 1/} is the size of the jump at 1/. Since the function is continuous at 1/ this probability is zero. (e P {X < 1} = P {X 1} P {X = 1}. Now P {X 1} = F (1 = 1/3 (from part (a. On the other hand, P {X = 1} is the size of the jump at 1 which is 1/3 1/4. Hence, P {X < 1} = 1/4. In other words, to compute P {X < a} we take the limit of F (x from the left at x = a. (f X = 1 implies 1/ X < 3/. So their union is the bigger event 1/ X < 3/. In other words, the probability in question equals P {1/ X < 3/}. In turn, we have P {1/ X < 3/} = P {X < 3/} P {X < 1/} = (P {X 3/} P {X = 3/} (P {X 1/} P {X = 1/}. Again, we now use P {X a} = F (a and P {X = a} is the jump at a to compute the above probabilities and get P {1/ X < 3/} = P {X < 3/} P {X < 1/} = ( 3/ 3 0 ( (1/ + 1 8 0. (g Using P {X 0} = F (0 and P {X = 0} from (d we get P {0 X } = P {X } P {X < 0} = P {X } (P {X 0} P {X = 0} = 3 (1/8 1/8 = 3.

Problem 3: Consider a random variable X with probability density function f(x = 1 e x. Calculate the probability that X > X. First note that X > X is equivalent to X(X > 0 which happens only when both factors are of the same sign. This happens exactly when X > or X < 0. The probability of this is then equal to 0 f(x dx + On the second integral we have x and thus f(x dx = 1 f(x dx. e x dx = e. For the first integral we have x < 0 and so x = x. The integral is then 0 f(x dx = 1 0 e x dx = 1. The probability in question is thus equal to 1/ + e /. Alternatively, one could compute the probability as one minus the probability of 0 X which equals 1 0 f(x dx = 1 1 0 e x dx = 1 1 (1 e. 3

Problem 4: A store has 10,000 table tennis balls in a big box, 6000 of which are white and 4000 are orange. You pull 30 of the balls from the box at random. We want to estimate the probability you will have at least half the balls white. (a Of course, the balls are pulled without replacement. Write a formula for the probability you have 15 or fewer orange balls. I expect your answer to be a long sum and I do not expect that you will be able to give a final number. (b Write a formula for the probability of getting 15 of fewer orange balls, if instead the balls are chosen with replacement. This is an estimate of your answer to (a, since 10,000 balls is so large that things do not really change much when we take out 30 balls. I again expect a long sum but still not a final number. (c If X is the random variable that indicates the number of orange balls among the 30 chosen with replacement, then what is the distribution of X? Is it a Bernoulli, Binomial, Geometric, Poisson, Uniform, Exponential, Cauchy, Normal, or none of these? (d Use the central limit theorem to approximate your answer to (b (which in turn would approximate your answer to (a. Now, I expect a number. (You can use the attached normal table and that 3/ 7. is about 1.1. (a The probability in question is the sum of the probabilities of getting exactly 0 orange balls, exactly one orange ball,..., exactly 15 orange balls. To get exactly k orange balls without replacement we need to pick k of the 4, 000 orange balls and 30 k of the 6, 000 white ones. The number of ways to do this is ( ( 4000 6000 k 30 k. The total number of ways to pick 30 balls out of 10, 000 is ( 10,000 30 (. So the probability of picking exactly k orange balls equals / 10,000. The probability in question then equals ( 4000 k ( 6000 30 k 30 ( 15 4000 ( 6000 k 30 k ( 10,000 k=0 30. 4

Extra Page (b Again, the probability in question is the sum of the probabilities of getting exactly 0 orange balls, exactly one orange ball,..., exactly 15 orange balls. Since we now are picking with replacement, the probability of a given ball ( to be orange is 0.4 and the probability of it being white is 0.6. There are 30 k ways to pick which of the 30 balls will be the k orange ones. Therefore, the probability of picking exactly k orange balls equals ( 30 k (0.4 k (0.6 30 k. The probability in question then equals 15 k=0 ( 30 (0.4 k (0.6 30 k. k (c If we consider getting an orange ball a success then X is a Binomial random variable with parameters 30 (number of trials and 0.4 (probability of success. (d A Binomial random variable can be approximated by a Normal random variable with parameters µ = np and σ = np(1 p. In other words, if compute 30 0.4 = 1 and 30 0.4 0.6 = 7. and rewrite ( X 1 P (X 15 = P 7. 15 1 7. then we can pretend (X 1/ 7. is a standard Normal and this will approximate the above probability for us. Hence, P (X 15 Φ(3/ 7. Φ(1.1 0.8686.