MATH 3670 First Midterm February 17, No books or notes. No cellphone or wireless devices. Write clearly and show your work for every answer.

Similar documents
Binomial and Poisson Probability Distributions

Lecture 3. Discrete Random Variables

Page Max. Possible Points Total 100

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

BINOMIAL DISTRIBUTION

M378K In-Class Assignment #1

Conditional Probability

Some Basic Concepts of Probability and Information Theory: Pt. 2

Chapter 3 Discrete Random Variables

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Math , Fall 2012: HW 5 Solutions

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Things to remember when learning probability distributions:

Discrete and continuous

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Chapter 5. Means and Variances

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Statistics and Econometrics I

Chapter 2: The Random Variable

CMPSCI 240: Reasoning Under Uncertainty

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

CSE 312, 2011 Winter, W.L.Ruzzo. 6. random variables

3 Multiple Discrete Random Variables

Random Variables Example:

Lecture 2: Repetition of probability theory and statistics

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

p. 4-1 Random Variables

X = X X n, + X 2

CSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions

Statistics, Probability Distributions & Error Propagation. James R. Graham

Distribusi Binomial, Poisson, dan Hipergeometrik

Probability and Statisitcs

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Discrete random variables and probability distributions

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Economists. Lectures 3 & 4

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

Lecture 4: Random Variables and Distributions

Common Discrete Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Bernoulli Trials, Binomial and Cumulative Distributions

Example. If 4 tickets are drawn with replacement from ,

Exercises with solutions (Set D)

HW on Ch Let X be a discrete random variable with V (X) = 8.6, then V (3X+5.6) is. V (3X + 5.6) = 3 2 V (X) = 9(8.6) = 77.4.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

CS206 Review Sheet 3 October 24, 2018

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

Exam 1 Solutions. Problem Points Score Total 145

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Name: 180A MIDTERM 2. (x + n)/2

Chapter 1: Revie of Calculus and Probability

a zoo of (discrete) random variables

Discrete Distributions

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Relationship between probability set function and random variable - 2 -

Math Review Sheet, Fall 2008

1. Analysis of Grouped Data

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Chapter 8: An Introduction to Probability and Statistics

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Sampling Distribution: Week 6

Midterm Exam 1 Solution

Class 26: review for final exam 18.05, Spring 2014

Q1 Own your learning with flash cards.

Topic 3: The Expectation of a Random Variable

1 Random variables and distributions

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

CSE 312 Final Review: Section AA

Topic 9 Examples of Mass Functions and Densities

Introduction to Probability and Statistics Slides 3 Chapter 3

Bayesian Updating with Discrete Priors Class 11, Jeremy Orloff and Jonathan Bloom

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 2 Random Variables

Exam III #1 Solutions

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

Bernoulli and Binomial Distributions. Notes. Bernoulli Trials. Bernoulli/Binomial Random Variables Bernoulli and Binomial Distributions.

are the objects described by a set of data. They may be people, animals or things.

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

MAT 271E Probability and Statistics

The area under a probability density curve between any two values a and b has two interpretations:

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Binomial random variable

CENTRAL LIMIT THEOREM (CLT)

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

SDS 321: Introduction to Probability and Statistics

Random Models. Tusheng Zhang. February 14, 2013

Basics on Probability. Jingrui He 09/11/2007

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

CS 361: Probability & Statistics

Transcription:

No books or notes. No cellphone or wireless devices. Write clearly and show your work for every answer. Name: Question: 1 2 3 4 Total Points: 30 20 20 40 110 Score:

1. The following numbers x i, i = 1,..., 17, represent a sample of size n = 17 from a given population. 6.46 6.33 6.93 7.12 7.26 11.12 7.03 6.96 5.48 5.35 6.49 7.87 6.02 6.75 5.67 4.01 5.82 (a) (10 points) Compute the sample median and fourth spread. Solution: After ordering the data you obtain 4.01 5.35 5.48 5.67 5.82 6.02 6.33 6.46 6.49 6.75 6.93 6.96 7.03 7.12 7.26 7.87 11.12 so that: x = 6.49 lf = 5.82 uf = 7.03 fs = 7.03 5.82 = 1.21 (b) (10 points) Knowing that 17 i=1 x i = 112.67 and 17 i=1 x2 i = 781.35 compute the sample mean and variance. Solution: x = 112.67 = 6.63 17 s 2 = 1 ) (781.35 112.672 = 2.16 16 17 Page 1 of 7

(c) (10 points) After checking for outlier, draw a box plot of the data. Solution: Observe that 4.01 > 5.82 1.5 1.21 = 4.005 so that there are no outlier on the lower part of the sample. On the upper part we have 11.12 > 7.03 + 3 1.21 = 10.66 while 7.87 < 7.03 + 1.5 1.21 = 8.845 so that 11.12 is the only outlier and it is an extreme outlier. Page 2 of 7

2. Let A, B and C be three events. (a) (10 points) Assume that A, B and C are mutually independent. Show that A and B C are independent. Solution: We need to show that Observe that by hypothesis P (A (B C)) = P (A)P (B C) P (A (B C)) = P (A B C) = P (A)P (B)P (C) P (B C) = P (B)P (C) from which the thesis follows immediately. (b) (10 points) Flip a fair coin twice and consider the three events A={the first flip is Head}, B={the second flip is Head}, and C={exacly one flip is Head}. Show that A and B are independent, A and C are independent and that B and C are independent. Are A, B and C independent? Solution: Observe that P (A) = P (B) = P (C) = 0.5. Moreover A B={both flip are Head} while A C={the first flip is Head and the second is Tail} and A C={the first is Tail and the second flip is Head}. Thus P (A B) = P (B C) = P (A C) = 0.25. Thus A and B are independent, A and C are independent and B and C are independent. Finally A B C = so that P (A B C) = 0. Thus A, B and C are not independent. Page 3 of 7

3. Consider the circuit depicted in the figure. A B C You know that the entire circuit work if and only if element A and at least one between elements B and C work. Assume that the probability that A works is 0.9 while the probability that B works is 0.7 and the probability that C works is 0.7. Each element works or does not work independently from the others. (a) (10 points) Compute the probability that the entire circuit works. Solution: Call A, B and C the events that element A, B or C (respectively) works. Call I the event the the entire circuit works. We have Observe that P (I) = P (A)P (B C) = 0.9P (B C) P (B C) = 1 P (B C ) = 1 P (B )P (C ) = = 1 (1 P (B))(1 P (C)) = 1 0.3 0.3 Thus we have P (I) = 0.9(1 0.3 2 ) = 0.819 (b) (10 points) What is the probability that element A does not work given that the circuit does not work? Solution: We have P (A I ) = P (I A ) P (I ) = 1 P (A) 1 P (I) = 0.1 1 0.9(1 0.3 2 ) = 0.552 Page 4 of 7

4. In a bucket there are 1000 red balls and 2000 blue balls. You extract 1 ball and then flip a fair coin. If the result is Head you reinsert the ball in the bucket. If the result is Tail you keep it. You repeat the above procedure 20 times. Call N the number of balls you have kept at the end of the 20 extractions and R the number of red balls among them. (a) (10 points) Recognize what kind of r.v. is N and write its p.m.f.. (Hint: relate N with the number of Tail you got.) Solution: N is a Binomial r.v. with parameter 20 and 0.5. Thus P (N = n) = ( ) 20 0.5 n 0.5 20 n = n ( ) 20 0.5 20 n (b) (10 points) What is the probability that R = 3 given that N = 10, that is find P (R = 3 N = 10)? (Hint: once you know the value of N what kind of r.v. is R?) Solution: If N = 10 then R is an hypergeometric r.v with parameter 10, 1000, 2000 so that ( )( ) 1000 2000 P (R = 3 N = 10) = 3 7 ( ) 3000 10 Page 5 of 7

(c) (10 points) Find an expression for the probability that R = 3.(Hint: use the above results and the law of total probability.) Solution: We have ( )( )( ) 20 1000 2000 20 20 P (R = 3) = P (R = 3 N = n)p (N = n) = 0.5 20 n 3 n 3 ( ) 3000 n=3 n=3 n (d) (10 points) (Bonus) Observing that 20 << 1000 and 20 << 2000, find an approximated p.m.f for R. (Hint: what is the probability that you extract a red ball and you keep it?) Solution: We can assume that each ball, upon extraction, is red with probability 1000 = 1 2000 and blue with probability = 2. Moreover we can assume 3000 3 3000 3 that different extractions are independent. After flipping the coin we have 3 possible outcomes: the ball is reinserted (with probability 1), the ball is kept and is blue (with probability 2 1 = 1 ) or the 2 3 2 3 ball is kept and is red (with probability 1 1 = 1). 3 2 6 Thus we can say that R is approximately a binomial r.v. with parameters 20 and 1, that is 6 ( ) ( ) r ( ) 20 r 20 1 5 P (R = r) r 6 6 Page 6 of 7

Useful Formulas If A and B are events then the probability of A given B is P (A B) = P (A B). P (B) If X is a binomial r.v. with parameters N and p then ( ) N bin(x; N, p) = P (X = x) = p x (1 p) n x. x If X is an hypergeomethic r.v. with parameters N,M and n then ( )( ) M M h(x; n, N, M) = x n x ( ). N + M n If X is a Poisson r.v. with parameter λ then λ λx p(x; λ) = P (X = x) = e x!. Page 7 of 7