MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

Similar documents
7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms.

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Topic 3: The Expectation of a Random Variable

Advanced topics from statistics

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Topic 3: The Expectation of a Random Variable

Test One Mathematics Fall 2009

ECEN 5612, Fall 2007 Noise and Random Processes Prof. Timothy X Brown NAME: CUID:

STAT 430/510: Lecture 16

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Random variables (discrete)

CME 106: Review Probability theory

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Week 12-13: Discrete Probability

Class 26: review for final exam 18.05, Spring 2014

Expectation MATH Expectation. Benjamin V.C. Collins, James A. Swenson MATH 2730

Department of Large Animal Sciences. Outline. Slide 2. Department of Large Animal Sciences. Slide 4. Department of Large Animal Sciences

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35%

CS 246 Review of Proof Techniques and Probability 01/14/19

1 Presessional Probability

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

SDS 321: Introduction to Probability and Statistics

Review of Basic Probability Theory

Part (A): Review of Probability [Statistics I revision]

PRACTICE PROBLEMS FOR EXAM 2

Homework 4 Solution, due July 23

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Lectures on Statistics. William G. Faris

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Math 20 Spring Discrete Probability. Midterm Exam

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

IE 336 Seat # Name (clearly) < KEY > Open book and notes. No calculators. 60 minutes. Cover page and five pages of exam.

Special distributions

SOR201 Solutions to Examples 3

3 Multiple Discrete Random Variables

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Fundamental Tools - Probability Theory II

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Tom Salisbury

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Recitation 2: Probability

Expectation of Random Variables

Lecture 4: Probability and Discrete Random Variables

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

CMPSCI 240: Reasoning about Uncertainty

Set Theory Digression

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

CS280, Spring 2004: Final

Midterm Exam 1 (Solutions)

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Data Mining Techniques. Lecture 3: Probability

FINAL EXAM: Monday 8-10am

University of Illinois ECE 313: Final Exam Fall 2014

Probability Dr. Manjula Gunarathna 1

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

STAT 4385 Topic 01: Introduction & Review

Lecture 16. Lectures 1-15 Review

Dynamic Programming Lecture #4

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

(b). What is an expression for the exact value of P(X = 4)? 2. (a). Suppose that the moment generating function for X is M (t) = 2et +1 3

Nonparametric hypothesis tests and permutation tests

1 Proof techniques. CS 224W Linear Algebra, Probability, and Proof Techniques

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65

success and failure independent from one trial to the next?

Probability & Random Variables

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Chapter 4. Continuous Random Variables

Chp 4. Expectation and Variance

Review of Probability. CS1538: Introduction to Simulations

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Analysis of Engineering and Scientific Data. Semester

SDS 321: Introduction to Probability and Statistics

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability and Statistics

CONTINUOUS RANDOM VARIABLES

Continuous Probability Distributions

ACM 116: Lectures 3 4

Probability and Distributions

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Refresher on Discrete Probability

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Math 510 midterm 3 answers

Joint Distribution of Two or More Random Variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

STAT 418: Probability and Stochastic Processes

EE 345 MIDTERM 2 Fall 2018 (Time: 1 hour 15 minutes) Total of 100 points

STAT 414: Introduction to Probability Theory

Transcription:

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are right! Write clearly. You may use a calculator. Box your final answers. No notes allowed. DO NOT WRITE IN THIS BOX! problem points score 1 9 pts 2 9 pts 3 10 pts 4 10 pts 5 9 pts 6 10 pts 7 6 pts 8 7 pts 9 10 pts 10 10 pts 11 10 pts 12 10 pts TOTAL 100 pts

1. (9 pts) State the Kolmogorov axioms for a probability measure. 1. For all events A, P (A) 0. 2. P (Ω) = 1, where Ω denotes the sample space. 3. For a sequence of mutually exclusive events A 1, A 2,..., P ( A i ) = i=1 P (A i ). i=1 2. (9 pts) Define the following terms. Use full sentences and be as precise as possible. (a) sample space The sample space is the set of all outcomes of a chance experiment. (b) event An event is a (nice) subset of the sample space. (c) random variable A random variable X is a function from the sample space Ω to the set of real numbers.

3. (10 pts) Recall that the covariance of random variables X and Y is defined to be cov(x, Y ) = E[(X E(X))(Y E(Y ))]. Prove that σ 2 (X + Y ) = σ 2 (X) + σ 2 (Y ) + 2 cov(x, Y ). (If you wish, you may assume the fact that E(aX + by ) = ae(x) + be(y ) without proof.) Proof: We compute σ 2 (X + Y ) = E[(X + Y E(X + Y )) 2 ] = E[(X + Y E(X) E(Y )) 2 ] = E[((X E(X)) + (Y E(Y ))) 2 ] = E[(X E(X)) 2 + 2(X E(X))(Y E(Y )) + (Y E(Y )) 2 ] = E[(X E(X)) 2 ] + 2E[(X E(X))(Y E(Y ))] + E[(Y E(Y )) 2 ] = σ 2 (X) + σ 2 (Y ) + 2 cov(x, Y ). 4. (10 pts) Let A and B be events where the set A is a subset of the set B. Show that P (A) P (B). You may use the Kolmogorov axioms and any other basic rules that we ve seen in class. If a rule requires a certain hypothesis (such as two events being disjoint), be sure to explain why the hypothesis is satisfied. Proof: Let C = B \ A. Since A is a subset of B, we have B = A C and A and C are disjoint. Since A and C are disjoint, we have by Rule 7.1 that P (B) = P (A) + P (C). Also, by Axiom 1, P (C) 0, so that P (B) P (A).

5. (9 pts) Roll two dice and let X be the larger of the numbers rolled. (a) Find the probability mass function of X. P (X = 1) = 1 P (X = 2) = 3 P (X = 3) = 5 P (X = 4) = 7 P (X = 5) = 9 P (X = 6) = 11 More compactly, P (X = k) = 2k 1 for k = 1,..., 6. (b) Find the expected value of X. E(X) = 6 161 kp (X = k) = 4.472. (c) Find the standard deviation of X. E(X 2 ) = 6 k2 P (X = k) = 791 791, so var(x) = ( ) 161 2 = 2555 1296 var(x) 1.404. 1.971. Thus, σ(x) =

6. (10 pts) In the problem below, be sure to include any necessary parameters (n, p, λ, etc.) (a) Find the expected value and variance for a Bernoulli distribution (that is, a random variable that equals 1 on the success of a Bernoulli experiment and 0 otherwise). Let I be Bernoulli with paramater p. Then E(I) = 1 p + 0 (1 p) = p and var(i) = E(I 2 ) E(I) 2 = (1 2 p + 0 2 (1 p)) p 2 = p p 2 = p(1 p). (b) Find the expected value and variance for a binomial distribution. (You probably want to use your answers from (a).) Let X be binomial with parameters n and p. Write X = I 1 +... I n where I 1,..., I n are independent Bernoulli trials with parameter p. Then E(X) = E(I 1 ) + + E(I n ) = np and var(x) = var(i 1 ) + + var(i n ) = np(1 p). 7. (6 pts) Let A and B be events with P (A) = 0.8 and P (B) = 0.4. If the probability that both A and B occur is 0.3, what is the probability that neither A nor B occurs? P (A C B C ) = P ((A B) C ) = 1 P (A B) = 1 (P (A)+P (B) P (AB)) = 1 (0.8+0.4 0.3) = 0.1.

8. (7 pts) Suppose the random variables X 1,..., X n are independent and identically distributed with expected value µ and standard deviation σ. Find the expected value and standard deviation of the random variable X = X 1 + + X n in terms of µ and σ. If n is large, what can you say about the distribution of the variable X? E(X) = E(X 1 ) + + E(X n ) = µ + + µ = nµ. σ(x) = var(x) = var(x 1 ) + + var(x n ) = σ 2 + + σ 2 = σ n. For n large, the Central Limit Theorem tells us that X will have an approximately normal distribution: X N(nµ, σ 2 n). 9. (10 pts) What are the chances of getting at least three sixes in one throw of 18 dice? Let X be the number of sixes in one throw of 18 dice. Then X B(18, 1 6 ). Then P (X 3) = 1 P (X 2) = 1 P (X = 2) P (X = 1) P (X = 0) ( ) ( ) 18 1 2 ( ) 5 16 ( ) ( ) 18 1 1 ( ) 5 17 = 1 2 6 6 1 6 6 0.597. ( 18 0 ) ( 1 6 ) 0 ( ) 5 18 6

10. (10 pts) What is the probability that a hand of 13 cards contains four of a kind? We will use the Principle of Inclusion-Exclusion. For the 13 ranks, let A i be the event that the hand contains four cards of rank i, 1 i 13. We compute P (A i ) = (48 9 ) ( 52 5 ) (as four of the thirteen cards in the hand are fixed, so that the other 9 must be chosen from the 48 that remain in the deck) and similarly that P (A i A j ) = (44 P (A i A j A k ) = (40 four-of-a-kinds. So, 1 ) ( 52 5 ) 5 ) ( 52 5 ) and. Since the hand only contains 13 cards, it can t possible contain more than three P ( 13 i=1 A i ) = 3 ( 13 ( 1) k+1 k )( 52 4k 13 4k ) ( 52 5 ) 0.0342.

11. (10 pts) In a particular area, the number of traffic accidents hovers around an average of 1,050. Last year, however, the number of accidents plunged drastically to 920. Authorities suggest that the decrease is the result of new traffic safety measures that have been in effect for one year. Statistically speaking, is there cause to doubt this explanation? Let X be the number of accidents in a year. Then X is Poisson distributed with λ = 1050. Since λ 25, we may approximate X by the normal distribution N(1050, 1050). The probability that there will be 920 or fewer accidents in a year is P (X 1050) = P ( X 1050 1050 920 1050 ) P (Z 4.01) = Φ( 4.01) 1050 (where Z denotes a standard normal). With a statistical calculator, we can compute that this probability is 0.00003. But, even without a calculator we know that this probability will be very small, as it s more than four standard deviations below the mean. Thus, it is unlikely that the drop in accidents is due to a chance variation and so there is no statistical reason to doubt this explanation. (Note, however, that our calculation only considered chance variation, so there may be non-statistical reasons to doubt this explanation.)

12. (Extra Credit, 10 pts) Let X be Poisson distributed with parameter λ. Calculate the expected value and the standard deviation of X. We calculate: E(X) = = kp (X = k) k=0 λ λk ke k! = λe λ = λe λ k=0 = λe λ e λ = λ. λ k 1 (k 1)! λ k k! Note that var(x) = E(X 2 ) E(X) 2. We calculate: E(X 2 ) = = k 2 P (X = k) k=0 (k(k 1) + k)e = e λ = λ 2 e λ λ λk k(k 1) λk k! + e λ = λ 2 e λ k=0 = λ 2 e λ e λ + λ = λ 2 + λ, λ k 2 (k 2)! + E(X) λ k k! + λ k! (as k(k 1) + k = k2 k + k = k 2 ) k λk k! so that var(x) = λ 2 + λ (λ) 2 = λ and σ(x) = var(x) = λ.