Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Similar documents
Math 510 midterm 3 answers

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

STT 441 Final Exam Fall 2013

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Homework 10 (due December 2, 2009)

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Final Exam. Math Su10. by Prof. Michael Cap Khoury

ENGG2430A-Homework 2

2 (Statistics) Random variables

Midterm Exam 1 Solution

Review of Probability. CS1538: Introduction to Simulations

Continuous Random Variables

Homework 5 Solutions

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Machine Learning: Homework Assignment 2 Solutions

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Probability Review. 1.1 Sample Spaces

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Class 8 Review Problems solutions, 18.05, Spring 2014

Notes for Math 324, Part 19

Bivariate distributions

Bivariate Distributions

Multivariate Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Functions of two random variables. Conditional pairs

ISyE 6644 Fall 2016 Test #1 Solutions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

ECE 302: Probabilistic Methods in Engineering

More than one variable

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

University of Illinois ECE 313: Final Exam Fall 2014

ECE Homework Set 3

Statistics Examples. Cathal Ormond

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

FINAL EXAM: 3:30-5:30pm

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

1 Basic continuous random variable problems

STAT/MATH 395 PROBABILITY II

Class 8 Review Problems 18.05, Spring 2014

FINAL EXAM: Monday 8-10am

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Final Exam { Take-Home Portion SOLUTIONS. choose. Behind one of those doors is a fabulous prize. Behind each of the other two isa

. Find E(V ) and var(v ).

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

Recitation 2: Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Math Spring Practice for the final Exam.

Review of Probability Theory

Exam 1. Problem 1: True or false

Multivariate distributions

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Lecture 2: Repetition of probability theory and statistics

Lecture 11. Probability Theory: an Overveiw

Statistics and Econometrics I

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

1 Basic continuous random variable problems

Notes for Math 324, Part 20

Bivariate Distributions

1 Presessional Probability

EXAM # 3 PLEASE SHOW ALL WORK!

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

conditional cdf, conditional pdf, total probability theorem?

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

CME 106: Review Probability theory

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

University of Connecticut Department of Mathematics

Covariance and Correlation

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Tutorial 1 : Probabilities

Review of Basic Probability Theory

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Data Analysis and Monte Carlo Methods

Final Exam. Take-Home Portion. enter class. No answers will be accepted for any reason after 8:30. NO EXCEP-

University of Connecticut Department of Mathematics

Homework 9 (due November 24, 2009)

Math 493 Final Exam December 01

Joint Distribution of Two or More Random Variables

Algorithms for Uncertainty Quantification

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

STAT 515 MIDTERM 2 EXAM November 14, 2018

SDS 321: Introduction to Probability and Statistics

Problem 1. Problem 2. Problem 3. Problem 4

M378K In-Class Assignment #1

Probability Theory and Statistics. Peter Jochumzen

Random variables (discrete)

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Midterm 1 practice UCLA: Math 32B, Winter 2017

Part I: Discrete Math.

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Transcription:

Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!

. Suppose that the heights of soybean plants are normally distributed, with mean µ = 4 meters and standard deviation σ = 5/4 meters. Given a random soybean plant, use the table attached to do the following: a) Estimate the probability that the soybean plant was taller than 5 meters. b) Estimate the probability that the soybean plant was between and 4 meters. Let X indicate the height of a random soybean plant, and note that Z = X 4 5/4 is the standard normal random variable. We have P X > 5) = P Z = X 4 5/4 > 5 4 5/4 =.8), and P < X < 4) = P 4 5/4 < X 4 5/4 < 4 4 ) = P.6 < Z < ). 5/4 Each of these can be expressed in terms of the cumulative distribution function Φ of a standard normal random variable: P Z >.8) = Φ.8), and P.6 < X < ) = Φ) Φ.6) =.5 Φ.6). Each of the values Φ.8) and Φ.6) can be ascertained from the attached table, and we find Φ.8).788 and Φ.6).548. Thus P X > 5).788 =.9, and P < X < 4).5.548 =.495.

. Suppose that X and Y are jointly continuous with joint pdf fx, y) = x y y > x >, and zero otherwise. for a) Find the marginal density f X. For x > the marginal density is given by f X x) = fx, y) dy = x x y dy = y x x If x, then f X x) = fx, y) dy =. ) = + ) = x x x. b) Find the marginal density f Y. For y > the marginal density is given by y f Y y) = fx, y) dx = If y, then f Y y) = fx, y) dx =. x y dx = y x y ) = y ) y + = y ) y. c) Are X and Y independent? Justify your answer. For values of x, y satisfying y > x >, we have 4y ) f X x) f Y y) =. x y 4y ) The latter is equal to fx, y) only if = x y x y, i.e. xy = 4y ). This condition fails for most values of x, y satisfying y > x >, so X and Y are not independent.

. Suppose that X and Y are jointly continuous random variables with joint probability density function { x fx, y) = 5 y if < x < and < y < mx otherwise for some constant m >. a) Find m. Hint: The probability of the whole sample space is.) Because the probability of the whole sample space is, we have mx = P S) = fx, y) dx dy = x 5 y dy dx x 5 y mx) = m x 7 dx = dx ) = m x 8 8 = m 8 = m 6. Thus m = 4. b) Find E[XY ]. We compute: E[XY ] = xyfx, y) dx dy = ) x 6 y 4x = dx = = 4 x = 4 = 5. 4x 4 x 9 dx x 6 y dy dx 4

4. Prove the following or give a counterexample: If covx, Y ) =, then X and Y are independent. This statement is false. Let X be a uniform random variable on [, ], and let Y = X. Then we have Moreover, E[X] = / On the other hand, we have while E[XY ] = E[X ] = / / x dx =. /x dx =, so that covx, Y ) = E[XY ] E[X]E[Y ] =. P X > 4, Y > 6 ) = P X > 4 ) = 4 = 4, P X > 4 )P Y > 6 ) = 4 P X > 6 ) = 4 P X > 4 ) = 4 = 8. That is, P X > 4, Y > 6 ) P X > 4 ) P Y > 6 ), so X and Y are not independent. [For a discrete example, you could let X be uniform on {,, }, and Y = if X =, and Y = otherwise. Then E[XY ] = = E[X], so covx, Y ) =, but P X =, Y = ) = while P X = ) = P Y = ) = /.] 5

5. Suppose that X and Y are jointly continuous with joint density function { x + y if < x <, < y < fx, y) = otherwise a) Find the conditional density f X Y x y). b) Compute P X < Y = ). The conditional density f X Y x y) is given by fx,y) f Y y), so for values of y, ) we have f Y y) = fx, y) dx = x + y dx = x + xy = + y. Thus f X Y x y) = x + y + y for x, y, ), and f X Y x y) = otherwise. Since P X A Y = b) = A f X Y x b) dx for every b and A, we have P X < Y = ) / ) = f X Y x dx = = 6 5 / = 6 5 x x + + + x 8 + ) 6 dx = ) / + = 6 5 / /) = 6 5 7 4 = 7. x + dx + / ) 6

6. You stand on a number line at position, facing the positive direction, and play a game as follows: ) oll a fair six-sided die. ) If the result is even, you walk that many steps forward so, for example, if the result is you step forward steps) and go back to step. ) If the result is odd but not ), you walk backwards that many steps so, for example, if the result is you step backwards steps), and go back to step. 4) If the result is, you stop the game. Find the expected position when you finish. Let X indicate the position on the number line at the end of the game, and let T indicate the result of the first roll. We compute E[X] by conditioning on T : 6 E[X] = E[E[X T ]] = E[X T = k] P T = k). k= Because the die is fair, P T = k) = /6 for each k =,..., 6. Let E[X] = e. Each of the values E[X T = k] may be rewritten: E[X T = ] =, because if T = then we stop the game while standing at ; E[X T = ] = + e, because if T = then we restart the game while standing at ; similarly E[X T = ] = + e, E[X T = 4] = 4 + e, E[X T = 5] = 5 + e, and E[X T = 6] = 6 + e. Thus the above equation becomes so that e = 4. e = 6 + e) + + e) + 4 + e) + 5 + e) + 6 + e)) = 5e + 4), 6 7

BONUS) Let Y be a uniform random variable on [, ], and let X be a uniform random variable on the interval [, e Y ]. That is, for a given outcome ω S, Xω) is a random number chosen uniformly from the interval [, e Y ω) ].) Find E[X]. Hint: E[X] = E[E[X Y ]].) We have E[X] = E[E[X Y ]], and the righthand side may be computed by viewing E[X Y ] as a random variable that is formed by composing the function φ : y E[X Y = y] with Y. E[X] = E[E[X Y ]] = φy)f Y y) dy = E[X Y = y]f Y y) dy = E[X Y = y] dy. Given that Y = y, X is uniform on [, e y ], and the expected value of a uniform random variable on [a, b] is the average a + b)/. That is, E[X Y = y] = +ey. This implies that + e y E[X] = dy = ) y + e y = + e + )) = e. 8