Math 510 midterm 3 answers

Similar documents
1 Basic continuous random variable problems

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

STAT 414: Introduction to Probability Theory

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATH Notebook 4 Fall 2018/2019

Properties of Summation Operator

STOR Lecture 16. Properties of Expectation - I

1 Basic continuous random variable problems

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

STAT 418: Probability and Stochastic Processes

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

MATH Notebook 5 Fall 2018/2019

Class 8 Review Problems solutions, 18.05, Spring 2014

More than one variable

Expectation, inequalities and laws of large numbers

Homework 9 (due November 24, 2009)

Week 12-13: Discrete Probability

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Homework 5 Solutions

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Problem 1. Problem 2. Problem 3. Problem 4

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

1 Presessional Probability

Random variables (discrete)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Expectation MATH Expectation. Benjamin V.C. Collins, James A. Swenson MATH 2730

Stat 100a, Introduction to Probability.

Notes for Math 324, Part 19

18.440: Lecture 28 Lectures Review

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Math Spring Practice for the final Exam.

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Homework 10 (due December 2, 2009)

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Final Review: Problem Solving Strategies for Stat 430

Probability Theory and Statistics. Peter Jochumzen

Multivariate Random Variable

ACM 116: Lectures 3 4

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

18.440: Lecture 26 Conditional expectation

Machine Learning: Homework Assignment 2 Solutions

Math 493 Final Exam December 01

X = X X n, + X 2

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6.

Class 8 Review Problems 18.05, Spring 2014

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

SOR201 Solutions to Examples 3

, 0 x < 2. a. Find the probability that the text is checked out for more than half an hour but less than an hour. = (1/2)2

Homework 4 Solution, due July 23

MATH 4426 HW7 solutions. April 15, Recall, Z is said to have a standard normal distribution (denoted Z N(0, 1)) if its pdf is

Final Exam. Math Su10. by Prof. Michael Cap Khoury

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Midterm Exam 1 Solution

No books, no notes, only SOA-approved calculators. Please put your answers in the spaces provided!

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

2 (Statistics) Random variables

CME 106: Review Probability theory

Simultaneous Equations Solve for x and y (What are the values of x and y): Summation What is the value of the following given x = j + 1. x i.

M378K In-Class Assignment #1

ENGG2430A-Homework 2

Business Statistics 41000: Homework # 2 Solutions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Final Exam { Take-Home Portion SOLUTIONS. choose. Behind one of those doors is a fabulous prize. Behind each of the other two isa

Probability. Hosung Sohn

(b). What is an expression for the exact value of P(X = 4)? 2. (a). Suppose that the moment generating function for X is M (t) = 2et +1 3

Exam 1 Review With Solutions Instructor: Brian Powers

LIST OF FORMULAS FOR STK1100 AND STK1110

18.600: Lecture 24 Covariance and some conditional expectation exercises

Actuarial Science Exam 1/P

Data Analysis and Monte Carlo Methods

1 Random Variable: Topics

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Final Exam # 3. Sta 230: Probability. December 16, 2012

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

Practice Examination # 3

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

11. Regression and Least Squares

Chapter 4 : Expectation and Moments

Lecture 22: Variance and Covariance

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Notes on Random Processes

Multiple Random Variables

Transcription:

Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e y dy dx e x e y 7x dx ( e x e 7x + e x) dx ( e 8x + e x) dx 1 8 e 8x e x 1 8 + 1 7 8 Problem 2 (1 pts) Suppose X and Y have the mass distributions below. Find the joint mass distribution function for X and Y so that X and Y are independent. X Prob. 1.4 2.6 Y Prob. 1.1 2.5 3.4 Since X and Y are independent, P (X i, Y j) P (X i)p (Y j), and therefore we can make the following table for the joint mass distribution function. X 1 X 2 Y 1.4.6 Y 2.2.3 Y 3.16.24 1

Problem 3 (1 pts) Suppose X and Y are independent random variables, and suppose X is binomial with n 1 and p.4; while Y is binomial with n 12 and p.2. Find the expected value and variance of 2X + 3Y. (Recall that a binomial random variable has expected value np and variance npq.) Expected Value: E[2X + 3Y ] 2E[X] + 3E[Y ] 2 1.4 + 3 12.2 8 + 7.2 15.2 Variance: V ar(2x + 3Y ) 4 V ar(x) + 9 V ar(y ) 4 1.4.6 + 9 12.2.8 9.6 + 17.28 26.88 Problem 4 (1 pts) True or false: TRUE The correlation ρ(x, Y ) is always a number in the range [ 1, 1]. FALSE For any random variables X and Y, we have V ar(x + Y ) V ar(x) + V ar(y ). FALSE If E[XY ] E[X]E[Y ], then X and Y are independent. TRUE If X and Y are independent, then M X+Y (t) M X (t) M Y (t). TRUE If V ar(x + Y ) V ar(x) + V ar(y ), then ρ(x, Y ). FALSE If E[XY Z] E[X Z]E[Y Z], then E[XY ] E[X]E[Y ]. TRUE If the moment generating function for a random variable X is e t2, then X must be normal. FALSE Let X and Y be continuous random variables. The density function for X and the density function for Y determine uniquely the joint density function for (X, Y ). FALSE If X and Y are independent, then SD(X + Y ) SD(X) + SD(Y ). FALSE If E[X + Y ] E[X] + E[Y ], then X and Y are independent. 2

Problem 5 (5 pts) Find the moment generating function M(t) for the random variable given in the table below: X Prob..3 1.5 2.2 M(t) E[e tx ].3 +.5 e t +.2 e 2t Problem 6 (5 pts) Find an example of two random variables X and Y so that ρ(x, Y ) 1. Let X be a non-constant random variable (say, a standard normal random variable) and let Y X. Problem 7 (5 pts) Suppose a random variable X had the moment generating function given by the series below. Find the third moment of X. M X (t) 1 + 5t + 9t 2 + 8t 3 + 3t 4 + Since M X (t) 1 + E[X]t + E[X2 ] t 2 + E[X3 ] t 3 + 2! 3! we can match terms to find E[X 3 ] 8 3! and therefore E[X 3 ] 3! 8 48 Problem 8 (1 pts) Suppose X and Y are normal random variables, both with mean 4 and standard deviation 1. Suppose ρ(x, Y ) 1 5. Find the variance of X + Y. Since ρ(x, Y ) Cov(X, Y ) SD(X)SD(Y ) we have 1 5 Cov(X, Y ) 1 1 and therefore Cov(X, Y ) 1 5 1 1 2. Now V ar(x + Y ) Cov(X + Y, X + Y ) Cov(X, X) + 2Cov(X, Y ) + V ar(y, Y ) 1 2 + 2 2 + 1 2 24 3

Problem 9 (1 pts) Two people are playing a game. A fair die is rolled. If it turns up even, player A pays player B a number of dollars equal to the amount shown on the die. If it turns up odd, player B pays $3 to player A. The procedure is repeated until the die comes up 1, 2, 4, or 5, at which point the game ends. Find the expected winnings of player A. Example: The die is rolled, and comes up 6. Then player A pays $6 to player B. The die is rolled again, and comes up 1. Then player B pays $3 to player A, and the game is over. Player A s winnings here is then 6 + 3 3 dollars. Let X be player A s winnings at the end of the game. Let Y be the first roll of the die. Then E[X] E[E[X Y ]] and we consider the following table: Therefore E[X] E[E[X Y ]] Y prob. E[X Y y] 1 1/6 3 2 1/6 2 3 1/6 3 + E[X] 4 1/6 4 5 1/6 3 6 1/6 6 + E[X] 1 6 3 + 1 6 ( 2) + 1 6 (3 + E[X]) + 1 6 ( 4) + 1 6 3 + 1 ( 6 + E[X]) 6 2 3 E[X] + 6 6 1 3 E[X] 1 2 2 3 E[X] 1 2 E[X] 3 4 Problem 1 (5 pts) 5 people go to a fancy dinner, where there are 5 seats. The guests seat themselves randomly, and only later do they realize there are name cards, indicating that the seating was actually assigned beforehand. What is the expected number of people who are sitting at their correct seat? Let { 1 if the ith guest sits at the correct seat X i otherwise Then the number of people who sit at the correct seat is N X 1 + + X 5 4

and its expected value is E[N] E[X 1 ] + + E[X 5 ] Since E[X i ] is the probability that the ith guest sits at the correct seat, and this is 1 5, we have 1 E[N] 5 5 1 Problem 11 (1 pts) Prove that Cov(X + Y, Z) Cov(X, Z) + Cov(Y, Z). Cov(X + Y, Z) E[(X + Y E[X + Y ])(Z E[Z])] E[(X + Y E[X] E[Y ])(Z E[Z])] E[((X E[X]) + (Y E[Y ]))(Z E[Z])] E[(X E[X])(Z E[Z]) + (Y E[Y ])(Z E[Z])] E[(X E[X])(Z E[Z])] + E[(Y E[Y ])(Z E[Z])] Cov(X, Z) + Cov(Y, Z) Problem 12 (5 pts) Let X be exponential with parameter 1 and Y be exponential with parameter 2. Suppose X and Y are independent. Find the density function for X + Y. f X+Y (t) f X (s)f Y (t s) ds ({ e s 2 e 2(t s), if s > and t s >, otherwise ) ds So we are really integrating over the values of s where s > and s < t. This only occurs when t >, and then s ranges in the interval [, t]. f X+Y { t es 2 e 2(t s) ds if t > if t { t 2 e2t e s ds if t > if t Now t t 2 e 2t s ds 2 e 2t e s ds 2 e 2t (1 e t ) 2 e 2t 2 e t 5

So f X+Y (t) { 2 e 2t 2 e t t >, t Problem 13 (5 pts) Prove that ρ(x, Y ) 1. ( X V ar Y ) σ X σ Y ( ) ( X X V ar 2 Cov, Y ) ( ) Y + V ar σ X σ X σ Y σ Y 1 σ 2 V ar(x) 2 1 Cov(X, Y ) + 1 X σ X σ Y σ 2 V ar(y ) Y 1 2ρ(X, Y ) + 1 2(1 ρ(x, Y )) 1 ρ(x, Y ) Therefore ρ(x, Y ) 1 6