This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Similar documents
This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Chapter 5 continued. Chapter 5 sections

Random Variables and Their Distributions

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Probability and Distributions

3 Continuous Random Variables

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

All other items including (and especially) CELL PHONES must be left at the front of the room.

STAT 516 Midterm Exam 3 Friday, April 18, 2008

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Lecture 2: Repetition of probability theory and statistics

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

1.1 Review of Probability Theory

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Chapter 5. Chapter 5 sections

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

Algorithms for Uncertainty Quantification

Bivariate distributions

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Final Exam # 3. Sta 230: Probability. December 16, 2012

Sampling Distributions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Problem 1. Problem 2. Problem 3. Problem 4

Midterm Exam 1 Solution

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Lecture 2: Review of Probability

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

3. Probability and Statistics

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

2 Functions of random variables

1 Introduction. P (n = 1 red ball drawn) =

MAS223 Statistical Inference and Modelling Exercises

Chapter 4 Multiple Random Variables

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Class 8 Review Problems solutions, 18.05, Spring 2014

Lecture 1: August 28

Miscellaneous Errors in the Chapter 6 Solutions

Question Points Score Total: 76

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

Continuous Probability Distributions. Uniform Distribution

STAT 414: Introduction to Probability Theory

MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm)

MTH 132 Solutions to Exam 2 Apr. 13th 2015

Homework 10 (due December 2, 2009)

Page Problem Score Max Score a 8 12b a b 10 14c 6 6

Common ontinuous random variables

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

STAT 418: Probability and Stochastic Processes

University of Illinois ECE 313: Final Exam Fall 2014

Basics on Probability. Jingrui He 09/11/2007

STAT 311 Practice Exam 2 Key Spring 2016 INSTRUCTIONS

Chapter 2: Random Variables

CS 361: Probability & Statistics

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Extra Topic: DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Conditional distributions (discrete case)

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Probability. Table of contents

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Contents 1. Contents

Chapter 4 Multiple Random Variables

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Continuous r.v practice problems

Data Analysis and Monte Carlo Methods

18.440: Lecture 28 Lectures Review

Name: Firas Rassoul-Agha

Page Max. Possible Points Total 100

Sampling Distributions

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Continuous Random Variables

3 Multiple Discrete Random Variables

THE QUEEN S UNIVERSITY OF BELFAST

More than one variable

Bell-shaped curves, variance

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

18.05 Practice Final Exam

Conditional densities, mass functions, and expectations

1 Review of Probability and Distributions

ENGG2430A-Homework 2

STA 4322 Exam I Name: Introduction to Statistics Theory

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

MAS108 Probability I

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

1 Solution to Problem 2.1

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

MATH Notebook 5 Fall 2018/2019

FIRST YEAR EXAM Monday May 10, 2010; 9:00 12:00am

Joint Probability Distributions, Correlations

Transcription:

TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.) Partial credit is available. (If you know part of a solution, write it down. If you know an approach to a problem, but cannot carry it out write down this approach. If you know a useful result, write it down.) The different problems are not related. The different parts of a problem are sometimes unrelated. If you cannot solve part of a problem, you should still go on to look at the later parts. Show and explain your work (including your calculations) for all problems unless you are explicitly told otherwise. No credit is given without work. But don t get carried away! Show enough work so that what you have done is clearly understandable. The grader should be able to see how you got from one step to the next. If you needed scratch paper to work something out, make sure to transfer your work to the exam. If you need more space for a problem than what is given, use the back of the same page and write Work on Back to indicate you have done so. On problems where work is shown, circle your answer. (On these problems, you should give only one answer!) All the work on the exam should be your own. No cooperation is allowed. Arithmetic does not have to be done completely unless requested otherwise. Answers can be left as fractions or products. You do not have to evaluate binomial coefficients, factorials or large powers. Answers can be left as summations (unless there is a simple closed form such as when summing a geometric or exponential series). You need only pens, pencils, erasers and a calculator. (You will be supplied with scratch paper.) Do not quote homework results. If you wish to use a result from homework in a solution, you must prove this result. The exam has 11 pages. There are a total of 1 points. 1

Problem 1. (1%) Prove that EY = E[E(Y X)]. (Assume that (X, Y ) has a joint density f X,Y (x, y) and that EY exists.) See notes 11, page 2. If students exactly reproduce the argument in the notes, give them full credit. But you should look carefully: in going from line 2 to line 3 of the proof, two things happen; y is brought inside the inner integral (over x), and then the order of integration is changed. 2

Problem 2. (1%) Suppose that f X (x) = { 6x(1 x) for < x < 1 otherwise and that for < x < 1, Find Var(Y X = x) for < x < 1. 2y for < y < x f Y X (y x) = x 2 otherwise. See notes9, page 19. The calculation is similar to (but simpler than) the calculation of V ar(y X = x) on notes9, page 22 (which uses E(Y X = x) on notes9, pages 17). Note that f X (x) is never used. 3

Problem 3. (12%) Prove that if the joint cumulative distribution function (cdf) satisfies then for any pair of intervals (a, b) and (c, d), F X,Y (x, y) = F X (x)f Y (y), P (a X b, c Y d) = P (a X b)p (c Y d). This is exercise 4.9. The solution in the manual is not quite right; it handles the lower endpoints of the intervals incorrectly. But if students reproduce this solution, give them full credit. 4

Problem 4. (12%) Let X = the number of trials to get the first head and Y = the number of trials to get the first tail in repeated tosses of a fair coin. Are X and Y independent? (Answer Yes or No and give a detailed justification of your answer.) This is similar to exercise 4.11. The answer is No. The simplest justification is to note that P (X = 1, Y = 1) = and P (X = 1) = P (Y = 1) = 1/2 so that P (X = 1, Y = 1) P (X = 1)P (Y = 1) which violates independence. Another way to justify No is to note that the support of the distribution of (X, Y ) is not a product set. The support consists of the possibilities marked with a plus sign below: 1 2 3 4 5... 1 + + + +... 2 + 3 + 4 + 5 +.. This follows by noting that either X or Y must be 1 (but not both), and whichever is not 1 can take any larger value. For students who give this type of solution, it is not sufficient to just say the support is not a product set. To get full credit, students must actually specify the support or give a specific example demonstrating the support is not a product set (e.g. they could note that (1, 1) is not in the support of (X, Y ), but that 1 is in the support of both X and Y, which violates the product set condition). 5

Problem 5. (12%) Let (X, Y ) be a bivariate random vector with joint pdf f(x, y) which is positive on the entire plane R 2. Let U = a e X + b and V = c e Y + d where a, b, c, and d are fixed constants with a > and c >. Find the joint pdf of (U, V ). (Make sure to specify the support of the joint pdf.) This is similar to exercise 4.22. The support of (X, Y ) is A = R 2, the entire plane. Since the pair (e X, e Y ) can be any two positive values, the support of (U, V ) is B = {(u, v) : u > b, v > d} = (b, ) (d, ). The inverse transformation is: X = log((u b)/a) = log(u b) log(a), Y = log((v d)/c) = log(v d) log(c) The Jacobian of the inverse transformation is: 1 J = u b 1 = 1 v d (u b)(v d) > for (u, v) B. Therefore f U,V (u, v) = f ( log ( u b a ) (, log v d )) c (u b)(v d) for u > b, v > d. 6

Problem 6. Suppose that X has density f X (x) = 2x for < x < 1 and Y X has a binomial distribution with n trials and success probability X. This is a modification of exercise 4.31. The solution is essentially the same. (a) (5%) Find EY. EY = EE(Y X) = E(nX) = ne(x) = n 1 x 2x dx = 2n 3 7

[ Problem 6 continued ] (b) (7%) Find Var(Y ). Var(Y ) = Var(E(Y X)) + EVar(Y X) = Var(nX) + E(nX(1 X)) = n 2 Var(X) + ne(x(1 X)) = n 2 { EX 2 (EX) 2} + n { 1 = n 2 x 2 2x dx + = n 2 { 1 2 4 9 = n2 18 + n 6 } ( 2 3 1 ) } 2 ( 1 + 2n 3 1 4 ) x(1 x) 2x dx + 2n 1 x 2 (1 x) dx Var(X) may also be found by noting that X Beta(2, 1) and using the formula for the variance of the Beta distribution given in the appendix. The calculation of E(X(1 X)) may also be done using the Beta integral. If students use the Beta integral in their calculations, they might leave Gamma functions in their answer. That is OK. 8

[ Problem 6 continued ] (c) (6%) Find the marginal mass function (pmf) of Y. ( n f X,Y (x, y) = f X (x)f Y X (y x) = 2x I (,1) (x) y ( n = 2 y ) x y+1 (1 x) n y I (,1) (x)i {,1,...,n} (y) ) x y (1 x) n y I {,1,...,n} (y) ( ) n 1 ( ) n f Y (y) = f X,Y (x) dx = 2 x y+1 (1 x) n y dx = 2 B(y + 2, n y + 1) y y ( ) n Γ(y + 2)Γ(n y + 1) 2 n!(y + 1)!(n y)! = 2 = y Γ(n + 3) y!(n y)!(n + 2)! 2(y + 1) = for y =, 1,..., n. (n + 2)(n + 1) The support, 1,..., n must be specified in some way to get full credit. Students may leave Gamma functions in their answer and still receive full credit. They don t have to simplify it down to the final form given above. 9

Problem 7. (12%) Let X i, i = 1, 2, be independent Gamma(α i,1) random variables. Find the density (pdf) of Z = X 1 /X 2. Students can use formula (2) on page 16 of notes1.pdf to answer this question, if they happen to remember it. (They don t have to re-derive it.) If they don t remember it, they can re-derive it following the argument on page 17 of notes1.pdf. This problem is similar to exercise 4.19(b), but requires a different bivariate transformation. The calculations are similar to those in 4.19(b). For those who remember formula (2) on page 16 of notes1.pdf, you simply plug into the formula using Z = Y/X with Y = X 1 and X = X 2 leading to: f Z (z) = = x f X,Y (x, zx) dx = x xα 2 1 e x (zx) α1 1 e zx dx Γ(α 2 ) Γ(α 1 ) z α 1 1 = x α 1+α 2 1 e (1+z)x dx Γ(α 1 )Γ(α 2 ) (Let u = (1 + z)x.) = Γ(α 1 + α 2 ) z α 1 1 Γ(α 1 )Γ(α 2 ) (1 + z) α 1+α 2 = Γ(α 1 + α 2 ) Γ(α 1 )Γ(α 2 ) z α 1 1 (1 + z) α 1+α 2, z >. x f X2,X 1 (x, zx) dx u α 1+α 2 1 e u Γ(α 1 + α 2 ) du We know in advance that the support must be (, ) since Z = X 1 /X 2 can be any positive value. 1

Problem 8. Suppose (X, Y ) have the joint density (pdf) f(x, y) = 3 2π exp { ( x 2 xy + y 2)} This is a bivariate normal density with µ X = µ Y =, σ 2 X = σ2 Y = 2/3, and ρ = 1/2. (a) (1%) Calculate f Y X (y x), the conditional density of Y given X = x. This is a special case of exercise 4.45(b). We know that f Y X (y x) is a density function in y for any fixed value of x. Think of f Y X (y x) as a function of y with x being a fixed value. Any quantity not depending on y is a constant. We may ignore multiplicative constants in our work since, in the end, the conditional density must integrate to 1, and this condition will determine the appropriate normalizing constant. f(x, y) f Y X (y x) = f X (x) f(x, y) exp { ( x 2 xy + y 2)} exp { ( y 2 xy )} = exp { (y x/2) 2 + x 2 /4 } exp { (y x/2) 2} } 1 (y x/2)2 exp { 2π(1/2) 2(1/2) The above is the correct answer to part (a). For part (b), we note the following: = N(µ = x/2, σ 2 = 1/2) density The answer to part (a) can be written in many algebraically equivalent ways, all of which are correct: } 1 (y x/2)2 f Y X (y x) = exp { = 1 exp { (y x/2) 2} = 1 exp { y 2 + xy x 2 /4 }. 2π(1/2) 2(1/2) π π (b) (4%) What is the conditional distribution of Y given X = x? Name the distribution whose density you found above, and give the values of any parameters. (No work is required.) The answer is Normal(mean = x/2, variance = 1/2). Give two points for stating Normal, and one point each for the correct values of µ and σ 2. Students should get zero points for stating bivariate normal, which is very wrong. 11