University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

Similar documents
University of Chicago Graduate School of Business. Business 41000: Business Statistics

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Practice Final Exam

MAS223 Statistical Inference and Modelling Exercises

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

MATH Notebook 5 Fall 2018/2019

Chapter 5. Bayesian Statistics

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Chapter 2: Random Variables

Chapter 4 Multiple Random Variables

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

MATH/STAT 3360, Probability

Lecture 1: August 28

More than one variable

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Math 493 Final Exam December 01

Foundations of Statistical Inference

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

1.1 Review of Probability Theory

18.440: Lecture 28 Lectures Review

Brief Review of Probability

Continuous random variables

Problem Points S C O R E Total: 120

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

1 Introduction. P (n = 1 red ball drawn) =

18.440: Lecture 28 Lectures Review

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Statistics 1B. Statistics 1B 1 (1 1)

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Continuous Distributions

1 Presessional Probability

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Review for the previous lecture

University of California, Los Angeles Department of Statistics. Joint probability distributions

Sampling Distributions

MAS223 Statistical Modelling and Inference Examples

3 Conditional Expectation

STAT 418: Probability and Stochastic Processes

Bayesian Inference. STA 121: Regression Analysis Artin Armagan

Probability and Distributions

(3) Review of Probability. ST440/540: Applied Bayesian Statistics

1: PROBABILITY REVIEW

Formulas for probability theory and linear models SF2941

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

We introduce methods that are useful in:

i=1 k i=1 g i (Y )] = k

Fundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner

Basics on Probability. Jingrui He 09/11/2007

FINAL EXAM: Monday 8-10am

Chapter 5. Chapter 5 sections

Math 510 midterm 3 answers

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Statistics for scientists and engineers

Multiple Random Variables

Exercises and Answers to Chapter 1

Exam 1 Review With Solutions Instructor: Brian Powers

STAT 414: Introduction to Probability Theory

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

MA 519 Probability: Review

Compute f(x θ)f(θ) dθ

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ECE531: Principles of Detection and Estimation Course Introduction

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Classical and Bayesian inference

Notes for Math 324, Part 17

Probability and Statistics Notes

Tutorial 1 : Probabilities

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6.

STAT Chapter 5 Continuous Distributions

Chapter 3 Common Families of Distributions

Introduction to Machine Learning

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

MAT 271E Probability and Statistics

Course 1 Solutions November 2001 Exams

Bayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017

Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2

Classical Probability

No books, no notes, only SOA-approved calculators. Please put your answers in the spaces provided!

Partial Solutions for h4/2014s: Sampling Distributions

Chapter 6: Functions of Random Variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

STAT J535: Chapter 5: Classes of Bayesian Priors

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Sampling Distributions

4. Distributions of Functions of Random Variables

Actuarial Science Exam 1/P

General Bayesian Inference I

Problem Points S C O R E Total: 120

Transcription:

Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout this paper, N(µ, σ ) will denote a normal distribution with mean µ and variance σ. 3. This is a hr exam. Honor Code: By signing my name below, I pledge my honor that I have not violated the Booth Honor Code during this examination. Signature:

Problem A. True or False: Please Explain your answers in detail. Partial credit will be given (60 points). (Normal) Let X be normal with mean zero, then the distribution of X is a Rayleigh distribution. Solution: False. X χ, with PDF given by whereas Rayleigh distribution has PDF π x / e x/, x x σ e σ.. (Russian Roulette) I have a revolver (six chambers) loaded with one bullet. I pull the trigger once, click, and hand you the revolver. Do you want to to spin it first, or shoot directly? Solution: True. With spin, we have P (survival) = 5/6. Without spin, P (survival) = 4/5. Hence, it s better to spin. 3. (Children and Apples) You have three children, but only one apple. You want to toss a fair coin to decide who gets the apple. You want to be fair. You flip a coin twice. Assign HH, HT, and T T to the children, and ignore T H if you throw it. This is a fair allocation. Solution: True. Let Y, Y Bern(/) be iid over {H, T }. Each element of the space Z = {HH, HT, T T, T H} has probability p(z = Z) = p(y = y Y = y ) = p(y = y )p(y = y ) = /4. If we exclude T H, then p(z) is simply renormalized; thus, each event still has equal probablility. 4. (Raisins per Cookie) A baker put 500 raisins into his dough, and made 00 cookies. You take a random cookie. The probability of finding exactly raisins is 8. Solution: True. Binomial distributed. Imagine the raisins are thrown in one after the other. For every raisin, p = /00 of being in the cookie. But n = 500. Then λ = np = 5 for the corresponding Poisson distribution, and P (X = ) 8%. 5. (Bayes) If (y θ) N(θ, ) and θ has a Jeffreys flat uniform prior p(θ), then the posterior distribution (θ y) N(y, ).

Solution: True. p(θ y) p(y θ)p(θ) = e (y θ) / = e (θ y) / Thus, as a function of θ we have θ y N(y, ) 6. (Cauchy) The reciprocal of a Cauchy distribution has finite mean and variance Solution: False. /C D = C, since C = X/Y where X, Y N(0, ), so /C = Y/X. 7. (Order Statistics) Let X,..., X n be a random sample from an exponential distribution with mean one. Then the probability density of any order statistic is also exponential. Solution: False. The distribution for an order statistic, k, of a continuous distribution X from n samples is given by f X(k) = so when X Exp() and x 0 n! (k )!(n k)! (F X(x)) k ( F X (x)) n k f X (x) n! ( f X(k) = ) e x k ( ( e x ) ) n k e x (k )!(n k)! n! ( = ) e x k e (n k+)x (k )!(n k)! which is not exponentially distributed. 8. (Transformations) Let Y = e X have a Pareto distribution. Then Y is a log-normal distribution. Solution: False. The Pareto distribution is given by f Y (y; α, β) = { αβ α y α+ y β 0 otherwise The transform y = g (z) = z = e x has derivative dy = dg (z) = z = e x, so dz dz { αβ α e αx x ln(β)/ f Z (z; α, β) = 0 otherwise 3

which is proportional to an Exponential distribution on its support. The log-normal distribution is given by x πσ e (ln x µ) σ for x 0, σ > 0. The two distributions do not share the same form. 9. (Brownian Motion) Let B t be a standard Brownian motion. Then we can calculate the mean E ( B t B s ) = t s for t > s. Solution: False. Z = B t B s N(0, t s) and Y = Z f Y (y; t s) = πσ x e x4 (t s) 4 x > 0 0 otherwise () and has E[Y ] = /4 π Γ(3/4)(t s) t s. 0. (Moments) The function /( t) cannot be a moment generating function, M X (t), of any random variable, X. Solution: False. It is the moment generating function of the Gamma distribution. M X (t) = E[e tx ] = βα Γ(α) = βα Γ(α) = βα Γ(α) 0 0 0 e tx x α e βx dx x α e (β t)x dx x α e (β t)x dx, where the integrand is another gamma PDF with β β t, so Now, let α =, β = = βα Γ(α) Γ(α) (β t) α = β α (β t). α ( t). (Martingales) A submartingale has the property that E(X t ) > X 0 for any t. 4

Solution: True. By definition E[X t X s ] > X s, hence by iterated expectation E[X t ] = E[E[X t X s ]] > E[X s ], s < t, including s = 0, which proves the statement.. (Odds) Consider betting x against Secretariat and y against Zenyatta. Crazy Eddie takes in x + y in bets and given his beliefs in the following payout table Horse Eddie s Beliefs Fair odds If horse loses Eddie pays Secretariat : x + x = 3x 3 Zenyatta : y + y = y Table : Eddies Payout Then Eddie will always lose money if we bet x =, y = 3 against him. Solution: True. 5

Problem B. (0 points) Let X and Y be independent with a joint distribution given by f X,Y (x, y) = ( π xy exp x y ) where x, y > 0. Identify the following distributions The marginal distribution of X Solution: e (x+y) dx = y e y π 0 xy π Compute the joint distribution of U = X and V = X + Y Solution: ( We ) perform the substitutions y v x and x u and obtain the parameter u vector, which has the following Jacobian: v u Thus, the transformed distribution is Compute the marginal distribution of V. Solution: ( ) 0 = f U,V (u, v) = e v/ π u(v u) v e v/ du = π 0 u(v u) e v/ 6

Problem C. (0 points) (Hit and Run Taxi) A certain town has two taxi companies: Blue Birds, whose cabs are blue, and Night Owls, whose cabs are black. Blue Birds has 5 taxis in its fleet, and Night Owls has 75. Late one night, there is a hit-and-run accident involving a taxi. The town s taxis were all on the streets at the time of the accident. A witness saw the accident and claims that a blue taxi was involved. At the request of the police, the witness undergoes a vision test under conditions similar to those on the night in question. Presented repeatedly with a blue taxi and a black taxi, in random order, he shows he can successfully identify the colour of the taxi 4 times out of 5. Which company is more likely to have been involved in the accident? Solution: Let W = witness sees blue, B = blue birds. We have the following probabilities P (W B) P (W B) =. P (B) We have 0.8 0.5 P (B W ) = 0.5 0.8 + 0.75 0. 0. = 0. + 0.5 45%. (Children) I tell you that I have two children, and that one of them is a girl (I say nothing about the other). What is the probability that I have two girls? Solution: Let X = two girls, Y = told one girl, Z = girl opens. We are looking for P (X Y ) = / /3 = 3. You come to visit, and a girl (who is my daughter) opens your door. What is the probability that I have two girls? 7

Solution: Now we are looking for P (X Y Z) = /4 /4 + /4 =. 8