ISyE 3044 Fall 2017 Test #1a Solutions

Similar documents
ISyE 6644 Fall 2016 Test #1 Solutions

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

ISyE 3044 Fall 2015 Test #1 Solutions

ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16)

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ISyE 2030 Practice Test 1

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

1 Basic continuous random variable problems

1 Basic continuous random variable problems

STAT Chapter 5 Continuous Distributions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Hand Simulations. Christos Alexopoulos and Dave Goldsman 9/2/14. Georgia Institute of Technology, Atlanta, GA, USA 1 / 26

Massachusetts Institute of Technology

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Chapter 5. Chapter 5 sections

ENGG2430A-Homework 2

Random Variables and Their Distributions

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Exponential Distribution and Poisson Process

Mathematics 426 Robert Gross Homework 9 Answers

STA 256: Statistics and Probability I

Multivariate Random Variable

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

2 Random Variable Generation

Stat 100a, Introduction to Probability.

Introduction to Probability

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

Chapter 1: Revie of Calculus and Probability

Chapter 5 continued. Chapter 5 sections

PRACTICE PROBLEMS FOR EXAM 2

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

ISyE 6644 Fall 2014 Test 3 Solutions

IEOR 4703: Homework 2 Solutions

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Midterm Exam 2 (Solutions)

Bivariate Distributions

SDS 321: Introduction to Probability and Statistics

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

2 Continuous Random Variables and their Distributions

HW4 : Bivariate Distributions (1) Solutions

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

1 Presessional Probability

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Midterm Exam 1 (Solutions)

Stat410 Probability and Statistics II (F16)

MAS223 Statistical Inference and Modelling Exercises

Homework 10 (due December 2, 2009)

Stat 35, Introduction to Probability.

Tutorial 1 : Probabilities

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Stat 5101 Notes: Algorithms

Class 8 Review Problems solutions, 18.05, Spring 2014

Tom Salisbury

Notes for Math 324, Part 19

Final Exam # 3. Sta 230: Probability. December 16, 2012

Conditional densities, mass functions, and expectations

ISyE 6739 Test 1 Solutions Summer 2015

Introduction to Machine Learning

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

MATH 3200 PROBABILITY AND STATISTICS M3200SP081.1

Math Spring Practice for the final Exam.

Homework 5 Solutions

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Probability and Statistics Notes

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

The exponential distribution and the Poisson process

Midterm Exam 1 Solution

Final Exam. Math Su10. by Prof. Michael Cap Khoury

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

THE QUEEN S UNIVERSITY OF BELFAST

STAT 516 Midterm Exam 2 Friday, March 7, 2008

1 st Midterm Exam Solution. Question #1: Student s Number. Student s Name. Answer the following with True or False:

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Bivariate distributions

Continuous Random Variables

14.30 Introduction to Statistical Methods in Economics Spring 2009

18.440: Lecture 28 Lectures Review

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Stat 150 Practice Final Spring 2015

18.440: Lecture 28 Lectures Review

Numerical Methods I Monte Carlo Methods

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Chapter 4. Continuous Random Variables

STAT 418: Probability and Stochastic Processes

Lecture 4. Continuous Random Variables and Transformations of Random Variables

STAT 414: Introduction to Probability Theory

Class 8 Review Problems 18.05, Spring 2014

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Transcription:

1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have [ 1 ] E X 2 = 1 x 2 4x3 dx = 4x dx = 2. This implies that [ 2 ] E X 3 2 [ 1 ] = 2E 3 = 1. X 2 2. Suppose that X has p.d.f. f(x) = 2x, x 1. What s the p.d.f. of the random variable e X? Solution: Let Y = e X. The c.d.f. of Y is G(y) = P(Y y) = P(e X y) = P(X ln(y)) = ln(y) 2x dx = [ln(y)] 2. This implies that the p.d.f. of Y is g(y) = d 2ln(y) G(y) =, 1 y e. dy y 3. Toss two dice and observe their sum. What is the expected number of tosses until you observe a sum of 7? Solution: X Geom(1/6), so E[X] = 6.

2 4. Consider a Poisson process with rate λ = 2. What is the distribution of the time between the 5th and 7th arrivals? Solution: The times between Poisson arrivals are i.i.d. Exp(2). Thus, the sum of the 2 interarrival times between the 5th and 7th arrivals is Erlang 2 (2). 5. YES or NO? If X and Y are independent normal random variables, are they necessarily uncorrelated? Solution: Yes. 6. Fill in the blank. A Poisson process has i.i.d. interarrivals. Solution: exponential. 7. Suppose that X and Y have joint p.d.f. f(x, y) = 8xy for < x < y < 1. Find E[X]. Solution: We have f X (x) = f(x, y) dy = R x so that E[X] = xf X (x) dx = R 8xy dy = 4(x x 3 ), < x < 1, 4(x 2 x 4 ) dx = 8/15. 8. Let X and Y be i.i.d. Exponential with rate 2. Find P(X + Y 1). Solution: X + Y Erlang 2 (2). So P(X + Y 1) = k 1 i= e λx (λx) i i! = 1 e 2(1) (2(1)) i i= i! = 3e 2. 9. If X and Y are i.i.d. Nor(,1) random variables, find P(Y + 1 < 2X).

3 Solution: Note that E[Y 2X] = and Var(Y 2X) = Var(Y ) + 4Var(X) = 4, so that Y 2X Nor(, 5). Thus, P(Y + 1 < 2X) = P(Y 2X < 1) = P(Nor(, 5) < 1) = P(Nor(, 1) < 1/ 5) = Φ( 1/ 5) = Φ(.447) =.327. 1. If X and Y are both Normal(4,1) with Cov(X, Y ) = 6, find Var(X Y ). Solution: Var(X Y ) = Var(X) + Var(Y ) 2Cov(X, Y ) = 8. 11. BONUS: What does Dave really enjoy? (a) An intellectual discussion with a typical UGA student. (b) A lovely cup of piping hot coffee to start the day. (c) Attending a Justin Bieber concert. (d) Attending a Justin Bieber concert at UGA. Solution: (b). 12. Suppose that the Atlanta Hawks play i.i.d. games, each of which has win probability.7. Let X be the number of games until the Hawks achieve their first win. Find the smallest x such that P(X x).95. Solution: The number of games until the first win is X Geom(.7), so that P(X = x) = q x 1 p = (.3) x 1 (.7). It can also be shown that the c.d.f. is F (x) = P(X x) = 1 q x = 1 (.3) x (though you don t need to know this if you do a trial-and-error argument to find the smallest x). Now, F (x) = 1 (.3) x.95 iff.5 (.3) x, which is achieved by x = 3.

4 13. Suppose that X 1,..., X 1 are i.i.d. with values 1 and 1, each with probability.5. (This is a simple random walk.) Find the approximate probability that the sum 1 X i will be at least 1. Solution: Note that E[X i ] = and Var(X i ) = E[Xi 2 ] (E[X i ]) 2 = 1. Then the Central Limit Theorem implies that 1 X i Nor(, 1), and so X i 1) P 1 P( ( Z 1 ) 1 = P(Z > 1) =.8413. 14. Consider the linear congruential pseudo-random number generator X i+1 = (3X i + 1)mod(8). Using X = 1, calculate the fifth pseudo-random number U 5. Solution: We have X 1 = 4, X 2 = 5, X 3 =, X 4 = 1, and so X 5 = 4. Thus, U 5 = 4/8 =.5. 15. If U Unif(, 1), what s the distribution of 3ln(U 2 )? Solution: Note that 3ln(U 2 ) = 6ln(U) Exp(1/6). 16. If U 1, U 2, and U 3 are i.i.d. Unif(,1), what s the distribution of 3ln(U 1 U 2 (1 U 3 ))? Solution: Note that 3ln(U 1 U 2 (1 U 3 )) 3ln(U 1 U 2 U 3 ) Erlang 3 (1/3) (or gamma). 17. If U 1 and U 2 are i.i.d. Unif(,1), what s the distribution of 2 2ln(U 1 )cos(2πu 2 )? Solution: By a homework problem, this is Nor(, 4). 18. Suppose that you want to estimate the integral I = [1 cos(πx)]dx.

5 The following numbers are a Unif(,1) sample:.419.19.732.893 Use the Monte Carlo method from class to approximate the integral via the estimator Ī4. Solution: Ī 4 = b a n = 1 4 = 1 4 4 n g(a + (b a)u i ) 4 g(u i ) = 1.14. [1 cos(πx)] 19. Consider the differential equation f (x) = (x 3)f(x) with f() = 1. Use Euler s method with increment h =.1 to find the approximate value of f(.2). Solution. You can actually get the true answer using separation of variables, and it turns out to be f(x) = exp{ x2 2 3x}. But our job is to use Euler to come up with an iterative approximation, so here goes. As usual, we start with f(x + h) = f(x) + hf (x) = f(x) + h(x 3)f(x) = f(x)[1 + h(x 3)], from which we obtain the following table. x Euler approx true f(x). 1. 1..1.97.975.2.941.942 Thus, the desired Euler approximation for f(.2) is.941.

6 2. Joey works at a chocolate store. Starting at time, we have the following 4 customer interarrival times (in minutes): 2 5 2 Customers are served in LIFO fashion. The 4 customers order the following numbers of chocolate products, respectively: 6 2 3 1 Suppose it takes Joey 3 minutes to prepare each chocolate product. Further suppose that he charges $3/chocolate. Unfortunately, the customers are unruly and each customer causes $.5 in damage for every minute the customer has to wait in line. (a) What is the average number of customers in the system during the first 1 minutes? (b) How much money will Joey make or lose with the above 4 customers? Solution: Consider the following table. cust intrarrl arrl time serv start serve time depart wait sys time 1 18 18 18 2 2 2 3 6 36 28 34 3 5 7 21 9 3 14 23 4 2 9 18 3 21 9 12 (a) Let X i denote the amount of time Customer i spends in the system during the time interval [,1]. In particular, X 1 = 1, X 2 = 1 2 = 8, X 3 = 3, and X 4 = 1. The average number of customers in the system during the first 1 minutes is total customer time 1 = 1 + 8 + 3 + 1 1 = 22 1 = 2.2. (b) Joey makes 3(6 + 2 + 3 + 1).5( + 28 + 14 + 9) = $1.5.