IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Similar documents
IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

Introduction to Stochastic Processes

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

3 Conditional Expectation

MATH Notebook 5 Fall 2018/2019

1 Basic continuous random variable problems

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Thursday, October 4 Renewal Theory: Renewal Reward Processes

Math 510 midterm 3 answers

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

X = X X n, + X 2

3 Multiple Discrete Random Variables

STAT 430/510: Lecture 15

Homework 2. Spring 2019 (Due Thursday February 7)

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Lecture 2: Repetition of probability theory and statistics

Northwestern University Department of Electrical Engineering and Computer Science

IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Chapter 5. Means and Variances

1: PROBABILITY REVIEW

Homework 10 (due December 2, 2009)

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Continuous Distributions

STAT 516 Midterm Exam 3 Friday, April 18, 2008

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Conditional densities, mass functions, and expectations

6.041/6.431 Fall 2010 Quiz 2 Solutions

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

MATH/STAT 3360, Probability

Conditional distributions (discrete case)

Chapter 4. Chapter 4 sections

Introduction to Probability

Lecture 1: August 28

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

Algorithms for Uncertainty Quantification

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MAS223 Statistical Inference and Modelling Exercises

Math 493 Final Exam December 01

Distributions of linear combinations

Chapter 2: Random Variables

Exam 3, Math Fall 2016 October 19, 2016

Business Statistics 41000: Homework # 2 Solutions

THE UNIVERSITY OF HONG KONG DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Things to remember when learning probability distributions:

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

1 Basic continuous random variable problems

S n = x + X 1 + X X n.

Basics on Probability. Jingrui He 09/11/2007

Chapter 4 Multiple Random Variables

18.440: Lecture 28 Lectures Review

Chapter 5 continued. Chapter 5 sections

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

STAT 3610: Review of Probability Distributions

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

STAT 414: Introduction to Probability Theory

Exam 1 Review With Solutions Instructor: Brian Powers

Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

18.440: Lecture 19 Normal random variables

Communication Theory II

Probability and Estimation. Alan Moses

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

STAT 418: Probability and Stochastic Processes

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Terminology. Experiment = Prior = Posterior =

1 Presessional Probability

Lecture 20 Random Samples 0/ 13

1 Variance of a Random Variable

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

18.440: Lecture 28 Lectures Review

M378K In-Class Assignment #1

Preliminary statistics

Recap of Basic Probability Theory

Order Statistics and Distributions

Problem Set 7 Due March, 22

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Guidelines for Solving Probability Problems

Bell-shaped curves, variance

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability.

Bivariate distributions

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Stat 426 : Homework 1.

SOLUTION FOR HOMEWORK 12, STAT 4351

Exercises with solutions (Set D)

Expectations and Variance

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

MAT 135B Midterm 1 Solutions

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Lecture 17: The Exponential and Some Related Distributions

1 Random Variable: Topics

Transcription:

IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read Sections 3.1-3.5 up to (not including) Example 3.24 on p. 125 (up to Example 3.22 on p. 123 of the 9th edition). To keep the reading under control, Examples 3.14-3.16 on pages 111-117 (Examples 3.13-3.15 on pages 11-116 of the 9th edition) can be omitted. (The rest of Chapter 3 has interesting material, but we will not cover it.) Problems from Chapter 3 of Ross, plus two extra ones. In all homework, show your work. Problems from Chapter 3 of Ross, plus two extra ones. In all homework, show your work. Problem 3.3 p(x = i Y = j) = P (X = i, Y = j) P (Y = j), where P (Y = j) = i p(x = i, Y = j). E[X Y = j] = i ip(x = i Y = j). So Problem 3.4 E[X Y = 1] = 2 E[X Y = 2] = 5 3 For X and Y to be independent, we need E[X Y = 3] = 12 5. p(x = i, Y = j) = p(x = i)p(y = j) for all i and j. This property does not hold here. For example, p(x = 1) = 2/9 and P (Y = 1) = 5/9, but p(x = 1, Y = 1) = 1/9. Note that 1/9 (2/9) (1/9). Problem 3.7

Note that p(x = 1, Z = 1 Y = 2) = 1/5 p(x = 1, Z = 2 Y = 2) = p(x = 2, Z = 1 Y = 2) = p(x = 2, Z = 2 Y = 2) = 4/5 so that p(x = 1 Y = 2) = 1/5 p(x = 2 Y = 2) = 4/5 and Now turn to the second question: First So E[X Y = 2, Z = 1] = 1. - Problem 3.11 First get the pdf of Y : E[X Y = 2] = (1 1 5 ) + 2 4 5 = 9 5 P (X = 1 Y = 2, Z = 1) = 1, = = = y3 e 6 f(x, y) dx (y 2 x 2 )e dx 8 Recognize that the pdf of Y is gamma with parameters λ = 1 and α = 4; see Section 2.3.3 get the conditional pdf f X Y (x y) = f(x, y) = 3 (y 2 x 2 ) 4 y 3, x y, for y >. To check, notice that the area under the conditional pdf f X Y (x y) is indeed 1. Now 2

E[X Y = y] = = = 3 4 = x 3 4 xf X Y (x y) dx (y 2 x 2 ) y 3 [ x y x3 y 3 ] dx You can see the result in advance by noticing that the conditional pdf f X Y (x y) is symmetric about. Problem 3.12 - First find the conditional density of X given that Y = y. See Section 3.3. The general formula is dx f X Y (x y) = f X,Y (x, y). To get the density of Y, we need to integrate: = f X,Y (x, y) dx = e, which we recognize as the pdf of an exponential random variable with mean 1. Hence f X Y (x y) = e x/y e ye = e x/y y, which we recognize as the pdf of an exponential random variable with mean y. Hence E[X Y = y] = y. Problem 3.14 You can see that the conditional distribution is uniform over [, 1/2], so that the conditional mean must be 1/4. You can also proceed more deliberately: Construct the conditional density: f X X<1/2 (x) = f X (x) P (X < 1/2) = 1 = 2, x 1/2. 1/2 Now compute the expected value with respect to this new density: E[X X < 1/2] = Problem 3.15 1/2 xf X (x) dx = 1/2 2x dx = 1 4 3

Since the joint density is independent of x, you can see that the conditional density must be uniform in x over the allowed range, which is the interval [, y]. Again we can proceed more deliberately: Start by computing the conditional density of X given Y = y: f X Y =y (x) = f X,Y (x, y) = (1/y)e, < x < y, where Hence, = f X,Y (x, y) dx = (1/y)e dx = e, y <. f X Y =y (x) = (1/y)e e = 1 y, < x < y, E[X 2 Y = y] = Problem 3.37 1 y x2 dx = y2 3. E[X] = (1/3)2.6 + (1/3)3. + (1/3)3.4 = 9./3 = 3. Essentially the same logic applies with the second moment. Recall that the variance of a Poisson random variable equals the mean. Hence the second moment of a Poisson random variable is the mean plus the square of the mean. We use the fact that the second moment of a mixture is the mixture of the second moments: E[X 2 ] = (1/3)[(2.6) + (2.6) 2 ] + (1/3)[(3.) + (3.) 2 ] + (1/3)[(3.4) + (3.4) 2 ] = 12.17 V ar(x) = E[X 2 ] (EX) 2 = 12.17 9. = 3.17. Problem 3.4 (a) (a) Let X denote the number of the door chosen on the first try. Let N be the total number of days spent in jail. The idea is to condition on X, using the fact that the problem repeats when the prisoner returns to his cell. E[N X = 1] = 2 + E[N] E[N X = 1] = 3 + E[N] E[N X = 1] = E[N] = (.5)(2 + E[N]) + (.3)(3 + E[N]) + (.2)(), from which we deduce that E[N] = 9.5 days. 4

Extra Problem 1. Suppose that you flip a fair coin 1,, times. What is the approximate probability of getting at least 51, 5 heads? The exact distribution of the number of heads is binomial with parameters n = 1,, and p = 1/2, but you should use a normal distribution approximation, based on the central limit theorem. The mean number of heads is 5, and the standard deviation is 5. So we are asking about the probability of exceeding the mean by at least 3 standard deviations. From the table for the normal distribution on page 81, we see that the probability is 1.9987 =.13 (about 1/1). The law of large numbers tells us that the proportion gets closer and closer to 1/2 as the sample size grows. The central limit theorem allows us to be much more precise, and see how likely are actual outcomes. Extra Problem 2. Suppose that you own shares of stock in the company Lewser, Inc. Suppose that the current (initial) stock price is $1 per share. Suppose that we regard the daily changes in stock price as independent and identically distributed random variables (a simple additive random walk model). Suppose that the expected daily change is $.1 or 1 cents per day (as one would expect from a stock with that name). Using units of dollars, suppose that the variance of the daily change is.25. Suppose that the distribution of the daily change is a gamma distribution. What is the approximate probability that the stock price has not dropped (is at least $1) after 1 days? The total change in 1 days is the sum of 1 IID random variables, each distributed as the single daily change. The exact distribution is complicated, but again we can apply the central limit theorem and get a normal approximation. We do not use the fact that the distribution is gamma. All we care about is that the daily change has a finite mean and variance. And of course we care about the values of the mean and variance. Let S 1 be the change in the stock price over 1 days. We would write S 1 = X 1 + X 2 + + X 1, where X j is the change in day j. Here are the mean and variance: The mean of the sum is (always) the sum of the means, so that E[S 1 ] = E[X 1 + X 2 + + X 1 ] = 1E[X 1 ] = 1 (dollars) and, because of the IID assumption, the variance of the sum is also the sum of the variances, so that it is V ar[s 1 ] = V ar[x 1 + X 2 + + X 1 ] = V ar[x 1 ] + + V ar[x 1 ] = 1V ar[x 1 ] = 25 so that the standard deviation of S 1 is exactly 5 (the square root of the variance). Hence P (1 + S 1 1) = P (S 1 ) ( S1 + 1 = P + 1 ) 5 5 P (N(, 1) 2) = 1.9772 =.228. See pages 79-83 of the book and the class lecture notes for more explanation. 5