HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Similar documents
STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Chapter 4. Continuous Random Variables

Notes on Continuous Random Variables

The exponential distribution and the Poisson process

Basics of Stochastic Modeling: Part II

Chapter 4 Continuous Random Variables and Probability Distributions

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

S n = x + X 1 + X X n.

2 Continuous Random Variables and their Distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Guidelines for Solving Probability Problems

Continuous Probability Distributions. Uniform Distribution

Continuous Distributions

3 Continuous Random Variables

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Exponential Distribution and Poisson Process

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Random variables. DS GA 1002 Probability and Statistics for Data Science.

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Exam 3, Math Fall 2016 October 19, 2016

SDS 321: Introduction to Probability and Statistics

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Math Spring Practice for the Second Exam.

Things to remember when learning probability distributions:

Continuous distributions

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Special distributions

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

STAT Chapter 5 Continuous Distributions

Continuous-time Markov Chains

PROBABILITY DISTRIBUTIONS

Chapter 4: Continuous Probability Distributions

Continuous Random Variables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Chapter 5. Chapter 5 sections

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Midterm Exam 1 Solution

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Northwestern University Department of Electrical Engineering and Computer Science

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

FINAL EXAM: 3:30-5:30pm

p. 6-1 Continuous Random Variables p. 6-2

ECE 313 Probability with Engineering Applications Fall 2000

STAT509: Continuous Random Variable

Continuous Probability Spaces

STAT 430/510: Lecture 10

Lecture 3 Continuous Random Variable

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

STAT 430/510: Lecture 16

Continuous Random Variables and Continuous Distributions

Will Landau. Feb 21, 2013

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Continuous random variables

ISyE 3044 Fall 2015 Test #1 Solutions

Probability and Distributions

Math Spring Practice for the final Exam.

Chapter 4: Continuous Random Variable

Sampling Distributions

Stat 512 Homework key 2

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Basic concepts of probability theory

Lecture 4a: Continuous-Time Markov Chain Models

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

Expected Values, Exponential and Gamma Distributions

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

1 Solution to Problem 2.1

FINAL EXAM PLEASE SHOW ALL WORK!

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Brief Review of Probability

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

General Random Variables

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Common ontinuous random variables

Basic concepts of probability theory

Statistics 100A Homework 5 Solutions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Continuous Random Variables

(Practice Version) Midterm Exam 2

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Slides 8: Statistical Models in Simulation

5.2 Continuous random variables

PhysicsAndMathsTutor.com

MAS1302 Computational Probability and Statistics

Chapter 4: Continuous Random Variables and Probability Distributions

EE 302 Division 1. Homework 6 Solutions.

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Basic concepts of probability theory

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Transcription:

HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from the ground at base A, climbs at an angle of 45 degrees to an altitude of km, flies at that altitude for a while, and then descends at an angle of 45 degrees to land at base B. All along the plane is assumed to fly at the same speed. James Bond jump off at a time uniformly distributed over the duration of the journey. You can assume that James Bond, being who he is, violates the laws of physics and descends vertically after he jumps. Compute the expectation and variance of this position. How do they compare to the case when the plane flies at constant velocity with no ascending and descending)? Answer: The density function of the position James Bond lands is given by if x [, ) +8 8 fx) 8 if x [, 9) +8 if x [9, ] +8 otherwise. if x [, ) +8 if x [, 9) +8 if x [9, ] +8 otherwise. The density function looks like this:

The expectation will be E[X] xfx) dx x + 8 dx + [ x + 8 + 8 + 8 5 9 ] x + 8 dx + x 9 + 8 dx [ ] x 9 [ x ] ) + + x x9 ) 9 ) + ) ) x ) ) + 8 A sanity check: The distribution is symmetric about the midpoint of the journey 5 km), so it makes sense that the expectation is 5.) The expectation is the same as when the plane travels at the same altitude for the entire trip. The variance will be VarX) E[X ] E[X] x fx) dx 5 9 x + 8 dx + x + 8 dx + [ x 3 ] [ ] x 3 9 [ x 3 + + + 8 3 x 3 x 3 ) 9 3 3 ) + + 8 3 7 ) ) + 78 5 + 8 3 3 34 + 9 8 5 + 8 95. 9 ] x9 x dx 5 + 8 ) 5 ) 3 ) 5 In the uniform case where the plane flies at a constant speed at the same altitude, the variance is 833. The variance is greater here because the ascent/descent increases the probability that Bond jumps close to base A or B, i.e., the probability that he jumps far away from the expectation the middle).. pts.) Laplace Distribution A continuous random variable X is said to have a Laplace distribution with parameter λ if its pdf is given by: fx) A exp λ x ), < x < for some constant A.

a) pts.) Can the parameter λ be negative? Can λ be zero? Explain. Answer: No. No. We know that for a valid pdf, If we plug in for fx), we get A exp λ x ) dx fx) dx. [ ] A A exp λx) dx exp λx). λ If λ, this evaluates to infinity, and then there is no way to choose A so that the area under the pdf is. Therefore, it must be that λ >. b) 3 pts.) Compute the constant A in terms of λ. Sketch the pdf. Answer: Following part a), we know Also, A exp λ x ) dx A exp λx) dx. [ ] A A exp λx) dx exp λx) λ A ) λ A λ. Therefore, we must have A/λ, or equivalently, A λ/. The pdf looks like this: x c) 4 pts.) Compute the mean and variance of X in terms of λ. Answer: Using the definition of the expectation: E[X] x λ exp λ x ) dx x λ exp λ x ) dx + x λ exp λ x ) dx x λ exp λ x ) dx + x λ exp λ x ) dx 3

Alternatively, we could note that the distribution is symmetric around x, from which it follows that the mean is zero.) For the variance, we have VarX) E[X ] E[X] E[X ] [ λ x λ exp λ x ) dx x λ exp λx) dx λxλx + ) λ exp λx) The integral in the 3rd line is done using integration by parts.) d) 4 pts.) Compute the cumulative distribution function cdf) of X. Answer: When x, PX x) PX x) x x x fu)du λ exp λ u ) du. λ exp λ u)) du [expλu)]x expλx). ] x When x >, PX x) x λ exp λu) du + [ exp λu)]x + exp λx) + ) + exp λx). e) 3 pts.) For s, t >, compute P[X s + t X s]. Answer: We begin with the definition of conditional probability: PX s + t X s) Since t >, if X s + t then X t, so we have PX s + t X s) PX s). PX s + t X s) PX s + t). 4

Also, since s and t are both positive, PX s + t) exp λs + t)) and PX s) exp λs). Therefore, PX s + t) PX s + t X s) exp λt). PX s) exp λs+t)) exp λs) f) 4 pts.) Let Y X. Compute the pdf of Y. Hint: you may consider starting from the cdf of X that you have computed in part d).) Answer: Note that for y, Thus, PY y) PY > y) PX > y) + PX < y)). PY y) exp λy) + ) exp λy) exp λy). This is the cumulative distribution function for Y. Finally, we differentiate PY y) to get the pdf of Y, noting that d exp λy)) λ exp λy). dy Consequently, the pdf for Y is given by 3. 5 pts.) Bus waiting times fy) { λ exp λy) if y, if y <. You show up at a bus stop at a random time. There are three buses running on a periodic schedule. They all go to the same destination which is where you want to go). Each of the three buses arrives at your bus stop once every minutes, but at different offsets from each other. The offsets of the three buses are uniformly random and independent. You will get on the first bus to arrive at your bus stop. Let the random variable T denote the number of minutes you have to wait until the first bus arrives. a) 6 pts.) Compute an expression for the probability density function pdf) and the cumulative distribution function cdf) for T. Answer: Let X, X, X 3 be random variables denoting the number of minutes you have to wait for bus,, or 3 respectively. They have a uniform distribution: X, X, X 3 Uniform... ). Also T minx, X, X 3 ). Let f, f, f 3 be the probability density functions for X, X, X 3, and let F, F, F 3 be the cumulative distribution functions. Let g be the probability density function for T, and G be the cumulative distribution function for T. Then, f i x) F i x) x for x [, ] for x [, ] To obtain the probability density function gt), we first compute the cumulative density function Gt) PT t), and then we calculate its derivative. Suppose t. To compute PT t), we compute the probability of the complement event: PT > t) PminX, X, X 3 ) > t) PX > t and X > t and X 3 > t) PX > t) PX > t) PX 3 > t) t ) 3. 5

Therefore, Gt) PT t) PT > t) t ) 3 for t [, ]. Now the pdf is the derivative of the cdf: b) 3 pts.) Plot the pdf and cdf. Answer: gt) d dt Gt) 3 t ) 3 t ) for t [, ]. c) 6 pts.) Compute E[T ] and VarT ). Answer: E[T ] t gt) dt t 3 t ) dt 3 ) t5 t3 t + dt 3 ) t t3 5 + t4 4 3 5 + ) 4 5. 6

E[T ] t gt) dt t 3 t ) dt 3 ) t t35 t4 + dt 3 ) t 3 3 t4 + t5 5. 4. pts.) Exponential Distribution VarT ) E[T ] E[T ] 5/) 5 4. We begin by proving two very useful properties of the exponential distribution. We then use them to solve a problem in photography a) 4 pts) Let r.v. X have geometric distribution with parameter p. Show that, for any positive m, n, we have P X > m + n X > m) P X > n). This is the memoryless property of the geometric distribution. Why do you think this property is called memoryless? Answer: We first calculate the probability PX > m) for a geometric random variable X with parameter p; we know that X > m when the first m trials were failures; therefore, we get: PX > m) p) m Using this, we can now compute the conditional probability: PX > m + n X > m) PX > m + n) X > m)) PX > m) PX > m + n) PX > m) [Since X > m + n) X > m) X > m + n)] p)m+n p) m p) n PX > n) This is exactly what we wanted to show. This property is called memoryless because even knowing that we have waited m trials and have not yet seen a success, the probability of seeing a success in the next n trials is exactly the same as if we had not seen any trials at all. 7

b) 4 pts) Let r.v. X have exponential distribution with parameter λ. Show that, for any positive s,t, we have P X > s + t X > t) P X > s). [This is the memoryless property of the exponential distribution.] Answer: We first calculate the probability PX > t) for an points) exponential random variable X with parameter λ. PX > t) t λe λx dx e λt Using this, it is now easy to compute the given conditional probability PX > s + t X > t) PX > s + t) X > t)) PX > t) PX > s + t) [Since X > s + t) X > t) X > s + t)] PX > t) e λs+t) e λt e λs PX > s) c) 4 pts) Let r.v.s X, X be independent and exponentially distributed with parameters λ, λ. Show that the r.v. Y min{x, X } is exponentially distributed with parameter λ + λ. [Hint: work with cdfs.] Answer: Using the hint given, we will work with the CDF; from part b, remember that if X is exponential with parameter λ, then PX t) PX > t) e λt. So, if we can show that the CDF of Y is also of this form, then we can conclude that Y is also an exponential random variable. So we get: PY t) PY > t) Pmin{X, X } > t) PX > t) X > t)) PX > t) PX > t) [Since X, X are independent] e λt e λ t e λ +λ )t This is exactly the CDF for an exponential random variable with parameter λ + λ. d) 4 pts) You have a digital camera that requires two batteries to operate. You purchase n batteries, labeled,,..., n, each of which has a lifetime that is exponentially distributed with parameter λ and is independent of all the other batteries. Initially you install batteries and. Each time a battery fails, you replace it with the lowest-numbered unused battery. At the end of this process you will be left with just one working battery. What is the expected total time until the end of the process? Justify your answer Answer: Let T be the time when we are left with only one battery. We want to compute E[T ].To facilitate the calculation, we can break up T into the intervals T T + T +... + T n 8

where T i is the time between the i ) th failure and i th failure. After n failures, we are left with only one battery and the process ends. Note that the i th failure need not correspond to the battery labeled i. We can calculate E[T ] as E[T ] E[T ] + E[T ] +... + E[T n ] It now remains to find E[T i ] for each i. To calculate E[T i ], note that after i failures, we have the battery labeled i and another battery labeled j for j < i installed in the camera. The lifetime of the battery labeled i is distributed as an exponential random variable with parameter λ. Also, because of the memoryless property proved in part a), the remaining lifetime of the battery labeled j, conditioned on the battery being functional till the time of the i ) th failure, is also distributed as an exponential random variable with parameter λ. Hence, T i is the minimum of two exponential random variables, each with parameter λ. By part b), T i is exponentially distributed with parameter λ. This gives E[T i ] λ and hence, E[T ] E[T ] + E[T ] +... + E[T n ] n λ e) 4 pts) In the scenario of part d), what is the probability that battery i is the last remaining working battery, as a function of i? Answer: We first calculate that if at any point, the battery labeled i and one labeled j, for some j i) are installed, then what is the probability that the one labelled j fails first. If neither of the batteries have failed, then conditioned on this fact, the distributions of their remaining lifetimes are identical exponential with parameter λ). Hence, by symmetry, the probability that the one labeled j fails first is /. For batteries, 3,..., n, the battery labelled i is installed after i batteries have failed. The probability that it is the last one left, is the probability that in each of the remaining n i + failures that remain, it is the other installed battery in the camera which fails first and not the battery labeled i. From the previous calculation, each of these events happen with probability, conditioned on the history. Hence, the probability that the battery labeled i is the last one left is / n i+. However, this formula is slightly off for the battery labeled ; like battery, it also needs to survive n failures, not n failures, and so we get: i n Plast battery is battery i) i n n i+ otherwise 5. 5 pts.) Misprints A textbook has on average one misprint per page. a) 7 pts) What is the chance that you see exactly 4 misprints on page? Answer: We assume that misprints are rare events that follows the Poisson distribution. Let X be a Poisson random variable representing the number of misprints on page. Since a Poisson random variable with parameter λ has expectation λ, we know X Poiss). Hence 9

PX 4) e 4! e 4.53 b) 8 pts) What is the chance that you see exactly 4 misprints on some page in the textbook, if the textbook is 5 pages long? Answer: Let X i be a Poisson random variable representing the number of misprints on page i. By the independence of X,..., X 5, PX i 4 for some i) PX i 4 for all i) ) 5 e 4.979 6. 5 pts.) Functions of a r.v. a) 5 pts.) Suppose X is a discrete random variable uniform in {,,..., n}. What is the pmf of the random variable Y X? Is it uniform on its range? Answer: The range of Y is {, 4, 9,..., n }. PY m) PX m) PX m) n when there exists i {,,..., n} such that m i. PY m) for all other values of m. Hence Y is also uniform on its range. Plot of pmf for n 6: b) 5 pts.) Suppose X is a continuous random variable uniform in the interval [, n]. What is the pdf of the random variable Y X? Is it uniform on its range? Sketch the pdf, and give an intuitive explanation of the shape of the curve.

Answer: The range of Y is the interval [, n ]. We start by computing the cdf of Y. If y <, PY y). If y > n, PY y). For y [, n ] we have F Y y) PY y) PX y) PX y) y n dx n y ) Therefore, f Y y) n ) y. The pdf of Y is decreasing, meaning that the probabilities of having larger values is smaller. Intuitively, this is because intervals of X get mapped to larger and larger intervals for Y. Plot of pdf for n 6: c) 5 pts.) Suppose X is a continuous random variable uniform in the interval [ n, n]. What is the pdf of Y X? Sketch the pdf. Answer: The range of Y is the interval [, n ]. If y <, PY y). If y > n, PY y). For y [, n ] we have F Y y) PY y) PX y) P X y) y y n y) n dx Therefore, f Y y) n y.

Plot of pdf for n 6: