Final Solutions Fri, June 8

Similar documents
1 Basic continuous random variable problems

Midterm Exam 2 (Solutions)

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

ECE Homework Set 2

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

Introduction to Probability

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

Midterm Exam 1 Solution

MATH 407 FINAL EXAM May 6, 2011 Prof. Alexander

Twelfth Problem Assignment

1 Basic continuous random variable problems

Lecture 11: Probability, Order Statistics and Sampling

Part I: Discrete Math.

Discrete Mathematics and Probability Theory Fall 2015 Note 20. A Brief Introduction to Continuous Probability

We introduce methods that are useful in:

CSE 312 Final Review: Section AA

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

18.440: Lecture 28 Lectures Review

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 18

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

MATH2715: Statistical Methods

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Randomized algorithm

Exponential Distribution and Poisson Process

CME 106: Review Probability theory

ECE 302 Solution to Homework Assignment 5

7 Continuous Variables

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Tutorial 1 : Probabilities

Massachusetts Institute of Technology

Stat410 Probability and Statistics II (F16)

Continuous distributions

STAT509: Continuous Random Variable

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 16. A Brief Introduction to Continuous Probability

Topic 4: Continuous random variables

SDS 321: Practice questions

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Math 30530: Introduction to Probability, Fall 2013

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Chapter 2: Random Variables

Statistics 100A Homework 5 Solutions

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

STAT Chapter 5 Continuous Distributions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

18.440: Lecture 28 Lectures Review

8 Laws of large numbers

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

1 Solution to Problem 2.1

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

This does not cover everything on the final. Look at the posted practice problems for other topics.

Chapter 4: Continuous Random Variable

2 Continuous Random Variables and their Distributions

General Random Variables

MAT 271E Probability and Statistics

Math Spring Practice for the final Exam.

Topic 4: Continuous random variables

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

MAT 271E Probability and Statistics

ISyE 3044 Fall 2015 Test #1 Solutions

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

CS145: Probability & Computing

Renewal theory and its applications

Chapter 4: Continuous Probability Distributions

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Continuous Distributions

Statistical distributions: Synopsis

Basics of Stochastic Modeling: Part II

Lecture 18. Uniform random variables

Problem Set 7 Due March, 22

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

S n = x + X 1 + X X n.

Expectation of geometric distribution

Final Examination Solutions (Total: 100 points)

Variance reduction techniques

MATH Solutions to Probability Exercises

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

COMPSCI 240: Reasoning Under Uncertainty

Expected Values, Exponential and Gamma Distributions

Essentials on the Analysis of Randomized Algorithms

SDS 321: Introduction to Probability and Statistics

Lecture 3 Continuous Random Variable

Math 50: Final. 1. [13 points] It was found that 35 out of 300 famous people have the star sign Sagittarius.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Chapter 4 Continuous Random Variables and Probability Distributions

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Transcription:

EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on [0, 1]. Let Z = max(x 1, X 2,..., X n ). Find the pdf of Z. Consider the cdf of Z : F Z (z) = Pr(Z z) = Pr(X 1 z, X 2 z... X n z) = n n Pr(X i z) = F Xi (z) The pdf is a derivative of the cdf Z : z n if z [0, 1] = (F X1 (z)) n = 0 if z < 0 1 if z > 1 f Z (z) = d dz F Z(z) = nz n 1 if z [0, 1] 0 otherwise (b) (8 points) I have 10 hats in a box. Each time I go out with my friends, I pick up one of the hats at random and wear it. When I come back home, I put the hat back in the box. If I went out 20 times during this quarter, what is the expected number of hats that I never used? Introduce random variables X i, i = 1,... 10 : 1, if hat i was never used X i = A hat is never used if it is not picked up in any of the 20 independent times. The probability of picking a specific hat on a specific day is p = 1/10, thus The expected number of unused hats is Pr(X i = 1) = (1 p) 20 = (0.9) 20 0.1216 [ 10 ] E X i = 10 E [X i ] = 10 Pr(X i = 1) = 10 (0.9) 20 1.216 (c) Let X and Y denote the x and y coordinate of a point that is uniformly chosen over the circumference of a unit circle. (i) (2 points) Are X and Y independent? No. X 2 + Y 2 = 1, thus, by symmetry Pr(X = 1 Y = 0) = Pr(X = 1 Y = 0) = 0.5, while Pr(X = 1) = 0 as X is continuous on [ 1; 1]. Thus Pr(X = 1 Y = 0) Pr(X = 1), which implies dependence.

(ii) (2 points) Are X and Y identically distributed? Yes. By symmetry, any rotation doesn t change the distribution and X turns into Y under a rotation by π/2, thus they are identically distributed. (iii) (6 points) What is the variance of X? Hint: no need for lengthy calculations, try to build on your answer for part (ii). X is symmetric around the origin, thus E[X] = 0 According to part (ii), X and Y have the same distribution, thus, E[X 2 ] = E[Y 2 ]. As the points belong to the unit circumference: The variance: E[X 2 + Y 2 ] = 1 = E[X 2 ] + E[Y 2 ] = 1 = E[X 2 ] = 0.5 Var(X) = E[X 2 ] E[X 2 ] = 0.5 (d) You are trying to call your best friend. She answers the phone with probability p and does not answer with probability (1 p). If she answers the phone you engage in a conversation whose duration is an exponential random variable with rate λ. If she does not answer you wait for 5 min and call again. The probability that she answers is again p. If she does not answer you give up, otherwise you engage in a conversation whose duration is again an exponential random variable with rate λ. Let X (in minutes) be the total duration of the experiment, i.e., the total time you spent until you either give up calling your friend or hang up the phone. Recall that the pdf of an exponential random variable with rate λ is given by f Y (y) = λe λy, y 0. (i) (8 points) What are the probabilities of the events X < 5} and X 5}, X = 5} and X = 6}? Let the random variables for i = 1, 2 be 1 if friend answers call i X i =, X i Bern(p) and T Exp(λ) be the length of the conversation (if it was initiated). Note that F T (t) = t 0 f T (x)dx = 1 e λt, Pr(T = t) = 0 for any t as T is a continuous random variable. Pr(X < 5) = Pr(X 1 = 1, T < 5) = pf T (5) = p(1 e 5λ ) Pr(X 5) = Pr(X 1 = 1, T 5) + Pr(X 1 = 0, X 2 = 0) + Pr(X 1 = 0, X 2 = 1, T = 0) = pf T (5) + (1 p) 2 + 0 = p(1 e 5λ ) + (1 p) 2 Pr(X = 5) = Pr(X 5) Pr(X < 5) = (1 p) 2 Pr(X = 6) = Pr(X 1 = 1, T = 6) + Pr(X 1 = 0, X 2 = 1, T = 1) = 0 (ii) (10 points) Find and plot the cdf of X. for x < 5 : for x > 5 : F X (x) = Pr(X < x) = Pr(X 1 = 1, T < x) = pf T (x) = p(1 e xλ ) F X (x) = Pr(X < x) = Pr(X 1 = 1, T < x) + Pr(X 1 = 0, X 2 = 1, T < x 5) + Pr(X 1 = X 2 = 0) = pf T (x) + p(1 p)f T (x 5) + (1 p) 2 = p(1 e xλ ) + p(1 p)(1 e (x 5)λ ) + (1 p) 2 = 1 pe xλ p(1 p)e 5λ e xλ

for x = 5 : Thus, F X (5) = Pr(X 5) = p(1 e 5λ ) + (1 p) 2 ( F X (x) = p(1 e xλ ) + (1 p) 1 pe (x 5)λ) 1 x 5, where 1 A is the indicator function of A. f X (x) p(1 e 5λ ) +(1 p) 2 p(1 e 5λ ) 5 x (e) (8 points) You choose a point at random on a stick of length L and break the stick into two pieces at this point. What is the probability that the ratio of the shorter to the longer piece is less than 1 3? Let X be the length of the left piece, then L X is the length of the right piece and the ratio of the shorter piece to the longer piece is Y = min X L X, L X X on the stick, thus F X (x) = x/l for x [0, L]. Pr(Y < 1/3) = 1 Pr(Y 1/3), ( X Pr(Y 1/3) = Pr L X 1/3, L X ) X 1/3 = Pr(4X L, 3L 4X) = Pr (X [L/4, 3L/4]) }. Also note that X is uniformly distributed = Pr(X 3L/4) Pr(X < L/4) = F X (3L/4) F X (L/4) = 0.5, Pr(Y < 1/3) = 1 Pr(Y 1/3) = 0.5 (f) (10 points) Assume now that after you break the stick into two at a random point you throw away the piece in your left-hand. Then you take the right-hand piece and break it once again into two at random. What is the pdf of the length of the middle piece you end up with (that does not have any of the original end points)? Let Z be the length of the middle piece and X be the length of the piece in your right hand. Then X is uniformly distributed on [0, L] and Z is uniformly distributed on [0, x] given X = x, thus: for z [0, L] : Finally, f Z (z) = L 0 f Z X (z x)f X (x)dx = f X (x) = 1 x [0,L] L ] f Z X (z x) = 1 z [0,x] x f Z (z) = L 0 1 L z [0,x] xl dx = 1 ln(l/z) dx = z xl L ln(l/z) L, if z [0, L]

2. Communication over an Optical Channel (38 points) Consider the problem of communicating one bit of information across an optical fiber, which is equally likely to be 0 or 1. If the bit B = 1, we switch on an LED and its light is carried across the optical fiber to a photodetector at the receiver side. The photodetector outputs the number of photons Y N it detects. If the bit B = 0, we keep the LED off. What makes the problem interesting is that even if the LED is off, the detector is likely to detect some photons (e.g. due to ambient light). A good model is that Y is Poisson distributed with intensity that depends on whether the LED is on or off. Assume that when B = 1 (the LED is on), the number of photons counted by the photodetector is a Poisson random variable with parameter 200 and when B = 0 (the LED is off), the number of photons counted by the photodetector is a Poisson random variable with parameter 50. Remember that a Poisson random variable Y with parameter λ has mean and variance λ, and the following probability distribution: Pr(Y = k) = λk e λ, k = 0, 1, 2,... (a) (10 points) Apply the ML rule to decide on the transmitted bit when the photodetector detects Y = y photons. Reduce the decision rule to a simple threshold form and specify the value of the threshold. Let λ 1 = 200, λ 0 = 50. The decision rule: ˆB = 0 Pr(Y = y B = 1) < P r(y = y B = 0) λy 1 e λ 1 < λy 0 e λ 0 y! y! y ln(λ 1 ) λ 1 < y ln(λ 0 ) λ 0 λ 1 λ 0 y < ln(λ 1 ) ln(λ 0 ) = 150 ˆB = 0 if y 108 1 if y 109 (b) (4 points) Given that the transmitted bit B = 0. What is the probability that we erroneously declare the bit to be 1, i.e. ˆB = 1? Hint: you can express the result as a summation. Pr( ˆB = 1 B = 0) = Pr(Y 109 B = 0) = k=109 λ k 0 e λ 0 108 = 1 e 50 (c) (7 points) Use Chebyshev s inequality to provide a bound on the probability in part (b). Let Z be a Poisson random variable with parameter λ 0. Then E[Z] = λ 0 and Var(Z) = λ 0. Y B = 0 has the same distribution as Z, thus Pr( ˆB = 1 B = 0) = Pr(Z 109) = Pr(Z E[Z] 109 E[Z]) Pr( Z E[Z] 109 E[Z]) Var(Z) (109 E[Z]) 2 λ 0 (109 λ 0 ) 2 = 50 59 2 0.0144 (d) (7 points) Use the central limit theorem to provide an approximate estimate for the probability in part (b). Hint: recall that the sum of two independent Poisson random variables with parameters λ and µ is Poisson with parameter λ + µ. If X 1, X 2... X 50 are i. i.d. Poisson with parameter 1, then Z = 50 X i has Poisson distribution with parameter 50. Thus, Pr( ˆB = 1 B = 0) = Pr(Z 109) = Pr k=0 50 k ( Z 50 59 ) ( ) 59 = Q 3.60 10 17 50 50 50

(e) (10 points) Assume that now the transmission is repeated N times. That is, if B = 1 the receiver observes Y 1,..., Y N, where the Y i s are i.i.d. Poisson(200) random variables, if B = 0 the receiver observes Y 1,..., Y N, where the Y i s are i.i.d. Poisson(50). Use the ML rule to decide on the value of the transmitted bit given a certain observation Y 1 = y 1,..., Y N = y N at the receiver. ˆB = 0 Pr ( N Y i = y i } B = 1 ) < P r ( N Y i = y i } B = 0 ) N λ y i 1 e λ 1 y i! < N λ y i 0 e λ 0 y i! ln(λ 1 ) y i Nλ 1 < ln(λ 0 ) y i Nλ 0 1 N y i < λ 1 λ 0 ln(λ 1 ) ln(λ 0 ) = 150 The decision rule: ˆB = 0 if 1 N N y i < 150 1 otherwise.