UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Similar documents
Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

2 Functions of random variables

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

We introduce methods that are useful in:

1 Random Variable: Topics

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

SDS 321: Introduction to Probability and Statistics

Chapter 2: Random Variables

Independent random variables

Lecture 1: August 28

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

S n = x + X 1 + X X n.

MAS223 Statistical Inference and Modelling Exercises

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Lecture Notes 2 Random Variables. Random Variable

Northwestern University Department of Electrical Engineering and Computer Science

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MA 519 Probability: Review

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

The exponential distribution and the Poisson process

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Mathematics 426 Robert Gross Homework 9 Answers

Probability. Lecture Notes. Adolfo J. Rumbos

Multiple Random Variables

M378K In-Class Assignment #1

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Naïve Bayes classification

Chapter 4 Multiple Random Variables

Conditioning a random variable on an event

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Recitation 2: Probability

Random Variables and Probability Distributions

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

conditional cdf, conditional pdf, total probability theorem?

ISyE 3044 Fall 2017 Test #1a Solutions

Multivariate distributions

General Random Variables

Conditional densities, mass functions, and expectations

Exercise 1. = P(y a 1)P(a 1 )

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

When Are Two Random Variables Independent?

Exponential Distribution and Poisson Process

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Lecture 2: Review of Basic Probability Theory

Statistics 100A Homework 5 Solutions

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability

Introduction to Probability and Statistics (Continued)

18.440: Lecture 28 Lectures Review

DS-GA 1002 Lecture notes 2 Fall Random variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

5. Conditional Distributions

1 Solution to Problem 2.1

6.041/6.431 Fall 2010 Quiz 2 Solutions

MAT 271E Probability and Statistics

EXAM # 3 PLEASE SHOW ALL WORK!

Chapter 3: Random Variables 1

Lecture 3: Random variables, distributions, and transformations

Lecture 8: Channel Capacity, Continuous Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Final Solutions Fri, June 8

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

1 Introduction. P (n = 1 red ball drawn) =

C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Midterm Exam 1 Solution

Stat 5101 Notes: Algorithms

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

MAS1302 Computational Probability and Statistics

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

STAT 516: Basic Probability and its Applications

IEOR 4703: Homework 2 Solutions

[Chapter 6. Functions of Random Variables]

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Chapter 2 Random Variables

Sampling Distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Stat 426 : Homework 1.

1 Probability and Random Variables

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Transcription:

UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N (n) = (λt)n e λt for n = 0,,,... n! Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solution: To find the pdf f Y (t) of the random variable Y, note that the event Y t} occurs iff the time of the nth packet is in [0,t], that is, iff the number N(t) of packets arriving in [0,t] is at least n. Alternatively, Y > t} occurs iff N(t) < n}. Hence, the cdf F Y (t) of Y is given by F Y (t) = PY t} = PN(t) n} = (λt) k e λt. k! k=n Differentiating F Y (t) with respect to t, we get the pdf f Y (t) as for t > 0. f Y (t) = k=n [ λe λt(λt)k k! = λe λt(λt)n (n )! = λe λt(λt)n (n )! k=n ] +λe λt(λt)k (k )! λe λt(λt)k k! + k=n+ λe λt(λt)k (k )! Or we can use another way. Since we know that the time interval T between packet arrivals is an exponential random variable with pdf λe f T (t) = λt, if t 0, Let T i denote the i.i.d. exponential interarrival times, then Y = T + T + + T n. By convolving f T (t) with itself n times, which can be also computed by its Fourier transform (characteristic function), we can show that the pdf of Y is given by f Y (t) = λe λt (λt) n (n )!, if t 0,

. Diamond distribution. Consider the random variables X and Y with the joint pdf c if x + y / f X,Y (x,y) = 0 otherwise, where c is a constant. (a) Find c. (b) Find f X (x) and f X Y (x y). (c) Are X and Y independent random variables? Justify your answer. Solution: (a) The integral of the pdf f X,Y (x,y) over < x <, < y < is c, and therefore by the definition of joint density c =. (b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 x, x ( ) f X (x) = cdy = x, and for x 0, f X (x) = +x +x x ( ) cdy = +x. So the marginal pdf may be written as x x f X (x) = 0 otherwise. Now since f XY (x,y) is symmetrical, f Y (y) = f X (y). Thus, (c) X and Y are not independent since f X Y (x y) = f X,Y(x,y) f Y (y) = y x + y, y 0 otherwise. f X,Y (x,y) f X (x)f Y (y). Alternatively, X and Y are not independent since f X Y (x y) depends on the value of y.

3. First available teller. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ) respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she is free. What is the probability that you are served by the first teller? Solution: From the memoryless property of the exponential distribution, the remaining services for the tellers are also independent exponentially distributed random variables with parameters λ and λ, respectively. The probability that you will be served by the first teller is the probability that the first teller finishes the service before the second teller does. Thus, PX < X } = f X,X (x,x )dx dx x >x = = x =0 x =0 = λ λ +λ. x =x λ e λ x λ e λ x dx dx λ e (λ +λ )x dx 4. Coin with random bias. You are given a coin but are not told what its bias (probability of heads) is. You are told instead that the bias is the outcome of a random variable P Unif[0, ]. To get more infromation about the coin bias, you flip it independently 0 times. Let X be the number of heads you get. Thus X B(0,P). Assuming that X = 9, find and sketch the a posteriori probability of P, i.e., f P X (p 9). Solution: In order to find the conditional pdf of P, apply Bayes rule for mixed random variables to get f P X (p x) = p X P(x p) f P (p) = p X (x) Now it is given that X = 9, thus for 0 p p 9 ( p) f P X (p 9) = 0 p9 ( p)dp = p9 ( p) 0 = 0p 9 ( p). p X P (x p) 0 p X P(x p)f P (p)dp f P(p). Figure compares the unconditional and the conditional pdfs for P. It may be seen that given the information that 0 independent tosses resulted in 9 heads, the pdf is shifted towards the value 9 0. 3

4.5 4 f P (p) f P X (p 9) 3.5 3.5.5 0.5 0 0 0. 0. 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Figure : Comparison of a priori and a posteriori pdfs of P 5. Optical communication channel. Let the signal input to an optical channel be: with probability X = 0 with probability. The conditional pmf of the output of the channel Y X = } Poisson(), i.e., Poisson with intensity λ = and Y X = 0} Poisson(0). Show that the MAP rule reduces to:, y < y D(y) = Find y and the corresponding probability of error. Solution: The MAP rule px Y ( y) > p D(y) = X Y (0 y) 0 otherwise minimizes the probability of decoding error. Since the a priori probabilities for the two X values are equal, the MAP rule is equivalent to the ML rule p Y X (y ) D(y) = p Y X (y 0) > 0 otherwise. Now, p Y X (y ) p Y X (y 0) = e /y! e 0 0 y /y! = e 9 yln(0). 4

This ratio is greater than if y < 9 ln(0). Therefore, y < 9 D(y) = ln(0) 0 otherwise and The probability of error is P e = PD(Y) X} y = 9 ln(0) = 3.9. = PY > y X = }PX = }+PY < y X = 0}PX = 0} e 3 e 0 0 y = 0.5+ 0.5 y! y! y=4 = 0.047. y=0 6. Iocane or Sennari. An absent-minded chemistry professor forgets to label two identically looking bottles. One contains a chemical named Iocane and the other contains a chemical named Sennari. It is well known that the radioactivity level of Iocane has the Unif[0, ] distribution, while the radioactivity level of Sennari has the Exp() distribution. (a) Let X be the radioactivity level measured from one of the bottles. What is the optimal decision rule (based on the measurement X) that maximizes the chance of correctly identifying the content of the bottle? (b) What is the associated probability of error? Solution: Let Θ = 0 denote the case in which the content of the bottle is Iocane and let Θ = denote the case in which the content of the bottle is Sennari. Implicit in the problem statement is that P(Θ = 0) = P(Θ = ) = /. (a) The optimal MAP rule is equivalent to the ML rule 0, f D(x) = X Θ (x 0) > f X Θ (x ),, otherwise. Since the Unif(0,) pdf f X Θ (x 0) is larger than the Exp() pdf f X Θ (x ) for 0 < x <, we have 0, 0 < x <, D(x) =, otherwise. (b) The probability of error is given by P(Θ D(X)) = P(Θ D(X) Θ = 0)+ P(Θ D(X) Θ = ) = P(X > Θ = 0)+ P(0 < X < Θ = ) = ( e ). 5

7. Two independent uniform random variables. Let X and Y be independently and uniformly drawn from the interval [0,]. (a) Find the pdf of U = max(x,y). (b) Find the pdf of V = min(x,y). (c) Find the pdf of W = U V. (d) Find the probability P X Y /}. Solution: (a) We have F U (u) = PU u} = Pmax(X,Y) u} = PX u,y u} = PX u}py u} = u for 0 u. Hence, f U (u) = u, 0 u, (b) Similarly, F V (v) = PV > v} = Pmin(X,Y) > v} = PX > v,y > v} = PX > v}py > v} = ( v), or equivalently, F V (v) = ( v), for 0 v. Hence, ( v), 0 v, f V (v) = (c) First note that W = U V = X Y. (Why?) Hence, PW w} = P X Y w} = P( w X Y w). Since X and Y are uniformly distributed over [0,], the above integral is equal to the area of the shaded region in the following figure: 6

y w 0 w x The area can be easily calculated as ( w) for 0 w. Hence F W (w) = ( w) and ( w), 0 w, f W (w) = (d) From the figure above, P X Y /} = PW /} = /4. 8. Waiting time at the bank. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ), respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y. Solution: First observe that Y = min(x,x ). Since PY > y} = PX > y,x > y} = PX > y}px > y} = e λ y e λ y = e (λ +λ )y for y 0, Y is an exponential random variable with pdf (λ +λ )e (λ +λ )y, y 0, f Y (y) = 7

Additional Exercises Do not turn in solutions to these problems.. Independence. Let X X and Y Y be two independent discrete random variables. (a) Show that any two events A X and B Y are independent. (b) Show that any two functions of X and Y separatelyareindependent; that is, ifu = g(x) and V = h(y) then U and V are independent. Solution: (a) Recall that the probability of any event A X is given by PX A} = x A X p X(x). Because of the independence of X and Y, we have PX A,Y B} = p X,Y (x,y) Therefore, A and B are independent. x A X y B Y = x A X y B Y = x A X p X (x) p X (x)p Y (y) y B Y = PX A}PY B}. p Y (y) (b) Let A x = x : g(x) < u} and B y = y : h(y) < v}. Then the joint distribution of U and V is F U,V (u,v) = PU u,v v} = Pg(X) u,h(y) v} = PX A x,y B y }. However, because of the independence of X and Y, so that Z and W are independent. F U,V (u,v) = PX A x,y B y = PX A x }PY B y } = Pg(X) < u}ph(y) < v} = PU < u}pv < v} = F(u)F(v),. Family planning. Alice and Bob choose a number X at random from the set,3,4} (so the outcomes are equally probable). If the outcome is X = x, they decide to have children until they have a girl or x children, whichever comes first. Assume that each child is a girl with probability / (independent of the number of children and gender of other children). Let Y be the number of children they will have. (a) Find the conditional pmf p Y X (y x) for all possible values of x and y. (b) Find the pmf of Y. 8

Solution: (a) Note that Y,,3,4}. The conditional pmf is as follows (b) The pmf of Y is: p Y () = p Y (3) = p Y X ( ) =, p Y X( ) =, p Y X ( 3) =, p Y X( 3) = 4, p Y X(3 3) = 4, p Y X ( 4) =, p Y X( 4) = 4, p Y X(3 4) = 8, p Y X(4 4) = 8. 4 p X (x)p Y X ( x) = /, p Y () = x= 4 p X (x)p Y X ( x) = /3, x= 4 p X (x)p Y X (3 x) = /8, p Y (4) = p X (4)p Y X (4 4) = /4. x=3 3. Joint cdf or not. Consider the function G(x,y) = if x+y 0 0 otherwise. Can G be a joint cdf for a pair of random variables? Justify your answer. Solution: No. Note that for every x, But for any genuine marginal cdf, lim G(x,y) =. y lim F X(x) = 0. x Therefore G(x,y) is not a cdf. Alternatively, assume that G(x,y) is a joint cdf for X and Y, then P < X, < Y } = G(,) G(,) G(, )+G(, ) = +0 =. But this violates the property that the probability of any event must be nonnegative. 9

0.5 f Y S (y 0) f Y S (y ) f Y S (y ) 0.4 0.3 0. 0. 0 3 0 3 Figure : f Y S (y,s) for λ = 4. Ternary signaling. Let the signal S be a random variable defined as follows: with probability 3 S = 0 with probability 3 + with probability 3. The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian random variable with pdf f Z (z) = λ e λ z, < z <. The signal S and the noise Z are assumed to be independent and the channel output is their sum Y = S +Z. (a) Find f Y S (y s) for s =, 0, +. Sketch the conditional pdfs on the same graph. (b) Find the optimal decoding rule D(Y) for deciding whether S is, 0 or +. Give your answer in terms of ranges of values of Y. (c) Find the probability of decoding error for D(y) in terms of λ. Solution: (a) We use a trick here that is used several times in the lecture notes. Since Y = S+Z and Z and S are independent, the conditional pdf is f Y S (y s) = f Z (y s) = λe λ y s. The plots are shown for λ = in Figure on page 0. 0

(b) The optimal decoding rule is MAP: D(y) = s where s maximizes p(s y) = f(y s)p(s) f(y) Sincep S (s)isthesamefors =,0,+, themaprulebecomesthemaximum-likelihood decoding rule: D(y) = argmaxf(y s). The conditional pdfs are plotted in Figure. By inspection, the ML rule reduces to s y < g(y) = 0 < y < + + y > +. (c) Inspection of Figure shows how to calculate the probability of error. P e = Perror i sent}pi sent} i Perror i sent} = 3 i = 3 ( P < S +Z < + S = 0}) + 3 P S +Z > S = } + 3 P S +Z < + S = +} = ( 3 P < Z < }) + 3 PZ < }+ 3 PZ > + } = 3 PZ < }+ 3 PZ > + }+ 3 PZ < }+ 3 PZ > + } = 3( PZ < }+PZ > + }) = 4 3 PZ > + } (by symmetry) = 4 3 λe λ z dz = 3 e λ. 5. Signal or no signal. Consider a communication system that is operated only from time to time. When the communication system is in the normal mode (denoted by M = ), it transmits a random signal S = X with +, with probability /, X =, with probability /. When the system is in the idle mode (denoted by M = 0), it does not transmit any signal (S = 0). Both normal and idle modes occur with equal probability. Thus X, with probability /, S = 0, with probability /. The receiver observes Y = S+Z, where the ambient noise Z U[,] is independent of S..

(a) Find and sketch the conditional pdf f Y M (y ) of the receiver observation Y given that the system is in the normal mode. (b) Find and sketch the conditional pdf f Y M (y 0) of the receiver observation Y given that the system is in the idle mode. (c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide the answer in terms of intervals of y. (d) Find the associated probability of error. Solution: (a) If M =, Hence, we have Y = + Z, with probability /, + Z, with probability /. f Y M (y ) = f Z(y )+ f Z(y +) = 4, y, (b) If M = 0, Y = Z, so f Y M (y 0) = f Z (y) =, y, (c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML rule, in which 0, if f d(y) = Y M (y 0) > f Y M (y ),, otherwise 0, if < y <, =, otherwise. (d) The probability of error is given by PM d(y)} = PM =, < Y < } = PM = }P < Y < M = } = = 4. 6. Function of uniform random variables. Let X and Y be two independent U[0, ] random variables. Find the probability density function (pdf) of Z = (X+Y) mod (i.e., Z = X+Y if X +Y and X +Y if X +Y > ). Solution: Let W = X + Y. Since X and Y are independent, the pdf of W is simply the convolution of the pdf of X and the pdf of Y. The convolution of this two uniform distributions is a triangular shaped pdf as follows,

w if 0 w f W (w) = w if w 0 otherwise. Since Z = W mod, it is easy to see that Z U[0,]. 7. Maximal correlation. (a) For any pair of random variables (X,Y), show that F X,Y (x,y) minf X (x),f Y (y)}. Now let F and G be continuous and invertible cdf s and let X F. (b) Find the distribution of (c) Show that Solution: (a) We have Y = G (F(X)). F X,Y (x,y) = minf(x),g(y)}. F X,Y (x,y) = PX x,y y} PX x} = F X (x), and similarly, F X,Y F Y (y). Thus, (b) We have (c) We have F X,Y minf X (x),f Y (y)}. F Y (y) = PY y} = PG (F(X)) y} = PF(X) G(y)} = PX F (G(y))} = F(F (G(y))) = G(y). F X,Y (x,y) = P(X x,y y) = P(X x,x F (G(y))) = P(X minx,f (G(y))) = minf(x),f(f (G(y)))} = minf(x),g(y)}. From part (a), this is the maximal joint cdf for any (X,Y) with the given marginal cdf s F(x) and G(y). 3