UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)
|
|
- Francis Mitchell
- 5 years ago
- Views:
Transcription
1 UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N (n) = (λt)n e λt for n = 0,,,... n! Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solution: To find the pdf f Y (t) of the random variable Y, note that the event Y t} occurs iff the time of the nth packet is in [0,t], that is, iff the number N(t) of packets arriving in [0,t] is at least n. Alternatively, Y > t} occurs iff N(t) < n}. Hence, the cdf F Y (t) of Y is given by F Y (t) = PY t} = PN(t) n} = (λt) k e λt. k! k=n Differentiating F Y (t) with respect to t, we get the pdf f Y (t) as for t > 0. f Y (t) = k=n [ λe λt(λt)k k! = λe λt(λt)n (n )! = λe λt(λt)n (n )! k=n ] +λe λt(λt)k (k )! λe λt(λt)k k! + k=n+ λe λt(λt)k (k )! Or we can use another way. Since we know that the time interval T between packet arrivals is an exponential random variable with pdf λe f T (t) = λt, if t 0, Let T i denote the i.i.d. exponential interarrival times, then Y = T + T + + T n. By convolving f T (t) with itself n times, which can be also computed by its Fourier transform (characteristic function), we can show that the pdf of Y is given by f Y (t) = λe λt (λt) n (n )!, if t 0,
2 . Diamond distribution. Consider the random variables X and Y with the joint pdf c if x + y / f X,Y (x,y) = 0 otherwise, where c is a constant. (a) Find c. (b) Find f X (x) and f X Y (x y). (c) Are X and Y independent random variables? Justify your answer. Solution: (a) The integral of the pdf f X,Y (x,y) over < x <, < y < is c, and therefore by the definition of joint density c =. (b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 x, x ( ) f X (x) = cdy = x, and for x 0, f X (x) = +x +x x ( ) cdy = +x. So the marginal pdf may be written as x x f X (x) = 0 otherwise. Now since f XY (x,y) is symmetrical, f Y (y) = f X (y). Thus, (c) X and Y are not independent since f X Y (x y) = f X,Y(x,y) f Y (y) = y x + y, y 0 otherwise. f X,Y (x,y) f X (x)f Y (y). Alternatively, X and Y are not independent since f X Y (x y) depends on the value of y.
3 3. First available teller. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ) respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she is free. What is the probability that you are served by the first teller? Solution: From the memoryless property of the exponential distribution, the remaining services for the tellers are also independent exponentially distributed random variables with parameters λ and λ, respectively. The probability that you will be served by the first teller is the probability that the first teller finishes the service before the second teller does. Thus, PX < X } = f X,X (x,x )dx dx x >x = = x =0 x =0 = λ λ +λ. x =x λ e λ x λ e λ x dx dx λ e (λ +λ )x dx 4. Coin with random bias. You are given a coin but are not told what its bias (probability of heads) is. You are told instead that the bias is the outcome of a random variable P Unif[0, ]. To get more infromation about the coin bias, you flip it independently 0 times. Let X be the number of heads you get. Thus X B(0,P). Assuming that X = 9, find and sketch the a posteriori probability of P, i.e., f P X (p 9). Solution: In order to find the conditional pdf of P, apply Bayes rule for mixed random variables to get f P X (p x) = p X P(x p) f P (p) = p X (x) Now it is given that X = 9, thus for 0 p p 9 ( p) f P X (p 9) = 0 p9 ( p)dp = p9 ( p) 0 = 0p 9 ( p). p X P (x p) 0 p X P(x p)f P (p)dp f P(p). Figure compares the unconditional and the conditional pdfs for P. It may be seen that given the information that 0 independent tosses resulted in 9 heads, the pdf is shifted towards the value
4 4.5 4 f P (p) f P X (p 9) Figure : Comparison of a priori and a posteriori pdfs of P 5. Optical communication channel. Let the signal input to an optical channel be: with probability X = 0 with probability. The conditional pmf of the output of the channel Y X = } Poisson(), i.e., Poisson with intensity λ = and Y X = 0} Poisson(0). Show that the MAP rule reduces to:, y < y D(y) = Find y and the corresponding probability of error. Solution: The MAP rule px Y ( y) > p D(y) = X Y (0 y) 0 otherwise minimizes the probability of decoding error. Since the a priori probabilities for the two X values are equal, the MAP rule is equivalent to the ML rule p Y X (y ) D(y) = p Y X (y 0) > 0 otherwise. Now, p Y X (y ) p Y X (y 0) = e /y! e 0 0 y /y! = e 9 yln(0). 4
5 This ratio is greater than if y < 9 ln(0). Therefore, y < 9 D(y) = ln(0) 0 otherwise and The probability of error is P e = PD(Y) X} y = 9 ln(0) = 3.9. = PY > y X = }PX = }+PY < y X = 0}PX = 0} e 3 e 0 0 y = y! y! y=4 = y=0 6. Iocane or Sennari. An absent-minded chemistry professor forgets to label two identically looking bottles. One contains a chemical named Iocane and the other contains a chemical named Sennari. It is well known that the radioactivity level of Iocane has the Unif[0, ] distribution, while the radioactivity level of Sennari has the Exp() distribution. (a) Let X be the radioactivity level measured from one of the bottles. What is the optimal decision rule (based on the measurement X) that maximizes the chance of correctly identifying the content of the bottle? (b) What is the associated probability of error? Solution: Let Θ = 0 denote the case in which the content of the bottle is Iocane and let Θ = denote the case in which the content of the bottle is Sennari. Implicit in the problem statement is that P(Θ = 0) = P(Θ = ) = /. (a) The optimal MAP rule is equivalent to the ML rule 0, f D(x) = X Θ (x 0) > f X Θ (x ),, otherwise. Since the Unif(0,) pdf f X Θ (x 0) is larger than the Exp() pdf f X Θ (x ) for 0 < x <, we have 0, 0 < x <, D(x) =, otherwise. (b) The probability of error is given by P(Θ D(X)) = P(Θ D(X) Θ = 0)+ P(Θ D(X) Θ = ) = P(X > Θ = 0)+ P(0 < X < Θ = ) = ( e ). 5
6 7. Two independent uniform random variables. Let X and Y be independently and uniformly drawn from the interval [0,]. (a) Find the pdf of U = max(x,y). (b) Find the pdf of V = min(x,y). (c) Find the pdf of W = U V. (d) Find the probability P X Y /}. Solution: (a) We have F U (u) = PU u} = Pmax(X,Y) u} = PX u,y u} = PX u}py u} = u for 0 u. Hence, f U (u) = u, 0 u, (b) Similarly, F V (v) = PV > v} = Pmin(X,Y) > v} = PX > v,y > v} = PX > v}py > v} = ( v), or equivalently, F V (v) = ( v), for 0 v. Hence, ( v), 0 v, f V (v) = (c) First note that W = U V = X Y. (Why?) Hence, PW w} = P X Y w} = P( w X Y w). Since X and Y are uniformly distributed over [0,], the above integral is equal to the area of the shaded region in the following figure: 6
7 y w 0 w x The area can be easily calculated as ( w) for 0 w. Hence F W (w) = ( w) and ( w), 0 w, f W (w) = (d) From the figure above, P X Y /} = PW /} = /4. 8. Waiting time at the bank. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ), respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y. Solution: First observe that Y = min(x,x ). Since PY > y} = PX > y,x > y} = PX > y}px > y} = e λ y e λ y = e (λ +λ )y for y 0, Y is an exponential random variable with pdf (λ +λ )e (λ +λ )y, y 0, f Y (y) = 7
8 Additional Exercises Do not turn in solutions to these problems.. Independence. Let X X and Y Y be two independent discrete random variables. (a) Show that any two events A X and B Y are independent. (b) Show that any two functions of X and Y separatelyareindependent; that is, ifu = g(x) and V = h(y) then U and V are independent. Solution: (a) Recall that the probability of any event A X is given by PX A} = x A X p X(x). Because of the independence of X and Y, we have PX A,Y B} = p X,Y (x,y) Therefore, A and B are independent. x A X y B Y = x A X y B Y = x A X p X (x) p X (x)p Y (y) y B Y = PX A}PY B}. p Y (y) (b) Let A x = x : g(x) < u} and B y = y : h(y) < v}. Then the joint distribution of U and V is F U,V (u,v) = PU u,v v} = Pg(X) u,h(y) v} = PX A x,y B y }. However, because of the independence of X and Y, so that Z and W are independent. F U,V (u,v) = PX A x,y B y = PX A x }PY B y } = Pg(X) < u}ph(y) < v} = PU < u}pv < v} = F(u)F(v),. Family planning. Alice and Bob choose a number X at random from the set,3,4} (so the outcomes are equally probable). If the outcome is X = x, they decide to have children until they have a girl or x children, whichever comes first. Assume that each child is a girl with probability / (independent of the number of children and gender of other children). Let Y be the number of children they will have. (a) Find the conditional pmf p Y X (y x) for all possible values of x and y. (b) Find the pmf of Y. 8
9 Solution: (a) Note that Y,,3,4}. The conditional pmf is as follows (b) The pmf of Y is: p Y () = p Y (3) = p Y X ( ) =, p Y X( ) =, p Y X ( 3) =, p Y X( 3) = 4, p Y X(3 3) = 4, p Y X ( 4) =, p Y X( 4) = 4, p Y X(3 4) = 8, p Y X(4 4) = 8. 4 p X (x)p Y X ( x) = /, p Y () = x= 4 p X (x)p Y X ( x) = /3, x= 4 p X (x)p Y X (3 x) = /8, p Y (4) = p X (4)p Y X (4 4) = /4. x=3 3. Joint cdf or not. Consider the function G(x,y) = if x+y 0 0 otherwise. Can G be a joint cdf for a pair of random variables? Justify your answer. Solution: No. Note that for every x, But for any genuine marginal cdf, lim G(x,y) =. y lim F X(x) = 0. x Therefore G(x,y) is not a cdf. Alternatively, assume that G(x,y) is a joint cdf for X and Y, then P < X, < Y } = G(,) G(,) G(, )+G(, ) = +0 =. But this violates the property that the probability of any event must be nonnegative. 9
10 0.5 f Y S (y 0) f Y S (y ) f Y S (y ) Figure : f Y S (y,s) for λ = 4. Ternary signaling. Let the signal S be a random variable defined as follows: with probability 3 S = 0 with probability 3 + with probability 3. The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian random variable with pdf f Z (z) = λ e λ z, < z <. The signal S and the noise Z are assumed to be independent and the channel output is their sum Y = S +Z. (a) Find f Y S (y s) for s =, 0, +. Sketch the conditional pdfs on the same graph. (b) Find the optimal decoding rule D(Y) for deciding whether S is, 0 or +. Give your answer in terms of ranges of values of Y. (c) Find the probability of decoding error for D(y) in terms of λ. Solution: (a) We use a trick here that is used several times in the lecture notes. Since Y = S+Z and Z and S are independent, the conditional pdf is f Y S (y s) = f Z (y s) = λe λ y s. The plots are shown for λ = in Figure on page 0. 0
11 (b) The optimal decoding rule is MAP: D(y) = s where s maximizes p(s y) = f(y s)p(s) f(y) Sincep S (s)isthesamefors =,0,+, themaprulebecomesthemaximum-likelihood decoding rule: D(y) = argmaxf(y s). The conditional pdfs are plotted in Figure. By inspection, the ML rule reduces to s y < g(y) = 0 < y < + + y > +. (c) Inspection of Figure shows how to calculate the probability of error. P e = Perror i sent}pi sent} i Perror i sent} = 3 i = 3 ( P < S +Z < + S = 0}) + 3 P S +Z > S = } + 3 P S +Z < + S = +} = ( 3 P < Z < }) + 3 PZ < }+ 3 PZ > + } = 3 PZ < }+ 3 PZ > + }+ 3 PZ < }+ 3 PZ > + } = 3( PZ < }+PZ > + }) = 4 3 PZ > + } (by symmetry) = 4 3 λe λ z dz = 3 e λ. 5. Signal or no signal. Consider a communication system that is operated only from time to time. When the communication system is in the normal mode (denoted by M = ), it transmits a random signal S = X with +, with probability /, X =, with probability /. When the system is in the idle mode (denoted by M = 0), it does not transmit any signal (S = 0). Both normal and idle modes occur with equal probability. Thus X, with probability /, S = 0, with probability /. The receiver observes Y = S+Z, where the ambient noise Z U[,] is independent of S..
12 (a) Find and sketch the conditional pdf f Y M (y ) of the receiver observation Y given that the system is in the normal mode. (b) Find and sketch the conditional pdf f Y M (y 0) of the receiver observation Y given that the system is in the idle mode. (c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide the answer in terms of intervals of y. (d) Find the associated probability of error. Solution: (a) If M =, Hence, we have Y = + Z, with probability /, + Z, with probability /. f Y M (y ) = f Z(y )+ f Z(y +) = 4, y, (b) If M = 0, Y = Z, so f Y M (y 0) = f Z (y) =, y, (c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML rule, in which 0, if f d(y) = Y M (y 0) > f Y M (y ),, otherwise 0, if < y <, =, otherwise. (d) The probability of error is given by PM d(y)} = PM =, < Y < } = PM = }P < Y < M = } = = Function of uniform random variables. Let X and Y be two independent U[0, ] random variables. Find the probability density function (pdf) of Z = (X+Y) mod (i.e., Z = X+Y if X +Y and X +Y if X +Y > ). Solution: Let W = X + Y. Since X and Y are independent, the pdf of W is simply the convolution of the pdf of X and the pdf of Y. The convolution of this two uniform distributions is a triangular shaped pdf as follows,
13 w if 0 w f W (w) = w if w 0 otherwise. Since Z = W mod, it is easy to see that Z U[0,]. 7. Maximal correlation. (a) For any pair of random variables (X,Y), show that F X,Y (x,y) minf X (x),f Y (y)}. Now let F and G be continuous and invertible cdf s and let X F. (b) Find the distribution of (c) Show that Solution: (a) We have Y = G (F(X)). F X,Y (x,y) = minf(x),g(y)}. F X,Y (x,y) = PX x,y y} PX x} = F X (x), and similarly, F X,Y F Y (y). Thus, (b) We have (c) We have F X,Y minf X (x),f Y (y)}. F Y (y) = PY y} = PG (F(X)) y} = PF(X) G(y)} = PX F (G(y))} = F(F (G(y))) = G(y). F X,Y (x,y) = P(X x,y y) = P(X x,x F (G(y))) = P(X minx,f (G(y))) = minf(x),f(f (G(y)))} = minf(x),g(y)}. From part (a), this is the maximal joint cdf for any (X,Y) with the given marginal cdf s F(x) and G(y). 3
Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.
Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationSolutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π
Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))
More informationUCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationUCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7
UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 08 Kannan Ramchandran February 4, 08 Midterm Last Name First Name SID You have 0 minutes to read the exam and 90 minutes
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationIndependent random variables
Will Monroe July 4, 017 with materials by Mehran Sahami and Chris Piech Independent random variables Announcements: Midterm Tomorrow! Tuesday, July 5, 7:00-9:00pm Building 30-105 (main quad, Geology Corner)
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationS n = x + X 1 + X X n.
0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationMA 519 Probability: Review
MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................
More informationCheck Your Understanding of the Lecture Material Finger Exercises with Solutions
Check Your Understanding of the Lecture Material Finger Exercises with Solutions Instructor: David Dobor March 6, 2017 While the solutions to these questions are available, you are strongly encouraged
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationMATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018
MATH 8A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM # FALL 8 Name (Last, First): Student ID: TA: SO AS TO NOT DISTURB OTHER STUDENTS, EVERY- ONE MUST STAY UNTIL THE EXAM IS COMPLETE. ANSWERS TO THE
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationUCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7
UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationProbability. Lecture Notes. Adolfo J. Rumbos
Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationNaïve Bayes classification
Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More informationConditioning a random variable on an event
Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationRandom Variables and Probability Distributions
CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function
More informationECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab
University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationISyE 3044 Fall 2017 Test #1a Solutions
1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationConditional densities, mass functions, and expectations
Conditional densities, mass functions, and expectations Jason Swanson April 22, 27 1 Discrete random variables Suppose that X is a discrete random variable with range {x 1, x 2, x 3,...}, and that Y is
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationWhen Are Two Random Variables Independent?
When Are Two Random Variables Independent? 1 Introduction. Almost all of the mathematics of inferential statistics and sampling theory is based on the behavior of mutually independent random variables,
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationStatistics 100A Homework 5 Solutions
Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to
More informationUniversity of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions
Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout
More informationNaïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability
Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationDS-GA 1002 Lecture notes 2 Fall Random variables
DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More information5. Conditional Distributions
1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationMAT 271E Probability and Statistics
MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday
More informationEXAM # 3 PLEASE SHOW ALL WORK!
Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationLecture 3: Random variables, distributions, and transformations
Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationFinal Solutions Fri, June 8
EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on
More informationUNIT Define joint distribution and joint probability density function for the two random variables X and Y.
UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y
More information1 Introduction. P (n = 1 red ball drawn) =
Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability
More informationC-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.
Lecture Next consider the topic that includes both discrete and continuous cases that of the multivariate probability distribution. We now want to consider situations in which two or more r.v.s act in
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018 Midterm 1 Last Name First Name SID You have 10 minutes to read the exam and
More informationMAS1302 Computational Probability and Statistics
MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationSTAT 516: Basic Probability and its Applications
Lecture 4: Random variables Prof. Michael September 15, 2015 What is a random variable? Often, it is hard and/or impossible to enumerate the entire sample space For a coin flip experiment, the sample space
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More information