ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice: Ω {HH, HT, T H, T T }. Define for any ω Ω, X(ω)number of heads in ω. X(ω) is a random variable. Definition. A random variable (RV) is a function X: Ω R. Example 2. Let w be the temperature in F at 3: pm on Thursday afternoon. Let X be the r.v. which the temperature in C. Then X 5 (w 32) 9 Definition 2 (Cumulative distribution function(cdf)). F (x) P (X x). () 3 4 F ( x).5 4 2 2 3 4 x Figure : Cumulative distribution function of x Example 3. The cumulative distribution function of x is as (Figure ) x <, F X (x) 4 x <, 3 4 x < 2, x 2.
Lemma. Properties of CDF () lim X(x) x (2) lim X(x) x +, (3) (2) F X (x) is non-decreasing: x x 2 F X (x ) F X (x 2 ) (4) (3) F X (x)is continuous from the right (4) (5) lim F X(x + ɛ) F X (x), ɛ > (5) ɛ P (a X b) P (X b) P (X a) + P (X a) (6) F X (b) F X (a) + P (X a) (7) P (X a) lim ε F X (a) F X (a ε), ε > (8) Definition 3. If random variable X has finite or countable number of values, X is called discrete. Remar. A set S is countable if the its elements can be indexed, i.e., we can find a injective function from S to the natural numbers Example 4. Non-countable example: R. A set S is countable if you can find a bijection of f. f : S N. Example 5. Countable example: The number of tosses we need till get a Head Definition 4. X is continuous if F X (x) is continuous. Definition 5 (Probability density function(pdf)). f X (x) df X(x) dx (X is continuous). (9) Example 6. Gaussian random variable: Normal/ Gaussian Distribution. By definition, f X (x) (x µ) 2 e 2σ 2 2πσ 2 2
f X (x) σ µ x Figure 2: Gaussian distribution pdf. Therefore, We should always have: F X (a) P (x a) + a f X (x)dx, a (x µ) 2 e 2σ 2 2πσ 2 f X (x)dx. Definition 6 (mean, variance of a RV X). For the continuous case: For the discrete case: E(X) µ V (X) σ 2 E(X) µ V (X) σ 2 + + + i + i xf X (x)dx, dx. (x µ) 2 f X (x)dx. x i P (X x i ), (x i µ) 2 P (X x i ). Example 7. X is uniformly distributed in [, ]. x <, F X (x) x dx x x <, x. E(X) V (X) X dx 2, (X 2 )2 dx 2. 3
Lemma 2 (Probability Density Functions). () Uniform X uniform over [a,b]: if a x b f X (x) b a otherwise () f (x) b a a b x Figure 3: Uniform distribution. (2) Gaussian distribution: f X (x) (x µ) 2 e 2σ 2, () 2πσ 2..8 μ, μ, μ, μ 2, 2 σ.2, 2 σ., 2 σ 5., 2 σ.5, φ μ,σ 2( x).6.4.2. 5 4 3 2 2 3 4 5 Figure 4: Gaussian distribution. 2 x Figure from Wiipedia: https://en.wiipedia.org/wii/uniform_distribution_(continuous) 2 Figure from Wiipedia: https://en.wiipedia.org/wii/normal_distribution 4
(3) Exponential distribution: It is the probability distribution of the waiting time between events in a Poisson process in which events occur continuously and independently at a constant average rate (chec Poisson process in later lectures) { λe λ if x f X (x) (2) if x < The mean: E [X] x x λ x xf (x) dx xλ exp ( λx) dx x exp ( λx) dx [ ] λ x x exp ( λx) + λ x λ ( + λ ) 2 x λ exp ( λx) dx λ Homewor: Find the variance of the exponential distribution. Answer: E [ X 2] x x λ x x 2 f (x) dx x 2 λ exp ( λx) dx x 2 exp ( λx) dx [ λ λ x2 exp ( λx) ( λ + ( )) λ λ E [X] ( ) λ λ 3 λ 2 ] x x + x x exp ( λx) dx λ 5
Figure 5: Exponential distribution. 3 (4) Rayleigh Distribution: f X (x) x σ 2 e x 2 2σ 2, x, (3).2 σ.5 σ σ2 σ3 σ4.8.6.4.2 2 4 6 8 Figure 6: Rayleigh distribution. 4 (5) Laplacian Distribution: f X (x) e 2 x σ. (4) 2σ 3 Figure from Wiipedia: https://en.wiipedia.org/wii/exponential_distribution 4 Figure from Wiipedia: https://en.wiipedia.org/wii/rayleigh_distribution 6
.5.4 μ, b μ, b2 μ, b4 μ-5, b4.3.2. - -8-6 -4-2 2 4 6 8 Figure 7: Laplacian distribution. 5 Example of Discrete Random Variable. Bernoulli RV flipping a coin, P (H) p, P (T ) p, if head occurs X, if tail occurs X, P (X ) p, P (X ) p. The CDF of a bernoulli RV is as Figure 8. F ( x).5 p x 2 2 3 4 Figure 8: Cumulative distribution function of Bernoulli Random Variable 5 Figure from Wiipedia: https://en.wiipedia.org/wii/laplace_distribution 7
.2 Binomial distribution Tossing a die n times, P (H) p, P (T ) p. X is number of heads, x {,,..., n}. ( ) n P (X ) p ( p) n. Remar 2. Let Y i {, } denote the outcome of tossing the die the ith time X Y + Y 2 + + Y n. i.e., a Binomial RV can be thought of as the sum of n independent Bernoulli RV. Example 8 (Random graph). Each edge exists with probability p, X is the number of neighbor of node (Figure 9). {, if node is connected to i+, Y i, otherwise. So X follows the Binomial distribution. X Y + Y 2 + + Y n. 2 3 4 5 6 Figure 9: Random Graphs n Example 9 (BSC). Suppose we are transmitting a file of length n. Consider a BSC where the probability of error is p and the probability of receiving the correct bit is -p. (Figure ) What is the probability that we have errors? ( ) n P ( errors) p ( p) n 8
X p Y p p p Figure : Binary Symmetric Channel with probability of error P e p. Let X represent the number of errors, what is E(X) E(X) n P (X ), n ( n n np n ) p ( p) n, ( n ( n np np. ) p ( p) n, ) p ( p) n +, Binomial theorem: (x + y) n (p + p) n n n ( ) n x y n ( n. ) p ( p) n + Theorem. For any two RVs X and X 2, Y X + X 2, It does not matter whether X and X 2 are independent or not. E(Y ) E(X ) + E(X 2 ). (5).3 Geometric distribution You eep tossing a coin until you observe a Head. X is the number of times you have to toss the coin. X {, 2,... }, P (X K) ( p) p. 9
p X p Y p erasure E p Figure : Binary Erasure Channel with probability of erasure P e.. Example (Binary erasure channel). Suppose you have a BEC channel with feedbac. When you get a erasure, you as the sender to retransmit.(figure ) Suppose you pay one dollar for each retransmission. Let X be the amount of money you pay per transmission. For geometric distribution, E(X) p,.9.$. P (H) E(X), which E(X) is the number of coin flips on average. Intuition: You have to mae E(X) trials, and in these E(X) trials, the success happens once at the last trial Proof. Recall that for x <, E(X) d d p P (X ), (6) ( p) p, (7) ( p). (8) x x x, (9) ( x) 2, (2) ( p) p 2. (2)
So, E(X) p p 2, (22) p. (23).4 Poisson distribution Suppose a server receives λ searches per second on average. The probability that the server receives searches for this second is To find C: P (X ) C λ,,, 2,...,. (24)! P (X ) C λ! C C e λ. λ! Then the pdf of the poisson distribution for an average of λ arrivals per time unit is: The mean is: P (X ) e λ λ,,, 2,...,. (25)! E(X) λ. Example (Interpretation of poisson distribution as an arrival experiment). Suppose average of arrival cumtomers per second is λ. Suppose server goes down if X. We want to find the probability of P (X ). P (server going down) P (X ). We divide the one second to n intervals, each length of the interval is n of getting requests in small interval is λ n. second. The probability p
2 3 4 5... n Figure 2: one second divided into n intervals. Now we can consider it to be Bernoulli distribution with p. ( ) n P (X ) p ( p) n, (26) ( ) n ( λ n ) ( λ n )n (27) We get (28) because of ( ) n n! ( p p ) ( p) n, (28)! (np) e np, (29)! λ e λ, (3) λ! e λ. (3) n(n )... (n + ), (32)! n, ( is a constant and n goes to infinity). (33)! This means we can approximate Binomial(n,p) by Poisson with λ np (if n is very large). 2 Two Random Variables Example 2. Let X and Y be Bernoulli Random Variable. If Y, we now X must equal to, so X and Y are dependent. Y Y X 2 4 X 4 Table : Joint probability mass function of X and Y. P (X ) 3 4, P (X ) 4. Here is a example which X and Y are independent, but they have the same marginal distribution. 2
Y Y 3 3 X 8 8 X 8 8 Table 2: Joint probability mass function of X and Y. 2. Marginalization You have the joint distribution P X,Y (x, y). P X (x ) y P Y (y ) x P X,Y (x, y), (34) P X,Y (x, y ). (35) Definition 7. If X and Y are continuous random variables, then the joint CDF: F X,Y (x, y) P (X x, Y y). (36) Given joint CDF F X,Y (x, y), F X (x ) F X,Y (x, + ). (37) Definition 8. When the CDF is differentiable, the joint pdf is defined as f X,Y (x, y) 2 F X,Y (x, y), (38) x y f X (x) f Y (y) Definition 9. X and Y are independent if and only if Definition. Conditional CDF of marginal distribution is f X,Y (x, y)dy, (39) f X,Y (x, y)dx. (4) F X,Y (x, y) F X (x)f Y (y), (4) f X,Y (x, y) f X (x)f Y (y). (42) F X,Y (x y) P (X x Y y), (43) F X,Y (X x, Y y). (44) P (Y y) Example 3. X and Y are 2 random variables given by the joint pdf f X,Y (x, y) 2πσ 2 ρ exp[ 2 2σ 2 ( ρ 2 ) (x2 + y 2 2ρxy)]. 3
What is f X (x)? f X (x) f X,Y (x, y)dy, 2πσ 2 ρ 2 exp[ x2 2σ 2 ( ρ 2 ) ] 2πσ 2 ρ 2 exp[ x2 +ρ2 x2 2σ 2 ( ρ 2 ) ] 2πσ 2 σρ 2 ( ρ2 )x2 exp[ 2σ 2 ( ρ 2 ) ] 2πσ exp[ 2σ 2 ( ρ 2 ) (x2 + y 2 2ρxy)]dy, exp[ 2σ 2 ( ρ 2 ) (y2 2ρxy + ρx 2 ρ 2 x 2 )]dy, ρ 2 x e (y ρx) 2σ 2 ( ρ 2 ) dy, 2πσ e (y ρx) 2σ 2 ( ρ 2 ) dy. Because 2πσ e (x µ)2 2σ dx. So if ρ, f X (x) 2πσ e x2 2σ 2. Similarly, f Y (y) 2πσ e y2 2σ 2. We can have f X,Y (x, y) f X (x)f Y (y). So X and Y are independent. If ρ, X and Y are not independent. 4