Chapter 2: Random Variables

Similar documents
Chapter 1: Introduction to Probability Theory

Algorithms for Uncertainty Quantification

Lecture 2: Repetition of probability theory and statistics

1: PROBABILITY REVIEW

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Recitation 2: Probability

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Northwestern University Department of Electrical Engineering and Computer Science

1.1 Review of Probability Theory

1 Random Variable: Topics

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Discrete Random Variables

Random variables. DS GA 1002 Probability and Statistics for Data Science.

1 Presessional Probability

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Chapter 4 : Expectation and Moments

Basics on Probability. Jingrui He 09/11/2007

Probability and Statistics Concepts

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Fundamental Tools - Probability Theory II

Name: Firas Rassoul-Agha

Review of Probability Theory

Lecture 3. Discrete Random Variables

Exercises with solutions (Set D)

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

M378K In-Class Assignment #1

Stochastic Models of Manufacturing Systems

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

DS-GA 1002 Lecture notes 2 Fall Random variables

Things to remember when learning probability distributions:

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Random Variables and Their Distributions

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Quick Tour of Basic Probability Theory and Linear Algebra

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

3 Multiple Discrete Random Variables

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Chapter 3: Random Variables 1

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

FINAL EXAM: Monday 8-10am

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Statistics and Econometrics I

Numerical Methods I Monte Carlo Methods

Chap 2.1 : Random Variables

Lecture 1: Review on Probability and Statistics

Chapter 2 Random Variables

Lecture 1: August 28

conditional cdf, conditional pdf, total probability theorem?

Introduction to Probability and Stocastic Processes - Part I

Probability and Distributions

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

STAT 430/510 Probability

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

General Random Variables

1 Review of Probability and Distributions

Stochastic Models in Computer Science A Tutorial

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Analysis of Engineering and Scientific Data. Semester

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 3: Random Variables 1

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

This does not cover everything on the final. Look at the posted practice problems for other topics.

Solution to Assignment 3

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Chapter 2: The Random Variable

EE4601 Communication Systems

Multiple Random Variables

Midterm Exam 1 Solution

Basic concepts of probability theory

S n = x + X 1 + X X n.

Statistics for scientists and engineers

ECE 4400:693 - Information Theory

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

ECE 302: Probabilistic Methods in Electrical Engineering

Probability Review. Gonzalo Mateos

1 Random variables and distributions

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Random Variables. Will Perkins. January 11, 2013

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

1 Review of Probability

Brief Review of Probability

STAT Chapter 5 Continuous Distributions

Appendix A : Introduction to Probability and stochastic processes

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Recap of Basic Probability Theory

Probability reminders

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Problem 1. Problem 2. Problem 3. Problem 4

Lecture 1. ABC of Probability

Set Theory Digression

Transcription:

ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice: Ω {HH, HT, T H, T T }. Define for any ω Ω, X(ω)number of heads in ω. X(ω) is a random variable. Definition. A random variable (RV) is a function X: Ω R. Example 2. Let w be the temperature in F at 3: pm on Thursday afternoon. Let X be the r.v. which the temperature in C. Then X 5 (w 32) 9 Definition 2 (Cumulative distribution function(cdf)). F (x) P (X x). () 3 4 F ( x).5 4 2 2 3 4 x Figure : Cumulative distribution function of x Example 3. The cumulative distribution function of x is as (Figure ) x <, F X (x) 4 x <, 3 4 x < 2, x 2.

Lemma. Properties of CDF () lim X(x) x (2) lim X(x) x +, (3) (2) F X (x) is non-decreasing: x x 2 F X (x ) F X (x 2 ) (4) (3) F X (x)is continuous from the right (4) (5) lim F X(x + ɛ) F X (x), ɛ > (5) ɛ P (a X b) P (X b) P (X a) + P (X a) (6) F X (b) F X (a) + P (X a) (7) P (X a) lim ε F X (a) F X (a ε), ε > (8) Definition 3. If random variable X has finite or countable number of values, X is called discrete. Remar. A set S is countable if the its elements can be indexed, i.e., we can find a injective function from S to the natural numbers Example 4. Non-countable example: R. A set S is countable if you can find a bijection of f. f : S N. Example 5. Countable example: The number of tosses we need till get a Head Definition 4. X is continuous if F X (x) is continuous. Definition 5 (Probability density function(pdf)). f X (x) df X(x) dx (X is continuous). (9) Example 6. Gaussian random variable: Normal/ Gaussian Distribution. By definition, f X (x) (x µ) 2 e 2σ 2 2πσ 2 2

f X (x) σ µ x Figure 2: Gaussian distribution pdf. Therefore, We should always have: F X (a) P (x a) + a f X (x)dx, a (x µ) 2 e 2σ 2 2πσ 2 f X (x)dx. Definition 6 (mean, variance of a RV X). For the continuous case: For the discrete case: E(X) µ V (X) σ 2 E(X) µ V (X) σ 2 + + + i + i xf X (x)dx, dx. (x µ) 2 f X (x)dx. x i P (X x i ), (x i µ) 2 P (X x i ). Example 7. X is uniformly distributed in [, ]. x <, F X (x) x dx x x <, x. E(X) V (X) X dx 2, (X 2 )2 dx 2. 3

Lemma 2 (Probability Density Functions). () Uniform X uniform over [a,b]: if a x b f X (x) b a otherwise () f (x) b a a b x Figure 3: Uniform distribution. (2) Gaussian distribution: f X (x) (x µ) 2 e 2σ 2, () 2πσ 2..8 μ, μ, μ, μ 2, 2 σ.2, 2 σ., 2 σ 5., 2 σ.5, φ μ,σ 2( x).6.4.2. 5 4 3 2 2 3 4 5 Figure 4: Gaussian distribution. 2 x Figure from Wiipedia: https://en.wiipedia.org/wii/uniform_distribution_(continuous) 2 Figure from Wiipedia: https://en.wiipedia.org/wii/normal_distribution 4

(3) Exponential distribution: It is the probability distribution of the waiting time between events in a Poisson process in which events occur continuously and independently at a constant average rate (chec Poisson process in later lectures) { λe λ if x f X (x) (2) if x < The mean: E [X] x x λ x xf (x) dx xλ exp ( λx) dx x exp ( λx) dx [ ] λ x x exp ( λx) + λ x λ ( + λ ) 2 x λ exp ( λx) dx λ Homewor: Find the variance of the exponential distribution. Answer: E [ X 2] x x λ x x 2 f (x) dx x 2 λ exp ( λx) dx x 2 exp ( λx) dx [ λ λ x2 exp ( λx) ( λ + ( )) λ λ E [X] ( ) λ λ 3 λ 2 ] x x + x x exp ( λx) dx λ 5

Figure 5: Exponential distribution. 3 (4) Rayleigh Distribution: f X (x) x σ 2 e x 2 2σ 2, x, (3).2 σ.5 σ σ2 σ3 σ4.8.6.4.2 2 4 6 8 Figure 6: Rayleigh distribution. 4 (5) Laplacian Distribution: f X (x) e 2 x σ. (4) 2σ 3 Figure from Wiipedia: https://en.wiipedia.org/wii/exponential_distribution 4 Figure from Wiipedia: https://en.wiipedia.org/wii/rayleigh_distribution 6

.5.4 μ, b μ, b2 μ, b4 μ-5, b4.3.2. - -8-6 -4-2 2 4 6 8 Figure 7: Laplacian distribution. 5 Example of Discrete Random Variable. Bernoulli RV flipping a coin, P (H) p, P (T ) p, if head occurs X, if tail occurs X, P (X ) p, P (X ) p. The CDF of a bernoulli RV is as Figure 8. F ( x).5 p x 2 2 3 4 Figure 8: Cumulative distribution function of Bernoulli Random Variable 5 Figure from Wiipedia: https://en.wiipedia.org/wii/laplace_distribution 7

.2 Binomial distribution Tossing a die n times, P (H) p, P (T ) p. X is number of heads, x {,,..., n}. ( ) n P (X ) p ( p) n. Remar 2. Let Y i {, } denote the outcome of tossing the die the ith time X Y + Y 2 + + Y n. i.e., a Binomial RV can be thought of as the sum of n independent Bernoulli RV. Example 8 (Random graph). Each edge exists with probability p, X is the number of neighbor of node (Figure 9). {, if node is connected to i+, Y i, otherwise. So X follows the Binomial distribution. X Y + Y 2 + + Y n. 2 3 4 5 6 Figure 9: Random Graphs n Example 9 (BSC). Suppose we are transmitting a file of length n. Consider a BSC where the probability of error is p and the probability of receiving the correct bit is -p. (Figure ) What is the probability that we have errors? ( ) n P ( errors) p ( p) n 8

X p Y p p p Figure : Binary Symmetric Channel with probability of error P e p. Let X represent the number of errors, what is E(X) E(X) n P (X ), n ( n n np n ) p ( p) n, ( n ( n np np. ) p ( p) n, ) p ( p) n +, Binomial theorem: (x + y) n (p + p) n n n ( ) n x y n ( n. ) p ( p) n + Theorem. For any two RVs X and X 2, Y X + X 2, It does not matter whether X and X 2 are independent or not. E(Y ) E(X ) + E(X 2 ). (5).3 Geometric distribution You eep tossing a coin until you observe a Head. X is the number of times you have to toss the coin. X {, 2,... }, P (X K) ( p) p. 9

p X p Y p erasure E p Figure : Binary Erasure Channel with probability of erasure P e.. Example (Binary erasure channel). Suppose you have a BEC channel with feedbac. When you get a erasure, you as the sender to retransmit.(figure ) Suppose you pay one dollar for each retransmission. Let X be the amount of money you pay per transmission. For geometric distribution, E(X) p,.9.$. P (H) E(X), which E(X) is the number of coin flips on average. Intuition: You have to mae E(X) trials, and in these E(X) trials, the success happens once at the last trial Proof. Recall that for x <, E(X) d d p P (X ), (6) ( p) p, (7) ( p). (8) x x x, (9) ( x) 2, (2) ( p) p 2. (2)

So, E(X) p p 2, (22) p. (23).4 Poisson distribution Suppose a server receives λ searches per second on average. The probability that the server receives searches for this second is To find C: P (X ) C λ,,, 2,...,. (24)! P (X ) C λ! C C e λ. λ! Then the pdf of the poisson distribution for an average of λ arrivals per time unit is: The mean is: P (X ) e λ λ,,, 2,...,. (25)! E(X) λ. Example (Interpretation of poisson distribution as an arrival experiment). Suppose average of arrival cumtomers per second is λ. Suppose server goes down if X. We want to find the probability of P (X ). P (server going down) P (X ). We divide the one second to n intervals, each length of the interval is n of getting requests in small interval is λ n. second. The probability p

2 3 4 5... n Figure 2: one second divided into n intervals. Now we can consider it to be Bernoulli distribution with p. ( ) n P (X ) p ( p) n, (26) ( ) n ( λ n ) ( λ n )n (27) We get (28) because of ( ) n n! ( p p ) ( p) n, (28)! (np) e np, (29)! λ e λ, (3) λ! e λ. (3) n(n )... (n + ), (32)! n, ( is a constant and n goes to infinity). (33)! This means we can approximate Binomial(n,p) by Poisson with λ np (if n is very large). 2 Two Random Variables Example 2. Let X and Y be Bernoulli Random Variable. If Y, we now X must equal to, so X and Y are dependent. Y Y X 2 4 X 4 Table : Joint probability mass function of X and Y. P (X ) 3 4, P (X ) 4. Here is a example which X and Y are independent, but they have the same marginal distribution. 2

Y Y 3 3 X 8 8 X 8 8 Table 2: Joint probability mass function of X and Y. 2. Marginalization You have the joint distribution P X,Y (x, y). P X (x ) y P Y (y ) x P X,Y (x, y), (34) P X,Y (x, y ). (35) Definition 7. If X and Y are continuous random variables, then the joint CDF: F X,Y (x, y) P (X x, Y y). (36) Given joint CDF F X,Y (x, y), F X (x ) F X,Y (x, + ). (37) Definition 8. When the CDF is differentiable, the joint pdf is defined as f X,Y (x, y) 2 F X,Y (x, y), (38) x y f X (x) f Y (y) Definition 9. X and Y are independent if and only if Definition. Conditional CDF of marginal distribution is f X,Y (x, y)dy, (39) f X,Y (x, y)dx. (4) F X,Y (x, y) F X (x)f Y (y), (4) f X,Y (x, y) f X (x)f Y (y). (42) F X,Y (x y) P (X x Y y), (43) F X,Y (X x, Y y). (44) P (Y y) Example 3. X and Y are 2 random variables given by the joint pdf f X,Y (x, y) 2πσ 2 ρ exp[ 2 2σ 2 ( ρ 2 ) (x2 + y 2 2ρxy)]. 3

What is f X (x)? f X (x) f X,Y (x, y)dy, 2πσ 2 ρ 2 exp[ x2 2σ 2 ( ρ 2 ) ] 2πσ 2 ρ 2 exp[ x2 +ρ2 x2 2σ 2 ( ρ 2 ) ] 2πσ 2 σρ 2 ( ρ2 )x2 exp[ 2σ 2 ( ρ 2 ) ] 2πσ exp[ 2σ 2 ( ρ 2 ) (x2 + y 2 2ρxy)]dy, exp[ 2σ 2 ( ρ 2 ) (y2 2ρxy + ρx 2 ρ 2 x 2 )]dy, ρ 2 x e (y ρx) 2σ 2 ( ρ 2 ) dy, 2πσ e (y ρx) 2σ 2 ( ρ 2 ) dy. Because 2πσ e (x µ)2 2σ dx. So if ρ, f X (x) 2πσ e x2 2σ 2. Similarly, f Y (y) 2πσ e y2 2σ 2. We can have f X,Y (x, y) f X (x)f Y (y). So X and Y are independent. If ρ, X and Y are not independent. 4