Chapter 3: Random Variables 1

Similar documents
Chapter 3: Random Variables 1

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Lecture 3: Random variables, distributions, and transformations

Deep Learning for Computer Vision

More on Distribution Function

1 Random Variable: Topics

Chapter 4. Continuous Random Variables 4.1 PDF

p. 4-1 Random Variables

Joint Distribution of Two or More Random Variables

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Chap 2.1 : Random Variables

Random Variables and Their Distributions

Chapter 6: Random Processes 1

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

Brief Review of Probability

Discrete Random Variable

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Continuous Random Variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

1.1 Review of Probability Theory

Quick Tour of Basic Probability Theory and Linear Algebra

Analysis of Engineering and Scientific Data. Semester

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

27 Binary Arithmetic: An Application to Programming

Probability, Random Processes and Inference

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Random variables. DS GA 1002 Probability and Statistics for Data Science.

p. 6-1 Continuous Random Variables p. 6-2

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Probability Distributions for Discrete RV

Expected Value 7/7/2006

General Random Variables

Chapter 2: Random Variables

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

2 (Statistics) Random variables

Review of Statistics I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

1 Presessional Probability

Lecture 3. Discrete Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture 1: August 28

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Northwestern University Department of Electrical Engineering and Computer Science

Probability theory. References:

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Fundamental Tools - Probability Theory II

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Applied Statistics I

SDS 321: Introduction to Probability and Statistics

Continuous Random Variables and Continuous Distributions

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Conditional Probability

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

2 Continuous Random Variables and their Distributions

Probability Review. Gonzalo Mateos

Discrete random variables and probability distributions

Things to remember when learning probability distributions:

Review of Probability Theory

Expectation of Random Variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Stochastic processes and Data mining

Algorithms for Uncertainty Quantification

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

EE514A Information Theory I Fall 2013

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Conditional Probability

The random variable 1

We introduce methods that are useful in:

Chapter 1 Probability Theory

ECE 302: Probabilistic Methods in Electrical Engineering

Probability and Distributions

Homework 4 Solution, due July 23

Lecture 2: Repetition of probability theory and statistics

THE QUEEN S UNIVERSITY OF BELFAST

Discrete Probability Refresher

Chapter 2 Random Variables

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

What is a random variable

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

SDS 321: Introduction to Probability and Statistics

Basics of Stochastic Modeling: Part II

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

CS206 Review Sheet 3 October 24, 2018

CME 106: Review Probability theory

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Part (A): Review of Probability [Statistics I revision]

Discrete Probability distribution Discrete Probability distribution

CS145: Probability & Computing

Continuous Distributions

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Chapter 2: The Random Variable

Probability (10A) Young Won Lim 6/12/17

Chapter 3. Chapter 3 sections

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

What is Probability? Probability. Sample Spaces and Events. Simple Event

Transcription:

Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof. Mao-Ching Chiu

Y. S. Han Random Variables 1 3.1 The Notion of a Random Variable A random variable X is a function that assigns a real number, X(ζ), to each outcome ζ in the sample space.

Y. S. Han Random Variables 2 Example: Toss a coin three times. S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}. Let X be the number of heads. ζ HHH HHT HTH THH HTT THT TTH TTT X(ζ) 3 2 2 2 1 1 1 0 X is a random variable with S X = {0, 1, 2, 3}.

Y. S. Han Random Variables 3 S: Sample space of a random experiment X: Random variable X : S S X S X is a new sample space Let B S X and A = {ζ : X(ζ) B}. Then P[B] = P[A] = P[{ζ : X(ζ) B}] A and B are equivalent events.

Y. S. Han Random Variables 4

Y. S. Han Random Variables 5 3.2 Cumulative Distribution Function Cumulative distribution function (cdf) F X (x) = P[X x] for < x < In underlying sample space F X (x) = P[{ζ : X(ζ) x}] F X (x) is a function of the variable x.

Y. S. Han Random Variables 6 Properties of cdf 1. 0 F X (x) 1. 2. lim x F X (x) = 1. 3. lim x F X (x) = 0. 4. If a < b, then F X (a) F X (b). 5. F X (x) is continuous from the right, i.e., for h > 0 F X (b) = lim h 0 F X (b + h) = F X (b + ). 6. P[a < X b] = F X (b) F X (a), since {X a} {a < X b} = {X b}.

Y. S. Han Random Variables 7 7. P[X = b] = F X (b) F X (b ). 8. P[X > x] = 1 F X (x).

Y. S. Han Random Variables 8 X: the number of heads in three tosses of a fair coin Let δ be a small positive number. Then F X (1 δ) = P[X 1 δ] = P {0 heads} = 1/8, F X (1) = P[X 1] = P[0 or 1 heads] = 1/8+3/8 = 1/2, F X (1 + δ) = P[X 1 + δ] = P[0 or 1 heads] = 1/2. Write in unit step function u(x) = { 0, x < 0 1, x 0, F X (x) = 1 8 u(x) + 3 8 u(x 1) + 3 8 u(x 2) + 1 u(x 3). 8

Y. S. Han Random Variables 9

Y. S. Han Random Variables 10 Example: The transmission time X of messages in a communication system obeys the exponential probability law with parameter λ, i.e., P[X > x] = e λx x > 0 Find the cdf of X and P[T < X 2T], where T = 1/λ. F X (x) = P[X x] = 1 P[X > x] = { 0, x < 0 1 e λx, x 0. P[T < X 2T] = 1 e 2 (1 e 1 ) = e 1 e 2.

Y. S. Han Random Variables 11

Y. S. Han Random Variables 12 Types of random variables Discrete random variable, S X = x 0,x 1,... F X (x) = k p X (x k )u(x x k ), where x k S X and p X (x k ) = P[X = x k ] is the probability mass function (pmf) of X. Continuous random variable F X (x) = x Random variable of mixed type f(t)dt F X (x) = pf 1 (x) + (1 p)f 2 (x)

Y. S. Han Random Variables 13 3.3 Probability Density Function Probability density function (pdf) of X is f X (x) = df X(x). dx

Y. S. Han Random Variables 14

Y. S. Han Random Variables 15 Properties of pdf 1. f X (x) 0. 2. P[a X b] = b a f X(x)dx. 3. F X (x) = x f X(t)dt. 4. f X(t)dt = 1.

Y. S. Han Random Variables 16 The pdf of the uniform random variable f X (x) = F X (x) = { 1 b a, a x b 0, otherwise 0, x < a x a, a x b b a 1, x > b

Y. S. Han Random Variables 17

Y. S. Han Random Variables 18 Pdf for discontinuous cdf Unit step function u(x) = { 0 x < 0 1 x 0 Delta function δ(t) u(x) = x δ(t)dt Cdf for a discrete random variable F X (x) = k p X (x k )u(x x k ) = x f X (t)dt

Y. S. Han Random Variables 19 f X (x) = k p X(x k )δ(x x k )

Y. S. Han Random Variables 20 Conditional cdf s and pdf s Conditional cdf of X given A F X (x A) = P[{X x} A] P[A] if P[A] > 0 Conditional pdf of X given A f X (x A) = d dx F X(x A)

Y. S. Han Random Variables 21 3.4 Some Important Random Variables Bernoulli random variable: Let A be an event. The indicator function for A is { 0 ζ / A I A (ζ) = 1 ζ A. I A is the Bernoulli random variable. Ex: toss a coin. Binomial random variable: Let X be the number of times a event A occurs in n independent trials. Let I j be the indicator function for event A in the jth trial. Then X = I 1 + I 2 + + I n

Y. S. Han Random Variables 22 and P[X = k] = ( n k) p k (1 p) n k, where p = P[I j = 1].

Y. S. Han Random Variables 23 Geometric random variable: Count the number M of independent Bernoulli trials until the first success of event A. P[M = k] = (1 p) k 1 p k = 1, 2,..., where p = P[A].

Y. S. Han Random Variables 24 Uniform random variable Exponential random variable f X (x) = { 0 x < 0 λe λx x 0 F X (x) = { 0 x < 0 1 e λx x 0 λ: rate at which events occur.

Y. S. Han Random Variables 25

Y. S. Han Random Variables 26 Gaussian (Normal) random variable: The pdf is f X (x) = 1 2πσ e (x m)2 /2σ 2 < x <, where m and σ are real numbers. The cdf is P[X x] = 1 2πσ x e (x m) 2 /2σ 2 dx. Change variable t = (x m)/σ and we have F X (x) = 1 (x m)/σ ( ) x m e t2 /2 dt = Φ 2π σ,

Y. S. Han Random Variables 27 where Φ(x) = 1 2π x e t2 /2 dt

Y. S. Han Random Variables 28

Y. S. Han Random Variables 29 Q-function is defined by Q(x) = 1 Φ(x) = 1 2π x e t2 /2 dt. Q-function is the probability of tail of the pdf. Q(0) = 1/2 and Q( x) = 1 Q(x). Q(x) can be obtained by look-up tables.

Y. S. Han Random Variables 30 Example: A communication system accepts a positive voltage V as input and output a voltage Y = αv + N, where α = 10 2 and N is a Gaussian random variable with parameters m = 0 and σ = 2. Find the value of V that gives P[Y < 0] = 10 6. Sol: P[Y < 0] = P[αV + N < 0] = P[N < αv ] ( ) ( ) αv αv = Φ = Q = 10 6. σ σ From the Q-function table, we have αv/σ = 4.753. Thus, V = (4.753)σ/α = 950.6.

Y. S. Han Random Variables 31 3.5 Functions of a Random Variable Let X be a random variable. Define another random variable Y = g(x). Example: Let the function h(x) = (x) + be defined as (x) + = { 0 x < 0 x x 0. Let B = {x : g(x) C}. The probability of event C is P[Y C] = P[g(X) C] = P[X B]. Three types of equivalent events are useful in determining the cdf and pdf:

Y. S. Han Random Variables 32 1. Discontinuity case: {g(x) = y k }; 2. cdf: {g(x) y}; 3. pdf: {y < g(x) y + h}.

Y. S. Han Random Variables 33 Example: Let X be a sample voltage of a speech waveform, and suppose that X has a uniform distribution in the interval [ 4d, 4d]. Let Y = q(x), where the quantizer input-output characteristic is shown below. Find the pmf for Y.

Y. S. Han Random Variables 34 Sol: The event {Y = q} for q in S Y is equivalent to the event {X I q }, where I q is an interval of points mapped into the

Y. S. Han Random Variables 35 representation point p. The pmf of Y P[Y = q] = f X (t)dt = 1/8 for all q. I q

Y. S. Han Random Variables 36 Example: Let the random variable Y be defined by Y = ax + b, where a is a nonzero constant. Suppose that X has cdf F X (x), find F Y (y). Sol: {Y y} and A = {ax + b y} are equivalent event. If a > 0 then A = {X (y b)/a}, and thus F Y (y) = P [ X y b ] a If a < 0, then A = {X (y b)/a} and F Y (y) = P [ X y b ] a ( ) y b = F X a a > 0. ( ) y b = 1 F X. a

Y. S. Han Random Variables 37 Therefore, we have f Y (y) = 1 a f X 1 a f X ( y b a ) ( y b a ) a > 0 a < 0 = 1 ( ) y b a f X. a

Y. S. Han Random Variables 38

Y. S. Han Random Variables 39 Example: Let X be a Gaussian random variable with mean m and standard deviation σ: f X (x) = 1 2πσ e (x m)2 /2σ 2 < x < Let Y = ax + b. Find the pdf of Y. Sol: From previous example, we have f Y (y) = 1 2π a σ e (y b am)2 /2(aσ) 2. Y also has a Gaussian distribution with mean am + b and standard deviation a σ.

Y. S. Han Random Variables 40 Example: Let random variable Y be defined by Y = X 2, where X is a continuous random variable. Find the cdf and pdf of Y. Sol: The event {Y y} occurs when {X 2 y} or equivalently { y X y} for y nonnegative. The event is null when y is negative. Then F Y (y) = 0 y < 0 F X ( y) F X ( y) y 0 and f Y (y) = f X( y) 2 y = f X( y) 2 y f X( y) 2 y + f X( y) 2. y y > 0

Y. S. Han Random Variables 41

Y. S. Han Random Variables 42 Consider Y = g(x) as shown below Consider the event C y = {y < Y < y + dy}. Let B x be its equivalence in the x-axis. As shown in the figure, g(x) = y has three solutions and B x = {x 1 < X < x 1 + dx 1 } {x 2 < X < x 2 + dx 2 }

Y. S. Han Random Variables 43 Thus, {x 3 < X < x 3 + dx 3 }. P[C y ] = f Y (y) dy = P[B x ] = f X (x 1 ) dx 1 +f X (x 2 ) dx 2 +f X (x 3 ) dx 3. In general, we have f Y (y) = k f X (x) dy/dx = x=xk k f X (x) dx dy. x=xk

Y. S. Han Random Variables 44 Example : Let Y = X 2. For Y 0, the equation y = x 2 has two solutions, x 0 = y and x 1 = y. Since dy/dx = 2x, we have f Y (y) = f X( y) 2 y + f X( y) 2. y Example: Let Y = cos(x), where X is uniformly distributed in the interval (0, 2π]. Find the pdf of Y.

Y. S. Han Random Variables 45 Sol: Two solutions in the interval, x 0 = cos 1 (y) and x 1 = 2π x 0. dy dx = sin(x 0 ) = sin(cos 1 (y)) = 1 y 2. x0

Y. S. Han Random Variables 46 Since f X (x) = 1/(2π), The cdf of Y is f Y (y) = F Y (y) = = 1 2π 1 y + 1 2 2π 1 y 2 1 π for 1 < y < 1. 1 y 2 0 y < 1 1 2 + sin 1 y π 1 y 1 1 y > 1.

Y. S. Han Random Variables 47 6 5 4 f Y (y) 3 2 1 0 1 0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 0.8 1 y

Y. S. Han Random Variables 48 3.6 Expected Value of Random Variables

Y. S. Han Random Variables 49 The Expected Value of X The expected value or mean of a random variable X is defined by E[X] = tf X (t)dt If X is a discrete random variable, then E[X] = k x k p X (x k ) Note that E[X] may not converge.

Y. S. Han Random Variables 50 The mean for a uniform random variable between a and b is given by E[X] = b a t b a dt = a + b 2 E[X] is the midpoint of the interval [a,b]. If the pdf of X is symmetric about a point m, then E[X] = m. That is, when f X (m x) = f X (m + x), we have 0 = + (m t)f X (t)dt = m + tf X (t)dt.

Y. S. Han Random Variables 51 The pdf of a Gaussian random variable is symmetric at x = m. Therefore, E[X] = m.

Y. S. Han Random Variables 52 Exercise: Show that if X is a nonnegative random variable, then E[X] = (1 F X (t))dt if X continuous and nonnegative 0 and E[X] = P[X > k] if X nonnegative, integer-valued. k=0

Y. S. Han Random Variables 53 Expected value of Y = g(x) Let Y = g(x), where X is a random variable with pdf f X (x). Y is also a random variable. Mean of Y is E[Y ] = + g(x)f X (x)dx.

Y. S. Han Random Variables 54 Variance of X Variance of the random variable X is defined by VAR[X] = E[(X E[X]) 2 ]. Standard deviation of X STD[X] = VAR[X] 1/2 measure of the spread of a distribution. Simplification VAR[X] = E[X 2 2E[X]X + E[X] 2 ] = E[X 2 ] 2E[X]E[X] + E[X] 2 = E[X 2 ] E[X] 2

Y. S. Han Random Variables 55 Example: Find the variance of the random variable X that is uniformly distributed in the interval [a,b]. E[X] = (a + b)/2, and VAR[X] = 1 b a b Let y = (x (a + b)/2). Then a ( x a + b ) 2 dx 2 VAR[X] = 1 b a (b a)/2 (b a)/2 y 2 dy = (b a)2 12.

Y. S. Han Random Variables 56 Example: Find the variance of a Gaussian random variable. Multiply the integral of the pdf of X by 2πσ to obtain + e (x m)2 /2σ 2 dx = 2πσ. Differentiate both sides with respect to σ to get + ( ) (x m) 2 e (x m)2 /2σ 2 dx = 2π. Then σ 3 VAR[X] = 1 + (x m) 2 e (x m)2 /2σ 2 dx = σ 2. 2πσ

Y. S. Han Random Variables 57 Properties Let c be a constant. Then VAR[c] = 0, VAR[X + c] = VAR[X], VAR[cX] = c 2 VAR[X]. nth moment of the random variable X is given by E[X n ] = + x n f X (x)dx.

Y. S. Han Random Variables 58 3.7 Markov and Chebyshev Inequalities Markov Inequality Suppose X is a nonnegative random variable with mean E[X]. Then Since E[X] = P[X a] E[X] a a 0 a tf X (t)dt + a for X nonnegative tf X (t)dt af X (t)dt = ap[x a]. a tf X (t)dt

Y. S. Han Random Variables 59 Chebyshev Inequality Consider random variable X with E[X] = m and VAR[X] = σ 2. Then P[ X m a] σ2 a 2. Proof: Let D 2 = (X m) 2. Markov inequality for D 2 gives P[D 2 a 2 ] E[(X m)2 ] a 2 = σ2 a 2. {D 2 a 2 } and { X m a} are equivalent events.