Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Similar documents
Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Fundamental Tools - Probability Theory II

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

SDS 321: Introduction to Probability and Statistics

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Lecture 3. Discrete Random Variables

We introduce methods that are useful in:

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Brief Review of Probability

DS-GA 1002 Lecture notes 2 Fall Random variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables LECTURE DEFINITION

1 Presessional Probability

1 Random Variable: Topics

Chapter 3: Random Variables 1

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Lecture 2: Repetition of probability theory and statistics

Chapter 3: Random Variables 1

Chapter 2: The Random Variable

Statistics and Econometrics I

Appendix A : Introduction to Probability and stochastic processes

Discrete Random Variables

Discrete Random Variables

Midterm Exam 1 Solution

Lecture 3: Random variables, distributions, and transformations

Random Variables Example:

EE 302 Division 1. Homework 6 Solutions.

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Review of Probability. CS1538: Introduction to Simulations

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 2: Random Variables

Lecture 2: CDF and EDF

ECS /1 Part IV.2 Dr.Prapun

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Mathematical Statistics 1 Math A 6330

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1: Revie of Calculus and Probability

Introduction to Probability Theory for Graduate Economics Fall 2008

Algorithms for Uncertainty Quantification

Topic 3: The Expectation of a Random Variable

1: PROBABILITY REVIEW

Lecture 1: August 28

3 Multiple Discrete Random Variables

Class 26: review for final exam 18.05, Spring 2014

18.440: Lecture 28 Lectures Review

Probability, Random Processes and Inference

S n = x + X 1 + X X n.

CMPSCI 240: Reasoning Under Uncertainty

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Probability Theory Review

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Things to remember when learning probability distributions:

Statistics for Economists. Lectures 3 & 4

Continuous random variables

Discrete Random Variable

Relationship between probability set function and random variable - 2 -

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Stochastic Models in Computer Science A Tutorial

The random variable 1

Analysis of Engineering and Scientific Data. Semester

Recap of Basic Probability Theory

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Chapter 3 Discrete Random Variables

Special distributions

Recitation 2: Probability

Quick Tour of Basic Probability Theory and Linear Algebra

Lecture 16. Lectures 1-15 Review

Recap of Basic Probability Theory

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables

STAT Chapter 5 Continuous Distributions

1 Review of Probability

Continuous Random Variables

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.

More on Distribution Function

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

RS Chapter 2 Random Variables 9/28/2017. Chapter 2. Random Variables

Continuous Probability Spaces

Notes for Math 324, Part 20

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

1.1 Review of Probability Theory

Week 12-13: Discrete Probability

Probability Theory and Random Variables

Brief Review of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Transcription:

Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution Function (cdf) Functions of Random Variables Corresponding pages from B&T tetbook: 72 83, 86, 88, 90, 40 44, 46 50, 52 57, 79 86. EE 78/278A: Random Variables Page 2 Random Variable A random variable is a real-valued variable that takes on values randomly Sounds nice, but not terribly precise or useful Mathematically, a random variable (r.v.) X is a real-valued function X(ω) over the sample space Ω of a random eperiment, i.e., X : Ω R Ω ω X(ω) Randomness comes from the fact that outcomes are random (X(ω) is a deterministic function) Notations: Always use upper case letters for random variables (X, Y,...) Always use lower case letters for values of random variables: X means that the random variable X takes on the value EE 78/278A: Random Variables Page 2 2

Eamples:. Flip a coin n times. Here Ω {H,T} n. Define the random variable X {0,,2,...,n} to be the number of heads 2. Roll a 4-sided die twice. (a) Define the random variable X as the maimum of the two rolls 4 3 2nd roll 2 2 3 4 st roll 2 3 4 Real Line EE 78/278A: Random Variables Page 2 3 (b) Define the random variable Y to be the sum of the outcomes of the two rolls (c) Define the random variable Z to be 0 if the sum of the two rolls is odd and if it is even 3. Flip coin until first heads shows up. Define the random variable X {,2...} to be the number of flips until the first heads 4. Let Ω R. Define the two random variables (a) X { ω + for ω 0 (b) Y otherwise 5. n packets arrive at a node in a communication network. Here Ω is the set of arrival time sequences (t,t 2,...,t n ) (0, ) n (a) Define the random variable N to be the number of packets arriving in the interval (0, ] (b) Define the random variable T to be the first interarrival time EE 78/278A: Random Variables Page 2 4

Why do we need random variables? Random variable can represent the gain or loss in a random eperiment, e.g., stock market Random variable can represent a measurement over a random eperiment, e.g., noise voltage on a resistor In most applications we care more about these costs/measurements than the underlying probability space Very often we work directly with random variables without knowing (or caring to know) the underlying probability space EE 78/278A: Random Variables Page 2 5 Specifying a Random Variable Specifying a random variable means being able to determine the probability that X A for any event A R, e.g., any interval To do so, we consider the inverse image of the set A under X(ω), {w : X(ω) A} R set A inverse image of A under X(ω), i.e., {ω : X(ω) A} So, X A iff ω {w : X(ω) A}, thus P({X A}) P({w : X(ω) A}), or in short P{X A} P{w : X(ω) A} EE 78/278A: Random Variables Page 2 6

Eample: Roll fair 4-sided die twice independently: Define the r.v. X to be the maimum of the two rolls. What is the P{0.5 < X 2}? 4 3 2nd roll 2 2 3 4 st roll 2 3 4 EE 78/278A: Random Variables Page 2 7 We classify r.v.s as: Discrete: X can assume only one of a countable number of values. Such r.v. can be specified by a Probability Mass Function (pmf). Eamples, 2, 3, 4(b), and 5(a) are of discrete r.v.s Continuous: X can assume one of a continuum of values and the probability of each value is 0. Such r.v. can be specified by a Probability Density Function (pdf). Eamples 4(a) and 5(b) are of continuous r.v.s Mied: X is neither discrete nor continuous. Such r.v. (as well as discrete and continuous r.v.s) can be specified by a Cumulative Distribution Function (cdf) EE 78/278A: Random Variables Page 2 8

Discrete Random Variables A random variable is said to be discrete if for some countable set X R, i.e., X {, 2,...}, P{X X} Eamples, 2, 3, 4(b), and 5(a) are discrete random variables Here X(ω) partitions Ω into the sets {ω : X(ω) i } for i,2,... Therefore, to specify X, it suffices to know P{X i } for all i R 2 3... n... Ω A discrete random variable is thus completely specified by its probability mass function (pmf) p X () P{X } for all X EE 78/278A: Random Variables Page 2 9 Clearly p X () 0 and X p X() Eample: Roll a fair 4-sided die twice independently: Define the r.v. X to be the maimum of the two rolls 4 3 2nd roll 2 2 3 4 st roll p X (): 2 3 4 EE 78/278A: Random Variables Page 2 0

Note that p X () can be viewed as a probability measure over a discrete sample space (even though the original sample space may be continuous as in eamples 4(b) and 5(a)) The probability of any event A R is given by P{X A} A X p X () For the previous eample P{ < X 2.5 or X 3.5} Notation: We use X p X () or simply X p() to mean that the discrete random variable X has pmf p X () or p() EE 78/278A: Random Variables Page 2 Famous Probability Mass Functions Bernoulli: X Bern(p) for 0 p has pmf p X () p, and p X (0) p Geometric: X Geom(p) for 0 p has pmf p X (k) p( p) k for k,2,... p X (k) p...... k 2 3 4 This r.v. represents, for eample, the number of coin flips until the first heads shows up (assuming independent coin flips) EE 78/278A: Random Variables Page 2 2

Binomial: X B(n,p) for integer n > 0 and 0 p has pmf ( ) n p X (k) p k ( p) (n k) for k 0,,2,...,n k p X (k)...... k 0 2 3 k n The maimum of p X (k) is attained at { k (n+)p, (n+)p, if (n+)p is an integer [(n+)p], otherwise The binomial r.v. represents, for eample, the number of heads in n independent coin flips (see page 48 of Lecture Notes ) EE 78/278A: Random Variables Page 2 3 Poisson: X Poisson(λ) for λ > 0 (called the rate) has pmf p X (k) λk k! e λ for k 0,,2,... p X (k) 0 2 3...... The maimum of p X (k) attained at { k λ, λ, if λ is an integer [λ], otherwise k The Poisson r.v. often represents the number of random events, e.g., arrivals of packets, photons, customers, etc.in some time interval, e.g., (0, ] k EE 78/278A: Random Variables Page 2 4

Poisson is the limit of Binomial when p n, as n To show this let X n B(n, λ n ) for λ > 0. For any fied nonnegative integer k p Xn (k) ( n k )( λ n ) k ( λ ) (n k) n n(n )...(n k +) λ k k! n k n(n )...(n k +) λ k (n λ) k k! n(n )...(n k +) λ k (n λ)(n λ)...(n λ) k! λk k! e λ as n ( n λ n ( n λ ) n k ) n n ( λ n ) n EE 78/278A: Random Variables Page 2 5 Continuous Random Variables Suppose a r.v. X can take on a continuum of values each with probability 0 Eamples: Pick a number between 0 and Measure the voltage across a heated resistor Measure the phase of a random sinusoid... How do we describe probabilities of interesting events? Idea: For discrete r.v., we sum a pmf over points in a set to find its probability. For continuous r.v., integrate a probability density over a set to find its probability analogous to mass density in physics (integrate mass density to get the mass) EE 78/278A: Random Variables Page 2 6

Probability Density Function A continuous r.v. X can be specified by a probability density function f X () (pdf) such that, for any event A, P{X A} f X () d For eample, for A (a,b], the probability can be computed as Properties of f X ():. f X () 0 2. f X()d P{X (a,b]} A b a f X ()d Important note: f X () should not be interpreted as the probability that X, in fact it is not a probability measure, e.g., it can be > EE 78/278A: Random Variables Page 2 7 Can relate f X () to a probability using mean value theorem for integrals: Fi and some > 0. Then provided f X is continuous over (,+ ], P{X (,+ ]} Now, if is sufficently small, then + f X (α)dα f X (c) for some c + P{X (,+ ]} f X () Notation: X f X () means that X has pdf f X () EE 78/278A: Random Variables Page 2 8

Famous Probability Density Functions Uniform: X U[a,b] for b > a has the pdf f X () f() { b a for a b 0 otherwise b a a Uniform r.v. is commonly used to model quantization noise and finite precision computation error (roundoff error) b EE 78/278A: Random Variables Page 2 9 Eponential: X Ep(λ) for λ > 0 has the pdf f X () f() { λe λ for 0 0 otherwise λ Eponential r.v. is commonly used to model interarrival time in a queue, i.e., time between two consecutive packet or customer arrivals, service time in a queue, and lifetime of a particle, etc.. EE 78/278A: Random Variables Page 2 20

Eample: Let X Ep(0.) be the customer service time at a bank (in minutes). The person ahead of you has been served for 5 minutes. What is the probability that you will wait another 5 minutes or more before getting served? We want to find P{X > 0 X > 5} Solution: By definition of conditional probability P{X > 0 X > 5} P{X > 0, X > 5} P{X > 5} P{X > 0} P{X > 5} 0 0.e 0. d 5 0.e 0. d e e 0.5 e 0.5, but P{X > 5} e 0.5, i.e., the conditional probability of waiting more than 5 additional minutes given that you have already waited more than 5 minutes is the same as the unconditional probability of waiting more than 5 minutes! EE 78/278A: Random Variables Page 2 2 This is because the eponential r.v. is memoryless, which in general means that for any 0 <, To show this, consider P{X > X > } P{X > } P{X > X > } P{X >, X > } P{X > } P{X > } P{X > } λe λ d λe λ d e λ e λ e λ( ) P{X > } EE 78/278A: Random Variables Page 2 22

Gaussian: X N(µ,σ 2 ) has pdf f X () 2πσ 2 e ( µ)2 2σ 2 for < <, where µ is the mean and σ 2 is the variance N(µ,σ 2 ) µ Gaussian r.v.s are frequently encountered in nature, e.g., thermal and shot noise in electronic devices are Gaussian, and very frequently used in modelling various social, biological, and other phenomena EE 78/278A: Random Variables Page 2 23 Mean and Variance A discrete (continuous) r.v. is completely specified by its pmf (pdf) It is often desirable to summarize the r.v. or predict its outcome in terms of one or a few numbers. What do we epect the value of the r.v. to be? What range of values around the mean do we epect the r.v. to take? Such information can be provided by the mean and standard deviation of the r.v. First we consider discrete r.v.s Let X p X (). The epected value (or mean) of X is defined as E(X) X p X () Interpretations: If we view probabilities as relative frequencies, the mean would be the weighted sum of the relative frequencies. If we view probabilities as point masses, the mean would be the center of mass of the set of mass points EE 78/278A: Random Variables Page 2 24

Eample: If the weather is good, which happens with probability 0.6, Alice walks the 2 miles to class at a speed 5 miles/hr, otherwise she rides a bus at speed 30 miles/hr. What is the epected time to get to class? Solution: Define the discrete r.v. T to take the value (2/5)hr with probability 0.6 and (2/30)hr with probability 0.4. The epected value of T E(T) 2/5 0.6+2/30 0.4 4/5 hr The second moment (or mean square or average power) of X is defined as E(X 2 ) X 2 p X () For the previous eample, the second moment is E(T 2 ) (2/5) 2 0.6+(2/30) 2 0.4 22/225 hr 2 The variance of X is defined as Var(X) E [ (X E(X)) 2] X( E(X)) 2 p X () EE 78/278A: Random Variables Page 2 25 Note that the Var 0. The variance has the interpretation of the moment of inertia about the center of mass for a set of mass points The standard deviation of X is defined as σ X Var(X) Variance in terms of mean and second moment: Epanding the square and using the linearity of sum, we obtain Var(X) E [ (X E(X)) 2] ( E(X)) 2 p X () ( 2 2E(X)+[E(X)] 2 )p X () 2 p X () 2E(X) p X ()+[E(X)] 2 p X () E(X 2 ) 2[E(X)] 2 +[E(X)] 2 E(X 2 ) [E(X)] 2 Note that since for any r.v., Var(X) 0, E(X 2 ) (E(X)) 2 So, for our eample, Var(T) 22/225 (4/5) 2 0.02667. EE 78/278A: Random Variables Page 2 26

Mean and Variance of Famous Discrete RVs Bernoulli r.v. X Bern(p): The mean is The second moment is Thus the variance is E(X) p +( p) 0 p E(X 2 ) p 2 +( p) 0 2 p Var(X) E(X 2 ) (E(X)) 2 p p 2 p( p) Binomial r.v. X B(n,p): It is not easy to find it using the definition. Later we use a much simpler method to show that E(X) np Var(X) np( p) Just n times the mean (variance) of a Bernoulli! EE 78/278A: Random Variables Page 2 27 Geometric r.v. X Geom(p): The mean is E(X) kp( p) k k p p p p k( p) k k k( p) k k0 p p p p 2 p Note: We found the series sum using the following trick: Recall that for a <, the sum of the geometric series k0 a k a EE 78/278A: Random Variables Page 2 28

Suppose we want to evaluate ka k k0 Multiply both sides of the geometric series sum by a to obtain k0 a k+ a a Now differentiate both sides with respect to a. The LHS gives d da a k+ k0 (k +)a k a k0 ka k k0 The RHS gives Thus we have d a da a ( a) 2 ka k k0 a ( a) 2 EE 78/278A: Random Variables Page 2 29 The second moment can be similarly evaluated to obtain E(X 2 ) 2 p p 2 The variance is thus Var(X) E(X 2 ) (E(X)) 2 p p 2 EE 78/278A: Random Variables Page 2 30

Poisson r.v. X Poisson(λ): The mean is given by E(X) k0 e λ k λk k! e λ k e λ k λe λ k λe λ λ k0 Can show that the variance is also equal to λ k λk k! λ k (k )! λ (k ) (k )! λ k k! EE 78/278A: Random Variables Page 2 3 Mean and Variance for Continuous RVs Now consider a continuous r.v. X f X (), the epected value (or mean) of X is defined as E(X) f X ()d This has the interpretation of the center of mass for a mass density The second moment and variance are similarly defined as: E(X 2 ) 2 f X ()d Var(X) E [ (X E(X)) 2] E(X 2 ) (E(X)) 2 EE 78/278A: Random Variables Page 2 32

Uniform r.v. X U[a,b]: The mean, second moment, and variance are given by E(X) E(X 2 ) b a f X ()d 2 f X ()d 2 b a d b b 2 d b a a b a 3 b b3 a 3 3 a 3(b a) Var(X) E(X 2 ) (E(X)) 2 b3 a 3 3(b a) (a+b)2 (b a)2 4 2 Thus, for X U[0,], E(X) /2 and Var /2 a a+b d b a 2 EE 78/278A: Random Variables Page 2 33 Eponential r.v. X Ep(λ): The mean and variance are given by E(X) E(X 2 ) 0 f X ()d λe λ d ( e λ ) 0 0 λ e λ 0 + 0 0 λ 2 λe λ d 2 λ 2 Var(X) 2 λ 2 λ 2 λ 2 e λ d (integration by parts) For a Gaussian r.v. X N(µ,σ 2 ), the mean is µ and the variance is σ 2 (will show this later using transforms) EE 78/278A: Random Variables Page 2 34

Mean and Variance for Famous r.v.s Random Variable Mean Variance Bern(p) p p( p) Geom(p) /p ( p)/p 2 B(n,p) np np( p) Poisson(λ) λ λ U[a,b] (a+b)/2 (b a) 2 /2 Ep(λ) /λ /λ 2 N(µ,σ 2 ) µ σ 2 EE 78/278A: Random Variables Page 2 35 Cumulative Distribution Function (cdf) For discrete r.v.s we use pmf s, for continuous r.v.s we use pdf s Many real-world r.v.s are mied, that is, have both discrete and continuous components Eample: A packet arrives at a router in a communication network. If the input buffer is empty (happens with probability p), the packet is serviced immediately. Otherwise the packet must wait for a random amount of time as characterized by a pdf Define the r.v. X to be the packet service time. X is neither discrete nor continuous There is a third probability function that characterizes all random variable types discrete, continuous, and mied. The cumulative distribution function or cdf F X () of a random variable is defined by F X () P{X } for (, ) EE 78/278A: Random Variables Page 2 36

Properties of the cdf: Like the pmf (but unlike the pdf), the cdf is the probability of something. Hence, 0 F X () Normalization aiom implies that F X ( ), and F X ( ) 0 F X () is monotonically nondecreasing, i.e., if a > b then F X (a) F X (b) The probability of any event can be easily computed from a cdf, e.g., for an interval (a, b], P{X (a,b]} P{a < X b} P{X b} P{X a} (additivity) F X (b) F X (a) The probability of any outcome a is: P{X a} F X (a) F X (a ) EE 78/278A: Random Variables Page 2 37 If a r.v. is discrete, its cdf consists of a set of steps p X () F X () 234 234 If X is a continuous r.v. with pdf f X (), then F X () P{X } f X (α)dα So, the cdf of a continuous r.v. X is continuous F X () EE 78/278A: Random Variables Page 2 38

In fact the precise way to define a continuous random variable is through the continuity of its cdf Further, if F X () is differentiable (almost everywhere), then f X () df X() d F X (+ ) F X () lim 0 lim 0 P{ < X + } The cdf of a mied r.v. has the general form F X () EE 78/278A: Random Variables Page 2 39 Eamples cdf of a uniform r.v.: 0 if < a a F X () a b adα b a if a b if b f X () F X () b a a b a b EE 78/278A: Random Variables Page 2 40

cdf of an eponential r.v.: F X () 0, X < 0, and F X () e λ, 0 f X () F X () λ EE 78/278A: Random Variables Page 2 4 cdf for a mied r.v.: Let X be the service time of a packet at a router. If the buffer is empty (happens with probability p), the packet is serviced immediately. If it is not empty, service time is described by an eponential pdf with parameter λ > 0 The cdf of X is 0 if < 0 F X () p if 0 p+( p)( e λ ) if > 0 F X () p EE 78/278A: Random Variables Page 2 42

cdf of a Gaussian r.v.: There is no nice closed form for the cdf of a Gaussian r.v., but there are many published tables for the cdf of a standard normal pdf N(0,), the Φ function: Φ() 2π e ξ2 2 dξ More commonly, the tables are for the Q() Φ() function N(0,) Q() Or, for the complementary error function: erfc() 2Q( 2) for > 0 As we shall see, the Q( ) function can be used to quickly compute P{X > a} for any Gaussian r.v. X EE 78/278A: Random Variables Page 2 43 Functions of a Random Variable We are often given a r.v. X with a known distribution (pmf, cdf, or pdf), and some function y g() of, e.g., X 2, X, cosx, etc., and wish to specify Y, i.e., find its pmf, if it is discrete, pdf, if continuous, or cdf Each of these functions is a random variable defined over the original eperiment as Y(ω) g(x(ω)). However, since we do not assume knowledge of the sample space or the probability measure, we need to specify Y directly from the pmf, pdf, or cdf of X First assume that X p X (), i.e., a discrete random variable, then Y is also discrete and can be described by a pmf p Y (y). To find it we find the probability of the inverse image {ω : Y(ω) y} for every y. Assuming Ω is discrete: g( i ) y 2 3... y y 2... y... EE 78/278A: Random Variables Page 2 44

Thus p Y (y) P{ω : Y(ω) y} P{ω} {ω: g(x(ω))y} {: g()y} p Y (y) {ω: X(ω)} {: g()y} P{ω} p X () {: g()y} p X () We can derive p Y (y) directly from p X () without going back to the original random eperiment EE 78/278A: Random Variables Page 2 45 Eample: Let the r.v. X be the maimum of two independent rolls of a four-sided die. Define a new random variable Y g(x), where g() { if 3 0 otherwise Find the pmf for Y Solution: p Y (y) p X () {: g()y} p Y () {: 3} p X () 5 6 + 7 6 3 4 p Y (0) p Y () 4 EE 78/278A: Random Variables Page 2 46

Derived Densities Let X be a continuous r.v. with pdf f X () and Y g(x) such that Y is a continuous r.v. We wish to find f Y (y) Recall derived pmf approach: Given p X () and a function Y g(x), the pmf of Y is given by p Y (y) p X (), {: g()y} i.e., p Y (y) is the sum of p X () over all that yield g() y Idea does not etend immediately to deriving pdfs, since pdfs are not probabilities, and we cannot add probabilities of points But basic idea does etend to cdfs EE 78/278A: Random Variables Page 2 47 Can first calculate the cdf of Y as F Y (y) P{g(X) y} f X ()d {: g() y} y { : g() y} y Then differentiate to obtain the pdf f Y (y) df Y(y) dy The hard part is typically getting the limits on the integral correct. Often they are obvious, but sometimes they are more subtle EE 78/278A: Random Variables Page 2 48

Eample: Linear Functions Let X f X () and Y ax +b for some a > 0 and b. (The case a < 0 is left as an eercise) To find the pdf of Y, we use the above procedure y y y b a EE 78/278A: Random Variables Page 2 49 Thus F Y (y) P{Y y} P{aX +b y} { P X y b } ( ) y b F X a a f Y (y) df Y(y) dy Can show that for general a 0, Eample: X Ep(λ), i.e., Then f Y (y) f Y (y) a f X a f X ( ) y b a ( ) y b a f X () λe λ, 0 { λ a e λ(y b)/a if y b a 0 0 otherwise EE 78/278A: Random Variables Page 2 50

Eample: X N(µ,σ 2 ), i.e., Again setting Y ax +b, f X () f Y (y) a f X 2πσ 2 e ( µ)2 2σ 2 ( ) y b a a 2πσ 2 e ( y b a µ )2 2σ 2 2π(aσ) 2 e (y b aµ)2 2a 2 σ 2 e (y µ Y ) 2σ Y 2 2πσ 2 Y 2 for < y < Therefore, Y N(aµ+b,a 2 σ 2 ) EE 78/278A: Random Variables Page 2 5 This result can be used to compute probabilities for an arbitrary Gaussian r.v. from knowledge of the distribution a N(0,) r.v. Suppose Y N(µ Y,σY 2 ) and we wish to find P{Y > y} From the above result, we can epress Y σ Y X +µ Y, where X N(0,) Thus P{Y > y} P{σ Y X +µ Y > y} { P X > y µ } Y ( y µy Q σ Y σ Y ) EE 78/278A: Random Variables Page 2 52

Eample: A Nonlinear Function John is driving a distance of 80 miles at constant speed that is uniformly distributed between 30 and 60 miles/hr. What is the pdf of the duration of the trip? Solution: Let X be John s speed, then The duration of the trip is Y 80/X { /30 if 30 60 f X () 0 otherwise, EE 78/278A: Random Variables Page 2 53 To find f Y (y) we first find F Y (y). Note that {y : Y y} { : X 80/y}, thus F Y (y) 80/y f X ()d 0 if y 3 60 80/y f X()d if 3 < y 6 if y > 6 0 if y 3 (2 6 y ) if 3 < y 6 if y > 6 Differentiating, we obtain { 6/y 2 if 3 y 6 f Y (y) 0 otherwise EE 78/278A: Random Variables Page 2 54

Monotonic Functions Let g() be a monotonically increasing and differentiable function over its range Then g is invertible, i.e., there eists a function h, such that y g() if and only if h(y) Often this is written as g. If a function g has these properties, then g() y iff h(y), and we can write F Y (y) P{g(X) y} P{X h(y)} F X (h(y)) h(y) f X ()d Thus f Y (y) df Y(y) dy f X (h(y)) dh dy (y) EE 78/278A: Random Variables Page 2 55 Generalizing the result to both monotonically increasing and decreasing functions yields f Y (y) f X (h(y)) dh dy (y) Eample: Recall the X U[30,60] eample with Y g(x) 80/X The inverse is X h(y) 80/Y Applying the above formula in region of interest Y [3,6] (it is 0 outside) yields f Y (y) f X (h(y)) dh dy (y) 80 30 y 2 6 y 2 for 3 y 6 EE 78/278A: Random Variables Page 2 56

Another Eample: Suppose that X has a pdf that is nonzero only in [0,] and define Y g(x) X 2 In the region of interest the function is invertible and X Y Applying the pdf formula, we obtain f Y (y) f X (h(y)) dh dy (y) f X( y) 2 y, for 0 < y Personally I prefer the more fundamental approach, since I often forget this formula or mess up the signs EE 78/278A: Random Variables Page 2 57