OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Multiple Random Variables Fall 2012 Yrd. Doç. Dr. Didem Kivanc Tureli didem@ieee.org didem.ivanc@oan.edu.tr 1 Mode and Median MODE: value which occurs most often. If there are two, three or more values that have a high probability of occurring, the distribution if bimodal, trimodal, or multimodal. MEDIAN: value of x for which P( < x) 0.5 and P( > x) 0.5. In the case of a discrete distribution a unique median may not exist. 2 1
Moments If is a random variable with mean μ = E[] th raw moment of a random variable : M = E th central moment of a random variable : ( μ) μ = E 3 Moment Generating Function If is a random variable with mean μ = E[], then the moment generating function of is defined as: () tx e i p xi i t = = tx e f M t E e x dx (discrete case) (continuous case) t 1 2 1 M ( t) = E e = E 1 + t + ( t ) + + ( t ) + 2!! 2 t 2 t = 1+ te[ ] + E + + E + 2!! 4 2
Why do we care about the Moment Generating Function? t 1 2 1 M () t = E e = E 1 + t + ( t) + + ( t) + 2!! 2 t 2 t = 1+ te[ ] + E E 2! + + +! To find the moments, tae the th derivative of the moment generating function and evaluate at t = 0. m = E = M 0 = 1, 2,... ( ) E M ( ) d ( 0) = () dt t= 0 M M t 5 Characteristic Function ( ω ) jω Ψ = E e = i jωx e i p xi (discrete case) (continuous case) jωx e f x dx where ω is a real variable and j is the square root of 1. Notice that if you now the moment generating function, you can find the characteristic function by setting t = jω. The probability density function is the inverse Fourier Transform of the characteristic function. 1 jωx f ( x) = Ψ ( ω) e dω 2π 6 3
Joint Distribution of Random Variables. Sometimes you want to study the relationship of two different random variables to each other. For instance, is there a correlation between height and grades? Is there a correlation between the number of hours someone sleeps and the amount of food they eat? 7 Independent random variables Random variables 1,, n are independent if for any two variables, the value of one gives us no information about the value of the other. Example: = height of a random person on the street, T = temperature outside. and T are independent random variables. The math (formally): and Y are independent if for any two sets of numbers A and B, P A, Y B = P A P Y B { } { } { } 8 4
Joint Probability Mass Function The joint probability mass function of the discrete random variables and Y, denoted as f Y (x, y), satisfies () 1 fy ( x, y) 0 ( 2 ) fy ( x, y) = 1 x y ( 3 ) f ( xy, ) = P( = xy, = y) Y 9 Joint Probability Density Function The joint probability density function of the continuous random variables and Y, denoted as f Y (x, y), satisfies () 1 fy ( x, y) 0 fy R 2 x, y dxdy = 1 3 For any region R of two-dimensional space, P Y f x y dx dy (, R ) = Y (, ) R 10 5
Independent random variables For independent random variables (, ) = f x y f x f y Y Y 11 Joint Cumulative Distribution Function The joint cumulative distribution function (or joint cdf) of and Y, denoted by F Y (x, y) is the function defined by F x, y = P x, Y y Y { } If the events A and B below are independent, then the random variables and Y are independent. { ζ S ( ζ ) x } ζ S Y( ζ ) y A = S x B = { } 12 6
Marginal Probability Density Function If the joint probability density function of random variables and Y is f Y (x, y), the marginal probability density functions of and Y are = (, ) and = (, ) f x f x y dy f y f x y dx Y Y Y 13 Example Let the random variable denote the time until a computer server connects to your machine (in milliseconds) and let Y denote the time until the server authorizes you as a valid user (in milliseconds). Each of these random variables measures the wait from a common starting time and < Y. Assume that the joint probability density function for and Y is: 6 ( xy) = ( y) fy, 6 10 exp 0.001x 0.002 for x< y. 14 Chec that the integral of f Y (x,y) is 1 over the entire region. Find P( 1000, Y 2000). Find P(Y 2000). 7
15 Conditional Probability Density Function Given continuous random variables and Y, with joint probability density function f Y (x, y), the conditional probability density function of Y, given = x is ( x, y) ( x) fy fyx ( y) = for f( x) 0 f > Example continued: Find the conditional probability density function of Y given = x, then calculate P(Y > 2000 = 1500). 16 8
Independent random variables For independent random variables (, ) f Y xy f x f Y y fyx ( y) = = = fy y f x f x 17 Conditional mean and variance The conditional mean of Y given = x denoted as E(Y x) or µ Y x is E Y x = y f y dy Yx Y The conditional variance of Y given = x denoted as V(Y x) or σ 2 Y x is ( ) = ( 2 2 2 μyx ) Yx = Yx μ Y Y V Y x y f y dy y f y dy Yx 18 9
19 Example y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 Joint probability mass function of and Y where is the number of bars of signal strength on your cellphone, and Y is the number of times you need to repeat the name of a city to the airline operator. The joint probability mass function is also sometimes called the joint distribution so you better be careful 20 Example y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 Sum of all of the probabilities is: 0.15+0.02+0.02+0.01=0.20 0.10+0.10+0.03+0.02=0.25 0.05+0.05+0.20+0.25=0.55 0.2+0.25+0.55=1 10
Find the Joint cumulative distribution function y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 21 x y 1 2 3 4 3 0.20 2 0.08 1 Sum of these probabilities goes here Find the Joint cumulative distribution function Joint Prob. Mass Fn. y = number of times city name is stated. 1 2 3 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 1 0.01 0.02 0.25 22 Joint Prob. Distribution Fn. y = number of times city name is stated. ttd 1 2 3 4 0.20 0.45 1.00 3 0.05 0.20 0.70 2 0.03 0.08 0.53 1 0.01 0.03 0.28 11
Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 Pr(Y = 2) 1 0.01 0.02 0.25 Pr( = 2) Sum of these probabilities gives the probability that Y = 2. 23 Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 3 0.02 0.10 0.05 2 0.02 0.03 0.20 Pr(Y = 2) 1 0.01 0.02 0.25 Pr( = x) Pr( = 3) Sum of these probabilities gives the probability that = 3. 24 12
Find the marginal probability mass function y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15 0.10 0.05 0.30 3 0.02 0.10 0.05 0.17 2 0.02 0.03 0.20 0.25 1 0.01 0.02 0.25 0.28 Pr( = x) 0.20 0.25 0.55 Marginal probability density function p (x) Marginal probability density function p Y (y) 25 Find the conditional probability mass function p x ( Y = y) p( Y=y) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.3 0.10/0.3 0.05/0.3 0.30 3 0.02/0.17 0.10/0.17 0.05/0.17 0.17 2 0.02/0.25 0.03/0.25 0.20/0.25 0.25 1 0.01/0.28 0.02/0.28 0.25/0.28 0.28 Pr( = x) 0.20 0.25 0.55 26 13
Find the conditional probability mass function p y (Y = x) p(y =x) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.2 0.10/0.25 0.05/0.55 0.30 3 0.02/0.2 0.10/0.25 0.05/0.55 0.17 2 0.02/0.2 0.03/0.25 0.20/0.55 0.25 1 0.01/0.2 0.02/0.25 0.25/0.55 0.28 Pr( = x) 0.20 0.25 0.55 27 Find the conditional mean E( Y = y) p( Y=y) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.3 0.10/0.3 0.05/0.3 0.30 3 0.02/0.17 0.10/0.17 0.05/0.17 0.17 2 0.02/0.25 0.03/0.25 0.20/0.25 0.25 1 0.01/0.28 0.02/0.28 0.25/0.28 0.28 Pr( = x) 0.20 0.25 0.55 28 015 0.15 0.10 010 005 0.05 E [ Y = 1] = 1+ 2+ 3= 1.6667 0.30 0.30 0.30 0.02 0.10 0.05 E[ Y = 2] = 1+ 2+ 3= 2.1765 0.17 0.17 0.17 0.02 0.03 0.20 E[ Y = 3] = 1+ 2 + 3 = 2.7200 0.25 0.25 0.25 0.01 0.02 0.25 E[ Y = 4] = 1+ 2 + 3 = 2.8571 0.28 0.28 0.28 14
Find the conditional mean E(Y = x) 29 p(y =x) y = number of times city name is stated. 1 2 3 Pr(Y = y) 4 0.15/0.2 0.10/0.25 0.05/0.55 0.30 3 0.02/0.2 0.10/0.25 0.05/0.55 0.17 2 0.02/0.2 0.03/0.25 0.20/0.55 0.25 1 0.01/0.2 0.02/0.25 0.25/0.55 0.28 Pr( = x) 0.20 0.25 0.55 0.15 0.0202 0.0202 0.0101 EY [ = 1] = 4 + 3 + 2 + 1 = 3.5500 0.20 0.20 0.20 0.20 0.10 0.10 0.02 0.03 EY [ = 2] = 4 + 3 + 2 + 1 = 3.0800 0.25 0.25 0.25 0.25 0.05 0.05 0.20 0.25 EY [ = 3] = 4 + 3 + 2 + 1 = 1.8182 0.55 0.55 0.55 0.55 15