Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome itself, but rather in some measurement or numerical attribute of the outcome. Eamples In tossing a coin we may be interested in the total number of heads and not in the specific order in which heads and tails occur. In selecting a student name from an urn (bo) we may be interested in the weight of the student. In each of these eamples, a numerical value is assigned to the outcome. Consider the eperiment of tossing a coin 3 times and observing the number of Heads (which is a numeric value).
The Random Variable Concept A random variable is a function that assigns a real number (є) to each outcome є in the sample space. Eample.-: The eperiment of rolling a die and tossing a coin Let be a random variable that maps the head outcome to the positive number which correspond to the dots on the die, and tail outcome map to the negative number that are equal in magnitude to twice the number which appear on the die. The Random Variable Concept Random variables can be Discrete Flipping a coin 3 times and counting the number of heads. Selecting a number from the positive integers. Number of cars arriving at gas station A. Continuous Selecting a number between { 0 and 6 } { 0 6 }
Conditions for a Function to be a Random Variable The Random variable may be any function that satisfies the followings: () The random variable function can map more than one point in S into same point on the real line. The random variable function can not be multivalued. () The set { } corresponds to points in S {є i (є i ) }. Equivalent { } { ε, ε, ε } i j k Equality = i j k = i j P{ } P{ ε, ε, ε } P( ε ) + P( ε ) + P( ε ) Eample: Tossing a coin 3 times and observing the number of heads k Equivalent { } { HTT, THT, TTH, TTT } Equality P{ } = P(HTT) + P(THT) + P(TTH) + P(TTT) 3
(3) P{= }=0 P{= }=0 This condition does not prevent from being or + for some values of S. It only requires that the probability of the set of those S be zero. Distribution Function If you have a random variable which is numeric by mapping a random eperiment outcomes to the real line Eample: Flipping a coin 3 times and counting the number of heads We can describe the distribution of the R.V using the probability 4
We will define two more distributions of the random variable which will help us finally to calculate probability. Distribution Function We define the cumulative probability distribution function F () = P{ } where, In our flipping the coin 3 times and counting the number of heads { } { } F () = P F () = P = P { = 0} + P { = } + P { = } 3 3 7 = + + = 8 8 8 8 Eample: Let = {0,,,3} with P( = 0) = P( = 3) = 8 3 P ( = ) = P ( = ) = 8 F (0) = P( 0) = 8 3 F () = P( ) = P( = 0) + P( = ) = + = 8 8 3 3 7 F () = P( ) = P( = 0) + P( = ) + P( = ) = + + = 8 8 8 8 3 3 F (3) = P( 3) = P( = 0) + P( = ) + P( = ) + P( = 3) = + + + = 8 8 8 8 5
. F ( ) = 0. F ( ) = 3. 0 F ( ) Distribution Function properties 4. F( ) F( ) if < 5. P < = F ( ) F ( ) { } + 6. F ( ) = F ( ) Eample: Let = {0,,,3} with P( = 0) = P( = 3) = 8 3 P ( = ) = P ( = ) = 8 F (0) = P( 0) = 8 3 F () = P( ) = P( = 0) + P( = ) = + = 8 8 3 3 7 F () = P( ) = P( = 0) + P( = ) + P( = ) = + + = 8 8 8 8 3 3 F (3) = P( 3) = P( = 0) + P( = ) + P( = ) + P( = 3) = + + + = 8 8 8 8 6
Let us consider the eperiment of tossing the coin 3 times and observing the number of heads The probabilities and the distribution function are shown below The stare type distribution function can be written as F ( ) = P( = 0) u ( ) + P( = ) u ( ) + P( = ) u( ) + P ( = 3) u( 3) step function step function step function step function N In general F ( ) = P( = ) u( ) i i i = were N can be infinite Density Function We define the derivative of the distribution function F () as the probability density function f (). df ( ) f ( ) = d We call f () the density function of the R.V In our discrete R.V since N F ( ) = P( = ) u( ) i i i = N d f ( ) = P( i) u( i) d = i = N d = P ( = i) u ( i) d i= N = P ( = ) δ ( ) i= N f ( ) = P( ) δ ( ) i i i = i i 7
Properties of Density Function (Density is non-negative derivative. f ( ) 0 for all of non-decreasing function). f ( ) d = Properties () and () are used to prove if a certain function can be a valid density function. 3. F( ) = f( ξ ) dξ From (3)=> Since F ( ) = { } 4. < = ( ) P f d ( < ) = ( ) ( ) = ( ) ( ) P F F f d f d 8
The Gaussian Random Variable A random variable is called Gaussian if its density function has the form ( a ) ( ) σ f = e πσ where σ > 0 and < a < are real constants (we will see their significance when we discuss the mean and variance later). The spread about the point = a is related to σ The Gaussian density is the most important of all densities. It accurately describes many practical and significant real-world quantities such as noise. The distribution function is found from F( ) = f( ξ ) dξ ( ξ a ) F ( ) σ = e dξ πσ The integral has no known closed-form solution and must be evaluated by numerical or approimation method. 9
However to evaluate numerically for a given ( ξ a ) F ( ) σ = e dξ πσ We need σ and a Eample: Let σ =3 and a =5, then ( ξ 5) (3) F ( ) = e dξ π 3 We then can construct the Table for various values of. 0 ( ξ 5) (3) F ( 0) = e dξ π 3 Evaluate Numerically 6 ( ξ 5 ) ( F (6) 3) = e dξ π 3 Evaluate Numerically Finally we will get a Table for various values of. However there is a problem! The Table will only work for Gaussian distribution with σ =3 and a =5. We know that not all Gaussian distributions have σ =3 and a =5. Since the combinations of a and σ are infinite (uncountable infinite) Uncountable infinite tables to be constructed Unpractical method 0
To show that the general distribution function F () of (.4-) can be found in terms of F() of (.4-3) we make the variable change u = ( ξ a ) σ in (.4-) to obtain ( a ) σ u F ( ) = e du π From (.4-), this epression is clearly equivalent to a F ( ) = F σ ( ξ a ) ( ) σ F = e dξ πσ We will show that the general distribution function F () ( ξ a ) F ( ) σ = e dξ πσ can be found in terms of the normalized Gaussian pdf f() f = e a = σ = π ( ) 0 we make the variable change u = ( ξ a ) σ in F ( ) ( a ) u F ( ) = σ e du = F π ( ) a F ( ) = F σ
f Other Distribution and Density Eamples Binomial Let 0 < p <, N =,,..., then the function N N k N k ( ) = p ( p) δ ( k) k = 0 k is called the binomial density function. N N! is the binomialcoefficient k = k!( N k)! The density can be applied to the - Bernoulli trial eperiment. - Games of chance. - Detection problems in radar and sonar. N = 6 p = 0.5
It applies to many eperiments that have only two possible outcomes ({H,T}, {0,}, {Target, No Target}) on any given trial (N). It applies when you have N trials of the eperiment of only outcomes and you ask what is the probability of k-successes out of these N trials. N N k N k Binomial distribution F ( ) = p ( p) u ( k) k = 0 k N = 6 p = 0.5 The following figure illustrates the binomial density and distribution functions for N = 6 and p = 0.5. 3
Poisson The Poisson RV has a density and distribution k b b f ( ) = e δ ( k) k = 0k! k b b F ( ) = e u ( k) k! k = 0 Where b > 0 is a real constant. Binomial Poisson N p 0 Np = b(constant) The Poisson RV applies to a wide variety of counting-type applications: - The number of defective units in a production line. - The number of telephone calls made during a period of time. - The number of electrons emitted from a small section of a cathode in a given time interval. 4
Rayleigh The Rayleigh density and distribution functions are ( ) ( a) b ae a f ( ) = b 0 < a ( a) b e a F ( ) = 0 < a for real constants < a< and b> 0 Conditional Distribution and Density Functions For two events A and B the conditional probability of event A given event B had occurred was defined as P(A B) = P(A B) P(B) We etend the concept of conditional probability to include random variables 5
Conditional Distribution Let be a random variable and define the event A A = { } we define the conditional distribution function F( B) A F ( B) = P{ B} = { } B P{ B} P(B) Properties of Conditional Distribution () F ( B) = 0 proof () F ( B) = { } { } F ( B) = P B Pr oof (3) 0 F ( B) P B 0 = = = 0 P(B) P(B) { } { } F ( B) = P B P B P(B) = = = P(B) P(B) (4) F ( B) F ( B) if < { } (5) P < B = F ( B) F ( B) + (6) F ( B) = F ( B) Right continuous None Decreasing 6
Conditional Density Functions We define the Conditional Density Function of the random variable as the derivative of the conditional distribution function df ( B) f ( B) = d If F( B) contain step discontinuities as when is discrete or mied (continues and discrete ) then f will contain ( B) impulse functions. Properties of Conditional Density () f ( B) 0 () f ( B) d = (3) F ( B) = f (ξ B)dξ { } (4) P < B = f ( B)d 7
Net we define the event < b < B = { b} { } F ( b) = P b = Cas e b { b} { } { } { b} = { b} { } { } were b is a real number { } P{ b} P b { } P b 0 { } { } P b P b F ( b) = = = P b P b b Cas e < b { } { b } { } { b} = { } b { } { } { } { } P b P F() F ( b) = = = P b P b F (b) By combining the two epressions we get F() F ( b) = F(b) < b b 8
then the conditional distribution F ( b) is never smaller then the ordinary distribution F () F() < b F F(b) ( b) = b F ( b) F () The conditional density function derives from the derivative df ( b) f ( b) = d = f () f () = < b b F(b) f ()d 0 b Similarly for the conditional density function f ( b) f () 9
Eample.6- Two Boes have Red, Green and Blue Balls i B B Totals Red 5 80 85 Green 35 60 95 3 Blue 60 0 70 00 50 50 Assume P(B)= /0 and P(B)=8/0 B and B are mutually eclusive Events Eample 8 Let be a random variable with an eponential probability density function given as f e 0 ( ) = 0 < 0 Find the probability P( < ) Since f ( ) = f f ( ) ( ) d 0 > = e e 0 > f ( ) f ( ) 0
P( < ) = f ( ) d = = 0 e d 0 = e e e 0 e e d = 0.730